ColorLab: The Matlab Toolbox for Colorimetry and Color Vision
ColorLab is a color computation and visualization toolbox to be used in the MATLAB environment. ColorLab is intended to deal with color in general-purpose quantitative colorimetric applications as color image processing and psychophysical experimentation.
ColorLab uses colorimetrically meaningful representations of color and color images (tristimulus values, chromatic coordinates and luminance, or, dominant wavelength, purity and luminance), in any primaries system of the tristimulus colorimetry (including CIE standards as CIE XYZ or CIE RGB). ColorLab relates this variety of colorimetric representations to the usual device-dependent discrete-color representation, i.e. it solves the problem of displaying a colorimetrically specified scene in the monitor within the accuracy of the VGA.
A number of other interesting color representations are also provided, as CIE uniform color spaces (as CIE Lab and CIE Luv, opponent color representations based on advanced color vision models, and color appearance representations (RLab, LLab, SVF and CIECAMs). All these representations are invertible, so the result of image processing made in these colorimetrically meaningful representations can always be inverted back to the tristimulus representation at hand, and be displayed. ColorLab includes useful visualization routines to represent colors in the tristimulus space or in the chromatic diagram of any color basis, as well as an advanced vector quantization scheme for color palette design. An extensive color data base is also included, with the CIE 1931 color matching functions, reflectance data of 1250 chips from the Munsell Book of Color, McAdam ellipses, normalized spectra of a number of standard CIE illuminants, matrices to change to a number of tristimulus representations, and calibration data of an ordinary CRT monitor.
The standard tools in ColorLab (and in VistaLab) are the necessary building blocks to develop more sophisticated vision models included in the dedicated site VistaModels.
Table of Contents
- Colorfulness edition using the purity
- Hue-based segmentation and edition using the dominant wavelength
- Luminance edition in cd/m2
- Changing the spectral illumination (standard and user defined illuminants)
- Playing with McAdam ellipses and Munsell chips
- Chromatic induction in LLab
Colorfulness edition using the purity
Colorimetric Purity and Excitation Purity are the descriptors of colorfulness in Tristimulus Colorimetry. Both of them are available in ColorLab. In the example below we analyze the colors of an image in the CIE XYZ system and reduce the excitation purity by a constant factor leaving the luminace and the dominant wavelength unaltered in order to obtain an image with reduced colorfulness. Other posibilities to obtain this effect with ColorLab include using any other tristimulus representations or changing the colorfulness descriptors in a number of available non-linear color appearance models.
Hue-based segmentation and edition using the dominant wavelength
The Dominant Wavelength is the descriptor of hue in Tristimulus Colorimetry. In the example below we first segment the flowers by selecting a range of wavelenghts (in the CIE XYZ chromatic diagram) and then, we modify their hue by applying a rotation to the chromatic coordinates. Other posibilities to obtain this effect with ColorLab include using any other tristimulus representation or changing (rotating) the hue descriptor in a number of available non-linear color appearance models.
Luminance edition in cd/m24
The Luminance is the descriptor of brightness in Tristimulus Colorimetry. In the example below we reduce the luminance by reducing the lenght of the tristimulus vectors by a constant factor in an arbitrary (RBG) tristimulus space (note how the chromatic diagram is twisted). Of course the chromatic coordinates remain the same (as can be seen in the figures below). Other posibilities to obtain this effect with ColorLab include using any other tristimulus representation or changing the brightness descriptor in a number of available non-linear color appearance models.
Changing the spectral illumination (standard and user defined illuminants)
ColorLab is able to deal with the spectro-radiometric description of color images or estimate it from their (usual) colorimetric description by using the Munsell reflectances data set. In this way, the effect of changing the spectral radiance of the illuminant may be simulated by obtaining the new tristimulus values with the new illuminant. In the example below, each pixel of the original image is assumed to be a patch with a given (or estimated) reflectance under white light illumination. The user may define a different illuminant (in this case a purple radiation) and apply it to the reflectances, thus obtaining the new image and the new (tristimulus) colors. Of course, this can be done in any tristimulus representation. But, better than that, if non-linear color appearance models are used together with the corresponding pair procedure [JOSA A 04], color constancy may be predicted!.
Playing with McAdam ellipses and Munsell chips
Now you can easily check the non-uniformity of the tristimulus space in your computer screen! As ColorLab comes with the McAdam ellipses database and the Munsell chips database, its color reproduction ability allows you to generate the right colors to prove that your discrimination is not Euclidean. In the first example below, we distort two given colors (green and blue) in by a constant factor in the chromatic diagram in the principal directions of the ellipsoids. Despite the eventual inaccuracies introduced by the use of a generic calibration, it is clear that blues are more different each other (the ellipse is smaller!) and the distortion in every case is more noticeable when it is done in the short direction of the ellipse. The second example shows a set of Munsell chips of different chroma which are chosen to depart each other a constant number of JNDs.
Chromatic induction in LLab
The perception of a test is modified by the stimuli in the surround. This is referred to as chromatic induction. In the example below, the (physically constant) gray test in the center changes its hue to blueish as the surround gets more yellow. Non-linear color appearance models are required to understand this effect.
Key Capabilities
- Visualization: Visualize color in tristimulus spaces or chromatic diagrams.
- Transformation: Move between tristimulus and non-linear color models like CIECAM.
- Quantitative Processing: Apply functions for color purity, luminance, and hue manipulation.
- Extensive Color Database: Includes CIE color matching functions, Munsell chips, McAdam ellipses, and more.
Download ColorLab
- Toolbox: Colorlab.zip (15MB)
- User Guide: ColorLab_userguide.pdf (12MB)
References
- ColorLab: the Matlab toolbox for Colorimetry and Color Vision. Univ. Valencia 2002
J. Malo & M.J. Luque. - Corresponding-pair procedure: a new approach to simulation of dichromatic color perception
P. Capilla, M. Diez, M.J. Luque, & J. Malo
JOSA A 21(2): 176-186 (2004) - Nonlinearities and Adaptation of Color Vision from Sequential Principal Curves Analysis
V. Laparra, S. Jimenez, G. Camps & J. Malo
Neural Computation 24(10): 2751-2788 (2012) - Spatio-Chromatic Adaptation via Higher-Order Canonical Correlation Analysis of Natural Images
M. Gutmann, V. Laparra, A. Hyvarinen & J. Malo
PLoS ONE 9(2): e86481 (2014) - Visual aftereffects and sensory nonlinearities from a single statistical framework
V. Laparra & J. Malo
Frontiers in Human Neuroscience 9:557 (2015) - Effect of a Yellow Filter on Brightness Evaluated by Asymmetric Matching: Measurements and Predictions
M.J. Luque, et al.
J. Opt. A - Pure Appl. Opt. (Inst. of Physics), 8 (5): 398-408 (2006) - Analyzing the metrics of the perceptual space in a new multistage physiological colour vision model
E. Chorro, F.M. Martínez‐Verdú, D. de Fez, P. Capilla, & M.J. Luque
Color Res. Appl., 34: 359-366 (2009) - Images Perceived after Chromatic or Achromatic Contrast Sensitivity Losses
M.J. Luque, et al.
Optom. Vision Sci., 87 (5): 313-322 (2010) - Simulating Images Seen by Patients with Inhomogeneous Sensitivity Losses
P. Capilla, M.J. Luque, M. Diez
Optom. Vision Sci., 89 (10): 1543-1556 (2012) - Software for simulating dichromatic perception of video streams
M.J. Luque, D. de Fez, & P. Acevedo
Color Res. Appl., 39: 486-491 (2014)