Table of Contents Author Guidelines Submit a Manuscript
Journal of Sensors
Volume 2016 (2016), Article ID 6731572, 7 pages
Research Article

Colorimetric Analysis Using Scene-Adaptive Color Conversion Matrix of Calibrated CIS

School of Electronics Engineering, Kyungpook National University, 80 Daehakro, Bukgu, Daegu 702-701, Republic of Korea

Received 14 July 2016; Accepted 6 September 2016

Academic Editor: Yasuko Y. Maruo

Copyright © 2016 Sung-Hak Lee. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


The RGB signals of different CISs (color image sensors) do not register the same values for the same viewing scene owing to their different spectral sensitivity and white balance mechanisms. Thus, CISs must be characterized based on CIE standard observation for colorimetric purposes. One general method for characterizing CISs is least square polynomial modeling to derive the colorimetric transfer matrix between RGB outputs and CIE tristimulus inputs. However, the transfer matrix that is obtained under the standard CIE illumination is unable to estimate various conditions of a CIS that is operated under various illuminations with varying chromaticity and luminance. Therefore, repeated experiments are necessary to obtain accurate colorimetric analysis results. This paper presents a scene-adaptive colorimetric analysis method using images captured by a general consumer camera under various environments.

1. Introduction

Generally, the RGB outputs generated by CISs (color image sensors) are device-dependent. This means that the RGB signals do not correspond to device-independent tristimulus based on CIE CMFs (color matching functions) [1]. This is because the spectral sensitivity of CISs is not matched to the standard CMFs, and the characteristics of color sensors that are utilized in consumer cameras usually depend on the surrounding environment when capturing a scene [2]. Therefore, it is necessary to determine the transfer characteristics defining the relationship between RGB signals and standard color stimuli for post-color matching and enhancement processes such as color reproduction, color correction, and HDR (high dynamic range) rendering to facilitate accurate scene rendering in general display devices [38].

The derivation of a transform relationship between RGB signals and CIE stimulus is known as CIS characterization [9], and it can be performed by spectral-sensitivity-based and color-target-based methods. The color-target-based method uses reference colors obtained from their values on a color chart and thus is relatively simple and practical when compared to the spectral-sensitivity-based method that requires spectral analysis of the camera. The polynomial regression method based on least squares fitting has been widely adopted by many color researchers for calculation of the transfer matrix from the captured RGB values and their values [913]. However, camera characterization under the specific standard illuminant does not provide good estimations for the different ambient light conditions in which an image is captured. Accordingly, for more accurate color analysis, flexible and rigorous experiments in various white balance conditions should be performed.

Generally, most color images from CISs should be calibrated chromatically to remove specific color cast. The changed transfer matrix for any other white balance condition can be obtained based on the surrounding illumination and the phosphor primaries of a camera [12, 13]. However, color measurement performance can be enhanced using a recalculated transfer matrix based on illuminant estimation techniques. Commonly, the illuminant estimation is referred to as white point estimation. It is assumed that the CIS is then calibrated to produce similar RGB values for white under any illuminant. However, it is uncertain which illuminant will be used and how similar the RGB values will be to each other. In theory, color correction with unknown white balance is not problematic, because the calibrating coefficients used to scale the sensor parameters are absorbed into the color transformation required for the color correction [14].

In this paper, a novel colorimetric analysis method is proposed using images captured by a consumer digital camera. In general cases, it is hard to estimate the internal calibration state of the camera sensor and surrounding illumination. For this reason, without the illumination estimation process, the color conversion function for various RGB levels is derived using only two images captured on the AWB (auto white balance) mode and the preset mode. The difference of the two images is used for the internal calibration process that corrects the color balance. It is possible to estimate the color conversion of the image sensor through the color transfer function. In simulation results, performance comparisons are made with the characterized cameras based on the single matrix and the multimatrices. This demonstrates that the proposed camera characterization method performs effectively relative to previous methods.

2. Image Calibration

In handling captured images, an important issue concerns the color balance status of the image sensor. In most consumer digital cameras, the sensor color channels are calibrated according to sensor response to the intensity and wavelength of the scene. The AWB techniques find the optimal color balanced condition for varying surroundings, by using various illumination estimation methods such as grey world, white patch based, neural network based, and bootstrapping methods [14]. Moreover, conventional digital cameras adopt a knee curve for dynamic range compression and the nonsaturated highlight range [15].

Figure 1 shows the difference between the sensor responses of Nikon and Canon cameras. Both cameras are balanced for identical illumination of A and D65 illuminants for the same scene. This allows a comparison between different digital cameras as regards color balance transformation. Sensor responses are plotted in the relative RGB space and show how the RGB values are calibrated between white balanced and unbalanced images. For some color patches, the RGB values are synthesized with the different sensor characteristics of two cameras. It can be seen that the RGB channel gains are controlled differently for different signal levels and that the RGB responses vary significantly even though light conditions are consistent. These are affected by the camera-specific color constancy algorithms. In other words, the color sensors of two cameras perform differently even for the case of color balanced for the same illumination. The internal calibrating parameters are absorbed into the color transformation and cause nonlinear transformation owing to the gain, gamma, and knee control properties.

Figure 1: Comparisons of relative RGB transform relationships obtained from color balanced images by two different cameras for the same illuminant: (a) under illuminant A, (b) under illuminant D65.

3. Colorimetric Analysis

Color imaging systems using CISs can be characterized by the colorimetric method. The resulting camera characterization generates a transfer matrix for the colorimetric analysis of captured images. The color-target-based method uses reference colors and their corresponding values to determine an approximated linear transformation between CIE values and camera RGB signals. The transfer characterization matrix is derived through polynomial regression based on measurements of known color samples. The color target-based characterization using polynomial regression is generally adopted owing to its simplicity and effectiveness.

The transformation processing between tristimulus for color test targets and corresponding camera RGB signals is shown in the following:The camera characterization matrix is determined to minimize the color differences over all test targets. Because RGB values should correspond to    tristimulus through a matrix, the matrix is derived by using the least squares method [16].

Nevertheless, this method is accurate only under certain standard illuminants because the characteristics of CISs are dependent on the surrounding illumination. In addition, the internal AWB process controls the variations of chromaticity and intensity for correct imaging. The cameras have different colorimetric characteristics according to white balance conditions. Accordingly, the transfer matrix should be recalculated for these inconsistent cases and thus illumination-adaptive measuring methods have been proposed.

A recently proposed method uses the estimated phosphor primaries and reference white points to calculate the tristimulus constant matrix for certain illumination conditions [12]. However, this requires phosphor chromaticity information for the RGB and white reference to be previously measured. An alternative method is color measurement based on multicharacterization and illuminant estimation [13]. First, the representative camera transfer matrix is measured for A, D50, and D65 illuminants. Then, the CCT (correlated color temperature) of an illuminant is estimated to select the correct representative transfer matrix for the illuminant. Finally, the tristimulus weighting factors from ALC (auto luminance control) functions are modified to control signal gains for sustained luminance levels. This enables more precise estimation results through the selective .

4. Colorimetric Analysis Based on a Calibrated Color Conversion Matrix

The characteristics and calibration of CISs are dependent on the surrounding illuminants. To obtain a precise camera characteristic that changes under different ambient conditions, it is necessary to know calibrating parameters such as the status of the ALC registers and balanced white points. However, the characteristics of balanced images and the sensor mechanism are unknown for consumer cameras; thus, it is very difficult to find out the condition of the calibrated sensors in an arbitrary ambient situation. The previous analytical methods are applicable only under restricted measuring circumstances, in which it is possible to understand the sensor’s signal processing by interfacing with the internal registers. Thus, in this section, an ambient-independent method is proposed for enhanced colorimetric measurements. It approaches the resulting color transformation based on the characterizations for the uncalibrated and calibrated images in each preset and AWB mode.

First, RGB input signals are converted to normalized values that are compensated linearly with the inverse of the camera gamma:where are the gamma compensated and normalized outputs from the RGB inputs, is the bit-depth (), and are gamma values for each RGB channel.

Then, the camera characterization for the transfer function should be performed under the reference illuminant. The color transfer matrix is calculated for a camera set by the preset mode. Herein, the 5000K illuminant is adopted for the reference, and RGB channel gains are set internally such that RGB outputs are in a ratio of 1 : 1 : 1 for white and grey samples under that illuminant. Thus, the preset mode means the reference mode for the D50 illuminant. The transfer characterization of a CIS can be conducted using polynomial regression with the least squares method. For the reference illuminant, tristimulus are estimated from the derived matrix () and RGB signals in the preset mode as per the following:However, for the arbitrary illuminant condition, the single characterization does not perform an adequate estimation of the reflected color signal. The transfer matrix should be modified to correspond to the changed surroundings.

As shown in Figure 1, the calibrated images represent the converted color balances and it is possible to find the weight of primaries and channel gains by using the difference between the uncalibrated and calibrated images. The difference between the uncalibrated image in the preset (or reference) mode and the calibrated image in the specific AWB condition can be used to determine a color conversion for different signal levels. The transfer gains of each , , and for the lower part () and upper part () are calculated approximately as per the following:where “” denotes the average value, subscript “low_AWB” and “low_ref” refer to the lower part in the AWB mode and the reference mode, respectively, is the average value of total reference mode inputs, and “up_AWB” and “up_ref” refer to the upper part in each mode. The calculated data are sampled at a rate of 10 : 1 to reduce computing burden.

According to the separated signals, the approximation function for transfer gains consists of two parts. The bi-segmental channel transfer function is derived to reflect the nonlinear transformation property as per the following:where is the estimated AWB output, is the average value of reference values less than , is the average value of reference values more than , and and are the maximum values in the reference and AWB modes, respectively.

The value of has been estimated nonlinearly using a quadratic function for the estimation of calibrated outputs corresponding to the region of reference inputs more than , while a transfer function has been derived linearly for the relatively narrow region of lower values less than . The construction of the channel transfer function using bi-segmental color gains, and , is illustrated in Figure 2.

Figure 2: Illustration of the estimation of the channel transfer function using lower and upper color gains, and .

Then, the new color transfer matrix of the CIS is reconstructed using a color conversion matrix () between the uncalibrated and calibrated images as per (6). Finally, the tristimulus of color objects in a scene can be estimated using the generated transfer matrix.where are the calibrated inputs in the AWB mode and are the uncalibrated inputs in the reference mode. are values for general nonstandard illuminant environments.

When the illuminant changes, the color conversion matrix controls 3 channel gains according to the image calibration by AWB processing. The proposed colorimetric analysis flow is shown in Figure 3(b) in comparison with the multi- method of Figure 3(a). The multi- method requires finding characteristic transfer matrices under various standard illuminations and estimating the illuminant. On the other hand, the adaptive- gives estimated results for illuminants and color information by obtaining the color conversion matrix using images captured in two different modes without the illuminant estimation and the AWB information of the image sensor.

Figure 3: Block diagrams of color analysis processes: (a) the multi- method and (b) the scene-adaptive- method.

5. Simulation Results

The proposed algorithm has been tested on images containing Macbeth 24 color samples and various objects under the representative D50, D65, and A illuminants using a Macbeth lighting booth. A schematic diagram of the camera characterization and color estimation is shown in Figure 4. The viewing conditions considered for the verification of the method are restricted within the range of indoor environments and three illuminants. A CMOS CIS (TOSHIBA TCM8230MD) module including a control interface was used for the experiments. To evaluate the multi- method compared with the proposed method, first the illuminant was estimated using a single characterization matrix (D50) for higher luminance pixels corresponding to the upper 10% of signals. Then, the most close matching of the three illuminants was selected and the corresponding characterization matrix was used for color estimations. For the consideration of the preset mode, the internal parameters of the camera were fixed so that the RGB outputs maintained the white balance state under the 5000 K illuminant.

Figure 4: Schematic of the camera characterization and color estimation for the evaluation.

Tables 1, 2, and 3 show the relative performance of the color and illuminant estimations in color differences between the measured and predicted values and a comparison of the processing steps between multi- and adaptive-. The average errors were computed in CIE color space. Comparisons of average color differences for color samples and illuminants for the three methods are shown in Tables 1 and 2, respectively. The results show that the adaptive- method performs better than the multi- method by 16% and 65% for the estimation of color samples and illuminants, respectively. The meaningful results are shown in color estimations under illuminant D65 and white estimations for D50, D65, and A illuminants. In comparison with the conventional multicharacterization for typical illuminants through the illuminant estimation process, the proposed method requires only a single characterization, no ALC information, and no selection based on illuminant estimation; however, it does require acquiring two-mode images.

Table 1: Comparisons of estimation performance among single-, multi-, and adaptive- in color differences.
Table 2: Comparisons of illuminant estimation among single-, multi-, and adaptive- in color differences.
Table 3: Comparisons between multi- and adaptive- in terms of processing steps and results.

In Table 3, the multi- method requires finding the characteristic transfer matrices for various standard illuminations. Additionally, the illuminant estimation for selection using pixel samples of higher levels is inaccurate. However, the adaptive- demonstrates satisfactory results for illuminants and color information by obtaining the color conversion matrix using images captured in two different modes without any other interfacing with the CIS.

6. Conclusions

In color imaging systems, colorimetric measuring is necessary to correctly reproduce color images. This is generally performed by techniques such as chromatic adaptation transformation that requires absolute luminance and chrominance information.

It is also indispensable to estimate illuminants and object colors for the correction of degraded effects by tonal mapping and color distortion in rendering high dynamic range scenes on low dynamic range displays. However, it is difficult to obtain accurate colorimetric measurement because the AE and AWB of cameras operate according to the exposed environment. The status of sensor calibration is unknown because it differs in many imaging systems. In general, color conversion techniques have been developed to convert between different illuminants based strictly on the physical quantities, and thus the measured results depend on the chosen illuminant.

In this paper, to solve this problem, the transfer matrix according to white-balance conditions is estimated but does not use the internal CIS register values. The difference between the uncalibrated and calibrated images is solely used to derive the color conversion matrix. This method of color analysis can be consistent with the assumption of known color-processing and illuminants. Experimental results show the proposed method is valid in terms of measuring performance. The prediction of luminance and chromaticity of scenes is applicable to CISs in automatic systems responding to various environments.

Competing Interests

The author declares that there is no conflict of interests regarding the publication of this paper.


This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2015R1D1A1A01059929).


  1. CIE, Publication No. 15.2, Colorimetry, Central Bureau of the CIE, Vienna, Austria, 2nd edition, 1986.
  2. S. Hullfish and J. Fowler, Color Correction for Video, Focal Press, 2nd edition, 2008.
  3. P.-C. Hung, “Colorimetric calibration in electronic imaging devices using a look-up-table model and interpolations,” Journal of Electronic Imaging, vol. 2, no. 1, pp. 53–61, 1993. View at Publisher · View at Google Scholar · View at Scopus
  4. M. D. Fairchild, Color Appearance Models, John Wiley & Sons, New York, NY, USA, 2nd edition, 2005.
  5. G. W. Larson, H. Rushmeier, and C. Piatko, “A visibility matching tone reproduction operator for high dynamic range scenes,” IEEE Transactions on Visualization and Computer Graphics, vol. 3, no. 4, pp. 291–306, 1997. View at Publisher · View at Google Scholar · View at Scopus
  6. E. Reinhard, M. Stark, P. Shirley, and J. Ferwerda, “Photographic tone reproduction for digital images,” ACM transactions on Graphics, vol. 21, no. 3, pp. 267–276, 2002. View at Google Scholar
  7. N. Moroney, M. D. Fairchild, R. W. G. Hunt, C. Li, M. R. Luo, and T. Newman, “The CIECAM02 color appearance model,” in Proceedings of the IS&T's Color and Imaging Conference (CIC '02), pp. 23–27, Scottsdale, Ariz, USA, 2002.
  8. R. Mantiuk, R. Mantiuk, A. Tomaszewska, and W. Heidrich, “Color correction for tone mapping,” Computer Graphics Forum, vol. 28, no. 2, pp. 193–202, 2009. View at Publisher · View at Google Scholar · View at Scopus
  9. G. Hong, M. R. Luo, and P. A. Rhodes, “A study of digital camera colorimetric characterization based on polynomial modeling,” Color Research and Application, vol. 26, no. 1, pp. 76–84, 2001. View at Publisher · View at Google Scholar · View at Scopus
  10. T. Johnson, “Methods for characterizing colour scanners and digital cameras,” Displays, vol. 16, no. 4, pp. 183–191, 1996. View at Publisher · View at Google Scholar · View at Scopus
  11. H. R. Kang, The Color Technology for Electronic Imaging Devices, SPIE Optical Engineering Press, Bellingham, Wash, USA, 1997.
  12. E.-S. Kim, S.-H. Lee, S.-W. Jang, and K.-I. Sohng, “Adaptive colorimetric characterization of camera for the variation of white balance,” IEICE Transactions on Electronics, vol. E88-C, no. 11, pp. 2086–2089, 2005. View at Publisher · View at Google Scholar · View at Scopus
  13. S.-H. Lee, J.-H. Lee, and K.-I. Sohng, “An illumination-adaptive colorimetric measurement using color image sensor,” IEICE Transactions on Electronics, vol. 91, no. 10, pp. 1608–1610, 2008. View at Publisher · View at Google Scholar · View at Scopus
  14. V. C. Cardei, B. Funt, and K. Barndard, “White point estimation for uncalibrated images,” in Proceedings of the IS&T 7th Color Imaging Conference, pp. 97–100, 1999.
  15. Y. Monobe, H. Yamashita, T. Kurosawa, and H. Kotera, “Dynamic range compression preserving local image contrast for digital video camera,” IEEE Transactions on Consumer Electronics, vol. 51, no. 1, pp. 1–10, 2005. View at Publisher · View at Google Scholar · View at Scopus
  16. D. G. Zill and M. R. Cullen, Advanced Engineering Mathematics, vol. 2nd, Jones and Bartlett Publishers International, 1999.