Table of Contents Author Guidelines Submit a Manuscript
Active and Passive Electronic Components
Volume 2016, Article ID 7467165, 16 pages
http://dx.doi.org/10.1155/2016/7467165
Research Article

Color Calibration for Colorized Vision System with Digital Sensor and LED Array Illuminator

School of Electrical and Electronic Engineering, East China Jiaotong University, Nanchang, Jiangxi 330013, China

Received 26 February 2016; Accepted 21 April 2016

Academic Editor: Huikai Xie

Copyright © 2016 Zhenmin Zhu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Color measurement by the colorized vision system is a superior method to achieve the evaluation of color objectively and continuously. However, the accuracy of color measurement is influenced by the spectral responses of digital sensor and the spectral mismatch of illumination. In this paper, two-color vision system illuminated by digital sensor and LED array, respectively, is presented. The Polynomial-Based Regression method is applied to solve the problem of color calibration in the sRGB and color spaces. By mapping the tristimulus values from RGB to sRGB color space, color difference between the estimated values and the reference values is less than . Additionally, the mapping matrix has proved a better performance in reducing the color difference, and it is introduced subsequently into the colorized vision system proposed for a better color measurement. Necessarily, the printed matter of clothes and the colored ceramic tile are chosen as the application experiment samples of our colorized vision system. As shown in the experimental data, the average color difference of images is less than . It indicates that a better performance of color measurement is obtained via the colorized vision system proposed.

1. Introduction

Color measurement is essential for a very wide range of industrial applications including paint, paper, oil, skin, printing, food, plastics, and ceramic tile [18]. The superficial appearance and the color are regarded as the most important elements evaluated by consumers and are the critical factors for accepting the products by the consumers as well. Traditionally, the color measurement is performed by commercial colorimeter or human being while the colorimeter instruments can measure small and nonrepresentative areas of a few square centimeters exclusively, but they cannot provide the automatic measurement in the pixel available images [7]. What is worse, these devices are expensive and can only be utilized to calculate the average tristimulus values. Usually, the colors of the products images are complex and nonuniform, and they cannot be measured by colorimeter accurately. On the other hand, the automatic visual measurement of colors in an industrial production process can improve the overall quality of the products. The advantage of color measurement by colorized vision system is that machines can evaluate the color continuously and objectively [1]. And the automatic color measurement performed by a colorized vision system proposed with digital sensor and LED array illuminator is discussed.

In our previous work, colorized vision system with the digital sensor and optimal LED array illuminator was investigated for color vision [911]. Nevertheless, the uncertainty of LED array illuminator for color measurement would be a deadly factor due to the stray light errors, the instabilities of colorimetric system, and the spectral mismatch of illumination. Compared with the standardized D65 illuminator, the average color reproduction error of our LED array illuminator is up to 7.63, which posts a considerable error for color measurement [12]. Therefore, in order to achieve a lower uncertainty of the color measurement, the color calibration for optimizing illumination is necessary. In this paper, color calibration is performed under the standardized D65 illuminator and the LED array illuminator, respectively, and it is applied to the same scenes captured by the same digital sensor.

Furthermore, the color images generated by the digital sensor are usually device-dependent. Via the CIE 1931 2° color matching functions (CMFs), the standard spectral tristimulus values are established by CIE1931 and are utilized for device-independent color representations on the basis of the properties of human vision system. Besides, different digital sensors can produce different RGB responses, as illustrated in Figures 1(a) and 1(b); the CIE 2° color matching functions and the spectral response functions of our digital sensor severally are represented and it can be seen that there exists a remarkable difference between the digital sensor and the CIE 2° color matching functions. And color measurement is performed on the tristimulus instruments, and even small deviations of the spectral response functions from the CMFs can produce significant color measurement errors [13]. Hence the tristimulus values produced by digital sensor and the standard tristimulus values of the same scene will be obviously different. Therefore, color calibration for the digital sensor is definitely necessary. A plenty of researchers once used the optimal tristimulus filter to design the surface color measurement devices [1317]. Wolski et al. [14] formulated a nonlinear optimization of sensor response functions to minimize the error in color space for colorimetry reflective and emissive surface. Ng et al. [16] developed an imaging colorimeter in order to obtain coordinates for measuring tooth color. Kosztyán et al. [13] once put forward a matrix-based color correction to reduce the systematic errors for color measurement. They have investigated the matrix elements for optimizing the smallest spectral mismatch errors of digital sensor and different illumination distribution. However, in their researches, though they have designed optimal filter to realize CIE color matching functions, it is expensive and difficult for some cheap real-time industry color measurement. In this paper, some simple methods are proposed to decrease the value of tristimulus errors resulting from the spectral mismatch of digital sensor and CIE standardized color matching functions.

Figure 1: Comparison of spectral response: (a) CIE 2° color matching functions and (b) spectral response functions of our digital sensor.

Several color calibration algorithms are proposed for various tasks in different device-independent color spaces. The most commonly used methods include the Polynomial-Based Regression, the Neural Network mapping algorithms, the Support Vector Regression, and the Ridge Regression [1829]. Moreover, these color calibration methods have been applied into the general imaging devices [16, 1833], such as digital cameras [16, 1821, 2527], digital colposcopy [22], scanners [23, 24, 30], printers and cathode ray tube/liquid crystal display (CRT/LCD) monitors [23, 28], and tristimulus colorimeter [3133]. Most related researches put emphasis on the comparison of color calibration methods for various tasks. Wang and Zhang [21] put forward an optimized calibration scheme for tongue image. In the comparison with several popular color calibration algorithms, Polynomial-Based Regression was selected as the most suitable method for a specific tongue image calibration in sRGB color space. Shi and Healey [24] used the Polynomial-Based Regression approach to calibrate the color scanner. By considering the color scanner calibration as a reflectance estimation, linear reflectance models were applied into the calibration process. Hong et al. [25] used the polynomial model to derive the colorimetric mapping between camera RGB signals and CIE tristimulus. They have studied the influence of the alteration of reference sample number and calibration matrices terms number on the color calibration accuracy. Li et al. [22] presented a calibration system for the digital colposcopy in color space. They used a polynomial transformation matrix to calibrate the colposcopy images. Cheung et al. [20] once performed a comparative study between the artificial neural networks and the polynomial transformation for a better camera characterization. It was concluded that polynomial transformation offers a better alternative due to the simple principle and less time-consuming for training. Thus, the method of Polynomial-Based Regression is selected as a simple calibration method in this paper. However, in these literatures mentioned above, they merely selected the device-independent color space and explained why they select this color space. They have totally overlooked the comparison of calibration results in the different device-independent color spaces, and there is no actual data comparison after color calibration. Hence, this paper performs the comparison of two common device-independent color spaces from the aspects of the calibration accuracy, the convenience of image display, and the complexity of the calibration algorithms of the colorized vision system proposed. Significantly, the objective of this paper is to present a novel color calibration method for decreasing tristimulus values error [34] caused by the spectral mismatch of digital sensor and the color reproduction error of LED illuminator.

This paper is organized as follows: the choice of two commonly used device-independent color spaces of the colorized vision system and the transformation relationship are discussed in Section 2. The colorized vision system proposed, the experiment condition, and the training samples are explained in Section 3. Then, a colorized vision system mathematical model on which the color calibration procedure is based and the Polynomial-Based Regression method is discussed in Section 4. A detailed explanation of Polynomial-Based Regression transformation from the device-dependent to device-independent tristimulus values is attached. In Section 5, the comparison of sRGB color space and color space in color calibration process is provided elaborately; besides the calibration accuracy, the average color difference, the chromaticity diagram, and the average chromaticity coordinate of images before and after calibration are discussed. Printed matter of clothes and colored ceramic tile are used as samples in Section 6. There is an application experiment to confirm the validity of the colorized vision system for color measurement. Ultimately, the discussion and conclusion are given in Section 7.

2. Color Space for Calibration

In this section, the selection of two commonly used device-independent color spaces is described. Color space plays a central role in the development of color vision system, and it makes a significant influence in the calibration process as well.

Color space, as a mathematical model, is presented by three or four values of color components for the purpose of describing the colors under a particular standard, such as the RGB, CMYK, and HSV color spaces. Color space can be broadly split into two basic types: the device-independent color spaces and the device-dependent color spaces. Additionally, the device-independent color space can produce the same color whatever color input or output device is used. And the device-dependent color spaces describe the tristimulus values defined by the characteristics of a specific imaging device. The purpose of the color calibration is to transform the device-dependent to device-independent tristimulus values. Besides, as illustrated in Table 1, the commonly used color spaces are represented, but not all the color spaces listed in Table 1 have the same effect when they are used for color calibration. Plenty of researches have selected the sRGB [19, 21, 27], [20, 25, 28, 30], and [22] as the device-independent color spaces. One of the objectives of this paper is to compare the color calibration results calculated in the different device-independent color spaces.

Table 1: The commonly used color spaces.

The sRGB color space is developed by an IEC technical committee and has been endorsed by many CRT (cathode ray tube) companies. It embodies much advantage for the color calibration. First, the color space has been introduced to many CRT-based display devices, the computer monitors, the printer, and the Internet included, which means that images calibrated in the sRGB color space can be displayed in computer monitor immediately. Second, the standardized D65 illuminant (color temperature is 6500 K) has been recommended by the International Commission on Illumination (CIE) to simulate the daylight sunshine. By illuminating under the D65 illuminator, the images captured by the digital sensor will show an optimum display effect. Fortunately, sRGB has a white point of D65 illuminator in the middle of color gamut [12]. Finally, the sRGB tristimulus values have a constant transformational relationship with , and it is expressed as follows:where represents the transformation model built from sRGB color space to color space.

The color difference is the calibration evaluation criterion in the color space. The tristimulus values are a nonlinear transformation to [35], it is defined as below:where () is reference white point of D65 in sRGB color space, and it is assigned the value of (94.81, 100.0, 107.32). The color difference of two specific colors () and () is calculated by the Euclidean distance. The color difference of two colors can be expressed as follows:

Generally speaking, a value of less than is a theoretical totally noticeable difference, below goes unnoticed difference, and between and is little difference observed by human beings. During the subsequent analysis, (3) explained above is adopted to calculate the color difference and the color chromatic aberration. For the sake of convenience, the nonlinear transformation is introduced which indicates a transformation from sRGB to , and it is defined as follows:

This transformation mentioned above is perfectly invertible and to transformation as well. , as a color space based on human vision characteristics, is a commonly utilized colorimetric system to describe a specific color as CIE observer recommended by CIE. The CIE-like device-independent color spaces all have a one-to-one mapping into the color space. Therefore, many researchers have selected this as the device-independent color spaces for color calibration. As for the , a uniform color space, it is also recommended by CIE in the 1976, and it is used for evaluating the color difference. Currently, it is the most popular color space for analyzing the difference of diverse colors. Some researchers have proved that it performs better than RGB on the color texture analysis and image segmentation [22]. So it is convenient to perform the image processing and understanding in this color space.

In conclusion, the and color spaces both belong to the CIE. The color difference can be calculated rapidly in the color space, and color space is not very abundant in the latter image processing. Therefore, we select the sRGB and color spaces which represent the RGB-like and CIE-like color spaces, respectively.

3. Color Vision System and Condition

Colorized vision system presented in this paper is explained minutely in this section and the experiment condition and the training samples as well. Munsell ColorChecker and standardized color patches are often selected as the training samples for color calibration. On account of the limitation of digital camera and in order to keep the influence of nonuniform images for calibration, this study selects Pantone color patches as the training and testing samples whose color is similar to the Munsell ColorChecker. The standard tristimulus values of Pantone color patches are measured by the X-Rite SP60 colorimeter. These values are combined with the standardized tristimulus values of matrices , and . Further speaking, the colorized vision system includes the IMPERX digital camera and the LED array illuminator.

The IMPERX (IPX-2M30-GCCI) digital camera is a 12-bit resolution for each channel coupled with a standard C-mount zoom lens (LM35HC) while the spatial resolution is 1600 by 1200 pixels. Besides, the camera has a high sensitivity in the spectral range of 400–1000 nm. To extend the application of this system, the calibration is performed under two illumination conditions which include the artificial D65 illuminator and LED array illuminator [35]. When the proportion of three primary colors of LED illuminator is set () = (254 : 237 : 90), the relevant color temperature 6504 K is very close to D65 [12].

In order to ensure the accuracy of color calibration, the illumination and viewing geometry should be approximately 45/0° as recommended by CIE which means that the angle of optical axis of illumination and the normal direction of the color patches is , and the angle of camera captured and the normal direction of the color patches is less than 10°. In the colorized vision system for color calibration, the color patches are placed on the 45° grey board and the camera is fixed on the tripod which is perpendicular to the grey board. Figure 2 illustrates the applied artificial D65 illumination and the scene of calibration for colorized vision system with LED array illuminator. Besides, the LED illuminator has 94.70% lateral uniformity irradiance distribution within the diameter of 80 mm at the panel-target distance of 200 mm [9]. To minimize the impact of external environment on illumination, the whole imaging system is installed in a dark cell to capture the color patches images. Besides, the image acquisition system is operated by a PC equipped with GigE software.

Figure 2: Color vision system calibration results: (a) the artificial D65 illuminator and (b) the scene of calibration for color vision system.

4. Calibration Methods

4.1. Theoretical Basis

In this subsection, the mathematical model of color vision is discussed exhaustively in device-independent and device-dependent color spaces. Then the model of the calibration process of colorized vision system is explained. A color image is always the result of a complex interaction among three components: the physical content of the scene, the illumination incident on the scene, and the characteristics of the digital camera included [3537]. Tristimulus values of image pixel in the device-dependent color spaces are obtained by the digital sensor as follows:where is the spectral distribution of illumination and is the surface reflectance of object at pixel i (). And , , and are the camera spectral response functions for red, green, and blue color bands, respectively, and is a scale factor. To simplify the integral operation, it is sufficient to approximate the various continuous spectra by their values at a number of discrete sample points; thus (5) can be transformed to the below equation:When the camera spectral response functions and the surface reflectance of the object along the axis are represented by the matrix and vector , respectively, (6) can be expressed in matrix notation as follows:where is an diagonal matrix containing samples along the diagonal. This simplified mathematical model has been widely used in researches [17, 38, 39]. Tristimulus values from device-independent color space also can be expressed in the matrix notation as below:where is the same parameter as in (7) and is an matrix of the color matching functions as illustrated in Figure 1(b). And is an diagonal matrix whose diagonal elements are elements of the vector. represents the standardized CIE illuminant.

Via comparing (7) and (8), an obvious difference between the tristimulus values is embodied in the device-independent and device-dependent color spaces and the color difference results from the illuminant and spectral response functions of digital sensor. Although the artificial standardized CIE D65 illuminator is adopted, the color temperature is about 6675 K (standard D65 is 6500 K) and the average color reproduction error is up to 3.25. The result implies that the illumination calibration on matrix is needed. Hence the calibration of proposed colorized vision system is performed under the artificial standardized CIE D65 illuminator and the LED array illuminator severally. The calibration process can be viewed as a mapping problem. The purposed is to find the optimal mapping which can be expressed as follows:where () is the matrix of standardized tristimulus values and represents that the illuminant is D65 illuminator while is the error matrix in color space. By using artificial D65 or LED illuminator for colorized vision system, the matrix () can be defined as follows:where and are the spectral distribution samples of artificial D65 and LED illuminators of the diagonal, respectively. On the basis of (10) and (11) substituting in (9), profile of colorized vision system under artificial D65 and LED illuminators can be expressed as the (ad65) and (LED) respectively. Nowadays, many optimization regression algorithms are applied to solve the mapping problem. In the next subsection, the Polynomial-Based Regression method is explained as the solution algorithm.

4.2. The Polynomial-Based Regression

The commonly used color calibration algorithms, the Polynomial-Based Regression, the Neural Network mapping algorithms, the Support Vector Regression, and the Ridge Regression included, are applied widely to obtain the mapping matrix. And the comparison among these regression algorithms is performed by many researches [20, 21, 31]. The Polynomial-Based Regression is considered as the most commonly used method due to the simple executive routine and less training time. In this subsection, the Polynomial-Based Regression method is applied to get in sRGB and color spaces.

The mapping coefficient matrix can be expressed as in the sRGB color space. The basic principle of polynomial transform is as follows. Initially assume that the number of reference Panton color patches is . Subsequently, according to the analysis of Sections 3 and 4.1, the average tristimulus values of images captured by the digital camera can be represented by a matrix () and the matrix of standard tristimulus values measured by SP60 in sRGB is (). Then the mapping coefficient matrix with terms of RGB to sRGB can be written as below:When the term is 3, the transformation is only a linear transform. But the spectral response function of the digital camera is a nonlinear combination of the color matching functions. The purpose of polynomial transformation is to add more terms for increasing the transformation accuracy. Therefore, there are 4, 6, and 11 elements in the following expression:

Polynomial combination for color calibration is given as an example and the transformation model can be expressed as follows:where (13) also can be written in the matrix format as below:where is the transposed matrix of mapping coefficient matrix and is the matrix generated by different polynomial combination . This equation can easily be solved in a least-squares sense. Then (9) can be expressed as follows:

According to the batch least-squares algorithm, the solution to (14) and (15) is as follows:where is the inverse matrix of . The polynomial combination is recommended by some researches [19, 25]. Hence, in this paper, and are performed to calibrate the colorized vision system and made a comparison on their accuracy. By means of the mapping , the tristimulus of captured color patches () in the RGB color space is transformed to () in the sRGB color space which is expressed as below:

In this sRGB color space, the mapping error of the calibration images is hardly represented for the human perception.

The sRGB values () to    are transformed under the certain transformation for calculating the color difference and performing the subsequent image processing:

The mapping error is determined between the standardized tristimulus value () and estimated value () and is calculated as below:

There is no appropriate formula to measure the color difference of two specific images while (19) is just used to compute the color difference of two specific tristimulus values. Generally, the color of calibration image is not uniform; the color difference between the mean value of image and the standard value is utilized as the evaluation criterion. But the images represented by the mean tristimulus values are not comprehensive enough. Therefore, in this paper, the difference of chromaticity coordinate between each pixel of images and the chromaticity coordinate of standardized value is calculated to evaluate the chromaticity differences of calibration images. Using the certain transformation , the values of images before color calibration and images after calibration are obtained, where is the th pixel of image. And thus chromaticity coordinates and can be explained as below:

The chromaticity diagrams of images are used to compare the calibration results intuitively between the images before and after the calibration. The average Euclidean distances between , , and the standard chromaticity coordinate are used as evaluation criterion of color difference which is defined as follows:where is the number of image pixels, while in the subsequent sections, the average Euclidean distance of chromaticity coordinates , and the color difference are used to assess the calibration impact in fixed quantity.

In conclusion, the whole color calibration process for colorized vision system in sRGB color space has been completed. As illustrated in Figure 3, the schematic diagram of the color calibration algorithm in sRGB color space is explained minutely. When it comes to the term of different illumination condition, the mapping coefficient matrices are expressed as and . And then the color calibration process in color space for color vision system is introduced. Assuming the RGB values matrix of captured images of color patches is from sRGB color space, can be expressed as (). The assumed sRGB values are transformed to color space () under the certain transformation . The mapping coefficient matrix with 4 and 11 terms is obtained by polynomial regression using (12)–(18). Using the mapping matrix , the assumed tristimulus value of captured color patches () in sRGB color space is transformed to () in the color space, which is expressed as follows:

Figure 3: Schematic diagram of the calibration algorithm in sRGB color space.

The mapping error is determined by (20). According to the analysis of Section 4.1, the mappings and represent the mapping matrices of the color vision system under LED illuminator and artificial D65 illuminator, respectively. Furthermore, Figure 4 illustrates the schematic diagram of the calibration algorithm in color space. The evaluation of the mapping coefficient and with 4 and 11 terms and the final choice for our color vision system are reviewed when discussing the experiment results.

Figure 4: Schematic diagram of the calibration algorithm in color space.

5. Experiment Results and Discussion

Comparing Figure 3 with Figure 4, it can be seen that there are many advantages and disadvantages in these calibration processes. In sRGB color space, it is very convenient to perform the color calibration because the captured images do not need any processing to be done before calibration and the calibrated images can be immediately displayed in CRT monitor for human perception. The sRGB color space is unsuitable for later image processing for each color channel is relevant to the luminosity. But the transformation to color space is just a certain transformation . The color space is a perceptually uniform color space in which we can detect the mapping error conveniently and perform the color texture analysis better than in sRGB color space [40]. However, before calibration the transformation of images is necessary. And the calibration accuracy will be influenced by the digital conversion accuracy.

The color calibration of colorized vision system is performed in the and sRGB color spaces under the LED array illuminator and artificial D65 illuminator severally. And the experiment conditions are already described in Section 3. Besides, all the algorithms are implemented by MATLAB software. Then the mappings , , , and are obtained by using Polynomial-Based Regression with 4 and 11 terms. With calibration accuracy, the measurements of colors made in the colorized vision system proposed are close to the reference instrument X-Rite SP60. In order to quantify the color calibration accuracy, the maximum mapping error and the average mapping error of twenty-four color patches are calculated. Tables 2 and 3 both illustrate the accuracy comparison in the sRGB color space and color space [41].

Table 2: The accuracy comparison () in sRGB color space obtained by mapping coefficient matrix .
Table 3: The accuracy comparison () in CIE- color space obtained by mapping coefficient matrix .

Comparing Table 2 with Table 3, it is shown that whatever the Polynomial-Based Regression with 4 or 11 terms, color vision system illuminated under the artificial D65 illuminator or the white field () = (254 : 237 : 90) of LED illuminator, the maximum mapping error and the average mapping error of the twenty-four color patches in sRGB color space illustrate higher accuracy than in color space. The average mapping errors of mapping are up to 2.56 and 2.39 with 11 terms which are located in unnoticeable scope. But the mapping errors of are 4.92 and 4.94; the accuracies are obviously on the high side due to the redundant transformation process before calibration in color space. Integrating the analysis results of color calibration schematic diagram, sRGB color space is strongly recommended to serve as the device-independent color space for color calibration for the less transformation process before calibration, the higher accuracy, and the more convenience for displaying images after calibration. Besides, the sRGB tristimulus can be conveniently transformed to the CIE- or color space under the certain transformation for evaluating the calibrated images and performing the later image processing.

In comparison with Polynomial-Based Regression with 4 and 11 terms, the mapping errors of 4 terms obtained from and are still higher than that of 11 terms. When are used for polynomial regression, it regards the problem of finding an optimal mapping as a simple linear transformation. The experiments illustrate that this problem cannot be regarded as a linear transformation because the spectral response functions of the digital camera are not a linear combination of the color matching functions. To solve the optimal mapping Φ problem, the high-order polynomial transformation is suitable. In this paper the polynomial regression with 11 terms in sRGB color space is applied for the next application.

Figures 5(a) and 5(b) illustrate the comparison of color difference of twenty-four color patches before color calibrations , and mapping errors , for the colorized vision system illuminated under the LED illuminator and artificial D65 illuminator. In the literature [21], the images captured by standardized CIE lighting conditions and an industrial camera seem as the reference images. However, in our experiments, the average color difference of the images captured by the colorized vision system proposed under the artificial D65 illuminator is still up to 16.09 because of the mismatch of spectral response functions of the digital sensor and the average color reproduction error of artificial D65 illuminator. It is essential to perform the color calibration on the “reference” images. Due to the narrow width of LED illuminator, the average color difference before calibration is up to 27.68 which is much higher than the artificial D65 illuminator (see also Figures 5(a) and 5(b)). After mapping the RGB tristimulus to sRGB tristimulus , the average color difference under artificial D65 illuminator is changed from 16.09 to 2.56; the color calibration for the digital sensor is successful. Under the LED illuminator the value is changed from 27.68 to 2.39 which means that the color vision system can also be applied for automatic color measurement.

Figure 5: The comparison of color difference of color patches. (a) Mapping under the LED illuminator and (b) mapping under the artificial D65 illuminator.

The evaluation of calibration images is also the important target because the mean tristimulus values cannot represent the whole images for color measurement. Two color patches, 225 M and 368 M of captured images, are calibrated for evaluating the calibration images. Besides, 225 M and 368 M are the identification serial numbers of Panton color patches.

Figures 6(a), 6(b), 7(a), and 7(b) illustrate the images of 225 M and 368 M captured by the same digital sensor under the artificial D65 illuminator and the LED illuminator before calibration, respectively. The color temperature and chromaticity coordinate of the D65 and the LED illuminator are 6675 K and 6505 K severally. They are all very close to the standardized D65 illuminator (6504 K ). However, an obvious color discrepancy exists in the images before calibration. The color differences between the average color of images and the standardized tristimulus values are up to 13.73, 18.54 and 31.59, 50.96 for the two color patches illustrate the color images obtained by the mappings and , respectively. Figures 6(e) and 7(e) show the virtual images made by the standardized tristimulus values . It can be seen that the color of neither row between calibrated and standardized images looks much more similar by the human eyes perception. As shown in Table 4, the color differences between the mean color values of neither row images and the standard tristimulus values are also computed. The color differences of two color patches are 1.42, 0.74 and 1.76, 1.55 which are all located in totally unnoticeable or unnoticeable range.

Table 4: The comparison of (Image), C between captured images and theoretical images before and after calibration.
Figure 6: The comparison of 368 M color patches. (a) Before calibration under D65 illuminator and (b) before calibration under LED illuminator and (c) obtained by and (d) obtained by and (e) theoretical images of standardized tristimulus values.
Figure 7: The comparison of 225 M color patches. (a) Before calibration under D65 illuminator and (b) before calibration under LED illuminator and (c) obtained by and (d) obtained by and (e) theoretical images of standardized tristimulus values.

The colors of the actual calibration images are usually not uniform and cannot be represented by mean values. Hence only utilizing the average color difference of images as the evaluation criterion is not objective enough. Therefore, the chromaticity diagram corresponding to all the pixels of images before and after color calibration is applied to compare the calibration results significantly. The average Euclidean distance between the chromaticity coordinate of every pixel in images and the standard chromaticity coordinate is also used in quantitative analysis. Figures 8(a) and 8(b) illustrate the chromaticity diagram of two color patches before and after color calibration which are transformed from Figures 6(a)6(e) to Figures 7(a)7(e) by means of (1)–(19).

Figure 8: Chromaticity diagram before and after calibration. (a) 225 M color patch and (b) 368 M color patch.

Although the color of captured color patches is uniform, it can be seen from Figure 8 that the nonuniform characteristic existed in the chromaticity diagram due to the nonuniform illumination, the irregularity of the color patches, and so forth. Figures 8(a) and 8(b) show that the average distance between the standardized chromaticity coordinate and the chromaticity coordinates of images before calibration captured under the D65 illuminator are closer than those captured under LED illuminator. The average distances of chromaticity coordinate for 225 M and 368 M color patches are 0.0539, 0.0501 and 0.1558, 0.0954, respectively, which confirms that the images captured under the D65 illuminator are better than those under the LED illuminator by the same digital sensor. As we can see, the chromaticity coordinate distributions of images calibrated by means of mappings and are almost overlapped with each other. From Figure 8, we can also notice that all the chromaticity coordinates of calibration images tend to be clustered around the standard chromaticity coordinate as the central point in the chromaticity diagram. Average chromaticity coordinates between the captured images after color calibration decrease to 0.0098, 0.0105 and 0.0135, 0.0044, respectively. Such a small value of the chromaticity coordinate distance between images cannot even be perceived by human being, and the color differences also confirm these results. The improved ratios are all stand at the level 80%. All the data results are illustrated in Table 4. Therefore, these evaluation criteria will be applied to evaluate the results of the further application.

sRGB and , as the device-independent color spaces, are utilized to conduct the experiment of color calibration. Additionally, the method of Polynomial-Based Regression is adopted to calculate the color difference. As shown in Tables 2 and 3, a lower difference value is represented under the circumstance of . However, a considerable amount of difference value is obtained when , and it indicates that the sRGB color space is regarded as the model for a better improvement of color calibration. Moreover, as for the images taken under different illuminators, a better performance of color calibration is gained by the mapping .

The experiment results have shown that sRGB color space selected as the device-independent color space is better than color space whatever the polynomial regression with 4 or 11 terms under the circumstance of colorized vision system captured under artificial D65 illuminator or LED illuminator. The images captured by artificial D65 illuminator cannot seem as the reference images since the average color difference is up to 16.09. The images acquired under various illuminator conditions by the same digital sensor are successfully transformed by the mapping coefficient matrix in sRGB color space. The average Euclidean distance of each pixel of images and the average color difference of images and chromaticity diagram are adopted as the evaluation criteria. These results confirm that the calibration for the colorized vision system proposed is useful and beneficial.

In this section, two mapping matrices, and , are introduced and the experiments of color calibration are conducted under different illuminators, respectively. Through the method of Polynomial-Based Regression with 4 and 11 terms, a better performance of mapping matrix is obtained and it is applied to conduct the application experiments.

6. Applications Experiment

In order to verify the validation of the colorized vision system furtherly after calibration which can be applied for color measurement, the printed matter of clothes and the ceramic tile are chosen in this paper as experiment samples. Unlike the comparison and the choice experiments, in real application, all the images are acquired by the colorized vision system under LED illuminator because the results in Section 5 have illustrated the calibration accuracy of LED illuminator is high enough as well as D65 illuminator. Apart from the calibration accuracy, LED illuminator has a superiority of luminous efficacy, compactness, durability, and convenience for industrial scene. The white balance of LED illuminator () = (254 : 237 :  90) is utilized to capture images.

Figure 9 illustrates the images of clothes printed matter before and after calibration, respectively. It can be explained that an obvious distinction is detected in the images before and after calibration. We invite several human beings with normal vision to compare actual clothes color with the calibrated color in Figure 9. They all think the color of Figure 9(b) is close to the color of clothes. The average color difference and the average Euclidean distance of images of blue printed matter in Figure 9 are also computed. Before calibration, the values are 17.63 and 0.0236 while after calibration the values fall to 4.92 and 0.0092. The average color difference locates in the range of little difference. Figure 12(a) illustrates the chromaticity diagram of the blue printed matter of clothes. As illustrated in Figure 12(a), the chromaticity coordinate of the standardized value is at the edge of the chromaticity diagram of the images after calibration. The standard tristimulus value is measured by colorimeter X-Rite SP60. And it is just an average value in the range of circle with three-centimeter diameter. However, the colors of measured samples are not uniform. Hence, the standardized tristimulus value is not very accurate and is just used as the reference substance.

Figure 9: The images of printed matter of clothes. (a) Before calibration and (b) after calibration.

Colored ceramic tile is an important construction material which has been widely applied. The color of ceramic tile is easily nonuniform because of the inaccurate control of the stove temperature and the sending speed. Nowadays, the color of ceramic tile is still measured by human vision or colorimeter. Figure 10 shows the ceramic tile images before and after color calibration. From Figures 10(a) and 10(b), we can notice that mapping coefficient matrix is sensitive to the specular reflection because the ceramic tile is a kind of material with high specular reflection. There are many black pixels in the middle of the images after calibration, and the whole image is in the distortion model. Hence, a linear polarizer is placed in front of the digital sensor to capture the image. Then the mapping is also used to calibrate the images. As visually shown in Figure 11, the specular reflection is eliminated and the calibration images show the normal color. We also invite several people with normal vision to compare the ceramic tile color. They are unanimous that the color of image after calibration is close to the ceramic tile.

Figure 10: The images of ceramic tile. (a) Before calibration and (b) after calibration.
Figure 11: The images captured by a linear polarizer before and after calibration. (a) Before calibration and (b) after calibration.
Figure 12: The chromaticity diagram. (a) Blue printed matter and (b) middle region of ceramic tile.

We select the middle of the images to calculate the average color difference and to draw the chromaticity diagram. As shown in Table 5, the average color difference and the average Euclidean distance of chromaticity coordinate are 34.57 and 0.0166, which fall to 5.69 and 0.0115 after color calibration. Figure 12(b) illustrates chromaticity diagram of the middle region of ceramic tile. It can be seen that the improved ratio of and the chromaticity diagram are not in accordance with the average color difference. The Euclidean distance of average RGB tristimulus values between before and after calibration is up to 106. But the chromaticity coordinate represents the corresponding proportion of the tristimulus values, which are close to each other. In this situation, comparison of chromaticity coordinate is insignificant to the calibration results. We could have used average color difference to estimate the difference before and after calibration.

Table 5: The comparison of (Image), C between captured images and reference value measured by SP60 before and after calibration.

From what is discussed above, we can conclude that the tristimulus values obtained by mapping are close to the values measured by SP60 colorimeter. The calibration accuracy in real application is lower than the application experiment because the reference values measured by SP60 are not that accurate. Colorized vision system after calibration can be used to replace the colorimeter for color measurement in pixel available under certain condition. The mapping algorithm is sensitive to specular reflection. The linear polarizer should be added to the colorized vision system proposed for color measurement when the sample belongs to the kinds of material with high specular reflection.

7. Conclusion

This paper explains a colorized vision system with a digital sensor and LED array illuminator for color measurement. In order to improve the measurement accuracy, the color calibration is implemented for the color vision system illuminated under artificial D65 illuminator and LED array illuminator. First, the mathematical model of the calibration process is derived from the tristimulus values principle of CIE1931. The calibration process is converted to solve the optimal mapping problem. Second, the Polynomial-Based Regression is used to obtain the mapping coefficient matrix. By using the mapping matrix, the tristimulus values in device-independent color space are obtained from the RGB values of images captured by color vision system. Third, color calibration for the colorized vision system proposed is performed by polynomial regression method in two commonly used color spaces, the and sRGB color spaces included. The sRGB color space is recommended as the device-independent color space due to the less transformation process before color calibration, the higher accuracy, and the more convenience to display images after calibration. The images illuminated by artificial D65 illuminator cannot seem as the reference images because the average color difference of color patches is up to before calibration. The mapping matrix has proved effective in reducing the color difference to less than and improving the ratio of average chromaticity coordinate of the images more than 80%.

Finally, printed matter of clothes and colored ceramic tile are used as the application samples for color measurement. The mapping matrix is sensitive to the pixels with specular reflection. When the sample belongs to the kinds of material with high specular reflection, a linear polarizer should be added to reduce the component of specular reflection for our color vision system. The average color differences between the estimated values and values measured by SP60 colorimeter are less than . The experiment results show that the colorized vision system applied for color measurement is feasible.

Competing Interests

The authors declare that they have no competing interests.

Acknowledgments

This research is supported by the National Natural Science Foundation of China (51305137), the National Science Foundation of Jiangxi Province (20151BBE50116), and the National Science Foundation of Jiangxi Province (GJJ14388).

References

  1. H. M. G. Stokman, T. Gevers, and J. J. Koenderink, “Color measurement by imaging spectrometry,” Computer Vision and Image Understanding, vol. 79, no. 2, pp. 236–249, 2000. View at Publisher · View at Google Scholar · View at Scopus
  2. K. Kılıç, B. Onal-Ulusoy, and I. H. Boyacı, “A novel method for color determination of edible oils in Lab format,” European Journal of Lipid Science and Technology, vol. 109, no. 2, pp. 157–164, 2007. View at Publisher · View at Google Scholar · View at Scopus
  3. A. C. M. de Oliveira and M. O. Balaban, “Comparison of a colorimeter with a machine vision system in measuring color of Gulf of Mexico sturgeon fillets,” Applied Engineering in Agriculture, vol. 22, no. 4, pp. 583–587, 2006. View at Publisher · View at Google Scholar · View at Scopus
  4. C. Boukouvalas, J. Kittler, R. Marik, and M. Petrou, “Automatic color grading of ceramic tiles using machine vision,” IEEE Transactions on Industrial Electronics, vol. 44, no. 1, pp. 132–135, 1997. View at Publisher · View at Google Scholar · View at Scopus
  5. C. Boukouvalas, J. Kittler, R. Marik, and M. Petrou, “Color grading of randomly textured ceramic tiles using color histograms,” IEEE Transactions on Industrial Electronics, vol. 46, no. 1, pp. 219–226, 1999. View at Publisher · View at Google Scholar · View at Scopus
  6. J. Pladellorens, A. Pintó, A. J. Segura et al., “A device for the color measurement and detection of spots on the skin,” Skin Research and Technology, vol. 14, no. 1, pp. 65–70, 2008. View at Publisher · View at Google Scholar · View at Scopus
  7. K. León, D. Mery, F. Pedreschi, and J. León, “Color measurement in Lab units from RGB digital images,” Food Research International, vol. 39, no. 10, pp. 1084–1091, 2006. View at Publisher · View at Google Scholar · View at Scopus
  8. J. Pospíšil, J. Hrdý, and J. Hrdý Jr., “Basic methods for measuring the reflectance color of iron oxides,” Optik, vol. 118, no. 6, pp. 278–288, 2007. View at Publisher · View at Google Scholar · View at Scopus
  9. Z.-M. Zhu, X.-H. Qu, G.-X. Jia, and J.-F. Ouyang, “Uniform illumination design by configuration of LED array and diffuse reflection surface for color vision application,” Journal of Display Technology, vol. 7, no. 2, pp. 84–89, 2011. View at Publisher · View at Google Scholar · View at Scopus
  10. Z. M. Zhu, X. H. Qu, H. Y. Liang, and G. X. Jia, “Effect of color illumination on color contrast in color vision application,” in Optical Metrology and Inspection for Industrial Applications, 785510, vol. 7855 of Proceedings of SPIE, pp. 1–8, November 2010.
  11. Z. Zhu, X. Qu, and G.-X. Jia, “Wavelength intervals selection of illumination for separating objects from backgrounds in color vision applications,” Journal of Modern Optics, vol. 58, no. 9, pp. 777–785, 2011. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  12. Z.-M. Zhu, X.-G. Qu, B. Chao, G.-X. Jia, and F.-M. Zhang, “Study on colorimetric properties of LED array sources for color vision application,” Acta Physica Sinica, vol. 61, no. 2, Article ID 020702, 5 pages, 2012. View at Publisher · View at Google Scholar · View at Scopus
  13. Z. T. Kosztyán, G. P. Eppeldauer, and J. D. Schanda, “Matrix-based color measurement corrections of tristimulus colorimeters,” Applied Optics, vol. 49, no. 12, pp. 2288–2301, 2010. View at Publisher · View at Google Scholar · View at Scopus
  14. M. Wolski, C. A. Bouman, J. P. Allebach, and E. Walowit, “Optimization of sensor response functions for colorimetry of reflective and emissive objects,” IEEE Transactions on Image Processing, vol. 5, no. 3, pp. 507–517, 1996. View at Publisher · View at Google Scholar · View at Scopus
  15. M. J. Vrhel, H. J. Trussell, and J. Bosch, “Design and realization of optimal color filters for correction,” Journal of Electronic Imaging, vol. 4, no. 1, pp. 6–14, 1995. View at Publisher · View at Google Scholar · View at Scopus
  16. D. Ng, J. P. Allebach, M. Analoui, and Z. Pizlo, “Non-contact imaging colorimeter for human tooth color assessment using a digital camera,” Journal of Imaging Science and Technology, vol. 47, pp. 531–542, 2003. View at Google Scholar
  17. D.-Y. Ng and J. P. Allebach, “A subspace matching color filter design methodology for a multispectral imaging system,” IEEE Transactions on Image Processing, vol. 15, no. 9, pp. 2631–2643, 2006. View at Publisher · View at Google Scholar · View at Scopus
  18. L.-C. Chiu and C.-S. Fuh, “Dynamic color restoration method in real time image system equipped with digital image sensors,” Journal of the Chinese Institute of Engineers, vol. 33, no. 2, pp. 243–250, 2010. View at Google Scholar · View at Scopus
  19. Y. V. Haeghen, J. Naeyaert, I. Lemahieu, and W. Philips, “An imaging system with calibrated color image acquisition for use in dermatology,” IEEE Transactions on Medical Imaging, vol. 19, no. 7, pp. 722–730, 2000. View at Publisher · View at Google Scholar · View at Scopus
  20. V. Cheung, S. Westland, D. Connah, and C. Ripamonti, “A comparative study of the characterisation of colour cameras by means of neural networks and polynomial transforms,” Coloration Technology, vol. 120, no. 1, pp. 19–25, 2004. View at Publisher · View at Google Scholar · View at Scopus
  21. X. Wang and D. Zhang, “An optimized tongue image color correction scheme,” IEEE Transactions on Information Technology in Biomedicine, vol. 14, no. 6, pp. 1355–1364, 2010. View at Publisher · View at Google Scholar · View at Scopus
  22. W. Li, M. Soto-Thompson, and U. Gustafsson, “A new image calibration system in digital colposcopy,” Optics Express, vol. 14, no. 26, pp. 12887–12901, 2006. View at Publisher · View at Google Scholar · View at Scopus
  23. M. J. Vrhel and H. J. Trussell, “Color device calibration: a mathematical formulation,” IEEE Transactions on Image Processing, vol. 8, no. 12, pp. 1796–1806, 1999. View at Publisher · View at Google Scholar · View at Scopus
  24. M. Shi and G. Healey, “Using reflectance models for color scanner calibration,” Journal of the Optical Society of America A: Optics and Image Science, and Vision, vol. 19, no. 4, pp. 645–656, 2002. View at Publisher · View at Google Scholar · View at Scopus
  25. G. Hong, M. R. Luo, and P. A. Rhodes, “A study of digital camera colorimetric characterization based on polynomial modeling,” Color Research and Application, vol. 26, no. 1, pp. 76–84, 2001. View at Publisher · View at Google Scholar · View at Scopus
  26. Y.-C. Chang and J. F. Reid, “RGB calibration for color image analysis in machine vision,” IEEE Transactions on Image Processing, vol. 5, no. 10, pp. 1414–1422, 1996. View at Publisher · View at Google Scholar · View at Scopus
  27. W.-C. Kao, S.-H. Wang, C.-C. Kao, C.-W. Huang, and S.-Y. Lin, “Color reproduction for digital imaging systems,” in Proceedings of the IEEE International Symposium on Circuits and Systems (ISCAS '06), pp. 4599–4602, Island of Kos, Greece, May 2006. View at Scopus
  28. B. Bastani, B. Cressman, and B. Funt, “Calibrated color mapping between LCD and CRT displays: a case study,” Color Research and Application, vol. 30, no. 6, pp. 438–447, 2005. View at Publisher · View at Google Scholar · View at Scopus
  29. H. D. Cheng, X. Cai, and R. Min, “A novel approach to color normalization using neural network,” Neural Computing and Applications, vol. 18, no. 3, pp. 237–247, 2009. View at Publisher · View at Google Scholar · View at Scopus
  30. G. Sharma, “Target-less scanner color calibration,” Journal of Imaging Science and Technology, vol. 44, no. 4, pp. 301–307, 2000. View at Google Scholar · View at Scopus
  31. J. L. Gardner, “Comparison of calibration methods for tri-stimulus colorimeters,” Journal of Research of the National Institute of Standards and Technology, vol. 112, no. 3, pp. 129–138, 2007. View at Publisher · View at Google Scholar · View at Scopus
  32. J. L. Gardner, “Tristimulus colorimeter calibration matrix uncertainties,” Color Research & Application, vol. 38, no. 4, pp. 251–258, 2013. View at Publisher · View at Google Scholar · View at Scopus
  33. G. Eppeldauer, “Spectral response based calibration method of tristimulus colorimeters,” Journal of Research of the National Institute of Standards and Technology, vol. 103, no. 6, pp. 615–619, 1998. View at Publisher · View at Google Scholar · View at Scopus
  34. D. D. Testa and M. Rossi, “Lightweight lossy compression of biometric patterns via denoising autoencoders,” IEEE Signal Processing Letter, vol. 22, no. 12, pp. 2304–2307, 2015. View at Google Scholar
  35. M. Jackowski, A. Goshtasby, S. Bines, D. Roseman, and C. Yu, “Correcting the geometry and color of digital images,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, no. 10, pp. 1152–1158, 1997. View at Publisher · View at Google Scholar · View at Scopus
  36. L. G. Corzo, J. A. Peñaranda, and P. Peer, “Estimation of a fluorescent lamp spectral distribution for color image in machine vision,” Machine Vision and Applications, vol. 16, no. 5, pp. 306–311, 2005. View at Publisher · View at Google Scholar · View at Scopus
  37. G. E. Healey and R. Kondepudy, “Radiometric CCD camera calibration and noise estimation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 16, no. 3, pp. 267–276, 1994. View at Publisher · View at Google Scholar
  38. G. Sapiro, “Color and illuminant voting,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 21, no. 11, pp. 1210–1215, 1999. View at Publisher · View at Google Scholar · View at Scopus
  39. P. L. Vora and H. J. Trussell, “Mathematical methods for the design of color scanning filters,” IEEE Transactions on Image Processing, vol. 6, no. 2, pp. 312–320, 1997. View at Publisher · View at Google Scholar · View at Scopus
  40. M. J. Vrhel and H. J. Trussell, “Optimal color filters in the presence of noise,” IEEE Transactions on Image Processing, vol. 4, no. 6, pp. 814–823, 1995. View at Publisher · View at Google Scholar · View at Scopus
  41. G. Paschos, “Perceptually uniform color spaces for color texture analysis: an empirical evaluation,” IEEE Transactions on Image Processing, vol. 10, no. 6, pp. 932–937, 2001. View at Publisher · View at Google Scholar · View at Scopus