About this Journal Submit a Manuscript Table of Contents
Advances in Multimedia
Volume 2012 (2012), Article ID 273723, 6 pages
http://dx.doi.org/10.1155/2012/273723
Research Article

Color Image Quality Assessment Based on CIEDE2000

1Key IC&SP Laboratory of Ministry of Education, Anhui University, Hefei 230039, China
2MOE-Microsoft Key Laboratory of Multimedia Computing & Communication, University of Science and Technology of China, Hefei 230027, China

Received 23 April 2012; Accepted 22 July 2012

Academic Editor: Qi Tian

Copyright © 2012 Yang Yang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Combining the color difference formula of CIEDE2000 and the printing industry standard for visual verification, we present an objective color image quality assessment method correlated with subjective vision perception. An objective score conformed to subjective perception (OSCSP) Q was proposed to directly reflect the subjective visual perception. In addition, we present a general method to calibrate correction factors of color difference formula under real experimental conditions. Our experiment results show that the present DE2000-based metric can be consistent with human visual system in general application environment.

1. Introduction

Image quality assessment is one of the basic technologies for image information engineering. Many researchers are seeking an objective quality assessment metric which can be calculated simply but can accurately reflect subjective quality of human perception [13]. Most of these studies aim at the reduction of deviation between the subjective and objective quality assessment results. Due to the wide application of color image, color image quality assessment becomes more and more important. To represent the color image visual perception, the three attributes (brightness, hue, and saturation) must be exploited [4]. At present, most of color image quality assessment methods convert a color image to the brightness, hue, and saturation corresponding space (including XYZ, YUV, HSL, and opponent color space) and then adopt the gray scale image quality assessment for each channel [58]. However, this kind of conversion cannot completely describe the nonlinear perception of brightness, hue, and saturation in a color image. It is well known that color image quality assessment results depend on the coincidence degree of color space and visual perception. Thus, it is highly desirable to convert color image to a color space which can reflect the subjective visual characteristics more properly. In the present color image assessment method, we propose to convert color image into the uniform color-difference color space.

Based on the JND (just noticeable difference) idea of XYZ system in the colorimetric, uniform color-difference space was proposed to establish a linear relationship between the changes of brightness, hue, and saturation and visual perception through the subjective visual perception experiment. It has been experimentally shown that uniform color-difference space assessment results are much better than opponent color space and YUV color space [9]. Presently, CIEDE2000 is regarded as the best uniform color-difference model coinciding with subjective visual perception. It normalizes brightness, hue, and saturation of the visual perception to the same unit [1012]. With this model, we can directly get a numerical result, named color-difference parameter Δ𝐸, which can reflect the color difference between two images. Many researchers have exploited this color-difference parameter Δ𝐸 from CIEDE2000 formula into the image quality assessment [13, 14]. However, this objective parameter, color-difference parameter Δ𝐸, cannot directly correspond to the subjective visual perception, such as subjective five-level assessment metric. It is noted that in the printing industry, a visual perception metric, based on National Bureau of Standards (NBS) unit (or modified Judd), relates the value of Δ𝐸 to visual perception [15].

In the present color image quality assessment method, we adopted the NBS unit idea to convert the value color-difference parameter Δ𝐸 from CIEDE2000 to a precise objective score conformed to subjective perception (OSCSP) 𝑄 to directly reflect the subjective visual perception. Different from the well-known subjective five-level metrics, our OSCSP 𝑄 converted from Δ𝐸 can be a real number 𝑄[0,5] consistent with subjective perception. Our experiments in various distorted images show the present color image quality assessment metric can give an objective assessment result nicely coinciding with subjective visual perception.

2. CIEDE2000 Color Difference Formula and Its Environmental Parameters Calibration

CIEDE2000 color difference formula presents the relationship of color difference value Δ𝐸 and lightness difference Δ𝐿, hue difference Δ𝐻, and chroma difference Δ𝐶. It is defined as =Δ𝐸Δ𝐿𝐾𝐿𝑆𝐿2+Δ𝐶𝐾𝐶𝑆𝐶2+Δ𝐻𝐾𝐻𝑆𝐻2+𝑅𝑇Δ𝐶𝐾𝐶𝑆𝐶Δ𝐻𝐾𝐻𝑆𝐻.(1)

Here the parameter factors, 𝐾𝐿, 𝐾𝐶, 𝐾𝐻, are correction factors related with observation environment. Lightness, chroma, and hue weighting factors, 𝑆𝐿, 𝑆𝐶, 𝑆𝐻, respectively describe visual perception action on three attributes. Rotation factor 𝑅𝑇 is used to correct deflection in the blue region of the ellipse axis direction for visual perception. Figure 1 indicates the basic steps to calculate the color-difference value Δ𝐸 based on CIEDE2000 formula.

273723.fig.001
Figure 1: CIEDE2000 color difference formula step diagram.

Under the condition of CIE standard observe environment, the parameter factor 𝐾𝐿=𝐾𝐶=𝐾𝐻=1. As it is impossible to fully meet the standard observation conditions in the real experiment, we have to calibrate these three correction factors 𝐾𝐿, 𝐾𝐶, 𝐾𝐻 based on the idea of JND (just noticeable difference). The so-called visual JND means that the human eye can just feel the difference of the lightness, hue, and chroma between two objects. We change one of these three attributes (lightness or hue or chroma) of one object and keep the other two parameters unchanged to get just noticeable difference between this object and the reference one. This JND case corresponds to the condition of color-difference parameter Δ𝐸=0.5 according to Table 1. For example, we can determine the lightness correction factor 𝐾𝐿 as follows. Firstly, choose two test images which have only lightness distortion Δ𝐿 without hue distortion Δ𝐻 and chroma distortion Δ𝐶. Secondly, change the lightness of one image so that we can percept the just noticeable difference at certain Δ𝐿. Lastly, we can determine the proper 𝐾𝐿 factor through (1) as JND condition means Δ𝐸=0.5. In this way, we can fit out all the values of 𝐾𝐿, 𝐾𝐶, 𝐾𝐻 satisfying Δ𝐸=0.5 under real experimental conditions.

tab1
Table 1: Subjective assessment metric based on CIEDE2000 Color difference.

3. Subjective Measurement Standard Establishment and Related Notes

In order to directly relate the color-difference parameter Δ𝐸 from the CIEDE2000 to the subjective visual perception, we adopt the idea of NBS unit to convert Δ𝐸 to an objective score conformed to subjective perception (OSCSP) 𝑄 to reflect the subjective visual perception through a nonlinear transformation as (2). To get this transformation, we need firstly extend the five-level metric by including two extreme states: the minimum color difference and the maximum color difference to get a subjective assessment metric as shown in Table 1 through the experiment. Then, we can define the OSCSP Q for the present subjective assessment metric as 𝑄=5,ΔΔ𝐸<0.5,𝑘=1,7𝑘,𝐸Δ𝐸min(𝑘)Δ𝐸max(𝑘)Δ𝐸min(𝑘),0.5Δ𝐸24,𝑘2,3,4,5,6,0,Δ𝐸>24,𝑘=7.(2)

Table 1 presents the detailed relationship of the OSCSP 𝑄, perception of color difference, and NBS units.

For any distorted image, we can get the OSCSP 𝑄 through the following processing.(1)Firstly, we get the primary display of original image 𝑅1[𝑖,𝑗], 𝐺1[𝑖,𝑗], 𝐵1[𝑖,𝑗] and distorted image 𝑅2[𝑖,𝑗], 𝐺2[𝑖,𝑗], 𝐵2[𝑖,𝑗]. Here, 𝑖[1,𝑀], 𝑗[1,𝑁]. (2)Secondly, we calculate color difference value Δ𝐸[𝑖,𝑗] of each pixel according to (1) and get color difference average value Δ𝐸=(1/𝑀/𝑁)1𝑖𝑀1𝑗𝑁Δ𝐸[𝑖,𝑗]. (3)Finally, 𝑄 is calculated according to (2) for each image. Compared with the previous methods, the present color image quality assessment method proposes an objective score conformed to subjective perception (OSCSP) 𝑄, which cannot only be directly gotten by objective numerical calculation but also reflect the subjective visual perception more accurately. When compared with the traditional subjective five-level metrics based on human scoring, the present objective metrics is more convenient and can be operated in real time for online color image assessment.

4. Experimental Systems and Assessment Results

To prove that the present OSCSP 𝑄 can accurately reflect the subjective visual perception, we have performed experiments on various distorted images from image database provided by image and video image quality engineering laboratory (LIVE) from University of Texas at Austin [16]. Our experimental environment is set according to the basic conditions for observation room. The experimental monitor is Founder FN980-WT, which has a resolution of 1440×900 32-bit true color. To get repeatable experimental results, we calibrate the color temperature as 6500 K, brightness 80, contrast 70, and observable color grade 61.

After setting the experimental environment, we need to firstly determine the correction parameter 𝐾𝐿, 𝐾𝐶, 𝐾𝐻 values under this real experimental condition based on the idea of JND as mentioned above. To get a better calibration, we get a set of 𝐾𝐿, 𝐾𝐶, 𝐾𝐻 values, respectively using 𝑅, 𝐺, 𝐵 three primary colors signals and multiple random signals and then average them to reach the final calibrated correction parameters: 𝐾𝐿=0.65, 𝐾𝐶=1.0, and 𝐾𝐻=4.0. Following experimental results used these correction parameters.

The LIVE database contains nearly 1000 images with five types of distortions: JPEG2000 and JPEG compression with various compression ratios, images contaminated by white Gaussian noise (WN), Gaussian blurred images (gblur), and JPEG2000 compressed images transmitted over simulated fast fading Rayleigh channel with bit errors typical for the wireless transmission (FF). For these images, the Differential Mean Opinion Score (DMOS) values range between 0 and 100. Smaller is DMOS, better is the image quality. We normalized the DMOS to a subject scores (SS) ranged between 0 and 5 by the expressions SS=5(100DMOS)/100 so that it can be compared with the subjective assessment metric. Higher is SS, better is the image quality.

The popular LIVE database contains 29 different images and a total of 982 images (reference and distorted). As an example, we choose five different images named “womanhat”, “sailing2,” “woman,” “lighthouse,” “statue,” and a total of 170 images (including reference and distorted) in our experiment. The references of these five images are shown in Figure 2. We can compare our DE2000-based method with the well-known metric PSNR and structural similarity (SSIM). We firstly calculate objective results of PSNR, SSIM and our objective score conformed to subjective perception (OSCSP) 𝑄 for all the images. Through the least-square method between the subjective scores (SS) of these images and those objective results from PSNR, SSIM, and our method, we can obtain a set of value named prediction subjective scores (SS𝑝) to reflect the conformity between objective scores and subjective perception.

273723.fig.002
Figure 2: The five reference images.

In Figure 3, we present linear correlation graphs from PSNR, SSIM, and our DE2000-based method for five typical types distorted images. For all these five typical types of distorted images, the present DE2000-based algorithm can give correlation results much closer to the diagonal than the PSNR and SSIM. Correspondingly, CC values of our DE2000-based method are found to be the highest.

fig3
Figure 3: The linear correlation graphs from PSNR, SSIM, and DE2000-based method. (a1) to (a5), (b1) to (b5), and (c1) to (c5), respectively show the results for PSNR, SSIM, and DE2000 with five types of distortions.

The linear correlation coefficient (CC) and the mean absolute error (MAE) are used to quantitatively compare the image quality assessment (IQA) results of the present DE2000-based method with the PSNR and SSIM. Tables 2 and 3, respectively, show CC and MAE of PSNR, SSIM and CIEDE2000. Parameters CC and MAE present the correlation of objective and subjective scores (SS). CC is higher, objective assessment results are more coincide with the subjective visual perception. MAE is lower, objective assessment results are also more coincide with the subjective visual perception. Compared with PSNR and SSIM, our DE2000-based method gives larger CC and smaller MAE for all these images.

tab2
Table 2: CC between algorithm and DMOS.
tab3
Table 3: MAE between algorithm and DMOS.

5. Conclusion

By exploiting the idea of NBS, we have established a color image quality assessment metric based on color difference formula of CIEDE2000. We propose to use an objective score conformed to subjective perception (OSCSP) 𝑄 directly gotten by objective numerical calculation to reflect the subjective visual perception of any color image. In addition, we present a general method to calibrate correction factors of this CIEDE2000 color difference formula under the real experimental conditions so that we can experimentally compare the present metric with other objective IQA method such as PSNR and SSIM in general application environment. The experiment results prove that the present DE2000-based metric can assess color image quality finely.

Acknowledgments

This work was supported by Fundamental Research Funds for the Central Universities (no. WK2100230002), National Science and Technology Major Project (no. 2010ZX03004-003), National Natural Science Foundation of China (no. 60872162), and Young Research Foundation of Anhui University (no. KJQN1012).

References

  1. A. C. Bovik, “Perceptual video processing: seeing the future,” Proceedings of the IEEE, vol. 98, no. 11, pp. 1799–1803, 2010. View at Publisher · View at Google Scholar · View at Scopus
  2. A. C. Bovik, “What you see is what you learn,” IEEE Signal Processing Magazine, vol. 27, no. 5, pp. 117–123, 2010. View at Publisher · View at Google Scholar · View at Scopus
  3. S. O. Lee and D. G. Sim, “Objectification of perceptual image quality for mobile video,” Optical Engineering, vol. 50, no. 6, Article ID 067404, 2011. View at Publisher · View at Google Scholar
  4. C. F. Hall, Digital color image compression in a perceptual space [Ph.D. thesis], University of Southern California, 1978.
  5. N. Thakur and S. Devi, “A new method for color image quality assessment,” International Journal of Computer Applications, vol. 15, no. 2, pp. 10–17, 2011.
  6. A. Toet and M. P. Lucassen, “A new universal colour image fidelity metric,” Displays, vol. 24, no. 4-5, pp. 197–207, 2003. View at Publisher · View at Google Scholar · View at Scopus
  7. P. Le Callet and D. Barba, “A robust quality metric for color image quality assessment,” in Proceedings of the International Conference on Image Processing (ICIP'03), pp. 437–440, September 2003. View at Scopus
  8. C. J. van den Branden Lambrecht, “Color moving pictures quality metric,” in Proceedings of the IEEE International Conference on Image Processing (ICIP'96), pp. 885–888, September 1996. View at Scopus
  9. V. Monga, W. S. Geisler, and B. L. Evans, “Linear color-separable human visual system models for vector error diffusion halftoning,” IEEE Signal Processing Letters, vol. 10, no. 4, pp. 93–97, 2003. View at Publisher · View at Google Scholar · View at Scopus
  10. M. R. Luo, G. Cui, and B. Rigg, “The development of the CIE 2000 colour-difference formula: CIEDE2000,” Color Research and Application, vol. 26, no. 5, pp. 340–350, 2001. View at Publisher · View at Google Scholar · View at Scopus
  11. R. G. Kuehni, “CIEDE2000, milestone or final answer?” Color Research and Application, vol. 27, no. 2, pp. 126–127, 2002. View at Publisher · View at Google Scholar · View at Scopus
  12. M. R. Luo, G. Cui, and B. Rigg, “Further comments on CIEDE2000,” Color Research and Application, vol. 27, no. 2, pp. 127–128, 2002. View at Publisher · View at Google Scholar · View at Scopus
  13. G. M. Johnson and M. D. Fairchild, “A top down description of S-CIELAB and CIEDE2000,” Color Research and Application, vol. 28, no. 6, pp. 425–435, 2003. View at Publisher · View at Google Scholar · View at Scopus
  14. S. Chen, A. Beghdadi, and A. Chetouani, “Color image assessment using spatial extension to CIE DE2000,” in Proceedings of the International Conference on Consumer Electronics (ICCE'08), Digest of Technical Papers, pp. 1–2, Las Vegas, Nev, USA, January 2008. View at Publisher · View at Google Scholar · View at Scopus
  15. C. Hu, Printing Color and Chromaticity, Printing Industry Press, 1993.
  16. H. R. Sheikh, Z. Wang, L. Cormack, and A. C. Bovik, “LIVE image quality assessment database release 2,” http://live.ece.utexas.edu/research/quality.