- About this Journal
- Abstracting and Indexing
- Aims and Scope
- Article Processing Charges
- Articles in Press
- Author Guidelines
- Bibliographic Information
- Citations to this Journal
- Contact Information
- Editorial Board
- Editorial Workflow
- Free eTOC Alerts
- Publication Ethics
- Reviewers Acknowledgment
- Submit a Manuscript
- Subscription Information
- Table of Contents
Advances in Multimedia
Volume 2012 (2012), Article ID 273723, 6 pages
Color Image Quality Assessment Based on CIEDE2000
1Key IC&SP Laboratory of Ministry of Education, Anhui University, Hefei 230039, China
2MOE-Microsoft Key Laboratory of Multimedia Computing & Communication, University of Science and Technology of China, Hefei 230027, China
Received 23 April 2012; Accepted 22 July 2012
Academic Editor: Qi Tian
Copyright © 2012 Yang Yang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Combining the color difference formula of CIEDE2000 and the printing industry standard for visual verification, we present an objective color image quality assessment method correlated with subjective vision perception. An objective score conformed to subjective perception (OSCSP) Q was proposed to directly reflect the subjective visual perception. In addition, we present a general method to calibrate correction factors of color difference formula under real experimental conditions. Our experiment results show that the present DE2000-based metric can be consistent with human visual system in general application environment.
Image quality assessment is one of the basic technologies for image information engineering. Many researchers are seeking an objective quality assessment metric which can be calculated simply but can accurately reflect subjective quality of human perception [1–3]. Most of these studies aim at the reduction of deviation between the subjective and objective quality assessment results. Due to the wide application of color image, color image quality assessment becomes more and more important. To represent the color image visual perception, the three attributes (brightness, hue, and saturation) must be exploited . At present, most of color image quality assessment methods convert a color image to the brightness, hue, and saturation corresponding space (including XYZ, YUV, HSL, and opponent color space) and then adopt the gray scale image quality assessment for each channel [5–8]. However, this kind of conversion cannot completely describe the nonlinear perception of brightness, hue, and saturation in a color image. It is well known that color image quality assessment results depend on the coincidence degree of color space and visual perception. Thus, it is highly desirable to convert color image to a color space which can reflect the subjective visual characteristics more properly. In the present color image assessment method, we propose to convert color image into the uniform color-difference color space.
Based on the JND (just noticeable difference) idea of XYZ system in the colorimetric, uniform color-difference space was proposed to establish a linear relationship between the changes of brightness, hue, and saturation and visual perception through the subjective visual perception experiment. It has been experimentally shown that uniform color-difference space assessment results are much better than opponent color space and YUV color space . Presently, CIEDE2000 is regarded as the best uniform color-difference model coinciding with subjective visual perception. It normalizes brightness, hue, and saturation of the visual perception to the same unit [10–12]. With this model, we can directly get a numerical result, named color-difference parameter , which can reflect the color difference between two images. Many researchers have exploited this color-difference parameter from CIEDE2000 formula into the image quality assessment [13, 14]. However, this objective parameter, color-difference parameter , cannot directly correspond to the subjective visual perception, such as subjective five-level assessment metric. It is noted that in the printing industry, a visual perception metric, based on National Bureau of Standards (NBS) unit (or modified Judd), relates the value of to visual perception .
In the present color image quality assessment method, we adopted the NBS unit idea to convert the value color-difference parameter from CIEDE2000 to a precise objective score conformed to subjective perception (OSCSP) to directly reflect the subjective visual perception. Different from the well-known subjective five-level metrics, our OSCSP converted from can be a real number consistent with subjective perception. Our experiments in various distorted images show the present color image quality assessment metric can give an objective assessment result nicely coinciding with subjective visual perception.
2. CIEDE2000 Color Difference Formula and Its Environmental Parameters Calibration
CIEDE2000 color difference formula presents the relationship of color difference value and lightness difference , hue difference , and chroma difference . It is defined as
Here the parameter factors, , , , are correction factors related with observation environment. Lightness, chroma, and hue weighting factors, , , , respectively describe visual perception action on three attributes. Rotation factor is used to correct deflection in the blue region of the ellipse axis direction for visual perception. Figure 1 indicates the basic steps to calculate the color-difference value based on CIEDE2000 formula.
Under the condition of CIE standard observe environment, the parameter factor . As it is impossible to fully meet the standard observation conditions in the real experiment, we have to calibrate these three correction factors , , based on the idea of JND (just noticeable difference). The so-called visual JND means that the human eye can just feel the difference of the lightness, hue, and chroma between two objects. We change one of these three attributes (lightness or hue or chroma) of one object and keep the other two parameters unchanged to get just noticeable difference between this object and the reference one. This JND case corresponds to the condition of color-difference parameter according to Table 1. For example, we can determine the lightness correction factor as follows. Firstly, choose two test images which have only lightness distortion without hue distortion and chroma distortion . Secondly, change the lightness of one image so that we can percept the just noticeable difference at certain . Lastly, we can determine the proper factor through (1) as JND condition means . In this way, we can fit out all the values of , , satisfying under real experimental conditions.
3. Subjective Measurement Standard Establishment and Related Notes
In order to directly relate the color-difference parameter from the CIEDE2000 to the subjective visual perception, we adopt the idea of NBS unit to convert to an objective score conformed to subjective perception (OSCSP) to reflect the subjective visual perception through a nonlinear transformation as (2). To get this transformation, we need firstly extend the five-level metric by including two extreme states: the minimum color difference and the maximum color difference to get a subjective assessment metric as shown in Table 1 through the experiment. Then, we can define the OSCSP Q for the present subjective assessment metric as
Table 1 presents the detailed relationship of the OSCSP , perception of color difference, and NBS units.
For any distorted image, we can get the OSCSP through the following processing.(1)Firstly, we get the primary display of original image , , and distorted image , , . Here, , . (2)Secondly, we calculate color difference value of each pixel according to (1) and get color difference average value . (3)Finally, is calculated according to (2) for each image. Compared with the previous methods, the present color image quality assessment method proposes an objective score conformed to subjective perception (OSCSP) , which cannot only be directly gotten by objective numerical calculation but also reflect the subjective visual perception more accurately. When compared with the traditional subjective five-level metrics based on human scoring, the present objective metrics is more convenient and can be operated in real time for online color image assessment.
4. Experimental Systems and Assessment Results
To prove that the present OSCSP can accurately reflect the subjective visual perception, we have performed experiments on various distorted images from image database provided by image and video image quality engineering laboratory (LIVE) from University of Texas at Austin . Our experimental environment is set according to the basic conditions for observation room. The experimental monitor is Founder FN980-WT, which has a resolution of 32-bit true color. To get repeatable experimental results, we calibrate the color temperature as 6500 K, brightness 80, contrast 70, and observable color grade 61.
After setting the experimental environment, we need to firstly determine the correction parameter , , values under this real experimental condition based on the idea of JND as mentioned above. To get a better calibration, we get a set of , , values, respectively using , , three primary colors signals and multiple random signals and then average them to reach the final calibrated correction parameters: , , and . Following experimental results used these correction parameters.
The LIVE database contains nearly 1000 images with five types of distortions: JPEG2000 and JPEG compression with various compression ratios, images contaminated by white Gaussian noise (WN), Gaussian blurred images (gblur), and JPEG2000 compressed images transmitted over simulated fast fading Rayleigh channel with bit errors typical for the wireless transmission (FF). For these images, the Differential Mean Opinion Score (DMOS) values range between 0 and 100. Smaller is DMOS, better is the image quality. We normalized the DMOS to a subject scores (SS) ranged between 0 and 5 by the expressions so that it can be compared with the subjective assessment metric. Higher is SS, better is the image quality.
The popular LIVE database contains 29 different images and a total of 982 images (reference and distorted). As an example, we choose five different images named “womanhat”, “sailing2,” “woman,” “lighthouse,” “statue,” and a total of 170 images (including reference and distorted) in our experiment. The references of these five images are shown in Figure 2. We can compare our DE2000-based method with the well-known metric PSNR and structural similarity (SSIM). We firstly calculate objective results of PSNR, SSIM and our objective score conformed to subjective perception (OSCSP) for all the images. Through the least-square method between the subjective scores (SS) of these images and those objective results from PSNR, SSIM, and our method, we can obtain a set of value named prediction subjective scores () to reflect the conformity between objective scores and subjective perception.
In Figure 3, we present linear correlation graphs from PSNR, SSIM, and our DE2000-based method for five typical types distorted images. For all these five typical types of distorted images, the present DE2000-based algorithm can give correlation results much closer to the diagonal than the PSNR and SSIM. Correspondingly, CC values of our DE2000-based method are found to be the highest.
The linear correlation coefficient (CC) and the mean absolute error (MAE) are used to quantitatively compare the image quality assessment (IQA) results of the present DE2000-based method with the PSNR and SSIM. Tables 2 and 3, respectively, show CC and MAE of PSNR, SSIM and CIEDE2000. Parameters CC and MAE present the correlation of objective and subjective scores (SS). CC is higher, objective assessment results are more coincide with the subjective visual perception. MAE is lower, objective assessment results are also more coincide with the subjective visual perception. Compared with PSNR and SSIM, our DE2000-based method gives larger CC and smaller MAE for all these images.
By exploiting the idea of NBS, we have established a color image quality assessment metric based on color difference formula of CIEDE2000. We propose to use an objective score conformed to subjective perception (OSCSP) directly gotten by objective numerical calculation to reflect the subjective visual perception of any color image. In addition, we present a general method to calibrate correction factors of this CIEDE2000 color difference formula under the real experimental conditions so that we can experimentally compare the present metric with other objective IQA method such as PSNR and SSIM in general application environment. The experiment results prove that the present DE2000-based metric can assess color image quality finely.
This work was supported by Fundamental Research Funds for the Central Universities (no. WK2100230002), National Science and Technology Major Project (no. 2010ZX03004-003), National Natural Science Foundation of China (no. 60872162), and Young Research Foundation of Anhui University (no. KJQN1012).
- A. C. Bovik, “Perceptual video processing: seeing the future,” Proceedings of the IEEE, vol. 98, no. 11, pp. 1799–1803, 2010.
- A. C. Bovik, “What you see is what you learn,” IEEE Signal Processing Magazine, vol. 27, no. 5, pp. 117–123, 2010.
- S. O. Lee and D. G. Sim, “Objectification of perceptual image quality for mobile video,” Optical Engineering, vol. 50, no. 6, Article ID 067404, 2011.
- C. F. Hall, Digital color image compression in a perceptual space [Ph.D. thesis], University of Southern California, 1978.
- N. Thakur and S. Devi, “A new method for color image quality assessment,” International Journal of Computer Applications, vol. 15, no. 2, pp. 10–17, 2011.
- A. Toet and M. P. Lucassen, “A new universal colour image fidelity metric,” Displays, vol. 24, no. 4-5, pp. 197–207, 2003.
- P. Le Callet and D. Barba, “A robust quality metric for color image quality assessment,” in Proceedings of the International Conference on Image Processing (ICIP'03), pp. 437–440, September 2003.
- C. J. van den Branden Lambrecht, “Color moving pictures quality metric,” in Proceedings of the IEEE International Conference on Image Processing (ICIP'96), pp. 885–888, September 1996.
- V. Monga, W. S. Geisler, and B. L. Evans, “Linear color-separable human visual system models for vector error diffusion halftoning,” IEEE Signal Processing Letters, vol. 10, no. 4, pp. 93–97, 2003.
- M. R. Luo, G. Cui, and B. Rigg, “The development of the CIE 2000 colour-difference formula: CIEDE2000,” Color Research and Application, vol. 26, no. 5, pp. 340–350, 2001.
- R. G. Kuehni, “CIEDE2000, milestone or final answer?” Color Research and Application, vol. 27, no. 2, pp. 126–127, 2002.
- M. R. Luo, G. Cui, and B. Rigg, “Further comments on CIEDE2000,” Color Research and Application, vol. 27, no. 2, pp. 127–128, 2002.
- G. M. Johnson and M. D. Fairchild, “A top down description of S-CIELAB and CIEDE2000,” Color Research and Application, vol. 28, no. 6, pp. 425–435, 2003.
- S. Chen, A. Beghdadi, and A. Chetouani, “Color image assessment using spatial extension to CIE DE2000,” in Proceedings of the International Conference on Consumer Electronics (ICCE'08), Digest of Technical Papers, pp. 1–2, Las Vegas, Nev, USA, January 2008.
- C. Hu, Printing Color and Chromaticity, Printing Industry Press, 1993.
- H. R. Sheikh, Z. Wang, L. Cormack, and A. C. Bovik, “LIVE image quality assessment database release 2,” http://live.ece.utexas.edu/research/quality.