Advances in Multimedia

Advances in Multimedia / 2012 / Article

Research Article | Open Access

Volume 2012 |Article ID 273723 | https://doi.org/10.1155/2012/273723

Yang Yang, Jun Ming, Nenghai Yu, "Color Image Quality Assessment Based on CIEDE2000", Advances in Multimedia, vol. 2012, Article ID 273723, 6 pages, 2012. https://doi.org/10.1155/2012/273723

Color Image Quality Assessment Based on CIEDE2000

Academic Editor: Qi Tian
Received23 Apr 2012
Accepted22 Jul 2012
Published30 Aug 2012

Abstract

Combining the color difference formula of CIEDE2000 and the printing industry standard for visual verification, we present an objective color image quality assessment method correlated with subjective vision perception. An objective score conformed to subjective perception (OSCSP) Q was proposed to directly reflect the subjective visual perception. In addition, we present a general method to calibrate correction factors of color difference formula under real experimental conditions. Our experiment results show that the present DE2000-based metric can be consistent with human visual system in general application environment.

1. Introduction

Image quality assessment is one of the basic technologies for image information engineering. Many researchers are seeking an objective quality assessment metric which can be calculated simply but can accurately reflect subjective quality of human perception [1โ€“3]. Most of these studies aim at the reduction of deviation between the subjective and objective quality assessment results. Due to the wide application of color image, color image quality assessment becomes more and more important. To represent the color image visual perception, the three attributes (brightness, hue, and saturation) must be exploited [4]. At present, most of color image quality assessment methods convert a color image to the brightness, hue, and saturation corresponding space (including XYZ, YUV, HSL, and opponent color space) and then adopt the gray scale image quality assessment for each channel [5โ€“8]. However, this kind of conversion cannot completely describe the nonlinear perception of brightness, hue, and saturation in a color image. It is well known that color image quality assessment results depend on the coincidence degree of color space and visual perception. Thus, it is highly desirable to convert color image to a color space which can reflect the subjective visual characteristics more properly. In the present color image assessment method, we propose to convert color image into the uniform color-difference color space.

Based on the JND (just noticeable difference) idea of XYZ system in the colorimetric, uniform color-difference space was proposed to establish a linear relationship between the changes of brightness, hue, and saturation and visual perception through the subjective visual perception experiment. It has been experimentally shown that uniform color-difference space assessment results are much better than opponent color space and YUV color space [9]. Presently, CIEDE2000 is regarded as the best uniform color-difference model coinciding with subjective visual perception. It normalizes brightness, hue, and saturation of the visual perception to the same unit [10โ€“12]. With this model, we can directly get a numerical result, named color-difference parameter ฮ”๐ธ, which can reflect the color difference between two images. Many researchers have exploited this color-difference parameter ฮ”๐ธ from CIEDE2000 formula into the image quality assessment [13, 14]. However, this objective parameter, color-difference parameter ฮ”๐ธ, cannot directly correspond to the subjective visual perception, such as subjective five-level assessment metric. It is noted that in the printing industry, a visual perception metric, based on National Bureau of Standards (NBS) unit (or modified Judd), relates the value of ฮ”๐ธ to visual perception [15].

In the present color image quality assessment method, we adopted the NBS unit idea to convert the value color-difference parameter ฮ”๐ธ from CIEDE2000 to a precise objective score conformed to subjective perception (OSCSP) ๐‘„ to directly reflect the subjective visual perception. Different from the well-known subjective five-level metrics, our OSCSP ๐‘„ converted from ฮ”๐ธ can be a real number ๐‘„โˆˆ[0,5] consistent with subjective perception. Our experiments in various distorted images show the present color image quality assessment metric can give an objective assessment result nicely coinciding with subjective visual perception.

2. CIEDE2000 Color Difference Formula and Its Environmental Parameters Calibration

CIEDE2000 color difference formula presents the relationship of color difference value ฮ”๐ธ and lightness difference ฮ”๐ฟโ€ฒ, hue difference ฮ”๐ปโ€ฒ, and chroma difference ฮ”๐ถโ€ฒ. It is defined as =๎ƒŽฮ”๐ธ๎‚ตฮ”๐ฟโ€ฒ๐พ๐ฟ๐‘†๐ฟ๎‚ถ2+๎‚ตฮ”๐ถโ€ฒ๐พ๐ถ๐‘†๐ถ๎‚ถ2+๎‚ตฮ”๐ปโ€ฒ๐พ๐ป๐‘†๐ป๎‚ถ2+๐‘…๐‘‡๎‚ตฮ”๐ถโ€ฒ๐พ๐ถ๐‘†๐ถ๎‚ถ๎‚ตฮ”๐ปโ€ฒ๐พ๐ป๐‘†๐ป๎‚ถ.(1)

Here the parameter factors, ๐พ๐ฟ,โ€‰๐พ๐ถ,โ€‰๐พ๐ป, are correction factors related with observation environment. Lightness, chroma, and hue weighting factors, ๐‘†๐ฟ,โ€‰๐‘†๐ถ,โ€‰๐‘†๐ป, respectively describe visual perception action on three attributes. Rotation factor ๐‘…๐‘‡ is used to correct deflection in the blue region of the ellipse axis direction for visual perception. Figure 1 indicates the basic steps to calculate the color-difference value ฮ”๐ธ based on CIEDE2000 formula.

Under the condition of CIE standard observe environment, the parameter factor ๐พ๐ฟ=๐พ๐ถ=๐พ๐ป=1. As it is impossible to fully meet the standard observation conditions in the real experiment, we have to calibrate these three correction factors ๐พ๐ฟ,โ€‰๐พ๐ถ,โ€‰๐พ๐ป based on the idea of JND (just noticeable difference). The so-called visual JND means that the human eye can just feel the difference of the lightness, hue, and chroma between two objects. We change one of these three attributes (lightness or hue or chroma) of one object and keep the other two parameters unchanged to get just noticeable difference between this object and the reference one. This JND case corresponds to the condition of color-difference parameter ฮ”๐ธ=0.5 according to Table 1. For example, we can determine the lightness correction factor ๐พ๐ฟ as follows. Firstly, choose two test images which have only lightness distortion ฮ”๐ฟโ€ฒ without hue distortion ฮ”๐ปโ€ฒ and chroma distortion ฮ”๐ถโ€ฒ. Secondly, change the lightness of one image so that we can percept the just noticeable difference at certain ฮ”๐ฟโ€ฒ. Lastly, we can determine the proper ๐พ๐ฟ factor through (1) as JND condition means ฮ”๐ธ=0.5. In this way, we can fit out all the values of ๐พ๐ฟ,โ€‰๐พ๐ถ,โ€‰๐พ๐ป satisfying ฮ”๐ธ=0.5 under real experimental conditions.


๐‘˜ ฮ” ๐ธ m i n ( ๐‘˜ ) ฮ” ๐ธ m a x ( ๐‘˜ ) Perception of color difference ๐‘„

10.0 0.5Hardly 5
20.51.5Slight 5 โˆ’ ( ฮ” ๐ธ โˆ’ 0 . 5 )
31.53.0Noticeable 4 โˆ’ ( ฮ” ๐ธ โˆ’ 1 . 5 ) / 1 . 5
43.06.0Appreciable 3 โˆ’ ( ฮ” ๐ธ โˆ’ 3 ) / 3
56.012.0Much 2 โˆ’ ( ฮ” ๐ธ โˆ’ 6 ) / 6
612.024.0Very much 1 โˆ’ ( ฮ” ๐ธ โˆ’ 1 2 ) / 1 2
724.0 โˆž Strongly 0

In order to directly relate the color-difference parameter ฮ”๐ธ from the CIEDE2000 to the subjective visual perception, we adopt the idea of NBS unit to convert ฮ”๐ธ to an objective score conformed to subjective perception (OSCSP) ๐‘„ to reflect the subjective visual perception through a nonlinear transformation as (2). To get this transformation, we need firstly extend the five-level metric by including two extreme states: the minimum color difference and the maximum color difference to get a subjective assessment metric as shown in Table 1 through the experiment. Then, we can define the OSCSP Q for the present subjective assessment metric as ๐‘„=โŽงโŽชโŽชโŽจโŽชโŽชโŽฉ5,ฮ”โˆ’ฮ”๐ธ<0.5,๐‘˜=1,7โˆ’๐‘˜,๐ธโˆ’ฮ”๐ธmin(๐‘˜)ฮ”๐ธmax(๐‘˜)โˆ’ฮ”๐ธmin(๐‘˜),0.5โ‰คฮ”๐ธโ‰ค24,๐‘˜โˆˆ2,3,4,5,6,0,ฮ”๐ธ>24,๐‘˜=7.(2)

Table 1 presents the detailed relationship of the OSCSP ๐‘„, perception of color difference, and NBS units.

For any distorted image, we can get the OSCSP ๐‘„ through the following processing.(1)Firstly, we get the primary display of original image ๐‘…1[๐‘–,๐‘—],โ€‰๐บ1[๐‘–,๐‘—],โ€‰๐ต1[๐‘–,๐‘—] and distorted image ๐‘…2[๐‘–,๐‘—],โ€‰๐บ2[๐‘–,๐‘—],โ€‰๐ต2[๐‘–,๐‘—]. Here, ๐‘–โˆˆ[1,๐‘€],โ€‰๐‘—โˆˆ[1,๐‘]. (2)Secondly, we calculate color difference value ฮ”๐ธ[๐‘–,๐‘—] of each pixel according to (1) and get color difference average value ฮ”โˆ’๐ธโˆ‘=(1/๐‘€/๐‘)1โ‰ค๐‘–โ‰ค๐‘€โˆ‘1โ‰ค๐‘—โ‰ค๐‘ฮ”๐ธ[๐‘–,๐‘—]. (3)Finally, ๐‘„ is calculated according to (2) for each image. Compared with the previous methods, the present color image quality assessment method proposes an objective score conformed to subjective perception (OSCSP) ๐‘„, which cannot only be directly gotten by objective numerical calculation but also reflect the subjective visual perception more accurately. When compared with the traditional subjective five-level metrics based on human scoring, the present objective metrics is more convenient and can be operated in real time for online color image assessment.

4. Experimental Systems and Assessment Results

To prove that the present OSCSP ๐‘„ can accurately reflect the subjective visual perception, we have performed experiments on various distorted images from image database provided by image and video image quality engineering laboratory (LIVE) from University of Texas at Austin [16]. Our experimental environment is set according to the basic conditions for observation room. The experimental monitor is Founder FN980-WT, which has a resolution of 1440ร—900 32-bit true color. To get repeatable experimental results, we calibrate the color temperature as 6500โ€‰K, brightness 80, contrast 70, and observable color grade 61.

After setting the experimental environment, we need to firstly determine the correction parameter ๐พ๐ฟ,โ€‰๐พ๐ถ,โ€‰๐พ๐ป values under this real experimental condition based on the idea of JND as mentioned above. To get a better calibration, we get a set of ๐พ๐ฟ,โ€‰๐พ๐ถ,โ€‰๐พ๐ป values, respectively using ๐‘…,โ€‰๐บ,โ€‰๐ต three primary colors signals and multiple random signals and then average them to reach the final calibrated correction parameters: ๐พ๐ฟ=0.65,โ€‰๐พ๐ถ=1.0, and ๐พ๐ป=4.0. Following experimental results used these correction parameters.

The LIVE database contains nearly 1000 images with five types of distortions: JPEG2000 and JPEG compression with various compression ratios, images contaminated by white Gaussian noise (WN), Gaussian blurred images (gblur), and JPEG2000 compressed images transmitted over simulated fast fading Rayleigh channel with bit errors typical for the wireless transmission (FF). For these images, the Differential Mean Opinion Score (DMOS) values range between 0 and 100. Smaller is DMOS, better is the image quality. We normalized the DMOS to a subject scores (SS) ranged between 0 and 5 by the expressions SS=5โˆ—(100โˆ’DMOS)/100 so that it can be compared with the subjective assessment metric. Higher is SS, better is the image quality.

The popular LIVE database contains 29 different images and a total of 982 images (reference and distorted). As an example, we choose five different images named โ€œwomanhatโ€, โ€œsailing2,โ€ โ€œwoman,โ€ โ€œlighthouse,โ€ โ€œstatue,โ€ and a total of 170 images (including reference and distorted) in our experiment. The references of these five images are shown in Figure 2. We can compare our DE2000-based method with the well-known metric PSNR and structural similarity (SSIM). We firstly calculate objective results of PSNR, SSIM and our objective score conformed to subjective perception (OSCSP) ๐‘„ for all the images. Through the least-square method between the subjective scores (SS) of these images and those objective results from PSNR, SSIM, and our method, we can obtain a set of value named prediction subjective scores (SS๐‘) to reflect the conformity between objective scores and subjective perception.

In Figure 3, we present linear correlation graphs from PSNR, SSIM, and our DE2000-based method for five typical types distorted images. For all these five typical types of distorted images, the present DE2000-based algorithm can give correlation results much closer to the diagonal than the PSNR and SSIM. Correspondingly, CC values of our DE2000-based method are found to be the highest.

The linear correlation coefficient (CC) and the mean absolute error (MAE) are used to quantitatively compare the image quality assessment (IQA) results of the present DE2000-based method with the PSNR and SSIM. Tables 2 and 3, respectively, show CC and MAE of PSNR, SSIM and CIEDE2000. Parameters CC and MAE present the correlation of objective and subjective scores (SS). CC is higher, objective assessment results are more coincide with the subjective visual perception. MAE is lower, objective assessment results are also more coincide with the subjective visual perception. Compared with PSNR and SSIM, our DE2000-based method gives larger CC and smaller MAE for all these images.


ModelWNBlur FFJPEGJP2K

CIEDE2000โ€œwomanhatโ€0.97600.98000.92100.91300.9460
โ€œsailing2โ€0.96700.95700.85300.91000.8760
โ€œwomanโ€0.96400.99200.96300.97300.9920
โ€œlighthouseโ€0.98600.96800.99700.94300.9670
โ€œstatueโ€0.97300.99400.96400.96800.9640
Average0.97320.97820.93960.94140.9490

PSNR (average)0.90660.83580.90340.85260.8534
SSIM (average)0.83860.91920.87200.81280.8264


ModelWNBlur FFJPEGJP2K


CIEDE2000
โ€œwomanhatโ€0.0410.0290.0490.0740.066
โ€œsailing2โ€0.0400.0630.0680.0720.062
โ€œwomanโ€0.0470.0250.0490.0420.021
โ€œlighthouseโ€0.0300.0470.0130.0600.045
โ€œstatueโ€0.0420.0200.0390.0500.049
โ€‰Average0.04000.03680.04360.05960.0486

PSNR (average)0.07640.09940.07000.09920.0880
SSIM (average)0.09380.07620.07800.09280.0552

5. Conclusion

By exploiting the idea of NBS, we have established a color image quality assessment metric based on color difference formula of CIEDE2000. We propose to use an objective score conformed to subjective perception (OSCSP) ๐‘„ directly gotten by objective numerical calculation to reflect the subjective visual perception of any color image. In addition, we present a general method to calibrate correction factors of this CIEDE2000 color difference formula under the real experimental conditions so that we can experimentally compare the present metric with other objective IQA method such as PSNR and SSIM in general application environment. The experiment results prove that the present DE2000-based metric can assess color image quality finely.

Acknowledgments

This work was supported by Fundamental Research Funds for the Central Universities (no. WK2100230002), National Science and Technology Major Project (no. 2010ZX03004-003), National Natural Science Foundation of China (no. 60872162), and Young Research Foundation of Anhui University (no. KJQN1012).

References

  1. A. C. Bovik, โ€œPerceptual video processing: seeing the future,โ€ Proceedings of the IEEE, vol. 98, no. 11, pp. 1799โ€“1803, 2010. View at: Publisher Site | Google Scholar
  2. A. C. Bovik, โ€œWhat you see is what you learn,โ€ IEEE Signal Processing Magazine, vol. 27, no. 5, pp. 117โ€“123, 2010. View at: Publisher Site | Google Scholar
  3. S. O. Lee and D. G. Sim, โ€œObjectification of perceptual image quality for mobile video,โ€ Optical Engineering, vol. 50, no. 6, Article ID 067404, 2011. View at: Publisher Site | Google Scholar
  4. C. F. Hall, Digital color image compression in a perceptual space [Ph.D. thesis], University of Southern California, 1978.
  5. N. Thakur and S. Devi, โ€œA new method for color image quality assessment,โ€ International Journal of Computer Applications, vol. 15, no. 2, pp. 10โ€“17, 2011. View at: Google Scholar
  6. A. Toet and M. P. Lucassen, โ€œA new universal colour image fidelity metric,โ€ Displays, vol. 24, no. 4-5, pp. 197โ€“207, 2003. View at: Publisher Site | Google Scholar
  7. P. Le Callet and D. Barba, โ€œA robust quality metric for color image quality assessment,โ€ in Proceedings of the International Conference on Image Processing (ICIP'03), pp. 437โ€“440, September 2003. View at: Google Scholar
  8. C. J. van den Branden Lambrecht, โ€œColor moving pictures quality metric,โ€ in Proceedings of the IEEE International Conference on Image Processing (ICIP'96), pp. 885โ€“888, September 1996. View at: Google Scholar
  9. V. Monga, W. S. Geisler, and B. L. Evans, โ€œLinear color-separable human visual system models for vector error diffusion halftoning,โ€ IEEE Signal Processing Letters, vol. 10, no. 4, pp. 93โ€“97, 2003. View at: Publisher Site | Google Scholar
  10. M. R. Luo, G. Cui, and B. Rigg, โ€œThe development of the CIE 2000 colour-difference formula: CIEDE2000,โ€ Color Research and Application, vol. 26, no. 5, pp. 340โ€“350, 2001. View at: Publisher Site | Google Scholar
  11. R. G. Kuehni, โ€œCIEDE2000, milestone or final answer?โ€ Color Research and Application, vol. 27, no. 2, pp. 126โ€“127, 2002. View at: Publisher Site | Google Scholar
  12. M. R. Luo, G. Cui, and B. Rigg, โ€œFurther comments on CIEDE2000,โ€ Color Research and Application, vol. 27, no. 2, pp. 127โ€“128, 2002. View at: Publisher Site | Google Scholar
  13. G. M. Johnson and M. D. Fairchild, โ€œA top down description of S-CIELAB and CIEDE2000,โ€ Color Research and Application, vol. 28, no. 6, pp. 425โ€“435, 2003. View at: Publisher Site | Google Scholar
  14. S. Chen, A. Beghdadi, and A. Chetouani, โ€œColor image assessment using spatial extension to CIE DE2000,โ€ in Proceedings of the International Conference on Consumer Electronics (ICCE'08), Digest of Technical Papers, pp. 1โ€“2, Las Vegas, Nev, USA, January 2008. View at: Publisher Site | Google Scholar
  15. C. Hu, Printing Color and Chromaticity, Printing Industry Press, 1993.
  16. H. R. Sheikh, Z. Wang, L. Cormack, and A. C. Bovik, โ€œLIVE image quality assessment database release 2,โ€ http://live.ece.utexas.edu/research/quality. View at: Google Scholar

Copyright © 2012 Yang Yang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder
Views4271
Downloads2955
Citations

Related articles

We are committed to sharing findings related to COVID-19 as quickly as possible. We will be providing unlimited waivers of publication charges for accepted research articles as well as case reports and case series related to COVID-19. Review articles are excluded from this waiver policy. Sign up here as a reviewer to help fast-track new submissions.