Abstract

Combining the color difference formula of CIEDE2000 and the printing industry standard for visual verification, we present an objective color image quality assessment method correlated with subjective vision perception. An objective score conformed to subjective perception (OSCSP) Q was proposed to directly reflect the subjective visual perception. In addition, we present a general method to calibrate correction factors of color difference formula under real experimental conditions. Our experiment results show that the present DE2000-based metric can be consistent with human visual system in general application environment.

1. Introduction

Image quality assessment is one of the basic technologies for image information engineering. Many researchers are seeking an objective quality assessment metric which can be calculated simply but can accurately reflect subjective quality of human perception [1–3]. Most of these studies aim at the reduction of deviation between the subjective and objective quality assessment results. Due to the wide application of color image, color image quality assessment becomes more and more important. To represent the color image visual perception, the three attributes (brightness, hue, and saturation) must be exploited [4]. At present, most of color image quality assessment methods convert a color image to the brightness, hue, and saturation corresponding space (including XYZ, YUV, HSL, and opponent color space) and then adopt the gray scale image quality assessment for each channel [5–8]. However, this kind of conversion cannot completely describe the nonlinear perception of brightness, hue, and saturation in a color image. It is well known that color image quality assessment results depend on the coincidence degree of color space and visual perception. Thus, it is highly desirable to convert color image to a color space which can reflect the subjective visual characteristics more properly. In the present color image assessment method, we propose to convert color image into the uniform color-difference color space.

Based on the JND (just noticeable difference) idea of XYZ system in the colorimetric, uniform color-difference space was proposed to establish a linear relationship between the changes of brightness, hue, and saturation and visual perception through the subjective visual perception experiment. It has been experimentally shown that uniform color-difference space assessment results are much better than opponent color space and YUV color space [9]. Presently, CIEDE2000 is regarded as the best uniform color-difference model coinciding with subjective visual perception. It normalizes brightness, hue, and saturation of the visual perception to the same unit [10–12]. With this model, we can directly get a numerical result, named color-difference parameter Δ𝐸, which can reflect the color difference between two images. Many researchers have exploited this color-difference parameter Δ𝐸 from CIEDE2000 formula into the image quality assessment [13, 14]. However, this objective parameter, color-difference parameter Δ𝐸, cannot directly correspond to the subjective visual perception, such as subjective five-level assessment metric. It is noted that in the printing industry, a visual perception metric, based on National Bureau of Standards (NBS) unit (or modified Judd), relates the value of Δ𝐸 to visual perception [15].

In the present color image quality assessment method, we adopted the NBS unit idea to convert the value color-difference parameter Δ𝐸 from CIEDE2000 to a precise objective score conformed to subjective perception (OSCSP) 𝑄 to directly reflect the subjective visual perception. Different from the well-known subjective five-level metrics, our OSCSP 𝑄 converted from Δ𝐸 can be a real number π‘„βˆˆ[0,5] consistent with subjective perception. Our experiments in various distorted images show the present color image quality assessment metric can give an objective assessment result nicely coinciding with subjective visual perception.

2. CIEDE2000 Color Difference Formula and Its Environmental Parameters Calibration

CIEDE2000 color difference formula presents the relationship of color difference value Δ𝐸 and lightness difference Δ𝐿′, hue difference Δ𝐻′, and chroma difference Δ𝐢′. It is defined as =ξƒŽΞ”πΈξ‚΅Ξ”πΏβ€²πΎπΏπ‘†πΏξ‚Ά2+Δ𝐢′𝐾𝐢𝑆𝐢2+Δ𝐻′𝐾𝐻𝑆𝐻2+𝑅𝑇Δ𝐢′𝐾𝐢𝑆𝐢Δ𝐻′𝐾𝐻𝑆𝐻.(1)

Here the parameter factors, 𝐾𝐿, 𝐾𝐢, 𝐾𝐻, are correction factors related with observation environment. Lightness, chroma, and hue weighting factors, 𝑆𝐿, 𝑆𝐢, 𝑆𝐻, respectively describe visual perception action on three attributes. Rotation factor 𝑅𝑇 is used to correct deflection in the blue region of the ellipse axis direction for visual perception. Figure 1 indicates the basic steps to calculate the color-difference value Δ𝐸 based on CIEDE2000 formula.

Under the condition of CIE standard observe environment, the parameter factor 𝐾𝐿=𝐾𝐢=𝐾𝐻=1. As it is impossible to fully meet the standard observation conditions in the real experiment, we have to calibrate these three correction factors 𝐾𝐿, 𝐾𝐢, 𝐾𝐻 based on the idea of JND (just noticeable difference). The so-called visual JND means that the human eye can just feel the difference of the lightness, hue, and chroma between two objects. We change one of these three attributes (lightness or hue or chroma) of one object and keep the other two parameters unchanged to get just noticeable difference between this object and the reference one. This JND case corresponds to the condition of color-difference parameter Δ𝐸=0.5 according to Table 1. For example, we can determine the lightness correction factor 𝐾𝐿 as follows. Firstly, choose two test images which have only lightness distortion Δ𝐿′ without hue distortion Δ𝐻′ and chroma distortion Δ𝐢′. Secondly, change the lightness of one image so that we can percept the just noticeable difference at certain Δ𝐿′. Lastly, we can determine the proper 𝐾𝐿 factor through (1) as JND condition means Δ𝐸=0.5. In this way, we can fit out all the values of 𝐾𝐿, 𝐾𝐢, 𝐾𝐻 satisfying Δ𝐸=0.5 under real experimental conditions.

In order to directly relate the color-difference parameter Δ𝐸 from the CIEDE2000 to the subjective visual perception, we adopt the idea of NBS unit to convert Δ𝐸 to an objective score conformed to subjective perception (OSCSP) 𝑄 to reflect the subjective visual perception through a nonlinear transformation as (2). To get this transformation, we need firstly extend the five-level metric by including two extreme states: the minimum color difference and the maximum color difference to get a subjective assessment metric as shown in Table 1 through the experiment. Then, we can define the OSCSP Q for the present subjective assessment metric as 𝑄=⎧βŽͺβŽͺ⎨βŽͺβŽͺ⎩5,Ξ”βˆ’Ξ”πΈ<0.5,π‘˜=1,7βˆ’π‘˜,πΈβˆ’Ξ”πΈmin(π‘˜)Δ𝐸max(π‘˜)βˆ’Ξ”πΈmin(π‘˜),0.5≀Δ𝐸≀24,π‘˜βˆˆ2,3,4,5,6,0,Δ𝐸>24,π‘˜=7.(2)

Table 1 presents the detailed relationship of the OSCSP 𝑄, perception of color difference, and NBS units.

For any distorted image, we can get the OSCSP 𝑄 through the following processing.(1)Firstly, we get the primary display of original image 𝑅1[𝑖,𝑗], 𝐺1[𝑖,𝑗], 𝐡1[𝑖,𝑗] and distorted image 𝑅2[𝑖,𝑗], 𝐺2[𝑖,𝑗], 𝐡2[𝑖,𝑗]. Here, π‘–βˆˆ[1,𝑀],β€‰π‘—βˆˆ[1,𝑁]. (2)Secondly, we calculate color difference value Δ𝐸[𝑖,𝑗] of each pixel according to (1) and get color difference average value Ξ”βˆ’πΈβˆ‘=(1/𝑀/𝑁)1β‰€π‘–β‰€π‘€βˆ‘1≀𝑗≀𝑁Δ𝐸[𝑖,𝑗]. (3)Finally, 𝑄 is calculated according to (2) for each image. Compared with the previous methods, the present color image quality assessment method proposes an objective score conformed to subjective perception (OSCSP) 𝑄, which cannot only be directly gotten by objective numerical calculation but also reflect the subjective visual perception more accurately. When compared with the traditional subjective five-level metrics based on human scoring, the present objective metrics is more convenient and can be operated in real time for online color image assessment.

4. Experimental Systems and Assessment Results

To prove that the present OSCSP 𝑄 can accurately reflect the subjective visual perception, we have performed experiments on various distorted images from image database provided by image and video image quality engineering laboratory (LIVE) from University of Texas at Austin [16]. Our experimental environment is set according to the basic conditions for observation room. The experimental monitor is Founder FN980-WT, which has a resolution of 1440Γ—900 32-bit true color. To get repeatable experimental results, we calibrate the color temperature as 6500 K, brightness 80, contrast 70, and observable color grade 61.

After setting the experimental environment, we need to firstly determine the correction parameter 𝐾𝐿, 𝐾𝐢, 𝐾𝐻 values under this real experimental condition based on the idea of JND as mentioned above. To get a better calibration, we get a set of 𝐾𝐿, 𝐾𝐢, 𝐾𝐻 values, respectively using 𝑅, 𝐺, 𝐡 three primary colors signals and multiple random signals and then average them to reach the final calibrated correction parameters: 𝐾𝐿=0.65, 𝐾𝐢=1.0, and 𝐾𝐻=4.0. Following experimental results used these correction parameters.

The LIVE database contains nearly 1000 images with five types of distortions: JPEG2000 and JPEG compression with various compression ratios, images contaminated by white Gaussian noise (WN), Gaussian blurred images (gblur), and JPEG2000 compressed images transmitted over simulated fast fading Rayleigh channel with bit errors typical for the wireless transmission (FF). For these images, the Differential Mean Opinion Score (DMOS) values range between 0 and 100. Smaller is DMOS, better is the image quality. We normalized the DMOS to a subject scores (SS) ranged between 0 and 5 by the expressions SS=5βˆ—(100βˆ’DMOS)/100 so that it can be compared with the subjective assessment metric. Higher is SS, better is the image quality.

The popular LIVE database contains 29 different images and a total of 982 images (reference and distorted). As an example, we choose five different images named β€œwomanhat”, β€œsailing2,” β€œwoman,” β€œlighthouse,” β€œstatue,” and a total of 170 images (including reference and distorted) in our experiment. The references of these five images are shown in Figure 2. We can compare our DE2000-based method with the well-known metric PSNR and structural similarity (SSIM). We firstly calculate objective results of PSNR, SSIM and our objective score conformed to subjective perception (OSCSP) 𝑄 for all the images. Through the least-square method between the subjective scores (SS) of these images and those objective results from PSNR, SSIM, and our method, we can obtain a set of value named prediction subjective scores (SS𝑝) to reflect the conformity between objective scores and subjective perception.

In Figure 3, we present linear correlation graphs from PSNR, SSIM, and our DE2000-based method for five typical types distorted images. For all these five typical types of distorted images, the present DE2000-based algorithm can give correlation results much closer to the diagonal than the PSNR and SSIM. Correspondingly, CC values of our DE2000-based method are found to be the highest.

The linear correlation coefficient (CC) and the mean absolute error (MAE) are used to quantitatively compare the image quality assessment (IQA) results of the present DE2000-based method with the PSNR and SSIM. Tables 2 and 3, respectively, show CC and MAE of PSNR, SSIM and CIEDE2000. Parameters CC and MAE present the correlation of objective and subjective scores (SS). CC is higher, objective assessment results are more coincide with the subjective visual perception. MAE is lower, objective assessment results are also more coincide with the subjective visual perception. Compared with PSNR and SSIM, our DE2000-based method gives larger CC and smaller MAE for all these images.

5. Conclusion

By exploiting the idea of NBS, we have established a color image quality assessment metric based on color difference formula of CIEDE2000. We propose to use an objective score conformed to subjective perception (OSCSP) 𝑄 directly gotten by objective numerical calculation to reflect the subjective visual perception of any color image. In addition, we present a general method to calibrate correction factors of this CIEDE2000 color difference formula under the real experimental conditions so that we can experimentally compare the present metric with other objective IQA method such as PSNR and SSIM in general application environment. The experiment results prove that the present DE2000-based metric can assess color image quality finely.

Acknowledgments

This work was supported by Fundamental Research Funds for the Central Universities (no. WK2100230002), National Science and Technology Major Project (no. 2010ZX03004-003), National Natural Science Foundation of China (no. 60872162), and Young Research Foundation of Anhui University (no. KJQN1012).