Research Article  Open Access
Hui Wang, Xiaojuan Hu, Hui Xu, Shiyin Li, Zhaolin Lu, "NoReference Quality Assessment Method for Blurriness of SEM Micrographs with Multiple Texture", Scanning, vol. 2019, Article ID 4271761, 15 pages, 2019. https://doi.org/10.1155/2019/4271761
NoReference Quality Assessment Method for Blurriness of SEM Micrographs with Multiple Texture
Abstract
Scanning electron microscopy (SEM) plays an important role in the intuitive understanding of microstructures because it can provide ultrahigh magnification. Tens or hundreds of images are regularly generated and saved during a typical microscopy imaging process. Given the subjectivity of a microscopist’s focusing operation, blurriness is an important distortion that debases the quality of micrographs. The selection of highquality micrographs using subjective methods is expensive and timeconsuming. This study proposes a new noreference quality assessment method for evaluating the blurriness of SEM micrographs. The human visual system is more sensitive to the distortions of cartoon components than to those of redundant textured components according to the Gestalt perception psychology and the entropy masking property. Micrographs are initially decomposed into cartoon and textured components. Then, the spectral and spatial sharpness maps of the cartoon components are extracted. One metric is calculated by combining the spatial and spectral sharpness maps of the cartoon components. The other metric is calculated on the basis of the edge of the maximum local variation map of the cartoon components. Finally, the two metrics are combined as the final metric. The objective scores generated using this method exhibit high correlation and consistency with the subjective scores.
1. Introduction
Scanning electron microscopy (SEM) helps researchers intuitively understand microstructures because it can provide ultrahigh magnification. SEM is playing an increasingly important role in various research areas, such as medical imaging, automated inspection, bioimaging, and ore detection. At present, microscopists must deal with a considerable number of images because tens or hundreds of images are regularly generated and saved during a typical microscopy imaging process [1–3]. Images obtained via SEM may be blurred because of the imaging equipment used or the operators who performed the process. Frequently, only a few images are useful for further analysis. Given the subjectivity of the SEM operation, blurring is a major distortion in SEM images [4–6]. Postek and Vladár qualitatively and quantitatively analysed the sharpness of micrographs in the Fourier domain [7, 8]. Their work provided valuable information to the SEM community. In addition, their research was easy to understand and the meaning was clear. However, Postek and Vladár disregarded the characteristics of the human visual system (HVS), which is the final receiver of the images. Thus, HVS characteristics should be considered.
Image quality assessment (IQA) is a useful method for finding clear images [9–11]. Human beings are the ultimate receivers of processed images, and they judge image quality. Automatic approaches that can consistently assess the image quality with human subjective evaluation must be developed [10]. Subjective IQA methods require numerous observers to participate in experiments. These methods are not only expensive and timeconsuming but they also cannot be incorporated into automatic or realtime image systems [9]. Hence, objective quality methods that can automatically and accurately assess image quality must be developed. Objective IQA methods can be classified into three categories on the basis of the availability of the reference images: full reference (FR) [9, 10], in which complete reference images are required; reduced reference (RR) [12, 13], in which partial information about the reference images is available; and no reference (NR) [14–16], in which no information about the reference images is necessary. Given that NR images are available, the use of the FR and RR methods is limited in microscopy applications.
NR assessment methods for blurriness or sharpness can be divided into three categories [17]. (1) Edgebased approaches assume that image edges will spread when blurriness occurs. Marziliano et al. detected vertical edges by using the Sobel operator and then obtained a blurriness measure from the edge map. The average width of all edge pixels was regarded as a metric for blurriness [18]. Ferzli and Karam combined Weber’s law with Marziliano’s method and proposed the “just noticeable blur” method, which considers HVS properties [14]. (2) Transformbased approaches: blurriness leads to distortion in different domains, and thus, many methods assess blurriness on the basis of transform domains, such as discrete cosine transform [19], discrete wavelet transform (DWT) [20], and sparse transform [21]. Some methods utilise information from two or more domains. Chen and Bovik used the comprehensive spatial and gradient information of DWT as the final blurriness metric [16]. Vu et al. combined spatial features with spectral features as a sharpness measurement [22]. Li et al. proposed a robust method by learning multiscale features extracted in the spatial and spectral domains [23]. (3) Statistical pixel informationbased approaches analyse the distribution laws of the pixels of original images or their maps. Considering that blurriness decreases the variance of difference in the intensity of adjacent pixels, Tsomko and Kim calculated the block difference variance and regarded it as a blurriness metric [24]. Bahrami and Kot developed a method based on the maximum local variation (MLV) distribution of each pixel, and the standard deviation of distribution was used to measure sharpness [25]. Li et al. combined the sum of squared nonDC moment (SSM) values of a gradient map that were computed on the basis of their Tchebichef moments, block variances, and visual saliency to measure blurriness [26]. Edgebased methods rely excessively on the image content. If an image only has few sharp edges, then edgebased methods may lead to inaccuracy. Transformbased methods assume that distortion in certain domains can be easily extracted and computed; however, these methods occasionally overlook human visual perception, which plays a key role in assessment. Statistical pixel informationbased methods are not robust because they are sensitive to noise.
Samples should be preprocessed before SEM imaging. This procedure leads to the difference between micrographs and natural images. Samples are typically ores, which are polished and placed on conductive tapes. Therefore, the micrograph content evidently has an edge area. Figure 1 shows that the final micrographs have many textures. Micrographs used in the database have more strong edges and textures than natural images. Images can be decomposed into cartoon and textured components due to the entropy masking property of human visual perception [27]. The cartoon components of images contain strong edges and flat areas, whereas the textured components contain middle and highfrequency information, including noise and textures. In [28], Attneave indicated that image information is concentrated on contours. Certain areas and objects are described simply by HVS on the basis of the Gestalt perception psychology [29]. Moreover, HVS is more sensitive to changes in cartoon components than to changes in textured components.
The experiments in the succeeding sections also prove that the cartoon components exhibit better blurriness features than the original micrographs. This study proposes a new NR assessment method that initially decomposes SEM micrographs into cartoon and textured components. The blurriness of the cartoon components is then assessed. The assessment method based on the combination of spatialspectral information and spatial map edges is adopted to separately calculate two different metrics and to obtain the final metric via the weighted summation of the two metrics after normalization. The experiments demonstrate the good performance of the proposed method.
2. Cartoon+Textured Components with Isotropic Nonlinear Filters
Before assessing the quality of the SEM micrographs, the micrographs are initially decomposed into cartoon and textured components. An original SEM micrograph is denoted as , which can be decomposed into the cartoon component and textured component . The decomposition process is defined as . The general variational framework for the decomposition model is provided in Meyer’s models [30] as an energy minimisation problem: where are functions and and are spaces of functions or distributions. and if and only if . is a tuning parameter. The cartoon component contains the strong edges and lowfrequency information of a micrograph, which can be described as . By contrast, the textured component contains the noise and texture of a micrograph, which can be described as .
A fast and approximate solution for the general variational problem was proposed in [31, 32] by applying a nonlinear lowpass/highpass filter pair. For each point of a micrograph, when the micrograph is filtered with a lowpass filter and if is a cartoon component, the total variation () does not decrease. By contrast, if is a textured component, then the decreases rapidly. A nonlinear filter was proposed by the solution on the basis of this characteristic. The local total variation () at is defined as follows: where is a lowpass filter with a Gaussian kernel and a standard deviation of , , denotes the entire micrograph region, and is a convolution symbol.
is the relative reduction rate of the , which is defined as follows:
This formula denotes the decrease rate of ’s when is filtered using a lowpass filter . If is close to 0, then the is slightly reduced and pixel belongs to the cartoon region. If is close to 1, then the is large and belongs to the textured region. The proposed fast nonlinear lowpass and highpass filter pair is defined as follows: where is the soft threshold function. The function is defined as follows: and its parameters and are fixed in the experiments. The specific numbers are and [32].
The attenuation of the highfrequency content is caused by the blurriness of an image. Therefore, if an image is more blurred, then its cartoon component is more similar to that of the original image. Figure 2 compares between the cartoon and textured decompositions of micrographs with different blurriness extent but the same content. The parameter of the nonlinear filter is 3. Compared with that of the sharp micrograph, the cartoon component of the blurred micrograph looks more similar to that of the original micrograph because in the blurred micrograph, more of the pixels’ are close to 0. Thus, more regions of the micrograph belong to the cartoon component. Therefore, the textured part of the blurred micrograph is less evident than that of the sharp micrograph.
(a) Sharp micrograph
(b) Blurred micrograph
(c) Cartoon part of (a)
(d) Cartoon part of (b)
(e) Textured part of (a)
(f) Textured part of (b)
Figure 3 presents the different decomposition results of various parameter values. Compared with , more regions of the original micrograph belong to the textured component when . Although more noises and textures are separated, the edges of the cartoon component exert greater zigzag effect. If is small, then the cartoon component still contains texture. If is large, then some edges are regarded as texture. A zigzag effect also occurs on the cartoon component.
(a) Original micrograph
(b) Cartoon part of (a) when
(c) Textured part of (a) when
(d) Cartoon part of (a) when
(e) Textured part of (a) when
In this study, . The cartoon component is a simplified description of the original micrograph, and it contains strong edges that are perceived more easily by human visual perception when blurriness occurs. As previously mentioned, human visual perception is sensitive to distortion in edges. The cartoon component retains the original edges, simplifies the original micrograph, and does not lose distortion information. Although the textured component is also affected by blurriness, human visual perception is less sensitive to distortion in this part because of perceptual redundancy. Furthermore, noise and repetitive texture reduce the performance of assessment methods, as proven in the following sections. The experiments also prove that assessing the quality of the cartoon component is better than that of the original micrograph.
3. NR IQA Method for Blurriness
Distortion at the edges should be given attention in accordance with the characteristics of human visual perception; thus, this method is primarily edgebased. Apart from strong edges, the cartoon component also contains other frequency information that is utilised to measure the attenuation of highfrequency information caused by blurriness. Attenuation is measured using a transformbased approach. The final metric is a weighted summation of the edge and transformbased metrics.
A famous property called the law exists in the spectrum domain. This property describes the amplitude spectrum of an image as an approximately straight line on a loglog scale [33, 34]. If blurriness appears in images, then the absolute value of the line’s slope increases, particularly in the highfrequency content [35]. For original images, if the tails of curves overlap (red mark in Figure 4), then the accuracy of quality assessment decreases. The curves of the cartoon components retain the trend caused by blurriness and avoid overlapping tails (blue mark in Figure 4). The blue solid line in Figure 4 is a loglog spectrum curve of the image in Figure 2(a), whereas the yellow line belongs to the image in Figure 2(b). The brown line is the image in Figure 2(c), and the purple curve is the image in Figure 2(d). The green dashed line is the fitted straight line of the brown curve at high frequencies, whereas the red dashed line is the fitted straight line of the purple curve at high frequencies.
The spectral map generated using the method mentioned in the spectral and spatial sharpness algorithm [22] intuitively proves that the cartoon component plays a more important role than its original micrograph in quality assessment. The spectral map is defined as and is calculated using the previously mentioned slope. The absolute value of the slope is defined as . To obtain , the algorithm initially calculates the 2D discrete Fourier transform of micrograph . In , and are computed using the following: where and . is the summed magnitude spectrum, as given by
The algorithm finds a line that best fits the magnitude spectrum. is the absolute value of the slope of the line. is calculated by where is taken over all radial frequency . Finally, is defined by where and . For additional details, refer to [22]. Figure 5 shows that the spectral map of the cartoon component contains no noise and efficiently distinguishes highlowfrequency content. By contrast, the spectral map of the original micrograph misconstrues some lowfrequency content as highfrequency content (marked by the blue ellipses). The blurriness of highfrequency content still exists (marked by the red ellipses), thereby indicating that the blurriness in the cartoon component conforms well to HVS.
(a) Spectral map of the sharp micrograph
(b) Spectral map of the sharp micrograph’s cartoon part
(c) Spectral map of the blurred micrograph
(d) Spectral map of the blurred micrograph’s cartoon part
The cartoon component extracts blurriness features well in the spectral and spatial domains. The spatial maps in spectral and spatial sharpness (S3) [22] and the MLV [25] are generated using the local variation of an image. In S3, map is generated on the basis of the TV. The TV of micrograph is defined as , and where are eightneighbour pixels of . is computed using where is a block of . The final map is defined as follows: where ; was set as 0.5 in [22]. For the reducing effect of the noise, the average sharpness is 1% of the highest values of . Additional details are provided in [22]. The MLV also generates its map. The MLV is defined as follows: where are eight neighbours of . Given a micrograph with size , the MLV is calculated for each pixel at location using formula (13). The final MLV map is generated via
Additional details are available in [25]. Given the separation of texture, the spatial map of the cartoon components focuses on edges. Figures 6 and 7 provide the intuitive details. In Figure 6, the region marked by the blue ellipse in the blurred micrograph shows that the textures and edges are mixed and measuring distortions at the edges is difficult. In its cartoon component, we observe that more pixels are considered edges. The redmarked region indicates that the cartoon component still contains distortions caused by blurriness. The same condition appears in Figure 7. The yellowmarked region confirms that the MLV can extract blur distortions better than the sum of the local variations because of the clear edges of the MLV map.
(a) Map of the sharp micrograph
(b) Map of the sharp micrograph’s cartoon part
(c) Map of the blurred micrograph
(d) Map of the blurred micrograph’s cartoon part
(a) Map of the sharp micrograph
(b) Map of the sharp micrograph’s cartoon part
(c) Map of the blurred micrograph
(d) Map of the blurred micrograph’s cartoon part
As we all know, blurriness leads to the spread of edges. Figure 8 shows the edge of the cartoon component’s MLV spatial map. From the details of the micrographs in Figures 7(a) and (7)c, we observe that the edge detection of the blurry MLV map has more edge pixels than that of the sharp one and its edge widths are wider than those of the sharp one.
(a) Edge detection of the sharp cartoon part’s MLV map
(b) Zoomin of the redmarked region in (a)
(c) Edge detection of the blurred cartoon part’s MLV map
(d) Zoomin of the red marked region in (c)
In [25], the textured component and edges exhibit high MLV, thereby indicating that high variations in pixel intensities are better indicators of sharpness than low variations. The cartoon component has edges and blank content, but blurriness does not change the blank content. Thus, we do not utilise the statistics of the MLV distribution as [25] did. In this research, we detect edges of the MLV map and calculate the sparsity of edge pixels as a blurriness metric. We define the sparsity of edge pixels as the average distance of pixels at edges. For pixels that correspond to an edge location, the start and end positions of the edge are defined as the location of the local luminance extrema closest to the edge. Edge width is defined as the length between the start and end positions [18]. The final blur metric sparsity is generated by the following:
Blurriness causes the spread of edges. It also produces more edge pixels during edge detection. The experiments indicate that the sparsity of edge pixels is lower when the micrograph is more blurred.
On the basis of the preceding analysis, this study proposes a new assessment method. The flowchart of this method is presented in Figure 9. An original micrograph is initially decomposed into cartoon and textured components. Then, the spectral and spatial features with the sparsity of edge pixels are combined. The final score is obtained via weighting summation. The method extracts spectral and spatial features using the algorithm in S3, and the metric is defined as . We separately calculate the sparsity of edge pixels in the vertical and horizontal directions. Vertical sparsity is , whereas horizontal sparsity is . The final sparsity is defined as and the final score is obtained using where is a weighting coefficient.
4. Analysis and Discussion of Experiment Results
4.1. SEM Micrographs and Their Quality Assessment Results
The SEM micrographs used in this research were taken at the Modern Analysis and Computing Centre of China University of Mining and Technology. We selected 50 samples. For every sample, we obtained three blurred micrographs with different extent by artificially adjusting the SEM focus parameter. After obtaining 150 micrographs, 30 SEM users without knowledge in image processing participated in the subjective experiment. Every micrograph gained 30 scores. To reduce the error of the experimental results, we selected 30 scores based on confidence interval and eliminated 5 scores that were not found in the confidence interval. The final mean opinion score () was the average score of the remaining 25 scores. Apart from the two samples presented in Figures 1 and 2, three other samples are illustrated in Figure 10.
(a) Sample a
(b) Sample b
(c) Sample c
The blurriness extent increases from the first column to the last column. In Table 1, the higher the blurriness extent, the more blurred the micrograph is. The subjective and objective assessment scores are also provided in Table 1. is the mean opinion score, is the objective score obtained from the combination of spectral and spatial features, is the sparsity of edge pixels, and is the final objective score. With regard to these parameters, the lower their values, the more blurred the micrograph is. This analysis matches the one mentioned in Section 3.

4.2. Performance Analysis of the Proposed Objective Method
In this study, three performance indexes are adopted to measure the proposed objective method. (1)Pearson Linear Correlation Coefficient (): where are subjective scores, are objective scores, and are their average scores, and and are their variances. is a metric that measures how well the objective scores correlate with the subjective scores. If is higher, then the correlation is better.(2)RootMeanSquare Error (): is a metric that measures the absolute error between the subjective and objective scores. A good algorithm is supposed to have a low value.(3)Spearman’s Rank Ordered Correlation Coefficient (): where and are the rank positions of and in arrays and , respectively. is a metric that measures the relative monotonicity between the subjective and objective scores. A high value indicates a good algorithm.
Figure 11 presents eight pairs of comparison between the original micrographs and their cartoon components. We obtain three performance indexes and fitted curves of the subjective and objective scores using eight different methods [14, 18, 22, 25, 36–38]. The fitted curves of the cartoon component are evidently better than those of the original micrographs. The performance indexes validate this finding. Therefore, the distortion in the cartoon components of the micrographs conforms more to the observed distortion by HVS when blurriness occurs.
(a) Marziliano, original
(b) Marziliano, cartoon
(c) JNB, original
(d) JNB, cartoon
(e) CPBD, original
(f) CPBD, cartoon
(g) LPC, original
(h) LPC, cartoon
(i) FISH, original
(j) FISH, cartoon
(k) FISHbb, original
(l) FISHbb, cartoon
(m) S3, original
(n) S3, cartoon
(o) MLV, original
(p) MLV, cartoon
(q) Proposed method
The last graph in Figure 11 shows the fitted curve of the subjective and objective scores obtained using the proposed method, and three performance indexes are appended at the top left corner. Figure 12 presents the analysis of . The values of , , and SRCC are the best when . Therefore, in the proposed method, the weighting coefficient is assumed as 0.3. This also indicates HVS is more sensitive to blurred distortion at the edges. Table 2 provides a summary of the performance indexes generated using different methods.

The top two indexes are marked in boldface. We obtain two conclusions from Table 2. (1) Cartoon components reflect blurriness characteristics better than original micrographs. (2) The indexes of the proposed method are the best compared with those of the other eight methods. Thus, the proposed method is the most similar to HVS perception characteristics.
5. Conclusion
This study proposes a new method for evaluating the blurriness of SEM micrographs. HVS is more sensitive to the distortion of cartoon components than that of redundant texture components according to the Gestalt perception psychology and the entropy masking property. The method initially decomposes original micrographs into cartoon and textured components. Then, blurriness features are extracted from the cartoon components. When assessing the quality of the cartoon components, the method combines the micrographs’ spectralspatial features and the sparsity of edge pixels of the MLV spatial map. Finally, we obtain the final quality scores via the weighted summation of the two metrics. The experiments demonstrate that the proposed method is more similar to human visual perception than other stateofart methods when assessing the quality of SEM micrographs.
Data Availability
The data including the database of blurred micrographs, (mean opinion scores), and objective scores used to support the findings of this study are available from the corresponding author upon request.
Conflicts of Interest
The authors declare that they have no conflicts of interest.
Acknowledgments
This work was supported by two projects of the National Natural Science Foundation of China whose Grant Numbers are 51604271 and 61771474, one project of the Natural Science Foundation of Jiangsu Province whose grant number is BK20170273 and one project whose Grant Number is 2015XKMS100 supported by the Fundamental Reserach Funds for the Gentral Universities.
References
 S. Koho, E. Fazeli, J. E. Eriksson, and P. E. Hänninen, “Image quality ranking method for microscopy,” Scientific Reports, vol. 6, no. 1, article 28962, 2016. View at: Publisher Site  Google Scholar
 M. D. Zotta, Y. Han, M. D. Bergkoetter, and E. Lifshin, “An evaluation of image quality metrics for scanning electron microscopy,” Microscopy and Microanalysis, vol. 22, no. S3, pp. 572573, 2016. View at: Publisher Site  Google Scholar
 M. Zeder, E. Kohler, and J. Pernthaler, “Automated quality assessment of autonomously acquired microscopic images of fluorescently stained bacteria,” Cytometry Part A, vol. 77, no. 1, pp. 76–85, 2010. View at: Publisher Site  Google Scholar
 L. Firestone, K. Cook, K. Culp, N. Talsania, and K. Preston, “Comparison of autofocus methods for automated microscopy,” Cytometry, vol. 12, no. 3, pp. 195–206, 1991. View at: Publisher Site  Google Scholar
 J. F. Brenner, B. S. Dew, J. B. Horton, T. King, P. W. Neurath, and W. D. Selles, “An automated microscope for cytologic research a preliminary evaluation,” Journal of Histochemistry & Cytochemistry, vol. 24, no. 1, pp. 100–111, 1976. View at: Publisher Site  Google Scholar
 S. L. Ellenberger, Influence of Defocus on Measurements in Microscope Images, [M.S. thesis], Delft: Delft University of Technology, 2000.
 M. L. T. Chae Postek and A. E. Vladár, “Image sharpness measurement in scanning electron microscopy—part I,” Scanning, vol. 20, no. 1, 9 pages, 1998. View at: Publisher Site  Google Scholar
 A. E. Vladár, M. T. Postek, and M. P. Davidson, “Image sharpness measurement in scanning electron microscopy—part II,” Scanning, vol. 20, no. 1, 34 pages, 1998. View at: Publisher Site  Google Scholar
 Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, “Image quality assessment: from error visibility to structural similarity,” IEEE Transactions on Image Processing, vol. 13, no. 4, pp. 600–612, 2004. View at: Publisher Site  Google Scholar
 H. R. Sheikh, A. C. Bovik, and G. De Veciana, “An information fidelity criterion for image quality assessment using natural scene statistics,” IEEE Transactions on Image Processing, vol. 14, no. 12, pp. 2117–2128, 2005. View at: Publisher Site  Google Scholar
 H. R. Sheikh and A. C. Bovik, “Image information and visual quality,” IEEE Transactions on Image Processing, vol. 15, no. 2, pp. 430–444, 2006. View at: Publisher Site  Google Scholar
 Z. Wang and A. C. Bovik, “Reduced and noreference image quality assessment,” IEEE Signal Processing Magazine, vol. 28, no. 6, pp. 29–40, 2011. View at: Publisher Site  Google Scholar
 R. Soundararajan and A. C. Bovik, “RRED indices: reduced reference entropic differencing for image quality assessment,” IEEE Transactions on Image Processing, vol. 21, no. 2, pp. 517–526, 2012. View at: Publisher Site  Google Scholar
 R. Ferzli and L. J. Karam, “A noreference objective image sharpness metric based on the notion of just noticeable blur (JNB),” IEEE Transactions on Image Processing, vol. 18, no. 4, pp. 717–728, 2009. View at: Publisher Site  Google Scholar
 P. Ye, J. Kumar, L. Kang, and D. Doermann, “Unsupervised feature learning framework for noreference image quality assessment,” in 2012 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1098–1105, Providence, RI, USA, June 2012. View at: Publisher Site  Google Scholar
 M.J. Chen and A. C. Bovik, “Noreference image blur assessment using multiscale gradient,” EURASIP Journal on Image and Video Processing, vol. 3, 11 pages, 2011. View at: Publisher Site  Google Scholar
 Z.M. Wang, “Review of noreference image quality assessment,” Acta Automatica Sinica, vol. 41, no. 6, pp. 1062–1079, 2015. View at: Google Scholar
 P. Marziliano, F. Dufaux, S. Winkler, and T. Ebrahimi, “Perceptual blur and ringing metrics: application to JPEG2000,” Signal Processing: Image Communication, vol. 19, no. 2, pp. 163–172, 2004. View at: Publisher Site  Google Scholar
 M. A. Saad, A. C. Bovik, and C. Charrier, “Blind image quality assessment: a natural scene statistics approach in the DCT domain,” IEEE Transactions on Image Processing, vol. 21, no. 8, pp. 3339–3352, 2012. View at: Publisher Site  Google Scholar
 R. Ferzli and L. J. Karam, “Noreference objective wavelet based noise immune image sharpness metric,” in IEEE International Conference on Image Processing 2005, pp. 1–405, Genova, Italy, September 2005. View at: Publisher Site  Google Scholar
 L. Li, D. Wu, J. Wu, H. Li, W. Lin, and A. C. Kot, “Image sharpness assessment by sparse representation,” IEEE Transactions on Multimedia, vol. 18, no. 6, pp. 1085–1097, 2016. View at: Publisher Site  Google Scholar
 C. T. Vu, T. D. Phan, and D. M. Chandler, “S3: a spectral and spatial measure of local perceived sharpness in natural images,” IEEE Transactions on Image Processing, vol. 21, no. 3, pp. 934–945, 2012. View at: Publisher Site  Google Scholar
 L. Li, W. Xia, W. Lin, Y. Fang, and S. Wang, “Noreference and robust image sharpness evaluation based on multiscale spatial and spectral features,” IEEE Transactions on Multimedia, vol. 19, no. 5, pp. 1030–1040, 2017. View at: Publisher Site  Google Scholar
 E. Tsomko and H. J. Kim, “Efficient method of detecting globally blurry or sharp images,” in 2008 Ninth International Workshop on Image Analysis for Multimedia Interactive Services, pp. 171–174, Klagenfurt, Austria, May 2008. View at: Publisher Site  Google Scholar
 K. Bahrami and A. C. Kot, “A fast approach for noreference image sharpness assessment based on maximum local variation,” IEEE Signal Processing Letters, vol. 21, no. 6, pp. 751–755, 2014. View at: Publisher Site  Google Scholar
 L. Li, W. Lin, X. Wang, G. Yang, K. Bahrami, and A. C. Kot, “Noreference image blur assessment based on discrete orthogonal moments,” IEEE Transactions on Cybernetics, vol. 46, no. 1, pp. 39–50, 2016. View at: Publisher Site  Google Scholar
 A. B. Watson, R. Borthwick, and M. Taylor, “Image quality and entropy masking,” in Proceedings Volume 3016, Human Vision and Electronic Imaging II, pp. 2–12, San Jose, CA, USA, June 1997. View at: Publisher Site  Google Scholar
 F. Attneave, “Some informational aspects of visual perception,” Psychological Review, vol. 61, no. 3, pp. 183–193, 1954. View at: Publisher Site  Google Scholar
 K. Koffka, Principles of Gestalt Psychology, Routledge, 2013. View at: Publisher Site
 Y. Meyer, Oscillating Patterns in Image Processing and Nonlinear Evolution Equations: the Fifteenth Dean Jacqueline B. Lewis Memorial Lectures, American Mathematical Society, 2001. View at: Publisher Site
 A. Buades, T. M. le, J. M. Morel, and L. A. Vese, “Fast cartoon + texture image filters,” IEEE Transactions on Image Processing, vol. 19, no. 8, pp. 1978–1986, 2010. View at: Publisher Site  Google Scholar
 A. Buades and J. L. Lisani, “Directional filters for cartoon + texture image decomposition,” Image Processing On Line, vol. 5, pp. 75–88, 2016. View at: Publisher Site  Google Scholar
 D. L. Ruderman, “The statistics of natural images,” Network: Computation in Neural Systems, vol. 5, no. 4, pp. 517–548, 1994. View at: Publisher Site  Google Scholar
 A. Srivastava, A. B. Lee, E. P. Simoncelli, and S. C. Zhu, “On advances in statistical modeling of natural images,” Journal of Mathematical Imaging and Vision, vol. 18, no. 1, pp. 17–33, 2003. View at: Publisher Site  Google Scholar
 D. J. Field and N. Brady, “Visual sensitivity, blur and the sources of variability in the amplitude spectra of natural scenes,” Vision Research, vol. 37, no. 23, pp. 3367–3383, 1997. View at: Publisher Site  Google Scholar
 N. D. Narvekar and L. J. Karam, “A noreference image blur metric based on the cumulative probability of blur detection (CPBD),” IEEE Transactions on Image Processing, vol. 20, no. 9, pp. 2678–2683, 2011. View at: Publisher Site  Google Scholar
 R. Hassen, Z. Wang, and M. M. A. Salama, “Image sharpness assessment based on local phase coherence,” IEEE Transactions on Image Processing, vol. 22, no. 7, pp. 2798–2810, 2013. View at: Publisher Site  Google Scholar
 P. V. Vu and D. M. Chandler, “A fast waveletbased algorithm for global and local image sharpness estimation,” IEEE Signal Processing Letters, vol. 19, no. 7, pp. 423–426, 2012. View at: Publisher Site  Google Scholar
Copyright
Copyright © 2019 Hui Wang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.