Table of Contents Author Guidelines Submit a Manuscript
Journal of Sensors
Volume 2016, Article ID 9794723, 8 pages
http://dx.doi.org/10.1155/2016/9794723
Research Article

Using the Dual-Tree Complex Wavelet Transform for Improved Fabric Defect Detection

Central University of Technology, Free State, 20 President Brand Street, Bloemfontein 9301, South Africa

Received 25 July 2016; Accepted 10 October 2016

Academic Editor: Calogero M. Oddo

Copyright © 2016 Hermanus Vermaak et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The dual-tree complex wavelet transform (DTCWT) solves the problems of shift variance and low directional selectivity in two and higher dimensions found with the commonly used discrete wavelet transform (DWT). It has been proposed for applications such as texture classification and content-based image retrieval. In this paper, the performance of the dual-tree complex wavelet transform for fabric defect detection is evaluated. As experimental samples, the fabric images from TILDA, a textile texture database from the Workgroup on Texture Analysis of the German Research Council (DFG), are used. The mean energies of real and imaginary parts of complex wavelet coefficients taken separately are identified as effective features for the purpose of fabric defect detection. Then it is shown that the use of the dual-tree complex wavelet transform yields greater performance as compared to the undecimated wavelet transform (UDWT) with a detection rate of 4.5% to 15.8% higher depending on the fabric type.

1. Introduction

The discrete wavelet transform has been commonly used for feature extraction in fabric defect detection research. For example, Lambert and Bock [1] performed experiments using several wavelet bases and compared the defect detection performance in fabric images. Other related research results have been published [25].

However, the discrete wavelet transform (DWT) in its critically sampled form suffers from the problem of shift variance that makes it unsuitable for pattern recognition applications such as fabric defect detection [6]. The solution that has been usually used is the undecimated discrete wavelet transform (UDWT) [79].

Even if the UDWT solves the problem of shift variance, it has a high redundancy rate of for image representation, where is the number of wavelet decomposition levels. That leads to increased computational requirements [10]. In addition, it does not solve the shortcoming of poor directional selectivity for diagonal features in 2D. This weakness of UDWT lowers the discrimination power of its texture features.

The dual-tree complex wavelet transform (DTCWT), first introduced by Kingsbury in 1998 [10, 11], is approximately shift invariant and allows directional wavelets in 2 and higher dimensions with only 2x redundancy in 1D (2d for -dimensional signals, in general) [12]. It has been used in several research papers especially for texture characterisation. For example, Costin and Ignat [13] discussed the effectiveness of the cosine similarity measure, the Pearson coefficient, and the Frobenius norm applied to the magnitudes of DTCWT coefficients of texture images in order to decide on their similarities. Other applications include content-based image retrieval [14, 15], image segmentation [16], and texture classification [17, 18].

Specific to textiles, Wang et al. used the 2D dual-tree complex wavelet transform to separate the fabric texture and the pilling information from the image of the pilled nonwoven fabric. They then used that pilling information and a supervised neural network classifier for objective pilling evaluation [19].

However, there is limited literature on the application of the dual-tree complex wavelet transform to fabric defect detection. In this paper, the dual-tree complex wavelet to extract texture features and then a Euclidean distance classifier are used in order to detect defects in textile fabrics. The objective is twofold: (i) to identify the best features for the DTCWT and (ii) to show that the features extracted using DTCWT are more powerful in discriminating the fabric defects from the sound fabric than those extracted using UDWT.

The remaining part of this paper is organised as follows. Section 2 describes the wavelet transform methods with emphasis on the dual-tree complex wavelet transform. Section 3 briefly describes the Euclidean distance classifier while Section 4 describes the dataset and the experiments. Section 5 presents the results as well as their interpretation and, finally, Section 6 concludes the paper.

2. The Wavelet Transform Methods

This section briefly introduces the discrete wavelet transform, the undecimated discrete wavelet transform, and the dual-tree complex wavelet transform.

2.1. The Discrete Wavelet Transform

The one-dimensional discrete wavelet transform allows one to decompose a digital signal into a low-frequency component called “approximation” and a high-frequency component called “details.” The low-frequency component can further be decomposed into approximation and details, and so forth. This process is illustrated by Figure 1 for a three-level decomposition. form a system of specially designed filters called “quadrature mirror filters.”

Figure 1: Three-level DWT decomposition. and are low-pass and high-pass filters, respectively. denotes the original signal; cA1, cA2, and cA3 denote the approximation coefficients at level 1, level 2, and level 3, respectively, while cD1, cD2, and cD3 denote the detail coefficients at level 1, level 2, and level 3, respectively.

For two-dimensional signals (images) the decomposition algorithm is applied in two phases. In phase 1, the filtering (high-pass and low-pass) followed by downsampling is done along the columns while in phase 2 the filtering (high-pass and low-pass) followed by downsampling is applied to the results of phase 1 along the rows [20].

2.2. The Undecimated Discrete Wavelet Transform

The DWT as described in the previous subsection is common but is inappropriate in some applications because it is shift variant. In such cases, the UDWT can be used. One-level UDWT is similar to one-level DWT with the exception that the results of filtering the original signal are not downsampled. Therefore, one ends up with the wavelet approximation coefficients with as many samples as the original signal and the wavelet detail coefficients with as many coefficients as the original leading to the overall results with twice as much data as the original signal. For multilevel decomposition, the filters coefficients are upsampled from one level to the next one. Figure 2 illustrates a two- level UDWT decomposition [20].

Figure 2: Two-level UDWT decomposition. and are upsampled versions of and , respectively. cA1, cA2, cD1, and cD2 are wavelet coefficients as described in Figure 1.
2.3. The Dual-Tree Complex Wavelet Transform

Figure 3 shows the implementation of 1D dual-tree complex wavelet transform using Finite Impulse Response (FIR) real-coefficient filters. , , , and are low-pass filters while , , , and are high-pass filters. Those filters are designed so that the corresponding wavelets and form approximately a Hilbert pair. Similarly the resulting scaling functions and should be such that is approximately the Hilbert transform of . Therefore the complex wavelet and complex scaling function described by the following equation would be approximately analytic:Consequently referring to Figure 3, the output coefficients of the top tree (tree ) and those of the bottom tree (tree ) can be considered, respectively, as real and imaginary parts of the complex wavelet coefficients.

Figure 3: Implementation of 1D dual-tree complex wavelet transform using FIR real-coefficient filters.

The conditions specified in the previous paragraph can be met if the filters satisfy the following requirements [12]:(i)They meet the perfect reconstruction conditions.(ii)One of the two low-pass filters and should be approximately a half-sample shift of the other.(iii)The first-stage filters and should be shifted by one sample with respect to and , respectively.

One way to meet those requirements is to design orthogonal Q-shift filters by minimizing energy in the frequency domain as proposed by Kingsbury [21]. The extension to 2 dimensions is achieved by 2D complex separable wavelets described by (2) and a 2D complex separable scaling function described by (3). They are implemented by separable filters along columns and then rows:where and are as shown in (1).

Therefore, the 2D DTCWT is implemented separably by 2 trees used for the rows of the image and 2 trees for the columns. The resulting wavelet coefficients are then combined by simple sum and difference operations to give real and imaginary wavelet coefficients. This gives 6 wavelets approximately shift invariant and oriented at ±15°, ±45°, and ±75°. The reader can find the details in [12].

3. Euclidean Distance Classifier

For a Euclidean distance classifier, each pattern class is characterised by a vector which is the mean vector of the features of the patterns of that class that are part of the training set as described by the following equation:where is the number of training pattern vectors for class and the summation is taken over these vectors.

Determining the class membership of an unknown pattern with feature vector consists in computing the distance measures given by (5) and assigning to the class for which is the smallest distance:where is the number of classes.

The above procedure of training the Euclidean distance classifier, whereby for each pattern class a characteristic feature vector is the mean vector of the corresponding feature vectors in the training set, is called “maximum likelihood training” (ML) training [9].

The “Minimum Classification Error” (MCE) training of the classifier provides a better way of obtaining the classifier characteristic feature vectors of different pattern classes [9]. The MCE training algorithm starts with the characteristic feature vectors obtained by the ML method and then adjusts them adaptively in order to achieve the highest classification rate of the feature vectors in the training set. The mentioned process is illustrated by Figure 4 for the experiments of this paper.

Figure 4: MCE training of a Euclidean distance classifier.

The fabric images of the training set are submitted to a feature extraction operation using a wavelet transform method. The features are extracted for each 32 × 32 window of the fabric image. The features of each window are compared to the reference feature vector (Λ) of the defective and defect-free classes and the window is classified as defective or defect-free. The detection results are then compared to the true training sample labels and the value of the detection loss function is evaluated. Using that loss value, the reference vectors (Λ) are adjusted in order to decrease the value of the loss function. New detection results are computed using the adjusted reference vectors and the new loss value of detection is evaluated. That process continues until the loss value of detection is minimized or is below a predefined threshold.

4. Experiments

4.1. Experimental Samples

The experiments were performed using the fabric images from the TILDA dataset [22]. That dataset contains images of four different classes of fabrics:Class 1: very fine fabrics with or without visible internal structure;Class 2: fabrics with a low variance stochastic structure. The surface of this case contains no imprints;Class 3: fabrics with a clearly visible periodic structure;Class 4: printed materials with no apparent periodicity.

For each of the 4 fabric classes, the dataset contains two representatives named R1, R2, or R3. Figure 5 shows samples of the four fabric classes contained in the dataset.

Figure 5: Samples of the 4 fabric classes contained in the TILDA dataset.

Fabric images with four types of defects were considered: holes and cuts, oil stains and colour fading, thread errors, and finally foreign body on the fabric. Figure 6 shows examples of those defect types for the fabric of class 2.

Figure 6: Examples of the 4 types of defects considered in this paper.

Table 1 shows the detailed numbers of images from the TILDA dataset considered in this paper. In total 1600 images of different fabric classes and containing different types of defects were used. Each image had a size of 512 × 768 pixels.

Table 1: The TILDA images used in this paper.

Each fabric image was divided into nonoverlapping windows of size 32 × 32 pixels. That gave a total number of 614,400 windows from all the 1,600 images. Among those 614,400 windows, 37,546 were defective while the rest were defect-free.

All the 37,546 defective windows as well as 37,546 defect-free windows chosen randomly were selected to make up the experimental dataset. The selection of the defect-free samples was done in such a way to get from each image an equal number of defective and defect-free samples. The experimental dataset was then divided into two parts of the same size: the training set and the testing set, each containing the same number of defective and defect-free samples.

Therefore the overall experimental dataset was as follows:(i)training set: 37546 samples, 18773 defective and 18773 defect-free;(ii)testing set: 37546 samples, 18773 defective and 18773 defect-free.

4.2. Experimental Procedures

The fabric defect detection experiments were divided into 2 phases: (i) training the classifier using the training set and (ii) classifying the samples of the testing set as defective or not using the trained classifier. The steps of training the classifier were as follows. (i)A 5-level wavelet decomposition of each fabric image in the dataset was performed using the dual-tree complex wavelet transform. The filters used to implement the wavelet transform were Q-shift FIR filters of length 14, designed according to the method proposed by Kingsbury [21]. The results of wavelet decomposition were six complex directional subbands for each of the five decomposition levels corresponding to six orientations: −75°, −45°, −15°, +15°, +45°, and +75°. (ii)For each sample of the training set, the corresponding complex wavelet coefficients for each level of wavelet decomposition and each of the directional subbands were extracted. The complex wavelet coefficient from the subband with decomposition level and orientation is denoted by . (iii)From the complex wavelet coefficients extracted from each subband as mentioned in step 2, the following features were calculated: (a)Mean energy of real parts of wavelet coefficients is (b)Mean energy of imaginary parts of wavelet coefficients is (c)Mean energy of wavelet coefficients taking into account both real and imaginary parts is (d)Mean magnitude of wavelet coefficients is (e)Variance of magnitudes of wavelet coefficients is

For (6) through (10) denotes the th complex wavelet coefficient from the subband with decomposition level and orientation corresponding to the current sample, while and denote its real and imaginary parts, respectively. denotes the total number of complex wavelet coefficients for the current sample for decomposition level and orientation . (iv)The features calculated for each subband were then grouped into the following feature sets. (a) Set 1. The mean energy of the real parts only is (b) Set 2. The mean energy of the real parts only and the mean energy of imaginary parts are (c) Set 3. The mean energy of complex wavelet coefficients is (d) Set 4. The mean and variance of magnitudes of wavelet coefficients areTherefore for each sample in the training dataset Set 1 and Set 3 have 30 features each, while Set 2 and Set 4 have 60 features each. (v)For each feature set, all the feature values were normalized to fall into the range for the samples of the training set usingwhere is the normalized feature value, is the minimum feature value of all the feature vectors of the training set, and is the maximum feature value of all the feature vectors of the training set. (vi)For each feature set, the normalized features were used to train a Euclidean distance classifier using the Minimum Classification Error (MCE) algorithm [9].

For each of the four feature sets (Set 1, Set 2, Set 3, and Set 4) a testing experiment was performed using all the samples of the testing dataset. That was done in order to identify the most powerful feature set among the four. Each testing experiment was performed as follows.(i)The features for the current feature set for all the samples of the testing dataset were calculated as described in steps 1 through 4 of the classifier training process.(ii)The obtained feature values were normalized using the same scaling parameters ( and ) as those used for normalizing the corresponding features of the training set.(iii)Those normalized features were fed into the trained classifier to classify the testing samples as defective or defect-free and then the correct classification rate was recorded.

To compare the performance of DTCWT-based features with UDWT-based features, similar experiments were performed on the same dataset but using UDWT- (undecimated wavelet transform-) based features instead of DTCWT-based features. The filters used to implement the UDWT were also of length 14 and were designed using a cascade-form factorization procedure of the low-pass and high-pass filters as described by Yang [9]. The choice of the UDWT was motivated by the fact that it is also shift invariant, what makes it suitable for applications of fabric defect detection.

As the wavelet coefficients obtained using the UDWT wavelet decomposition are real, they can only lead to feature sets Set 3 and Set 4 (as defined in step 4 of the training process above). Set 2 is not applicable as it involves imaginary parts of wavelet coefficients while Set 1 is the same as Set 3 in that context.

5. Results and Interpretation

Figure 7 compares the fabric defect detection rates obtained for four different DTCWT-based feature sets. That was done to make it possible to assess the relative discriminating power of those different feature sets. Feature set Set 1 is made up of mean energy of real part of wavelet coefficients for all the 6 directional subbands and for all the 5 levels of decomposition. To get Set 2, the mean energy of imaginary parts of wavelet coefficients for all the 6 directional subbands and for all the 5 levels of decomposition was added to Set 1. It can be seen from Figure 7 that that addition improved significantly the detection rate. The features of Set 3 are the mean energies of complex wavelet coefficients. For each directional subband and each level of decomposition, those features of Set 3 can be obtained by summing together the two corresponding features of Set 2. The defect detection rate drops with respect to Set 2 (the mean energies of real and imaginary wavelet coefficients taken separately) but generally remains higher than that obtained using mean energy of real part of wavelet coefficients alone (Set 1). The use of the mean and variance of magnitudes of complex wavelet coefficients (Set 4) leads to a slightly lower detection rate than the mean energies of real and imaginary parts of wavelet coefficients taken separately (Set 2) for most of the considered fabric categories and defect types. The detection performance of Set 4 is higher than that of Set 2 only for the fabric category C4, representative R1, and for the defect type E3 of the TILDA dataset.

Figure 7: Comparison of detection rate of the 4 sets of the DTCWT-based features for different fabric categories and different defect types.

In the experiments, the use of the entropies of real parts, imaginary parts, and magnitudes of DTCWT coefficients as features was also tried, but they did not show significant variation for most of the experimental samples. Therefore it was decided not to consider them any further as it was clear that they would not add any improvement to the detection performance.

Therefore the use of the mean energies of the real and imaginary parts of complex wavelet coefficients obtained using the dual-tree complex wavelet decomposition as features for fabric defect detection is recommended. If, for any reason, those features cannot be used, then the mean and variance of magnitudes of the complex wavelet coefficients would be used.

Figure 8 compares the fabric defect detection performance of DTCWT- and UDWT-based features. The comparison is made for two different feature sets: Set 3 (mean energy of wavelet coefficients) and Set 4 (mean and variance of magnitudes of wavelet coefficients). To increase the validity of the deduction from the comparison, samples from each of the four fabric classes and the two representatives of each fabric category were considered. Additionally, defective samples were taken from different types of defects.

Figure 8: Comparison of detection rate of DTCWT- and UDWT-based features for different fabric categories and defect types.

It can be seen clearly that for any fabric category and any defect type, the DTCWT-based features outperformed the UDWT-based features for each of the two considered feature sets. The difference of fabric defect detection rate varies from 4.5% to 15.8%. One possible explanation of that difference of performance is the analyticity of the complex wavelet implemented by DTCWT which allows discriminating texture in 6 different directions. UDWT on the other hand implements a real wavelet and thus can discriminate texture in only three directions (horizontal, vertical, and diagonal). Furthermore, for the diagonal orientation, UDWT cannot distinguish the texture features oriented at +45° from those oriented at −45°.

6. Conclusions

In this paper, the fabric defect detection performance of features extracted using the dual-tree complex wavelet transform (DTCWT) was investigated. One of the advantages of that method is its approximate shift invariance, property that is important in pattern recognition applications such as fabric defect detection. It was shown that the mean energies of real and imaginary parts of complex wavelet coefficients taken separately are effective features for the purpose of fabric defect detection, outperforming the mean and variance of magnitudes of the coefficients as well as the mean energies of real parts alone or the mean total energies of the coefficients. The undecimated discrete wavelet transform (UDWT) also has the shift invariance property. However, it was shown that the defect detection performance of features obtained by DTCWT is much higher than that obtained using UDWT.

Competing Interests

The authors declare that they have no competing interests.

Acknowledgments

The work described in this paper was supported by the Innovation Fund of the Central University of Technology, Free State (Grant no. 210000414). The authors fully appreciate the financial support.

References

  1. G. Lambert and F. Bock, “Wavelet methods for texture defect detection,” in Proceedings of the International Conference on Image Processing, vol. 3, pp. 201–204, October 1997. View at Scopus
  2. S. Arivazhagan and L. Ganesan, “Texture classification using wavelet transform,” Pattern Recognition Letters, vol. 24, no. 9-10, pp. 1513–1521, 2003. View at Publisher · View at Google Scholar · View at Scopus
  3. S. Guan, X. Shi, H. Cui, and Y. Song, “Fabric defect detection based on wavelet characteristics,” in Proceedings of the Pacific-Asia Workshop on Computational Intelligence and Industrial Application (PACIIA '08), vol. 1, pp. 366–370, Wuhan, China, December 2008. View at Publisher · View at Google Scholar · View at Scopus
  4. S. G. Liu and P. G. Qu, “Inspection of fabric defects based on wavelet analysis and BP neural network,” in Proceedings of the International Conference on Wavelet Analysis and Pattern Recognition (ICWAPR '08), pp. 232–236, August 2008. View at Publisher · View at Google Scholar
  5. J.-W. Wang, C.-H. Chen, W.-M. Chien, and C.-M. Tsai, “Texture classification using non-separable two-dimensional wavelets,” Pattern Recognition Letters, vol. 19, no. 13, pp. 1225–1234, 1998. View at Publisher · View at Google Scholar · View at Scopus
  6. S. Mallat, A Wavelet Tour of Signal Processing, Academic Press, San Diego, Calif, USA, 1999.
  7. X. Z. Yang, G. K. H. Pang, and N. H. C. Yung, “Fabric defect classification using wavelet frames and minimum classification error training,” in Proceedings of the 37th IAS Annual Meeting and World Conference on Industrial applications of Electrical Energy, pp. 290–296, Pittsburgh, Pa, USA, October 2002. View at Scopus
  8. Y. X. Zhi, G. K. H. Pang, and N. H. C. Yung, “Fabric defect detection using adaptive wavelet,” in Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP '01), vol. 6, pp. 3697–3700, Salt Lake City, Utah, USA, May 2001. View at Publisher · View at Google Scholar
  9. X. Z. Yang, Discriminative fabric defect detection and classification using adaptive wavelet [Ph.D. thesis], University of Hong Kong, 2003.
  10. N. G. Kingsbury, “The dual-tree complex wavelet transform: a new technique for shift invariance and directional filters,” in Proceedings of the of 8th IEEE DSP Workshop, vol. 8, Bryce Canyon, Utah, USA, August 1998.
  11. N. Kingsbury, “The dual-tree complex wavelet transform: a new efficient tool for image restoration and enhancement,” in Proceedings of the 9th European Signal Processing Conference (EUSIPCO '98), Rhodes, Greece, September 1998.
  12. I. W. Selesnick, R. G. Baraniuk, and N. G. Kingsbury, “The dual-tree complex wavelet transform,” IEEE Signal Processing Magazine, vol. 22, no. 6, pp. 123–151, 2005. View at Publisher · View at Google Scholar · View at Scopus
  13. M. Costin and A. Ignat, “Pitfalls in using dual tree complex wavelet transform for texture featuring: a discussion,” in Proceedings of the 7th IEEE International Symposium on Intelligent Signal Processing (WISP '11), pp. 110–115, Floriana, Malta, September 2011. View at Publisher · View at Google Scholar · View at Scopus
  14. H. Wang, X. He, and W. Zai, “Texture image retrieval using dual-tree complex wavelet,” in Proceedings of the International Conference on Wavelet Analysis and Pattern Recognition (ICWAPR '07), pp. 230–234, Beijing, China, 2007.
  15. A. H. Kam, T. T. Ng, N. G. Kingsbury, and W. J. Fitzgerald, “Content based image retrieval through object extraction and querying,” in Proceedings of the IEEE Workshop on Content-based Access of Image and Video Libraries, pp. 91–95, Hilton Head Island, SC, USA, June 2000. View at Publisher · View at Google Scholar
  16. E. H. S. Lo, M. R. Pickering, M. R. Frater, and J. F. Arnold, “Image segmentation using invariant texture features from the double dyadic dual-tree complex wavelet transform,” in Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP '07), pp. I609–I612, Honolulu, Hawaii, USA, April 2007. View at Publisher · View at Google Scholar · View at Scopus
  17. S. Hatipoglu, K. M. Sanjit, and N. G. Kingsbury, “Texture classification using dual-tree complex wavelet transform,” in Proceedings of the 7th International Conference on Image Processing and Its Applications (Conf. Publ. No. 465), vol. 1, pp. 344–347, IET, Manchester, UK, July 1999. View at Publisher · View at Google Scholar
  18. T. Celik and T. Tjahjadi, “Multiscale texture classification using dual-tree complex wavelet transform,” Pattern Recognition Letters, vol. 30, no. 3, pp. 331–339, 2009. View at Publisher · View at Google Scholar · View at Scopus
  19. L. Wang, D. Zhongmin, and W. Xungai, “Application of wavelet transform method for textile material feature extraction,” in avelet Transforms and Their Recent Applications in Biology and Geoscience, D. Baleanu, Ed., pp. 207–224, InTech, 2012. View at Google Scholar
  20. M. Misiti, Y. Misiti, G. Oppenheim, and J. M. Poggi, Wavelet Toolbox™ 4—Getting Started, Mathworks®, 2011.
  21. N. Kingsbury, “Design of Q-shift complex wavelets for image processing using frequency domain energy minimization,” in Proceedings of the International Conference on Image Processing (ICIP '03), vol. 1, pp. 1013–1016, Barcelona, Spain, September 2003. View at Scopus
  22. Technical University Hamburg-Harburg and Technical Information Technology, “A eeference dataset for evaluation of visual inspection procedure for textile,” Internal Report, 1996. View at Google Scholar