Table of Contents Author Guidelines Submit a Manuscript
Journal of Sensors
Volume 2017, Article ID 9702612, 14 pages
https://doi.org/10.1155/2017/9702612
Research Article

An Unsupervised Algorithm for Change Detection in Hyperspectral Remote Sensing Data Using Synthetically Fused Images and Derivative Spectral Profiles

1School of Convergence & Fusion System Engineering, Kyungpook National University, Sangju 37224, Republic of Korea
2School of Engineering and Computing Sciences, Texas A&M University-Corpus Christi, 6300 Ocean Dr., Corpus Christi, TX 78412, USA
3School of Civil Engineering, Chungbuk National University, 1, Chungdae-ro, Seowon-gu, Cheongju, Chungbuk 28644, Republic of Korea

Correspondence should be addressed to Jaewan Choi; rk.ca.kubgnuhc@iohcnaweaj

Received 25 April 2017; Accepted 9 July 2017; Published 10 August 2017

Academic Editor: Hyung-Sup Jung

Copyright © 2017 Youkyung Han et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Multitemporal hyperspectral remote sensing data have the potential to detect altered areas on the earth’s surface. However, dissimilar radiometric and geometric properties between the multitemporal data due to the acquisition time or position of the sensors should be resolved to enable hyperspectral imagery for detecting changes in natural and human-impacted areas. In addition, data noise in the hyperspectral imagery spectrum decreases the change-detection accuracy when general change-detection algorithms are applied to hyperspectral images. To address these problems, we present an unsupervised change-detection algorithm based on statistical analyses of spectral profiles; the profiles are generated from a synthetic image fusion method for multitemporal hyperspectral images. This method aims to minimize the noise between the spectra corresponding to the locations of identical positions by increasing the change-detection rate and decreasing the false-alarm rate without reducing the dimensionality of the original hyperspectral data. Using a quantitative comparison of an actual dataset acquired by airborne hyperspectral sensors, we demonstrate that the proposed method provides superb change-detection results relative to the state-of-the-art unsupervised change-detection algorithms.

1. Introduction

Hyperspectral imaging based on spaceborne or airborne imagery has strong potential for applications in remote sensing because each pixel of the hyperspectral data, which are composed of a continuous spectral profile, includes a detailed description of the spectral features of the image. These descriptions allow for an analysis of the specific differences in the characteristics of the earth’s surface. Several satellite-based hyperspectral sensors, such as the Hyperion spectrometer of the National Aeronautics and Space Administration (NASA), and various airborne sensors, including the Airborne Imaging Spectrometer for Applications (AISA), Compact Airborne Spectrographic Imager (CASI), and HyMap, are currently available. In addition, several hyperspectral sensors are planned for future launches, including the Hyperspectral Imager SUIte (HISUI), Hyperspectral Precursor of the Application Mission (PRISMA), HYPererspectral-X IMagery (HYPXIM), and the Environmental Mapping and Analysis Program (ENMAP) [1]. Thus, the development of core technologies for various applications of future hyperspectral sensor systems is necessary.

Change detection is one of the most commonly used applications in remote sensing. In general, change detection is defined as the use of remotely sensed data for the same area at different times to identify altered areas on the earth’s surface. When studying human-induced or natural disasters such as floods, earthquakes, landslides, oil spills, and industrial accidents, change-detection technologies based on remotely sensed data can be effectively used to detect and estimate the extent of the damaged area. Several researchers have analyzed the existing change-detection methods for applications in the remote sensing data, including image differencing, image rationing, vegetation index, regression, data transformation, change vector, postclassification comparison, and GIS-based methods [25]. According to the literature, change-detection techniques comprise two types: supervised change detection and unsupervised change detection [5]. In particular, unsupervised change-detection techniques are preferred for remote sensing applications because the training data do not need to be manually collected, even though supervised methods can provide change aspects of specific land-cover classes and “from-to” information of change detection [3].

Traditional unsupervised change-detection algorithms are focused on proposing change-detection indices and applying automatic thresholding methods for generating a binary map of the changed area. For example, change-detection indices such as change vector analysis (CVA), image correlation, and Relative Dimensionless Global Error (ERGAS) have been proposed [6, 7]. In addition, various thresholding techniques have been developed and applied to multitemporal remote sensing data [8]. Molina et al. [9] integrated different change-detection indices to improve the statistical properties of indices for single change detection. Bruzzone and Prieto proposed modified image differencing and thresholding of adaptive parcels based on a homogeneous region [10]. Markov random fields (MRFs), expectation-maximization (EM) algorithms, and neural networks have been used to determine optical thresholds for automatic change detection [11, 12]. In addition, Bruzzone and Bovolo [13] analyzed change-detection techniques for very-high-resolution (VHR) satellite imagery and developed a novel framework using the top-down approach considering the radiometric characteristics of multitemporal images.

With the launches and high availability of hyperspectral sensors, new change-detection methodologies have been developed, and existing methods have been modified to make them appropriate for hyperspectral imagery. Eismann et al. [14] proposed an algorithm based on linear predictors to detect subtle targets against a complex background, and Kim [15] modified matched filtering using target signal exclusion. Song et al. [16] proposed unsupervised change-detection algorithms using spectral unmixing and iterative error analysis (IEA). Hao et al. [17] applied hyperspectral data to detect changes in urban forest resources in natural disaster zones. Image transformation techniques, such as multivariate alteration detection (MAD) and principal component analysis (PCA), which can be applied to general multispectral images have been extended to include hyperspectral data [18, 19]. In particular, iterative regularized multivariate alteration detection (IR-MAD) is considered a state-of-the-art change-detection algorithm due to its stability and outstanding change-detection results [2022]. However, dimensionality reduction through PCA or the minimum noise fraction (MNF) is required prior to the application of IR-MAD-based algorithms. Meola et al. [23] proposed a model-based change-detection approach that reduces false alarms caused by shadow differences using calibrated hyperspectral data, and Liu et al. [24] proposed a hierarchical change-detection algorithm using endmember detection and cluster merging in multitemporal hyperspectral images. Júnior et al. [25] used CVA based on distance (Euclidean distance and Mahalanobis distance) and similarity (spectral angle mapper, SAM, and spectral correlation mapper, SCM) measurements and concluded that CVA using the Euclidean distance and SAM is superior to other measures. Wu et al. [26] developed a subspace-based algorithm using two types of information from Hyperion and from the Chinese HJ-1A satellite, proving that measurements based on the SAM are identical to the orthogonal subspace projection- (OSP-) based measurements. Based on this concept, the authors have proposed local and adaptive measures based on OSP to minimize registration errors. Other advanced algorithms based on target detection, anomaly detection, and change detection have also been developed recently [2729].

These related approaches have generally focused on the development of change-detection algorithms that increase accuracy by effectively using hyperspectral bands to maximize the advantages of hyperspectral sensors. In other words, these approaches have focused on using abundant spectral information. However, the dissimilarity of radiometric and geometric properties between multitemporal data based on the acquisition time or position of the sensors should be resolved to enable hyperspectral imagery for detecting changes in natural and human-impact areas. Moreover, few considerations related to data noise have been included in the spectrum of hyperspectral imagery, potentially decreasing the change-detection accuracy when general change-detection algorithms are applied to hyperspectral images without dimensionality reduction.

Most studies have also considered hyperspectral data with low or medium spatial resolutions due to the technical limitations of generating hyperspectral sensors with high spatial resolutions. However, considering the future development of hyperspectral sensors, hyperspectral data with high spatial resolutions are necessary. Thus, approaches for change detection should consider high spectral and spatial resolutions simultaneously. High-resolution images acquired at various times demonstrate local geometric location differences and spectral variation by phenomenological or temporal differences between dates of imagery, even when the data are georeferenced. These geometric differences can cause poor change-detection results based on false alarms of the changed area caused by the differences [30]. Information exclusively obtained from the data acquired at two times, that is, before and after the change occurs, is insufficient for accurate change detection.

The main objective of this study is to develop a change-detection index suitable for hyperspectral imagery. We applied unsupervised change detection using an image fusion technique and an ordered combination of similarity measures to identify the changed regions. To minimize the radiometric and geometric dissimilarities between multitemporal images with high spatial and spectral resolutions, a temporal-data-based synthetic image fusion method is applied which generates combined images at two data points at different times. A combination of the synthetically fused images is then used to optimize the change-detection results. We propose a new measure for change detection that is appropriate for hyperspectral data without the loss of dimensionality. To evaluate the proposed method, we install a test-bed acquired from an airborne hyperspectral sensor taken at different times. Finally, the obtained experimental results are compared with the performance of the state-of-the-art algorithms.

2. Study Site and Dataset Description

The proposed change-detection algorithm was applied to airborne hyperspectral images collected by CASI. The CASI images for this experiment were collected over Gangnae-myeon, Cheongwon-gun, and Chungcheongbuk-do, Korea, on 22 June 2013 (127° 22′ E, 36° 34′ N). The data specifications are described in Table 1.

Table 1: CASI data specifications.

A total of 45 spectral bands with spectral wavelength of 413.4–1044.9 nm, excluding noise bands, were used, and a subset of image sizes was extracted for the experiments. The constructed study site is shown in Figure 1(a). To artificially construct the changed areas within the study site, we installed two types of targets: camouflages (, , and ) and an artificial turf () that have colors similar to the vegetation and ground in the study site (Figures 1(b) and 1(c)). In addition, we covered a car using a camouflage net, as shown in the left side of Figure 1(b). The ginseng field (yellow circle of Figure 1(a)) includes an inclined canvas, giving rise to different reflectance values depending on the position of the airborne hyperspectral sensor.

Figure 1: Study site with targets for change detection: (a) CASI image of study site generated with an RGB composite (red: 657 nm, green: 557 nm, and blue: 471 nm), (b) example of the area changed by camouflage nets (red circle of Figure 1(a)), and (c) example of the area changed by artificial turf (green circle of Figure 1(a)).

Figures 2(a) and 2(b) show the reference and target data for the experiment, and Figures 2(c) and 2(d) show the ground-truth data of the altered regions for the evaluation of the accuracy using image interpretation [31].

Figure 2: Study area: (a) reference data, (b) target data, (c) ground-truth data, and (d) specification of changed area in the study area.

3. Methodology

The workflow of our method is illustrated in Figure 3. Following geometric and radiometric preprocessing of hyperspectral data acquired at different times with deliberate differences, synthetic image fusion is applied. Then, a spectral similarity is calculated using the fused images. Finally, the changed area is detected by the proposed unsupervised similarity measure. A detailed explanation of each step is provided below.

Figure 3: Workflow of the proposed method.
3.1. Preprocessing

Prior to change detection, geometric and radiometric corrections must be applied to the multitemporal images. In our proposed method, image-to-image registration using a manually extracted ground control point (GCP) is used to obtain georeferenced images. Registered hyperspectral images are atmospherically corrected using the atmospheric and topographic correction (ATCOR) module and then radiometrically adjusted using an empirical line-calibration algorithm based on linear predictions [32]. Four tarps with different reflectance (3.5%, 23%, 35%, and 53%) are used as reference targets for the radiometric calibration. During the preprocessing step, spectral information for the same location may include identical characteristics even though partial noise data are presented.

3.2. Generation of Synthetically Fused Images

Synthetically fused images are generated based on the cross-sharpening methodology that was first developed for VHR satellite-based multitemporal data to minimize spatial dissimilarities while preserving spectral distortion [33]. A general pansharpening algorithm for VHR images is applied to obtain a high-spatial-resolution multispectral image using high-spatial-resolution panchromatic and low-spatial-resolution multispectral images acquired at the same time and at the same sensor position. The cross-sharpening algorithm is a modified version of the algorithm of Chen et al. [33, 34]. Cross-sharpened images are effective for minimizing change-detection errors caused by geometric displacement and spectral variation, including the noise in multitemporal images [33]. The cross-sharpening algorithm uses panchromatic and multispectral datasets acquired at different times and at different sensor positions. However, general airborne hyperspectral datasets do not include high-spatial-resolution panchromatic data. Therefore, to apply the cross-sharpening algorithm to the hyperspectral dataset, we generate a synthetic dataset with high-spatial-resolution multispectral images and low-spatial-resolution hyperspectral images and then employ a block-based fusion algorithm for sharpening the hyperspectral images using the multispectral images. Initially, the hyperspectral imagery obtained from CASI sensor is grouped in three blocks: 413.4–585.9 nm (corresponding to bands 1–13 of the hyperspectral images from CASI), 600.3–686.4 nm (corresponding to bands 14–20 of the hyperspectral images from CASI), and 700.7–1044.9 nm (corresponding to bands 21–45 of the hyperspectral images from CASI). Then, we generate a synthetic dataset with high-spatial-resolution multispectral images and low-spatial-resolution hyperspectral images corresponding to each block [35, 36]. Here, and are the original hyperspectral images with high spatial resolutions before the change (time ) and after the change (time ), respectively. The workflow for the synthetic dataset generation is shown in Figure 4.

Figure 4: Workflow of the synthetic multitemporal dataset generation corresponding to each block of hyperspectral imagery.

The multispectral data with high spatial resolution () at time are obtained by spectrally degrading by averaging all bands in each block of the hyperspectral image [37]. Then, hyperspectral data with low spatial resolution are created by downscaling via a Gaussian point spread function (PSF) [38]. In the downscaling process, the original hyperspectral data with 0.5 m spatial resolution are downscaled to 1 m spatial resolution. By applying this process to , we obtain two synthetic multitemporal datasets, and . After the synthetic multitemporal images are generated, a cross-sharpening algorithm is applied. The cross-sharpening method is defined as special image fusion that is accomplished using hyperspectral and multispectral images obtained at the same or at different times using [35]where is the specific fusion algorithm and is the synthetically fused image with high spatial resolution generated using the multispectral image at time and the hyperspectral image at time . To generate the cross-sharpened hyperspectral image, we employ a block-based fusion algorithm using a multispectral band corresponding to the range of wavelengths of hyperspectral bands [35, 36, 39]. In contrast to typical fusion processing with multispectral and panchromatic data, block-based fusion algorithm considers the multispectral bands as a panchromatic band and hyperspectral bands as multispectral bands. Each multispectral band is regarded as the panchromatic image, and the corresponding hyperspectral bands are considered the multispectral dataset. As mentioned above, we divide the wavelength range into three blocks. Accordingly, the multispectral image is composed of three bands, , , and , and each band of is regarded as a panchromatic image for pansharpening. The hyperspectral image is also partitioned as three hyperspectral images , , and , corresponding to the wavelength of each band of . The pansharpening algorithm is applied using generated panchromatic and partitioned hyperspectral images. After applying the block-based pansharpening approach, fused multispectral bands of each block are integrated as . Figure 5 represents the workflow of block-based fusion using multispectral bands and hyperspectral bands .

Figure 5: Workflow of the block-based pansharpening algorithm.

In the case of the pansharpening algorithm, we use the Gram-Schmidt (GS) method, which is a representative and efficient pansharpening algorithm [40]. In (1)–(4), the spatial resolution, the number of bands, and spectral wavelength of the synthetically fused images are equivalent to those of the original hyperspectral image. In addition, fused images corresponding to the same multispectral images have similar spatial characteristics, indicating that some of the geometric errors in the change detection can be minimized using a pair of fused images with similar spatial information. Figure 6 shows an example in which the geometric error is minimized in the synthetically fused images. The spatial characteristics of in the tiled roof of the house are more similar to than to .

Figure 6: Example of an image fusion result using synthetic hyperspectral images: (a) , (b) , and (c) .
3.3. Spectral Similarity Measure for Detecting Change

In this study, two representative normalized spectral measures are used. First, the spectral angle distance (SAD) measure is used to calculate the spectral similarity between the spectra of two pixels in the original multitemporal hyperspectral images [41]. We assume that is a spectrum of the reference data and that is a spectrum of the target data in a hyperspectral dataset with L bands. According to these spectral vectors, SAD is calculated as follows:where represents the inner product between the two spectrum vectors and and represent the magnitudes of spectra and , respectively. The SAD range is .

Having obtained the SAD measure, the Euclidean distance (ED) is also used to measure the spectral dissimilarity between the pixels. The ED, a representative similarity distance measure, is defined as the square root of the sum of the squared differences between the corresponding spectra. The ED between two spectra is expressed asThe ED is mathematically simple, conducive for rapid processing, and sensitive to the absolute reflectance values. In addition, the values are not normalized. Therefore, the values of ED between the two spectra are normalized by the magnitude of the spectral vector as follows:where and are the maximum and minimum values of the calculated ED, respectively. According to (7), the ED value can be normalized as .

3.4. Unsupervised Change Detection Using a Derivative Spectral Profile

When change detection between and of (1)-(2) is applied based on SAD and ED measures, some unchanged areas may be identified as changed areas given the dissimilarity of the geometric and radiometric characteristics of the features. However, as noted in Section 3.2, the synthetically fused image has a tendency to adhere to the spatial properties of the multispectral image at time 1 () and to adhere to the spectral properties of the hyperspectral image at time 2 (). Therefore, if the fused image is used with the fused image at time 1, that is, , for change detection, then the errors from the geometric difference between the two images are minimized while maintaining the spectral difference between the two sets of data. This characteristic is more significant in the derivative pixel spectra [42]. Figure 7 illustrates the spectral profile (Figure 7(b)) and second-derivative spectral profile (Figure 7(c)) of the pixel that is actually changed (Figure 7(a)), corresponding to a camouflage net that exists only at time 1. Comparing the second-derivative spectral profiles of fused images and , we find that exhibits a high profile variation, which is beneficial for the identification of the changed region.

Figure 7: Spectral property comparison of fused images for detecting changed areas: (a) sample target of fused images , , and , (b) spectral profile of pixel values, and (c) spectral profile of the second-derivative pixel values (black line: , red dot line: , and blue dot line: ).

Therefore, the SAD and ED measures are integrated to generate a new similarity distance (integrated similarity distance, ISD) based on the combination of the original and derivative spectral profiles of the synthetically fused images. The ISD is defined aswhere is the second-derivative spectral profile, min is the minimum value, and is the convolution filter. If a pixel has a larger ISD value than a predefined threshold, it is determined to be a changed pixel. In the evaluation of the original spectral profile, SAD measurements between two hyperspectral datasets tend to exhibit different spectral and spatial characteristics. However, these results may include false-positive regions of change detection. The ED of the derivative spectral profile between the synthetically fused image pair demonstrates similar spatial characteristics; thus, we calculated the ED through second-derivative spectral profiles of the hyperspectral data. In particular, the overall information acquired from the synthetic image fusion pairs ( and ) is used for change detection to optimize the accuracy of the change-detection results.

4. Results and Discussion

4.1. Change-Detection Results

To estimate the performance of the proposed ISD measure, various unsupervised state-of-the-art change-detection algorithms are applied. The magnitude of the change vector (Euclidean distance of CVA or CVAED) and the spectral information divergence (SID) similarity measures are applied to detect the changed area [24, 41]. Subspace-based change-detection (SCD) algorithms, that is, original SCD, adaptive SCD (ASCD), and local SCD (LSCD), are also employed [26]. ASCD and LSCD are applied to estimate the effects of varying sensor positions or misregistration between the multitemporal hyperspectral data. Finally, IR-MAD is chosen as the change-detection algorithm for quantitative estimations because it is frequently cited as a representative unsupervised change-detection algorithm. In the case of ASCD and LSCD, a window size of is selected while considering the size of the camouflage nets at the test site. To apply IR-MAD, PCA is applied as preprocessing for dimensionality reduction, and the top five principal components, including approximately 99% of the information, are used for calculating IR-MAD.

Each change-detection result map is calculated from two corresponding hyperspectral datasets and then normalized over a range of . Figure 8 presents the calculated results of the state-of-the-art and proposed methods; these are displayed using a gamma correction. The change-detection results obtained from the CVAED (Figure 8(d)) have higher values in the mountains than the other change-detection algorithms. The estimated results using SID (Figure 8(e)) and SCD-based algorithms (Figures 8(f)8(g)) show highly dissimilar values in the ginseng field in the lower part of the site. In the case of ASCD and LSCD, some unchanged areas of edge or linear features were efficiently removed relative to the result obtained by SCD. However, small changed areas that have similar size with the sliding window represent lower values than other unchanged areas because these changed areas are blurred by the effect of local processing based on the moving window of ASCD and LSCD. In addition, the proposed algorithm (Figure 8(j)) shows more distinguishable and contrasting values over the camouflage nets and artificial turf relative to the results obtained using other change-detection methods, including the IR-MAD algorithm (Figure 8(i)), even though the result obtained by the IR-MAD shows better quality of change-detection output than that of the other existing algorithms.

Figure 8: Comparison of the change-detection results: (a) reference hyperspectral image, (b) target hyperspectral image, (c) ground-truth image, (d) result using the CVAED, (e) result using the SID, (f) result using the SCD, (g) result using the ASCD, (h) result using the LSCD, (i) result using the IR-MAD, and (j) result using the proposed algorithm.

The change-detection results can be quantitatively evaluated by estimating the receiver operating characteristic (ROC) curve and by evaluating the binary change map using a thresholding technique. The ROC curve provides a graphical plot for the estimating the performance and selecting an optimal model from the class distribution [26, 43, 44]. The ROC curve is composed of the cumulative distribution function of the detection rate versus that of the false-alarm rate. From the curve, an appropriate threshold for separating changed and unchanged areas can be selected. Using the ROC curve, the area under the curve (AUC) can be calculated. The AUC describes the probability that the change-detection algorithm will rank a randomly chosen positive data point higher than a randomly chosen negative data point [44]. Figure 9 shows the ROC curves corresponding to each algorithm, showing that the proposed algorithm can detect changed regions better than the other methods regardless of the threshold used for the unsupervised change-detection process. Poor results of ASCD and LSCD relative to the SCD result are due to the blurring effect of the process based on a sliding window, as shown in Figures 8(g) and 8(h). IR-MAD shows a higher value than the other existing algorithms but requires the use of a dimensionality reduction process. Moreover, the AUC by IR-MAD is lower than that obtained using the proposed algorithm. The efficiency of our algorithm can be proven based on its high AUC value without the use of dimensionality reduction techniques such as PCA and MNF, as shown in Table 2. The ROC analysis indicates that the proposed method can optimize the change-detection results while minimizing the false-alarm rate.

Table 2: AUC values corresponding to each similarity measure for change detection.
Figure 9: ROC curves corresponding to each similarity measure for change detection.

An automatic thresholding technique is applied based on the unsupervised change-detection results presented in Figure 8. The threshold determination is based on Rosin’s threshold algorithm [45]. Rosin’s threshold, which is a representative unimodal threshold, assumes that the histogram of the similarity distance image is a proportional unimodal function and that a nonchanged class of pixels (e.g., the background) is considerably larger than the changed class of the pixels in the image. Rosin’s threshold fits a straight line from the peak of the histogram. Then, the point of the maximum deviation between the line and the histogram curve is selected as the threshold. Using the estimated threshold values, we extract the binary images corresponding to the changed area. After the thresholding, a 3-by-3 median filter is applied to consider the minimum size of the changed regions.

Figure 10 depicts the change-detection results using the existing and proposed algorithms with automatic thresholding for unsupervised change detection. Figure 11 presents the magnified results near the targets for a visual assessment. As shown in Figures 10(a) and 11(c), ground-truth data are generated, including a total of 863 changed pixels. Referring to the change-detection results using CVAED, the nearly changed regions are represented as nonchanged areas (Figure 10(b)). The change-detection results based on the SID- and SCD-based algorithms are presented in Figures 10(c)10(f). In these results, the actual changed regions, for example, the camouflage net and the artificial turf in the middle of the site and the ginseng field in the lower part of the site, are detected as changed areas. This result is attributed to the inclined canvas on the ginseng field, which exhibits varying radiometric and spatial properties based on the off-nadir angles of the given scene. Thus, many false alarms arise in the change-detection results. In addition, the region corresponding to the brown camouflage net that has spectral properties similar to the neighboring pixels is not detected as a changed region. In the results of the IR-MAD algorithm, the artificial turf and certain building roofs (left side of Figure 10(a)) are not detected as changed areas. The result of the proposed algorithm integrated by convoluting the similarity measures between the synthetically fused images is presented in Figure 10(h). The ginseng field is considered to be an unchanged area, whereas the camouflage net and artificial turf regions are effectively detected as changed areas (Figure 11(j)) relative to the IR-MAD result (Figure 11(i)). However, in all cases, the small targets of the camouflage net are not detected due to the limited spatial resolution of the data. With the exception of the small targets, all of the changed regions are detected using the proposed algorithm, and some regions that did not actually change are also detected. These errors are attributed to the atmospheric correction and weather conditions. As shown in Figure 10(b), cloud shadows occurred on the right side of the datasets. Thus, some change-detection errors occurred based on these areas. To quantitatively evaluate the change-detection results, the detected changes, overall accuracy, completeness, correctness, and false-alarm rate are calculated based on the ground-truth data. Evaluation factors are estimated with the number of pixels, and the results are presented in Table 3. The change-detection result based on the proposed algorithm shows higher completeness than the results of the existing algorithms. Therefore, our algorithm detected the highest portion of the actually changed areas. The CVAED, IR-MAD, and proposed algorithms showed better overall accuracy, correctness, and false-alarm rates than the SID- and SCD-based algorithms. However, the high correctness and low false-alarm rates of the CVAED and IR-MAD results indicate that these algorithms could not adequately detect the changed areas, in contrast to the proposed algorithm. Therefore, we conclude that our algorithm detects changed areas efficiently and minimizes the overall error in the change detection without the use of dimensionality reduction.

Table 3: Accuracy of the change-detection maps using the existing and proposed algorithms.
Figure 10: Comparison of the change-detection maps: (a) ground-truth image, (b) result using the CVAED, (c) result using the SID, (d) result using the SCD, (e) result using the ASCD, (f) result using the LSCD, (g) result using the IR-MAD, and (h) result using the proposed algorithm.
Figure 11: Comparison of the magnified change-detection maps: (a) reference hyperspectral image, (b) target hyperspectral image, (c) ground-truth image, (d) result using the CVAED, (e) result using the SID, (f) result using the SCD, (g) result using the ASCD, (h) result using the LSCD, (i) result using the IR-MAD, and (j) result using the proposed algorithm.

5. Conclusions

In this study, we proposed a novel algorithm appropriate for detecting changes in hyperspectral data without the dimensionality reduction step that can lose specific information of the data. Synthetically fused images that preserve spectral differences while minimizing the spatial differences between the two images acquired at different times were used for the change-detection procedure. Based on the proposed similarity distance and the application of the second-derivative spectral profiles of pixels, most of the changed regions, except for certain small targets, were effectively detected. The change-detection results using our algorithm showed the highest AUC value relative to the results from the other state-of-the-art algorithms. The high AUC value indicates that the change-detection result can be optimized while minimizing the false-alarm rate. In the case of the evaluation from the binary change-detection map, the CVAED and IR-MAD methods showed higher overall accuracies of the change-detection results than that of the proposed approach. These results arose due to the higher correctness and false-alarm rates of these methods. The proposed method produced the highest completeness values, implying that the proposed algorithm detected the largest portion of truly changed areas. Therefore, we conclude that our methodology can be used to detect subtle changes in hyperspectral data acquired at different times using the properties of the derivative spectral profiles in pixels.

However, certain regions that were not changed between the two data were incorrectly detected due to the spectral dissimilarity caused by atmospheric effects and cloud shadows. Moreover, small targets were also not detected due to the limited spatial resolution of the acquired hyperspectral data. This problem will be addressed in our future work using various hyperspectral data together with modifications of the similarity distance measure.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

This work was supported by the Space Core Technology Development Program through the National Research Foundation of Korea (NRF) and was funded by the Ministry of Science, ICT & Future Planning (NRF-2012M1A3A3A02033469 and NRF-2014M1A3A3A03034798).

References

  1. H. Yamamoto, R. Nakamura, and S. Tsuchida, “Radiometric calibration plan for the hyperspectral imager suite (HISUI) instruments,” in Proceedings of the Multispectral, Hyperspectral, and Ultraspectral Remote Sensing Technology, Techniques and Applications IV Conference, 85270V, Kyoto, Japan, October 2012. View at Publisher · View at Google Scholar · View at Scopus
  2. R. J. Radke, S. Andra, O. Al-Kofahi, and B. Roysam, “Image change detection algorithms: a systematic survey,” IEEE Transactions on Image Processing, vol. 14, no. 3, pp. 294–307, 2005. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  3. D. Lu, P. Mausel, E. Brondízio, and E. Moran, “Change detection techniques,” International Journal of Remote Sensing, vol. 25, no. 12, pp. 2365–2407, 2004. View at Publisher · View at Google Scholar · View at Scopus
  4. M. Hussain, D. Chen, A. Cheng, H. Wei, and D. Stanley, “Change detection from remotely sensed images: from pixel-based to object-based approaches,” ISPRS Journal of Photogrammetry and Remote Sensing, vol. 80, pp. 91–106, 2013. View at Publisher · View at Google Scholar · View at Scopus
  5. A. Singh, “Digital change detection techniques using remotely-sensed data,” International Journal of Remote Sensing, vol. 10, no. 6, pp. 989–1003, 1989. View at Publisher · View at Google Scholar · View at Scopus
  6. D. Renza, E. Martinez, and A. Arquero, “A new approach to change detection in multispectral images by means of ERGAS index,” IEEE Geoscience and Remote Sensing Letters, vol. 10, no. 1, pp. 76–80, 2013. View at Publisher · View at Google Scholar · View at Scopus
  7. J. Im and J. R. Jensen, “A change detection model based on neighborhood correlation image analysis and decision tree classification,” Remote Sensing of Environment, vol. 99, no. 3, pp. 326–340, 2005. View at Publisher · View at Google Scholar · View at Scopus
  8. S. Patra, S. Ghosh, and A. Ghosh, “Histogram thresholding for unsupervised change detection of remote sensing images,” International Journal of Remote Sensing, vol. 32, no. 21, pp. 6071–6089, 2011. View at Publisher · View at Google Scholar · View at Scopus
  9. I. Molina, E. Martinez, A. Arquero, G. Pajares, and J. Sanchez, “Evaluation of a change detection methodology by means of binary thresholding algorithms and informational fusion processes,” Sensors, vol. 12, no. 3, pp. 3528–3561, 2012. View at Publisher · View at Google Scholar · View at Scopus
  10. L. Bruzzone and D. F. Prieto, “An adaptive parcel-based technique for unsupervised change detection,” International Journal of Remote Sensing, vol. 21, no. 4, pp. 817–822, 2000. View at Publisher · View at Google Scholar · View at Scopus
  11. L. Bruzzone, “Automatic analysis of the difference image for unsupervised change detection,” IEEE Transactions on Geoscience and Remote Sensing, vol. 38, no. 3, pp. 1171–1182, 2000. View at Publisher · View at Google Scholar · View at Scopus
  12. G. Pajares, “A hopfield neural network for image change detection,” IEEE Transactions on Neural Networks, vol. 17, no. 5, pp. 1250–1264, 2006. View at Publisher · View at Google Scholar · View at Scopus
  13. L. Bruzzone and F. Bovolo, “A novel framework for the design of change-detection systems for very-high-resolution remote sensing images,” Proceedings of the IEEE, vol. 101, no. 3, pp. 609–630, 2013. View at Publisher · View at Google Scholar · View at Scopus
  14. M. T. Eismann, J. Meola, and R. C. Hardie, “Hyperspectral Change Detection in the Presence of Diurnal and Seasonal Variations,” IEEE Transactions on Geoscience and Remote Sensing, vol. 46, no. 1, pp. 237–249, 2008. View at Publisher · View at Google Scholar · View at Scopus
  15. K. Kim, “Study on Improving Hyperspectral Target Detection by Target Signal Exclusion in Matched Filtering,” Korean Journal of Remote Sensing, vol. 31, no. 5, pp. 433–440, 2015. View at Publisher · View at Google Scholar
  16. A. Song, J. Choi, A. Chang, and Y. Kim, “Change Detection Using Spectral Unmixing and IEA(Iterative Error Analysis) for Hyperspectral Images,” Korean Journal of Remote Sensing, vol. 31, no. 5, pp. 361–370, 2015. View at Publisher · View at Google Scholar
  17. Z. Hao, H.-J. Song, and B.-C. Yu, “Application of hyper spectral remote sensing for urban forestry monitoring in natural disaster zones,” in Proceedings of the 2011 International Conference on Computer and Management, CAMAN 2011, May 2011. View at Publisher · View at Google Scholar · View at Scopus
  18. A. A. Nielsen, “The regularized iteratively reweighted MAD method for change detection in multi- and hyperspectral data,” IEEE Transactions on Image Processing, vol. 16, no. 2, pp. 463–478, 2007. View at Publisher · View at Google Scholar · View at MathSciNet
  19. T. Celik, “Unsupervised change detection in satellite images using principal component analysis and κ-means clustering,” IEEE Geoscience and Remote Sensing Letters, vol. 6, no. 4, pp. 772–776, 2009. View at Publisher · View at Google Scholar · View at Scopus
  20. M. J. Canty and A. A. Nielsen, “Automatic radiometric normalization of multitemporal satellite imagery with the iteratively re-weighted MAD transformation,” Remote Sensing of Environment, vol. 112, no. 3, pp. 1025–1036, 2008. View at Publisher · View at Google Scholar · View at Scopus
  21. P. R. Marpu, P. Gamba, and M. J. Canty, “Improving change detection results of ir-mad by eliminating strong changes,” IEEE Geoscience and Remote Sensing Letters, vol. 8, no. 4, pp. 799–803, 2011. View at Publisher · View at Google Scholar · View at Scopus
  22. M. J. Canty and A. A. Nielsen, “Linear and kernel methods for multivariate change detection,” Computers and Geosciences, vol. 38, no. 1, pp. 107–114, 2012. View at Publisher · View at Google Scholar · View at Scopus
  23. J. Meola, M. T. Eismann, R. L. Moses, and J. N. Ash, “Detecting changes in hyperspectral imagery using a model-based approach,” IEEE Transactions on Geoscience and Remote Sensing, vol. 49, no. 7, pp. 2647–2661, 2011. View at Publisher · View at Google Scholar · View at Scopus
  24. S. Liu, L. Bruzzone, F. Bovolo, and P. Du, “Hierarchical unsupervised change detection in multitemporal hyperspectral images,” IEEE Transactions on Geoscience and Remote Sensing, vol. 53, no. 1, pp. 244–260, 2015. View at Publisher · View at Google Scholar · View at Scopus
  25. O. A. C. Júnior, R. F. Guimarães, A. R. Gillespie, N. C. Silva, and R. A. T. Gomes, “A new approach to change vector analysis using distance and similarity measures,” Remote Sensing, vol. 3, no. 11, pp. 2473–2493, 2011. View at Publisher · View at Google Scholar · View at Scopus
  26. C. Wu, B. Du, and L. Zhang, “A subspace-based change detection method for hyperspectral images,” IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 6, no. 2, pp. 815–830, 2013. View at Publisher · View at Google Scholar · View at Scopus
  27. A. Brook and E. Ben-Dor, “Advantages of the boresight effect in hyperspectral data analysis,” Remote Sensing, vol. 3, no. 3, pp. 484–502, 2011. View at Publisher · View at Google Scholar · View at Scopus
  28. A. Averbuch and M. Zheludev, “Two linear unmixing algorithms to recognize targets using supervised classification and orthogonal rotation in airborne hyperspectral images,” Remote Sensing, vol. 4, no. 2, pp. 532–560, 2012. View at Publisher · View at Google Scholar · View at Scopus
  29. Y. Yuan, Q. Wang, and G. Zhu, “Fast hyperspectral anomaly detection via high-order 2-d crossing filter,” IEEE Transactions on Geoscience and Remote Sensing, vol. 53, no. 2, pp. 620–630, 2015. View at Publisher · View at Google Scholar · View at Scopus
  30. J. R. G. Townshend, C. Gurney, J. McManus, and C. O. Justice, “The impact of misregistration on change detection,” IEEE Transactions on Geoscience and Remote Sensing, vol. 30, no. 5, pp. 1054–1060, 1992. View at Publisher · View at Google Scholar · View at Scopus
  31. “Construction and data analysis of test-bed by hyperspectral imagery,” http://www.hyperspectral-testbed.com/.
  32. A. Brook and E. B. Dor, “Supervised vicarious calibration (SVC) of hyperspectral remote-sensing data,” Remote Sensing of Environment, vol. 115, no. 6, pp. 1543–1555, 2011. View at Publisher · View at Google Scholar · View at Scopus
  33. B. Wang, S. Choi, Y. Byun, S. Lee, and J. Choi, “Object-based change detection of very high resolution satellite imagery using the cross-sharpening of multitemporal data,” IEEE Geoscience and Remote Sensing Letters, vol. 12, no. 5, pp. 1151–1155, 2015. View at Publisher · View at Google Scholar · View at Scopus
  34. Z. Chen, J. Chen, P. Shi, and M. Tamura, “An IHS-based change detection approach for assessment of urban expansion impact on arable land loss in China,” International Journal of Remote Sensing, vol. 24, no. 6, pp. 1353–1360, 2003. View at Publisher · View at Google Scholar · View at Scopus
  35. B. Wang, S.-K. Choi, Y.-K. Han, S.-K. Lee, and J.-W. Choi, “Application of IR-MAD using synthetically fused images for change detection in hyperspectral data,” Remote Sensing Letters, vol. 6, no. 8, pp. 578–586, 2015. View at Publisher · View at Google Scholar · View at Scopus
  36. Y. Kim and J. Choi, “Evaluation of block-based sharpening algorithms for fusion of Hyperion and ALI imagery,” Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography, vol. 33, no. 1, pp. 63–70, 2015. View at Publisher · View at Google Scholar · View at Scopus
  37. M. T. Eismann and R. C. Hardie, “Hyperspectral resolution enhancement using high-resolution multispectral imagery with arbitrary response functions,” IEEE Transactions on Geoscience and Remote Sensing, vol. 43, no. 3, pp. 455–465, 2005. View at Publisher · View at Google Scholar · View at Scopus
  38. N. Yokoya, T. Yairi, and A. Iwasaki, “Coupled nonnegative matrix factorization unmixing for hyperspectral and multispectral data fusion,” IEEE Transactions on Geoscience and Remote Sensing, vol. 50, no. 2, pp. 528–537, 2012. View at Publisher · View at Google Scholar · View at Scopus
  39. D. Sylla, A. Minghelli-Roman, P. Blanc, A. Mangin, and O. Hembise Fanton D'Andon, “Fusion of multispectral images by extension of the pan-sharpening ARSIS method,” IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 7, no. 5, pp. 1781–1791, 2014. View at Publisher · View at Google Scholar · View at Scopus
  40. B. Aiazzi, S. Baronti, F. Lotti, and M. Selva, “A comparison between global and context-adaptive pansharpening of multispectral images,” IEEE Geoscience and Remote Sensing Letters, vol. 6, no. 2, pp. 302–306, 2009. View at Publisher · View at Google Scholar · View at Scopus
  41. F. A. Kruse, A. B. Lefkoff, J. W. Boardman et al., “The spectral image processing system (SIPS)-interactive visualization and analysis of imaging spectrometer data,” Remote Sensing of Environment, vol. 44, no. 2-3, pp. 145–163, 1993. View at Publisher · View at Google Scholar · View at Scopus
  42. J. Zhang, B. Rivard, and A. Sanchez-Azofeifa, “Derivative spectral unmixing of hyperspectral data applied to mixtures of lichen and rock,” IEEE Transactions on Geoscience and Remote Sensing, vol. 42, no. 9, pp. 1934–1940, 2004. View at Publisher · View at Google Scholar · View at Scopus
  43. L. C. Alatorre, R. Sánchez-Andrés, S. Cirujano, S. Beguería, and S. Sánchez-Carrillo, “Identification of mangrove areas by remote sensing: The ROC curve technique applied to the northwestern Mexico coastal zone using Landsat imagery,” Remote Sensing, vol. 3, no. 8, pp. 1568–1583, 2011. View at Publisher · View at Google Scholar · View at Scopus
  44. T. Fawcett, “An introduction to ROC analysis,” Pattern Recognition Letters, vol. 27, no. 8, pp. 861–874, 2006. View at Publisher · View at Google Scholar · View at Scopus
  45. P. L. Rosin and E. Ioannidis, “Evaluation of global image thresholding for change detection,” Pattern Recognition Letters, vol. 24, no. 14, pp. 2345–2356, 2003. View at Publisher · View at Google Scholar · View at Scopus