Research Article | Open Access
Sijie Kong, Jin Zhou, Wenli Ma, "The Experimental Demonstration of Correcting the Atmospheric Dispersion Using Image Processing Based on Edge Extension", International Journal of Optics, vol. 2019, Article ID 5680956, 9 pages, 2019. https://doi.org/10.1155/2019/5680956
The Experimental Demonstration of Correcting the Atmospheric Dispersion Using Image Processing Based on Edge Extension
We present an image processing algorithm based on edge extension to correct the influence of atmospheric dispersion. The Elden model is used to estimate the image dispersion index caused by atmospheric dispersion and the image affected by the atmospheric dispersion is regarded as the results of the original image convolution operation. When the direct convolution is used to compensate the blur of star, border effect and ill-posed problem make the result unacceptable. To solve these problems, we use image preprocessing and perform an edge extension method for images before the convolution. The simulated analysis and experimental results from a 300 mm telescope system show that the proposed method can effectively correct the influence of atmospheric dispersion even under relatively low signal to noise ratio (SNR<2). Compared with the traditional prism correction and fiber correction methods, this technique can greatly reduce the complexity of the optical system.
With the increase in the aperture and focal length of the ground-based astronomical optics telescope, the image quantity is greatly influenced by the differential atmospheric refraction or dispersion, especially at larger zenith angles and at shorter wavelengths. Atmospheric dispersion is due to the dependence of the refractive index on the wavelength of light. Optical rays of shorter wavelength are refracted more than those of longer wavelength. For a telescope that can observe a star, this means that the apparent location of the star depends on the wavelength. The result is that for a certain filter bandwidth, the star will seem elongated with the red end closer to the horizon and the blue end closer to the zenith. In an astronomical observation system with a monochrome CCD camera, the stars become a short stripe, thus having a low SNR (Signal to Noise Ratio) and not being able to recognize some character details of images.
The traditional method of atmospheric dispersion correction is to design the dispersion correction lens in the optical system [1–6]. When the diameter of the telescope is large and the requirement of large fields of vision imaging is considered, the design and manufacture of dispersion correction lens becomes difficult. The lens needs to accommodate both a large field of view and a large diameter, including the correction of atmospheric dispersion, which increases the difficulty of optical system designs. For the material of the correction lens, high dispersion optical material is often required . However, those materials always have high short-wave absorption, which affects the transmittance of the optical system. As the observation elevation angle is different, the atmospheric dispersion will also change accordingly. Therefore, the atmospheric dispersion correction lens needs to be adjusted to the elevation angle of the telescope. In this process, the eccentricity requirement and the imaging quality of the optical system should be ensured. Therefore, the adjustment mechanism is required to improve its precision. For the optical correction method which results from a complex system and difficult technique, we adopt an image processing technique to solve the problem of atmospheric dispersion. The image processing method can reduce the difficulty of telescope optical design, processing, testing, and adjustment.
The atmospheric dispersion severely limits the resolution of ground-based telescope. In the past years, various image processing methods have been proposed to overcome this limitation. These approaches include speckle imaging, blind deconvolution, deconvolution with wave-front sensing, and phase diversity. Speckle image is a method to recover the original target image from the atmospheric optical point spread function [8–11]. Image blind deconvolution is to solve original objective function and point diffusion function from degeneration image [12–14]. Deconvolution with wave-front sensing is an imaging technique designed to compensate for the image degradation [15, 16]. And the phase diversity method is a technique for obtaining estimates of both the object and the phase distribution of wave-front, by exploiting the simultaneous collection of one or more pairs of short exposure images [17–19]. These methods have achieved good results in image blurring caused by atmospheric turbulence and have guiding significance for the atmospheric dispersion correction algorithm. The dispersion function of atmospheric dispersion is different from that of atmospheric turbulence, which is relatively better estimated. Therefore, it is possible to use unblinded deconvolution in this paper.
In this paper, an image processing methodology is proposed to restore the image blur caused by atmospheric dispersion. The Elden model is used to estimate the image dispersion index caused by atmospheric dispersion and the image affected by the atmospheric dispersion is regarded as the results of the original image convolution operation. When the direct convolution is used to compensate the blur of star, ill-posed problem makes the result unacceptable. To solve that problem, we perform an edge extension method for images before the convolution. The simulated analysis and experimental results from a 300 mm telescope system show that the proposed method can effectively correct the influence of atmospheric dispersion even under relatively low signal to noise ratio (SNR<2). This methodology is superior to other existing methods in numerous ways and it is useful in the field of astronomical observations, space debris tracking, and satellite-to-ground laser communications, due to the effective correction of the effects of atmospheric dispersion, and it makes the targets and stars in the image clearer.
2. Atmospheric Dispersion
Because of the influence of gravity and other factors, the density of the atmosphere around the earth decreases with height. While the refractive index of the atmosphere is proportional to the air density, the refractive index of the atmosphere at different heights for light is different . When the observation object is located at the zenith position, the light emitted by the object is perpendicularly incident on the atmosphere. That is, the light does not deflect. However, when the observation object is not at the zenith position, the object ray enters the atmosphere obliquely. According to the law of refraction, the light will be deflected when passing a boundary between different media. As the refractive index of the atmosphere is related to the wavelength of light when a light of different wavelengths from the same target passes through the atmosphere, the degree of deflection will be slightly different. The result is that, for a certain filter bandwidth, the star will look elongated, with the red end closer to the horizon and blue end closer to the zenith, as shown in Figure 1.
With an increase in ground-based telescope aperture, many factors affect the resolution of the telescope. On the one hand, given the influence of atmospheric refraction, the actual position of the target and the target observed in the nonzenith region is different. On the other hand, because of atmospheric dispersion, nonzenith targets will widen the vertical direction. The atmospheric dispersion causes the star point on the image plane to change from a point to a discrete spectrum. The energy is not concentrated, and the original image becomes blurred. As the refractive index of the atmosphere varies with the wavelength, the PSF (point spread function) of the image plane will eventually increase in a certain direction, and the star point will look like an oval on the image plane.
Many factors affect atmospheric dispersion, such as the wavelength, zenith distance, observation latitude, temperature, pressure, humidity, and altitude. In this paper, we use the Elden model to calculate the atmospheric dispersion model. The formula for calculating the refractive index of the atmospheric is where is the wavelength of light. is the temperature and is atmospheric pressure. If the zenith distance is less than , atmospheric dispersion is similar to where is the zenith distance and and is the atmospheric refractive index corresponding to the bandwidth to . Because of the atmospheric dispersion, the diffuse pixels generated by stars photographed by the ground-based telescope iswhere indicates the focal length of the telescope and is the pixel size of the telescope. This section may be divided into subheadings. It should provide a concise and precise description of the experimental results, their interpretation, and the experimental conclusions that can be drawn.
3. Dispersion Correction by Image Processing
3.1. Basic Concept
When the ground-based telescope gathers light from far-off stars and galaxies because of the effect of atmospheric dispersion, the star points become strips. It will not be able to recognize some character details of the starts, and it will have low SNR, as shown in Figure 2. Linear system model is adopted to approximate the image degradation process, which is described as
where is the original star image. is the image photographed by the ground-based telescope. is the noise. is the ununiform background, such as thin cloud and light condition. Image can be represented by the two-dimensional Dirac delta functionwhere is defined as and satisfiesand then
Because H is linear time invariant, the following relation can be derived:and then
According to the convolution theorem, it is easy to get
The image affected by the atmospheric dispersion is regarded as the results of the original image convolution operation. As long as the effect of atmospheric dispersion is corrected, the original image is obtained. As illustrated in (10), the devolution operation can be used to calculate the original image. However, naive deconvolution is subject to important artifacts due to border effects, as shown in Figure 3(c). Moreover, the ill-posed problem of deconvolution also leads to the restoration of the image being unacceptable; the situation can be formulated as
is usually a low-pass operator, the value of tends to diverge to infinity, and the noise and background ununiform will be greatly amplified. In order to obtain the restoration image, the background and noise in the image should be removed, or candidate targets can be segmented directly and then processed. However, in the actual process, many targets are too faint to be detected confidently. At the same time, because of the large field of view, there will be hundreds of thousands of stars in a frame, which is too large for processing. To solve these problems in image deconvolution, edge extension is performed before deconvolution in the figure. Figure 2 shows the image restoration process.
There are noises and ununiform background in the image affected by atmospheric dispersion. Image preprocessing is necessary in this method. Morphological operations are used to remove background fluctuation caused by thin cloud and light condition. And highlighted noise can be removed by the median filter of 33. In addition, multiframe contrast has a good effect on noise removal. For images affected by atmospheric dispersion, the stars in the image will spread in the direction of the image column. Therefore, the system only needs to extract each column of the image for deconvolution and then combine it into a complete image. The role of edge extension is to use the deconvolution operation efficiently to restore the image because of the differences between the diffusion of target energy caused by atmospheric dispersion and real convolution. Because the result of diffusion is different from convolution, the deconvolution operation does not restore well to A. The grayscale image of the column in the image is shown in Figure 3(a). When subjected to atmospheric dispersion, the grayscale of the reordered image is shown in Figure 3(b). If a direct deconvolution calculation is performed in the image column in Figure 3(b), the result is unacceptable, as shown in Figure 3(c).
3.2. Edge Extension
The edge continuation model is adopted to match the image affected by atmospheric dispersion with the direct convolution results as much as possible. Then, the deconvolution algorithm can be used to restore the image. The target energy diffusion caused by atmospheric dispersion generally exists only in the vertical direction. In particular, if there is an image rotation in the image, it is necessary to first multiply it by the rotation matrix to correct the image.
Atmospheric dispersion leads to the expansion of star points in the image column direction, as shown in Figure 3(b). The image affected by atmospheric dispersion is regarded as the result of the convolution between the original image and the convolution kernel. The edge extension method is used to match the affected image with convolution image as much as possible.
According to the full convolution , the following equations are obtained,
At this point, the number of formulas is equal to the number of unknowns. In this paper, atmospheric dispersion is regarded as the convolution effect, and the convolution operation is not a full convolution operation. Therefore, the edge extension method is adopted to supplement the image edge after dispersion, thus resulting in full convolution. What needs to be supplemented for edge continuation is the part from to and to . The supplementary formula is as follows:where and represent the grayscale value of the input image column and the grayscale value of the image column after the edge extension, respectively. is the image column coordinates, while is the number of diffuse pixels affected by atmospheric dispersion which is calculated by (3). represents the integer part of .
After edge extension, the image is most like the full convolution of original image and convolution kernel. At this time, direct deconvolution can be used to restore the image. The stars are affected by atmospheric dispersion to generate the dispersion on the direction of image column, as shown in Figure 4(a). The grayscale of the column in dispersion image is shown in Figure 4(b). After the image column is extended by the edge extension model indicated above, the results are shown in Figure 4(c). The image repaired by direct deconvolution is shown in Figure 4(d). Findings reveal that most of the star points have been effectively recovered, but there are still some unremoved stripes in the image. Where these streaks exist, star points cannot be effectively restored to energy. Through further analysis of the restored image, stripes exist in the image column with the bright target at the edge. Therefore, the edge extension model is not applicable to the image column with the bright target at the edges. The method that can improve the edge extension model is as follows: first, detect whether there is a bright target at the edge. Next, if there is a bright target, remove it and fill the target area with the background gray. After that, the edge extension of the image column is carried out according to the appeal model. Finally, deconvolution is performed for each column for the dispersion image to restore the star map. The results of the improved algorithm are shown in Figure 5(b).
4. Experimental Results
To verify the effectiveness of the algorithm applied in compensating the atmospheric dispersion, the stars were observed with the CCD detector equipped on a 300 mm telescope at the Lijiang Astronomical Observatory in Yunnan Province, China. When the experiment started, the dispersion correction lens was removed in order to observe the influence of the dispersion. The exposure time of the camera detector is 0.1 second, which means the stars movements could be ignored. The images were captured under different zenith angle and the algorithm was used to remove the effect of atmospheric dispersion.
When the zenith angle is , the effect of atmospheric dispersion is not serious, as shown in Figure 6(a). When the zenith angle increases to , as shown in Figure 7(a), the stars become more blurry. After the image processing with edge extension, the stars become more clear and rounded, as shown in Figures 6(b) and 7(b).
When the zenith angle is greater, the situation would become much more serious. Figure 8(a) shows the original image of stars captured by the system when zenith is . Five stars are highlighted (marked in red box in Figure 8). Under the influence of atmospheric dispersion, those five targets turn into long dash lines in the same direction. The effect of atmospheric dispersion makes it difficult to detect and identify the stars or space targets.
Figure 8(b) shows the image after the compensation. The SNRs of the five targets are calculated and listed in Figure 9. The SNR value of the stars are defined aswhere is the number of pixels of calculating area. represents the grayscale value of the target. and are mean grayscale value and standard deviation of the background noise, respectively.
In the high SNR condition, for example, target 1, the SNR value increases by threefold after the correction. Meanwhile, in the low SNR condition, for example, target 5, the SNR value increases by twofold after the correction. The experimental results show that the algorithm mentioned above can effectively correct the influence of atmospheric dispersion at different SNRs.
At present, the optical correction method is still the most widely used method for correction of atmospheric dispersion. This method can basically eliminate the influence of atmospheric dispersion on the telescope resolution, and it can recover 91% of the target energy. The edge continuation based image algorithm proposed in this paper has the advantage of simple calculation in algorithm. According to statistics, the restoration target energy is about 89%, which is close to the effect of the optical method. The algorithm in this paper is used for systems with large view field, such as satellite capturing and tracking system for satellite-to-ground laser communications. In such system, the effect of atmospheric turbulence can be ignored. In future, the improved algorithm will be studied which is used for dispersion correction after adaptive optics system.
In this paper, we use an image processing algorithm to correct the star images influenced by atmospheric dispersion. The objective of this algorithm is to solve the singularity and ill-conditioned problems of deconvolution. For this reason, we establish an edge extension model to recover the blurred images to direct convolution images. The experiment is carried out with a 300 mm telescope. The experimental results show that the algorithm described above can effectively correct the influence of atmospheric dispersion at different SNR. Compared with the traditional prism correction and fiber correction methods, this technique greatly reduces the complexity of optical system designs.
The data used to support the findings of this study are available from the corresponding author upon request.
Conflicts of Interest
The authors declare that they have no conflicts of interest.
The research is supported by National Natural Science Foundation (NSFC) of China under project No. 60978050.
- J. Zheng, W. Saunders, J. S. Lawrence, and S. Richards, “On-sky demonstration of a fluid atmospheric dispersion corrector,” Publications of the Astronomical Society of the Pacific, vol. 125, no. 924, pp. 183–195, 2013.
- A. V. Goncharov, N. Devaney, and C. Dainty, “Atmospheric dispersion compensation for extremely large telescopes,” Optics Express, vol. 15, no. 4, pp. 1534–1542, 2007.
- W. Saunders, B. Goran, G. Smith et al., “Prime focus wide-field corrector designs with lossless atmospheric dispersion correction,” Spie Astronomical Telescopes and Instrumentation. International Society for Optics and Photonics, vol. 9151, Article ID 91511M, 2014.
- M. Bahrami and A. V. Goncharov, “The achromatic design of an atmospheric dispersion corrector for extremely large telescopes,” Optics Express, vol. 19, no. 18, pp. 17099–17113, 2011.
- D. Kopon, L. M. Close, J. R. Males, and V. Gasho, “Design, implementation, and on-sky performance of an advanced apochromatic triplet atmospheric dispersion corrector for the Magellan adaptive optics system and VisAO camera,” Publications of the Astronomical Society of the Pacific, vol. 125, no. 930, pp. 966–975, 2013.
- S. Egner, B. L. Ellerbroek, M. Hart et al., “Atmospheric dispersion correction for the Subaru AO system,” Adaptive Optics Systems II. International Society for Optics and Photonics, vol. 7736, Article ID 77364V, 2010.
- B. J. Bauman, D. Cramptone, J. E. Larkin, A. M. Moore, C. N. Niehaus, and A. C. Phillips, “The infrared imaging spectrograph (IRIS) for tmt: the atmospheric dispersion corrector,” in Proceedings of the SPIE - The International Society for Optical Engineering, vol. 7735, USA, July 2010.
- A. Labeyrie, “Attainment of diffraction-limited resolution in large telescopes by Fourier analyzing speckle patterns in star images,” Astron and Astrophys, vol. 6, no. 1, pp. 85–87, 1970.
- D. Korff, “Analysis of a method for obtaining near-diffraction-limited information in the presence of atmospheric turbulence,” Journal of the Optical Society of America, vol. 63, no. 8, p. 971, 1973.
- T. W. Lawrence, D. M. Goodman, E. M. Johansson, and J. P. Fitch, “Speckle imaging of satellites at the US air force maui optical station,” Applied Optics, vol. 31, no. 29, pp. 6307–6321, 1992.
- Y. Lian-chen and S. Mang-zuo, “Simulat ion for high-resolution speckle imaging of extended objects,” Opto-Electronic Engineering, vol. 27, no. 4, pp. 7–10, 2000.
- R. G. Lane and R. H. T. Bates, “Automatic multidimensional deconvolution,” Journal of the Optical Society of America A: Optics and Image Science, and Vision, vol. 4, no. 1, pp. 180–188, 1987.
- T. J. Holmes, “Blind deconvolution of quantum-limited incoherent imagery: Maximum-likelihood approach,” Journal of the Optical Society of America A: Optics and Image Science, and Vision, vol. 9, no. 7, pp. 1052–1061, 1992.
- E. Thiebaut and J.-M. Conan, “Strict a priori constraints for maximum-likelihood blind deconvolution,” Journal of the Optical Society of America A: Optics and Image Science, and Vision, vol. 12, no. 3, pp. 485–492, 1995.
- L. M. Mugnier, C. Robert, J.-M. Conan, V. Michau, and S. Salem, “Myopic deconvolution from wave-front sensing,” Journal of the Optical Society of America A: Optics and Image Science, and Vision, vol. 18, no. 4, pp. 862–872, 2001.
- L. M. Mugnier, A. Blanc, and J. Idier, “Phase diversity: a technique for wave-front sensing and for diffraction-limited imaging,” Advances in Imaging and Electron Physics, vol. 141, no. 5, pp. 1–76, 2006.
- Q. Li, S. Liao, H. Wei, and M. Shen, “Restoration of solar and star images with phase diversity-based blind deconvolution,” Chinese Optics Letters, vol. 5, no. 4, pp. 201–203, 2007.
- L. Qiang and S. Mangzuo, “The study of high-resolution imaging of astronomical object based on phase-diversity method,” Acta Astronomica Sinica, vol. 48, no. 1, pp. 113–120, 2007.
- J. Bardsley, S. Jefferies, J. Nagy, and R. Plemmons, “A computational method for the restoration of images with an unknown, spatially-varying blur,” Optics Express, vol. 14, no. 5, pp. 1767–1782, 2006.
- C. Lucchini, J. Gaignebet, and J. Hatat, “Validation of two color laser ranging comparison between: index integrated on the trajectory and index at the station,” in Proceedings of the Ninth International Workshop on Laser Ranging Instrumentation, vol. 2, pp. 628–634, 1996.
Copyright © 2019 Sijie Kong et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.