Abstract

Citizen science project GLORIA (GLObal Robotic-telescopes Intelligent Array) is a first free- and open-access network of robotic telescopes in the world. It provides a web-based environment where users can do research in astronomy by observing with robotic telescopes and/or by analyzing data that other users have acquired with GLORIA or from other free-access databases. Network of 17 telescopes allows users to control selected telescopes in real time or schedule any more demanding observation. This paper deals with new opportunity that GLORIA project provides to teachers and students of various levels of education. At the moment, there are prepared educational materials related to events like Sun eclipse (measuring local atmosphere changes), Aurora Borealis (calculation of Northern Lights height), or transit of Venus (measurement of the Earth-Sun distance). Student should be able to learn principles of CCD imaging, spectral analysis, basic calibration like dark frames subtraction, or advanced methods of noise suppression. Every user of the network can design his own experiment. We propose advanced experiment aimed at obtaining astronomical image data with high dynamic range. We also introduce methods of objective image quality evaluation in order to discover how HDR methods are affecting astronomical measurements.

1. Introduction

GLORIA (GLObal Robotic-telescopes Intelligent Array) project is targeted at both professional and amateur scientists. While the first group has a particular interest in the scientific content of image data, the second group may be interested also in imaging, including imaging with high dynamic range. High dynamic range imaging (HDRi) is a very active area of research and consumer electronics market today. HDRi methods, which may be used for both multimedia and scientific image data [1, 2], could help observers to discover unexpected attraction in their images. It is primarily used in scenes where the radiance covers a range that exceeds what can be captured in a single film or digital exposure.

Astrophotography would seem to be a natural fit for this technique, since the objects we photograph cover an enormous range of brightness, ranging from the Sun (magnitude −26) to faint nebulae and galaxies (magnitude 10 and beyond). Stellar objects like nebulae or galaxies often hide other stars that due to the low dynamic range of the image sensor disappear in many times brighter object. This problem can be solved by using shorter exposure time, but the shape and details of the object itself will be most probably missed. Another often used technique is stacking, combining of the frames with the same exposure time [3]. Main goal of stacking method is to increase signal-to-noise ratio in order to detect faint sources [4].

Although high dynamic range image can be captured directly using the special expensive high dynamic range sensors, these sensors are not very suitable for the scientific purposes since end user has no control over processes inside camera. Therefore, various HDR synthesis methods combining multiple low dynamic range (LDR) images, the same scene with different exposure times, were introduced in the last couple of years. Among all the methods, exposure bracketing [5, 6] is most widely used as no hardware modification is required. LDR frames record a different range of light intensities and should be acquired within a short period of time to minimize the change of the lighting condition as well as object movement. Then, as far as radiometric calibration of nonlinear camera is known, an HDR radiance map is recovered by using the best exposure information in different LDR frames.

The paper is organized as follows. Section 2 gives a brief overview of GLORIA network and experiment currently available. Section 3 introduces high dynamic technique and method of camera response function estimation. Sections 4 and 5 bring results of measurements on processed images and finally Section 6 concludes the paper.

2. GLORIA Network

As mentioned in the Introduction, GLORIA aimed to become the first free-access global network of robotic telescopes to allow everybody around the world to do research in astronomy [7]. The project partners put their 17 robotic telescopes (see Figure 1) into this network as a seed, in order to do ICT (Information and Communication Technologies) research toward helping GLORIA to grow up in the future and to continue operation after the life of the project. This should allow all citizens to do astronomy research by giving them free access to a world-wide expensive infrastructure of robotic telescopes and a free database of astronomical images taken by the members of GLORIA robotic telescope network. It means that users who love astronomy have opportunity to research and perhaps generate new knowledge and make it available to everybody and learn and teach astronomy.

With provided software, anybody can design a web application for conducting research into some specific astronomical issue. Applications can directly access individual components of the telescopes—cameras, focusers, and filter wheels. Citizen scientist, therefore, may implement advanced methods of image processing, like HDRi technique, which is the subject of this paper.

3. High Dynamic Range Imaging in Astronomy

Astronomical image data have very specific features which make common multimedia oriented HDR methods almost unusable:(i)The exposure values are typically very high; that is, noise in the image is increased.(ii)The noise has significant part based on photon counting (Poisson noise), the signal-to-noise ratio may be very different in different regions of the image, and also stellar object may be distorted on different frames.(iii)Image content is affected by atmospheric turbulence and cosmic rays.(iv)The telescopes (or lenses) tend to have big focal lengths; that is, photographed stellar object must be well guided during the exposure.(v)Wide-field lenses tend to have spatially variant point spread function.(vi)Typically used 16-bit quantization increases numerical complexity and decreases efficiency of postprocessing methods.(vii)Spatial distribution of pixel intensity is nonhomogeneous and pixels are typically grouped in isolated spots—stellar objects.

3.1. Image Acquisition

The HDRi is based on the construction of the radiance map of imaged scene from a set of frames with different exposures. For detailed overview see Figure 2. There are many approaches for choosing exposures sequences [8, 9]. Quality of radiance map is strongly affected by choosing proper LDR frames [10] and their number and exposure times [11, 12]. The frame combining leads to better contrast mapping in the scene and it changes noise level in synthesized HDR image.

Let us have a set of images of the same scene obtained with exposures , that is, each image differing only in their relative exposure ratio . Thus, a set of functions,where is scalar constant for each photoquantity , is known as Wyckoff set [13]. For each image in a Wyckoff set there exists representing the pixel at location of the image with th exposure.

Photoquantity is the measurement of quantity of light integrated on sensor element of camera system. Photoquantity is weighted by the sensor response and thus can accurately contain all the information we need to recover a mapping that is linear-responsive to the quantity of light arrived at the sensor. The mapping is known as camera response function.

3.2. Camera Response Function

Let an image captured by digital camera be nonlinear function of sensor exposure defined as multiplication of the sensor irradiance and exposure (see Figure 3). Then, the set of the images of the same scene with different exposure value can be described aswhere is a camera response function (CRF), are indexes of pixel in the image, and is an index of a frame in the set .

To map pixel values to irradiance values , we need to find inverse function so that the sensor exposureIf exposure time is known, one could obtain irradiance using (2).

Function or , respectively, may be obtained by precise measurement of the digital camera, for example, using a uniformly illuminated chart containing patches of known reflectance [14]. However, this process is quite complicated and can be performed only under laboratory conditions. Numerous papers showed that set of differently exposed images of the same scene contain enough information to recover response using images themselves [15, 16].

Mann et al., authors of the papers regarded as seed papers for the area of quantigraphic HDR, assumed that CFR is a gamma function [8]:based on the density curve of photographic film. Parameter can be estimated by taking an image with the lens covered and then subtracted from the other images. Equation is solved by using comparametric equations [17]; when function is monotonic, the comparametric plot can be expressed as a monotonic function not involving . Idea of comparametric image composting is widely recognized as computationally efficient HDRi method [18].

Grossberg and Nayar showed that recovering of exhibits fractal ambiguity [19]. This ambiguity can only be broken by adopting certain assumptions. Debevec and Malik [20] showed that reconstruction of function depends only on the determination of several functional values of . Mitsunaga and Nayar showed that can be modeled by a polynomial [21].

While CRF is estimated, irradiance map then could be expressed aswhere is weighting function involved to emphasize the smoothness and fitting terms toward the middle of the CRF curve. Selection of weighting function will be further discussed in Section 4; comprehensive overview of various weighting function may be founded in [22].

3.3. The Signal-to-Noise Ratio in HDR Imaging

The noise is always included in captured images and its influence can be described by well-known parameters [23]. The noise and image defects have many components and they depend on sensor quality, exposure value, electronic circuit, and photon noise presented in photon flux from a detected scene. The image acquisition in astronomy is never ending compromise among exposure value, noise occurrence, and imaging details in high contrast scene [24]. The values distribution can be evaluated by known parameters. The first one is root mean square error (RMSE) of the th frame from the setwhere the symbol is used for the mean value of the image and are image size. The noise influence is described by signal-to-noise ratio (SNR):SNR of the HDR radiance can be expressed aswhere is a number of images in the LDR set, is active area of the CCD pixel, is radiance, and is variance of the th frame in the set.

These relations can be simplified when the linear camera response function is assumed. For more details about SNR in HDR, see [25, 26].

3.4. Image Aligning

To combine a set of differently exposed images of the same scene into one HDR frame, it is necessary to find correspondence between pixels in the individual exposures. Basically, the process of two or more images aligning requires choosing one of the images as a reference and finding the geometric transformation to spatially register another image to reference image. Registered images are often convolved with appropriate kernel to degrade the point spread function of the registered image to match that of the reference image. Convoluted images are also scaled to match the intensity and background sky level of the reference image [27]. Registration is typically performed with subpixel accuracy [28]. Phillips and Davis provide details about how aligning can be done with IRAF package [29]. There are also recent works on simultaneous HDR reconstruction and optic flow computation [3032].

For our purpose, we can assume that single frames in datasets will be acquired in relatively short time (i.e., as one observer session or as a batch) by using one GLORIA telescope; that is, PSF will be constant.

3.5. Tone Mapping

Since reconstructed HDR image has much larger dynamic range than most display devices, tone mapping operators (TMO) are involved in the process of image reproduction. These operators, global or local, transform dynamic range of image to dynamic range of particular display device; more about the TMO design could be found in [5].

4. Experiments

For the purpose of measurement, we used GLORIA interactive experiment with the BART, telescope placed in Ondřejov, Czech Republic [33]. Telescope BART (an acronym for Burst Alert Robotic Telescope) is one of the oldest fully autonomous robotic telescopes in the world—it exists since early 2000. Nowadays it uses German-type mount Losmandy Titan, which is holding two optical telescopes: the narrow-field (NF), which is 0.25 m Schmidt-Cassegrain from Meade ( mm), and wide-field (WF), which is 0.1 m Maksutov-Cassegrain from Rubinar ( mm).

We used wide-field camera G21600 (equipped with Kodak KAF1603ME, full-frame CCD image sensor) from Moravian Instruments to obtain two sets of LDR images with different exposure values: 32 s, 64 s, 128 s, 256 s, and 512 s. Apparently there is a constant ratio between exposure times, so images could be considered as Wyckoff set. For galaxy M33, see Figure 4, and, for galaxy M101, see Figure 5. Four LDR frames (32 s, 64 s, 128 s, and 256 s) of both image sets were used for reconstruction of HDR image, later compared with 512-second exposure. All single frames were calibrated with proper dark and flat frames.

4.1. Camera Response Function Estimation

Pixel selection for the purpose of camera response calculation is a delicate matter. Apparently we need at least two differently exposed images; more are better. Selected pixels should meet the following requirements:(i)Having a good spatial distribution.(ii)Covering the entire irradiance range in the image as well as possible.(iii)Being selected from zones with small irradiance variance.

Some sources state that, for the range of 256 values, typical for the color channel of the multimedia image, it will be sufficient to use 50 pixels to obtain an overdetermined system of equations [34]. As we mentioned before, a typical astronomical image has 16-bit resolution and also spatial distribution of pixel intensities is different from the case of multimedia image data. Therefore, due to a big variance of pixel values over the LDR set, we propose choosing multiple patches across frames, containing stellar objects of different sizes and fluxes, and calculating CRF as a median. Figure 6 shows the estimation of function from observed images by use of procedure proposed by Reinhard et al. [5] in comparison to method proposed by Mann.

4.2. HDR Image Reconstruction

For reconstruction of the radiance map, we used the procedure described by (5). An important point of the procedure is obviously the choice of the weighting function mentioned in Section 3.2. Debevec and Malic proposed in [20] a hat function that assigns higher weights to middle-gray values as they are farthest from both underexposed and saturated outputs. Given the nature of astronomical data, irradiance map obtained by use of this weighting function is not very useful. From this reason, we used weighting function based on the standard deviation, estimated directly from the LDR frames [35]. We also included term declining significance of saturated pixels.

The central part of final HDR frame can be seen on Figure 7(b). One can see comparison between HDR frame and LDR frame captured with the exposure time of 512 s. Apparently galaxy and bright stars have lower brightness on the HDR frame, and also structure of the galaxy is more visible.

4.3. Stellar Profile Analysis

Methods of objective quality evaluation can be divided into two major groups. The first group is classic objective method, for example, SNR or MSE, established in Section 3. The second group of quality evaluation is methods based on data-evaluation algorithms. These methods include fitting stars’ profiles (measuring the PSF (point spread function) of the stars), aperture photometry, astrometry (position error), or successful detection of the star.

There are two common functions for fitting stars’ profiles, according to Bendinelli et al. [36]. The effort is to match a star’s profile as good as possible with the Gaussian or Moffat profile. If a star was ideal, it would be represented by a small dot. But because of many different distortions including application of generally nonlinear postprocessing algorithms, the dot is “blurred” all around, and the star’s profile is close to the Gaussian function. The Gaussian function isand the Moffat function iswhere stands for a radial distance from the middle of the star (with maximal brightness ), is a standard deviation, is brightness at the radial distance , and , are Moffat coefficients. Center of a star usually has a profile closer to the Gaussian function, the more distant parts to the Moffat, though. So ideal fitting function combines Gaussian and Moffat. For evaluation, parameter FWHM (Full Width Half Maximum) is typically used, defined as twice the value of at which .

Figure 9 shows how high dynamic range reconstruction changes the shape of the stellar object. Although this change usually does not affect the results of the algorithms to search for stellar objects, obviously HDR algorithm may be further optimized for wide-field astronomical image data. Objects ellipticity seems to be statistically unchanged; for details see Figure 10.

Figure 11 shows the distribution of stellar objects fluxes. Apparently fluxes are growing with longer exposure values and object tends to be overexposed. It seems like reconstruction with high dynamic range provides approximately the same number of objects, but objects are not overexposed; for detailed analysis of two selected stars, see Figure 8.

5. Results

In order to evaluate obtained frame with high dynamic range, various tests were performed. Primarily these are global statistics, which are summarized in Table 1 for M33 image set and Table 2 for M101 image set, respectively; the number of objects is detected by SExtractor [37], the median of FWHM values (Gaussian fit), median of stellar object’s ellipticity, and signal-to-noise ratio (SNR).

Figure 12 shows the relationship between a number of stellar objects found in the image and signal-to-noise ratio of the image. There is a noticeable effort of the HDRi approach—ratio between the number of objects detected and SNR is for the HDR image better than for any other LDR frame.

According to (8), HDR frame of M33 should have and HDR frame of M101 in the ideal cases. Although these (theoretical) values were not achieved, is better than of 512-second exposure and is better than of 512-second exposure for M101.

6. Conclusions

High dynamic range imaging means a new opportunity for users of GLORIA, the global network of robotic telescopes. In this paper, we presented an evaluation of irradiance map reconstructed from the set of frames with astronomical content. We compared two methods of camera response function estimation and proposed simple rule, how to select pixels for the estimation from astronomical image data. Results of methods are very similar; estimation of CRF strongly depends on the data present in the dataset. It is mostly due to the fact that it is impossible to select appropriate samples from different sets of LDR frames. It should be noted that there is no significant impact of small differences between estimated CRF; the biggest influence on the quality of the resulting image has the choice of weighting function.

According to the performed tests, it seems like high dynamic range reconstruction may have a positive impact on objective quality of image data. Although HDR reconstruction is a nonlinear operation, reconstructed images kept their nature in terms of photometric and astrometric properties. Moreover, it is evident that SNR of reconstructed irradiance map is significantly better than SNR of any LDR image. The structure of galaxy in the HDR image is clearly visible, while bright stars are not saturated. Distortion of the shape of the stellar object, which is evident in the resulting HDR image, may be due to the fact that the BART telescope does not have guiding and longer exposures are a bit blurry. There is undoubtedly the need for further tests, such as inspection of noise models in LDR frames and a comparison between noise ratios of stacked images and HDR reconstructed images.

Competing Interests

The authors declare that there are no competing interests regarding the publication of this paper.

Acknowledgments

This work was supported by Grant no. 14-25251S nonlinear imaging systems with spatially variant point spread function of the Czech Science Foundation.