Abstract

Accurate and reliable measurements of the 3D flame temperature profile are highly desirable to achieve in-depth understanding of the combustion and pollutant formation processes. In this paper, a measurement method for reconstruction of a 3D flame temperature profile was proposed by using a light field camera. It combines the convolution imaging model and radiative transfer equation and takes into account the characteristics of emission, absorption, and scattering of a semitransparent flame. According to the point spread function characteristics of the imaging system, the number and positions of the refocus planes were set by comprehensive consideration of the reconstruction accuracy and efficiency. The feasibility of the present method was proved by numerical simulation and an experiment of a candle flame. This method achieves the reconstruction of a 3D asymmetric flame profile through a single exposure of a single camera, which overcomes the problem of complexity of a multicamera system and the time delay of a conventional scanning camera system.

1. Introduction

At present, combustion still provides most of the energy consumed in the world. It widely exists in power plant boiler, chemical plant reactor, jet engine, gas turbine, internal combustion engine, and other fields [14]. Despite the continuing search for alternative energy, the combustion of fossil fuels will remain important for considerable time to come [5]. Thus, there is an urgent need to understand the subtle processes of combustion to increase the efficiency of combustor and to control their pollutants specifically, soot particles and NOx [6]. Accurate and reliable measurements of the flame temperature profile are highly desirable to achieve in-depth understanding of these combustion and pollutant formation processes [7]. To investigate the behavior of the flame in detail, three-dimensional (3D) distributions of the flame temperature have become increasingly important to combustion engineers.

For 3D temperature reconstruction, the common practice is using a multicamera system or scanning with a single camera to capture multiple 2D image sequences in different views [812]. In a multicamera imaging system, there are limitations such as limited space layout, difficulty in system calibration, slow data transmission and storage, inconsistent imaging quality, difficulty in synchronization, and high hardware costs. For a single camera scanning imaging system, it actually sacrifices the temporal resolution to obtain multiview 2D image sequences [1317]. Therefore, it is only suitable for the 3D temperature measurement of a relatively stable flame. In recent years, a new type of image acquisition device, light field camera, has been developed based on computational imaging theory [1820]. Compared to the conventional camera, it can record the intensities and directions of the light field simultaneously by placing a microlens array between the sensor and the main lens. It achieves the use of a single camera to obtain multiview 2D image sequences through one exposure, which provides a basis for 3D temperature reconstruction, and thus, it has drawn more and more attentions from all over the world.

Niu et al. [21] simultaneously reconstructed 3D temperature distribution and optical properties (absorption and scattering coefficients) of the cylindrical participating media through a raw light field image using a hybrid least-square QR decomposition-stochastic particle swarm optimization (LSQR-SPSO) algorithm. The temperature distribution was assumed as axisymmetric, and the absorption and scattering coefficients were distributed uniformly. Qi et al. [22] reconstructed 3D temperature and absorption coefficient distribution simultaneously using a hybrid algorithm of the Levenberg-Marquardt method with boundary constraint and nonnegative least squares (LMBC-NNLS). Although axisymmetric and uniform constraints were not used in the reconstruction, the spatial resolution and retrieval efficiency were too low. Zhao et al. [23] presented a novel method based on optical sectioning tomography for 3D flame temperature measurement. The method has high computational efficiency and spatial resolution. However, it did not take into account the absorption and scattering of the flame.

In this paper, a single light field camera is used for imaging a candle flame. Then, several 2D image sequences at different depths are obtained by a digital refocusing technique based on a raw light field image. Furthermore, the transmitted radiative source terms at different preset depths are obtained by image deconvolution. By eliminating the effects of the absorption and scattering, the emission intensity distribution at these preset depths is obtained. Finally, the 3D flame temperature profile is reconstructed by associating the emission luminance and the temperature. The paper is organized as follows. In Section 2, the digital refocusing principle of a light field camera and imaging model of emitting, absorbing, and scattering flame are introduced. In Section 3, the feasibility of the proposed method is verified by several numerical simulations; then, two experimental systems are set up to measure the absorption and scattering coefficients and 3D temperature profile of a candle flame, respectively. Finally, the main conclusions and perspectives are provided.

2. Fundamental Measurement Principles

2.1. Imaging Model of a Semitransparent Flame

The conventional imaging process is to overlap the images of scenes on different depth. It does not consider the absorption and scattering of the space. In other words, the rays emitted from a scene will not change the energy and direction before they reach the entrance pupil of the imaging system. As shown in Figure 1, there is a 3D flame with the thickness of in axis . In general, axis is selected as the main optical axis of the imaging system. If the imaging system forms an image on plane , the gray distribution of the image can be considered as the overlap of the intensity distributions of the focus plane and other defocus planes . For a linear shift-invariant imaging system, the gray distribution of the image is the convolution of original intensity distribution on the corresponding point spread function (PSF) according to Fourier’s optics theory. where is the PSF of object plane with respect to image plane . and are the and coordinates relative to the center of the image spot. indicates a convolution operation.

If the flame is divided into several sections, and the raw light field image can be used to compute the conventional photography at any depth, Equation (1) can be discretized as follows: where and are the index of horizontal and vertical of refocus image. is the index of the refocus image. , , and are the index of the flame in axis , , and , respectively. and are the relative index of horizontal and vertical of refocus image. and are the total number of refocus image and grids in axis .

The flame is an emitting, absorbing, and scattering medium. It will interact with the ray that passes through it. Therefore, the imaging formula of the flame should be rewritten as follows: where is the transmitted radiative source term. is the outgoing radiative source term. and are the absorption and scattering coefficients of the flame; this model does not consider the spatial and spectral characteristics of the radiative properties. is the blackbody radiative intensity of the flame; it can establish a relationship with the temperature according to Stefan-Boltzmann’s law. and are the index of directions along and . is the radiative intensity of the flame along direction ; it can be calculated by solving the radiative transfer equation. is the scattering phase function with incident in direction and exit in direction . is the size of the solid angle represented by direction . is the total number of the angle divisions. is the distance that the ray travels in the flame.

2.2. Digital Refocusing of a Light Field Camera

The imaging model of a conventional camera is a two-dimensional projection subspace sampling of a seven-dimensional plenoptic function [24]. It has a high sampling capability only in the 2D spatial resolution, and the ability of information sampling in other dimensions is extremely limited [25]. On the contrary, the light field camera is placing a microlens array between the sensor and the main lens on the basis of a conventional camera (see Figure 2). These microlenses image the subapertures of the main lens. In other words, it can achieve directional resampling. If and coordinate systems are established on the main lens plane and the microlens array plane, respectively, then a sampling ray can be parameterized by a four-dimensional function .

For a light field camera, the spatial resolution depends on the number of the microlens, and the directional resolution is the number of pixels covered by each microlens. The energy irradiated to on the microlens array plane is equal to a weighted integral of the intensities coming through the main lens [26]: where is the distance between the main lens plane and the microlens array plane. is the intensity along the ray traveling from position on the main lens plane to position on the microlens array plane. is the angle between ray and the microlens array plane normal. is a defined variable to shorten the equations, .

The light field recorded by the light field camera can be used to compute the conventional photograph at any depth , where does not need to be the same as . This is done by reparameterizing to produce . Applying Equation (6), the refocus image can be obtained by summing the intensities of all resampling points on each oblique line.

3. Results and Discussions

3.1. Numerical Simulation

In order to verify the feasibility of the measurement method proposed in this paper, several numerical cases are designed to reconstruct the flame temperature in different characteristics. Firstly, the absorption and scattering characteristics of the flame are not considered. As shown in Figure 3, three point light source arrays are arranged on three layers with the depth of 107 mm, 112 mm, and 117 mm, respectively. The size of each point light source array is , and it is divided into grids. There is a light field camera with its main optical axis passing through the center of the point light source arrays. Its focus plane is selected as the central section of the array. The focal length of the main lens is , and the diameter of the aperture is . The distance between the main lens plane and the microlens array plane is . The focal length of the microlens is , and the size of its aperture is . The resolution of the microlens array is . The number of pixels covered by each microlens is ; the collection of these pixels is called a macropixel. The size of each pixel is .

According to the principle of geometrical optics, the raw light field image of these point light source arrays is obtained as shown in Figure 4. It can be seen from Figure 4 that the point light source array is imaged inversely on the CCD sensor. Unlike the image of a conventional camera, the light field image is composed of some circular spots with a uniform gray level when the point light source array is on the focal plane. This is due to the interaction of the aperture main lens and the microlens array. When the point light source array is on a defocus plane, the circular spots in the light field image are not full, especially at the edges.

According to the principle of the digital refocusing described above, subaperture images with the pixels of are extracted from the raw light field image. Then, three refocused images at the depth of 107 mm, 112 mm, and 117 mm are obtained by shifting and summing these subaperture images. As shown in Figure 5, the letter on the refocus plane is clear and others are fuzzy. It proves the correctness of the refocusing method in this paper. The PSFs are calculated according to the distances between the object planes and the refocused image planes, and then, the distribution of the point light source array of each section can be obtained by a deconvolution operation. As shown in Figure 6, the reconstructed distribution is very similar to the original distribution, which proves the feasibility of the measurement method.

To illustrate the effect of the absorption and scattering coefficients of the flame on light field imaging, a 3D axisymmetric flame with isotropic scattering is selected. Setting the center of the flame as the origin of coordinates, the size of the flame is and the grids are divided as . The temperature distribution of the flame is defined by the following formula and shown in Figure 7.

The main parameters of the light field camera are shown in Table 1. According to the principle of geometrical optics, rays of light are randomly sampled from each pixel, and the intersections with the flame outer surface and the directions of incidence are recorded. In the flame, the light path is randomly generated according to the extinction coefficient using the Monte Carlo method, and the absorption and scattering of the ray are determined by the random number and scattering albedo. If the ray goes out of the flame or is absorbed by the flame, the trace of this ray ends. The control volume index of the flame is recorded when the ray is absorbed by a control volume. Then, the relationship between the pixel and the control volume of the flame can be established. Unlike conventional imaging of opaque objects, the light inside the flame can pass through the front surface of the flame and image on the sensor. It is more complicated than the imaging of opaque objects. However, because of this, the flame image can be associated with every control volume of the flame, and it makes the reconstruction of the interior flame possible. According to the digital refocusing technique mentioned above, the refocus image at any depth can be obtained by using the raw light field image.

Firstly, the influence of the extinction coefficient (sum of absorption and scattering coefficients) of the flame is investigated. The scattering albedo (ratio of scattering coefficient to extinction coefficient) of the flame is kept constant at 0.5, and the extinction coefficient of the flame is taken as 0.001 mm-1, 0.01 mm-1, 0.1 mm-1, and 1 mm-1, respectively. The light field images of the flame with different coefficients are shown in Figure 8. It can be seen that as the extinction coefficient increases, the gray levels of the light field image will gradually decrease. This is because the light emitted from the high temperature zone on the back side is difficult to pass through the front flame and reach the CCD sensor, leading to the result that the light field image is only an image of the front flame, when the extinction coefficient is large.

The extinction coefficient of the flame is kept constant at 0.1 mm-1, and the scattering albedo is taken as 0.8, 0.5, 0.2, and 0, respectively. The obtained light field image is shown in Figure 9. It can be seen that as the scattering albedo decreases, the gray levels of the light field image will gradually increase. This is because the absorption coefficient is positively correlated with the emission capacity. However, the attenuation capacity is only related to the extinction coefficient. The extinction coefficients of all these four conditions are the same, and the absorption coefficient increases as the scattering albedo decreases.

Therefore, the influence of the absorption and scattering coefficients of the flame on light field imaging is obvious and cannot be ignored. In this paper, a flame with extinction coefficient of 0.1 mm-1 and scattering albedo of 0.2 is taken as an example. The refocus image can be regarded as an overlap of the images at the current refocus plane of all sections in the flame. In general approach in image processing, when the temperature profile of one section needs to be reconstructed, the refocus image at this section is required for deconvolution with the corresponding PSF, and then, the deblurring technique is applied to improve the reconstruction quality [27, 28]. However, this approach is only suitable for 3D reconstruction of discrete media, and it does not work in the 3D reconstruction of continuous media (e.g., flame). It needs to use an overall deconvolution approach to eliminate the effects from other sections.

In the proposed method, the number and locations of the refocus sections are not limited by the grid division of the flame. The depth range of the refocus sections is expanded to amplify the differences between each PSF. At the same time, the number of the refocus sections is increased to enhance the utilization degree of acquired information. In general, the accuracy of the reconstruction increases with increasing the number of the refocus sections. However, it comes with low reconstruction efficiency. In this paper, 41 sections are divided equally within of the flame center. The refocus image sequences are shown in Figure 10. It can be seen from Figure 10 that as the distance from the focus plane increases, the refocus images become more and more fuzzy.

Then, the PSFs of the 11 reconstructed sections of the flame with respect to these 41 refocus sections are calculated and used to build a large deconvolution matrix. Using the truncated singular value decomposition (TSVD), the transmitted radiative source terms can be obtained according to the 41 refocus image sequences. To demonstrate the effects of the measurement error on the reconstructed results, the random noise is added to each pixel of the raw light field image. The measurement value can be expressed by the following. where is the measured value of the intensity distribution of the raw light field image. is the true value of the intensity distribution of the raw light field image. is the measurement error. is the random number distribution which follows a uniform distribution in the interval [-1, 1].

The reconstruction results corresponding to no measurement error and 10% measurement error are shown in Figures 11(a) and 11(b), respectively. The positive and negative signs of in Figure 11 indicate the directions toward and away from the camera, respectively. The value of indicates the distance from the reconstructed plane to the flame center. It can be seen from Figure 11 that when there is no measurement error, the reconstruction results are smoother compared to the results with 10% measurement error. We can also see the effect of the extinction of the flame; the farther away from the camera, the more light energy is attenuated.

According to the ray tracing based on the geometrical optics principle, the distances of the ray traveled in the flame from each control volume can be calculated. Then, the ratio of the extinction at each position can be calculated according to Beer’s law. Furthermore, the outgoing radiative source term can be obtained. Substituting it into the radiative transfer equation (as shown below), the radiative intensities of each position at each direction can be calculated by the Finite Volume Method (FVM) [29, 30]. where and indicate the position and direction vectors, respectively. , , and are the cosines of the angle between and the positive direction of the , , and axes. indicates the radiative intensity at position in direction . indicates the outgoing radiative source term at position .

Substituting the calculated radiative intensity back into Equation (5), the blackbody radiative intensity of the flame can be calculated, and the flame temperature can be obtained according to Stefan-Boltzmann’s law. When there is no measurement error, the maximum absolute reconstruction error is , and the average and standard deviation of the absolute reconstruction error are . When there is measurement error, the maximum absolute reconstruction error is , and the average and standard deviation of the absolute reconstruction error are .The reconstructed 3D temperature profiles of the flame are shown in Figure 12. It can be seen from Figure 12 that the temperature reconstruction error is larger than the reconstruction error of the outgoing radiative source term when there is some measurement error, especially in the regions where the temperature is low. But compared with the original setting temperature range K, the absolute error is allowable. It proves the feasibility of the proposed measurement method.

3.2. Experimental Measurement

A candle flame is used as a measuring object to demonstrate the performance of the proposed measurement method under laboratory condition. Before temperature profile reconstruction, it needs to measure the absorption and scattering coefficients, calibrate the optical parameters of the measurement system, and make some pretreatment. The experimental system of measuring the absorption and scattering coefficients of the candle flame is shown in Figure 13.

As shown in Figure 13, a laser is used to irradiate the candle flame, where the transmitted and scattered light are received by a diffused white screen. An aperture is used to keep out the stray light around the laser spot, and an attenuation slice is used to reduce the laser intensity to the same level of the candle flame. The diameter of the candle flame is . The size of the diffused white screen is , and the distance between the screen and the center of the candle flame is . The laser, aperture, attenuation slice, candle flame, and screen are adjusted at the same height; then, the reflected intensity distribution from the screen are recorded by a conventional camera with constant parameters. Figure 14 shows the reflected intensity distributions from the diffused white screen under three different conditions.

Assuming that the absorption and scattering coefficients of the flame do not change with the spectrum and space, they can be solved by the following two equations. where indicates a collection of pixels in the imaging area of the laser spot on the diffused white screen. and are the indexes of the pixel in the and directions. , , and are the gray values of the pixel in Figures 14(a)14(c).

Based on the gray value of the reflected image on the diffused white screen, the absorption and scattering coefficients of the candle flame are roughly measured as and . It shows that the absorption in the candle flame is much greater than the scattering effect. The imaging model of the present light field camera is shown in Figure 2. The dynamic range of the camera (Imperx B2320) is 12 bits. Some of the geometry parameters of the imaging system, such as , , and , are difficult to measure directly and need to be obtained by calibration. Using the measurement system image calibration plate, as shown in Figure 15, the key parameters of the measurement system can be obtained according to the geometrical optics theory. These key geometry and optical parameters of the measurement system are listed in Table 2.

Unlike the simulation conditions, the value of in a real experiment is not an odd number, or even an integer. Therefore, in order to increase the accuracy of the imaging model, a circular array is used to fit the pixels covered by each microlens. Then, the center of each macropixel and the number of pixels covered by each microlens can be obtained, as shown in Figure 16(b). Since the candle flame is an elongated shape, there are many useless pixels in the entire image. To improve the reconstruction efficiency, as shown in Figure 16(a), the raw light field image is cropped containing a candle flame using a rectangle.

In order to ensure that the error of refocusing is within 1 pixel, a concentric square area with a side of 15 pixels of each macropixel are selected in the refocusing process. The number of the calculation area grids is divided into . The distance between the refocus image plane is set to 20 equally spaced positions from to . Then, the refocus image sequences are shown in Figure 17. It can be seen from Figure 17 that when the refocus plane is away from the camera side, the differences between the refocus image sequences are not very apparent. However, the differences are obvious when the refocus plane is near the camera side. Therefore, it should set more refocus planes in the camera side.

According to the aforementioned method, the transmitted radiative source terms and temperature profile of the candle flame are reconstructed as shown in Figure 18. Through the blackbody furnace calibration, the transmitted radiative source term ranges from 4333 W/m3 to 26423 W/m3. It can be seen from Figure 18(a) that the distributions of the transmitted radiative source terms of the candle flame, which is directly reconstructed from the image sequences, change drastically especially at the borders. The image of the outer flame is the brightest and gradually decreases toward the air and the inner flame. It conforms to the theoretical structure of a candle flame. According to the method of the numerical simulation mentioned above, the outgoing radiative source term , the radiative intensities , the blackbody radiative intensity , and the temperature can be determined successively. The temperature ranges from 704 K to 1083 K. As can be seen from Figure 18(b), the trend of the temperature is much gentler than that of the transmitted radiative source terms. This is because the transmitted radiative source term is linearly related to the fourth power of the temperature. Thus, the temperature profile is proved to be reliable according to the qualitative analysis. Due to the lack of widely accepted standard measurement methods of the flame temperature distribution, the work in this paper does not compare the measured temperature result to the measurements by other methods.

4. Conclusions

In this paper, a measurement method for a 3D flame temperature profile was proposed by using a light field camera. It combines the convolution imaging model and radiative transfer equation and takes into account the characteristics of emission, absorption, and scattering of a semitransparent flame. According to the digital image processing technique, the 3D distribution of transmitted radiative source terms of the flame was reconstructed directly; then, it achieves indirect reconstruction of a 3D temperature profile through solving radiative transfer equation. During the reconstruction process, the information availability of the raw light field image is increased by increasing the refocus planes without being limited by the depth grids of the flame. At the same time, the efficiency of the reconstruction is considered comprehensively, and accuracy of the reconstruction is improved by setting the refocus plane according to the PSF characteristics of the imaging system. The present method was proved to be feasible through numerical simulation and experiment. It achieves the reconstruction of a 3D asymmetric profile through a single exposure of a single camera and overcomes the problem of complexity of a multicamera system and the time delay of a conventional scanning camera system. The future work will focus on verifying the reconstruction accuracy of the absolute temperature distribution and applying it to more different kinds of flames.

Nomenclature

:The collection of pixels in the imaging area of the laser spot
:The size of the diffused white screen (mm)
:The diameter of the aperture (mm)
:The diameter of the microlens aperture or the flame (mm)
:The energy irradiated to the microlens array plane (W)
:The focal length of the main lens (mm)
:The focal length of the microlens (μm)
:The gray values of the pixel
:The point spread function
:The radiation intensity (W/m2·sr)
:The radiation intensity of blackbody (W/m2·sr)
:The index of the flame in axes , , and
:The distance between the main lens plane and the microlens array plane or refocus plane (m)
:The index of the refocus image
:The indexes of the pixel in the and directions
:The indexes of the pixel in the and directions relative to the center of the image spot
:The total number
:The size of the pixel (μm)
:The position vector
:The distance that the ray travels in the flame (mm)
:The transmitted radiative source terms (W/m3)
:The outgoing radiative source term (W/m3)
:The temperature (K)
:The time (s)
:The coordinates on the main lens plane
:The size of the flame in directions (mm)
:The coordinates of components
:The and coordinates relative to the center of the image spot.
Greek Symbols
:The geometrical parameter for refocus depth
:The angle of resampling points in the coordinate system
:The thickness of the flame (mm)
:The scattering phase function
:The angle between ray and the microlens array plane normal (rad)
:The azimuthal angle (rad)
:The absorption coefficient (m−1)
:The wave length (μm)
:The cosines of the angle between and the positive direction of the , , and axes
:The polar angle (rad)
:The scattering coefficient (m−1)
:The radiation direction
:The size of the solid angle (sr).
Subscripts
:Figures 14(a), 14(b), and 14(c)
:CCD
:Flame
:The control volume of the flame
:Refocus positions
:The refocus section
:Microlens and main lens
:The microlens
:The screen and the candle flame
:Object plane with respect to image plane
:Incident direction to exit direction .
Superscripts
:The indexes of directions along and .

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

The supports of this work by the National Natural Science Foundation of China (No. 51676044, No. 51976038), the Aero-engine Thermal Environment and Structure Key Laboratory of the Ministry of Industry and Information Technology (No. CEPE2018005), and the Fundamental Research Funds for the Central Universities (2242019K1G024) are gratefully acknowledged.