Abstract

By means of UWB Radar sensors the tasks of material characterisation and object recognition can be performed on the basis of a previous imaging of the whole environment. A UWB version of the microwave ellipsometry method is applied for estimating the permittivity of homogenous objects. The object recognition task is performed using bistatic sensor nodes on the basis of Radar measurements. The simulation-based performance evaluations show a very robust behavior due to suitable preprocessing of Radar data. The applications comprise the detection of fire sources, the detection of metallic object hidden under clothing, and the recognition of building structures.

1. Introduction

The background of the work described below is the vision of a scenario with security robots performing an inspection tour in an unknown or often changing indoor environment. The robots have the task of identifying hazardous situations such as a fire or other emergency situations. Based on localization and imaging tasks the position and the rough form of all objects in the supervised room have to be determined first. Afterwards an inspection of each object with a polarimetric UWB Radar sensor then has to determine more details of the objects material and form and has to classify the objects.

2. Material Characterization

As described in more detail in [1] the material characterization is based on a method known from optics, the ellipsometry, which has been adapted for the implementation with a UWB radar system. The basic idea of the ellipsometry method is to use the relation between the reflection coefficients of two orthogonal polarizations. Assuming a wavelength larger than the dimensions of a smooth or slightly rough surface the reflection can be described by means of the modified Fresnel formulas. From that for nonmagnetic materials the permittivity can be determined as

The angle is the angle of incidence. and are the electric field strengths of incident and radiated waves, respectively, at the objects surface.

Figures 1 and 2 exemplify the measurements performed with a medium-density-fibreboard and a sand-lime brick wall. There the reflectance factor of both polarizations is plotted over the viewing angle. Limiting factor at high angles of inclination is the high level of crosstalk between both antennas. This degrading effect could be reduced moderately by using UWB antennas with very small opening angles and installing an absorbing aperture in the direct sight path of the antennas. At lower angles of inclination the difference between both polarizations becomes very small and handicaps the measurement. Another limiting factor is the content of humidity of the material and the required minimal size and form of the material.

3. Application in Fire Detection

In [2] it was shown that radiometric microwave measurements are able to identify sources of fires with solid burning material. When scanning the environment similar results compared to those using IR cameras can be achieved with the advantage that such measurements also can be performed in very dusty or smoke filled ambients.

The main weakness of the radiometric measurement method is the influence of the antenna fill factor when observing small objects. Because of different distances to the objects, the antenna fill factor may vary over decades. To overcome this problem a joint microwave and UWB radar fire detection method was proposed in [3, 4] which relies on the determination of the objects size and its distance to the sensor. Several object surfaces with different angles at the corners were investigated.

The distance could be estimated with satisfying results with an error of just some millimetre. The overall dimensions could be determined with an error of For this purpose the objects were scanned planar and parallel to the objects surface with a bistatic setup of two double-rigid horn antennas. Figure 3 shows the resulting wave front extraction of a triangle object.

4. Object Recognition

The development of the recognition algorithm was performed simulation-based and used ray-tracing algorithms described in its basics in [5, 6]. For the first investigations 12 objects with simple canonical and noncanonical complex cross sections were assumed. The scenarios investigated comprised antennas moving on a circular track around an object. The used UWB-antennas are bistatic double-rigid horn antennas with small opening angles. Figure 4 shows an example of a simulation-based radargram of an object with a rectangular cross section of 30 cm by 12 cm.

For the purpose of comparison Figure 5 shows the radargram of a real object with the same dimensions gained during a UWB measurement campaign. Also for other objects the similarity between simulations and measurements was significantly high. The angle dependent impulse responses are combined such that each impulse response forms one column of the radargram in which the time-of-flight is expressed as a distance and is shown versus the rotation angle of the antennas. The colour bar indicates the power of the impulse responses in dB. The radargram is treated as an image so that image processing methods are applied. Several preprocessing steps for enhanced robustness are performed starting with an extraction and time-windowing to focus on object relevant data.

To achieve high resolution of the radar system wave fronts have to be detected accurately and, in case of multiple reflections, overlapped pulses must be separated by a suitable algorithm. Taking only the maximum values of the measured signal to estimate the time of flight is not the best solution since the magnitude of a peak could nearly vanish due to destructive interference of different reflections. Furthermore, a high peak in the signal can be caused by antenna ringing. Thus, a correlation with a reference signal is a more suitable approach. For this purpose the cross-correlation is carried out between a reference pulse and the measured pulse to be investigated. The reflection of a large metal plate is used for obtaining the reference pulse with the target information in the back scattered reflection. This correlation-based algorithm was introduced in [7].

These detected wave fronts in dependence of the distance versus the rotation angle of the antennas are shown in Figure 6 for a complex cascaded object. High peak levels correspond to specular reflections, whereas low levels correspond to diffraction and scattering effects.

An evaluation of the distance between the antennas and the surface of the object allows a representation in polar coordinates which converges to the contour of the object. Figure 7 shows the evaluated data of Figure 6 in that way for the simulated complex cascaded object data. Here for reference the exact contour of the object is highlighted in red whereas the detected wave fronts are marked according to the legend. This object was built with the same dimensions as the simulated one and measured with the same setup. The evaluation of the real object is shown in contrast in Figure 8.

Due to noise and imprecise position data the exact position of the objects may differ from reference data. In this case the UWB-antennas do not focus exactly on the centre of the object during the inspection tour. This inaccuracy would distort the polar representation because the determined distance would not have the zero-point as origin any more. An example of a distorted radargram is shown in Figure 9 where the rectangular object of Figure 4 is shifted 5 cm towards and 5 cm across the antenna.

To be invariant against translation effects and thus to obtain a robust recognition an algorithm based on the analysis of only specular reflections was investigated. For this purpose the detected wave fronts of Figure 5 exceeding a 20% threshold are considered. Figure 10 shows these normalized magnitudes of the wave fronts for the shifted object from Figure 9.

The choice of the threshold is dependent on the directional radiation pattern of the UWB-antennas. Because the objects are not any more in the main lobes due to the object shift, the reflected waves are received with lower power levels. It is assumable that the samples of the peak data which are close together and exceed the threshold correspond to a planar surface of the object and cause a specular reflection to this direction.

Evaluated for every estimated planar surface which is orthogonal to the angle of incidence and combined with the information of the corresponding distance the rough surrounding boundary of the object can be determined. In Figure 11 the surrounding boundary and the object translation are estimated with very little error for the shifted object of Figure 9.

Rotation invariance can be achieved if both the scanned objects and the reference patterns are rotated into a standard orientation, for instance, using the orientation of the moment tensor [8]. The orientation of the object is defined as the angle between the -axis and the axis around which the object can be rotated with minimum inertia. This angle is given by

Here , , and are the three second-order moments of the object. This algorithm can be applied onto the polar diagram data of the objects.

Finally the moment invariant method is applied. Based on 5 central moments 7 other moments can be determined which are invariant to translation, rotation, and scaling. The algorithm was improved with respect to the original version described in [9] using 3 additional moments: (object area), (eccentricity), and (circumference).

During a measurement campaign 5 objects were built with the same dimensions as the corresponding reference objects. After being scanned by UWB-Radar and some preprocessing this improved moment invariant algorithm was applied onto the polar plots of the detected wave fronts of 1st order. The median denoised object recognition algorithm could detect correctly all 5 objects out of 12 reference objects.

In subsequent investigations the performance of the object recognition could be improved further by means of polar Fourier descriptors. The recognition algorithm proposed in [6] makes now use of up to 5 detected wave fronts. Thus it could clearly recognize 6 objects from the set of 12 reference objects.

Acknowledgments

The authors thank the Deutsche Forschungsgemeinschaft (DFG) for the support of the work as part of the “Cooperative Localisation and Object Recognition in Autonomous UWB Sensor Networks” (CoLOR) project within the UKoLoS priority program.