Radar systems are largely employed for surveillance of wide and remote areas; the recent advent of drones gives the opportunity to exploit radar sensors on board of unmanned aerial platforms. Nevertheless, whereas drone radars are currently available for military applications, their employment in the civilian domain is still limited. The present research focuses on design, prototyping, and testing of an agile, low-cost, mini radar system, to be carried on board of Remotely Piloted Aircraft (RPAs) or tethered aerostats. In particular, the paper faces the challenge to integrate the in-house developed radar sensor with a low-cost navigation board, which is used to estimate attitude and positioning data. In fact, a suitable synchronization between radar and navigation data is essential to properly reconstruct the radar picture whenever the platform is moving or the radar is scanning different azimuthal sectors. Preliminary results, relative to tests conducted in preoperational conditions, are provided and exploited to assert the suitable consistency of the obtained radar pictures. From the results, there is a high consistency between the radar images and the picture of the current environment emerges; finally, the comparison of radar images obtained in different scans shows the stability of the platform.

1. Introduction

For more than a century, radar technology has been recognized as the primary tool for remote sensing and surveillance. Currently radar system are used in many fields (e.g., automotive, aerospace, geoscience, and medicine fields) and for multiple applications (e.g., ranging, surveillance, discovery, and mapping) [1]. In particular, radars have become a fundamental tool for surveillance in maritime domain, where this task can be carried out exploiting various assets and technologies:(i)Coastal radars cover relatively small areas, which barely exceed territorial waters [2].(ii)Satellite radars are suitable for a focused surveillance, but their employment is always subjected to the satellite availability [3].(iii)Shipborne radars are limited in range by their height Above Sea Level (ASL), according to the Line Of Sight (LOS) operating principle.(iv)High Frequency (HF) and Very High Frequency (VHF) radars provide a larger coverage, but at the cost of a lower resolution [4].(v)Airborne radars are characterized by both a relatively large coverage and a high resolution, but their employment is limited by endurance and running costs [5].The abovementioned approaches have intrinsic limitations in terms of range resolution, endurance, and costs. In order to enhance the surveillance capability of overcoming such limitations a mini radar sensor has been designed, prototyped, and tested; the device is a low-cost platform, and hence a large number of units can be deployed creating a network of sensors; moreover the sensor is conceived for air-based applications, being carried by small flying vehicles such as Remotely Piloted Aircraft (RPAs) and tethered aerostats [6]. The joint employment of these assets could sensibly improve the surveillance capability required in the Mediterranean Sea to perform migration control and to coordinate Search and Rescue missions. In fact, whereas an aerostat-radar is able to guarantee continuous coverage of large areas, a rapidly deployable drone-radar can provide focused coverage over selected spots (in a slew-to-cue surveillance mode).

Nevertheless, one of the main challenges in the developing of the mini radar system is the necessity to match the requirements of compactness, lightness, and reduced consumption, adopting at the same time a low-cost philosophy and guaranteeing the reliability of the overall system.

Although some mini radars are already available on the market, their applications are definitely different from the one proposed in this work. Among the available mini radars, Google mini radar, XSight, and MESA-K-DEV are the most popular systems. Soli radar, from Google ATAP, has been designed for automatic gesture recognition of the computer users; hence it has a very limited range and a high resolution. XSight is a millimeter wave radar in W band; it is mainly used in airports for Foreign Object Debris (FOD) applications. It is compact and powerful tool for ground-based applications. MESA-K-DEV is a very compact device suitable for different application such as Unmanned Aerial Vehicle (UAV) airborne detect and avoid and autonomous vehicle radar vision. With respect to the proposed system it has lower range and resolution and it is not fully configurable.

The proposed radar prototype exploits a well-known technology; it is a coherent Frequency Modulated Continuous Wave (FMCW) radar operating in Ku band.

It has been designed to provide a medium-short range coverage. The developed platform is a prototype and has to be considered as a feasibility study; in its current version the radar system allows a maximum range of four kilometers, with a resulting range resolution of about 1.5 meters. These limitations are due to the prototype nature of the device.

Radar data, acquired on board a moving platform, have to be associated with the attitude of the platform in order to have a radar picture. If one has to properly reconstruct a radar image the pointing direction of the antenna has to be carefully estimated. Hence a navigation board has to be integrated and synchronized with the radar. In the present research, a low-cost GPS/Inertial Measurements Unit (IMU) integrated platform is used. A Commercial Off-The-Shelf (COTS) board has been integrated and synchronized with the radar low-cost platform. The paper describes the design, assembly, and testing of the radar-navigation systems with a particular focus on their synchronization. Several tests, in preoperational conditions, have been carried out to validate the developed prototype. During the tests, the radar was performing several azimuthal scans: the comparison of the different polar plots aims at demonstrating that no drifts are present in the attitude platform estimation. In fact, if only small differences between two consecutive scans can be appreciated, then the consistency of the radar data and the proper synchronization between radar and navigation systems are validated. To this end, the correlation among different radar pictures relative to different scans needs to be computed. Moreover, the comparison of the obtained radar images with the actual environment of the test provides the opportunity to further validate the chosen approach.

The rest of the paper is structured as follows: the different sensor components, including the radar front-end and back-end, are described in Section 2, whereas the navigation board and the algorithm for the attitude determination are presented in Section 3. The integration and synchronization strategies are discussed in Section 4. In Section 5, the setup adopted for the tests is described, while the obtained results are commented on in Section 6. Finally, Section 7 concludes the paper.

2. Radar

The implementation of a low-cost agile radar system is a challenge: weight, overall dimensions, power consumption, and cost are severe requirements which have to be fulfilled. The prototype developed is a monostatic radar, based on FMCW concept; although this type of sensors exploits an old technology [7], they are currently employed for several applications, thanks to their small overall dimensions, reduced consumption, and limited costs.

FMCW radar concept is briefly introduced in Section 2.1; in Section 2.2 the assembled radar system is described; finally, the radar signal processing is illustrated in Section 2.3.

2.1. FMCW Radar Concept

The developed prototype operates in Ku band; specifically the carrier is centred at 17.2 GHz and the maximum bandwidth is 1.4 GHz. Continuous Wave (CW) radars are not able to measure the distance of the targets [8]; in order to have ranging capabilities the carrier has to be modulated by a periodic function. Different functions can be used for the modulation [1, 9]; in the present research a replicated ramp (“sawtooth wave”) is used.

The FMCW radar estimates the target distance from the frequency difference between the transmitted and received waves, which, for a steady target, is directly proportional to the delay between the transmitted and received signals; this concept is graphically represented in Figure 1. The ramp of the received signal (in red) is a delayed replica of the transmitted one (in blue), and between the two ramps there is a frequency shift: when the target is moving with respect to the transmitter, the Doppler effect introduces a further shift of the received ramp. Although considering a static target can appear a strong hypothesis it is widely accepted for radar applications in the maritime domain where the relative speed between the target and the transmitter is limited [10].

The key parameters characterizing a radar system are as follows:(i)The maximum range resolution , which is inversely proportional to maximum radar bandwidth:where is the speed of light.(ii)The sweep rate , defined as  where is the sweep time and is the swath time and represents the time delay between transmitted and received signals. Finally, is the beat frequency.(iii)The unambiguous range is defined as the maximum distance measurable by the radar: is inversely proportional to the Pulse Repetition Frequency (PRF), defined as the number of pulses of a repeating signal in a specific time unit.

The scheme representing the functional principle of the FMCW radar is shown in Figure 2. The scheme is composed of two chains: the transmitting chain in the upper part and the receiving chain in the bottom part. In the red dashed box, there are the elements of the radar back-end: the Single Board Computer (SBC), a Digital-to-Analog Converter (DAC), and an Analog-to-Digital Converter (ADC), whereas the front-end elements are shown in the green dashed box.

For the transmission chain, the SBC digitally design the ramp (blue line in Figure 1), which in turn is converted into an analogical signal by the DAC. The mixer, exploiting the Local Oscillator (LO), upconverts the signal from Intermediate Frequency (IF) into Radio Frequency (RF). The transmitted signal is split into two parts: one is fed to the receiving chain and used as a reference for the downconversion process. The remaining part of the signal is amplified and finally transmitted.

In the receiving chain, the received signal is amplified by a Low Noise Amplifier (LNA). Then it is fed to a mixer together with the reference signal for the downconversion. After the downconversion, the signal is amplified by the Intermediate Frequency Amplifier (IFA) and digitized by ADC. Finally, the SBC performs some signal processing, Section 2.3.

The elements of both chains are described in the following section.

2.2. Radar Assembly

The radar system can be divided, according to Figure 2, into two main functional blocks: processing unit (blue box) and transceiver (yellow box).

The employed processing unit is represented by an Intel-based SBC that runs Windows 7. This unit carries out the following tasks:System initializationParameters settingDigital signal processingData storageRemote control.The abovementioned functionalities are based on MATLAB and C in-house implemented functions.

The transceiver is composed of two subblocks: the Intermediate Frequency part of the back-end (intersection between red and yellow boxes) and the front-end (green box).

The first subblock, which represents the bound between digital and analog parts of the radar, consists of two converters:(i)The DAC, which is a single channel, 4 GSps, 12-bit resolution, arbitrary waveform generator with internal memory of 4 MB(ii)The ADC, which is an acquisition device with maximum sampling rate of 500 MSps, 14-bit resolution, and internal memory of 2 GB.

The front-end is made of the following:(i)The RF Module, which includes amplifiers, splitters, filters, mixers, and a Local Oscillator(ii)The antenna system.Whereas the back-end has been built up with COTS components, the front-end has been designed and manufactured ex novo, by assembling RF connectorized modules. It operates at a central frequency of 17.2 GHz, with a 1.4 GHz bandwidth. The front-end test has shown the following features:(i)Significant side-lobe isolation, higher than −34 dB(ii)A power output of about 30 dBm(iii)Linearity within the bandwidth lower than 5 dB.Besides, in order to achieve the compactness and low-weight requirements, the front-end has been included in a 3D printed case, obtained from a light Acrylonitrile-Butadiene-Styrene (ABS) Terpolymer. The aluminum plate, on which the radar is placed, together with a mini fan, guarantees the necessary heat dissipation (see the details of Figure 3(a)).

As it clearly appears from Figure 3, two versions of the integrated system have been manufactured: one to be carried by a multirotor drone (a) and the other suitable to be exploited by an unmanned, tethered aerostat (b). The two versions are very similar; the main differences are in terms of support and antennas:(i)The drone-radar is equipped with a couple of conventional horn antennas with a 3 dB beam-width of about 37 degrees both in the horizontal and in the vertical planes (Figure 3(a)).(ii)A couple of microstrip patch antennas, designed to have a 3 dB beam-width of 2 degrees in the horizontal plane and 10 degrees in the vertical plane, are mounted on the aerostat-radar (Figure 3(b));In both configurations a digital microcamera is inserted between the two antennas, allowing acquiring and storing pictures of the radar footprint at regular intervals, in order to be employed as ground truth when comparing the radar picture. Whereas for the aerostat a remotely controlled gimbal allows orienting the radar, in the case of the drone the radar is solid with the body of the RPA. This means that the azimuth pointing direction of the radar depends on the heading of the drone; on the other hand, the elevation of the radar beam can be varied thanks to a remotely piloted servomotor.

2.3. Radar Data Processing

An implemented Graphic User Interface (GUI), which is shown in Figure 9, provides the interface between the operator and the radar, allowing toset the main radar parameters;design the modulating sawtooth;enable/disable the transmission of the radar signal;record the received echo;display the radar waterfall;load prerecorded datasets.As previously mentioned, the whole radar processing is conducted in digital form [11] by the SBC. After being digitized, the raw radar signal needs to be properly cut, according to the key radar parameters. Such procedure is graphically described by Figure 4: only the first part of each received sweep is selected for the further processing, while the last part, overlapping the next transmitted ramp, is discharged.

At this point the selected data can be stored in the SBC to be processed offline. Alternatively, in order to show to the operator a waterfall plot of the radar data in a quasi-real-time mode a basic processing can be performed online. The signal processing is shown in Figure 4 and includes the following: integration of a variable number of sweeps, optionally, weighting through a tapering function, before the Fast Fourier Transform (FFT) is performed.

The sweeps integration increases the Signal-to-Noise Ratio (SNR), but it needs to be done carefully if the radar is not stationary and pointing in a fixed direction, because it can cause a defocusing of the radar picture (due to the reduction of the azimuth resolution) [12]. Nevertheless, whenever the radar is scanning, the number of sweeps to be integrated is automatically computed through the navigation data, according to the chosen resolution of the polar plot.

Through the FFT, one obtains a matrix [slant-range versus radar images] of the radar return. After its normalization and conversion into the logarithmic scale, the power matrix can be displayed through the waterfall plot (Figure 14).

A further processing, based on the synchronization of radar and navigation units, is required to obtain the polar plot (Figure 15). This is true if the radar is stationary and scanning angular sectors, but mostly when the radar acquisition is conducted while the platform is moving. In fact, both azimuth and position of the platform must be properly associated with each radar sweep in order to obtain a suitable, georeferenced, radar picture. This part of the digital processing is described in Section 4.

In this section, the navigation board and the algorithm for the computation of the Euler angles are described.

3.1. Navigation Board

As mentioned before, position and attitude of the platform are key enablers for the proper reconstruction of the Plan Position Indicator (PPI) image.

Using low-cost devices, the attitude determination of a moving platform is a challenge, because of the stringent accuracy requirements. GPS/IMU integration has been identified as suitable technical solution for the navigation engine.

In order to retrieve navigation data for the platform under development, a cheap electronic board, which has a variety of embedded sensors, was selected as navigation engine. The navigation unit, shown in Figure 5, is based on the combination of the following: a SBC (Figure 5(b)) and a navigation board (Figure 5(a)). Specifically the SBC is a FOX Board G20, which is a ready-to-use low-cost Linux embedded SBC, consisting of ARM9 400 Mhz. A COTS board, namely, “Daisy-7,” has been used to determine the attitude and the position of the radar. The adoption of the Daisy-7 allows containing the cost of the system, since the board, together with the relative controller, costs less than 150 Euros. Moreover, it has very limited dimensions,  cm. Finally, the board integrates a Global Positioning System (GPS) receiver, a barometer, 3 magnetometers, 3 gyroscopes, and 3 accelerometers. The outputs of the board are(i)user position: latitude , longitude , and altitude [m](ii)user velocity [m/s](iii)angular velocity (gyroscope measurements) [deg/s] and acceleration [m/s2](iv)quaternions [13].

3.2. Form Quaternions to Euler Angles

Different strategies can be implemented to compute the attitude of a moving object [1416]; moreover, several methods can be used to describe the orientation of a moving object: rotation vectors [13] which describe a rotation by an axis of rotation and an angle around it [17]; Euler angles, which represent rotation of a rigid body by decomposing it into consecutive rotations around three different axes [18]; and quaternions [13]. Although Euler angles have singularities and are less accurate than quaternions when used to integrate incremental changes in attitude over time, they are the most widely used. Different names and notation can be associated with the Euler angles, , , and , in this work, are identified, respectively, as roll, tilt (pitch), and heading (yaw).

The relationship between the quaternions and the Euler angles is detailed in [13, 19]:where , , , and are the elements of the quaternion [13].

4. Integration

This section is focused on the integration of the radar sensor with the navigation system.

The aim of the project is to achieve a low-cost, agile radar system exploitable on board of unmanned moving platforms; the radar system should allow operating in two different modes:(i)Scan of angular sectors with variable speed and rotating direction(ii)Fixed pointing acquisition, in order to increase the SNR trough higher sweep integration.A diagram describing the integration between navigation and radar systems is shown in Figure 6. The red dashed box contains the radar elements, whereas the green dashed box contains the navigation devices; the synchronization is performed exploiting a common time-stamp provided by the shared SBC. As we previously mentioned, without an accurate reference for the platform attitude and position, this radar agility would not be possible. Whereas modern military radar systems typically provide this feature, the same capability is seldom achieved by civilian radar and it is never available in low-cost radar systems.

The radar acquisition device, that is, the ADC, is the time master of the radar system. In fact, it triggers the waveform generator, that is, the DAC, to send a pulse train while, at the same time, it enables the acquisition window, in order to receive and convert the radar echo (see Figure 7).

The waveform generator can be programmed by the operator to send one or several pulses when it receives the trigger from the ADC (see the “number of pulses” field in the right side of Figure 9). The length of the pulse train is limited by the internal memory of the waveform generator, which is available to design the transmitted wave. A longer ramp would allow a lower number of transmitted pulses.

The length of the acquisition window is automatically set according to the length of the transmitted signal. Obviously the acquisition window accounts for the expected signal delay (referred to as “” in Figure 1). The chosen digitizer, ADQ14DC-2A-VG by SP Devices, allows an acquisition rate up to 500 Hz. The Daisy-7 board provides positioning data from its various sensors at a frequency of 300 Hz. Both systems associate each recorded data with its own time-stamp, exploiting the SBC internal clock as reference.

Due to the different data rate of the two devices, an interpolation of the positioning parameters is required in order to match the radar and navigation time-stamps. This interpolation is possible thanks to the difference between the dynamics of the radar platform and the data rate of the navigation engine. Specifically, in the proposed experimental results (Section 6), the radar platform was rotating at an average speed of 1.5 rpm, which corresponds to 25 mHz: four magnitude orders smaller than the data rate of the Daisy-7.

A second matching, between the navigation data and the polar plot grid, is required in order to properly display the radar picture. In fact, according to the angular speed of the platform and to the azimuth resolution selected for the polar plot, the number of radar sweeps to be integrated for each angular sector of the plot is automatically computed. As for the rest of the signal processing, also the synchronization task is achieved through ad hoc MATLAB implemented functions. Besides, in this preliminary stage of the presented research, both the association radar-navigation and navigation-polar are carried out offline, and the overall computational cost does not prevent the online implementation on the SBC. Nevertheless, because of the required interpolation of the positioning parameters, a minimum time delay of 33 ms is required, leading to a quasi-real-time operative mode.

5. Experimental Setup

In order to demonstrate the radar capabilities of the developed prototype and to validate the approach for the synchronization between the radar and the navigation engine, several data collections were carried out. The datasets considered herein were collected on the 5th October 2016; in order to simulate real acquisition conditions (i.e., radar mounted on board of a flying vehicle) the radar platform, including the radar and the navigation board, was placed on a remotely controlled gimbal as shown in Figure 3(b). Clockwise and counterclockwise rotations were considered (including a dataset with a change of rotation direction) to analyse the influence of such parameter. Specifically, data collections with different number of rotations, from two to six consecutive rotations, were performed to evaluate the stability of the platform with respect to the number of rotations. During the tests, the radar performed several azimuthal scans with antennas at a fixed elevation angle. Two different datasets were recorded on the doorstep of the “MELISSA Lab,” within the Joint Research Centre (JRC) site in Ispra (Italy). In order to have a current image of the real environment a snapshot from Google Earth is shown in Figure 8. The green line is the maximum nonambiguous range (100 meters) resulting from the specific radar settings adopted for the tests. The red line is the limit of the polar plots (80 meters), and this limit was set to better identify the objects in the radar images; hence it is only for visualization purposes. From Figure 8, different elements of the real environment can be clearly identified; these elements are useful to properly understand the radar results presented in Section 6.3.

The developed prototype is a flexible full configurable device; the radar parameters such as the operating frequencies, maximum-unambiguous range of the waveform, and the tapering function used for the processing can be set using a GUI developed in MATLAB. The GUI is shown in Figure 9. A range resolution of about 20 centimeters is obtained using specific parameters used for the tests. The complete list of the parameters used for the data campaigns is reported in Table 1.

6. Experimental Results

This section presents the experimental results obtained in preoperational conditions: two datasets, collected on 5th October 2016, are considered.

At first the navigation results are discussed in Section 6.1, then the radar results are presented in Section 6.2, and finally the results obtained integrating the navigation and radar data are discussed in Section 6.3.

6.1. Navigation Results

As mentioned before, the data relative to the experiments of the 5th October 2016 are considered. From the navigation point of view, the most relevant data are the attitude information.

In the following, the output of the Daisy-7 board is analysed. In Figure 10(a), the angular velocity measured by the three gyroscopes of the navigation board is plotted as a function of the time. From the figure, it can be noted that the angular velocities around - and -axis with zero mean. The rotations were performed rotating only on the -axis: the red line has a mean different to zero, and specifically it is almost 10 deg/s. The same periodic variation can be appreciated on the -axis due to the nonperfect alignment of the , plane with the horizontal plane. The direction of the rotation can be clearly identified by the sign of the angular velocity measured by the gyroscope on the -axis. In the first data collection 6 clockwise rotations were performed, with constant angular variation, and in this case the angular velocity is positive. While, in the second data collection, the change of the rotation direction can be noted, this can be appreciated from the change of the sign of the angular velocity, and in the first part it is negative and then it becomes positive. This parameter is extremely noisy and cannot be used for attitude determination per se.

In Figure 10(b), the acceleration measured by the three accelerometers is shown as a function of the time. In this case, a periodic behavior is present in the three measurements of the three devices. The accelerometer on the vertical axis mainly measures the gravity acceleration (9.8 m = s2); hence its mean is almost 9.8. The slight deviation between the , plane and the horizontal one is confirmed also by the measurements of the accelerometers: the acceleration measured by the accelerometer on the -axis has zero mean whereas a small bias can be noted in the measurements of the accelerometer on the -axis. The accelerometer output should be integrated (increasing the noise) to obtain attitude information; moreover low-cost devices have to be calibrated to avoid the growth of attitude error [20].

The measurements of the magnetometers on the - and -axes are shown in Figure 11. Magnetic measurements are distorted by different phenomena, which can be divided into two categories: hard or soft iron. Hard iron distortions are due to objects producing a magnetic field. Soft iron distortions are due to alterations in the existing magnetic field. These distortions modify the magnetic field depending upon in which direction the field acts relative to the sensor. To limit the effect of these phenomena, a simple calibration process has been performed; the results of the calibration are evident from the figure. The calibrated measurements are centred into the origin and no deformation of the circle can be appreciated.

A method for the attitude estimation has been used exploiting quaternions.

In Figure 12, the quaternions are shown as a function of the time; the periodic behavior of the four quaternions clearly identifies the rotations. The periodic behavior is more evident in the blue and cyan lines. The positive and negative peaks of the lines clearly identify the rotations. In the first data collection, the rotation was performed in clockwise direction whereas the second data collection started with a counterclockwise direction; then the rotation was changed into clockwise one. The change of direction clearly appears from the behavior of the quaternions: in the first data collection (clockwise) the peaks of the blue line are in advance with respect to the cyan peaks, while it is the opposite in the initial phase of the second data collection where the platform was rotating in counterclockwise direction. Quaternions are useful to represent the attitude of a rotating body and are easily connected to the Euler angles as shown in (4). Hence from the quaternions, output of the navigation board, the Euler angles are computed. Roll pitch and heading angles are shown as function of time in Figure 13: in the upper box of the figure, the tilt angle is plotted, in the central box the roll angle is considered, and finally in the lower box the heading angle is shown. In the two first cases (tilt and roll), only small variations of the angles can be appreciated; however a periodic behavior can be noted in both cases. From the upper box, the break between the two data collections clearly appears, and a variation of the tilt angle of almost two degrees can be appreciated. To properly reconstruct radar images when the platform is rotating on the vertical axis, the foremost angle is the heading angle, which provides information related to the direction where the antennas are pointing. The heading angles as function of the time are shown in the lower box of Figure 13. From the figure, the separation between the two data collections is evident (the area in the grey box). During the first data collection, the platform performed six clockwise rotations: the start and ending point of each rotation are reported in the bottom part of the figure. A high consistency between the different rotations can be appreciated, and no drift or biases are evident. In the second data collection, the platform performed almost six rotations (the last rotation is incomplete): at first, two counterclockwise rotations were performed; then the direction was inverted and the other three rotations were carried out. This type of data collection was performed to demonstrate the robustness of the navigation system with respect to unexpected change of way of the rotation. Also in this case the navigation engine was able to properly estimate the heading of the platform: the change of rotation direction clearly appears from the figure.

6.2. Radar Results

After the basic signal processing described in Section 2.3, the radar data can be plotted in the waterfall diagram. Figure 14 shows the waterfalls of the two proposed data collections consisting of 60.000 radar sweeps. In both plots, showing the power of the radar echo in dBm on a range versus sweep-number axes, the six azimuth scans can be easily identified and compared with the navigation data proposed in the previous section. It clearly appears that in both cases the radar is scanning at a nearly constant angular speed, since each scan includes more or less the same number of sweeps. Besides, a sharp observer could notice the inversion of rotating direction between scans two and three (closed to sweep number 2000) in the lower plot.

Unfortunately, the waterfall plot is hardly interpretable from an inexperienced radar user and it is definitely not comparable with the actual experimental scenario proposed in Figure 8. As a matter of fact, without the integration of the navigation data and in particular of those data relative to the platform attitude, it is impossible to associate the radar sweeps with their actual azimuth in the case of nonhomogeneous rotation.

6.3. Integrated Results

To demonstrate the radar capabilities of the developed prototype, radar and attitude data are associated in order to achieve the radar polar plot. In fact, only a suitable matching between the radar and the navigation data allows obtaining the proper polar radar picture.

A qualitative analysis is performed comparing the radar polar plots with respect to the Google Earth image, Figure 8, from which the main environmental features are clearly identifiable. Moreover, the distances of the main feature identified have been measured with a laser meter, and such distances have been compared with respect to the distance obtained from the radar polar plots. Specifically, the distance of the parked car in front of the MELISSA lab was 8.25 m, whereas the distances of the light poles are 43.3 m, 29.9 m, and 40.55 m, respectively.

The presented datasets include six azimuthal scans each, but, for visualization reason, only two polar plots, relative to the first scan of each data collection, are graphically presented hereby. Anyway, two plots are enough to assess the consistency of the obtained results with respect to the actual experimental scenario.

Besides, in order to evaluate the achieved synchronization level between radar and navigation units, comparison among the different radar images obtained from consecutive scans is carried out through correlation analysis.

As previously indicated, the objective of the research is the implementation of a remotely controlled, flexible, and agile radar platform which allows performing different types of data acquisition; therefore, the developed software needs to be able to estimate the angular velocity and to consequently evaluate the number of radar sweeps to integrate, according to the acquisition rate and to the azimuth resolution of the polar plot. This feature is already available in the tested prototype. In fact, although during the proposed data collections the rotating speed of the radar platform was almost constant, through the navigation data, the system was able to evaluate the angular velocity and to accordingly optimize the number of integrated sweeps for each angular sector of the polar plot. The presented results were obtained using an angular resolution of 0.25 degrees. Averaged angular velocity and number of integrated sweeps per angular sector are reported in Table 2 for each of the six scans in the two datasets. The polar plots relative to the first scan of each dataset are shown in Figure 15, where the power of the radar echo is shown on a scale ranging from −65 to 0 dBm. For graphical reasons and to avoid the repetition of similar findings, only two polar plots out of twelve are shown. Specifically, in Figure 15(a), the polar plot obtained from the dataset relative to the first data collection is shown, whereas in Figure 15(b) the image reconstructed using the data collected in the second data campaign is shown.

Although the gathered data are relative to a maximum nonambiguous range of 100 meters, to better identify the reflecting objects only the first 80 meters are shown in the polar plots. For analogous reasons, the intense backscattering from objects closer than three meters has been cropped. Both polar plots are orientated according to the orientation shown in Figure 8; this fact allows a simple comparison between the actual environment and the radar images. In the right side of each polar plot the reflecting objects are visible only within the first two circles (range < 20 meters). In fact this area corresponds to the interior of the MELISSA laboratory. On the contrary, in the left side of both radar pictures, several features of the actual environment can be identified at various range. A radial-line at an azimuth of almost 165 degrees is particularly evident and it corresponds to the external wall of the building. Remaining in the left side of the polar plots, two parallel lines of reflecting objects can be identified at ten and thirty meters, respectively. These features are also clearly identifiable in Figure 8, where a line of parked car is at almost ten meters from the radar position and a line of trees at some thirty meters is present. Among the trees, three more intense reflections are visible, corresponding to as many light poles. Finally, in the lower part of both radar images the external staircase of the building is clearly identifiable. Although in both radar polar images the main features of the current environments can be properly identified, also small differences between the two images can be appreciated. This is true also comparing to each other the other scans. The more probable reason for these small variations is ascribed to the dynamic and noncontrolled nature of the experimental environment. The proposed datasets are relative to forty seconds’ scans, so that, even between two consecutive scans some variation of the scenario could be detected (i.e., maneuvering cars, pedestrians or cyclists, etc.). In Figure 16, the distances of the main environmental features are shown. Form the figure, only small differences between the radar and laser meter distances can be noted. This further confirms the reliability of the radar measurements. In order to verify if a bias or a drift is present in the association of attitude information and radar data, the correlation coefficient between the azimuthal sectors of the first radar scan and the corresponding sectors of the further scans is computed. The index of the maximum correlation coefficient is plotted in Figure 17. In order to verify the effect of the angular resolution, the correlation coefficient is computed for different azimuthal resolution of the polar plot: from 0.1 to 1 degree. From the figures, it emerges that the indexes of the maximum correlation coefficient lay on the main diagonal of the plane identified by the index of the angular sector. This result demonstrates that the maximum correlation coefficient is obtained for two corresponding angular sectors. Moreover, the lines relative to the correlation of the first scan with the remaining five are almost superimposed; hence no drift can be appreciated between the various scans. The correlation shows consistency of the radar picture up to an azimuth resolution of 0.1 degrees, which is well below the resolution provided by the most directive antenna employed (2 deg of azimuthal resolution for the microstrip patch antennas). It is worth remarking that the presented results are relative to preliminary tests conducted in a noncontrolled, dynamic scenario, characterized by unknown multipath effects. Notwithstanding this, the correlation between different scans reveals a general consistency of the radar pictures, validating the exploited association between radar and attitude data. Further investigation should consider also scans with different angular speed and acquisitions with the moving platform, in order to fully exploit the navigation data and to validate the georeferencing of the radar picture.

7. Conclusions

The paper presents design, prototyping, and tests of a low-cost mini radar system, with specific interest on the synchronization of radar and navigation data.

The presented system has been conceived for radar applications on board unmanned aerial platforms. Hence, the prototype needs to be agile, flexible, and reliable, able to operate in different modes, according to the requirements of the application. The employment of the radar on board of RPAs involves severe limitations in terms of weight, consumption, and overall dimensions. The required radar capabilities and the moving nature of the sensor imply a fine synchronization between radar and navigation data. Moreover, this task is a challenge, considering the low-cost nature of both navigation and radar units.

Two prototypes of the system have been manufactured: one to be carried by a multirotor drone and the other suitable to be exploited by an unmanned, tethered aerostat. The main differences between the two prototypes are the antennas and the supports joining the radar with the aerial platforms.

The radar is a coherent FMCW sensor operating in Ku band. The implemented prototypes have a reduced unambiguous range up to four kilometers, corresponding to a range resolution of 1.5 meters. To the unambiguous range of 100 meters’ setup for the presented results corresponds a range resolution of about 20 centimeters. The maximum bandwidth is 1.4 GHz and the sample rate 25 MSps.

The navigation unit is based upon an integrated IMU/GPS board, namely, “Daisy-7,” including a GPS receiver, a barometer, 3 magnetometers, 3 gyroscopes, and 3 accelerometers. A MATLAB algorithm has been developed to compute the Euler angles relative to the radar platform. The algorithm is based on quaternions outputs of the Daisy-7 board.

The true challenge of the processing is the reconstruction of the radar picture based on data from the low-cost navigation unit. In fact, in order to obtain the polar plots, the radar data acquired during free rotations of the sensor need to be associated with the correct attitude of the platform. Such association is achieved through interpolation and matching of the time-referenced data for navigation and radar systems.

Aiming to demonstrate the radar capabilities of the integrated platform, several tests were designed and performed. Two datasets were collected on the 5th October 2016, with the integrated radar-navigation platform placed on a remotely controlled gimbal and performing several azimuthal scans. Clockwise and counterclockwise rotations were considered (including a dataset with a change of rotation direction).

These repeatability tests were carried out to validate the synchronization between the radar and navigation data. In particular, the consistency between different radar images, acquired in different scans, was analysed.

From the radar images obtained, it emerges that several features of the actual environment can be clearly identified at various range. A radial-line at an azimuth of almost 165 degrees is particularly evident and it corresponds to the external wall of the building. Two parallel lines of reflecting objects can be identified at ten and thirty meters, corresponding to a line of parked cars and a line of trees, respectively. Among the trees, three more intense reflections are visible, corresponding to as many light poles. The position of the identified features in the radar picture appears consistent for the considered scans.

The correlation coefficient between the azimuthal sectors of the first radar scan and the corresponding sectors of the further scans was computed. From the analysis it can be noted that the indices of the maximum correlation coefficient lay on the main diagonal of the plane identified by the index of angular sector. This result demonstrates that the maximum correlation coefficient is obtained for two corresponding angular sectors. Moreover, the lines relative to the correlation of the first scan with the remaining five are almost superimposed; hence no drift can be appreciated between the various scans.

The encouraging results demonstrate the feasibility of a low-cost agile radar system, which takes advantage from the attitude determination performed with a COTS integrated board.

Further investigation should consider both scans with different angular speed and acquisitions with the moving platform, in order to fully exploit the navigation data and to validate also the georeferencing of the radar picture. Moreover, it is worth remarking that the provided results are relative to an analysis based only on the signal amplitude. Nevertheless, since the implemented radar is a “coherent” system, also the phase information could be potentially exploited. Such opportunity shall be investigated in a future work.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.


The authors would like to thank Jorge Figueiredo-Morgado and Raimondo Giuliani (JRC) for the indispensable technical contribution to the manufacturing of the radar systems.