Table of Contents Author Guidelines Submit a Manuscript
Journal of Sensors
Volume 2018, Article ID 4515828, 19 pages
https://doi.org/10.1155/2018/4515828
Research Article

Cooperative Virtual Sensor for Fault Detection and Identification in Multi-UAV Applications

1Robotics, Vision and Control Group, Universidad de Sevilla, Camino de los Descubrimientos s/n, 41092 Seville, Spain
2Aerospace Advanced Technology Center (CATEC), Aeropolis, 41309 Seville, Spain

Correspondence should be addressed to Guillermo Heredia; se.su@relliug

Received 9 November 2017; Accepted 28 March 2018; Published 30 April 2018

Academic Editor: Calogero M. Oddo

Copyright © 2018 Alejandro Suarez et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

This paper considers the problem of fault detection and identification (FDI) in applications carried out by a group of unmanned aerial vehicles (UAVs) with visual cameras. In many cases, the UAVs have cameras mounted onboard for other applications, and these cameras can be used as bearing-only sensors to estimate the relative orientation of another UAV. The idea is to exploit the redundant information provided by these sensors onboard each of the UAVs to increase safety and reliability, detecting faults on UAV internal sensors that cannot be detected by the UAVs themselves. Fault detection is based on the generation of residuals which compare the expected position of a UAV, considered as target, with the measurements taken by one or more UAVs acting as observers that are tracking the target UAV with their cameras. Depending on the available number of observers and the way they are used, a set of strategies and policies for fault detection are defined. When the target UAV is being visually tracked by two or more observers, it is possible to obtain an estimation of its 3D position that could replace damaged sensors. Accuracy and reliability of this vision-based cooperative virtual sensor (CVS) have been evaluated experimentally in a multivehicle indoor testbed with quadrotors, injecting faults on data to validate the proposed fault detection methods.

1. Introduction

Reliability and fault tolerance have always been an important issue in UAVs [1], where fault detection and identification (FDI) techniques play an important role in the efforts to increase the reliability of the systems [2]. It is even more important when teams of aerial vehicles cooperate closely between them and the environment; such is the case in multi-UAV missions and formation flight. Most FDI techniques for single UAVs that appear in the literature use model-based methods, which try to diagnose faults using the redundancy of some mathematical description of the system dynamics and sensors onboard the UAV. Model-based CL-FDI (component level) has been applied to general aircraft sensors and actuators [3, 4] and also to unmanned aircraft [5], either fixed-wing UAVs [6, 7] or helicopter UAVs [810]. On the other hand, cooperative FDI makes use of all the sensors available in the multi-UAV fleet for detecting the faults in any of the single UAVs. In most published works on cooperative FDI, each UAV estimates its own state and broadcasts it to the rest of the fleet through the communication channel [11]. What has not been thoroughly explored is the use of the sensors onboard the other vehicles of the team for detection of faults in an autonomous vehicle, which requires sensing the state of a vehicle from the other team components. This scheme requires the computation of the relative position of a UAV from another UAV. There exist several techniques that are being used for relative position estimation between autonomous vehicles. One of the most widely used is visual tracking using onboard cameras. Object tracking using vision-based techniques has been well studied in the last decades (see [12, 13] for a review and classification of object tracking techniques).

The use of sensors that measure the angle to a target for robot localization has become very popular in the last years. These sensors measure the bearing angle in 2D and the azimuth and elevation angles in 3D environments and are known commonly as bearing-only sensors. Mostly used bearing-only sensors are visual cameras and also ultrasound sensors for underwater vehicles [14], radar, lidar, and others. For autonomous vehicles (aerial or underwater), the measurements of the 3D target location with bearing-only sensors are inherently nonlinear in the single UAV case because the observed state variables are measured angles to the target. As such, in many cases, nonlinear estimators are used to determine the target state. By geometric considerations, a single observation from a single camera can only determine a ray along which the target must lie. Thus, observations from multiple bearing-only sensors (at least two) are required to provide depth perception and to obtain a good estimate of the target’s position and orientation [15]. There has been a significant interest in the last years in the computation of the relative position between aerial vehicles using vision sensors [16], especially in autonomous formation flight [17] and the closely related field of autonomous aerial refueling [18]. In some cases, uniquely identifiable light markers (beacons) are placed on the leader aircraft and on the refueling drogue to facilitate relative navigation. The beacons can be one [19] or several [20] light-emitting diodes (LEDs) that emit structured light modulated with a known waveform. Active contours and Kalman filtering have also been employed to track the leader aircraft across several image frames, without uniquely identifiable optical markers on the leader aircraft [20]. Other techniques have been used for vehicle relative position estimation and applied to FDI systems. Work [21] shows this cooperative FDI approach with small ground vehicles in a laboratory experiment, using cameras to detect the position of other ground robots. In [22], a cooperative FDI system for the odometry sensors is presented, in which the relative velocity of a robot with respect to the leader robot is obtained by matching a laser range sensor (LRS) image transmitted by the leader robot with the LRS obtained by its own laser sensor.

Cooperative FDI has also been researched on UAVs [23], using homography-based techniques to estimate the position of a UAV relative to another UAV from the images that both taken from the same scene. Each UAV estimates its ego-motion applying visual odometry with landmarks on flat surfaces. In [24], visual tracking is used to estimate the position of a ground robot from a fixed-wing UAV on ground, and this estimation is transmitted to the ground robot as external position for FDI. Virtual sensors [25] are software modules which utilize measurable signals (virtual sensor inputs) in order to reconstruct a signal of interest (virtual sensor output). Virtual sensors are useful in replacing physical sensors, thus reducing hardware redundancy and acquisition cost, or as part of fault detection methodologies by having their output “compared” to that of a corresponding actual sensor. Several researchers propose nonlinear virtual sensors in aerospace applications [2628], although they have primarily been applied to single aircraft problems.

This paper describes the development of a FDI system that integrates the sensors available in a multi-UAV fleet (i.e., visual cameras) in order to detect faults in the sensors of one of its member UAV. The FDI system is based on a cooperative virtual sensor (CVS) that estimates the position of a single UAV using sensors onboard other UAVs of the team and thus needs the cooperation of the UAVs. Once the fault has been detected, the CVS estimation can replace the internal sensor so the affected UAV can be driven to a safer state. The proposed FDI scheme based on the CVS may, in general, use any sensor that gives information on the relative position of the UAVs in real time, and it can use also any bearing-only sensor. In this work, the CVS has been developed using visual tracking with the cameras onboard several other UAVs of the team. Object recognition and tracking are not the main focus of this paper, and thus a modified CAMShift algorithm [29, 30] has been used for tracking target UAV since it is simple and fast, and thus it is suitable for real-time implementation with low computational cost. However, the FDI system has been designed so that any tracking algorithm can be used in the CVS, allowing for advanced object or vehicle tracking systems tailored to the specific application [31]. The CVS uses the estimations obtained from the cameras onboard other UAVs, which are transmitted to the target UAV along with the state of each UAV. An extensive set of experimental tests has been carried out in the CATEC multi-UAV indoor testbed. This testbed is equipped with a Vicon Motion Capture System, whose measurements are used as UAV location ground truth for comparison. Figure 1 shows the urban scenario and the four quadrotors employed in the experiments: three observers equipped with a visual tracking module (camera and computer board) and the target identified by a blue marker. The paper also shows results of robustness experiments in which the system has been injected with artificial UAV location and orientation errors. Preliminary results of this work were presented in [32]. In this paper, we generalize and complete the results in our previous work, considering an arbitrary number of UAVs in the fleet and an extensive analysis of case studies, proposing appropriate FDI policies.

Figure 1: Indoor testbed (15 × 15 × 5 m size) with four quadrotors used for validating the developed FDI system. The target UAV is visually tracked by observers 1 and 2 while observer 3 has landed.

The rest of the paper is organized as follows. Section 2 introduces the vision-based fault detection and identification problem in multi-UAV systems. Potential applications, the vision-based FDI system, which may be useful are also mentioned, identifying typical faults on UAVs and proposing several strategies and methods for their detection and identification. Section 3 describes the design of the FDI system, including the methods based on variable threshold, the FDI policies, the application of the CVS for target UAV recovery, and some safety issues related with its practical application. In Section 4, experimental results that validate the CVS and the fault detection and identification methods are shown, while Section 5 contains the conclusions of this work.

2. Vision-Based Multi-UAV FDI

From now on, the vehicle that is under observation will be called target UAV, and observers to the UAVs that are visually tracking the target. In the general case, any of the vehicles in the fleet can play the role of target or observer. This work also considers that sensor fault can be associated to any of the UAVs (either target or observer).

Multi-UAV FDI is based on the availability of vehicle relative positioning information (red vectors in Figure 2). In general, the sensors used for relative positioning will be mounted onboard the vehicles, and in some cases, sensors that are employed for other purposes (i.e., cameras) can be reused for FDI. The sensors that can be used for vehicle relative positioning can be classified in the following: (i)Full 3D position estimation sensors: these sensors can estimate the full relative position of two vehicles. An example of these sensors is a pair of real-time kinematic DGPS receivers on the observer and the target vehicles working in relative mode.(ii)Bearing-only sensors: these sensors can estimate the bearing of the target vehicle, that is, the elevation and azimuth angles from the observer vehicle. An example of these sensors is a visual or IR camera with a tracking algorithm to track the position of the other vehicle.(iii)Range-only sensors: these sensors estimate the distance to the target vehicle. An example of these sensors is RF range sensing.

Figure 2: A scenario illustrating the vision-based multi-UAV FDI. UAV-1 and UAV-2 act as observers, focusing their cameras on target UAV-3.

The type of sensor determines the number of degrees of freedom (dof) that a sensor mounted on a single observer vehicle can estimate: 3 dof in the case of full position estimation sensors, 2 dof in the case of bearing-only sensors, and 1 dof for range-only sensors. Thus, for estimating the 3D relative position of two vehicles, it is needed at least one full position estimation sensor (3 dof), two bearing-only sensors (4 dof), or three range-only sensors (3 dof), each one on different vehicles.

In this paper, bearing-only sensors have been used for multi-UAV FDI, since visual cameras are proven, cost-effective sensors of widespread use in autonomous vehicles. Camera technologies are mature, and there exist a wide base of algorithms for image processing. Furthermore, cameras are routinely used in many UAV missions, and there is the possibility of using existing onboard cameras for FDI.

On the other hand, multi-UAV FDI also uses the own position estimation of the observer UAVs (light blue vectors in Figure 2), which can also have faults. When a sensor fault is detected by the system, it is important to identify if the fault is in the target UAV or if the fault is in one of the observer UAVs. FDI techniques are based on sensor redundancy, and thus more degrees of freedom (more sensors) will be needed in order to identify which sensor is failing (i.e., in the target or one of the observer UAVs), not only that there exist a fault. Since each camera provides 2 dof to the FDI system, the following situations may arise depending on the number of available observer UAVs: (i)One observer UAV (): the camera on the observer UAV provides 2 dof, and then it is not possible to estimate the 3D relative position of the target UAV. However, still the information provided by the visual sensor can be used for detection of some faults.(ii)Two observer UAVs (): the two cameras on the observer UAVs provide 4 dof, and then it is possible to estimate the 3D position of the target UAV. Furthermore, with the additional dof, it is possible to distinguish if the fault is in the target or in the observer UAVs, although it is not possible to decide which observer UAV has the fault.(iii)Three (or more) observer UAVs (): the cameras onboard three observer UAVs provide 6 dof, and then it is possible to estimate the 3D position of the target UAV, detect if there is a fault in the positioning sensors, and decide if the fault is in the target or one of the observer UAVs. In the general case with more than three observer UAVs, it is also true and more accuracy in the detection can be achieved since there are more available estimations from the cameras.

Thus, depending on how many observers are available and how they are used, a set of FDI policies and methods can be defined, being possible to determine which is the most probable faulty UAV with at least two observers. The next subsection presents a general description of the problem. Then, potential applications are briefly discussed, and the different types of faults that may occur are analyzed. The following subsection discusses the policies for multi-UAV FDI depending on the number of available observers, in the three cases outlined above.

2.1. Problem Statement

Consider a group of UAVs executing a certain task, whether independently or jointly. Each of the UAVs has onboard a visual tracking module, which is a device consisting of a camera, a lightweight computer board for image and data processing, and a wireless communication device for data interchange. Every UAV is identified with respect to the rest of the vehicles in terms of a color marker disposed at a visible part, preferably having a spherical shape so its projection on the image plane of the cameras is independent from the point of view. It is assumed that UAV position and orientation sensors are subject to failures, in the sense that at any time they may stop working properly and start giving erroneous measurements. In order to increase system robustness and reliability against sensor faults, the visual information provided by the tracking modules can be exploited to check if the UAVs of the fleet behave as expected, and in the event that a particular sensor device is found to be faulty, it could even be replaced by a cooperative virtual sensor (CVS) that integrates the measurements of the tracking modules to estimate the position of the affected UAV, so it can be guided to a safer location using this estimation instead of the internal measurement, avoiding potential accidents.

Consider the scenario shown in Figure 2 as an illustrative example. It is assumed that three quadrotors (named as UAV-1, UAV-2, and UAV-3) are executing a certain task. At a given instant, UAVs 1 and 2 (observers) are requested to check if there is a fault on UAV-3 (target), for example, because a human operator or UAV-3 itself has detected a strange behavior and wants to verify it from an external source. Then the three vehicles interrupt their tasks, the target quadrotor (UAV-3) stays hovering, and both observer UAVs go to the observation points around UAV-3 and start to track it with their cameras. The observers transmit the information from their cameras and their own position and attitude estimation to the target, and then the target executes the CVS for comparing the expected position given by target internal sensors with respect to the visual information. Finally, a fault detection and identification report will be generated by the target and transmitted back to the observers, indicating the most probable source of the fault.

2.2. Applications

The proposed vision-based CVS and FDI system may be considered in any application where there are three or more UAVs involved flying at close or medium range of distances so they can be visually detected between them. The design of a visual tracking module in the form of a plug-in device that is capable of communicating with other tracking modules through a dedicated wireless network makes its integration in the aerial platforms easier. The idea is to provide the fleet with a redundant estimation of the 3D position and velocity of any of its members in a cooperative way, along with fault detection and identification capabilities as additional features of the system in such a way that the main application is affected as least as possible. Some scenarios where this may be useful are the following.

2.2.1. Target Tracking in Urban Canyons

In high-density urban environments like big cities, the GPS signal may be affected by the presence of buildings. Consider for example that there is a quadrotor visually tracking a car in such scenario, so the UAV needs to know its own position in order to report the localization of the tracked vehicle. For that purpose, a pair of observers flying out of the GPS-denied area is implementing an instance of the CVS, visually tracking the tracker quadrotor and sending it its position estimation.

2.2.2. Formation Flight

In military applications where a team of UAVs move jointly from one point to another, the safety and reliability of the operation could be increased considering a vision-based fault-tolerant system capable of detecting and recovering quickly from faults in the navigation sensors of any member of the fleet. By deploying a camera on the aircrafts that can be focused in any of the UAVs, it could be possible to periodically check if the vehicles are located in the expected positions. In case that a certain UAV is not found or it is in a distant position from the expected one, a search and recovery phase would be initiated in a cooperative way to prevent the loss of the vehicle or its collision against other UAVs.

2.2.3. Flight Tests with Experimental/under Development Aircrafts

The first flight tests of aerial vehicles under development usually have higher probability of accident. In order to prevent this, the deployment was proposed in a group of two or more UAVs specifically dedicated to the detection of potential faults in the navigation sensors of the vehicle that is being tested (target). These UAVs, considered as observers, would move jointly with the target during the take-off, landing and flight phases, maintaining the target focused in their field of view all the time, with a relative speed between them close to zero so the accuracy and reliability of the CVS is as high as possible. As the take-off and landing operations are more critical, a group of observers capable of hovering might be properly placed to ensure that these operations are executed with no problem.

2.2.4. UGV Recovery after Collision in Industrial Environments

In industrial environments with large surfaces where a significant number of UGVs are moving around autonomously, the presence of unexpected obstacles like persons, objects, or other vehicles may cause a certain UGV to crash, losing temporarily its navigation system. If the factory counts with a team of small quadrotors with a visual tracking module for fault detection and recovery, the affected UGV could be guided in terms of the CVS provided by these observers to a safe place for its maintenance. This eliminates the need of a person or another ground vehicle that might interrupt the other vehicles, without interfering in the operation of the factory.

2.3. Identification and Modeling of UAV Faults

A fault can be defined as any situation in which a component of a system presents an unexpected and undesirable behavior. The fault may occur suddenly, as is the case of crashes or electric short circuits, or may cause a progressive degradation of system performance that could be avoided with periodic maintenance. Some typical faults associated with low-weight UAVs like quadrotors are listed in Table 1, classified depending on which component is affected.

Table 1: Identification of possible faults in lightweight UAVs.

This paper is focused in the detection and identification of positioning sensor faults in multi-UAV applications, using image sensors for this purpose. The basic idea is to compare the expected position of target UAV given by its internal sensors with respect to the information provided by the observers. As it will be explained later, it is necessary to know the orientation of the observers for the estimation of target position, so attitude sensor faults are also subject to be detected.

It is convenient to have a certain idea of what kind of faults can affect a position sensor and how will they affect the performance of UAV control. Table 2 contains a classification of different failures that may describe approximately the behavior in some positioning devices or methods, including the mathematical model of the fault, its effect on the position of the UAV, and an assessment on the boundness of the response error. The following notation has been used. (i): the real position of UAV affected by faults(ii): the position estimation given by UAV sensor affected by faults(iii): the time instant when fault occurs(iv): the time instant when fault disappears(v): the step function, zero if and one if (vi): the drift constants in xyz axes in (vii): the offset error in XYZ axes in (viii): the noise signals in XYZ axes in

Table 2: Models of different position estimation faults and their consequences in UAV control.

Table 3 lists some typical position estimation technologies and their associated faults. A more in-depth study of the reliability and fault detection capabilities of the sensors being used should be convenient when designing a FDI system for a particular application, as in some cases the device itself may provide useful information for its own fault diagnosis that should be exploited.

Table 3: Identification of possible faults in some position estimation sensors and methods.
2.4. Methods and Strategies for Fault Detection and Identification

The objective of this section is to identify some situations that may occur during the fault detection and identification phase that need to be analyzed separately and propose a particular method for its treatment. Fault detection should be carried out as fast as possible, using the minimum number of resources, and being as reliable as possible. As there are a large number of variations in the detection of faults depending on the number of UAVs involved, who are affected by the fault, the presence of obstacles, and other relevant factors, this analysis will try to consider the most significant case studies. From now on, subindices i and j are associated to observer UAVs, while subindex k corresponds to target UAV, imposing that .

2.4.1. Fault Detection with One Observer ()

Although with a single UAV is not possible to estimate the 3D position of a tracked target, it is possible to detect deviations in target position sensor or in observer pose. On the one hand, the tracking algorithm provides two signals: a tracking loss flag (with value TRUE or FALSE) indicating if the target could not be detected by the tracking module and the measured position of target UAV-k on the image plane of observer UAV-i, which will be denoted by . On the other hand, if the target position and the observer pose are known, it is possible to compute the expected position of the target on the image plane of the observer, denoted as . Let as the expected distance between target k and observer i, taking into account the possible presence of a fault in the position measurement of target and/or observer. If f denotes the focal length of the camera, then the following transversal error projected in a plane parallel to the image plane of observer camera is defined:

This has been represented in Figure 3. The single observer will report an error if , that is, if the norm of the transversal projection error exceeds a certain threshold that will depend on the relative position between target and observer. Section 3 will cover more in detail the definition of the detection thresholds. Now, the following FDI algorithm is proposed: (i)If then observer UAV-i was not able to detect on its field of view target UAV-k and a fault in target or in observer is reported.(ii)If then a fault is detected on target UAV-k position sensor or in observer UAV-i pose sensors.(iii)If then no fault is detected on UAVs i or k.

Figure 3: Transversal projection error due to deviations between measured and expected projection points.

Note however that, in case a fault is detected, it is not possible to ensure if it is located in the target, in the observer, or in both UAVs. Furthermore, the fault cannot be detected if the error in the position sensor is in the direction of the projection ray of the camera, although it could be detected changing the point of view (POV). These two situations have been illustrated in Figure 4. On the other hand, errors in the orientation sensor of the observer UAV will be detected using this method as the expected projection of target position on the image plane depends on the orientation of the observer.

Figure 4: Fault detection with a single observer. Fault injected on target UAV (a) and injected on observer (b). Depending on the point of view (POV), the fault can or cannot be detected.

Finally, as mentioned before, it may occur that target cannot be found by the tracking algorithm of the observer. This may happen if the fault on the position or orientation sensor is such that target is out of the field of view of the observer. Depending on the particular application, an emergency landing or a target search phase should be requested.

2.4.2. Fault Detection and Identification with Two Observers ()

In this case, both observers are used simultaneously to estimate directly the 3D position of the target creating an instance of the CVS. In normal conditions with no faults, the estimation provided by the observers should be quite similar to the estimation given by target sensor. If the distance between them exceeds a conveniently defined threshold, then a fault is detected, although it is not possible to ensure if the fault is on the target or the observers’ sensors. There are two ways of estimating target position from the measurements provided by the observers: geometrically considering the closest point between the projection rays (triangulation) or using a nonlinear probabilistic estimator like the EKF or the UKF. In the geometric method, the projection rays obtained from the projection points of target on the image plane of observers should ideally intersect in a point corresponding to the position of the target. When the projection rays do not cross, it is necessary to consider the closest points between these two lines. Intuitively, the distance between these points gives an idea of how reliable the estimation is. Some errors in position or orientation measurements of observers’ sensors will cause the distance between these two points to increase, which can be exploited to detect faults on observers. This has been represented in Figure 5(a), where it represents the ideal fault-free case in which the projection rays cross exactly in the target. On the other hand, in Figure 5(b), there is a fault on observer 1 position sensor, and this causes that the projection rays do not cross in space, and the minimum distance in space of these two rays can be used to detect faults in the observers.

Figure 5: Ideal case of projection ray intersection on target (a). Fault in observer 1 (b).

Let and denote the closest points between the projection rays from observers i and j to target k. Target position given by internal sensor and estimated by the CVS will be represented by and , respectively. Finally, will be the fault detection threshold for observers i and j sensors when they are focused on target k, while will be the fault detection threshold for target position sensors. Both thresholds will be defined taking into account the relative position between target and observers, as it will be explained in Section 3. Then, the following three situations are considered for the peer-observer fault detection case: (i)If , then a fault is detected in the position and/or attitude sensors of observer UAV-i or observer UAV-j when they are focused on target UAV-k.(ii)If , then a fault is detected on target UAV-k position sensor when it is observed by UAV-i and UAV-j.(iii)If then it is considered that there is no fault in the positioning sensors.

Note one important aspect related with the detection of faults in observers’ sensors: only those errors in the position and/or attitude measurements that tend to increase the distance between the closest points can be detected. For example, if both projection rays are parallel to the XY plane, then only deviations in z-axis position and in pitch angle will influence the term .

In case the target position is estimated using a Kalman filter, the error covariance matrix could be used to estimate observers’ reliability, although this option will not be analyzed here. Assuming that observers are not affected by sensor faults, then either geometric or Kalman filter estimators can be used to detect faults on target position sensor simply comparing the distance between internal and cooperative estimation with respect to the detection threshold .

2.4.3. Fault Detection and Identification with Three or More Observers (): General Case

Fault detection with three or more observers can be treated as an extension of the case with two observers. Every pair of observers can obtain an independent estimation of the position of the target UAV. The total number of possible situations with fault is , including no fault in any UAV, fault on all UAVs, or any combination of faulty UAV whether target or observers. Each situation will have an associated probability that can be computed from the observations provided by individual or by peer-observers. Comparing the estimations given by different pairs of observers is even possible to identify individual UAVs affected by faults, which could not be done with two observers.

With observers, there are pairs of observers that can provide a 3D estimation of target UAV position using the geometric method. As in the previous case with two observers, fault detection on observers’ sensors should be done previously to fault detection on target sensor. Now, with three or more pairs of observers, it is possible to identify the particular UAV affected by the fault. Consider a situation with four UAVs, where UAVs 1, 2, and 3 act as observers and UAV-4 acts as target. Imagine that and therefore a fault is detected on observers’ pair 1-2. In order to identify which is the affected observer, the other two possible combinations are evaluated, obtaining that and , which implies that the faulty UAV must be UAV-1. Note that the computation of the distance between the closest points of the projection rays does not depend on target position, so a fault on target will not influence the detection of faults on observers’ sensors. If observers i and j are reliable, that is, if , then their observations can be used to obtain the cooperative 3D position estimation of target k, whether using the geometric method or a nonlinear Kalman filter estimator. The second option is more appropriate when there are multiple sources of information generating measurements asynchronously and at different rates, also providing a certain level of rejection against noise and outliers. This requires redefining the target sensor fault detection condition as , where variable threshold will depend on the number of available observers and the relative position between them and the target.

3. FDI System Design

3.1. Fault Detection Based on Variable Threshold

As mentioned in the previous section, fault detection is performed comparing the distance between the expected target position and the vision-based position estimation defining index values known as residuals (either in the 3D space or in the observers’ 2D image plane). Independent residuals are constructed for each different sensor failure. Residuals are designed so that they respond to an individual failure and not to the others. In general, residuals are functions of the squared difference between real () and estimated () sensor outputs: where are weighting coefficients that are determined for each failure based on experience and experimentation. In this case, the estimated sensor outputs are the cooperative virtual sensor.

Ideally, if no fault is present, the residual would be zero. In practice, the residual will take nonzero values due to estimation errors, sensor noise, perturbations, and so on. Usually, the residual for a specific sensor will be bounded, and therefore a threshold level can be defined so that the residual is always below it in the absence of failures. The selection of the threshold is very important, because it should enable both diagnosis of incipient faults and minimization of the false alarm rate. In many FDI systems, a constant threshold level is used since the sensor noise and errors do not vary significantly during operation, as is the case in FDI of individual vehicles that use internal sensors for fault detection [9]. In multi-UAV systems that use range sensors and cameras for estimation and tracking of other vehicles [15] and mutual localization [36], adaptive thresholding is used. This is also the situation with the CVS, since there are many factors that affect the accuracy and even the availability of the CVS output, and therefore a variable threshold strategy for FDI is most appropriate [2, 23, 37]. The fault detection procedure is designed to decide if the observed changes in the residual signal rk can be justified in terms of the disturbance (measurement noise) and/or modeling uncertainty as opposed to failures. It is critical to minimize the detection delay associated with a “true” fault; furthermore, the false alarm rate should be minimized while, at the same time, no “true” faults should remain undetected.

Statistical change detection filters as the CUSUM filter [33] can be used for robust detection of faults. This filter is used to detect both positive and negative changes in the mean value of the residual caused by the occurrence of a fault. Although the CUSUM filter balances the detection delay with the false alarm rate, in this paper, it has been considered that fast fault detection is the most critical aspect of the FDI system, and, therefore, a direct comparison of the residual with the threshold has been implemented, with some modifications to increase the robustness. First, the outliers (single points of the residual with a large change in value, which have been found to be relatively common in experiments) are discarded. Then, to activate the fault detection flag, a number of consecutive residual points are required to be above the threshold (typical values of are 5–7).

There are several works dealing with the optimal placement of bearing-only sensors for object position estimation and tracking (see, e.g., [34] and references therein). In the following, a qualitative analysis to the factors that influence the accuracy of the estimation is presented.

The threshold for fault detection, which depends on the accuracy of the estimation, will depend in general on the relative position between the target and the observers and also on the number of observers actively tracking the target, being more tolerant (i.e., taking higher values) when CVS estimation reliability and accuracy are lower. In general, CVS estimation accuracy will be better if (i)the distance between observers and target decreases;(ii)the number of observers actively tracking the target increases;(iii)the projection rays from observers to target are orthogonal between them [34].

However, it is not feasible to analytically analyze the influence of these three factors over position estimation error, not even when the position estimation is computed using the geometric method. For that reason, in this work, a heuristic approach has been followed, proposing expressions with intuitive geometric interpretation.

3.1.1. Variable Threshold with One Observer

The transversal projection error in the image plane of a certain observer (see (1)) is compared with respect to a variable threshold defined in the following form:

Here, is a constant threshold that compensates nominal estimation errors in fault-free conditions (i.e., sensor measurement noise), is the expected distance from observer UAV-i to target UAV-k, and is a positive constant whose value is determined empirically, taking into account that lower values imply faster detections but a higher probability of false-positive detections. This definition is based on the assumption that, in normal conditions, the transversal projection error should be approximately proportional to the distance to the target. Table 4 shows typical values of these parameters obtained from the experimental results.

Table 4: Typical values of the variable threshold parameters (one observer).
3.1.2. Variable Threshold with Two or More Observers

As explained in Sections 2.4.2 and 2.4.3, when the target is visually tracked by two or more observers, it is possible to detect two types of faults. On the one hand, the distance between the closest points of the projection rays (see Figure 5(a)) is associated to errors in the position/orientation measurement of any of the observers, whereas the distance between the target position given by its internal sensors and the geometrical estimation will reflect any fault in the position of the target or in the pose of the observers. This leads to the definition of two different detection thresholds. The first one allows the detection of position/orientation errors in the observers, and it is based on the separation angle of the observers and on the expected distance to the target.

Here, is an offset term that compensates nominal errors, is the separation angle between the projection rays of observers i and j when they are tracking target k, and are the expected distances to target, and is a tolerance constant that determines how restrictive the threshold is. Intuitively, this will be more tolerant as the mean distance between the observers and the target increases. The angle compensates the uncertainty in the vision-based position estimation, which is minimal when the projection rays of both observers are orthogonal (), and increases in the depth axis when the optical axes tend to align. The degenerated case corresponds to , where both axes are parallel and it is not possible to obtain an estimation. In this case, the fault detection system is not reliable, requiring the repositioning of the observers. Figure 6 represents graphically how the uncertainty in the vision-based position estimation varies with observation angle. Reference [34] analyzes in more detail the propagation of errors in the 3D position estimation using image sensors, modelling the estimation error as a Gaussian process whose standard deviation depends on the separation angle.

Figure 6: Uncertainty in the estimated target position associated to the separation angle between the observers, . Orthogonal observation with minimum uncertainty (a) and optical axes close to the degenerated case (b). The fault detection threshold should be adjusted according to this angle.

On the other hand, if the position and orientation sensors of the observers are reliable, and thus the CVS, then a second threshold can be defined for detecting position sensor faults in the target UAV, as described in Sections 2.4.2 and 2.4.3. The distance between the target position given by the onboard sensors, , and the estimation obtained from the observers, , is compared to a variable threshold that takes into account the observation distances, the separation angle of the observers, and the number of observers actively tracking the target:

Similarly to the threshold defined in (4), compensates nominal errors, is a tuneable tolerance constant, is the mean separation angle between all the combinations of observers’ pairs (which should be in the range 20 to 90° approximately), is the mean distance between the observers and the target, is the number of observers actively tracking the target, and is a tuneable parameter whose value is close to one. The term accounts heuristically for the increase in estimation accuracy with a higher number of observers. The inverse exponential term results suitable since the variation of the accuracy and reliability of the CVS and FDI system is more significant when the number of observers is between 1 and 4.

Although the thresholds defined in (4) and (5) are similar, their application is different according to the assumption of reliability regarding the observers’ sensors and the way they are evaluated. Table 5 summarizes the features of both thresholds.

Table 5: Similarities and differences of the variable thresholds defined in (4) and (5).

Table 6 shows typical values in our experiments of the parameters of the variable threshold for one observer. In practice, the parameters , , , and are chosen heuristically, observing the residuals in fault-free long time experiments, and choosing them so that the variable threshold is above the residuals. For example, the experiment described in Section 4.1 used to evaluate the accuracy in the CVS estimation which allows to determine the offset term and an initial guess of the tolerance constant . The injection of different fault patterns in the position and orientation measurements of the observers is evaluated in Section 4.2.

Table 6: Typical values of the variable threshold parameters (two observers).
3.2. FDI Policies

The usual approach for single UAV FDI follows the scheme shown by the white background blocks in Figure 7. The FDI subsystem performs the tasks of failure detection and identification by continuously monitoring the system inputs and the outputs of the sensors. Under nominal conditions, these follow predictable patterns, within a tolerance determined by the amount of uncertainties introduced by random system disturbances and measurement noise in the sensors. Usually, sensor FDI tasks are accomplished by observing when the output of a failed sensor deviates from its predicted pattern. On the other hand, in multi-UAV teams, individual UAVs can use all the available information for FDI, and this includes the measurements of the sensors onboard other UAVs, shown as “external sensors” block with shaded background in Figure 7. The system proposed in this paper is valid for UAVs of the team tracking the target UAV, being equal to two or greater.

Figure 7: Single UAV and cooperative FDI schemes. The block with shaded background, “external sensors,” is only available in cooperative FDI.

In the cooperative fault detection, it is important to note that the observers may have to stop working in their own tasks to dedicate their time and resources to the computation of the CVS. Depending on the way the observers are managed, four fault detection policies can be defined: (i)Cooperative fault detection by continuous monitoring: a group of one or more UAVs is specifically focused in the visual tracking of a particular UAV. This approach requires a continuous dedication of the observers, but it provides the highest response in case of fault in the target, which may be useful if this vehicle is executing a critical task. The design parameter is the number of observers required and the relative positions to the target.(ii)Cooperative fault detection over demand: one of the UAVs or a human supervisor requests a group of observers to focus on a particular UAV in order to check or prevent a possible failure. For example, if a pair of observers detects a fault during a periodic observation phase, they may require the collaboration of other UAVs in order to confirm the fault and identify its source, as described in Section 2.4.3.(iii)Cooperative fault detection by periodic observation: one or more UAVs assume the role of observers while the rest are considered as targets. Observers stop working in their own tasks and start the fault detection phase in which they focus sequentially in the rest of the vehicles to check failures in the positioning sensors. This has been illustrated in Figure 8. The observation conditions (duration and positions) and the observation period or frequency will be specified.(iv)Cooperative opportunistic fault detection: if, eventually, a certain UAV (considered as target) enters in the FOV of any other UAV while this is executing its own task, this last one may exploit the situation checking if the projection on the image plane corresponds to the position given by the sensors of target UAV. This avoids requesting UAVs to spend their time acting as observers. The performance of this strategy could be improved and assimilated to the continuous monitoring policy if the orientation of the onboard cameras is controlled regardless of the aerial platform, using a pan and tilt system for this purpose. By doing this, the tracking modules can perform their observation task while the UAVs move around during the execution of their own task without any interference between them.

Figure 8: Different configurations for the cooperative fault detection by periodic observation: single UAV observer (a) and two observers (b).
3.3. Safe Performance in FDI

It should be noted that the FDI procedure itself may cause an accident due to the reconfiguration of the UAVs required for the positioning of the observers around the target. Special care should be taken if any of the involved UAVs is affected by unbounded errors (as is the case of lock-in-place or drift errors) that may cause collision between the vehicles. For this reason, it is convenient to plan the trajectories of target and observers in such a way that the probability of impact during the FDI procedure is reduced as much as possible. One possible solution is the one illustrated in Figure 9. In vertical take-off and landing (VTOL) vehicles like quadrotors, height is usually a reliable measurement. This feature can be exploited making that each observer and target fly at different altitudes to prevent collisions between them. Furthermore, the order in the operation should be (1) move to a higher plane and (2) move to the desired XY observation position.

Figure 9: Space allocation for collision avoidance FDI. Observer and target are constrained to fly at different altitudes to avoid crashes between them.

One critical issue in the fault detection and recovery process is how much time the FDI system has to react and prevent the consequences of possible failures. In this sense, the reaction time against failure (RTAF) is defined as the available time since the fault occurs until it causes the damage on the vehicle or the environment. In general, it will be a random variable unless the conditions of the fault are well known, although it can be bounded in terms of UAV velocity, height, and distance to obstacles. Let us consider the situation depicted in Figure 10. The position sensors of the faulty UAV cause a displacement of this in an unknown direction at a speed . Two obstacles are considered: another quadrotor at a distance and width , and a wall at a distance and width . Target quadrotor width will be denoted by . The goal is to compute the RTAF for every obstacle, as well as to estimate the probability of collision against the obstacles, assuming that the heading angle of the quadrotor is a uniformly distributed random variable in the range . Applying simple geometric relationships, it is derived that the and the collision probability () to obstacle ith are

Figure 10: Model considered for the estimation of the RTAF with two obstacles and the respective collision probabilities.

These two variables should be kept monitored in real time, and, in the case they exceed a certain threshold, the FDIR system should be activated to increase safety and reliability.

4. Experimental Validation

This section presents experimental results (a video of the experiments can be found in [35]) that validate the vision-based cooperative virtual sensor and fault detection methods developed in this work for the case with two observers and one target. A set of experiments have been carried out in the CATEC multivehicle indoor testbed, which is equipped with a Vicon Motion Capture System that provides position and orientation ground truth of the objects in a 15 × 15 × 5 meters volume with undermillimeter accuracy. The UAVs employed in the experiments were three Hummingbird quadcopters from Ascending Technologies (see Figure 11) also provided by CATEC. Two of them were considered as observers and equipped with a tracking module consisting of an Odroid U3 computer board, a Logitech C525 USB camera, and a dual-band WiFi module, while the other played the role of target, so it was endowed with a color marker. The specifications of the tracking modules and the aerial platform are listed in Table 7.

Figure 11: Hummingbird quadrotors with the tracking module (a) and color marker (b).
Table 7: Specifications of the tracking module and the aerial platforms employed in the experiments.
4.1. Accuracy of the CVS Estimation

Several experiments were conducted in the testbed to evaluate the accuracy of the CVS, implemented with an extended Kalman filter which integrates the measurements provided by the tracking modules (pose of the onboard camera and target centroid on image plane), obtaining the Cartesian position and velocity of the tracked target. The knowledge extracted from the analysis of the experimental results leads to the definition of the fault detection methods based on variable threshold described in Sections 2.4 and 3.1. In particular, the offset component of the threshold is determined from the mean value of the estimation error in static observation conditions, whereas the tolerance constants can be tuned taking into account the variation of the error for different observation distances and separation angles. Figure 12 represents the CVS position estimation error for the case observers. The shaded area around indicates an interval in which the estimation becomes monocular since one of the tracking modules loses the target. The y-axis corresponds to the direction of the bisector line between the rays from the observer to the target and thus usually has larger error than the x-axis, which is perpendicular to the y-axis.

Figure 12: Position error (X,Y,Z) of the CVS estimation with respect to the VICON ground truth. Experiment with two observers at a relative angle of and a distance to the target of about 1.5 m. The shaded area around corresponds to a CVS estimation with only one observer.

As can be seen in Figure 12, the estimated Cartesian position presents different offset errors in each axis. This happens frequently, and it is due to camera calibration variations. When the CVS is enabled, these offset terms are subtracted from the error signal averaging it in the first period of the experiment. As an example, Table 8 presents the standard deviation of the estimation errors of the CVS with respect to the VICON ground truth for a mean distance of the observers to the target of 1.5 m and a relative angle of about 75°.

Table 8: Standard deviation.
4.2. CVS for Target UAV Position Estimation

In these experiments, the sequence of time-stamped images and the corresponding measurements were processed offline for obtaining the cooperative estimation of target position. This data was exploited for evaluating the fault detection methods explained before, as the limitation imposed by the batteries of the quadcopters does not allow analyzing the data properly in real time.

The 3D trajectories followed by the target and the two observers during the fault detection and recovery phases have been represented in Figure 13, while target position ground truth given by Vicon and the estimation provided by the CVS are compared in Figure 14. The corresponding estimation error has been represented in Figure 15. The relative positions between the UAVs were maintained constant for making the visual tracking more reliable. The two observers will move jointly with the target maintaining constant the relative position between each observer and the target. The observation positions in the x-, y-, and z-axes (taking target position as origin) were [−1, 1, 1] and [1] for observers 1 and 2.

Figure 13: Trajectory performed by the two observers and the target in the fault recovery experiment.
Figure 14: Target position given by Vicon (taken as ground truth) and CVS for the fault detection by periodic observation experiment. CVS computations were done offline. The different phases and transitions during the experiment have been marked. Target speed was set to 0.2 m/s.
Figure 15: Target position estimation error associated to the CVS in the fault detection by periodic observation experiment. CVS estimations were computed offline. Target speed was set to 0.2 m/s.
4.3. CVS for FDI on Target UAV

The fault detection and identification capabilities will be demonstrated graphically injecting offline the fault pattern described in Table 9 in the Vicon position data of one of the observers and in the target quadrotor.

Table 9: Fault pattern injected over observers and target position sensors.

For the identification of the fault, a simple rule is proposed when observers i and j are focused on target k: (i)If observer-i detects a fault but observer-j does not, then the fault is located on observer-i.(ii)If both observer-i and observer-j detect a fault, then the fault is located on target.

The interpretation of these results has to be made taking into account the observation conditions. Figure 16 shows the relative positions between observer and target in the XY plane. Both observers were hovering one meter above the target, maintaining constant the distance to it.

Figure 16: Relative position in the XY plane between the observers and the target for the fault detection and identification experiment. The observers are flying one meter above the target.
4.3.1. Fault Injected on Observer 1 Position Measurement

Figure 17 shows the projection error associated to observer 1 and observer 2. Remember that the projection error is defined as the distance between the expected and measured projection points of the target in the image sensor of the observer. The expected point is computed from target position given by its internal sensor (subject to failures) and from camera pose, while the measured point is given by the tracking algorithm. The red line in the plot is a constant threshold of 150 pixels (note that the image resolution was 640 × 480) that was adjusted taking into account the offset projection error in normal conditions, although in general, it will depend on the distance between target and observer. As expected, only observer 1 reports a fault when it is injected in the position measurements of the associated UAV. What it is interesting to observe in the left picture of Figure 17 is that sensor faults are more easily detectable in some particular directions.

Figure 17: Distance between expected and measured projection points in observer 1 and in observer 2 image plane when the fault pattern is injected in observer 1 position sensor. The detection threshold has been represented in red.
4.3.2. Fault Injected on Target Position Measurement

The same fault pattern than in previous case is injected this time in the position measurement given by Vicon of target UAV. As seen in Figure 18, both observers detect the fault, so it is inferred that it is located on the target.

Figure 18: Distance between expected and measured projection points in observer 1 and observer 2 image plane when the fault pattern is injected in target position sensor.
4.3.3. Fault Injected on Observer 1 Heading Measurement

As the projection error depends on the expected projection point, and at the same time this depends on camera orientation, heading (yaw angle) errors can also be detected. Errors in roll and pitch are not considered because UAV stability critically depends on both angles, although they are detectable too. Figure 19 represents the fault detection report provided by observer 1 when different offset errors are injected on the yaw angle (the representation of the attitude was expressed in Euler-XYZ).

Figure 19: Distance between expected and measured projection points in observer 1 image plane when different offset errors are injected in yaw angle.

5. Conclusions and Future Work

This article has presented several methods, strategies, and policies for vision-based fault detection and identification in the position and orientation sensors of any UAV member of a fleet in a multi-UAV application. The visual information provided by the tracking modules onboard the UAVs is exploited in two ways. On the one hand, with two or more observers, it is possible to create a cooperative virtual sensor (CVS) that provides a redundant estimation of the target position as long as observers’ sensors are reliable. On the other hand, comparing the expected target position given by its internal sensors with respect to the measured position given by the tracking algorithms, the observers will be able to report individually or cooperatively any fault on the sensors. Several experiments have been carried out in a multivehicle indoor testbed equipped with a Vicon Motion Capture System and three quadcopters, demonstrating the accuracy of the CVS and the FDI capabilities.

Two research lines are proposed as future work. Firstly, a method to search and redetect a target lost after a position sensor fault should be investigated, using for example the last position and velocity measurement reported as initial guess. This is necessary since the CVS requires that the target is within the field of view of the observers during the initialization phase. Different cooperative search strategies can be defined depending on the number of UAVs involved, the way the search space is divided between the participants, and the way the trajectories are planned. Secondly, it would be interesting to develop tracking modules that allow the orientation of the on-board camera with a pan and tilt system, instead of using the aerial platform for this purpose. This gives the possibility to execute in parallel the UAV task and the FDI process, improving the time response.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

This work has been supported by the AEROARMS (H2020, Reference 644271) and EC-SAFEMOBIL (FP7, Reference 288082) projects, funded by the European Commission, the AEROCROS (DPI2015-71524-R), and AEROMAIN (DPI2014-5983-C2-1-R) projects, and funded by the Spanish Ministerio de Economia, Industria y Competitividad. The research activity of Alejandro Suarez is supported by the FPU Program from the Spanish Ministerio de Educacion, Cultura y Deporte. The authors wish to acknowledge Miguel Angel Trujillo for his help in the experiments with the support from CATEC during the experiments carried out in its testbed.

Supplementary Materials

Video files attached to this paper: FIDR_2Observers_1Target.mp4: self-explanatory video of the fault detection and recovery experiments with three quadrotors, two of them acting as observers and one faulty UAV (target). The UAVs execute a patrolling task until the target stops due to an internal sensor fault. A help request message is sent to the observers, who interrupt their task and position themselves around the target. Then, the instance to the CVS is created and the recovery operation is executed until the target UAV is landed. FIDR_3Observers_1Target.mp4: video experiment with three observers and one target UAV. FDIR_OnboardCamera_Observer1.mp4: RGB and backprojection images from the onboard camera of the observer 1 UAV. The sequence covers the detection of the target UAV and its tracking until it has landed. The video files are available in the following link: https://hdvirtual.us.es/discovirt/index.php/s/yJzu3JPyMsW3sVB. (Supplementary Materials)

References

  1. A. Ollero and L. Merino, “Control and perception techniques for aerial robotics,” Annual Reviews in Control, vol. 28, no. 2, pp. 167–178, 2004. View at Publisher · View at Google Scholar · View at Scopus
  2. R. Isermann, Fault-Diagnosis Systems, An Introduction from Fault Detection to Fault Tolerance, Springer-Verlag, 2006.
  3. C. Hajiyev and F. Caliskan, Fault-Diagnosis and Reconfiguration in Flight Control Systems, Springer, Boston MA, USA, 2003. View at Publisher · View at Google Scholar
  4. S. Gururajan, M. Fravolini, M. Rhudy, and A. Moschitta, “Evaluation of sensor failure detection, identification and accommodation (SFDIA) performance following common-mode failures of pitot tubes,” in SAE Technical Paper 2014-01-2164, pp. 1–9, 2014. View at Publisher · View at Google Scholar · View at Scopus
  5. G. Ducard, K. Rudin, S. Omari, and R. Siegwart, “Strategies for sensor-fault compensation on UAVs: review, discussions & additions,” in 2014 European Control Conference (ECC), pp. 1963–1968, Strasbourg, France, 2014. View at Publisher · View at Google Scholar · View at Scopus
  6. G. Ducard, “Actuator fault detection in UAVs,” in UAV Handbook, pp. 1071–1122, Springer, Dordrecht, Netherlands, 2014. View at Publisher · View at Google Scholar · View at Scopus
  7. S. Gururajan, M. Fravolini, H. Chao, M. Rhudy, and R. Napolitano, “Performance evaluation of neural network based approaches for airspeed sensor failure accommodation on a small UAV,” in 21st Mediterranean Conference on Control and Automation, pp. 603–608, Chania, Crete, Greece, 2013. View at Publisher · View at Google Scholar · View at Scopus
  8. X. Qi, J. Qi, D. Theilliol et al., “A review on fault diagnosis and fault tolerant control methods for single-rotor aerial vehicles,” Journal of Intelligent & Robotic Systems, vol. 73, no. 1-4, pp. 535–555, 2014. View at Publisher · View at Google Scholar · View at Scopus
  9. G. Heredia, A. Ollero, M. Bejar, and R. Mahtani, “Sensor and actuator fault detection in small autonomous helicopters,” Mechatronics, vol. 18, no. 2, pp. 90–99, 2008. View at Publisher · View at Google Scholar · View at Scopus
  10. G. Heredia and A. Ollero, “Detection of sensor faults in small helicopter UAVs using observer/kalman filter identification,” Mathematical Problems in Engineering, vol. 2011, Article ID 174618, 20 pages, 2011. View at Publisher · View at Google Scholar · View at Scopus
  11. N. Léchevin, C. A. Rabbath, M. Shanmugavel, A. Tsourdos, and B. A. White, “An integrated decision, control and fault detection scheme for cooperating unmanned vehicle formations,” in 2008 American Control Conference, pp. 1997–2002, Seattle, WA, USA, 2008. View at Publisher · View at Google Scholar · View at Scopus
  12. H. Yang, L. Shao, F. Zheng, L. Wang, and Z. Song, “Recent advances and trends in visual tracking: a review,” Neurocomputing, vol. 74, no. 18, pp. 3823–3831, 2011. View at Publisher · View at Google Scholar · View at Scopus
  13. A. Yilmaz, O. Javed, and M. Shah, “Object tracking: a survey,” ACM Computing Surveys, vol. 38, no. 4, pp. 13–es, 2006. View at Publisher · View at Google Scholar · View at Scopus
  14. D. Moreno-Salinas, A. Pascoal, and J. Aranda, “Sensor networks for optimal target localization with bearings-only measurements in constrained three-dimensional scenarios,” Sensors, vol. 13, no. 8, pp. 10386–10417, 2013. View at Publisher · View at Google Scholar · View at Scopus
  15. B. Bethke, M. Valenti, and J. How, “Cooperative vision based estimation and tracking using multiple UAVs,” in Advances in Cooperative Control and Optimization, M. J. Hirsch, P. M. Pardalos, R. Murphey, and D. Grundel, Eds., vol. 369 of Lecture Notes in Control and Information Sciences, pp. 179–189, Springer, Berlin, Heidelberg, 2007. View at Publisher · View at Google Scholar · View at Scopus
  16. R. Sattigeri, Adaptive Estimation and Control with Application to Vision-Based Autonomous Formation Flight, [Ph.D. thesis], Georgia Institute of Technology, 2007.
  17. D. B. Wilson, A. H. Goktogan, and S. Sukkarieh, “Vision-aided guidance and navigation for close formation flight,” Journal of Field Robotics, vol. 33, no. 5, pp. 661–686, 2016. View at Publisher · View at Google Scholar · View at Scopus
  18. D. Wilson, S. Sukkarieh, and A. Goktogan, “Experimental validation of a drogue estimation algorithm for autonomous aerial refueling,” in 2015 IEEE International Conference on Robotics and Automation (ICRA), pp. 5318–5323, Seattle, WA, USA, 2015. View at Publisher · View at Google Scholar · View at Scopus
  19. L. Pollini, R. Mati, and M. Innocenti, “Experimental evaluation of vision algorithms for formation flight and aerial refueling,” in AIAA Modeling and Simulation Technologies Conference, Providence, RI, USA, 2004. View at Publisher · View at Google Scholar
  20. J. Ha, C. Alvino, G. Pryor, M. Niethammer, E. Johnson, and A. Tannenbaum, “Active contours and optical flow for automatic tracking of flying vehicles,” in Proceedings of the 2004 American Control Conference, pp. 3441–3446, Boston, MA, USA, 2004.
  21. R. A. Carrasco, F. Nuñez, and A. Cipriano, “Fault detection and isolation in cooperative mobile robots using multilayer architecture and dynamic observers,” Robotica, vol. 29, no. 4, pp. 555–562, 2011. View at Publisher · View at Google Scholar · View at Scopus
  22. M. Hashimoto, T. Ishii, and K. Takahashi, “Sensor fault detection and isolation for mobile robots in a multi-robot team,” in 2009 35th Annual Conference of IEEE Industrial Electronics, pp. 2348–2353, Porto, Portugal, 2009. View at Publisher · View at Google Scholar · View at Scopus
  23. G. Heredia, F. Caballero, I. Maza, L. Merino, A. Viguria, and A. Ollero, “Multi-unmanned aerial vehicle (UAV) cooperative fault detection employing differential global positioning (DGPS), inertial and vision sensors,” Sensors, vol. 9, no. 9, pp. 7566–7579, 2009. View at Publisher · View at Google Scholar · View at Scopus
  24. G. Heredia, J. Martín, and L. de Paz, “Fixed wing aerial robot with deployable ground robot for soil sampling,” in Workshop “Aerial robots physically interacting with the environment”, European Robotics Forum, Odense, Denmark, 2012.
  25. P. Samara, G. Fouskitakis, J. Sakellariou, and S. Fassois, “Aircraft angle-of-attack virtual sensor design via a functional pooling NARX methodology,” in 2003 European Control Conference (ECC), pp. 1816–1821, Cambridge, UK, 2003.
  26. M. Oosterom and R. Babuska, “Virtual sensor for fault detection and isolation in flight control systems - fuzzy modeling approach,” in Proceedings of the 39th IEEE Conference on Decision and Control (Cat. No.00CH37187), pp. 2645–2650, Sydney, NSW, Australia, 2000. View at Publisher · View at Google Scholar
  27. A. Tomczyk, “Simple virtual attitude sensors for general aviation aircraft,” Aircraft Engineering and Aerospace Technology, vol. 78, no. 4, pp. 310–314, 2006. View at Publisher · View at Google Scholar · View at Scopus
  28. G. Heredia and A. Ollero, “Virtual sensor for failure detection, identification and recovery in the transition phase of a morphing aircraft,” Sensors, vol. 10, no. 3, pp. 2188–2201, 2010. View at Publisher · View at Google Scholar · View at Scopus
  29. G. R. Bradsky, “Computer vision face tracking for use in a perceptual user interface,” Intel Technology Journal, vol. Q2, pp. 214–219, 1998. View at Google Scholar
  30. D. Exner, E. Bruns, D. Kurz, A. Grundhöfer, and O. Bimber, “Fast and robust CAMShift tracking,” in 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops, pp. 9–16, San Francisco, CA, USA, 2010. View at Publisher · View at Google Scholar · View at Scopus
  31. A. Andreopoulos and J. K. Tsotsos, “50 years of object recognition: directions forward,” Computer Vision and Image Understanding, vol. 117, no. 8, pp. 827–891, 2013. View at Publisher · View at Google Scholar · View at Scopus
  32. A. Suarez, G. Heredia, and A. Ollero, “Cooperative sensor fault recovery in multi-UAV systems,” in 2016 IEEE International Conference on Robotics and Automation (ICRA), pp. 1188–1193, Stockholm, Sweden, 2016. View at Publisher · View at Google Scholar · View at Scopus
  33. G. Campa, M. L. Fravolini, B. Seanor et al., “On-line learning neural networks for sensor validation for the flight control system of a B777 research scale model,” International Journal of Robust and Nonlinear Control, vol. 12, no. 11, pp. 987–1007, 2002. View at Publisher · View at Google Scholar · View at Scopus
  34. J. N. Sanders-Reed, “Error propagation in two-sensor three-dimensional position estimation,” Optical Engineering, vol. 40, no. 4, pp. 627–636, 2001. View at Publisher · View at Google Scholar · View at Scopus
  35. “Video of the experiments,” https://hdvirtual.us.es/discovirt/index.php/s/yJzu3JPyMsW3sVB.
  36. M. Cognetti, P. Stegagno, A. Franchi, and G. Oriolo, “Two measurement scenarios for anonymous mutual localization in multi-UAV systems,” IFAC Proceedings Volumes, vol. 45, no. 28, pp. 13–18, 2012. View at Publisher · View at Google Scholar · View at Scopus
  37. G. Heredia, F. Caballero, I. Maza, L. Merino, A. Viguria, and A. Ollero, “Multi-UAV cooperative fault detection employing vision-based relative position estimation,” IFAC Proceedings Volumes, vol. 41, no. 2, pp. 12093–12098, 2008. View at Publisher · View at Google Scholar