# Multi-Unmanned Aerial Vehicle (UAV) Cooperative Fault Detection Employing Differential Global Positioning (DGPS), Inertial and Vision Sensors

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Sensor FDI in Small Autonomous Helicopters

#### 2.1. Inertial, magnetometer and altitude sensor faults

_{i,j}anḍ β

_{i,j,k}with i = 1, …, m, of the model have to be determined by the identification approach. The term ε

_{i}(t) takes into account the modeling error, which is due to process noises, parameter variations, etc. The ARX models are chosen with the structure that achieve the smallest Akaike's Information Theoretic Criterion (AIC) [12], according to a simple search algorithm, in which the first half of data is used for estimation and the second for cross validation.

_{k}are functions of the squared difference between real (c

_{i}) and estimated (ĉ

_{i}) sensor outputs:

_{i}are weighting coefficients that are determined for each failure based on experience and experimentation. Ideally, if no fault is present, the residual would be zero. In practice, the residual will take non-zero values due to estimation errors, sensor noise, etc. Usually, the residual for a specific sensor will be bounded, and therefore a “threshold level” can be defined so that the residual is always below it in absence of failures. The system has been tested with different sensors and failure types. The implemented sensor FDI system is able to detect many of these errors.

#### 2.2. Differential GPS sensor failures

_{1}) generated by the ARX estimator is shown. The fault has been declared at t = 18 sec. Shortly after the fault, the residual goes above the threshold value (dashed line). The lower plot shows the fault detection signal (D, which equals 1 when a fault is detected, and is zero otherwise).

## 3. Multi-UAV Vision-Based Relative Position Estimation

**A**

_{1}and

**A**

_{2}are the intrinsic calibration matrices of both cameras,

**t**

_{2}is the relative translation of the second camera point of view in the first camera coordinate frame,

**n**is an unitary vector normal to the plane in the first camera coordinate frame (in the outward camera direction), w = 1/z, z being the distance from the first camera to the plane and

**R**

_{2}is the rotation matrix that transforms a vector in the first camera coordinate frame into a vector expressed in the second camera coordinate frame. Thus, the relative displacement between two UAVs is obtained by computing the plane-induced homography matrix between their cameras (

**H**

^{12}).

**H**

^{12}is the homography matrix that relates two images gathered by UAVs 1 and 2 for image 0 computed by blob matching.

**H**

^{j}

_{0}

_{i}, the homography matrix between image i and image 0 for UAV j is computed by composing the homography matrices between consecutive images. For instance,

**H**

^{2}

_{02}=

**H**

^{2}

_{01}

**H**

^{2}

_{1}

_{2}. Combining the homographies

**H**

^{2}

_{0}

_{i}with

**H**

^{12}determines the homography matrices of UAV 2 at different time instants along its trajectory with respect to the view of UAV 1 at image 0.

**H**

^{12}

**H**

^{2}

_{01}

**H**

^{2}

_{12}. More details are given in [2].

_{33}= 1. This standard deviation has been computed with the data obtained in many experiments with the real vehicles. From these results it can be stated that, in general, the computation of the relative position is very good if these deviations are lower than 1, good between 1 and 4, acceptable if they are between 4 and 7 and usually useless for values higher that 8. Notice that the correlation with the GPS has been intentionally discarded.

## 4. Planning for Multi-UAV FDI

## 5. Experimental Results

## 6. Conclusions

## Acknowledgments

## References and Notes

- Ollero, A.; Merino, L. Control and perception techniques for aerial robotics. Annu. Rev. Control
**2004**, 28, 167–178. [Google Scholar] - Merino, L.; Wiklund, J.; Caballero, F.; Moe, A.; De Dios, J.R.M.; Forssen, P.E.; Nordberg, K.; Ollero, A. Vision-based multi-UAV position estimation. Rob. Autom. Mag
**2006**, 13, 53–62. [Google Scholar] - Napolitano, M.; Windon, D.; Casanova, J.; Innocenti, M.; Silvestri, G. Kalman filters and neural-network schemes for sensor validation in flight control systems. IEEE Trans. Contral Syst. Technol
**1998**, 6, 596–611. [Google Scholar] - Drozeski, G.; Saha, B.; Vachtsevanos, G. A fault detection and reconfigurable control architecture for unmanned aerial vehicles. Proceedings of the IEEE Aerospace Conference, Big-Sky, MT, USA, March 5–12, 2005.
- Heredia, G.; Ollero, A.; Bejar, M.; Mahtani, R. Sensor and actuator fault detection in small autonomous helicopters. Mechatronics
**2008**, 18, 90–99. [Google Scholar] - Heredia, G.; Ollero, A. Sensor fault detection in small autonomous helicopters using observer/kalman filter identification. Proceeding of the 5th IEEE International Conference on Mechatronics, Málaga, Spain, April, 2009.
- Prieto, J.C.; Croux, C.; Jimenez, A.R. RoPEUS: A new robust algorithm for static positioning in ultrasonic systems. Sensors
**2009**, 9, 4211–4229. [Google Scholar] - Hewitson, S.; Wang, J. GNSS receiver autonomous integrity monitoring with a dynamic model. J. Navig
**2007**, 60, 247–263. [Google Scholar] - Patton, R.J.; Chen, J. Observer-based fault detection and isolation: robustness and applications. Contr. Eng. Pract
**1997**, 5, 671–682. [Google Scholar] - Napolitano, M.; An, Y.; Seanor, B. A fault tolerant flight control system for sensor and actuator failures using neural networks. Aircr. Des
**2000**, 3, 103–128. [Google Scholar] - Samy, I.; Postlethwaite, I.; Gu, D. Detection of additive sensor faults in an Unmanned Air Vehicle (UAV) model using neural networks. Proceedings of the UKACC Int. Conf. on Control, Manchester, UK, September 2–4, 2008. paper Tu05.05.
- Ljung, L. System Identification - Theory for the User, 2nd ed; Prentice-Hall: Upper Saddle River, NJ, USA, 1999. [Google Scholar]
- Yun-chun, Y.; Hatch, R.R.; Sharpe, R.T. Minimizing the integer ambiguity search space for RTK. Wuhan Univ. J. Natural Sci
**2003**, 8, 485–491. [Google Scholar] - Goodman, J. The space shuttle and GPS – a safety-critical navigation upgrade. Proceedings of the 2nd International Conference on COTS-Based Software Systems, Ottawa, Canada, February 10–12, 2003.
- Caballero, F.; Merino, L.; Ferruz, J.; Ollero, A. Improving vision-based planar motion estimation for unmanned aerial vehicles through online mosaicing. Proceedings of the IEEE International Conference on Robotics and Automation, Orlando, FL, USA, May 15–19, 2006; pp. 2860–2865.
- Forssén, P.E. Low and medium level vision using channel representations, PhD thesis No. 858.. Linköping University, Linköping, Sweden, 2004.
- Sandholm, T. An implementation of the contract net protocol based on marginal cost calculations. Proceedings of 11th Int. Workshop on Distributed Artificial Intelligence, Washington, DC, USA, January, 1993; pp. 256–262.
- Smith, G. The contract net protocol: High-level communication and control in a distributed problem solver. IEEE Trans. Comput
**1980**, 29, 1104–1113. [Google Scholar] - Remuss, V.; Musial, M.; Hommel, G. MARVIN-An autonomous flying robot-based on mass market. Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems IROS, Lausanne, Switzerland, September 30–October 4, 2002; pp. 23–28.

**Figure 2.**Outlier GPS-z sensor failure detection. Upper plot: residual signal. Lower plot: fault detection signal.

**Figure 3.**Slow-growing drift GPS-z sensor failure detection. Upper plot: residual signal r

_{1}. Lower plot: fault detection signal.

**Figure 6.**An example illustrating the re-planning process: (a) UAVs following the initial plan. (b) Dynamic replanning: UAV B plan changes to include wp6.

**Figure 7.**Residual r

_{2}for position estimation sequence with initial standard deviations of UDev = 5.8 and VDev = 6.1, and threshold level of 4.5.

**Figure 8.**Residual r

_{2}for position estimation sequence with initial standard deviations of UDev = 2.3 and VDev = 1.5, and threshold level of 1.5.

wp1 | wp2 | wp3 | wp4 | wp5 | |
---|---|---|---|---|---|

UAV A | 1 | 1 | 0 | 0 | 0 |

UAV B | 0 | 0 | 0 | 1 | 1 |

UAV C | 0 | 0 | 1 | 0 | 0 |

© 2009 by the authors; licensee MDPI, Basel, Switzerland This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).

## Share and Cite

**MDPI and ACS Style**

Heredia, G.; Caballero, F.; Maza, I.; Merino, L.; Viguria, A.; Ollero, A.
Multi-Unmanned Aerial Vehicle (UAV) Cooperative Fault Detection Employing Differential Global Positioning (DGPS), Inertial and Vision Sensors. *Sensors* **2009**, *9*, 7566-7579.
https://doi.org/10.3390/s90907566

**AMA Style**

Heredia G, Caballero F, Maza I, Merino L, Viguria A, Ollero A.
Multi-Unmanned Aerial Vehicle (UAV) Cooperative Fault Detection Employing Differential Global Positioning (DGPS), Inertial and Vision Sensors. *Sensors*. 2009; 9(9):7566-7579.
https://doi.org/10.3390/s90907566

**Chicago/Turabian Style**

Heredia, Guillermo, Fernando Caballero, Iván Maza, Luis Merino, Antidio Viguria, and Aníbal Ollero.
2009. "Multi-Unmanned Aerial Vehicle (UAV) Cooperative Fault Detection Employing Differential Global Positioning (DGPS), Inertial and Vision Sensors" *Sensors* 9, no. 9: 7566-7579.
https://doi.org/10.3390/s90907566