Next Article in Journal
A Tapered Aluminium Microelectrode Array for Improvement of Dielectrophoresis-Based Particle Manipulation
Next Article in Special Issue
Multisensor Super Resolution Using Directionally-Adaptive Regularization for UAV Images
Previous Article in Journal
Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition
Previous Article in Special Issue
Wavelength-Adaptive Dehazing Using Histogram Merging-Based Classification for UAV Images
Article Menu

Export Article

Open AccessArticle
Sensors 2015, 15(5), 10948-10972;

Autonomous Aerial Refueling Ground Test Demonstration—A Sensor-in-the-Loop, Non-Tracking Method

Advanced Scientific Concepts Inc., 135 East Ortega Street, Santa Barbara, CA 93101, USA
Author to whom correspondence should be addressed.
Academic Editor: Felipe Gonzalez Toro
Received: 15 October 2014 / Revised: 30 April 2015 / Accepted: 4 May 2015 / Published: 11 May 2015
(This article belongs to the Special Issue UAV Sensors for Environmental Monitoring)
Full-Text   |   PDF [3193 KB, uploaded 11 May 2015]   |  


An essential capability for an unmanned aerial vehicle (UAV) to extend its airborne duration without increasing the size of the aircraft is called the autonomous aerial refueling (AAR). This paper proposes a sensor-in-the-loop, non-tracking method for probe-and-drogue style autonomous aerial refueling tasks by combining sensitivity adjustments of a 3D Flash LIDAR camera with computer vision based image-processing techniques. The method overcomes the inherit ambiguity issues when reconstructing 3D information from traditional 2D images by taking advantage of ready to use 3D point cloud data from the camera, followed by well-established computer vision techniques. These techniques include curve fitting algorithms and outlier removal with the random sample consensus (RANSAC) algorithm to reliably estimate the drogue center in 3D space, as well as to establish the relative position between the probe and the drogue. To demonstrate the feasibility of the proposed method on a real system, a ground navigation robot was designed and fabricated. Results presented in the paper show that using images acquired from a 3D Flash LIDAR camera as real time visual feedback, the ground robot is able to track a moving simulated drogue and continuously narrow the gap between the robot and the target autonomously. View Full-Text
Keywords: 3D Flash LIDAR; autonomous aerial refueling; computer vision; UAV; probe and drogue; markerless 3D Flash LIDAR; autonomous aerial refueling; computer vision; UAV; probe and drogue; markerless

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
Printed Edition Available!
A printed edition of this Special Issue is available here.

Share & Cite This Article

MDPI and ACS Style

Chen, C.-I.; Koseluk, R.; Buchanan, C.; Duerner, A.; Jeppesen, B.; Laux, H. Autonomous Aerial Refueling Ground Test Demonstration—A Sensor-in-the-Loop, Non-Tracking Method. Sensors 2015, 15, 10948-10972.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top