Next Article in Journal
Fabrication and Hypersonic Wind Tunnel Validation of a MEMS Skin Friction Sensor Based on Visual Alignment Technology
Previous Article in Journal
Highly Fluorescent Green Carbon Dots as a Fluorescent Probe for Detecting Mineral Water pH
Previous Article in Special Issue
A Distributed Architecture for Human-Drone Teaming: Timing Challenges and Interaction Opportunities
Open AccessArticle

Fusion of Enhanced and Synthetic Vision System Images for Runway and Horizon Detection

1
Department of Electrical Engineering, University of Kirkuk, Kirkuk, 36001, Iraq
2
Department of Electrical and Computer Engineering, Southern Illinois University, Carbondale, IL 62901, USA
3
Department of Biomedical Engineering, University of Reading, Whiteknights, Reading RG6 6AH, UK
4
Department of Mechanical Engineering, Imperial College London, London SW7 1AL, UK
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(17), 3802; https://doi.org/10.3390/s19173802
Received: 9 August 2019 / Revised: 30 August 2019 / Accepted: 1 September 2019 / Published: 3 September 2019
(This article belongs to the Special Issue Unmanned Aerial Vehicle Networks, Systems and Applications)
Networked operation of unmanned air vehicles (UAVs) demands fusion of information from disparate sources for accurate flight control. In this investigation, a novel sensor fusion architecture for detecting aircraft runway and horizons as well as enhancing the awareness of surrounding terrain is introduced based on fusion of enhanced vision system (EVS) and synthetic vision system (SVS) images. EVS and SVS image fusion has yet to be implemented in real-world situations due to signal misalignment. We address this through a registration step to align EVS and SVS images. Four fusion rules combining discrete wavelet transform (DWT) sub-bands are formulated, implemented, and evaluated. The resulting procedure is tested on real EVS-SVS image pairs and pairs containing simulated turbulence. Evaluations reveal that runways and horizons can be detected accurately even in poor visibility. Furthermore, it is demonstrated that different aspects of EVS and SVS images can be emphasized by using different DWT fusion rules. The procedure is autonomous throughout landing, irrespective of weather. The fusion architecture developed in this study holds promise for incorporation into manned heads-up displays (HUDs) and UAV remote displays to assist pilots landing aircraft in poor lighting and varying weather. The algorithm also provides a basis for rule selection in other signal fusion applications. View Full-Text
Keywords: unmanned aircraft (UAV); sensing; intelligent transportation; image fusion; signal alignment; runway detection; image registration; wavelet transform; Hough transform unmanned aircraft (UAV); sensing; intelligent transportation; image fusion; signal alignment; runway detection; image registration; wavelet transform; Hough transform
Show Figures

Figure 1

MDPI and ACS Style

Fadhil, A.F.; Kanneganti, R.; Gupta, L.; Eberle, H.; Vaidyanathan, R. Fusion of Enhanced and Synthetic Vision System Images for Runway and Horizon Detection. Sensors 2019, 19, 3802.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map

1
Back to TopTop