Vision-Based SLAM System for Unmanned Aerial Vehicles
AbstractThe present paper describes a vision-based simultaneous localization and mapping system to be applied to Unmanned Aerial Vehicles (UAVs). The main contribution of this work is to propose a novel estimator relying on an Extended Kalman Filter. The estimator is designed in order to fuse the measurements obtained from: (i) an orientation sensor (AHRS); (ii) a position sensor (GPS); and (iii) a monocular camera. The estimated state consists of the full state of the vehicle: position and orientation and their first derivatives, as well as the location of the landmarks observed by the camera. The position sensor will be used only during the initialization period in order to recover the metric scale of the world. Afterwards, the estimated map of landmarks will be used to perform a fully vision-based navigation when the position sensor is not available. Experimental results obtained with simulations and real data show the benefits of the inclusion of camera measurements into the system. In this sense the estimation of the trajectory of the vehicle is considerably improved, compared with the estimates obtained using only the measurements from the position sensor, which are commonly low-rated and highly noisy. View Full-Text
Share & Cite This Article
Munguía, R.; Urzua, S.; Bolea, Y.; Grau, A. Vision-Based SLAM System for Unmanned Aerial Vehicles. Sensors 2016, 16, 372.
Munguía R, Urzua S, Bolea Y, Grau A. Vision-Based SLAM System for Unmanned Aerial Vehicles. Sensors. 2016; 16(3):372.Chicago/Turabian Style
Munguía, Rodrigo; Urzua, Sarquis; Bolea, Yolanda; Grau, Antoni. 2016. "Vision-Based SLAM System for Unmanned Aerial Vehicles." Sensors 16, no. 3: 372.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.