Next Article in Journal
Estimating the Underwater Diffuse Attenuation Coefficient with a Low-Cost Instrument: The KdUINO DIY Buoy
Next Article in Special Issue
Design of a Computerised Flight Mill Device to Measure the Flight Potential of Different Insects
Previous Article in Journal
Particle Filter with Novel Nonlinear Error Model for Miniature Gyroscope-Based Measurement While Drilling Navigation
Previous Article in Special Issue
Robot Guidance Using Machine Vision Techniques in Industrial Environments: A Comparative Review
Article Menu

Export Article

Open AccessArticle
Sensors 2016, 16(3), 372; doi:10.3390/s16030372

Vision-Based SLAM System for Unmanned Aerial Vehicles

1
Department of Automatic Control, Technical University of Catalonia UPC, Barcelona 08036, Spain
2
Department of Computer Science, CUCEI, University of Guadalajara, Guadalajara 44430, Mexico
*
Authors to whom correspondence should be addressed.
Academic Editor: Gonzalo Pajares Martinsanz
Received: 8 December 2015 / Revised: 7 March 2016 / Accepted: 9 March 2016 / Published: 15 March 2016
(This article belongs to the Special Issue State-of-the-Art Sensors Technology in Spain 2015)
View Full-Text   |   Download PDF [2090 KB, uploaded 15 March 2016]   |  

Abstract

The present paper describes a vision-based simultaneous localization and mapping system to be applied to Unmanned Aerial Vehicles (UAVs). The main contribution of this work is to propose a novel estimator relying on an Extended Kalman Filter. The estimator is designed in order to fuse the measurements obtained from: (i) an orientation sensor (AHRS); (ii) a position sensor (GPS); and (iii) a monocular camera. The estimated state consists of the full state of the vehicle: position and orientation and their first derivatives, as well as the location of the landmarks observed by the camera. The position sensor will be used only during the initialization period in order to recover the metric scale of the world. Afterwards, the estimated map of landmarks will be used to perform a fully vision-based navigation when the position sensor is not available. Experimental results obtained with simulations and real data show the benefits of the inclusion of camera measurements into the system. In this sense the estimation of the trajectory of the vehicle is considerably improved, compared with the estimates obtained using only the measurements from the position sensor, which are commonly low-rated and highly noisy. View Full-Text
Keywords: state estimation; unmanned aerial vehicle; monocular vision; localization; mapping state estimation; unmanned aerial vehicle; monocular vision; localization; mapping
Figures

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Munguía, R.; Urzua, S.; Bolea, Y.; Grau, A. Vision-Based SLAM System for Unmanned Aerial Vehicles. Sensors 2016, 16, 372.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top