Next Article in Journal
Alternative cDEP Design to Facilitate Cell Isolation for Identification by Raman Spectroscopy
Previous Article in Journal
Investigation of Pristine Graphite Oxide as Room-Temperature Chemiresistive Ammonia Gas Sensing Material
Article Menu
Issue 2 (February) cover image

Export Article

Open AccessArticle
Sensors 2017, 17(2), 325; doi:10.3390/s17020325

Improved Omnidirectional Odometry for a View-Based Mapping Approach

1
System Engineering and Automation Department, Miguel Hernández University, Elche (Alicante) 03202, Spain
2
Q-Bot Ltd., Riverside Business Park, London SW18 4UQ, UK
3
Dyson School of Design Engineering, Imperial College, London SW7 1NA, UK
*
Author to whom correspondence should be addressed.
Academic Editor: Vittorio M. N. Passaro
Received: 9 December 2016 / Revised: 3 February 2017 / Accepted: 6 February 2017 / Published: 9 February 2017
(This article belongs to the Section Physical Sensors)

Abstract

This work presents an improved visual odometry using omnidirectional images. The main purpose is to generate a reliable prior input which enhances the SLAM (Simultaneous Localization and Mapping) estimation tasks within the framework of navigation in mobile robotics, in detriment of the internal odometry data. Generally, standard SLAM approaches extensively use data such as the main prior input to localize the robot. They also tend to consider sensory data acquired with GPSs, lasers or digital cameras, as the more commonly acknowledged to re-estimate the solution. Nonetheless, the modeling of the main prior is crucial, and sometimes especially challenging when it comes to non-systematic terms, such as those associated with the internal odometer, which ultimately turn to be considerably injurious and compromise the convergence of the system. This omnidirectional odometry relies on an adaptive feature point matching through the propagation of the current uncertainty of the system. Ultimately, it is fused as the main prior input in an EKF (Extended Kalman Filter) view-based SLAM system, together with the adaption of the epipolar constraint to the omnidirectional geometry. Several improvements have been added to the initial visual odometry proposal so as to produce better performance. We present real data experiments to test the validity of the proposal and to demonstrate its benefits, in contrast to the internal odometry. Furthermore, SLAM results are included to assess its robustness and accuracy when using the proposed prior omnidirectional odometry. View Full-Text
Keywords: visual odometry; omnidirectional images; visual SLAM; feature matching; mapping visual odometry; omnidirectional images; visual SLAM; feature matching; mapping
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Valiente, D.; Gil, A.; Reinoso, Ó.; Juliá, M.; Holloway, M. Improved Omnidirectional Odometry for a View-Based Mapping Approach. Sensors 2017, 17, 325.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top