Next Article in Journal
Testing Fifth Forces from the Galactic Dark Matter
Previous Article in Journal
Pressure Agglomeration Process of Bakery Industry Waste
Article Menu
Issue 1 (I3S 2019) cover image

Export Article

Open AccessProceedings

Dynamic Catadioptric Sensory Data Fusion for Visual Localization in Mobile Robotics

1
Department of Systems Engineering and Automation, Miguel Hernández University, Av. de la Universidad s/n. Ed. Innova., 03202 Elche (Alicante), Spain
2
Centre for Automation and Robotics (CAR), UPM-CSIC. Technical University of Madrid, C/ José Gutiérrez Abascal, 2, 28006 Madrid, Spain
*
Author to whom correspondence should be addressed.
Presented at the 7th International Symposium on Sensor Science, Napoli, Italy, 9–11 May 2019.
Proceedings 2019, 15(1), 2; https://doi.org/10.3390/proceedings2019015002
Published: 5 July 2019
(This article belongs to the Proceedings of 7th International Symposium on Sensor Science)
PDF [2571 KB, uploaded 9 July 2019]

Abstract

This approach presents a localization technique within mobile robotics sustained by visual sensory data fusion. A regression inference framework is designed with the aid of informative data models of the system, together with support of probabilistic techniques such as Gaussian Processes. As a result, the visual data acquired with a catadioptric sensor is fused between poses of the robot in order to produce a probability distribution of visual information in the 3D global reference of the robot. In addition, a prediction technique based on filter gain is defined to improve the matching of visual information extracted from the probability distribution. This work reveals an enhanced matching technique for visual information in both, the image reference frame, and the 3D global reference. Real data results are presented to confirm the validity of the approach when working in a mobile robotic application for visual localization. Besides, a comparison against standard visual matching techniques is also presented. The suitability and robustness of the contributions are tested in the presented experiments.
Keywords: catadioptric sensor; visual data fusion; mobile robotics catadioptric sensor; visual data fusion; mobile robotics
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Valiente, D.; Payá, L.; Sebastián, J.M.; Jiménez, L.M.; Reinoso, O. Dynamic Catadioptric Sensory Data Fusion for Visual Localization in Mobile Robotics. Proceedings 2019, 15, 2.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Proceedings EISSN 2504-3900 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top