Next Article in Journal
Coumarin Probe for Selective Detection of Fluoride Ions in Aqueous Solution and Its Bioimaging in Live Cells
Next Article in Special Issue
Fast Visual Odometry for a Low-Cost Underwater Embedded Stereo System
Previous Article in Journal
Rub-Impact Fault Diagnosis Using an Effective IMF Selection Technique in Ensemble Empirical Mode Decomposition and Hybrid Feature Models
Previous Article in Special Issue
Hybrid Histogram Descriptor: A Fusion Feature Representation for Image Retrieval
Article Menu
Issue 7 (July) cover image

Export Article

Open AccessArticle
Sensors 2018, 18(7), 2041; https://doi.org/10.3390/s18072041

Visual Information Fusion through Bayesian Inference for Adaptive Probability-Oriented Feature Matching

1
Department of Systems Engineering and Automation, Miguel Hernández University, Av. de la Universidad s/n. Ed. Innova., 03202 Elche (Alicante), Spain
2
Centre for Automation and Robotics (CAR), UPM-CSIC, Technical University of Madrid, C/ José Gutiérrez Abascal, 2, 28006 Madrid, Spain
These authors contributed equally to this work.
*
Author to whom correspondence should be addressed.
Received: 5 April 2018 / Revised: 23 June 2018 / Accepted: 24 June 2018 / Published: 26 June 2018
(This article belongs to the Special Issue Visual Sensors)
Full-Text   |   PDF [3051 KB, uploaded 2 July 2018]   |  

Abstract

This work presents a visual information fusion approach for robust probability-oriented feature matching. It is sustained by omnidirectional imaging, and it is tested in a visual localization framework, in mobile robotics. General visual localization methods have been extensively studied and optimized in terms of performance. However, one of the main threats that jeopardizes the final estimation is the presence of outliers. In this paper, we present several contributions to deal with that issue. First, 3D information data, associated with SURF (Speeded-Up Robust Feature) points detected on the images, is inferred under the Bayesian framework established by Gaussian processes (GPs). Such information represents a probability distribution for the feature points’ existence, which is successively fused and updated throughout the robot’s poses. Secondly, this distribution can be properly sampled and projected onto the next 2D image frame in t+1, by means of a filter-motion prediction. This strategy permits obtaining relevant areas in the image reference system, from which probable matches could be detected, in terms of the accumulated probability of feature existence. This approach entails an adaptive probability-oriented matching search, which accounts for significant areas of the image, but it also considers unseen parts of the scene, thanks to an internal modulation of the probability distribution domain, computed in terms of the current uncertainty of the system. The main outcomes confirm a robust feature matching, which permits producing consistent localization estimates, aided by the odometer’s prior to estimate the scale factor. Publicly available datasets have been used to validate the design and operation of the approach. Moreover, the proposal has been compared, firstly with a standard feature matching and secondly with a localization method, based on an inverse depth parametrization. The results confirm the validity of the approach in terms of feature matching, localization accuracy, and time consumption. View Full-Text
Keywords: omnidirectional imaging; visual localization; catadioptric sensor; visual information fusion omnidirectional imaging; visual localization; catadioptric sensor; visual information fusion
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Valiente, D.; Payá, L.; Jiménez, L.M.; Sebastián, J.M.; Reinoso, Ó. Visual Information Fusion through Bayesian Inference for Adaptive Probability-Oriented Feature Matching. Sensors 2018, 18, 2041.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top