Next Article in Journal
Assessing White Wine Viscosity Variation Using Polarized Laser Speckle: A Promising Alternative to Wine Sensory Analysis
Previous Article in Journal
Automated Epileptic Seizure Detection Based on Wearable ECG and PPG in a Hospital Environment
Article

Robust Stereo Visual Odometry Using Improved RANSAC-Based Methods for Mobile Robot Localization

by 1,2, 1,2, 1,2 and 1,2,*
1
Bio-Vision System Laboratory, Shanghai Institute of Microsystem and Information Technology, Chinese Academy of Sciences, Shanghai 200050, China
2
University of Chinese Academy of Sciences, Beijing 100049, China
*
Author to whom correspondence should be addressed.
Sensors 2017, 17(10), 2339; https://doi.org/10.3390/s17102339
Received: 28 August 2017 / Revised: 9 October 2017 / Accepted: 11 October 2017 / Published: 13 October 2017
(This article belongs to the Section Physical Sensors)
In this paper, we present a novel approach for stereo visual odometry with robust motion estimation that is faster and more accurate than standard RANSAC (Random Sample Consensus). Our method makes improvements in RANSAC in three aspects: first, the hypotheses are preferentially generated by sampling the input feature points on the order of ages and similarities of the features; second, the evaluation of hypotheses is performed based on the SPRT (Sequential Probability Ratio Test) that makes bad hypotheses discarded very fast without verifying all the data points; third, we aggregate the three best hypotheses to get the final estimation instead of only selecting the best hypothesis. The first two aspects improve the speed of RANSAC by generating good hypotheses and discarding bad hypotheses in advance, respectively. The last aspect improves the accuracy of motion estimation. Our method was evaluated in the KITTI (Karlsruhe Institute of Technology and Toyota Technological Institute) and the New Tsukuba dataset. Experimental results show that the proposed method achieves better results for both speed and accuracy than RANSAC. View Full-Text
Keywords: visual odometry; stereovision; robust estimation; motion estimation; RANSAC visual odometry; stereovision; robust estimation; motion estimation; RANSAC
Show Figures

Figure 1

MDPI and ACS Style

Liu, Y.; Gu, Y.; Li, J.; Zhang, X. Robust Stereo Visual Odometry Using Improved RANSAC-Based Methods for Mobile Robot Localization. Sensors 2017, 17, 2339. https://doi.org/10.3390/s17102339

AMA Style

Liu Y, Gu Y, Li J, Zhang X. Robust Stereo Visual Odometry Using Improved RANSAC-Based Methods for Mobile Robot Localization. Sensors. 2017; 17(10):2339. https://doi.org/10.3390/s17102339

Chicago/Turabian Style

Liu, Yanqing, Yuzhang Gu, Jiamao Li, and Xiaolin Zhang. 2017. "Robust Stereo Visual Odometry Using Improved RANSAC-Based Methods for Mobile Robot Localization" Sensors 17, no. 10: 2339. https://doi.org/10.3390/s17102339

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop