Next Article in Journal
Electrochemical Selective and Simultaneous Detection of Diclofenac and Ibuprofen in Aqueous Solution Using HKUST-1 Metal-Organic Framework-Carbon Nanofiber Composite Electrode
Next Article in Special Issue
Real-Time Lane Region Detection Using a Combination of Geometrical and Image Features
Previous Article in Journal
Transmit Power Allocation for Physical Layer Security in Cooperative Multi-Hop Full-Duplex Relay Networks
Previous Article in Special Issue
Performance Enhancement of Land Vehicle Positioning Using Multiple GPS Receivers in an Urban Area
Article Menu

Export Article

Open AccessArticle
Sensors 2016, 16(10), 1704; doi:10.3390/s16101704

A Robust Method for Ego-Motion Estimation in Urban Environment Using Stereo Camera

1
School of Optical-Electrical and Computer Engineering, University of Shanghai for Science & Technology, Shanghai 200093, China
2
School of Electric Power Engineering, Nanjing Normal University Taizhou Colledge, Taizhou 225300, China
*
Author to whom correspondence should be addressed.
Academic Editor: Felipe Jimenez
Received: 12 July 2016 / Revised: 8 September 2016 / Accepted: 30 September 2016 / Published: 17 October 2016
(This article belongs to the Special Issue Sensors for Autonomous Road Vehicles)
View Full-Text   |   Download PDF [6018 KB, uploaded 20 October 2016]   |  

Abstract

Visual odometry estimates the ego-motion of an agent (e.g., vehicle and robot) using image information and is a key component for autonomous vehicles and robotics. This paper proposes a robust and precise method for estimating the 6-DoF ego-motion, using a stereo rig with optical flow analysis. An objective function fitted with a set of feature points is created by establishing the mathematical relationship between optical flow, depth and camera ego-motion parameters through the camera’s 3-dimensional motion and planar imaging model. Accordingly, the six motion parameters are computed by minimizing the objective function, using the iterative Levenberg–Marquard method. One of key points for visual odometry is that the feature points selected for the computation should contain inliers as much as possible. In this work, the feature points and their optical flows are initially detected by using the Kanade–Lucas–Tomasi (KLT) algorithm. A circle matching is followed to remove the outliers caused by the mismatching of the KLT algorithm. A space position constraint is imposed to filter out the moving points from the point set detected by the KLT algorithm. The Random Sample Consensus (RANSAC) algorithm is employed to further refine the feature point set, i.e., to eliminate the effects of outliers. The remaining points are tracked to estimate the ego-motion parameters in the subsequent frames. The approach presented here is tested on real traffic videos and the results prove the robustness and precision of the method. View Full-Text
Keywords: visual odometry; ego-motion; stereovision; optical flow; RANSAC algorithm; space position constraint visual odometry; ego-motion; stereovision; optical flow; RANSAC algorithm; space position constraint
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Ci, W.; Huang, Y. A Robust Method for Ego-Motion Estimation in Urban Environment Using Stereo Camera. Sensors 2016, 16, 1704.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top