Lightweight Visual Odometry for Autonomous Mobile Robots
AbstractVision-based motion estimation is an effective means for mobile robot localization and is often used in conjunction with other sensors for navigation and path planning. This paper presents a low-overhead real-time ego-motion estimation (visual odometry) system based on either a stereo or RGB-D sensor. The algorithm’s accuracy outperforms typical frame-to-frame approaches by maintaining a limited local map, while requiring significantly less memory and computational power in contrast to using global maps common in full visual SLAM methods. The algorithm is evaluated on common publicly available datasets that span different use-cases and performance is compared to other comparable open-source systems in terms of accuracy, frame rate and memory requirements. This paper accompanies the release of the source code as a modular software package for the robotics community compatible with the Robot Operating System (ROS). View Full-Text
Share & Cite This Article
Aladem, M.; Rawashdeh, S.A. Lightweight Visual Odometry for Autonomous Mobile Robots. Sensors 2018, 18, 2837.
Aladem M, Rawashdeh SA. Lightweight Visual Odometry for Autonomous Mobile Robots. Sensors. 2018; 18(9):2837.Chicago/Turabian Style
Aladem, Mohamed; Rawashdeh, Samir A. 2018. "Lightweight Visual Odometry for Autonomous Mobile Robots." Sensors 18, no. 9: 2837.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.