Landmark-Based Homing Navigation Using Omnidirectional Depth Information
AbstractA number of landmark-based navigation algorithms have been studied using feature extraction over the visual information. In this paper, we apply the distance information of the surrounding environment in a landmark navigation model. We mount a depth sensor on a mobile robot, in order to obtain omnidirectional distance information. The surrounding environment is represented as a circular form of landmark vectors, which forms a snapshot. The depth snapshots at the current position and the target position are compared to determine the homing direction, inspired by the snapshot model. Here, we suggest a holistic view of panoramic depth information for homing navigation where each sample point is taken as a landmark. The results are shown in a vector map of homing vectors. The performance of the suggested method is evaluated based on the angular errors and the homing success rate. Omnidirectional depth information about the surrounding environment can be a promising source of landmark homing navigation. We demonstrate the results that a holistic approach with omnidirectional depth information shows effective homing navigation. View Full-Text
Share & Cite This Article
Lee, C.; Yu, S.-E.; Kim, D. Landmark-Based Homing Navigation Using Omnidirectional Depth Information. Sensors 2017, 17, 1928.
Lee C, Yu S-E, Kim D. Landmark-Based Homing Navigation Using Omnidirectional Depth Information. Sensors. 2017; 17(8):1928.Chicago/Turabian Style
Lee, Changmin; Yu, Seung-Eun; Kim, DaeEun. 2017. "Landmark-Based Homing Navigation Using Omnidirectional Depth Information." Sensors 17, no. 8: 1928.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.