A Low Cost Sensors Approach for Accurate Vehicle Localization and Autonomous Driving Application
AbstractAutonomous driving in public roads requires precise localization within the range of few centimeters. Even the best current precise localization system based on the Global Navigation Satellite System (GNSS) can not always reach this level of precision, especially in an urban environment, where the signal is disturbed by surrounding buildings and artifacts. Laser range finder and stereo vision have been successfully used for obstacle detection, mapping and localization to solve the autonomous driving problem. Unfortunately, Light Detection and Ranging (LIDARs) are very expensive sensors and stereo vision requires powerful dedicated hardware to process the cameras information. In this context, this article presents a low-cost architecture of sensors and data fusion algorithm capable of autonomous driving in narrow two-way roads. Our approach exploits a combination of a short-range visual lane marking detector and a dead reckoning system to build a long and precise perception of the lane markings in the vehicle’s backwards. This information is used to localize the vehicle in a map, that also contains the reference trajectory for autonomous driving. Experimental results show the successful application of the proposed system on a real autonomous driving situation. View Full-Text
Share & Cite This Article
Vivacqua, R.; Vassallo, R.; Martins, F. A Low Cost Sensors Approach for Accurate Vehicle Localization and Autonomous Driving Application. Sensors 2017, 17, 2359.
Vivacqua R, Vassallo R, Martins F. A Low Cost Sensors Approach for Accurate Vehicle Localization and Autonomous Driving Application. Sensors. 2017; 17(10):2359.Chicago/Turabian Style
Vivacqua, Rafael; Vassallo, Raquel; Martins, Felipe. 2017. "A Low Cost Sensors Approach for Accurate Vehicle Localization and Autonomous Driving Application." Sensors 17, no. 10: 2359.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.