Next Article in Journal
Early Detection of Plant Viral Disease Using Hyperspectral Imaging and Deep Learning
Next Article in Special Issue
Research on Cooperative Perception of MUSVs in Complex Ocean Conditions
Previous Article in Journal
Robust Position Control of an Over-actuated Underwater Vehicle under Model Uncertainties and Ocean Current Effects Using Dynamic Sliding Mode Surface and Optimal Allocation Control
Previous Article in Special Issue
Demonstrations of Cooperative Perception: Safety and Robustness in Connected and Automated Vehicle Operations
Article

A Grid-Based Framework for Collective Perception in Autonomous Vehicles

Centre for Automation and Robotics (CSIC-UPM), Ctra. M300 Campo Real, Km 0.200, Arganda del Rey, 28500 Madrid, Spain
*
Author to whom correspondence should be addressed.
Academic Editor: Miguel Sepulcre
Sensors 2021, 21(3), 744; https://doi.org/10.3390/s21030744
Received: 22 December 2020 / Revised: 15 January 2021 / Accepted: 19 January 2021 / Published: 22 January 2021
(This article belongs to the Special Issue Cooperative Perception for Intelligent Vehicles)
Today, perception solutions for Automated Vehicles rely on sensors on board the vehicle, which are limited by the line of sight and occlusions caused by any other elements on the road. As an alternative, Vehicle-to-Everything (V2X) communications allow vehicles to cooperate and enhance their perception capabilities. Besides announcing its own presence and intentions, services such as Collective Perception (CPS) aim to share information about perceived objects as a high-level description. This work proposes a perception framework for fusing information from on-board sensors and data received via CPS messages (CPM). To that end, the environment is modeled using an occupancy grid where occupied, and free and uncertain space is considered. For each sensor, including V2X, independent grids are calculated from sensor measurements and uncertainties and then fused in terms of both occupancy and confidence. Moreover, the implementation of a Particle Filter allows the evolution of cell occupancy from one step to the next, allowing for object tracking. The proposed framework was validated on a set of experiments using real vehicles and infrastructure sensors for sensing static and dynamic objects. Results showed a good performance even under important uncertainties and delays, hence validating the viability of the proposed framework for Collective Perception. View Full-Text
Keywords: autonomous driving; connected vehicles; cooperative perception; collective perception service; V2X; occupancy grid autonomous driving; connected vehicles; cooperative perception; collective perception service; V2X; occupancy grid
Show Figures

Figure 1

MDPI and ACS Style

Godoy, J.; Jiménez, V.; Artuñedo, A.; Villagra, J. A Grid-Based Framework for Collective Perception in Autonomous Vehicles. Sensors 2021, 21, 744. https://doi.org/10.3390/s21030744

AMA Style

Godoy J, Jiménez V, Artuñedo A, Villagra J. A Grid-Based Framework for Collective Perception in Autonomous Vehicles. Sensors. 2021; 21(3):744. https://doi.org/10.3390/s21030744

Chicago/Turabian Style

Godoy, Jorge, Víctor Jiménez, Antonio Artuñedo, and Jorge Villagra. 2021. "A Grid-Based Framework for Collective Perception in Autonomous Vehicles" Sensors 21, no. 3: 744. https://doi.org/10.3390/s21030744

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop