sensors-logo

Journal Browser

Journal Browser

Cooperative Perception for Intelligent Vehicles

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensor Networks".

Deadline for manuscript submissions: closed (15 June 2021) | Viewed by 45526

Special Issue Editors


E-Mail Website
Guest Editor
Associate Professor, Universidad Miguel Hernandez de Elche, Spain
Interests: V2X communications; congestion control; heterogeneous wireless networks; industrial wireless networks; 5G; LTE-V; C-V2X

E-Mail Website
Guest Editor
Hyundai Motor Europe Technical Center GmbH, Rüsselsheim am Main, Germany
Interests: V2X; C-ITS; Cooperative Automated driving; Vehicular Ad Hoc Networks; Wireless Sensor Networks

E-Mail Website
Guest Editor
Institute of Transportation Systems, German Aerospace Center (DLR), Germany
Interests: Image processing and data fusion for applications in intelligent transportation systems

E-Mail Website
Guest Editor
Dynniq Nederland B.V., The Netherlands
Interests: Intelligent Transport Systems; Cooperative and automated driving; traffic control and management; V2X; 5G

Special Issue Information

Dear Colleagues,

Automated vehicles are expected to have a significant impact in the transport sector in the next decades, improving road safety and traffic efficiency, as well as reducing energy consumption and improving user comfort. Automated vehicles make use of a set of onboard sensors installed in the vehicle (e.g. camera, radar and lidar) that are responsible for perceiving the surrounding environment, and a set of actuators that control its longitudinal and lateral movements. One of the development objectives is to automatically perform driving tasks with less or even without driver intervention. Several studies have already shown that the sensors used in the perception process have limitations that might degrade the performance of automated vehicles. For example, in adverse weather conditions (such as rain, snow and fog), the cameras will not be able to adequately capture the environment, and in situations where the sensor’s field of vision is blocked (by other vehicles or buildings) none of the current sensors can detect beyond the position of the obstacle. To overcome these limitations and improve the perception capabilities of the vehicles, cooperative perception enables the wireless exchange of sensor information between vehicles and between vehicles and infrastructure nodes. Cooperative perception, also known as cooperative sensing or collective perception, enables vehicles and infrastructure nodes to detect objects (e.g. non-connected vehicles, pedestrians, obstacles) beyond their local sensing capabilities. Cooperative perception can be key for extended and timely detection of the surrounding environment and can also enable cooperative applications by compensating low penetration rates of connected road users, thus facilitating the future deployment of automated vehicles.

The purpose of this Special Issue is to present and discuss major research challenges, latest developments, and recent advances on cooperative perception. This Special Issue solicits the submission of high-quality papers from academia and industry that aim to solve open technical problems or challenges in the context of cooperative perception. Original and innovative contributions on all aspects, both theoretical and experimental, are all welcome.

Potential topics include, but are not limited to, the following:

  • Application development and validation based on cooperative perception
  • V2X communication algorithms and protocols for cooperative perception
  • Communication technologies for cooperative perception
  • Radio resource allocation for cooperative perception
  • Congestion control for cooperative perception
  • Infrastructure-assisted solutions for cooperative perception
  • Security analysis and algorithms for cooperative perception
  • Artificial intelligence and machine learning-based cooperative perception
  • Sensor design and configuration for cooperative perception
  • Sensor architectures and technologies for cooperative perception
  • The impact of sensor data and sensor data fusion quality on the effectiveness of cooperative perception
  • Sensor data fusion problems, algorithms and architectures in the context of cooperative perception
  • Simulation platforms and experimental testbeds for cooperative perception

Dr. Miguel Sepulcre
Dr. Michele Rondinone
Dr. Andreas Leich
Dr. Meng Lu
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Cooperative perception (also known as cooperative sensing or collective perception)
  • Connected automated vehicles
  • V2X communications
  • Simulation platforms
  • Experimental testbeds

Published Papers (12 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

23 pages, 10648 KiB  
Article
ECPC-ICP: A 6D Vehicle Pose Estimation Method by Fusing the Roadside Lidar Point Cloud and Road Feature
by Bo Gu, Jianxun Liu, Huiyuan Xiong, Tongtong Li and Yuelong Pan
Sensors 2021, 21(10), 3489; https://doi.org/10.3390/s21103489 - 17 May 2021
Cited by 8 | Viewed by 2921
Abstract
In the vehicle pose estimation task based on roadside Lidar in cooperative perception, the measurement distance, angle, and laser resolution directly affect the quality of the target point cloud. For incomplete and sparse point clouds, current methods are either less accurate in correspondences [...] Read more.
In the vehicle pose estimation task based on roadside Lidar in cooperative perception, the measurement distance, angle, and laser resolution directly affect the quality of the target point cloud. For incomplete and sparse point clouds, current methods are either less accurate in correspondences solved by local descriptors or not robust enough due to the reduction of effective boundary points. In response to the above weakness, this paper proposed a registration algorithm Environment Constraint Principal Component-Iterative Closest Point (ECPC-ICP), which integrated road information constraints. The road normal feature was extracted, and the principal component of the vehicle point cloud matrix under the road normal constraint was calculated as the initial pose result. Then, an accurate 6D pose was obtained through point-to-point ICP registration. According to the measurement characteristics of the roadside Lidars, this paper defined the point cloud sparseness description. The existing algorithms were tested on point cloud data with different sparseness. The simulated experimental results showed that the positioning MAE of ECPC-ICP was about 0.5% of the vehicle scale, the orientation MAE was about 0.26°, and the average registration success rate was 95.5%, which demonstrated an improvement in accuracy and robustness compared with current methods. In the real test environment, the positioning MAE was about 2.6% of the vehicle scale, and the average time cost was 53.19 ms, proving the accuracy and effectiveness of ECPC-ICP in practical applications. Full article
(This article belongs to the Special Issue Cooperative Perception for Intelligent Vehicles)
Show Figures

Figure 1

24 pages, 1733 KiB  
Article
Impact of Safety Message Generation Rules on the Awareness of Vulnerable Road Users
by Tomás Lara, Alexis Yáñez, Sandra Céspedes and Abdelhakim Senhaji Hafid
Sensors 2021, 21(10), 3375; https://doi.org/10.3390/s21103375 - 12 May 2021
Cited by 9 | Viewed by 2302
Abstract
In the face of cooperative intelligent transportation systems (C-ITS) advancements, the inclusion of vulnerable road users (VRU), i.e., pedestrians, cyclists, and motorcyclists, has just recently become a part of the discussion. Including VRU in C-ITS presents new challenges, most notably the trade-off between [...] Read more.
In the face of cooperative intelligent transportation systems (C-ITS) advancements, the inclusion of vulnerable road users (VRU), i.e., pedestrians, cyclists, and motorcyclists, has just recently become a part of the discussion. Including VRU in C-ITS presents new challenges, most notably the trade-off between the increase in VRU safety and the aggravation in channel congestion resulting from VRU-generated messages. However, previous studies mainly focus on network-related metrics without giving much consideration to VRU safety-related metrics. In this context, we evaluated such a trade-off with a study of motion-based message generation rules for VRU transmissions. The rules were analyzed using theoretical and simulation-based evaluations. In addition to studying the message generation rules using channel load metrics, such as channel busy ratio (CBR) and packet delivery ratio (PDR), we introduced a new metric: the VRU Awareness Probability (VAP). VAP uses the exchange of messages from active VRU to measure the probability of VRU detection by nearby vehicles. Results show that fixed message-filtering mechanisms reduce the overall channel load, but they could negatively impact VRU detection. We established the importance of quantifying the VRU awareness and its inclusion in C-ITS analysis because of its direct impact on VRU safety. We also discussed approaches that include VRU context and dynamism to improve the definition of message generation rules. Full article
(This article belongs to the Special Issue Cooperative Perception for Intelligent Vehicles)
Show Figures

Figure 1

26 pages, 1948 KiB  
Article
Automated Driving with Cooperative Perception Using Millimeter-Wave V2V Communications for Safe Overtaking
by Ryuichi Fukatsu and Kei Sakaguchi
Sensors 2021, 21(8), 2659; https://doi.org/10.3390/s21082659 - 10 Apr 2021
Cited by 14 | Viewed by 3119
Abstract
The combination of onboard sensors on vehicles with wireless communication has great advantages over the conventional driving systems in terms of safety and reliability. This technique is often called cooperative perception. Cooperative perception is expected to compensate for blind spots in dynamic maps, [...] Read more.
The combination of onboard sensors on vehicles with wireless communication has great advantages over the conventional driving systems in terms of safety and reliability. This technique is often called cooperative perception. Cooperative perception is expected to compensate for blind spots in dynamic maps, which are caused by obstacles. Few blind spots in dynamic maps can improve the safety and reliability of driving thanks to the additional information beyond the sensing of the onboard sensors. In this paper, we analyzed the required sensor data rate to be exchanged for the cooperative perception in order to enable a new level of safe and reliable automated driving in overtaking scenario. The required sensor data rate was calculated by the combination of recognition and vehicle movement to adopt realistic assumptions. In the end of the paper, we compared the required sensor data rate with the outage data rate realized by the conventional V2V communication and millimeter-wave communication. The results showed the indispensability of millimeter-wave communications in automated driving systems. Full article
(This article belongs to the Special Issue Cooperative Perception for Intelligent Vehicles)
Show Figures

Figure 1

23 pages, 3661 KiB  
Article
Analysis of Cooperative Perception in Ant Traffic and Its Effects on Transportation System by Using a Congestion-Free Ant-Trail Model
by Prafull Kasture and Hidekazu Nishimura
Sensors 2021, 21(7), 2393; https://doi.org/10.3390/s21072393 - 30 Mar 2021
Cited by 1 | Viewed by 2158
Abstract
We investigated agent-based model simulations that mimic an ant transportation system to analyze the cooperative perception and communication in the system. On a trail, ants use cooperative perception through chemotaxis to maintain a constant average velocity irrespective of their density, thereby avoiding traffic [...] Read more.
We investigated agent-based model simulations that mimic an ant transportation system to analyze the cooperative perception and communication in the system. On a trail, ants use cooperative perception through chemotaxis to maintain a constant average velocity irrespective of their density, thereby avoiding traffic jams. Using model simulations and approximate mathematical representations, we analyzed various aspects of the communication system and their effects on cooperative perception in ant traffic. Based on the analysis, insights about the cooperative perception of ants which facilitate decentralized self-organization is presented. We also present values of communication-parameters in ant traffic, where the system conveys traffic conditions to individual ants, which ants use to self-organize and avoid traffic-jams. The mathematical analysis also verifies our findings and provides a better understanding of various model parameters leading to model improvements. Full article
(This article belongs to the Special Issue Cooperative Perception for Intelligent Vehicles)
Show Figures

Figure 1

29 pages, 7918 KiB  
Article
Driving Environment Perception Based on the Fusion of Vehicular Wireless Communications and Automotive Remote Sensors
by Minjin Baek, Jungwi Mun, Woojoong Kim, Dongho Choi, Janghyuk Yim and Sangsun Lee
Sensors 2021, 21(5), 1860; https://doi.org/10.3390/s21051860 - 07 Mar 2021
Cited by 4 | Viewed by 3261
Abstract
Driving environment perception for automated vehicles is typically achieved by the use of automotive remote sensors such as radars and cameras. A vehicular wireless communication system can be viewed as a new type of remote sensor that plays a central role in connected [...] Read more.
Driving environment perception for automated vehicles is typically achieved by the use of automotive remote sensors such as radars and cameras. A vehicular wireless communication system can be viewed as a new type of remote sensor that plays a central role in connected and automated vehicles (CAVs), which are capable of sharing information with each other and also with the surrounding infrastructure. In this paper, we present the design and implementation of driving environment perception based on the fusion of vehicular wireless communications and automotive remote sensors. A track-to-track fusion of high-level sensor data and vehicular wireless communication data was performed to accurately and reliably locate the remote target in the vehicle surroundings and predict the future trajectory. The proposed approach was implemented and evaluated in vehicle tests conducted at a proving ground. The experimental results demonstrate that using vehicular wireless communications in conjunction with the on-board sensors enables improved perception of the surrounding vehicle located at varying longitudinal and lateral distances. The results also indicate that vehicle future trajectory and potential crash involvement can be reliably predicted with the proposed system in different cut-in driving scenarios. Full article
(This article belongs to the Special Issue Cooperative Perception for Intelligent Vehicles)
Show Figures

Figure 1

17 pages, 4135 KiB  
Article
Research on Cooperative Perception of MUSVs in Complex Ocean Conditions
by Lili Yin, Rubo Zhang, Hengwen Gu and Peng Li
Sensors 2021, 21(5), 1657; https://doi.org/10.3390/s21051657 - 28 Feb 2021
Cited by 2 | Viewed by 1760
Abstract
Since the working environment of Multiple Unmanned Surface Vehicles (MUSVs) is accompanied by a large number of uncertainties and various hazards, in order to ensure the collision avoidance capability of MUSVs in complex marine environments, the perception of complex marine environments by MUSVs [...] Read more.
Since the working environment of Multiple Unmanned Surface Vehicles (MUSVs) is accompanied by a large number of uncertainties and various hazards, in order to ensure the collision avoidance capability of MUSVs in complex marine environments, the perception of complex marine environments by MUSVs is the first problem that needs to be solved. A cooperative perception framework with uncertain event detection, cooperative collision avoidance pattern recognition and environmental ontology model is proposed to realize the cooperative perception process of MUSVs using ontology and Bayesian network theory. The cooperative perception approach was validated by simulating experiments. Results show the effectiveness of cooperative perception approach. Full article
(This article belongs to the Special Issue Cooperative Perception for Intelligent Vehicles)
Show Figures

Figure 1

21 pages, 15633 KiB  
Article
A Grid-Based Framework for Collective Perception in Autonomous Vehicles
by Jorge Godoy, Víctor Jiménez, Antonio Artuñedo and Jorge Villagra
Sensors 2021, 21(3), 744; https://doi.org/10.3390/s21030744 - 22 Jan 2021
Cited by 22 | Viewed by 3876
Abstract
Today, perception solutions for Automated Vehicles rely on sensors on board the vehicle, which are limited by the line of sight and occlusions caused by any other elements on the road. As an alternative, Vehicle-to-Everything (V2X) communications allow vehicles to cooperate and enhance [...] Read more.
Today, perception solutions for Automated Vehicles rely on sensors on board the vehicle, which are limited by the line of sight and occlusions caused by any other elements on the road. As an alternative, Vehicle-to-Everything (V2X) communications allow vehicles to cooperate and enhance their perception capabilities. Besides announcing its own presence and intentions, services such as Collective Perception (CPS) aim to share information about perceived objects as a high-level description. This work proposes a perception framework for fusing information from on-board sensors and data received via CPS messages (CPM). To that end, the environment is modeled using an occupancy grid where occupied, and free and uncertain space is considered. For each sensor, including V2X, independent grids are calculated from sensor measurements and uncertainties and then fused in terms of both occupancy and confidence. Moreover, the implementation of a Particle Filter allows the evolution of cell occupancy from one step to the next, allowing for object tracking. The proposed framework was validated on a set of experiments using real vehicles and infrastructure sensors for sensing static and dynamic objects. Results showed a good performance even under important uncertainties and delays, hence validating the viability of the proposed framework for Collective Perception. Full article
(This article belongs to the Special Issue Cooperative Perception for Intelligent Vehicles)
Show Figures

Figure 1

31 pages, 7873 KiB  
Article
Demonstrations of Cooperative Perception: Safety and Robustness in Connected and Automated Vehicle Operations
by Mao Shan, Karan Narula, Yung Fei Wong, Stewart Worrall, Malik Khan, Paul Alexander and Eduardo Nebot
Sensors 2021, 21(1), 200; https://doi.org/10.3390/s21010200 - 30 Dec 2020
Cited by 50 | Viewed by 6429
Abstract
Cooperative perception, or collective perception (CP), is an emerging and promising technology for intelligent transportation systems (ITS). It enables an ITS station (ITS-S) to share its local perception information with others by means of vehicle-to-X (V2X) communication, thereby achieving improved efficiency and safety [...] Read more.
Cooperative perception, or collective perception (CP), is an emerging and promising technology for intelligent transportation systems (ITS). It enables an ITS station (ITS-S) to share its local perception information with others by means of vehicle-to-X (V2X) communication, thereby achieving improved efficiency and safety in road transportation. In this paper, we present our recent progress on the development of a connected and automated vehicle (CAV) and intelligent roadside unit (IRSU). The main contribution of the work lies in investigating and demonstrating the use of CP service within intelligent infrastructure to improve awareness of vulnerable road users (VRU) and thus safety for CAVs in various traffic scenarios. We demonstrate in experiments that a connected vehicle (CV) can “see” a pedestrian around the corners. More importantly, we demonstrate how CAVs can autonomously and safely interact with walking and running pedestrians, relying only on the CP information from the IRSU through vehicle-to-infrastructure (V2I) communication. This is one of the first demonstrations of urban vehicle automation using only CP information. We also address in the paper the handling of collective perception messages (CPMs) received from the IRSU, and passing them through a pipeline of CP information coordinate transformation with uncertainty, multiple road user tracking, and eventually path planning/decision-making within the CAV. The experimental results were obtained with manually driven CV, fully autonomous CAV, and an IRSU retrofitted with vision and laser sensors and a road user tracking system. Full article
(This article belongs to the Special Issue Cooperative Perception for Intelligent Vehicles)
Show Figures

Figure 1

23 pages, 3958 KiB  
Article
Collective Perception: A Safety Perspective
by Florian A. Schiegg, Ignacio Llatser, Daniel Bischoff and Georg Volk
Sensors 2021, 21(1), 159; https://doi.org/10.3390/s21010159 - 29 Dec 2020
Cited by 33 | Viewed by 5362
Abstract
Vehicle-to-everything (V2X) communication is seen as one of the main enabling technologies for automated vehicles. Collective perception is especially promising, as it allows connected traffic participants to “see through the eyes of others” by sharing sensor-detected objects via V2X communication. Its benefit is [...] Read more.
Vehicle-to-everything (V2X) communication is seen as one of the main enabling technologies for automated vehicles. Collective perception is especially promising, as it allows connected traffic participants to “see through the eyes of others” by sharing sensor-detected objects via V2X communication. Its benefit is typically assessed in terms of the increased object update rate, redundancy, and awareness. To determine the safety improvement thanks to collective perception, the authors introduce new metrics, which quantify the environmental risk awareness of the traffic participants. The performance of the V2X service is then analyzed with the help of the test platform TEPLITS, using real traffic traces from German highways, amounting to over 100 h of total driving time. The results in the considered scenarios clearly show that collective perception not only contributes to the accuracy and integrity of the vehicles’ environmental perception, but also that a V2X market penetration of at least 25% is necessary to increase traffic safety from a “risk of serious traffic accidents” to a “residual hypothetical risk of collisions without minor injuries” for traffic participants equipped with non-redundant 360° sensor systems. These results support the ongoing worldwide standardization efforts of the collective perception service. Full article
(This article belongs to the Special Issue Cooperative Perception for Intelligent Vehicles)
Show Figures

Figure 1

21 pages, 957 KiB  
Article
Semantic Distributed Data for Vehicular Networks Using the Inter-Planetary File System
by Victor Ortega and Jose F. Monserrat
Sensors 2020, 20(22), 6404; https://doi.org/10.3390/s20226404 - 10 Nov 2020
Cited by 4 | Viewed by 3203
Abstract
Vehicular networks provide means to distribute data among intelligent vehicles, increasing their efficiency and the safety of their occupants. While connected to these networks, vehicles have access to various kinds of information shared by other vehicles and road-side units (RSUs). This information includes [...] Read more.
Vehicular networks provide means to distribute data among intelligent vehicles, increasing their efficiency and the safety of their occupants. While connected to these networks, vehicles have access to various kinds of information shared by other vehicles and road-side units (RSUs). This information includes helpful resources, such as traffic state or remote sensors. An efficient and fast system to get access to this information is important but unproductive if the data are not appropriately structured, accessible, and easy to process. This paper proposes the creation of a semantic distributed network using content-addressed networking and peer-to-peer (P2P) connections. In this open and collaborative network, RSUs and vehicles use ontologies to semantically represent information and facilitate the development of intelligent autonomous agents capable of navigating and processing the shared data. In order to create this P2P network, this paper makes use of the Inter-Planetary File System (IPFS), an open source solution that provides secure, reliable, and efficient content-addressed distributed storage over standard IP networks using the new QUIC protocol. This paper highlights the feasibility of this proposal and compares it with the state-of-the-art. Results show that IPFS is a promising technology that offers a great balance between functionality, performance, and security. Full article
(This article belongs to the Special Issue Cooperative Perception for Intelligent Vehicles)
Show Figures

Figure 1

21 pages, 28075 KiB  
Article
Networked Roadside Perception Units for Autonomous Driving
by Manabu Tsukada, Takaharu Oi, Masahiro Kitazawa and Hiroshi Esaki
Sensors 2020, 20(18), 5320; https://doi.org/10.3390/s20185320 - 17 Sep 2020
Cited by 37 | Viewed by 5466
Abstract
Vehicle-to-Everything (V2X) communication enhances the capability of autonomous driving through better safety, efficiency, and comfort. In particular, sensor data sharing, known as cooperative perception, is a crucial technique to accommodate vulnerable road users in a cooperative intelligent transport system (ITS). In this paper, [...] Read more.
Vehicle-to-Everything (V2X) communication enhances the capability of autonomous driving through better safety, efficiency, and comfort. In particular, sensor data sharing, known as cooperative perception, is a crucial technique to accommodate vulnerable road users in a cooperative intelligent transport system (ITS). In this paper, we describe a roadside perception unit (RSPU) that combines sensors and roadside units (RSUs) for infrastructure-based cooperative perception. We propose a software called AutoC2X that we designed to realize cooperative perception for RSPUs and vehicles. We also propose the concept of networked RSPUs, which is the inter-connection of RSPUs along a road over a wired network, and helps realize broader cooperative perception. We evaluated the RSPU system and the networked RSPUs through a field test, numerical analysis, and simulation experiments. Field evaluation showed that, even in the worst case, our RSPU system can deliver messages to an autonomous vehicle within 100 ms. The simulation result shows that the proposed priority algorithm achieves a wide perception range with a high delivery ratio and low latency, especially under heavy road traffic conditions. Full article
(This article belongs to the Special Issue Cooperative Perception for Intelligent Vehicles)
Show Figures

Figure 1

27 pages, 5064 KiB  
Article
Collective Perception Using UAVs: Autonomous Aerial Reconnaissance in a Complex Urban Environment
by Petr Stodola, Jan Drozd, Karel Šilinger, Jan Hodický and Dalibor Procházka
Sensors 2020, 20(10), 2926; https://doi.org/10.3390/s20102926 - 21 May 2020
Cited by 12 | Viewed by 2368
Abstract
This article examines autonomous reconnaissance in a complex urban environment using unmanned aerial vehicles (UAVs). Environments with many buildings and other types of obstacles and/or an uneven terrain are harder to be explored as occlusion of objects of interest may often occur. First, [...] Read more.
This article examines autonomous reconnaissance in a complex urban environment using unmanned aerial vehicles (UAVs). Environments with many buildings and other types of obstacles and/or an uneven terrain are harder to be explored as occlusion of objects of interest may often occur. First, in this article, the problem of autonomous reconnaissance in a complex urban environment via a swarm of UAVs is formulated. Then, the algorithm based on the metaheuristic approach is proposed for a solution. This solution lies in deploying a number of waypoints in the area of interest to be explored, from which the monitoring is performed, and planning the routes for available UAVs among these waypoints so that the monitored area is as large as possible and the operation as short as possible. In the last part of this article, two types of main experiments based on computer simulations are designed to verify the proposed algorithms. The first type focuses on comparing the results achieved on the benchmark instances with the optimal solutions. The second one presents and discusses the results obtained from a number of scenarios, which are based on typical reconnaissance operations in real environments. Full article
(This article belongs to the Special Issue Cooperative Perception for Intelligent Vehicles)
Show Figures

Figure 1

Back to TopTop