Simulating the Effects of Sensor Failures on Autonomous Vehicles for Safety Evaluation
Abstract
1. Introduction
- Integration of CARLA and Autoware: We established an integration that enables structured, automated testing of autonomous driving behavior under sensor failures using scenario-based simulations.
- Sensor Fault Injection Mechanism: We designed and implemented a fault injection system capable of simulating various sensor anomalies, including silent failures and noise. This mechanism targets key sensors such as LiDAR, IMU (gyroscope, accelerometer, and quaternion), and GNSS.
- Evaluation of AV System Response: We conducted a structured fault injection campaign to assess how Autoware responds under different sensor failure conditions. By injecting faults during scenario execution, we analyzed the system’s behavior and measured its ability to maintain safe and functional operation.
2. Background Concepts
2.1. Architecture of Autonomous Driving Systems
2.1.1. Perception
2.1.2. Planning and Decision
2.1.3. Motion and Vehicle Control
2.1.4. System Supervision
- Health Monitoring: The supervision module continuously checks the status of both hardware and software throughout the vehicle. If any module is not operating correctly, for example, a critical sensor drops out or the planner stops producing new trajectories, the supervision layer will detect this [17].
- Fault Management and Fail-Safe Mechanisms: Upon detecting an anomaly or failure, system supervision can initiate appropriate fail-safe mechanisms. This might mean alerting the driver or a remote operator or transitioning the vehicle into a safe state (such as gradually coming to a stop) if the autonomous system can no longer operate safely. This supervisory function is crucial for meeting functional safety requirements (ISO 26262 and similar standards) in a safety-critical system [17].
- Mission and Mode Management: It can manage transitions between autonomous driving and manual control, authorize engagement of self-driving when system checks pass, and abort or override the autonomy if necessary. It may also handle route initialization and high-level mission planning or interface with fleet management (in the context of robotaxis, for example), though these functions are sometimes considered separate. Additionally, system supervision can manage the Human–Machine Interface (HMI), for example, conveying system status to passengers or requesting driver takeover when needed [17].
2.2. Sensors Used in AVs
2.2.1. Ultrasonic Sensors
2.2.2. RADAR: Radio Detection and Ranging
2.2.3. LiDAR: Light Detection and Ranging
2.2.4. Camera
2.2.5. GNSS
- Clock bias and synchronization, due to residual satellite-clock error and receiver clock drift/jitter, clock/synchronization effects are present, but the receiver’s absolute bias is estimated with (x, y, z), so stability, not absolute offset, governs positioning accuracy [49].
- Signal delays, caused by propagation through the ionosphere and troposphere.
- Multipath effect.
- Satellite orbit uncertainties.
2.2.6. Inertial Measurement Units
2.2.7. Sensor Fusion
2.3. CARLA
- The CARLA server handles the simulation environment, including physics, rendering, traffic behavior, and all the actors in the scene.
- The client, which may use a TCP/IP API or operate through a ROS interface, communicates with the server to control vehicles, place sensors, modify environment settings, and access simulation data.
2.3.1. Scenario Management and Ego-Vehicle
2.3.2. CARLA—Autonomous Driving System Communication Using ROS
- Providing Sensor Data (Lidar, Semantic lidar, Cameras, GNSS, Radar, IMU)
- Providing Object Data (Traffic light status, Visualization markers, Collision, Lane invasion)
- Controlling AD Agents (Steer/Throttle/Brake)
- Controlling CARLA (Play/pause simulation, Set simulation parameters)
2.4. Autoware
2.4.1. Architecture
- Perception: Object detection, segmentation, and tracking using data from LiDAR and cameras.
- Localization: Using GNSS, IMU, and LiDAR-based SLAM (Simultaneous Localization and Mapping) or map-matching for accurate vehicle positioning.
- Planning: Path planning, behavior planning, and route generation based on real-time map and object data.
- Control: Low-level vehicle actuation commands, such as throttle, brake, and steering, which are received by the vehicle interface responsible for executing them through actuators.
- Interface: Tools for vehicle-to-platform communication and human–machine interaction.
2.4.2. Sensing
- GNSS: Provides global positioning (latitude, longitude, altitude) and orientation information. It is used primarily for global localization and navigation.
- IMU: Supplies measurements of vehicle motion, including acceleration and angular velocity. This data supports orientation tracking and is essential for sensor fusion techniques.
- LiDAR: Generates detailed 3D representations of the vehicle’s surroundings. It is used for localization (e.g., scan matching) and object detection.
- Camera: Delivers visual information for scene understanding, such as lane markings, traffic lights, and object classification.
- Radar: Detects objects by measuring distance, speed, and direction. Its robustness to adverse weather makes it valuable for tracking other road users.
2.4.3. Map
- To the sensing module, the projection information (used to convert GNSS data to the local coordinate system).
- To the localization module, the point cloud (used for LiDAR-based localization) and vector maps (used for localization methods based on road markings).
- To the perception module, the point cloud map (used for obstacle segmentation) and vector map (used for vehicle trajectory prediction).
- To the planning module, the vector map (for behavior planning).
2.4.4. Localization
2.4.5. Perception
2.4.6. Planning
2.4.7. Control
2.4.8. Vehicle Interface
2.5. Fault Injection in AV Systems
3. Related Work
3.1. Fault Injection in Autonomous Vehicles
3.2. Simulator-Based AV Testing
4. Framework Implementation
4.1. Framework Overview
4.2. Scenario Runner Modifications
4.3. Sensor Fault Injection System Implementation
4.4. Logging and Results Collection
5. Experimental Setup
5.1. Simulation Environment and System Under Test
5.2. Fault Model
- Location—the location of the fault refers to the specific sensor affected. In this case, 5 different locations were considered: LiDAR, GNSS, and the three components of the IMU (gyroscope, accelerometer, and quaternion).
- Type—the fault type includes Silent Failures, where the sensor stops transmitting data, and noise, representing random deviations beyond the sensor’s normal operating conditions. We refer to this as Severe Noise.
- Trigger—the trigger defines the position where the ego vehicle must be to start the fault injection. We defined 5 fault triggers based on the moments the ego vehicle reaches specific locations within the map of the scenario.
- Duration—the duration of the fault specifies how long the fault remains active since the trigger. We defined a permanent duration, meaning the fault remains active until the end of the test run.
5.3. Severe Noise per Fault Location
5.3.1. LiDAR
5.3.2. GNSS
5.3.3. IMU
5.3.4. Summary of Fault Type Values per Location
5.4. Fault Trigger
5.4.1. Trigger 1: Starting Point
5.4.2. Trigger 2: Stop Sign
5.4.3. Trigger 3: Right Turn at Intersection
5.4.4. Trigger 4: Pedestrian Crosswalk Encounter
5.4.5. Trigger 5: Final Intersection
5.5. Results Classification
- Collision: The ego vehicle collides with another object, pedestrian, or road infrastructure.
- Out: The vehicle deviates from its designated lane or route but does not collide.
- Timeout: The vehicle fails to reach the destination within the time limit (3 min). It probably stopped, but did not collide or deviate from the designated lane or route
- OK: The vehicle completes the route without any incident.
5.6. Fault Injection Process
5.7. Experimental Setup Validation Process
6. Experimental Execution and Results
6.1. Golden Runs
6.2. Experiment Execution and Analysis
7. Conclusions and Future Work
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
AD | Autonomous Driving |
ADS | Autonomous Driving System |
AI | Artificial Intelligence |
API | Application Programming Interface |
AV | Autonomous Vehicle |
AVFI | Autonomous Vehicle Fault Injection |
CARLA | Car Learning to Act (Simulator) |
CPU | Central Processing Unit |
CVC | Computer Vision Center |
DVS | Dynamic Vision Sensor |
FMCW | Frequency-Modulated Continuous Wave |
GNSS | Global Navigation Satellite System |
GPS | Global Positioning System |
HMI | Human–Machine Interface |
IMU | Inertial Measurement Unit |
INS | Inertial Navigation System |
IP | Internet Protocol |
ISO | International Organization for Standardization |
JSON | JavaScript Object Notation |
LGSVL | LG Simulator for Autonomous Vehicles |
MPU | Microprocessor Unit |
NDT | Normal Distributions Transform |
RADAR | Radio Detection and Ranging |
RGB | Red Green Blue (color model) |
ROS | Robot Operating System |
SLAM | Simultaneous Localization and Mapping |
TCP | Transmission Control Protocol |
WGS | World Geodetic System |
XML | eXtensible Markup Language |
Appendix A
Fault ID | Location | Type | Trigger | Outcome |
---|---|---|---|---|
1 | IMU—Gyroscope | Silent | Scenario 1 | OK |
2 | IMU—Gyroscope | Silent | Scenario 2 | OK |
3 | IMU—Gyroscope | Silent | Scenario 3 | OK |
4 | IMU—Gyroscope | Silent | Scenario 4 | OK |
5 | IMU—Gyroscope | Silent | Scenario 5 | OK |
6 | IMU—Gyroscope | Noise | Scenario 1 | OK |
7 | IMU—Gyroscope | Noise | Scenario 2 | OK |
8 | IMU—Gyroscope | Noise | Scenario 3 | OK |
9 | IMU—Gyroscope | Noise | Scenario 4 | OK |
10 | IMU—Gyroscope | Noise | Scenario 5 | OK |
11 | IMU—Gyroscope | Severe | Scenario 1 | Collision |
12 | IMU—Gyroscope | Severe | Scenario 2 | Collision |
13 | IMU—Gyroscope | Severe | Scenario 3 | Timeout |
14 | IMU—Gyroscope | Severe | Scenario 4 | Collision |
15 | IMU—Gyroscope | Severe | Scenario 5 | Out |
16 | IMU—Accelerometer | Silent | Scenario 1 | OK |
17 | IMU—Accelerometer | Silent | Scenario 2 | OK |
18 | IMU—Accelerometer | Silent | Scenario 3 | OK |
19 | IMU—Accelerometer | Silent | Scenario 4 | OK |
20 | IMU—Accelerometer | Silent | Scenario 5 | OK |
21 | IMU—Accelerometer | Noise | Scenario 1 | OK |
22 | IMU—Accelerometer | Noise | Scenario 2 | OK |
23 | IMU—Accelerometer | Noise | Scenario 3 | OK |
24 | IMU—Accelerometer | Noise | Scenario 4 | OK |
25 | IMU—Accelerometer | Noise | Scenario 5 | OK |
26 | IMU—Accelerometer | Severe | Scenario 1 | OK |
27 | IMU—Accelerometer | Severe | Scenario 2 | OK |
28 | IMU—Accelerometer | Severe | Scenario 3 | OK |
29 | IMU—Accelerometer | Severe | Scenario 4 | OK |
30 | IMU—Accelerometer | Severe | Scenario 5 | OK |
31 | IMU—Quaternion | Silent | Scenario 1 | OK |
32 | IMU—Quaternion | Silent | Scenario 2 | OK |
33 | IMU—Quaternion | Silent | Scenario 3 | OK |
34 | IMU—Quaternion | Silent | Scenario 4 | OK |
35 | IMU—Quaternion | Silent | Scenario 5 | OK |
36 | IMU—Quaternion | Noise | Scenario 1 | OK |
37 | IMU—Quaternion | Noise | Scenario 2 | OK |
38 | IMU—Quaternion | Noise | Scenario 3 | OK |
39 | IMU—Quaternion | Noise | Scenario 4 | OK |
40 | IMU—Quaternion | Noise | Scenario 5 | OK |
41 | IMU—Quaternion | Severe | Scenario 1 | OK |
42 | IMU—Quaternion | Severe | Scenario 2 | OK |
43 | IMU—Quaternion | Severe | Scenario 3 | OK |
44 | IMU—Quaternion | Severe | Scenario 4 | OK |
45 | IMU—Quaternion | Severe | Scenario 5 | OK |
46 | LiDAR | Silent | Scenario 1 | Collision |
47 | LiDAR | Silent | Scenario 2 | Collision |
48 | LiDAR | Silent | Scenario 3 | Collision |
49 | LiDAR | Silent | Scenario 4 | Collision |
50 | LiDAR | Silent | Scenario 5 | Collision |
51 | LiDAR | Noise | Scenario 1 | OK |
52 | LiDAR | Noise | Scenario 2 | OK |
53 | LiDAR | Noise | Scenario 3 | OK |
54 | LiDAR | Noise | Scenario 4 | OK |
55 | LiDAR | Noise | Scenario 5 | OK |
56 | LiDAR | Severe | Scenario 1 | Timeout |
57 | LiDAR | Severe | Scenario 2 | Timeout |
58 | LiDAR | Severe | Scenario 3 | Timeout |
59 | LiDAR | Severe | Scenario 4 | Timeout |
60 | LiDAR | Severe | Scenario 5 | Timeout |
61 | GNSS | Silent | Scenario 1 | OK |
62 | GNSS | Silent | Scenario 2 | OK |
63 | GNSS | Silent | Scenario 3 | OK |
64 | GNSS | Silent | Scenario 4 | OK |
65 | GNSS | Silent | Scenario 5 | OK |
66 | GNSS | Noise | Scenario 1 | OK |
67 | GNSS | Noise | Scenario 2 | OK |
68 | GNSS | Noise | Scenario 3 | OK |
69 | GNSS | Noise | Scenario 4 | OK |
70 | GNSS | Noise | Scenario 5 | OK |
71 | GNSS | Severe | Scenario 1 | OK |
72 | GNSS | Severe | Scenario 2 | OK |
73 | GNSS | Severe | Scenario 3 | OK |
74 | GNSS | Severe | Scenario 4 | OK |
75 | GNSS | Severe | Scenario 5 | OK |
76 | IMU—Gyroscope | Severe | Scenario 1 | Timeout |
77 | IMU—Gyroscope | Severe | Scenario 1 | Collision |
78 | IMU—Gyroscope | Severe | Scenario 1 | Out |
79 | IMU—Gyroscope | Severe | Scenario 1 | Collision |
80 | IMU—Gyroscope | Severe | Scenario 1 | Collision |
81 | IMU—Gyroscope | Severe | Scenario 2 | Out |
82 | IMU—Gyroscope | Severe | Scenario 2 | Collision |
83 | IMU—Gyroscope | Severe | Scenario 2 | Collision |
84 | IMU—Gyroscope | Severe | Scenario 2 | Collision |
85 | IMU—Gyroscope | Severe | Scenario 2 | Timeout |
86 | IMU—Gyroscope | Severe | Scenario 3 | Out |
87 | IMU—Gyroscope | Severe | Scenario 3 | Collision |
88 | IMU—Gyroscope | Severe | Scenario 3 | Out |
89 | IMU—Gyroscope | Severe | Scenario 3 | Collision |
90 | IMU—Gyroscope | Severe | Scenario 3 | Collision |
91 | IMU—Gyroscope | Severe | Scenario 4 | Collision |
92 | IMU—Gyroscope | Severe | Scenario 4 | Timeout |
93 | IMU—Gyroscope | Severe | Scenario 4 | Collision |
94 | IMU—Gyroscope | Severe | Scenario 4 | Collision |
95 | IMU—Gyroscope | Severe | Scenario 4 | Timeout |
96 | IMU—Gyroscope | Severe | Scenario 5 | Collision |
97 | IMU—Gyroscope | Severe | Scenario 5 | Collision |
98 | IMU—Gyroscope | Severe | Scenario 5 | Timeout |
99 | IMU—Gyroscope | Severe | Scenario 5 | Timeout |
100 | IMU—Gyroscope | Severe | Scenario 5 | Out |
References
- Silva, Ó.; Cordera, R.; González-González, E.; Nogués, S. Environmental impacts of autonomous vehicles: A review of the scientific literature. Sci. Total Environ. 2022, 830, 154615. [Google Scholar] [CrossRef]
- Luca, O.; Andrei, L.; Iacoboaea, C.; Gaman, F. Unveiling the Hidden Effects of Automated Vehicles on “Do No Significant Harm’’ Components. Sustainability 2023, 15, 11265. [Google Scholar] [CrossRef]
- Ahangar, M.N.; Ahmed, Q.Z.; Khan, F.A.; Hafeez, M. A Survey of Autonomous Vehicles: Enabling Communication Technologies and Challenges. Sensors 2021, 21, 706. [Google Scholar] [CrossRef]
- Pandharipande, A.; Cheng, C.-H.; Dauwels, J.; Gurbuz, S.Z.; Ibanez-Guzman, J.; Li, G.; Piazzoni, A.; Wang, P.; Santra, A. Sensing and Machine Learning for Automotive Perception: A Review. IEEE Sens. J. 2023, 23, 11097–11115. [Google Scholar] [CrossRef]
- Jha, S.; Banerjee, S.S.; Cyriac, J.; Kalbarczyk, Z.T.; Iyer, R.K. AVFI: Fault Injection for Autonomous Vehicles. In Proceedings of the 48th Annual IEEE/IFIP International Conference on Dependable Systems and Networks Workshops, DSN-W 2018, Luxembourg, 25–28 June 2018; pp. 55–56. [Google Scholar] [CrossRef]
- Maleki, M.; Farooqui, A.; Sangchoolie, B. CarFASE: A Carla-based Tool for Evaluating the Effects of Faults and Attacks on Autonomous Driving Stacks. In Proceedings of the 53rd Annual IEEE/IFIP International Conference on Dependable Systems and Networks Workshops, DSN-W 2023, Porto, Portugal, 27–30 June 2023; pp. 92–99. [Google Scholar] [CrossRef]
- Jha, S.; Banerjee, S.; Tsai, T.; Hari, S.K.S.; Sullivan, M.B.; Kalbarczyk, Z.T.; Keckler, S.W.; Iyer, R.K. ML-Based Fault Injection for Autonomous Vehicles: A Case for Bayesian Fault Injection. In Proceedings of the 2019 49th Annual IEEE/IFIP International Conference on Dependable Systems and Networks (DSN), Portland, OR, USA, 24–27 June 2019; pp. 112–124. [Google Scholar] [CrossRef]
- Maleki, M.; Sangchoolie, B. SUFI: A Simulation-based Fault Injection Tool for Safety Evaluation of Advanced Driver Assistance Systems Modelled in SUMO. In Proceedings of the 2021 17th European Dependable Computing Conference, EDCC 2021, Munich, Germany, 13–16 September 2021; pp. 45–52. [Google Scholar] [CrossRef]
- Saraoglu, M.; Morozov, A.; Janschek, K. MOBATSim: MOdel-Based Autonomous Traffic Simulation Framework for Fault-Error-Failure Chain Analysis. IFAC-PapersOnLine 2019, 52, 239–244. [Google Scholar] [CrossRef]
- Gosavi, M.A.; Rhoades, B.B.; Conrad, J.M. Application of Functional Safety in Autonomous Vehicles Using ISO 26262 Standard: A Survey. In Proceedings of the IEEE SOUTHEASTCON 2018, St. Petersburg, FL, USA, 19–22 April 2018. [Google Scholar] [CrossRef]
- Gat, E.; Bonnasso, R.P.; Murphy, R. On Three-Layer Architectures. Artif. Intell. Mob. Robot. 1998, 195, 210. [Google Scholar]
- Subsumption Control of a Mobile Robot. Available online: https://www.researchgate.net/publication/2875073_Subsumption_Control_of_a_Mobile_Robot (accessed on 14 August 2025).
- The 3T Intelligent Control Architecture|Download Scientific Diagram. Available online: https://www.researchgate.net/figure/The-3T-Intelligent-Control-Architecture_fig1_2851637 (accessed on 14 August 2025).
- Iovino, M.; Scukins, E.; Styrud, J.; Ögren, P.; Smith, C. A survey of Behavior Trees in robotics and AI. Rob. Auton. Syst. 2022, 154, 104096. [Google Scholar] [CrossRef]
- García, C.E.; Prett, D.M.; Morari, M. Model predictive control: Theory and practice—A survey. Automatica 1989, 25, 335–348. [Google Scholar] [CrossRef]
- Pendleton, S.D.; Andersen, H.; Du, X.; Shen, X.; Meghjani, M.; Eng, Y.H.; Rus, D.; Ang, M.H. Perception, Planning, Control, and Coordination for Autonomous Vehicles. Machines 2017, 5, 6. [Google Scholar] [CrossRef]
- Velasco-Hernandez, G.; Yeong, D.J.; Barry, J.; Walsh, J. Autonomous Driving Architectures, Perception and Data Fusion: A Review. In Proceedings of the 2020 IEEE 16th International Conference on Intelligent Computer Communication and Processing (ICCP), Cluj-Napoca, Romania, 3–5 September 2020; pp. 315–321. [Google Scholar]
- Kumar, D.; Muhammad, N. A Survey on Localization for Autonomous Vehicles. IEEE Access 2023, 11, 115865–115883. [Google Scholar] [CrossRef]
- Chalvatzaras, A.; Pratikakis, I.; Amanatiadis, A.A. A Survey on Map-Based Localization Techniques for Autonomous Vehicles. IEEE Trans. Intell. Veh. 2023, 8, 1574–1596. [Google Scholar] [CrossRef]
- Karle, P.; Geisslinger, M.; Betz, J.; Lienkamp, M. Scenario Understanding and Motion Prediction for Autonomous Vehicles—Review and Comparison. IEEE Trans. Intell. Transp. Syst. 2022, 23, 16962–16982. [Google Scholar] [CrossRef]
- Wang, X.; Maleki, M.A.; Azhar, M.W.; Trancoso, P. Moving Forward: A Review of Autonomous Driving Software and Hardware Systems. arXiv 2024, arXiv:2411.10291. [Google Scholar] [CrossRef]
- He, Z.; Nie, L.; Yin, Z.; Huang, S. A Two-Layer Controller for Lateral Path Tracking Control of Autonomous Vehicles. Sensors 2020, 20, 3689. [Google Scholar] [CrossRef]
- Chen, G.; Zhao, X.; Gao, Z.; Hua, M. Dynamic Drifting Control for General Path Tracking of Autonomous Vehicles. IEEE Trans. Intell. Veh. 2023, 8, 2527–2537. [Google Scholar] [CrossRef]
- Yeong, D.J.; Velasco-Hernandez, G.; Barry, J.; Walsh, J. Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Review. Sensors 2021, 21, 2140. [Google Scholar] [CrossRef]
- Ignatious, H.A.; Sayed, H.-E.; Khan, M. An overview of sensors in Autonomous Vehicles. Procedia Comput. Sci. 2022, 198, 736–741. [Google Scholar] [CrossRef]
- Matos, F.; Bernardino, J.; Durães, J.; Cunha, J. A Survey on Sensor Failures in Autonomous Vehicles: Challenges and Solutions. Sensors 2024, 24, 5108. [Google Scholar] [CrossRef]
- Ortiz, F.M.; Sammarco, M.; Costa, L.H.M.K.; Detyniecki, M. Applications and Services Using Vehicular Exteroceptive Sensors: A Survey. IEEE Trans. Intell. Veh. 2023, 8, 949–969. [Google Scholar] [CrossRef]
- Budisusila, E.N.; Khosyi’in, M.; Prasetyowati, S.A.D.; Suprapto, B.Y.; Nawawi, Z. Ultrasonic Multi-Sensor Detection Patterns on Autonomous Vehicles Using Data Stream Method. In Proceedings of the 2021 8th International Conference on Electrical Engineering, Computer Science and Informatics (EECSI), Semarang, Indonesia, 20–21 October 2021; pp. 144–150. [Google Scholar]
- Paidi, V.; Fleyeh, H.; Håkansson, J.; Nyberg, R.G. Smart parking sensors, technologies and applications for open parking lots: A review. IET Intell. Transp. Syst. 2018, 12, 735–741. [Google Scholar] [CrossRef]
- Vargas, J.; Alsweiss, S.; Toker, O.; Razdan, R.; Santos, J. An Overview of Autonomous Vehicles Sensors and Their Vulnerability to Weather Conditions. Sensors 2021, 21, 5397. [Google Scholar] [CrossRef] [PubMed]
- Rosique, F.; Navarro, P.J.; Fernández, C.; Padilla, A. A Systematic Review of Perception System and Simulators for Autonomous Vehicles Research. Sensors 2019, 19, 648. [Google Scholar] [CrossRef]
- Wang, Z.; Wu, Y.; Niu, Q. Multi-Sensor Fusion in Automated Driving: A Survey. IEEE Access 2020, 8, 2847–2868. [Google Scholar] [CrossRef]
- Komissarov, R.; Kozlov, V.; Filonov, D.; Ginzburg, P. Partially coherent radar unties range resolution from bandwidth limitations. Nat. Commun. 2019, 10, 1423. [Google Scholar] [CrossRef]
- Skaria, S.; Al-Hourani, A.; Evans, R.J.; Sithamparanathan, K.; Parampalli, U. Interference Mitigation in Automotive Radars Using Pseudo-Random Cyclic Orthogonal Sequences. Sensors 2019, 19, 4459. [Google Scholar] [CrossRef]
- Pirkani, A.; Norouzian, F.; Hoare, E.; Cherniakov, M.; Gashinova, M. Automotive interference statistics and their effect on radar detector. IET Radar Sonar Navig. 2022, 16, 9–21. [Google Scholar] [CrossRef]
- Wu, Z.; Song, Y.; Liu, J.; Chen, Y.; Sha, H.; Shi, M.; Zhang, H.; Qin, L.; Liang, L.; Jia, P.; et al. Advancements in Key Parameters of Frequency-Modulated Continuous-Wave Light Detection and Ranging: A Research Review. Appl. Sci. 2024, 14, 7810. [Google Scholar] [CrossRef]
- Staffas, T.; Elshaari, A.; Zwiller, V. Frequency modulated continuous wave and time of flight LIDAR with single photons: A comparison. Opt. Express 2024, 32, 7332–7341. [Google Scholar] [CrossRef]
- Ma, J.; Zhuo, S.; Qiu, L.; Gao, Y.; Wu, Y.; Zhong, M.; Bai, R.; Sun, M.; Chiang, P.Y.; Ma, J.; et al. A review of ToF-based LiDAR. J. Semicond. 2024, 45, 101201. [Google Scholar] [CrossRef]
- Li, N.; Ho, C.P.; Xue, J.; Lim, L.W.; Chen, G.; Fu, Y.H.; Lee, L.Y.T. A Progress Review on Solid-State LiDAR and Nanophotonics-Based LiDAR Sensors. Laser Photon Rev. 2022, 16, 2100511. [Google Scholar] [CrossRef]
- Holzhüter, H.; Bödewadt, J.; Bayesteh, S.; Aschinger, A.; Blume, H. Technical concepts of automotive LiDAR sensors: A review. Opt. Eng. 2023, 62, 031213. [Google Scholar] [CrossRef]
- Li, Y.; Ibanez-Guzman, J. Lidar for Autonomous Driving: The Principles, Challenges, and Trends for Automotive Lidar and Perception Systems. IEEE Signal Process. Mag. 2020, 37, 50–61. [Google Scholar] [CrossRef]
- Dreissig, M.; Scheuble, D.; Piewak, F.; Boedecker, J. Survey on LiDAR Perception in Adverse Weather Conditions. In Proceedings of the 2023 IEEE Intelligent Vehicles Symposium (IV), Anchorage, AK, USA, 4–7 June 2023; pp. 1–8. [Google Scholar]
- Damodaran, D.; Mozaffari, S.; Alirezaee, S.; Ahamed, M.J. Experimental Analysis of the Behavior of Mirror-like Objects in LiDAR-Based Robot Navigation. Appl. Sci. 2023, 13, 2908. [Google Scholar] [CrossRef]
- Roszyk, K.; Nowicki, M.R.; Skrzypczyński, P. Adopting the YOLOv4 Architecture for Low-Latency Multispectral Pedestrian Detection in Autonomous Driving. Sensors 2022, 22, 1082. [Google Scholar] [CrossRef]
- Sun, C.; Chen, Y.; Qiu, X.; Li, R.; You, L. MRD-YOLO: A Multispectral Object Detection Algorithm for Complex Road Scenes. Sensors 2024, 24, 3222. [Google Scholar] [CrossRef]
- Xie, Y.; Zhang, L.; Yu, X.; Xie, W. YOLO-MS: Multispectral Object Detection via Feature Interaction and Self-Attention Guided Fusion. IEEE Trans. Cogn. Dev. Syst. 2023, 15, 2132–2143. [Google Scholar] [CrossRef]
- Altay, F.; Velipasalar, S. The Use of Thermal Cameras for Pedestrian Detection. IEEE Sens. J. 2022, 22, 11489–11498. [Google Scholar] [CrossRef]
- Ceccarelli, A.; Secci, F. RGB Cameras Failures and Their Effects in Autonomous Driving Applications. IEEE Trans. Dependable Secur. Comput. 2023, 20, 2731–2745. [Google Scholar] [CrossRef]
- Maciuk, K. Determination of GNSS receiver elevation-dependent clock bias accuracy. Measurement 2021, 168, 108336. [Google Scholar] [CrossRef]
- Raveena, C.S.; Sravya, R.S.; Kumar, R.V.; Chavan, A. Sensor Fusion Module Using IMU and GPS Sensors for Autonomous Car. In Proceedings of the 2020 IEEE International Conference for Innovation in Technology (INOCON), Bangluru, India, 6–8 November 2020; pp. 1–6. [Google Scholar]
- Yusefi, A.; Durdu, A.; Bozkaya, F.; Tığlıoğlu, Ş.; Yılmaz, A.; Sungur, C. A Generalizable D-VIO and Its Fusion with GNSS/IMU for Improved Autonomous Vehicle Localization. IEEE Trans. Intell. Veh. 2024, 9, 2893–2907. [Google Scholar] [CrossRef]
- Xia, X.; Hashemi, E.; Xiong, L.; Khajepour, A.; Xu, N. Autonomous Vehicles Sideslip Angle Estimation: Single Antenna GNSS/IMU Fusion With Observability Analysis. IEEE Internet Things J. 2021, 8, 14845–14859. [Google Scholar] [CrossRef]
- Shahian Jahromi, B.; Tulabandhula, T.; Cetin, S. Real-Time Hybrid Multi-Sensor Fusion Framework for Perception in Autonomous Vehicles. Sensors 2019, 19, 4357. [Google Scholar] [CrossRef] [PubMed]
- Nobis, F.; Geisslinger, M.; Weber, M.; Betz, J.; Lienkamp, M. A Deep Learning-based Radar and Camera Sensor Fusion Architecture for Object Detection. In Proceedings of the 2019 Sensor Data Fusion: Trends, Solutions, Applications (SDF), Bonn, Germany, 15–17 October 2019; pp. 1–7. [Google Scholar]
- Intellias. How Sensor Fusion for Autonomous Cars Helps Avoid Deaths on the Road. Intellias Blog. Available online: https://intellias.com/sensor-fusion-autonomous-cars-helps-avoid-deaths-road/ (accessed on 14 June 2025).
- Brena, R.F.; Aguileta, A.A.; Trejo, L.A.; Molino-Minero-Re, E.; Mayora, O. Choosing the Best Sensor Fusion Method: A Machine-Learning Approach. Sensors 2020, 20, 2350. [Google Scholar] [CrossRef] [PubMed]
- Xiang, C.; Feng, C.; Xie, X.; Shi, B.; Lu, H.; Lv, Y.; Yang, M.; Niu, Z. Multi-Sensor Fusion and Cooperative Perception for Autonomous Driving: A Review. IEEE Intell. Transp. Syst. Mag. 2023, 15, 36–58. [Google Scholar] [CrossRef]
- Fayyad, J.; Jaradat, M.A.; Gruyer, D.; Najjaran, H. Deep Learning Sensor Fusion for Autonomous Vehicle Perception and Localization: A Review. Sensors 2020, 20, 4220. [Google Scholar] [CrossRef]
- Gu, S.; Zhang, Y.; Yang, J.; Alvarez, J.M.; Kong, H. Two-View Fusion based Convolutional Neural Network for Urban Road Detection. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 4–8 November 2019; pp. 6144–6149. [Google Scholar]
- Alfred Daniel, J.; Chandru Vignesh, C.; Muthu, B.A.; Senthil Kumar, R.; Sivaparthipan, C.; Marin, C.E.M. Fully convolutional neural networks for LIDAR–camera fusion for pedestrian detection in autonomous vehicle. Multimed. Tools Appl. 2023, 82, 25107–25130. [Google Scholar] [CrossRef]
- Gao, L.; Xia, X.; Zheng, Z.; Ma, J. GNSS/IMU/LiDAR fusion for vehicle localization in urban driving environments within a consensus framework. Mech. Syst. Signal Process. 2023, 205, 110862. [Google Scholar] [CrossRef]
- Kim, J.; Kim, J.; Cho, J. An advanced object classification strategy using YOLO through camera and LiDAR sensor fusion. In Proceedings of the 2019 13th International Conference on Signal Processing and Communication Systems (ICSPCS), Gold Coast, Australia, 16–18 December 2019; pp. 1–5. [Google Scholar]
- Banerjee, K.; Notz, D.; Windelen, J.; Gavarraju, S.; He, M. Online Camera LiDAR Fusion and Object Detection on Hybrid Data for Autonomous Driving. In Proceedings of the 2018 IEEE Intelligent Vehicles Symposium (IV), Changshu, China, 26–30 June 2018; pp. 1632–1638. [Google Scholar]
- Pollach, M.; Schiegg, F.; Knoll, A. Low Latency and Low-Level Sensor Fusion for Automotive Use-Cases. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 6780–6786. [Google Scholar]
- Wang, X.; Li, K.; Chehri, A. Multi-Sensor Fusion Technology for 3D Object Detection in Autonomous Driving: A Review. IEEE Trans. Intell. Transp. Syst. 2023, 25, 1148–1165. [Google Scholar] [CrossRef]
- Zhao, X.; Sun, P.; Xu, Z.; Min, H.; Yu, H. Fusion of 3D LIDAR and Camera Data for Object Detection in Autonomous Vehicle Applications. IEEE Sens. J. 2020, 20, 4901–4913. [Google Scholar] [CrossRef]
- Bijelic, M.; Gruber, T.; Mannan, F.; Kraus, F.; Ritter, W.; Dietmayer, K.; Heide, F. Seeing Through Fog Without Seeing Fog: Deep Multimodal Sensor Fusion in Unseen Adverse Weather. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 13–19 June 2020; pp. 11679–11689. [Google Scholar]
- AlZu’bi, S.; Jararweh, Y. Data Fusion in Autonomous Vehicles Research, Literature Tracing from Imaginary Idea to Smart Surrounding Community. In Proceedings of the 2020 Fifth International Conference on Fog and Mobile Edge Computing (FMEC), Paris, France, 30 June–3 July 2020; pp. 306–311. [Google Scholar]
- Hasanujjaman, M.; Chowdhury, M.Z.; Jang, Y.M. Sensor Fusion in Autonomous Vehicle with Traffic Surveillance Camera System: Detection, Localization, and AI Networking. Sensors 2023, 23, 3335. [Google Scholar] [CrossRef]
- Ogunrinde, I.; Bernadin, S. Deep Camera–Radar Fusion with an Attention Framework for Autonomous Vehicle Vision in Foggy Weather Conditions. Sensors 2023, 23, 6255. [Google Scholar] [CrossRef]
- Yao, S.; Guan, R.; Huang, X.; Li, Z.; Sha, X.; Yue, Y.; Lim, E.G.; Seo, H.; Man, K.L.; Zhu, X.; et al. Radar-Camera Fusion for Object Detection and Semantic Segmentation in Autonomous Driving: A Comprehensive Review. IEEE Trans. Intell. Veh. 2024, 9, 2094–2128. [Google Scholar] [CrossRef]
- Choi, J.D.; Kim, M.Y. A Sensor Fusion System with Thermal Infrared Camera and LiDAR for Autonomous Vehicles: Its Calibration and Application. In Proceedings of the 2021 Twelfth International Conference on Ubiquitous and Future Networks (ICUFN), Jeju, Republic of Korea, 17–20 August 2021; pp. 361–365. [Google Scholar]
- Wang, S.; Mei, L.; Yin, Z.; Li, H.; Liu, R.; Jiang, W.; Lu, C.X. End-to-End Target Liveness Detection via mmWave Radar and Vision Fusion for Autonomous Vehicles. ACM Trans. Sens. Netw. 2024, 20, 1–26. [Google Scholar] [CrossRef]
- Shi, J.; Tang, Y.; Gao, J.; Piao, C.; Wang, Z. Multitarget-Tracking Method Based on the Fusion of Millimeter-Wave Radar and LiDAR Sensor Information for Autonomous Vehicles. Sensors 2023, 23, 6920. [Google Scholar] [CrossRef] [PubMed]
- Lai, Z.; Le, L.; Silva, V.; Bräunl, T. A Comprehensive Comparative Analysis of Carla and Awsim: Open-Source Autonomous Driving Simulators. 2025. Available online: https://ssrn.com/abstract=5096777 (accessed on 16 June 2025). [CrossRef]
- Autoware Universe Documentation. Available online: https://autowarefoundation.github.io/autoware_universe/main/ (accessed on 16 June 2025).
- Autoware Overview—Autoware. Available online: https://autoware.org/autoware-overview/ (accessed on 12 July 2025).
- Arlat, J.; Aguera, M.; Amat, L.; Crouzet, Y.; Fabre, J.C.; Laprie, J.C.; Martins, E.; Powell, D. Fault Injection for Dependability Validation: A Methodology and Some Applications. IEEE Trans. Softw. Eng. 1990, 16, 166–182. [Google Scholar] [CrossRef]
- Hong, D.; Moon, C. Autonomous Driving System Architecture with Integrated ROS2 and Adaptive AUTOSAR. Electronics 2024, 13, 1303. [Google Scholar] [CrossRef]
- Rong, G.; Shin, B.H.; Tabatabaee, H.; Lu, Q.; Lemke, S.; Možeiko, M.; Boise, E.; Uhm, G.; Gerow, M.; Mehta, S.; et al. LGSVL Simulator: A High Fidelity Simulator for Autonomous Driving. In Proceedings of the 2020 IEEE 23rd International Conference on Intelligent Transportation Systems, ITSC 2020, Rhodes, Greece, 20–23 September 2020. [Google Scholar] [CrossRef]
- Raju, V.M.; Gupta, V.; Lomate, S. Performance of Open Autonomous Vehicle Platforms: Autoware and Apollo. In Proceedings of the 2019 IEEE 5th International Conference for Convergence in Technology, I2CT 2019, Bombay, India, 29–31 March 2019. [Google Scholar] [CrossRef]
- Shah, S.; Dey, D.; Lovett, C.; Kapoor, A. AirSim: High-Fidelity Visual and Physical Simulation for Autonomous Vehicles. In Field and Service Robotics; Springer: Cham, Switzerland, 2018; Volume 5, pp. 621–635. [Google Scholar] [CrossRef]
- Cao, L.; Feng, X.; Liu, J.; Zhou, G. Automatic Generation System for Autonomous Driving Simulation Scenarios Based on PreScan. Appl. Sci. 2024, 14, 1354. [Google Scholar] [CrossRef]
- Kemeny, A.; Mérienne, F. Trends in Driving Simulation Design and Experiments. In Driving Simulation Conference Europe 2010 Proceedings; Actes: Paris, France, 2010. [Google Scholar]
- Winner, H.; Lemmer, K.; Form, T.; Mazzega, J. PEGASUS—First Steps for the Safe Introduction of Automated Driving. In Road Vehicle Automation 5; Springer: Cham, Switzerland, 2019; pp. 185–195. [Google Scholar] [CrossRef]
- Safety Pool—Powered by Deepen AI and WMG University of Warwick. Available online: https://www.safetypool.ai/ (accessed on 13 July 2025).
- Li, Q.; Peng, Z.; Feng, L.; Zhang, Q.; Xue, Z.; Zhou, B. MetaDrive: Composing Diverse Driving Scenarios for Generalizable Reinforcement Learning. IEEE Trans. Pattern Anal. Mach. Intell. 2021, 45, 3461–3475. [Google Scholar] [CrossRef]
- Kaljavesi, G.; Kerbl, T.; Betz, T.; Mitkovskii, K.; Diermeyer, F. CARLA-Autoware-Bridge: Facilitating Autonomous Driving Research with a Unified Framework for Simulation and Module Development. In Proceedings of the IEEE Intelligent Vehicles Symposium, Proceedings, Jeju, Republic of Korea, 2–5 June 2024; pp. 224–229. [Google Scholar] [CrossRef]
- Elmquist, A.; Negrut, D. Methods and Models for Simulating Autonomous Vehicle Sensors. IEEE Trans. Intell. Veh. 2020, 5, 684–692. [Google Scholar] [CrossRef]
- Lethander, K.; Taylor, C. Conservative estimation of inertial sensor errors using allan variance data. In Proceedings of the 34th International Technical Meeting of the Satellite Division of the Institute of Navigation, ION GNSS+ 2021, St. Louis, MO, USA, 20–24 September 2021; pp. 2556–2564. [Google Scholar] [CrossRef]
- Fang, X.; Song, D.; Shi, C.; Fan, L.; Hu, Z. Multipath Error Modeling Methodology for GNSS Integrity Monitoring Using a Global Optimization Strategy. Remote Sens. 2022, 14, 2130. [Google Scholar] [CrossRef]
- Kim, J.; Park, B.J.; Kim, J. Empirical Analysis of Autonomous Vehicle’s LiDAR Detection Performance Degradation for Actual Road Driving in Rain and Fog. Sensors 2023, 23, 2972. [Google Scholar] [CrossRef]
- Cannot Get the Traffic_Light_Rois and the Image Raw in rviz2 When Executing Autoware+AWSIM Simulation Issue #5567 Autowarefoundation/Autoware_Universe. Available online: https://github.com/autowarefoundation/autoware_universe/issues/5567?utm_source=chatgpt.com%3Futm_source%3Dchatgpt.com (accessed on 13 August 2025).
- Integrate Trafficlight Detection as an Exemplary Case for Town10. Issue #8 TUMFTM/Carla-Autoware-Bridge. Available online: https://github.com/TUMFTM/Carla-Autoware-Bridge/issues/8?utm_source=chatgpt.com (accessed on 13 August 2025).
- Burnett, K.; Schoellig, A.P.; Barfoot, T.D. Continuous-Time Radar-Inertial and Lidar-Inertial Odometry using a Gaussian Process Motion Prior. IEEE Trans. Robot. 2024, 41, 1059–1076. [Google Scholar] [CrossRef]
- Sun, C.; Sun, P.; Wang, J.; Guo, Y.; Zhao, X. Understanding LiDAR Performance for Autonomous Vehicles Under Snowfall Conditions. IEEE Trans. Intell. Transp. Syst. 2024, 25, 16462–16472. [Google Scholar] [CrossRef]
- Habib, A.F.; Al-Durgham, M.; Kersting, A.P.; Quackenbush, P. Error Budget of Lidar Systems and Quality Control of the Derived Point Cloud. In The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences; International Society of Photogrammetry and Remote Sensing: Beijing, China, 2008. [Google Scholar]
- Meng, X.; Wang, H.; Liu, B. A Robust Vehicle Localization Approach Based on GNSS/IMU/DMI/LiDAR Sensor Fusion for Autonomous Vehicles. Sensors 2017, 17, 2140. [Google Scholar] [CrossRef]
- Enge, P.K. The Global Positioning System: Signals, measurements, and performance. Int. J. Wirel. Inf. Netw. 1994, 1, 83–105. [Google Scholar] [CrossRef]
- Bosch Sensortec GmbH. BMI160: Small, Low-Power Inertial Measurement Unit—Data Sheet; Document No. BST-BMI160-DS000-09, Revision 1.0; published November 25, 2020. Available online: https://www.bosch-sensortec.com/media/boschsensortec/downloads/datasheets/bst-bmi160-ds000.pdf (accessed on 14 July 2025).
- InvenSense Inc. MPU-6000 and MPU-6050 Product Specification, Document No. PS-MPU-6000A-00, Revision 3.4, released 19 August 2013. Available online: https://invensense.tdk.com/wp-content/uploads/2015/02/MPU-6000-Datasheet1.pdf (accessed on 14 July 2025).
- Analog Devices, Inc. ADXL345: Digital Accelerometer Data Sheet, Rev. G, published 26 October 2015. Available online: https://www.analog.com/media/en/technical-documentation/data-sheets/adxl345.pdf (accessed on 14 July 2025).
- Sabatini, A.M. Quaternion-based extended Kalman filter for determining orientation by inertial and magnetic sensing. IEEE Trans. Biomed. Eng. 2006, 53, 1346–1356. [Google Scholar] [CrossRef]
- Xiao, X.; Zhang, Y.; Li, H.; Wang, H.; Li, B. Camera-IMU Extrinsic Calibration Quality Monitoring for Autonomous Ground Vehicles. IEEE Robot. Autom. Lett. 2022, 7, 4614–4621. [Google Scholar] [CrossRef]
- Espineira, J.P.; Robinson, J.; Groenewald, J.; Chan, P.H.; Donzella, V. Realistic LiDAR with Noise Model for Real-Time Testing of Automated Vehicles in a Virtual Environment. IEEE Sens. J. 2021, 21, 9919–9926. [Google Scholar] [CrossRef]
- IEEE Xplore Full-Text PDF. Available online: https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9925708 (accessed on 16 June 2025).
- Wang, Y.; Hwang, J.N.; Wang, G.; Liu, H.; Kim, K.J.; Hsu, H.M.; Cai, J.; Zhang, H.; Jiang, Z.; Gu, R. ROD2021 challenge: A summary for radar object detection challenge for autonomous driving applications. In Proceedings of the ICMR 2021—Proceedings of the 2021 International Conference on Multimedia Retrieval, Taipei, Taiwan, 21–24 August 2021; pp. 553–559. [Google Scholar] [CrossRef]
- Carballo, A.; Lambert, J.; Monrroy, A.; Wong, D.; Narksri, P.; Kitsukawa, Y.; Takeuchi, E.; Kato, S.; Takeda, K. LIBRE: The Multiple 3D LiDAR Dataset. In Proceedings of the IEEE Intelligent Vehicles Symposium, Proceedings, Las Vegas, NV, USA, 19 October–13 November 2020; pp. 1094–1101. [Google Scholar] [CrossRef]
Location | Nominal Values | Severe Noise Values |
---|---|---|
IG—IMU—gyroscope | Deviation up to 5% of angular velocity | Deviation between 5% and 50% of angular velocity |
IA—IMU—accelerometer | Deviation up to 5% of linear acceleration | Deviation between 5% and 50% of linear acceleration |
IQ—IMU—Quaternion | Rotational disturbance up to 0.01 rad (~0.57°) | Disturbance of 0.2 rad (~11.5°) |
LiDAR | Point deviation up to 2% of point range (distance from sensor) | Point deviation between 2% and 10% of point range |
GNSS | ±2 m positional jitter (≈±0.00002°), | ±20 m jitter (≈±0.0002°) |
Location | Trigger 1 | Trigger 2 | Trigger 3 | Trigger 4 | Trigger 5 |
---|---|---|---|---|---|
GNSS | OK | OK | OK | OK | OK |
IMU—Accelerometer | OK | OK | OK | OK | OK |
IMU—Gyroscope | OK | OK | OK | OK | OK |
IMU—Quaternion | OK | OK | OK | OK | OK |
LiDAR | OK | OK | OK | OK | OK |
Location | Type | Scenario 1 | Scenario 2 | Scenario 3 | Scenario 4 | Scenario 5 |
---|---|---|---|---|---|---|
GNSS | Severe | OK | OK | OK | OK | OK |
GNSS | Silent | OK | OK | OK | OK | OK |
IMU—Accelerometer | Severe | OK | OK | OK | OK | OK |
IMU—Accelerometer | Silent | OK | OK | OK | OK | OK |
IMU—Gyroscope | Severe | Collision | Collision | Timeout | Collision | Out |
IMU—Gyroscope | Silent | OK | OK | OK | OK | OK |
IMU—Quaternion | Severe | OK | OK | OK | OK | OK |
IMU—Quaternion | Silent | OK | OK | OK | OK | OK |
LiDAR | Severe | Timeout | Timeout | Timeout | Timeout | Timeout |
LiDAR | Silent | Collision | Collision | Collision | Collision | Collision |
Location | Type | Scenario 1 | Scenario 2 | Scenario 3 | Scenario 4 | Scenario 5 |
---|---|---|---|---|---|---|
IMU—Gyroscope | Severe | Collision | Collision | Collision | Collision | Collision |
Collision | Collision | Collision | Collision | Collision | ||
Collision | Collision | Collision | Collision | Out | ||
Out | Out | Out | Collision | Timeout | ||
Timeout | Timeout | Out | Out | Timeout |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Matos, F.; Durães, J.; Cunha, J. Simulating the Effects of Sensor Failures on Autonomous Vehicles for Safety Evaluation. Informatics 2025, 12, 94. https://doi.org/10.3390/informatics12030094
Matos F, Durães J, Cunha J. Simulating the Effects of Sensor Failures on Autonomous Vehicles for Safety Evaluation. Informatics. 2025; 12(3):94. https://doi.org/10.3390/informatics12030094
Chicago/Turabian StyleMatos, Francisco, João Durães, and João Cunha. 2025. "Simulating the Effects of Sensor Failures on Autonomous Vehicles for Safety Evaluation" Informatics 12, no. 3: 94. https://doi.org/10.3390/informatics12030094
APA StyleMatos, F., Durães, J., & Cunha, J. (2025). Simulating the Effects of Sensor Failures on Autonomous Vehicles for Safety Evaluation. Informatics, 12(3), 94. https://doi.org/10.3390/informatics12030094