sensors-logo

Journal Browser

Journal Browser

Sensors and Sensor Fusion for Future Mobility Systems

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Intelligent Sensors".

Deadline for manuscript submissions: closed (30 April 2023) | Viewed by 20448

Special Issue Editors


E-Mail Website
Guest Editor
1. COSYS Department for Automated Vehicles Researches, Université Gustave Eiffel, 25 Allée des Marronnier, 78000 Versailles, France
2. PICS-L Lab, COSYS Department, Université Gustave Eiffel, 25 Allée des Marronnier, 78000 Versailles, France
3. The International Associated Lab ICCAM (France-Australia), Université Gustave Eiffel, 25 Allée des Marronnier, 78000 Versailles, France
Interests: automated driving; multisensor data fusion; cooperative systems; environment perception; extended perception; sensors simulation for ADAS prototyping
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Deputy Director of the PICS-L lab, COSYS department, University Gustave Eiffel, 25 allée des Marronnier 78000 Versailles, France
Interests: automated driving; optimal path planning; ecomobility; ecoconsumption

E-Mail Website
Guest Editor
School of Engineering, University of British Colombia, Kelowna V1V1V7, BC, Canada
Interests: artificial intelligence; sensor fusion; machine learning; computer vision with applications in unmanned vehicles, robotics, and industrial automation
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Centre for Accident Research & Road Safety, Queensland University of Technology, Brisbane, QL 4059, Australia
Interests: automated driving; cooperative systems; path planning; control theory; risk assessment
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

With the development of advanced driver assistance systems (ADAS) for connected and automated mobility, functional safety has become a critical issue. The building of automated driving systems and applications requires four main stages: environment sensing and understanding, decision, trajectory generation, and action. However, the quality and performance of the last three stages depend on the quality of both the perception stage and the sensors. Existing sensor technologies are impaired by effects such as interferences, and weather conditions; their field of view can be obstructed, and our understanding of them can thus be misleading. Therefore, it is important to develop multisensor data fusion architectures and sensor performance monitoring systems to guarantee a robust and reliable perception in all conditions and situations. Currently, the use of sensors operating at different wavebands and having different characteristics seems to be relevant in order to address these issues.

This Special Issue will provide an overview of recent research related to the new generation of sensor technologies, sensor processing and fusion, and fusion architecture for efficient, reliable, and robust connected, smart, and future mobility system development. Indeed, prior to ensuring a high level of accuracy and safety in the deployment of connected and automated driving applications, it is necessary to guarantee a high level of information and perception quality, and to monitor the sensor and perception performances. Therefore, research contributions addressing sensor data improvement, sensor data restoration methods, sensor reliability/robustness/integrity assessment, adaptive, and innovative perception methods, AI-based fusion approaches, cooperative and connected perception, risk assessment, and safe operation methods for sensor failure detection, identification, and correction are welcome.

Dr. Dominique Gruyer
Dr. Olivier Orfila
Prof. Dr. Homayoun Najjaran
Prof. Sebastien Glaser
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • New sensors for Automated Vehicles
  • Sensor modeling
  • Sensor data fusion
  • Information processing
  • Cooperative perception
  • Fusion for connected and automated vehicles
  • Automated driving
  • Sensor fusion architecture
  • Smart sensors
  • AI-based sensor fusion

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

19 pages, 862 KiB  
Article
DOPESLAM: High-Precision ROS-Based Semantic 3D SLAM in a Dynamic Environment
by Jesse Roch, Jamil Fayyad and Homayoun Najjaran
Sensors 2023, 23(9), 4364; https://doi.org/10.3390/s23094364 - 28 Apr 2023
Cited by 1 | Viewed by 1943
Abstract
Recent advancements in deep learning techniques have accelerated the growth of robotic vision systems. One way this technology can be applied is to use a mobile robot to automatically generate a 3D map and identify objects within it. This paper addresses the important [...] Read more.
Recent advancements in deep learning techniques have accelerated the growth of robotic vision systems. One way this technology can be applied is to use a mobile robot to automatically generate a 3D map and identify objects within it. This paper addresses the important challenge of labeling objects and generating 3D maps in a dynamic environment. It explores a solution to this problem by combining Deep Object Pose Estimation (DOPE) with Real-Time Appearance-Based Mapping (RTAB-Map) through means of loose-coupled parallel fusion. DOPE’s abilities are enhanced by leveraging its belief map system to filter uncertain key points, which increases precision to ensure that only the best object labels end up on the map. Additionally, DOPE’s pipeline is modified to enable shape-based object recognition using depth maps, allowing it to identify objects in complete darkness. Three experiments are performed to find the ideal training dataset, quantify the increased precision, and evaluate the overall performance of the system. The results show that the proposed solution outperforms existing methods in most intended scenarios, such as in unilluminated scenes. The proposed key point filtering technique has demonstrated an improvement in the average inference speed, achieving a speedup of 2.6× and improving the average distance to the ground truth compared to the original DOPE algorithm. Full article
(This article belongs to the Special Issue Sensors and Sensor Fusion for Future Mobility Systems)
Show Figures

Figure 1

28 pages, 17671 KiB  
Article
Impact Evaluation of Cyberattacks on Connected and Automated Vehicles in Mixed Traffic Flow and Its Resilient and Robust Control Strategy
by Ting Wang, Meiting Tu, Hao Lyu, Ye Li, Olivier Orfila, Guojian Zou and Dominique Gruyer
Sensors 2023, 23(1), 74; https://doi.org/10.3390/s23010074 - 21 Dec 2022
Cited by 2 | Viewed by 1921
Abstract
Connected and automated vehicles (CAVs) present significant potential for improving road safety and mitigating traffic congestion for the future mobility system. However, cooperative driving vehicles are more vulnerable to cyberattacks when communicating with each other, which will introduce a new threat to the [...] Read more.
Connected and automated vehicles (CAVs) present significant potential for improving road safety and mitigating traffic congestion for the future mobility system. However, cooperative driving vehicles are more vulnerable to cyberattacks when communicating with each other, which will introduce a new threat to the transportation system. In order to guarantee safety aspects, it is also necessary to ensure a high level of information quality for CAV. To the best of our knowledge, this is the first investigation on the impacts of cyberattacks on CAV in mixed traffic (large vehicles, medium vehicles, and small vehicles) from the perspective of vehicle dynamics. The paper aims to explore the influence of cyberattacks on the evolution of CAV mixed traffic flow and propose a resilient and robust control strategy (RRCS) to alleviate the threat of cyberattacks. First, we propose a CAV mixed traffic car-following model considering cyberattacks based on the Intelligent Driver Model (IDM). Furthermore, a RRCS for cyberattacks is developed by setting the acceleration control switch and its impacts on the mixed traffic flow are explored in different cyberattack types. Finally, sensitivity analyses are conducted in different platoon compositions, vehicle distributions, and cyberattack intensities. The results show that the proposed RRCS of cyberattacks is robust and can resist the negative threats of cyberattacks on the CAV platoon, thereby providing a theoretical basis for restoring the stability and improving the safety of the CAV. Full article
(This article belongs to the Special Issue Sensors and Sensor Fusion for Future Mobility Systems)
Show Figures

Figure 1

16 pages, 4282 KiB  
Article
Lightweight Security Transmission in Wireless Sensor Networks through Information Hiding and Data Flipping
by Lan Zhou, Ming Kang and Wen Chen
Sensors 2022, 22(3), 823; https://doi.org/10.3390/s22030823 - 21 Jan 2022
Cited by 2 | Viewed by 1802
Abstract
Eavesdroppers can easily intercept the data transmitted in a wireless sensor network (WSN) because of the network’s open properties and constrained resources. Therefore, it is important to ensure data confidentiality in WSN with highly efficient security mechanisms. We proposed a lightweight security transmission [...] Read more.
Eavesdroppers can easily intercept the data transmitted in a wireless sensor network (WSN) because of the network’s open properties and constrained resources. Therefore, it is important to ensure data confidentiality in WSN with highly efficient security mechanisms. We proposed a lightweight security transmission method based on information hiding and random data flipping to ensure that the ally fusion center (AFC) can achieve confidential data transmission over insecure open links. First, the sensors’ local measurements are coded into a customized binary string, and then before data transmission, some parts of the string are flipped by the sensors according to the outputs of a pre-deployed pseudo-random function. The AFC can recover the flipped binaries using the same function and extract the measurement hidden in the string, while the enemy fusion center (EFC) cannot distinguish flipped and non-flipped data at all, and they cannot restore the measurement correctly as long as one bit in the string is not correctly recovered. We proved the security and anti-interference of the scheme through both simulations and physical experiments. Furthermore, the proposed method is more efficient such that it consumes less power than traditional digital encryptions through real power consumption tests. Full article
(This article belongs to the Special Issue Sensors and Sensor Fusion for Future Mobility Systems)
Show Figures

Figure 1

22 pages, 16815 KiB  
Article
Mapping Urban Air Quality from Mobile Sensors Using Spatio-Temporal Geostatistics
by Yacine Mohamed Idir, Olivier Orfila, Vincent Judalet, Benoit Sagot and Patrice Chatellier
Sensors 2021, 21(14), 4717; https://doi.org/10.3390/s21144717 - 09 Jul 2021
Cited by 6 | Viewed by 2780
Abstract
With the advancement of technology and the arrival of miniaturized environmental sensors that offer greater performance, the idea of building mobile network sensing for air quality has quickly emerged to increase our knowledge of air pollution in urban environments. However, with these new [...] Read more.
With the advancement of technology and the arrival of miniaturized environmental sensors that offer greater performance, the idea of building mobile network sensing for air quality has quickly emerged to increase our knowledge of air pollution in urban environments. However, with these new techniques, the difficulty of building mathematical models capable of aggregating all these data sources in order to provide precise mapping of air quality arises. In this context, we explore the spatio-temporal geostatistics methods as a solution for such a problem and evaluate three different methods: Simple Kriging (SK) in residuals, Ordinary Kriging (OK), and Kriging with External Drift (KED). On average, geostatistical models showed 26.57% improvement in the Root Mean Squared Error (RMSE) compared to the standard Inverse Distance Weighting (IDW) technique in interpolating scenarios (27.94% for KED, 26.05% for OK, and 25.71% for SK). The results showed less significant scores in extrapolating scenarios (a 12.22% decrease in the RMSE for geostatisical models compared to IDW). We conclude that univariable geostatistics is suitable for interpolating this type of data but is less appropriate for an extrapolation of non-sampled places since it does not create any information. Full article
(This article belongs to the Special Issue Sensors and Sensor Fusion for Future Mobility Systems)
Show Figures

Figure 1

Review

Jump to: Research

39 pages, 4385 KiB  
Review
Visibility Enhancement and Fog Detection: Solutions Presented in Recent Scientific Papers with Potential for Application to Mobile Systems
by Răzvan-Cătălin Miclea, Vlad-Ilie Ungureanu, Florin-Daniel Sandru and Ioan Silea
Sensors 2021, 21(10), 3370; https://doi.org/10.3390/s21103370 - 12 May 2021
Cited by 11 | Viewed by 10790
Abstract
In mobile systems, fog, rain, snow, haze, and sun glare are natural phenomena that can be very dangerous for drivers. In addition to the visibility problem, the driver must face also the choice of speed while driving. The main effects of fog are [...] Read more.
In mobile systems, fog, rain, snow, haze, and sun glare are natural phenomena that can be very dangerous for drivers. In addition to the visibility problem, the driver must face also the choice of speed while driving. The main effects of fog are a decrease in contrast and a fade of color. Rain and snow cause also high perturbation for the driver while glare caused by the sun or by other traffic participants can be very dangerous even for a short period. In the field of autonomous vehicles, visibility is of the utmost importance. To solve this problem, different researchers have approached and offered varied solutions and methods. It is useful to focus on what has been presented in the scientific literature over the past ten years relative to these concerns. This synthesis and technological evolution in the field of sensors, in the field of communications, in data processing, can be the basis of new possibilities for approaching the problems. This paper summarizes the methods and systems found and considered relevant, which estimate or even improve visibility in adverse weather conditions. Searching in the scientific literature, in the last few years, for the preoccupations of the researchers for avoiding the problems of the mobile systems caused by the environmental factors, we found that the fog phenomenon is the most dangerous. Our focus is on the fog phenomenon, and here, we present published research about methods based on image processing, optical power measurement, systems of sensors, etc. Full article
(This article belongs to the Special Issue Sensors and Sensor Fusion for Future Mobility Systems)
Show Figures

Figure 1

Back to TopTop