sensors-logo

Journal Browser

Journal Browser

Sensors Fusion for Vehicle Detection and Control

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Physical Sensors".

Deadline for manuscript submissions: closed (31 May 2021) | Viewed by 16442

Special Issue Editor


E-Mail Website
Guest Editor
Mechanical Engineering Department, Institute for Automotive Vehicle Safety (ISVA), Universidad Carlos III de Madrid, Avda. de la Universidad 30, 28911 Leganés, Madrid, Spain
Interests: vehicle control; vehicle safety; internet of things; sensor fusion; intelligent vehicles; neural networks
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Road vehicles are the most prevalent transportation system. Over the last several decades, vehicles have been equipped with systems which prevent accidents, and improve their comfort, and energy consumption. These systems, specially those that are related to the safety, gain greater interest in autonomous vehicles (AV). An AV is a vehicle capable of making decisions to move safety without driver supervision. In this regard, obstacle detection and avoidance during navigation has become one of the challenging problems in the field of AVs. On the other hand, more advancements in control systems have to be achieved to guarantee a good performance and stability of the AVs over a wide range of parameter changes and disturbances. In all these fields, a large number of sensors are required not only for perceiving and understand the nearby environment but also for determining the dynamic behavior of the vehicle. Nowadays, sensor fusion is playing a key role because it allows to combine the advantages of different sensors to get better results. Although, great advancements have made in the last years for automotive applications, there are still aspects that need to be addressed.

The topics of interest include, but are not limited to:

Vehicle Dynamics Control

Obstacle detection

Sensor fusion

Learning techniques/ Deep learning

Internet of Things/5G

Estimation techniques

Autonomous vehicle

Prof. Dr. Maria Jesús López Boada
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Vehicle Dynamics Control
  • Obstacle detection
  • Sensor fusion
  • Learning techniques/ Deep learning
  • Internet of Things/5G
  • Estimation techniques
  • Autonomous vehicle

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

20 pages, 2284 KiB  
Article
Human-Machine Shared Driving Control for Semi-Autonomous Vehicles Using Level of Cooperativeness
by Anh-Tu Nguyen, Jagat Jyoti Rath, Chen Lv, Thierry-Marie Guerra and Jimmy Lauber
Sensors 2021, 21(14), 4647; https://doi.org/10.3390/s21144647 - 07 Jul 2021
Cited by 16 | Viewed by 3456
Abstract
This paper proposes a new haptic shared control concept between the human driver and the automation for lane keeping in semi-autonomous vehicles. Based on the principle of human-machine interaction during lane keeping, the level of cooperativeness for completion of driving task is introduced. [...] Read more.
This paper proposes a new haptic shared control concept between the human driver and the automation for lane keeping in semi-autonomous vehicles. Based on the principle of human-machine interaction during lane keeping, the level of cooperativeness for completion of driving task is introduced. Using the proposed human-machine cooperative status along with the driver workload, the required level of haptic authority is determined according to the driver’s performance characteristics. Then, a time-varying assistance factor is developed to modulate the assistance torque, which is designed from an integrated driver-in-the-loop vehicle model taking into account the yaw-slip dynamics, the steering dynamics, and the human driver dynamics. To deal with the time-varying nature of both the assistance factor and the vehicle speed involved in the driver-in-the-loop vehicle model, a new linear parameter varying control technique is proposed. The predefined specifications of the driver-vehicle system are guaranteed using Lyapunov stability theory. The proposed haptic shared control method is validated under various driving tests conducted with high-fidelity simulations. Extensive performance evaluations are performed to highlight the effectiveness of the new method in terms of driver-automation conflict management. Full article
(This article belongs to the Special Issue Sensors Fusion for Vehicle Detection and Control)
Show Figures

Figure 1

15 pages, 3505 KiB  
Article
Feasibility of Using a MEMS Microphone Array for Pedestrian Detection in an Autonomous Emergency Braking System
by Alberto Izquierdo, Lara del Val and Juan J. Villacorta
Sensors 2021, 21(12), 4162; https://doi.org/10.3390/s21124162 - 17 Jun 2021
Cited by 5 | Viewed by 2370
Abstract
Pedestrian detection by a car is typically performed using camera, LIDAR, or RADAR-based systems. The first two systems, based on the propagation of light, do not work in foggy or poor visibility environments, and the latter are expensive and the probability associated with [...] Read more.
Pedestrian detection by a car is typically performed using camera, LIDAR, or RADAR-based systems. The first two systems, based on the propagation of light, do not work in foggy or poor visibility environments, and the latter are expensive and the probability associated with their ability to detect people is low. It is necessary to develop systems that are not based on light propagation, with reduced cost and with a high detection probability for pedestrians. This work presents a new sensor that satisfies these three requirements. An active sound system, with a sensor based on a 2D array of MEMS microphones, working in the 14 kHz to 21 kHz band, has been developed. The architecture of the system is based on an FPGA and a multicore processor that allow the system to operate in real time. The algorithms developed are based on a beamformer, range and lane filters, and a CFAR (Constant False Alarm Rate) detector. In this work, tests have been carried out with different people and in different ranges, calculating, in each case and globally, the Detection Probability and the False Alarm Probability of the system. The results obtained verify that the developed system allows the detection and estimation of the position of pedestrians, ensuring that a vehicle travelling at up to 50 km/h can stop and avoid a collision. Full article
(This article belongs to the Special Issue Sensors Fusion for Vehicle Detection and Control)
Show Figures

Figure 1

20 pages, 3580 KiB  
Article
Optimal H Control for Lateral Dynamics of Autonomous Vehicles
by Gianfranco Gagliardi, Marco Lupia, Gianni Cario and Alessandro Casavola
Sensors 2021, 21(12), 4072; https://doi.org/10.3390/s21124072 - 13 Jun 2021
Cited by 4 | Viewed by 2935
Abstract
This paper presents the design and validation of a model-based H vehicle lateral controller for autonomous vehicles in a simulation environment. The controller was designed so that the position and orientation tracking errors are minimized and so that the vehicle is able [...] Read more.
This paper presents the design and validation of a model-based H vehicle lateral controller for autonomous vehicles in a simulation environment. The controller was designed so that the position and orientation tracking errors are minimized and so that the vehicle is able to follow a trajectory computed in real-time by exploiting proper video-processing and lane-detection algorithms. From a computational point of view, the controller is obtained by solving a suitable LMI optimization problem and ensures that the closed-loop system is robust with respect to variations in the vehicle’s longitudinal speed. In order to show the effectiveness of the proposed control strategy, simulations have been undertaken by taking advantage of a co-simulation environment jointly developed in Matlab/Simulink © and Carsim 8 ©. The simulation activity shows that the proposed control approach allows for good control performance to be achieved. Full article
(This article belongs to the Special Issue Sensors Fusion for Vehicle Detection and Control)
Show Figures

Figure 1

22 pages, 37616 KiB  
Article
Estimation of Vehicle Attitude, Acceleration, and Angular Velocity Using Convolutional Neural Network and Dual Extended Kalman Filter
by Minseok Ok, Sungsuk Ok and Jahng Hyon Park
Sensors 2021, 21(4), 1282; https://doi.org/10.3390/s21041282 - 11 Feb 2021
Cited by 7 | Viewed by 4433
Abstract
The acceleration of a vehicle is important information in vehicle states. The vehicle acceleration is measured by an inertial measurement unit (IMU). However, gravity affects the IMU when there is a transition in vehicle attitude; thus, the IMU produces an incorrect signal output. [...] Read more.
The acceleration of a vehicle is important information in vehicle states. The vehicle acceleration is measured by an inertial measurement unit (IMU). However, gravity affects the IMU when there is a transition in vehicle attitude; thus, the IMU produces an incorrect signal output. Therefore, vehicle attitude information is essential for obtaining correct acceleration information. This paper proposes a convolutional neural network (CNN) for attitude estimation. Using sequential data of a vehicle’s chassis sensor signal, the roll and pitch angles of a vehicle can be estimated without using a high-cost sensor such as a global positioning system or a six-dimensional IMU. This paper also proposes a dual-extended Kalman filter (DEKF), which can accurately estimate acceleration/angular velocity based on the estimated roll/pitch information. The proposed method is validated by real-car experiment data and CarSim, a vehicle simulator. It accurately estimates the attitude estimation with limited sensors, and the exact acceleration/angular velocity is estimated considering the roll and pitch angle with de-noising effect. In addition, the DEKF can improve the modeling accuracy and can estimate the roll and pitch rates. Full article
(This article belongs to the Special Issue Sensors Fusion for Vehicle Detection and Control)
Show Figures

Figure 1

19 pages, 7569 KiB  
Article
3D Fast Object Detection Based on Discriminant Images and Dynamic Distance Threshold Clustering
by Baifan Chen, Hong Chen, Dian Yuan and Lingli Yu
Sensors 2020, 20(24), 7221; https://doi.org/10.3390/s20247221 - 17 Dec 2020
Cited by 4 | Viewed by 2387
Abstract
The object detection algorithm based on vehicle-mounted lidar is a key component of the perception system on autonomous vehicles. It can provide high-precision and highly robust obstacle information for the safe driving of autonomous vehicles. However, most algorithms are often based on a [...] Read more.
The object detection algorithm based on vehicle-mounted lidar is a key component of the perception system on autonomous vehicles. It can provide high-precision and highly robust obstacle information for the safe driving of autonomous vehicles. However, most algorithms are often based on a large amount of point cloud data, which makes real-time detection difficult. To solve this problem, this paper proposes a 3D fast object detection method based on three main steps: First, the ground segmentation by discriminant image (GSDI) method is used to convert point cloud data into discriminant images for ground points segmentation, which avoids the direct computing of the point cloud data and improves the efficiency of ground points segmentation. Second, the image detector is used to generate the region of interest of the three-dimensional object, which effectively narrows the search range. Finally, the dynamic distance threshold clustering (DDTC) method is designed for different density of the point cloud data, which improves the detection effect of long-distance objects and avoids the over-segmentation phenomenon generated by the traditional algorithm. Experiments have showed that this algorithm can meet the real-time requirements of autonomous driving while maintaining high accuracy. Full article
(This article belongs to the Special Issue Sensors Fusion for Vehicle Detection and Control)
Show Figures

Figure 1

Back to TopTop