Next Article in Journal
Robust Transceiver Design for IRS-Assisted Cascaded MIMO Communication Systems
Next Article in Special Issue
Autonomous Trajectory Generation Comparison for De-Orbiting with Multiple Collision Avoidance
Previous Article in Journal
A Low Cost IoT Cyber-Physical System for Vehicle and Pedestrian Tracking in a Smart Campus
Previous Article in Special Issue
Comparison of Deep Learning and Deterministic Algorithms for Control Modeling
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Research Scenarios of Autonomous Vehicles, the Sensors and Measurement Systems Used in Experiments

by
Leon Prochowski
1,2,
Patryk Szwajkowski
2,3,* and
Mateusz Ziubiński
1
1
Institute of Vehicles and Transportation, Military University of Technology (WAT), ul. gen. Sylwestra Kaliskiego 2, 00-908 Warsaw, Poland
2
Łukasiewicz Research Network—Automotive Industry Institute (Łukasiewicz-PIMOT), ul. Jagiellońska 55, 03-301 Warsaw, Poland
3
Doctoral School, Military University of Technology (WAT), ul. gen. Sylwestra Kaliskiego 2, 00-908 Warsaw, Poland
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(17), 6586; https://doi.org/10.3390/s22176586
Submission received: 8 August 2022 / Revised: 26 August 2022 / Accepted: 29 August 2022 / Published: 31 August 2022

Abstract

:
Automated and autonomous vehicles are in an intensive development phase. It is a phase that requires a lot of modelling and experimental research. Experimental research into these vehicles is in its initial state. There is a lack of findings and standardized recommendations for the organization and creation of research scenarios. There are also many difficulties in creating research scenarios. The main difficulties are the large number of systems for simultaneous checking. Additionally, the vehicles have a very complicated structure. A review of current publications allowed for systematization of the research scenarios of vehicles and their components as well as the measurement systems used. These include perception systems, automated response to threats, and critical situations in the area of road safety. The scenarios analyzed ensure that the planned research tasks can be carried out, including the investigation of systems that enable autonomous driving. This study uses passenger cars equipped with highly sophisticated sensor systems and localization devices. Perception systems are the necessary equipment during the conducted study. They provide recognition of the environment, mainly through vision sensors (cameras) and lidars. The research tasks include autonomous driving along a detected road lane on a curvilinear track. The effective maintenance of the vehicle in this lane is assessed. The location used in the study is a set of specialized research tracks on which stationary or moving obstacles are often placed.

1. Introduction

Currently, there are a large number of publications on autonomous vehicles. Most of these papers, along with their included research and discussions, deal with technological aspects. The dominant topics are vehicle control, avoiding road obstacles, and sending information about the current situation of the vehicle and possible threats. This is happening because these vehicles are in a phase of intensive development. A very important stage of development work is model and experimental research.
The test vehicles, often conventionally referred to as autonomous in many publications, perform strictly planned tasks during research. Thus, they operate as automated vehicles. In this paper the abbreviation AV will be used, which should be considered as an automated or autonomous vehicle. According to the SAE J3016 standard [1,2], autonomous vehicles (AV) are accepted cars classified between levels 3 and 5.
However, experimental research related to AVs is in an initial state. This is confirmed by a small number of studies of complete control systems for such cars. For example, there are studies that analyze environment perception systems and the vehicle control is investigated separately. Research is still at the stage of gathering knowledge about the performance of different systems under different conditions. Research plans are visible and adapted to the current cognitive needs of the research organizer. This situation is the result of a lack of recommendations and standardized guidelines for the scenarios used in such studies. There is a big problem and binding arrangements (e.g., EU, USA) for the organization of safe AV testing in road traffic is sorely needed [3]. Particularly, the requirements for the scenarios used in such studies have not been specified. Difficulties regarding these scenarios also result from the fact that, in the planned research, local and readily available technical resources and available equipment are first taken into account. In autonomous vehicles, an infinite number of variants related to vehicle behavior arise as a result of autonomous decision-making by the control system. Hence, there may be a very large number of research scenarios. Particular difficulties arise in the organization of research in critical traffic situations [4,5].
The main difficulties that are faced during the creation of research scenarios with a planned course (that is, in automatic vehicle control mode) can be summarized as follows:
-
the number of systems to be checked at the same time is significant and the vehicles have a very complex structure;
-
the number of places where conducting research is possible is limited;
-
the trajectory of the tested vehicle may deviate significantly from the planned trajectory;
-
the legal regulation of AVs’ participation in traffic is inconsistent and unclear in different countries.
A complication in the organization of the tests also arises from the fact that, in traffic, all the vehicle’s systems will be operating simultaneously or in a sequence that is difficult to plan [6].
The development of scenarios for experimental research related to road traffic can be based on the results of previous studies (available in significant numbers), namely:
-
the results of model tests and computer simulations in which the behavior of the vehicle in virtual space can be safely observed (technologies: MIL [7], SIL [8]);
-
the results of analysis focusing on cooperation between computer models and the systems of a real vehicle (technologies: HIL [8]);
-
research results of complete vehicle actuators in combination with road tests and the simulation environment (technologies: VIL [7,9]);
-
results of the validation of the human–machine interface through the interaction between the driver and the autonomous vehicle (technologies: DIL [7]);
-
research results of the vehicle at the proving grounds (closed area, dedicated track).
There is a general consensus that the scenarios of such research should result from the observation of road traffic and drivers’ behavior [10,11,12].
Currently, there are significantly more simulation studies than experimental studies. In [13,14], the benefits of implementing mixed/hybrid research are described. In such a method, signals for perception systems during experimental research are supplemented by signals generated from computer simulations. This allows for better repeatability of test conditions and realization of more complex scenarios. On the other hand, in [15], the concept of so-called parallel research was presented, in which a real autonomous car reacts to signals from movies shown on screens placed around it. This ensures the research has high repeatability due to the possibility of reproducing the same scenarios.
The above-described research and all other experimental research related to AVs has been conducted according to specific scenarios. The experimental research scenario requires a lot of information to be established:
-
aim of the research;
-
test vehicle (platforms) data and the method of their preparation and configuration;
-
the use of the environment (surroundings, test track) in which the research will be conducted;
-
a set of elements and additional objects (complementary) which are required to be included in the research;
-
initial conditions and the planned course of the study (test sequence);
-
measuring equipment, its tasks, and method of use;
-
the quantities necessary to measure, the level of accuracy of the measurements, and the method of evaluation of the results;
-
safety procedures.
Similar findings are included in scenarios in current normative research regulations, e.g., SAE [16], Euro NCAP [17], and EU [18].
The aim of the article is to present the main information about AV research scenarios, and the study tasks undertaken in this area, in an orderly manner. The collected material aims to systematize the information that will facilitate the planning, preparation, and execution of experimental AV research. The developed material will allow test preparation time to be shortened, the repetition of already performed research to be avoided, and the number of wrong decisions made at this stage to be reduced (including selection of the research aim, measuring and recording equipment, localization devices, etc.). The systematization of available information on normative and non-normative scenarios will facilitate their analysis and the assessment of their applicability when planning AV research.
The material gathered in the article will help identify the following:
-
research tasks that currently dominate;
-
elements to be included in the AV research scenario;
-
which normative documents are useful for organizing research;
-
what measuring and recording equipment is necessary.
This paper presents the currently used normative scenarios of experimental AV research. Scenarios used by various research centers will be characterized, in which the plan and course is adapted to the current needs of researchers (nonnormative scenarios). Attention will also be paid to the technological aspects of the research scenarios, as well as the measurement systems and sensors used in the research.
On the basis of the literature analysis, a hierarchical scheme of information necessary for the creation of AV research scenarios has been built. The layout of this scheme is shown in Figure 1. The article presents individual elements of this set.
There is a lack of publications focused on the preparation and organization of AV research. In publications containing the results of experimental studies, the authors mainly focus on the results of these studies, and information about their preparation and scenario is limited. This additionally justifies the need to gather and systematize knowledge in this area. During the preparation of the article, our own database of scientific publications about AV research was used. The database was created over many years of research carried out at the Research Network Łukasiewicz—Automotive Industry Institute in Warsaw on the improvement of AVs.
The collection of research material for the article was based on the usage of scientific publications databases such as Springer, Science Direct, MDPI, Taylor & Francis, IEEE Xplore, ResearchGate, and SAE International. Moreover, online resources from many organizations were analyzed separately, including Euro NCAP, NHTSA, ISO, EU, UNECE, and IIHS. The result of tagging topics of interest and research areas was also a set of keywords. This was the starting point for the search of publications in the databases. Selected articles were categorized in three areas (AV research scenarios, technical aspects of the AV research, and sensors and measuring equipment). These areas are the main content of the article.

2. Objectives and Scope of AV Research

Experimental AV research considers many multidisciplinary issues related to the road traffic control process. One of the most frequently studied issues is the problem of object detection by the environment perception system. An example of such research conducted by the Łukasiewicz Research Network—Automotive Industry Institute in Warsaw is shown in Figure 2. Researchers analyze the time it takes for AVs to detect a suddenly appearing obstacle (red car). The obstacle comes from behind the car leaving the road intersection (blue car). Research was carried out for various driving velocities, the shape and color of the obstacle, as well as for varying illumination (day, night). Many authors present publicly available and proprietary models for object detection. Obstacle classes such as cars [19], pedestrians [20,21], bicyclist/motorcyclists [20], traffic signs [22], and others [23,24] are recognized. These models are based on vision sensors in the form of RGB cameras, from which fragments of the surrounding image are analyzed by neural network models. Characteristic features of a given object are sought. Perception systems based on the use of several devices are increasingly being considered, which makes it possible to eliminate the disadvantages of some sensors. The most common way to combine information is sensory fusion of the lidar and camera signals [23,24,25]. The detected objects make it possible to generate the output signal from the perception system, which is the basis for further decision-making processes in the AV control system.
A large amount of the papers published in this area discuss the problem of road edge and lane detection by AVs [26,27,28]. Regarding this issue, the image from the camera of the perception system is most commonly used. Usually, the edges of the road, lines limiting the lane of the road, or free spaces in front of the vehicle are usually marked. Figure 3 presents a part of the research conducted by the authors. The result of the lane and road edge detection by the AV’s perception system is presented while it drives on a three-lane road. The road edges determined by the perception system are marked in red and yellow (left and right edge, respectively). The colors green and blue indicate the edges of the occupied lane (left and right edge, respectively). Despite the fact that parts of the lines are obscured by other vehicles, the perception system estimates their location in the missing areas.
Another research problem is avoiding obstacles on the road. Planning an avoidance maneuver requires information from the perception system about the position and dimensions of the obstacle. There is a separate need to define the area in which the AV can continue driving after passing the obstacle (detected road lane, free spaces, road edges). Based on this, the decision-making system generates a trajectory for further driving (path planning). Usually, these are trajectories safely avoid obstacles in road traffic [29].
Maintaining the planned path strongly depends on the curvature of the path and driving velocity. These factors influence the lateral force values (centrifugal force), which causes sideslip and lateral skidding of the tires. The result is a deviation of the real trajectory from the planned path. This deviation can be measured as shown in Figure 4. It is calculated as:
Δ y M A = y M ( x ) y C A ( x ) ; Δ ψ M A = ψ Y M ψ C A ,
where the individual parameters are the geometric quantities indicated in Figure 4.
This is exemplified by the planned path y M ( x ) and car driving trajectory y C A ( x )   in Figure 5a. The planned path is the result of the control system’s calculation based on information from the perception system. Driving trajectory is the actual trace of the vehicle’s movement (the car’s response to the control process) on the distance of avoiding the obstacle. This distance in the drawing is 30 m long. Figure 5b,c show examples of deviations of driving trajectory from the planned path for several driving velocities as part of the suddenly appearing obstacle avoidance maneuver. These are the results of simulations that the authors partially showed in [4]. There are visible differences between the planned path and the actual AV trajectory. These differences increase with increasing driving velocity.
The figures show the influence of driving velocities on driving trajectory deviations from the planned path. The lateral forces (centrifugal forces) increase with increases in driving velocity on the curvilinear path, which causes sideslip and lateral skidding of the tires. As a result, an increase in the lateral deviation Δ y M A of the driving trajectory from the planned path is observed and increased angular deviation of the longitudinal vehicle axis Δ ψ M A . This issue is one of the most important problems in experimental AV research.
AV’s contribution to road traffic is based on many activities. These include:
-
recognition of the road situation;
-
planning a collision-free path;
-
current assessment of the real trajectory in relation to the planned path and making the necessary corrections.
Path planning is based on:
-
navigation and digital maps of the driving area;
-
location of the vehicle in relation to the road infrastructure;
-
environment perception to recognize the situation around the vehicle.
The problem of planning and maintaining a planned path is often the aim of experimental research or is part of other research in which the planned path is an indirect factor.
The primary task during AV movement is performed by the perception system. The result should be at least the detection of the road lane edge or its centerline and an estimate of the relative position of the car in that lane. This is the dominant problem of experimental research in the aspect described here.

3. Experimental Research Scenarios

3.1. Scope and Features of Typical AV Research

Experimental AV research scenarios found in the literature have a certain set of features. These often depend on the aim of the research. The main features of the AV research scenarios, in terms of self-driving and the effectiveness of lane-keeping by the vehicle, are summarized in Table 1. These are automated driving scenarios, i.e., following a planned path or a preset road lane.
Table 2 shows examples of research scenarios that aim to analyze the behavior of a vehicle in urban areas, namely:
-
effectiveness of road line (lane boundary) detection;
-
the planning of a safe path for avoiding a slower moving vehicle.
The next scenario concerns research into avoiding suddenly appearing obstacles (Table 3). Thus, this research is different to the previously described scenarios.

3.2. Normative and Unified AV Test Procedures

Normative AV test scenarios and procedures are used to test finished products and automation systems, e.g., for homologation. An example of a type of system for which normative testing is carried out is ADAS. Driver assistance systems include FCW, AEB, LKA, or ESP. ADAS are maximum level 2 solutions according to SAE [2,35,36]. They form the basis of higher-level systems. Such test scenarios have a number of features that should also be found in the research scenarios considered in this paper. Therefore, the results of a review of normative AV testing scenarios and procedures are presented. These are defined by organizations:
EU [18];
UNECE [37,38,39];
ISO [40,41,42];
Euro NCAP [17,22,43];
NHTSA [44,45,46];
SAE [16,47];
IIHS [48,49].
Normative scenarios allow for repeated testing of entire vehicles and their systems. Unified procedures, equipment, and methods of evaluation of test results make it possible to compare the various solutions used in AVs. This makes it easier to decide on appropriate directions of development for active safety systems in vehicles. A structured overview of normative testing scenarios is presented in Table 4.
Normative test scenarios are focused on the testing of finished systems in AVs. These are most often certification, homologation, and ranking tests (these tests are not of a cognitive nature). The analysis of normative test scenarios shows that the largest part of studies is currently carried out for driver warning, emergency braking, and lane keeping systems. The effectiveness of these systems is assessed in terms of vehicle response to the detected threat (only the effect of the system’s reaction is assessed). This may include the threat signals for the driver or the effectiveness of an automatic defensive maneuver. Normative scenarios are mostly carried out on test tracks/proving grounds with the necessary infrastructure. Selected scenarios enable tests to be conducted on public roads. In these cases, critical situations are ignored and AV control is not implemented. The analyzed normative scenarios indicate that AV tests are currently focused on critical situations in road traffic. This is the basis for building nonnormative research scenarios. These nonnormative scenarios should focus on the cognitive aspects of autonomous vehicle operation. The reasons for the AV’s reaction to the threat should also be assessed (e.g., inference about the behavior of the perception or control system).

3.3. Nonnormative Research Scenarios

Nonnormative test scenarios are created to conduct experimental research related to the development and analysis of the performance of AV systems. Therefore, these are scientific pieces of research of a different basis than the normative test scenarios described in the previous section. These scenarios are focused on the cognitive aspect of experimental research. Such scenarios are also needed during component/system research, e.g., object detection by the perception system [67,68,69], path planning systems [19,33], and road line/edge detection [26,27,70]. Nonnormative AV test scenarios were analyzed. These scenarios were organized and grouped based on the scope and aim of the conducted experimental research. These scenarios take into account:
AV path planning [33,34], where the goal is usually to develop a system for determining a non-collision path or smoothly avoiding slower/stationary objects;
following the previous vehicle, e.g., as part of the movement within two circles [19];
road line detection and traffic lane detection [26,27,70], where the aim is to detect an area where the driving process can be continued by the AV;
AV lane keeping:
driving with high velocities on highways [30,32];
driving on a racing track on a wet and dry surface and different strategies of AV path planning [31].
emergency braking before an obstacle [34,71,72], where the aim is to minimize the risk of collision;
avoiding a suddenly appearing obstacle in a critical situation where it is not possible to stop the AV [34,72];
2D object detection, localization, and tracking:
car detection [70,73,74] and distance estimation to other vehicles [27,75];
pedestrian detection [20,21,69];
bicyclist/motorcyclist detection [20,73,74].
3D object detection on the road [72,76] using lidar and sensor fusion (e.g., lidar and camera), where the aim is to obtain information about the obstacle in the spatial coordinate system:
car detection [19,75,77];
pedestrian detection [23,24,78];
bicyclist/motorcyclist detection [23,25,73];
cone detection [67];
detection of walls, trees, bushes, and poles [78];
detection and tracking of unknown objects [79].
The analyzed nonnormative research scenarios indicate that currently single AV systems (components) are the main directions of development within experimental research. Few publications discuss scenarios where a complete AV system is analyzed in terms of its control. Some focused on critical situations and these are conducted on tracks. Algorithms such as lane keeping, braking, or obstacle avoidance are being investigated. The number of these publications is growing, but it is still small in relation to the research of single AV systems (components). This can be influenced by the cost of conducting research into complete AV systems. They are also characterized by the complexity of preparation for the research scenario and a very large number of variants. The issue of perception systems focused on object detection and line/edge detection on the road is widely researched. A significant number of research scenarios for these systems are evident. They are often carried out in road traffic because they do not reduce the level of safety for other road users. These studies are focused on calm driving without the risk of critical situations.

4. Technical Aspects of AV Research

The organization of research according to specific scenarios requires meeting the technical requirements contained therein. Technical aspects in the research scenarios include the characteristics of the research environment and the characteristics and parameters of the vehicle. The relevant issues in this area will be presented below.

4.1. Research Vehicles

Research according to the described scenarios is usually carried out with the use of passenger cars [31]. The movement of the vehicle during the research is automated or controlled by the driver. Depending on the subject and scope of the research, the following are also used:
buses [79];
trucks [67];
microcars (single person) [34];
go-karts [32];
research platforms (autonomous systems carriers) or physical models of the car at a reduced scale [80,81].
The use of reduced-scale car models allows for significant reductions in research costs, the space required for their realization, as well as the risk of possible damages in case of a defective realization of the scenario. It should be noted, however, that conducting research on physical models may not provide complete information on the dynamics of real car motion, and the problem of scaling vehicles is a complex and problematic issue [82,83,84].

4.2. Research Site

Many countries still do not allow the testing of autonomous vehicles on public roads. Therefore, in addition to limited road fragments, special places are prepared for the realization of experimental research of autonomous cars, such as:
the proving grounds for testing and validation of AVs in Zalaegerszeg (Hungary) and Immendingen (Germany) [85,86];
proving grounds Mcity (Almono (Uber ATG)) and Kcity [13,87,88];
proving grounds containing an oval track (such as KATRI—Korea Apparel Testing and Research Institute) [30];
tracks used in car racing (Monza in Italy) [31];
fragments of road infrastructure excluded from public traffic which have been specially selected for conducting research according to specific scenarios [89];
polygons using an unstructured environment [90,91], which sometimes lacks road infrastructure elements such as vertical and horizontal signs.
The number of public roads where research is allowed is small, but has been increasing recently [92].

4.3. Infrastructure Elements and Other Objects

The realization of experimental research often requires the use of additional objects. Their presence within the range of the car’s perception system is the basis for the operation of the control system. In road traffic, these include other vehicles, pedestrians, cyclists, or fixed obstacles. If the course of the research is not related to the realization of critical maneuvers, the scenarios depend on the state of road traffic. Research results according to such scenarios are available in [25,67,70].
Due to limitations surrounding the possibility of conducting research on public roads, it is necessary to conduct research in a closed area. In such an area, real objects (e.g., other road users) are replaced with various types of dummies, decorations, and physical models. Table 5 summarizes examples of artificial objects used in automated vehicles test scenarios, e.g., in the role of obstacles in their planned path.
Figure 6 shows the objects (cf. Table 3) used in AV experimental research. In [34] boxes and cardboard boxes were used to mark the road lane, whose arrangement around the vehicle was the basis for controlling AV movement (Figure 6a). Figure 6b shows a soft passenger car target after mounting on a moving platform. The target was used in scenarios of avoiding a suddenly appearing obstacle [17,28]. For example, the car model moving on a collision path with AV, forced a change of AV driving path and trajectory.
Dummies of pedestrians, children and bicyclists are used in a similar role (Figure 6c [43] and Figure 6d [56]). Figure 5d also shows an example of an obstacle in the AV path, built in the form of a soft wall covered with a material with reflective properties to facilitate detection by the vehicle perception system [56].
Conducting experimental AV research requires the provision of appropriate facilities and technical conditions. The AV system is located on the research vehicle, which is usually a passenger car. However, there are test scenarios where other vehicles, go-karts, or reduced-scale vehicles are used. Due to the limited possibility of conducting research in road traffic, most AV research is carried out on test tracks and proving grounds. They are usually equipped with road infrastructure to replicate actual road conditions. It is necessary to use dummies, models, and mannequins of other road users, including pedestrians, children, bicyclists, or other vehicles. Their role in research varies, but usually their presence in the environment is the basis for AV control.

5. Measuring Systems, Sensors, and Measurement Ranges

The considered AV research scenarios describe the methods of testing individual systems of the autonomy system, e.g., control during uninterrupted movement (lane keeping on highways) and during critical situations (active safety systems). This type of experimental research requires many sensors. These sensors can be divided into sensors for the AV’s perception and control systems, as well as devices recording the results of research (measuring equipment).

5.1. Sensors Used for Perception and Control Systems

AV perception systems are an integral part of any control system, regardless of their level of sophistication. Single sensors are used in simple systems [27,44,55]. Many different sensors are used for more advanced solutions, which receive signals that complement each other and provide complementary information [24,25,79]. Sensors in AV systems are devices that generate and transmit information about the traffic environment for further analysis (e.g., information about road infrastructure and locations of other objects/obstacles). They are also sensors for measuring the vehicle motion parameters necessary for driving planning and control (the AV’s position, vehicle velocity, acceleration, angular velocities, etc.).
The sensors of the perception systems are mainly vision sensors (cameras) recording RGB images [31,32,70], radars using radio waves [55,57,71], and lidars based on laser beams [24,25,33]. Most solutions based on vision sensors use single-lens cameras that provide a 2D image for the road situation analysis system [24,32,72]. There are also vehicles and systems that use stereo cameras [31,67,74]. These sensors provide an image from two lenses, and this creates an image with depth. This solution enables 3D observation of the surroundings. Both types of cameras are relatively low-cost, which allows them to be widely used [20]. More complex solutions use thermal imaging cameras to observe the AV’s surroundings [20,95]. These sensors provide information about the temperature of the surroundings. This makes it possible to detect objects and obstacles based on the temperature gradient. The disadvantage of these sensors is their high price, low resolution, and slower image refresh rate compared to RGB cameras [20]. A summary of the advantages and disadvantages of sensors in AV perception systems is presented in Table 6.
Radars are commonly used sensors which provide information about obstacles and vehicle movement parameters. Radars are used for emergency braking systems [55], active cruise controls, and for object detection in a vehicle’s blind spots [44]. The advantage of this type of sensor is their operation at high driving velocities, especially over long ranges. The maximum range of the radar can be over 150 m [72] (at a maximum of 25 m for the Stereolabs ZED camera [31,96]). The disadvantage of radars compared to cameras is the small amount of information acquired about the spatial dimensions of objects. Radars generate a scan of the environment in the form of a point cloud based on reflection of a radio wave. Such a radar scan contains a small number of points, which makes it difficult to recognize objects [20]. This disadvantage is eliminated by lidar sensors (laser sensors), which are characterized by a large number of layers that scan the area around the vehicle. Lidar sensors generate a scan of the environment around the vehicle in the form of a point cloud. This enables the object and infrastructure elements to be detected, along with their classification. The collected information is similar to the depth image of a stereo camera, but with a much greater scanning range. They are characterized by high accuracy in terms of object localization (e.g., 3 cm—typical measurement accuracy for Velodyne VLP-16 lidar). The disadvantage of lidars is the very high price (especially for sensors with a large number of scanning layers), which in many solutions prevents their widespread use [20].
AVs are equipped with additional sensors for control systems, namely:
GPS [19,33,74];
GNSS RTK [31,75,95];
IMU [19,34,95].
The GPS/GNSS system enables localization of the vehicle’s position using a satellite signal. This is a very common sensor. It is usually used for global path planning of the AV. The disadvantage is the low accuracy of the localization—usually 1–10 m. An extension of this system is to complement GPS/GNSS with RTK. It is a GPS system that allows precise determination of the position of the vehicle in real time. In addition to the GPS in the car, this kit uses an additional signal from a stationary reference station, which also receives a GPS signal. The reference station sends the data about position correction to the receiver in the car. This increases localization accuracy to approximately 4 cm.
The IMU sensor is a device that integrates an accelerometer, gyroscope, and magnetometer. This sensor enables the determination of vehicle acceleration, angular position, and angular velocity. IMU is very often found as a supplement to GPS and GNSS RTK systems.

5.2. Sensors in Research Scenarios

5.2.1. Cameras

In experimental research, AV cameras are used to detect lines on the road [27] or during lane keeping [31,32,79]. Their widest area of application is object detection on roads. There are research scenarios that take into account 2D detection of objects and obstacles [20,70,75]. There are separate scenarios for the detection and localization of objects in 3D space, based on the sensory fusion of several perception sensors (cameras with lidars or radars) [24,67,77]. Cameras are used in emergency braking systems and collision warning systems about an obstacle (e.g., in front of another vehicle, pedestrian, or bicyclist) [51,56,72]. They are also used for issues related to monitoring and warning about objects in the blind spot of the vehicle [44]. Cameras from different manufacturers are used in AV research. Camera properties are usually characterized by a set of parameters: H, V, R, f, and range.
Table 7 presents examples of the use of cameras in AV research. The research scope of the selected research scenarios has been summarized and the sensor models with selected parameters have been specified.

5.2.2. Radars

The signal from the radar can be used to calculate the object’s motion parameters (e.g., position, velocity) [57,71,72]. Radars are also used in the sensory fusion of several sensors:
radar–camera–GPS fusion [72];
radar–camera fusion [51].
Long-range radars are mainly used for object detection at high driving velocities, with short-range radars used for precise maneuvers at low velocities. Radars are used in braking and obstacle avoidance scenarios [71], 3D object detection [72], and research into emergency braking systems and collision warning systems about obstacles in front of AVs (car obstacles [57], pedestrians [50], bicyclists [56]). Various manufacturers’ radars are used in AV research. Their properties are usually characterized by a set of parameters: H, f, and range.
Table 8 presents information about the radars used in AV research according to the analyzed scenarios. The research scope of selected research scenarios has been summarized and the sensor models with selected parameters have been specified.

5.2.3. Lidars

Lidars create an image of the environment in the form of a point cloud [25,44]. Based on the arrangement of these points, a map of the surroundings can be built [33], road edges can be detected [34], or other objects on the road can be identified [76]. Lidars are the most commonly used sensor for sensory fusion with a camera [17,50,72]. The properties of lidars can be characterized by parameters: H, V, R, f, range, and layers.
Lidar sensors with a wide horizontal field of view (e.g., H: 360 deg) and a large number of scanning layers (e.g., Layers: 64) have high scanning resolution with a wide vertical measurement range. Reducing the number of scanning layers while maintaining the vertical measurement range limits the resolution of the sensor (reducing information about the objects). Maintaining high resolution with a lower number of scanning layers can adversely affect the observation of tall objects. Lidars with a narrower horizontal scope of observation are also used in AV research (e.g., H: 90 deg). Additionally, 360 deg sensors allow the entire environment surrounding the AV to be scanned with a single device (with a sensor that is usually mounted on the roof of the vehicle). The disadvantage of this solution is the lack of information about obstacles in close proximity to the AV due to insufficient vertical measuring range. The use of sensors with a smaller horizontal measuring range requires the installation of several devices around the vehicle to observe the entire environment. This increases the vertical measuring range in areas close to the vehicle. The disadvantage is the high cost of many devices. Increasing the number of scanning layers, resolution, and horizontal measurement range of the sensor/sensors increases the size of the collected files. This may have an adverse effect on the time taken by the AV’s perception system to detect objects.
Table 9 summarizes the lidars used in experimental AV research for different scenarios. The research scope of selected research scenarios has been summarized and the sensor models with selected parameters have been specified.

5.2.4. Other Sensors

Table 10 summarizes the types and models of sensors other than those described in Table 8 Table 9 andTable 10. These are sensors and devices used in the AV localization and control process. In [19,33], the GPS sensor is used in path planning and object detection scenarios. A similar application has the GNSS RTK Swift Navigation sensor in [31,75]. This sensor is used in lane keeping, localization, and object tracking scenarios. In [74], a NovAtel SPAN-CPT GPS/INS device was used in object detection and tracking scenarios (e.g., pedestrians, cars, bicyclists). The GPS/INS sensor is used in sensory fusion with the lidar to improve the accuracy of the AV’s position in space relative to the 3D detected obstacle. The IMU sensor for measuring acceleration and angular velocity was used in [33,34]. There, scenarios related to path planning, braking, and obstacle avoidance were considered. This type of sensor is useful in scenarios where the influence of AV dynamics on the process of its control is analyzed.
The analysis of the sensors used in the perception and control systems shows that the most commonly used solutions in AVs are based on cameras (2D detection) and lidar sensors (3D detection). Many scenarios use sensory fusion of these two sensors. This enables the elimination of the disadvantages of individual sensors, as well as the extension of the measurement capabilities of the AV’s perception system. Lidars that fully scan the environment around the vehicle are increasingly used. This enables the use of a single sensor on the vehicle. The use of sensors with an increasing number of scanning layers is visible in AV research. In experimental research, radars are mainly used in normative scenarios as the basis for ADAS solutions (e.g., AEB). Additionally, in ADAS and in nonnormative scenarios, cameras are used to test LKA, LSS, or perception systems which detect lines or road signs. There is a growing number of research results which take into account sensors such as GPS and GNSS RTK, enabling precise determination of the AV’s position (e.g., in relation to an obstacle). There is increasing research into the impact of AV dynamics and kinematics in terms of its control. For this purpose, IMU sensors are increasingly being used to measure quantities such as acceleration or angular velocity of the vehicle.

5.3. Measuring Equipment

Experimental AV research uses a wide range of sensors which are necessary for perception and control systems. Verification of the operation of the AV system requires external measuring equipment, which will enable the parallel measurement of selected physical quantities (e.g., position, velocity, and acceleration) and other signals. These recorded data are the basis for assessing the performance of AV systems in the considered research scenario.
The measuring equipment is only used to record the course of the research and does not affect the decision-making or AV control process. The measuring equipment used depends on the considered research scenario; however, several characteristic areas of measurement can be identified:
waveforms related to the movement of the vehicle/obstacle (e.g., position and acceleration);
information and warning signals from the AV system interface;
course of the experiment (image from internal and external cameras installed on vehicle);
signals related to the vehicle control process.
The measurement of waveforms related to the movement of the vehicle or obstacle for a given scenario is related to the measurement quantities such as the position of objects, driving velocity, acceleration (linear, angular), angular velocity, etc. For this purpose, information transmitted by GPS/GNSS, GNSS RTK, and IMU sensors is used [95]. In scenarios related to research into the emergency braking system and collision warning system, the ADMA or DGNSS device has been used to measure waveforms [50,51]. In a similar scenario in [55,56], waveforms from the wheel’s angular velocity sensor and the GNSS RTK module were also measured.
Normative scenarios specify the range of vehicle dynamics parameters and the obstacle parameters necessary for registration during the tests. Either the sampling frequency (e.g., min. 100 Hz) or the accuracy of the measurements is indicated (velocity 0.1 km/h, position localization 0.03 m, angular position 0.1 deg, angular velocity 0.1 deg/s, acceleration 0.1 m/s2, steering wheel angular velocity 1.0 deg/s). These requirements are developed for scenarios related to emergency braking systems, collision warning systems [17,43], and testing of lane keeping systems [28].
In [44] (a research scenario for blind spot monitoring systems) and [75] (a 3D object detection scenario), for the measurement of the AV’s movement and obstacle parameters, OXTS equipment was used. Specifically, the RT3000, RT Hunter, RT Target, and RT Range were the devices used.
The experimental research of systems, the main task of which is to warn the driver about dangers, requires the registration of messages appearing during a critical situation (e.g., FCW or LDW systems). The normative test scenarios for these systems are described in [17,28,43]. Visual and sound-based warning signals for the driver are recorded indirectly using cameras located inside the vehicle. In [50,51] (an emergency braking/collision warning system research scenario), GoPro cameras were used to record visual warning signals on the AV’s dashboard and warning sound signals. In [44], warning signals for the driver (sounds and visuals) related to an object in the blind spot were recorded. In [22], visual signals about the recognition of a road sign were recorded.
Cameras are used to record the progress of an experiment. They can record the course of the research inside the vehicle or can be mounted on the AV’s body and record the surroundings of the vehicle. For this, SLR cameras can be used, which can additionally be activated at a defined point in time in order to synchronize data recording [50,51] (emergency braking/collision warning system research scenario).
The last area related to measuring the course of research is signals generated by the AV system for controlling vehicle movement. In [55,56], signals such as the value of steering wheel torque or the force on the brake pedal were recorded. This information makes it easier to evaluate the performance of the systems, especially in the case of normative testing where there is no access to the inside of the AV system. In [30], for the AV lane keeping test scenario, the values of the steering wheel’s turning angle and the steering torque were measured. The sensors used were components of the vehicles systems. The signals were read from the CAN bus of the vehicle.
In [44], the blind spot monitoring and braking scenario for the lane change assistant was considered. The braking maneuver was carried out using the Heitz Automotive Testing device. This apparatus enables the measurement of data related to braking intensity and the resulting braking deceleration.
The role of measuring equipment is crucial for quantity measurements that enable the evaluation of the course and results of experimental AV research. This is especially true of scenarios in normative testing of ADAS systems. ADAS systems do not enable measurement of physical quantities which form the basis of vehicle control. This applies, for example, to signals from the perception or decision system. Only the final result of the AV’s control system can be observed (sound or visual signal of the system interface for the driver, execution of the maneuver by the vehicle, etc.). The measurement equipment used is often characterized by high measurement accuracy, which allows the measured results to be used as a reference for signals from the AV. This mainly applies to nonnormative research scenarios.

6. Summary and Conclusions

Experimental research is carried out according to more and more complex scenarios. The scenario makes it easier to organize and conduct research. It contains a lot of important information, e.g., the aim and scope of research, measurement equipment, and safety procedures. The analyzed scenarios were divided into two groups, which were conventionally named:
-
normative and unified (e.g., SAE, Euro NCAP), which are applicable when testing vehicles and their systems in a repeatable manner (testing, inspection tests); unification of scenarios will make it possible to compare the different solutions used in AVs;
-
nonnormative, which were developed in various research centers; their course is adapted to the current needs of researchers. The research is of a cognitive nature and indicates the ways AVs can be improved and developed.
There are a significant number of scenarios that focus on the research of AV systems. The leading area of research regarding components is perception systems. These systems are dominated by two methods of object detection: 2D and 3D detection. Systems based on vision sensors use single-lens cameras that provide 2D images. On the other hand, systems with stereo cameras provide a spatial image from two lenses. This solution enables 3D observation, typically at distances of up to 30 m. An advantageous but costly solution is to use information from radar and lidar (especially with a large number of scanning layers). This arrangement enables the spatial localization of objects and road infrastructure at greater distances than a stereo camera. It makes it easier to detect obstacles and increases the accuracy of object detection, determining their distances to AV and dimensions. Increasingly, at the stage of processing information from sensors, sensory fusion methods (combining information from cameras and lidars) and neural network models are used for object detection (obstacles) in road traffic.
AV research has been carried out in dedicated areas of road infrastructure. In such an area, real objects (e.g., other road users) are often replaced by various types of dummies, targets, and physical models. Their location is described in test scenarios, and the results have an impact on the effectiveness of the AV control process, including obstacle avoidance. This process affects the realization of the planned path and road safety. The AV control system requires current information about the position of the vehicle and the road lane. The investigated scenarios showed that this process is based on the use of information from sensors and systems, such as GPS, GNSS RTK and IMU sensors/systems.
The current experience, resulting from the analysis of AV tests, indicates the need for further standardization of research procedures. However, the limitations in the conducted analysis result from several factors:
-
there is a lack of publications reporting experimental research related to AVs;
-
there is little information that can be obtained about the experimental research scenarios in the analyzed publications;
-
the lack of many legal regulations in the field of these studies means that a large part of the scenarios should be treated as individual ideas and solutions.
Expanding activities in this area will increase the possibility of comparing and summarizing experimental AV research results, which will create an increasingly complete picture of the behavior of autonomous vehicles in road traffic. The knowledge gathered in this way should significantly improve road safety and set directions for the further development of AVs. Future scenarios should also include the study of necessary changes to road infrastructure to facilitate the operation of AVs. Defining these scenarios should take into account the current capabilities and advancement level of AV systems.
This article indicates the currently dominant research tasks and methods of AV research. Another important area of systematized information is lists of standards describing standardized procedures for ADAS systems testing. In addition, an important fragment is the comparison of frequently used AV sensors with the available research topics. This makes it possible to compare the results of research centered on the use of a given sensor model to the results of other works in this field.
The designation for AV as used in this work also includes the terms contained in Regulation (EU) 2019/2144 of the European Parliament and of the Council of 27 November 2019 [98], where automated vehicle means a motor vehicle designed and constructed to move autonomously for certain periods of time without continuous driver supervision but in respect of which driver intervention is still expected or required, and fully automated vehicle means a motor vehicle that has been designed and constructed to move autonomously without any driver supervision.

Author Contributions

Conceptualization, L.P., P.S. and M.Z.; methodology, L.P., P.S. and M.Z.; formal analysis, L.P. and M.Z.; investigation, L.P., M.Z. and P.S.; resources, P.S. and M.Z.; data curation, P.S. and M.Z.; writing—original draft preparation, P.S., L.P. and M.Z.; writing—review and editing, P.S., L.P. and M.Z.; visualization, M.Z., L.P. and P.S.; supervision, L.P.; project administration, P.S.; funding acquisition, L.P., P.S. and M.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data sharing not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

ACCActive Cruise Control
ADASAdvanced Driver Assistance Systems
ADMAAutomotive Dynamic Motion Analyzer
AEB–C2CAutonomous Emergency Braking–Car-to-Car
AEB–VRUAutonomous Emergency Braking–Vulnerable Road User
AEBAutonomous Emergency Braking
AVAutonomous/Automated Vehicle
CANController Area Network
CBFACar-to-Bicyclist Farside Adult
CBLACar-to-Bicyclist Longitudinal Adult
CBNACar-to-Bicyclist Nearside Adult
CBNAOCar-to-Bicyclist Nearside Adult Obstructed
CCFtapCar-to-Car Front Turn-Across-Path
CCRbCar-to-Car Rear Braking
CCRmCar-to-Car Rear Moving
CCRsCar-to-Car Rear Stationary
CMOSComplementary Metal-Oxide-Semiconductor
CPFACar-to-Pedestrian Farside Adult
CPLACar-to-Pedestrian Longitudinal Adult
CPNACar-to-Pedestrian Nearside Adult
CPNCCar-to-Pedestrian Nearside Child
CPRACar-to-Pedestrian Reverse Adult
CPTACar-to-Pedestrian Turning Adult
DGNSSDifferential GNSS
DILDriver in the Loop
UNECEUnited Nations Economic Commission for Europe
ELKEmergency Lane Keeping
ESPElectronics Stability Program
EUEuropean Union (European Parliament and the Council)
EURO NCAPEuropean New Car Assessment Programme
fFrequency/refresh rate
FCWForward Collision Warning
FEBForward Emergency Braking
GLONASSGlobal Navigation System
GNSS RTKGlobal Navigation Satellite Systems Real Time Kinematic
GPSGlobal Positioning System
HHorizontal field of view
HIPHardware in the Loop
IIHSInsurance Institute for Highway Safety
IMUInertial Measurement Unit
INSInertial Navigation System
ISOInternational Organization for Standardization
KATRIKorea Apparel Testing and Research Institute
Layersnumber of scanning layers
LDWLane Departure Warning
LKALane Keeping Assist
LSSLane Support Systems
MILModel in the Loop
NHTSANational Highway Traffic Safety Administration
OXTSOxford Technical Solutions
P-AEBPedestrian-Autonomous Emergency Braking
RResolution
Rangemaximum scanning distance
RGBRed Green Blue
SAESociety of Automotive Engineers
SASSpeed Assist Systems
SILSoftware in the Loop
SLIFSpeed Limit Information Function
SLRSingle-Lens Reflex
UAVUnmanned Aerial Vehicle
VVertical field of view
VILVehicle in the Loop

References

  1. J3016_201609; Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles—SAE International. Available online: https://www.sae.org/standards/content/j3016_201609/ (accessed on 2 August 2022).
  2. J3016_201806; Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles—SAE International. Available online: https://www.sae.org/standards/content/j3016_201806/ (accessed on 2 August 2022).
  3. Guerra, E. Planning for Cars That Drive Themselves: Metropolitan Planning Organizations, Regional Transportation Plans, and Autonomous Vehicles. J. Plan. Educ. Res. 2016, 36, 210–224. [Google Scholar] [CrossRef]
  4. Prochowski, L.; Ziubiński, M.; Szwajkowski, P.; Gidlewski, M.; Pusty, T.; Stańczyk, T.L. Impact of Control System Model Parameters on the Obstacle Avoidance by an Autonomous Car-Trailer Unit: Research Results. Energies 2021, 14, 2958. [Google Scholar] [CrossRef]
  5. Prochowski, L.; Ziubiński, M.; Szwajkowski, P.; Pusty, T.; Gidlewski, M. Experimental and Simulation Examination of the Impact of the Control Model on the Motion of a Motorcar with a Trailer in a Critical Situation. In Proceedings of the 15th International Conference Dynamical Systems-Theory and Applications DSTA, Łódź, Poland, 4 December 2019. [Google Scholar]
  6. Winner, H. Safety Assurance for Highly Automated Driving—The PEGASUS Approach. In Proceedings of the Automated Vehicle Symposium (AVS) 2017, San Francisco, CA, USA, 11–13 July 2017. [Google Scholar]
  7. Nabhan, M. Models and Algorithms for the Exploration of the Space of Scenarios: Toward the Validation of the Autonomous Vehicle. Ph.D. Thesis, Université Paris-Saclay, Paris, France, 2020. [Google Scholar]
  8. Pietruch, M.; Młyniec, A.; Wetula, A. An overview and review of testing methods for the verification and validation of ADAS, active safety systems, and autonomous driving. Min. Inform. Autom. Electr. Eng. 2020, 58, 19–27. [Google Scholar] [CrossRef]
  9. Duleba, S.; Tettamanti, T.; Nyerges, Á.; Szalay, Z. Ranking the Key Areas for Autonomous Proving Ground Development Using Pareto Analytic Hierarchy Process. IEEE Access 2021, 9, 51214–51230. [Google Scholar] [CrossRef]
  10. Aparicio, A. Badania Walidacyjne Samochodów Autonomicznych—Trudne Wyzwanie | Polska Izba Motoryzacji. Available online: https://pim.pl/badania-walidacyjne-samochodow-autonomicznych-trudne-wyzwanie/ (accessed on 2 August 2022).
  11. Morris, A.P.; Haworth, N.; Filtness, A.; Nguatem, D.-P.A.; Brown, L.; Rakotonirainy, A.; Glaser, S. Autonomous Vehicles and Vulnerable Road-Users—Important Considerations and Requirements Based on Crash Data from Two Countries. Behav. Sci. 2021, 11, 101. [Google Scholar] [CrossRef] [PubMed]
  12. Jurecki, R.S.; Stańczyk, T.L. A Methodology for Evaluating Driving Styles in Various Road Conditions. Energies 2021, 14, 3570. [Google Scholar] [CrossRef]
  13. Huang, W.; Wang, K.; Lv, Y.; Zhu, F. Autonomous Vehicles Testing Methods Review. In Proceedings of the 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), Rio de Janeiro, Brazil, 1–4 November 2016; pp. 163–168. [Google Scholar]
  14. Németh, H.; Háry, A.; Szalay, Z.; Tihanyi, V.; Tóth, B. Proving Ground Test Scenarios in Mixed Virtual and Real Environment for Highly Automated Driving. In Mobilität in Zeiten der Veränderung: Technische und Betriebswirtschaftliche Aspekte; Proff, H., Ed.; Springer Fachmedien: Wiesbaden, Germany, 2019; pp. 199–210. ISBN 978-3-658-26107-8. [Google Scholar]
  15. Li, L.; Huang, W.-L.; Liu, Y.; Zheng, N.-N.; Wang, F.-Y. Intelligence Testing for Autonomous Vehicles: A New Approach. IEEE Trans. Intell. Veh. 2016, 1, 158–166. [Google Scholar] [CrossRef]
  16. J3045_201808; Truck and Bus Lane Departure Warning Systems Test Procedure and Minimum Performance Requirements—SAE International. Available online: https://www.sae.org/standards/content/j3045_201808 (accessed on 2 August 2022).
  17. Euro-Ncap-Aeb-C2c-Test-Protocol-V303.Pdf. Available online: https://cdn.euroncap.com/media/62794/euro-ncap-aeb-c2c-test-protocol-v303.pdf (accessed on 2 August 2022).
  18. Commission Delegated Regulation (EU) 2021/1958 of 23 June 2021 Supplementing Regulation (EU) 2019/2144 of the European Parliament and of the Council by Laying down Detailed Rules Concerning the Specific Test Procedures and Technical Requirements for the Type-Approval of Motor Vehicles with Regard to Their Intelligent Speed Assistance Systems and for the Type-Approval of Those Systems as Separate Technical Units and Amending Annex II to That Regulation (Text with EEA Relevance). Available online: http://data.europa.eu/eli/reg_del/2021/1958/oj (accessed on 2 August 2022).
  19. Spencer, M.; Jones, D.; Kraehling, M.; Stol, K. Trajectory Based Autonomous Vehicle Following Using a Robotic Driver. In Proceedings of the 2009 Australasian Conference on Robotics and Automation, ACRA 2009, Sydney, Australia, 2–4 December 2009. [Google Scholar]
  20. Ahmed, S.; Huda, M.N.; Rajbhandari, S.; Saha, C.; Elshaw, M.; Kanarachos, S. Pedestrian and Cyclist Detection and Intent Estimation for Autonomous Vehicles: A Survey. Appl. Sci. 2019, 9, 2335. [Google Scholar] [CrossRef]
  21. Ragesh, N.K.; Rajesh, R. Pedestrian Detection in Automotive Safety: Understanding State-of-the-Art. IEEE Access 2019, 7, 47864–47890. [Google Scholar] [CrossRef]
  22. Euro-Ncap-Sas-Test-Protocol-V20.pdf. Available online: https://cdn.euroncap.com/media/32290/euro-ncap-sas-test-protocol-v20.pdf (accessed on 2 August 2022).
  23. Negahbani, F.; Töre, O.B.; Güney, F.; Akgun, B. Frustum Fusion: Pseudo-LiDAR and LiDAR Fusion for 3D Detection 2021. arXiv 2021, arXiv:2111.04780. [Google Scholar]
  24. Kumar, G.A.; Lee, J.H.; Hwang, J.; Park, J.; Youn, S.H.; Kwon, S. LiDAR and Camera Fusion Approach for Object Distance Estimation in Self-Driving Vehicles. Symmetry 2020, 12, 324. [Google Scholar] [CrossRef] [Green Version]
  25. Meyer, G.P.; Charland, J.; Hegde, D.; Laddha, A.; Vallespi-Gonzalez, C. Sensor Fusion for Joint 3D Object Detection and Semantic Segmentation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Long Beach, CA, USA, 16–17 June 2019. [Google Scholar]
  26. Nkosi, M.P.; Hancke, G.P.; dos Santos, R.M.A. Autonomous Pedestrian Detection. In Proceedings of the AFRICON 2015, Addis Ababa, Ethiopia, 14–17 September 2015; pp. 1–5. [Google Scholar]
  27. Wu, C.-F.; Lin, C.-J.; Lee, C.-Y. Applying a Functional Neurofuzzy Network to Real-Time Lane Detection and Front-Vehicle Distance Measurement. Syst. Man Cybern. Part C Appl. Rev. IEEE Trans. 2012, 42, 577–589. [Google Scholar] [CrossRef]
  28. Euro-Ncap-Lss-Test-Protocol-V302.Pdf. Available online: https://cdn.euroncap.com/media/64973/euro-ncap-lss-test-protocol-v302.pdf (accessed on 2 August 2022).
  29. Gidlewski, M.; Jackowski, J.; Jemioł, L.; Żardecki, D. Sensitivity of a Vehicle Lane Change Control System to Disturbances and Measurement Signal Errors—Modeling and Numerical Investigations. Mech. Syst. Signal Processing 2021, 147, 107081. [Google Scholar] [CrossRef]
  30. Kang, C.M.; Lee, J.; Yi, S.G.; Jeon, S.J.; Son, Y.S.; Kim, W.; Lee, S.-H.; Chung, C.C. Lateral Control for Autonomous Lane Keeping System on Highways. In Proceedings of the 2015 15th International Conference on Control, Automation and Systems (ICCAS), Busan, Korea, 13–16 October 2015; pp. 1728–1733. [Google Scholar]
  31. Cudrano, P.; Mentasti, S.; Matteucci, M.; Bersani, M.; Arrigoni, S.; Cheli, F. Advances in Centerline Estimation for Autonomous Lateral Control. In Proceedings of the 2020 IEEE Intelligent Vehicles Symposium (IV), Las Vegas, NV, USA, 9 October–13 November 2020. [Google Scholar]
  32. Törő, O.; Bécsi, T.; Aradi, S. Design of Lane Keeping Algorithm of Autonomous Vehicle. Period. Polytech. Transp. Eng. 2016, 44, 60–68. [Google Scholar] [CrossRef]
  33. Li, X.; Sun, Z.; He, Z.; Zhu, Q.; Liu, D. A Practical Trajectory Planning Framework for Autonomous Ground Vehicles Driving in Urban Environments. In Proceedings of the 2015 IEEE Intelligent Vehicles Symposium (IV), Seoul, Korea, 28 June–1 July 2015; pp. 1160–1166. [Google Scholar]
  34. Hayashi, R.; Isogai, J.; Raksincharoensak, P.; Nagai, M. Autonomous Collision Avoidance System by Combined Control of Steering and Braking Using Geometrically Optimised Vehicular Trajectory. Veh. Syst. Dyn. 2012, 50, 151–168. [Google Scholar] [CrossRef]
  35. Favarò, F.M.; Nader, N.; Eurich, S.O.; Tripp, M.; Varadaraju, N. Examining Accident Reports Involving Autonomous Vehicles in California. PLoS ONE 2017, 12, e0184952. [Google Scholar] [CrossRef]
  36. Katzorke, N.; Moosmann, M.; Imdahl, R.; Lasi, H. A Method to Assess and Compare Proving Grounds in the Context of Automated Driving Systems. In Proceedings of the 2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC), Rhodes, Greece, 20–23 September 2020; pp. 1–6. [Google Scholar]
  37. Regulation No 130 of the Economic Commission for Europe of the United Nations (UN/ECE)—Uniform Provisions Concerning the Approval of Motor Vehicles with Regard to the Lane Departure Warning System (LDWS). Available online: http://data.europa.eu/eli/reg/2014/130/oj (accessed on 2 August 2022).
  38. UN Regulation No 152—Uniform Provisions Concerning the Approval of Motor Vehicles with Regard to the Advanced Emergency Braking System (AEBS) for M1 and N1 Vehicles [2020/1597]. Available online: https://op.europa.eu/en/publication-detail/-/publication/fc2d3589-1a7c-11eb-b57e-01aa75ed71a1 (accessed on 2 August 2022).
  39. UN Regulation No 157—Uniform Provisions Concerning the Approval of Vehicles with Regards to Automated Lane Keeping Systems [2021/389]. Available online: https://op.europa.eu/en/publication-detail/-/publication/36fd3041-807a-11eb-9ac9-01aa75ed71a1 (accessed on 2 August 2022).
  40. ISO 17361:2017. Available online: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/07/23/72349.html (accessed on 2 August 2022).
  41. ISO 15623:2013. Available online: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/05/66/56655.html (accessed on 2 August 2022).
  42. ISO 21202:2020. Available online: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/07/00/70072.html (accessed on 2 August 2022).
  43. Euro-Ncap-Aeb-Vru-Test-Protocol-V304.Pdf. Available online: https://cdn.euroncap.com/media/62795/euro-ncap-aeb-vru-test-protocol-v304.pdf (accessed on 2 August 2022).
  44. Forkenbrock, G.; Hoover, R.L.; Gerdus, E.; Buskirk, T.V.; Heitz, M. Blind Spot Monitoring in Light Vehicles—System Performance. Available online: https://www.nhtsa.gov/sites/nhtsa.gov/files/812045_blind-spot-monitoring-in-light-vehicles-system-performance.pdf (accessed on 2 August 2022).
  45. Howe, G.; Xu, G.; Hoover, D.; Elsasser, D.; Barickman, F. Commercial Connected Vehicle Test Procedure Development and Test Results—Blind Spot Warning/Lane Change Warning. 2016. Available online: https://www.nhtsa.gov/sites/nhtsa.gov/files/documents/812317_connectedveh.pdf (accessed on 2 August 2022).
  46. Thorn, E.; Kimmel, S.; Chaka, M. A Framework for Automated Driving System Testable Cases and Scenarios. 2018. Available online: https://www.nhtsa.gov/sites/nhtsa.gov/files/documents/13882-automateddrivingsystems_092618_v1a_tag.pdf (accessed on 2 August 2022).
  47. J2808_201701; Lane Departure Warning Systems: Information for the Human Interface—SAE International. Available online: https://www.sae.org/standards/content/j2808_201701 (accessed on 2 August 2022).
  48. Test_protocol_aeb.Pdf. Available online: https://www.iihs.org/media/a582abfb-7691-4805-81aa-16bbdf622992/REo1sA/Ratings/Protocols/current/test_protocol_aeb.pdf (accessed on 2 August 2022).
  49. Test_protocol_pedestrian_aeb.Pdf. Available online: https://www.iihs.org/media/f6a24355-fe4b-4d71-bd19-0aab8b39aa7e/TfEBAA/Ratings/Protocols/current/test_protocol_pedestrian_aeb.pdf (accessed on 2 August 2022).
  50. Paula, D.; Böhm, K.; Kubjatko, T.; Schweiger, H.-G. Autonomous Emergency Braking (AEB) Experiments for Traffic Accident Reconstruction. In Proceedings of the 25th International Scientific Conference Transport Means, Kaunas, Lithuania, 6–8 October 2021. [Google Scholar]
  51. Böhm, K.; Paula, D.; Geidl, B.; Graßl, L.; Kubjatko, T.; Schweiger, H.-G. Reliability and Performance of the AEB System of a Tesla Model X under Different Conditions. In Proceedings of the 29th Annual Congress of the European Association for Accident Research, Haifa, Israel, 6–7 October 2021. [Google Scholar]
  52. ISO 19237:2017. Available online: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/06/41/64111.html (accessed on 2 August 2022).
  53. ISO 22078:2020. Available online: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/07/25/72508.html (accessed on 2 August 2022).
  54. ISO 22839:2013. Available online: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/04/53/45339.html (accessed on 2 August 2022).
  55. Ivanov, A.M.; Shadrin, S.S.; Kristalniy, S.R.; Popov, N.V. Possible Scenarios of Autonomous Vehicles’ Testing in Russia. IOP Conf. Ser. Mater. Sci. Eng. 2019, 534, 012001. [Google Scholar] [CrossRef]
  56. Ivanov, A.M.; Kristalniy, S.R.; Popov, N.V.; Toporkov, M.A.; Isakova, M.I. New Testing Methods of Automatic Emergency Braking Systems and the Experience of Their Application. IOP Conf. Ser. Mater. Sci. Eng. 2018, 386, 012019. [Google Scholar] [CrossRef]
  57. Ivanov, A.M.; Shadrin, S.S. System of Requirements and Testing Procedures for Autonomous Driving Technologies. IOP Conf. Ser. Mater. Sci. Eng. 2020, 819, 012016. [Google Scholar] [CrossRef]
  58. Ivanov, A.; Kristalniy, S.; Popov, N. Russian National Non-Commercial Vehicle Safety Rating System RuNCAP. IOP Conf. Ser. Mater. Sci. Eng. 2021, 1159, 012088. [Google Scholar] [CrossRef]
  59. ISO 11270:2014. Available online: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/05/03/50347.html (accessed on 2 August 2022).
  60. ISO 17387:2008. Available online: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/04/36/43654.html (accessed on 2 August 2022).
  61. ISO 11067:2015. Available online: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/05/00/50091.html (accessed on 2 August 2022).
  62. ISO 15622:2018. Available online: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/07/15/71515.html (accessed on 2 August 2022).
  63. ISO 22178:2009. Available online: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/04/07/40752.html (accessed on 2 August 2022).
  64. ISO 16787:2017. Available online: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/07/37/73768.html (accessed on 2 August 2022).
  65. ISO 17386:2010. Available online: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/05/14/51448.html (accessed on 2 August 2022).
  66. ISO 22737:2021. Available online: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/07/37/73767.html (accessed on 2 August 2022).
  67. Bruno, D.; Peres Nunes Matias, L.; Amaro, J.; Osório, F.S.; Wolf, D. Computer Vision System with 2D and 3D Data Fusion for Detection of Possible Auxiliaries Routes in Stretches of Interdicted Roads. In Proceedings of the 52nd Hawaii International Conference on System Sciences HICSS, Grand Wailea, Maui, HI, USA, 8–11 January 2019; ISBN 978-0-9981331-2-6. [Google Scholar]
  68. Madawy, K.E.; Rashed, H.; Sallab, A.E.; Nasr, O.; Kamel, H.; Yogamani, S. RGB and LiDAR Fusion Based 3D Semantic Segmentation for Autonomous Driving. In Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 27–30 October 2019. [Google Scholar]
  69. Yang, Z.; Li, J.; Li, H. Real-Time Pedestrian Detection for Autonomous Driving. In Proceedings of the 2018 International Conference on Intelligent Autonomous Systems (ICoIAS), Singapore, 1–3 March 2018; pp. 9–13. [Google Scholar]
  70. Wu, B.F.; Lin, C.-T.; Chen, C.-J. Real-Time Lane and Vehicle Detection Based on A Single Camera Model. Int. J. Comput. Appl. 2010, 32, 149–159. [Google Scholar] [CrossRef]
  71. Matsubayashi, K.; Yamad, Y.; Iyoda, M.; Koike, S.; Kawasaki, T.; Tokuda, M. Development of Rear Pre-Crash Safety System for Rear-End Collisions. In Proceedings of the 20th International Technical Conference on the Enhanced Safety of Vehicles (ESV), Lyon, France, 18–21 June 2007. [Google Scholar]
  72. Jansson, J. Collision Avoidance Theory: With Application to Automotive Collision Mitigation. Ph.D. Thesis, Linköping studies in science and technology Dissertations. Linköping University Electronic Press, Sweden, Switzerland, 2005. [Google Scholar]
  73. Janai, J.; Güney, F.; Behl, A.; Geiger, A. Computer Vision for Autonomous Vehicles: Problems, Datasets and State of the Art. Found. Trends Comput. Graph. Vis. 2017, 12, 1–308. [Google Scholar] [CrossRef]
  74. Aryal, M. Object Detection, Classification, and Tracking for Autonomous Vehicle. Master’s Thesis, Grand Valley States University, Grand Rapids, MI, USA, 2018. [Google Scholar]
  75. Blachut, K.; Danilowicz, M.; Szolc, H.; Wasala, M.; Kryjak, T.; Komorkiewicz, M. Automotive Perception System Evaluation with Reference Data from a UAV’s Camera Using ArUco Markers and DCNN. J. Sign Process Syst. 2022, 94, 675–692. [Google Scholar] [CrossRef]
  76. Gao, F.; Li, C.; Zhang, B. A Dynamic Clustering Algorithm for Lidar Obstacle Detection of Autonomous Driving System. IEEE Sens. J. 2021, 21, 25922–25930. [Google Scholar] [CrossRef]
  77. Lim, B.; Woo, T.; Kim, H. Integration of Vehicle Detection and Distance Estimation Using Stereo Vision for Real-Time AEB System. In Proceedings of the 3rd International Conference on Vehicle Technology and Intelligent Transport Systems—VEHITS, Porto, Portugal, 22–24 April 2017; pp. 211–216, ISBN 978-989-758-242-4. [Google Scholar] [CrossRef]
  78. Song, W.; Zou, S.; Tian, Y.; Fong, S.; Cho, K. Classifying 3D Objects in LiDAR Point Clouds with a Back-Propagation Neural Network. Hum. Cent. Comput. Inf. Sci. 2018, 8, 29. [Google Scholar] [CrossRef]
  79. Shilo, A. Detection and Tracking of Unknown Objects on the Road Based on Sparse LiDAR Data for Heavy Duty Vehicles. Master’s Thesis, KTH School of Electrical Engineering and Computer Science (EECS), Stockholm, Sweden, 2018. Available online: http://www.diva-portal.org/smash/get/diva2:1256042/FULLTEXT01.pdf (accessed on 2 August 2022).
  80. Vincke, B.; Rodriguez Florez, S.; Aubert, P. An Open-Source Scale Model Platform for Teaching Autonomous Vehicle Technologies. Sensors 2021, 21, 3850. [Google Scholar] [CrossRef]
  81. Scheffe, P.; Maczijewski, J.; Kloock, M.; Kampmann, A.; Derks, A.; Kowalewski, S.; Alrifaee, B. Networked and Autonomous Model-Scale Vehicles for Experiments in Research and Education. This Research Is Supported by the Deutsche Forschungsgemein-Schaft (German Research Foundation) within the Priority Program SPP 1835 “Cooperative Interacting Automobiles” (Grant Number: KO 1430/17-1) and the Post Graduate Program GRK 1856 “Integrated Energy Supply Modules for Roadbound E-Mobility”. IFAC-PapersOnLine 2020, 53, 17332–17337. [Google Scholar] [CrossRef]
  82. Lapapong, S.; Gupta, V.; Callejas, E.; Brennan, S. Fidelity of Using Scaled Vehicles for Chassis Dynamic Studies. Veh. Syst. Dyn. 2009, 47, 1401–1437. [Google Scholar] [CrossRef]
  83. Liburdi, A. Development of a Scale Vehicle Dynamics Test Bed. Master’s Thesis, University of Windsor, Windsor, ON, Canada, 2010. Available online: https://scholar.uwindsor.ca/etd/195/ (accessed on 2 August 2022).
  84. Park, Y.; Kim, B.; Ahn, C. Scaled Experiment with Dimensional Analysis for Vehicle Lateral Dynamics Maneuver. In Proceedings of the Advances in Dynamics of Vehicles on Roads and Tracks; Klomp, M., Bruzelius, F., Nielsen, J., Hillemyr, A., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 1288–1294. [Google Scholar]
  85. Szalay, Z.; Tettamanti, T.; Esztergár-Kiss, D.; Varga, I.; Bartolini, C. Development of a Test Track for Driverless Cars: Vehicle Design, Track Configuration, and Liability Considerations. Period. Polytech. Transp. Eng. 2018, 46, 29–35. [Google Scholar] [CrossRef]
  86. Katzorke, N. Proving Ground Requirements for Automated Vehicle Testing. In Proceedings of the ADAS & Autonomous Vehicle Technology Conference, San Jose, CA, USA, 7–8 September 2022. [Google Scholar] [CrossRef]
  87. Chen, R.; Arief, M.; Zhang, W.; Zhao, D. How to Evaluate Proving Grounds for Self-Driving? A Quantitative Approach. IEEE Trans. Intell. Transp. Syst. 2021, 22, 5737–5748. [Google Scholar] [CrossRef]
  88. Chen, R.; Arief, M.; Zhao, D. An “Xcity” Optimization Approach to Designing Proving Grounds for Connected and Autonomous Vehicles. arXiv 2018, arXiv:1808.03089. [Google Scholar]
  89. Fremont, D.J.; Kim, E.; Pant, Y.V.; Seshia, S.A.; Acharya, A.; Bruso, X.; Wells, P.; Lemke, S.; Lu, Q.; Mehta, S. Formal Scenario-Based Testing of Autonomous Vehicles: From Simulation to the Real World. In Proceedings of the 2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC), Rhodes, Greece, 20–23 September 2020; pp. 1–8. [Google Scholar]
  90. Xiong, L.; Fu, Z.; Zeng, D.; Leng, B. An Optimized Trajectory Planner and Motion Controller Framework for Autonomous Driving in Unstructured Environments. Sensors 2021, 21, 4409. [Google Scholar] [CrossRef]
  91. Yang, S.M.; Lin, Y.A. Development of an Improved Rapidly Exploring Random Trees Algorithm for Static Obstacle Avoidance in Autonomous Vehicles. Sensors 2021, 21, 2244. [Google Scholar] [CrossRef] [PubMed]
  92. Szalay, Z. Structure and Architecture Problems of Autonomous Road Vehicle Testing and Validation. In Proceedings of the 15th Mini Conference on Vehicle System Dynamics, Identification and Anomalies-VSDIA, Budapest, Hungary, 7–9 November 2016; Volume 24, pp. 229–236, ISBN 978-963-313-266-1. [Google Scholar]
  93. Aparicio, A.; Boltshauser, S.; Lesemann, M.; Jacobson, J.; Eriksson, H.; Herard, J. Status of Test Methods for Active Safety Systems; SAE International: Warrendale, PA, USA, 2012. [Google Scholar]
  94. Aparicio, A.; Lesemann, M.; Eriksson, H. Status of Test Methods for Autonomous Emergency Braking Systems—Results from the Active Test Project. In Proceedings of the SAE 2013 World Congress and Exhibition, Detroit, MI, USA, 16–18 April 2013. [Google Scholar]
  95. Christiansen, P.H. TractorEYE: Vision-Based Real-Time Detection for Autonomous Vehicles in Agriculture; Aarhus University Library Scholarly: Aarhus, Denmark, 7 November 2018; ISBN 978-87-7507-426-6. [Google Scholar] [CrossRef]
  96. Haris, M.; Hou, J. Obstacle Detection and Safely Navigate the Autonomous Vehicle from Unexpected Obstacles on the Driving Lane. Sensors 2020, 20, 4719. [Google Scholar] [CrossRef] [PubMed]
  97. Bae, H.; Lee, G.; Yang, J.; Shin, G.; Choi, G.; Lim, Y. Estimation of the Closest In-Path Vehicle by Low-Channel LiDAR and Camera Sensor Fusion for Autonomous Vehicles. Sensors 2021, 21, 3124. [Google Scholar] [CrossRef]
  98. Regulation (EU) 2019/2144 of the European Parliament and of the Council of 27 November 2019 on Type-Approval Requirements for Motor Vehicles and Their Trailers, and Systems, Components and Separate Technical Units Intended for Such Vehicles, as Regards Their General Safety and the Protection of Vehicle Occupants and Vulnerable Road Users, Amending Regulation (EU) 2018/858 of the European Parliament and of the Council and Repealing Regulations (EC) No 78/2009, (EC) No 79/2009 and (EC) No 661/2009 of the European Parliament and of the Council and Commission Regulations (EC) No 631/2009, (EU) No 406/2010, (EU) No 672/2010, (EU) No 1003/2010, (EU) No 1005/2010, (EU) No 1008/2010, (EU) No 1009/2010, (EU) No 19/2011, (EU) No 109/2011, (EU) No 458/2011, (EU) No 65/2012, (EU) No 130/2012, (EU) No 347/2012, (EU) No 351/2012, (EU) No 1230/2012 and (EU) 2015/166 (Text with EEA Relevance). Available online: http://data.europa.eu/eli/reg/2019/2144/oj (accessed on 2 August 2022).
Figure 1. Hierarchical information diagram for AV research scenarios.
Figure 1. Hierarchical information diagram for AV research scenarios.
Sensors 22 06586 g001
Figure 2. Part of the research: identification of the suddenly appearing obstacle at a road intersection by the AV’s perception system (Łukasiewicz Research Network—Automotive Industry Institute). Two frames from the perception system’s camera are shown; the numerical value in the figure indicates the probability of object identification by the AV system.
Figure 2. Part of the research: identification of the suddenly appearing obstacle at a road intersection by the AV’s perception system (Łukasiewicz Research Network—Automotive Industry Institute). Two frames from the perception system’s camera are shown; the numerical value in the figure indicates the probability of object identification by the AV system.
Sensors 22 06586 g002
Figure 3. Part of the research: lane and road edge detection by the AV’s perception system (Łukasiewicz Research Network—Automotive Industry Institute).
Figure 3. Part of the research: lane and road edge detection by the AV’s perception system (Łukasiewicz Research Network—Automotive Industry Institute).
Sensors 22 06586 g003
Figure 4. Example of the course of the planned driving path y M ( x ) and actual resultant trajectory y C A ( x ) ; the method for measuring lateral deviation ( Δ y M A ) and the longitudinal axis of the car’s angular deviation ( Δ ψ M A ) has been marked.
Figure 4. Example of the course of the planned driving path y M ( x ) and actual resultant trajectory y C A ( x ) ; the method for measuring lateral deviation ( Δ y M A ) and the longitudinal axis of the car’s angular deviation ( Δ ψ M A ) has been marked.
Sensors 22 06586 g004
Figure 5. Planned driving path and resultant trajectory of the car for several driving velocities; v = 50 ,   60 ,   70 ,   80 km/h; (a) planned driving path y M ( x ) and trajectory y C A ( x ) ; (b) lateral deviation Δ y M A of the trajectory (movement of the center of mass) from the planned driving path; (c) longitudinal axis of the car’s angular deviation Δ ψ M A from the tangent to the planned driving path.
Figure 5. Planned driving path and resultant trajectory of the car for several driving velocities; v = 50 ,   60 ,   70 ,   80 km/h; (a) planned driving path y M ( x ) and trajectory y C A ( x ) ; (b) lateral deviation Δ y M A of the trajectory (movement of the center of mass) from the planned driving path; (c) longitudinal axis of the car’s angular deviation Δ ψ M A from the tangent to the planned driving path.
Sensors 22 06586 g005aSensors 22 06586 g005b
Figure 6. Obstacles examples used in experimental research scenarios: (a) pieces of cardboard to mark the road lane and the obstacle [34]; (b) soft car target on a moving platform [17,28]; (c) pedestrian, child, bicyclist dummy on a moving platform [43]; (d) soft wall covered with a metallized mirrored film [56]; (e) motorcyclist and human figure dummy [56].
Figure 6. Obstacles examples used in experimental research scenarios: (a) pieces of cardboard to mark the road lane and the obstacle [34]; (b) soft car target on a moving platform [17,28]; (c) pedestrian, child, bicyclist dummy on a moving platform [43]; (d) soft wall covered with a metallized mirrored film [56]; (e) motorcyclist and human figure dummy [56].
Sensors 22 06586 g006
Table 1. The main features of the test scenario for planning and maintaining the vehicle’s driving path [30,31,32].
Table 1. The main features of the test scenario for planning and maintaining the vehicle’s driving path [30,31,32].
Task and aim of the researchThe research task in the described scenarios is automated driving along a detected road lane on a curvilinear track. The aim of the research is to keep the vehicle in the road lane. The lane is maintained by controlling the steered wheels or additionally by selecting the driving velocity.
Research objectThe research used passenger cars equipped with systems enabling autonomous driving [30,31] or a go-kart which performs AV functions [32]. For this purpose, elements of the perception system, path planning, and the steering wheel control system were used. In this research, the path planning system determines the reference trajectory along the center line of the road lane.
Research site/environmentThe location used for this research is specialized research tracks. KATRI [30], Aci-Sara Lainate and the Monza Eni Circuit [31] tracks were used. They create road infrastructure for safe and repeatable research in the form of closed loops. In [32], the perception system used track elements in the form of road signs to select driving velocity.
Vehicle control systemThe control process was carried out by turning the vehicle wheels. The value of the steering angle results from the need to keep the vehicle on the road lane. The necessary correction takes into account the lateral and angular deviation (Figure 4 and Figure 5) between the planned path and the actual vehicle trajectory. The deviations are measured from the center line of the road lane [31] or the reference trajectory [30].
For example, the control process in [30] was carried out in a feedback loop, in which the lateral deviation of the vehicle from the planned path was taken into account. This deviation was the basis for calculating the steering angle of the vehicle’s wheels. An actuator with a stepper motor was used to make the turn.
Additional informationIn [32], HIL technology was used, in which computer simulation uses the position of the road lines and the vehicle. The position of the vehicle was simulated according to the so-called bicycle model (CarSim—Matlab software). Images from the computer screen, observed through the camera, were the basis (in the feedback loop) for the evaluation of the position of the vehicle in relation to the computer simulation. The difference of these positions makes it possible to calculate the necessary correction in the form of the steering wheel. However, the camera is sensitive to shocks and even its minimal displacement makes it difficult to detect lines on the computer screen and calculate the correct deviation from the planned path.
Table 2. The main features of the test scenarios for the effectiveness of lane detection and keeping [27,33].
Table 2. The main features of the test scenarios for the effectiveness of lane detection and keeping [27,33].
Task and aim of the researchThe research task in the described scenarios is automated lane detection and keeping [27], as well as avoiding a slower moving vehicle [33]. The aim of the research has two aspects: effective detection of road lane boundary lines in conditions of traffic with other vehicles and determination of a non-collision (safe) trajectory while avoiding a slower car.
In [27], a scenario was used in which the perception system was tested in terms of its effectiveness in detecting lines on the road. A more complex research task covers the scenario in [33]. The task additionally includes determining a non-collision path of movement so as to obtain smooth avoidance of a slower moving car.
Research objectPassenger cars and their measuring equipment were used during the research. Little information has been provided about the equipment of the vehicle in [27]. The scope of the planned research indicates that the vehicle perception system (camera and software) was involved.
Research site/environmentIn both scenarios, research was planned in an urban area. In [33], a section of the road was simulated in a closed area (e.g., with dummies). In this area, a two-lane roadway was separated, which made it possible to avoid vehicles. In this area, driving velocities reached up to 25 km/h.
In [27], there was a limited stretch of road traffic used in the scenario. The car’s perception system was used during analysis of the effectiveness of detecting the road lane lines.
Vehicle control systemDuring the research in scenario [27], attention was focused on the effective detection of lane boundaries in urban conditions. Vehicles in scenario [33] performed automated driving by turning the wheels and selecting driving velocity. The choice of velocity took into account the limitations of driving safety and smoothness.
Additional informationThe IMU sensors used are inertial sensors, hence, when measuring the angle of deviation of the longitudinal axis of the vehicle, it is necessary to include the deviation value, which makes the result susceptible to the phenomenon of drift [33]. In scenario [27], a neural network was used to estimate the safe driving distance.
Table 3. The main features of the scenario including avoidance of suddenly appearing obstacles [34].
Table 3. The main features of the scenario including avoidance of suddenly appearing obstacles [34].
Task and aim of the researchThe research task related to avoiding a suddenly appearing obstacle. This research is a reference for active safety technology aiming to prevent road accidents. A scenario was used that involved two aspects of research: obstacle avoidance and braking.
Research objectThe research subject was a single-person AV that performed autonomous driving during obstacle avoidance. The functioning of perception, risk analysis, and obstacle avoidance path planning systems was investigated.
Research site/environmentThe research was conducted in a closed area. Cardboard boxes were used to define the research area (roads and obstacles). A velocity limit of 30 km/h was introduced.
Vehicle control systemThe vehicle was controlled by the braking and steering systems. The location of the edge of the road and the obstacle allowed for geometric location (in global coordinates) of the obstacle avoidance trajectory. This made it possible to calculate the necessary correction of the trajectory compared to the planned path in the vehicle control system.
Additional informationCardboard boxes are often used to mark road boundaries and obstacles.
Table 4. Normative test scenarios for active safety and control systems in AVs.
Table 4. Normative test scenarios for active safety and control systems in AVs.
Scope and Aim of the ResearchScenario Identification
Tests of the emergency braking system/collision warning system against an obstacle (vehicle, pedestrian, bicyclist). The aim of the test was to assess the effectiveness of a given AV system. Images from cameras and system interface signals were recorded. Measured values for the car: position, driving velocity, angular velocities, acceleration, steering wheel angular velocity, and the intensity of acceleration and braking.Euro NCAP: AEB C2C (CCRs, CCRm, CCRb, CCFtap) [17], AEB VRU (CPFA, CPNA, CPNC [50,51], CPLA, CPTA, CPRA, CBNA, CBNAO, CBFA, CBLA) [43]
ISO: 15623 [41], 19237 [52], 22078 [53], 22839 [54]
IIHS: AEB [48], P-AEB [49]
UNECE: 152 ] [38]
RuNCAP: AEB [55,56,57,58]
Tests related to lane keeping, driver lane departure warnings, semi-automatic lane changes, and emergency lane keeping systems. The aim of the test was to assess the system’s effectiveness in different driving conditions (types of road lines and specificity of obstacles on the road). Images from cameras and system interface signals were recorded. Measured values for the car: position, driving velocity, angular velocities, and steering wheel angular velocity.Euro NCAP: LSS (ELK, LKA, LDW) [28]
SAE: J2808 [47], J3045 [16]
ISO: 11270 [59], 17361 [40], 21202 [42]
UNECE: 130 [37], 157 [39]
Tests of the system responsible for informing about speed limits and a system that adjusts driving velocity to road limits. The aim of the test was to assess the system’s effectiveness with different road types and driving velocities. Road infrastructure (road speed limit signs) and signaling of restrictions via the AV interface were recorded.Euro NCAP: SAS (SLIF) [22]
EU 2021/1958 [18]
Tests of the vehicle’s blind spot monitoring system and lane change support system. The aim of the test was to assess the system’s effectiveness in various road maneuvers. Images from cameras and system interface signals about lane departure hazards were recorded. Measured values for the car: driving velocity, acceleration, and steering wheel angle.NHTSA: 812045 [44], 812317 [45]
ISO: 17387 [60]
Tests of the driver warning system for excessive driving velocity on the curve of the road. The aim of the test was to assess the correctness of signaling related to excessive velocity on the curve of the road through the system interface. Arc radius value and overspeed signaling via the system interface were recorded. Measured values for the car: position and driving velocity.ISO: 11067 [61]
Testing of the active cruise control system and the system that controls following the vehicle in front at low speeds (traffic jam assistant). The aim of the test was to assess the effectiveness of the system for different driving modes. Detected vehicles before the AV were recorded. Measured values for the car: AV motion parameters and distance to the obstacle.ISO: 15622 [62], 22178 [63]
Testing of the assisted parking system, maneuvering aids system, and the system for predefined routes for low-speed operations. The aim of the test was to assess the effectiveness of parking area detection, detection of obstacles, scanning the space around the vehicles, path planning, and control. Images from cameras, information about obstacles, and signals from the scanning sensors were recorded. Measured values for the car: position and distance to the obstacle.ISO: 16787 [64], 17386 [65], 22737 [66]
Table 5. Examples of artificial objects and their roles in the research scenario.
Table 5. Examples of artificial objects and their roles in the research scenario.
Aim of the ResearchArtificial ObjectsRole in the ResearchPublication
Avoiding obstacles, braking in front of obstaclesCardboard boxesMarking a road lane[34]
Control of the vehicle braking processDummy parts of the rear of a car body (stationary or movable)Imitation of a car on the road[72,93,94]
2D and 3D obstacle identificationRoad conesObjects to be identified by the perception system[67]
Critical maneuvers to avoid collisions with suddenly appearing obstaclesSoft wall covered with a metallized mirrored filmObstacle on the AV’s driving path (stationary, mobile)[55]
Critical maneuvers to avoid front-end collisions with suddenly appearing obstaclesPedestrian, child, and bicyclist dummies on a moving platform; soft car targetMoving objects on the path intersecting with the AV’s driving path[51,56]
Defensive maneuvers before the collision; the lane change problemSusceptible obstacle on a moving platformMoving objects on the path intersecting with the AV’s driving path[17,28]
Table 6. Properties of sensors in AV perception systems [20].
Table 6. Properties of sensors in AV perception systems [20].
ParameterCameraThermal CameraRadarLidar
ResolutionGoodGoodFairFair
IlluminationPoorGoodGoodGood
WeatherFairGoodFairGood
CostGoodFairPoorPoor
Table 7. Cameras used in perception and control systems during AV research.
Table 7. Cameras used in perception and control systems during AV research.
Research Scope of the ScenarioType, Manufacturer, Model, and Selected Sensor ParametersPublication
Lane keeping by AVsCamera, Bosch (first generation, CMOS).[32]
3D obstacle detection, braking and avoiding the obstacleRGB camera (H: 48 deg). [72]
2D obstacle detection, localization and object tracking, lane/road detectionCamera, Hitachi KP-F3 (R: 644 × 493 px.).[70]
3D obstacle detectionRGB camera (H: 90 deg, V: 30 deg, R: 1920 × 640 px.).[25]
2D and 3D obstacle detection, localization, trackingStereo camera.[21,23,67,77]
2D and 3D obstacle detection, localization, tracking, research of emergency braking/collision warning systems (car, pedestrian, bicyclist)RGB camera.[20,26,50,68,69]
2D object detection, localization, trackingThermal camera.[20]
Research of emergency braking/collision warning systems (car, pedestrian, bicyclist)Camera in the Subaru EyeSight system.[55]
Research of emergency braking/collision warning systems (car, pedestrian, bicyclist)Camera in the Infinity FEB system.[56]
Research of emergency braking/collision warning systems (car, pedestrian, bicyclist), research of the lane keeping/lane departure warning/semi-automatic lane change/emergency lane keeping systems, research of blind spot monitoring/lane change assist systemsRGB camera.[17,28,43,44,51]
Lane keeping by the AV, 3D object detection Stereo camera, Stereolabs ZED (R: 672 × 376 px., F: 100 Hz, Range: 25 m).[31,79,96]
Lane/road detection, 3D object detection Camera, Sony PC-350.[27]
3D object detection Camera, Sekonix SF3321 (f: 30 Hz).[24]
2D and 3D object detection, localization, trackingCamera, Logitech HD Pro C920 (R: 1920 × 1080 px., f: 30 Hz);
Camera, Giroptic HD (H: 360 deg, R: 2048 × 1024 px., f: 30 Hz);
Thermal camera, FLIR A320 (R: 380 × 240 px., f: 9 Hz);
Thermal camera, FLIR A65 (R: 640 × 512 px., f: 30 Hz);
Thermal camera, Tonbo Imaging Inc. HawkVision (R: 640 × 480 px., f: 25 Hz);
Camera, New Imaging Technology NSC1003 (CMOS, R: 1280 × 1024 px., f: 25 Hz);
Camera, Point Grey Two Flea/FL3-GE-28S4C-C (R: 1928 × 1448 px., f: 15 Hz);
Camera, Carnegie Robotics Multi Sense S21 (f: 30 Hz).
[95]
2D and 3D object detection, localization, trackingCamera (H: 85 deg, R: 3840 × 2160 px., f: 30 Hz).[75]
2D and 3D object detection, localization, trackingStereo camera, Point Grey Bumblebee XB3 (BBX3-13S2C-38) (R: 1280 × 960 px., f: 16 Hz, H: 66 deg);
Camera, Point Grey Grasshopper2 (GS2-FW-14S5C-C) (R: 1024 × 1024 px., f: 11 Hz, H: 180 deg).
[74]
2D and 3D object detectionLogitech StreamCam (R: 1280 × 720 px., f: 60 Hz, H: 78 deg)[97]
3D obstacle detectionLeopard Imaging AR023ZWDR[89]
Table 8. Radars used in perception and control systems during AV research.
Table 8. Radars used in perception and control systems during AV research.
Research Scope of the ScenarioType, Manufacturer, Model, and Selected Sensor ParametersPublication
Braking and avoiding the obstacleShort-range radar (H: 30 deg, Range: 30 m, f: 76.5 GHz);
Long-range radar (H: 20 deg, Range: 150 m, f: 76.5 GHz).
[71]
Braking and avoiding the obstacle, 3D object detectionRadar, Fujitsu Ten (Range: 120 m, H: 16 deg);
Radar, Mitsubishi (Range: 150 m, H: 12–16 deg);
Radar, Denso (Range: 150 m, H: 20 deg);
Radar, Nec (Range: 120 m, H: 16 deg);
Radar, Hitachi (Range: 120 m, H: 16 deg);
Radar, A.D.C. (Range: 150 m, H: 10 deg);
Radar, Bosch (Range: 150 m, H: 8 deg);
Radar, Autocruise (Range: 150 m, H: 12 deg);
Radar, Delphi (Range: 150 m, H:16 deg);
Radar, Eaton (Range: 150 m, H:12 deg);
Radar, Visteon (Range: 150 m, H: 12 deg).
[72]
Research into emergency braking/collision warning systems (car, pedestrian, bicyclist)Radar, Continental ARS510 (H: 9 deg, Range: 120 m);
Radar, Continental SRR510 (H: 90 deg, Range: 30 m).
[17,43,50,51,55,57]
Research into emergency braking/collision warning systems (car, pedestrian, bicyclist)Radar in the Infinity FEB system.[56]
3D object detectionRadar, Delphi ESR 64.[95]
Table 9. Lidars used in perception and control systems during AV research.
Table 9. Lidars used in perception and control systems during AV research.
Research Scope of the ScenarioType, Manufacturer, Model, and Selected Sensor ParametersPublication
Planning the driving path of an AV, 3D object detectionLidar, Velodyne HDL-64E (H: 360 deg, Layers: 64, Range: 120 m).[23,24,25,33,79]
Planning the driving path of an AV, braking and avoiding the obstacleLidar, Sick.[34]
Braking and avoiding the obstacle, 3D object detectionLidar, Mitsubishi (H: 12 deg, V: 4 deg, Range: 130 m);
Lidar, Denso (H: 16 deg, V: 4.4 deg, Range: 120 m);
Lidar, Denso (gen.II, H: 40 deg, V: 4.4 deg, Range: 120 m);
Lidar, Nec (H: 20 deg, V: 3 deg, Range: 100 m);
Lidar, Omron (H: 10.5 deg, V: 3.3 deg, Range: 150 m);
Lidar, Omron (gen.II, H: 20–30 deg, V: 6.5 deg, Range: 150 m);
Lidar, Kansei (H: 12 deg, V: 3.5 deg, Range: 120 m);
Lidar, A.D.C. (H: 17 deg, Range: 150 m).
[CYT42] [72]
3D object detectionLidar, Velodyne HDL32E (H: 360 deg, Layers: 32, Range: 100 m, f: 10 Hz).[67,78,95]
3D object detectionLidar, (H: 90 deg, Layers: 64).[68]
3D object detectionLidar, Ibeo Lux 8 (H: 110 deg, Layers: 8, Range: 50 m).[76]
Research into blind spot monitoring/lane change assist systems for AvsLidar.[44]
3D object detectionLidar, Velodyne VLP-16 (H: 360 deg, Layers: 16, Range: 100 m).[79,97]
3D object detectionLidar, Hesai Pandar 40p (H: 360 deg, V: 40 deg, Layers: 40, Range: 200 m, f: 10 Hz).[75]
3D object detectionLidar SICK LMS-151 2D (H: 270 deg, f: 50 Hz, Range: 50 m).
Lidar SICK LD-MRS 3D (H: 85 deg, V: 3.2 deg, Layers: 4, f: 12.5 Hz, Range: 50 m).
[74]
Planning the driving path of an AV, 3D object detectionLidar Sick LD MRS (H: 94 deg, Layers: 8, Range: 200 m).[19]
3D obstacle detectionLidar Velodyne VLS-128 (H: 360 deg, Layers: 128, Range: 245 m).[89]
Table 10. Other sensors used in perception and control systems during AV research.
Table 10. Other sensors used in perception and control systems during AV research.
Research Scope of the ScenarioType, Manufacturer, Model, and Selected Sensor ParametersPublication
Planning the driving path of an AV, 3D object detectionGPS.[19,33]
Planning the driving path of an AV, braking and avoiding the obstacleIMU.[33,34]
Lane keeping by the AV, 2D and 3D object detection, localization, trackingGNSS RTK SwiftNavigation.[31,75]
2D and 3D object detection, localization, trackingIMU Vectornav VN-100.
Single RTK GPS Trimble AG GPS361.
Differential RTK GPS Trimble Dual Antennas A BD982.
[95]
2D and 3D object detection, localization, trackingOXTS RT-Range.[75]
2D and 3D object detection, localization, trackingGPS/INS NovAtel SPAN-CPT Align inertial and GPS navigation system.
GPS/GLONASS dual antenna.
[74]
3D object detectionIMU Xsens Mti-G.[19]
2D and 3D object detectionRTK GNSS GPS MRP-2000.[97]
3D obstacle detectionGPS/IMU with RTK Novatel PwrPak7 dual antenna.[89]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Prochowski, L.; Szwajkowski, P.; Ziubiński, M. Research Scenarios of Autonomous Vehicles, the Sensors and Measurement Systems Used in Experiments. Sensors 2022, 22, 6586. https://doi.org/10.3390/s22176586

AMA Style

Prochowski L, Szwajkowski P, Ziubiński M. Research Scenarios of Autonomous Vehicles, the Sensors and Measurement Systems Used in Experiments. Sensors. 2022; 22(17):6586. https://doi.org/10.3390/s22176586

Chicago/Turabian Style

Prochowski, Leon, Patryk Szwajkowski, and Mateusz Ziubiński. 2022. "Research Scenarios of Autonomous Vehicles, the Sensors and Measurement Systems Used in Experiments" Sensors 22, no. 17: 6586. https://doi.org/10.3390/s22176586

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop