Next Article in Journal
A Novel Multi-Axial Pressure Sensor Probe for Measuring Triaxial Stress States Inside Soft Materials
Next Article in Special Issue
Estimation of Vehicle Dynamic Parameters Based on the Two-Stage Estimation Method
Previous Article in Journal
Full-Self-Powered Humidity Sensor Based on Electrochemical Aluminum–Water Reaction
Previous Article in Special Issue
Small Object Detection in Traffic Scenes Based on Attention Feature Fusion
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel V2V Cooperative Collision Warning System Using UWB/DR for Intelligent Vehicles

Institute of Intelligent Vehicles, School of Automotive Studies, Tongji University, No. 4800 Cao’an Highway, Jiading District, Shanghai 201804, China
*
Authors to whom correspondence should be addressed.
Sensors 2021, 21(10), 3485; https://doi.org/10.3390/s21103485
Submission received: 11 April 2021 / Revised: 13 May 2021 / Accepted: 14 May 2021 / Published: 17 May 2021
(This article belongs to the Special Issue Advanced Sensing and Control for Connected and Automated Vehicles)

Abstract

:
The collision warning system (CWS) plays an essential role in vehicle active safety. However, traditional distance-measuring solutions, e.g., millimeter-wave radars, ultrasonic radars, and lidars, fail to reflect vehicles’ relative attitude and motion trends. In this paper, we proposed a vehicle-to-vehicle (V2V) cooperative collision warning system (CCWS) consisting of an ultra-wideband (UWB) relative positioning/directing module and a dead reckoning (DR) module with wheel-speed sensors. Each vehicle has four UWB modules on the body corners and two wheel-speed sensors on the rear wheels in the presented configuration. An over-constrained localization method is proposed to calculate the relative position and orientation with the UWB data more accurately. Vehicle velocities and yaw rates are measured by wheel-speed sensors. An extended Kalman filter (EKF) is applied based on the relative kinematic model to combine the UWB and DR data. Finally, the time to collision (TTC) is estimated based on the predicted vehicle collision position. Furthermore, through UWB signals, vehicles can simultaneously communicate with each other and share information, e.g., velocity, yaw rate, which brings the potential for enhanced real-time performance. Simulation and experimental results show that the proposed method significantly improves the positioning, directing, and velocity estimating accuracy, and the proposed system can efficiently provide collision warning.

1. Introduction

The global status report on road safety 2018, launched by the WHO in December 2018, highlighted that the number of annual road traffic deaths had reached 1.35 million [1]. Two-vehicle and multi-vehicle collisions were the most severe types of accidents. Studies showed that more than 80% of road traffic accidents resulted from drivers’ belated responses, and more than 65% resulted in rear-end collisions [2]. Researches indicate that more than 80% of accidents could have been averted if drivers had focused and driven correctly in three seconds before the accident [3].
In recent years, more and more researchers have focused on advanced driving assistance systems (ADAS) to raise consumers’ awareness of safety devices and to reduce the risk of accidents caused by careless driving. As an essential component of the collision warning system, the forward collision warning system (FCWS), can measure the distance with the leading vehicle by itself and warn drivers when the distance between vehicles is less than the safe distance. At present, FCWS using active sensors, such as laser [4,5], radar [6], vision sensor [7,8,9], and infrared [10], has been widely studied. Sanberg et al. [11] presented a stereo vision-based CWS suited for real-time execution in a car. Hernandez et al. designed an object warning collision system for high-conflict vehicle-pedestrian zones using a laser [12]. Coelingh et al. [13,14] proposed a collision avoidance and automatic braking system using a car mounted with radar and camera. Srinivasa et al. [15] proposed an improved CWS combining data from a forward-looking camera and a radar. Although these sensors have high accuracy, they cannot work robustly in bad weather such as rain, snow, and fog and effectively identify dangerous vehicles in visual blind areas. Many advanced algorithms have been proposed to overcome the defects of the sensors [16,17]. However, these algorithms are always limited to particular scenarios, e.g., lane changing [18] and turning [19].
CCWS is an effective solution to this issue, which combines traditional CWS with vehicle-to-infrastructure (V2I) communication and V2V communication [20]. In CCWS, the sensor defects of a single vehicle are supplemented by acquiring information from other vehicles or infrastructures. A V2V-based system shares information among the on-board units (OBU) of vehicles. In V2I systems, accidents and hazardous events are detected by roadside units (RSU) and sent to the OBUs of vehicles [21]. Since vehicles can communicate directly through V2V without dependence on infrastructures, it is more suitable for CWS than V2I. Yang et al. [22] proposed a novel FCWS, which used license plate recognition and vehicle-to-vehicle (V2V) communication to warn the drivers of both vehicles. Xiang et al. [23] proposed an FCWS based on dedicated short-range communication (DSRC) and the global positioning system (GPS). Yang et al. [24] proposed an FCWS combining differential global positioning system (DGPS) and DSRC. Patra et al. [25] proposed a novel FCWS, in which GPS provides the relative positioning information, and vehicles communicate through a vehicular network integrated with smartphones. In general, CCWS can overcome the limitations of the in-vehicle sensor-based CWS by sharing information such as vehicle speed, location, and angle to surrounding vehicles. However, the current V2V based CWSs implement relative positioning and communication separately using different technologies, e.g., predicting collision warning based on radars but communicating through WIFI, which may affect the real-time performance.
To address this issue, the UWB-based CCWS seamlessly combines CWS and V2V without delay. UWB is a communication technology that uses nanosecond narrow pulse signal to transmit data and to measure distances, which has become an effective transmission technology in location-aware sensor networks [26]. Inherently, the UWB-based ranging technology has the advantages of high time resolution and can achieve centimeter-level ranging accuracy [27]. UWB is more adaptable to different environments than traditional sensors used in CWS [28]. There has also been some research on UWB-based CCWS. Sun et al. [29] proposed a UWB/INS (Inertial Navigation System)-based automatic guided vehicle (AGV) collision avoidance system. Liu et al. [30] designed a vehicle collision-avoidance system based on UWB wireless sensor networks. Marianna et al. used UWB to obtain distance information and calculated the collision time to provide collision warnings for workers [31]. Kianfar et al. presented a CWS for the underground mine, which predicted collisions using distances between workers and the mining vehicle measured by UWB [32]. In summary, the existing UWB-based CCWS mainly has two technical routes, which are based on absolute positioning and relative positioning, respectively. The former is hard to popularize due to the small coverage area and high cost of base stations. For the latter, most of the existing research only considers the relative distance between targets rather than the position and ignores the information such as relative velocity and orientation.
To deal with the above problems, a CCWS based on UWB and DR is proposed in this paper. In the proposed system, relative positioning and communication are implemented by UWB simultaneously, which contributes to better real-time performance. Four UWB modules are installed on each vehicle, which makes it possible to calculate not only two-dimension (2D) relative positions but also relative orientations. An over-constrained method is proposed to improve the positioning/directing accuracy. Then, the accuracy and stability of the system are further improved, and the TTC can be estimated with the integration of DR.
This paper is organized as follows: In Section 2, the three subsystems of CCWs are introduced. Section 3 carries on a simulation to evaluate the performance of the system. In Section 4, we conduct experiments and analyze the results. Finally, we summarize the conclusions in Section 5.

2. Algorithm and Modeling

The CWS consists of three parts, the UWB-based relative positioning and directing system, the DR system based on wheel-speed sensors, and the TTC estimation system. In the following sections, the UWB-based relative positioning/directing system is shortened to the UWB system. In this section, the UWB and DR subsystems are established. Then, an EKF-based fusion algorithm is proposed to integrate UWB with DR, which significantly improves the accuracy of relative position, orientation, and velocity. Finally, the TTC estimation method in several different collision scenarios is put forward.

2.1. The Relative Positioning and Directing System

According to the vehicle axis system regulated by ISO 8855: 2011 [33], as shown in Figure 1, the origin is located at the automotive rear axle center. The X-axis points to the forward of the vehicle, and the Y-axis points to the left. In this paper, all proposed systems are established based on this axis system.
Figure 2 shows the UWB system model. XOY represents the coordinate system of vehicle 1. X’O’Y’ represents the coordinate system of vehicle 2. Points 1, 2, 3, and 4 represent the UWB modules on vehicle 1, and points M, N, P, and Q represent the UWB modules on vehicle 2. The coordinate of each UWB module in its own vehicle axis system is known when installed. As Figure 2, X K = [ x K , y K ] T is defined as the position of module K in the axis system of vehicle 1 and X K = [ x K , y K ] T is defined as the position of module K in the axis system of vehicle 2, where K = ( 1 ,   2 ,   3 ,   4 ,   M ,   N ,   P ,   Q ,   C , O , O ) .
With the distances measured by UWB and the coordinates of UWB modules, the relative position and orientation [ x , y , β ] T can be calculated. [ x , y ] T is the position of vehicle 2 in the axis system of vehicle 1. β is the relative orientation, which means the intersection angle of the two vehicles’ driving directions.
As the ranging precision of UWB is very sensitive to NLOS, not all UWB modules are necessary at the same time. Therefore, only four modules, two on each vehicle, in LOS are picked at the same time. The other modules are used to help distinguish multiple solutions. On account of the high time resolution and low multipath effect of UWB signals, it is not complex to distinguish NLOS and LOS signals.
Figure 2 shows a typical driving scenario. Vehicle 2 is changing lanes to the front of vehicle 1. Apparently, rear-end collision risk exists if vehicle 1 drives faster than vehicle 2 and does not brake. Since the CWS is especially necessary in this condition, we take it as an example to interpret our algorithm. In this case, points 1, 2, M, and N are in LOS. Define d1, d2, d3, and d4 as the real distances shown in Figure 2, and d ^ 1 ,   d ^ 2 ,   d ^ 3 , and d ^ 4 as the corresponding measurements ranged by UWB. Other known parameters include X 1 = [ x 1 , y 1 ] T , X 2 = [ x 2 , y 2 ] T , X M = [ x M , y M ] T , X N = [ x N , y N ] T . Then, we have
{ d 1 = ( x M x 1 ) 2 + ( y M y 1 ) 2 d 2 = ( x M x 2 ) 2 + ( y M y 2 ) 2 d 3 = ( x N x 1 ) 2 + ( y N y 1 ) 2 d 4 = ( x N x 1 ) 2 + ( y N y 1 ) 2 .
As d1, d2, d3, and d4 are unknown, d ^ 1 ,   d ^ 2 ,   d ^ 3 , and d ^ 4 are substituted into Equation (1) for the estimated positions of M and N, X ^ M = [ x ^ M , y ^ M ] T and X ^ N = [ x ^ N , y ^ N ] T . Then, the estimated distance between M and N can be calculated by Equation (2).
d ^ 5 = ( x ^ M x ^ N ) 2 + ( y ^ M y ^ N ) 2
However, when UWB modules are installed, the real distance between M and N is a determined constant, which can be calculated by Equation (3).
d 5 = ( x M x N ) 2 + ( y M y N ) 2
When ranging error exists, d ^ 5 d 5 . In order to get the least square (LS) solutions that could better meet all the distances, we rewrite Equation (1) as Equation (4).
{ d 1 = ( x M x 1 ) 2 + ( y M y 1 ) 2 d 2 = ( x M x 2 ) 2 + ( y M y 2 ) 2 d 3 = ( x N x 1 ) 2 + ( y N y 1 ) 2 d 4 = ( x N x 1 ) 2 + ( y N y 1 ) 2 d 5 = ( x M x N ) 2 + ( y M y N ) 2
Significantly, it is an overdetermined nonlinear equation set with five equations and four unknowns. When ranging error exists, the equation set does not have exact solutions. We define function g as shown in Equation (5).
g ( x M , y M , x N , y N ) = ( d ^ 1 ( x M x 1 ) 2 + ( y M y 1 ) 2 ) 2 + ( d ^ 2 ( x M x 2 ) 2 + ( y M y 2 ) 2 ) 2 + ( d ^ 3 ( x N x 1 ) 2 + ( y N y 1 ) 2 ) 2 + ( d ^ 4 ( x N x 2 ) 2 + ( y N y 2 ) 2 ) 2 + ( d ^ 5 ( x M x N ) 2 + ( y M y N ) 2 ) 2
Then, the positioning algorithm is converted to an optimization problem with the optimized objective function g. According to the first-order necessary condition of optimization problems, the partial derivative of the function g should be zero, which is
g x M = g y M = g x N = g y N = 0 .
Several sets of local optimal solutions may be derived from Equation (6). Define [ x M * , y M * , x N * , y N * ] as the global LS solution that minimizes the objective function g. Then, we have
[ x M * , y M * , x N * , y N * ] T = argmin [ g ( x M , y M , x N , y N ) ] .
The solutions of Equation (7) are much more accurate than those of Equation (1). It will be proved later by simulation in Section 3. When no real solutions can be solved from Equation (7), we can go back to Equation (1) for solutions instead.
In the example scenario, we can get two sets of solutions that are symmetric about the line determined by point 1 and point 2, as shown in Figure 3. Dealing with this, ranging information between other UWB modules can be drawn. For example, in Figure 3, distances M 4 ¯ and Q 2 ¯ can be used to distinguish the two sets of solutions.
After [ x M * , y M * , x N * , y N * ] is solved, the relative orientation β and position [x, y]T can be derived as
β = atan 2 ( y M y N , x M x N ) π 2 [ x y ] = [ x ¯ * y ¯ * ] [ cos ( β ) sin ( β ) sin ( β ) cos ( β ) ] [ x ¯ y ¯ ]
where [ x ¯ * y ¯ * ] = 1 2 [ x M * + x N * y M * + y N * ] ,   [ x ¯ y ¯ ] = 1 2 [ x M + x N y M + y N ] , atan 2 ( y , x ) = 2 arctan ( y x 2 + y 2 + x ) .

2.2. The DR System Based on Wheel Speed Sensors

The proposed system consists of four wheel-speed sensors, which are installed on the rear wheels of two vehicles. According to the Ackerman steering model shown in Figure 4, the instantaneous center of a vehicle is located on the line of the rear axle. The velocity v, yaw rate ω, and tuning radius r can be derived as shown in Equation (9).
{ v = v r + v l 2 ω = v r v l L r = v ω
where vr denotes the speed of the right wheel, vl represents the speed of the left wheel, and L indicates the rear wheelbase.
Then, the position [ x t + Δ t , y t + Δ t ] T and yaw angle y a w t + Δ t of the vehicle in the global axis system at time t + Δ t can be reckoned by [ x t , y t ] T , vt, and yamt at time t as shown in Equation (10).
[ x t + Δ t y t + Δ t y a w t + Δ t ] = [ x t + v t Δ t cos ( y a w t ) y t + v t Δ t sin ( y a w t ) y a w t + ω Δ t ]

2.3. The EKF Based UWB/DR Fusion Model

We define Xk as the state vector at time k. It contains the relative position/orientation P k = [ x k , y k , β k ] T , as well as yaw rates and velocities of the two vehicles S k = [ ω 1 k , ω 2 k , v 1 k , v 2 k ] T , which can be expressed as Equation (11).
X k = [ x k , y k , β k , ω 1 k , ω 2 k , v 1 k , v 2 k ] T
We define Δt as the time period from time k − 1 to time k. Xk can be predicted by Xk−1 based on the relative kinematics model shown in Figure 5. The state equation can be expressed on the basis of Equation (10) as Equation (12).
Figure 5. The relative kinematic model.
Figure 5. The relative kinematic model.
Sensors 21 03485 g005
X k = [ x k y k β k ω 1 k ω 2 k v 1 k v 2 k ] = f ( X k 1 , W ) = [ C cos ( θ ) + D sin ( θ ) C sin ( θ ) + D cos ( θ ) β k 1 θ + ω 2 k 1 Δ t + 1 2 W ω 2 Δ t 2 ω 1 k 1 + W ω 1 Δ t ω 2 k 1 + W ω 2 Δ t v 1 k 1 + W v 1 Δ t v 2 k 1 + W v 2 Δ t ] ,
where
C = x k 1 v 1 k 1 t + v 2 k 1 c o s ( β k 1 ) Δ t W v 1 ω Δ t 2 / 2 + W v 2 c o s ( β k 1 ) Δ t 2 / 2 ,
D = y k 1 + v 2 k 1 s i n ( β k 1 ) Δ t + W v 2 s i n ( β k 1 ) Δ t 2 / 2 ,
θ = ω 1 k 1 Δ t + W ω 1 Δ t 2 / 2 .
Then, the transition matrix of the state vector A can be derived as Equation (13).
A = f X = [ cos ( θ ) sin ( θ ) A 1 , 3 A 1 , 4 0 cos ( θ ) Δ t A 1 , 7 sin ( θ ) cos ( θ ) A 2 , 3 A 2 , 4 0 sin ( θ ) Δ t A 2 , 7 0 0 1 Δ t Δ t 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 ] ,
where
A 1 , 3 = cos ( θ ) sin ( β k 1 ) v 2 k 1 Δ t + sin ( θ ) cos ( β k 1 ) v 2 k 1 Δ t ,
A 1 , 4 = C s i n ( θ ) Δ t + D c o s ( θ ) Δ t ,
A 1 , 7 = cos ( θ ) cos ( β k 1 ) Δ t + sin ( θ ) sin ( β k 1 ) Δ t ,
A 2 , 3 = sin ( θ ) sin ( β k 1 ) v 2 k 1 Δ t + cos ( θ ) cos ( β k 1 ) v 2 k 1 Δ t ,
A 2 , 4 = C cos ( θ ) Δ t D sin ( θ ) Δ t ,
A 2 , 7 = sin ( θ ) cos ( β k 1 ) Δ t + cos ( θ ) sin ( β k 1 ) Δ t .
Similarly, the transition matrix of process noise is:
G = f W = [ G 1 , 1 0 G 1 , 3 G 1 , 4 G 2 , 1 0 G 2 , 3 G 2 , 4 Δ t 2 / 2 Δ t 2 / 2 0 0 Δ t 0 0 0 0 Δ t 0 0 0 0 Δ t 0 0 0 0 Δ t ] ,
where
G 1 , 1 = [ C sin ( θ ) + D cos ( θ ) ] Δ t 2 / 2 ,
G 1 , 3 = cos ( θ ) Δ t 2 / 2 ,
G 1 , 4 = [ cos ( θ ) cos ( β k 1 ) + sin ( θ ) sin ( β k 1 ) ] Δ t 2 / 2 ,
G 2 , 1 = [ C cos ( θ ) D sin ( θ ) ] Δ t 2 / 2 ,
G 2 , 3 = sin ( θ ) Δ t 2 / 2 ,
G 2 , 4 = [ sin ( θ ) cos ( β k 1 ) + cos ( θ ) sin ( β k 1 ) ]   Δ t 2 / 2 .
The error covariance matrix Q of process noise consists of error covariances of speeds and yaw rates, that is:
Q = cov ( W ) = [ σ ω 1 2 0 0 0 0 σ ω 2 2 0 0 0 0 σ v 1 2 0 0 0 0 σ v 2 2 ] .
Thus, the predicting process of the model is:
X ^ k = f ( X ^ k 1 ) P k = A P k 1 A + G Q G .
We define Zk as the observation vector, containing the relative position and orientation of vehicle 2 measured by the UWB system, four wheel-speeds measured by the DR system, and the observation noise Vk. Then, the observation equation can be expressed as Equation (17).
Z k = [ x U W B , k y U W B , k β U W B , k v r 1 , k v l 1 , k v r 2 , k v l 2 , k ] = H X k + V k
Referring to Equation (9), the velocities and yaw rates of the two vehicles can be expressed by the velocities measured by wheel-speed sensors as Equation (18).
[ v 1 r v 1 l v 2 r v 2 l ] = [ L 1 / 2 0 1 0 L 1 / 2 0 1 0 0 L 2 / 2   0 1 0 L 2 / 2 0 1 ] [ ω 1 ω 2 v 1 v 2 ]
Then, the Jacobian matrix H is obtained as Equation (19).
H = [ 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 L 1 / 2 0 1 0 0 0 0 L 1 / 2 0 1 0 0 0 0 0 L 2 / 2 0 1 0 0 0 0 L 2 / 2 0 1 ]
The estimating process is:
K k = P k H T ( H P k H T + R ) 1 X = X ^ k + K k ( Z k H X ^ k ) P k = P k K k H P k .
In Equation (20), R represents the error covariance matrix of Zk. It can be divided into the error covariance matrix of the UWB system RUWB and the error covariance matrix of the DR system RDR. That is:
R = cov ( V k ) = [ R U W B 0 0 R D R ] ,
where R U W B = [ σ x 2 0 0 0 σ y 2 0 0 0 σ β 2 ] , R D R = [ σ v r 1 2 0 0 0 0 σ v l 1 2 0 0 0 0 σ v r 2 2 0 0 0 0 σ v l 2 2 ] .
RDR is decided by measurement errors of the wheel-speed sensors directly, whereas RUWB is decided by positioning and directing errors, which is indirectly decided by the ranging error of UWB modules. Define D = [d1, d2, d3, d4]. On the basis of Equation (5), we can derive the relationship between the deviation D and the deviation of UWB modules’ position XM and XN as Equation (22). d5 is ignored because it is not a measurement but a constant, which means d d 5 = 0 .
d D = [ d d 1 d d 2 d d 3 d d 4 ] = D ( x M , y M , x N , y N ) [ d x M d y M d x N d y N ] = F D [ d x M d y M d x N d y N ] ,
FD can be derived as Equation (23).
F D = [ x M x 1 d 1 y M y 1 d 1 0 0 x M x 2 d 2 y M y 2 d 2 0 0 0 0 x N x 1 d 3 y N y 1 d 3 0 0 x N x 2 d 4 y N y 2 d 4 ] .
where d 1 = d ^ 1 , d 2 = d ^ 2 , d 3 = d ^ 3 , d 4 = d ^ 4 , x M = x M * , y M = y M * ,   x N = x N * ,   y N = y N * .
From Equation (8), we can get the relationship between the deviation of the vehicle position and orientation XUWB = [x, y, β] and the deviation of the UWB modules’ position XM and XN as Equation (24).
d X U W B = [ x y β ] = X U W B ( x M , y M , x N , y N ) [ d x M d y M d x N d y N ] = F X U W B [ d x M d y M d x N d y N ] .
F X U W B can be derived as Equation (25).
F X U W B = [ F 1 , 1 F 1 , 2 F 1 , 3 F 1 , 4 F 2 , 1 F 2 , 2 F 2 , 3 F 2 , 4 ( y M y N ) d 5 2 ( x M x N ) d 5 2 ( y M y N ) d 5 2 ( x M x N ) d 5 2 ] ,
where
F 1 , 1 = 1 / 2 + [ x M cos ( β ) y M sin ( β ) ] ( y M y N ) / d 5 2 ,
F 1 , 2 = [ x M cos ( β ) y M sin ( β ) ] ( x M x N ) / d 5 2 ,
F 1 , 3 = 1 / 2 [ x M cos ( β ) y M sin ( β ) ] ( y M y N ) / d 5 2 ,
F 1 , 4 = [ x M cos ( β ) y M sin ( β ) ] ( x M x N ) / d 5 2 ,
F 2 , 1 = [ y M cos ( β ) + x M sin ( β ) ] ( y M y N ) / d 5 2 ,
F 2 , 2 = 1 / 2 [ y M cos ( β ) + x M sin ( β ) ] ( x M x N ) / d 5 2 ,
F 2 , 3 = [ y M cos ( β ) + x M sin ( β ) ] ( y M y N ) / d 5 2 ,
F 2 , 4 = 1 / 2 + [ y M cos ( β ) + x M sin ( β ) ] ( x M x N ) / d 5 2 ,
x M N = ( x M + x N ) / 2 ,   y M N = ( y M + y N ) / 2 ,
x M = x M * ,   y M = y M * ,   x N = x N * ,   y N = y N * .
Then, RUWB can be expressed as Equation (26).
R U W B = F X U W B ( F D T F D ) 1 F D T R D F D ( F D T F D ) 1 F X U W B T ,
where R D = diag ( σ d 1 2 , σ d 2 2 , σ d 3 2 , σ d 4 2 , σ d 5 2 ) is determined directly by UWB ranging error covariance.

2.4. The Collision Warning Model

CWS mainly works in two ways, headway measurement warning (HMW) and TTC-based warning [34]. Both of them need to measure the distance to the front vehicle but estimate the collision time with different speeds as Equation (27).
H e a d w a y   C o l l i s i o n   T i m e = H e a d w a y v R e a r V e h i c l e T T C = H e a d w a y v R e a r V e h i c l e v F r o n t V e h i c l e
The TTC-based system takes relative velocity into account, so it provides a more accurate collision warning. In this paper, the proposed system allows vehicles to share information through UWB, such as velocities. The TTC method is apparently the better choice.
Two vehicles driving on the road have the probability of collisions in various types, such as head-to-head collision, rear-end collision, and side collision. Different kinds of collisions may happen at different times. That means all cases need to be taken into account in order to obtain the exact TTC. Before establishing the collision warning model, we simplified the shape of a vehicle as a rectangle. With this assumption, all kinds of collisions can be described as point-to-edge collisions. Edge-to-edges collisions and point-to-point collisions are also covered by point-to-edge collisions, as shown in Figure 6.
After unifying different collision types, TTC can be calculated in the same way. We take the collision type shown in Figure 7 as an example. In this case, the front left corner of vehicle 2 collides on the right edge of vehicle 1. As we defined in Section 2.1, the coordinate of a point in the axis system of vehicle 1 is expressed as X k = [ x k , y k ] T , and X k = [ x k , y k ] T in the axis system of vehicle 2. Ri (i = 1,2,3,4) represents the four corners of vehicle 1. Fi (i = 1,2,3,4) represents the four corners of vehicle 2. Therefore, the coordinate of R i is X R i = [ x R i , y R i ] T , which is known by measuring the size of the vehicle 1. Similarly, X F i = [ x F i , y F i ] T is also known by measuring the size of vehicle 2. The relative position of X = [x, y]T and the relative orientation β are estimated by the UWB/DR system. Then, the coordinates of vehicle 2′s corners in the axis system of vehicle 1 can be derived as Equation (28).
[ X F 1 , X F 2 , X F 3 , X F 4 ] = R [ X F 1 , X F 2 , X F 3 , X F 4 ] + X [ 1 , 1 , 1 , 1 ]
where R = [ cos ( β ) sin ( β ) sin ( β ) cos ( β ) ] .
We define all the points at the collision time as R C i and F C i , and their coordinates as X R C i = [ x R C i , y R C i ] T , X F C i = [ x F C i , y F C i ] T . The velocity vectors of the two vehicles are known for the UWB/DR system, which are V R = [ v R cos ( β R ) , v R cos ( β R ) ] T   ( β R = 0 ) and V F = [ v F cos ( β F ) , v F sin ( β F ) ] T   ( β F = β ) . Assume that point F i collides on the edge between R j and Rk at time t F i , R j k . Then X F C i , X R C j , and X R C k can be expressed as Equation (29).
X F C i = X F i + V F t F i , R j k X R C j = X R j + V R t F i , R j k X R C k = X R k + V R t F i , R j k
Point Fi collides on the edge between Rj and Rk means F c i is on the segment R C j R C k ¯ , which can be expressed as Equation (30).
F C i R C j · F C i R C k = F C i R C j F C i R C k
Solution t of Equation (30) is the collision time under the condition that corners of vehicle 2 collide on edges of vehicle 1, including 16 different conditions altogether. In the other 16 cases in which the corners of vehicle 1 collide on the edges of vehicle 2, the collision times can be calculated similarly. Thirty-two collision times can be calculated in total. Ignoring negative values, the minimum of the rest value is TCC. That is:
T T C = min ( t R i , F j k , t F i , R j k ) ,   ( i = 1 , 2 , 3 , 4 ; j k = 12 , 23 , 34 , 41 ) , t R i , F j k 0 , t F i , R j k 0 .
When TTC → ∞ or TTC < 0, there is no risk of collision.

3. Simulation

In this section, simulation is conducted to evaluate our algorithm. Firstly, the accuracy of the UWB positioning and directing system is validated by comparing the algorithm with and without the constraint of d5. Secondly, the accuracy of the UWB/DR fusion model based on EKF is compared to the accuracy of UWB and DR separately. Finally, plenty of driving scenarios are generated to evaluate the success rate of the CWS.

3.1. Simulation of the Overconstrained UWB Positioning and Directing System

In Section 2.1, a relative positioning/directing algorithm with the constraint of d 5 is proposed. Its performance is simulated in this section. Firstly, a driving scenario is established in the driving scenario designer of MATLAB as shown in Figure 8. The blue cube represents vehicle 1, and the red cube represents vehicle 2. The lines in blue and red denote their driving track. Kinematic parameters of vehicle and positions of UWB modules and wheel sensors in their own vehicle axis system are defined in the model. The UWB ranging error is set to σd = 0.05 m, and the wheel speed error is set to σv = 0.2 m/s referring to the sensors we will use in experiments. Calculating results of our algorithm are compared to the real values exported by the model.
Solutions of the algorithm with and without the constraint of d5 are compared in Figure 9 and Table 1. The improvement of accuracy with the derivation of d5 is very intuitive, especially for x and β. In Table 1, the root mean square error (RMSE) is recommended to compare their accuracy quantitatively.

3.2. Simulation of the UWB/DR Fusion Algorithm

We also take the scenario in Section 3.1 as an example to validate the performance of the UWB/DR fusion algorithm. The comparison results are shown in Figure 10 and Table 2. The proposed UWB/DR fusion method based on EKF significantly improves the accuracy and stability of positioning and directing.
Figure 11 and Figure 12 and Table 3 compare the accuracy of yaw rates and velocities estimated by UWB/DR to DR. They are improved significantly as well, which contributes to the better prediction accuracy of TTC in the next section.

3.3. Simulation of CWS based on TTC Estimation

In this section, we generate plenty of driving scenarios with different velocities, relative positions, and relative orientations, as shown in Figure 13. The ranges of parameters are set as outlined in Table 4.
TTCreal is certain when a scenario is established, and TTCest estimated by CWS is calculated every 10 ms. The collision warning threshold is set to 3.0 s. It means that when TTCest ≤ 3.0 s, the CWS will send an alert. TTCerr = TTCestTTCreal denotes the TTC error at the warning time as Figure 14.
In order to guarantee driving safety, we set 2.7 s as the latest warning time. If the system does not work when the vehicle is colliding within 2.7 s, the collision warning evaluation is failed. In addition, in order not to disturb the driver too much, if the system sends alerts when vehicles have no risk of collision within 4 s, we regard the warning as false. Then, TTCerr can be divided into three conditions corresponding to three evaluations of collision warning as Equation (32).
T T C e r r { > 0.3 Failed [ 1 , 0.3 ] Correct < 1 False
  • “Failed” denotes warning too late or not warning;
  • “Correct” denotes warning in the proper time period;
  • “False” denotes warning too early or warning by mistake.
The scenario marked with gray background is the typical rear-end collision scenario, which is the most critical function of a collision warning system. One hundred and ninety-six rear-end collision scenarios are generated, and Table 5 shows the results. In all the 196 simulation scenarios, two of them behave false, which means that the collision warning is triggered too early. All of the others perform correctly. It shows the reliability of the proposed CWS in the most common rear-end collision scenarios.
Then, we emulate other scenarios in which two vehicles drive in any lanes from any positions to any directions defined in Figure 13 and Table 4. Results are shown in Table 6. The scenarios with initial TTCreal less than 3 s will not be considered. In the remaining 10,823 scenarios, 10,593 of them perform correctly. The collision warning success rate is 97.9%.

4. Experiments

In this section, experiments are divided into two parts: straight driving experiments and curved driving experiments. The straight driving experiments are conducted referring to JT/T883-2014, which describes the standard experiments for FCWS, published by the Ministry of Transport of the People’s Republic of China (MOT). As JT/T883-2014 only regulates straight driving experiments, to further validate the performance of our system, curved driving experiments are conducted in addition. Since the CWS is implemented based on the UWB/DR relative positioning/directing system, the positioning/directing accuracy can reflect the performance of the CWS. Therefore, in the curved experiments, we drive through complex routes and compare the positioning/directing accuracy to the parameters of a commercial millimeter-wave radar (MMWR) used for collision warning.

4.1. Experimental Equipment and Environment

Figure 15 shows the equipment used in the experiments. Two vehicles are required in the experiments for relative positioning and directing. UWB modules are installed on the corners of the vehicles. Four wheel-speed sensors designed by our team are installed on the centers of the wheels. The wheel-speed measurements are transmitted to a receiver inside the vehicle wirelessly, which receives the velocity information from the four wheels and then sends it to the controller area network (CAN) bus. In the proposed system, only the speeds of the rear wheels are used. UWB modules are also developed by our team based on DW1000. Two vehicles share data through UWB. All data are transferred to the CAN bus and recorded by the computer using a USB-CAN adapter. A computing terminal receives sensor data from the CAN bus and calculates the relative position, direction, velocity, and TTC. Results from the computing terminal are compared to the measurement of a high-precision integrated positioning system, which combines dual-antenna real-time kinematic (RTK)-GPS and INS. The long-range radio (LoRa) antenna is used to receive differential signals from the RTK-GPS base station, which is installed in the testing ground. A total station is used to measure the relative coordinates of the UWB modules to the main RTK-GPS antenna. It should be noted that the main GPS antenna is not right above the center of the rear wheels. The deviation needs to be derived from measurements of the total station and compensated in the algorithm.
Figure 16 shows the testing ground in which we conduct experiments. The driving routes of the two types of experiments are also marked in Figure 16.

4.2. Straight Driving Experiments

According to JT/T883-2014 [35], experiments for FCWS consist of three tests. Each test needs repeating seven times. Only if five of them were passed, and no two consecutive failed tests exist could the test be evaluated as passed. In the standard experiments, the headway distances, velocities, and accelerations of vehicles need controlling around specific values, so we design software as shown in Figure 17, with necessary parameters displayed, which helps drivers better control vehicles and records necessary data. The TTC derived from the data of the RTK-GPS/INS is recognized as real TTC.

4.2.1. Test 1

Test 1 is designed as shown in Figure 18. The rear vehicle drives at the speed of 72 km/h toward the parked front vehicle from an inertial headway distance of 150 m. If the collision warning system is triggered before the real TTC is 2.7 s, the test is passed. Otherwise, the test is failed.

4.2.2. Test 2

Test 2 is designed as shown in Figure 19. The rear vehicle drives at the speed of 72 km/h toward the front vehicle, which drives at the speed of 32 km/h, from an initial headway distance of 150 m. If the collision warning system is triggered before the real TTC is 2.1 s, the test is passed. Otherwise, the test is failed.

4.2.3. Test 3

Test 3 is designed as Figure 20. The rear vehicle drives at the speed of 72 km/h toward the front vehicle, which drives at the speed of 32 km/h and decelerates with the acceleration of −0.3 g. If the collision warning system is triggered before the real TTC is 2.4 s, the test is passed. Otherwise, the test is failed.

4.2.4. Results Analysis of the Straight Driving Experiments

During each test, two TTC values are calculated: (1) TTCreal, which is derived from the RTK-GPS/INS information; (2) TTCCWS, which is estimated using the UWB/DR measurements. Since the terminating conditions in the three experiments are different, to satisfy all the three tests and reserve some margin, we set the warning TTCCWS to 3.0 s. The software in Figure 17 will send a warning when either TTCreal or TTCCWS reaches its marginal value. If TTCreal reaches the regulated marginal value when TTCCWS is still greater than 3.0 s, the test is terminated and evaluated as failed. In the standard, only the minimum threshold of the collision warning time is regulated, whereas the maximum threshold is not. In other words, the standard only cares about “how safe” the warning is, with no consideration of “how accurate” it is. However, as we explained in Section 3, too early warnings are annoying and offensive, so we set 4.0 s as the upper limit. If the CWS is triggered when TTCreal > 4.0 s, we also regard the test as failed. According to JT/T883-2014, each test needs repeating seven times. Table 7, Table 8 and Table 9 show the results of the three tests, respectively.
According to Table 7, Table 8 and Table 9, all the tests were passed, which proves that the proposed system can satisfy the requirement of MOT and has the ability to provide collision warning for vehicles in time.

4.3. Curved Driving Experiments

JT/T883-2014 only regulates the straight driving experiments but does not request or give advice to curved driving experiments. However, to further validate the superiority of our system, we conduct curved driving experiments and compare its accuracy to a commercial MMWR, Aptiv (Electronically Scanning RADAR) ESR 2.5, which is used in CWS. The MMWR measures the relative distance, relative azimuth, and relative velocity. Table 10 shows the accuracy of Aptiv ESR 2.5 according to its datasheet. ρ, θ, and v represent the relative distance, azimuth angle, and velocity, respectively. The MMWR has two working modes, middle-distance mode and long-distance mode, and the accuracies are different.
In order to facilitate comparison, the curved experiments are also divided into a middle-distance experiment under vehicle distances within 50 m and a long-distance experiment under vehicle distances within 100 m. The estimated values of the relative position [x, y] are converted to the polar coordinate [ρ, θ], and the velocities of the two vehicles [v1, v2] are converted to the relative velocity v, as shown in Figure 21. In addition, the relative orientation β cannot be measured by MMWR directly.

4.3.1. Middle-Distance Experiments

The proposed CWS and MMWR are all dynamic systems, so the vehicle distance is not kept to a constant value but changes in the experiment. During the middle-distance experiment, the vehicle distance changes between 10 and 50 m. In our system, the vehicle distance represents the distance between the real axle centers of the two vehicles, so it cannot be zero.

4.3.2. Long-Distance Experiments

During the long-distance experiment, the vehicle distances change between 10 and 100 m.

4.3.3. Results Analysis of the Curved Experiments

According to Figure 22 and Figure 23, the accuracy of the proposed system improves significantly after fusion, which reaches the same conclusion as the simulation results shown in Figure 12. The relative position is described as Cartesian coordinate [x, y] in the simulation but as polar coordinate [ρ, θ] in the experiments. Both x and y improve after fusion as shown in Table 2, whereas only θ without ρ improves after fusion according to Table 11 and Table 12. That is because the accuracy improvement of θ can contribute to better accuracy of both x and y, as Figure 21. Therefore, the experimental results are consistent with the simulation. The comparison result of the proposed system and the MMWR is shown in Table 13, which combines Table 11 and Table 12 with Table 10.
The distance accuracy of the proposed system is always much better than the MMWR, no matter with or without fusion. The azimuth accuracy without fusion is about 0.76° in both experiments, which is better than the middle-distance MMWR but is inferior to the long-distance MMWR. However, velocity accuracy without fusion is worse than the MMWR in both modes. As for the fusion system, the accuracy of relative distance and azimuth performs significantly better than the MMWR, and the relative velocity accuracy also improves to a similar level as the MMWR in both middle and long-distance modes. Table 14 shows the accuracy enhanced rates of the proposed system to the MMWR.
In addition, the proposed system can provide the relative orientation, which is not available directly in the MMWR system.

5. Conclusions

In this paper, we proposed a CWS combining UWB and DR. An improved relative positioning/directing algorithm based on UWB is presented, and a DR model based on the speeds of the rear wheels is established. Then, a fusion algorithm using EKF is proposed to improve the accuracy of relative position, orientation, and velocity. Afterwards, the advantage of the proposed system is preliminarily verified by simulation. Finally, experiments are conducted to further validate the performance of our system, and the experiment results are compared to a commercial MMWR used in CWS. The main conclusions are summarized as follows:
  • The proposed relative positioning/directing algorithm with an additional distance constraint significantly improves the relative positioning/directing accuracy, especially the directing accuracy, as shown in Figure 9 and Table 1.
  • The fusion method significantly improves the relative positioning/directing accuracy and slightly improves the velocity accuracy according to the simulation and experiment results.
  • The proposed CWS passes the regulated tests in JT/T883-2014 published by MOT, which proves the feasibility of the proposed system.
  • In middle-distance mode up to 50 m, compared to the MMWR, the proposed system improves the relative positioning/directing accuracy by 44%, 69%, and 8%, respectively, in the relative distance, azimuth angle, and velocity. As for in long-distance mode, the enhanced rate is 66% and 38%, respectively, for the relative distance and azimuth angle. The relative velocity accuracy of the proposed system is similar to the MMWR.
  • In both middle and long-distance modes, the proposed system can provide relative orientations with errors no more than 0.4° RMSE, which is not available directly in MMWR systems, but it is very beneficial to the CWS.
The inadequacy of the proposed system is the velocity accuracy. Although it performs at the same level as MMWR in terms of velocity accuracy, it can be further improved. To facilitate comparison of the proposed system and the MMWR, the velocity data of the experiments are shown as relative velocity. We also analyze the accuracy of absolute velocities of two vehicles. In the middle-distance experiment, R M S E v 1 = 0.16   m / s and R M S E v 2 = 0.13   m / s , and in the long-distance experiment, R M S E v 1 = 0.16   m / s and R M S E v 2 = 0.16   m / s . Both of them are inferior to the simulation results. It is because the DR system is established based on a theoretical Ackerman steering model, which ignores the stiffness of suspensions and tires. In the actual situation, vehicle dynamic parameters such as side-slip angles will also affect the precision of the algorithm. Therefore, our research direction in the future is the system with a more accurate vehicle dynamic model and with more sensors integrated such as IMU and GPS.

Author Contributions

Conceptualization, M.W. and Y.S.; Funding acquisition, X.C. and Y.S.; Investigation, M.W.; Methodology, M.W. and P.L.; Software, M.W., B.J. and P.L.; Validation, M.W. and B.J.; Writing—original draft, M.W.; Writing—review and editing, W.W. and Y.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by National Key R&D Program of China (Grant No. 2018YFB0104802) and Industry University Research Project of Shanghai Automotive Industry Science and Technology Development Foundation (Grant No. 1705).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Acknowledgments

The authors are grateful to the subjects in the experiment and appreciate the reviewers for their helpful comments and suggestions in this study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. World Health Organization (WHO). Global Status Report on Road Safety 2018. Available online: https://www.who.int/violence_injury_prevention/road_safety_status/2018/en/ (accessed on 7 October 2020).
  2. Kusano, K.D.; Gabler, H.C. Safety Benefits of Forward Collision Warning, Brake Assist, and Autonomous Braking Systems in Rear-End Collisions. IEEE Trans. Intell. Transp. Syst. 2012, 13, 1546–1555. [Google Scholar] [CrossRef]
  3. Owens, J.M.; Dingus, T.A.; Guo, F.; Fang, Y.; Perez, M.; McClafferty, J.; Tefft, B. Prevalence of Drowsy Driving Crashes: Estimates from a Large-Scale Naturalistic Driving Study. AAA Foundation for Traffic Safety: Wahsington, WA., USA, 2018. [Google Scholar]
  4. Ewald, A.; Willhoeft, V. Laser Scanners for Obstacle Detection in Automotive Applications. In Proceedings of the IEEE Intelligent Vehicles Symposium 2000 (Cat. No.00TH8511), Dearborn, MI, USA, 5 October 2000; pp. 682–687. [Google Scholar]
  5. Chen, S.-K.; Parikh, J.S. Developing a Forward Collision Warning System Simulation. In Proceedings of the IEEE Intelligent Vehicles Symposium 2000 (Cat. No.00TH8511), Dearborn, MI, USA, 5 October 2000; pp. 338–343. [Google Scholar]
  6. Kim, J.; Han, D.S.; Senouci, B. Radar and Vision Sensor Fusion for Object Detection in Autonomous Vehicle Surroundings. In Proceedings of the 2018 Tenth International Conference on Ubiquitous and Future Networks (ICUFN), Prague, Czech Republic, 3–6 July 2018; pp. 76–78. [Google Scholar]
  7. Peng, W.; Zhiqiang, L. The Study of Intelligent Vehicle Anti-Collision Forewarning Technology by Multi-Information Detection. In Proceedings of the 2013 Third International Conference on Intelligent System Design and Engineering Applications, Hong Kong, China, 16–18 January 2013; pp. 1557–1561. [Google Scholar]
  8. Srinivasa, N. Vision-Based Vehicle Detection and Tracking Method for Forward Collision Warning in Automobiles. In Proceedings of the Intelligent Vehicle Symposium, 2002. IEEE, Versailles, France, 17–21 June 2003; Volume 2, pp. 626–631. [Google Scholar]
  9. Liu, J.-F.; Su, Y.-F.; Ko, M.-K.; Yu, P.-N. Development of a Vision-Based Driver Assistance System with Lane Departure Warning and Forward Collision Warning Functions. In Proceedings of the 2008 Digital Image Computing: Techniques and Applications, Canberra, Australia, 1–3 December 2008; pp. 480–485. [Google Scholar]
  10. Shieh, W.-Y.; Hsu, C.-C.J.; Chen, H.-C.; Wang, T.-H.; Chen, C.-C. Construction of Infrared Signal-Direction Discriminator for Intervehicle Communication. IEEE Trans. Veh. Technol. 2015, 64, 2436–2447. [Google Scholar] [CrossRef]
  11. Sanberg, W.P.; Dubbelman, G. From Stixels to Asteroids: Towards a Collision Warning System Using Stereo Vision. Electron. Imaging 2019, 2019, 34-1–34-7. [Google Scholar] [CrossRef]
  12. Hernandez, D.C.; Filonenko, A.; Hariyono, J.; Shahbaz, A. Kang-Hyun Jo Laser Based Collision Warning System for High Conflict Vehicle-Pedestrian Zones. In Proceedings of the 2016 IEEE 25th International Symposium on Industrial Electronics (ISIE), Santa Clara, CA, USA, 8–10 June 2016; pp. 935–939. [Google Scholar]
  13. Coelingh, E.; Jakobsson, L.; Lind, H.; Lindman, M. Collision Warning with Auto Brake—A Real-Life Safety Perspective, Innovations for Safety: Opportunities and Challenges. In Proceedings of the 20th International Technical Conference on the Enhanced Safety of Vehicles (ESV), Lyon, France, 18–21 June 2017. [Google Scholar]
  14. Coelingh, E.; Eidehall, A.; Bengtsson, M. Collision Warning with Full Auto Brake and Pedestrian Detection—A Practical Example of Automatic Emergency Braking. In Proceedings of the 13th International IEEE Conference on Intelligent Transportation Systems, Funchal, Portugal, 19–22 September 2010; pp. 155–160. [Google Scholar]
  15. Srinivasa, N.; Chen, Y.; Daniell, C. A Fusion System for Real-Time Forward Collision Warning in Automobiles. In Proceedings of the 2003 IEEE International Conference on Intelligent Transportation Systems, Shanghai, China, 12–15 October 2003; pp. 457–462. [Google Scholar]
  16. Huang, C.; Lv, C.; Hang, P.; Xing, Y. Toward Safe and Personalized Autonomous Driving: Decision-Making and Motion Control with DPF and CDT Techniques. IEEE ASME Trans. Mechatron. 2021, 26, 611–620. [Google Scholar] [CrossRef]
  17. Huang, C.; Lv, C.; Hang, P.; Hu, Z.; Xing, Y. Human-Machine Adaptive Shared Control for Safe Automated Driving under Automation Degradation. arXiv 2021, arXiv:2103.04563. [Google Scholar]
  18. Huang, C.; Huang, H.; Hang, P.; Gao, H.; Wu, J.; Huang, Z.; Lv, C. Personalized Trajectory Planning and Control of Lane-Change Maneuvers for Autonomous Driving. IEEE Trans. Veh. Technol. 2021, 1. [Google Scholar] [CrossRef]
  19. Hang, P.; Lv, C. Human-Like Decision Making for Autonomous Driving: A Noncooperative Game Theoretic Approach. arXiv 2020, arXiv:2005.11064. [Google Scholar] [CrossRef]
  20. Hang, P.; Lv, C.; Huang, C.; Xing, Y.; Hu, Z. Cooperative Decision Making of Connected Automated Vehicles at Multi-Lane Merging Zone: A Coalitional Game Approach. arXiv 2021, arXiv:210307887. [Google Scholar]
  21. Chen, J.; Tian, S.; Xu, H.; Yue, R.; Sun, Y.; Cui, Y. Architecture of Vehicle Trajectories Extraction with Roadside LiDAR Serving Connected Vehicles. IEEE Access 2019, 7, 100406–100415. [Google Scholar] [CrossRef]
  22. Yang, W.; Wan, B.; Qu, X. A Forward Collision Warning System Using Driving Intention Recognition of the Front Vehicle and V2V Communication. IEEE Access 2020, 8, 11268–11278. [Google Scholar] [CrossRef]
  23. Xiang, X.; Qin, W.; Xiang, B. Research on a DSRC-Based Rear-End Collision Warning Model. IEEE Trans. Intell. Transp. Syst. 2014, 15, 1054–1065. [Google Scholar] [CrossRef]
  24. Yang, T.; Zhang, Y.; Tan, J.; Qiu, T.Z. Research on Forward Collision Warning System Based on Connected Vehicle V2V Communication. In Proceedings of the 2019 5th International Conference on Transportation Information and Safety (ICTIS), Liverpool, UK, 14–17 July 2019; pp. 1174–1181. [Google Scholar]
  25. Patra, S.; Veelaert, P.; Calafate, C.; Cano, J.-C.; Zamora, W.; Manzoni, P.; González, F. A Forward Collision Warning System for Smartphones Using Image Processing and V2V Communication. Sensors 2018, 18, 2672. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  26. Marano, S.; Gifford, W.; Wymeersch, H.; Win, M. NLOS Identification and Mitigation for Localization Based on UWB Experimental Data. IEEE J. Sel. Areas Commun. 2010, 28, 1026–1035. [Google Scholar] [CrossRef] [Green Version]
  27. Lu, Y.; Yi, J.; He, L.; Zhu, X.; Liu, P. A Hybrid Fusion Algorithm for Integrated INS/UWB Navigation and Its Application in Vehicle Platoon Formation Control. In Proceedings of the 2018 International Conference on Computer Science, Electronics and Communication Engineering (CSECE 2018); Wuhan, China, 7–8 February 2018, Atlantis Press: Sanya, China, 2018. [Google Scholar]
  28. Wang, M.; Zhou, A.; Chen, X.; Shen, Y.; Li, Z. A Novel Asynchronous UWB Positioning System for Autonomous Trucks in an Automated Container Terminal. SAE Int. J. Adv. Curr. Pract. Mobil. 2020, 2, 3413–3422. [Google Scholar] [CrossRef]
  29. Sun, S.; Hu, J.; Li, J.; Liu, R.; Shu, M.; Yang, Y. An INS-UWB Based Collision Avoidance System for AGV. Algorithms 2019, 12, 40. [Google Scholar] [CrossRef] [Green Version]
  30. Liu, X.; Jin, F.; Lv, X.; Zhan, Y.S.; Zhang, D. Design and Development of Vehicle Collision-Avoidance System Based on UWB Wireless Sensor Networks. In Proceedings of the 7th International Conference on Computer Engineering and Networks—PoS (CENet2017); Shanghai, China, 22–23 July 2017, Sissa Medialab: Shanghai, China, 2017; p. 037. [Google Scholar]
  31. Pittokopiti, M.; Grammenos, R. Infrastructureless UWB Based Collision Avoidance System for the Safety of Construction Workers. In Proceedings of the 2019 26th International Conference on Telecommunications (ICT), Hanoi, Vietnam, 8–10 April 2019; pp. 490–495. [Google Scholar]
  32. Kianfar, A.E.; Uth, F.; Baltes, R.; Clausen, E. Development of a Robust Ultra-Wideband Module for Underground Positioning and Collision Avoidance. Min. Metall. Explor. 2020, 37, 1821–1825. [Google Scholar] [CrossRef]
  33. ISO 8855:2011. Available online: https://www.iso.org/cms/render/live/en/sites/isoorg/contents/data/standard/05/11/51180.html (accessed on 6 April 2021).
  34. Forkenbrock, G.J.; O’hara, B. A Forward Collision Warning (FCW) Performance Evaluation. In Proceedings of the International Technical Conference on the Enhanced Safety of Vehicles; Stuttgart, Germany, 15–18 June 2009, National Highway Traffic Safety Administration: Washington, DC, USA, 2009; Volume 2009. [Google Scholar]
  35. Transportation Industry Standard of the People’s Republic of China. In JT/T 883-2014: Commercial Vehicle Driving Dangerous Warning System Technical Requirements and Test Procedures; China Communications Press: Beijing, China, 2014.
Figure 1. Vehicle axis system.
Figure 1. Vehicle axis system.
Sensors 21 03485 g001
Figure 2. The UWB based relative positioning system model.
Figure 2. The UWB based relative positioning system model.
Sensors 21 03485 g002
Figure 3. Two sets of solutions.
Figure 3. Two sets of solutions.
Sensors 21 03485 g003
Figure 4. Ackerman steering model.
Figure 4. Ackerman steering model.
Sensors 21 03485 g004
Figure 6. Collision types. (a) Point-to-edge collision; (b) Edge-to-edge collision; (c) Point-to-point collision.
Figure 6. Collision types. (a) Point-to-edge collision; (b) Edge-to-edge collision; (c) Point-to-point collision.
Sensors 21 03485 g006
Figure 7. The collision warning model.
Figure 7. The collision warning model.
Sensors 21 03485 g007
Figure 8. The virtual scenario in the driving scenario designer.
Figure 8. The virtual scenario in the driving scenario designer.
Sensors 21 03485 g008
Figure 9. Comparison of relative positioning and directing algorithm with and without the constraint of d5: (a) The relative longitudinal position x; (b) The relative lateral position y; (c) The relative orientation β.
Figure 9. Comparison of relative positioning and directing algorithm with and without the constraint of d5: (a) The relative longitudinal position x; (b) The relative lateral position y; (c) The relative orientation β.
Sensors 21 03485 g009
Figure 10. Comparison of positioning and directing performance using UWB and fusion of UWB/DR: (a) The relative longitudinal position x; (b) The relative lateral position y; (c) The relative orientation β.
Figure 10. Comparison of positioning and directing performance using UWB and fusion of UWB/DR: (a) The relative longitudinal position x; (b) The relative lateral position y; (c) The relative orientation β.
Sensors 21 03485 g010
Figure 11. Comparison of yaw rates measured by DR and estimated by UWB/DR: (a) Yaw rate of vehicle 1; (b) Yaw rate of vehicle 2.
Figure 11. Comparison of yaw rates measured by DR and estimated by UWB/DR: (a) Yaw rate of vehicle 1; (b) Yaw rate of vehicle 2.
Sensors 21 03485 g011
Figure 12. Comparison of velocities measured by DR and estimated by UWB/DR: (a) Velocity of vehicle 1; (b) Velocity of vehicle 2.
Figure 12. Comparison of velocities measured by DR and estimated by UWB/DR: (a) Velocity of vehicle 1; (b) Velocity of vehicle 2.
Sensors 21 03485 g012
Figure 13. TTC simulating scenarios.
Figure 13. TTC simulating scenarios.
Sensors 21 03485 g013
Figure 14. TTC estimation error.
Figure 14. TTC estimation error.
Sensors 21 03485 g014
Figure 15. Experimental Equipment.
Figure 15. Experimental Equipment.
Sensors 21 03485 g015
Figure 16. The testing ground and vehicle driving routes.
Figure 16. The testing ground and vehicle driving routes.
Sensors 21 03485 g016
Figure 17. Vehicle state display software.
Figure 17. Vehicle state display software.
Sensors 21 03485 g017
Figure 18. Test 1 in the straight driving experiments.
Figure 18. Test 1 in the straight driving experiments.
Sensors 21 03485 g018
Figure 19. Test 2 in the straight driving experiments.
Figure 19. Test 2 in the straight driving experiments.
Sensors 21 03485 g019
Figure 20. Test 3 in the straight driving experiments.
Figure 20. Test 3 in the straight driving experiments.
Sensors 21 03485 g020
Figure 21. The transformation from Cartesian coordinates to polar coordinates.
Figure 21. The transformation from Cartesian coordinates to polar coordinates.
Sensors 21 03485 g021
Figure 22. The results of the middle-distance experiments. (a) Relative distance; (b) Relative azimuth angle; (c) Relative velocity; (d) Relative orientation.
Figure 22. The results of the middle-distance experiments. (a) Relative distance; (b) Relative azimuth angle; (c) Relative velocity; (d) Relative orientation.
Sensors 21 03485 g022
Figure 23. The results of the long-distance experiments. (a) Relative distance; (b) Relative azimuth angle; (c) Relative velocity; (d) Relative orientation.
Figure 23. The results of the long-distance experiments. (a) Relative distance; (b) Relative azimuth angle; (c) Relative velocity; (d) Relative orientation.
Sensors 21 03485 g023
Table 1. RMSE of the algorithm with and without d5.
Table 1. RMSE of the algorithm with and without d5.
AlgorithmRMSEx (m)RMSEy (m)RMSEβ (°)
With d50.700.7346.29
Without d50.210.582.42
Table 2. RMSE of position and orientation estimated by UWB and UWB/DR.
Table 2. RMSE of position and orientation estimated by UWB and UWB/DR.
AlgorithmRMSEx (m)RMSEy (m)RMSEβ (°)
UWB0.210.582.42
UWB + DR (EKF)0.060.170.83
Table 3. RMSE of velocities and yaw rates estimated by DR and UWB/DR.
Table 3. RMSE of velocities and yaw rates estimated by DR and UWB/DR.
Algorithm R M S E ω 1   ( ° / s ) R M S E ω 2   ( ° / s ) R M S E v 1   ( m / s ) R M S E v 2   ( m / s )
DR8.928.710.140.15
UWB + DR (EKF)5.074.600.120.08
Table 4. Ranges of parameters in TTC simulation.
Table 4. Ranges of parameters in TTC simulation.
ParametersRange
v1&v2 (km/h)0~75
x (m)−200~200
y (m)−15~15
β (°)0~360
Table 5. Collision warning evaluation in rear-end scenarios.
Table 5. Collision warning evaluation in rear-end scenarios.
EvaluationQuantity
Failed0
Correct194
False2
Table 6. Collision warning evaluation in random scenarios.
Table 6. Collision warning evaluation in random scenarios.
EvaluationQuantity
Failed0
Correct10,596
False227
Table 7. Results of Test 1.
Table 7. Results of Test 1.
1234567
TTC(CWS)2.99872.99072.97592.98142.97292.97222.9799
TTC(Real)3.00473.00692.99252.99632.99023.01363.0219
EvaluationPassPassPassPassPassPassPass
Table 8. Results of Test 2.
Table 8. Results of Test 2.
1234567
TTC(CWS)2.98632.98102.99872.98042.99542.98992.9673
TTC(Real)3.04233.11663.02453.03543.12833.12693.0226
EvaluationPassPassPassPassPassPassPass
Table 9. Results of Test 3.
Table 9. Results of Test 3.
1234567
TTC(CWS)2.97822.99052.99472.97892.99422.86232.8958
TTC(Real)2.75602.81102.78772.68512.65112.58312.8975
EvaluationPassPassPassPassPassPassPass
Table 10. The accuracy of the MMWR.
Table 10. The accuracy of the MMWR.
ModeCoverage (m)RMSEρ (m)RMSEθ (°)RMSEv (m/s)
Middle Distance500.2510.12
Long Distance1000.50.50.12
Table 11. The accuracy of the MMWR.
Table 11. The accuracy of the MMWR.
ModeRMSEρ (m)RMSEθ (°)RMSEv (m/s)RMSEβ (°)
No Fusion0.140.760.221.84
Fusion0.140.310.110.39
Table 12. The accuracy of the MMWR.
Table 12. The accuracy of the MMWR.
ModeRMSEρ (m)RMSEθ (°)RMSEv (m/s)RMSEβ (°)
No Fusion0.180.770.221.86
Fusion0.170.310.120.40
Table 13. Accuracy comparison of the proposed system and the MMWR.
Table 13. Accuracy comparison of the proposed system and the MMWR.
ModeSystemRMSEρ (m)RMSEθ (°)RMSEv (m/s)RMSEβ (°)
Middle
Distance
MMWR0.2510.12None
Proposed System (No Fusion)0.140.760.221.84
Proposed System (Fusion)0.140.310.110.39
Long
Distance
MMWR0.50.50.12None
Proposed System (No Fusion)0.180.770.241.86
Proposed System (Fusion)0.170.310.120.40
Table 14. Enhanced rate of the proposed system to the MMWR.
Table 14. Enhanced rate of the proposed system to the MMWR.
ModeRMSEρ (m)RMSEθ (°)RMSEv (m/s)
Middle Distance44%69%8%
Long Distance66%38%0%
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Wang, M.; Chen, X.; Jin, B.; Lv, P.; Wang, W.; Shen, Y. A Novel V2V Cooperative Collision Warning System Using UWB/DR for Intelligent Vehicles. Sensors 2021, 21, 3485. https://doi.org/10.3390/s21103485

AMA Style

Wang M, Chen X, Jin B, Lv P, Wang W, Shen Y. A Novel V2V Cooperative Collision Warning System Using UWB/DR for Intelligent Vehicles. Sensors. 2021; 21(10):3485. https://doi.org/10.3390/s21103485

Chicago/Turabian Style

Wang, Mingyang, Xinbo Chen, Baobao Jin, Pengyuan Lv, Wei Wang, and Yong Shen. 2021. "A Novel V2V Cooperative Collision Warning System Using UWB/DR for Intelligent Vehicles" Sensors 21, no. 10: 3485. https://doi.org/10.3390/s21103485

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop