Next Article in Journal
Dynamic pH Sensor with Embedded Calibration Scheme by Advanced CMOS FinFET Technology
Next Article in Special Issue
SemanticDepth: Fusing Semantic Segmentation and Monocular Depth Estimation for Enabling Autonomous Driving in Roads without Lane Lines
Previous Article in Journal
Flexible Three-Dimensional Reconstruction via Structured-Light-Based Visual Positioning and Global Optimization

Sensors 2019, 19(7), 1584; https://doi.org/10.3390/s19071584

Article
Research on Lane a Compensation Method Based on Multi-Sensor Fusion
1
College of Transportation, Shandong University of Science and Technology, Qingdao 266590, China
2
State Key Laboratory of Automotive Safety and Energy, Tsinghua University, Beijing 100084, China
3
School of Mechanical and Automotive Engineering, Liaocheng University, Liaocheng 252059, China
*
Author to whom correspondence should be addressed.
Received: 10 January 2019 / Accepted: 29 March 2019 / Published: 2 April 2019

Abstract

:
The curvature of the lane output by the vision sensor caused by shadows, changes in lighting and line breaking jumps over in a period of time, which leads to serious problems for unmanned driving control. It is particularly important to predict or compensate the real lane in real-time during sensor jumps. This paper presents a lane compensation method based on multi-sensor fusion of global positioning system (GPS), inertial measurement unit (IMU) and vision sensors. In order to compensate the lane, the cubic polynomial function of the longitudinal distance is selected as the lane model. In this method, a Kalman filter is used to estimate vehicle velocity and yaw angle by GPS and IMU measurements, and a vehicle kinematics model is established to describe vehicle motion. It uses the geometric relationship between vehicle and relative lane motion at the current moment to solve the coefficient of the lane polynomial at the next moment. The simulation and vehicle test results show that the prediction information can compensate for the failure of the vision sensor, and has good real-time, robustness and accuracy.
Keywords:
sensor fusion; kinematics; lane detection; vision; virtual lane

1. Introduction

In recent years, intelligent driving vehicles have received widespread attention. The reason is that intelligent driving vehicles can play a positive role in the daily traffic environment. The reliability of intelligent driving control system with mature technology is higher than that of drivers with different driving skills. Important functions of intelligent driving, such as lane keeping systems (LKAs) and lane change systems (LCXs) have been widely studied. In order to ensure autonomous vehicles can drive safely on the road, high-precision lane level location is required. Currently, lane-level location can be realized through Lidar, GPS/INS or cameras. However, the cost of Lidar is higher than that of other sensors. When GPS and INS are used for high-precision positioning, the absolute position information is obtained, which must be matched with a high-precision map to obtain the relative road position information. Therefore, high-efficiency and low-cost environmental perception based on vision will become the main direction of future industrialization of intelligent driving vehicles [1].
Lane recognition [2,3] is an important part of lane-level location. At present, the mainstream method is to obtain lane images through vision sensors installed on intelligent vehicles, and then use the edge, color, texture or frequency domain features of lanes to separate lane lines from road areas. According to the different image processing strategies of road conditions, existing methods can generally be divided into model-based methods [4,5,6,7,8] and feature-based methods [9,10,11,12,13]. However, the acquisition of lane information from vision sensors may fail due to complex shadows, missing lane markers, changes in lighting, different lane brightness or vision sensor failures [14]. In these abnormal conditions, the sensor recognition algorithm automatically chooses to exit temporarily. Based on this kind of problem, scholars have put forward two main solutions. One method is to compensate from the level of image processing. A method using a fuzzy system and line segment detector algorithm to overcome various lighting problems has been presented in [15]. References [16,17,18,19,20,21] respectively improve the robustness of lane recognition in different environments from the aspects of clustering, feature extraction and curve fitting. These methods based on image processing can improve the accuracy of lane detection to a certain extent, but this kind of method is limited by the sensor accuracy. Another method is to improve the robustness of vision sensors by using sensor fusion. A positioning system combining global positioning system (GPS), inertial measurement unit (IMU), wheel speed sensor, single front-facing camera and digital map was proposed in [22,23,24,25,26,27,28,29]. This kind of method relies on digital maps and increases the computational burden of the computer. References [30,31] used a kinematics model of the vehicle and the position information of the vehicle in front based on radar or V2X to predict the lane. However, this method places high requirements on knowing the trajectory of the vehicle in front. A vehicle kinematic lateral motion model and road constraints are used to solve sensor failures in [32,33,34,35]. The kinematics model and dynamics model are combined for lane prediction in the high speed range in [36]. Nevertheless, the above method requires high precision of the model, and the accuracy of the model is related to the performance of the system. Therefore, how to realize information compensation under the condition of discontinuous lane signals is still an important technical problem. The importance of lane information stability to vehicle automatic control is verified in [37,38,39,40]. A real-time vision sensor compensation algorithm will ensure that the movement trajectory of the vehicles can remain stable when the positioning signal is lost, and a sufficient system response time can be provide when the lane information continues to fail. Therefore, it can complete driver wake-up and control transfer, and transfer the control of the vehicle to the driver smoothly and safely.
In this paper, a lane prediction system based on sensor fusion for vision sensor failures has been presented. First, a low-cost GPS, IMU and DR are integrated to obtain a high-precision vehicle trajectory. Secondly, in the multi-sensor system, the relative position relation of the vehicle path is used to predict the lane coefficient, and the prediction algorithm is synchronized with the control sample time. Therefore, when the sensor does not fail, the method proposed in this paper can also verify the lane information which is collected by visual sensors under normal circumstances. Finally, the performance of the method is evaluated by a HIL simulation and vehicle tests at a test site. The proposed method can effectively realize the compensation of vision sensors in the state of failure. The method works effectively after 1 s. Even if the vision sensor breakdown occurs on a bend, this time is enough for the driver to take over control.
The rest of this paper is organized as follows: in Section 2, a low-cost GPS and IMU are used to estimate the vehicle state, and the vehicle trajectory is fitted by dead reckoning based on the vehicle kinematics. The lane coefficient is predicted by using the relative position relationship between the vehicle trajectory and the lane in the multi-rate system in Section 3. The simulation and the experimental results are given in Section 4. Finally, Section 5 outlines the conclusions and discusses the limitations of this method and possible future work.

2. Sensor Fusion and Lane Modeling

The GPS, IMU fusion algorithm, vehicle kinematics based on a dead reckoning method, and lane polynomial function are described in this part. More accurate yaw angle and longitudinal velocity values can be obtained through GPS and IMU information fusion, and errors generated by the IMU’s long integration time can be corrected, and real-time vehicle pose can be calculated by combining the track calculation method.

2.1. Yaw Angle and Longitudinal Velocity Estimation

This section focuses on estimation of the vehicle trajectory using an IMU and a GPS receiver. The vehicle velocity and yaw angle can be estimated by a sensor fusion algorithm using the data obtained from an IMU and a GPS. The main source of inertial sensor error is drift caused by sensor deviation and gravity effects. Therefore, the method in this paper aims to model these error sources and ignore the influence of cross-coupling errors and sensor scale factor errors. The IMU offers six degrees of freedom and consists of a three-axis gyroscope and a three-axis accelerometer which are installed on the carrier. In this paper, only the z-axis of the gyro and x-axis of the accelerometer are modeled as required.

2.1.1. Gyro Modeling

The output of the gyroscope can be expressed as the real vehicle yaw rate, with zero deviation and white noise, as shown in Equation (1):
g r = r + b r + ω g y r o
where, g r is the output of the gyro, r is the yaw rate of vehicle, b r is the constant offset or bias of the gyro, ω g y r o is the zero mean white noise of sensor. Assuming that the sensor noise obeys normally distributed, and the sampled covariance is E [ ω g y r o 2 ] = σ g y r o 2 .
The deviation of the gyroscope is represented by a first order Markov process:
b ˙ r = 1 T g b r + ω g _ b i a s
where, T g is the correlation time. ω g _ b i a s is the process driving noise of the gyro which is normally distributed with zero mean and a sampled covariance of E [ ω g _ b i a s 2 ] = σ b g 2 .

2.1.2. Accelerometer Modeling

The accelerometer is modeled in the same way as the gyro. The influence of Coriolis acceleration and gravity acceleration were not considered in the modeling process:
a = x ¨ + b a + ω a c c e l
where, a is the output of accelerometer, b a is the constant offset or bias of the accelerometer. ω a c c e l is the process driving noise of the accelerometer which is obeying normally distributed and the sampled covariance is E [ ω a c c e l 2 ] = σ a c c e l 2 .
The accelerometer bias is also represented by a first order Markov process:
b ˙ a = 1 T a b a + ω a _ b i a s
where, T a is the correlation time. ω a _ b i a s is the process driving noise of the accelerometer which is obeying normally distributed and the sampled covariance is E [ ω a _ b i a s 2 ] = σ b a 2 .

2.1.3. Kalman Filter Establishment

Velocity, gyro bias, yaw angle and accelerometer zero bias were selected as state variables which are presented as x = [ ψ b r v b a ] T . The longitudinal acceleration and yaw rate are the measurements of IMU, expressed as u = [ g r a ] T . According to Equations (1) and (4), the equation of state of the system is:
x ˙ = A x + B u + ω
where:
A = [ 0 1 0 0 0 1 T g 0 0 0 0 0 1 0 0 0 1 T a ]
B = [ 1 0 0 0 0 1 0 0 ] T
The velocity and yaw angle measured by GPS are selected as external observation parameters and the system measurement equation is established as follows:
y = C x + μ
where, μ is the sensor noise, represented as μ = [ μ ψ μ v ] T and it is satisfied that E [ μ ψ 2 ] = σ ψ 2 and E [ μ v 2 ] = σ v 2 , respectively, where:
y = [ ψ G P S v G P S ]
C = [ 1 0 0 0 0 1 0 0 ]
The covariance matrices Q and R of the process noise and the measured noise are solved, and the linear Kalman filter is used for the optimal estimation. Readers may refer to [41] for details.

2.2. Vehicle Kinematics

The sensor fusion information is used to build the vehicle kinematics model, so as to obtain the vehicle trajectory. Firstly, the vehicle motion is simplified and described as a motion in a two-dimensional plane, as shown in Figure 1. The kinematics model uses three parameters to describe the motion of the vehicle, which represents the current abscissa X ( t ) of the vehicle, the current ordinate Y ( t ) of the vehicle and the vehicle’s current yaw angle ψ ( t ) . The global coordinate system and the local coordinate system are respectively established at the center of mass of the vehicle. The horizontal axis of the global coordinate system is X and the vertical axis is Y. The horizontal axis of the local coordinate system is x and the vertical axis is y . In Figure 1, x 0 and y 0 are the local coordinate systems established by the vehicle at time t 0 . ψ is the angle between the longitudinal axis of the vehicle and the X-axis. V is the velocity at the center of mass of the vehicle. V X and V Y are the projection of the vehicle velocity on the X-axis and Y-axis in the global coordinate system. The kinematics equation of the vehicle in the global coordinate system is:
X ˙ ( t ) = V cos ( ψ ( t ) β )
Y ˙ ( t ) = V sin ( ψ ( t ) β )
ψ ˙ ( t ) = γ
From Equations (11)–(13), it can be seen that the vehicle’s motion position is determined by the yaw rate, longitudinal acceleration of the center of mass and sideslip angle.
In the global coordinate system, the vehicle position is described as [ X Y ψ ] . Assuming that the vehicle’s position at initial time t 0 is P ( t 0 ) = [ X ( t 0 ) Y ( t 0 ) ψ ( t 0 ) ] T , then the vehicle’s position at time t 1 can be expressed as follows:
P ( t 1 ) = [ X ( t 1 ) Y ( t 1 ) ψ ( t 1 ) ] = [ X ( t 0 ) + t 0 t 1 V X ( τ ) d τ Y ( t 0 ) + t 0 t 1 V Y ( τ ) d τ ψ ( t 0 ) + t 0 t 1 γ ( τ ) d τ ]

2.3. Vehicle Trajectory and Lane Polynomial

The vision sensor performs lane detection in a local coordinate system. Therefore, the vehicle trajectory and lane are defined in the local coordinate system of the vehicle. Lane curves and vehicle motion trajectories are shown in Figure 2.
Assuming that the vehicle is moving at constant velocity V x and yaw rate ψ ˙ , the vehicle trajectory can be approximated by a parabolic projection. The vehicle trajectory f v ( x ) is expressed as follows:
f v ( x ) = ρ v 2 x 2 = ψ ˙ 2 V x x 2
At present, the general method is to describe the two-dimensional geometry of the lane through the cyclotron lines model. Taking an expressway as an example, the radius of lanes is generally more than 100 m, so the curvature and curvature rate are usually small and the geometry of the lane can be expressed by a cubic polynomial equation. Through the cubic polynomial, the left and right lanes can be described as follows:
f L ( x ) = c L 0 + c L 1 x + c L 2 x 2 + c L 3 x 3
f R ( x ) = c R 0 + c R 1 x + c R 2 x 2 + c R 3 x 3
where, c L 0 and c R 0 represent the lateral offset between the vehicle and the left lane or right lane at the current moment, respectively. The terms c L 1 and c R 1 represent the heading angle between the vehicle and the left lane or right lane at the current moment, respectively, while c L 2 and c R 2 represent the curvature of the left lane or right lane at the current moment, respectively and c L 3 and c R 3 represent the curvature rate of the left lane or right lane at the current moment, respectively.
Taking the average value of the left and right lanes and then the polynomial of the road centerline f M ( x ) is shown as follows:
f M ( x ) = c 0 + c 1 x + c 2 x 2 + c 3 x 3
where, c 0 = ( c L 0 + c R 0 ) / 2 ; c 1 = ( c L 1 + c R 1 ) / 2 ; c 2 = ( c L 2 + c R 2 ) / 2 ; c 3 = ( c L 3 + c R 3 ) / 2 .

3. Lane Parameters Estimation

The system is a multi-rate system. The visual sensor, IMU, GPS and vehicle controller operate at different update rates, and the data can be obtained through the vehicle CAN bus. The information transmission of sensors and controllers is shown in Figure 3.
In Figure 3, the update rate of vision sensor is T c a m , the update rate of IMU is T c , and the update rate of GPS is T G . The relationship between the three update rate is T G > T c a m > T c . Since the Kalman filter and IMU operate at the same rate, we only focus on the relationship between T c a m and T c . Assuming that there is an integer n ( n > 1 ) that makes T c a m = n T c , according to the difference of update rate between IMU and the camera sensor which shown in Figure 4, the time constant t is defined as:
t = ( k + i n ) T c a m
where, k ( k = 0 , 1 , ) and i ( i = 0 , 1 , n 1 ) represent the update periods of the vision sensor and IMU, respectively.
It is assumed that the vision sensor at time k can obtain stable lane information, and the longitudinal velocity and yaw angle can be obtained after the fusion of IMU and GPS data. Based on the difference in the sensor update rate, the vehicle trajectory can be continuously calculated within the interval of the vision sensor update. Assuming that the initial vehicle position at time k is P ( t 0 ) = [ 0 0 0 ] T , the equation of vehicle position at time k + 1 can be expressed as follows after discrete processing:
ψ ^ ( k | i ) = j = 0 i ψ ( k | j )
x ^ ( k + 1 | 0 ) = i = 0 n 1 V x ( k | i ) cos ( ψ ^ ( k | i ) ) T c
y ^ ( k + 1 | 0 ) = i = 0 n 1 V x ( k | i ) sin ( ψ ^ ( k | i ) ) T c
where V x is the component of the speed V in the local coordinate system of the vehicle.
Vehicle position and trajectory equation are solved by using Equations (15) and (20)–(22). The prediction method is derived below to predict the coefficient of lane polynomial at time k + 1 .

3.1. Lateral Offset Estimation

In the local coordinate system, the relationship between the vehicle center of mass and the lane is used to predict the lane coefficient, which is shown in Figure 5.
The vertical line of the tangent line of the vehicle trajectory at point P k + 1 intersects the lane at point Q k + 1 . g ( x ) is the line passing through the point P k + 1 and point Q k + 1 :
g ( x ) = a x + b
In the plane rectangular coordinate system, the angle between g ( x ) and the x-axis is π 2 + ψ ^ ( k + 1 ) , and the slope and intercept are expressed as m and b, respectively:
a = tan ( π 2 + ψ ^ ( k + 1 | 0 ) )
b = y ^ ( k + 1 | 0 ) m x ^ ( k + 1 | 0 )
For simultaneous Equations (18) and (23):
c 0 + c 1 x + c 2 x 2 + c 3 x 3 = a x + b
The horizontal coordinate of point Q k + 1 can be solved that using the secant method. Let’s define the coordinates Q k + 1 ( x ^ Q ( k + 1 | 0 ) , f L ( x ^ Q ( k + 1 | 0 ) ) ) . Then, the Euclidean distance between P k + 1 and Q k + 1 can be obtained, namely, the lateral offset c ^ 0 ( k + 1 | 0 ) :
c ^ 0 ( k + 1 | 0 ) = sgn ( f L ( x ^ Q ( k + 1 | 0 ) ) ) X e 2 + Y e 2
where:
X e = x ^ Q ( k + 1 | 0 ) x ^ ( k + 1 | 0 )
Y e = f L ( x ^ Q ( k + 1 | 0 ) ) f v ( x ^ ( k + 1 | 0 ) )

3.2. Heading Angle Estimation

As shown in Figure 5, the heading angle is the angle between the tangent line of the vehicle trajectory and the tangent line of the lane. Using geometric relations, the heading angle c ^ 1 is calculated by solving the slope of the tangent line at the points P k + 1 and Q k + 1 :
tan ( θ 1 ( k + 1 | 0 ) ) = f v ( x ) | x = x ^ ( k + 1 | 0 )
tan ( θ 2 ( k + 1 | 0 ) ) = f L ( x ) | x = x ^ Q ( k + 1 | 0 )
As shown in Figure 5, c ^ 1 ( k + 1 , 0 ) is the error between θ 1 ( k + 1 , 0 ) and θ 2 ( k + 1 , 0 ) , and the tangent value of the angle is presented as follows,
tan ( c ^ 1 ( k + 1 | 0 ) ) = tan ( θ 1 ( k + 1 | 0 ) θ 2 ( k + 1 | 0 ) ) = m 1 m 2 1 + m 1 m 2
where, m1 = tan(θ1(k + 1|0)), m2 = tan(θ2(k + 1|0)).
When the heading angle is small, c ^ 1 ( k + 1 | 0 ) can be approximately equal to:
c ^ 1 ( k + 1 | 0 ) = tan 1 ( m 1 m 2 1 + m 1 m 2 ) m 1 m 2 1 + m 1 m 2

3.3. Curvature and Curvature Rate Estimation

According to the design characteristics of expressway, it is approximately considered that the curvature rate of lane is constant in a single period. According to Equation (18), f L ( x ) = 2 c 2 + 6 c 3 x , the vehicle position at the current moment is zero in the local coordinate system of the vehicle, so f L ( 0 ) = 2 c 2 . Then we will calculate the curvature and curvature rate at time k + 1 :
c ^ 2 ( k + 1 | 0 ) = f ¨ L ( x ) | x = x ^ ( k + 1 | 0 ) = 2 c 2 ( k | 0 ) + 6 c 3 ( k | 0 ) x ^ ( k + 1 | 0 )
c ^ 3 ( k + 1 | 0 ) = c ^ 2 ( k + 1 | 0 ) c 2 ( k | 0 ) κ
where, κ is an tuning parameter related to vehicle speed, and the value is approximately equal to the sampling time of the vision sensor.
If the sensor fails to provide lane information within several sampling cycles, the predictive compensation method derived above is used for road compensation. However, when the sensor is ineffective for a long time, the performance of lane prediction will decline due to the lack of road information feedback.

4. Experimental Results

The method is evaluated by hardware-in-the-loop simulation platform and vehicle experiments.

4.1. Simulation

In the hardware-in-the-loop simulation, Carsim software is used to simulate a A-class vehicle. In the simulation process, the velocity is set to 25 m/s (i.e., 90 km/h). The expected path is a third-order polynomial curve that conforms to cyclotron constraints. The parameters used in Carsim are the nominal values of the test vehicle. The HIL platform is shown in Figure 6, and the HIL process is shown in Figure 7.
As shown in Figure 7, the 27-DOF nonlinear vehicle model of Carsim is used to simulate the virtual vehicle. The state input of vehicle, feedback of steering controller and compensation algorithm all run in AutoBox which from dSPACE. The communication between each part is realized by using CAN bus.
The prediction performance of the method is verified based on the HIL platform, and the vehicle steering change process caused by sensor failure during lane restore is simulated. In the process of simulation, the following three scenarios were considered: (a) The vision sensor can detect the road information normally; (b) the vision sensor fails every 10 s, and the failure time is variable; (c) when the vision sensor fails, the compensation algorithm is used to restore the lane.
Figure 8 shows the comparison of lateral offset, heading angle, curvature and curvature rate in the three scenarios. The blue line shows the data measured by the virtual sensor. The red line shows the measured data which contains the failure of the virtual sensor. The green line shows the data predicted using the compensation algorithm. On the road, the failure occurred at 5, 15, 25, 35, 45 and 55 s, respectively. The first 50 s shown in the figure are a variable curvature road and the remaining time is an approximately straight road. The results show that the method can achieve good results on both straight and curved roads.
Figure 9 shows the errors between the measured data and the restored data. On the road, the failure occurred in 5, 15, 25, 35, 45 and 55 s, respectively. The first 50 s shown in the figure are variable curvature road and the remaining time is an approximately straight road. The results show that the method can achieve good results on straight and curved roads. The statistical information of the errors is shown in Table 1. The lateral displacement of the error is less than ±2 × 10−2 m, heading angle error is less than ±4 × 10−4 rad, curvature of the error is less than ±1.2 × 10−5 m−1, the error of the curvature change rate is less than 1.5 × 10−7 m−2. The RMSE of each variable is far less than the magnitude of the error. This indicates that the predicted lane is very close to the compensated lane. The steering angle error caused by the maximum prediction error is far less than 1°, so the influence of the prediction error on the control accuracy can be neglected.
Considering the safety, HIL is used to verify the lane-keeping based on compensation method. We design a lane keeping controller based on PID theory. In the program, bad points are set manually. In Figure 10, the blue line indicates lane-keeping control with sudden bad points, while the red dotted line represents lane-keeping control after compensation by the algorithm and the tracking effect is represented by steering wheel angle. At 25 s, due to the appearance of bad points, the uncompensated control produces the sudden change of steering wheel angle, which leads to the instability of the controlled vehicle and easily causes accidents. After compensation, the control effect is stable, and the vehicle smoothly transits to the sensor and returns to normal.

4.2. Vehicle Test

Figure 11 shows the experimental vehicle and the experimental road section. The experimental vehicle is equipped with sensors such as Mobileye, IMU and GPS. The update rate of each sensor is shown in Table 2. The update periods of ECU controller, visual sensor, IMU and GPS are all different. This is a typical multi-rate system. The ECU controller communicates with the sensors through the CAN bus. The vehicle velocity is controlled between 60 km/h and 70 km/h. The prediction method is compared with the measured data of the vision sensor. The characteristics of the experimental road are as follows:
  • Total length: 4000 m (straight line section: 1500 m; curve line section: 2500 m)
  • Width: 2- lane (each lane width is 3.5 m)
  • Curve radius: 250 m and 400 m.
Figure 12 shows the performance of the estimation method in a vehicle test. The red line shows the measured data by the vision sensor. The blue dots line shows the estimated data by the compensation algorithm. In the test process, when one of the two lines cannot be detected, it is also considered a failure. It can be seen that the compensation effect is good in straight and curve sections.
Figure 13 shows the lane estimation between 20 s to 30 s in four frames. The blue lines represent the lane measured by the camera. The black dot lines represent the center lane which calculated by the left and right line. The green asterisk indicates that when the camera fails temporarily, the method proposed in this paper can better predict the lane. The predicted lane meets the requirements of lane keeping control.

5. Conclusions

In this study, a lane compensation method based on sensor fusion is proposed to compensate for the short-time failure of vision sensors. The method is divided into two parts. The first part is the vehicle trajectory acquisition based on sensor fusion, and the second part is the lane prediction using the relative position of the vehicle and road. The hardware-in-the-loop and vehicle experiments show that the short-time failure of vision sensors can be compensated by this method.
The algorithm has two main contributions. Firstly, a lane compensation method based on sensor fusion is proposed. That is, through the fusion of low-cost sensors, high-precision vehicle status information can be obtained, and then the relative position of vehicles and roads can be used to predict the road coefficient. Secondly, the simulation and vehicle experiments are used to verify the effectiveness and real-time performance of the lane compensation algorithm. It validates the applicability of the algorithm in the case of discontinuous bad points and short duration of bad points.
In the future, we will use more complex working conditions to verify the algorithm and consider combining other sensors to extend the compensation time. Based on this, we will consider the impact of speed on the algorithm to improve the compensation method. In addition, lane detection method will also be focus of our research.

Author Contributions

Conceptualization and Methodology, Y.L.; Writing-Original Draft Preparation and Investigation, W.Z.; Funding Acquisition and Resources, X.J.; Formal Analysis, C.R.; Writing-Review & Editing and Resources, J.W.

Funding

This research was funded by [Nature Science Foundation of Shandong Province] grant number [ZR2016EEQ06] and [Tsinghua University Initiative Scientific Research Program] grant number [20161080033].

Conflicts of Interest

The authors declare no conflict of interest

References

  1. Chinese Society for Artificial Intelligence. China Artificial Intelligence Series White Paper Intelligent Driving; Chinese Society for Artificial Intelligence: Beijing, China, 2017; p. 11. [Google Scholar]
  2. Jung, S.; Youn, J.; Sull, S. Efficient Lane Detection Based on Spatiotemporal Images. IEEE Trans. Intell. Transp. Syst. 2015, 17, 289–295. [Google Scholar] [CrossRef]
  3. Sivaraman, S.; Trivedi, M.M. Integrated lane and vehicle detection, localization, and tracking: A synergistic approach. IEEE Trans. Intell. Transp. Syst. 2013, 14, 906–917. [Google Scholar] [CrossRef]
  4. Low, C.Y.; Zamzuri, H.; Mazlan, S.A. Simple robust road lane detection algorithm. In Proceedings of the 2014 5th International Conference on Intelligent and Advanced Systems (ICIAS), Kuala Lumpur, Malaysia, 3–5 June 2014. [Google Scholar]
  5. Tan, T.; Yin, S.; Quyang, P. Efficient lane detection system based on monocular camera. In Proceedings of the 2015 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 9–12 January 2015. [Google Scholar]
  6. Chen, Q.; Wang, H. A real-time lane detection algorithm based on a hyperbola-pair model. In Proceedings of the Intelligent Vehicles Symposium, Tokyo, Japan, 13–15 June 2006. [Google Scholar]
  7. Wang, Y.; Teoh, E.K.; Shen, D. Lane detection and tracking using B-Snake. Image Vis. Comput. 2004, 22, 269–280. [Google Scholar] [CrossRef][Green Version]
  8. Wang, Y.; Shen, D.; Teoh, E.K. Lane detection using spline model. Pattern Recognit. Lett. 2000, 21, 677–689. [Google Scholar] [CrossRef]
  9. Chiu, K.Y.; Lin, S.F. Lane detection using color-based segmentation. In Proceedings of the Intelligent Vehicles Symposium, Las Vegas, NV, USA, 6–8 June 2005. [Google Scholar]
  10. Ma, C.; Mao, L.; Zhang, Y.F. Lane detection using heuristic search methods based on color clustering. In Proceedings of the 2010 International Conference on Communications, Circuits and Systems (ICCCAS), Chengdu, China, 28–30 July 2010. [Google Scholar]
  11. Liu, G.; Li, S.; Liu, W. Lane detection algorithm based on local feature extraction. In Proceedings of the Chinese Automation Congress (CAC), Changsha, China, 7–8 November 2013. [Google Scholar]
  12. Wu, P.C.; Chang, C.Y.; Lin, C.H. Lane-mark extraction for automobiles under complex conditions. Pattern Recognit. 2014, 47, 2756–2767. [Google Scholar] [CrossRef]
  13. You, F.; Zhang, R.; Zhong, L.; Wang, H.; Xu, J. Lane detection algorithm for night-time digital image based on distribution feature of boundary pixels. J. Opt. Soc. Korea 2013, 17, 188–199. [Google Scholar] [CrossRef]
  14. Du, X.; Tan, K.K. Vision-based approach towards lane line detection and vehicle localization. Mach. Vis. Appl. 2016, 27, 175–191. [Google Scholar] [CrossRef]
  15. Hoang, T.M.; Baek, N.R.; Cho, S.W.; Kim, K.W.; Park, K.R. Road lane detection robust to shadows based on a fuzzy system using a visible light camera sensor. Sensors 2017, 17, 2475. [Google Scholar] [CrossRef] [PubMed]
  16. Lei, G.; Amp, W.J. Lane Detection Under Vehicles Disturbance. Automot. Eng. 2007, 29, 372–376. [Google Scholar]
  17. Kim, Z. Robust lane detection and tracking in challenging scenarios. IEEE Trans. Intell. Transp. Syst. 2008, 9, 16–26. [Google Scholar] [CrossRef]
  18. Assidiq, A.A.M.; Khalifa, O.O.; Islam, R.; Khan, S. Real time lane detection for autonomous vehicles. In Proceedings of the International Conference on Computer and Communication Engineering, Kuala Lumpur, Malaysia, 13–15 May 2008. [Google Scholar]
  19. Satzoda, R.K.; Sathyanarayana, S.; Srikanthan, T. Robust extraction of lane markings using gradient angle histograms and directional signed edges. In Proceedings of the Intelligent Vehicles Symposlum, Madrid, Spain, 3–7 June 2012; pp. 754–759. [Google Scholar]
  20. Niu, J.W.; Lu, J.; Xu, M.L.; Liu, P. Robust Lane Detection using Two-stage Feature Extraction with Curve Fitting. Pattern Recognit. 2016, 59, 225–233. [Google Scholar] [CrossRef]
  21. Tapia-Espinoza, R.; Torres-Torriti, M. Robust lane sensing and departure warning under shadows and occlusions. Sensors 2013, 13, 3270–3298. [Google Scholar] [CrossRef] [PubMed]
  22. Suhr, J.K.; Jang, J.; Min, D.; Jung, H.G. Sensor Fusion-Based Low-Cost Vehicle Localization System for Complex Urban Environments. IEEE Trans. Intell. Transp. Syst. 2016, 18, 1–9. [Google Scholar] [CrossRef]
  23. Gu, Y.L.; Hsu, L.T.; Kamijo, S. Passive Sensor Integration for Vehicle Self-Localization in Urban Traffic Environment. Sensors 2015, 12, 30199–30220. [Google Scholar] [CrossRef]
  24. Kang, C.; Lee, S.H.; Chung, C.C. On-road vehicle localization with GPS under long term failure of a vision sensor. In Proceedings of the 2015 IEEE 18th International Conference on Intelligent Transportation Systems (ITSC), Gran Canaria, Spain, 15–18 September 2015. [Google Scholar]
  25. Hu, Z.C.; Uchimura, K. Fusion of Vision, GPS and 3D Gyro Data in Solving Camera Registration Problem for Direct Visual Navigation. Int. J. ITS Res. 2006, 4, 3–12. [Google Scholar]
  26. Tao, Z.; Bonnifait, P.; Frémont, V. Mapping and localization using GPS, lane markings and proprioceptive sensors. In Proceedings of the IEEE RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 406–412. [Google Scholar]
  27. Jung, J.; Bae, S.H. Real-Time Road Lane Detection in Urban Areas Using LiDAR Data. Electronics 2018, 7, 276. [Google Scholar] [CrossRef]
  28. Vivacqua, R.; Vassallo, R.; Martins, F. A low cost sensors approach for accurate vehicle localization and autonomous driving application. Sensors 2017, 17, 2359. [Google Scholar] [CrossRef]
  29. Meng, X.; Wang, H.; Liu, B. A robust vehicle localization approach based on gnss/imu/dmi/lidar sensor fusion for autonomous vehicles. Sensors 2017, 17, 2140. [Google Scholar] [CrossRef]
  30. Lee, G.I.; Kang, C.M.; Lee, S.H. Multi object-based predictive virtual lane. In Proceedings of the IEEE International Conference on Intelligent Transportation Systems, Yokohama, Japan, 16–19 October 2018. [Google Scholar]
  31. Song, M.; Kim, C.; Kim, M.; Yi, K. Robust lane tracking algorithm for forward target detection of automated driving vehicles. Proc. Inst. Mech. Eng. Part D J. Automob. Eng. 2018. [Google Scholar] [CrossRef]
  32. Kang, C.M.; Lee, S.H.; Chung, C.C. Lane estimation using a vehicle kinematic lateral motion model under clothoidal road constraints. In Proceedings of the 2014 IEEE 17th International Conference on Intelligent Transportation Systems (ITSC), Qingdao, China, 8–11 October 2014. [Google Scholar]
  33. Son, Y.S.; Kim, W.; Lee, S.H.; Chung, C.C. Robust multirate control scheme with predictive virtual lanes for lane-keeping system of autonomous highway driving. IEEE Trans. Veh. Technol. 2015, 64, 3378–3391. [Google Scholar] [CrossRef]
  34. Kang, C.M.; Lee, S.H.; Kee, S.C.; Chung, C.C. Kinematics-based Fault-tolerant Techniques: Lane Prediction for an Autonomous Lane Keeping System. Int. J. Control Autom. Syst. 2018, 16, 1293–1302. [Google Scholar] [CrossRef]
  35. Son, Y.S.; Lee, S.H.; Chung, C.C. Predictive virtual lane method using relative motions between a vehicle and lanes. Int. J. Control Autom. Syst. 2015, 13, 146–155. [Google Scholar] [CrossRef]
  36. Lee, S.H.; Chung, C.C. Robust multirate on-road vehicle localization for autonomous highway driving vehicles. IEEE Trans. Control Syst. Technol. 2017, 25, 577–589. [Google Scholar] [CrossRef]
  37. Adrian, Z.; Marc, W. Control algorithm for hands-off lane centering on motorways. In Proceedings of the Aachen Colloquium Automobile and Engine Technology, Aachen, Germany, 12 October 2011; pp. 1–13. [Google Scholar]
  38. Ji, X.; Wu, J.; Zhao, Y.; Liu, Y.; Zhan, X. A new robust control method for active front steering considering the intention of the driver. Proc. Inst. Mech. Eng. Part D J. Automob. Eng. 2015, 229, 518–531. [Google Scholar] [CrossRef]
  39. Wu, J.; Cheng, S.; Liu, B.; Liu, C. A Human-Machine-Cooperative-Driving Controller Based on AFS and DYC for Vehicle Dynamic Stability. Energies 2017, 10, 1737. [Google Scholar] [CrossRef]
  40. Ji, X.; Liu, Y.; He, X.; Yang, K.; Na, X.; Lv, C. Interactive Control Paradigm based Robust Lateral Stability Controller Design for Autonomous Automobile Path Tracking with Uncertain Disturbance: A Dynamic Game Approach. IEEE Trans. Veh. Technol. 2018, 67, 6906–6920. [Google Scholar] [CrossRef]
  41. Kalman, R.E. A new approach to linear filtering and prediction problems. J. Basic Eng. 1960, 82, 35–45. [Google Scholar] [CrossRef]
Figure 1. Vehicle kinematics model.
Figure 1. Vehicle kinematics model.
Sensors 19 01584 g001
Figure 2. Vehicle trajceotry and lane polynomial.
Figure 2. Vehicle trajceotry and lane polynomial.
Sensors 19 01584 g002
Figure 3. System block diagram.
Figure 3. System block diagram.
Sensors 19 01584 g003
Figure 4. Update periods of the IMU and camera sensor.
Figure 4. Update periods of the IMU and camera sensor.
Sensors 19 01584 g004
Figure 5. Relationship between vehicle’s trajectory and lane polynomial.
Figure 5. Relationship between vehicle’s trajectory and lane polynomial.
Sensors 19 01584 g005
Figure 6. HIL platform.
Figure 6. HIL platform.
Sensors 19 01584 g006
Figure 7. Flow chart of HIL.
Figure 7. Flow chart of HIL.
Sensors 19 01584 g007
Figure 8. Contrast in three scenarios
Figure 8. Contrast in three scenarios
Sensors 19 01584 g008aSensors 19 01584 g008b
Figure 9. Errors between measured and restored values.
Figure 9. Errors between measured and restored values.
Sensors 19 01584 g009
Figure 10. Comparison of steering wheel angle.
Figure 10. Comparison of steering wheel angle.
Sensors 19 01584 g010
Figure 11. Experimental vehicles and test sites.
Figure 11. Experimental vehicles and test sites.
Sensors 19 01584 g011
Figure 12. Lane test results.
Figure 12. Lane test results.
Sensors 19 01584 g012aSensors 19 01584 g012b
Figure 13. Lane estimation at time k, k + 2, k + 4, k + 6.
Figure 13. Lane estimation at time k, k + 2, k + 4, k + 6.
Sensors 19 01584 g013
Table 1. Lane prediction results.
Table 1. Lane prediction results.
Error BoundariesRMSE
Lateral offset [m]1 × 10−23.9 × 10−3
Heading angle [rad]4 × 10−41.18 × 10−4
Curvature [1/m]1.2 × 1054 × 106
Curvature rate [1/m2]1.5 × 1073.24 × 108
Table 2. Update period of sensor.
Table 2. Update period of sensor.
Sensor/ParametersUpdate Period (ms)
ECU10
Vision (Mobileye)70
GPS (Trimble)500
IMU (BOSCH)10

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Back to TopTop