# A Data-Driven Path-Tracking Model Based on Visual Perception Behavior Analysis and ANFIS Method

^{1}

^{2}

^{3}

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

- (1)
- Based on the visual behavior in path tracking, a visual perception module was established to extract the visual inputs similar to human drivers’ behavior, which considers the compensation, preview, and anticipation characteristics of driving.
- (2)
- Based on the ANFIS method, a decision-making module is proposed to generate the steering wheel angle, which can learn the steering behavior from human driving data and mimic the fuzzy description and reasoning mechanism of humans.

## 2. Human-like Driver Model and Visual Perception

#### 2.1. Human-like Driver Model

- Vehicle longitudinal velocity v, which is a remarkable influence factor of human steering behavior except road curvature.
- Lateral deviation ${e}_{l}$ (${e}_{l}=\frac{{D}_{L}-{D}_{R}}{2}$), which reflects the vehicle’s lateral position in the lane at a near zone of the road ahead. It is derived from the relationship between the vehicle heading and the visible lane lines and does not need the information of the invisible centerline. ${D}_{L}$ denotes the distance between the vehicle heading and the left lane line 6 m ahead; ${D}_{R}$ denotes the distance between the vehicle heading and the right lane line 6 m ahead.
- Heading angle error ${e}_{\theta}$ (${e}_{\theta}={\theta}_{t}$ when the TP is existent; ${e}_{\theta}={\theta}_{f}$ when the TP is inexistent), which reflects the road curvature at a far zone of the future road, as well as the effect of the road geometry on the driver’s preview distance). ${\theta}_{t}$ denotes the angle between the vehicle heading and the line segment connecting the VP and TP; ${\theta}_{f}$ denotes the angle between the vehicle heading and the line segment connecting the VP and FP′.

#### 2.2. Steering Control

## 3. Collection of Human Driving Data

#### 3.1. Setup

#### 3.2. Test Road

#### 3.3. Procedure

#### 3.4. Tangent Point Information

- (1)
- The left and right lane lines were replaced with certain discrete points in an equidistant way: ${L}_{1}({X}_{l1},{Y}_{l1},{C}_{l1})$, ⋯, ${L}_{n}({X}_{ln},{Y}_{ln},{C}_{ln})$, ⋯; ${R}_{1}({X}_{r1},{Y}_{r1},{C}_{r1})$, ⋯, ${R}_{n}({X}_{rn},{Y}_{rn},$${C}_{rn})$, ⋯. ${L}_{n}$ denotes the n-th point of the left lane line; ${R}_{n}$ denotes the n-th point of the right lane line; ${X}_{ln}$, ${Y}_{ln}$, and ${C}_{ln}$ denote the X-coordinate, Y-coordinate, and curvature of the n-th point on the left lane line; ${X}_{rn}$, ${Y}_{rn}$, and ${C}_{rn}$ denote the X-coordinate, Y-coordinate, and curvature of the n-th point on the right lane line.
- (2)
- Calculate the first-order difference quotient of two adjacent points, and use the value as an approximation for the slope of the tangent at certain positions on the certain lane line: ${K}_{ln}=\frac{{Y}_{l(n+1)}-{Y}_{ln}}{{X}_{l(n+1)}-{X}_{ln}}$; ${K}_{rn}=\frac{{Y}_{r(n+1)}-{Y}_{rn}}{{X}_{r(n+1)}-{X}_{rn}}$. ${K}_{ln}$ denotes the approximation for the slope of the tangent at the n-th point on the left lane line; ${K}_{rn}$ denotes the approximation for the slope of the tangent at the n-th point on the right lane line; the value, to some degree, represents the heading at the corresponding position of the road.
- (3)
- Determine the start point (S) for searching, which is the point of the lane line nearest to the vehicle’s current position $P(X,Y)$.
- (4)
- Determine the calculation range. Search the points of the lane lines from 10 to 30 m in front of the vehicle’s current position: $10\le \sqrt{{(X-{X}_{ln})}^{2}+{(Y-{Y}_{ln})}^{2}}\le 30$; $10\le \sqrt{{(X-{X}_{rn})}^{2}+{(Y-{Y}_{rn})}^{2}}\le 30$.
- (5)
- Calculate the mathematical formulas: ${K}_{vln}=\frac{{Y}_{ln}-Y}{{X}_{ln}-X}$; ${K}_{vrn}=\frac{{Y}_{rn}-Y}{{X}_{rn}-X}$. ${K}_{vln}$ denotes the slope of the segment connecting the CG of the vehicle and the n-th point on the left lane line; ${K}_{vrn}$ denotes the slope of the segment connecting the CG of vehicle and the n-th point on the right lane line.
- (6)
- Calculate the following mathematical formulas: ${E}_{ln}=|{K}_{vln}-{K}_{ln}|$; ${E}_{rn}=|{K}_{vrn}-{K}_{rn}|$. The calculated value ${E}_{ln}$ shows the absolute difference between ${K}_{vln}$ and ${K}_{ln}$; ${E}_{rn}$ shows the absolute difference between ${K}_{vrn}$ and ${K}_{rn}$.
- (7)
- If ${E}_{ln}<T$ (a threshold value) and ${C}_{ln}>0$, the TP is the point satisfying $min\left({E}_{ln}\right)$; if ${E}_{rn}<T$ and ${C}_{rn}<0$, the TP is the point satisfying $min\left({E}_{rn}\right)$; the coordinates of the TP $({X}_{t},{Y}_{t})$, the curvature at the TP ${C}_{t}$, as well as the distance between the TP and the vehicle ${D}_{t}$ can be obtained; if not, specify ${X}_{t}=0$, ${Y}_{t}=0$, ${C}_{t}=0$, and ${D}_{t}=30$.

#### 3.5. Data Processing

## 4. Model Simulation and Result Analysis

#### 4.1. Validation Scenario and Speed

#### 4.1.1. Validation on the Original Test Road

#### 4.1.2. Validation on a New Test Road

#### 4.2. Model Evaluation

#### 4.3. Result and Analysis

## 5. Conclusions

## Author Contributions

## Funding

## Data Availability Statement

## Conflicts of Interest

## References

- Chen, B.; Sun, D.; Zhou, J.; Wong, W.; Ding, Z. A future intelligent traffic system with mixed autonomous vehicles and human-driven vehicles. Inf. Sci.
**2020**, 529, 59–72. [Google Scholar] [CrossRef] - Meng, L.Z.Q.; Chen, H. Kalman Filter-Based Fusion Estimation Method of Steering Feedback Torque for Steer-by-Wire Systems. Automot. Innov.
**2021**, 4, 430–439. [Google Scholar] - Li, L.; Kaoru, O.; Dong, M. Human-like driving: Empirical decision-making system for autonomous vehicles. IEEE Trans. Veh. Technol.
**2018**, 67, 6814–6823. [Google Scholar] [CrossRef] - McRuer, D.T.; Krendel, E.S. The human operator as a servo system element. J. Frankl. Inst.
**1959**, 267, 381–403. [Google Scholar] [CrossRef] - Kondo, M.; Ajimine, A. Driver’s sight point and dynamics of the driver-vehicle-system related to it. In SAE Technical Paper; SAE International: Warrendale, PA, USA, 1968. [Google Scholar]
- MacAdam, C.C. Application of an optimal preview control for simulation of closed-loop automobile driving. IEEE Trans. Syst. Man, Cybern.
**1981**, 11, 393–399. [Google Scholar] [CrossRef] - Zhou, X.; Jiang, H.; Li, A.; Ma, S. A new single point preview-based human-like driver model on urban curved roads. IEEE Access
**2020**, 8, 107452–107464. [Google Scholar] [CrossRef] - Li, B.; Ouyang, Y.; Li, X. Mixed-Integer and Conditional Trajectory Planning for an Autonomous Mining Truck in Loading/Dumping Scenarios: A Global Optimization Approach. IEEE Trans. Intell. Veh.
**2023**, 8, 1512–1522. [Google Scholar] [CrossRef] - Tan, Y.; Shen, H.; Huang, M.; Mao, J. Driver directional control using two-point preview and fuzzy decision. J. Appl. Math. Mech.
**2016**, 80, 459–465. [Google Scholar] [CrossRef] - Li, Y.; Nan, Y.; He, J.; Feng, Q.; Zhang, Y.; Fan, J. Study on lateral assisted control for commercial vehicles. In Proceedings of the 2019 14th IEEE Conference on Industrial Electronics and Applications (ICIEA), Xi’an, China, 19–21 June 2019; pp. 567–572. [Google Scholar]
- Li, B.; Zhang, Y.; Zhang, T. Embodied Footprints: A Safety-Guaranteed Collision-Avoidance Model for Numerical Optimization-Based Trajectory Planning. arXiv
**2023**, arXiv:2302.076222023. [Google Scholar] [CrossRef] - Land, M.; Horwood, J. Which parts of the road guide steering? Nature
**1995**, 377, 339–340. [Google Scholar] [CrossRef] - Li, A.; Jiang, H.; Li, Z.; Zhou, J.; Zhou, X. Human-like trajectory planning on curved road: Learning from human drivers. IEEE Trans. Intell. Transp. Syst.
**2020**, 21, 3388–3397. [Google Scholar] [CrossRef] - Macadam, C.C. Understanding and modeling the human driver. Veh. Syst. Dyn.
**2003**, 40, 101–134. [Google Scholar] [CrossRef] - Marino, R.; Scalzi, S.; Netto, M. Nested pid steering control for lane keeping in autonomous vehicles. Control. Eng. Pract.
**2011**, 19, 1459–1467. [Google Scholar] [CrossRef] - Piao, C.; Liu, X.; Lu, C. Lateral control using parameter self-tuning lqr on autonomous vehicle. In Proceedings of the 2019 International Conference on Intelligent Computing, Automation and Systems (ICICAS), Chongqing, China, 6–8 December 2019; pp. 913–917. [Google Scholar]
- Jiang, H.; Tian, H.; Hua, Y. Model predictive driver model considering the steering characteristics of the skilled drivers. Adv. Mech. Eng.
**2019**, 11, 1687814019829337. [Google Scholar] [CrossRef] - Ling, R.; Shen, H.; Gu, J.; Mao, J.; Zhang, Y.; Miao, X.; Zhang, H. Vision based steering controller for intelligent vehicles via fuzzy logic. In Proceedings of the 2011 International Conference on Electronics and Optoelectronics, Dalian, China, 29–31 July 2011; Volume 4, pp. 417–420. [Google Scholar]
- Li, A.; Jiang, H.; Zhou, J.; Zhou, X. Implementation of human-like driver model based on recurrent neural networks. IEEE Access
**2019**, 7, 98094–98106. [Google Scholar] [CrossRef] - Sentouh, C.; Chevrel, P.; Mars, F.; Claveau, F. A sensorimotor driver model for steering control. In Proceedings of the 2009 IEEE International Conference on Systems, Man and Cybernetics, San Antonio, TX, USA, 11–14 October 2009; pp. 2462–2467. [Google Scholar]
- Nash, C.J.; Cole, D.J. Modelling the influence of sensory dynamics on linear and nonlinear driver steering control. Veh. Syst. Dyn.
**2018**, 56, 689–718. [Google Scholar] [CrossRef] - Nash, C.J.; Cole, D.J.; Bigler, R.S. A review of human sensory dynamics for application to models of driver steering and speed control. Biol. Cybern.
**2016**, 110, 91–116. [Google Scholar] [CrossRef] [PubMed] - El, K.v.; Pool, D.M.; Mulder, M. Measuring and modeling driver steering behavior: From compensatory tracking to curve driving. Transp. Res. Part Traffic Psychol. Behav.
**2019**, 61, 337–346. [Google Scholar] - Nash, C.; Cole, D. Identification and validation of a driver steering control model incorporating human sensory dynamics. Veh. Syst. Dyn.
**2020**, 58, 495–517. [Google Scholar] [CrossRef] - Abduljabbar, R.; Dia, H.; Liyanage, S.; Bagloee, S.A. Applications of artificial intelligence in transport: An overview. Sustainability
**2019**, 11, 189. [Google Scholar] [CrossRef] - Jang, J.R. Anfis: Adaptive-network-based fuzzy inference system. IEEE Trans. Syst. Man Cybern.
**1993**, 23, 665–685. [Google Scholar] [CrossRef] - Salvucci, D.D. Modeling driver behavior in a cognitive architecture. Hum. Factors
**2006**, 48, 362–380. [Google Scholar] [CrossRef] - Wang, W.; Mao, Y.; Jin, J.; Wang, X.; Guo, H.; Ren, X.; Ikeuchi, K. Driver’s various information process and multi-ruled decision-making mechanism: A fundamental of intelligent driving shaping model. Int. J. Comput. Intell. Syst.
**2011**, 4, 297–305. [Google Scholar] - Schwarting, W.; Alonso-Mora, J.; Rus, D. Planning and decision-making for autonomous vehicles. Annu. Rev. Control. Robot. Auton. Syst.
**2018**, 1, 187–210. [Google Scholar] [CrossRef] - Plöchl, M.; Edelmann, J. Driver models in automobile dynamics application. Veh. Syst. Dyn.
**2007**, 45, 699–741. [Google Scholar] [CrossRef] - Wolfe, B.; Dobres, J.; Rosenholtz, R.; Reimer, B. More than the useful field: Considering peripheral vision in driving. Appl. Ergon.
**2017**, 65, 316–325. [Google Scholar] [CrossRef] [PubMed] - Xing, D.; Li, X.; Zheng, X.; Ren, Y.; Ishiwatari, Y. Study on driver’s preview time based on field tests. In Proceedings of the 2017 4th International Conference on Transportation Information and Safety (ICTIS), Banff, AL, Canada, 8–10 August 2017; pp. 575–580. [Google Scholar]
- Lappi, O. Future path and tangent point models in the visual control of locomotion in curve driving. J. Vis.
**2017**, 14, 21. [Google Scholar] [CrossRef] - Kandil, F.I.; Rotter, A.; Lappe, M. Car drivers attend to different gaze targets when negotiating closed vs. open bends. J. Vis.
**2010**, 10, 24. [Google Scholar] [CrossRef] [PubMed] - Lappi, O.; Rinkkala, P.; Pekkanen, J. Systematic observation of an expert driver’s gaze strategy—An on-road case study. Front. Psychol.
**2017**, 8, 620. [Google Scholar] [CrossRef] [PubMed] - Erwin, R. What preview elements do drivers need? IFAC Pap. Online
**2016**, 49, 102–107. [Google Scholar] - Schnelle, S.; Wang, J.; Su, H.; Jagacinski, R. A personalizable driver steering model capable of predicting driver behaviors in vehicle collision avoidance maneuvers. IEEE Trans. Hum. Mach. Syst.
**2017**, 47, 625–635. [Google Scholar] [CrossRef]

**Figure 2.**Diagram for visual inputs of decision-making module: (

**a**) when the TP is existent in the defined far zone; (

**b**) when the TP is inexistent in the defined far zone.

**Figure 6.**“Human View” setting in PreScan. It is actually a camera for visual aid during simulation; we linked it to the vehicle and placed it at the position of the eyes of the driver, which can simulate the real driver’s view angle.

**Figure 9.**TP information of one driving process: (

**a**) TP distribution; (

**b**) distance between the TP and vehicle. The distance between the vehicle and TP is denoted by ${D}_{t}$; the serial number of the sampling data can reflect the road length to some degree. It is noted that we define ${D}_{t}$ as 30 m when the TP is located outside the far zone. That is to say, ${D}_{t}$ is different from the actual distance between the vehicle and TP when the TP is inexistent within the range of 30 m.

**Figure 12.**Simulation results of lateral position of the proposed driver model: (

**a**) on the original test road; (

**b**) on the new test road.

**Figure 13.**Comparison of steering wheel angles: (

**a**) on the original test road, at 20 km/h; (

**b**) on the original test road, at 30 km/h; (

**c**) on the original test road, at 40 km/h; (

**d**) on the original test road, at 50 km/h; (

**e**) on the original test road, at 60 km/h; (

**f**) on the new test road, at 36 km/h; (

**g**) on the new test road, at 54 km/h.

Sensory System | Information Relevant to Driving |
---|---|

Visual | Spatial location and geometry of the future road such as road curvature and heading [22] and vehicle’s motion relative to the surrounding environment such as lateral, longitudinal, and yaw velocity [22]. |

Vestibular | Rotational and translational motion such as angular velocity and acceleration [22]. |

Somatosensory | Angle and torque of steering wheel [22] and displacement and force of pedals [22]. |

Description | Value | Unit |
---|---|---|

Total mass of the vehicle | 1480 | kg |

Moment of inertia of the vehicle body around the z-axis | 2562 | kg·m${}^{2}$ |

Cornering stiffness of the front/rear tires (left, right merged) | 62,191/98,727 | N/rad |

Vehicle width | 1.86 | m |

Distance from the center of gravity to the front/rear axle | 1.059/1.641 | m |

Steering ratio | 20 | – |

Maximum steering wheel angle | 500 | deg |

Maximum steering wheel angle rate | 1200 | deg/s |

Validation Scenario | Speed (km/h) | Human-like Driver Model | Single-Point Preview Optimal Curvature Model | ||||
---|---|---|---|---|---|---|---|

PCC | RMSE | MAE | PCC | RMSE | MAE | ||

the original test road | 20 | 0.9980 | 1.3772 | 1.0207 | 0.9959 | 1.9572 | 1.4915 |

30 | 0.9977 | 1.6330 | 1.2146 | 0.9977 | 1.6166 | 1.2427 | |

40 | 0.9967 | 2.2237 | 1.6110 | 0.9971 | 2.0803 | 1.4797 | |

50 | 0.9955 | 2.9782 | 2.1483 | 0.9952 | 3.1237 | 2.2065 | |

60 | 0.9946 | 3.8313 | 2.7433 | 0.9882 | 5.7221 | 3.7759 | |

the new test road | 36 | 0.9951 | 2.1811 | 1.4564 | 0.9952 | 2.1643 | 1.4462 |

54 | 0.9957 | 2.6366 | 1.9420 | 0.9924 | 3.4846 | 2.3408 |

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Hu, Z.; Yu, Y.; Yang, Z.; Zhu, H.; Liu, L.; Zhou, Y.
A Data-Driven Path-Tracking Model Based on Visual Perception Behavior Analysis and ANFIS Method. *Electronics* **2024**, *13*, 61.
https://doi.org/10.3390/electronics13010061

**AMA Style**

Hu Z, Yu Y, Yang Z, Zhu H, Liu L, Zhou Y.
A Data-Driven Path-Tracking Model Based on Visual Perception Behavior Analysis and ANFIS Method. *Electronics*. 2024; 13(1):61.
https://doi.org/10.3390/electronics13010061

**Chicago/Turabian Style**

Hu, Ziniu, Yue Yu, Zeyu Yang, Haotian Zhu, Lvfan Liu, and Yunshui Zhou.
2024. "A Data-Driven Path-Tracking Model Based on Visual Perception Behavior Analysis and ANFIS Method" *Electronics* 13, no. 1: 61.
https://doi.org/10.3390/electronics13010061