Developing Artificial Intelligence-Based Car-Following Models Using Improved Permutation Entropy Analysis Results
Abstract
1. Introduction
- Applying a new prediction method by using the IPE to analyze lead vehicle position data and incorporating the analysis results to develop artificial intelligence-based car-following models.
- Using and comparing four different models (ANN, LSTM, Transformer, and IDM) to examine the prediction method for follower acceleration forecasting.
- Different tests were conducted to provide a comprehensive evaluation of the models’ performance by using four evaluation metrics (RMSE, MAE, MASE, and R2), statistical testing, ablation experiments, and feature contribution analysis.
- Offering helpful information regarding the ability of the models to predict the follower vehicle acceleration.
2. Materials and Methods
2.1. IPE Algorithm and Its Parameter Setting
2.1.1. First Hypothetical Case: Stopping Without Movement
2.1.2. Second Hypothetical Case: One Type of Movement (Free Flow or Congestion) Without Stopping
2.1.3. Third Hypothetical Case: Two Types of Movement (Free Flow and Congestion) Without Stopping
2.1.4. Fourth Hypothetical Case: All Types of Movements (Free Flow, Congestion, Stop, and Free Flow)
2.2. Data Preperation for IPE Applications
2.3. Applying IPE to Analyze Real Vehicle Trajectory Data
3. Model Formulation
3.1. Prediction Methods
3.2. Data Processing
3.3. Evaluation Methods
4. Prediction Models
4.1. ANNs
4.2. LSTMs
- Input gate: Using the sigmoid function to determine which new information from the current inputs will be used to update the cell state.
- Forget gate: Using the sigmoid function to determine which information from the previous cell state should be discarded.
- Output gate: This regulates which information from the network’s memory will be output.
4.3. Transformer
- 1-
- Input embeddings.
- 2-
- Positional encoding.
- 3-
- Generating vectors.
- 4-
- Self-attention mechanisms.
- 5-
- Multi-head attention.
- 6-
- Layer normalization and residual connections.
- 7-
- Output layer.
4.4. IDM
5. Experimental Results and Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A


References
- Zhong, R.X.; Fu, K.Y.; Sumalee, A.; Ngoduy, D.; Lam, W.H.K. A cross-entropy method and probabilistic sensitivity analysis framework for calibrating microscopic traffic models. Transp. Res. Part C Emerg. Technol. 2016, 63, 147–169. [Google Scholar] [CrossRef]
- Ahmed, H.U.; Huang, Y.; Lu, P. A review of car-following models and modeling tools for human and autonomous-ready driving behaviors in micro-simulation. Smart Cities 2021, 4, 314–335. [Google Scholar] [CrossRef]
- Qin, P.; Li, X.; Bin, S.; Wu, F.; Pang, Y. Research on transformer and long short-term memory neural network car-following model considering data loss. Math. Biosci. Eng. 2023, 20, 19617–19635. [Google Scholar] [CrossRef]
- Zhu, K.; Yang, X.; Zhang, Y.; Liang, M.; Wu, J. A Heterogeneity-Aware Car-Following Model: Based on the XGBoost Method. Algorithms 2024, 17, 68. [Google Scholar] [CrossRef]
- Zhang, L. Swarm Intelligent Car-Following Model for Autonomous Vehicle Platoon Based on Particle Swarm Optimization Theory. Electronics 2025, 14, 1851. [Google Scholar] [CrossRef]
- Cazis, D.C.; Herman, R.; Rothery, R.W. Nonlinear Follow-The-Leader Models of Traffic Flow. Oper. Res. 1961, 9, 545–567. [Google Scholar]
- Newell, G.F. Nonlinear Effects in the Dynamics of Car Following. Oper. Res. 1961, 9, 209–229. [Google Scholar] [CrossRef]
- Gipps, P.G. A Behavioural Car-Following Model for Computer Simulation. Transp. Res. Part B Methodol. 1981, 15, 105–111. [Google Scholar] [CrossRef]
- Treiber, M.; Hennecke, A.; Helbing, D. Congested Traffic States in Empirical Observations and Microscopic Simulations. arXiv 2000. [Google Scholar] [CrossRef] [PubMed]
- Jiang, R.; Wu, Q.; Zhu, Z. Full velocity difference model for a car-following theory. Phys. Rev. E 2001, 64, 017101. [Google Scholar] [CrossRef]
- Zhang, T.; Jin, P.J.; McQuade, S.T.; Bayen, A.; Piccoli, B. Car-Following Models: A Multidisciplinary Review. IEEE Trans. Intell. Veh. 2025, 10, 92–116. [Google Scholar] [CrossRef]
- Kehtarnavaz, N.; Member, S.; Griswold, N.; Miller, K.; Lescoe, P. A Transportable Neural-Network Approach to Autonomous Vehicle Following. IEEE Trans. Veh. Technol. 1998, 47, 694–702. [Google Scholar] [CrossRef]
- Panwai, S.; Dia, H. Neural agent car-following models. IEEE Trans. Intell. Transp. Syst. 2007, 8, 60–70. [Google Scholar] [CrossRef]
- Khodayari, A.; Ghaffari, A.; Kazemi, R.; Braunstingl, R. A modified car-following model based on a neural network model of the human driver effects. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 2012, 42, 1440–1449. [Google Scholar] [CrossRef]
- Zhou, M.; Qu, X.; Li, X. A recurrent neural network based microscopic car following model to predict traffic oscillation. Transp. Res. Part C Emerg. Technol. 2017, 84, 245–264. [Google Scholar] [CrossRef]
- Huang, X.; Sun, J.; Sun, J. A car-following model considering asymmetric driving behavior based on long short-term memory neural networks. Transp. Res. Part C Emerg. Technol. 2018, 95, 346–362. [Google Scholar] [CrossRef]
- Wu, F.; Work, D.B. Connections between classical car following models and artificial neural networks. In Proceedings of the 2018 21st International Conference on Intelligent Transportation Systems (ITSC), Maui, HI, USA, 4–7 November 2018; IEEE: New York, NY, USA, 2018. [Google Scholar]
- Ma, L.; Qu, S. A sequence to sequence learning based car-following model for multi-step predictions considering reaction delay. Transp. Res. Part C Emerg. Technol. 2020, 120, 102785. [Google Scholar] [CrossRef]
- Colombaroni, C.; Fusco, G.; Isaenko, N. Modeling Car following with Feed-Forward and Long-Short Term Memory Neural Networks. Transp. Res. Procedia 2021, 52, 195–202. [Google Scholar] [CrossRef]
- Zhou, J.; Wan, J.; Zhu, F. Transfer Learning Based Long Short-Term Memory Car-Following Model for Adaptive Cruise Control. IEEE Trans. Intell. Transp. Syst. 2022, 23, 21345–21359. [Google Scholar] [CrossRef]
- Qu, D.; Wang, S.; Liu, H.; Meng, Y. A Car-Following Model Based on Trajectory Data for Connected and Automated Vehicles to Predict Trajectory of Human-Driven Vehicles. Sustainability 2022, 14, 7045. [Google Scholar] [CrossRef]
- Qin, P.; Li, H.; Li, Z.; Guan, W.; He, Y. A CNN-LSTM Car-Following Model Considering Generalization Ability. Sensors 2023, 23, 660. [Google Scholar] [CrossRef]
- Adewale, A.; Lee, C. Prediction of Car-Following Behavior of Autonomous Vehicle and Human-Driven Vehicle Based on Drivers’ Memory and Cooperation with Lead Vehicle. Transp. Res. Rec. 2024, 2678, 248–266. [Google Scholar] [CrossRef]
- Wang, K.; Liu, Y.; Zhang, J. Modeling Human-Like Car-Following Model for Intelligent Vehicles Based on Deep Reinforcement Learning. Int. J. Automot. Technol. 2025, 27, 163–172. [Google Scholar] [CrossRef]
- Li, T.; Halatsis, A.; Stern, R. RACER: Rational Artificial Intelligence Car-Following-Model Enhanced by Reality. IEEE Trans. Intell. Transp. Syst. 2025, 26, 21199–21214. [Google Scholar] [CrossRef]
- Liu, Y.; Wang, S.; Panicker, A.; Embry, K.; Asanova, A.; Li, T. A Phase-Aware AI Car-Following Model for Electric Vehicles with Adaptive Cruise Control: Development and Validation Using Real-World Data. arXiv 2025. [Google Scholar] [CrossRef]
- Petrov, A.I. Information and Entropy Aspects of the Specifics of Regional Road Traffic Accident Rate in Russia. Information 2023, 14, 138. [Google Scholar] [CrossRef]
- Azami, H.; Sanei, S.; Rajji, T.K. Ensemble entropy: A low bias approach for data analysis. Knowl. Based Syst. 2022, 256, 109876. [Google Scholar] [CrossRef]
- Zanin, M.; Zunino, L.; Rosso, O.A.; Papo, D. Permutation entropy and its main biomedical and econophysics applications: A review. Entropy 2012, 14, 1553–1577. [Google Scholar] [CrossRef]
- Liu, Z.; Xu, C.; Chen, L.; Zhou, S. Dynamic Traffic Flow Entropy Calculation Based on Vehicle Spacing. In IOP Conference Series: Earth and Environmental Science; Institute of Physics Publishing: Bristol, UK, 2019. [Google Scholar] [CrossRef]
- Lyu, W.; Gonçalves, R.; Guo, F.; Torrão, G.; Radhakrishnan, V.; Guillen, P.P.; Louw, T.; Merat, N. Applying Entropy to Understand Drivers’ Uncertainty during Car-following. In Proceedings of the 12th International Conference on Measuring Behavior and the 6th Seminar on Behavioral Methods, Krakow, Poland, 13–15 October 2021; pp. 13–15. [Google Scholar] [CrossRef]
- Zhang, Z.; Xiang, Z.; Chen, Y.; Xu, J. Fuzzy permutation entropy derived from a novel distance between segments of time series. AIMS Math. 2020, 5, 6244–6260. [Google Scholar] [CrossRef]
- Liu, Z.; Wang, Y.; Cheng, Q.; Yang, H. Analysis of the Information Entropy on Traffic Flows. IEEE Trans. Intell. Transp. Syst. 2022, 23, 18012–18023. [Google Scholar] [CrossRef]
- Tang, L. Application of information entropy algorithm in safety risk prediction of road traffic driving behaviors. Adv. Transp. Stud. 2023, 1, 147–158. [Google Scholar] [CrossRef]
- Baldini, G. On the application of entropy measures with sliding window for intrusion detection in automotive in-vehicle networks. Entropy 2020, 22, 1044. [Google Scholar] [CrossRef] [PubMed]
- Guo, C.; Zhang, J.; Cao, Z. Traffic congestion recognition based on information entropy. In Proceedings of the International Conference on Smart Transportation and City Engineering; SPIE: Bellingham, WA, USA, 2021; pp. 523–528. [Google Scholar] [CrossRef]
- Cui, Z.; Chen, G.; Liu, B.; Li, D. A Multiscale Symbolic Dynamic Entropy Analysis of Trac Flow. J. Adv. Transp. 2022, 2022, 8389229. [Google Scholar] [CrossRef]
- Ye, W.; Xu, Y.; Shi, X.; Shiwakoti, N.; Ye, Z.; Zheng, Y. A macroscopic safety indicator for road segment: Application of entropy theory. Phys. A Stat. Mech. Its Appl. 2024, 642, 129787. [Google Scholar] [CrossRef]
- Zhou, J.; Li, Y.; Wang, M. Research on the Threshold Determination Method of the Duffing Chaotic System Based on Improved Permutation Entropy and Poincaré Mapping. Entropy 2023, 25, 1654. [Google Scholar] [CrossRef] [PubMed]
- Zhou, J.; Hao, B.; Li, Y.; Yang, X. Underwater Small Target Detection Method Based on the Short-Time Fourier Transform and the Improved Permutation Entropy. Acoustics 2024, 6, 870–884. [Google Scholar] [CrossRef]
- Zheng, K.; Gan, H.S.; Chaw, J.K.; Teh, S.H.; Chen, Z. Generalized Gaussian Distribution Improved Permutation Entropy: A New Measure for Complex Time Series Analysis. Entropy 2024, 26, 960. [Google Scholar] [CrossRef]
- Bandt, C.; Pompe, B. Permutation Entropy: A Natural Complexity Measure for Time Series. Phys. Rev. Lett. 2002, 88, 174102. [Google Scholar] [CrossRef]
- Duran, O.; Fumagalli, L.; Arata, A. Permutation Entropy and Ordinal Patterns as a Resilience Indicator. In Proceedings of the 2023 7th International Conference on System Reliability and Safety, ICSRS 2023; Institute of Electrical and Electronics Engineers Inc.: New York, NY, USA, 2023; pp. 384–388. [Google Scholar] [CrossRef]
- Cuesta-Frau, D. Permutation entropy: Influence of amplitude information on time series classification performance. Math. Biosci. Eng. 2019, 16, 6842–6857. [Google Scholar] [CrossRef]
- Chen, Z.; Li, Y.; Liang, H.; Yu, J. Improved permutation entropy for measuring complexity of time series under noisy condition. Complexity 2019, 2019, 1403829. [Google Scholar] [CrossRef]
- Shahatha, A.M.; Şahin, İ. Permutation Entropy and Improved Permutation Entropy Applied to Traffic Flow Variables. In Graduate Student Conference—Transportation—2025 Fall Semester 22/11/2025; TMMOB Chamber of Civil Engineers Istanbul Branch: Istanbul, Turkey, 2025; Available online: https://ogrenci.imo.org.tr/index-eng.html#kitap (accessed on 14 December 2025).
- Zunino, L.; Olivares, F.; Scholkmann, F.; Rosso, O.A. Permutation entropy based time series analysis: Equalities in the input signal can lead to false conclusions. Phys. Lett. Sect. A Gen. At. Solid State Phys. 2017, 381, 1883–1892. [Google Scholar] [CrossRef]
- Mansourian, N.; Sarafan, S.; Ghirmai, T.; Cao, H.; Azar, F.T. Novel QRS Detection Based on the Adaptive Improved Permutation Entropy. Biomed. Signal Process. Control 2022, 80, 104270. [Google Scholar] [CrossRef]
- Blvd, V. Vehicle Trajectory Study Area On-Ramp Cahuenga Blvd. Off-Ramp. 2007. Available online: https://ops.fhwa.dot.gov/trafficanalysistools/ngsim.htm (accessed on 23 February 2026).
- Moller, M.F. A scaled conjugate gradient algorithm for fast supervised learning. Neural Netw. 1993, 6, 525–533. [Google Scholar] [CrossRef]
- Haykin, S.S. Neural Networks and Learning Machines, 3rd ed.; Prentice Hall/Pearson: Hamilton, ON, Canada, 2009. [Google Scholar]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Tepetidis, N.; Koutsoyiannis, D.; Iliopoulou, T.; Dimitriadis, P. Investigating the Performance of the Informer Model for Streamflow Forecasting. Water 2024, 16, 2882. [Google Scholar] [CrossRef]
- Krichen, M.; Mihoub, A. Long Short-Term Memory Networks: A Comprehensive Survey. AI 2025, 6, 215. [Google Scholar] [CrossRef]
- Lindemann, B.; Maschler, B.; Sahlab, N.; Weyrich, M. A Survey on Anomaly Detection for Technical Systems using LSTM Networks. Comput. Ind. 2024, 131, 103498. [Google Scholar] [CrossRef]
- Hua, Y.; Zhao, Z.; Li, R.; Chen, X.; Liu, Z.; Zhang, H. Deep Learning with Long Short-Term Memory for Time Series Prediction. arXiv 2018. [Google Scholar] [CrossRef]
- Understanding LSTM Networks—Colah’s Blog. Available online: https://colah.github.io/posts/2015-08-Understanding-LSTMs/ (accessed on 6 March 2026).
- Zhao, J.; Li, X.; Xue, Q.; Zhang, W. Spatial-Channel Transformer Network for Trajectory Prediction on the Traffic Scenes. arXiv 2021. [Google Scholar] [CrossRef]
- Pazho, A.D.; Noghre, G.A.; Katariya, V.; Tabkhi, H. VT-Former: An Exploratory Study on Vehicle Trajectory Prediction for Highway Surveillance Through Graph Isomorphism and Transformer. arXiv 2024. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, L.; Polosukhin, I. Attention Is All You Need. arXiv 2023. [Google Scholar] [CrossRef]
- What Is a Transformer Model?|IBM. Available online: https://www.ibm.com/think/topics/transformer-model#1280257394 (accessed on 23 February 2026).
- Dong, S.; Wang, P.; Abbas, K. A Survey on Deep Learning and Its Applications; Elsevier Ireland Ltd.: Amsterdam, The Netherlands, 2021. [Google Scholar] [CrossRef]
- Srinivas, A.; Lin, T.Y.; Parmar, N.; Shlens, J.; Abbeel, P.; Vaswani, A. Bottleneck transformers for visual recognition. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition; IEEE Computer Society: New York, NY, USA, 2021; pp. 16514–16524. [Google Scholar] [CrossRef]
- Hung, W.-C.; Kretzschmar, H.; Lin, T.-Y.; Chai, Y.; Yu, R.; Yang, M.-H.; Anguelov, D. SoDA: Multi-Object Tracking with Soft Data Association. arXiv 2020. [Google Scholar] [CrossRef]
- Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Zhai, X.; Unterthiner, T.; Dehghani, M.; Minderer, M.; Heigold, G.; Gelly, S.; et al. An Image Is Worth 16 × 16 Words: Transformers for Image Recognition at Scale. arXiv 2020. [Google Scholar] [CrossRef]
- Khan, S.; Naseer, M.; Hayat, M.; Zamir, S.W.; Khan, F.S.; Shah, M. Transformers in Vision: A Survey. ACM Comput. Surv. 2022, 54, 1–41. [Google Scholar] [CrossRef]
- Zhou, H.; Zhang, S.; Peng, J.; Zhang, S.; Li, J.; Xiong, H.; Zhang, W. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. arXiv 2020. [Google Scholar] [CrossRef]
- Zhou, S.; Zheng, S.; Tian, J.; Jiang, R.; Zhang, H.M. Twenty-Five Years of the Intelligent Driver Model: Foundations, Extensions, Applications, and Future Directions. arXiv 2025. [Google Scholar] [CrossRef]














| D | 3 | 4 | |
|---|---|---|---|
| L | |||
| 2 | 8 | 16 | |
| 3 | 27 | 81 | |
| 4 | 64 | 256 | |
| Group ID | Number of Vehicles | First Veh_ID in Group | Last Veh_ID in Group |
|---|---|---|---|
| G1 | 48 | 719 | 1058 |
| Prediction Method | Input Layer | Network Size | Output Layer |
|---|---|---|---|
| 1 | 3 neurons [] | 2 hidden layers with neurons each | 1 neuron (follower acceleration) |
| 2 | 4 neurons , IPE] |
| Parameter | Specification |
|---|---|
| Input layer size | 4 input neurons |
| Hidden layers | 2 hidden layers |
| Hidden layer 1 | 5 fully connected neurons |
| Hidden layer 2 | 5 fully connected neurons |
| Output layer | 1 fully connected neuron |
| Activation function (hidden) | Nonlinear activation tansiq (default) |
| Training algorithm (optimization method) | Scaled conjugate gradient (trainscg) |
| Loss function | MSE |
| Epochs | 1000 (default) |
| Performance goal | 0 (default) |
| Minimum gradient | 1 × 10−6 |
| Parameter | Specification |
|---|---|
| Input layer | Sequence input with 4 features |
| LSTM layer | 20 hidden units |
| Fully connected layer 1 | 10 neurons |
| Output layer | 1 fully connected neuron |
| Activation function | ReLU |
| Optimizer | RMSProp |
| Epochs | 100 |
| Learning rate | 0.001 |
| Parameter | Specification |
|---|---|
| Input embedding layer | Linear layer: 4 → d_model (64) |
| Embedding size (d_model) | 64 |
| Transformer encoder layers | 3 layers |
| Self-attention | Multi-head attention with 4 heads |
| Feedforward dimension | 128 |
| Output layer | Linear layer: d_model (64) → 1 |
| Activation function | ReLU (default) |
| Optimizer | Adam |
| Loss function | MSE |
| Learning rate | 0.001 |
| Epochs | 40 |
| Model Type | Follower Acceleration | Follower Speed | Follower Position | |||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| RMSE | MAE | MASE | R2 | RMSE | MAE | MASE | R2 | RMSE | MAE | MASE | R2 | |
| IDM | 0.3953 | 0.3048 | 6.4268 | 0.5421 | 0.9497 | 0.7497 | 13.9983 | 0.8587 | 7.0966 | 5.4225 | 4.4596 | 0.9952 |
| Model Type | Follower Acceleration | Follower Speed | Follower Position | |||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| RMSE | MAE | MASE | R2 | RMSE | MAE | MASE | R2 | RMSE | MAE | MASE | R2 | |
| ANN | 0.3736 | 0.2981 | 6.2438 | 0.5632 | 1.0678 | 0.8301 | 15.628 | 0.8591 | 8.1775 | 6.3999 | 5.2565 | 0.9956 |
| ANN & IPE | 0.3717 | 0.2956 | 6.1874 | 0.5664 | 1.0341 | 0.8062 | 15.158 | 0.8685 | 8.0029 | 6.2619 | 5.1372 | 0.9957 |
| Percentage of Improvement | 0.49% | 0.81% | 0.9% | 0.57% | 3.16% | 2.88% | 3.01% | 1.09% | 2.13% | 2.15% | 2.27% | 0.01% |
| Model Type | Follower Acceleration | Follower Speed | Follower Position | |||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| RMSE | MAE | MASE | R2 | RMSE | MAE | MASE | R2 | RMSE | MAE | MASE | R2 | |
| LSTM | 0.3732 | 0.2979 | 6.2012 | 0.5682 | 1.0861 | 0.8479 | 15.888 | 0.8519 | 7.6267 | 5.8705 | 4.8332 | 0.9962 |
| LSTM & IPE | 0.3712 | 0.2954 | 6.1595 | 0.5707 | 0.9537 | 0.7443 | 14.098 | 0.8792 | 6.5993 | 4.9967 | 4.1151 | 0.9968 |
| Percentage of Improvement | 0.52% | 0.85% | 0.67% | 0.44% | 12.2% | 12.2% | 11.3% | 3.21% | 13.5% | 14.9% | 14.9% | 0.06% |
| Model Type | Follower Acceleration | Follower Speed | Follower Position | |||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| RMSE | MAE | MASE | R2 | RMSE | MAE | MASE | R2 | RMSE | MAE | MASE | R2 | |
| Transformer | 0.3686 | 0.2891 | 6.0267 | 0.5661 | 1.0311 | 0.8110 | 14.446 | 0.8241 | 10.016 | 7.5448 | 6.2142 | 0.9717 |
| Transformer & IPE | 0.3611 | 0.2850 | 5.9529 | 0.5809 | 0.8692 | 0.6718 | 12.676 | 0.9009 | 6.0891 | 4.6686 | 3.8571 | 0.9972 |
| Percentage of Improvement | 2.04% | 1.42% | 1.22% | 2.62% | 15.7% | 17.2% | 12.3% | 9.32% | 39.2% | 38.1% | 37.9% | 2.62% |
| Model Type | Follower Acceleration | Follower Speed | Follower Position | |||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| RMSE | MAE | MASE | R2 | RMSE | MAE | MASE | R2 | RMSE | MAE | MASE | R2 | |
| IDM | 0.3953 | 0.3048 | 6.4268 | 0.5421 | 0.9497 | 0.7497 | 13.998 | 0.8587 | 7.0966 | 5.4225 | 4.4596 | 0.9952 |
| ANN & IPE | 0.3717 | 0.2956 | 6.1874 | 0.5664 | 1.0341 | 0.8062 | 15.158 | 0.8685 | 8.0029 | 6.2619 | 5.1372 | 0.9957 |
| Percentage of improvement | 5.96% | 3.01% | 3.72% | 4.49% | −8.9% | −7.5% | −8.3% | 1.15% | −12% | −15% | −15% | 0.05% |
| Model Type | Follower Acceleration | Follower Speed | Follower Position | |||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| RMSE | MAE | MASE | R2 | RMSE | MAE | MASE | R2 | RMSE | MAE | MASE | R2 | |
| IDM | 0.3953 | 0.3048 | 6.4268 | 0.5421 | 0.9497 | 0.7497 | 13.998 | 0.8587 | 7.0966 | 5.4225 | 4.4596 | 0.9952 |
| LSTM & IPE | 0.3712 | 0.2954 | 6.1595 | 0.5707 | 0.9537 | 0.7443 | 14.098 | 0.8792 | 6.5993 | 4.9967 | 4.1151 | 0.9968 |
| Percentage of Improvement | 6.09% | 3.1% | 4.16% | 5.28% | −0.4% | 0.71% | −0.7% | 2.4% | 7% | 7.85% | 7.735 | 0.165 |
| Model Type | Follower Acceleration | Follower Speed | Follower Position | |||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| RMSE | MAE | MASE | R2 | RMSE | MAE | MASE | R2 | RMSE | MAE | MASE | R2 | |
| IDM | 0.3953 | 0.3048 | 6.4268 | 0.5421 | 0.9497 | 0.7497 | 13.998 | 0.8587 | 7.0966 | 5.4225 | 4.4596 | 0.9952 |
| Transformer & IPE | 0.3611 | 0.2850 | 5.9529 | 0.5809 | 0.8692 | 0.6718 | 12.676 | 0.9009 | 6.0891 | 4.6686 | 3.8571 | 0.9972 |
| Percentage of Improvement | 8.64% | 6.51% | 7.37% | 7.15% | 8.47% | 10.4% | 9.44% | 4.92% | 14.2% | 13.9% | 13.5% | 0.2% |
| Model Type | Follower Acceleration | Follower Speed | Follower Position | |||
|---|---|---|---|---|---|---|
| t Stat | p-Value | t Stat | p-Value | t Stat | p-Value | |
| ANN & IPE + ANN | −4.62 | 1.93 × 10−6 | −16.72 | 1.82 × 10−62 | −13.69 | 1.08 × 10−42 |
| LSTM & IPE + LSTM | −4.93 | 3.97 × 10−7 | −40.36 | 0 | −52.97 | 0 |
| Transformer & IPE + Transformer | −6.27 | 1.79 × 10−10 | −16.00 | 2.21 × 10−57 | −15.23 | 3.16 × 10−52 |
| ANN & IPE + IDM | −4.24 | 1.08 × 10−5 | 15.68 | 3.58 × 10−55 | 33.10 | 3.23 × 10−230 |
| LSTM & IPE + IDM | −5.75 | 4.39 × 10−9 | 4.34 | 7.02 × 10−6 | −1.91 | 0.028 |
| Transformer & IPE + IDM | −13.97 | 2.42 × 10−44 | −11.43 | 2.05 × 10−30 | −8.16 | 1.79 × 10−16 |
| Input Variable Scenario | Follower Acceleration RMSE | Follower Speed RMSE | Follower Position RMSE |
|---|---|---|---|
| (∆x, ∆v, Vn, IPE) | 0.3611 | 0.8692 | 6.0890 |
| (∆v, Vn, IPE) No ∆x | 0.5001 | 2.7549 | 24.9696 |
| (∆x, Vn, IPE) No ∆v | 0.6297 | 3.7290 | 61.8244 |
| (∆x, ∆v, IPE) No Vn | 0.3799 | 0.9296 | 6.9903 |
| Origin (∆x, ∆v, Vn) | 0.3686 | 1.0311 | 10.0169 |
| (∆x, ∆v, Vn, Xι *) | 0.3887 | 1.2129 | 15.3703 |
| (∆v, Vn, Xι) No ∆x | 0.4649 | 2.5667 | 49.5486 |
| (∆x, Vn, Xι) No ∆v | 0.7989 | 5.5709 | 70.2001 |
| (∆x, ∆v, Xι) No Vn | 0.3705 | 0.9642 | 9.2909 |
| Variable Contribution | Perturbation Analysis | Gradient Based Attribution | SHAP Values | Permutation Importance |
|---|---|---|---|---|
| ∆x | 26.66 | 32.24 | 34.75 | 31.26 |
| ∆v | 33.19 | 1.66 | 6.85 | 52.67 |
| Vn | 31.85 | 60.64 | 57.7 | 43.83 |
| IPE | 8.3 | 5.46 | 0.7 | 1.02 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Shahatha, A.M.; Şahin, İ. Developing Artificial Intelligence-Based Car-Following Models Using Improved Permutation Entropy Analysis Results. Appl. Sci. 2026, 16, 4224. https://doi.org/10.3390/app16094224
Shahatha AM, Şahin İ. Developing Artificial Intelligence-Based Car-Following Models Using Improved Permutation Entropy Analysis Results. Applied Sciences. 2026; 16(9):4224. https://doi.org/10.3390/app16094224
Chicago/Turabian StyleShahatha, Ali Muhssin, and İsmail Şahin. 2026. "Developing Artificial Intelligence-Based Car-Following Models Using Improved Permutation Entropy Analysis Results" Applied Sciences 16, no. 9: 4224. https://doi.org/10.3390/app16094224
APA StyleShahatha, A. M., & Şahin, İ. (2026). Developing Artificial Intelligence-Based Car-Following Models Using Improved Permutation Entropy Analysis Results. Applied Sciences, 16(9), 4224. https://doi.org/10.3390/app16094224

