# A Hybrid DNN Model for Travel Time Estimation from Spatio-Temporal Features

^{1}

^{2}

^{3}

^{4}

^{5}

^{*}

## Abstract

**:**

## 1. Introduction

- Stacked autoencoder is used to reduce the dimensionality of a temporary feature map;
- Graph neural network uses spatial characteristics (connection graph) to understand the strength of connections (spatial characteristics at the node level);
- Graph convolved feature map is inserted into ResNet and consists of 26 fully connected layers with skip connection;
- The hybridization of GNN + ResNet is obtained to simultaneously learn the spatial and temporal characteristics.

## 2. Related Works

## 3. Methodology

#### 3.1. Dimensionality Reduction Using Stacked Autoencoder

_{1}, x

_{2}, …, x

_{k}}, where xi ∈ Rd. The characteristic sequence T = {t

_{1}, t

_{2}, …, t

_{k}}, where ti ∈ Rl of the original data is obtained using Equation (1). The output of the encoder is used as the input of the decoder. The decoder reconstructs the original data to Y = {y

_{1}, y

_{2}, …, y

_{k}}, where yi ∈ Rd by using the characteristic sequence T. The purpose of decoding is to verify whether the extracted features are valid and represent the original input sequence before and after the encoding process. Once the stacked layers in the autoencoder complete the training, the encoder is used to extract the characteristics of the original data to represent the dimensionally reduced feature map of the temporal data. The bottleneck layer has a reduced number of nodes

_{i}= f(w

_{t}·x

_{i}+ b

_{t})

_{i}= g(w

_{y}·t

_{i}+ b

_{y})

_{t}is the encoder’s weight matrix, and b

_{t}is the bias vector, W

_{y}the decoder’s weight matrix, and b

_{y}the bias vector. The reconstruction error in the decoding layers is calculated using Equation (3), where the difference between the reconstructed data sequence Y and the original data sequence X is infinitesimally small (of 10

^{−5}order), which signifies that the characteristic sequence T is valid. For the Chengdu dataset, the stacked autoencoder (SAE) with 16 hidden layers is sufficient to reconstruct the characteristic sequence, which implies that the 50 temporal features in the original temporal feature space can be represented using 16 temporal features, thereby reducing the dimensionality of the temporal feature space.

#### 3.2. Graph Neural Network (GNN)

- The feature matrix of the graph is represented by X with dimension N × F⁰ where N is the number of nodes and F⁰ is the number of input features for each node.

^{i}, A) = σ(AZ

^{i}W

^{i})

^{i}is the weight matrix that corresponds to layer i and σ is a non-linear activation function ReLU. The weight matrix has dimensions F

^{i}× F

^{i+1}, which means that the size of the weight matrix’s first dimension influences the number of features at the following i + 1 layers.

#### 3.3. Residual Network

## 4. Experimental Analysis

## 5. Conclusions

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Conflicts of Interest

## References

- Liao, B.; Zhang, J.; Cai, M.; Tang, S.; Gao, Y.; Wu, C.; Yang, S.; Zhu, W.; Guo, Y.; Wu, F. Dest-ResNet. In Proceedings of the 26th ACM International Conference on Multimedia, Seoul, Korea, 22–26 October 2018; ACM: New York, NY, USA, 2018; pp. 1883–1891. [Google Scholar]
- Zivot, E.; Wang, J. Modeling Financial Time Series with S-PLUS
^{®}; Springer: New York, NY, USA, 2006; ISBN 978-0-387-27965-7. [Google Scholar] - Williams, B.M.; Hoel, L.A. Modeling and Forecasting Vehicular Traffic Flow as a Seasonal ARIMA Process: Theoretical Basis and Empirical Results. J. Transp. Eng.
**2003**, 129, 664–672. [Google Scholar] [CrossRef] [Green Version] - Yin, X.; Wu, G.; Wei, J.; Shen, Y.; Qi, H.; Yin, B. Deep Learning on Traffic Prediction: Methods, Analysis, and Future Directions. IEEE Trans. Intell. Transp. Syst.
**2022**, 23, 4927–4943. [Google Scholar] [CrossRef] - Zhang, C.; Yu, J.J.Q.; Liu, Y. Spatial-Temporal Graph Attention Networks: A Deep Learning Approach for Traffic Forecasting. IEEE Access
**2019**, 7, 166246–166256. [Google Scholar] [CrossRef] - Wang, S.; Cao, J.; Yu, P.S. Deep Learning for Spatio-Temporal Data Mining: A Survey. IEEE Trans. Knowl. Data Eng.
**2022**, 34, 3681–3700. [Google Scholar] [CrossRef] - Ma, X.; Dai, Z.; He, Z.; Ma, J.; Wang, Y.; Wang, Y. Learning Traffic as Images: A Deep Convolutional Neural Network for Large-Scale Transportation Network Speed Prediction. Sensors
**2017**, 17, 818. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Guo, S.; Lin, Y.; Feng, N.; Song, C.; Wan, H. Attention Based Spatial-Temporal Graph Convolutional Networks for Traffic Flow Forecasting. Proceedings of the AAAI conference on artificial intelligence
**2019**, 33, 922–929. [Google Scholar] [CrossRef] [Green Version] - Lv, Y.; Duan, Y.; Kang, W.; Li, Z.; Wang, F.-Y. Traffic Flow Prediction with Big Data: A Deep Learning Approach. IEEE Trans. Intell. Transp. Syst.
**2014**, 16, 865–873. [Google Scholar] [CrossRef] - Okutani, I.; Stephanedes, Y.J. Dynamic prediction of traffic volume through Kalman filtering theory. Transp. Res. Part B Methodol.
**1984**, 18, 1–11. [Google Scholar] [CrossRef] - Dubey, P.P.; Borkar, P. Review on techniques for traffic jam detection and congestion avoidance. In Proceedings of the 2015 2nd International Conference on Electronics and Communication Systems (ICECS), Coimbatore, India, 26–27 February 2015; pp. 434–440. [Google Scholar]
- Song, F.; Zhou, Y.-T.; Wang, Y.; Zhao, T.-M.; You, I.; Zhang, H.-K. Smart collaborative distribution for privacy enhancement in moving target defense. Inf. Sci.
**2019**, 479, 593–606. [Google Scholar] [CrossRef] - Stathopoulos, A.; Karlaftis, M.G. A multivariate state space approach for urban traffic flow modeling and prediction. Transp. Res. Part C Emerg. Technol.
**2003**, 11, 121–135. [Google Scholar] [CrossRef] - Kamarianakis, Y.; Prastacos, P. Space–time modeling of traffic flow. Comput. Geosci.
**2005**, 31, 119–133. [Google Scholar] [CrossRef] [Green Version] - Min, W.; Wynter, L. Real-time road traffic prediction with spatio-temporal correlations. Transp. Res. Part C Emerg. Technol.
**2011**, 19, 606–616. [Google Scholar] [CrossRef] - Williams, B.M.; Durvasula, P.K.; Brown, D.E. Urban Freeway Traffic Flow Prediction: Application of Seasonal Autoregressive Integrated Moving Average and Exponential Smoothing Models. Transp. Res. Rec. J. Transp. Res. Board
**1998**, 1644, 132–141. [Google Scholar] [CrossRef] - Xie, Y.; Zhang, Y.; Ye, Z. Short-Term Traffic Volume Forecasting Using Kalman Filter with Discrete Wavelet Decomposition. Comput. Civ. Infrastruct. Eng.
**2007**, 22, 326–334. [Google Scholar] [CrossRef] - Zhang, Y.; Xie, Y. Forecasting of short-term freeway volume with v-support vector machines. Transp. Res. Rec.
**2007**, 92–99. [Google Scholar] [CrossRef] - Lee, J.; Park, B.; Yun, I. Cumulative Travel-Time Responsive Real-Time Intersection Control Algorithm in the Connected Vehicle Environment. J. Transp. Eng.
**2013**, 139, 1020–1029. [Google Scholar] [CrossRef] - Wu, Y.; Tan, H.; Qin, L.; Ran, B.; Jiang, Z. A hybrid deep learning based traffic flow prediction method and its understanding. Transp. Res. Part C Emerg. Technol.
**2018**, 90, 166–180. [Google Scholar] [CrossRef] - Liang, Z.; Wakahara, Y. City traffic prediction based on real-time traffic information for Intelligent Transport Systems. In Proceedings of the 2013 13th International Conference on ITS Telecommunications (ITST), Tampere, Finland, 5–7 November 2013; pp. 378–383. [Google Scholar]
- Khan, N.A. Real Time Predictive Monitoring System for Urban Transport Real Time Predictive Monitoring System for Urban Transport. Ph.D. Thesis, Kingston University, Kingston upon Thames, UK, 2017. [Google Scholar]
- Kari, D.; Wu, G.; Barth, M.J. Development of an agent-based online adaptive signal control strategy using connected vehicle technology. In Proceedings of the 17th International IEEE Conference on Intelligent Transportation Systems (ITSC), Qingdao, China, 8–11 October 2014; pp. 1802–1807. [Google Scholar]
- Luo, Q.; Juan, Z.; Sun, B.; Jia, H. Method Research on Measuring the External Costs of Urban Traffic Congestion. J. Transp. Syst. Eng. Inf. Technol.
**2007**, 7, 9–12. [Google Scholar] [CrossRef] - Padiath, A.; Vanajakshi, L.; Subramanian, S.C.; Manda, H. Prediction of traffic density for congestion analysis under Indian traffic conditions. In Proceedings of the 2009 12th International IEEE Conference on Intelligent Transportation Systems, St. Louis, MO, USA, 4–7 October 2009; pp. 1–6. [Google Scholar]
- Mohan Rao, A.; Ramachandra Rao, K. Measuring Urban Traffic Congestion—A Review. Int. J. Traffic Transp. Eng.
**2012**, 2, 286–305. [Google Scholar] [CrossRef] [Green Version] - Kumar, S.V.; Vanajakshi, L. Short-term traffic flow prediction using seasonal ARIMA model with limited input data. Eur. Transp. Res. Rev.
**2015**, 7, 21. [Google Scholar] [CrossRef] - Sun, S.; Zhang, C.; Yu, G. A Bayesian Network Approach to Traffic Flow Forecasting. IEEE Trans. Intell. Transp. Syst.
**2006**, 7, 124–132. [Google Scholar] [CrossRef] - Pan, T.L.; Sumalee, A.; Zhong, R.X.; Indra-payoong, N. Short-Term Traffic State Prediction Based on Temporal–Spatial Correlation. IEEE Trans. Intell. Transp. Syst.
**2013**, 14, 1242–1254. [Google Scholar] [CrossRef] - Xu, Y.; Kong, Q.-J.; Klette, R.; Liu, Y. Accurate and Interpretable Bayesian MARS for Traffic Flow Prediction. IEEE Trans. Intell. Transp. Syst.
**2014**, 15, 2457–2469. [Google Scholar] [CrossRef] - Zheng, X.; Chen, W.; Wang, P.; Shen, D.; Chen, S.; Wang, X.; Zhang, Q.; Yang, L. Big Data for Social Transportation. IEEE Trans. Intell. Transp. Syst.
**2016**, 17, 620–630. [Google Scholar] [CrossRef] - Cheng, N.; Lyu, F.; Chen, J.; Xu, W.; Zhou, H.; Zhang, S.; Shen, X. Big Data Driven Vehicular Networks. IEEE Netw.
**2018**, 32, 160–167. [Google Scholar] [CrossRef] [Green Version] - Tan, M.C.; Wong, S.C.; Xu, J.M.; Guan, Z.R.; Zhang, P. An Aggregation Approach to Short-Term Traffic Flow Prediction. IEEE Trans. Intell. Transp. Syst.
**2009**, 10, 60–69. [Google Scholar] [CrossRef] - Ahmed, M.S.; Cook, A.R. Analysis of Freeway Traffic Time-Series Data by Using Box-Jenkins Techniques. Transp. Res. Rec.
**1979**, 722, 1–9. [Google Scholar] - Liu, W.; Wang, Z. Dynamic Router Real-Time Travel Time Prediction Based on a Road Network. In Communications in Computer and Information Science; Springer: Berlin/Heidelberg, Germany, 2011; Volume 86, ISBN 9783642198526. [Google Scholar]
- Xu, T.; Li, X.; Claramunt, C. Trip-oriented travel time prediction (TOTTP) with historical vehicle trajectories. Front. Earth Sci.
**2018**, 12, 253–263. [Google Scholar] [CrossRef] - Zhang, Y.; Liu, Y. Comparison of parametric and nonparametric techniques for non-peak traffic forecasting. World Acad. Sci. Eng. Technol.
**2009**, 39, 242–248. [Google Scholar] [CrossRef] - Wang, J.; Gu, Q.; Wu, J.; Liu, G.; Xiong, Z. Traffic Speed Prediction and Congestion Source Exploration: A Deep Learning Method. In Proceedings of the 2016 IEEE 16th International Conference on Data Mining (ICDM), Barcelona, Spain, 12–15 December 2016; Volume 0, pp. 499–508. [Google Scholar]
- Yang, L.; Ma, R.; Zhang, H.M.; Guan, W.; Jiang, S. Driving behavior recognition using EEG data from a simulated car-following experiment. Accid. Anal. Prev.
**2018**, 116, 30–40. [Google Scholar] [CrossRef] - Kim, Y.; Wang, P.; Zhu, Y.; Mihaylova, L. A Capsule Network for Traffic Speed Prediction in Complex Road Networks. In Proceedings of the 2018 Sensor Data Fusion: Trends, Solutions, Applications (SDF), Bonn, Germany, 9–11 October 2018; pp. 1–6. [Google Scholar]
- Tan, H.; Xuan, X.; Wu, Y.; Zhong, Z.; Ran, B. A Comparison of Traffic Flow Prediction Methods Based on DBN. In Proceedings of the CICTP 2016; American Society of Civil Engineers: Reston, VA, USA, 2016; pp. 273–283. [Google Scholar]
- Fu, R.; Zhang, Z.; Li, L. Using LSTM and GRU neural network methods for traffic flow prediction. In Proceedings of the 2016 31st Youth Academic Annual Conference of Chinese Association of Automation (YAC), Hubei, China, 11–13 November 2016; pp. 324–328. [Google Scholar]
- Chen, Q.; Song, X.; Yamada, H.; Shibasaki, R. Learning Deep Representation from Big and Heterogeneous Data for Traffic Accident Inference. In Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence, Phoenix, AZ, USA, 12–17 February 2016. [Google Scholar]
- Lu, Z.; Lv, W.; Cao, Y.; Xie, Z.; Peng, H.; Du, B. LSTM variants meet graph neural networks for road speed prediction. Neurocomputing
**2020**, 400, 34–45. [Google Scholar] [CrossRef] - Xu, D.; Wei, C.; Peng, P.; Xuan, Q.; Guo, H. GE-GAN: A novel deep learning framework for road traffic state estimation. Transp. Res. Part C Emerg. Technol.
**2020**, 117, 102635. [Google Scholar] [CrossRef] - Shen, Y.; Jin, C.; Hua, J.; Huang, D. TTPNet: A Neural Network for Travel Time Prediction Based on Tensor Decomposition and Graph Embedding. IEEE Trans. Knowl. Data Eng.
**2022**, 34, 4514–4526. [Google Scholar] [CrossRef] - Jin, G.; Wang, M.; Zhang, J.; Sha, H.; Huang, J. STGNN-TTE: Travel time estimation via spatial–temporal graph neural network. Future Gener. Comput. Syst.
**2022**, 126, 70–81. [Google Scholar] [CrossRef] - Wei, W.; Wu, H.; Ma, H. An AutoEncoder and LSTM-Based Traffic Flow Prediction Method. Sensors
**2019**, 19, 2946. [Google Scholar] [CrossRef] [Green Version] - Ouyang, K.; Liang, Y.; Liu, Y.; Tong, Z.; Ruan, S.; Rosenblum, D.; Zheng, Y. Fine-Grained Urban Flow Inference. IEEE Trans. Knowl. Data Eng.
**2020**, 34, 6. [Google Scholar] [CrossRef] - Vélez-Serrano, D.; Álvaro-Meca, A.; Sebastián-Huerta, F.; Vélez-Serrano, J. Spatio-Temporal Traffic Flow Prediction in Madrid: An Application of Residual Convolutional Neural Networks. Mathematics
**2021**, 9, 1068. [Google Scholar] [CrossRef] - Jiber, M.; Mbarek, A.; Yahyaouy, A.; Sabri, M.A.; Boumhidi, J. Road Traffic Prediction Model Using Extreme Learning Machine: The Case Study of Tangier, Morocco. Information
**2020**, 11, 542. [Google Scholar] [CrossRef] - Xu, D.; Dai, H.; Wang, Y.; Peng, P.; Xuan, Q.; Guo, H. Road traffic state prediction based on a graph embedding recurrent neural network under the SCATS. Chaos Interdiscip. J. Nonlinear Sci.
**2019**, 29, 103125. [Google Scholar] [CrossRef] - Ren, C.; Chai, C.; Yin, C.; Ji, H.; Cheng, X.; Gao, G.; Zhang, H. Short-Term Traffic Flow Prediction: A Method of Combined Deep Learnings. J. Adv. Transp.
**2021**, 2021, 9928073. [Google Scholar] [CrossRef] - Liao, B.; Zhang, J.; Wu, C.; McIlwraith, D.; Chen, T.; Yang, S.; Guo, Y.; Wu, F. Deep Sequence Learning with Auxiliary Information for Traffic Prediction. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, London, UK, 19–23 August 2018; ACM: New York, NY, USA, 2018; Volume 18, pp. 537–546. [Google Scholar]
- Jiang, W.; Luo, J. Graph neural network for traffic forecasting: A survey. Expert Syst. Appl.
**2022**, 207, 117921. [Google Scholar] [CrossRef] - Sun, S.; Chen, J.; Sun, J. Traffic congestion prediction based on GPS trajectory data. Int. J. Distrib. Sens. Netw.
**2019**, 15. [Google Scholar] [CrossRef] - Xu, M.; Dai, W.; Liu, C.; Gao, X.; Lin, W.; Qi, G.-J.; Xiong, H. Spatial-Temporal Transformer Networks for Traffic Flow Forecasting. arXiv
**2020**, arXiv:2001.02908. [Google Scholar] [CrossRef] - Qi, Y.; Ishak, S. A Hidden Markov Model for short term prediction of traffic conditions on freeways. Transp. Res. Part C Emerg. Technol.
**2014**, 43, 95–111. [Google Scholar] [CrossRef] - Xie, Y.; Zhao, K.; Sun, Y.; Chen, D. Gaussian Processes for Short-Term Traffic Volume Forecasting. Transp. Res. Rec. J. Transp. Res. Board
**2010**, 2165, 69–78. [Google Scholar] [CrossRef] [Green Version] - Wang, S.-H.; Govindaraj, V.V.; Górriz, J.M.; Zhang, X.; Zhang, Y.-D. Covid-19 classification by FGCNet with deep feature fusion from graph convolutional network and convolutional neural network. Inf. Fusion
**2021**, 67, 208–229. [Google Scholar] [CrossRef] - Staff, T.P.O. Correction: Multi-view classification with convolutional neural networks. PLoS ONE
**2021**, 16, e0250190. [Google Scholar] [CrossRef] - Guarino, A.; Lettieri, N.; Malandrino, D.; Zaccagnino, R.; Capo, C. Adam or Eve? Automatic users’ gender classification via gestures analysis on touch devices. Neural Comput. Appl.
**2022**, 34, 18473–18495. [Google Scholar] [CrossRef] - Xiang, L. Simulation System of Car Crash Test in C-NCAP Analysis Based on an Improved Apriori Algorithm*. Phys. Procedia
**2012**, 25, 2066–2071. [Google Scholar] [CrossRef] - Jimenez-Martinez, M. Artificial Neural Networks for Passive Safety Assessment. Eng. Lett.
**2022**, 30, 1–9. [Google Scholar]

**Figure 1.**Proposed Hybrid Deep Learning Pipeline that learns spatio-temporal features for travel time prediction.

**Figure 8.**Results in Google Map using our GNN + Resnet model with estimated travel time (in seconds) for distant points.

**Figure 9.**Results in Google Map using our GNN + Resnet model with estimated travel time (in seconds) for a nearer points.

**Figure 10.**Results in Google Map using our GNN + Resnet model with estimated travel time (in seconds) for very nearer points.

**Table 1.**Summary of the different models in literatures with their performance metrics and loss functions.

Ref. No. | Model | Dataset | Performance Measures | |||
---|---|---|---|---|---|---|

RMSE | MAE | MAPE | Loss Function | |||

[4] | GCNN | METR-LA | 7.24 | 3.47 | 9.57 | N.A. |

[49] | STGNN-TTE | Chengdu1 | 88.03 | 60.71 | 0.2606 | Average Loss |

Chengdu2 | 121.14 | 63.38 | 0.2568 | |||

Porto | 52.35 | 39.25 | 0.1474 | |||

[50] | Graph Phased LSTM | Xi’an | 6.814 | 4.921 | 0.295 | Optimal loss |

Beijing | 12.558 | 9.956 | 0.665 | |||

[51] | Caltrans PeMS | GE_GAN | 18.40 | 13.11 | 4.11 | Discriminator |

[52] | Beijing | TTPNet | NA | 118.59 | 11.42 | first-order proximity |

[53] | AE-LSTM | PeMS | 28.23 | 20.16 | 0.072 | |

[54] | GERNN | SCATS | 8.827 | 6.327 | NA | True traffic |

[55] | Convolutional Residual Neural Network | Madrid City Council | NA | 39.65 | 19.08 | MSE |

[56] | UrbanPy | TaxiBj | 8.030 | 1.790 | 0.531 | KL-divergence |

Happy Valley | 8.332 | 1.732 | 0.508 | |||

[57] | Moroccan | Extreme Learning Machine (ELM) | 2.8779 | 1.9666 | 14.3785 | Symmetric |

[58] | Combined Deep Learning Prediction (cdlp) | Jiangxi Road | 11.42 | 7.84 | 21.84 | CDLP |

South Fuzhou | 6.52 | 3.26 | 5.12 | |||

[59] | Hybrid | Q-Traffic | NA | NA | 9.22 | N.A. |

[60] | Spatial-Temporal Transformer Network | PeMS-Bay | 4.50 | 1.95 | 4.58 | Mean Absolute loss |

PeMSD7 (M) | 6.17 | 3.12 | 7.89 | |||

[61] | ARIMA Model | Chengdu | 12.44 | 5.32 | 154.64 | N.A. |

Distribution | Weekday Peak | Weekday Off-Peak | Weekend Peak | Weekend Off-Peak |
---|---|---|---|---|

No. of points (roads) | 5943 | |||

Spatial distribution (start) (Lat, Long) | (103.9299735, 30.56811912) | (103.9299735, 30.56811912) | (103.9299735, 30.56811912) | (103.9299735, 30.56811912) |

Spatial distribution (end)(Lat, Long) | (103.9299735, 30.56811912) | (103.9299735, 30.56811912) | (103.9299735, 30.56811912) | (103.9299735, 30.56811912) |

Avg moving time (s) | 63.7096 | 46.4763 | 61.3718 | 46.5097 |

Max moving time(s) | 337.9256 | 250.6087 | 293.6739 | 253.8487 |

Min moving time (s) | 4.9793 | 3.3162 | 12.8908 | 2.8709 |

Median (s) | 53.4519 | 37.6102 | 51.4952 | 37.675 |

Model | Chengdu1 | Chengdu2 | ||||
---|---|---|---|---|---|---|

RMSE | MAE | MAPE | RMSE | MAE | MAPE | |

ARIMA | 147.32 | 91.69 | 0.5621 | 192.48 | 102.44 | 0.5485 |

TEMP | 116.38 | 77.5 | 0.3903 | 155.34 | 81.19 | 0.4057 |

LightGBM | 107.89 | 71.97 | 0.3518 | 142.56 | 74.1 | 0.3316 |

MlpTTE | 96.17 | 66.58 | 0.2963 | 128.52 | 68.31 | 0.2839 |

RnnTTE | 95.29 | 65.74 | 0.2915 | 128.79 | 68.43 | 0.2841 |

DeepTTE | 93.68 | 64.03 | 0.2864 | 127.24 | 67.39 | 0.2807 |

STGCN | 94.37 | 65.02 | 0.2787 | 127.89 | 67.58 | 0.272 |

DRCNN | 96.78 | 66.34 | 0.2817 | 130.11 | 69.57 | 0.2786 |

Graph Wavenet | 92.65 | 63.57 | 0.2725 | 126.58 | 67.1 | 0.2701 |

STGCN TTE | 88.03 | 60.71 | 0.2606 | 121.14 | 63.38 | 0.2568 |

GNN ResNet (ours) | 88.79 | 57.48 | 0.2616 | 121.24 | 61.65 | 0.2628 |

**Table 4.**Performance of our proposed model GNN ResNet with other graph-based models for estimating the travel times of each link on two datasets.

Model | Chengdu1 | Chengdu2 | ||||
---|---|---|---|---|---|---|

Spatio-Temporal GCN | RMSE | MAE | MAPE | RMSE | MAE | MAPE |

STGCN | 33.87 | 18.79 | 0.5301 | 52.33 | 21.13 | 0.5345 |

DRCNN | 35.75 | 19.42 | 0.5457 | 54.21 | 21.88 | 0.5519 |

Graph Wavenet | 33.12 | 18.23 | 0.5264 | 51.08 | 20.76 | 0.533 |

Spatio-Temporal GNN TTE | 31.28 | 17.57 | 0.5152 | 49.77 | 20.01 | 0.5217 |

GNN + ResNet (ours) | 31.06 | 17.23 | 0.5028 | 49.63 | 18.43 | 0.5314 |

**Table 5.**Performance comparison of proposed GNN + residual network under different temporal scenarios on chengdu1 dataset.

Model | Chengdu1 | |||||
---|---|---|---|---|---|---|

Rush Hours | Non-Rush Hours | |||||

RMSE | MAE | MAPE | RMSE | MAE | MAPE | |

ARIMA | 132.58 | 93.46 | 0.5731 | 120.41 | 88.93 | 0.551 |

Empirical | 110.41 | 79.41 | 0.4022 | 102.33 | 75.44 | 0.3896 |

Light Gradient Boosting Machine | 101.34 | 74.67 | 0.3582 | 93.48 | 70.1 | 0.3476 |

Multiple Layer Perceptron | 90.84 | 67.01 | 0.2991 | 86.95 | 61.36 | 0.2931 |

Recurrent Neural Network TTE | 89.49 | 66.16 | 0.2967 | 84.09 | 60.14 | 0.2884 |

Deep Convolution TTE | 87.24 | 64.93 | 0.2889 | 83.05 | 59.51 | 0.2851 |

Spatio Temporal Graph Convolutional TTE | 88.27 | 65.35 | 0.2810 | 83.56 | 59.43 | 0.2726 |

Deep Regions with Convolution Neural Network | 90.57 | 66.72 | 0.2883 | 86.83 | 61.05 | 0.2785 |

Deep Spatial-Temporal Graph | 86.49 | 64.10 | 0.2789 | 82.19 | 59.36 | 0.2692 |

Spatio Temporal Graph Neural Network | 82.65 | 61.45 | 0.2662 | 77.34 | 56.55 | 0.2581 |

GNN + ResNet (ours) | 71.79 | 46.99 | 0.2616 | 68.79 | 53.93 | 0.2142 |

**Table 6.**Performance of GNN + ResNet model with other graph based models under different temporal scenarios on chengdu2 dataset.

Model | Chengdu2 | |||||
---|---|---|---|---|---|---|

Rush Hours | Non-Rush Hours | |||||

RMSE | MAE | MAPE | RMSE | MAE | MAPE | |

ARIMA | 175.84 | 105.74 | 0.5733 | 158.32 | 100.65 | 0.5387 |

Empirical | 144.66 | 83.52 | 0.4341 | 136.55 | 80.02 | 0.3982 |

Light Gradient Boosting Machine | 131.94 | 76.63 | 0.3589 | 125.84 | 72.38 | 0.3224 |

Multiple Layer Perceptron | 122.67 | 69.16 | 0.2908 | 117.15 | 68.02 | 0.2806 |

Recurrent Neural Network TTE | 122.12 | 69.21 | 0.2911 | 117.47 | 68.18 | 0.281 |

Deep Convolution TTE | 121.56 | 68.63 | 0.2885 | 117.1 | 67.07 | 0.2793 |

Spatio Temporal Graph Convolutional TTE | 121.35 | 68.41 | 0.2798 | 116.27 | 67.19 | 0.2703 |

Deep Regions with Convolution Neural Network | 125.22 | 70.14 | 0.2823 | 120.17 | 69.08 | 0.2759 |

Deep Spatial-Temporal Graph | 120.62 | 68.16 | 0.2759 | 115.86 | 65.91 | 0.2684 |

Spatio Temporal Graph Neural Network | 114.25 | 64.45 | 0.2611 | 110.83 | 62.77 | 0.2551 |

GNN + ResNet (ours) | 112.74 | 66.32 | 0.2535 | 108.21 | 63.74 | 0.2387 |

Model | Chengdu1 | Chengdu2 | ||||
---|---|---|---|---|---|---|

RMSE | MAE | MAPE | RMSE | MAE | MAPE | |

FCN | 92.31 | 72.3 | 0.3221 | 112.21 | 63.74 | 0.2387 |

GNN ResNet (ours) | 88.79 | 57.48 | 0.2616 | 26.26 | 16.98 | 0.48 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Rajagopal, B.G.; Kumar, M.; Samui, P.; Kaloop, M.R.; Shahdah, U.E.
A Hybrid DNN Model for Travel Time Estimation from Spatio-Temporal Features. *Sustainability* **2022**, *14*, 14049.
https://doi.org/10.3390/su142114049

**AMA Style**

Rajagopal BG, Kumar M, Samui P, Kaloop MR, Shahdah UE.
A Hybrid DNN Model for Travel Time Estimation from Spatio-Temporal Features. *Sustainability*. 2022; 14(21):14049.
https://doi.org/10.3390/su142114049

**Chicago/Turabian Style**

Rajagopal, Balaji Ganesh, Manish Kumar, Pijush Samui, Mosbeh R. Kaloop, and Usama Elrawy Shahdah.
2022. "A Hybrid DNN Model for Travel Time Estimation from Spatio-Temporal Features" *Sustainability* 14, no. 21: 14049.
https://doi.org/10.3390/su142114049