Scatter-GNN: A Scatter Graph Neural Network for Prediction of High-Speed Railway Station—A Case Study of Yinchuan–Chongqing HSR
Abstract
:Featured Application
Abstract
1. Introduction
- (1)
- The Scatter-GNN model was used to model the main national high-speed railway lines, and calculate the priority of two possible planned lines in the Yinchuan–Chongqing high-speed railway, which may become a new scenario for applying the GNN classifier;
- (2)
- Compared with the graph representation learning method GAT, the Scatter-GNN model improves the accuracy by 0.03%.
2. Related Work
2.1. High-Speed Railway Line Planning Based on Mathematical Model
2.2. High-Speed Railway Line Planning Based on Machine Learning Model
2.3. Graph Neural Network Model Based on Edge Graph Structure
3. Route Planning Scheme Based on Scatter-GNN
3.1. Data Selection
3.2. Problem Definition
3.3. Adaptive Function Calculation and Edge Site Location Search
Algorithm 1: Calculate Undirected Node Degree and Neighbor Vector. |
Require: import networkx as nx Graph Ensure: 1. G = nx.random_graphs.barabasi_albert_graph (Graph) 2. Return G.degree(g.degree <) 3. For n = 1, 2,…,N do 4. Generate neighborhood node lists ,,…, 5. for k = 1, 2, …, K do 6. 7. g.add_edges_from(,) 8. output |
3.4. Completion of Edge Site Neighbors
Algorithm 2: Learning Vector Representation. |
Require: Graph , Dimensions d, Walks per node r. Walk length l, Context size k, Return p, In-out q = PreprocessModifiedWeights (G, p, q) Initialize walks to Empty Ensure: 1. For iter = 1 to r do 2. for all nodes u do 3. walk = node2vecWalk(, u, l) 4. Append walk to walks 5. f = StochasticGradientDescent(k, d, walks) 6. Return f |
Algorithm 3: Link Prediction. |
Graph Forward propagation: 1. Def forward(self,h,adj): 2. calculate the attention coefficient e,adjacent nodes and obtain: 4. if self.concats is False 5. output layer does not perform linear transformation Mosaic feature vector: 6. Def _prepare_attentional_mechanism_input(self, Wh): 7. For eigenvector, characteristic matrix in n do 8. all_combinations_matrix=wh_... 9. return all_combinations_matrix.view Attention Networks: 10. Def Link prediction(epoch,loss,P) 11. For i = 1 to epoch do 12. for edge = 1 to N do 13. Get node p and q by dividing edge: 14. 15. for k=1 layer to K layer do 16. calculate loss = and gradient 17. update parameter w and b 18. save model 19. Return P Prediction: 20. For n = 1, 2,…,N do 21. Generate neighborhood node lists ,,…, 22. for k = 1, 2, …, K do 23. input to model 24. if p 0.5 25. g.add_edges_from(,) 26. g.delete_edges_from(,) 27. output |
3.5. Calculation of Existing Route Probability Based on Scatter-GNN
Algorithm 4: Scatter-GNN. |
Require: Graph , Node Features Matrix , Neighborhood Sample Layer Num , Each Layer Neighborhood Sample Size ,,…,, Learning Rate for Adam: Ensure: Updated Node Features Matrix 1. Initialize the parameters in , 2. while not converge do 3. for i = 1, 2, …, N do 4. Generate neighborhood node lists ,,…, by layer neighborhood sampling according to ,,…, 5. for k = 1, 2, …, K do 6. 7. Update in 8. Calculate loss = and gradient of and by Adam 9. Update the parameters: 10. Return Updated Prediction: 11. input to model 12. Return P |
4. Experimental Verification
4.1. Baseline Algorithm
- DeepWalk: DeepWalk [51] obtains the node sequence by random walk in the graph, then uses the Word2vec algorithm to obtain the vector representation of the node, and finally uses the logistic regression classifier for prediction. The parameters of the logistic regression classifier are all consistent with the Scatter-GNN;
- Node2vec: Node2vec [52] is an improved algorithm for DeepWalk. It obtains the node sequence by performing a biased random walk in the graph, and then obtains the vector representation of the node through the Word2vec algorithm. Similar to the DeepWalk algorithm, the vector representation of the obtained node is input into the logistic regression classifier for prediction, and the classifier parameters are consistent with the Scatter-GNN;
- GCN: GCN [7] uses the convolution kernel to extract the structural features of the graph, and then combines the node features to train on the whole graph;
- GAT: GAT [8] enhances the vector representation of the target node by assigning different weights to each neighbor, and it is one of the most advanced GNN models in graph representation learning so far.
4.2. Parameter Setting
4.3. Experimental Analysis
5. Conclusions
- (1)
- Since the Scatter-GNN model needs to constantly judge whether to add sites and routes before formal training, this makes it less efficient than traditional GNN classifiers;
- (2)
- Due to the extremely limited number of railway stations, the prediction accuracy of the graph deep learning model is not high;
- (3)
- The adaptive function defined in the Scatter-GNN model is based on the assumption . However, the stations in the railway line network eventually become saturated, which limits the model prediction performance.
- (1)
- How to divide the scope of the adaptive function more carefully, so as to further improve the prediction performance of the model;
- (2)
- Explore ways to limit the number of potential neighbor sites to improve prediction efficiency.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
Appendix B
Order | Symbol | Illustration |
---|---|---|
1 | Multiply the corresponding positions of two vectors | |
2 | Ordinary activation function | |
3 | exp | Exponential function based on the natural constant e |
4 | || | Concatenation operation of two vectors |
5 | Activation functions in multi-classification and normalization | |
6 | Leaky Rectified Linear Units, linear activation function | |
7 | Vector space |
References
- Singh, P.; Elmi, Z.; Meriga, V.K.; Pasha, J.; Dulebenets, M.A. Internet of Things for sustainable railway transportation: Past, present, and future. Clean. Logist. Supply Chain 2022, 4, 100065. [Google Scholar] [CrossRef]
- Khan, M.Z.; Khan, F.N. Estimating the demand for rail freight transport in Pakistan: A time series analysis. J. Rail Transp. Plan. Manag. 2020, 14, 100176. [Google Scholar] [CrossRef]
- Beuthe, M.; Jourquin, B.; Geerts, J.-F.; Ha, C.K.N. Freight transportation demand elasticities: A geographic multimodal transportation network analysis. Transp. Res. Part E Logist. Transp. Rev. 2001, 37, 253–266. [Google Scholar] [CrossRef]
- Atack, J.; Margo, R.A. The impact of access to rail transportation on agricultural improvement: The American Midwest as a test case, 1850–1860. J. Transp. Land Use 2011, 4, 5–18. [Google Scholar] [CrossRef]
- Ghofrani, F.; He, Q.; Goverde, R.M.; Liu, X. Recent applications of big data analytics in railway transportation systems: A survey. Transp. Res. Part C Emerg. Technol. 2018, 90, 226–246. [Google Scholar] [CrossRef]
- Gao, M.; Cong, J.; Xiao, J.; He, Q.; Li, S.; Wang, Y.; Yao, Y.; Chen, R.; Wang, P. Dynamic modeling and experimental investigation of self-powered sensor nodes for freight rail transport. Appl. Energy 2020, 257, 113969. [Google Scholar] [CrossRef]
- Liang, Y.; Zhou, K.; Li, X.; Zhou, Z.; Sun, W.; Zeng, J. Effectiveness of high-speed railway on regional economic growth for less developed areas. J. Transp. Geogr. 2020, 82, 102621. [Google Scholar] [CrossRef]
- Yin, M.; Bertolini, L.; Duan, J. The effects of the high-speed railway on urban development: International experience and potential implications for China. Prog. Plan. 2015, 98, 1–52. [Google Scholar] [CrossRef] [Green Version]
- Bešinović, N. Resilience in railway transport systems: A literature review and research agenda. Transp. Rev. 2020, 40, 457–478. [Google Scholar] [CrossRef]
- Singh, P.; Dulebenets, M.A.; Pasha, J.; Gonzalez, E.D.R.S.; Lau, Y.-Y.; Kampmann, R. Deployment of autonomous trains in rail transportation: Current trends and existing challenges. IEEE Access 2021, 9, 91427–91461. [Google Scholar] [CrossRef]
- Singh, P.; Pasha, J.; Khorram-Manesh, A.; Goniewicz, K.; Roshani, A.; Dulebenets, M. A Holistic Analysis of Train-Vehicle Accidents at Highway-Rail Grade Crossings in Florida. Sustainability 2021, 13, 8842. [Google Scholar] [CrossRef]
- Grechi, D.; Maggi, E. The importance of punctuality in rail transport service: An empirical investigation on the delay determinants. Eur. Transp.-Trasp. Eur. 2018, 70, 1–23. [Google Scholar]
- Yan, C. Research on Route Planning of Xi’an to Hancheng Intercity Railway. Railw. Stand. Des. 2016, 11, 1562. [Google Scholar]
- Assad, A.A. Modelling of rail networks: Toward a routing/makeup model. Transp. Res. Part B Methodol. 1980, 14, 101–114. [Google Scholar] [CrossRef]
- Lee, S.D. Strategic environment assessment and biological diversity conservation in the Korean high-speed railway project. J. Environ. Assess. Policy Manag. 2005, 7, 287–298. [Google Scholar] [CrossRef]
- He, G.; Lu, Y. Public protests against the Beijing–Shenyang high-speed railway in China. Transp. Res. Part D Transp. Environ. 2016, 43, 1–16. [Google Scholar] [CrossRef] [Green Version]
- Yildirim, V.; Bediroglu, S. A geographic information system-based model for economical and eco-friendly high-speed railway route determination using analytic hierarchy process and least-cost-path analysis. Expert Syst. 2019, 36, e12376. [Google Scholar] [CrossRef]
- Will, H.; Ying, Z.T. Inductive representation learning on large graphs. NeurIPS 2017, 2, 1024–1034. [Google Scholar]
- Thomas, N.; Welling, M. Semi-supervised classification with graph convolutional networks. In Proceedings of the ICLR, Toulon, France, 24–26 April 2017. [Google Scholar]
- Petar, V.; Guillem, C.; Arantxa, C.; Adriana, R.; Pietro, L.; Yoshua, B. Graph attention networks. In Proceedings of the ICLR, Vancouver, BC, Canada, 30 April 30–May 2018. [Google Scholar]
- Xu, K.; Hu, W.H.; Jure, L.; Stefanie, J. How powerful are graph neural networks? In Proceedings of the ICLR, New Orleans, LA, USA, 6–9 May 2019.
- Liu, Z.M.; Fang, Y.; Liu, C.H.; Steven, C.H. Node-wise localization of graph neural networks. In Proceedings of the IJCAI, Montreal, QC, Canada, 19–27 August 2021. [Google Scholar]
- Pei, H.B.; Wei, B.Z.; Kevin, C.; Chang, C.; Lei, Y.; Yang, B. Geom-GCN: Geometric graph convolutional networks. In Proceedings of the ICLR, Ababa, Ethiopia, 30 April 2020. [Google Scholar]
- Chang, Y.H.; Yeh, C.H.; Shen, C.C. A multi objective model for passenger train services planning: Application to Taiwan’s high-speed rail line. Transp. Res. Part B 2000, 34, 91–106. [Google Scholar] [CrossRef]
- Yin, Y. Multi objective bilevel optimization for transportation planning and management problems. J. Adv. Transp. 2002, 36, 93–105. [Google Scholar] [CrossRef]
- Gallo, M.; Montella, B.; Acierno, L.D. The transit network design problem with elastic demand and internalization of external costs: An application to rail frequency optimization. Transp. Res. Part C 2011, 19, 1276–1305. [Google Scholar] [CrossRef]
- Liu, D.; Javier, D.M.; Lu, G.U.; Peng, Q.Y.; Ning, J.; Pieter, V. A Matheuristic Iterative Approach for Profit-Oriented Line Planning Applied to the Chinese High-Speed Railway Network. J. Adv. Transp. 2020, 18, 4294195. [Google Scholar] [CrossRef]
- Scholl, S. Customer-Oriented Line Planning; University of Kaiserslautern: Kaiserslautern, Germany, 2005; pp. 23–56. [Google Scholar]
- Schöbel, B.; Scholl, S. Line planning with minimal transfers. In Proceedings of the 5th Workshop on Algorithmic Methods and Models for Optimization of Railways, Dagstuhl, Germany, 1 October 2005. [Google Scholar]
- Schöbel, A.; Scholl, S. Line Planning with Minimal Traveling Time; 5th Workshop on Algorithmic Methods and Models for Optimization of Railways (ATMOS’05); Schloss Dagstuhl-Leibniz-Zentrum für Informatik: Mallorca, Spain, 2006; pp. 1–16. [Google Scholar]
- Fu, H.L. Research on Theory and Methods of Line Planning for High-Speed Railways; Beijing Jiaotong University: Beijing, China, 2010. (In Chinese) [Google Scholar]
- Zhao, S.; Wu, R.; Shi, F. A line planning approach for high-speed railway network with time-varying demand. Comput. Ind. Eng. 2021, 160, 107547. [Google Scholar] [CrossRef]
- Wang, L.; Jia, L.; Qin, Y.; Li, H. A two-layer optimization model for high-speed railway line planning. J. Zhejiang Univ. Sci. A 2011, 12, 902–912. [Google Scholar] [CrossRef]
- Zhao, L.; Song, Y.; Zhang, C.; Liu, Y.; Wang, P.; Lin, T.; Deng, M. T-gcn: A temporal graph convolutional network for traffic prediction. IEEE Trans. Intell. Transp. Syst. 2019, 21, 3848–3858. [Google Scholar] [CrossRef] [Green Version]
- Huang, W.; Song, G.; Hong, H.; Xie, K. Deep architecture for traffic flow prediction: Deep belief networks with multitask learning. IEEE Trans. Intell. Transp. Syst. 2014, 15, 2191–2201. [Google Scholar] [CrossRef]
- Shi, M.; Tang, Y.F.; Zhu, X.Q.; Wilson, D.A.; Liu, J.X. Multi-Class Imbalanced Graph Convolutional Network Learning. In Proceedings of the 29th International Joint Conference on Artificial Intelligence, IJCAI, Virtual, 19–26 August 2020; pp. 2879–2885. [Google Scholar]
- Ghorbani, M.; Kazi, A.; Baghshah, M.S.; Rabiee, H.R.; Navab, N. RA-GCN: Graph Convolutional Network for Disease Prediction Problems with Imbalanced Data. arXiv 2021, arXiv:2103.00221. [Google Scholar] [CrossRef]
- Patel, H.; Singh, R.D.; Thippa, R.G.; Xie, K. A review on classification of imbalanced data for wireless sensor networks. Int. J. Distrib. Sens. Netw. 2020, 16, 1550147720916404. [Google Scholar] [CrossRef]
- Yang, Y.Z.; Xu, Z. Rethinking the value of labels for improving class-imbalanced learning. In Proceedings of the Advances in Neural Information Processing Systems 33, Annual Conference on Neural Information Processing Systems, Virtual, 6–12 December 2020. [Google Scholar]
- Chen, D.L.; Lin, Y.K.; Zhao, G.X.; Ren, X.C.; Li, P.; Zhou, J.; Sun, X. Topology-Imbalance Learning for Semi-Supervised Node Classification. In Proceedings of the NeurIPS, Virtual, 6–14 December 2021; pp. 29885–29897. [Google Scholar]
- Huang, C.; Li, Y.N.; Loy, C.C.; Tang, X. Learning Deep Representation for Imbalanced Classification. In Proceedings of the 29th IEEE Conference on Computer Vision and Pattern Recognition, CVPR, Las Vegas, NV, USA, 27–30 June 2016; pp. 5375–5384. [Google Scholar]
- Ren, M.Y.; Zeng, W.Y.; Yang, B.; Urtasun, B. Learning to Reweight Examples for Robust Deep Learning. In Proceedings of the International Conference on Machine Learning, PMLR, Stockholm, Sweden, 10–15 July 2018; pp. 4334–4343. [Google Scholar]
- Cui, Y.; Li, M.; Jia, T.; Lin, Y.; Song, Y.; Belongie, S.J. Class-Balanced Loss Based on Effective Number of Samples. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, CVPR, Long Beach, CA, USA, 16–20 June 2019; pp. 9268–9277. [Google Scholar]
- Cao, K.D.; Wei, C.; Gaidon, A.; Aréchiga, N.; Ma, T.Y. Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss. In Proceedings of the Advances in Neural Information Processing Systems 32, Annual Conference on Neural Information Processing Systems 2019, NeurIPS, Vancouver, BC, Canada, 8–14 December 2019; pp. 1565–1576. [Google Scholar]
- Liu, Z.M.; Nguyen, T.K.; Fang, Y. Tail-GNN: Tail-Node Graph Neural Networks. In Proceedings of the KDD, Virtual, 14–18 August 2021; pp. 1109–1119. [Google Scholar]
- Leroy, V.; Cambazoglu, B.B.; Bonchi, F. Cold start link prediction. In Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Washington, DC, USA, 24–28 July 2010; pp. 393–402. [Google Scholar]
- Wang, Z.; Liang, J.; Li, R.; Qian, Y. An approach to cold-start link prediction: Establishing connections between non-topological and topological information. IEEE Trans. Knowl. Data Eng. 2016, 28, 2857–2870. [Google Scholar] [CrossRef]
- Wang, H.; Zhang, F.; Hou, M.; Qi, Y. Shine: Signed heterogeneous information network embedding for sentiment link prediction. In Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining, Angeles, CA, USA, 2 June 2018; pp. 592–600. [Google Scholar]
- Ge, L.; Zhang, A. Pseudo cold start link prediction with multiple sources in social networks. In Proceedings of the 2012 SIAM International Conference on Data Mining. Society for Industrial and Applied Mathematics 2012, Anaheim, CA, USA, 26–28 April 2012; pp. 768–779. [Google Scholar]
- Tang, M.; Wang, W. Cold-start link prediction integrating community information via multi-nonnegative matrix factorization. Chaos Solitons Fractals 2022, 162, 112421. [Google Scholar] [CrossRef]
- Bryan, P.; Rami, A.; Steven, S. DeepWalk: Online learning of social representations. In Proceedings of the KDD, New York, NY, USA, 24–27 August 2014; pp. 701–710. [Google Scholar]
- Grover, A.; Leskovec, J. node2vec: Scalable Feature Learning for Networks. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, ACM, San Francisco, CA, USA, 13–17 August 2016; pp. 855–864. [Google Scholar]
Order | Route |
---|---|
1 | Beijing–Tianjin–Jinan–Xuzhou–Bengbu–Nanjing–Shanghai |
2 | Beijing–Shijiazhuang–Zhengzhou–Wuhan–Changsha–Guangzhou–Shenzhen–Jiulong |
3 | Beijing–Chengde–Chaoyang–Fuxin–Shenyang–Tieling–Siping–Changchun–Harbin |
4 | Shenyang–Anshan–Yingkou–Dalian |
5 | Hangzhou–Ningbo–Taizhou–Wenzhou–Fuzhou–Xiamen–Shenzhen |
6 | Nanjing–Hefei–Wuhan–Chongqing–Chengdu |
7 | Lianyungang–Xuzhou–Shangqiu–Zhengzhou–Luoyang–Xi’an–Baoji–Lanzhou–Xining–Wulumuqi |
8 | Shanghai–Hangzhou–Nanchang–Changsha–Guiyang–Kunming |
9 | Qingdao–Jinan–Dezhou–Shijiazhuang–Taiyuan |
… | … |
Methods | ACC | Macro-F |
---|---|---|
DeepWalk | 47.6003 | 43.0754 |
Node2vec | 48.4542 | 45.2780 |
GCN | 57.35 ± 1.4 | 54.02 ± 1.2 |
Scatter-GCN | 57.49 ± 1.3 | 54.05 ± 1.2 |
GAT | 60.21 ± 1.4 | 58.18 ± 1.6 |
Scatter-GAT | 60.24 ± 1.5 | 58.20 ± 1.6 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ma, M.; Zhang, Y.; Li, Y.; Li, X.; Liu, Y. Scatter-GNN: A Scatter Graph Neural Network for Prediction of High-Speed Railway Station—A Case Study of Yinchuan–Chongqing HSR. Appl. Sci. 2023, 13, 150. https://doi.org/10.3390/app13010150
Ma M, Zhang Y, Li Y, Li X, Liu Y. Scatter-GNN: A Scatter Graph Neural Network for Prediction of High-Speed Railway Station—A Case Study of Yinchuan–Chongqing HSR. Applied Sciences. 2023; 13(1):150. https://doi.org/10.3390/app13010150
Chicago/Turabian StyleMa, Manfu, Yiding Zhang, Yong Li, Xiaoxue Li, and Yiping Liu. 2023. "Scatter-GNN: A Scatter Graph Neural Network for Prediction of High-Speed Railway Station—A Case Study of Yinchuan–Chongqing HSR" Applied Sciences 13, no. 1: 150. https://doi.org/10.3390/app13010150
APA StyleMa, M., Zhang, Y., Li, Y., Li, X., & Liu, Y. (2023). Scatter-GNN: A Scatter Graph Neural Network for Prediction of High-Speed Railway Station—A Case Study of Yinchuan–Chongqing HSR. Applied Sciences, 13(1), 150. https://doi.org/10.3390/app13010150