Streamflow Prediction of Spatio-Temporal Graph Neural Network with Feature Enhancement Fusion
Abstract
1. Introduction
- We develop a dynamic graph construction mechanism that autonomously identifies and encodes spatial correlations between historical rainfall and flow data through adaptive graph convolution operations.
- We present a physically informed feature enhancement layer that incorporates soil moisture and evaporation processes into the temporal feature propagation, enabling context-aware weighting of historical inputs.
- We introduce a hybrid Graph-LSTM module that synergizes global spatial context learning with local temporal dependency modeling, effectively capturing both basin-scale hydrological interactions and station-specific temporal patterns.
2. Conceptual Framework
2.1. ST-GNN Architectural Framework
2.2. Multi-Source Input Modules
2.2.1. History Streamflow and Rainfall Data
- Distance graph
- Correlation graph
2.2.2. Watershed State Indicators
2.2.3. Future Rainfall Trends
2.3. Hierarchical Processing Modules
2.3.1. Spatial GCN Module
2.3.2. Feature Enhancement Fusion Module
2.3.3. Graph-LSTM Network Module
2.4. Output Prediction Module
3. Case Study
3.1. Data Source and Description
3.2. Baseline Models
3.2.1. Benchmarks
- LSTM: long short-term memory, which is specifically designed to solve the long-term dependency problem of the general RNN [45] and suitable for prediction tasks of longer time series.
- STA-LSTM: Spatio-temporal attention long short-term memory model [46]. This uses an attention mechanism and LSTM to predict floods.
- GCN: convolutional neural networks. ZHAO et al. [47] use GCN for feature extraction and additional decode module for gated recurrent unit networks.
- ST-GCN: spatial-temporal graph neural network, which is popular in recent time series data prediction tasks such as groundwater levels prediction [48].
- AGCLSTM: Graph convolution-based spatial-temporal attention LSTM [49], which uses ST-GCN combined with attention mechanism to predict floods.
- ST-GNN: spatial-temporal graph neural network model proposed by us.
3.2.2. Model Variants
- ST-GNN-rglstm: The variant model of ST-GNN that removes Graph-LSTM network module.
- ST-GNN-rgcn: The variant model of ST-GNN that removes GCN module.
- ST-GNN-rfi: The variant model of ST-GNN that removes future rainfall input module.
- ST-GNN-rfe: The variant model of ST-GNN that removes feature enhancement fusion module.
3.3. Evaluation Indices
- Root mean square error (RMSE)
- Deterministic coefficient (DC)
- The relative difference of RMSE (%)
3.4. Experimental Methods
4. Results and Discussion
4.1. Comprehensive Evaluation of ST-GNN Predictive Performance and Discussion
4.1.1. Comprehensive Evaluation of ST-GNN Predictive Performance
4.1.2. Discussion
4.2. Optimization Analysis of ST-GNN Input Timesteps
4.3. Analysis of ST-GNN’s Long-Term Prediction Stability
4.4. Ablation Study on Key Modules of ST-GNN Model
4.5. Watershed Adaptability Analysis of Feature Enhancement Fusion Module
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Zhou, G.; Zhao, Q.; Zhang, S.; Zhang, D.; Li, C. Rolling forecast of snowmelt floods in data-scarce mountainous regions using weather forecast products to drive distributed energy balance hydrological model. J. Hydrol. 2024, 637, 131384. [Google Scholar] [CrossRef]
- Izadi, A.; Zarei, N.; Nikoo, M.R.; Al-Wardy, M.; Yazdandoost, F. Exploring the potential of deep learning for streamflow forecasting: A comparative study with hydrological models for seasonal and perennial rivers. Expert Syst. Appl. 2024, 252, 124139. [Google Scholar] [CrossRef]
- Kim, S.; Matsumi, Y.; Pan, S.; Mase, H. A real-time forecast model using artificial neural network for after-runner storm surges on the Tottori Coast, Japan. Ocean Eng. 2016, 122, 44–53. [Google Scholar] [CrossRef]
- Tareke, K.A.; Awoke, A.G. Hydrological drought forecasting and monitoring system development using artificial neural network (ANN) in Ethiopia. Heliyon 2023, 9, e13287. [Google Scholar] [CrossRef]
- Araghinejad, S.; Azmi, M.; Kholghi, M. Application of artificial neural network ensembles in probabilistic hydrological forecasting. J. Hydrol. 2011, 407, 94–104. [Google Scholar]
- Zhang, J.; Zhu, Y.; Zhang, X.; Ye, M.; Yang, J. Developing a Long Short-Term Memory (LSTM) based model for predicting water table depth in agricultural areas. J. Hydrol. 2018, 561, 918–929. [Google Scholar] [CrossRef]
- Gao, S.; Huang, Y.; Zhang, S.; Han, J.; Wang, G.; Zhang, M.; Lin, Q. Short-term runoff prediction with GRU and LSTM networks without requiring time step optimization during sample generation. J. Hydrol. 2020, 589, 125188. [Google Scholar] [CrossRef]
- Ni, L.; Wang, D.; Singh, V.P.; Wu, J.; Wang, Y.; Tao, Y.; Zhang, J. Streamflow and rainfall forecasting by two long short-term memory-based models. J. Hydrol. 2020, 583, 124296. [Google Scholar] [CrossRef]
- Deng, H.; Chen, W.; Huang, G. Deep insight into daily runoff forecasting based on a CNN-LSTM model. Nat. Hazards 2022, 113, 1675–1696. [Google Scholar] [CrossRef]
- Khorram, S.; Jehbez, N. A hybrid CNN-LSTM approach for monthly reservoir inflow forecasting. Water Resour. Manag. 2023, 37, 4097–4121. [Google Scholar] [CrossRef]
- Dehghani, A.; Moazam, H.M.Z.H.; Mortazavizadeh, F.; Ranjbar, V.; Mirzaei, M.; Mortezavi, S.; Ng, J.L.; Dehghani, A. Comparative evaluation of LSTM, CNN, and ConvLSTM for hourly short-term streamflow forecasting using deep learning approaches. Ecol. Inform. 2023, 75, 102119. [Google Scholar] [CrossRef]
- Malik, H.; Feng, J.; Shao, P.; Abduljabbar, Z.A. Improving flood forecasting using time-distributed CNN-LSTM model: A time-distributed spatiotemporal method. Earth Sci. Inform. 2024, 17, 3455–3474. [Google Scholar] [CrossRef]
- Jin, A.; Wang, Q.; Zhan, H.; Zhou, R. Comparative performance assessment of physical-based and data-driven machine-learning models for simulating streamflow: A case study in three catchments across the US. J. Hydrol. Eng. 2024, 29, 05024004. [Google Scholar] [CrossRef]
- Hu, X.; Shi, L.; Lin, G.; Lin, L. Comparison of physical-based, data-driven and hybrid modeling approaches for evapotranspiration estimation. J. Hydrol. 2021, 601, 126592. [Google Scholar] [CrossRef]
- Sabzipour, B.; Arsenault, R.; Troin, M.; Martel, J.L.; Brissette, F.; Brunet, F.; Mai, J. Comparing a long short-term memory (LSTM) neural network with a physically-based hydrological model for streamflow forecasting over a Canadian catchment. J. Hydrol. 2023, 627, 130380. [Google Scholar] [CrossRef]
- Scarselli, F.; Gori, M.; Tsoi, A.C.; Hagenbuchner, M.; Monfardini, G. The graph neural network model. IEEE Trans. Neural Netw. 2008, 20, 61–80. [Google Scholar] [CrossRef]
- Feng, Z.; Wang, R.; Wang, T.; Song, M.; Wu, S.; He, S. A comprehensive survey of dynamic graph neural networks: Models, frameworks, benchmarks, experiments and challenges. IEEE Trans. Knowl. Data Eng. 2025, 38, 26–46. [Google Scholar] [CrossRef]
- Liu, H.; Yang, D.; Liu, X.; Chen, X.; Liang, Z.; Wang, H.; Cui, Y.; Gu, J. Todynet: Temporal dynamic graph neural network for multivariate time series classification. Inf. Sci. 2024, 677, 120914. [Google Scholar] [CrossRef]
- Lu, G.; Bu, S. Online Power System Dynamic Security Assessment: A GNN–FNO Approach Learning From Multisource Spatial–Temporal Data. IEEE Trans. Ind. Inform. 2025, 21, 7598–7608. [Google Scholar] [CrossRef]
- Gu, W.; Sun, M.; Liu, B.; Xu, K.; Sui, M. Adaptive Spatio-Temporal Aggregation for Temporal Dynamic Graph-Based Fraud Risk Detection. J. Comput. Technol. Softw. 2024, 3, 1–5. [Google Scholar]
- Liu, J.; Liu, J.; Zhao, K.; Tang, Y.; Chen, W. TP-GNN: Continuous dynamic graph neural network for graph classification. In Proceedings of the 2024 IEEE 40th International Conference on Data Engineering (ICDE), Utrecht, The Netherlands, 13–16 May 2024; IEEE: New York, NY, USA, 2024; pp. 2848–2861. [Google Scholar]
- Wang, T.; Ding, Z.; Chang, Z.; Yang, X.; Chen, Y.; Li, M.; Xu, S.; Wang, Y. A novel graph neural network framework for resting-state functional MRI spatiotemporal dynamics analysis. Phys. A Stat. Mech. Its Appl. 2025, 669, 130582. [Google Scholar] [CrossRef]
- Molaei, S.; Niknam, G.; Ghosheh, G.O.; Chauhan, V.K.; Zare, H.; Zhu, T.; Pan, S.; Clifton, D.A. Temporal dynamics unleashed: Elevating variational graph attention. Knowl.-Based Syst. 2024, 299, 112110. [Google Scholar] [CrossRef]
- Liu, Y.; Hou, G.; Huang, F.; Qin, H.; Wang, B.; Yi, L. Directed graph deep neural network for multi-step daily streamflow forecasting. J. Hydrol. 2022, 607, 127515. [Google Scholar] [CrossRef]
- Kazadi, A.N.; Doss-Gollin, J.; Sebastian, A.; Silva, A. Flood prediction with graph neural networks. Clim. Change AI 2022. Available online: https://s3.us-east-1.amazonaws.com/climate-change-ai/papers/neurips2022/75/paper.pdf (accessed on 25 January 2026).
- Bai, T.; Tahmasebi, P. Graph neural network for groundwater level forecasting. J. Hydrol. 2023, 616, 128792. [Google Scholar] [CrossRef]
- Taghizadeh, M.; Zandsalimi, Z.; Nabian, M.A.; Shafiee-Jood, M.; Alemazkoor, N. Interpretable physics-informed graph neural networks for flood forecasting. Comput.-Aided Civ. Infrastruct. Eng. 2025, 40, 2629–2649. [Google Scholar] [CrossRef]
- Ashraf, I.; Strotherm, J.; Hermes, L.; Hammer, B. Physics-informed graph neural networks for water distribution systems. In Proceedings of the AAAI Conference on Artificial Intelligence, Vancouver, BC, Canada, 20–27 February 2024; Volume 38, pp. 21905–21913. [Google Scholar]
- Yang, B.; Chen, L.; Yi, B.; Li, S.; Leng, Z. Local Weather and Global Climate Data-Driven Long-Term Runoff Forecasting Based on Local–Global–Temporal Attention Mechanisms and Graph Attention Networks. Remote Sens. 2024, 16, 3659. [Google Scholar] [CrossRef]
- Li, J.; Hsu, K.l.; Jiang, A.L.; Zhang, Y.; Ye, A.; Sorooshian, S. Improving Rainfall-runoff Modeling Using an Attention-based Model. Authorea Prepr. 2025. [Google Scholar] [CrossRef]
- Yuan, W.; Yan, H. Enhancing runoff prediction with causal lag-aware attention and multi-scale fusion in transformer models. J. Hydrol. 2025, 664, 134369. [Google Scholar] [CrossRef]
- Sheng, Z.; Cao, Y.; Yang, Y.; Feng, Z.K.; Shi, K.; Huang, T.; Wen, S. Residual temporal convolutional network with dual attention mechanism for multilead-time interpretable runoff forecasting. IEEE Trans. Neural Netw. Learn. Syst. 2024, 36, 8757–8771. [Google Scholar] [CrossRef]
- Zhou, R. Multi-scale dynamic spatiotemporal graph attention network for forecasting karst spring discharge. J. Hydrol. 2025, 659, 133289. [Google Scholar] [CrossRef]
- Taccari, M.L.; Wang, H.; Nuttall, J.; Chen, X.; Jimack, P.K. Spatial-temporal graph neural networks for groundwater data. Sci. Rep. 2024, 14, 24564. [Google Scholar] [CrossRef]
- Xue, L.; Zhu, Y. Dynamic hydrological flow prediction with self-iterative spatiotemporal graph neural network: Modeling long-and short-period topological dynamics. J. Hydrol. 2025, 663, 134122. [Google Scholar] [CrossRef]
- Jin, G.; Liang, Y.; Fang, Y.; Shao, Z.; Huang, J.; Zhang, J.; Zheng, Y. Spatio-temporal graph neural networks for predictive learning in urban computing: A survey. IEEE Trans. Knowl. Data Eng. 2023, 36, 5388–5408. [Google Scholar] [CrossRef]
- Benesty, J.; Chen, J.; Huang, Y.; Cohen, I. Pearson correlation coefficient. In Noise Reduction in Speech Processing; Springer: Berlin/Heidelberg, Germany, 2009; pp. 1–4. [Google Scholar]
- Mahanama, S.; Livneh, B.; Koster, R.; Lettenmaier, D.; Reichle, R. Soil moisture, snow, and seasonal streamflow forecasts in the United States. J. Hydrometeorol. 2012, 13, 189–203. [Google Scholar] [CrossRef]
- Koster, R.D.; Mahanama, S.P.; Livneh, B.; Lettenmaier, D.P.; Reichle, R.H. Skill in streamflow forecasts derived from large-scale estimates of soil moisture and snow. Nat. Geosci. 2010, 3, 613–616. [Google Scholar] [CrossRef]
- Atwood, J.; Towsley, D. Diffusion-convolutional neural networks. In Proceedings of the Advances in Neural Information Processing Systems, Barcelona, Spain, 5–10 December 2016; pp. 1993–2001. [Google Scholar]
- Kipf, T. Semi-supervised classification with graph convolutional networks. arXiv 2016, arXiv:1609.02907. [Google Scholar]
- Li, Y.; Yu, R.; Shahabi, C.; Liu, Y. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. In Proceedings of the International Conference on Learning Representations, Vancouver, BC, Canada, 30 April–3 May 2018. [Google Scholar]
- Shu, X.; Zhang, L.; Sun, Y.; Tang, J. Host–parasite: Graph LSTM-in-LSTM for group activity recognition. IEEE Trans. Neural Netw. Learn. Syst. 2020, 32, 663–674. [Google Scholar] [CrossRef]
- Xu, W.; Chen, J.; Zhang, X.J. Scale effects of the monthly streamflow prediction using a state-of-the-art deep learning model. Water Resour. Manag. 2022, 36, 3609–3625. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef] [PubMed]
- Ding, Y.; Zhu, Y.; Feng, J.; Zhang, P.; Cheng, Z. Interpretable spatio-temporal attention LSTM model for flood forecasting. Neurocomputing 2020, 403, 348–359. [Google Scholar] [CrossRef]
- Zhao, Q.; Zhu, Y.; Shu, K.; Wan, D.; Yu, Y.; Zhou, X.; Liu, H. Joint Spatial and Temporal Modeling for Hydrological Prediction. IEEE Access 2020, 8, 78492–78503. [Google Scholar] [CrossRef]
- Chen, L.; Zhang, D.; Xu, J.; Zhou, Z.; Jin, J.; Luan, J.; Wulamu, A. Enhancing the accuracy of groundwater level prediction at different scales using spatio-temporal graph convolutional model. Earth Sci. Inform. 2025, 18, 250. [Google Scholar]
- Feng, J.; Sha, H.; Ding, Y.; Yan, L.; Yu, Z. Graph convolution based spatial-temporal attention LSTM model for flood forecasting. In Proceedings of the 2022 International Joint Conference on Neural Networks (IJCNN), Padua, Italy, 18–23 July 2022; IEEE: New York, NY, USA, 2022; pp. 1–8. [Google Scholar]













| Module | Function & Description | Input/Output Specifications |
|---|---|---|
| Graph Constructor | Constructs adjacency matrices based on: (1) Physical distance (, Equation (1)), (2) Statistical correlation (, Equation (2)). Final adjacency: | Input: Station coordinates; Historical features Output: |
| Spatial GCN | Performs K-layer graph convolution (Equation (6)) using diffusion convolution. Aggregates spatial information from neighboring stations with summation across diffusion steps. | Input: , Output: Parameters: K layers, hidden dim D |
| Feature Fusion | Enhances spatial GCN features (where ) with watershed state features via gating mechanism: (Equation (7)). Here, is obtained by applying a fully connected network (FCN) to the concatenated watershed state features from Equation (3). | Input: , Output: |
| Graph-LSTM | Processes temporal sequence of fused features along with future rainfall information to capture spatiotemporal dependencies. Each LSTM unit (Equations (8)–(14)) aggregates information from neighboring nodes and incorporates future rainfall trends (from Equation (4)) through an additional input gate. | Input: , Output: Parameters: Hidden dim |
| Output Layer | Linear projection for multi-step streamflow forecasts: | Input: Output: Parameters: , |
| Watershed | Type of Stations | Count | Time Granularity | Training Set | Test Set |
|---|---|---|---|---|---|
| Changhua | Hydrological station | 1 | 1 sample/h | 7483 | 1871 |
| Rainfall station | 6 | ||||
| Chenhe | Hydrological station | 1 | 1 sample/h | 14,796 | 3699 |
| Rainfall station | 8 | ||||
| Pingyao | Hydrological station | 1 | 1 sample/h | 12,324 | 3081 |
| Rainfall station | 7 | ||||
| Daheba | Hydrological station | 1 | 1 sample/h | 10,247 | 2561 |
| Rainfall station | 15 |
| Data Set | Number of Samples | Max | Min | Average | Median | SD |
|---|---|---|---|---|---|---|
| Changhua data set | 9371 | 2100 | 0.58 | 146.65 | 80.32 | 202.49 |
| Chenhe data set | 18,496 | 1740 | 0.69 | 36.92 | 13.50 | 78.77 |
| Pingyao data set | 15,406 | 630 | 4.22 | 97.85 | 62.93 | 106.22 |
| Dahaba data set | 12,809 | 3300 | 1.40 | 85.73 | 25.96 | 209.07 |
| Data Set | Number of Samples | Max | Min | Average | Median | SD |
|---|---|---|---|---|---|---|
| Changhua data set | 9371 | 36.91 | 0.00 | 0.73 | 0.00 | 2.06 |
| Chenhe data set | 18,496 | 7.42 | 0.00 | 0.20 | 0.00 | 0.57 |
| Pingyao data set | 15,406 | 32.00 | 0.00 | 0.70 | 0.15 | 1.61 |
| Dahaba data set | 12,809 | 14.48 | 0.00 | 0.25 | 0.00 | 0.86 |
| Models | Predict Step | 1 h | 2 h | 3 h | 4 h | 5 h | 6 h | 7 h | 8 h |
|---|---|---|---|---|---|---|---|---|---|
| LSTM | RMSE | 78.02 | 79.82 | 81.85 | 83.99 | 85.61 | 88.67 | 92.08 | 96.98 |
| DC | 0.961 | 0.952 | 0.945 | 0.933 | 0.916 | 0.893 | 0.871 | 0.842 | |
| STA-LSTM | RMSE | 74.32 | 75.54 | 76.71 | 77.72 | 80.61 | 83.33 | 86.03 | 92.41 |
| DC | 0.975 | 0.968 | 0.958 | 0.948 | 0.928 | 0.918 | 0.901 | 0.867 | |
| GCN | RMSE | 78.33 | 78.92 | 80.75 | 82.79 | 84.61 | 87.37 | 91.08 | 95.48 |
| DC | 0.966 | 0.952 | 0.944 | 0.932 | 0.912 | 0.903 | 0.889 | 0.842 | |
| ST-GCN | RMSE | 75.32 | 76.54 | 77.71 | 78.72 | 82.61 | 85.33 | 88.03 | 95.41 |
| DC | 0.971 | 0.963 | 0.951 | 0.944 | 0.921 | 0.914 | 0.892 | 0.853 | |
| AGCLSTM | RMSE | 73.82 | 74.54 | 76.21 | 77.22 | 79.61 | 83.33 | 85.03 | 90.41 |
| DC | 0.978 | 0.970 | 0.961 | 0.950 | 0.932 | 0.920 | 0.906 | 0.874 | |
| ST-GNN | RMSE | 72.35 | 73.82 | 75.65 | 77.95 | 80.66 | 82.67 | 85.98 | 91.01 |
| DC | 0.981 | 0.972 | 0.965 | 0.953 | 0.936 | 0.923 | 0.911 | 0.882 |
| Models | Predict Step | 1 h | 2 h | 3 h | 4 h | 5 h | 6 h | 7 h | 8 h |
|---|---|---|---|---|---|---|---|---|---|
| LSTM | RMSE | 60.32 | 61.82 | 63.32 | 65.91 | 67.62 | 68.61 | 72.02 | 76.98 |
| DC | 0.946 | 0.932 | 0.927 | 0.921 | 0.905 | 0.873 | 0.851 | 0.822 | |
| STA-LSTM | RMSE | 65.32 | 66.54 | 67.71 | 68.72 | 70.61 | 72.33 | 75.03 | 78.41 |
| DC | 0.935 | 0.928 | 0.920 | 0.912 | 0.901 | 0.885 | 0.865 | 0.838 | |
| GCN | RMSE | 60.62 | 61.32 | 63.14 | 64.99 | 67.64 | 68.67 | 71.91 | 75.88 |
| DC | 0.945 | 0.935 | 0.925 | 0.922 | 0.899 | 0.872 | 0.849 | 0.821 | |
| ST-GCN | RMSE | 58.65 | 59.35 | 61.32 | 63.23 | 65.61 | 66.64 | 70.12 | 74.51 |
| DC | 0.942 | 0.933 | 0.914 | 0.905 | 0.892 | 0.871 | 0.857 | 0.831 | |
| AGCLSTM | RMSE | 62.32 | 63.54 | 64.71 | 65.72 | 67.61 | 69.33 | 72.03 | 76.41 |
| DC | 0.938 | 0.930 | 0.922 | 0.915 | 0.902 | 0.888 | 0.869 | 0.840 | |
| ST-GNN | RMSE | 55.35 | 55.82 | 57.63 | 58.75 | 59.89 | 61.37 | 64.94 | 68.01 |
| DC | 0.955 | 0.942 | 0.931 | 0.918 | 0.906 | 0.893 | 0.871 | 0.842 |
| Models | Predict Step | 1 h | 2 h | 3 h | 4 h | 5 h | 6 h | 7 h | 8 h |
|---|---|---|---|---|---|---|---|---|---|
| LSTM | RMSE | 47.05 | 48.22 | 49.39 | 51.41 | 52.74 | 53.52 | 56.18 | 60.04 |
| DC | 0.951 | 0.942 | 0.935 | 0.923 | 0.906 | 0.884 | 0.862 | 0.833 | |
| STA-LSTM | RMSE | 44.82 | 45.68 | 46.91 | 48.22 | 49.45 | 51.17 | 53.21 | 56.26 |
| DC | 0.960 | 0.952 | 0.943 | 0.931 | 0.915 | 0.902 | 0.877 | 0.848 | |
| GCN | RMSE | 47.37 | 48.42 | 49.54 | 51.21 | 52.41 | 53.52 | 56.26 | 59.84 |
| DC | 0.950 | 0.941 | 0.934 | 0.926 | 0.908 | 0.888 | 0.861 | 0.831 | |
| ST-GCN | RMSE | 45.31 | 46.42 | 47.55 | 50.22 | 51.45 | 53.17 | 55.21 | 58.26 |
| DC | 0.952 | 0.945 | 0.938 | 0.926 | 0.906 | 0.894 | 0.865 | 0.836 | |
| AGCLSTM | RMSE | 43.32 | 44.18 | 45.41 | 47.22 | 48.45 | 50.17 | 52.21 | 54.26 |
| DC | 0.965 | 0.957 | 0.948 | 0.935 | 0.920 | 0.908 | 0.885 | 0.855 | |
| ST-GNN | RMSE | 41.51 | 41.87 | 43.22 | 44.06 | 44.92 | 46.03 | 48.71 | 51.01 |
| DC | 0.974 | 0.961 | 0.950 | 0.936 | 0.924 | 0.911 | 0.888 | 0.859 |
| Models | Predict Step | 1 h | 2 h | 3 h | 4 h | 5 h | 6 h | 7 h | 8 h |
|---|---|---|---|---|---|---|---|---|---|
| LSTM | RMSE | 100.65 | 102.97 | 105.59 | 108.35 | 110.44 | 114.38 | 118.78 | 125.10 |
| DC | 0.896 | 0.887 | 0.881 | 0.870 | 0.854 | 0.832 | 0.812 | 0.785 | |
| STA-LSTM | RMSE | 97.45 | 99.12 | 101.85 | 104.62 | 107.25 | 110.38 | 114.72 | 121.15 |
| DC | 0.905 | 0.898 | 0.890 | 0.878 | 0.863 | 0.846 | 0.830 | 0.808 | |
| GCN | RMSE | 101.91 | 103.51 | 106.52 | 109.86 | 110.76 | 113.31 | 117.71 | 124.44 |
| DC | 0.895 | 0.881 | 0.879 | 0.862 | 0.846 | 0.831 | 0.811 | 0.789 | |
| ST-GCN | RMSE | 98.92 | 100.52 | 103.53 | 106.71 | 108.72 | 112.35 | 116.72 | 123.11 |
| DC | 0.899 | 0.892 | 0.883 | 0.865 | 0.855 | 0.836 | 0.816 | 0.797 | |
| AGCLSTM | RMSE | 96.15 | 97.82 | 100.55 | 103.32 | 105.95 | 109.08 | 113.42 | 119.85 |
| DC | 0.910 | 0.903 | 0.895 | 0.883 | 0.868 | 0.851 | 0.835 | 0.813 | |
| ST-GNN | RMSE | 94.06 | 95.97 | 98.35 | 101.34 | 104.86 | 107.47 | 111.77 | 118.31 |
| DC | 0.914 | 0.906 | 0.899 | 0.888 | 0.872 | 0.860 | 0.849 | 0.822 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Yan, L.; Shan, D.; Zhu, X.; Zheng, L.; Zhang, H.; Li, Y.; Li, J.; Hang, T.; Feng, J. Streamflow Prediction of Spatio-Temporal Graph Neural Network with Feature Enhancement Fusion. Symmetry 2026, 18, 240. https://doi.org/10.3390/sym18020240
Yan L, Shan D, Zhu X, Zheng L, Zhang H, Li Y, Li J, Hang T, Feng J. Streamflow Prediction of Spatio-Temporal Graph Neural Network with Feature Enhancement Fusion. Symmetry. 2026; 18(2):240. https://doi.org/10.3390/sym18020240
Chicago/Turabian StyleYan, Le, Dacheng Shan, Xiaorui Zhu, Lingling Zheng, Hongtao Zhang, Ying Li, Jing Li, Tingting Hang, and Jun Feng. 2026. "Streamflow Prediction of Spatio-Temporal Graph Neural Network with Feature Enhancement Fusion" Symmetry 18, no. 2: 240. https://doi.org/10.3390/sym18020240
APA StyleYan, L., Shan, D., Zhu, X., Zheng, L., Zhang, H., Li, Y., Li, J., Hang, T., & Feng, J. (2026). Streamflow Prediction of Spatio-Temporal Graph Neural Network with Feature Enhancement Fusion. Symmetry, 18(2), 240. https://doi.org/10.3390/sym18020240

