Enhancing Energy Market Forecasting with Graph Convolutional Networks: A Multi-Node Time-Series Analysis Framework
Abstract
1. Introduction
2. Literature Review
3. Materials and Methods
3.1. Graph Construction and Normalization
- -
- Augment adjacency:
- -
- Augment degree:
- -
- : Normalized adjacency matrix.
- -
- : A diagonal matrix containing the inverse square roots of the degrees.
- -
- is symmetric
3.1.1. GCN Layer as a First-Order Polynomial
- -
- is the l-th layer’s node feature matrix;
- -
- are trainable weights;
- -
- is a nonlinear activation (ReLU).
3.1.2. Relation to Normalized Laplacian
3.2. Feature Construction
- Current value:The latest observed value at time (t − 1)
- Moving average (): A smoothed version of the data to reduce noise, computed as:where n is a tunable parameter that determines the window size, typically n = 5 (averaging over 5-time steps)
- Rate of change :Captures short-term variations in energy demand or supply:These are concatenated to form where N is the number of nodes and is the number of features. All feature dimensions are normalized (zero mean, unit variance) using statistics computed on the training set; normalization is applied per-feature across nodes unless a domain reason suggests per-node scaling.: Represents the difference between the two most recent observations, capturing short-term trends. Collecting these for all nodes, we form a tensor as:where T is the total number of time steps, and n is the number of steps taken for the moving average (here, 5).
3.3. Integrated GCN-Based Model Architecture
3.3.1. GCN for Forecasting
- h is the Size of the hidden dimension,
- σ is the ReLU activation function (σ(x) = max (0, x)),
- is the output representation after convolution.
3.3.2. Self-Attention Mechanism
- QKT: The dot product between queries and keys across all nodes; computes pairwise similarity.
- represents the attention weight matrix showing how much node i attends to node j
- ⊙ represents element-wise multiplication and the scaling factor prevents large dot products from destabilizing the SoftMax gradients
3.3.3. Final Prediction Head
3.3.4. Physical Constraint Integration
- A.
- Load Balancing and Generation Mix
- Generation mix (cost minimization)
- Load balancing:
- Load shedding:
- Generation mix cost:
- B.
- Demand Management Constraints
- C.
- Vehicle-Grid Integration
- D.
- AC Load Flow Equations
3.3.5. Output Layer
- A.
- Loss Function and Backpropagation
- B.
- Training Strategy
3.4. Evaluation and Forecasting
Forecasting Procedures
| Algorithm 1: Training Physics-Aware GCN + Attention Forecasting Model |
| Input: Historical feature sequences Ground-truth targets Y(t) Graph adjacency A Constraint definitions and parameters (capacity bounds, voltage limits, reserve margin, EV SoC bounds, etc.)Hyperparameter search space for Learning rate schedule parameters (, , ) Early stopping patience p Fusion method Fuse (·,·) (concatenate +linear or residual) Batch size or rolling window setup for time-series CV |
| Preprocessing: 1. Compute normalized adjacency: Equation (2) 2. Compute feature normalization statistics on the training set. 3. Normalize features: for all t, . |
| Initialize: GCN weights biases , Attention projection weights Fusion head parameters (if applicable) MLP prediction head parameters Optimizer state (e.g., Adam) Best validation loss ← ∞ Early stopping counter ← 0 |
| Hyperparameter tuning loop (e.g., grid or Bayesian search over λ): For each candidate λ combination: Reset model parameters (or use warm-start strategy) For epoch = 1 to max_epochs: For each time step t in training fold (respect temporal order): 1. Input 2. GCN forward: Equations (9) and (10) 3. Attention: Equations (11)–(13) 4. Fusion: Equation (14) 5. Prediction: Equation (15) 6. Compute physical constraint penalties: Equations (18), (20)–(22) 7. Base loss: 8. Total loss: Equation (43) 9. Backpropagate ; update parameters per optimizer with current learning rate η_epoch. End for (time steps) 10. Update learning rate via cosine annealing schedule. 11. Evaluate on validation fold: Compute validation loss (same decomposition) Record constraint violation statistics 12. Early stopping: If validation loss improved: Best validation loss ← current Save model snapshot Early stopping counter ← 0 Else: Early stopping counter += 1 If counter ≥ p: break epoch loop End for (epochs) |
| Record validation metrics and constraint satisfaction for this λ.Select that yields the best trade-off (e.g., lowest validation error with acceptable constraint violations). |
| Output:Trained model parameters (with ) Validation and constraint diagnostics (violation frequencies and magnitudes) |
| Algorithm 2: Inference/Multi-Step Forecasting with Constraint Monitoring |
| Input: Trained model (GCN weights, attention weights, fusion, MLP) Feature generator (to produce X(t) from incoming raw measurements) Graph  Forecast horizon H (for recursive multi-step) Initial history Constraint thresholds |
| Procedure: Initialize forecast sequence F ← empty Current time index ← current time For step = 1 to H: Construct a normalized feature matrix . 2. GCN forward: Equations (9) and (10) 3. Attention: Equations (11)–(13) 4. Fusion: Equation (14) 5. Prediction: Equation (15) 6. Constraint evaluation (for monitoring, not modifying unless post-processing applied): Compute Equations (18), (20)–(22). Log any violations beyond thresholds (e.g., line flow > limit). 7. Optionally: apply a lightweight projection or correction (if implemented) to enforce hard constraints. 8. Append Ŷ to F. 9. Update history if recursive: If multi-step uses its own previous forecast in features, update the feature generator accordingly. 10. Increment . |
| Return: Forecast sequence F Constraint violation report per step |
| Algorithm 3: Subroutine: Constraint Violation Logging |
| Input: Current forecast Ŷ, computed physical quantities (flows, voltages, SoC, etc.) For each constraint type: If violation detected: Record: Node or line where it occurred Magnitude of violation (e.g., flow—limit) Time step |
| Aggregate over evaluation window: Frequency: fraction of time steps with violation per constraint Severity: average magnitude when violated |
| Return violation summary |
| Subroutine: Hyperparameter (λ) Selection Heuristic |
| 1. Define the candidate set for each (e.g., exponential grid: ) |
| 2. For each tuple in the Cartesian product: Train the model via Algorithm 1 for limited epochs or using early-stop warm-start. Compute: Validation prediction error (MSE, MAE) Constraint violation metrics |
| 3. Compute the Pareto frontier between accuracy and feasibility. |
| 4. Select that: Lies near the elbow of trade-off curve (small marginal accuracy loss for large feasibility gain) Satisfies user-defined maximum allowable violation thresholds |
3.5. Data Input and Processing
3.5.1. Assumptions
- Grid topology is static, represented by a normalized adjacency matrix from the graph of branches.
- Loads are non-zero only at original Pd > 0 buses (1–10,13–16,18–20), others set to zero.
- Synthetic data assumes diurnal peaking, mild trend/seasonal components, and noise levels typical of power systems; no extreme events like outages.
- EV behavior is deterministic: Night charging , day , with 95% efficiency and no degradation.
- Generation profiles assume constant availability; renewables use randomized/periodic proxies without real weather data.
- Normalization is per-bus for loads (mean and std over time), global for features to handle scale differences.
- Training uses 300 samples for training, 350 for validation, rest for test; multi-step forecasting (10 steps) iterates predictions with post-processing for non-negativity and balance.
3.5.2. Validation and Simulation Software
- A.
- Validation Strategy
- Mean Squared Error (MSE): Measures prediction accuracy, with baseline showing an average CV MSE of 1.8412.
- Mean Absolute Error (MAE): Quantifies absolute prediction errors.
- Mean Absolute Percentage Error (MAPE): Assesses relative errors, with a test MAPE of 38.52%, indicating challenges with extreme fluctuations.
- Sliding-window CV: Train on [t, t + K), validate on [t + K, t + K + L).
- Early Stopping: Monitor validation MAE; stop if no improvement over 10 epochs.
- Repeats: 5 independent splits to report mean ± std.
- B.
- Simulation Software
3.6. Flowchart
4. Results
AC Load Flow Analysis Results
5. Discussion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Agu, E.E.; Efunniyi, C.P.; Adeniran, I.A.; Osundare, O.S.; Iriogbe, H.O. Challenges and opportunities in data-driven decision making for the energy sector. Int. J. Sch. Res. Multidiscip. Stud. 2024, 5, 068–076. [Google Scholar] [CrossRef]
- SShahzad, S.; Abbasi, M.A.; Ali, H.; Iqbal, M.; Munir, R.; Kilic, H. Possibilities, Challenges, and Future Opportunities of Microgrids: A Review. Sustainability 2023, 15, 6366. [Google Scholar] [CrossRef]
- Loza, B.; Minchala, L.I.; Ochoa-Correa, D.; Martinez, S. Grid-Friendly Integration of Wind Energy: A Review of Power Forecasting and Frequency Control Techniques. Sustainability 2024, 16, 9535. [Google Scholar] [CrossRef]
- Maleki, N.; Lundström, O.; Musaddiq, A.; Jeansson, J.; Olsson, T.; Ahlgren, F. Future energy insights: Time-series and deep learning models for city load forecasting. Appl. Energy 2024, 374, 124067. [Google Scholar] [CrossRef]
- Teixeira, R.; Cerveira, A.; Pires, E.J.S.; Baptista, J. Advancing Renewable Energy Forecasting: A Comprehensive Review of Renewable Energy Forecasting Methods. Energies 2024, 17, 3480. [Google Scholar] [CrossRef]
- Nyangaresi, V.O. AI-Driven Energy Forecasting Enhancing Smart Grid Efficiency with LSTM Networks. EDRAAK 2024, 2024, 32–38. [Google Scholar] [CrossRef]
- Zheng, Y.; Hu, C.; Wang, X.; Wu, Z. Physics-informed recurrent neural network modeling for predictive control of nonlinear processes. J. Process. Control 2023, 128, 103005. [Google Scholar] [CrossRef]
- Kim, D.-Y.; Jin, D.-Y.; Suk, H.-I. Spatiotemporal graph neural networks for predicting mid-to-long-term PM2.5 concentrations. J. Clean. Prod. 2023, 425, 138880. [Google Scholar] [CrossRef]
- Casolaro, A.; Capone, V.; Iannuzzo, G.; Camastra, F. Deep Learning for Time Series Forecasting: Advances and Open Problems. Information 2023, 14, 598. [Google Scholar] [CrossRef]
- Khodayar, M.; Wang, J. Spatio-Temporal Graph Deep Neural Network for Short-Term Wind Speed Forecasting. IEEE Trans. Sustain. Energy 2019, 10, 670–681. [Google Scholar] [CrossRef]
- Moradzadeh, A.; Zakeri, S.; Shoaran, M.; Mohammadi-Ivatloo, B.; Mohammadi, F. Short-Term Load Forecasting of Microgrid via Hybrid Support Vector Regression and Long Short-Term Memory Algorithms. Sustainability 2020, 12, 7076. [Google Scholar] [CrossRef]
- Pourdaryaei, A.; Mohammadi, M.; Mubarak, H.; Abdellatif, A.; Karimi, M.; Gryazina, E.; Terzija, V. A new framework for electricity price forecasting via multi-head self-attention and CNN-based techniques in the competitive electricity market. Expert Syst. Appl. 2024, 235, 121207. [Google Scholar] [CrossRef]
- Singh, V.; Sahana, S.K.; Bhattacharjee, V. Integrated Spatio-Temporal Graph Neural Network for Traffic Forecasting. Appl. Sci. 2024, 14, 11477. [Google Scholar] [CrossRef]
- Zou, Y.; Feng, W.; Zhang, J.; Li, J. Forecasting of Short-Term Load Using the MFF-SAM-GCN Model. Energies 2022, 15, 3140. [Google Scholar] [CrossRef]
- Guo, M.; Xia, M.; Chen, Q. A review of regional energy internet in smart city from the perspective of energy community. Energy Rep. 2022, 8, 161–182. [Google Scholar] [CrossRef]
- Zhang, C.; Sjarif, N.N.A.; Ibrahim, R. Deep learning models for price forecasting of financial time series: A review of recent advancements: 2020–2022. WIREs Data Min. Knowl. Discov. 2024, 14, e1519. [Google Scholar] [CrossRef]
- Tiwari, D.; Zideh, M.J.; Talreja, V.; Verma, V.; Solanki, S.K.; Solanki, J. Power Flow Analysis Using Deep Neural Networks in Three-Phase Unbalanced Smart Distribution Grids. IEEE Access 2024, 12, 29959–29970. [Google Scholar] [CrossRef]
- Tarmanini, C.; Sarma, N.; Gezegin, C.; Ozgonenel, O. Short term load forecasting based on ARIMA and ANN approaches. Energy Rep. 2023, 9, 550–557. [Google Scholar] [CrossRef]
- Khan, A.R.; Mahmood, A.; Safdar, A.; Khan, Z.A.; Khan, N.A. Load forecasting, dynamic pricing and DSM in smart grid: A review. Energy Rev. 2016, 54, 1311–1322. [Google Scholar] [CrossRef]
- Lei, L.; Wu, B.; Fang, X.; Chen, L.; Wu, H.; Liu, W. A dynamic anomaly detection method of building energy consumption based on data mining technology. Energy 2023, 263, 125575. [Google Scholar] [CrossRef]
- Guo, W.; Che, L.; Shahidehpour, M.; Wan, X. Machine-Learning based methods in short-term load forecasting. Electr. J. 2021, 34, 106884. [Google Scholar] [CrossRef]
- Cao, J.; Zhang, R.-X.; Liu, C.-Q.; Yang, Y.-B.; Chen, C.-L. A Group Resident Daily Load Forecasting Method Fusing Self-Attention Mechanism Based on Load Clustering. Appl. Sci. 2023, 13, 1165. [Google Scholar] [CrossRef]
- Ahmad, N.; Ghadi, Y.; Adnan, M.; Ali, M. Load Forecasting Techniques for Power System: Research Challenges and Survey. IEEE Access 2022, 10, 71054–71090. [Google Scholar] [CrossRef]
- Silva, M.I.; Malitckii, E.; Santos, T.G.; Vilaça, P. Review of conventional and advanced non-destructive testing techniques for detection and characterization of small-scale defects. Prog. Mater. Sci. 2023, 138, 101155. [Google Scholar] [CrossRef]
- Su, H.; Peng, X.; Liu, H.; Quan, H.; Wu, K.; Chen, Z. Multi-Step-Ahead Electricity Price Forecasting Based on Temporal Graph Convolutional Network. Mathematics 2022, 10, 2366. [Google Scholar] [CrossRef]
- Balal, A.; Jafarabadi, Y.P.; Demir, A.; Igene, M.; Giesselmann, M.; Bayne, S. Forecasting Solar Power Generation Utilizing Machine Learning Models in Lubbock. Emerg. Sci. J. 2023, 7, 1052–1062. [Google Scholar] [CrossRef]
- Jung, S.; Moon, J.; Park, S.; Hwang, E. An Attention-Based Multilayer GRU Model for Multistep-Ahead Short-Term Load Forecasting. Sensors 2021, 21, 1639. [Google Scholar] [CrossRef] [PubMed]
- Lin, J.; Ma, J.; Zhu, J.; Cui, Y. Short-term load forecasting based on LSTM networks considering attention mechanism. Int. J. Electr. Power Energy Syst. 2022, 137, 107818. [Google Scholar] [CrossRef]
- Meng, A.; Wang, P.; Zhai, G.; Zeng, C.; Chen, S.; Yang, X.; Yin, H. Electricity price forecasting with high penetration of renewable energy using attention-based LSTM network trained by crisscross optimization. Energy 2022, 254, 124212. [Google Scholar] [CrossRef]
- Khodayar, M.; Regan, J. Deep Neural Networks in Power Systems: A Review. Energies 2023, 16, 4773. [Google Scholar] [CrossRef]
- Xu, H.; Jiang, B.; Huang, L.; Tang, J.; Zhang, S. Multi-head collaborative learning for graph neural networks. Neurocomputing 2022, 499, 47–53. [Google Scholar] [CrossRef]
- Agnew, D.; Boamah, S.; Mathieu, R.; Cooper, A.; McNair, J.; Bretas, A. Distributed Software-Defined Network Architecture for Smart Grid Resilience to Denial-of-Service Attacks. In Proceedings of the 2023 IEEE Power & Energy Society General Meeting (PESGM), Orlando, FL, USA, 25 September 2023. [Google Scholar]
- Xia, Z.; Zhang, Y.; Yang, J.; Xie, L. Dynamic spatial–temporal graph convolutional recurrent networks for traffic flow forecasting. Expert Syst. Appl. 2024, 240, 122381. [Google Scholar] [CrossRef]
- Shan, S.; Li, C.; Wang, Y.; Fang, S.; Zhang, K.; Wei, H. A deep learning model for multi-modal spatio-temporal irradiance forecast. Expert Syst. Appl. 2024, 244, 122925. [Google Scholar] [CrossRef]
- Jiang, W.; Luo, J. Graph neural network for traffic forecasting: A survey. Expert Syst. Appl. 2022, 207, 117921. [Google Scholar] [CrossRef]
- Feng, X.; Chen, Y.; Li, H.; Ma, T.; Ren, Y. Gated Recurrent Graph Convolutional Attention Network for Traffic Flow Prediction. Sustainability 2023, 15, 7696. [Google Scholar] [CrossRef]
- Nunes, M.; Abreu, A. Applying Social Network Analysis to Identify Project Critical Success Factors. Sustainability 2020, 12, 1503. [Google Scholar] [CrossRef]
- Wu, Z.; Mu, Y.; Deng, S.; Li, Y. Spatial–temporal short-term load forecasting framework via K-shape time series clustering method and graph convolutional networks. Energy Rep. 2022, 8, 8752–8766. [Google Scholar] [CrossRef]
- Ali, S.; Bogarra, S.; Riaz, M.N.; Phyo, P.P.; Flynn, D.; Taha, A. From Time-Series to Hybrid Models: Advancements in Short-Term Load Forecasting Embracing Smart Grid Paradigm. Appl. Sci. 2024, 14, 4442. [Google Scholar] [CrossRef]
- Tsai, W.-C.; Hong, C.-M.; Tu, C.-S.; Lin, W.-M.; Chen, C.-H. A Review of Modern Wind Power Generation Forecasting Technologies. Sustainability 2023, 15, 10757. [Google Scholar] [CrossRef]
- Saffari, M.; Khodayar, M. Spatiotemporal Deep Learning for Power System Applications: A Survey. IEEE Access 2024, 12, 93623–93657. [Google Scholar] [CrossRef]
- Li, G.; Xie, S.; Wang, B.; Xin, J.; Li, Y.; Du, S. Photovoltaic Power Forecasting with a Hybrid Deep Learning Approach. IEEE Access 2020, 8, 175871–175880. [Google Scholar] [CrossRef]
- Ahmad, A.; Javaid, N.; Mateen, A.; Awais, M.; Khan, Z.A. Short-Term Load Forecasting in Smart Grids: An Intelligent Modular Approach. Energies 2019, 12, 164. [Google Scholar] [CrossRef]
- Wu, Y. Attention is all you need for boosting graph convolutional neural network. arXiv 2024, arXiv:2403.15419. [Google Scholar] [CrossRef]
- Bao, Y.; Shen, Q.; Cao, Y.; Ding, W.; Shi, Q. Residual attention enhanced Time-varying Multi-Factor Graph Convolutional Network for traffic flow prediction. Eng. Appl. Artif. Intell. 2024, 133, 108135. [Google Scholar] [CrossRef]
- Zhang, Q.; Qin, C.; Zhang, Y.; Bao, F.; Zhang, C.; Liu, P. Transformer-based attention network for stock movement prediction. Expert Syst. Appl. 2022, 202, 117239. [Google Scholar] [CrossRef]
- Wang, Y.; Zou, R.; Liu, F.; Zhang, L.; Liu, Q. A review of wind speed and wind power forecasting with deep neural networks. Appl. Energy 2021, 304, 117766. [Google Scholar] [CrossRef]
- van Quyet, N.; Thong, N.T.; Giang, N.L.; Lan, L.T.H. An Efficient Solution for Multivariate Time Series Forecasting Based on a Stacked Complex Fuzzy Gated Recurrent Neural Network. IEEE Access 2024, 12, 112936–112947. [Google Scholar] [CrossRef]
- Lim, B.; Zohren, S. Time-series forecasting with deep learning: A survey. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 2021, 379, 20200209. [Google Scholar] [CrossRef]
- Aquila, G.; Morais, L.B.S.; de Faria, V.A.D.; Lima, J.W.M.; Lima, L.M.M.; de Queiroz, A.R. An Overview of Short-Term Load Forecasting for Electricity Systems Operational Planning: Machine Learning Methods and the Brazilian Experience. Energies 2023, 16, 7444. [Google Scholar] [CrossRef]
- Sarmas, E.; Spiliotis, E.; Stamatopoulos, E.; Marinakis, V.; Doukas, H. Short-term photovoltaic power forecasting using meta-learning and numerical weather prediction independent Long Short-Term Memory models. Renew. Energy 2023, 216, 118997. [Google Scholar] [CrossRef]
- Bashir, T.; Haoyong, C.; Tahir, M.F.; Liqiang, Z. Short term electricity load forecasting using hybrid prophet-LSTM model optimized by BPNN. Energy Rep. 2022, 8, 1678–1686. [Google Scholar] [CrossRef]
- Qiu, L.; Wang, X.; Wei, J. Energy security and energy management: The role of extreme natural events. Innov. Green Dev. 2023, 2, 100051. [Google Scholar] [CrossRef]
- Wu, Y.; Sicard, B.; Gadsden, S.A. Physics-informed machine learning: A comprehensive review on applications in anomaly detection and condition monitoring. Expert Syst. Appl. 2024, 255, 124678. [Google Scholar] [CrossRef]
- Long, L.D. An AI-driven model for predicting and optimizing energy-efficient building envelopes. Alex. Eng. J. 2023, 79, 480–501. [Google Scholar] [CrossRef]
- Ridett, E.M.; Bahaj, A.S.; James, P.A.B. Forecasting Network Constraints and the Energy Storage Requirements in an Already Constrained Network Across the Isle of Wight, UK. 2023. Available online: https://ssrn.com/abstract=4653755 (accessed on 9 December 2025).
- Ngo, Q.-H.; Nguyen, B.L.; Vu, T.V.; Zhang, J.; Ngo, T. Physics-informed graphical neural network for power system state estimation. Appl. Energy 2024, 358, 122602. [Google Scholar] [CrossRef]
- Karniadakis, G.E.; Kevrekidis, I.G.; Lu, L.; Perdikaris, P.; Wang, S.; Yang, L. Physics-informed machine learning. Nat. Rev. Phys. 2021, 3, 422–4401. [Google Scholar] [CrossRef]
- Sharifhosseini, S.M.; Niknam, T.; Taabodi, M.H.; Aghajari, H.A.; Sheybani, E.; Javidi, G.; Pourbehzadi, M. Investigating Intelligent Forecasting and Optimization in Electrical Power Systems: A Comprehensive Review of Techniques and Applications. Energies 2024, 17, 5385. [Google Scholar] [CrossRef]
- Apribowo, C.H.B.; Sarjiya, S.; Hadi, S.P.; Wijaya, F.D. Optimal Planning of Battery Energy Storage Systems by Considering Battery Degradation due to Ambient Temperature: A Review, Challenges, and New Perspective. Batteries 2022, 8, 290. [Google Scholar] [CrossRef]
- Akhtar, S.; Adeel, M.; Iqbal, M.; Namoun, A.; Tufail, A.; Kim, K.-H. Deep learning methods utilization in electric power systems. Energy Rep. 2023, 10, 2138–2151. [Google Scholar] [CrossRef]
- Li, H.; Wert, J.L.; Birchfield, A.B.; Overbye, T.J.; Roman, T.G.S.; Domingo, C.M.; Marcos, F.E.P.; Martinez, P.D.; Elgindy, T.; Palmintier, B. Building Highly Detailed Synthetic Electric Grid Data Sets for Combined Transmission and Distribution Systems. IEEE Open Access J. Power Energy 2020, 7, 478–488. [Google Scholar] [CrossRef]
- Li, H.; Yeo, J.H.; Bornsheuer, A.L.; Overbye, T.J. The Creation and Validation of Load Time Series for Synthetic Electric Power Systems. IEEE Trans. Power Syst. 2021, 36, 961–969. [Google Scholar] [CrossRef]
- Neumann, O.; Turowski, M.; Mikut, R.; Hagenmeyer, V.; Ludwig, N. Using weather data in energy time series forecasting: The benefit of input data transformations. Energy Inform. 2023, 6, 44. [Google Scholar] [CrossRef]
- Muller, M.; Anderson, K.; Deceglie, M. Generating Synthetic Time Series Photovoltaic Data with Real-World Physical Challenges and Noise for Use in Algorithm Test and Validation; National Renewable Energy Laboratory (NREL): Golden, CO, USA, 2023. [Google Scholar]












| Parameter | Value | Unit |
|---|---|---|
| Load scale factor applied | 0.143181 | – |
| Total load | 1485.02 | MW |
| Total explicit generation (excluding slack) | 1266.18 | MW |
| Network losses | 17.632 | MW |
| Slack real generation (power balance) | 236.471 | MW |
| Slack real generation (from voltage solution) | 454.302 | MW |
| Power balance residual | 0.000000 | MW |
| Line () | (MW) | (MVAr) | (MW) | (MVAr) |
|---|---|---|---|---|
| 1 → 2 | −414.25 | −151.79 | 414.71 | −306.84 |
| 1 → 3 | 78.53 | −50.41 | −78.17 | −5.42 |
| 1 → 5 | −413.25 | 92.34 | 417.21 | −99.92 |
| 2 → 4 | −309.26 | 58.98 | 312.59 | −80.47 |
| 2 → 6 | 57.35 | −47.99 | −57.17 | −3.36 |
| Parameter | Value | Unit |
|---|---|---|
| Total load | 1485.02 | MW |
| Total explicit generation | 1266.18 | MW |
| Losses | 17.632 | MW |
| Slack generation (balance) | 236.471 | MW |
| Slack generation (voltage) | 454.302 | MW |
| Slack discrepancy | 217.831 | MW |
| Power balance residual | 0.000000 | MW |
| Model | Spatial | Temporal | MSE (MW2) | MAE (MW) | MAPE (%) |
|---|---|---|---|---|---|
| ARIMA | No | Yes | 2.31 | 1.42 | 46.8 |
| LSTM | No | Yes | 2.05 | 1.27 | 42.3 |
| GCN-LSTM | Yes | Yes | 1.98 | 1.21 | 40.6 |
| Transformer | No | Yes | 1.92 | 1.18 | 39.1 |
| Proposed GCN-Attention-Physics | Yes | Yes | 1.84 | 1.05 | 38.5 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Otshwe, J.N.; Li, B.; Ngouokoua, J.C.; Qi, B.; Tabaro, C.M.; Guo, Q.; Kang, Y. Enhancing Energy Market Forecasting with Graph Convolutional Networks: A Multi-Node Time-Series Analysis Framework. Energies 2026, 19, 280. https://doi.org/10.3390/en19010280
Otshwe JN, Li B, Ngouokoua JC, Qi B, Tabaro CM, Guo Q, Kang Y. Enhancing Energy Market Forecasting with Graph Convolutional Networks: A Multi-Node Time-Series Analysis Framework. Energies. 2026; 19(1):280. https://doi.org/10.3390/en19010280
Chicago/Turabian StyleOtshwe, Josue Ngondo, Bin Li, Jaime Chabrol Ngouokoua, Bing Qi, Christian Mugisho Tabaro, Qi Guo, and Yi Kang. 2026. "Enhancing Energy Market Forecasting with Graph Convolutional Networks: A Multi-Node Time-Series Analysis Framework" Energies 19, no. 1: 280. https://doi.org/10.3390/en19010280
APA StyleOtshwe, J. N., Li, B., Ngouokoua, J. C., Qi, B., Tabaro, C. M., Guo, Q., & Kang, Y. (2026). Enhancing Energy Market Forecasting with Graph Convolutional Networks: A Multi-Node Time-Series Analysis Framework. Energies, 19(1), 280. https://doi.org/10.3390/en19010280

