A Review of Recent Trends in Electricity Price Forecasting Using Deep Learning Techniques
Abstract
1. Introduction
1.1. Research Gap and Contributions
1.2. Paper Organization
2. Literature Review Methodology
3. Results
3.1. Error Metrics
3.2. Deep Learning Architectures in Electricity Market Forecasts
3.3. Input Variables
4. Discussion
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| AI | Artificial Intelligence |
| ANN | Artificial Neural Network |
| AR | Autoregressive |
| ARIMA | Autoregressive Integrated Moving Average |
| AT | Attention based |
| Bi | Bidirectional |
| CAM | Class Activation Mapping |
| CNN | Convolutional Neural Network |
| CORR | Correlation |
| DLMLP | Deeply Learnt Multi-Layer Perceptron |
| DNN | Deep Neural Network |
| ECRF | Error-Compensated Random Forest |
| ELM | Extreme Learning Machine |
| EMD | Empirical Mode Decomposition |
| EPEX | European Power Exchange |
| GPU | Graphics Processing Units |
| GRU | Gated Recurrent Unit |
| GWO | Grey Wolf Optimization |
| ICEEMDAN | Improved Complete Ensemble Empirical Mode Decomposition with Adaptive Noise |
| ISO-NE | Independent System Operator New England |
| LIME | Local Interpretable Model-Agnostic Explanations |
| LSTM | Long Short-Term Memory |
| MAE | Mean Absolute Error |
| MAPE | Mean Absolute Percentage Error |
| MH | Multi-Head |
| MI | Mutual Information |
| MLP | Multi-Layer Perceptron |
| MoDWT | Maximum overlap Discrete Wavelet Transform |
| MPA | Marine Predators Algorithm |
| MSE | Mean Squared Error |
| N-BEATS | Network Base Expansion Analysis for Interpretable Time Series Forecasting |
| NLP | Natural Language Processing |
| PatchTST | Patch Time Series Transformer |
| PSO | Particle Swarm Optimization |
| RF | Random Forest |
| RMSE | Root Mean Squared Error |
| RS | Random Search |
| RVFL | Random Vector Functional Link |
| SA | Seasonal Attention-based |
| SARIMA | Seasonal Autoregressive Integrated Moving Average |
| SHAP | Shapley Additive Explanations |
| SSA | Singular Spectrum Analysis |
| SSDAE | Stacked Sparse Denoising Auto-Encoder |
| STGNN | Spatial-Temporal Graph Neural Network |
| SVMD | Successive Variational Mode Decomposition |
| TCN | Temporal Convolutional Network |
| VMD | Variational Mode Decomposition |
| WD | Wavelet Decomposition |
| WPD | Wavelet Packet Decomposition |
| WT | Wavelet Transform |
| XAI | Explainable Artificial Intelligence |
References
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, L.; Polosukhin, I. Attention Is All You Need. arXiv 2023, arXiv:1706.03762. [Google Scholar] [CrossRef] [PubMed]
- Bahdanau, D.; Cho, K.; Bengio, Y. Neural Machine Translation by Jointly Learning to Align and Translate. arXiv 2016, arXiv:1409.0473. [Google Scholar] [CrossRef]
- Wang, Q.; Yuan, Y. Stock Price Forecast: Comparison of LSTM, HMM, and Transformer. In Proceedings of the 2nd International Academic Conference on Blockchain, Information Technology and Smart Finance (ICBIS 2023), Hangzhou, China, 17–19 February 2023; pp. 126–136. [Google Scholar] [CrossRef]
- Li, X. Comparative analysis and prospect of RNN and Transformer. Appl. Comput. Eng. 2024, 75, 178–184. [Google Scholar] [CrossRef]
- Mashru, P.D. Comparative Analysis of CNN, RNN, LSTM, and Transformer Architectures in Deep Learning. Educ. Adm. Theory Pract. 2023, 29, 5439–5443. [Google Scholar] [CrossRef]
- Su, L.; Zuo, X.; Li, R.; Wang, X.; Zhao, H.; Huang, B. A systematic review for transformer-based long-term series forecasting. Artif. Intell. Rev. 2025, 58, 80. [Google Scholar] [CrossRef]
- Kim, J.; Kim, H.; Kim, H.; Lee, D.; Yoon, S. A comprehensive survey of deep learning for time series forecasting: Architectural diversity and open challenges. Artif. Intell. Rev. 2025, 58, 216. [Google Scholar] [CrossRef]
- Koutsandreas, D.; Spiliotis, E.; Petropoulos, F.; Assimakopoulos, V. On the selection of forecasting accuracy measures. J. Oper. Res. Soc. 2022, 73, 937–954. [Google Scholar] [CrossRef]
- Hewamalage, H.; Ackermann, K.; Bergmeir, C. Forecast evaluation for data scientists: Common pitfalls and best practices. Data Min. Knowl. Discov. 2023, 37, 788–832. [Google Scholar] [CrossRef]
- Almeida, V.; Gama, J. Prediction intervals for electric load forecast: Evaluation for different profiles. In Proceedings of the 2015 18th International Conference on Intelligent System Application to Power Systems (ISAP), Porto, Portugal, 11–16 September 2015; IEEE: New York, NY, USA, 2015; pp. 1–6. [Google Scholar] [CrossRef]
- Jasiński, T. A new approach to modeling cycles with summer and winter demand peaks as input variables for deep neural networks. Renew. Sustain. Energy Rev. 2022, 159, 112217. [Google Scholar] [CrossRef]
- Wagner, A.; Ramentol, E.; Schirra, F.; Michaeli, H. Short- and long-term forecasting of electricity prices using embedding of calendar information in neural networks. J. Commod. Mark. 2022, 28, 100246. [Google Scholar] [CrossRef]
- Pedram, O.; Soares, A.; Moura, P. A Review of Methodologies for Photovoltaic Energy Generation Forecasting in the Building Sector. Energies 2025, 18, 5007. [Google Scholar] [CrossRef]
- Agakishiev, I.; Härdle, W.K.; Kopa, M.; Kozmik, K.; Petukhina, A. Multivariate probabilistic forecasting of electricity prices with trading applications. Energy Econ. 2025, 141, 108008. [Google Scholar] [CrossRef]
- Willmott, C.; Matsuura, K. Advantages of the mean absolute error (MAE) over the root mean square error (RMSE) in assessing average model performance. Clim. Res. 2005, 30, 79–82. [Google Scholar] [CrossRef]
- Naser, M.Z.; Alavi, A.H. Error Metrics and Performance Fitness Indicators for Artificial Intelligence and Machine Learning in Engineering and Sciences. Archit. Struct. Constr. 2023, 3, 499–517. [Google Scholar] [CrossRef]
- Hodson, T.O. Root-mean-square error (RMSE) or mean absolute error (MAE): When to use them or not. Geosci. Model Dev. 2022, 15, 5481–5487. [Google Scholar] [CrossRef]
- Armstrong, J.S.; Collopy, F. Error measures for generalizing about forecasting methods: Empirical comparisons. Int. J. Forecast. 1992, 8, 69–80. [Google Scholar] [CrossRef]
- Buturac, G. Measurement of Economic Forecast Accuracy: A Systematic Overview of the Empirical Literature. J. Risk Financ. Manag. 2021, 15, 1. [Google Scholar] [CrossRef]
- Koponen, P.; Ikäheimo, J.; Koskela, J.; Brester, C.; Niska, H. Assessing and Comparing Short Term Load Forecasting Performance. Energies 2020, 13, 2054. [Google Scholar] [CrossRef]
- Chen, Z.; Yang, Y. Assessing Forecast Accuracy Measures. Preprint 2004, 2010, 2004-10. [Google Scholar]
- Kva Lseth, T.O. Note on the R2 measure of goodness of fit for nonlinear models. Bull. Psychon. Soc. 1983, 21, 79–80. [Google Scholar] [CrossRef]
- Failing, J.M.; Segarra-Tamarit, J.; Cardo-Miota, J.; Beltran, H. Deep learning-based prediction models for spot electricity market prices in the Spanish market. Math. Comput. Simul. 2026, 240, 96–104. [Google Scholar] [CrossRef]
- Zhang, B.; Song, C.; Jiang, X.; Li, Y. Electricity price forecast based on the STL-TCN-NBEATS model. Heliyon 2023, 9, e13029. [Google Scholar] [CrossRef]
- Bakir, A.; Rami, A. Enhanced electricity price forecasting in smart grids using an optimized hybrid convolutional Multi-Layer Perceptron deep network with Marine Predators Algorithm for feature selection. Energy Sources Part B Econ. Plan. Policy 2025, 20, 2456058. [Google Scholar] [CrossRef]
- Khan, A.A.A.; Ullah, M.H.; Tabassum, R.; Kabir, M.F. Enhanced Transformer-BiLSTM Deep Learning Framework for Day-Ahead Energy Price Forecasting. IEEE Trans. Ind. Appl. 2025; 1–15, in press. [Google Scholar] [CrossRef]
- Gomez, W.; Wang, F.-K.; Amogne, Z.E. Electricity Load and Price Forecasting Using a Hybrid Method Based Bidirectional Long Short-Term Memory with Attention Mechanism Model. Int. J. Energy Res. 2023, 2023, 3815063. [Google Scholar] [CrossRef]
- Jia, H.; Guo, Y.; Zhang, X.; Ma, Q.; Yang, Z.; Zheng, Y.; Zeng, D.; Liu, D. Forecasting electricity prices in the spot market utilizing wavelet packet decomposition integrated with a hybrid deep neural network. Global Energy Interconnect. 2025, 8, 874–890. [Google Scholar] [CrossRef]
- Sowiński, R.; Komorowska, A. Forecasting electricity prices in the Polish Day-Ahead Market using machine learning models. Polityka Energetyczna Energy Policy J. 2025, 28, 211–230. [Google Scholar] [CrossRef]
- Bozlak, Ç.B.; Yaşar, C.F. An optimized deep learning approach for forecasting day-ahead electricity prices. Electr. Power Syst. Res. 2024, 229, 110129. [Google Scholar] [CrossRef]
- Laitsos, V.; Vontzos, G.; Bargiotas, D.; Daskalopulu, A.; Tsoukalas, L.H. Data-Driven Techniques for Short-Term Electricity Price Forecasting through Novel Deep Learning Approaches with Attention Mechanisms. Energies 2024, 17, 1625. [Google Scholar] [CrossRef]
- Mubarak, H.; Abdellatif, A.; Ahmad, S.; Zohurul Islam, M.; Muyeen, S.M.; Abdul Mannan, M.; Kamwa, I. Day-Ahead electricity price forecasting using a CNN-BiLSTM model in conjunction with autoregressive modeling and hyperparameter optimization. Int. J. Electr. Power Energy Syst. 2024, 161, 110206. [Google Scholar] [CrossRef]
- Deng, S.; Inekwe, J.; Smirnov, V.; Wait, A.; Wang, C. Seasonality in deep learning forecasts of electricity imbalance prices. Energy Econ. 2024, 137, 107770. [Google Scholar] [CrossRef]
- Ehsani, B.; Pineau, P.-O.; Charlin, L. Price forecasting in the Ontario electricity market via TriConvGRU hybrid model: Univariate vs. multivariate frameworks. Appl. Energy 2024, 359, 122649. [Google Scholar] [CrossRef]
- Pourdaryaei, A.; Mohammadi, M.; Mubarak, H.; Abdellatif, A.; Karimi, M.; Gryazina, E.; Terzija, V. A new framework for electricity price forecasting via multi-head self-attention and CNN-based techniques in the competitive electricity market. Expert Syst. Appl. 2024, 235, 121207. [Google Scholar] [CrossRef]
- Shi, W.; Feng Wang, Y. A robust electricity price forecasting framework based on heteroscedastic temporal Convolutional Network. Int. J. Electr. Power Energy Syst. 2024, 161, 110177. [Google Scholar] [CrossRef]
- Yang, N.; Bi, G.; Li, Y.; Wang, X.; Luo, Z.; Shen, X. A BiGRUSA-ResSE-KAN Hybrid Deep Learning Model for Day-Ahead Electricity Price Prediction. Symmetry 2025, 17, 805. [Google Scholar] [CrossRef]
- Abdellatif, A.; Mubarak, H.; Ahmad, S.; Mekhilef, S.; Abdellatef, H.; Mokhlis, H.; Kanesan, J. Electricity Price Forecasting One Day Ahead by Employing Hybrid Deep Learning Model. In Proceedings of the 2023 IEEE IAS Global Conference on Renewable Energy and Hydrogen Technologies (GlobConHT), Male, Maldives, 11–12 March 2023; IEEE: New York, NY, USA, 2023; pp. 1–5. [Google Scholar] [CrossRef]
- Ghimire, S.; Deo, R.C.; Casillas-Pérez, D.; Sharma, E.; Salcedo-Sanz, S.; Barua, P.D.; Rajendra Acharya, U. Half-hourly electricity price prediction with a hybrid convolution neural network-random vector functional link deep learning approach. Appl. Energy 2024, 374, 123920. [Google Scholar] [CrossRef]
- Ghimire, S.; Deo, R.C.; Casillas-Pérez, D.; Salcedo-Sanz, S. Two-step deep learning framework with error compensation technique for short-term, half-hourly electricity price forecasting. Appl. Energy 2024, 353, 122059. [Google Scholar] [CrossRef]
- Ghimire, S.; Nguyen-Huy, T.; Deo, R.C.; Casillas-Pérez, D.; Masrur Ahmed, A.A.; Salcedo-Sanz, S. Novel deep hybrid model for electricity price prediction based on dual decomposition. Appl. Energy 2025, 395, 126197. [Google Scholar] [CrossRef]
- Rawal, K.; Ahmad, A. Mining latent patterns with multi-scale decomposition for electricity demand and price forecasting using modified deep graph convolutional neural networks. Sustain. Energy Grids Netw. 2024, 39, 101436. [Google Scholar] [CrossRef]
- Chen, Z.; Zhang, B.; Du, C.; Yang, C.; Gui, W. Outlier-adaptive-based non-crossing quantiles method for day-ahead electricity price forecasting. Appl. Energy 2025, 382, 125328. [Google Scholar] [CrossRef]
- Tan, Y.Q.; Shen, Y.X.; Yu, X.Y.; Lu, X. Day-ahead electricity price forecasting employing a novel hybrid frame of deep learning methods: A case study in NSW, Australia. Electr. Power Syst. Res. 2023, 220, 109300. [Google Scholar] [CrossRef]
- Bisoi, R.; Dash, P.K.; Perla, S. Improved decomposition strategy based recurrent ensemble deep random vector functional link network for forecasting short-term electricity price. e-Prime Adv. Electr. Eng. Electron. Energy 2025, 12, 101024. [Google Scholar] [CrossRef]
- Huang, S.; Shi, J.; Wang, B.; An, N.; Li, L.; Hou, X.; Wang, C.; Zhang, X.; Wang, K.; Li, H.; et al. A hybrid framework for day-ahead electricity spot-price forecasting: A case study in China. Appl. Energy 2024, 373, 123863. [Google Scholar] [CrossRef]
- Mao, X.; Chen, S.; Yu, H.; Duan, L.; He, Y.; Chu, Y. Simplicity in dynamic and competitive electricity markets: A case study on enhanced linear models versus complex deep-learning models for day-ahead electricity price forecasting. Appl. Energy 2025, 383, 125201. [Google Scholar] [CrossRef]
- Gundu, V.; Simon, S.P. Forecasting of price signals using deep recurrent models. Int. J. Syst. Assur. Eng. Manag. 2024, 15, 5378–5388. [Google Scholar] [CrossRef]
- Shejul, K.; Harikrishnan, R.; Gupta, H. The improved integrated Exponential Smoothing based CNN-LSTM algorithm to forecast the day ahead electricity price. MethodsX 2024, 13, 102923. [Google Scholar] [CrossRef]
- Kaya, M.; Karan, M.B.; Telatar, E. Electricity price estimation using deep learning approaches: An empirical study on Turkish markets in normal and Covid-19 periods. Expert Syst. Appl. 2023, 224, 120026. [Google Scholar] [CrossRef]
- Li, W.; Becker, D.M. Day-ahead electricity price prediction applying hybrid models of LSTM-based deep learning methods and feature selection algorithms under consideration of market coupling. Energy 2021, 237, 121543. [Google Scholar] [CrossRef]
- Failing, J.M.; Cardo-Miota, J.; Pérez, E.; Beltran, H.; Segarra-Tamarit, J. Deep learning approaches for predicting the upward and downward energy prices in the Spanish automatic Frequency Restoration Reserve market. Energy 2025, 320, 135245. [Google Scholar] [CrossRef]
- Cerasa, A.; Zani, A. Enhancing electricity price forecasting accuracy: A novel filtering strategy for improved out-of-sample predictions. Appl. Energy 2025, 383, 125357. [Google Scholar] [CrossRef]
- Poggi, A.; Di Persio, L.; Ehrhardt, M. Electricity Price Forecasting via Statistical and Deep Learning Approaches: The German Case. AppliedMath 2023, 3, 316–342. [Google Scholar] [CrossRef]
- Kılıç, D.K.; Nielsen, P.; Thibbotuwawa, A. Intraday Electricity Price Forecasting via LSTM and Trading Strategy for the Power Market: A Case Study of the West Denmark DK1 Grid Region. Energies 2024, 17, 2909. [Google Scholar] [CrossRef]
- Kitsatoglou, A.; Georgopoulos, G.; Papadopoulos, P.; Antonopoulos, H. An ensemble approach for enhanced Day-Ahead price forecasting in electricity markets. Expert Syst. Appl. 2024, 256, 124971. [Google Scholar] [CrossRef]
- Yang, H.; Schell, K.R. ATTnet: An explainable gated recurrent unit neural network for high frequency electricity price forecasting. Int. J. Electr. Power Energy Syst. 2024, 158, 109975. [Google Scholar] [CrossRef]
- Wang, Y.; Ren, J. Application of KNN Algorithm Based on Particle Swarm Optimization in Fire Image Segmentation. J. Electr. Eng. Technol. 2019, 14, 1707–1715. [Google Scholar] [CrossRef]
- El-Azab, H.-A.I.; Swief, R.A.; El-Amary, N.H.; Temraz, H.K. Machine and deep learning approaches for forecasting electricity price and energy load assessment on real datasets. Ain Shams Eng. J. 2024, 15, 102613. [Google Scholar] [CrossRef]
- Ansari, I.; Hassani, K.; Malakouti, S.M.; Suratgar, A.A. Predicting electrical energy consumption for finland using CNN-LSTM hybrid model. Next Res. 2025, 2, 100580. [Google Scholar] [CrossRef]
- Luo, H.; Shao, Y. Advanced Optimal System for Electricity Price Forecasting Based on Hybrid Techniques. Energies 2024, 17, 4833. [Google Scholar] [CrossRef]
- Aliyon, K.; Ritvanen, J. Deep learning-based electricity price forecasting: Findings on price predictability and European electricity markets. Energy 2024, 308, 132877. [Google Scholar] [CrossRef]
- Wang, Z.; Mae, M.; Yamane, T.; Ajisaka, M.; Nakata, T.; Matsuhashi, R. Enhanced Day-Ahead Electricity Price Forecasting Using a Convolutional Neural Network–Long Short-Term Memory Ensemble Learning Approach with Multimodal Data Integration. Energies 2024, 17, 2687. [Google Scholar] [CrossRef]
- Cleveland, R.B.; Cleveland, W.S.; McRae, J.E.; Terpenning, I. STL: A Seasonal-Trend Decomposition Procedure Based on Loess. J. Off. Stat. 1990, 6, 3–73. [Google Scholar]
- Guo, Y.; Du, Y.; Wang, P.; Tian, X.; Xu, Z.; Yang, F.; Chen, L.; Wan, J. A hybrid forecasting method considering the long-term dependence of day-ahead electricity price series. Electr. Power Syst. Res. 2024, 235, 110841. [Google Scholar] [CrossRef]
- Islam, M.R.; Rashed-Al-Mahfuz, M.; Ahmad, S.; Molla, M.K.I. Multiband Prediction Model for Financial Time Series with Multivariate Empirical Mode Decomposition. Discret. Dyn. Nat. Soc. 2012, 2012, 593018. [Google Scholar] [CrossRef]
- Karaev, A.K.; Gorlova, O.S.; Ponkratov, V.V.; Sedova, M.L.; Shmigol, N.S.; Vasyunina, M.L. A Comparative Analysis of the Choice of Mother Wavelet Functions Affecting the Accuracy of Forecasts of Daily Balances in the Treasury Single Account. Economies 2022, 10, 213. [Google Scholar] [CrossRef]
- Yang, Y.; Guo, J.; Li, Y.; Zhou, J. Forecasting day-ahead electricity prices with spatial dependence. Int. J. Forecast. 2024, 40, 1255–1270. [Google Scholar] [CrossRef]
- Nie, Y.; Li, P.; Wang, J.; Zhang, L. A novel multivariate electrical price bi-forecasting system based on deep learning, a multi-input multi-output structure and an operator combination mechanism. Appl. Energy 2024, 366, 123233. [Google Scholar] [CrossRef]
- Xu, Y.; Huang, X.; Zheng, X.; Zeng, Z.; Jin, T. VMD-ATT-LSTM electricity price prediction based on grey wolf optimization algorithm in electricity markets considering renewable energy. Renew. Energy 2024, 236, 121408. [Google Scholar] [CrossRef]
- Wu, Z.; Huang, N.E. Ensemble empirical mode decomposition: A noise-assisted data analysis method. Adv. Adapt. Data Anal. 2009, 1, 1–41. [Google Scholar] [CrossRef]
- Yan, W.; Wang, P.; Xu, R.; Han, R.; Chen, E.; Han, Y.; Zhang, X. A novel mid- and long-term time-series forecasting framework for electricity price based on hierarchical recurrent neural networks. J. Frankl. Inst. 2025, 362, 107590. [Google Scholar] [CrossRef]
- Belenguer, E.; Segarra-Tamarit, J.; Pérez, E.; Vidal-Albalate, R. Short-term electricity price forecasting through demand and renewable generation prediction. Math. Comput. Simul. 2025, 229, 350–361. [Google Scholar] [CrossRef]
- Belenguer, E.; Segarra-Tamarit, J.; Redondo, J.; Pérez, E. Neural Network Model for Aggregated Photovoltaic Generation Forecasting. In Proceedings of the International Conference of the IMACS TC1 Committee, Nancy, France, 16–19 May 2022; pp. 29–40. [Google Scholar] [CrossRef]
- Zhang, Z.; Hua, H.; Chen, X.; Shao, J.; Zheng, J.; Wang, B.; Gan, L.; Yu, K. Spot Market Electricity Price Forecast via the Combination of Transformer and Ornstein-Uhlenbeck Process. In Proceedings of the 2025 21st International Conference on the European Energy Market (EEM), Lisbon, Portugal, 27–29 May 2025; pp. 1–6. [Google Scholar] [CrossRef]
- Lim, B.; Arik, S.O.; Loeff, N.; Pfister, T. Temporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting. Int. J. Forecast. 2021, 37, 1748–1764. [Google Scholar] [CrossRef]
- Peng, Y.; Wang, Z.; Castillo, I.; Gunnell, L.; Jiang, S. A New Modeling Framework for Real-Time Extreme Electricity Price Forecasting. IFAC-Pap. 2024, 58, 899–904. [Google Scholar] [CrossRef]
- Gunduz, S.; Ugurlu, U.; Oksuz, I. Transfer learning for electricity price forecasting. Sustain. Energy Grids Netw. 2023, 34, 100996. [Google Scholar] [CrossRef]
- Tashman, L.J. Out-of-sample tests of forecasting accuracy: An analysis and review. Int. J. Forecast. 2000, 16, 437–450. [Google Scholar] [CrossRef]
- Pourdaryaei, A.; Mohammadi, M.; Karimi, M.; Mokhlis, H.; Illias, H.A.; Kaboli, S.H.A.; Ahmad, S. Recent Development in Electricity Price Forecasting Based on Computational Intelligence Techniques in Deregulated Power Market. Energies 2021, 14, 6104. [Google Scholar] [CrossRef]
- Lei, Y.; Liang, Z.; Ruan, P. Evaluation on the impact of digital transformation on the economic resilience of the energy industry in the context of artificial intelligence. Energy Rep. 2023, 9, 785–792. [Google Scholar] [CrossRef]
- Xu, B.; Yang, G. Interpretability research of deep learning: A literature survey. Inf. Fusion 2025, 115, 102721. [Google Scholar] [CrossRef]
- Papapetrou, P.; Lee, Z. Interpretable and Explainable Time Series Mining. In Proceedings of the 2024 IEEE 11th International Conference on Data Science and Advanced Analytics (DSAA), San Diego, CA, USA, 6–10 October 2024; pp. 1–3. [Google Scholar] [CrossRef]
- Şahin, E.; Arslan, N.N.; Özdemir, D. Unlocking the black box: An in-depth review on interpretability, explainability, and reliability in deep learning. Neural Comput. Appl. 2025, 37, 859–965. [Google Scholar] [CrossRef]




| Name | Abbreviation | Formula | Comments | |
|---|---|---|---|---|
| Mean Absolute Error | MAE | (1) | is an intuitive measure of average error magnitude [15] | |
| Mean Squared Error | MSE | (2) | is sensitive to outliers (thereby promoting lower errors) [16] | |
| Root Mean Squared Error | RMSE | (3) | is highly sensitive (analogously to MSE) to the proportion of the dataset employed (reduced reliability) [16] is expressed in the same units as the dependent variable y (conveniently representing the typical or “standard” magnitude of error under the assumption of normally distributed residuals) [17] | |
| Mean Absolute Percentage Error | MAPE | (4) | is not suitable when large errors are expected (it tends to be biased towards lower forecasts) [18], its distribution can become highly skewed when the true values are near zero [19] | |
| relative Mean Absolute Error | rMAE | (5) | relative measures may use multiple forecasts for the same series [19], or they can be expressed as the ratio of the error to the average absolute value of the series | |
| where | ||||
| (6) | ||||
| relative Root Mean Squared Error | rRMSE | (7) | ||
| normalized Root Mean Squared Error | nRMSE | (8) | represents a relative error metric designed to enable comparison across different experiments or datasets, its definition varies considerably in the literature, as several alternative normalization approaches for RMSE have been proposed [20] | |
| where | ||||
| (9) | ||||
| symmetric Mean Absolute Percentage Error | sMAPE | (10) | is symmetrical (unlike MAPE) [21] | |
| Coefficient of determination | R2 | (11) | often provides misleading information for nonlinear models [22] | |
| DNN Type | Forecasting Accuracy | Geographical Location | Year/Ref. | |||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| MAE (€/MWh) | RMSE (€/MWh) | rMAE (%) | rRMSE (%) | |||||||||||||
| Convolutional Neural Network (CNN) | 11.728 | 15.96 | 13.65 | 18.61 | Spain | 2026 [23] | ||||||||||
| Long Short-Term Memory (LSTM) | 13.579 | 18.986 | 15.801 | 22.094 | ||||||||||||
| RMSE (€/MWh) | MAPE (%) | |||||||||||||||
| Gated Recurrent Unit (GRU) | 9.7026 | 11.3675 | Spain | 2023 [24] | ||||||||||||
| LSTM | 8.9455 | 8.3972 | ||||||||||||||
| Temporal Convolutional Network (TCN)-LSTM | 6.8857 | 6.6380 | ||||||||||||||
| STL-TCN-N-BEATS 1 | 3.7441 | 4.5044 | ||||||||||||||
| MAE ($/MWh) | RMSE ($/MWh) | MAPE (%) | DP-MAPE (%) | R2 | ||||||||||||
| GRU | 0.619 | 0.646 | 2.277 | 1.275 | 0.983 | USA (ISO-NE) | 2025 [25] | |||||||||
| CNN | 1.088 | 1.103 | 3.928 | 1.445 | 0.950 | |||||||||||
| CNN-LSTM | 1.027 | 1.051 | 3.721 | 1.964 | 0.955 | |||||||||||
| CNN-LSTM-Multi-Layer Perceptron (MLP) | 0.591 | 0.612 | 2.127 | 1.580 | 0.985 | |||||||||||
| CNN-GRU-MLP | 0.710 | 0.778 | 2.678 | 1.121 | 0.975 | |||||||||||
| Basic CNN-MLP | 0.594 | 0.708 | 2.313 | 1.179 | 0.979 | |||||||||||
| Optimized CNN-MLP | 0.410 | 0.440 | 1.483 | 0.631 | 0.992 | |||||||||||
| MAE ($/MWh) | RMSE ($/MWh) | MAPE (%) | sMAPE (%) | R2 | ||||||||||||
| GRU | 3.5310 | 7.4869 | 8.2806 | 7.1245 | 0.9194 | USA (NYISO) | 2025 [26] | |||||||||
| CNN | 6.8251 | 13.2971 | 17.0186 | 18.8426 | 0.7455 | |||||||||||
| LSTM | 3.2656 | 7.1573 | 7.8171 | 7.3802 | 0.9262 | |||||||||||
| BiLSTM | 3.1027 | 6.7663 | 7.4780 | 8.3563 | 0.9381 | |||||||||||
| Transformer | 3.3511 | 7.3436 | 8.2084 | 7.5236 | 0.9223 | |||||||||||
| CNN-LSTM | 3.3455 | 7.1131 | 8.3043 | 8.0200 | 0.9247 | |||||||||||
| CNN-BiLSTM | 3.4926 | 7.2303 | 8.5599 | 8.4712 | 0.9271 | |||||||||||
| Transformer-BiLSTM | 2.7818 | 6.4937 | 6.6060 | 6.3741 | 0.9393 | |||||||||||
| RMSE ($/MWh) 2 periods: {2, 10, 30} days | MAPE (%) 2 periods: {2, 10, 30} days | R2 2 periods: {2, 10, 30} days | ||||||||||||||
| LSTM | {11.280, 10.930, 10.419} | {15.4, 16.4, 14.5} | {0.545, 0.643, 0.571} | USA (PJM) | 2023 [27] | |||||||||||
| BiLSTM | {11.393, 10.993, 10.668} | {13.2, 10.8, 14.7} | {0.536, 0.639, 0.551} | |||||||||||||
| Attention based (AT)-LSTM | {12.082, 10.620, 10.406} | {13.9, 11.8, 15.6} | {0.478, 0.663, 0.572} | |||||||||||||
| Empirical Mode Decomposition (EMD)-AT-LSTM | {7.859, 6.853, 7.203} | {17.8, 12.2, 12.9} | {0.779, 0.889, 0.842} | |||||||||||||
| Ensemble EMD-AT-LSTM | {5.134, 4.890, 4.392} | {8.9, 10.1, 10.0} | {0.906, 0.929, 0.924} | |||||||||||||
| MAE (£/MWh) | RMSE (£/MWh) | MAPE (%) | ||||||||||||||
| TCN | 8.46 | 9.41 | 13.7 | UK | 2025 [28] | |||||||||||
| LSTM | 7.91 | 8.56 | 14.1 | |||||||||||||
| CNN | 9.56 | 10.45 | 15.7 | |||||||||||||
| CNN-LSTM | 7.02 | 8.21 | 11.9 | |||||||||||||
| TCN-LSTM | 7.55 | 8.71 | 11.1 | |||||||||||||
| AT-LSTM | 7.94 | 8.99 | 11.5 | |||||||||||||
| Wavelet Packet Decomposition (WPD)-LSTM | 4.91 | 5.21 | 11.9 | |||||||||||||
| WPD-TCN | 5.71 | 6.64 | 11.4 | |||||||||||||
| WPD-TCN-LSTM | 3.12 | 3.79 | 9.2 | |||||||||||||
| Variational Mode Decomposition (VMD)-TCN-LSTM | 4.65 | 6.17 | 10.4 | |||||||||||||
| Fourier-TCN-LSTM | 4.13 | 5.62 | 9.9 | |||||||||||||
| Complete Ensemble EMD with Adaptive Noise (CEEMDAN)-TCN-LSTM | 5.43 | 7.61 | 10.7 | |||||||||||||
| Wavelet Decomposition (WD)-TCN-LSTM (denoised data—soft thresholding) | 4.51 | 6.32 | 10.2 | |||||||||||||
| WD-TCN-LSTM (denoised data—hard thresholding) | 3.85 | 4.66 | 9.8 | |||||||||||||
| WPD-TCN-LSTM (denoised data—soft thresholding) | 3.62 | 4.21 | 9.7 | |||||||||||||
| WPD-TCN-LSTM (denoised data—hard thresholding) | 3.12 | 3.79 | 9.2 | |||||||||||||
| MAE (PLN/MWh) | RMSE (PLN/MWh) | R2 | ||||||||||||||
| CNN | 76.48 | 105.77 | 0.58 | Poland | 2025 [29] | |||||||||||
| LSTM | 96.02 | 129.18 | 0.37 | |||||||||||||
| GRU | 89.90 | 123.17 | 0.42 | |||||||||||||
| CNN-LSTM | 75.21 | 103.64 | 0.59 | |||||||||||||
| CNN-GRU | 79.34 | 107.75 | 0.56 | |||||||||||||
| MAE (€/MWh) 3 year: {2019, 2020, 2021} | RMSE (€/MWh) 3 year: {2019, 2020, 2021} | MAPE (%) 3 year: {2019, 2020, 2021} | ||||||||||||||
| LSTM (1-step forecast) | {2.62, 4.24, 12.62} | {5.22, 7.40, 15.66} | {5.98, 6.77, 6.39} | Germany | 2024 [30] | |||||||||||
| CNN-LSTM (1-step forecast) | {2.56, 3.02, 10.03} | {6.17, 5.03, 12.96} | {5.84, 4.82, 5.08} | |||||||||||||
| LSTM (64-step forecast) | {5.85, 7.32, 52.04} | {2.61, 2.88, 7.71} | {4.37, 3.91, 11.92} | |||||||||||||
| CNN-LSTM (64-step forecast) | {11.29, 11.46, 52.31} | {3.5, 3.66, 7.79} | {7.69, 6.13, 11.98} | |||||||||||||
| MAE (€/MWh) | RMSE (€/MWh) | MAPE (%) | ||||||||||||||
| AT-CNN | 9.849 | 12.771 | 10.699 | Germany | 2024 [31] | |||||||||||
| Multi-Head (MH) AT Model (Transformer) | 6.342 | 10.484 | 6.889 | |||||||||||||
| AT-CNN-GRU | 5.832 | 10.523 | 6.333 | |||||||||||||
| Stacking Ensemble (AT-CNN-GRU, MH Transformer) | 8.377 | 11.330 | 9.100 | |||||||||||||
| Soft Voting Ensemble Model (AT-CNN-GRU, MH Transformer) | 5.639 | 10.401 | 6.125 | |||||||||||||
| MAE (€/MWh) | RMSE (€/MWh) | |||||||||||||||
| CNN-Bidirectional LSTM (BiLSTM)-AutoRegressive (AR) | 3.337 | 4.874 | Germany | 2024 [32] | ||||||||||||
| Random Search (RS)-CNN-BiLSTM-AR | 3.076 | 4.649 | ||||||||||||||
| Genetic Algorithm (GA)-CNN-BiLSTM-AR | 2.87 | 4.463 | ||||||||||||||
| Particle Swarm Optimization (PSO)-CNN-BiLSTM-AR | 2.554 | 4.058 | ||||||||||||||
| MAE (£/MWh) | RMSE (£/MWh) | |||||||||||||||
| CNN-BiLSTM-AR | 2.4492 | 3.802 | UK | 2024 [32] | ||||||||||||
| RS-CNN-BiLSTM-AR | 2.165 | 3.619 | ||||||||||||||
| GA-CNN-BiLSTM-AR | 2.212 | 3.7105 | ||||||||||||||
| PSO-CNN-BiLSTM-AR | 2.009 | 3.465 | ||||||||||||||
| MAE (£/MWh) 4 prediction steps: {1,2,3,4} | RMSE (£/MWh) 4 prediction steps: {1,2,3,4} | MAPE (%) 4 prediction steps: {1,2,3,4} | ||||||||||||||
| Seasonal Attention-based (SA)-BiLSTM | {9.257, 11.595, 13.078, 13.759} | {12.709, 15.216, 16.516, 17.169} | {37.326, 51.830, 61.978, 68.836} | UK | 2024 [33] | |||||||||||
| GRU | {9.279, 11.704, 13.249, 14.035} | {12.894, 15.450, 16.922, 17.558} | {40.113, 52.766, 63.016, 69.794} | |||||||||||||
| LSTM | {9.215, 11.817, 13.234, 13.831} | {12.875, 15.465, 16.915, 17.235} | {38.372, 52.624, 62.440, 68.962} | |||||||||||||
| BiLSTM | {9.225, 11.753, 13.296, 14.159} | {12.963, 15.477, 16.853, 17.589} | {40.128, 53.995, 64.936, 72.093} | |||||||||||||
| TCN | {9.356, 12.154, 13.873, 14.772} | {13.454, 16.067, 17.597, 18.475} | {42.527, 58.097, 68.940, 75.404} | |||||||||||||
| Transformer | {9.369, 11.860, 13.389, 14.948} | {13.553, 15.824, 17.367, 18.783} | {42.289, 55.930, 64.110, 81.016} | |||||||||||||
| MAEh,t ($/MWh) 5 | RMSEh,t ($/MWh) 5 | |||||||||||||||
| LSTM | Canada (Ontario) | 2024 [34] | ||||||||||||||
| GRU | ||||||||||||||||
| CNN | ||||||||||||||||
| CNN-GRU (consecutive) | ||||||||||||||||
| CNN-GRU (parallel) | ||||||||||||||||
| TriConvGRU | ||||||||||||||||
| MAE ($/MWh) 6 {spring, summer, autumn, winter} | RMSE ($/MWh) 6 {spring, summer, autumn, winter} | MAPE (%) 6 {spring, summer, autumn, winter} | ||||||||||||||
| AT-CNN | {0.007287, 0.007352, 0.007359, 0.007216} | {0.008708, 0.008519, 0.008642, 0.00852} | {1.746826, 1.759079, 1.720309, 1.796719} | Canada (Ontario) | 2024 [35] | |||||||||||
| AT-LSTM | {0.00834, 0.008433, 0.008164, 0.00854} | {0.009676, 0.009617, 0.009735, 0.009512} | {2.032756, 2.026645, 1.980703, 2.034482} | |||||||||||||
| CNN | {0.008167, 0.008102, 0.008487, 0.008341} | {0.010101, 0.010104, 0.010043, 0.0101} | {2.233926, 2.277373, 2.24329, 2.255633} | |||||||||||||
| LSTM | {0.008276, 0.008493, 0.008372, 0.008351} | {0.010458, 0.010446, 0.010601, 0.010517} | {2.395201, 2.423371, 2.35098, 2.344382} | |||||||||||||
| MAE ($/MWh or €/MWh) 7 | RMSE ($/MWh or €/MWh) 7 | sMAPE (%) | ||||||||||||||
| Heteroscedastic TCN | 2.035 | 3.693 | 5.89 | Europe (Nord Pool) | 2024 [36] | |||||||||||
| 3.064 | 5.420 | 11.96 | USA (PJM) | |||||||||||||
| 6.335 | 16.408 | 15.13 | Belgium | |||||||||||||
| 4.354 | 12.016 | 12.77 | France | |||||||||||||
| 4.423 | 7.333 | 17.27 | Germany | |||||||||||||
| DeepAR | 2.433 | 4.422 | 6.85 | Europe (Nord Pool) | ||||||||||||
| 4.389 | 6.987 | 17.24 | USA (PJM) | |||||||||||||
| 8.623 | 18.729 | 21.29 | Belgium | |||||||||||||
| 6.617 | 14.764 | 18.44 | France | |||||||||||||
| 5.491 | 8.865 | 21.69 | Germany | |||||||||||||
| Temporal Fusion Transformer | 2.417 | 4.595 | 6.79 | Europe (Nord Pool) | ||||||||||||
| 4.035 | 6.767 | 15.15 | USA (PJM) | |||||||||||||
| 7.467 | 17.254 | 17.08 | Belgium | |||||||||||||
| 4.835 | 12.523 | 13.65 | France | |||||||||||||
| 5.750 | 9.120 | 21.54 | Germany | |||||||||||||
| Unshared CNN-GRU | 2.139 | 3.761 | 6.27 | Europe (Nord Pool) | ||||||||||||
| 3.649 | 6.377 | 13.76 | USA (PJM) | |||||||||||||
| 6.705 | 16.482 | 16.09 | Belgium | |||||||||||||
| 4.407 | 11.602 | 12.79 | France | |||||||||||||
| 4.479 | 7.359 | 17.64 | Germany | |||||||||||||
| BiGRU Self Attention-ResSE-KAN | 0.732 | 1.438 | 2.082 | Europe (Nord Pool) | 2025 [37] | |||||||||||
| 1.224 | 1.736 | 5.212 | USA (PJM) | |||||||||||||
| 4.730 | 11.116 | 12.066 | Belgium | |||||||||||||
| 2.736 | 10.264 | 7.415 | France | |||||||||||||
| 2.113 | 4.141 | 9.235 | Germany | |||||||||||||
| MAE (€/MWh) | RMSE (€/MWh) | |||||||||||||||
| CNN-LSTM | 5.899 | 9.471 | Europe (Nord Pool) | 2023 [38] | ||||||||||||
| CNN-BiLSTM | 5.653 | 9.08 | ||||||||||||||
| CNN-LSTM-AR | 5.645 | 9.154 | ||||||||||||||
| CNN-BiLSTM-AR | 5.434 | 8.674 | ||||||||||||||
| MAE (A$/MWh) | RMSE (A$/MWh) | R2 | ||||||||||||||
| Maximum overlap Discrete Wavelet Transform (MoDWT)-CNN- Random Vector Functional Link (RVFL) | 6.030 | 8.617 | 0.999 | Australia | 2024 [39] | |||||||||||
| MoDWT-LSTM | 22.495 | 37.163 | 0.993 | |||||||||||||
| VMD-CNN-LSTM-VMD-Error-Compensated Random Forest (ECRF) | 5.154 | 10.220 | 0.999 | 2024 [40] | ||||||||||||
| VMD-CNN-LSTM | 13.178 | 21.855 | 0.914 | |||||||||||||
| LSTM | 34.886 | 61.125 | 0.906 | |||||||||||||
| MAE (A$/MWh) | RMSE (A$/MWh) | rMAE (%) | ||||||||||||||
| VMD-Empirical Wavelet Transform (EWT)- Multi Resolution Convolution (MRC)-BiLSTM | 6.707 | 10.555 | 11.115 | 2025 [41] | ||||||||||||
| VMD-Attention based LSTM | 11.095 | 24.790 | 12.449 | |||||||||||||
| VMD-GRU | 60.774 | 35.462 | 59.031 | |||||||||||||
| sMAPE (%) | ||||||||||||||||
| LSTM | 51.946 | Australia | 2024 [42] | |||||||||||||
| 27.930 | Singapore | |||||||||||||||
| LSTM-Wavelet Transform (WT) | 32.395 | Australia | ||||||||||||||
| 16.701 | Singapore | |||||||||||||||
| LSTM-Empirical Mode Decomposition (EMD) | 59.103 | Australia | ||||||||||||||
| 39.874 | Singapore | |||||||||||||||
| modified deep Graph Convolutional Neural Network (GCNN)-WT-Correlation (CORR) | 6.380 | Australia | ||||||||||||||
| 7.800 | Singapore | |||||||||||||||
| GCNN-WT-Mutual Information (MI) | 35.207 | Australia | ||||||||||||||
| 6.676 | Singapore | |||||||||||||||
| GCNN-EMD-CORR | 4.937 | Australia | ||||||||||||||
| 2.009 | Singapore | |||||||||||||||
| GCNN-EMD-MI | 9.825 | Australia | ||||||||||||||
| 8.753 | Singapore | |||||||||||||||
| MAE ($/MWh) 8 {spring, summer, autumn, winter} | RMSE ($/MWh) 8 {spring, summer, autumn, winter} | |||||||||||||||
| EMD-TCN | {61.01, 35.19, 42.51, 11.49} | {74.81, 55.73, 56.92, 15.99} | Singapore | 2025 [43] | ||||||||||||
| VMD-LSTM | {46.47, 19.18, 20.82, 20.02} | {65.7, 27.34, 35.89, 23.4} | ||||||||||||||
| VMD-GRU | {40.26, 17.11, 19.78, 8.90} | {57.55, 24.8, 33.09, 14.59} | ||||||||||||||
| outlier adaptive-based non-crossing quantiles regression (Skip Connection Recurrent Neural Network based) | {31.08, 14.38, 16.19, 6.28} | {46.31, 19.74, 28.05, 11.65} | ||||||||||||||
| MAE (no unit) 9,10 {spring, summer, autumn, winter} | RMSE (no unit) 9,10 {spring, summer, autumn, winter} | MAPE (%) 10 {spring, summer, autumn, winter} | ||||||||||||||
| LSTM | {0.01923, 0.01643, 0.02149, 0.02847} | {0.02939, 0.02264, 0.03103, 0.03576} | {13.007, 13.909, 14.831, 16.785} | Australia | 2023 [44] | |||||||||||
| GRU | {0.01558, 0.02221, 0.02514, 0.01518} | {0.02065, 0.03393, 0.03114, 0.02911} | {12.738, 12.278, 14.451, 14.873} | |||||||||||||
| CNN | {0.02445, 0.02073, 0.02871, 0.03422} | {0.03206, 0.02810, 0.03768, 0.04557} | {16.667, 15.720, 16.879, 16.778} | |||||||||||||
| CNN-LSTM | {0.01518, 0.01352, 0.01655, 0.01998} | {0.02651, 0.02225, 0.02245, 0.02861} | {11.453, 11.433, 13.501, 13.342} | |||||||||||||
| CNN-GRU | {0.01122, 0.01911, 0.02145, 0.02287} | {0.01475, 0.02354, 0.02802, 0.02924} | {10.963, 10.901, 12.435, 12.896} | |||||||||||||
| CNN-Stacked Sparse Denoising Auto-Encoder (SSDAE) | {0.00699,0.01345, 0.00544, 0.01081} | {0.01015, 0.02161, 0.00738, 0.01733} | {8.402, 8.363, 9.504, 9.696} | |||||||||||||
| WT-CNN-SSDAE | {0.02096, 0.00934, 0.01878, 0.02418} | {0.02747, 0.01316, 0.03099, 0.03388} | {11.967, 10.004, 11.864, 12.360} | |||||||||||||
| Singular Spectrum Analysis (SSA)-CNN-SSDAE | {0.00861, 0.01419, 0.03637, 0.01103} | {0.01185, 0.01578, 0.05427, 0.01581} | {10.242, 11.064, 11.665, 10.406} | |||||||||||||
| Ensemble EMD-CNN-SSDAE | {0.01155, 0.01367, 0.01265, 0.01044} | {0.01842, 0.02173, 0.01439, 0.01138} | {9.771, 8.858, 9.645, 9.504} | |||||||||||||
| Improved CEEMDAN (ICEEMDAN)-CNN | {0.01765, 0.01889, 0.02098, 0.01972} | {0.01959, 0.02027, 0.02175, 0.02228} | {14.689, 15.979, 16.909, 17.419} | |||||||||||||
| ICEEMDAN-LSTM | {0.02422, 0.02455, 0.03047, 0.03164} | {0.03068, 0.03305, 0.04423, 0.04690} | {13.477, 12.839, 14.446, 15.592} | |||||||||||||
| ICEEMDAN-GRU | {0.01985, 0.01581, 0.01752, 0.01785} | {0.02815, 0.02164, 0.02584, 0.02751} | {10.684, 11.372, 12.326, 11.800} | |||||||||||||
| ICEEMDAN-CNN-LSTM | {0.01637, 0.00898, 0.00965, 0.00592} | {0.02221, 0.01359, 0.01343, 0.00754} | {9.910, 9.436, 9.771, 9.683} | |||||||||||||
| ICEEMDAN-CNN-GRU | {0.00750, 0.00817, 0.00712, 0.00981} | {0.01013, 0.01376, 0.00955, 0.00705} | {8.417, 7.951, 8.677, 8.235} | |||||||||||||
| ICEEMDAN-CNN-SSDAE | {0.00128, 0.00434, 0.00536, 0.00472} | {0.00277, 0.00549, 0.00732, 0.00566} | {6.632, 6.003, 6.970, 6.745} | |||||||||||||
| MAE ($/MWh) 11 {Feb, Apr, Jul, Oct} | RMSE ($/MWh) 11 {Feb, Apr, Jul, Oct} | MAPE (%) 11 {Feb, Apr, Jul, Oct} | ||||||||||||||
| SVMD-REDRVFLN | {1.2761, 1.5125, 3.0048, 4.0033} | {4.0206, 3.2968, 7.3523, 8.1349} | {6.3615, 6.8403, 10.1361, 10.3847} | USA (PJM) | 2025 [45] | |||||||||||
| {4.0603, 4.1416, 4.4162, 4.3588} | {6.0679, 6.3452, 8.0083, 7.2661} | {6.4946, 7.8435, 9.8743, 6.2153} | Australia | |||||||||||||
| SVMD-RDRVFLN | {2.0128, 2.7792, 4.6164, 5.8238} | {5.9623, 5.0675, 9.2645, 10.1452} | {8.3426, 9.1547, 12.4237, 13.7914} | USA (PJM) | ||||||||||||
| {6.0165, 6.0652, 6.7239, 5.9432} | {7.8777, 8.1376, 9.7326, 8.8327} | {8.5508, 9.7033, 11.8248, 7.9756} | Australia | |||||||||||||
| VMD-REDRVFLN | {2.2594, 3.1137, 4.9328, 6.3668} | {6.2019, 5.3933, 9.5134, 10.4945} | {8.6276, 9.4233, 12.8947, 14.2316} | USA (PJM) | ||||||||||||
| {6.3127, 6.4835, 8.2647, 6.3318} | {8.2291, 8.5233, 10.1579, 9.3345} | {8.9168, 10.1457, 12.3913, 8.4371} | Australia | |||||||||||||
| EEMD-REDRVFLN | {4.2667, 5.0058, 6.6213, 8.0827} | {8.0345, 7.2569, 10.6325, 12.2681} | {10.2573, 11.6879, 14.6469, 15.5681} | USA (PJM) | ||||||||||||
| {7.5986, 7.8131, 9.8331, 7.8245} | {9.7311, 10.1566, 11.5854, 12.8638} | {10.2634, 11.5537, 13.9856, 10.2744} | Australia | |||||||||||||
| EMD-REDRVFLN | {5.4562, 6.1753, 7.3724, 9.3315} | {8.9366, 7.9975, 11.4116, 13.3634} | {11.0965, 12.8123, 15.1855, 16.5774} | USA (PJM) | ||||||||||||
| {8.6044, 8.1627, 10.1256, 8.6382} | {10.6893, 11.1872, 12.4165, 13.9817} | {11.3213, 12.4658, 14.3774, 11.0085} | Australia | |||||||||||||
| EWT-REDRVFLN | {6.4139, 7.6245, 8.2406, 10.1884} | {9.4382, 8.6008, 11.9374, 14.1855} | {11.8116, 13.0024, 16.7994, 17.4243} | USA (PJM) | ||||||||||||
| {9.4193, 9.1853, 11.4078, 10.0038} | {11.7066, 12.2349, 13.3347, 15.0068} | {12.2742, 13.4916, 15.0038, 11.9128} | Australia | |||||||||||||
| WT-REDRVFLN | {6.9831, 8.5608, 8.9117, 10.7276} | {10.3217, 9.2673, 12.4356, 14.6973} | {12.3047, 13.8208, 19.2634, 18.1108} | USA (PJM) | ||||||||||||
| {10.1985, 9.8238, 12.1337, 10.7318} | {12.6557, 13.1567, 14.1336, 15.9233} | {12.8325, 14.1534, 15.7352, 12.6315} | Australia | |||||||||||||
| MAE (CNY/MWh) | RMSE (CNY/MWh) | |||||||||||||||
| Deeply Learnt MLP (DLMLP) | 0.228 | 0.263 | China (Shandong Province) | 2024 [46] | ||||||||||||
| Day Similarity Algorithm (DSA)-DLMLP | 0.165 | 0.197 | ||||||||||||||
| DSA-adaptive tree-structured Parzen estimator (ATPE)-DLMLP | 0.156 | 0.187 | ||||||||||||||
| DSA-Embedded Feature Selection (EFS)-DLMLP | 0.154 | 0.184 | ||||||||||||||
| DSA-EFS-ATPE-DLMLP | 0.138 | 0.166 | ||||||||||||||
| MAE (CNY/MWh) Powerplant no: {1, 2, 3} | RMSE (CNY/MWh) Powerplant no: {1, 2, 3} | MAPE (%) Powerplant no: {1, 2, 3} | ||||||||||||||
| LSTM | {60.77, 74.21, 60.21} | {74.88, 91.36, 74.57} | {17.24, 19.76, 14.79} | China (Guangdong) | 2025 [47] | |||||||||||
| Transformer | {78.52, 73.79, 110.00} | {101.70, 94.89, 133.50} | {22.08, 19.44, 28.09} | |||||||||||||
| smallest average MAPE (%) 12 {day-ahead, similar day, similar day-day-ahead} | ||||||||||||||||
| LSTM | {9.42, 10.66, 7.50} | India | 2024 [48] | |||||||||||||
| GRU | {8.60, 9.72, 7.21} | |||||||||||||||
| PSO-LSTM | {8.12, 9.59, 6.82} | |||||||||||||||
| PSO-GRU | {7.72,8.27, 6.14} | |||||||||||||||
| MAE (Rs./MWh) | RMSE (Rs./MWh) | MAPE (%) | R2 | |||||||||||||
| LSTM | 584.9 | 864.4 | 7.12 | 0.9496 | India | 2024 [49] | ||||||||||
| CNN-LSTM | 498.9 | 842.5 | 6.56 | 0.9472 | ||||||||||||
| Exponential Smoothing-CNN-LSTM | 109.4 | 169.7 | 1.53 | 0.9973 | ||||||||||||
| MAE ($/MWh) | RMSE ($/MWh) | R2 | ||||||||||||||
| Transformer Encoder–Decoder | 2.82 | 3.14 | 0.94 | Turkey | 2023 [50] | |||||||||||
| GRU | 4.48 | 7.61 | 0.84 | |||||||||||||
| stacked GRU | 4.15 | 7.61 | 0.84 | |||||||||||||
| BiGRU | 4.54 | 7.62 | 0.84 | |||||||||||||
| stacked BiGRU | 4.43 | 7.54 | 0.85 | |||||||||||||
| LSTM | 4.49 | 7.69 | 0.84 | |||||||||||||
| stacked LSTM | 4.12 | 7.69 | 0.84 | |||||||||||||
| BiLSTM | 4.59 | 7.76 | 0.84 | |||||||||||||
| stacked BiLSTM | 4.96 | 8.31 | 0.81 | |||||||||||||
| CNN-GRU | 6.06 | 9.28 | 0.77 | |||||||||||||
| CNN-LSTM | 7.57 | 12.32 | 0.59 | |||||||||||||
| Category | Description | Selected Examples |
|---|---|---|
| Architecture hybridization | The neural network structure is built as a combination of different types of architectures or processing algorithms. | CNN-(Bi)LSTM, CNN-GRU, TCN-LSTM, TriConvGRU |
| Data hybridization | Hybridization refers to data processing, not the architecture itself. Data is often transformed by an additional model that acts as an initial feature extractor. | Empirical Mode Decomposition (EMD)-AT-LSTM, Ensemble EMD-AT-LSTM, Wavelet Packet Decomposition (WPD)-LSTM, WPD-TCN |
| Training/optimization process hybridization | Neural network training is supported by additional optimization algorithms, meta-heuristics, or analytical methods. | Particle Swarm Optimization (PSO)-LSTM, PSO-GRU, Genetic Algorithm (GA)-LSTM, Random Search (RS)-LSTM |
| Input Variable | References |
|---|---|
| Gas or other fuel prices | [23,25,44,47,52,56,77] |
| Demand/load/consumption (lagged, real-time or forecasted) | [23,25,28,31,34,35,36,37,44,46,47,48,50,52,53,55,56,57,59,62,68,69,70,73,77] |
| Total generation (actual or forecasted) | [36,37,44,50,53,63,68] |
| Generation from combustion-based energy sources and nuclear power (scheduled or forecasted) | [23,24,28,31,34,52,56] |
| Generation from non-combustion/direct renewable sources (forecasted) | [23,24,28,31,34,37,44,52,53,55,56,62,73] |
| Weather data | [24,31,44,46,47,57,59,63,75,77] |
| Temporal data | [25,31,36,44,46,50,57,59,62,63,68,77] |
| Other | [23,31,44,46,52,56,73] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jasiński, T. A Review of Recent Trends in Electricity Price Forecasting Using Deep Learning Techniques. Energies 2025, 18, 6422. https://doi.org/10.3390/en18246422
Jasiński T. A Review of Recent Trends in Electricity Price Forecasting Using Deep Learning Techniques. Energies. 2025; 18(24):6422. https://doi.org/10.3390/en18246422
Chicago/Turabian StyleJasiński, Tomasz. 2025. "A Review of Recent Trends in Electricity Price Forecasting Using Deep Learning Techniques" Energies 18, no. 24: 6422. https://doi.org/10.3390/en18246422
APA StyleJasiński, T. (2025). A Review of Recent Trends in Electricity Price Forecasting Using Deep Learning Techniques. Energies, 18(24), 6422. https://doi.org/10.3390/en18246422

