Long-Term Power Load Forecasting Using LSTM-Informer with Ensemble Learning
Abstract
:1. Introduction
1.1. Background and Literature Review
1.2. Reasearch Gap
1.3. Contribution
- The Pearson model and the RF model were used to process multivariate data to find the variables most relevant to electricity forecasting.
- A relatively new integrated learning LSTM-Informer model was constructed. The bottom layer of this model uses the LSTM model as a learner to capture the short-term time correlation of power load, and the top layer uses the Informer model to solve the long-term dependence of input and output. The model can accurately predict long-term power load while capturing short-term time correlation.
- For the analyzed data, the model proposed in this paper has good performance in accuracy and fitting degree.
2. Problem Statement
3. Deep Learning Forecast Model
3.1. Long Short-Term Memory Networks
3.2. Informer
3.3. LSTM-Informer
4. Experiment
4.1. Datasets
4.2. Variable Selection
4.3. Data Normalization
4.4. Methods for Comparison
4.5. Evalution Metrics
4.6. Method Comparison and Analysis
4.6.1. Experimental Result
4.6.2. Results Analysis
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Share of Electricity in Total Final Energy Consumption. Available online: https://yearbook.enerdata.net/electricity/share-electricity-final-consumption.html (accessed on 1 January 2022).
- Tang, L.; Wang, X.; Wang, X.; Shao, C.; Liu, S.; Tian, S. Long-term electricity consumption forecasting based on expert prediction and fuzzy Bayesian theory. Energy 2019, 167, 1144–1154. [Google Scholar] [CrossRef]
- Kim, T.Y.; Cho, S.B. Predicting the household power consumption using CNN-LSTM hybrid networks. In Proceedings of the International Conference on Intelligent Data Engineering and Automated Learning, Madrid, Spain, 21–23 November 2018; Springer: Cham, Switzerland, 2018; pp. 481–490. [Google Scholar]
- Fan, D.; Sun, H.; Yao, J.; Zhang, K.; Yan, X.; Sun, Z. Well production forecasting based on ARIMA-LSTM model considering manual operations. Energy 2021, 220, 119708. [Google Scholar] [CrossRef]
- Ahmad, T.; Zhang, D.; Huang, C.; Zhang, H.; Dai, N.; Song, Y.; Chen, H. Artificial intelligence in sustainable energy industry: Status Quo, challenges and opportunities. J. Clean. Prod. 2021, 289, 125834. [Google Scholar] [CrossRef]
- Yu, C.; Li, Y.; Chen, Q.; Lai, X.; Zhao, L. Matrix-based wavelet transformation embedded in recurrent neural networks for wind speed prediction. Appl. Energy 2022, 324, 119692. [Google Scholar] [CrossRef]
- Nowicka-Zagrajek, J.; Weron, R. Modeling electricity loads in California: ARMA models with hyperbolic noise. Signal Process. 2002, 82, 1903–1915. [Google Scholar] [CrossRef]
- Chen, X.; Jia, S.; Ding, L.; Xiang, Y. Reasoning over temporal knowledge graph with temporal consistency constraints. J. Intell. Fuzzy Syst. 2021, 40, 11941–11950. [Google Scholar] [CrossRef]
- Valipour, M.; Banihabib, M.E.; Behbahani, S.M.R. Comparison of the ARMA, ARIMA, and the autoregressive artificial neural network models in forecasting the monthly inflow of Dez dam reservoir. J. Hydrol. 2013, 476, 433–441. [Google Scholar] [CrossRef]
- Divina, F.; Gilson, A.; Goméz-Vela, F.; García Torres, M.; Torres, J.F. Stacking ensemble learning for short-term electricity consumption forecasting. Energies 2018, 11, 949. [Google Scholar] [CrossRef]
- Gu, B.; Shen, H.; Lei, X.; Hu, H.; Liu, X. Forecasting and uncertainty analysis of day-ahead photovoltaic power using a novel forecasting method. Appl. Energy 2021, 299, 117291. [Google Scholar] [CrossRef]
- Wang, J.; Gao, J.; Wei, D. Electric load prediction based on a novel combined interval forecasting system. Appl. Energy 2022, 322, 119420. [Google Scholar] [CrossRef]
- Carpinteiro, O.A.; Leme, R.C.; de Souza, A.C.Z.; Pinheiro, C.A.; Moreira, E.M. Long-term load forecasting via a hierarchical neural model with time integrators. Electr. Power Syst. Res. 2007, 77, 371–378. [Google Scholar] [CrossRef]
- Eldeeb, E. Traffic Classification and Prediction, and Fast Uplink Grant Allocation for Machine Type Communications via Support Vector Machines and Long Short-Term Memory. Master’s Thesis, University of Oulu, Oulu, Finland, 2020. [Google Scholar]
- Senjyu, T.; Yona, A.; Urasaki, N.; Funabashi, T. Application of recurrent neural network to long-term-ahead generating power forecasting for wind power generator. In Proceedings of the 2006 IEEE PES Power Systems Conference and Exposition, Atlanta, GA, USA, 29 October–1 November 2006; pp. 1260–1265. [Google Scholar]
- Shewalkar, A.; Nyavanandi, D.; Ludwig, S.A. Performance evaluation of deep neural networks applied to speech recognition: RNN, LSTM and GRU. J. Artif. Intell. Soft Comput. Res. 2019, 9, 235–245. [Google Scholar] [CrossRef]
- Wang, J.Q.; Du, Y.; Wang, J. LSTM based long-term energy consumption prediction with periodicity. Energy 2020, 197, 117197. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. Adv. Neural Inf. Process. Syst. 2017, 30. Available online: https://papers.nips.cc/paper_files/paper/2017/hash/3f5ee243547dee91fbd053c1c4a845aa-Abstract.html (accessed on 1 January 2022).
- Wu, H.; Xu, J.; Wang, J.; Long, M. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural Inf. Process. Syst. 2021, 34, 22419–22430. [Google Scholar]
- Khodayar, M.; Liu, G.; Wang, J.; Khodayar, M.E. Deep learning in power systems research: A review. CSEE J. Power Energy Syst. 2020, 7, 209–220. [Google Scholar]
- Kaur, D.; Kumar, R.; Kumar, N.; Guizani, M. Smart grid energy management using rnn-lstm: A deep learning-based approach. In Proceedings of the 2019 IEEE Global Communications Conference (GLOBECOM), Waikoloa, HI, USA, 9–13 December 2019; pp. 1–6. [Google Scholar]
- Jung, Y.; Jung, J.; Kim, B.; Han, S. Long short-term memory recurrent neural network for modeling temporal patterns in long-term power forecasting for solar PV facilities: Case study of South Korea. J. Clean. Prod. 2020, 250, 119476. [Google Scholar] [CrossRef]
- Pallonetto, F.; Jin, C.; Mangina, E. Forecast electricity demand in commercial building with machine learning models to enable demand response programs. Energy AI 2022, 7, 100121. [Google Scholar] [CrossRef]
- Ahmad, N.; Ghadi, Y.; Adnan, M.; Ali, M. Load forecasting techniques for power system: Research challenges and survey. IEEE Access 2022, 10, 71054–71090. [Google Scholar] [CrossRef]
- Fan, S.; Mao, C.; Chen, L. Peak load forecasting using the self-organizing map. In Proceedings of the International Symposium on Neural Networks, Chongqing, China, 30 May–1 June 2005; Springer: Berlin/Heidelberg, Germany, 2005; pp. 640–647. [Google Scholar]
- Mishra, M.; Nayak, J.; Naik, B.; Abraham, A. Deep learning in electrical utility industry: A comprehensive review of a decade of research. Eng. Appl. Artif. Intell. 2020, 96, 104000. [Google Scholar] [CrossRef]
- Ozcanli, A.K.; Yaprakdal, F.; Baysal, M. Deep learning methods and applications for electrical power systems: A comprehensive review. Int. J. Energy Res. 2020, 44, 7136–7157. [Google Scholar] [CrossRef]
- Nti, I.K.; Teimeh, M.; Nyarko-Boateng, O.; Adekoya, A.F. Electricity load forecasting: A systematic review. J. Electr. Syst. Inf. Technol. 2020, 7, 13. [Google Scholar] [CrossRef]
- Bedi, J.; Toshniwal, D. Deep learning framework to forecast electricity demand. Appl. Energy 2019, 238, 1312–1326. [Google Scholar] [CrossRef]
- Bandara, K.; Bergmeir, C.; Smyl, S. Forecasting across time series databases using long short-term memory networks on groups of similar series. arXiv 2017, 8, 805–815. [Google Scholar] [CrossRef]
- Sherstinsky, A. Fundamentals of recurrent neural network (RNN) and long short-term memory (LSTM) network. Phys. D Nonlinear Phenom. 2020, 404, 132306. [Google Scholar] [CrossRef]
- Tian, Z.; Chen, H. A novel decomposition-ensemble prediction model for ultra-short-term wind speed. Energy Convers. Manag. 2021, 248, 114775. [Google Scholar] [CrossRef]
- Takyi-Aninakwa, P.; Wang, S.; Zhang, H.; Yang, X.; Fernandez, C. An optimized long short-term memory-weighted fading extended Kalman filtering model with wide temperature adaptation for the state of charge estimation of lithium-ion batteries. Appl. Energy 2022, 326, 120043. [Google Scholar] [CrossRef]
- Liu, B.; He, X.; Song, M.; Li, J.; Qu, G.; Lang, J.; Gu, R. A Method for Mining Granger Causality Relationship on Atmospheric Visibility. ACM Trans. Knowl. Discov. Data (TKDD) 2021, 15, 1–16. [Google Scholar] [CrossRef]
- You, Z.; Congbo, L.; Lihong, L. Centrifugal blower Fault trend prediction method of centrifugal blower based on Informer under incomplete data. Comput. -Integr. Manuf. Syst. 2023, 29, 133–145. [Google Scholar] [CrossRef]
- Jamali, B.; Rasekh, M.; Jamadi, F.; Gandomkar, R.; Makiabadi, F. Using PSO-GA algorithm for training artificial neural network to forecast solar space heating system parameters. Appl. Therm. Eng. 2019, 147, 647–660. [Google Scholar] [CrossRef]
- Mishra, M.; Dash, P.B.; Nayak, J.; Naik, B.; Swain, S.K. Deep learning and wavelet transform integrated approach for short-term solar PV power prediction. Measurement 2020, 166, 108250. [Google Scholar] [CrossRef]
- Ying, R.K.; Shou, Y.; Liu, C. Prediction Model of Dow Jones Index Based on LSTM-Adaboost. In Proceedings of the 2021 International Conference on Communications, Information System and Computer Engineering (CISCE), Beijing, China, 14–16 May 2021; pp. 808–812. [Google Scholar]
- Sibo, W.; Zhigang, C.; Rui, H. ID 3 optimization algorithm based on correlation coefficient. Comput. Eng. Sci. 2016, 38. [Google Scholar]
- Liaw, A.; Wiener, M. Classification and regression by randomForest. R News 2002, 2, 18–22. [Google Scholar]
- Lu, L.; Zeng, B.; Yang, S.; Chen, M.; Yu, Y. Continuous Monitoring Analysis of Rice Quality in Southern China Based on Random Forest. J. Food Qual. 2022, 2022, 7730427. [Google Scholar] [CrossRef]
- Huang, S.; Tan, E.; Jimin, R. Analog Circuit Fault Diagnosis based on Optimization Matrix Random Forest Algorithm. In Proceedings of the 2021 International Symposium on Computer Technology and Information Science (ISCTIS), Guilin, China, 4–6 June 2021; pp. 63–67. [Google Scholar]
- Du, H.; Ni, Y.; Wang, Z. An Improved Algorithm Based on Fast Search and Find of Density Peak Clustering for High-Dimensional Data. Wirel. Commun. Mob. Comput. 2021, 2021, 9977884. [Google Scholar] [CrossRef]
- Lv, B.; Wang, G.; Li, S.; Wu, Y.; Wang, G. A weight recognition method for movable objects in sealed cavity based on supervised learning. Measurement 2022, 205, 112149. [Google Scholar] [CrossRef]
- Zhou, J.; Hu, L.; Jiang, Y.; Liu, L. A correlation analysis between SNPs and ROIs of Alzheimer’s disease based on deep learning. BioMed Res. Int. 2021, 2021, 8890513. [Google Scholar] [CrossRef]
- Huang, X.; Xiang, Y.; Li, K.C. Green, pervasive, and cloud computing’. In Proceedings of the 11th International Conference, GPC 2016, Xi’an, China, 6–8 May 2016. [Google Scholar]
- Zhou, H.; Zhang, S.; Peng, J.; Zhang, S.; Li, J.; Xiong, H.; Zhang, W. Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Virtually, 2–9 February 2021; Volume 35, pp. 11106–11115. [Google Scholar]
- Kitaev, N.; Kaiser, Ł.; Levskaya, A. Reformer: The efficient transformer. arXiv 2020, arXiv:2001.04451. [Google Scholar]
- Graves, A.; Graves, A. Long short-term memory. Supervised Seq. Label. Recurr. Neural Netw. 2012, 385, 37–45. [Google Scholar]
Hidden_Size | Num_Layers | Batch_Size |
---|---|---|
15 | 2 | 32 |
Temperature (℃) | Humidity (%) | Wind Speed (m/s) | General DiffuseFlows (w/m2) | DiffuseFlows (w/m2) | PowerC_Zone1 (kw) | PowerC_Zone2 (kw) | PowerC_Zone3 (kw) | |
---|---|---|---|---|---|---|---|---|
count | 52,416.00 | 52,416.00 | 52,416.00 | 52,416.00 | 52,416.00 | 52,416.00 | 52,416.00 | 52,416.00 |
mean | 18.81 | 68.26 | 1.96 | 182.70 | 75.03 | 32,344.97 | 21,042.51 | 17,835.41 |
Std | 5.82 | 15.55 | 2.35 | 264.40 | 124.21 | 7130.56 | 5201.471 | 6622.17 |
min | 3.25 | 11.34 | 0.05 | 0.00 | 0.01 | 13,895.70 | 8560.08 | 5935.17 |
max | 40.01 | 94.80 | 6.48 | 1163.00 | 936.00 | 52,204.40 | 37,408.86 | 47,598.33 |
Feature | ρ | Correlation |
---|---|---|
Temperature | 0.49 | + |
Humidity | −0.23 | − |
WindSpeed | 0.28 | + |
GeneralDiffuseFlows | 0.063 | + |
DiffuseFlows | −0.039 | − |
PowerConsumption_Zone1 | 0.75 | + |
PowerConsumption_Zone2 | 0.57 | − |
Feature | The Importance of Feature X |
---|---|
PowerConsumption_Zone1 | 0.580088 |
PowerConsumption_Zone2 | 0.137847 |
Temperature | 0.135731 |
GeneralDiffuseFlows | 0.046054 |
DiffuseFlows | 0.035704 |
WindSpeed | 0.033177 |
Humidity | 0.031397 |
Methods | LSTM-Informer | Informer | Autoformer | Transformer | Reformer | LSTM | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Metric | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE |
24 | 0.0617 | 0.1988 | 0.0597 | 0.1929 | 0.2464 | 0.3997 | 0.1153 | 0.2943 | 0.2983 | 0.4738 | 0.0787 | 0.2245 |
48 | 0.0846 | 0.2431 | 0.0910 | 0.2424 | 1.2121 | 0.9072 | 0.1855 | 0.3766 | 0.3196 | 0.4922 | 0.1057 | 0.2533 |
72 | 0.0856 | 0.2415 | 0.0964 | 0.2584 | 0.2771 | 0.4115 | 0.1866 | 0.3779 | 0.4423 | 0.5895 | 0.1299 | 0.2988 |
96 | 0.1064 | 0.2755 | 0.1165 | 0.2879 | 0.8902 | 0.7659 | 0.2354 | 0.4396 | 0.4228 | 0.5731 | 0.1233 | 0.2879 |
120 | 0.1098 | 0.2802 | 0.1159 | 0.2848 | 0.4739 | 0.5371 | 0.2106 | 0.4043 | 0.4412 | 0.5858 | 0.1372 | 0.3157 |
Count | 7 | 3 | 0 | 0 | 0 | 0 |
Methods | LSTM-Informer | Informer | Autoformer | Transformer | Reformer | LSTM | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Metric | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE |
144 | 0.2085 | 0.3963 | 0.2690 | 0.4561 | 0.7061 | 0.6181 | 0.2692 | 0.4517 | 0.8949 | 0.8672 | 0.7509 | 0.7675 |
192 | 0.2644 | 0.4477 | 0.2979 | 0.4813 | 1.0563 | 0.7284 | 0.2736 | 0.4566 | 0.8762 | 0.8449 | 0.8128 | 0.8054 |
240 | 0.2407 | 0.4253 | 0.2691 | 0.4542 | 0.4664 | 0.5451 | 0.2510 | 0.4353 | 0.8868 | 0.8447 | 0.5970 | 0.6731 |
288 | 0.2490 | 0.4232 | 0.2544 | 0.4305 | 1.3293 | 0.7713 | 0.2111 | 0.3920 | 0.9092 | 0.8578 | 0.7085 | 0.7285 |
432 | 0.2030 | 0.3783 | 0.2786 | 0.4440 | 0.4519 | 0.5148 | 0.2357 | 0.4173 | 0.9253 | 0.8592 | 0.7808 | 0.7629 |
Count | 8 | 0 | 0 | 2 | 0 | 0 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, K.; Zhang, J.; Li, X.; Zhang, Y. Long-Term Power Load Forecasting Using LSTM-Informer with Ensemble Learning. Electronics 2023, 12, 2175. https://doi.org/10.3390/electronics12102175
Wang K, Zhang J, Li X, Zhang Y. Long-Term Power Load Forecasting Using LSTM-Informer with Ensemble Learning. Electronics. 2023; 12(10):2175. https://doi.org/10.3390/electronics12102175
Chicago/Turabian StyleWang, Kun, Junlong Zhang, Xiwang Li, and Yaxin Zhang. 2023. "Long-Term Power Load Forecasting Using LSTM-Informer with Ensemble Learning" Electronics 12, no. 10: 2175. https://doi.org/10.3390/electronics12102175
APA StyleWang, K., Zhang, J., Li, X., & Zhang, Y. (2023). Long-Term Power Load Forecasting Using LSTM-Informer with Ensemble Learning. Electronics, 12(10), 2175. https://doi.org/10.3390/electronics12102175