Analysis for Non-Residential Short-Term Load Forecasting Using Machine Learning and Statistical Methods with Financial Impact on the Power Market
Abstract
:1. Introduction
1.1. Context and Importance of Load Forecasting
1.2. Literature Review: Traditional Approach
1.3. Literature Review: Machine Learning
2. Materials and Methods
2.1. Case Study Data Analysis—Hourly Industrial and Commercial Demand
2.2. Software Resources
2.3. Forecasting Using Traditional Methods
2.4. Forecasting Using Machine Learning
Algorithm 1: Machine learning forecasting algorithm (h = forecast horizon) |
|
2.4.1. Multilayer Perceptron (MLP)
2.4.2. Recurrent Neural Networks
2.4.3. Long-Short Term Memory
2.4.4. Gate Recurrent Unit (GRU)
2.4.5. LSTM Encoder—Decoder
2.4.6. CNN (Convolutional Neural Network)—LSTM Arhitecture
3. Results
4. Discussion
5. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Pellegrini-Masini, G.; Pirni, A.; Maran, S.; Klöckner, C. Delivering a timely and Just Energy Transition: Which policy research priorities? Environ. Policy Gov. 2020, 30, 293–305. [Google Scholar] [CrossRef]
- Koltsaklis, N.; Dagoumas, A.; Seritan, G.; Porumb, R. Energy transition in the South East Europe: The case of the Romanian power system. Energy Rep. 2020, 6, 2376–2393. [Google Scholar] [CrossRef]
- European Commission. A European Green Deal. Available online: https://ec.europa.eu/info/strategy/priorities-2019-2024/european-green-deal_en (accessed on 28 July 2021).
- O’Shaughnessy, E.; Cruce, J.R.; Xu, K. Too much of a good thing? Global trends in the curtailment of solar PV. Sol. Energy 2020, 208, 1068–1077. [Google Scholar] [CrossRef]
- Syranidou, C.; Linssen, J.; Stolten, D.; Robinius, M. On the Curtailments of Variable Renewable Energy Sources in Europe and the Role of Load Shifting. In Proceedings of the 2020 55th International Universities Power Engineering Conference (UPEC), Turin, Italy, 1–4 September 2020. [Google Scholar] [CrossRef]
- Ahmad, T.; Zhang, D.; Huang, C.; Zhang, H.; Dai, N.; Song, Y.; Chen, H. Artificial intelligence in sustainable energy industry: Status Quo, challenges and opportunities. J. Clean. Prod. 2021, 289, 125834. [Google Scholar] [CrossRef]
- Oprea, S.V.; Bâra, A.; Tudorică, B.; Călinoiu, M.; Botezatu, M. Insights into demand-side management with big data analytics in electricity consumers’ behaviour. Comput. Electr. Eng. 2021, 89, 106902. [Google Scholar] [CrossRef]
- Wang, S.; Sun, X.; Geng, J.; Han, Y.; Zhang, C.; Zhang, W. Application and Analysis of Big Data Technology in Smart Grid; IOP Publishing: London, UK, 2020; Volume 1639, p. 12043. [Google Scholar] [CrossRef]
- Zhang, M.; Lo, K.L. A comparison of imbalance settlement methods of electricity markets. In Proceedings of the 2009 44th International Universities Power Engineering Conference (UPEC), Glasgow, UK, 1–4 September 2009; pp. 1–5. [Google Scholar]
- ECOFYS Fraunhofer-ISI. Electricity Costs of Energy Intensive Industries—An International Comparison. Available online: https://www.isi.fraunhofer.de/content/dam/isi/dokumente/ccx/2015/Electricity-Costs-of-Energy-Intensive-Industries.pdf (accessed on 15 July 2021).
- Almeshaiei, E.; Soltan, H. A methodology for Electric Power Load Forecasting. Alex. Eng. J. 2011, 50, 137–144. [Google Scholar] [CrossRef] [Green Version]
- Winters, P.R. Forecasting Sales by Exponentially Weighted Moving Averages. Manag. Sci. 2021, 6, 324–342. [Google Scholar] [CrossRef]
- Taylor, J.W.; McSharry, P.E. Short-Term Load Forecasting Methods: An Evaluation Based on European Data. IEEE Trans. Power Syst. 2007, 22, 2213–2219. [Google Scholar] [CrossRef] [Green Version]
- Hyndman, R.; Athanasopoulos, G. Forecasting: Principles and Practice, 2nd ed.; Available online: OTexts.com/fpp2 (accessed on 29 April 2021).
- Box, G.; Jenkins, G.M. Time Series Analysis: Forecasting and Control; Holden-Day: San Francisco, CA, USA, 1976. [Google Scholar]
- Tao, Y.; Zhao, F.; Yuan, H.; Lai, C.; Xu, Z.; Ng, W.; Li, R.; Li, X.; Lai, L. Revisit Neural Network Based Load Forecasting. In Proceedings of the IEEE 2019, 2019 20th International Conference on Intelligent System Application to Power Systems (ISAP), New Delhi, India, 10–14 December 2019. [Google Scholar] [CrossRef]
- Djukanovic, M.; Babic, B.; Sobajic, D.J.; Pao, Y.H. Unsupervised/supervised learning concept for 24-h load forecasting. IEE Proc. Part C Gener. Transm. Distrib. 1993, 140, 311–318. [Google Scholar] [CrossRef]
- Upadhaya, D.; Thakur, R.; Singh, N. A Systematic Review on the Methods of Short Term Load Forecasting. In Proceedings of the IEEE 2019, 2019 2nd International Conference on Power Energy, Environment and Intelligent Control (PEEIC), Greater Noida, India, 18–19 October 2019; pp. 6–11. [Google Scholar] [CrossRef]
- Cai, M.; Pipattanasomporn, M.; Rahman, S. Day-ahead building-level load forecasts using deep learning vs. traditional time-series techniques. Appl. Energy 2019, 236, 1078–1088. [Google Scholar] [CrossRef]
- Moon, J.; Park, J.; Hwang, E.; Jun, S. Forecasting power consumption for higher educational institutions based on machine learning. J. Supercomput. 2018, 74, 3778–3800. [Google Scholar] [CrossRef]
- Božić, Z.; Dobromirov, D.; Arsić, J.; Radišić, M.; Ślusarczyk, B. Power Exchange Prices: Comparison of Volatility in European Markets. Energies 2020, 13, 5620. [Google Scholar] [CrossRef]
- Jierula, A.; Wang, S.; OH, T.M.; Wang, P. Study on Accuracy Metrics for Evaluating the Predictions of Damage Locations in Deep Piles Using Artificial Neural Networks with Acoustic Emission Data. Appl. Sci. 2021, 11, 2314. [Google Scholar] [CrossRef]
- Brownlee, J. Deep Learning for Time Series Forecasting. 2020. Available online: https://machinelearningmastery.com/machine-learning-with-python/ (accessed on 3 June 2021).
- Bellahsen, A.; Dagdougui, H. Aggregated short-term load forecasting for heterogeneous buildings using machine learning with peak estimation. Energy Build. 2021, 237, 110742. [Google Scholar] [CrossRef]
- Fekri, M.N.; Patel, H.; Grolinger, K.; Sharma, V. Deep learning for load forecasting with smart meter data: Online Adaptive Recurrent Neural Network. Appl. Energy 2021, 282, 116177. [Google Scholar] [CrossRef]
- Grigoras, G. 9—Impact of smart meter implementation on saving electricity in distribution networks in Romania. In Application of Smart Grid Technologies; Lamont, L.A., Sayigh, A., Eds.; Academic Press: Cambridge, MA, USA, 2018; pp. 313–346. [Google Scholar] [CrossRef]
- Guo, W.; Che, L.; Shahidehpour, M.; Wan, X. Machine-Learning based methods in short-term load forecasting. Special Issue: Machine Learning Applications To Power System Planning And Operation. Electr. J. 2021, 34, 106884. [Google Scholar] [CrossRef]
- Kim, J.; Moon, J.; Hwang, E.; Kang, P. Recurrent inception convolution neural network for multi short-term load forecasting. Energy Build. 2019, 194, 328–341. [Google Scholar] [CrossRef]
- Wu, W.; Liao, W.; Miao, J.; Du, G. Using Gated Recurrent Unit Network to Forecast Short-Term Load Considering Impact of Electricity Price. Innovative Solutions for Energy Transitions. Energy Procedia 2019, 158, 3369–3374. [Google Scholar] [CrossRef]
- Krstevski, P.; Borozan, S.; Krkoleva Mateska, A. Electricity balancing markets in South East Europe—Investigation of the level of development and regional integration. Energy Rep. 2021, in press. [Google Scholar] [CrossRef]
- Boechler, E.; Hanania, J.; Suarez, L.V.; Donev, J. Energy Education—Industrial Energy Use. Available online: https://energyeducation.ca/encyclopedia/Industrial_energy_use#cite_note-OED-1 (accessed on 15 May 2021).
- Moerenhout, T.S.; Sharma, S.; Urpelainen, J. Commercial and industrial consumers’ perspectives on electricity pricing reform: Evidence from India. Energy Policy 2019, 130, 162–171. [Google Scholar] [CrossRef]
- Otsuka, A. An Economic Analysis of Electricity Demand: Evidence from Japan. J. Econ. Struct. 2015, 28, 147–173. [Google Scholar] [CrossRef] [Green Version]
- Hobby, J.D.; Tucci, G.H. Analysis of the residential, commercial and industrial electricity consumption. In Proceedings of the 2011 IEEE PES Innovative Smart Grid Technologies, Perth, WA, Australia, 13–16 November 2011; pp. 1–7. [Google Scholar] [CrossRef]
- Abadi, M.; Agarwal, A.; Barham, P.; Brevdo, E.; Chen, Z.; Citro, C.; Corrado, G.S.; Davis, A.; Dean, J.; Devin, M.; et al. TensorFlow: Large-Scale Machine Learning on Heterogeneous Systems. 2015. Available online: tensorflow.org (accessed on 15 May 2021).
- François, C. Keras. 2015. Available online: https://keras.io (accessed on 15 May 2021).
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine Learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Harris, C.R.; Millman, K.J.; van der Walt, S.J.; Gommers, R.; Virtanen, P.; Cournapeau, D.; Wieser, E.; Taylor, J.; Berg, S.; Smith, N.J.; et al. Array programming with NumPy. Nature 2020, 585, 357–362. [Google Scholar] [CrossRef]
- Hunter, J.D. Matplotlib: A 2D graphics environment. Comput. Sci. Eng. 2007, 9, 90–95. [Google Scholar] [CrossRef]
- Waskom, M.; Botvinnik, O.; O’Kane, D.; Hobson, P.; Lukauskas, S.; Gemperline, D.C.; Augspurger, T.; Halchenko, Y.; Cole, J.B.; Warmenhoven, J.; et al. Mwaskom/Seaborn: v0.8.1 (September 2017); Zenodo: Genève, Switzerland, 2017. [Google Scholar] [CrossRef]
- Alberg, D.; Last, M. Short-term load forecasting in smart meters with sliding window-based ARIMA algorithms. Vietnam. J. Comput. Sci. 2018, 5, 241–249. [Google Scholar] [CrossRef]
- Andersen, F.; Larsen, H.; Boomsma, T. Long-term forecasting of hourly electricity load: Identification of consumption profiles and segmentation of customers. Energy Convers. Manag. 2013, 68, 244–252. [Google Scholar] [CrossRef]
- Chicco, G. Overview and performance assessment of the clustering methods for electrical load pattern grouping. Energy 2012, 42, 68–80. [Google Scholar] [CrossRef]
- Silva, I.; Spatti, D.; Flauzino, R.; Bartocci Liboni, L.; Reis Alves, S. Multilayer Perceptron Networks. In Artificial Neural Networks; Springer: Cham, Switzerland, 2017; pp. 55–115. [Google Scholar] [CrossRef]
- Mercioni, M.A.; Holban, S. The Most Used Activation Functions: Classic Versus Current. In Proceedings of the 2020 International Conference on Development and Application Systems (DAS), Suceava, Romania, 21–23 May 2020; pp. 141–145. [Google Scholar] [CrossRef]
- Mandal, J.K.; Sinha, A.K.; Parthasarathy, G. Application of recurrent neural network for short term load forecasting in electric power system. In Proceedings of the ICNN’95—International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; Volume 5, pp. 2694–2698. [Google Scholar] [CrossRef]
- Mori, H.; Ogasawara, T. A recurrent neural network for short-term load forecasting. In Proceedings of the Second International Forum on Applications of Neural Networks to Power Systems, Yokohama, Japan, 19–22 April 1993; pp. 395–400. [Google Scholar] [CrossRef]
- Dietrich, B.; Walther, J.; Weigold, M.; Abele, E. Machine learning based very short term load forecasting of machine tools. Appl. Energy 2020, 276. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Cao, C.; Liu, F.; Tan, H.; Song, D.; Shu, W.; Li, W.; Zhou, Y.; Bo, X.; Xie, Z. Deep Learning and Its Applications in Biomedicine. Genom. Proteom. Bioinform. 2018, 16, 17–32. [Google Scholar] [CrossRef]
- Muzaffar, S.; Afshari, A. Short-Term Load Forecasts Using LSTM Networks. Innovative Solutions for Energy Transitions. Energy Procedia 2019, 158, 2922–2927. [Google Scholar] [CrossRef]
- Kong, W.; Dong, Z.; Jia, Y.; Hill, D.; Xu, Y.; Zhang, Y. Short-Term Residential Load Forecasting Based on LSTM Recurrent Neural Network. IEEE Trans. Smart Grid 2019, 10, 841–851. [Google Scholar] [CrossRef]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016; Available online: http://www.deeplearningbook.org (accessed on 15 June 2021).
- Almalaq, A.; Edwards, G. A Review of Deep Learning Methods Applied on Load Forecasting. In Proceedings of the 2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA), Cancun, Mexico, 18–21 December 2017; pp. 511–516. [Google Scholar] [CrossRef]
- Subbiah, S.; Chinnappan, J. A review of short term load forecasting using deep learning. Int. J. Emerg. Technol. 2020, 11, 378–384. [Google Scholar]
- Moroff, N.U.; Kurt, E.; Kamphues, J. Machine Learning and Statistics: A Study for assessing innovative Demand Forecasting Models. Procedia Comput. Sci. 2021, 180, 40–49. [Google Scholar] [CrossRef]
- Olah, C. Understanding LSTM Networks. Available online: http://colah.github.io/posts/2015-08-Understanding-LSTMs (accessed on 10 May 2021).
- Wu, D.C.; Bahrami Asl, B.; Razban, A.; Chen, J. Air compressor load forecasting using artificial neural network. Expert Syst. Appl. 2021, 168, 114209. [Google Scholar] [CrossRef]
- Gasparin, A.; Lukovic, S.; Alippi, C. Deep Learning for Time Series Forecasting: The Electric Load Case. CoRR 2019. Available online: http://xxx.lanl.gov/abs/1907.09207 (accessed on 13 April 2021).
- Li, X.; Zhuang, W.; Zhang, H. Short-term Power Load Forecasting Based on Gate Recurrent Unit Network and Cloud Computing Platform. In Proceedings of the 4th International Conference on Computer Science and Application Engineering, Sanya, China, 20–22 October 2020. [Google Scholar]
- Lim, B.; Zohren, S. Time Series Forecasting with Deep Learning: A Survey. 2020. Available online: http://xxx.lanl.gov/abs/2004.13408 (accessed on 3 May 2021).
- Cho, K.; van Merrienboer, B.; Gulcehre, C.; Bahdanau, D.; Bougares, F.; Schwenk, H.; Bengio, Y. Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation. 2014. Available online: http://xxx.lanl.gov/abs/1406.1078 (accessed on 25 May 2021).
- Sutskever, I.; Vinyals, O.; Le, Q.V. Sequence to Sequence Learning with Neural Networks. CoRR 2014. Available online: http://xxx.lanl.gov/abs/1409.3215 (accessed on 22 May 2021).
- Donahue, J.; Hendricks, L.A.; Rohrbach, M.; Venugopalan, S.; Guadarrama, S.; Saenko, K.; Darrell, T. Long-Term Recurrent Convolutional Networks for Visual Recognition and Description. 2016. Available online: http://xxx.lanl.gov/abs/1411.4389 (accessed on 20 May 2021).
- Pascanu, R.; Gulcehre, C.; Cho, K.; Bengio, Y. How to Construct Deep Recurrent Neural Networks. 2014. Available online: http://xxx.lanl.gov/abs/1312.6026 (accessed on 5 June 2021).
- George-Ufot, G.; Qu, Y.; Orji, I. Sustainable lifestyle factors influencing industries’ electric consumption patterns using Fuzzy logic and DEMATEL: The Nigerian perspective. J. Clean. Prod. 2017, 162, 624–634. [Google Scholar] [CrossRef]
- Christen, R.; Mazzola, L.; Denzler, A.; Portmann, E. Exogenous Data for Load Forecasting: A Review. In Proceedings of the 12th International Joint Conference on Computational Intelligence—Volume 1: CI4EMS, INSTICC, Budapest, Hungary, 2–4 November 2020; pp. 489–500. [Google Scholar] [CrossRef]
- Sowinski, J. The Impact of the Selection of Exogenous Variables in the ANFIS Model on the Results of the Daily Load Forecast in the Power Company. Energies 2021, 14, 345. [Google Scholar] [CrossRef]
- Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. 2017. Available online: http://xxx.lanl.gov/abs/1412.6980 (accessed on 2 June 2021).
Coeff. | Std. Error | t Stat | p-Value | Lower 95% | Upper 95% | Lower 95.0% | Upper 95.0% | ||
---|---|---|---|---|---|---|---|---|---|
0.334 | 0.049 | 6.768 | 0.0000000 | 0.237 | 0.431 | 0.237 | 0.431 | ||
T-1 | 0.370 | 0.011 | 32.485 | 0.0000000 | 0.348 | 0.392 | 0.348 | 0.392 | |
T-2 | −0.016 | 0.012 | −1.277 | 0.2015198 | −0.040 | 0.008 | −0.040 | 0.008 | |
T-3 | 0.039 | 0.012 | 3.220 | 0.0012885 | 0.015 | 0.063 | 0.015 | 0.063 | |
T-4 | 0.071 | 0.012 | 5.791 | 0.0000000 | 0.047 | 0.095 | 0.047 | 0.095 | |
T-5 | −0.008 | 0.012 | −0.653 | 0.5135898 | −0.032 | 0.016 | −0.032 | 0.016 | |
T-6 | 0.025 | 0.012 | 2.092 | 0.0364606 | 0.002 | 0.048 | 0.002 | 0.048 | |
T-7 | 0.576 | 0.011 | 51.288 | 0.0000000 | 0.554 | 0.598 | 0.554 | 0.598 | |
T-8 | −0.295 | 0.012 | −25.119 | 0.0000000 | −0.318 | −0.272 | −0.318 | −0.272 | |
T-9 | −0.013 | 0.012 | −1.087 | 0.2769461 | −0.037 | 0.011 | −0.037 | 0.011 | |
T-10 | −0.040 | 0.012 | −3.244 | 0.0011830 | −0.064 | −0.016 | −0.064 | −0.016 | |
T-11 | −0.071 | 0.012 | −5.792 | 0.0000000 | −0.095 | −0.047 | −0.095 | −0.047 | |
T-12 | 0.006 | 0.012 | 0.497 | 0.6189813 | −0.018 | 0.030 | −0.018 | 0.030 | |
T-13 | −0.023 | 0.012 | −1.920 | 0.0548434 | −0.046 | 0.000 | −0.046 | 0.000 | |
T-14 | 0.303 | 0.011 | 28.200 | 0.0000000 | 0.282 | 0.324 | 0.282 | 0.324 |
Method | Parameters Considered |
---|---|
Naïve I | Hourly values from day d-7 used for hourly forecast in day d |
Naïve II | Hourly values from day d-1 used for hourly forecast in day d |
ES | Triple exponential smoothing, 24 time periods ( coefficients used = 0.9331, = 0.0014, = 1) |
AR | Autogressive prediction for each hour is based on the same hour in the past 14 days. Coefficients are presented in Table 1 |
MA | Moving average of order 3 |
Sarima | Seasonal autoregressive integrated moving average (2,0,2) (1,0,0) [24] |
MLP | Multi-Layer Perceptron. Input matrix [24,20]. Input variable: Past 14 days, Day of week, Working /non-working day and hours, Special days, temperature. 2 × hidden layers (300, 200). Output layer: Dense; Activation: Relu; Optimizer: Adam; Loss: mean square error; Epochs: 20–100 |
RNN | Recurrent neural network. Input matrix [24,20]. Input variable: Past 14 days, Day of week, Working /non-working day and hours, Special days, temperature. 3 × hidden layers (100, 100, 96). Output layer: Dense; Activation: Tanh, Sigmoid; Optimizer: Adam; Loss: mean square error; Epochs: 20–100 |
LSTM | Long short term memory. Input matrix [24,20]. Input variable: Past 14 days, Day of week, Working /non-working day and hours, Special days, temperature. 3 × hidden layers (200, 200, 168). Output layer: Dense; Activation: Tanh, Sigmoid; Optimizer: Adam; Loss: mean square error; Epochs: 20–100 |
LSTM encoder-decoder | Input matrix [24,20]. Input variable: Past 14 days, Day of week, Working /non-working day and hours, Special days, temperature. 2 × hidden layers (100, 100), 1 × Repeat Vector, 1 × Time distributed layer (96). Activation: Tanh, Sigmoid; Optimizer: Adam; Loss: mean square error; Epochs: 20–100 |
GRU | Gated recurrent unit. Input matrix [24,20]. Input variable: Past 14 days, Day of the week, Working /non-working day and hours, Special days, temperature. 3 × hidden layers (200, 200, 168). Output layer: Dense; Activation: Tanh, Sigmoid; Optimizer: Adam; Loss: mean square error; Epochs: 20–100 |
CNN-LSTM | Convolutional neural network. Input matrix [24,20]. Input variable: Past 14 days, Day of week, Working /non-working day and hours, Special days, temperature. 2 × 1D Conv layers (24, 48); 1 × MaxPooling1D layer; 1 × Flatten layer; 2 × LSTM layers; Output layer: Time distributed dense; Activation: Tanh, Sigmoid; Optimizer: Adam; Loss: mean square error; Epochs: 20–100 |
Parameter | Details | Value |
---|---|---|
Optimization algorithm | Keras provides built-in optimizers for gradient descent | Adam |
Initial learning rate | Step size at each iteration | 0.001 |
Activations | Layer describing which activation function to use | tanh, relu |
Hidden layers | How deep the network is build | >3 |
Hidden unit | Number of neurons in each hidden layer | 240/240/168 |
Max epochs | Number of iterations the data is passed to the neural network | 200 |
Mean Absolute Error | Root Mean Square Error | Mean Absolute Percentage Error |
---|---|---|
MAE = | RMSE = | MAPE = |
Load | Metric | Naive I | Naive II | ES | AR | MA | SARIMA | MLP | RNN | LSTM | LSTMed | GRU | CNN LSTM | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Agg | MAPE | 10.49% | 12.89% | 9.40% | 10.02% | 10.28% | 7.78% | 4.79% | 4.14% | 3.96% | 3.29% | 3.64% | 5.72% | |
train | RMSE | 0.7077 | 1.4385 | 0.5678 | 0.5788 | 0.6345 | 0.4901 | 0.4334 | 0.3995 | 0.3969 | 0.3816 | 0.3895 | 0.4516 | |
MAE | 0.3675 | 0.9149 | 0.3568 | 0.3557 | 0.3906 | 0.3029 | 0.2030 | 0.1825 | 0.1767 | 0.1555 | 0.1662 | 0.2397 | ||
MAPE | 7.64% | 9.72% | 9.35% | 6.76% | 9.20% | 7.56% | 6.00% | 6.61% | 5.67% | 6.23% | 5.28% | 6.97% | ||
test | RMSE | 0.2009 | 1.8903 | 0.3280 | 0.1430 | 0.2944 | 0.1308 | 0.2334 | 0.2196 | 0.1784 | 0.2060 | 0.1696 | 0.2444 | |
MAE | 0.3282 | 0.8921 | 0.4019 | 0.2897 | 0.4022 | 0.3089 | 0.2754 | 0.3011 | 0.2607 | 0.2822 | 0.2420 | 0.3308 | ||
PF | MAPE | 20.78% | 28.55% | 14.97% | 19.07% | 19.30% | 9.05% | 8.85% | 8.85% | 4.52% | 5.25% | 4.61% | 11.00% | |
train | RMSE | 0.4190 | 0.5668 | 0.3379 | 0.3577 | 0.1656 | 0.2155 | 0.2129 | 0.1544 | 0.1599 | 0.1495 | 0.2489 | 0.0000 | |
MAE | 0.2253 | 0.3749 | 0.2158 | 0.2248 | 0.2252 | 0.1252 | 0.1172 | 0.1214 | 0.0764 | 0.0861 | 0.0748 | 0.1499 | ||
MAPE | 25.16% | 24.85% | 19.76% | 19.12% | 29.32% | 14.47% | 20.91% | 15.12% | 15.98% | 15.44% | 13.82% | 14.84% | ||
test | RMSE | 0.1780 | 0.2472 | 0.0908 | 0.1859 | 0.0244 | 0.2065 | 0.1112 | 0.1387 | 0.1358 | 0.0887 | 0.0880 | 0.0000 | |
MAE | 0.2881 | 0.3211 | 0.2615 | 0.2283 | 0.3231 | 0.1217 | 0.2684 | 0.2136 | 0.2157 | 0.2132 | 0.1817 | 0.1965 | ||
FF | MAPE | 14.52% | 14.26% | 14.82% | 14.53% | 14.51% | 11.84% | 7.19% | 7.30% | 4.36% | 4.01% | 4.06% | 3.19% | |
train | RMSE | 0.4327 | 0.8891 | 0.2429 | 0.3519 | 0.4051 | 0.2386 | 0.2525 | 0.2240 | 0.1953 | 0.1950 | 0.1894 | 0.1600 | |
MAE | 0.1984 | 0.5490 | 0.2046 | 0.1988 | 0.2365 | 0.2047 | 0.1212 | 0.1259 | 0.0866 | 0.0776 | 0.0771 | 0.1675 | ||
MAPE | 5.00% | 12.18% | 6.42% | 5.18% | 6.91% | 10.56% | 4.94% | 5.63% | 5.15% | 6.28% | 4.71% | 10.31% | ||
test | RMSE | 0.0207 | 0.7136 | 0.0223 | 0.0202 | 0.0617 | 0.0876 | 0.0341 | 0.0478 | 0.0440 | 0.0481 | 0.0100 | 0.0853 | |
MAE | 0.1066 | 0.5158 | 0.1174 | 0.1098 | 0.1592 | 0.2508 | 0.1055 | 0.1510 | 0.1403 | 0.1432 | 0.1276 | 0.2014 | ||
SM | MAPE | 6.78% | 5.56% | 8.01% | 4.76% | 5.67% | 15.68% | 4.49% | 7.35% | 4.58% | 4.25% | 4.97% | 5.73% | |
train | RMSE | 0.0319 | 0.0266 | 0.0408 | 0.0223 | 0.0267 | 0.0699 | 0.0312 | 0.0432 | 0.0315 | 0.0303 | 0.0456 | 0.0345 | |
MAE | 0.0250 | 0.0201 | 0.0315 | 0.0173 | 0.0209 | 0.0560 | 0.0175 | 0.0310 | 0.0179 | 0.0166 | 0.0349 | 0.0213 | ||
MAPE | 5.54% | 4.99% | 10.52% | 4.22% | 5.27% | 18.96% | 4.29% | 7.13% | 4.33% | 4.20% | 4.37% | 5.34% | ||
test | RMSE | 0.0005 | 0.0004 | 0.0027 | 0.0003 | 0.0005 | 0.0046 | 0.0021 | 0.0027 | 0.0021 | 0.0021 | 0.0029 | 0.0022 | |
MAE | 0.0183 | 0.0164 | 0.0394 | 0.0139 | 0.0173 | 0.0580 | 0.0184 | 0.0297 | 0.0182 | 0.0180 | 0.0344 | 0.0218 | ||
FP | MAPE | 27.09% | 29.57% | 17.38% | 25.43% | 30.69% | 17.52% | 12.17% | 11.00% | 6.51% | 5.52% | 8.17% | 8.98% | |
train | RMSE | 0.0706 | 0.1006 | 0.0904 | 0.0583 | 0.0676 | 0.0589 | 0.0295 | 0.0375 | 0.0192 | 0.0182 | 0.0214 | 0.0249 | |
MAE | 0.0467 | 0.0620 | 0.0634 | 0.0413 | 0.0477 | 0.0475 | 0.0202 | 0.0279 | 0.0114 | 0.0101 | 0.0138 | 0.0163 | ||
MAPE | 21.07% | 24.85% | 14.05% | 29.78% | 26.60% | 23.32% | 16.58% | 17.21% | j% | 13.43% | 14.94% | 14.84% | ||
test | RMSE | 0.0046 | 0.0087 | 0.0063 | 0.0032 | 0.0043 | 0.0026 | 0.0018 | 0.0021 | 0.0014 | 0.0015 | 0.0016 | 0.0015 | |
MAE | 0.0447 | 0.0583 | 0.0529 | 0.0396 | 0.0462 | 0.0415 | 0.0278 | 0.0328 | 0.0218 | 0.0217 | 0.0242 | 0.0243 | ||
AS | MAPE | 8.75% | 15.78% | 19.48% | 7.70% | 7.63% | 15.46% | 6.51% | 7.02% | 6.46% | 4.35% | 8.84% | 7.02% | |
train | RMSE | 0.0085 | 0.0150 | 0.0138 | 0.0070 | 0.0072 | 0.0088 | 0.0061 | 0.0099 | 0.0054 | 0.0047 | 0.0065 | 0.0066 | |
MAE | 0.0044 | 0.0076 | 0.0091 | 0.0038 | 0.0038 | 0.0068 | 0.0033 | 0.0075 | 0.0031 | 0.0023 | 0.0041 | 0.0037 | ||
MAPE | 8.55% | 15.13% | 17.92% | 8.05% | 7.87% | 17.45% | 8.91% | 8.98% | 9.71% | 7.73% | 7.69% | 9.36% | ||
test | RMSE | 0.0001 | 0.0002 | 0.0002 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | |
MAE | 0.0043 | 0.0073 | 0.0075 | 0.0040 | 0.0040 | 0.0073 | 0.0046 | 0.0085 | 0.0047 | 0.0040 | 0.0055 | 0.0048 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ungureanu, S.; Topa, V.; Cziker, A.C. Analysis for Non-Residential Short-Term Load Forecasting Using Machine Learning and Statistical Methods with Financial Impact on the Power Market. Energies 2021, 14, 6966. https://doi.org/10.3390/en14216966
Ungureanu S, Topa V, Cziker AC. Analysis for Non-Residential Short-Term Load Forecasting Using Machine Learning and Statistical Methods with Financial Impact on the Power Market. Energies. 2021; 14(21):6966. https://doi.org/10.3390/en14216966
Chicago/Turabian StyleUngureanu, Stefan, Vasile Topa, and Andrei Cristinel Cziker. 2021. "Analysis for Non-Residential Short-Term Load Forecasting Using Machine Learning and Statistical Methods with Financial Impact on the Power Market" Energies 14, no. 21: 6966. https://doi.org/10.3390/en14216966
APA StyleUngureanu, S., Topa, V., & Cziker, A. C. (2021). Analysis for Non-Residential Short-Term Load Forecasting Using Machine Learning and Statistical Methods with Financial Impact on the Power Market. Energies, 14(21), 6966. https://doi.org/10.3390/en14216966