Prediction of Electricity Generation Using Onshore Wind and Solar Energy in Germany
Abstract
:1. Introduction
Overview of the Paper
2. Materials and Methods
2.1. Models
2.1.1. LSTM
2.1.2. Transformer
2.1.3. Informer
2.1.4. Autoformer
2.1.5. FEDformer
2.1.6. Non-Stationary Transformer
2.2. Data
2.3. General Preprocessing
2.4. PV
Preprocessing and Data Overview
2.5. Onshore Wind
Data Overview
3. Methodology and Results
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- McLennan, M. The Global Risks Report 2022 17th Edition; World Economic Forum: Cologny, Switzerland, 2022; p. 14. Available online: https://www3.weforum.org/docs/WEF_The_Global_Risks_Report_2022.pdf (accessed on 2 July 2023).
- Wöhrle, D. Kohlenstoffkreislauf und Klimawandel. Chem. Unserer Zeit 2020, 55, 112–113. [Google Scholar] [CrossRef]
- Wilke, S. Energiebedingte Emissionen von Klimagasen und Luftschadstoffen. Umweltbundesamt, 6 June 2023. Available online: https://www.umweltbundesamt.de/daten/energie/energiebedingte-emissionen#energiebedingte-kohlendioxid-emissionen-durch-stromerzeugung (accessed on 4 July 2023).
- Aktuelle Fakten zur Photovoltaik in Deutschland. Fraunhofer ISE. p. 48. Available online: https://www.ise.fraunhofer.de/content/dam/ise/de/documents/publications/studies/aktuelle-fakten-zur-photovoltaik-in-deutschland.pdf (accessed on 22 September 2022).
- Umweltbundesamt. Umweltbundesamt. Emissionsbilanz erneuerbarer Energieträger Bestimmung der vermiedenen Emissionen im Jahr 2018. Umweltbundesamt, November 2019; pp. 47–50. Available online: https://www.umweltbundesamt.de/sites/default/files/medien/1410/publikationen/2019-11-07_cc-37-2019_emissionsbilanz-erneuerbarer-energien_2018.pdf (accessed on 2 July 2023).
- Ahmad, T.; Zhang, D.; Huang, C.; Zhang, H.; Dai, N.; Song, Y.; Chen, H. Artificial intelligence in sustainable energy industry: Status Quo, challenges and opportunities. J. Clean. Prod. 2021, 289, 125834. [Google Scholar] [CrossRef]
- Entezari, A.; Aslani, A.; Zahedi, R.; Noorollahi, Y. Artificial intelligence and machine learning in energy systems: A bibliographic perspective. Energy Strategy Rev. 2023, 45, 101017. [Google Scholar] [CrossRef]
- Botterud, A. Forecasting renewable energy for grid operations. In Renewable Energy Integration; Elsevier: Amsterdam, The Netherlands, 2017; pp. 133–143. [Google Scholar] [CrossRef]
- Potter, C.W.; Archambault, A.; Westrick, K. Building a Smarter Smart Grid through Better Renewable Energy Information. Abgerufen von. 1 März 2009. Available online: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4840110 (accessed on 18 January 2024).
- Hang, T.; Pinson, P.; Wang, Y.; Weron, R.; Yang, D.; Zareipour, H. Energy Forecasting: A Review and Outlook. 2020. Available online: https://ieeexplore.ieee.org/document/9218967 (accessed on 16 January 2024).
- Chen, X.; Zhang, X.; Dong, M.; Huang, L.; Guo, Y.; He, S. Deep Learning-Based prediction of wind power for multi-turbines in a wind farm. Front. Energy Res. 2021, 9, 723775. [Google Scholar] [CrossRef]
- Gensler, A.; Henze, J.; Sick, B.; Raabe, N. Deep Learning for Solar Power Forecasting—An Approach Using AutoEncoder and LSTM Neural Networks. 1 Oktober 2016. Available online: https://ieeexplore.ieee.org/document/7844673 (accessed on 16 January 2024).
- Bethel, B.J.; Sun, W.; Dong, C.; Wang, D. Forecasting hurricane-forced significant wave heights using a long short-term memory network in the Caribbean Sea. Ocean Sci. 2022, 18, 419–436. [Google Scholar] [CrossRef]
- Gao, C.; Zhou, L.; Zhang, R. A Transformer-Based deep learning model for successful predictions of the 2021 Second-Year La Niña condition. Geophys. Res. Lett. 2023, 50, e2023GL104034. [Google Scholar] [CrossRef]
- Zhou, H.; Zhang, S.; Peng, J.; Zhang, S.; Li, J.; Xiong, H.; Zhang, W. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. arXiv 2019. [Google Scholar] [CrossRef]
- Wu, H.; Xu, J.; Wang, J.; Long, M. Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting. Adv. Neural Inf. Process. Syst. 2021, 34, 22419–22430. [Google Scholar]
- Zhou, T.; Ma, Z.; Wen, Q.; Wang, X.; Sun, L.; Jin, R. FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting. In Proceedings of the International Conference on Machine Learning, PMLR, Baltimore, MD, USA, 17–23 July 2022. [Google Scholar]
- Liu, Y.; Wu, H.; Wang, J.; Long, M. Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting. Adv. Neural Inf. Process. Syst. 2022, 35, 9881–9893. [Google Scholar]
- Taylor, S.V.; Letham, B. Forecasting at scale. PeerJ Comput. Sci. 2018, 72, 37–45. [Google Scholar] [CrossRef]
- Benti, N.E.; Chaka, M.D.; Semie, A.G. Forecasting Renewable Energy Generation with Machine Learning and Deep Learning: Current Advances and Future Prospects. Sustainability 2023, 15, 7087. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. 1997. Available online: https://www.bioinf.jku.at/publications/older/2604.pdf (accessed on 7 July 2023).
- Zaremba, W.; Sutskever, I.; Vinyals, O. Recurrent Neural Network Regularization. arXiv 2014, arXiv:1409.2329. [Google Scholar]
- Lai, G.; Chang, W.-C.; Yang, Y.; Liu, H. Modeling Long- and Short-Term Temporal Patterns with Deep Neural Networks. In Proceedings of the 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, Ann Arbor MI USA, 8–12 July 2018. [Google Scholar]
- Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Zhai, X.; Unterthiner, T.; Dehghani, M.; Minderer, M.; Heigold, G.; Gelly, S.; et al. An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale. arXiv 2020, arXiv:2010.11929. [Google Scholar]
- Jamil, S.; Piran, J.; Kwon, O.-J. A Comprehensive Survey of Transformers for Computer Vision. Drones 2023, 7, 287. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention Is All You Need. Adv. Neural Inf. Process. Syst. 2017, 30, 5998–6008. [Google Scholar]
- Zhouhaoyi, o.D. GitHub—Zhouhaoyi/Informer2020: The GitHub Repository for the Paper “Informer” Accepted by AAAI 2021. Available online: https://github.com/zhouhaoyi/Informer2020 (accessed on 15 October 2023).
- Lin, Y.; Koprinska, I.; Rana, M. Temporal Convolutional Attention Neural Networks for Time Series Forecasting. In Proceedings of the 2021 International Joint Conference on Neural Networks (IJCNN), Shenzhen, China, 18–22 July 2021; Available online: https://ieeexplore.ieee.org/document/9534351 (accessed on 7 July 2023).
- Wan, R.; Tian, C.; Zhang, W.; Deng, W.-D.; Yang, F. A Multivariate Temporal Convolutional Attention Network for Time-Series Forecasting. Electronics 2022, 11, 1516. [Google Scholar] [CrossRef]
- Qin, Y.; Song, D.; Chen, H.; Cheng, W.; Jiang, G.; Cottrell, G.W. A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction. arXiv 2017, arXiv:1704.02971. [Google Scholar]
- Wang, D.; Chen, C. Spatiotemporal Self-Attention-Based LSTNet for Multivariate Time Series Prediction. Int. J. Intell. Syst. 2023, 2023, 9523230. [Google Scholar] [CrossRef]
- Bundesnetzagentur. SMARD|SMARD—Strommarktdaten, Stromhandel und Stromerzeugung in Deutschland. SMARD. Available online: https://www.smard.de/home (accessed on 16 June 2023).
- Deutscher Wetterdienst Wetter und Klima aus Einer Hand. Climate Data Center. Available online: https://cdc.dwd.de/portal/ (accessed on 24 June 2023).
- Energie-Atlas Bayern—Das Zentrale Informationsportal zur Energiewende in Bayern|Energie-Atlas Bayern. Available online: https://www.energieatlas.bayern.de/ (accessed on 25 January 2024).
- MAZiqing. GitHub—MAZiqing/FEDformer. GitHub. Available online: https://github.com/MAZiqing/FEDformer (accessed on 15 October 2023).
- Thuml. GitHub—thuml/Nonstationary_Transformers: Code Release for ‘Non-Stationary Transformers: Exploring the Stationarity in Time Series Forecasting’ (NeurIPS 2022). GitHub. Available online: https://github.com/thuml/Nonstationary_Transformers (accessed on 15 October 2023).
- Zeng, A.; Chen, M.; Zhang, L.; Xu, Q. Are transformers effective for time series forecasting? arXiv 2022, arXiv:2205.13504. [Google Scholar] [CrossRef]
- Wu, H.; Hu, T.; Liu, Y.; Zhou, H.; Wang, J.; Long, M. TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis. arXiv 2022, arXiv:2210.02186. [Google Scholar]
- Nie, Y.; Nguyen, N.H.; Sinthong, P.; Kalagnanam, J. A Time Series is Worth 64 Words: Long-term Forecasting with Transformers. arXiv 2022, arXiv:2211.14730. [Google Scholar]
- Chen, S.; Li, C.-L.; Yoder, N.-C.; Arık, S.Ö.; Pfister, T. TSMixer: An All-MLP architecture for time series forecasting. arXiv 2023, arXiv:2303.06053. [Google Scholar]
Layer No. | Layer | Number of Neurons for PV Scenario | Number of Neurons for Onshore Wind Scenario |
---|---|---|---|
1 | Input Layer | 100 | 100 |
2 | LSTM | 30 | 30 |
3 | LSTM | 50 | 50 |
4 | Dropout | 0.3 | 0.3 |
5 | LSTM | 30 | 50 |
6 | Dropout | 0.4 | 0.4 |
7 | Dense | 24, 48, 72, 96, 168, 336 | 24, 48, 72, 96, 168, 336 |
Parameter | PV Scenario Selected Parameters/All Parameters | Wind Scenario Selected Parameters/All Parameters |
---|---|---|
seq_len (length of input sequence) | 100 | 100 |
pred_len (prediction length) | 24, 48, 72, 96, 168, 336 | 24, 48, 72, 96, 168, 336 |
e_layers (number of layers in encoder) | 4 | 4 |
d_layers (number of layers in decoder) | 2 | 2 |
enc_in (dimensionality of input encoder) | 12/4 | 12/3 |
dec_in (dimensionality of input decoder) | 12/4 | 12/3 |
c_out (number of output channels) | 1 | 1 |
d_ff (dimension of fcn) | 1024 | 1024 |
d_model (dimension of model) | 256 | 256 |
n_heads | 8 | 8 |
dropout | 0.1 | 0.1 |
Model | Non-Stationary Transformer | FEDformer-w | Autoformer | Informer | Transformer | LSTM | SARIMAX | |||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Metric | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE |
24 | 0.108 | 0.183 | 0.376 | 0.349 | 0.108 | 0.201 | 0.102 | 0.167 | 0.104 | 0.186 | 0.108 | 0.171 | 0.177 | 0.210 |
48 | 0.177 | 0.213 | 0.410 | 0.383 | 0.153 | 0.255 | 0.142 | 0.197 | 0.153 | 0.219 | 0.152 | 0.202 | 0.207 | 0.256 |
72 | 0.198 | 0.226 | 0.445 | 0.399 | 0.198 | 0.273 | 0.158 | 0.225 | 0.168 | 0.230 | 0.166 | 0.242 | 0.232 | 0.266 |
96 | 0.193 | 0.234 | 0.436 | 0.395 | 0.201 | 0.275 | 0.176 | 0.227 | 0.186 | 0.229 | 0.185 | 0.245 | 0.324 | 0.335 |
168 | 0.192 | 0.238 | 0.465 | 0.408 | 0.221 | 0.300 | 0.173 | 0.226 | 0.189 | 0.231 | 0.188 | 0.246 | 0.349 | 0.385 |
336 | 0.204 | 0.245 | 0.487 | 0.419 | 0.241 | 0.310 | 0.207 | 0.256 | 0.250 | 0.271 | 0.218 | 0.267 | 0.389 | 0.437 |
Model | Non-Stationary Transformer | FEDformer-w | Autoformer | Informer | Transformer | LSTM | SARIMAX | |||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Metric | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE |
24 | 0.098 | 0.181 | 0.097 | 0.176 | 0.119 | 0.220 | 0.092 | 0.159 | 0.095 | 0.163 | 0.088 | 0.143 | 0.167 | 0.259 |
48 | 0.137 | 0.205 | 0.133 | 0.204 | 0.164 | 0.251 | 0.127 | 0.192 | 0.129 | 0.193 | 0.121 | 0.189 | 0.183 | 0.278 |
72 | 0.165 | 0.209 | 0.158 | 0.209 | 0.181 | 0.262 | 0.149 | 0.206 | 0.151 | 0.206 | 0.143 | 0.206 | 0.214 | 0.298 |
96 | 0.171 | 0.225 | 0.166 | 0.245 | 0.196 | 0.273 | 0.157 | 0.215 | 0.162 | 0.220 | 0.150 | 0.209 | 0.282 | 0.318 |
168 | 0.190 | 0.232 | 0.183 | 0.255 | 0.197 | 0.278 | 0.170 | 0.221 | 0.172 | 0.226 | 0.165 | 0.211 | 0.325 | 0.361 |
336 | 0.196 | 0.240 | 0.203 | 0.270 | 0.300 | 0.326 | 0.197 | 0.242 | 0.199 | 0.237 | 0.175 | 0.230 | 0.375 | 0.420 |
Model | Non-Stationary Transformer | FEDformer-w | Autoformer | Informer | Transformer | LSTM | ARIMAX | |||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Metric | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE |
24 | 0.694 | 0.584 | 1.238 | 0.776 | 0.976 | 0.721 | 0.621 | 0.603 | 0.739 | 0.618 | 0.599 | 0.594 | 8.255 | 1.621 |
48 | 1.012 | 0.733 | 1.604 | 0.917 | 1.362 | 0.856 | 0.932 | 0.690 | 1.126 | 0.783 | 0.876 | 0.750 | 9.007 | 1.705 |
72 | 1.295 | 0.833 | 1.824 | 0.986 | 1.433 | 0.891 | 1.143 | 0.790 | 1.251 | 0.846 | 0.930 | 0.775 | 9.173 | 1.724 |
96 | 1.468 | 0.894 | 1.812 | 0.982 | 1.667 | 0.955 | 1.178 | 0.841 | 1.280 | 0.844 | 1.001 | 0.820 | 9.231 | 1.720 |
168 | 1.621 | 0.984 | 1.938 | 1.025 | 2.064 | 1.052 | 1.231 | 0.855 | 1.369 | 0.867 | 1.041 | 0.846 | 9.199 | 1.707 |
336 | 1.718 | 1.004 | 1.999 | 1.032 | 2.123 | 1.068 | 1.508 | 0.971 | 1.796 | 1.022 | 1.177 | 0.875 | 9.156 | 1.667 |
Model | Non-Stationary Transformer | FEDformer-w | Autoformer | Informer | Transformer | LSTM | ARIMAX | |||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Metric | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE |
24 | 0.620 | 0.555 | 0.645 | 0.567 | 0.548 | 0.519 | 0.541 | 0.525 | 0.540 | 0.518 | 0.499 | 0.502 | 3.067 | 1.349 |
48 | 0.954 | 0.699 | 0.997 | 0.718 | 0.920 | 0.688 | 0.854 | 0.678 | 0.903 | 0.681 | 0.794 | 0.659 | 3.161 | 1.348 |
72 | 1.147 | 0.840 | 1.154 | 0.738 | 1.127 | 0.780 | 0.991 | 0.737 | 1.068 | 0.776 | 0.899 | 0.712 | 3.216 | 1.345 |
96 | 1.107 | 0.798 | 1.243 | 0.799 | 1.172 | 0.834 | 1.078 | 0.782 | 1.250 | 0.813 | 0.943 | 0.743 | 3.250 | 1.349 |
168 | 1.502 | 0.966 | 1.327 | 0.839 | 1.239 | 0.853 | 1.145 | 0.814 | 1.358 | 0.874 | 1.011 | 0.783 | 3.311 | 1.351 |
336 | 1.625 | 0.983 | 1.414 | 0.876 | 1.296 | 0.878 | 1.269 | 0.857 | 1.441 | 0.896 | 1.064 | 0.818 | 3.324 | 1.339 |
Forecasting Scenario | PV with All Parameters | PV with Selected Parameters | Onshore Wind with All Parameters | Onshore Wind with Selected Parameters | ||||
---|---|---|---|---|---|---|---|---|
Metric | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE |
24 | −5.88% | −2.32% | +4.55% | +11.19% | +3.67% | +1.55% | +8.42% | +4.59% |
48 | −7.07% | −2.54% | +4.96% | +1.59% | +6.39% | −8% | +7.55% | +2.88% |
72 | −5.06% | −7.55% | +4.20% | +0.00% | +22.90% | +1.90% | +10.23% | +3.51% |
96 | −5.11% | −7.93% | +4.66% | +2.90% | +17.68% | +2.56% | +14.32% | +5.25% |
168 | −8.60% | −8.89% | +3.33% | +4.74% | +18.25% | +1.01% | +13.25% | +4.00% |
336 | −5.31% | −4.30% | +12.6% | +5.22% | +28.12% | +10.97% | +19.27% | +4.77% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Walczewski, M.J.; Wöhrle, H. Prediction of Electricity Generation Using Onshore Wind and Solar Energy in Germany. Energies 2024, 17, 844. https://doi.org/10.3390/en17040844
Walczewski MJ, Wöhrle H. Prediction of Electricity Generation Using Onshore Wind and Solar Energy in Germany. Energies. 2024; 17(4):844. https://doi.org/10.3390/en17040844
Chicago/Turabian StyleWalczewski, Maciej Jakub, and Hendrik Wöhrle. 2024. "Prediction of Electricity Generation Using Onshore Wind and Solar Energy in Germany" Energies 17, no. 4: 844. https://doi.org/10.3390/en17040844
APA StyleWalczewski, M. J., & Wöhrle, H. (2024). Prediction of Electricity Generation Using Onshore Wind and Solar Energy in Germany. Energies, 17(4), 844. https://doi.org/10.3390/en17040844