Wind Speed Modeling for Wind Farms Based on Deterministic Broad Learning System
Abstract
:1. Introduction
- The primary contribution of this research project is the introduction of the Deterministic Broad Learning System, addressing the issues of data saturation and local minima that commonly occur when the sample data are continuously updated. By adapting the system input to a fixed-width input, the proposed model achieves good prediction accuracy.
- The collection and establishment of a sample dataset of wind speed from 10 wind farms in Gansu Province, China.
- The application of the DBLS algorithm to wind speed prediction in wind farms, with comparative experiments conducted against RF, SVR, ELM, and BLS. The experimental results demonstrate that the DBLS algorithm performs well in terms of stability and prediction accuracy.
2. BLS
- (1)
- The input of the BLS is to perform a linear random mapping of samples to form feature nodes, which facilitates the network to be easily integrated with other neural networks when required.
- (2)
- In the output layer, the BLS takes the feature node and the enhancement node together as the input of the output layer. Here, we notice that the feature node is a linear mapping of the input sample, which can effectively reduce the loss of sample features.
3. Incremental Broad Learning System
4. Deterministic Broad Learning System
Algorithm 1 Deterministic Broad Learning System |
Input: training sample X; |
Output: W |
|
Set |
5. Validation and Analysis
- The initial values and of the DBLS algorithm were derived offline using batch BLS.
- Number of feature windows (NnmWin): Selected from {5, 6, …, 20}, and in this experiment, NnmWin was set to 7.
- Number of feature nodes (NumFea): Selected from {5, 6, …, 30}, and in this experiment, NumFea was set to 10.
- Number of enhancement nodes (NumEnhan): Selected from {100, 200, …, 1000}, and in this experiment, NumEnhan was set to 300.
- Scaling factor (s) for enhancement nodes: Set to 0.8 in this experiment.
- Regularization parameter (C): Set to 2−30 in this experiment.
6. Discussion
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations and Symbols
DBLS | Deterministic Broad Learning System |
BLS | Broad Learning System |
RF | Random Forest |
SVR | Support Vector Regression |
ELM | Extreme Learning Machines |
RMSE | Root Mean Square Error |
MAPE | Mean Absolute Percentage Error |
ARMA | Auto Regression Moving Average |
ARIMA | Auto Regressive Integrated Moving Average |
SARIMA | Seasonal Auto Regressive Integrated Moving Average |
STL | Seasonal Trend Loss |
GARCH | Generalized Auto Regressive Conditional Heteroskedasticity |
ANNs | Artificial Neural Networks |
LSTM | Long Short-Term Memory |
CNNs | Convolutional Neural Networks |
SVR | Support Vector Regression |
SLFNs | Single-Layer Feedforward Neural Networks |
RVFLNN | Random Vector Functional Link Neural Network |
the ith feature window | |
the linear mapping function | |
the weights of the function | |
the biases of the function | |
the mapping function of the enhancement node | |
the weights of the function | |
the biases of the function | |
the output function weight | |
X | the input sample |
the intermediate layer of the BLS | |
the middle layer addition variable corresponding to the added input sample | |
the inverse of | |
the inverse of | |
the difference equation |
References
- Tao, T.; Shi, P.; Wang, H.; Ai, W. Short-term prediction of downburst winds: A double-step modification enhanced approach. J. Wind Eng. Ind. Aerodyn. 2021, 211, 104561. [Google Scholar] [CrossRef]
- Dolatabadi, A.; Abdeltawab, H.; Mohamed, Y.A.R.I. Deep spatial-temporal 2-D CNN-BLSTM model for ultrashort-term LiDAR-assisted wind turbine’s power and fatigue load forecasting. IEEE Trans. Ind. Inform. 2021, 18, 2342–2353. [Google Scholar] [CrossRef]
- Ran, X.; Xu, C.; Ma, L.; Xue, F. Wind Power Interval Prediction with Adaptive Rolling Error Correction Based on PSR-BLS-QR. Energies 2022, 15, 4137. [Google Scholar] [CrossRef]
- Wang, L.; Wang, Y.; Chen, J.; Shen, X. A PM2.5 Concentration Prediction Model Based on CART–BLS. Atmosphere 2022, 13, 1674. [Google Scholar] [CrossRef]
- Jiang, Y.; Liu, S.; Zhao, N.; Xin, J.; Wu, B. Short-term wind speed prediction using time varying filter-based empirical mode decomposition and group method of data handling-based hybrid model. Energy Convers. Manag. 2020, 220, 113076. [Google Scholar] [CrossRef]
- Han, L.; Li, M.J.; Wang, X.J.; Lu, P.P. Wind power forecast based on broad learning system and simplified long short term memory network. IET Renew. Power Gener. 2022, 16, 3614–3628. [Google Scholar] [CrossRef]
- Li, D.; Chen, D.; Jin, B.; Shi, L.; Goh, J.; Ng, S.K. MAD-GAN: Multivariate anomaly detection for time series data with generative adversarial networks. In Proceedings of the International Conference on Artificial Neural Networks, Munich, Germany, 17–19 September 2019; pp. 703–716. [Google Scholar]
- Kim, H.S.; Han, K.M.; Yu, J.; Kim, J.; Kim, K.; Kim, H. Development of a CNN+LSTM Hybrid Neural Network for Daily PM2.5 Prediction. Atmosphere 2022, 13, 2124. [Google Scholar] [CrossRef]
- Löwe, S.; Madras, D.; Zemel, R.; Welling, M. Amortized causal discovery: Learning to infer causal graphs from time-series data. In Proceedings of the Conference on Causal Learning and Reasoning, Eureka, CA, USA, 11–13 April 2022; pp. 509–525. [Google Scholar]
- Pappas, S.S.; Ekonomou, L.; Karamousantas, D.C.; Chatzarakis, G.E.; Katsikas, S.K.; Liatsis, P. Electricity demand loads modeling using AutoRegressive Moving Average (ARMA) models. Energy 2008, 33, 1353–1360. [Google Scholar] [CrossRef]
- Nelson, B.K. Time series analysis using autoregressive integrated moving average (ARIMA) models. Acad. Emerg. Med. 1998, 5, 739–744. [Google Scholar] [CrossRef]
- Adachi, Y.; Makita, K. Method of Time Series Analysis of Meat Inspection Data Using Seasonal Autoregressive Integrated Moving Average Model. J. Jpn. Vet. Med. Assoc. 2015, 68, 189–197. [Google Scholar] [CrossRef]
- Cleveland, R.B.; Cleveland, W.S.; McRae, J.E.; Terpenning, I. STL: A seasonal-trend decomposition. J. Off. Stat. 1990, 6, 3–73. [Google Scholar]
- Bollerslev, T. Generalized autoregressive conditional heteroskedasticity. J. Econom. 1986, 31, 307–327. [Google Scholar] [CrossRef]
- Agatonovic-Kustrin, S.; Beresford, R. Basic concepts of artificial neural network (ANN) modeling and its application in pharmaceutical research. J. Pharm. Biomed. Anal. 2000, 22, 717–727. [Google Scholar] [CrossRef] [PubMed]
- Graves, A.; Graves, A. Supervised Sequence Labelling with Recurrent Neural Networks; Long short-term memory; Springer: Berlin/Heidelberg, Germany, 2012; pp. 37–45. [Google Scholar]
- Kattenborn, T.; Leitloff, J.; Schiefer, F.; Hinz, S. Review on Convolutional Neural Networks (CNN) in vegetation remote sensing. ISPRS J. Photogramm. Remote Sens. 2021, 173, 24–49. [Google Scholar] [CrossRef]
- Albawi, S.; Mohammed, T.A.; Al-Zawi, S. Understanding of a convolutional neural network. In Proceedings of the International Conference on Engineering and Technology (ICET), Antalya, Turkey, 21–23 August 2017; pp. 1–6. [Google Scholar]
- Awad, M.; Khanna, R.; Awad, M.; Khanna, R. Support vector regression: Theories, concepts, and applications for engineers and system designers. In Efficient Learning Machines; Springer: Berlin/Heidelberg, Germany, 2015; pp. 67–80. [Google Scholar]
- Ding, S.; Zhang, Z.; Guo, L.; Sun, Y. An optimized twin support vector regression algorithm enhanced by ensemble empirical mode decomposition and gated recurrent unit. Inf. Sci. 2022, 598, 101–125. [Google Scholar] [CrossRef]
- Biau, G.; Scornet, E. A random forest guided tour. Test 2016, 25, 197–227. [Google Scholar] [CrossRef]
- Belgiu, M.; Drăguţ, L. Random forest in remote sensing: A review of applications and future directions. ISPRS J. Photogramm. Remote Sens. 2016, 114, 24–31. [Google Scholar] [CrossRef]
- Rigatti, S.J. Random forest. J. Insur. Med. 2017, 47, 31–39. [Google Scholar] [CrossRef]
- Yoo, H.; Kim, E.; Chung, J.W.; Cho, H.; Jeong, S.; Kim, H.; Jang, D.; Kim, H.; Yoon, J.; Lee, G.H.; et al. Silent Speech Recognition with Strain Sensors and Deep Learning Analysis of Directional Facial Muscle Movement. ACS Appl. Mater. Interfaces 2022, 14, 54157–54169. [Google Scholar] [CrossRef]
- Janiesch, C.; Zschech, P.; Heinrich, K. Machine learning and deep learning. Electron. Mark. 2021, 31, 685–695. [Google Scholar] [CrossRef]
- Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
- Mueller, C.M.; Schatz, G.C. An Algorithmic Approach Based on Data Trees and Genetic Algorithms to Understanding Charged and Neutral Metal Nanocluster Growth. J. Phys. Chem. A 2022, 126, 5864–5872. [Google Scholar] [CrossRef] [PubMed]
- Pao, Y.-H.; Takefuji, Y. Functional-link net computing: Theory, system architecture, and functionalities. Computer 1992, 25, 76–79. [Google Scholar] [CrossRef]
- Pao, Y.-H.; Park, G.-H.; Sobajic, D.J. Learning and generalization characteristics of the random vector functional-link net. Neurocomputing 1994, 6, 163–180. [Google Scholar] [CrossRef]
- Liu, Z.; Zhou, J.; Chen, C.L.P. Broad learning system: Feature extraction based on K-means clustering algorithm. In Proceedings of the International Conference on Information, Cybernetics and Computational Social Systems (ICCSS), Dalian, China, 24–26 July 2017; pp. 683–687. [Google Scholar]
- Chen, C.L.P.; Liu, Z. Broad Learning System: An Effective and Efficient Incremental Learning System without the Need for Deep Architecture. IEEE Trans. Neural Netw. Learn. Syst. 2018, 29, 10–24. [Google Scholar] [CrossRef]
- Igelnik, B.; Pao, Y.H. Stochastic choice of basis functions in adaptive function approximation and the functional-link net. IEEE Trans. Neural Netw 1995, 6, 1320–1329. [Google Scholar] [CrossRef]
- Cheng, Y.; Le, H.; Li, C.; Huang, J.; Liu, P.X. A Decomposition-Based Improved Broad Learning System Model for Short-Term Load Forecasting. J. Electr. Eng. Technol. 2022, 17, 2703–2716. [Google Scholar] [CrossRef]
- Wang, M.; Ge, Q.; Li, C.; Sun, C. Charging diagnosis of power battery based on adaptive STCKF and BLS for electric vehicles. IEEE Trans. Veh. Technol. 2022, 71, 8251–8265. [Google Scholar] [CrossRef]
- Pu, X.; Li, C. Online semisupervised broad learning system for industrial fault diagnosis. IEEE Trans. Ind. Inform. 2021, 17, 6644–6654. [Google Scholar] [CrossRef]
- Chu, F.; Liang, T.; Chen, C.L.P.; Wang, X.; Ma, X. Weighted broad learning system and its application in nonlinear industrial process modeling. IEEE Trans. Neural Netw. Learn. Syst. 2019, 31, 3017–3031. [Google Scholar] [CrossRef]
- Chen, C.L.P.; Liu, Z.; Feng, S. Universal approximation capability of broad learning system and its structural variations. IEEE Trans. Neural Netw. Learn. Syst. 2018, 30, 1191–1204. [Google Scholar] [CrossRef] [PubMed]
- Wang, S.; Yuen, K.-V.; Yang, X.; Zhang, Y. Tropical Cyclogenesis Detection from Remotely Sensed Sea Surface Winds Using Graphical and Statistical Features-based Broad Learning System. IEEE Trans. Geosci. Remote Sens. 2023, 61, 4203815. [Google Scholar] [CrossRef]
- Jiao, X.; Zhang, D.; Song, D.; Mu, D.; Tian, Y.; Wu, H. Wind Speed Prediction Based on VMD-BLS and Error Compensation. J. Mar. Sci. Eng. 2023, 11, 1082. [Google Scholar] [CrossRef]
- Tao, T.; He, J.; Wang, H.; Zhao, K. Efficient simulation of non-stationary non-homogeneous wind field: Fusion of multi-dimensional interpolation and NUFFT. J. Wind Eng. Ind. Aerodyn. 2023, 236, 105394. [Google Scholar] [CrossRef]
- Xu, Y.-L.; Hu, L.; Kareem, A. Conditional Simulation of Nonstationary Fluctuating Wind Speeds for Long-Span Bridges. J. Eng. Mech. 2014, 140, 61–73. [Google Scholar] [CrossRef]
- Tao, T.; Wang, H.; Zhao, K. Efficient simulation of fully non-stationary random wind field based on reduced 2D hermite interpolation. Mech. Syst. Signal Process. 2021, 150, 107265. [Google Scholar] [CrossRef]
Dataset | Training Samples | Testing Samples | Features | Classifications |
---|---|---|---|---|
Wind farm 1 | 28,790 | 6240 | 4 | 5 |
Wind farm 2 | 28,800 | 6200 | 4 | 5 |
Wind farm 3 | 28,700 | 6300 | 4 | 5 |
Wind farm 4 | 28,650 | 6400 | 4 | 5 |
Wind farm 5 | 28,880 | 6100 | 4 | 5 |
Wind farm 6 | 28,780 | 6250 | 4 | 5 |
Wind farm 7 | 28,790 | 6250 | 4 | 5 |
Wind farm 8 | 28,700 | 6240 | 4 | 5 |
Wind farm 9 | 28,800 | 6200 | 4 | 5 |
Wind farm 10 | 28,850 | 6300 | 4 | 5 |
Dataset | RF | SVR | ELM | BLS | DBLS | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Estimators | Min Samples Split | Max Depth | C | γ | C | Hidden Num | Num Win | Num Fea | Num Enhan | C | Num Win | Num Fea | Num Enhan | C | |
Wind farm 1 | 100 | 200 | 20 | 10 | 0.01 | 10 | 300 | 7 | 8 | 800 | 2−30 | 6 | 10 | 400 | 2−30 |
Wind farm 2 | 200 | 300 | 20 | 1 | 0.1 | 1 | 200 | 6 | 10 | 900 | 2−30 | 8 | 12 | 500 | 2−30 |
Wind farm 3 | 100 | 400 | 30 | 1 | 0.01 | 10 | 300 | 5 | 13 | 600 | 2−30 | 7 | 10 | 300 | 2−30 |
Wind farm 4 | 150 | 300 | 40 | 10 | 0.1 | 1 | 200 | 10 | 15 | 500 | 2−30 | 9 | 12 | 500 | 2−30 |
Wind farm 5 | 120 | 200 | 40 | 1 | 0.01 | 10 | 200 | 8 | 8 | 700 | 2−30 | 10 | 10 | 600 | 2−30 |
Wind farm 6 | 140 | 300 | 50 | 10 | 0.1 | 10 | 300 | 6 | 7 | 400 | 2−30 | 6 | 9 | 400 | 2−30 |
Wind farm 7 | 150 | 400 | 40 | 1 | 0.1 | 1 | 200 | 7 | 10 | 500 | 2−30 | 5 | 8 | 500 | 2−30 |
Wind farm 8 | 200 | 200 | 20 | 10 | 0.01 | 10 | 200 | 10 | 16 | 700 | 2−30 | 6 | 9 | 600 | 2−30 |
Wind farm 9 | 150 | 200 | 20 | 10 | 1 | 10 | 300 | 9 | 14 | 400 | 2−30 | 7 | 9 | 400 | 2−30 |
Wind farm 10 | 200 | 300 | 30 | 1 | 0.1 | 1 | 300 | 10 | 15 | 700 | 2−30 | 7 | 10 | 300 | 2−30 |
Dataset | RF | SVR | ELM | BLS | DBLS | |||||
---|---|---|---|---|---|---|---|---|---|---|
RMSE m/s | MAPE % | RMSE m/s | MAPE % | RMSE m/s | MAPE % | RMSE m/s | MAPE % | RMSE m/s | MAPE % | |
Wind farm 1 | 1.562 | 0.429 | 1.242 | 0.403 | 0.956 | 0.390 | 0.805 | 0.170 | 0.765 | 0.140 |
Wind farm 2 | 1.559 | 0.426 | 1.230 | 0.402 | 0.955 | 0.389 | 0.808 | 0.171 | 0.767 | 0.142 |
Wind farm 3 | 1.575 | 0.424 | 1.248 | 0.405 | 0.958 | 0.392 | 0.811 | 0.173 | 0.768 | 0.142 |
Wind farm 4 | 1.572 | 0.420 | 1.249 | 0.406 | 0.957 | 0.393 | 0.815 | 0.175 | 0.766 | 0.140 |
Wind farm 5 | 1.580 | 0.427 | 1.242 | 0.401 | 0.952 | 0.390 | 0.817 | 0.176 | 0.762 | 0.138 |
Wind farm 6 | 1.574 | 0.423 | 1.243 | 0.402 | 0.955 | 0.391 | 0.819 | 0.179 | 0.765 | 0.140 |
Wind farm 7 | 1.582 | 0.430 | 1.246 | 0.408 | 0.961 | 0.392 | 0.821 | 0.181 | 0.769 | 0.143 |
Wind farm 8 | 1.585 | 0.432 | 1.249 | 0.410 | 0.965 | 0.395 | 0.825 | 0.184 | 0.772 | 0.146 |
Wind farm 9 | 1.587 | 0.435 | 1.251 | 0.413 | 0.968 | 0.397 | 0.827 | 0.186 | 0.776 | 0.149 |
Wind farm 10 | 1.583 | 0.431 | 1.259 | 0.416 | 0.969 | 0.398 | 0.828 | 0.189 | 0.771 | 0.145 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, L.; Xue, A. Wind Speed Modeling for Wind Farms Based on Deterministic Broad Learning System. Atmosphere 2023, 14, 1308. https://doi.org/10.3390/atmos14081308
Wang L, Xue A. Wind Speed Modeling for Wind Farms Based on Deterministic Broad Learning System. Atmosphere. 2023; 14(8):1308. https://doi.org/10.3390/atmos14081308
Chicago/Turabian StyleWang, Lin, and Anke Xue. 2023. "Wind Speed Modeling for Wind Farms Based on Deterministic Broad Learning System" Atmosphere 14, no. 8: 1308. https://doi.org/10.3390/atmos14081308
APA StyleWang, L., & Xue, A. (2023). Wind Speed Modeling for Wind Farms Based on Deterministic Broad Learning System. Atmosphere, 14(8), 1308. https://doi.org/10.3390/atmos14081308