Machine Learning Innovations in Renewable Energy Systems with Integrated NRBO-TXAD for Enhanced Wind Speed Forecasting Accuracy
Abstract
1. Introduction
- This study innovatively combines the IQR method with an AMAF for data pre-processing. The IQR method effectively identifies outliers, while the adaptive nature of the AMAF adjusts the filter window dynamically. This combination reduces noise and preserves essential information, thereby providing a more reliable foundation for subsequent modeling.
- To enhance wind speed prediction accuracy, an NRBO–Transformer-based model is proposed. By optimizing hyperparameters, the model enhances training convergence and prediction performance.
- An error compensation mechanism is introduced to address the limitations of using a single model. XGBoost, which is a powerful ensemble learning algorithm, handles nonlinear relationships effectively and corrects the Transformer’s prediction errors.
2. Model Frameworks
2.1. Transformer
2.1.1. Embeddings and Positional Encoding
2.1.2. Encoder–Decoder Module
2.1.3. Classifier
2.2. Newton–Raphson-Based Optimizer
2.2.1. Population Initialization
2.2.2. Newton–Raphson Search Rule
2.2.3. Trap Avoidance Operation
2.3. XGBoost
2.4. Framework of the NRBO-TXAD Model
3. Case Study Analysis
3.1. Dataset Description and Preprocessing
3.1.1. IQR Outlier Detection and Correction
3.1.2. Adaptive Moving Average Filter
3.2. Evaluation Metrics
3.3. Simulation Environment
4. Experimental Results
4.1. Parameter Setting and Comparison Model
4.2. Fitness Curve
4.3. Prediction and Error Plots
4.4. The Impact of Time Steps on Prediction Results
4.5. Uncertainty Analysis
4.6. Sensitivity Analysis
4.6.1. Hyperparameter Sensitivity
4.6.2. Input Perturbation Sensitivity
4.7. Real-World Applicability and Computational Cost
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Nomenclature
ADM | Atmospheric dynamics model |
AMAF | Adaptive moving average filter |
ANN | Artificial neural network |
ARMIMA | Autoregressive integrated moving average |
BLM | Boundary layer model |
BPNN | Backpropagation neural network |
CART | Classification and regression tree |
CNN | Convolutional neural network |
DE | Differential evolution algorithm |
ELM | Extreme learning machine |
ENN | Elman neural network |
GA | Genetic algorithm |
GBDT | Gradient boosted decision tree |
IQR | Interquartile range |
LSTM | Long short-term memory |
lr | Learning rate |
MAPE | Mean absolute percentage error |
NGO | Northern Goshawk optimization |
NRBO | Newton–Raphson-based optimizer |
NRM | Newton–Raphson method |
NRSR | Newton–Raphson search rule |
numHeads | The number of attention heads |
NWP | Numerical weather prediction |
PSO | Particle swarm optimization |
RNN | Recurrent neural network |
RMSE | Root mean square error |
SVM | Support vector machine |
TAO | Trap avoidance operator |
TS | Taylor series |
WOA | Whale optimization algorithm |
XGBoost | Extreme gradient boosting |
References
- Dablander, F.; Hickey, C.; Sandberg, M.; Zell-Ziegler, C.; Grin, J. Embracing Sufficiency to Accelerate the Energy Transition. Energy Res. Social. Sci. 2025, 120, 103907. [Google Scholar] [CrossRef]
- Sun, Y.; Du, R.; Chen, H. Energy Transition and Policy Perception Acuity: An Analysis of 335 High-Energy-Consuming Enterprises in China. Appl. Energy 2025, 377, 124627. [Google Scholar] [CrossRef]
- Liu, H.; Mi, X.; Li, Y. Smart Deep Learning Based Wind Speed Prediction Model Using Wavelet Packet Decomposition, Convolutional Neural Network and Convolutional Long Short Term Memory Network. Energy Convers. Manag. 2018, 166, 120–131. [Google Scholar] [CrossRef]
- Shokri Gazafroudi, A. Assessing the Impact of Load and Renewable Energies’ Uncertainty on a Hybrid System. Int. J. Electr. Power Energy Syst. 2016, 5, 1. [Google Scholar] [CrossRef]
- Kim, S.-Y.; Kim, S.-H. Study on the Prediction of Wind Power Generation Based on Artificial Neural Network. J. Inst. Control Robot. Syst. 2011, 17, 1173–1178. [Google Scholar] [CrossRef]
- Qin, X.; Yuan, L.; Dong, X.; Zhang, S.; Shi, H. Short Term Wind Speed Prediction Based on CEESMDAN and Improved Seagull Optimization Kernel Extreme Learning Machine. Earth Sci. Inform. 2025, 18, 141. [Google Scholar] [CrossRef]
- Cai, H.; Wu, Z.; Huang, C.; Huang, D. Wind Power Forecasting Based on Ensemble Empirical Mode Decomposition with Generalized Regression Neural Network Based on Cross-Validated Method. J. Electr. Eng. Technol. 2019, 14, 1823–1829. [Google Scholar] [CrossRef]
- Khan, S.; Muhammad, Y.; Jadoon, I.; Awan, S.E.; Raja, M.A.Z. Leveraging LSTM-SMI and ARIMA Architecture for Robust Wind Power Plant Forecasting. Appl. Soft Comput. 2025, 170, 112765. [Google Scholar] [CrossRef]
- Melalkia, L.; Berrezzek, F.; Khelil, K.; Saim, A.; Nebili, R. A Hybrid Error Correction Method Based on EEMD and ConvLSTM for Offshore Wind Power Forecasting. Ocean. Eng. 2025, 325, 120773. [Google Scholar] [CrossRef]
- Global Wind Energy Council Launched. Refocus 2005, 6, 11. [CrossRef]
- Phan, Q.B.; Nguyen, T.T. Enhancing Wind Speed Forecasting Accuracy Using a GWO-Nested CEEMDAN-CNN-BiLSTM Model. ICT Express 2024, 10, 485–490. [Google Scholar] [CrossRef]
- Countries That Produce the Most Wind Energy. Available online: https://www.evwind.es/2023/01/14/countries-that-produce-the-most-wind-energy/89725 (accessed on 20 April 2025).
- Pinson, P.; Nielsen, H.A.; Madsen, H.; Kariniotakis, G. Skill Forecasting from Ensemble Predictions of Wind Power. Appl. Energy 2009, 86, 1326–1334. [Google Scholar] [CrossRef]
- Barbosa De Alencar, D.; De Mattos Affonso, C.; Limão De Oliveira, R.; Moya Rodríguez, J.; Leite, J.; Reston Filho, J. Different Models for Forecasting Wind Power Generation: Case Study. Energies 2017, 10, 1976. [Google Scholar] [CrossRef]
- Okumus, I.; Dinler, A. Current Status of Wind Energy Forecasting and a Hybrid Method for Hourly Predictions. Energy Convers. Manag. 2016, 123, 362–371. [Google Scholar] [CrossRef]
- Maděra, J.; Kočí, J.; Černý, R. Computational Modeling of the Effect of External Environment on the Degradation of High-Performance Concrete. In AIP Conference Proceedings; American Institute of Physics: Istanbul, Turkey, 2017; Volume 1809, p. 020032. [Google Scholar]
- Georgilakis, P.S. Technical Challenges Associated with the Integration of Wind Power into Power Systems. Renew. Sustain. Energy Rev. 2008, 12, 852–863. [Google Scholar] [CrossRef]
- Pan, J.-S.; Liu, F.-F.; Tian, A.-Q.; Kong, L.; Chu, S.-C. Parameter Extraction Model of Wind Turbine Based on A Novel Pigeon-Inspired Optimization Algorithm. J. Internet Technol. 2024, 25, 561–573. [Google Scholar] [CrossRef]
- Vargas, S.A.; Esteves, G.R.T.; Maçaira, P.M.; Bastos, B.Q.; Cyrino Oliveira, F.L.; Souza, R.C. Wind Power Generation: A Review and a Research Agenda. J. Clean. Prod. 2019, 218, 850–870. [Google Scholar] [CrossRef]
- Shu, Y.; Chen, G.; He, J.; Zhang, F. Building a New Electric Power System Based on New Energy Sources. Chin. J. Eng. Sci. 2021, 23, 61. [Google Scholar] [CrossRef]
- Shahid, F.; Zameer, A.; Muneeb, M. A Novel Genetic LSTM Model for Wind Power Forecast. Energy 2021, 223, 120069. [Google Scholar] [CrossRef]
- Sideratos, G.; Hatziargyriou, N.D. An Advanced Statistical Method for Wind Power Forecasting. IEEE Trans. Power Syst. 2007, 22, 258–265. [Google Scholar] [CrossRef]
- Colak, I.; Sagiroglu, S.; Yesilbudak, M. Data Mining and Wind Power Prediction: A Literature Review. Renew. Energy 2012, 46, 241–247. [Google Scholar] [CrossRef]
- Guo, L.; Xu, C.; Yu, T.; Wumaier, T.; Han, X. Ultra-Short-Term Wind Power Forecasting Based on Long Short-Term Memory Network with Modified Honey Badger Algorithm. Energy Rep. 2024, 12, 3548–3565. [Google Scholar] [CrossRef]
- Bryce, R. Solar PV, Wind Generation, and Load Forecasting Dataset for ERCOT 2018: Performance-Based Energy Resource Feedback, Optimization, and Risk Management (P.E.R.F.O.R.M.); National Renewable Energy Laboratory (NREL): Golden, CO, USA, 2023. [Google Scholar]
- Balkissoon, S.; Fox, N.; Lupo, A.; Haupt, S.E.; Penny, S.G. Classification of Tall Tower Meteorological Variables and Forecasting Wind Speeds in Columbia, Missouri. Renew. Energy 2023, 217, 119123. [Google Scholar] [CrossRef]
- Tian, Z. Modes Decomposition Forecasting Approach for Ultra-Short-Term Wind Speed. Appl. Soft Comput. 2021, 105, 107303. [Google Scholar] [CrossRef]
- Xiong, X.; Zou, R.; Sheng, T.; Zeng, W.; Ye, X. An Ultra-Short-Term Wind Speed Correction Method Based on the Fluctuation Characteristics of Wind Speed. Energy 2023, 283, 129012. [Google Scholar] [CrossRef]
- Saini, V.K.; Kumar, R.; Al-Sumaiti, A.S.; Sujil, A.; Heydarian-Forushani, E. Learning Based Short Term Wind Speed Forecasting Models for Smart Grid Applications: An Extensive Review and Case Study. Electr. Power Syst. Res. 2023, 222, 109502. [Google Scholar] [CrossRef]
- Han, Y.; Mi, L.; Shen, L.; Cai, C.S.; Liu, Y.; Li, K.; Xu, G. A Short-Term Wind Speed Prediction Method Utilizing Novel Hybrid Deep Learning Algorithms to Correct Numerical Weather Forecasting. Appl. Energy 2022, 312, 118777. [Google Scholar] [CrossRef]
- Shirzadi, N.; Nizami, A.; Khazen, M.; Nik-Bakht, M. Medium-Term Regional Electricity Load Forecasting through Machine Learning and Deep Learning. Designs 2021, 5, 27. [Google Scholar] [CrossRef]
- Ávila, L.; Mine, M.R.M.; Kaviski, E.; Detzel, D.H.M. Evaluation of Hydro-Wind Complementarity in the Medium-Term Planning of Electrical Power Systems by Joint Simulation of Periodic Streamflow and Wind Speed Time Series: A Brazilian Case Study. Renew. Energy 2021, 167, 685–699. [Google Scholar] [CrossRef]
- Ban, G.; Chen, Y.; Xiong, Z.; Zhuo, Y.; Huang, K. The Univariate Model for Long-Term Wind Speed Forecasting Based on Wavelet Soft Threshold Denoising and Improved Autoformer. Energy 2024, 290, 130225. [Google Scholar] [CrossRef]
- Hayes, L.; Stocks, M.; Blakers, A. Accurate Long-Term Power Generation Model for Offshore Wind Farms in Europe Using ERA5 Reanalysis. Energy 2021, 229, 120603. [Google Scholar] [CrossRef]
- Omidkar, A.; Es’haghian, R.; Song, H. Using Machine Learning Methods for Long-Term Technical and Economic Evaluation of Wind Power Plants. Green. Energy Resour. 2025, 3, 100115. [Google Scholar] [CrossRef]
- Wang, J.; Che, J.; Li, Z.; Gao, J.; Zhang, L. Hybrid Wind Speed Optimization Forecasting System Based on Linear and Nonlinear Deep Neural Network Structure and Data Preprocessing Fusion. Future Gener. Comput. Syst. 2025, 164, 107565. [Google Scholar] [CrossRef]
- Geng, D.; Zhang, Y.; Zhang, Y.; Qu, X.; Li, L. A Hybrid Model Based on CapSA-VMD-ResNet-GRU-Attention Mechanism for Ultra-Short-Term and Short-Term Wind Speed Prediction. Renew. Energy 2025, 240, 122191. [Google Scholar] [CrossRef]
- Raju, S.K.; Periyasamy, M.; Alhussan, A.A.; Kannan, S.; Raghavendran, S.; El-kenawy, E.-S.M. Machine Learning Boosts Wind Turbine Efficiency with Smart Failure Detection and Strategic Placement. Sci. Rep. 2025, 15, 1485. [Google Scholar] [CrossRef]
- Sanda, M.G.; Emam, M.; Ookawara, S.; Hassan, H. Techno-Enviro-Economic Evaluation of on-Grid and off-Grid Hybrid Photovoltaics and Vertical Axis Wind Turbines System with Battery Storage for Street Lighting Application. J. Clean. Prod. 2025, 491, 144866. [Google Scholar] [CrossRef]
- Xu, H.; Zhao, Y.; Dajun, Z.; Duan, Y.; Xu, X. Exploring the Typhoon Intensity Forecasting through Integrating AI Weather Forecasting with Regional Numerical Weather Model. npj Clim. Atmos. Sci. 2025, 8, 38. [Google Scholar] [CrossRef]
- Han, S.; Song, W.; Yan, J.; Zhang, N.; Wang, H.; Ge, C.; Liu, Y. Integrating Intra-Seasonal Oscillations with Numerical Weather Prediction for 15-Day Wind Power Forecasting. IEEE Trans. Power Syst. 2025, 1–14. [Google Scholar] [CrossRef]
- Duca, V.E.L.A.; Fonseca, T.C.O.; Cyrino Oliveira, F.L. A Generalized Dynamical Model for Wind Speed Forecasting. Renew. Sustain. Energy Rev. 2021, 136, 110421. [Google Scholar] [CrossRef]
- Efthimiou, G.C.; Kumar, P.; Giannissi, S.G.; Feiz, A.A.; Andronopoulos, S. Prediction of the Wind Speed Probabilities in the Atmospheric Surface Layer. Renew. Energy 2019, 132, 921–930. [Google Scholar] [CrossRef]
- Van De Wiel, B.J.H.; Moene, A.F.; Jonker, H.J.J.; Baas, P.; Basu, S.; Donda, J.M.M.; Sun, J.; Holtslag, A.A.M. The Minimum Wind Speed for Sustainable Turbulence in the Nocturnal Boundary Layer. J. Atmos. Sci. 2012, 69, 3116–3127. [Google Scholar] [CrossRef]
- Chenge, Y.; Brutsaert, W. Flux-Profile Relationships for Wind Speed and Temperature in the Stable Atmospheric Boundary Layer. Bound.-Layer. Meteorol. 2005, 114, 519–538. [Google Scholar] [CrossRef]
- Feng, L.; Zhou, Y.; Luo, Q.; Wei, Y. Complex-Valued Artificial Hummingbird Algorithm for Global Optimization and Short-Term Wind Speed Prediction. Expert. Syst. Appl. 2024, 246, 123160. [Google Scholar] [CrossRef]
- Castorrini, A.; Gentile, S.; Geraldi, E.; Bonfiglioli, A. Increasing Spatial Resolution of Wind Resource Prediction Using NWP and RANS Simulation. J. Wind. Eng. Ind. Aerodyn. 2021, 210, 104499. [Google Scholar] [CrossRef]
- Wu, C.; Huang, H.; Zhang, L.; Chen, J.; Tong, Y.; Zhou, M. Towards automated 3D evaluation of water leakage on a tunnel face via improved GAN and self-attention DL model. Tunn. Undergr. Space Technol. 2023, 142, 105432. [Google Scholar] [CrossRef]
- Kavasseri, R.G.; Seetharaman, K. Day-Ahead Wind Speed Forecasting Using f-ARIMA Models. Renew. Energy 2009, 34, 1388–1393. [Google Scholar] [CrossRef]
- Chen, W. A novel Tree-augmented Bayesian network for predicting rock weathering degree using incomplete dataset. Int. J. Rock Mech. Min. Sci. 2024, 183, 105933. [Google Scholar] [CrossRef]
- Torres, J.L.; García, A.; De Blas, M.; De Francisco, A. Forecast of Hourly Average Wind Speed with ARMA Models in Navarre (Spain). Sol. Energy 2005, 79, 65–77. [Google Scholar] [CrossRef]
- Liu, M.-D.; Ding, L.; Bai, Y.-L. Application of Hybrid Model Based on Empirical Mode Decomposition, Novel Recurrent Neural Networks and the ARIMA to Wind Speed Prediction. Energy Convers. Manag. 2021, 233, 113917. [Google Scholar] [CrossRef]
- Jiang, Y.; Huang, G.; Peng, X.; Li, Y.; Yang, Q. A Novel Wind Speed Prediction Method: Hybrid of Correlation-Aided DWT, LSSVM and GARCH. J. Wind. Eng. Ind. Aerodyn. 2018, 174, 28–38. [Google Scholar] [CrossRef]
- García, I.; Huo, S.; Prado, R.; Bravo, L. Dynamic Bayesian Temporal Modeling and Forecasting of Short-Term Wind Measurements. Renew. Energy 2020, 161, 55–64. [Google Scholar] [CrossRef]
- Ak, R.; Fink, O.; Zio, E. Two Machine Learning Approaches for Short-Term Wind Speed Time-Series Prediction. IEEE Trans. Neural Netw. Learning Syst. 2016, 27, 1734–1747. [Google Scholar] [CrossRef] [PubMed]
- Abdelghany, E.S.; Farghaly, M.B.; Almalki, M.M.; Sarhan, H.H.; Essa, M.E.-S.M. Machine Learning and Iot Trends for Intelligent Prediction of Aircraft Wing Anti-Icing System Temperature. Aerospace 2023, 10, 676. [Google Scholar] [CrossRef]
- Wu, C.; Huang, H.; Ni, Y.-Q.; Zhang, L.; Zhang, L. Evaluation of Tunnel Rock Mass Integrity Using Multi-Modal Data and Generative Large Models: Tunnelrip-Gpt. SSRN, 2025; preprint. [Google Scholar] [CrossRef]
- Duan, J.; Chang, M.; Chen, X.; Wang, W.; Zuo, H.; Bai, Y.; Chen, B. A Combined Short-Term Wind Speed Forecasting Model Based on CNN–RNN and Linear Regression Optimization Considering Error. Renew. Energy 2022, 200, 788–808. [Google Scholar] [CrossRef]
- Xu, M. Comparative Analysis of Machine Learning Models for Weather Forecasting: A Heathrow Case Study. TE 2024, 1, 1–12. [Google Scholar] [CrossRef]
- Ren, C.; An, N.; Wang, J.; Li, L.; Hu, B.; Shang, D. Optimal Parameters Selection for BP Neural Network Based on Particle Swarm Optimization: A Case Study of Wind Speed Forecasting. Knowl.-Based Syst. 2014, 56, 226–239. [Google Scholar] [CrossRef]
- Yu, C.; Li, Y.; Zhang, M. Comparative Study on Three New Hybrid Models Using Elman Neural Network and Empirical Mode Decomposition Based Technologies Improved by Singular Spectrum Analysis for Hour-Ahead Wind Speed Forecasting. Energy Convers. Manag. 2017, 147, 75–85. [Google Scholar] [CrossRef]
- Yang, Y.; Solomin, E.V. Wind Direction Prediction Based on Nonlinear Autoregression and Elman Neural Networks for the Wind Turbine Yaw System. Renew. Energy 2025, 241, 122284. [Google Scholar] [CrossRef]
- Santhosh, M.; Venkaiah, C.; Vinod Kumar, D.M. Ensemble Empirical Mode Decomposition Based Adaptive Wavelet Neural Network Method for Wind Speed Prediction. Energy Convers. Manag. 2018, 168, 482–493. [Google Scholar] [CrossRef]
- Banik, A.; Behera, C.; Sarathkumar, T.V.; Goswami, A.K. Uncertain Wind Power Forecasting Using LSTM-based Prediction Interval. IET Renew. Power Gen. 2020, 14, 2657–2667. [Google Scholar] [CrossRef]
- Nair, K.R.; Vanitha, V.; Jisma, M. Forecasting of Wind Speed Using ANN, ARIMA and Hybrid Models. In Proceedings of the 2017 International Conference on Intelligent Computing, Instrumentation and Control Technologies (ICICICT), Kannur, Kerala State, India, 6–7 July 2017; pp. 170–175. [Google Scholar]
- Liu, W.; Bai, Y.; Yue, X.; Wang, R.; Song, Q. A Wind Speed Forcasting Model Based on Rime Optimization Based VMD and Multi-Headed Self-Attention-LSTM. Energy 2024, 294, 130726. [Google Scholar] [CrossRef]
- Qian, Z.; Pei, Y.; Zareipour, H.; Chen, N. A Review and Discussion of Decomposition-Based Hybrid Models for Wind Energy Forecasting Applications. Appl. Energy 2019, 235, 939–953. [Google Scholar] [CrossRef]
- Mi, X.; Zhao, S. Wind Speed Prediction Based on Singular Spectrum Analysis and Neural Network Structural Learning. Energy Convers. Manag. 2020, 216, 112956. [Google Scholar] [CrossRef]
- Liang, T.; Chai, C.; Sun, H.; Tan, J. Wind Speed Prediction Based on Multi-Variable Capsnet-BILSTM-MOHHO for WPCCC. Energy 2022, 250, 123761. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, L.; Polosukhin, I. Attention Is All You Need. arXiv 2017, arXiv:1706.03762. [Google Scholar]
- Wang, C.; Chen, Y.; Zhang, S.; Zhang, Q. Stock Market Index Prediction Using Deep Transformer Model. Expert. Syst. Appl. 2022, 208, 118128. [Google Scholar] [CrossRef]
- Chandra, A.; Tünnermann, L.; Löfstedt, T.; Gratz, R. Transformer-Based Deep Learning for Predicting Protein Properties in the Life Sciences. eLife 2023, 12, e82819. [Google Scholar] [CrossRef] [PubMed]
- Wu, H.; Xu, J.; Wang, J.; Long, M. Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting. arXiv 2021, arXiv:2106.13008. [Google Scholar]
- Qu, K.; Si, G.; Shan, Z.; Kong, X.; Yang, X. Short-Term Forecasting for Multiple Wind Farms Based on Transformer Model. Energy Rep. 2022, 8, 483–490. [Google Scholar] [CrossRef]
- Yan, D.; Lu, Y. Recent Advances in Particle Swarm Optimization for Large Scale Problems. J. Auton. Intell. 2018, 1, 22. [Google Scholar] [CrossRef]
- Kumar, R.; Kumar, A. Application of Differential Evolution for Wind Speed Distribution Parameters Estimation. Wind. Eng. 2021, 45, 1544–1556. [Google Scholar] [CrossRef]
- Sivanandam, S.N.; Deepa, S.N. (Eds.) Introduction to Genetic Algorithms; Springer: Berlin/Heidelberg, Germany, 2008; ISBN 9783540731894. [Google Scholar]
- Dehghani, M.; Hubalovsky, S.; Trojovsky, P. Northern Goshawk Optimization: A New Swarm-Based Algorithm for Solving Optimization Problems. IEEE Access 2021, 9, 162059–162080. [Google Scholar] [CrossRef]
- Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
- Fan, X.; Wang, R.; Yang, Y.; Wang, J. Transformer–BiLSTM Fusion Neural Network for Short-Term PV Output Prediction Based on NRBO Algorithm and VMD. Appl. Sci. 2024, 14, 11991. [Google Scholar] [CrossRef]
- Wu, H.; Meng, K.; Fan, D.; Zhang, Z.; Liu, Q. Multistep Short-Term Wind Speed Forecasting Using Transformer. Energy 2022, 261, 125231. [Google Scholar] [CrossRef]
- Novotný, V.; Štefánik, M.; Ayetiran, E.F.; Sojka, P.; Řehůřek, R. When FastText Pays Attention: Efficient Estimation of Word Representations Using Constrained Positional Weighting. J. Univ. Comput. Sci. 2022, 28, 181–201. [Google Scholar] [CrossRef]
- Sowmya, R.; Premkumar, M.; Jangir, P. Newton-Raphson-Based Optimizer: A New Population-Based Metaheuristic Algorithm for Continuous Optimization Problems. Eng. Appl. Artif. Intell. 2024, 128, 107532. [Google Scholar] [CrossRef]
- Magreñán, A.A.; Argyros, I.K. A Contemporary Study of Iterative Methods: Convergence, Dynamics and Applications; Academic Press: London, UK, 2018; ISBN 9780128092149. [Google Scholar]
- Friedman, J.H. Greedy Function Approximation: A Gradient Boosting Machine. Ann. Statist. 2001, 29, 1189–1232. [Google Scholar] [CrossRef]
- Wang, Y.; Guo, Y. Forecasting Method of Stock Market Volatility in Time Series Data Based on Mixed Model of ARIMA and XGBoost. China Commun. 2020, 17, 205–221. [Google Scholar] [CrossRef]
- Gunawan, R.G.; Handika, E.S.; Ismanto, E. Pendekatan Machine Learning Dengan Menggunakan Algoritma Xgboost (Extreme Gradient Boosting) Untuk Peningkatan Kinerja Klasifikasi Serangan Syn. CoSciTech 2022, 3, 453–463. [Google Scholar] [CrossRef]
- Deng, X.; Ye, A.; Zhong, J.; Xu, D.; Yang, W.; Song, Z.; Zhang, Z.; Guo, J.; Wang, T.; Tian, Y.; et al. Bagging—XGBoost Algorithm Based Extreme Weather Identification and Short-Term Load Forecasting Model. Energy Rep. 2022, 8, 8661–8674. [Google Scholar] [CrossRef]
- Semmelmann, L.; Henni, S.; Weinhardt, C. Load Forecasting for Energy Communities: A Novel LSTM-XGBoost Hybrid Model Based on Smart Meter Data. Energy Inform. 2022, 5, 24. [Google Scholar] [CrossRef]
- Leng, Z.; Chen, L.; Yi, B.; Liu, F.; Xie, T.; Mei, Z. Short-Term Wind Speed Forecasting Based on a Novel KANInformer Model and Improved Dual Decomposition. Energy 2025, 322, 135551. [Google Scholar] [CrossRef]
- Hua, Z.; Yang, Q.; Chen, J.; Lan, T.; Zhao, D.; Dou, M.; Liang, B. Degradation Prediction of PEMFC Based on BiTCN-BiGRU-ELM Fusion Prognostic Method. Int. J. Hydrogen Energy 2024, 87, 361–372. [Google Scholar] [CrossRef]
- Chen, G.; Tang, B.; Zeng, X.; Zhou, P.; Kang, P.; Long, H. Short-Term Wind Speed Forecasting Based on Long Short-Term Memory and Improved BP Neural Network. Int. J. Electr. Power Energy Syst. 2022, 134, 107365. [Google Scholar] [CrossRef]
- Fang, Y.; Wu, Y.; Wu, F.; Yan, Y.; Liu, Q.; Liu, N.; Xia, J. Short-Term Wind Speed Forecasting Bias Correction in the Hangzhou Area of China Based on a Machine Learning Model. Atmos. Ocean. Sci. Lett. 2023, 16, 100339. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Zhou, H.; Zhang, S.; Peng, J.; Zhang, S.; Li, J.; Xiong, H.; Zhang, W. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. Assoc. Adv. Artif. Intell. 2021, 35, 11106–11115. [Google Scholar] [CrossRef]
- Shao, B.; Song, D.; Bian, G.; Zhao, Y. Wind Speed Forecast Based on the LSTM Neural Network Optimized by the Firework Algorithm. Adv. Mater. Sci. Eng. 2021, 2021, 4874757. [Google Scholar] [CrossRef]
- Shan, S.; Ni, H.; Chen, G.; Lin, X.; Li, J. A Machine Learning Framework for Enhancing Short-Term Water Demand Forecasting Using Attention-BiLSTM Networks Integrated with XGBoost Residual Correction. Water 2023, 15, 3605. [Google Scholar] [CrossRef]
- Shivam, K.; Tzou, J.-C.; Wu, S.-C. Multi-Step Short-Term Wind Speed Prediction Using a Residual Dilated Causal Convolutional Network with Nonlinear Attention. Energies 2020, 13, 1772. [Google Scholar] [CrossRef]
Type | Time Frame | Application Scenarios | Characteristics |
---|---|---|---|
Ultra-short-term [27,28] | A few minutes to within one hour | Real-time power control of wind farms and emergency grid response | High temporal resolution, rapid response, frequent data updates, high requirement for real-time data, capable of effectively addressing sudden wind power fluctuations |
Short-term [29,30] | One hour to within several hours | Intraday power dispatch and optimization of wind farms | Relatively high temporal resolution, suitable for short-term dispatch, sensitive to weather changes, allows for early adjustment of wind farm operating strategies |
Medium-term [31,32] | Several hours to a few days | Daily dispatch planning of wind farms and power market trading | Moderate temporal resolution, integrates weather forecasts and load forecasting, high requirement for system reliability of grid-connected wind power, helps optimize market participation strategies |
Long-term [33,34,35] | One week to several weeks | Weekly dispatch planning of wind farms and resource allocation | Low temporal resolution, considers seasonal variations and long-term trends, important for strategic planning and resource optimization of wind farms, supports long-term operation management |
Method | Ref. | Description | Advantages | Disadvantages |
---|---|---|---|---|
NWP | [40,41] | Based on atmospheric physics, uses numerical calculations to predict wind speed. |
|
|
ADM | [42,43] | Simulates atmospheric circulation and dynamic processes to predict wind speed changes. |
|
|
BLM | [44,45] | Concentrates on boundary layer wind speed, accounting for surface friction. |
|
|
Method Name | Brief Description | Advantages | Disadvantages |
---|---|---|---|
Recurrent neural network (RNN) [58] | Processes sequential data to capture temporal dependencies in wind speed. | Handles time series data well, captures temporal patterns. | Susceptible to vanishing/exploding gradients, slow training. |
Convolutional neural network (CNN) [3] | Uses convolutional layers to extract spatial features from wind speed data. | Good at processing spatial data, captures local features effectively. | Less effective for dynamic time series data. |
Support vector machine (SVM) [59] | Employs statistical learning theory to find optimal hyperplanes for wind speed prediction. | Effective with small datasets, suitable for high-dimensional data. | Sensitive to parameter selection, high computational complexity. |
Backpropagation neural network (BPNN) [60] | Trained via backpropagation for nonlinear wind speed approximation. | Simple structure, easy to implement. | Prone to local optima, long training times. |
Extreme learning machine (ELM) [61] | Fast training algorithm for single-hidden-layer feedforward neural networks. | Rapid training, good generalization. | Sensitive to parameter selection, risk of overfitting. |
Elman neural network (ENN) [62] | Recurrent network with context layers for dynamic wind speed prediction. | Processes dynamic systems, has memory function. | Complex training process, risk of overfitting. |
Adaptive wavelet neural network (AWNN) [63] | Integrates wavelet transform with neural networks for nonlinear wind speed processing. | Handles nonlinear signals, self-adaptive capability. | High model complexity, long training times. |
Long short-term memory (LSTM) [64] | Modified RNN architecture that captures long-term dependencies in wind speed. | Captures long-term patterns, mitigates vanishing gradient issues. | Longer training times, high model complexity. |
Artificial neural network combined with autoregressive integrated moving average (ANN-ARIMA) [65] | Integrates neural networks with time series models for wind speed prediction. | Reduces nonlinearity in wind speed sequences. | Complex model structure, sensitive to parameter selection. |
Model Type | Reference | Description | Advantages | Disadvantages |
---|---|---|---|---|
PSO (Particle Swarm Optimization) | [75] | Simulates bird foraging behavior through group cooperation and information sharing for optimization. | Simple implementation, few parameters, good scalability. | Prone to local optima, poor performance on high-dimensional problems. |
DE (Differential Evolution Algorithm) | [76] | A global optimization algorithm based on differential operations, suitable for continuous parameter optimization. | Simple and effective, suitable for high-dimensional problems, easy to parallelize. | May converge to local optima, sensitive to parameter settings. |
GA (Genetic Algorithm) | [77] | Simulates natural evolution processes through selection, crossover, and mutation operations for optimization. | Good global search ability, suitable for non-convex and discrete problems. | High computational complexity, convergence speed may be slow. |
NGO (Northern Goshawk Optimization) | [78] | Simulates the hunting behavior of northern goshawks, balancing exploration and exploitation. | Strong optimization ability, can effectively avoid local optima. | High computational cost for high-dimensional problems, complex parameter settings. |
WOA (Whale Optimization Algorithm) | [79] | Simulates the bubble net hunting behavior of whales. | Easy to implement, strong search ability. | Local search ability is insufficient, sensitive to parameter settings. |
Module | Key Methods | Function |
---|---|---|
Data preprocessing | IQR method, adaptive moving average filter | Removes outliers, smooths the series, and ensures high-quality input |
Transformer prediction | NRBO optimization, self-attention mechanism | Extracts temporal features and provides initial predictions |
Error compensation | XGBoost residual modeling | Learns residual patterns and improves prediction accuracy |
Dataset | Processing Method | Mean | Median | Std | Skewness | Kurtosis | Min | Max | Optimal Window |
---|---|---|---|---|---|---|---|---|---|
Case 1 | Raw | 5.4455 | 5.3500 | 2.7438 | 0.4403 | 3.3986 | 0.0200 | 14.0300 | - |
IQR | 5.3994 | 5.3500 | 2.6305 | 0.2205 | 2.9122 | 0.0200 | 11.2762 | - | |
AMAF | 5.3994 | 5.3500 | 2.5905 | 0.2451 | 2.9376 | 0.0200 | 11.2762 | 3 | |
Case 2 | Raw | 4.5150 | 4.7000 | 1.7975 | −0.2607 | 2.7650 | 0.0200 | 9.1600 | - |
IQR | 4.5150 | 4.7000 | 1.7975 | −0.2607 | 2.7650 | 0.0200 | 9.1600 | - | |
AMAF | 4.5149 | 4.6983 | 1.7643 | −0.2512 | 2.7510 | 0.0600 | 8.8300 | 3 |
Model Name | Model Description |
---|---|
BP neural network [92] | A feedforward neural network trained through backpropagation, capable of learning complex nonlinear relationships between inputs and outputs. |
XGBoost [93] | An ensemble learning algorithm based on gradient boosting that combines multiple weak learners to improve predictive performance. |
Transformer [94] | Utilizes a self-attention mechanism to capture long-range dependencies in sequential data. |
Informer [95] | Employs the ProbSparse self-attention mechanism and self-attention distillation to reduce computational complexity. |
LSTM [96] | A special type of recurrent neural network that uses gating mechanisms to effectively address the vanishing gradient problem in traditional RNNs. |
NRBO–Transformer | Combines the Transformer model with the NRBO optimization algorithm, optimizing network structure and parameters to further enhance long sequence processing capabilities. |
XGBoost–Transformer | Combines the error compensation value predicted by the XGBoost model with the predicted values from the Transformer model to obtain the final wind speed forecasting results. |
Model | Hyperparameter Settings |
---|---|
BP neural network | Hidden layer nodes: 11; initial learning rate: 0.001; epochs: 100; min_grad: 1 × 10−6. |
XGBoost | Estimators: 600; maximum depth: 4; minimum child weight: 1; initial learning rate: 0.002; epochs: 100. |
Transformer | Heads: 8; dimension (dm): 128; hidden neurons: 64; dropout rate: 0.1; initial learning rate: 0.001; epochs: 100; encoder–decoders: 1. |
Informer | Heads: 4; dimension (dm): 128; hidden neurons: 128; dropout rate: 0.1; initial learning rate: 0.002; epochs: 100; encoder–decoders: 2. |
LSTM | Hidden size: 128; number of layers: 2; dropout rate: 0.1; initial learning rate: 0.002; epochs: 100. |
NRBO–Transformer | Heads, L2 regularization, learning rate: to be optimized; for other settings, refer to Transformer. |
XGBoost–Transformer | Refer to the settings of XGBoost and Transformer. |
Case | Parameters | Quantity |
---|---|---|
Case 1 | lr | 0.00578 |
numHeads | 8 | |
l2 | 0.0001002 | |
Case 2 | lr | 0.00629 |
numHeads | 3 | |
l2 | 0.0001000 |
Case | Method | MAPE/% | RMSE |
---|---|---|---|
Case 1 | BP | 52.52 | 1.2001 |
XGBoost | 26.36 | 0.7223 | |
Transformer | 20.31 | 0.6883 | |
Informer | 18.77 | 0.6198 | |
LSTM | 19.90 | 0.6815 | |
NRBO–Transformer | 14.89 | 0.4518 | |
XGBoost–Transformer | 13.82 | 0.4094 | |
Ours | 11.24 | 0.2551 | |
Case 2 | BP | 13.27 | 0.6612 |
XGBoost | 16.39 | 0.5891 | |
Transformer | 11.88 | 0.5940 | |
Informer | 12.47 | 0.7239 | |
LSTM | 23.29 | 0.7017 | |
NRBO–Transformer | 6.77 | 0.3486 | |
XGBoost–Transformer | 11.05 | 0.5845 | |
Ours | 4.90 | 0.2976 |
Method | Dataset | Single-Step | Two-Step | Three-Step | |||
---|---|---|---|---|---|---|---|
MAPE/% | RMSE | MAPE/% | RMSE | MAPE/% | RMSE | ||
BP neural network | Case 1 | 32.79 | 0.5413 | 42.67 | 0.9613 | 52.52 | 1.2001 |
Case 2 | 8.33 | 0.5209 | 10.90 | 0.5933 | 13.27 | 0.6612 | |
XGBoost | Case 1 | 15.64 | 0.3238 | 23.26 | 0.6427 | 26.36 | 0.7223 |
Case 2 | 11.28 | 0.3756 | 14.45 | 0.5242 | 16.39 | 0.5891 | |
Transformer | Case 1 | 10.65 | 0.3267 | 15.42 | 0.5075 | 20.31 | 0.6883 |
Case 2 | 8.92 | 0.4699 | 9.91 | 0.5592 | 11.88 | 0.5940 | |
Informer | Case 1 | 9.03 | 0.3618 | 14.56 | 0.4489 | 18.77 | 0.6198 |
Case 2 | 9.45 | 0.3637 | 10.29 | 0.5624 | 12.47 | 0.7239 | |
LSTM | Case 1 | 10.36 | 0.2090 | 15.13 | 0.4954 | 19.90 | 0.6815 |
Case 2 | 9.58 | 0.5514 | 17.30 | 0.624 | 23.29 | 0.7017 | |
NRBO–Transformer | Case 1 | 7.57 | 0.2320 | 11.20 | 0.3354 | 14.89 | 0.4518 |
Case 2 | 5.13 | 0.2685 | 5.49 | 0.2806 | 6.77 | 0.3486 | |
XGBoost–Transformer | Case 1 | 6.27 | 0.2281 | 10.26 | 0.3049 | 13.82 | 0.4094 |
Case 2 | 8.86 | 0.3168 | 9.95 | 0.4786 | 11.05 | 0.5845 | |
Ours | Case 1 | 6.73 | 0.1155 | 8.57 | 0.1929 | 11.24 | 0.2551 |
Case 2 | 2.32 | 0.2173 | 3.56 | 0.2352 | 4.90 | 0.2976 |
Model | Transformer | t Value of Case 1 | t Value of Case 2 | ||
---|---|---|---|---|---|
MAPE/% | RMSE | MAPE/% | RMSE | ||
BPNN | Single-Step | −21.515 | −30.308 | −21.145 | −18.506 |
Two-Step | −29.882 | −32.474 | −22.82 | −23.136 | |
Three-Step | −29.113 | −27.094 | −22.105 | −22.518 | |
XGBoost | Single-Step | −21.065 | −24.782 | −33.52 | −13.109 |
Two-Step | −24.262 | −25.211 | −29.676 | −19.603 | |
Three-Step | −19.64 | −20.571 | −22.683 | −14.869 | |
Transformer | Single-Step | −10.108 | −27.52 | −22.212 | −18.98 |
Two-Step | −13.666 | −23.255 | −21.922 | −23.012 | |
Three-Step | −21.09 | −26.422 | −21.991 | −17.62 | |
Informer | Single-Step | −2.77 | −16.783 | −27.757 | −17.638 |
Two-Step | −6.409 | −13.239 | −26.843 | −18.815 | |
Three-Step | −7.258 | −14.299 | −26.921 | −22.209 | |
NRBO–Transformer | Single-Step | −9.827 | −14.724 | −29.412 | −11.162 |
Two-Step | −15.23 | −19.183 | −27.564 | −20.168 | |
Three-Step | −13.33 | −20.62 | −20.724 | −19.332 | |
LSTM | Single-Step | −6.827 | −23.465 | −14.914 | −4.69 |
Two-Step | −11.793 | −18.7 | −10.084 | −3.979 | |
Three-Step | −13.61 | −18.691 | −9.547 | −4.068 | |
XGBoost–Transformer | Single-Step | −1.533 | −15.683 | −32.052 | −8.511 |
Two-Step | −4.759 | −11.455 | −20.995 | −16.95 | |
Three-Step | −5.85 | −12.849 | −20.35 | −18.059 |
Model | Single-Step | Two-Step | Three-Step | |||
---|---|---|---|---|---|---|
MAPE/% | RMSE | MAPE/% | RMSE | MAPE/% | RMSE | |
BPNN | (−28.605, −23.515) | (−0.455, −0.396) | (−36.497, −31.703) | (−0.818, −0.719) | (−44.259, −38.301) | (−1.018, −0.872) |
XGBoost | (−9.799, −8.021) | (−0.226, −0.191) | (−15.962, −13.418) | (−0.487, −0.412) | (−16.737, −13.503) | (−0.515, −0.419) |
Transformer | (−4.735, −3.105) | (−0.227, −0.195) | (−7.903, −5.797) | (−0.343, −0.286) | (−9.973, −8.166) | (−0.468, −0.399) |
Informer | (−1.477, −0.203) | (−0.131, −0.102) | (−3.492, −1.768) | (−0.165, −0.12) | (−4.707, −2.593) | (−0.226, −0.168) |
NRBO–Transformer | (−4.406, −2.854) | (−0.107, −0.08) | (−7.465, −5.655) | (−0.336, −0.269) | (−10.025, −7.295) | (−0.47, −0.383) |
LSTM | (−3.008, −1.592) | (−0.268, −0.224) | (−7.057, −4.923) | (−0.285, −0.227) | (−8.692, −6.368) | (−0.406, −0.324) |
XGBoost–Transformer | (−1.17, −0.09) | (−0.128, −0.098) | (−2.436, −0.944) | (−0.133, −0.091) | (−3.506, −1.653) | (−0.18, −0.129) |
Model | Single-Step | Two-Step | Three-Step | |||
---|---|---|---|---|---|---|
MAPE/% | RMSE | MAPE/% | RMSE | MAPE/% | RMSE | |
BPNN | (−6.607, −5.413) | (−0.338, −0.269) | (−8.016, −6.664) | (−0.391, −0.326) | (−9.166, −7.574) | (−0.398, −0.33) |
XGBoost | (−9.522, −8.398) | (−0.184, −0.133) | (−11.661, −10.119) | (−0.32, −0.258) | (−12.554, −10.426) | (−0.333, −0.25) |
Transformer | (−7.224, −5.976) | (−0.281, −0.225) | (−6.959, −5.741) | (−0.354, −0.294) | (−7.647, −6.313) | (−0.332, −0.261) |
LSTM | (−7.81, −6.711) | (−0.374, −0.294) | (−14.815, −12.665) | (−0.432, −0.345) | (−19.825, −16.955) | (−0.442, −0.366) |
Informer | (−7.639, −6.621) | (−0.174, −0.119) | (−7.243, −6.217) | (−0.361, −0.293) | (−8.337, −6.803) | (−0.473, −0.38) |
NRBO–Transformer | (−3.206, −2.414) | (−0.074, −0.028) | (−2.332, −1.528) | (−0.069, −0.021) | (−2.282, −1.459) | (−0.077, −0.025) |
XGBoost–Transformer | (−6.969, −6.111) | (−0.124, −0.075) | (−7.029, −5.751) | (−0.274, −0.213) | (−6.785, −5.515) | (−0.32, −0.254) |
Case | lr | numHeads | l2 | MAPE (%) | RMSE |
---|---|---|---|---|---|
Opt-1 | 0.00578 | 8 | 0.0001002 | 11.24 | 0.2551 |
Var-1 | 0.00450 | 6 | 0.0010000 | 13.39 | 0.3172 |
Var-2 | 0.00720 | 2 | 0.0000500 | 15.02 | 0.3604 |
Var-3 | 0.00900 | 4 | 0.0005000 | 14.85 | 0.3491 |
Case | lr | numHeads | l2 | MAPE (%) | RMSE |
---|---|---|---|---|---|
Opt-2 | 0.00629 | 3 | 0.0001000 | 4.90 | 0.2976 |
Var-1 | 0.00500 | 5 | 0.0000500 | 6.67 | 0.3457 |
Var-2 | 0.00750 | 2 | 0.0010000 | 7.89 | 0.3842 |
Var-3 | 0.00820 | 4 | 0.0003000 | 6.21 | 0.3525 |
Dataset | Noise Std (σ) | Description | MAPE (%) | RMSE |
---|---|---|---|---|
Case 1 | 0 | Clean input | 11.24 | 0.2551 |
0.01 | Slight noise | 11.63 | 0.2640 | |
0.05 | Moderate noise | 12.87 | 0.2873 | |
0.10 | Strong noise | 15.42 | 0.3415 | |
0.20 | Severe noise | 22.13 | 0.4812 | |
Case 2 | 0 | Clean input | 4.90 | 0.2976 |
0.01 | Slight noise | 5.12 | 0.3028 | |
0.05 | Moderate noise | 5.78 | 0.3260 | |
0.10 | Strong noise | 8.94 | 0.3917 | |
0.20 | Severe noise | 14.28 | 0.5526 |
Model | Case 1 | Case 2 | ||
---|---|---|---|---|
Training Time (s/Epoch) | Inference Time (ms/Instance) | Training Time (s/Epoch) | Inference Time (ms/Instance) | |
BP Neural Network | 0.41 | 0.32 | 0.46 | 0.37 |
XGBoost | 0.58 | 0.24 | 0.58 | 0.28 |
LSTM | 1.25 | 0.47 | 1.29 | 0.48 |
Transformer | 1.38 | 0.35 | 1.30 | 0.31 |
Informer | 1.65 | 0.42 | 1.74 | 0.49 |
NRBO–Transformer | 2.17 | 0.38 | 2.13 | 0.35 |
XGBoost–Transformer | 1.82 | 0.36 | 1.66 | 0.30 |
Ours | 2.35 | 0.41 | 2.29 | 0.38 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hou, Z.; Liu, J.; Shao, Z.; Ma, Q.; Liu, W. Machine Learning Innovations in Renewable Energy Systems with Integrated NRBO-TXAD for Enhanced Wind Speed Forecasting Accuracy. Electronics 2025, 14, 2329. https://doi.org/10.3390/electronics14122329
Hou Z, Liu J, Shao Z, Ma Q, Liu W. Machine Learning Innovations in Renewable Energy Systems with Integrated NRBO-TXAD for Enhanced Wind Speed Forecasting Accuracy. Electronics. 2025; 14(12):2329. https://doi.org/10.3390/electronics14122329
Chicago/Turabian StyleHou, Zhiwen, Jingrui Liu, Ziqiu Shao, Qixiang Ma, and Wanchuan Liu. 2025. "Machine Learning Innovations in Renewable Energy Systems with Integrated NRBO-TXAD for Enhanced Wind Speed Forecasting Accuracy" Electronics 14, no. 12: 2329. https://doi.org/10.3390/electronics14122329
APA StyleHou, Z., Liu, J., Shao, Z., Ma, Q., & Liu, W. (2025). Machine Learning Innovations in Renewable Energy Systems with Integrated NRBO-TXAD for Enhanced Wind Speed Forecasting Accuracy. Electronics, 14(12), 2329. https://doi.org/10.3390/electronics14122329