Prediction of Blade Root Loads for Wind Turbine Based on RBMO-VMD and TCN-BiLSTM-Attention
Abstract
1. Introduction
2. Basic Principle
2.1. Blade Element Theory
2.2. Red-Billed Blue Magpie Optimizer Algorithm
2.2.1. Population Initialization
2.2.2. Foraging Phase
2.2.3. Attacking Prey
2.2.4. Food Storage
2.3. VMD Decomposition Principle
3. TCN-BiLSTM-Attention Model
3.1. Temporal Convolutional Neural Network
3.2. Bidirectional Long and Short-Term Memory Network
3.3. Attention Mechanism
3.4. TCN-BiLSTM-Attention Model Construction
4. VMD-TCN-BiLSTM-Attention Prediction Model Optimized by RBMO Algorithm
4.1. VMD Optimized Based on RBMO Algorithm
- (1)
- Initialization: Initialize the population using Equations (5) and (6). Set population size n = 40, dimension dim = 2, where j = 1 and j = 2 represent and respectively. Set the maximum iteration count T = 60, with K ranging from [4, 10] and α ranging from [100, 3000];
- (2)
- Calculate the sample entropy corresponding to each individual;
- (3)
- During the foraging phase, each individual adjusts its position based on the difference between the average position of the reference group and the position of a randomly selected individual within the population. That is, the search area is updated according to Equations (7) and (8).
- (4)
- During the prey attack phase, the swarm detects small prey (local optima) and conducts a fine-grained search within a small radius centered on the prey. Upon discovering large prey (potentially a global optimum), the swarm concentrates its efforts on an intensive search around the prey. Specifically, the values of and are updated according to Equations (9) and (10).
- (5)
- During the food storage phase, retain the parameter combination with the superior fitness value according to Equation (11);
- (6)
- If the iteration termination condition is not met, continue iterating; otherwise, output the optimal solution for and
4.2. RBMO Optimization of Hyperparameters for the TCN-BiLSTM-Attention Model
- (1)
- Initialize the population using Equations (5) and (6), with a population size n = 60 and maximum iteration count T = 100. Set the ranges for each hyperparameter; detailed values are shown in Table 1;
- (2)
- Design Fitness Function;
- (3)
- During the foraging phase, exploration is conducted using population-level social information. The position of each individual within the search population is updated according to Equations (7) and (8), thereby updating the search area;
- (4)
- During the prey attack phase, based on Equations (9) and (10), perform fine-grained development around the current optimal solution and update the model hyperparameters;
- (5)
- Model training: Construct corresponding TCN-BiLSTM-Attention models based on hyperparameters for each group, evaluate them using the validation set, and recalculate individual fitness values;
- (6)
- During the food storage phase, according to Equation (11), the fitness value of the new position is compared with that of the previous update, and the optimal information is retained;
- (7)
- If the iteration termination condition is not met, continue iterating; otherwise, output the optimal solution for the hyperparameters.
4.3. Constructing Prediction Models
- (1)
- Data preprocessing (including data cleaning and data normalization);
- (2)
- Determination of and values for VMD by RBMO algorithm;
- (3)
- Employ the VMD algorithm to decompose non-stationary sequences into multiple stationary sequences;
- (4)
- Combine each sub-sequence with environmental parameters and operational state parameters (wind speed, rotor speed, pitch angle, etc.) to form input components;
- (5)
- Divide the data into training, validation, and test sets;
- (6)
- Determine the optimal hyperparameter combinations for each component of the TCN-BiLSTM-Attention prediction model using the RBMO algorithm (including training the model on the training set and evaluating performance on the validation set);
- (7)
- Substitute each component into the TCN-BiLSTM-Attention model for prediction;
- (8)
- Sum and reconstruct the predicted values for each component, then renormalize the data to obtain the final predicted blade root load for the wind turbine.
4.4. Evaluation Indicators for the Model
4.5. Model Parameter Settings
5. Method Validation
5.1. Data Selection
5.2. Data Cleansing
5.3. Data Normalization
5.4. Sliding Window Design
5.5. Data Pre-Processing
5.6. Experimental Results and Analysis
5.6.1. Analysis of Prediction Results After Joining VDM
5.6.2. Analysis of Prediction Results Incorporating Different Intelligent Optimization Algorithms
5.6.3. Comparison with Other Models
5.6.4. Melting Experiment
5.7. Statistical Analysis
6. Conclusions
- (1)
- The RBMO-VMD-TCN-BiLSTM-Attention model demonstrates superior performance in predicting blade root loads for wind turbines compared to single models such as ELM, BiGUR, BiLSTM, Transformer, and Informer.
- (2)
- The VMD-TCN-BiLSTM-Attention model has better performance compared with the TCN-Bi LSTM-Attention model. It shows that the VMD algorithm can overcome the shortcomings of high volatility and non-stationarity of the original wind turbine sequence, which facilitates the prediction model to extract the hidden information in the data, and then improves the prediction accuracy of the blade root load of the wind turbines.
- (3)
- To further enhance the prediction accuracy of blade root loads in wind turbines, the RBMO algorithm was employed to optimize the parameters of the VMD and the hyperparameters of the TCN-BiLSTM-Attention model, resulting in a significant improvement in prediction accuracy. Experiments demonstrate that the combined prediction model based on RBMO-VMD and TCN-BiLSTM-Attention achieves high prediction accuracy.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Xiao, Z.; Cao, Z.H.; Deng, J.W.; Duan, S.Y.; Zhao, Q.C.; Dai, J.C.; Tao, J. A wind turbine load prediction method integrating fibre optic sensing data and gated recurrent neural network. J. Mech. Eng. 2025, 61, 272–282. [Google Scholar]
- Chen, Z.H.; Huang, S.; Hu, Q.; Chen, F.; Han, G.W.; Han, W.F.; Pai, Y.F. A GIS Isolator Contact Status Assessment Method Based on Weighted Naive Bayes. High Volt. Appar. 2025, 61, 50–57. [Google Scholar] [CrossRef]
- Manzali, Y.; Barry, K.A.; Flouchi, R.; Balouki, Y.; Elfar, M. A feature weighted K-nearest neighbor algorithm based on association rules. J. Ambient. Intell. Humaniz. Comput. 2024, 15, 2995–3008. [Google Scholar] [CrossRef]
- Chen, S.C.; Huang, L.J.; Pan, Y.J.; Hu, Y.C.; Shen, D.L.; Dai, J.G. Decision tree-based prediction approach for improving stable energy management in smart grids. J. High Speed Netw. 2023, 29, 295–305. [Google Scholar] [CrossRef]
- Qin, B.; Yi, H.Y.; Wang, X. Modelling of blade root load identification for wind turbines based on limit learning machine. Vib. Shock. 2018, 37, 257–262. [Google Scholar]
- Xu, Y.; Cai, A.M.; Zhang, L.W.; Lin, W.R.; Li, C.; Li, S.Q. Wind turbine load prediction and analysis based on BP neural network and multi-factor weighting method. Therm. Power Gener. 2022, 51, 42–49. [Google Scholar]
- Han, L.; Wang, X.; Yu, Y.; Wang, D. Power load forecast based on CS-LSTM neural network. Mathematics 2024, 12, 1402. [Google Scholar] [CrossRef]
- Song, G.J.; Wang, L.; Zhang, P.C.; Zhang, C.Q.; Bao, Z.Z. Short-Term Load Forecasting Based on Dual-Channel Feature Extraction VMD-CNN-BiLSTM. Trans. Electr. Power Syst. Autom. 2025, 1–14. [Google Scholar] [CrossRef]
- Wu, K.T.; Peng, X.G.; Chen, Z.W.; Su, H.K.; Quan, H.; Liu, H.Y. A novel short-term household load forecasting method combined BiLSTM with Application of Big Data in Power Systems trend feature extraction. Energy Rep. 2023, 9, 1013–1022. [Google Scholar] [CrossRef]
- Wang, D.; Li, S.; Fu, X. Short-term power load forecasting based on secondary cleaning and CNN-BILSTM-Attention. Energies 2024, 17, 4142. [Google Scholar] [CrossRef]
- Qian, C.; Liu, Y.C.; Li, H.X.; Chen, L.J.; Chen, J.X. Research on Highway Tunnel Structural Condition Prediction Method Based on HO–CNN–BiLSTM. J. Saf. Environ. 2025, 1–12. [Google Scholar] [CrossRef]
- Laitsos, V.; Vontzos, G.; Bargiotas, D.; Daskalopulu, A.; Tsoukalas, L.H. Enhanced Automated Deep Learning Application for Short-Term Load Forecasting. Mathematics 2023, 11, 2912. [Google Scholar] [CrossRef]
- Li, X.J.; Zhang, Y.; Wang, Z.W.; Song, Z.Y. Short-Term Wind Power Prediction Based on Optimized VMD and LSTM. Energy Eng. 2025, 122, 4603–4619. [Google Scholar] [CrossRef]
- Fu, X. A Study on Multi-Feature Short-Term Electricity Load Forecasting Based on VMD and TCN-KAN. Master’s Thesis, Harbin Institute of Technology, Harbin, China, 2025. [Google Scholar]
- Zhang, Y.Q.; Cheng, Q.Z.; Jiang, W.J.; Liu, X.F.; Shen, L.; Chen, Z.H. An EMD-PCA-LSTM-Based Photovoltaic Power Forecasting Model. Chin. J. Sol. Energy 2021, 42, 62–69. [Google Scholar]
- Shen, H.B.; Wang, L.Z.; Deng, L.Y.; Cheng, X.L.; Wu, H.J. Short-Term Wind Power Portfolio Forecasting Model Based on CEEMDAN-PCA-BiLSTM-LSTNet. Renew. Energy 2025, 43, 902–910. [Google Scholar]
- Zhang, Y.; Liu, K.; Qin, L. Deterministic and probabilistic interval prediction for short-term wind power generation based on variational mode decomposition and machine learning methods. Energy Convers. Manag. 2016, 112, 208–219. [Google Scholar] [CrossRef]
- Shi, P.Z.; Wei, X.; Zhang, C.M.; Xie, L.R.; Ye, J.H.; Yang, J.L. Short-time wind power prediction based on VMD-BOA-LSSVM-AdaBoost. Acta Energiae Solaris Sin. 2014, 45, 226–233. [Google Scholar]
- Liu, H.M.; Tang, X.X.; Zhang, Z.K.; Li, Y. Research on independent variable pitch control strategy based on joint feedback of azimuth and load. Chin. J. Electr. Eng. 2016, 36, 3798–3806. [Google Scholar]
- Liu, H.; Zhou, B.; Sun, X.M.; Zhou, H. Modelling and dynamic characterization of large-scale constant-frequency variable-speed wind turbine blade. Renew. Energy 2014, 32, 54–57. [Google Scholar]
- Fu, S.W.; Li, K.; Huang, H.S.; Ma, C.; Fan, Q.S.; Zhu, Y.W. Red-billed blue magpie optimizer: Anovel metaheuristic algorithm for 2D/3D UAV path planning and engineering design problems. Artif. Intell. Rev. 2024, 57, 134. [Google Scholar] [CrossRef]
- Wang, G.; Wang, X.; Wang, Z.; Ma, C.; Song, Z. A VMD–CISSA–LSSVM Based Electricity Load Forecasting Model. Mathematics 2022, 10, 28. [Google Scholar] [CrossRef]
- Zhao, B.; Zhang, L.X.; Zhang, L.Q.; Chen, Z.; Liu, X.W.; Xie, C.J. PEMFC life fusion prediction method based on joint iteration of TCN and AUKF. Chin. J. Electr. Eng. 2024, 45, 3609–3624. [Google Scholar]
- Wu, J.B.; Zeng, G.H.; Zhang, Z.H.; Huang, B.; Liu, J. Short-Term Photovoltaic Power Forecasting Using a Dynamic Combination of TCN-GRU and LSTM Based on K-Means Hierarchical Clustering. Renew. Energy 2023, 41, 1015–1022. [Google Scholar]
- Su, L.C.; Zhu, J.J.; Li, Y.W. Short-term wind power prediction based on time-convolutional network residual correction. J. Sol. Energy 2023, 44, 427–435. [Google Scholar]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Liu, M.J.; Lu, Y.S.; Long, S.; Bai, J.Y.; Lian, W.M. An attention-based CNN-BiLSTM hybrid neural network enhanced with features of discrete wavelet transformation for fetal acidosis classification. Expert Syst. Appl. 2021, 186, 115714. [Google Scholar] [CrossRef]
- Ai, C.Y.; He, S.; Hu, H.; Fan, X.C.; Wang, W.Q. Chaotic time series wind power interval prediction based on quadratic decomposition and intelligent optimization algorithm. Chaos Solitons Fractals 2023, 177, 114222. [Google Scholar] [CrossRef]





















| Hyperparameter Name | Search Scope |
|---|---|
| Number of TCN layers | [3, 6] |
| Size of the TCN convolution kernel | [3, 7] |
| Number of BiLSTM Layers | [1, 3] |
| Number of hidden layer units in BiLSTM | [64, 256] |
| Attention Dimension | [64, 256] |
| Batch size | [32, 128] |
| Learning rate | [1 × 10−4, 1 × 10−3] |
| Dropout Rate | [0.2, 0.5] |
| Parameter Category | Parameter Name | Value |
|---|---|---|
| Model Parameters | Number of TCN layers | 3 |
| Size of the TCN convolution kernel | 5 | |
| TCN Dropout Rate | 0.2 | |
| Expansion Factor of TCN | [1, 2, 4] | |
| Number of BiLSTM Layers | 2 | |
| Number of hidden layer units in BiLSTM | 128 | |
| BiLSTM Dropout Rate | 0.3 | |
| Attention Dimension | 128 | |
| Training Parameters | Batch size | 64 |
| Learning rate | 1 × 10−3 | |
| Optimizer | Adam | |
| Loss function | MSE |
| Sample | Quantity | Load Types | Statistical Indicators | |||
|---|---|---|---|---|---|---|
| Maximum Value | Minimum Value | Average | Standard Deviation | |||
| All samples | 2500 | oscillating moment | 11,624.9 | −8001.9 | 556.3 | 6150.2 |
| waving moment | 19,883.8 | 11,217.1 | 13,411.2 | 1865.3 | ||
| Training samples | 1500 | oscillating moment | 11,624.9 | −8001.9 | 589.3 | 6147.3 |
| waving moment | 19,883.8 | 11,217.1 | 13,529.8 | 1892.3 | ||
| Verification sample | 500 | oscillating moment | 11,592.9 | −7658.3 | 466.2 | 6135.9 |
| waving moment | 19,728.9 | 11,296.5 | 13,125.9 | 1749.3 | ||
| Test sample | 500 | oscillating moment | 11,589.3 | −7892.7 | 546.6 | 6173.7 |
| waving moment | 19,668.2 | 11,396.5 | 13,341.5 | 1861.9 | ||
| Number of Decompositions | Value of SE |
|---|---|
| 4 | 0.212 |
| 5 | 0.245 |
| 6 | 0.208 |
| 7 | 0.217 |
| 8 | 0.225 |
| 9 | 0.233 |
| 10 | 0.249 |
| Model | R2 | MAE | MSE | RMSE | MAPE% |
|---|---|---|---|---|---|
| TCN-BiLSTM-Attention | 0.9752 | 6.8217 | 98.5476 | 7.9526 | 3.5503 |
| EMD-TCN-BiLSTM-Attention | 0.9769 | 6.2275 | 81.1586 | 7.6315 | 3.2175 |
| CEEMDAN-TCN-BiLSTM-Attention | 0.9775 | 6.6072 | 79.2239 | 7.3143 | 3.4507 |
| VMD-TCN-BiLSTM-Attention | 0.9865 | 6.0232 | 72.7055 | 6.8276 | 3.2092 |
| Model | R2 | MAE | MSE | RMSE | MAPE |
|---|---|---|---|---|---|
| VMD-TCN- BiLSTM-Attention | 0.9865 | 6.0232 | 72.7055 | 6.8276 | 3.2092 |
| PSO-VDM-TCN- BiLSTM-Attention | 0.9893 | 4.1326 | 49.8872 | 4.0319 | 2.7132 |
| GWO-VDM-TCN- BiLSTM-Attention | 0.9885 | 3.8875 | 52.1537 | 4.2681 | 2.7567 |
| WOA-VDM-TCN- BiLSTM-Attention | 0.9881 | 4.1893 | 58.6631 | 3.9269 | 2.9954 |
| RBMO-VDM-TCN- BiLSTM-Attention | 0.9972 | 2.9073 | 32.7528 | 3.2356 | 2.1825 |
| Model | R2 | MAE | MSE | RMSE | MAPE% |
|---|---|---|---|---|---|
| ELM | 0.8922 | 15.3508 | 178.2559 | 16.2632 | 11.5507 |
| BiGUR | 0.9326 | 12.3618 | 158.6603 | 12.1662 | 8.3215 |
| BiLSTM | 0.9512 | 10.0116 | 149.1436 | 10.3265 | 7.0179 |
| Transformer | 0.9601 | 9.0345 | 126.2586 | 8.9122 | 5.2518 |
| Informer | 0.9603 | 9.0173 | 129.5139 | 8.8642 | 5.0937 |
| RBMO-VMD-TCN-BiLSTM-Attention | 0.9972 | 2.9073 | 32.7528 | 3.2356 | 2.1825 |
| Model | R2 | MAE | MSE | RMSE | MAPE |
|---|---|---|---|---|---|
| BiLSTM | 0.9512 | 10.0116 | 149.1436 | 10.3265 | 7.0179 |
| TCN-BiLSTM | 0.9663 | 8.7396 | 123.1519 | 9.7283 | 5.7816 |
| TCN-BiLSTM-Attention | 0.9752 | 6.8217 | 98.5476 | 7.9526 | 3.5503 |
| VMD-TCN- BiLSTM-Attention | 0.9865 | 6.0232 | 72.7055 | 6.8276 | 3.2092 |
| RBMO-VMD-TCN-BiLSTM-Attention | 0.9972 | 2.9073 | 32.7528 | 3.2356 | 2.1825 |
| Experimental Group | Model A | Model B | p-Value | Significant |
|---|---|---|---|---|
| RBMO-VMD-TCN-BiLSTM-Attention | Observation value | 0.3826 | No | |
| Experiment 1 | VMD-TCN-BiLSTM-Attention | TCN-BiLSTM-Attention | 0.0089 | Yes |
| EMD-TCN-BiLSTM-Attention | 0.0128 | Yes | ||
| CEEMDAN-TCN-BiLSTM-Attention | 0.0177 | Yes | ||
| Experiment 2 | RBMO-VMD-TCN-BiLSTM-Attention | VMD-TCN-BiLSTM-Attention | 0.0052 | Yes |
| PSO-VMD-TCN-BiLSTM-Attention | 0.0236 | Yes | ||
| GWO-VMD-TCN-BiLSTM-Attention | 0.0213 | Yes | ||
| WOA-VMD-TCN-BiLSTM-Attention | 0.0228 | Yes | ||
| Experiment 3 | RBMO-VMD-TCN-BiLSTM-Attention | ELM | <0.0001 | Yes |
| BiGUR | <0.0001 | Yes | ||
| BiLSTM | <0.0001 | Yes | ||
| Transformer | <0.0001 | Yes | ||
| Informer | <0.0001 | Yes | ||
| Melting Experiment | RBMO-VMD-TCN-BiLSTM-Attention | BiLSTM | <0.0001 | Yes |
| TCN-BiLSTM | <0.0001 | Yes | ||
| TCN-BiLSTM-Attention | 0.0002 | Yes | ||
| VMD-TCN-BiLSTM-Attention | 0.0052 | Yes |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Liu, Y.; Cheng, J. Prediction of Blade Root Loads for Wind Turbine Based on RBMO-VMD and TCN-BiLSTM-Attention. Mathematics 2026, 14, 218. https://doi.org/10.3390/math14020218
Liu Y, Cheng J. Prediction of Blade Root Loads for Wind Turbine Based on RBMO-VMD and TCN-BiLSTM-Attention. Mathematics. 2026; 14(2):218. https://doi.org/10.3390/math14020218
Chicago/Turabian StyleLiu, Yifan, and Jing Cheng. 2026. "Prediction of Blade Root Loads for Wind Turbine Based on RBMO-VMD and TCN-BiLSTM-Attention" Mathematics 14, no. 2: 218. https://doi.org/10.3390/math14020218
APA StyleLiu, Y., & Cheng, J. (2026). Prediction of Blade Root Loads for Wind Turbine Based on RBMO-VMD and TCN-BiLSTM-Attention. Mathematics, 14(2), 218. https://doi.org/10.3390/math14020218
