Research and Application of a Model Selection Forecasting System for Wind Speed and Theoretical Power Generation
Abstract
1. Introduction
2. Conceptual Framework of the MSFSC Model
2.1. Framework of Predictive Models and Classifiers
2.2. Classification-Integrated Framework for Model Selection Forecasting
3. Experiment and Analysis
3.1. Data Source
3.2. Evaluation Criteria
3.3. Experiment I: Forecasting System for Model Selection Driven by Classification
3.4. Experiment II: Analysis of Classification and Forecasting Outcomes for Various Wind Turbine Categories
3.5. Experiment III: Theoretical Assessment of Power Generation for Individual Wind Turbines
- Pressure reduction with altitude:describing the exponential decay of atmospheric pressure as elevation increases.
- Thermal expansion of air:reflecting the ideal-gas relationship between density and temperature.
4. Discussion
Analysis of the Proposed Model’s Significance and Uncertainty
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A
- 1.
- Model Description
| Model | Hyperparameter | Default | Structural Complexity | Computational Complexity | Training Time |
|---|---|---|---|---|---|
| CNN-LSTM | NumFilters | 32 | CNN filters = 32, kernel size = 3 LSTM hidden units = 64 Parameter scale: medium-to-low Two-stage hybrid structure (CNN → LSTM), a typical mixed model | CNN: O(32 × k × L) LSTM: O(L × 642) Overall complexity: O(L × 4k + L × 4096), medium level | 10–20 s Runs efficiently on GPU/CPU Suitable for medium-scale real-time forecasting |
| FilterSize | 3 | ||||
| NumHiddenUnits | 64 | ||||
| Dropout | 0.1 | ||||
| InitialLearnRate | 1.00 × 10−3 | ||||
| MiniBatchSize | 16 | ||||
| MaxEpochs | 150 | ||||
| Transformers | NumEncoderLayers | 2 | Self-attention + FFN Parameter scale: medium Small dmodel, considered a lightweight Transformer | Multi-head attention: O(L2 × dmodel) FFN: O(L × 64 × 32) Main bottleneck comes from the quadratic attention term L2 | 8–18 s Significantly higher cost than RNN/CNN for long sequences |
| dModel | 32 | ||||
| NumHeads | 4 | ||||
| NumHiddenUnits | 64 | ||||
| Dropout | 0.1 | ||||
| InitialLearnRate | 1.00 × 10−3 | ||||
| TCN-LSTM | NumFilters | 32 | 32 filters × dilation = [1, 2, 4, 8] × 2 residual blocks LSTM hidden units = 64 Parameter scale: medium-to-high | TCN (dilated convolution): O(L × 32 × 3 × blocks) LSTM: O(L × 642) More complex than CNN-LSTM but does not suffer from the L2 cost of Transformers | 15–25 s Lighter than Transformers Considered a medium-complexity model |
| FilterSize | 3 | ||||
| DilationFactor | [1 2 4 8] | ||||
| NumResidualBlocks | 2 | ||||
| NumHiddenUnits | 64 | ||||
| Dropout | 0.1 | ||||
| CNN-GRU | NumFilters | 32 | CNN filters = 32 GRU hidden units = 64 Parameter scale similar to CNN-LSTM, but GRU is lighter than LSTM | CNN: O(L × 32 × 3) GRU: O(L × 642) × 3 gates Lower than LSTM (which has 4 gates) | 15–30 s Typically, faster than LSTM Medium-to-low complexity |
| FilterSize | 3 | ||||
| NumHiddenUnits | 64 | ||||
| Dropout | 0.1 | ||||
| InitialLearnRate | 1.00 × 10−3 | ||||
| LSTM-XGBoost | NumHiddenUnits | 32 | LSTM hidden units = 32 (smaller than other hybrid models) XGBoost trees = 200, depth = 3 Overall architecture: hybrid but not heavy | LSTM: O(L × 322) (small scale) XGBoost: O(trees × depth × features) Overall complexity: medium-to-high depending on the number of trees | 20–40 s XGBoost training can be slow on CPU |
| Dropout | 0.1 | ||||
| max_depth | 3 | ||||
| learning_rate | 0.05 | ||||
| n_estimators | 200 | ||||
| subsample | 0.8 | ||||
| colsample_bytree | 0.8 | ||||
| GNN-TCN | NumHiddenUnits | 32 | GNN layers = 2, hidden size = 32 TCN filters = 32 with dilation = [1, 2, 4] Requires graph convolution based on adjacency matrix Parameter scale: relatively high | GNN: O(Nedges × 32) TCN: O(L × 32 × 3 × layers) Combined complexity is higher than pure CNN/RNN models | 25–45 s Runs efficiently on GPU/CPU |
| NumGraphLayers | 2 | ||||
| NumFilters | 32 | ||||
| DilationFactor | [1 2 4] | ||||
| Dropout | 0.1 | ||||
| InitialLearnRate | 1.00 × 10−3 |
- (1)
- Low-Complexity Models
- (2)
- Medium-Complexity Models
- (3)
- High-Complexity Models: The Transformer and GNN–TCN exhibit the highest structural and computational complexity.
| Category | Description |
|---|---|
| Model Type | Transformer encoder classifier |
| Hyperparameters |
|
| Training Input | 10,080 samples × 7 features |
| Testing Input | 4320 samples × 7 features |
| Structural Complexity |
|
| Computational Complexity | O (NL2d + NLddff) With (L = 7, d = 32, dff = 64): extremely low in practice |
| Training Time | 3–8 s on RTX 4090 for 100 epochs (few tens of seconds on CPU) |
| Inference Time | Millisecond-level for 4 320 samples (negligible overhead in MSFS) |
| Practical Notes |
|
| NO | Distribution | Parameter | CNN-LSTM | CNN-GRU | GNN-TCN | LSTM-XGBoost | TCN-LSTM | Transformers | MSFS |
|---|---|---|---|---|---|---|---|---|---|
| Turbine 1 | normal | ‘mu’ | −0.0009 | −0.0028 | 0.0010 | −0.0037 | −0.0024 | −0.0012 | −0.0005 |
| ‘sigma’ | 0.1403 | 0.1304 | 0.0452 | 0.1329 | 0.1415 | 0.1244 | 0.0826 | ||
| SSE | 22.7525 | 23.7055 | 36.4974 | 29.2852 | 19.0430 | 25.0445 | 48.5762 | ||
| logistic | ‘mu’ | 0.0008 | −0.0023 | 0.0011 | −0.0024 | −0.0002 | −0.0008 | 0.0002 | |
| ‘sigma’ | 0.0663 | 0.0627 | 0.0240 | 0.0602 | 0.0680 | 0.0571 | 0.0369 | ||
| SSE | 10.9773 | 12.1976 | 33.0165 | 12.6268 | 8.6780 | 9.6383 | 13.5334 | ||
| generalized extreme value | ‘k’ | −0.1654 | −0.1418 | −0.1932 | −0.2102 | −0.2666 | −0.2168 | −0.1488 | |
| ‘sigma’ | 0.1634 | 0.1472 | 0.0473 | 0.1673 | 0.1632 | 0.1498 | 0.1008 | ||
| ‘mu’ | −0.0591 | −0.0580 | −0.0167 | −0.0561 | −0.0541 | −0.0498 | −0.0350 | ||
| SSE | 33.4475 | 34.1444 | 61.6511 | 44.6973 | 27.0634 | 38.3722 | 85.7774 | ||
| tlocationscale | ‘mu’ | 0.0035 | 0.0002 | 0.0011 | −0.0001 | 0.0016 | 0.0006 | 0.0014 | |
| ‘sigma’ | 0.0632 | 0.0618 | 0.0351 | 0.0553 | 0.0701 | 0.0561 | 0.0398 | ||
| ‘nu’ | 1.8106 | 1.8891 | 5.1360 | 1.7513 | 2.0201 | 1.9176 | 2.2681 | ||
| SSE | 1.5362 | 2.9071 | 36.4651 | 1.0454 | 1.1662 | 1.0650 | 7.6752 | ||
| Turbine 2 | normal | ‘mu’ | −0.0031 | −0.0015 | −0.0086 | −0.0018 | −0.0168 | 0.0003 | −0.0036 |
| ‘sigma’ | 0.3063 | 0.3297 | 0.4203 | 0.0946 | 0.2939 | 0.2798 | 0.3412 | ||
| SSE | 8.7714 | 3.3898 | 6.5115 | 37.4751 | 9.2970 | 13.3985 | 7.4163 | ||
| logistic | ‘mu’ | −0.0061 | −0.0064 | −0.0018 | −0.0021 | −0.0146 | 0.0002 | −0.0077 | |
| ‘sigma’ | 0.1454 | 0.1551 | 0.1803 | 0.0410 | 0.1377 | 0.1306 | 0.1601 | ||
| SSE | 5.2544 | 1.7794 | 3.9987 | 10.9500 | 5.4140 | 8.7500 | 4.0357 | ||
| generalized extreme value | ‘k’ | −0.1761 | −0.1463 | −0.1541 | −0.1438 | −0.1429 | −0.2046 | −0.1859 | |
| ‘sigma’ | 0.3316 | 0.3671 | 0.4861 | 0.1142 | 0.3283 | 0.3138 | 0.3697 | ||
| ‘mu’ | −0.1252 | −0.1370 | −0.1828 | −0.0411 | −0.1404 | −0.1088 | −0.1378 | ||
| SSE | 10.7998 | 4.5071 | 7.9097 | 59.8068 | 12.1690 | 16.0977 | 9.2571 | ||
| tlocationscale | ‘mu’ | −0.0025 | −0.0060 | 0.0029 | −0.0017 | −0.0073 | 0.0015 | −0.0087 | |
| ‘sigma’ | 0.1288 | 0.1348 | 0.0979 | 0.0408 | 0.1222 | 0.0980 | 0.1424 | ||
| ‘nu’ | 1.6361 | 1.5962 | 1.0650 | 2.0172 | 1.6481 | 1.3634 | 1.6557 | ||
| SSE | 1.2709 | 0.1652 | 0.1060 | 4.4801 | 1.3445 | 1.3377 | 0.7493 | ||
| Turbine 3 | normal | ‘mu’ | 0.0044 | 0.0150 | −0.0059 | −0.0130 | 0.0268 | 0.0086 | 0.0023 |
| ‘sigma’ | 0.3934 | 0.7600 | 0.7122 | 0.7975 | 0.6672 | 0.6798 | 0.2491 | ||
| SSE | 2.8572 | 0.0547 | 0.0254 | 0.0442 | 0.4714 | 0.0623 | 9.7058 | ||
| logistic | ‘mu’ | 0.0102 | 0.0157 | −0.0041 | −0.0057 | 0.0225 | 0.0129 | 0.0049 | |
| ‘sigma’ | 0.1836 | 0.4143 | 0.3930 | 0.4338 | 0.3550 | 0.3654 | 0.1151 | ||
| SSE | 1.4026 | 0.0251 | 0.0100 | 0.0230 | 0.2635 | 0.0134 | 5.1727 | ||
| generalized extreme value | ‘k’ | −0.2296 | −0.2292 | −0.2837 | −0.1937 | −0.2190 | −0.2342 | −0.1907 | |
| ‘sigma’ | 0.4551 | 0.7775 | 0.7374 | 0.8335 | 0.6929 | 0.7220 | 0.2968 | ||
| ‘mu’ | −0.1474 | −0.2726 | −0.2567 | −0.3324 | −0.2290 | −0.2491 | −0.0988 | ||
| SSE | 3.9163 | 0.0958 | 0.0530 | 0.1073 | 0.6244 | 0.1347 | 13.7061 | ||
| tlocationscale | ‘mu’ | 0.0109 | 0.0157 | −0.0029 | −0.0047 | 0.0196 | 0.0138 | 0.0071 | |
| ‘sigma’ | 0.1674 | 0.6265 | 0.6173 | 0.6578 | 0.4909 | 0.5391 | 0.1082 | ||
| ‘nu’ | 1.7131 | 6.0411 | 7.9591 | 6.2193 | 3.9390 | 5.2836 | 1.7880 | ||
| SSE | 0.1603 | 0.0245 | 0.0093 | 0.0230 | 0.1997 | 0.0112 | 1.4037 | ||
| Turbine 4 | normal | ‘mu’ | −0.0005 | −0.0024 | −0.0012 | −0.0028 | −0.0037 | −0.0009 | 0.0010 |
| ‘sigma’ | 0.0826 | 0.1415 | 0.1244 | 0.1304 | 0.1329 | 0.1403 | 0.0452 | ||
| SSE | 48.5762 | 19.0430 | 25.0445 | 23.7055 | 29.2852 | 22.7525 | 36.4974 | ||
| logistic | ‘mu’ | 0.0002 | −0.0002 | −0.0008 | −0.0023 | −0.0024 | 0.0008 | 0.0011 | |
| ‘sigma’ | 0.0369 | 0.0680 | 0.0571 | 0.0627 | 0.0602 | 0.0663 | 0.0240 | ||
| SSE | 13.5334 | 8.6780 | 9.6383 | 12.1976 | 12.6268 | 10.9773 | 33.0165 | ||
| generalized extreme value | ‘k’ | −0.1488 | −0.2666 | −0.2168 | −0.1418 | −0.2102 | −0.1654 | −0.1932 | |
| ‘sigma’ | 0.1008 | 0.1632 | 0.1498 | 0.1472 | 0.1673 | 0.1634 | 0.0473 | ||
| ‘mu’ | −0.0350 | −0.0541 | −0.0498 | −0.0580 | −0.0561 | −0.0591 | −0.0167 | ||
| SSE | 85.7774 | 27.0634 | 38.3722 | 34.1444 | 44.6973 | 33.4475 | 61.6511 | ||
| tlocationscale | ‘mu’ | 0.0014 | 0.0016 | 0.0006 | 0.0002 | −0.0001 | 0.0035 | 0.0011 | |
| ‘sigma’ | 0.0398 | 0.0701 | 0.0561 | 0.0618 | 0.0553 | 0.0632 | 0.0351 | ||
| ‘nu’ | 2.2681 | 2.0201 | 1.9176 | 1.8891 | 1.7513 | 1.8106 | 5.1360 | ||
| SSE | 7.6752 | 1.1662 | 1.0650 | 2.9071 | 1.0454 | 1.5362 | 36.4651 | ||
| Turbine 5 | normal | ‘mu’ | −0.0086 | −0.0080 | −0.0076 | −0.0079 | −0.0104 | −0.0113 | −0.0013 |
| ‘sigma’ | 0.1656 | 0.2200 | 0.2354 | 0.2457 | 0.2388 | 0.2550 | 0.0923 | ||
| SSE | 36.8238 | 17.7037 | 10.3676 | 10.8713 | 4.8369 | 12.6825 | 31.4714 | ||
| logistic | ‘mu’ | −0.0040 | −0.0072 | −0.0082 | −0.0082 | −0.0082 | −0.0091 | −0.0012 | |
| ‘sigma’ | 0.0592 | 0.0949 | 0.1075 | 0.1111 | 0.1025 | 0.1183 | 0.0343 | ||
| SSE | 14.7899 | 9.5452 | 5.1610 | 5.6601 | 1.6649 | 7.9237 | 5.1605 | ||
| generalized extreme value | ‘k’ | −0.1141 | −0.1097 | −0.1397 | −0.1112 | −0.1733 | −0.1687 | −0.0567 | |
| ‘sigma’ | 0.2331 | 0.2430 | 0.2670 | 0.2689 | 0.3552 | 0.3180 | 0.1090 | ||
| ‘mu’ | −0.0797 | −0.0997 | −0.1056 | −0.1113 | −0.1105 | −0.1181 | −0.0391 | ||
| SSE | 51.4102 | 22.3621 | 14.2799 | 14.3606 | 9.0344 | 17.2393 | 47.7776 | ||
| tlocationscale | ‘mu’ | 0.0003 | 0.0004 | −0.0047 | −0.0033 | −0.0045 | −0.0068 | −0.0002 | |
| ‘sigma’ | 0.0420 | 0.0715 | 0.0982 | 0.1010 | 0.0884 | 0.1003 | 0.0362 | ||
| ‘nu’ | 1.4376 | 1.4075 | 1.7277 | 1.7181 | 1.6342 | 1.5550 | 2.3046 | ||
| SSE | 0.1742 | 0.6407 | 0.9189 | 1.1361 | 0.1456 | 1.9040 | 0.2898 | ||
| Turbine 6 | normal | ‘mu’ | −0.0059 | −0.0100 | −0.0099 | −0.0088 | −0.0119 | 0.0002 | −0.0011 |
| ‘sigma’ | 0.1343 | 0.1889 | 0.1788 | 0.2056 | 0.1860 | 0.2007 | 0.0695 | ||
| SSE | 36.4818 | 9.4050 | 15.2219 | 17.5967 | 14.5273 | 10.4848 | 44.0969 | ||
| logistic | ‘mu’ | −0.0014 | −0.0044 | −0.0060 | −0.0063 | −0.0048 | 0.0006 | 0.0005 | |
| ‘sigma’ | 0.0515 | 0.0897 | 0.0820 | 0.0995 | 0.0852 | 0.0949 | 0.0316 | ||
| SSE | 14.0928 | 4.4375 | 7.7534 | 10.0849 | 7.5479 | 5.4956 | 9.3376 | ||
| generalized extreme value | ‘k’ | −0.1777 | −0.2106 | −0.1901 | −0.2204 | −0.2112 | −0.2020 | −0.2341 | |
| ‘sigma’ | 0.1924 | 0.2202 | 0.2093 | 0.2246 | 0.2308 | 0.2358 | 0.0884 | ||
| ‘mu’ | −0.0647 | −0.0854 | −0.0831 | −0.0875 | −0.0872 | −0.0787 | −0.0283 | ||
| SSE | 53.5973 | 13.8068 | 20.7958 | 22.2693 | 20.7974 | 14.7346 | 86.6710 | ||
| tlocationscale | ‘mu’ | 0.0000 | −0.0016 | −0.0039 | −0.0031 | −0.0015 | 0.0006 | 0.0005 | |
| ‘sigma’ | 0.0398 | 0.0886 | 0.0691 | 0.0945 | 0.0737 | 0.0885 | 0.0353 | ||
| ‘nu’ | 1.5459 | 1.9040 | 1.5669 | 1.7844 | 1.6105 | 1.7506 | 2.4024 | ||
| SSE | 0.3054 | 0.6006 | 0.4354 | 2.2271 | 0.8118 | 0.7631 | 5.4124 |
References
- Global Wind Energy Council. Global Wind Report 2024; Global Wind Energy Council (GWEC): Brussels, Belgium, 2024. [Google Scholar]
- Yang, D.Z.; Wang, W.T.; Hong, T. A historical weather forecast dataset from the European centre for medium-range weather forecasts (ecmwf) for energy forecasting. Sol. Energy 2022, 232, 263–274. [Google Scholar] [CrossRef]
- Alsamamra, H.R.; Salah, S.; Shoqeir, J.H. Performance analysis of arima model for wind speed forecasting in Jerusalem, Palestine. Energy Explor. Exploit. 2024, 42, 1727–1746. [Google Scholar] [CrossRef]
- Dupuy, F.; Durand, P.; Hedde, T. Downscaling of surface wind forecasts using convolutional neural networks. Nonlin. Process. Geophys. 2023, 30, 553–570. [Google Scholar] [CrossRef]
- Daenens, S.; Verstraeten, T.; Daems, P.J.; Nowé, A.; Helsen, J. Spatio-temporal graph neural networks for power prediction in offshore wind farms using scada data. Wind Energy Sci. 2025, 10, 1137–1152. [Google Scholar] [CrossRef]
- Liao, W.; Fang, J.; Ye, L.; Bak-Jensen, B.; Yang, Z.; Porte-Agel, F. Can we trust explainable artificial intelligence in wind power forecasting? Appl. Energy 2024, 376, 124273. [Google Scholar] [CrossRef]
- Liu, Z.; Guo, H.; Zhang, Y.; Zuo, Z. A comprehensive review of wind power prediction based on machine learning: Models, applications, and challenges. Energies 2025, 18, 350. [Google Scholar] [CrossRef]
- Wang, Y.; Zhang, F.; Kou, H.; Zou, R.; Hu, Q.; Wang, J.; Srinivasan, D. A review of predictive uncertainty modeling techniques and evaluation metrics in probabilistic wind speed and wind power forecasting. Appl. Energy 2025, 396, 126234. [Google Scholar] [CrossRef]
- Zhao, Y.; Liao, H.; Zhao, Y.; Pan, S. Data-augmented trend-fluctuation representations by interpretable contrastive learning for wind power forecasting. Appl. Energy 2025, 380, 125052. [Google Scholar] [CrossRef]
- Li, J.; Jia, L.; Zhou, C. Deep fuzzy inference system fused with probability density function control for wind power forecasting with asymmetric error distribution. J. Clean. Prod. 2025, 511, 145590. [Google Scholar] [CrossRef]
- Chen, H.; Jiang, X.; Hui, H.; Zhang, K.; Meng, W.; Cheynet, E. Enhancing probabilistic wind speed forecasting by integrating self-adaptive bayesian wavelet denoising with deep gaussian process regression under uncertainties. Renew. Energy 2026, 256, 123966. [Google Scholar] [CrossRef]
- He, Y.Y.; Yu, N.N.; Wang, B. Online probability density prediction of wind power considering virtual and real concept drift detection. Appl. Energy 2025, 396, 126318. [Google Scholar] [CrossRef]
- Gao, J.; Cheng, Y.; Zhang, D.; Chen, Y. Physics-constrained wind power forecasting aligned with probability distributions for noise-resilient deep learning☆. Appl. Energy 2025, 383, 125295. [Google Scholar] [CrossRef]
- Huang, H.; Xue, S.; Zhao, L.; Wang, W.; Wu, H. Privacy-preserving smart energy management by consumer-electronic chips and federated learning. IEEE Trans. Consum. Electron. 2024, 70, 2200–2209. [Google Scholar] [CrossRef]
- Wang, X.R.; Zhou, Y.Z. Privacy-preserving probabilistic wind power forecasting: An adaptive federated approach. Appl. Energy 2025, 396, 126177. [Google Scholar] [CrossRef]
- Nayak, A.K.; Sharma, K.C.; Bhakar, R.; Tiwari, H. Probabilistic online learning framework for short-term wind power forecasting using ensemble bagging regression model. Energy Convers. Manag. 2025, 323, 119142. [Google Scholar] [CrossRef]
- Leal, J.I.; Pitombeira-Neto, A.R.; Bueno, A.V.; Rocha, P.A.C.; de Andrade, C.F. Probabilistic wind speed forecasting via bayesian dlms and its application in green hydrogen production. Appl. Energy 2025, 382, 125286. [Google Scholar] [CrossRef]
- Chen, H. A novel wind model downscaling with statistical regression and forecast for the cleaner energy. J. Clean. Prod. 2024, 434, 140217. [Google Scholar] [CrossRef]
- Zhao, L.; Liu, C.; Yang, C.; Liu, S.; Zhang, Y.; Li, Y. A location-centric transformer framework for multi-location short-term wind speed forecasting. Energy Convers. Manag. 2025, 328, 119627. [Google Scholar] [CrossRef]
- Ma, J.; Du, J.; Chen, Q.; Jiang, X.; Pan, L. Multi-feature extraction spatio-temporal interaction graph network for wind speed forecasting in windfarm. Energy 2025, 333, 137229. [Google Scholar] [CrossRef]
- Resifi, S.; Al Aawar, E.; Dasari, H.P.; Jebari, H.; Hoteit, I. A novel deep learning approach for regional high-resolution spatio-temporal wind speed forecasting for energy applications. Energy 2025, 328, 136356. [Google Scholar] [CrossRef]
- Zhang, Z.G.; Yin, J.C. Incremental principal component analysis based depthwise separable unet model for complex wind system forecasting. Energy 2025, 334, 137751. [Google Scholar] [CrossRef]
- Zhang, J.; Zhang, Y.; Liu, K.; Zhao, C. Multi-step prediction of spatio-temporal wind speed based on the multimodal coupled st-dfnet model. Energy 2025, 334, 137670. [Google Scholar] [CrossRef]
- Hu, D.; He, F.; Fan, W.; Feng, W. Dbann: Dual-branch attention neural networks with hierarchical spatiotemporal-perception for multi-node offshore wind power forecasting. Energy 2025, 334, 137521. [Google Scholar] [CrossRef]
- Zhao, Y.; Zhao, Y.; Liao, H.; Pan, S.; Zheng, Y. Interpreting lasso regression model by feature space matching analysis for spatio-temporal correlation based wind power forecasting. Appl. Energy 2025, 380, 124954. [Google Scholar] [CrossRef]
- Verdone, A.; Panella, M.; De Santis, E.; Rizzi, A. A review of solar and wind energy forecasting: From single-site to multi-site paradigm. Appl. Energy 2025, 392, 126016. [Google Scholar] [CrossRef]
- Wan, H.; Wang, J.; Gan, Q.; Xia, Y.; Chang, Y.; Yan, H. Addressing intermittency in medium-term photovoltaic and wind power forecasting using a hybrid xlstm-tccnn model with numerical weather predictions. Renew. Energy 2025, 253, 123618. [Google Scholar] [CrossRef]
- Ignatev, E.; Deriugina, G.; Suslov, K.; Balaban, G. Development of a hybrid model for medium-term wind farm power output forecasting. Renew. Energy 2025, 249, 123200. [Google Scholar] [CrossRef]
- Michalakopoulos, V.; Zakynthinos, A.; Sarmas, E.; Marinakis, V.; Askounis, D. Hybrid short-term wind power forecasting model using theoretical power curves and temporal fusion transformers. Renew. Energy 2026, 256, 124008. [Google Scholar] [CrossRef]
- Xu, X.; Cao, Q.; Deng, R.; Guo, Z.; Chen, Y.; Yan, J. A cross-dataset benchmark for neural network-based wind power forecasting. Renew. Energy 2025, 254, 123463. [Google Scholar] [CrossRef]
- Dong, Y.; Zhou, B.; Zhang, H.; Yang, G.; Ma, S. A deep time-frequency augmented wind power forecasting model. Renew. Energy 2026, 256, 123550. [Google Scholar] [CrossRef]
- Ullah, S.; Chen, X.; Han, H.; Wu, J.; Dong, J.; Liu, R.; Ding, W.; Liu, M.; Li, Q.; Qi, H.; et al. A novel hybrid ensemble approach for wind speed forecasting with dual-stage decomposition strategy using optimized gru and transformer models. Energy 2025, 329, 136739. [Google Scholar] [CrossRef]
- Li, S.; Guo, L.; Zhu, J.; Liu, M.; Chen, J.; Meng, Z. Short-term multi-step wind speed forecasting with multi-feature inputs using variational mode decomposition, a novel artificial intelligence network, and the polar lights optimizer. Renew. Energy 2026, 256, 123965. [Google Scholar] [CrossRef]
- Ma, C.; Zhang, C.; Yao, J.; Zhang, X.; Nazir, M.S.; Peng, T. Enhancement of wind speed forecasting using optimized decomposition technique, entropy-based reconstruction, and evolutionary patchtst. Energy Convers. Manag. 2025, 333, 119819. [Google Scholar] [CrossRef]
- Wu, X.; Wang, D.; Yang, M.; Liang, C. Ceemdan-se-hdbscan-vmd-tcn-bigru: A two-stage decomposition-based parallel model for multi-altitude ultra-short-term wind speed forecasting. Energy 2025, 330, 136660. [Google Scholar] [CrossRef]
- Hong, J.T.; Han, S.; Yan, J.; Liu, Y.Q. Dual-path frequency mamba-transformer model for wind power forecasting. Energy 2025, 332, 137225. [Google Scholar] [CrossRef]
- Wu, B.; Lin, J.; Liu, R.; Wang, L. A multi-dimensional interpretable wind speed forecasting model with two-stage feature exploring. Renew. Energy 2026, 256, 124028. [Google Scholar] [CrossRef]
- Zeng, H.; Wu, B.; Fang, H.; Lin, J. Interpretable wind speed forecasting through two-stage decomposition with comprehensive relative importance analysis. Appl. Energy 2025, 392, 126015. [Google Scholar] [CrossRef]
- Xu, R.; Fang, H.; Zeng, H.; Wu, B. A novel interpretable wind speed forecasting based on the multivariate variational mode decomposition and temporal fusion transformer. Energy 2025, 331, 136497. [Google Scholar] [CrossRef]
- Liang, B.J.; Tian, Z.R. Isi net: A novel paradigm integrating interpretability and intelligent selection in ensemble learning for accurate wind power forecasting. Energy Convers. Manag. 2025, 332, 119752. [Google Scholar] [CrossRef]
- Wang, Q.; Xu, F.; He, J.; Luo, K.; Fan, J. A new fusion model for enhanced ultra-short-term offshore wind power forecasting. Renew. Energy 2026, 256. [Google Scholar] [CrossRef]
- Li, M.; Zhang, K.; Kou, M.; Ma, Y. An offshore wind speed forecasting system based on feature enhancement, deep time series clustering, and extended lstm. Energy 2025, 333, 137335. [Google Scholar] [CrossRef]
- Cui, X.; Yu, X.; Niu, H.; Niu, D.; Liu, D. A novel data-driven multi-step wind power point-interval prediction framework integrating sliding window-based two-layer adaptive decomposition and multi-objective optimization for balancing prediction accuracy and stability. Appl. Energy 2025, 397, 126348. [Google Scholar] [CrossRef]
- He, X.; Zhao, K.; Chu, X. AutoML: A survey of the state-of-the-art. Knowl.-Based Syst. 2021, 212, 106622. [Google Scholar] [CrossRef]
- Zöller, M.A.; Huber, M.F. Benchmark and survey of automated machine learning frameworks. J. Artif. Intell. Res. 2021, 70, 409–472. [Google Scholar] [CrossRef]
- Zhou, Y.; Wu, X.; Wu, J.; Feng, L.; Tan, K.C. HM3: Hierarchical multi-objective model merging for pretrained models. arXiv 2024, arXiv:2409.18893. [Google Scholar] [CrossRef]
- Telikani, A.; Tahmassebi, A.; Banzhaf, W.; Gandomi, A.H. Evolutionary machine learning: A survey. ACM Comput. Surv. 2021, 54, 1–35. [Google Scholar] [CrossRef]
- Cerqueira, V.; Torgo, L.; Pinto, F.; Soares, C. Arbitrated ensembles for time series forecasting in online learning scenarios. Expert Syst. Appl. 2019, 118, 271–282. [Google Scholar]
- Jiang, W.; Liu, B.; Liang, Y.; Gao, H.; Lin, P.; Zhang, D.; Hu, G. Applicability analysis of transformer to wind speed forecasting by a novel deep learning framework with multiple atmospheric variables. Appl. Energy 2024, 353, 122155. [Google Scholar] [CrossRef]
- Li, W.; Li, Y.; Garg, A.; Gao, L. Enhancing real-time degradation prediction of lithium-ion battery: A digital twin framework with cnn-lstm-attention model. Energy 2024, 286, 129681. [Google Scholar] [CrossRef]
- Liu, X.; Zhang, L.; Wang, J.; Zhou, Y.; Gan, W. A unified multi-step wind speed forecasting framework based on numerical weather prediction grids and wind farm monitoring data. Renew. Energy 2023, 211, 948–963. [Google Scholar] [CrossRef]
- Su, T.S.; Weng, X.Y.; Vincent, F.Y.; Wu, C.C. Optimal maintenance planning for offshore wind farms under an uncertain environment. Ocean. Eng. 2023, 283, 115033. [Google Scholar] [CrossRef]
- Semmelmann, L.; Henni, S.; Weinhardt, C. Load forecasting for energy communities: A novel lstm-xgboost hybrid model based on smart meter data. Energy Inform. 2022, 5, 24. [Google Scholar] [CrossRef]
- Hedegaard, L.; Heidari, N.; Iosifidis, A. Continual spatio-temporal graph convolutional networks. Pattern Recognit. 2023, 140, 109528. [Google Scholar] [CrossRef]
- Huang, H.; Lin, L.; Zhao, L.; Ding, S.; Huang, H. Time series focused neural network for accurate wireless human gesture recognition. IEEE Trans. Netw. Sci. Eng. 2025, 113, 118–129. [Google Scholar] [CrossRef]
- Goldstein, S. On the vortex theory of screw propeller. Proc. R. Soc. A Math. Phys. Eng. Sci. 1929, 123, 440–465. [Google Scholar]
- McKight, P.E.; Najab, J. Kruskal-wallis test. In The Corsini Encyclopedia of Psychology; John Wiley & Sons: Haboken, NJ, USA, 2010; Volume 4. [Google Scholar]
- Xiaojia, H.; Wang, C.; Zhang, S. Research and application of a Model selection forecasting system for wind speed and theoretical power generation in wind farms based on classification and wind conversion. Energy 2024, 293, 130606. [Google Scholar] [CrossRef]
- Tedeschi, P.; Sciancalepore, S.; Pietro, R. Modelling a communication channel under jamming: Experimental model and applications. In Proceedings of the Parallel & Distributed Processing with Applications, Big Data & Cloud Computing, Sustainable Computing & Communications, Social Computing & Networking (ISPA/BDCloud/SocialCom/SustainCom), New York, NY, USA, 30 September–3 October 2021. [Google Scholar]









| Metric | Definition | Mathematical Expression |
|---|---|---|
| Upper Bound | Upper bounds of the wind speed forecasting value | |
| Lower Bound | Lower bounds of the wind speed forecasting value | |
| FICP | Forecast interval coverage probability of testing dataset | |
| FINAW | Forecast interval normalized average width of testing dataset | |
| AWDi | Accumulated width deviation of testing sample i | |
| AWD | Accumulated width deviation of testing dataset |
| Index | Description | Mathematical Expression |
|---|---|---|
| MAE | MAE measures the average magnitude of forecasting errors without considering their direction. | |
| RMSE | Evaluates the square root of the mean squared error; more sensitive to large deviations and outliers. | |
| MAPE | MAPE expresses forecasting accuracy as a percentage; lower MAPE means higher accuracy. | |
| STDAPE | The Standard Deviation of the Absolute Percentage Error is used to measure the stability of a forecasting model. | |
| DA | Measures the ability of the model to correctly forecast the direction (increase/decrease) of changes. Higher DA means better trend following capability. | |
| TIC | This indicator quantifies the relative accuracy of forecasts, with values ranging from 0 to 1; smaller values correspond to higher predictive accuracy. | |
| R2 | Represents the proportion of variance in observations explained by the model. Values close to 1 indicate strong predictive correlation. | |
| ACC | Accuracy (ACC) is a widely used measure in machine learning for assessing the effectiveness of a classification model |
| NO | Model | MAE | RMSE | STDAPE | DA | TIC | MAPE | R2 |
|---|---|---|---|---|---|---|---|---|
| Turbine #1 | CNN-LSTM | 0.1382 | 0.1936 | 17.30% | 73.88% | 0.0164 | 4.12% | 0.9587 |
| Transformers | 0.1515 | 0.2055 | 26.45% | 73.86% | 0.0177 | 4.97% | 0.9487 | |
| TCN-LSTM | 0.1505 | 0.1877 | 17.61% | 67.12% | 0.0155 | 5.48% | 0.9561 | |
| CNN-GRU | 0.1018 | 0.1554 | 21.51% | 70.56% | 0.0159 | 3.88% | 0.9660 | |
| LSTM-XGBoost | 0.1077 | 0.1547 | 24.47% | 76.17% | 0.0151 | 4.20% | 0.9652 | |
| GNN-TCN | 0.1112 | 0.1661 | 25.66% | 76.73% | 0.0168 | 4.55% | 0.9602 | |
| Turbine #2 | CNN-LSTM | 0.1429 | 0.2100 | 9.95% | 74.84% | 0.0170 | 4.09% | 0.9555 |
| Transformers | 0.1554 | 0.2138 | 13.23% | 74.37% | 0.0171 | 4.70% | 0.9498 | |
| TCN-LSTM | 0.1430 | 0.2004 | 20.89% | 74.45% | 0.0169 | 5.63% | 0.9591 | |
| CNN-GRU | 0.1711 | 0.2384 | 16.19% | 72.34% | 0.0197 | 5.57% | 0.9396 | |
| LSTM-XGBoost | 0.1128 | 0.1654 | 21.29% | 74.67% | 0.0145 | 4.82% | 0.9703 | |
| GNN-TCN | 0.1309 | 0.1843 | 22.36% | 74.20% | 0.0173 | 5.73% | 0.9590 | |
| Turbine #3 | CNN-LSTM | 0.1256 | 0.1776 | 9.18% | 75.08% | 0.0160 | 4.30% | 0.9599 |
| Transformers | 0.1380 | 0.1792 | 14.59% | 72.47% | 0.0150 | 4.72% | 0.9586 | |
| TCN-LSTM | 0.1469 | 0.2064 | 11.85% | 69.22% | 0.0180 | 4.80% | 0.9600 | |
| CNN-GRU | 0.1471 | 0.1897 | 21.17% | 74.50% | 0.0166 | 5.21% | 0.9488 | |
| LSTM-XGBoost | 0.1315 | 0.1888 | 13.52% | 76.71% | 0.0166 | 4.47% | 0.9590 | |
| GNN-TCN | 0.1336 | 0.1862 | 9.83% | 75.81% | 0.0155 | 4.56% | 0.9571 | |
| Turbine #4 | CNN-LSTM | 0.1251 | 0.1811 | 14.52% | 74.27% | 0.0156 | 4.10% | 0.9597 |
| Transformers | 0.1277 | 0.1690 | 14.79% | 72.50% | 0.0149 | 4.09% | 0.9616 | |
| TCN-LSTM | 0.1341 | 0.1756 | 20.08% | 66.78% | 0.0158 | 5.14% | 0.9595 | |
| CNN-GRU | 0.1438 | 0.2080 | 24.13% | 73.07% | 0.0183 | 5.44% | 0.9460 | |
| LSTM-XGBoost | 0.0966 | 0.1528 | 14.49% | 70.23% | 0.0148 | 3.94% | 0.9683 | |
| GNN-TCN | 0.0925 | 0.1387 | 21.73% | 74.66% | 0.0139 | 4.53% | 0.9721 | |
| Turbine #5 | CNN-LSTM | 0.1259 | 0.1937 | 18.93% | 70.85% | 0.0163 | 4.67% | 0.9629 |
| Transformers | 0.1268 | 0.2032 | 21.29% | 67.52% | 0.0159 | 4.75% | 0.9592 | |
| TCN-LSTM | 0.1374 | 0.2037 | 22.50% | 68.06% | 0.0163 | 5.51% | 0.9562 | |
| CNN-GRU | 0.1371 | 0.1749 | 23.44% | 70.66% | 0.0144 | 6.12% | 0.9671 | |
| LSTM-XGBoost | 0.1394 | 0.1967 | 17.50% | 74.43% | 0.0160 | 4.06% | 0.9582 | |
| GNN-TCN | 0.1347 | 0.2103 | 17.55% | 74.79% | 0.0169 | 4.12% | 0.9553 | |
| Turbine #6 | CNN-LSTM | 0.1088 | 0.1673 | 17.63% | 73.94% | 0.0142 | 4.31% | 0.9665 |
| Transformers | 0.1239 | 0.1633 | 15.84% | 74.17% | 0.0161 | 4.13% | 0.9616 | |
| TCN-LSTM | 0.1146 | 0.1556 | 23.83% | 73.49% | 0.0144 | 4.91% | 0.9689 | |
| CNN-GRU | 0.1300 | 0.1844 | 26.65% | 74.57% | 0.0165 | 5.31% | 0.9577 | |
| LSTM-XGBoost | 0.1422 | 0.1887 | 18.60% | 68.14% | 0.0159 | 4.75% | 0.9595 | |
| GNN-TCN | 0.1350 | 0.1932 | 14.70% | 73.80% | 0.0169 | 5.15% | 0.9571 |
| NO | Model | MAE | RMSE | STDAPE | DA | TIC | MAPE | R2 | OP | SOP | SA |
|---|---|---|---|---|---|---|---|---|---|---|---|
| Turbine #1 | CNN-LSTM | 0.0690 | 0.0948 | 8.31% | 88.66% | 0.0079 | 2.15% | 0.9861 | 1925 | 1821 | 94.59% |
| Transformers | 0.0768 | 0.1053 | 13.13% | 88.64% | 0.0088 | 2.50% | 0.9852 | 2737 | 2547 | 93.05% | |
| TCN-LSTM | 0.0748 | 0.0976 | 9.23% | 80.54% | 0.0082 | 2.62% | 0.9802 | 2638 | 2492 | 94.48% | |
| CNN-GRU | 0.0523 | 0.0772 | 10.75% | 84.67% | 0.0076 | 2.04% | 0.9887 | 2139 | 2017 | 94.29% | |
| LSTM-XGBoost | 0.0530 | 0.0779 | 12.26% | 91.40% | 0.0076 | 2.10% | 0.9887 | 2031 | 1831 | 90.16% | |
| GNN-TCN | 0.0574 | 0.0830 | 12.75% | 92.07% | 0.0081 | 2.34% | 0.9860 | 2930 | 2668 | 91.04% | |
| MSFS | 0.0650 | 0.1052 | 5.32% | 93.83% | 0.0088 | 1.71% | 0.9934 | 14,400 | 13,376 | 92.89% | |
| Turbine #2 | CNN-LSTM | 0.0686 | 0.1006 | 5.10% | 89.81% | 0.0084 | 2.12% | 0.9896 | 2430 | 2228 | 91.69% |
| Transformers | 0.0748 | 0.1066 | 6.67% | 89.24% | 0.0089 | 2.44% | 0.9891 | 1807 | 1683 | 93.13% | |
| TCN-LSTM | 0.0724 | 0.0965 | 10.45% | 89.34% | 0.0081 | 2.77% | 0.9803 | 1948 | 1754 | 90.03% | |
| CNN-GRU | 0.0841 | 0.1204 | 8.02% | 86.80% | 0.0100 | 2.79% | 0.9798 | 2432 | 2239 | 92.05% | |
| LSTM-XGBoost | 0.0570 | 0.0811 | 10.31% | 89.61% | 0.0071 | 2.46% | 0.9855 | 2764 | 2556 | 92.48% | |
| GNN-TCN | 0.0669 | 0.0950 | 10.81% | 89.03% | 0.0084 | 2.85% | 0.9790 | 3019 | 2850 | 94.40% | |
| MSFS | 0.0607 | 0.1030 | 7.22% | 90.25% | 0.0086 | 1.66% | 0.9919 | 14,400 | 13,309 | 92.43% | |
| Turbine #3 | CNN-LSTM | 0.0653 | 0.0904 | 4.71% | 90.09% | 0.0078 | 2.09% | 0.9899 | 2717 | 2575 | 94.78% |
| Transformers | 0.0693 | 0.0904 | 6.97% | 86.97% | 0.0078 | 2.40% | 0.9886 | 1768 | 1613 | 91.26% | |
| TCN-LSTM | 0.0726 | 0.1019 | 5.90% | 83.06% | 0.0088 | 2.44% | 0.9865 | 2064 | 1944 | 94.20% | |
| CNN-GRU | 0.0722 | 0.0935 | 10.87% | 89.40% | 0.0082 | 2.73% | 0.9800 | 2004 | 1895 | 94.54% | |
| LSTM-XGBoost | 0.0661 | 0.0908 | 6.55% | 92.05% | 0.0079 | 2.17% | 0.9890 | 2655 | 2450 | 92.28% | |
| GNN-TCN | 0.0662 | 0.0918 | 4.99% | 90.97% | 0.0079 | 2.18% | 0.9888 | 3192 | 2948 | 92.36% | |
| MSFS | 0.0608 | 0.0999 | 4.59% | 93.02% | 0.0086 | 1.64% | 0.9954 | 14,400 | 13,426 | 93.23% | |
| Turbine #4 | CNN-LSTM | 0.0653 | 0.0901 | 7.21% | 89.12% | 0.0078 | 2.11% | 0.9895 | 1892 | 1788 | 94.51% |
| Transformers | 0.0638 | 0.0879 | 7.73% | 87.01% | 0.0076 | 2.13% | 0.9883 | 2435 | 2250 | 92.39% | |
| TCN-LSTM | 0.0696 | 0.0902 | 10.11% | 80.13% | 0.0078 | 2.70% | 0.9821 | 1808 | 1658 | 91.71% | |
| CNN-GRU | 0.0754 | 0.1041 | 12.09% | 87.68% | 0.0090 | 2.79% | 0.9816 | 2170 | 2020 | 93.10% | |
| LSTM-XGBoost | 0.0495 | 0.0745 | 7.44% | 84.28% | 0.0074 | 1.97% | 0.9945 | 2536 | 2335 | 92.07% | |
| GNN-TCN | 0.0484 | 0.0697 | 10.70% | 89.59% | 0.0069 | 2.20% | 0.9860 | 3559 | 3335 | 93.69% | |
| MSFS | 0.0573 | 0.0954 | 9.08% | 91.77% | 0.0083 | 1.74% | 0.9950 | 14,400 | 13,386 | 92.96% | |
| Turbine #5 | CNN-LSTM | 0.0633 | 0.0934 | 9.15% | 85.02% | 0.0079 | 2.33% | 0.9853 | 2687 | 2441 | 90.84% |
| Transformers | 0.0660 | 0.0983 | 10.50% | 81.03% | 0.0083 | 2.41% | 0.9829 | 2002 | 1892 | 94.51% | |
| TCN-LSTM | 0.0700 | 0.1015 | 11.26% | 81.68% | 0.0086 | 2.67% | 0.9792 | 1936 | 1746 | 90.17% | |
| CNN-GRU | 0.0665 | 0.0881 | 11.45% | 84.80% | 0.0075 | 2.93% | 0.9762 | 2099 | 1989 | 94.74% | |
| LSTM-XGBoost | 0.0700 | 0.0974 | 8.34% | 89.32% | 0.0079 | 2.13% | 0.9882 | 2517 | 2324 | 92.33% | |
| GNN-TCN | 0.0705 | 0.1007 | 8.82% | 89.75% | 0.0082 | 2.16% | 0.9871 | 3159 | 2859 | 90.52% | |
| MSFS | 0.0577 | 0.0965 | 6.35% | 91.95% | 0.0082 | 1.73% | 0.9906 | 14,400 | 13,251 | 92.02% | |
| Turbine #6 | CNN-LSTM | 0.0562 | 0.0797 | 8.56% | 88.73% | 0.0073 | 2.09% | 0.9895 | 2635 | 2381 | 90.37% |
| Transformers | 0.0595 | 0.0854 | 8.01% | 89.01% | 0.0078 | 2.14% | 0.9889 | 1874 | 1690 | 90.19% | |
| TCN-LSTM | 0.0583 | 0.0768 | 11.70% | 88.18% | 0.0071 | 2.53% | 0.9865 | 1767 | 1642 | 92.90% | |
| CNN-GRU | 0.0634 | 0.0893 | 13.02% | 89.48% | 0.0082 | 2.57% | 0.9816 | 2229 | 2056 | 92.25% | |
| LSTM-XGBoost | 0.0700 | 0.0974 | 9.01% | 81.77% | 0.0082 | 2.45% | 0.9877 | 3326 | 3090 | 92.90% | |
| GNN-TCN | 0.0708 | 0.1007 | 7.60% | 88.56% | 0.0085 | 2.47% | 0.9871 | 2569 | 2322 | 90.38% | |
| MSFS | 0.0495 | 0.0837 | 9.63% | 92.68% | 0.0077 | 1.60% | 0.9911 | 14,400 | 13,181 | 91.53% |
| NO | Model | MAE | RMSE | STDAPE | DA | TIC | MAPE | R2 | OP | SOP | SA |
|---|---|---|---|---|---|---|---|---|---|---|---|
| Turbine #1 | CNN-LSTM | 0.1382 | 0.2116 | 1.57% | 89.94% | 0.0128 | 1.66% | 0.9953 | 169 | 152 | 90.21% |
| Transformers | 0.1145 | 0.1788 | 1.89% | 85.16% | 0.0144 | 1.99% | 0.9959 | 128 | 118 | 92.95% | |
| TCN-LSTM | 0.1350 | 0.2145 | 3.87% | 88.68% | 0.0276 | 3.55% | 0.9609 | 159 | 143 | 90.29% | |
| CNN-GRU | 0.1101 | 0.1742 | 2.87% | 90.70% | 0.0162 | 2.63% | 0.9923 | 172 | 154 | 90.00% | |
| LSTM-XGBoost | 0.0636 | 0.1079 | 1.82% | 85.96% | 0.0120 | 1.56% | 0.9937 | 178 | 164 | 92.42% | |
| GNN-TCN | 0.1269 | 0.1880 | 3.05% | 94.06% | 0.0198 | 2.87% | 0.9776 | 202 | 190 | 94.42% | |
| MSFS | 0.1147 | 0.1834 | 2.96% | 89.38% | 0.0176 | 2.55% | 0.9908 | 1008 | 921 | 91.37% | |
| Turbine #2 | CNN-LSTM | 0.2286 | 0.3823 | 2.88% | 89.68% | 0.0224 | 2.53% | 0.9802 | 126 | 115 | 91.34% |
| Transformers | 0.0724 | 0.1345 | 2.22% | 94.12% | 0.0130 | 1.66% | 0.9953 | 170 | 161 | 94.97% | |
| TCN-LSTM | 0.1258 | 0.1917 | 3.43% | 91.88% | 0.0244 | 3.44% | 0.9817 | 160 | 149 | 93.23% | |
| CNN-GRU | 0.1512 | 0.2272 | 3.89% | 89.43% | 0.0246 | 3.72% | 0.9644 | 123 | 113 | 91.96% | |
| LSTM-XGBoost | 0.1211 | 0.1834 | 3.35% | 85.86% | 0.0217 | 3.20% | 0.9785 | 191 | 177 | 92.95% | |
| GNN-TCN | 0.1444 | 0.2141 | 3.22% | 92.44% | 0.0244 | 3.38% | 0.9772 | 238 | 216 | 90.94% | |
| MSFS | 0.1394 | 0.2244 | 3.28% | 90.67% | 0.0225 | 3.15% | 0.9850 | 1008 | 931 | 92.36% | |
| Turbine #3 | CNN-LSTM | 0.1703 | 0.3118 | 2.51% | 93.72% | 0.0221 | 2.12% | 0.9862 | 191 | 172 | 90.05% |
| Transformers | 0.1287 | 0.2448 | 5.28% | 87.74% | 0.0273 | 3.33% | 0.9721 | 155 | 140 | 90.76% | |
| TCN-LSTM | 0.1269 | 0.2341 | 3.96% | 88.59% | 0.0320 | 3.28% | 0.9442 | 149 | 141 | 94.85% | |
| CNN-GRU | 0.1521 | 0.2310 | 3.52% | 91.13% | 0.0229 | 3.42% | 0.9759 | 124 | 114 | 92.56% | |
| LSTM-XGBoost | 0.1184 | 0.1889 | 3.13% | 90.76% | 0.0229 | 2.87% | 0.9789 | 184 | 173 | 94.52% | |
| GNN-TCN | 0.1125 | 0.1709 | 2.99% | 93.17% | 0.0214 | 2.95% | 0.9791 | 205 | 192 | 93.68% | |
| MSFS | 0.1292 | 0.2212 | 3.50% | 91.07% | 0.0239 | 2.97% | 0.9802 | 1008 | 932 | 92.46% | |
| Turbine #4 | CNN-LSTM | 0.1302 | 0.2816 | 2.43% | 90.45% | 0.0234 | 1.82% | 0.9887 | 199 | 184 | 92.56% |
| Transformers | 0.1904 | 0.2836 | 5.22% | 90.43% | 0.0322 | 4.78% | 0.9575 | 188 | 174 | 92.64% | |
| TCN-LSTM | 0.1105 | 0.1771 | 2.90% | 90.40% | 0.0184 | 2.55% | 0.9898 | 198 | 178 | 90.40% | |
| CNN-GRU | 0.1512 | 0.2556 | 4.43% | 92.44% | 0.0250 | 3.44% | 0.9832 | 119 | 109 | 91.83% | |
| LSTM-XGBoost | 0.1095 | 0.1868 | 3.69% | 86.21% | 0.0231 | 2.86% | 0.9695 | 145 | 136 | 93.85% | |
| GNN-TCN | 0.1540 | 0.2471 | 4.05% | 86.16% | 0.0359 | 4.01% | 0.7544 | 159 | 148 | 93.71% | |
| MSFS | 0.1419 | 0.2401 | 4.06% | 89.38% | 0.0261 | 3.34% | 0.9767 | 1008 | 929 | 92.16% | |
| Turbine #5 | CNN-LSTM | 0.1436 | 0.2565 | 2.12% | 89.76% | 0.0132 | 1.57% | 0.9931 | 166 | 154 | 93.11% |
| Transformers | 0.1027 | 0.2091 | 3.12% | 85.38% | 0.0171 | 2.13% | 0.9922 | 171 | 162 | 94.91% | |
| TCN-LSTM | 0.1177 | 0.1944 | 3.57% | 90.28% | 0.0191 | 2.90% | 0.9907 | 216 | 194 | 90.06% | |
| CNN-GRU | 0.1535 | 0.2340 | 2.67% | 85.85% | 0.0208 | 2.93% | 0.9854 | 106 | 95 | 90.26% | |
| LSTM-XGBoost | 0.1059 | 0.1492 | 2.78% | 93.22% | 0.0182 | 2.86% | 0.9856 | 177 | 164 | 92.74% | |
| GNN-TCN | 0.1049 | 0.1503 | 3.06% | 90.70% | 0.0241 | 3.25% | 0.8206 | 172 | 155 | 90.48% | |
| MSFS | 0.1197 | 0.1932 | 2.99% | 89.48% | 0.0178 | 2.77% | 0.9920 | 1008 | 924 | 91.67% | |
| Turbine #6 | CNN-LSTM | 0.1464 | 0.2681 | 2.06% | 93.87% | 0.0170 | 1.63% | 0.9889 | 163 | 153 | 94.36% |
| Transformers | 0.0808 | 0.1198 | 2.60% | 91.95% | 0.0123 | 2.34% | 0.9950 | 174 | 162 | 93.34% | |
| TCN-LSTM | 0.0760 | 0.1153 | 2.35% | 93.21% | 0.0106 | 1.89% | 0.9960 | 162 | 146 | 90.18% | |
| CNN-GRU | 0.1332 | 0.2111 | 3.83% | 85.42% | 0.0219 | 3.31% | 0.9884 | 144 | 132 | 92.21% | |
| LSTM-XGBoost | 0.0727 | 0.1157 | 2.97% | 86.21% | 0.0146 | 2.24% | 0.9914 | 174 | 158 | 90.86% | |
| GNN-TCN | 0.1218 | 0.1922 | 2.85% | 90.58% | 0.0212 | 2.97% | 0.9872 | 191 | 172 | 90.15% | |
| MSFS | 0.1052 | 0.1774 | 2.93% | 90.28% | 0.0174 | 2.51% | 0.9912 | 1008 | 923 | 91.57% |
| Power Curve with Standard Air Density | |
|---|---|
| Standard Power Curve | |
| Power Coefficient | |
| Actual Power Curve for Wind Turbine | |
| Air Density Correction Function | |
| Actual Power Coefficient Curve (Low bound) | |
| Actual Power Coefficient Curve (Standard) | |
| Actual Power Coefficient Curve (Upper bound) | |
| Type | SL1500/82 |
| Manufacturer | Sinovel |
| Rated Wind Speed | 10.5 m/s |
| Rotor Diameter | 82 m |
| Hub Height | 70 m |
| Cut-In Wind Speed | 3 m/s |
| Cut-Out Wind Speed | 22 m/s |
| NO | Model | NRMD | RMSE | R2 | MPE | NO | Model | RMD | RMSE | R2 | MPE |
|---|---|---|---|---|---|---|---|---|---|---|---|
| Turbine #1 | CNN-LSTM | 4.778% | 71.5522 | 0.9928 | −4.760% | Turbine #4 | CNN-LSTM | 4.155% | 61.2387 | 0.9958 | −4.042% |
| Transformers | 4.973% | 75.5793 | 0.9936 | −5.434% | Transformers | 4.561% | 68.8929 | 0.9949 | −4.371% | ||
| TCN-LSTM | 4.984% | 78.0304 | 0.9943 | −5.699% | TCN-LSTM | 4.467% | 66.3624 | 0.9968 | −4.253% | ||
| CNN-GRU | 4.802% | 72.2187 | 0.9920 | −5.348% | CNN-GRU | 4.400% | 64.7776 | 0.9959 | −4.186% | ||
| LSTM-XGBoost | 4.886% | 75.4574 | 0.9937 | −5.413% | LSTM-XGBoost | 4.377% | 64.5617 | 0.9952 | −4.176% | ||
| GNN-TCN | 4.813% | 72.8123 | 0.9930 | −5.379% | GNN-TCN | 4.361% | 64.3825 | 0.9960 | −4.056% | ||
| MSFS | 4.438% | 67.6901 | 0.9957 | −3.823% | MSFS | 4.037% | 58.9629 | 0.9973 | −3.253% | ||
| Turbine #2 | CNN-LSTM | 4.820% | 71.2050 | 0.9932 | −4.725% | Turbine #5 | CNN-LSTM | 3.960% | 57.1472 | 0.9975 | −4.252% |
| Transformers | 4.973% | 74.1480 | 0.9934 | −5.530% | Transformers | 4.006% | 57.8198 | 0.9971 | −4.276% | ||
| TCN-LSTM | 4.968% | 73.3896 | 0.9936 | −5.428% | TCN-LSTM | 4.070% | 58.4351 | 0.9973 | −4.351% | ||
| CNN-GRU | 4.737% | 68.7901 | 0.9938 | −4.542% | CNN-GRU | 3.823% | 54.7854 | 0.9978 | −4.194% | ||
| LSTM-XGBoost | 5.038% | 74.9899 | 0.9940 | −5.598% | LSTM-XGBoost | 3.980% | 57.3913 | 0.9973 | −4.266% | ||
| GNN-TCN | 4.829% | 71.5200 | 0.9948 | −4.926% | GNN-TCN | 3.928% | 56.7533 | 0.9974 | −4.228% | ||
| MSFS | 4.262% | 64.6974 | 0.9961 | −3.329% | MSFS | 3.808% | 54.6707 | 0.9979 | −3.253% | ||
| Turbine #3 | CNN-LSTM | 4.186% | 59.5443 | 0.9974 | −4.649% | Turbine #6 | CNN-LSTM | 7.737% | 115.4382 | 0.9836 | −3.901% |
| Transformers | 4.228% | 59.8509 | 0.9978 | −4.708% | Transformers | 7.901% | 117.7301 | 0.9931 | −5.837% | ||
| TCN-LSTM | 4.167% | 59.3860 | 0.9974 | −4.577% | TCN-LSTM | 7.704% | 114.5872 | 0.9828 | −3.253% | ||
| CNN-GRU | 4.122% | 58.6405 | 0.9976 | −4.545% | CNN-GRU | 6.092% | 113.6973 | 0.9819 | −2.887% | ||
| LSTM-XGBoost | 4.213% | 59.6998 | 0.9976 | −4.707% | LSTM-XGBoost | 7.754% | 115.7431 | 0.9903 | −5.735% | ||
| GNN-TCN | 4.117% | 58.6096 | 0.9975 | −4.452% | GNN-TCN | 7.613% | 113.7532 | 0.9828 | −2.643% | ||
| MSFS | 4.067% | 57.8520 | 0.9979 | −3.897% | MSFS | 5.458% | 83.5771 | 0.9828 | −1.882% |
| NO | Comparison | A- | p-Value (KW) | p-Value (KS) | FICP | FINAW | AWD |
|---|---|---|---|---|---|---|---|
| Turbine #1 | CNN-LSTM vs. Actual Data | −2.128 | 1.0000 | 0.9935 | 74.21% | 0.02827 | 0.5032 |
| Transformers vs. Actual Data | −0.604 | 1.0000 | 0.9881 | 73.61% | 0.03073 | 0.5470 | |
| TCN-LSTM vs. Actual Data | 0.062 | 1.0000 | 0.9969 | 67.06% | 0.03385 | 0.6026 | |
| CNN-GRU vs. Actual Data | −0.200 | 1.0000 | 0.9995 | 69.84% | 0.03163 | 0.5631 | |
| LSTM-XGBoost vs. Actual Data | −1.375 | 1.0000 | 0.9999 | 87.10% | 0.01723 | 0.3066 | |
| GNN-TCN vs. Actual Data | 1.258 | 1.0000 | 1.0000 | 68.25% | 0.03355 | 0.5971 | |
| MSFS vs. Actual Data | −1.346 | 1.0000 | 0.9935 | 95.24% | 0.01005 | 0.1788 | |
| Turbine #2 | CNN-LSTM vs. Actual Data | 0.882 | 1.0000 | 0.9987 | 61.90% | 0.06732 | 1.3600 |
| Transformers vs. Actual Data | −0.764 | 1.0000 | 0.9125 | 69.94% | 0.06432 | 1.2992 | |
| TCN-LSTM vs. Actual Data | 10.351 | 1.0000 | 0.9798 | 64.98% | 0.07450 | 1.5048 | |
| CNN-GRU vs. Actual Data | −4.923 | 1.0000 | 0.8580 | 62.10% | 0.10678 | 2.1569 | |
| LSTM-XGBoost vs. Actual Data | −2.086 | 1.0000 | 0.9684 | 66.07% | 0.06840 | 1.3816 | |
| GNN-TCN vs. Actual Data | −1.239 | 1.0000 | 0.9881 | 65.58% | 0.07384 | 1.4916 | |
| MSFS vs. Actual Data | 9.611 | 1.0000 | 0.9535 | 93.45% | 0.01725 | 0.3484 | |
| Turbine #3 | CNN-LSTM vs. Actual Data | −0.030 | 1.0000 | 0.9987 | 87.60% | 0.07548 | 1.6983 |
| Transformers vs. Actual Data | −9.782 | 1.0000 | 0.9125 | 56.15% | 0.13604 | 3.0609 | |
| TCN-LSTM vs. Actual Data | −0.765 | 1.0000 | 0.9798 | 58.04% | 0.12665 | 2.8497 | |
| CNN-GRU vs. Actual Data | 8.904 | 1.0000 | 0.8580 | 52.98% | 0.14186 | 3.1917 | |
| LSTM-XGBoost vs. Actual Data | 0.954 | 1.0000 | 0.8265 | 65.77% | 0.12122 | 2.7274 | |
| GNN-TCN vs. Actual Data | 8.791 | 1.0000 | 0.9881 | 62.80% | 0.12189 | 2.7424 | |
| MSFS vs. Actual Data | −1.833 | 1.0000 | 0.9935 | 94.94% | 0.04645 | 1.0450 | |
| Turbine #4 | CNN-LSTM vs. Actual Data | −2.128 | 1.0000 | 0.9935 | 87.10% | 0.01723 | 0.3066 |
| Transformers vs. Actual Data | −0.604 | 1.0000 | 0.9881 | 73.61% | 0.03073 | 0.5470 | |
| TCN-LSTM vs. Actual Data | 0.062 | 1.0000 | 0.9969 | 74.21% | 0.02827 | 0.5032 | |
| CNN-GRU vs. Actual Data | −0.200 | 1.0000 | 0.9995 | 69.84% | 0.03163 | 0.5631 | |
| LSTM-XGBoost vs. Actual Data | −1.375 | 1.0000 | 0.9999 | 67.06% | 0.03385 | 0.6026 | |
| GNN-TCN vs. Actual Data | 1.258 | 1.0000 | 1.0000 | 68.25% | 0.03355 | 0.5971 | |
| MSFS vs. Actual Data | −1.346 | 1.0000 | 0.9935 | 95.24% | 0.01005 | 0.1788 | |
| Turbine #5 | CNN-LSTM vs. Actual Data | 0.990 | 1.0000 | 1.0000 | 86.11% | 0.02955 | 0.5378 |
| Transformers vs. Actual Data | 4.540 | 1.0000 | 0.9999 | 72.62% | 0.05190 | 0.9446 | |
| TCN-LSTM vs. Actual Data | 4.511 | 1.0000 | 0.9987 | 66.57% | 0.05419 | 0.9863 | |
| CNN-GRU vs. Actual Data | 3.547 | 1.0000 | 0.9999 | 65.38% | 0.05611 | 1.0212 | |
| LSTM-XGBoost vs. Actual Data | 5.337 | 1.0000 | 1.0000 | 69.54% | 0.05220 | 0.9500 | |
| GNN-TCN vs. Actual Data | 2.429 | 1.0000 | 0.9999 | 63.89% | 0.06312 | 1.1487 | |
| MSFS vs. Actual Data | 4.853 | 1.0000 | 0.9995 | 94.35% | 0.01510 | 0.2748 | |
| Turbine #6 | CNN-LSTM vs. Actual Data | 0.600 | 1.0000 | 1.0000 | 87.90% | 0.02369 | 0.4595 |
| Transformers vs. Actual Data | 5.788 | 1.0000 | 0.9995 | 69.15% | 0.04155 | 0.8061 | |
| TCN-LSTM vs. Actual Data | 7.281 | 1.0000 | 0.9987 | 75.20% | 0.04037 | 0.7833 | |
| CNN-GRU vs. Actual Data | 7.008 | 1.0000 | 0.9881 | 66.37% | 0.04717 | 0.9151 | |
| LSTM-XGBoost vs. Actual Data | 0.817 | 1.0000 | 0.9935 | 71.83% | 0.04124 | 0.8001 | |
| GNN-TCN vs. Actual Data | 6.554 | 1.0000 | 0.9881 | 68.65% | 0.04514 | 0.8758 | |
| MSFS vs. Actual Data | 2.952 | 1.0000 | 0.9999 | 93.35% | 0.01339 | 0.2598 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Zeng, M.; Jia, Q.; Wen, Z.; Mao, F.; Huang, H.; Pan, J. Research and Application of a Model Selection Forecasting System for Wind Speed and Theoretical Power Generation. Future Internet 2026, 18, 7. https://doi.org/10.3390/fi18010007
Zeng M, Jia Q, Wen Z, Mao F, Huang H, Pan J. Research and Application of a Model Selection Forecasting System for Wind Speed and Theoretical Power Generation. Future Internet. 2026; 18(1):7. https://doi.org/10.3390/fi18010007
Chicago/Turabian StyleZeng, Ming, Qianqian Jia, Zhenming Wen, Fang Mao, Haotao Huang, and Jingyuan Pan. 2026. "Research and Application of a Model Selection Forecasting System for Wind Speed and Theoretical Power Generation" Future Internet 18, no. 1: 7. https://doi.org/10.3390/fi18010007
APA StyleZeng, M., Jia, Q., Wen, Z., Mao, F., Huang, H., & Pan, J. (2026). Research and Application of a Model Selection Forecasting System for Wind Speed and Theoretical Power Generation. Future Internet, 18(1), 7. https://doi.org/10.3390/fi18010007
