Short-Term Wind Speed Prediction for Bridge Site Area Based on Wavelet Denoising OOA-Transformer
Abstract
:1. Introduction
- A novel outlier processing method using quartile analysis and a Wavelet Denoising strategy is proposed. Outliers are identified by calculating the upper and lower quartiles of the data; the parts of the wavelet coefficients that fall below a certain threshold are either reduced or set to zero. By combining outlier processing and noise reduction, the quality of the wind speed data series is effectively enhanced.
- A short-term wind speed prediction model for the bridge site area with the OOA-transformer model is proposed. The model adopts the Transformer to capture the short-term and long-term dependencies and nonlinear windspeed characteristics in the windspeed series. The OOA is used to optimize the key parameters of the Transformer model to improve the prediction accuracy and stability.
- Experiments conducted on the actual wind speed dataset of the Xuefeng Lake Bridge demonstrate that the proposed model outperforms other models in all key evaluation metrics (MAPE, RMSE, and ).
2. Related Works
3. Materials and Methods
3.1. Data Sources and Pre-processing
3.2. Methods
3.2.1. Wavelet Denoising
3.2.2. Transformer
- Positional Encoding: Since the Transformer model does not include recurrent or convolutional operations, positional encoding retains the temporal attributes associated with each data point, thus enabling the model to utilize this information to understand the sequential relationships in the series. Positional encoding is generated by alternating sine and cosine functions, with each dimension corresponding to a frequency:
- Multi-Head Attention allows the model to learn information simultaneously from different representational subspaces. This mechanism enhances the model’s focus and learning capabilities, thus enabling it to capture information at different levels of the sequence. Scaled dot-product attention is defined as follows:
- Feed-Forward Network (FFN) provides additional nonlinear processing capabilities for each layer of the Transformer model. The feed-forward network consists of two linear transformations separated by a nonlinear activation function. The FFN formula is as follows:
3.2.3. OOA-Transformer
- Initialization: Identify the hyperparameters of the Transformer model to be optimized, i.e., the size of the osprey population (number of Transformer models), the dimension of the population (set to 1 for single-variable wind speed prediction), and the maximum number of iterations. First, form an initial solution set according to Equations (12) and (13):
- Global Exploration (First Phase): For each Transformer model , first define the fitness function :
- Local Exploitation (Second Phase): For each iteration , and for each Transformer model , generate a new candidate solution using the following equation:
- Repeat: Repeat Steps 2 and 3 until the termination condition (maximum number of iterations) is met.
3.2.4. The Overall Framework of Our Proposed Model
- Step 1: The process begins with the Interquartile Range (IQR) method applied to the initial wind speed data to remove potential errors and outliers. This method calculates the IQR and excludes the data points lying outside and , thus aiming to improve data reliability. Following this preprocessing, the data are denoised using a Bior2.2 wavelet, which addresses non-stationary features in the data through wavelet decomposition, soft thresholding, and reconstruction.
- Step 2: After the Wavelet Denoising, min-max normalization is employed to scale the values to a [0, 1] range. This normalization facilitates the efficient processing of data. The Transformer model’s parameters are optimized by the Osprey Optimization Algorithm, thus enhancing model performance.
- Step 3: In the final step, the processed data are fed into the OOA-Transformer model to predict future wind speeds. The predictions undergo inverse normalization to revert them to their original scale, thus allowing for comparison with the actual data over different time intervals, as displayed in the diagram.
4. Results and Discussion
4.1. The Results of Wavelet Denoising
4.2. Compared with Other Prediction Models
4.2.1. Parameter Settings
- window_len: This parameter defines the length of the input sequence for the model. In time series analysis (such as wind speed prediction), a window_len of 1 means using data from the previous hour to predict the wind speed at the next time point (or points);
- target_len: This specifies the output sequence length from the model. A target_len of 1 indicates that the model predicts wind speed for a single future time point;
- num_encoder_layers: The number of encoder layers in the Transformer model is 2. Encoder layers are responsible for processing the input sequence, and each additional layer can increase the model’s ability to capture complex relationships;
- num_decoder_layers: The number of decoder layers in the Transformer model is also 2. Decoder layers are used to generate the output sequence;
- d_model: The dimensionality of the output for all layers within the model is 64. For both encoder and decoder layers, d_model is the dimension size of their output vectors;
- num_heads: The model uses a multi-head attention mechanism with two heads. This allows the model to learn information from different representational subspaces at the same time;
- feedforward_dim: The dimension of the feedforward network in both encoder and decoder layers is 64. This represents the size of the hidden layer, which performs a separate, fully connected transformation on each position following the self-attention layer;
- Dropout: The dropout rate for regularization in the model is 0.1. Dropout is a common regularization technique that prevents overfitting by randomly dropping (setting to zero) a portion of the feature values;
- positional_encoding: “sinusoidal”. Since the Transformer model lacks recurrent or convolutional structures, it uses positional encoding to incorporate sequence order information, and “sinusoidal” refers to using fixed sinusoidal and cosinusoidal functions for positional encoding.
4.2.2. Evaluation Criteria
4.2.3. The Experiment Results
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Liu, Y.; Lin, P.; Liang, X. Study of Wind Resistance Measures for Installing of Arches of a Long Span Steel Truss Girder Bridge with Flexible Arches on High-Speed Railway. Bridge Constr. 2018, 48, 40–44. [Google Scholar]
- Zhang, T.; Tang, M.; Li, M.; Li, C. Study of Aerostatic Safety of Long-Span Suspension Bridge in Process of Stiffening Girder Hoisting. Bridge Constr. 2019, 49, 6. [Google Scholar]
- Ding, Y. In-Situ Casting Technique for 49.2m Concrete Box Beam in Complicated Sea Area Using Offshore Bridge Building Machine. World Bridg. 2020, 48, 5. [Google Scholar]
- Changqing, W.; Zhitian, Z.; Xiaobo, W. Influences of wind cables on dynamic properties and aerostatic stability of pedestrian suspension bridges. Bridge Constr. 2017, 47, 77–82. [Google Scholar]
- Fan, W.; Zhang, M.; Chen, N. Wind-Resistant Measures for Long Cantilever Erection of Steel Truss Girder of Huanggang Changjiang River Rail-cum-Road Bridge. Bridge Constr. 2013, 43, 5. [Google Scholar]
- JTG/T3650—2020; Technical Specification for Construcion of Highway Bridge and Culvert. China First Highway Engineering Co., Ltd.: Beijing, China, 2020.
- Tang, Q.; Shi, R.; Fan, T.; Ma, Y.; Huang, J. Prediction of financial time series based on LSTM using wavelet transform and singular spectrum analysis. Math. Probl. Eng. 2021, 2021, 9942410. [Google Scholar] [CrossRef]
- Zhang, X. Financial Time Series Forecasting Based on LSTM Neural Network optimized by Wavelet Denoising and Whale Optimization Algorithm. Acad. J. Comput. Inf. Sci. 2022, 5, 1–9. [Google Scholar]
- Dehghani, M.; Trojovskỳ, P. Osprey optimization algorithm: A new bio-inspired metaheuristic algorithm for solving engineering optimization problems. Front. Mech. Eng. 2023, 8, 1126450. [Google Scholar] [CrossRef]
- Li, F.; Ren, G.; Lee, J. Multi-step wind speed prediction based on turbulence intensity and hybrid deep neural networks. Energy Convers. Manag. 2019, 186, 306–322. [Google Scholar] [CrossRef]
- Yang, M.; Guo, Y.; Huang, Y. Wind power ultra-short-term prediction method based on NWP wind speed correction and double clustering division of transitional weather process. Energy 2023, 282, 128947. [Google Scholar] [CrossRef]
- Wang, H.; Han, S.; Liu, Y.; Yan, J.; Li, L. Sequence transfer correction algorithm for numerical weather prediction wind speed and its application in a wind power forecasting system. Appl. Energy 2019, 237, 1–10. [Google Scholar] [CrossRef]
- Zhang, Y.; Zhao, Y.; Shen, X.; Zhang, J. A comprehensive wind speed prediction system based on Monte Carlo and artificial intelligence algorithms. Appl. Energy 2022, 305, 117815. [Google Scholar] [CrossRef]
- Zhang, Y.; Zhao, Y.; Kong, C.; Chen, B. A new prediction method based on VMD-PRBF-ARMA-E model considering wind speed characteristic. Energy Convers. Manag. 2020, 203, 112254. [Google Scholar] [CrossRef]
- Yunus, K.; Thiringer, T.; Chen, P. ARIMA-based frequency-decomposed modeling of wind speed time series. IEEE Trans. Power Syst. 2015, 31, 2546–2556. [Google Scholar] [CrossRef]
- Box, G.E.; Pierce, D.A. Distribution of residual autocorrelations in autoregressive-integrated moving average time series models. J. Am. Stat. Assoc. 1970, 65, 1509–1526. [Google Scholar] [CrossRef]
- Yu, E.; Wei, H.; Han, Y.; Hu, P.; Xu, G. Application of time series prediction techniques for coastal bridge engineering. Adv. Bridge Eng. 2021, 2, 6. [Google Scholar] [CrossRef]
- Wu, C.; Wang, J.; Chen, X.; Du, P.; Yang, W. A novel hybrid system based on multi-objective optimization for wind speed forecasting. Renew. Energy 2020, 146, 149–165. [Google Scholar] [CrossRef]
- Alexiadis, M.; Dokopoulos, P.; Sahsamanoglou, H.; Manousaridis, I. Short-term forecasting of wind speed and related electrical power. Sol. Energy 1998, 63, 61–68. [Google Scholar] [CrossRef]
- Shao, L.; Wen, S. Research on Obtaining Average Wind Speed of Roadway Based on GRU Neural Network. Gold Sci. Technol. 2021, 29, 709–718. [Google Scholar]
- Alves, D.; Mendonça, F.; Mostafa, S.S.; Morgado-Dias, F. The Potential of Machine Learning for Wind Speed and Direction Short-Term Forecasting: A Systematic Review. Computers 2023, 12, 206. [Google Scholar] [CrossRef]
- Pan, H.; Tang, Y.; Wang, G. A Stock Index Futures Price Prediction Approach Based on the MULTI-GARCH-LSTM Mixed Model. Mathematics 2024, 12, 1677. [Google Scholar] [CrossRef]
- Ma, G.; Yue, X.; Zhu, J.; Liu, Z.; Lu, S. Deep learning network based on improved sparrow search algorithm optimization for rolling bearing fault diagnosis. Mathematics 2023, 11, 4634. [Google Scholar] [CrossRef]
- Noh, J.; Park, H.J.; Kim, J.S.; Hwang, S.J. Gated recurrent unit with genetic algorithm for product demand forecasting in supply chain management. Mathematics 2020, 8, 565. [Google Scholar] [CrossRef]
- Qian, Z.; Pei, Y.; Zareipour, H.; Chen, N. A review and discussion of decomposition-based hybrid models for wind energy forecasting applications. Appl. Energy 2019, 235, 939–953. [Google Scholar] [CrossRef]
- Zhu, Q.; Ye, L.; Zhao, Y.; Lang, Y.; Song, X. Methods for elimination and reconstruction of abnormal power data in wind farms. Dianli Xitong Baohu Yu Kongzhi/Power Syst. Prot. Control 2015, 43, 38–45. [Google Scholar]
- Ma, Q.; Liu, S.; Fan, X.; Chai, C.; Wang, Y.; Yang, K. A time series prediction model of foundation pit deformation based on empirical wavelet transform and NARX network. Mathematics 2020, 8, 1535. [Google Scholar] [CrossRef]
- Heidari, A.A.; Akhoondzadeh, M.; Chen, H. A wavelet PM2. 5 prediction system using optimized kernel extreme learning with Boruta-XGBoost feature selection. Mathematics 2022, 10, 3566. [Google Scholar] [CrossRef]
- Liu, B.; Zhao, S.; Yu, X.; Zhang, L.; Wang, Q. A Novel Deep Learning Approach for Wind Power Forecasting Based on WD-LSTM Model. Energies 2020, 13, 4964. [Google Scholar] [CrossRef]
- Alex Graves. Long short-term memory. In Supervised Sequence Labelling with Recurrent Neural Networks; Springer: Berlin/Heidelberg, Germany, 2012; pp. 37–45. [Google Scholar]
- Hongfeng, C.; He, W.; Yan, L.; Min, X. Short-Term Wind Speed Prediction by Combining Two-Step Decomposition and ARIMA-LSTM. Acta Energiae Solaris Sin. 2024, 45, 164–171. [Google Scholar]
- Liu, M.D.; Ding, L.; Bai, Y.L. Application of hybrid model based on empirical mode decomposition, novel recurrent neural networks and the ARIMA to wind speed prediction. Energy Convers. Manag. 2021, 233, 113917. [Google Scholar] [CrossRef]
- Bai, Y.T.; Jia, W.; Jin, X.B.; Su, T.L.; Kong, J.L.; Shi, Z.G. Nonstationary time series prediction based on deep echo state network tuned by Bayesian optimization. Mathematics 2023, 11, 1503. [Google Scholar] [CrossRef]
- Shaoqin, W.; Jingshu, L.; Minghao, G. Short-Term Wind Speed Prediction in Bridge Area Based on EMD-SSA-BiLSTM Method. Comput. Integr. Manuf. Syst. 2023, 12, 1–8. [Google Scholar]
- Suo, L.; Peng, T.; Song, S.; Zhang, C.; Wang, Y.; Fu, Y.; Nazir, M.S. Wind speed prediction by a swarm intelligence based deep learning model via signal decomposition and parameter optimization using improved chimp optimization algorithm. Energy 2023, 276, 127526. [Google Scholar] [CrossRef]
- Li, Y.; Sun, K.; Yao, Q.; Wang, L. A dual-optimization wind speed forecasting model based on deep learning and improved dung beetle optimization algorithm. Energy 2024, 286, 129604. [Google Scholar] [CrossRef]
- Zhu, A.; Zhao, Q.; Yang, T.; Zhou, L.; Zeng, B. Wind speed prediction and reconstruction based on improved grey wolf optimization algorithm and deep learning networks. Comput. Electr. Eng. 2024, 114, 109074. [Google Scholar] [CrossRef]
- Sun, P.; Liu, Z.; Wang, J.; Zhao, W. Interval forecasting for wind speed using a combination model based on multiobjective artificial hummingbird algorithm. Appl. Soft Comput. 2024, 150, 111090. [Google Scholar] [CrossRef]
- Guo, X.; Zhu, C.; Hao, J.; Kong, L.; Zhang, S. A Point-Interval Forecasting Method for Wind Speed Using Improved Wild Horse Optimization Algorithm and Ensemble Learning. Sustainability 2023, 16, 94. [Google Scholar] [CrossRef]
- Conte, T.; Oliveira, R. Comparative Analysis between Intelligent Machine Committees and Hybrid Deep Learning with Genetic Algorithms in Energy Sector Forecasting: A Case Study on Electricity Price and Wind Speed in the Brazilian Market. Energies 2024, 17, 829. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. Adv. Neural Inf. Process. Syst. 2017, 30. [Google Scholar] [CrossRef]
- Zhang, Z.; Chang, L.; Tian, M.; Deng, L.; Chang, J.; Dong, L. Power Consumption Time Series Forecast Based on CATPCA for Optimal Transformer Satellite. Trans. Beijing Inst. Technol. 2023, 43, 744–754. [Google Scholar]
- Minghao, W.; Xinghua, X.; Juntao, C.; Guangjun, S.; Weifei, H. Dam Deformation Prediction Research Based on LSTM and Transformer. China Rural. Water Hydropower 2024, 4, 250–257. [Google Scholar] [CrossRef]
- Wu, H.; Xu, J.; Wang, J.; Long, M. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural Inf. Process. Syst. 2021, 34, 22419–22430. [Google Scholar]
- Zhou, T.; Ma, Z.; Wen, Q.; Wang, X.; Sun, L.; Jin, R. Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. In Proceedings of the International Conference on Machine Learning, Baltimore, MD, USA, 17–23 July 2022; pp. 27268–27286. [Google Scholar]
- Zhou, H.; Zhang, S.; Peng, J.; Zhang, S.; Li, J.; Xiong, H.; Zhang, W. Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Vancouver, BC, Canada, 20–27 February 2021; Volume 35, pp. 11106–11115. [Google Scholar]
- Qu, K.; Si, G.; Shan, Z.; Kong, X.; Yang, X. Short-term forecasting for multiple wind farms based on transformer model. Energy Rep. 2022, 8, 483–490. [Google Scholar] [CrossRef]
- Shan, L.; Liu, Y.; Tang, M.; Yang, M.; Bai, X. CNN-BiLSTM hybrid neural networks with attention mechanism for well log prediction. J. Pet. Sci. Eng. 2021, 205, 108838. [Google Scholar] [CrossRef]
Dataset | Min | Max | Mean | Std | Skewness | Kurtosis |
---|---|---|---|---|---|---|
Raw | 0.0000 | 25.2060 | 8.0777 | 5.1068 | 0.6619 | −0.3347 |
IQR | 0.2052 | 22.8112 | 8.0336 | 5.0457 | 0.6317 | −0.4290 |
Wavelet | SNR | MSE | NCC |
---|---|---|---|
Db4 | 22.2305 | 0.5384 | 0.9917 |
Sym4 | 22.1369 | 0.5502 | 0.9918 |
Coif1 | 21.7374 | 0.6032 | 0.9910 |
Bior2.2 | 22.9762 | 0.4535 | 0.9931 |
Model | Dropout | lr | Lookback |
---|---|---|---|
LSTM | 0.2 | 0.001 | 100 |
LSTM-AT | 0.2 | 0.001 | 100 |
BiLSTM-AT | 0.2 | 0.001 | 100 |
CNN-BiLSTM-AT | 0.2 | 0.001 | 100 |
Parameter | Min | Max |
---|---|---|
window_len | 1 | 50 |
num_encoder_layers | 1 | 10 |
num_decoder_layers | 1 | 10 |
d_model | 16 | 512 |
num_heads | 1 | 16 |
feedforward_dim | 16 | 512 |
dropout | 0.5 |
Model | Parameter | Optimum Value |
---|---|---|
OOA-CNN-BiLSTM-AT | lookback | 148 |
dropout | 0.0105 | |
learning rate (lr) | 0.0031 | |
OOA-Transformer | window_len | 1 |
number_encoder_layers | 1 | |
number_decoder_layers | 1 | |
d_model | 16 | |
number_heads | 1 | |
feedforward_dim | 16 | |
Dropout |
Model | MAPE (%) | RMSE | |
---|---|---|---|
LSTM | 10.36 | 0.0406 | 0.9679 |
LSTM-AT | 9.35 | 0.0383 | 0.9716 |
BiLSTM-AT | 9.21 | 0.0375 | 0.9727 |
CNN-BiLSTM-AT | 8.93 | 0.0350 | 0.9762 |
OOA-CNN-BiLSTM-AT | 4.57 | 0.0188 | 0.9930 |
Transformer | 6.53 | 0.0296 | 0.9830 |
OOA-Transformer | 4.16 | 0.0152 | 0.9955 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gao, Y.; Cao, B.; Yu, W.; Yi, L.; Guo, F. Short-Term Wind Speed Prediction for Bridge Site Area Based on Wavelet Denoising OOA-Transformer. Mathematics 2024, 12, 1910. https://doi.org/10.3390/math12121910
Gao Y, Cao B, Yu W, Yi L, Guo F. Short-Term Wind Speed Prediction for Bridge Site Area Based on Wavelet Denoising OOA-Transformer. Mathematics. 2024; 12(12):1910. https://doi.org/10.3390/math12121910
Chicago/Turabian StyleGao, Yan, Baifu Cao, Wenhao Yu, Lu Yi, and Fengqi Guo. 2024. "Short-Term Wind Speed Prediction for Bridge Site Area Based on Wavelet Denoising OOA-Transformer" Mathematics 12, no. 12: 1910. https://doi.org/10.3390/math12121910
APA StyleGao, Y., Cao, B., Yu, W., Yi, L., & Guo, F. (2024). Short-Term Wind Speed Prediction for Bridge Site Area Based on Wavelet Denoising OOA-Transformer. Mathematics, 12(12), 1910. https://doi.org/10.3390/math12121910