Hybrid Wavelet–Transformer–XGBoost Model Optimized by Chaotic Billiards for Global Irradiance Forecasting
Abstract
1. Introduction
1.1. Background and Challenges
- Strong temporal variability and seasonality;
- Rapid short-term fluctuations caused by cloud cover;
- Multi-scale irradiance dynamics ranging from seconds to hours;
- Spatial heterogeneity across regions and driving trajectories.
- Multi-scale signal decomposition to capture diverse temporal patterns;
- Attention-based learning to model long-range dependencies;
- Ensemble refinement to mitigate residual non-linear errors;
- Advanced optimization strategies to efficiently tune hyperparameters.
1.2. Main Contributions
- A hierarchical hybrid forecasting framework, termed WTX–CBO, which integrates Discrete Wavelet Transform (DWT), an encoder–decoder Transformer, and XGBoost in a sequential residual learning architecture rather than a parallel or stacked combination.
- A frequency-aware attention mechanism is introduced by feeding multi-resolution wavelet subbands directly into the Transformer, enabling effective modeling of long-range temporal dependencies under highly non-stationary and rapidly fluctuating solar irradiance conditions.
- A two-stage temporal learning strategy in which the Transformer captures global and multi-scale dynamics, while XGBoost explicitly learns structured residual errors that persist after attention-based forecasting, improving generalization across diverse climatic regimes.
- A joint global optimization scheme based on the Chaotic Billiards Optimizer (CBO) is developed to simultaneously tune the hyperparameters of both Transformer and XGBoost components within a unified search space, mitigating premature convergence in high-dimensional hybrid models.
- The proposed framework is computationally efficient and deployment-oriented, achieving fast convergence and low-latency inference suitable for real-time decision-support in solar-powered and embedded energy systems, including solar electric vehicle applications.
- Extensive experiments conducted on real-world datasets from multiple climate regions demonstrate that WTX–CBO consistently outperforms state-of-the-art deep learning and hybrid forecasting models in terms of accuracy, robustness, and scalability.
2. Proposed Methodology
- (i)
- Decomposition and Normalization: The input global horizontal irradiance time series is decomposed into multi-scale subcomponents using Discrete Wavelet Transform (DWT):where and denote the approximation (low-frequency) and detail (high-frequency) coefficients at decomposition level j, respectively.
- (ii)
- Deep Forecasting: The decomposed subseries are embedded and passed through the Transformer encoder–decoder:where represents the multi-head attention and feed-forward sequence learning function.
- (iii)
- Refinement and Optimization: The Transformer’s output is refined by XGBoost regression:while the CBO algorithm adaptively tunes hyperparameters , to minimize the objective:
2.1. Datasets Description
2.2. Data Preprocessing and Decomposition Module
2.2.1. Data Cleaning and Normalization
2.2.2. Feature Selection
2.2.3. Wavelet Decomposition
2.2.4. Input Structuring for Deep Learning
2.3. Forecasting Module
2.3.1. Encoder–Decoder Transformer
2.3.2. XGBoost Refinement Layer
2.3.3. Output Reconstruction
2.4. Optimization Module
| Algorithm 1 WTX–CBO: Wavelet–Transformer with XGBoost optimized by Chaotic Billiards (CBO) |
|
2.4.1. Motivation for Metaheuristic Optimization
2.4.2. Mathematical Formulation
2.4.3. Integration with the Hybrid Framework
2.5. Metrics and Hyperparameter Optimization
2.5.1. Evaluation Metrics
Mean Absolute Error (MAE) [22]
Mean Squared Error (MSE) [22]
Root Mean Squared Error (RMSE) [22]
Coefficient of Determination () [22]
Mean Absolute Percentage Error (MAPE) [22]
2.5.2. Hyperparameter Optimization
Stage 1: Empirical Grid Search
Stage 2: Fine-Tuning via CBO
3. Experimental Results and Analysis
3.1. Computational Environment
3.2. Evaluation of the Proposed Model
3.3. Decomposition Performance
3.4. Forecasting Accuracy
3.5. Optimization Efficiency
3.6. Ablation and Component Analysis
3.7. Computational Scalability
3.8. Statistical Validation
3.9. Visualization and Interpretation
3.10. Physical Interpretation and Error Regime Analysis
3.11. Summary
Implications for Solar Electric Vehicle Applications
4. Conclusions and Future Work
Future Work
- Integration with vehicle dynamics: Future versions will couple irradiance prediction with battery state of health (SOH) and vehicle motion models to enable holistic SEV energy optimization.
- Uncertainty quantification: Incorporating probabilistic inference or Monte Carlo dropout could provide confidence intervals for the predicted irradiance, enhancing reliability under abrupt weather transitions.
- Edge deployment and real-time adaptation: Model compression, pruning, and knowledge distillation will be explored to deploy WTX–CBO on low-power embedded systems within SEV control units.
- Multi-modal data fusion: Combining satellite imagery, sky cameras, and ground-based meteorological data can further improve spatial generalization and context awareness.
- Meta-optimization: Extending the Chaotic Billiards Optimizer toward multi-objective optimization will allow simultaneous tuning for accuracy, complexity, and energy efficiency.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Nomenclature
| Abbreviations | |
| AI | Artificial Intelligence |
| AM | Attention Mechanism |
| ANN | Artificial Neural Network |
| BSA | Backtracking Search Algorithm |
| BiLSTM | Bidirectional Long Short-Term Memory |
| CBO | Chaotic Billiards Optimizer |
| CNN | Convolutional Neural Network |
| CWS | CNN–Wavelet–SVR hybrid model |
| DWT | Discrete Wavelet Transform |
| DL | Deep Learning |
| DM Test | Diebold–Mariano Test |
| DWT | Discrete Wavelet Transform |
| EV | Electric Vehicle |
| FCM | Fuzzy C-Means |
| GA | Genetic Algorithm |
| GHI | Global Horizontal Irradiance |
| GRU | Gated Recurrent Unit |
| KNN | K-Nearest Neighbors (if appears) |
| LSTM | Long Short-Term Memory |
| MAE | Mean Absolute Error |
| MABE | Mean Absolute Bias Error |
| MAPE | Mean Absolute Percentage Error |
| MBE | Mean Bias Error |
| MLPNN | Multilayer Perceptron Neural Network |
| MR-ESN | Multi-Reservoir Echo State Network |
| NREL | National Renewable Energy Laboratory |
| NSRDB | National Solar Radiation Database |
| PV | Photovoltaic |
| PVGIS | Photovoltaic Geographical Information System |
| PSO | Particle Swarm Optimization |
| rRMSE | Relative Root Mean Square Error |
| RMSE | Root Mean Square Error |
| R2 | Coefficient of Determination |
| SEV | Solar Electric Vehicle |
| SVM/SVR | Support Vector Machine/Support Vector Regression |
| SoC | State of Charge (Battery) |
| V2G | Vehicle-to-Grid |
| VMD | Variational Mode Decomposition |
| WNN | Wavelet Neural Network |
| WT | Wavelet Transform |
| WTX–CBO | Wavelet Transformer XGBoost optimized by CBO |
| XGB/XGBoost | Extreme Gradient Boosting |
| Symbols | |
| Global horizontal irradiance at time t () | |
| Predicted global horizontal irradiance | |
| Approximation coefficients of irradiance (wavelet low-frequency component) | |
| Detail coefficients of irradiance at level j (high-frequency components) | |
| Input feature vector | |
| T | Look-back window size |
| H | Forecast horizon |
| Set of trainable and meta-parameters | |
| L | Loss function |
| Damping coefficient of the Chaotic Billiards Optimizer (CBO) | |
| Chaotic random factors in CBO |
References
- Oluwalana, O.J.; Grzesik, K. Solar-powered electric vehicles: Comprehensive review of technology advancements, challenges, and future prospects. Energies 2025, 18, 3650. [Google Scholar] [CrossRef]
- Li, T.T.; Zhao, A.P.; Wang, Y.; Li, S.; Fei, J.; Wang, Z.; Xiang, Y. Integrating solar-powered electric vehicles into sustainable energy systems. Nat. Rev. Electr. Eng. 2025, 2, 467–479. [Google Scholar] [CrossRef]
- Colucci, R.; Mahgoub, I. Generalizable Solar Irradiance Prediction for Battery Operation Optimization in IoT-Based Microgrid Environments. J. Sens. Actuator Netw. 2025, 14, 3. [Google Scholar] [CrossRef]
- Akinyoola, J.A.; Oluleye, A.; Gbode, I.E. A review of atmospheric aerosol impacts on regional extreme weather and climate events. Aerosol Sci. Eng. 2024, 8, 249–274. [Google Scholar] [CrossRef]
- Isman Okieh, O.; Seker, S.; Gokce, S.; Dennenmoser, M. An Enhanced Forecasting Method of Daily Solar Irradiance in Southwestern France: A Hybrid Nonlinear Autoregressive with Exogenous Inputs with Long Short-Term Memory Approach. Energies 2024, 17, 3965. [Google Scholar] [CrossRef]
- Zhou, B.; Chen, X.; Li, G.; Gu, P.; Huang, J.; Yang, B. XGBoost–SFS and Double Nested Stacking Ensemble Model for Photovoltaic Power Forecasting under Variable Weather Conditions. Sustainability 2023, 15, 13146. [Google Scholar] [CrossRef]
- Sansine, V.; Ortega, P.; Hissel, D.; Ferrucci, F. Hybrid Deep Learning Model for Mean Hourly Irradiance Probabilistic Forecasting. Atmosphere 2023, 14, 1192. [Google Scholar] [CrossRef]
- Al-Ali, E.M.; Hajji, Y.; Said, Y.; Hleili, M.; Alanzi, A.M.; Laatar, A.H.; Atri, M. Solar Energy Production Forecasting Based on a Hybrid CNN-LSTM-Transformer Model. Mathematics 2023, 11, 676. [Google Scholar] [CrossRef]
- Tercha, W.; Tadjer, S.A.; Chekired, F.; Canale, L. Machine Learning-Based Forecasting of Temperature and Solar Irradiance for Photovoltaic Systems. Energies 2024, 17, 1124. [Google Scholar] [CrossRef]
- Lopes, F.M.; Silva, H.G.; Salgado, R.; Cavaco, A.; Canhoto, P.; Collares-Pereira, M. Short-term forecasts of GHI and DNI for solar energy systems operation: Assessment of the ECMWF integrated forecasting system in southern Portugal. Sol. Energy 2018, 170, 14–30. [Google Scholar] [CrossRef]
- Pereira, S.; Canhoto, P.; Salgado, R.; Costa, M.J. Development of an ANN based corrective algorithm of the operational ECMWF global horizontal irradiation forecasts. Sol. Energy 2019, 185, 387–405. [Google Scholar] [CrossRef]
- Ahmed, U.; Khan, A.R.; Mahmood, A.; Rafiq, I.; Ghannam, R.; Zoha, A. Short-term global horizontal irradiance forecasting using weather classified categorical boosting. Appl. Soft Comput. 2024, 155, 111441. [Google Scholar] [CrossRef]
- Yadav, A.K.; Malik, H.; Chandel, S. Selection of most relevant input parameters using WEKA for artificial neural network based solar radiation prediction models. Renew. Sustain. Energy Rev. 2014, 31, 509–519. [Google Scholar] [CrossRef]
- Wang, L.; Kisi, O.; Zounemat-Kermani, M.; Salazar, G.A.; Zhu, Z.; Gong, W. Solar radiation prediction using different techniques: Model evaluation and comparison. Renew. Sustain. Energy Rev. 2016, 61, 384–397. [Google Scholar] [CrossRef]
- Olatomiwa, L.; Mekhilef, S.; Shamshirband, S.; Mohammadi, K.; Petković, D.; Sudheer, C. A support vector machine–firefly algorithm-based model for global solar radiation prediction. Sol. Energy 2015, 115, 632–644. [Google Scholar] [CrossRef]
- Gupta, P.; Singh, R. Combining simple and less time complex ML models with multivariate empirical mode decomposition to obtain accurate GHI forecast. Energy 2023, 263, 125844. [Google Scholar] [CrossRef]
- Ghimire, S.; Deo, R.C.; Raj, N.; Mi, J. Deep solar radiation forecasting with convolutional neural network and long short-term memory network algorithms. Appl. Energy 2019, 253, 113541. [Google Scholar] [CrossRef]
- Cannizzaro, D.; Aliberti, A.; Bottaccioli, L.; Macii, E.; Acquaviva, A.; Patti, E. Solar radiation forecasting based on convolutional neural network and ensemble learning. Expert Syst. Appl. 2021, 181, 115167. [Google Scholar] [CrossRef]
- Mchara, W.; Khalfa, M.A.; Manai, L. Advanced health state intelligent diagnosis of lithium-ion batteries based on CNN-WNN-WBiLSTM model with attention mechanism. Automatika 2025, 66, 154–173. [Google Scholar] [CrossRef]
- Tomar, V.; Bansal, M.; Singh, P. Metaheuristic algorithms for optimization: A brief review. Eng. Proc. 2024, 59, 238. [Google Scholar]
- NASA Power. Prediction of Worldwide Energy Resource (POWER) Project. 2019. Available online: https://power.larc.nasa.gov (accessed on 12 December 2024).
- Mchara, W.; Manai, L.; Khalfa, M.A.; Raissi, M. Intelligent Health State Diagnosis of Lithium-Ion Batteries for Electric Vehicles Using Wavelet-Enhanced Hybrid Deep Learning Integrated with an Attention Mechanism. Clean Energy 2025, 6, zkaf019. [Google Scholar] [CrossRef]
- Khalfa, M.A.; Manai, L.; Mchara, W. Advanced artificial intelligence model for solar irradiance forecasting for solar electric vehicles: MA Khalfa et al. Int. J. Dyn. Control 2025, 13, 101. [Google Scholar] [CrossRef]
- Ondo Ekogha, E.; Owolawi, P.A. Comparative Analysis of Supervised Learning Techniques for Forecasting PV Current in South Africa. Forecasting 2024, 7, 1. [Google Scholar] [CrossRef]
- Albayram, M.; Yılmaz, A.; Bayrak, G.; Basaran, K.; Georgeta Popescu, L. Effectiveness of un-decimated wavelet transform in time-series forecasting: A PV power calculation case study in BTU. Renew. Energy 2026, 256, 124062. [Google Scholar] [CrossRef]
- Berrezzek, F.; Khelil, K.; Bouadjila, T. Efficient wind speed forecasting using discrete wavelet transform and artificial neural networks. Rev. Intell. Artif. 2019, 33, 447–452. [Google Scholar] [CrossRef]
- Niccolai, A.; Orooji, S.; Matteri, A.; Ogliari, E.; Leva, S. Irradiance nowcasting by means of deep-learning analysis of infrared images. Forecasting 2022, 4, 338–348. [Google Scholar] [CrossRef]
- Legrene, I.; Wong, T.; Dessaint, L.A. Enhancing neural architecture search using transfer learning and dynamic search spaces for global horizontal irradiance prediction. Forecasting 2025, 7, 43. [Google Scholar] [CrossRef]
- Ndong, J.; Soubdhan, T. Extracting Statistical Properties of Solar and Photovoltaic Power Production for the Scope of Building a Sophisticated Forecasting Framework. Forecasting 2022, 5, 1–21. [Google Scholar] [CrossRef]
- Lee, J.H.; Okwuosa, C.N.; Shin, B.C.; Hur, J.W. A Spectral-Based Blade Fault Detection in Shot Blast Machines with XGBoost and Feature Importance. J. Sens. Actuator Netw. 2024, 13, 64. [Google Scholar] [CrossRef]
- Benavides-Cesar, L.; Manso-Callejo, M.Á.; Cira, C.I. Methodology Based on BERT (Bidirectional Encoder Representations from Transformers) to Improve Solar Irradiance Prediction of Deep Learning Models Trained with Time Series of Spatiotemporal Meteorological Information. Forecasting 2025, 7, 5. [Google Scholar] [CrossRef]
- Elnaghi, B.E.; Abelwhab, M.; Ismaiel, A.M.; Mohammed, R.H. Solar hydrogen variable speed control of induction motor based on chaotic billiards optimization technique. Energies 2023, 16, 1110. [Google Scholar] [CrossRef]






| Model | Main Advantages | Main Limitations |
|---|---|---|
| CNN–LSTM | Captures local feature patterns and short-term temporal dependencies; widely adopted and computationally efficient | Limited ability to model long-range dependencies; prone to smoothing effects during abrupt irradiance transitions |
| VMD–CNN | Effective decomposition of non-stationary signals; improved noise robustness | Lacks explicit temporal dependency modeling; performance sensitive to decomposition parameters |
| WT–LSTM | Multi-scale signal representation; improved handling of non-stationarity | Sequential modeling may struggle with long-term dependencies and residual nonlinear errors |
| Transformer-based models | Strong capability for long-range temporal dependency modeling | Performance degrades under high-frequency fluctuations without explicit multi-scale preprocessing |
| Hybrid DL + ensemble models | Improved generalization through error correction | Often rely on parallel stacking without explicit residual hierarchy or coordinated optimization |
| Parameter | Tested Range | Selected | Role/Impact on Performance |
|---|---|---|---|
| Learning rate | {, , } | Controls convergence speed and training stability | |
| Dropout rate | {0.1, 0.2, 0.3} | 0.2 | Reduces overfitting and |
| improves generalization | |||
| Feed-forward dimension | {128, 256, 512} | 256 | Determines nonlinear feature |
| transformation capacity | |||
| Model dimension | {64, 128, 256} | 128 | Influences representation richness |
| and temporal modeling ability | |||
| Attention heads | {2, 4, 8} | 4 | Enables multi-head attention |
| to capture diverse temporal patterns | |||
| Encoder layers | {2, 3, 4} | 4 | Controls model depth and |
| long-range dependency learning |
| Parameter | Tested Range | Selected | Role/Impact on Performance |
|---|---|---|---|
| Subsample ratio | {0.6, 0.8, 1.0} | 0.8 | Introduces stochasticity to improve |
| generalization and reduce overfitting | |||
| Learning rate | {0.01, 0.05, 0.1} | 0.05 | Controls the contribution |
| of each tree and convergence behavior | |||
| Maximum depth | {3, 5, 7} | 5 | Regulates model complexity |
| and bias–variance trade-off | |||
| Number of trees | {100, 200, 300} | 200 | Determines ensemble capacity |
| and residual learning ability |
| Parameter | Value |
|---|---|
| Cue movement coefficient () | 0.9 |
| Acceleration coefficient (a) | 1.0 |
| Maximum iterations | 100 |
| Population size | 30 |
| Termination criterion | Convergence or max iterations |
| Decomposition Method | MAE | RMSE | MAPE (%) | R2 | Time (s) | Observation |
|---|---|---|---|---|---|---|
| No Decomposition (Raw) | 0.0286 | 0.0391 | 2.84 | 0.973 | 118 | Baseline without multi-scale learning |
| EMD-Based Hybrid | 0.0238 | 0.0349 | 2.41 | 0.978 | 142 | Captures non-stationary features |
| VMD-Based Hybrid | 0.0212 | 0.0318 | 2.10 | 0.981 | 166 | Smooth frequency separation |
| WT-Based Hybrid (Proposed) | 0.0189 | 0.0283 | 1.87 | 0.987 | 132 | Superior time–frequency localization |
| Dataset | Model | MAE | MSE | RMSE | MAPE (%) | R2 |
|---|---|---|---|---|---|---|
| Switzerland | LSTM | 0.0341 | 0.0021 | 0.0459 | 4.52 | 0.961 |
| BiLSTM | 0.0314 | 0.0018 | 0.0424 | 3.96 | 0.968 | |
| CNN–LSTM | 0.0286 | 0.0015 | 0.0387 | 3.42 | 0.972 | |
| Transformer | 0.0259 | 0.0012 | 0.0346 | 2.85 | 0.976 | |
| XGBoost | 0.0248 | 0.0011 | 0.0331 | 2.62 | 0.978 | |
| WTX–CBO (Proposed) | 0.0189 | 0.0008 | 0.0283 | 1.87 | 0.987 | |
| Italy | LSTM | 0.0356 | 0.0023 | 0.0481 | 4.88 | 0.958 |
| BiLSTM | 0.0328 | 0.0019 | 0.0436 | 4.05 | 0.967 | |
| CNN–LSTM | 0.0303 | 0.0016 | 0.0402 | 3.54 | 0.972 | |
| Transformer | 0.0272 | 0.0013 | 0.0363 | 2.91 | 0.975 | |
| XGBoost | 0.0261 | 0.0011 | 0.0339 | 2.68 | 0.978 | |
| WTX–CBO (Proposed) | 0.0195 | 0.0009 | 0.0291 | 1.79 | 0.986 | |
| Austria | LSTM | 0.0338 | 0.0020 | 0.0448 | 4.42 | 0.963 |
| BiLSTM | 0.0309 | 0.0017 | 0.0413 | 3.85 | 0.969 | |
| CNN–LSTM | 0.0284 | 0.0014 | 0.0374 | 3.33 | 0.974 | |
| Transformer | 0.0256 | 0.0012 | 0.0342 | 2.74 | 0.978 | |
| XGBoost | 0.0243 | 0.0010 | 0.0318 | 2.53 | 0.981 | |
| WTX–CBO (Proposed) | 0.0181 | 0.0007 | 0.0269 | 1.72 | 0.989 | |
| Germany | LSTM | 0.0367 | 0.0024 | 0.0490 | 4.97 | 0.958 |
| BiLSTM | 0.0332 | 0.0019 | 0.0437 | 4.09 | 0.967 | |
| CNN–LSTM | 0.0308 | 0.0016 | 0.0398 | 3.51 | 0.973 | |
| Transformer | 0.0274 | 0.0013 | 0.0364 | 2.96 | 0.977 | |
| XGBoost | 0.0260 | 0.0011 | 0.0336 | 2.71 | 0.981 | |
| WTX–CBO (Proposed) | 0.0191 | 0.0008 | 0.0282 | 1.84 | 0.987 |
| Model | Architecture | MAE | RMSE | MAPE (%) | R2 | Reference |
|---|---|---|---|---|---|---|
| CNN–LSTM | Deep hybrid RNN | 0.0287 | 0.0387 | 3.42 | 0.972 | [5] |
| FCM–CNN–WNN–AM | Multi-resolution attention | 0.0234 | 0.0329 | 2.46 | 0.981 | [9] |
| EWT–BiLSTM | Empirical wavelet-based DL | 0.0225 | 0.0314 | 2.29 | 0.982 | [12] |
| CWS (CNN–WT–SVR) | Hybrid regression-based | 0.0210 | 0.0302 | 2.10 | 0.984 | [10] |
| WTX–CBO (Proposed) | Wavelet–Transformer–XGBoost | 0.0189 | 0.0283 | 1.87 | 0.987 | This study |
| Optimizer | MAE | RMSE | R2 | Convergence Epochs | Stability | Observation |
|---|---|---|---|---|---|---|
| Adam | 0.0235 | 0.0338 | 0.978 | 87 | Stable | Local optimizer; slower convergence |
| GA | 0.0227 | 0.0325 | 0.980 | 72 | Moderate | Prone to premature convergence |
| PSO | 0.0218 | 0.0312 | 0.982 | 65 | High | Balanced exploration/exploitation |
| CBO (Proposed) | 0.0189 | 0.0283 | 0.987 | 59 | Very High | Fast and stable chaotic convergence |
| Configuration | MAE | RMSE | MAPE (%) | R2 |
|---|---|---|---|---|
| Without WT Decomposition | 0.0226 | 0.0335 | 2.32 | 0.978 |
| Without Transformer (WT–XGBoost) | 0.0218 | 0.0321 | 2.21 | 0.981 |
| Without XGBoost (WT–Transformer) | 0.0207 | 0.0310 | 2.05 | 0.983 |
| Without CBO Optimization | 0.0199 | 0.0298 | 1.96 | 0.985 |
| Full Model (WTX–CBO) | 0.0189 | 0.0283 | 1.87 | 0.987 |
| Model | Epochs to Converge | Training Time (s) | Validation R2 | Observation |
|---|---|---|---|---|
| LSTM | 120 | 415 | 0.961 | Gradual convergence |
| CNN–LSTM | 105 | 372 | 0.972 | Improved feature extraction |
| Transformer | 98 | 350 | 0.976 | Strong long-term memory |
| XGBoost | 89 | 312 | 0.978 | Fast but less generalizable |
| WTX–CBO (Proposed) | 78 | 312 | 0.987 | Fastest and most stable convergence |
| Metric | Value | Observation |
|---|---|---|
| Training Time (epochs = 100) | 312 s | 25% faster than Transformer |
| Inference Time/Window | 0.038 s | Suitable for real-time SEV use |
| Memory Usage | 1.2 GB | Lightweight hybrid design |
| Model Parameters | 2.14 M | Moderate complexity |
| Hardware Used | Ryzen 9/A100 GPU | 128 GB RAM |
| Model | Latency (s) | Memory (GB) | Observation |
|---|---|---|---|
| LSTM | 0.056 | 1.35 | Recurrent overhead |
| CNN–LSTM | 0.049 | 1.42 | Convolutional operations cost |
| Transformer | 0.045 | 1.28 | Attention complexity scales linearly |
| XGBoost | 0.041 | 1.10 | Lightweight tree ensemble |
| WTX–CBO (Proposed) | 0.038 | 1.20 | Optimized for real-time SEV deployment |
| Comparison Model | p-Value | Significance () |
|---|---|---|
| WTX–CBO vs. LSTM | 0.0042 | Significant |
| WTX–CBO vs. BiLSTM | 0.0061 | Significant |
| WTX–CBO vs. CNN–LSTM | 0.0095 | Significant |
| WTX–CBO vs. Transformer | 0.0087 | Significant |
| WTX–CBO vs. XGBoost | 0.0074 | Significant |
| Comparison Model | DM Statistic | p-Value |
|---|---|---|
| WTX–CBO vs. LSTM | 2.846 | 0.0045 |
| WTX–CBO vs. BiLSTM | 2.513 | 0.0084 |
| WTX–CBO vs. CNN–LSTM | 2.297 | 0.0121 |
| WTX–CBO vs. Transformer | 2.181 | 0.0163 |
| WTX–CBO vs. XGBoost | 2.057 | 0.0198 |
| Feature | Gain Importance (%) |
|---|---|
| Frost Point | 1.0 |
| Precipitation Rate | 2.7 |
| Dew Point Temperature | 4.1 |
| Wind Direction | 5.9 |
| Surface Pressure | 7.2 |
| Precipitable Water | 8.4 |
| Wind Speed | 9.7 |
| Relative Humidity | 13.5 |
| Air Temperature | 18.9 |
| Global Horizontal Irradiance (GHI) | 28.6 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Mchara, W.; Cicceri, G.; Manai, L.; Raissi, M.; Albaqami, H. Hybrid Wavelet–Transformer–XGBoost Model Optimized by Chaotic Billiards for Global Irradiance Forecasting. J. Sens. Actuator Netw. 2026, 15, 12. https://doi.org/10.3390/jsan15010012
Mchara W, Cicceri G, Manai L, Raissi M, Albaqami H. Hybrid Wavelet–Transformer–XGBoost Model Optimized by Chaotic Billiards for Global Irradiance Forecasting. Journal of Sensor and Actuator Networks. 2026; 15(1):12. https://doi.org/10.3390/jsan15010012
Chicago/Turabian StyleMchara, Walid, Giovanni Cicceri, Lazhar Manai, Monia Raissi, and Hezam Albaqami. 2026. "Hybrid Wavelet–Transformer–XGBoost Model Optimized by Chaotic Billiards for Global Irradiance Forecasting" Journal of Sensor and Actuator Networks 15, no. 1: 12. https://doi.org/10.3390/jsan15010012
APA StyleMchara, W., Cicceri, G., Manai, L., Raissi, M., & Albaqami, H. (2026). Hybrid Wavelet–Transformer–XGBoost Model Optimized by Chaotic Billiards for Global Irradiance Forecasting. Journal of Sensor and Actuator Networks, 15(1), 12. https://doi.org/10.3390/jsan15010012

