Next Article in Journal
COLOSSUS X-Challenge Student Competition-Exploring Solutions to Wildfire Fighting Using System of Systems Analysis
Previous Article in Journal
Understanding the Behavior of CSS Under Dry and Wet Weather Conditions for Predictive Maintenance Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

A Comprehensive Benchmarking of Evolutionary, Swarm-Intelligence, and Surrogate-Assisted Optimization for Residual Demand Forecasting in South African Microgrids †

by
Pfano Nemakonde
1,*,
Fhulufhelo Nemangwele
1,
Mukovhe Ratshitanga
2,3 and
Komla Agbenyo Folly
2
1
Department of Physics, Faculty of Science, Engineering and Agriculture, University of Venda, Thohoyandou 0950, South Africa
2
Department of Electrical Engineering, University of Cape Town, Rondebosch 7701, South Africa
3
Department of Electrical, Electronic and Computer Engineering, Cape Peninsula University of Technology, Bellville 7535, South Africa
*
Author to whom correspondence should be addressed.
Presented at the 34th Southern African Universities Power Engineering Conference (SAUPEC 2026), Durban, South Africa, 30 June–1 July 2026.
Eng. Proc. 2026, 140(1), 17; https://doi.org/10.3390/engproc2026140017
Published: 14 May 2026

Abstract

Accurate residual demand forecasting (RDF) is essential for stable peer-to-peer energy trading in developing economies. This study benchmarks three hyperparameter optimization paradigms, HEBO, PSO, and GP-BO, applied to XGBoost (2.1.4) forecasting using seven-fold TimeSeriesSplit validation on South African hourly grid data. Results demonstrate a fundamental trade-off between accuracy and efficiency: PSO achieves superior accuracy (0.47% MAPE) at the cost of substantial computation (23.4 h), while GP-BO offers revolutionary speed (19 min) with acceptable accuracy trade-offs. HEBO provides balanced performance with stable convergence. Crucially, we identify a “data–optimizer coupling” effect where optimal scaling methods are algorithm-dependent. These findings provide context-specific deployment strategies for microgrid operators addressing energy trilemma challenges.

1. Introduction

The global energy sector’s transformation through decentralization, digitalization, and decarbonization poses complex challenges for grid stability [1]. This “Energy Trilemma”—balancing energy security, equity, and sustainability—is particularly acute in developing economies with ageing infrastructure [2]. South Africa exemplifies these challenges, facing chronic load-shedding while pursuing ambitious renewable targets under its Just Energy Transition framework [3].
Peer-to-peer energy trading enables direct renewable exchange but requires accurate residual demand forecasting (total load minus renewable generation) for market stability, enabling dynamic pricing and grid balancing [4,5]. Renewable generation’s stochastic nature and complex consumption patterns make this particularly challenging in microgrids [6].
While machine learning approaches like XGBoost show promise for energy forecasting [7], their effectiveness depends critically on hyperparameter optimization. Traditional methods become computationally prohibitive in high-dimensional spaces [8], necessitating intelligent strategies. The current literature lacks comprehensive HPO benchmarking for microgrid forecasting in resource-constrained environments [9]. This study addresses these gaps through rigorous empirical benchmarking of three HPO paradigms: Heteroscedastic Evolutionary Bayesian Optimization (HEBO) [10], Particle Swarm Optimization (PSO) [11], and Surrogate-Assisted Gaussian Process Bayesian Optimization (GP-BO) [12].
Contributions include advanced feature engineering, rigorous temporal validation, data–optimizer coupling analysis, computational efficiency benchmarking, and context-specific deployment guidelines.

2. Literature Review

2.1. Machine Learning in Energy Forecasting

Energy forecasting has progressed from statistical to machine-learning approaches. Traditional methods like ARIMA assume stationarity and linearity [13], assumptions frequently violated by renewable integration [14]. Modern ML captures complex non-linear relationships with tree-based ensembles like Extreme Gradient Boosting (XGBoost) [7] and Light Gradient Boosting Machine (LightGBM) [15] architectures, proving particularly effective for energy data. Deep learning shows promise for sequence forecasting [16] but requires substantial resources, making it impractical for resource-constrained microgrids [17]. This positions gradient boosting as the preferred approach, though its performance depends heavily on hyperparameter configuration [18].

2.2. Hyperparameter Optimization Paradigms

Hyperparameter optimization critically determines automated ML performance. Population-based methods like PSO [11] navigate high-dimensional spaces but require substantial evaluations, making them computationally expensive [19]. Recent metaheuristic advances address these limitations. Chou and Nguyen [20] proposed the Age of Exploration-Inspired Optimizer (AEIO), integrating DBSCAN clustering to balance exploration and exploitation, outperforming 11 established optimizers on energy forecasting tasks, with lower MAPE and reduced computational burden.
Hybrid architectures combining multiple paradigms have also emerged. Chen et al. [21] introduced a Chaotic Quasi-Reverse Artificial Lemming Algorithm (CQALA) within a multi-scale convolutional Kolmogorov–Arnold network (MCKAN), achieving a 27.6–33.4% lower mean absolute error in wind and solar forecasting.
Bayesian Optimization (BO) offers sample efficiency through probabilistic surrogates [12], though standard GP-BO assumes homoscedastic noise and scales poorly [22]. PSO-optimized Bayesian LSTM frameworks address this, achieving 22.84% MAPE reduction for solar PV forecasting [23]. Despite these advances, hybrid and adaptive optimizers for microgrid residual demand forecasting—particularly in resource-constrained developing economies—remain critically underexplored [10,24].

2.3. Data-Centric Optimization Gaps

A persistent HPO limitation is treating data preprocessing as static [25]. The “No Free Lunch” theorems [26] imply that optimizer performance depends on alignment with objective function geometry—which is inherently shaped by data scaling. Recent work beyond energy corroborates this interdependence. Murtiningsih et al. [27] demonstrated that boosting algorithm efficacy for hypertension prediction was highly sensitive to feature selection, resampling, and tuning protocols. Similarly, the performance of the HPO method in heart failure prediction varied significantly across imputation techniques and validation strategies [28]. These findings reinforce treating preprocessing and optimization as a coupled system—a principle absent from most of the energy forecasting literature. This study addresses this gap by systematically investigating “data–optimizer coupling” for microgrid residual demand forecasting.

3. Methodology

This study uses hourly residual demand data from Eskom (2018–2025) to benchmark hyperparameter optimization methods for microgrid forecasting. Residual demand, calculated as total grid load minus renewable generation, supports P2P energy trading. An advanced feature engineering pipeline generates 108 high-value predictors capturing seasonality, trends, and system state signals. XGBoost is used with 12-dimensional hyperparameter space, optimized via HEBO, PSO, and GP-BO. Rigorous 7-fold TimeSeriesSplit cross-validation with RMSLE objective evaluates performance across four scaling methods (Raw, MinMax, Standard, Robust). GPU acceleration (NVIDIA RTX A4500 which is manufactured by NVIDIA Corporation, Santa Clara, CA, USA) and computational resource monitoring ensure efficient experimentation, enabling data–optimizer coupling effects to be investigated.

4. Assessment Metrics and Forecasting Results

4.1. Performance Evaluation Metrics

The models were evaluated using five evaluation metrics, notably, Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE), Root Mean Squared Logarithmic Error (RMSLE) and Coefficient of Determination ( R 2 ) to validate performance against benchmarks. These metrics are defined as follows:
M A E = 1 N i = 1 N y i y ^ i
M A P E = 100 % N i = 1 N y i y ^ i y i 2
R M S L E = 1 N i = 1 N log 1 + y i log 1 + y ^ i 2
R 2 = 1 i = 1 N y i y ^ i 2 i = 1 N y i y ¯ 2
where N is the number of samples, y i is the actual value, y i is the predicted value, and y is the mean of actual values.

4.2. Comprehensive Performance Benchmarking

Table 1 reveals a clear accuracy–efficiency trade-off. PSO achieves superior forecasting accuracy (0.47% MAPE), making it suitable for mission-critical applications where precision outweighs computational cost. HEBO provides comparable accuracy with slightly better computational characteristics. GP-BO demonstrates revolutionary optimization speed (19 min versus 20+ h for population-based methods), though with reduced accuracy (4.77% MAPE).
Figure 1 illustrates distinct convergence patterns: GP-BO (a) shows rapid initial improvement characteristic of sample-efficient surrogate methods, while PSO (b) and HEBO (c) exhibit a more gradual but ultimately superior convergence. HEBO’s stable trajectory (c) suggests robust handling of the complex, multi-modal search space.

4.3. Multi-Horizon Forecasting Analysis

Figure 2 demonstrates temporal consistency across forecasting horizons. PSO (b) and HEBO (c) maintain stable performance from 1-h to 7-day forecasts, indicating robust capture of temporal dependencies.
GP-BO (a) shows degradation at longer horizons, suggesting limitations in identifying hyperparameters that generalize across multiple timescales. This has significant implications for microgrid operations requiring both short-term dispatch and long-term planning.

4.4. Residual Error Characterization

Figure 3 reveals error distribution characteristics critical for P2P trading risk assessment. PSO (b) exhibits the tightest error concentration around zero (±0.02), minimizing extreme forecasting errors that could destabilize energy markets. HEBO (c) shows slightly wider but well-behaved distributions. GP-BO’s broader, multi-modal error distribution (a) explains its higher MAPE despite competitive R2 values, indicating consistent but larger-magnitude errors.

5. Case Study: Input Data and Setting

5.1. Input Data

The dataset spans 8 April 2018 to 22 September 2025, capturing critical transitions in South Africa’s energy policy and infrastructure. This temporal scope enables analysis of forecasting performance during both stable and crisis periods, providing robust insights for operational deployment.

5.2. Experimental Configuration

The model is implemented using Python 3.12.7 and CUDA 12.8.93, with libraries including scikit-optimize 0.9 and HEBO 0.3.0. The computational environment features an AMD (Advanced Micro Devices, Santa Clara, CA, USA) Ryzen 9 5950X 16-core processor (3.40 GHz), 20 GB NVIDIA RTX A4500 GPU, 64 GB RAM, 64-bit OS, and x86_64 architecture.

5.3. Validation Protocol

The seven-fold TimeSeriesSplit validation ensures temporal integrity with the following: training periods—sequentially expanding windows; validation periods—fixed 3-month intervals; test periods—subsequent 1-month intervals. This approach mimics real-world deployment where models are periodically retrained on expanding historical data.

6. In-Depth Discussion of Findings

6.1. Engineering Significance and Operational Implications

These results extend beyond comparative algorithm performance to provide actionable intelligence for engineering design and operational management of AI-enabled microgrid systems.

The Accuracy–Efficiency Trade-Off as a Deployment Decision Variable

The findings (Table 2) show that optimizer choice is a strategic decision impacting system reliability, operational agility, and costs. PSO with standard scaling offers high accuracy (0.47% MAPE) but is computationally intensive (23.4 h), as shown in Table 3, making it suitable for offline, periodic retraining. GP-BO with robust scaling enables rapid retraining, is 64–73× faster, and 6.2× more sample efficient, making it ideal for adaptive control. HEBO with standard scaling balances performance and efficiency, making it suitable for edge deployment on resource-constrained controllers. Co-designing preprocessing and optimization strategies is crucial, as GP-BO enables sub-daily retraining and adaptation to evolving grid conditions. The accuracy–efficiency trade-off guides deployment decisions, and standardized preprocessing workflows are inadequate. GP-BO supports environmental sustainability mandates and regulatory trends, offering a pathway to reduce the model training carbon footprint without compromising forecasting integrity [29].

7. Conclusions and Future Work

This study benchmarks hyperparameter optimization methods for microgrid residual demand forecasting, revealing accuracy-efficiency trade-offs. PSO achieves 0.47% MAPE but is computationally intensive, making it suitable for accuracy-critical applications. GP-BO offers revolutionary efficiency with 19-min optimization and acceptable accuracy trade-offs, enabling rapid model iteration. HEBO provides balanced performance with stable convergence and efficient model architectures.
Data–optimizer coupling is critical, requiring algorithm-specific preprocessing strategies. Optimal scaling methods vary by algorithm, necessitating co-design of data preprocessing and optimization. This insight has broad applicability beyond energy forecasting to domains employing automated machine learning pipelines.
The results guide policymakers and microgrid operators in deploying ML solutions, enabling practical AI-driven forecasting in resource-constrained environments. This supports the transition toward decentralized, sustainable energy systems. Future research directions include hybrid optimization, federated learning [30], transfer learning [31], explainable AI [32], and real-time adaptive optimization.

Author Contributions

Conceptualization, P.N. and M.R.; methodology, P.N. and M.R.; formal analysis, P.N. and M.R.; investigation, P.N.; writing—original draft preparation, P.N. and M.R.; writing—review and editing, M.R., F.N. and K.A.F.; supervision, F.N. and K.A.F.; funding acquisition, F.N. and M.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data available on reasonable request.

Acknowledgments

Eskom Tertiary Education Support Programme (TESP), South Africa, is acknowledged for always supporting research initiatives.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
RDFResidual Demand Forecasting
HEBOHeteroscedastic Evolutionary Bayesian Optimization
GP-BOGaussian Process Bayesian Optimization
PSOParticle Swarm Optimization
CQALAChaotic Quasi-Reverse Artificial Lemming Algorithm
AEIOAge of Exploration-Inspired Optimizer

References

  1. Strbac, G.; Papadaskalopoulos, D.; Chrysanthopoulos, N.; Estanqueiro, A.; Algarvio, H.; Lopes, F.; de Vries, L.; Morales-Espana, G.; Sijm, J.; Hernandez-Serna, R.; et al. Decarbonization of Electricity Systems in Europe: Market Design Challenges. IEEE Power Energy Mag. 2021, 19, 53–63. [Google Scholar] [CrossRef]
  2. Kim, J.E. Sustainable energy transition in developing countries: The role of energy aid donors. Clim. Policy 2019, 19, 1–16. [Google Scholar] [CrossRef]
  3. Baker, L.; Newell, P.; Phillips, J. The Political Economy of Energy Transitions: The Case of South Africa. New Political Econ. 2014, 19, 791–818. [Google Scholar] [CrossRef]
  4. Sousa, T.; Soares, T.; Pinson, P.; Moret, F.; Baroche, T.; Sorin, E. Peer-to-peer and community-based markets: A comprehensive review. Renew. Sustain. Energy Rev. 2019, 104, 367–378. [Google Scholar] [CrossRef]
  5. Hong, T. Energy Forecasting: Past, Present, and Future. Foresight Int. J. Appl. Forecast. 2014, 32, 43–48. [Google Scholar]
  6. Shahzad, S.; Abbasi, M.A.; Ali, H.; Iqbal, M.; Munir, R.; Kilic, H. Possibilities, Challenges, and Future Opportunities of Microgrids: A Review. Sustainability 2023, 15, 6366. [Google Scholar] [CrossRef]
  7. Chen, T.; Guestrin, C. XGBoost: A scalable tree boosting system. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Association for Computing Machinery, San Francisco, CA, USA, 13–17 August 2016; pp. 785–794. [Google Scholar] [CrossRef]
  8. Yang, L.; Shami, A. On hyperparameter optimization of machine learning algorithms: Theory and practice. Neurocomputing 2020, 415, 295–316. [Google Scholar] [CrossRef]
  9. Razak, T.R.; Ismail, M.H.; Darus, M.Y.; Jarimi, H.; Su, Y. Artificial Intelligence in Renewable Energy: A Systematic Review of Trends in Solar, Wind, and Smart Grid Applications. Res. Rev. Sustain. 2025, 1, 1–22. [Google Scholar] [CrossRef]
  10. Cowen-Rivers, A.I.; Lyu, W.; Tutunov, R.; Wang, Z.; Grosnit, A.; Griffiths, R.R.; Maraval, A.M.; Jianye, H.; Wang, J.; Peters, J.; et al. HEBO: An Empirical Study of Assumptions in Bayesian Optimisation. J. Artif. Intell. Res. 2022, 74, 1269–1349. [Google Scholar] [CrossRef]
  11. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar] [CrossRef]
  12. Shahriari, B.; Swersky, K.; Wang, Z.; Adams, R.P.; de Freitas, N. Taking the Human Out of the Loop: A Review of Bayesian Optimization. Proc. IEEE 2016, 104, 148–175. [Google Scholar] [CrossRef]
  13. Hyndman, R.J.; Athanasopoulos, G. Forecasting: Principles and Practice; OTexts: Melbourne, Australia, 2018. [Google Scholar]
  14. Dudek, G. Pattern-based local linear regression models for short-term load forecasting. Electr. Power Syst. Res. 2016, 130, 139–147. [Google Scholar] [CrossRef]
  15. Ke, G.; Meng, Q.; Finley, T.; Wang, T.; Chen, W.; Ma, W.; Ye, Q.; Liu, T.-Y. Lightgbm: A highly efficient gradient boosting decision tree. In Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017; Volume 30, pp. 3149–3157. [Google Scholar]
  16. Jalalifar, R.; Delavar, M.R.; Ghaderi, S.F. SAC-ConvLSTM: A novel spatio-temporal deep learning-based approach for a short term power load forecasting. Expert Syst. Appl. 2024, 237, 121487. [Google Scholar] [CrossRef]
  17. Yang, J. Deep learning methods for smart grid data analysis. In Proceedings of the 2024 4th International Conference on Smart Grid and Energy Internet (SGEI), Shenyang, China, 13–15 December 2024; pp. 717–720. [Google Scholar] [CrossRef]
  18. Adib, A.; Nduka, O.S. Hyperparameter Optimization Techniques for Enhanced Machine Learning Energy Forecasting: A Comparative Analysis. In Proceedings of the 2025 13th International Conference on Smart Grid (icSmartGrid), Glasgow, UK, 27–29 May 2025; pp. 220–224. [Google Scholar] [CrossRef]
  19. Bischl, B.; Binder, M.; Lang, M.; Pielok, T.; Richter, J.; Coors, S.; Thomas, J.; Ullmann, T.; Becker, M.; Boulesteix, A.; et al. Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2023, 13, e1484. [Google Scholar] [CrossRef]
  20. Chou, J.S.; Nguyen, H.M. Advancing energy predictive models in smart and sustainable buildings through the age of exploration-inspired optimization of machine and deep learning. Build. Environ. 2026, 290, 114113. [Google Scholar] [CrossRef]
  21. Chen, S.; Wan, H.; Peng, B.; Quan, R.; Chang, Y.; Derigent, W. Accurate multi-step wind and solar power forecasting based on multi-scale convolutional Kolmogorov-Arnold network and improved Lemming-optimized attention fusion. Eng. Appl. Artif. Intell. 2026, 163, 112832. [Google Scholar] [CrossRef]
  22. Snoek, J.; Larochelle, H.; Adams, R.P. Practical bayesian optimization of machine learning algorithms. Adv. Neural Inf. Process. Syst. 2012, 25, 2960–2968. [Google Scholar]
  23. Sadheesh Kumar, S.J. Next-Gen Solar Forecasting: PSO-Optimized Bayesian LSTM for Enhanced Accuracy. J. Oper. Autom. Power Eng. 2025. [Google Scholar] [CrossRef]
  24. Ejiyi, C.J.; Cai, D.; Thomas, D.; Obiora, S.; Osei-Mensah, E.; Acen, C.; Eze, F.O.; Sam, F.; Zhang, Q.; Bamisile, O.O. Comprehensive review of artificial intelligence applications in renewable energy systems: Current implementations and emerging trends. J. Big Data 2025, 12, 169. [Google Scholar] [CrossRef]
  25. Zha, D.; Bhat, Z.P.; Lai, K.-H.; Yang, F.; Hu, X. Data-centric AI: Perspectives and Challenges. In 2023 SIAM International Conference on Data Mining (SDM); Society for Industrial and Applied Mathematics: Philadelphia, PA, USA, 2023; pp. 945–948. [Google Scholar] [CrossRef]
  26. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  27. Murtiningsih, D.A.; Sari, B.W.; Fajri, I.N. Comparison of Light Gradient Boosting Machine, eXtreme Gradient Boosting, and CatBoost with Balancing and Hyperparameter Tuning for Hypertension Risk Prediction on Clinical Dataset. J. Appl. Inform. Comput. 2025, 9, 2753–2763. [Google Scholar] [CrossRef]
  28. Hidayaturrohman, Q.A.; Hanada, E. A Comparative Analysis of Hyper-Parameter Optimization Methods for Predicting Heart Failure Outcomes. Appl. Sci. 2025, 15, 3393. [Google Scholar] [CrossRef]
  29. Hyperlocalized Energy Sharing Market Size & Trends Estimation. Available online: https://www.htfmarketintelligence.com/report/global-hyperlocalized-energy-sharing-market (accessed on 14 February 2026).
  30. Yang, Q.; Liu, Y.; Chen, T.; Tong, Y. Federated Machine Learning: Concept and Applications. ACM Trans. Intell. Syst. Technol. 2019, 10, 1–19. [Google Scholar] [CrossRef]
  31. Yang, Q.; Lin, Y.; Kuang, S.; Wang, D. A novel short-term load forecasting approach for data-poor areas based on K-MIFS-XGBoost and transfer-learning. Electr. Power Syst. Res. 2024, 229, 110151. [Google Scholar] [CrossRef]
  32. Barredo Arrieta, A.; Díaz-Rodríguez, N.; Del Ser, J.; Bennetot, A.; Tabik, S.; Barbado, A.; Garcia, S.; Gil-Lopez, S.; Molina, D.; Benjamins, R.; et al. Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI. Inf. Fusion 2020, 58, 82–115. [Google Scholar] [CrossRef]
Figure 1. Optimizer loss convergence for (a) GP-BO; (b) PSO; (c). HEBO.
Figure 1. Optimizer loss convergence for (a) GP-BO; (b) PSO; (c). HEBO.
Engproc 140 00017 g001
Figure 2. Multi-horizon RMSE performance for (a) GP-BO; (b) PSO; (c). HEBO.
Figure 2. Multi-horizon RMSE performance for (a) GP-BO; (b) PSO; (c). HEBO.
Engproc 140 00017 g002
Figure 3. Residual error distributions for (a) GP-BO; (b) PSO; (c). HEBO.
Figure 3. Residual error distributions for (a) GP-BO; (b) PSO; (c). HEBO.
Engproc 140 00017 g003
Table 1. Optimizer performance across scaling methods.
Table 1. Optimizer performance across scaling methods.
OptimizerBest ScalingMAPE (%)RMSLEOptimization Time
PSOStandard0.4690.0022723.38 h
HEBOStandard0.4820.0023120.63 h
GP-BORobust4.7700.017680.32 h
Table 2. Optimal scaling method by optimizer.
Table 2. Optimal scaling method by optimizer.
OptimizerOptimal ScalingPerformance Drop with Alternative
PSOStandard41% higher MAPE with Robust
HEBOStandard47% higher MAPE with Robust
GP-BORobust24% higher MAPE with Standard
Table 3. Computational resource utilization.
Table 3. Computational resource utilization.
MetricPSOHEBOGP-BO
Time23.38 h20.63 h0.32 h
Evaluations37237260
CPU Utilization56.3%56.1%5.6%
Model Size7.05 MB3.52 MB-
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Nemakonde, P.; Nemangwele, F.; Ratshitanga, M.; Folly, K.A. A Comprehensive Benchmarking of Evolutionary, Swarm-Intelligence, and Surrogate-Assisted Optimization for Residual Demand Forecasting in South African Microgrids. Eng. Proc. 2026, 140, 17. https://doi.org/10.3390/engproc2026140017

AMA Style

Nemakonde P, Nemangwele F, Ratshitanga M, Folly KA. A Comprehensive Benchmarking of Evolutionary, Swarm-Intelligence, and Surrogate-Assisted Optimization for Residual Demand Forecasting in South African Microgrids. Engineering Proceedings. 2026; 140(1):17. https://doi.org/10.3390/engproc2026140017

Chicago/Turabian Style

Nemakonde, Pfano, Fhulufhelo Nemangwele, Mukovhe Ratshitanga, and Komla Agbenyo Folly. 2026. "A Comprehensive Benchmarking of Evolutionary, Swarm-Intelligence, and Surrogate-Assisted Optimization for Residual Demand Forecasting in South African Microgrids" Engineering Proceedings 140, no. 1: 17. https://doi.org/10.3390/engproc2026140017

APA Style

Nemakonde, P., Nemangwele, F., Ratshitanga, M., & Folly, K. A. (2026). A Comprehensive Benchmarking of Evolutionary, Swarm-Intelligence, and Surrogate-Assisted Optimization for Residual Demand Forecasting in South African Microgrids. Engineering Proceedings, 140(1), 17. https://doi.org/10.3390/engproc2026140017

Article Metrics

Back to TopTop