HyperDE: An Adaptive Hyper-Heuristic for Global Optimization
Abstract
:1. Introduction
1.1. Problem
1.2. State of the Art
1.3. Our Contribution
- The high-level adaptive algorithm, which is variant of the genetic algorithm (GA), is responsible for the process of online learning of the hyper-parameters of the basic heuristics.
- The low-level optimization algorithm can be any optimization method characterized by a set of hyper-parameters that are tuned by the high-level algorithm. The low-level algorithm acts through agents that directly attempt to solve the optimization problem by exploring the solution space.
1.4. Content
2. Differential Evolution Algorithm Overview
Algorithm 1 Differential evolution algorithm |
population size F ← weighting factor CR ← crossover probability Initialize randomly all individuals , t ← 0 while do for do Randomly choose , , from current population MUTATION: form the donor vector using the Formula (1) CROSSOVER: form trial vector with (2) EVALUATE: if , replace with trial vector end for t = t + 1 end while |
- 0: “DE/rand/1” , which is (1)
- 1: “DE/best/1”
- 2: “DE/rand/2”
- 3: “DE/best/2”
- 4: “DE/current-to-best/1”
- 5: “DE/current-to-rand/1”
3. Other Parameterized Heuristics
4. Genetic Algorithm Overview
Algorithm 2 Genetic Algorithm |
while do Compute the fitness values of Compute the fitness values of end while return |
5. HyperDe Algorithm
Algorithm 3 Computation of the fitness function for all agents |
procedure AgentsFitness(, H, n_window) for each DE agent g in do end for for each iteration do for each DE agent g in do end for Sort in ascending/descending order by fitness the set for each DE agent g in do end for end for return end procedure |
Algorithm 4 Evaluate and Explore algorithm |
procedure EvaluateAndExplore(, , n_quota, n_window) for each DE agent g in do solutions generated by the DE algorithm run for n_window iterations on population S end for AgentsFitness for each DE agent g in do end for return end procedure |
Algorithm 5 HyperDE algorithm |
|
6. Results
Algorithm 6 Evaluation, relative error |
for each problem in n_problems do for each test in n_tests do if then else end if end for end for |
7. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Ma, Z.; Wu, G.; Suganthan, P.N.; Song, A.; Luo, Q. Performance assessment and exhaustive listing of 500+ nature-inspired metaheuristic algorithms. Swarm Evol. Comput. 2023, 77, 101248. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; IEEE: New York, NY, USA, 1995; Volume 4, pp. 1942–1948. [Google Scholar]
- Kirkpatrick, S.; Gelatt, C.D., Jr.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef]
- Dorigo, M.; Birattari, M.; Stutzle, T. Ant colony optimization. IEEE Comput. Intell. Mag. 2006, 1, 28–39. [Google Scholar] [CrossRef]
- Karaboga, D.; Basturk, B. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. J. Glob. Optim. 2007, 39, 459–471. [Google Scholar] [CrossRef]
- Geem, Z.W.; Kim, J.H.; Loganathan, G.V. A new heuristic optimization algorithm: Harmony search. Simulation 2001, 76, 60–68. [Google Scholar] [CrossRef]
- Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
- Yang, X.S. Firefly algorithms for multimodal optimization. In Proceedings of the International Symposium on Stochastic Algorithms, Sapporo, Japan, 26–28 October 2009; Springer: Berlin/Heidelberg, Germany, 2009; pp. 169–178. [Google Scholar]
- Storn, R.; Price, K. Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341. [Google Scholar] [CrossRef]
- Tanabe, R.; Fukunaga, A. Success-history based parameter adaptation for differential evolution. In Proceedings of the 2013 IEEE Congress on Evolutionary Computation, Cancun, Mexico, 20–23 June 2013; IEEE: New York, NY, USA, 2013; pp. 71–78. [Google Scholar]
- Tanabe, R.; Fukunaga, A.S. Improving the search performance of SHADE using linear population size reduction. In Proceedings of the 2014 IEEE Congress on Evolutionary Computation (CEC), Beijing, China, 6–11 July 2014; IEEE: New York, NY, USA, 2014; pp. 1658–1665. [Google Scholar]
- Xue, J.; Shen, B. A novel swarm intelligence optimization approach: Sparrow search algorithm. Syst. Sci. Control Eng. 2020, 8, 22–34. [Google Scholar] [CrossRef]
- Alsattar, H.A.; Zaidan, A.; Zaidan, B. Novel meta-heuristic bald eagle search optimisation algorithm. Artif. Intell. Rev. 2020, 53, 2237–2264. [Google Scholar] [CrossRef]
- Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
- Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
- Yang, Y.; Chen, H.; Heidari, A.A.; Gandomi, A.H. Hunger games search: Visions, conception, implementation, deep analysis, perspectives, and towards performance shifts. Expert Syst. Appl. 2021, 177, 114864. [Google Scholar] [CrossRef]
- Camacho-Villalón, C.L.; Dorigo, M.; Stützle, T. Exposing the grey wolf, moth-flame, whale, firefly, bat, and antlion algorithms: Six misleading optimization techniques inspired by bestial metaphors. Int. Trans. Oper. Res. 2023, 30, 2945–2971. [Google Scholar] [CrossRef]
- Talatahari, S.; Azizi, M. Chaos Game Optimization: A novel metaheuristic algorithm. Artif. Intell. Rev. 2021, 54, 917–1004. [Google Scholar] [CrossRef]
- Li, S.; Chen, H.; Wang, M.; Heidari, A.A.; Mirjalili, S. Slime mould algorithm: A new method for stochastic optimization. Future Gener. Comput. Syst. 2020, 111, 300–323. [Google Scholar] [CrossRef]
- Gárate-Escamilla, A.K.; Amaya, I.; Cruz-Duarte, J.M.; Terashima-Marín, H.; Ortiz-Bayliss, J.C. Identifying Hyper-Heuristic Trends through a Text Mining Approach on the Current Literature. Appl. Sci. 2022, 12, 10576. [Google Scholar] [CrossRef]
- Burke, E.K.; Hyde, M.R.; Kendall, G.; Ochoa, G.; Ozcan, E.; Woodward, J.R. Exploring hyper-heuristic methodologies with genetic programming. In Computational Intelligence; Springer: Berlin/Heidelberg, Germany, 2009; pp. 177–201. [Google Scholar]
- Burke, E.K.; McCollum, B.; Meisels, A.; Petrovic, S.; Qu, R. A graph-based hyper-heuristic for educational timetabling problems. Eur. J. Oper. Res. 2007, 176, 177–192. [Google Scholar] [CrossRef]
- Hsiao, P.C.; Chiang, T.C.; Fu, L.C. A vns-based hyper-heuristic with adaptive computational budget of local search. In Proceedings of the 2012 IEEE Congress on Evolutionary Computation, Brisbane, Australia, 10–15 June 2012; IEEE: New York, NY, USA, 2012; pp. 1–8. [Google Scholar]
- Chen, P.C.; Kendall, G.; Berghe, G.V. An ant based hyper-heuristic for the travelling tournament problem. In Proceedings of the 2007 IEEE Symposium on Computational Intelligence in Scheduling, Honolulu, Hawaii, 1–5 April 2007; IEEE: New York, NY, USA, 2007; pp. 19–26. [Google Scholar]
- Burke, E.K.; Kendall, G.; Soubeiga, E. A tabu-search hyperheuristic for timetabling and rostering. J. Heuristics 2003, 9, 451–470. [Google Scholar] [CrossRef]
- Cowling, P.I.; Chakhlevitch, K. Using a large set of low level heuristics in a hyperheuristic approach to personnel scheduling. In Evolutionary Scheduling; Springer: Berlin/Heidelberg, Germany, 2007; pp. 543–576. [Google Scholar]
- Han, L.; Kendall, G. Guided operators for a hyper-heuristic genetic algorithm. In Proceedings of the Australasian Joint Conference on Artificial Intelligence, Perth, Australia, 3–5 December 2003; Springer: Berlin/Heidelberg, Germany, 2003; pp. 807–820. [Google Scholar]
- Bai, R.; Kendall, G. An investigation of automated planograms using a simulated annealing based hyper-heuristic. In Metaheuristics: Progress as Real Problem Solvers; Springer: Berlin/Heidelberg, Germany, 2005; pp. 87–108. [Google Scholar]
- Resende, M.G.; de Sousa, J.P.; Nareyek, A. Choosing search heuristics by non-stationary reinforcement learning. In Metaheuristics: Computer Decision-Making; Springer: Berlin/Heidelberg, Germany, 2004; pp. 523–544. [Google Scholar]
- Lim, K.C.W.; Wong, L.P.; Chin, J.F. Simulated-annealing-based hyper-heuristic for flexible job-shop scheduling. In Engineering Optimization; Taylor & Francis: Abingdon, UK, 2022; pp. 1–17. [Google Scholar]
- Qin, W.; Zhuang, Z.; Huang, Z.; Huang, H. A novel reinforcement learning-based hyper-heuristic for heterogeneous vehicle routing problem. Comput. Ind. Eng. 2021, 156, 107252. [Google Scholar] [CrossRef]
- Lin, J.; Zhu, L.; Gao, K. A genetic programming hyper-heuristic approach for the multi-skill resource constrained project scheduling problem. Expert Syst. Appl. 2020, 140, 112915. [Google Scholar] [CrossRef]
- Oliva, D.; Martins, M.S. A Bayesian based Hyper-Heuristic approach for global optimization. In Proceedings of the 2019 IEEE Congress on Evolutionary Computation (CEC), Wellington, New Zealand, 10–13 June 2019; IEEE: New York, NY, USA, 2019; pp. 1766–1773. [Google Scholar]
- Cruz-Duarte, J.M.; Amaya, I.; Ortiz-Bayliss, J.C.; Conant-Pablos, S.E.; Terashima-Marín, H.; Shi, Y. Hyper-heuristics to customise metaheuristics for continuous optimisation. Swarm Evol. Comput. 2021, 66, 100935. [Google Scholar] [CrossRef]
- Burke, E.K.; Hyde, M.R.; Kendall, G.; Ochoa, G.; Özcan, E.; Woodward, J.R. A classification of hyper-heuristic approaches: Revisited. In Handbook of Metaheuristics; Springer: Berlin/Heidelberg, Germany, 2019; pp. 453–477. [Google Scholar]
- Adam, S.P.; Alexandropoulos, S.A.N.; Pardalos, P.M.; Vrahatis, M.N. No free lunch theorem: A review. In Approximation and Optimization: Algorithms, Complexity and Applications; Springer: Berlin/Heidelberg, Germany, 2019; pp. 57–82. [Google Scholar]
- Georgioudakis, M.; Plevris, V. A comparative study of differential evolution variants in constrained structural optimization. Front. Built Environ. 2020, 6, 102. [Google Scholar] [CrossRef]
- Thieu, N.V.; Mirjalili, S. MEALPY: A Framework of The State-of-The-Art Meta-Heuristic Algorithms in Python. 2022. Available online: https://zenodo.org/record/6684223 (accessed on 18 September 2023).
- Katoch, S.; Chauhan, S.S.; Kumar, V. A review on genetic algorithm: Past, present, and future. Multimed. Tools Appl. 2021, 80, 8091–8126. [Google Scholar] [CrossRef] [PubMed]
Heuristic Name | Error |
---|---|
HyperDE | 0.045 |
LSHADE | 0.078 |
SHADE | 0.085 |
HyperSSA | 1.068 |
HyperBES | 1.073 |
Chaos Game Optimization (CGO) | 1.213 |
Whale Optimization Algorithm (WOA) | 1.303 |
Sparrow Search Algorithm (SSA) | 1.462 |
Slime Mold Algorithm (SMA) | 1.741 |
Hunger Games Search (HGS) | 1.795 |
Bald Eagle Search (BES) | 2.068 |
Harris Hawks Optimization (HHO) | 2.103 |
Differential Evolution (DE) | 3.603 |
Heuristic Name | Rank |
---|---|
HyperDE | 1.875 |
SHADE | 2.680 |
LSHADE | 3.097 |
Slime Mold Algorithm (SMA) | 4.222 |
HyperBES | 4.916 |
Sparrow Search Algorithm (SSA) | 5.097 |
Differential Evolution (DE) | 5.458 |
Chaos Game Optimization (CGO) | 5.638 |
Whale Optimization Algorithm (WOA) | 6.041 |
HyperSSA | 6.208 |
Hunger Games Search (HGS) | 6.791 |
Bald Eagle Search (BES) | 8.166 |
Harris Hawks Optimization (HHO) | 8.305 |
Function | HyperDE | HyperSSA | HyperBES | LSHADE | SHADE |
---|---|---|---|---|---|
F1 | 4.09 | 6.61 | 10.72 | 2.43 | 2.34 |
F2 | 4.12 | 6.45 | 10.83 | 2.33 | 2.37 |
F3 | 4.29 | 7.16 | 11.39 | 2.36 | 2.37 |
F4 | 4.30 | 6.56 | 11.39 | 2.37 | 2.37 |
F5 | 4.20 | 6.51 | 11.09 | 2.34 | 2.32 |
F6 | 4.88 | 7.82 | 12.75 | 2.76 | 2.76 |
F7 | 5.41 | 6.71 | 14.22 | 3.17 | 3.22 |
F8 | 5.76 | 9.23 | 15.16 | 3.59 | 3.47 |
F9 | 4.51 | 7.26 | 11.87 | 2.62 | 2.65 |
F10 | 4.54 | 6.97 | 11.94 | 2.73 | 2.69 |
F11 | 4.76 | 7.42 | 12.48 | 2.80 | 2.82 |
F12 | 4.77 | 7.23 | 12.54 | 2.78 | 2.78 |
Problem | Method | Best | Mean | Std. Dev |
---|---|---|---|---|
F1 | HyperDE | 300.0 | 300.0 | |
SHADE | 300.000000106882 | 300.0000004073815 | ||
LSHADE | 300.0000001404671 | 300.00000041613555 | ||
DE | 1235.4661553234967 | 4130.32659367826 | 1205.22 | |
F2 | HyperDE | 400.00006692489313 | 405.7762007938934 | 3.18 |
SHADE | 400.3938197535978 | 407.5481893148344 | 2.13 | |
LSHADE | 400.3851128433223 | 408.0876491551797 | 1.55 | |
DE | 410.80352316352435 | 417.77563978207604 | 3.27 | |
F3 | HyperDE | 600.0 | 600.0 | |
SHADE | 600.0 | 600.0 | 0.0 | |
LSHADE | 600.0 | 600.0 | 0.0 | |
DE | 600.0000112734627 | 600.0000363216427 | ||
F4 | HyperDE | 800.0971590987756 | 800.29929127267 | 0.19 |
SHADE | 800.2721523050697 | 800.4602925668243 | 0.09 | |
LSHADE | 800.2506108454668 | 800.397114829017 | 0.08 | |
DE | 800.2004337865635 | 800.4850824190988 | 0.09 | |
F5 | HyperDE | 900.0 | 900.0436061309064 | 0.11 |
SHADE | 900.0000000000498 | 900.0000000002758 | ||
LSHADE | 900.0000000000752 | 900.0000000001007 | ||
DE | 900.0308300706507 | 900.2115266993348 | 0.04 | |
F6 | HyperDE | 1800.550008569961 | 2349.376836704244 | 1611.13 |
SHADE | 2527.2166249124543 | 3318.210953638849 | 515.35 | |
LSHADE | 2448.651779056926 | 3156.2346074618913 | 433.08 | |
DE | 18,245.220034908845 | 55,938.99918382257 | 23,898.85 | |
F7 | HyperDE | 2001.494972244204 | 2020.428895032187 | 8.00 |
SHADE | 2023.69169017458 | 2028.6419318989665 | 1.74 | |
LSHADE | 2014.5645721366564 | 2027.7440931323101 | 2.78 | |
DE | 2024.5036568238922 | 2028.4449058504215 | 1.99 | |
F8 | HyperDE | 2200.226924720861 | 2218.3297256215615 | 6.73 |
SHADE | 2209.5886930417746 | 2218.297973263973 | 4.32 | |
LSHADE | 2208.325106401539 | 2217.383152114902 | 4.02 | |
DE | 2230.0941806086953 | 2245.28243788769 | 9.28 | |
F9 | HyperDE | 2300.0 | 2437.838567116613 | 174.83 |
SHADE | 2300.0001089307398 | 2300.0048868578 | 0.03 | |
LSHADE | 2300.0001212533975 | 2300.0061751439484 | 0.03 | |
DE | 2396.4069384108525 | 2646.9062600644074 | 60.17 | |
F10 | HyperDE | 2598.5455167771906 | 2605.156197642855 | 25.86 |
SHADE | 2598.5468912550587 | 2598.5620204350707 | 0.07 | |
LSHADE | 2598.5468210921954 | 2598.551948117506 | 0.00 | |
DE | 2608.799360629708 | 2614.908013661977 | 3.19 | |
F11 | HyperDE | 2600.0 | 2601.8934468491198 | 6.28 |
SHADE | 2600.0000050187055 | 2600.0000090069157 | ||
LSHADE | 2600.0000049464516 | 2600.000009874206 | ||
DE | 2607.8865627810346 | 2624.76104595252 | 4.06 | |
F12 | HyperDE | 2863.76941226926 | 2865.838163178578 | 1.26 |
SHADE | 2821.118856642622 | 2864.1391105809807 | 5.62 | |
LSHADE | 2864.224719298684 | 2864.904468801826 | 0.50 | |
DE | 2866.84004856121 | 2867.9027213119152 | 0.50 |
Problem | Method | p-Value |
---|---|---|
F1 | HyperDE vs. SHADE | |
HyperDE vs. LSHADE | ||
HyperDE vs. DE | ||
F2 | HyperDE vs. SHADE | |
HyperDE vs. LSHADE | ||
HyperDE vs. DE | ||
F3 | HyperDE vs. SHADE | |
HyperDE vs. LSHADE | ||
HyperDE vs. DE | ||
F4 | HyperDE vs. SHADE | |
HyperDE vs. LSHADE | ||
HyperDE vs. DE | ||
F5 | HyperDE vs. SHADE | |
HyperDE vs. LSHADE | ||
HyperDE vs. DE | ||
F6 | HyperDE vs. SHADE | |
HyperDE vs. LSHADE | ||
HyperDE vs. DE | ||
F7 | HyperDE vs. SHADE | |
HyperDE vs. LSHADE | ||
HyperDE vs. DE | ||
F8 | HyperDE vs. SHADE | |
HyperDE vs. LSHADE | ||
HyperDE vs. DE | ||
F9 | HyperDE vs. SHADE | |
HyperDE vs. LSHADE | ||
HyperDE vs. DE | ||
F10 | HyperDE vs. SHADE | |
HyperDE vs. LSHADE | ||
HyperDE vs. DE | ||
F11 | HyperDE vs. SHADE | |
HyperDE vs. LSHADE | ||
HyperDE vs. DE | ||
F12 | HyperDE vs. SHADE | |
HyperDE vs. LSHADE | ||
HyperDE vs. DE |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Manescu, A.-R.; Dumitrescu, B. HyperDE: An Adaptive Hyper-Heuristic for Global Optimization. Algorithms 2023, 16, 451. https://doi.org/10.3390/a16090451
Manescu A-R, Dumitrescu B. HyperDE: An Adaptive Hyper-Heuristic for Global Optimization. Algorithms. 2023; 16(9):451. https://doi.org/10.3390/a16090451
Chicago/Turabian StyleManescu, Alexandru-Razvan, and Bogdan Dumitrescu. 2023. "HyperDE: An Adaptive Hyper-Heuristic for Global Optimization" Algorithms 16, no. 9: 451. https://doi.org/10.3390/a16090451
APA StyleManescu, A.-R., & Dumitrescu, B. (2023). HyperDE: An Adaptive Hyper-Heuristic for Global Optimization. Algorithms, 16(9), 451. https://doi.org/10.3390/a16090451