Simulated Annealing with Exploratory Sensing for Global Optimization
Abstract
:1. Introduction
2. Simulated Annealing and Sensing Memory
2.1. Simulated Annealing Algorithm
- If , means the new state has a smaller or equal cost, then is chosen as the current state as down-hills is always accepted.
- If , and the probability of accepting is larger than a random value such that then is chosen as the current state where T is a control parameter known as . This parameter is gradually decreased during the search process, making the algorithm greedier as the probability of accepting uphill moves decreases over time. Moreover, is a randomly generated number, where . Accepting uphill moves is important for the algorithm to avoid being stuck in local minima.
- Initial value of temperature.
- A function to decrease temperature value gradually.
- A final temperature value.
- The length of each homogeneous Markov chains. A Markov chain is a sequence of trials where the trial outcome’s probability only depends on the previous trial outcome. It is classified as homogeneous when the transition probabilities do not depend on the trial number [62].
2.2. Sensing Search Memory
Gene Matrix (GM)
3. Simulated Annealing with Exploratory Sensing (SAES) Algorithm
- (1)
- The exploration phase. This stage has several iterations within several Markov chains and aims to explore the constrained search space within the lower and upper limits of the solutions’ values without affecting the way of how SA works in each Markov chain. Instead of starting the new Markov chain from the last accepted state, applying the GM directs the search to visit solutions in non-visited partitions of the search space. The cooling schedules and the acceptance criteria are not affected by applying the GM, while the temperature is changed in each Markov chain ones. Therefore, in each Markov chain, an intensification process is carried out in the new partition of the search space, while the search is keeping records of the best-found solutions.The adjustment of the SA settings should explore the search space in the first phase. Extensive exploration is maintained in our proposed method by a diversification index. The diversification index is a ratio of the number of visited partitions to the number of all partitions in the GM, and it is measured after each Markov chain. For instance, this phase can be ended when the diversification index reaches a pre-defined ratio of the partitions have been visited. The numerical experiments shown later indicate that the appropriate value for the parameter is , and this means visiting a percentage greater than of the GM partitions. Another parameter called a diversification threshold is applied to control the direction to un-visited partitions of the search space and this is called diversification sensing. This parameter is a smaller anticipated value of the diversification index. For example, the diversification threshold value , means that when the search in one Markov chain has not changed the diversification value by this threshold, the diversification process will be called to move to new search partitions. Otherwise, the diversification process is not used as the search is achieving good diversification using the escaping mechanism.
- (2)
- The intensification phase. After reaching a specified level of diversification where most of the searching partitions were visited, the search is directed to start from the best-found state to do a more local search in that region. Although the diversification is supposed to be achieved mainly in the first phase using the GM, it is essential to ensure that the initial temperature is wisely chosen to avoid getting trapped in local minima earlier at the start of this phase. We choose to use a static cooling schedule that allows sufficient temperature after ending the exploration phase. The search is ended when reaching the maximum iteration, which equals the number of Markov chains multiplied by each Markov chain’s length.
3.1. Neighborhood Representation
3.2. Initial Solution
3.3. Objective Function
3.4. Initial Temperature Settings
Cooling Schedule
3.5. Markov Chains Configurations
3.6. The Diversification and Intensification Settings
3.7. Stopping Criterion
3.8. The SAES Algorithm
Algorithm 1 SAES |
|
4. Numerical Simulation
4.1. Test Functions
4.2. Parameter Setting and Configuration
- The SAES without the diversification phase and final intensification, which is the standard annealing algorithm and denoted by the SA method.
- The SAES without the final intensification, which is denoted by SAES.
- The complete SAES method.
- Function evaluations in the initialization: 100 function evaluations.
- Function evaluations in the diversification phase: , where is the number of Markov chains in the diversification phase, which is at maximum 18.
- Function evaluations in the diversification phase: .
- Function evaluations in the final intensification: at maximum .
4.3. Statistical Tests
- The positive and negative ranks:
- the p-value.
4.4. Results and Discussion
- There are no significant differences between the obtained errors in the SA and SAES methods, although the latter method could beat the first method in almost all used test functions.
- The SAES is significantly better than the other two methods in terms of obtaining better errors.
- The processing time of the SAES method is slightly longer than that of the SA and SAES methods with no significant differences between all methods processing times except in the hard test functions.
- BLX-GL50 [81]: Hybrid real-coded genetic algorithms with female and male differentiation.
- BLX-MA [82]: Adaptive local search parameters for real-coded memetic algorithms.
- COEVO [83]: Real-parameter optimization using the mutation step co-evolution.
- DE [84]: Real-parameter optimization with differential evolution.
- DMS-L-PSO [85]: Dynamic multi-swarm particle swarm optimizer with local search.
- EDA [86]: A simple continuous estimated distribution algorithm.
- IPOP-CMA-ES [87]: Restart with increasing population size Covariance Matrix Adaptation Evolution Strategy (CMA-ES).
- LR-CMA-ES [88]: Local restart CMA-ES.
- K-PCX [89]: A population-based, steady-state procedure for real-parameter optimization.
- L-SaDE [90]: Self-adaptive differential evolution algorithm.
- SPC-PNX [91]: Scaled probabilistic crowding genetic algorithm with parent centric normal crossover.
- cHS [92]: Cellular harmony search algorithm.
- HC [93]: Hill climbing.
- HC [94]: -Hill climbing.
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Appendix A. Classical Test Functions
Appendix A.1. Sphere Function (f 1)
Appendix A.2. Schwefel Function (f 2)
Appendix A.3. Schwefel Function (f 3)
Appendix A.4. Schwefel Function (f 4)
Appendix A.5. Rosenbrock Function (f 5)
Appendix A.6. Step Function (f 6)
Appendix A.7. Quadratic Function with Noise (f 7)
Appendix A.8. Schwefel Functions (f 8)
Appendix A.9. Rastrigin Function (f 9)
Appendix A.10. Ackley Function (f 10)
Appendix A.11. Griewank Function (f 11)
Appendix A.12. Levy Functions (f 12, f 13)
Appendix A.13. Shekel Foxholes Function (f 14)
Appendix A.14. Kowalik Function (f 15)
Appendix A.15. Hump Function (f 16)
Appendix A.16. Branin RCOS Function (f 17)
Appendix A.17. Goldstein & Price Function (f 18)
Appendix A.18. Hartmann Function (f 19)
Appendix A.19. Hartmann Function (f 20)
Appendix A.20. Shekel Functions (f 21, f 22, f 23)
Appendix A.21. Function (f 24)
Appendix A.22. Function (f 25)
Appendix B. Hard Test Functions
h | Function Name | Bounds | Global Minimum |
---|---|---|---|
Shifted Sphere | [−100,100] | −450 | |
Shifted Schwefel’s 1.2 | [−100,100] | −450 | |
Shifted rotated high conditioned elliptic | [−100,100] | −450 | |
Shifted Schwefel’s 1.2 with noise in fitness | [−100,100] | −450 | |
Schwefel’s 2.6 with global optimum on bounds | [−100,100] | −310 | |
Shifted Rosenbrock’s | [−100,100] | 390 | |
Shifted rotated Griewank’s without bounds | [0,600] | −180 | |
Shifted rotated Ackley’s with global optimum on bounds | [−32,32] | −140 | |
Shifted Rastrigin’s | [−5,5] | −330 | |
Shifted rotated Rastrigin’s | [−5,5] | −330 | |
Shifted rotated Weierstrass | [−0.5,0.5] | 90 | |
Schwefel’s 2.13 | [−100,100] | −460 | |
Expanded extended Griewank’s + Rosenbrock’s | [−3,1] | −130 | |
Expanded rotated extended Scaffe’s | [−100,100] | −300 | |
Hybrid composition 1 | [−5,5] | 120 | |
Rotated hybrid comp. | [−5,5] | 120 | |
Rotated hybrid comp. Fn 1 with noise in fitness | [−5,5] | 120 | |
Rotated hybrid composition function | 10 | ||
Rotated hybrid composition function with narrow basin global optimum | 10 | ||
Rotated hybrid composition function with global optimum on the bounds | 10 | ||
Rotated hybrid composition function | 360 | ||
Rotated hybrid composition function with high condition number matrix | 360 | ||
Non-Continuous rotated hybrid composition function | 360 | ||
Rotated hybrid composition function | 260 | ||
Rotated hybrid composition function without bounds | 260 |
References
- Glover, F.; Laguna, M. Tabu Search; Kluwer Academic Publishers: Boston, MA, USA, 1997. [Google Scholar]
- Kirkpatrick, S.; Gelatt, C.; Vecchi, M. Optimization by simulated annealing, 1983. Science 1983, 220, 671–680. [Google Scholar] [CrossRef] [PubMed]
- Goldenberg, D.E. Genetic Algorithms in Search, Optimization and Machine Learning; Addison Wesley: Reading, MA, USA, 1989. [Google Scholar]
- Yasear, S.A.; Ku-Mahamud, K.R. Taxonomy of Memory Usage in Swarm Intelligence-Based Metaheuristics. Baghdad Sci. J. 2019, 16, 445–452. [Google Scholar]
- Tarawneh, H.; Ayob, M.; Ahmad, Z. A hybrid Simulated Annealing with Solutions Memory for Curriculum-based Course Timetabling Problem. J. Appl. Sci. 2013, 13, 262–269. [Google Scholar] [CrossRef]
- Azizi, N.; Zolfaghari, S.; Liang, M. Hybrid simulated annealing with memory: An evolution-based diversification approach. Int. J. Prod. Res. 2010, 48, 5455–5480. [Google Scholar] [CrossRef]
- Zou, D.; Wang, G.G.; Sangaiah, A.K.; Kong, X. A memory-based simulated annealing algorithm and a new auxiliary function for the fixed-outline floorplanning with soft blocks. J. Ambient. Intell. Humaniz. Comput. 2017, 1–12. [Google Scholar] [CrossRef]
- Gao, H.; Feng, B.; Zhu, L. Adaptive SAGA based on mutative scale chaos optimization strategy. In Proceedings of the 2005 International Conference on Neural Networks and Brain, Beijing, China, 13–15 October 2005; IEEE: Piscataway, NJ, USA, 2005; Volume 1, pp. 517–520. [Google Scholar]
- Skaggs, R.; Mays, L.; Vail, L. Simulated annealing with memory and directional search for ground water remediation design. J. Am. Water Resour. Assoc. 2001, 37, 853–866. [Google Scholar] [CrossRef]
- Mohammadi, H.; Sahraeian, R. Bi-objective simulated annealing and adaptive memory procedure approaches to solve a hybrid flow shop scheduling problem with unrelated parallel machines. In Proceedings of the 2012 IEEE International Conference on Industrial Engineering and Engineering Management, Hong Kong, China, 10–13 October 2012; pp. 528–532. [Google Scholar] [CrossRef]
- Lo, C.C.; Hsu, C.C. An annealing framework with learning memory. IEEE Trans. Syst. Man, Cybern. Part A Syst. Hum. 1998, 28, 648–661. [Google Scholar]
- Javidrad, F.; Nazari, M. A new hybrid particle swarm and simulated annealing stochastic optimization method. Appl. Soft Comput. 2017, 60, 634–654. [Google Scholar] [CrossRef]
- Assad, A.; Deep, K. A Hybrid Harmony search and Simulated Annealing algorithm for continuous optimization. Inf. Sci. 2018, 450, 246–266. [Google Scholar] [CrossRef]
- Mafarja, M.M.; Mirjalili, S. Hybrid whale optimization algorithm with simulated annealing for feature selection. Neurocomputing 2017, 260, 302–312. [Google Scholar] [CrossRef]
- Vincent, F.Y.; Redi, A.P.; Hidayat, Y.A.; Wibowo, O.J. A simulated annealing heuristic for the hybrid vehicle routing problem. Appl. Soft Comput. 2017, 53, 119–132. [Google Scholar]
- Li, Z.; Tang, Q.; Zhang, L. Minimizing energy consumption and cycle time in two-sided robotic assembly line systems using restarted simulated annealing algorithm. J. Clean. Prod. 2016, 135, 508–522. [Google Scholar] [CrossRef]
- Yu, V.; Iswari, T.; Normasari, N.; Asih, A.; Ting, H. Simulated annealing with restart strategy for the blood pickup routing problem. In IOP Conference Series: Materials Science and Engineering; IOP Publishing: Bristol, UK, 2018; Volume 337, p. 012007. [Google Scholar]
- Hedar, A.R.; Fukushima, M. Minimizing multimodal functions by simplex coding genetic algorithm. Optim. Methods Softw. 2003, 18, 265–282. [Google Scholar] [CrossRef]
- Li, Y.; Zeng, X. Multi-population co-genetic algorithm with double chain-like agents structure for parallel global numerical optimization. Appl. Intell. 2010, 32, 292–310. [Google Scholar] [CrossRef]
- Sawyerr, B.A.; Ali, M.M.; Adewumi, A.O. A comparative study of some real-coded genetic algorithms for unconstrained global optimization. Optim. Methods Softw. 2011, 26, 945–970. [Google Scholar] [CrossRef]
- Hansen, N. The CMA evolution strategy: A comparing review. In Towards a New Evolutionary Computation; Springer: Berlin/Heidelberg, Germany, 2006; pp. 75–102. [Google Scholar]
- Hedar, A.R.; Fukushima, M. Evolution strategies learned with automatic termination criteria. In Proceedings of the SCIS-ISIS, Tokyo, Japan, 20–24 September 2006; J-STAGE: Tokyo, Japan, 2006; pp. 1126–1134. [Google Scholar]
- Hedar, A.R.; Fukushima, M. Directed evolutionary programming: Towards an improved performance of evolutionary programming. In Proceedings of the 2006 IEEE International Conference on Evolutionary Computation, Vancouver, BC, Canada, 16–21 July 2006; pp. 1521–1528. [Google Scholar]
- Lee, C.Y.; Yao, X. Evolutionary programming using mutations based on the Lévy probability distribution. Evol. Comput. IEEE Trans. 2004, 8, 1–13. [Google Scholar]
- Hedar, A.R.; Fukushima, M. Tabu search directed by direct search methods for nonlinear global optimization. Eur. J. Oper. Res. 2006, 170, 329–349. [Google Scholar] [CrossRef] [Green Version]
- Lozano, M.; Herrera, F.; Krasnogor, N.; Molina, D. Real-coded memetic algorithms with crossover hill-climbing. Evol. Comput. 2004, 12, 273–302. [Google Scholar] [CrossRef]
- Nguyen, Q.H.; Ong, Y.S.; Lim, M.H. A probabilistic memetic framework. Evol. Comput. IEEE Trans. 2009, 13, 604–623. [Google Scholar] [CrossRef]
- Noman, N.; Iba, H. Accelerating differential evolution using an adaptive local search. Evol. Comput. IEEE Trans. 2008, 12, 107–125. [Google Scholar] [CrossRef]
- Gandomi, A.H.; Yang, X.S.; Talatahari, S.; Deb, S. Coupled eagle strategy and differential evolution for unconstrained and constrained global optimization. Comput. Math. Appl. 2012, 63, 191–200. [Google Scholar] [CrossRef] [Green Version]
- Brest, J.; Maučec, M.S. Population size reduction for the differential evolution algorithm. Appl. Intell. 2008, 29, 228–247. [Google Scholar] [CrossRef]
- Das, S.; Abraham, A.; Chakraborty, U.K.; Konar, A. Differential evolution using a neighborhood-based mutation operator. Evol. Comput. IEEE Trans. 2009, 13, 526–553. [Google Scholar] [CrossRef] [Green Version]
- Qin, A.K.; Huang, V.L.; Suganthan, P.N. Differential evolution algorithm with strategy adaptation for global numerical optimization. Evol. Comput. IEEE Trans. 2009, 13, 398–417. [Google Scholar] [CrossRef]
- Al-Tashi, Q.; Rais, H.; Abdulkadir, S.J. Hybrid swarm intelligence algorithms with ensemble machine learning for medical diagnosis. In Proceedings of the 2018 4th International Conference on Computer and Information Sciences (ICCOINS), Kuala Lumpur, Malaysia, 13–14 August 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–6. [Google Scholar]
- Liang, J.J.; Qin, A.K.; Suganthan, P.N.; Baskar, S. Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. Evol. Comput. IEEE Trans. 2006, 10, 281–295. [Google Scholar] [CrossRef]
- De Oca, M.A.M.; Stützle, T.; Birattari, M.; Dorigo, M. Frankenstein’s PSO: A composite particle swarm optimization algorithm. Evol. Comput. IEEE Trans. 2009, 13, 1120–1132. [Google Scholar] [CrossRef]
- Salahi, M.; Jamalian, A.; Taati, A. Global minimization of multi-funnel functions using particle swarm optimization. Neural Comput. Appl. 2013, 23, 2101–2106. [Google Scholar] [CrossRef]
- Vasumathi, B.; Moorthi, S. Implementation of hybrid ANN–PSO algorithm on FPGA for harmonic estimation. Eng. Appl. Artif. Intell. 2012, 25, 476–483. [Google Scholar] [CrossRef]
- Duarte, A.; Martí, R.; Glover, F.; Gortazar, F. Hybrid scatter tabu search for unconstrained global optimization. Ann. Oper. Res. 2011, 183, 95–123. [Google Scholar] [CrossRef]
- Hvattum, L.M.; Duarte, A.; Glover, F.; Martí, R. Designing effective improvement methods for scatter search: An experimental study on global optimization. Soft Comput. 2013, 17, 49–62. [Google Scholar] [CrossRef]
- Chen, Z.; Wang, R.L. Ant colony optimization with different crossover schemes for global optimization. Clust. Comput. 2017, 20, 1247–1257. [Google Scholar] [CrossRef]
- Ciornei, I.; Kyriakides, E. Hybrid ant colony-genetic algorithm (GAAPI) for global continuous optimization. IEEE Trans. Syst. Man Cybern. Part B 2011, 42, 234–245. [Google Scholar] [CrossRef] [PubMed]
- Socha, K.; Dorigo, M. Ant colony optimization for continuous domains. Eur. J. Oper. Res. 2008, 185, 1155–1173. [Google Scholar] [CrossRef] [Green Version]
- Ghanem, W.A.; Jantan, A. Hybridizing artificial bee colony with monarch butterfly optimization for numerical optimization problems. Neural Comput. Appl. 2017, 1–19. [Google Scholar] [CrossRef]
- Zhang, B.; Liu, T.; Zhang, C.; Wang, P. Artificial bee colony algorithm with strategy and parameter adaptation for global optimization. Neural Comput. Appl. 2016, 28, 1–16. [Google Scholar] [CrossRef]
- Hansen, P.; Mladenović, N.; Pérez, J.A.M. Variable neighbourhood search: Methods and applications. Ann. Oper. Res. 2010, 175, 367–407. [Google Scholar] [CrossRef]
- Mladenović, N.; Dražić, M.; Kovačevic-Vujčić, V.; Čangalović, M. General variable neighborhood search for the continuous optimization. Eur. J. Oper. Res. 2008, 191, 753–770. [Google Scholar] [CrossRef]
- Liu, H.; Cai, Z.; Wang, Y. Hybridizing particle swarm optimization with differential evolution for constrained numerical and engineering optimization. Appl. Soft Comput. 2010, 10, 629–640. [Google Scholar] [CrossRef]
- Vrugt, J.; Robinson, B.; Hyman, J.M. Self-adaptive multimethod search for global optimization in real-parameter spaces. Evol. Comput. IEEE Trans. 2009, 13, 243–259. [Google Scholar] [CrossRef]
- Li, S.; Tan, M.; Tsang, I.W.; Kwok, J.T.Y. A hybrid PSO-BFGS strategy for global optimization of multimodal functions. IEEE Trans. Syst. Man Cybern. Part B 2011, 41, 1003–1014. [Google Scholar]
- Sahnehsaraei, M.A.; Mahmoodabadi, M.J.; Taherkhorsandi, M.; Castillo-Villar, K.K.; Yazdi, S.M. A hybrid global optimization algorithm: Particle swarm optimization in association with a genetic algorithm. In Complex System Modelling and Control Through Intelligent Soft Computations; Springer: Berlin/Heidelberg, Germany, 2015; pp. 45–86. [Google Scholar]
- Ting, T.; Yang, X.S.; Cheng, S.; Huang, K. Hybrid metaheuristic algorithms: Past, present, and future. In Recent Advances in Swarm Intelligence and Evolutionary Computation; Springer: Berlin/Heidelberg, Germany, 2015; pp. 71–83. [Google Scholar]
- Zhang, L.; Liu, L.; Yang, X.S.; Dai, Y. A novel hybrid firefly algorithm for global optimization. PLoS ONE 2016, 11, e0163230. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Mirjalili, S.; Mirjalili, S.M.; Hatamlou, A. Multi-verse optimizer: A nature-inspired algorithm for global optimization. Neural Comput. Appl. 2016, 27, 495–513. [Google Scholar] [CrossRef]
- Cheng, M.Y.; Prayogo, D. Fuzzy adaptive teaching–learning-based optimization for global numerical optimization. Neural Comput. Appl. 2016, 29, 309–327. [Google Scholar] [CrossRef]
- Strumberger, I.; Bacanin, N.; Tuba, M.; Tuba, E. Resource scheduling in cloud computing based on a hybridized whale optimization algorithm. Appl. Sci. 2019, 9, 4893. [Google Scholar] [CrossRef] [Green Version]
- Fong, S.; Deb, S.; Yang, X.S. A heuristic optimization method inspired by wolf preying behavior. Neural Comput. Appl. 2015, 26, 1725–1738. [Google Scholar] [CrossRef]
- de Melo, V.V.; Banzhaf, W. Drone Squadron Optimization: A novel self-adaptive algorithm for global numerical optimization. Neural Comput. Appl. 2017, 30, 1–28. [Google Scholar] [CrossRef]
- Hedar, A.R.; Ong, B.T.; Fukushima, M. Genetic Algorithms with Automatic Accelerated Termination; Technical Report; Department of Applied Mathematics and Physics, Kyoto University: Kyoto, Japan, 2007; Volume 2. [Google Scholar]
- Hedar, A.R.; Deabes, W.; Amin, H.H.; Almaraashi, M.; Fukushima, M. Global Sensing Search for Nonlinear Global Optimization. J. Glob. Optim. 2020, submitted. [Google Scholar]
- Aarts, E.H.L.; Eikelder, H.M.M.T. Simulated Annealing. In Handbook of Applied Optimization; Pardalos, P., Resende, M., Eds.; Oxford University Press: Oxford, UK, 2002; pp. 209–220. [Google Scholar]
- Drack, L.; Zadeh, H. Soft computing in engineering design optimisation. J. Intell. Fuzzy Syst. 2006, 17, 353–365. [Google Scholar]
- Aarts, E.; Lenstra, J. Local Search in Combinatorial Optimization; Princeton Univ Press: Princeton, NJ, USA, 2003. [Google Scholar]
- Miki, M.; Hiroyasu, T.; Ono, K. Simulated annealing with advanced adaptive neighborhood. In Second International Workshop on Intelligent Systems Design and Application; Dynamic Publishers, Inc.: Atlanta, GA, USA, 2002; pp. 113–118. [Google Scholar]
- Locatelli, M. Simulated annealing algorithms for continuous global optimization. Handb. Glob. Optim. 2002, 2, 179–229. [Google Scholar]
- Nolle, L.; Goodyear, A.; Hopgood, A.; Picton, P.; Braithwaite, N. On Step Width Adaptation in Simulated Annealing for Continuous Parameter Optimisation. In Computational Intelligence. Theory and Applications; Reusch, B., Ed.; Springer: Berlin/Heidelberg, 2001; Volume 2206, pp. 589–598. [Google Scholar]
- White, S.R. Concepts of scale in simulated annealing. In AIP Conference Proceedings; American Institute of Physics: Melville, NY, USA, 1984; pp. 261–270. [Google Scholar]
- Hedar, A.R.; Fukushima, M. Hybrid simulated annealing and direct search method for nonlinear unconstrained global optimization. Optim. Methods Softw. 2002, 17, 891–912. [Google Scholar] [CrossRef]
- Hedar, A.R.; Fukushima, M. Heuristic pattern search and its hybridization with simulated annealing for nonlinear global optimization. Optim. Methods Softw. 2004, 19, 291–308. [Google Scholar] [CrossRef]
- Henderson, D.; Jacobson, S.H.; Johnson, A.W. The theory and practice of simulated annealing. In Handbook of Metaheuristics; Springer: Berlin/Heidelberg, Germany, 2003; pp. 287–319. [Google Scholar]
- Garibaldi, J.; Ifeachor, E. Application of simulated annealing fuzzy model tuning to umbilical cord acid-base interpretation. Fuzzy Syst. IEEE Trans. 1999, 7, 72–84. [Google Scholar] [CrossRef] [Green Version]
- Hedar, A.R.; Ali, A.F. Tabu search with multi-level neighborhood structures for high dimensional problems. Appl. Intell. 2012, 37, 189–206. [Google Scholar] [CrossRef]
- Liang, J.; Suganthan, P.; Deb, K. Novel composition test functions for numerical global optimization. In Proceedings of the 2005 IEEE Swarm Intelligence Symposium, 2005. SIS 2005, Pasadena, CA, USA, 8–10 June 2005; IEEE: Piscataway, NJ, USA, 2005; pp. 68–75. [Google Scholar]
- Suganthan, P.N.; Hansen, N.; Liang, J.J.; Deb, K.; Chen, Y.P.; Auger, A.; Tiwari, S. Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization. Kangal Rep. 2005, 2005005, 2005. [Google Scholar]
- Hedar, A.R.; Ali, A.F. Genetic algorithm with population partitioning and space reduction for high dimensional problems. In Proceedings of the 2009 International Conference on Computer Engineering & Systems, Cairo, Egypt, 14–16 December 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 151–156. [Google Scholar]
- Hedar, A.R.; Ali, A.F.; Abdel-Hamid, T.H. Genetic algorithm and tabu search based methods for molecular 3D-structure prediction. Numer. Algebr. Control. Optim. 2011, 1, 191. [Google Scholar] [CrossRef]
- García, S.; Fernández, A.; Luengo, J.; Herrera, F. A study of statistical techniques and performance measures for genetics-based machine learning: Accuracy and interpretability. Soft Comput. 2009, 13, 959. [Google Scholar] [CrossRef]
- Sheskin, D.J. Handbook of Parametric and Nonparametric Statistical Procedures; CRC Press: Boca Raton, FL, USA, 2003. [Google Scholar]
- Zar, J.H. Biostatistical Analysis; Pearson Higher Ed: San Francisco, CA, USA, 2013. [Google Scholar]
- Derrac, J.; García, S.; Molina, D.; Herrera, F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 2011, 1, 3–18. [Google Scholar] [CrossRef]
- García-Martínez, S.; Molina, D.; Lozano, M.; Herrera, F. A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: A case study on the CEC’2005 special session on real parameter optimization. J. Heuristics 2009, 15, 617–644. [Google Scholar] [CrossRef]
- García-Martínez, C.; Lozano, M. Hybrid real-coded genetic algorithms with female and male differentiation. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation, IEEE CEC 2005, Edinburgh, Scotland, UK, 2–5 September 2005; Volume 1, pp. 896–903. [Google Scholar] [CrossRef]
- Molina, D.; Herrera, F.; Lozano, M. Adaptive local search parameters for real-coded memetic algorithms. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation, IEEE CEC 2005, Edinburgh, Scotland, UK, 2–5 September 2005; Volume 1, pp. 888–895. [Google Scholar]
- Posik, P. Real-Parameter Optimization Using the Mutation Step Co-evolution. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation, IEEE CEC 2005, Edinburgh, Scotland, UK, 2–5 September 2005; pp. 872–879. [Google Scholar] [CrossRef]
- Ronkkonen, J.; Kukkonen, S.; Price, K.V. Real-parameter optimization with differential evolution. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation, Edinburgh, Scotland, UK, 2–5 September 2005; Volume 1, pp. 506–513. [Google Scholar]
- Liang, J.J.; Suganthan, P.N. Dynamic multi-swarm particle swarm optimizer with local search. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation, IEEE CEC 2005, Edinburgh, Scotland, UK, 2–5 September 2005; Volume 1, pp. 522–528. [Google Scholar]
- Yuan, B.; Gallagher, M. Experimental results for the special session on real-parameter optimization at CEC 2005: A simple, continuous EDA. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation, IEEE CEC 2005, Edinburgh, Scotland, UK, 2–5 September 2005; Volume 2, pp. 1792–1799. [Google Scholar] [CrossRef]
- Auger, A.; Hansen, N. A restart CMA evolution strategy with increasing population size. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation, IEEE CEC 2005, Edinburgh, Scotland, UK, 2–5 September 2005; Volume 2, pp. 1769–1776. [Google Scholar]
- Auger, A.; Hansen, N. Performance evaluation of an advanced local search evolutionary algorithm. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation, IEEE CEC 2005, Edinburgh, Scotland, UK, 2–5 September 2005; Volume 2, pp. 1777–1784. [Google Scholar] [CrossRef]
- Sinha, A.; Tiwari, S.; Deb, K. A population-based, steady-state procedure for real-parameter optimization. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation, IEEE CEC 2005, Edinburgh, Scotland, UK, 2–5 September 2005; Volume 1, pp. 514–521. [Google Scholar]
- Qin, A.K.; Suganthan, P.N. Self-adaptive differential evolution algorithm for numerical optimization. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation, IEEE CEC 2005, Edinburgh, Scotland, UK, 2–5 September 2005; Volume 2, pp. 1785–1791. [Google Scholar] [CrossRef]
- Ballester, P.J.; Stephenson, J.; Carter, J.N.; Gallagher, K. Real-parameter Optimization performance study on the CEC-2005 benchmark with SPC-PNX. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation, IEEE CEC 2005, Edinburgh, Scotland, UK, 2–5 September 2005; Volume 1, pp. 498–505. [Google Scholar] [CrossRef]
- Al-Betar, M.A.; Khader, A.T.; Awadallah, M.A.; Alawan, M.H.; Zaqaibeh, B. Cellular harmony search for optimization problems. J. Appl. Math. 2013, 2013. [Google Scholar] [CrossRef]
- Al-Betar, M.A.; Khader, A.T.; Doush, I.A. Memetic techniques for examination timetabling. Ann. Oper. Res. 2014, 218, 23–50. [Google Scholar] [CrossRef]
- Al-Betar, M.A. β-Hill climbing: An exploratory local search. Neural Comput. Appl. 2017, 28, 153–168. [Google Scholar] [CrossRef]
Common Settings Used by Both Methods | |
---|---|
No. of variables (n) | As described in functions definitions (Appendix A and Appendix B). |
Lower bound of each variable | As described in functions definitions (Appendix A and Appendix B). |
Upper bound of each variable | As described in functions definitions (Appendix A and Appendix B). |
No. of Markov chains (M) | 60 |
Length of each Markov chain (l) | |
Maximum no. of iterations | n. |
Initial solution | Random moves. |
Initial temperature | The standard deviation of 100 random moves based on [66]. |
Cooling schedule | where . |
Termination criteria | Reaching the maximum iterations number. |
No. of independent runs for each function | 30 |
SAES configurations | |
No of the GM partitions | |
Diversification index | No. of visited partitions/No. of all partitions. |
Diversification threshold | 0.04 |
Diversification stopping criteria | Diversification index 0.9, i.e., , or reaching of the number of Markov chain iterations. |
Final intensification budget | 500 n function evaluations. |
Test | SA | SAES | SAES | |||
---|---|---|---|---|---|---|
Functions | Average | Minimum | Average | Minimum | Average | Minimum |
Test | SA | SAES | SAES | |||
---|---|---|---|---|---|---|
Functions | Average | Minimum | Average | Minimum | Average | Minimum |
Test | Test | ||||||
---|---|---|---|---|---|---|---|
Functions | SA | SAES | SAES | Functions | SA | SAES | SAES |
Criterion | Compared Methods | No. of Beats | p-Value | Best Method | |||
---|---|---|---|---|---|---|---|
Average Errors | SA | SAES | 9/14 | 205.5 | 119.5 | 0.9536 | – |
SA | SAES | 3/22 | 300 | 25 | 0.0062 | SAES | |
SAES | SAES | 2/23 | 306 | 19 | 0.0066 | SAES | |
Minimum Errors | SA | SAES | 7/15 | 222 | 103 | 0.8996 | – |
SA | SAES | 3/22 | 303 | 22 | 0.0025 | SAES | |
SAES | SAES | 3/22 | 304 | 21 | 0.0026 | SAES | |
Processing Time | SA | SAES | 9/9 | 177 | 174 | 1.0000 | – |
SA | SAES | 26/0 | 0 | 351 | 0.6018 | – | |
SAES | SAES | 26/0 | 0 | 351 | 0.6341 | – |
Criterion | Compared Methods | No. of Beats | p-Value | Best Method | |||
---|---|---|---|---|---|---|---|
Average Errors | SA | SAES | 8/17 | 215 | 110 | 0.8614 | – |
SA | SAES | 2/23 | 320 | 5 | 0.0049 | SAES | |
SAES | SAES | 1/24 | 324 | 1 | 0.0049 | SAES | |
Minimum Errors | SA | SAES | 12/12 | 187.5 | 137.5 | 0.9459 | – |
SA | SAES | 8/17 | 259 | 66 | 0.0036 | SAES | |
SAES | SAES | 6/19 | 288 | 37 | 0.0030 | SAES | |
Processing Time | SA | SAES | 21/4 | 23.5 | 301.5 | 0.7636 | – |
SA | SAES | 24/1 | 1 | 324 | 0.0052 | SA | |
SAES | SAES | 24/1 | 1 | 324 | 0.0052 | SAES |
h | BLX-GLS50 | BLX-MA | CoEVO | DE | DMS-L-PSO | EDA | IPOP-CMA-ES | K-PCX |
---|---|---|---|---|---|---|---|---|
h | LR-CMA-ES | L-SADE | SPC-PNX | cHS | HC | HC | SAES | |
Criterion | Compared Methods | No. of Beats | p-Value | Best Method | |||
---|---|---|---|---|---|---|---|
Average Errors | BLX-GLS50 | SAES | 16/8 | 133.5 | 191.5 | 0.9458 | – |
BLX-MA | SAES | 11/14 | 209 | 116 | 0.7415 | – | |
CoEVO | SAES | 10/15 | 232 | 93 | 0.6344 | – | |
DE | SAES | 17/7 | 88.5 | 236.5 | 0.5935 | – | |
DMS-L-PSO | SAES | 13/11 | 151.5 | 173.5 | 0.5802 | – | |
EDA | SAES | 12/12 | 169.5 | 155.5 | 1.0000 | – | |
IPOP-CMA-ES | SAES | 18/4 | 59 | 266 | 0.3361 | – | |
K-PCX | SAES | 11/13 | 205.5 | 119.5 | 0.8537 | – | |
LR-CMA-ES | SAES | 13/11 | 176.5 | 148.5 | 0.7634 | – | |
L-SADE | SAES | 16/7 | 113.5 | 211.5 | 0.5605 | – | |
SPC-PNX | SAES | 15/9 | 133 | 192 | 0.9304 | – | |
cHS | SAES | 3/22 | 311 | 14 | 0.0055 | SAES | |
HC | SAES | 2/23 | 318 | 7 | 0.0034 | SAES | |
HC | SAES | 11/13 | 205.5 | 119.5 | 0.4788 | – | |
Function Evaluations | All methods | SAES | 0/25 | 325 | 0 | 2.77E–12 | SAES |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Almarashi, M.; Deabes, W.; Amin, H.H.; Hedar, A.-R. Simulated Annealing with Exploratory Sensing for Global Optimization. Algorithms 2020, 13, 230. https://doi.org/10.3390/a13090230
Almarashi M, Deabes W, Amin HH, Hedar A-R. Simulated Annealing with Exploratory Sensing for Global Optimization. Algorithms. 2020; 13(9):230. https://doi.org/10.3390/a13090230
Chicago/Turabian StyleAlmarashi, Majid, Wael Deabes, Hesham H. Amin, and Abdel-Rahman Hedar. 2020. "Simulated Annealing with Exploratory Sensing for Global Optimization" Algorithms 13, no. 9: 230. https://doi.org/10.3390/a13090230
APA StyleAlmarashi, M., Deabes, W., Amin, H. H., & Hedar, A. -R. (2020). Simulated Annealing with Exploratory Sensing for Global Optimization. Algorithms, 13(9), 230. https://doi.org/10.3390/a13090230