Refining the Eel and Grouper Optimizer with Intelligent Modifications for Global Optimization
Abstract
:1. Introduction
- 1.
- Naturalism, where two species can live without affecting each other.
- 2.
- Predation, where one creature dies by feeding another.
- 3.
- Parasitism, where one species can cause harm to another.
- 4.
- In competitive mode, the same or different organizations compete for resources.
- 5.
- The addition of a sampling technique based on the K-means method [69,70,71]. The sampling points facilitate finding the global minimum of the function in the most efficient way. Additionally, by applying this method, nearby points are discarded. The initialization of the population in evolutionary techniques is a crucial factor which can push these techniques to more efficiently locate the global minimum, and in this direction, a multitude of research works have been presented in recent years, such as the work of Maaranen et al. [72], where they apply quasi-random sequences to the initial population of a genetic algorithm. Likewise, Paul et al. [73] suggested a method for the initialization of the population of genetic algorithms using a Vari-begin and Vari-diversity (VV) seeding method. Ali et al. proposed a series of initialization methods for the Differential Evolution method [74]. A novel method that initializes the population of evolutionary algorithms using clustering and Cauchy deviates is suggested in the work of Bajer et al. [75]. A systematic review of initialization techniques for evolutionary algorithms can be found in the work of Kazimipour et al. [76].
- Using a termination technique that is developed with random measurements. Each time the algorithm is repeated, the minimum value is recorded. When this remains constant for a predefined number of iterations, the process is terminated. Therefore, the method terminates without wasting execution time in iterations, avoiding unnecessary consumption of computing resources. There are several methods found in the recent bibliography to terminate optimization methods. An overview of methods used to terminate evolutionary algorithms can be found in the work of Jain et al. [77]. Also, Zielinski et al. outlined some stopping rules used particularly in the Differential Evolution method [78]. Recently, Ghoreishi et al. published a literature study concerning various termination criteria on evolutionary algorithms [79]. Moreover, Ravber et al. performed extended research on the impact of the maximum number of iterations on the effectiveness of evolutionary algorithms [80].
- The application of randomness in the definition of the range of the positions of candidate solutions.
2. The Proposed Method
2.1. The Main Steps of the Algorithm
Algorithm 1 EGO algorithm |
|
- 1.
- The members of the population are initialized using a procedure that incorporates the K-means algorithm. This procedure is fully described in Section 2.2. The purpose of this sampling technique is to produce points that are close to the local minima of the objective problem through systematic clustering. This potentially significantly reduces the time required to complete the technique. The samples used in the proposed algorithm are the calculated centers of the K-means algorithm. This sampling technique has also been utilized in some recent papers related to Global Optimization techniques, such as the extended version of the Optimal Foraging algorithm [82] or the new genetic algorithm proposed by Charilogis et al. [83].
- 2.
- The second modification proposed is the stopping rule invoked at every step of the algorithm. This rule measures the similarity between the best fitness values obtained between consecutive iterations of the algorithm. If this difference takes low values for a consecutive number of iterations, then the algorithm may not be able to find a lower value for the global minimum and should stop. This stopping rule has been applied in recent years in various methods such as in the work of Charilogis and Tsoulos, which presented an improved parallel PSO method [84], the work of Kyrou at al., which suggested an improved version of the Giant-Armadillo optimization method [85], or the recent work of Kyrou et al., which proposed an extended version of the Optimal Foraging Algorithm [82]. Of course, this method is general enough for application in any global optimization procedure.
- 3.
- The third modification is the m flag, which controls the randomness in the range of candidate solutions. When this value is set to 2, then the critical parameters are calculated using random numbers.
Algorithm 2 The basic steps of the algorithm accompanied with the proposed modifications |
|
2.2. The Used Sampling Procedure
Algorithm 3 K-means algorithm |
|
3. Results
3.1. Test Functions
- Ackley’s function:
- Bf1 (Bohachevsky 1) function:
- Bf2 (Bohachevsky 2) function:
- Bf3 (Bohachevsky 3) function:
- Branin function:
- Camel function:
- Easom function:
- Equal maxima function, defined as:
- Exponential function, with the following definition:The values were used in the conducted experiments.
- F9 test function:
- Extended F10 function:
- F14 function:
- F15 function:
- F17 function:
- Five-uneven-peak trap function:
- Himmelblau’s function:
- Griewank2 function:
- Griewank10 function, given by the equation
- Goldstein and Price’s function
- Hansen’s function: , .
- Hartman 3 function:
- Hartman 6 function:
- Potential function: this function represents the energy of a molecular conformation of N atoms. The interaction of these atoms is determined by the Lennard–Jones potential [101]. The definition of this potential is:For the conducted experiments, the values were used.
- Rastrigin’s function.
- Rosenbrock function.The values were incorporated in the conducted experiments.
- Shekel 5 function.
- Shekel 7 function.
- Shekel 10 function.
- Sinusoidal function, defined as:The values were incorporated in the conducted experiments.
- Schaffer’s function:
- Schwefel221 function:
- Schwefel222 function:
- Shubert function:
- Sphere function:
- Test2N function:The values were incorporated for the conducted experiments.
- Test30N function:For the conducted experiments, the values were used.
- Uneven decreasing maxima function:
- Vincent’s function:
3.2. Experimental Results
- The column function represents the used test function.
- The column EGO represents the initial method without the modifications suggested in this work.
- The column EEGO represents the usage of the proposed method. The corresponding settings are shown in Table 2.
- The row sum is used to measure the total function calls for all problems.
- If a method failed to find the global minimum over all runs, this was noted in the corresponding table with a percentage enclosed in parentheses next to the average function calls.
- 1.
- The column uniform represents the usage of uniform sampling in the current method.
- 2.
- The column triangular stands for the usage of the triangular distribution [104] for sampling.
- 3.
- The column Maxwell represents for the application of the Maxwell distribution [105] to produce initial samples for the used method.
- 4.
- The column K-means represents the usage of the method described in Section 2.2 to produce initial samples for the used method.
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Törn, A.; Ali, M.M.; Viitanen, S. Stochastic global optimization: Problem classes and solution techniques. J. Glob. Optim. 1999, 14, 437–447. [Google Scholar] [CrossRef]
- Floudas, C.A.; Pardalos, P.M. (Eds.) State of the Art in Global Optimization: Computational Methods and Applications; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
- Horst, R.; Pardalos, P.M. Handbook of Global Optimization; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2013; Volume 2. [Google Scholar]
- Intriligator, M.D. Mathematical Optimization and Economic Theory; Society for Industrial and Applied Mathematics, SIAM: New Delhi, India, 2002. [Google Scholar]
- Cánovas, M.J.; Kruger, A.; Phu, H.X.; Théra, M. Marco A. López, a Pioneer of Continuous Optimization in Spain. Vietnam. J. Math. 2020, 48, 211–219. [Google Scholar] [CrossRef]
- Mahmoodabadi, M.J.; Nemati, A.R. A novel adaptive genetic algorithm for global optimization of mathematical test functions and real-world problems. Eng. Sci. Technol. Int. J. 2016, 19, 2002–2021. [Google Scholar] [CrossRef]
- Li, J.; Xiao, X.; Boukouvala, F.; Floudas, C.A.; Zhao, B.; Du, G.; Su, X.; Liu, H. Data-driven mathematical modeling and global optimization framework for entire petrochemical planning operations. AIChE J. 2016, 62, 3020–3040. [Google Scholar] [CrossRef]
- Iuliano, E. Global optimization of benchmark aerodynamic cases using physics-based surrogate models. Aerosp. Sci. Technol. 2017, 67, 273–286. [Google Scholar] [CrossRef]
- Duan, Q.; Sorooshian, S.; Gupta, V. Effective and efficient global optimization for conceptual rainfall-runoff models. Water Resour. Res. 1992, 28, 1015–1031. [Google Scholar] [CrossRef]
- Yang, L.; Robin, D.; Sannibale, F.; Steier, C.; Wan, W. Global optimization of an accelerator lattice using multiobjective genetic algorithms. Nucl. Instrum. Methods Phys. Res. Sect. Accel. Spectrom. Detect. Assoc. Equip. 2009, 609, 50–57. [Google Scholar] [CrossRef]
- Heiles, S.; Johnston, R.L. Global optimization of clusters using electronic structure methods. Int. J. Quantum Chem. 2013, 113, 2091–2109. [Google Scholar] [CrossRef]
- Shin, W.H.; Kim, J.K.; Kim, D.S.; Seok, C. GalaxyDock2: Protein–ligand docking using beta-complex and global optimization. J. Comput. Chem. 2013, 34, 2647–2656. [Google Scholar] [CrossRef] [PubMed]
- Liwo, A.; Lee, J.; Ripoll, D.R.; Pillardy, J.; Scheraga, H.A. Protein structure prediction by global optimization of a potential energy function. Proc. Natl. Acad. Sci. USA 1999, 96, 5482–5485. [Google Scholar] [CrossRef]
- Houssein, E.H.; Hosney, M.E.; Mohamed, W.M.; Ali, A.A.; Younis, E.M. Fuzzy-based hunger games search algorithm for global optimization and feature selection using medical data. Neural Comput. Appl. 2023, 35, 5251–5275. [Google Scholar] [CrossRef]
- Ion, I.G.; Bontinck, Z.; Loukrezis, D.; Römer, U.; Lass, O.; Ulbrich, S.; Schöps, S.; De Gersem, H. Robust shape optimization of electric devices based on deterministic optimization methods and finite-element analysis with affine parametrization and design elements. Electr. Eng. 2018, 100, 2635–2647. [Google Scholar] [CrossRef]
- Cuevas-Velásquez, V.; Sordo-Ward, A.; García-Palacios, J.H.; Bianucci, P.; Garrote, L. Probabilistic model for real-time flood operation of a dam based on a deterministic optimization model. Water 2020, 12, 3206. [Google Scholar] [CrossRef]
- Pereyra, M.; Schniter, P.; Chouzenoux, E.; Pesquet, J.C.; Tourneret, J.Y.; Hero, A.O.; McLaughlin, S. A survey of stochastic simulation and optimization methods in signal processing. IEEE J. Sel. Top. Signal Process. 2015, 10, 224–241. [Google Scholar] [CrossRef]
- Hannah, L.A. Stochastic optimization. Int. Encycl. Soc. Behav. Sci. 2015, 2, 473–481. [Google Scholar]
- Kizielewicz, B.; Sałabun, W. A new approach to identifying a multi-criteria decision model based on stochastic optimization techniques. Symmetry 2020, 12, 1551. [Google Scholar] [CrossRef]
- Chen, T.; Sun, Y.; Yin, W. Solving stochastic compositional optimization is nearly as easy as solving stochastic optimization. IEEE Trans. Signal Process. 2021, 69, 4937–4948. [Google Scholar] [CrossRef]
- Wolfe, M.A. Interval methods for global optimization. Appl. Math. Comput. 1996, 75, 179–206. [Google Scholar]
- Csendes, T.; Ratz, D. Subdivision direction selection in interval methods for global optimization. SIAM J. Numer. Anal. 1997, 34, 922–938. [Google Scholar] [CrossRef]
- Sergeyev, Y.D.; Kvasov, D.E.; Mukhametzhanov, M.S. On the efficiency of nature-inspired metaheuristics in expensive global optimization with limited budget. Sci. Rep. 2018, 8, 453. [Google Scholar] [CrossRef]
- Storn, R.; Price, K. Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
- Liu, J.; Lampinen, J. A fuzzy adaptive differential evolution algorithm. Soft Comput. 2005, 9, 448–462. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, USA, 27 November–1 December 1995; IEEE: Piscataway, NJ, USA, 1995; Volume 4, pp. 1942–1948. [Google Scholar]
- Poli, R.; Kennedy, J.; Blackwell, T. Particle swarm optimization: An overview. Swarm Intell. 2007, 1, 33–57. [Google Scholar] [CrossRef]
- Trelea, I.C. The particle swarm optimization algorithm: Convergence analysis and parameter selection. Inf. Process. Lett. 2003, 85, 317–325. [Google Scholar] [CrossRef]
- Dorigo, M.; Birattari, M.; Stutzle, T. Ant colony optimization. IEEE Comput. Intell. Mag. 2006, 1, 28–39. [Google Scholar] [CrossRef]
- Socha, K.; Dorigo, M. Ant colony optimization for continuous domains. Eur. J. Oper. Res. 2008, 185, 1155–1173. [Google Scholar] [CrossRef]
- Goldberg, D. Genetic Algorithms in Search, Optimization and Machine Learning; Addison-Wesley Publishing Company: Reading, MA, USA, 1989. [Google Scholar]
- Michalewicz, Z. Genetic Algorithms + Data Structures = Evolution Programs; Springer: Berlin/Heidelberg, Germany, 1999. [Google Scholar]
- Abdel-Basset, M.; El-Shahat, D.; Jameel, M.; Abouhawwash, M. Exponential distribution optimizer (EDO): A novel math-inspired algorithm for global optimization and engineering problems. Artif. Intell. Rev. 2023, 56, 9329–9400. [Google Scholar] [CrossRef]
- Ma, L.; Cheng, S.; Shi, Y. Enhancing learning efficiency of brain storm optimization via orthogonal learning design. IEEE Trans. Syst. Man Cybern. Syst. 2020, 51, 6723–6742. [Google Scholar] [CrossRef]
- Zhou, Y.; Tan, Y. GPU-based parallel particle swarm optimization. In Proceedings of the 2009 IEEE Congress on Evolutionary Computation, Trondheim, Norway, 18–21 May 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 1493–1500. [Google Scholar]
- Dawson, L.; Stewart, I. Improving Ant Colony Optimization performance on the GPU using CUDA. In Proceedings of the 2013 IEEE Congress on Evolutionary Computation, Cancun, Mexico, 20–23 June 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 1901–1908. [Google Scholar]
- Barkalov, K.; Gergel, V. Parallel global optimization on GPU. J. Glob. Optim. 2016, 66, 3–20. [Google Scholar] [CrossRef]
- Hassanien, A.E.; Emary, E. Swarm Intelligence: Principles, Advances, and Applications; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
- Tang, J.; Liu, G.; Pan, Q. A review on representative swarm intelligence algorithms for solving optimization problems: Applications and trends. IEEE/CAA J. Autom. Sin. 2021, 8, 1627–1643. [Google Scholar] [CrossRef]
- Brezočnik, L.; Fister, I., Jr.; Podgorelec, V. Swarm intelligence algorithms for feature selection: A review. Appl. Sci. 2018, 8, 1521. [Google Scholar] [CrossRef]
- Chu, Y.; Mi, H.; Liao, H.; Ji, Z.; Wu, Q.H. A fast bacterial swarming algorithm for high-dimensional function optimization. In Proceedings of the 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence), Hong Kong, China, 1–6 June 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 3135–3140. [Google Scholar]
- Neshat, M.; Sepidnam, G.; Sargolzaei, M.; Toosi, A.N. Artificial fish swarm algorithm: A survey of the state-of-the-art, hybridization, combinatorial and indicative applications. Artif. Intell. Rev. 2014, 42, 965–997. [Google Scholar] [CrossRef]
- Wu, T.Q.; Yao, M.; Yang, J.H. Dolphin swarm algorithm. Front. Inf. Technol. Electron. Eng. 2016, 17, 717–729. [Google Scholar] [CrossRef]
- Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
- Nasiri, J.; Khiyabani, F.M. A whale optimization algorithm (WOA) approach for clustering. Cogent Math. Stat. 2018, 5, 1483565. [Google Scholar] [CrossRef]
- Gharehchopogh, F.S.; Gholizadeh, H. A comprehensive survey: Whale Optimization Algorithm and its applications. Swarm Evol. Comput. 2019, 48, 1–24. [Google Scholar] [CrossRef]
- Wang, J.; Bei, J.; Song, H.; Zhang, H.; Zhang, P. A whale optimization algorithm with combined mutation and removing similarity for global optimization and multilevel thresholding image segmentation. Appl. Soft Comput. 2023, 137, 110130. [Google Scholar] [CrossRef]
- Kaur, S.; Awasthi, L.K.; Sangal, A.L.; Dhiman, G. Tunicate Swarm Algorithm: A new bio-inspired based metaheuristic paradigm for global optimization. Eng. Appl. Artif. Intell. 2020, 90, 103541. [Google Scholar] [CrossRef]
- Wan, Y.; Mao, M.; Zhou, L.; Zhang, Q.; Xi, X.; Zheng, C. A novel nature-inspired maximum power point tracking (MPPT) controller based on SSA-GWO algorithm for partially shaded photovoltaic systems. Electronics 2019, 8, 680. [Google Scholar] [CrossRef]
- Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
- Bairathi, D.; Gopalani, D. Salp swarm algorithm (SSA) for training feed-forward neural networks. In Proceedings of the Soft Computing for Problem Solving: SocProS, Bhubaneswar, India, 23–24 December 2017; Springer: Singapore, 2019; Volume 1, pp. 521–534. [Google Scholar]
- Abualigah, L.; Shehab, M.; Alshinwan, M.; Alabool, H. Salp swarm algorithm: A comprehensive survey. Neural Comput. Appl. 2020, 32, 11195–11215. [Google Scholar] [CrossRef]
- Karaboga, D.; Akay, B. A comparative study of artificial bee colony algorithm. Appl. Math. Comput. 2009, 214, 108–132. [Google Scholar] [CrossRef]
- Abdullahi, M.; Ngadi, M.A.; Dishing, S.I.; Abdulhamid, S.I.M.; Usman, M.J. A survey of symbiotic organisms search algorithms and applications. Neural Comput. Appl. 2020, 32, 547–566. [Google Scholar] [CrossRef]
- Wang, Y.; DeAngelis, D.L. A mutualism-parasitism system modeling host and parasite withmutualism at low density. Math. Biosci. Eng. 2012, 9, 431–444. [Google Scholar]
- Aubier, T.G.; Joron, M.; Sherratt, T.N. Mimicry among unequally defended prey should be mutualistic when predators sample optimally. Am. Nat. 2017, 189, 267–282. [Google Scholar] [CrossRef]
- Addicott, J.F. Competition in mutualistic systems. In The biology of Mutualism: Ecology and Evolution; Croom Helm: London, UK, 1985; pp. 217–247. [Google Scholar]
- Bshary, R.; Hohner, A.; Ait-el-Djoudi, K.; Fricke, H. Interspecific communicative and coordinated hunting between groupers and giant moray eels in the Red Sea. Plos Biol. 2006, 4, e431. [Google Scholar] [CrossRef]
- Mohammadzadeh, A.; Mirjalili, S. Eel and Grouper Optimizer: A Nature-Inspired Optimization Algorithm; Springer Science+Business Media, LLC.: Berlin/Heidelberg, Germany, 2024. [Google Scholar]
- Gogu, A.; Nace, D.; Dilo, A.; Meratnia, N.; Ortiz, J.H. Review of optimization problems in wireless sensor networks. In Telecommunications Networks—Current Status and Future Trends; BoD: Norderstedt, Germany, 2012; pp. 153–180. [Google Scholar]
- Goudos, S.K.; Boursianis, A.D.; Mohamed, A.W.; Wan, S.; Sarigiannidis, P.; Karagiannidis, G.K.; Suganthan, P.N. Large Scale Global Optimization Algorithms for IoT Networks: A Comparative Study. In Proceedings of the 2021 17th International Conference on Distributed Computing in Sensor Systems (DCOSS), Pafos, Cyprus, 14–16 July 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 272–279. [Google Scholar]
- Arayapan, K.; Warunyuwong, P. Logistics Optimization: Application of Optimization Modeling in Inbound Logistics. Master’s Thesis, Malardalen University, Västerås, Sweeden, 2009. [Google Scholar]
- Singh, S.P.; Dhiman, G.; Juneja, S.; Viriyasitavat, W.; Singal, G.; Kumar, N.; Johri, P. A New QoS Optimization in IoT-Smart Agriculture Using Rapid Adaption Based Nature-Inspired Approach. IEEE Internet Things J. 2023, 11, 5417–5426. [Google Scholar] [CrossRef]
- Wang, H.; Ersoy, O.K. A novel evolutionary global optimization algorithm and its application in bioinformatics. ECE Tech. Rep. 2005, 65. Available online: https://docs.lib.purdue.edu/cgi/viewcontent.cgi?article=1065&context=ecetr (accessed on 7 September 2024).
- Cassioli, A.; Di Lorenzo, D.; Locatelli, M.; Schoen, F.; Sciandrone, M. Machine learning for global optimization. Comput. Optim. Appl. 2012, 51, 279–303. [Google Scholar] [CrossRef]
- Houssein, E.H.; Helmy, B.E.D.; Elngar, A.A.; Abdelminaam, D.S.; Shaban, H. An improved tunicate swarm algorithm for global optimization and image segmentation. IEEE Access 2021, 9, 56066–56092. [Google Scholar] [CrossRef]
- Torun, H.M.; Swaminathan, M. High-dimensional global optimization method for high-frequency electronic design. IEEE Trans. Microw. Theory Tech. 2019, 67, 2128–2142. [Google Scholar] [CrossRef]
- Wang, L.; Kan, J.; Guo, J.; Wang, C. 3D path planning for the ground robot with improved ant colony optimization. Sensors 2019, 19, 815. [Google Scholar] [CrossRef] [PubMed]
- Arora, P.; Varshney, S. Analysis of k-means and k-medoids algorithm for big data. Procedia Comput. Sci. 2016, 78, 507–512. [Google Scholar] [CrossRef]
- Ahmed, M.; Seraj, R.; Islam, S.M.S. The k-means algorithm: A comprehensive survey and performance evaluation. Electronics 2020, 9, 1295. [Google Scholar] [CrossRef]
- MacQueen, J.B. Some Methods for classification and Analysis of Multivariate Observations. In Proceedings of the 5th Berkeley Symposium on Mathematical Statistics and Probability, Oakland, CA, USA, 21 June–18 July 1967; pp. 281–297. [Google Scholar]
- Maaranen, H.; Miettinen, K.; Mäkelä, M.M. Quasi-random initial population for genetic algorithms. Comput. Math. Appl. 2004, 47, 1885–1895. [Google Scholar] [CrossRef]
- Paul, P.V.; Dhavachelvan, P.; Baskaran, R. A novel population initialization technique for genetic algorithm. In Proceedings of the 2013 International Conference on Circuits, Power and Computing Technologies (ICCPCT), Nagercoil, India, 20–21 March 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 1235–1238. [Google Scholar]
- Ali, M.; Pant, M.; Abraham, A. Unconventional initialization methods for differential evolution. Appl. Math. Comput. 2013, 219, 4474–4494. [Google Scholar] [CrossRef]
- Bajer, D.; Martinović, G.; Brest, J. A population initialization method for evolutionary algorithms based on clustering and Cauchy deviates. Expert Syst. Appl. 2016, 60, 294–310. [Google Scholar] [CrossRef]
- Kazimipour, B.; Li, X.; Qin, A.K. A review of population initialization techniques for evolutionary algorithms. In Proceedings of the 2014 IEEE Congress on Evolutionary Computation (CEC), Beijing, China, 6–11 July 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 2585–2592. [Google Scholar]
- Jain, B.J.; Pohlheim, H.; Wegener, J. On termination criteria of evolutionary algorithms. In Proceedings of the 3rd Annual Conference on Genetic and Evolutionary Computation, Francisc, CA, USA, 7–11 July 2001; p. 768. [Google Scholar]
- Zielinski, K.; Weitkemper, P.; Laur, R.; Kammeyer, K.D. Examination of stopping criteria for differential evolution based on a power allocation problem. In Proceedings of the 10th International Conference on Optimization of Electrical and Electronic Equipment, Brasov, Romania, 18–20 May 2006; Volume 3, pp. 149–156. [Google Scholar]
- Ghoreishi, S.N.; Clausen, A.; Jørgensen, B.N. Termination Criteria in Evolutionary Algorithms: A Survey. In Proceedings of the IJCCI, Funchal, Portugal, 1–3 November 2017; pp. 373–384. [Google Scholar]
- Ravber, M.; Liu, S.H.; Mernik, M.; Črepinšek, M. Maximum number of generations as a stopping criterion considered harmful. Appl. Soft Comput. 2022, 128, 109478. [Google Scholar] [CrossRef]
- Charilogis, V.; Tsoulos, I.G. Toward an ideal particle swarm optimizer for multidimensional functions. Information 2022, 13, 217. [Google Scholar] [CrossRef]
- Kyrou, G.; Charilogis, V.; Tsoulos, I.G. EOFA: An Extended Version of the Optimal Foraging Algorithm for Global Optimization Problems. Computation 2024, 12, 158. [Google Scholar] [CrossRef]
- Charilogis, V.; Tsoulos, I.G.; Stavrou, V.N. An Intelligent Technique for Initial Distribution of Genetic Algorithms. Axioms 2023, 12, 980. [Google Scholar] [CrossRef]
- Charilogis, V.; Tsoulos, I.G.; Tzallas, A. An improved parallel particle swarm optimization. SN Comput. Sci. 2023, 4, 766. [Google Scholar] [CrossRef]
- Kyrou, G.; Charilogis, V.; Tsoulos, I.G. Improving the Giant-Armadillo Optimization Method. Analytics 2024, 3, 225–240. [Google Scholar] [CrossRef]
- Li, Y.; Wu, H. A clustering method based on K-means algorithm. Phys. Procedia 2012, 25, 1104–1109. [Google Scholar] [CrossRef]
- Ali, H.H.; Kadhum, L.E. K-means clustering algorithm applications in data mining and pattern recognition. Int. J. Sci. Res. (IJSR) 2017, 6, 1577–1584. [Google Scholar]
- Krishna, K.; Murty, M.N. Genetic K-means algorithm. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 1999, 29, 433–439. [Google Scholar] [CrossRef]
- Sinaga, K.P.; Yang, M.S. Unsupervised K-means clustering algorithm. IEEE Access 2020, 8, 80716–80727. [Google Scholar] [CrossRef]
- Ay, M.; Özbakır, L.; Kulluk, S.; Gülmez, B.; Öztürk, G.; Özer, S. FC-Kmeans: Fixed-centered K-means algorithm. Expert Syst. Appl. 2023, 211, 118656. [Google Scholar] [CrossRef]
- Oti, E.U.; Olusola, M.O.; Eze, F.C.; Enogwe, S.U. Comprehensive review of K-Means clustering algorithms. Criterion 2021, 12, 22–23. [Google Scholar] [CrossRef]
- Ali, M.M.; Khompatraporn, C.; Zabinsky, Z.B. A numerical evaluation of several stochastic algorithms on selected continuous global optimization test problems. J. Glob. Optim. 2005, 31, 635–672. [Google Scholar] [CrossRef]
- Floudas, C.A.; Pardalos, P.M.; Adjiman, C.; Esposito, W.R.; Gümüs, Z.H.; Harding, S.T.; Klepeis, J.L.; Meyer, C.A.; Schweiger, C.A. Handbook of Test Problems in Local and Global Optimization; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2013; Volume 33. [Google Scholar]
- Ali, M.M.; Kaelo, P. Improved particle swarm algorithms for global optimization. Appl. Math. Comput. 2008, 196, 578–593. [Google Scholar] [CrossRef]
- Koyuncu, H.; Ceylan, R. A PSO based approach: Scout particle swarm algorithm for continuous global optimization problems. J. Comput. Des. Eng. 2019, 6, 129–142. [Google Scholar] [CrossRef]
- Siarry, P.; Berthiau, G.; Durdin, F.; Haussy, J. Enhanced simulated annealing for globally minimizing functions of many-continuous variables. ACM Trans. Math. Softw. (TOMS) 1997, 23, 209–228. [Google Scholar] [CrossRef]
- Tsoulos, I.G.; Lagaris, I.E. GenMin: An enhanced genetic algorithm for global optimization. Comput. Phys. Commun. 2008, 178, 843–851. [Google Scholar] [CrossRef]
- LaTorre, A.; Molina, D.; Osaba, E.; Poyatos, J.; Del Ser, J.; Herrera, F. A prescription of methodological guidelines for comparing bio-inspired optimization algorithms. Swarm Evol. Comput. 2021, 67, 100973. [Google Scholar] [CrossRef]
- Li, X.; Engelbrecht, A.; Epitropakis, M.G. Benchmark Functions for CEC’2013 Special Session and Competition on Niching Methods for Multimodal Function Optimization; Technology Report; RMIT University, Evolutionary Computation and Machine Learning Group: Melbourne, Australia, 2013. [Google Scholar]
- Gaviano, M.; Kvasov, D.E.; Lera, D.; Sergeyev, Y.D. Algorithm 829: Software for generation of classes of test functions with known local and global minima for global optimization. ACM Trans. Math. Softw. (TOMS) 2003, 29, 469–480. [Google Scholar] [CrossRef]
- Jones, J.E. On the Determination of Molecular Fields.—II. From the Equation of State of a Gas; Series A, Containing Papers of a Mathematical and Physical Character; Royal Society: London, UK, 1924; Volume 106, pp. 463–477. [Google Scholar]
- Powell, M.J.D. A Tolerant Algorithm for Linearly Constrained Optimization Calculations. Math. Program 1989, 45, 547–566. [Google Scholar] [CrossRef]
- Tsoulos, I.G. Modifications of real code genetic algorithm for global optimization. Appl. Math. Comput. 2008, 203, 598–607. [Google Scholar] [CrossRef]
- Stein, W.E.; Keblis, M.F. A new method to simulate the triangular distribution. Math. Comput. Model. 2009, 49, 1143–1147. [Google Scholar] [CrossRef]
- Sharma, V.K.; Bakouch, H.S.; Suthar, K. An extended Maxwell distribution: Properties and applications. Commun. Stat. Simul. Comput. 2017, 46, 6982–7007. [Google Scholar] [CrossRef]
- Sengupta, R.; Pal, M.; Saha, S.; Bandyopadhyay, S. Uniform distribution driven adaptive differential evolution. Appl. Intell. 2020, 50, 3638–3659. [Google Scholar] [CrossRef]
- Glickman, T.S.; Xu, F. Practical risk assessment with triangular distributions. Int. J. Risk Assess. Manag. 2009, 13, 313–327. [Google Scholar] [CrossRef]
- Ishaq, A.I.; Abiodun, A.A. The Maxwell–Weibull distribution in modeling lifetime datasets. Ann. Data Sci. 2020, 7, 639–662. [Google Scholar] [CrossRef]
- Beretta, L.; Cohen-Addad, V.; Lattanzi, S.; Parotsidis, N. Multi-swap k-means++. Adv. Neural Inf. Process. Syst. 2023, 36, 26069–26091. [Google Scholar]
- Gropp, W.; Lusk, E.; Doss, N.; Skjellum, A. A high-performance, portable implementation of the MPI message passing interface standard. Parallel Comput. 1996, 22, 789–828. [Google Scholar] [CrossRef]
- Chandra, R. Parallel Programming in OpenMP; Academic Press: Cambridge, MA, USA, 2001. [Google Scholar]
Function | Dimension |
---|---|
Ackley | |
Bf1 | |
Bf2 | |
Bf3 | |
Branin | |
Camel | |
Easom | |
EQUAL_MAXIMA | |
EXP | |
EXTENDEDF10 | |
FIVE_UNEVEN | |
F9 | |
F14 | |
F15 | |
F17 | |
HIMMELBLAU | |
GKLS | |
GRIEWANK2 | |
GRIEWANK10 | |
HANSEN | |
HARTMAN3 | |
HARTMAN6 | |
POTENTIAL | |
RASTRIGIN | |
ROSENBROCK | |
SHEKEL5 | |
SHEKEL7 | |
SHEKEL10 | |
SHUBERT | |
SCHWEFEL221 | |
SCHWEFEL222 | |
SPHERE | |
TEST2N | |
SINU | |
TEST30N | |
UNEVEN_MAXIMA | |
VINCENT |
PARAMETER | MEANING | VALUE |
---|---|---|
Number of chromosomes/particles | 200 | |
Maximum number of allowed iterations | 200 | |
Number of samples for the K-means | ||
Number of maximum iterations for the stopping rule | 5 | |
Selection rate of the genetic algorithm | 0.1 | |
Mutation rate of the genetic algorithm | 0.05 |
Function | Genetic | PSO | DE | EGO | EEGO |
---|---|---|---|---|---|
ACKLEY | 6749 (869) | 6885 (1108) | 10,220 (1342) | 8714 (1009) | 4199 (768) |
BF1 | 4007 (308) | 4142 (397) | 8268 (299) | 4762 (379) | 3228 (356) |
BF2 | 3794 (350) | 3752 (302) | 7913 (320) | 4299 (325) | 2815 (310) |
BF3 | 3480 (266) | 3306 (261) | 10270 (294) | 3747 (303) | 2501 (218) |
BRANIN | 2376 (136) | 2548 (142) | 4101 (559) | 2659 (150) | 1684 (180) |
CAMEL | 2869 (195) | 2933 (180) | 5609 (530) | 3317 (229) | 2262 (295) |
EASOM | 1958 (67) | 1982 (92) | 2978 (344) | 2235 (196) | 1334 (133) |
EQUAL_MAXIMA | 2651 (131) | 1499 (176) | 2374 (117) | 2013 (157) | 1286 (121) |
EXP4 | 2946 (196) | 3404 (201) | 5166 (430) | 3392 (313) | 2166 (243) |
EXP8 | 3120 (187) | 3585 (212) | 5895 (615) | 3347 (288) | 2802 (304) |
EXP16 | 3250 (234) | 3735 (206) | 6498 (784) | 3345 (233) | 3279 (416) |
EXP32 | 3561 (233) | 3902 (233) | 7606 (475) | 3332 (250) | 3430 (401) |
EXTENDEDF10 | 4862 (959) | 3653 (329) | 5728 (976) (0.87) | 4737 (428) | 2609 (321) |
FIVE_UNEVEN | 3412 (422) (0.67) | 3913 (378) (0.87) | 4042 (391) (0.14) | 5006 (519) (0.97) | 3849 (388) (0.90) |
F9 | 2604 (197) | 1888 (306) | 2271 (364) | 2748 (333) | 1439 (229) |
F14 | 6686 (1503) | 5498 (550) | 5279 (1988) (0.63) | 9228 (3725) (0.94) | 6063 (1835) |
F15 | 4373 (624) | 6696 (856) | 5874 (1880) (0.80) | 7342 (1441) | 4397 (1031) |
F17 | 3667 (267) | 3805 (333) | 10441 (1435) | 4057 (339) | 2766 (368) |
HIMMELBLAU | 2481 (27) | 1013 (34) | 6636 (282) | 1718 (84) | 1119 (81) |
GKLS250 | 2280 (184) | 2411 (158) | 3834 (416) | 3332 (203) | 1603 (150) |
GKLS350 | 2613 (269) | 2234 (225) | 3919 (469) | 2493 (233) | 1298 (178) |
GOLDSTEIN | 3687 (278) | 3865 (356) | 6781 (483) | 4015 (323) | 2784 (290) |
GRIEWANK2 | 4501 (918) | 3076 (218) (0.73) | 7429 (1472) | 4682 (586) | 2589 (532) (0.96) |
GRIEWANK10 | 6410 (1264) (0.97) | 8006 (732) | 18490 (1716) | 8772 (1138) | 7435 (900) |
HANSEN | 3210 (458) | 2856 (208) | 4185 (627) | 3789 (864) | 2484 (417) |
HARTMAN3 | 2752 (188) | 3140 (215) | 5190 (294) | 3078 (267) | 1793 (231) |
HARTMAN6 | 3219 (217) | 3710 (235) | 5968 (548) | 3583 (309) | 2478 (282) |
POTENTIAL3 | 4352 (461) | 4865 (468) | 6118 (1047) | 6027 (793) | 4081 (391) |
POTENTIAL5 | 7705 (892) | 9183 (1180) | 9119 (570) | 9968 (1171) | 8886 (1146) |
RASTRIGIN | 4107 (729) | 3477 (348) | 6216 (428) | 4201 (401) | 2304 (302) |
ROSENBROCK4 | 3679 (393) | 6372 (549) | 8452 (643) | 6137 (720) | 4019 (324) |
ROSENBROCK8 | 5270 (514) | 8284 (849) | 11,530 (1632) | 8569 (872) | 6801 (633) |
ROSENBROCK16 | 8509 (1026) | 11,872 (1018) | 17,432 (1738) | 11,777 (1268) | 11,996 (1331) |
SHEKEL5 | 3325 (227) | 4259 (305) | 6662 (974) | 3948 (340) | 2495 (310) |
SHEKEL7 | 3360 (283) | 4241 (300) | 6967 (1035) | 4043 (379) | 2432 (240) |
SHEKEL10 | 3488 (240) | 4237 (268) | 6757 (897) | 3932 (355) | 2516 (326) |
SHUBERT2 | 3567 (413) | 2123 (188) | 3526 (885) | 3622 (446) | 2300 (527) |
SHUBERT4 | 3358 (380) | 1823 (166) | 3067 (699) | 3593 (674) | 1967 (323) (0.97) |
SHUBERT8 | 3569 (357) | 2348 (203) | 3120 (452) (0.94) | 2862 (383) | 2267 (296) |
SCHAFFER | 18,787 (3105) | 15,176 (2401) | 6315 (1548) | 28,679 (6267) | 23,531 (5904) |
SCHWEFEL221 | 2667 (416) | 2529 (163) | 5415 (1161) | 3426 (787) | 2203 (478) |
SCHWEFEL222 | 33,725 (4809) | 42,898 (6978) | 12,200 (1737) | 51,654 (7150) | 38,876 (5659) |
SPHERE | 1588 (23) | 1521 (12) | 3503 (199) | 1642 (38) | 1162 (84) |
TEST2N4 | 3331 (392) | 3437 (223) | 6396 (999) | 3695 (411) | 2277 (464) |
TEST2N5 | 4000 (815) | 3683 (287) | 6271 (1017) | 4234 (636) | 2734 (479) (0.96) |
TEST2N6 | 4312 (901) (0.93) | 3781 (241) | 5410 (822) (0.93) | 4599 (589) | 2905 (832) (0.86) |
TEST2N7 | 4775 (905) (0.90) | 4060 (312) | 7074 (1523) (0.97) | 5146 (767) | 3559 (763) (0.73) |
SINU4 | 2991 (298) | 3504 (231) | 5953 (1509) | 3478 (463) | 2005 (332) |
SINU8 | 3442 (293) | 4213 (309) | 6973 (1637) | 4420 (560) | 3158 (467) |
SINU16 | 4320 (458) | 5019 (312) | 6979 (634) | 7033 (1017) | 5891 (553) |
TEST30N3 | 3211 (1207) | 4610 (1615) | 6168 (1297) | 3971 (1348) | 2362 (884) |
TEST30N4 | 3679 (1435) | 4629 (1708) | 7006 (2745) | 4908 (1333) | 2978 (1719) |
UNEVEN_MAXIMA | 2969 (283) | 2729 (292) | 2393 (319) | 2972 (288) | 1560 (208) |
VINCENT2 | 12,779 (563) | 1797 (140) (0.87) | 6216 (1152) | 2094 (148) | 1834 (388) (0.90) |
VINCENT4 | 19,385 (1179) | 1830 (190) (0.67) | 4691 (1019) (0.83) | 2674 (244) | 2697 (1443) (0.64) |
VINCENT8 | 19,882 (2548) | 2717 (256) | 4417 (1118) (0.77) | 3423 (358) | 4368 (2253) (0.77) |
SUM | 297,650 | 268,654 | 288,413 | 320,469 | 231,856 |
Function | Uniform | Triangular | Maxwell | K-means |
---|---|---|---|---|
ACKLEY | 6118 (736) | 5912 (741) | 5986 (845) | 4199 (768) |
BF1 | 4513 (423) | 4318 (368) | 4055 (346) | 3228 (356) |
BF2 | 3959 (333) | 3879 (363) | 3587 (339) | 2815 (310) |
BF3 | 3506 (286) | 3344 (254) | 3129 (298) | 2501 (218) |
BRANIN | 2282 (162) | 2131 (174) | 2066 (167) | 1684 (180) |
CAMEL | 3156 (251) | 2919 (244) | 2848 (261) | 2262 (295) |
EASOM | 1756 (138) | 1650 (140) | 1321 (106) | 1334 (133) |
EQUAL_MAXIMA | 1445 (130) | 1293 (115) | 1165 (139) | 1286 (121) |
EXP4 | 3438 (335) | 3273 (232) | 3194 (330) | 2166 (243) |
EXP8 | 3432 (284) | 3387 (332) | 3152 (266) | 2802 (304) |
EXP16 | 3369 (349) | 3326 (329) | 3291 (224) | 3279 (416) |
EXP32 | 3216 (274) | 3225 (207) | 3344 (374) | 3430 (401) |
EXTENDEDF10 | 4304 (683) | 3913 (596) | 3718 (673) | 2609 (321) |
FIVE_UNEVEN | 4685 (427) | 4330 (474) | 4358 (544) | 3849 (388) |
F9 | 1958 (168) | 1681 (188) | 1878 (186) | 1439 (229) |
F14 | 7552 (1377) | 7573 (2049) | 6148 (724) | 6063 (1835) |
F15 | 6806 (1065) | 6466 (722) | 6543 (974) | 4397 (1031) |
F17 | 3805 (326) | 3700 (404) | 3543 (236) | 2766 (368) |
HIMMELBLAU | 1333 (91) | 1173 (72) | 1114 (102) | 1119 (81) |
GKLS250 | 2268 (177) | 2023 (170) | 1778 (208) | 1603 (150) |
GKLS350 | 2151 (291) | 1841 (190) | 2069 (893) | 1298 (178) |
GOLDSTEIN | 3855 (291) | 3731 (335) | 3530 (309) | 2784 (290) |
GRIEWANK2 | 4310 (1205) | 4510 (1310) | 4035 (995) | 2589 (532) |
GRIEWANK10 | 8640 (1106) | 8773 (1265) | 8232 (911) | 7435 (900) |
HANSEN | 3329 (653) | 3071 (466) | 2734 (521) | 2484 (417) |
HARTMAN3 | 2849 (240) | 2673 (246) | 2678 (317) | 1793 (231) |
HARTMAN6 | 3456 (344) | 3249 (330) | 3119 (299) | 2478 (282) |
POTENTIAL3 | 4554 (511) | 5095 (469) | 3928 (509) | 4081 (391) |
POTENTIAL5 | 8356 (702) | 10,032 (1078) | 7504 (996) | 8886 (1146) |
RASTRIGIN | 3310 (414) | 3187 (336) | 2751 (600) | 2304 (302) |
ROSENBROCK4 | 6566 (719) | 6353 (742) | 5588 (599) | 4019 (324) |
ROSENBROCK8 | 8379 (864) | 8717 (882) | 7783 (782) | 6801 (633) |
ROSENBROCK16 | 11,921 (1389) | 12,471 (1025) | 11,677 (1212) | 11,996 (1331) |
SHEKEL5 | 3946 (333) | 3731 (398) | 3859 (460) | 2495 (310) |
SHEKEL7 | 3990 (361) | 3646 (442) | 3944 (326) | 2432 (240) |
SHEKEL10 | 3836 (316) | 3630 (385) | 3694 (379) | 2516 (326) |
SHUBERT2 | 3288 (562) | 3212 (728) | 2631 (373) | 2300 (527) |
SHUBERT4 | 3116 (548) | 2919 (514) | 2499 (422) | 1967 (323) |
SHUBERT8 | 2815 (500) | 2810 (513) | 2337 (296) | 2267 (296) |
SCHAFFER | 55,131 (10,062) | 40,715 (9866) | 60,717 (9950) | 23531 (5904) |
SCHWEFEL221 | 2724 (406) | 2901 (547) | 2799 (505) | 2203 (478) |
SCHWEFEL222 | 53,118 (7011) | 54,593 (8763) | 55,354 (7330) | 38,876 (5659) |
SPHERE | 1346 (68) | 1188 (72) | 1084 (82) | 1162 (84) |
TEST2N4 | 3345 (416) | 3233 (401) | 2867 (253) | 2277 (464) |
TEST2N5 | 3937 (757) | 3742 (584) | 3094 (278) | 2734 (479) |
TEST2N6 | 4008 (718) | 4473 (1060) | 3266 (297) | 2905 (832) |
TEST2N7 | 4545 (1169) | 4612 (1046) | 3549 (421) | 3559 (763) |
SINU4 | 3128 (410) | 2879 (240) | 3559 (750) | 2005 (332) |
SINU8 | 4126 (339) | 3767 (410) | 5637 (964) | 3158 (467) |
SINU16 | 6774 (1166) | 5977 (601) | 7739 (1791) | 5891 (553) |
TEST30N3 | 3704 (1289) | 3384 (1083) | 3175 (1012) | 2362 (884) |
TEST30N4 | 4262 (1805) | 4327 (1838) | 3491 (920) | 2978 (1719) |
UNEVEN_MAXIMA | 1877 (236) | 1810 (239) | 1666 (189) | 1560 (208) |
VINCENT2 | 1598 (119) | 1482 (133) | 1849 (121) | 1834 (388) |
VINCENT4 | 2471 (229) | 2282 (167) | 2296 (206) | 2697 (1443) |
VINCENT8 | 3074 (524) | 2797 (201) | 2883 (423) | 4368 (2253) |
SUM | 324,736 | 307,329 | 315,835 | 231,856 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kyrou, G.; Charilogis, V.; Tsoulos, I.G. Refining the Eel and Grouper Optimizer with Intelligent Modifications for Global Optimization. Computation 2024, 12, 205. https://doi.org/10.3390/computation12100205
Kyrou G, Charilogis V, Tsoulos IG. Refining the Eel and Grouper Optimizer with Intelligent Modifications for Global Optimization. Computation. 2024; 12(10):205. https://doi.org/10.3390/computation12100205
Chicago/Turabian StyleKyrou, Glykeria, Vasileios Charilogis, and Ioannis G. Tsoulos. 2024. "Refining the Eel and Grouper Optimizer with Intelligent Modifications for Global Optimization" Computation 12, no. 10: 205. https://doi.org/10.3390/computation12100205
APA StyleKyrou, G., Charilogis, V., & Tsoulos, I. G. (2024). Refining the Eel and Grouper Optimizer with Intelligent Modifications for Global Optimization. Computation, 12(10), 205. https://doi.org/10.3390/computation12100205