EOFA: An Extended Version of the Optimal Foraging Algorithm for Global Optimization Problems
Abstract
:1. Introduction
- The incorporation of a sampling technique that utilizes the K-means method [52]. The points that are sampled will more accurately and quickly lead to the global minimum of the function and additional points that might have been close enough using another sampling technique are discarded.
- The use of a termination technique based on stochastic observations. At each iteration of the algorithm, the smallest functional value is recorded. If this remains constant for a predetermined number of iterations, then the method terminates. This way, the method will terminate in time without needlessly wasting computing time on iterations that will simply produce the same result.
- The improvement of the offspring produced by the method using a few steps of a local minimization method. In the current work, a BFGS variant [53] was used as a local search procedure.
2. The Proposed Method
2.1. The Main Steps of the Algorithm
- Initialization step.
- (a)
- Define as the number of elements in the population.
- (b)
- Define as the maximum number of allowed iterations.
- (c)
- Initialize randomly the members of the population in set S. This forms the initial population denoted as P. The sampling is performed using the procedure in Section 2.2.
- (d)
- Create the QOP population, which stands for the quasi-opposite population of P, by calculating the quasi-opposite positions of P.
- (e)
- Set , the iteration number.
- (f)
- Select the fittest solutions from the population set using the QOBL method proposed in [51].
- Calculation step.
- (a)
- Produce the new offspring solutions based on the ranking order using the following equation:
- (b)
- For , do
- i.
- If then . This step applies a limited number of steps of a local search procedure LS(x) to the current point. The parameter denotes the number of steps for the local search procedure. In the current work, the BFGS local optimization procedure is used, but other local search methods could be employed such as Gradient Descent [54], Steepest Descent [55], etc.
- ii.
- if then select for the next population else select . The values are uniformly distributed random numbers in .
- (c)
- End For.
- (d)
- Sort all solutions in the current population from the best to worst according to the function value.
- Termination check step.
- (a)
- Set .
- (b)
- Calculate to stopping rule in the work of Charilogis [56]. This termination method is described in Section 2.3.
- (c)
- If the termination criteria are not met then go to Calculation step, else terminate.
2.2. The Proposed Sampling Procedure
- Define as k the number of clusters.
- Draw randomly initial points from the objective function.
- Assign randomly each point in a cluster .
- For every cluster do
- (a)
- Set as the number of points in .
- (b)
- Compute the center of the cluster as
- EndFor.
- Repeat
- (a)
- Set .
- (b)
- For each point do
- i.
- Set . The function is the Euclidean distance of points .
- ii.
- Set .
- (c)
- EndFor.
- (d)
- For each center do
- i.
- Update the center as
- (e)
- EndFor.
- If there is no significant change in centers terminate the algorithm and return the k centers as the final set of samples.
2.3. The Termination Rule Used
3. Results
3.1. Test Functions
- Ackley function:
- Bf1 (Bohachevsky 1) function:
- Bf2 (Bohachevsky 2) function:
- Bf3 (Bohachevsky 3) function:
- Branin function: with .
- Camel function:
- Easom function:
- Exponential function, defined asIn the conducted experiments, the values were used.
- ExtendedF10 function:
- F12 function:
- F14 function:
- F15 function:
- F17 function:
- Griewank2 function:
- Griewank10 function: the function is given by the equation
- Gkls function: is a constructed function with w local minima presented in [67], with . For the conducted experiments, the values and were utilized.
- Goldstein and Price function:
- Hansen function: , .
- Hartman 3 function:
- Hartman 6 function:
- Potential function: This function stands for the energy of a molecular conformation of N atoms that interacts via the Lennard-Jones potential [68]. The function is defined asFor the conducted experiments, the values were used.
- Rastrigin function:
- Rosenbrock function:The values were used in the conducted experiments.
- Shekel 5 function:
- Shekel 7 function:
- Shekel 10 function:
- Sinusoidal function, defined asThe values were used in the conducted experiments.
- Sphere function:
- Schwefel function:
- Schwefel 2.21 function:
- Schwefel 2.22 function:
- Test2N function:For the conducted experiments, the values were used.
- Test30N function:The values were used in the conducted experiments.
3.2. Experimental Results
- The column FUNCTION denotes the name of the objective problem.
- The column GENETIC denotes the application of a genetic algorithm to the objective problem. The genetic algorithm has chromosomes and the maximum number of allowed generations was set to .
- The column PSO stands for the application of Particle Swarm Optimizer to every objective problem. The number of particles was set to and the maximum number of allowed iterations was set to .
- The column EOFA represents the application of the proposed method using the values for the parameters shown in Table 1.
- The row SUM represents the sum of function calls for all test functions.
- The column UNIFORM stands for the application of uniform sampling in the proposed method.
- The column TRIANGULAR represents the application of the triangular distribution [69] to sample the initial points of the proposed method.
- The column KMEANS represents the sampling method presented in the current work.
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Intriligator, M.D. Mathematical Optimization and Economic Theory; Society for Industrial and Applied Mathematics: Philadelphia, PA, USA, 2002. [Google Scholar]
- Törn, A.; Ali, M.M.; Viitanen, S. Stochastic global optimization: Problem classes and solution techniques. J. Glob. Optim. 1999, 14, 437–447. [Google Scholar] [CrossRef]
- Yang, L.; Robin, D.; Sannibale, F.; Steier, C.; Wan, W. Global optimization of an accelerator lattice using multiobjective genetic algorithms. Nucl. Instrum. Methods Phys. Res. Accel. Spectrometers Detect. Assoc. Equip. 2009, 609, 50–57. [Google Scholar] [CrossRef]
- Iuliano, E. Global optimization of benchmark aerodynamic cases using physics-based surrogate models. Aerosp. Sci. Technol. 2017, 67, 273–286. [Google Scholar] [CrossRef]
- Duan, Q.; Sorooshian, S.; Gupta, V. Effective and efficient global optimization for conceptual rainfall-runoff models. Water Resour. Res. 1992, 28, 1015–1031. [Google Scholar] [CrossRef]
- Heiles, S.; Johnston, R.L. Global optimization of clusters using electronic structure methods. Int. J. Quantum Chem. 2013, 113, 2091–2109. [Google Scholar] [CrossRef]
- Shin, W.H.; Kim, J.K.; Kim, D.S.; Seok, C. GalaxyDock2: Protein–ligand docking using beta-complex and global optimization. J. Comput. Chem. 2013, 34, 2647–2656. [Google Scholar] [CrossRef] [PubMed]
- Liwo, A.; Lee, J.; Ripoll, D.R.; Pillardy, J.; Scheraga, H.A. Protein structure prediction by global optimization of a potential energy function. Biophysics 1999, 96, 5482–5485. [Google Scholar] [CrossRef] [PubMed]
- Gaing, Z.-L. Particle swarm optimization to solving the economic dispatch considering the generator constraints. IEEE Trans. Power Syst. 2003, 18, 1187–1195. [Google Scholar] [CrossRef]
- Maranas, C.D.; Androulakis, I.P.; Floudas, C.A.; Berger, A.J.; Mulvey, J.M. Solving long-term financial planning problems via global optimization. J. Econ. Dyn. Control 1997, 21, 1405–1425. [Google Scholar] [CrossRef]
- Lee, E.K. Large-Scale Optimization-Based Classification Models in Medicine and Biology. Ann. Biomed. Eng. 2007, 35, 1095–1109. [Google Scholar] [CrossRef]
- Cherruault, Y. Global optimization in biology and medicine. Math. Comput. Model. 1994, 20, 119–132. [Google Scholar] [CrossRef]
- Liberti, L.; Kucherenko, S. Comparison of deterministic and stochastic approaches to global optimization. Int. Trans. Oper. Res. 2005, 12, 263–285. [Google Scholar] [CrossRef]
- Choi, S.H.; Manousiouthakis, V. Global optimization methods for chemical process design: Deterministic and stochastic approaches. Korean J. Chem. Eng. 2002, 19, 227–232. [Google Scholar] [CrossRef]
- Wolfe, M.A. Interval methods for global optimization. Appl. Math. Comput. 1996, 75, 179–206. [Google Scholar]
- Csendes, T.; Ratz, D. Subdivision Direction Selection in Interval Methods for Global Optimization. SIAM J. Numer. Anal. 1997, 34, 922–938. [Google Scholar] [CrossRef]
- Price, W.L. Global optimization by controlled random search. J. Optim. Theory Appl. 1983, 40, 333–348. [Google Scholar] [CrossRef]
- Krivy, I.; Tvrdik, J. The controlled random search algorithm in optimizing regression models. Comput. Stat. Data Anal. 1995, 20, 229–234. [Google Scholar] [CrossRef]
- Ali, M.M.; Torn, A.; Viitanen, S. A Numerical Comparison of Some Modified Controlled Random Search Algorithms. J. Glob. Optim. 1997, 11, 377–385. [Google Scholar] [CrossRef]
- Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef]
- Ingber, L. Very fast simulated re-annealing. Math. Comput. Model. 1989, 12, 967–973. [Google Scholar] [CrossRef]
- Eglese, R.W. Simulated annealing: A tool for operational research. Simulated Annealing Tool Oper. Res. 1990, 46, 271–281. [Google Scholar] [CrossRef]
- Tsoulos, I.G.; Lagaris, I.E. MinFinder: Locating all the local minima of a function. Comput. Phys. Commun. 2006, 174, 166–179. [Google Scholar] [CrossRef]
- Liu, Y.; Tian, P. A multi-start central force optimization for global optimization. Appl. Soft Comput. 2015, 27, 92–98. [Google Scholar] [CrossRef]
- Perez, M.; Almeida, F.; Moreno-Vega, J.M. Genetic algorithm with multistart search for the p-Hub median problem. In Proceedings of the 24th EUROMICRO Conference (Cat. No. 98EX204), Vasteras, Sweden, 27 August 1998; Volume 2, pp. 702–707. [Google Scholar]
- Oliveira, H.C.B.d.; Vasconcelos, G.C.; Alvarenga, G. A Multi-Start Simulated Annealing Algorithm for the Vehicle Routing Problem with Time Windows. In Proceedings of the 2006 Ninth Brazilian Symposium on Neural Networks (SBRN’06), Ribeirao Preto, Brazil, 23–27 October 2006; pp. 137–142. [Google Scholar]
- Larson, J.; Wild, S.M. Asynchronously parallel optimization solver for finding multiple minima. Math. Program. Comput. 2018, 10, 303–332. [Google Scholar] [CrossRef]
- Bolton, H.P.J.; Schutte, J.F.; Groenwold, A.A. Multiple Parallel Local Searches in Global Optimization. In Recent Advances in Parallel Virtual Machine and Message Passing Interface; EuroPVM/MPI 2000. Lecture Notes in Computer Science; Dongarra, J., Kacsuk, P., Podhorszki, N., Eds.; Springer: Berlin/Heidelberg, Germany, 2000; Volume 1908. [Google Scholar]
- Kamil, R.; Reiji, S. An Efficient GPU Implementation of a Multi-Start TSP Solver for Large Problem Instances. In Proceedings of the 14th Annual Conference Companion on Genetic and Evolutionary Computation, Philadelphia, PA, USA, 7–11 July 2012; pp. 1441–1442. [Google Scholar]
- Van Luong, T.; Melab, N.; Talbi, E.G. GPU-Based Multi-start Local Search Algorithms. In Learning and Intelligent Optimization. LION 2011; Lecture Notes in Computer Science; Coello, C.A.C., Ed.; Springer: Berlin/Heidelberg, Germany, 2011; Volume 6683. [Google Scholar]
- Sepulveda, A.E.; Epstein, L. The repulsion algorithm, a new multistart method for global optimization. Struct. Optim. 1996, 11, 145–152. [Google Scholar] [CrossRef]
- Chen, Z.; Qiu, H.; Gao, L.; Li, X.; Li, P. A local adaptive sampling method for reliability-based design optimization using Kriging model. Struct. Multidisc. Optim. 2014, 49, 401–416. [Google Scholar] [CrossRef]
- Homem-de-Mello, T.; Bayraksan, G. Monte Carlo sampling-based methods for stochastic optimization. Surv. Oper. Res. Manag. Sci. 2014, 19, 56–85. [Google Scholar] [CrossRef]
- Park, J.; Sandberg, I.W. Universal Approximation Using Radial-Basis-Function Networks. Neural Comput. 1991, 3, 246–257. [Google Scholar] [CrossRef]
- Tsoulos, I.G.; Tzallas, A.; Tsalikakis, D. Use RBF as a Sampling Method in Multistart Global Optimization Method. Signals 2022, 3, 857–874. [Google Scholar] [CrossRef]
- Hart, W.E. Sequential stopping rules for random optimization methods with applications to multistart local search. SIAM J. Optim. 1998, 9, 270–290. [Google Scholar] [CrossRef]
- Tsoulos, I.G. Modifications of real code genetic algorithm for global optimization. Appl. Math. Comput. 2008, 203, 598–607. [Google Scholar] [CrossRef]
- Ohsaki, M.; Yamakawa, M. Stopping rule of multi-start local search for structural optimization. Struct. Multidisc. Optim. 2018, 57, 595–603. [Google Scholar] [CrossRef]
- Storn, R.; Price, K. Differential Evolution—A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
- Liu, J.; Lampinen, J. A Fuzzy Adaptive Differential Evolution Algorithm. Soft Comput. 2005, 9, 448–462. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95—International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar]
- Poli, R.; Kennedy, J.; Blackwell, T. Particle swarm optimization An Overview. Swarm Intell. 2007, 1, 33–57. [Google Scholar] [CrossRef]
- Trelea, I.C. The particle swarm optimization algorithm: Convergence analysis and parameter selection. Inf. Process. Lett. 2003, 85, 317–325. [Google Scholar] [CrossRef]
- Dorigo, M.; Birattari, M.; Stutzle, T. Ant colony optimization. IEEE Comput. Intell. Mag. 2006, 1, 28–39. [Google Scholar] [CrossRef]
- Socha, K.; Dorigo, M. Ant colony optimization for continuous domains. Eur. J. Oper. Res. 2008, 185, 1155–1173. [Google Scholar] [CrossRef]
- Goldberg, D. Genetic Algorithms in Search, Optimization and Machine Learning; Addison-Wesley Publishing Company: Reading, MA, USA, 1989. [Google Scholar]
- Michaelewicz, Z. Genetic Algorithms + Data Structures = Evolution Programs; Springer: Berlin/Heidelberg, Germany, 1996. [Google Scholar]
- Zhu, G.Y.; Zhang, W.B. Optimal foraging algorithm for global optimization. Appl. Soft Comput. 2017, 51, 294–313. [Google Scholar] [CrossRef]
- Fu, Y.; Zhang, W.; Qu, C.; Huang, B. Optimal Foraging Algorithm Based on Differential Evolution. IEEE Access 2020, 8, 19657–19678. [Google Scholar] [CrossRef]
- Jian, Z.; Zhu, G. Optimal foraging algorithm with direction prediction. Appl. Soft Comput. 2021, 111, 107660. [Google Scholar] [CrossRef]
- Ding, C.; Zhu, G. Improved optimal foraging algorithm for global optimization. Computing 2024, 106, 2293–2319. [Google Scholar] [CrossRef]
- Ahmed, M.; Seraj, R.; Islam, S.M.S. The k-means algorithm: A comprehensive survey and performance evaluation. Electronics 2020, 9, 1295. [Google Scholar] [CrossRef]
- Powell, M.J.D. A Tolerant Algorithm for Linearly Constrained Optimization Calculations. Math. Program. 1989, 45, 547–566. [Google Scholar] [CrossRef]
- Amari, S.I. Backpropagation and stochastic gradient descent method. Neurocomputing 1993, 5, 185–196. [Google Scholar] [CrossRef]
- Meza, J.C. Steepest descent. Wiley Interdiscip. Rev. Comput. Stat. 2010, 2, 719–722. [Google Scholar] [CrossRef]
- Charilogis, V.; Tsoulos, I.G. Toward an Ideal Particle Swarm Optimizer for Multidimensional Functions. Information 2022, 13, 217. [Google Scholar] [CrossRef]
- MacQueen, J.B. Some Methods for classification and Analysis of Multivariate Observations. In Proceedings of the 5th Berkeley Symposium on Mathematical Statistics and Probability, Oakland, CA, USA, 21 June–18 July 1967; pp. 281–297. [Google Scholar]
- Li, Y.; Wu, H. A clustering method based on K-means algorithm. Phys. Procedia 2012, 25, 1104–1109. [Google Scholar] [CrossRef]
- Arora, P.; Varshney, S. Analysis of k-means and k-medoids algorithm for big data. Procedia Comput. Sci. 2016, 78, 507–512. [Google Scholar] [CrossRef]
- Montaz Ali, M. Charoenchai Khompatraporn, Zelda B. Zabinsky, A Numerical Evaluation of Several Stochastic Algorithms on Selected Continuous Global Optimization Test Problems. J. Glob. Optim. 2005, 31, 635–672. [Google Scholar]
- Floudas, C.A.; Pardalos, P.M.; Adjiman, C.; Esposoto, W.; Gümüs, Z.; Harding, S.; Klepeis, J.; Meyer, C.; Schweiger, C. Handbook of Test Problems in Local and Global Optimization; Kluwer Academic Publishers: Dordrecht, The Netherlands, 1999. [Google Scholar]
- Ali, M.M.; Kaelo, P. Improved particle swarm algorithms for global optimization. Appl. Math. Comput. 2008, 196, 578–593. [Google Scholar] [CrossRef]
- Koyuncu, H.; Ceylan, R. A PSO based approach: Scout particle swarm algorithm for continuous global optimization problems. J. Comput. Des. Eng. 2019, 6, 129–142. [Google Scholar] [CrossRef]
- Siarry, P.; Berthiau, G.; Durdin, F.; Haussy, J. Enhanced simulated annealing for globally minimizing functions of many-continuous variables. ACM Trans. Math. Softw. 1997, 23, 209–228. [Google Scholar] [CrossRef]
- Tsoulos, I.G.; Lagaris, I.E. GenMin: An enhanced genetic algorithm for global optimization. Comput. Phys. Commun. 2008, 178, 843–851. [Google Scholar] [CrossRef]
- LaTorre, A.; Molina, D.; Osaba, E.; Poyatos, J.; Del Ser, J.; Herrera, F. A prescription of methodological guidelines for comparing bio-inspired optimization algorithms. Swarm Evol. Comput. 2021, 67, 100973. [Google Scholar] [CrossRef]
- Gaviano, M.; Ksasov, D.E.; Lera, D.; Sergeyev, Y.D. Software for generation of classes of test functions with known local and global minima for global optimization. ACM Trans. Math. Softw. 2003, 29, 469–480. [Google Scholar] [CrossRef]
- Lennard-Jones, J.E. On the Determination of Molecular Fields. Proc. R. Soc. Lond. A 1924, 106, 463–477. [Google Scholar]
- Stein, W.E.; Keblis, M.F. A new method to simulate the triangular distribution. Math. Comput. Model. Vol. 2009, 49, 1143–1147. [Google Scholar] [CrossRef]
- Gropp, W.; Lusk, E.; Doss, N.; Skjellum, A. A high-performance, portable implementation of the MPI message passing interface standard. Parallel Comput. 1996, 22, 789–828. [Google Scholar] [CrossRef]
- Chandra, R. Parallel Programming in OpenMP; Morgan Kaufmann: Cambridge, MA, USA, 2001. [Google Scholar]
Parameter | Meaning | Value |
---|---|---|
Number of chromosomes/particles | 500 | |
Maximum number of allowed iterations | 200 | |
Number of initial samples for K-means | ||
Number of iterations for stopping rule | 5 | |
Selection rate for the genetic algorithm | 0.1 | |
Mutation rate for the genetic algorithm | 0.05 | |
Number of iterations for BFGS | 3 |
FUNCTION | GENETIC | PSO | EOFA |
---|---|---|---|
ACKLEY | 16,754 | 24,518 | 1346 |
BF1 | 10,466 | 9912 | 1856 |
BF2 | 10,059 | 9364 | 1738 |
BF3 | 7290 | 9847 | 1408 |
BRANIN | 10,032 | 5940 | 1479 |
CAMEL | 11,069 | 7132 | 1540 |
EASOM | 10,587 | 4922 | 1304 |
EXP4 | 10,231 | 7382 | 1651 |
EXP8 | 10,622 | 7644 | 1891 |
EXP16 | 10,458 | 8050 | 2145 |
EXP32 | 10,202 | 8800 | 2165 |
EXTENDEDF10 | 10,739 | 14,363 | 1361 |
F12 | 31,277 | 23,705 | 4340 |
F14 | 14,598 | 26,296 | 1617 |
F15 | 8215 | 13,599 | 1620 |
F17 | 9202 | 11,851 | 1438 |
GKLS250 | 10,198 | 5488 | 1404 |
GKLS350 | 9861 | 6029 | 1325 |
GOLDSTEIN | 11,901 | 9244 | 1751 |
GRIEWANK2 | 13,612 | 10,315 | 1602 |
GRIEWANK10 | 14,750 | 15,721 | 3092 |
HANSEN | 13,053 | 7636 | 1559 |
HARTMAN3 | 10,066 | 6897 | 1664 |
HARTMAN6 | 11,119 | 8061 | 1873 |
POTENTIAL3 | 16,325 | 10,728 | 2093 |
POTENTIAL5 | 34,284 | 19,307 | 3397 |
RASTRIGIN | 13,354 | 9783 | 1528 |
ROSENBROCK4 | 12,618 | 9266 | 2134 |
ROSENBROCK8 | 15,019 | 12,854 | 2985 |
ROSENBROCK16 | 17,150 | 21,074 | 4151 |
SHEKEL5 | 13,927 | 8383 | 2021 |
SHEKEL7 | 13,688 | 8491 | 2029 |
SHEKEL10 | 13,722 | 8746 | 2120 |
TEST2N4 | 10,522 | 7815 | 1608 |
TEST2N5 | 10,847 | 8393 | 1658 |
TEST2N6 | 11,180 | 9385 | 1782 |
TEST2N7 | 11,485 | 10,561 | 1876 |
SPHERE | 3942 | 8330 | 1199 |
SCHWEFEL | 4688 | 9037 | 1242 |
SCHWEFEL2.21 | 6233 | 7872 | 1286 |
SCHWEFEL2.22 | 81,980 | 88,227 | 5309 |
SINU4 | 12,920 | 7250 | 1700 |
SINU8 | 12,703 | 8202 | 2202 |
SINU16 | 12,404 | 10,640 | 2188 |
TEST30N3 | 16,692 | 7750 | 1912 |
TEST30N4 | 19,159 | 10,036 | 1820 |
SUM | 651,203 | 564,846 | 91,409 |
FUNCTION | UNIFORM | TRIANGULAR | KMEANS |
---|---|---|---|
ACKLEY | 1803 | 1330 | 1346 |
BF1 | 2637 | 2146 | 1856 |
BF2 | 2470 | 2011 | 1738 |
BF3 | 1863 | 1366 | 1408 |
BRANIN | 2023 | 1518 | 1479 |
CAMEL | 2174 | 1671 | 1540 |
EASOM | 1755 | 1264 | 1304 |
EXP4 | 2428 | 1844 | 1651 |
EXP8 | 2500 | 1939 | 1891 |
EXP16 | 2568 | 2026 | 2145 |
EXP32 | 2534 | 1976 | 2165 |
EXTENDEDF10 | 1812 | 1360 | 1361 |
F12 | 6085 | 4196 | 4340 |
F14 | 2225 | 1691 | 1617 |
F15 | 1950 | 1512 | 1620 |
F17 | 1932 | 1424 | 1438 |
GKLS250 | 1895 | 1403 | 1404 |
GKLS350 | 1789 | 1249 | 1325 |
GOLDSTEIN | 2516 | 2013 | 1751 |
GRIEWANK2 | 2242 | 1711 | 1602 |
GRIEWANK10 | 3776 | 3416 | 3092 |
HANSEN | 2163 | 1678 | 1559 |
HARTMAN3 | 2289 | 1757 | 1664 |
HARTMAN6 | 2541 | 2065 | 1873 |
POTENTIAL3 | 2841 | 2366 | 2093 |
POTENTIAL5 | 4029 | 3960 | 3397 |
RASTRIGIN | 2105 | 1779 | 1528 |
ROSENBROCK4 | 3255 | 2949 | 2134 |
ROSENBROCK8 | 4059 | 3468 | 2985 |
ROSENBROCK16 | 4963 | 4391 | 4151 |
SPHERE | 1533 | 1033 | 1199 |
SCHWEFEL | 1635 | 1139 | 1242 |
SCHWEFEL2.21 | 1724 | 1230 | 1286 |
SCHWEFEL2.22 | 8363 | 8663 | 5309 |
SHEKEL5 | 2945 | 2515 | 2021 |
SHEKEL7 | 3008 | 2567 | 2029 |
SHEKEL10 | 3160 | 2683 | 2120 |
TEST2N4 | 2339 | 1804 | 1608 |
TEST2N5 | 2388 | 1847 | 1658 |
TEST2N6 | 2478 | 1962 | 1782 |
TEST2N7 | 2552 | 2044 | 1876 |
SINU4 | 2444 | 1954 | 1700 |
SINU8 | 2881 | 2388 | 2202 |
SINU16 | 3694 | 3300 | 2188 |
TEST30N3 | 2189 | 1654 | 1912 |
TEST30N4 | 2282 | 2374 | 1820 |
SUM | 124,837 | 102,636 | 91,309 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kyrou, G.; Charilogis, V.; Tsoulos, I.G. EOFA: An Extended Version of the Optimal Foraging Algorithm for Global Optimization Problems. Computation 2024, 12, 158. https://doi.org/10.3390/computation12080158
Kyrou G, Charilogis V, Tsoulos IG. EOFA: An Extended Version of the Optimal Foraging Algorithm for Global Optimization Problems. Computation. 2024; 12(8):158. https://doi.org/10.3390/computation12080158
Chicago/Turabian StyleKyrou, Glykeria, Vasileios Charilogis, and Ioannis G. Tsoulos. 2024. "EOFA: An Extended Version of the Optimal Foraging Algorithm for Global Optimization Problems" Computation 12, no. 8: 158. https://doi.org/10.3390/computation12080158
APA StyleKyrou, G., Charilogis, V., & Tsoulos, I. G. (2024). EOFA: An Extended Version of the Optimal Foraging Algorithm for Global Optimization Problems. Computation, 12(8), 158. https://doi.org/10.3390/computation12080158