NeuralMinimizer: A Novel Method for Global Optimization
Abstract
:1. Introduction
2. Method Description
2.1. RBF Preliminaries
- The vector is called the input pattern to the equation.
- The vectors are called the center vectors.
- The vector stands for the the output weight of the RBF network.
- In the first phase, the K-Means algorithm [65] is used to approximate the k centers and the corresponding variances.
- In the second phase, the weight vector is calculated by solving a linear system of equations as follows:
- (a)
- Set.
- (b)
- Set.
- (c)
- Set.
- (d)
- The system to be solved is identified as:With the solution:
2.2. The Main Algorithm
- 1.
- Initialization step.
- (a)
- Setk the number of weights in the RBF network.
- (b)
- Set the initial samples that will be taken from the function .
- (c)
- Set the number of samples that will be used in every iteration as starting points for the local optimization procedure.
- (d)
- Set the number of samples that will be drawn from the RBF network at every iteration with .
- (e)
- Set the maximum number of allowed iterations.
- (f)
- Set Iter = 0, the iteration number.
- (g)
- Set as the global minimum. Initially,
- 2.
- Creation Step.
- (a)
- Set, the training set for the RBF network.
- (b)
- For do
- i.
- Take a new sample .
- ii.
- Calculate.
- iii.
- .
- (c)
- End For
- (d)
- Train the RBF network on the training set T.
- 3.
- Sampling Step.
- (a)
- Set.
- (b)
- Fordo
- i.
- Take a random sample from the RBF network.
- ii.
- Set.
- (c)
- End For
- (d)
- Sort according to the y values in ascending order.
- 4.
- Optimization Step.
- (a)
- For do
- i.
- Take the next sample from .
- ii.
- , where LS(x) is a predefined local search method.
- iii.
- ; this step updates the training set of the RBF network.
- iv.
- Train the RBF network on the set T.
- v.
- If, then .
- vi.
- Check the termination rule as suggested in [63]. If it holds, then report as the located global minimum and terminate.
- (b)
- End For
- 5.
- Set iter = iter + 1.
- 6.
- Goto to Sampling step.
3. Experiments
4. Conclusions
Author Contributions
Funding
Conflicts of Interest
Appendix A
- Bent Cigar function. The function is
- Bf1 function. The function Bohachevsky 1 is given by the equation
- Bf2 function. The function Bohachevsky 2 is given by the equation
- Branin function. The function is defined by with . The value of global minimum is 0.397887 with .
- CM function. The Cosine Mixture function is given by the equation
- Camel function. The function is given byThe global minimum has the value of .
- Discus function. The function is defined as
- Easom function. The function is given by the equation
- Exponential function.The function is given byThe values were used here and the corresponding function names are EXP4, EXP16, EXP64.
- Griewank10 function, defined as:
- Hansen function. , . The global minimum of the function is −176.541793.
- Hartman 3 function. The function is given byThe value of the global minimum is −3.862782.
- Hartman 6 function.The value of the global minimum is −3.322368.
- High Conditioned Elliptic function, defined as
- Potential function used to represent the lowest energy for the molecular conformation of N atoms via the Lennard–Jones potential [69]. The function is defined as:In the current experiments, two different cases were studied: .
- Rastrigin function. The function is given by
- Shekel 7 function.
- Shekel 5 function.
- Shekel 10 function.
- Sinusoidal function. The function is given byThe global minimum is located at with . For the conducted experiments, the cases of , and were studied.
- Test2N function. This function is given by the equationThe function has local minima in the specified range and, in our experiments, we used .
- Test30N function. This function is given by
References
- Honda, M. Application of genetic algorithms to modelings of fusion plasma physics. Comput. Phys. Commun. 2018, 231, 94–106. [Google Scholar] [CrossRef]
- Luo, X.L.; Feng, J.; Zhang, H.H. A genetic algorithm for astroparticle physics studies. Comput. Phys. Commun. 2020, 250, 106818. [Google Scholar] [CrossRef] [Green Version]
- Aljohani, T.M.; Ebrahim, A.F.; Mohammed, O. Single and Multiobjective Optimal Reactive Power Dispatch Based on Hybrid Artificial Physics–Particle Swarm Optimization. Energies 2019, 12, 2333. [Google Scholar] [CrossRef] [Green Version]
- Pardalos, P.M.; Shalloway, D.; Xue, G. Optimization methods for computing global minima of nonconvex potential energy functions. J. Glob. Optim. 1994, 4, 117–133. [Google Scholar] [CrossRef]
- Liwo, A.; Lee, J.; Ripoll, D.R.; Pillardy, J.; Scheraga, A.H. Protein structure prediction by global optimization of a potential energy function. Biophysics 1999, 96, 5482–5485. [Google Scholar] [CrossRef] [Green Version]
- An, J.; He, G.; Qin, F.; Li, R.; Huang, Z. A new framework of global sensitivity analysis for the chemical kinetic model using PSO-BPNN. Comput. Chem. Eng. 2018, 112, 154–164. [Google Scholar] [CrossRef]
- Gaing, Z.-L. Particle swarm optimization to solving the economic dispatch considering the generator constraints. IEEE Trans. Power Syst. 2003, 18, 1187–1195. [Google Scholar] [CrossRef]
- Basu, M. A simulated annealing-based goal-attainment method for economic emission load dispatch of fixed head hydrothermal power systems. Int. J. Electr. Power Energy Syst. 2005, 27, 147–153. [Google Scholar] [CrossRef]
- Cherruault, Y. Global optimization in biology and medicine. Math. Comput. Model. 1994, 20, 119–132. [Google Scholar] [CrossRef]
- Lee, E.K. Large-Scale Optimization-Based Classification Models in Medicine and Biology. Ann. Biomed. Eng. 2007, 35, 1095–1109. [Google Scholar] [CrossRef] [Green Version]
- Price, W.L. Global optimization by controlled random search. J. Optim. Theory Appl. 1983, 40, 333–348. [Google Scholar] [CrossRef]
- Křivý, I.; Tvrdík, J. The controlled random search algorithm in optimizing regression models. Comput. Stat. Data Anal. 1995, 20, 229–234. [Google Scholar] [CrossRef]
- Ali, M.M.; Törn, A.; Viitanen, S. A Numerical Comparison of Some Modified Controlled Random Search Algorithms. J. Glob. 1997, 11, 377–385. [Google Scholar]
- Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef]
- Ingber, L. Very fast simulated re-annealing. Math. Comput. Model. 1989, 12, 967–973. [Google Scholar] [CrossRef] [Green Version]
- Eglese, R.W. Simulated annealing: A tool for operational research. Eur. J. Oper. Res. 1990, 46, 271–281. [Google Scholar] [CrossRef]
- Storn, R.; Price, K. Differential Evolution—A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
- Liu, J.; Lampinen, J. A Fuzzy Adaptive Differential Evolution Algorithm. Soft Comput. 2005, 9, 448–462. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95—International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar] [CrossRef]
- Poli, R.; Kennedy, J.K.; Blackwell, T. Particle swarm optimization algorithm: An Overview. Swarm Intell. 2007, 1, 33–57. [Google Scholar] [CrossRef]
- Trelea, I.C. The particle swarm optimization algorithm: Convergence analysis and parameter selection. Inf. Process. Lett. 2003, 85, 317–325. [Google Scholar] [CrossRef]
- Dorigo, M.; Birattari, M.; Stutzle, T. Ant colony optimization. IEEE Comput. Intell. Mag. 2006, 1, 28–39. [Google Scholar] [CrossRef]
- Socha, K.; Dorigo, M. Ant colony optimization for continuous domains. Eur. J. Oper. Res. 2008, 185, 1155–1173. [Google Scholar] [CrossRef] [Green Version]
- Goldberg, D. Genetic Algorithms in Search, Optimization and Machine Learning; Addison-Wesley Publishing Company: Reading, MA, USA, 1989. [Google Scholar]
- Michaelewicz, Z. Genetic Algorithms + Data Structures = Evolution Programs; Springer-Verlag: Berlin, Germany, 1996. [Google Scholar]
- Grady, S.A.; Hussaini, M.Y.; Abdullah, M.M. Placement of wind turbines using genetic algorithms. Renew. Energy 2005, 30, 259–270. [Google Scholar] [CrossRef]
- Floudas, C.A.; Gounaris, C.E. A review of recent advances in global optimization. J. Glob. Optim. 2009, 45, 3–38. [Google Scholar] [CrossRef]
- Da, Y.; Xiurun, G. An improved PSO-based ANN with simulated annealing technique. Neurocomputing 2005, 63, 527–533. [Google Scholar] [CrossRef]
- Liu, H.; Cai, Z.; Wang, Y. Hybridizing particle swarm optimization with differential evolution for constrained numerical and engineering optimization. Appl. Soft Comput. 2010, 10, 629–640. [Google Scholar] [CrossRef]
- Pan, X.; Xue, L.; Lu, Y.; Sun, N. Hybrid particle swarm optimization with simulated annealing. Multimed. Tools Appl. 2019, 78, 29921–29936. [Google Scholar] [CrossRef]
- Ali, M.M.; Storey, C. Topographical multilevel single linkage. J. Glob. Optim. 1994, 5, 349–358. [Google Scholar] [CrossRef]
- Salhi, S.; Queen, N.M. A hybrid algorithm for identifying global and local minima when optimizing functions with many minima. Eur. J. Oper. Res. 2004, 155, 51–67. [Google Scholar] [CrossRef]
- Tsoulos, I.G.; Lagaris, I.E. MinFinder: Locating all the local minima of a function. Comput. Phys. Commun. 2006, 174, 166–179. [Google Scholar] [CrossRef] [Green Version]
- Betro, B.; Schoen, F. Optimal and sub-optimal stopping rules for the multistart algorithm in global optimization. Math. Program. 1992, 57, 445–458. [Google Scholar] [CrossRef]
- Hart, W.E. Sequential stopping rules for random optimization methods with applications to multistart local search. Siam J. Optim. 1998, 9, 270–290. [Google Scholar] [CrossRef]
- Lagaris, I.E.; Tsoulos, I.G. Stopping Rules for Box-Constrained Stochastic Global Optimization. Appl. Math. Comput. 2008, 197, 622–632. [Google Scholar] [CrossRef]
- Schutte, J.F.; Reinbolt, J.A.; Fregly, B.J.; Haftka, R.T.; George, A.D. Parallel global optimization with the particle swarm algorithm. Int. J. Numer. Methods Eng. 2004, 61, 2296–2315. [Google Scholar] [CrossRef] [Green Version]
- Larson, J.; Wild, S.M. Asynchronously parallel optimization solver for finding multiple minima. Math. Comput. 2018, 10, 303–332. [Google Scholar] [CrossRef]
- Tsoulos, I.G.; Tzallas, A.; Tsalikakis, D. PDoublePop: An implementation of parallel genetic algorithm for function optimization. Comput. Phys. Commun. 2016, 209, 183–189. [Google Scholar] [CrossRef]
- Kamil, R.; Reiji, S. An Efficient GPU Implementation of a Multi-Start TSP Solver for Large Problem Instances. In Proceedings of the 14th Annual Conference Companion on Genetic and Evolutionary Computation, Philadelphia, PA, USA, 7–11 July 2012; pp. 1441–1442. [Google Scholar]
- Van Luong, T.; Melab, N.; Talbi, E.G. GPU-Based Multi-start Local Search Algorithms. In Learning and Intelligent Optimization; Coello, C.A.C., Ed.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2011; Volume 6683. [Google Scholar]
- Hoda, Z.; Nadimi-Shahraki, M.H.; Gandomi, A.H. QANA: Quantum-based avian navigation optimizer algorithm. Eng. Appl. Artif. Intell. 2021, 104, 104314. [Google Scholar]
- Gharehchopogh, F.S. An Improved Tunicate Swarm Algorithm with Best-random Mutation Strategy for Global Optimization Problems. J. Bionic Eng. 2022, 19, 1177–1202. [Google Scholar] [CrossRef]
- Hoda, Z.; Nadimi-Shahraki, M.H.; Gandomi, A.H. Starling murmuration optimizer: A novel bio-inspired algorithm for global and engineering optimization. Comput. Methods Appl. Mech. Eng. 2022, 392, 114616. [Google Scholar]
- Nadimi-Shahraki, M.H.; Asghari Varzaneh, Z.; Zamani, H.; Mirjalili, S. Binary Starling Murmuration Optimizer Algorithm to Select Effective Features from Medical Data. Appl. Sci. 2023, 13, 564. [Google Scholar] [CrossRef]
- Nadimi-Shahraki, M.H.; Hoda, Z. DMDE: Diversity-maintained multi-trial vector differential evolution algorithm for non-decomposition large-scale global optimization. Expert Syst. Appl. 2022, 198, 116895. [Google Scholar] [CrossRef]
- Nadimi-Shahraki, M.H.; Fatahi, A.; Zamani, H.; Mirjalili, S.; Abualigah, L. An improved moth-flame optimization algorithm with adaptation mechanism to solve numerical and mechanical engineering problems. Entropy 2021, 23, 1637. [Google Scholar] [CrossRef] [PubMed]
- Agushaka, J.O.; Ezugwu, A.E.; Abualigah, L. Dwarf mongoose optimization algorithm. Comput. Methods Appl. Mech. Eng. 2022, 391, 114570. [Google Scholar] [CrossRef]
- Li, W. A Parallel Multi-start Search Algorithm for Dynamic Traveling Salesman Problem. In Experimental Algorithms; Pardalos, P.M., Rebennack, S., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2011; Volume 6630. [Google Scholar]
- Martí, R.; Resende, M.G.C.; Ribeiro, C.C. Multi-start methods for combinatorial optimization. Eur. J. Oper. Res. 2013, 226, 1–8. [Google Scholar] [CrossRef]
- Pandiri, V.; Singh, A. Two multi-start heuristics for the k-traveling salesman problem. Opsearch 2020, 57, 1164–1204. [Google Scholar] [CrossRef]
- Wu, Q.; Hao, J.K. An adaptive multistart tabu search approach to solve the maximum clique problem. J. Comb. Optim. 2013, 26, 86–108. [Google Scholar] [CrossRef] [Green Version]
- Djeddi, Y.; Haddadene, H.A.; Belacel, N. An extension of adaptive multi-start tabu search for the maximum quasi-clique problem. Comput. Ind. Eng. 2019, 132, 280–292. [Google Scholar] [CrossRef]
- Bräysy, O.; Hasle, G.; Dullaert, W. A multi-start local search algorithm for the vehicle routing problem with time windows. Eur. J. Oper. Res. 2004, 159, 586–605. [Google Scholar] [CrossRef]
- Michallet, J.; Prins, C.; Amodeo, L.; Yalaoui, F.; Vitry, G. Multi-start iterated local search for the periodic vehicle routing problem with time windows and time spread constraints on services. Comput. Oper. Res. 2014, 41, 196–207. [Google Scholar] [CrossRef]
- Peng, K.; Pan, Q.K.; Gao, L.; Li, X.; Das, S.; Zhang, B. A multi-start variable neighbourhood descent algorithm for hybrid flowshop rescheduling. Swarm Evol. Comput. 2019, 45, 92–112. [Google Scholar] [CrossRef]
- Mao, J.Y.; Pan, Q.K.; Miao, Z.H.; Gao, L. An effective multi-start iterated greedy algorithm to minimize makespan for the distributed permutation flowshop scheduling problem with preventive maintenance. Expert Syst. Appl. 2021, 169, 114495. [Google Scholar] [CrossRef]
- Park, J.; Sandberg, I.W. Approximation and Radial-Basis-Function Networks. Neural Comput. 1993, 5, 305–316. [Google Scholar] [CrossRef]
- Yoo, S.H.; Oh, S.K.; Pedrycz, W. Optimized face recognition algorithm using radial basis function neural networks and its practical applications. Neural Netw. 2015, 69, 111–125. [Google Scholar] [CrossRef]
- Huang, G.B.; Saratchandran, P.; Sundararajan, N. A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation. IEEE Trans. Neural Netw. 2005, 16, 57–67. [Google Scholar] [CrossRef]
- Majdisova, Z.; Skala, V. Radial basis function approximations: Comparison and applications. Appl. Math. Model. 2017, 51, 728–743. [Google Scholar] [CrossRef] [Green Version]
- Kuo, B.C.; Ho, H.H.; Li, C.H.; Hung, C.C.; Taur, J.S. A Kernel-Based Feature Selection Method for SVM With RBF Kernel for Hyperspectral Image Classification. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 317–326. [Google Scholar] [CrossRef]
- Tsoulos, I.G. Modifications of real code genetic algorithm for global optimization. Appl. Math. Comput. 2008, 203, 598–607. [Google Scholar] [CrossRef]
- Powell, M.J.D. A Tolerant Algorithm for Linearly Constrained Optimization Calculations. Math. Program. 1989, 45, 547–566. [Google Scholar] [CrossRef]
- MacQueen, J. Some methods for classification and analysis of multivariate observations. In Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, Davis Davis, CA, USA, 27 December 1965–7 January 1966; Volume 1, pp. 281–297. [Google Scholar]
- Ali, M.M.; Khompatraporn, C.; Zabinsky, Z.B. A Numerical Evaluation of Several Stochastic Algorithms on Selected Continuous Global Optimization Test Problems. J. Glob. Optim. 2005, 31, 635–672. [Google Scholar] [CrossRef]
- Floudas, C.A.; Pardalos, P.M.; Adjiman, C.; Esposito, W.R.; Gümüs, Z.H.; Harding, S.; Klepeis, J.; Meyer, C.; Schweiger, C. Handbook of Test Problems in Local and Global Optimization; Kluwer Academic Publishers: Dordrecht, Germany, 1999. [Google Scholar]
- Kaelo, P.; Ali, M.M. Integrated crossover rules in real coded genetic algorithms. Eur. J. Oper. Res. 2007, 176, 60–76. [Google Scholar] [CrossRef]
- Lennard-Jones, J.E. On the Determination of Molecular Fields. Proc. R. Soc. Lond. A 1924, 106, 463–477. [Google Scholar]
Parameter | Meaning | Value |
---|---|---|
k | Number of weights | 10 |
Start samples | 50 | |
Number of samples used as starting points | 100 | |
Number of samples that will be drawn from the RBF network | ||
Chromosomes or Particles or agents | 100 | |
Maximum number of iterations | 200 |
Function | Genetic | PSO | DE | Proposed |
---|---|---|---|---|
BF1 | 7150 | 9030 (0.87) | 5579 | 1051 |
BF2 | 7504 | 6505 (0.67) | 5598 | 921 |
BRANIN | 6135 | 6865 (0.93) | 5888 | 460 |
CAMEL | 6564 | 5162 | 6403 | 778 |
CIGAR10 | 11,813 | 18,803 | 13,313 | 1896 |
CM4 | 10,537 | 11,124 | 9018 | 1877 (0.87) |
DISCUS10 | 20,208 | 6039 | 7797 | 478 |
EASOM | 5281 | 2037 | 7917 | 258 |
ELP10 | 20,337 | 16,731 | 2863 | 2263 |
EXP4 | 10,537 | 9155 | 5944 | 750 |
EXP16 | 20,131 | 14,061 | 3653 | 885 |
EXP64 | 20,140 | 8958 | 3692 | 948 |
GRIEWANK10 | 20,151 (0.10) | 17,497 (0.03) | 16,469 (0.03) | 2697 |
POTENTIAL3 | 18,902 | 9936 | 5452 | 1192 |
POTENTIAL5 | 18,477 | 12,385 | 3972 | 2399 |
HANSEN | 10,708 | 9104 | 14,016 | 2370 (0.93) |
HARTMAN3 | 8481 | 12,971 | 4677 | 642 |
HARTMAN6 | 17,723 (0.60) | 15,174 (0.57) | 14,372 (0.90) | 883 |
RASTRIGIN | 6744 | 7639 (0.97) | 6148 | 1408 (0.80) |
ROSENBROCK4 | 20,815 (0.63) | 11,526 | 16,763 | 1619 |
ROSENBROCK8 | 20,597 (0.67) | 16,967 | 16,631 | 2444 |
SHEKEL5 | 14,456 (0.73) | 15,082 (0.47) | 13,178 | 2333 (0.87) |
SHEKEL7 | 16,786 (0.83) | 14,625 (0.40) | 12,050 | 1844 (0.93) |
SHEKEL10 | 15,586 (0.80) | 12,628 (0.53) | 13,107 | 2451 |
SINU4 | 11,908 | 10,659 | 9048 | 802 |
SINU8 | 20,115 | 13,912 | 16,210 | 1500 (0.97) |
TEST2N4 | 13,943 | 12,948 | 10,864 | 878 (0.93) |
TEST2N5 | 15,814 | 13,936 (0.90) | 15,259 | 971 (0.77) |
TEST2N6 | 18,987 | 15,449 (0.70) | 12,839 | 997 (0.70) |
TEST2N7 | 20,035 | 16,020 (0.50) | 8185 (0.97) | 1084 (0.30) |
TEST30N3 | 13,029 | 7239 | 4839 | 1061 |
TEST30N4 | 12,889 | 8051 | 5070 | 854 |
Average | 472,596 (0.89) | 368,218 (0.86) | 296,814 (0.96) | 42,994 (0.94) |
Function | |||
---|---|---|---|
BF1 | 1051 | 1116 | 1224 |
BF2 | 921 | 949 | 1058 |
BRANIN | 460 | 506 | 599 |
CAMEL | 778 | 676 | 739 |
CIGAR10 | 1896 | 1934 | 2042 |
CM4 | 1877 (0.87) | 1859 (0.93) | 1877 (0.90) |
DISCUS10 | 478 | 531 | 634 |
EASOM | 258 | 307 | 450 |
ELP10 | 2263 | 2339 | 3130 |
EXP4 | 750 | 778 | 884 |
EXP16 | 885 | 932 | 1030 |
EXP64 | 948 | 998 | 1091 |
GRIEWANK10 | 2697 | 2647 | 2801 |
POTENTIAL3 | 1192 | 1228 | 1305 |
POTENTIAL5 | 2399 | 2417 | 2544 |
HANSEN | 2370 (0.93) | 2602 (0.93) | 2578 (0.97) |
HARTMAN3 | 642 | 696 | 798 |
HARTMAN6 | 883 | 940 | 1038 |
RASTRIGIN | 1408 (0.80) | 989 (0.83) | 1041 |
ROSENBROCK4 | 1619 | 1674 | 1751 |
ROSENBROCK8 | 2444 | 2499 | 2583 |
SHEKEL5 | 2333 (0.87) | 1267 | 1878 (0.97) |
SHEKEL7 | 1844 (0.93) | 1517 (0.93) | 1685 (0.97) |
SHEKEL10 | 2451 | 2695 | 1498 |
SINU4 | 802 | 821 | 901 |
SINU8 | 1500 (0.97) | 1216 | 1247 |
TEST2N4 | 878 (0.93) | 934 | 850 (0.97) |
TEST2N5 | 971 (0.77) | 941 (0.80) | 993 |
TEST2N6 | 997 (0.70) | 1087 (0.77) | 1098 |
TEST2N7 | 1084 (0.30) | 1160 (0.53) | 1313 (0.57) |
TEST30N3 | 1061 | 998 | 1320 |
TEST30N4 | 854 | 830 | 1108 |
Average | 42,994 (0.94) | 42,083 (0.96) | 45,088 (0.97) |
Atoms | Genetic | PSO | Proposed |
---|---|---|---|
3 | 18,902 | 9936 | 1192 |
4 | 17,806 | 12,560 | 1964 |
5 | 18,477 | 12,385 | 2399 |
6 | 19,069 (0.20) | 9683 | 3198 |
7 | 16,390 (0.33) | 10,533 (0.17) | 3311 (0.97) |
8 | 15,924 (0.50) | 8053 (0.50) | 3526 |
9 | 15,041 (0.27) | 9276 (0.17) | 4338 |
10 | 14,817 (0.03) | 7548 (0.17) | 5517 (0.87) |
11 | 13,885 (0.03) | 6864 (0.13) | 6588 (0.80) |
12 | 14,435 (0.17) | 12,182 (0.07) | 7508 (0.83) |
13 | 14,457 (0.07) | 10,748 (0.03) | 6717 (0.77) |
14 | 13,906 (0.07) | 14,235 (0.13) | 6201 (0.93) |
15 | 12,832 (0.10) | 12,980 (0.10) | 7802 (0.90) |
Average | 205,941 (0.37) | 137,134 (0.42) | 60,258 (0.93) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tsoulos, I.G.; Tzallas, A.; Karvounis, E.; Tsalikakis, D. NeuralMinimizer: A Novel Method for Global Optimization. Information 2023, 14, 66. https://doi.org/10.3390/info14020066
Tsoulos IG, Tzallas A, Karvounis E, Tsalikakis D. NeuralMinimizer: A Novel Method for Global Optimization. Information. 2023; 14(2):66. https://doi.org/10.3390/info14020066
Chicago/Turabian StyleTsoulos, Ioannis G., Alexandros Tzallas, Evangelos Karvounis, and Dimitrios Tsalikakis. 2023. "NeuralMinimizer: A Novel Method for Global Optimization" Information 14, no. 2: 66. https://doi.org/10.3390/info14020066
APA StyleTsoulos, I. G., Tzallas, A., Karvounis, E., & Tsalikakis, D. (2023). NeuralMinimizer: A Novel Method for Global Optimization. Information, 14(2), 66. https://doi.org/10.3390/info14020066