Abstract
Several population-based metaheuristic optimization algorithms have been proposed in the last decades, none of which are able either to outperform all existing algorithms or to solve all optimization problems according to the No Free Lunch (NFL) theorem. Many of these algorithms behave effectively, under a correct setting of the control parameter(s), when solving different engineering problems. The optimization behavior of these algorithms is boosted by applying various strategies, which include the hybridization technique and the use of chaotic maps instead of the pseudo-random number generators (PRNGs). The hybrid algorithms are suitable for a large number of engineering applications in which they behave more effectively than the thoroughbred optimization algorithms. However, they increase the difficulty of correctly setting control parameters, and sometimes they are designed to solve particular problems. This paper presents three hybridizations dubbed HYBPOP, HYBSUBPOP, and HYBIND of up to seven algorithms free of control parameters. Each hybrid proposal uses a different strategy to switch the algorithm charged with generating each new individual. These algorithms are Jaya, sine cosine algorithm (SCA), Rao’s algorithms, teaching-learning-based optimization (TLBO), and chaotic Jaya. The experimental results show that the proposed algorithms perform better than the original algorithms, which implies the optimal use of these algorithms according to the problem to be solved. One more advantage of the hybrid algorithms is that no prior process of control parameter tuning is needed.
1. Introduction
It is well known that metaheuristic optimization methods are widely used to solve problems in several fields of science and engineering. Population-based metaheuristic methods iteratively generate new populations to increase diversity in the current generation. This increases the probability of reaching the optimum for the considered problem. These algorithms are proposed to replace exact optimization algorithms when they are not able to reach an acceptable solution. The inability to provide an adequate solution may be due to either the characteristics of the objective function or the wide search space, which renders a comprehensive search useless. In addition, classical optimization methods, such as greedy-based algorithms, need to consider several assumptions that make it hard to resolve the considered problem.
When metaheuristic methods are operated, on the one hand, the objective function has no restrictions. On the other hand, each optimization method proposes its own rules for the evolution of the population towards the optimum. These algorithms are suitable for general problems, but each one has different skills in global exploration and local exploitation.
Some of the proposed algorithms that have proven to be effective in several areas of science and engineering are studied: mine blast algorithm (MBA) [1] based on the mine bomb explosion concept; the manta ray foraging optimization method (MRFO) [2] based on intelligent behaviors of manta ray; the crow search algorithm (CSA) [3] based on the behavior of crows; the ant colony optimization (ACO) algorithm [4] which imitates the foraging behavior of ant colonies; the biogeography-based optimization (BBO) algorithm [5] which improves solutions stochastically and iteratively; the grenade explosion method (GEM) algorithm [6] based on the characteristics of the explosion of a grenade; the particle swarm optimization (PSO) algorithm [7] based on the social behavior of fish schooling or bird flocking; the firefly (FF) algorithm [8] inspired by the flashing behavior of fireflies; the artificial bee colony (ABC) algorithm [9] inspired by the foraging behavior of honey bees; the gravitational search algorithm (GSA) [10] based on Newton’s law of gravity; and the shuffled frog leaping (SFL) algorithm [11] which imitates the collaborative behavior of frogs; among others. Many of them require configuration parameters that must be correctly tuned according to the problem to be solved, see for example [12]. Otherwise, exploitation and exploration skills can be degraded. If the exploitation capacity degrades, the number of populations generated must be increased, while if the exploration capacity deteriorates, the quality of the solution may worsen.
Other proposed algorithms that have also been shown to be effective in various areas of science and engineering but have no algorithm-specific parameters are: the sine cosine algorithm (SCA) [13] based on the sine and cosine trigonometric functions; the teaching-learning-based optimization (TLBO) algorithm [14] based on the processes of teaching and learning; the supply-demand-based optimization method (SDO) [15] based on both the demand relation of consumers and supply relation of producers; the Jaya algorithm [16] based on geometric distances and random processes; the Harris haws optimization method (HHO) [17] based on the cooperative behavior and chasing style of Harris’ hawks, and Rao optimization algorithms [18]; among others.
One of the widely used techniques to improve optimization algorithms is chaos theory. Nonlinear dynamic systems that are characterized by a high sensitivity to their initial conditions are studied in chaos theory [19,20]. They can be applied to replace the PRNGs in producing the control parameters or performing local searches [21,22,23,24,25,26,27,28,29,30,31,32,33,34]. However, improving an optimization algorithm using chaotic systems instead of pseudo-random number generators (PRNGs) may be restricted to the problem under consideration or to a set of problems with similar characteristics.
Hybridization is a well-known strategy that boosts the capacity of optimization algorithms. Since a metaheuristic optimization algorithm cannot overcome all algorithms in solving any problem, hybridization can be a solution that merges the capabilities of different algorithms in one system [35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52]. Many of these algorithms require the correct setting of control parameters, and merging several of these algorithms into a single solution increases the complexity of accurate adjustment of control parameters. Furthermore, some hybridization techniques can be complicated if the management and replacement strategies of individuals in the populations are not similar. On the other hand, when chaos is applied, hybrid algorithms can provide excellent performance for a limited number of applications.
The proposed algorithms consist of hybridizations of seven of the best optimization algorithms that satisfy two requirements: (i) they must be free of algorithm-specific control parameters, and (ii) population management should allow hybridization not only at the population level but also at the individual level.
The remainder of this paper is organized as follows. Section 2 presents a brief description of the optimization algorithms used for the hybridizations. Section 3 describes the hybrid algorithms in detail, analyses of which are provided in Section 4. Finally, concluding remarks are drawn in Section 5.
2. Preliminaries
As mentioned above, among the best free control parameter algorithms are the Jaya algorithm [16], the SCA algorithm [13], the supply-demand-based optimization method [15], Rao’s optimization algorithms [18], the Harris hawks optimization method (HHO) [17], and the teaching-learning-based optimization (TLBO) algorithm [14]. Among these proposals, the HHO algorithm is the most complex. It consists of two phases. During the first phase, the elements of the population are replaced without comparing the fitness of the associated solutions, which is an unwanted strategy for hybrid algorithms. In addition, the SDO algorithm, which offers impressive initial results, works with two populations preventing its integration in our hybrid proposals.
The Jaya optimization algorithm and the three new Rao’s optimization algorithms (i.e., RAO1, RAO2, and RAO3) are described in Algorithm 1. The Jaya optimization algorithm has been successfully used for solving a large number of large-scale industrial problems [53,54,55,56,57,58,59]. The three new Rao’s optimization algorithms are metaphor-less algorithms based on the best and worst solutions obtained during the optimization process and the random interactions between the candidate solutions [60,61,62]. In Algorithms 1–3 and 5–9, is the number of generations; is the number of individuals in population ; is the number of variables of the objective function F; is the mth individual in the current population; and are the low and high bounds of the kth variable of F, respectively; and are the best and worst individuals of the current population, , successively; is the mth new individual that can replace the current mth individual of population ; and is an uniformly distributed random number in .
The SCA algorithm, presented in Algorithm 2 has been proven to be efficient in several applications [63,64,65,66,67,68,69].
The TLBO algorithm, described in Algorithm 3, is a two-phase algorithm; teacher phase and learner phase. It has been proven effective in solving various engineering problems [70,71,72,73,74,75,76].
As mentioned earlier, the use of chaotic maps can improve the behavior of some metaheuristic methods. The 2D chaotic map reported in [33] has significantly improved the convergence rate of the Jaya algorithm [33,77]. The generation of the 2D chaotic map is shown in Algorithm 4, where the initial conditions are , , , and . The computed values of and are in . The chaotic Jaya algorithm (in short, CJaya) is shown in Algorithm 5, where are chaotic values randomly extracted from the 2D chaotic map. Other chaotic maps have been applied to Jaya in [32,78]. However, they do not surmount the chaotic behavior of the aforementioned 2D map.
As they present a similar structure, Algorithms 1–5 are used for designing our hybrid algorithms.
| Algorithm 1 Jaya and Rao algorithms |
|
| Algorithm 2 SCA optimization algorithm |
|
| Algorithm 3 TLBO algorithm |
|
| Algorithm 4 2D chaotic map |
|
| Algorithm 5 Chaotic 2D Jaya algorithm |
|
3. Hybrid Algorithms
The proposed hybrid algorithms are designed using the seven algorithms described in Section 2. These algorithms have been selected thanks to their performance in solving constrained and unconstrained functions, but also because they share a similar structure that allows the implementation of different hybridization strategies.
Algorithm 6 shows the skeleton of the proposed hybrid algorithms, which includes all common and uncommon tasks without any updating procedure of the current population. Since the TLBO algorithm is a two-phase algorithm, the proposed hybrid algorithms apply these two phases consecutively to each individual. In contrast to the other algorithm where a single-phase is executed, a control parameter is applied to process twice the same individual when the TLBO algorithm is used (see lines 24–29 of Algorithm 6). The algorithm used to obtain a new individual is determined by (see line 17 of Algorithm 6). In Algorithms 6–9, determines the algorithm accountable for producing a new individual.
Given that only algorithms that are free of control parameters have been considered, proposals that require the inclusion of control parameters have been discarded. Following these guidelines, we have designed three hybrid algorithms, an analysis of which is provided in Section 4. The first proposed hybrid algorithm, shown in Algorithm 7, processes the entire population in each iteration using the same algorithm, and is referred to as the HYBPOP algorithm. This is the most straightforward hybridization technique where the requirement to follow the structure given by Algorithm 6 is not mandatory on all algorithms. In Algorithms 7–9, is the number of algorithms free of control parameters involved in the hybrid proposals.
| Algorithm 6 Skeleton of hybrid algorithms |
|
| Algorithm 7 HYBPOP: Hybrid algorithm based on population |
|
The second algorithm, named HYBSUBPOP, is described through Algorithm 8. It logically splits the population into sub-populations. During the optimization process, each sub-population will be processed by one of the seven algorithms mentioned previously.
| Algorithm 8 HYBSUBPOP: Hybrid algorithm based on sub-populations |
|
Algorithm 9 shows the third proposed hybrid algorithm, dubbed HYBIND, in which a different algorithm in each iteration handles each individual of the population.
| Algorithm 9 HYBIND: Hybrid algorithm based on individuals |
|
It is worth noting that the aim of the proposed hybrid algorithms is not to improve the convergence ratio of the used algorithms separately, nor to perform optimally for a particular problem. It is to show outstanding performance for a large number of problems without adjusting any control parameters of the considered algorithms.
4. Numerical Experiments
In this section, the performance of the proposed hybrid algorithms is analyzed through solving 28 well-known unconstrained functions (see Table 1), the definitions of which can be seen in [77]. The proposed algorithms were implemented in the C language, using the GCC v.4.4.7 [79], and an Intel Xeon E5-2620 v2 processor at 2.1 GHz. The hybrid proposals, along with the original algorithms, have been implemented and tested using C language. The C implementations of the original algorithms used are not available through the Internet. However, their Java/Matlab implementations are commonly available.
Table 1.
Benchmark functions. Names and parameters.
The data collected from the experimental analysis are as follows:
- NoR-AI: the total number of replacements for any individual.
- NoR-BI: the total number of replacements for the current best individual.
- NoR-BwT: the total number of replacements for the current best individual with an error of less than .
- LtI-AI: the last iteration () in which a replacement of any individual occurs.
- LtI-BI: the last iteration () in which a replacement of the best individual occurs.
Three of the five analyzed data (NoR-) indicate the number of times the current individual () is replaced by a new individual (), which provides a better fitness function (see line 21 of Algoritm 6), while the remaining two (LtI-) refer to the last generation (iterator) in which at least one individual has been replaced.
All data given below have been obtained under 50 runs, iterations () and two population sizes ( and 210). The maximum values of the analyzed data are listed in Table 2.
Table 2.
Maximum values of the analyzed data.
Table 3, Table 4 and Table 5 show the data of all the considered algorithms independently, i.e., without hybridization. As expected, the behavior of the different algorithms does not follow a familiar pattern. In addition, it depends on the objective function. Regarding a global convergence analysis, both TLBO and CJaya behave better but with a higher order of complexity (see [77,80]). Moreover, it is noted that when using TLBO, two new individuals are generated in each iteration; one in the teacher phase and the other one in the learner phase. The values in brackets in Table 3, Table 4 and Table 5 refer to the standard deviation of the data under 50 runs. Note that heuristic optimization algorithms are partially based on randomness, which leads to high values of standard deviation. The average standard deviations are approximately equal to 16%, 22%, 15%, 30%, 23%, 23%, and 22% for Jaya, Chaotic Jaya, SCA, RAO1, RAO2, RAO3, and TLBO, respectively.
Table 3.
Analysis of Jaya and chaotic Jaya on function F1-F28 with a population size of 140.
Table 4.
Analysis of sine cosine algorithm (SCA) and Rao’s optimization Algorithm 1 (RAO1) on function F1-F28 with a population size of 140.
Table 5.
Analysis of RAO2, RAO3, and teaching-learning-based optimization (TLBO) on function F1-F28 with a population size of 140.
An important aspect, not shown in Table 3, Table 4 and Table 5, is whether the solution obtained by each algorithm is acceptable or not. In particular, the original algorithms fail to obtain a solution tolerance of less than for 3, 8, 2, 4, 7, 5, and 2 functions for Jaya, CJaya, SCA, RAO1, RAO2, RAO3, and TLBO, respectively. Therefore, considering only original algorithms, there is no algorithm whose behavior is always the best, which justifies the development of a generalist hybrid system that can solve a large number of benchmark functions and engineering problems.
Comparing the quality of the solutions obtained from the proposed hybrid algorithms, it can be concluded that the HYBSUBPOP algorithm is the worst one because the same thoroughbred algorithm is always applied to the same sub-population, which degrades the algorithm’s performance for a small population. Contrary to HYBSUBPOP, the HYBPOP and HYBIND algorithms apply the selected algorithms to all individuals, which leads to better-exploiting hybridizations. The HYBSUPOP algorithm fails to obtain a solution tolerance of less than in 3 functions (F11, F23, and F27) and the HYBPOP and HYBIND algorithms fail in only one function (F27 and F11, respectively). If the population size is increased to 210, the HYBIND algorithm succeeds with all functions, thus the HYBIND algorithm has a slightly better performance in comparison to HYBPOP.
Local exploration has improved both in the HYBPOP method and especially in the HYBIND method, as stated above. Figure 1 and Figure 2 show the convergence curves of both all the individual methods and the three hybrid methods proposed for the first 1000 and 100 iterations, respectively, for functions F1, F8, F11, and F18. Each point in both figures is the average of the data obtained from 10 runs. As shown in these figures, the curves of the three hybrid methods are similar to the curves of the best single algorithms for each function. Therefore, global exploitation, while not improving all methods, behaves similarly to the best single methods for each function. It should be noted that the hybrid methods behave similarly to the best individual methods for each function, which are not always the same.
Figure 1.
Convergence curves. The population size is set as 140 and the number of iterations to 1000.
Figure 2.
Convergence curves. The population size is set as 140 and the number of iterations to 100.
Table 6 sorts the algorithms according to the number of iterations required to obtain an error of less than , if an algorithm is missing in a row an acceptable solution is not reached. As seen from this table, no algorithm outperforms all other algorithms. Moreover, a computational cost analysis would be necessary to classify them correctly. Table 7 exhibits the computational cost of different algorithms. It reveals from this table that the hybrid algorithms are mid-ranked in terms of computational cost, and HYBIND is computationally less expensive than HYBPOP.
Table 6.
Ranking of the algorithms according to the number of iterations required to achieve an error of less than . The population size is set as 140.
Table 7.
Computational times (s.) for 50 runs. The population size is set as 140 and is set to 50,000.
An analysis of the contribution of each algorithm in the HYBPOP and HYBIND algorithms is exhibited in Table 8, Table 9 and Table 10. Table 8 indicates the number of times that an individual has been replaced in each algorithm. The replacement is accepted when the new individual improves the fitness of the current solution. As seen from Table 8, the HYBIND algorithm performs more replacements of individuals. In addition, the numbers of replacements per individual for the contributing algorithms are nearly equal, except for the RAO1 algorithm, where the contribution to replacements is limited. The standard deviations of each data (from 50 runs) are being put in brackets. We found that, on average, the standard deviations for HYBPOP and HYBRID algorithms are both equal to 14%.
Table 8.
Contribution of each algorithm to the replacements of the individuals (NoR-AI). The population size is set as 140.
Table 9.
Last iteration in which a replacement of any individual occurs (LtI-AI). The population size is set as 140.
Table 10.
Last iteration in which a replacement of the best individual occurs (LtI-BI). The population size is set as 140.
Table 9 shows the last iteration in which each optimization algorithm replaces an individual in the population, i.e., when it no longer brings improvement to the hybrid algorithm. As can be seen from Table 8, the optimization algorithms, except the RAO1 algorithm, work efficiently in the hybrid algorithms. It is also revealed that the considered algorithms contribute to more generations in the HYBIND algorithm. The mean value of the standard deviation rises to 28% and 23% for HYBPOP and HYBIND, respectively, due to the randomness behavior and lower LtI-AI costs.
Finally, Table 10 shows the last iteration in which each algorithm obtains a new optimum. A careful analysis of the results in Table 10 reveals that in the HYBPOP algorithm, the seven algorithms contribute similarly to reaching a better solution as new populations are produced. By contrast, when using the HYBIND algorithm, the powerful algorithms are CJaya and TLBO. It should be noted that the CJaya algorithm extracts random individuals from the population to generate new individuals. The TLBO algorithm collects all the individuals of the population to obtain new individuals. Therefore, these algorithms exploit the results obtained from the rest of the algorithms to converge towards the optimum. This fact is due to the nature of these algorithms, where the best solution correctly guided the individuals. The mean value of the standard deviation is high because the LtI-BI is strongly affected by randomness behavior.
It has been found that the HYBSUBPOP algorithm does not reach excellent optimization performance because of the lack of harmony between the original algorithms, so it has left without further analysis. On the other hand, the exploitation phase of the HYBPOP and HYBIND algorithms are similar. In contrast, the HYBIND algorithm outperforms the HYBPOP one in terms of exploitation. The hybridization of the original algorithms is implemented at the individual level in the HYBIND algorithm, contrary to the HYBPOP algorithm, in which that hybridization is performed at the population level. Finally, the HYBPOP algorithm included algorithms that update the population without analyzing the fitness of the associated solutions, while this restriction is mandatory in the HYBIND algorithm.
5. Conclusions
This paper proposed a hybridization strategy of seven well-known algorithms. Three hybrid algorithms free of setting parameters dubbed the HYBSUBPOP, HYBPOP, and HYBIND algorithms are designed. These algorithms are derived from a dynamic skeleton allowing the inclusion of any metaheuristic optimization algorithm that exhibits further improvements. The only requirement in merging a new optimization algorithm into the proposed skeleton is to know if the replacement of an individual on that algorithm is based on the enhancement of the cost function or not. Moreover, both chaotic algorithms and multi-phase algorithms have been employed to design the proposed hybrid algorithms, which proves the versatility of the proposed hybridization skeleton. The experimental results show that the HYBPOP and HYBIND algorithms effectively exploit the capabilities of all the considered algorithms. They present an excellent ability to solve a large number of benchmark functions while improving the quality of the solutions obtained. Generally speaking, the hybridization at the individual level is better than that at the population level, which explains why the performance of the HYBSUBPOP algorithm is inferior to the other hybrid algorithms. As future lines of work, we intend to integrate more efficient algorithms into the proposed hybridization skeleton as well as to evaluate new versions of hybridization, and extend the performance analysis of the potential algorithms for solving more complex functions and real-world engineering problems.
Author Contributions
H.M., A.B., J.-L.S.-R., H.R., and A.J.-M. conceived the hybrid algorithms; A.B. conceived the Chaotic 2D Jaya algorithm; H.M. designed the hybrid algorithms; H.M. codified the hybrid algorithms; H.M., J.-L.S.-R., and H.R. performed numerical experiments; H.M., A.B., and A.J.-M. analyzed the data; H.M. wrote the original draft. A.B., J.-L.S.-R., H.R., and A.J.-M. reviewed and edited the manuscript. All the authors have read and agreed to the published version of the manuscript. All authors have read and agreed to the published version of the manuscript.
Funding
This research and APC was funded by the Spanish Ministry of Science, Innovation and Universities and the Research State Agency under Grant RTI2018-098156-B-C54 co-financed by FEDER funds, and by the Spanish Ministry of Economy and Competitiveness under Grant TIN2017-89266-R, co-financed by FEDER funds.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Sadollah, A.; Bahreininejad, A.; Eskandar, H.; Hamdi, M. Mine Blast Algorithm: A New Population Based Algorithm for Solving Constrained Engineering Optimization Problems. Appl. Soft Comput. 2013, 13, 2592–2612. [Google Scholar] [CrossRef]
- Zhao, W.; Zhang, Z.; Wang, L. Manta ray foraging optimization: An effective bio-inspired optimizer for engineering applications. Eng. Appl. Artif. Intell. 2020, 87, 103300. [Google Scholar] [CrossRef]
- Askarzadeh, A. A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm. Comput. Struct. 2016, 169, 1–12. [Google Scholar] [CrossRef]
- Dorigo, M.; Di Caro, G. New Ideas in Optimization; Chapter The Ant Colony Optimization Meta-Heuristic; McGraw-Hill Ltd.: Maidenhead, UK, 1999; pp. 11–32. [Google Scholar]
- Ma, H.; Simon, D.; Siarry, P.; Yang, Z.; Fei, M. Biogeography-Based Optimization: A 10-Year Review. IEEE Trans. Emerg. Top. Comput. Intell. 2017, 1, 391–407. [Google Scholar] [CrossRef]
- Ahrari, A.; Atai, A.A. Grenade Explosion Method—A novel tool for optimization of multimodal functions. Appl. Soft Comput. 2010, 10, 1132–1140. [Google Scholar] [CrossRef]
- Poli, R.; Kennedy, J.; Blackwell, T. Particle swarm optimization. Swarm Intell. 2007, 1, 33–57. [Google Scholar] [CrossRef]
- Xin-She, Y. Firefly Algorithm, Lévy Flights and Global Optimization. Res. Dev. Intell. Syst. XXVI 2009, 209–218. [Google Scholar] [CrossRef]
- Karaboga, D.; Basturk, B. On the Performance of Artificial Bee Colony (ABC) Algorithm. Appl. Soft Comput. 2008, 8, 687–697. [Google Scholar] [CrossRef]
- Rashedi, E.; Nezamabadi-pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
- Eusuff, M.; Lansey, K.; Pasha, F. Shuffled frog-leaping algorithm: A memetic meta-heuristic for discrete optimization. Eng. Optim. 2006, 38, 129–154. [Google Scholar] [CrossRef]
- Szénási, S.; Felde, I. Configuring genetic algorithm to solve the inverse heat conduction problem. Acta Polytech. Hung. 2017, 14, 133–152. [Google Scholar] [CrossRef]
- Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
- Rao, R.V.; Savsani, V.; Vakharia, D. Teaching-learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput.-Aided Des. 2011, 43, 303–315. [Google Scholar] [CrossRef]
- Zhao, W.; Wang, L.; Zhang, Z. Supply-Demand-Based Optimization: A Novel Economics-Inspired Algorithm for Global Optimization. IEEE Access 2019, 7, 73182–73206. [Google Scholar] [CrossRef]
- Rao, R.V. Jaya: A simple and new optimization algorithm for solving constrained and unconstrained optimization problems. Int. J. Ind. Eng. Comput. 2016, 7, 19–34. [Google Scholar] [CrossRef]
- Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
- Rao, R.V. Rao algorithms: Three metaphor-less simple algorithms for solving optimization problems. Int. J. Ind. Eng. Comput. 2020, 11, 107–130. [Google Scholar] [CrossRef]
- Majumdar, M.; Mitra, T.; Nishimura, K. Optim. Chaos; Springer: New York, NY, USA, 2000. [Google Scholar]
- Ott, E. Frontmatter. In Chaos in Dynamical Systems, 2nd ed.; Cambridge University Press: Cambridge, UK, 2002; pp. i–iv. [Google Scholar]
- Gandomi, A.; Yang, X.S.; Talatahari, S.; Alavi, A. Firefly algorithm with chaos. Commun. Nonlinear Sci. Numer. Simul. 2013, 18, 89–98. [Google Scholar] [CrossRef]
- Gokhale, S.; Kale, V. An application of a tent map initiated Chaotic Firefly algorithm for optimal overcurrent relay coordination. Int. J. Electr. Power Energy Syst. 2016, 78, 336–342. [Google Scholar] [CrossRef]
- Ma, Z.S. Chaotic populations in genetic algorithms. Appl. Soft Comput. 2012, 12, 2409–2424. [Google Scholar] [CrossRef]
- Yan, X.F.; Chen, D.Z.; Hu, S.X. Chaos-genetic algorithms for optimizing the operating conditions based on RBF-PLS model. Comput. Chem. Eng. 2003, 27, 1393–1404. [Google Scholar] [CrossRef]
- Hong, W.C. Traffic flow forecasting by seasonal SVR with chaotic simulated annealing algorithm. Neurocomputing 2011, 74, 2096–2107. [Google Scholar] [CrossRef]
- Mingjun, J.; Huanwen, T. Application of chaos in simulated annealing. Chaos Solitons Fractals 2004, 21, 933–941. [Google Scholar] [CrossRef]
- Saremi, J.; Mirjalili, S.; Lewisn, A. Biogeography-based optimisation with chaos. Neural Comput. Appl. 2014, 25, 1077–1097. [Google Scholar] [CrossRef]
- Wang, X.; Duan, H. A hybrid biogeography-based optimization algorithm for job shop scheduling problem. Comput. Ind. Eng. 2014, 73, 96–114. [Google Scholar] [CrossRef]
- Jia, D.; Zheng, G.; Khan, M.K. An effective memetic differential evolution algorithm based on chaotic local search. Inf. Sci. 2011, 181, 3175–3187. [Google Scholar] [CrossRef]
- Peng, C.; Sun, H.; Guo, J.; Liu, G. Dynamic economic dispatch for wind-thermal power system using a novel bi-population chaotic differential evolution algorithm. Int. J. Electr. Power Energy Syst. 2012, 42, 119–126. [Google Scholar] [CrossRef]
- Alatas, B. Chaotic bee colony algorithms for global numerical optimization. Expert Syst. Appl. 2010, 37, 5682–5687. [Google Scholar] [CrossRef]
- Yu, J.; Kim, C.H.; Wadood, A.; Khurshiad, T.; Rhee, S.B. A Novel Multi-Population Based Chaotic JAYA Algorithm with Application in Solving Economic Load Dispatch Problems. Energies 2018, 11, 1946. [Google Scholar] [CrossRef]
- Farah, A.; Belazi, A. A novel chaotic Jaya algorithm for unconstrained numerical optimization. Nonlinear Dyn. 2018, 93, 1451–1480. [Google Scholar] [CrossRef]
- Kumar, Y.; Singh, P.K. RA chaotic teaching learning based optimization algorithm for clustering problems. Appl. Intell. 2019, 49, 107–130. [Google Scholar] [CrossRef]
- Moayedi, H.; Gör, M.; Khari, M.; Foong, L.K.; Bahiraei, M.; Bui, D.T. Hybridizing four wise neural-metaheuristic paradigms in predicting soil shear strength. Measurement 2020, 156, 107576. [Google Scholar] [CrossRef]
- Nguyen, B.M.; Tran, T.; Nguyen, T.; Nguyen, G. Hybridization of Galactic Swarm and Evolution Whale Optimization for Global Search Problem. IEEE Access 2020, 8, 74991–75010. [Google Scholar] [CrossRef]
- Kayabekir, A.E.; Toklu, Y.C.; Bekdaş, G.; Nigdeli, S.M.; Yücel, M.; Geem, Z.W. A Novel Hybrid Harmony Search Approach for the Analysis of Plane Stress Systems via Total Potential Optimization. Appl. Sci. 2020, 10, 2301. [Google Scholar] [CrossRef]
- Punurai, W.; Azad, M.S.; Pholdee, N.; Bureerat, S.; Sinsabvarodom, C. A novel hybridized metaheuristic technique in enhancing the diagnosis of cross-sectional dent damaged offshore platform members. Comput. Intell. 2020, 36, 132–150. [Google Scholar] [CrossRef]
- Pellegrini, R.; Serani, A.; Liuzzi, G.; Rinaldi, F.; Lucidi, S.; Diez, M. Hybridization of Multi-Objective Deterministic Particle Swarm with Derivative-Free Local Searches. Mathematics 2020, 8, 546. [Google Scholar] [CrossRef]
- Yue, Z.; Zhang, S.; Xiao, W. A Novel Hybrid Algorithm Based on Grey Wolf Optimizer and Fireworks Algorithm. Sensors 2020, 20, 2147. [Google Scholar] [CrossRef]
- Seifi, A.; Ehteram, M.; Singh, V.P.; Mosavi, A. Modeling and Uncertainty Analysis of Groundwater Level Using Six Evolutionary Optimization Algorithms Hybridized with ANFIS, SVM, and ANN. Sustainability 2020, 12, 4023. [Google Scholar] [CrossRef]
- Chen, X.; Yu, K. Hybridizing cuckoo search algorithm with biogeography-based optimization for estimating photovoltaic model parameters. Sol. Energy 2019, 180, 192–206. [Google Scholar] [CrossRef]
- Zhang, X.; Shen, X.; Yu, Z. A Novel Hybrid Ant Colony Optimization for a Multicast Routing Problem. Algorithms 2019, 12, 18. [Google Scholar] [CrossRef]
- Aljohani, T.M.; Ebrahim, A.F.; Mohammed, O. Single and Multiobjective Optimal Reactive Power Dispatch Based on Hybrid Artificial Physics—Particle Swarm Optimization. Energies 2019, 12, 2333. [Google Scholar] [CrossRef]
- Ahmadian, A.; Elkamel, A.; Mazouz, A. An Improved Hybrid Particle Swarm Optimization and Tabu Search Algorithm for Expansion Planning of Large Dimension Electric Distribution Network. Energies 2019, 12, 3052. [Google Scholar] [CrossRef]
- Li, G.; Liu, P.; Le, C.; Zhou, B. A Novel Hybrid Meta-Heuristic Algorithm Based on the Cross-Entropy Method and Firefly Algorithm for Global Optimization. Entropy 2019, 21, 494. [Google Scholar] [CrossRef]
- Cherki, I.; Chaker, A.; Djidar, Z.; Khalfallah, N.; Benzergua, F. A Sequential Hybridization of Genetic Algorithm and Particle Swarm Optimization for the Optimal Reactive Power Flow. Sustainability 2019, 11, 3862. [Google Scholar] [CrossRef]
- Hanem, W.A.H.M.; Jantan, A. Hybridizing artificial bee colony with monarch butterfly optimization for numerical optimization problems. Neural Comput. Appl. 2018, 30, 163–181. [Google Scholar] [CrossRef]
- Das, P.; Behera, H.; Panigrahi, B. A hybridization of an improved particle swarm optimization and gravitational search algorithm for multi-robot path planning. Swarm Evol. Comput. 2016, 28, 14–28. [Google Scholar] [CrossRef]
- Wang, G.G.; Gandomi, A.H.; Zhao, X.; Chu, H.C.E. Hybridizing harmony search algorithm with cuckoo search for global numerical optimization. Soft Comput. 2016, 20, 273–285. [Google Scholar] [CrossRef]
- Zhu, A.; Xu, C.; Li, Z.; Wu, J.; Liu, Z. Hybridizing grey wolf optimization with differential evolution for global optimization and test scheduling for 3D stacked SoC. J. Syst. Eng. Electron. 2015, 26, 317–328. [Google Scholar] [CrossRef]
- Javaid, N.; Ahmed, A.; Iqbal, S.; Ashraf, M. Day Ahead Real Time Pricing and Critical Peak Pricing Based Power Scheduling for Smart Homes with Different Duty Cycles. Energies 2018, 11, 1464. [Google Scholar] [CrossRef]
- Mishra, S.; Ray, P.K. Power quality improvement using photovoltaic fed DSTATCOM based on JAYA optimization. IEEE Trans. Sustain. Energy 2016, 7, 1672–1680. [Google Scholar] [CrossRef]
- Huang, C.; Wang, L.; Yeung, R.S.; Zhang, Z.; Chung, H.S.; Bensoussan, A. A Prediction Model-Guided Jaya Algorithm for the PV System Maximum Power Point Tracking. IEEE Trans. Sustain. Energy 2018, 9, 45–55. [Google Scholar] [CrossRef]
- Abhishek, K.; Kumar, V.R.; Datta, S.; Mahapatra, S.S. Application of JAYA algorithm for the optimization of machining performance characteristics during the turning of CFRP (epoxy) composites: Comparison with TLBO, GA, and ICA. Eng. Comput. 2016, 1–19. [Google Scholar] [CrossRef]
- Choudhary, A.; Kumar, M.; Unune, D.R. Investigating effects of resistance wire heating on AISI 1023 weldment characteristics during ASAW. Mater. Manuf. Process. 2018, 33, 759–769. [Google Scholar] [CrossRef]
- Dinh-Cong, D.; Dang-Trung, H.; Nguyen-Thoi, T. An efficient approach for optimal sensor placement and damage identification in laminated composite structures. Adv. Eng. Softw. 2018, 119, 48–59. [Google Scholar] [CrossRef]
- Singh, S.P.; Prakash, T.; Singh, V.; Babu, M.G. Analytic hierarchy process based automatic generation control of multi-area interconnected power system using Jaya algorithm. Eng. Appl. Artif. Intell. 2017, 60, 35–44. [Google Scholar] [CrossRef]
- Cruz, N.C.; Redondo, J.L.; Álvarez, J.D.; Berenguel, M.; Ortigosa, P.M. A parallel Teaching–Learning-Based Optimization procedure for automatic heliostat aiming. J. Supercomput. 2017, 73, 591–606. [Google Scholar] [CrossRef]
- Rao, R.V.; Pawar, R.B. Self-adaptive Multi-population Rao Algorithms for Engineering Design Optimization. Appl. Artif. Intell. 2020, 34, 187–250. [Google Scholar] [CrossRef]
- Rao, R.; Pawar, R. Constrained design optimization of selected mechanical system components using Rao algorithms. Appl. Soft Comput. 2020, 89, 106141. [Google Scholar] [CrossRef]
- Rao, R.V.; Keesari, H.S. Rao algorithms for multi-objective optimization of selected thermodynamic cyclesn. Eng. Comput. 2020. [Google Scholar] [CrossRef]
- Kumar-Majhi, S. An Efficient Feed Foreword Network Model with Sine Cosine Algorithm for Breast Cancer Classification. Int. J. Syst. Dyn. Appl. (IJSDA) 2018, 7, 1–14. [Google Scholar] [CrossRef]
- Rajesh, K.S.; Dash, S.S. Load frequency control of autonomous power system using adaptive fuzzy based PID controller optimized on improved sine cosine algorithm. J. Ambient. Intell. Humaniz. Comput. 2019, 10, 2361–2373. [Google Scholar] [CrossRef]
- Khezri, R.; Oshnoei, A.; Tarafdar Hagh, M.; Muyeen, S. Coordination of Heat Pumps, Electric Vehicles and AGC for Efficient LFC in a Smart Hybrid Power System via SCA-Based Optimized FOPID Controllers. Energies 2018, 11, 420. [Google Scholar] [CrossRef]
- Ramanaiah, M.L.; Reddy, M.D. Sine Cosine Algorithm for Loss Reduction in Distribution System with Unified Power Quality Conditioner. i-Manag. J. Power Syst. Eng. 2017, 5, 10–16. [Google Scholar] [CrossRef]
- Dhundhara, S.; Verma, Y.P. Capacitive energy storage with optimized controller for frequency regulation in realistic multisource deregulated power system. Energy 2018, 147, 1108–1128. [Google Scholar] [CrossRef]
- Singh, V.P. Sine cosine algorithm based reduction of higher order continuous systems. In Proceedings of the 2017 International Conference on Intelligent Sustainable Systems (ICISS), Palladam, India, 7–8 December 2017; pp. 649–653. [Google Scholar] [CrossRef]
- Das, S.; Bhattacharya, A.; Chakraborty, A.K. Solution of short-term hydrothermal scheduling using sine cosine algorithm. Soft Comput. 2018, 22, 6409–6427. [Google Scholar] [CrossRef]
- Singh, M.; Panigrahi, B.; Abhyankar, A. Optimal coordination of directional over-current relays using Teaching Learning-Based Optimization (TLBO) algorithm. Int. J. Electr. Power Energy Syst. 2013, 50, 33–41. [Google Scholar] [CrossRef]
- Niknam, T.; Azizipanah-Abarghooee, R.; Narimani, M.R. A new multi objective optimization approach based on TLBO for location of automatic voltage regulators in distribution systems. Eng. Appl. Artif. Intell. 2012, 25, 1577–1588. [Google Scholar] [CrossRef]
- Li, D.; Zhang, C.; Shao, X.; Lin, W. A multi-objective TLBO algorithm for balancing two-sided assembly line with multiple constraints. J. Intell. Manuf. 2016, 27, 725–739. [Google Scholar] [CrossRef]
- Arya, L.; Koshti, A. Anticipatory load shedding for line overload alleviation using Teaching learning based optimization (TLBO). Int. J. Electr. Power Energy Syst. 2014, 63, 862–877. [Google Scholar] [CrossRef]
- Mohanty, B. TLBO optimized sliding mode controller for multi-area multi-source nonlinear interconnected AGC system. Int. J. Electr. Power Energy Syst. 2015, 73, 872–881. [Google Scholar] [CrossRef]
- Yan, J.; Li, K.; Bai, E.; Yang, Z.; Foley, A. Time series wind power forecasting based on variant Gaussian Process and TLBO. Neurocomputing 2016, 189, 135–144. [Google Scholar] [CrossRef]
- Baghban, A.; Kardani, M.N.; Mohammadi, A.H. Improved estimation of Cetane number of fatty acid methyl esters (FAMEs) based biodiesels using TLBO-NN and PSO-NN models. Fuel 2018, 232, 620–631. [Google Scholar] [CrossRef]
- Migallón, H.; Jimeno-Morenilla, A.; Sánchez-Romero, J.; Belazi, A. Efficient parallel and fast convergence chaotic Jaya algorithms. Swarm Evol. Comput. 2020, 100698. [Google Scholar] [CrossRef]
- Ravipudi, J.L.; Neebha, M. Synthesis of linear antenna arrays using Jaya, self-adaptive Jaya and chaotic Jaya algorithms. AEU-Int. J. Electron. Commun. 2018, 92, 54–63. [Google Scholar] [CrossRef]
- Free Software Foundation, Inc. GCC, the GNU Compiler Collection. Available online: https://www.gnu.org/software/gcc/index.html (accessed on 10 March 2017).
- García-Monzó, A.; Migallón, H.; Jimeno-Morenilla, A.; Sánchez-Romero, J.L.; Rico, H.; Rao, R.V. Efficient Subpopulation Based Parallel TLBO Optimization Algorithms. Electronics 2018, 8, 19. [Google Scholar] [CrossRef]
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).