Next Article in Journal
Fractional Integral Equations Tell Us How to Impose Initial Values in Fractional Differential Equations
Next Article in Special Issue
A Partial Allocation Local Search Matheuristic for Solving the School Bus Routing Problem with Bus Stop Selection
Previous Article in Journal
On Derivative Free Multiple-Root Finders with Optimal Fourth Order Convergence
Previous Article in Special Issue
A Competitive Memory Paradigm for Multimodal Optimization Driven by Clustering and Chaos
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Settings-Free Hybrid Metaheuristic General Optimization Methods

by
Héctor Migallón
1,*,†,
Akram Belazi
2,†,
José-Luis Sánchez-Romero
3,†,
Héctor Rico
3,† and
Antonio Jimeno-Morenilla
3,†
1
Department of Computer Engineering, Miguel Hernández University, E-03202 Elche, Alicante, Spain
2
Laboratory RISC-ENIT (LR-16-ES07), Tunis El Manar University, Tunis 1002, Tunisia
3
Department of Computer Technology, University of Alicante, E-03071 Alicante, Spain
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Mathematics 2020, 8(7), 1092; https://doi.org/10.3390/math8071092
Submission received: 2 June 2020 / Revised: 30 June 2020 / Accepted: 1 July 2020 / Published: 3 July 2020
(This article belongs to the Special Issue Advances of Metaheuristic Computation)

Abstract

:
Several population-based metaheuristic optimization algorithms have been proposed in the last decades, none of which are able either to outperform all existing algorithms or to solve all optimization problems according to the No Free Lunch (NFL) theorem. Many of these algorithms behave effectively, under a correct setting of the control parameter(s), when solving different engineering problems. The optimization behavior of these algorithms is boosted by applying various strategies, which include the hybridization technique and the use of chaotic maps instead of the pseudo-random number generators (PRNGs). The hybrid algorithms are suitable for a large number of engineering applications in which they behave more effectively than the thoroughbred optimization algorithms. However, they increase the difficulty of correctly setting control parameters, and sometimes they are designed to solve particular problems. This paper presents three hybridizations dubbed HYBPOP, HYBSUBPOP, and HYBIND of up to seven algorithms free of control parameters. Each hybrid proposal uses a different strategy to switch the algorithm charged with generating each new individual. These algorithms are Jaya, sine cosine algorithm (SCA), Rao’s algorithms, teaching-learning-based optimization (TLBO), and chaotic Jaya. The experimental results show that the proposed algorithms perform better than the original algorithms, which implies the optimal use of these algorithms according to the problem to be solved. One more advantage of the hybrid algorithms is that no prior process of control parameter tuning is needed.

1. Introduction

It is well known that metaheuristic optimization methods are widely used to solve problems in several fields of science and engineering. Population-based metaheuristic methods iteratively generate new populations to increase diversity in the current generation. This increases the probability of reaching the optimum for the considered problem. These algorithms are proposed to replace exact optimization algorithms when they are not able to reach an acceptable solution. The inability to provide an adequate solution may be due to either the characteristics of the objective function or the wide search space, which renders a comprehensive search useless. In addition, classical optimization methods, such as greedy-based algorithms, need to consider several assumptions that make it hard to resolve the considered problem.
When metaheuristic methods are operated, on the one hand, the objective function has no restrictions. On the other hand, each optimization method proposes its own rules for the evolution of the population towards the optimum. These algorithms are suitable for general problems, but each one has different skills in global exploration and local exploitation.
Some of the proposed algorithms that have proven to be effective in several areas of science and engineering are studied: mine blast algorithm (MBA) [1] based on the mine bomb explosion concept; the manta ray foraging optimization method (MRFO) [2] based on intelligent behaviors of manta ray; the crow search algorithm (CSA) [3] based on the behavior of crows; the ant colony optimization (ACO) algorithm [4] which imitates the foraging behavior of ant colonies; the biogeography-based optimization (BBO) algorithm [5] which improves solutions stochastically and iteratively; the grenade explosion method (GEM) algorithm [6] based on the characteristics of the explosion of a grenade; the particle swarm optimization (PSO) algorithm [7] based on the social behavior of fish schooling or bird flocking; the firefly (FF) algorithm [8] inspired by the flashing behavior of fireflies; the artificial bee colony (ABC) algorithm [9] inspired by the foraging behavior of honey bees; the gravitational search algorithm (GSA) [10] based on Newton’s law of gravity; and the shuffled frog leaping (SFL) algorithm [11] which imitates the collaborative behavior of frogs; among others. Many of them require configuration parameters that must be correctly tuned according to the problem to be solved, see for example [12]. Otherwise, exploitation and exploration skills can be degraded. If the exploitation capacity degrades, the number of populations generated must be increased, while if the exploration capacity deteriorates, the quality of the solution may worsen.
Other proposed algorithms that have also been shown to be effective in various areas of science and engineering but have no algorithm-specific parameters are: the sine cosine algorithm (SCA) [13] based on the sine and cosine trigonometric functions; the teaching-learning-based optimization (TLBO) algorithm [14] based on the processes of teaching and learning; the supply-demand-based optimization method (SDO) [15] based on both the demand relation of consumers and supply relation of producers; the Jaya algorithm [16] based on geometric distances and random processes; the Harris haws optimization method (HHO) [17] based on the cooperative behavior and chasing style of Harris’ hawks, and Rao optimization algorithms [18]; among others.
One of the widely used techniques to improve optimization algorithms is chaos theory. Nonlinear dynamic systems that are characterized by a high sensitivity to their initial conditions are studied in chaos theory [19,20]. They can be applied to replace the PRNGs in producing the control parameters or performing local searches [21,22,23,24,25,26,27,28,29,30,31,32,33,34]. However, improving an optimization algorithm using chaotic systems instead of pseudo-random number generators (PRNGs) may be restricted to the problem under consideration or to a set of problems with similar characteristics.
Hybridization is a well-known strategy that boosts the capacity of optimization algorithms. Since a metaheuristic optimization algorithm cannot overcome all algorithms in solving any problem, hybridization can be a solution that merges the capabilities of different algorithms in one system [35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52]. Many of these algorithms require the correct setting of control parameters, and merging several of these algorithms into a single solution increases the complexity of accurate adjustment of control parameters. Furthermore, some hybridization techniques can be complicated if the management and replacement strategies of individuals in the populations are not similar. On the other hand, when chaos is applied, hybrid algorithms can provide excellent performance for a limited number of applications.
The proposed algorithms consist of hybridizations of seven of the best optimization algorithms that satisfy two requirements: (i) they must be free of algorithm-specific control parameters, and (ii) population management should allow hybridization not only at the population level but also at the individual level.
The remainder of this paper is organized as follows. Section 2 presents a brief description of the optimization algorithms used for the hybridizations. Section 3 describes the hybrid algorithms in detail, analyses of which are provided in Section 4. Finally, concluding remarks are drawn in Section 5.

2. Preliminaries

As mentioned above, among the best free control parameter algorithms are the Jaya algorithm [16], the SCA algorithm  [13], the supply-demand-based optimization method [15], Rao’s optimization algorithms [18], the Harris hawks optimization method (HHO) [17], and the teaching-learning-based optimization (TLBO) algorithm [14]. Among these proposals, the HHO algorithm is the most complex. It consists of two phases. During the first phase, the elements of the population are replaced without comparing the fitness of the associated solutions, which is an unwanted strategy for hybrid algorithms. In addition, the SDO algorithm, which offers impressive initial results, works with two populations preventing its integration in our hybrid proposals.
The Jaya optimization algorithm and the three new Rao’s optimization algorithms (i.e., RAO1, RAO2, and RAO3) are described in Algorithm 1. The Jaya optimization algorithm has been successfully used for solving a large number of large-scale industrial problems [53,54,55,56,57,58,59]. The three new Rao’s optimization algorithms are metaphor-less algorithms based on the best and worst solutions obtained during the optimization process and the random interactions between the candidate solutions [60,61,62]. In Algorithms 1–3 and 5–9, m a x _ I T s is the number of generations; p o p S i z e is the number of individuals in population P o p ; n u m D e s i g n V a r s is the number of variables of the objective function F; P o p m is the mth individual in the current population; M i n V a l u e k and M a x V a l u e k are the low and high bounds of the kth variable of F, respectively; B e s t P o p and W o r s t P o p are the best and worst individuals of the current population, P o p , successively; n e w P o p m is the mth new individual that can replace the current mth individual of population P o p m ; and r a n d 0 . . 1 is an uniformly distributed random number in [ 0 , 1 ] .
The SCA algorithm, presented in Algorithm 2 has been proven to be efficient in several applications [63,64,65,66,67,68,69].
The TLBO algorithm, described in Algorithm 3, is a two-phase algorithm; teacher phase and learner phase. It has been proven effective in solving various engineering problems [70,71,72,73,74,75,76].
As mentioned earlier, the use of chaotic maps can improve the behavior of some metaheuristic methods. The 2D chaotic map reported in [33] has significantly improved the convergence rate of the Jaya algorithm [33,77]. The generation of the 2D chaotic map is shown in Algorithm 4, where the initial conditions are c h A 1 = 0.2 , c h B 1 = 0.3 , k = i , and d i m M a p = 500 . The computed values of c h A i and c h B i are in [ 1 , 1 ] . The chaotic Jaya algorithm (in short, CJaya) is shown in Algorithm 5, where c h x , x [ 1 6 ] are chaotic values randomly extracted from the 2D chaotic map. Other chaotic maps have been applied to Jaya in [32,78]. However, they do not surmount the chaotic behavior of the aforementioned 2D map.
As they present a similar structure, Algorithms 1–5 are used for designing our hybrid algorithms.
Algorithm 1 Jaya and Rao algorithms
1:
Set m a x _ I T s and population size (Iterator individuals: m)
2:
Define the function cost (Iterator design variables: k)
3:
Generate the initial population P o p
4:
for m = 0 to p o p S i z e do
5:
  for k = 1 to n u m D e s i g n V a r s do
6:
    r 1 = r a n d 0 . . 1
7:
    P o p m k = M i n V a l u e k + ( M a x V a l u e k M i n V a l u e k ) * r 1
8:
  end for
9:
  Compute and store function fitness F ( P o p m k )
10:
end for
11:
for i t e r a t o r = 1 to m a x _ I T s do
12:
  Search for the current B e s t P o p and W o r s t P o p
13:
  for m = 0 to p o p S i z e do
14:
   Select the random individual R a n d P o p P o p m {Only for RAO2 and RAO3}
15:
   for k = 1 to n u m D e s i g n V a r s do
16:
    if Jaya then
17:
      r 1 , r 2 = r a n d 0 . . 1
18:
      n e w P o p m k = P o p m k + r 1 ( B e s t P o p k | P o p m k | ) r 2 ( W o r s t P o p k | P o p m k | )
19:
    end if
20:
    if RAO1 then
21:
      r 1 = r a n d 0 . . 1
22:
      n e w P o p m k = P o p m k + r 1 ( B e s t P o p k W o r s t P o p k )
23:
    end if
24:
    if RAO2 then
25:
      r 1 , r 2 = r a n d 0 . . 1
26:
     if P o p m k < R a n d P o p k then
27:
       n e w P o p m k = P o p m k + r 1 ( B e s t P o p k W o r s t P o p k ) r 2 ( | P o p m k | | R a n d P o p k | )
28:
     else
29:
       n e w P o p m k = P o p m k + r 1 ( B e s t P o p k W o r s t P o p k ) r 2 ( | R a n d P o p k | | P o p m k | )
30:
    end if
31:
   end if
32:
   if RAO3 then
33:
     r 1 , r 2 = r a n d 0 . . 1
34:
    if P o p m k < R a n d P o p k then
35:
      n e w P o p m k = P o p m k + r 1 ( B e s t P o p k | W o r s t P o p k | ) r 2 ( | P o p m k | | R a n d P o p k | )
36:
    else
37:
      n e w P o p m k = P o p m k + r 1 ( B e s t P o p k | W o r s t P o p k | ) r 2 ( | R a n d P o p k | | P o p m k | )
38:
    end if
39:
   end if
40:
   if n e w P o p m k < M i n V a l u e k then
41:
     n e w P o p m k = M i n V a l u e k
42:
   end if
43:
   if n e w P o p m k > M a x V a l u e k then
44:
     n e w P o p m k = M a x V a l u e k
45:
    end if
46:
   end for
47:
   if F ( n e w P o p m ) < F ( P o p m ) then
48:
     P o p m = n e w P o p m {replace the current population}
49:
   end if
50:
  end for
51:
end for
52:
Search for the current B e s t P o p
Algorithm 2 SCA optimization algorithm
1:
Set i n i V a l u e _ r 1 = 2
2:
Set m a x _ I T s and the population size (Iterator individuals: m)
3:
Define the function cost (Iterator design variables: k)
4:
Generate the initial population P o p {lines 4–10 of Algorithm 1}
5:
for i t e r a t o r = 1 to m a x _ I T s do
6:
  Search for the current B e s t P o p
7:
   r 1 = i n i V a l u e _ r 1 i t e r a t o r ( i n i V a l u e _ r 1 / m a x _ I T s )
8:
  for m = 0 to p o p S i z e do
9:
   for k = 1 to n u m D e s i g n V a r s do
10:
     r 2 = 2 π r a n d 0 . . 1 ; r 3 = 2 r a n d 0 . . 1 ; r 4 = r a n d 0 . . 1
11:
    if r 4 < 0 . 5 then
12:
      n e w P o p m k = P o p m k ( r 1 sin ( r 2 ) | r 3 B e s t P o p k P o p m k | )
13:
    else
14:
      n e w P o p m k = P o p m k ( r 1 cos ( r 2 ) | r 3 B e s t P o p k P o p m k | )
15:
    end if
16:
    Check the bounds of n e w P o p m k {lines 40–45 of Algorithm 1}
17:
   end for
18:
   if F ( n e w P o p m ) < F ( P o p m ) then
19:
     P o p m = n e w P o p m {replace the current population}
20:
   end if
21:
  end for
22:
end for
23:
Search for the current B e s t P o p
Algorithm 3 TLBO algorithm
1:
Set i n i V a l u e _ r 1 = 2
2:
Set m a x _ I T s and the population size (Iterator individuals: m)
3:
Define the function cost (Iterator design variables: k)
4:
Generate the initial population P o p {lines 4–10 of Algorithm 1}
5:
Set P h a s e = T e a c h e r P h a s e
6:
for i t e r a t o r = 1 to m a x _ I T s do
7:
  Search for the current B e s t P o p
8:
  Set the teaching factor T F (an integer random value [ 1 , 2 ] )
9:
  for k = 1 to n u m D e s i g n V a r s do
10:
    A v e r a g e P o p k = ( 1 m P o p k ) / n u m D e s i g n V a r s
11:
  end for
12:
  for m = 0 to p o p S i z e do
13:
   Select the random individual R a n d P o p P o p m
14:
   for k = 1 to n u m D e s i g n V a r s do
15:
    if P h a s e = T e a c h e r P h a s e then
16:
      r 1 = r a n d 0 . . 1
17:
      n e w P o p m k = P o p m k + r 1 ( B e s t P o p k T F A v e r a g e P o p k )
18:
    end if
19:
    if P h a s e = L e a r n e r P h a s e then
20:
      r 1 = r a n d 0 . . 1
21:
     if P o p m k < R a n d P o p k then
22:
       n e w P o p m k = P o p m k + r 1 ( P o p m k R a n d P o p k )
23:
     else
24:
       n e w P o p m k = P o p m k + r 1 ( R a n d P o p k P o p m k )
25:
     end if
26:
    end if
27:
    Check the bounds of n e w P o p m k {lines 40–45 of Algorithm 1}
28:
   end for
29:
   if F ( n e w P o p m ) < F ( P o p m ) then
30:
     P o p m = n e w P o p m {replace the current population}
31:
   end if
32:
  end for
33:
  if P h a s e = T e a c h e r P h a s e then
34:
    P h a s e = L e a r n e r P h a s e
35:
  else
36:
    P h a s e = T e a c h e r P h a s e
37:
  end if
38:
end for
39:
Search for the current B e s t P o p
Algorithm 4 2D chaotic map
1:
Set x 1 , y 1 and d i m M a p
2:
for i = 1 to d i m M a p do
3:
   c h A i + 1 = c o s ( k * a r c c o s ( c h B i ) )
4:
   c h B i + 1 = 16 c h A i 5 20 c h A i 3 + 5 c h A i
5:
end for
Algorithm 5 Chaotic 2D Jaya algorithm
1:
Set i n i V a l u e _ r 1 = 2
2:
Set m a x _ I T s and the population size (Iterator individuals: m)
3:
Define the function cost (Iterator design variables: k)
4:
Generate the initial population P o p {lines 4–10 of Algorithm 1}
5:
for i t e r a t o r = 1 to m a x _ I T s do
6:
  Search for the current B e s t P o p
7:
  Search for the current W o r s t P o p
8:
  Set the scaling factor S F (integer random value [ 1 , 2 ] )
9:
  for m = 0 to p o p S i z e do
10:
   Select the random individual R a n d P o p P o p m
11:
    r 1 , r 2 = r a n d 0 . . 1
12:
    r a = m i n ( r 1 , r 2 )
13:
    r b = m a x ( r 1 , r 2 )
14:
   for k = 1 to n u m D e s i g n V a r s do
15:
    Extract c h 1 , c h 2 , c h 3 , c h 4 , c h 5 , c h 6
16:
    if c h 1 < a then
17:
      n e w P o p m k = c h 2 R a n d P o p m k + c h 3 ( P o p m k c h 4 R a n d P o p m k )
18:
                     + c h 5 ( B e s t P o p m k c h 6 R a n d P o p m k )
19:
    end if
20:
    if a < c h 1 < b then
21:
      n e w P o p m k = c h 2 R a n d P o p m k + c h 3 ( P o p m k c h 4 R a n d P o p m k )
22:
                     + c h 5 ( W o r s t P o p m k c h 6 R a n d P o p m k )
23:
    end if
24:
    if c h j > b then
25:
      n e w P o p m k = c h 2 B e s t P o p m k + c h 3 ( R a n d P o p m k S F B e s t P o p m k )
26:
    end if
27:
    Check the bounds of n e w P o p m k {lines 40–45 of Algorithm 1}
28:
   end for
29:
   if F ( n e w P o p m ) < F ( P o p m ) then
30:
     P o p m = n e w P o p m {replace the current population}
31:
   end if
32:
  end for
33:
end for
34:
Search for the current B e s t P o p

3. Hybrid Algorithms

The proposed hybrid algorithms are designed using the seven algorithms described in Section 2. These algorithms have been selected thanks to their performance in solving constrained and unconstrained functions, but also because they share a similar structure that allows the implementation of different hybridization strategies.
Algorithm 6 shows the skeleton of the proposed hybrid algorithms, which includes all common and uncommon tasks without any updating procedure of the current population. Since the TLBO algorithm is a two-phase algorithm, the proposed hybrid algorithms apply these two phases consecutively to each individual. In contrast to the other algorithm where a single-phase is executed, a control parameter P h a s e is applied to process twice the same individual when the TLBO algorithm is used (see lines 24–29 of Algorithm 6). The algorithm used to obtain a new individual is determined by A l g S e l e c t e d (see line 17 of Algorithm 6). In Algorithms 6–9, A l g S e l e c t e d determines the algorithm accountable for producing a new individual.
Given that only algorithms that are free of control parameters have been considered, proposals that require the inclusion of control parameters have been discarded. Following these guidelines, we have designed three hybrid algorithms, an analysis of which is provided in Section 4. The first proposed hybrid algorithm, shown in Algorithm 7, processes the entire population in each iteration using the same algorithm, and is referred to as the HYBPOP algorithm. This is the most straightforward hybridization technique where the requirement to follow the structure given by Algorithm 6 is not mandatory on all algorithms. In Algorithms 7–9, N u m O f A l g o r i t h m s is the number of algorithms free of control parameters involved in the hybrid proposals.
Algorithm 6 Skeleton of hybrid algorithms
1:
Set m a x _ I T s and the population size (Iterator individuals: m)
2:
Define the function cost (Iterator design variables: k)
3:
Set i n i V a l u e _ r 1 = 2 ; P h a s e = T e a c h e r P h a s e ; R e p e a t T L B O = f a l s e
4:
Generate the initial population P o p {lines 4–10 of Algorithm 1}
5:
for i t e r a t o r = 1 to m a x _ I T s do
6:
  Search for the current B e s t P o p and W o r s t P o p
7:
  Set the scaling factor S F and teaching factor T F (an integer random value [ 1 , 2 ] )
8:
  for k = 1 to n u m D e s i g n V a r s do
9:
    A v e r a g e P o p k = ( 1 m P o p k ) / n u m D e s i g n V a r s
10:
  end for
11:
   r 1 = i n i V a l u e _ r 1 i t e r a t o r ( i n i V a l u e _ r 1 / m a x _ I T s )
12:
  for m = 0 to p o p S i z e do
13:
   Select random individual R a n d P o p P o p m
14:
    r 2 , r 3 = r a n d 0 . . 1
15:
    r a = m i n ( r 2 , r 3 ) ; r b = m a x ( r 2 , r 3 )
16:
   for k = 1 to n u m D e s i g n V a r s do
17:
    ⇒ ( A l g S e l e c t e d ) Compute n e w P o p m k using A l g S e l e c t e d (one from Algorithms 1–5)
18:
    Check the bounds of n e w P o p m k {lines 40–45 of Algorithm 1}
19:
   end for
20:
   if F ( n e w P o p m ) < F ( P o p m ) then
21:
     P o p m = n e w P o p m {Replace the current population}
22:
   end if
23:
   if T L B O then
24:
    if P h a s e = T e a c h e r P h a s e then
25:
      P h a s e = L e a r n e r P h a s e ; R e p e a t T L B O = t r u e ; m = m 1
26:
    else
27:
      P h a s e = T e a c h e r P h a s e ; R e p e a t T L B O = f a l s e
28:
    end if
29:
   end if
30:
  end for
31:
end for
32:
Search for the current B e s t P o p
Algorithm 7 HYBPOP: Hybrid algorithm based on population
1:
N u m O f A l g o r i t h m s = 7
2:
A L G S [ N u m O f A l g o r i t h m s ] = { Jaya , Chaotic Jaya , SCA , RAO 1 , RAO 2 , RAO 3 , TLBO }
3:
S e l e c t i o n = 0
4:
for i t e r a t o r = 1 to m a x _ I T s do
5:
   A l g S e l e c t e d = A L G S [ S e l e c t i o n ]
6:
  for m = 0 to p o p S i z e do
7:
   for k = 1 to n u m D e s i g n V a r s do
8:
    Compute n e w P o p m k using A l g S e l e c t e d
9:
   end for
10:
  end for
11:
   S e l e c t i o n = S e l e c t i o n + 1
12:
  if S e l e c t i o n N u m O f A l g o r i t h m s then
13:
    S e l e c t i o n = 0
14:
  end if
15:
end for
16:
Search for the current B e s t P o p
The second algorithm, named HYBSUBPOP, is described through Algorithm 8. It logically splits the population into sub-populations. During the optimization process, each sub-population will be processed by one of the seven algorithms mentioned previously.
Algorithm 8 HYBSUBPOP: Hybrid algorithm based on sub-populations
1:
N u m O f A l g o r i t h m s = 7
2:
Split p o p S i z e into N u m O f A l g o r i t h m s sub-populations
3:
A L G S [ N u m O f A l g o r i t h m s ] = { Jaya , Chaotic Jaya , SCA , RAO 1 , RAO 2 , RAO 3 , TLBO }
4:
for i t e r a t o r = 1 to m a x _ I T s do
5:
  for m = 0 to p o p S i z e do
6:
    s u b P o p I D = sub-population index of individual m.
7:
    A l g S e l e c t e d = A L G S [ s u b P o p I D ]
8:
   for k = 1 to n u m D e s i g n V a r s do
9:
    Compute n e w P o p m k using A l g S e l e c t e d
10:
   end for
11:
  end for
12:
end for
13:
Search for the current B e s t P o p
Algorithm 9 shows the third proposed hybrid algorithm, dubbed HYBIND, in which a different algorithm in each iteration handles each individual of the population.
Algorithm 9 HYBIND: Hybrid algorithm based on individuals
1:
N u m O f A l g o r i t h m s = 7
2:
A L G S [ N u m O f A l g o r i t h m s ] = { Jaya , Chaotic Jaya , SCA , RAO 1 , RAO 2 , RAO 3 , TLBO }
3:
S e l e c t i o n = 0
4:
for i t e r a t o r = 1 to m a x _ I T s do
5:
   S e l e c t i o n = i t e r a t o r % 7
6:
  for m = 0 to p o p S i z e do
7:
    A l g S e l e c t e d = A L G S [ S e l e c t i o n ]
8:
   for k = 1 to n u m D e s i g n V a r s do
9:
    Compute n e w P o p m k using A l g S e l e c t e d
10:
   end for
11:
    S e l e c t i o n = S e l e c t i o n + 1
12:
   if S e l e c t i o n N u m O f A l g o r i t h m s then
13:
     S e l e c t i o n = 0
14:
   end if
15:
  end for
16:
end for
17:
Search for the current B e s t P o p
It is worth noting that the aim of the proposed hybrid algorithms is not to improve the convergence ratio of the used algorithms separately, nor to perform optimally for a particular problem. It is to show outstanding performance for a large number of problems without adjusting any control parameters of the considered algorithms.

4. Numerical Experiments

In this section, the performance of the proposed hybrid algorithms is analyzed through solving 28 well-known unconstrained functions (see Table 1), the definitions of which can be seen in [77]. The proposed algorithms were implemented in the C language, using the GCC v.4.4.7 [79], and an Intel Xeon E5-2620 v2 processor at 2.1 GHz. The hybrid proposals, along with the original algorithms, have been implemented and tested using C language. The C implementations of the original algorithms used are not available through the Internet. However, their Java/Matlab implementations are commonly available.
The data collected from the experimental analysis are as follows:
  • NoR-AI: the total number of replacements for any individual.
  • NoR-BI: the total number of replacements for the current best individual.
  • NoR-BwT: the total number of replacements for the current best individual with an error of less than 0.001 .
  • LtI-AI: the last iteration ( i t e r a t o r ) in which a replacement of any individual occurs.
  • LtI-BI: the last iteration ( i t e r a t o r ) in which a replacement of the best individual occurs.
Three of the five analyzed data (NoR-) indicate the number of times the current individual ( P o p m ) is replaced by a new individual ( n e w P o p m ), which provides a better fitness function (see line 21 of Algoritm 6), while the remaining two (LtI-) refer to the last generation (iterator) in which at least one individual has been replaced.
All data given below have been obtained under 50 runs, 50 , 000 iterations ( m a x _ I T s = 50 , 000 ) and two population sizes ( p o p S i z e = 140 and 210). The maximum values of the analyzed data are listed in Table 2.
Table 3, Table 4 and Table 5 show the data of all the considered algorithms independently, i.e., without hybridization. As expected, the behavior of the different algorithms does not follow a familiar pattern. In addition, it depends on the objective function. Regarding a global convergence analysis, both TLBO and CJaya behave better but with a higher order of complexity (see [77,80]). Moreover, it is noted that when using TLBO, two new individuals are generated in each iteration; one in the teacher phase and the other one in the learner phase. The values in brackets in Table 3, Table 4 and Table 5 refer to the standard deviation of the data under 50 runs. Note that heuristic optimization algorithms are partially based on randomness, which leads to high values of standard deviation. The average standard deviations are approximately equal to 16%, 22%, 15%, 30%, 23%, 23%, and 22% for Jaya, Chaotic Jaya, SCA, RAO1, RAO2, RAO3, and TLBO, respectively.
An important aspect, not shown in Table 3, Table 4 and Table 5, is whether the solution obtained by each algorithm is acceptable or not. In particular, the original algorithms fail to obtain a solution tolerance of less than 0.001 for 3, 8, 2, 4, 7, 5, and 2 functions for Jaya, CJaya, SCA, RAO1, RAO2, RAO3, and TLBO, respectively. Therefore, considering only original algorithms, there is no algorithm whose behavior is always the best, which justifies the development of a generalist hybrid system that can solve a large number of benchmark functions and engineering problems.
Comparing the quality of the solutions obtained from the proposed hybrid algorithms, it can be concluded that the HYBSUBPOP algorithm is the worst one because the same thoroughbred algorithm is always applied to the same sub-population, which degrades the algorithm’s performance for a small population. Contrary to HYBSUBPOP, the HYBPOP and HYBIND algorithms apply the selected algorithms to all individuals, which leads to better-exploiting hybridizations. The HYBSUPOP algorithm fails to obtain a solution tolerance of less than 0.001 in 3 functions (F11, F23, and F27) and the HYBPOP and HYBIND algorithms fail in only one function (F27 and F11, respectively). If the population size is increased to 210, the HYBIND algorithm succeeds with all functions, thus the HYBIND algorithm has a slightly better performance in comparison to HYBPOP.
Local exploration has improved both in the HYBPOP method and especially in the HYBIND method, as stated above. Figure 1 and Figure 2 show the convergence curves of both all the individual methods and the three hybrid methods proposed for the first 1000 and 100 iterations, respectively, for functions F1, F8, F11, and F18. Each point in both figures is the average of the data obtained from 10 runs. As shown in these figures, the curves of the three hybrid methods are similar to the curves of the best single algorithms for each function. Therefore, global exploitation, while not improving all methods, behaves similarly to the best single methods for each function. It should be noted that the hybrid methods behave similarly to the best individual methods for each function, which are not always the same.
Table 6 sorts the algorithms according to the number of iterations required to obtain an error of less than 0.001 , if an algorithm is missing in a row an acceptable solution is not reached. As seen from this table, no algorithm outperforms all other algorithms. Moreover, a computational cost analysis would be necessary to classify them correctly. Table 7 exhibits the computational cost of different algorithms. It reveals from this table that the hybrid algorithms are mid-ranked in terms of computational cost, and HYBIND is computationally less expensive than HYBPOP.
An analysis of the contribution of each algorithm in the HYBPOP and HYBIND algorithms is exhibited in Table 8, Table 9 and Table 10. Table 8 indicates the number of times that an individual has been replaced in each algorithm. The replacement is accepted when the new individual improves the fitness of the current solution. As seen from Table 8, the HYBIND algorithm performs more replacements of individuals. In addition, the numbers of replacements per individual for the contributing algorithms are nearly equal, except for the RAO1 algorithm, where the contribution to replacements is limited. The standard deviations of each data (from 50 runs) are being put in brackets. We found that, on average, the standard deviations for HYBPOP and HYBRID algorithms are both equal to 14%.
Table 9 shows the last iteration in which each optimization algorithm replaces an individual in the population, i.e., when it no longer brings improvement to the hybrid algorithm. As can be seen from Table 8, the optimization algorithms, except the RAO1 algorithm, work efficiently in the hybrid algorithms. It is also revealed that the considered algorithms contribute to more generations in the HYBIND algorithm. The mean value of the standard deviation rises to 28% and 23% for HYBPOP and HYBIND, respectively, due to the randomness behavior and lower LtI-AI costs.
Finally, Table 10 shows the last iteration in which each algorithm obtains a new optimum. A careful analysis of the results in Table 10 reveals that in the HYBPOP algorithm, the seven algorithms contribute similarly to reaching a better solution as new populations are produced. By contrast, when using the HYBIND algorithm, the powerful algorithms are CJaya and TLBO. It should be noted that the CJaya algorithm extracts random individuals from the population to generate new individuals. The TLBO algorithm collects all the individuals of the population to obtain new individuals. Therefore, these algorithms exploit the results obtained from the rest of the algorithms to converge towards the optimum. This fact is due to the nature of these algorithms, where the best solution correctly guided the individuals. The mean value of the standard deviation is high because the LtI-BI is strongly affected by randomness behavior.
It has been found that the HYBSUBPOP algorithm does not reach excellent optimization performance because of the lack of harmony between the original algorithms, so it has left without further analysis. On the other hand, the exploitation phase of the HYBPOP and HYBIND algorithms are similar. In contrast, the HYBIND algorithm outperforms the HYBPOP one in terms of exploitation. The hybridization of the original algorithms is implemented at the individual level in the HYBIND algorithm, contrary to the HYBPOP algorithm, in which that hybridization is performed at the population level. Finally, the HYBPOP algorithm included algorithms that update the population without analyzing the fitness of the associated solutions, while this restriction is mandatory in the HYBIND algorithm.

5. Conclusions

This paper proposed a hybridization strategy of seven well-known algorithms. Three hybrid algorithms free of setting parameters dubbed the HYBSUBPOP, HYBPOP, and HYBIND algorithms are designed. These algorithms are derived from a dynamic skeleton allowing the inclusion of any metaheuristic optimization algorithm that exhibits further improvements. The only requirement in merging a new optimization algorithm into the proposed skeleton is to know if the replacement of an individual on that algorithm is based on the enhancement of the cost function or not. Moreover, both chaotic algorithms and multi-phase algorithms have been employed to design the proposed hybrid algorithms, which proves the versatility of the proposed hybridization skeleton. The experimental results show that the HYBPOP and HYBIND algorithms effectively exploit the capabilities of all the considered algorithms. They present an excellent ability to solve a large number of benchmark functions while improving the quality of the solutions obtained. Generally speaking, the hybridization at the individual level is better than that at the population level, which explains why the performance of the HYBSUBPOP algorithm is inferior to the other hybrid algorithms. As future lines of work, we intend to integrate more efficient algorithms into the proposed hybridization skeleton as well as to evaluate new versions of hybridization, and extend the performance analysis of the potential algorithms for solving more complex functions and real-world engineering problems.

Author Contributions

H.M., A.B., J.-L.S.-R., H.R., and A.J.-M. conceived the hybrid algorithms; A.B. conceived the Chaotic 2D Jaya algorithm; H.M. designed the hybrid algorithms; H.M. codified the hybrid algorithms; H.M., J.-L.S.-R., and H.R. performed numerical experiments; H.M., A.B., and A.J.-M. analyzed the data; H.M. wrote the original draft. A.B., J.-L.S.-R., H.R., and A.J.-M. reviewed and edited the manuscript. All the authors have read and agreed to the published version of the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This research and APC was funded by the Spanish Ministry of Science, Innovation and Universities and the Research State Agency under Grant RTI2018-098156-B-C54 co-financed by FEDER funds, and by the Spanish Ministry of Economy and Competitiveness under Grant TIN2017-89266-R, co-financed by FEDER funds.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Sadollah, A.; Bahreininejad, A.; Eskandar, H.; Hamdi, M. Mine Blast Algorithm: A New Population Based Algorithm for Solving Constrained Engineering Optimization Problems. Appl. Soft Comput. 2013, 13, 2592–2612. [Google Scholar] [CrossRef]
  2. Zhao, W.; Zhang, Z.; Wang, L. Manta ray foraging optimization: An effective bio-inspired optimizer for engineering applications. Eng. Appl. Artif. Intell. 2020, 87, 103300. [Google Scholar] [CrossRef]
  3. Askarzadeh, A. A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm. Comput. Struct. 2016, 169, 1–12. [Google Scholar] [CrossRef]
  4. Dorigo, M.; Di Caro, G. New Ideas in Optimization; Chapter The Ant Colony Optimization Meta-Heuristic; McGraw-Hill Ltd.: Maidenhead, UK, 1999; pp. 11–32. [Google Scholar]
  5. Ma, H.; Simon, D.; Siarry, P.; Yang, Z.; Fei, M. Biogeography-Based Optimization: A 10-Year Review. IEEE Trans. Emerg. Top. Comput. Intell. 2017, 1, 391–407. [Google Scholar] [CrossRef]
  6. Ahrari, A.; Atai, A.A. Grenade Explosion Method—A novel tool for optimization of multimodal functions. Appl. Soft Comput. 2010, 10, 1132–1140. [Google Scholar] [CrossRef]
  7. Poli, R.; Kennedy, J.; Blackwell, T. Particle swarm optimization. Swarm Intell. 2007, 1, 33–57. [Google Scholar] [CrossRef]
  8. Xin-She, Y. Firefly Algorithm, Lévy Flights and Global Optimization. Res. Dev. Intell. Syst. XXVI 2009, 209–218. [Google Scholar] [CrossRef] [Green Version]
  9. Karaboga, D.; Basturk, B. On the Performance of Artificial Bee Colony (ABC) Algorithm. Appl. Soft Comput. 2008, 8, 687–697. [Google Scholar] [CrossRef]
  10. Rashedi, E.; Nezamabadi-pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  11. Eusuff, M.; Lansey, K.; Pasha, F. Shuffled frog-leaping algorithm: A memetic meta-heuristic for discrete optimization. Eng. Optim. 2006, 38, 129–154. [Google Scholar] [CrossRef]
  12. Szénási, S.; Felde, I. Configuring genetic algorithm to solve the inverse heat conduction problem. Acta Polytech. Hung. 2017, 14, 133–152. [Google Scholar] [CrossRef]
  13. Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  14. Rao, R.V.; Savsani, V.; Vakharia, D. Teaching-learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput.-Aided Des. 2011, 43, 303–315. [Google Scholar] [CrossRef]
  15. Zhao, W.; Wang, L.; Zhang, Z. Supply-Demand-Based Optimization: A Novel Economics-Inspired Algorithm for Global Optimization. IEEE Access 2019, 7, 73182–73206. [Google Scholar] [CrossRef]
  16. Rao, R.V. Jaya: A simple and new optimization algorithm for solving constrained and unconstrained optimization problems. Int. J. Ind. Eng. Comput. 2016, 7, 19–34. [Google Scholar] [CrossRef]
  17. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  18. Rao, R.V. Rao algorithms: Three metaphor-less simple algorithms for solving optimization problems. Int. J. Ind. Eng. Comput. 2020, 11, 107–130. [Google Scholar] [CrossRef]
  19. Majumdar, M.; Mitra, T.; Nishimura, K. Optim. Chaos; Springer: New York, NY, USA, 2000. [Google Scholar]
  20. Ott, E. Frontmatter. In Chaos in Dynamical Systems, 2nd ed.; Cambridge University Press: Cambridge, UK, 2002; pp. i–iv. [Google Scholar]
  21. Gandomi, A.; Yang, X.S.; Talatahari, S.; Alavi, A. Firefly algorithm with chaos. Commun. Nonlinear Sci. Numer. Simul. 2013, 18, 89–98. [Google Scholar] [CrossRef]
  22. Gokhale, S.; Kale, V. An application of a tent map initiated Chaotic Firefly algorithm for optimal overcurrent relay coordination. Int. J. Electr. Power Energy Syst. 2016, 78, 336–342. [Google Scholar] [CrossRef]
  23. Ma, Z.S. Chaotic populations in genetic algorithms. Appl. Soft Comput. 2012, 12, 2409–2424. [Google Scholar] [CrossRef]
  24. Yan, X.F.; Chen, D.Z.; Hu, S.X. Chaos-genetic algorithms for optimizing the operating conditions based on RBF-PLS model. Comput. Chem. Eng. 2003, 27, 1393–1404. [Google Scholar] [CrossRef]
  25. Hong, W.C. Traffic flow forecasting by seasonal SVR with chaotic simulated annealing algorithm. Neurocomputing 2011, 74, 2096–2107. [Google Scholar] [CrossRef]
  26. Mingjun, J.; Huanwen, T. Application of chaos in simulated annealing. Chaos Solitons Fractals 2004, 21, 933–941. [Google Scholar] [CrossRef]
  27. Saremi, J.; Mirjalili, S.; Lewisn, A. Biogeography-based optimisation with chaos. Neural Comput. Appl. 2014, 25, 1077–1097. [Google Scholar] [CrossRef]
  28. Wang, X.; Duan, H. A hybrid biogeography-based optimization algorithm for job shop scheduling problem. Comput. Ind. Eng. 2014, 73, 96–114. [Google Scholar] [CrossRef]
  29. Jia, D.; Zheng, G.; Khan, M.K. An effective memetic differential evolution algorithm based on chaotic local search. Inf. Sci. 2011, 181, 3175–3187. [Google Scholar] [CrossRef]
  30. Peng, C.; Sun, H.; Guo, J.; Liu, G. Dynamic economic dispatch for wind-thermal power system using a novel bi-population chaotic differential evolution algorithm. Int. J. Electr. Power Energy Syst. 2012, 42, 119–126. [Google Scholar] [CrossRef]
  31. Alatas, B. Chaotic bee colony algorithms for global numerical optimization. Expert Syst. Appl. 2010, 37, 5682–5687. [Google Scholar] [CrossRef]
  32. Yu, J.; Kim, C.H.; Wadood, A.; Khurshiad, T.; Rhee, S.B. A Novel Multi-Population Based Chaotic JAYA Algorithm with Application in Solving Economic Load Dispatch Problems. Energies 2018, 11, 1946. [Google Scholar] [CrossRef] [Green Version]
  33. Farah, A.; Belazi, A. A novel chaotic Jaya algorithm for unconstrained numerical optimization. Nonlinear Dyn. 2018, 93, 1451–1480. [Google Scholar] [CrossRef]
  34. Kumar, Y.; Singh, P.K. RA chaotic teaching learning based optimization algorithm for clustering problems. Appl. Intell. 2019, 49, 107–130. [Google Scholar] [CrossRef]
  35. Moayedi, H.; Gör, M.; Khari, M.; Foong, L.K.; Bahiraei, M.; Bui, D.T. Hybridizing four wise neural-metaheuristic paradigms in predicting soil shear strength. Measurement 2020, 156, 107576. [Google Scholar] [CrossRef]
  36. Nguyen, B.M.; Tran, T.; Nguyen, T.; Nguyen, G. Hybridization of Galactic Swarm and Evolution Whale Optimization for Global Search Problem. IEEE Access 2020, 8, 74991–75010. [Google Scholar] [CrossRef]
  37. Kayabekir, A.E.; Toklu, Y.C.; Bekdaş, G.; Nigdeli, S.M.; Yücel, M.; Geem, Z.W. A Novel Hybrid Harmony Search Approach for the Analysis of Plane Stress Systems via Total Potential Optimization. Appl. Sci. 2020, 10, 2301. [Google Scholar] [CrossRef] [Green Version]
  38. Punurai, W.; Azad, M.S.; Pholdee, N.; Bureerat, S.; Sinsabvarodom, C. A novel hybridized metaheuristic technique in enhancing the diagnosis of cross-sectional dent damaged offshore platform members. Comput. Intell. 2020, 36, 132–150. [Google Scholar] [CrossRef]
  39. Pellegrini, R.; Serani, A.; Liuzzi, G.; Rinaldi, F.; Lucidi, S.; Diez, M. Hybridization of Multi-Objective Deterministic Particle Swarm with Derivative-Free Local Searches. Mathematics 2020, 8, 546. [Google Scholar] [CrossRef] [Green Version]
  40. Yue, Z.; Zhang, S.; Xiao, W. A Novel Hybrid Algorithm Based on Grey Wolf Optimizer and Fireworks Algorithm. Sensors 2020, 20, 2147. [Google Scholar] [CrossRef] [Green Version]
  41. Seifi, A.; Ehteram, M.; Singh, V.P.; Mosavi, A. Modeling and Uncertainty Analysis of Groundwater Level Using Six Evolutionary Optimization Algorithms Hybridized with ANFIS, SVM, and ANN. Sustainability 2020, 12, 4023. [Google Scholar] [CrossRef]
  42. Chen, X.; Yu, K. Hybridizing cuckoo search algorithm with biogeography-based optimization for estimating photovoltaic model parameters. Sol. Energy 2019, 180, 192–206. [Google Scholar] [CrossRef]
  43. Zhang, X.; Shen, X.; Yu, Z. A Novel Hybrid Ant Colony Optimization for a Multicast Routing Problem. Algorithms 2019, 12, 18. [Google Scholar] [CrossRef] [Green Version]
  44. Aljohani, T.M.; Ebrahim, A.F.; Mohammed, O. Single and Multiobjective Optimal Reactive Power Dispatch Based on Hybrid Artificial Physics—Particle Swarm Optimization. Energies 2019, 12, 2333. [Google Scholar] [CrossRef] [Green Version]
  45. Ahmadian, A.; Elkamel, A.; Mazouz, A. An Improved Hybrid Particle Swarm Optimization and Tabu Search Algorithm for Expansion Planning of Large Dimension Electric Distribution Network. Energies 2019, 12, 3052. [Google Scholar] [CrossRef] [Green Version]
  46. Li, G.; Liu, P.; Le, C.; Zhou, B. A Novel Hybrid Meta-Heuristic Algorithm Based on the Cross-Entropy Method and Firefly Algorithm for Global Optimization. Entropy 2019, 21, 494. [Google Scholar] [CrossRef] [Green Version]
  47. Cherki, I.; Chaker, A.; Djidar, Z.; Khalfallah, N.; Benzergua, F. A Sequential Hybridization of Genetic Algorithm and Particle Swarm Optimization for the Optimal Reactive Power Flow. Sustainability 2019, 11, 3862. [Google Scholar] [CrossRef] [Green Version]
  48. Hanem, W.A.H.M.; Jantan, A. Hybridizing artificial bee colony with monarch butterfly optimization for numerical optimization problems. Neural Comput. Appl. 2018, 30, 163–181. [Google Scholar] [CrossRef]
  49. Das, P.; Behera, H.; Panigrahi, B. A hybridization of an improved particle swarm optimization and gravitational search algorithm for multi-robot path planning. Swarm Evol. Comput. 2016, 28, 14–28. [Google Scholar] [CrossRef]
  50. Wang, G.G.; Gandomi, A.H.; Zhao, X.; Chu, H.C.E. Hybridizing harmony search algorithm with cuckoo search for global numerical optimization. Soft Comput. 2016, 20, 273–285. [Google Scholar] [CrossRef]
  51. Zhu, A.; Xu, C.; Li, Z.; Wu, J.; Liu, Z. Hybridizing grey wolf optimization with differential evolution for global optimization and test scheduling for 3D stacked SoC. J. Syst. Eng. Electron. 2015, 26, 317–328. [Google Scholar] [CrossRef]
  52. Javaid, N.; Ahmed, A.; Iqbal, S.; Ashraf, M. Day Ahead Real Time Pricing and Critical Peak Pricing Based Power Scheduling for Smart Homes with Different Duty Cycles. Energies 2018, 11, 1464. [Google Scholar] [CrossRef] [Green Version]
  53. Mishra, S.; Ray, P.K. Power quality improvement using photovoltaic fed DSTATCOM based on JAYA optimization. IEEE Trans. Sustain. Energy 2016, 7, 1672–1680. [Google Scholar] [CrossRef]
  54. Huang, C.; Wang, L.; Yeung, R.S.; Zhang, Z.; Chung, H.S.; Bensoussan, A. A Prediction Model-Guided Jaya Algorithm for the PV System Maximum Power Point Tracking. IEEE Trans. Sustain. Energy 2018, 9, 45–55. [Google Scholar] [CrossRef]
  55. Abhishek, K.; Kumar, V.R.; Datta, S.; Mahapatra, S.S. Application of JAYA algorithm for the optimization of machining performance characteristics during the turning of CFRP (epoxy) composites: Comparison with TLBO, GA, and ICA. Eng. Comput. 2016, 1–19. [Google Scholar] [CrossRef]
  56. Choudhary, A.; Kumar, M.; Unune, D.R. Investigating effects of resistance wire heating on AISI 1023 weldment characteristics during ASAW. Mater. Manuf. Process. 2018, 33, 759–769. [Google Scholar] [CrossRef]
  57. Dinh-Cong, D.; Dang-Trung, H.; Nguyen-Thoi, T. An efficient approach for optimal sensor placement and damage identification in laminated composite structures. Adv. Eng. Softw. 2018, 119, 48–59. [Google Scholar] [CrossRef]
  58. Singh, S.P.; Prakash, T.; Singh, V.; Babu, M.G. Analytic hierarchy process based automatic generation control of multi-area interconnected power system using Jaya algorithm. Eng. Appl. Artif. Intell. 2017, 60, 35–44. [Google Scholar] [CrossRef]
  59. Cruz, N.C.; Redondo, J.L.; Álvarez, J.D.; Berenguel, M.; Ortigosa, P.M. A parallel Teaching–Learning-Based Optimization procedure for automatic heliostat aiming. J. Supercomput. 2017, 73, 591–606. [Google Scholar] [CrossRef]
  60. Rao, R.V.; Pawar, R.B. Self-adaptive Multi-population Rao Algorithms for Engineering Design Optimization. Appl. Artif. Intell. 2020, 34, 187–250. [Google Scholar] [CrossRef]
  61. Rao, R.; Pawar, R. Constrained design optimization of selected mechanical system components using Rao algorithms. Appl. Soft Comput. 2020, 89, 106141. [Google Scholar] [CrossRef]
  62. Rao, R.V.; Keesari, H.S. Rao algorithms for multi-objective optimization of selected thermodynamic cyclesn. Eng. Comput. 2020. [Google Scholar] [CrossRef]
  63. Kumar-Majhi, S. An Efficient Feed Foreword Network Model with Sine Cosine Algorithm for Breast Cancer Classification. Int. J. Syst. Dyn. Appl. (IJSDA) 2018, 7, 1–14. [Google Scholar] [CrossRef] [Green Version]
  64. Rajesh, K.S.; Dash, S.S. Load frequency control of autonomous power system using adaptive fuzzy based PID controller optimized on improved sine cosine algorithm. J. Ambient. Intell. Humaniz. Comput. 2019, 10, 2361–2373. [Google Scholar] [CrossRef]
  65. Khezri, R.; Oshnoei, A.; Tarafdar Hagh, M.; Muyeen, S. Coordination of Heat Pumps, Electric Vehicles and AGC for Efficient LFC in a Smart Hybrid Power System via SCA-Based Optimized FOPID Controllers. Energies 2018, 11, 420. [Google Scholar] [CrossRef] [Green Version]
  66. Ramanaiah, M.L.; Reddy, M.D. Sine Cosine Algorithm for Loss Reduction in Distribution System with Unified Power Quality Conditioner. i-Manag. J. Power Syst. Eng. 2017, 5, 10–16. [Google Scholar] [CrossRef]
  67. Dhundhara, S.; Verma, Y.P. Capacitive energy storage with optimized controller for frequency regulation in realistic multisource deregulated power system. Energy 2018, 147, 1108–1128. [Google Scholar] [CrossRef]
  68. Singh, V.P. Sine cosine algorithm based reduction of higher order continuous systems. In Proceedings of the 2017 International Conference on Intelligent Sustainable Systems (ICISS), Palladam, India, 7–8 December 2017; pp. 649–653. [Google Scholar] [CrossRef]
  69. Das, S.; Bhattacharya, A.; Chakraborty, A.K. Solution of short-term hydrothermal scheduling using sine cosine algorithm. Soft Comput. 2018, 22, 6409–6427. [Google Scholar] [CrossRef]
  70. Singh, M.; Panigrahi, B.; Abhyankar, A. Optimal coordination of directional over-current relays using Teaching Learning-Based Optimization (TLBO) algorithm. Int. J. Electr. Power Energy Syst. 2013, 50, 33–41. [Google Scholar] [CrossRef]
  71. Niknam, T.; Azizipanah-Abarghooee, R.; Narimani, M.R. A new multi objective optimization approach based on TLBO for location of automatic voltage regulators in distribution systems. Eng. Appl. Artif. Intell. 2012, 25, 1577–1588. [Google Scholar] [CrossRef]
  72. Li, D.; Zhang, C.; Shao, X.; Lin, W. A multi-objective TLBO algorithm for balancing two-sided assembly line with multiple constraints. J. Intell. Manuf. 2016, 27, 725–739. [Google Scholar] [CrossRef]
  73. Arya, L.; Koshti, A. Anticipatory load shedding for line overload alleviation using Teaching learning based optimization (TLBO). Int. J. Electr. Power Energy Syst. 2014, 63, 862–877. [Google Scholar] [CrossRef]
  74. Mohanty, B. TLBO optimized sliding mode controller for multi-area multi-source nonlinear interconnected AGC system. Int. J. Electr. Power Energy Syst. 2015, 73, 872–881. [Google Scholar] [CrossRef]
  75. Yan, J.; Li, K.; Bai, E.; Yang, Z.; Foley, A. Time series wind power forecasting based on variant Gaussian Process and TLBO. Neurocomputing 2016, 189, 135–144. [Google Scholar] [CrossRef]
  76. Baghban, A.; Kardani, M.N.; Mohammadi, A.H. Improved estimation of Cetane number of fatty acid methyl esters (FAMEs) based biodiesels using TLBO-NN and PSO-NN models. Fuel 2018, 232, 620–631. [Google Scholar] [CrossRef]
  77. Migallón, H.; Jimeno-Morenilla, A.; Sánchez-Romero, J.; Belazi, A. Efficient parallel and fast convergence chaotic Jaya algorithms. Swarm Evol. Comput. 2020, 100698. [Google Scholar] [CrossRef]
  78. Ravipudi, J.L.; Neebha, M. Synthesis of linear antenna arrays using Jaya, self-adaptive Jaya and chaotic Jaya algorithms. AEU-Int. J. Electron. Commun. 2018, 92, 54–63. [Google Scholar] [CrossRef]
  79. Free Software Foundation, Inc. GCC, the GNU Compiler Collection. Available online: https://www.gnu.org/software/gcc/index.html (accessed on 10 March 2017).
  80. García-Monzó, A.; Migallón, H.; Jimeno-Morenilla, A.; Sánchez-Romero, J.L.; Rico, H.; Rao, R.V. Efficient Subpopulation Based Parallel TLBO Optimization Algorithms. Electronics 2018, 8, 19. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Convergence curves. The population size is set as 140 and the number of iterations to 1000.
Figure 1. Convergence curves. The population size is set as 140 and the number of iterations to 1000.
Mathematics 08 01092 g001
Figure 2. Convergence curves. The population size is set as 140 and the number of iterations to 100.
Figure 2. Convergence curves. The population size is set as 140 and the number of iterations to 100.
Mathematics 08 01092 g002
Table 1. Benchmark functions. Names and parameters.
Table 1. Benchmark functions. Names and parameters.
Id.NameNum.DomainId.NameNum.Domain
vars (D)(Min, Max) vars (D)(Min, Max)
F1Sphere30 100 , 100 F15Bohachevsky_12 100 , 100
F2SumSquares30 10 , 10 F16Booth2 10 , 10
F3Beale2 4.5 , 4.5 F17Michalewicz_22 0 , π
F4Easom2 100 , 100 F18Michalewicz_55 0 , π
F5Matyas2 10 , 10 F19Bohachevsky_22 100 , 100
F6Colville4 10 , 10 F20Bohachevsky_32 100 , 100
F7Trid 66 D 2 , D 2 F21GoldStein-Price2 2 , 2
F8Trid 1010 D 2 , D 2 F22Perm4 D , D
F9Zakharov10 5 , 10 F23Hartman_33 0 , 1
F10Schwefel_1.230 100 , 100 F24Ackley30 32 , 32
F11Rosenbrock30 30 , 30 F25Penalized_230 50 , 50
F12Dixon-Price5 10 , 10 F26Langermann_22 0 , 10
F13Foxholes2 2 16 , 2 16 F27Langermann_55 0 , 10
F14Branin2 x 1 : 5 , 10 F28Fletcher-Powell_55 x i , α i : π , π
x 2 : 0 , 15 a i j , b i j : 100 , 100
Table 2. Maximum values of the analyzed data.
Table 2. Maximum values of the analyzed data.
Population Size70140210
NoR-AI3,500,0007,000,00010,500,000
NoR-BI3,500,0007,000,00010,500,000
NoR-BwT3,500,0007,000,00010,500,000
LtI-AI49,99949,99949,999
LtI-BI49,99949,99949,999
Table 3. Analysis of Jaya and chaotic Jaya on function F1-F28 with a population size of 140.
Table 3. Analysis of Jaya and chaotic Jaya on function F1-F28 with a population size of 140.
JayaChaotic Jaya
NoRNoRNoRLtILtINoRNoRNoRLtILtI
-AI-BI-BwT-AI-BI-AI-BI-BwT-AI-BI
F1293,5163676343449,99949,97175,4351580152511681157
(1323)(37)(34)(0)(27)(421)(58)(55)(5)(5)
F2294,0703653343549,99949,99075,3241581152611631151
(1444)(75)(68)(0)(8)(449)(51)(50)(8)(8)
F389577060862778239420949,71928,810
(107)(7)(7)(164)(163)(52)(4)(2)(168)(8641)
F450454324451412245326849,77327,786
(69)(4)(4)(77)(74)(41)(5)(2)(214)(12,758)
F596,251739729835182361930148418
(253)(32)(34)(65)(65)(528)(11)(7)(12)(6)
F617,56815111225,29322,919326454049,84341,426
(129)(15)(11)(1564)(1752)(358)(19)(0)(122)(11,336)
F711,6831155649,04410,094385660049,72237,363
(80)(8)(4)(1099)(8036)(275)(10)(0)(339)(7744)
F818,0751977949,35828,6006030114049,72235,566
(658)(28)(6)(561)(9924)(743)(23)(0)(303)(10,334)
F9249,0272288220649,99949,96827,271443421552446
(939)(38)(40)(0)(18)(12,031)(289)(264)(281)(210)
F10752270049,99348,95076,9781465141115971552
(380)(12)(0)(6)(1026)(1835)(109)(109)(37)(116)
F1159,19395637549,99249,6609367197049,92543,912
(7991)(239)(174)(11)(773)(1936)(48)(0)(47)(7236)
F127648742741,23625,124351956049,76534,756
(891)(11)(10)(14,771)(18,788)(70)(5)(0)(320)(11,038)
F13265823049,77930,524322037049,66638,222
(73)(3)(0)(272)(10,879)(553)(10)(0)(292)(9294)
F1432612902171917234202321049,71332,070
(925)(7)(0)(18,396)(16,324)(111)(5)(0)(253)(11,911)
F156084422227523416392183011
(62)(4)(3)(9)(9)(258)(12)(3)(2)(6)
F16282724849,67838,126241327949,70934,212
(71)(4)(1)(261)(10,321)(41)(5)(3)(227)(12,119)
F1749453614098174213418149,73934,560
(52)(6)(0)(1714)(12)(47)(4)(0)(333)(12,597)
F18995896048,3448329216220049,64836,491
(134)(10)(0)(723)(1549)(60)(3)(0)(320)(10,256)
F196247462532927317081963210
(53)(6)(6)(20)(18)(315)(13)(2)(4)(5)
F20625848246574441597164317
(70)(6)(4)(251)(27)(217)(9)(1)(2)(2)
F21259918449,81837,0232543281449,65127,864
(52)(4)(1)(141)(10,059)(36)(5)(2)(241)(14,620)
F22184115048,78418,724190717049,85024,366
(176)(3)(0)(1361)(10,505)(92)(3)(0)(123)(11,825)
F235976510394180217019049,66223,570
(84)(6)(0)(81)(9)(35)(4)(0)(323)(12,705)
F2441,5754152059044487465091175512495
(269)(23)(12)(5088)(2387)(158)(25)(19)(11)(17)
F2558,07984062186528613555975049,81040,191
(593)(25)(20)(111)(111)(852)(20)(0)(255)(7868)
F26506343013,85411,630238922049,84832,604
(110)(5)(0)(8815)(5926)(200)(7)(0)(116)(13,247)
F279071792325,5369665212721049,69831,794
(452)(9)(25)(18,586)(3926)(140)(4)(0)(284)(11,266)
F28211915049,42724,846215621049,63533,067
(159)(2)(0)(751)(11,088)(124)(6)(0)(353)(12,178)
Table 4. Analysis of sine cosine algorithm (SCA) and Rao’s optimization Algorithm 1 (RAO1) on function F1-F28 with a population size of 140.
Table 4. Analysis of sine cosine algorithm (SCA) and Rao’s optimization Algorithm 1 (RAO1) on function F1-F28 with a population size of 140.
SCARAO1
NoRNoRNoRLtILtINoRNoRNoRLtILtI
-AI-BI-BwT-AI-BI-AI-BI-BwT-AI-BI
F1209,3881397133912,3874261282,2044109384749,99949,985
(323)(24)(23)(440)(181)(671)(35)(30)(0)(15)
F2208,8241397134412,1294222282,6754105386749,99949,990
(335)(16)(15)(232)(99)(1315)(47)(43)(0)(7)
F33263241149,99949,90810,49071591037937
(59)(5)(2)(0)(252)(104)(5)(5)(122)(124)
F43522301049,99949,98359214123819588
(71)(5)(3)(0)(27)(61)(5)(3)(431)(79)
F581,84163162241132540102,60573972860775996
(285)(25)(24)(89)(177)(355)(23)(23)(21)(23)
F6661342049,99949,99720,96415411145,87441,787
(158)(6)(0)(0)(2)(211)(8)(6)(1909)(2043)
F7741660049,99949,99914,5011145748,66919,652
(89)(7)(0)(0)(0)(132)(9)(6)(1124)(15,645)
F811,41190049,99949,99821,8681907749,78223,351
(197)(12)(0)(0)(0)(505)(14)(4)(242)(13,762)
F9137,4141134109813,6437062394,4783436335049,99249,865
(296)(26)(24)(141)(415)(748)(58)(58)(15)(102)
F10131,2161416135220,31413,65950,68857329949,99949,912
(572)(18)(18)(684)(992)(792)(13)(16)(0)(58)
F1122,671173049,99949,99849,03173011649,98649,824
(509)(11)(0)(0)(1)(5791)(88)(56)(15)(147)
F12711659849,99949,99618,0051449715,1519236
(232)(8)(3)(0)(5)(6110)(51)(49)(12,276)(6218)
F13408337049,99949,99611749098118799
(70)(7)(0)(0)(3)(2874)(21)(0)(24,036)(21,554)
F14316122049,99949,985234218045,87828,668
(53)(6)(0)(0)(36)(185)(3)(0)(4245)(18,831)
F15597643222006664684624263217
(33)(6)(5)(19)(6)(87)(5)(5)(28)(9)
F163314271049,99949,99810,0227053352298
(57)(3)(2)(0)(1)(135)(11)(9)(9)(4)
F17303024049,99949,99457173914279221
(30)(2)(0)(0)(9)(80)(5)(1)(749)(12)
F1854614504,999949,99812,198100047,5667285
(100)(4)(0)(0)(2)(76)(11)(0)(3038)(1664)
F19620141222596866484525318268
(71)(7)(2)(30)(4)(89)(5)(3)(12)(7)
F20575142234931016677422510,328389
(66)(3)(4)(46)(10)(88)(5)(6)(19,703)(26)
F213330261249,99948,2366390483141,48522,033
(45)(4)(3)(0)(4088)(126)(5)(3)(6441)(13,841)
F22298625049,99949,975193113049,47024,078
(76)(6)(0)(0)(37)(162)(2)(0)(805)(13,173)
F23369929049,99949,9516505530670215
(97)(6)(0)(0)(109)(2170)(17)(0)(379)(72)
F2422,1091286226,470424745,46844121788184045
(3516)(7)(4)(9520)(6275)(342)(15)(15)(7583)(500)
F2523,889206049,99949,99859,02484361810,89510,755
(117)(12)(0)(0)(1)(431)(18)(11)(61)(61)
F26320521049,99949,958585041018,75717,964
(42)(4)(0)(0)(48)(130)(6)(0)(11,293)(11,291)
F27537647049,99949,9976280471032,65914,141
(621)(8)(1)(0)(2)(4993)(36)(23)(10,816)(8996)
F28596942049,99949,997407733045,60230,802
(241)(3)(0)(0)(2)(1375)(14)(0)(4353)(13,495)
Table 5. Analysis of RAO2, RAO3, and teaching-learning-based optimization (TLBO) on function F1-F28 with a population size of 140.
Table 5. Analysis of RAO2, RAO3, and teaching-learning-based optimization (TLBO) on function F1-F28 with a population size of 140.
RAO2RAO3TLBO
NoR-AINoR-BINoR-BwTLtI-AILtI-BINoR-AINoR-BINoR-BwTLtI-AILtI-BINoR-AINoR-BINoR-BwTLtI-AILtI-BI
F1158,2071833158249,99949,975402,0406191599012,84512,588317,5486636643218161806
(321)(35)(25)(0)(20)(1764)(59)(52)(171)(152)(2268)(62)(62)(5)(4)
F2158,6511838160049,99949,964400,5076179599012,74812,545314,2166582638818021792
(522)(36)(30)(0)(30)(1638)(77)(71)(140)(122)(1213)(82)(79)(5)(4)
F3985574596966109933746066558310,0987260296166
(93)(11)(9)(40)(45)(111)(6)(6)(32)(32)(209)(7)(7)(81)(8)
F4543544252038201555094425108810635455482512783
(79)(9)(6)(899)(897)(75)(6)(5)(614)(616)(124)(5)(4)(19)(4)
F5107,11173672788858769115,70075674412,29612,161151,05676475112611229
(218)(10)(10)(81)(77)(411)(26)(24)(180)(190)(1162)(26)(25)(5)(6)
F616,6461299349,99549,66916,1541278649,98449,81046,79718513654633573
(542)(12)(8)(3)(402)(615)(11)(10)(20)(159)(3371)(17)(7)(677)(250)
F711,7371085648,902785411,4211156148,50813,02113,4591397049,6385639
(91)(5)(5)(865)(5216)(172)(8)(3)(2285)(9489)(501)(12)(9)(279)(1539)
F817,8692186349,54319,78817,2551887648,89625,615202,4353127141849,44024,482
(1951)(45)(32)(413)(5571)(574)(15)(9)(1466)(9085)(46,668)(779)(176)(444)(14,497)
F9158,1991411132449,99949,965209,3342857278517,78417,582277,0403261318029852958
(509)(41)(38)(0)(34)(765)(50)(50)(301)(277)(1370)(53)(55)(11)(12)
F10388931049,94346,96658,36371653649,99849,889409,5795276509783378270
(267)(4)(0)(52)(2679)(4038)(60)(62)(1)(103)(1597)(55)(60)(65)(71)
F1134,536524849,94049,10647,480905749,98449,7614,325,39465,32826,53242,85738,436
(5740)(107)(7)(79)(790)(2960)(71)(8)(17)(197)(204,520)(3160)(3689)(3655)(2036)
F1219,1041871361472139312,595123576960357025,139234172393299
(215)(13)(12)(917)(85)(5136)(54)(66)(2906)(619)(3571)(11)(17)(190)(161)
F136409049,68021,886128514049,64930,857525149048,8571302
(44)(2)(0)(301)(9803)(794)(5)(0)(532)(13,644)(368)(10)(0)(1119)(467)
F144472300471614144423320611534654862330251864
(752)(7)(0)(684)(683)(767)(7)(0)(3259)(3260)(125)(5)(0)(225)(11)
F156939432266734864744323319275842048266351
(126)(5)(3)(166)(31)(72)(4)(2)(17)(17)(320)(5)(3)(1)(1)
F1698126953534464290422749,83832,61411,723695410386
(85)(8)(6)(29)(20)(42)(5)(2)(177)(13,514)(260)(5)(5)(2)(1)
F1750894012641149510340121871445888351213756
(55)(5)(0)(726)(7)(70)(4)(1)(425)(9)(330)(5)(0)(713)(7)
F18946695048,4111938949393049151210214,358122049,2081466
(106)(8)(0)(1305)(786)(136)(6)(0)(579)(570)(3227)(23)(0)(668)(866)
F197104492783354766774624561509781243217358
(123)(6)(5)(77)(96)(99)(4)(4)(225)(220)(327)(2)(3)(3)(2)
F20684742233782167757304526111510448677472510375
(122)(4)(2)(894)(867)(77)(7)(5)(568)(569)(563)(6)(5)(6)(5)
F216889452849,20010,483262721449,80333,3146407463027,5929567
(86)(4)(2)(1009)(4233)(82)(4)(1)(141)(10,046)(348)(8)(6)(16,851)(11,094)
F22205015047,79122,625238123049,44135,49527,8752185549,99747,112
(134)(3)(0)(3154)(14,711)(502)(8)(0)(729)(14,282)(2436)(147)(1)(2)(3039)
F2361134902411516095520249147718557014957
(58)(4)(0)(19)(3)(67)(5)(0)(26)(4)(323)(6)(0)(27)(2)
F2438,38937118232,50115,91627,517355177153284125,248402201237167
(18515)(12)(91)(12,156)(12,065)(262)(19)(10)(987)(36)(557)(16)(15)(17)(8)
F2552,0147585544811479949,68274154515841570140,740236083721,1627098
(331)(34)(24)(210)(209)(265)(33)(25)(37)(35)(43,787)(744)(1381)(22,046)(3282)
F2651533901061883526339063021436668938015,868758
(121)(5)(0)(882)(870)(242)(5)(0)(1120)(970)(956)(7)(0)(4085)(373)
F279039931614,65643829148922319,748267412,64722212421,1515807
(372)(5)(27)(543)(5900)(289)(5)(28)(23,981)(2360)(1424)(63)(42)(11,703)(3567)
F28399333043,66135,225231818049,41126,33238,28523816219,500897
(525)(8)(0)(10,700)(13,441)(291)(5)(0)(561)(13,590)(4443)(13)(19)(19,909)(345)
Table 6. Ranking of the algorithms according to the number of iterations required to achieve an error of less than 0.001 . The population size is set as 140.
Table 6. Ranking of the algorithms according to the number of iterations required to achieve an error of less than 0.001 . The population size is set as 140.
12345678910
F1CJayaTLBOHYBPOPSCAHYBINDRAO3HYBSUBPOPRAO1JayaRAO2
F2CJayaTLBOHYBPOPSCAHYBINDHYBSUBPOPRAO3RAO1JayaRAO2
F3TLBOHYBPOPCJayaHYBSUBPOPHYBINDRAO3SCARAO2JayaRAO1
F4CJayaHYBPOPTLBOHYBSUBPOPHYBINDSCAJayaRAO1RAO2RAO3
F5HYBPOPSCAHYBINDHYBSUBPOPTLBORAO3JayaRAO1RAO2CJaya
F6TLBOHYBPOPJayaHYBINDHYBSUBPOPRAO3RAO1RAO2SCACJaya
F7TLBOHYBPOPRAO3JayaRAO2RAO1HYBINDHYBSUBPOPSCACJaya
F8HYBPOPJayaRAO3TLBORAO1HYBINDHYBSUBPOPSCA
F9CJayaHYBPOPTLBOHYBINDHYBSUBPOPSCARAO3RAO1JayaRAO2
F10CJayaHYBPOPTLBOHYBINDHYBSUBPOPSCARAO3RAO1
F11HYBPOPJayaRAO1TLBORAO3RAO2
F12TLBORAO2RAO1HYBINDJayaHYBSUBPOPHYBPOPSCA
F13HYBPOPSCAHYBINDHYBSUBPOPTLBOJaya
F14TLBOCJayaHYBSUBPOPRAO2HYBINDHYBPOPRAO3JayaSCARAO1
F15CJayaHYBPOPSCAHYBSUBPOPHYBINDTLBOJayaRAO1RAO3RAO2
F16TLBOHYBPOPRAO1CJayaHYBSUBPOPHYBINDRAO2RAO3JayaSCA
F17TLBOHYBPOPHYBINDJayaHYBSUBPOPRAO2RAO3RAO1CJayaSCA
F18HYBINDHYBSUBPOPJayaRAO1SCAHYBPOP
F19SCAHYBPOPHYBINDTLBOJayaRAO1RAO3RAO2CJayaHYBSUBPOP
F20SCAHYBPOPHYBSUBPOPHYBINDTLBOJayaRAO1RAO3RAO2CJaya
F21CJayaTLBOHYBPOPHYBINDHYBSUBPOPSCARAO1RAO3RAO2Jaya
F22TLBOHYBSUBPOPHYBPOPHYBINDRAO3RAO2JayaSCACJayaRAO1
F23SCAHYBINDHYBPOPRAO2CJayaJayaRAO3TLBO
F24CJayaHYBPOPTLBOHYBINDRAO3HYBSUBPOPSCAJayaRAO1
F25RAO1JayaRAO2HYBPOPRAO3HYBINDHYBSUBPOPSCA
F26HYBPOPHYBINDHYBSUBPOPTLBOJayaSCARAO2CJayaRAO1RAO3
F27TLBOHYBIND
F28TLBOHYBPOPHYBINDHYBSUBPOPSCA
Table 7. Computational times (s.) for 50 runs. The population size is set as 140 and m a x _ I T s is set to 50,000.
Table 7. Computational times (s.) for 50 runs. The population size is set as 140 and m a x _ I T s is set to 50,000.
JayaCJayaSCARAO1RAO2RAO3TLBOHYBSUBPOPHYBPOPHYBIND
F1275.03263.81355.0178.0306.5785.61515.0743.21346.7856.2
F2300.53050.81406.5193.3324.2811.8764.9777.91394.7861.1
F350.5231.5105.043.555.655.744.092.389.785.1
F448.5233.4120.941.853.160.645.9104.191.294.9
F595.3184.8157.290.499.793.897.969.553.8121.5
F639.5381.4137.325.049.649.625.797.7107.797.2
F758.8544.8221.133.868.868.033.1144.2157.9138.7
F894.6898.2370.258.5108.8108.752.6238.5258.1228.1
F9140.4932.0583.1119.3151.5311.0311.0312.9328.0313.7
F10411.43435.21732.1291.7444.8425.5748.5889.71446.7803.1
F11291.92652.71023.1182.3315.4313.5153.9688.9752.0661.2
F1248.8457.8186.030.861.059.528.7122.8130.2118.4
F131760.61933.41850.71638.71619.41766.01738.01745.42057.61801.4
F1433.6210.994.226.737.436.928.971.672.969.9
F1534.9208.191.926.639.438.528.699.871.068.0
F1620.2187.375.812.725.628.315.156.254.952.4
F1793.3307.2247.787.4101.998.897.2212.3165.1159.9
F18295.7763.3615.6281.3296.2296.0263.3683.1442.5472.1
F1931.5200.489.523.936.932.526.289.664.662.4
F2031.8198.479.022.137.736.423.475.263.860.2
F2123.2196.079.115.131.631.017.460.958.157.9
F22305.9650.5443.7298.8319.5312.6295.2388.9426.4382.9
F2379.7342.2166.267.685.189.366.6126.4138.3128.4
F24197.31381.3620.6143.5351.1205.8124.8618.4427.2433.5
F254336.83192.11694.54063.54614.04018.71470.02253.04266.21493.1
F26161.7360.4244.9153.7166.4167.2159.9209.7226.8206.5
F27191.5651.4361.7198.1194.4197.9174.5306.0316.4298.1
F28561.6969.3697.7533.4553.5559.0514.1638.8698.8617.7
Table 8. Contribution of each algorithm to the replacements of the individuals (NoR-AI). The population size is set as 140.
Table 8. Contribution of each algorithm to the replacements of the individuals (NoR-AI). The population size is set as 140.
HYBPOPHYBIND
JayaCJayaSCARAO1RAO2RAO3TLBOJayaCJayaSCARAO1RAO2RAO3TLBO
F125,12514,55596,302717,725332,520111,73051,946934743,566525,863490158,819
(224)(80)(832)(3)(165)(3125)(566)(201)(683)(1647)(2)(189)(709)(2781)
F225,24612,58596,155917,679333,19498,34751,874905143,800625,999562157,287
(153)(3435)(691)(3)(187)(2382)(24,920)(204)(2019)(1249)(1)(196)(916)(8893)
F328731561823232594666668616132881854962517,3442543
(51)(9)(24)(53)(53)(208)(120)(46)(25)(14)(10)(50)(8810)(65)
F41368232511211374234835906333342751560255,7891321
(31)(12)(29)(4)(40)(239)(87)(43)(20)(18)(4)(41)(19,009)(40)
F54483102839,63133470552,409144411,98054319,0852912,800134615,300
(164)(134)(434)(4)(189)(1294)(200)(71)(219)(141)(5)(54)(247)(170)
F6113728525989780146,00931,69238536678543447417014,094
(58)(14)(34)(14)(43)(7004)(1833)(73)(40)(256)(8)(62)(1343)(873)
F7290925313587338515720589856741973732119560,5368801
(60)(14)(18)(95)(56)(183)(117)(44)(29)(116)(9)(53)(14,046)(1958)
F83570369180840477713,34215,701672410111222116220,32169,118
(93)(25)(23)(111)(125)(565)(760)(50)(39)(164)(5)(89)(13,307)(2852)
F996587123641651162912216,54823,51943,762445028,8017611,319448725,089
(145)(4361)(451)(11)(67)(3530)(14,231)(570)(2101)(120)(11)(91)(871)(3527)
F10105513,63427,55051198535,232109,601228412,82723,8453726727,71455,571
(73)(3278)(741)(8)(13)(7868)(25,934)(233)(2881)(1573)(7)(35)(10,418)(8180)
F1146175368348183734124,739301,7581165494400635305319123,365
(1330)(22)(360)(442)(1758)(38,358)(42,543)(52)(13)(80)(1)(27)(1354)(553)
F1238253934214885401622519,53161148210903086339119347
(1912)(23)(69)(438)(1662)(3380)(4751)(164)(32)(32)(7)(176)(1281)(1420)
F134340916172153080154041256627663397332,2101423
(5)(29)(71)(1)(83)(1388)(512)(16)(39)(25)(1)(35)(18,471)(95)
F141100912391149855423217468243481554137791319
(456)(8)(102)(20)(387)(106)(75)(101)(37)(39)(10)(73)(107)(219)
F15670926258138651311112747844981340297855101209
(35)(100)(55)(5)(33)(109)(141)(31)(54)(29)(4)(29)(68)(28)
F16272142155120481710,28564042014002464592045,6292258
(15)(9)(14)(18)(105)(384)(93)(18)(24)(17)(7)(66)(10,723)(46)
F17161480625041609214630085591951685151169,4411290
(53)(4)(9)(28)(36)(162)(40)(26)(11)(22)(6)(32)(12,934)(42)
F18293185642224076423410,1623512199065246111224273
(747)(7)(9)(67)(996)(1275)(2047)(78)(23)(24)(5)(81)(132)(966)
F19512975302038489255013688185081398318185431252
(54)(63)(130)(6)(66)(291)(93)(22)(68)(22)(7)(22)(28)(33)
F20594985228938531478812708575001199297698161199
(32)(66)(71)(7)(21)(171)(89)(29)(76)(41)(7)(30)(103)(33)
F2116418930220927088114378214353128137107810,6181440
(10)(14)(27)(18)(99)(480)(97)(18)(16)(18)(9)(139)(6726)(50)
F22380665582403788665278247894151142427404710
(23)(5)(40)(25)(41)(85)(1041)(27)(14)(18)(17)(25)(47)(909)
F23188562456892101269832146642142144857197,3021593
(43)(6)(5)(49)(42)(166)(55)(54)(24)(17)(9)(36)(14,611)(93)
F2469591882794127540250,26770734512122256232427815324525
(200)(24)(216)(3)(151)(786)(106)(105)(18)(65)(5)(63)(116)(91)
F256308368164277421,12428,99529,346774737448141835793882,428
(1141)(23)(19)(647)(4220)(2630)(4062)(33)(37)(74)(3)(102)(1834)(9703)
F2615241223061461496197137723643334807734217,5291563
(28)(10)(44)(15)(35)(155)(322)(181)(38)(132)(17)(162)(24,235)(199)
F2714887786931119093090770829019388410337811583392
(1027)(6)(802)(138)(1379)(2155)(1192)(46)(47)(51)(13)(95)(403)(613)
F281841001674210875106527,356165316100310447162616,520
(18)(9)(225)(37)(130)(192)(4070)(22)(54)(37)(10)(71)(62)(6926)
Table 9. Last iteration in which a replacement of any individual occurs (LtI-AI). The population size is set as 140.
Table 9. Last iteration in which a replacement of any individual occurs (LtI-AI). The population size is set as 140.
HYBPOPHYBIND
JayaCJayaSCARAO1RAO2RAO3TLBOJayaCJayaSCARAO1RAO2RAO3TLBO
F127,70615,95227,672827,68727,63815,97441,97816,71917,985641,95812,56016,529
(133)(78)(116)(4)(139)(111)(86)(123)(825)(439)(3)(122)(642)(841)
F227,59814,20527,566727,58527,53514,22041,85516,14718,153641,85112,74716,037
(133)(3194)(131)(4)(133)(144)(3210)(401)(3033)(471)(3)(401)(940)(2917)
F31115158192107711151091109249,97536,43648,75217149,88549,9591642
(53)(43)(60)(52)(50)(56)(54)(36)(7356)(3444)(103)(142)(63)(67)
F466813848513867166053349,89227,08949,9985349,91749,998570
(92)(17)(93)(137)(93)(93)(16)(207)(11,247)(1)(18)(191)(2)(26)
F58136301812424814681002977280267660725728147262744
(83)(42)(76)(13)(75)(70)(41)(154)(97)(109)(10)(155)(627)(52)
F649,8891032853473049,91549,74046,41149,05948,33549,999103449,76349,81647,269
(109)(448)(186)(3277)(101)(493)(1886)(1626)(1169)(0)(907)(415)(230)(2201)
F742,8141208041,57345,09645,05540,20249,99543,63849,9995949,99649,99846,348
(5594)(18)(9)(9922)(3793)(4291)(4886)(4)(4013)(0)(23)(2)(1)(3456)
F848,71532025646,06247,12248,56449,17749,99045,54949,9991649,99349,99749,803
(1000)(47)(52)(3367)(2979)(1758)(742)(9)(3461)(0)(4)(4)(2)(290)
F930,943473230,9056730,88430,842473349,998462318,8786549,99811,2716159
(525)(2638)(536)(17)(536)(530)(2642)(1)(2084)(479)(16)(1)(963)(1442)
F1049,43427,37549,9994446,70449,99927,43549,98729,38649,9994549,94649,78029,077
(873)(6641)(0)(14)(2947)(0)(6661)(26)(6347)(0)(10)(62)(206)(5971)
F1114,618256140,44613,23214,52549,77649,99949,68324,94549,999749,60846,95949,999
(1006)(745)(19,107)(8804)(13,913)(563)(0)(135)(10,461)(0)(5)(197)(426)(0)
F1293573025310505510,95314,699965849,07243,35349,99944349,82249,9228679
(7181)(45)(81)(4578)(6044)(14,643)(5204)(1526)(6153)(0)(178)(375)(190)(13,743)
F1314,277106314,850183046,40947,58946,80416,92049,24849,99970349,92249,99839259
(2793)(102)(2653)(5416)(3263)(3459)(4071)(2372)(721)(0)(690)(70)(1)(8692)
F143044717430,6206273042827404298849,50743,44249,99935949,51349,8703924
(20,306)(88)(22,023)(873)(20,332)(19,241)(1819)(780)(5080)(0)(768)(782)(240)(6743)
F153031813023730329718330816729528307285184
(6)(6)(6)(17)(7)(7)(10)(20)(9)(25)(21)(19)(22)(7)
F16267821101571631152786325,14348,48149,9966449,89749,970888
(60)(15)(21)(43)(72)(71)(20)(13,492)(1381)(6)(17)(145)(81)(34)
F1744027264266542783223344449,98930,26949,9987549,98749,99816,011
(1252)(18)(15)(595)(1104)(729)(707)(13)(7299)(0)(24)(17)(1)(5032)
F1844,394665028843,18548,62546,91046,65049,36745,60049,99973649,35449,67926,117
(5789)(12519)(205)(5242)(986)(1507)(3721)(677)(4301)(0)(1541)(658)(319)(11,794)
F193851953753038035919447917444737474435192
(27)(13)(31)(6)(26)(25)(5)(101)(16)(97)(26)(99)(94)(6)
F206632116403965361520970821063352694623245
(34)(21)(30)(18)(16)(21)(14)(62)(25)(70)(32)(66)(70)(17)
F2142515852941,47448,52047,21045,157853849,32049,7675149,44349,94047,027
(69)(30)(106)(8300)(1006)(1836)(4091)(12362)(529)(439)(27)(677)(85)(2504)
F2244,93915,05049,99925,38146,46044,79649,99047,53644,06949,99918,74646,94947,93249,985
(4800)(11,628)(0)(15,057)(5448)(5873)(8)(2663)(4699)(0)(13,839)(3518)(2883)(15)
F23443593435547242143549,99539,61549,9995849,99149,9981276
(35)(13)(4)(29)(52)(26)(40)(3)(9172)(0)(29)(8)(1)(274)
F2422,208172325,8561113,6804089143117,557154924,593347912,641946943,818
(14,958)(151)(1126)(4)(13,093)(85)(170)(14,169)(127)(1456)(10,405)(3699)(3695)(7627)
F2512,9724991683805113,17511,75912,30549,94237,07149,99910,73049,99249,99043,810
(13,910)(468)(66)(6134)(14,209)(11,152)(13,043)(56)(2584)(0)(16,943)(5)(10)(8205)
F26418225693561234142114082367446,01045,34849,999130347,09149,81613,990
(1571)(1527)(1527)(1990)(1552)(1569)(1588)(7701)(6555)(0)(2433)(3918)(277)(9456)
F2726,95134425,400921223,80030,52431,15043,48044,97749,999995347,56046,58227,162
(17,884)(92)(18,353)(3658)(16,455)(17,310)(16,665)(6458)(6360)(0)(14,823)(2370)(4419)(21,555)
F2820,38764149,999720445,86547,67442,10323,92348,89449,999243643,17845,98745,473
(12935)(559)(0)(3331)(4309)(1666)(14,129)(9779)(900)(0)(4258)(5956)(2088)(13,321)
Table 10. Last iteration in which a replacement of the best individual occurs (LtI-BI). The population size is set as 140.
Table 10. Last iteration in which a replacement of the best individual occurs (LtI-BI). The population size is set as 140.
HYBPOPHYBIND
JayaCJayaSCARAO1RAO2RAO3TLBOJayaCJayaSCARAO1RAO2RAO3TLBO
F1015,863000015,730015,94614,47300015,509
(0)(84)(0)(0)(0)(0)(116)(0)(1848)(4866)(0)(0)(0)(2075)
F2012,473000012,344015,34812,87300015,188
(0)(5550)(0)(0)(0)(0)(5609)(0)(4498)(6459)(0)(0)(0)(4560)
F3475332408329765943616720311288
(382)(31)(2)(367)(404)(275)(55)(15)(13)(7)(6)(0)(26)(48)
F4148197016409027274026450
(2)(27)(18)(11)(0)(19)(16)(0)(15)(18)(10)(0)(26)(20)
F50381000017068210048
(0)(33)(15)(0)(0)(0)(19)(0)(62)(20)(0)(0)(7)(13)
F624966321685032134415723908535,901
(41)(143)(19)(5)(42)(2653)(1169)(9)(14)(93)(89)(0)(112)(1651)
F717642311194110775315581102121024624,451
(1964)(24)(22)(2372)(1547)(7355)(8283)(0)(18)(4)(23)(6)(34)(11,193)
F856817808600873513,76618,31604500019344,652
(8188)(33)(0)(9945)(12,438)(13,950)(11,867)(0)(16)(0)(0)(0)(329)(5431)
F903322000132410330803013086
(0)(2657)(0)(0)(0)(2)(2667)(0)(1939)(0)(5)(0)(0)(1971)
F10025,452000125,355027,404792600027,330
(0)(9341)(0)(0)(0)(2)(9451)(0)(7900)(12,114)(0)(0)(0)(7991)
F1110,5861241012,384851829,73449,9770357000049,996
(11,106)(809)(0)(14,485)(8532)(6041)(28)(0)(109)(0)(0)(0)(0)(4)
F1275084228677417582766171251684202504035
(753)(40)(44)(760)(793)(752)(812)(49)(96)(32)(128)(60)(101)(679)
F131330711138260147637421214,327
(1)(158)(86)(1)(3)(4)(5143)(1)(228)(162)(5)(1)(3)(8720)
F141150115414052116560
(22)(7)(0)(2)(2)(9)(45)(0)(11)(3)(1)(1)(7)(126)
F150621000025041220026
(0)(50)(17)(0)(0)(0)(31)(0)(28)(35)(0)(0)(3)(8)
F1612942391970511428123753
(2)(23)(8)(39)(12)(20)(6)(1)(10)(3)(24)(1)(16)(35)
F1716850154114195239210016645
(96)(13)(0)(70)(97)(57)(15)(4)(1)(0)(0)(1)(8)(188)
F1864936472566678367757682743205348314,666
(12,607)(12,609)(14)(12,188)(11,997)(12,953)(12,092)(6)(4)(0)(1594)(18)(4)(12,688)
F191598002110665100612
(1)(42)(9)(0)(0)(3)(21)(0)(42)(45)(0)(0)(10)(23)
F200491001370613400037
(0)(43)(1)(0)(0)(3)(41)(0)(47)(22)(0)(0)(0)(58)
F21147888206337909854991321020485016,989
(3)(24)(13)(5982)(15,046)(16,086)(12,564)(1)(26)(15)(4)(0)(14,515)(17,876)
F22882764855191974418330074399946,093
(164)(827)(104)(84)(42)(166)(12,373)(8)(0)(0)(183)(81)(139)(7310)
F2321150177151213247180613564
(36)(7)(0)(64)(87)(46)(11)(2)(10)(0)(12)(1)(4)(46)
F24010340000786011440000757
(0)(74)(0)(0)(0)(0)(203)(0)(85)(0)(0)(0)(0)(177)
F253665232039483720368737920116466210,7040043,752
(679)(134)(0)(312)(479)(456)(485)(0)(45)(13,984)(16,898)(0)(0)(8269)
F262659269563211599111020900
(32)(52)(4)(14)(11)(8)(141)(25)(67)(20)(22)(19)(36)(422)
F27501165004328350280213,48540587820,00127253274611,151
(919)(32)(14,995)(483)(477)(5567)(17,667)(12,128)(87)(24,493)(71)(752)(8058)(16,657)
F282362184911474013330111611,993
(2)(67)(3)(22)(7)(11)(8711)(0)(23)(5)(39)(24)(15)(7744)

Share and Cite

MDPI and ACS Style

Migallón, H.; Belazi, A.; Sánchez-Romero, J.-L.; Rico, H.; Jimeno-Morenilla, A. Settings-Free Hybrid Metaheuristic General Optimization Methods. Mathematics 2020, 8, 1092. https://doi.org/10.3390/math8071092

AMA Style

Migallón H, Belazi A, Sánchez-Romero J-L, Rico H, Jimeno-Morenilla A. Settings-Free Hybrid Metaheuristic General Optimization Methods. Mathematics. 2020; 8(7):1092. https://doi.org/10.3390/math8071092

Chicago/Turabian Style

Migallón, Héctor, Akram Belazi, José-Luis Sánchez-Romero, Héctor Rico, and Antonio Jimeno-Morenilla. 2020. "Settings-Free Hybrid Metaheuristic General Optimization Methods" Mathematics 8, no. 7: 1092. https://doi.org/10.3390/math8071092

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop