Next Article in Journal
Graded Derived Equivalences
Previous Article in Journal
Stochastic Final Pit Limits: An Efficient Frontier Analysis under Geological Uncertainty in the Open-Pit Mining Industry
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Bio-Inspired Method for Mathematical Optimization Inspired by Arachnida Salticidade

by
Hernán Peraza-Vázquez
1,*,
Adrián Peña-Delgado
2,*,
Prakash Ranjan
3,
Chetan Barde
3,
Arvind Choubey
3 and
Ana Beatriz Morales-Cepeda
4
1
Instituto Politécnico Nacional, Research Center for Applied Science and Advanced Technology (CICATA), km.14.5 Carretera Tampico-Puerto Industrial Altamira, Altamira 89600, Tamaulipas, Mexico
2
Departamento de Mecatrónica y Energías Renovables, Universidad Tecnológica de Altamira, Boulevard de los Ríos km.3 + 100, Puerto Industrial Altamira, Altamira 89601, Tamaulipas, Mexico
3
Department of Electronics and Communication Engineering, Indian Institute of Information Technology Bhagalpur, Bhagalpur 813210, Bihar, India
4
Division of Graduate Studies and Research, Instituto Tecnológico de Ciudad Madero (TecNM), Juventino Rosas y Jesús Urueta s/n, Col. Los Mangos, Cd. Madero 89318, Tamaulipas, Mexico
*
Authors to whom correspondence should be addressed.
Mathematics 2022, 10(1), 102; https://doi.org/10.3390/math10010102
Submission received: 16 November 2021 / Revised: 12 December 2021 / Accepted: 23 December 2021 / Published: 29 December 2021

Abstract

:
This paper proposes a new meta-heuristic called Jumping Spider Optimization Algorithm (JSOA), inspired by Arachnida Salticidae hunting habits. The proposed algorithm mimics the behavior of spiders in nature and mathematically models its hunting strategies: search, persecution, and jumping skills to get the prey. These strategies provide a fine balance between exploitation and exploration over the solution search space and solve global optimization problems. JSOA is tested with 20 well-known testbench mathematical problems taken from the literature. Further studies include the tuning of a Proportional-Integral-Derivative (PID) controller, the Selective harmonic elimination problem, and a few real-world single objective bound-constrained numerical optimization problems taken from CEC 2020. Additionally, the JSOA’s performance is tested against several well-known bio-inspired algorithms taken from the literature. The statistical results show that the proposed algorithm outperforms recent literature algorithms and is capable to solve challenging real-world problems with unknown search space.

1. Introduction

Currently, meta-heuristic algorithms are widely used as one of the main techniques to obtain an optimal solution (near to optimal) in various complex problems in several engineering and scientific research areas. Practically, it is possible to use them to solve any linear or non-linear optimization problem through one or several objective functions subject to several intrinsic restrictions. For example, they are quite useful in solving very complex problems where deterministic algorithms get caught up in a local optimum. Nowadays, meta-heuristics have become effective alternatives for solving NP-hard problems due to their versatility to find several local solutions, as in real-world applications, e.g., optimal design of structural engineering problems [1,2], logistics and industrial manufacture [3], renewable energy systems [4], Deep Neural Networks (DNNs) models optimization [5], among other applications. Metaheuristics generate several search agents, through stochastic processes within the solution space to find the optimal or close to the optimal value. The goal is to achieve efficient exploitation (intensification) and exploration (diversification) of the search space. Where, exploitation and exploration are associated with local and global search, respectively. In any metaheuristic design a fine balance of these search processes should be a desired objective. Metaheuristics are not perfect, one weakness is that the quality of the solution depends on the number of search agents and the stop condition of the algorithm, commonly determined by the number of iterations.
On the other hand, the scientific community continues developing new metaheuristic algorithms due to its inability to get good results in all engineering and research fields. Good results can be obtained by optimizing specific problems in a particular field, but failing to find the global optimum in other fields [6,7].
Many metaheuristics have been developed in recent years; they can be classified into the following categories:
Swarm-Based Algorithms, which include Particle Swarm Optimization (PSO) [8], Salp Swarm Algorithm (SSA) [9], Whale Optimization Algorithm (WOA) [10], Chameleon Swarm Algorithm (CSA) [11], Orca Predation Algorithm (OPA) [12], African Vultures Optimization Algorithm (AVOA) [13], Dingo Optimization Algorithm (DOA) [14], Black widow Optimization Algorithm (BWOA) [15], Coot Bird Algorithm (COOT) [16], Mexican Axolotl Optimization (MAO) [17], Golden Eagle Optimizer (GEO) [18], Ant Lion Optimizer (ALO) [19], Coronavirus Optimization (CRO) [20], Archimedes Optimization algorithm (AOA) [21], Arithmetic Optimization Algorithm (ArOA) [22], Gradient-base Optimizer (GBO) [23], Hunger Game Search (HGS) [24], Henry Gas Solubility Optimization (HGSO) [25], Harris Hawks Optimization (HHO) [26], among others.
Evolutionary Algorithms, which include Differential evolution (DE) [27], Genetic Algorithm (GA) [28], Genetic Programming (GP) [29], Biogeography based Optimizer (BBO)[30], among others.
Physics-Based Algorithms, which include Multi-verse Optimization (MVO) [31], Water Wave optimization (WWO) [32], Thermal Exchange Optimization (TEO) [33], Cyclical Parthenogenesis Algorithm (CPA) [34], Magnetic Charged System Search (MCSS) [34], Colliding Bodies Optimization (CBO) [34], among others.
Human-Based Algorithms are Harmony Search (HS) [35,36], Ali Baba and the forty thieves algorithm (AFT) [37], Firework Algorithm (FWA) [38,39], Soccer-Inspired (SI) [40], among others.
Math’s Based Algorithms, which are Sine Cosine Algorithm (SCA) [41], Chaos Game Optimization (CGO) [42], Stochastic Fractal Search (SFS) [43], Hyper-Spherical Search (HSS)[44] algorithm, are among the most prominent.
The generic structure of a swarm-based bio-inspired algorithm is depicted in Figure 1. It consists of four main phases, described as follows:
Phase 1: A set of vectors is randomly generated; the cardinality of the set is called the size of the initial population. The length of the vector is the number of variables in the problem (dimension). The vector represents the search agent, e.g., they can be used to represent some animals (reptiles, mammals, birds, amphibians, or insects), even some physical or chemical phenomenon. This initial population will evolve on each iteration.
Phase 2: Vector recombination functions (search agents). In this phase, the mathematical model that represents the behavior of the modeled living being is proposed, e.g., hunting, breeding, mating. Furthermore, the model must have a good balance (good ratio between) between exploration (diversification) and exploitation (intensification) of the search solution space (over the search solution space).
Phase 3: In this phase, the set of vectors are evaluated in the objective function, with or without restrictions, where the result of the evaluation of each vector is called fitness, for instance, the lowest fitness value is the best vector when the objective is to minimize.
Phase 4: The best fitness of the previous iteration (generation) is compared with the best fitness of the current iteration. If an obtained value exceeds (minimize) the best fitness obtained so far, the value and its respective vector are updated. In this way, in each iteration in the worst case, the fitness will be equal to that of the previous iteration, but it will never get worse. It is essential to highlight that the performance of a bio-inspired algorithm can be affected by the size of the population and the number of iterations.
Here, a new population-based bio-inspired algorithm, namely Jumping Spider Optimization Algorithm (JSOA), is proposed for solving optimization problems. JSOA mathematically models the persecution, search and jumping on the prey hunting strategies of jumping spiders. The rest of the paper is arranged as follows:
Section 2 illustrates the JSOA details, including the inspiration, mathematical model, time complexity, and a population size analysis. The proficiency and robustness of the proposed approach are benchmarked with several numerical examples, and their comparison with state-of-the-art metaheuristics are presented in Section 3. The results and discussion of the proposed algorithm are shown in Section 4. Real-world applications are solved in Section 5. The final section of the paper summarizes the conclusions and future work.

2. Jumping Spider Optimization Algorithm (JSOA)

In this section, the inspiration of the proposed method is first discussed. Then, the mathematical model is provided.

2.1. Biological Fundamentals

Jumping spiders Salticidae are found in a wide range of microhabitats [45]. Family Salticidae is the largest family of spiders; they are agile and dexterous jumpers of a few millimeters in length. They move at high speed and are capable of long and accurate jumps. They are generally active diurnal hunters and do not build webs. Their body appears to be covered with hairs that are both scaly, sometimes iridescent. The palps of males, but not females, are often large and showy, used during courtship. The front legs are somewhat larger and used to hold the prey when they fall on it. Thus, running and jumping on the prey are their primary hunting method. When they move from one side to the other and especially before jumping, they glue a silk thread where they are perched to protect themselves in case the jump fails. In that case, they climb back up the thread. The silk threads are impregnated with pheromones that play a role in reproductive and social—communication and possibly navigation. In addition, jumping spiders have complex eyes and acute vision. They are known for their bright colors and elaborate ornamentation, where males are generally brighter than females. Some species can remember and recognize colors and adapt their hunting style accordingly. Further details about the Jumping Spiders’ biology and locomotion behavior can be found in [46,47]. A photograph of the jumping spider can be seen in Figure 2.

2.2. Mathematical Model and Optimization Algorithm

In this section, the mathematical model of Jumping spiders different hunting strategies is first provided. The JSOA algorithm is then proposed. The hunting strategies considered are attacking by persecution, search, and Jumping on the prey. In addition, a model is also considered to represent the pheromone rate of the spider.

2.2.1. Strategy 1: Persecution

When the spider is not within a distance where it can catch its prey by jumping, it will move closer by doing some stealthy movements until it is at an achievable distance where it can jump and catch the prey. The persecution strategy can be represented by the uniformly accelerated rectilinear motion, see Equation (1). Thereby, the spider moves along the coordinate axis and the velocity increases (or decreases) linearly with time with constant acceleration.
x i = 1 2 a t 2 + v o t
where x i shows the position of ith follower spider, t is the time, v o is the initial speed. The acceleration is given by a = v t , where v = x x o .
Here, for the optimization, each iteration is considered as the time, where the difference between iterations is equal to 1 and the initial speed is set to zero, v o = 0 . Thereby, Equation (1) can be re-defined as follows:
x i ( g + 1 ) = 1 2 ( x i ( g ) x r ( g ) )
where x i ( g + 1 ) is the new position of a search agent (jumping spider) for the generation g + 1, x i ( g ) is the current ith search agent in generation g, and x r ( g ) is the rth search agent randomly selected, with i r , where r is a random integer number generated in the interval from 1 to the size of a maximum of search agents. A representation of this strategy can be seen in Figure 3.

2.2.2. Strategy 2: Jumping on the Prey

The jumping spider follows its prey and pounces on it. The hunting strategy of jumping on the prey can be represented as a projectile motion; see Figure 4.
The equations of projectile motion, which result from the composition of a uniform motion along the X-axis and a uniformly accelerated motion along the Y-axis, are as follows:
The horizontal axis and its derivative with respect to time is represented by Equation (3).
x i = v o cos ( α ) t i d x d t = V x = v o cos ( α ) i
Similarly, the vertical axis, and its derivative with respect to time is represented by Equation (4).
y i = ( v o s e n ( α ) t 1 2 g t 2 ) j d y d t = V y = ( v o s e n ( α ) g t ) j
The time is represented similarly to strategy 1. Thereby, we obtain the equation of the trajectory, as seen in Equation (5).
y = x tan ( α ) g x 2 2 V o 2 cos 2 ( α )
Finally, the trajectory depicted in Figure 4 can be expressed as follows:
x i ( g + 1 ) = x i ( g ) tan ( α ) g x i 2 ( g ) 2 V o 2 cos 2 ( α ) α = ϕ π 180
where x i ( g + 1 ) is the new position of a search agent, indicating jumping spiders’ movement), x i ( g ) is the current ith search agent, V o is set to 100 mm/seg, g is the gravity (9.80665 m / s 2 ), and the α angle is calculated by an ϕ angle value randomly generated between (0, 1).

2.2.3. Strategy 3: Searching for Prey

The Jumping spider performs a random search around the environment to locate prey. Two mathematical functions, local and global search, are proposed for the modeling of this search technique, see Figure 5.
The local search is described in Equation (7)
x i ( g + 1 ) = x b e s t ( g ) + w a l k ( 1 2 ε )
where x i ( g + 1 ) is the new position of a search agent, x b e s t ( g ) is the best search agent found from the previous iteration, walk is a pseudo-random number uniformly distributed in the interval of (−2, 2), whereas ε is a normally distributed pseudo-random number in the interval of (0, 1).
On the other hand, the Global search is formulated by Equation (8).
x i ( g + 1 ) = x b e s t ( g ) + ( x b e s t ( g ) x w o r s t ( g ) ) λ
where, x i ( g + 1 ) is the new position of a search agent, x b e s t ( g ) and x w o r s t ( g ) are the best and worst search agent found form the previous iteration, respectively, and λ is a Cauchy random number with μ set to 0 and θ set to 1.

2.2.4. Strategy 4: Jumping Spider’ Pheromone Rates

The pheromones are chemical substances produced and secreted to the outside by an individual, they are olfactory perceived by other individuals of the same species, and they cause a behavior change. Pheromones are produced by many animals, among which are insects, including spiders. In some spiders, like black widow spiders, the pheromone has a notable role in the courtship-mating. Whereas, in the Jumping spider the courtship-mating is based on its striking colors. Nevertheless, they also produce pheromones; the modeling of the rate of pheromones is taken from [15] and defined in the following equation:
p h e r o m o n e ( i ) = F i t n e s s max F i t n e s s ( i ) F i t n e s s max F i t n e s s min
where F i t n e s s max and F i t n e s s min are the worst and the best fitness value in the current generation, respectively, whereas F i t n e s s ( i ) is the current fitness value of the ith search agent. Equation (9) normalize the fitness value in the interval (0, 1) where 0 is the worst pheromone rate, whereas 1 is the best.
The criteria consist that for low pheromones rates values equal or less than 0.3, the following equation is then applied [15]:
x i ( g ) = x b e s t ( g ) + 1 2 ( x r 1 ( g ) ( 1 ) σ x r 2 ( g ) )
where x i ( g ) is the search agent (jumping spider) with low pheromone rate that will be updated, r 1 and r 2 are random integer numbers generated in the interval from 1 to the maximum size of search agents, with r 1 r 2 , whereas x r 1 ( g ) and x r 2 ( g ) are the r 1 , r 2 th search agents selected, x b e s t ( g ) is the best search agent found from the previous iteration and σ is a binary number generated, σ { 0 , 1 } . The pheromone procedure is shown in Algorithm 1.
Algorithm 1 Pheromone procedure
1:
 Begin procedure
2:
Compute pheromone rate for all spiders (search agents) by Equation (9)
3:
for i = 1 to sizePopulation do
4:
      if pheronome(i)   0.3 then
5:
            search agent updated by Equation (10)
6:
      end if
7:
end for
8:
return x
9:
End procedure

2.2.5. Pseudo Code for JSOA

The pseudocode for the JSOA is explained in Algorithm 2, whereas the overall flow is shown in Figure 6.
Algorithm 2 Jumping Spider Optimizer Algorithm
1:
Procedure JSOA
2:
Generate the initial population randomly (A set of spiders, search agents)
3:
while iteration < Max Number of Iterations do
4:
    if random < 0.5 then Attack or Search?
5:
        if random < 0.5 then
6:
            Strategy 1: Attack by persecution, Equation (2)
7:
        Else
8:
            Strategy 2: Attack by jumping on the prey, Equation (6)
9:
        End if
10:
  else
11:
      if random < 0.5 then
12:
          Strategy 3: Search for prey by local search, Equation (7).
13:
      else
14:
          Strategy 3: Search for prey by global search, Equation (8).
15:
      end if
16:
  end if
17:
  Update search agents that have low pheromone rate (computed by Equations (9) and (10)). See Algorithm 1.
18:
   Calculate x n e w , the fitness value of the search agents
19:
   if x n e w < x * then
20:
           x * = x n e w
21:
   end if
22:
Iteration = Iteration + 1
23:
  end while
24:
Display x * , the best optimal solution
25:
End procedure

2.3. JSOA Algorithm Analysis

In this section, an in-depth analysis of the JSOA algorithm is carried out. This analysis includes the JSOA’s time complexity and the associated effects of the population size in the algorithm’s performance.

2.3.1. Time Complexity

Without any loss of generality, let f be any optimization problem and suppose that O(f) is the computational time complexity of evaluating its function value. Thereby, the JSOA computational time complexity is defined as O (f * tMax * nSpiders), where tMax is the maximum number of iterations, and nSpiders is the number of spiders (population size).

2.3.2. Population Size Analysis

The effects of the population size on the performance of the JSOA algorithm are studied by fixing the number of iterations to 100 and then varying the population size initially at 50, then 100, 200, and 500 for the Griewank function (F2). The output of these tests is summarized in Figure 7, where a convergence graph is used to determine whether the measured quantity (fitness) converged acceptably. The results were graphically compared with the setup mentioned above. Whereas Figure 8 shows the JSOA algorithm compared with the four most recent bio-inspired algorithms from the first half-year of 2021, these are Coot Bird Algorithm (COOT) [16], Chaos Game Optimization (CGO) [42], Mexican Axolotl Optimization (MAO) [17], and Golden Eagle Optimizer (GEO) [18].
Additionally, the algorithms were tested as micro-algorithms with a reduced number of iterations and population size of 30 and 50 respectively. Figure 9 shows the results of this analysis.
Notice that JSOA outperforms all the other algorithms, even as a micro-algorithm. Based on this, for the rest of the paper, JSOA tests will be conducted with a population size of 200 for testbench functions and 100 for real-world problems.

3. Experimental Setup

The numerical efficiency, effectiveness, and stability of the JSOA algorithm developed in this study were tested by solving 20 classical well-known benchmark optimization functions reported in the literature [48]. The classification of the testbench functions can be observed in Table A1. The unimodal functions allow testing the exploitation ability (local search) since they only have one global optimum [14]. In contrast, multimodal functions can test the exploration ability (global search) since they include many local optima [14]. Table A2 summarizes these testbench functions where Dim indicates the dimension of the function, Interval is the boundary of the function’s search space and f min is the optimum value. The shapes of the testbench functions considered in this study are shown in appendix A. The JSOA algorithm was compared with ten state-of-the-art bio-inspired algorithms taken from the literature; these are Coot Bird Algorithm (COOT) [16], Chaos Game Optimization (CGO) [42], Mexican Axolotl Optimization (MAO) [17], Golden Eagle Optimizer (GEO) [18], Archimedes Optimization algorithm (AOA) [21], Arithmetic Optimization Algorithm (ArOA) [22], Gradient-based Optimizer (GBO) [23], Hunger Game Search (HGS) [24], Henry Gas Solubility Optimization (HGSO) [25] and, Harris Hawks Optimization (HHO) [26]. For each testbench function, the eleven algorithms were run 30 times, the size of the population (search agents) and the number of iterations were set to 30 and 200, respectively. The entire parameter settings for all algorithms are shown in Table 1.
Our approach is implemented in MATLAB R2018a. All computations were carried out on a standard PC (Linux Kubuntu 20.04 LTS, Intel core i7, 2.50 GHz, 32 GB).

4. Results and Discussion

The comparison results are shown in Table 2, where f min , the best, the mean and standard deviation values are shown. The convergence graphs of all functions versus all algorithms selected in this study are summarized in Figure 10. In order to investigate the significant differences between the results of the proposed JSOA and the other algorithms, the Wilcoxon rank-sum non-parametric statistical test with a 5% degree of significance was carried out. This statistical test returns a parameter called p-values that determines the significance level of two algorithms. In this experimentation, an algorithm is statistically significant if and only if the calculated p-value is less than 0.05. Table 3 summarizes the result of this test.
Moreover, to evaluate the quality of the results the Mean Absolute Error (MAE) criteria was used to measure the error between the best-known f min (fitness) versus the mean fitness computed as described in Equation (11). See Table A2.
MAE = i = 1 n | m i p i | n
where m i indicates the mean of the optimal values (computed), p i is the corresponding global optimal value (observed), and n represents the number of test functions. Note that the arithmetic average of the absolute error is e i = | m i p i | . It shows how far the results are from actual values. The ten algorithms were ranked by computing their Mean Absolute Error (MAE). Table 4 shows the average error rates obtained in the 20 testbench functions, while the ranking of all the algorithms based on their MAE calculations is illustrated in Table 5.
According to the statistical results given in Table 2, the Jumping Spider Optimization Algorithm (JSOA) can provide overtopping results. In the exploitation analysis (unimodal functions), the JSOA outperforms all algorithms in the functions F1, F3, F4, F5, and F10. Whereas competitive results were found in the functions F2, F6, F7, F8 and F9 for the Algorithms Chaos Game Optimization (CGO), Archimedes Optimization algorithm (AOA), Gradient-base Optimizer (GBO), Hunger Game Search (HGS), Henry Gas Solubility Optimization (HGSO), Harris Hawks Optimization (HHO), and Arithmetic Optimization Algorithm (ArOA). Moreover, in the exploration analysis, multimodal functions, the JSOA outperforms all algorithms in functions F14, F16, F17, F18, and F19. Whereas it also shows competitive results with the COOT, CBO, HGSO, HHO, HGS, and ArOA, for F11, F12, F3, F15 and F20 functions. On the other hand, in Wilcoxon rank-sum test the p-values in Table 3 confirm the meaningful advantage of JSOA compared to other bio-inspired algorithms for many cases. Additionally, in the Mean Absolute Error (MAE) analysis, the JSOA algorithm appears ranked in the second position with a slight difference of 1.43 × 10−2 with the first position, as shown in Table 5.
In all the carried out tests, the number of iterations was set at 200, this being a lower value than that commonly used in the literature (500 or 1000) [16,17,18]. Therefore, the JSOA algorithm is more efficient in finding optimal (near-to optimal) solutions with a smaller number of iterations.

5. Real-World Applications

In this section, constrained optimization problems are considered. The JSOA algorithm was tested with four Real-World Single Objective Bound Constrained Numerical Optimization problems, Process Flow Sheeting, Process Synthesis, Optimal Design of an Industrial Refrigeration System and Welded Beam Design, taken from the CEC2020 special session [49].
On the other hand, the JSOA algorithm was also tested to find the optimal tuning parameters of a Proportional-Integral-Derivative (PID) controller and to solve the Selective Harmonics Elimination Problem, taken from [14,15], respectively.
For all the real-world application problems solved, JSOA was tested against COOT, CGO, MAO, GEO, and the best ranked in the MAE test (HHO), with population size and maximum iteration equal to 30 and 100, respectively.

5.1. Constraint Handling

The constraint handling method used is based on Penalization of Constraints (PCSt) [14] and in computing the Mean Constraint Violation (MCV) [49].
The PCSt handling is based on the penalization of infeasible solutions, that is at least one constraint is violated. This method is formulated in Equation (12) as follows:
F ( x ) = { f ( x ) , i f υ ( x ) 0 f max + υ ( x ) , o t h e r w i s e .
where f max is the fitness function value of the worst feasible solution in the population, whereas f ( x ) is the fitness function value of a feasible solution and υ ( x ) is the value of MCV. The MCV handling is depicted in Equation (13).
υ ( x ) = i = 1 p G i ( x ) + j = 1 m H j ( x ) p + m
where G i ( x ) is the sum of the p inequality constraints, whereas H i ( x ) is the sum of the m equality constraints. G i ( x ) and H i ( x ) are formulated in Equations (14) and (15), respectively.
G i ( x ) = { 0 , i f g i ( x ) 0 g i ( x ) , o t h e r w i s e .
H j ( x ) = { 0 , i f | h j ( x ) | δ 0 | h j ( x ) | , o t h e r w i s e .
In the inequality constraint g i ( x ) or the equality constraint h i ( x ) is not violated, a zero value is returned, else its self-value is returned (the value of the constraint is violated). That is to say, the constraint handling method used is based on the fitness of an infeasible solution punished by the worst feasible solution in the current population plus the mean value of the constraints violated. In Equation (15), the δ value is set to 0.0001.

5.2. Process Flow Sheeting Problem

This problem is formulated as a non-convex constrained optimization problem [49]. In this test, there are three decision variables with three inequality constraints. It is formulated as shown in Equation (16). The best known feasible objective function value is f ( x ) = 1.0765430833 .
M i n i m i z e f ( x ) = 0.7 x 3 + 5 ( 0.5 x 1 ) 2 + 0.8 S u b j e c t   to g 1 ( x ) = e ( x 1 0.2 ) x 2 0 g 2 ( x ) = x 2 + 1.1 x 3 1.0 g 3 ( x ) = x 1 x 3 0.2 w i t h   bounds : 0.2 x 1 1 , 2.22554 x 2 1 , x 3 { 0 , 1 }
In Table 6, the comparison results show that JSOA, CGO, COOT, and HHO algorithms reported feasible solutions, whereas MAO and GEO are infeasible, as seen in the convergence graph in Figure 11. An infeasible solution does not satisfy one or more constraints, so it is not valid for the problem. The JSOA algorithm is ranked as the first best-obtained solution. The difference with the best-known feasible objective function value is 1.17347 × 10−5.

5.3. Process Synthesis Problem

This problem has non-linearities in real and binary variables [50]. Here, there are seven decision variables and nine inequality constraints. The mathematical formulation of the optimization problem is described in Equation (17). The best known feasible objective function value is f ( x ) = 2 . 9248305537 .
M i n i m i z e f ( x ) = ( 1 x 1 ) 2 + ( 2 x 2 ) 2 + ( 3 x 3 ) 2 + ( 1 x 4 ) 2 + ( 1 x 5 ) 2 + ( 1 x 6 ) 2 ln ( 1 + x 7 ) S u b j e c t   to g 1 ( x ) = x 1 + x 2 + x 3 + x 4 + x 5 + x 6 5 g 2 ( x ) = x 1 2 + x 2 2 + x 3 2 + x 6 2 5.5 g 3 ( x ) = x 1 + x 4 1.2 g 4 ( x ) = x 2 + x 5 1.8 g 5 ( x ) = x 3 + x 6 2.5 g 6 ( x ) = x 1 + x 7 1.2 g 7 ( x ) = x 2 2 + x 5 2 1.64 g 8 ( x ) = x 3 2 + x 6 2 4.25 g 9 ( x ) = x 3 2 + x 5 2 4.64 w i t h   bounds 0 x 1 , x 2 , x 3 1 , x 4 , x 5 , x 6 , x 7 { 0 , 1 }
In Table 7, the comparison results show that JSOA and COOT algorithms reported feasible solutions, whereas MAO, CGO, GEO, and HHO are infeasible, as seen in the convergence graph in Figure 12. Note that the JSOA algorithm is ranked as the first best-obtained solution. The difference with the best-known feasible objective function value is 1.21 × 104.

5.4. Optimal Design of an Industrial Refrigeration System

The problem is formulated as a non-linear inequality-constrained optimization problem [50]. Here, there are fourteen decision variables and 15th inequality constraints. The best-known feasible objective is f ( x ) = 3.22130008 × 10 2 . This problem can be stated as follows:
M i n i m i z e f ( x ) = 63098.88 x 2 x 4 x 12 + 5441.5 x 2 2 x 12 + 115055.5 x 2 1.664 x 6 + 6172.27 x 2 2 x 6 + 63098.88 x 1 x 3 x 11 + 5441.5 x 1 2 x 11 + 115055.5 x 1 1.664 x 5 + 6172.27 x 1 2 x 5 + 140.53 x 1 x 11 281.29 x 3 x 11 + 70.26 x 1 2 + 281.29 x 3 2 + 14437 x 8 1.8812 x 12 0.3424 x 10 x 14 1 x 1 2 x 7 x 9 1 + 20470.2 x 7 2.893 x 11 0.316 x 1 2 S u b j e c t   to g 1 ( x ) = 1.524 x 7 1 1 g 2 ( x ) = 1.524 x 8 1 1 g 3 ( x ) = 0.07789 x 1 2 x 7 1 x 9 1 g 4 ( x ) = 7.05305 x 9 1 x 1 2 x 10 x 8 1 x 2 1 x 14 1 1 g 5 ( x ) = 0.0833 x 13 1 x 14 1 g 6 ( x ) = 47.136 x 2 0.333 x 10 1 x 12 1.333 x 8 x 13 2.1195 + 62.08 x 13 2.1195 x 12 1 x 8 0.2 x 10 1 1 g 7 ( x ) = 0.04771 x 10 x 8 1.8812 x 12 0.3424 1 g 8 ( x ) = 0.0488 x 9 x 7 1.893 x 11 0.316 1 g 9 ( x ) = 0.0099 x 1 x 3 1 1 g 10 ( x ) = 0.0193 x 2 x 4 1 1 g 11 ( x ) = 0.0298 x 1 x 5 1 1 g 12 ( x ) = 0.056 x 2 x 6 1 1 g 13 ( x ) = 2 x 9 1 1 g 14 ( x ) = 2 x 10 1 1 g 1 ( x ) = x 12 x 11 1 1 w i t h   bounds 0.001 x i 5 ,   i = 1 , 2 , , 14
In Table 8, the comparison results show that JSOA and COOT algorithms reported feasible solutions, whereas MAO, CGO, GEO, and HHO are infeasible, as seen in the convergence graph illustrated in Figure 13. Note that the JSOA algorithm is ranked as the first best-obtained solution. The difference with the best-known feasible objective function value is 2.77 × 10−3.

5.5. Welded Beam Design

The main objective of this problem is to design a welded beam with minimum cost [50]. This problem contains five inequality constraints and four decision variables that are used to develop a welded beam [51]. The best known feasible objective function value is f ( x ) = 1.6702177263 . Therefore, the mathematical description of this problem can be defined as follows:
M i n i m i z e f ( x ) = 1.10471   x 1 2 x 2 + 0.04811 x 3 x 4 ( x 2 + 14 ) S u b j e c t   t o g 1 ( x ) = x 1 x 4 0 g 2 ( x ) = δ ( x ) δ max 0 g 3 ( x ) = P P c ( x ) g 4 ( x ) = τ max τ ( x ) g 5 ( x ) = σ ( x ) σ max 0 w h e r e , τ = τ ι 2 + τ ι ι 2 + 2 τ ι τ ι ι x 2 2 R ,   τ ι ι = R M J ,   τ ι = P 2 x 2 x 1 ,   M = P ( x 2 2 + L ) , R = x 2 2 4 + ( x 1 + x 3 2 ) 2   ,   J = 2 ( ( x 2 2 4 + ( x 1 + x 3 2 ) 2 ) 2 x 1 x 2 ) ,   σ ( x ) = 6 P L x 4 x 3 2 , L = 14 i n ,   P = 6000   l b ,   E = 30.10 6 p s i ,   σ max = 30 , 000   p s i ,   τ max = 13 , 600   p s i , δ max = 0.25 i n , w i t h   bounds : 0.125 x 1 2 , 0.1 x 2 , x 3 10 , 0.1 x 4 2
In Table 9, the comparison results show that JSOA, MAO, COOT and GEO algorithms reported feasible solutions, whereas GEO and HHO are infeasible, as seen in the convergence graph in Figure 14. Furthermore, the COOT algorithm showed a competitive result, whereas the JSOA algorithm is ranked as the first best-obtained solution. The difference with the best-known feasible objective function value is 2.27 × 10−8.

5.6. Tuning of a Proportional-Integral-Derivative (PID) Controller: Sloshing Dynamics

Problem

This problem is taken from [14]. The goal is to tune a Proportional-Integral-Derivative (PID) controller for the Sloshing dynamics phenomenon (SDP). The SDP is a well-known problem in fluid dynamics. It is related to the movement of a liquid inside another object, altering the system dynamics [14]. Sloshing is an important effect on mobile vehicles carrying liquids, e.g., ships, spacecraft, aircraft, and trucks. A deficient sloshing control causes instability and accidents.
Sloshing dynamics can be depicted as a Ball and Hoop System (BHS). This effect illustrates the dynamics of a steel ball that is free to roll on the inner surface of a rotating circular hoop [52]. The ball exhibits an oscillatory motion caused by the continuously rotated hoop through a motor. The ball will tend to move in the direction of the hoop rotation and will fall back, at some point, when gravity overcomes the frictional forces [14]. Seven variables can describe the BHS behavior: hoop radius (R), hoop angle ( θ ), input torque to the hoop (T(t)), ball position on the hoop (y), ball radius (r), ball mass (m) and ball angles with vertical (slosh angle) ( ψ ) [14]. A schematic representation is shown in Figure 15. The transfer function of the BHS system, taken from [14], is formulated in Equation (20). Where θ is, the input and y are the output of the BHS system.
G B H S ( s ) = y ( s ) θ ( s ) = 1 s 4 + 6 s 3 + 11 s 2 + 6 s
Table 10 summarizes the comparison results at solving the BHS for the optimal tuning of a Proportional-Integral Derivative (PID) controller. The transient response parameters of the PID controller are Rise time, Settling time, Peak time, and Peak overshoot. The PID controller is designed to minimize the overshoot and settling time so that the liquid can remain as stable as possible under any perturbance, and if it moves, it can rapidly go back to its steady-state [14]. It is to be noticed that the JSOA and HHO show competitive results and are better than the rest of the algorithms part of this test, obtaining the lowest rising and settling time value, as well as the peak overshoot, see Table 11. In addition, the PID controller step response for the six algorithms is shown in Figure 16. This figure shows that the JSOA (blue line) and HHO (magenta line) are more stable with very fine control without exceeding the setpoint (dotted line).

5.7. Selective Harmonics Elimination (SHE) Problem

The selective harmonic elimination (SHE) problem is taken from [15]; it is a highly used control strategy applied to multilevel inverters (MLI) that aims for the elimination of unwanted low order harmonics by setting them equal to zero. In contrast, the fundamental component is kept equal to the desired amplitude. Figure 17 shows the typical staircase output waveform of a single-phase multilevel inverter. This waveform is generated by the correct synchronization and angle switching of the MLI power semiconductor devices, part of cascaded H-bridge inverter modules with isolated direct current (dc) sources, connected either in cascaded or series.
The Fourier series expansion of this waveform is defined in Equation (21). Due to the quarter-wave symmetry nature of the waveform, the dc component, and the Fourier coefficient A n will be both equal to 0.
f ( t ) = A 0 dc + n = 1 ( A n cos ( n α ) + B n sin ( n α ) ac ) | ω 0 = 2 π T
By Fourier transforming the staircase output waveform, it can be described in Equation (22).
f ( t ) V + = { 4 V d c n π cos ( n α 1 ) + + cos ( n α n ) f o r   odd   n 0 f o r   even   n
Therefore, a series of nonlinear equations must be solved for unknown angles. The mathematical relationship between the MLI isolated dc sources (S) and the number of levels (n) is shown in Equation (23).
n = 2 S + 1
where the switching angles ( α ) are subjected to:
0 α 1 α 2 α ( s 1 ) α s 90 °
As a case of study, a 3 ϕ eleven-levels multilevel inverter is selected with an index of modulation equal to 1. Thus, the selective harmonic elimination set of equations that eliminates the fifth, seventh, eleventh and thirteenth harmonic can be rewritten as below:
cos ( α 1 ) + cos ( α 2 ) + + cos ( α 5 ) = M cos ( 5 α 1 ) + cos ( 5 α 2 ) + + cos ( 5 α 5 ) = 0 cos ( 7 α 1 ) + cos ( 7 α 2 ) + + cos ( 7 α 5 ) = 0 cos ( 11 α 1 ) + cos ( 11 α 2 ) + + cos ( 11 α 5 ) = 0 cos ( 13 α 1 ) + cos ( 13 α 2 ) + + cos ( 13 α 5 ) = 0
where M = ( V 1 * ) / ( 4 V d c π ) and the modulation index is defined as m = ( M / 5 ) for 0 m 1 . V 1 * is defined as the desired peak voltage and V d c is equal to the direct voltage of the isolated dc power supplies.
Therefore, the objective function [15], is defined as:
min   f ( α 1 , α 2 , , α 5 ) = [ i = 1 5 cos ( α i ) M ] 2 + [ i = 1 5 cos ( 5 α i ) ] 2 + + [ i = 1 5 cos ( 13 α i ) ] 2
Subjected to the switching angles described in Equation (24). Some of the algorithms that are chosen for comparison are Whale Optimization Algorithm (WOA), Modified Grey Wolf Optimization Algorithm (MGWOA) and Black Widow Optimization Algorithm (BWOA). Table 5 was taken from [15] and updated with results of the following algorithms: Coot Bird Algorithm (COOT) [16], Chaos Game Optimization (CGO) [42], Mexican Axolotl Optimization (MAO) [17], Golden Eagle Optimizer (GEO) [18], Harris Hawks Optimization (HHO)[26], and Jumping Spider Optimization Algorithm (JSOA). The obtained near-optimal angles are then fed to a Simulink code that retrieves the staircase output waveform. Then, the Total Harmonic Distortion (THD) is calculated, and a Fourier spectrum is determined to show the correct elimination of the unwanted low order harmonics. From Table 12, it can be seen that JSOA gets the best fitness values, whereas Figure 18 shows the correct elimination of the fifth, seventh, eleventh, and thirteenth order harmonics.

6. Conclusions

This work presented a new swarm-based optimization algorithm inspired by the hunting behavior Arachnida Salticidae, named Jumping Spider Optimization Algorithm (JSOA). The proposed method included three operators to simulate the spider hunting methods (search, persecution, and jumping on the prey). These strategies work as mathematical recombination functions of vectors (spiders, also named search agents), that provide a fine balance between exploitation and exploration over the solution search space. The algorithm’s performance was benchmarked on 20 test functions, four real-world optimization problems, tuning of a Proportional-Integral-Derivative (PID) controller, and the selective harmonic elimination problem. In addition, the performance of the proposed JSOA algorithm is compared with the ten most recent algorithms in the literature: COOT, CGO, MAO, GEO, AOA, ArOA, GBO, HGS, HGSO, and HHO. The statistical results show that the JSOA algorithm outperforms or has competitive results in these algorithms. This study has the following conclusions:
  • The JSOA algorithm does not have parameters to configure affecting its performance.
  • Exploitation and exploration of the JSOA are intrinsically high on problems involving unimodal and multimodal test functions, respectively.
  • The algorithm has outstanding results with few iterations, 200 for the testbench functions and 100 for the real-world problems.
  • The JSOA algorithm models the pheromone of spiders whose criteria are used to repair vectors with a low pheromone level. The vectors (spiders, also named search agents) with the worst fitness are replaced in each iteration. This repair helps get a better performance in the exploration of the search space.
  • The Wilcoxon rank-sum test p-values confirm the meaningful advantage of JSOA compared to other bio-inspired algorithms for many cases. In addition, the MAE statistical results show that the JSOA is ranked among the highest position compared to the other algorithms.
  • JSOA can tune a Proportional-Integral-Derivative (PID) controller with very fine control without exceeding the setpoint, with zero percent of peak overshoot for the Sloshing dynamics Problem.
  • JSOA can solve the Selective Harmonic Elimination problem with the best fitness value results compared to WOA, MGWOA, BWOA, COOT, CGO, MAO, GEO, and HHO algorithms and competitive results with the BWOA algorithm regarding the Total Harmonic Distortion (THD)
  • JSOA can solve real-world problems with unknown search spaces.
The evaluation criterion of the pheromone rate lacks a sensitivity analysis of the fixed 0.3 value previously determined empirically. An updated version of the JSOA algorithm addressing this issue and that solves multi and many-objective functions is currently under development for future work

Author Contributions

Conceptualization, H.P.-V.; methodology, H.P.-V.; software, H.P.-V. and A.P.-D.; validation, P.R., C.B. and A.B.M.-C.; formal analysis, A.C. and A.P.-D.; investigation, P.R., C.B. and A.B.M.-C.; resources, H.P.-V.; data curation, A.P.-D. and A.C.; writing—original draft preparation, H.P.-V.; writing—review and editing, H.P.-V., A.P.-D. and P.R.; visualization, C.B. and A.B.M.-C.; supervision, H.P.-V. and A.P.-D.; project administration, H.P.-V.; funding acquisition, H.P.-V. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Instituto Politécnico Nacional (IPN) through grant number SIP-20211364.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The source code used to support the findings of this study has been deposited in the MathWorks repository at https://www.mathworks.com/matlabcentral/fileexchange/104045-a-bio-inspired-method-inspired-by-arachnida-salticidade (accessed on 16 October 2021).

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Classification of Testbench functions.
Table A1. Classification of Testbench functions.
IDFunction NameUnimodalMultimodaln-DimensionalNon-SeparableConvexDifferentiableContinuousNon-ConvexNon-DifferentiableSeparableRandom
F1BrownX XXXX
F2GriewankX XX XX
F3Schwefel 2.20X XX XXX
F4Schwefel 2.21X X X X XX
F5Schwefel 2.22X X X X XX
F6Schwefel 2.23X X XXX X
F7SphereX X XXX X
F8Sum SquaresX X XXX X
F9Xin-She Yang N. 3X XX X X
F10ZakharovX X X X
F11Ackley XX XXX
F12Ackley N. 4 XXX X X
F13Periodic XX XXXX
F14Quartic XX XX XX
F15Rastrigin XX XXX X
F16Rosenbrock XXX XXX
F17Salomon XXX XXX
F18Xin-She Yang XX XXX
F19Xin-She Yang N. 2 XXX XX
F20Xin-She Yang N. 4 XXX XX
Table A2. Description of the Testbench Functions.
Table A2. Description of the Testbench Functions.
IDFunctionDimInterval f min
F1 f ( x ) = f ( x 1 , , x n ) = i = 1 n 1 ( x i 2 ) ( x i + 1 2 + 1 ) + ( x i + 1 2 ) ( x i 2 + 1 ) 30[−1, 4]0
F2 f ( x ) = f ( x 1 , , x n ) = 1 + i = 1 n x i 2 4000 i = 1 n cos ( x i i ) 30[−600, 600]0
F3 f ( x ) = f ( x 1 , , x n ) = i = 1 n | x i | 30[−100, 100]0
F4 f ( x ) = f ( x 1 , , x n ) = max i = 1 , , n | x i | 30[−100, 100]0
F5 f ( x ) = f ( x 1 , , x n ) = i = 1 n | x i | + i = 1 n | x i | 30[−100, 100]0
F6 f ( x ) = f ( x 1 , , x n ) = i = 1 n x i 10 30[−10, 10]0
F7 f ( x ) = f ( x 1 , x 2 , , x n ) = i = 1 n x i 2 30[−5.12, 5.12]0
F8 f ( x ) = f ( x 1 , , x n ) = i = 1 n i x i 2 30[−10, 10]0
F9 f ( x ) = f ( x 1 , , x n ) = e x p ( i = 1 n ( x i / β ) 2 m ) 2 e x p ( i = 1 n x i 2 ) i = 1 n cos 2 ( x i ) 30[−2π, 2π], m = 5,
β = 15
−1
F10 f ( x ) = f ( x 1 , , x n ) = i = 1 n x i 2 + ( i = 1 n 0.5 i x i ) 2 + ( i = 1 n 0.5 i x i ) 4 30[−5, 10]0
F11 f ( x ) = f ( x 1 , , x n ) = a . e x p ( b 1 n i = 1 n x i 2 ) e x p ( 1 d i = 1 n cos ( c x i ) ) + a + e x p ( 1 ) 30[−32, 32], a = 20,
b = 0.3, c = 2π
0
F12 f ( x ) = f ( x 1 , , x n ) = i = 1 n 1 ( e 0.2 x i 2 + x i + 1 2 + 3 ( cos ( 2 x i ) + sin ( 2 x i + 1 ) ) ) 2[−35, 35]−5.901 × 1014
F13 f ( x ) = f ( x 1 , , x n ) = 1 + i = 1 n sin 2 ( x i ) 0.1 e ( i = 1 n x i 2 ) 30[−10, 10]0.9
F14 f ( x ) = f ( x 1 , , x n ) = i = 1 n i x i 4 + random [ 0 , 1 ) 30[−1.28, 1.28]0 + random noise
F15 f ( x , y ) = 10 n + i = 1 n ( x i 2 10 cos ( 2 π x i ) ) 30[−5.12, 5.12]0
F16 f ( x 1 x n ) = i = 1 n 1 ( 100 ( x i 2 x i + 1 ) 2 + ( 1 x i ) 2 ) 30[−5, 10]0
F17 f ( x ) = f ( x 1 , , x n ) = 1 cos ( 2 π i = 1 D x i 2 ) + 0.1 i = 1 D x i 2 30[−100, 100]0
F18 f ( x ) = f ( x 1 , , x n ) = i = 1 n ϵ i | x i | i 30[−5, 5], ε random0
F19 f ( x ) = f ( x 1 , , x n ) = ( i = 1 n | x i | ) e x p ( i = 1 n sin ( x i 2 ) ) 30[−2π, 2π]0
F20 f ( x ) = f ( x 1 , , x n ) = ( i = 1 n sin 2 ( x i ) e i = 1 n x i 2 ) e i = 1 n sin 2 | x i | 30[−10, 10]−1
Figure A1. Testbench Unimodal functions.
Figure A1. Testbench Unimodal functions.
Mathematics 10 00102 g0a1
Figure A2. Testbench Multimodal functions.
Figure A2. Testbench Multimodal functions.
Mathematics 10 00102 g0a2

References

  1. Kumar, A. Application of nature-inspired computing paradigms in optimal design of structural engineering problems—A review. Nat.-Inspired Comput. Paradig. Syst. 2021, 63–74. [Google Scholar] [CrossRef]
  2. Shaukat, N.; Ahmad, A.; Mohsin, B.; Khan, R.; Khan, S.U.-D.; Khan, S.U.-D. Multiobjective Core Reloading Pattern Optimization of PARR-1 Using Modified Genetic Algorithm Coupled with Monte Carlo Methods. Sci. Technol. Nucl. Install. 2021, 2021, 1–13. [Google Scholar] [CrossRef]
  3. Lodewijks, G.; Cao, Y.; Zhao, N.; Zhang, H. Reducing CO₂ Emissions of an Airport Baggage Handling Transport System Using a Particle Swarm Optimization Algorithm. IEEE Access 2021, 9, 121894–121905. [Google Scholar] [CrossRef]
  4. Malik, H.; Iqbal, A.; Joshi, P.; Agrawal, S.; Bakhsh, F.I. (Eds.) Metaheuristic and Evolutionary Computation: Algorithms and Applications; Springer: Berlin/Heidelberg, Germany, 2021; Volume 916. [Google Scholar] [CrossRef]
  5. Elaziz, M.A.; Dahou, A.; Abualigah, L.; Yu, L.; Alshinwan, M.; Khasawneh, A.M.; Lu, S. Advanced metaheuristic optimization techniques in applications of deep neural networks: A review. Neural Comput. Appl. 2021, 33, 14079–14099. [Google Scholar] [CrossRef]
  6. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef] [Green Version]
  7. Ho, Y.; Pepyne, D. Simple Explanation of the No-Free-Lunch Theorem and Its Implications. J. Optim. Theory Appl. 2002, 115, 549–570. [Google Scholar] [CrossRef]
  8. Mirjalili, S.; Song Dong, J.; Lewis, A.; Sadiq, A.S. Particle Swarm Optimization: Theory, Literature Review, and Application in Airfoil Design. In Nature-Inspired Optimizers: Theories, Literature Reviews and Applications; Mirjalili, S., Song Dong, J., Lewis, A., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 167–184. [Google Scholar]
  9. Castelli, M.; Manzoni, L.; Mariot, L.; Nobile, M.S.; Tangherloni, A. Salp Swarm Optimization: A critical review. Expert Syst. Appl. 2021, 189, 116029. [Google Scholar] [CrossRef]
  10. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  11. Braik, M.S. Chameleon Swarm Algorithm: A bio-inspired optimizer for solving engineering design problems. Expert Syst. Appl. 2021, 174, 114685. [Google Scholar] [CrossRef]
  12. Jiang, Y.; Wu, Q.; Zhu, S.; Zhang, L. Orca predation algorithm: A novel bio-inspired algorithm for global optimization problems. Expert Syst. Appl. 2021, 188, 116026. [Google Scholar] [CrossRef]
  13. Abdollahzadeh, B.; Gharehchopogh, F.S.; Mirjalili, S. African vultures optimization algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems. Comput. Ind. Eng. 2021, 158, 107408. [Google Scholar] [CrossRef]
  14. Peraza-Vázquez, H.; Peña-Delgado, A.F.; Echavarría-Castillo, G.; Morales-Cepeda, A.B.; Velasco-Álvarez, J.; Ruiz-Perez, F. A Bio-Inspired Method for Engineering Design Optimization Inspired by Dingoes Hunting Strategies. Math. Probl. Eng. 2021, 2021, 1–19. [Google Scholar] [CrossRef]
  15. Peña-Delgado, A.F.; Peraza-Vázquez, H.; Almazán-Covarrubias, J.H.; Cruz, N.T.; García-Vite, P.M.; Morales-Cepeda, A.B.; Ramirez-Arredondo, J.M. A novel bio-inspired algorithm applied to selective harmonic elimination in a three-phase eleven-level inverter. Math. Probl. Eng. 2020, 2020, 1–10. [Google Scholar] [CrossRef]
  16. Naruei, I.; Keynia, F. A new optimization method based on COOT bird natural life model. Expert Syst. Appl. 2021, 183, 115352. [Google Scholar] [CrossRef]
  17. Villuendas-Rey, Y.; Velázquez-Rodríguez, J.; Alanis-Tamez, M.; Moreno-Ibarra, M.-A.; Yáñez-Márquez, C. Mexican Axolotl Optimization: A Novel Bioinspired Heuristic. Mathematics 2021, 9, 781. [Google Scholar] [CrossRef]
  18. Mohammadi-Balani, A.; Nayeri, M.D.; Azar, A.; Taghizadeh-Yazdi, M. Golden eagle optimizer: A nature-inspired metaheuristic algorithm. Comput. Ind. Eng. 2021, 152, 107050. [Google Scholar] [CrossRef]
  19. Abualigah, L.; Shehab, M.; Alshinwan, M.; Mirjalili, S.; Elaziz, M.A. Ant Lion Optimizer: A Comprehensive Survey of Its Variants and Applications. Arch. Comput. Methods Eng. 2021, 28, 1397–1416. [Google Scholar] [CrossRef]
  20. Salehan, A.; Deldari, A. Corona virus optimization (CVO): A novel optimization algorithm inspired from the Corona virus pandemic. J. Supercomput. 2021. [Google Scholar] [CrossRef]
  21. Hashim, F.A.; Hussain, K.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W. Archimedes optimization algorithm: A new metaheuristic algorithm for solving optimization problems. Appl. Intell. 2021, 51, 1531–1551. [Google Scholar] [CrossRef]
  22. Abualigah, L.; Diabat, A.; Mirjalili, S.; Elaziz, M.A.; Gandomi, A.H. The Arithmetic Optimization Algorithm. Comput. Methods Appl. Mech. Eng. 2021, 376, 113609. [Google Scholar] [CrossRef]
  23. Ahmadianfar, I.; Bozorg-Haddad, O.; Chu, X. Gradient-based optimizer: A new metaheuristic optimization algorithm. Inf. Sci. 2020, 540, 131–159. [Google Scholar] [CrossRef]
  24. Yang, Y.; Chen, H.; Heidari, A.A.; Gandomi, A.H. Hunger games search: Visions, conception, implementation, deep analysis, perspectives, and towards performance shifts. Expert Syst. Appl. 2021, 177, 114864. [Google Scholar] [CrossRef]
  25. Hashim, F.A.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W.; Mirjalili, S. Henry gas solubility optimization: A novel physics-based algorithm. Future Gener. Comput. Syst. 2019, 101, 646–667. [Google Scholar] [CrossRef]
  26. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  27. Singh, S.; Tiwari, A.; Agrawal, S. Differential Evolution Algorithm for Multimodal Optimization: A Short Survey. In Advances in Intelligent Systems and Computing; Springer: Berlin/Heidelberg, Germany, 2021; pp. 745–756. [Google Scholar]
  28. Katoch, S.; Chauhan, S.S.; Kumar, V. A review on genetic algorithm: Past, present, and future. Multimed. Tools Appl. 2021, 80, 8091–8126. [Google Scholar] [CrossRef] [PubMed]
  29. Liu, W.-L.; Yang, J.; Zhong, J.; Wang, S. Genetic programming with separability detection for symbolic regression. Complex Intell. Syst. 2021, 7, 1185–1194. [Google Scholar] [CrossRef]
  30. Sang, X.; Liu, X.; Zhang, Z.; Wang, L. Improved Biogeography-Based Optimization Algorithm by Hierarchical Tissue-Like P System with Triggering Ablation Rules. Math. Probl. Eng. 2021, 2021, 1–24. [Google Scholar] [CrossRef]
  31. Fu, Y.; Zhou, M.; Guo, X.; Qi, L.; Sedraoui, K. Multiverse Optimization Algorithm for Stochastic Biobjective Disassembly Sequence Planning Subject to Operation Failures. In Proceedings of the Transactions on System, Man, and Cybernetics: Systems, Virtual. 17–20 October 2021. [Google Scholar] [CrossRef]
  32. Kaur, A.; Kumar, Y. A new metaheuristic algorithm based on water wave optimization for data clustering. Evol. Intell. 2021, 1–25. [Google Scholar] [CrossRef]
  33. Kaveh, A.; Dadras, A. A novel meta-heuristic optimization algorithm: Thermal exchange optimization. Adv. Eng. Softw. 2017, 110, 69–84. [Google Scholar] [CrossRef]
  34. Kaveh, A. Advances in Metaheuristic Algorithms for Optimal Design of Structures; Springer: Cham, Switzerland, 2021. [Google Scholar] [CrossRef]
  35. Geem, Z.W.; Kim, J.H.; Loganathan, G.V. A new heuristic optimization algorithm: Harmony search. Simulation 2001, 76, 60–68. [Google Scholar] [CrossRef]
  36. Abbasi, M.; Abbasi, E.; Mohammadi-Ivatloo, B. Single and multi-objective optimal power flow using a new differen-tial-based harmony search algorithm. J. Ambient. Intell. Humaniz. Comput. 2020, 12, 851–871. [Google Scholar] [CrossRef]
  37. Braik, M.; Ryalat, M.H.; Al-Zoubi, H. A novel meta-heuristic algorithm for solving numerical optimization problems: Ali Baba and the forty thieves. Neural Comput. Appl. 2021, 1–47. [Google Scholar] [CrossRef]
  38. Qi, Y.; Liu, J.; Yu, J. A Fireworks algorithm based path planning method for amphibious robot. In Proceedings of the 2021 IEEE International Conference on Real-time Computing and Robotics (RCAR), Xining, China, 15–19 July 2021. [Google Scholar] [CrossRef]
  39. Tan, Y.; Zhu, Y. Fireworks Algorithm for Optimization. In Lecture Notes in Computer Science; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2010; pp. 355–364. [Google Scholar]
  40. Osaba, E.; Yang, X.-S. Soccer-Inspired Metaheuristics: Systematic Review of Recent Research and Applications. Appl. Optim. Swarm Intell. 2021, 81–102. [Google Scholar] [CrossRef]
  41. Gabis, A.B.; Meraihi, Y.; Mirjalili, S.; Ramdane-Cherif, A. A comprehensive survey of sine cosine algorithm: Variants and applications. Artif. Intell. Rev. 2021, 54, 5469–5540. [Google Scholar] [CrossRef] [PubMed]
  42. Talatahari, S.; Azizi, M. Chaos Game Optimization: A Novel Metaheuristic Algorithm; Springer: Berlin/Heidelberg, Germany, 2021; Volume 54. [Google Scholar] [CrossRef]
  43. Sasmito, A.; Pratiwi, A.B. Stochastic fractal search algorithm in permutation flowshop scheduling problem. In Proceedings of the International Conference on Mathematics, Computational Sciences and Statistics 2020, Online. 29 September 2020; Volume 2329, p. 050003. [Google Scholar] [CrossRef]
  44. Karami, H.; Sanjari, M.J.; Gharehpetian, G.B. Hyper-Spherical Search (HSS) algorithm: A novel meta-heuristic algorithm to optimize nonlinear functions. Neural Comput. Appl. 2014, 25, 1455–1465. [Google Scholar] [CrossRef]
  45. Aguilar-Arguello, S.; Taylor, A.H.; Nelson, X.J. Jumping spiders attend to information from multiple modalities when preparing to jump. Anim. Behav. 2021, 171, 99–109. [Google Scholar] [CrossRef]
  46. Göttler, C. Locomotion of Spiders—What Robotics can Learn from Spiders and Vice Versa. Ph.D. Thesis, ETH Zurich, Zurich, Switzerland, 2021. [Google Scholar] [CrossRef]
  47. Brandt, E.E.; Sasiharan, Y.; Elias, D.O.; Mhatre, N. Jump takeoff in a small jumping spider. J. Comp. Physiol. A 2021, 207, 153–164. [Google Scholar] [CrossRef] [PubMed]
  48. GitHub-Mazhar-Ansari-Ardeh/BenchmarkFcns: A Collection of Mathematical Test Functions for Benchmarking Optimization Algorithms. Available online: https://github.com/mazhar-ansari-ardeh/BenchmarkFcns (accessed on 20 October 2021).
  49. Suganthan, P.N.; Hansen, N.; Liang, J.J.; Deb, K.; Chen, Y.P.; Auger, A.; Tiwari, S. Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization; Nanyang Technological University: Singapore, 2005. [Google Scholar]
  50. Kumar, A.; Wu, G.; Ali, M.Z.; Mallipeddi, R.; Suganthan, P.N.; Das, S. A test-suite of non-convex constrained optimization problems from the real-world and some baseline results. Swarm Evol. Comput. 2020, 56, 100693. [Google Scholar] [CrossRef]
  51. Vazquez, H.P.; Torres-Huerta, A.M.; Flores-Vela, A. Self-Adaptive Differential Evolution Hyper-Heuristic with Applications in Process Design. Comput. Sist. 2016, 20, 173–193. [Google Scholar] [CrossRef]
  52. Jain, N.; Parmar, G.; Gupta, R.; Khanam, I. Performance evaluation of GWO/PID approach in control of ball hoop system with different objective functions and perturbation. Cogent Eng. 2018, 5, 1465328. [Google Scholar] [CrossRef]
Figure 1. General structure of a bio-inspired algorithm.
Figure 1. General structure of a bio-inspired algorithm.
Mathematics 10 00102 g001
Figure 2. Salticidae Jumping Spider. Photography by Thomas Shahan (published under a CC BY 2.0 license).
Figure 2. Salticidae Jumping Spider. Photography by Thomas Shahan (published under a CC BY 2.0 license).
Mathematics 10 00102 g002
Figure 3. Representation of Jumping Spider’s persecution strategy.
Figure 3. Representation of Jumping Spider’s persecution strategy.
Mathematics 10 00102 g003
Figure 4. Jumping on the prey [47].
Figure 4. Jumping on the prey [47].
Mathematics 10 00102 g004
Figure 5. Local and global search.
Figure 5. Local and global search.
Mathematics 10 00102 g005
Figure 6. JSOA flowchart.
Figure 6. JSOA flowchart.
Mathematics 10 00102 g006
Figure 7. Convergence graph of population size analysis.
Figure 7. Convergence graph of population size analysis.
Mathematics 10 00102 g007
Figure 8. JSOA, COOT, CGO, MAO, and GEO Convergence graph population size analysis.
Figure 8. JSOA, COOT, CGO, MAO, and GEO Convergence graph population size analysis.
Mathematics 10 00102 g008
Figure 9. Convergence graph of the population size analysis as micro-algorithms.
Figure 9. Convergence graph of the population size analysis as micro-algorithms.
Mathematics 10 00102 g009
Figure 10. Convergence curves.
Figure 10. Convergence curves.
Mathematics 10 00102 g010
Figure 11. Convergence graph of the Process Flow Sheeting Problem. The dotted line represents an infeasible solution shown by an algorithm.
Figure 11. Convergence graph of the Process Flow Sheeting Problem. The dotted line represents an infeasible solution shown by an algorithm.
Mathematics 10 00102 g011
Figure 12. Convergence graph of the Process Synthesis Problem. The dotted line represents an infeasible solution shown by an algorithm.
Figure 12. Convergence graph of the Process Synthesis Problem. The dotted line represents an infeasible solution shown by an algorithm.
Mathematics 10 00102 g012
Figure 13. Convergence graph of the Optimal Design of an Industrial Refrigeration System. The dotted line represents an infeasible solution shown by an algorithm.
Figure 13. Convergence graph of the Optimal Design of an Industrial Refrigeration System. The dotted line represents an infeasible solution shown by an algorithm.
Mathematics 10 00102 g013
Figure 14. Convergence graph of Welded Beam Design. The dotted line represents an infeasible solution shown by an algorithm.
Figure 14. Convergence graph of Welded Beam Design. The dotted line represents an infeasible solution shown by an algorithm.
Mathematics 10 00102 g014
Figure 15. Schema of the ball and hoop system [14].
Figure 15. Schema of the ball and hoop system [14].
Mathematics 10 00102 g015
Figure 16. Comparative results of step response for the PID controller.
Figure 16. Comparative results of step response for the PID controller.
Mathematics 10 00102 g016
Figure 17. Single phase multilevel inverter staircase output waveform [15].
Figure 17. Single phase multilevel inverter staircase output waveform [15].
Mathematics 10 00102 g017
Figure 18. Fourier transform spectrum calculated from the JSOA set of angles described in Table 12.
Figure 18. Fourier transform spectrum calculated from the JSOA set of angles described in Table 12.
Mathematics 10 00102 g018
Table 1. Parameter settings.
Table 1. Parameter settings.
AlgorithmsParameterValue
for All AlgorithmsPopulation Size for All Problems
Maximum Iterations for Testbench Functions
Number of Replications for Testbench Functions
Maximum Iterations for Real-World Problems
30
200
30
100
JSOADoes not use additional parameters
CGODoes not use additional parameters
COOTDoes not use additional parameters
GEOpa: Propensity to attack pc: Propensity to croise[0.5, 2.0]
[1.0, 0.5]
MAOCrossover probability (cop)
Damage probability (dp)
Regeneration probability (rp)
Tournament size (k) Lamba value (λ)
0.5
0.5
0.1
2
0.5
AOAObject Number
C1 = 2, C2 = 6
C3 = 2 and C4 = 0.5 (CEC and engineering problems)
30
ArOAα
µ
5
0.5
GBOβmin, βmax
p r
0.2, 0.6
0.5
HGSOGases number
Cluster number
M1 and M2
β, α, and K
l1, l2 and l3 are constants fixed
for benchmark function
l1, l2 and l3 are constants fixed for engineering problems.
50
5
0.1, 0.2
1, 1, 1
5 × 10−3, 100, 1 × 10−2
1, 10, 1
HHOHarris Hawk Number
E0 variable changes from −1 to 1 (Default)
30
HGSDoes not use additional parameters-
Table 2. JSOA—Performance and comparison.
Table 2. JSOA—Performance and comparison.
Algorithms
JSOA COOT CGO MAO GEO
F f min BestAveStdBestAveStdBestAveStdBestAveStdBestAveStd
F104.1113 × 10−768.36 × 10−672.74 × 10−661.44 × 10−112.62 × 10−111.0 × 10−111.36 × 10−81.87 × 10−54.4 × 10−51.98 × 1022.77 × 1045.96 × 1044.30 × 10−17.05 × 10−12.7 × 10−1
F200.00000.00000.00001.32 × 10−83.20 × 10−81.61 × 10−70.00005.54 × 1029.16 × 10−27.98 × 103.04 × 1028.31 × 101.05 × 1001.11 × 105.2 × 10−2
F303.54 × 10−515.54 × 10−391.90 × 10−381.9910 × −54.13 × 10−42.07 × 10−32.32 × 10−112.80 × 10−13.60 × 10−14.92 × 1027.80 × 1021.14 × 1021.06 × 102.83 × 101.19 × 10
F400.00005.35 × 10−402.02 × 10−391.04 × 10−43.31 × 10−51.57 × 10−45.22 × 10−21.91 × 10−21.81 × 10−25.79 × 107.52 × 106.91053.19125.57141.3723
F504.448 × 10−454.448 × 10−451.390 × 10−391.85 × 10−55.55 × 10−13.041234.97 × 10−143.79 × 10−13.80 × 10−17.37 × 10263.59 × 10381.47 × 10391.10 × 1021.06 × 1075.80 × 107
F600.00005.87 × 10−2960.00003.14 × 10−441.05 × 10−455.7 × 10−451.77 × 10−244.21 × 10−232.1 × 10−222.78 × 1076.68 × 1087.15 × 1081.25 × 10−62.50 × 10−38.2 × 10−3
F709.12 × 10−793.61 × 10−661.46 × 10−653.43 × 10−119.72 × 10−95.30 × 10−81.93 × 10−44.02 × 10−55.58 × 10−53.91 × 109.40 × 102.57 × 101.53 × 10−23.63 × 10−22.3 × 10−2
F801.57 × 10−801.20 × 10−664.30 × 10−668.69 × 10−72.90 × 10−81.59 × 10−71.17 × 10−34.40 × 10−31.54 × 10−21.90 × 1034.94 × 1031.27 × 1031.86 × 107.48 × 103.87 × 10
F9−1−1.0000−1.00000.0000−1.0000−2.64 × 10−19.92 × 10−1−1.0000−1.00007.43 × 10−59.99 × 10−19.99 × 10−11.36 × 10−49.95 × 10−19.95 × 10−14.9 × 10−6
F1003.78 × 10−744.93 × 10−622.62 × 10−612.98 × 10−108.12 × 104.45 × 104.16 × 10−29.55 × 10−13.78 × 104.60 × 1076.59 × 1072.34 × 1082.55 × 105.42 × 101.80 × 10
F110−8.88 × 10−16−8.88 × 10−160.00002.40 × 10−61.82 × 10−49.85 × 10−4−8.88 × 10−162.28 × 10−22.14 × 10−21.78 × 101.91 × 107.81 × 10−11.69002.48004.0 × 10−1
F12−4.59−4.5900−4.59001.29 × 10−15−4.5900−4.59001.60 × 10−3−4.5900−4.42002.24 × 10−1−4.06004.09 × 10−12.82001.78 × 1022.89 × 1025.21 × 10
F130.90.90000.90004.52 × 10−160.90001.51001.34000.90009.00 × 10−13.39 × 10−48.77009.57009.94 × 10−11.96004.84001.8300
F1401.70 × 10−64.46 × 10−44.01 × 10−41.93 × 10−31.53 × 10−21.13 × 10−22.21 × 10−45.56 × 10−34.07 × 10−33.53 × 103.93 × 102.49 × 102.42 × 10−24.94 × 10−22.2 × 10−2
F1500.00000.00000.00001.05 × 10−51.74 × 10−49.41 × 10−42.06 × 10−34.84 × 10−35.07 × 10−32.89 × 1023.45 × 1023.68 × 102.98 × 107.61 × 103.62 × 10
F1605.33 × 10−213.32 × 10−17.32 × 10−12.88 × 103.17 × 109.66001.23 × 10−62.49 × 10−33.78 × 10−32.36 × 1047.18 × 1043.02 × 1043.14 × 103.99 × 101.05 × 10
F1702.17 × 10−453.40 × 10−384.32 × 10−389.99 × 10−27.56 × 10−24.32 × 10−25.21 × 10−98.54 × 10−28.10 × 10−21.43 × 101.83 × 102.52001.11001.60002.3 × 10−1
F1803.85 × 10−403.16 × 10−289.61 × 10−281.73 × 10−37.43 × 10−53.17 × 10−42.26 × 10−42.63 × 10−43.60 × 10−41.28 × 1075.60 × 10112.68 × 10124.83 × 10−41.07 × 10−22.5 × 10−2
F1900.00001.17 × 10−136.41 × 10−131.91 × 10−117.39 × 10−125.4 × 10−123.51 × 10−123.51 × 10−121.1 × 10−151.54 × 10−45.31 × 10−46.99 × 10−43.01 × 10−114.78 × 10−102.4 × 10−9
F20−1−1.0000−1.00000.0000−9.99 × 10−1−4.91 × 10−14.88 × 10−1−1.0000−9.55 × 10−14.19 × 10−23.31 × 10−94.35 × 10−93.71 × 10−96.08 × 10−136.60 × 10−132.1 × 10−13
Algorithms
AOA GBO HGSO
BestAveStd.BestAveStd.BestAveStd.
F16.40 × 10−351.87 × 10−336.02 × 10−331.11 × 10−441.13 × 10−425.34 × 10−421.14 × 10−721.27 × 10−616.97 × 10−61
F20.00004.17 × 10−32.29 × 10−20.00000.00000.00000.00000.00000.0000
F35.84 × 10−178.53 × 10−172.64 × 10−163.96 × 10−213.35 × 10−178.92 × 10−175.04 × 10−411.53 × 10−338.09 × 10−33
F46.15 × 10−192.81 × 10−167.59 × 10−161.07 × 10−172.73 × 10−171.35 × 10−163.00 × 10−312.30 × 10−281.26 × 10−27
F51.02 × 10−214.53 × 10−171.72 × 10−161.59 × 10−202.07 × 10−184.19 × 10−186.95 × 10−415.22 × 10−352.67 × 10−34
F65.69 × 10−1491.88 × 10−1471.03 × 10−1462.21 × 10−1784.10 × 10−1740.00000.00000.00000.0000
F77.26 × 10−483.96 × 10−342.04 × 10−337.29 × 10−452.01 × 10−417.12 × 10−412.5010−771.46 × 10−617.99 × 10−61
F83.03 × 10−433.38 × 10−349.76 × 10−342.02 × 10−392.45 × 10−378.83 × 10−375.72 × 10−751.43 × 10−567.86 × 10−56
F90.00000.00000.0000−1.0000−1.00001.04 × 10−100.00000.00000.0000
F106.89 × 10−121.84 × 10−71.01 × 10−64.63 × 10−294.75 × 10−251.95 × 10−243.45 × 10−298.63 × 10−214.72 × 10−20
F112.00 × 101.93 × 103.6500−8.88 × 10−16−8.88 × 10−160.0000−8.88 × 10−16−4.14 × 10−161.23 × 10−15
F12−4.5100−4.55005.00 × 10−2−4.5900−4.59001.81 × 10−15−4.5900−4.51002.22 × 10−1
F136.03007.61001.62009.00 × 10−19.11 × 10−14.26 × 10−28.44008.47003.0400
F141.33 × 10−32.44 × 10−31.58 × 10−33.12 × 10−33.02 × 10−32.56 × 10−33.37 × 10−48.49 × 10−47.63 × 10−4
F151.70 × 1025.69003.12 × 100.00000.00000.00000.00000.00000.0000
F162.89 × 102.89 × 108.71 × 10−22.79 × 102.68 × 106.48 × 10−12.77 × 102.86 × 103.41 × 10−1
F172.00 × 10−11.10 × 10−13.05 × 10−21.49 × 10−66.76 × 10−32.53 × 10−29.99 × 10−21.01 × 10−19.53 × 10−4
F181.18 × 10−109.89 × 10−54.14 × 10−45.54 × 10−245.48 × 10−112.62 × 10−104.22 × 10−333.3810−281.52 × 10−27
F193.36 × 10−112.90 × 10−113.12 × 10−123.53 × 10−124.34 × 10−121.89 × 10−123.38 × 10−113.38 × 10−112.63 × 10−26
F204.77 × 10−116.09 × 10−112.70 × 10−11−1.0000−1.00001.08 × 10−36.21 × 10−104.07 × 10−102.31 × 10−10
Algorithms
HHO HGS ArOA
BestAveStd.BestAveStd.BestAveStd.
F17.70 × 10−516.49 × 10−433.48 × 10−421.93 × 10−608.17 × 10−543.12 × 10−532.31 × 10−15.18001.8400
F20.00000.00000.00000.00000.00000.00001.15 × 10−24.90 × 10−12.90 × 10−1
F31.45 × 10−203.01 × 10−201.22 × 10−191.95 × 10−261.93 × 10−235.41 × 10−232.45 × 10−317.30 × 10−122.85 × 10−11
F46.63 × 10−241.12 × 10−204.14 × 10−203.17 × 10−202.47 × 10−131.35 × 10−125.81 × 10−203.28 × 10−21.84 × 10−2
F59.14 × 10−233.76 × 10−191.95 × 10−182.66 × 10−266.74 × 10−206.74 × 10−201.52 × 10−321.43 × 10−105.07 × 10−10
F63.42 × 10−2334.96 × 10−2120.00000.00007.22 × 10−883.96 × 10−870.00000.00000.0000
F71.93 × 10−513.26 × 10−411.19 × 10−409.00 × 10−637.46 × 10−254.08 × 10−240.00000.00000.0000
F82.49 × 10−575.81 × 10−421.51 × 10−411.57 × 10−683.45 × 10−271.89 × 10−260.00003.40 × 10−2050.0000
F9−1.0000−1.00000.0000−1.0000−6.01 × 10−18.12 × 10−1−1.0000−7.34 × 10−16.91 × 10−1
F101.43 × 10−221.09 × 10−104.45 × 10−102.26 × 10−135.50 × 109.98 × 102.11 × 1023.89 × 1029.14 × 10
F11−8.88 × 10−16−8.88 × 10−160.0000−8.88 × 10−162.90 × 10−151.56 × 10−14−8.88 × 10−16−7.70 × 10−166.49 × 10−16
F12−4.5900−4.59003.69 × 10−5−4.5900−4.56001.61 × 10−1−4.5900−4.59001.46 × 10−5
F139.00 × 10−19.00 × 10−14.52 × 10−169.00 × 10−19.00 × 10−16.49 × 10−169.00 × 10−19.00 × 10−14.52 × 10−16
F143.28 × 10−44.82 × 10−46.36 × 10−44.8610 × −45.78 × 10−39.35 × 10−33.24 × 10−61.42 × 10−41.42 × 10−4
F150.00000.00000.00000.00000.00000.00000.00000.00000.0000
F161.06 × 10−24.64 × 10−26.23 × 10−27.63 × 10−21.82 × 101.26 × 102.87 × 102.88 × 103.76 × 10−2
F174.48 × 10−231.68 × 10−207.96 × 10−205.28 × 10−281.43 × 10−24.34 × 10−29.99 × 10−29.99 × 10−21.88 × 10−7
F182.34 × 10−291.45 × 10−107.83 × 10−104.02 × 10−112.61 × 10−38.28 × 10−30.00000.00000.0000
F193.51 × 10−123.57 × 10−122.40 × 10−133.54 × 10−121.38 × 10−111.02 × 10−111.58 × 10−77.8610 × 10−51.55 × 10−4
F20−1.0000−1.00000.0000−1.0000−2.98 × 10−14.63 × 10−11.22 × 10−81.43 × 10−71.13 × 10−7
Table 3. The p-values of the Wilcoxon rank-sum test with 5% significance for JSOA vs. other algorithms (20 benchmark functions with 30 dimensions). NaN means “Not a Number” returned by the test.
Table 3. The p-values of the Wilcoxon rank-sum test with 5% significance for JSOA vs. other algorithms (20 benchmark functions with 30 dimensions). NaN means “Not a Number” returned by the test.
Algorithms
CGACOOTGEOMAOAOAGBOHGSOHHOHGSArOA
F11.929 × 10−33.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−111.09 × 10−103.02 × 10−111.018 × 10−13.02 × 10−11
F21.306 × 10−71.21 × 10−121.21 × 10−121.21 × 10−124.193 × 10−2NaNNaNNaNNaN1.21 × 10−12
F31.791 × 10−23.01 × 10−113.01 × 10−113.01 × 10−113.01 × 10−113.01 × 10−112.708 × 10−23.01 × 10−111.253 × 10−23.01 × 10−11
F46.708 × 10−53.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.112 × 10−13.02 × 10−116.710 × 10−53.02 × 10−11
F52.195 × 10−23.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−116.669 × 10−33.02 × 10−113.261 × 10−13.02 × 10−11
F62.548 × 10−31.249 × 10−61.249 × 10−61.249 × 10−61.249 × 10−61.249 × 10−61.045 × 10−31.249 × 10−63.361 × 10−11.152 × 10−4
F73.947 × 10−43.02 × 10−113.02 × 10−113.020 × 10−113.02 × 10−113.02 × 10−117.043 × 10−73.02 × 10−117.952 × 10−11.21 × 10−12
F83.947 × 10−43.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−115.07 × 10−103.02 × 10−119.941 × 10−11.52 × 10−10
F91.09 × 10−123.90 × 10−131.09 × 10−121.21 × 10−121.68 × 10−14NaN5.63 × 10−13NaNNaN2.157 × 10−2
F102.72 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−116.710 × 10−53.02 × 10−11
F111.13 × 10−121.21 × 10−121.21 × 10−121.21 × 10−124.61 × 10−13NaN3.337 × 10−1NaN3.337 × 10−13.337 × 10−1
F123.96 × 10−112.33 × 10−112.36 × 10−122.95 × 10−126.02 × 10−111.608 × 10−15.42 × 10−111.124 × 10−91.608 × 10−11.019 × 10−9
F135.852 × 10−91.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−122.158 × 10−25.90 × 10−10NaNNaNNaN
F149.75 × 10−103.02 × 10−113.02 × 10−113.02 × 10−111.473 × 10−77.088 × 10−87.172 × 10−13.711 × 10−15.012 × 10−29.514 × 10−6
F153.453 × 10−71.21 × 10−121.21 × 10−121.21 × 10−124.193 × 10−2NaNNaNNaNNaNNaN
F161.413 × 10−13.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−116.100 × 10−12.491 × 10−63.02 × 10−11
F179.460 × 10−63.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.351 × 10−53.02 × 10−11
F182.691 × 10−23.02 × 10−113.02 × 10−113.02 × 10−113.69 × 10−113.02 × 10−111.091 × 10−58.89 × 10−103.02 × 10−111.21 × 10−12
F191.891 × 10−61.64 × 10−121.72 × 10−121.72 × 10−121.72 × 10−122.41 × 10−122.70 × 10−141.01 × 10−118.68 × 10−111.72 × 10−12
F201.701 × 10−81.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.701 × 10−81.21 × 10−12NaN3.337 × 10−11.21 × 10−12
Table 4. Error rates results (20 benchmark functions).
Table 4. Error rates results (20 benchmark functions).
Algorithms
JSOACOOTCGOMAOGEOAOAGBOHGSOHHOHGSArOA
F18.36 × 10−672.62 × 10−121.87 × 10−52.77 × 1047.05 × 10−11.87 × 10−331.13 × 10−421.27 × 10−616.49 × 10−438.17 × 10−545.1800
F20.00003.20 × 10−85.54 × 10−23.04 × 1021.11004.17 × 10−30.00000.00000.00000.00004.90 × 10−1
F35.54 × 10−394.13 × 10−42.80 × 10−17.80 × 1022.83 × 108.53 × 10−173.35 × 10−171.53 × 10−333.01 × 10−201.93 × 10−237.30 × 10−12
F45.35 × 10−403.31 × 10−51.91 × 10−27.52 × 105.57002.81 × 10−162.73 × 10−172.30 × 10−281.12 × 10−202.47 × 10−133.28 × 10−2
F57.90 × 10−405.55 × 10−13.79 × 10−13.59 × 10381.06 × 1074.53 × 10−172.07 × 10−185.22 × 10−353.76 × 10−196.74 × 10−201.43 × 10−10
F65.87 × 10−2961.05 × 10−454.2 × 10−236.68 × 1082.50 × 10−31.88 × 10−1474.1 × 10−1740.00004.96 × 10−2127.22 × 10−880.0000
F73.61 × 10−669.72 × 10−94.02 × 10−59.40 × 103.63 × 10−23.96 × 10−342.01 × 10−411.46 × 10−613.26 × 10−417.46 × 10−250.0000
F81.20 × 10−662.90 × 10−84.40 × 10−34.94 × 1037.48003.38 × 10−342.45 × 10−371.43 × 10−565.81 × 10−423.45 × 10−273.40 × 10−205
F90.00007.36 × 10−16.00 × 10−52.00002.00001.00000.00001.00000.00003.99 × 10−12.66 × 10−1
F104.93 × 10−628.12009.55 × 10−16.59 × 1075.42 × 101.84 × 10−74.75 × 10−258.63 × 10−211.09 × 10−105.50 × 103.89 × 102
F118.88 × 10−161.82 × 10−42.28 × 10−21.91 × 102.48001.93 × 108.88 × 10−164.14 × 10−168.88 × 10−162.90 × 10−157.70 × 10−16
F121.63 × 10−63.02 × 10−41.70 × 10−15.00002.94 × 1023.78 × 10−21.63 × 10−67.54 × 10−21.63 × 10−62.94 × 10−21.63 × 10−6
F130.00006.12 × 10−12.40 × 10−48.67003.94006.71001.07 × 10−27.57000.00000.00000.0000
F144.46 × 10−41.53 × 10−25.56 × 10−33.93 × 104.94 × 10−22.44 × 10−33.02 × 10−38.49 × 10−44.82 × 10−45.78 × 10−31.42 × 10−4
F150.00001.74 × 10−44.84 × 10−33.45 × 1027.61 × 105.69000.00000.00000.00000.00000.0000
F163.32 × 10−13.17 × 102.49 × 10−37.18 × 1043.99 × 102.89 × 102.68 × 102.86 × 104.64 × 10−21.82 × 102.88 × 10
F173.40 × 10−387.56 × 10−28.54 × 10−21.83 × 101.60001.10 × 10−16.76 × 10−31.01 × 10−11.68 × 10−201.43 × 10−29.99 × 10−2
F183.16 × 10−287.43 × 10−52.63 × 10−45.60 × 10111.07 × 10−29.89 × 10−55.48 × 10−113.38 × 10−281.45 × 10−102.61 × 10−30.0000
F191.17 × 10−137.39 × 10−123.5 × 10−125.31 × 10−44.78 × 10−102.90 × 10−114.34 × 10−123.38 × 10−113.57 × 10−121.38 × 10−117.86 × 10−5
F200.00005.09 × 10−14.49 × 10−21.00001.00001.00002.00 × 10−41.00000.00007.02 × 10−11.0000
Table 5. Rank of algorithms using MAE.
Table 5. Rank of algorithms using MAE.
AlgorithmsMAERank
HHO2.35 × 10−31
JSOA1.66 × 10−22
CGO1.01 × 10−13
GBO1.34004
HGSO1.92005
COOT2.12006
AOA3.14007
HGS3.72008
ArOA2.12 × 109
GEO5.31 × 10510
MAO1.80 × 103711
Table 6. Process Flow Sheeting problem—Comparison Results.
Table 6. Process Flow Sheeting problem—Comparison Results.
AlgorithmOptimal Values for Variables f min
x 1 x 2 x 3
JSOA0.94194−2.1000011.076554818
MAO0.22351−1.033101.1822336005 (infeasible *)
CGO0.20000−1.000001.2500000000
COOT0.20000−1.000001.2500000000
GEO0.36341−1.551010.1932841405 (infeasible *)
HHO0.20000−1.000001.2500000000
* The solution does not satisfy one or more constraints.
Table 7. Comparison results for the Process Synthesis problem.
Table 7. Comparison results for the Process Synthesis problem.
AlgorithmOptimal Values for Variables f min
x 1 x 2 x 3 x 4 x 5 x 6 x 7
JSOA0.194681.28061.955910012.92495120667677
MAO7.412982.0895513.617101115.69266 × 1017 (infeasible *)
CGO11.640032.778747.092611013.9149 × 1022 (infeasible *)
COOT0.142140.788821.907711013.00121860556942
GEO96.557247.327786.253810103.6795 × 1043 (infeasible *)
HHO0.389221.12882.01800004.7224 (infeasible *)
* The solution does not satisfy one or more constraints.
Table 8. Comparison results for the Optimal Design of an Industrial Refrigeration System.
Table 8. Comparison results for the Optimal Design of an Industrial Refrigeration System.
AlgorithmOptimal Values for Variables f min
x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 x 10 x 11 x 12 x 13 x 14
JSOA0.0010.00100030.0010.0010.0010.0011.581.52414.95992.14210.0010.0010.0075080.0897840.034988
MAO3.4464.03694.00251.52230.291350.409691.44764.613.49634.78131.25344.53394.0073.19368.07 × 106 (infeasible *)
* CGO52.889852.33833.319253.209853.2024553.876453.49826.19 × 107 (infeasible *)
COOT0.0010.0010.0010.0014.76354.8392.03531.66353.37894.85192.08430.587340.113291.123812.659065
GEO3.05741.91802.22073.36531.93151.85282.17553.74801.82492.32420.00102.08663.17552.19605.94 × 106
(infeasible *)
HHO0.0010.0010.0010.212051.87242.62672.7952.79983.09034.99014.99084.99371.93682.75197.55 × 10
(infeasible *)
* The solution does not satisfy one or more constraints.
Table 9. Comparison results for Welded Beam Design.
Table 9. Comparison results for Welded Beam Design.
AlgorithmOptimal Values for Variables f min
x 1 x 2 x 3 x 4
JSOA0.1988323122219733.3373652606665609.1920242092555300.1988323123316521.67021774898897
MAO1.03652.07566.19191.16498.04184550085786
CGO0.51316.85152.53710.221072.5553
(infeasible *)
COOT0.198083.3849.17310.199771.67928562217482
GEO0.930083.68765.68680.937988.06300000000000
HHO0.341672.3077.01380.341532.17679623002226
(infeasible *)
* The solution does not satisfy one or more constraints.
Table 10. Comparison results of optimized PID parameters.
Table 10. Comparison results of optimized PID parameters.
AlgorithmParameter
KpKiKd
JSOA3.67290.00584.9697
MAO3.99160.592424.4857
CGO2.53030.352415
COOT3.84960.572925
GEO3.67990.309452.2922
HHO3.65040.014.9338
Table 11. Comparison results of transient response parameters.
Table 11. Comparison results of transient response parameters.
AlgorithmTransient Parameters
Rise Time (s)Settling Time (s)Peak Time (s)Peak Overshoot (%)
JSOA2.00593.29154.0270
MOA1.748415.33164.920922.1115
CGO2.472323.96627.279352.2769
COOT1.745516.14945.65917.9202
GEO2.086322.92625.312427.432
HHO2.02053.31814.07930
Table 12. Comparison results for the selective harmonic elimination problem.
Table 12. Comparison results for the selective harmonic elimination problem.
AlgorithmAnglesTHD% f min
α 1 α 2 α 3 α 4 α 5
JSOA7.8619.3629.6547.6863.205.013.49 × 10−31
BWOA7.8619.3729.6547.6863.215.011.29 × 10−28
WOA4.1920.2922.1241.9761.156.93.93 × 10−2
MGWOA0.4914.7425.6140.5789.165.7116.04 × 10−2
MAO9.6722.3733.0451.9164.635.263.64 × 10−1
CGO13.2526.3745.9346.0587.2213.999.76 × 10−2
COOT7.9719.4629.7947.7463.205.043.43 × 10−4
GEO8.6210.6038.7676.5983.1714.834.64 × 10−1
HHO8.1619.5030.1548.2163.355.162.69 × 10−4
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Peraza-Vázquez, H.; Peña-Delgado, A.; Ranjan, P.; Barde, C.; Choubey, A.; Morales-Cepeda, A.B. A Bio-Inspired Method for Mathematical Optimization Inspired by Arachnida Salticidade. Mathematics 2022, 10, 102. https://doi.org/10.3390/math10010102

AMA Style

Peraza-Vázquez H, Peña-Delgado A, Ranjan P, Barde C, Choubey A, Morales-Cepeda AB. A Bio-Inspired Method for Mathematical Optimization Inspired by Arachnida Salticidade. Mathematics. 2022; 10(1):102. https://doi.org/10.3390/math10010102

Chicago/Turabian Style

Peraza-Vázquez, Hernán, Adrián Peña-Delgado, Prakash Ranjan, Chetan Barde, Arvind Choubey, and Ana Beatriz Morales-Cepeda. 2022. "A Bio-Inspired Method for Mathematical Optimization Inspired by Arachnida Salticidade" Mathematics 10, no. 1: 102. https://doi.org/10.3390/math10010102

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop