You are currently viewing a new version of our website. To view the old version click .
Biomimetics
  • Editor’s Choice
  • Article
  • Open Access

20 November 2022

Serval Optimization Algorithm: A New Bio-Inspired Approach for Solving Optimization Problems

and
Department of Mathematics, Faculty of Science, University of Hradec Králové, 500 03 Hradec Králové, Czech Republic
*
Author to whom correspondence should be addressed.
This article belongs to the Special Issue Bio-Inspired Design and Optimisation of Engineering Systems

Abstract

This article introduces a new metaheuristic algorithm called the Serval Optimization Algorithm (SOA), which imitates the natural behavior of serval in nature. The fundamental inspiration of SOA is the serval’s hunting strategy, which attacks the selected prey and then hunts the prey in a chasing process. The steps of SOA implementation in two phases of exploration and exploitation are mathematically modeled. The capability of SOA in solving optimization problems is challenged in the optimization of thirty-nine standard benchmark functions from the CEC 2017 test suite and CEC 2019 test suite. The proposed SOA approach is compared with the performance of twelve well-known metaheuristic algorithms to evaluate further. The optimization results show that the proposed SOA approach, due to the appropriate balancing exploration and exploitation, is provided better solutions for most of the mentioned benchmark functions and has superior performance compared to competing algorithms. SOA implementation on the CEC 2011 test suite and four engineering design challenges shows the high efficiency of the proposed approach in handling real-world optimization applications.

1. Introduction

An optimization problem is a type of problem that has several feasible solutions. Optimization is the process of searching for the best solution among possible solutions for a problem []. Optimization is used in various science, engineering, technology, and real-world applications []. Finding the best optimum can achieve multiple benefits, such as reducing costs, maximizing profits, improving equipment efficiency, etc. For this reason, finding suitable and effective solutions for optimization applications is a fundamental challenge for scientists. From a mathematical point of view, an optimization problem is characterized by three main parts: decision variables, objectives, and constraints [].
Problem-solving techniques in dealing with optimization tasks are classified into two groups: deterministic and stochastic approaches []. Deterministic methods in two categories, gradient-based and non-gradient-based, are effective in handling optimization problems that are linear, convex, differentiable, and have a continuous search space []. Among the deterministic methods of solving optimization problems are approaches such as dynamic programming, Newton methods, linear programming, gradient method, quadratic programming, and simplex methods, among the deterministic methods of solving optimization problems []. However, as optimization problems become more complex, the number of decision variables increases, and for many real-world applications, deterministic approaches lose their effectiveness. Features such as complexity, non-linearity, non-convexity, non-differentiability, discretization, and problems with high-dimensions objective functions or a discrete search space, etc., are the nature of many modern optimization problems and real-world applications. Such characteristics lead to disruption of the efficiency of deterministic approaches and the problem that they are getting stuck in local optima []. Such difficulties in deterministic approaches have led researchers to develop new methods called stochastic approaches to deal with optimization problems. Advantages such as simplicity of concepts, easy implementation, no dependence on the type of problem, no need for derivative information, efficiency in non-linear, non-convex, NP-hard, complex and high-dimensional problems, efficiency in non-linear, and unknown search spaces have made metaheuristic algorithms popular and widespread [].
The process of solution finding in metaheuristic algorithms starts with the random generation of a number of feasible solutions in the search space. Then, these solutions are improved during iterations of the algorithm based on different steps of updating the metaheuristic algorithms. Finally, after the full implementation of the algorithm, the best feasible solution found during the iterations is presented as a solution to the problem []. Metaheuristic algorithms perform the search process in the problem-solving space at both global and local levels. Global search with the concept of exploration ability leads to scanning different regions of the problem-solving space and avoiding getting stuck in local optima. Local search with the concept of exploitation ability leads to finding better solutions in promising areas of search space. In addition to the appropriate quality in exploration and exploitation, the primary key to the success of metaheuristic algorithms in solving optimization problems is to create a balance between exploration and exploitation during algorithm iterations [].
The nature of random search in metaheuristic algorithms leads to the fact that there is no guarantee that the solutions obtained from these methods are the best solution to the problem. However, these solutions are acceptable as quasi-optimal solutions. A metaheuristic algorithm that can provide better pseudo-optimal solutions closer to the global optimum has a superiority in competition with other metaheuristic algorithms. The desire of scientists to achieve better and more effective solutions for optimization applications has led to the introduction of numerous metaheuristic algorithms [].
Due to the fact that numerous metaheuristic algorithms have been developed so far, the main research question is: Does the world still need to introduce newer metaheuristic algorithms? No Free Lunch (NFL) [] theorem answers the question that the effective performance of a metaheuristic algorithm in solving a set of optimization problems does not guarantee the same performance of that algorithm in all optimization applications. According to the NFL theorem, there is no assumption about the efficiency or non-efficiency of an algorithm in handling an optimization problem. Therefore, it can only be claimed that a particular algorithm is the best optimizer for some optimization tasks. Instead, the NFL theorem encourages and motivates researchers to be able to provide more effective solutions for optimization tasks by designing new metaheuristic algorithms. This theorem has also inspired the authors of this article to develop a new metaheuristic algorithm to deal with optimization problems.
According to the concept of the NFL theorem, the randomness of the search process in metaheuristic algorithms, the failure to guarantee the achievement of the global optimal by metaheuristic algorithms, and failure of a metaheuristic algorithm to provide similar performance in all optimization applications, the world always needs to introduce newer metaheuristic algorithms to provide more effective solutions for optimization problems. In this regard, the goal of the paper is to introduce a new metaheuristic algorithm to provide an effective problem-solving tool for researchers to be able to achieve better solutions for optimization tasks. The proposed new metaheuristic algorithm is developed based on simulating the natural behavior of serval during hunting and chasing process. In the proposed method, by simulating the search process in two phases of (i) exploration with the aim of increasing the global search power of the algorithm in order to identify the main optimal area and prevent getting stuck in the local optimal and the (ii) exploitation with the aim of increasing the local search power in order to achieve better solutions, it is expected acquired more effective solutions that are closer to the global optimum in solving optimization problems.
This paper’s novelty and innovative aspects in designing a new optimizer called the Serval Optimization Algorithm (SOA) to deal with optimization tasks in different sciences. The main contributions of this paper are listed as follows:
  • SOA is a nature-inspired approach that simulates natural serval behaviors.
  • The essential inspiration of SOA is the serval strategy when hunting in three stages: selection, attack, and chase.
  • The mathematical model of SOA is presented in two phases: exploration and exploitation.
  • SOA capability is benchmarked in optimizing the CEC 2017 and CEC 2019 test suites.
  • The performance of SOA in handling real-world applications is evaluated on the CEC 2011 test suite and four engineering design challenges.
  • The performance of the proposed SOA approach is challenged in comparison with twelve well-known metaheuristic algorithms.
The article is organized as follows: a literature review is presented in Section 2. The proposed SOA approach is introduced and modeled in Section 3. Simulation studies and results are presented in Section 4. The effectiveness of SOA in handling real-world applications is challenged in Section 5. Conclusions and suggestions for future research are provided in Section 6.

2. Literature Review

Metaheuristic algorithms have been developed based on the simulation of various natural phenomena, natural behaviors of animals, birds, aquatic animals, insects, and other living creatures in the wild, physical laws and phenomena, biological sciences, genetics, human behaviors and interactions, rules of games, and other evolutionary phenomena. Therefore, based on the main idea used in the design, metaheuristic algorithms are classified into five groups: swarm-based, evolutionary-based, physics-based, game-based, and human-based.
Swarm-based algorithms are developed by being inspired by the swarming behavior of living organisms, such as, e.g., animals, birds, insects, and aquatics in nature. The most famous algorithms of this group can be mentioned: Particle Swarm Optimization (PSO) [], Artificial Bee Colony (ABC) [], and Ant Colony Optimization (ACO) []. PSO is developed based on the simulation of the movement of flocks of birds or fish that are searching for food. ABC is introduced inspired by the activities of a honey bee colony in obtaining food resources. ACO is designed based on modeling the ability of ants to find the optimal route between the nest and the food source. Searching for food resources and hunting strategy for providing food are natural behaviors among animals that are employed in the design of numerous metaheuristic algorithms such as the Coati Optimization Algorithm (COA) [], Reptile Search Algorithm (RSA) [], White Shark Optimizer (WSO) [], Honey Badger Algorithm (HBA) [], Golden Jackal Optimization (GJO) [], African Vultures Optimization Algorithm (AVOA) [], Grey Wolf Optimizer (GWO) [], Whale Optimization Algorithm (WOA) [], Marine Predator Algorithm (MPA) [], and Tunicate Swarm Algorithm (TSA) [].
Evolutionary-based algorithms are introduced with inspiration from biological and genetics sciences, random operators, concepts of natural selection, and survival of the fittest. Genetic Algorithm (GA) [] and Differential Evolution (DE) [] are among the most well-known and widely used metaheuristic algorithms that are designed based on reproduction simulation, Darwin’s theory of evolution, and stochastic operators such as selection, crossover, and mutation.
Physics-based algorithms are designed with inspiration from phenomena, concepts, and laws in physics. The Simulated Annealing (SA) [] algorithm is one of the most famous physics-based approaches. The modeling of the metal annealing phenomenon in metallurgy has been the main idea in its design. Physical forces are the origin of the creation of algorithms such as the Spring Search Algorithm (SSA) [] based on spring tensile force, the Gravitational Search Algorithm (GSA) [] based on gravitational attraction force, and the Momentum Search Algorithm (MSA) [] based on momentum force. The phenomenon of physical changes in water has been the main idea in Water Cycle Algorithm (WCA) design []. Concepts of cosmology have been the origin of Black Hole Algorithm (BHA) design []. Some of the most popular physics-based methods are: Equilibrium Optimizer (EO) [], Electro-Magnetism Optimization (EMO) [], Multi-Verse Optimizer (MVO) [], Archimedes Optimization Algorithm (AOA) [], Thermal Exchange Optimization (TEO) [], and Lichtenberg Algorithm (LA) [].
Game-based algorithms are developed with inspiration from various individual and group games, the behavior of players, coaches, referees, and other people influencing the game. Football Game Based Optimization (FGBO) [] and Volleyball Premier League (VPL) [] are two game-based approaches that are designed based on the modeling of holding league competitions. The common aspect of many games is the effort of players to earn points, which is the origin of the design of algorithms, including Darts Game Optimizer (DGO) [], Puzzle Optimization Algorithm (POA) [], Hide Object Game Optimizer (HOGO) [], Archery Algorithm (AA) [], and Tug of War Optimization (TWO) [].
Human-based algorithms are introduced by taking inspiration from human behaviors, interactions, and thoughts. One of this group’s most widely used algorithms is Teaching-Learning Based Optimization (TLBO) [], which is introduced based on the modeling of human behaviors between students and teachers in the classroom. Teammates’ efforts to achieve team goals have been the design idea of the Teamwork Optimization Algorithm (TOA) []. The therapeutic activities of doctors in treating patients have inspired the design of Doctor and Patient Optimization (DPO) []. Some of the other popular human-based methods are: Ali Baba and the Forty Thieves (AFT) [], Coronavirus Herd Immunity Optimizer (CHIO) [], War Strategy Optimization (WSO) [], and Gaining Sharing Knowledge based Algorithm (GSK) [].
Based on the best knowledge obtained from the literature review, no metaheuristic algorithm has been designed so far based on the simulation of natural behaviors of servals. At the same time, the serval’s strategy during hunting and capturing prey is an intelligent process with the potential to design an optimizer. In order to address this research gap, in this paper, the natural behavior of servals during hunting in nature is employed in the design of a new bio-inspired metaheuristic algorithm, which is introduced and modeled in the next section.

3. Serval Optimization Algorithm

This section is dedicated to the introduction and mathematical modeling of the proposed Serval Optimization Algorithm (SOA) approach.

3.1. Inspiration of SOA

Serval is a skilled predator that hunts its prey in three stages. First, using its strong sense of hearing, it identifies the position of the prey and observes it for up to 15 min without moving. Then, in the second stage, it moves towards the prey, jumps up to a height of 4 meters in the air with all four feet, and attacks this prey with its front paws. Finally, in the third stage, in a chasing process by running and jumping to catch the fleeing prey, the serval kills it and starts eating it [].
Serval’s strategy during hunting is one of the most characteristic natural behaviors of this animal. This strategy is an intelligent process that can inspire the design of a new metaheuristic algorithm. Modeling the three-stage serval strategy during hunting is employed in SOA design, which is discussed below.

3.2. Algorithm Initialization

The proposed SOA approach is a population-based optimizer that is able to provide suitable solutions for optimization problems by using the search power of its search agents. Servals that look for prey in nature have a similar approach to the mechanism of search agents in identifying the optimal solution. For this reason, from a mathematical point of view, servals form the SOA population that seeks to achieve the optimal solution in the search space. Therefore, each serval is a candidate solution for the problem whose position in the search space determines the values of the decision variables. From a mathematical point of view, each serval is a vector, and their population together forms the SOA population matrix, which can be represented according to Equation (1). The initial position of servals in the search space at the beginning of the implementation of the algorithm is randomly generated using Equation (2).
X = [ X 1 X i X N ] N × d = [ x 1 , 1 x 1 , j x 1 , d x i , 1 x i , j x i , d x N , 1 x N , j x N , d ] N × d ,
x i , j = l b j + r i , j · ( u b j l b j ) ,   i = 1 , 2 ,   ,   N   and   j = 1 , 2 ,   , d ,
where X denotes the population matrix of serval locations, X i is the i th serval (candidate solution), x i , j is its j th dimension in search space (decision variable), N denotes the number of servals, d is the number of decision variables, r i , j are random numbers in the interval [ 0 , 1 ] , l b j , and u b j are the lower and upper bounds of the j th decision variable, respectively.
Since each serval is a candidate solution for the problem, the objective function of the problem can be evaluated based on the proposed values of each serval for the decision variables. Then, according to Equation (3), a vector can represent the values of the problem’s objective function.
F = [ F 1 F i F N ] N × 1 = [ F ( X 1 ) F ( X i ) F ( X N ) ] N × 1 ,
where F denotes the vector of objective function values and F i denote to the obtained objective function value from the i th serval.
Among the calculated values for the objective function, the best value is identified as the best candidate solution, and the member corresponding to it is determined as the best member of the population. Considering that in each SOA iteration, the positions of all population members are updated, of course, the best member should be updated in each iteration.

3.3. Mathematical Modelling of SOA

The process of updating SOA population members in the search space has two phases based on simulating the serval hunting strategy in nature. These phases are intended to model exploration in global search and exploitation in local search in SOA design.

3.3.1. Phase 1: Prey Selection and Attacking (Exploration)

The serval is an efficient predator that uses its strong sense of hearing to identify the location of its prey and then attack it. In the first phase of SOA, the positions of servals are updated based on the simulation of these two strategies. This update causes big changes in the position of servals and leads to a detailed scanning of the search space. The purpose of this phase of SOA is to increase the power of SOA exploration in global search and to identify the main optimal region.
In the SOA design, the position of the population’s best member is considered the prey position. First, the new position for the serval is calculated using Equation (4) to model the serval’s attack on the prey. Then, if this new position improves the value of the objective function, it replaces the previous serval position according to Equation (5).
x i , j P 1 = x i , j + r i , j · ( P j I i , j · x i , j ) ,   i = 1 , 2 ,   ,   N   and   j = 1 , 2 ,   , d ,
X i = { X i P 1 ,   F i P 1 < F i X i ,   e l s e
where X i P 1 denotes the new position of the i th serval based on the first phase of SOA, x i , j P 1 is its j th dimension, F i P 1 is its objective function value, r i , j are random numbers in interval [ 0 , 1 ] , P denotes the prey location, P j is its j th dimension, I i , j are numbers randomly selected from the set { 1 , 2 } , N is the total number of servals population, and d is the number of decision variables.

3.3.2. Phase 2: Chase Process (Exploitation)

After attacking the prey, the serval tries to stop the prey by leaping in a chase process, then kills it and feeds on it. In the second phase of SOA, this serval strategy is employed in updating the population position of SOA. The simulation of the chase process causes small changes in the positions of the servals in the search space. In fact, the purpose of this SOA phase is to increase the exploitation power of SOA in local search and find better solutions. In order to mathematically model the chasing process between the serval and the prey, a new random position near the serval is calculated using Equation (6). This new position, provided that it improves the value of the objective function, replaces the previous position of the corresponding serval according to Equation (7).
x i , j P 2 = x i , j + r i , j · ( u b j l b j ) t ,   i = 1 , 2 ,   ,   N ,   j = 1 , 2 ,   , d ,   and   t = 1 , 2 ,   ,   T ,
X i = { X i P 2 ,   F i P 2 < F i , X i ,   e l s e ,
where X i P 2 represents the new position of the i th serval based on second phase of SOA, x i , j P 2 is its j th dimension, F i P 2 denotes its objective function value, t is the iteration counter of the algorithm, and T represents to the total number of algorithm iterations.

3.4. Repetition Process, Pseudocode, and Flowchart of SOA

By updating all servals based on the first and second phases of SOA, the first iteration of the algorithm is completed. Then, based on the new positions of the servals and the new values obtained for the objective function, the algorithm enters the next iteration. The operation of updating the positions of servals is repeated until the last iteration of the algorithm based on Equations (4)–(7). After the complete implementation of SOA, the best candidate solution obtained during the algorithm’s execution is introduced as the solution to the problem. The SOA implementation process is presented in the form of a flowchart in Figure 1, and its pseudo code is presented in Algorithm 1.
Figure 1. Flowchart of the proposed SOA.
Algorithm 1 Pseudocode of the SOA.
Start SOA.
1. Input problem information: variables, the objective function, and constraints.
2. Set the population size (N) and the total number of iterations (T)
3. Generate the initial population matrix at random.
4. Evaluate the objective function.
5. For t = 1 to N
6.   For i = 1 to N
7.   Phase 1: Prey selection and attacking (exploration)
8.     Update the best member of population as prey location.
9.     Calculate the new position of the ith SOA member based on attack simulation using Equation (4). x i , j P 1 x i , j + r i , j · ( P j I i , j · x i , j )
10.     Update the ith SOA member using Equation (5). X i { X i P 1 ,   F i P 1 < F i , X i ,   else ,
11.   Phase 2: Chase process (exploitation)
12.     Calculate new position of the ith SOA member based on simulation the chase using Equation (6).
x i , j P 2 x i , j + r i , j · ( u b j l b j ) t
13.     Update the ith SOA member using Equation (7). X i { X i P 2 ,   F i P 2 < F i , X i ,   else ,
14.   end
15.   Save the best candidate solution so far.
16. end.
17. Output the best quasi-optimal solution obtained with the SOA.
End SOA.

3.5. Computational Complexity of SOA

This subsection is dedicated to the computational complexity analysis of the proposed SOA approach. SOA initialization operation has a complexity equal to O ( N d ) , where N is the number of servals and d is the number of decision variables. The process of updating the SOA population and calculating the objective function in two phases has a complexity equal to O ( 2 N d T ) , where T is the maximum number of algorithm iterations. Therefore, the total computational complexity of SOA is O ( N d ( 1 + 2 T ) ) .

4. Simulation Studies and Results

This section is dedicated to evaluating the performance of SOA in solving optimization problems and achieving solutions for these problems. For this purpose, thirty-nine standard benchmark functions from the CEC 2017 test suite and CEC 2019 test suite have been employed. The CEC 2017 test suite has 30 benchmark functions, including 3 unimodal functions C17-F1 to C17-F3, 7 multimodal functions C17-F4 to C17-F10, 10 hybrid functions C17-F11 to C17-F20, and 10 composition functions C17-F21 to C17-F30. The C17-F2 function is not considered in the simulations due to its unstable behavior. The complete information of the CEC 2017 test suite is described in []. CEC 2019 test suite has 10 hard benchmark functions, the complete information of which is described in []. The quality of the SOA approach in optimization has been compared with the performance of twelve well-known metaheuristic algorithms. These algorithms include: (i) widely used and famous methods: GA, PSO, (ii) high cited methods: GSA, TLBO, MVO, GWO, WOA (iii) recently published methods MPA, TSA, RSA, AVOA, and WSO. The adjusted values for the parameters of competitor algorithms are listed in Table 1.
Table 1. Control parameters values.
SOA and competitor algorithms are employed to optimize the 39 benchmark functions mentioned above. Simulation results are presented using six indicators: mean, best, worst, standard deviation (std), median, and rank.

4.1. Evaluation the CEC 2017 Test Suite

To analyze the quality of SOA and competitor algorithms in handling optimization problems, they have been implemented on the CEC 2017 test suite for dimensions d equal to 10, 30, 50, and 100. Table 2, Table 3, Table 4 and Table 5 present the results obtained from these implementations.
Table 2. Optimization results of the CEC 2017 test suite (for the dimension d = 10 ).
Table 3. Optimization results of the CEC 2017 test suite (for the dimension d = 30 ).
Table 4. Optimization results of the CEC 2017 test suite (for the dimension d = 50 ).
Table 5. Optimization results of the CEC 2017 test suite (for the dimension d = 100 ).
What can be concluded from the comparison of simulation results is that for the dimension d = 10 , SOA is the best optimizer in handling the functions C17-F3, C17-F6, C17-F7, C17-F10, C17-F19 to C17-F24, C17-F26 to C17-F30. For the dimension d = 30 , SOA is the best optimizer in handling the functions C17-F1, C17-F3 to C17-F5, C17-F7 to C17-F11, C17-F14 to C17-F16, C17-F20, C17-F21, C17-F23, C17-F26, C17-F27, C17-F29, and C17-F30. For the dimension d = 50 , SOA is the best optimizer in handling the functions C17-F1, C17-F3 to C17-F10, C17-F12 to C17-F20, C17-F22, C17-F23, and C17-F25, to C17-F30. For the dimension d = 100 , SOA is the best optimizer in handling the functions C17-F1, C17-F3 to C17-F5, C17-F7 to C17-F13, C17-F15, C17-F16, C17-F18, to C17-F22, C17-F24, to C17-F26, C17-F29, and C17-F30.
The optimization results show that the proposed SOA approach has provided superior performance compared to competing algorithms in the CEC 2017 test suite optimization by creating a suitable balance between exploration and exploitation. The performance of SOA and competitor algorithms in the optimization of the CEC 2017 test suite is drawn as a boxplot diagram in Figure 2, Figure 3, Figure 4 and Figure 5.
Figure 2. Boxplot diagram of SOA and competitor algorithms performances on the CEC 2017 test suite (for the dimension d = 10 ).
Figure 3. Boxplot diagram of SOA and competitor algorithms performances on the CEC 2017 test suite (for the dimension d = 30 ).
Figure 4. Boxplot diagram of SOA and competitor algorithms performances on the CEC 2017 test suite (for the dimension d = 50 ).
Figure 5. Boxplot diagram of SOA and competitor algorithms performances on the CEC 2017 test suite (for the dimension d = 100 ).

4.2. Evaluation the CEC 2019 Test Suite

This subsection tests the effectiveness of the proposed SOA approach and competing algorithms in solving the CEC 2019 test suite. The optimization results of C19-F1 to C19-F10 functions are published in Table 6.
Table 6. Optimization results of the CEC 2019 test suite.
What is evident from the comparison of the simulation results is that the proposed SOA approach is the first best optimizer C19-F1 to C19-F4, and C19-F6 to C19-F9 functions against competitor algorithms. The optimization results show that SOA’s proposed approach better handles the CEC 2019 test suite by winning the first rank compared to competitor algorithms. The performance of SOA and competitor algorithms in the optimization of the CEC 2019 test suite is drawn as a boxplot diagram in Figure 6.
Figure 6. Boxplot diagram of SOA and competitor algorithms performances on the CEC 2019 test suite.

4.3. Statistical Analysis

In this subsection, by providing a statistical analysis of the simulation results, it has been investigated how significant the superiority of the proposed SOA approach is against competitor algorithms from a statistical point of view. For this reason, the Wilcoxon rank sum test [] is utilized, which is applicable to determine the significant difference between the average of two data samples. The results of applying the Wilcoxon rank sum test on the performance of the proposed SOA proposed approach and competitor algorithms are reported in Table 7. Based on the values obtained for the p-value index, in cases where the p-value is less than 0.05, the proposed SOA approach has a statistically significant superiority compared to the corresponding competitor algorithm.
Table 7. Wilcoxon rank sum test results.

5. SOA for Real-World Applications

This section is dedicated to analyzing the effectiveness of the proposed SOA approach in handling real-world applications. In this regard, SOA and competitor algorithms are employed to optimize the CEC 2011 test suite and four engineering design problems.

5.1. Evaluation the CEC 2011 Test Suite

This collection contains twenty-two real-world optimization problems (the C11-F3 function was excluded in the simulation studies). The CEC 2011 test suite details are described in []. The optimization results of the CEC 2011 test suite using SOA and competitor algorithms are published in Table 8.
Table 8. Optimization results of the CEC 2011 test suite.
The simulation results imply that the proposed SOA approach is the best optimizer for handling functions C11-F1, C11-F2, C11-F4 to C11-F6, C11-F8 to C11-F10, F12, F15, F18, and C11-F20 to C11-F22. A comparison of the simulation results indicates that the proposed SOA approach has an acceptable efficiency in dealing with real-world optimization problems against competitor algorithms. Additionally, the results of employing the Wilcoxon rank sum test on the performance of SOA and competitor algorithms on the CEC 2011 test suite show the statistically significant superiority of SOA in competition with the compared algorithms. The performance of SOA and competitor algorithms in dealing with the CEC 2011 test suite is plotted as a boxplot diagram in Figure 7.
Figure 7. Boxplot diagram of SOA and competitor algorithms performances on the CEC 2011 test suite.

5.2. The SOA Testing on Engineering Optimization Problems

In this subsection, the performance of SOA in solving four engineering design problems from real-world applications is evaluated.

5.2.1. Pressure Vessel Design Problem

The pressure vessel design is a real-world challenge in engineering studies where the goal is to minimize the design cost. The schematic of this design is provided in Figure 8.
Figure 8. Schematic of the pressure vessel design.
The mathematical model of pressure vessel design problem is as follows []:
Consider: X = [ x 1 ,   x 2 ,   x 3 ,   x 4 ] = [ T s ,   T h ,   R ,   L ] .
Minimize: f ( x ) = 0.6224 x 1 x 3 x 4 + 1.778 x 2 x 3 2 + 3.1661 x 1 2 x 4 + 19.84 x 1 2 x 3 .
Subject to:
g 1 ( x ) = x 1 + 0.0193 x 3     0 ,   g 2 ( x ) = x 2 + 0.00954 x 3   0 , g 3 ( x ) = π x 3 2 x 4 4 3 π x 3 3 + 1296000   0 ,   g 4 ( x ) = x 4 240     0 .
With
0 x 1 , x 2 100   and   10 x 3 , x 4 200 .
The optimization results of pressure vessel design using SOA and competitor algorithms are released in Table 9 and Table 10.
Table 9. Performance of optimization algorithms on the pressure vessel design problem.
Table 10. Statistical results of optimization algorithms on pressure vessel design problem.
Based on the simulation results, the proposed SOA approach has provided the optimal solution with the values of the design variables equal to (0.778027, 0.384579, 40.31228, 200), and the value of the objective function equals to 5882.901. The analysis of the results shows that compared to competitor algorithms. Therefore, SOA has provided better performance in dealing with pressure vessel design. The convergence curve of SOA in achieving the solution for the pressure vessel design problem is drawn in Figure 9.
Figure 9. SOA’s performance convergence curve on the pressure vessel design.

5.2.2. Speed Reducer Design Problem

The speed reducer design is an engineering subject aiming to minimize the speed reducer’s weight. The schematic of this design is provided in Figure 10.
Figure 10. Schematic of speed reducer design.
The mathematical model of the speed reducer design problem is as follows [,]:
Consider: X = [ x 1 ,   x 2 ,   x 3 ,   x 4 ,   x 5   , x 6   , x 7 ] = [ b ,   m ,   p ,   l 1 ,   l 2 ,   d 1 ,   d 2 ] .
Minimize: f ( x ) = 0.7854 x 1 x 2 2 ( 3.3333 x 3 2 + 14.9334 x 3 43.0934 ) 1.508 x 1 ( x 6 2 + x 7 2 ) + 7.4777 ( x 6 3 + x 7 3 ) + 0.7854 ( x 4 x 6 2 + x 5 x 7 2 ) .
Subject to:
g 1 ( x ) = 27 x 1 x 2 2 x 3 1     0 ,   g 2 ( x ) = 397.5 x 1 x 2 2 x 3 1   0 , g 3 ( x ) = 1.93 x 4 3 x 2 x 3 x 6 4 1   0 ,   g 4 ( x ) = 1.93 x 5 3 x 2 x 3 x 7 4 1     0 , g 5 ( x ) = 1 110 x 6 3 ( 745 x 4 x 2 x 3 ) 2 + 16.9 · 10 6 1   0 , g 6 ( x ) = 1 85 x 7 3 ( 745 x 5 x 2 x 3 ) 2 + 157.5 · 10 6 1     0 , g 7 ( x ) = x 2 x 3 40 1     0 ,   g 8 ( x ) = 5 x 2 x 1 1     0 , g 9 ( x ) = x 1 12 x 2 1     0 ,   g 10 ( x ) = 1.5 x 6 + 1.9 x 4 1     0 , g 11 ( x ) = 1.1 x 7 + 1.9 x 5 1     0 .
With
2.6 x 1 3.6 ,   0.7 x 2 0.8 ,   17 x 3 28 ,   7.3 x 4 8.3 ,   7.8 x 5 8.3 ,   2.9 x 6 3.9 ,   and   5 x 7 5.5 .
The implementation results of the proposed SOA and competitor algorithms on the speed reducer design problem are released in Table 11 and Table 12.
Table 11. Performance of optimization algorithms on the speed reducer design problem.
Table 12. Statistical results of optimization algorithms on the speed reducer design problem.
Based on the simulation results, the proposed SOA approach has provided the optimal solution with the values of the design variables equal to (3.5, 0.7, 17, 7.3, 7.8, 3.350215, 5.286683) and the objective function equal to 2996.348. Analysis of the results shows that SOA has provided better performance in handling speed reducer design compared to competitor algorithms. The convergence curve of SOA while solving the speed reducer design problem is drawn in Figure 11.
Figure 11. SOA’s performance convergence curve on the speed reducer design.

5.2.3. Welded Beam Design

The welded beam design is a real-world application with the aim of minimizing the fabrication cost of the welded beam. The schematic of welded beam design problem is provided in Figure 12.
Figure 12. Schematic of the welded beam design.
The mathematical model of welded beam design problem is as follows []:
Consider: X = [ x 1 ,   x 2 ,   x 3 ,   x 4 ] = [ h ,   l ,   t ,   b ] .
Minimize: f ( x ) = 1.10471 x 1 2 x 2 + 0.04811 x 3 x 4   ( 14.0 + x 2 ) .
Subject to:
g 1 ( x ) = τ ( x ) 13600     0 ,   g 2 ( x ) = σ ( x ) 30000     0 , g 3 ( x ) = x 1 x 4   0 ,   g 4 ( x ) = 0.10471 x 1 2 + 0.04811 x 3 x 4   ( 14 + x 2 ) 5.0     0 , g 5 ( x ) = 0.125 x 1   0 ,   g 6 ( x ) = δ   ( x ) 0.25     0 , g 7 ( x ) = 6000 p c   ( x )   0 .
Here
τ ( x ) = ( τ ) 2 + ( 2 τ τ ) x 2 2 R + ( τ ) 2   ,   τ = 6000 2 x 1 x 2 ,   τ = M R J , M = 6000 ( 14 + x 2 2 ) ,   R = x 2 2 4 + ( x 1 + x 3 2 ) 2 , J = 2 { x 1 x 2 2 [ x 2 2 12 + ( x 1 + x 3 2 ) 2 ] }   ,   σ ( x ) = 504000 x 4 x 3 2 ,
δ   ( x ) = 65856000 ( 30 · 10 6 ) x 4 x 3 3   ,   p c   ( x ) = 4.013 ( 30 · 10 6 ) x 3 2 x 4 6 36 196 ( 1 x 3 28 30 · 10 6 4 ( 12 · 10 6 ) ) .
With
0.1 x 1 ,   x 4 2   and   0.1 x 2 ,   x 3 10 .
The results of using the SOA and competing algorithms on the problem of welded beam design are released in Table 13 and Table 14.
Table 13. Performance of optimization algorithms on the welded beam design problem.
Table 14. Statistical results of optimization algorithms on the welded beam design problem.
Based on the simulation results, the proposed SOA approach has provided the optimal solution with the values of the design variables equal to (0.20573, 3.470489, 9.036624, 0.20573) and the objective function equal to 1.724852. Based on the statistical indicators, it is clear that SOA has provided a more effective capability in handling the welded beam design problem compared to competitor algorithms. The SOA con-vergence curve during welded beam design optimization is drawn in Figure 13.
Figure 13. SOA’s performance convergence curve on the welded beam design.

5.2.4. Tension/Compression Spring Design

The tension/compression spring design is a real-world issue with the goal of minimizing the weight of tension/compression spring. The schematic of this design is provided in Figure 14.
Figure 14. Schematic of the tension/compression spring design.
The mathematical model of tension/compression spring design problem is as follows []:
Consider: X = [ x 1 ,   x 2 ,   x 3   ] = [ d ,   D ,   P ] .
Minimize: f ( x ) = ( x 3 + 2 ) x 2 x 1 2 .
Subject to:
g 1 ( x ) = 1 x 2 3 x 3 71785 x 1 4     0 ,   g 2 ( x ) = 4 x 2 2 x 1 x 2 12566 ( x 2 x 1 3 ) + 1 5108 x 1 2 1   0 ,
g 3 ( x ) = 1 140.45 x 1 x 2 2 x 3   0 ,   g 4 ( x ) = x 1 + x 2 1.5 1     0 .
With
0.05 x 1 2 ,   0.25 x 2 1.3   and   2   x 3 15 .
The simulation results of the tension/compression spring design problem using the SOA and competitor algorithms are released in Table 15 and Table 16.
Table 15. Performance of optimization algorithms on the tension/compression spring design problem.
Table 16. Statistical results of optimization algorithms on the tension/compression spring design problem.
Based on the simulation results, the proposed SOA approach has provided the optimal solution with the values of the design variables equal to (0.051689, 0.356718, 11.28897) and the objective function equal to 0.012665. Comparing the obtained results indicates the superiority of SOA in dealing with the tension/compression spring design problem compared to competing algorithms. The SOA convergence curve while achieving the optimal design for the tension/compression spring design problem is drawn in Figure 15.
Figure 15. SOA’s performance convergence curve on the tension/compression spring.

6. Conclusions and Future Works

This paper introduced a new swarm-based metaheuristic algorithm named the Serval Optimization Algorithm (SOA) based on the simulation of serval behaviors in nature. The serval strategy during hunting in the three stages of prey selection, attack, and the chase is the fundamental inspiration of SOA. Different steps of SOA were stated and mathematically modeled in two phases of exploration and exploitation. The effectiveness of SOA in solving optimization problems was tested on thirty-nine benchmark functions from the CEC 2017 test suite and the CEC 2019 test suite. The SOA’s results were compared with the performance of the other twelve well-known metaheuristic algorithms. The optimization results showed that SOA had performed better by balancing exploration and exploitation and had superior performance compared to competitor algorithms. Employing the proposed approach in optimizing the CEC 2011 test suite and four engineering design challenges demonstrated SOA’s evident ability to address real-world applications.
The introduction of SOA enables several research tasks for future studies. Designing the multi-objective version of SOA and using it in multi-objective optimization problems, developing the binary version of SOA, and using it in applications that require binary algorithms, such as feature selection, are among the most special suggestions for future studies. The use of SOA in various optimization problems in science and real-world applications are among the other recommendations of this article for future research.

Author Contributions

Conceptualization, P.T.; methodology, P.T.; software, M.D.; validation, P.T. and M.D.; formal analysis, M.D.; investigation, P.T.; resources, P.T.; data curation, P.T. and M.D.; writing—original draft preparation, P.T. and M.D.; writing—review and editing, P.T. and M.D.; visualization, P.T.; supervision, P.T.; project administration, M.D.; funding acquisition, P.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Project of Excellence Faculty of Science, University of Hradec Králové, grant number 2210/2022.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors thank University of Hradec Králové for support.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhao, S.; Zhang, T.; Ma, S.; Chen, M. Dandelion Optimizer: A nature-inspired metaheuristic algorithm for engineering applications. Eng. Appl. Artif. Intell. 2022, 114, 105075. [Google Scholar] [CrossRef]
  2. Jahani, E.; Chizari, M. Tackling global optimization problems with a novel algorithm-Mouth Brooding Fish algorithm. Appl. Soft Comput. 2018, 62, 987–1002. [Google Scholar] [CrossRef]
  3. Sergeyev, Y.D.; Kvasov, D.; Mukhametzhanov, M. On the efficiency of nature-inspired metaheuristics in expensive global optimization with limited budget. Sci. Rep. 2018, 8, 1–9. [Google Scholar] [CrossRef]
  4. Liberti, L.; Kucherenko, S. Comparison of deterministic and stochastic approaches to global optimization. Int. Trans. Oper. Res. 2005, 12, 263–285. [Google Scholar] [CrossRef]
  5. Koc, I.; Atay, Y.; Babaoglu, I. Discrete tree seed algorithm for urban land readjustment. Eng. Appl. Artif. Intell. 2022, 112, 104783. [Google Scholar] [CrossRef]
  6. Renard, P.; Alcolea, A.; Ginsbourger, D. Stochastic versus deterministic approaches. Environ. Model. Find. Simplicity Complex. 2013, 8, 133–149. [Google Scholar]
  7. Dehghani, M.; Trojovská, E.; Trojovský, P. A new human-based metaheuristic algorithm for solving optimization problems on the base of simulation of driving training process. Sci. Rep. 2022, 12, 9924. [Google Scholar] [CrossRef] [PubMed]
  8. Zeidabadi, F.-A.; Dehghani, M.; Trojovský, P.; Hubálovský, Š.; Leiva, V.; Dhiman, G. Archery Algorithm: A Novel Stochastic Optimization Algorithm for Solving Optimization Problems. Comput. Mater. Contin. 2022, 72, 399–416. [Google Scholar] [CrossRef]
  9. De Armas, J.; Lalla-Ruiz, E.; Tilahun, S.L.; Voß, S. Similarity in metaheuristics: A gentle step towards a comparison methodology. Nat. Comput. 2022, 21, 265–287. [Google Scholar] [CrossRef]
  10. Trojovská, E.; Dehghani, M.; Trojovský, P. Zebra Optimization Algorithm: A New Bio-Inspired Optimization Algorithm for Solving Optimization Algorithm. IEEE Access 2022, 10, 49445–49473. [Google Scholar] [CrossRef]
  11. Dehghani, M.; Montazeri, Z.; Dehghani, A.; Malik, O.P.; Morales-Menendez, R.; Dhiman, G.; Nouri, N.; Ehsanifar, A.; Guerrero, J.M.; Ramirez-Mendoza, R.A. Binary spring search algorithm for solving various optimization problems. Appl. Sci. 2021, 11, 1286. [Google Scholar] [CrossRef]
  12. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  13. Kennedy, J.; Eberhart, R. Particle Swarm Optimization, Proceedings of ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; IEEE: Perth, WA, Australia, 1995; Volume 4, pp. 1942–1948. [Google Scholar]
  14. Karaboga, D.; Basturk, B. Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems; International fuzzy systems association world congress; Springer: Berlin/Heidelberg, Germany, 2007; pp. 789–798. [Google Scholar]
  15. Dorigo, M.; Maniezzo, V.; Colorni, A. Ant system: Optimization by a colony of cooperating agents. IEEE Trans. Syst. Man Cybern. Part B 1996, 26, 29–41. [Google Scholar] [CrossRef]
  16. Dehghani, M.; Montazeri, Z.; Trojovská, E.; Trojovský, P. Coati Optimization Algorithm: A new bio-inspired metaheuristic algorithm for solving optimization problems. Knowl. Based Syst. 2022, 259, 110011. [Google Scholar] [CrossRef]
  17. Abualigah, L.; Abd Elaziz, M.; Sumari, P.; Geem, Z.W.; Gandomi, A.H. Reptile Search Algorithm (RSA): A nature-inspired meta-heuristic optimizer. Expert Syst. Appl. 2022, 191, 116158. [Google Scholar] [CrossRef]
  18. Braik, M.; Hammouri, A.; Atwan, J.; Al-Betar, M.A.; Awadallah, M.A. White Shark Optimizer: A novel bio-inspired meta-heuristic algorithm for global optimization problems. Knowl. Based Syst. 2022, 243, 108457. [Google Scholar] [CrossRef]
  19. Hashim, F.A.; Houssein, E.H.; Hussain, K.; Mabrouk, M.S.; Al-Atabany, W. Honey Badger Algorithm: New metaheuristic algorithm for solving optimization problems. Math. Comput. Simul. 2022, 192, 84–110. [Google Scholar] [CrossRef]
  20. Chopra, N.; Ansari, M.M. Golden Jackal Optimization: A Novel Nature-Inspired Optimizer for Engineering Applications. Expert Syst. Appl. 2022, 198, 116924. [Google Scholar] [CrossRef]
  21. Abdollahzadeh, B.; Gharehchopogh, F.S.; Mirjalili, S. African vultures optimization algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems. Comput. Ind. Eng. 2021, 158, 107408. [Google Scholar] [CrossRef]
  22. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  23. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  24. Faramarzi, A.; Heidarinejad, M.; Mirjalili, S.; Gandomi, A.H. Marine Predators Algorithm: A nature-inspired metaheuristic. Expert Syst. Appl. 2020, 152, 113377. [Google Scholar] [CrossRef]
  25. Kaur, S.; Awasthi, L.K.; Sangal, A.L.; Dhiman, G. Tunicate Swarm Algorithm: A new bio-inspired based metaheuristic paradigm for global optimization. Eng. Appl. Artif. Intell. 2020, 90, 103541. [Google Scholar] [CrossRef]
  26. Goldberg, D.E.; Holland, J.H. Genetic Algorithms and Machine Learning. Mach. Learn. 1988, 3, 95–99. [Google Scholar] [CrossRef]
  27. Storn, R.; Price, K. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  28. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef]
  29. Dehghani, M.; Montazeri, Z.; Dhiman, G.; Malik, O.; Morales-Menendez, R.; Ramirez-Mendoza, R.A.; Dehghani, A.; Guerrero, J.M.; Parra-Arroyo, L. A spring search algorithm applied to engineering optimization problems. Appl. Sci. 2020, 10, 6173. [Google Scholar] [CrossRef]
  30. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  31. Dehghani, M.; Samet, H. Momentum search algorithm: A new meta-heuristic optimization algorithm inspired by momentum conservation law. SN Appl. Sci. 2020, 2, 1–15. [Google Scholar] [CrossRef]
  32. Eskandar, H.; Sadollah, A.; Bahreininejad, A.; Hamdi, M. Water cycle algorithm–A novel metaheuristic optimization method for solving constrained engineering optimization problems. Comput. Struct. 2012, 110, 151–166. [Google Scholar] [CrossRef]
  33. Hatamlou, A. Black hole: A new heuristic optimization approach for data clustering. Inf. Sci. 2013, 222, 175–184. [Google Scholar] [CrossRef]
  34. Faramarzi, A.; Heidarinejad, M.; Stephens, B.; Mirjalili, S. Equilibrium optimizer: A novel optimization algorithm. Knowl. Based Syst. 2020, 191, 105190. [Google Scholar] [CrossRef]
  35. Cuevas, E.; Oliva, D.; Zaldivar, D.; Pérez-Cisneros, M.; Sossa, H. Circle detection using electro-magnetism optimization. Inf. Sci. 2012, 182, 40–55. [Google Scholar] [CrossRef]
  36. Mirjalili, S.; Mirjalili, S.M.; Hatamlou, A. Multi-verse optimizer: A nature-inspired algorithm for global optimization. Neural Comput. Appl. 2016, 27, 495–513. [Google Scholar] [CrossRef]
  37. Hashim, F.A.; Hussain, K.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W. Archimedes optimization algorithm: A new metaheuristic algorithm for solving optimization problems. Appl. Intell. 2021, 51, 1531–1551. [Google Scholar] [CrossRef]
  38. Kaveh, A.; Dadras, A. A novel meta-heuristic optimization algorithm: Thermal exchange optimization. Adv. Eng. Softw. 2017, 110, 69–84. [Google Scholar] [CrossRef]
  39. Pereira, J.L.J.; Francisco, M.B.; Diniz, C.A.; Oliver, G.A.; Cunha Jr, S.S.; Gomes, G.F. Lichtenberg algorithm: A novel hybrid physics-based meta-heuristic for global optimization. Expert Syst. Appl. 2021, 170, 114522. [Google Scholar] [CrossRef]
  40. Dehghani, M.; Mardaneh, M.; Guerrero, J.M.; Malik, O.; Kumar, V. Football game based optimization: An application to solve energy commitment problem. Int. J. Intell. Eng. Syst. 2020, 13, 514–523. [Google Scholar] [CrossRef]
  41. Moghdani, R.; Salimifard, K. Volleyball premier league algorithm. Appl. Soft Comput. 2018, 64, 161–185. [Google Scholar] [CrossRef]
  42. Dehghani, M.; Montazeri, Z.; Givi, H.; Guerrero, J.M.; Dhiman, G. Darts game optimizer: A new optimization technique based on darts game. Int. J. Intell. Eng. Syst. 2020, 13, 286–294. [Google Scholar] [CrossRef]
  43. Zeidabadi, F.A.; Dehghani, M. POA: Puzzle Optimization Algorithm. Int. J. Intell. Eng. Syst. 2022, 15, 273–281. [Google Scholar]
  44. Dehghani, M.; Montazeri, Z.; Saremi, S.; Dehghani, A.; Malik, O.P.; Al-Haddad, K.; Guerrero, J.M. HOGO: Hide objects game optimization. Int. J. Intell. Eng. Syst. 2020, 13, 216–225. [Google Scholar] [CrossRef]
  45. Kaveh, A.; Zolghadr, A. A novel Meta-Heuristic algorithm: Tug of war optimization. Int. J. Optim. Civ. Eng. 2016, 6, 469–492. [Google Scholar]
  46. Rao, R.V.; Savsani, V.J.; Vakharia, D. Teaching–learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput. Aided Des. 2011, 43, 303–315. [Google Scholar] [CrossRef]
  47. Dehghani, M.; Trojovský, P. Teamwork Optimization Algorithm: A New Optimization Approach for Function Minimization/Maximization. Sensors 2021, 21, 4567. [Google Scholar] [CrossRef]
  48. Dehghani, M.; Mardaneh, M.; Guerrero, J.M.; Malik, O.P.; Ramirez-Mendoza, R.A.; Matas, J.; Vasquez, J.C.; Parra-Arroyo, L. A new “Doctor and Patient” optimization algorithm: An application to energy commitment problem. Appl. Sci. 2020, 10, 5791. [Google Scholar] [CrossRef]
  49. Braik, M.; Ryalat, M.H.; Al-Zoubi, H. A novel meta-heuristic algorithm for solving numerical optimization problems: Ali Baba and the forty thieves. Neural Comput. Appl. 2022, 34, 409–455. [Google Scholar] [CrossRef]
  50. Al-Betar, M.A.; Alyasseri, Z.A.A.; Awadallah, M.A.; Abu Doush, I. Coronavirus herd immunity optimizer (CHIO). Neural Comput. Appl. 2021, 33, 5011–5042. [Google Scholar] [CrossRef]
  51. Ayyarao, T.L.; RamaKrishna, N.; Elavarasam, R.M.; Polumahanthi, N.; Rambabu, M.; Saini, G.; Khan, B.; Alatas, B. War Strategy Optimization Algorithm: A New Effective Metaheuristic Algorithm for Global Optimization. IEEE Access 2022, 10, 25073–25105. [Google Scholar] [CrossRef]
  52. Mohamed, A.W.; Hadi, A.A.; Mohamed, A.K. Gaining-sharing knowledge based algorithm for solving optimization problems: A novel nature-inspired algorithm. Int. J. Mach. Learn. Cybern. 2020, 11, 1501–1529. [Google Scholar] [CrossRef]
  53. Smithers, R.H. The serval Felis serval Schreber, 1776. South Afr. J. Wildl. Res. 24-Mon. Delayed Open Access 1978, 8, 29–37. [Google Scholar]
  54. Awad, N.; Ali, M.; Liang, J.; Qu, B.; Suganthan, P.; Definitions, P. Evaluation criteria for the CEC 2017 special session and competition on single objective real-parameter numerical optimization. Technol. Rep. 2016, 5–8, 3126–3133. [Google Scholar]
  55. Price, K.V.; Awad, N.H.; Ali, M.Z.; Suganthan, P.N. Problem Definitions and Evaluation Criteria for the 100-Digit Challenge Special Session and Competition on Single Objective Numerical Optimization; Nanyang Technological University: Singapore, Singapore, 2018. [Google Scholar]
  56. Wilcoxon, F. Individual comparisons by ranking methods. In Breakthroughs in Statistics; Springer: Berlin/Heidelberg, Germany, 1992; pp. 196–202. [Google Scholar]
  57. Das, S.; Suganthan, P.N. Problem definitions and evaluation criteria for CEC 2011 competition on testing evolutionary algorithms on real world optimization problems. Jadavpur Univ. Nanyang Technol. Univ. Kolkata 2010, 6, 341–359. [Google Scholar]
  58. Kannan, B.; Kramer, S.N. An augmented Lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. J. Mech. Des. 1994, 116, 405–411. [Google Scholar] [CrossRef]
  59. Gandomi, A.H.; Yang, X.-S. Benchmark problems in structural optimization. In Computational Optimization, Methods and Algorithms; Springer: Berlin/Heidelberg, Germany, 2011; pp. 259–281. [Google Scholar]
  60. Mezura-Montes, E.; Coello, C.A.C. Useful Infeasible Solutions in Engineering Optimization with Evolutionary Algorithms; Mexican international conference on artificial intelligence; Springer: Berlin/Heidelberg, Germany, 2005; pp. 652–662. [Google Scholar]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.