Next Article in Journal
Review and Proposal for a Classification System of Soft Robots Inspired by Animal Morphology
Next Article in Special Issue
An Energy-Saving and Efficient Deployment Strategy for Heterogeneous Wireless Sensor Networks Based on Improved Seagull Optimization Algorithm
Previous Article in Journal
SmartLact8: A Bio-Inspired Robotic Breast Pump for Customized and Comfort Milk Expression
Previous Article in Special Issue
A Novel Topology Optimization Protocol Based on an Improved Crow Search Algorithm for the Perception Layer of the Internet of Things
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Adaptive Sand Cat Swarm Algorithm Based on Cauchy Mutation and Optimal Neighborhood Disturbance Strategy

1
School of Science, Xi’an University of Technology, Xi’an 710054, China
2
School of Science, Chang’an University, Xi’an 710064, China
*
Author to whom correspondence should be addressed.
Biomimetics 2023, 8(2), 191; https://doi.org/10.3390/biomimetics8020191
Submission received: 4 April 2023 / Revised: 26 April 2023 / Accepted: 26 April 2023 / Published: 4 May 2023
(This article belongs to the Special Issue Nature-Inspired Computer Algorithms)

Abstract

:
Sand cat swarm optimization algorithm (SCSO) keeps a potent and straightforward meta-heuristic algorithm derived from the distant sense of hearing of sand cats, which shows excellent performance in some large-scale optimization problems. However, the SCSO still has several disadvantages, including sluggish convergence, lower convergence precision, and the tendency to be trapped in the topical optimum. To escape these demerits, an adaptive sand cat swarm optimization algorithm based on Cauchy mutation and optimal neighborhood disturbance strategy (COSCSO) are provided in this study. First and foremost, the introduction of a nonlinear adaptive parameter in favor of scaling up the global search helps to retrieve the global optimum from a colossal search space, preventing it from being caught in a topical optimum. Secondly, the Cauchy mutation operator perturbs the search step, accelerating the convergence speed and improving the search efficiency. Finally, the optimal neighborhood disturbance strategy diversifies the population, broadens the search space, and enhances exploitation. To reveal the performance of COSCSO, it was compared with alternative algorithms in the CEC2017 and CEC2020 competition suites. Furthermore, COSCSO is further deployed to solve six engineering optimization problems. The experimental results reveal that the COSCSO is strongly competitive and capable of being deployed to solve some practical problems.

1. Introduction

Throughout history, optimization issues have been presented in all dimensions of people’s lives, such as in finance, science, engineering, etc. Nevertheless, with the development of society, optimization issues have become progressively more intricate. Traditional optimization methods, such as the Lagrange multiplier method, the complex method, queuing theory, and so on, require explicit descriptions of conditions and can only solve smaller optimization problems, which cannot be tackled exactly in a limited time. At the same time, for nonlinear engineering problems with a large quantity of constraints and decision variables, traditional optimization methods tend to get caught in the local optimum instead of sourcing the global optimal solution. Therefore, drawing inspiration from numerous manifestations in nature, researchers have devised a host of powerful and accessible meta-heuristic algorithms that, it is worth noting, can strike a superior balance between hopping out of the topical optimum and converging to a single point in order to arrive at a global optimum and solve sophisticated optimization problems.
The algorithms have been grouped I”to f’ve principal categories based on the inspiration used to create them: (1). Human-based optimization algorithms are designed based on human brain thinking, systems, organs, and social evolution. An example is the well-known neural network algorithm (NNA) [1], which tackles problems in ways informed by the message transmission of neural networks in the human brain. The Harmony Search (HS) [2,3] algorithm simulates a musician’s ability to achieve a pleasing harmonic state by repeatedly adjusting the pitch through recall. (2). Those that emulate natural evolution are classified as evolutionary-based optimization algorithms. The genetic algorithm (GA) [4] is the most classical model for simulating evolution, in which chromosomes pass through a cycle of stages to form descendants that leave more adaptive individuals through the laws and methods of superiority and inferiority. Simultaneously, the differential evolution (DE) [5,6] algorithm, imperial competition algorithm (ICA) [7], and memetic algorithm (MA) [8] also belong to the algorithms based on evolutionary mechanisms. (3). Population-based optimization algorithms are modeled to simulate the reproduction, predation, migration, and other behaviors of a colony of organisms. In this class of algorithm, the individuals in the population are conceived as quality-free particles seeking the best position. Ant colony optimization (ACO) [9,10] exploits the ideology of ants searching for the shortest distance from the nest to food. Particle swarm optimization (PSO) [11], stemming from the feeding of birds, is the most broadly accepted swarm intelligence algorithm. The moth-flame optimization (MFO) [12] algorithm serves as a mathematical model that is built by simulating the special navigation of a moth, which spirals close to a light source until it “flames”. Other swarm intelligence algorithms include the gray wolf optimization (GWO) [13] algorithm, manta ray foraging optimization (MRFO) algorithm [14], artificial hummingbird algorithm (AHA) [15], dwarf mongoose optimization (DMO) [16], and chimpanzee optimization algorithm (CHOA) [17,18], etc. (4). Plant growth-based optimization algorithms. Such algorithms are devised inspired by the properties of plants, such as photosynthesis, flower pollination, and seed dispersal. The dandelion optimization (DO) [19] algorithm is inspired by its process of rising, falling, and landing in different weather conditions depending on the wind. The one that simulates the aggressive invasion of weeds, searches for a suitable living space, and utilizes natural resources for rapidly growing and reproducing is denoted as invasive weed optimization (IWO) [20]. (5). Physics-based optimization algorithms are created in accordance with physical phenomena and regulations in nature. The gravitational search algorithm (GSA) [21], which is derived from gravity, has a robust global search capability and a fast convergence rate. The artificial raindrop optimization algorithm (ARA) [22] is designed on the basis of the processes of raindrop formation, landing, collision, confluence, and finally evaporation as water vapor.
Especially, many algorithms have been implemented for many practical engineering problems on account of their excellent performance, such as feature selection [23,24,25], image segmentation [26,27], signal processing [28], construction of water facilities [29], path planning for walking robots [30,31], job-shop scheduling problems [32], and piping and wiring problems in industrial and agricultural production [33]. Unlike gradient-based optimization algorithms, meta-heuristic algorithms rely on probabilistic search rather than gradient-based execution. With no centralized control constraints, the failure of individual individuals does not affect the solution of the whole problem, ensuring a more stable search process. As a general rule, as a first step, it is necessary to appropriately setup the essential parameters in the algorithm and produce a stochastic collection of initial solutions. Next, the search mechanism of the algorithm is applied to help find the optimum value until the stopping constraint is attained or the optimum value is discovered [34]. Nevertheless, it is evident that there are two different aspects to every algorithm; there are merits and demerits, and the performance will fluctuate based on the problem being addressed. No free lunch (NFL) [35] claims that an algorithm is capable of addressing one or more optimization problems, but there is no scientific foundation for the idea that it is possible to successfully tackle other optimization problems. Therefore, facing several special problems, it is sensible to propose a variety of strategies to enhance the efficiency of the algorithm.
The sand cat swarm optimization algorithm is a recently published, completely new swarm intelligence algorithm. In [36], SCSO is tested with some other popular algorithms (such as PSO and GWO) on different test functions, and better results or at least comparable results are achieved, but they can still be further improved. As a result, this paper provides an enhancement to tackle the optimization problem with the following primary contributions:
(1)
The COSCSO with better performance is designed by adding three strategies to SCSO.
In the first place, the nonlinear adaptive parameters replace the original linear parameters to increase the global search and prevent it from being caught in a topical optimum.
In second place, the Cauchy mutation operator strategy expedites the convergence speed.
In the end, optimal neighborhood disturbance enriches population diversity.
(2)
The enhanced algorithm is instrumented on test suites of different dimensions and on real engineering optimization problems.
Analyzing the balance of COSCSO exploration and exploitation on the 30-dimensional CEC2019 test suite.
Comparing with other competitive algorithms on the CEC2017 test suite and the CEC2020 test suite of 30 and 50 dimensions.
The improved algorithm is deployed on six engineering optimization problems in conjunction with nine other algorithms.
The remaining details of the paper are described below. The second part describes the relevant work on SCSO, with the third part consisting of a summary review of the original algorithm for sand cat swarm search for attacking prey. The fourth part elaborates on the three improvement strategies in detail. The fifth part presents an analysis of the comparative data of COSCSO, SCSO, and other optimization algorithms, while the superiority of COSCSO is illustrated. In the sixth part, six engineering examples are collected to verify the capabilities of COSCSO with other algorithms in addressing real-world problems. The final part is the conclusion.

2. Related Works

Since the emergence of the sand cat swarm optimization algorithm, considerable attention has been paid to it by researchers due to its excellence. Vahid Tavakol Aghaei, Amir SeyyedAbbasi et al. [37] applied COSCSO to address three diverse nonlinear control systems for inverted pendulum, Furuta pendulum, and Acrobat robotic arm. It has been shown through simulation experiments that SCSO is simple and accessible and can be a viable candidate for real-world control and engineering problems. In addition, several researchers have optimized the SCSO for greater performance. Firstly, Li et al. [38] designed an elite collaboration strategy with stochastic variation to select the top three sand cats in the population for adaptation, and the three elites assigned different weights cooperated to form a new sand cat position to guide the search process, avoiding the dilemma of being entangled in a local optimum. Secondly, Amir Seyyedabbasi et al. [39] combined SCSO with reinforcement learning techniques to better balance the exploration and exploitation processes and further solve the mobile node localization problem in wireless sensor networks. Finally, the ISCSO proposed by Lu et al. [40] effectively boosts the fault diagnosis performance of power transformers.

3. The Sand Cat Swarm Optimization

The sand cat swarm optimization (SCSO) algorithm is a remarkably new meta-heuristic optimization algorithm proposed by Amir Seyyedabbasi et al. in 2022. Sand cats live in very barren deserts and mountainous areas. Gerbils, hares, snakes, and insects are their dominant sources of food. In appearance, sand cats are similar to domestic cats, but one big difference is that their hearing is very sensitive and they can detect low-frequency noise below 2 kHz. Therefore, they can use this special skill to find and attack their prey very quickly. The process from discovery to prey capture is shown in Figure 1. We can compare the sand cat’s predation to the process of finding the optimal value, which is the inspiration of the algorithm.

3.1. Initialization

Originally, it is initialized in a randomized manner so that the sand cats are evenly distributed in the exploration area:
X 0 = l b + r a n d ( 0 , 1 ) ( u b l b )
where lb and ub are the upper and lower bounds of the variable, and rand is a random number between 0 and 1.
The resulting initial matrix is shown below:
C a t = x 1 , 1 x 1 , 2 x 1 , M x 2 , 1 x 2 , 2 x 2 , M x N , 1 x N , 2 x N , M
where x i , j denotes the jth dimension of the ith individual, and there are a total of N individuals and M variables. Meanwhile, the matrix of the fitness function is shown below:
F i t n e s s = f ( x 1 , 1 ; x 1 , 2 ; x 1 , M ) f ( x 2 , 1 ; x 2 , 2 ; x 2 , M ) f ( x N , 1 ; x N , 2 ; x N , M )
After comparing all fitness values, the minimum value is found, and the individual corresponding to it is the current optimal one.

3.2. Searching for Prey (Exploration)

The sand cat searches for prey mainly using its very sharp sense of hearing, which can detect low-frequency noise below 2 kHz. Then its mathematical model in the prey-finding stage is shown as follows:
S e = S M ( S M × t T )
r e = S e × r a n d   ( 0 , 1 )
X ( t + 1 ) = r e ( X a ( t ) r a n d   ( 0 , 1 ) X ( t ) )
where S M = 2 , S e denotes the general sensitivity range of the sand cats, whose value decreases linearly from 2 to 0, and re is the sensitivity range of a particular sand cat in the sand cat swarm. t is the immediate count of the iteration, and T depicts the utmost count of iterations for the entire search process. X a ( t ) is any one of the populations, and X ( t ) is the immediate position of the sand cat. Notably, when S e = 0 , r e = 0 , the latest position of the sand cat will also be assigned to 0 according to Equation (6), also in the search space. Furthermore, in order to guarantee a steady state between the exploration and exploitation phases, Re is put forward, and R e 0 , 2 , its value is given by Equation (7).
R e = 2 × S e × r a n d   ( 0 , 1 ) S e

3.3. Grabbing Prey (Exploitation)

As the search process progresses and the sand cat attacks the prey found in the previous stage, its mathematical modeling of the prey attack phase is as follows:
d i s t = r a n d   ( 0 , 1 ) X b e s t ( t ) X ( t )
X ( t + 1 ) = X ( t ) d i s t cos ( θ ) r e
where dist is the distance between the best and the current individual. θ is a random angle from 0 to 360.

3.4. Bridging Phase

The conversion of SCSO from the exploration phase to exploitation is closely associated with the parameter Re. When R e < 1 , the sand cat gets in close and captures the prey, which is in the exploitation phase; when R e > 1 , it continues to search different spaces to find the location of the prey, which is in the exploration phase. The pseudo-code of SCSO is seen in [36]. The mathematical modeling at this time is:
X ( t + 1 ) = X b e s t ( t ) d i s t cos ( θ ) r e , r e ( X a ( t ) r a n d ( 0 , 1 ) X ( t ) ) , R e 1 ; exploitation R e 1 ; exploration

4. Improved Sand Cat Swarm Optimization

In SCSO, the sand cat uses its powerful ability to recognize lower-profile noise below 2 kHz to capture prey, although the algorithm is straightforward and accessible to implement and allows for iterating quickly until the best position is found. However, there are some shortcomings, such as the tendency to be stuck in the topical optimum and excessive premature convergence. So now this algorithm is optimized and improved. In this paper, three strategies will be taken, namely: nonlinear adaptive parameter, Cauchy mutation strategy, and optimal neighborhood disturbance strategy.

4.1. Nonlinear Adaptive Parameters

In SCSO, the parameter Se plays a very prominent role; firstly, it indicates the sensitivity range of the sand cat hearing. Secondly, it influences the size of the parameter Re, which is in turn accountable for equilibrating the global search and local exploitation phases of the iterative process, and thus Se is also a parameter that coordinates the exploration and exploitation phases. Finally, it is also a crucial component of the convergence factor re, which affects the speed of convergence during the iteration. Whereas in the original algorithm, Se decreases linearly from 2 to 0. This idealized law is not representative of the actual sand cat’s predation ability, so a nonlinear adaptive parameter strategy is now utilized with the formula as in Equation (11).
S e = 2 q t 1 ( t T ) 2 1 ( t T ) 1 4 + 2 1 ( t T ) 1 4
Here, q t = 1 2 ( q t 1 ) 2 , and q t 0 , 1 , q t 0.5 .
The variation curves before and after the improvement of parameter Se are displayed in Figure 2. Comparing the two curves, we can see that the modified Se has a larger value in the preliminary portion of the optimization process, focusing on the global search; moreover, due to the perturbation of qt, the value of Se sometimes becomes smaller in the optimization process, which can cater for the local search at this time, forming a faster convergence speed and enabling a more precise search accuracy. In the posterior part of the optimization process, the value is on the lower side, focusing on the local search, and due to the perturbation of qt, the value of Se sometimes becomes larger in the optimization process, which ensures that the algorithm avoids becoming bogged down in local optima.

4.2. Cauchy Mutation Strategy

The Cauchy distribution is distinguished by long tails at both ends and a larger peak at the central origin. The introduction of the Cauchy mutation operator [41,42,43] as a mutational step provides each sand cat with a greater likelihood of skipping to a better place. Once obtaining the local optimal solution, the Cauchy mutation operator perturbs the step size, making the step size larger, which in turn causes the sand cat to jump away from the local optimal position. Conversely, this operator makes the step size smaller and speeds up the convergence when the individual is pursuing the global optimum. The Cauchy mutation has been integrated with many algorithms, such as MFO and CSO. The Cauchy distribution function and the probability density function of the Cauchy distribution are described as follows:
F ( x ) = 1 π arctan ( x x 0 γ ) + 1 2
f ( x ) = 1 π γ ( x x 0 ) 2 + γ 2
where x0 is referred to as the position parameter at the maximum and γ is the size parameter of half the distance at half the width of the peak. Here x 0 = 0 , γ = 1 , the standard Cauchy distribution is obtained, and its probability density function is as in Equation (14), and Figure 3 is the probability density function curve of the standard Cauchy distribution.
f ( x ) = 1 π ( 1 + x 2 )
To diminish the probability of dropping into the local optimum of SCSO, this paper uses the Cauchy mutation operator to promote the global optimization-seeking ability of the algorithm, expedite the convergence speed, and increase the population diversity. Well, at this point, the individual renewal changes to
X ( t + 1 ) = X ( t ) C ( 0 , 1 ) r e d i s t cos ( θ )
where C ( 0 , 1 ) is a stochastic number that submits to the standard Cauchy distribution.

4.3. Optimal Neighborhood Disturbance Strategy

When a sand cat swarm is feeding, all individuals move towards the location of prey, a circumstance that may account for the homogeneity of the population but is not conducive to the fluidity of the global search phase. Therefore, an optimal neighborhood disturbance strategy [44] is now utilized. When the global optimum is updated, a further search is performed around it. With this, population diversity can be enriched to obviate the need for a local optimum. The optimal neighborhood disturbance is shown as follows:
X b e s t ( t ) = X b e s t ( t ) + 0.5 r 1 X b e s t ( t ) , X b e s t ( t ) , r 2 < 0.5 r 2 0.5
where X b e s t ( t ) is the new individual generated after disturbance, r 1 , r 2 [ 0 , 1 ] .
After the optimal neighborhood search, the greedy strategy is adopted to opt for judgment. The specific formula is as follows:
X b e s t ( t ) = X b e s t ( t ) , X b e s t ( t ) , f ( X b e s t ( t ) ) < f ( X b e s t ( t ) ) f ( X b e s t ( t ) ) f ( X b e s t ( t ) )

4.4. COSCSO Steps

In this work, a nonlinear adaptive parameter, a Cauchy mutation strategy, and an optimal neighborhood disturbance strategy are combined to modify the standard SCSO algorithm to form the COSCSO algorithm. The fundamental steps of COSCSO are as follows:
Step 1. Initialization, identifying the population magnitude N, the maximum number of iterations T, and the parameters needed.
Step 2. Computing and comparing the fitness value of each sand cat and getting the existing best position.
Step 3. Update the nonlinear parameters Se and the parameters re, Re by means of Equations (11), (5) and (7).
Step 4. Generate the Cauchy mutation operator.
Step 5. Update the individual position of the sand cat if R e > 1 , using Equation (6); otherwise, use Equation (15).
Step 6. Compare the fitness values of the existing individual, and if the former is better, renew the best individual position.
Step 7. Generate new individuals by perturbing the existing best individual according to the optimal neighborhood disturbance strategy using Equation (16).
Step 8. A comparison of the fitness values of the freshly engendered individual and the best individual in accordance with the greedy strategy, and upgrading the position of the best individual if the former is preferable.
Step 9. Revert to Step 3 if the maximum count of iteration T has not been achieved; otherwise, continue with Step 10.
Step 10. Output the global best position and the corresponding fitness value.
For a more concise description of the procedures of the COSCSO algorithm, the pseudo-code of the algorithm is given in Table 1 and the flowchart in Figure 4.

4.5. Computational Complexity of COSCSO Algorithm

The computational complexity of an algorithm is defined as the volume of resources it consumes during implementation. When the COSCSO algorithm program is performed, the complexity of each D-dimensional individual in the population is O(D). Then, for a population size of N individuals, its computational complexity is O(N × D), and in the process of finding the best, it needs to be executed T times to get the final result, and the final is O(T × N × D). In the following section, we will test the capability of COSCSO by exploiting different test suites and concrete engineering problems.

5. Numerical Experiments and Analysis

In this chapter, the balance between the COSCSO exploration and development processes is first discussed. Then, the more challenging CEC2017 test suite and the CEC2020 test suite were selected to test the final performance of COSCSO. COSCSO is evaluated with standard SCSO as well as with an extensive variety of meta-heuristic algorithms, and the values of the required parameters for all algorithms are specified in Table 2. All statistical experiments are conducted on the same computer. In addition, all algorithms are implemented in 20 independent executions of each function, taking N = 50 and T = 1000. And the optimization results are compared by analyzing the average and standard deviation of the best solutions.

5.1. Exploration and Exploitation Analysis

Exploration and exploitation play an integral role in the optimization process. Therefore, when evaluating algorithm performance, it is vital to discuss not only the ultimate consequences of the algorithm but also the nature of the balance between exploration and exploitation [45]. Figure 5 gives a diagram of the exploration and exploitation of COSCSO on the 30-dimensional CEC2020 test suite.
As we can observe from the figure, the algorithm progressively transitions from the exploration phase to the exploitation phase. On the simpler basic functions F2 and F4 and the most complex composition function F9, COSCSO moves to the exploitation phase around the 10th iteration and rapidly reaches the top of the exploitation phase, illustrating the greatly enhanced convergence accuracy of COSCSO. On the hybrid functions F5, F6, and F7, COSCSO also preserves a strong exploration ability in the middle and late stages, effectively refraining from plunging into a local optimum.

5.2. Comparison and Analysis on the CEC2017 Test Suite

Firstly, a running test is performed on the 30-dimensional CEC2017 test suite. The specific formulas for these functions are given in [46]. Then, COSCSO is compared and analyzed with SCSO and eight other competitive optimization algorithms, which include: PSO, RSA [47], BWO [48], DO, AOA [49], HHO [50,51], NCHHO [52], and ATOA [53].
The results obtained by running COSCSO 20 times with other competing algorithms are given in Table 3. There are 24 test functions ranked first in COSCSO, accounting for about 82.76% of all test functions. At first, on single-peak test functions, COSCSO has a distinct superiority over others in regard to the mean value and can achieve a smaller standard deviation. Next, on the multi-peak test function, although COSCSO is at a weak point compared to PSO in F5 and F6, it is more competitive with the other nine algorithms. Furthermore, on the hybrid functions, except for F15 and F19, COSCSO is obviously superior to other algorithms, especially on F12–F14, F16, and F18, where COSCSO is on the leading edge with respect to mean and standard deviation. Finally, on the synthetic functions, COSCSO is far ahead on F22, F28, and F30, but on F21, it is marginally weaker than PSO and SCSO. The last row of the table shows the average ranking of the ten algorithms. The rankings are: COSCSO > HHO > SCSO > PSO > DO > ATOA > AOA > NCHHO = BWO > RSA. In summary, the COSCSO algorithm has superior merit-seeking ability on the CEC2017 test suite; this fully demonstrates that the three strategies effectively boost convergence accuracy and efficiency and greatly reduce the defects of the initial algorithm.
Table 3 depicts the Wilcoxon rank-sum test p-values [54] derived from solving the 30-dimensional CEC2017 problem for 20 runs of other meta-heuristic algorithms at the 95% significance level ( α = 0.05 ), using COSCSO as a benchmark. The last row shows the statistical results, “+” indicates the number of algorithms that outperform the COSCSO, and “=” indicates that there is no appreciable variation among the two algorithms, at this point α = 0.05 . “-” indicates the number of times COSCSO outperformed other algorithms. Combining the ranking of each algorithm, we get that COSCSO is significantly superior to RSA, BWO, DO, AOA, NCHHO, and ATOA on all test functions, worse than PSO on F6 and F21, and apparently preferred to PSO on 14 test functions. So, all together, COSCSO has by far better competence compared to other algorithms and is a wise choice for solving the CEC2017 problem.
Figure 6 illustrates the convergence curves of COSCSO with other algorithms on the CEC2017 test functions. Observing the curves, we can see that COSCSO is a dramatic enhancement over SCSO. Although for F5, F6, and F21, COSCSO is at a disadvantage compared to PSO and inferior to the ATOA on F15 and F19, COSCSO is still more superior than the other algorithms. On the remaining functions, COSCSO obviously converges faster and with higher convergence accuracy than SCSO. These advantages are attributed to the improvement of three major strategies of adaptive parameters, Cauchy mutation operator and optimal neighborhood disturbance, which hinder the algorithm from dropping into local optimum and excessive premature convergence.
Figure 7 depicts the box plots of COSCSO with other algorithms on the CEC2017 test functions. The height of the box mirrors the level of swing in the data, and a narrower box plot represents more concentrated data and a more stable algorithm. If there are abnormal points in the data that are beyond the normal range of the data, these points are signaled by a “+”. From the figure, we can see that on F1, F3, F4, F11, F12, F14, F15, F17, F18, F27, F28, and F30, the box plot width of the COSCSO is significantly narrower than other algorithms. In addition, except for F22, the COSCSO has almost no outliers. This implies that its operation is more stable and has good robustness in solving the CEC2017 test functions.
Radar maps, also known as spider web maps, map the amount of data in multiple dimensions onto the axes and can give an indication of how high or low the weights of each variable are. Figure 8 shows the radar maps of COSCSO with other algorithms, which are plotted based on the ranking of the ten meta-heuristic algorithms on the CEC2017 test function. From the figure, it can be observed that COSCSO constitutes the smallest shaded area, which further sufficiently illustrates the capacity of COSCSO ahead of the other nine comparative algorithms. The shaded area of HHO ranks second, which indicates that HHO has some competition for COSCSO.

5.3. Comparison and Analysis on the CEC2020 Test Suite

In order to further test the COSCSO’s optimization-seeking ability, this paper is also tested on the 30-dimensional and 50-dimensional CEC2020 test suites, respectively. The CEC2020 test suite [55] is composed of some of the CEC2014 test suite [56] and the CEC2017 test suite. The algorithms compared with it are eight other optimization algorithms besides SCSO, which include WOA [57], RSA, PSO, CHOA, AOA, HHO, NCHHO, and ATOA. All parameter definitions remain identical except for the number of dimensions.
The experimental results of each algorithm on the 30-dimensional CEC2020 test suite are given in Table 4. From the data, it can be seen that COSCSO is ahead of SCSO and other comparative algorithms on nine test functions. And on F6, the HHO ranks first and the COSCSO ranks second, which is better than the other eight algorithms. The smallest standard deviation on F1, F5, and F7 indicates that COSCSO is more steady on these test functions. The table shows that the overall ranking is COSCSO > HHO > SCSO > PSO > WOA > ATOA > CHOA > AOA > RSA > NCHHO. The average rank of COSCSO is 1.1, which is the first overall rank, and the average rank of HHO is 2.8, which is the second overall rank, which shows that COSCSO is consistently first among all algorithms many times.
In addition, Table 4 lists the p-value magnitude of each algorithm, from which it can be seen that COSCSO as a whole outperforms all compared algorithms, especially for the WOA, RSA, PSO, CHOA, AOA, NCHHO, and ATOA, the COSCSO algorithms far ahead. For the HHO and SCSO, there is no major difference in a few test functions. This reveals that COSCSO is extremely feasible for solving the CEC2020 function problem in 30 dimensions.
Figure 9 presents the convergence curves of COSCSO with other algorithms on the 30-dimensional CEC2020 test suite. Combining the data in the table visually illustrates that COSCSO has faster convergence and more accurate accuracy on F1, F2, F5, F7, and F8. It is poorer than the HHO on F6.
Figure 10 displays the box plots of COSCSO with other algorithms on the 30-dimensional CEC2020 test function. Where the COSCSO algorithm has the smallest median on F1, F2, F5, F7, and F8 compared to the other nine algorithms. In the plots of F1, F5, F7, F8, and F10, the box plot of COSCSO is narrower, suggesting that the COSCSO algorithm is more stable and has relatively good robustness on these functions.
Figure 11 presents the radar maps based on the ranking of the COSCSO with the other nine algorithms in the 30-dimensional CEC2020 test suite. Depending on the area of the radar maps, it is easy to see that COSCSO ranks at the top in all functions, which very intuitively shows the superiority of COSCSO and its applicability in solving the 30-dimensional CEC2020 problem.
Table 5 contains the experimental data for each algorithm for each metric on the 50-dimensional CEC2020 test function. In this experiment, COSCSO achieved better fitness values on the eight test functions. Although inferior to the original algorithm in F2 and F3, the COSCSO algorithm performed competitively compared to the other eight algorithms. The third row from the bottom is the average rank of the ten algorithms. COSCSO has an average rank of 1.4, ranking first. The combined ranking of the algorithms is: COSCSO > SCSO > HHO > PSO > WOA > ATOA > CHOA > RSA > AOA > NCHHO. This fully reflects the ability of the COSCSO algorithm to solve the CEC2020 problem.
Rank sum tests are also documented in Table 5. Similarly, the COSCSO was used as a benchmark, and other meta-heuristic algorithms were run 20 times to solve the 50-dimensional CEC2020 problem at the 95% significance level ( α = 0.05 ). Looking at the last row, COSCSO clearly excelled SCSO on the six tested functions; moreover, COSCSO outperformed the other algorithms on most tested functions.
The convergence plots of each function in Figure 12 more directly show its performance in solving the CEC2020 problem. COSCSO surpasses all other algorithms except F2, F3, and F4 and ranks first.
In Figure 13, the median is the same for all algorithms except the PSO algorithm on F4. The median of COSCSO is lower than the other algorithms except for F3, F6, and F9. The box plots of COSCSO on F1, F5, F7, and F10 are extremely narrow, indicating its good stability and robustness.
Figure 14 shows the radar maps of COSCSO with other algorithms. Observing the area of each graph, it can be detected that the shaded area of COSCSO is the smallest as well as relatively more rounded, which indicates that COSCSO has more stable and remarkable capability, and COSCSO can be deployed to solve the 50-dimensional CEC2020 problem.

6. Engineering Problems

This chapter tests the ability of COSCSO to solve practical problems [58]. In the following, ten algorithms are devoted to addressing six practical engineering problems: welded beam, pressure vessel, gas transmission compressor, heat exchanger, tubular column, and piston lever design problems. In particular, the bounded problems are converted into unbounded problems by utilizing penalty functions. In the comparison experiments, N = 30 , T = 500 , and running times are set to 20.

6.1. Welded Beam Design

The objective of the problem is to construct a welded beam [59] with minimal expense under the bounds of shear stress ( η ), bending stress ( λ ), buckling load ( Q C ) and end deflection ( μ ) of the beam. It considers the weld thickness h, the joint length l, the height t of the beam, and the thickness b as variants, and the design schematic is shown in Figure 15. Let K = [ k 1 , k 2 , k 3 , k 4 ] = [ h , l , t , b ] , the mathematical model of this problem is shown in Equation (18).
min f ( K ) = 1.10471 k 1 2 k 2 + 0.04811 k 3 k 4 ( 14.0 + k 2 ) ,
Subject to:
y 1 ( K ) = η ( k ) η max 0 , y 2 ( K ) = β ( k ) β max 0 ,   y 3 ( K ) = μ ( k ) μ max 0 , y 4 ( K ) = k 1 k 4 0 , y 5 ( K ) = M Q C ( k ) 0 , y 6 ( K ) = 0.125 k 1 0 , y 7 ( K ) = 1.1047 k 1 2 + 0.04811 k 3 k 4 ( 14.0 + k 2 ) 5.0 0 ,
Variable range:
0.1 k 1 2 , 0.1 k 2 10 , 0.1 k 3 10 , 0.1 k 4 2 ,
where
η ( K ) = ( η ) 2 + 2 η η k 2 R + ( η ) 2 , η = M 2 k 1 k 2 , η = W R J , W = M ( S + k 2 2 ) , R = k 2 2 4 + ( k 1 + k 2 2 ) 2 , J = 2 { 2 k 1 k 2 [ k 2 2 4 + ( k 1 + k 3 2 ) 2 ] } , β ( K ) = 6 M S k 4 k 3 2 , μ ( K ) = 6 M L 3 E k 4 k 3 2 , Q C ( K ) = 4.013 D k 3 2 k 4 6 36 S 2 ( 1 k 3 2 S D 4 G ) , M = 6000 l b , S = 14 i n , μ max = 0.25 i n ,   D = 30 × 1 6 p s i , G = 12 × 10 6 p s i , ξ max = 13,600 p s i , β max = 30,000 p s i .
Ten competitive meta-heuristic algorithms are used to solve this problem in this experiment, which are: COSCSO, SCSO, WOA, AO [60], SCA [61], RSA, HS, BWO, HHO, and AOA. The optimal cost obtained by solving the welded beam design problem using each algorithm and the decision variables it corresponds to are given in Table 6. It is apparent from the table that COSCSO generates the cheapest expenses. Table 7 shows the statistical results obtained for all algorithms run 20 times. It can be noticed that COSCSO obtained the best ranking in all indicators. In conclusion, COSCSO is highly competitive in solving the welded beam design problem.

6.2. Pressure Vessel Design

The main purpose of the problem is to fabricate the pressure vessel [62] with the least amount of cost under a host of constraints. It treats shell thickness T1, head thickness T2, inner radius R*, and the length S of the cylindrical part without head as variables, and let K = [ k 1 , k 2 , k 3 , k 4 ] = [ T 1 , T 2 , R , S ] . The design schematic is presented in Figure 16. The mathematical model of the problem is shown in Equation (19).
min f ( K ) = 0.6224 k 1 k 3 k 4 + 1.7781 k 2 k 3 2 + 3.1661 k 1 4 + 19.84 k 1 2 k 3 ,
Subject to:
y 1 ( K ) = k 1 0.0193 k 3 0 , y 2 ( K ) = k 3 + 0.00954 k 3 0 , y 3 ( K ) = π k 3 2 k 4 4 / 3 π k 3 3 + 1296000 0 , y 4 ( K ) = k 4 240 0 ,
Variable range:
1 × 0.0625 k 1 , k 2 99 × 0.0625 , 10 k 3 , k 4 200 .
This problem is solved by ten algorithms, which are COSCSO, SCSO, WOA, AO, HS, RSA, SCA, BWO, BSA [63], and AOA. Table 8 contains the optimal cost of COSCSO and other compared algorithms and their corresponding decision variables. Four more pieces of data for each algorithm are included in Table 9. The result of COSCSO is the best among the ten algorithms and is relatively stable.

6.3. Gas Transmission Compressor Design Problem

The key target of the problem [64] is to minimize the total expense of carrying 100 million cubic feet per day. There are three design variables in this problem: the distance between the two compressors (L), the ratio of the first compressor to the second compressor pressure (δ), and the length of the natural gas pipeline inside the diameter (H). The gas transmission compressor is shown in Figure 17. Let K = [ k 1 , k 2 , k 3 ] = [ L , δ , H ] . It is modeled as illustrated in Equation (20).
min f ( K ) = 3.69 × 10 4 k 3 + 7.72 × 10 8 k 1 1 k 2 0.219 765.43 × 10 6 k 1 1 + 8.61 × 10 5 × k 1 1 2 k 2 ( k 2 2 1 ) 1 2 k 3 2 3 ,
Variable range:
10 k 1 55 , 1.1 k 2 2 , 10 k 3 40 .
In addition to SCSO, we pick RSA, BWO, SOA [65], WOA, SCA, HS, AO, and AOA to compare with COSCSO. The best results of different algorithms and the corresponding decision variables are summarized in Table 10. The best results of COSCSO are substantially smaller than those of the other algorithms. The statistical results of all algorithms are collected in Table 11, where their standard deviations are the smallest, indicating a high stability of COSCSO.

6.4. Heat Exchanger Design

It is a minimization problem for heat exchanger design [66]. There are eight variables and six inequality constraints in this problem. It is specified as shown in Equation (21).
min f ( K ) = k 1 + k 2 + k 3 ,
Subject to:
y 1 ( K ) = 0.0025 ( k 4 + k 6 ) 1 0 , y 2 ( K ) = 0.0025 ( k 5 + k 7 k 4 ) 1 0 , y 3 ( K ) = 1 0.01 ( k 8 k 5 ) 0 , y 4 ( K ) = k 1 k 6 833.33252 k 4 100 k 1 + 83,333.333 0 , y 5 ( K ) = k 2 k 7 1250 k 5 k 2 k 4 + 1250 k 4 0 , y 6 ( K ) = k 3 k 8 k 3 k 5 + 2500 k 5 1,250,000 0 ,
Variable range:
100 k 1 10,000 , 1000 k 2 , k 3 10,000 , 10 k i 1000 ( i = 4 , , 8 ) .
For this problem, nine algorithms, such as WOA and HHO, are compared with COSCSO. Table 12 counts the best results of COSCSO and other algorithms and the best decision variables corresponding to them. The results of each algorithm are listed in Table 13. Apparently, the COSCSO algorithm obtains better results and is very competitive among all ten algorithms.

6.5. Tubular Column Design

The goal of this problem is to minimize the expense of designing a tubular column [67] to bear compressive loads under six constraints. It contains two decision variables: the average diameter of the column (D), and the thickness of the tube (b), let K = [ k 1 , k 2 ] = [ D , b ] . Its design schematic is depicted in Figure 18. The model of this problem is indicated in Equation (22).
min f ( K ) = 9.8 k 1 k 2 + 2 k 1 ,
Subject to:
y 1 ( K ) = Q π k 1 k 2 δ y 1 , y 2 ( K ) = 8 Q H 2 π 3 E k 1 k 2 ( k 1 2 + k 2 2 ) 1 , y 3 ( K ) = 2.0 k 1 1 , y 4 ( K ) = k 1 14 1 , y 5 ( K ) = 0.2 k 2 1 , y 6 ( K ) = k 2 8 1 ,
Variable range:
2 k 1 14 , 0.2 k 2 0.8 ,
where
δ y = 500 , E = 0.84 × 10 6 .
Table 14 shows the optimal costs and variables for COSCSO and the other nine algorithms. Observing the four indicators in Table 15, COSCSO obtained better values for all of them.

6.6. Piston Lever Design

The primary goal of the problem [68] is to minimize the amount of oil consumed when the piston lever is tilted from 0° to 45° under four constraints, thus determining H, B, D, and K. The schematic is seen in Figure 19. The mathematical expression of the problem is Equation (23).
min f ( K ) = 1 4 π k 3 2 ( L 2 L 1 ) ,
Subject to:
y 1 ( K ) = M A cos θ R F 0 θ = 45 ° , y 2 ( K ) = M ( A k 4 ) N max 0 , y 3 ( K ) = 1.2 ( A 2 A 1 ) A 1 0 , y 4 ( K ) = k 3 / 3 k 2 0 ,
where
R = - k 4 ( k 4 sin θ + k 1 ) + k 1 ( k 2 k 4 cos θ ) ( k 4 k 2 ) 2 + k 1 2 , F = π C k 3 2 / 4 , A 1 = ( k 4 k 2 ) 2 + k 1 2 , A 2 = ( k 4 sin 45 + k 1 ) 2 + ( k 2 k 4 cos 45 ) 2 M = 10,000 lbs , C = 1500 psi , A = 240 in , N max = 1.8 × 10 lbs in .
Besides COSCSO and SCSO, SOA, MVO [69], HHO, etc., were also enrolled in the experiment. By looking at Table 16 and Table 17, COSCSO is the best choice within each of these ten algorithms to solve this problem.

7. Conclusions and Future Work

In this paper, SCSO based on adaptive parameters, Cauchy mutation, and an optimal neighborhood disturbance strategy are proposed. The nonlinear adaptive parameter replaces the linear adaptive parameter and increases the global search, which helps prevent premature convergence and puts exploration and development in a more balanced state. The introduction of the Cauchy mutation operator perturbs the search step to speed up convergence and improve search efficiency. The optimal neighborhood disturbance strategy is used to enrich the species diversity and prevent the algorithm from getting into the dilemma of the local optimum. COSCSO is evaluated against the standard SCSO and other challenging swarm intelligence optimization algorithms at CEC2017 and CEC2020 in distinct dimensions. The comparison of average and standard deviation, convergence, stability, and statistical analysis were performed. It is proven that COSCSO converges more rapidly, with higher accuracy, and stays more stable. In contrast to other algorithms, COSCO is more advanced. What is more, COSCSO is deployed to solve six engineering problems. From the experimental results, it can be visually concluded that COSCSO also has the potential to solve practical problems.
The COSCSO algorithm has strong exploration ability, which can effectively avoid falling into local optimums and prevent premature convergence. However, it has weak exploitation ability and a relatively slow convergence speed. In the future, we can use more novel strategies to improve the algorithm and further improve its exploration speed, which can be made available to tackle more high-dimensional optimization problems and employed in various fields, such as feature selection, path planning, image segmentation, fuzzy recognition, etc.

Author Contributions

Conceptualization, X.W.; Methodology, Q.L.; Software, Q.L. and L.Z.; Formal analysis, L.Z.; Investigation, X.W.; Resources, Q.L.; Writing—original draft, L.Z.; Funding acquisition, X.W. All authors have read and agreed to the published version of the manuscript.

Funding

This work was funded by Natural Science Basic Research Plan in Shaanxi Province of China (2023-JC-YB-023, 2021JM-320).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The figures utilized to support the findings of this study are embraced in the article.

Conflicts of Interest

The authors declare that they have no conflict of interest.

References

  1. Sadollah, A.; Sayyaadi, H.; Yadav, A. A dynamic metaheuristic optimization model inspired by biological nervous systems: Neural network algorithm. Appl. Soft Comput. 2018, 71, 747–782. [Google Scholar] [CrossRef]
  2. Qin, F.; Zain, A.M.; Zhou, K.-Q. Harmony search algorithm and related variants: A systematic review. Swarm Evol. Comput. 2022, 74, 101126. [Google Scholar] [CrossRef]
  3. Abualigah, L.; Diabat, A.; Geem, Z.W. A Comprehensive Survey of the Harmony Search Algorithm in Clustering Applications. Appl. Sci. 2020, 10, 3827. [Google Scholar] [CrossRef]
  4. Rajeev, S.; Krishnamoorthy, C.S. Discrete optimization of structures using genetic algorithms. J. Struct. Eng. 1992, 118, 1233–1250. [Google Scholar] [CrossRef]
  5. Storn, R.; Price, K. Differential evolution–A simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  6. Houssein, E.H.; Rezk, H.; Fathy, A.; Mahdy, M.A.; Nassef, A.M. A modified adaptive guided differential evolution algorithm applied to engineering applications. Eng. Appl. Artif. Intell. 2022, 113, 104920. [Google Scholar] [CrossRef]
  7. Atashpaz-Gargari, E.; Lucas, C. Imperialist competitive algorithm: An algorithm for optimization inspired by imperialistic competition. In Proceedings of the 2007 IEEE Congress on Evolutionary Computation, Singapore, 25–28 September 2007; pp. 4661–4667. [Google Scholar]
  8. Priya, R.D.; Sivaraj, R.; Anitha, N.; Devisurya, V. Tri-staged feature selection in multi-class heterogeneous datasets using memetic algorithm and cuckoo search optimization. Expert Syst. Appl. 2022, 209, 118286. [Google Scholar] [CrossRef]
  9. Dorigo, M.; Birattari, M.; Stutzle, T. Ant colony optimization-Artificial ants as a computational intelligence technique. IEEE Comput. Intell. Mag. 2006, 1, 28–39. [Google Scholar] [CrossRef]
  10. Zhao, D.; Lei, L.; Yu, F.; Heidari, A.A.; Wang, M.; Oliva, D.; Muhammad, K.; Chen, H. Ant colony optimization with horizontal and vertical crossover search: Fundamental visions for multi-threshold image segmentation. Expert. Syst. Appl. 2021, 167, 114122. [Google Scholar] [CrossRef]
  11. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
  12. Ma, M.; Wu, J.; Shi, Y.; Yue, L.; Yang, C.; Chen, X. Chaotic Random Opposition-Based Learning and Cauchy Mutation Improved Moth-Flame Optimization Algorithm for Intelligent Route Planning of Multiple UAVs. IEEE Access 2022, 10, 49385–49397. [Google Scholar] [CrossRef]
  13. Yu, X.; Wu, X. Ensemble grey wolf Optimizer and its application for image segmentation. Expert Syst. Appl. 2022, 209, 118267. [Google Scholar] [CrossRef]
  14. Zhao, W.; Zhang, Z.; Wang, L. Manta ray foraging optimization: An effective bio-inspired optimizer for engineering applications. Eng. Appl. Artif. Intell. 2020, 87, 103300. [Google Scholar] [CrossRef]
  15. Zhao, W.; Wang, L.; Mirjalili, S. Artificial hummingbird algorithm: A new bio-inspired optimizer with its engineering applications. Comput. Methods Appl. Mech. Eng. 2022, 388, 114194. [Google Scholar] [CrossRef]
  16. Agushaka, J.O.; Ezugwu, A.E.; Abualigah, L. Dwarf Mongoose Optimization Algorithm. Comput. Methods Appl. Mech. Eng. 2022, 391, 114570. [Google Scholar] [CrossRef]
  17. Abdollahzadeh, B.; Gharehchopogh, F.S.; Mirjalili, S. Artificialgorilla troops optimizer: A new nature-inspired metaheuristic algorithm for global optimization problems. Int. J. Intell. Syst. 2021, 36, 5887–5958. [Google Scholar] [CrossRef]
  18. Houssein, E.H.; Saad, M.R.; Ali, A.A.; Shaban, H. An efficient multi-objective gorilla troops optimizer for minimizing energy consumption of large-scale wireless sensor networks. Expert Syst. Appl. 2023, 212, 118827. [Google Scholar] [CrossRef]
  19. Zhao, S.; Zhang, T.; Ma, S.; Chen, M. Dandelion Optimizer: A nature-inspired metaheuristic algorithm for engineering applications. Eng. Appl. Artif. Intell. 2022, 114, 105075. [Google Scholar] [CrossRef]
  20. Beşkirli, M. A novel Invasive Weed Optimization with levy flight for optimization problems: The case of forecasting energy demand. Energy Rep. 2022, 8 (Suppl. S1), 1102–1111. [Google Scholar] [CrossRef]
  21. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  22. Jiang, Q.; Wang, L.; Lin, Y.; Hei, X.; Yu, G.; Lu, X. An efficient multi-objective artificial raindrop algorithm and its application to dynamic optimization problems in chemical processes. Appl. Soft Comput. 2017, 58, 354–377. [Google Scholar] [CrossRef]
  23. Houssein, E.H.; Hosney, M.E.; Mohamed, W.M.; Ali, A.A.; Younis, E.M.G. Fuzzy-based hunger games search algorithm for global optimization and feature selection using medical data. Neural Comput. Appl. 2022, 35, 5251–5275. [Google Scholar] [CrossRef] [PubMed]
  24. Houssein, E.H.; Oliva, D.; Çelik, E.; Emam, M.M.; Ghoniem, R.M. Boosted sooty tern optimization algorithm for global optimization and feature selection. Expert Syst. Appl. 2023, 113, 119015. [Google Scholar] [CrossRef]
  25. Hu, G.; Du, B.; Wang, X.; Wei, G. An enhanced black widow optimization algorithm for feature selection. Knowl.-Based Syst. 2022, 235, 107638. [Google Scholar] [CrossRef]
  26. Abualigah, L.; Almotairi, K.H.; Elaziz, M.A. Multilevel thresholding image segmentation using meta-heuristic optimization algorithms: Comparative analysis, open challenges and new trends. Appl. Intell. 2022, 52, 1–51. [Google Scholar] [CrossRef]
  27. Houssein, E.H.; Hussain, K.; Abualigah, L.; Elaziz, M.A.; Alomoush, W.; Dhiman, G.; Djenouri, Y.; Cuevas, E. An improved opposition-based marine predators algorithm for global optimization and multilevel thresholding image segmentation. Knowl.-Based Syst. 2021, 229, 107348. [Google Scholar] [CrossRef]
  28. Sharma, P.; Dinkar, S.K. A Linearly Adaptive Sine–Cosine Algorithm with Application in Deep Neural Network for Feature Optimization in Arrhythmia Classification using ECG Signals. Knowl. Based Syst. 2022, 242, 108411. [Google Scholar] [CrossRef]
  29. Guo, Y.; Tian, X.; Fang, G.; Xu, Y.-P. Many-objective optimization with improved shuffled frog leaping algorithm for inter-basin water transfers. Adv. Water Resour. 2020, 138, 103531. [Google Scholar] [CrossRef]
  30. Das, P.K.; Behera, H.S.; Panigrahi, B.K. A hybridization of an improved particle swarm optimization and gravitational search algorithm for multi-robot path planning. Swarm Evol. Comput. 2016, 28, 14–28. [Google Scholar] [CrossRef]
  31. Yu, X.; Jiang, N.; Wang, X.; Li, M. A hybrid algorithm based on grey wolf optimizer and differential evolution for UAV path planning. Expert Syst. Appl. 2022, 215, 119327. [Google Scholar] [CrossRef]
  32. Gao, D.; Wang, G.-G.; Pedrycz, W. Solving fuzzy job-shop scheduling problem using DE algorithm improved by a selection mechanism. IEEE Trans. Fuzzy Syst. 2020, 28, 3265–3275. [Google Scholar] [CrossRef]
  33. Dong, Z.R.; Bian, X.Y.; Zhao, S. Ship pipe route design using improved multi-objective ant colony optimization. Ocean. Eng. 2022, 258, 111789. [Google Scholar] [CrossRef]
  34. Hu, G.; Wang, J.; Li, M.; Hussien, A.G.; Abbas, M. EJS: Multi-strategy enhanced jellyfish search algorithm for engineering applications. Mathematics 2023, 11, 851. [Google Scholar] [CrossRef]
  35. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  36. Seyyedabbasi, A.; Kiani, F. Sand cat swarm optimization: A nature-inspired algorithm to solve global optimization problems. Eng. Comput. 2022, 38, 1–15. [Google Scholar] [CrossRef]
  37. Aghaei, V.T.; SeyyedAbbasi, A.; Rasheed, J.; Abu-Mahfouz, A.M. Sand cat swarm optimization-based feedback controller design for nonlinear systems. Heliyon 2023, 9, e13885. [Google Scholar] [CrossRef]
  38. Li, Y.; Wang, G. Sand cat swarm optimization based on stochastic variation with elite collaboration. IEEE Access 2022, 10, 89989–90003. [Google Scholar] [CrossRef]
  39. Seyyedabbasi, A. A reinforcement learning-based metaheuristic algorithm for solving global optimization problems. Adv. Eng. Softw. 2023, 178, 103411. [Google Scholar] [CrossRef]
  40. Lu, W.; Shi, C.; Fu, H.; Xu, Y. A Power Transformer Fault Diagnosis Method Based on Improved Sand Cat Swarm Optimization Algorithm and Bidirectional Gated Recurrent Unit. Electronics 2023, 12, 672. [Google Scholar] [CrossRef]
  41. Zhao, X.; Fang, Y.; Liu, L.; Xu, M.; Li, Q. A covariance-based Moth–flame optimization algorithm with Cauchy mutation for solving numerical optimization problems. Soft Comput. 2022, 119, 108538. [Google Scholar] [CrossRef]
  42. Wang, W.C.; Xu, L.; Chau, K.W.; Xu, D.M. Yin-Yang firefly algorithm based on dimensionally Cauchy mutation. Expert Syst. Appl. 2020, 150, 113216. [Google Scholar] [CrossRef]
  43. Ou, X.; Wu, M.; Pu, Y.; Tu, B.; Zhang, G.; Xu, Z. Cuckoo search algorithm with fuzzy logic and Gauss–Cauchy for minimizing localization error of WSN. Soft Comput. 2022, 125, 109211. [Google Scholar] [CrossRef]
  44. Hu, G.; Zhong, J.; Du, B.; Wei, G. An enhanced hybrid arithmetic optimization algorithm for engineering applications. Comput. Methods Appl. Mech. Eng. 2022, 394, 114901. [Google Scholar] [CrossRef]
  45. Hussain, K.; Salleh, M.N.M.; Cheng, S.; Shi, Y. On the exploration and exploitation in popular swarm-based metaheuristic algorithms. Neural Comput. Appl. 2019, 31, 7665–7683. [Google Scholar] [CrossRef]
  46. Awad, N.; Ali, M.; Liang, J.; Qu, B.; Suganthan, P.; Definitions, P. Evaluation criteria for the CEC 2017 special session and competition on single objective real-parameter numerical optimization. Technol. Rep. 2016. [Google Scholar]
  47. Abualigah, L.; Abd Elaziz, M.; Sumari, P.; Geem, Z.W.; Gandomi, A.H. Reptile Search Algorithm (RSA): A nature-inspired meta-heuristic optimizer. Expert Syst. Appl. 2022, 191, 116158. [Google Scholar] [CrossRef]
  48. Zhong, C.; Li, G.; Meng, Z. Beluga whale optimization: A novel nature-inspired metaheuristic algorithm. Knowl. Based Syst. 2022, 251, 109215. [Google Scholar] [CrossRef]
  49. Abualigah, L.; Diabat, A.; Mirjalili, S.; Abd Elaziz, M.; Gandomi, A.H. The Arithmetic Optimization Algorithm. Comput. Methods Appl. Mech. Eng. 2021, 376, 113609. [Google Scholar] [CrossRef]
  50. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  51. Abualigah, L.; Diabat, A.; Svetinovic, D.; Elaziz, M.A. Boosted Harris Hawks gravitational force algorithm for global optimization and industrial engineering problems. J. Intell. Manuf. 2022, 34, 1–13. [Google Scholar] [CrossRef]
  52. Dehkordi, A.A.; Sadiq, A.S.; Mirjalili, S.; Ghafoor, K.Z. Nonlinear-based Chaotic Harris Hawks Optimizer: Algorithm and Internet of Vehicles application. Appl. Soft Comput. 2021, 109, 107574. [Google Scholar] [CrossRef]
  53. Arun Mozhi Devan, P.; Hussin, F.A.; Ibrahim, R.B.; Bingi, K.; Nagarajapandian, M.; Assaad, M. An Arithmetic-Trigonometric Optimization Algorithm with Application for Control of Real-Time Pressure Process Plant. Sensors 2022, 22, 617. [Google Scholar] [CrossRef] [PubMed]
  54. Hu, G.; Zhu, X.; Wei, G.; Chang, C.T. An improved marine predators algorithm for shape optimization of developable Ball surfaces. Eng. Appl. Artif. Intell. 2021, 105, 104417. [Google Scholar] [CrossRef]
  55. Mohamed, A.W.; Hadi, A.A.; Awad, N.H. Evaluating the performance of adaptive Gaining Sharing knowledge based algorithm on CEC 2020 benchmark problems. In Proceedings of the 2020 IEEE Congress on Evolutionary Computation (CEC), Glasgow, UK, 19–24 July 2020. [Google Scholar]
  56. Liang, J.; Qu, B.; Suganthan, P. Problem Definitions and Evaluation Criteria for the CEC 2014 Special Session and Competition on Single Objective Real-Parameter Numerical Optimization. Tech. Rep., Comput. Intell. Lab.Zhengzhou Univ. Zhengzhou China Nanyang Technol. Univ. Singap., 2013,635.S. Mirjalili, A. Lewis, The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar]
  57. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  58. Zhao, W.; Wang, L.; Zhang, Z. Atom search optimization and its application to solve a hydrogeologic parameter estimation problem. Knowl. Based Sys. 2018, 163, 283–304. [Google Scholar] [CrossRef]
  59. Zhao, W.; Wang, L.; Zhang, Z. Artificial ecosystem-based optimization: A novel nature-inspired meta-heuristic algorithm. Neural Comput. Appl. 2020, 32, 9383–9425. [Google Scholar] [CrossRef]
  60. Abualigah, L.; Yousri, D.; Elaziz, M.A.; Ewees, A.A.; Al-Qaness, M.A.; Gandomi, A.H. Aquila Optimizer: A novel meta-heuristic optimization algorithm. Comput. Ind. Eng. 2021, 157, 107250. [Google Scholar] [CrossRef]
  61. Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl. Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  62. Seyyedabbasi, A. WOASCALF: A new hybrid whale optimization algorithm based on sine cosine algorithm and levy flight to solve global optimization problems. Adv. Eng. Softw. 2022, 173, 103272. [Google Scholar] [CrossRef]
  63. Wang, L.; Wang, H.; Han, X.; Zhou, W. A novel adaptive density-based spatial clustering of application with noise based on bird swarm optimization algorithm. Comput. Commun. 2021, 174, 205–214. [Google Scholar] [CrossRef]
  64. Kumar, N.; Mahato, S.K.; Bhunia, A.K. Design of an efficient hybridized CS-PSO algorithm and its applications for solving constrained and bound constrained structural engineering design problems. Results Control. Optim. 2021, 5, 100064. [Google Scholar] [CrossRef]
  65. Dhiman, G.; Kumar, V. Seagull optimization algorithm: Theory and its applications for large-scale industrial engineering problems. Knowl. Based Syst. 2019, 165, 169–196. [Google Scholar] [CrossRef]
  66. Jaberipour, M.; Khorram, E. Two improved harmony search algorithms for solving engineering optimization problems. Commun. Nonlinear Sci. Numer. Simul. 2010, 15, 3316–3331. [Google Scholar] [CrossRef]
  67. Hu, G.; Yang, R.; Qin, X.; Wei, G. MCSA: Multi-strategy boosted chameleon-inspired optimization algorithm for engineering applications. Comput. Methods Appl. Mech. Eng. 2023, 403, 115676. [Google Scholar] [CrossRef]
  68. Ong, K.M.; Ong, P.; Sia, C.K. A new flower pollination algorithm with improved convergence and its application to engineering optimization. Decis. Anal. J. 2022, 5, 100144. [Google Scholar] [CrossRef]
  69. Mirjalili, S.; Mirjalili, S.M.; Hatamlou, A. Multi-Verse Optimizer: A nature-inspired algorithm for global optimization. Neural Comput. Appl. 2016, 27, 495–513. [Google Scholar] [CrossRef]
Figure 1. Sand cat capturing prey diagram.
Figure 1. Sand cat capturing prey diagram.
Biomimetics 08 00191 g001
Figure 2. The curve of the variation of parameter Se.
Figure 2. The curve of the variation of parameter Se.
Biomimetics 08 00191 g002
Figure 3. Curve of the probability density function of the Cauchy distribution.
Figure 3. Curve of the probability density function of the Cauchy distribution.
Biomimetics 08 00191 g003
Figure 4. Flow chart of the COSCSO algorithm.
Figure 4. Flow chart of the COSCSO algorithm.
Biomimetics 08 00191 g004
Figure 5. Diagram of COSCSO exploration and exploitation.
Figure 5. Diagram of COSCSO exploration and exploitation.
Biomimetics 08 00191 g005
Figure 6. Convergence curves of COSCSO with other algorithms (CEC2017).
Figure 6. Convergence curves of COSCSO with other algorithms (CEC2017).
Biomimetics 08 00191 g006aBiomimetics 08 00191 g006bBiomimetics 08 00191 g006c
Figure 7. Box plots of COSCSO with other algorithms (CEC2017).
Figure 7. Box plots of COSCSO with other algorithms (CEC2017).
Biomimetics 08 00191 g007aBiomimetics 08 00191 g007b
Figure 8. Radar maps of COSCSO with other algorithms (CEC2017).
Figure 8. Radar maps of COSCSO with other algorithms (CEC2017).
Biomimetics 08 00191 g008aBiomimetics 08 00191 g008b
Figure 9. Convergence curves of COSCSO with other algorithms (30-dimensional CEC2020).
Figure 9. Convergence curves of COSCSO with other algorithms (30-dimensional CEC2020).
Biomimetics 08 00191 g009
Figure 10. Box plots of COSCSO with other algorithms (30-dimensional CEC2020).
Figure 10. Box plots of COSCSO with other algorithms (30-dimensional CEC2020).
Biomimetics 08 00191 g010
Figure 11. Radar maps of COSCSO with other algorithms (30-dimensional CEC2020).
Figure 11. Radar maps of COSCSO with other algorithms (30-dimensional CEC2020).
Biomimetics 08 00191 g011aBiomimetics 08 00191 g011b
Figure 12. Convergence curves of COSCSO with other algorithms (50-dimensional CEC2020).
Figure 12. Convergence curves of COSCSO with other algorithms (50-dimensional CEC2020).
Biomimetics 08 00191 g012aBiomimetics 08 00191 g012b
Figure 13. Box plots of COSCSO with other algorithms (50-dimensional CEC2020).
Figure 13. Box plots of COSCSO with other algorithms (50-dimensional CEC2020).
Biomimetics 08 00191 g013aBiomimetics 08 00191 g013b
Figure 14. Radar maps of COSCSO with other algorithms (50-dimensional CEC2020).
Figure 14. Radar maps of COSCSO with other algorithms (50-dimensional CEC2020).
Biomimetics 08 00191 g014aBiomimetics 08 00191 g014b
Figure 15. Welded beam design problem.
Figure 15. Welded beam design problem.
Biomimetics 08 00191 g015
Figure 16. Pressure vessel design problem.
Figure 16. Pressure vessel design problem.
Biomimetics 08 00191 g016
Figure 17. Gas transmission compressor design problem.
Figure 17. Gas transmission compressor design problem.
Biomimetics 08 00191 g017
Figure 18. Tubular column design problem.
Figure 18. Tubular column design problem.
Biomimetics 08 00191 g018
Figure 19. Piston lever design.
Figure 19. Piston lever design.
Biomimetics 08 00191 g019
Table 1. Pseudo-code of COSCSO algorithm.
Table 1. Pseudo-code of COSCSO algorithm.
Algorithm: The COSCSO algorithm.
Initialize individuals Xi (I = 1,2,∖,N)
Calculate the fitness values for all individuals.
1: While (t < T)
2: Update the parameters like Se, re, Re;
3: For each individual
4: Get a random angle based on Roulette Wheel Selection (0° ≤ θ ≤ 360°);
5: If ( R e 1 )
6: Update the individual position in conformity with Equation (15);
7:  Else
8: Update the individual position in conformity with Equation (6);
9:  End
10: Calculate the fitness values of individuals, Produce the Xbest(t);
11: Produce the X*best(t) in conformity with Equation (16);
12: Calculate the fitness, Update the Xbest(t);
13:  End
14:  t = t + +
15: End
Table 2. Parameters setting in traditional classical algorithms.
Table 2. Parameters setting in traditional classical algorithms.
AlgorithmsParameters NameParameters Values
PSOSelf-learning factor o1
Group learning factor o2
Inertia weight ω
0.5
0.5
0.8
RSASensitive parameter α
Control parameter β
0.1
0.05
BWOBalance factor Bf(0, 1)
DOAdaptive parameter α[0, 1]
AOAControl parameter σ
Sensitive parameter v
0.499
0.5
HHOInitial energy E0[−1, 1]
ATOASensitive parameter α5
NCHHOControl parameter c
Control parameter a1
[0, 2]
4
WOAControl parameter m
Constant n
Linearly decreases from 2 to 0
1
CHOAparameter fLinearly decreases from 2 to 0
Table 3. Comparison results on functions of CEC2017 (Bold type is the optimal value).
Table 3. Comparison results on functions of CEC2017 (Bold type is the optimal value).
FResultsAlgorithms
PSORSABWODOAOAHHONCHHOATOASCSOCOSCSO
F1Mean8.08E+094.64E+104.76E+101.67E+094.86E+101.55E+074.37E+102.20E+103.88E+093.48E+05
Std6.54E+096.30E+095.54E+099.53E+081.10E+103.75E+069.77E+098.28E+091.74E+097.03E+05
Rank58931027641
p6.80E-086.80E-086.80E-086.80E-086.80E-086.80E-086.80E-086.80E-086.80E-08
F3Mean5.84E+047.86E+047.69E+048.56E+047.68E+042.13E+048.96E+048.14E+044.37E+041.11E+04
Std2.57E+045.33E+034.27E+035.64E+037.18E+035.48E+034.00E+039.29E+037.06E+033.26E+03
Rank47596210831
p9.17E-086.80E-086.80E-086.80E-086.80E-081.38E-066.80E-086.80E-086.80E-08
F4Mean1.35E+038.84E+031.15E+046.91E+021.28E+045.41E+021.14E+042.43E+037.47E+025.05E+02
Std9.28E+022.22E+031.12E+021.02E+023.34E+032.74E+013.18E+031.40E+033.01E+021.96E+01
Rank57931028641
p6.80E-086.80E-086.80E-081.23E-076.80E-081.04E-046.80E-086.80E-082.56E-07
F5Mean6.99E+029.14E+029.11E+027.84E+028.92E+027.39E+028.93E+028.38E+027.32E+027.24E+02
Std4.04E+012.62E+012.12E+014.84E+012.80E+012.45E+014.40E+013.27E+015.03E+014.88E+01
Rank11095748632
p1.08E-016.80E-086.80E-089.21E-046.80E-081.33E-011.23E-073.94E-075.79E-01
F6Mean6.49E+026.87E+026.86E+026.68E+026.74E+026.64E+026.83E+026.65E+026.60E+026.59E+02
Std8.98E+006.15E+003.79E+004.16E+005.69E+005.90E+007.64E+009.24E+009.60E+008.97E+00
Rank11096748532
p1.79E-046.80E-086.80E-081.16E-047.95E-071.08E-017.90E-082.56E-029.46E-01
F7Mean1.15E+031.38E+031.36E+031.25E+031.33E+031.29E+031.35E+031.24E+031.13E+031.12E+03
Std1.58E+023.95E+014.66E+011.03E+026.59E+017.03E+017.41E+016.09E+019.85E+011.14E+02
Rank31095768421
p6.17E-016.80E-089.17E-083.38E-049.13E-072.60E-055.23E-073.05E-047.35E-01
F8Mean9.81E+021.13E+031.13E+031.01E+031.10E+039.72E+021.13E+031.09E+039.91E+029.71E+02
Std4.46E+012.16E+011.19E+012.50E+012.29E+012.76E+013.60E+013.31E+013.16E+013.56E+01
Rank39857210641
p5.43E-016.80E-086.80E-083.05E-046.80E-088.82E-016.80E-087.90E-086.79E-02
F9Mean5.62E+031.01E+041.06E+046.90E+036.95E+036.88E+038.88E+031.24E+045.58E+035.11E+03
Std2.29E+039.13E+026.65E+026.96E+021.12E+038.05E+021.54E+033.16E+038.87E+023.22E+02
Rank38956471021
p6.17E-016.80E-086.80E-086.80E-081.10E-057.95E-076.80E-086.80E-086.87E-04
F10Mean5.48E+038.25E+038.55E+035.75E+037.44E+035.71E+038.26E+037.40E+035.88E+035.33E+03
Std6.40E+023.65E+023.58E+024.41E+025.05E+025.50E+026.11E+025.96E+027.38E+028.03E+02
Rank28104739651
p5.25E-016.80E-086.80E-082.75E-022.22E-079.09E-029.17E-086.01E-072.75E-02
F11Mean1.55E+038.06E+036.60E+032.52E+037.76E+031.28E+039.79E+037.20E+031.73E+031.27E+03
Std1.88E+022.79E+036.51E+026.82E+022.19E+034.84E+012.43E+034.81E+033.10E+026.12E+01
Rank39658210741
p2.96E-076.80E-086.80E-086.80E-086.80E-086.95E-016.80E-086.80E-089.17E-08
F12Mean6.49E+081.38E+109.76E+098.43E+071.16E+101.98E+071.76E+077.35E+091.40E+086.89E+06
Std6.66E+083.54E+092.11E+097.05E+072.96E+091.48E+071.32E+072.93E+091.77E+088.68E+06
Rank61084932751
p1.66E-076.80E-086.80E-081.92E-076.80E-085.63E-046.80E-086.80E-081.06E-07
F13Mean4.21E+081.08E+105.55E+091.29E+067.67E+094.45E+054.30E+093.00E+081.44E+071.38E+05
Std9.20E+084.05E+091.62E+094.16E+064.65E+091.42E+052.08E+093.34E+082.67E+071.03E+05
Rank61083927541
p1.44E-046.80E-086.80E-083.97E-036.80E-081.05E-066.80E-086.80E-082.07E-02
F14Mean1.25E+057.35E+062.27E+061.58E+061.15E+062.85E+056.36E+061.67E+064.34E+054.44E+04
Std1.89E+059.53E+061.02E+069.04E+051.33E+062.30E+055.46E+061.57E+067.20E+053.31E+04
Rank21086539741
p1.23E-026.80E-086.80E-086.80E-082.36E-065.90E-056.80E-087.58E-061.35E-03
F15Mean7.03E+046.22E+082.15E+086.75E+042.31E+065.44E+041.27E+081.41E+042.29E+054.53E+04
Std4.11E+043.12E+081.08E+087.62E+041.02E+073.43E+042.27E+081.07E+045.94E+053.22E+04
Rank51094738162
p1.55E-026.80E-086.80E-083.23E-013.06E-033.79E-016.80E-089.75E-061.81E-01
F16Mean3.15E+035.64E+035.24E+033.50E+034.78E+033.17E+035.00E+033.54E+033.12E+033.08E+03
Std3.54E+021.47E+035.18E+024.37E+027.45E+023.86E+026.24E+023.41E+023.66E+023.36E+02
Rank31095748621
p3.79E-016.80E-086.80E-083.64E-031.06E-074.90E-016.80E-085.09E-045.08E-01
F17Mean2.53E+035.34E+033.74E+032.56E+033.31E+032.55E+033.08E+032.69E+032.47E+032.44E+03
Std2.74E+023.62E+033.82E+023.21E+025.48E+023.57E+025.03E+021.44E+022.82E+022.86E+02
Rank31095847621
p5.61E-016.80E-087.90E-082.98E-019.13E-074.41E-013.29E-052.14E-037.15E-01
F18Mean2.34E+063.35E+072.70E+076.72E+061.19E+071.86E+065.49E+074.21E+061.20E+069.94E+05
Std6.10E+062.73E+071.40E+075.32E+068.24E+061.76E+064.93E+073.08E+061.11E+061.01E+06
Rank49867310521
p9.03E-016.80E-086.80E-088.60E-061.92E-077.64E-022.96E-071.25E-052.85E-01
F19Mean1.62E+076.19E+083.07E+081.14E+061.79E+063.46E+052.20E+083.41E+044.23E+062.44E+05
Std4.56E+072.77E+081.30E+089.55E+057.77E+042.09E+052.70E+084.54E+049.10E+064.08E+05
Rank71094538162
p7.35E-016.80E-086.80E-083.75E-049.13E-079.05E-036.80E-087.58E-068.29E-05
F20Mean2.72E+032.95E+032.91E+032.99E+032.78E+032.72E+032.99E+032.89E+032.69E+032.66E+03
Std2.65E+021.31E+021.03E+022.56E+022.17E+021.96E+022.19E+021.91E+021.83E+021.62E+02
Rank48710539621
p4.90E-015.87E-062.69E-067.41E-056.79E-022.73E-012.92E-056.87E-047.35E-01
F21Mean2.49E+032.69E+032.71E+032.55E+032.65E+032.55E+032.70E+032.62E+032.51E+032.54E+03
Std4.39E+014.15E+013.47E+011.05E+024.79E+015.16E+015.23E+014.29E+013.48E+014.51E+01
Rank18105749623
p4.16E-046.80E-086.80E-082.18E-012.06E-069.89E-011.66E-077.41E-051.93E-02
F22Mean6.20E+038.35E+038.30E+035.30E+038.25E+035.74E+039.38E+037.63E+033.49E+033.02E+03
Std1.79E+031.06E+036.84E+022.65E+031.08E+032.06E+031.04E+032.21E+031.30E+031.77E+03
Rank58739410621
p1.41E-053.42E-071.66E-079.75E-063.42E-072.30E-051.43E-071.38E-061.29E-04
F23Mean2.97E+033.25E+033.28E+033.07E+033.51E+033.13E+033.67E+033.07E+032.93E+032.92E+03
Std8.40E+017.39E+016.25E+011.48E+021.45E+021.40E+021.39E+026.62E+014.15E+016.87E+01
Rank37859610421
p8.59E-026.80E-086.80E-082.47E-046.80E-083.07E-066.80E-081.05E-068.39E-01
F24Mean3.16E+033.47E+033.52E+033.19E+033.80E+033.11E+033.80E+033.28E+033.09E+033.08E+03
Std7.63E+011.59E+027.24E+017.65E+012.48E+021.84E+012.38E+026.80E+016.91E+016.44E+01
Rank47851039621
p4.32E-036.80E-086.80E-082.60E-056.80E-081.99E-016.80E-081.06E-076.95E-01
F25Mean3.17E+034.85E+034.31E+033.03E+035.28E+032.92E+034.60E+033.43E+033.09E+032.93E+03
Std3.42E+026.52E+021.63E+025.05E+018.73E+022.30E+014.43E+022.85E+027.83E+012.17E+01
Rank59731028641
p9.17E-086.80E-086.80E-086.80E-086.80E-086.55E-016.80E-086.80E-087.90E-08
F26Mean6.73E+039.68E+031.03E+046.96E+031.04E+047.56E+031.05E+048.28E+036.63E+036.52E+03
Std8.80E+028.20E+023.43E+021.76E+031.02E+031.30E+031.11E+031.13E+031.41E+031.86E+03
Rank37849510621
p9.46E-013.94E-076.80E-082.73E-011.06E-074.68E-021.06E-071.23E-039.89E-01
F27Mean3.33E+033.89E+033.90E+033.39E+034.39E+033.42E+034.53E+033.42E+033.36E+033.32E+03
Std7.58E+015.05E+021.34E+027.83E+013.28E+021.14E+025.01E+028.17E+017.70E+018.83E+01
Rank27859410631
p3.94E-013.42E-076.80E-082.22E-046.80E-081.61E-047.90E-083.71E-052.23E-02
F28Mean4.26E+035.89E+036.20E+033.47E+036.89E+033.29E+036.31E+034.23E+033.53E+033.26E+03
Std8.73E+029.02E+022.92E+025.79E+019.24E+021.77E+018.83E+024.62E+021.37E+022.50E+01
Rank67831029541
p6.80E-086.80E-086.80E-086.80E-086.80E-081.44E-046.80E-086.80E-086.80E-08
F29Mean4.63E+037.40E+036.51E+034.88E+037.12E+034.60E+036.63E+034.70E+034.57E+034.55E+03
Std4.96E+023.56E+035.43E+023.71E+021.06E+034.09E+029.81E+023.68E+023.61E+024.22E+02
Rank31076948521
p5.43E-011.06E-071.06E-078.35E-036.80E-085.25E-011.23E-072.08E-015.25E-01
F30Mean6.67E+072.76E+098.24E+081.11E+071.14E+093.33E+066.27E+085.55E+078.36E+071.95E+06
Std2.56E+089.66E+083.24E+085.53E+061.01E+092.02E+064.99E+085.30E+079.62E+061.41E+06
Rank51083927461
p3.97E-036.80E-086.80E-082.22E-076.80E-082.39E-026.80E-083.42E-071.58E-06
Mean rank3.68978.37938.48284.79317.86213.27598.37935.82763.31031.2414
Result4895728631
+/=/−2/13/140/0/290/0/290/4/250/1/280/13/160/0/291/1/270/12/17
Table 4. Comparison results on functions of 30-dimensional CEC2020.
Table 4. Comparison results on functions of 30-dimensional CEC2020.
FResultsAlgorithms
WOARSAPSOCHOAAOAHHONCHHOATOASCSOCOSCSO
F1Mean5.27E+085.14E+106.81E+092.83E+104.80E+101.72E+074.31E+102.26E+104.65E+096.08E+05
Std3.25E+088.10E+095.60E+095.05E+097.81E+094.11E+068.72E+097.55E+093.17E+098.65E+05
Rank31057928641
p6.80E-086.80E-086.80E-086.80E-086.80E-086.80E-086.80E-086.80E-086.80E-08
F2Mean6.46E+037.65E+035.48E+038.01E+037.20E+035.44E+038.44E+037.32E+036.15E+035.22E+03
Std8.01E+024.01E+026.19E+026.76E+025.69E+026.31E+026.41E+025.48E+027.64E+026.89E+02
Rank58396210741
p9.75E-066.80E-083.10E-011.06E-072.22E-073.65E-012.36E-061.43E-075.63E-04
F3Mean1.27E+031.38E+031.14E+031.26E+031.36E+031.27E+031.33E+031.26E+031.15E+031.14E+03
Std9.35E+012.98E+011.58E+021.83E+016.55E+018.55E+017.13E+016.16E+016.24E+011.01E+02
Rank71025968431
p4.60E-047.90E-082.85E-011.04E-047.90E-081.63E-035.23E-072.39E-027.35E-01
F4Mean1.90E+031.90E+031.95E+031.90E+031.90E+031.90E+031.90E+031.91E+031.90E+031.90E+03
Std0.00E+000.00E+003.04E+010.00E+000.00E+000.00E+000.00E+005.47E+000.00E+000.00E+00
Rank1131111211
pNaNNaN8.01E-09NaNNaNNaNNaN8.01E-09NaN
F5Mean9.80E+066.41E+074.72E+062.53E+077.13E+072.47E+068.03E+079.09E+062.96E+061.26E+06
Std6.58E+064.36E+076.21E+061.76E+074.66E+071.62E+064.61E+076.31E+063.64E+061.00E+06
Rank68479210531
p6.01E-076.80E-082.80E-036.80E-086.80E-081.14E-026.80E-086.80E-081.90E-01
F6Mean3.43E+034.49E+032.94E+033.43E+033.99E+032.59E+034.57E+032.78E+032.70E+032.67E+03
Std5.77E+026.81E+023.37E+023.51E+021.25E+033.41E+028.14E+022.95E+023.61E+023.09E+02
Rank79568110432
p5.17E-066.80E-081.93E-022.06E-063.99E-062.85E-016.80E-082.98E-019.89E-01
F7Mean5.60E+064.32E+078.30E+057.52E+062.25E+076.40E+053.22E+072.99E+062.44E+063.42E+05
Std3.92E+062.21E+076.16E+054.42E+061.57E+074.88E+052.16E+072.12E+063.40E+063.36E+05
Rank61037829541
p2.22E-076.80E-081.95E-031.66E-076.80E-089.79E-036.80E-082.96E-071.48E-03
F8Mean7.11E+038.74E+036.36E+038.19E+038.69E+035.61E+038.94E+038.83E+034.12E+033.26E+03
Std1.56E+038.15E+021.74E+031.32E+031.12E+032.29E+031.37E+031.00E+032.02E+032.00E+03
Rank58467310921
p7.58E-061.92E-071.60E-051.20E-063.94E-075.26E-053.42E-072.22E-073.75E-04
F9Mean3.24E+033.48E+033.12E+033.31E+033.78E+033.41E+033.91E+033.31E+033.09E+033.07E+03
Std8.48E+012.20E+021.56E+013.24E+011.75E+021.29E+022.66E+021.04E+025.90E+019.11E+01
Rank48369710521
p5.17E-066.80E-082.34E-031.66E-076.80E-081.43E-076.80E-087.95E-072.08E-01
F10Mean3.06E+034.72E+033.24E+034.54E+035.06E+032.94E+034.50E+033.56E+033.09E+032.93E+03
Std5.77E+015.67E+022.41E+024.58E+027.40E+022.29E+016.05E+023.13E+029.65E+012.41E+01
Rank39581027641
p6.80E-086.80E-086.80E-086.80E-086.80E-082.50E-016.80E-086.80E-087.90E-08
Mean rank4.78.13.76.27.62.88.35.33.01.1
Result59478210631
+/=/−0/1/90/1/90/2/80/1/90/1/90/3/70/1/90/1/90/4/6
Table 5. Comparison results on functions of 50-dimensional CEC2020.
Table 5. Comparison results on functions of 50-dimensional CEC2020.
FResultsAlgorithms
WOARSAPSOCHOAAOAHHONCHHOATOASCSOCOSCSO
F1Mean3.63E+099.43E+103.50E+105.87E+101.13E+118.59E+079.44E+105.35E+101.69E+101.11E+07
Std1.31E+098.75E+091.62E+102.76E+096.18E+091.88E+079.67E+097.04E+096.50E+091.06E+07
Rank38571029641
p6.80E-086.80E-086.80E-086.80E-086.80E-086.80E-086.80E-086.80E-086.80E-08
F2Mean1.20E+041.44E+048.93E+031.51E+041.37E+049.90E+031.52E+041.42E+041.01E+049.15E+03
Std1.15E+035.84E+021.01E+036.70E+026.81E+021.09E+038.54E+027.02E+028.38E+028.92E+02
Rank58196310742
p3.42E-076.80E-083.94E-016.80E-086.80E-083.60E-026.80E-086.80E-083.97E-03
F3Mean1.80E+031.94E+032.28E+031.77E+031.92E+031.82E+031.98E+031.80E+031.57E+031.62E+03
Std8.51E+014.61E+013.32E+025.16E+014.70E+017.88E+017.71E+011.23E+021.17E+021.43E+02
Rank47103869512
p3.29E-056.80E-086.92E-071.60E-056.80E-087.58E-066.80E-084.60E-047.64E-02
F4Mean1.90E+031.90E+033.13E+041.90E+031.90E+031.90E+031.90E+031.93E+031.90E+031.90E+03
Std0.00E+000.00E+003.83E+040.00E+008.80E-060.00E+000.00E+001.09E+010.00E+000.00E+00
Rank1141211311
pNaNNaN8.01E-09NaN6.68E-05NaNNaN8.01E-09NaN
F5Mean1.26E+083.85E+083.00E+076.87E+074.35E+089.20E+063.73E+083.81E+079.72E+062.49E+06
Std6.57E+072.20E+083.03E+072.41E+071.87E+085.57E+061.41E+082.48E+075.11E+061.55E+06
Rank79461028531
p6.80E-086.80E-087.95E-076.80E-086.80E-082.36E-066.80E-086.80E-083.42E-07
F6Mean6.15E+037.76E+034.43E+036.11E+037.41E+034.44E+037.81E+035.80E+034.69E+034.30E+03
Std7.63E+025.94E+024.02E+024.00E+021.44E+035.78E+021.36E+038.39E+027.27E+027.90E+02
Rank79268310541
p1.58E-066.80E-083.94E-011.23E-077.90E-084.57E-016.80E-081.10E-059.62E-02
F7Mean1.59E+078.32E+078.53E+062.35E+075.78E+075.29E+069.07E+071.73E+074.30E+061.40E+06
Std8.79E+065.05E+076.39E+066.08E+063.21E+072.80E+064.58E+079.63E+063.82E+069.03E+05
Rank69478310521
p4.54E-066.80E-082.06E-066.80E-086.80E-082.36E-066.80E-081.23E-071.12E-03
F8Mean1.36E+041.71E+041.04E+041.71E+041.56E+041.12E+041.64E+041.59E+041.10E+041.00E+04
Std1.02E+033.92E+028.71E+025.94E+026.30E+027.42E+026.67E+026.78E+022.02E+032.06E+03
Rank59210648731
p1.43E-076.80E-089.25E-016.80E-086.80E-081.93E-026.80E-086.80E-083.37E-02
F9Mean3.80E+034.17E+033.65E+034.05E+034.88E+034.25E+034.86E+033.88E+033.44E+033.44E+03
Std1.36E+024.42E+021.58E+024.76E+012.48E+022.46E+024.38E+021.66E+021.28E+021.27E+02
Rank47361089521
p5.23E-076.80E-084.68E-056.80E-086.80E-086.80E-086.80E-081.06E-079.25E-01
F10Mean3.72E+031.27E+046.14E+031.04E+041.58E+043.19E+031.39E+048.21E+034.32E+033.12E+03
Std2.48E+021.71E+031.93E+038.38E+021.66E+033.95E+011.78E+031.62E+034.74E+022.59E+01
Rank38571029641
p6.80E-086.80E-086.80E-086.80E-086.80E-085.23E-076.80E-086.80E-086.80E-08
Mean rank4.57.54.06.47.83.48.35.42.81.4
Result58479310621
+/=/−0/1/90/1/90/3/60/1/90/0/100/2/80/1/90/0/100/4/6
Table 6. The optimal result of welded beam design.
Table 6. The optimal result of welded beam design.
AlgorithmsOptimum VariablesOptimum Cost
k1k2k3k4
WOA0.176794286 3.891577393 9.255735323 0.204661441 1.764910575
AO0.202106856 3.319686034 9.081046814 0.212260217 1.755925500
SCA0.165196847 4.294568905 8.967124254 0.210654365 1.792045440
RSA0.175670680 3.624398648 9.999449030 0.205952534 1.869756941
HS0.177543770 4.098912342 8.878245918 0.217531907 1.824393253
BWO0.218273731 3.015636000 9.273401650 0.220363497 1.831589741
HHO0.201474520 3.378944473 8.960620234 0.2186448191.719100771
AOA0.209983002 3.017257244 10.00000000 0.212237971 1.884562862
SCSO3.267355241 3.267355241 9.035052358 0.205802427 1.696622053
COSCSO0.205747115 3.252954257 9.036236158 0.205747352 1.695317058
Table 7. Statistical results of welded beam design.
Table 7. Statistical results of welded beam design.
AlgorithmsBestWorstMeanStd
WOA1.764910575 3.268065060 2.360271979 0.437518374
AO1.755925500 2.621445835 2.074867078 0.197032112
SCA1.792045440 1.992645879 1.862883723 0.051244352
RSA1.869756941 27.442208090 3.681496058 5.601754100
HS1.824393253 3.706355014 2.719342102 0.498207128
BWO1.831589741 2.952547258 2.251305241 0.284732148
HHO1.719100771 2.313552829 1.850862237 0.147918368
AOA1.884562862 2.914716233 2.309549772 0.313092730
SCSO1.696622053 4.242983833 2.004451488 0.765112742
COSCSO1.6953170581.7817263911.7138144930.021561647
Table 8. The optimal result of pressure vessel design.
Table 8. The optimal result of pressure vessel design.
AlgorithmsOptimum VariablesOptimum Cost
k1k2k3k4
WOA0.795338105 0.654521576 40.86741439 192.5123053 6736.71434
AO0.801045506 0.418921940 41.48734742 185.7712132 6030.23905
HS0.978539020 0.500973483 49.50650039 103.1850619 6547.72658
RSA0.972997715 0.512637098 47.36047546 200 6305.50049
SCA0.854015786 0.520118487 44.11567303 158.4223506 5885.88792
BWO0.938950393 0.563152725 48.41429245 115.1168137 5885.33405
BSA0.778164873 0.493776584 40.31964778 199.9995955 7303.71890
AOA0.793199905 0.458365929 40.34176154 200 5885.31787
SCSO0.796043593 0.406047822 41.24545087 187.5713345 5885.32021
COSCSO0.778539806 0.385198905 40.33914125 199.7284276 5885.31757
Table 9. Statistical results of pressure vessel design.
Table 9. Statistical results of pressure vessel design.
AlgorithmsBestWorstMeanStd
WOA6736.71434 15021.66294 9511.21867 2551.89041
AO6030.23905 7756.65903 6882.12163 533.52205
HS6547.72658 12757.34328 8720.20447 1699.11330
RSA9269.85205 68265.01725 32898.82493 15501.85116
SCA6518.94915 9160.64385 7511.17067 772.99225
BWO6772.30663 9518.89193 7538.79379 683.87846
BSA6200.76520 30037.17906 11444.68588 6484.51516
AOA6211.62984 18842.91882 11254.75474 3546.15966
SCSO5956.21327 23310.15051 7614.91057 3715.97323
COSCSO5887.020117318.918726569.02774517.46245
Table 10. The optimal result of gas transmission compressor design.
Table 10. The optimal result of gas transmission compressor design.
AlgorithmsOptimum VariablesOptimum Cost
k1k2k3
RSA54.99999359 1.194623188 25.00083760 2964527.52459
BWO551.193088691 24.53523919 2964639.69065
SOA53.65973171 1.190449899 24.74449640 2964378.80113
WOA53.44872314 1.190109928 24.71816871 2964375.49576
SCA551.195189878 24.73268345 2964474.41040
HS53.38712267 1.189241767 24.74999745 2964380.11714
AO53.47061054 1.190026373 24.64034062 2964384.51386
AOA551.200762326 24.62935858 2964730.55260
SCSO53.45101427 1.190109067 24.71872247 2964375.49653
COSCSO53.44671239 1.190100716 24.71857897 2964375.49533
Table 11. Statistical results of gas transmission compressor design.
Table 11. Statistical results of gas transmission compressor design.
AlgorithmsBestWorstMeanStd
RSA2964527.52459 3016290.08652 2978147.32748 14122.03566
BWO2964639.69065 2985654.73236 2968570.17953 5235.597886
SOA2964378.80113 2964502.32306 2964430.85626 39.36551384
WOA2964375.49576 2964376.19841 2964375.57737 0.15661415
SCA2964474.41040 2965960.51365 2964893.53068 407.6793260
HS2964380.11714 2972724.66633 2965744.95719 2116.81140
AO2964384.51386 2974563.67888 2966781.11226 3285.30995
AOA2964730.55260 2985136.92184 2970250.27062 6037.25127
SCSO2964375.49653 2987124.04245 2966285.86187 5997.65711
COSCSO2964375.495332964375.495332964375.495335.32331E-09
Table 12. The optimal result of heat exchanger design.
Table 12. The optimal result of heat exchanger design.
AlgorithmsOptimum VariablesOptimum Cost
k1k2k3k4k5k6k7k8
WOA8.857E+024.785E+034.419E+037.565E+013.233E+028.793E+011.503E+024.233E+0210089.27612
AO2.657E+034.867E+036.451E+031.227E+023.026E+021.735E+021.903E+023.791E+0213975.23738
HS2.319E+032.737E+035.757E+032.414E+023.254E+021.583E+023.023E+024.252E+0210812.90588
RSA1.120E+034.272E+034.772E+037.793E+013.003E+023.712E+021.469E+024.051E+0227223.59476
BSA2.869E+033.093E+035.411E+031.051E+022.840E+022.013E+021.774E+023.838E+0211372.90767
BWO4.001E+036.410E+034.806E+031.448E+023.183E+021.470E+021.852E+024.132E+0215217.14817
HHO1.174E+03 1.000E+036.899E+031.253E+022.241E+021.606E+022.711E+023.241E+029073.094947
AOA5.052E+039.071E+03 9.071E+033.209E+012.132E+021.668E+022.137E+023.069E+0223194.46529
SCSO5.119E+022.451E+034.512E+031.640E+023.196E+022.226E+022.442E+024.195E+027475.073844
COSCSO1.084E+03 1.103E+035.271E+031.997E+022.891E+021.948E+023.071E+023.891E+027458.396002
Table 13. Statistical results of heat exchanger design.
Table 13. Statistical results of heat exchanger design.
AlgorithmsBestWorstMeanStd
WOA10089.27612196031.89848225.0652452285.7206
AO13975.23738178736.757242861.35149 37560.63068
HS10812.90588124467.664460095.1832329532.60344
RSA27223.59476315644.1889147440.393161985.48452
BSA11372.90767161404.45148673.3065637923.7065
BWO15217.14817120323.295260086.46685 27717.89898
HHO9073.094947 77151.9167821087.558315916.02815
AOA23194.46529155972.8617 49743.4608729775.26723
SCSO7475.073844212854.467530578.9153612694.59629
COSCSO7458.39600233159.1951212647.274277803.060165
Table 14. The optimal result of tubular column design.
Table 14. The optimal result of tubular column design.
AlgorithmsOptimum VariablesOptimum Cost
k1k2
HS5.445466505 0.293088136 26.531748937
SOA5.450525166 0.291989004 26.497685800
RSA5.5186772220.288200889 26.624133745
WOA5.450602426 0.291877924 26.492127934
SCA5.4527751830.291738699 26.495248656
CHOA5.4498016880.29223044126.507063312
BWO5.4345796990.29571633 26.618680219
AOA5.4274239190.303376181 26.991049027
SCSO5.4522490690.29162291826.486505805
COSCSO5.4521807390.291626429 26.486361480
Table 15. Statistical results of tubular column design.
Table 15. Statistical results of tubular column design.
AlgorithmsBestWorstMeanStd
HS26.531748937 29.380491179 27.101286566 0.682159826
SOA26.497685800 26.651702678 26.546770890 0.043476606
RSA26.624133745 31.482014142 28.600884307 1.378250449
WOA26.492127934 28.062545561 27.067123704 0.475587468
SCA26.495248656 26.911864726 26.663842655 0.103145967
CHOA26.507063312 26.664362638 26.592798770 0.049578003
BWO26.618680219 28.701867216 27.247581963 0.524397843
AOA26.991049027 28.660769408 27.781832446 0.601084705
SCSO26.486505805 26.488283717 26.487303513 0.000514557
COSCSO26.48636148026.48642913526.4863675010.000016150
Table 16. The optimal result of piston lever design.
Table 16. The optimal result of piston lever design.
AlgorithmsOptimum VariablesOptimum Cost
k1k2k3k4
WOA0.086602041 2.079956862 4.093800868 119.228613317 8.955469786
SOA0.050780514 2.044519184 4.083248703 120 8.432800807
MVO0.05 2.050052620 4.087058119 119.979915073 8.463145662
CHOA0.073876185 2.081364562 4.095381685 120 8.847198681
SCA0.05 2.053817149 4.093726603 120 8.505719113
BWO0.052.105295277 4.096763691 120 8.723224918
HHO0.050019685 2.041900656 4.083032582 119.999227818 8.414435140
AOA0.271028410 0.271028410 4.162257291 120 57.99492127
SCSO0.052.041589027 4.083079945 120 8.413213831
COSCSO0.05 2.041513591 4.083027180 120 8.412698328
Table 17. Statistical results of piston lever design.
Table 17. Statistical results of piston lever design.
AlgorithmsBestWorstMeanStd
WOA8.955469786 342.7630967 55.4985560295.85126869
SOA8.432800807 9.954077528 43.99002860 0.503774548
MVO8.463145662 314.1339038 9.861960059 88.25491267
CHOA8.847198681 11.80905189 0.871394022 0.815693522
SCA8.505719113 10.23160760 9.410095331 0.542224958
BWO8.723224918 10.37487240 9.479639241 0.535563793
HHO8.414435140 411.9250502 96.24528231 122.8529460
AOA57.99492127 577.6401065 343.1464542 160.6837497
SCSO8.413213831 56.92881071 10.87687234 10.83963680
COSCSO8.4126983288.5258706428.4260249550.028570619
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, X.; Liu, Q.; Zhang, L. An Adaptive Sand Cat Swarm Algorithm Based on Cauchy Mutation and Optimal Neighborhood Disturbance Strategy. Biomimetics 2023, 8, 191. https://doi.org/10.3390/biomimetics8020191

AMA Style

Wang X, Liu Q, Zhang L. An Adaptive Sand Cat Swarm Algorithm Based on Cauchy Mutation and Optimal Neighborhood Disturbance Strategy. Biomimetics. 2023; 8(2):191. https://doi.org/10.3390/biomimetics8020191

Chicago/Turabian Style

Wang, Xing, Qian Liu, and Li Zhang. 2023. "An Adaptive Sand Cat Swarm Algorithm Based on Cauchy Mutation and Optimal Neighborhood Disturbance Strategy" Biomimetics 8, no. 2: 191. https://doi.org/10.3390/biomimetics8020191

APA Style

Wang, X., Liu, Q., & Zhang, L. (2023). An Adaptive Sand Cat Swarm Algorithm Based on Cauchy Mutation and Optimal Neighborhood Disturbance Strategy. Biomimetics, 8(2), 191. https://doi.org/10.3390/biomimetics8020191

Article Metrics

Back to TopTop