Next Article in Journal
Anti-Adhesive Surfaces Inspired by Bee Mandible Surfaces
Next Article in Special Issue
Kepler Algorithm for Large-Scale Systems of Economic Dispatch with Heat Optimization
Previous Article in Journal
Advancing Dimethacrylate Dental Composites by Synergy of Pre-Polymerized TEGDMA Co-Filler: A Physio-Mechanical Evaluation
Previous Article in Special Issue
Evolutionary Computing for the Radiative–Convective Heat Transfer of a Wetted Wavy Fin Using a Genetic Algorithm-Based Neural Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Sine Cosine Algorithm for Elite Individual Collaborative Search and Its Application in Mechanical Optimization Designs

College of Information Science and Technology, Gansu Agricultural University, Lanzhou 730070, China
*
Author to whom correspondence should be addressed.
Biomimetics 2023, 8(8), 576; https://doi.org/10.3390/biomimetics8080576
Submission received: 9 October 2023 / Revised: 16 November 2023 / Accepted: 20 November 2023 / Published: 1 December 2023
(This article belongs to the Special Issue Nature-Inspired Metaheuristic Optimization Algorithms)

Abstract

:
To address the shortcomings of the sine cosine algorithm such as the low search accuracy, slow convergence speed, and easily falling into local optimality, a sine cosine algorithm for elite individual collaborative search was proposed. Firstly, tent chaotic mapping was used to initialize the population and the hyperbolic tangent function was applied non-linearly to adjust the parameters of the sine cosine algorithm, which enhanced the uniformity of population distribution and balanced the global exploration and local exploitation ability. Secondly, the search method of the sine cosine algorithm was improved by combining the search strategy of the sine cosine algorithm, the m-neighborhood locally optimal individual-guided search strategy, and the global optimal individual-guided search strategy, and, then, the three search strategies were executed alternately, which achieved collaboration, improved the convergence accuracy, and prevented the algorithm from falling into local optima. Finally, a greedy selection strategy was employed to select the best individuals for the population, which accelerated the convergence speed of the sine cosine algorithm. The simulation results illustrated that the sine cosine algorithm for elite individual collaborative search demonstrated a better optimization performance than the sine cosine algorithm, the other improved sine cosine algorithms, the other chaos-based algorithms, and other intelligent optimization algorithms. In addition, the feasibility and applicability of the sine cosine algorithm for elite individual collaborative search were further demonstrated by two mechanical optimization design experiments.

1. Introduction

Most of the optimization problems that exist today are inherently NP-hard and difficult to solve using basic mathematical methods. Many real-world optimization problems are also represented figuratively, and the complexity of the problems faced to be solved is increasing. Therefore, the excellent performance of swarm intelligence optimization algorithms for finding optimal values in specific systems and problems has attracted many experts and scholars to conduct research in this area.
In recent years, the proposed swarm intelligence optimization algorithms include the whale optimization algorithm (WOA) [1], grey wolf optimization algorithm [2], and Harris hawks optimization algorithm [3], as well as the salp swarm algorithm [4], Olympic optimization algorithm [5], sand cat swarm optimization algorithm [6,7], and quadratic interpolation optimization algorithm [8]. The above algorithms not only enrich the research fields, but also solve many practical problems in real life, such as engineering optimization [9,10], shop floor scheduling [11], intelligent transportation [12], software module clustering [13], wireless sensor networks [14,15], intelligent agriculture [16], EDM molding process parameters [17], centrifugal pump design [18], logistics and transport [19], etc. Many optimization problems have resulted in substantial contributions to other research fields.
However, according to the “no free lunch” theorem [20], no algorithm is perfect and can solve all problems. Therefore, it is necessary to propose different emerging algorithms or improve existing ones. We can keep trying to improve the algorithms to the point where they are close to the optimal solution, and to make them less complex, with a shorter running time and higher optimization accuracy, while maintaining a stable performance on global optimization problems and retaining the random searchability of the algorithm.

1.1. The Motivation

Population initialization in the SCA [21] over-relies on randomness and does not initialize the population efficiently. Relying only on the random initialization of the population may cause the initial population to gather in a certain spatial domain, which causes the population distribution uniformity to be poor and the searching individuals to fall into a blind search in the early stage. The algorithm is prone to fall into local optimality after many iterations, which, in turn, leads to a slow convergence. In addition, the overall iterative process of the SCA utilizes the parameter r1 for linear regulation to balance the performance of the global search and local development, and the parameter r1 is set to linearly decrease from 2 to 0 according to the number of iterations; at the beginning of the iteration, r1 is at a larger value, which is favorable for the global search, and, as the number of iterations reaches the end of the iteration period, the value of the parameter linearly decreases to a smaller value, and a local search is carried out.
However, according to the linear search changes, the two phases can easily be unevenly coordinated, and the control parameter is in the global search phase when it is already in a rapid linear decline, and cannot perform fully, causing the algorithm global search and local development to fall into an imbalance. Finally, the search strategy of the SCA is to utilize the current individual xi to guide the goal development of the global optimal individual Pbest, i.e., using the current individual to cause the global optimal individual to rapidly converge, to achieve the purpose of seeking the optimal solution. However, this strategy only uses the current individual xi as a guide; although it has a strong global search ability, the convergence speed is slow and the optimization accuracy is low, and it easily falls into the local optimum and may miss the potential solutions that exist around the optimal individual.
To this end, this paper proposes the sine cosine algorithm for elite individual collaborative search (SCAEICS), which employs the tent chaos mapping strategy to initialize the population and efficiently and reasonably distribute the population to enhance the performance of the algorithm. The tent chaos mapping strategy has a wide range of applications [22], and adopting a tent chaos mapping population allows the algorithm to generate a uniformly distributed population of individuals in initializing the distribution, which is much more efficient and effective compared to the traditional method of population distribution.
Meanwhile, the hyperbolic tangent function has shown a better performance as an excitation function in the field of machine learning. The introduction of the hyperbolic tangent function in the r1 parameter effectively balances the equilibrium state between the global search and the local exploitation. In the field of optimization, the idea of elite individuals has been shown to improve the performance of algorithms, and the application of the idea of elite individuals can be seen in the literature [23,24,25]. Combining the SCA basic search strategy, the m-neighborhood locally optimal individual-guided search strategy, and the globally optimal individual-guided search strategy effectively enhances the algorithm. The three search strategies are executed alternately to realize a collaborative search, which improves the convergence accuracy and prevents the algorithm from falling into a local optimum.
In addition, a greedy selection strategy is employed to select the population to accelerate the convergence speed. Finally, two mechanical optimization design problems are simulated to verify the effectiveness of the proposed algorithm in this paper.

1.2. The Contribution

Based on the above ideas and strategies, this paper proposes a sine cosine algorithm for collaborative search of elite individuals to improve the performance of SCA. The tent chaotic mapping strategy is used in the initialization phase to enhance the uniformity of the population distribution, and the hyperbolic tangent function strategy is introduced to balance the equilibrium state between global search and local exploitation. In addition, to improve the convergence accuracy and convergence speed, and to avoid the algorithm from falling into local optimality, the idea of elite individuals is introduced, which effectively solves these problems. The performance of the improved algorithm is evaluated through 23 benchmark functions, CEC2020 functions, and 2 engineering design problems. Therefore, the main contributions of this study are summarized as follows:
(1)
A sine cosine algorithm for elite individual collaborative search was proposed, and SCAEICS exhibits the faster convergence speed, higher convergence accuracy, and effective escape from local optima compared to the SCA;
(2)
In the improvement process, the tent chaotic mapping strategy and the hyperbolic tangent function strategy are adopted, which effectively solve the defect of the randomness of the population distribution and balance the global search and local exploitation;
(3)
In addition, the concept of the collaborative search of elite individuals is combined with SCA and used to improve the search performance of SCA;
(4)
The proposed SCAEICS was validated by 23 benchmark functions, CEC2020 functions, and in two mechanical engineering optimization problems, and it outperformed the basic SCA in terms of convergence performance.

1.3. The Structure of Organization

The rest of the paper is organized as follows. Section 2 describes the basic principles and drawback in the analysis of SCA. Section 3 describes the improvement ideas of the elite individual collaborative search sine cosine algorithm in detail. Section 4 performs comparative experiments and analysis using 23 benchmark test functions and CEC2020 test functions. Section 5 applies the SCAEICS to the optimization of mechanical designs. Section 6 gives a discussion of the proposed approach. The last section summarizes the findings of this paper and points out the direction of the next research work.

2. Related Research

The sine cosine algorithm (SCA) [21] was proposed by S Mirjalili in 2016. The SCA was a new type of population intelligent optimization algorithm, which seeks the optimal solution of the population by modeling the periodic oscillations of the sine cosine function, and it has the advantages of fewer parameters, simpler structure, and easier implementation. Like most algorithms, the sine cosine algorithm still has the defects of a low optimization searching accuracy, slow convergence speed, and easily falling into the local optimal value. Therefore, researchers are working to improve the sine cosine algorithm in the following two aspects.
(1) Improve the initialization, parameter setting, and algorithmic structure of the SCA.
For example, a Q-learning embedded sine cosine algorithm (QLESCA) was proposed [26], in which the algorithm controls the SCA parameters by forming Q-tables for different individuals during operation, which effectively improves the convergence speed of the SCA. A new backbone sine cosine algorithm based on domain structure was proposed [27], which mainly introduced domain structure and Gaussian sampling learning through the backbone optimization idea in the update process of the sine cosine algorithm, effectively enhancing the population exploration ability and improving the population diversity. A doubly adaptive randomized standby enhanced sine cosine algorithm was proposed by introducing the doubly adjusted weight strategy and the randomized standby strategy [28], which balanced the weight factor between exploitation and exploration, accelerated the convergence speed, and enhanced the exploration ability.
A improved sine cosine algorithm with Lévy flight was proposed [29] by multiplying the Lévy flight distribution with the sine cosine population individual position vector with corresponding elements, and the non-linear parameter adjustment method based on the spatial distance, which effectively enhanced the convergence accuracy and improved the convergence speed of the algorithm. A spectral feature peak identification and localization method based on the improved sine cosine algorithm was proposed [30], which improved the sine cosine algorithm by an adopted dynamic conversion probability and significantly modified the performance such as the spectral identification rate and localization accuracy.
Furthermore, the sine cosine algorithm introducing the backward learning strategy was proposed [31], which effectively solved the problem of late evolutionary stagnation and improved the global optimization performance. A sine cosine algorithm based on orthogonal parallel information was proposed [32], which increased the diversity and enhanced the global search capability of the algorithm by adopting a multiple orthogonal parallel information strategy. An improved sine cosine algorithm for text feature selection was proposed [33] mainly using individual coding and adaptive weighting strategies, which improved the classification accuracy compared with other feature-selection algorithms. A population-based sine cosine algorithm for application to economic load scheduling was proposed [34], which indicated a high performance compared to other techniques. An enhanced parallel sine cosine algorithm with a single-stage synchronous and asynchronous strategy designed for solving constrained and unconstrained problems was proposed [35], which effectively sped up the convergence of the algorithm. An improved sine cosine algorithm for solving high-dimensional global optimization problems was proposed [36], in which an inertia weight factor was introduced to modify the original algorithm formula, the Gaussian function was used to reduce the algorithm parameters nonlinearly, and 24 high-dimensional functions and other large-scale global optimization problems were used to evaluate the effectiveness of the algorithm, which showed that the modified algorithm effectively avoids falling into local optimization and accelerates the convergence speed.
(2) Complement the SCA with other algorithms.
For example, an enhanced brain storm sine cosine algorithm was proposed [37] to improve population diversity by introducing the enhanced brain storm strategy and two new individual updating strategies, which achieved an effective balance between the global search and local exploitation. Meanwhile, by introducing a cloud model strategy to adaptively adjust the control parameters, a cloud model-based sine cosine algorithm was proposed [38], and the experimental results show that the improved algorithm outperformed the original algorithm in solving global optimization problems. A sine cosine algorithm embedded with differential evolution and inertia weights was proposed [39], which embedded a differential evolution algorithm with dynamic variation, and better balanced the performance of global search and local exploitation by introducing adaptive inertia weights. A hybrid multi-objective firefly algorithm and sine cosine algorithm was proposed [40], in which the results show that the modified algorithm was effective for multi-objective optimization problems. A improved sine cosine algorithm hybridized with a particle swarm algorithm for recording lung CT images of COVID-19 infected patients was proposed [41], which had a high practicality in the field of medical image alignment.
The above algorithms use different strategies to improve the working of the SCA and the improvement is better. However, by analyzing the results, these algorithms still have some limitations. Therefore, it is necessary to continuously improve the SCA to make it more applicable to real-life practical problems and to overcome the challenging new problems brought by the development of society, which is the main motivation of the current research. Next, we will work more effectively and comprehensively.

3. Basic Sine Cosine Algorithm

3.1. Principle of the Sine Cosine Algorithm

The SCA converges to the global optimal solution by modeling the periodic oscillatory nature of the sine and cosine functions by the probability progressively from the global search in the exploration phase and the local exploitation in the exploitation phase. The basic principle of the SCA is shown in Figure 1. When the position interval of the sine and cosine functions is in [2, 1] and [−2, −1], the current individual is guided to perform the global search in the exploration phase for the solution space. When the sine and cosine function location interval is at [−1, 1], the target optimal solution is exploited and the exploitation phase performance is exploited locally in the solution space. The sine and cosine functions work together in the SCA search process to complement each other, thus ultimately the SCA eventually converges to the global optimal solution.
For solving the minimization problem,
min f ( x ) = min f ( x 1 ,   x 2 ,   ,   x D )
where x is the feasible solution to the problem and D is the spatial dimension.
(I) Suppose the population size is N, the search space is D-dimensional, the position of the ith individual is denoted as xi = (xi1, xi2, …, xiD), i ∈ (1, 2, …, D), the optimal individual position in the iterative update of the N population is denoted as pbest = (pbest,1, pbest,2, …, pbest,D), and the ith individual in the population updates its spatial position according to Equation (1).
x i , j t + 1 = x i , j t + r 1 cos ( r 2 ) r 3 p b e s t , j t x i , j t ,       r 4 0.5 x i , j t + r 1 sin ( r 2 ) r 3 p b e s t , j t x i , j t ,       r 4 < 0.5
where xi,j is the jth dimensional component of the ith candidate solution and pbest,j is the jth dimensional component of the global optimal solution in the iteration, j ∈ (1, 2, …, D), where r2 ∈ (0,2π), r3 ∈ (0,2), and r4 ∈ (0,1) are three uniformly distributed random numbers.
(II) In the SCA, the r1 parameter effectively balances the global search and local exploitation performance, gradually converging to the globally optimal solution as the number of iterations decreases linearly, and the r1 parameter is updated according to Equation (2).
r 1 = a t a T
where a is the a constant, generally taking a value of 2, t is the number of current iterations, and T is the maximum number of iterations.

3.2. Disadvantage Analysis of the Sine Cosine Algorithm

Good initialized populations are crucial for the swarm intelligence algorithms, and the SCA population initialization relies excessively on randomness, without efficient initialized populations. Relying only on random initialization of the population may lead to the initial population clustering in a certain spatial domain, resulting in the population distribution being poorly homogeneous and the search for individuals falling into a blind search in the early stage. The algorithm tends to fall into a local optimum after several iterations, which in turn leads to a slow convergence rate. Therefore, a tent chaotic mapping strategy was adopted in the SCAEICS to initialize the population, distribute the population efficiently and reasonably, and enhance the performance of the SCA.
Any swarm intelligence optimization algorithm needs to consider how to balance the global search and local exploitation performance. Global search explores a broader space for the population to maintain good population diversity and avoid getting trapped in local optima. The SCA uses the parameter r1 for linear regulation to balance the performance of the global search and local exploitation. The parameter r1 is set to decrease linearly from 2 to 0 according to the number of iterations, with r1 at a large value at the beginning of the iteration to facilitate global search. As the number of iterations reaches the end, the parameter value decreases linearly to a smaller value, and the local search is carried out. However, according to the linear search variation, two phases are easily unevenly coordinated, and the control parameters are in the global search phase when they are already rapidly decreasing linearly and cannot fully perform, making the global search and local development of the algorithm fall into imbalance. To solve this problem, the hyperbolic tangent function to non-linearly adjust the control parameters r1 was used in the SCAEICS.
The search strategy of the SCA adopts the current individual xi to guide global optimal individual Pbest to develop the objective, the current individual to the global optimal individual to quickly converge to achieve the purpose of seeking an optimal solution. However, this strategy only uses the current individual xi to guide; although it has a strong global search capability, the convergence speed is slow and the optimization accuracy is low, and it is easy to fall into the local optimum and it may miss the potential solutions around the optimal individual. Therefore, to combine the SCA search strategy, the m-neighborhood locally optimal individual-guided search strategy and the global optimal individual-guided search strategy, the SCAEICE was proposed in this paper. The three search strategies were executed alternately to achieve a collaborative search, improve the convergence accuracy, and prevent the algorithm from falling into a local optimum.

4. Sine Cosine Algorithm for Collaborative Search of Elite Individuals

4.1. Modified Strategies of the SCAEICS Algorithm

(I).
Tent chaos mapping initialization
Tent chaotic mapping [42] was used by the SCAEICS to initialize the population, so that the population was reasonably evenly distributed in the search space, the algorithm performance was improved, and the population distribution uniformity was maintained effectively. The comparison results between the tent chaos mapping initialization and random population initialization are shown in Figure 2. Tent chaos mapping not only has the characteristics of traversal uniformity and low complexity, but also preserves the initialization randomness of the SCA. The function expressions are updated according to Equation (3).
s i + 1 , d = u ( 1 s i , d )         0.5 u s i , d                         < 0.5
where the population undergoes tent chaos mapping to generate a chaotic sequence si = (si, i = 1, 2, …, N), si,d = (si,d, d = 1, 2, …, D), u ∈ (0,2); the larger the u, the better the chaos, and the system is in a fully chaotic state when u is at 2.
The mapping of the initial values of the resulting chaotic sequence in the search space yields the population X = (Xi, i = 1, 2, …, N), Xi = (Xi,d, d = 1, 2, …, D), and the population individual expressions are Equation (4).
x i , d = ( 1 + s i , d ) × ( ( u b l b ) / 2 ) + l b
where ub and lb are the upper and lower bounds of the search space.
(II).
The hyperbolic tangent function non-linear adjustment control parameter r1
The hyperbolic tangent [43] was adopted by the SCAEICS to improve the parameter r1. The tanh function, as an excitation function, is commonly used in the field of machine learning and has a good optimization performance for non-linear improvements. The expression of the tanh function is Equation (5).
Tanh ( x ) = s i n h ( x ) c o s h ( x ) = e x e x e x + e x
The sinh(x) and cosh(x) are the hyperbolic sine functions and hyperbolic cosine functions, respectively. By introducing the tanh function into the SCAEICS, the parameters could be nonlinearly decreased, the global search capability in the early stage was enhanced, and the local development performance in the later stage was stably maintained to achieve a state of global search and local development balance, and the expression for the control parameter is improved as Equation (6).
r 1 = ( a m a x a m i n ) t a n h ( ( S / T ) t S )
where amax and amin are the initial and termination values of parameter a, respectively, t is the current number of iterations, and T is the maximum number of iterations; S is the adjustment parameter, and the comparison of parameter r1 before and after improvement is shown in Figure 3.
(III).
The elite individual collaborative search strategies
The elite individual-guided search strategy uses the best individuals of the population to guide the search process, improving the optimization accuracy and avoiding missing potential solutions and falling into local optima. The elite individual-guided search strategy is divided into the m-neighborhood locally optimal individual-guided search strategy and the global optimal individual-guided search strategy.
(IV).
The m-neighborhood locally optimal individual-guided search strategy
The M (mN) individuals are randomly selected from the population, where the location of the optimal individual is denoted as Lbest = (lbest,1, lbest,2, …, lbest,D). The Lbest is the locally optimal individual in the m-neighborhood, which was searched according to Equation (7). Then, we obtain a new individual Ui = (ui,1, ui,2, …, ui,D), i = 1, 2,…, n. This strategy searches in the m-neighborhood and around the locally optimal individual Lbest. The guidance of Lbest in the search is effectively utilized, considering both the search accuracy and the global search capability to prevent the algorithm from falling into the local extremes.
u i , j t + 1 = l b e s t , j t + r 1 cos ( r 2 ) r 3 l b e s t , j t x i , j t ,       r 4 0.5 l b e s t , j t + r 1 sin ( r 2 ) r 3 l b e s t , j t x i , j t ,       r 4 < 0.5
(V).
The global optimal individual-guided search strategy
The current individual xi and the globally optimal individual Pbest are searched according to the search strategy in Equation (8) to obtain the new individual Ui. The new individual Ui was obtained by searching the current individual xi and the globally optimal individual Pbest according to Equation (8).
u i , j t + 1 = p b e s t , j t + r 1 cos ( r 2 ) r 3 p b e s t , j t x i , j t ,       r 4 0.5 p b e s t , j t + r 1 sin ( r 2 ) r 3 p b e s t , j t x i , j t ,       r 4 < 0.5
This strategy searches for the global optimal individual Pbest in the vicinity of the global optimal individual P  r 3 p b e s t x i t as the radius, in a sine or cosine manner, which not only plays the role of guiding the search by the global optimal individual Pbest, but also improves the optimization accuracy.
(VI).
The collaborative search strategy
In the search strategy Equation (1) of the SCA, the current individual xi is searched near the current individual x r 3 p b e s t x i t as the radius, in a sine or cosine manner, so that the current individual xi is close to or far from the global optimal individual.
To better balance the global and local search capabilities of the SCA, enhance the optimization accuracy, prevent falling into local extremes, and improve the quality of the optimal solution, the search strategy of the SCA, the m-neighborhood locally optimal individual-guided search strategy and the globally optimal individual-guided search strategy are combined, and the three search strategies are executed alternately to realize a collaborative search. The collaborative search strategy, which preserves the search mechanism of the SCA, makes full use of the ability of global search and prevents the algorithm from falling into local extremes. The elite individual-guided search mechanism plays the role of the pbest global optimal individual and the lbest local optimal individual to guide the search process and conducts a local search in the vicinity of the pbest global optimal individual and the lbest local optimal individual to improve the optimization accuracy and enhance the quality of the optimal solution of the algorithm. At the same time, this allows the use of the lbest local optimal individual guidance, and a balance of global search ability and local search ability.
(VII).
The greedy selection strategy
The new individual ui derived from the elite individual collaborative search strategy is greedily selected with the individual xi, and the better individual of the two is retained to improve the accuracy of the algorithm and accelerate the convergence speed. In Equation (9), f(−) is the objective function fitness value.
x i t + 1 =             x i t ,                         f ( u i t ) f ( x i t )             u i t ,                         f ( u i t ) < f ( x i t )

4.2. Algorithm Implementation Steps

The SCAEICS algorithm is implemented in the following steps.
Step 1: Tent chaos mapping initializes the population, the population size is set to N, the current number of iterations t, the maximum number of iterations T of the algorithm, the spatial dimension D, the number of neighbors m, and the iteration interval h of the elite individual collaborative search strategy.
Step 2: Calculate the fitness of each individual and find the globally optimal individual pbest.
Step 3: Calculate the control parameter r1.
Step 4: Execute the search strategy of the SCA according to Equation (1) when t mod 2 == 0, and turn to Step 6, otherwise execute Step 5.
Step 5: Execute the elite individual-guided search mechanism. Generate a random number h between [0,1], and when h > 0.5, execute the m-neighborhood locally optimal individual-guided search strategy according to Equation (7). Otherwise, execute the global optimal individual-guided search strategy according to Equation (8).
Step 6: Execute the greedy selection strategy according to Equation (9) and update the current individual.
Step 7: Update the global optimal individual.
Step 8: If t > T, stop the iteration and output the global optimal solution. Otherwise, t = t + 1 and move to Step 3.

4.2.1. Pseudo-Code for the SCAEICS Algorithm

The pseudo-code of the SCAEICS algorithm is shown in Algorithm 1.
Algorithm 1: Sine cosine algorithm for the collaborative search of elite individuals (SCAEICS)
Enter parameters and initialize.
Set the population size N, use the tent chaos mapping strategy to generate the initialized population, (xi, i = 1, 2, …, N), and the maximum number of iterations T. Set the neighborhood individuals m, integer h, and spatial dimension D (where the function f14~f23 is a fixed dimension).
Calculate the individual fitness value f(xi), i = 1, 2, …, N) and find the globally optimal individual and its location.
t = 0;
While (t < T) do
                Identifying locally optimal individuals and their locations.
         for i = 1 to N do
                        Calculate the value of the control parameter r1 according to Equation (6).
               if (t mod 2==0)
                               The SCA search strategy is executed according to Equation (1).
                       else if (h > 0.5)
                     Execute the m-neighborhood locally optimal individual-guided search strategy according to Equation (7).
                                 else if
                                           Execute the globally optimal individual guided search strategy according to Equation (8).
                end if
                      end if
              end if
  Execute the greedy selection strategy according to Equation (9).
         end for
                     Updating the current optimal individual and position.
t = t + 1;
end while

4.2.2. Flowchart of the SCAEICS Algorithm

The flowchart of the SCAEICS algorithm is shown in Figure 4.

4.3. Analysis of Algorithm Convergence and Diversity

The advantages and disadvantages of convergence in intelligent algorithms determine the performance of the algorithm to a large degree. Therefore, in the improvement work of intelligent algorithms, many scholars have analyzed the convergence in detail. For example, an analysis of the convergence of the ABC algorithm was proposed [44] using the relationship between the algorithm variables and the general solution of the objective optimal solution update equation.
A convergence analysis of the ABC algorithm based on von Neumann stability and convergence was proposed [45]. A convergence analysis of the PSO algorithm guided by backward learning was proposed [46]. An analysis of the convergence of the improved SCA algorithm for population diversity defined by the population center of gravity was proposed [47]. An analysis of the convergence of the improved SCA algorithm for Markov chains was proposed [48].
The iterative update of the SCAEICS belongs to the Markov chain of stochastic search. Therefore, based on the above literature analysis, it is clear that the SCAEICS combines the SCA search mechanism with the elite individual-guided search mechanism and converges to the globally optimal solution using a greedy selection strategy. How to balance the local exploration and global development is an important evaluation indicator for optimization algorithms, and the importance of diversity in this process is relatively important. Therefore, conducting diversity analysis on the SCAEICS is of great significance. This article refers to the diversity analysis of the SCAEICS conducted in reference [22]. Due to the use of tent chaos mapping initialization, the population distribution is uniform and a good population diversity is maintained.

5. Simulation Experiments

Experimental environments include an Intel(R) Core (TM) i7−10750H CPU@ 2.30 GHz, 16 GB RAM, Windows 10 operating system, and the whole algorithms were compiled and implemented using the MATLAB R 2020b simulation platform.
(I).
Benchmark functions and parameter settings
To analyze the performance of the proposed algorithm, 23 benchmark functions and CEC2020 benchmark functions were used for simulation experiments, and the function expressions are detailed in the literature [49]. Other parameters of the functions are shown in Table 1 and Table 2: f1 to f7 are single-peaked dimensional functions, f8 to f13 are multi-peaked dimensional functions, and f13 to f23 are fixed dimensional multi-peaked functions. In the single-peaked dimensional function, the test dimension has one local extremum, which is used to verify the convergence speed of the algorithm and its performance, and in the fixed-dimensional multi-peaked function, the test dimension has multiple complex local extremums, which are used to verify the global search performance of the algorithm and its ability to escape from the local optimum. F1 to F10 are unimodal, multimodal, new hybrid, and composite, with significance for verifying the SCAECIS performance.
(II).
Parameter settings of other algorithms involved in the following comparison
To verify the effectiveness and superiority of the SCAEICS, 23 benchmark functions and CEC2020 benchmark functions were used to compare the SCAEICS with other algorithms. To ensure fair results, the involved algorithms in the comparison were uniformly set to a population size of N = 30 and a maximum number of iterations T = 500, and each algorithm was run 30 times independently. The parameters of the SCAEICS were set to amax = 2, amin = 0, S = 2, m = 6, and h = 1, and the other parameters of other algorithms involved in the following comparison are shown in Table 3.

5.1. Comparative Analysis of the SCAEICS with the SCA and Other Intelligent Algorithms

To verify the performance of the SCAEICS, 23 benchmark functions were used to compare it with the SCA and other swarm intelligence algorithms proposed in recent years, which are the whale optimization algorithm (WOA) [1], grey wolf optimization algorithm (GWO) [2], Harris hawks algorithm (HHO) [3], and salp swarm algorithm (SSA) [4]. The parameters of the algorithms involved in the comparison are detailed above, the best values of the comparison results are bolded, and the experimental results are shown in Table 4.
As can be seen from Table 4, the SCAEICS is comparable to the SCA in experimental results; except for the optimization effect of the function f16, the other 22 functions are better than the SCA. The experimental results show that the SCAEICS has shown a significant optimization performance, for the single-peak dimension functions f1~f7, the optimization effect of four functions has a large degree, including f1, f2, f3, f4, etc. For the multi-peak dimensional functions f8 to f13, three of them, f8, f9 and f10, have reached the theoretical optimum. For the fixed multi-peak dimensional functions f14 to f23, five of them, f16, f18, f21, f22 and f23, have reached the theoretical optimum. This is because the SCAEICS retains the search strategy of the SCA and adopts a combination of the m-neighborhood local optimal individual-guided search strategy and the global optimal individual-guided search strategy, and the three search strategies are executed alternately to achieve a collaborative search, presenting a better collaborative capability, and the global search and local exploitation achieve a desirable balance effect.
To further analyze the stability and superiority of the SCAEICS, comparative experiments were conducted with other algorithms proposed in recent years, and the Wilcoxon rank sum test with a significance level of 5% was used to analyze the significant differences between the algorithms. The decision results (+/=/−) indicate the number of functions in which the compared algorithms are “better/equal/worse” than the SCAEICS, respectively. The Wilcoxon rank sum test results show that the SCAEICS outperformed the SCA on 21 functions, the WOA on 19 functions, the GWO on 18 functions, the HHO on 13 functions, and the SSA on 18 functions.
The data in Figure 5 are from the convergence graphs plotted for 30 runs of six selected functions, f2, f5, f8, f13, f21, and f23, from the Table 4 comparison experiment. The vertical co-ordinate is represented by the logarithm of the average optimal value of the function, and the horizontal co-ordinate is the number of iterations.
From the analysis in Figure 5, it can be seen that among the single-peaked dimensional functions: For f2, the SCAEICS converges significantly faster and has a higher optimization accuracy compared to the other five algorithms. For f5, the SCAEICS converges significantly faster than the other five algorithms, starting to converge at about 230 iterations until convergence to the global optimal solution. Among the multi-peaked dimensional functions, the f8 function has difficulty in finding an optimum due to its high number of local optima. For f8, the SCAEICS converges slightly faster compared to the HHO, starting to converge at about 10 iterations until it converges to the global optimum. For f13, the SCAEICS converges faster than the other five algorithms at the beginning of the iteration and starts to converge at about 250 iterations until it converges to the global optimum.
In the fixed multi-peaked dimensional function, for f21, although the convergence rate is slightly lower than that of the HHO at the beginning of the iteration compared to the other five algorithms, the SCAEICS starts to converge ahead of the other five algorithms until the theoretical optimum at about 50 iterations. For f23, the SCAEICS shows significant outperformance in the early iterations compared to the other five algorithms and starts to converge to the theoretical optimum at about 70 iterations. The convergence curves demonstrate the remarkable optimization-seeking performance of the SCAEICS, which reaches global optimality on most functions and has a higher convergence accuracy compared to the SCA and the other algorithms.
In the study of intelligent optimization algorithms, the setting of different dimensions has a certain influence on the experimental results. Therefore, to verify the SCAEICS can maintain a stable performance while searching the optimum in different dimensions, this paper extends the dimensions of the f1~f13 functions to 60 and 100 dimensions, with the other parameters unchanged, and uses the Wilcoxon rank sum test with a significance level of 5% to analyze the significant differences between different dimensions of the SCAEICS in the function optimization problem, and the experimental results are shown in Table 5 and Table 6.
As can be seen from Table 5, in the 60-dimensional comparative experimental results, the SCAEICS outperforms the other algorithms for the remaining 10 functions, except for 3 functions including f9, f10, and f11, whose optimization results are comparable with the other algorithms. As can be seen from Table 6, in the results of the comparative experiments in 100 dimensions, the SCAEICS outperforms the other algorithms except for 3 functions including f9, f10, and f11, whose optimization effects are comparable with those of the other algorithms, and the other 10 functions. The analysis of the results shows that the SCAEICS has a better optimization effect in the same dimension and has a stable optimization-seeking performance in different dimensions.
To analyze the stability and superiority of the SCAEICS in different dimensions, the Wilcoxon rank sum test with a significance level of 5% was used to analyze the significant differences between the SCAEICS. The decision results (+/=/−) indicate the number of functions in which the comparison algorithm is “better than/equal to/worse than” the SCAEICS, respectively. The Wilcoxon rank sum test shows that the SCAEICS outperformed the HHO on 9 functions and the SCA, the WOA, the GWO, and the SSA on 13 functions in the 60-dimensional comparison experiment.
In the 100-dimensional comparative experimental results, the SCAEICS outperformed the SCA on 13 functions, the WOA on 12 functions, the GWO on 13 functions, the HHO on 8 functions, and the SSA on 13 functions. The results show that the SCAEICS maintains a stable performance in the search for excellence in different dimensions.

5.2. Comparative Analysis of the SCAEICS with Other Improved Algorithms

To further validate the performance of the SCAEICS, 23 benchmark functions were used to compare it with other improved SCA algorithms: an alternating sine cosine algorithm based on an elite chaotic search strategy (COSCA) [48], a memory-guided sine cosine algorithm for global optimization (MGSCA) [50], a sine cosine algorithm based on differential evolution (SCADE) [51], and a cloud model-based sine cosine algorithm (CSCA) [38]. The comparative analysis was carried out and the Wilcoxon rank sum test with a significance level of 5% was used to analyze the significant differences of the SCAEICS, the algorithm parameters are detailed above, where the SCADE data were taken from the literature and the rest of the data were obtained from the experiments, the best values of the comparison results are bolded, the experimental results are shown in Table 7.
As can be seen from Table 7, the experimental results of comparing the SCAEICS with other improved algorithms in the SCAEICS obtained significantly better optimization results than the other improved algorithms for 18 functions including f1~f13, f16, f20~f23, and several functions achieved theoretical optimal values. To analyze the stability and superiority of the SCAEICS more precisely, the Wilcoxon rank sum test with a significance level of 5% was used to further analyze the significant differences of the SCAEICS, and the meanings of the symbols in the decision results are shown above. The Wilcoxon rank sum test shows that the SCAEICS outperforms the COSCA algorithm on 15 functions, the SCADE algorithm on 13 functions, the MGSCA algorithm on 18 functions, and the CSCA algorithm on 19 functions.
In summary, compared to the other improved algorithms, the SCAEICS has a stronger optimization accuracy, faster convergence, which indicates the modified algorithm is able to handle different conditions of the optimization search problem.

5.3. Comparison and Analysis of the SCAEICS with Other Chaos-Based Algorithms

To further validate the performance of the SCAEICS, CEC2020 benchmark functions were used to compare it with the other chaos-based algorithms: the elite chaotic manta ray algorithm integrated with chaotic initialization and opposition-based learning (CMRFO) [52], chaos marine predators algorithm (CMPA) [49], sine cosine algorithm (SCA) [21]. The comparative analysis was carried out and the Wilcoxon rank sum test with a significance level of 5% was used to analyze the significant differences of the SCAEICS. The algorithm parameters are detailed above, the best values of the comparison results are bolded, and the experimental results are shown in Table 8.
As can be seen from Table 8, the SCAEICS is compared with other chaos-based algorithms in the experimental results. Compared with the CMRFO, the SCAEICS obtains significantly better optimization results on six functions including F2, F5, F6, F8, F9, F10, three functions including F3, F4, F7 are comparable to the CMRFO and the remaining function is inferior to the CMRFO. Compared with the CMPA, the SCAEICS obtains significantly better optimization on three functions including F1, F3, F4, four functions including F6, F8, F9, F10 are comparable to the CMPA, and the remaining three functions are inferior to the CMPA. Compared with the SCA, the SCAEICS obtains significantly better optimization on 10 functions than the SCA.
To analyze the stability and superiority of the SCAEICS more precisely, the Wilcoxon rank sum test with a significance level of 5% was used to further analyze the significant differences of the SCAEICS, and the meanings of the symbols in the decision results are shown above. The Wilcoxon rank sum test shows that the SCAEICS outperforms the CMRFO algorithm on 6 functions, the CMPA algorithm on 3 functions, and the SCA algorithm on 10 functions.
In summary, compared to other chaos-based algorithms, the SCAEICS has a stronger optimization accuracy and a faster convergence, which indicates the modified algorithm is able to handle different conditions of the optimization search problem.

5.4. Analysis of Important Parameters

The setting of the local population size m of the locally optimal individual lbest is one of the more important parameters in the SCAEICS and has a large impact on the optimization accuracy of solving different problem functions. Taking a larger number in the local population size affects the local optimization performance and risks premature convergence of the algorithm, while conversely taking a smaller number has too little impact to achieve the desired algorithm performance. Therefore, the number of local populations m is set to 4/5/6/7, respectively, and the effect of the number of local populations on the performance of the algorithm is tested by simulation experiments. In total, 9 of the 23 tested functions, including the single-peaked dimension functions f1, f3, f6, the multi-peaked dimension function f8, f13, and the fixed multi-peaked dimension functions f17, f19, f21, f23, were used for the experiments, and the significant difference of the experimental results was indicated by the Friedman test with a significance level α of 5%. A p-value greater than 0.05 indicates significant differentiation in the parameter, whereas a value of less than 0.05 indicates that the parameter is not significantly different. The following results were analyzed by the professional statistical software IBM SPSS Statistics 21. Among the impact, the indicators are the mean best value, the standard deviation, the ranking, and the parameter value p of the Friedman test.
As can be seen from Table 10, the nine functions of f1, f3, f6, f8, f13, f17, f19, f21, and f23 are all less than the Friedman test value of 0.05, indicating that the algorithms corresponding to the four local population size values have significant differences, and the algorithm performance is optimal when the local population size m is taken as six in this paper, and the experimental test results are shown in Table 9.

5.5. Time Complexity Analysis

The time complexity of an algorithm has a large impact on the speed of convergence. In the SCA, it is assumed that the population size is N, the problem dimension is D, and the maximum number of iterations of the algorithm is T. Therefore, the time complexity of the SCA is Equation (10).
O ( S C A ) = O ( N D T )
In the SCAEICS, the other parameters are consistent with the SCA, and tent chaos mapping is used to initialize the population, and the time complexity of this part is O(N-D). For the m-neighborhood locally optimal individual-guided search strategy and in the global optimal individual-guided search strategy, the time complexity of this part is O(N-T-h-(m−1)/2) as the solution of this part is only a comparison process and therefore has less impact on the complexity of the algorithm. A greedy selection strategy is used to select individuals of the population on merit, and the time complexity of this part is O(N-T). Therefore, the time complexity of the SCAEICS is Equation (11).
O ( S C A E I C S ) O ( N D T )
From the above analysis, it can be concluded that the algorithm time complexity of the SCAEICS is approximately the same compared to the SCA.

6. Applications

In this paper, the proposed SCAEICS was used to solve mechanical design optimization problems and to further validate the feasibility and applicability of the SCAEICS.
(I).
Mechanical design optimization
Mechanical optimization design problems [53] belong to the classical problems in the field of machinery, which are mainly through the selection of design variables, objective functions, and constraints to build a mathematical model; therefore, the research application of such problems has a certain research significance for the mathematical model of this type of problem, which can generally be expressed as the following constrained optimization problem, Equation (12):
min f ( x ) s . t         g p ( x ) 0 ,   p = 1 ,   2 ,   ,   j ; h m ( x ) = 0 ,   m = 1 ,   2 ,   ,   y ; x u b x i x l b ,   i = 1 ,   2 ,   ,   n ;
where x is the design variable, x = x1, x2, …, xn and f(x) is the objective function; gp is the pth inequality constraint; hm is the mth equality constraint; and xub and xlb are the upper and lower bounds of the design variable, respectively.
(II).
Example of mechanical design optimization
To verify the feasibility and practicality of the SCAEICS, it is applied to the two problems of cantilever beam optimization [54] and three-bar truss optimization [55], and is compared with the whale optimization algorithm, grey wolf optimizer, Harris hawks optimization, salp swarm algorithm, sine cosine algorithm, and so on. To ensure the fairness of the experiment, the parameters of the algorithms were set as follows: the population size N = 30, the maximum number of iterations T = 1000, each algorithm was run 30 times independently and the average value was taken.
(III).
Example of optimized design of a cantilever beam
The optimization objective in the cantilever beam optimization design problem is to make the mass of the rectangular section of the cantilever beam as small as possible, the mathematical expression for which is Equation (13):
  min f ( x ) = 0.0624 i = 1 5 x i s . t         61 x 1 3 + 37 x 2 3 + 19 x 3 3 + 7 x 4 3 + 1 x 5 3 1 0.01 x i 100       i = 1 ,   2 ,   3 ,   4 ,   5 ;
where the minimum value of the f(x) function is the maximum mass of the rectangular section of the cantilever beam and the design variable xi denotes the height or width of the different unit beams.
The results of comparing the performance of the SCAEICS with different algorithms, for the cantilever beam optimization problem, can be seen in Table 10. The optimal solution of the SCAEICS is already better than the other algorithms for the minimum value of the function f(x). Therefore, the SCAEICS yields the optimum quality of the rectangular section of the cantilever beam compared to the other algorithms.
(IV).
Modified strategies of SFLACF algorithm: Example of optimized design of a three-rod truss
The objective of the optimal design of a three-rod truss is to find the optimal three-rod truss volume by adjusting the cross-sectional area. The problem has a non-linear fitness function, three inequality constraints, and two decision optimization variables. The mathematical expression is Equation (14).
  min f ( x ) = ( 2 2 x 1 + x 2 ) l ) s . t             g 1 ( x ) = 2 x 1 + x 2 2 x 1 2 + 2 x 1 x 2   p σ 0 ; g 1 ( x ) = x 2 2 x 1 2 + 2 x 1 x 2   p σ 0 ; g 1 ( x ) = 1 2 x 1 2 + x 1   p σ 0 ; 0 x i 1 ,   i = 1 ,   2 ; l = 100 cm ,   p = 2 KN / cm 2 ,   σ = 2 KN / cm 2
where the minimum value of the f(x) function is the optimal volume of the three-rod truss, l is the deflection, p is the buckling, σ is the stress constraint of the truss member, and x1 and x2 are the truss rod frame lengths on both sides for assessing the optimal cross-section.
As can be seen from Table 11, the optimal solution of the SCAEICS is 263.9055 for the three-rod truss optimization problem, which is better than the optimization results of other algorithms, indicating that the cross-sectional volume of the three-rod truss obtained by the SCAEICS is optimal.

7. Discussion

The elite individual collaborative search for the sine cosine algorithm proposed in this paper is a stochastic search algorithm, which searches for a suitable solution for the optimization problem in an iterative process. Therefore, a faster convergence speed, higher convergence accuracy, and escape from a local optimal solution are important indicators for evaluating the merits of the algorithm. In this paper, benchmark functions are used for performance testing between the SCAEICS and other algorithms, as well as practicality testing using two mechanical optimization design problems.

7.1. The Practical Managerial Significance (PMS)

In this section, we focus our discussion on the practical implications of the results of the algorithm performance comparisons; any optimization algorithm that has been proposed has certain advantages, but we have to consider the effectiveness and potential practical implications in real applications.
Therefore, we compared the SCAEICS with other algorithms on the 23 most popular test function sets with the CEC2020 test function set. It can be seen in Table 4 that the SCAEICS outperforms other optimization algorithms proposed in recent years, and it can be seen in Table 5 and Table 6 that the SCAEICS has an optimization performance applicable to different dimensions. In Table 7, it can be seen that the SCAEICS outperforms other improved SCA algorithms. In Table 8, it can be seen that the SCAEICS outperforms other chaos-based SCA algorithms. In Table 9, we have analyzed the important parameters in the algorithm in detail and obtained the values for the most applicable algorithm.
Furthermore, we have used Wilcoxon to compare the algorithms for comparison and Friedman to analyze the significance of the important parameters, and after analyzing the algorithms from a statistical point of view, we have proved that the SCAEICS has excellent optimization capabilities and is significantly competitive from a theoretical point of view.
In addition, to verify the practical application performance of the SCAEICS, two classical problems in mechanical optimal design problems are used to test the practicality: (1) the cantilever beam optimization problem, (2) the three-bar truss optimization problem. As can be seen from Table 10 and Table 11, the optimal values obtained by the SCAEICS in the cantilever beam optimization problem and the three-bar truss optimization problem are 1.3401 and 263.9055, respectively, which are better than the other algorithms involved in the comparison.
From the analysis of the above-mentioned results, it is clear that the SCAEICS has a superior performance and robustness and has the potential to solve problems existing in other areas. Therefore, the SCAEICS can be effectively applied to specific scenarios in mechanical engineering, road and bridge, agriculture, and water-conservancy construction, as well as other mechanical design fields.

7.2. Open Research Questions (ORQ)

In single-objective optimization problems, the SCAEICS proposed in this paper is significantly competitive. However, multi-objective problems still deserve attention in the field of optimization and the next research will focus on multi-objective optimization problems and solving more practical problems.
Next, researchers can focus on more practical problems such as shop-floor scheduling problems, image processing, text classification, transportation problems, logistic scheduling problems, agricultural water problems, other mechanical optimization design problems, and hyper-parametric optimization problems faced for machine learning and deep learning. Finally, interested researchers can further analyze its performance by improving the method.

8. Conclusions

To improve the optimization performance of the SCA, the problems of slow convergence, low accuracy of the search for excellence, and the tendency to fall into local optimality were addressed. A sine cosine algorithm for the collaborative search of elite individuals was proposed, with the following main improvements in the work.
(1). Tent chaos mapping was used to initialize the population with a hyperbolic tangent function non-linearly adjusting the control parameter r1, so that the population was uniformly distributed, enhancing the uniformity of population distribution and balancing the global search and local exploitation performance.
(2). By combining the search strategy of the SCA, the m-neighborhood locally optimal individual-guided search strategy, and the global optimal individual-guided search strategy, the search method of the original algorithm was improved. The above three search strategies were executed alternately to achieve a collaborative search, which effectively improved the convergence accuracy and prevented the algorithm from falling into a local optimum.
(3). A greedy selection strategy was used to select individuals of the population on merit to speed up convergence.
(4). Simulation experiments on 23 basic test functions and CEC2020 functions were conducted to compare the sine cosine algorithm for the collaborative search of elite individuals (SCAEICS) with the SCA, other improved SCA, other chaos-based algorithms, and other intelligent optimization algorithms, and the experimental results show that the SCAEICS had a better optimization performance.
(5). The feasibility and applicability of the SCAEICS were further verified by optimizing two example problems in mechanical design, which could provide new ideas for research in the field of mechanical design.
For future work, we plan to test the improved algorithm using a more novel set of CEC test functions. In addition, we are going to apply the improved algorithm to agricultural water resources for practical applications.

Author Contributions

Conceptualization, J.T. and L.W.; methodology, J.T. and L.W.; software, J.T.; writing—original draft, J.T.; writing—review & editing, L.W.; visualization, J.T.; project administration, L.W.; funding acquisition, L.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Key Research and Development Program of Gansu Province, grant number 21YF5GA088.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The data supporting the reported results are available from the authors upon reasonable request.

Acknowledgments

We are grateful to the anonymous reviewers’ hard work and comments, which allowed us to improve the quality of this paper.

Conflicts of Interest

The authors declare that they have no conflict of interest.

References

  1. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  2. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  3. Aaha, B.; Sm, C.; Hf, D.; Ia, D.; Mm, E.; Hc, F. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar]
  4. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp swarm algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  5. Arasteh, B.; Sadegi, R.; Arasteh, K.; Gunes, P.; Kiani, F.; Torkamanian-Afshar, M. A bioinspired discrete heuristic algorithm to generate the effective structural model of a program source code. J. King Saud Univ. Comput. Inf. Sci. 2023, 35, 101655. [Google Scholar] [CrossRef]
  6. Amir, S.; Farzad, K. Sand Cat swarm optimization: A nature-inspired algorithm to solve global optimization problems. Eng. Comput. 2023, 39, 2627–2651. [Google Scholar] [CrossRef]
  7. Kiani, F.; Anka, F.A.; Erenel, F. PSCSO: Enhanced sand cat swarm optimization inspired by the political system to solve complex problems. Adv. Eng. Softw. 2023, 178, 103423. [Google Scholar] [CrossRef]
  8. Zhao, W.; Wang, L.; Zhang, Z.; Mirjalili, S.; Khodadadi, N.; Ge, Q. Quadratic Interpolation Optimization (QIO): A new optimization algorithm based on generalized quadratic interpolation and its applications to real-world engineering problems. Comput. Methods Appl. Mech. Eng. 2023, 417, 116446. [Google Scholar] [CrossRef]
  9. Nadimi, S.; Mohammad, H.; Taghian, S.; Mirjalili, S. An improved grey wolf optimizer for solving engineering problems. Expert Syst. Appl. 2021, 166, 917–940. [Google Scholar] [CrossRef]
  10. Yang, W.; Xia, K.; Fan, S.; Wang, L.; Li, T.; Zhang, J.; Feng, Y. A Multi-Strategy Whale Optimization Algorithm and Its Application. Eng. Appl. Artif. Intell. 2022, 108, 111–139. [Google Scholar] [CrossRef]
  11. Zhang, C.; Han, Y.; Wang, Y.; Li, J.; Gao, K. A Distributed Blocking Flowshop Scheduling with Setup Times Using Multi-Factory Collaboration Iterated Greedy Algorithm. Mathematics 2023, 11, 581. [Google Scholar] [CrossRef]
  12. Zhang, D.; Wang, J.; Fan, H. New method of traffic flow forecasting based on quantum particle swarm optimization strategy for intelligent transportation system. Int. J. Commun. Syst. 2020, 34, e4647.1–e4647.20. [Google Scholar] [CrossRef]
  13. Arasteh, B.; Fatolahzadeh, A.; Kiani, F. Savalan: Multi objective and homogeneous method for software modules clustering. J. Softw. Evol. Process 2022, 34, e2408. [Google Scholar] [CrossRef]
  14. Seyyedabbasi, A.; Kiani, F.; Allahviranloo, T.; Fernandez-Gamiz, U.; Noeiaghdam, S. Optimal data transmission and pathfinding for WSN and decentralized IoT systems using I-GWO and Ex-GWO algorithms. Alex. Eng. J. 2023, 63, 339–357. [Google Scholar] [CrossRef]
  15. Kiani, F.; Seyyedabbasi, A.; Nematzadeh, S. Improving the performance of hierarchical wireless sensor networks using the metaheuristic algorithms: Efficient cluster head selection. Sens. Rev. 2021, 41, 368–381. [Google Scholar] [CrossRef]
  16. Kiani, F.; Randazzo, G.; Yelmen, I.; Seyyedabbasi, A.; Nematzadeh, S.; Anka, F.A.; Erenel, F.; Zontul, M.; Lanza, S.; Muzirafuti, A. A Smart and Mechanized Agricultural Application: From Cultivation to Harvest. Appl. Sci. 2022, 12, 6021. [Google Scholar] [CrossRef]
  17. Singh, N.; Bhatia, O.S. Optimization of Process Parameters in Die Sinking EDM—A Review. Int. J. Sci. Technol. Eng. 2016, 2, 808–813. [Google Scholar]
  18. Jaiswal, A.K.; Siddique, M.H.; Paul, A.R.; Samad, A. Surrogate-based design optimization of a centrifugal pump impeller. Eng. Optim. 2022, 54, 1395–1412. [Google Scholar] [CrossRef]
  19. Shadkam, E. Cuckoo optimization algorithm in reverse logistics: A network design for COVID-19 waste management. Waste Manag. Res. 2022, 40, 458–469. [Google Scholar] [CrossRef]
  20. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  21. Mirjalili, S. SCA: A sine cosine algorithm for solving optimization problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  22. Kiani, F.; Nematzadeh, S.; Anka, F.A.; Findikli, M.A. Chaotic Sand Cat Swarm Optimization. Mathematics 2023, 11, 2340. [Google Scholar] [CrossRef]
  23. Duan, J.; Jiang, Z. Joint Scheduling Optimization of a Short-Term Hydrothermal Power System Based on an Elite Collaborative Search Algorithm. Energies 2022, 15, 4633. [Google Scholar] [CrossRef]
  24. Jiang, Z.; Duan, J.; Xiao, Y.; He, S. Elite collaborative search algorithm and its application in power generation scheduling optimization of cascade reservoirs. J. Hydrol. 2022, 615, 128684. [Google Scholar] [CrossRef]
  25. Wu, L.; Li, Z.; Ge, W.; Zhao, X. An adaptive differential evolution algorithm with elite gaussian mutation and bare-bones strategy. Math. Biosci. Eng. 2022, 19, 8537–8553. [Google Scholar] [CrossRef]
  26. Qsha, C.; Hs, B.; Sas, A.; Ms, A. Q-Learning embedded sine cosine algorithm (QLESCA). Expert Syst. Appl. 2022, 193, 0957–0972. [Google Scholar]
  27. Zhao, Y.Q.; Zou, F.; Chen, D.B. A new bare bone sine cosine algorithm based on neighborhood structure. J. Chang. Norm. Univ. 2019, 88, 16–25. [Google Scholar]
  28. Hussien, G.; Liang, G.; Chen, H.; Lin, H. A double adaptive random spare reinforced sine cosine algorithm. Comput. Model Eng. 2023, 136, 2267–2289. [Google Scholar] [CrossRef]
  29. Chao, Z.; Yi, Y. Improved sine cosine algorithm for large-scale optimization problems. J. Shenzhen Univ. Sci. Eng. 2022, 39, 684–692. [Google Scholar]
  30. Kun, Y.; Qingliang, J.; Zilong, L.; Qin, J.Y. Positioning of characterstic spectral peaks based on improved sine cosine algorithm. Acta Opt. Sin. 2019, 39, 411–417. [Google Scholar]
  31. Elaziz, M.A.; Oliva, D.; Xiong, S. An Improved Opposition-Based Sine Cosine Algorithm for Global Optimization. Expert Syst. Appl. 2017, 90, 484–500. [Google Scholar] [CrossRef]
  32. Rizk-Allah, R.M. An improved sine-cosine algorithm based on orthogonal parallel information for global optimization. Soft Comput. Meth. Appl. 2019, 23, 7135–7161. [Google Scholar] [CrossRef]
  33. Wu, W.; Yuhui, W.; Zhiyun, W. Text feature selection based on sine and cosine algorithm. Comput. Eng. Sci. 2022, 44, 1467–1473. [Google Scholar]
  34. Verma, D.; Soni, J.; Kalathia, D.; Bhattacharjee, K. Sine cosine algorithm for solving economic load dispatch problem with penetration of renewables. Intern. Swarm Intell. Res. 2022, 13, 1–21. [Google Scholar] [CrossRef]
  35. Belazi, A.; Jimenomorenilla, A.; Sanchezromero, J.L. Enhanced parallel sine cosine algorithm for constrained and unconstrained optimization. Mathematics 2022, 10, 1166. [Google Scholar] [CrossRef]
  36. Long, W.; Wu, T.; Liang, X.; Xu, S. Solving high-dimensional global optimization problems using an improved sine cosine algorithm. Expert Syst. Appl. 2019, 123, 108–126. [Google Scholar] [CrossRef]
  37. Li, C.; Luo, Z.; Song, Z.; Yang, F.; Fan, J.; Liu, P.X. An enhanced brain storm sine cosine algorithm for global optimization problems. IEEE Access 2019, 5, 102–113. [Google Scholar] [CrossRef]
  38. Cheng, J.; Duan, Z. Cloud model based sine cosine algorithm for solving optimization problems. Evol. Intell. 2019, 12, 503–514. [Google Scholar] [CrossRef]
  39. Ning, Z.; He, X.; Yang, X.; Zhao, X. Sine cosine algorithm embedded with differential evolution and inertia weight. Trans. Micro. Tech. 2022, 41, 131–135. [Google Scholar]
  40. Guo, G.; Zhang, N. A hybrid multi-objective firefly-sine cosine algorithm for multi-objective optimization problem. Int. J. Comput. Inf. Eng. 2020, 10, 71–82. [Google Scholar]
  41. Dida, H.; Charif, F.; Benchabane, A. Image registration of computed tomography of lung infected with COVID-19 using an improved sine cosine algorithm. Med. Biol. Eng. Comput. 2022, 60, 2521–2535. [Google Scholar] [CrossRef]
  42. Li, Y.; Han, M.; Guo, Q. Modified whale optimization algorithm based on tent chaotic mapping and its application in structural optimization. KSCE J. Civ. Eng. 2020, 24, 3703–3713. [Google Scholar] [CrossRef]
  43. Xiao, F.; Honma, Y.; Kono, T. A simple algebraic interface capturing scheme using hyperbolic tangent function. Int. J. Numer. Methods Fluids 2005, 48, 1023–1040. [Google Scholar] [CrossRef]
  44. Elkhateeb, N.; Badr, R. A novel variable population size artificial bee colony algorithm with convergence analysis for optimal parameter tuning. Int. J. Comput. Intell. Appl. 2017, 16, 175–189. [Google Scholar] [CrossRef]
  45. Bansal, J.C.; Gopal, A.; Nagar, A.K. Stability analysis of Artificial Bee Colony optimization algorithm. Swarm Evol. Comput. 2018, 41, 9–19. [Google Scholar] [CrossRef]
  46. Gandomi, A.H.; Yun, G.J.; Yang, X.-S.; Talatahari, S. Chaos-enhanced accelerated particle swarm optimization. Commun. Nonlinear Sci. Numer. Simul. 2013, 18, 327–340. [Google Scholar] [CrossRef]
  47. Yuanxia, S.; Xuefeng, Z.; Xin, F.; Xiaoyan, W. A multi-scale sine cosine algorithm for optimization problems. Control Decis. 2022, 37, 2860–2868. [Google Scholar] [CrossRef]
  48. Wenyan, G.; Yuan, W.; Fang, D.; Ting, L. Alternating sine cosine algorithm based on elite chaotic search strategy. Control Decis. 2019, 34, 1654–1662. [Google Scholar] [CrossRef]
  49. Kumar, S.; Yildiz, B.S.; Mehta, P.; Panagant, N.; Sait, S.M.; Mirjalili, S.; Yildiz, A.R. Chaotic marine predators algorithm for global optimization of real-world engineering problems. Knowl.-Based Syst. 2023, 261, 110192. [Google Scholar] [CrossRef]
  50. Gupta, S.; Deep, K.; Engelbrecht, A.P. A memory guided sine cosine algorithm for global optimization. Eng. Appl. Artif. Intell. 2020, 93, 476–489. [Google Scholar] [CrossRef]
  51. Xiaojuan, L.; Lianguo, W. A sine cosine algorithm based on differential evolution. Chin. J. Eng. 2020, 42, 1674–1684. [Google Scholar]
  52. Yang, J.; Liu, Z.; Zhang, X.; Hu, G. Elite Chaotic Manta Ray Algorithm Integrated with Chaotic Initialization and Opposition-Based Learning. Mathematics 2022, 10, 2960. [Google Scholar] [CrossRef]
  53. Cong, S. Mechanical optimization design. China’s Foreign Trade 2011, 13, 12–25. [Google Scholar]
  54. Canbaz, B.; Yannou, B.; Yvars, P.A.B.T.-I. A new framework for collaborative set-based design: Application to the design problem of a hollow cylindrical cantilever beam. In Proceedings of the International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, Washington, DC, USA, 28–31 August 2011; pp. 1345–1356. [Google Scholar]
  55. Teknillinen, T.; Konstruktiotekniikan, Y.; Tutkimusraportti, L. Multicriterion compliance minimization and stress-constrained minimum weight design of a three-bar truss. Astrophysical 2011, 6, 321–332. [Google Scholar]
Figure 1. Schematic diagram of the sine cosine algorithm. The yellow striped area indicates the global search space, the black striped area indicates the local exploitation space, xi indicates the current individual, and x* indicates the target individual.
Figure 1. Schematic diagram of the sine cosine algorithm. The yellow striped area indicates the global search space, the black striped area indicates the local exploitation space, xi indicates the current individual, and x* indicates the target individual.
Biomimetics 08 00576 g001
Figure 2. Comparison chart of population initialization (the subfigure (a) shows the effect of SCA population initialization, and (b) shows the effect of population initialization after using tent chaotic mapping).
Figure 2. Comparison chart of population initialization (the subfigure (a) shows the effect of SCA population initialization, and (b) shows the effect of population initialization after using tent chaotic mapping).
Biomimetics 08 00576 g002
Figure 3. Comparison chart of parameter r1 curve.
Figure 3. Comparison chart of parameter r1 curve.
Biomimetics 08 00576 g003
Figure 4. The flowchart of the proposed algorithm.
Figure 4. The flowchart of the proposed algorithm.
Biomimetics 08 00576 g004
Figure 5. Plot of average convergence curves: (a) f2; (b) f5; (c) f8; (d) f13; (e) f21; (f) f23.
Figure 5. Plot of average convergence curves: (a) f2; (b) f5; (c) f8; (d) f13; (e) f21; (f) f23.
Biomimetics 08 00576 g005
Table 1. Basic test functions.
Table 1. Basic test functions.
FunctionsFunction NameDimensionalitySearch SpaceTheoretical Optimal Value
f1Sphere30[−100,100]0
f2Schwefel 2.2230[−10,10]0
f3Schwefel 1.230[−100,100]0
f4Schwefel 2.2130[−100,100]0
f5Rosenbrock30[−30,30]0
f6Step30[−100,100]0
f7quarticWN30[−1.28,1.28]0
f8Schwefel 2.2630[−500,500]−12,569.48
f9Rastrigin30[−5.12,5.12]0
f10Ackley30[−32,32]0
f11Griewank30[−600,600]0
f12Penalized130[−50,50]0
f13Penalized230[−50,50]0
f14Shekel foxholes2[−65,65]1
f15Kowalk4[−5,5]0.0003075
f16Six_hump camel_back2[−5,5]−1.0316
f17Branin2[−5,5]0.398
f18Goldstein_price2[0,2]3
f19Hartman13[0,1]−3.86
f20Hartman26[0,1]−3.32
f21Sheke_54[0,10]−10.1532
f22Sheke_74[0,10]−10.4028
f23Sheke_104[0,10]−10.5363
Table 2. CEC2020 benchmark functions.
Table 2. CEC2020 benchmark functions.
FunctionsFunction NameDimensionalitySearch SpaceTheoretical Optimal Value
Unimodal
Function
F1Shifted and Rotated Bent Cigar Function10[−100,100]100
Basic
Functions
F2Shifted and Rotated Schwefel’s Function10[−100,100]1100
F3Shifted and Rotated Lunacek bi-Rastrigin Function10[−100,100]700
F4Expanded Rosenbrock’s plus Griewangk’s Function10[−100,100]1900
Hybri
Functions
F5Hybrid Function 110[−100,100]1700
F6Hybrid Function 210[−100,100]1600
F7Hybrid Function 310[−100,100]2100
Composition
Functions
F8Composition Function 110[−100,100]2200
F9Composition Function 210[−100,100]2400
F10Composition Function 310[−100,100]2500
Table 3. Table of parameter settings for the participation comparison algorithm.
Table 3. Table of parameter settings for the participation comparison algorithm.
AlgorithmsParameter Settings
WOAa ∈ [0,2]; b = 1; l ∈ [−2,1]
GWOa ∈ [0,2], r1 ∈ [0,1]; r2 ∈ [0,1]
HHOE0 ∈ [−1,1]; E ∈ [0,1]; r ∈ [0,1]; u ∈ [0,1]; v ∈ [0,1]; β =1.5
SSAC1 ∈ [2,0]; C2 ∈ [0,1]; C3 ∈ [0,1]
SCAr2 ∈ [0,2π]; r3 ∈ [−2,2];
r4 ∈ [0,1]; a = 2
COSCAη = 1; astart = 1; aend = 0; pr = 0.1
SCADEnlim = 50; cr = 0.3; kmax = 3; h = 10; δ2max = 0.6;
δ2min = 0.0001; a = 2
MGSCAr2 ∈ [0,2π]; r3 ∈ [−2,2];
r4 ∈ [0,1]; a = 2
CSCAr2 ∈ [0,2π]; r3 ∈ [−2,2];
r4 ∈ [0,1]
CMRFOS ∈ 2, p ∈ 0.1
CMPAr ∈ [0,1], p ∈ 0.5, v ∈ 0.1, u ∈ [0,1], FADs ∈ 0.2
Table 4. Comparison of the SCAEICS with the SCA and other intelligent algorithms.
Table 4. Comparison of the SCAEICS with the SCA and other intelligent algorithms.
FunctionEvaluation CriterionWOAGWOHHOSSASCASCAEICS
f1Ave2.02 × 10−721.39 × 10−273.95 × 10−961.70 × 10−78.10 × 1019.72 × 10−232
Std1.09 × 10−711.59 × 10−271.66 × 10−952.11 × 10−71.15 × 1020
f2Ave2.16 × 10−501.37 × 10−169.15 × 10−511.851.76 × 10−13.10 × 10−123
Std1.10 × 10−498.53 × 10−173.99 × 10−501.592.57 × 10−11.69 × 10−122
f3Ave4.47 × 1043.16 × 10−52.85 × 10−801.57 × 1031.42 × 1043.71 × 10−230
Std1.53 × 1041.44 × 10−41.20 × 10−791.15 × 1037.42 × 1030
f4Ave4.64 × 1019.54 × 10−72.51 × 10−501.06 × 1014.80 × 1011.45 × 10−124
Std2.86 × 1011.20 × 10−67.44 × 10−503.621.01 × 1017.73 × 10−124
f5Ave2.80 × 1012.72 × 1011.19 × 10−21.13 × 1023.43 × 1053.29 × 10−8
Std4.50 × 10−17.31 × 10−11.69 × 10−21.41 × 1025.99 × 1053.77 × 10−8
f6Ave3.17 × 10−17.43 × 10−19.25 × 10−52.55 × 10−71.04 × 1022.62 × 10−10
Std1.82 × 10−14.07 × 10−11.73 × 10−43.82 × 10−71.45 × 1021.12 × 10−9
f7Ave3.10 × 10−32.06 × 10−31.66 × 10−41.68 × 10−13.02 × 10−17.65 × 10−6
Std4.58 × 10−31.36 × 10−31.75 × 10−45.77 × 10−24.11 × 10−17.67 × 10−6
f8Ave−10,672.1305−6090.7165−12,554.3583−6374.5963−3785.7654−12,569.4865
Std1704.9911887.89376.2766743.4421273.83530.00029551
f9Ave02.3006.18 × 1015.28 × 1010
Std05.3401.61 × 1014.14 × 1010
f10Ave4.44 × 10−151.03 × 10−138.88 × 10−162.571.62 × 1018.88 × 10−16
Std2.09 × 10−152.02 × 10−1408.19 × 10−17.340
f11Ave1.11 × 10−26.80 × 10−301.57 × 10−21.600
Std6.07 × 10−21.31 × 10−201.32 × 10−26.49 × 10−10
f12Ave1.71 × 10−24.55 × 10−26.33 × 10−66.646.12 × 1051.45 × 10−9
Std7.51 × 10−32.28 × 10−28.21 × 10−62.951.86 × 1062.97 × 10−9
f13Ave4.94 × 10−17.19 × 10−11.02 × 10−42.00 × 1011.44 × 1061.69 × 10−10
Std2.51 × 10−12.21 × 10−11.19 × 10−41.64 × 1012.02 × 1063.02 × 10−10
f14Ave2.574.981.331.161.339.98 × 10−1
Std2.994.539.47 × 10−14.58 × 10−17.50 × 10−11.06 × 10−6
f15Ave0.000718180.00246810.000346880.000998780.0012110.00033444
Std0.000542310.00607343.251× 10−50.000284190.000341861.84 × 10−4
f16Ave−1.0316−1.0316−1.0316−1.0316−1.0316−1.0316
Std8.63 × 10−102.82 × 10−88.49 × 10−103.06 × 10−149.52 × 10−54.45 × 10−2
f17Ave0.397890.397890.397890.397890.40160.39789
Std1.22 × 10−52.14 × 10−37.01 × 10−62.16 × 10−32.20 × 10−38.47 × 10−5
f18Ave3.0001333.00053.00033
Std4.935.05 × 10−56.38 × 10−75.01 × 10−43.71 × 10−46.49 × 10−7
f19Ave−3.8582−3.8615−3.8595−3.8626−3.8536−3.8633
Std5.88 × 10−22.72 × 10−32.70 × 10−32.59 × 10−43.10 × 10−39.69 × 10−3
f20Ave−3.1983−3.2541−3.133−3.2072−2.8409−3.3263
Std9.35 × 10−27.38 × 10−21.01 × 10−17.48 × 10−22.46 × 10−11.66 × 10−1
f21Ave−8.4313−8.6048−5.0507−8.3476−1.8336−10.153
Std2.272.461.243.161.631.13 × 10−3
f22Ave−8.1586−9.6926−5.0821−9.4209−3.4597−10.4024
Std3.231.621.582.041.813.95 × 10−4
f23Ave−8.4151−10.0841−5.2933−9.3126−3.8672−0.5362
Std3.149.79 × 10−19.83 × 10−12.401.035.24 × 10−4
Decision result+/=/−2/2/192/3/183/7/133/2/181/1/21--
Table 5. Comparison of the SCAEICS with the SCA and other intelligent algorithms (D = 60).
Table 5. Comparison of the SCAEICS with the SCA and other intelligent algorithms (D = 60).
FunctionEvaluation CriterionWOAGWOHHOSSASCASCAEICS
f1Ave8.79 × 10−711.03 × 10−271.20 × 10−943.02 × 1027.82 × 1014.79 × 10−224
Std3.71 × 10−701.68 × 10−276.53 × 10−948.75 × 1011.24 × 1020
f2Ave1.29 × 10−509.79 × 10−172.57 × 10−511.22 × 1012.06 × 10−14.23 × 10−127
Std4.22 × 10−505.86 × 10−178.54 × 10−511.974.10 × 10−11.11 × 10−126
f3Ave4.33 × 1044.22 × 10−69.12 × 10−706.58 × 1031.56 × 1044.11 × 10−234
Std1.36 × 1041.08 × 10−54.99 × 10−692.85 × 1036.59 × 1030
f4Ave5.10 × 1016.11 × 10−72.04 × 10−481.84 × 1014.61 × 1013.40 × 10−125
Std2.94 × 1016.59 × 10−79.61 × 10−483.641.27 × 1011.84 × 10−124
f5Ave2.80 × 1012.68 × 1011.46 × 10−22.62 × 1045.35 × 1053.77 × 10−8
Std3.90 × 10−15.50 × 10−11.64 × 10−22.91 × 1041.70 × 1067.36 × 10−10
f6Ave3.84 × 10−18.13 × 10−19.96 × 10−53.20 × 1021.29 × 1022.98 × 10−9
Std2.72 × 10−13.82 × 10−11.44 × 10−41.14 × 1021.74 × 1025.37 × 10−9
f7Ave2.56 × 10−31.88 × 10−31.64 × 10−43.31 × 10−12.16 × 10−19.11 × 10−5
Std2.79 × 10−31.22 × 10−31.40 × 10−41.08 × 10−14.31 × 10−18.65 × 10−5
f8Ave−9016.3152−6709.0284−12569.3642−6699.3213−3794.8247−12569.4851
Std1.69 × 1038.88 × 1025.88 × 1027.95 × 1022.45 × 1027.73 × 10−4
f9Ave1.89 × 10−152.2101.33 × 1025.44 × 1010
Std1.04 × 10−143.7902.50 × 1014.35 × 1010
f10Ave3.85 × 10−159.94 × 10−148.88 × 10−166.371.42 × 1018.88 × 10−16
Std2.81 × 10−151.96 × 10−1409.36 × 10−18.090
f11Ave5.91 × 10−34.42 × 10−303.661.810
Std3.24 × 10−28.23 × 10−308.74 × 10−11.100
f12Ave2.25 × 10−23.74 × 10−28.11 × 10−63.61 × 1011.94 × 1063.69 × 10−10
Std1.78 × 10−22.04 × 10−29.99 × 10−63.92 × 1019.39 × 1061.01 × 10−9
f13Ave5.56 × 10−15.72 × 10−17.79 × 10−52.73 × 1031.57 × 1067.07 × 10−9
Std2.20 × 10−11.77 × 10−17.82 × 10−51.05 × 1042.78 × 1061.21 × 10−8
Decision result+/=/−0/0/130/0/131/3/90/0/130/0/13--
Table 6. Comparison of the SCAEICS with the SCA and other intelligent algorithms (D = 100).
Table 6. Comparison of the SCAEICS with the SCA and other intelligent algorithms (D = 100).
FunctionEvaluation CriterionWOAGWOHHOSSASCASCAEICS
f1Ave9.88 × 10−747.23 × 10−288.49 × 10−963.18 × 1029.23 × 1012.56 × 10−216
Std4.30 × 10−739.27 × 10−284.64 × 10−951.14 × 1021.42 × 1020
f2Ave2.23 × 10−511.13 × 10−167.58 × 10−511.23 × 1011.57 × 10−12.77 × 10−123
Std6.13 × 10−516.47 × 10−173.98 × 10−503.132.01 × 10−11.38 × 10−122
f3Ave4.72 × 1043.15 × 10−52.58 × 10−705.94 × 1031.33 × 1041.32 × 10−238
Std1.24 × 1049.29 × 10−51.41 × 10−692.46 × 1037.85 × 1030
f4Ave5.35 × 1016.68 × 10−75.78 × 10−491.87 × 1014.46 × 1011.28 × 10−123
Std2.48 × 1015.59 × 10−72.22 × 10−483.541.29 × 1016.96 × 10−123
f5Ave2.81 × 1012.69 × 1011.02 × 10−22.59 × 1042.66 × 1051.52 × 10−7
Std5.13 × 10−16.93 × 10−11.11 × 10−22.22 × 1044.40 × 1052.04 × 10−7
f6Ave3.32 × 10−17.66 × 10−11.46 × 10−43.02 × 1028.19 × 1017.83 × 10−9
Std2.07 × 10−13.71 × 10−12.17 × 10−47.90 × 1011.24 × 1021.99 × 10−8
f7Ave3.19 × 10−32.05 × 10−31.56 × 10−43.25 × 10−13.47 × 10−18.38 × 10−5
Std4.50 × 10−38.65 × 10−41.62 × 10−41.43 × 10−13.00 × 10−16.56 × 10−5
f8Ave−12507.8053−5308.7488−12568.5282−5959.9149−4434.8286−12569.4857
Std1.61 × 1039.96 × 1024.08 × 1018.14 × 1022.66 × 1022.63 × 10−4
f9Ave02.5501.30 × 1026.79 × 1010
Std03.3402.07 × 1014.15 × 1010
f10Ave4.32 × 10−151.08 × 10−138.88 × 10−166.371.60 × 1018.88 × 10−16
Std2.72 × 10−152.07 × 10−1408.91 × 10−17.210
f11Ave1.25 × 10−23.06 × 10−303.601.890
Std4.79 × 10−25.85 × 10−309.60 × 10−11.630
f12Ave1.86 × 10−25.16 × 10−21.14 × 10−52.27 × 1014.08 × 1058.64 × 10−10
Std1.34 × 10−23.00 × 10−21.85 × 10−51.33 × 1011.16 × 1061.97 × 10−9
f13Ave5.31 × 10−15.71 × 10−11.04 × 10−41.23 × 1031.81 × 1066.16 × 10−9
Std2.90 × 10−12.24 × 10−11.41 × 10−42.63 × 1033.87 × 1061.39 × 10−8
Decision result+/=/−0/1/120/0/131/4/80/0/130/0/13--
Table 7. Comparison table of the SCAEICS with other improved algorithms.
Table 7. Comparison table of the SCAEICS with other improved algorithms.
FunctionEvaluation
Criterion
COSCASCADEMGSCACSCASCAEICS
f1Ave2.44 × 10−789.58 × 10−957.62 × 10−237.49 × 10−29.72 × 10−232
Std3.21 × 10−944.92 × 10−941.67 × 10−221.93 × 10−10
f2Ave1.52 × 10−446.14 × 10−631.92 × 10−175.09 × 10−73.10 × 10−123
Std1.94 × 10−602.73 × 10−624.63 × 10−178.73 × 10−71.69 × 10−122
f3Ave1.78 × 10−151.93 × 10−42.80 × 10−35.83 × 1033.71 × 10−230
Std1.45 × 10−309.81 × 10−48.78 × 10−35.29 × 1030
f4Ave5.27 × 10−352.85 × 10−98.11 × 10−31.26 × 1011.45 × 10−124
Std1.91 × 10−501.53 × 10−82.34 × 10−29.507.73 × 10−124
f5Ave2.84 × 1012.69 × 1012.75 × 1014.17 × 1033.29 × 10−08
Std7.94 × 10−161.47 × 10−17.07 × 10−11.80 × 1043.77 × 10−8
f6Ave3.827.54 × 10−51.395.092.62 × 10−10
Std7.94 × 10−169.46 × 10−55.59 × 10−19.42 × 10−11.12 × 10−9
f7Ave3.21 × 10−48.44 × 10−33.87 × 10−37.74 × 10−27.65 × 10−6
Std6.06 × 10−217.37 × 10−32.65 × 10−35.34 × 10−27.67 × 10−6
f8Ave−3.31 × 103−1.20 × 104−6.36 × 103−3.37 × 103−12,569.4865
Std2.64 × 10−122.53 × 1026.41 × 1022.95 × 1020.00029551
f9Ave002.86 × 10−14.63 × 1010
Std008.88 × 10−14.93 × 1010
f10Ave2.48 × 10−152.13 × 10−157.392.89 × 10−28.88 × 10−16
Std7.05 × 10−311.76 × 10−159.887.60 × 10−20
f11Ave001.01 × 10−24.78 × 10−10
Std001.85 × 10−23.69 × 10−10
f12Ave3.68 × 10−13.45 × 10−51.00 × 10−19.391.45 × 10−9
Std1.73 × 10−161.66 × 10−44.78 × 10−24.22 × 1012.97 × 10−9
f13Ave2.048.13 × 10−31.466.35 × 1021.69 × 10−10
Std1.20 × 10−152.29 × 10−23.20 × 10−13.21 × 1033.02 × 10−10
f14Ave3.569.98 × 10−11.132.269.98 × 10−1
Std5.95 × 10−165.00 × 10−165.03 × 10−12.481.06 × 10−6
f15Ave7.87 × 10−047.52 × 10−40.00069790.00066940.00033444
Std7.75 × 10−191.54 × 10−43.45 × 10−42.44 × 10−41.84 × 10−4
f16Ave−1.0316−1.0316−1.0316−1.0316−1.0316
Std1.09 × 10−164.44 × 10−162.04 × 10−83.05 × 10−54.45 × 10−2
f17Ave0.397890.397890.397890.40030.39789
Std004.36 × 10−62.84 × 10−38.47 × 10−5
f18Ave3333.00013
Std7.94 × 10−163.18 × 10−74.99 × 10−61.19 × 10−46.49 × 10−7
f19Ave−3.8589−3.8628−3.8585−3.8572−3.8633
Std1.39 × 10−157.64 × 10−133.85 × 10−33.33 × 10−39.69 × 10−3
f20Ave−3.1561−3.3119−3.1134−3.1229−3.3263
Std9.93 × 10−163.05 × 10−21.80 × 10−17.41 × 10−21.66 × 10−1
f21Ave−9.8534−9.7526−7.3713−4.2256−10.153
Std6.35 × 10−158.91 × 10−12.911.141.13 × 10−3
f22Ave−10.3208−10.4029−8.3904−4.4632−10.4024
Std4.36 × 10−151.453.218.68 × 10−13.95 × 10−4
f23Ave−10.4821−10.5364−8.7732−4.5667−10.5362
Std3.97 × 10−153.45 × 10−143.041.345.24 × 10−4
Decision
result
+/=/−2/6/152/8/132/3/182/2/19--
Table 8. Comparison table of the SCAEICS with other chaos-based algorithms.
Table 8. Comparison table of the SCAEICS with other chaos-based algorithms.
FunctionEvaluation
Criterion
CMRFOCMPASCASCAEICS
F1Ave2.56 × 1031.732 × 1065.45 × 10101.35 × 104
Std7.12 × 1061.178 × 1066.69 × 10193.79 × 105
F2Ave8.57 × 1032.191 × 1031.52 × 1044.30 × 103
Std9.48 × 1052.191 × 1031.80 × 1052.18 × 102
F3Ave1.51 × 1032.191 × 1031.76 × 1031.67 × 103
Std6.04 × 1042.191 × 1039.22 × 1031.98 × 103
F4Ave1.90 × 1032.191 × 1032.04 × 1031.90 × 103
Std06.347 × 1011.75 × 1041.62
F5Ave4.10 × 1052.066 × 1037.65 × 1072.94 × 103
Std4.32 × 10108.949 × 1011.60 × 10151.34 × 103
F6Ave3.36 × 1031.601 × 1036.34 × 1031.60 × 103
Std2.07 × 1051.601 × 1034.45 × 1051.10 × 101
F7Ave2.36 × 1051.601 × 1032.06 × 1072.53 × 105
Std1.79 × 10106.108 × 1017.40 × 10134.33 × 104
F8Ave8.85 × 1032.295 × 1031.68 × 1042.31 × 103
Std9.47 × 1062.472 × 1012.79 × 1054.63 × 101
F9Ave3.30 × 1032.575 × 1033.80 × 1032.61 × 103
Std1.07 × 1047.444 × 1014.87 × 1034.79 × 101
F10Ave3.06 × 1032.897 × 1037.43 × 1032.87 × 103
Std1.07 × 1032.338 × 1015.82 × 1051.05 × 101
Decision
result
+/=/−1/3/63/4/30/0/10--
Table 9. Effect of local population size m on the performance of the SCAEICS.
Table 9. Effect of local population size m on the performance of the SCAEICS.
mf1 (p = 3.3074 × 10−76)f3 (p = 6.8459 × 10−223)f6 (p = 1.0364 × 10−141)
Average Optimal ValueStandard DeviationRankAverage Optimal ValueStandard DeviationRankAverage Optimal ValueStandard DeviationRank
47.60 × 10−232016.78 × 10−205017.34 × 10−91.04 × 10−83
52.06 × 10−181037.40 × 10−1551.05 × 10−15434.24 × 10−113.92 × 10−111
65.94 × 10−219021.42 × 10−203029.81 × 10−91.24 × 10−82
79.01 × 10−170042.63 × 10−1413.72 × 10−14148.59 × 10−102.05 × 10−104
mf8 (p = 1.8302 × 10−72)f13 (p = 2.3981 × 10−187)f17 (p = 3.1354 × 10−278)
Average Optimal ValueStandard DeviationRankAverage Optimal ValueStandard DeviationRankAverage Optimal ValueStandard DeviationRank
4−12,569.48579.44 × 10−439.21 × 10−91.30 × 10−823.98 × 10−14.58 × 10−72
5−12,569.48665.78 × 10−422.17 × 10−83.02 × 10−833.98 × 10−11.42 × 10−44
6−12,569.48664.61 × 10−413.89 × 10−108.74 × 10−1113.98 × 10−11.72 × 10−71
7−12,569.48522.34 × 10−341.96 × 10−82.59 × 10−843.98 × 10−16.08 × 10−83
mf19 (p = 6.4911 × 10−198)f21 (p = 3.562 × 10−100)f23 (p = 1.2524 × 10−142)
Average Optimal ValueStandard DeviationRankAverage Optimal ValueStandard DeviationRankAverage Optimal ValueStandard DeviationRank
4−3.443.11 × 10−24−10.15312.73 × 10−44−10.53213.91 × 10−54
5−3.681.69 × 10−11−10.15317.06 × 10−42−10.53631.73 × 10−31
6−3.717.85 × 10−22−10.15322.38 × 10−41−10.53621.14 × 10−32
7−3.562.39 × 10−13−10.15211.46 × 10−43−10.53629.17 × 10−53
Table 10. Performance comparison of different algorithms for cantilever beam optimization problems.
Table 10. Performance comparison of different algorithms for cantilever beam optimization problems.
Algorithmsx1x2x3x4x5f(x)
WOA6.72235.64964.867842.78541.53431.7224
GWO6.05055.31334.47033.52212.18571.3402
HHO6.28295.28354.41233.68262.09381.3415
SSA6.73144.37294.43442.93674.19881.8389
SCA5.08815.28553.56833.65433.85441.4901
SCAEICS6.93285.87334.90514.52462.49031.3401
Table 11. Comparison of the performance of different algorithms for the three-rod truss optimization problem.
Table 11. Comparison of the performance of different algorithms for the three-rod truss optimization problem.
AlgorithmsMaximum ValueMinimum ValueStandard Deviationx1x2f (x)
WOA267.7761263.89881.21170.79750.3837265.1009
GWO264.2903271.07812.27150.81460.3410267.6625
HHO263.8972265.06140.25090.78790.4286264.0875
SSA263.8973263.96790.01600.78730.4121263.9181
SCA263.929282.84276.45480.79350.3984266.6682
SCAEICS263.8962263.93490.00750.78160.3404263.9055
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Tang, J.; Wang, L. Sine Cosine Algorithm for Elite Individual Collaborative Search and Its Application in Mechanical Optimization Designs. Biomimetics 2023, 8, 576. https://doi.org/10.3390/biomimetics8080576

AMA Style

Tang J, Wang L. Sine Cosine Algorithm for Elite Individual Collaborative Search and Its Application in Mechanical Optimization Designs. Biomimetics. 2023; 8(8):576. https://doi.org/10.3390/biomimetics8080576

Chicago/Turabian Style

Tang, Junjie, and Lianguo Wang. 2023. "Sine Cosine Algorithm for Elite Individual Collaborative Search and Its Application in Mechanical Optimization Designs" Biomimetics 8, no. 8: 576. https://doi.org/10.3390/biomimetics8080576

Article Metrics

Back to TopTop