Sine Cosine Algorithm for Elite Individual Collaborative Search and Its Application in Mechanical Optimization Designs

To address the shortcomings of the sine cosine algorithm such as the low search accuracy, slow convergence speed, and easily falling into local optimality, a sine cosine algorithm for elite individual collaborative search was proposed. Firstly, tent chaotic mapping was used to initialize the population and the hyperbolic tangent function was applied non-linearly to adjust the parameters of the sine cosine algorithm, which enhanced the uniformity of population distribution and balanced the global exploration and local exploitation ability. Secondly, the search method of the sine cosine algorithm was improved by combining the search strategy of the sine cosine algorithm, the m-neighborhood locally optimal individual-guided search strategy, and the global optimal individual-guided search strategy, and, then, the three search strategies were executed alternately, which achieved collaboration, improved the convergence accuracy, and prevented the algorithm from falling into local optima. Finally, a greedy selection strategy was employed to select the best individuals for the population, which accelerated the convergence speed of the sine cosine algorithm. The simulation results illustrated that the sine cosine algorithm for elite individual collaborative search demonstrated a better optimization performance than the sine cosine algorithm, the other improved sine cosine algorithms, the other chaos-based algorithms, and other intelligent optimization algorithms. In addition, the feasibility and applicability of the sine cosine algorithm for elite individual collaborative search were further demonstrated by two mechanical optimization design experiments.


Introduction
Most of the optimization problems that exist today are inherently NP-hard and difficult to solve using basic mathematical methods.Many real-world optimization problems are also represented figuratively, and the complexity of the problems faced to be solved is increasing.Therefore, the excellent performance of swarm intelligence optimization algorithms for finding optimal values in specific systems and problems has attracted many experts and scholars to conduct research in this area.

The Contribution
Based on the above ideas and strategies, this paper proposes a sine cosine algorithm for collaborative search of elite individuals to improve the performance of SCA.The tent chaotic mapping strategy is used in the initialization phase to enhance the uniformity of the population distribution, and the hyperbolic tangent function strategy is introduced to balance the equilibrium state between global search and local exploitation.In addition, to improve the convergence accuracy and convergence speed, and to avoid the algorithm from falling into local optimality, the idea of elite individuals is introduced, which effectively solves these problems.The performance of the improved algorithm is evaluated through 23 benchmark functions, CEC2020 functions, and 2 engineering design problems.Therefore, the main contributions of this study are summarized as follows: (1) A sine cosine algorithm for elite individual collaborative search was proposed, and SCAEICS exhibits the faster convergence speed, higher convergence accuracy, and effective escape from local optima compared to the SCA; (2) In the improvement process, the tent chaotic mapping strategy and the hyperbolic tangent function strategy are adopted, which effectively solve the defect of the randomness of the population distribution and balance the global search and local exploitation; (3) In addition, the concept of the collaborative search of elite individuals is combined with SCA and used to improve the search performance of SCA; (4) The proposed SCAEICS was validated by 23 benchmark functions, CEC2020 functions, and in two mechanical engineering optimization problems, and it outperformed the basic SCA in terms of convergence performance.

The Structure of Organization
The rest of the paper is organized as follows.Section 2 describes the basic principles and drawback in the analysis of SCA.Section 3 describes the improvement ideas of the elite individual collaborative search sine cosine algorithm in detail.Section 4 performs comparative experiments and analysis using 23 benchmark test functions and CEC2020 test functions.Section 5 applies the SCAEICS to the optimization of mechanical designs.Section 6 gives a discussion of the proposed approach.The last section summarizes the findings of this paper and points out the direction of the next research work.

Related Research
The sine cosine algorithm (SCA) [21] was proposed by S Mirjalili in 2016.The SCA was a new type of population intelligent optimization algorithm, which seeks the optimal solution of the population by modeling the periodic oscillations of the sine cosine function, and it has the advantages of fewer parameters, simpler structure, and easier implementation.Like most algorithms, the sine cosine algorithm still has the defects of a low optimization searching accuracy, slow convergence speed, and easily falling into the local optimal value.Therefore, researchers are working to improve the sine cosine algorithm in the following two aspects.
(1) Improve the initialization, parameter setting, and algorithmic structure of the SCA.For example, a Q-learning embedded sine cosine algorithm (QLESCA) was proposed [26], in which the algorithm controls the SCA parameters by forming Q-tables for different individuals during operation, which effectively improves the convergence speed of the SCA.A new backbone sine cosine algorithm based on domain structure was proposed [27], which mainly introduced domain structure and Gaussian sampling learning through the backbone optimization idea in the update process of the sine cosine algorithm, effectively enhancing the population exploration ability and improving the population diversity.A doubly adaptive randomized standby enhanced sine cosine algorithm was proposed by introducing the doubly adjusted weight strategy and the randomized standby strategy [28], which balanced the weight factor between exploitation and exploration, accelerated the convergence speed, and enhanced the exploration ability.
A improved sine cosine algorithm with Lévy flight was proposed [29] by multiplying the Lévy flight distribution with the sine cosine population individual position vector with corresponding elements, and the non-linear parameter adjustment method based on the spatial distance, which effectively enhanced the convergence accuracy and improved the convergence speed of the algorithm.A spectral feature peak identification and localization method based on the improved sine cosine algorithm was proposed [30], which improved the sine cosine algorithm by an adopted dynamic conversion probability and significantly modified the performance such as the spectral identification rate and localization accuracy.
Furthermore, the sine cosine algorithm introducing the backward learning strategy was proposed [31], which effectively solved the problem of late evolutionary stagnation and improved the global optimization performance.A sine cosine algorithm based on orthogonal parallel information was proposed [32], which increased the diversity and enhanced the global search capability of the algorithm by adopting a multiple orthogonal parallel information strategy.An improved sine cosine algorithm for text feature selection was proposed [33] mainly using individual coding and adaptive weighting strategies, which improved the classification accuracy compared with other feature-selection algorithms.A population-based sine cosine algorithm for application to economic load scheduling was proposed [34], which indicated a high performance compared to other techniques.An enhanced parallel sine cosine algorithm with a single-stage synchronous and asynchronous strategy designed for solving constrained and unconstrained problems was proposed [35], which effectively sped up the convergence of the algorithm.An improved sine cosine algorithm for solving high-dimensional global optimization problems was proposed [36], in which an inertia weight factor was introduced to modify the original algorithm formula, the Gaussian function was used to reduce the algorithm parameters nonlinearly, and 24 high-dimensional functions and other large-scale global optimization problems were used to evaluate the effectiveness of the algorithm, which showed that the modified algorithm effectively avoids falling into local optimization and accelerates the convergence speed.
(2) Complement the SCA with other algorithms.For example, an enhanced brain storm sine cosine algorithm was proposed [37] to improve population diversity by introducing the enhanced brain storm strategy and two new individual updating strategies, which achieved an effective balance between the global search and local exploitation.Meanwhile, by introducing a cloud model strategy to adaptively adjust the control parameters, a cloud model-based sine cosine algorithm was proposed [38], and the experimental results show that the improved algorithm outperformed the original algorithm in solving global optimization problems.A sine cosine algorithm embedded with differential evolution and inertia weights was proposed [39], which embedded a differential evolution algorithm with dynamic variation, and better balanced the performance of global search and local exploitation by introducing adaptive inertia weights.A hybrid multi-objective firefly algorithm and sine cosine algorithm was proposed [40], in which the results show that the modified algorithm was effective for multi-objective optimization problems.A improved sine cosine algorithm hybridized with a particle swarm algorithm for recording lung CT images of COVID-19 infected patients was proposed [41], which had a high practicality in the field of medical image alignment.
The above algorithms use different strategies to improve the working of the SCA and the improvement is better.However, by analyzing the results, these algorithms still have some limitations.Therefore, it is necessary to continuously improve the SCA to make it more applicable to real-life practical problems and to overcome the challenging new problems brought by the development of society, which is the main motivation of the current research.Next, we will work more effectively and comprehensively.

Principle of the Sine Cosine Algorithm
The SCA converges to the global optimal solution by modeling the periodic oscillatory nature of the sine and cosine functions by the probability progressively from the global search in the exploration phase and the local exploitation in the exploitation phase.The basic principle of the SCA is shown in Figure 1.When the position interval of the sine and cosine functions is in [2,1] and [−2, −1], the current individual is guided to perform the global search in the exploration phase for the solution space.When the sine and cosine function location interval is at [−1, 1], the target optimal solution is exploited and the exploitation phase performance is exploited locally in the solution space.The sine and cosine functions work together in the SCA search process to complement each other, thus ultimately the SCA eventually converges to the global optimal solution.

Principle of the Sine Cosine Algorithm
The SCA converges to the global optimal solution by modeling the periodic oscillatory nature of the sine and cosine functions by the probability progressively from the global search in the exploration phase and the local exploitation in the exploitation phase.The basic principle of the SCA is shown in Figure 1.When the position interval of the sine and cosine functions is in [2,1] and [−2, −1], the current individual is guided to perform the global search in the exploration phase for the solution space.When the sine and cosine function location interval is at [−1, 1], the target optimal solution is exploited and the exploitation phase performance is exploited locally in the solution space.The sine and cosine functions work together in the SCA search process to complement each other, thus ultimately the SCA eventually converges to the global optimal solution.x where xi,j is the jth dimensional component of the ith candidate solution and pbest,j is the jth dimensional component of the global optimal solution in the iteration, j ∈ (1, 2, …, D), where r2 ∈ (0,2π), r3 ∈ (0,2), and r4 ∈ (0,1) are three uniformly distributed random numbers.(Ⅱ) In the SCA, the r1 parameter effectively balances the global search and local exploitation performance, gradually converging to the globally optimal solution as the number of iterations decreases linearly, and the r1 parameter is updated according to Equation (2).For solving the minimization problem, where x is the feasible solution to the problem and D is the spatial dimension.
(I) Suppose the population size is N, the search space is D-dimensional, the position of the ith individual is denoted as x i = (x i1 , x i2 , . .., x iD ), i ∈ (1, 2, . .., D), the optimal individual position in the iterative update of the N population is denoted as p best = ( pbest,1 , pbest,2 , . .., pbest,D ), and the ith individual in the population updates its spatial position according to Equation (1).
(II) In the SCA, the r 1 parameter effectively balances the global search and local exploitation performance, gradually converging to the globally optimal solution as the number of iterations decreases linearly, and the r 1 parameter is updated according to Equation (2).
where a is the a constant, generally taking a value of 2, t is the number of current iterations, and T is the maximum number of iterations.

Disadvantage Analysis of the Sine Cosine Algorithm
Good initialized populations are crucial for the swarm intelligence algorithms, and the SCA population initialization relies excessively on randomness, without efficient initialized populations.Relying only on random initialization of the population may lead to the initial population clustering in a certain spatial domain, resulting in the population distribution being poorly homogeneous and the search for individuals falling into a blind search in the early stage.The algorithm tends to fall into a local optimum after several iterations, which in turn leads to a slow convergence rate.Therefore, a tent chaotic mapping strategy was adopted in the SCAEICS to initialize the population, distribute the population efficiently and reasonably, and enhance the performance of the SCA.
Any swarm intelligence optimization algorithm needs to consider how to balance the global search and local exploitation performance.Global search explores a broader space for the population to maintain good population diversity and avoid getting trapped in local optima.The SCA uses the parameter r 1 for linear regulation to balance the performance of the global search and local exploitation.The parameter r 1 is set to decrease linearly from 2 to 0 according to the number of iterations, with r 1 at a large value at the beginning of the iteration to facilitate global search.As the number of iterations reaches the end, the parameter value decreases linearly to a smaller value, and the local search is carried out.However, according to the linear search variation, two phases are easily unevenly coordinated, and the control parameters are in the global search phase when they are already rapidly decreasing linearly and cannot fully perform, making the global search and local development of the algorithm fall into imbalance.To solve this problem, the hyperbolic tangent function to non-linearly adjust the control parameters r 1 was used in the SCAEICS.
The search strategy of the SCA adopts the current individual x i to guide global optimal individual P best to develop the objective, the current individual to the global optimal individual to quickly converge to achieve the purpose of seeking an optimal solution.However, this strategy only uses the current individual x i to guide; although it has a strong global search capability, the convergence speed is slow and the optimization accuracy is low, and it is easy to fall into the local optimum and it may miss the potential solutions around the optimal individual.Therefore, to combine the SCA search strategy, the m-neighborhood locally optimal individual-guided search strategy and the global optimal individual-guided search strategy, the SCAEICE was proposed in this paper.The three search strategies were executed alternately to achieve a collaborative search, improve the convergence accuracy, and prevent the algorithm from falling into a local optimum.

Modified Strategies of the SCAEICS Algorithm (I). Tent chaos mapping initialization
Tent chaotic mapping [42] was used by the SCAEICS to initialize the population, so that the population was reasonably evenly distributed in the search space, the algorithm performance was improved, and the population distribution uniformity was maintained effectively.The comparison results between the tent chaos mapping initialization and random population initialization are shown in Figure 2. Tent chaos mapping not only has the characteristics of traversal uniformity and low complexity, but also preserves the initialization randomness of the SCA.The function expressions are updated according to Equation (3).
where the population undergoes tent chaos mapping to generate a chaotic sequence s i = (s i , i = 1, 2, . . . N), s i,d = (s i,d , d = 1, 2, . . . D), u ∈ (0,2); the larger the u, the better the chaos, and the system is in a fully chaotic state when u is at 2. The mapping of the initial values of the resulting chaotic sequence in the search space yields the population X = (X i , i = 1, 2, . .., N), X i = (X i,d , d = 1, 2, . .., D), and the population individual expressions are Equation (4).
where ub and lb are the upper and lower bounds of the search space.
PEER REVIEW 7 of 27  , , 0 .5 1 , ( 1 ) 0 .5 where the population undergoes tent chaos mapping to generate a chaotic sequence si = (si, i = 1, 2, …, N), si,d = (si,d, d = 1, 2, …, D), u ∈ (0,2); the larger the u, the better the chaos, and the system is in a fully chaotic state when u is at 2. ) where ub and lb are the upper and lower bounds of the search space.

(II). The hyperbolic tangent function non-linear adjustment control parameter r1
The hyperbolic tangent [43] was adopted by the SCAEICS to improve the parameter r1.The tanh function, as an excitation function, is commonly used in the field of machine learning and has a good optimization performance for non-linear improvements.The expression of the tanh function is Equation ( 5).
( ) Tanh( ) ( ) The sinh(x) and cosh(x) are the hyperbolic sine functions and hyperbolic cosine functions, respectively.By introducing the tanh function into the SCAEICS, the parameters could be nonlinearly decreased, the global search capability in the early stage was enhanced, and the local development performance in the later stage was stably maintained to achieve a state of global search and local development balance, and the expression for the control parameter is improved as Equation ( 6).

(II). The hyperbolic tangent function non-linear adjustment control parameter r 1
The hyperbolic tangent [43] was adopted by the SCAEICS to improve the parameter r 1 .The tanh function, as an excitation function, is commonly used in the field of machine learning and has a good optimization performance for non-linear improvements.The expression of the tanh function is Equation (5).
The sinh(x) and cosh(x) are the hyperbolic sine functions and hyperbolic cosine functions, respectively.By introducing the tanh function into the SCAEICS, the parameters could be nonlinearly decreased, the global search capability in the early stage was enhanced, and the local development performance in the later stage was stably maintained to achieve a state of global search and local development balance, and the expression for the control parameter is improved as Equation ( 6).
where a max and a min are the initial and termination values of parameter a, respectively, t is the current number of iterations, and T is the maximum number of iterations; S is the adjustment parameter, and the comparison of parameter r 1 before and after improvement is shown in Figure 3.

(III). The elite individual collaborative search strategies
The elite individual-guided search strategy uses the best individuals of the population to guide the search process, improving the optimization accuracy and avoiding missing

(III). The elite individual collaborative search strategies
The elite individual-guided search strategy uses the best individ tion to guide the search process, improving the optimization accuracy ing potential solutions and falling into local optima.The elite indivi strategy is divided into the m-neighborhood locally optimal indivi strategy and the global optimal individual-guided search strategy.

(IV). The m-neighborhood locally optimal individual-guided search s
The M (m ≤ N) individuals are randomly selected from the po location of the optimal individual is denoted as Lbest = (lbest,1, lbest,2, ..., l locally optimal individual in the m-neighborhood, which was searched tion (7).Then, we obtain a new individual Ui = (ui,1, ui,2, …, ui,D), i = 1, 2 searches in the m-neighborhood and around the locally optimal indiv ance of Lbest in the search is effectively utilized, considering both the s the global search capability to prevent the algorithm from falling into sin( ) , 0.5 1 , cos( ) , 0.5 The global optimal individual-guided search strategy (IV).The m-neighborhood locally optimal individual-guided search strategy The M (m ≤ N) individuals are randomly selected from the population, where the location of the optimal individual is denoted as L best = (l best,1 , l best,2 , . .., l best,D ).The L best is the locally optimal individual in the m-neighborhood, which was searched according to Equation (7).Then, we obtain a new individual U i = (u i,1 , u i,2 , . .., u i,D ), i = 1, 2,. .., n.This strategy searches in the m-neighborhood and around the locally optimal individual L best .The guidance of L best in the search is effectively utilized, considering both the search accuracy and the global search capability to prevent the algorithm from falling into the local extremes.

(V). The global optimal individual-guided search strategy
The current individual x i and the globally optimal individual P best are searched according to the search strategy in Equation ( 8) to obtain the new individual U i .The new individual U i was obtained by searching the current individual x i and the globally optimal individual P best according to Equation (8).
This strategy searches for the global optimal individual P best in the vicinity of the global optimal individual P r 3 • p best − x t i as the radius, in a sine or cosine manner, which not only plays the role of guiding the search by the global optimal individual P best , but also improves the optimization accuracy.

(VI). The collaborative search strategy
In the search strategy Equation (1) of the SCA, the current individual x i is searched near the current individual x r 3 • p best − x t i as the radius, in a sine or cosine manner, so that the current individual x i is close to or far from the global optimal individual.
To better balance the global and local search capabilities of the SCA, enhance the optimization accuracy, prevent falling into local extremes, and improve the quality of the optimal solution, the search strategy of the SCA, the m-neighborhood locally optimal individual-guided search strategy and the globally optimal individual-guided search strategy are combined, and the three search strategies are executed alternately to realize a collaborative search.The collaborative search strategy, which preserves the search mechanism of the SCA, makes full use of the ability of global search and prevents the algorithm from falling into local extremes.The elite individual-guided search mechanism plays the role of the p best global optimal individual and the l best local optimal individual to guide the search process and conducts a local search in the vicinity of the p best global optimal individual and the l best local optimal individual to improve the optimization accuracy and enhance the quality of the optimal solution of the algorithm.At the same time, this allows the use of the l best local optimal individual guidance, and a balance of global search ability and local search ability.

(VII).The greedy selection strategy
The new individual u i derived from the elite individual collaborative search strategy is greedily selected with the individual x i , and the better individual of the two is retained to improve the accuracy of the algorithm and accelerate the convergence speed.In Equation ( 9), f (−) is the objective function fitness value.

Algorithm Implementation Steps
The SCAEICS algorithm is implemented in the following steps.
Step 1: Tent chaos mapping initializes the population, the population size is set to N, the current number of iterations t, the maximum number of iterations T of the algorithm, the spatial dimension D, the number of neighbors m, and the iteration interval h of the elite individual collaborative search strategy.
Step 2: Calculate the fitness of each individual and find the globally optimal individual p best .
Step 3: Calculate the control parameter r 1 .
Step 4: Execute the search strategy of the SCA according to Equation (1) when t mod 2 == 0, and turn to Step 6, otherwise execute Step 5.
Step 5: Execute the elite individual-guided search mechanism.Generate a random number h between [0,1], and when h > 0.5, execute the m-neighborhood locally optimal individual-guided search strategy according to Equation (7).Otherwise, execute the global optimal individual-guided search strategy according to Equation (8).
Step 6: Execute the greedy selection strategy according to Equation ( 9) and update the current individual.
Step 7: Update the global optimal individual.
Step 8: If t > T, stop the iteration and output the global optimal solution.Otherwise, t = t + 1 and move to Step 3.

Pseudo-Code for the SCAEICS Algorithm
The pseudo-code of the SCAEICS algorithm is shown in Algorithm 1.

Flowchart of the SCAEICS Algorithm
The flowchart of the SCAEICS algorithm is shown in Figure 4. Enter parameters and initialize.Set the population size N, use the tent chaos mapping strategy to generate the initialized population, (x i , i = 1, 2, . .., N), and the maximum number of iterations T. Set the neighborhood individuals m, integer h, and spatial dimension D (where the function f 14 ~f 23 is a fixed dimension).Calculate the individual fitness value f (x i ), i = 1, 2, . .., N) and find the globally optimal individual and its location.t = 0; While (t < T) do Identifying locally optimal individuals and their locations.for i = 1 to N do Calculate the value of the control parameter r 1 according to Equation (6).if (t mod 2==0) The SCA search strategy is executed according to Equation (1).else if (h > 0.5) Execute the m-neighborhood locally optimal individual-guided search strategy according to Equation (7) The SCA search strategy is executed according to Equation (1).else if (h > 0.5) Execute the m-neighborhood locally optimal individual-guided search strategy according to Equation (7).
else if Execute the globally optimal individual guided search strategy according to Equation (8).end if end if end if Execute the greedy selection strategy according to Equation (9).end for Updating the current optimal individual and position.t = t + 1; end while

Flowchart of the SCAEICS Algorithm
The flowchart of the SCAEICS algorithm is shown in Figure 4.

Start
Initialize the population and set the algorithm parameters.
Calculate the fitness of each search individual.
Select the globally optimal individual.
Update the globally optimal individual obtained by Equation.

Analysis of Algorithm Convergence and Diversity
The advantages and disadvantages of convergence in intelligent algorithms determine the performance of the algorithm to a large degree.Therefore, in the improvement work of intelligent algorithms, many scholars have analyzed the convergence in detail.For example, an analysis of the convergence of the ABC algorithm was proposed [44] using the relationship between the algorithm variables and the general solution of the objective optimal solution update equation.
A convergence analysis of the ABC algorithm based on von Neumann stability and convergence was proposed [45].A convergence analysis of the PSO algorithm guided by backward learning was proposed [46].An analysis of the convergence of the improved SCA algorithm for population diversity defined by the population center of gravity was proposed [47].An analysis of the convergence of the improved SCA algorithm for Markov chains was proposed [48].
The iterative update of the SCAEICS belongs to the Markov chain of stochastic search.Therefore, based on the above literature analysis, it is clear that the SCAEICS combines the SCA search mechanism with the elite individual-guided search mechanism and converges to the globally optimal solution using a greedy selection strategy.How to balance the local exploration and global development is an important evaluation indicator for optimization algorithms, and the importance of diversity in this process is relatively important.Therefore, conducting diversity analysis on the SCAEICS is of great significance.This article refers to the diversity analysis of the SCAEICS conducted in reference [22].Due to the use of tent chaos mapping initialization, the population distribution is uniform and a good population diversity is maintained.

Simulation Experiments
Experimental environments include an Intel(R) Core (TM) i7−10750H CPU@ 2.30 GHz, 16 GB RAM, Windows 10 operating system, and the whole algorithms were compiled and implemented using the MATLAB R 2020b simulation platform.

(I). Benchmark functions and parameter settings
To analyze the performance of the proposed algorithm, 23 benchmark functions and CEC2020 benchmark functions were used for simulation experiments, and the function expressions are detailed in the literature [49].Other parameters of the functions are shown in Tables 1 and 2: f 1 to f 7 are single-peaked dimensional functions, f 8 to f 13 are multipeaked dimensional functions, and f 13 to f 23 are fixed dimensional multi-peaked functions.In the single-peaked dimensional function, the test dimension has one local extremum, which is used to verify the convergence speed of the algorithm and its performance, and in the fixed-dimensional multi-peaked function, the test dimension has multiple complex local extremums, which are used to verify the global search performance of the algorithm and its ability to escape from the local optimum.F 1 to F 10 are unimodal, multimodal, new hybrid, and composite, with significance for verifying the SCAECIS performance.To verify the effectiveness and superiority of the SCAEICS, 23 benchmark functions and CEC2020 benchmark functions were used to compare the SCAEICS with other algorithms.To ensure fair results, the involved algorithms in the comparison were uniformly set to a population size of N = 30 and a maximum number of iterations T = 500, and each algorithm was run 30 times independently.The parameters of the SCAEICS were set to a max = 2, a min = 0, S = 2, m = 6, and h = 1, and the other parameters of other algorithms involved in the following comparison are shown in Table 3.

Comparative Analysis of the SCAEICS with the SCA and Other Intelligent Algorithms
To verify the performance of the SCAEICS, 23 benchmark functions were used to compare it with the SCA and other swarm intelligence algorithms proposed in recent years, which are the whale optimization algorithm (WOA) [1], grey wolf optimization algorithm (GWO) [2], Harris hawks algorithm (HHO) [3], and salp swarm algorithm (SSA) [4].The parameters of the algorithms involved in the comparison are detailed above, the best values of the comparison results are bolded, and the experimental results are shown in Table 4.As can be seen from Table 4, the SCAEICS is comparable to the SCA in experimental results; except for the optimization effect of the function f 16 , the other 22 functions are better than the SCA.The experimental results show that the SCAEICS has shown a significant optimization performance, for the single-peak dimension functions f 1 ~f 7 , the optimization effect of four functions has a large degree, including f 1 , f 2 , f 3 , f 4 , etc.For the multi-peak dimensional functions f 8 to f 13 , three of them, f 8 , f 9 and f 10 , have reached the theoretical optimum.For the fixed multi-peak dimensional functions f 14 to f 23 , five of them, f 16 , f 18 , f 21 , f 22 and f 23 , have reached the theoretical optimum.This is because the SCAEICS retains the search strategy of the SCA and adopts a combination of the m-neighborhood local optimal individual-guided search strategy and the global optimal individual-guided search strategy, and the three search strategies are executed alternately to achieve a collaborative search, presenting a better collaborative capability, and the global search and local exploitation achieve a desirable balance effect.
To further analyze the stability and superiority of the SCAEICS, comparative experiments were conducted with other algorithms proposed in recent years, and the Wilcoxon rank sum test with a significance level of 5% was used to analyze the significant differences between the algorithms.The decision results (+/=/−) indicate the number of functions in which the compared algorithms are "better/equal/worse" than the SCAEICS, respectively.The Wilcoxon rank sum test results show that the SCAEICS outperformed the SCA on 21 functions, the WOA on 19 functions, the GWO on 18 functions, the HHO on 13 functions, and the SSA on 18 functions.The data in Figure 5 are from the convergence graphs plotted for 30 runs of six selected functions, f 2 , f 5 , f 8 , f 13 , f 21 , and f 23 , from the Table 4 comparison experiment.The vertical co-ordinate is represented by the logarithm of the average optimal value of the function, and the horizontal co-ordinate is the number of iterations.
From the analysis in Figure 5, it can be seen that among the single-peaked dimensional functions: For f 2 , the SCAEICS converges significantly faster and has a higher optimization accuracy compared to the other five algorithms.For f 5 , the SCAEICS converges significantly faster than the other five algorithms, starting to converge at about 230 iterations until convergence to the global optimal solution.Among the multi-peaked dimensional functions, the f 8 function has difficulty in finding an optimum due to its high number of local optima.For f 8 , the SCAEICS converges slightly faster compared to the HHO, starting to converge at about 10 iterations until it converges to the global optimum.For f 13 , the SCAEICS converges faster than the other five algorithms at the beginning of the iteration and starts to converge at about 250 iterations until it converges to the global optimum.
In the fixed multi-peaked dimensional function, for f 21 , although the convergence rate is slightly lower than that of the HHO at the beginning of the iteration compared to the other five algorithms, the SCAEICS starts to converge ahead of the other five algorithms until the theoretical optimum at about 50 iterations.For f 23 , the SCAEICS shows significant outperformance in the early iterations compared to the other five algorithms and starts to converge to the theoretical optimum at about 70 iterations.The convergence curves demonstrate the remarkable optimization-seeking performance of the SCAEICS, which reaches global optimality on most functions and has a higher convergence accuracy compared to the SCA and the other algorithms.From the analysis in Figure 5, it can be seen that among the single-peaked dimen sional functions: For f2, the SCAEICS converges significantly faster and has a higher opt mization accuracy compared to the other five algorithms.For f5, the SCAEICS converge significantly faster than the other five algorithms, starting to converge at about 230 itera tions until convergence to the global optimal solution.Among the multi-peaked dimen sional functions, the f8 function has difficulty in finding an optimum due to its high num ber of local optima.For f8, the SCAEICS converges slightly faster compared to the HHO starting to converge at about 10 iterations until it converges to the global optimum.For f1 the SCAEICS converges faster than the other five algorithms at the beginning of the itera tion and starts to converge at about 250 iterations until it converges to the global optimum In the study of intelligent optimization algorithms, the setting of different dimensions has a certain influence on the experimental results.Therefore, to verify the SCAEICS can maintain a stable performance while searching the optimum in different dimensions, this paper extends the dimensions of the f 1 ~f 13 functions to 60 and 100 dimensions, with the other parameters unchanged, and uses the Wilcoxon rank sum test with a significance level of 5% to analyze the significant differences between different dimensions of the SCAEICS in the function optimization problem, and the experimental results are shown in Tables 5 and 6.As can be seen from Table 5, in the 60-dimensional comparative experimental results, the SCAEICS outperforms the other algorithms for the remaining 10 functions, except for 3 functions including f 9 , f 10 , and f 11 , whose optimization results are comparable with the other algorithms.As can be seen from Table 6, in the results of the comparative experiments in 100 dimensions, the SCAEICS outperforms the other algorithms except for 3 functions including f 9 , f 10 , and f 11 , whose optimization effects are comparable with those of the other algorithms, and the other 10 functions.The analysis of the results shows that the SCAEICS has a better optimization effect in the same dimension and has a stable optimization-seeking performance in different dimensions.
To analyze the stability and superiority of the SCAEICS in different dimensions, the Wilcoxon rank sum test with a significance level of 5% was used to analyze the significant differences between the SCAEICS.The decision results (+/=/−) indicate the number of functions in which the comparison algorithm is "better than/equal to/worse than" the SCAEICS, respectively.The Wilcoxon rank sum test shows that the SCAEICS outperformed the HHO on 9 functions and the SCA, the WOA, the GWO, and the SSA on 13 functions in the 60-dimensional comparison experiment.
In the 100-dimensional comparative experimental results, the SCAEICS outperformed the SCA on 13 functions, the WOA on 12 functions, the GWO on 13 functions, the HHO on 8 functions, and the SSA on 13 functions.The results show that the SCAEICS maintains a stable performance in the search for excellence in different dimensions.

Comparative Analysis of the SCAEICS with Other Improved Algorithms
To further validate the performance of the SCAEICS, 23 benchmark functions were used to compare it with other improved SCA algorithms: an alternating sine cosine algorithm based on an elite chaotic search strategy (COSCA) [48], a memory-guided sine cosine algorithm for global optimization (MGSCA) [50], a sine cosine algorithm based on differential evolution (SCADE) [51], and a cloud model-based sine cosine algorithm (CSCA) [38].The comparative analysis was carried out and the Wilcoxon rank sum test with a significance level of 5% was used to analyze the significant differences of the SCAEICS, the algorithm parameters are detailed above, where the SCADE data were taken from the literature and the rest of the data were obtained from the experiments, the best values of the comparison results are bolded, the experimental results are shown in Table 7.
As can be seen from Table 7, the experimental results of comparing the SCAEICS with other improved algorithms in the SCAEICS obtained significantly better optimization results than the other improved algorithms for 18 functions including f 1 ~f 13 , f 16 , f 20 ~f 23 , and several functions achieved theoretical optimal values.To analyze the stability and superiority of the SCAEICS more precisely, the Wilcoxon rank sum test with a significance level of 5% was used to further analyze the significant differences of the SCAEICS, and the meanings of the symbols in the decision results are shown above.The Wilcoxon rank sum test shows that the SCAEICS outperforms the COSCA algorithm on 15 functions, the SCADE algorithm on 13 functions, the MGSCA algorithm on 18 functions, and the CSCA algorithm on 19 functions.In summary, compared to the other improved algorithms, the SCAEICS has a stronger optimization accuracy, faster convergence, which indicates the modified algorithm is able to handle different conditions of the optimization search problem.

Comparison and Analysis of the SCAEICS with Other Chaos-Based Algorithms
To further validate the performance of the SCAEICS, CEC2020 benchmark functions were used to compare it with the other chaos-based algorithms: the elite chaotic manta ray algorithm integrated with chaotic initialization and opposition-based learning (CM-RFO) [52], chaos marine predators algorithm (CMPA) [49], sine cosine algorithm (SCA) [21].The comparative analysis was carried out and the Wilcoxon rank sum test with a significance level of 5% was used to analyze the significant differences of the SCAEICS.The algorithm parameters are detailed above, the best values of the comparison results are bolded, and the experimental results are shown in Table 8.As can be seen from Table 8, the SCAEICS is compared with other chaos-based algorithms in the experimental results.Compared with the CMRFO, the SCAEICS obtains significantly better optimization results on six functions including F 2 , F 5 , F 6 , F 8 , F 9 , F 10 , three functions including F 3 , F 4 , F 7 are comparable to the CMRFO and the remaining function is inferior to the CMRFO.Compared with the CMPA, the SCAEICS obtains significantly better optimization on three functions including F 1 , F 3 , F 4 , four functions including F 6 , F 8 , F 9 , F 10 are comparable to the CMPA, and the remaining three functions are inferior to the CMPA.Compared with the SCA, the SCAEICS obtains significantly better optimization on 10 functions than the SCA.
To analyze the stability and superiority of the SCAEICS more precisely, the Wilcoxon rank sum test with a significance level of 5% was used to further analyze the significant differences of the SCAEICS, and the meanings of the symbols in the decision results are shown above.The Wilcoxon rank sum test shows that the SCAEICS outperforms the CMRFO algorithm on 6 functions, the CMPA algorithm on 3 functions, and the SCA algorithm on 10 functions.
In summary, compared to other chaos-based algorithms, the SCAEICS has a stronger optimization accuracy and a faster convergence, which indicates the modified algorithm is able to handle different conditions of the optimization search problem.

Analysis of Important Parameters
The setting of the local population size m of the locally optimal individual l best is one of the more important parameters in the SCAEICS and has a large impact on the optimization accuracy of solving different problem functions.Taking a larger number in the local population size affects the local optimization performance and risks premature convergence of the algorithm, while conversely taking a smaller number has too little impact to achieve the desired algorithm performance.Therefore, the number of local populations m is set to 4/5/6/7, respectively, and the effect of the number of local populations on the performance of the algorithm is tested by simulation experiments.In total, 9 of the 23 tested functions, including the single-peaked dimension functions f 1 , f 3 , f 6 , the multi-peaked dimension function f 8 , f 13 , and the fixed multi-peaked dimension functions f 17 , f 19 , f 21 , f 23 , were used for the experiments, and the significant difference of the experimental results was indicated by the Friedman test with a significance level α of 5%.A p-value greater than 0.05 indicates significant differentiation in the parameter, whereas a value of less than 0.05 indicates that the parameter is not significantly different.The following results were analyzed by the professional statistical software IBM SPSS Statistics 21.Among the impact, the indicators are the mean best value, the standard deviation, the ranking, and the parameter value p of the Friedman test.
As can be seen from Table 10, the nine functions of f 1 , f 3 , f 6 , f 8 , f 13 , f 17 , f 19 , f 21 , and f 23 are all less than the Friedman test value of 0.05, indicating that the algorithms corresponding to the four local population size values have significant differences, and the algorithm performance is optimal when the local population size m is taken as six in this paper, and the experimental test results are shown in Table 9.

Time Complexity Analysis
The time complexity of an algorithm has a large impact on the speed of convergence.In the SCA, it is assumed that the population size is N, the problem dimension is D, and the maximum number of iterations of the algorithm is T. Therefore, the time complexity of the SCA is Equation (10).
In the SCAEICS, the other parameters are consistent with the SCA, and tent chaos mapping is used to initialize the population, and the time complexity of this part is O(N-D).For the m-neighborhood locally optimal individual-guided search strategy and in the global optimal individual-guided search strategy, the time complexity of this part is O(N-T-h-(m−1)/2) as the solution of this part is only a comparison process and therefore has less impact on the complexity of the algorithm.A greedy selection strategy is used to select individuals of the population on merit, and the time complexity of this part is O(N-T).Therefore, the time complexity of the SCAEICS is Equation (11).
From the above analysis, it can be concluded that the algorithm time complexity of the SCAEICS is approximately the same compared to the SCA.

Applications
In this paper, the proposed SCAEICS was used to solve mechanical design optimization problems and to further validate the feasibility and applicability of the SCAEICS.

(I). Mechanical design optimization
Mechanical optimization design problems [53] belong to the classical problems in the field of machinery, which are mainly through the selection of design variables, objective functions, and constraints to build a mathematical model; therefore, the research application of such problems has a certain research significance for the mathematical model of this type of problem, which can generally be expressed as the following constrained optimization problem, Equation (12): min f (x) s.t g p (x) ≤ 0, p = 1, 2, . . ., j; h m (x) = 0, m = 1, 2, . . ., y; x ub ≤ x i ≤ x lb , i = 1, 2, . . ., n; (12) where x is the design variable, x = x 1 , x 2 , . .., x n and f (x) is the objective function; g p is the pth inequality constraint; h m is the mth equality constraint; and x ub and x lb are the upper and lower bounds of the design variable, respectively.

(II). Example of mechanical design optimization
To verify the feasibility and practicality of the SCAEICS, it is applied to the two problems of cantilever beam optimization [54] and three-bar truss optimization [55], and is compared with the whale optimization algorithm, grey wolf optimizer, Harris hawks optimization, salp swarm algorithm, sine cosine algorithm, and so on.To ensure the fairness of the experiment, the parameters of the algorithms were set as follows: the population size N = 30, the maximum number of iterations T = 1000, each algorithm was run 30 times independently and the average value was taken.

(III). Example of optimized design of a cantilever beam
The optimization objective in the cantilever beam optimization design problem is to make the mass of the rectangular section of the cantilever beam as small as possible, the mathematical expression for which is Equation ( 13 where the minimum value of the f (x) function is the maximum mass of the rectangular section of the cantilever beam and the design variable x i denotes the height or width of the different unit beams.
The results of comparing the performance of the SCAEICS with different algorithms, for the cantilever beam optimization problem, can be seen in Table 10.The optimal solution of the SCAEICS is already better than the other algorithms for the minimum value of the function f (x).Therefore, the SCAEICS yields the optimum quality of the rectangular section of the cantilever beam compared to the other algorithms.The objective of the optimal design of a three-rod truss is to find the optimal threerod truss volume by adjusting the cross-sectional area.The problem has a non-linear fitness function, three inequality constraints, and two decision optimization variables.The mathematical expression is Equation (14).
where the minimum value of the f (x) function is the optimal volume of the three-rod truss, l is the deflection, p is the buckling, σ is the stress constraint of the truss member, and x 1 and x 2 are the truss rod frame lengths on both sides for assessing the optimal cross-section.
As can be seen from Table 11, the optimal solution of the SCAEICS is 263.9055 for the three-rod truss optimization problem, which is better than the optimization results of other algorithms, indicating that the cross-sectional volume of the three-rod truss obtained by the SCAEICS is optimal.the field of optimization and the next research will focus on multi-objective optimization problems and solving more practical problems.
Next, researchers can focus on more practical problems such as shop-floor scheduling problems, image processing, text classification, transportation problems, logistic scheduling problems, agricultural water problems, other mechanical optimization design problems, and hyper-parametric optimization problems faced for machine learning and deep learning.Finally, interested researchers can further analyze its performance by improving the method.

Conclusions
To improve the optimization performance of the SCA, the problems of slow convergence, low accuracy of the search for excellence, and the tendency to fall into local optimality were addressed.A sine cosine algorithm for the collaborative search of elite individuals was proposed, with the following main improvements in the work.
(1).Tent chaos mapping was used to initialize the population with a hyperbolic tangent function non-linearly adjusting the control parameter r 1 , so that the population was uniformly distributed, enhancing the uniformity of population distribution and balancing the global search and local exploitation performance.
(2).By combining the search strategy of the SCA, the m-neighborhood locally optimal individual-guided search strategy, and the global optimal individual-guided search strategy, the search method of the original algorithm was improved.The above three search strategies were executed alternately to achieve a collaborative search, which effectively improved the convergence accuracy and prevented the algorithm from falling into a local optimum.
(3).A greedy selection strategy was used to select individuals of the population on merit to speed up convergence.(4).Simulation experiments on 23 basic test functions and CEC2020 functions were conducted to compare the sine cosine algorithm for the collaborative search of elite individuals (SCAEICS) with the SCA, other improved SCA, other chaos-based algorithms, and other intelligent optimization algorithms, and the experimental results show that the SCAEICS had a better optimization performance.
(5).The feasibility and applicability of the SCAEICS were further verified by optimizing two example problems in mechanical design, which could provide new ideas for research in the field of mechanical design.
For future work, we plan to test the improved algorithm using a more novel set of CEC test functions.In addition, we are going to apply the improved algorithm to agricultural water resources for practical applications.

Figure 1 .
Figure 1.Schematic diagram of the sine cosine algorithm.The yellow striped area indicates the global search space, the black striped area indicates the local exploitation space, xi indicates the current individual, and x* indicates the target individual.

Figure 1 .
Figure 1.Schematic diagram of the sine cosine algorithm.The yellow striped area indicates the global search space, the black striped area indicates the local exploitation space, x i indicates the current individual, and x* indicates the target individual.

Figure 2 .
Figure 2. Comparison chart of population initialization (the subfigure (a) shows the effect of SCA population initialization, and (b) shows the effect of population initialization after using tent chaotic mapping).

Figure 2 .
Figure 2. Comparison chart of population initialization (the subfigure (a) shows the effect of SCA population initialization, and (b) shows the effect of population initialization after using tent chaotic mapping).

Figure 3 .
Figure 3.Comparison chart of parameter r 1 curve.

Algorithm 1 :
Sine cosine algorithm for the collaborative search of elite individuals (SCAEICS)

Figure 4 .
Figure 4.The flowchart of the proposed algorithm.

Figure 4 .
Figure 4.The flowchart of the proposed algorithm.
≤ x i ≤ 100 i = 1, 2, 3, 4, 5; Sine cosine algorithm for the collaborative search of elite individuals (SCAEICS) Enter parameters and initialize.Set the population size N, use the tent chaos mapping strategy to generate the initialized population, (xi, i = 1, 2, …, N), and the maximum number of iterations T. Set the neighborhood individuals m, integer h, and spatial dimension D (where the function f14~ f23 is a fixed dimension).Calculate the individual fitness value f(xi), i = 1, 2, …, N) and find the globally optimal individual and its location.

Table 1 .
Basic test functions.

Table 3 .
Table of parameter settings for the participation comparison algorithm.

Table 4 .
Comparison of the SCAEICS with the SCA and other intelligent algorithms.

Table 5 .
Comparison of the SCAEICS with the SCA and other intelligent algorithms (D = 60).

Table 6 .
Comparison of the SCAEICS with the SCA and other intelligent algorithms (D = 100).

Table 7 .
Comparison table of the SCAEICS with other improved algorithms.

Table 8 .
Comparison table of the SCAEICS with other chaos-based algorithms.

Table 9 .
Effect of local population size m on the performance of the SCAEICS.

Table 10 .
Performance comparison of different algorithms for cantilever beam optimization problems.
(IV).Modified strategies of SFLACF algorithm: Example of optimized design of a three-rod truss