Hybrid Sine Cosine Algorithm for Solving Engineering Optimization Problems

: Engineering design optimization problems are difﬁcult to solve because the objective function is often complex, with a mix of continuous and discrete design variables and various design constraints. Our research presents a novel hybrid algorithm that integrates the beneﬁts of the sine cosine algorithm (SCA) and artiﬁcial bee colony (ABC) to address engineering design optimization problems. The SCA is a recently developed metaheuristic algorithm with many advantages, such as good search ability and reasonable execution time, but it may suffer from premature convergence. The enhanced SCA search equation is proposed to avoid this drawback and reach a preferable balance between exploitation and exploration abilities. In the proposed hybrid method, named HSCA, the SCA with improved search strategy and the ABC algorithm with two distinct search equations are run alternately during working on the same population. The ABC with multiple search equations can provide proper diversity in the population so that both algorithms complement each other to create beneﬁcial cooperation from their merger. Certain feasibility rules are incorporated in the HSCA to steer the search towards feasible areas of the search space. The HSCA is applied to ﬁfteen demanding engineering design problems to investigate its performance. The presented experimental results indicate that the developed method performs better than the basic SCA and ABC. The HSCA accomplishes pretty competitive results compared to other recent state-of-the-art methods.


Introduction
Design optimization problems thrive in diverse engineering areas.Some examples are structural design problems in civil engineering, synthesis problems in chemical engineering, and mechanical problems in mechanical engineering [1].Most engineering design optimization problems are hard to solve due to the complicated objective function and different types of constraints.An additional difficulty is that these problems might have objective functions with continuous and discrete variables.
An engineering design optimization problem tries to minimize or maximize an objective function with respect to design variables while satisfying the constraints established on the search space.This problem can be formulated as follows: minimize f (x), x = (x 1 , x 2 , . . . ,x D ) (1) with respect to l i ≤ x i ≤ u i , i = 1, 2, ..., D subject to g j (x) ≤ 0, j = 1, . . ., l h j (x) = 0, j = l + 1, . . ., m.
In Equation (1), a problem solution is denoted as x i , f (x) is the objective function, the number of design variables is denoted as D, inequality constraints are indicated as g j (x), while h j (x) denote equality constraints.In this problem, formulation design variables are discrete or continuous.
Diverse accurate and approximate algorithms have been proposed to solve constrained engineering design problems [2].The efficacy of accurate optimization algorithms is solid for some design problems.However, these methods have to make certain presumptions that can not be justified in some cases.For instance, gradient-based algorithms belong to the group of exact algorithms, and these methods require the functions to be smooth enough.Exact methods are often inefficient when solving non-linear optimization problems which possess multiple local optima, since these problems are demanding to solve [3].
Since features of objective and constraint functions, such as modality, convexity, and smoothness, have limited the applicability of exact methods, numerous approximate algorithms have been developed in the last few decades [4][5][6].Two groups of approximate algorithms are specific heuristics and metaheuristics.Specific heuristics are developed to solve a particular problem.Metaheuristics are applicable to a wide range of optimization problems.These algorithms use randomness in their search equations to help them avoid local optima and effectively explore the search space.Even though it is not guaranteed that the global optimum solution could be reached, metaheuristics can tackle optimization problems by finding solutions of sufficient quality in a reasonable time.
Many metaheuristic algorithms mimic natural metaphors to solve diverse challenging optimization problems.An essential class of these methods includes population-based metaheuristic algorithms, which reach appropriate solutions by iteratively choosing and combining solutions from the population.These algorithms are separated into five categories: evolutionary-based, swarm-based, physic-based, human-based, and maths-based metaheuristics [5].Evolutionary algorithms, such as genetic and differential evolution algorithms, draw inspiration from nature and employ selection, recombination, and mutation search operators.Swarm intelligence methods are inspired by the collective behavior of animal societies.There are numerous prominent metaheuristic algorithms that belong to this category and some notable representatives include artificial bee colony (ABC) algorithm [7], particle swarm optimization (PSO) [8], cuckoo search (CS) [9], bat algorithm (BA) [10], firefly algorithm (FA) [11].Human-based techniques, such as teaching learning-based algorithms, are motivated by distinct human-made happenings.Metaheuristics that mimic rules in physics, such as gravitational search algorithm (GSA) [12], fall into the category of physics-based algorithms.Math-based metaheuristics emulate mathematical rules.One of the popular recently proposed maths-based algorithms is the sine cosine algorithm (SCA) [13].Additionally, in recent years applying machine learning to develop efficient metaheuristic algorithms has been studied [14,15].Basic metaheuristic algorithms, after their creation, are often modified or hybridized with some other methods to enhance their performances for some problem classes.
The Sine Cosine Algorithm (SCA), proposed by Mirjalili, is inspired by the properties of trigonometric cosine and sine functions.This technique has several modified and hybridized variants that are used to solve various optimization problems.In [5] it was noticed that the SCA has to be combined with some other metaheuristic algorithm to solve specific real-world optimization problems.Even though this algorithm has many advantages, such as simplicity, adaptability, robustness, and reasonable execution time, it has been pointed out that the SCA may suffer from premature convergence.To escape from a local optimum point, the SCA should maintain a suitable diversity in the population.Hybridization has been a broadly established approach to produce differences among individuals during the search process [16].
Motivated by the above observations, a hybrid method (HSCA) is developed to solve complex engineering optimization problems.The foundations of this method are modified SCA and ABC metaheuristics.The proposed approach is collaborative hybrid algorithms with a sequential structure.In the HSCA, both methods, the modified SCA and ABC with two distinct search equations, execute alternatively until the convergence criterion is met.To arrive at a preferable equilibrium in the middle of exploitation and exploration abilities, a modified SCA search equation is proposed.In the developed SCA strategy, the base vector is selected randomly from the population and another difference vector is added to the base vector.Since the ABC metaheuristic has superior exploration ability, the ABC method with two search strategies was combined with the modified SCA.In the employed ABC variant, both ABC search equations are engaged in generating candidate solutions to provide suitable diversity in the population.Additionally, a mechanism to deal with the constraints based on the three feasibility rules is incorporated in the proposed HSCA.The proposed HSCA is tested on 15 challenging engineering design optimization problems.The achieved experimental results are compared to the basic ABC, SCA and recent prominent metaheuristics developed to tackle design problems.
The structure of this paper is as follows.A brief review of the SCA and ABC is presented in Section 2. Section 3 describes a novel hybrid sine cosine algorithm.Section 4 presents fifteen engineering design optimization problems.The experimental design and the reached results are given in Section 5. Concluding remarks are provided in Section 6.

Sine Cosine Algorithm
The search strategy of the SCA is based on the cyclic pattern of cosine and sine functions [13].The search equation employed to update the positions of solutions in the population is given as follows: where i = 1, . . ., SP, j = 1, . . ., D, such that SP is the size of population and D is the number of design variables.In the Equation (2), x i,j is jth component of solution x i at iteration t, y j is jth component of the best achieved solution at iteration t. Specific control parameters, r 1 , r 2 , r 3 and r 4 , are randomly chosen.The parameter r 1 regulates the location of the novel solution.If r 1 < 1, the new solution will be located in the space between the current solution and the best solution reached so far.On the other hand, if r 1 > 1, the novel solution will be positioned outside that space.The distance towards or outwards the best-reached solution is determined by the parameter r 2 , which has to be chosen from the range [0, 2π].The value of r 3 represents a randomly chosen weight for the best-reached solution.The random parameter r 4 takes values from [0, 1] and changes among the sine and cosine components in the Equation (2).
In the SCA, the optimization process starts with randomly created solutions.The method updates other solutions from the population using the Equation ( 2).The optimiza-tion process of the SCA is finished when the iteration counter exceeds the maximal number of iterations (MNI).
Exploration, the process of generating solutions with a great deal of diversity, and exploitation, the process of searching a local area for good solutions found, are essential components of any metaheuristic algorithm.These two processes have to be well-balanced to achieve quality results.To balance the impact of the exploration and exploitation in the SCA, the r 1 value is updated during the search by the equation where t is the present cycle and a is the constant.The pseudocode of the SCA is given as Algorithm 1.

Algorithm 1
The pseudocode of the SCA Record the best solution y reached so far 6: Generate a new solution by employing the Equation ( 2) and evaluate it 8: end for 9: Update control parameters r 1 , r 2 , r 3 , r 4 10: The basic SCA is proposed to solve continuous optimization problems.After its development, several modified variants have been developed to enhance its performance [5].These modified versions are divided into nine categories: binary, chaotic, opposition-based learning, orthogonal-based learning, Lévy flight, mutation, fuzzy, adaptive, and improved.For instance, binary SCA is proposed to discover the beneficial features [17].The S-shaped and V-shaped transfer functions were employed to convert the continuous form of SCA into the respective binary algorithm.The chaotic oppositional SCA that uses an oppositionbased learning mechanism was proposed to improve the performance of the SCA for solving global optimization in [18].In [19], an original band selection method based on the SCA and Lévy flight was proposed.
The SCA algorithm is also combined with other metaheuristic methods to improve its performance in general or for a specific group of problems.For example, a hybrid metaheuristic method that combines the Lévy flight distribution, PSO, and SCA, for unconstrained numerical function optimization and constrained engineering design problems was proposed in [20].This algorithm incorporates the death penalty technique to handle constraints.A joint procedure arising from SCA and ABC for solving thresholds at several levels has recently been developed [21].In the proposed approach, the SCA was employed to achieve the global optimal solution, while ABC was used to shorten the search area.In general, the SCA method is used to solve optimization models from various fields, for instance, electrical engineering, image processing, computer engineering, classification, and control engineering [5].

Artificial Bee Colony Algorithm
The behavior of a honey bee swarm in search of food influenced the emergence of the ABC metaheuristic [7].Three types of individuals exist in the swarm of ABC: employed, onlooker, and scout bees.Employed bees try to find sources of food.Onlookers search around the previously found sources of food.The scout bees are created from certainly employed bees that leave unfavorable sources to search for novel ones.The framework of the ABC is described as Algorithm 2.

Algorithm 2
The framework of the ABC algorithm In the initialization part of the ABC, the control parameters of the algorithm are initialized as well as a population of solutions x i , i = 1, 2, . . ., SP is randomly generated in the search space.The ABC employed two standard control parameters, the maximum number of iterations (MN I) and the size of the population (SP), and the specific parameter limit which defines the number of attempts to abandon the food source.There are three main phases in the ABC algorithm.In both the employed and onlooker stages, the same search strategy is used to generate a candidate solution v i from the current one x i .This strategy is described by the equation where j is a randomly selected index, x i,j is jth component of current solution x i , x k is a randomly picked solution that is distinct from x i and ϕ is a random number between (−1, 1).After a candidate solution is created, the boundary constraint-handling mechanism is used to v i , and the greedy selection process is applied among x i and v i .In the employed stage, each solution from the population is subjected to the update process described by the Equation (4).
The same search strategy is used in the onlooker stage as in the employed stage.Also, the proportionate fitness selection is used to determine solutions that will be subjected to the update process.The greedy selection among candidates and the current solution decides whether the current solution will be updated.In the scouting phase, a solution that does not change over a predetermined number of trials is replaced with a randomly generated solution.
The ABC algorithm is one of the most notable swarm intelligence techniques.Many modified and hybridized ABC algorithms for continuous optimization problems have been developed since its invention.For instance, in [22], the ABC algorithm incorporated the constraint handling method which is based on three feasibility rules into its selection step with the intention to solve engineering optimization problems.A modified variant of the ABC algorithm for solving constrained design optimization problems, which included the crossover operator in the scout bee stage and the modified ABC search equation in the onlooker stage, was presented in [23].A variant of the ABC algorithm combined with a dualsearch mechanism and differential self-perturbation for solving engineering optimization problems was presented in [24].A novel ABC variant, named I-ABC greedy, has recently been developed for solving mechanical engineering design problems.The I-ABC greedy algorithm integrates three modifications in the basic ABC: the use of an opposition-based learning concept, a modified ABC search equation, and a constraint-handling technique based on Deb's rules [25].
Despite the fact that the standard ABC algorithm was developed to tackle optimization problems in the continuous domain, nowadays, there are a lot of ABC variants for combinatorial, binary, and integer programming problems [26][27][28].To be suitable for solving these types of problems, diverse encoding types, search equations, and selection operators are incorporated into the ABC algorithm.

Proposed Approach: HSCA
The performance of the SCA relies on its solution-seeking strategy which has characteristics that ensure exploration and exploitation abilities.The use of cosine and sine functions allows an agent to be created near another agent.This strategy ensures the exploitation of the space bounded among two agents.According to Equation (2), exploration is guaranteed since the algorithm can search outside the space bounded by the current solution and the best solution reached so far.Even though the SCA has good search ability, simple implementation, a small number of control parameters, and good execution time, it lacks the accuracy of the final solution in solving specific real-world optimization problems.
The problem of premature convergence is related to maintaining the solution variety during the search [16].The diversity between the agents of a population is a prerequisite for exploration, and it is often high in the initial part of a search.It is expected that these differences among agents decrease as specific individuals find favorable search areas and the population progress towards the global optimum [29].
Since the search strategy of the ABC algorithm has a strong exploration ability, an approach that combines the SCA and ABC could efficiently sustain diversity in the population.Hence, a collaborative hybrid method is developed with a sequential structure based on the SCA and ABC.In the proposed HSCA, the SCA with an enhanced search strategy and the ABC algorithm with two distinct search equations are run alternately by working on the same population until the MN I is reached.The SCA with a modified search equation and the multi-strategy ABC algorithms employ a selection mechanism based on three feasibility rules for handling constraints.
A novel mutation operator is proposed and used in the SCA to achieve a better ratio between exploitation and exploration and thus a more productive search.This search strategy generates the candidate solution v i from the current solution x i according to the following equation: where x n1 and x n2 are two neighbourhood solutions chosen randomly from the population, y is the best solution achieved so far, rand i is a random number from (0, 1] and j = 1, . . ., D. As in the Equation (2), r 1 regulates the location of the candidate solution and is updated by the Equation (3), the parameter r 2 determines the distance towards or outwards to the best-reached solution and takes value from the range [0, 2π] and parameter R i,j takes the value from range [0, 1] and changes among the sine and cosine components in the Equation ( 5).In the proposed mutation strategy, the two vectors, x n1 and x n2 are picked randomly from the population, and the base vector is then chosen at random between these two solutions.In Equation ( 5), two difference vectors are added to the base vector.The second term of the Equation ( 5) is the same as the corresponding term of the Equation ( 2), except for the use of a random weight for the best-found solution.In the Equation ( 5) this parameter is not used since it may provide too much exploration and inefficient search.The best solution found and the randomly picked neighborhood solution forms the third term of the Equation ( 5).The use of the third term and the randomly picked base vector might lead to better perturbation.
The selection process based on three feasibility rules is incorporated in the proposed approach to steer the search toward feasible areas.Hence, after the candidate solution v i is created by Equation ( 5) these rules are applied to determine the acceptability of the new solution v i in the population.
Deb developed the feasibility rules using the following criteria to compare two solutions [30]: (1) if both solutions are feasible, the one which has a better objective function value is chosen, (2) any feasible solution is favored over any infeasible solution, (3) between two infeasible solutions, the one with the lowest sum of constraint violations (CV) is chosen.This sum is calculated by In the proposed ABC algorithm, two search strategies with a superior mixing ability coexist through the search process.The proposed ABC algorithm does not include three distinct phases, i.e., only the employed bee stage is performed.Each agent randomly chooses a search strategy for creating a candidate solution in this stage.After generating a potential new solution, Deb's rules are conducted to determine whether a novel solution will join the population.
Both search operators used in the developed ABC approach were previously used in the ABC variants for constrained optimization [22,23].The first search strategy creates a candidate solution v i from the current solution x i according to the following equation: where ϕ i is a uniform random number which takes value from [−1, 1], t is current iteration number, x k is another agent picked randomly from the population, R j is a real number picked in a random way from [0, 1], such that j = 1, 2, . . ., D.
The quantity MR in Equation ( 7) is a modification rate control parameter that controls the potential changes of design parameters.The MR value is increased from 0.1 to the user-specified value MRmax in the first P • MNI cycles to reduce the chances of skipping to exploit current agents from the population.This parameter is set to MRmax in the lasting cycles.Hence the MR value is updated by where the control parameter P takes values from (0, 1) and it regulates the number of starting cycles in which lower values of MR are employed.
The second search strategy used in the proposed multi-strategy ABC algorithm is described by where x v and x k are different neighbourhood solutions chosen randomly from the population, ϕ i is a uniform random number from the range [−1, 1] and j = 1, 2, . . ., D.
Let us denote the Equation ( 7) as S 1 and the Equation ( 9) as S 2 .The pseudocode of the HSCA is given as Algorithm 3.
In each iteration of the HSCA, one of the three search strategies described by Equation (5), Equation (7), or Equation ( 9) is applied to each solution from the population with the intention to create a potential new solution.Each time when a candidate solution oversteps the boundaries of design variables, various values are created by the well-known boundary constraint handling technique described in [31].Discrete design variables are addressed as though they were continuous.Each time when a novel solution is generated, these values are then rounded to the closest accessible discrete values.

Algorithm 3
The pseudocode of the HSCA Update MR by using Equation (8) 22: The HSCA employs four specific control parameters: the parameter a which represent the initial value of parameter r 1 , the parameter r 2 , the MR max and P. Control parameters a and r 2 are used in the search strategy of the modified SCA algorithm, while parameters MR max and P are employed in the Equation ( 7) of the proposed multi-strategy ABC algorithm.Also, two standard control parameters for algorithms with population, size of population SP, and the maximum number of iterations MN I are used in the HSCA.It can be seen from Algorithm 3 that in each cycle of the HSCA each agent is updated at most once.Hence, the computational time complexity of the proposed HSCA is O(MNI • SP • f ), where O( f ) is the computational time complexity of evaluating the objective function value for a specific problem f .The performance of the HSCA relies on the employed three search strategies and the incorporated constrained handling technique.The good exploitation capacity of the modified SCA strategy is reached by using the current best solution in both difference vectors.In the second term of the Equation ( 5) the value of parameter r 1 is in the range [0, a], where a is a non-negative constant.Since the lower values of this parameter encourage the exploitation ability of this search equation, the value of parameter a is set to 0.75.The exploitation is further increased by gradually reducing the parameter r 1 during the search by using Equation (3).The control parameter r 2 takes value from the range [0, 2π], as in the basic SCA [13].The exploration capacity of the modified SCA search equation is guided by selecting the base vector randomly from the population and adding another difference vector to the base vector.To better maintain diversity in the population, the modified SCA optimizer is combined with the multi-strategy ABC algorithm.In the ABC search operator given by the Equation ( 7), the higher value of MR max and lower value of P parameters are needed during the search, to produce excessive diversity in a solution.The ABC search strategy given by the Equation ( 9) does not use the MR max parameter, i.e., by using this equation each component of an old solution is updated to create a potential new solution.

Design Problems
The performance of the proposed HSCA is evaluated on fifteen challenging structural and engineering design problems.These problems are: speed reducer, tension/compression spring, pressure vessel, welded beam, gear train, stepped cantilever beam, multiple disk clutch brake, cantilever beam, helical compression spring, hollow shaft, hydro-static thrust bearing, Belleville spring, car side impact, gas transmission compressor, and reinforced concrete beam.A brief description of these problems is given as follows.
The speed reducer problem aims to optimize the weight.This problem has six continuous and one integer optimization variable and eleven inequality constraints.The most precise recorded rounded value to achieve (VTA) for this problem is 2994.470166,while its mathematical description can be found in [32].
Tension/compression spring problem tries to optimize the weight.This problem has three continuous optimization variables and four inequality constraints.The best recorded rounded VTA is 0.012665 [33].The mathematical definition of this problem is presented in [22].Optimization of materials, forming and welding costs are the goal of the pressure vessel problem.This problem has two design variables which have to be integer multiples of 0.0625, two continuous variables, and three inequality constraints.The global optimum for this problem is 6059.714335048436[34].The mathematical description of this problem is given in [22].
The target of the welded beam problem is to minimize the fabrication cost of the beam.This problem involves a non-linear objective function of four continuous variables and seven inequality restrictions.The best recorded rounded VTA for this problem is 1.724852 [33].The mathematical description of this problem is given in [22].
The target of the gear train problem is to optimize the cost of the gear ratio of a gear train.This problem has four integer optimization variables.The optimal solution for this problem is 2.700857 × 10 −12 .The model of this problem is presented in [22].
The target of the stepped cantilever beam problem is to minimize the volume of a stepped cantilever beam.This problem has four continuous and six discrete design variables and eleven inequality constraints.Value 63,893.43 is the best reported rounded VTA for this problem [35].The mathematical description of this problem is given in [36].
Multiple disk clutch brake problem aims to optimize the mass.This problem has five integer variables and eight inequalities.The best achieved VTA for this problem is 2.3524245790 × 10 −1 , while its mathematical description can be found in [37].
The cantilever beam problem tries to optimize the overall weight of a cantilever beam.This problem has five continuous variables and one inequality constraint.The global optimal solution is 1.339956367 [34].The mathematical description of this problem is presented in [34].
The helical compression spring design problem tries to optimize the volume of spring steel.This problem has one continuous variable and two discrete variables, and eight inequality restrictions.The best reported rounded VTA for this problem is 2.658559, while its mathematical description can be found in [4].
The hollow shaft problem tries to minimize the weight of a hollow shaft.This problem has two continuous variables and two non-linear constraints.The best recorded rounded VTA is 2.370398, while its mathematical description can be found in [4].
The hydro-static thrust bearing problem tries to optimize bearing power loss.This problem has four continuous decision variables and seven inequality restrictions.The best-known VTA for this problem is 1.6254428092 × 10 3 [37].A mathematical description of this problem is presented in [38].
The Belleville spring design problem aims to reach the minimum volume.It includes four continuous variables and seven non-linear constraints.The most precise recorded rounded VTA is 1.979675 [35].In [38] the mathematical definition of this problem can be found.
The goal of the car side impact problem is to minimize the weight.This problem has nine continuous and two discrete variables and ten inequality constraints.The best calculated VTA is 22.84297, while its mathematical description is presented in [38].
The mathematical description of the gas transmission compressor problem is presented in [37].The best known VTA for this problem is 2.9648954173 × 10 6 .This problem has four design variables and one inequality constraint.
In the problem reinforced concrete beam the target is to design the beam for minimum cost.There are three discrete decision variables and two inequality constraints.Value 359.2080 is the best reported rounded VTA for this problem [35].The mathematical description of this problem is presented in [36].

Experimental Study
This section presents the computational results achieved by the proposed HSCA for the considered fifteen structural and mechanical engineering design problems.Since the HSCA is based on the combination of SCA and ABC, these optimizers are compared with our developed method.A direct comparison, where the results achieved by the HSCA are compared with the results obtained by our implementation of the SCA and ABC is conducted.These algorithms have been implemented in the Java programming language.To promote a fair comparison, the same changes as those described in the original paper [22] are employed in our implementation of the ABC algorithm.Also, in our implementation of the SCA, Deb's rules are employed to favor feasible areas of total search space.Additionally, results obtained by the HSCA are compared with the results recently achieved for the same design problems employing other optimizers in preceding research studies.The results obtained by these metaheuristic optimizers are taken from the specialized literature.
In the SCA, the parameter SP was also set to 30, while this algorithm used identical specific parameter values as those reported in the original paper [13].The number of function evaluations (NFEs) used by ABC, SCA, and HSCA was specified in each test case.For each design optimization problem, 50 runs were carried out.

Computational Results
In Tables 1-15, "Best", "Mean", "Worst", "SD", and "NFEs" symbolize, respectively, the best-found value, the average result, the worst-reached result, standard deviation values, and the number of function evaluations.The best results among the compared optimizers are shown in bold in these tables.The capacity of an optimizer to accomplish the best-known solution is decided by the best results.Algorithmic robustness or algorithmic stability is the capability of an algorithm to constantly converge to low values of objective functions without depending on the search space and initial solutions.Average and standard deviation results decide the stability or robustness of an optimizer.The NFEs represent the measure of the convergence rate.In Tables 1-4, NA denotes that results are not available in the respective paper.The speed reducer design problem was lately solved by the mine blast algorithm (MBA) [39], enhanced Jaya (EJAYA) [40], improved accelerated PSO (IAPSO) [32], crossoverbased ABC (CB-ABC) [23] and I-ABC greedy [25].The experimental results achieved by the HSCA, SCA, and ABC, and these optimizers are presented in Table 1.
The results given in Table 1 indicate that the HSCA is capable of greatly enhancing the capacity of the SCA and ABC to reach the optimal result, as well as the stability of these algorithms.In addition, exclusively, the IAPSO and HSCA could accomplish the best-rounded VTA in all runs.The HSCA converges faster to the optimal solution than all the compared optimizers, except for the IAPSO algorithm.The IAPSO and HSCA used the same NFEs to achieve these results.
From the results presented in Table 2 it can be observed that only the HSCA reached the best-rounded VTA in all runs.The number of function evaluations needed by the HSCA to solve the tension/compression spring design problem is less or equal to that of the six optimizers (AOS, QSA, EJAYA, CB-ABC, ABC, and SCA), and higher compared to the MBA, ECO, IAPSO, and IABC-greedy.For the ISCA optimizer, the number of function evaluations is not reported in the respective paper.The pressure vessel design problem was lately solved by the following optimizers: ECO [41], QSA [43], IAPSO [32], ISCA [44] CB-ABC [23] and I-ABC greedy [25].The statistical results reached by the HSCA, SCA, ABC, and other optimizers are presented in Table 3.As it can be noticed from Table 3, the best-rounded VTA was obtained by all compared optimizers, except for the ECO, ISCA, ABC, and SCA.It can also be observed that the HSCA achieved the smallest average and standard deviation results among all compared optimizers with the smallest number of NFEs.
The welded beam design problem was lately examined by employing diverse metaheuristic algorithms, including the use of the FA [36], MBA [39], ECO [41], AOS [42], QSA [43], EJAYA [40], IAPSO [32], CB-ABC [23] and I-ABC greedy [25].The statistical results achieved by the HSCA, SCA, ABC, and other optimizers are given in Table 4.In Table 4, NA denotes that results are not available in the respective paper.From Table 4 it can be seen that all considered optimizers obtain the best-rounded VTA, except the FA, ECO, and SCA.The HSCA, EJAYA, QSA, and MBA reached the mean results, equal to the best-rounded VTA among all compared techniques.Even though the HSCA obtained a worse standard deviation value than the QSA and MBS, the HSCA obtained these statistical results with the minimal of NFEs between considered optimizers.
The gear train design problem was solved by applying the MBA [39], IAPSO [35] and I-ABC greedy [25].The comparative results obtained by the HSCA, SCA, ABC, and other approaches are presented in Table 5.From Table 5 it can be observed that all considered optimizers obtain the optimal solution, except the I-ABC greedy algorithm.The SCA, ABC, and HSCA reached the optimal solution with fewer NFEs than the MBA and IAPSO.Additionally, the proposed HSCA obtained the smallest average and standard deviation results of the remaining five considered optimizers.
The stepped cantilever beam problem was lately solved by employing the FA [36], diversity-guided PSO [35] and I-ABC greedy [25] approaches.The results reached by the HSCA, SCA, ABC, and other optimizers are presented in Table 6.The presented results show that only the HSCA, ABC, and diversity-guided PSO could obtain the best-rounded VTA.The ABC and HSCA reached the best-rounded VTA with fewer NFEs than the diversity-guided PSO.Also, only the HSCA and diversity-guided PSO accomplished the best-rounded VTA in all runs.The multiple-disc clutch brake design problem was lately solved by the Rao-1, Rao-1, Rao-3 [4], AOS [42], IAPSO [32], I-ABC greedy [25] algorithms.The comparative results obtained by the HSCA, SCA, ABC, and other methods are presented in Table 7. From Table 7 it can be seen that the best-known rounded VTA were obtained only by the ABC, SCA, and HSCA with the same number of NFEs.Moreover, the HSCA reached smaller average and standard deviation values concerning the other eight optimizers.
For the cantilever beam design problem, the global optimum result 1.339956367 was found in [34].This design problem was lately solved by the AOS algorithm [45].The statistical results of the AOS, ABC, SCA, and HSCA are given in Table 8. Results arranged in Table 8 confirm that only the proposed HSCA can find the best-rounded solution.Additionally, the HSCA obtained smaller average and standard deviation values than the other compared optimizers with the smallest NFEs.The helical compression spring design problem was lately solved by FA method [36], diversity-guided PSO [35], Rao-1, Rao-1, and Rao-3 [4].Table 9 presents the comparative results obtained by the FA, diversity-guided PSO, Rao-1, Rao-1, Rao-3, ABC, SCA, and HSCA.From these statistical results, it can be noticed that most optimizers reached the best-rounded VTA.The only exceptions are the FA and SCA approaches which found slightly worse best results.The diversity-guided PSO converges to the best VTA faster than other compared optimizers.On the other hand, the proposed HSCA produced minimal mean and standard deviation results concerning the other seven metaheuristic algorithms.
The comparative results reached by the Rao-1, Rao-1, Rao-3 [4], ABC, SCA, and HSCA for the hollow shaft design problem are given in Table 10.It can be observed that each metaheuristic approach, except the SCA, was capable to obtain the best-rounded VTA.Between these optimizers, only Rao-1, Rao-1, Rao-3, and HSCA accomplished the best-rounded VTA in each run.The HSCA used the smallest NFEs to achieve these results.
Results reached by the EJAYA [40], ABC, SCA, and HSCA for the hydrostatic thrust bearing design problem are presented in Table 11.From Table 11 it can be observed that HSCA found the best-rounded VTA and the best average and standard deviation results among all compared optimizers.Also, the HSCA reached these results with the smallest NFEs.
The Belleville spring design problem was lately solved by the MBA [39] and diversityguided PSO [35].Table 12 presents the comparative results obtained by the MBA, diversityguided PSO, ABC, SCA, and HSCA.The best VTA was obtained by the MBA, diversityguided PSO, and HSCA optimizers.The MBA obtained the best VTA with the smallest NFEs, but only the diversity-guided PSO and HSCA reached the best-rounded solution in each run among all compared optimizers.
The car side impact design problem was lately solved by the FA [36] and EJAYA [40].The statistical results achieved by the FA, EJAYA, SCA, ABC, and HSCA are arranged in Table 13.From Table 13 it is observable that only the HSCA can find the best-rounded solution.Also, the proposed HCSA obtained smaller average and standard deviation values than the other compared optimizers.The HSCA outperforms the EJAYA, ABC, and SCA in computational efficiency.
The gas transmission compressor design problem was lately solved by the AOS algorithm [45].The comparative results of the AOS, ABC, SCA, and HSCA are presented in Table 14.From Table 14 it can be observed that the HSCA and AOS can find the bestrounded solution.Also, the HSCA performs more stable and converges faster to the optimal solution than the other compared optimizers.
The comparative results obtained by the FA method [36], diversity-guided PSO [35], QSA [43], ABC, SCA, and HSCA for the reinforced concrete beam design problem are presented in Table 15.From Table 15, it can be noticed that just the proposed HSCA, diversity-guided PSO, and QSA obtained the best-rounded solution in each run.The HSCA achieved the best VTA with the smallest NFEs in comparison with the other five optimizers.
Figure 1 demonstrates six convergence curves of the average solutions achieved by the ABC, SCA, and HSCA through 50 runs for specific design problems.This figure indicates that the proposed HSCA converges faster than the ABC and SCA on the chosen design problems.The statistical analysis derived by Wilcoxon's test among HSCA and SCA and among HSCA and ABC for these fifteen design problems at the 0.05 significance level obtained the p value 0.001.A summary of the results obtained by HSCA shows that our proposed approach can significantly enhance the performance of SCA and ABC concerning the quality of the reached results, stability, and computational efficiency for all fifteen design problems.Table 16 presents results based on the performance of metaheuristic optimizers for fifteen design problems.In Table 16 "Best", "Mean" and "Worst" symbolize, respectively, optimizers which achieved the "best" best, the "best" mean and the "best" worst solutions, while "NFEs" represent optimizers that reached the "best" best solution for the smallest NFEs.From Table 16 it can be observed that the HSCA optimizer reached the "best" best, mean and worst solutions for twelve problems (speed reducer, pressure vessel, welded beam, gear train, stepped cantilever beam, multiple disc clutch brake, cantilever beam, hollow shaft, hydrostatic thrust bearing, car side impact, gas transmission compressor, and reinforced concrete beam) with the smallest NFEs.For the tension/compression spring problem, the IAPSO and I-ABC greedy obtained the "best" best solution with the lower NFEs than the HSCA.For the helical compression spring problem, the diversity-guided PSO reached the "best" best solution with the lower NFEs than the HSCA, while for the Belleville spring problem, the MBA obtained the "best" best solution with the lower NFEs than the HSCA.However, the HSCA showed good performance in solving these three design problems, since it reached the best-rounded VTA in each run with acceptable NFEs.It can be concluded that the comparative results with the other metaheuristic optimizers showed that the HSCA has efficient performance regarding the quality and stability of the results, and convergence speed.

Analyses
In this section, the behavior of the HSCA, ABC, and SCA when solving design problems is analyzed with regard to the diversity of a population.Diversity indicates differences between agents and preserving the proper amount of diversity during the search is an essential factor in avoiding local optimums and balancing exploration and exploitation.The exploration and exploitation are intertwined in the three mutation operators employed in the HSCA, and it is expected that these strategies can produce a suitable amount of diversity when it is needed.
To measure the diversity of the entire population, we employ the dimension-wise diversity metric, which is described by the next equations [46]: In the Equation ( 10), median j is the median of jth element in the entire population, x i,j is jth component of ith agent, SP is the total number of agents in the population.The calculated value Div j represents the diversity of jth dimension.The Equation ( 11) is used to calculate the population diversity in tth iteration.
The diversity behavior of the ABC, SCA, and HSCA for the six design problems, speed reducer, tension/compression spring, pressure vessel, welded beam, Belleville spring, and car side-impact, is presented in Figure 2. From Figure 2 it can be seen that the diversity of the HSCA is high in the first part of the search process, and then it gradually decreases for all selected problems.High diversity in the initial iterations indicates that the HSCA may be capable of searching for a global optimum solution with greater accuracy.Further, low diversity in the final iterations may signify that promising areas are found and that fine adjustment of the reached results is provided.The population diversity of ABC and SCA is also higher in the initial generations, but decreases very slowly or remains on a similar level in later iterations for most of the chosen problems.Since encouraging high diversity in each stage of the search process might be inefficient, this diversity behavior may point to slow convergence and deficiency of precision of the ultimate solution.
The fact that HSCA outperformed ABC and SCA in tackling design problems and the presented diversity behavior of these algorithms indicates that the HSCA can maintain appropriate diversity through the search.Hence it can be concluded that the use of three mutation operators and Deb's rules in the HSCA enables a preferable balance between exploration and exploitation.

Conclusions
This paper proposes a hybridization of the sine cosine algorithm, called HSCA, for solving structural and mechanical engineering design optimization problems.The HSCA is a collaborative hybrid algorithm with a sequential structure.The modified SCA and ABC with two different search strategies execute alternately until the convergence criterium is met in the proposed method.The modified SCA search strategy is developed to reach a suitable balance between local exploitation and global exploration abilities.In the proposed equation, the global exploration is guided by choosing the base vector randomly from the population and adding another difference vector to the base vector.Also, the local exploitation is enhanced by using the current best solution in both difference vectors of this search operator.Further, modified SCA is combined with the multi-strategy ABC algorithm to better maintain diversity in the population.In the employed ABC variant, two ABC search equations with good global exploration abilities are engaged in creating candidate solutions.The modified SCA and ABC employ a constraint handling technique based on Deb's rules to steer the search toward feasible areas.
The performance of the developed hybrid optimizer was explored on fifteen complex engineering problems.The experimental results of direct comparison of the HSCA with the standard ABC and SCA revealed that the HSCA performs significantly better than these two metaheuristics.The proposed approach has also established a highly competitive performance than the other optimizers studied by the preceding researchers in terms of the quality and stability of the results.Additionally, the HSCA showed an effective convergence speed in reaching these results.Therefore, the developed HSCA has a great capacity to deal with mixed-type design variables concurrently while satisfying various complicated design constraints.
The proposed hybrid method is easy to implement.The HSCA has a simple architecture that does not use extra components and is computationally inexpensive.This algorithm uses three search operators with distinct benefits which coexist during the search.The analyses indicated that the combination of the modified SCA mutation operator and the two ABC search equations with the use of three feasibility rules properly sustain the balance between exploration and exploitation for these design problems.The HSCA does not increase the number of control parameters in the SCA and the ABC optimizers.But a common complexity of finding the proper settings of algorithm-dependent parameters is present in the HSCA, as in lots of other metaheuristic optimizers.Additionally, the issue of more sophisticated control of the balance between exploitation and exploration during diverse stages of the search process exists in the proposed approach.
The considered fifteen design problems have different numbers of inequality constraints.To solve problems with equality constraints, the appropriate equality constraint handling technique needs to be employed.It is also important to note that the performance of the HSCA may be different depending on the constraint-handling technique employed.The efficiency of some other constraint-handling techniques for complex constraints will be investigated in future work.Extending the HSCA for solving multi-objective problems will also be studied.

Figure 1 .
Figure 1.Convergence curves of the average solutions achieved by the ABC, SCA, and HSCA through 50 runs for some design problems: (a) Speed reducer, (b) Tension/compression spring, (c) Pressure vessel, (d) Welded beam, (e) Belleville spring, (f) Car side impact.

Table 1 .
Performance of HSCA and other optimizers for the speed reducer design problem.

Table 2 .
Performance of HSCA and other optimizers for the tension/compression spring design problem.

Table 3 .
Performance of HSCA and other optimizers for the pressure vessel design problem.

Table 4 .
Performance of HSCA and other optimizers for the welded beam design problem.

Table 5 .
Performance of HSCA and other optimizers for the gear train design problem.

Table 6 .
Performance of HSCA and other optimizers for the stepped cantilever beam problem design problem.

Table 7 .
Performance of HSCA and other optimizers for the multiple disc clutch brake design problem.

Table 8 .
Performance of HSCA and other optimizers for the cantilever beam design problem.

Table 9 .
Performance of HSCA and other optimizers for the helical compression spring design problem.

Table 10 .
Performance of HSCA and other optimizers for the hollow shaft design problem.

Table 11 .
Performance of HSCA and other optimizers for the hydrostatic thrust bearing design problem.

Table 12 .
Performance of HSCA and other optimizers for the Belleville spring design problem.

Table 13 .
Performance of HSCA and other optimizers for the car side impact design problem.

Table 15 .
Performance of HSCA and other optimizers for the reinforced concrete beam design problem.

Table 16 .
Results based on the performance of metaheuristic optimizers for fifteen design problems.