A Spring Search Algorithm Applied to Engineering Optimization Problems

: At present, optimization algorithms are used extensively. One particular type of such algorithms includes random-based heuristic population optimization algorithms, which may be created by modeling scientiﬁc phenomena, like, for example, physical processes. The present article proposes a novel optimization algorithm based on Hooke’s law, called the spring search algorithm (SSA), which aims to solve single-objective constrained optimization problems. In the SSA, search agents are weights joined through springs, which, as Hooke’s law states, possess a force that corresponds to its length. The mathematics behind the algorithm are presented in the text. In order to test its functionality, it is executed on 38 established benchmark test functions and weighed against eight other optimization algorithms: a genetic algorithm (GA), a gravitational search algorithm (GSA), a grasshopper optimization algorithm (GOA), particle swarm optimization (PSO), teaching–learning-based optimization (TLBO), a grey wolf optimizer (GWO), a spotted hyena optimizer (SHO), as well as an emperor penguin optimizer (EPO). To test the SSA’s usability, it is employed on ﬁve engineering optimization problems. The SSA delivered better ﬁtting results than the other algorithms in unimodal objective function, multimodal objective functions, CEC 2015, in addition to the optimization problems in engineering.


Introduction
As the demand for quick and accurate solutions to ever increasingly complex problems expands, classical methods are being substituted for more robust approaches. One proposal is the use of heuristic random-based algorithms in place of searching the defined problem space exhaustively [1][2][3][4][5]. Heuristic algorithms are applicable to a variety of scientific fields, such as: logistics [6], bioinformatics [7], data mining [8], chemical physics [9], energy [10], security [11], electrical engineering [12][13][14][15][16], energy carriers [17,18] as well as other fields that aim to discover the optimal solution.
Each population-based algorithm may represent the conveying of data along with the interlinkage between elements in a different way. For example, genetic algorithms simulate evolution [19] while annealing algorithms, thermodynamics [20]; immunity algorithms, the human immune system [21]; colony optimization strategies, ants' search for food [22]; and particle swarm optimization approaches, the behavior of birds while searching for food [23].
There are many laws of nature that may serve as inspiration, such as Newton's universal law of gravitation, Hooke's spring law, the laws of motion, the laws of energy and mass conservation, as well as the laws that dictate the electromagnetic force. A novel optimization algorithm was proposed based on Hooke's spring law, with its corresponding precursory results detailed in [24]. Such algorithm is detailed and analyzed in the current paper with its improved equations. The SSAs capabilities were evaluated through 23 benchmark test functions, as well as a group of problems mentioned in the Constrained Single Objective Real-Parameter Optimization Technical Report, 'CEC'2015'. The SSA was further corroborated by being weighed against eight established algorithms found in the literature, as well as being used to solve a selection of engineering problems.
Section 2 provides a greater insight to other established optimization approaches. Section 3 elucidates Hooke's law while Section 4 outlines the SSA. Section 5 assesses the algorithm's search ability while Section 6, its proficiency. Section 7 includes the outcome of the evaluation through the standard benchmark test functions. Section 8 includes the implementation of the algorithm on select engineering design problems. Finally, Section 9 encompasses conclusions.

A Brief History of Intelligent Algorithms
An algorithm is considered intelligent when it finds a suitable answer or solution to a problem whose main characteristic is optimization in the shortest possible time and using the least amount of available information [25]. Using a more complete definition, the heuristic method is a strategy that sacrifices part of the information to reach a solution in the shortest possible time and with good precision [26]. Usually, heuristic algorithms are very frequently based on natural processes, that is, typically biological processes or laws that explain physical phenomena. This approach has been widely considered in the last ten years, and numerous algorithms have been suggested. These algorithms have been classified into different categories, such as swarm-based algorithms, evolution-based algorithms, and physics-based algorithms.

Swarm-Based Algorithms
These techniques were developed from the analysis of several processes that exist naturally, such as the growth or symbiosis of plants, the feeding behavior of insects, and the behavior and social organization of animals [27]. The particle swarm optimization (PSO) algorithm is an indeterminate (random) search method that was developed around 1995 to support functional optimization [28]. This algorithm was developed by analyzing and taking a reference to the movement that birds develop as a group (team) when looking for food. The algorithm is based on the premise that a group of birds looks for food at random and that there is only one portion of food in the area (search space) in question, but none of the birds know where the food is. One of the most successful strategies could be: for the birds to follow the bird that is closest to the food, and in sequence to be the bird most likely to find the food. This strategy is, in fact, the source of the algorithm. In the algorithm, each solution, called a particle, is equivalent to a bird in the bird movement algorithm. Each particle (bird) has an arbitrary value calculated by a success function. Each particle (bird) also has a speed that controls the particle (bird). By continuing to search for optimal particles, the agent continues to move in the solution space. The firefly algorithm (FA) is an algorithm inspired by a natural system; in this case, based on swarms for projects where limited optimal solutions are sought [29]. The algorithm is inspired by the analysis of the radiation behavior of these insects. The firefly lives in groups and changes from low light to higher light intensity. The firefly algorithm generates a rhythmic light and passes through each different light pattern or behavior among these insects. The firefly algorithm uses Appl. Sci. 2020, 10, 6173 3 of 21 these lights for two main purposes: finding mates to mate with and looking for food. These lights can also serve as a protection mechanism or strategy. The whale optimization algorithm (WOA) [30] is another nature-inspired optimization algorithm; as the name implies, this algorithm mimics the social behavior that humpback whales have. The most surprising thing about humpback whales is their hunting strategy. This food search strategy is called a bubble net feeding method. Humpback whales like to hunt schools of krill or small fish near the surface of the ocean. It has been analyzed that this foraging is done by creating distinctive bubbles along a circle or route in the form of the number "9". Some of the other swarm-based algorithms are: artificial bee colony (ABC) [31], bat-inspired algorithm (BA) [32], spotted hyena optimizer (SHO) [33], cuckoo search (CS) [34], monkey search (MS) [35], group optimization (GO) [36], artificial fish-swarm algorithm (AFSA) [37], hunting search (HS) [38], moth-flame optimization algorithm (MFO) [39], dolphin partner optimization (DPO) [40], orientation search algorithm (OSA) [41], binary orientation search algorithm (BOSA) [42], dice game optimizer (DGO) [43], shell game optimization (SGO) [44], hide objects game optimization (HOGO) [45], donkey theorem optimization (DTO) [46], following optimization algorithm (FOA) [47], rat swarm optimizer (RSO) [48], darts game optimizer (DGO) [49], football game based optimization (FGBO) [50], grey wolf optimizer (GWO) [51], grasshopper optimization algorithm (GOA) [52], coupled spring forced bat algorithm (SFBA) [53], adaptive granularity learning distributed particle swarm optimization (AGLDPSO) [54], multi leader optimizer (MLO) [55], doctor and patient optimization (DPO) [56], and emperor penguin optimizer (EPO) [57].

Evolution-Based Algorithms
An algorithm is considered evolutionary when the algorithm combines aspects of natural selection and continuity of coordination. These algorithms are based on structures that simulate the rules of selection, recombination, change, and survival, similar to genetics, hence the adjective algorithm. These structures are based on genetic sets. In this method, the environment determines each person's coordination or performance in a population and uses the most consistent individuals to reproduce [58]. Evolutionary algorithms are random search procedures that use genetic mechanisms and natural selection [59]. Genetic algorithms (GA) were developed as a method that seeks optimization, starting from fundamental basic operations in genetic biology [60]. The first record of using these concepts to create an optimization method occurred in 1967 [61]. GA is a particular type of evolution algorithm that exploits basic biological concepts such as inheritance and mutation [62] and has had satisfactory results in different scientific domains.
On the other hand, differential evolution (DE) [63] is an algorithm that also seeks intelligent optimization based on populations introduced in 1995 [64]. The initial version of this algorithm was used to solve problems with continuous variables, and interpretations of this algorithm have been implemented over time to solve optimization problems with discrete variables [65]. Other algorithms based on the theory of evolution have been developed, examples of which are: evolutionary programming (EP) [66], biogeography-based optimizer (BBO) [67], enhanced quantum-inspired differential evolution algorithm (IQDE) [68], genetic programming (GP) [69] and evolution strategy (ES) [70].

Physics-Based Algorithms
Physics-based algorithms, as the name implies, are inspired by the laws of physics. Simulated annealing (SA) is one of the best known and most popular optimization algorithms. SA was developed in 1983 [71], inspired by metals' annealing, for example, in the artisan process to make swords or knives in the past. The process consists of first heating the metal to a very high temperature and then cooling it by gradually reducing the temperature so that the metal hardens and becomes harder. In this process, when the temperature of the metal increases, the speed of atomic movement increases dramatically and, in the next step, the gradual reduction in temperature causes the formation of specific patterns based on the location of its atoms [20]. Drastic temperature change is one of the adjustment parameters of this algorithm. The gravitational search algorithm (GSA) [72] is inspired by the universal law of gravitation developed by Isaac Newton. In this algorithm, objects such as planets in a galaxy are defined as search agents. The optimal region, similar to a black hole, absorbs the planets. Information about the fitness of any object is stored as gravity and mass inertia. The exchange of information and the effects of objects on each other is governed by the attractive force of gravitation [73]. Several algorithms have been developed based on laws and/or theories of physics, such as: charged system search (CSS) [74], galaxy-based search algorithm (GBSA) [75], curved space optimization (CSO) [76], ray optimization (RO) algorithm [77], artificial chemical reaction optimization algorithm (ACROA) [78], small world optimization algorithm (SWOA) [79], central force optimization (CFO) [80], black hole (BH) [81] and big-bang big-crunch (BBBC) [82].

Spring Force Law
If a force that moves an object in a closed path (forward and backward) is not affected by the object's trajectory, that force is conservative. Another method for diagnosing conservative forces is that the work done by the force in different paths is equal to the difference between the initial and final points. The spring force is a type of conservative force [83].
Consider a spring that imposes a force on a particle with mass 'm'. The particle moves horizontally in the x direction. When a particle is at the origin (x = 0), the spring is balanced. An external force (F ext ) influences the object in the anti-clock wise direction of the spring. The external force is always equal to the spring force. Thus, the particle is always in balance.
Consider that the particle is moved a distance x from its initial location x = 0. When the external factor imposes a force F ext on the particle, the spring offers a resistance force F s on the particle. This force can be described by the spring force or Hooke's law as follows: where k is the spring force constant and x denotes the spring displacement (strain or compression) from the balance point. Most real springs properly follow Hooke's law up to a limit [84].
The system behavior is investigated in an isolated system as shown in Figure 1. It is assumed that only the spring force is imposed on the object.
Appl. Sci. 2020, 10, x FOR PEER REVIEW 4 of 23 absorbs the planets. Information about the fitness of any object is stored as gravity and mass inertia. The exchange of information and the effects of objects on each other is governed by the attractive force of gravitation [73]. Several algorithms have been developed based on laws and/or theories of physics, such as: charged system search (CSS) [74], galaxy-based search algorithm (GBSA) [75], curved space optimization (CSO) [76], ray optimization (RO) algorithm [77], artificial chemical reaction optimization algorithm (ACROA) [78], small world optimization algorithm (SWOA) [79], central force optimization (CFO) [80], black hole (BH) [81] and big-bang big-crunch (BBBC) [82].

Spring Force Law
If a force that moves an object in a closed path (forward and backward) is not affected by the object's trajectory, that force is conservative. Another method for diagnosing conservative forces is that the work done by the force in different paths is equal to the difference between the initial and final points. The spring force is a type of conservative force [83].
Consider a spring that imposes a force on a particle with mass 'm'. The particle moves horizontally in the x direction. When a particle is at the origin (x = 0), the spring is balanced. An external force ( ) influences the object in the anti-clock wise direction of the spring. The external force is always equal to the spring force. Thus, the particle is always in balance.
Consider that the particle is moved a distance x from its initial location x = 0. When the external factor imposes a force on the particle, the spring offers a resistance force Fs on the particle. This force can be described by the spring force or Hooke's law as follows: where k is the spring force constant and denotes the spring displacement (strain or compression) from the balance point. Most real springs properly follow Hooke's law up to a limit [84].
The system behavior is investigated in an isolated system as shown in Figure 1. It is assumed that only the spring force is imposed on the object. In Figure 1, forces imposed on object j can be grouped into two, , which variables as described in Equation (2), is the sum of forces imposed from the right and , which, as shown in Equation (3), is the sum of forces imposed from the left. It is necessary to mention that the springs that are attached to the object from either right or left, are also attached to robust points at their other ends. In Figure 1, forces imposed on object j can be grouped into two, F total R , which variables as described in Equation (2), is the sum of forces imposed from the right and F total L , which, as shown in Equation (3), is the sum of forces imposed from the left. It is necessary to mention that the springs that are attached to the object from either right or left, are also attached to robust points at their other ends.
Appl. Sci. 2020, 10, 6173 where, n R and n L are the number of left and right spring forces, x i,j and x l,j show the distance between the object j and the fixed left and right points, K i,j and K l,j are the spring stiffness coefficients between the object j and the fixed points. The object is initially balanced with no force exerted on it. Then, by applying the spring forces, the object is pulled from the right and the left. Considering the magnitude of these forces, the spring either shifts to left or right until the system reaches a new equilibrium position. It is apt to mention that if the right and left forces are equal, the object remains at its original position.
Considering the stiffness coefficient of the springs that are connected to the object, two new parameters may be defined as below: are the right and the left constants of the spring, respectively. Considering Equation (1) the displacement values at each side may be defined as follows: here, dX j R and dX j L are the displacement values of object j to the right and left, respectively. Therefore, the total displacement may be defined as follows: dX j is the final object j displacement value that may be a positive or negative value.
Equation (9), X j relates to the location of the new balance point of the system and object j. Besides, X j 0 is the initial balance of object j.
By simulating Hooke's law within the discrete time domain, a new optimization algorithm called the spring optimization was designed, which is explained further in the following section.

Spring Search Algorithm (SSA)
In this article, the spring search algorithm is run in an artificial system with discrete time. The system space is the defined domain of problem. It is possible to apply the spring force law as a tool to convey information. The designed optimization may be applied to solve any optimization problem, as long as the answer can be defined as a position within space, and its similarity to other problem answers can be expressed as spring stiffness comparisons. The stiffness of the spring is established relative to the objective function.
Appl. Sci. 2020, 10, 6173 6 of 21 SSAs consist of two general steps: making a discrete time artificial system within the problem space by defining the initial positioning of objects, determining the governing laws and arranging parameters; letting the algorithm run until it reaches a stop.

Setting the System, Determining Laws and Arranging Parameters
First, the system space is determined with its multi-dimensional coordinate system in which the problem is defined. Any point in space may be the solution of the optimization problem. Search factors are sets of objects that are attached to each other by some springs. Each object has its own position as well as stiffness coefficients pertaining to the springs attached to it. The object's position is a point in space where the solution of the problem may be found. The values of the springs are computed regarding the stiffness of both objects attached. After setting the system, its governing laws are determined. Only the spring force laws and movement laws are observed in the system. The general patterns of these rules are similar to natural laws and are defined below.
In physics, mechanical and elastic science, Hooke's law is an approximation that shows that any change in an object's length is directly proportional to its load. Most materials follow this law with reasonable accuracy, except when the force is lower than its elasticity. Any deviation from Hooke's law increases with the deformation quantity, such that with numerous deformations, when the material trespasses its linear elastic domain, Hooke's law loses its application. The present article assumes that Hooke's law is valid for all of the time observed.
The present location of any object equals the sum coefficient of its previous locations and its displacements according to the laws of motion. Any object displacement may be determined with the aid of the spring force law.
Consider a system as a set of m objects where the position of each object is a point in the search space and a solution to the problem.
The position of an object i of dimension d is designated x d i in Equation (10). The initial positions of the objects are defined within the search space randomly. These objects tend to return to an equilibrium position by means of the forces exerted by the spring.
Equation (11) is employed in order to compute the spring stiffness coefficient.
In Equation (11), K i,j is the spring stiffness coefficient among objects i and j, K max is the maximum value of the spring stiffness coefficient, and is determined according to the type of problem in question, F i n and F j n are the normalized objective functions of objects i and j respectively. Equations (12) and (13) are used in order to normalize the objective function.
In the above equations, F obj is the objective function and f i obj is the objective function value for the object i.
In the m variable problem, it is possible to assume that the problem has m dimensions and that there is a coordinate for each dimension. Therefore, it is possible to draw a system on its related coordinate based on each variable. Each coordinate's strong points at the left or right side of the object Appl. Sci. 2020, 10, 6173 7 of 21 are determined by comparing the objective function quantities. Stronger points related to each object mean that they are positioned at more optimal positions. Therefore, each coordinate has two general summative forces: the sum of the right, Equation (14) and the sum of the left, Equation (15). Both are applied to the object j.
In the above equations, F j ,d total R stands for the sum of the right forces and F j ,d total L the sum of the left forces imposed on object j within the dimension d. n d R and n d L are the d dimension right and left strong points; x d i,j and x d l,j represent the distance of the object 'j' from the right and left strong points; K i,j and K l,j are the spring stiffness coefficient attached to the object j and the strong points. Now, considering Hooke's law in the dimension d: here, dX j,d R and dX j,d L are the displacement of object j to the right and the left of dimension d, respectively. The total displacement may be calculated as follows: where dX j,d is the final displacement of object j in the dimension d. The direction may be in the positive Equation (19) relates X j,d new to both the new position and balance point of the system, along with the d dimension of object j. Additionally, X j,d 0 is the initial balance point of object j along with the d dimension. Here, r 1 is a random number with a uniform distribution within the span of [0-1], which is used to preserve the random mode.
In the last step, the objects and springs after reaching balance have a small displacement due to slipping, which is simulated in Equation (20).
here, X j,d is the updated position of d dimension of object j, T is the maximum number of iteration, t is the counter of iteration, and r 2 is a random number with a uniform distribution within the span of [0-1], which is used to preserve the random mode.

Time Passing and Parameters Updating
While initially forming a system, each object is randomly placed within a point in space where it may be the problem's answer. At any given time, objects are assessed and their displacement is calculated using Equations (11) through (18). Thereafter, the object is placed in the new computed position. The parameter of interest is the spring stiffness coefficient that is updated at each stage based on Equation (11). The stop condition is established after the algorithm has been run for a given time. The steps of the SSA are as follows and a flowchart encompassing them is shown in Figure 2:

Properties of the Proposed SSA
The above algorithm is a proposed optimization method that applies the spring force law. The strategy of the algorithm proposed is to use the spring fitness coefficient. In this algorithm, a set of objects search the space randomly using the spring force as a tool to transfer information between objects. Under the influence of other objects, each may arrive at a rough understanding of its surrounding space. The algorithm must be navigated such that the objects' location improves as time passes.
Thus, springs with more fitness coefficients can be attached to those with better fitness functions. The springs attract other objects to themselves, allowing a suitable force to be exerted on each object. Objects tend to move towards better conditions as time goes by. Accordingly, objects placed in better locations must take slower and shorter steps. As an object arrives in a better condition, its stiffness coefficient increases. The stiffer objects search their surrounding environment with more precision. Indeed, this behavior, known here as the adjustment, is similar to arranging the learning rate in a neural network. The spring's stiffness coefficient becomes smaller with the passing of time as a result of the spring's force. Another reason why the spring stiffness coefficient decreases with time is that the objects tend to concentrate around better locations and need to search space with smaller and more precise steps.
Each object can influence the radius of its neighborhood according to its fitness value. As indicated in Figure 3, each object can move as influenced by the spring forces imposed on it.
Appl. Sci. 2020, 10, x FOR PEER REVIEW 9 of 23 8-Update object locations 9-Repeat steps 4 through 8 until the stop condition is satisfied 10-Print best solution 11-End

Properties of the Proposed SSA
The above algorithm is a proposed optimization method that applies the spring force law. The strategy of the algorithm proposed is to use the spring fitness coefficient. In this algorithm, a set of objects search the space randomly using the spring force as a tool to transfer information between objects. Under the influence of other objects, each may arrive at a rough understanding of its surrounding space. The algorithm must be navigated such that the objects' location improves as time passes.
Thus, springs with more fitness coefficients can be attached to those with better fitness functions. The springs attract other objects to themselves, allowing a suitable force to be exerted on each object. Objects tend to move towards better conditions as time goes by. Accordingly, objects placed in better locations must take slower and shorter steps. As an object arrives in a better condition, its stiffness coefficient increases. The stiffer objects search their surrounding environment with more precision. Indeed, this behavior, known here as the adjustment, is similar to arranging the learning rate in a neural network. The spring's stiffness coefficient becomes smaller with the passing of time as a result of the spring's force. Another reason why the spring stiffness coefficient decreases with time is that the objects tend to concentrate around better locations and need to search space with smaller and more precise steps.
Each object can influence the radius of its neighborhood according to its fitness value. As indicated in Figure 3, each object can move as influenced by the spring forces imposed on it.

Exploration and Exploitation in SSA
The optimization method must address two issues: exploration and exploitation. In the exploration aspect, the algorithm must have enough power to search the problem search space well and not be limited to only a few specific locations. The algorithm tackles exploitation by focusing on exploring optimal locations. Before running a population algorithm, it is necessary to search the designated space comprehensively. Hence, the algorithm must focus on searching for the solution's general area during initial iterations, though as the time passes, it must locate itself more efficiently through aid from the population findings [85].

Exploration and Exploitation in SSA
The optimization method must address two issues: exploration and exploitation. In the exploration aspect, the algorithm must have enough power to search the problem search space well and not be limited to only a few specific locations. The algorithm tackles exploitation by focusing on exploring optimal locations. Before running a population algorithm, it is necessary to search the designated space comprehensively. Hence, the algorithm must focus on searching for the solution's general area during initial iterations, though as the time passes, it must locate itself more efficiently through aid from the population findings [85].
SSAs can search space by considering the suitable number of objects. The way the optimization algorithm improves its detection power is through the spring force effect by varying the spring's stiffness coefficient and consequently controlling the spring forces among the objects. During initial iterations, the problem needs thorough searches, and as time passes, the population arrives at better results and the spring stiffness coefficient value is controlled. At the initial time, a proper value is chosen, and as time passes, that value decreases though the spring stiffness coefficient (Equation (11)) until it arrives at its minimum value.

Experimental Results and Discussion
This section presents the results from the evaluation of the SSA's performance on twenty-three standard benchmark test functions. A detailed description of these benchmark functions is presented below. Furthermore, the results are compared to eight existing optimization algorithms.

Benchmark Test Functions
The performance of the SSA was assessed by using 23 benchmark test functions [86]. The experimentation was done on MATLAB version R2014a (8.3.0.532) in a Microsoft Windows 7 environment using a 64 bit Core i-7 processor with 2.40 GHz and 16 GB main memory. The average and standard deviation of the best optimal solutions are displayed in Tables 1-3. For each benchmark test function, the SSA utilizes 20 independent runs, in which each run employs 1000 iterations.

Algorithms Used for Comparison
In order to prove the potency of the SSA, it is also compared to eight optimization algorithms on unimodal, multimodal, fixed-dimension multimodal and composite optimization. They were assessed by solving a set of minimization problems introduced in the Constrained Single Objective Real-Parameter Optimization Technical Report, 'CEC'2015' [86]. To validate the performance of the SSA, the eight optimization algorithms included: GA [87], PSO [29], GSA [73], TLBO [88], GOA [53], GWO [52], SHO [34], and EPO [58].
The parameter values of optimization algorithms are given in the next table.

Evaluation of Unimodal Test Function with High Dimensions
Functions F 1 to F 7 are unimodal. The mean results of 20 independent runs of the algorithm are displayed in Table 1. These results show that the SSA has a better performance in all F 1 to F 7 functions than other algorithms.

Evaluation of Multimodal Test Functions with High Dimensions
In multimodal functions F 8 to F 13 , by increasing the function dimensions, the number of local responses increased exponentially. Therefore, arriving at the minimum response of these functions is hard to achieve. In these types of functions, arriving at a response close to the ideal response denotes the algorithm's high capability in avoiding wrong local responses. The results obtained from assessing F 8 to F 13 after 20 runs of the SSA as well as the other algorithms compared are presented in Table 2. The SSA demonstrated the best performance of all the functions analyzed.

Evaluation of Multimodal Test Functions with Low Dimensions
Functions F 14 to F 23 have both low dimensions in addition to low local responses. Results obtained from 20 runs of each algorithm, are shown in Table 3. These results indicate that the SSA has a suitable performance in all functions F 14 to F 23 .

Evaluation of CEC 2015 Test Functions
This section is devoted to real approaches and techniques for solving single objective optimization problems. All of these test functions are minimization problems. Table 4 shows the performance of the SSA as well as the other algorithms on the CEC 2015 test. Table 4 exhibits how the SSA is the most efficient optimizer of all of the benchmark test functions studied.

SSA for Engineering Design Problems
In this section, the SSA is applied to five constrained engineering design problems.

Pressure Vessel Design
The mathematical model of this problem was adapted from a paper by Kannan and Kramer [89]. Tables 5 and 6 show the performance of the SSA along with the other algorithms. The SSA provides an optimal solution at (0.778099, 0.383241, 40.315121, 200.00000), with a corresponding fitness value of 5880.0700.

Speed Reducer Design Problem
This problem is modeled mathematically in [90,91]. The results of the optimization problem are presented in Tables 7 and 8. The optimal solution was provided by the SSA at (3.50123, 0.7, 17, 7.3, 7.8, 3.33421, 5.26536) with a corresponding fitness value equal to 2994.2472.

Welded Beam Design
The mathematical model of a welded beam design was adapted from [31]. The results to this optimization problem are presented in Tables 9 and 10. The SSA provides an optimal solution at (0.205411, 3.472341, 9.035215, 0.201153) with a corresponding fitness value equal: 1.723589.

Tension/Compression Spring Design Problem
The mathematical model of this problem was adapted from [31]. The results to this optimization problem are displayed in Tables 11 and 12. The SSA provides the optimal solution at (0.051087, 0.342908, 12.0898), with a corresponding fitness value of 0.012656987.

Rolling Element Bearing Design Problem
The mathematical model of this problem is adapted from [92]. The results of this optimization problem are included in Tables 13 and 14 and prove that the SSA provides an optimal solution at (125, 21.41890, 10.94113, 0.515, 0.515, 0.4, 0.7, 0.3, 0.02, 0.6) with a corresponding fitness value equal to 85,067.983.

Conclusions
There are many optimization problems in the different scientific domains that must be solved using the algorithms with each case's necessary characteristics. In this project, a new optimization algorithm called spring search algorithm (SSA) was developed; Hooke's law describes the starting point or basic concept. The search agents of the proposed method are weights that are connected by several springs. The system starts from a transitory situation or state and stabilizes at the equilibrium point, according to the law of the spring.
To evaluate the algorithm and purchase it, almost 40 standard objective functions, including unimodal and multimodal functions, in addition to CEC2015, were used to assess the performance of the proposed algorithm in solving optimization problems of a different nature. To review and analyze the algorithm's results, these were compared with eight widely known optimization algorithms: GA, PSO, GSA, TLBO, GWO, GOA, SHO, and EPO.
The results in the functions show the superior exploration and exploitation capabilities of SSA compared to other optimization algorithms, for both unimodal and multimodal functions. The same occurs with the simulations using the SSA algorithm and the eight algorithms selected for comparison in the case of CEC2015, which shows the SSA's high aptitude to solve this type of problem to optimize the function. In addition to the work carried out on the almost 40 functions, the SSA algorithm was evaluated in five engineering design optimization problems to evaluate the performance in solving optimization problems in real situations, showing that it is much more competitive than other algorithms.
For future work, it is suggested to develop a binary version of the SSA algorithm and apply this algorithm to multi-objective problems. Likewise, extending the concept of Hooke's law to more complex models with more adjustment parameters, which, even though it will reduce simplicity, could generate additional advantages.

Conflicts of Interest:
The authors declare no conflict of interest. The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.