Next Article in Journal
The Heat Pulse Method for Soil Physical Measurements: A Bibliometric Analysis
Next Article in Special Issue
Written Documents Analyzed as Nature-Inspired Processes: Persistence, Anti-Persistence, and Random Walks—We Remember, as Along Came Writing—T. Holopainen
Previous Article in Journal
Development and Validation of Overpressure Response Model in Steel Tunnels Subjected to External Explosion
Previous Article in Special Issue
A New “Doctor and Patient” Optimization Algorithm: An Application to Energy Commitment Problem
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Spring Search Algorithm Applied to Engineering Optimization Problems

1
Department of Electrical and Electronics Engineering, Shiraz University of Technology, Shiraz 71557-13876, Iran
2
Department of Computer Science, Government Bikram College of Commerce, Patiala, Punjab 147004, India
3
Department of Electrical and Computer Engineering, University of Calgary, Calgary, AB T2N 1N4, Canada
4
School of Engineering and Sciences, Tecnologico de Monterrey, Monterrey 64849, Mexico
5
Department of Civil Engineering, Islamic Azad Universities of Estahban, Estahban 74, Iran
6
CROM Center for Research on Microgrids, Department of Energy Technology, Aalborg University, 9220 Aalborg, Denmark
*
Author to whom correspondence should be addressed.
Appl. Sci. 2020, 10(18), 6173; https://doi.org/10.3390/app10186173
Submission received: 18 August 2020 / Revised: 31 August 2020 / Accepted: 2 September 2020 / Published: 4 September 2020

Abstract

:
At present, optimization algorithms are used extensively. One particular type of such algorithms includes random-based heuristic population optimization algorithms, which may be created by modeling scientific phenomena, like, for example, physical processes. The present article proposes a novel optimization algorithm based on Hooke’s law, called the spring search algorithm (SSA), which aims to solve single-objective constrained optimization problems. In the SSA, search agents are weights joined through springs, which, as Hooke’s law states, possess a force that corresponds to its length. The mathematics behind the algorithm are presented in the text. In order to test its functionality, it is executed on 38 established benchmark test functions and weighed against eight other optimization algorithms: a genetic algorithm (GA), a gravitational search algorithm (GSA), a grasshopper optimization algorithm (GOA), particle swarm optimization (PSO), teaching–learning-based optimization (TLBO), a grey wolf optimizer (GWO), a spotted hyena optimizer (SHO), as well as an emperor penguin optimizer (EPO). To test the SSA’s usability, it is employed on five engineering optimization problems. The SSA delivered better fitting results than the other algorithms in unimodal objective function, multimodal objective functions, CEC 2015, in addition to the optimization problems in engineering.

1. Introduction

As the demand for quick and accurate solutions to ever increasingly complex problems expands, classical methods are being substituted for more robust approaches. One proposal is the use of heuristic random-based algorithms in place of searching the defined problem space exhaustively [1,2,3,4,5]. Heuristic algorithms are applicable to a variety of scientific fields, such as: logistics [6], bioinformatics [7], data mining [8], chemical physics [9], energy [10], security [11], electrical engineering [12,13,14,15,16], energy carriers [17,18] as well as other fields that aim to discover the optimal solution.
Each population-based algorithm may represent the conveying of data along with the interlinkage between elements in a different way. For example, genetic algorithms simulate evolution [19] while annealing algorithms, thermodynamics [20]; immunity algorithms, the human immune system [21]; colony optimization strategies, ants’ search for food [22]; and particle swarm optimization approaches, the behavior of birds while searching for food [23].
There are many laws of nature that may serve as inspiration, such as Newton’s universal law of gravitation, Hooke’s spring law, the laws of motion, the laws of energy and mass conservation, as well as the laws that dictate the electromagnetic force. A novel optimization algorithm was proposed based on Hooke’s spring law, with its corresponding precursory results detailed in [24]. Such algorithm is detailed and analyzed in the current paper with its improved equations. The SSAs capabilities were evaluated through 23 benchmark test functions, as well as a group of problems mentioned in the Constrained Single Objective Real-Parameter Optimization Technical Report, ‘CEC’2015’. The SSA was further corroborated by being weighed against eight established algorithms found in the literature, as well as being used to solve a selection of engineering problems.
Section 2 provides a greater insight to other established optimization approaches. Section 3 elucidates Hooke’s law while Section 4 outlines the SSA. Section 5 assesses the algorithm’s search ability while Section 6, its proficiency. Section 7 includes the outcome of the evaluation through the standard benchmark test functions. Section 8 includes the implementation of the algorithm on select engineering design problems. Finally, Section 9 encompasses conclusions.

2. A Brief History of Intelligent Algorithms

An algorithm is considered intelligent when it finds a suitable answer or solution to a problem whose main characteristic is optimization in the shortest possible time and using the least amount of available information [25]. Using a more complete definition, the heuristic method is a strategy that sacrifices part of the information to reach a solution in the shortest possible time and with good precision [26]. Usually, heuristic algorithms are very frequently based on natural processes, that is, typically biological processes or laws that explain physical phenomena. This approach has been widely considered in the last ten years, and numerous algorithms have been suggested. These algorithms have been classified into different categories, such as swarm-based algorithms, evolution-based algorithms, and physics-based algorithms.

2.1. Swarm-Based Algorithms

These techniques were developed from the analysis of several processes that exist naturally, such as the growth or symbiosis of plants, the feeding behavior of insects, and the behavior and social organization of animals [27]. The particle swarm optimization (PSO) algorithm is an indeterminate (random) search method that was developed around 1995 to support functional optimization [28]. This algorithm was developed by analyzing and taking a reference to the movement that birds develop as a group (team) when looking for food. The algorithm is based on the premise that a group of birds looks for food at random and that there is only one portion of food in the area (search space) in question, but none of the birds know where the food is. One of the most successful strategies could be: for the birds to follow the bird that is closest to the food, and in sequence to be the bird most likely to find the food. This strategy is, in fact, the source of the algorithm. In the algorithm, each solution, called a particle, is equivalent to a bird in the bird movement algorithm. Each particle (bird) has an arbitrary value calculated by a success function. Each particle (bird) also has a speed that controls the particle (bird). By continuing to search for optimal particles, the agent continues to move in the solution space. The firefly algorithm (FA) is an algorithm inspired by a natural system; in this case, based on swarms for projects where limited optimal solutions are sought [29]. The algorithm is inspired by the analysis of the radiation behavior of these insects. The firefly lives in groups and changes from low light to higher light intensity. The firefly algorithm generates a rhythmic light and passes through each different light pattern or behavior among these insects. The firefly algorithm uses these lights for two main purposes: finding mates to mate with and looking for food. These lights can also serve as a protection mechanism or strategy. The whale optimization algorithm (WOA) [30] is another nature-inspired optimization algorithm; as the name implies, this algorithm mimics the social behavior that humpback whales have. The most surprising thing about humpback whales is their hunting strategy. This food search strategy is called a bubble net feeding method. Humpback whales like to hunt schools of krill or small fish near the surface of the ocean. It has been analyzed that this foraging is done by creating distinctive bubbles along a circle or route in the form of the number “9”. Some of the other swarm-based algorithms are: artificial bee colony (ABC) [31], bat-inspired algorithm (BA) [32], spotted hyena optimizer (SHO) [33], cuckoo search (CS) [34], monkey search (MS) [35], group optimization (GO) [36], artificial fish-swarm algorithm (AFSA) [37], hunting search (HS) [38], moth-flame optimization algorithm (MFO) [39], dolphin partner optimization (DPO) [40], orientation search algorithm (OSA) [41], binary orientation search algorithm (BOSA) [42], dice game optimizer (DGO) [43], shell game optimization (SGO) [44], hide objects game optimization (HOGO) [45], donkey theorem optimization (DTO) [46], following optimization algorithm (FOA) [47], rat swarm optimizer (RSO) [48], darts game optimizer (DGO) [49], football game based optimization (FGBO) [50], grey wolf optimizer (GWO) [51], grasshopper optimization algorithm (GOA) [52], coupled spring forced bat algorithm (SFBA) [53], adaptive granularity learning distributed particle swarm optimization (AGLDPSO) [54], multi leader optimizer (MLO) [55], doctor and patient optimization (DPO) [56], and emperor penguin optimizer (EPO) [57].

2.2. Evolution-Based Algorithms

An algorithm is considered evolutionary when the algorithm combines aspects of natural selection and continuity of coordination. These algorithms are based on structures that simulate the rules of selection, recombination, change, and survival, similar to genetics, hence the adjective algorithm. These structures are based on genetic sets. In this method, the environment determines each person’s coordination or performance in a population and uses the most consistent individuals to reproduce [58]. Evolutionary algorithms are random search procedures that use genetic mechanisms and natural selection [59]. Genetic algorithms (GA) were developed as a method that seeks optimization, starting from fundamental basic operations in genetic biology [60]. The first record of using these concepts to create an optimization method occurred in 1967 [61]. GA is a particular type of evolution algorithm that exploits basic biological concepts such as inheritance and mutation [62] and has had satisfactory results in different scientific domains.
On the other hand, differential evolution (DE) [63] is an algorithm that also seeks intelligent optimization based on populations introduced in 1995 [64]. The initial version of this algorithm was used to solve problems with continuous variables, and interpretations of this algorithm have been implemented over time to solve optimization problems with discrete variables [65]. Other algorithms based on the theory of evolution have been developed, examples of which are: evolutionary programming (EP) [66], biogeography-based optimizer (BBO) [67], enhanced quantum-inspired differential evolution algorithm (IQDE) [68], genetic programming (GP) [69] and evolution strategy (ES) [70].

2.3. Physics-Based Algorithms

Physics-based algorithms, as the name implies, are inspired by the laws of physics. Simulated annealing (SA) is one of the best known and most popular optimization algorithms. SA was developed in 1983 [71], inspired by metals’ annealing, for example, in the artisan process to make swords or knives in the past. The process consists of first heating the metal to a very high temperature and then cooling it by gradually reducing the temperature so that the metal hardens and becomes harder. In this process, when the temperature of the metal increases, the speed of atomic movement increases dramatically and, in the next step, the gradual reduction in temperature causes the formation of specific patterns based on the location of its atoms [20]. Drastic temperature change is one of the adjustment parameters of this algorithm. The gravitational search algorithm (GSA) [72] is inspired by the universal law of gravitation developed by Isaac Newton. In this algorithm, objects such as planets in a galaxy are defined as search agents. The optimal region, similar to a black hole, absorbs the planets. Information about the fitness of any object is stored as gravity and mass inertia. The exchange of information and the effects of objects on each other is governed by the attractive force of gravitation [73]. Several algorithms have been developed based on laws and/or theories of physics, such as: charged system search (CSS) [74], galaxy-based search algorithm (GBSA) [75], curved space optimization (CSO) [76], ray optimization (RO) algorithm [77], artificial chemical reaction optimization algorithm (ACROA) [78], small world optimization algorithm (SWOA) [79], central force optimization (CFO) [80], black hole (BH) [81] and big-bang big-crunch (BBBC) [82].

3. Spring Force Law

If a force that moves an object in a closed path (forward and backward) is not affected by the object’s trajectory, that force is conservative. Another method for diagnosing conservative forces is that the work done by the force in different paths is equal to the difference between the initial and final points. The spring force is a type of conservative force [83].
Consider a spring that imposes a force on a particle with mass ‘m’. The particle moves horizontally in the x direction. When a particle is at the origin (x = 0), the spring is balanced. An external force ( F e x t ) influences the object in the anti-clock wise direction of the spring. The external force is always equal to the spring force. Thus, the particle is always in balance.
Consider that the particle is moved a distance x from its initial location x = 0. When the external factor imposes a force F e x t on the particle, the spring offers a resistance force Fs on the particle. This force can be described by the spring force or Hooke’s law as follows:
F s = k x
where k is the spring force constant and x denotes the spring displacement (strain or compression) from the balance point. Most real springs properly follow Hooke’s law up to a limit [84].
The system behavior is investigated in an isolated system as shown in Figure 1. It is assumed that only the spring force is imposed on the object.
In Figure 1, forces imposed on object j can be grouped into two, F t o t a l R , which variables as described in Equation (2), is the sum of forces imposed from the right and F t o t a l L , which, as shown in Equation (3), is the sum of forces imposed from the left. It is necessary to mention that the springs that are attached to the object from either right or left, are also attached to robust points at their other ends.
F t o t a l R j =   i = 1 n R K i , j x i , j
F t o t a l L j =   l = 1 n L K l , j x l , j
where, n R and n L are the number of left and right spring forces, x i , j and x l , j show the distance between the object j and the fixed left and right points, K i , j and K l , j are the spring stiffness coefficients between the object j and the fixed points.
The object is initially balanced with no force exerted on it. Then, by applying the spring forces, the object is pulled from the right and the left. Considering the magnitude of these forces, the spring either shifts to left or right until the system reaches a new equilibrium position. It is apt to mention that if the right and left forces are equal, the object remains at its original position.
Considering the stiffness coefficient of the springs that are connected to the object, two new parameters may be defined as below:
K e q u a l R j =   i = 1 n R K i , j
K e q u a l L j =   l = 1 n L K l , j
K e q u a l R j and K e q u a l L j are the right and the left constants of the spring, respectively. Considering Equation (1) the displacement values at each side may be defined as follows:
d X R j = F t o t a l R j K e q u a l R j
d X L j = F t o t a l L j K e q u a l L j
here, d X R j and d X L j are the displacement values of object j to the right and left, respectively. Therefore, the total displacement may be defined as follows:
X j = d X R j + d X L j
d X j is the final object j displacement value that may be a positive or negative value.
X j = X 0 j + d X j
Equation (9), X j relates to the location of the new balance point of the system and object j. Besides, X 0 j is the initial balance of object j.
By simulating Hooke’s law within the discrete time domain, a new optimization algorithm called the spring optimization was designed, which is explained further in the following section.

4. Spring Search Algorithm (SSA)

In this article, the spring search algorithm is run in an artificial system with discrete time. The system space is the defined domain of problem. It is possible to apply the spring force law as a tool to convey information. The designed optimization may be applied to solve any optimization problem, as long as the answer can be defined as a position within space, and its similarity to other problem answers can be expressed as spring stiffness comparisons. The stiffness of the spring is established relative to the objective function.
SSAs consist of two general steps: making a discrete time artificial system within the problem space by defining the initial positioning of objects, determining the governing laws and arranging parameters; letting the algorithm run until it reaches a stop.

4.1. Setting the System, Determining Laws and Arranging Parameters

First, the system space is determined with its multi-dimensional coordinate system in which the problem is defined. Any point in space may be the solution of the optimization problem. Search factors are sets of objects that are attached to each other by some springs. Each object has its own position as well as stiffness coefficients pertaining to the springs attached to it. The object’s position is a point in space where the solution of the problem may be found. The values of the springs are computed regarding the stiffness of both objects attached.
After setting the system, its governing laws are determined. Only the spring force laws and movement laws are observed in the system. The general patterns of these rules are similar to natural laws and are defined below.
In physics, mechanical and elastic science, Hooke’s law is an approximation that shows that any change in an object’s length is directly proportional to its load. Most materials follow this law with reasonable accuracy, except when the force is lower than its elasticity. Any deviation from Hooke’s law increases with the deformation quantity, such that with numerous deformations, when the material trespasses its linear elastic domain, Hooke’s law loses its application. The present article assumes that Hooke’s law is valid for all of the time observed.
The present location of any object equals the sum coefficient of its previous locations and its displacements according to the laws of motion. Any object displacement may be determined with the aid of the spring force law.
Consider a system as a set of m objects where the position of each object is a point in the search space and a solution to the problem.
X i = ( x i 1 , , x i d , , x i n )
The position of an object i of dimension d is designated x i d in Equation (10). The initial positions of the objects are defined within the search space randomly. These objects tend to return to an equilibrium position by means of the forces exerted by the spring.
Equation (11) is employed in order to compute the spring stiffness coefficient.
K i , j = K m a x ×   | F n i F n j | × max ( F n i , F n j )
In Equation (11), K i , j is the spring stiffness coefficient among objects i and j, K m a x is the maximum value of the spring stiffness coefficient, and is determined according to the type of problem in question, F n i and F n j are the normalized objective functions of objects i and j respectively. Equations (12) and (13) are used in order to normalize the objective function.
F n i = f o b j i min ( f o b j )
F n i = min ( F n i ) × 1 F n i
In the above equations, Fobj is the objective function and f o b j i is the objective function value for the object i.
In the m variable problem, it is possible to assume that the problem has m dimensions and that there is a coordinate for each dimension. Therefore, it is possible to draw a system on its related coordinate based on each variable. Each coordinate’s strong points at the left or right side of the object are determined by comparing the objective function quantities. Stronger points related to each object mean that they are positioned at more optimal positions. Therefore, each coordinate has two general summative forces: the sum of the right, Equation (14) and the sum of the left, Equation (15). Both are applied to the object j.
F t o t a l R j , d = i = 1 n R d K i , j x i , j d
F t o t a l L j , d = l = 1 n L d K l , j x l , j d
In the above equations, F t o t a l R j , d stands for the sum of the right forces and F t o t a l L j , d the sum of the left forces imposed on object j within the dimension d. n R d and n L d are the d dimension right and left strong points; x i , j d and x l , j d represent the distance of the object ‘j’ from the right and left strong points; K i , j and K l , j are the spring stiffness coefficient attached to the object j and the strong points.
Now, considering Hooke’s law in the dimension d:
d X R j , d = F t o t a l R j , d K e q u a l R j
d X L j , d = F t o t a l L j , d K e q u a l L j
here, d X R j , d and d X L j , d are the displacement of object j to the right and the left of dimension d, respectively. The total displacement may be calculated as follows:
d X j , d = d X R j , d + d X L j , d
where d X j , d is the final displacement of object j in the dimension d. The direction may be in the positive or negative x direction.
X n e w j , d = X 0 j , d + r 1   × d X j , d ,
Equation (19) relates X n e w j , d to both the new position and balance point of the system, along with the d dimension of object j. Additionally, X 0 j , d is the initial balance point of object j along with the d dimension. Here, r 1 is a random number with a uniform distribution within the span of [ 0 1 ] , which is used to preserve the random mode.
In the last step, the objects and springs after reaching balance have a small displacement due to slipping, which is simulated in Equation (20).
X j , d = X n e w j , d + 2 × ( T t ) T   × ( 0.2 + r 2 × 0.4 ) × X n e w j , d
here, X j , d is the updated position of d dimension of object j, T is the maximum number of iteration, t is the counter of iteration, and r 2 is a random number with a uniform distribution within the span of [ 0 1 ] , which is used to preserve the random mode.

4.2. Time Passing and Parameters Updating

While initially forming a system, each object is randomly placed within a point in space where it may be the problem’s answer. At any given time, objects are assessed and their displacement is calculated using Equations (11) through (18). Thereafter, the object is placed in the new computed position. The parameter of interest is the spring stiffness coefficient that is updated at each stage based on Equation (11). The stop condition is established after the algorithm has been run for a given time. The steps of the SSA are as follows and a flowchart encompassing them is shown in Figure 2:
1-
Start
2-
Determine the system environment and the problem information
3-
Create the initial population of objects
4-
Evaluate and normalize fitness function or objective function
5-
Update parameter K
6-
Formulate the spring force and the laws of motion for each object
7-
Compute object displacement quantities
8-
Update object locations
9-
Repeat steps 4 through 8 until the stop condition is satisfied
10-
Print best solution
11-
End

5. Properties of the Proposed SSA

The above algorithm is a proposed optimization method that applies the spring force law. The strategy of the algorithm proposed is to use the spring fitness coefficient. In this algorithm, a set of objects search the space randomly using the spring force as a tool to transfer information between objects. Under the influence of other objects, each may arrive at a rough understanding of its surrounding space. The algorithm must be navigated such that the objects’ location improves as time passes.
Thus, springs with more fitness coefficients can be attached to those with better fitness functions. The springs attract other objects to themselves, allowing a suitable force to be exerted on each object. Objects tend to move towards better conditions as time goes by. Accordingly, objects placed in better locations must take slower and shorter steps. As an object arrives in a better condition, its stiffness coefficient increases. The stiffer objects search their surrounding environment with more precision. Indeed, this behavior, known here as the adjustment, is similar to arranging the learning rate in a neural network. The spring’s stiffness coefficient becomes smaller with the passing of time as a result of the spring’s force. Another reason why the spring stiffness coefficient decreases with time is that the objects tend to concentrate around better locations and need to search space with smaller and more precise steps.
Each object can influence the radius of its neighborhood according to its fitness value. As indicated in Figure 3, each object can move as influenced by the spring forces imposed on it.

6. Exploration and Exploitation in SSA

The optimization method must address two issues: exploration and exploitation. In the exploration aspect, the algorithm must have enough power to search the problem search space well and not be limited to only a few specific locations. The algorithm tackles exploitation by focusing on exploring optimal locations. Before running a population algorithm, it is necessary to search the designated space comprehensively. Hence, the algorithm must focus on searching for the solution’s general area during initial iterations, though as the time passes, it must locate itself more efficiently through aid from the population findings [85].
SSAs can search space by considering the suitable number of objects. The way the optimization algorithm improves its detection power is through the spring force effect by varying the spring’s stiffness coefficient and consequently controlling the spring forces among the objects. During initial iterations, the problem needs thorough searches, and as time passes, the population arrives at better results and the spring stiffness coefficient value is controlled. At the initial time, a proper value is chosen, and as time passes, that value decreases though the spring stiffness coefficient (Equation (11)) until it arrives at its minimum value.

7. Experimental Results and Discussion

This section presents the results from the evaluation of the SSA’s performance on twenty-three standard benchmark test functions. A detailed description of these benchmark functions is presented below. Furthermore, the results are compared to eight existing optimization algorithms.

7.1. Benchmark Test Functions

The performance of the SSA was assessed by using 23 benchmark test functions [86]. The experimentation was done on MATLAB version R2014a (8.3.0.532) in a Microsoft Windows 7 environment using a 64 bit Core i-7 processor with 2.40 GHz and 16 GB main memory. The average and standard deviation of the best optimal solutions are displayed in Table 1, Table 2 and Table 3. For each benchmark test function, the SSA utilizes 20 independent runs, in which each run employs 1000 iterations.

7.2. Algorithms Used for Comparison

In order to prove the potency of the SSA, it is also compared to eight optimization algorithms on unimodal, multimodal, fixed-dimension multimodal and composite optimization. They were assessed by solving a set of minimization problems introduced in the Constrained Single Objective Real-Parameter Optimization Technical Report, ‘CEC’2015’ [86]. To validate the performance of the SSA, the eight optimization algorithms included: GA [87], PSO [29], GSA [73], TLBO [88], GOA [53], GWO [52], SHO [34], and EPO [58].
The parameter values of optimization algorithms are given in the next table.
Parameter definition
 1: GA
 2: Population size N = 80
 3: Crossover 0.9
 4: Mutation 0.05
 5: PSO
 6: Swarm size S = 50
 7: Inertia weight decreases linearly from 0.9 to 0.4
 8: C1(individual-best acceleration factor) increases linearly from 0.5 to 2.5
 9: C2(global-best acceleration factor) decreases linearly from 2.5 to 0.5
10: GSA
11: Objects number N = 50
12: Acceleration coefficient (a = 20)
13: Initial gravitational constant (G0 = 100)
14: TLBO
15: Swarm size S = 50
16: GOA
17: Search Agents N = 100
18: C m a x = 1
19: C m i n = 4 × 10 5
20: l = 1.5 and f = 0.5
21: GWO
22: Wolves number = 50
23: a variable decreases linearly from 2 to 0
24: SHO
25: Search Agents N = 80
26: M Constant [ 0.5 ,   1 ]
27: Control Parameter (h) [ 5 ,   0 ]
28: EPO
29: Search Agents N = 80
30: Temperature Profile [ 1 ,   1000 ]
31: A Constant [ 1.5 ,   1.5 ]
32: Function S() [ 0 ,   1.5 ]
33: Parameter M = 2
34: Parameter f [ 2 ,   3 ]
35: Parameter l [ 1.5 ,   2 ]

7.2.1. Evaluation of Unimodal Test Function with High Dimensions

Functions F1 to F7 are unimodal. The mean results of 20 independent runs of the algorithm are displayed in Table 1. These results show that the SSA has a better performance in all F1 to F7 functions than other algorithms.

7.2.2. Evaluation of Multimodal Test Functions with High Dimensions

In multimodal functions F8 to F13, by increasing the function dimensions, the number of local responses increased exponentially. Therefore, arriving at the minimum response of these functions is hard to achieve. In these types of functions, arriving at a response close to the ideal response denotes the algorithm’s high capability in avoiding wrong local responses. The results obtained from assessing F8 to F13 after 20 runs of the SSA as well as the other algorithms compared are presented in Table 2. The SSA demonstrated the best performance of all the functions analyzed.

7.2.3. Evaluation of Multimodal Test Functions with Low Dimensions

Functions F14 to F23 have both low dimensions in addition to low local responses. Results obtained from 20 runs of each algorithm, are shown in Table 3. These results indicate that the SSA has a suitable performance in all functions F14 to F23.

7.2.4. Evaluation of CEC 2015 Test Functions

This section is devoted to real approaches and techniques for solving single objective optimization problems. All of these test functions are minimization problems. Table 4 shows the performance of the SSA as well as the other algorithms on the CEC 2015 test. Table 4 exhibits how the SSA is the most efficient optimizer of all of the benchmark test functions studied.

8. SSA for Engineering Design Problems

In this section, the SSA is applied to five constrained engineering design problems.

8.1. Pressure Vessel Design

The mathematical model of this problem was adapted from a paper by Kannan and Kramer [89]. Table 5 and Table 6 show the performance of the SSA along with the other algorithms. The SSA provides an optimal solution at (0.778099, 0.383241, 40.315121, 200.00000), with a corresponding fitness value of 5880.0700.

8.2. Speed Reducer Design Problem

This problem is modeled mathematically in [90,91]. The results of the optimization problem are presented in Table 7 and Table 8. The optimal solution was provided by the SSA at (3.50123, 0.7, 17, 7.3, 7.8, 3.33421, 5.26536) with a corresponding fitness value equal to 2994.2472.

8.3. Welded Beam Design

The mathematical model of a welded beam design was adapted from [31]. The results to this optimization problem are presented in Table 9 and Table 10. The SSA provides an optimal solution at (0.205411, 3.472341, 9.035215, 0.201153) with a corresponding fitness value equal: 1.723589.

8.4. Tension/Compression Spring Design Problem

The mathematical model of this problem was adapted from [31]. The results to this optimization problem are displayed in Table 11 and Table 12. The SSA provides the optimal solution at (0.051087, 0.342908, 12.0898), with a corresponding fitness value of 0.012656987.

8.5. Rolling Element Bearing Design Problem

The mathematical model of this problem is adapted from [92]. The results of this optimization problem are included in Table 13 and Table 14 and prove that the SSA provides an optimal solution at (125, 21.41890, 10.94113, 0.515, 0.515, 0.4, 0.7, 0.3, 0.02, 0.6) with a corresponding fitness value equal to 85,067.983.

9. Conclusions

There are many optimization problems in the different scientific domains that must be solved using the algorithms with each case’s necessary characteristics. In this project, a new optimization algorithm called spring search algorithm (SSA) was developed; Hooke’s law describes the starting point or basic concept. The search agents of the proposed method are weights that are connected by several springs. The system starts from a transitory situation or state and stabilizes at the equilibrium point, according to the law of the spring.
To evaluate the algorithm and purchase it, almost 40 standard objective functions, including unimodal and multimodal functions, in addition to CEC2015, were used to assess the performance of the proposed algorithm in solving optimization problems of a different nature. To review and analyze the algorithm’s results, these were compared with eight widely known optimization algorithms: GA, PSO, GSA, TLBO, GWO, GOA, SHO, and EPO.
The results in the functions show the superior exploration and exploitation capabilities of SSA compared to other optimization algorithms, for both unimodal and multimodal functions. The same occurs with the simulations using the SSA algorithm and the eight algorithms selected for comparison in the case of CEC2015, which shows the SSA’s high aptitude to solve this type of problem to optimize the function. In addition to the work carried out on the almost 40 functions, the SSA algorithm was evaluated in five engineering design optimization problems to evaluate the performance in solving optimization problems in real situations, showing that it is much more competitive than other algorithms.
For future work, it is suggested to develop a binary version of the SSA algorithm and apply this algorithm to multi-objective problems. Likewise, extending the concept of Hooke’s law to more complex models with more adjustment parameters, which, even though it will reduce simplicity, could generate additional advantages.

Author Contributions

Conceptualization, M.D., Z.M., R.M.-M., R.A.R.-M., A.D. and J.M.G.; methodology, M.D. and Z.M.; software, M.D.; validation, J.M.G., G.D., R.A.R.-M., R.M.-M., A.D. and O.P.M.; formal analysis, A.D., O.P.M.; investigation, M.D. and O.P.M.; resources, J.M.G.; data curation, G.D.; writing—original draft preparation, M.D. and Z.M.; writing—review and editing, O.P.M., R.A.R.-M., R.M.-M., L.P.-A., G.D., and J.M.G.; visualization, M.D.; supervision, M.D. and Z.M.; project administration, M.D. and Z.M.; funding acquisition, R.A.R.-M. and R.M.-M. All authors have read and agreed to the published version of the manuscript.

Funding

The current project was funded by Tecnologico de Monterrey and FEMSA Foundation (grant CAMPUSCITY project).

Conflicts of Interest

The authors declare no conflict of interest. The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Abbreviations

AcronymDefinition
ABCArtificial Bee Colony
ACROAArtificial Chemical Reaction Optimization Algorithm
AFSAArtificial Fish-Swarm Algorithm
BABat-inspired Algorithm
BBOBiogeography-Based Optimizer
BHBlack Hole
BOSABinary Orientation Search Algorithm
BBBCBig-Bang Big-Crunch
CFOCentral Force Optimization
CSCuckoo Search
CSOCurved Space Optimization
CSSCharged System Search
DGODarts Game Optimizer
DPODolphin Partner Optimization
DGODice Game Optimizer
DEDifferential Evolution
DTODonkey Theorem Optimization
EPEvolutionary Programming
ESEvolution Strategy
EPOEmperor Penguin Optimizer
FAFirefly Algorithm
FOAFollowing Optimization Algorithm
FGBOFootball Game Based Optimization
GPGenetic Programming
GOGroup Optimization
GOAGrasshopper Optimization Algorithm
GSAGravitational Search Algorithm
GbSAGalaxy-based Search Algorithm
GWOGrey Wolf Optimizer
HOGOHide Objects Game Optimization
HSHunting Search
MFOMoth-flame Optimization Algorithm
MSMonkey Search
OSAOrientation Search Algorithm
PSOParticle Swarm Optimization
RSORat Swarm Optimizer
RORay Optimization
SHOSpotted Hyena Optimizer
SGOShell Game Optimization
SWOASmall World Optimization Algorithm
WOAWhale Optimization Algorithm

References

  1. Cortés-Toro, E.M.; Crawford, B.; Gómez-Pulido, J.A.; Soto, R.; Lanza-Gutiérrez, J.M. A new metaheuristic inspired by the vapour-liquid equilibrium for continuous optimization. Appl. Sci. 2018, 8, 2080. [Google Scholar] [CrossRef] [Green Version]
  2. Pelusi, D.; Mascella, R.; Tallini, L. A fuzzy gravitational search algorithm to design optimal IIR filters. Energies 2018, 11, 736. [Google Scholar] [CrossRef] [Green Version]
  3. Díaz, P.; Pérez-Cisneros, M.; Cuevas, E.; Avalos, O.; Gálvez, J.; Hinojosa, S.; Zaldivar, D. An improved crow search algorithm applied to energy problems. Energies 2018, 11, 571. [Google Scholar] [CrossRef] [Green Version]
  4. Chiu, C.-Y.; Shih, P.-C.; Li, X. A dynamic adjusting novel global harmony search for continuous optimization problems. Symmetry 2018, 10, 337. [Google Scholar] [CrossRef] [Green Version]
  5. Sengupta, S.; Basak, S.; Peters, R.A. Particle Swarm Optimization: A survey of historical and recent developments with hybridization perspectives. Mach. Learn. Knowl. Extr. 2019, 1, 157–191. [Google Scholar] [CrossRef] [Green Version]
  6. Stripling, E.; Broucke, S.V.; Antonio, K.; Baesens, B.; Snoeck, M. Profit maximizing logistic model for customer churn prediction using genetic algorithms. Swarm Evol. Comput. 2018, 40, 116–130. [Google Scholar] [CrossRef] [Green Version]
  7. Antonov, I.V.; Mazurov, E.; Borodovsky, M.; Medvedeva, Y.A. Prediction of lncRNAs and their interactions with nucleic acids: Benchmarking bioinformatics tools. Brief. Bioinform. 2019, 20, 551–564. [Google Scholar] [CrossRef]
  8. Djenouri, Y.; Belhadi, A.; Belkebir, R. Bees swarm optimization guided by data mining techniques for document information retrieval. Expert Syst. Appl. 2018, 94, 126–136. [Google Scholar] [CrossRef]
  9. Artrith, N.; Urban, A.; Ceder, G. Constructing first-principles phase diagrams of amorphous Li x Si using machine-learning-assisted sampling with an evolutionary algorithm. J. Chem. Phys. 2018, 148, 241711. [Google Scholar] [CrossRef] [Green Version]
  10. Dehghani, M.; Montazeri, Z.; Malik, O. Energy commitment: A planning of energy carrier based on energy consumption. Электрoтехника и электрoмеханика 2019, 4, 69–72. [Google Scholar] [CrossRef]
  11. Ehsanifar, A.; Dehghani, M.; Allahbakhshi, M. Calculating the leakage inductance for transformer inter-turn fault detection using finite element method. In Proceedings of the 2017 Iranian Conference on Electrical Engineering (ICEE), Tehran, Iran, 2–4 May 2017; pp. 1372–1377. [Google Scholar]
  12. Dehghani, M.; Montazeri, Z.; Malik, O. Optimal sizing and placement of capacitor banks and distributed generation in distribution systems using spring search algorithm. Int. J. Emerg. Electr. Power Syst. 2020, 21. [Google Scholar] [CrossRef] [Green Version]
  13. Dehghani, M.; Montazeri, Z.; Malik, O.P.; Al-Haddad, K.; Guerrero, J.M.; Dhiman, G. A New Methodology Called Dice Game Optimizer for Capacitor Placement in Distribution Systems. Электрoтехника и электрoмеханика 2020, 1, 61–64. [Google Scholar] [CrossRef] [Green Version]
  14. Dehbozorgi, S.; Ehsanifar, A.; Montazeri, Z.; Dehghani, M.; Seifi, A. Line loss reduction and voltage profile improvement in radial distribution networks using battery energy storage system. In Proceedings of the 2017 IEEE 4th International Conference on Knowledge-Based Engineering and Innovation (KBEI), Tehran, Iran, 22 December 2017; pp. 215–219. [Google Scholar]
  15. Montazeri, Z.; Niknam, T. Optimal utilization of electrical energy from power plants based on final energy consumption using gravitational search algorithm. Электрoтехника и электрoмеханика 2018, 4, 70–73. [Google Scholar] [CrossRef] [Green Version]
  16. Dehghani, M.; Mardaneh, M.; Montazeri, Z.; Ehsanifar, A.; Ebadi, M.; Grechko, O. Spring search algorithm for simultaneous placement of distributed generation and capacitors. Электрoтехника и электрoмеханика 2018, 6, 68–73. [Google Scholar] [CrossRef]
  17. Dehghani, M.; Montazeri, Z.; Ehsanifar, A.; Seifi, A.; Ebadi, M.; Grechko, O. Planning of energy carriers based on final energy consumption using dynamic programming and particle swarm optimization. Электрoтехника и электрoмеханика 2018, 5, 62–71. [Google Scholar] [CrossRef] [Green Version]
  18. Montazeri, Z.; Niknam, T. Energy carriers management based on energy consumption. In Proceedings of the2017 IEEE 4th International Conference on Knowledge-Based Engineering and Innovation (KBEI), Tehran, Iran, 22 December 2017; pp. 539–543. [Google Scholar]
  19. Mirjalili, S. Genetic Algorithm. In Evolutionary Algorithms and Neural Networks; Springer: Berlin/Heidelberg, Germany, 2019; pp. 43–55. [Google Scholar]
  20. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef]
  21. Farmer, J.D.; Packard, N.H.; Perelson, A.S. The immune system, adaptation, and machine learning. Phys. D Nonlinear Phenom. 1986, 22, 187–204. [Google Scholar] [CrossRef]
  22. Mirjalili, S. Ant Colony Optimisation. In Evolutionary Algorithms and Neural Networks; Springer: Berlin/Heidelberg, Germany, 2019; pp. 33–42. [Google Scholar]
  23. Mirjalili, S. Particle Swarm Optimisation. In Evolutionary Algorithms and Neural Networks; Springer: Berlin/Heidelberg, Germany, 2019; pp. 15–31. [Google Scholar]
  24. Dehghani, M.; Montazeri, Z.; Dehghani, A.; Seifi, A. Spring search algorithm: A new meta-heuristic optimization algorithm inspired by Hooke’s law. In Proceedings of the 2017 IEEE 4th International Conference on Knowledge-Based Engineering and Innovation (KBEI), Tehran, Iran, 22 December 2017; pp. 210–214. [Google Scholar]
  25. Mirjalili, S. Biogeography-Based Optimisation. In Evolutionary Algorithms and Neural Networks; Springer: Berlin/Heidelberg, Germany, 2019; pp. 57–72. [Google Scholar]
  26. Gigerenzer, G.; Gaissmaier, W. Heuristic decision making. Annu. Rev. Psychol. 2011, 62, 451–482. [Google Scholar] [CrossRef] [Green Version]
  27. Lim, S.M.; Leong, K.Y. A Brief Survey on Intelligent Swarm-Based Algorithms for Solving Optimization Problems. In Nature-inspired Methods for Stochastic, Robust and Dynamic Optimization; IntechOpen: London, UK, 2018. [Google Scholar]
  28. Kennedy, J.; Eberhart, R. Particle swarm optimization, proceeding of the IEEE International Conference on Neural Networks, Perth, Australia. IEEE Serv. Cent. Piscataway 1942, 1948, 1995. [Google Scholar]
  29. Yang, X.-S. Firefly algorithm, stochastic test functions and design optimization. arXiv 2010, arXiv:1003.1409. [Google Scholar]
  30. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  31. Karaboga, D.; Basturk, B. Artificial bee colony (ABC) optimization algorithm for solving constrained optimization. In Problems, LNCS: Advances in Soft Computing: Foundations of Fuzzy Logic and Soft Computing; Springer: Berlin/Heidelberg, Germany, 2007. [Google Scholar]
  32. Yang, X.-S. A new metaheuristic bat-inspired algorithm. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2010); Springer: Berlin/Heidelberg, Germany, 2010; pp. 65–74. [Google Scholar]
  33. Dhiman, G.; Kumar, V. Spotted hyena optimizer: A novel bio-inspired based metaheuristic technique for engineering applications. Adv. Eng. Softw. 2017, 114, 48–70. [Google Scholar] [CrossRef]
  34. Gandomi, A.H.; Yang, X.-S.; Alavi, A.H. Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar] [CrossRef]
  35. Mucherino, A.; Seref, O. Monkey search: A novel metaheuristic search for global optimization. AIP Conf. Proc. 2007, 953, 162–173. [Google Scholar]
  36. Dehghani, M.; Montazeri, Z.; Dehghani, A.; Malik, O.P. GO: Group Optimization. Gazi Univ. J. Sci. 2020, 33, 381–392. [Google Scholar] [CrossRef]
  37. Neshat, M.; Sepidnam, G.; Sargolzaei, M.; Toosi, A.N. Artificial fish swarm algorithm: A survey of the state-of-the-art, hybridization, combinatorial and indicative applications. Artif. Intell. Rev. 2014, 42, 965–997. [Google Scholar] [CrossRef]
  38. Oftadeh, R.; Mahjoob, M.; Shariatpanahi, M. A novel meta-heuristic optimization algorithm inspired by group hunting of animals: Hunting search. Comput. Math. Appl. 2010, 60, 2087–2098. [Google Scholar] [CrossRef] [Green Version]
  39. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl. Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  40. Shiqin, Y.; Jianjun, J.; Guangxing, Y. A dolphin partner optimization. In Proceedings of the Global Congress on Intelligent Systems, Xiamen, China, 19–21 May 2009; pp. 124–128. [Google Scholar]
  41. Dehghani, M.; Montazeri, Z.; Malik, O.P.; Ehsanifar, A.; Dehghani, A. OSA: Orientation Search Algorithm. Int. J. Ind. Electron. Control Optim. 2019, 2, 99–112. [Google Scholar]
  42. Dehghani, M.; Montazeri, Z.; Malik, O.P.; Dhiman, G.; Kumar, V. BOSA: Binary Orientation Search Algorithm. Int. J. Innov. Technol. Explor. Eng. (IJITEE) 2019, 9, 5306–5310. [Google Scholar]
  43. Dehghani, M.; Montazeri, Z.; Malik, O.P. DGO: Dice Game Optimizer. Gazi Univ. J. Sci. 2019, 32, 871–882. [Google Scholar] [CrossRef] [Green Version]
  44. Mohammad, D.; Zeinab, M.; Malik, O.P.; Givi, H.; Guerrero, J.M. Shell Game Optimization: A Novel Game-Based Algorithm. Int. J. Intell. Eng. Syst. 2020, 13, 246–255. [Google Scholar]
  45. Dehghani, M.; Montazeri, Z.; Saremi, S.; Dehghani, A.; Malik, O.P.; Al-Haddad, K.; Guerrero, J.M. HOGO: Hide Objects Game Optimization. Int. J. Intell. Eng. Syst. 2020, 13, 216–225. [Google Scholar] [CrossRef]
  46. Dehghani, M.; Mardaneh, M.; Malik, O.P.; NouraeiPour, S.M. DTO: Donkey Theorem Optimization. In Proceedings of the 2019 27th Iranian Conference on Electrical Engineering (ICEE), Yazd, Iran, 30 April–2 May 2019; pp. 1855–1859. [Google Scholar]
  47. Dehghani, M.; Mardaneh, M.; Malik, O. FOA: ‘Following’Optimization Algorithm for solving Power engineering optimization problems. J. Oper. Autom. Power Eng. 2020, 8, 57–64. [Google Scholar]
  48. Dhiman, G.; Garg, M.; Nagar, A.K.; Kumar, V.; Dehghani, M. A Novel Algorithm for Global Optimization: Rat Swarm Optimizer. J. Ambient Intell. Humaniz. Comput. 2020, 1, 1–6. [Google Scholar]
  49. Dehghani, M.; Montazeri, Z.; Givi, H.; Guerrero, J.M.; Dhiman, G. Darts Game Optimizer: A New Optimization Technique Based on Darts Game. Int. J. Intell. Eng. Syst. 2020, 13, 286–294. [Google Scholar]
  50. Dehghani, M.; Mardaneh, M.; Guerrero, J.M.; Malik, O.P.; Kumar, V. Football Game Based Optimization: An Application to Solve Energy Commitment Problem. Int. J. Intell. Eng. Syst. 2020, 13, 514–523. [Google Scholar] [CrossRef]
  51. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  52. Saremi, S.; Mirjalili, S.; Lewis, A. Grasshopper optimisation algorithm: Theory and application. Adv. Eng. Softw. 2017, 105, 30–47. [Google Scholar] [CrossRef]
  53. Zhang, H.; Hui, Q. A Coupled Spring Forced Bat Searching Algorithm: Design, Analysis and Evaluation. In Proceedings of the 2020 American Control Conference (ACC), Denver, CO, USA, 1–3 July 2020; pp. 5016–5021. [Google Scholar]
  54. Wang, Z.-J.; Zhan, Z.-H.; Kwong, S.; Jin, H.; Zhang, J. Adaptive Granularity Learning Distributed Particle Swarm Optimization for Large-Scale Optimization. IEEE Trans. Cybern. 2020, 1–14. [Google Scholar] [CrossRef]
  55. Dehghani, M.; Montazeri, Z.; Dehghani, A.; Ramirez-Mendoza, R.A.; Samet, H.; Guerrero, J.M.; Dhiman, G. MLO: Multi Leader Optimizer. Int. J. Intell. Eng. Syst. 2020, 1, 1–11. [Google Scholar]
  56. Dehghani, M.; Mardaneh, M.; Guerrero, J.M.; Malik, O.P.; Ramirez-Mendoza, R.A.; Matas, J.; Vasquez, J.C.; Parra-Arroyo, L. A New “Doctor and Patient” Optimization Algorithm: An Application to Energy Commitment Problem. Appl. Sci. 2020, 10, 5791. [Google Scholar] [CrossRef]
  57. Dhiman, G.; Kumar, V. Emperor Penguin Optimizer: A Bio-inspired Algorithm for Engineering Problems. Knowl. Based Syst. 2018, 159, 20–50. [Google Scholar] [CrossRef]
  58. Karkalos, N.E.; Markopoulos, A.P.; Davim, J.P. Evolutionary-Based Methods. In Computational Methods for Application in Industry 4.0; Springer: Berlin/Heidelberg, Germany, 2019; pp. 11–31. [Google Scholar]
  59. Mirjalili, S. Introduction to Evolutionary Single-Objective Optimisation. In Evolutionary Algorithms and Neural Networks; Springer: Berlin/Heidelberg, Germany, 2019; pp. 3–14. [Google Scholar]
  60. Holland, J.H. Genetic Algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  61. Bageley, J. The Behavior of Adaptive Systems Which Employ Genetic and Correlation Algorithms. Ph.D. Thesis, University of Michigan, Ann Arbor, MI, USA, 1967. [Google Scholar]
  62. Bose, A.; Biswas, T.; Kuila, P. A Novel Genetic Algorithm Based Scheduling for Multi-core Systems. In Smart Innovations in Communication and Computational Sciences; Springer: Berlin/Heidelberg, Germany, 2019; pp. 45–54. [Google Scholar]
  63. Das, S.; Suganthan, P.N. Differential evolution: A survey of the state-of-the-art. IEEE Trans. Evol. Comput. 2011, 15, 4–31. [Google Scholar] [CrossRef]
  64. Storn, R.; Price, K. Differential Evolution—A Simple and Efficient Adaptive Scheme for Global Optimization over Continuous Spaces; ICSI: Berkeley, CA, USA, 1995. [Google Scholar]
  65. Chakraborty, U.K. Advances in Differential Evolution; Springer: Berlin/Heidelberg, Germany, 2008; Volume 143. [Google Scholar]
  66. Fogel, L.J.; Owens, A.J.; Walsh, M.J. Artificial Intelligence through Simulated Evolution; Wiley: New York, NY, USA, 1966. [Google Scholar]
  67. Simon, D. Biogeography-based optimization. IEEE Trans. Evol. Comput. 2008, 12, 702–713. [Google Scholar] [CrossRef] [Green Version]
  68. Deng, W.; Liu, H.; Xu, J.; Zhao, H.; Song, Y. An improved quantum-inspired differential evolution algorithm for deep belief network. IEEE Trans. Instrum. Meas. 2020. [Google Scholar] [CrossRef]
  69. Koza, J.R. Genetic programming as a means for programming computers by natural selection. Stat. Comput. 1994, 4, 87–112. [Google Scholar] [CrossRef]
  70. Beyer, H.-G.; Schwefel, H.-P. Evolution strategies—A comprehensive introduction. Nat. Comput. 2002, 1, 3–52. [Google Scholar] [CrossRef]
  71. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. A heuristic algorithm and simulation approach to relative location of facilities. Optim. Simulated Annealing 1983, 220, 671–680. [Google Scholar]
  72. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  73. Rashedi, E.; Rashedi, E.; Nezamabadi-pour, H. A comprehensive survey on gravitational search algorithm. Swarm Evol. Comput. 2018, 41, 141–158. [Google Scholar] [CrossRef]
  74. Kaveh, A.; Talatahari, S. A novel heuristic optimization method: Charged system search. Acta Mech. 2010, 213, 267–289. [Google Scholar] [CrossRef]
  75. Shah-Hosseini, H. Principal components analysis by the galaxy-based search algorithm: A novel metaheuristic for continuous optimisation. Int. J. Comput. Sci. Eng. 2011, 6, 132–140. [Google Scholar]
  76. Moghaddam, F.F.; Moghaddam, R.F.; Cheriet, M. Curved space optimization: A random search based on general relativity theory. arXiv 2012, arXiv:1208.2214. [Google Scholar]
  77. Kaveh, A.; Khayatazad, M. A new meta-heuristic method: Ray optimization. Comput. Struct. 2012, 112, 283–294. [Google Scholar] [CrossRef]
  78. Alatas, B. ACROA: Artificial chemical reaction optimization algorithm for global optimization. Expert Syst. Appl. 2011, 38, 13170–13180. [Google Scholar] [CrossRef]
  79. Du, H.; Wu, X.; Zhuang, J. Small-world optimization algorithm for function optimization. In International Conference on Natural Computation; Springer: Berlin/Heidelberg, Germany, 2006; pp. 264–273. [Google Scholar]
  80. Formato, R.A. Central force optimization: A new nature inspired computational framework for multidimensional search and optimization. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2007); Springer: Berlin/Heidelberg, Germany, 2008; pp. 221–238. [Google Scholar]
  81. Hatamlou, A. Black hole: A new heuristic optimization approach for data clustering. Inf. Sci. 2013, 222, 175–184. [Google Scholar] [CrossRef]
  82. Erol, O.K.; Eksin, I. A new optimization method: Big bang–big crunch. Adv. Eng. Softw. 2006, 37, 106–111. [Google Scholar] [CrossRef]
  83. Halliday, D.; Resnick, R.; Walker, J. Fundamentals of Physics; John Wiley & Sons: Hoboken, NJ, USA, 2013. [Google Scholar]
  84. Eiben, A.E.; Schippers, C.A. On evolutionary exploration and exploitation. Fundam. Inform. 1998, 35, 35–50. [Google Scholar] [CrossRef]
  85. Yao, X.; Liu, Y.; Lin, G. Evolutionary programming made faster. IEEE Trans. Evol. Comput. 1999, 3, 82–102. [Google Scholar]
  86. Chen, Q.; Liu, B.; Zhang, Q.; Liang, J.; Suganthan, P.; Qu, B. Problem Definition and Evaluation Criteria for CEC 2015 Special Session and Competition on Bound Constrained Single-Objective Computationally Expensive Numerical Optimization; Technical Report; Computational Intelligence Laboratory, Zhengzhou University, China and Nanyang Technological University: Singapore, 2014. [Google Scholar]
  87. Tang, K.-S.; Man, K.-F.; Kwong, S.; He, Q. Genetic algorithms and their applications. IEEE Signal Process. Mag. 1996, 13, 22–37. [Google Scholar] [CrossRef]
  88. Sarzaeim, P.; Bozorg-Haddad, O.; Chu, X. Teaching-Learning-Based Optimization (TLBO) Algorithm. In Advanced Optimization by Nature-Inspired Algorithms; Springer: Berlin/Heidelberg, Germany, 2018; pp. 51–58. [Google Scholar]
  89. Kannan, B.; Kramer, S.N. An augmented Lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. J. Mech. Des. 1994, 116, 405–411. [Google Scholar] [CrossRef]
  90. Gandomi, A.H.; Yang, X.-S. Benchmark problems in structural optimization. In Computational Optimization, Methods and Algorithms; Springer: Berlin/Heidelberg, Germany, 2011; pp. 259–281. [Google Scholar]
  91. Mezura-Montes, E.; Coello, C.A.C. Useful infeasible solutions in engineering optimization with evolutionary algorithms. In Mexican International Conference on Artificial Intelligence; Springer: Berlin/Heidelberg, Germany, 2005; pp. 652–662. [Google Scholar]
  92. Rao, B.R.; Tiwari, R. Optimum design of rolling element bearings using genetic algorithms. Mech. Mach. Theory 2007, 42, 233–250. [Google Scholar]
Figure 1. An isolated system is composed of object and spring forces.
Figure 1. An isolated system is composed of object and spring forces.
Applsci 10 06173 g001
Figure 2. Spring Search Algorithm Flowchart.
Figure 2. Spring Search Algorithm Flowchart.
Applsci 10 06173 g002
Figure 3. Forces in the system of objects and springs.
Figure 3. Forces in the system of objects and springs.
Applsci 10 06173 g003
Table 1. Results for SSA and other algorithms in Unimodal test functions.
Table 1. Results for SSA and other algorithms in Unimodal test functions.
GAPSOGSATLBOGOAGWOSHOEPOSSA
F1Ave1.95 × 10−124.98 × 10−091.16 × 10−163.55 × 10−022.81 × 10−017.86 × 10−104.61 × 10−235.71 × 10−286.74 × 10−35
std2.01 × 10−111.40 × 10−086.10 × 10−171.06 × 10−011.11 × 10−018.11 × 10−097.37 × 10−238.31 × 10−299.17 × 10−36
F2Ave6.53 × 10−187.29 × 10−041.70 × 10−013.23 × 10−053.96 × 10−015.99 × 10−201.20 × 10−346.20 × 10−407.78 × 10−45
std5.10 × 10−171.84 × 10−039.29 × 10−018.57 × 10−051.41 × 10−011.11 × 10−171.30 × 10−343.32 × 10−403.48 × 10−45
F3Ave7.70 × 10−101.40 × 10014.16 × 10024.91 × 10034.31 × 10019.19 × 10−051.00 × 10−142.05 × 10−192.63 × 10−25
std7.36 × 10−097.13 × 10001.56 × 10023.89 × 10038.97 × 10006.16 × 10−044.10 × 10−149.17 × 10−209.83 × 10−27
F4Ave9.17 × 10016.00 × 10−011.12 × 10001.87 × 10018.80 × 10−018.73 × 10−012.02 × 10−144.32 × 10−184.65 × 10−26
std5.67 × 10011.72 × 10−019.89 × 10−018.21 × 10002.50 × 10−011.19 × 10−012.43 × 10−143.98 × 10−194.68 × 10−29
F5Ave5.57 × 10024.93 × 10013.85 × 10017.37 × 10021.18 × 10028.91 × 10022.79 × 10015.07 × 10005.41 × 10−01
std4.16 × 10013.89 × 10013.47 × 10011.98 × 10031.43 × 10022.97 × 10021.84 × 10004.90 × 10−015.05 × 10−02
F6Ave3.15 × 10−019.23 × 10−091.08 × 10−164.88 × 10003.15 × 10−018.18 × 10−176.58 × 10−017.01 × 10−198.03 × 10−24
std9.98 × 10−021.78 × 10−084.00 × 10−179.75 × 10−019.98 × 10−021.70 × 10−183.38 × 10−014.39 × 10−205.22 × 10−26
F7Ave6.79 × 10−046.92 × 10−027.68 × 10−013.88 × 10−022.02 × 10−025.37 × 10−017.80 × 10−042.71 × 10−053.33 × 10−08
std3.29 × 10−032.87 × 10−022.77 × 10005.79 × 10−027.43 × 10−031.89 × 10−013.85 × 10−049.26 × 10−061.18 × 10−06
Table 2. Results for SSA and other algorithms in Multimodal test functions.
Table 2. Results for SSA and other algorithms in Multimodal test functions.
GAPSOGSATLBOGOAGWOSHOEPOSSA
F8Ave−5.11 × 1002−5.01 × 1002−2.75 × 1002 −3.81 × 1002−6.92 × 1002−4.69 × 1001−6.14 × 1002−8.76 × 1002−1.2 × 1004
std4.37 × 10014.28 × 10015.72 × 10012.83 × 10019.19 × 10013.94 × 10019.32 × 10015.92 × 10019.14 × 10−12
F9Ave1.23 × 10−011.20 × 10−013.35 × 10012.23 × 10011.01 × 10024.85 × 10−024.34 × 10−016.90 × 10−018.76 × 10−04
std4.11 × 10014.01 × 10011.19 × 10013.25 × 10011.89 × 10013.91 × 10011.66 × 10004.81 × 10−014.85 × 10−02
F10Ave5.31 × 10−115.20 × 10−118.25 × 10−091.55 × 10011.15 × 10002.83 × 10−081.63 × 10−148.03 × 10−168.04 × 10−20
std1.11 × 10−101.08 × 10−101.90 × 10−098.11 × 10007.87 × 10−014.34 × 10−073.14 × 10−152.74 × 10−143.34 × 10−18
F11Ave3.31 × 10−063.24 × 10−068.19 × 10003.01 × 10−015.74 × 10−012.49 × 10−052.29 × 10−034.20 × 10−054.23 × 10−10
std4.23 × 10−054.11 × 10−053.70 × 10002.89 × 10−011.12 × 10−011.34 × 10−045.24 × 10−034.73 × 10−045.11 × 10−07
F12Ave9.16 × 10−088.93 × 10−082.65 × 10−015.21 × 10011.27 × 10001.34 × 10−053.93 × 10−025.09 × 10−036.33 × 10−05
std4.88 × 10−074.77 × 10−073.14 × 10−012.47 × 10021.02 × 10006.23 × 10−042.42 × 10−023.75 × 10−034.71 × 10−04
F13Ave6.39 × 10−026.26 × 10−025.73 × 10−322.81 × 10026.60 × 10−029.94 × 10−084.75 × 10−011.25 × 10−080.00 × 1000
std4.49 × 10−024.39 × 10−028.95 × 10−328.63 × 10024.33 × 10−022.61 × 10−072.38 × 10−012.61 × 10−070.00 × 1000
Table 3. Results for SSA and other algorithms in Multimodal test functions with low dimension.
Table 3. Results for SSA and other algorithms in Multimodal test functions with low dimension.
GAPSOGSATLBOGOAGWOSHOEPOSSA
F14Ave4.39 × 10002.77 × 10003.61 × 10006.79 × 10009.98 × 10011.26 × 10003.71 × 10001.08 × 10009.98 × 10−01
std4.41 × 10−022.32 × 10002.96 × 10001.12 × 10009.14 × 10−16.86 × 10−013.86 × 10004.11 × 10−027.64 × 10−12
F15Ave7.36 × 10−029.09 × 10−036.84 × 10−025.15 × 10−027.15 × 10−021.01 × 10−023.66 × 10−028.21 × 10−033.3 × 10−04
std2.39 × 10−032.38 × 10−037.37 × 10−023.45 × 10−031.26 × 10−013.75 × 10−037.60 × 10−024.09 × 10−031.25 × 10−05
F16Ave−1.02 × 1000−1.02 × 1000−1.02 × 1000−1.01 × 1000−1.02 × 1000−1.02 × 1000−1.02 × 1000−1.02 × 1000−1.03 × 1000
std4.19 × 10−070.00 × 10000.00 × 10003.64 × 10−084.74 × 10−083.23 × 10−057.02 × 10−099.80 × 10−075.12 × 10−10
F17Ave3.98 × 10−013.98 × 10−013.98 × 10−013.98 × 10−013.98 × 10−013.98 × 10−013.98 × 10−013.98 × 10−013.98 × 10−01
std3.71 × 10−179.03 × 10−161.13 × 10−169.45 × 10−151.15 × 10−077.61 × 10−047.00 × 10−075.39 × 10−054.56 × 10−21
F18Ave3.00 × 10003.00 × 10003.00 × 10003.00 × 10003.00 × 10003.00 × 10003.00 × 10003.00 × 10003.00 × 1000
std6.33 × 10−076.59 × 10−053.24 × 10−021.94 × 10−101.48 × 10012.25 × 10−057.16 × 10−061.15 × 10−081.15 × 10−18
F19Ave−3.81 × 1000−3.80 × 1000−3.86 × 1000−3.73 × 1000−3.77 × 1000−3.75 × 1000−3.84 × 1000−3.86 × 1000−3.86 × 1000
std4.37 × 10−103.37 × 10−154.15 × 10−019.69 × 10−043.53 × 10−072.55 × 10−031.57 × 10−036.50 × 10−075.61 × 10−10
F20Ave−2.39 × 1000−3.32 × 1000−1.47 × 1000−2.17 × 1000−3.23 × 1000−2.84 × 1000−3.27 × 1000−2.81 × 1000−3.31 × 1000
std4.37 × 10−012.66 × 10−015.32 × 10−011.64 × 10−015.37 × 10−023.71 × 10−017.27 × 10−027.11 × 10−014.29 × 10−05
F21Ave−5.19 × 1000−7.54 × 1000−4570−7.33 × 1000−7.38 × 1000−2.28 × 1000−9.65 × 1000−8.07 × 1000−10.15 × 1000
std2.34 × 10002.77 × 10001.30 × 10001.29 × 10002.91 × 10001.80 × 10001.54 × 10002.29 × 10001.25 × 10−02
F22Ave−2.97 × 1000−8.55 × 1000−6.58 × 1000−1.00 × 1000−8.50 × 1000−3.99 × 1000−1.04 × 1000−10.01 × 1000−10.40 × 1000
std1.37 × 10−023.08 × 10002.64 × 10002.89 × 10−043.02 × 10001.99 × 10002.73 × 10−043.97 × 10−023.65 × 10−07
F23Ave−3.10 × 1000−9.19 × 1000−9.37 × 1000−2.46 × 1000−8.41 × 1000−4.49 × 1000−1.05 × 1001−3.41 × 1000−10.53 × 1000
std2.37 × 10002.52 × 10002.75 × 10001.19 × 10003.13 × 10001.96 × 10001.81 × 10−041.11 × 10−025.26 × 10−06
Table 4. Results for SSA and other algorithms in CEC 2015.
Table 4. Results for SSA and other algorithms in CEC 2015.
GAPSOGSATLBOGOAGWOSHOEPOSSA
Cec-1Ave8.89 × 10063.20 × 10077.65 × 10061.47 × 10064.37 × 10052.02 × 10062.28 × 10066.06 × 10051.50 × 1005
std6.95 × 10068.37 × 10063.07 × 10062.63 × 10064.73 × 10052.08 × 10062.18 × 10065.02 × 10051.21 × 1005
Cec-2Ave2.97 × 10056.58 × 10037.33 × 10081.97 × 10049.41 × 10035.65 × 10063.13 × 10051.43 × 10044.40 × 1003
std2.85 × 10031.09 × 10032.33 × 10081.46 × 10041.08 × 10046.03 × 10064.19 × 10051.03 × 10041.34 × 1008
Cec-3Ave3.20 × 10023.20 × 10023.20 × 10023.20 × 10023.20 × 10023.20 × 10023.20 × 10023.20 × 10023.20 × 1002
std2.78 × 10−021.11 × 10−057.53 × 10−023.19 × 10−029.14 × 10−028.61 × 10−027.08 × 10−023.76 × 10−021.16 × 10−06
Cec-4Ave6.99 × 10024.39 × 10024.42 × 10024.18 × 10024.26 × 10024.09 × 10024.16 × 10024.11 × 10024.04 × 1002
std6.43 × 10007.25 × 10007.72 × 10001.03 × 10011.17 × 10013.96 × 10001.03 × 10011.71 × 10015.61 × 1000
Cec-5Ave1.26 × 10031.75 × 10031.76 × 10031.09 × 10031.33 × 10038.65 × 10029.20 × 10029.13 × 10029.81 × 1002
std1.86 × 10022.79 × 10022.30 × 10022.81 × 10023.45 × 10022.16 × 10021.78 × 10021.85 × 10021.06 × 1002
Cec-6Ave2.91 × 10053.91 × 10062.30 × 10043.82 × 10037.35 × 10031.86 × 10032.26 × 10041.29 × 10041.05 × 1003
std1.67 × 10052.70 × 10062.41 × 10042.44 × 10033.82 × 10031.93 × 10032.45 × 10041.15 × 10041.05 × 1003
Cec-7Ave7.08 × 10027.08 × 10027.06 × 10027.02 × 10027.02 × 10027.02 × 10027.02 × 10027.02 × 10027.02 × 1002
std2.97 × 10001.32 × 10009.07 × 10−019.40 × 10−011.10 × 10007.75 × 10−017.07 × 10−016.76 × 10−015.50 × 10−01
Cec-8Ave5.79 × 10046.07 × 10056.73 × 10032.58 × 10039.93 × 10033.43 × 10033.49 × 10031.86 × 10031.47 × 1003
std2.76 × 10044.81 × 10053.36 × 10031.61 × 10038.74 × 10032.77 × 10032.04 × 10031.98 × 10031.34 × 1003
Cec-9Ave1.00 × 10031.00 × 10031.00 × 10031.00 × 10031.00 × 10031.00 × 10031.00 × 10031.00 × 10031.00 × 1003
std3.97 × 10005.33 × 10009.79 × 10−015.29 × 10−022.20 × 10−017.23 × 10−021.28 × 10−011.43 × 10−011.51 × 10−03
Cec-10Ave4.13 × 10043.42 × 10059.91 × 10032.62 × 10038.39 × 10033.27 × 10034.00 × 10032.00 × 10031.23 × 1003
std2.39 × 10041.74 × 10058.83 × 10031.78 × 10031.12 × 10041.84 × 10032.82 × 10032.73 × 10031.51 × 1003
Cec-11Ave1.36 × 10031.41 × 10031.35 × 10031.39 × 10031.37 × 10031.35 × 10031.40 × 10031.38 × 10031.35 × 1003
std5.39 × 10017.73 × 10011.11 × 10025.42 × 10018.97 × 10011.12 × 10025.81 × 10012.42 × 10011.01 × 1001
Cec-12Ave1.31 × 10031.31 × 10031.31 × 10031.30 × 10031.30 × 10031.30 × 10031.30 × 10031.30 × 10031.30 × 1003
std1.65 × 10002.05 × 10001.54 × 10008.07 × 10−019.14 × 10−016.94 × 10−016.69 × 10−017.89 × 10−011.50 × 10−01
Cec-13Ave1.35 × 10031.35 × 10031.30 × 10031.30 × 10031.30 × 10031.30 × 10031.30 × 10031.30 × 10031.30 × 1003
std3.97 × 10014.70 × 10013.78 × 10−032.43 × 10−041.04 × 10−035.44 × 10−031.92 × 10−042.76 × 10−046.43 × 10−05
Cec-14Ave8.96 × 10039.30 × 10037.51 × 10037.34 × 10037.60 × 10037.10 × 10037.29 × 10034.25 × 10033.22 × 1003
std6.32 × 10034.04 × 10021.52 × 10032.47 × 10031.29 × 10033.12 × 10032.45 × 10031.73 × 10032.12 × 1002
Cec-15Ave1.63 × 10031.64 × 10031.62 × 10031.60 × 10031.61 × 10031.60 × 10031.61 × 10031.60 × 10031.60 × 1003
std3.67 × 10011.12 × 10013.64 × 10001.80 × 10−021.13 × 10012.66 × 10004.94 × 10003.76 × 10005.69 × 10−01
Table 5. Comparison results for pressure vessel design problem.
Table 5. Comparison results for pressure vessel design problem.
Algorithms Optimum Variables Optimum Cost
TsThRL
SSA0.7780990.38324140.315121200.000005880.0700
EPO0.7782100.38488940.315040200.000005885.5773
SHO0.7790350.38466040.327793199.650295889.3689
GOA0.7789610.38468340.320913200.000005891.3879
GWO0.8457190.41856443.816270156.381646011.5148
TLBO0.8175770.41793241.74939183.572706137.3724
GSA1.0858000.94961449.345231169.4874111,550.2976
PSO0.7523620.39954040.452514198.002685890.3279
GA1.0995230.90657944.456397179.658876550.0230
Table 6. Statistical results for pressure vessel design problem.
Table 6. Statistical results for pressure vessel design problem.
AlgorithmsBestMeanWorstStd. Dev.Median
SSA5880.07005891.3099024.3415883.5153
EPO5885.57735887.44415892.3207002.8935886.2282
SHO5889.36895891.52475894.6238013.9105890.6497
GOA5891.38796531.50327394.5879534.1196416.1138
GWO6011.51486477.30507250.9170327.0076397.4805
TLBO6137.37246326.76066512.3541126.6096318.3179
GSA11,550.297623,342.290933,226.25265790.62524,010.0415
PSO5890.32796264.00537005.7500496.1286112.6899
GA6550.02306643.98708005.4397657.5237586.0085
Table 7. Comparison results for speed reducer design problem.
Table 7. Comparison results for speed reducer design problem.
Algorithms Optimum Variables Optimum Cost
bmpl1l2d1d2
SSA3.501230.7177.37.83.334215.265362994.2472
EPO3.501590.7177.37.83.351275.288742998.5507
SHO3.5066900.7177.3809337.8157263.3578475.2867683001.288
GOA3.5000190.7178.37.83.3524125.2867153005.763
GWO3.5085020.7177.3928437.8160343.3580735.2867773002.928
TLBO3.5087550.7177.37.83.4610205.2892133030.563
GSA3.6000000.7178.37.83.3696585.2892243051.120
PSO3.5102530.7178.357.83.3622015.2877233067.561
GA3.5201240.7178.377.83.3669705.2887193029.002
Table 8. Statistical results for speed reducer design problem.
Table 8. Statistical results for speed reducer design problem.
AlgorithmsBestMeanWorstStd. Dev.Median
SSA2994.24722997.4822999.0921.780912996.318
EPO2998.55072999.6403003.8891.931932999.187
SHO3001.2883005.8453008.7525.837943004.519
GOA3005.7633105.2523211.17479.63813105.252
GWO3002.9283028.8413060.95813.01863027.031
TLBO3030.5633065.9173104.77918.07423065.609
GSA3051.1203170.3343363.87392.57263156.752
PSO3067.5613186.5233313.19917.11863198.187
GA3029.0023295.3293619.46557.02353288.657
Table 9. Comparison results for welded beam design problem.
Table 9. Comparison results for welded beam design problem.
AlgorithmsOptimum VariablesOptimum Cost
hltb
SSA0.2054113.4723419.0352150.2011531.723589
EPO0.2055633.4748469.0357990.2058111.725661
SHO0.2056783.4754039.0369640.2062291.726995
GOA0.1974113.31506110.000000.2013951.820395
GWO0.2056113.4721039.0409310.2057091.725472
TLBO0.2046953.5362919.0042900.2100251.759173
GSA0.1470985.49074410.000000.2177252.172858
PSO0.1641714.03254110.000000.2236471.873971
GA0.2064873.63587210.000000.2032491.836250
Table 10. Statistical results for welded beam design problem.
Table 10. Statistical results for welded beam design problem.
AlgorithmsBestMeanWorstStd. Dev.Median
SSA1.7235891.7251241.7272110.0043251.724399
EPO1.7256611.7258281.7260640.0002871.725787
SHO1.7269951.7271281.7275640.0011571.727087
GOA1.8203952.2303103.0482310.3245252.244663
GWO1.7254721.7296801.7416510.0048661.727420
TLBO1.7591731.8176571.8734080.0275431.820128
GSA2.1728582.5442393.0036570.2558592.495114
PSO1.8739712.1192402.3201250.0348202.097048
GA1.8362501.3635272.0352470.1394851.9357485
Table 11. Comparison results for tension/compression spring design problem.
Table 11. Comparison results for tension/compression spring design problem.
AlgorithmsOptimum VariablesOptimum Cost
dDp
SSA0.0510870.34290812.08980.012656987
EPO0.0511440.34375112.09550.012674000
SHO0.0501780.34154112.073490.012678321
GOA0.050000.31041415.00000.013192580
GWO0.050000.31595614.226230.012816930
TLBO0.0507800.33477912.722690.012709667
GSA0.050000.31731214.228670.012873881
PSO0.050100.31011114.00000.013036251
GA0.050250.31635115.239600.012776352
Table 12. Statistical results for tension/compression spring design problem.
Table 12. Statistical results for tension/compression spring design problem.
AlgorithmsBestMeanWorstStd. Dev.Median
SSA0.0126569870.0126789030.0126679020.0010210.012676002
EPO0.0126740000.0126841060.0127151850.0000270.012687293
SHO0.0126783210.0126971160.0127207570.0000410.012699686
GOA0.0131925800.0148171810.0178625070.0022720.013192580
GWO0.0128169300.0144643720.0178397370.0016220.014021237
TLBO0.0127096670.0128396370.0129984480.0000780.012844664
GSA0.0128738810.0134388710.0142117310.0002870.013367888
PSO0.0130362510.0140362540.0162514230.0020730.013002365
GA0.0127763520.0130698720.0152142300.0003750.012952142
Table 13. Comparison results for rolling element bearing design problem.
Table 13. Comparison results for rolling element bearing design problem.
AlgorithmsOptimum VariablesOpt. Cost
DmDbZfifoKDminKDmaxεeζ
SSA12521.4189010.941130.5150.5150.40.70.30.020.685,067.983
EPO12521.4073210.932680.5150.5150.40.70.30.020.685,054.532
SHO125.619921.3512910.987810.5150.5150.50.688070.3001510.032540.6270184,807.111
GOA12520.7538811.173420.5150.5150000.50.615030.3000000.051610.6000081,691.202
GWO125.600221.3225010.973380.5150.5150000.50.687820.3013480.036170.6106184,491.266
TLBO12521.1483410.969280.5150.5150.50.70.30.027780.6291283,431.117
GSA12520.8541711.149890.5150.5177460.50.618270.3040680.020000.62463882,276.941
PSO12520.7756211.012470.5150.5150000.50.613970.3000000.050040.61000182,773.982
GA12520.8712311.166970.5150.5160000.50.619510.3011280.050240.61453181,569.527
Table 14. Statistical results for Rolling element bearing design problem.
Table 14. Statistical results for Rolling element bearing design problem.
AlgorithmsBestMeanWorstStd. Dev.Median
SSA85,067.98385,042.35286,551.5991877.0985,056.095
EPO85,054.53285,024.85885,853.8760186.6885,040.241
SHO84,807.11184,791.61384,517.9230137.18684,960.147
GOA81,691.20250,435.01732,761.54613,962.15042,287.581
GWO84,491.26684,353.68584,100.8340392.43184,398.601
TLBO83,431.11781,005.23277,992.4821710.77781,035.109
GSA82,276.94178,002.10771,043.1103119.90478,398.853
PSO82,773.98281,198.75380,687.2391679.3678439.728
GA81,569.52780,397.99879,412.7791756.9028347.009

Share and Cite

MDPI and ACS Style

Dehghani, M.; Montazeri, Z.; Dhiman, G.; Malik, O.P.; Morales-Menendez, R.; Ramirez-Mendoza, R.A.; Dehghani, A.; Guerrero, J.M.; Parra-Arroyo, L. A Spring Search Algorithm Applied to Engineering Optimization Problems. Appl. Sci. 2020, 10, 6173. https://doi.org/10.3390/app10186173

AMA Style

Dehghani M, Montazeri Z, Dhiman G, Malik OP, Morales-Menendez R, Ramirez-Mendoza RA, Dehghani A, Guerrero JM, Parra-Arroyo L. A Spring Search Algorithm Applied to Engineering Optimization Problems. Applied Sciences. 2020; 10(18):6173. https://doi.org/10.3390/app10186173

Chicago/Turabian Style

Dehghani, Mohammad, Zeinab Montazeri, Gaurav Dhiman, O. P. Malik, Ruben Morales-Menendez, Ricardo A. Ramirez-Mendoza, Ali Dehghani, Josep M. Guerrero, and Lizeth Parra-Arroyo. 2020. "A Spring Search Algorithm Applied to Engineering Optimization Problems" Applied Sciences 10, no. 18: 6173. https://doi.org/10.3390/app10186173

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop