Binary Spring Search Algorithm for Solving Various Optimization Problems

: One of the most powerful tools for solving optimization problems is optimization algorithms (inspired by nature) based on populations. These algorithms provide a solution to a problem by randomly searching in the search space. The design ’ s central idea is derived from various natural phenomena, the behavior and living conditions of living organisms, laws of physics, etc. A new population-based optimization algorithm called the Binary Spring Search Algorithm ( BSSA ) is introduced to solve optimization problems. BSSA is an algorithm based on a simulation of the famous Hooke ’ s law (physics) for the traditional weights and springs system. In this proposal, the population comprises weights that are connected by unique springs. The mathematical modeling of the proposed algorithm is presented to be used to achieve solutions to optimization problems. The re-sults were thoroughly validated in different unimodal and multimodal functions; additionally, the BSSA was compared with high-performance algorithms: binary grasshopper optimization algo-rithm, binary dragonfly algorithm, binary bat algorithm, binary gravitational search algorithm, binary particle swarm optimization, and binary genetic algorithm. The results show the superiority of the BSSA . The results of the Friedman test corroborate that the BSSA is more competitive.


Introduction
The optimization of a process or system is a concept that has critical applications in many fields of science. Many optimization algorithms have been introduced [1][2][3], which has led to greater availability of heuristic optimization techniques in recent years and their application in various fields, such as energy [4,5], protection [6], electrical engineering [7][8][9][10][11], filter design [12], and energy carriers [13,14], to achieve the optimal solution (under specific criteria). Lately, these methods have been modified, achieving better yields [15][16][17].
An optimization algorithm is intelligent when its approach is to find an adequate solution to an optimization problem in the shortest possible time with the least detailed information [18]. The word "heuristics" in ancient Greek means "to know," "to discover," "to find," or "a clue for an investigation" [19]. The heuristic approach ignores some of the information to make faster decisions with the maximum saving of time and utmost precision compared to complex approaches [20]. Processes in nature inspire many heuristic algorithms (biological processes, animals or groups of animals, and physics theories).
Optimization algorithms can be classified in different ways, using a widely accepted structure of four categories, based on the main design. These categories are physics-based, swarm, game-based, and evolution-based optimization algorithms.
Physics-based optimization algorithms are designed according to the simulation of various phenomena and laws of physics. Momentum Search Algorithm (MSA) is one of the algorithms belonging to this group. MSA is based on the simulation of momentum and motion laws. Members of the MSA population are balls with different weights that move in the direction of a ball with a suitable position based on the impulse [1]. Simulated Annealing (SA) algorithm is one of the oldest algorithms in physics. SA is inspired by the process of smelting and cooling materials in metallurgy. Under controlled temperature conditions, the materials are subjected to a heat treatment that causes the molecular structures to go through different phases to change their mechanical properties. The previous phenomenon increases the strength and durability of the material. Heating the material increases its atoms' energy. It allows them to move freely, and the slow cooling process allows a new, lower-energy, higher-strength configuration to be discovered and exploited [21]. This algorithm's efficiency is due to an essential connection between statistical mechanics and optimization processes (multivariate or combinatorial in nature).
The analogous behavior of these processes lays the foundations for defining values that optimize the properties of extensive and/or complex systems, which is where the use of this algorithm is mainly justified.
Swarm-based optimization algorithms have been developed based on the simulation of natural processes, movements, and behavior of animals and other living things. Particle Swarm Optimization (PSO) is the most famous optimization algorithm and is often used by researchers. Particle swarm optimization is a heuristic global optimization method that was proposed in 1995 [31]. It is based on the intelligence of swarms and emulates the behavior patterns of birds/fish when looking for food. One of the birds smells food and communicates it to the rest; this coordination reproduces successful behavior due to the cooperation of each bird. This algorithmically structured idea, due to its simplicity and ease of implementation, is widely exploited for optimization in many different areas of knowledge.
Game-based optimization algorithms are introduced and designed based on the simulation of different game rules and player behavior. Football Game-Based Optimization (FGBO) is a game-based optimization algorithm developed based on football league simulations and club performance [51]. Some of the other game-based algorithms are Binary Orientation Search Algorithm (BOSA) [52], Darts Game Optimizer (DGO) [53], Orientation Search Algorithm (OSA) [54], Shell Game Optimization (SGO) [55], Dice Game Optimizer (DGO) [56], and Hide Objects Game Optimization (HOGO) [57].
Evolutionary-based optimization algorithms are developed based on the combination of natural selection and continuity of coordination. These algorithms are based on structures that simulate the rules of selection, recombination, change, and survival. Genetic Algorithm (GA) is one of the oldest and most popular evolutionary-based optimization algorithms [58]. Some other evolutionary-based algorithms include Improved Quantum-Inspired Differential Evolution Algorithm (IQDE) [59], Differential Evolution (DE) [60], Biogeography-Based Optimizer (BBO) [61], Evolutionary Programming (EP) [62], Evolution Strategy (ES) [63], and Genetic Programming (GP) [64].
These algorithms use a kind of statistical feature and random phenomena in their structure. Some central force optimization algorithms that are metaphors for the global law of gravity do not use such random phenomena. Such algorithms have certainty characteristics [29].
Population-based approaches have been inspired by social interactions between members of a community.
Based on the experience learned and the neighborhood particle guides, every particle tries to move towards the search space's best position [65]. Physical and biological processes and nature have inspired heuristic search algorithms. The majority of them act as population-based algorithms.
Unlike classical techniques, heuristic search techniques act randomly and search space in parallel; they do not use spatial gradient information. These algorithms use only fitness functions to guide the search process, but they can discover the solution thanks to their swarm intelligence. Swarm intelligence appears in cases where there is a population of non-expert factors. These factors have a simple behavior/pattern in certain situations/conditions and interact locally. These localized relationships/interactions between members cause unexpected ultralocal interactions and guide the search to the optimal solution. This allows the system/process to find a solution without the need for a central controller. The members' behavior/performance organizes the system internally, generating characteristics such as positive feedback, negative feedback, balanced exploration-exploitation, and multiple interactions of a different order. This effect is called the self-organizing impact [66,67].
Although heuristic algorithms have been developed, improved, and used, no algorithm has been introduced that provides an efficient solution for optimizing engineering problems or problems in other sciences. This article analyzes/discusses a new heuristic algorithm that solves the traditional shortcomings. An optimization algorithm based on the well-known Hooke's law is proposed, and preliminary results are presented [68].
The optimization algorithm called Binary Spring Search Algorithm (BSSA) is described and analyzed. The rest of the paper is organized as follows. In the first section, a brief introduction to heuristic-based optimization algorithms is presented. The spring force law is discussed in the second section, and the binary version of the spring search algorithm is introduced in Section 3. The main features of the BBSA are shown in Section 4, and a computational complexity analysis is presented in Section 5. The proposed algorithm's exploration and exploitation characteristics are explained in Section 6, and the results are given in Section 7. Finally, concluding remarks are listed in the last section.

Spring Search Algorithm
The BSSA is a physics-based optimization algorithm that can be used to solve various optimization problems. The BSSA has a population matrix whose members are different weights that are moved in the search space in order to achieve the optimal solution. All desired weights are connected to each other in this system by a unique spring whose stiffness coefficient is determined based on the value of the objective function. The main idea of the proposed BSSA is to use Hooke's law between the weights and springs in order to reach the equilibrium point (solution).
Hooke's law is defined using Equation (1). All springs follow Hooke's law as long as they are not deformed [69].
Here, is the spring force, is the spring constant, and is the spring compression or stretch.

BSSA Formulation
In this section, the mathematical formulation of the BSSA is modeled according to Hooke's law. Similar to population-based algorithms, the BSSA has a population matrix in which each row represents a member of the population as a weight. Thus, every member of the population is a vector, where each vector element determines the value of a variable of the optimization problem. In the BSSA, each member of the population is introduced using Equation (2).
Here, is the i'th member of population matrix, is the status of the d'th dimension of the i'th member of the population matrix, m is the number of the problem's variables, and N is the number of members of the population. The initial position of each member of the population is randomly considered in the search space of the problem. Then, based on the forces that the spring exerts on the weights, the members of the population move in the search space. The force of the springs is proportional to the spring constant, which is updated in each iteration using Equation (3).
Here, , is the spring constant of a specified spring that connects weight i to weight j, is the maximum value of the spring constant for all springs, and its value is 1, and is the normalized objective function, in which means a normalized objective function for the i'th member. In the BSSA, objective functions are normalized using Equations (4) and (5).
where is the vector of the objective function, in which means an objective function for the i'th member. An m-variable problem has an m-dimensional search space. Therefore, the search space has m coordinate axes corresponding to each variable. Each member of the population has a value on each axis. For each member of the population in each axis, the fixed points on the right and left are defined. Fixed points for a member are members who have a better objective function than that member. This causes two separate forces to be applied to each member on each axis from the left and right, which can be determined using Equations (6) and (7).
where , and , are respectively the total of the forces exerted on the d'th dimension of the i'th member of the population matrix from the right and left, is the number of fixed points on the right in the d'th axis or dimension, and is the number of fixed points on the left in the d'th axis or dimension. Now, according to Hooke's law, the amount of displacement to the right and left side for each member in each axis can be calculated using Equations (8) and (9).
where , is the amount of displacement to the right side for the j'th member in the d'th axis or dimension, and , is the amount of displacement to the left side for the j'th member in the d'th axis or dimension. In this case, the final displacement value can be calculated by merging Equations (8) and (9) according to Equation (10).
where , is the final displacement for the j'th member in the d'th axis or dimension. After determining the amount of displacement, the new position of each member in the search space is updated using Equation (11).
where 0 , is the previous position of the j'th member in the d'th dimension, and 1 is a random number with a normal distribution in the range [0 − 1].

BSSA Implantation
In the BSSA, the population of the algorithm is first defined randomly. Then, in each iteration, the position of each member of the population is updated according to Equations (3)- (10). Additionally, the spring constant coefficient is updated in each iteration according to Equation (3). This process is repeated until the algorithm reaches the stopping condition. Therefore, the various steps of implementing the BSSA can be expressed as follows: Start Step 1: Define the problem and determine the search space of the problem.
Step 2: Create the initial population randomly.
Step 3: Evaluate and normalize the objective function.
Step 4: Update the spring constant.
Step 5: Calculate the amount of left and right displacement according to Hooke's law.
Step 8: Repeat steps 3-7 until the stop condition is reached.
Step 9: Return best solution for objective function. End

Binary Spring Search Algorithm (BSSA)
In this section, a binary version of the spring search algorithm is developed. In the binary version of SSA, real values are displayed in binary using the numbers zero and one. The search space is discrete, and the appropriate number of binary values must be used to display each variable on the axis. Given that in the binary version, there are only two numbers (zero and one), the concept of displacement is defined as changing the status from zero to one or changing the status from one to zero. In order to implement this concept of displacement in the binary version, a probability function is used. Based on the value of this probability function, the new position of each member in each dimension of the problem may be change or remain unchanged. Therefore, in the BSSA, , is the probability of , becoming zero or one. The method of calculating the spring forces, the constant values of the springs, the amount of displacement per member of the population, and the update steps are similar in both the binary and real versions. The difference between the two versions is in how the population is updated. Given that the probability function must be a number between zero and one, the probability of changing the position for each dimension of each member is calculated using Equation (12).
Therefore, based on the values of the probability function, the new position of each dimension of each member is updated using Equation (13).

If
< ( , ( )) Then , ( + 1) = complement( , ( )) According to Equation (13), each member of population changes its position with a probability; the higher the value of , , the higher the probability of object moving in dimension .
is a random number with a normal distribution in the range The different steps of the BSSA are shown as flowcharts in Figure 1.
In order to clearly illustrate how the proposed method seeks the optimal solution, let us consider the following standard function: This problem is solved for two dimensions with 50 iterations and 10 bodies as the problem population. In the first iteration, the members of the population are randomly placed in the problem space. It is observed in Figure 2 that the proposed algorithm extensively searches the search space in the initial iterations to cover the defined space of the problem with high search capacity. Over time, it is seen that the proposed algorithm converges towards an optimal solution, and members of the population are concentrated in the vicinity of this optimal solution. The high capacity of the algorithm is presented for a quick exploration of the optimal solution. The numerical results of the test function in different iterations are shown in Table 1.  (4) and (5) Calculate spring stiffness coefficient. Equation (3) Calculate sum of the right forces. Equation (6) Calculate sum of the left forces. Equation (7) Calculate displacement of objects to the right. Equation (8) Calculate displacement of objects to the left. Equation (9) Calculate total displacement. Equation (10) Calculate probability function. Equation (12) Return best solution

Features of the BSSA
In the proposed BSSA, a new optimizer was designed using the simulation of the spring force law. In the BSSA, the population members are a set of interconnected weights that move through the problem search space. The spring force is the tool for exchanging information between members of the population. Each object has a rough understanding of the surrounding area affected by other objects' position, so an optimization algorithm must be designed to improve the position of population members during successive repetitions and over time. This is accomplished by adjusting the spring stiffness coefficient during the iterations of the algorithm. A spring with a higher coefficient of stiffness connects to objects with a better fitness function and draws other objects towards it. For any object, a force proportional to the size of that object is applied. Objects that are in better positions should have shorter and slower steps. To achieve this goal, a spring with a higher stiffness coefficient is attributed to better weights. This process makes the weights of an enhanced fitness function more carefully search the space around them. The coefficient of the springs' stiffness and, as a result, the force of the springs decrease over time. As can be seen, objects accumulate around better positions over time, and a space with smaller steps and more precision needs to be found. The stiffness of the spring decreases over time. Figure 3 shows a visualization of the forces applied to the system and the performance of the algorithm.

Computational Complexity
The time and space complexities of the proposed BSSA are discussed in this section.

Time Complexity
The initialization process of the BSSA takes O(n) times. It takes O(c) times to convert the algorithm into a binary version. The number of population members and objective function require O(p) and O(f) times.
The whole process will be simulated until a maximum number of iterations, which requires O(Maxiterations) times.
Overall, the time complexity of the proposed BSSA algorithm is O(n+c*p*f*Maxiterations).

Space Complexity
The proposed BSSA's space complexity is its initialization process, which requires O(n) times.

Exploration and Exploitation of the BSSA
The two most important indicators recommended for evaluating the performance of different optimization algorithms in optimizing optimization problems are exploitation power and exploration power. The exploitation index is used to analyze the ability of optimization algorithms to achieve the optimal solution. In fact, an algorithm that can provide a solution closer to the original solution has a higher power of exploitation. The exploration index is used for the analysis of the power of optimization algorithms in the exact search of the defined search space of a specific optimization problem. This indicator is especially important for optimization problems that have several local optimal points. Thus, an algorithm that can effectively scan the entire search space is able to extract the population of the algorithm from the local optimal points and direct it to the main optimal areas. Therefore, according to the definition of the mentioned indicators, it is better that the optimization algorithms have more exploratory power in the first iterations to examine different areas of the search space. Then, as the algorithm approaches the final iterations, the exploitation power of the algorithm must be adjusted to provide the appropriate solution [70,71].
The BSSA is able to accurately scan the search space according to the appropriate population members. The main parameter considered in the BSSA to maintain the balance between the two important indicators of exploitation and exploration is the spring constant. The equation of the spring constant in the BSSA is designed to have large values in the initial iterations and, as a result, according to Hooke's law and spring force, different areas of the search space are scanned by members of the population. Then, by increasing the iterations of the algorithm and getting closer to the final iterations, the spring constant has smaller values and searches the optimal areas more carefully in order to provide the most appropriate solution possible. The above procedure is included in Equation (11) to adjust the spring constant and maintain the balance between the exploitation power and the exploration power.
Each optimization algorithm was applied independently 20 times, and the results were averaged. As can be seen from Tables 5-7, the BSSA provides better results for most functions.
The seven objective functions F1-F7 are suitable for evaluating the exploitation rate. Based on the optimization results presented, Table 5 shows that the proposed algorithm is the best. The objective functions F8-F23 were selected to study and analyze the scan index. The optimization of the objective functions in Tables 6 and 7 shows the exploitability of the algorithm. Although the simulation and optimization results, reported as the average of the best solution and standard deviation, indicate the superiority of the proposed BSSA, these results alone are not sufficient to guarantee the proposed algorithm's superiority. Although all algorithms were run independently 20 times, it is still possible that the advantage occurs by chance despite its low probability in 20 runs. Therefore, the Friedman rank test [78] was applied to analyze the results further. This statistical test has two approaches to achieving the same goal. A quantitative variable is recorded two or more times in the same sample. In the other objective, the quantitative variables are measured from the same sample. In these objectives, the Friedman test compares the distributions (of the two or more quantitative variables). The results of this test are presented in Table 8 and are specified for all three different groups of objective functions: unimodal, multimodal, and multimodal with fixed dimension test functions and all objective test functions. Based on these results, the proposed algorithm for all three different test functions is positioned first in the Friedman rank test. Furthermore, the overall results of all the test functions (F1-F23) show that the BSSA is significantly superior to the other algorithms.

Conclusions
There are many optimization problems that must be solved by using a suitable method. Different heuristic optimization algorithms have been proposed to overcome the shortcomings of traditional methods, such as Linear Programming (LP), non-linear LP, and differential programming. Most of these algorithms are population-based using the randomness of natural phenomena. A heuristic optimization algorithm called Binary Spring Search Algorithm (BSSA) is proposed, which uses laws of the spring force law. The proposal was mathematically modeled, and its efficiency was evaluated using 23 standard test functions. These test functions were selected from three different types: unimodal, multimodal, and multimodal with fixed dimension test functions to evaluate different aspects of the proposed algorithm. Seven optimization algorithms (binary genetic algorithm, binary particle swarm optimization, binary gravitational search algorithm, binary dragonfly algorithm, binary bat algorithm, and binary grasshopper optimization algorithm) were compared to evaluate the performance of the proposed algorithm. Compared to the other algorithms, in all cases, the BSSA produces nearly optimal solutions. Friedman's rank test was used to further analyze the performance of the BSSA. The results obtained from this test show the clear superiority of the proposed algorithm in the three different types of test functions. The overall results of all test functions (F1-F23) show that the BSSA is significantly superior to the other algorithms and ranks first among them. Based on the optimization results and the Friedman rank test results, it is clear that the proposed BSSA performs well in solving optimization problems and is more competitive than similar algorithms.