Next Article in Journal
Research of Calibration Method for Industrial Robot Based on Error Model of Position
Next Article in Special Issue
A Review of Power System Fault Diagnosis with Spiking Neural P Systems
Previous Article in Journal
Minimum Relevant Features to Obtain Explainable Systems for Predicting Cardiovascular Disease Using the Statlog Data Set
Previous Article in Special Issue
A Bio-Inspired Model of Picture Array Generating P System with Restricted Insertion Rules
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Binary Spring Search Algorithm for Solving Various Optimization Problems

1
Department of Electrical and Electronics Engineering, Shiraz University of Technology, Shiraz 71557-13876, Iran
2
Department of Civil Engineering Islamic, Azad Universities of Estahban, Estahban Fars 74518-64747, Iran
3
Department of Electrical and Computer Engineering, University of Calgary, Calgary, AB T2N 1N4, Canada
4
School of Engineering and Sciences, Tecnologico de Monterrey, Monterrey, NL 64849, Mexico
5
Department of Computer Science, Government Bikram College of Commerce, Patiala, Punjab 147004, India
6
Department of Electrical Engineering, Yazd University, Yazd 89195-741, Iran
7
Department of Power and Control Engineering, School of Electrical and Computer Engineering, Shiraz University, Shiraz 71557-13876, Iran
8
CROM Center for Research on Microgrids, Department of Energy Technology, Aalborg University, 9220 Aalborg, Denmark
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(3), 1286; https://doi.org/10.3390/app11031286
Submission received: 19 December 2020 / Revised: 20 January 2021 / Accepted: 27 January 2021 / Published: 30 January 2021
(This article belongs to the Collection Bio-inspired Computation and Applications)

Abstract

:
One of the most powerful tools for solving optimization problems is optimization algorithms (inspired by nature) based on populations. These algorithms provide a solution to a problem by randomly searching in the search space. The design’s central idea is derived from various natural phenomena, the behavior and living conditions of living organisms, laws of physics, etc. A new population-based optimization algorithm called the Binary Spring Search Algorithm (BSSA) is introduced to solve optimization problems. BSSA is an algorithm based on a simulation of the famous Hooke’s law (physics) for the traditional weights and springs system. In this proposal, the population comprises weights that are connected by unique springs. The mathematical modeling of the proposed algorithm is presented to be used to achieve solutions to optimization problems. The results were thoroughly validated in different unimodal and multimodal functions; additionally, the BSSA was compared with high-performance algorithms: binary grasshopper optimization algorithm, binary dragonfly algorithm, binary bat algorithm, binary gravitational search algorithm, binary particle swarm optimization, and binary genetic algorithm. The results show the superiority of the BSSA. The results of the Friedman test corroborate that the BSSA is more competitive.

1. Introduction

The optimization of a process or system is a concept that has critical applications in many fields of science. Many optimization algorithms have been introduced [1,2,3], which has led to greater availability of heuristic optimization techniques in recent years and their application in various fields, such as energy [4,5], protection [6], electrical engineering [7,8,9,10,11], filter design [12], and energy carriers [13,14], to achieve the optimal solution (under specific criteria). Lately, these methods have been modified, achieving better yields [15,16,17].
An optimization algorithm is intelligent when its approach is to find an adequate solution to an optimization problem in the shortest possible time with the least detailed information [18]. The word “heuristics” in ancient Greek means “to know,” “to discover,” “to find,” or “a clue for an investigation” [19]. The heuristic approach ignores some of the information to make faster decisions with the maximum saving of time and utmost precision compared to complex approaches [20]. Processes in nature inspire many heuristic algorithms (biological processes, animals or groups of animals, and physics theories).
Optimization algorithms can be classified in different ways, using a widely accepted structure of four categories, based on the main design. These categories are physics-based, swarm, game-based, and evolution-based optimization algorithms.
Physics-based optimization algorithms are designed according to the simulation of various phenomena and laws of physics. Momentum Search Algorithm (MSA) is one of the algorithms belonging to this group. MSA is based on the simulation of momentum and motion laws. Members of the MSA population are balls with different weights that move in the direction of a ball with a suitable position based on the impulse [1]. Simulated Annealing (SA) algorithm is one of the oldest algorithms in physics. SA is inspired by the process of smelting and cooling materials in metallurgy. Under controlled temperature conditions, the materials are subjected to a heat treatment that causes the molecular structures to go through different phases to change their mechanical properties. The previous phenomenon increases the strength and durability of the material. Heating the material increases its atoms’ energy. It allows them to move freely, and the slow cooling process allows a new, lower-energy, higher-strength configuration to be discovered and exploited [21]. This algorithm’s efficiency is due to an essential connection between statistical mechanics and optimization processes (multivariate or combinatorial in nature).
The analogous behavior of these processes lays the foundations for defining values that optimize the properties of extensive and/or complex systems, which is where the use of this algorithm is mainly justified.
Some other popular physics-based optimization algorithms include Galaxy-based Search Algorithm (GbSA) [22], Charged System Search (CSS) [23], Curved Space Optimization (CSO) [24], Artificial Chemical Reaction Optimization Algorithm (ACROA) [25], Ray Optimization (RO) algorithm [26], Small World Optimization Algorithm (SWOA) [27], Black Hole (BH) [28], Central Force Optimization (CFO) [29], and Big-Bang Big-Crunch (BBBC) [30].
Swarm-based optimization algorithms have been developed based on the simulation of natural processes, movements, and behavior of animals and other living things. Particle Swarm Optimization (PSO) is the most famous optimization algorithm and is often used by researchers. Particle swarm optimization is a heuristic global optimization method that was proposed in 1995 [31]. It is based on the intelligence of swarms and emulates the behavior patterns of birds/fish when looking for food. One of the birds smells food and communicates it to the rest; this coordination reproduces successful behavior due to the cooperation of each bird. This algorithmically structured idea, due to its simplicity and ease of implementation, is widely exploited for optimization in many different areas of knowledge.
Some of the other swarm-based algorithms are Bat-inspired Algorithm (BA) [32], Artificial Bee Colony (ABC) [33], Doctor and Patient Optimization (DPO) [34], Cuckoo Search (CS) [35], Spotted Hyena Optimizer (SHO) [36], Multi Leader Optimizer (MLO) [37], Group Optimization (GO) [38], Monkey Search (MS) [39], Grey Wolf Optimizer (GWO) [40], Artificial Fish-Swarm Algorithm (AFSA) [41], Hunting Search (HS) [42], Moth-Flame Optimization Algorithm (MFO) [43], Emperor Penguin Optimizer (EPO) [44], Dolphin Partner Optimization (DPO) [45], Donkey Theorem Optimization (DTO) [46], Rat Swarm Optimizer (RSO) [47], Grasshopper Optimization Algorithm (GOA) [48], Coupled Spring Forced Bat Algorithm (SFBA) [49], and Following Optimization Algorithm (FOA) [50].
Game-based optimization algorithms are introduced and designed based on the simulation of different game rules and player behavior. Football Game-Based Optimization (FGBO) is a game-based optimization algorithm developed based on football league simulations and club performance [51]. Some of the other game-based algorithms are Binary Orientation Search Algorithm (BOSA) [52], Darts Game Optimizer (DGO) [53], Orientation Search Algorithm (OSA) [54], Shell Game Optimization (SGO) [55], Dice Game Optimizer (DGO) [56], and Hide Objects Game Optimization (HOGO) [57].
Evolutionary-based optimization algorithms are developed based on the combination of natural selection and continuity of coordination. These algorithms are based on structures that simulate the rules of selection, recombination, change, and survival. Genetic Algorithm (GA) is one of the oldest and most popular evolutionary-based optimization algorithms [58]. Some other evolutionary-based algorithms include Improved Quantum-Inspired Differential Evolution Algorithm (IQDE) [59], Differential Evolution (DE) [60], Biogeography-Based Optimizer (BBO) [61], Evolutionary Programming (EP) [62], Evolution Strategy (ES) [63], and Genetic Programming (GP) [64].
These algorithms use a kind of statistical feature and random phenomena in their structure. Some central force optimization algorithms that are metaphors for the global law of gravity do not use such random phenomena. Such algorithms have certainty characteristics [29].
Population-based approaches have been inspired by social interactions between members of a community.
Based on the experience learned and the neighborhood particle guides, every particle tries to move towards the search space’s best position [65]. Physical and biological processes and nature have inspired heuristic search algorithms. The majority of them act as population-based algorithms.
Unlike classical techniques, heuristic search techniques act randomly and search space in parallel; they do not use spatial gradient information. These algorithms use only fitness functions to guide the search process, but they can discover the solution thanks to their swarm intelligence. Swarm intelligence appears in cases where there is a population of non-expert factors. These factors have a simple behavior/pattern in certain situations/conditions and interact locally. These localized relationships/interactions between members cause unexpected ultralocal interactions and guide the search to the optimal solution. This allows the system/process to find a solution without the need for a central controller. The members’ behavior/performance organizes the system internally, generating characteristics such as positive feedback, negative feedback, balanced exploration-exploitation, and multiple interactions of a different order. This effect is called the self-organizing impact [66,67].
Although heuristic algorithms have been developed, improved, and used, no algorithm has been introduced that provides an efficient solution for optimizing engineering problems or problems in other sciences. This article analyzes/discusses a new heuristic algorithm that solves the traditional shortcomings. An optimization algorithm based on the well-known Hooke’s law is proposed, and preliminary results are presented [68].
The optimization algorithm called Binary Spring Search Algorithm (BSSA) is described and analyzed. The rest of the paper is organized as follows. In the first section, a brief introduction to heuristic-based optimization algorithms is presented. The spring force law is discussed in the second section, and the binary version of the spring search algorithm is introduced in Section 3. The main features of the BBSA are shown in Section 4, and a computational complexity analysis is presented in Section 5. The proposed algorithm’s exploration and exploitation characteristics are explained in Section 6, and the results are given in Section 7. Finally, concluding remarks are listed in the last section.

2. Spring Search Algorithm

The BSSA is a physics-based optimization algorithm that can be used to solve various optimization problems. The BSSA has a population matrix whose members are different weights that are moved in the search space in order to achieve the optimal solution. All desired weights are connected to each other in this system by a unique spring whose stiffness coefficient is determined based on the value of the objective function. The main idea of the proposed BSSA is to use Hooke’s law between the weights and springs in order to reach the equilibrium point (solution).
Hooke’s law is defined using Equation (1). All springs follow Hooke’s law as long as they are not deformed [69].
F s   =   k x
Here, F s is the spring force, k is the spring constant, and x is the spring compression or stretch.

2.1. BSSA Formulation

In this section, the mathematical formulation of the BSSA is modeled according to Hooke’s law. Similar to population-based algorithms, the BSSA has a population matrix in which each row represents a member of the population as a weight. Thus, every member of the population is a vector, where each vector element determines the value of a variable of the optimization problem. In the BSSA, each member of the population is introduced using Equation (2).
X i = ( x i 1 , , x i d , , x i m )             for   i = 1 ,   2 ,   ,   N ,  
Here, X i is the i’th member of population matrix, x i d is the status of the d’th dimension of the i’th member of the population matrix, m is the number of the problem’s variables, and N is the number of members of the population. The initial position of each member of the population is randomly considered in the search space of the problem. Then, based on the forces that the spring exerts on the weights, the members of the population move in the search space. The force of the springs is proportional to the spring constant, which is updated in each iteration using Equation (3).
K i , j = K m a x | F n i F n j | m a x ( F n i , F n j )
Here, K i , j is the spring constant of a specified spring that connects weight i to weight j, K m a x is the maximum value of the spring constant for all springs, and its value is 1, and F n is the normalized objective function, in which F n i means a normalized objective function for the i’th member. In the BSSA, objective functions are normalized using Equations (4) and (5).
F n i = f o b j i min ( f o b j ) ,
F n i = min ( F n i ) F n i
where f o b j is the vector of the objective function, in which f o b j i means an objective function for the i’th member. An m-variable problem has an m-dimensional search space. Therefore, the search space has m coordinate axes corresponding to each variable. Each member of the population has a value on each axis. For each member of the population in each axis, the fixed points on the right and left are defined. Fixed points for a member are members who have a better objective function than that member. This causes two separate forces to be applied to each member on each axis from the left and right, which can be determined using Equations (6) and (7).
F t o t a l R j , d = i = 1 n R d K i , j   x i , j d
F t o t a l L j , d = l = 1 n L d K l , j   x l , j d
where F t o t a l R j , d and F t o t a l L j , d are respectively the total of the forces exerted on the d’th dimension of the i’th member of the population matrix from the right and left,   n R d is the number of fixed points on the right in the d’th axis or dimension, and n L d is the number of fixed points on the left in the d’th axis or dimension. Now, according to Hooke’s law, the amount of displacement to the right and left side for each member in each axis can be calculated using Equations (8) and (9).
d X R j , d = F t o t a l R j , d K e q u a l R j
d X L j , d = F t o t a l L j , d K e q u a l L j
where d X R j , d is the amount of displacement to the right side for the j’th member in the d’th axis or dimension, and d X L j , d is the amount of displacement to the left side for the j’th member in the d’th axis or dimension. In this case, the final displacement value can be calculated by merging Equations (8) and (9) according to Equation (10).
d X j , d = d X R j , d + d X L j , d
where d X j , d is the final displacement for the j’th member in the d’th axis or dimension. After determining the amount of displacement, the new position of each member in the search space is updated using Equation (11).
X j , d = X 0 j , d + r 1   × d X j , d
where X 0 j , d is the previous position of the j’th member in the d’th dimension, and r 1 is a random number with a normal distribution in the range of [ 0 1 ] .

2.2. BSSA Implantation

In the BSSA, the population of the algorithm is first defined randomly. Then, in each iteration, the position of each member of the population is updated according to Equations (3)–(10). Additionally, the spring constant coefficient is updated in each iteration according to Equation (3). This process is repeated until the algorithm reaches the stopping condition. Therefore, the various steps of implementing the BSSA can be expressed as follows:
Start
Step 1: Define the problem and determine the search space of the problem.
Step 2: Create the initial population randomly.
Step 3: Evaluate and normalize the objective function.
Step 4: Update the spring constant.
Step 5: Calculate the amount of left and right displacement according to Hooke’s law.
Step 6: Calculate final displacement.
Step 7: Update population.
Step 8: Repeat steps 3–7 until the stop condition is reached.
Step 9: Return best solution for objective function.
End

3. Binary Spring Search Algorithm (BSSA)

In this section, a binary version of the spring search algorithm is developed. In the binary version of SSA, real values are displayed in binary using the numbers zero and one. The search space is discrete, and the appropriate number of binary values must be used to display each variable on the axis. Given that in the binary version, there are only two numbers (zero and one), the concept of displacement is defined as changing the status from zero to one or changing the status from one to zero. In order to implement this concept of displacement in the binary version, a probability function is used. Based on the value of this probability function, the new position of each member in each dimension of the problem may be change or remain unchanged. Therefore, in the BSSA, d X j , d is the probability of X j , d becoming zero or one. The method of calculating the spring forces, the constant values of the springs, the amount of displacement per member of the population, and the update steps are similar in both the binary and real versions. The difference between the two versions is in how the population is updated. Given that the probability function must be a number between zero and one, the probability of changing the position for each dimension of each member is calculated using Equation (12).
S ( d X j , d ( t ) )   = | tan h   ( d X j , d ( t ) ) |
Therefore, based on the values of the probability function, the new position of each dimension of each member is updated using Equation (13).
If   r a n d < S ( d X j , d ( t ) )   Then   X j , d ( t + 1 ) = complement ( X j , d ( t ) )       Else   X j , d ( t + 1 ) = X j , d ( t )
According to Equation (13), each member of population changes its position with a probability; the higher the value of d X j , d , the higher the probability of object j moving in dimension d . r a n d is a random number with a normal distribution in the range o f   [ 0 1 ] .
The different steps of the BSSA are shown as flowcharts in Figure 1.
In order to clearly illustrate how the proposed method seeks the optimal solution, let us consider the following standard function:
f ( x ) = i = 1 2 x i 2
This problem is solved for two dimensions with 50 iterations and 10 bodies as the problem population. In the first iteration, the members of the population are randomly placed in the problem space. It is observed in Figure 2 that the proposed algorithm extensively searches the search space in the initial iterations to cover the defined space of the problem with high search capacity. Over time, it is seen that the proposed algorithm converges towards an optimal solution, and members of the population are concentrated in the vicinity of this optimal solution. The high capacity of the algorithm is presented for a quick exploration of the optimal solution. The numerical results of the test function in different iterations are shown in Table 1.

4. Features of the BSSA

In the proposed BSSA, a new optimizer was designed using the simulation of the spring force law. In the BSSA, the population members are a set of interconnected weights that move through the problem search space. The spring force is the tool for exchanging information between members of the population. Each object has a rough understanding of the surrounding area affected by other objects’ position, so an optimization algorithm must be designed to improve the position of population members during successive repetitions and over time. This is accomplished by adjusting the spring stiffness coefficient during the iterations of the algorithm. A spring with a higher coefficient of stiffness connects to objects with a better fitness function and draws other objects towards it. For any object, a force proportional to the size of that object is applied. Objects that are in better positions should have shorter and slower steps. To achieve this goal, a spring with a higher stiffness coefficient is attributed to better weights. This process makes the weights of an enhanced fitness function more carefully search the space around them. The coefficient of the springs’ stiffness and, as a result, the force of the springs decrease over time. As can be seen, objects accumulate around better positions over time, and a space with smaller steps and more precision needs to be found. The stiffness of the spring decreases over time. Figure 3 shows a visualization of the forces applied to the system and the performance of the algorithm.

5. Computational Complexity

The time and space complexities of the proposed BSSA are discussed in this section.

5.1. Time Complexity

The initialization process of the BSSA takes O(n) times.
It takes O(c) times to convert the algorithm into a binary version.
The number of population members and objective function require O(p) and O(f) times.
The whole process will be simulated until a maximum number of iterations, which requires O(Maxiterations) times.
Overall, the time complexity of the proposed BSSA algorithm is O(n+c*p*f*Maxiterations).

5.2. Space Complexity

The proposed BSSA’s space complexity is its initialization process, which requires O(n) times.

6. Exploration and Exploitation of the BSSA

The two most important indicators recommended for evaluating the performance of different optimization algorithms in optimizing optimization problems are exploitation power and exploration power. The exploitation index is used to analyze the ability of optimization algorithms to achieve the optimal solution. In fact, an algorithm that can provide a solution closer to the original solution has a higher power of exploitation. The exploration index is used for the analysis of the power of optimization algorithms in the exact search of the defined search space of a specific optimization problem. This indicator is especially important for optimization problems that have several local optimal points. Thus, an algorithm that can effectively scan the entire search space is able to extract the population of the algorithm from the local optimal points and direct it to the main optimal areas. Therefore, according to the definition of the mentioned indicators, it is better that the optimization algorithms have more exploratory power in the first iterations to examine different areas of the search space. Then, as the algorithm approaches the final iterations, the exploitation power of the algorithm must be adjusted to provide the appropriate solution [70,71].
The BSSA is able to accurately scan the search space according to the appropriate population members. The main parameter considered in the BSSA to maintain the balance between the two important indicators of exploitation and exploration is the spring constant. The equation of the spring constant in the BSSA is designed to have large values in the initial iterations and, as a result, according to Hooke’s law and spring force, different areas of the search space are scanned by members of the population. Then, by increasing the iterations of the algorithm and getting closer to the final iterations, the spring constant has smaller values and searches the optimal areas more carefully in order to provide the most appropriate solution possible. The above procedure is included in Equation (11) to adjust the spring constant and maintain the balance between the exploitation power and the exploration power.

7. Simulation Results

The performance of the BSSA was evaluated on 23 benchmark fitness test functions [72], as defined in Table 2, Table 3 and Table 4.
The performance of the BSSA was compared with other algorithms, such as Binary Genetic Algorithm (BGA) [73], Binary Particle Swarm Optimization (BPSO) [65], Binary Gravitational Search Algorithm (BGSA) [74], Binary Bat Algorithm (BBA) [75], Binary Dragonfly Algorithm (BDA) [76], and Binary Grasshopper Optimization Algorithm (BGOA) [77].
Each optimization algorithm was applied independently 20 times, and the results were averaged. As can be seen from Table 5, Table 6 and Table 7, the BSSA provides better results for most functions.
The seven objective functions F1–F7 are suitable for evaluating the exploitation rate. Based on the optimization results presented, Table 5 shows that the proposed algorithm is the best. The objective functions F8–F23 were selected to study and analyze the scan index. The optimization of the objective functions in Table 6 and Table 7 shows the exploitability of the algorithm.

Statistical Testing

Although the simulation and optimization results, reported as the average of the best solution and standard deviation, indicate the superiority of the proposed BSSA, these results alone are not sufficient to guarantee the proposed algorithm’s superiority. Although all algorithms were run independently 20 times, it is still possible that the advantage occurs by chance despite its low probability in 20 runs. Therefore, the Friedman rank test [78] was applied to analyze the results further. This statistical test has two approaches to achieving the same goal. A quantitative variable is recorded two or more times in the same sample. In the other objective, the quantitative variables are measured from the same sample. In these objectives, the Friedman test compares the distributions (of the two or more quantitative variables). The results of this test are presented in Table 8 and are specified for all three different groups of objective functions: unimodal, multimodal, and multimodal with fixed dimension test functions and all objective test functions. Based on these results, the proposed algorithm for all three different test functions is positioned first in the Friedman rank test. Furthermore, the overall results of all the test functions (F1–F23) show that the BSSA is significantly superior to the other algorithms.

8. Conclusions

There are many optimization problems that must be solved by using a suitable method. Different heuristic optimization algorithms have been proposed to overcome the shortcomings of traditional methods, such as Linear Programming (LP), non-linear LP, and differential programming. Most of these algorithms are population-based using the randomness of natural phenomena. A heuristic optimization algorithm called Binary Spring Search Algorithm (BSSA) is proposed, which uses laws of the spring force law. The proposal was mathematically modeled, and its efficiency was evaluated using 23 standard test functions. These test functions were selected from three different types: unimodal, multimodal, and multimodal with fixed dimension test functions to evaluate different aspects of the proposed algorithm. Seven optimization algorithms (binary genetic algorithm, binary particle swarm optimization, binary gravitational search algorithm, binary dragonfly algorithm, binary bat algorithm, and binary grasshopper optimization algorithm) were compared to evaluate the performance of the proposed algorithm. Compared to the other algorithms, in all cases, the BSSA produces nearly optimal solutions. Friedman’s rank test was used to further analyze the performance of the BSSA. The results obtained from this test show the clear superiority of the proposed algorithm in the three different types of test functions. The overall results of all test functions (F1–F23) show that the BSSA is significantly superior to the other algorithms and ranks first among them. Based on the optimization results and the Friedman rank test results, it is clear that the proposed BSSA performs well in solving optimization problems and is more competitive than similar algorithms.

Author Contributions

Conceptualization, M.D., Z.M., and A.D.; methodology, M.D., R.A.R.-M.; software, M.D., Z.M., A.D., G.D., and N.N.; validation, O.P.M., N.N., and A.E.; formal analysis, O.P.M., R.M.-M.; investigation, R.A.R.-M.; resources, J.M.G., A.E.; data curation, R.M.-M.; writing—original draft preparation, M.D., Z.M., and A.D.; writing—review and editing, O.P.M., R.A.R.-M., R.M.-M., G.D., and J.M.G.; visualization; supervision, M.D. and Z.M.; project administration, A.D.; funding acquisition, R.M.-M., R.A.R.-M. All authors have read and agreed to the published version of the manuscript.

Funding

The current project was funded by Tecnológico de Monterrey and FEMSA Foundation (grant CAMPUSCITY project).

Conflicts of Interest

The authors declare no conflict of interest. The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. Dehghani, M.; Samet, H. Momentum search algorithm: A new meta-heuristic optimization algorithm inspired by momentum conservation law. SN Appl. Sci. 2020, 2, 1720. [Google Scholar] [CrossRef]
  2. Dehghani, M.; Montazeri, Z.; Dehghani, A.; Samet, H.; Sotelo, C.; Sotelo, D.; Ehsanifar, A.; Malik, O.P.; Guerrero, J.M.; Dhiman, G.; et al. DM: Dehghani Method for Modifying Optimization Algorithms. Appl. Sci. 2020, 10, 7683. [Google Scholar] [CrossRef]
  3. Dehghani, M.; Montazeri, Z.; Dhiman, G.; Malik, O.P.; Morales-Menendez, R.; Ramirez-Mendoza, R.A.; Dehghani, A.; Guerrero, J.M.; Parra-Arroyo, L. A Spring Search Algorithm Applied to Engineering Optimization Problems. Appl. Sci. 2020, 10, 6173. [Google Scholar] [CrossRef]
  4. Dehghani, M.; Montazeri, Z.; Malik, O. Energy Commitment: A Planning of Energy Carrier Based on Energy Consumption. Electr. Eng. Electromech. 2019, 4, 69–72. [Google Scholar] [CrossRef]
  5. Dehghani, M.; Mardaneh, M.; Malik, O.P.; Guerrero, J.M.; Sotelo, C.; Sotelo, D.; Nazari-Heris, M.; Al-Haddad, K.; Ramirez-Mendoza, R.A. Genetic Algorithm for Energy Commitment in a Power System Supplied by Multiple Energy Carriers. Sustainability 2020, 12, 10053. [Google Scholar] [CrossRef]
  6. Ehsanifar, A.; Dehghani, M.; Allahbakhshi, M. Calculating the leakage inductance for transformer inter-turn fault detection using finite element method. In Proceedings of the 2017 Iranian Conference on Electrical Engineering (ICEE), Tehran, Iran, 2–4 May 2017; pp. 1372–1377. [Google Scholar]
  7. Dehghani, M.; Montazeri, Z.; Malik, O.P. Optimal Sizing and Placement of Capacitor Banks and Distributed Generation in Distribution Systems Using Spring Search Algorithm. Int. J. Emerg. Electr. Power Syst. 2020, 21. [Google Scholar] [CrossRef] [Green Version]
  8. Dehghani, M.; Montazeri, Z.; Malik, O.P.; Al-Haddad, K.; Guerrero, J.M.; Dhiman, G. A New Methodology Called Dice Game Optimizer for Capacitor Placement in Distribution Systems. Electr. Eng. Electromech. 2020, 1, 61–64. [Google Scholar] [CrossRef] [Green Version]
  9. Dehbozorgi, S.; Ehsanifar, A.; Montazeri, Z.; Dehghani, M.; Seifi, A. Line loss reduction and voltage profile improvement in radial distribution networks using battery energy storage system. In Proceedings of the 2017 IEEE 4th International Conference on Knowledge-Based Engineering and Innovation (KBEI), Tehran, Iran, 22–22 December 2017; pp. 0215–0219. [Google Scholar]
  10. Montazeri, Z.; Niknam, T. Optimal Utilization of Electrical Energy from Power Plants Based on Final Energy Consumption Using Gravitational Search Algorithm. Electr. Eng. Electromech. 2018, 4, 70–73. [Google Scholar] [CrossRef] [Green Version]
  11. Dehghani, M.; Mardaneh, M.; Montazeri, Z.; Ehsanifar, A.; Ebadi, M.J.; Grechko, O. Spring Search Algorithm for Simultaneous Placement of Distributed Generation and Capacitors. Electr. Eng. Electromech. 2018, 6, 68–73. [Google Scholar] [CrossRef]
  12. Pelusi, D.; Mascella, R.; Tallini, L.G. A Fuzzy Gravitational Search Algorithm to Design Optimal IIR Filters. Energies 2018, 11, 736. [Google Scholar] [CrossRef] [Green Version]
  13. Dehghani, M.; Montazeri, Z.; Ehsanifar, A.; Seifi, A.; Ebadi, M.; Grechko, O.M. Planning of Energy Carriers Based on Final Energy Consumption Using Dynamic Programming and Particle Swarm Optimization. Electr. Eng. Electromech. 2018, 5, 62–71. [Google Scholar] [CrossRef] [Green Version]
  14. Montazeri, Z.; Niknam, T. Energy carriers management based on energy consumption. In Proceedings of the 2017 IEEE 4th International Conference on Knowledge-Based Engineering and Innovation (KBEI), Tehran, Iran, 22–22 December 2017; pp. 0539–0543. [Google Scholar]
  15. Pelusi, D.; Mascella, R.; Tallini, L.G.; Nayak, J.; Naik, B.; Deng, Y. Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm. Knowl. Based Syst. 2020, 193, 105404. [Google Scholar] [CrossRef]
  16. Pelusi, D.; Mascella, R.; Tallini, L.G.; Nayak, J.; Naik, B.; Deng, Y. An Improved Moth-Flame Optimization algorithm with hybrid search phase. Knowl. Based Syst. 2020, 191, 105277. [Google Scholar] [CrossRef]
  17. Pelusi, D.; Mascella, R.; Tallini, L.G.; Nayak, J.; Naik, B.; Abraham, A. Neural network and fuzzy system for the tuning of Gravitational Search Algorithm parameters. Expert Syst. Appl. 2018, 102, 234–244. [Google Scholar] [CrossRef]
  18. Gigerenzer, G.; Todd, P.M. Simple Heuristics that Make Us Smart; Oxford University Press: New York, NY, USA, 1999. [Google Scholar]
  19. Lazar, A. Heuristic knowledge discovery for archaeological data using genetic algorithms and rough sets. Heuristics Optim. Knowl. Discov. 2002, 263. [Google Scholar] [CrossRef]
  20. Gigerenzer, G.; Gaissmaier, W. Heuristic Decision Making. Annu. Rev. Psychol. 2011, 62, 451–482. [Google Scholar] [CrossRef] [Green Version]
  21. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. A heuristic algorithm and simulation approach to relative location of facilities. Optim. Simulated Annealing 1983, 220, 671–680. [Google Scholar]
  22. Shah-Hosseini, H. Principal components analysis by the galaxy-based search algorithm: A novel metaheuristic for continuous optimisation. Int. J. Comput. Sci. Eng. 2011, 6, 132. [Google Scholar] [CrossRef]
  23. Kaveh, A.; Talatahari, S. A novel heuristic optimization method: Charged system search. Acta Mech. 2010, 213, 267–289. [Google Scholar] [CrossRef]
  24. Moghaddam, F.F.; Moghaddam, R.F.; Cheriet, M. Curved space optimization: A random search based on general relativity theory. arXiv 2012, arXiv:1208.2214. [Google Scholar]
  25. Alatas, B. ACROA: Artificial Chemical Reaction Optimization Algorithm for global optimization. Expert Syst. Appl. 2011, 38, 13170–13180. [Google Scholar] [CrossRef]
  26. Kaveh, A.; Khayatazad, M. A new meta-heuristic method: Ray Optimization. Comput. Struct. 2012, 112, 283–294. [Google Scholar] [CrossRef]
  27. Du, H.; Wu, X.; Zhuang, J. Small-World Optimization Algorithm for Function Optimization. In Proceedings of the Second International Conference on Advances in Natural Computation; Springer: Berlin/Heidelberg, Germany, 2006; pp. 264–273. [Google Scholar]
  28. Hatamlou, A. Black hole: A new heuristic optimization approach for data clustering. Inf. Sci. 2013, 222, 175–184. [Google Scholar] [CrossRef]
  29. Formato, R.A. Central force optimization: A new nature inspired computational framework for multidimensional search and optimization. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2007); Springer: Berlin/Heidelberg, Germany, 2008; pp. 221–238. [Google Scholar]
  30. Erol, O.K.; Eksin, I. A new optimization method: Big Bang–Big Crunch. Adv. Eng. Softw. 2006, 37, 106–111. [Google Scholar] [CrossRef]
  31. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the IEEE International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; IEEE Service Center: Piscataway, NY, USA, 1942; Volume 1948. [Google Scholar]
  32. Yang, X.-S. A new metaheuristic bat-inspired algorithm. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2010); Springer: Berlin/Heidelberg, Germany, 2010; pp. 65–74. [Google Scholar]
  33. Karaboga, D.; Basturk, B. Artificial bee colony (ABC) optimization algorithm for solving constrained optimization, problems. In Proceedings of the 12th International Fuzzy Systems Association World Congress on Foundations of Fuzzy Logic and Soft Computing; Springer: Berlin/Heidelberg, Germany, 2007. [Google Scholar]
  34. Dehghani, M.; Mardaneh, M.; Guerrero, J.M.; Malik, O.P.; Ramirez-Mendoza, R.A.; Matas, J.; Vasquez, J.C.; Parra-Arroyo, L. A New “Doctor and Patient” Optimization Algorithm: An Application to Energy Commitment Problem. Appl. Sci. 2020, 10, 5791. [Google Scholar] [CrossRef]
  35. Gandomi, A.H.; Yang, X.-S.; Alavi, A.H. Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar] [CrossRef]
  36. Dhiman, G.; Kumar, V. Spotted hyena optimizer: A novel bio-inspired based metaheuristic technique for engineering applications. Adv. Eng. Softw. 2017, 114, 48–70. [Google Scholar] [CrossRef]
  37. Dehghani, M.; Montazeri, Z.; Dehghani, A.; Mendoza, R.R.; Samet, H.; Guerrero, J.M.; Dhiman, G. MLO: Multi Leader Optimizer. Int. J. Intell. Eng. Syst. 2020, 13, 364–373. [Google Scholar] [CrossRef]
  38. Dehghani, M.; Montazeri, Z.; Dehghani, A.; Malik, O.P. GO: Group Optimization. GAZI Univ. J. Sci. 2020, 33, 381–392. [Google Scholar] [CrossRef]
  39. Mucherino, A.; Seref, O. Monkey Search: A Novel Metaheuristic Search for Global Optimization. AIP Conf. Proc. 2007, 953, 162–173. [Google Scholar]
  40. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  41. Neshat, M.; Sepidnam, G.; Sargolzaei, M.; Toosi, A.N. Artificial fish swarm algorithm: A survey of the state-of-the-art, hybridization, combinatorial and indicative applications. Artif. Intell. Rev. 2014, 42, 965–997. [Google Scholar] [CrossRef]
  42. Oftadeh, R.; Mahjoob, M.J.; Shariatpanahi, M. A novel meta-heuristic optimization algorithm inspired by group hunting of animals: Hunting search. Comput. Math. Appl. 2010, 60, 2087–2098. [Google Scholar] [CrossRef] [Green Version]
  43. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl. Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  44. Dhiman, G.; Kumar, V. Emperor penguin optimizer: A bio-inspired algorithm for engineering problems. Knowl. Based Syst. 2018, 159, 20–50. [Google Scholar] [CrossRef]
  45. Shiqin, Y.; Jianjun, J.; Guangxing, Y. A dolphin partner optimization. In Proceedings of the Global Congress on Intelligent Systems, Xiamen, China, 19–21 May 2009; pp. 124–128. [Google Scholar]
  46. Dehghani, M.; Mardaneh, M.; Malik, O.P.; NouraeiPour, S.M. DTO: Donkey Theorem Optimization. In Proceedings of the 2019 27th Iranian Conference on Electrical Engineering (ICEE), Yazd, Iran, 30 April–2 May 2019; pp. 1855–1859. [Google Scholar]
  47. Dhiman, G.; Garg, M.; Nagar, A.; Kumar, V.; Dehghani, M. A novel algorithm for global optimization: Rat Swarm Optimizer. J. Ambient. Intell. Humaniz. Comput. 2020. [Google Scholar] [CrossRef]
  48. Saremi, S.; Mirjalili, S.; Lewis, A. Grasshopper Optimisation Algorithm: Theory and application. Adv. Eng. Softw. 2017, 105, 30–47. [Google Scholar] [CrossRef]
  49. Zhang, H.; Hui, Q. A Coupled Spring Forced Bat Searching Algorithm: Design, Analysis and Evaluation. In Proceedings of the 2020 American Control Conference (ACC), Denver, CO, USA, 1–3 July 2020; pp. 5016–5021. [Google Scholar]
  50. Dehghani, M.; Mardaneh, M.; Malik, O. FOA: ‘Following’ Optimization Algorithm for solving power engineering optimization problems. J. Oper. Autom. Power Eng. 2020, 8, 57–64. [Google Scholar]
  51. Dehghani, M.; Mardaneh, M.; Guerrero, J.M.; Malik, O.P.; Kumar, V. Football Game Based Optimization: An Application to Solve Energy Commitment Problem. Int. J. Intell. Eng. Syst. 2020, 13, 514–523. [Google Scholar] [CrossRef]
  52. Dehghani, M.; Montazeri, Z.; Malik, O.P.; Dhiman, G.; Kumar, V. BOSA: Binary Orientation Search Algorithm. Int. J. Innov. Technol. Explor. Eng. 2019, 9, 5306–5310. [Google Scholar]
  53. Dehghani, M.; Montazeri, Z.; Givi, H.; Guerrero, J.M.; Dhiman, G. Darts Game Optimizer: A New Optimization Technique Based on Darts Game. Int. J. Intell. Eng. Syst. 2020, 13, 286–294. [Google Scholar] [CrossRef]
  54. Dehghani, M.; Montazeri, Z.; Malik, O.P.; Ehsanifar, A.; Dehghani, A. OSA: Orientation Search Algorithm. Int. J. Ind. Electron. Control Optim. 2019, 2, 99–112. [Google Scholar]
  55. Mohammad, D.; Zeinab, M.; Malik, O.P.; Givi, H.; Guerrero, J.M. Shell Game Optimization: A Novel Game-Based Algorithm. Int. J. Intell. Eng. Syst. 2020, 13, 246–255. [Google Scholar] [CrossRef]
  56. Dehghani, M.; Montazeri, Z.; Malik, O.P. DGO: Dice Game Optimizer. GAZI Univ. J. Sci. 2019, 32, 871–882. [Google Scholar] [CrossRef] [Green Version]
  57. Dehghani, M.; Montazeri, Z.; Saremi, S.; Dehghani, A.; Malik, O.P.; Al-Haddad, K.; Guerrero, J.M. HOGO: Hide Objects Game Optimization. Int. J. Intell. Eng. Syst. 2020, 13, 216–225. [Google Scholar] [CrossRef]
  58. Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  59. Deng, W.; Liu, H.; Xu, J.; Zhao, H.; Song, Y. An Improved Quantum-Inspired Differential Evolution Algorithm for Deep Belief Network. IEEE Trans. Instrum. Meas. 2020, 69, 7319–7327. [Google Scholar] [CrossRef]
  60. Das, S.; Suganthan, P.N. Differential Evolution: A Survey of the State-of-the-Art. IEEE Trans. Evol. Comput. 2011, 15, 4–31. [Google Scholar] [CrossRef]
  61. Simon, D. Biogeography-Based Optimization. IEEE Trans. Evol. Comput. 2008, 12, 702–713. [Google Scholar] [CrossRef] [Green Version]
  62. Fogel, L.J.; Owens, A.J.; Walsh, M.J. Artificial Intelligence through Simulated Evolution; Wiley: NewYork, NY, USA, 1966. [Google Scholar]
  63. Beyer, H.-G.; Schwefel, H.-P. Evolution strategies—A comprehensive introduction. Nat. Comput. 2002, 1, 3–52. [Google Scholar] [CrossRef]
  64. Koza, J.R. Genetic programming as a means for programming computers by natural selection. Stat. Comput. 1994, 4, 87–112. [Google Scholar] [CrossRef]
  65. Mirjalili, S. Particle Swarm Optimisation. In Evolutionary Algorithms and Neural Networks; Springer: Berlin/Heidelberg, Germany, 2019; pp. 15–31. [Google Scholar]
  66. Tarasewich, P.; McMullen, P.R. Swarm intelligence: Power in numbers. Commun. ACM 2002, 45, 62–67. [Google Scholar] [CrossRef]
  67. Kohonen, T. Self-Organization and Associative Memory; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2012; Volume 8. [Google Scholar]
  68. Dehghani, M.; Montazeri, Z.; Dehghani, A.; Nouri, N.; Seifi, A. BSSA: Binary spring search algorithm. In Proceedings of the 2017 IEEE 4th International Conference on Knowledge-Based Engineering and Innovation (KBEI), Tehran, Iran, 22–22 December 2017; pp. 220–224. [Google Scholar]
  69. Halliday, D.; Resnick, R.; Walker, J. Fundamentals of Physics; John Wiley & Sons: Hoboken, NJ, USA, 2013. [Google Scholar]
  70. Eiben, A.E.; Schippers, C.A. On Evolutionary Exploration and Exploitation. Fundam. Inform. 1998, 35, 35–50. [Google Scholar] [CrossRef]
  71. Lynn, N.; Suganthan, P.N. Heterogeneous comprehensive learning particle swarm optimization with enhanced exploration and exploitation. Swarm Evol. Comput. 2015, 24, 11–24. [Google Scholar] [CrossRef]
  72. Zhang, L.; Tang, Y.; Hua, C.; Guan, X. A new particle swarm optimization algorithm with adaptive inertia weight based on Bayesian techniques. Appl. Soft Comput. 2015, 28, 138–149. [Google Scholar] [CrossRef]
  73. Castillo, O.; Aguilar, L.T. Genetic Algorithms. In Type-2 Fuzzy Logic in Control of Nonsmooth Systems; Springer: Berlin/Heidelberg, Germany, 2019; pp. 23–39. [Google Scholar]
  74. Bala, I.; Yadav, A. Gravitational Search Algorithm: A State-of-the-Art Review. In Harmony Search and Nature Inspired Optimization Algorithms; Springer: Berlin/Heidelberg, Germany, 2019; pp. 27–37. [Google Scholar]
  75. Mirjalili, S.; Mirjalili, S.M.; Yang, X.-S. Binary bat algorithm. Neural Comput. Appl. 2014, 25, 663–681. [Google Scholar] [CrossRef]
  76. Mafarja, M.M.; Eleyan, D.; Jaber, I.; Hammouri, A.; Mirjalili, S. Binary Dragonfly Algorithm for Feature Selection. In Proceedings of the 2017 International Conference on New Trends in Computing Sciences (ICTCS), Amman, Jordan, 11–13 October 2017; pp. 12–17. [Google Scholar]
  77. Mafarja, M.; Aljarah, I.; Faris, H.; Hammouri, A.I.; Ala’M, A.-Z.; Mirjalili, S. Binary grasshopper optimisation algorithm approaches for feature selection problems. Expert Syst. Appl. 2019, 117, 267–286. [Google Scholar] [CrossRef]
  78. Daniel, W.W. Friedman two-way analysis of variance by ranks. In Applied Nonparametric Statistics; PWS-Kent: Boston, MA, USA, 1990; pp. 262–274. [Google Scholar]
Figure 1. The flowchart of the Binary Spring Search Algorithm (BSSA).
Figure 1. The flowchart of the Binary Spring Search Algorithm (BSSA).
Applsci 11 01286 g001
Figure 2. Search space for the test function in the BSSA.
Figure 2. Search space for the test function in the BSSA.
Applsci 11 01286 g002aApplsci 11 01286 g002b
Figure 3. Each object is displaced according to spring forces applied to it in the BSSA.
Figure 3. Each object is displaced according to spring forces applied to it in the BSSA.
Applsci 11 01286 g003
Table 1. Numerical results for the test function at different iterations.
Table 1. Numerical results for the test function at different iterations.
Iteration = 1Iteration = 2Iteration = 3
X 1 X 2 F(x) X 1 X 2 F(x) X 1 X 2 F(x)
4.13E+01−3.70E+013.07E+03−1.90E+01−9.82E+004.59E+023.01E+01−9.82E+001.00E+03
−4.72E+01−8.09E+018.77E+03−3.49E+01−6.17E+015.02E+032.47E+00−4.32E+011.87E+03
1.13E+013.96E+011.70E+03−3.69E+01−2.55E+012.01E+03−3.44E+01−1.65E+011.46E+03
7.58E+011.93E+016.11E+032.60E+011.93E+011.05E+03−3.62E+007.90E+007.55E+01
−3.72E+01−2.03E+011.80E+039.97E+00−1.65E+013.73E+02−1.10E+01−4.78E+001.44E+02
4.90E+017.61E+018.20E+036.19E+007.57E+015.77E+036.19E+002.10E+014.78E+02
3.68E+015.27E+014.13E+037.16E+002.89E+018.85E+02−1.65E+012.13E+017.24E+02
−9.73E+01−8.21E+011.62E+04−2.97E+01−6.76E+015.45E+03−2.16E+00−8.32E+007.39E+01
−9.95E+00−8.96E+018.13E+03−9.95E+00−2.38E+016.63E+022.23E+01−1.52E+017.29E+02
3.74E+01−9.96E+011.13E+04−2.04E+01−2.28E+019.37E+029.06E+003.67E+011.43E+03
Iteration = 4Iteration = 5Iteration = 10
X 1 X 2 F(x) X 1 X 2 F(x) X 1 X 2 F(x)
2.41E+003.35E+011.13E+03−3.54E+011.38E+011.45E+03−5.37E+002.83E+003.68E+01
−1.08E+01−3.98E+011.70E+031.16E+01−2.78E+019.10E+02−8.39E+002.79E−017.05E+01
−3.47E+001.06E−011.20E+013.05E+011.06E−019.32E+027.56E+001.06E−015.71E+01
7.98E+00−1.21E+012.11E+02−7.97E+001.45E+012.75E+021.13E+01−9.93E−011.29E+02
2.27E+01−4.78E+005.40E+022.06E+011.54E+016.64E+02−4.43E+007.19E−012.02E+01
−2.05E+011.93E+017.89E+021.26E+01−1.54E+013.96E+025.68E+001.53E+003.46E+01
−1.43E+01−4.64E+002.26E+021.77E+012.61E+019.96E+027.86E+00−4.35E+008.08E+01
−2.16E+001.64E+012.73E+02−2.16E+00−1.74E+013.08E+02−5.92E+00−8.32E+001.04E+02
−1.52E+013.26E+011.30E+031.92E+012.42E+019.55E+021.20E−01−3.89E+001.51E+01
−3.94E+015.31E+001.58E+03−1.23E+01−3.17E+011.16E+03−9.92E+007.80E−019.89E+01
Iteration = 20Iteration = 30Iteration = 50
X 1 X 2 F(x) X 1 X 2 F(x) X 1 X 2 F(x)
−6.49E−01−2.71E−024.21E−01−7.88E−048.56E−037.39E−05−2.14E−06−1.58E−032.49E−06
3.51E−01−3.30E−012.32E−01−2.12E−02−1.86E−027.95E−04−3.02E−061.64E−052.79E−10
8.31E−033.10E−021.03E−036.29E−03−3.22E−034.99E−05−1.50E−069.95E−061.01E−10
2.81E−011.34E−019.70E−02−1.65E−02−1.36E−024.60E−04−7.90E−06−7.15E−061.14E−10
−9.40E−02−4.12E−021.05E−021.36E−02−1.55E−024.25E−043.76E−061.10E−041.21E−08
−2.46E−011.16E−017.39E−022.40E−031.02E−036.82E−06−3.30E−05−2.45E−061.09E−09
−1.76E−012.19E+004.84E+00−1.56E−022.82E−018.00E−02−1.99E−032.34E−015.50E−02
−6.43E−022.03E−014.52E−02−4.90E−04−4.00E−021.60E−038.19E−06−9.05E−061.49E−10
4.60E−017.55E−022.18E−011.66E−022.87E−032.82E−042.56E−05−1.30E−058.27E−10
−8.30E−023.75E−011.47E−01−3.15E−02−8.20E−031.06E−03−7.89E−04−2.61E−066.22E−07
Table 2. Unimodal test functions.
Table 2. Unimodal test functions.
F 1 ( x ) = i = 1 m x i 2 [ 100 , 100 ] m
F 2 ( x ) = i = 1 m | x i | + i = 1 m | x i | [ 10 , 10 ] m
F 3 ( x ) = i = 1 m ( j = 1 i x i ) 2 [ 100 , 100 ] m
F 4 ( x ) = m a x { | x i |   ,   1 i m   } [ 100 , 100 ] m
F 5 ( x ) = i = 1 m 1 [ 100 ( x i + 1 x i 2 ) 2 + ( x i 1 ) 2 ) ] [ 30 , 30 ] m
F 6 ( x ) = i = 1 m ( [ x i + 0.5 ] ) 2 [ 100 , 100 ] m
F 7 ( x ) = i = 1 m i x i 4 + r a n d o m ( 0 , 1 ) [ 1.28 , 1.28 ] m
Table 3. Multimodal test functions.
Table 3. Multimodal test functions.
F 8 ( x ) = i = 1 m x i   sin ( | x i | ) [ 500 , 500 ] m
F 9 ( x ) = i = 1 m [   x i 2 10 cos ( 2 π x i ) + 10 ] [ 5.12 , 5.12 ] m
F 10 ( x ) = 20 exp ( 0.2 1 m i = 1 m x i 2 ) exp ( 1 m i = 1 m c o s ( 2 π x i ) ) + 20 + e [ 32 , 32 ] m
F 11 ( x ) = 1 4000 i = 1 m x i 2 i = 1 m c o s ( x i i ) + 1 [ 600 , 600 ] m
F 12 ( x ) = π m   { 10 sin ( π y 1 ) + i = 1 m ( y i 1 ) 2 [ 1 + 10 sin 2 ( π y i + 1 ) ] + ( y n 1 ) 2 } + i = 1 m u ( x i , 10 , 100 , 4 )
u ( x i , a , i , n ) = { k ( x i a ) n                               x i > a 0                                       a < x i < a k ( x i a ) n                       x i < a
[ 50 , 50 ] m
F 13 ( x ) = 0.1 {   sin 2 ( 3 π x 1 ) + i = 1 m ( x i 1 ) 2 [ 1 + sin 2 ( 3 π x i + 1 ) ] + ( x n 1 ) 2 [ 1 + sin 2 ( 2 π x m ) ] } + i = 1 m u ( x i , 5 , 100 , 4 ) [ 50 , 50 ] m
Table 4. Multimodal test functions with fixed dimensions.
Table 4. Multimodal test functions with fixed dimensions.
F 14 ( x ) = ( 1 500 + j = 1 25 1 j + i = 1 2 ( x i a i j ) 6 ) 1 [ 65.53 , 65.53 ] 2
F 15 ( x ) = i = 1 11 [ a i x 1 ( b i 2 + b i x 2 ) b i 2 + b i x 3 + x 4 ] 2 [ 5 , 5 ] 4
F 16 ( x ) = 4 x 1 2 2.1 x 1 4 + 1 3 x 1 6 + x 1 x 2 4 x 2 2 + 4 x 2 4 [ 5 , 5 ] 2
F 17 ( x ) = ( x 2 5.1 4 π 2 x 1 2 + 5 π x 1 6 ) 2 + 10 ( 1 1 8 π ) c o s x 1 + 10 [–5,10] × [0,15]
F 18 ( x ) = [ 1 + ( x 1 + x 2 + 1 ) 2 ( 19 14 x 1 + 3 x 1 2 14 x 2 + 6 x 1 x 2 + 3 x 2 2 ) ] × [ 30 + ( 2 x 1 3 x 2 ) 2 × ( 18 32 x 1 + 12 x 1 2 + 48 x 2 36 x 1 x 2 + 27 x 2 2 ) ] [ 5 , 5 ] 2
F 19 ( x ) = i = 1 4 c i exp ( j = 1 3 a i j ( x j P i j ) 2 ) [ 0 , 1 ] 3
F 20 ( x ) = i = 1 4 c i exp ( j = 1 6 a i j ( x j P i j ) 2 ) [ 0 , 1 ] 6
F 21 ( x ) = i = 1 5 [ ( X a i ) ( X a i ) T + 6 c i ] 1 [ 0 , 10 ] 4
F 22 ( x ) = i = 1 7 [ ( X a i ) ( X a i ) T + 6 c i ] 1 [ 0 , 10 ] 4
F 23 ( x ) = i = 1 10 [ ( X a i ) ( X a i ) T + 6 c i ] 1 [ 0 , 10 ] 4
Table 5. Results for the BSSA and other algorithms in unimodal test functions.
Table 5. Results for the BSSA and other algorithms in unimodal test functions.
BSSABGOABMOABBABDABGSABPSOBGA
F1Ave6.74E−355.71E−284.61E−237.86E−102.81E−011.16E−164.98E−091.95E−12
std9.17E−368.31E−297.37E−238.11E−091.11E−016.10E−171.40E−082.01E−11
F2Ave7.78E−456.20E−401.20E−345.99E−203.96E−011.70E−017.29E−046.53E−18
std3.48E−453.32E−401.30E−341.11E−171.41E−019.29E−011.84E−035.10E−17
F3Ave2.63E−252.05E−191.00E−149.19E−054.31E+014.16E+021.40E+017.70E−10
std9.83E−279.17E−204.10E−146.16E−048.97E+001.56E+027.13E+007.36E−09
F4Ave4.65E−264.32E−182.02E−148.73E−018.80E−011.12E+006.00E−019.17E+01
std4.68E−293.98E−192.43E−141.19E−012.50E−019.89E−011.72E−015.67E+01
F5Ave5.41E−015.07E+002.79E+018.91E+021.18E+023.85E+014.93E+015.57E+02
std5.05E−024.90E−011.84E+002.97E+021.43E+023.47E+013.89E+014.16E+01
F6Ave8.03E−247.01E−196.58E−018.18E−173.15E−011.08E−169.23E−093.15E−01
std5.22E−264.39E−203.38E−011.70E−189.98E−024.00E−171.78E−089.98E−02
F7Ave3.33E−082.71E−057.80E−045.37E−012.02E−027.68E−016.92E−026.79E−04
std1.18E−069.26E−063.85E−041.89E−017.43E−032.77E+002.87E−023.29E−03
Table 6. Results for the BSSA and other algorithms in multimodal test functions.
Table 6. Results for the BSSA and other algorithms in multimodal test functions.
BSSABGOABMOABBABDABGSABPSOBGA
F8Ave−1.2E+04−8.76E+02 −6.14E+02 −4.69E+01 −6.92E+02−2.75E+02 −5.01E+02−5.11E+02
std9.14E−125.92E+019.32E+013.94E+019.19EE+015.72E+014.28E+014.37E+01
F9Ave8.76E−046.90E−01 4.34E−01 4.85E−02 1.01E+023.35E+01 1.20E−011.23E−01
std4.85E−024.81E−011.66E+003.91E+011.89E+011.19E+014.01E+014.11E+01
F10Ave8.04E−208.03E−16 1.63E−14 2.83E−08 1.15E+008.25E−09 5.20E−115.31E−11
std3.34E−182.74E−143.14E−154.34E−077.87E−011.90E−091.08E−101.11E−10
F11Ave4.23E−104.20E−05 2.29E−03 2.49E−05 5.74E−018.19E+00 3.24E−063.31E−06
std5.11E−074.73E−045.24E−031.34E−041.12E−013.70E+004.11E−054.23E−05
F12Ave6.33E−085.09E−03 3.93E−02 1.34E−05 1.27E+002.65E−01 8.93E−089.16E−08
std4.71E−043.75E−032.42E−026.23E−041.02E+003.14E−014.77E−074.88E−07
F13Ave0.00E+001.25E−08 4.75E−01 9.94E−08 6.60E−025.73E−32 6.26E−026.39E−02
std0.00E+002.61E−072.38E−012.61E−074.33E−028.95E−324.39E−024.49E−02
Table 7. Results for the BSSA and other algorithms in multimodal test functions with fixed dimensions.
Table 7. Results for the BSSA and other algorithms in multimodal test functions with fixed dimensions.
BSSABGOABMOABBABDABGSABPSOBGA
F14Ave9.98E−011.08E+00 3.71E+00 1.26E+00 9.98E+01 3.61E+00 2.77E+00 4.39E+00
std7.64E−124.11E−023.86E+006.86E−019.14E−12.96E+002.32E+004.41E−02
F15Ave3.3E−048.21E−03 3.66E−02 1.01E−02 7.15E−02 6.84E−02 9.09E−03 7.36E−02
std1.25E−054.09E−037.60E−023.75E−031.26E−017.37E−022.38E−032.39E−03
F16Ave−1.03E+00−1.02E+00 −1.02E+00 −1.02E+00 −1.02E+00 −1.02E+00 −1.02E+00 −1.02E+00
std5.12E−109.80E−077.02E−093.23E−054.74E−080.00E+000.00E+004.19E−07
F17Ave3.98E−013.98E−01 3.98E−01 3.98E−01 3.98E−01 3.98E−01 3.98E−01 3.98E−01
std4.56E−215.39E−057.00E−077.61E−041.15E−071.13E−169.03E−163.71E−17
F18Ave3.00E+003.00E+00 3.00E+00 3.00E+00 3.00E+00 3.00E+00 3.00E+00 3.00E+00
std1.15E−181.15E−087.16E−062.25E−051.48E+013.24E−026.59E−056.33E−07
F19Ave−3.86E+00−3.86E+00 −3.84E+00 −3.75E+00 −3.77E+00 −3.86E+00 −3.80E+00 −3.81E+00
std5.61E−106.50E−071.57E−032.55E−033.53E−074.15E−013.37E−154.37E−10
F20Ave−3.32E+00−2.81E+00 −3.27E+00 −2.84E+00 −3.23E+00 −1.47E+00 −3.32E+00 −2.39E+00
std4.29E−057.11E−017.27E−023.71E−015.37E−025.32E−012.66E−014.37E−01
F21Ave−10.15E+00−8.07E+00 −9.65E+00 −2.28E+00 −7.38E+00 −4.57E+00 −7.54E+00 −5.19E+00
std1.25E−022.29E+001.54E+001.80E+002.91E+001.30E+002.77E+002.34E+00
F22Ave−10.40E+00−10.01E+00−1.04E+00 −3.99E+00 −8.50E+00 −6.58E+00 −8.55E+00 −2.97E+00
std3.65E−073.97E−022.73E−041.99E+003.02E+002.64E+003.08E+001.37E−02
F23Ave−10.53E+00−3.41E+00−1.05E+01 −4.49E+00 −8.41E+00 −9.37E+00 −9.19E+00 −3.10E+00
std5.26E−061.11E−021.81E−041.96E+003.13E+002.75E+002.52E+002.37E+00
Table 8. Results of the Friedman rank test.
Table 8. Results of the Friedman rank test.
Test Function BGABPSOBGSABDABBABMOABGOABSSA
1Unimodal
(F1–F7)
Friedman value393942463826147
Friedman rank55674321
2Multimodal
(F8–F13)
Friedman value252136402831226
Friedman rank42785631
3Multimodal with fixed dimensions
(F14–F23)
Friedman value5231424444342910
Friedman rank73566421
4All 23 test functionsFriedman value11691120130110916523
Friedman rank53674321
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Dehghani, M.; Montazeri, Z.; Dehghani, A.; Malik, O.P.; Morales-Menendez, R.; Dhiman, G.; Nouri, N.; Ehsanifar, A.; Guerrero, J.M.; Ramirez-Mendoza, R.A. Binary Spring Search Algorithm for Solving Various Optimization Problems. Appl. Sci. 2021, 11, 1286. https://doi.org/10.3390/app11031286

AMA Style

Dehghani M, Montazeri Z, Dehghani A, Malik OP, Morales-Menendez R, Dhiman G, Nouri N, Ehsanifar A, Guerrero JM, Ramirez-Mendoza RA. Binary Spring Search Algorithm for Solving Various Optimization Problems. Applied Sciences. 2021; 11(3):1286. https://doi.org/10.3390/app11031286

Chicago/Turabian Style

Dehghani, Mohammad, Zeinab Montazeri, Ali Dehghani, Om P. Malik, Ruben Morales-Menendez, Gaurav Dhiman, Nima Nouri, Ali Ehsanifar, Josep M. Guerrero, and Ricardo A. Ramirez-Mendoza. 2021. "Binary Spring Search Algorithm for Solving Various Optimization Problems" Applied Sciences 11, no. 3: 1286. https://doi.org/10.3390/app11031286

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop