An Improved Novel Global Harmony Search Algorithm Based on Selective Acceptance

The novel global harmony search (NGHS) algorithm is proposed in 2010, and it is an improved harmony search (HS) algorithm which combines the particle swarm optimization (PSO) and the genetic algorithm (GA). One of the main differences between the HS and NGHS algorithms is that of using different mechanisms to renew the harmony memory (HM). In the HS algorithm, in each iteration, the new harmony is accepted and replaced the worst harmony in the HM while the fitness of the new harmony is better than the worst harmony in the HM. Conversely, in the NGHS algorithm, the new harmony replaces the worst harmony in the HM without any precondition. However, in addition to these two mechanisms, there is one old mechanism, the selective acceptance mechanism, which is used in the simulated annealing (SA) algorithm. Therefore, in this paper, we proposed the selective acceptance novel global harmony search (SANGHS) algorithm which combines the NGHS algorithm with a selective acceptance mechanism. The advantage of the SANGHS algorithm is that it balances the global exploration and local exploitation ability. Moreover, to verify the search ability of the SANGHS algorithm, we used the SANGHS algorithm in ten well-known benchmark continuous optimization problems and two engineering problems and compared the experimental results with other metaheuristic algorithms. The experimental results show that the SANGHS algorithm has better search ability than the other four harmony search algorithms in ten continuous optimization problems. In addition, in two engineering problems, the SANGHS algorithm also provided a competition solution compared with other state-of-the-art metaheuristic algorithms.


Introduction
In the past two decades, the interest in the research related to metaheuristic algorithms has increased significantly due to the fact that existing numerical methods do not allow for solving a number of complex optimization problems, such as discrete structural optimization [1], design of water distribution networks [2], and vehicle routing [3]. A metaheuristic algorithm is applied to simulate the randomness and rules in natural phenomena. For example, the genetic algorithm (GA) [4] imitates the process of the biological evolutionary and natural selection. The particle swarm optimization (PSO) [5] was inspired by the animal behavior, namely, the foraging behavior of bird flocking. In addition, there are many algorithms extending PSO and GA, such as the harmony search (HS) algorithm [6] and the honey bee mating optimization (HBMO) [7]. These algorithms have proved their ability to search for better solutions than numerical methods in extensive optimization problems [8][9][10]. However, compared with theses algorithms, the HS algorithm which imitates an improvisation process of a harmony has two features. Firstly, it has fewer mathematical requirements and is easy to implement.
• Step 2: Initialize the harmony memory (HM) The number of harmony vectors stored in the HM is equal to HMS. A harmony vector is a set of decision variables. The initial values of the decision variable x k=0 i,j (i = 1, 2, . . . , HMS) are generated by Equation (1). In Equation (1), r is a random number in the range [0,1]. Here, x k i,j represents the jth decision variable in the ith harmony vector, when the current iteration is k. The harmony memory (HM) is shown in Equation (2): (1) x 0 HMS,1 x 0 HMS,2 · · · x 0 • Step 3: Improvise a new harmony A new harmony vector x k new,j = (x k new,1 , x k new,2 , · · · , x k new,D ) is improvised by three mechanisms: memory consideration, pitch adjustment, and random selections. It is defined as in Algorithm 1.

Algorithm 1 The Improvisation of New Harmony in HS
1: For j = 1 to D do 2: If r 1 ≤ HMCR then 3: x k new,j = x k i, j %memory consideration 4: If r 2 ≤ PAR then 5: x k new, j = x k new, j − BW + r 3 × 2 × BW %pitch adjustment 6: If x k new, j > x jU then 7: x k new, j = x jU 8: Else if x k new, j < x jL then 9: x k new, j = x jL 10: End 11: End 12: Else 13: x k new, j = x jL + r 4 × x jU − x jL % random selection 14: End 15: End Here, x k new,j is the jth decision variable of the new harmony improvised in the current iteration k. In the memory consideration mechanism, i of the x k i,j is an integer random selected in the range of [1, HMS]; r 1 , r 2 , r 3 , and r 4 are random numbers in the region of [0,1].
• Step 4: Update the harmony memory If the new harmony vector x k new, j = (x k new,1 , x k new,2 , · · · , x k new,D ) has better fitness compared with the worse fitness in the harmony memory, the new harmony vector x k new,j will replace the worse harmony vector.
• Step 5: Check the stopping iteration If the current iteration is equal to the stopping iteration (the maximum number of iterations is NI), the computation is terminated. Otherwise, set k = k + 1 and go back to step 3.

Improved Harmony Search Algorithm
In the HS algorithm, the pitch adjustment mechanism is designed to improve the local search ability. However, the PAR and the BW are fixed in the search process, which means that the local search ability does not change in the early iteration and the final iteration. This is a reason why the HS algorithm has a limit to balance the global search ability and local search ability in the different stages of the search process. To mitigate this deficiency, Mahdavi et al. [11] proposed a new strategy to dynamically adjust PAR and BW. At the early stage of optimization, the small PAR values with the large BW enhance the ability of the global search and increase the diversity of the solution vectors. The large PAR and the small BW values improve the ability of the local search and make the algorithm converge to the optimal solution vector in the late stage of optimization. Therefore, PAR and BW are adjusted according to Equations (3) and (4), respectively: In Equation (3), PAR k represents the pitch adjusting rate, when the current iteration is k. PAR min and PAR max are the minimum and maximum pitch adjusting rates in the process of optimization, respectively. Additionally, in Equation (4), BW k represents the bandwidth, when the current iteration is k. BW max and BW min are the maximum and the minimum bandwidth in the process of optimization.

Self-Adaptive Global Best Harmony Search Algorithm
In 2010, the SGHS algorithm was proposed by Pan et al. [12]. The SGHS algorithm and the HS algorithm differ between each other in three aspects. Firstly, the SGHS algorithm implies that the pitch adjusting mechanism should be computed before the memory consideration mechanism. In addition, the memory consideration mechanism only selects the best harmony in the HM other than selecting randomly. Thereby, the new harmony is improvised using Algorithm 2. Secondly, the parameters HMCR and PAR are dynamically adjusted by the normal distribution, and BW is adjusted in each iteration according to Equation (5). Furthermore, HMCR is generated by the mean HMCR mean that is in the range of [0.9, 1.0] and the static standard deviation (0.01). PAR is generated by the mean PAR mean that is in the region of [0.0, 1.0] and the static standard deviation (0.05). Finally, the SGHS algorithm introduces another parameter named the learning period (LP) to determine HMCR mean and PAR mean . In a learning period, if the new harmony replaces the worse harmony in the HM, when the iteration is k, HMCR k and PAR k are recorded. When the learning period is satisfied, HMCR mean and PAR mean are recalculated by averaging the recorded HMCR k and PAR k , respectively, then the recorded HMCR k and PAR k are discarded, and another new learning period will begin.

Algorithm 2 The Improvisation of New Harmony in SGHS
1: For j = 1 to D do 2: If r 1 ≤ HMCR then 3: x k new,j = x k i, j − BW + r 2 × 2 × BW %pitch adjustment 4: If x k new, j > x jU then 5: x k new, j = x jU 6: Else if x k new, j < x jL then 7: x k new, j = x jL 8: End 9: If r 3 ≤ PAR then 10: x k new, j = x k B, j %memory consideration 11: End 12: Else 13: x k new, j = x jL + r 4 × x jU − x jL % random selection 14: End 15: End Here, x k B,j represents the jth decision variable of the best harmony in the HM, when the iteration is k. The meaning of other variables in Algorithm 2 is the same as in Algorithm 1:

Noval Global Harmony Search Algorithm
In 2010, the NGHS algorithm was proposed by Zou et al. [13,17,18]. The algorithm proposed in their study combines the HS algorithm, GA, and the particle swarm optimization (PSO) algorithm. GA imitates a process of the biological evolution [4]. The genetic mutation probability (p m ) in GA was introduced to escape from the local optimum. Particle swarm optimization (PSO) imitates the foraging process of bird flocking [5]. The position update in the NGHS algorithm was inspired by the main characteristic of PSO implying that the position of the particles is affected by the best position of all particles [13]. Figure 1 illustrates the principle of the position update. step k j = |x k best,j − x k worse,j | is an adaptive step of the jth decision variable in the current iteration k. The range between P and R is referred to as a truth region, which is centered at the global best harmony. In other words, the jth decision variable is generated randomly in a range of the truth region. In the early iteration, all solution vectors in the harmony memory are sporadic, and the truth region is wide, which means that the new harmony is generated randomly in a large range. It is beneficial to enhance the global search ability. In the late iteration, all solution vectors in the HM are inclined to move to the global optimum. Therefore, all solution vectors are concentrated within a small solution space, and the truth region is narrow, which enhances the local search ability of NGHS [13,17,18]. The NGHS algorithm works as follows. Here, , represents the jth decision variable of the best harmony in the HM, when the iteration is . The meaning of other variables in Algorithm 2 is the same as in Algorithm 1:

Noval Global Harmony Search Algorithm
In 2010, the NGHS algorithm was proposed by Zou et al. [13,17,18]. The algorithm proposed in their study combines the HS algorithm, GA, and the particle swarm optimization (PSO) algorithm. GA imitates a process of the biological evolution [4]. The genetic mutation probability ( ) in GA was introduced to escape from the local optimum. Particle swarm optimization (PSO) imitates the foraging process of bird flocking [5]. The position update in the NGHS algorithm was inspired by the main characteristic of PSO implying that the position of the particles is affected by the best position of all particles [13]. Figure 1 illustrates the principle of the position update. = | , − , | is an adaptive step of the jth decision variable in the current iteration . The range between P and R is referred to as a truth region, which is centered at the global best harmony. In other words, the jth decision variable is generated randomly in a range of the truth region. In the early iteration, all solution vectors in the harmony memory are sporadic, and the truth region is wide, which means that the new harmony is generated randomly in a large range. It is beneficial to enhance the global search ability. In the late iteration, all solution vectors in the HM are inclined to move to the global optimum. Therefore, all solution vectors are concentrated within a small solution space, and the truth region is narrow, which enhances the local search ability of NGHS [13,17,18]. The NGHS algorithm works as follows.


Step 1: Initialize the algorithm and problem parameters • Step 1: Initialize the algorithm and problem parameters The genetic mutation probability (p m ), the current iteration k, and the maximum number of iterations (NI) are set in this step. However, the harmony memory considering rate (HMCR), pitch adjusting rate (PAR), and the bandwidth (BW) are excluded from the NGHS algorithm.
• Step 3: Improvise a new harmony A new harmony vector x k new,j = (x k new,1 , x k new,2 , · · · , x k new,D ) is improvised by Algorithm 3.

Algorithm 3 The Improvisation of New Harmony in NGHS
1: For j = 1 to D do 2:

3:
If x R > x jU then 4: x R = x jU 5: Else if x R < x jL then 6: x R = x jL 7: End If r 2 ≤ p m then 10: x k new, j = x jL + r 3 × x jU − x jL % genetic mutation 11: End 12: End Here, x k best,j and x k worse,j are the best harmony and the worse harmony in the harmony memory, when the current iteration is k, respectively. r 1 , r 2 , and r 3 are random numbers in the range of [0,1]: • Step 4: Update the harmony memory The new harmony replaces the worse harmony in the HM, even if the worse harmony in the HM has better fitness than the new harmony.
• Step 5: Check the stopping iteration If the current iteration k is equal to the stopping iteration (the maximum number of iterations is NI), the algorithm is terminated. Otherwise, set k = k + 1 and go back to step 3.

Selective Acceptance Novel Global Harmony Search Algorithm
Assad et al. [19] pointed that the effectiveness of evolutionary algorithms relies on the extent of balance between diversification, which implies the ability of an algorithm to explore the high performance regions of the search space, and intensification, which is the ability of an algorithm to search for better solution in the high performance regions during the optimizing process. In other words, the key to improving the searching ability of evolutionary algorithms is to enhance the extent of balance between diversification and intensification capabilities. It can be seen that, in Section 2, Step 3 (Improvise a new harmony) in the IHS, SGHS, or NGHS algorithm is gradually improved to balance the diversification and intensification capabilities. However, in Step 4, the mechanism of updating the HM is different in these algorithms. Accept-the-better-only mechanism is used in the HS, IHS, and the SGHS algorithm. The accept-everything mechanism is used in the NGHS algorithm. This difference inspired us to explore whether the mechanism in Step 4 could affect the balance of diversification and Appl. Sci. 2020, 10,1910 7 of 20 intensification capabilities and improve the searching ability of the NGHS algorithm. However, in the process of finding a good mechanism in Step 4, the mechanism that is used in the SA algorithm attracted our attention. The SA algorithm used Boltzmann distribution to calculate the acceptable probability dynamically in each iteration and determine to accept the new solution or not by the value of acceptable probability. Therefore, in this paper, we proposed the selective acceptance novel global harmony search (SANGHS) algorithm, which combined the selective acceptance mechanism with the NGHS algorithm.
The main difference between the SANGHS and the NGHS algorithms is that the SANGHS algorithm will accept and replace the worst harmony in the HM with the new harmony according to the selective acceptance mechanism. In this mechanism, the selective acceptance formula as shown in Equation (6), which is also dynamically adjusted in the optimization process as in the SA algorithm, is used to further balance the diversification and intensification capabilities. However, the selective acceptance formula is different from the Boltzmann distribution in the SA algorithm and does not need to use additional parameters. In Equation (6), AP k represents the acceptable probability in the current iteration k. F k new represents the fitness value of the new harmony in the current iteration k. F k best and F k worse represent the fitness values of the best and the worst harmonies in the HM in the current iteration k, respectively. According to the selective acceptance formula in the SANGHS algorithm, if the fitness of the new harmony is better than the worst harmony in the HM, the AP is equal to 1. It means that the worst harmony will be replaced by the new harmony in the HM. However, if the fitness of the new harmony is worse than the worst harmony in the HM, the SANGHS algorithm will accept the new harmony according to the AP, which is calculated by the proportion formula. Proportion formula is the most important part of the selective acceptance formula. In proportion formula, F k worse − F k best represents the convergence degree of all harmonies in the HM. In the early iterations, all harmonies in the HM are dispersed in a large solution space, and the value of F k worse − F k best is a large value. Thus, it is more likely to accept a worse harmony and thereby expand the range of truth region in the SANGHS algorithm, which would contribute to improve the diversification capabilities. Conversely, in the late iterations, all harmonies in the HM have converged in a small solution space; the value of F k worse − F k best is a small value. The HM is less likely to accept a worse harmony and thereby reduce the range of truth region, which makes the SANGHS algorithm have strong intensification capabilities: In general, the SANGHS algorithm contains the following steps: • Step 1: Initialize the algorithm and problem parameters The genetic mutation probability (p m ), the current iteration k, and the maximum iteration (NI) are set in this step.
• Step 3: Improvise a new harmony This step is the same as in the NGHS algorithm.

• Step 4: Calculate AP
If the new harmony has a better fitness value than the worst harmony in the HM, AP is set to 1. Otherwise, AP is calculated by Equation (6).
• Step 5: Update the harmony memory Firstly, generate a random number in [0,1]. If the random number is smaller than the AP, the new harmony replaces the worst harmony in the HM. Otherwise, the new harmony is discarded.
• Step 6: Check the stopping iteration If the current iteration k is equal to the stopping iteration (the maximum number of iterations is NI), the algorithm is terminated. Otherwise, set k = k + 1 and go back to Step 3.

Computing Environment Settings
In this study, we used Python 3.5.1 (64-bit) as the complier to implement the program to search for the trial solution. The solution-finding equipment comprised an Intel Core (TM) i5-8500 (3.0 GHz, 3.0 GHz) CPU, 8 GB of memory, and Windows 10 professional edition (64-bit) OS.

Ten Well-Known Benchmark Optimization Problems
To estimate the performance for the proposed acceptable probability mechanism, we considered ten well-known benchmark optimization problems [20,21] as follows. Among these problems, problems 1 to 4 are unimodal problems. Problems 5 to 10 are multimodal problems with several local optima, and the number of local optima is increased with the number of decision variables [12].
where the global optimum is 0, and the search space of where the global optimum is 0, and the search space of The Rastrigin function is defined as min f ( where the global optimum is 0, and the search space of x i (i = 1, 2, · · · , N) is [−5.12, 5.12].
4 and the global optimum is 0, and the search space of where the global optimum is 0, and the search space of

Comparison of Different Harmony Search Algorithms in Ten Benchmark Optimization Problems
To test the performance of the SANGHS algorithm for ten aforementioned optimization problems, we compared it with four harmony search algorithms, which were the HS, IHS, SGHS, and NGHS algorithms. The parameters of these algorithms are provided in Table 1 [12,17,22,23]. We tested three different dimensions equal to 10  In Tables 2-4, several experimental results are given. Firstly, in all problems (F1 to F10) with three different dimensions (D = 10, 30, and 100), the SANGHS algorithm has smaller minimum (Min) and maximum (Max) fitness values than other four HS algorithms. For example, in problem 1 with D = 10, SANGHS has the smallest minimum fitness values (9.6086 × 10 −111 ) and the smallest maximum fitness values (1.3918 × 10 −77 ). Secondly, the SANGHS algorithm has a smallest average (Mean) fitness value. For example, in problem 1 with D = 10, SANGHS has the smallest average fitness values (4.6889 × 10 −79 ). This value is far less than the second one (8.2477 × 10 −38 ). The above two experimental results reveal that the SANGHS algorithm has a better search ability than other four harmony search algorithms. Lastly, in the above problems (F1 to F10) with D = 10, 30, and 100, the SANGHS algorithm has a smallest standard deviation (Std) value. For example, in problem 1 with D = 10, SANGHS has the smallest standard deviation values (2.4976 × 10 −78 ), and this value is also far less than the second one (3.8833 × 10 −37 ). It means that the search ability of the SANGHS algorithm is more robust than other four harmony search algorithms.   In addition, to further explore the difference in the performance of the SANGHS algorithm and other four harmony search algorithms, the experiment results were also analyzed through the Wilcoxon rank-sum test [24,25] as shown in Table 5. In Table 5, the p-value, which is greater than 0.05, is bold. Based on the p-value, the SANGHS algorithm is significantly better than the other four HS algorithms in all problems with three different dimensions, except in problem 10 (F10) with D = 10 (6.3886 × 10 −2 ). Figure 2 shows the typical convergence graphs of five different algorithms for ten problems with D = 30. In Figure 2a-d which are unimodal problems, as the number of iterations increases, the SANGHS algorithm converges faster than other four harmony search algorithms. It means that the SANGHS algorithm has a strong local search ability in unimodal problems. In Figure 2e,f,h,j which are multimodal problems, we can easily observe that the SANGHS algorithm is more effective in escaping from the local optima and have a better convergence ability compared with the other four harmony search algorithms. In other words, the SANGHS algorithm has a better balance between the global search ability and the local search ability in multimodal problems. 1.5099 × 10 −11 1.5099 × 10 −11 1.5099 × 10 −11 1.5099 × 10 −11 D = 30. In Figure 2a-d which are unimodal problems, as the number of iterations increases, the SANGHS algorithm converges faster than other four harmony search algorithms. It means that the SANGHS algorithm has a strong local search ability in unimodal problems. In Figure 2e,f,h,j which are multimodal problems, we can easily observe that the SANGHS algorithm is more effective in escaping from the local optima and have a better convergence ability compared with the other four harmony search algorithms. In other words, the SANGHS algorithm has a better balance between the global search ability and the local search ability in multimodal problems.

Tension/Compression Spring Design Optimization Problem
The first engineering problem is the tension/compression spring design engineering problem. The main purpose of this problem is to minimize the spring weight by determining three design variables when four constraints are satisfied. Three design variables are the wire diameter (d), mean coil diameter (D), and the number of active coils (N). Four constraints are about the shear stress, surge frequency, and the minimum deflection. One objective function, three design variables, and four constraints are shown in Figure 3.
We used the HS, IHS, SGHS, NGHS, and the SANGHS algorithms to search the best result for this problem. Table 6 shows the parameters of these five HS algorithms. The best results obtained by these five algorithms are shown in Table 7. Moreover, as shown in Table 7, to further verify the searching performance of SANGHS algorithm, the best result of SANGHS algorithm is also compared with other state-of-the-art algorithms which have been applied to solve this problem.
Appl. Sci. 2020, 10, x FOR PEER REVIEW 15 of 21 In Table 7, it can be seen that, compared with the HS, IHS, SGHS, and the NGHS algorithm, the SANGHS algorithm provides a best fitness value for the tension/compression spring design engineering problem. It is worth noting that the SANGHS algorithm also provides a very competitive result compared with the other state-of-the-art algorithms in previous works.      [27] 0.05169000 0.35673700 11.28885000 0.0126663 RO [28] 0.05137000 0.34909600 11.76279000 0.0126786 CDE [29] 0.05160900 0.35471400 11.41083100 0.0126702 CPSO [30] 0.05172800 0.35764400 11.24454300 0.0126747 GA3 [31] 0.05198900 0.36396500 10.89052200 0.0126810 ES [32] 0.05164300 0.35536000 11.39792600 0.0126980 GSA [33] 0.05027600 0.32368000 13.52541000 0.0127022 MFO [34] 0 In Table 7, it can be seen that, compared with the HS, IHS, SGHS, and the NGHS algorithm, the SANGHS algorithm provides a best fitness value for the tension/compression spring design engineering problem. It is worth noting that the SANGHS algorithm also provides a very competitive result compared with the other state-of-the-art algorithms in previous works.

Welded Beam Design Optimization Problem
The second problem is the welded beam design engineering problem. In this problem, there are four variables that need to be determined that are the thickness of the weld (x 1 ), the length of the clamped bar (x 2 ), the height of the bar (x 3 ), and the thickness of the bar (x 4 ). The main purpose of this problem is to minimize the manufacturing cost when seven constraints are satisfied. The constraints include limits on the shear stress, bending stress, buckling load and end deflection, and several size constraints. The welded beam and its design variables are shown in Figure 4. These seven constraints and the objective function of manufacturing cost are defined as follows.    We also used the HS, IHS, SGHS, NGHS, and the SANGHS algorithms to search the best result for this problem. Table 8 shows the parameters of these five HS algorithms. The best results obtained by these five algorithms are shown in Table 9. Moreover, as shown in Table 9, to further verify the searching performance of SANGHS algorithm, the best result of the SANGHS algorithm is also compared with other state-of-the-art algorithms that have been applied to solve this problem.
In Table 9, it can be seen that, compared with the HS, IHS, SGHS, and the NGHS algorithm, the SANGHS algorithm provides a minimum manufacturing cost for the welded beam design engineering problem. It is worth noting that the SANGHS algorithm also provides a very competitive result compared with other state-of-the-art algorithms in previous works.

Conclusions
In this paper, we presented a novel algorithm, called the SANGHS algorithm, which combines the NGHS algorithm with a selective acceptance mechanism. In the SANGHS algorithm, the acceptable probability (AP) is generated by a selective acceptance formula in each iteration. Based on the value of AP, the SANGHS algorithm will determine whether to accept the worse trial solution or not. Moreover, to verify the searching performance of the SANGHS algorithm, we used the SANGHS algorithm in ten well-known benchmark continuous optimization problems and two engineering problems and compared the experimental results with other state-of-the-art metaheuristic algorithms. The experimental results provide two findings that are worth noting.
Firstly, it was found that the SANGHS algorithm could result in improved searching performance as opposed to the NGHS algorithm. In the SANGHS algorithm, the selective acceptance mechanism dynamically adjusts the acceptable probability according to the selective acceptance formula. The most important part of the selective acceptance formula is proportion formula, and the main concept of the proportion formula is convergence degree in the HM. Hence, in Equation (6), F k worse − F k best represents the convergence degree of all harmonies in the HM. In the early iterations, all solutions have not yet converged in the HM. Therefore, the value of F k worse − F k best will be a larger value, and then there is a larger acceptable probability to accept a worse harmony. In the next iteration, there is a higher probability to generate a new harmony in a larger truth region. In other words, there is a strong global exploration ability in the early iterations. However, in the late iterations, the value of F k worse − F k best will be a smaller value while all solutions have converged in the HM. Therefore, there is a smaller acceptable probability to accept a worse harmony. In the next iteration, there is a higher probability to generate a new harmony in a smaller truth region. In other words, there is a strong local exploitation ability in the later iterations. According to the experimental results, the proposed proportion mechanism can balance the global exploration and local exploitation ability in the NGHS algorithm.
Finally, in ten considered benchmark optimization problems with three different dimensions, based on the results of the Wilcoxon rank-sum test and the convergence graphs, the searching performance of the SANGHS algorithm was better than other four HS algorithms significantly. Especially in Figure 2, we can easily observe that the SANGHS algorithm had the better searching performance and convergence ability than other algorithms in all problems. Conversely, the HS, IHS, and the SGHS algorithms easily fall within the local optimum, and unlikely escape it. Moreover, in two engineering optimization problems, the SANGHS algorithm could not only search for a better solution compared with other four HS algorithms, but also could provide a competition result to compare with other state-of-the-art algorithms. In conclusion, the SANGHS algorithm is a more efficient and effective algorithm.