Next Article in Journal
A Fast Multi-Scale of Distributed Batch-Learning Growing Neural Gas for Multi-Camera 3D Environmental Map Building
Previous Article in Journal
Magnetically Driven Quadruped Soft Robot with Multimodal Motion for Targeted Drug Delivery
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Multi-Strategy Improved Northern Goshawk Optimization Algorithm for Optimizing Engineering Problems

1
School of Emergency Management, Institute of Disaster Prevention, Langfang 065201, China
2
Institute of Mineral Resources Research, China Metallurgical Geology Bureau, Beijing 101300, China
3
College of General Education, Hainan Vocational University, Haikou 570216, China
4
Guangxi Key Laboratory of Trusted Software, Guilin University of Electronic Technology, Guilin 541004, China
5
College of Computer Science and Technology, Jilin University, Changchun 130012, China
*
Author to whom correspondence should be addressed.
Biomimetics 2024, 9(9), 561; https://doi.org/10.3390/biomimetics9090561
Submission received: 21 August 2024 / Revised: 12 September 2024 / Accepted: 14 September 2024 / Published: 16 September 2024

Abstract

:
Northern Goshawk Optimization (NGO) is an efficient optimization algorithm, but it has the drawbacks of easily falling into local optima and slow convergence. Aiming at these drawbacks, an improved NGO algorithm named the Multi-Strategy Improved Northern Goshawk Optimization (MSINGO) algorithm was proposed by adding the cubic mapping strategy, a novel weighted stochastic difference mutation strategy, and weighted sine and cosine optimization strategy to the original NGO. To verify the performance of MSINGO, a set of comparative experiments were performed with five highly cited and six recently proposed metaheuristic algorithms on the CEC2017 test functions. Comparative experimental results show that in the vast majority of cases, MSINGO’s exploitation ability, exploration ability, local optimal avoidance ability, and scalability are superior to those of competitive algorithms. Finally, six real world engineering problems demonstrated the merits and potential of MSINGO.

1. Introduction

In the real-world, there are optimization problems in many fields [1], such as medicine [2,3], transportation [4,5], engineering design [6,7], economics [8], feature selection [9,10,11], artificial neural networks [12,13,14], and so on. The solution of optimization problems can be approached in two principal ways. The first is traditional mathematical methods, and the second is metaheuristic algorithms [15]. Traditional mathematical methods are deterministic methods, including gradient descent [16], Newton’s method [17], conjugate gradient [18], and so on. They solve the problem by using the derivative information of the objective function, which is an effective solution to a continuously differentiable problem [19]. However, when the optimization problem is higher dimensional, non-differentiable, and has multiple local optimal solutions, the traditional mathematical method loses its function [20]. Hence, metaheuristic algorithms have been extensively studied. Metaheuristic algorithms are stochastic methods that consist mainly of two stages: exploration and exploitation phases. In the exploration phase, the algorithm searches for solutions on a global scale to avoid falling into local optima. In the exploitation phase, the algorithm searches in a local area to find a better solution. Metaheuristic algorithms are widely used to solve complex optimization problems in the real world due to their simplicity, flexibility, lack of a deduction mechanism, and ability to avoid local optima.
Metaheuristic algorithms are generally classified into four categories [21]: (1) evolution-based algorithms; (2) swarm-based algorithms; (3) physics- or chemistry-based algorithms; and (4) social- or human-based algorithms. Some well-known and recently metaheuristic algorithms are summarized in Table 1.
Evolution-based metaheuristic algorithms are mainly implemented by modeling the principles of species evolution in nature. An evolution-based metaheuristic algorithm usually consists of three operations: selection, crossover, and mutation. The Genetic Algorithm (GA) [22] and Differential Evolution (DE) [23] are two well-known evolution-based metaheuristic algorithms. The GA was first proposed by Holland in 1975, inspired by Darwin’s theory of natural competition. DE was proposed by Storn and Price in 1997, known for its simplicity, ease of implementation, fast convergence, and robustness. Other evolution-based algorithms include Evolutionary Programming (EP) [24], Genetic Programming (GP) [25], Biogeography-Based Optimization (BBO) [26], Memetic Algorithms (MAs) [27], and Imperialist Competitive Algorithms (ICAs) [28].
Swarm-based metaheuristic algorithms are primarily inspired from the behavior of a group, using the interaction and cooperation of individual sources of information between populations to find the global optimal solution. Particle Swarm Optimization (PSO) is one of the most famous swarm-based algorithms [29] mimicking the flocking behavior of birds. Ant Colony Optimization (ACO) [30] is another popular swarm-based metaheuristic algorithm, inspired from the behavior of ants searching for the shortest path during their foraging process. Other swarm-based metaheuristic algorithms include the Whale Optimization Algorithm (WOA) [31], Cuckoo Search Algorithm (CSA) [32], Grey Wolf Optimizer (GWO) [33], Moth Flame Optimizer (MFO) [21], Sparrow Search Algorithm (SSA) [34], Dung Beetle Optimizer (DBO) [35], Beluga Whale Optimization (BWO) [19], Red Fox Optimization Algorithm (RFO) [36], Sea-horse Optimizer (SHO) [37], Coati Optimization Algorithm (COA) [38], Spider Wasp Optimizer (SWO) [39], Cleaner fish optimization (CFO) [40], and so on.
Physics- or chemistry-based metaheuristic algorithms are based on the simulation of various laws or phenomena in physics or chemistry. One of the well-known algorithms in this category is Simulated Annealing (SA) [41], which is inspired by the physical law of metal cooling and annealing. Other physics- or chemistry-based algorithms include the Gravitational Search Algorithm (GSA) [42], Artificial Chemical Reaction Optimization (ACRO) [43], Sine Cosine Optimization (SCA) [44], Thermal Exchange Optimization (TEO) [45], and the Kepler Optimization Algorithm (KOA) [46].
Social- or human-based algorithms mainly imitate human behaviors. Teaching and Learning Based Optimization (TLBO) [47] is a typical example of this category. It is derived from the behavior of teaching and learning. Other famous or recent social- or human-based algorithms include the Cultural Evolution Algorithm (CEA) [48], Social Learning Optimization Algorithm (SLOA) [49], Socio Evolution and Learning Optimization Algorithm (SELO) [50], and Volleyball Premier League Algorithm (VPL) [51].
Table 1. Metaheuristic algorithms.
Table 1. Metaheuristic algorithms.
CategoryAlgorithmsAuthorsYear
EvolutionaryGenetic Algorithm (GA) [22]Holland1992
Genetic Programming (GP) [25]Koza et al.1994
Differential Evolution (DE) [23]Storn and Price1997
Evolutionary Programming (EP) [24]Yao et al.1999
Memetic Algorithms (MAs) [27]Moscato2003
Imperialist Competitive Algorithms (ICAs) [28]Atashpaz et al.2007
Biogeography-Based Optimization (BBO) [26]Simon2008
SwarmParticle Swarm Optimization (PSO) [29]Kennedy and Eberhart1995
Ant Colony Optimization (ACO) [30]Dorigo et al.1999
Cuckoo Search Algorithm (CSA) [32]Yang and Deb2009
Grey Wolf Optimizer (GWO) [33]Mirjalili et al.2014
Moth Flame Optimizer (MFO) [21]Mirjalili and Seyedali2015
Whale Optimization Algorithm (WOA) [31]Mirjalili et al.2016
Seagull Optimization Algorithm (SOA)[52]Dhiman and Kumar2019
Sparrow Search Algorithm (SSA) [34]Xue et al.2020
Red Fox Optimization Algorithm (RFO) [36]Połap et al.2021
Northern Goshawk Optimization (NGO) [53]Dehghani et al.2021
Pelican Optimization Algorithm (POA) [54]Trojovský and Dehghani2022
Golden Jackal Optimization (GJO) [55]Chopra and Ansari2022
Beluga Whale Optimization (BWO) [19]Zhong et al.2022
Sea-horse Optimizer (SHO) [37]Zhao et al.2023
Dung Beetle Optimizer (DBO) [35]Xue et al.2023
Coati Optimization Algorithm (COA) [38]Dehghani et al.2023
Spider Wasp Optimizer (SWO) [39]Basset et al.2023
Cleaner fish optimization (CFO) [40]Zhang et al.2024
Physics and ChemistrySimulated Annealing (SA) [41]Kirkpatrick et al.1983
Magnetic Optimization Algorithm (MOA) [56]Tayaraniet al.2008
Gravitational Search Algorithm (GSA) [42]Rashedi et al.2009
Artificial Chemical Reaction Optimization (ACRO) [43]Alatas2011
Lightning Search Algorithm (LSA) [57]Mirjalili2015
Sine Cosine Optimization (SCA) [44]Tanyildizi et al.2016
Golden Sine Algorithm (GSA) [58]Kaveh et al.2017
Thermal Exchange Optimization (TEO) [45]Abualigah et al.2017
Kepler Optimization Algorithm (KOA) [46]Basset et al.2023
HumanTeaching and Learning Based Optimization (TLBO) [47]Rao et al.2011
Cultural Evolution Algorithm (CEA) [48]Kuo and Lin2013
Election Algorithm (EA) [59]Emami et al.2015
Social Learning Optimization Algorithm (SLOA) [49]Liu et al.2017
Socio Evolution and Learning Optimization Algorithm (SELO) [50]Kumar et al.2018
Volleyball Premier League Algorithm (VPL) [51]Moghdani et al.2018
As mentioned earlier, a large number of metaheuristic algorithms have been developed. However, the No Free Lunch (NFL) theorem states that no metaheuristic optimization algorithm can solve all optimization problems [60]. As real-world problems become increasingly challenging, it is necessary to improve existing optimization algorithms to design more effective and efficient optimization algorithms for increasingly complex optimization problems in the real world. Northern Goshawk Optimization (NGO) [53], inspired by the predatory behavior of the northern goshawk, has gained a lot of attention shortly after its first proposal. For example, Chang et al. [61] used NGO to optimize the life-cycle costs of the power grid. El-Dabah et al. [62] utilized NGO to optimize the parameters in PV modules. Wu et al. [63] proposed a deep learning model CNN-LSTM for predicting PV power, where NGO was used to optimize CNN-LSTM. Although the NGO algorithm has achieved some accomplishments, it still has some drawbacks: (1) the initial population is randomly generated and lacks diversity; (2) it exhibits slow convergence speed; (3) it easily falls into local optima.
To address the issues in the original NGO algorithm, this paper adds three strategies to it and proposes a Multi-Strategy Improved Northern Goshawk Optimization, called MSINGO. Then, the improved MSINGO is used to optimize six engineering problems.
The main contributions of this paper are summarized as follows:
(1)
To enhance the diversity of the initial population, this paper adds the cubic mapping strategy in the initialization phase of the original NGO algorithm;
(2)
To avoid NGO being trapped in local optima, a novel weighted stochastic difference variation strategy is introduced in the exploration phase. It will help NGO jump out of local optima;
(3)
To accelerate the convergence speed, a weighted sine and cosine optimization strategy is added in the exploitation phase of the original NGO algorithm;
(4)
To evaluate the performance of our improved MSINGO, we compare it with five highly cited and six recently proposed metaheuristic algorithms on CEC2017 test functions and six engineering design problems.
The rest of this paper is structured as follows. Section 2 briefly introduces the original NGO. Section 3 describes the improved MSINGO in detail. Section 4 shows experiments on 29 benchmark functions on CEC2017. Section 5 evaluates the performance of improved MSINGO in solving six engineering problems. Section 6 summarizes this paper.

2. Overview of Original NGO

The mathematical model of the NGO algorithm is briefly presented in this section. NGO mainly simulates the predatory process of the northern goshawk. The process of NGO involves 3 main phases: initialization, the exploration phase and the exploitation phase, which will be described in detail below.

2.1. Initialization

NGO is a population-based algorithm. Each individual northern goshawk in the population is considered as a candidate solution. In the initial stage of the NGO algorithm, candidate solutions are randomly generated, as shown in Equation (1). All candidate solutions form the population matrix, as shown in Equation (2).
x i , j = l b j + r × u b j l b j , i = 1,2 , , N , j = 1,2 , , d i m
  X = X 1 X i X N N × m = x 1,1 x 1 , d x 1 , d i m x i , 1 x i , d x i , d i m x N , 1 x N , d x N , d i m N × d i m
where X represents the population matrix, storing positions of all possible candidate solutions; X i is the position of the i th candidate solution, which will be updated during optimization; x i , j indicates the j th dimension of the i th solution; r is a random real number in the interval [0, 1]; l b j and u b j are the lower bound and upper bound of the j th variable, respectively; N denotes the size of the population; d i m represents the dimension to be optimized.
In the optimization process, the objective function is used to calculate the fitness value of each candidate solution. All fitness values are stored in the fitness matrix F ( X ) , as shown in Equation (3). The candidate solution with the minimum fitness is the optimal solution.
  F X = F 1 = F X 1 F i = F X i F N = F X N N × 1
where F is the objective function, and F i is the fitness value obtained via the objective function with the i th proposed solution.

2.2. Exploration Phase

The exploration phase of NGO imitated a northern goshawk randomly selecting prey in the search space and attacking it. In this phase, the mathematical model is shown in Equations (4) and (5)
X i t + 1 = X i t + r × X k t I × X i t ,                 F k t < F i t X i t + r × X i t X k t ,                 F k t F i t
X i t = X i t + 1 , F i t + 1 < F i t X i t , F i t + 1 F i t
where k represents a random natural number in the interval [ 1 , N ] ; X k t is the position of the kth solution in the t th iteration; X i t is the position of the ith solution in the tth iteration; X i t + 1 is the position of the ith solution in iteration t + 1 ; F k t is the fitness value of the k th solution of the objective function in the tth iteration; F i t is the fitness value of the ith solution of the objective function in the tth iteration; F i t + 1 is the fitness value of the ith solution of the objective function in iteration t + 1 ; r is a random number in the interval [ 0 ,   1 ] ; I is a random number of 1 or 2.

2.3. Exploitation Phase

The exploitation phase of the NGO algorithm mimicked the behavior of a northern goshawk chasing and hunting prey. In this phase, the positions are updated using Equations (6)–(8).
  X i t + 1 = X i t + R × 2 × r 1 × X i t
R = 0.02 × 1 t T
X i t = X i t + 1 , F i t + 1 < F i t X i t , F i t + 1 F i t
where t is the current number of iteration, and T is the maximum number of iterations; r is a random number in the interval [ 0 ,   1 ] .

3. Our Proposed MSINGO

We proposed a Multi-Strategy Improved NGO(MSINGO) by adding three strategies to the original NGO algorithm. In this section, three strategies (i.e., cubic mapping, weighted random difference variation, and weighted sine and cosine optimization) and the proposed MSINGO will be introduced below.

3.1. Cubic Mapping Strategy

In the original NGO algorithm, the initial population is randomly generated, which may lead to insufficient population diversity. Many researchers have proved that chaotic mapping can improve population diversity, making optimization algorithm find the global optimal solution easier [64,65]. Cubic mapping is a common form of chaotic mapping [66]. We use it to initialize the population, which will enhance the population’s diversity. The mathematical model of cubic mapping is shown in Equation (9).
z p + 1 = a × z p × 1 z p 2 ,   p = 0 ,   1 ,   2 , , N     d i m
where a is a parameter set to 2.595; z p is the p th chaotic value, with z 0 = 0.3; N denotes the population size; d i m indicates the dimension of problem variables to be optimized.
The new population initialization formula with the cubic mapping strategy is shown in Equation (10).
x i , j = l b j + z p + 1 × u b j l b j , i = 1 ,   2 , , N ,   j = 1 ,   2 , ,   d i m

3.2. Weighted Stochastic Difference Mutation Strategy

In the exploration phase of the original NGO algorithm, individuals in the population are updated by generating new individuals nearby randomly. In this way, new individuals may be close to the old ones, which can lead to NGO falling into local optima. A mutation strategy can help escape from the local optima. In this paper, we proposed a novel mutation strategy named weighted stochastic differential mutation strategy, which will help NGO jump out of local optima. Our weighted stochastic difference mutation strategy consists of two parts: differential variation value Q and its weight W .
The differential variation value Q is calculated with Equation (11).
  Q = R 1 × X b e s t t X i t R 2 × X r a n d o m t X i t
where X b e s t t represents the current optimal position; X r a n d o m t denotes a randomly selected position; R 1 and R 2 are random numbers in the interval [ 0 ,   1 ] .
The weight W is generated using the weighted Levy flight technique [67], as shown in Equations (12)–(15).
W = ω × L e v y
ω = e 10 t T 2 + 10
L e v y = s × u × σ v 1 η
  σ = Γ 1 + η × sin π η 2 Γ 1 + η 2 × η × 2 η 1 2 1 η
Among them, t is the current number of iterations, and T is the maximum number of iterations; s is 0.05; η is 1.5; u and v are normally distributed random numbers; Γ is the gamma function.
Then, the weighted differential variation value is calculated with Equation (16).
W R = W × Q
Finally, we replaced the original Equation (4) in the NGO with the new Equation (17).
  X i t + 1 = X i t + r × X k t I × X i t + W R ,                 F k t < F i t X i t + r × X i t X k t + W R ,                               F k t F i t

3.3. Weighted Sine and Cosine Optimization Strategy

The convergence speed of the original algorithm is slow. To solve this problem, we add the weighted sine and cosine optimization strategy to the exploitation phase. The mathematical model of the sine and cosine optimization strategy is shown in Equation (18).
X i t + 1 = X i t + r 1 × sin r 2 × r 3 × X b e s t t X i t ,           r 4 < 0.5 X i t + r 1 × cos r 2 × r 3 × X b e s t t X i t ,           r 4 0.5
where r 1 = 2     ( ( 1 t / T ) ^ 2 ) ; r 2 represents a random number in the interval [ 0 ,   2 π ] ; r 3 is a random number in the interval [ 0 ,   2 ] , r 4 = r     ( 1 0.5     t / T ) ; r is a random number between 0 and 1.
Finally, we use Equation (19) instead of Equation (6) as the position updating formula in the exploitation phase.
X i t + 1 = X i t + r 1 × s i n ( r 2 ) × r 3 × X b e s t t X i t + W ,           r 4 < 0.5 X i t + r 1 × c o s ( r 2 ) × r 3 × X b e s t t X i t + W ,           r 4 0.5

3.4. The Detail of Our Proposed MSINGO

The pseudo-code of the proposed MSINGO algorithm is shown in Algorithm 1. The flow chart of MSINGO is shown in Figure 1, where the green parts are our improvements.
The pseudo-code of the MSINGO algorithm.
Algorithm 1. Pseudo-code of the MSINGO algorithm
Input:The initial parameters of MSINGO, including the maximum number of iterations T , the number of population members N , the dimension of problem variables d i m , the lower bound and upper bound of problem variables l b , u b .
Output:Optimal fitness value
1:Set i = 1 ,   t = 1
2:Create initial population using Equation (10)
3:While  t     T  Do
4:  While  i     N  Do
5:  Exploration phase:
6:    Calculate the position of the i th solution using Equation (17)
7:    Update the position of the i th solution using Equation (5)
8:  Exploitation phase:
9:    Calculate r 4
10:    Calculate the position of the i th solution using Equation (19)
11:    Update the position of the i th solution using Equation (8)
12:   End While
13:  Save best proposed solution so far
14:   t = t + 1
15:End While
16:Output the best solution

3.5. Time Complexity Analysis

The time complexity of the MSINGO algorithm consists of three parts: population initialization, exploration phase, and exploitation phase. The primary parameters that affect time complexity are the maximum number of iterations T, the dimension to be optimized dim, and the size of population N. First of all, in the population initialization, the time complexity of MSINGO is O ( N     d i m ) . Secondly, in the iteration process, each individual will be updated in the exploration and exploitation phases. Therefore, the time complexity in the two phases is O ( T     N     d i m ) , respectively. Consequently, the total time complexity of MSINGO is O ( ( 2     T + 1 )     d i m     N ) .

4. Experimental Results and Analysis

In this section, we evaluated the performance of the proposed MSINGO on the CEC2017 test functions. The whole experiment consists of the following four parts: (1) the impact of three strategies on NGOs; (2) qualitative analysis of the MSINGO algorithm; (3) comparison with 11 well-known metaheuristic algorithms; and (4) scalability analysis of the MSINGO algorithm. All experiments were tested in a computer with Intel (R) Core (TM) i7-8565U,1.80 GHz CPU (Intel Corporation. City, Santa Clara, CA, USA). And the algorithms are all based on Python 3.7.

4.1. Benchmark Functions

To evaluate the performance of the MSINGO algorithm, we conducted experiments on 29 benchmark functions of IEEE CEC2017. The CEC2017 test functions contains 2 unimodal functions (C1–C2) to test the exploitation capability, 7 multimodal functions (C3–C9) to test the exploration capability, and 10 hybrid functions (C10–C19) and 10 composition functions (C20–C29) to test the algorithm’s ability to avoid local optima. The detailed descriptions of these benchmark functions are shown in Table 2, where range denotes the boundary of design variables, and f m i n represents the optimal value.

4.2. Competitor Algorithms and Parameters Setting

MSINGO is compared with 5 highly cited algorithms, including DE, MFO, WOA, SCA, and SOA, and 6 recently proposed metaheuristic algorithms, including SSA, DBO, POA, BWO, GJO, and NGO. The parameter settings of all algorithms are shown in Table 3. The population size and the maximum number of iterations of all algorithms are set to 30 and 500, respectively. Furthermore, in order to decrease the impact of the random factor, each algorithm was executed independently 30 times on each test function.

4.3. Influence of the Three Mechanisms

This section combines the cubic mapping strategy (C), weighted stochastic difference mutation strategy (WS), and weighted sine and cosine optimization strategy (WSC) with NGO, and analyzes their influence on improving NGO’s performance. The details of these different NGO algorithms are shown in Table 4, where ‘1’ indicates that the strategy is integrated with NGO and ‘0’ indicates the opposite.
Seven kinds of strategic NGO algorithms and the original NGO are tested on CEC2017 test functions, and the dimension to be optimized (dim) is set to 30. We calculated the average (Ave), standard deviation (Std), and highlighted the optimal algorithm in bold. The results are presented in Table 5.
According to the average fitness (Ave) of each algorithm in Table 5, the Friedman test method was carried out to rank the fitness of all the algorithms. The ranking of each algorithm is shown in Table 6, where average rank represents the average ranking of each algorithm and overall rank represents the final ranking of each algorithm, and +/−/= represents the number of test functions for which MSINGO performs better, lower or equal to that of other comparison algorithms. The smaller the values of average rank and overall rank, the better the performance of the algorithm.
According to average rank in Table 6, the eight NGO algorithms are ranked as follows, MSINGO > WS_NGO > C_WS_NGO > WS_WSC_NGO > C_WSC_NGO > WSC_NGO > NGO > C_NGO. It can be seen that MSINGO performs better than WS_NGO on only 15 functions, indicating that the weighted stochastic difference variation (WS) strategy has played a crucial role in improving NGO algorithms. However, according to the average ranking of the algorithm, MSINGO is superior to WS_NGO, indicating that the other two strategies also play an auxiliary role in improving NGO algorithms. Therefore, the performance of the algorithm is improved the most when the three strategies are added simultaneously.

4.4. Qualitative Analysis

The qualitative analysis of MSINGO in solving common unimodal and multimodal functions are shown in Figure 2, and the detail and complete information of these functions are given in [24]. In the qualitative analysis of MSINGO, four well-known indicators are used to intuitively analyze the performance of the MSINGO algorithm, including the following: (1) search history; (2) the trajectory of the first northern goshawk in the 1st dimension; (3) the average fitness of the northern goshawk population; and (4) the convergence curve of the best candidate solution.
The search history shows the position of each northern goshawk in the search space during the iterations. As can be seen in Figure 2b, the red dot in the search history represents the location of the global optimum and the blue dots represent the locations of the candidate best solutions during the iterations. It can be easily seen from the search history that MSINGO can search globally and converge quickly once finding the main optimal area, indicating MSINGO has a good ability to perform global exploration and local exploitation.
The trajectory of the first dimension refers to the position changes of the first northern goshawk in the first dimension, which indicates the primary exploratory behavior of MSINGO. As shown in Figure 2c, in the trajectory diagram of the first northern goshawk, the position of the first northern goshawk experienced rapid oscillations in the initial iterations, indicating that the MSINGO algorithm can quickly identify the main optimal region. In the subsequent iterations, there were slight oscillations, indicating that MSINGO searched around the optimal position and converged to it.
The average fitness curves and convergence curves for different benchmark functions are also provided in Figure 2d,e. Among them, rapid decline occurred in the initial iterations, representing that MSINGO converged quickly.

4.5. Comparison with 11 Well-Known Metaheuristic Algorithms

In qualitative analysis, we intuitively demonstrated the exploitation and exploration capabilities of the proposed MSINGO algorithm. In this section, we quantitatively compared our proposed MSINGO algorithm with 11 metaheuristic algorithms (5 highly cited and 6 recently proposed) on CEC2017 test functions. The comparison algorithms are described in Section 4.2.
Comparative experiments include the following: (1) comparison of exploitation capabilities on unimodal benchmark functions (C1–C2); (2) comparison of exploration abilities on multimodal benchmark functions (C3–C9); (3) comparison of local optimal avoidance abilities on hybrid functions (C10–C19) and composition functions (C20–C29).
Furthermore, the Friedman test was used to evaluate overall performance of 12 metaheuristics algorithms. The Wilcoxon signed-rank test was used to verify if two sets of solutions are dissimilar statistically substantial or not.
In these comparative experiments, the dimension to be optimized (dim) was set to 30. For a fair comparison, each algorithm was implemented 30 independent runs in each benchmark and the Ave, Std, and Rank were calculated, where Ave and Std indicate the mean and standard deviation of the optimal values, and Rank presents the algorithm’s ranking. A low Ave indicates a higher optimization performance and a lower Std indicates a more stable optimization performance. The algorithm with the best performance is bolded to highlight.
Below is a detailed discussion of these comparative experiments.

4.5.1. Exploitation Ability Analysis

To test the exploitation ability of MSINGO, we compare it with other 11 metaheuristic algorithms on two unimodal functions (C1–C2). The quantitative results on these two unimodal functions are shown in Table 7. It can be seen from Table 7 that MSINGO ranks first on the two unimodal functions, indicating MSINGO has the best exploitation ability among all the 12 algorithms.
The convergence curves of 12 metaheuristic algorithms on C1 and C2 are shown in Figure 3. Obviously, our MSINGO algorithm has the fastest convergence speed and minimum fitness value, which shows that the proposed MSINGO is the most competitive algorithm in these unimodal functions.
According to the Rank score in Table 7, the Friedman test was performed, and the result is shown in Figure 4, where average rank represents the average ranking of all algorithms, and overall rank represents the final ranking of all algorithms in unimodal functions. The smaller the average rank and overall rank, the better the performance of the algorithm. As can be seen from Figure 4, MSINGO ranks first, indicating that it is superior to other comparison algorithms in exploitation ability.
The p-values obtained via the Wilcoxon signed-rank test between MSINGO and each of the comparison algorithms are presented in Table 8. A p-value less than 0.05 indicates a significant difference between the comparison algorithm and MSINGO. Conversely, there is no significant difference. As seen in Table 8, on the two unimodal functions, all p-values are less than 0.05, indicating that MSINGO is significantly better than other comparison algorithms.
The results in this section showed that the MSINGO algorithm has the best exploitation ability among all 12 algorithms.

4.5.2. Exploration Ability Analysis

In this section, we tested the exploration ability of MSINGO on seven multimodal functions (C3–C9). Quantitative analysis is presented in Table 9, where MSINGO ranks first on five functions (C3, C4, C5, C8, and C9) and second on two functions (C6 and C7), indicating that MSINGO’s exploration ability is superior to comparative algorithms in the vast majority of cases.
The convergence curves from C3 to C9 are shown in Figure 5. It can be seen that in terms of convergence speed, MSINGO ranks first on three functions (C3, C4, and C8) and second on the other four functions; from the perspective of the optimal solution, MSINGO is the smallest among five functions (C3, C4, C5, C8, and C9) and the second smallest among two functions (C6 and C7).
Friedman test results for these seven multimodal functions are shown in Figure 6, where MSINGO ranks first, indicating it has the best exploration ability among the 12 algorithms.
Furthermore, Table 10 shows the p-value results of the Wilcoxon signed-rank test on MSINGO against 11 other algorithms. The vast majority of p-values are less than 0.05, indicating that MSINGO is better than the comparison algorithms in terms of exploration capability.
The results of this section indicate that MSINGO is superior to other comparison algorithms in exploration ability.

4.5.3. Local Optimal Avoidance Ability Analysis

In this section, 10 hybrid functions (C10–C19) and 10 composition functions (C20–C29) are selected to verify the ability to avoid local optimal on 12 metaheuristic algorithms. Quantitative statistical results are presented in Table 11 and Table 12. It can be seen that MSINGO ranks first on nineteen functions (C10 to C17 and C19 to C29) and second on C18. These results indicate that MSINGO has superior performance in avoiding local optima compared to 11 other metaheuristic algorithms.
The convergence curves of hybrid functions and composition functions are shown in Figure 7 and Figure 8, respectively. For these 20 hybrid and composition functions, MSINGO ranks first on 19 functions (C10–C17 and C19–C29) and second on C18 in convergence speed and optimal solution.
The results of the Friedman test for 20 hybrid and composition functions are shown in Figure 9. It can be seen that MSINGO ranks first among 12 algorithms.
The p-values of Wilcoxon signed-rank tests on hybrid functions (C10–C19) and composition functions (C20–C29) are shown in Table 13. Obviously, the majority of the p-values are less than 0.05, indicating a significant difference between MSINGO and the majority of algorithms in terms of avoiding local optima.
The results of this section show that MSINGO is the best in avoiding local optima among all algorithms.
In Figure 10, we showed the Sankey ranking of the 12 algorithms, demonstrating that our proposed algorithm mostly maintains the first position in different test functions.
Among all the functions, MSINGO and NGO have similar optimal values in the seven functions of C5, C8, C9, C15, C6, C24, and C26, indicating that MSINGO’s advantages are not very obvious among these seven functions. Meanwhile, the optimal values for functions C6, C7, and C18 are not as good as NGO. This shows that the strategy we designed is not optimal, and some adjustments may be needed, such as parameter tuning. But we have obvious advantages in the other functions, so we need to consider the strategy from multiple aspects.

4.6. Scalability Analysis

In this section, to test the scalability of MSINGO, the dimension of benchmark function to be optimized (dim) is set to 100. For all algorithms, the population size is set to 30 and the maximum number of iterations is set to 500. Meanwhile, to reduce the influence of random factors, each algorithm was executed independently 30 times on each test function, and the Ave, Std, and Rank were calculated. The algorithm with the best performance is bolded. The results are shown in Table 14.
Among all the algorithms, MSINGO ranked first on 26 test functions (C1 to C5, C8, and C10 to C29), second on C7, fourth on C9, and fifth on C6.
In Figure 11, we showed the Sankey ranking of the 12 algorithms on high-dimensional test functions, demonstrating that our proposed algorithm mostly maintains the first positions in different test functions.
The results of the Friedman test on all 100-dimensional benchmark functions (C1–C29) are shown in Figure 12. It can be seen that MSINGO ranks first in all algorithms, indicating that MSINGO is the best of the 12 algorithms when dealing with high-dimensional problems.

4.7. Memory Occupation

We choose one function from unimodal (C1), multimodal (C4), hybrid function (C15), and composition function (C27) to experiment. The population size and the maximum number of iterations of all algorithms are set to 30 and 500, respectively. And the dimension to be optimized (dim) was set to 30. The results are shown in Table 15. It can be concluded that the memory occupation of MSINGO is not the smallest, which indicates that the convergence speed of MSINGO is accelerated, but the memory used is increased.

5. MSINGO for Engineering Optimization Problems

This section verifies the efficiency of MSINGO in dealing with real-world optimization applications in six practical engineering design problems, including a tension/compression spring design problem (T/CSD) [68], cantilever beam design problem (CBD) [69], pressure vessel design problem (PVD) [70], welded beam design problem (WBD) [71], speed reducer design problem (SRD) [72], and three-bar truss design problem (T-bTD) [73]. Parameter settings of MSINGO and the other 11 comparative metaheuristic algorithms in this section are identical to those in Section 4.2.

5.1. Tension/Compression Spring Design Problem (T/CSD)

The tension/compression spring design (T/CSD) is an optimization problem to minimize the weight of a tension/compression spring with constraints. Its schematic diagram is shown in Figure 13. There are three variables that require optimization: spring wire diameter ( d ), spring coil diameter ( D ), and the number of active coils ( P ). The mathematical formula of T/CSD is as follows:
Consider :     x = [ x 1     x 2     x 3 ] = [ d     D     P ]
Minimize :     f ( x ) = ( x 3 + 2 ) x 2 x 1 2
Subject   to :     g 1 ( x ) = 1 x 2 3 x 3 71,785 x 1 4 0
g 2 ( x ) = 4 x 2 2 x 1 x 2 12,566 ( x 2 x 1 3 x 1 4 ) + 1 5108 x 1 2 1 0
g 3 ( x ) = 1 140.45 x 1 x 2 2 x 3 0
g 4 ( x ) = x 1 + x 2 1.5 1 0
Parameters   range :     0.05 x 1 2 ,   0.25 x 2 1.3 ,   2 x 3 15
Table 16 presents the experimental results of MSINGO and 11 competitor algorithms in achieving the optimal solution for T/CSD. Based on these results, MSINGO ranks first among all the algorithms.

5.2. Cantilever Beam Design Problem (CBD)

The cantilever beam design (CBD) is an engineering optimization problem to minimize the beam’s weight while meeting the constraint conditions. Schematic representation of the CBD is shown in Figure 14. A cantilever beam consists of five hollow blocks, each of which is a hollow square section with a constant thickness. There are five variables to be optimized. The optimization problem of CBD can be defined as follows:
Consider :     x = [ x 1   x 2   x 3   x 4   x 5 ]
Minimize :     f ( x ) = 0.6224 ( x 1 + x 2 + x 3 + x 4 + x 5 )
Subject   to :     g ( x ) = 61 x 1 3 + 37 x 2 3 + 19 x 3 3 + 7 x 4 3 + 1 x 5 3 1 0
Parameter   range :     0.01 x 1 , x 2 , x 3 , x 4 , x 5 100
Table 17 presents the optimal results. Based on Table 17, MSINGO outperforms all the 11 competitor algorithms.

5.3. Pressure Vessel Design Problem (PVD)

The pressure vessel design (PVD) problem aims at minimizing total vessel cost while satisfying the constraint conditions, as illustrated in Figure 15. This problem contains four optimization variables: thickness of the shell ( T S ), thickness of the head ( T h ), inner radius ( R ), and length of the cylindrical section excluding the head ( L ). The mathematical formula of the PVD optimization problem is as follows:
Consider :     x = [ x 1   x 2   x 3   x 4 ] = [ T s   T h   R   L ]
Minimize :     f ( x ) = 0.6224 x 1 x 3 x 4 + 1.7781 x 2 x 3 2 + 3.1661 x 1 2 x 4 + 19.84 x 1 2 x 3
Subject   to :     g 1 ( x ) = x 1 + 0.0193 x 3 0
g 2 ( x ) = x 2 + 0.00954 x 3 0
g 3 ( x ) = π x 3 2 x 4 4 3 π x 3 3 + 1,296,000 0
g 4 ( x ) = x 4 240 0
Parameters   range :     0 x 1 , x 2 99 ,       10 x 3 , x 4 200
Table 18 lists the optimal results of MSINGO and the competitor algorithms. MSINGO ranks second in solving PVD problems, just inferior to DE.

5.4. Welded Beam Design Problem (WBD)

The welded beam design (WBD) problem is to minimize the cost of a welded beam. Figure 16 shows schematic representation of WBD, which has four optimization variables, including weld thickness ( h ), clamping bar length ( l ), bar height ( t ), and bar thickness ( b ). As seen in Figure 10. The mathematical formula of the WBD optimization problem is as follows:
Consider :     x = [ x 1   x 2   x 3   x 4 ] = [ h   l   t   b ]
Minimize :     f ( x ) = 1.10471 x 1 2 x 2 + 0.04811 x 3 x 4 ( 14.0 + x 2 )
Subject   to :     g 1 ( x ) = τ ( x ) τ max 0
g 2 ( x ) = σ ( x ) σ max 0
g 3 ( x ) = δ ( x ) δ max 0
g 4 ( x ) = x 1 x 4 0
g 5 ( x ) = P P c ( x ) 0
g 6 ( x ) = 0.125 x 1 0
g 7 ( x ) = 1.10471 x 1 2 + 0.04811 x 3 x 4 ( 14.0 + x 2 ) 5.0 0
Parameter   range :     0.1 x 1 , x 4 2 ,       0.1 x 2 , x 3 10
Where :     τ ( x ) = ( τ ) 2 + 2 τ τ x 2 2 R + ( τ ) 2
τ = P 2 x 1 x 2 ,       τ = M R J
M = P ( L + x 2 2 )
R = x 2 2 4 + ( x 1 + x 3 2 ) 2
J = 2 2 x 1 x 2 [ x 2 2 4 + ( x 1 + x 3 2 ) 2 ]
σ ( x ) = 6 P L x 4 x 3 2 ,     δ ( x ) = 6 P L 3 E x 3 2 x 4
P c ( x ) = 4.013 E x 3 2 x 4 6 36 L 2 ( 1 x 3 2 L E 4 G )
P = 6000   lb ,       L = 14   in ,       δ max = 0.25   in
E = 30 × 1 6   psi ,     G = 12 × 10 6   psi
τ max = 13,600   psi ,   σ max = 30,000   psi
The results of the WBD problem solved by MSINGO and compared algorithms are summarized in Table 19, where MSINGO ranks the first.

5.5. Speed Reducer Design Problem (SRD)

The speed reducer design (SRD) problem is an engineering optimization problem to minimize the weight of the reducer with constraints. A schematic representation of SRD is shown in Figure 17. It includes seven optimization variables: face width ( b ), module of teeth ( m ), pinion teeth count ( p ), length of the first shaft between bearings ( l 1 ), length of the second shaft between bearings ( l 2 ), diameter of the first shaft ( d 1 ), and diameter of the second shaft ( d 2 ). The mathematical formula of the SRD optimization problem is as follows:
Consider :     x = [ x 1       x 2       x 3       x 4       x 5       x 6       x 7 ] = [ b     m   p   l 1   l 2   d 1   d 2 ]
Minimize :     f ( x ) = 0.7854 x 1 x 2 2 ( 3.3333 x 3 2 + 14.9334 x 3 43.0934 ) 1.508 x 1 ( x 6 2 + x 7 2 ) + 7.4777 ( x 6 3 + x 7 3 ) + 0.7854 ( x 4 x 6 2 + x 5 x 7 2 )
Subject   to :     g 1 ( x ) = 27 x 1 x 2 2 x 3 1 0
g 2 ( x ) = 397.5 x 1 x 2 2 x 3 2 1 0
g 3 ( x ) = 1.93 x 4 3 x 2 x 3 x 6 4 1 0
g 4 ( x ) = 1.93 x 5 3 x 2 x 3 x 7 4 1 0
g 5 ( x ) = ( 745 x 4 x 2 x 3 ) 2 + 16.9 × 10 6 110.0 x 6 3 1 0
g 6 ( x ) = ( 754 x 5 x 2 x 3 ) 2 + 157.5 × 10 6 85.0 x 7 3 1 0
g 7 ( x ) = x 2 x 3 40 1 0
g 8 ( x ) = 5 x 2 x 1 1 0
g 9 ( x ) = x 1 12 x 2 1 0
g 10 ( x ) = 1.5 x 6 + 1.9 x 4 1 0
g 11 ( x ) = 1.1 x 7 + 1.9 x 5 1 0
Parameter   range :     2.6 x 1 3.6 ,         0.7 x 2 0.8 ,         17.0 x 3 28.0
7.3 x 4 8.3 ,       7.3 x 5 8.3 ,       2.9 x 6 3.9
5.0 x 7 5.5
Table 20 displays the optimization results for SRD. MSINGO ranks first along with NGO, DE, and BWO.

5.6. Three-Bar Truss Design Problem (T-bTD)

The three-bar truss design (T-bTD) problem is an optimization problem in civil engineering where the objective is to minimize the volume of the three-bar truss. The schematic representation of T-bTD is shown in Figure 18. It has two optimization variables, namely x 1 and x 2 . The mathematical formula of T-bTD is defined as follows:
Consider :     x = [ x 1   x 2 ]
Minimize :     f ( x ) = ( 2 2 x 1 + x 2 ) l
Subject   to :     g 1 ( x ) = 2 x 1 + x 2 2 x 1 2 + 2 x 1 x 2 P σ 0
g 2 ( x ) = x 2 2 x 1 2 + 2 x 1 x 2 P σ 0
g 3 ( x ) = 1 2 x 2 + x 1 P σ 0
Parameter   range :     0 x 1 , x 2 1
Where :     l = 100   cm , P = 2   kN / cm 2 , σ = kN / cm 2
Table 21 presents the experimental results. MSINGO ranks second, behind DE.
Figure 19 shows a heat map of 12 algorithms on six engineering applications. MSINGO ranks the lowest overall, proving that MSINGO has excellent performance in solving engineering problems.
Different engineering applications have different conditions, so one algorithm cannot be applicable to all engineering applications. In most cases, MSINGO outperforms other algorithms, but performs worse than DE in the pressure vessel design problem (PVD) and the three-bar truss design problem (T-bTD). Therefore, for different practical problems, algorithm improvement also requires targeted analysis.

6. Conclusions and Future Work

In this paper, we propose a multi-strategy improved northern goshawk optimization named MSINGO. Firstly, cubic mapping is applied in the population initialization to improve the population diversity of the algorithm. Secondly, weighted stochastic difference mutation is added in the exploration phase to make it jump out of the local optimal solution. Finally, we use the weighted sine and cosine optimization strategy instead of the original exploitation formula to enhance convergence speed of the MSINGO. We analyzed the impact of the three strategies on NGOs and verified the effectiveness of the proposed algorithm. Then, MSINGO is compared with 11 well-known algorithms on CEC2017 test functions. Comparative experiments include comparison of exploitation ability, exploration ability, local optimal avoidance ability, and scalability. The comparative experimental results show that in the vast majority of cases, MSINGO’s exploitation ability, exploration ability, local optimal avoidance ability, and scalability are superior to those of competitive algorithms. Additionally, the implementation of MSINGO in addressing six engineering design optimization issues demonstrated the high capability of MSINGO in real-world optimization problems.
In the future, we will verify our algorithm on large size datasets and study its performance on large size datasets, such as parameter selection for a neural network model and so on. We will also use hybrid methods to validate the performance of the algorithm, such as combining it with faster gradient-based methods.

Author Contributions

H.L.: Methodology, Writing—Review and Editing; J.X.: Software, Writing—Original draft; Y.Y.: Resources; S.Z.: Methodology; Y.C.: Methodology; R.Z.: Supervision; Y.M.: Supervision; M.W.: Resources; K.Z.: Resources. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Natural Science Foundation of Hebei Province (D2023512004).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Inquiries about data availability should be directed to the author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Phan, H.D.; Ellis, K.; Barca, J.C.; Dorin, A. A Survey of Dynamic Parameter Setting Methods for Nature-Inspired Swarm Intelligence Algorithms. Neural Comput. Appl. 2020, 32, 567–588. [Google Scholar] [CrossRef]
  2. Nadimi-Shahraki, M.H.; Zamani, H.; Mirjalili, S. Enhanced Whale Optimization Algorithm for Medical Feature Selection: A COVID-19 Case Study. Comput. Biol. Med. 2022, 148, 105858. [Google Scholar] [CrossRef]
  3. Guo, X.; Hu, J.; Yu, H.; Wang, M.; Yang, B. A New Population Initialization of Metaheuristic Algorithms Based on Hybrid Fuzzy Rough Set for High-Dimensional Gene Data Feature Selection. Comput. Biol. Med. 2023, 166, 107538. [Google Scholar] [CrossRef]
  4. Wang, X.; Choi, T.-M.; Liu, H.; Yue, X. A Novel Hybrid Ant Colony Optimization Algorithm for Emergency Transportation Problems During Post-Disaster Scenarios. IEEE Trans. Syst. Man Cybern Syst. 2018, 48, 545–556. [Google Scholar] [CrossRef]
  5. Beheshtinia, M.A.; Jozi, A.; Fathi, M. Optimizing Disaster Relief Goods Distribution and Transportation: A Mathematical Model and Metaheuristic Algorithms. Appl. Math. Sci. Eng. 2023, 31, 2252980. [Google Scholar] [CrossRef]
  6. Shen, Y.; Zhang, C.; Soleimanian Gharehchopogh, F.; Mirjalili, S. An Improved Whale Optimization Algorithm Based on Multi-Population Evolution for Global Optimization and Engineering Design Problems. Expert Syst. Appl. 2023, 215, 119269. [Google Scholar] [CrossRef]
  7. Jiadong, Q.; Ohl, J.P.; Tran, T.-T. Predicting Clay Compressibility for Foundation Design with High Reliability and Safety: A Geotechnical Engineering Perspective Using Artificial Neural Network and Five Metaheuristic Algorithms. Reliab. Eng. Syst. Saf. 2024, 243, 109827. [Google Scholar] [CrossRef]
  8. Lin, W.-Y. A Novel 3D Fruit Fly Optimization Algorithm and Its Applications in Economics. Neural Comput. Appl. 2016, 27, 1391–1413. [Google Scholar] [CrossRef]
  9. Ewees, A.A.; Mostafa, R.R.; Ghoniem, R.M.; Gaheen, M.A. Improved Seagull Optimization Algorithm Using Lévy Flight and Mutation Operator for Feature Selection. Neural Comput. Appl. 2022, 34, 7437–7472. [Google Scholar] [CrossRef]
  10. Dutta, D.; Rath, S. Innovative Hybrid Metaheuristic Algorithms: Exponential Mutation and Dual-Swarm Strategy for Hybrid Feature Selection Problem. Int. J. Inf. Tecnol. 2024, 16, 77–89. [Google Scholar] [CrossRef]
  11. Yildirim, S.; Kaya, Y.; Kılıç, F. A Modified Feature Selection Method Based on Metaheuristic Algorithms for Speech Emotion Recognition. Appl. Acoust. 2021, 173, 107721. [Google Scholar] [CrossRef]
  12. Nasruddin; Sholahudin; Satrio, P.; Mahlia, T.M.I.; Giannetti, N.; Saito, K. Optimization of HVAC System Energy Consumption in a Building Using Artificial Neural Network and Multi-Objective Genetic Algorithm. Sustain. Energy Technol. Assess. 2019, 35, 48–57. [Google Scholar] [CrossRef]
  13. Eker, E.; Kayri, M.; Ekinci, S.; İzci, D. Comparison of Swarm-Based Metaheuristic and Gradient Descent-Based Algorithms in Artificial Neural Network Training. Adv. Distrib. Comput. Artif. Intell. J. 2023, 12, e29969. [Google Scholar] [CrossRef]
  14. Mirsaeidi, S.; Shang, F.; Ghaffari, K.; He, J.; Said, D.M.; Muttaqi, K.M. An Artificial Neural Network Based Strategy for Commutation Failure Forecasting in LCC-HVDC Transmission Networks. In Proceedings of the 2023 IEEE International Conference on Energy Technologies for Future Grids (ETFG), Wollongong, Australia, 3 December 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 1–6. [Google Scholar]
  15. Zheng, R.; Hussien, A.G.; Qaddoura, R.; Jia, H.; Abualigah, L.; Wang, S.; Saber, A. A Multi-Strategy Enhanced African Vultures Optimization Algorithm for Global Optimization Problems. J. Comput. Des. Eng. 2023, 10, 329–356. [Google Scholar] [CrossRef]
  16. Hochreiter, S.; Younger, A.S.; Conwell, P.R. Learning to Learn Using Gradient Descent. In Proceedings of the Artificial Neural Networks—ICANN 2001, Vienna, Austria, 21–25 August 2001; Dorffner, G., Bischof, H., Hornik, K., Eds.; Springer: Berlin/Heidelberg, Germany, 2001; pp. 87–94. [Google Scholar]
  17. Polyak, B.T. Newton’s Method and Its Use in Optimization. Eur. J. Oper. Res. 2007, 181, 1086–1096. [Google Scholar] [CrossRef]
  18. Fletcher, R. Function Minimization by Conjugate Gradients. Comput. J. 1964, 7, 149–154. [Google Scholar] [CrossRef]
  19. Zhong, C.; Li, G.; Meng, Z. Beluga Whale Optimization: A Novel Nature-Inspired Metaheuristic Algorithm. Knowl.-Based Syst. 2022, 251, 109215. [Google Scholar] [CrossRef]
  20. Sahoo, S.K.; Saha, A.K.; Nama, S.; Masdari, M. An Improved Moth Flame Optimization Algorithm Based on Modified Dynamic Opposite Learning Strategy. Artif. Intell. Rev. 2023, 56, 2811–2869. [Google Scholar] [CrossRef]
  21. Mirjalili, S. Moth-Flame Optimization Algorithm: A Novel Nature-Inspired Heuristic Paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  22. Holland, J.H. Genetic Algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  23. Storn, R.; Price, K. Differential Evolution—A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  24. Yao, X.; Liu, Y.; Lin, G. Evolutionary Programming Made Faster. IEEE Trans. Evol. Comput. 1999, 3, 82–102. [Google Scholar] [CrossRef]
  25. Koza, J.R. Genetic Programming as a Means for Programming Computers by Natural Selection. Stat. Comput. 1994, 4, 87–112. [Google Scholar] [CrossRef]
  26. Simon, D. Biogeography-Based Optimization. IEEE Trans. Evol. Comput. 2008, 12, 702–713. [Google Scholar] [CrossRef]
  27. Moscato, P.; Cotta Porras, C. An Introduction to Memetic Algorithms. Int. Artif. 2003, 7, 360. [Google Scholar] [CrossRef]
  28. Atashpaz-Gargari, E.; Lucas, C. Imperialist Competitive Algorithm: An Algorithm for Optimization Inspired by Imperialistic Competition. In Proceedings of the 2007 IEEE Congress on Evolutionary Computation, Singapore, 25–28 September 2007; IEEE: Piscataway, NJ, USA, 2007; pp. 4661–4667. [Google Scholar]
  29. Kennedy, J.; Eberhart, R. Particle Swarm Optimization. In Proceedings of the ICNN’95—International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; IEEE: Piscataway, NJ, USA, 1995; Volume 4, pp. 1942–1948. [Google Scholar]
  30. Dorigo, M.; Birattari, M.; Stutzle, T. Ant Colony Optimization. IEEE Comput. Intell. Mag. 2006, 1, 28–39. [Google Scholar] [CrossRef]
  31. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  32. Yang, X.-S.; Deb, S. Cuckoo Search via Lévy Flights. In Proceedings of the 2009 World Congress on Nature & Biologically Inspired Computing (NaBIC), Coimbatore, India, 9–11 December 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 210–214. [Google Scholar]
  33. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  34. Xue, J.; Shen, B. A Novel Swarm Intelligence Optimization Approach: Sparrow Search Algorithm. Syst. Sci. Control Eng. 2020, 8, 22–34. [Google Scholar] [CrossRef]
  35. Xue, J.; Shen, B. Dung Beetle Optimizer: A New Meta-Heuristic Algorithm for Global Optimization. J. Supercomput. 2023, 79, 7305–7336. [Google Scholar] [CrossRef]
  36. Połap, D.; Woźniak, M. Red Fox Optimization Algorithm. Expert Syst. Appl. 2021, 166, 114107. [Google Scholar] [CrossRef]
  37. Zhao, S.; Zhang, T.; Ma, S.; Wang, M. Sea-Horse Optimizer: A Novel Nature-Inspired Meta-Heuristic for Global Optimization Problems. Appl. Intell. 2023, 53, 11833–11860. [Google Scholar] [CrossRef]
  38. Dehghani, M.; Montazeri, Z.; Trojovská, E.; Trojovský, P. Coati Optimization Algorithm: A New Bio-Inspired Metaheuristic Algorithm for Solving Optimization Problems. Knowl.-Based Syst. 2023, 259, 110011. [Google Scholar] [CrossRef]
  39. Abdel-Basset, M.; Mohamed, R.; Jameel, M.; Abouhawwash, M. Spider Wasp Optimizer: A Novel Meta-Heuristic Optimization Algorithm. Artif. Intell. Rev. 2023, 56, 11675–11738. [Google Scholar] [CrossRef]
  40. Zhang, W.; Zhao, J.; Liu, H.; Tu, L. Cleaner Fish Optimization Algorithm: A New Bio-Inspired Meta-Heuristic Optimization Algorithm. J. Supercomput. 2024, 80, 17338–17376. [Google Scholar] [CrossRef]
  41. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by Simulated Annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef]
  42. Rashedi, E.; Nezamabadi-pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  43. Alatas, B. ACROA: Artificial Chemical Reaction Optimization Algorithm for Global Optimization. Expert Syst. Appl. 2011, 38, 13170–13180. [Google Scholar] [CrossRef]
  44. Mirjalili, S. SCA: A Sine Cosine Algorithm for Solving Optimization Problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  45. Kaveh, A.; Dadras, A. A Novel Meta-Heuristic Optimization Algorithm: Thermal Exchange Optimization. Adv. Eng. Softw. 2017, 110, 69–84. [Google Scholar] [CrossRef]
  46. Abdel-Basset, M.; Mohamed, R.; Azeem, S.A.A.; Jameel, M.; Abouhawwash, M. Kepler Optimization Algorithm: A New Metaheuristic Algorithm Inspired by Kepler’s Laws of Planetary Motion. Knowl.-Based Syst. 2023, 268, 110454. [Google Scholar] [CrossRef]
  47. Rao, R.V.; Savsani, V.J.; Vakharia, D.P. Teaching–Learning-Based Optimization: A Novel Method for Constrained Mechanical Design Optimization Problems. Comput.-Aided Des. 2011, 43, 303–315. [Google Scholar] [CrossRef]
  48. Kuo, H.C.; Lin, C.H. Cultural Evolution Algorithm for Global Optimizations and Its Applications. J. Appl. Res. Technol. 2013, 11, 510–522. [Google Scholar] [CrossRef]
  49. Liu, Z.; Qin, J.; Peng, W.; Chao, H. Effective Task Scheduling in Cloud Computing Based on Improved Social Learning Optimization Algorithm. Int. J. Online Eng. 2017, 13, 4. [Google Scholar] [CrossRef]
  50. Kumar, M.; Kulkarni, A.J.; Satapathy, S.C. Socio Evolution & Learning Optimization Algorithm: A Socio-Inspired Optimization Methodology. Future Gener. Comput. Syst. 2018, 81, 252–272. [Google Scholar] [CrossRef]
  51. Moghdani, R.; Salimifard, K. Volleyball Premier League Algorithm. Appl. Soft Comput. 2018, 64, 161–185. [Google Scholar] [CrossRef]
  52. Dhiman, G.; Kumar, V. Seagull Optimization Algorithm: Theory and Its Applications for Large-Scale Industrial Engineering Problems. Knowl.-Based Syst. 2019, 165, 169–196. [Google Scholar] [CrossRef]
  53. Dehghani, M.; Hubalovsky, S.; Trojovsky, P. Northern Goshawk Optimization: A New Swarm-Based Algorithm for Solving Optimization Problems. IEEE Access 2021, 9, 162059–162080. [Google Scholar] [CrossRef]
  54. Trojovský, P.; Dehghani, M. Pelican Optimization Algorithm: A Novel Nature-Inspired Algorithm for Engineering Applications. Sensors 2022, 22, 855. [Google Scholar] [CrossRef]
  55. Chopra, N.; Mohsin Ansari, M. Golden Jackal Optimization: A Novel Nature-Inspired Optimizer for Engineering Applications. Expert Syst. Appl. 2022, 198, 116924. [Google Scholar] [CrossRef]
  56. Tayarani-N, M.H.; Akbarzadeh-T, M.R. Magnetic Optimization Algorithms a New Synthesis. In Proceedings of the 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence), Hong Kong, China, June 1–6 June 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 2659–2664. [Google Scholar]
  57. Shareef, H.; Ibrahim, A.A.; Mutlag, A.H. Lightning Search Algorithm. Appl. Soft Comput. 2015, 36, 315–333. [Google Scholar] [CrossRef]
  58. Tanyildizi, E.; Demir, G. Golden Sine Algorithm: A Novel Math-Inspired Algorithm. Adv. Electr. Comput. Eng. 2017, 17, 71–78. [Google Scholar] [CrossRef]
  59. Emami, H.; Derakhshan, F. Election Algorithm: A New Socio-Politically Inspired Strategy. AI Commun. 2015, 28, 591–603. [Google Scholar] [CrossRef]
  60. Wolpert, D.H.; Macready, W.G. No Free Lunch Theorems for Optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  61. Chang, T.; Ge, Y.; Lin, Q.; Wang, Y.; Chen, R.; Wang, J. Optimal Configuration of Hybrid Energy Storage Capacity Based on Northern Goshawk Optimization. In Proceedings of the 2023 35th Chinese Control and Decision Conference (CCDC), Yichang, China, 20 May 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 301–306. [Google Scholar]
  62. El-Dabah, M.A.; El-Sehiemy, R.A.; Hasanien, H.M.; Saad, B. Photovoltaic Model Parameters Identification Using Northern Goshawk Optimization Algorithm. Energy 2023, 262, 125522. [Google Scholar] [CrossRef]
  63. Wu, X.; He, L.; Wu, G.; Liu, B.; Song, D. Optimizing CNN-LSTM Model for Short-Term PV Power Prediction Using Northern Goshawk Optimization. In Proceedings of the 2023 6th International Conference on Power and Energy Applications (ICPEA), Weihai, China, 24 November 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 248–252. [Google Scholar]
  64. Deng, H.; Liu, L.; Fang, J.; Qu, B.; Huang, Q. A Novel Improved Whale Optimization Algorithm for Optimization Problems with Multi-Strategy and Hybrid Algorithm. Math. Comput. Simul. 2023, 205, 794–817. [Google Scholar] [CrossRef]
  65. Fan, J.; Li, Y.; Wang, T. An Improved African Vultures Optimization Algorithm Based on Tent Chaotic Mapping and Time-Varying Mechanism. PLoS ONE 2021, 16, e0260725. [Google Scholar] [CrossRef]
  66. Rogers, T.D.; Whitley, D.C. Chaos in the Cubic Mapping. Math. Model 1983, 4, 9–25. [Google Scholar] [CrossRef]
  67. Chechkin, A.V.; Metzler, R.; Klafter, J.; Gonchar, V.Y. Introduction to the Theory of Lévy Flights. In Anomalous Transport; John Wiley & Sons, Ltd.: Hoboken, NJ, USA, 2008; pp. 129–162. ISBN 978-3-527-62297-9. [Google Scholar]
  68. Belegundu, A.D.; Arora, J.S. A Study of Mathematical Programmingmethods for Structural Optimization. Part II: Numerical Results. Int. J. Numer. Methods Eng. 1985, 21, 1601–1623. [Google Scholar] [CrossRef]
  69. Chickermane, H.; Gea, H.C. Structural Optimization Using a New Local Approximation Method. Int. J. Numer. Methods Eng. 1996, 39, 829–846. [Google Scholar] [CrossRef]
  70. Ghasemi, M.; Golalipour, K.; Zare, M.; Mirjalili, S.; Trojovský, P.; Abualigah, L.; Hemmati, R. Flood Algorithm (FLA): An Efficient Inspired Meta-Heuristic for Engineering Optimization. J. Supercomput. 2024, 80, 22913–23017. [Google Scholar] [CrossRef]
  71. Coello Coello, C.A. Use of a Self-Adaptive Penalty Approach for Engineering Optimization Problems. Comput. Ind. 2000, 41, 113–127. [Google Scholar] [CrossRef]
  72. Mezura-Montes, E.; Coello, C.A.C. Useful Infeasible Solutions in Engineering Optimization with Evolutionary Algorithms. In Proceedings of the MICAI 2005: Advances in Artificial Intelligence, Monterrey, Mexico, 14–18 November 2005; Gelbukh, A., de Albornoz, Á., Terashima-Marín, H., Eds.; Springer: Berlin/Heidelberg, Germany, 2005; pp. 652–662. [Google Scholar]
  73. Ray, T.; Saini, P. Engineering Design Optimization Using a Swarm with an Intelligent Information Sharing among Individuals. Eng. Optim. 2001, 33, 735–748. [Google Scholar] [CrossRef]
Figure 1. The flow chart of the MSINGO algorithm.
Figure 1. The flow chart of the MSINGO algorithm.
Biomimetics 09 00561 g001
Figure 2. Qualitative results of MSINGO, including (a) function’s landscape; (b) search history; (c) trajectory of 1st dimension; (d) average fitness; and (e) convergence curve.
Figure 2. Qualitative results of MSINGO, including (a) function’s landscape; (b) search history; (c) trajectory of 1st dimension; (d) average fitness; and (e) convergence curve.
Biomimetics 09 00561 g002aBiomimetics 09 00561 g002b
Figure 3. Convergence curves of different algorithms on the unimodal functions (C1–C2, dimension = 30).
Figure 3. Convergence curves of different algorithms on the unimodal functions (C1–C2, dimension = 30).
Biomimetics 09 00561 g003
Figure 4. Radar maps of Friedman ranking of different algorithms on unimodal functions.
Figure 4. Radar maps of Friedman ranking of different algorithms on unimodal functions.
Biomimetics 09 00561 g004
Figure 5. Convergence curves of different algorithms on the multimodal functions (C3–C9, dimension = 30).
Figure 5. Convergence curves of different algorithms on the multimodal functions (C3–C9, dimension = 30).
Biomimetics 09 00561 g005
Figure 6. Radar maps of Friedman ranking of different algorithms on multimodal functions.
Figure 6. Radar maps of Friedman ranking of different algorithms on multimodal functions.
Biomimetics 09 00561 g006
Figure 7. Convergence curves of different algorithms on hybrid functions (C10–C19, dimension = 30).
Figure 7. Convergence curves of different algorithms on hybrid functions (C10–C19, dimension = 30).
Biomimetics 09 00561 g007aBiomimetics 09 00561 g007b
Figure 8. Convergence curves of different algorithms on composition functions (C20–C29, dimension = 30).
Figure 8. Convergence curves of different algorithms on composition functions (C20–C29, dimension = 30).
Biomimetics 09 00561 g008aBiomimetics 09 00561 g008b
Figure 9. Radar maps of Friedman ranking of different algorithms on hybrid functions and composition functions.
Figure 9. Radar maps of Friedman ranking of different algorithms on hybrid functions and composition functions.
Biomimetics 09 00561 g009
Figure 10. The ranking Sankey of different algorithms on CEC2017 (Dimension = 30).
Figure 10. The ranking Sankey of different algorithms on CEC2017 (Dimension = 30).
Biomimetics 09 00561 g010
Figure 11. The Sankey ranking of different algorithms on CEC2017 (dimension = 100).
Figure 11. The Sankey ranking of different algorithms on CEC2017 (dimension = 100).
Biomimetics 09 00561 g011
Figure 12. Radar maps of Friedman ranking of different algorithms on all functions.
Figure 12. Radar maps of Friedman ranking of different algorithms on all functions.
Biomimetics 09 00561 g012
Figure 13. Schematic representation of T/CSD.
Figure 13. Schematic representation of T/CSD.
Biomimetics 09 00561 g013
Figure 14. Schematic representation of CBD.
Figure 14. Schematic representation of CBD.
Biomimetics 09 00561 g014
Figure 15. Schematic representation of PVD.
Figure 15. Schematic representation of PVD.
Biomimetics 09 00561 g015
Figure 16. Schematic representation of WBD.
Figure 16. Schematic representation of WBD.
Biomimetics 09 00561 g016
Figure 17. Schematic representation of SRD.
Figure 17. Schematic representation of SRD.
Biomimetics 09 00561 g017
Figure 18. Schematic representation of T-bTD.
Figure 18. Schematic representation of T-bTD.
Biomimetics 09 00561 g018
Figure 19. The heat map of different algorithms on 6 engineering problems.
Figure 19. The heat map of different algorithms on 6 engineering problems.
Biomimetics 09 00561 g019
Table 2. Details of the CEC2017 benchmark functions.
Table 2. Details of the CEC2017 benchmark functions.
CategoryIDFunctionRange f m i n
Unimodal functionsC1Shifted and Rotated Bent Cigar Function[−100, 100]100
C2Shifted and Rotated Zakharov Function[−100, 100]200
Multimodal functionsC3Shifted and Rotated Rosenbrock’s Function[−100, 100]300
C4Shifted and Rotated Rastrigin’s Function[−100, 100]400
C5Shifted and Rotated Schaffer’s F7 Function[−100, 100]500
C6Shifted and Rotated Lunacek Bi-Rastrigin’s Function[−100, 100]600
C7Shifted and Rotated Non-Continuous Rastrigin’s Function[−100, 100]700
C8Shifted and Rotated Levy Function[−100, 100]800
C9Shifted and Rotated Schwefel’s Function[−100, 100]900
Hybrid functionsC10Hybrid Function 1 (N= 3)[−100, 100]1000
C11Hybrid Function 2 (N = 3)[−100, 100]1100
C12Hybrid Function 3 (N = 3)[−100, 100]1200
C13Hybrid Function 4 (N = 4)[−100, 100]1300
C14Hybrid Function 5 (N = 4)[−100, 100]1400
C15Hybrid Function 6 (N = 4)[−100, 100]1500
C16Hybrid Function 7 (N = 5)[−100, 100]1600
C17Hybrid Function 8 (N = 5)[−100, 100]1700
C18Hybrid Function 9 (N = 5)[−100, 100]1800
C19Hybrid Function 10 (N = 6)[−100, 100]1900
Composition functionsC20Composition Function 1 (N = 3)[−100, 100]2000
C21Composition Function 2 (N = 3)[−100, 100]2100
C22Composition Function 3 (N = 4)[−100, 100]2200
C23Composition Function 4 (N = 4)[−100, 100]2300
C24Composition Function 5 (N = 5)[−100, 100]2400
C25Composition Function 6 (N = 5)[−100, 100]2500
C26Composition Function 7 (N = 6)[−100, 100]2600
C27Composition Function 8 (N = 6)[−100, 100]2700
C28Composition Function 9 (N = 3)[−100, 100]2800
C29Composition Function 10 (N = 3)[−100, 100]2900
Table 3. Parameter settings of the competitors and proposed MSINGO.
Table 3. Parameter settings of the competitors and proposed MSINGO.
CategoryAlgorithmsName of the ParameterValue of the Parameter
Highly citedDE P C r , F 0.8, 0.85
MFO b 1
WOA a , a 2 , b [0, 2], [−2, −1], 1
SCA a 2
SOA f c 2
Recently proposedSSA S T , P D , S D 0.6, 0.7, 0.2
DBO P b a l l R o l l i n g , P b r o o d B a l l , P S m a l l , P t h i e f , b , k , S 0.2, 0.4, 0.2, 0.4, b = 0.3, k = 0.1,
S = 0.5
POA R 0.2
BWO w f [0.1, 0.05]
GJO β 0.5
NGO R [0, 0.02]
Our proposedMSINGO a , β 2.595, 1.5
Table 4. Various NGO algorithms from three mechanisms.
Table 4. Various NGO algorithms from three mechanisms.
Cubic Mapping (C)Weighted Stochastic Difference Variation (WS)Weighted Sine and Cosine Optimization (WSC)
NGO000
C_NGO100
WS_NGO010
WSC_NGO001
C_WS_NGO110
C_WSC_NGO101
WS_WSC_NGO011
MSINGO111
Table 5. Experimental results of strategy comparison on CEC2017 test functions.
Table 5. Experimental results of strategy comparison on CEC2017 test functions.
ID NGOC_NGOWS_NGOWSC_NGOC_WS_NGOC_WSC_NGOWS_WSC_NGOMSINGO
C1Ave1.8025 × 1071.4315 × 1085.5679 × 1033.8139 × 1065.3757 × 1034.0069 × 1068.2264 × 1031.0583 × 104
Std1.9969 × 1071.8369 × 1084.6062 × 1031.4174 × 1065.1807 × 1031.7799 × 1066.2044 × 1037.6418 × 10
C2Ave9.5971 × 1022.2737 × 1032.0586 × 1023.3318 × 1022.0162 × 1023.2309 × 1022.1121 × 1022.0902 × 102
Std4.8997 × 1028.6804 × 1021.3416 × 1015.7545 × 1011.93505.5625 × 1017.66825.5120
C3Ave4.2824 × 1024.7716 × 1023.5739 × 1023.9802 × 1023.7833 × 1023.9686 × 1023.5959 × 1023.5318 × 102
Std4.6788 × 1015.3148 × 1012.8096 × 1013.1249 × 1014.3003 × 1014.4813 × 1012.9606 × 1012.6328 × 101
C4Ave1.0209 × 1031.2609 × 1035.7725 × 1027.0194 × 1025.8034 × 1027.0097 × 1026.3334 × 1026.2263 × 102
Std2.1185 × 1025.0975 × 1023.1176 × 1015.1463 × 1012.9463 × 1013.8466 × 1013.0272 × 1013.0987 × 101
C5Ave5.0000 × 1025.0000 × 1025.0000 × 1025.0000 × 1025.0000 × 1025.0000 × 1025.0000 × 1025.0000 × 102
Std1.1436 × 10−32.5502 × 10−38.8501 × 10−43.6425 × 10−46.3949 × 10−43.7749 × 10−44.5111 × 10−44.1208 × 10−4
C6Ave3.4794 × 1031.9116 × 1043.2849 × 1031.5336 × 1042.6826 × 1031.4086 × 1048.7875 × 1038.4491 × 103
Std1.3695 × 1036.3324 × 1031.2963 × 1036.6900 × 1037.8237 × 1025.6652 × 1033.6049 × 1033.9353 × 103
C7Ave7.0013 × 1027.0103 × 1027.0010 × 1027.0022 × 1027.0011 × 1027.0020 × 1027.0014 × 1027.0015 × 102
Std6.0608 × 10−22.6993 × 10−17.1534 × 10−28.5846 × 10−24.2965 × 10−27.3934 × 10−26.2216 × 10−26.1209 × 10−2
C8Ave8.0554 × 1028.0587 × 1028.0154 × 1028.0425 × 1028.0158 × 1028.0471 × 1028.0199 × 1028.0156 × 102
Std1.95181.95251.43771.43771.223701.51041.39901.0497
C9Ave5.1677 × 1037.7212 × 1035.2430 × 1035.2093 × 1035.2409 × 1035.1227 × 1035.2556 × 1035.0619 × 103
Std4.5304 × 1023.6230 × 1024.0000 × 1025.7974 × 1024.0613 × 1027.1581 × 1024.8143 × 1025.4751 × 102
C10Ave3.5772 × 1042.9359 × 1043.8583 × 1041.2361 × 1042.5199 × 1041.3668 × 1042.2557 × 1041.6519 × 104
Std2.3458 × 1041.6215 × 1042.7401 × 1046.2903 × 1031.5051 × 1046.8203 × 1032.0052 × 1048.7681 × 103
C11Ave1.9569 × 1053.9197 × 1051.0143 × 1052.6041 × 1051.1863 × 1052.9005 × 1059.7033 × 1058.1669 × 104
Std1.9607 × 1053.3473 × 1058.6213 × 1042.3087 × 1051.0390 × 1053.2213 × 1058.3390 × 1045.8853 × 104
C12Ave2.5452 × 1042.3404 × 1045.7697 × 1031.0315 × 1056.2603 × 1034.9563 × 1046.5499 × 1037.6979 × 103
Std2.6804 × 1041.2787 × 1044.0495 × 1031.1173 × 1055.3015 × 1036.1465 × 1045.3117 × 1035.5292 × 103
C13Ave2.5841 × 1053.1302 × 1051.1252 × 1053.1334 × 1051.1695 × 1052.5456 × 1051.5680 × 1051.5022 × 105
Std1.4319 × 1051.3164 × 1053.6354 × 1041.1721 × 1053.9481 × 1041.0037 × 1058.8740 × 1044.4144 × 104
C14Ave2.6773 × 1042.4840 × 1042.1099 × 1042.7076 × 1042.0208 × 1042.2563 × 1041.6073 × 1041.3331 × 104
Std1.0410 × 1041.0861 × 1041.0634 × 1041.0504 × 1041.4964 × 1048.0369 × 1035.6588 × 1037.8585 × 103
C15Ave1.9647 × 1036.3238 × 1031.8444 × 1032.3482 × 1031.7647 × 1033.3227 × 1032.1792 × 1031.9586 × 103
Std3.4647 × 1026.1986 × 1033.5904 × 1028.2637 × 1022.5106 × 1021.4183 × 1031.2759 × 1033.6559 × 102
C16Ave7.3627 × 1034.5647 × 1037.2881 × 1033.3705 × 1036.4137 × 1033.4096 × 1035.0775 × 1034.4896 × 103
Std7.1224 × 1033.0930 × 1036.9245 × 1032.2489 × 1033.8862 × 1031.6588 × 1033.2278 × 1032.6157 × 103
C17Ave9.2979 × 1041.0940 × 1054.7520 × 1041.0834 × 1055.0553 × 1041.1181 × 1057.8437 × 1048.5337 × 104
Std4.5633 × 1043.8472 × 1041.6387 × 1042.7706 × 1041.6759 × 1043.6410 × 1042.1293 × 1043.0722 × 104
C18Ave1.7265 × 1048.2445 × 1032.0264 × 1041.5480 × 1043.7698 × 1041.3146 × 1042.5454 × 1042.2163 × 104
Std1.7684 × 1041.0789 × 1041.7719 × 1041.7940 × 1042.6475 × 1041.5760 × 1042.1266 × 1041.7829 × 104
C19Ave2.7377 × 1032.7541 × 1032.4082 × 1032.3599 × 1032.3419 × 1032.3971 × 1032.2661 × 1032.2469 × 103
Std2.7274 × 1023.4635 × 1022.7271 × 1021.9647 × 1022.8974 × 1022.3299 × 1021.4353 × 1021.6479 × 102
C20Ave2.6466 × 1032.6348 × 1032.2183 × 1032.3056 × 1032.2987 × 1032.3848 × 1032.2597 × 1032.3231 × 103
Std4.8847 × 1023.3976 × 1021.3459 × 1021.6827 × 1021.3206 × 1021.5305 × 1021.6844 × 1021.5802 × 102
C21Ave2.3416 × 1032.3513 × 1032.2794 × 1032.2876 × 1032.2796 × 1032.2742 × 1032.2998 × 1032.2752 × 103
Std2.7963 × 1014.3453 × 1016.62387.55337.81336.57091.1813 × 1016.6923
C22Ave3.0030 × 1033.4859 × 1032.4293 × 1032.6507 × 1032.4026 × 1032.6507 × 1032.4386 × 1032.4422 × 103
Std5.7231 × 1023.9136 × 1028.0678 × 1011.1595 × 1028.2796 × 10−11.1331 × 1029.8880 × 1011.1063 × 102
C23Ave2.8273 × 1033.2852 × 1032.5014 × 1032.6288 × 1032.5012 × 1032.6432 × 1032.5352 × 1032.5146 × 103
Std4.6000 × 1025.5984 × 1026.0767 × 10−12.8852 × 1015.1043 × 10−16.9075 × 1019.7281 × 1016.2048 × 101
C24Ave2.8767 × 1032.8928 × 1032.8249 × 1032.8460 × 1032.8281 × 1032.8510 × 1032.8305 × 1032.8249 × 103
Std2.2767 × 1013.8269 × 1017.68301.3856 × 1018.92611.5720 × 1019.37984.6406
C25Ave3.3543E × 1033.3604 × 1033.3580 × 1033.3287 × 1033.3595 × 1033.3276 × 1033.3310 × 1033.3352 × 103
Std1.8431 × 1012.2434 × 1011.8335 × 1012.07721.9466 × 1019.02044.41041.3340 × 101
C26Ave3.1376 × 1033.1378 × 1033.1446 × 1033.1253 × 1033.1450 × 1033.1339 × 1033.1338 × 1033.1284 × 103
Std2.1172 × 1011.8583 × 1011.8694 × 1011.8782 × 1012.5270 × 1012.5982 × 1012.7872 × 1011.9031 × 101
C27Ave2.9989 × 1033.0790 × 1032.9391 × 1032.7340 × 1032.8915 × 1032.7611 × 1032.7663 × 1032.7338 × 103
Std1.3136 × 1028.9258 × 1011.5966 × 1025.3534 × 1011.7577 × 1021.3843 × 1021.1840 × 1024.6613 × 101
C28Ave3.8901 × 1043.9198 × 1042.7089 × 1042.3630 × 1042.8494 × 1043.7390 × 1042.1817 × 1041.7619 × 104
Std3.8092 × 1043.4223 × 1042.1814 × 1042.2986 × 1041.6735 × 1049.4507 × 1041.6813 × 1041.5936 × 104
C29Ave7.7663 × 1044.4846 × 1054.6742 × 1056.5721 × 1041.0671 × 1054.1435 × 1042.5975 × 1044.2697 × 104
Std7.1995 × 1047.1173 × 1056.3771 × 1055.3622 × 1041.9750 × 1051.8536 × 1041.1253 × 1046.2395 × 104
Bold is the best result of all algorithms.
Table 6. Friedman test results of different strategies.
Table 6. Friedman test results of different strategies.
Overall RankAverage Rank+/−/=
NGO76.03426/3/0
C_NGO87.27628/1/0
WS_NGO23.44915/14/0
WSC_NGO64.75922/7/0
C_WS_NGO33.55217/12/0
C_WSC_NGO54.55223/6/0
WS_WSC_NGO43.62120/9/0
MSINGO12.793~
Table 7. Comparison results on unimodal functions (C1–C2, dimension = 30).
Table 7. Comparison results on unimodal functions (C1–C2, dimension = 30).
DESSASCAMFOWOADBO
C1Avg7.1157 × 1091.8983 × 10102.1073 × 10102.0367 × 10101.6594 × 10106.5809 × 109
Std1.7525 × 1094.9407 × 1094.1159 × 1091.5291 × 10103.1904 × 1094.6532 × 109
Rank4810973
C2Avg5.2832 × 1045.5643 × 1044.5035 × 1046.5401 × 1045.2580 × 1045.0283 × 104
Std9.5379 × 1031.3621 × 1049.9155 × 1032.8648 × 1049.9181 × 1032.0199 × 104
Rank7631054
POASOABWOGJONGOMSINGO
C1Avg6.2347 × 10105.4888 × 10107.5221 × 1091.3628 × 10101.8025 × 1071.0583 × 104
Std5.2900 × 1095.5611 × 1091.8633 × 1092.3413 × 1091.9969 × 1077.6416 × 103
Rank12115621
C2Avg1.2412 × 1051.1839 × 1055.5750 × 1045.6468 × 1049.5971 × 1022.0902 × 102
Std3.7176 × 1042.2784 × 1041.1262 × 1047.0274 × 1034.8997 × 1025.5120
Rank12118921
Bold is the best result of all algorithms.
Table 8. Results for p-value from Wilcoxon signed-rank test on unimodal functions (C1–C2, dimension = 30).
Table 8. Results for p-value from Wilcoxon signed-rank test on unimodal functions (C1–C2, dimension = 30).
Ours vs.DESSASCAMFOWOADBOPOASOABWOGJONGO
C12.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−11
C22.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−11
Table 9. Comparison results on multimodal functions (C3–C9, dimension = 30).
Table 9. Comparison results on multimodal functions (C3–C9, dimension = 30).
DESSASCAMFOWOADBO
C3Avg1.0418 × 1032.9128 × 1032.9573 × 1034.2844 × 1031.7357 × 1031.3672 × 103
Std1.8976 × 1021.1603 × 1039.5803 × 1023.0445 × 1036.9362 × 1029.3724 × 102
Rank3891076
C4Avg6.9329 × 1032.7862 × 1042.5376 × 1042.7085 × 1041.7964 × 1041.9735 × 104
Std1.2039 × 1037.4911 × 1035.2626 × 1031.4029 × 1046.8852 × 1031.1157 × 104
Rank3108967
C5Avg5.0003 × 1025.0001 × 1025.0003 × 1025.0001 × 1025.0001 × 1025.0001 × 102
Std4.9406 × 10−35.5137 × 10−39.6118 × 10−35.6648 × 10−34.7371 × 10−37.4612 × 10−3
Rank9511648
C6Avg4.5943 × 1041.5106 × 1045.3867 × 1041.0947 × 1043.3083 × 1043.9703 × 104
Std9.3287 × 1034.8363 × 1031.6393 × 1047.4864 × 1038.3105 × 1031.5718 × 104
Rank9411367
C7Avg7.0337 × 1027.0074 × 1027.0281 × 1027.0071 × 1027.0108 × 1027.0221 × 102
Std8.1562 × 10−16.9344 × 10−17.3905 × 10−16.6394 × 10−14.6689 × 10−18.8467 × 10−1
Rank11410368
C8Avg8.1667 × 1028.1785 × 1028.1640 × 1028.2556 × 1028.1415 × 1028.1363 × 102
Std4.21454.25744.34968.16396.10107.5496
Rank8971065
C9Avg8.5354 × 1036.0192 × 1038.6150 × 1035.6357 × 1036.9107 × 1037.6447 × 103
Std3.5466 × 1026.8392 × 1022.4630 × 1028.1562 × 1025.5922 × 1026.0268 × 102
Rank10411357
POASOABWOGJONGOMSINGO
C3Avg1.7796 × 1041.5998 × 1041.1487 × 1031.3002 × 1034.2824 × 1023.5319 × 102
Std2.9417 × 1034.9858 × 1032.6341 × 1023.9099 × 1024.6788 × 1012.6328 × 101
Rank12114521
C4Avg8.7121 × 1047.2960 × 1049.1229 × 1031.1006 × 1041.0209 × 1036.2263 × 102
Std1.1259 × 1041.1609 × 1042.5040 × 1033.8440 × 1032.1185 × 1023.0987 × 101
Rank12114521
C5Avg5.0004 × 1025.0003 × 1025.0001 × 1025.0001 × 1025.0000 × 1025.0000 × 102
Std1.1975 × 10−28.2257 × 10−36.4051 × 10−33.9437 × 10−31.1436 × 10−34.1208 × 10−4
Rank12107321
C6Avg7.5547 × 1045.3641 × 1044.3028 × 1042.5471 × 1043.4794 × 1038.4491 × 103
Std1.5711 × 1041.5298 × 1048.0718 × 1037.4351 × 1031.3695 × 1033.9353 × 103
Rank12108512
C7Avg7.0428 × 1027.0256 × 1027.0171 × 1027.0100 × 1027.0013 × 1027.0015 × 102
Std9.5099 × 10−18.0624 × 10−17.5813 × 10−13.4868 × 10−16.0608 × 10−26.1209 × 10−2
Rank1297512
C8Avg8.3070 × 1028.3700 × 1028.0984 × 1028.0862 × 1028.0554 × 1028.0156 × 102
Std6.75018.51593.12232.13071.95191.0497
Rank11124321
C9Avg8.1098 × 1039.1949 × 1037.9685 × 1037.3134 × 1035.1677 × 1035.0619 × 103
Std3.7913 × 1026.8997 × 1024.3993 × 1024.2661 × 1024.5304 × 1025.4751 × 102
Rank9128621
Bold is the best result of all algorithms.
Table 10. Results for p-value from Wilcoxon signed-rank test on multimodal functions (C3-C9, dimension = 30).
Table 10. Results for p-value from Wilcoxon signed-rank test on multimodal functions (C3-C9, dimension = 30).
Ours vs.DESSASCAMFOWOADBOPOASOABWOGJONGO
C32.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−11
C42.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−11
C52.87 × 10−113.66 × 10−92.87 × 10−112.49 × 10−82.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−116.80 × 10−8
C62.87 × 10−111.93 × 10−62.87 × 10−113.75 × 10−12.87 × 10−111.39 × 10−102.87 × 10−113.18 × 10−112.87 × 10−113.88 × 10−113.39 × 10−7
C72.87 × 10−118.51 × 10−72.87 × 10−113.21 × 10−62.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−113.16 × 10−112.87 × 10−112.04 × 10−1
C82.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−115.23 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−111.06 × 10−8
C92.87 × 10−117.32 × 10−72.87 × 10−111.56 × 10−34.29 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−113.18 × 10−114.25 × 10−1
Table 11. Comparison results on hybrid functions (C10–C19, dimension = 30).
Table 11. Comparison results on hybrid functions (C10–C19, dimension = 30).
DESSASCAMFOWOADBO
C10Ave2.3824 × 1052.5356 × 1053.8235 × 1067.4495 × 1053.0101 × 1051.9390 × 106
Std7.5769 × 1042.5043 × 1053.1173 × 1069.0509 × 1054.3615 × 1051.8982 × 106
Rank3410859
C11Ave4.1393 × 1085.3388 × 1083.1337 × 1091.9391 × 1091.0405 × 1094.3244 × 108
Std1.2903 × 1085.4696 × 1089.8563 × 1081.7738 × 1098.5502 × 1088.5984 × 108
Rank3511974
C12Ave1.3180 × 1081.0432 × 1092.2508 × 1091.2072 × 1098.6726 × 1081.3450 × 109
Std9.0952 × 1071.2250 × 1098.3328 × 1081.4706 × 1094.8556 × 1082.6294 × 109
Rank3610759
C13Ave6.0983 × 1053.7275 × 1062.7138 × 1061.4532 × 1061.9689 × 1061.3957 × 106
Std3.0735 × 1053.8287 × 1061.4229 × 1061.9387 × 1061.2815 × 1061.3469 × 106
Rank3119675
C14Ave5.2572 × 1073.1892 × 1087.2280 × 1081.2978 × 1082.0984 × 1081.4844 × 108
Std4.1440 × 1074.0858 × 1083.4740 × 1083.3414 × 1082.0304 × 1085.0534 × 108
Rank3910576
C15Ave2.4606 × 1042.4629 × 1072.7235 × 1078.1481 × 1071.0801 × 1079.4404 × 107
Std1.0443 × 1048.9424 × 1073.0023 × 1071.3306 × 1081.2275 × 1071.6710 × 108
Rank3679510
C16Ave1.2704 × 1072.9113 × 10125.8732 × 1091.3321 × 1084.6590 × 1072.9759 × 108
Std2.7267 × 1071.0842 × 10131.0318 × 10107.1205 × 1081.1157 × 1081.4980 × 109
Rank3119658
C17Ave2.5583 × 1051.2008 × 1058.7192 × 1058.7138 × 1054.0092 × 1052.7430 × 106
Std7.7286 × 1044.9587 × 1046.1177 × 1052.1572 × 1064.0770 × 1055.9845 × 106
Rank4387612
C18Ave2.6120 × 1071.1889 × 10111.0677 × 10116.0959 × 10102.1192 × 10103.3961 × 1011
Std1.7854 × 1075.5510 × 10112.9388 × 10115.2927 × 10102.3646 × 10106.6162 × 1011
Rank3876510
C19Ave4.4672 × 1031.1472 × 1045.8916 × 1034.4471 × 1035.2497 × 1034.9749 × 103
Std5.6679 × 1022.8782 × 1031.0770 × 1032.9891 × 1039.9786 × 1021.3788 × 103
Rank4118397
POASOABWOGJONGOMSINGO
C10Ave1.2174 × 1074.5419 × 1076.5171 × 1054.7743 × 1053.5772 × 1041.6519 × 104
Std2.0078 × 1074.9249 × 1073.6771 × 1051.2237 × 1062.3458 × 1048.7681 × 103
Rank11127621
C11Ave1.4374 × 10108.3053 × 1096.8625 × 1081.0873 × 1091.9569 × 1058.1669 × 104
Std4.3551 × 1093.2900 × 1092.3423 × 1088.3723 × 1081.9607 × 1055.8853 × 104
Rank12106821
C12Ave1.6366 × 10101.0709 × 10102.4212 × 1081.2453 × 1092.5452 × 1047.6979 × 103
Std2.5142 × 1095.2024 × 1099.0134 × 1078.0927 × 1082.6804 × 1045.5292 × 103
Rank12114821
C13Ave1.0717 × 1062.0740 × 1072.4878 × 1063.6762 × 1062.5841 × 1051.5022 × 105
Std7.0693 × 1052.4172 × 1071.8788 × 1061.8478 × 1061.4319 × 1054.4144 × 104
Rank41281021
C14Ave7.1133 × 1095.8169 × 1095.6701 × 1072.8762 × 1082.6773 × 1041.3331 × 104
Std2.3403 × 1093.2930 × 1092.7721 × 1072.2670 × 1081.0410 × 1047.8585 × 103
Rank12114821
C15Ave1.9921 × 1091.7567 × 1091.7753 × 1063.3905 × 1071.9647 × 1031.9586 × 103
Std2.2691 × 1092.2460 × 1091.3368 × 1062.2802 × 1073.4647 × 1023.6559 × 102
Rank12114821
C16Ave3.0927 × 10111.3140 × 10153.7858 × 1071.7921 × 1087.3627 × 1034.4896 × 103
Std8.1938 × 10113.2889 × 10156.9027 × 1074.1231 × 1087.1224 × 1032.6157 × 103
Rank10124721
C17Ave3.8567 × 1051.2542 × 1061.9249 × 1061.1930 × 1069.2979 × 1048.5337 × 104
Std2.8337 × 1056.3729 × 1052.4965 × 1061.2711 × 1064.5633 × 1043.0722 × 104
Rank51011921
C18Ave3.2735 × 10126.7178 × 10142.1412 × 1091.6957 × 10111.7265 × 1042.2163 × 104
Std1.3036 × 10131.5232 × 10151.1168 × 1093.3946 × 10111.7684 × 1041.7829 × 104
Rank11124912
C19Ave1.3295 × 1041.0568 × 1044.4772 × 1034.9503 × 1032.7377 × 1032.2469 × 103
Std3.0948 × 1032.6512 × 1036.9120 × 1027.2511 × 1022.7274 × 1021.6479 × 102
Rank12105621
Bold is the best result of all algorithms.
Table 12. Comparison results on composition functions (C20–C29, dimension = 30).
Table 12. Comparison results on composition functions (C20–C29, dimension = 30).
DESSASCAMFOWOADBO
C20Ave6.0889 × 1032.2039 × 1041.5167 × 1041.6832 × 1041.0976 × 1041.0256 × 104
Std2.3964 × 1039.6740 × 1036.1198 × 1031.3998 × 1044.6101 × 1036.2115 × 103
Rank3108975
C21Ave2.3752 × 1034.2211 × 1032.6210 × 1032.5464 × 1032.4818 × 1032.6688 × 103
Std1.4146 × 1016.4724 × 1028.0512 × 1011.0171 × 1025.4961 × 1011.6943 × 102
Rank3108769
C22Ave1.1479 × 1043.3571 × 1042.6421 × 1042.1697 × 1041.8355 × 1041.8080 × 104
Std8.7594 × 1028.7025 × 1033.8245 × 1038.7977 × 1033.0176 × 1035.0424 × 103
Rank3109876
C23Ave8.0055 × 1032.1791 × 1041.7674 × 1041.2769 × 1041.3186 × 1041.0497 × 104
Std5.3995 × 1024.8587 × 1032.6056 × 1033.4069 × 1032.7605 × 1032.5129 × 103
Rank3109784
C24Ave3.1773 × 1033.9174 × 1033.8390 × 1033.9923 × 1033.4682 × 1033.2048 × 103
Std1.1303 × 1023.3045 × 1022.4359 × 1029.2974 × 1022.3489 × 1023.8471 × 102
Rank4981075
C25Ave3.3657 × 1036.1522 × 1034.4970 × 1033.4268 × 1033.6861 × 1033.8545 × 103
Std1.4826 × 1011.3863 × 1034.1089 × 1024.1888 × 1011.2616 × 1025.3122 × 102
Rank3109568
C26Ave3.1445 × 1034.1549 × 1033.4964 × 1033.1968 × 1033.3498 × 1033.2549 × 103
Std1.2613 × 1015.5968 × 1027.2717 × 1014.2376 × 1015.9534 × 1019.5293 × 101
Rank3109486
C27Ave3.3147 × 1034.1378 × 1033.9593 × 1033.8085 × 1033.5068 × 1034.1153 × 103
Std3.6930 × 1015.1463 × 1022.1656 × 1021.6789 × 1021.4454 × 1024.6185 × 102
Rank3108759
C28Ave3.9188 × 1084.1002 × 1098.4025 × 1091.8019 × 10101.0726 × 1098.3308 × 108
Std4.2525 × 1086.3386 × 1099.6860 × 1099.5223 × 10107.2428 × 1081.4082 × 109
Rank3891065
C29Ave3.1419 × 1081.2640 × 10106.0558 × 1098.6999 × 1092.9297 × 1098.0939 × 1010
Std1.6203 × 1084.0154 × 10103.2377 × 1091.1304 × 10101.9782 × 1092.8023 × 1011
Rank3978510
POASOABWOGJONGOMSINGO
C20Ave3.0832 × 1045.1253 × 1046.4274 × 1031.0731 × 1042.6466 × 1032.3231 × 103
Std1.2701 × 1048.9562 × 1032.5167 × 1033.1797 × 1034.8847 × 1021.5802 × 102
Rank11124621
C21Ave4.9915 × 1035.1938 × 1032.4040 × 1032.4158 × 1032.3416 × 1032.2752 × 103
Std1.3506 × 1039.4591 × 1022.6864 × 1012.6154 × 1012.7963 × 1016.6923
Rank11124521
C22Ave5.7229 × 1046.3209 × 1041.3620 × 1041.7769 × 1043.0030 × 1032.4422 × 103
Std9.2948 × 1036.3493 × 1031.5893 × 1031.9906 × 1035.7231 × 1021.1063 × 102
Rank11124521
C23Ave3.2591 × 1043.3602 × 1041.0588 × 1041.2191 × 1042.8273 × 1032.5146 × 103
Std3.7133 × 1032.8012 × 1031.7990 × 1031.2763 × 1034.6000 × 1026.2048 × 101
Rank11125621
C24Ave7.3834 × 1037.5119 × 1033.1730 × 1033.3755 × 1032.8767 × 1032.8249 × 103
Std1.0436 × 1039.2234 × 1027.6243 × 1018.6234 × 1012.2767 × 1014.6406
Rank11123621
C25Ave9.1076 × 1031.2374 × 1043.4414 × 1033.7395 × 1033.3543 × 1033.3352 × 103
Std2.6356 × 1036.2058 × 1035.6777 × 1011.1559 × 1021.8431 × 1011.3340 × 101
Rank11124721
C26Ave3.5546 × 1034.3146 × 1033.1969 × 1033.3483 × 1033.1376 × 1033.1284 × 103
Std1.6801 × 1023.8833 × 1021.9053 × 1014.3705 × 1012.1172 × 1011.9031 × 101
Rank11125721
C27Ave7.8579 × 1036.3553 × 1033.3934 × 1033.5350 × 1032.9989 × 1032.7338 × 103
Std1.4815 × 1031.5897 × 1036.9237 × 1011.2377 × 1021.3136 × 1024.6613 × 101
Rank12114621
C28Ave4.0327 × 10117.0928 × 10124.2329 × 1082.8629 × 1093.8901 × 1041.7619 × 104
Std7.4454 × 10111.6879 × 10134.7836 × 1081.5993 × 1093.8092 × 1041.5936 × 104
Rank11124721
C29Ave1.6043 × 10111.5796 × 10124.9480 × 1083.4954 × 1097.7663 × 1044.2697 × 104
Std2.5786 × 10113.7520 × 10123.0719 × 1081.0517 × 1097.1995 × 1046.2395 × 104
Rank11124621
Bold is the best result of all algorithms.
Table 13. The p-values from Wilcoxon signed-rank test on hybrid and composition functions (C10–C29, dimension = 30).
Table 13. The p-values from Wilcoxon signed-rank test on hybrid and composition functions (C10–C29, dimension = 30).
Ours vs.DESSASCAMFOWOADBOPOASOABWOGJONGO
C102.87 × 10−112.87 × 10−112.87 × 10−113.51 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−114.58 × 10−4
C112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−117.13 × 10−4
C122.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.98 × 10−6
C133.51 × 10−113.51 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−111.54 × 10−4
C142.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.58 × 10−6
C152.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−118.59 × 10−1
C162.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−114.51 × 10−1
C173.51 × 10−114.97 × 10−112.87 × 10−112.13 × 10−93.13 × 10−77.03 × 10−113.88 × 10−115.23 × 10−113.88 × 10−112.05 × 10−116.68 × 10−1
C182.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−113.08 × 10−1
C192.87 × 10−112.87 × 10−112.87 × 10−117.44 × 10−92.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−111.94 × 10−9
C202.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.19 × 10-2
C212.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−113.18 × 10−11
C222.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−113.06 × 10−9
C232.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.74 × 10−10
C242.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−11
C258.86 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−111.67 × 10−6
C261.07 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−116.37 × 10−112.87 × 10−117.13 × 10−2
C272.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−111.54 × 10−10
C282.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−113.26 × 10−5
C292.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−112.87 × 10−115.79 × 10−5
Table 14. Comparison results of 12 algorithms on CEC2017 (dimension = 100).
Table 14. Comparison results of 12 algorithms on CEC2017 (dimension = 100).
DESSASCAMFOWOADBO
C1Ave1.8714 × 10112.2451 × 10112.1794 × 10112.0379 × 10111.5912 × 10118.9114 × 1010
Std3.3258 × 10101.9344 × 10101.3744 × 10105.5019 × 10101.7294 × 10101.9767 × 1010
Rank7109863
C2Ave5.8117 × 1053.4224 × 1052.9727 × 1054.6114 × 1052.3816 × 1053.6726 × 105
Std5.0427 × 1041.5552 × 1042.4056 × 1041.0256 × 1051.8431 × 1049.1226 × 104
Rank12751138
C3Ave3.1431 × 1048.7861 × 1045.8632 × 1045.0840 × 1042.7590 × 1042.2050 × 104
Std6.0826 × 1031.6582 × 1049.8769 × 1032.8771 × 1046.0483 × 1031.2966 × 104
Rank7109854
C4Ave1.7912 × 1052.8037 × 1052.3375 × 1052.3947 × 1051.6200 × 1059.1679 × 104
Std2.0952 × 1042.3223 × 1041.6550 × 1046.1382 × 1041.8946 × 1042.0503 × 104
Rank7108963
C5Ave5.0013 × 1025.0004 × 1025.0008 × 1025.0002 × 1025.0003 × 1025.0007 × 102
Std1.6821 × 10−21.1768 × 10−21.1816 × 10−27.0367 × 10−37.7224 × 10−32.1163 × 10−2
Rank127113510
C6Ave1.8110 × 1055.2966 × 1041.2772 × 1054.9471 × 1047.0580 × 1041.0061 × 105
Std2.1493 × 1041.1622 × 1041.5073 × 1041.5183 × 1041.1143 × 1041.5069 × 104
Rank12310268
C7Ave7.1334 × 1027.0367 × 1027.0846 × 1027.0264 × 1027.0420 × 1027.0682 × 102
Std1.60471.11081.25509.0368 × 10−18.9731 × 10−11.5332
Rank12410368
C8Ave1.0631 × 1031.0415 × 1031.0859 × 1031.0263 × 1039.8904 × 1021.0028 × 103
Std3.5307 × 1013.3135 × 1013.9201 × 1013.6753 × 1011.8707 × 1013.0361 × 101
Rank9811756
C9Ave3.3469 × 1042.5666 × 1043.3485 × 1042.3548 × 1042.9961 × 1043.1757 × 104
Std6.3076 × 1021.3784 × 1036.2937 × 1021.3766 × 1031.3469 × 1031.2608 × 103
Rank11312267
C10Ave1.0950 × 1085.2412 × 1099.6620 × 1081.7408 × 1093.9623 × 1081.0114 × 109
Std4.6414 × 1072.6803 × 1094.1832 × 1082.2862 × 1093.6290 × 1081.7108 × 109
Rank3107968
C11Ave3.9643 × 10101.6110 × 10119.8334 × 10106.6192 × 10105.7521 × 10104.5170 × 1010
Std6.0463 × 1091.6734 × 10109.9422 × 1092.3068 × 10101.5850 × 10101.4588 × 1010
Rank3109874
C12Ave6.3041 × 10102.5220 × 10111.6476 × 10111.2474 × 10118.8210 × 10105.2968 × 1010
Std1.3926 × 10103.9817 × 10102.5106 × 10107.7464 × 10102.5571 × 10102.5852 × 1010
Rank4109873
C13Ave9.0632 × 1071.5284 × 1081.0641 × 1083.6619 × 1073.2953 × 1073.9380 × 107
Std1.8954 × 1079.4472 × 1073.5987 × 1072.9756 × 1071.3111 × 1072.9826 × 107
Rank8109536
C14Ave1.7157 × 10104.1575 × 10106.4363 × 1092.1118 × 10101.3052 × 10105.3754 × 109
Std4.6542 × 1096.4363 × 1095.5286 × 1091.5106 × 10103.5989 × 1097.7236 × 109
Rank7109863
C15Ave2.7372 × 1089.3648 × 1093.5003 × 1091.3216 × 1091.6270 × 1095.6257 × 108
Std1.6406 × 1083.1402 × 1091.2131 × 1091.5564 × 1091.1732 × 1091.1270 × 109
Rank3109784
C16Ave1.6080 × 10132.1785 × 10151.5784 × 10143.4409 × 10121.7698 × 10138.0268 × 1012
Std1.4585 × 10132.1762 × 10151.4949 × 10146.3994 × 10122.7751 × 10132.4554 × 1013
Rank7109486
C17Ave4.7197 × 1073.5049 × 1087.8418 × 1072.3595 × 1072.5945 × 1073.7904 × 107
Std1.4321 × 1074.1373 × 1083.1270 × 1071.4758 × 1071.5739 × 1073.1630 × 107
Rank7119346
C18Ave1.2987 × 10111.5637 × 10155.4805 × 10133.4544 × 10132.3872 × 10134.9242 × 1013
Std4.8343 × 10101.1323 × 10155.6232 × 10136.7410 × 10134.6587 × 10131.7240 × 1014
Rank3109768
C19Ave1.4185 × 1044.0968 × 1042.0195 × 1041.8634 × 1041.8613 × 1041.5225 × 104
Std1.2408 × 1034.6721 × 1032.9459 × 1039.0316 × 1033.1425 × 1033.2202 × 103
Rank4129876
C20Ave1.8601 × 1042.3097 × 1042.1652 × 1051.9596 × 1051.5790 × 1051.0071 × 105
Std2.9660 × 1041.3891 × 1041.4003 × 1044.3676 × 1041.9050 × 1042.1569 × 104
Rank7109863
C21Ave4.2211 × 1032.4411 × 1041.4136 × 1041.5069 × 1047.1533 × 1031.2942 × 104
Std4.8456 × 1032.6404 × 1032.4611 × 1039.0638 × 1031.6881 × 1031.0667 × 104
Rank2108967
C22Ave6.2058 × 1041.2499 × 1051.2821 × 1058.1417 × 1041.1296 × 1057.5481 × 104
Std6.4505 × 1032.8138 × 1035.0529 × 1031.9281 × 1042.7483 × 1031.5720 × 104
Rank2910583
C23Ave7.9316 × 1041.6838 × 1051.6689 × 1051.1732 × 1051.4457 × 1059.0111 × 104
Std9.7544 × 1034.7708 × 1039.1002 × 1034.5696 × 1041.0299 × 1042.1932 × 104
Rank2109683
C24Ave1.9742 × 1042.8569 × 1042.5740 × 1042.3018 × 1041.3254 × 1049.1553 × 103
Std3.6475 × 1034.3503 × 1033.5689 × 1031.0784 × 1041.9038 × 1031.9815 × 103
Rank7109853
C25Ave7.8815 × 1031.9336 × 1058.1783 × 1049.6644 × 1033.6022 × 1041.7992 × 104
Std5.4080 × 1027.4500 × 1041.2994 × 1041.7413 × 1038.9004 × 1036.0362 × 103
Rank2109476
C26Ave4.1844 × 1031.0527 × 1047.4291 × 1034.3089 × 1035.7508 × 1034.7032 × 103
Std2.1289 × 1021.3710 × 1034.7861 × 1023.2853 × 1023.4251 × 1025.9115 × 102
Rank31110475
C27Ave7.8695 × 1032.1520 × 1041.5464 × 1049.5747 × 1038.9684 × 1038.2506 × 103
Std6.5597 × 1022.9496 × 1031.8744 × 1031.2426 × 1031.3524 × 1032.6792 × 103
Rank3109864
C28Ave4.2680 × 10121.8533 × 10151.2952 × 10141.2616 × 10159.7234 × 10134.9022 × 1013
Std4.7020 × 10121.9674 × 10151.1273 × 10143.3475 × 10151.2558 × 10141.5075 × 1014
Rank3108976
C29Ave4.5676 × 10129.1778 × 10148.3903 × 10139.0060 × 10146.5671 × 10134.3255 × 1013
Std3.7867 × 10128.7941 × 10148.0805 × 10131.6666 × 10156.8414 × 10131.3850 × 1014
Rank4108976
POASOABWOGJONGOMSINGO
C1Ave2.7197 × 10112.6571 × 10111.4584 × 10111.4130 × 10115.4890 × 10108.4928 × 108
Std6.9345 × 1091.1789 × 10101.3582 × 10101.1035 × 10109.7426 × 1091.6473 × 108
Rank12115421
C2Ave4.4141 × 1054.4752 × 1053.3394 × 1052.4770 × 1051.0009 × 1055.5465 × 104
Std9.2720 × 1049.7072 × 1042.5787 × 1042.0284 × 1041.0894 × 1048.4001 × 103
Rank9106421
C3Ave1.3761 × 1051.2197 × 1052.8413 × 1042.0899 × 1045.5971 × 1031.0925 × 103
Std1.3293 × 1041.7944 × 1045.0326 × 1033.4556 × 1031.3988 × 1031.0775 × 102
Rank12116321
C4Ave3.4659 × 1053.3982 × 1051.4924 × 1051.3076 × 1055.4008 × 1042.5471 × 103
Std1.2819 × 1041.7820 × 1041.4383 × 1041.4710 × 1048.0657 × 1032.2091 × 102
Rank12115421
C5Ave5.0007 × 1025.0007 × 1025.0004 × 1025.0003 × 1025.0001 × 1025.0001 × 102
Std1.6995 × 10−21.0127 × 10−27.9461 × 10−35.4425 × 10−35.8196 × 10−33.2887 × 10−3
Rank986421
C6Ave1.3280 × 1051.0459 × 1059.5042 × 1045.9436 × 1042.1376 × 1046.0034 × 104
Std1.0984 × 1041.1421 × 1048.8743 × 1031.1377 × 1043.1109 × 1031.0923 × 104
Rank1197415
C7Ave7.0894 × 1027.0714 × 1027.0615 × 1027.0381 × 1027.0154 × 1027.0214 × 102
Std1.29567.6338 × 10−18.9235 × 10−18.1803 × 10−12.9539 × 10−13.1753 × 10−1
Rank1197512
C8Ave1.0757 × 1031.1257 × 1039.8844 × 1029.5851 × 1028.8063 × 1028.3947 × 102
Std2.6452 × 1013.1883 × 1012.9761 × 1011.4760 × 1019.22428.9618
Rank10124321
C9Ave3.2439 × 1043.2912 × 1043.2839 × 1042.9635 × 1042.3546 × 1042.6675 × 104
Std4.4694 × 1029.2632 × 1028.8386 × 1021.6070 × 1038.4292 × 1029.3004 × 102
Rank8109514
C10Ave1.3643 × 10101.2385 × 10101.1090 × 1083.8427 × 1083.2330 × 1051.4126 × 105
Std4.1704 × 1094.2609 × 1096.2048 × 1071.9565 × 1086.8403 × 1044.7198 × 104
Rank12114521
C11Ave2.1603 × 10112.0048 × 10114.9144 × 10105.2750 × 10103.5261 × 1099.0874 × 107
Std1.8518 × 10102.2306 × 10107.7498 × 1099.9203 × 1091.3724 × 1092.6935 × 107
Rank12115621
C12Ave3.9166 × 10113.6647 × 10117.4745 × 10106.9642 × 10105.2550 × 1092.2443 × 107
Std3.8587 × 10104.1229 × 10101.4011 × 10101.7967 × 10103.3866 × 1091.2996 × 107
Rank12116521
C13Ave1.8003 × 1085.5452 × 1084.6331 × 1073.3962 × 1073.1420 × 1062.2309 × 106
Std7.2055 × 1072.7697 × 1081.8832 × 1071.8692 × 1071.3169 × 1061.2777 × 106
Rank11127421
C14Ave6.3373 × 10105.6712 × 10109.3942 × 1099.8012 × 1091.5295 × 1085.9354 × 105
Std4.9593 × 1097.4869 × 1092.4456 × 1092.3664 × 1091.0588 × 1088.2288 × 105
Rank11124521
C15Ave1.9384 × 10101.4827 × 10105.9723 × 1086.6147 × 1082.0594 × 1051.0052 × 104
Std4.0925 × 1094.1885 × 1092.9038 × 1085.6126 × 1082.6539 × 1054.0463 × 103
Rank12115621
C16Ave1.0971 × 10161.2848 × 10164.4153 × 10122.4385 × 10123.4450 × 1051.8396 × 104
Std6.4344 × 10158.5564 × 10155.6616 × 10123.5274 × 10121.0012 × 1061.2462 × 104
Rank11125321
C17Ave8.7034 × 1079.1846 × 1085.4226 × 1073.1966 × 1072.8924 × 1061.7515 × 106
Std4.4478 × 1078.0554 × 1082.5658 × 1072.5626 × 1071.6674 × 1068.3470 × 105
Rank10128521
C18Ave5.3588 × 10155.4466 × 10159.2231 × 10111.8409 × 10121.6219 × 1098.0681 × 104
Std2.3777 × 10153.3979 × 10151.0244 × 10123.1529 × 10121.6709 × 1093.9072 × 104
Rank11124521
C19Ave3.9360 × 1043.5902 × 1041.4653 × 1041.2227 × 1041.0594 × 1043.6227 × 103
Std2.7796 × 1034.5031 × 1032.4324 × 1032.0835 × 1032.4325 × 1036.7418 × 102
Rank11125321
C20Ave2.6774 × 1052.5812 × 1051.5194 × 1051.3359 × 1056.6361 × 1044.3931 × 103
Std6.9321 × 1031.0314 × 1041.1221 × 1041.0917 × 1041.5980 × 1042.6209 × 102
Rank12115421
C21Ave3.0604 × 1042.8400 × 1044.9006 × 1035.3387 × 1034.2665 × 1032.4277 × 103
Std1.8045 × 1031.4014 × 1034.8344 × 1026.1843 × 1021.2618 × 1031.9240 × 101
Rank12114531
C22Ave1.2974 × 1051.2954 × 1051.0658 × 1057.8477 × 1048.1696 × 1045.9174 × 103
Std1.1120 × 1032.1654 × 1032.2259 × 1037.6809 × 1038.4532 × 1031.0142 × 103
Rank12117461
C23Ave1.8700 × 1051.8323 × 1051.3544 × 1051.0518 × 1059.2968 × 1041.0313 × 104
Std2.9026 × 1036.8315 × 1034.7580 × 1031.1031 × 1041.1393 × 1042.3948 × 103
Rank12117541
C24Ave4.7859 × 1044.1915 × 1041.4354 × 1041.2213 × 1046.6933 × 1033.9888 × 103
Std4.0259 × 1035.9832 × 1031.6802 × 1031.2759 × 1036.2752 × 1021.0340 × 102
Rank12116421
C25Ave2.0646 × 1052.9442 × 1051.0520 × 1044.2624 × 1048.6799 × 1036.0679 × 103
Std6.2660 × 1049.8789 × 1042.7744 × 1036.6328 × 1032.3094 × 1036.1890 × 101
Rank11125831
C26Ave6.7638 × 1031.1428 × 1044.9680 × 1036.0710 × 1033.9788 × 1033.4510 × 103
Std5.5201 × 1021.2443 × 1033.2669 × 1023.3254 × 1022.1108 × 1026.4330 × 101
Rank9126821
C27Ave3.2399 × 1042.9115 × 1049.1847 × 1038.4986 × 1034.5246 × 1033.2770 × 103
Std2.8213 × 1032.5432 × 1031.0138 × 1038.4867 × 1022.6381 × 1024.9362 × 101
Rank12117521
C28Ave1.5165 × 10161.1634 × 10165.0930 × 10122.3691 × 10131.2282 × 1093.0941 × 106
Std1.0569 × 10161.0278 × 10167.2343 × 10128.2516 × 10131.4546 × 1093.4823 × 106
Rank12114521
C29Ave5.1701 × 10156.1968 × 10152.4814 × 10122.0838 × 10133.3664 × 1096.8307 × 106
Std3.6081 × 10155.4952 × 10152.9205 × 10122.7394 × 10132.0573 × 1094.7809 × 106
Rank11123521
Bold is the best result of all algorithms.
Table 15. Memory usage of all algorithms.
Table 15. Memory usage of all algorithms.
AlgorithmsC1C4C15C27Average Memory OccupationRank
DE2.992MB3.277 MB3.812 MB3.176 MB3.314 MB7
SSA2.949 MB2.906 MB3.738 MB3.484 MB3.269 MB4
SCA3.242 MB3.145 MB3.559 MB3.614 MB3.390 MB11
MFO2.918 MB3.211 MB3.469 MB3.079 MB3.169 MB1
WOA2.992 MB2.898 MB3.758 MB3.699 MB3.337 MB10
DBO3.020 MB3.156 MB3.492 MB3.367 MB3.259 MB2
POA2.891 MB2.895 MB3.725 MB3.559 MB3.267 MB3
SOA2.848 MB3.125 MB3.516 MB3.738 MB3.307 MB6
BWO3.270 MB2.965 MB3.492 MB3.594 MB3.330 MB9
GJO3.098 MB3.227 MB3.539 MB3.852 MB3.429 MB12
NGO2.984 MB2.891 MB3.266 MB3.988 MB3.282 MB5
MSINGO2.988 MB3.172 MB3.441 MB3.707 MB3.327 MB8
Table 16. Optimization results for T/CSD.
Table 16. Optimization results for T/CSD.
AlgorithmOptimal Values of VariableOptimal ValueRank
dDp
MSINGO0.051661070.3561754611.366666671.2673687 × 10−21
NGO0.051874270.3612958311.066666671.2674502 × 10−22
DE0.051813790.3595703811.166666671.2684364 × 10−23
SSA0.050604980.3263995813.866666671.3146121 × 10−27
SCA0.050777990.3317498614.033333331.3273039 × 10−210
MFO0.050573970.3263000913.866666671.3048161 × 10−26
WOA0.052204990.3693160311.366666671.2988707 × 10−25
DBO0.055412810.476382910.31.4084970 × 10−211
POA0.052037650.0520376511.033333331.2751105 × 10−24
SOA0.058263110.537407556.033333331.4125533 × 10−212
BWO0.051099650.3407194413.566666671.3256904 × 10−29
GJO0.05429350.427234039.31.3172539 × 10−28
Table 17. Optimization results for CBD.
Table 17. Optimization results for CBD.
AlgorithmOptimal Values for VariableOptimal ValueRank
x 1 x 2 x 3 x 4 x 5
MSINGO6.016570855.30965014.493752843.501129142.152606741.3399595 × 1001
NGO6.015036165.31117294.491557373.503616482.152423521.3399655 × 1002
DE6.016006975.309356654.496844513.499092572.152647291.3399744 × 1003
SSA5.983169315.345446194.4863113.527889322.19500921.3439603 × 1007
SCA6.323627845.468144714.754185543.785774152.199966781.4059780 × 10010
MFO6.068159025.209534674.687807963.522435042.204129341.3535849 × 1008
WOA6.006765345.332967284.505727943.524525142.162144721.3436049 × 1006
DBO5.997049255.314243014.476303673.518154142.184586451.3409970 × 1005
POA7.913714918.035420576.698205965.825514094.011068762.0269969 × 10011
SOA9.952707228.456745868.002324447.417292967.594987812.5848612 × 10012
BWO6.186437375.335346094.54062743.504711672.195375161.3579799 × 1009
GJO6.007109515.320706194.508518463.490836362.159501871.3407684 × 1004
Table 18. Optimization results for PVD.
Table 18. Optimization results for PVD.
AlgorithmOptimal Values for VariableOptimal ValueRank
T s T h R L
MSINGO0.778377470.3847652340.32807677199.891043915.8862290 × 10₊32
NGO0.79858070.4013667241.34128664188.153745175.9514389 × 10₊33
DE0.778185530.3846652740.32013235199.994208765.8854618 × 10₊31
SSA1.07579630.5553438454.7046924673.841852796.9301306 × 10₊37
SCA1.122020450.612802754.8520121382.47797417.6979528 × 10₊310
MFO1.001136290.4948684451.87208913100.972446036.4884200× 10₊34
WOA1.015975950.5461541850.65441508108.482557376.9998825 × 10₊38
DBO0.91856333.7462725547.31901688139.18367731.5813270 × 10₊411
POA1.208902410.6154716562.4021977125.590880077.2397050 × 10₊39
SOA3.5818158214.2107027557.0640747256.441888561.4362201 × 10₊512
BWO0.92687190.5165936746.16063784150.313863986.7770451 × 10₊36
GJO1.034721480.5231918753.325869885.358822016.6726795 × 10₊35
Table 19. Optimization results for WBD.
Table 19. Optimization results for WBD.
AlgorithmOptimal Values for VariableOptimal ValueRank
h l t b
MSINGO0.198832313.337365329.192024320.198832311.6702177 × 1001
NGO0.198832283.337365789.192024360.198832311.6702178 × 1002
DE0.19883193.337374939.192026590.198832361.6702192 × 1003
SSA0.202282943.971606098.960725830.220094061.8715148 × 1007
SCA0.183816734.072438779.335066990.210054821.8504610 × 1006
MFO0.132574965.965368949.446932370.198527961.9021302 × 1008
WOA0.189986143.563854959.21422580.200624021.7032059 × 1005
DBO0.175537495.410728479.371213390.31642072.1657368 × 10011
POA0.137996747.043173178.635422480.243637322.1635826 × 10010
SOA0.32047524.450951485.29480730.728131593.4197885 × 10012
BWO0.146840375.616711569.33756090.211996491.9755729 × 1009
GJO0.190903343.522539559.199131790.199097691.6855842 × 1004
Table 20. Optimization results for SRD.
Table 20. Optimization results for SRD.
AlgorithmOptimal Values for VariableOptimal ValueRank
bmp l 1 l 2 d 1 d 2
MSINGO3.50.7178.1718.2523.95.51.3415265 × 10₊31
NGO3.50.7178.1748.2653.95.51.3415265 × 10₊31
DE3.50.7178.1348.1803.95.51.3415265 × 10₊31
SSA3.5480.70119.1318.0128.0103.7415.3631.7946432 × 10₊310
SCA3.5350.7178.1738.2863.8495.4971.3569351 × 10₊38
MFO3.5030.7178.2348.2403.95.51.3428041 × 10₊35
WOA3.5050.7178.1898.2373.8785.4931.3447225 × 10₊36
DBO3.5570.717.7338.0158.2773.8505.51.5282107 × 10₊39
POA3.5260.70320.8347.8448.0273.6515.3893.8108706 × 10₊1711
SOA3.567.0.75624.1528.0538.10043.7765.4149.8846695 × 10₊1812
BWO3.50.7178.2018.2593.95.51.3415265 × 10₊31
GJO3.5050.7178.1518.1833.8715.4951.3448399 × 10₊37
Table 21. Optimization results for T-bTD.
Table 21. Optimization results for T-bTD.
AlgorithmOptimal Values for VariableOptimal ValueRank
x 1 x 2
MSINGO0.788667780.40826922.6389586 × 10₊22
NGO0.788714890.408137582.6389602 × 10₊23
DE0.788675140.408248292.6389584 × 10₊21
SSA0.788847130.415175722.6463723 × 10₊28
SCA0.794537510.400205162.6474966 × 10₊29
MFO0.787379850.412706832.6397534 × 10₊26
WOA0.825220250.368920162.7029955 × 10₊212
DBO0.790855040.402246422.6391223 × 10₊25
POA0.787941190.410423862.6390581 × 10₊24
SOA0.796903030.413995122.6679773 × 10₊210
BWO0.788711280.409144142.6399565 × 10₊27
GJO0.830657980.369551952.7190075 × 10₊211
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, H.; Xiao, J.; Yao, Y.; Zhu, S.; Chen, Y.; Zhou, R.; Ma, Y.; Wang, M.; Zhang, K. A Multi-Strategy Improved Northern Goshawk Optimization Algorithm for Optimizing Engineering Problems. Biomimetics 2024, 9, 561. https://doi.org/10.3390/biomimetics9090561

AMA Style

Liu H, Xiao J, Yao Y, Zhu S, Chen Y, Zhou R, Ma Y, Wang M, Zhang K. A Multi-Strategy Improved Northern Goshawk Optimization Algorithm for Optimizing Engineering Problems. Biomimetics. 2024; 9(9):561. https://doi.org/10.3390/biomimetics9090561

Chicago/Turabian Style

Liu, Haijun, Jian Xiao, Yuan Yao, Shiyi Zhu, Yi Chen, Rui Zhou, Yan Ma, Maofa Wang, and Kunpeng Zhang. 2024. "A Multi-Strategy Improved Northern Goshawk Optimization Algorithm for Optimizing Engineering Problems" Biomimetics 9, no. 9: 561. https://doi.org/10.3390/biomimetics9090561

APA Style

Liu, H., Xiao, J., Yao, Y., Zhu, S., Chen, Y., Zhou, R., Ma, Y., Wang, M., & Zhang, K. (2024). A Multi-Strategy Improved Northern Goshawk Optimization Algorithm for Optimizing Engineering Problems. Biomimetics, 9(9), 561. https://doi.org/10.3390/biomimetics9090561

Article Metrics

Back to TopTop