Next Article in Journal
Micro-CT and Histomorphometric Analysis of Degradability and New Bone Formation of Anodized Mg-Ca System
Previous Article in Journal
Design and Optimization of Hierarchical Porous Metamaterial Lattices Inspired by the Pistol Shrimp’s Claw: Coupling for Superior Crashworthiness
Previous Article in Special Issue
A Multi-Strategy Improved Red-Billed Blue Magpie Optimizer for Global Optimization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Multi-Strategy Honey Badger Algorithm for Global Optimization

1
School of Mathematics and Statistics, Qiannan Normal University for Nationalities, Duyun 558000, China
2
Guangxi Key Laboratory of Hybrid Computation and IC Design Analysis, Guangxi University for Nationalities, Nanning 530006, China
3
College of Artificial Intelligence, Guangxi University for Nationalities, Nanning 530006, China
*
Author to whom correspondence should be addressed.
Biomimetics 2025, 10(9), 581; https://doi.org/10.3390/biomimetics10090581
Submission received: 23 April 2025 / Revised: 1 August 2025 / Accepted: 5 August 2025 / Published: 2 September 2025
(This article belongs to the Special Issue Advances in Biological and Bio-Inspired Algorithms)

Abstract

The Honey Badger Algorithm (HBA) is a recently proposed metaheuristic optimization algorithm inspired by the foraging behavior of honey badgers. The search mechanism of this algorithm is divided into two phases: a mining phase and a honey-seeking phase, effectively emulating the processes of exploration and exploitation within the search space. Despite its innovative approach, the Honey Badger Algorithm (HBA) faces challenges such as slow convergence rates, an imbalanced trade-off between exploration and exploitation, and a tendency to become trapped in local optima. To address these issues, we propose an enhanced version of the Honey Badger Algorithm (HBA), namely the Multi-Strategy Honey Badger Algorithm (MSHBA), which incorporates a Cubic Chaotic Mapping mechanism for population initialization. This integration aims to enhance the uniformity and diversity of the initial population distribution. In the mining and honey-seeking stages, the position of the honey badger is updated based on the best fitness value within the population. This strategy may lead to premature convergence due to population aggregation around the fittest individual. To counteract this tendency and enhance the algorithm’s global optimization capability, we introduce a random search strategy. Furthermore, an elite tangential search and a differential mutation strategy are employed after three iterations without detecting a new best value in the population, thereby enhancing the algorithm’s efficacy. A comprehensive performance evaluation, conducted across a suite of established benchmark functions, reveals that the MSHBA excels in 26 out of 29 IEEE CEC 2017 benchmarks. Subsequent statistical analysis corroborates the superior performance of the MSHBA. Moreover, the MSHBA has been successfully applied to four engineering design problems, highlighting its capability for addressing constrained engineering design challenges and outperforming other optimization algorithms in this domain.

1. Introduction

Optimization refers to the process of finding the best solution for a given system from all possible values to maximize or minimize the output. Over the past few decades, as the complexity of problems has increased, the demand for new optimization techniques has become more pressing [1,2]. In the past, traditional mathematical techniques used to solve optimization problems were mostly deterministic, but a major issue was their tendency to become trapped in local optima. This has resulted in low efficiency when using these techniques to solve practical optimization problems over the past two decades [3,4], thereby increasing interest in stochastic optimization techniques. In general, most real-world optimization problems—such as those in engineering [5], wireless sensor networks [6], image processing [7], feature selection [8,9], tuning machine learning parameters [10], and bioinformatics [11]—are highly nonlinear and non-convex due to inherent complex constraints and numerous design variables. Therefore, solving these types of optimization problems is highly complex due to the presence of numerous inherent local minima. Additionally, there is no guarantee of finding a global optimal solution.
Optimization problem-solving algorithms are classified into five main categories based on heuristic creation principles. Human-based optimization algorithms are designed based on human brain thinking, systems, organs, and social evolution. An example is the well-known Neural Network Algorithm (NNA) [12], which solves problems based on the message transmission in neural networks of the human brain. The Harmony Search (HS) algorithm [13,14] simulates the process by which musicians achieve a harmonious state by iteratively adjusting pitches through memory recall.
Algorithms that mimic natural evolution are classified as evolutionary optimization algorithms. The Genetic Algorithm (GA) [15] is the most classic model that simulates evolution, where chromosomes form offspring through a series of stages in cycles and produce more adaptive individuals through selection and reproduction mechanisms. Additionally, the differential evolution (DE) algorithm [16,17], the Imperialist Competitive Algorithm (ICA) [18], and the Mimetic Algorithm (MA) [19] are also based on evolutionary mechanisms.
Population-based optimization algorithms simulate the behaviors of biological populations, including reproduction, predation, and migration. In these algorithms, individuals in the population are treated as massless particles searching for the best position. The Ant Colony Optimization (ACO) algorithm [20,21] utilizes the concept of ants finding the shortest path from the nest to food sources. The Particle Swarm Optimization (PSO) algorithm [22] is derived from the foraging behavior of birds and is widely recognized as a swarm intelligence algorithm. The Moth Flame Optimization (MFO) algorithm [23] is a mathematical model that simulates the unique navigation behavior of moths, which spiral towards a light source until they reach the “flame.” Other swarm intelligence algorithms include the Grey Wolf Optimization (GWO) algorithm [24], the Moth-Flame Optimization (MFO) algorithm [25], the Artificial Hummingbird Algorithm (AHA) [26], the Stinky Pete Optimization (DMO) algorithm [27], the Chimpanzee Optimization Algorithm (CHOA) [28,29], the Raccoon Optimization Algorithm (COA) [30], the Beetle Optimization Algorithm (DBO) [31], Harris Hawk Optimization (HHO) [32], and the Osprey Optimization Algorithm (OOA) [33].
Plant growth-based optimization algorithms. The inspiration for these algorithms comes from plant characteristics such as photosynthesis, flower pollination, and seed dispersal. The Dandelion Optimization (DO) algorithm [34] is inspired by the processes of rising, falling, and landing of dandelion seeds in various wind directions. Algorithms that mimic the aggressive invasion of weeds, their search for suitable living spaces, and their utilization of natural resources for rapid growth and reproduction are known as Invasive Weed Optimization (IWO) [35].
Physics-based optimization algorithms are developed based on natural physical phenomena and laws. The Gravity Search Algorithm (GSA) [36] originates from the concept of gravity, and it possesses powerful global search capabilities and fast convergence speed. The Artificial Raindrop Optimization Algorithm (ARA) [37] is designed based on the processes of raindrop formation, landing, collision, aggregation, and evaporation into water vapor.
In particular, due to their excellent performance, many algorithms have been applied to a wide range of practical engineering problems, such as feature selection [38,39,40], image segmentation [41,42], signal processing [43], hydraulic facility construction [44], walking robot path planning [45,46], job shop scheduling [47], and pipeline and wiring optimization in industrial and agricultural production [48]. Unlike gradient-based optimization algorithms, metaheuristic algorithms rely on probabilistic searches rather than gradient-based methods. In the absence of centralized control constraints, the failure of individual agents will not affect the overall problem-solving process, ensuring a more stable search process. Typically, as a first step, it is necessary to appropriately set the basic parameters of the algorithm and generate an initial population of random solutions. Next, the search mechanism of the algorithm is employed to locate the optimal value until a stopping criterion is met or the optimal value is identified [49]. However, it is evident that each algorithm possesses distinct advantages and disadvantages, and its performance may vary depending on the specific problem being addressed. The “No Free Lunch” (NFL) theorem [50] posits that while an algorithm may effectively solve certain optimization problems, there is no universal guarantee that it can successfully address other optimization problems. Therefore, when confronted with several specific problems, it is reasonable to propose multiple strategies to enhance the efficiency of the algorithm.
In order to identify more effective problem-solving approaches, numerous researchers have endeavored to develop new algorithms and enhance existing methods. Research on metaheuristic optimization algorithms has led to the development of effective search strategies for achieving global optimality. Due to the exponential growth of the search space in real-life optimization problems, which often exhibit multimodality, traditional optimization methods frequently yield suboptimal solutions. Over the past few decades, the development of numerous new metaheuristic algorithms [51] has demonstrated robust performance across a broader spectrum of complex problems.
The Honey Badger Algorithm (HBA) is a novel metaheuristic algorithm proposed by Fatma A. Hashim et al. in 2022, inspired by the foraging behavior of honey badgers in nature. The algorithm searches for the optimal solution to problems by simulating the dynamic foraging and digging behaviors of honey badgers. The algorithm is known for its strong search ability and fast convergence speed compared to other algorithms, but it also has the disadvantage of slower search performance in the late stages and a tendency to become trapped in local optimal solutions.
The proposed Multi-Strategy Honey Badger Algorithm (MSHBA) is introduced to address the aforementioned issues. Due to the random generation of initial populations in the basic Honey Badger search algorithm, it cannot guarantee the uniform distribution of individuals within the search space, thereby affecting the algorithm’s search efficiency and optimization performance. The Cubic chaos mapping mechanism is incorporated into the initialization process of the improved Honey Badger algorithm to enhance the traversal capability of the initial population. During the digging and honey-seeking phases of the Honey Badger Algorithm, the positions of the agents are updated based on the best value within the population. This approach can lead to premature convergence due to population clustering around the optimal individual. To enhance the global optimization capability of the Honey Badger Algorithm, a random search strategy is introduced. When the population identifies the same best value for three consecutive iterations, both the elite tangent search and the differential mutation strategy are executed. Finally, the MSHBA algorithm is tested and validated on the CEC 2017 benchmark suite, demonstrating improved convergence speed and accuracy, as well as high efficiency in solving engineering problems.

2. Fundamentals of Honey Badger Algorithm

In the standard Honey Badger Optimization Algorithm, the optimal solution to the optimization problem is obtained by updating the prey odor intensity factor and employing two distinct foraging strategies of honey badgers: the “digging phase” and the “honey phase,” each characterized by unique search trajectories.
During the initialization stage, the population size of honey badgers is N , and the position of each individual honey badger is determined by the following equation:
x i = l b i + r 1 × ( u b i l b i )
where r is a random number between 0 and 1, x i is the position of the i -th honey badger referring to a candidate solution, while l b i and u b i are the lower and upper bounds of the search space, respectively.
Defining Intensity (I): Intensity is related to the concentration of prey and the distance of individual honey badgers. I i represents the intensity of the scent. If the odor concentration is high, the honey badger searches for prey more rapidly; conversely, if the concentration is low, the search is slower. Therefore, the intensity of the scent is directly proportional to the concentration of the prey and inversely proportional to the square of the distance from the honey badger. The specific definition is given by the following equation:
I i = r 2 × S 4 π d i 2 S = ( x i x i + 1 ) 2 d i = x p r e y x i
where r 2 is a randomly generated number ranging from 0 to 1, S is the prey concentration, and d i indicates the distance between the prey and the first badger.
Update the density factor. The density factor α controls the time-varying randomization to ensure a smooth transition from exploration to exploitation; as the number of iterations increases, the intensity factor decreases, thereby reducing randomization over time, as given by the equation below:
α = C × exp ( t t max )
where t max is the maximum number of iterations, and C is a constant ≥1 (default = 2).
Digging phase
At this stage, the honey badger automatically locates beehives by scent and destroys them to obtain food. Its path follows the shape of a heart, and its location is updated as follows:
x n e w = x p r e y + F × β × I × x p r e y + F × r 3 × α × d i × cos ( 2 π r 4 ) × 1 cos ( 2 π r 5 )
F = 1 , r 6 0.5 1 , r 6 > 0.5
where x p r e y is the position of the prey, which is the best position found so far—in other words, the global best position. β 1 (default = 6) is ability of the honey badger to obtain food. d i is the distance between the prey and the i th honey badger, see Equation (2). r 3 , r 4 , and r 5 are three different random numbers between 0 and 1. F works as the flag that alters search direction; it is determined using Equation (5), where r 6 is a randomly generated number ranging from 0 to 1.
Honey phase
At this stage, the honey badger follows the honey guide bird to the beehive to find honey. This process can be described by the following formula:
x n e w = x p r e y + F × r 7 × α × d i
Here, x n e w refers to the new position of the honey badger, whereas x p r e y denotes the location of the prey. The parameters F and α are determined using Equations (3) and (5), respectively. From Equation (6), it can be observed that the honey badger performs a search in the vicinity of the best x p r e y location found so far, based on distance information d i . At this stage, the search behavior is influenced by a time-varying factor t. Moreover, the honey badger may encounter disturbances F , where r is a random number between 0 and 1.

3. Multi-Strategy Honey Badger Algorithm

Given the shortcomings of the standard HBA algorithm, such as its tendency to easily fall into local optima and its slow convergence speed in the later stages, this paper introduces a cubic chaotic mapping mechanism during the population initialization phase and incorporates a random search strategy into the position update process of the honey badgers. When the global best solution obtained by the honey badger population remains unchanged for three consecutive iterations, the elite tangent search and differential mutation strategies are introduced.

3.1. Cubic Chaotic Mapping

Since the initial population of the basic Honey Badger Algorithm is generated randomly, the uniform distribution of individuals in the search space cannot be guaranteed, which may negatively impact the convergence speed and optimization performance of the algorithm. To improve the initialization process of the Honey Badger Algorithm, cubic mapping [52] was introduced to enhance the diversity and coverage of the initial population, as shown below:
x i = l b i + ( u b i l b i ) × z i , i = 1 , 2 , , n
z i + 1 = ρ × z i ( 1 z i 2 )
where x i denotes the position of the i -th honey badger; l b i and u b i represent the lower and upper bounds of the i -th variable, respectively; z i is a constant; and ρ is the cubic chaotic sequence, typically set to a constant value of 3. This method generates individuals in the initial population to ensure a uniform and diverse distribution across the search space.

3.2. Introduction of Random Value Perturbation Strategy for Honey Badger Algorithm

In the Honey Badger Algorithm, the position of each honey badger is updated based on the global best solution x p r e y , which may easily result in premature convergence as the population tends to cluster around the current best individual. To enhance the global optimization capability of the Honey Badger Algorithm, a random search strategy [53] is incorporated, and the position update mechanism is adaptively determined based on the value of a control coefficient A . When the specified condition is A 1 , a perturbation-based search strategy is applied to randomly selected individuals; otherwise, the position of the honey badger is updated based on the current global best solution x p r e y . The expression is given by
A = 2 m cos ( r ) m  
In this expression, m = 2 ( 1 t t max ) , r denotes a random number drawn from the interval (0,1). Its value decreases linearly from 2 to 0, where r denotes a random number drawn from the interval (0,1). It can be seen from the above two expressions that the value of m generally shows a linear decreasing trend. In the early stages of iteration, the algorithm repeatedly applies random search strategies to prevent premature convergence due to population clustering, enhance the exploration capability of honey badgers across the search space, and improve global search performance.
The mathematical expressions for the digging phase and the honey phase of the Honey Badger Algorithm at the current stage are given as follows:
x n e w = x r a n d + F × β × I i × r a n d + F × r 3 × α × d i × cos ( 2 π r i ) × [ 1 cos ( 2 π r 5 )  
x n e w = x r a n d + F × r 7 × α × d i  
where d i = x r a n d x i .

3.3. Population Remains the Best Value Unchanged After Three Iterations, Elite Tangent Search and Differential Variation Strategy Are Executed

During each iteration, the population is divided into two subpopulations: the first half, consisting of individuals with lower fitness values, is designated as the elite subpopulation, while the second half serves as the exploration subpopulation. The elite subpopulation then executes the elite migration strategy.

3.3.1. Migration Strategy of Elite Subpopulation

When an individual approaches the current optimal solution, it explores the surrounding region—a behavior referred to as local search—which enhances convergence speed and solution accuracy. Since the fitness of the elite subpopulation is close to the current global best solution, enabling it to perform local search can improve the convergence rate and solution precision. The migration strategy of the elite subpopulation [54] involves updating the position by incorporating the Tangent Search Algorithm [55] (TSA). The mathematical formulation of the elite tangent search strategy is given as follows:
x = x p r e y + s t e p × tan ( θ ) × ( r a n d × x p r e y x ) , i f           x = o p t s  
x = x p r e y + s t e p × tan ( θ ) × ( x p r e y x ) , i f           x o p t s  
Due to the perturbation applied to elite individuals, the original position x is transformed to a new position x p r e y .

3.3.2. Exploration of Subpopulation Evolution Strategies

In the Honey Badger Algorithm, the position updating mechanism generates new individuals in the vicinity of the current individual x and the current global best individual x p r e y . In other words, other individuals in the population are guided toward the global best solution. However, if this solution is a local optimum, continued iterations may cause the honey badger individuals to converge around it, resulting in reduced population diversity and increasing the risk of premature convergence. To address these issues, a differential mutation strategy is adopted. Inspired by the mutation strategy in differential evolution, the current individual, the global best individual, and randomly selected individuals from the population are used to perform differentiation in order to generate new individuals. This process is described by the following equation:
X ( t + 1 ) = X r a n d 1 ( t ) + F 0 × ( X r a n d 2 ( t ) X r a n d 3 ( t ) )  
where F 0 = 0.4 is the scaling factor for differential evolution; t denotes the current iteration number; and X r a n d 1 , X r a n d 2 , X r a n d 3 represent three randomly selected honey badger individuals.
The flowchart of the MSHBA algorithm is shown in Figure 1. Algorithm 1 presents the pseudocode of MSHBA.
Algorithm 1 Pseudo Code of MSHBA
Set parameters T max , N , β , C .
 Initialize population using Equations (7) and (8)
while t T max  do
 Update the decreasing factor α using (3).
for i = 1 to N do
  Calculate the intensity I x using Equation (2).
  Calculate A using Equation (9).
  if A 1
   Update the position x n e w using Equations (10) and (11).
  else
   Update the position x n e w using Equations (5) and (6).
  end.
  If the best value of the population does not change after 3 iterations
    The first half of the subpopulation performs the elite tangent search strategy.
   Update the position x n e w using Equations (12) and (13).
   The second half of the subpopulation performs the differential mutation strategy.
   Update the position x n e w using Equation (14).
    else
   digging and honey hunting
   Update the position using Equations (4) and (6)
  end
  If stop criteria satisfied.
  Output the optimal solution.
  else
  return Calculate A using Equation (9).
  end

4. Experimental Results and Discussion

The experiments were conducted on the Windows 10 operating system. All algorithms were implemented using MATLAB R2023b. The performance of the IHBA was compared with that of recently developed metaheuristics to evaluate its effectiveness in global optimization. The numerical efficiency of MSHBA was evaluated using 29 benchmark functions from CEC 2017 and four engineering design problems. To validate the performance of MSHBA, the results were compared with four state-of-the-art optimization algorithms: COA [30], DBO [31], HHO [32], and OOA [33]. Among the selected competitive algorithms, COA, DBO, HHO, and OOA are swarm intelligence algorithms widely recognized in the metaheuristic literature. On the other hand, COA, DBO, HHO, and OOA are relatively recent algorithms that have demonstrated promising performance in addressing the optimization problems considered in this study. These methods were selected to include both well-established and recently proposed algorithms, ensuring a fair comparison to demonstrate the overall effectiveness of the proposed approach. For a fair comparison, a maximum of 500 iterations was set for each optimization problem.

4.1. Parameter Settings

Apart from the algorithm-specific parameter settings and the dimensions of the test functions listed in Table 1 and Table 2, the general settings common to all selected algorithms include a population size of 30 ( N = 30 ), a maximum of 500 iterations ( T max = 500 ), and 30 independent runs for each optimization problem.

4.2. Benchmark Testing Functions

The performance of the Improved Honey Badger Optimization Algorithm is evaluated using the CEC2017 benchmark functions, which are categorized into unimodal functions, basic multimodal functions, hybrid functions, and composite functions, as presented in the Table 1 below.
Function F2 has been excluded from the benchmark set, as it exhibits unstable behavior, particularly in high-dimensional cases, and shows significant variations in performance when the same algorithm is implemented in MATLAB.

4.3. Comparison of MSHBA Algorithm with Other Algorithms

Table 3 demonstrates that the MSHBA algorithm achieves superior performance in terms of minimum, worst-case, median, average, and standard deviation (std) values on the unimodal functions F1 and F3, compared to the Raccoon Optimization Algorithm (COA), Dung Beetle Optimization (DBO), Harris Hawk Optimization (HHO), Osprey Optimization Algorithm (OOA), and the original Honey Badger Algorithm (HBA). These results indicate a significant improvement in the performance of the enhanced Honey Badger Algorithm (MSHBA).
Table 4 shows that the MSHBA algorithm outperforms the Raccoon Optimization Algorithm (COA), Dung Beetle Optimization (DBO), Harris Hawk Optimization (HHO), Osprey Optimization Algorithm (OOA), and the original Honey Badger Algorithm (HBA) when tested on a set of multimodal functions (F4–F10), in terms of minimum, worst-case, median, average, and standard deviation (std) values.
From Table 5, it can be seen that the MSHBA algorithm exhibits superior performance min, worst, median, average, and std on the hybrid function set F11–F20 compared to other algorithms such as Raccoon Optimization Algorithm (COA), Beetle Optimization Algorithm (DBO), Harris Hawk Optimization (HHO), Ospry Optimization Algorithm (OOA), Honey Badger Algorithm (HBA).
Table 6 shows that the MSHBA algorithm demonstrates superior performance in terms of minimum, worst-case, median, average, and standard deviation (std) values on the hybrid functions F21, F23, F24, F26, and F30, compared to other algorithms such as the Raccoon Optimization Algorithm (COA), Dung Beetle Optimization (DBO), Harris Hawk Optimization (HHO), Osprey Optimization Algorithm (OOA), and the original Honey Badger Algorithm (HBA). However, its performance in terms of variance is slightly inferior to that of COA on function F22, and to that of HBA on functions F25 and F29. Additionally, the mean and variance performances are slightly weaker than those of HBA on functions F27 and F28.

4.4. MSHBA and Other Algorithms’ Rank-Sum Test

Table 7 shows that the Multi-Strategy Honey Badger Optimization Algorithm (MSHBA) demonstrates significantly superior performance compared to the other four algorithms (COA, DBO, OOA, and HHO) on test functions F1 and F3–F30, according to the Wilcoxon rank-sum test, as the obtained p-values are significantly lower than the given significance level. Overall, the improved algorithm outperforms the other four algorithms. However, on test functions F1, F12, F15, F18, F19, F20, F22, F25, F27, F28, and F29, the performance of the improved Honey Badger Algorithm is comparable to that of the original Honey Badger Algorithm. In general, the performance of the Honey Badger Algorithm has been significantly enhanced by introducing hybrid strategy operators.

4.5. MSHBA and Other Algorithms of Boxplot

Figure 2 Boxplot Comparison of MSHBA with Other Algorithms outperforms the Raccoon Optimization Algorithm (COA), Dung Beetle Optimization (DBO), Harris Hawk Optimization (HHO), Osprey Optimization Algorithm (OOA), and the original Honey Badger Algorithm (HBA) on the CEC2017 test functions F3, F5–F9, F12, F13, F15–F19, F21, F23, F24, F26, F27, and F29.
However, on functions F1, F4, F11, F14, F22, F25, and F30, MSHBA performs comparably to or slightly worse than the original Honey Badger Algorithm (HBA). On functions F10, F20, and F28, its performance is similar to that of HHO.
Overall, the improved Honey Badger Algorithm demonstrates enhanced optimization performance on the majority of the benchmark functions.

4.6. MSHBA for Test Function of Fitness Change Curve

The Multi-Strategy Honey Badger Algorithm (MSHBA) demonstrates superior convergence performance compared to the other five algorithms—COA, DBO, OOA, HHO, and HBA—on 29 benchmark functions, including F1 and F3–F19, F21–F30 (Figure 3, Figure 4, Figure 5, Figure 6, Figure 7, Figure 8, Figure 9, Figure 10, Figure 11, Figure 12, Figure 13, Figure 14, Figure 15, Figure 16, Figure 17, Figure 18, Figure 19, Figure 20, Figure 21, Figure 22, Figure 23, Figure 24, Figure 25, Figure 26, Figure 27, Figure 28, Figure 29, Figure 30 and Figure 31). Specifically, MSHBA outperforms COA, DBO, OOA, and HHO on functions F3, F20, and F29. Overall, the improved Honey Badger Algorithm exhibits significantly enhanced convergence behavior compared to the original HBA.

5. MSHBA for Solving Classical Engineering Problems

5.1. Weight Minimization of a Speed Reducer (WMSR) [51]

The reducer weight minimization problem is a typical engineering optimization problem, with the objective of achieving weight reduction in the reducer by optimizing design variables while satisfying a series of design constraints. The weight of the reducer depends on 11 constraints, which must be minimized. Among them, seven are nonlinear constraints, and the remaining four are linear constraints. The design variables include gear face width b ( x 1 ) , gear module m ( x 2 ) , number of teeth on the pinion z ( x 3 ) , bearing span of the first shaft x 4 , bearing span of the second shaft x 5 , diameter of the first shaft d 1 ( x 6 ) , and diameter of the second shaft d 2 ( x 7 ) . The mathematical model is established as follows:
Minimize
F 1 ( x ¯ ) = 0.7854 x 2 2 x 1 ( 14.9334 x 3 43.0934 + 3.3333 x 3 2 )                   + 0.7854 ( x 5 x 7 2 + x 4 x 6 2 ) 1.508 x 1 ( x 7 2 + x 6 2 ) + 7.477 ( x 7 3 + x 6 3 )
subject to
g 1 ( x ¯ ) = x 1 x 2 2 x 3 + 27 0 g 2 ( x ¯ ) = x 1 x 2 2 x 3 2 + 397.5 0 g 3 ( x ¯ ) = x 2 x 6 4 x 3 x 4 3 + 1.93 0 g 4 ( x ¯ ) = x 2 x 7 4 x 3 x 5 3 + 1.93 0 g 5 ( x ¯ ) = 10 x 6 3 16.91 × 10 6 + ( 745 x 4 x 2 1 x 3 1 ) 2 1100 0 g 6 ( x ¯ ) = 10 x 7 3 157.5 × 10 6 + ( 745 x 5 x 2 1 x 3 1 ) 2 850 0 g 6 ( x ¯ ) = x 2 x 3 40 0 ,             g 8 ( x ¯ ) = x 1 x 2 1 + 5 0 g 9 ( x ¯ ) = x 1 x 2 1 12 0 ,       g 10 ( x ¯ ) = 1.5 x 6 x 4 + 1.9 0 g 11 ( x ¯ ) = 1.1 x 7 x 5 + 1.9 0
with bounds
2.6 x 1 3.6 ,       0.7 x 2 0.8 ,       17 x 3 28 7.3 x 4 , x 5 8.3 ,       2.9 x 6 3.9 ,       5 x 7 5.5

5.2. Tension/Compression Spring Design (TCSD) [52]

The design of tension/compression springs is a classic engineering optimization problem. The objective is to optimize the structural parameters of the spring to achieve minimum mass or optimal performance, while satisfying specific mechanical properties and spatial requirements. The design variables include wire diameter d ( x 1 ) , mean coil diameter D ( x 2 ) , and number of active coils N ( x 3 ) . The constraints are shear stress constraint, spring vibration frequency constraint, and minimum deflection constraint. The established mathematical model is as follows:
Minimize
F 2 ( x ¯ ) = x 1 2 x 2 ( 2 + x 3 )
subject to
g 1 ( x ¯ ) = 1 x 2 3 x 3 71785 x 1 4 0       g 2 ( x ¯ ) = 4 x 2 2 x 1 x 2 12566 ( x 2 x 1 3 x 1 4 ) + 1 5108 x 1 2 1 0 g 3 ( x ¯ ) = 1 140.45 x 1 x 2 2 x 3 0 g 4 ( x ¯ ) = x 1 + x 2 1.5 1 0
with bounds
0.05 x 1 2.00 , 0.25 x 2 1.30 , 2.00 x 3 15.0

5.3. Pressure Vessel Design (PVD) [52]

Pressure vessel design is a classic engineering optimization problem. The objective is to optimize the structural parameters of the pressure vessel to achieve cost minimization or performance optimization, while satisfying specific mechanical properties and safety standards. The design variables include shell thickness T s ( x 1 ) , head thickness T h ( x 2 ) , inner radius R ( x 3 ) , and shell length L ( x 4 ) . The constraints include thickness requirements, stress constraints, safety standard constraints, and geometric constraints. The established mathematical model is as follows:
Minimize
f ( x ¯ ) = 1.7781 z 2 x 3 2 + 0.6224 z 1 x 3 x 4 + 3.1661 z 1 2 x 4 + 19.84 z 1 2 x 3
subject to
g 1 ( x ¯ ) = 0.00954 x 3 z 2     g 2 ( x ¯ ) = 0.0193 x 3 z 1 g 3 ( x ¯ ) = x 4 240 g 4 ( x ¯ ) = π x 3 2 1296000
where
z 1 = 0.0625 x 1 z 2 = 0.0625 x 2
with bounds
1 x 1 , x 2 99 ( int e g e r var i a b l e ) 10 x 3 , x 4 200

5.4. Welded Beam Design (WBD) [52]

Welded beam design is a classic engineering optimization problem. The objective is to optimize the structural parameters of the welded beam to achieve the minimization of manufacturing costs or the optimization of performance, while satisfying specific mechanical properties and safety standards. The design variables include weld thickness h ( x 1 ) , length of the beam attached to the support l ( x 2 ) , height of the beam t ( x 3 ) , and thickness of the beam b ( x 4 ) . The constraints include shear stress constraint, bending stress constraint, buckling load constraint, end deflection constraint of the beam, and geometric constraints. The established mathematical model is as follows:
Minimize
f ( x ¯ ) = 0.04811 x 3 x 4 ( x 2 + 14 ) + 1.10471 x 1 2 x 2
subject to
g 1 ( x ¯ ) = x 1 x 4 0 g 2 ( x ¯ ) = δ ( x ¯ ) δ max 0 g 3 ( x ¯ ) = p p c ( x ¯ ) g 4 ( x ¯ ) = τ max τ ( x ¯ ) g 5 ( x ¯ ) = σ ( x ¯ ) σ max 0
where
τ = τ 2 + τ 2 + 2 τ τ x 2 2 R , τ = R M J , M = p ( x 2 2 + L ) R = x 2 2 4 + ( x 1 + x 2 2 ) 2 , J = 2 ( ( x 2 2 4 + ( x 1 + x 3 2 ) 2 ) 2 x 1 x 2 ) σ ( x ¯ ) = 6 p L x 4 x 3 2 , δ ( x ¯ ) = 6 p L 3 E x 3 2 x 4 , p c ( x ¯ ) = 4.013 E x 3 x 4 3 6 L 2 ( 1 x 3 2 L E 4 G ) L = 14 i n , p = 6000 l b , E = 30.10 6 p s i , σ max = 30000 p s i τ max = 13600 p s i , G = 12.10 6 p s i , δ max = 0.25 i n
with bounds
0.1 x 2 , x 3 10 0.1 x 4 2 0.125 x 1 2
The MSHBA algorithm demonstrates strong convergence performance and stability when solving constrained engineering design problems, as evidenced by the results presented in Table 8, Table 9, Table 10 and Table 11 and Figure 32, Figure 33, Figure 34 and Figure 35. Additionally, it can be observed from Table 12, Table 13, Table 14 and Table 15 that the decision variables achieve optimal solutions when the objective function reaches its optimum. The algorithm is capable of reaching the optimal objective function value during the initial iterations and demonstrates excellent stability. In summary, the improved Honey Badger Algorithm demonstrates outstanding performance in solving constrained engineering design problems.

6. Conclusions and Future Works

This paper proposes a hybrid strategy to enhance the performance of the Honey Badger Optimization Algorithm. Performance analysis further confirms the superior performance of MSHBA, as evidenced by its improved convergence rate and enhanced exploration–exploitation balance. However, the proposed Multi-Strategy Honey Badger Algorithm (MSHBA) employs a hybrid strategy to enhance the performance of the original Honey Badger Algorithm (HBA). Specifically, it introduces a cubic chaotic map during population initialization to improve the diversity and exploration capability of the initial population. In the excavation and honey-searching stages of the Honey Badger Algorithm, the position update mechanism relies on the global best solution, potentially resulting in premature convergence caused by population clustering around the optimal individuals. In order to improve the global exploration capability of the Honey Badger Algorithm, a random search strategy is incorporated to enhance convergence speed and computational efficiency in the later iterations.
In subsequent studies, the proposed MSHBA algorithm will be further assessed for its effectiveness in addressing multi-objective, combinatorial, and real-world optimization problems with complex and uncertain search spaces. Furthermore, integrating hybrid strategies and parameter adaptation techniques into the traditional Honey Badger Algorithm (HBA) can enhance its capabilities in binary and multi-objective optimization. This approach aims to improve solution accuracy, achieve a better balance between exploration and exploitation, and accelerate global convergence, thereby enabling the algorithm to effectively solve a wide range of optimization problems.

Author Contributions

Author Contributions: Conceptualization, D.G. and H.H.; methodology, D.G.; software, D.G.; validation, D.G. and H.H.; formal analysis, D.G.; investigation, D.G.; resources, D.G.; data curation, D.G.; writing—original draft preparation, D.G.; writing—review and editing, H.H.; visualization, D.G.; supervision, D.G.; project administration, D.G.; funding acquisition, H.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by [2024 Open Fund Projects to be Funded by the Guangxi Key Laboratory of Hybrid Computing and Integrated Circuit Design Analysis] grant number [GXIC2402]. And the APC was funded by [2024 Open Fund Projects to be Funded by the Guangxi Key Laboratory of Hybrid Computing and Integrated Circuit Design Analysis].

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Houssein, E.H.; Helmy, B.E.-D.; Rezk, H.; Nassef, A.M. An enhanced archimedes optimization algorithm based on local escaping operator and orthogonal learning for PEM fuel cell parameter identification. Eng. Appl. Artif. Intell. 2021, 103, 104309. [Google Scholar] [CrossRef]
  2. Houssein, E.H.; Mahdy, M.A.; Fathy, A.; Rezk, H. A modified marine predator algorithm based on opposition based learning for tracking the global MPP of shaded PV system. Expert Syst. Appl. 2021, 183, 115253. [Google Scholar] [CrossRef]
  3. James, C. Introduction to Stochastics Search and Optimization; John Wiley and Sons: Hoboken, NJ, USA, 2003. [Google Scholar]
  4. Parejo, J.A.; Ruiz-Cortés, A.; Lozano, S.; Fernandez, P. Metaheuristic optimization frameworks: A survey and benchmarking. Soft Comput. 2012, 16, 527–561. [Google Scholar] [CrossRef]
  5. Hassan, M.H.; Houssein, E.H.; Mahdy, M.A.; Kamel, S. An improved manta ray foraging optimizer for cost-effective emission dispatch problems. Eng. Appl. Artif. Intell. 2021, 100, 104–155. [Google Scholar] [CrossRef]
  6. Ahmed, M.M.; Houssein, E.H.; Hassanien, A.E.; Taha, A.; Hassanien, E. Maximizing lifetime of large-scale wireless sensor networks using multi-objective whale optimization algorithm. Telecommun. Syst. 2019, 72, 243–259. [Google Scholar] [CrossRef]
  7. Houssein, E.H.; Helmy, B.E.-D.; Oliva, D.; Elngar, A.A.; Shaban, H. A novel black widow optimization algorithm for multilevel thresholding image segmentation. Expert Syst. Appl. 2021, 167, 114159. [Google Scholar] [CrossRef]
  8. Hussain, K.; Neggaz, N.; Zhu, W.; Houssein, E.H. An efficient hybrid sine-cosine harris hawks optimization for low and high-dimensional feature selection. Expert Syst. Appl. 2021, 176, 114778. [Google Scholar] [CrossRef]
  9. Neggaz, N.; Houssein, E.H.; Hussain, K. An efficient henry gas solubility optimization for feature selection. Expert Syst. Appl. 2020, 152, 113364. [Google Scholar] [CrossRef]
  10. Hassanien, A.E.; Kilany, M.; Houssein, E.H.; AlQaheri, H. Intelligent human emotion recognition based on elephant herding optimization tuned support vector regression. Biomed. Signal Process. Control 2018, 45, 182–191. [Google Scholar] [CrossRef]
  11. Hashim, F.A.; Houssein, E.H.; Hussain, K.; Mabrouk, M.S.; Al-Atabany, W. A modified henry gas solubility optimization for solving motif discovery problem. Neural Comput. Appl. 2019, 32, 10759–10771. [Google Scholar] [CrossRef]
  12. Sadollah, A.; Sayyaadi, H.; Yadav, A. A dynamic metaheuristic optimization model inspired by biological nervous systems: Neural network algorithm. Appl. Soft Comput. 2018, 71, 747–782. [Google Scholar] [CrossRef]
  13. Qin, F.; Zain, A.M.; Zhou, K.-Q. Harmony search algorithm and related variants: A systematic review. Swarm Evol. Comput. 2022, 74, 101126. [Google Scholar] [CrossRef]
  14. Abualigah, L.; Diabat, A.; Geem, Z.W. A Comprehensive Survey of the Harmony Search Algorithm in Clustering Applications. Appl. Sci. 2020, 10, 3827. [Google Scholar] [CrossRef]
  15. Rajeev, S.; Krishnamoorthy, C.S. Discrete optimization of structures using genetic algorithms. J. Struct. Eng. 1992, 118, 1233–1250. [Google Scholar] [CrossRef]
  16. Storn, R.; Price, K. Differential evolution–A simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  17. Houssein, E.H.; Rezk, H.; Fathy, A.; Mahdy, M.A.; Nassef, A.M. A modified adaptive guided differential evolution algorithm applied to engineering applications. Eng. Appl. Artif. Intell. 2022, 113, 104920. [Google Scholar] [CrossRef]
  18. Atashpaz-Gargari, E.; Lucas, C. Imperialist competitive algorithm: An algorithm for optimization inspired by imperialistic competition. In Proceedings of the 2007 IEEE Congress on Evolutionary Computation, Singapore, 25–28 September 2007; pp. 4661–4667. [Google Scholar]
  19. Priya, R.D.; Sivaraj, R.; Anitha, N.; Devisurya, V. Tri-staged feature selection in multi-class heterogeneous datasets using memetic algorithm and cuckoo search optimization. Expert Syst. Appl. 2022, 209, 118286. [Google Scholar] [CrossRef]
  20. Dorigo, M.; Birattari, M.; Stutzle, T. Ant colony optimization-Artificial ants as a computational intelligence technique. IEEE Comput. Intell. Mag. 2006, 1, 28–39. [Google Scholar] [CrossRef]
  21. Zhao, D.; Lei, L.; Yu, F.; Heidari, A.A.; Wang, M.; Oliva, D.; Muhammad, K.; Chen, H. Ant colony optimization with horizontal and vertical crossover search: Fundamental visions for multi-threshold image segmentation. Expert. Syst. Appl. 2021, 167, 114122. [Google Scholar] [CrossRef]
  22. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
  23. Ma, M.; Wu, J.; Shi, Y.; Yue, L.; Yang, C.; Chen, X. Chaotic Random Opposition-Based Learning and Cauchy Mutation Improved Moth-Flame Optimization Algorithm for Intelligent Route Planning of Multiple UAVs. IEEE Access 2022, 10, 49385–49397. [Google Scholar] [CrossRef]
  24. Yu, X.; Wu, X. Ensemble grey wolf Optimizer and its application for image segmentation. Expert Syst. Appl. 2022, 209, 118267. [Google Scholar] [CrossRef]
  25. Zhao, W.; Zhang, Z.; Wang, L. Manta ray foraging optimization: An effective bio-inspired optimizer for engineering applications. Eng. Appl. Artif. Intell. 2020, 87, 103300. [Google Scholar] [CrossRef]
  26. Zhao, W.; Wang, L.; Mirjalili, S. Artificial hummingbird algorithm: A new bio-inspired optimizer with its engineering applications. Comput. Methods Appl. Mech. Eng. 2022, 388, 114194. [Google Scholar] [CrossRef]
  27. Agushaka, J.O.; Ezugwu, A.E.; Abualigah, L. Dwarf Mongoose Optimization Algorithm. Comput. Methods Appl. Mech. Eng. 2022, 391, 114570. [Google Scholar] [CrossRef]
  28. Abdollahzadeh, B.; Gharehchopogh, F.S.; Mirjalili, S. Artificialgorilla troops optimizer: A new nature-inspired metaheuristic algorithm for global optimization problems. Int. J. Intell. Syst. 2021, 36, 5887–5958. [Google Scholar] [CrossRef]
  29. Houssein, E.H.; Saad, M.R.; Ali, A.A.; Shaban, H. An efficient multi-objective gorilla troops optimizer for minimizing energy consumption of large-scale wireless sensor networks. Expert Syst. Appl. 2023, 212, 118827. [Google Scholar] [CrossRef]
  30. Zhao, S.; Zhang, T.; Ma, S.; Chen, M. Dandelion Optimizer: A nature-inspired metaheuristic algorithm for engineering applications. Eng. Appl. Artif. Intell. 2022, 114, 105075. [Google Scholar] [CrossRef]
  31. Beşkirli, M. A novel Invasive Weed Optimization with levy flight for optimization problems: The case of forecasting energy demand. Energy Rep. 2022, 8 (Suppl. S1), 1102–1111. [Google Scholar] [CrossRef]
  32. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  33. Jiang, Q.; Wang, L.; Lin, Y.; Hei, X.; Yu, G.; Lu, X. An efficient multi-objective artificial raindrop algorithm and its application to dynamic optimization problems in chemical processes. Appl. Soft Comput. 2017, 58, 354–377. [Google Scholar] [CrossRef]
  34. Houssein, E.H.; Hosney, M.E.; Mohamed, W.M.; Ali, A.A.; Younis, E.M.G. Fuzzy-based hunger games search algorithm for global optimization and feature selection using medical data. Neural Comput. Appl. 2022, 35, 5251–5275. [Google Scholar] [CrossRef]
  35. Houssein, E.H.; Oliva, D.; Çelik, E.; Emam, M.M.; Ghoniem, R.M. Boosted sooty tern optimization algorithm for global optimization and feature selection. Expert Syst. Appl. 2023, 113, 119015. [Google Scholar] [CrossRef]
  36. Hu, G.; Du, B.; Wang, X.; Wei, G. An enhanced black widow optimization algorithm for feature selection. Knowl.-Based Syst. 2022, 235, 107638. [Google Scholar] [CrossRef]
  37. Abualigah, L.; Almotairi, K.H.; Elaziz, M.A. Multilevel thresholding image segmentation using meta-heuristic optimization algorithms: Comparative analysis, open challenges and new trends. Appl. Intell. 2022, 53, 11654–11704. [Google Scholar] [CrossRef]
  38. Houssein, E.H.; Hussain, K.; Abualigah, L.; Elaziz, M.A.; Alomoush, W.; Dhiman, G.; Djenouri, Y.; Cuevas, E. An improved opposition-based marine predators algorithm for global optimization and multilevel thresholding image segmentation. Knowl.-Based Syst. 2021, 229, 107348. [Google Scholar] [CrossRef]
  39. Sharma, P.; Dinkar, S.K. A Linearly Adaptive Sine–Cosine Algorithm with Application in Deep Neural Network for Feature Optimization in Arrhythmia Classification using ECG Signals. Knowl. Based Syst. 2022, 242, 108411. [Google Scholar] [CrossRef]
  40. Guo, Y.; Tian, X.; Fang, G.; Xu, Y.-P. Many-objective optimization with improved shuffled frog leaping algorithm for inter-basin water transfers. Adv. Water Resour. 2020, 138, 103531. [Google Scholar] [CrossRef]
  41. Das, P.K.; Behera, H.S.; Panigrahi, B.K. A hybridization of an improved particle swarm optimization and gravitational search algorithm for multi-robot path planning. Swarm Evol. Comput. 2016, 28, 14–28. [Google Scholar] [CrossRef]
  42. Yu, X.; Jiang, N.; Wang, X.; Li, M. A hybrid algorithm based on grey wolf optimizer and differential evolution for UAV path planning. Expert Syst. Appl. 2022, 215, 119327. [Google Scholar] [CrossRef]
  43. Gao, D.; Wang, G.-G.; Pedrycz, W. Solving fuzzy job-shop scheduling problem using DE algorithm improved by a selection mechanism. IEEE Trans. Fuzzy Syst. 2020, 28, 3265–3275. [Google Scholar] [CrossRef]
  44. Dong, Z.R.; Bian, X.Y.; Zhao, S. Ship pipe route design using improved multi-objective ant colony optimization. Ocean. Eng. 2022, 258, 111789. [Google Scholar] [CrossRef]
  45. Hu, G.; Wang, J.; Li, M.; Hussien, A.G.; Abbas, M. EJS: Multi-strategy enhanced jellyfish search algorithm for engineering applications. Mathematics 2023, 11, 851. [Google Scholar] [CrossRef]
  46. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  47. Yazdani, M.; Jolai, F. Lion optimization algorithm (LOA): A nature-inspired metaheuristic algorithm. J. Comput. Des. Eng. 2016, 3, 24–36. [Google Scholar] [CrossRef]
  48. Hashim, F.A.; Houssein, E.H.; Hussain, K.; Mabrouk, M.S.; Al-Atabany, W. Honey Badger Algorithm: New metaheuristic algorithm for solving optimization problems. Math. Comput. Simul. 2022, 192, 84–110. [Google Scholar] [CrossRef]
  49. Lochert, C.; Scheuermann, B.; Mauve, M. A Survey on congestion control for mobile ad-hoc networks. Wiley Wirel. Commun. Mob. Comput. 2007, 7, 655–676. [Google Scholar] [CrossRef]
  50. Zheng, J.F.; Zhan, H.W.; Huang, W.; Zhang, H.; Wu, Z.X. Development of Levy Flight and Its Application in Intelligent Optimization Algorithm. Comput. Sci. 2021, 48, 190–206. [Google Scholar]
  51. Pant, M.; Thangaraj, R.; Singh, V. Optimization of mechanical design problems using improved differential evolution algorithm. Int. J. Recent Trends Eng. 2009, 1, 21. [Google Scholar]
  52. He, X.; Zhou, Y. Enhancing the performance of differential evolution with covariance matrix self-adaptation. Appl. Soft Comput. 2018, 64, 227–243. [Google Scholar] [CrossRef]
  53. Yahi, A.; Bekkouche, T.; Daachi, M.E.H.; Diffellah, N. A color image encryption scheme based on 1D cubic map. Optik 2022, 249, 168290. [Google Scholar] [CrossRef]
  54. He, S.; Xia, X. Random perturbation subsampling for rank regression with massive data. Stat. Sci. 2024, 35, 13–28. [Google Scholar] [CrossRef]
  55. Ting, H.; Yong, C.; Peng, C. Improved Honey Badger Algorithm Based on Elite Tangent Search and Differential Mutation with Applications in Fault Diagnosis. Processes 2025, 13, 256. [Google Scholar] [CrossRef]
Figure 1. Flowchart of the MSHBA algorithm.
Figure 1. Flowchart of the MSHBA algorithm.
Biomimetics 10 00581 g001
Figure 2. Boxplot Comparison of MSHBA with Other Algorithms.
Figure 2. Boxplot Comparison of MSHBA with Other Algorithms.
Biomimetics 10 00581 g002
Figure 3. Convergence Performance Comparison of CEC2017-F1 Test Function.
Figure 3. Convergence Performance Comparison of CEC2017-F1 Test Function.
Biomimetics 10 00581 g003
Figure 4. Convergence Performance Comparison of CEC2017-F3 Test Function.
Figure 4. Convergence Performance Comparison of CEC2017-F3 Test Function.
Biomimetics 10 00581 g004
Figure 5. Convergence Performance Comparison of CEC2017-F4 Test Function.
Figure 5. Convergence Performance Comparison of CEC2017-F4 Test Function.
Biomimetics 10 00581 g005
Figure 6. Convergence Performance Comparison of CEC2017-F5 Test Function.
Figure 6. Convergence Performance Comparison of CEC2017-F5 Test Function.
Biomimetics 10 00581 g006
Figure 7. Convergence Performance Comparison of CEC2017-F6 Test Function.
Figure 7. Convergence Performance Comparison of CEC2017-F6 Test Function.
Biomimetics 10 00581 g007
Figure 8. Convergence Performance Comparison of CEC2017-F7 Test Function.
Figure 8. Convergence Performance Comparison of CEC2017-F7 Test Function.
Biomimetics 10 00581 g008
Figure 9. Convergence Performance Comparison of CEC2017-F8 Test Function.
Figure 9. Convergence Performance Comparison of CEC2017-F8 Test Function.
Biomimetics 10 00581 g009
Figure 10. Convergence Performance Comparison of CEC2017-F9 Test Function.
Figure 10. Convergence Performance Comparison of CEC2017-F9 Test Function.
Biomimetics 10 00581 g010
Figure 11. Convergence Performance Comparison of CEC2017-F10 Test Function.
Figure 11. Convergence Performance Comparison of CEC2017-F10 Test Function.
Biomimetics 10 00581 g011
Figure 12. Convergence Performance Comparison of CEC2017-F11 Test Function.
Figure 12. Convergence Performance Comparison of CEC2017-F11 Test Function.
Biomimetics 10 00581 g012
Figure 13. Convergence Performance Comparison of CEC2017-F12 Test Function.
Figure 13. Convergence Performance Comparison of CEC2017-F12 Test Function.
Biomimetics 10 00581 g013
Figure 14. Convergence Performance Comparison of CEC2017-F13 Test Function.
Figure 14. Convergence Performance Comparison of CEC2017-F13 Test Function.
Biomimetics 10 00581 g014
Figure 15. Convergence Performance Comparison of CEC2017-F14 Test Function.
Figure 15. Convergence Performance Comparison of CEC2017-F14 Test Function.
Biomimetics 10 00581 g015
Figure 16. Convergence Performance Comparison of CEC2017-F15 Test Function.
Figure 16. Convergence Performance Comparison of CEC2017-F15 Test Function.
Biomimetics 10 00581 g016
Figure 17. Convergence Performance Comparison of CEC2017-F16 Test Function.
Figure 17. Convergence Performance Comparison of CEC2017-F16 Test Function.
Biomimetics 10 00581 g017
Figure 18. Convergence Performance Comparison of CEC2017-F17 Test Function.
Figure 18. Convergence Performance Comparison of CEC2017-F17 Test Function.
Biomimetics 10 00581 g018
Figure 19. Convergence Performance Comparison of CEC2017-F18 Test Function.
Figure 19. Convergence Performance Comparison of CEC2017-F18 Test Function.
Biomimetics 10 00581 g019
Figure 20. Convergence Performance Comparison of CEC2017-F19 Test Function.
Figure 20. Convergence Performance Comparison of CEC2017-F19 Test Function.
Biomimetics 10 00581 g020
Figure 21. Convergence Performance Comparison of CEC2017-F20 Test Function.
Figure 21. Convergence Performance Comparison of CEC2017-F20 Test Function.
Biomimetics 10 00581 g021
Figure 22. Convergence Performance Comparison of CEC2017-F21 Test Function.
Figure 22. Convergence Performance Comparison of CEC2017-F21 Test Function.
Biomimetics 10 00581 g022
Figure 23. Convergence Performance Comparison of CEC2017-F22 Test Function.
Figure 23. Convergence Performance Comparison of CEC2017-F22 Test Function.
Biomimetics 10 00581 g023
Figure 24. Convergence Performance Comparison of CEC2017-F23 Test Function.
Figure 24. Convergence Performance Comparison of CEC2017-F23 Test Function.
Biomimetics 10 00581 g024
Figure 25. Convergence Performance Comparison of CEC2017-F24 Test Function.
Figure 25. Convergence Performance Comparison of CEC2017-F24 Test Function.
Biomimetics 10 00581 g025
Figure 26. Convergence Performance Comparison of CEC2017-F25 Test Function.
Figure 26. Convergence Performance Comparison of CEC2017-F25 Test Function.
Biomimetics 10 00581 g026
Figure 27. Convergence Performance Comparison of CEC2017-F26 Test Function.
Figure 27. Convergence Performance Comparison of CEC2017-F26 Test Function.
Biomimetics 10 00581 g027
Figure 28. Convergence Performance Comparison of CEC2017-F27 Test Function.
Figure 28. Convergence Performance Comparison of CEC2017-F27 Test Function.
Biomimetics 10 00581 g028
Figure 29. Convergence Performance Comparison of CEC2017-F28 Test Function.
Figure 29. Convergence Performance Comparison of CEC2017-F28 Test Function.
Biomimetics 10 00581 g029
Figure 30. Convergence Performance Comparison of CEC2017-F29 Test Function.
Figure 30. Convergence Performance Comparison of CEC2017-F29 Test Function.
Biomimetics 10 00581 g030
Figure 31. Convergence Performance Comparison of CEC2017-F30 Test Function.
Figure 31. Convergence Performance Comparison of CEC2017-F30 Test Function.
Biomimetics 10 00581 g031
Figure 32. Convergence curve of problem 1.
Figure 32. Convergence curve of problem 1.
Biomimetics 10 00581 g032
Figure 33. Convergence curve of problem 2.
Figure 33. Convergence curve of problem 2.
Biomimetics 10 00581 g033
Figure 34. Convergence curve of problem 3.
Figure 34. Convergence curve of problem 3.
Biomimetics 10 00581 g034
Figure 35. Convergence curve of problem 4.
Figure 35. Convergence curve of problem 4.
Biomimetics 10 00581 g035
Table 1. Summary of the CEC ’2017 test functions.
Table 1. Summary of the CEC ’2017 test functions.
No.Functions F * = F ( x * )
Unimodal Functions1Shifted and Rotated Bent Cigar Function100
3Shifted and Rotated Zakharov Function300
Simple
Multimodal
Functions
4Shifted and Rotated Rosenbrock’s Function400
5Shifted and Rotated Rastrigin’s Function500
6Shifted and Rotated Expanded Scaffer’s F6 Function600
7Shifted and Rotated Lunacek Bi_Rastrigin Function700
8Shifted and Rotated Non-Continuous Rastrigin’s Function800
9Shifted and Rotated Levy Function900
10Shifted and Rotated Schwefel’s Function1000
Hybrid Functions11Hybrid Function 1 (N = 3)1100
12Hybrid Function 2 (N = 3)1200
13Hybrid Function 3 (N = 3)1300
14Hybrid Function 4 (N = 4)1400
15Hybrid Function 5 (N = 4)1500
16Hybrid Function 6 (N = 4)1600
17Hybrid Function 6 (N = 5)1700
18Hybrid Function 6 (N = 5)1800
19Hybrid Function 6 (N = 5)1900
20Hybrid Function 6 (N = 6)2000
Composition Functions21Composition Function 1 (N = 3)2100
22Composition Function 2 (N = 3)2200
23Composition Function 3 (N = 4)2300
24Composition Function 4 (N = 4)2400
25Composition Function 5 (N = 5)2500
26Composition Function 6 (N = 5)2600
27Composition Function 7 (N = 6)2700
28Composition Function 8 (N = 6)2800
29Composition Function 9 (N = 3)2900
30Composition Function 10 (N = 3)3000
Search Range: [−100, 100]D
Table 2. Parameters settings of IHBA and selected algorithm.
Table 2. Parameters settings of IHBA and selected algorithm.
AlgorithmParameters
COAcoati number = 30
t max = 500
HHOharris hawk number = 30
t max = 500
DBOdung beetle number = 30
t max = 500
OOAosprey number = 30
t max = 500
HBAhoney badger number = 30
t max = 500, β = 6 , C = 2
MSHBAhoney badger number = 30
t max = 500, β = 6 , C = 2
Table 3. Unimodal functions.
Table 3. Unimodal functions.
Function Index MSHBAHBACOADBOOOAHHO
F1min36,332.2132775,683.7318544,680,368,01824,217,484,92942,002,907,856100,012,066.4
F1std16,173,366.95168,770,925.97,033,529,1354,766,539,4848,294,013,405345,955,374.9
F1avg4,000,756.79433,619,961.0659,731,161,88032,021,958,76056,223,524,199439,575,119.5
F1median601,797.5757675,483.514559,521,568,79231,651,854,43855,292,058,030359,305,702.1
F1worse89,307,263.31925,953,472.770,695,016,54145,351,521,18269,887,934,6341,665,291,319
F3min8168.28978119,637.2828179,071.6744366,751.9009273,028.0930829,344.23558
F3std4174.2314467984.2458535087.4789757426.8790358097.0103397765.054029
F3avg15,583.0313539,438.3415390,069.3971483,047.7808892,758.217556,224.97662
F3median15,452.6383339,285.1026391,008.423384,055.4642392,718.3854157,878.3185
F3worse25,626.5814353,569.16705102,566.638397,344.63217106,866.255165,724.612
Table 4. Simple multimodal functions.
Table 4. Simple multimodal functions.
Function Index MSHBAHBACOADBOOOAHHO
F4min471.2650759429.86531348676.5404382280.1883988831.397677575.0612356
F4std25.3396121843.354798883016.5033862310.151022999.80766492.66355947
F4avg515.5658011520.635770815,449.901377926.72670815,696.97266696.2467487
F4median515.2685906517.303681615,432.697297380.83614416,168.57155686.963974
F4worse577.3558248655.37461820,979.7553713,948.772921,098.8796988.4041393
F5min539.0543882579.7459582876.9745056800.7163328867.1474351728.1264447
F5std18.2079347229.6426440525.7620759425.1975321529.8776523231.15405432
F5avg574.301356632.031401924.9641367847.6968692932.0657073771.1397853
F5median572.9181152627.5478741924.0502838847.4039296940.4132239761.3208434
F5worse604.1481834682.386896973.0416522898.4303847973.4967331838.8982555
F6min601.0876771611.1461317665.2970351662.3393043672.4335121651.8707445
F6std1.7815964678.8784233747.2812242775.152571257.9092862486.135355419
F6avg603.6710611624.3606534689.514728672.996196686.6871553668.0205421
F6median603.3518742622.8009585690.0552519673.1831749686.0670717669.1685594
F6worse609.0684128643.7839895701.1380974688.384032700.7970491678.3770071
F7min768.6980104833.55506221293.0066291152.280461315.4490881094.740061
F7std28.6680736152.4989168952.5448379449.9592983451.0102746570.4876899
F7avg830.1325411918.17734281415.3369241235.8768651424.7877521296.86546
F7median825.2929227910.14631551423.4101951241.3280181431.4218631299.322915
F7worse890.78283721077.2438931490.8732071366.6846451521.8180791446.083503
F8min831.0104332858.90791831095.554037999.93130481100.350225893.4379959
F8std19.9077406323.6697524929.2969212728.4758963725.3458360429.25608386
F8avg870.4349257911.71214881149.8643641068.8827811142.133401978.6668202
F8median869.1223147914.02181591160.0785011071.8071611142.377033981.8118432
F8worse919.1643449944.77394961205.9768091129.4824331198.863561051.396815
F9min959.60801771423.4338416152.584145931.1183747591.5094476594.499632
F9std395.29170371300.9224091812.5902381337.3512721321.5661091082.513028
F9avg1378.2696593472.9488110,986.976988787.75361410,324.264378567.879885
F9median1309.0635023491.84309711,539.116998729.92330210,502.019878495.984721
F9worse2725.4324747361.93742213,698.3891811,540.9527912,481.8478311,199.1604
F10min4458.5876043460.7827877955.3268925450.4888977973.1991514769.375371
F10std1173.509975797.6259678404.9918699853.6256714387.2710695702.6015629
F10avg6400.1854645481.7357068748.5988568428.4488718846.0517366105.540435
F10median6220.9041275403.0672578732.7146258671.4134018974.0890126014.950217
F10worse9396.6403066946.5206199410.2823499321.1639979490.9140077373.924473
Table 5. Hybrid functions.
Table 5. Hybrid functions.
Function Index MSHBAHBACOADBOOOAHHO
F11min1151.1991561216.4234056495.475193090.3598455484.640631373.055653
F11std57.8355743875.425731481587.6263071310.3266282394.807543244.8888555
F11avg1262.3254041330.4534778809.9160415676.57589412.8568731623.867058
F11median1260.7350781309.6046598710.0042385763.6944089777.2358611560.535589
F11worse1423.2297041503.40043311763.676898596.09783513,794.184232627.061678
F12min74,900.7336354,677.465788,004,724,178789,307,016.78,182,408,48213,090,571.92
F12std1,345,797.8021,313,184.5773,913,546,2232,659,696,5513,045,040,81652,989,396.93
F12avg1,099,451.9741,568,902.39214,865,255,2556,982,785,08213,644,525,71584,803,095.81
F12median442,557.83681,176,808.6915,136,388,3507,476,784,29113,123,772,48068,615,743.72
F12worse5,792,037.1874,574,308.94620,744,229,98811,789,738,55020,395,893,476210,539,888.9
F13min4539.37186150.1656781,489,333,852397,260,5391,817,805,403286,183.994
F13std25,110.96118824,912.72054,505,373,7612,632,621,2635,200,050,627813,240.6509
F13avg32,644.09208,033.339910,597,566,2493,442,195,4669,670,358,6131,066,786.477
F13median19,401.2495445,344.8684110,172,352,0173,644,524,9548,417,691,801836,432.0973
F13worse72,977.09814,566,519.74324,141,559,32010,745,888,75020,809,917,3884,074,357.755
F14min1711.8316413326.949485112,724.079538,819.47772231,396.153147,760.74738
F14std5936.70015232,243.530522,155,628.537825,536.15059,825,484.425885,781.4279
F14avg6989.51810733,974.406062,191,192.564798,810.25277,100,756.591873,504.5218
F14median4602.07678519,658.032541,616,476.92627,218.52443,353,309.042514,502.6535
F14worse28,745.95484123,532.407711,557,981.023,576,983.46545,832,920.43,472,437.752
F15min2250.4462812959.7197955,121,496.847280,652.726955,165,095.9524,165.02678
F15std13,856.9489523,486.89054471,974,341.236,434,849.5561,614,606.549,771.67436
F15avg13,675.2831618,354.87093611,253,276.512,629,448.73690,445,609.7118,023.198
F15median5277.72809411,497.20992470,200,572.13,714,150.257456,713,103.9117,371.5719
F15worse43,960.42823121,903.62221,550,944,089203,183,4152,071,388,517253,191.8004
F16min2012.8479142080.6784154374.8774983423.9425144651.211373055.309339
F16std216.1472784286.7054735897.7786711365.81964071070.803158473.6407211
F16avg2386.5390972605.070655770.4669524112.252275881.9634043731.242326
F16median2421.919852585.417215780.020714072.1271745587.3761563654.007111
F16worse2859.6374343122.2626917372.0694954816.4399988935.9974575239.164841
F17min1742.3721391852.494372180.1869232443.3628492652.7816952163.695568
F17std173.9290205224.63737342928.131984270.87836345932.261031297.7302024
F17avg2070.2363172392.4847634779.3933822881.0514476107.2959082808.588621
F17median2043.8692382408.6859953782.0664172864.4868194961.4770452848.983717
F17worse2393.8955272833.38468517,991.165683314.36019834,438.946043373.572517
F18min20,702.7536731,131.697713,827,698.722753,574.3712705,790.5116126,809.6891
F18std216,050.89492,160,010.37138,061,181.035,430,566.94866,536,117.465,914,002.752
F18avg229,876.8588729,865.156742,928,859.66,213,346.65559,235,659.853,859,932.405
F18median163,273.2064238,578.337631,545,216.575,021,293.50938,632,192.561,369,412.165
F18worse944,167.691112,062,231.49162,754,395.124,131,433.9239,549,282.822,515,593.58
F19min2101.480232313.36281615,734,537.6824,800,002.0216,199,347.56106,105.1687
F19std14,150.4331319,935.19608535,984,424.3146,049,396400,151,587.51,256,830.774
F19avg12,701.8005818,755.67073550,228,302.6188,822,826.3583,411,721.31,485,477.19
F19median6789.8860636281.857242473,418,261.1162,652,180.2538,321,091.81,268,606.797
F19worse54,393.066456,697.177632,215,792,102813,804,4231,523,573,9226,134,312.848
F20min2184.8363642190.0381312669.0096952474.7252652737.7733962498.676162
F20std322.9574545255.6539361202.9663686186.1973247160.9241292214.2404172
F20avg2734.5547722629.8826763035.7710442818.5676683022.7430162840.438917
F20median2734.9545732613.0206213052.3046522857.1094623007.5003882863.102686
F20worse3375.4340613284.2371143347.4765993183.6888423397.7534913187.744902
Table 6. Composition functions.
Table 6. Composition functions.
Function Index MSHBAHBACOADBOOOAHHO
F21min2328.3747242361.6066552674.2555472298.816062645.3059012474.986844
F21std21.5456247738.0702635150.71681329100.646883852.2032774445.93516289
F21avg2361.6213562416.9341752756.9239062567.7873012724.2658942571.424771
F21median2357.6664692406.3451482746.034192589.1631522712.1866172576.02751
F21worse2414.8919532522.4465492874.99412686.1066012824.3961752680.325844
F22min2303.0112912303.8036687394.1264954952.5635346418.4410632476.604061
F22std2744.9844392699.275571931.1823853896.2348088942.41426361299.944318
F22avg5774.5713294396.51469541.6989296635.5782359424.1010497201.58516
F22median6346.9381722320.4667449879.1630076547.4120319661.9005117380.529293
F22worse10,872.1274610,131.1990210,808.016329530.78460510,457.535539091.500067
F23min2692.7016862716.749753341.9418973011.2745423417.2939072931.615137
F23std28.03445356108.1414638129.691958893.6055587176.4835198125.9967206
F23avg2725.1414622838.06163582.4341613151.8759583711.9980193224.611813
F23median2720.6025882815.5662273615.8769053150.7237363725.8848023241.020759
F23worse2799.6364233335.5415873865.7070133410.1748954027.5144633500.775722
F24min2854.034412898.5704283426.1044233221.320363653.2314633239.149426
F24std22.94711896182.4346651168.463600896.51085244238.7841852143.9425178
F24avg2895.8464853068.6652293811.6814413378.4312254082.8775263486.40142
F24median2894.7731752992.3075823803.2275573388.2834714042.7606743511.12618
F24worse2936.8303323561.446844170.9484343584.0097644569.0613773869.309011
F25min2888.03392889.8448674161.1533323749.6122394145.774822954.389479
F25std21.7783340319.55326434471.1370306294.308607546.96120731.74155552
F25avg2916.0415872920.0088985155.4401524240.8628755036.0068643011.140246
F25median2913.7829992921.4190635091.0192664245.5028214923.7892883011.853877
F25worse2977.4656592956.2929026111.9270674807.5831376440.7208253090.130785
F26min3868.106952822.4111289865.29356691.2167549515.4746644212.221834
F26std261.05558571040.547875864.1190973788.44142581171.6590091197.068098
F26avg4412.0296825011.39151511,468.693858328.35663211,584.749088015.494868
F26median4401.1373875037.25662711,510.989058549.82689311,562.827418317.378243
F26worse4954.5404896919.70349313,334.837689603.89665614,288.208739897.642704
F27min3203.5966863217.647193764.2987233397.7020453735.005743276.890401
F27std375.1604378173.3180073501.2971062150.2258588534.7853183205.0072178
F27avg3418.4626133342.3897014493.8578693669.406384841.3368333535.701627
F27median3304.9111693307.4595544424.1958153657.4606424843.9634823487.872732
F27worse4809.3360714068.6682995725.689923956.1276876093.6583524273.873259
F28min3207.5376473215.5391116219.8296554759.8581616276.8026583347.569615
F28std964.106931235.93018009596.6879353380.8594573614.042427768.96732179
F28avg3619.8892743282.6149297422.9444365542.1636897385.340413474.906644
F28median3273.3872113286.347067541.3521545660.782897489.3626393461.763521
F28worse6912.5901733382.6119698724.4154826185.4949388414.7458513666.251144
F29min3494.3226583467.1022385827.1799134377.031915186.8160564153.204096
F29std390.6260893287.90240781235.80597360.35937881843.805616395.5045007
F29avg4024.0481574096.3879977398.8692175018.8767528150.9404295056.387627
F29median3975.5678174111.5543427068.0209384970.1594967617.9725695160.819057
F29worse4891.9426314820.40462311,120.623935660.5664212,418.381645796.888337
F30min6596.265677878.446768568,099,420.922,276,646.83262,230,934.42,198,313.919
F30std6851.45347688,005.28982903,137,928.5289,290,425.31,130,305,0039,307,191.494
F30avg17,546.1297572,406.147811,713,122,036214,934,810.11,920,168,27311,292,240.97
F30median16,024.0003230,456.115831,407,603,022143,764,437.11,586,940,28410,002,707.62
F30worse42,228.44695386,843.32974,408,681,6251,646,995,8344,855,238,13041,377,423.53
Table 7. Rank-sum test.
Table 7. Rank-sum test.
Function HBACOADBOOOAHHO
F10.6627347583.01986 × 10−113.01986 × 10−113.01986 × 10−113.01986 × 10−11
F34.97517 × 10−113.01986 × 10−113.01986 × 10−113.01986 × 10−113.01986 × 10−11
F40.5894511693.01986 × 10−113.01986 × 10−113.01986 × 10−113.68973 × 10−11
F51.07018 × 10−93.01986 × 10−113.01986 × 10−113.01986 × 10−113.01986 × 10−11
F63.01986 × 10−113.01986 × 10−113.01986 × 10−113.01986 × 10−113.01986 × 10−11
F71.85673 × 10−93.01986 × 10−113.01986 × 10−113.01986 × 10−113.01986 × 10−11
F89.83289 × 10−83.01986 × 10−113.01986 × 10−113.01986 × 10−114.97517 × 10−11
F93.15889 × 10−103.01986 × 10−113.01986 × 10−113.01986 × 10−113.01986 × 10−11
F100.0016797568.48477 × 10−92.19589 × 10−75.96731 × 10−90.539510317
F110.0002680573.01986 × 10−113.01986 × 10−113.01986 × 10−114.97517 × 10−11
F120.0724455963.01986 × 10−113.01986 × 10−113.01986 × 10−113.01986 × 10−11
F130.0270863183.01986 × 10−113.01986 × 10−113.01986 × 10−113.01986 × 10−11
F149.06321 × 10−83.01986 × 10−113.01986 × 10−113.01986 × 10−113.01986 × 10−11
F150.1296702253.01986 × 10−113.01986 × 10−113.01986 × 10−111.09367 × 10−10
F160.0030339483.01986 × 10−113.01986 × 10−113.01986 × 10−113.01986 × 10−11
F171.02773 × 10−67.38908 × 10−113.01986 × 10−113.01986 × 10−111.46431 × 10−10
F180.0656712583.01986 × 10−114.07716 × 10−113.68973 × 10−113.19674 × 10−9
F190.2398499913.01986 × 10−113.01986 × 10−113.01986 × 10−113.01986 × 10−11
F200.1808995330.0002530580.1623750220.000336790.122352926
F217.77255 × 10−93.01986 × 10−111.54652 × 10−93.01986 × 10−113.01986 × 10−11
F220.1857668565.0922 × 10−80.5297824911.35943 × 10−70.023243447
F231.41098 × 10−93.01986 × 10−113.01986 × 10−113.01986 × 10−113.01986 × 10−11
F241.41098 × 10−93.01986 × 10−113.01986 × 10−113.01986 × 10−113.01986 × 10−11
F250.3402884653.01986 × 10−113.01986 × 10−113.01986 × 10−114.97517 × 10−11
F260.0002680573.01986 × 10−113.01986 × 10−113.01986 × 10−114.19968 × 10−10
F270.6414235231.69472 × 10−97.69496 × 10−84.19968 × 10−104.63897 × 10−5
F280.87663498.99341 × 10−112.60151 × 10−88.99341 × 10−111.72903 × 10−6
F290.2062054873.01986 × 10−119.7555 × 10−103.01986 × 10−111.41098 × 10−9
F302.95898 × 10−53.01986 × 10−113.01986 × 10−113.01986 × 10−113.01986 × 10−11
Table 8. Statistical results of competitor algorithms for WMSR problem.
Table 8. Statistical results of competitor algorithms for WMSR problem.
Engineering 1IndexMSHBACOADBOOOAHHOHBA
F1min2994.4244662994.4244662994.4244673005.4767222994.4244662994.424466
F1std2.8484805665.1697868945.097511965.0283710062.3684551652.54734021
F1avg2995.3580093056.15743034.2963933012.8296122995.0468232995.376420
F1median2994.4244663038.3682663033.7015963011.3463022994.4244662994.424466
F1worse3003.759823188.2645443188.2645443025.9676633003.759823002.57892
Table 9. Statistical results of competitor algorithms for TCSD problem.
Table 9. Statistical results of competitor algorithms for TCSD problem.
Engineering 2IndexMSHBACOADBOOOAHHOHBA
F2min0.012665340.0126769410.0126696410.0126695810.0126674020.01266421
F2std0.001276340.002178470.0009825810.0001562430.0015326330.001254356
F2avg0.0132118020.0144034610.0131585950.0128344630.0134150170.013321041
F2median0.0127190540.013123910.0128434730.0127638320.0127190540.012718236
F2worse0.0177731580.0180266970.0177731580.0133477910.0177731580.017764532
Table 10. Statistical results of competitor algorithms for PVD problem.
Table 10. Statistical results of competitor algorithms for PVD problem.
Engineering 3IndexMSHBACOADBOOOAHHOHBA
F3min6059.7143356059.714356059.7145526060.371466059.7143356061.823546
F3std512.5757289579.9510833316.2129992374.7160367540.7771321524.34210589
F3avg6513.0973636594.2395366451.5624746238.4522616688.4937986534.0238912
F3median6370.7797176338.9584496410.0867786065.46486728.8547856380.2384513
F3worse7544.4925187544.4925187332.8435097425.0132657544.4925187678.5642312
Table 11. Statistical results of competitor algorithms for WBD problem.
Table 11. Statistical results of competitor algorithms for WBD problem.
Engineering 3IndexMSHBACOADBOOOAHHOHBA
F3min1.6702179191.6722427731.6702187951.67170351.6702180561.670217919
F3std0.1426835240.1139676120.0846765170.0027742160.2396562880.142683524
F3avg1.7875264741.7897992181.710248921.676350621.818371221.787526474
F3median1.724045321.7710260281.6703595591.6763238431.7240648071.72404532
F3worse2.1836400712.2829134481.9794117191.6825548192.6797607462.183640071
Table 12. Statistical competition algorithm for solving WMSR problem.
Table 12. Statistical competition algorithm for solving WMSR problem.
AlgorithmX1X2X3X4X5X6X7
MSHBA3.50.7177.37.715323.350545.28665
COA3.50.7177.37.715323.350545.28665
DBO3.50.7177.37.715323.350545.28665
OOA3.50.7177.377357.939093.350915.28803
HHO3.50.7177.37.715323.350545.28665
HBA3.50.7177.37.715323.350545.28665
Table 13. Statistical competition algorithm for solving TCSD problem.
Table 13. Statistical competition algorithm for solving TCSD problem.
AlgorithmX1X2X3
MSHBA0.050.31742514.0278
COA0.05263860.37999310.0444
DBO0.05146210.35128311.6148
OOA0.05051820.32902813.1295
HHO0.05195190.36307410.9258
HBA0.050.31742514.0278
Table 14. Statistical competition algorithm for solving PVD problem.
Table 14. Statistical competition algorithm for solving PVD problem.
AlgorithmX1X2X3X4
MSHBA12.847976.8822742.09382176.89271
COA12.585817.0488641.63912182.41281
DBO13.203076.5764640.66877195.19762
OOA15.079527.7383448.57513110.06771
HHO14.889468.1028348.57513110.06775
HBA13.436816.9552142.08866177.10523
Table 15. Statistical competition algorithm for solving WBD problem.
Table 15. Statistical competition algorithm for solving WBD problem.
AlgorithmX1X2X3X4
MSHBA0.198833.337419.192130.19883
COA0.197953.363129.191340.19891
DBO0.197753.359819.190120.19914
OOA0.195633.486219.209230.19947
HHO0.199323.442519.152930.20123
HBA0.196733.378239.191340.19883
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Guo, D.; Huang, H. Multi-Strategy Honey Badger Algorithm for Global Optimization. Biomimetics 2025, 10, 581. https://doi.org/10.3390/biomimetics10090581

AMA Style

Guo D, Huang H. Multi-Strategy Honey Badger Algorithm for Global Optimization. Biomimetics. 2025; 10(9):581. https://doi.org/10.3390/biomimetics10090581

Chicago/Turabian Style

Guo, Delong, and Huajuan Huang. 2025. "Multi-Strategy Honey Badger Algorithm for Global Optimization" Biomimetics 10, no. 9: 581. https://doi.org/10.3390/biomimetics10090581

APA Style

Guo, D., & Huang, H. (2025). Multi-Strategy Honey Badger Algorithm for Global Optimization. Biomimetics, 10(9), 581. https://doi.org/10.3390/biomimetics10090581

Article Metrics

Back to TopTop