Next Article in Journal
Multi-Strategy-Assisted Hybrid Crayfish-Inspired Optimization Algorithm for Solving Real-World Problems
Previous Article in Journal
The Drosophila Connectome as a Computational Reservoir for Time-Series Prediction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Particle Swarm Optimization-Guided Ivy Algorithm for Global Optimization Problems

1
School of Computer Science, Hubei University of Technology, Wuhan 430068, China
2
School of Computer Science and Technology, Taiyuan Normal University, Taiyuan 030619, China
3
College of Mechanical Engineering, Chongqing University of Technology, Chongqing 400054, China
4
Department of Engineering Science and Mechanics, Shibaura Institute of Technology, 3-7-5 Toyosu, Koto-ku, Tokyo 135-8548, Japan
5
Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences, Chongqing 400714, China
*
Authors to whom correspondence should be addressed.
Biomimetics 2025, 10(5), 342; https://doi.org/10.3390/biomimetics10050342
Submission received: 21 April 2025 / Revised: 16 May 2025 / Accepted: 17 May 2025 / Published: 21 May 2025

Abstract

:
In recent years, metaheuristic algorithms have garnered significant attention for their efficiency in solving complex optimization problems. However, their performance critically depends on maintaining a balance between global exploration and local exploitation; a deficiency in either can result in premature convergence to local optima or low convergence efficiency. To address this challenge, this paper proposes an enhanced ivy algorithm guided by a particle swarm optimization (PSO) mechanism, referred to as IVYPSO. This hybrid approach integrates PSO’s velocity update strategy for global searches with the ivy algorithm’s growth strategy for local exploitation and introduces an ivy-inspired variable to intensify random perturbations. These enhancements collectively improve the algorithm’s ability to escape local optima and enhance the search stability. Furthermore, IVYPSO adaptively selects between local growth and global diffusion strategies based on the fitness difference between the current solution and the global best, thereby improving the solution diversity and convergence accuracy. To assess the effectiveness of IVYPSO, comprehensive experiments were conducted on 26 standard benchmark functions and three real-world engineering optimization problems, with the performance compared against 11 state-of-the-art intelligent optimization algorithms. The results demonstrate that IVYPSO outperformed most competing algorithms on the majority of benchmark functions, exhibiting superior search capability and robustness. In the stability analysis, IVYPSO consistently achieved the global optimum across multiple runs on the three engineering cases with reduced computational time, attaining a 100% success rate (SR), which highlights its strong global optimization ability and excellent repeatability.

1. Introduction

Metaheuristic algorithms have become vital tools for addressing complex optimization problems and are widely applied in fields such as engineering design [1,2], machine learning [3,4], and control systems [5,6]. Compared to traditional mathematical optimization techniques, such as gradient-based techniques [7], metaheuristics offer several advantages; they do not require gradient information, exhibit low sensitivity to initial conditions, and are well-suited for high-dimensional or multimodal problems [8,9]. Their flexibility in navigating the solution space and capability to avoid entrapment in local optima have made them a central focus in computational intelligence research [10,11].
In recent years, hybridization techniques have been increasingly employed to enhance the performance of metaheuristics by integrating complementary strategies from different algorithms, thereby improving the convergence speed, solution quality, and robustness [12,13,14,15]. Among these, particle swarm optimization (PSO) has received considerable attention due to its simplicity and rapid convergence [16]. However, PSO also suffers from limited local exploitation ability and a tendency toward premature convergence. To mitigate these issues, numerous studies have combined particle swarm optimization (PSO) with evolutionary operators such as genetic algorithms (GA) [17] and differential evolution (DE) [18], adaptive parameter control techniques [19], and multi-strategy perturbation schemes to better balance global exploration and local exploitation [20].
In parallel, the ivy algorithm (IVYA) has emerged as a promising bio-inspired approach due to its straightforward design and strong local search capabilities [21]. By mimicking the natural growth and propagation behaviors of ivy plants, IVYA can efficiently exploit the neighborhood of promising regions while preserving population diversity. Nevertheless, the lack of a global guidance mechanism limits its convergence efficiency on large-scale optimization problems.
Despite the development of various hybrid metaheuristics, most existing methods primarily focus on accelerating convergence or maintaining diversity, without effectively addressing how to preserve the exploration–exploitation balance while avoiding premature convergence. Moreover, mechanisms that adaptively adjust search strategies based on dynamic fitness landscapes remain underexplored in the current literature.
To bridge these gaps, this paper introduces a novel hybrid optimization algorithm—IVYPSO—which synergistically combines the global guidance of PSO with the adaptive local search capabilities of IVYA. IVYPSO incorporates a biologically inspired ivy perturbation variable (GV) into the PSO velocity update equation, enabling adaptive directional control and stochastic perturbations based on the current fitness landscape. This design enhances the algorithm’s ability to escape local optima while improving the population diversity and convergence stability. The main contributions of this work are summarized as follows:
  • A novel PSO-guided hybrid optimization algorithm, IVYPSO, is proposed. It embeds the ivy growth strategy of IVYA into the velocity update process of PSO, thereby enabling dynamic switching between local and global search modes.
  • A balanced search framework was constructed by comparing the fitness of the current individual with that of the global best. The algorithm dynamically determines whether to perform local exploitation or global exploration, thereby improving its adaptability to complex search spaces.
  • A comprehensive experimental evaluation was conducted on 26 standard benchmark functions and three constrained real-world engineering design problems. The performance of the proposed IVYPSO algorithm was compared against eleven advanced metaheuristic optimization algorithms, including classical methods (such as PSO, IVY, BOA, WOA, and GOOSE), as well as recently developed hybrid algorithms from high-quality research studies (including HPSOBOA, FDC-AGDE, dFDB-LSHADE, NSM-BO, dFDB-SFS, and FDB-AGSK). The experimental results demonstrate that IVYPSO consistently outperformed the compared algorithms in terms of optimization accuracy, convergence speed, and robustness.
The rest of this paper is organized as follows. Section 2 reviews the related work concerning the design and development of metaheuristic search algorithms. Section 3 introduces the PSO and ivy algorithms. Section 4 presents the proposed IVYPSO hybrid algorithm. Section 5 covers its testing on benchmark functions and applications for practical engineering problems. Section 6 discusses the results and outlines future work.

2. Related Work

Metaheuristic algorithms have become essential tools for addressing complex, multimodal, and high-dimensional optimization problems. Inspired by natural, biological, or social phenomena, these gradient-free algorithms are particularly well-suited for real-world engineering applications. Recent advancements in metaheuristics have shifted the research focus toward enhancing the algorithmic performance by redesigning key components—namely, the guiding mechanisms, convergence strategies, and update schemes.
Among these components, the guiding mechanism is critical, as it governs how individuals or candidate solutions are influenced by others within the population. Traditional guidance approaches, such as those used in classical PSO, typically direct individuals toward global or local optima based on fitness values but often suffer from premature convergence and loss of population diversity. To address these challenges, advanced guiding strategies have been developed, including fitness distance balance (FDB) [22], adaptive FDB (AFDB) [23], dynamic FDB (DFDB) [24], and fitness distance constraint (FDC). By incorporating spatial distance information between solutions, these methods achieve a more balanced trade-off between exploration and exploitation, as demonstrated by their successful integration into algorithms such as FDC_AGDE [25], dFDB_LSHADE [26], dFDB_SFS [27], and FDB_AGSK [28], which exhibit strong performance on benchmark functions and engineering problems.
Convergence strategies, which determine how candidate solutions are updated each iteration, also play a pivotal role in directing search trajectories and influencing the convergence speed. The contemporary approaches include time-decreasing control parameters, multi-phase convergence schemes, and hybrid deterministic random update models. For instance, the GOOSE algorithm adapts its convergence behavior across different search stages [29], while HPSOBOA incorporates multiple convergence modes to sustain robustness across diverse problem domains [30].
Moreover, update schemes—particularly for the selection of surviving individuals—are vital for maintaining the solution quality and population diversity. Conventional methods relying solely on fitness-based ranking risk stagnation and premature convergence. To overcome this, the natural survivor method (NSM) integrates the fitness performance with historical success rates to judiciously select individuals for the next generation [31]. NSM has proven effective in algorithms such as NSM-BO and NSM-LSHADE-CnEpSin [32], markedly enhancing the stability and global search performance in constrained engineering problems.
Despite these advances, most existing studies focus on improving individual algorithmic components—such as the convergence behavior or update strategies—often at the expense of overall adaptability or diversity. A unified framework that simultaneously integrates global guidance and bio-inspired local searches with dynamic adaptability remains lacking.
To bridge this gap, we propose a hybrid algorithm, IVYPSO, which synergistically combines PSO’s global search capability with the ivy algorithm’s local growth and diffusion mechanisms. By embedding a dynamic ivy-inspired variable GV within the PSO velocity update formula, IVYPSO adaptively switches search strategies based on individual fitness, effectively balancing exploration and exploitation. This design not only preserves the convergence accuracy but also enhances the robustness and diversity, offering a novel high-performance optimization framework tailored for complex global optimization challenges.

3. Materials and Methods

To better illustrate our hybrid algorithm, the inspiration behind and detailed implementation of the PSO algorithm and ivy algorithm will be introduced.

3.1. PSO Formulation

PSO is a nature-inspired swarm intelligence algorithm, proposed by Kennedy and Eberhart in 1995. Its core idea is based on simulating the foraging behavior of bird flocks or fish schools. By sharing information and collaborating among individuals, PSO strikes a balance between global exploration and local exploitation, effectively solving optimization problems.
In the PSO algorithm, each particle has a unique position X i = [ x i 1 , x i 2 , x i 3 , , x i j , , x i D ] and velocity V i = v i 1 , v i 2 , v i 3 , , v i j , , v i D , representing a potential solution. Here, j = 1,2 , , D , D denotes the dimensionality of the search space. In each iteration, the position of the particle is dynamically updated based on the following three components:
(1)
Inertia component: Determined by the particle’s previous velocity, it is used to maintain the particle’s movement trend and balance the global exploration capability of the search.
(2)
Individual cognitive component: The particle adjusts its search direction based on its own historical best position P b e s t , simulating individual learning behavior.
(3)
Social component: The particle adjusts its direction based on the entire population’s global best position G b e s t , reflecting social learning and collaborative effects.
During the search process, the particle’s position is influenced by its best position in the neighborhood P b e s t , i and the global best position G b e s t of the entire population.
The particle position and velocity update formulas are given by Equations (1) and (2), respectively:
X i k + 1 = X i k + V i k + 1
V i k + 1 = w V i k + c 1 r 1 P b e s t , i X i k + c 2 r 2 G b e s t X i k
where w is the inertia weight, which controls the balance between global and local searches; c 1 and c 2 are acceleration coefficients that determine the influence of individual and group learning, respectively; c 1 r 1 and c 2 r 2 are random numbers, ranging from 0 to 1, which enhance the randomness of the search.
The flowchart of the PSO algorithm is shown in Figure 1.

3.2. IVYA Formulation

The ivy algorithm, derived from the growth behavior of ivy plants in nature, is a swarm intelligence optimization method. Ivy plants continuously grow, climb, and spread in the environment in search of sunlight, nutrients, and other resources for survival. This process serves as inspiration for addressing global optimization problems. The algorithm simulates the different life stages of ivy, including growth, ascent, and spreading [33]. The algorithm’s implementation process can be outlined in the subsequent four steps:
(1)
Initialize the population, where N and D represent the total number of members and the dimensionality of the problem, respectively. Thus, the i t h population member has the form I i = ( I i 1 , , I i D ) , where i = 1,2 , , N , The total population of ivy plants is represented as I = ( I 1 , , I i , , I N p o p ) , At the start of the algorithm, the initial positions of the ivy algorithm population in the search space are determined using Equation (3):
I i = I m i n + r a n d ( 1 , D ) ( I m a x I m i n ) , i = 1 , , N
where r a n d ( 1 , D ) represents a vector of dimension D with random numbers uniformly distributed in the range [0, 1]. The upper and lower limits of the search space are denoted by I m a x and I m i n , respectively, and denotes the element-wise product of two vectors.
(2)
Coordinated and disciplined population growth. In the growth process of the ivy algorithm, we assume that the growth rate G V of the ivy algorithm is a function of time, expressed through a differential equation, as illustrated in Equation (4):
d G V ( t ) d t = ψ G V ( t ) φ ( G V ( t ) )
where G V , ψ , and φ represent the growth rate, growth velocity, and correction factor for growth deviation, respectively. The member I i is modeled by Equation (5).
Δ G V i ( t + 1 ) = r a n d 2 ( N ( 1 , D ) Δ G V i ( t ) )
where G V i ( t ) and G V i ( t + 1 ) represent the growth rates at discrete time steps t and t + 1 , respectively; r a n d is a random number in the range of [0, 1]; N ( 1 , D ) represents a random vector of dimension D .
(3)
Obtaining sunlight for growth. For ivy in nature, quickly finding a surface to attach to is crucial. The movement towards the light source is modeled by Equations (6)–(8). In the proposed algorithm, this behavior is simulated by the i t h individual I i in the population, selecting its closest and most optimal neighbor I i i (based on the value of the fitness function) as a reference for self-improvement, as shown in Figure 2.
I i i = I j 1 s , I i = I j s I i , I i = I b e s t
I i n e w 1 = I i + N ( 1 , D ) ( I i i I i ) + N ( 1 , D ) Δ G V i , i = 1,2 , , N
Δ G v i = I i ( I m a x I m i n ) , I t e r = 1 r a n d 2 ( N ( 1 , D ) Δ G v i ) , I t e r > 1
where N ( 1 , D ) is a vector, with each component being the absolute value of the corresponding component in the vector N ( 1 , D ) .
(4)
Growth and evolution of ivy. After the member I i navigates the search space globally to reach its nearest and most significant neighbor I ii , it enters a phase where I i strives to directly follow the optimal member I b e s t in the population. This stage aligns with the pursuit of an improved optimal solution in the vicinity of I b e s t , as depicted in Equations (9) and (10).
I i n e w = I B e s t ( r a n d ( 1 , D ) + N ( 1 , D ) Δ G V i )
Δ G V i n e w = I i n e w ( I m a x I m i n )
The flowchart of the ivy algorithm is shown in Figure 3.

4. Proposed Optimization Formulation of IVYPSO

The IVYPSO hybrid algorithm integrates the global exploration capability of PSO with the local exploitation and adaptive perturbation mechanism inspired by the natural growth behavior of ivy plants. Specifically, the algorithm leverages the PSO’s velocity-position update rule to guide individuals toward promising areas in the solution space. Meanwhile, an ivy-inspired G V introduces fine-grained random perturbations to enhance the local search capability around high-quality solutions. To balance exploration and exploitation, IVYPSO adaptively switches between global and local strategies based on a dynamic quality threshold. A greedy selection strategy is also employed to retain elite solutions in each iteration. This coordinated hybridization approach improves the convergence speed, avoids local optima, and enhances the solution’s overall accuracy.

4.1. Initialization Phase

In this phase, the objective function f o b j is defined to calculate the fitness. The lower and upper bounds of the search space, l b and u b , respectively, are set to restrict the solution range; N individuals are randomly generated, and the position and velocity of each individual i are initialized, along with the ivy variables. The specific parameter initialization process is as follows.
The particles’ initial positions in the search space are randomly generated, as described in Equation (11):
X i = u n i f r n d l b , u b , 1 , d i m .
The velocity is initialized as a zero vector, as described in Equation (12):
V i = z e r o s 1 , d i m .
The ivy growth variable G V is then initialized, as described in Equation (13):
G V = X i u b l b
The ivy variable G V represents the relative growth behavior of an ivy plant within the bounded search space and is used to control the intensity of local random movement.

4.2. Guidance Mechanism: PSO-Guided Velocity Update

This step updates the velocity for each individual, as described by the velocity update formula in Equation (14):
V i t + 1 = w V i t + c 1 r 1 ( P b e s t X i t ) + c 2 r 2 ( G b e s t X i t )
where w = 0.7 , c 1 = 1.5 , c 2 = 1.5 , r 1 , and r 2 are randomly generated values within the interval [0, 1], adding randomness to the particles. This mechanism guides particles towards their personal best position P b e s t and the global best position G b e s t , promoting convergence toward high-quality areas.
This velocity update represents the guidance mechanism in a metaheuristic search (MHS), guiding particles toward personal and global best positions to balance exploration and exploitation. Recent state-of-the-art (SOTA) methods such as FDB and its variants improve the guidance by considering both the fitness and distance to avoid premature convergence.

4.3. Update Mechanism: Position Update with Ivy Perturbation

To achieve a dynamic balance between global and local searches, we introduce a dynamic control factor β 1 , which is a dynamic adjustment parameter that controls the ratio between global and local search. The expression is as described in Equation (15):
β 1 = 1 + r a n d 2
The random value r a n d is in the range of [0, 1], and the value of β 1 dynamically changes, which helps introduce randomness and balance the strengths of global and local searches. By adjusting β 1 , the algorithm primarily focuses on global searches in the early stages and gradually enhances the local search capability in the later stages, thereby improving the convergence accuracy.
In each iteration, for each individual I the update strategy is selected based on the relationship between its current fitness value C o s t ( X i ) and the global best fitness value C o s t ( G b e s t ) . The condition judgment formula is as described in Equation (16):
C o s t ( X i ) < β 1 C o s t ( G b e s t )
where C o s t ( X i ) represents the fitness value of the current individual and C o s t ( G b e s t ) represents the fitness value of the current global best solution. If the condition is met, Equation (17) is used for local searches (exploited around a neighbor). Otherwise, Equation (18) is applied to global searches (perturbed toward the best). Equations (17) and (18) are as follows:
X i = x + | r a n d n | × ( x n e i g h b o r x ) + r a n d n × G V
X i = G b e s t × ( r a n d + r a n d n × G V )
Here, G V controls the perturbation magnitude; r a n d n is a normally distributed random value that mimics the irregular yet directed growth of ivy tips toward better areas. The ivy growth variable is then updated using Equation (19).
G V = G V × ( r a n d 2 × r a n d n )
This adaptive update allows G V to decay or intensify based on the stochastic process, simulating flexible growth behaviors for refining solutions.

4.4. Survivor Selection Strategy

In metaheuristic search algorithms, survivor selection is a crucial component of the update mechanism that determines which individuals are retained in the population to balance the convergence speed and solution diversity. In IVYPSO, we employ a greedy selection strategy, described in Equation (20), to preserve improved solutions:
C o s t ( X i ) < C o s t ( X i ) X i ( t + 1 ) = X i
If the new position X i yields a better fitness than the current position X i , it replaces the current solution; otherwise, the original solution is retained. Additionally, the individual best P b e s t , i and global best G b e s t are updated synchronously upon improvement.
Furthermore, inspired by recent advances in metaheuristics, the NSM has been proposed as an effective survivor selection technique. NSM dynamically combines fitness and historical success information to select survivors, enhancing the stability and diversity in the population. Although IVYPSO does not explicitly implement NSM, its greedy selection strategy combined with ivy-inspired perturbations shares conceptual similarities with NSM’s goal of balancing exploitation and exploration, ensuring robustness against premature convergence. Future work may explore integrating NSM directly into IVYPSO to further improve the update mechanism performance.

4.5. Summary

Through PSO-guided global movement and ivy-inspired local perturbations, IVYPSO forms a complementary hybrid search system. The use of ivy growth variables enhances the adaptability and solution refinement, particularly in rugged or complex landscapes. The implementation process of IVYPSO is illustrated in Figure 4 and Algorithm 1.
Algorithm 1. IVYPSO
Input: N, Max_iteration, lb, ub, dim, fobj
Output: Destination_fitness, Destination_position, Convergence_curve
Initialize parameters:
    Set PSO parameters: inertia weight (w), cognitive factor (c1), social factor (c2)
    Initialize population size (N), maximum iterations (Max_iteration), search space (lb, ub)
    Define and initialize the vine growth variable (GV)
Initialize population:
    For each particle i in population:
        Randomly initialize position Position_i within [lb, ub]
        Initialize velocity Velocity_i as a zero vector
        Evaluate fitness Cost_i = fobj(Position_i)
        Initialize vine growth variable GV_i = Position_i/(ub - lb)
        Set personal best PBest_i = Position_i and PBest_Cost_i = Cost_i
    Set global best GBest as the particle with the lowest fitness value
Iteration loop (t = 1 to Max_iteration):
    For each particle i in population:
        Update velocity and position:
            Generate random vectors r1 and r2
            Velocity_i = w * Velocity_i
                + c1 * r1 * (PBest_i - Position_i)
                + c2 * r2 * (GBest - Position_i)
        Calculate dynamic control factor β:
          β = 1 + (random/2)
        Perform local or global search based on fitness comparison:
          If Cost_i < β * GBest_Cost:
            New_Position = Position_i
                    + |N(0,1)| * (Position_neighbor - Position_i)
                    + N(0,1) * GV_i
          Else:
            New_Position = GBest * (random + N(0,1) * GV_i)
        Boundary handling:
          Ensure New_Position is within [lb, ub]
        Update vine growth variable:
          GV_i = GV_i * (random^2 * N(0,1))
        Evaluate and update solutions:
          New_Cost = fobj(New_Position)
          If New_Cost < Cost_i:
            Position_i = New_Position
            Cost_i = New_Cost
            If New_Cost < PBest_Cost_i:
                PBest_i = New_Position
                PBest_Cost_i = New_Cost
                If New_Cost < GBest_Cost:
                    GBest = New_Position
                    GBest_Cost = New_Cost
    Record the best fitness at the current iteration:
        Convergence_curve(t) = GBest_Cost
Return:
    Destination_fitness = GBest_Cost
    Destination_position = GBest
    Convergence_curve

5. Results and Analytical Evaluation of the Experiment

To verify the proposed algorithm’s reliability and performance, 26 standard benchmark functions were utilized. Moreover, we applied the algorithm to three real-world engineering optimization problems to evaluate its practical performance. The subsequent sections offer comprehensive details on the benchmark functions, parameter configurations, and performance metrics. The algorithm’s effectiveness was assessed through a comparative analysis with ten widely recognized metaheuristic algorithms.
Setup for experiments: The proposed IVYPSO algorithm and other metaheuristic methods were implemented in MATLAB 2023a. All tests were conducted on a Windows 10 platform with an Intel(R) Core (TM) i9-14900KF processor (3.10 GHz) and 32 GB of RAM.

5.1. Global Optimization with 26 Benchmark Mathematical Test Functions

To assess the performance of the proposed IVYPSO algorithm in solving complex optimization problems, this study employed 26 widely recognized benchmark functions [34,35,36]. These functions were selected to ensure a comprehensive evaluation covering various optimization scenarios, and they are grouped here into two main categories: unimodal and multimodal functions. Each function was tested in a 30-dimensional space.
Unimodal functions: The initial set of 15 benchmark functions (F1–F15) is unimodal, characterized by the presence of a single global optimum, making it suitable for evaluating the convergence speed and local exploitation capability of optimization algorithms. These functions provide a smooth search landscape without local optima, allowing the assessment of an algorithm’s ability to quickly converge to the global minimum.
For instance, the Quartic function introduces a noise component that simulates real-world measurement errors, making it relevant for applications such as experimental data fitting. The Sum Power function, with its amplified penalization of higher-order dimensions, reflects challenges seen in structural reliability or robust design tasks in engineering optimization.
Multimodal functions: The latter set of functions (F16–F26) is multimodal, containing numerous local optima and being employed to test an algorithm’s ability to maintain diversity and avoid premature convergence. These functions simulate complex landscapes typically encountered in real-world scenarios such as material design, resource allocation, or non-linear process optimization.
For example, the Alpine function represents rugged fitness landscapes with a repetitive pattern, akin to multi-peak phenomena in signal processing or energy system optimization. The Weierstrass function, characterized by its fractal-like structure, is often used to test algorithms under highly irregular and non-differentiable conditions, making it applicable to domains such as financial modeling or dynamic system tuning.
By adopting this comprehensive set of 26 benchmark functions, this study evaluated IVYPSO across a wide range of conditions, from simple landscapes to highly complex multimodal terrains. This thorough testing ensures that the algorithm’s effectiveness, robustness, and generalization ability were rigorously validated. The full mathematical formulations and parameter settings of the benchmark functions are detailed in Table 1.

5.1.1. Performance Indicators

To objectively assess the effectiveness of the IVYPSO algorithm, this study used the following standard evaluation metrics to comprehensively assess its performance across different benchmark tests [37,38,39,40].
Average value ( A v g ): The average fitness value obtained from M independent runs of the algorithm, calculated as shown in Equation (21).
A v g = i = 1 M ( f i ) M
Standard deviation ( S t d ): The variability in the objective function values obtained from M independent runs of the algorithm. The standard deviation is calculated using Equation (22).
S t d = 1 M 1 i = 1 M ( f i A v g ) 2
Best: The minimum fitness obtained from M independent runs of the algorithm, as shown in Equation (23).
B e s t = m i n 1 i M f i
where f i denotes the optimal fitness value attained during run i .

5.1.2. Parameter Settings and Performance Comparison Against Other Algorithms

Table 2 summarizes the parameter settings for the IVYPSO algorithm and its comparison algorithms. The proper adjustment of the parameters for each algorithm is crucial to ensure optimal performance. To maintain the fairness of the comparison, the initial settings and common parameters for all algorithms are set based on the standard values from the existing literature, while the remaining parameters are optimized through experimental procedures. The comparison algorithms selected in this study include classic metaheuristic methods such as PSO, IVY, BOA [41], and WOA [42], as well as several recently proposed improved algorithms, including GOOSE and HPSOBOA, and several advanced variant algorithms such as FDC-AGDE, dFDB-LSHADE, NSM-BO, dFDB-SFS, and FDB-AGSK. These algorithms have demonstrated strong performance in the IEEE CEC competition and in solving real-world engineering optimization problems. In this study, the evaluation of the objective function is terminated upon reaching the maximum number of iterations.

5.1.3. Analysis of Numerical Results

The proposed algorithm’s performance was evaluated and compared with several mature and latest algorithms.
Table 3 presents a detailed comparison of the average fitness values and standard deviations achieved by the IVYPSO algorithm and other algorithms across the test functions. Notably, IVYPSO attained the best average fitness on 21 out of the 26 test functions (F1–F4, F6, F8–F20, F24–F26), outperforming the other 11 algorithms. Particularly impressive was its performance on 17 functions (F1–F4, F8, F9, F11, F13–F17, F19, F20, F24–F26), where it achieved the best average fitness with a standard deviation of zero, demonstrating excellent stability and efficiency. On functions F5, F21, and F23, the IVYPSO algorithm ranked 6th, 5th, and 4th, respectively. However, its performance on F7 and F22 was relatively poor, ranking 11th and 12th, respectively.
Table 4 compares the best fitness values obtained by the IVYPSO algorithm and the other 11 algorithms across the test functions. Among the 26 functions, IVYPSO achieved superior results on 20 functions (F1–F4, F8, F9, F11–F20, F24–F26), outperforming the other 11 algorithms. For functions F6, F21, and F23, IVYPSO ranked 2nd, 4th, and 2nd, respectively, in terms of the best fitness value. Nevertheless, its performance was inferior on functions F5, F7, and F22, where it ranked 8th, 11th, and 10th, respectively.
In summary, the IVYPSO algorithm can be considered a superior optimization algorithm.

5.1.4. Analysis of Convergence Behavior

Figure 5 compares the convergence behavior of the IVYPSO algorithm with that of other algorithms over 500 iterations. The vertical axis represents the best fitness value obtained at each iteration, while the horizontal axis indicates the iteration count. On 16 out of the 26 test functions (specifically F1–F4, F6, F8–F15, F18, F20, and F25), IVYPSO demonstrated consistently faster and more efficient convergence, outperforming the other algorithms. Although IVYPSO exhibits strong global optimization capability, a few other algorithms outperformed it in certain specific cases.

5.1.5. Analysis of Exploitation Capabilities

Ideal for evaluating the algorithm’s development capabilities, the single-modal benchmark functions (F1–F14) contain only one minimum. For functions F1–F4, F8, F9, F11, F13, and F14, Figure 3 demonstrates that IVYPSO reaches the theoretical best solution in around 50 iterations. Reflecting IVYPSO’s high precision and stability, Table 3 reveals that the average fitness and standard deviation for these functions are typically 0. These findings clearly indicate that IVYPSO outperforms most of the comparison algorithms on the benchmark functions. For the multi-modal functions F15–F17, F19, F20, and F24–F26, IVYPSO also attains the theoretical best fitness in around 50 iterations. Table 4 demonstrates that the majority of the algorithms are capable of locating the theoretical optimal solution. Moreover, Table 3 underscores that IVYPSO demonstrates better average fitness and standard deviation than most competing algorithms, highlighting its robust and stable exploration capabilities.

5.1.6. Wilcoxon Signed-Rank Analysis Results and Friedman Ranking Scores

To ensure statistically robust conclusions, the widely accepted Wilcoxon non-parametric [43,44,45,46] test was used to assess the effectiveness of IVYPSO compared to 11 other algorithms. Table 5 presents the results of the Wilcoxon signed-rank test applied to 26 standard test functions at a significance level of α = 0.05, using the mean objective value of each function as the test sample. The p-value, which reflects the significance level, is considered significant when below 0.05. The data in Table 5 show that IVYPSO yields p-values below 0.05 in all cases. Therefore, this analysis statistically validates the superiority of IVYPSO over the other 11 algorithms.
To evaluate further the overall performance of the proposed IVYPSO algorithm on the 26 benchmark functions, the Friedman scores were used to rank the average performances of 12 comparative algorithms [47,48,49]. Table 6 presents the Friedman scores and their corresponding ranks. Among all algorithms, IVYPSO achieved the lowest Friedman score of 1.9231, ranking first, which demonstrates its outstanding performance across all test functions. In contrast, traditional algorithms such as PSO and BOA obtained Friedman scores of 5.8846 and 6.7308, ranking 9th and 11th, respectively, indicating relatively weaker overall performance. Similarly, some improved algorithms, such as FDC-AGDE and dFDB-LSHADE, ranked 12th and 10th, respectively, also underperforming compared to IVYPSO. These results indicate that IVYPSO exhibits strong robustness and competitiveness in solving complex optimization problems, and its overall performance surpasses that of both traditional and newly developed metaheuristic algorithms.

5.1.7. Analysis of Computational Expenses

The time complexity of the IVYPSO algorithm is primarily composed of three parts: the population initialization, iteration updates, and fitness evaluation. Let the population size be N , the maximum number of iterations be T , and the dimensionality of the decision variables be d i m . In the population initialization phase, the random generation of each particle’s position and velocity, along with the fitness evaluation, requires a time complexity of O ( N × d i m ) . In each iteration, the algorithm updates the velocity and position of each particle, adjusts the ivy growth variables, and evaluates the fitness. The velocity and position updates involve basic vector operations, with a time complexity of O ( d i m ) , and the ivy growth variable update and random perturbation for local or global searches also have a time complexity of O ( d i m ) . The fitness evaluation is usually the most time-consuming operation. If the complexity of the objective function evaluation is O ( f ) , the fitness calculation for the entire population in each iteration has a time complexity of O ( N × f ) . Therefore, the overall time complexity of the IVYPSO algorithm can be expressed as O ( T × N × ( d i m + f ) ) . As the population size N , maximum iteration count T , and dimensionality d i m increase, the algorithm’s time complexity grows linearly, making it suitable for large-scale, complex global optimization problems.

5.2. Application of IVYPSO to Engineering Optimization Problems

This section presents the application of the IVYPSO algorithm in solving engineering optimization problems that involve various inequality and equality constraints. For each optimization problem, IVYPSO was evaluated in 20 independent runs using a population size of 30 individuals, with a maximum of 500 iterations. The performance of IVYPSO was compared with 11 other algorithms, including PSO, IVY, BOA, WOA, GOOSE, HPSOBOA, FDC_AGDE, dFDB_LSHADE, NSM_BO, dFDB_SFS, and FDB_AGSK. Additionally, three stability analysis metrics were incorporated: SR, ACTs, and AFEs. SR represents the proportion of independent runs in which the algorithm successfully found solutions that satisfy all constraints and reach the global optimum or an acceptable level of precision. ACTs refers to the average time (in seconds) spent by the algorithm to complete each optimization task across multiple runs. AFEs indicates the average number of fitness function evaluations required to complete the optimization task over multiple experiments.

5.2.1. Gas Transmission Compressor Design (GTCD) Problem

Figure 6 illustrates the gas transmission compressor design (GTCD) problem, which is a representative and practical mechanical design case originally proposed by Beightler and Phillips [50]. This problem involves determining the optimal values of several design variables, such as the pipeline length, inlet and outlet pressures, and pipe diameter, with the objective of minimizing the total cost of a gas pipeline transmission system while ensuring the delivery of 100 million cubic feet of natural gas per day.
The GTCD problem reflects a realistic and complex engineering scenario widely encountered in energy and process industries. It poses significant optimization challenges due to its non-linear, constrained, and multimodal nature. By applying the proposed IVYPSO algorithm to this problem, we aim to demonstrate its capability to handle real-world design constraints, achieve cost-effective solutions, and maintain robustness in practical optimization tasks. This problem has three decision variables: the length L between two compressor stations; the compression ratio L , at the compressor inlet, where r = P 1 P 2 , with P 1 being the pressure leaving the compressor station (psi) and P 2 being the pressure entering the compressor station (psi); and D , the internal diameter of the pipeline (inches). The goal is to find the optimal values for L , r , and D that minimize the C 1 value.
In this problem, the total annual cost of the gas transmission system is defined as in Equation (24):
m i n C 1 ( x ) = 8.61 × 10 5 x 3 2 3 x 1 1 2 x 2 x 2 2 1 1 2 + 3.69 × 10 4 x 3 7.6543 × 10 8 x 1 1 + 7.72 × 10 8 x 1 1 x 2 0.219 ,
where x 1 = L , x 2 = r , x 3 = D , subject to:
10 x 1 55 ,
1.1 x 2 2 ,
10 x 3 40 .
Table 7 presents a comparison of the results for solving the gas transmission compressor design problem using IVYPSO and other algorithms from the literature. The success rate threshold for this problem was determined to be 1,677,759.2755. The best result achieved by IVYPSO was L = 24.4960 , r = 1.5867 , D = 20.0000 , with a minimum cost of 1,677,759.2755 and with an SR value of 100%, which required minimal time to complete the task. Table 8 shows a statistical analysis, showing that IVYPSO achieved the lowest average cost of 1,677,759.2755. These results demonstrate that compared to other optimization algorithms, IVYPSO delivers a superior solution to the gas transmission compressor design problem.

5.2.2. Three-Bar Truss Design Problem

Figure 7 shows the three-bar truss design problem. The three-bar truss is typically a simple planar truss structure composed of three rods forming a triangular shape. This problem is widely applicable in engineering design, particularly in the field of structural optimization. The design variables typically include the cross-sectional area or dimensions of the rods, with the objective of minimizing the total mass of the truss while ensuring compliance with stress and geometric constraints. Such problems effectively reflect real-world engineering demands for structural safety and material efficiency, making them commonly used as benchmark problems for evaluating the performance of optimization algorithms.
The total mass of the three-bar truss can be expressed by Equation (25):
m i n f ( x ) = 2 2 x 1 + x 2 × H
where x 1 = A 1 , x 2 = A 2 :
g 1 ( x ) = 2 x 1 + x 2 2 x 1 + 2 x 1 x 2 x 2 P σ 0 ,
g 2 ( x ) = P 2 x 1 + 2 x 1 x 2 x 2 σ 0 ,
g 3 ( x ) = P x 1 + 2 x 2 σ 0 ,
0 x 1 , x 2 1 .
where H = 1000 m m , P = 2 k N / c m 2 , σ = 2 k N / c m 2 .
Table 9 presents a comparison of the results for solving the three-bar truss design problem using IVYPSO and other algorithms from the literature. The success rate threshold for this problem was determined to be 263.8523. The best result obtained by IVYPSO was A 1 = 0.7884 , A 2 = 0.4081 , with a minimum cost of 263.8523 and an SR value of 100%, and it required minimal time to complete the task. Table 10 provides a statistical analysis, showing that IVYPSO achieved the lowest average cost of 263.8523. These results indicate that compared to other optimization algorithms, IVYPSO provides a better solution for the three-bar truss design problem.

5.2.3. Multiple-Disk Clutch Brake Design Problem

Figure 8 illustrates the multiple-disk clutch brake design problem, a classic engineering optimization issue that is commonly encountered in automation equipment, mechanical transmission systems, and the automotive industry [51]. This problem involves optimizing the design of the clutch and brake system to minimize the stopping time of the brake while ensuring high operational efficiency and stability. Such design problems are highly relevant to real-world engineering scenarios, where the balance between performance, efficiency, and reliability is critical. By addressing this problem, the algorithm’s ability to handle practical engineering challenges and improve the overall system design is demonstrated, reflecting its applicability in real-world applications. This problem involves five decision variables: the inner radius r i in millimeters, outer radius r o in millimeters, disk thickness t in millimeters, driving force F , and number of friction surfaces Z .
The brake’s stopping time can be expressed by Equation (26):
m i n f ( x ) = π ( x 2 2 x 1 2 ) x 3 ( x 5 + 1 ) p m
where x 1 = r i , x 2 = r o , x 3 = t , x 4 = F , x 5 = Z , subject to:
g 1 ( x ) = x 2 x 1 Δ R 0
g 2 ( x ) = L m a x ( Z + 1 ) ( t + δ ) 0
g 3 ( x ) = p m a x p r z 0
g 4 ( x ) = p m a x V s r , m a x p r z V s r 0
g 5 ( x ) = V s r , m a x V s r 0
g 6 ( x ) = M h s M s 0
g 7 ( x ) = T 0
g 8 ( x ) = T m a x T 0
60 x 1 80   m m , 90 x 2 110   m m , 1.5 x 3 3   m m , 0 x 4 1000   N , 2 x 5 9
where:
p m = 0.0000078   k g / m m 3 , p m a x = 1   M P a , μ = 0.5 , V s r , m a x = 10   m / s , s = 1.5 , T m a x = 15   s ,
n = 250   r p m , M f = 3   N m , I z = 55 k g / m 2 , δ = 0.5   m m , Δ R = 20   m m , L m a x = 30   m m ,
M h = 2 3 μ x 4 x 5 x 2 3 x 1 3 x 2 2 x 1 2 N   m m , w = π n 30 r a d s , R s r = 2 3 x 2 3 x 1 3 x 2 2 x 1 2 m m , A = π ( x 2 2 x 1 2 ) m m 2 ,
M s = 40   N m , p r z = x 4 A N / m m 2 , V s r = π R s r n 30 m m / s
Table 11 presents a comparison of the results for solving the multiple-disk clutch brake design problem using IVYPSO and other algorithms from the literature. The success rate threshold for this problem was determined to be 0.2352. The best result obtained by IVYPSO was r i = 70 , r o = 90 , t = 1 , F = 1000 , Z = 2 , with the lowest cost of 0.2352 and an SR value of 100%, and it required minimal time to complete the task. Table 12 shows a statistical analysis, showing that IVYPSO achieves the lowest average cost of 0.2352. These results indicate that compared to other optimization algorithms, IVYPSO provides a better solution for the multiple-disk clutch brake design problem.

6. Conclusions

The proposed IVYPSO algorithm effectively enhances both the global exploration and local exploitation capabilities within complex search spaces by integrating ivy growth variables and dynamic control factors into the classical PSO framework. The ivy growth variables facilitate diverse search behaviors that prevent premature convergence, while the dynamic control factors adaptively balance the emphasis between global and local searches based on fitness evaluations.
Extensive experiments on 26 benchmark functions and three challenging engineering optimization problems—including the multiple-disk clutch brake design and gas transmission compressor design—demonstrated that IVYPSO achieves superior solution quality with rapid convergence. Comparative analyses against ten state-of-the-art optimization algorithms further confirmed the algorithm’s superiority in convergence accuracy, stability, and robustness, particularly in handling multi-modal and high-dimensional problems where the global optimization ability is critical.
This study lays a foundation for further research in several directions. Our future work will focus on extending IVYPSO to more complex engineering problems and multi-objective optimization scenarios, especially those involving dynamic, multi-modal, and high-dimensional environments. Moreover, the incorporation of adaptive parameter tuning strategies is planned to optimize the design of ivy growth variables and dynamic control factors, thereby improving the algorithm’s flexibility and performance across diverse problem domains.
In addition, hybridizing IVYPSO with other intelligent optimization techniques, such as genetic algorithms, differential evolution, and simulated annealing, to build multi-fusion frameworks may offer significant performance enhancements. This would leverage complementary algorithmic strengths and further improve the solution quality and convergence behavior.

Author Contributions

Conceptualization, K.Z. and F.Y.; methodology, K.Z. and Y.J.; software, Y.J. and Z.Z.; validation, Y.J. and Z.Z.; formal analysis, F.Y.; investigation, K.Z. and Z.Z.; data curation, Y.J.; writing—original draft preparation, K.Z.; writing—review and editing, Z.M. and Y.P.; supervision, F.Y.; project administration, F.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The datasets used and/or analyzed during the current study are publicly available and can be accessed without any restrictions. If needed, they can also be obtained by contacting the corresponding author.

Acknowledgments

This work was supported by equipment funded through the Intelligent Connected New Energy Vehicle Teaching System project of Chongqing University of Technology, under the national initiative “Promote Large-Scale Equipment Renewals and Trade-Ins of Consumer Goods”.

Conflicts of Interest

The authors declare no conflict of interest. The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. Sattar, D.; Salim, R. A smart metaheuristic algorithm for solving engineering problems. Eng. Comput. 2021, 37, 2389–2417. [Google Scholar] [CrossRef]
  2. Abualigah, L.; Elaziz, M.A.; Khasawneh, A.M.; Alshinwan, M.; Ibrahim, R.A.; Al-Qaness, M.A.A.; Mirjalili, S.; Sumari, P.; Gandomi, A.H. Meta-heuristic optimization algorithms for solving real-world mechanical engineering design problems: A comprehensive survey, applications, comparative analysis, and results. Neural Comput. Appl. 2022, 34, 4081–4110. [Google Scholar] [CrossRef]
  3. Talbi, E.G. Machine learning into metaheuristics: A survey and taxonomy. ACM Comput. Surv. (CSUR) 2021, 54, 1–32. [Google Scholar] [CrossRef]
  4. Kaveh, M.; Mesgari, M.S. Application of meta-heuristic algorithms for training neural networks and deep learning architectures: A comprehensive review. Neural Process. Lett. 2023, 55, 4519–4622. [Google Scholar] [CrossRef] [PubMed]
  5. Katebi, J.; Shoaei-Parchin, M.; Shariati, M.; Trung, N.T.; Khorami, M. Developed comparative analysis of metaheuristic optimization algorithms for optimal active control of structures. Eng. Comput. 2020, 36, 1539–1558. [Google Scholar] [CrossRef]
  6. Joseph, S.B.; Dada, E.G.; Abidemi, A.; Oyewola, D.O.; Khammas, B.M. Metaheuristic algorithms for PID controller parameters tuning: Review, approaches and open problems. Heliyon 2022, 8, e09399. [Google Scholar] [CrossRef]
  7. Ahmadianfar, I.; Bozorg-Haddad, O.; Chu, X. Gradient-based optimizer: A new metaheuristic optimization algorithm. Inf. Sci. 2020, 540, 131–159. [Google Scholar] [CrossRef]
  8. Abdel-Basset, M.; Mohamed, R.; Elkomy, O.M.; Abouhawwash, M. Recent metaheuristic algorithms with genetic operators for high-dimensional knapsack instances: A comparative study. Comput. Ind. Eng. 2022, 166, 107974. [Google Scholar] [CrossRef]
  9. Keivanian, F.; Chiong, R. A novel hybrid fuzzy–metaheuristic approach for multimodal single and multi-objective optimization problems. Expert Syst. Appl. 2022, 195, 116199. [Google Scholar] [CrossRef]
  10. Huang, Y.; Lu, S.; Liu, Q.; Han, T.; Li, T. GOHBA: Improved Honey Badger Algorithm for Global Optimization. Biomimetics 2025, 10, 92. [Google Scholar] [CrossRef]
  11. Zhu, X.; Zhang, J.; Jia, C.; Liu, Y.; Fu, M. A Hybrid Black-Winged Kite Algorithm with PSO and Differential Mutation for Superior Global Optimization and Engineering Applications. Biomimetics 2025, 10, 236. [Google Scholar] [CrossRef] [PubMed]
  12. Choi, K.; Jang, D.-H.; Kang, S.-I.; Lee, J.-H.; Chung, T.-K.; Kim, H.-S. Hybrid algorithm combing genetic algorithm with evolution strategy for antenna design. IEEE Trans. Magn. 2015, 52, 7209004. [Google Scholar] [CrossRef]
  13. Yu, X.; Jiang, N.; Wang, X.; Li, M. A hybrid algorithm based on grey wolf optimizer and differential evolution for UAV path planning. Expert Syst. Appl. 2023, 215, 119327. [Google Scholar] [CrossRef]
  14. Peng, Y.; Wang, Y.; Hu, F.; He, M.; Mao, Z.; Huang, X.; Ding, J. Predictive modeling of flexible EHD pumps using Kolmogorov–Arnold Networks. Biomim. Intell. Robot. 2024, 4, 100184. [Google Scholar] [CrossRef]
  15. Mao, Z.; Bai, X.; Peng, Y.; Shen, Y. Design, modeling, and characteristics of ring-shaped robot actuated by functional fluid. J. Intell. Mater. Syst. Struct. 2024, 35, 1459–1470. [Google Scholar] [CrossRef]
  16. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; IEEE: Piscataway, NJ, USA, 1995; Volume 4, pp. 1942–1948. [Google Scholar]
  17. Papazoglou, G.; Biskas, P. Review and comparison of genetic algorithm and particle swarm optimization in the optimal power flow problem. Energies 2023, 16, 1152. [Google Scholar] [CrossRef]
  18. Thangaraj, R.; Pant, M.; Abraham, A.; Bouvry, P. Particle swarm optimization: Hybridization perspectives and experimental illustrations. Appl. Math. Comput. 2011, 217, 5208–5226. [Google Scholar] [CrossRef]
  19. Sun, J.; Xu, W.; Feng, B. Adaptive parameter control for quantum-behaved particle swarm optimization on individual level. In Proceedings of the 2005 IEEE International Conference on Systems, Man and Cybernetics, Waikoloa, HI, USA, 12 October 2015; IEEE: Piscataway, NJ, USA, 2005; Volume 4, pp. 3049–3054. [Google Scholar]
  20. Song, Y.; Liu, Y.; Chen, H.; Deng, W. A multi-strategy adaptive particle swarm optimization algorithm for solving optimization problem. Electronics 2023, 12, 491. [Google Scholar] [CrossRef]
  21. Ghasemi, M.; Zare, M.; Trojovský, P.; Rao, R.V.; Trojovská, E.; Kandasamy, V. Optimization based on the smart behavior of plants with its engineering applications: Ivy algorithm. Knowl. Based Syst. 2024, 295, 111850. [Google Scholar] [CrossRef]
  22. Kahraman, H.T.; Aras, S.; Gedikli, E. Fitness-distance balance (FDB): A new selection method for meta-heuristic search algorithms. Knowl. Based Syst. 2020, 190, 105169. [Google Scholar] [CrossRef]
  23. Duman, S.; Kahraman, H.T.; Kati, M. Economical operation of modern power grids incorporating uncertainties of renewable energy sources and load demand using the adaptive fitness-distance balance-based stochastic fractal search algorithm. Eng. Appl. Artif. Intell. 2023, 117, 105501. [Google Scholar] [CrossRef]
  24. Kahraman, H.T.; Bakir, H.; Duman, S.; Katı, M.; Aras, S.; Guvenc, U. Dynamic FDB selection method and its application: Modeling and optimizing of directional overcurrent relays coordination. Appl. Intell. 2022, 52, 4873–4908. [Google Scholar] [CrossRef]
  25. Ozkaya, B.; Kahraman, H.T.; Duman, S.; Guvenc, U. Fitness-Distance-Constraint (FDC) based guide selection method for constrained optimization problems. Appl. Soft Comput. 2023, 144, 110479. [Google Scholar] [CrossRef]
  26. Yildirim, I.; Bozkurt, M.H.; Kahraman, H.T.; Aras, S. Dental X-Ray image enhancement using a novel evolutionary optimization algorithm. Eng. Appl. Artif. Intell. 2025, 142, 109879. [Google Scholar] [CrossRef]
  27. Kahraman, H.T.; Hassan, M.H.; Katı, M.; Tostado-Véliz, M.; Duman, S.; Kamel, S. Dynamic-fitness-distance-balance stochastic fractal search (dFDB-SFS algorithm): An effective metaheuristic for global optimization and accurate photovoltaic modeling. Soft Comput. 2024, 28, 6447–6474. [Google Scholar] [CrossRef]
  28. Bakır, H.; Duman, S.; Guvenc, U.; Kahraman, H.T. Improved adaptive gaining-sharing knowledge algorithm with FDB-based guiding mechanism for optimization of optimal reactive power flow problem. Electr. Eng. 2023, 105, 3121–3160. [Google Scholar] [CrossRef]
  29. Hamad, R.K.; Rashid, T.A. GOOSE algorithm: A powerful optimization tool for real-world engineering challenges and beyond. Evol. Syst. 2024, 15, 1249–1274. [Google Scholar] [CrossRef]
  30. Zhang, M.; Long, D.; Qin, T.; Yang, J. A chaotic hybrid butterfly optimization algorithm with particle swarm optimization for high-dimensional optimization problems. Symmetry 2020, 12, 1800. [Google Scholar] [CrossRef]
  31. Kahraman, H.T.; Katı, M.; Aras, S.; Taşci, D.A. Development of the Natural Survivor Method (NSM) for designing an updating mechanism in metaheuristic search algorithms. Eng. Appl. Artif. Intell. 2023, 122, 106121. [Google Scholar] [CrossRef]
  32. Öztürk, H.T.; Kahraman, H.T. Metaheuristic search algorithms in Frequency Constrained Truss Problems: Four improved evolutionary algorithms, optimal solutions and stability analysis. Appl. Soft Comput. 2025, 171, 112854. [Google Scholar] [CrossRef]
  33. Ouyang, H.; Li, W.; Gao, F.; Huang, K.; Xiao, P. Research on Fault Diagnosis of Ship Diesel Generator System Based on IVY-RF. Energies 2024, 17, 5799. [Google Scholar] [CrossRef]
  34. Jamil, M.; Yang, X.S. A literature survey of benchmark functions for global optimisation problems. Int. J. Math. Model. Numer. Optim. 2013, 4, 150–194. [Google Scholar] [CrossRef]
  35. Zhan, Z.-H.; Shi, L.; Tan, K.C.; Zhang, J. A survey on evolutionary computation for complex continuous optimization. Artif. Intell. Rev. 2022, 55, 59–110. [Google Scholar] [CrossRef]
  36. Cai, T.; Zhang, S.; Ye, Z.; Zhou, W.; Wang, M.; He, Q.; Chen, Z.; Bai, W. Cooperative metaheuristic algorithm for global optimization and engineering problems inspired by heterosis theory. Sci. Rep. 2024, 14, 28876. [Google Scholar] [CrossRef]
  37. Mao, Z.; Hosoya, N.; Maeda, S. Flexible electrohydrodynamic fluid-driven valveless water pump via immiscible interface. Cyborg Bionic Syst. 2024, 5, 0091. [Google Scholar] [CrossRef]
  38. Lau, S.L.H.; Lim, J.; Chong, E.K.P.; Wang, X. Single-pixel image reconstruction based on block compressive sensing and convolutional neural network. Int. J. Hydromechatronics 2023, 6, 258–273. [Google Scholar] [CrossRef]
  39. Verma, H.; Siruvuri, S.V.; Budarapu, P.R. A machine learning-based image classification of silicon solar cells. Int. J. Hydromechatronics 2024, 7, 49–66. [Google Scholar] [CrossRef]
  40. Alawi, O.A.; Kamar, H.M.; Shawkat, M.M.; Al Ani, M.M.; Mohammed, H.A.; Homod, R.Z.; Wahid, M.A. Artificial intelligence-based viscosity prediction of polyalphaolefin-boron nitride nanofluids. Int. J. Hydromechatro. 2024, 7, 89–112. [Google Scholar] [CrossRef]
  41. Arora, S.; Singh, S. Butterfly optimization algorithm: A novel approach for global optimization. Soft Comput. 2019, 23, 715–734. [Google Scholar] [CrossRef]
  42. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  43. Yuan, F.; Huang, X.; Zheng, L.; Wang, L.; Wang, Y.; Yan, X.; Gu, S.; Peng, Y. The Evolution and Optimization Strategies of a PBFT Consensus Algorithm for Consortium Blockchains. Information 2025, 16, 268. [Google Scholar] [CrossRef]
  44. Zhang, C.; Chen, J.; Li, J.; Peng, Y.; Mao, Z. Large language models for human–robot interaction: A review. Biomim. Intell. Robot. 2023, 3, 100131. [Google Scholar] [CrossRef]
  45. Peng, Y.; Yang, X.; Li, D.; Ma, Z.; Liu, Z.; Bai, X.; Mao, Z. Predicting flow status of a flexible rectifier using cognitive computing. Expert Syst. Appl. 2025, 264, 125878. [Google Scholar] [CrossRef]
  46. Bai, X.; Peng, Y.; Li, D.; Liu, Z.; Mao, Z. Novel soft robotic finger model driven by electrohydrodynamic (EHD) pump. J. Zhejiang Univ. Sci. A 2024, 25, 596–604. [Google Scholar] [CrossRef]
  47. Mao, Z.; Peng, Y.; Hu, C.; Ding, R.; Yamada, Y.; Maeda, S. Soft computing-based predictive modeling of flexible electrohydrodynamic pumps. Biomim. Intell. Robot. 2023, 3, 100114. [Google Scholar] [CrossRef]
  48. Peng, Y.; Sakai, Y.; Funabora, Y.; Yokoe, K.; Aoyama, T.; Doki, S. Funabot-Sleeve: A Wearable Device Employing McKibben Artificial Muscles for Haptic Sensation in the Forearm. IEEE Robot. Autom. Lett. 2025, 10, 1944–1951. [Google Scholar] [CrossRef]
  49. Mao, Z.; Kobayashi, R.; Nabae, H.; Suzumori, K. Multimodal strain sensing system for shape recognition of tensegrity structures by combining traditional regression and deep learning approaches. IEEE Robot. Autom. Lett. 2024, 9, 10050–10056. [Google Scholar] [CrossRef]
  50. Dai, L.; Zhang, L.; Chen, Z. GrS Algorithm for Solving Gas Transmission Compressor Design Problem. In Proceedings of the 2022 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT), Niagara Falls, ON, Canada, 17–20 November 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 842–846. [Google Scholar]
  51. Dhiman, G.; Kaur, A. Spotted hyena optimizer for solving engineering design problems. In Proceedings of the 2017 International Conference on Machine Learning and Data Science (MLDS), Noida, India, 14–15 December 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 114–119. [Google Scholar]
Figure 1. Flowchart of PSO algorithm.
Figure 1. Flowchart of PSO algorithm.
Biomimetics 10 00342 g001
Figure 2. The i t h member of the population I i chooses its closest, most vital neighbor I ii .
Figure 2. The i t h member of the population I i chooses its closest, most vital neighbor I ii .
Biomimetics 10 00342 g002
Figure 3. Flowchart of the ivy algorithm.
Figure 3. Flowchart of the ivy algorithm.
Biomimetics 10 00342 g003
Figure 4. Flowchart of IVYPSO algorithm.
Figure 4. Flowchart of IVYPSO algorithm.
Biomimetics 10 00342 g004
Figure 5. The convergence behaviors of IVYPSO and other algorithms for various test functions.
Figure 5. The convergence behaviors of IVYPSO and other algorithms for various test functions.
Biomimetics 10 00342 g005aBiomimetics 10 00342 g005bBiomimetics 10 00342 g005cBiomimetics 10 00342 g005dBiomimetics 10 00342 g005eBiomimetics 10 00342 g005fBiomimetics 10 00342 g005gBiomimetics 10 00342 g005hBiomimetics 10 00342 g005iBiomimetics 10 00342 g005jBiomimetics 10 00342 g005kBiomimetics 10 00342 g005lBiomimetics 10 00342 g005m
Figure 6. A gas pipeline transmission system for the GTCD problem.
Figure 6. A gas pipeline transmission system for the GTCD problem.
Biomimetics 10 00342 g006
Figure 7. Schematic diagram of the three-bar truss design.
Figure 7. Schematic diagram of the three-bar truss design.
Biomimetics 10 00342 g007
Figure 8. Schematic view of multiple-disk clutch brake design problem.
Figure 8. Schematic view of multiple-disk clutch brake design problem.
Biomimetics 10 00342 g008
Table 1. Details of the 26 test functions.
Table 1. Details of the 26 test functions.
s/nFunction NameFormulaCategoryRange f m i n *
F1Sphere f 1 ( x ) = i = 1 d i m   x i 2 Unimodal[−100, 100]0
F2Schwefel 2.22 f 2 ( x ) = i = 1 d i m   | x i | + i = 1 d i m   | x i | Unimodal[−10, 10]0
F3Schwefel 1.2 f 3 ( x ) = i = 1 d i m   j = 1 i   x j 2 Unimodal[−100, 100]0
F4Schwefel 2.21 f 4 ( x ) = m a x i   { | x i | } , 1 i d i m Unimodal[−100, 100]0
F5Step f 5 ( x ) = i = 1 d i m   x i + 0.5 2 Unimodal[−100, 100]0
F6Quartic f 6 ( x ) = i = 1 d i m   i x i 4 + r a n d Unimodal[−1.28, 1.28]0
F7Exponential f 7 ( x ) = i = 1 d i m   ( e x i x i ) Unimodal[−10, 10]0
F8Sum power f 8 ( x ) = i = 1 d i m   x i 2 Unimodal[−1, 1]0
F9Sum square f 9 ( x ) = i = 1 d i m   i x i 2 Unimodal[−10, 10]0
F10Rosenbrock f 10 ( x ) = i = 1 d i m 1   100 ( x i + 1 x i 2 ) 2 + ( x i 1 ) 2 Unimodal[−5, 10]0
F11Zakharov f 11 ( x ) = i = 1 d i m   x i 2 + i = 1 d i m   0.5 i x i 2 + i = 1 d i m   0.5 i x i 4 Unimodal[−5, 10]0
F12Trid f 12 ( x ) = i = 1 d i m   ( x i 1 ) 2 i = 2 d i m   x i x i 1 Unimodal[−5, 10]0
F13Elliptic f 13 ( x ) = i = 1 d i m   ( 10 6 ) i / ( d i m 1 ) x i 2 Unimodal[−100, 100]0
F14Cigar f 14 ( x ) = x 1 2 + 10 6 i = 2 d i m   x i 2 Unimodal[−100, 100]0
F15Rastrigin f 15 ( x ) = i = 1 d i m   x i 2 10 c o s ( 2 π x i ) + 10 Fixed[−5.12, 5.12]0
F16NCRastrigin f 16 ( x ) = i = 1 d i m   x i 2 10 c o s ( 2 π x i ) + 10 , y i = x i , i f x i 0.5 x i 1 , o t h e r w i s e Multimodal[−5.12, 5.12]0
F17Ackley f 17 ( x ) = 20 e 0.2 1 d i m i = 1 d i m   x i 2 + e 1 i = 1 d i m   c o s ( 2 π x i ) + 20 + e Multimodal[−50, 50]0
F18Griewank f 18 ( x ) = 1 + 1 4000 i = 1 d i m   x i 2 i = 1 d i m   c o s x i i Multimodal[−600, 600]0
F19Alpine f 19 ( x ) = i = 1 d i m   | x i s i n ( x i ) + 0.1 x i | Fixed[−10, 10]0
F20Penalized 1 f 20 x = π d i m 10 s i n 2 ( π y 1 ) + i = 1 d i m 1   ( y i 1 ) 2 [ 1 + 10 s i n 2 ( π y i + 1 ) ] + ( y d i m 1 ) 2 + i = 1 d i m   u x i , 10,100,4 , y i = 1 + x i + 1 4 , u ( x i , a , k , m ) = k ( x i a ) m , x i > a 0 , a x i a k ( x i a ) m , x i < a Multimodal[−100, 100]0
F21Penalized 2 f 21 ( x ) = 0.1 s i n 2 ( 3 π x 1 ) + i = 1 d i m 1   ( x i 1 ) 2 [ 1 + s i n 2 ( 3 π x i + 1 ) ] + ( x d i m 1 ) 2 [ 1 + s i n 2 ( 2 π x d i m ) ] } + i = 1 d i m   u ( x i , 5,100,4 ) Multimodal[−100, 100]0
F22Schwefel f 22 ( x ) = i = 1 d i m   x i s i n ( | x i | ) Fixed[−100, 100]0
F23Lévy f 23 ( x ) = s i n 2 ( 3 π x 1 ) + i = 1 d i m   ( x i 1 ) 2 [ 1 + s i n 2 ( 3 π x i + 1 ) ] + ( x d i m 1 ) 2 [ 1 + s i n 2 ( 2 π x d i m ) ] Multimodal[−10, 10]0
F24Weierstrass f 24 ( x ) = i = 1 d i m   k = 0 k m a x   a k c o s ( 2 π b k ( x i + 0.5 ) ) d i m k = 0 k m a x   a k c o s ( π b k ) , a = 0.5 , b = 3 , k m a x = 20 Multimodal[−0.5, 0.5]0
F25Solomon f 25 ( x ) = 1 c o s 2 π i = 1 d i m   x i 2 + 0.1 i = 1 d i m   x i 2 Fixed[−100,100]0
F26Bohachevsky f 26 ( x ) = i = 1 d i m   x i 2 + 2 x i 2 0.3 c o s ( 3 π x i ) Fixed[−10,10]0
Table 2. Parameter settings of IVYPSO and other algorithms.
Table 2. Parameter settings of IVYPSO and other algorithms.
AlgorithmParameterAlgorithmParameter
ALLMax iteration = 500, Agents = 30, Runs = 30HPSOBOA w = 0.7 , a = ( 0.1,0.3 ) , V = ( 1,1 ) , C 1 = C 2 = 0.5 , c 0 = 0.01 , p = 0.6
IVYPSO C 1 = C 2 = 1.5 , w = 0.7 , b e t a 1 = 1 ,   1.5 , G V = [ 0,1 ] dFDB_LSHADE p _ b e s t _ r a t e = 0.11 ,   a r c _ r a t e = 1.4
m e m o r y _ s i z e = 5 , m e m o r y _ p o s = 1
PSO C 1 = C 2 = 2 , V = ( 6,6 ) , w = ( 0.2,0.9 ) FDC_AGDE N W = 0.5   0.5 , C r _ A l l = z e r o s ( 1 , 2 )
IVY b e t a 1 = 1,1.5 ,   G V = [ 0,1 ] NSM_BO p x g m i n i t i a l = 0.03 , s c a b = 1.25 , s c s b = 1.3
r c p p = 0.0035 , t s g s _ f a c t o r _ m a x = 0.05
BOA a = 0.1 , p = 0.6 , c 0 = 0.01
WOA a = l i n e a r   d e c r e a s e   f r o m   2   t o   0 , C = [ 0,2 ] , a 2 = l i n e a r   d e c r e a s e   f r o m 1   t o 2 dFDB_SFS d = X A , : X B , :
GOOSE S _ W _ m i n = 5 , S _ W _ m a x = 25 , c o e _ m i n = 0.17 FDB_AGSK l = r a n d ( ) * 2 1 , b = 1
Table 3. A comparison of IVYPSO’s average fitness values and standard deviation with 10 other algorithms for various test functions.
Table 3. A comparison of IVYPSO’s average fitness values and standard deviation with 10 other algorithms for various test functions.
FuncMetricsIVYPSOPSOIVYBOAWOAGOOSEHPSOBOAdFDB_LSHADEFDC_AGDENSM_BOdFDB_SFSFDB_AGSK
F1Avg02.51595.6148 × 10−1607.5380 × 10−114.4643 × 10−7428.0451.6444 × 10−2914954.64366.1553 × 10−30.504427.56476.0516 × 10−96
Std01.10718.5891 × 10−1607.0143 × 10−122.2966 × 10−73103.457101398.51693.6651 × 10−31.98210.30533.3035 × 10−95
Rank193651121278104
F2Avg04.548902.3029 × 10−82.5328 × 10−501359.22991.0782 × 10−14543.10421.4138 × 10−24.2033 × 10−34.53633.9126 × 10−60
Std01.187806.7460 × 10−91.3725 × 10−496860.75347.0216 × 10−1469.47463.4474 × 10−38.0123 × 10−31.15472.1335 × 10−59
Rank110165123118794
F3Avg082.46655.6472 × 10−855.2628 × 10−11422.4322.433.3155 × 10−292188.75338.47393.6188174.9405419.0353
Std024.25034.3479 × 10−866.0949 × 10−12156.22030.8986078.089613.06593.985248.3456128.6289
Rank183412521076911
F4Avg01.917902.6176 × 10−83.88520.25025.4053 × 10−1474.40931.26572.28181.15095.7175
Std00.3202.4363 × 10−92.86590.23043.3527 × 10−1480.7440.14670.49250.24633.5226
Rank181410531179612
F5Avg0.2412.24920.48885.37669.8934 × 10−20.01140.032753.35836.2779 × 10−53.2036 × 10−30.25620.5742
Std0.19571.15150.40370.46517.3668 × 10−23.7831 × 10−32.3514 × 10−216.693.1763 × 10−57.9615 × 10−38.6912 × 10−20.5068
Rank610811534121279
F6Avg6.7137 × 10−514.47016.7440 × 10−51.9729 × 10−32.5420 × 10−30.13361.1522 × 10−41.94436.3089 × 10−20.09847.7676 × 10−21.3547 × 10−3
Std5.9831 × 10−510.06236.4272 × 10−58.4663 × 10−43.0980 × 10−34.2711 × 10−28.8115 × 10−50.80961.4809 × 10−23.9632 × 10−22.4921 × 10−22.5808 × 10−3
Rank112256103117984
F7Avg4.8103 × 10−1508.7899 × 10−316.3334 × 10−117.1751 × 10−661.2216 × 10−651.7703 × 10−621.6598 × 10−482.5562 × 10−547.1751 × 10−662.0229 × 10−657.1751 × 10−66
Std1.7831 × 10−1404.0250 × 10−302.0811 × 10−103.2167 × 10−811.6721 × 10−658.7838 × 10−789.0229 × 10−481.3999 × 10−533.2167 × 10−811.2398 × 10−653.2167 × 10−81
Rank111101225798364
F8Avg00.18309.7665 × 10−143.5039 × 10−1041.7042 × 10−51.2449 × 10−2957.6028 × 10−49.4531 × 10−202.5527 × 10−201.4060 × 10−104.4055 × 10−153
Std00.168406.2767 × 10−141.9152 × 10−1031.1127 × 10−501.5541 × 10−33.4998 × 10−198.9169 × 10−203.8320 × 10−102.3589 × 10−152
Rank112185103117694
F9Avg026.79694.3686 × 10−956.8638 × 10−112.6339 × 10−730.88482.6080 × 10−291658.43616.9232 × 10−40.1232.96771.3465 × 10−94
Std014.549405.7608 × 10−121.4319 × 10−720.75490209.4864.2068 × 10−40.48630.99215.1291 × 10−94
Rank111365921278104
F10Avg26.9025914.453727.634328.910427.903586.34928.756945,111.577960.633987.2424162.712828.7208
Std0.8425438.81390.25582.3631 × 10−20.462868.11334.3639 × 10−229,206.252143.697453.490963.26339.4422 × 10−2
Rank111263851279104
F11Avg0111.850806.6299 × 10−117.0248 × 10−750.14911.5401 × 10−291679.94586.8789 × 10−49.7422 × 10−31.73172.7436 × 10−92
Std052.059906.5461 × 10−122.5674 × 10−743.6406 × 10−20337.37683.7684 × 10−41.8844 × 10−20.58261.5027 × 10−91
Rank111165931278104
F12Avg0.6667209.90590.66670.97070.6672.16090.99591233.91480.82342.94153.91690.7818
Std9.4147 × 10−8128.93064.7975 × 10−89.8902 × 10−32.0518 × 10−41.39148.0022 × 10−4870.88410.28791.80421.72170.1635
Rank111263871259104
F13Avg01.7597 × 10−2301.2000 × 10−2103.3757 × 10−45.2516 × 10−28.67678.4768 × 10−3701.8101 × 10−320
Std09.2757 × 10−2303.9643 × 10−2105.2807 × 10−40.10317.23543.6192 × 10−3604.8001 × 10−320
Rank181911011126171
F14Avg01.2617 × 10−1706.5631 × 10−152.0239 × 10−1031997.03690.03598.13142.6242 × 10−2502.3759 × 10−199.7815 × 10−151
Std04.3869 × 10−1703.5821 × 10−149.0242 × 10−1032107.49330.104423.35061.4323 × 10−2405.3197 × 10−195.3566 × 10−150
Rank181951210116174
F15Avg01.0826 × 10−235.3697 × 10−2601.3855 × 10−181.1287 × 10−1286.8549 × 10−33.1054 × 10−30.05571.6634 × 10−3601.8751 × 10−301.6475 × 10−192
Std04.3883 × 10−236.3684 × 10−2594.4188 × 10−186.1811 × 10−1281.1247 × 10−26.5997 × 10−30.14449.0070 × 10−3605.4607 × 10−300
Rank183951110126174
F16Avg0169.139026.63770150.31390.3609239.603546.648310.3208214.29061.8948 × 10−15
Std031.3264069.0482028.0260.805718.09747.12983.603314.32031.0378 × 10−14
Rank110171951286114
F17Avg0151.81790109.61984.9084194.13590.2559216.45530.77788.7194197.90830
Std028.435079.606826.884633.04750.474723.11973.17023.075619.40550
Rank191851041276111
F18Avg4.4409 × 10−162.57424.4409 × 10−162.7789 × 10−83.1678 × 10−158.19134.4409 × 10−169.35021.2613 × 10−20.85112.32553.6415 × 10−15
Std00.474502.7043 × 10−92.5861 × 10−157.650700.9463.3014 × 10−30.61420.40142.1580 × 10−15
Rank110164111127895
F19Avg00.12978.3647 × 10−311.0441 × 10−119.3855 × 10−3240.6283051.19225.7896 × 10−20.19981.21715.3087 × 10−2
Std05.8069 × 10−029.1238 × 10−308.8748 × 10−123.6288 × 10−2219.7319016.60665.7987 × 10−20.24489.1563 × 10−20.2087
Rank183451211179106
F20Avg05.551402.9520 × 10−91.0727 × 10−446.334.9382 × 10−14725.95183.0941 × 10−21.1060 × 10−310.40772.4419 × 10−62
Std02.338308.8910 × 10−95.8751 × 10−442.61956.0974 × 10−1474.72191.0909 × 10−21.6310 × 10−32.53456.5463 × 10−62
Rank191651031287114
F21Avg2.5547 × 10−25.2594 × 10−22.9661 × 10−20.53261.9728 × 10−23.83122.2408 × 10−33.25294.4463 × 10−62.0755 × 10−20.03123.5037 × 10−2
Std9.7752 × 10−34.9296 × 10−21.4688 × 10−20.11765.4402 × 10−21.34791.6259 × 10−31.29214.5629 × 10−64.2171 × 10−22.6494 × 10−20.03
Rank594103122111467
F22Avg2.90380.56882.90052.74720.17299.1851 × 10−31.00422.73554.4222 × 10−65.5565 × 10−34.5368 × 10−20.134
Std0.16210.2470.199550.30130.12766.9460 × 10−30.80450.87785.7875 × 10−61.0655 × 10−21.8046 × 10−20.0886
Rank12791063891245
F23Avg0.55376.26031.385412.17230.32060.80290.77885.89243.9947 × 10−26.3801 × 10−21.64540.5537
Std1.14023.59560.83072.44770.28350.63380.83491.76690.0128.4765 × 10−20.63331.1402
Rank411512365101294
F24Avg03.79478.1486 × 10−81.097309.6985038.424501.4924 × 10−500
Std03.12788.6874 × 10−82.072806.550602.557105.6480 × 10−500
Rank110793114125867
F25Avg01.691800.7140.15921.8440.023320.74380.7324.95811.01140.136
Std00.457100.24080.13730.49424.2871 × 10−26.47170.23541.58610.24080.1394
Rank191651031271184
F26Avg022.388307.8537 × 10−1104.9249058.7281.6200 × 10−20.41354.54090
Std06.098108.3251 × 10−1202.077012.04278.5765 × 10−21.06751.47050
Rank111161101127891
Paired rank +/=/−24/0/211/12/325/0/118/3/523/0/319/3/424/0/221/0/518/3/524/0/220/4/2
Avg. rank2.269.272.927.354.548.924.3111.355.966.088.424.81
Overall rank111284103126795
Note: The optimal values are highlighted in bold.
Table 4. A comparison of the best fitness values between IVYPSO and 10 other algorithms for various test functions.
Table 4. A comparison of the best fitness values between IVYPSO and 10 other algorithms for various test functions.
Func.MetricsIVYPSOPSOIVYBOAWOAGOOSEHPSOBOAdFDB_LSHADEFDC_AGDENSM_BOdFDB_SFSFDB_AGSK
F1Best01.91953.2648 × 10−2607.6243 × 10−112.3137 × 10−830.01342.2327 × 10−2916221.60160.00753.9426 × 10−526.83363.2275 × 10−104
Rank110365921287114
F2Best04.808102.4927 × 10−083.6914 × 10−4912.84567.8194 × 10−14632.77740.01820.00515.35655.2776 × 10−63
Rank191651131287104
F3Best053.19336.5785 × 10−955.085 × 10−11428.78450.71362.9839 × 10−292169.57414.421.3085161.3353724.5566
Rank183411521076912
F4Best01.827702.466 × 10−080.41630.35575.2086 × 10−1472.90461.41832.61061.02048.0432
Rank191465311810712
F5Best0.25230.95590.69045.17870.07250.0070.030239.47165.2662 × 10−53.4297 × 10−50.24320.136
Rank810911534122176
F6Best1.1454 × 10−545.85733.2954 × 10−50.00140.00110.13515.385 × 10−52.62360.0660.14960.09311.2591 × 10−5
Rank212165941171083
F7Best3.1914 × 10−1906.3819 × 10−355.3719 × 10−137.1751 × 10−669.334 × 10−661.7703 × 10−628.2727 × 10−527.1751 × 10−667.1751 × 10−661.1911 × 10−657.1751 × 10−66
Rank111101226892272
F8Best00.333501.1492 × 10−132.5197 × 10−1008.3454 × 10−61.7958 × 10−2951.4189 × 10−54.3298 × 10−211.2446 × 10−239.3771 × 10−117.2831 × 10−152
Rank112185103117694
F9Best025.84314.3686 × 10−957.5806 × 10−111.2833 × 10−820.79842.2617 × 10−291545.80440.00030.00034.35252.3672 × 10−106
Rank111465921287103
F10Best27.0059673.429827.89928.946827.730830.199728.836521,955.771329.2534396.3681203.314128.7384
Rank111362851271094
F11Best085.994106.8182 × 10−113.5927 × 10−810.2441.9516 × 10−291894.80340.00071.665 × 10−50.98291.1737 × 10−108
Rank111165931287104
F12Best0.6667222.8860.666670.96320.66670.81540.99532331.61150.72256.90314.50960.6774
Rank111171681251094
F13Best02.0826 × 10−2705.5035 × 10−2600.00170.05980.00894.8456 × 10−4202.4964 × 10−330
Rank181911012116171
F14Best07.0078 × 10−2207.8597 × 10−205.0035 × 10−1250.0130.02560.51585.1081 × 10−2701.1492 × 10−191.0275 × 10−152
Rank171851011126194
F15Best01.1273 × 10−308.3154 × 10−2625.7109 × 10−224.1221 × 10−1330.00041.9605 × 10−2960.00022.3713 × 10−4002.0643 × 10−311.7744 × 10−188
Rank194106123117185
F16Best0207.278302.8422 × 10−131.1369 × 10−13170.52840227.455245.51116.967197.85280
Rank111165911287101
F17Best0174.9210167.59030280.00040217.874535.35756.0006206.65720
Rank191811211176101
F18Best4.4409 × 10−162.80754.4409 × 10−162.5252 × 10−84.4409 × 10−160.06494.4409 × 10−169.50980.01150.93732.09323.9968 × 10−15
Rank111161811279105
F19Best00.103901.8527 × 10−110289.0103057.18560.01520.16841.15990
Rank181611211179101
F20Best02.327205.0467 × 10−102.1642 × 10−508.43446.1344 × 10−14723.01760.02720.00058.96521.3616 × 10−67
Rank191651031287114
F21Best0.00680.01250.01980.48610.01813.34390.00133.23164.0407 × 10−62.7412 × 10−70.02110.0107
Rank468107123112195
F22Best2.96610.87532.97152.99680.05950.01470.06511.50152.0754 × 10−67.3649 × 10−90.0520.2135
Rank108111253692147
F23Best0.01345.25422.10486.49450.31790.29861.52235.62780.0511.2351.61930.0021
Rank210912547113681
F24Best03.26780002.0776038.25060000
Rank111111101121111
F25Best02.487400.89550.3981.59190.099525.4680.89556.36760.90690.0995
Rank110175941261183
F26Best014.090807.6919 × 10−1105.3631054.18920.00060.02533.84840
Rank111161101127891
Paired rank +/=/−24/0/29/15/225/0/116/7/323/0/316/6/424/0/221/1/418/4/422/1/316/6/4
Avg. rank2.199.353.087.274.088.53.9211.355.925.858.463.92
Overall rank110285103117693
Note: The optimal values are highlighted in bold.
Table 5. Results of the Wilcoxon signed-rank test for 26 test functions with α = 0.05.
Table 5. Results of the Wilcoxon signed-rank test for 26 test functions with α = 0.05.
AlgorthmWilcoxon Test p-ValueSignificant
IVYPSO-PSO2.4153 × 10−5Yes
IVYPSO-IVY1.6357 × 10−2Yes
IVYPSO-BOA2.9991 × 10−5Yes
IVYPSO-WOA4.5685 × 10−2Yes
IVYPSO-GOOSE5.0978 × 10−5Yes
IVYPSO-HPSOBOA1.6067 × 10−2Yes
IVYPSO-FDC-AGDE8.3166 × 10−3Yes
IVYPSO-dFDB-LSHADE7.8847 × 10−6Yes
IVYPSO-NSM-BO3.8919 × 10−3Yes
IVYPSO-dFDB-SFS5.9619 × 10−5Yes
IVYPSO-FDB-AGSK1.5664 × 10−3Yes
Table 6. Friedman ranking scores of IVYPSO and other competing algorithms.
Table 6. Friedman ranking scores of IVYPSO and other competing algorithms.
AlgorthmFriedman ScoresRank
IVYPSO1.92311
PSO5.88469
IVY3.73082
BOA6.730811
WOA4.84623
GOOSE5.46156
HPSOBOA4.96154
FDC-AGDE7.038512
dFDB-LSHADE6.423110
NSM-BO5.46157
dFDB-SFS5.53858
FDB-AGSK5.03855
Note: The optimal values are highlighted in bold.
Table 7. The best values obtained by IVYPSO and other competing algorithms for the GTCD problem.
Table 7. The best values obtained by IVYPSO and other competing algorithms for the GTCD problem.
AlgorithmLrDOptimal ValueSR (%)ACTsAFEs
IVYPSO24.4961.5867201,677,759.27551000.141915,030
PSO24.4961.5867201,677,759.2755800.118715,030
IVY24.4961.5867201,677,759.2755800.17215,030
BOA201.1134201,683,684.545700.179730,030
WOA24.4961.5867201,677,759.2755850.093815,000
GOOSE32.52561.2305201,677,759.285400.10115,000
HPSOBOA211.0537211,685,732.680400.178630,030
FDC_AGDE24.4961.5867201,677,759.2755850.108715,030
dFDB_LSHADE28.55411.1932201,677,783.313200.0047500
NSM_BO24.4961.5867201,677,759.27551000.323915,000
dFDB_SFS24.4961.5867201,677,759.2755800.197815,000
FDB_AGSK24.4961.5867201,677,759.27551000.184515,000
Note: The optimal values are highlighted in bold.
Table 8. Statistical assessment of various algorithms applied to the GTCD problem.
Table 8. Statistical assessment of various algorithms applied to the GTCD problem.
AlgorithmMean BestWorstMedianStdRank
IVYPSO1,677,759.27551,677,759.2755 1,677,759.2755 1,677,759.2755 0.0000 1
PSO1,678,556.6157 1,677,759.2755 1,685,732.6774 1,677,759.2755 2521.4111 7
IVY1,678,354.4837 1,677,759.2755 1,685,634.2755 1,677,759.2755 1932.6400 6
BOA1,685,527.8813 1,683,684.5457 1,685,732.7254 1,685,732.6942 647.6821 8
WOA1,677,759.2760 1,677,759.2755 1,677,759.2777 1,677,759.2756 0.0009 4
GOOSE2,048,858.2698 1,677,759.2854 5,177,439.6140 1,698,815.7980 1,099,516.1908 12
HPSOBOA1,685,748.4393 1,685,732.6804 1,685,810.0629 1,685,735.5707 25.3056 9
FDC_AGDE1,778,373.1947 1,677,759.2755 2,675,925.0653 1,677,759.2755 315,377.5360 11
dFDB_LSHADE1,678,255.5734 1,677,783.3132 1,679,765.5234 1,678,075.9952 619.2305 5
NSM_BO1,677,759.27551,677,759.2755 1,677,759.2755 1,677,759.2755 0 1
dFDB_SFS1,694,760.8055 1,677,759.2755 1,762,766.9254 1,677,759.2755 35,842.3723 10
FDB_AGSK1,677,759.27551,677,759.2755 1,677,759.2755 1,677,759.2755 0 1
Note: The optimal values are highlighted in bold.
Table 9. The best values obtained by IVYPSO and other competing algorithms for the three-bar truss design problem.
Table 9. The best values obtained by IVYPSO and other competing algorithms for the three-bar truss design problem.
AlgorithmX1X2Optimal ValueSR (%)ACTsAFEs
IVYPSO0.78840.4081263.85231000.153115,030
PSO0.78840.4081263.8523850.166315,030
IVY0.78840.4081263.8523900.215115,030
BOA0.79370.3938263.884900.274330,030
WOA0.79190.3984263.861100.13715,000
GOOSE0.78840.4081263.85231000.144715,000
HPSOBOA0.83740.4669264.247400.278330,030
FDC_AGDE0.78840.4081263.85231000.151815,030
dFDB_LSHADE0.78840.4081263.8523900.0058500
NSM_BO0.78840.4081263.85231000.35715,000
dFDB_SFS0.78840.4081263.85231000.265415,000
FDB_AGSK0.78840.4081263.85231000.216715,000
Note: The optimal values are highlighted in bold.
Table 10. Statistical assessment of various algorithms applied to the three-bar truss design problem.
Table 10. Statistical assessment of various algorithms applied to the three-bar truss design problem.
AlgorithmMean BestWorstMedianStdRank
IVYPSO263.8523263.8523263.8523263.852301
PSO263.9375263.8523264.7016263.85230.26859
IVY263.8524263.8523263.8527263.85246.97 × 10−47
BOA264.1555263.8787264.8682264.02280.322710
WOA265.3408263.8591268.7184264.74591.831611
GOOSE263.8524263.8523264.5934263.85230.23976
HPSOBOA271.5462264.4945279.0356272.98944.476312
FDC_AGDE263.8523263.8523263.8523263.852301
dFDB_LSHADE263.8573 263.8524 263.8955 263.8527 0.0134 8
NSM_BO263.8523263.8523263.8523263.852301
dFDB_SFS263.8523263.8523263.8523263.852301
FDB_AGSK263.8523263.8523263.8523263.852301
Note: The optimal values are highlighted in bold.
Table 11. The best values obtained by IVYPSO and other competing algorithms for the multiple-disk clutch brake design problem.
Table 11. The best values obtained by IVYPSO and other competing algorithms for the multiple-disk clutch brake design problem.
AlgorithmX1X2X3X4X5Optimal ValueSR(%)ACTsAFEs
IVYPSO70901100020.23521000.226115,030
PSO70901100020.2352900.211715,030
IVY70901100020.2352950.264415,030
BOA69.7384901.1944405.044720.284200.367130,030
WOA70901100020.23521000.184315,000
GOOSE70901100020.2352900.239515,000
HPSOBOA67.2176910.8157788.69991.88120.273100.607830,030
FDC_AGDE70901100020.23521000.221115,030
dFDB_LSHADE69.9485901432.17720.235800.0117500
NSM_BO70901100020.23521000.449715,000
dFDB_SFS70901100020.2352850.324815,000
FDB_AGSK70901100020.23521000.286715,000
Note: The optimal values are highlighted in bold.
Table 12. Statistical assessment of various algorithms applied to the multiple-disk clutch brake design problem.
Table 12. Statistical assessment of various algorithms applied to the multiple-disk clutch brake design problem.
AlgorithmMean BestWorstMedianStdRank
IVYPSO0.23520.23520.23520.235201
PSO0.23810.23520.26380.23520.0098
IVY0.23540.23520.2360.23520.00146
BOA0.31490.28420.33080.32550.023411
WOA0.23520.23520.23520.235201
GOOSE0.23830.23520.25310.23520.0069
HPSOBOA0.32640.27310.33080.33080.013912
FDC_AGDE0.23520.23520.23520.235201
dFDB_LSHADE0.23760.23580.2420.23710.0027
NSM_BO0.23520.23520.23520.235201
dFDB_SFS0.24110.23520.26460.23520.012410
FDB_AGSK0.23520.23520.23520.235201
Note: The optimal values are highlighted in bold.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, K.; Yuan, F.; Jiang, Y.; Mao, Z.; Zuo, Z.; Peng, Y. A Particle Swarm Optimization-Guided Ivy Algorithm for Global Optimization Problems. Biomimetics 2025, 10, 342. https://doi.org/10.3390/biomimetics10050342

AMA Style

Zhang K, Yuan F, Jiang Y, Mao Z, Zuo Z, Peng Y. A Particle Swarm Optimization-Guided Ivy Algorithm for Global Optimization Problems. Biomimetics. 2025; 10(5):342. https://doi.org/10.3390/biomimetics10050342

Chicago/Turabian Style

Zhang, Kaifan, Fujiang Yuan, Yang Jiang, Zebing Mao, Zihao Zuo, and Yanhong Peng. 2025. "A Particle Swarm Optimization-Guided Ivy Algorithm for Global Optimization Problems" Biomimetics 10, no. 5: 342. https://doi.org/10.3390/biomimetics10050342

APA Style

Zhang, K., Yuan, F., Jiang, Y., Mao, Z., Zuo, Z., & Peng, Y. (2025). A Particle Swarm Optimization-Guided Ivy Algorithm for Global Optimization Problems. Biomimetics, 10(5), 342. https://doi.org/10.3390/biomimetics10050342

Article Metrics

Back to TopTop