You are currently viewing a new version of our website. To view the old version click .
Biomimetics
  • Article
  • Open Access

8 April 2025

mESC: An Enhanced Escape Algorithm Fusing Multiple Strategies for Engineering Optimization

,
and
1
Faculty of Mechanical Engineering, Shaanxi University of Technology, Hanzhong 723000, China
2
Faculty of Art and Design, Xi’an University of Technology, Xi’an 710054, China
3
Department of Applied Mathematics, Xi’an University of Technology, Xi’an 710054, China
*
Author to whom correspondence should be addressed.
This article belongs to the Special Issue Bio-Inspired Optimization Algorithms and Designs for Engineering Applications: 3rd Edition

Abstract

A multi-strategy enhanced version of the escape algorithm (mESC, for short) is proposed to address the challenges of balancing exploration and development stages and low convergence accuracy in the escape algorithm (ESC). Firstly, an adaptive perturbation factor strategy was employed to maintain population diversity. Secondly, introducing a restart mechanism to enhance the exploration capability of mESC. Finally, a dynamic centroid reverse learning strategy was designed to balance local development. In addition, in order to accelerate the global convergence speed, a boundary adjustment strategy based on the elite pool is proposed, which selects elite individuals to replace bad individuals. Comparing mESC with the latest metaheuristic algorithm and high-performance winner algorithm in the CEC2022 testing suite, numerical results confirmed that mESC outperforms other competitors. Finally, the superiority of mESC in handling problems was verified through several classic real-world optimization problems.

1. Introduction

1.1. Research Background

The optimization problem is a type of problem that optimizes the objective function that meets the constraints []. This type of problem involves many fields, such as physical chemistry [], biomedical [], economics and finance [], logistics and operations management [], science and technology, and machine learning []. Optimization problems are commonly present in the real world, such as path planning, image processing, feature selection, etc. Through optimization, specific excellent parts can be extracted to achieve the goal of improving overall performance.

1.2. Literature Review

The traditional methods for solving optimization problems such as the conjugate gradient method and momentum method have obvious disadvantages such as low efficiency and unsatisfactory optimization results. In this regard, metaheuristic algorithms can provide a novel and efficient approach. Metaheuristic algorithms include some classic algorithms inspired by the concept of selective elimination, such as particle swarm optimization (PSO) [], differential evolution (DE) [], etc. It also includes some algorithms inspired by animals, such as the zebra optimization algorithm (ZOA) proposed based on zebra foraging and predator avoidance behavior [], the moth flame optimization (MFO) inspired by the nature of moths [], spider wasp optimization (SWO) inspired by the survival behavior of spider bees [], the seahorse optimization (SHO) [] proposed based on the biological habits of seahorses in the ocean, the artificial hummingbird algorithm (AHA) [] proposed based on the flight and foraging of hummingbirds, and the dwarf mongoose optimization algorithm (DMOA) [] proposed based on the collective foraging behavior of dwarf mongoose. Metaheuristic algorithms also include algorithms inspired by physics and chemistry, such as the multi-verse optimization (MVO) proposed based on the concept of physical motion of celestial bodies in the universe, and the planetary optimization algorithm (POA) inspired by Newton’s law of gravity [], algorithms proposed under the influence of human development behavior, such as the imperialist competition algorithm (ICA) inspired by weak strong cannibalism between countries [], teaching–learning based optimization (TLBO) that simulates the process of human “teaching” and “learning” [], in addition to sled dog optimization (SDO) [], gray wolf optimization (GWO) [], osprey optimization algorithm (OOA) [], dung beetle optimization (DBO) [], gravity search algorithm (GSA) [], big bang big crunch (BBBC) [], and other algorithms. In addition, there are also algorithms that have been improved on existing algorithms, such as the enhanced bottlenose dolphin optimization (namely EMBDO) for drone path planning with four constraints [], the improved Kepler optimization algorithm (namely CGKOA) for handling engineering optimization problems [], the superior eagle optimization algorithm (namely SEOA) for path planning [], and the artificial rabbit optimization (namely MNEARO) for optimizing several engineering problems [].
However, it is unrealistic to use one algorithm to solve all problems, and constantly proposing new algorithms and improving them is the most effective approach. The proposal of ESC to compound this demand was inspired by crowd evacuation behavior, and ESC [] simulated three types of crowd behavior. ESC validates its superiority and competitiveness by comparing it with other competitors on two testing suites and several optimization problems. However, when balancing exploration and development, as well as handling high-dimensional situations, ESC may fall into local optima due to insufficient performance. Therefore, in order to better unleash the potential of ESC and further improve its performance, this article proposes mESC.
In mESC, the proposed adaptive perturbation factor strategy, boundary adjustment strategy based on the elite pool, dynamic centroid reverse learning strategy, and a proposed restart mechanism are used to enhance the overall performance of ESC. Use an adaptive perturbation factor strategy to balance population diversity during algorithm iteration. The restart mechanism enhances the exploration capability of mESC and prevents excessive convergence in the later stages of iteration. The boundary adjustment strategy based on an elite pool can screen more outstanding individuals as candidate solutions and accelerate convergence speed. Local development of dynamic centroid can reverse learning strategy balance algorithm to improve convergence accuracy and enhance local optimization.

1.3. Research Contribution

This study proposes a multi-strategy-based escape algorithm, mESC. This algorithm improves the original algorithm and its performance is validated through multiple experimental metrics on a test suite with 26 competitors. In addition, mESC is used for truss topology optimization and five engineering design optimization problems to affirm its superiority. The proposal of this improved algorithm provides more methods for optimizing problems, greatly improving the accuracy of optimization problems.

1.4. Chapter Arrangement

Section 2 first describes the concept of ESC, then presents the proposed improvement strategy for mESC, and finally presents relevant numerical experiments to verify the performance of the proposed algorithm. Section 3 confirms the practicality of the proposed algorithm through truss topology optimization and 5 engineering optimizations. Section 4 provides a summary of the entire text.

3. Real World Application Solving

Next, we apply the proposed mESC to five engineering optimizations and two truss topology optimizations, highlighting its superiority through the results of solving these problems with mESC and other competitors.

3.1. mESC Optimization of Truss Topology Design Problem

Structural optimization design refers to the design of a scheme with the goals of minimizing volume, minimizing cost, and maximizing stiffness under given constraints. Meanwhile, truss optimization is aimed at reducing weight as much as possible to achieve resource recycling efficiency. The specific constraints are as follows:
Find
x = A 1 , A 2 , , A n .
Minimize
f ( x ) = i = 1 n B i A i ρ i L i + j = 1 m b j ,   B i = 0 ,   if   A i <   critical   area 1 ,   if   A i   critical   area .
A i is the cross-sectional area, b j is the mass value, ρ i is the mass density, and L i is the length of the rod.
Limitation
g 1 ( x ) : B i σ i σ i 0 ,   g 2 ( x ) : δ i δ j max 0 ,   g 3 ( x ) : f r f r min 0 g 4 ( x ) : B i σ i c o m p σ i c r 0 ,   σ i c r = k i A i E i L i 2 ,   g 5 ( x ) : A r min A A r max   g 6 : c h e c k   o n   v a l i d i t y   o f   s t r u c t u r e ,   g 7 : c h e c k   o n   k i n e m a t i c   s t a b i l i t y ,
In the formula, i = 1 , 2 , , n ,   j = 1 , 2 , , m ,   σ i represents stress, g 1 ( x ) represents stress constraint, and B i represents binary bits, g 2 ( x ) , g 3 ( x ) and g 4 ( x ) are displacement constraints, natural frequency constraints, and Euler buckling constraints, respectively; g 5 ( x ) is a cross-sectional area constraint, and without truss connections, this truss topology is invalid, i.e., g 6 . The consideration of motion stability will ensure the smooth use of the truss, i.e., g 7 .
In the truss topology optimization, if the solved cross-sectional area is less than the critical area, it is assumed that the member is removed from the Truss; otherwise, the member is retained. If the loading node, the supporting node, and the non-erasable node are not connected through any Truss pole, the generated Truss topology is invalid (g6). In order to ensure the generation of a motion stable Truss structure, motion stability (g7) is included in the design constraints, which are mainly divided into the following two criteria:
(1) The degrees of freedom of the structure calculated using the Grubler criterion should be less than 1;
(2) The motion stability of the structure is checked by the positive definiteness of the stiffness matrix created by component connections, and the global stiffness matrix should be positive definite.
To evaluate whether the design scheme meets the constraints, a penalty letter [] needs to be introduced, as follows:
Punishment
f ( x ) = 10 9                                             if   g 7   is   violated 10 8                                             if   g 6   is   violated   with   degree   of   freedom 10 7                                             if   g 6   is   violated   with   positive   definiteness f ( x ) F p e n a l t y               otherwise ,
In the formula, F p e n a l t y = ( 1 + α C ) β ,   C = i = 1 q C i ,   C i = 1 p i / p i * , p i represent the level of constraint violation, q represents the activity constraint, and α and β are 2. The Euler buckling coefficient is set to 4.0 kg and the mass is set to 5.0 kg.

3.1.1. Optimization of 24 Bar 2D Truss

In the experiment, we selected nine algorithms including NRBO, MPSO [], CPO, MFO, BWO, WOA, HHO [], BSD [], and TSA [] as competitors. The relevant structure is shown in Figure 3, and the design variables are the segmental members of the truss. The parameters required for the experiment are described in Table 1. The experimental results of all algorithms after 20 runs are shown in Table 3, where the optimal weights and the optimal values of the average are marked with black bold graphs to evaluate the superiority and inferiority of the algorithm.
Figure 3. 24 Pole truss structure.
Table 3. Conditions for 24 bar truss structure.
The data in Table 4 proves that the optimal weight and overall average value found by mESC are generally the best, with a minimum weight of 121.5840 and an overall average value of 140.5701. This indicates that mESC’s performance in solving this problem is the most stable compared to other competitors. Figure 4 shows the convergence graph of optimizing the problem among competitors. CPO begins to fall into local optima in the early and middle stages of iteration; MESC performed even better. Overall, the comprehensive performance of mESC has been validated in solving this optimization problem. Figure 5 shows the truss diagrams of each algorithm after removing the rods based on experimental results, with mESC having the highest number of rods removed.
Table 4. Optimization results of 24 bars.
Figure 4. Comparison of convergence of various algorithms on 24 bars.
Figure 5. Topology optimization of 24 bar truss using various algorithms.

3.1.2. Optimization of 72 Bar 3D Truss

Figure 6 shows the 72 bar structural diagram. The truss components are divided into 16 groups, with the top four nodes representing mass concentration points. Set the data and parameters for minimum weight optimization in Table 5. The minimum weights and average values obtained by each algorithm are shown in Table 6. The optimal weight and overall average value of mESC are generally the best, with an optimal weight of 443.7325 being the smallest among all comparison algorithms, which confirms the superiority of the proposed algorithm’s performance. Figure 7 shows the optimization design convergence graphs of all algorithms, with mESC having the highest convergence accuracy. Figure 8 shows the truss structure diagram optimized by various algorithms.
Figure 6. 72 bar truss.
Table 5. Setting of 72 bar truss structure.
Table 6. Optimization of 72 bar truss structure.
Figure 7. Convergence diagram of 72 bar truss.
Figure 8. Topology optimization of 72 bar truss using various algorithms.
mESC can achieve the optimal structure by removing six rods; mESC needs to excel in solving such problems.

3.2. Engineering Problem

We use mESC to solve five engineering optimizations, namely minimizing the weight of the reducer, designing the welding beam, the problem of the stepper cone pulley, the problem of robot clamping, and the rolling element bearing. This article uses static penalty methods to handle constraints in the above engineering optimization problems, with the specific formula being:
ϕ ( r ) = f ( r ) ± j = 1 m h j max ( 0 ,   z j ( r ) ) ε + k = 1 n i j W j ( r ) φ .
In Formula (22), ϕ ( r ) is the objective function, h j and i j are positive penalty constants, W j and z j are constraints, and parameters ε and φ are 1 or 2.

3.2.1. Minimize the Weight of the Reducer

This question is about the design of a small aircraft engine reducer, which involves seven variables: surface width ( c 1 ) , tooth pattern ( c 2 ) , number of teeth ( c 3 ) of the small gear, the size of the first axis is ( c 4 ) , size ( c 5 ) of the other shaft, first shaft diameter ( c 6 ) , and another shaft diameter ( c 7 ) . Its characteristics are as follows:
Minimize
f ( c ¯ ) = 0.7854   c 2 2 c 1 ( 14.9334   c 3 43.0934 + 3.3333   c 3 2 )                       + 0.7854   ( c 5 c 7 2 + c 4 c 6 2 ) 1.508   c 1 ( c 7 2 + c 6 2 ) + 7.477   ( c 7 3 + c 6 3 ) .
Constraints:
g 1 ( c ¯ ) = c 1 c 2 2 c 3 + 27 0 , g 2 ( c ¯ ) = c 1 c 2 2 c 3 2 + 397.5 0 , g 3 ( c ¯ ) = c 2 c 6 4 c 3 c 4 2 + 1.93 0 , g 4 ( c ¯ ) = c 2 c 7 4 c 3 c 5 3 + 1.93 0 , g 5 ( c ¯ ) = 10 c 6 3 16.91 × 10 6 + ( 745 c 4 c 2 1 c 3 1 ) 2 1100 0 , g 6 ( c ¯ ) = 10 c 7 3 157.5 × 10 6 + ( 745 c 5 c 2 1 c 3 1 ) 2 850 0 , g 7 ( c ¯ ) = c 2 c 3 40 0 , g 8 ( c ¯ ) = c 1 c 2 1 + 5 0 , g 9 ( c ¯ ) = c 1 c 2 1 12 0 , g 10 ( c ¯ ) = 1.5 c 6 c 4 + 1.9 0 , g 11 ( c ¯ ) = 1.1 c 7 c 5 + 1.9 0 .
Range:
0.7 c 2 0.8 ,   17 c 3 28 ,   2.6 c 1 3.6 , 5 c 7 5.5 ,   7.3 c 5 ,   c 4 8.3 ,   2.9 c 6 3.9 .
To solve this problem, mESC conducted comparative experiments with NRBO, MPSO, CPO, BWO, WOA, HHO, TSA, AO [], and GWO. According to Table 7, mESC provides the optimal design with an optimal value of 2994.506339787.
Table 7. Optimization results of reducer design.

3.2.2. Welding Beam Design

This design aims to minimize the cost of welding beams [], involving weld seam width h ( c 1 ) , clamping rod length l ( c 2 ) , rod height t ( c 3 ) , and rod thickness b ( c 4 ) . The schematic diagram is shown in Figure 9, and the features are as follows:
Figure 9. Welded beam structure.
Minimize
f ( c ¯ ) = 1.10471 c 1 2 c 2 + 0.04811 c 3 c 4 ( 14.0 + c 2 ) .
Constraints:
g 1 ( c ¯ ) = c 1 c 4 0 , g 2 ( c ¯ ) = δ ( c ) δ max 0 , g 3 ( c ¯ ) = P P c ( c ) , g 4 ( c ¯ ) = τ max τ ( c ) , g 5 ( c ¯ ) = σ ( c ) σ max 0 .
Range:
0.1 c 3 , c 2 10 , 0.1 c 4 2 , 0.125 c 1 2 , δ max = 0.25 i n , τ = τ 2 + τ 2 + 2 τ τ c 2 2 R , τ = R M J , τ = P 2 c 2 c 1 , M = P ( c 2 2 + L ) , R = c 2 2 4 + ( c 1 + c 3 2 ) 2 , σ ( c ¯ ) = 6 P L 3 E c 3 2 c 4 , J = 2 ( ( c 2 2 4 + ( c 1 + c 3 2 ) 2 ) 2 c 1 c 2 ) , P x ( c ¯ ) = 4.013 E c 3 c 4 3 6 L 2 ( 1 c 3 2 L E 4 G ) , L = 14 i n , P = 6000 l b , E = 30.10 p s i , σ max = 13600 p s i , G = 12 · 10 6 p s i .
To solve this problem, mESC conducted comparative experiments with NRBO, MPSO, CPO, BWO, WOA, HHO, TSA, AO, and GWO. The optimal value obtained by mESC on this problem in Table 8 is 1.670306973.
Table 8. Optimization results of welded beam design.

3.2.3. Step Cone Pulley Problem

The problem is to minimize the weight of the stepping cone pulley [], involving five variables: pulley diameter and pulley width. Figure 10 shows its design diagram:
Figure 10. Stepping cone pulley.
Minimize
f ( c ¯ ) = ρ ω d 1 2 11 + ( N 1 N ) 2 + d 2 2 1 + ( N 2 N ) 2 + d 3 2 1 + ( N 3 N ) 2 + d 4 2 1 + ( N 4 N ) 2 .
Constraints:
h 1 ( c ¯ ) = X 1 X 2 = 0 , h 2 ( c ¯ ) = X 1 X 3 = 0 , h 3 ( c ¯ ) = X 1 X 4 = 0 , g i = 1 , 2 , 3 , 4 ( c ¯ ) = R 2 , g i = 5 , 6 , 7 , 8 ( c ¯ ) = ( 0.75 × 745.6998 ) P i 0 .
Among them,
X i = π d i 2 ( 1 + N i N ) + ( N i N ) 2 4 a + 2 a , i = ( 1 , 2 , 3 , 4 ) , R i = exp μ π 2 sin 1 ( N i N 1 ) d i 2 a , i = ( 1 , 2 , 3 , 4 ) , P i = s t ω ( 1 R i ) π d i N i 60 , i = ( 1 , 2 , 3 , 4 ) , t = 8 mm , s = 1.75 MPa , μ = 0 . 35 , ρ = 7200 kg / m 3 , a = 3 mm .
To address this issue, comparative experiments were conducted in Table 9 between mESC and competitors such as MPSO, CPO, AVOA [], BWO, WOA, HHO, TSA, AO, and GWO. The results showed that mESC achieved the best performance with an optimal value of 16.983218316.
Table 9. Optimization results of stepping cone pulley.

3.2.4. Robot Clamping Problem

This task studies the force that robot grippers can generate when grasping objects, while ensuring that the objects are not damaged and the grasping is stable []. Figure 11 is a structural diagram, characterized by the following:
Figure 11. Robot clamping structure.
Minimize
f ( c ¯ ) = min z F k ( c , z ) + max z F k ( c , z ) .
Constraints:
g 1 ( c ¯ ) = Y min + y ( c ¯ , Z max ) 0 , g 2 ( c ¯ ) = y ( c , Z max ) 0 , g 3 ( c ¯ ) = Y max y ( c ¯ , 0 ) 0 , g 4 ( c ¯ ) = y ( c ¯ , 0 ) Y G 0 , g 5 ( c ¯ ) = l 2 + e 2 ( a + b ) 2 < 2 , g 6 ( c ¯ ) = b 2 ( a e ) 2 ( l Z max ) 2 0 , g 7 ( c ¯ ) = Z max l 0 .
Among them,
α = cos 1 ( a 2 + g 2 b 2 2 a g ) + φ , g = e + ( z 2 l ) 2 , β = cos 1 ( b 2 + g 2 a 2 2 b g ) φ , φ = tan 1 ( e l z ) , y ( c , z ) = 2 ( f + e + x sin ( β + δ ) ) , F k = P b sin ( α + β ) / 2 x cos ( α ) , Y min = 50 , Y max = 100 , Y G = 150 , Z max = 100 , P = 100 .
Range:
0 e 50 , 100 x 200 , 10 f , a , b 150 , 1 δ 3.14 , 100 l 300 .
To address this issue, mESC conducted comparative experiments with MPSO, CPO, AVOA, BWO, WOA, HHO, TSA, AO, and GWO. Table 10 presents the design results, and mESC achieved the best performance with an optimal value of 16.983218316.
Table 10. Optimization of robot gripping.

3.2.5. Rolling Element Bearings

This article applies mESC to optimize the design of rolling bearings. The schematic diagram is shown in Figure 12, with a total of 10 optimization parameters and the following characteristics:
Figure 12. Rolling element bearing structure.
Minimize
f ( c ¯ ) = f x Z 2 / 3 D b 1.8                       ,   i f   D b 25.4   m m , 3.647 f x Z 2 / 3 D b 1.4     ,   o t h w e w i s e .
Constraints:
g 1 ( c ¯ ) = Z φ 0 / 2 sin 1 ( D b / D m ) 1 0 , g 2 ( c ¯ ) = K D min ( D d ) 2 D b 0 , g 3 ( c ¯ ) = 2 D b K D min ( D d ) 0 , g 4 ( c ¯ ) = D b w 0 , g 5 ( c ¯ ) = 0.5 ( D + d ) D m 0 , g 6 ( c ¯ ) = D m ( 0.5 + e ) ( D + d ) 0 , g 7 ( c ¯ ) = ε D b 0.5 ( D D m D b ) 0 , g 8 ( c ¯ ) = 0.515 f i 0 , g 9 ( c ¯ ) = 0.515 f 0 0 .
where
f x = 37.91 1 + 1.04 ( 1 γ 1 + γ ) 1.72 ( f i ( 2 f 0 1 ) f 0 ( 2 f i 1 ) ) 0.41 10 / 3 0.3 , γ = D b cos ( α ) D m , f i = r i D b , f 0 = r 0 D b , φ 0 = 2 π 2 cos 1 ( ( D d ) / 2 3 ( T / 4 ) 2 + D / 2 ( T / 4 ) D b 2 2 ( D b ) / 2 3 ( T / 4 ) d / 2 + ( T / 4 ) 2 D / 2 ( T / 4 ) D b ) , T D d 2 D b , D = 160 , d = 90 , B w = 30 .
Range:
0.5 ( D + d ) < D m < 0.6 ( D + d ) , 0.15 ( D d ) < D b < 0.45 ( D d ) , 4 Z 50 , 0.515 f 0 0.6 , 0.4 K D min 0.5 , 0.6 K D max 0.7 , 0.3 ε 0.4 , 0.02 e 0.1 , 0.6 ζ 0.85 .
To solve this problem, mESC was compared with MPSO, CPO, AVOA, BWO, WOA, HHO, TSA, AO, and GWO in experimental experiments. mESC can achieve optimal performance levels, and the optimal value of 16,958.202286941 is given in Table 11.
Table 11. Optimization of rolling element bearings.

4. Summarize

This study proposes an improved version of mESC based on multi-strategy enhancement, which maintains population diversity through adaptive perturbation factor strategy, restarts the mechanism to improve the global exploration of mESC, and balances local development of the algorithm through dynamic centroid reverse learning strategy. Finally, the elite pool boundary adjustment strategy is used to accelerate population convergence. mESC conducted performance tests on the test suite and six optimized designs to demonstrate its strong superiority. In the future, we will further expand mESC, research new population update mechanisms, and apply them in areas such as feature selection, image segmentation, and information processing.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/biomimetics10040232/s1.

Author Contributions

Conceptualization, J.L. and L.C.; Methodology, J.L. and J.Y.; Software, J.L., J.Y. and L.C.; Validation, J.Y. and L.C.; Formal analysis, J.L. and L.C.; Investigation, J.L. and J.Y.; Resources, J.L., J.Y. and L.C.; Data curation, J.L. and L.C.; Writing—original draft, J.L., J.Y. and L.C.; Writing—review & editing, J.L., J.Y. and L.C.; Visualization, J.Y. and L.C.; Supervision, J.L. and L.C.; Project administration, J.L.; Funding acquisition, J.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the Natural Science Research Program of the Shaanxi Education Department in China (grant No. 24JC021). This work is also supported by the 2024 Shaanxi Provincial Key R&D Program project in China (grant No. 2024GX-YBXM-529).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

All data generated or analyzed during the study are included in this published article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Krentel, M.W. The complexity of optimization problems. In Proceedings of the Eighteenth Annual ACM Symposium on Theory of Computing, Berkeley, CA, USA, 28–30 May 1986; pp. 69–76. [Google Scholar] [CrossRef]
  2. Zhao, Y.; vom Lehn, F.; Pitsch, H.; Pelucchi, M.; Cai, L. Mechanism optimization with a novel objective function: Surface matching with joint dependence on physical condition parameters. Proc. Combust. Inst. 2024, 40, 105240l. [Google Scholar] [CrossRef]
  3. Alireza, M.N.; Mohamad, S.A.; Rana, I. Optimizing a self-healing gelatin/aldehyde-modified xanthan gum hydrogel for extrusion-based 3D printing in biomedical applications. Mater. Today Chem. 2024, 40, 102208. [Google Scholar] [CrossRef]
  4. Ehsan, B.; Peter, B. A simulation-optimization approach for integrating physical and financial flows in a supply chain under economic uncertainty. Oper. Res. Perspect. 2023, 10, 100270. [Google Scholar] [CrossRef]
  5. Xie, D.W.; Qiu, Y.Z.; Huang, J.S. Multi-objective optimization for green logistics planning and operations management: From economic to environmental perspective. Comput. Ind. Eng. 2024, 189, 109988. [Google Scholar] [CrossRef]
  6. Ibham, V.; Aslan, D.K.; Sener, A.; Martin, S.; Muhammad, I. Machine learning of weighted superposition attraction algorithm for optimization diesel engine performance and emission fueled with butanol-diesel biofuel. Ain Shams Eng. J. 2024, 12, 103126. [Google Scholar] [CrossRef]
  7. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95−International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar] [CrossRef]
  8. Storn, R.; Price, K. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  9. Trojovská, E.; Dehghani, M.; Trojovský, P. Zebra Optimization Algorithm: A New Bio-Inspired Optimization Algorithm for Solving Optimization Algorithm. IEEE Access 2022, 10, 49445–49473. [Google Scholar] [CrossRef]
  10. Seyedali, M. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  11. Abdel-Basset, M.; Mohamed, R.; Jameel, M. Spider wasp optimizer: A novel meta-heuristic optimization algorithm. Artif. Intell. Rev. 2023, 56, 11675–11738. [Google Scholar] [CrossRef]
  12. Zhao, S.; Zhang, T.; Ma, S.; Chen, M. Sea-horse optimizer: A novel nature-inspired meta-heuristic for global optimization problems. Appl. Intell. 2023, 53, 11833–11860. [Google Scholar] [CrossRef]
  13. Hao, W.G.; Wang, L.Y.; Mirjalili, S. Artificial hummingbird algorithm: A new bio-inspired optimizer with its engineering applications. Comput. Methods Appl. Mech. Eng. 2022, 388, 114194. [Google Scholar] [CrossRef]
  14. Agushaka, J.O.; Ezugwu, A.E.; Abualigah, L. Dwarf Mongoose Optimization Algorithm: A New Bio-Inspired Metaheuristic Method for Engineering Optimization. Comput. Methods Appl. Mech. Eng. 2022, 391, 114570. [Google Scholar] [CrossRef]
  15. Sang-To, T.; Hoang-Le, M.; Wahab, M.A. An efficient Planet Optimization Algorithm for solving engineering problems. Sci. Rep. 2022, 12, 8362. [Google Scholar] [CrossRef]
  16. Atashpaz-Gargari, E.; Lucas, C. Imperialist competitive algorithm: An algorithm for optimization inspired by imperialistic competition. In Proceedings of the 2007 IEEE Congress on Evolutionary Computation, Singapore, 25–28 September 2007; pp. 4661–4667. [Google Scholar] [CrossRef]
  17. Rao, R.V.; Savsani, V.J.; Vakharia, D.P. Teaching–learning-based optimization: An optimization method for continuous non-linear large scale problems. Inf. Sci. 2012, 183, 1–15. [Google Scholar] [CrossRef]
  18. Hu, G.; Cheng, M.; Houssein, E.H.; Hussien, A.G.; Abualigah, L. SDO: A novel sled dog-inspired optimizer for solving engineering problems. Adv. Eng. Inform. 2024, 62, 102783. [Google Scholar] [CrossRef]
  19. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  20. Dehghani, M.; Trojovský, P. Osprey optimization algorithm: A new bio-inspired metaheuristic algorithm for solving engineering optimization problems. Front. Mech. Eng. 2023, 8, 1126450. [Google Scholar] [CrossRef]
  21. Xue, J.; Shen, B. Dung beetle optimizer: A new meta-heuristic algorithm for global optimization. J. Supercomput. 2023, 79, 7305–7336. [Google Scholar] [CrossRef]
  22. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  23. Hakki, M.G.; Eksin, I.; Erol, O.K. A new optimization method: Big Bang-Big Crunch. Adv. Eng. Softw. 2006, 37, 106–111. [Google Scholar] [CrossRef]
  24. Hu, G.; Huang, F.Y.; Seyyedabbasi, A.; Wei, G. Enhanced multi-strategy bottlenose dolphin optimizer for UAVs path planning. Appl. Math. Model. 2024, 130, 243–271. [Google Scholar] [CrossRef]
  25. Hu, G.; Gong, C.S.; Li, X.X.; Xu, Z.Q. CGKOA: An enhanced Kepler optimization algorithm for multi-domain optimization problems. Comput. Methods Appl. Mech. Eng. 2024, 425, 116964. [Google Scholar] [CrossRef]
  26. Hu, G.; Du, B.; Chen, K.; Wei, G. Super eagle optimization algorithm based three-dimensional ball security corridor planning method for fixed-wing UAVs. Adv. Eng. Inform. 2024, 59, 102354. [Google Scholar] [CrossRef]
  27. Hu, G.; Huang, F.Y.; Chen, K.; Wei, G. MNEARO: A meta swarm intelligence optimization algorithm for engineering applications. Comput. Methods Appl. Mech. Eng. 2024, 419, 116664. [Google Scholar] [CrossRef]
  28. Ouyang, K.; Fu, S.; Chen, Y.; Cai, Q.; Heidari, A.A.; Chen, H. Escape: An Optimizer based on crowd evacuation behaviors. Artif. Intell. Rev. 2024, 58, 19. [Google Scholar] [CrossRef]
  29. Zhang, G.Z.; Fu, S.W.; Li, K.; Huang, H.S. Differential evolution with multi-strategies for UAV trajectory planning and point cloud registration. Appl. Soft Comput. 2024, 167 Pt C, 112466. [Google Scholar] [CrossRef]
  30. Lian, J.B.; Hui, G.H.; Ma, L.; Zhu, T.; Wu, X.C.; Heidari, A.A.; Chen, Y.; Chen, H.L. Parrot optimizer: Algorithm and applications to medical problems. Comput. Biol. Med. 2024, 172, 108064. [Google Scholar] [CrossRef]
  31. Rezaei, F.; Safavi, H.R.; Abd Elaziz, M. GMO: Geometric mean optimizer for solving engineering problems. Soft Comput. 27 2023, 15, 10571–10606. [Google Scholar] [CrossRef]
  32. Qi, A.; Zhao, D.; Heidari, A.A. FATA: An Efficient Optimization Method Based on Geophysics. Neurocomputing 2024, 607, 128289. [Google Scholar] [CrossRef]
  33. Zheng, B.; Chen, Y.; Wang, C. The moss growth optimization (MGO): Concepts and performance. J. Comput. Des. Eng. 2024, 11, 184–221. [Google Scholar] [CrossRef]
  34. Abdel-Basset, M.; Mohamed, R.; Abouhawwash, M. Crested Porcupine Optimizer: A new nature-inspired metaheuristic. Knowl.-Based Syst. 2024, 284, 111257. [Google Scholar] [CrossRef]
  35. Yuan, C.; Dong, Z.; Heidari, A.A.; Liu, L.; Chen, Y.; Chen, H.l. Polar lights optimizer: Algorithm and applications in image segmentation and feature selection. Neurocomputing 2024, 607, 128427. [Google Scholar] [CrossRef]
  36. Sowmya, R.; Premkumar, M.; Jangir, P. Newton-Raphson-based optimizer: A new population-based metaheuristic algorithm for continuous optimization problems. Eng. Appl. Artif. Intell. 2024, 128, 107532. [Google Scholar] [CrossRef]
  37. Wu, X.; Li, S.; Jiang, X. Information acquisition optimizer: A new efficient algorithm for solving numerical and constrained engineering optimization problems. J. Supercomput. 2024, 80, 25736–25791. [Google Scholar] [CrossRef]
  38. Gao, Y.; Zhang, J.; Wang, Y.; Wang, J.; Qin, L. Correction to: Love evolution algorithm: A stimulus–value–role theory-inspired evolutionary algorithm for global optimization. J. Supercomput. 2024, 80, 15097–15099. [Google Scholar] [CrossRef]
  39. Mirjalili, S.; Lewis, A.; Sadiq, A.S. Autonomous Particles Groups for Particle Swarm Optimization. Arab. J. Sci. Eng. 2014, 39, 4683–4697. [Google Scholar] [CrossRef]
  40. Rather, S.A.; Bala, P.S. Constriction coefficient based particle swarm optimization and gravitational search algorithm for multilevel image thresholding. Expert Syst. 2021, 38, 12717. [Google Scholar] [CrossRef]
  41. Vinodh, G.; Kathiravan, K.; Mahendran, G. Distributed Network Reconfiguration for Real Power Loss Reduction Using TACPSO. Int. J. Adv. Res. Electr. Electron. Instrum. Eng. 2017, 6, 7517–7525. [Google Scholar] [CrossRef]
  42. Civicioglu, P.; Besdok, E. Bernstein-Levy differential evolution algorithm for numerical function optimization. Neural Comput. Appl. 2023, 35, 6603–6621. [Google Scholar] [CrossRef]
  43. Akgungor, A.P.; Korkmaz, E. Bezier Search Differential Evolution algorithm based estimationmodels of delay parameter k for signalized intersections. Concurr. Comput. Pract. Exp. 2022, 34, e6931. [Google Scholar] [CrossRef]
  44. Emin, K.A. Detection of object boundary from point cloud by using multi-population based differential evolution algorithm. Neural Comput. Appl. 2023, 35, 5193–5206. [Google Scholar] [CrossRef]
  45. Biswas, S.; Saha, D.; De, S.; Cobb, A.D.; Jalaian, B.A. Improving Differential Evolution through Bayesian Hyperparameter Optimization. In Proceedings of the 2021 IEEE Congress on Evolutionary Computation (CEC), Krakow, Poland, 28 June–1 July 2021. [Google Scholar] [CrossRef]
  46. Awad, N.H.; Ali, M.Z.; Suganthan, P.N. Ensemble sinusoidal differential covariance matrix adaptation with Euclidean neighborhood for solving CEC2017 benchmark problems. In Proceedings of the 2017 IEEE Congress on Evolutionary Computation (CEC), Donostia/San Sebastian, Spain, 5–8 June 2017. [Google Scholar] [CrossRef]
  47. Mohamed, A.W.; Hadi, A.A.; Fattouh, A.M.; Jambi, K.M. Jambi: L-SHADE with Semi Parameter Adaptation Approach for Solving CEC 2017 Benchmark Problems. In Proceedings of the 2017 IEEE Congress on Evolutionary Computation (CEC), Donostia/San Sebastian, Spain, 5–8 June 2017; pp. 1456–1463. [Google Scholar] [CrossRef]
  48. Tanabe, R.; Fukunaga, A. Improving the search performance of SHADE using linear population size reduction. In Proceedings of the 2014 IEEE Congress on Evolutionary Computation, Beijing, China, 6–11 July 2014; pp. 1658–1665. [Google Scholar] [CrossRef]
  49. Tejani, G.G.; Savsani, V.J.; Bureerat, S.; Patel, V.K. Topology and size optimization of trusses with static and dynamic bounds by modified symbiotic organisms search. J. Comput. Civ. Eng. 2018, 32, 04017085. [Google Scholar] [CrossRef]
  50. Bansal, A.K.; Gupta, R.A.; Kumar, R. Optimization of hybrid PV/wind energy system using Meta Particle Swarm Optimization (MPSO). In Proceedings of the India International Conference on Power Electronics, New Delhi, India, 28–30 January 2011. [Google Scholar] [CrossRef]
  51. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H.L. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  52. Civicioglu, P.; Besdok, E. Bernstain-search differential evolution algorithm for numerical function optimization. Expert Syst. Appl. 2019, 138, 112831. [Google Scholar] [CrossRef]
  53. Kaur, S.; Awasthi, L.K.; Sangal, A.L.; Dhiman, G. Tunicate Swarm Algorithm: A new bio-inspired based metaheuristic paradigm for global optimization. Eng. Appl. Artif. Intell. 2020, 90, 103541. [Google Scholar] [CrossRef]
  54. Abualigah, L.; Yousri, D.; Abd Elaziz, M.; Ewees, A.A.; Al-qaness, M.A.A.; Gandomi, A.H. Aquila optimizer: A novel meta-heuristic optimization algorithm. Comput. Ind. Eng. 2021, 157, 107250. [Google Scholar] [CrossRef]
  55. Hu, G.; Zhu, X.N.; Wei, G.; Chang, C.T. An improved marine predators algorithm for shape optimization of developable Ball surfaces. Eng. Appl. Artif. Intell. 2021, 105, 104417. [Google Scholar] [CrossRef]
  56. Siddall, J.N. Optimal Engineering Design: Principles and Applications; CRC Press: Boca Raton, FL, USA, 1982. [Google Scholar]
  57. Abdollahzadeh, B.; Gharehchopogh, F.S.; Mirjalili, S. African vultures optimization algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems. Comput. Ind. Eng. 2021, 158, 107408. [Google Scholar] [CrossRef]
  58. Ekrem, Ö.; Aksoy, B. Trajectory planning for a 6-axis robotic arm with particle swarm optimization algorithm. Eng. Appl. Artif. Intell. 2023, 122, 106099. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.