Previous Article in Journal
Damage Resistance of an fMRI-Spiking Neural Network Based on Speech Recognition Against Stochastic Attack
Previous Article in Special Issue
An Enhanced Snow Geese Optimizer Integrating Multiple Strategies for Numerical Optimization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Multi-Degree Reduction of Said–Ball Curves and Engineering Design Using Multi-Strategy Enhanced Coati Optimization Algorithm

1
Department of Science, Jiangxi University of Science and Technology, Ganzhou 341000, China
2
Jiangxi Provincial Key Laboratory of Multidimension Intelligent Perception and Control, Jiangxi University of Science and Technology, Ganzhou 341000, China
*
Author to whom correspondence should be addressed.
Biomimetics 2025, 10(7), 416; https://doi.org/10.3390/biomimetics10070416
Submission received: 21 May 2025 / Revised: 16 June 2025 / Accepted: 24 June 2025 / Published: 26 June 2025

Abstract

Within computer-aided geometric design (CAGD), Said–Ball curves are primarily adopted in domains such as 3D object skeleton modeling, vascular structure repair, and path planning, owing to their flexible geometric properties. Techniques for curve degree reduction seek to reduce computational and storage demands while striving to maintain the essential geometric attributes of the original curve. This study presents a novel degree reduction model leveraging Euclidean distance and curvature data, markedly improving the preservation of geometric features throughout the reduction process. To enhance performance further, we propose a multi-strategy enhanced coati optimization algorithm (MSECOA). This algorithm utilizes a good point set combined with opposition-based learning to refine the initial population distribution, employs a fitness–distance equilibrium approach alongside a dynamic spiral search strategy to harmonize global exploration with local exploitation, and integrates an adaptive differential evolution mechanism to boost convergence rates and robustness. Experimental results demonstrate that the MSECOA outperforms nine highly cited agorithms in terms of convergence performance, solution accuracy, and stability. The algorithm exhibits superior behavior on the IEEE CEC2017 and CEC2022 benchmark functions and demonstrates strong practical utility across four engineering optimization problems with constraints. When applied to multi-degree reduction approximation of Said–Ball curves, the algorithm’s effectiveness is substantiated through four reduction cases, highlighting its superior precision and computational efficiency, thus providing a highly effective and accurate solution for complex curve degree reduction tasks.

1. Introduction

In the fields of computer-aided geometric design (CAGD) and computer graphics, significant progress has been made in shape optimization methods for parameterized free-form curves. The Ball curve, as one of the classic curves in CAGD, is primarily applied in mechanical design, vascular structure repair, and path planning due to its excellent properties in geometric modeling [1]. Since Ball introduced the rational cubic Ball curve in 1974 [2], research on Ball curves has continued to deepen. Wang extended the cubic Ball curve to higher-order generalized Ball curves [3], while Said further proposed the generalized Ball curve, extending it to arbitrary odd degrees and revealing its numerous advantageous properties in geometric modeling [4]. Hu et al. conducted an in-depth comparative study of generalized Ball curves and Bézier curves in terms of recursive evaluation and envelope properties [5]. In 2000, Wu introduced two new generalized Ball curves, namely the Said–Bézier generalized Ball curve and the Wang–Said generalized Ball curve (WSGB) [6]. Recently, Liu et al. proposed the h–Said–Ball basis function, enhancing modeling capabilities and improving evaluation efficiency through h-calculus [7]. These new curves exhibit more efficient recursive algorithms compared to traditional Bézier curves.
In curve approximation techniques, Sederberg and Farouki first introduced interval Bézier curves into curve approximation and formally proposed interval algorithms to ensure the stability and accuracy of computational results [8]. Tuohy et al. introduced the concept of interval B-spline curves and surfaces and studied the boundary properties of interval Bézier curves [8]. Lin and Rokne proposed disk Bézier curves and spherical Bézier curves, which demonstrated superior performance in geometric computations [9]. Lu et al. developed a Bézier curve degree reduction algorithm based on L2 norm approximation error control, significantly improving the stability and accuracy of degree reduction [10]. Quan et al. proposed a preprocessing asymptotic iterative approximation method to accelerate the convergence speed of tensor product Said–Ball surfaces, markedly enhancing computational efficiency and stability [11].
The degree reduction of curves and surfaces has long been a research focus in CAGD, particularly in complex geometric design, where the need for degree reduction is more pressing. Hu et al. conducted an in-depth study on the approximate degree reduction of Said–Ball curves and surfaces, proposing a stepwise degree reduction method, though it can only achieve single-step reduction at a time [5]. Chen and Lou developed a degree reduction algorithm for interval B-spline curves [12], while Tan and Fang systematically explored the degree reduction of interval WSGB curves [13]. Hu and Wang, based on Jacobi polynomials and quadratic programming theory, proposed an optimal multi-degree reduction algorithm [14]. Wang Hongkai achieved the exact degree reduction of Bézier curves through polynomial reparameterization, significantly simplifying the process and improving computational efficiency [15]. However, traditional degree reduction methods (e.g., least squares, geometric approximation, and classical optimization algorithms) exhibit significant limitations when handling complex curves like Said–Ball curves: first, their convergence speed is slow, failing to meet the demands of efficient computation; second, their accuracy is insufficient, particularly in high-dimension spaces where they tend to fall into local optima, making it difficult to simultaneously preserve geometric features and minimize errors; and third, their robustness is poor, lacking stability for complex shapes or high-degree curves, resulting in degree reduction outcomes that deviate from expectations. Moreover, traditional methods struggle to find globally optimal solutions when dealing with curves exhibiting multimodal or nonlinear characteristics, further limiting their applicability.
In view of the aforementioned challenges, the degree reduction problem can essentially be formulated as an optimization task, making it well-suited for resolution through bio-inspired intelligent optimization algorithms. In recent years, such algorithms have demonstrated significant potential in addressing complex optimization problems, particularly excelling in global search capabilities, convergence performance, and algorithmic robustness. These methods draw inspiration from natural phenomena, animal behaviors, and evolutionary mechanisms, abstracting biological principles into mathematical models to simulate adaptive and efficient search strategies. According to the No Free Lunch (NFL) theorem, no single algorithm can achieve optimal performance across all types of optimization problems [16]. This highlights the necessity of designing specialized algorithms tailored to specific application domains. To address this need, researchers have continuously proposed and refined bio-inspired intelligent optimization algorithms to deliver more efficient and targeted solutions. For instance, Cheng et al., based on uniform experimental design theory, proposed an improved sparrow search algorithm (ISSA), inspired by the foraging behavior of sparrows. By incorporating a surrounding diversity metric, dynamic management strategies, and boundary update mechanisms, ISSA enhances population diversity and convergence efficiency and has been validated for its superior performance in UAV path planning [17]. Mahmoud et al. developed an enhanced hiking optimization algorithm (AEDHOA), drawing inspiration from human hiking behavior. The algorithm introduces four strategies to improve population diversity and convergence efficiency, effectively tackling high-dimension feature selection problems while demonstrating strong global search capabilities and robustness [18].These studies suggest that biomimetic intelligent optimization algorithms are powerful tools for addressing curve and surface degree reduction in CAGD.
Specifically, Hu et al. applied the grey wolf optimization algorithm (GWO)—a bio-inspired swarm intelligence method—to investigate the multi-degree reduction of SG-Bézier curves [19]. Guo et al. employed a skewed normal cloud-modified whale optimization algorithm (WOA) to enhance the degree reduction performance for S- λ curves, leveraging its global search capabilities [20]. Additionally, Hu et al. proposed an enhanced golden jackal optimization algorithm (EGJO), which integrates oppositional learning, adaptive mutation, and binomial crossover strategies to improve the shape optimization of complex CSGC–Ball surfaces [21]. These studies collectively demonstrate the significant potential of biomimetic intelligent optimization algorithms in addressing complex curve and surface approximation problems, offering efficient and reliable solutions for high-order degree reduction tasks. However, existing biomimetic intelligent optimization algorithms still exhibit certain limitations when addressing degree reduction of complex curves. For instance, the grey wolf optimization algorithm (GWO) excels in global search capabilities but tends to get trapped in local optima in high-dimension problems and suffers from slow convergence [19]; the whale optimization algorithm (WOA) lacks sufficient local exploitation ability, particularly showing unstable performance in multimodal optimization problems [20]; and golden jackal optimization (GJO) struggles to maintain population diversity, often leading to premature convergence [21]. These shortcomings constrain the effectiveness of existing algorithms in complex curve degree reduction problems.
This paper addresses the limitations of the coati optimization algorithm [22], such as insufficient population diversity, limited local convergence capability, and a tendency to become trapped in local optima in complex optimization problems, by proposing a multi-strategy enhanced coati optimization algorithm (MSECOA) and applying it to the degree reduction approximation problem of Said–Ball curves. The main contributions of this paper are as follows:
  • Construction of a Said–Ball curve degree reduction model: A new degree reduction approximation model for Said–Ball curves is proposed, with the objective function designed to minimize Euclidean distance and curvature error.
  • Proposal of a multi-strategy enhanced coati optimization algorithm: The initial population distribution is optimized using a hybrid oppositional learning strategy based on a good point set, while global search and local exploitation capabilities are enhanced through a fitness–distance balance strategy and a dynamic spiral search strategy. An adaptive differential evolution mechanism is introduced to improve convergence speed and robustness. Comparative experiments on the IEEE CEC2017 and CEC-2020 test function suites, as well as four engineering constrained design problems, validate the MSECOA’s significant advantages in optimization accuracy, convergence speed, and robustness.
  • Validation of the MSECOA’s experimental performance in Said–Ball curve degree reduction: Through numerical examples involving four different degree reduction levels, the MSECOA’s efficiency in multi-degree reduction approximation of Said–Ball curves is demonstrated. Experimental results show that the MSECOA effectively preserves the geometric features of curves while reducing their degree, offering an efficient solution for complex curve degree reduction problems.
The structure of this paper is organized as follows: Section 2 systematically elaborates degree reduction approximation problem of Said–Ball curves and constructs a degree reduction optimization model. Section 3 provides a detailed introduction to core principles and mathematical formulation of the coati optimization algorithm (COA). Section 4 presents a multi-strategy enhanced coati optimization algorithm (MSECOA), including pseudocode, flowcharts, and a theoretical analysis of its computational complexity. Section 5 comprehensively evaluates the MSECOA’s optimization performance using IEEE CEC2017 and CEC2022 benchmark functions. Section 6 applies the MSECOA to three typical engineering constrained optimization problems to verify its practical effectiveness. Section 7 validates the MSECOA’s superior performance in Said–Ball curve degree reduction through four sets of numerical experiments with varying degree reduction levels. Finally, Section 8 summarizes research contributions and innovations of this study and provides an outlook on future research directions.

2. Degree Reduction Approximation for Said–Ball Curves

2.1. Said–Ball Curves

Said–Ball curves, introduced by Said in 1991, represent an important class of parametric curves in computer-aided geometric design (CAGD). As a generalized form of Ball curves, they not only preserve many desirable properties of Bézier curves but also demonstrate superior flexibility and computational efficiency in specific applications. These curves are mathematically defined through control points and basis functions, enabling the precise representation of complex geometric shapes. Their applications span diverse engineering fields including mechanical design, aerospace engineering, and architectural modeling.
Definition 1. 
Given a set of control points P i ( i = 0 , 1 , , n ) and the parameter t [ 0 , 1 ] , the Said–Ball curve is expressed as Equation (1):
C ( t ) = i = 0 n P i S i n ( t ) ,
where S i n ( t ) denotes the Said–Ball basis function defined by Equation (2):
S i n ( t ) = n / 2 + i i t i ( 1 t ) n / 2 + 1 , 0 i n / 2 1 n n / 2 t n / 2 ( 1 t ) n / 2 , i = n / 2 S n i n ( 1 t ) , n / 2 + 1 i n
here, · and · denote the floor and ceiling functions, respectively.

2.2. Degree Reduction Problem of Said–Ball Curves

The degree reduction approximation problem of Said–Ball curves refers to reducing a high-degree Said–Ball curve P n ( t ) to a lower-degree Said–Ball curve Q m ( t ) (where m < n ), while preserving the geometric features of the original curve as much as possible. The primary objective of degree reduction is to decrease the number of control points, thereby reducing computational complexity and storage requirements, while maintaining the shape accuracy of the curve.
Definition 2. 
The degree reduction problem of Said–Ball curves can be described as follows: Given an n-th degree Said–Ball curve, as shown in Equation (3),
P n ( t ) = i = 0 n P i S i n ( t ) ,
where P i are the control points and S i n ( t ) are the Said–Ball basis functions, the objective is to determine a set of real numbers { Q i } i = 0 m such that the corresponding m-th degree Said–Ball curve
Q m ( t ) = i = 0 m Q i S i m ( t )
satisfies the following condition over the closed interval [ 0 , 1 ] :
d P n ( t ) , Q m ( t ) = max 0 t 1 P n ( t ) Q m ( t ) = min .
here, d P n ( t ) , Q m ( t ) represents the maximum error between the two curves, serving as the primary optimization objective of degree reduction problem.
Figure 1 illustrates a comparison between a sixth-degree Said–Ball curve and its reduction to a fourth-degree Said–Ball curve.
To preserve the geometric features of the original curve as much as possible during the degree reduction process, the following constraints are typically required to be satisfied:
  • End Point Constraint
    The reduced-degree curve Q m ( t ) must coincide with the original curve P n ( t ) at the end points, i.e.,
    P n ( 0 ) = Q m ( 0 ) and P n ( 1 ) = Q m ( 1 ) .
    This constraint ensures consistency between the reduced-degree curve and the original curve at the start and end points.
  • Higher-Order Derivative Constraint
    To further maintain the smoothness of the curve, the higher-order derivatives of the reduced-degree curve Q m ( t ) at the end points should match those of the original curve, i.e.,
    d j P n ( t ) d t j | t = 0 = d j Q m ( t ) d t j | t = 0 , j = 0 , 1 , , r , d j P n ( t ) d t j | t = 1 = d j Q m ( t ) d t j | t = 1 , j = 0 , 1 , , s ,
    where r and s denote the derivative orders at the start and end points, respectively, satisfying r + s + 1 m . A curve Q m ( t ) meeting these conditions is referred to as an end point-preserving ( r , s ) -order interpolation degree reduction approximation of P n ( t ) .
  • Boundary Constraint
    The reduced-degree curve Q m ( t ) should lie within the convex hull of the original curve P n ( t ) to ensure reasonableness of the geometric shape. This constraint prevents potential shape distortion issues during the degree reduction process.

3. Coati Optimization Algorithm

The coati optimization algorithm (COA), a novel swarm intelligence optimization method, was proposed by Dehghani M et al. in 2023, drawing inspiration from the natural predatory behavior of coatis hunting iguanas and evading predators [22]. By simulating the hunting and escaping mechanisms of coatis, this algorithm exhibits strong global search capabilities and rapid convergence. However, the COA faces limitations in practical applications: uneven population distribution during the initialization phase may result in inadequate coverage of the search space, while its tendency to become trapped in local optima when addressing multimodal optimization problems can lead to a reduced convergence speed. These shortcomings constrain the algorithm’s performance in complex optimization scenarios.

3.1. Population Initialization

Coatis often operate in groups, with each coati’s position representing a candidate solution to the optimization problem. Since the positions of the coatis are random, the population must first be initialized, as shown in Equation (8):
X i : x i , j = l b j + r · ( u b j l b j ) , i = 1 , 2 , , N ; j = 1 , 2 , , m
where X i is the position of the i-th coati in the search space, x i , j is the value of the j-th decision variable, N is the number of coatis, m is the number of decision variables, r is a random real number in interval [ 0 , 1 ] , and l b j and u b j are the lower and upper bounds of the j-th decision variable, respectively.

3.2. Hunting and Attack Strategy on Iguanas (Exploration Phase)

In the exploration phase, the COA simulates the process of coatis hunting and attacking iguanas. Specifically, half of coatis climb trees to approach and intimidate iguana, while the other half remain on the ground waiting. When the iguana falls from the tree, ground-based coatis quickly kill it. This strategy enables coatis to migrate to different positions within the search space, emphasizing the COA’s global exploration capability in the problem-solving domain. Figure 2 illustrates this strategy in detail.
In COA design, the position of the individual with the best fitness in the coati population is assumed to represent the iguana’s position. Thus, the position update for tree-climbing coatis tasked with attacking iguanas is given by Equation (9):
X i P 1 : x i , j P 1 = x i , j + r a n d ( 0 , 1 ) · ( G j I · x i , j ) , i = 1 , 2 , , N 2 ; j = 1 , 2 , , m
where X i P 1 is the new position calculated for the i-th coati, x i , j P 1 is its value in the j-th dimension, r a n d ( 0 , 1 ) is a random variable in [ 0 , 1 ] , G j denotes the iguana’s position in the j-th dimension, and I is an integer randomly selected from the set { 1 , 2 } .
When the iguana falls to the ground, its position may land at a random location within the search space, with the position update formula given by Equation (10):
G g : G j g = u b j + r a n d ( 0 , 1 ) u b j l b j
Meanwhile, the movement of ground-based coatis in the search space follows a similar position update rule, as shown in Equation (11):
X i P 1 : x i , j P 1 = x i , j + r a n d ( 0 , 1 ) · G j g I x i , j , F G , j g F i , j x i , j + r a n d ( 0 , 1 ) · x i , j G j g , otherwise i = [ N / 2 ] + 1 , [ N / 2 ] + 2 , , N , and j = 1 , 2 , . . . , m
where G j g represents the iguana’s position on the ground and F is the value of the objective function.
If a coati’s new position improves objective function value, that position is accepted; otherwise, the coati remains at its original position. This update condition is applied to all coatis i = 1 , 2 , , N , as simulated by Equation (12):
X i = X i P 1 , F i P 1 F i X i , otherwise
Through these strategies, the COA achieves efficient global exploration and local exploitation in the search space, resulting in superior performance in optimization problems.

3.3. Escape from Predators (Exploitation Phase)

As shown in Figure 3, the second phase of the COA simulates the natural behavior of coatis escaping from predators through mathematical modeling, demonstrating the algorithm’s local search capability. In this phase, coatis move toward safer nearby positions to evade potential threats.
To precisely model this behavior, a random position within each coati’s neighborhood is generated according to Equation (13), guiding coati to migrate toward that position. This strategy not only enhances the algorithm’s fine-grained search capability within local regions but also significantly improves the solution accuracy and convergence efficiency.
X i P 2 : x i , j P 2 = x i , j + ( 1 2 r ) · l b j l o c a l + r · u b j l o c a l l b j l o c a l l b j l o c a l = l b j t , u b j l o c a l = u b j t , where t = 1 , 2 , , T . i = 1 , 2 , , N ; j = 1 , 2 , , m .
Here, l b j l o c a l and u b j l o c a l are the local lower and upper bounds of the j-th decision variable, respectively; l b j and u b j are the global lower and upper bounds of the j-th decision variable; t is the iteration counter; X i P 2 is the new position of the i-th coati in the second phase; and x i , j P 2 is its value in the j-th dimension. If the newly computed position improves the objective function value, it is accepted; otherwise, the coati remains at its original position. This condition is expressed by Equation (14):
X i = X i P 2 , F i P 2 F i X i , otherwise
where F i P 2 is the fitness value of the new position and F i is the fitness value of the current position.
Through simulating the hunting and escape behaviors of coatis, the COA updates the coati positions in two phases, iterating continuously until the maximum number of iterations is reached. Upon completion, the algorithm outputs the best solution as the result. By integrating global exploration and local exploitation, the COA exhibits efficient optimization search capabilities, providing robust support for solving complex problems.

4. Multi-Strategy Enhanced Coati Optimization Algorithm (MSECOA)

4.1. Good Point Set Hybrid Oppositional Learning Strategy

In optimization algorithms, the quality of the initial population significantly impacts algorithm performance. Traditional random initialization methods often produce unevenly distributed solutions, leading to local clustering and reduced global search capability. To address this, a population initialization method based on good point set sequences and a hybrid oppositional learning strategy is proposed in this paper. This approach combines the uniform distribution properties of good point sets with the diversity-enhancing mechanism of hybrid oppositional learning to generate an initial population that is both uniformly distributed and highly diverse, effectively resolving the issues of uneven distribution and local clustering inherent in traditional methods.

4.1.1. Good Point Set Sequence

To overcome limitations of traditional random initialization, a good point set (GPS) is introduced as a foundational tool for population initialization in this study. The good point set is a low-discrepancy sequence generation method rooted in number theory, capable of producing uniformly distributed point sets in high-dimension spaces [23]. Unlike pseudo-random number generation methods, GPS leverages properties of prime numbers and trigonometric functions to ensure that the generated points exhibit low discrepancy, thereby avoiding local clustering in good point set initial population.
Definition 3. 
Let G s denote the unit cube in good point set s-dimension Euclidean space, and let r G s . If the point set p n ( k ) = { ( { k · r 1 ( n ) } , , { k · r s ( n ) } ) , 1 k n } has a discrepancy satisfying ϕ ( n ) = C ( r , ε ) · n 1 + ε (where C ( r , ε ) is a constant dependent only on r and ε, and ε is any positive number), then p n ( k ) is called a good point set and r is termed a good point.
The specific construction method is as follows: set { r k = 2 cos ( 2 π k / p ) , 1 k s } , where p is the smallest prime number satisfying ( p 3 ) / 2 s and { k · r i ( n ) } represents the fractional part of k · r i ( n ) .
Specifically, initial solutions are generated using the good point set as shown in Equation (15):
x i , j = l b j + cos 2 π · i p j mod 1 · ( u b j l b j )
where p j is prime number corresponding to the j-th dimension and [ l b j , u b j ] denotes good point set lower and upper bounds of the search space. This method ensures uniform coverage of the initial population across the search space while enhancing the algorithm’s global search capability.
Figure 4 compares the individual distributions generated by the good point set and random methods. The good point set yields a uniform distribution with broad coverage and no clustering, enabling the comprehensive traversal of the solution space, whereas the random method results in uneven distribution prone to local optima. The stable construction of the good point set, independent of dimensionity, produces high-quality initial solution sets, significantly improving the algorithm’s global search capability and convergence efficiency.

4.1.2. Hybrid Oppositional Learning Strategy

In complex optimization problems, although the good point set generates a uniformly distributed initial population, relying solely on uniform distribution may not suffice to fully explore the entire search space. To address this, a hybrid oppositional learning strategy [24] is introduced in this study, dynamically selecting oppositional learning methods via a random switching mechanism to effectively enhance population diversity and search capability, as illustrated in Figure 5.
The hybrid oppositional learning strategy encompasses two mechanisms: lens oppositional learning and quasi-oppositional learning. Lens oppositional learning generates an opposite solution to the current solution through symmetric mapping, as expressed by
X opp = l b + u b X
where X is the current solution, X opp is its opposite solution, and l b and u b are the lower and upper bounds of the search space, respectively. The advantage of lens oppositional learning lies in its ability to rapidly generate solutions symmetric to the current one, thereby expanding the search range and enhancing global exploration.
Quasi-oppositional learning introduces random perturbations between the current solution and its opposite to generate new solutions, as given by
X qopp = X + α · ( X opp X )
where α is a random perturbation factor, typically a random number in [ 0 , 1 ] . The strength of quasi-oppositional learning is its capacity to produce solutions between the current and opposite solutions, enhancing local search capability and improving convergence efficiency.
To leverage the strengths of both methods, a hybrid strategy is further proposed. This strategy employs a random switching probability p to dynamically select between lens oppositional learning and quasi-oppositional learning for generating new solutions. Specifically, a random number r [ 0 , 1 ] is generated, and the choice of oppositional learning method is determined based on the relationship between r and p:
X new = l b + u b X , if r < p X + α · ( l b + u b 2 X ) , else
where α U ( 0 , 1 ) is a perturbation factor randomly drawn from a uniform distribution. When r < p , lens oppositional learning is used to generate the opposite solution; when r p , quasi-oppositional learning is applied to produce a quasi-opposite solution. This probability-based switching strategy balances global exploration and local exploitation, effectively preventing the algorithm from being trapped in local optima. By dynamically adjusting the search direction, lens oppositional learning enhances global exploration, while quasi-oppositional learning boosts local exploitation efficiency, enabling the algorithm to better balance exploration and exploitation in complex multimodal optimization problems and avoid premature convergence.

4.2. Fitness–Distance Balance Strategy

In the COA, positions are updated through random subjects during hunting and attacking phases, with optimization performance relying on balancing exploration and exploitation. During optimization, the algorithm must maintain global search capability to identify promising regions in solution space while performing refined searches in the neighborhoods of high-quality solutions to enhance convergence accuracy. The fitness–distance balance (FDB) strategy is a novel selection approach designed to choose candidate solutions that contribute most to search process [25]. Unlike traditional selection methods, FDB considers not only the fitness value of a candidate solution but also its distance from the optimal solution to compute a score. This dual criterion ensures that the candidate with the highest score is selected, efficiently guiding the population search while avoiding individuals too close to the optimal solution, thus preventing entrapment in local optima. The mathematical model of the FDB strategy is as follows:
Let the current population be P = { X 1 , X 2 , , X N } , with the best individual denoted as X b e s t . The FDB score S i for other individuals X i is calculated as shown in Equation (19):
S i = ω * · f ( X i ) f min f max f min + ( 1 ω * ) · D max D ( X i , X b e s t ) D max D min
where f ( X i ) is the fitness value of the individual X i , f min and f max are the minimum and maximum fitness values in the current population, D ( X i , X b e s t ) is the Euclidean distance between X i and X b e s t , D min and D max are the minimum and maximum distances in the current population, and ω * is a weight within [ 0 , 1 ] that balances fitness and distance.
By weighting these two distances, the candidate solution with the highest score is selected to guide the algorithm’s search. In this study, the FDB strategy is applied to the exploration phase of the COA, modifying the selection strategy for G j individuals across both phases to use FDB, thereby selecting the most representative individuals for position updates.

4.3. Dynamic Spiral Search

In the exploitation phase of the COA, the search space gradually narrows as iterations progress, weakening the ability to explore unknown regions. To address this, a dynamic spiral search method inspired by the spiral search mechanism of the whale optimization algorithm (WOA) [26] is introduced. This method dynamically adjusts the spiral shape parameter based on the iteration count, enhancing global search capability and optimization accuracy. Originally derived from humpback whale predation behavior, the spiral search mechanism narrows encirclement via a spiral path to approach prey. In traditional spiral update models, the spiral shape parameter is constant, accelerating late-stage convergence but often resulting in a singular search path and local optima entrapment. By introducing an iteration parameter to dynamically adjust the spiral shape, this issue is effectively mitigated, with the update formula given by Equation (20):
z = e k · cos ( π · t T ) X ( t + 1 ) = e z l · cos ( 2 π l ) · r · ( X b e s t ( t ) r a n d · X b e s t ( t ) )
where k is a constant defining the logarithmic spiral shape, l is a random number between [ 1 , 1 ] , t is the current iteration, T is the maximum iteration count, and r is the sensitivity range of each coati.
The position update for coatis in the exploitation phase is thus defined by Equation (21):
X i P 2 = x i , j P 2 = e z l · cos ( 2 π l ) · r · ( X b e s t ( t ) r a n d · X b e s t ( t ) ) , if F i P 1 m e a n ( F ) x i , j P 2 = x i , j + ( 1 2 r ) · ( l b j l o c a l + r · ( u b j l o c a l l b j l o c a l ) ) , else
Thus, in early iterations, the COA employs a larger spiral shape for the global search to accelerate convergence; in later iterations, the spiral shape is gradually reduced as the iteration count increases, enabling refined searches to enhance optimization accuracy.

4.4. Integration of Self-Adaptive Differential Evolution Algorithm

The differential evolution algorithm (DE) optimizes population individuals through mutation, crossover, and selection operations, using differential vectors to generate new individuals and improve search efficiency. In the MSECOA, the differential mutation mechanism of DE is incorporated into the COA, leveraging differential vector generation to enhance solution space exploration and avoid local optima. Additionally, the self-adaptive differential evolution algorithm (SADE) [27] is integrated, introducing an adaptive scaling factor and linearly decreasing crossover probability to dynamically adjust exploration and exploitation capabilities. This enhances the global search in early stages and local precision in later stages, further improving algorithm performance. The mathematical models for each step are described below:
The mutation operation generates a step size by randomly selecting two individuals from the population and using their difference vector, as shown in Equation (22):
y i , j = x i , j + F · x R 1 , j x R 2 , j , i = 1 , 2 , 3 , , N
where X i is i-th individual in population, x R 1 and x R 2 are two distinct individuals randomly chosen from the population, j is the j-th dimension of the solution vector, and F is the scaling factor controlling the mutation step size.
The crossover operation generates a trial individual by combining information from target and mutated individuals, as expressed in Equation (23):
Z i : z i , j = y i , j , r < C R | | j r = j x i , j , otherwise
where C R is the crossover probability, r is a random number, and j r is a randomly selected dimension index.
The selection operation determines whether the target individual is updated by comparing the fitness values of trial and target individuals, as given by Equation (24):
X i = Z i , F ( Z i ) < F ( X i ) X i , otherwise
where F ( Z i ) and F ( X i ) are the fitness values of trial and target individuals, respectively.
SADE further enhances performance by adaptively adjusting the scaling factor F and crossover probability C R , as implemented in Equation (25):
F = 0.5 × ( 1 + r a n d ( ) ) C R = 0.9 ( 0.9 0.1 ) × t T
where r a n d ( ) generates a random number in [ 0 , 1 ] , t is the current iteration, and T is the maximum iteration count.

4.5. Flowchart and Pseudocode of MSECOA

The pseudocode of the MSECOA is provided in Algorithm 1, while the process of the algorithm is described in Figure 6.
Algorithm 1 MSECOA: Multi-Strategy Enhanced Coati Optimization Algorithm
Require: 
Population size N, dimensionity of problem d i m , Maximum number of iterations T.
Ensure: 
Best objective function value f b e s t , Best solution position X b e s t , Convergence curve C O A c u r v e .
  1:
Initialize population: using Good Point Set and Opposition Learning
  2:
for  i = 1 to N do
  3:
     X ( i , : ) i n i t i a l i z e P o p u l a t i o n ( ) , f i t ( i ) f i t n e s s ( X ( i , : ) )
  4:
end for
  5:
for  t = 1 to T do
  6:
    Update global best solution: X b e s t , f b e s t
  7:
    Exploration Phase: Use fitness–distance balance to guide hunting and attacking strategy
  8:
    for  i = 1 to N / 2  do
  9:
         Calculate new position for i-th coati using Equation (19).
10:
         Update position of i-th coati using Equation (12).
11:
    end for
12:
    for  i = N / 2 to N do
13:
        Calculate new position for i-th coati using Equation (19).
14:
        Update position of i-th coati using Equation (12).
15:
    end for
16:
    Exploitation Phase: Use Dynamic Spiral Search
17:
    for  i = 1 to N do
18:
        Calculate new position for i-th coati using Equation (21).
19:
        Update position of i-th coati using Equation (14).
20:
    end for
21:
    Adaptive Differential Evolution: Apply mutation, crossover, and selection
22:
    for  i = 1 to N do
23:
        Calculate new position for i-th coati using Equations (22)–(25).
24:
        Update position of i-th coati using Equation (14).
25:
    end for
26:
    Record best-so-far fitness: C O A c u r v e ( t ) f b e s t
27:
end for
28:
return  f b e s t , X b e s t

4.6. Time Complexity Analysis

Time complexity serves as a critical metric for evaluating algorithm performance, primarily determined by three core processes: population initialization, fitness evaluation, and position updating. The original COA exhibits a time complexity of O ( N × D × T ) , where N, D, and T represent the population size, problem dimensionity, and maximum iterations, respectively. The MSECOA’s complexity analysis reveals the following:
  • Good Point Set with Opposition-Based Learning: Initialization requires O ( N × D ) operations, while opposition-based component adds another O ( N × D ) computation. This combined strategy maintains an initialization complexity of O ( N × D ) without substantial overhead.
  • Fitness–Distance Balance Strategy: Each iteration computes distances between individuals and the global best ( O ( N × D ) ) and performs selection operations. This single traversal operation preserves overall O ( N × D × T ) complexity.
  • Dynamic Spiral Search: Exploitation-phase spiral search operates within O ( N × D ) complexity. As it merely modifies the position update equations without additional loops, it imposes a negligible computational burden.
  • Adaptive Differential Evolution: Mutation–crossover–selection operations in each iteration require O ( N × D ) computations. Executing in parallel with the COA’s native updates, they maintain original O ( N × D × T ) complexity.
Notably, the MSECOA retains base O ( N × D × T ) complexity while achieving superior optimization performance through these strategic enhancements.

5. Experimental Validation and Result Analysis

5.1. Experimental Design and Parameter Settings

The simulation test environment for this section’s experiments is based on the Windows 10 operating system, with hardware consisting of an AMD Ryzen 7 4800H processor running at a base frequency of 2.90 GHz. All algorithm implementations and tests were conducted on the MATLAB 2022b platform.
To validate the effectiveness of the MSECOA, performance tests were conducted using a total of 41 test functions from IEEE CEC2017 [28] and CEC2022 [29]. In experiments, the MSECOA was compared against ten algorithms: the basic COA, the improved COA algorithm (ICOA [30]), popular optimization algorithms and their variants (whale optimization algorithm (WOA) [26], the Arithmetic Optimization Algorithm (AOA) [31], Nonlinear Chaotic Harris Hawks Optimization (NCHHO) [32], Improved Sand Cat Swarm Optimization (ISCSO) [33]), newly proposed algorithms from 2024 (Black-winged Kite Algorithm (BKA) [34], the hiking optimization algorithm (HOA) [35]), and a top-performing CEC algorithm (LSHADE_cnEpSin [36]). For the experimental setup, the maximum number of iterations for each algorithm was set to 1000, the population size was uniformly fixed at 30, and each algorithm was independently run 30 times.
Furthermore, mean (Mean) and standard deviation (Std) were selected as the evaluation metrics, and the experimental results were subjected to the Friedman test [37] and Wilcoxon rank-sum test to verify the MSECOA’s effectiveness. The Wilcoxon rank-sum test results were derived from p-values obtained under a significance level of α = 0.05 . In this testing framework, “+” indicates that the MSECOA outperforms the compared algorithm in terms of performance, “=” signifies no significant difference between the MSECOA and compared algorithm, and “−” denotes that the compared algorithm surpasses the MSECOA in performance. The parameter settings for these algorithms are detailed in Table 1.

5.2. Ablation Study of MSECOA

To rigorously evaluate the contribution of each enhancement strategy in the MSECOA, we conducted a systematic ablation study. The experiments were carried out on the IEEE CEC2017 benchmark suite, which comprises 29 test functions categorized into four types: unimodal (F1–F3), multimodal (F4–F10), hybrid (F11–F20), and composite functions (F21–F30). A detailed description of these functions, including their theoretical optimal values, is provided in Table 2.
To assess the individual effectiveness of each enhancement strategy, we compared the MSECOA with four variant algorithms, each incorporating only one of the proposed strategies:
  • GCOA: COA with a good point set hybrid oppositional learning strategy.
  • FCOA: COA with a fitness–distance balance strategy.
  • DCOA: COA with a dynamic spiral search strategy.
  • SCOA: COA with a self-adaptive differential evolution mechanism.
The performance comparison is illustrated in Figure 7, which includes a radar chart and an average rank chart. The results clearly demonstrate that the MSECOA achieves the best overall performance, with the lowest average rank of 1.34, significantly outperforming all individual variants. In contrast, the FCOA exhibits the highest average rank of 4.69, indicating limited optimization capability when applied in isolation. The GCOA, DCOA, and SCOA achieve average ranks of 4.41, 3.41, and 1.66, respectively, reflecting their moderate but distinct contributions to algorithmic improvement.
These findings confirm that the integration of all four enhancement strategies—the good point set hybrid oppositional learning approach, fitness–distance balance mechanism, dynamic spiral search technique, and self-adaptive differential evolution algorithm—results in a synergistic effect. This combination significantly enhances the algorithm’s convergence speed, solution accuracy, and overall robustness.

5.3. Comparative Analysis of Algorithms on CEC2017 Benchmark

5.3.1. Experimental Results and Analysis of CEC2017 Benchmark in Dimension 10

The performance of the different algorithms was evaluated using metrics such as mean and standard deviation, with detailed results presented in Table 3. To further validate the statistical significance of the performance differences between the algorithms, the Wilcoxon rank-sum test was conducted, and the results are provided in Table 4.
From Table 3, it is evident that the MSECOA performs exceptionally well on 17 benchmark test functions (F1, F3–F5, F7, F8, F11–F16, F18, F19, F23, F26, and F29), with both the mean and standard deviation of its optimal values surpassing those of the other eight compared algorithms. For functions F6, F9, F17, F20, F22, and F25, the mean optimal value of the MSECOA is second only to the best-performing algorithm; for F10, F24, F27, and F30, the MSECOA achieves the best mean value, though its standard deviation is not the smallest. According to the Friedman test results, the MSECOA ranks first, significantly outperforming other algorithms, with LSHADE_cnEpSin and the BKA ranking second and third, respectively, while the COA ranks ninth. The experimental results demonstrate that the MSECOA exhibits strong optimization capability on the CEC2017 test suite (dimension 10), particularly excelling in unimodal, multimodal, hybrid, and composition functions. Among the 29 test functions, the MSECOA achieves the best or near-best results on most, with particularly notable performance on high-dimension complex functions (e.g., F12, F18, F30). Although it slightly underperforms LSHADE_cnEpSin on some composition functions (e.g., F27, F28, F29), its overall performance remains significantly superior, with a Friedman rank of 1.207, securing the top position.
Based on the Wilcoxon rank-sum test results in Table 4, the MSECOA demonstrates a significant performance advantage on the CEC2017 test suite (dimension 10). Across the 29 benchmark test functions, the MSECOA significantly outperforms the ICOA, the COA, the WOA, the AOA, the HOA, NCHHO, and ISCSO on all 29 functions (p < 0.05), particularly excelling in unimodal functions (e.g., F1, F3–F5), multimodal functions (e.g., F7, F8, F11–F16), and high-dimension complex functions (e.g., F12, F18, F30). Compared to LSHADE_cnEpSin, the MSECOA is significantly superior on 23 functions, performs comparably on 4 functions ( p 0.05 ), and is slightly inferior on 2 functions (e.g., F9 and F23). Additionally, the MSECOA outperforms the BKA significantly on 22 functions, further confirming its robustness and stability. In summary, the MSECOA exhibits strong adaptability and generalization across diverse optimization problems, proving to be an efficient and stable algorithm, particularly suited for high-dimension complex optimization challenges.
As shown in Figure 8, this paper presents the iterative convergence curves of different algorithms on 15 benchmark functions from the CEC2017 test suite. The experimental results indicate that the MSECOA significantly outperforms the other nine compared algorithms in terms of global search capability, convergence speed, and convergence accuracy when solving these functions. Specifically, the MSECOA demonstrates strong global exploration ability in early iterations, rapidly identifying potential optimal solution regions, while in later iterations, its local exploitation capability is fully exhibited, enabling convergence to the global optimum with high precision.

5.3.2. Experimental Results and Analysis of CEC2017 Benchmark in Dimension 30

When the dimension is set to 30, the performance of each algorithm was evaluated based on the mean and standard deviation of its optimization results, with specific data presented in Table 5. Additionally, to further verify the statistical significance of the performance differences between the algorithms, the Wilcoxon rank-sum test results are provided in Table 6.
From the experimental results in Table 5, it is evident that the MSECOA exhibits a significant performance advantage on the CEC2017 test suite (dimension 30). Specifically, the MSECOA achieves the best mean values on 22 out of the 29 benchmark test functions, with particularly outstanding performance on high-dimension complex functions. Furthermore, the standard deviation of the MSECOA is significantly lower than that of the other algorithms on most functions, indicating high stability and robustness in its optimization outcomes. According to Friedman test results, the MSECOA ranks at 1.103, markedly outperforming the other compared algorithms, further confirming its superiority in high-dimension optimization problems. LSHADE_cnEpSin and the BKA rank second and third, respectively, showing relatively strong performance, though they fall short of the MSECOA on complex functions. In contrast, algorithms such as the ICOA, COA, WOA, and AOA perform poorly, especially on high-dimension complex functions.
Based on the Wilcoxon rank-sum test results in Table 6, the MSECOA demonstrates a significant performance advantage on the CEC2017 test suite (dimension 30). Across the 29 test functions, the MSECOA significantly outperforms the ICOA, the COA, the WOA, the AOA, the HOA, NCHHO, and ISCSO (p < 0.05), with particularly notable optimization capabilities on unimodal, multimodal, and high-dimension complex functions. Compared to LSHADE_cnEpSin, the MSECOA exhibits a significant advantage on 26 functions, performing comparably or slightly worse on a few composition functions. Additionally, the MSECOA significantly surpasses the BKA on 25 functions, further evidencing its exceptional robustness and stability. These results collectively demonstrate that the MSECOA possesses strong competitiveness across various types of optimization problems.
As shown in Figure 9, the MSECOA exhibits excellent performance in convergence speed, accuracy, and global search capability. In functions such as F1, F3, and F19, the MSECOA rapidly approaches the optimal solution, with the fitness values significantly lower than those of other algorithms. In multimodal functions (e.g., F7, F9), the MSECOA effectively avoids local optima, demonstrating strong global search capability. In contrast, algorithms such as the ICOA and COA show a clear gap in convergence speed and accuracy compared to the MSECOA.

5.4. Comparative Analysis of Algorithms on CEC2022 Benchmark

To further validate the performance of the MSECOA, the CEC2022 test suite was employed for experimental evaluation in this study. The IEEE CEC2022 test suite comprises four types of functions—unimodal (F1), basic (F2–F5), hybrid (F6–F8), and composition (F9–F12)—capable of comprehensively assessing algorithm performance across diverse optimization problems. The names and theoretical optimal values of each function are detailed in Table 7. Through experiments with this test suite, the adaptability and robustness of the MSECOA in handling varied optimization challenges can be more thoroughly evaluated.

Experimental Results and Analysis of CEC2022 Benchmark in Dimension 20

When the dimension is set to 20, metrics such as the mean and standard deviation of the experimental results are presented in Table 8, used to evaluate the performance of each algorithm on the CEC2022 test suite. Additionally, Table 9 provides the Wilcoxon rank-sum test results to further verify the statistical significance of the performance differences between algorithms, enabling a comprehensive analysis of their performance in high-dimension optimization problems.
From the experimental results in Table 8, it is evident that the MSECOA demonstrates a significant performance advantage on the CEC2022 test suite (dimension 20). Specifically, the MSECOA achieves the best mean values on multiple functions out of 12 test functions. Furthermore, the standard deviation of the MSECOA is significantly lower than that of the other algorithms on most functions, indicating high stability and robustness in its optimization outcomes. According to Friedman test results, the MSECOA ranks at 1.500, significantly outperforming the other compared algorithms, further confirming its superiority across diverse optimization problems. LSHADE_cnEpSin and the BKA rank second and third, respectively, exhibiting relatively strong performance, though they fall short of the MSECOA on complex functions. Other algorithms perform adequately on some functions but generally lag behind the MSECOA.
The Wilcoxon test results in Table 9 show that the MSECOA performs exceptionally on the CEC2022 suite (20-dimension). Across all 12 test functions, the MSECOA significantly outperforms the ICOA, the COA, the WOA, the AOA, the HOA, NCHHO, and ISCSO (p < 0.05). Compared to the BKA, the MSECOA is significantly superior on 10 functions, with equal performances on F2 and F4. Against LSHADE_cnEpSin, the MSECOA excels on seven functions, matches on two (F2 and F9), and is slightly inferior on one (F7). Overall, the MSECOA exhibits stronger optimization capability in 20-dimension problems. Additionally, Figure 10 illustrates convergence curves of different algorithms on CEC2022 benchmark functions.
The experimental results of various algorithms across different test sets are presented in Table 10.
As indicated by Table 10, the MSECOA exhibits superior stability and optimization precision across multi-dimension optimization problems. Its Friedman value and final ranking are notably better than those of other algorithms, reflecting an effective balance between search accuracy, computational efficiency, and stability. In contrast, algorithms such as the ICOA, COA, and WOA demonstrate poorer performance in high-dimension problems, revealing evident performance bottlenecks. Although the BKA and LSHADE_cnEpSin maintain consistent performance, their optimization effectiveness is limited in highly challenging test sets. With its robust performance, the MSECOA achieves a cumulative ranking of one, underscoring its efficiency and reliability in tackling complex optimization problems.

6. MSECOA for Engineering Constrained Design Problems

To evaluate the optimization performance of the MSECOA in addressing real-world engineering challenges, four classical engineering design problems—namely, three-bar truss design [38], welded beam design [39], compression spring design [40], and pressure vessel design [41]—are selected for testing. These problems vary in solution complexity and constraint conditions, providing a comprehensive assessment of the algorithm’s adaptability and robustness in intricate engineering scenarios. The experimental setup is configured with a population size of 30 and a maximum of 1000 iterations, while a penalty coefficient of 10 10 is applied to handle constraints. To ensure the reliability of results, each algorithm is independently executed 30 times, with the best, worst, mean, and standard deviation of the objective function values recorded.

6.1. Three-Bar Truss Design

The objective of three-bar truss design (TBT) problem [38] is to minimize the volume of truss by optimizing cross-sectional areas. A schematic of the three-bar truss design problem is illustrated in Figure 11.
The mathematical model for the TBT problem is presented in Equation (26).
M i n f ( x ) = a ( 2 2 x 1 + x 2 ) s . t g 1 = 2 x 1 + x 2 2 x 1 2 + 2 x 1 x 2 P σ 0 g 2 = x 2 2 x 1 2 + 2 x 1 x 2 P σ 0 g 3 = 1 2 x 2 + x 1 P σ 0 a = 100 cm , P = 2 kN / cm 2 σ = 2 kN / cm 2 0 x 1 , x 2 1
The statistical results, including the optimal parameters and objective function values achieved by each algorithm for the three-bar truss design problem, are summarized in Table 11.
As evidenced in Table 11, the MSECOA demonstrates exceptional performance in the three-bar truss design problem. The best, mean, and standard deviation of the objective function values are recorded as 263.8958, 263.8958, and 2.98556 × 10 14 , respectively, outperforming all competing algorithms and securing the top rank. Specifically, the optimal parameter combination obtained by the MSECOA is { x 1 = 0.7887 , x 2 = 0.4082 } , yielding a minimum objective function value that aligns with the theoretical optimum.

6.2. Welded Beam Design

The objective of the welded beam design (WBD) problem [39] is to minimize the manufacturing cost of a welded beam by optimizing the beam length (l), height (t), thickness (b), and weld thickness (h) while satisfying constraints such as shear stress ( τ ), bending stress ( θ ), the buckling load of the beam ( P c ), end deflection ( δ ), and boundary conditions. A schematic of the WBD problem is presented in Figure 12.
The mathematical model of the WBD problem is given by Equation (27).
M i n f ( x ) = 1.10471 x 1 2 x 2 + 0.04811 x 3 x 4 × ( 14 + x 2 ) x = [ x 1 , x 2 , x 3 , x 3 ] = [ h , l , t , b ] . s . t f 1 ( x ) = τ ( x ) τ max 0 , f 2 ( x ) = σ ( x ) σ max 0 , f 3 ( x ) = x 1 x 4 0 , f 4 ( x ) = 0.125 x 1 0 , f 5 ( x ) = 0.10472 x 1 2 + 0.04821 x 3 x 4 ( 14 + x 2 ) 5 0 , f 5 ( x ) = 0.125 x 1 0 , f 6 ( x ) = δ ( x ) δ max 0 , f 7 ( x ) = P P c ( x ) 0 , τ ( x ) = ( τ ) 2 + 2 τ τ x 2 R + τ 2 , τ = P 2 x 1 x 2 , τ = M R J , M = P ( L + x 2 2 ) , R = x 2 2 4 + ( x 1 + x 2 2 ) 2 , J = 2 2 x 1 x 2 x 2 2 12 + ( x 1 + x 3 2 ) 2 , σ ( x ) = 6 P L x 4 x 3 2 , δ ( x ) = 6 P L 3 E x 4 x 3 2 , P c ( x ) = 4.013 E x 3 2 x 4 6 36 L 2 ( 1 x 3 2 L E 4 G ) , τ max = 13600 p s i , σ max = 30000 p s i , δ max = 0.25 i n , P = 60001 b , E = 30 E + 10 6 p s i , G = 12 E + 10 6 p s i , L = 14 .
The statistical results, including the optimal parameters and objective function values obtained by various algorithms, are summarized in Table 12.
It can be observed from Table 12 that the MSECOA significantly outperforms other comparative algorithms in addressing the WBD problem. The best objective function value, mean, and standard deviation achieved by the MSECOA are reported as 1.6928, 1.6928, and 0.0001, respectively, surpassing the performance of all other algorithms. Specifically, the optimal solution derived by the MSECOA is x = { 0.2057 , 3.2349 , 9.0366 , 0.2057 } , yielding a minimum objective function value of 1.6928, which is notably lower than the results of the other methods. In contrast, algorithms such as the ICOA, COA, and WOA exhibit inferior performance, with their best objective function values recorded as 1.9303, 1.8437, and 1.7236, respectively—substantially higher than that of the MSECOA. Additionally, although the LSHADE_cnEpSin algorithm demonstrates performance relatively close to that of the MSECOA, its best objective function value and standard deviation remain less competitive than those achieved by the MSECOA.

6.3. Tension/Compression Spring Design

The tension/compression spring design (TCSD) problem [40] is a classic case in engineering optimization, aimed at minimizing the weight of a spring by optimizing its geometric parameters—namely, the wire diameter d ( x 1 ) , mean coil diameter D ( x 2 ) , and number of active coils N ( x 3 ) —while adhering to constraints such as minimum deflection g 1 , shear stress g 2 , oscillation frequency g 3 , and outer diameter limit g 4 . A schematic of the TCSD problem is illustrated in Figure 13.
The mathematical model for the TCSD problem is expressed in Equation (28).
m i n f ( x ) = 2 + x 3 x 1 2 x 2 x = [ x 1 , x 2 , x 3 ] = [ d , D , N ] s . t g 1 x = 1 x 2 3 x 3 71785 x 1 4 0 g 2 x = 4 x 2 2 x 1 x 2 12566 x 1 3 x 2 x 1 4 + 1 5108 x 1 2 1 0 g 3 x = 1 140.45 x 1 x 2 2 x 3 0 g 4 x = x 1 x 2 1.5 1 0 0.05 x 1 2 , 0.25 x 2 1.3 , 2 x 3 15
The statistical results, including the optimal parameters and objective function values obtained by various algorithms—such as the best value, mean, standard deviation, and ranking—are presented in Table 13.
Table 13 lists the optimal parameters and statistical results of the objective function for each algorithm in the compression spring design problem. From the table, it is evident that the MSECOA significantly outperforms the other compared algorithms in this problem. Its optimal value, mean, and standard deviation of the objective function are 0.012665, 0.012665, and 7.8035 × 10 8 , respectively, all superior to those of the other algorithms, demonstrating its efficiency and stability in solving complex engineering problems. Specifically, the optimal solution obtained by the MSECOA is x = { 0.0517 , 0.3567 , 11.2892 } , corresponding to a minimum objective function value of 0.012665, which is notably lower than the results of the other algorithms.

6.4. Pressure Vessel Design

The pressure vessel design (PVD) problem [41] aims to minimize the manufacturing cost of a pressure vessel by optimizing multiple design variables. These variables include shell thickness T s , head thickness T n , inner radius R, and length L, which directly affect the vessel’s cost, safety, and performance. During the optimization process, these variables need to be comprehensively considered to ensure that the cost is minimized while satisfying the engineering constraints. A schematic of the PVD problem is illustrated in in Figure 14.
The mathematical model of the PVD problem is represented as shown in Equation (29).
min f ( x ) = 0.6224 x 1 x 3 x 4 + 1.7781 x 2 x 2 2 + 3.1661 x 1 2 x 4 + 19.84 x 1 2 x 3 x = [ x 1 , x 2 , x 3 , x 4 ] = [ T s , T n , R , L ] s . t g 1 = x 1 + 0.0193 x 3 0 g 2 = x 2 + 0.00954 x 3 0 g 3 = π x 3 2 x 4 + 4 3 π x 3 3 + 129600 0 g 4 = x 4 240 0 0 x 1 , x 2 100 , 0 x 3 , x 4 200
where x 1 is shell thickness T s , x 1 is head thickness T n , x 1 is the inner radius R, and x 4 is length L.
The optimal parameters and objective function statistics—optimal value, mean, standard deviation, and ranking—for each algorithm are summarized in Table 14.
From Table 14, it can be observed that the MSECOA significantly outperforms the other compared algorithms in the pressure vessel design problem. The optimal value, mean, and standard deviation of its objective function are 5885.3328, 5885.4843, and 0.6666, respectively, all of which surpass those of the other algorithms, demonstrating its efficiency and stability in solving complex engineering problems. Specifically, optimal solution obtained by the MSECOA is x = { 0.7782 , 0.3846 , 40.3196 , 200 } , corresponding to a minimum objective function value of 5885.33277, which is notably lower than results of the other algorithms. Additionally, although the LSHADE_cnEpSin algorithm performs relatively close to the MSECOA, its optimal value and standard deviation remain inferior to those of the MSECOA.

7. MSECOA for Said–Ball Curves Degree Reduction

7.1. Algorithm Design

7.1.1. Population Initialization

The coati population consists of feasible solutions representing the control vertices of the reduced-degree Said–Ball curve. A population initialization method for Said–Ball curve degree reduction is proposed in this paper, aimed at ensuring the quality and diversity of the initial solutions. Given the n + 1 control vertices { P k } k = 0 n of the original Said–Ball curve, their coordinate range is first determined:
P min = [ min k x k , min k y k ] T , P max = [ max k x k , max k y k ] T
To ensure end point interpolation, Q 0 = P 0 and Q m = P n are set. Thus, for the m + 1 control vertices { Q j } j = 0 m of the reduced-degree curve, the generation rule is given by Equation (31):
Q j = P 0 , j = 0 P min + α j ( P max P min ) , j = 1 , . . . , m 1 P n , j = m
where Q j R 2 denotes the coordinate vector of the j-th control vertex, α j U ( [ 0 , 1 ] 2 ) is a two-dimension uniformly distributed random vector, and ∘ represents the Hadamard product (element-wise multiplication).
To ensure population diversity, the search space is expanded, as shown in Equation (32):
Ω = { Q j R 2 x min ρ Δ x x j x max + ρ Δ x , y min ρ Δ y y j y max + ρ Δ y }
where Δ x = x max x min , Δ y = y max y min , and ρ ( 0 , 1 ] is the space expansion factor.

7.1.2. Selection of Fitness Function

In a population, an individual’s fitness is determined by the quality of its properties, with higher fitness values increasing the likelihood of selection for breeding the next generation. To meet the approximation conditions of the Said–Ball curve P n ( t ) , the objective function is defined as minimizing the Euclidean distance term and average curvature difference between the original curve and reduced-degree curve over the parameter range. Specifically, the Euclidean distance term is used to quantify the geometric deviation between the reduced-degree curve and original curve, ensuring that the reduced-degree curve remains as close as possible to the original in terms of geometric position, thus reducing the overall approximation error. The average curvature difference is employed to assess the ability of the reduced-degree curve to preserve shape characteristics, particularly in curved regions, ensuring smooth transitions and the retention of geometric features while avoiding shape distortion or abrupt changes. By combining the Euclidean distance term with the average curvature difference, the objective function is designed to maintain the curve’s shape properties effectively while ensuring degree reduction accuracy. The mathematical expression of the objective function is given in Equation (33):
Minimize F ( P n ( t ) , Q m ( t ) ) = λ · 0 1 d P n ( t ) , Q m ( t ) 2 d t + ( 1 λ ) · 0 1 κ 1 ( t ) κ 2 ( t ) 2 d t .
here, P n ( t ) and Q m ( t ) represent parameterized expressions of the original and reduced-degree curves, respectively; d ( P n ( t ) , Q m ( t ) ) denotes the Euclidean distance between the original and reduced-degree curves over the parameter range; κ 1 ( t ) and κ 2 ( t ) indicate the curvatures of the original and reduced-degree curves at parameter t, respectively; and λ is a weighting coefficient used to balance the Euclidean distance term and the average curvature difference, set to 0.8.
By appropriately setting the weighting coefficient λ , a dynamic balance between geometric accuracy and shape preservation can be achieved during the degree reduction process, thereby providing a more effective solution for the degree reduction of complex curves.

7.1.3. Degree Reduction Error

To evaluate the error between curves before and after degree reduction, an integral-based error calculation formula is adopted in this paper, as shown in Equation (34):
ε = D P n ( t ) Q m ( t ) = 0 1 P n ( t ) Q m ( t ) 2 d t .
here, P n ( t ) represents the control vertex function of the original curve, Q m ( t ) denotes the control vertex function of the reduced-degree curve, and ε is the degree reduction error, used to quantify the difference between the curves before and after reduction. This formula, by calculating the integral of the squared difference between the two curves over the parameter interval [ 0 , 1 ] , comprehensively reflects the deviation in the overall shape of the curves before and after degree reduction. A smaller degree reduction error ε indicates a higher degree of fit between the reduced curve and original curve, signifying a better degree reduction outcome.

7.2. Algorithm Steps

Based on introduction to the MSECOA in Section 4, the COA iteratively updates its position in the search space through exploration and exploitation phases to find the optimal solution. Therefore, in the degree reduction process of the Said–Ball curves, the curve’s control vertices are treated as coati’s position information, and control vertices satisfying the degree reduction condition are continuously updated through the position update strategy to obtain the reduced-degree Said–Ball curve.
The specific implementation steps for applying the MSECOA to Said–Ball curve degree reduction are as follows:
Step 1: Initialize the control vertex sequence of the Said–Ball curve p 0 , p 1 , , p n , and the set target reduction degree as m.
Step 2: Set the population size N and maximum number of iterations T, and initialize the iteration counter t = 0 .
Step 3: Generate the initial coati population X i according to Equations (30)–(32), where ( i = 1 , 2 , , N ) .
Step 4: Calculate the fitness value f ( X i ) for each coati individual.
Step 5: Update the positions of the coati individuals based on the position update strategies of the exploration and exploitation phases.
Step 6: Update the current best solution X best and its corresponding fitness value f best .
Step 7: If the current iteration count t reaches the maximum T, proceed to Step 8; otherwise, increment t = t + 1 and return to Step 4.
Step 8: Output the optimal solution X best and its corresponding sequence of reduced-degree control vertices.
Step 9: Plot the reduced-degree Said–Ball curve and compute the degree reduction error.

7.3. Numerical Examples

To verify the performance of the MSECOA in Said–Ball curve degree reduction, this paper selects multiple cases with different degree reduction levels for testing. In the experiment, the MSECOA is compared with four algorithms: the basic COA, the improved coati optimization algorithm (ICOA) [30], the whale optimization algorithm (WOA) [26], and the hiking optimization algorithm (HOA) [35]). Additionally, the population size is uniformly set to 30 and the maximum number of iterations to 200 for all algorithms to ensure fairness in comparison. Each algorithm is independently run 30 times to mitigate the impact of randomness on results, and the mean, standard deviation, and minimum values of the degree reduction error are calculated as performance evaluation metrics.

7.3.1. Example 1

The control points for example 1 are defined as follows:
{ P 0 = ( 0.1 , 0 ) , P 1 = ( 0 , 0.1 ) , P 2 = ( 0.05 , 0.2 ) , P 3 = ( 0.3 , 0.3 ) , P 4 = ( 0.55 , 0.2 ) , P 5 = ( 0.6 , 0.1 ) , P 6 = ( 0.5 , 0 ) }
An sixth-degree Said–Ball curve was constructed using these control points and was subsequently reduced to a fourth-degree curve through various algorithms, without employing end point interpolation constraints. The resulting control point coordinates and statistical error metrics for each algorithm are presented in Table 15.
From Table 15, it can be observed that the MSECOA performs best in Case 1, with the mean, standard deviation, and minimum values of its degree reduction error ( ε ) significantly lower than those of the other algorithms. Specifically, the MSECOA achieves a mean error of 3.671 × 10 5 , a standard deviation of 2.710 × 10 6 , and a minimum error of 3.351 × 10 5 , all of which outperform the COA, ICOA, WOA, and HOA algorithms. This indicates that the MSECOA exhibits higher convergence accuracy and stability during the degree reduction process. Furthermore, the reduced-degree curve generated by the MSECOA approximates the original curve more closely, whereas the other algorithms tend to get trapped in local optima during degree reduction, resulting in larger errors. For instance, the mean errors of the COA and ICOA are 2.180 × 10 3 and 3.149 × 10 3 , respectively, notably higher than that of the MSECOA.
Figure 15 illustrates the comparative results of reducing a seventh-degree Said–Ball curve to a fourth-degree representation, including both the geometric approximation quality and iterative convergence behavior.

7.3.2. Example 2

The control points for example 2 are defined as follows:
{ P 0 = ( 0.871 , 0.408 ) , P 1 = ( 0.692 , 0.916 ) , P 2 = ( 0.483 , 0.597 ) , P 3 = ( 0.256 , 0.302 ) , P 4 = ( 0.256 , 0.302 ) , P 5 = ( 0.302 ; 0.483 ) , P 6 = ( 0.692 , 0.916 ) , P 7 = ( 0.871 , 0.409 ) }
An seventh-degree Said–Ball curve was constructed using these control points and was subsequently reduced to a fourth-degree curve through various algorithms, without employing end point interpolation constraints. The resulting control point coordinates and statistical error metrics for each algorithm are presented in Table 16.
The statistical results presented in Table 16 demonstrate that the MSECOA achieves significantly lower degree-reduction errors ( ε ) compared to alternative methods. Specifically, the MSECOA exhibits a mean error of 5.704 × 10 4 with a standard deviation of 4.420 × 10 6 and minimum error of 5.628 × 10 4 , outperforming the COA, ICOA, WOA, and HOA. In contrast, the COA and ICOA show substantially higher mean errors of 1.295 × 10 2 and 4.988 × 10 3 , respectively. The algorithm’s superior performance in both standard deviation and minimum error metrics further confirms its enhanced stability and convergence precision during the degree reduction process.
Figure 16 illustrates comparative results of reducing a seventh-degree Said–Ball curve to a fourth-degree representation, including both the geometric approximation quality and iterative convergence behavior.

7.3.3. Example 3

The control points for example 3 are defined as follows:
{ P 0 = ( 0.5 , 0 ) , P 1 = ( 0.8 , 0.3 ) , P 2 = ( 0.65 , 0.7 ) , P 3 = ( 0.1 , 1 ) , P 4 = ( 0.6 , 1.1 ) , P 5 = ( 1.3 , 1 ) , P 6 = ( 1.85 , 0.7 ) , P 7 = ( 2 , 0.3 ) , P 8 = ( 1.7 , 0 ) }
An eighth-degree Said–Ball curve was constructed using these control points and was subsequently reduced to a fourth-degree curve through various algorithms, without employing end point interpolation constraints. The resulting control point coordinates and statistical error metrics for each algorithm are presented in Table 17.
Table 17 demonstrates that the MSECOA achieves significantly lower approximation errors in degree reduction compared to alternative methods. Specifically, the MSECOA exhibits a mean error of 1.059 × 10 5 , standard deviation of 1.070 × 10 5 , and minimum error of 6.720 × 10 6 , outperforming the COA, ICOA, WOA, and HOA by two orders of magnitude. In contrast, the COA and ICOA show substantially higher mean errors of 3.368 × 10 2 and 4.699 × 10 3 , respectively. The algorithm’s superior performance in both standard deviation and minimum error further confirms its exceptional stability and precision.
Figure 17 illustrates the degree reduction comparison from eighth-order to fourth-order Said–Ball curves, displaying both the geometric approximation and convergence behavior.

7.3.4. Example 4

The control points for example 4 are defined as follows:
{ P 0 = ( 0.3 , 0 ) , P 1 = ( 0.8 , 0.3 ) , P 2 = ( 0.65 , 0.7 ) , P 3 = ( 0.1 , 1.0 ) , P 4 = ( 0.6 , 1.1 ) , P 5 = ( 1.3 , 1.0 ) , P 6 = ( 1.85 , 0.7 ) , P 7 = ( 2.0 , 0.3 ) , P 8 = ( 1.5 , 0 ) }
An eighth-degree Said–Ball curve was constructed from these control points and subsequently reduced to a third-degree curve using different algorithms without end point interpolation constraints. The resulting control point coordinates and statistical error metrics for each method are presented in Table 18.
As evidenced by the statistical results in Table 18, the MSECOA demonstrates significantly lower degree-reduction errors compared to the other methods. Specifically, the MSECOA achieves a mean error of 1.353 × 10 3 with a standard deviation of 8.900 × 10 7 , while the minimum error ( 1.350 × 10 3 ) closely aligns with the mean value. This consistency confirms the algorithm’s stability across varying initial conditions. In contrast, the COA and ICOA exhibit notably higher mean errors ( 1.844 × 10 2 and 2.567 × 10 3 , respectively). Furthermore, the MSECOA’s superior performance in both the standard deviation and minimum error metrics underscores its enhanced stability and convergence precision during degree reduction. These results collectively demonstrate that the MSECOA offers a robust solution for Said–Ball curve degree reduction, effectively minimizing approximation errors while preserving geometric characteristics. The algorithm thus presents an efficient and reliable approach for handling complex curve simplification.
Figure 18 illustrates the comparative results of reducing an eighth-degree Said–Ball curve to a third-degree curve, including both the curve geometry and iterative convergence behavior.

8. Conclusions and Future Prospects

This paper addresses the degree reduction problem of Said–Ball curves by proposing a multi-strategy enhanced coati optimization algorithm (MSECOA). The algorithm constructs a degree reduction model based on Euclidean distance and integrated curvature information, incorporating multiple strategies such as good point set-based hybrid opposition learning, fitness–distance balance, dynamic spiral searching, and adaptive differential evolution. These enhancements significantly improve the algorithm’s global search capability, local exploitation ability, and convergence speed. The experimental results demonstrate that the MSECOA outperforms the existing methods on the IEEE CEC2017 and CEC2022 test function suites and exhibits strong practical applicability in four engineering constrained design problems. Furthermore, numerical experiments with four different degree reduction levels validate the MSECOA’s notable advantages in the Said–Ball curve degree reduction problem, effectively preserving the geometric features of the curve while reducing its degree. This provides an efficient and reliable solution for complex curve degree reduction challenges.
The potential applications of the MSECOA extend beyond the degree reduction of Said–Ball curves to broader areas of geometric modeling [42,43]. In particular, the MSECOA demonstrates strong applicability in curve and surface modeling for complex real-world problems. For example, in vascular structure reconstruction, accurate curve approximation is essential for modeling intricate blood vessel geometries based on segmented medical imaging data. The algorithm’s ability to preserve shape characteristics during degree reduction makes it particularly suitable for reconstructing tubular anatomical structures, such as arteries and veins, under spatial constraints. Similarly, in pipeline design, where the precise geometric representation of pipe networks is crucial for layout optimization and structural integrity analysis, the MSECOA can be applied to generate high-fidelity free-form curves and surfaces, offering improved efficiency and accuracy.
Although the MSECOA has demonstrated notable performance advantages in current degree reduction tasks, future research will focus on further improving its convergence behavior and robustness. Promising research directions include extending the algorithm to handle curves defined over disk or ball domains, exploring its application in the degree reduction of Ball Said–Ball curves, and generalizing the framework to high-dimensional surface reduction, thereby addressing more complex geometric modeling challenges. Additionally, integrating reinforcement learning techniques to explore the MSECOA’s potential in intelligent geometric modeling represents a promising avenue for future studies.

Author Contributions

Conceptualization, F.Z. and H.Y.; Methodology, F.Z.; Software, F.Z. and X.W.; Formal analysis, W.Z. Data curation, Q.S.; Writing-original draft, F.Z.; Writing-review editing, H.Y.; Supervision, H.Y.; Funding acquisition, F.Z., X.W. and H.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the National Natural Science Foundation of China (Grant No. 12161043), Ganzhou City Guiding Science and Technology Plan Project (GZ2024ZSF874), Jiangxi Postgraduate Innovation Special Fund Project (YC2024-S528), and Postgraduate Innovation Special Fund Project of Jiangxi University of Science and Technology (XY2024-S051).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

All data generated or analyzed during the study are included in this published article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Hu, G.; Li, M.; Wang, X.; Wei, G.; Chang, C.T. An enhanced manta ray foraging optimization algorithm for shape optimization of complex CCG-Ball curves. Knowl.-Based Syst. 2022, 240, 108071–108091. [Google Scholar] [CrossRef]
  2. Ball, A.A. CONSURF. Part one: Introduction of the conic lofting tile. Comput. Aided Des. 1993, 6, 243–249. [Google Scholar] [CrossRef]
  3. Wang, G. Ball curve of high degree and its geometric properties. Appl. Math. J. Chin. Univ. 1987, 2, 126–140. [Google Scholar]
  4. Said, H.B. A generalized Ball curve and its recursive algorithm. ACM Trans. Graph. 1989, 8, 360–371. [Google Scholar] [CrossRef]
  5. Hu, S.M.; Wang, G.Z.; Jin, T.G. Properties of two types of generalized Ball curves. Comput. Aided Des. 1996, 28, 125–133. [Google Scholar] [CrossRef]
  6. Wu, H. Two new types of generalized Ball curves. Acta Math. Appl. Sin. 2000, 23, 196–205. [Google Scholar]
  7. Liu, W.; Xie, B.; Han, L. h-Said-Ball bases and h-Said-Ball curves. Appl. Math. J. Chin. Univ. Ser. A 2024, 39, 273–290. [Google Scholar]
  8. Sederberg, T.W.; Farouki, R.T. Approximation by interval Bézier curves. IEEE Comput. Graph. Appl. 1992, 12, 87–95. [Google Scholar] [CrossRef]
  9. Lin, Q.; Rokne, J. Disk Bézier curves. Comput. Aided Geom. Des. 1998, 15, 721–737. [Google Scholar] [CrossRef]
  10. Lu, L.; Hu, Q.; Wang, G. An Iterative Algorithm for Degree Reduction of Bézier Curves. J. Comput. Des. Comput. Graph. 2009, 21, 1689–1693. [Google Scholar]
  11. Quan, H.; Liu, C.; Li, J.; Yang, L.; Hu, L. Preconditioned progressive iterative approximation for tensor product Said-Ball patches. J. Zhejiang Univ. 2022, 49, 682–690. [Google Scholar] [CrossRef]
  12. Chen, F.; Lou, W. Degree reduction of interval Bézier curves. Comput. Aided Des. 2000, 32, 571–582. [Google Scholar] [CrossRef]
  13. Tan, Q.; Fang, Z. Degree reduction of interval generalized ball curves of Wang-Said type. J. Comput. Aided Des. Comput. Graph. 2008, 20, 1483–1493. [Google Scholar]
  14. Hu, Q.; Wang, G. Multi-degree Reduction of Disk Beezier Curves in L2 Norm. J. Inf. Comput. Sci. 2010, 7, 1045–1057. [Google Scholar]
  15. Hongkai, W. Research of Reparameterization of Bézier Curves for Reducing Order in CAD. Master’s Thesis, Jiangnan University, Wuxi, China, 2020. [Google Scholar]
  16. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  17. Cheng, L.; Ling, G.; Liu, F.; Ge, M.F. Application of uniform experimental design theory to multi-strategy improved sparrow search algorithm for UAV path planning. Expert Syst. Appl. 2024, 255, 124849. [Google Scholar] [CrossRef]
  18. Abdel-salam, M.; Alomari, S.A.; Almomani, M.H.; Hu, G.; Lee, S.; Saleem, K.; Smerat, A.; Abualigah, L. Quadruple Strategy-Driven Hiking Optimization Algorithm for Low and High-Dimensional Feature Selection and Real-World Skin Cancer Classification. Knowl.-Based Syst. 2025, 315, 113286. [Google Scholar] [CrossRef]
  19. Hu, G.; Qiao, Y.; Qin, X.; Wei, G. Approximate multi-degree reduction of SG-Bézier curves using the grey wolf optimizer algorithm. Symmetry 2019, 11, 1242. [Google Scholar] [CrossRef]
  20. Guo, W.; Liu, T.; Dai, F.; Zhao, F.; Xu, P. Skewed normal cloud modified whale optimization algorithm for degree reduction of S-λ curves. Apll. Intell. 2021, 51, 8377–8398. [Google Scholar] [CrossRef]
  21. Hu, G.; Chen, L.; Wei, G. Enhanced golden jackal optimizer-based shape optimization of complex CSGC-Ball surfaces. Artif. Intell. Rev. 2023, 56, 2407–2475. [Google Scholar] [CrossRef]
  22. Dehghani, M.; Montazeri, Z.; Trojovská, E.; Trojovský, P. Coati Optimization Algorithm: A new bio-inspired metaheuristic algorithm for solving optimization problems. Knowl.-Based Syst. 2023, 259, 110011. [Google Scholar] [CrossRef]
  23. Luo, H.; Yuan, W. Applications of Number-Theoretic Methods in Approximate Analysis. Technol. Rep. 1978, 1978. [Google Scholar]
  24. Li, X.; Qi, Y.; Xing, Q.; Hu, Y. IMSCSO: An intensified sand cat swarm optimization with multi-strategy for solving global and engineering optimization problems. IEEE Access 2023, 11, 122315–122344. [Google Scholar] [CrossRef]
  25. Kahraman, H.T.; Aras, S.; Gedikli, E. Fitness-distance balance (FDB): A new selection method for meta-heuristic search algorithms. Knowl.-Based Syst. 2020, 190, 105169. [Google Scholar] [CrossRef]
  26. Mirjalini, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  27. Salman, A.; Engelbrecht, A.P.; Omran, M.G. Empirical analysis of self-adaptive differential evolution. Eur. J. Oper. Res. 2007, 183, 785–804. [Google Scholar] [CrossRef]
  28. Awad, N.H.; Ali, M.Z.; Liang, J.J.; Qu, B.Y.; Suganthan, P.N.; Definitions, P. Evaluation criteria for the CEC 2017 special session and competition on single objective real-parameter numerical optimization. Technol. Rep. 2016, 5. [Google Scholar]
  29. Bujok, P.; Kolenovsky, P. Eigen crossover in cooperative model of evolutionary algorithms applied to CEC 2022 single objective numerical optimisation. In Proceedings of the 2022 IEEE Congress on Evolutionary Computation (CEC), Padua, Italy, 18–23 July 2022; pp. 1–8. [Google Scholar] [CrossRef]
  30. Jia, H.; Shi, S.; Wu, D.; Rao, H.; Zhang, J.; Abualigah, L. Improve coati optimization algorithm for solving constrained engineering optimization problems. J. Comput. Des. Eng. 2023, 10, 2223–2250. [Google Scholar] [CrossRef]
  31. Abualigah, L.; Diabat, A.; Mirjalini, S.; Abd Elaziz, M.; Gandomi, A.H. The arithmetic optimization algorithm. Comput. Methods Appl. Mech. Eng. 2021, 376, 113609. [Google Scholar] [CrossRef]
  32. Dehkordi, A.A.; Sadiq, A.S.; Mirjalini, S.; Ghafoor, K.Z. Nonlinear-based chaotic Harris hawks optimizer: Algorithm and internet of vehicles application. Appl. Soft Comput. 2021, 109, 107574. [Google Scholar] [CrossRef]
  33. Jia, H.; Zhang, J.; Rao, H.; Abualigah, L. Improved sandcat swarm optimization algorithm for solving global optimum problems. Artif. Intell. Rev. 2024, 58, 5. [Google Scholar] [CrossRef]
  34. Wang, J.; Wang, W.C.; Hu, X.X.; Qiu, L.; Zang, H.F. Black-winged kite algorithm: A nature-inspired meta-heuristic for solving benchmark functions and engineering problems. Artif. Intell. Rev. 2024, 57, 98. [Google Scholar] [CrossRef]
  35. Oladejo, S.O.; Ekwe, S.O.; Mirjalini, S. The hiking optimization algorithm: A novel human-based metaheuristic approach. Knowl.-Based Syst. 2024, 296, 111880. [Google Scholar] [CrossRef]
  36. Awad, N.H.; Ali, M.Z.; Suganthan, P.N. Ensemble sinusoidal differential covariance matrix adaptation with Euclidean neighborhood for solving CEC2017 benchmark problems. In Proceedings of the 2017 IEEE Congress on Evolutionary Computation (CEC), Donostia, San Sebastian, Spain, 5–8 June 2017; pp. 372–379. [Google Scholar] [CrossRef]
  37. Friedman, M. A comparison of alternative tests of significance for the problem of m rankings. Ann. Math. Stat. 1940, 11, 86–92. [Google Scholar] [CrossRef]
  38. Örnek, B.N.; Aydemir, S.B.; Düzenli, T.; Özak, B. A novel version of slime mould algorithm for global optimization and real-world engineering problems: Enhanced slime mould algorithm. Math. Comput. Simul. 2022, 198, 253–288. [Google Scholar] [CrossRef]
  39. Abdulsalami, A.O.; Abd Elaziz, M.; Gharehchopogh, F.S.; Salawudeen, A.T.; Xiong, S. An improved heterogeneous comprehensive learning symbiotic organism search for optimization problems. Knowl.-Based Syst. 2024, 285, 111351. [Google Scholar] [CrossRef]
  40. Olmez, Y.; Koca, G.O.; Sengur, A.; Acharya, U.R. Chaotic opposition golden sine algorithm for global optimization problems. Chaos Solitons Fractals 2024, 183, 114869. [Google Scholar] [CrossRef]
  41. Bencherqui, A.; Tahiri, M.A.; El Ghouate, N.; Karmouni, H.; Sayyouri, M.; Askar, S.S.; Abouhawwash, M. Chaos-enhanced white shark optimization algorithms (CWSO) for global optimization. Alexandria Eng. J. 2025, 122, 465–483. [Google Scholar] [CrossRef]
  42. Zhang, L.; Tan, J.; Dong, Z. The dual bases for the Bézier-Said-Wang type generalized Ball polynomial bases and their applications. Appl. Math. Comput. 2010, 217, 3088–3101. [Google Scholar] [CrossRef]
  43. Majeed, A.; Mt Piah, A.R.; Gobithaasan, R.U.; Yahya, Z.R. Craniofacial reconstruction using rational cubic Ball curves. PLoS ONE 2015, 10, e0122854. [Google Scholar] [CrossRef]
Figure 1. Comparison of 6th-degree Said–Ball curve before and after degree reduction.
Figure 1. Comparison of 6th-degree Said–Ball curve before and after degree reduction.
Biomimetics 10 00416 g001
Figure 2. Hunting and attack strategy on iguanas (exploration phase).
Figure 2. Hunting and attack strategy on iguanas (exploration phase).
Biomimetics 10 00416 g002
Figure 3. Process of escaping predators (exploitation phase).
Figure 3. Process of escaping predators (exploitation phase).
Biomimetics 10 00416 g003
Figure 4. Comparison of population distribution between the good point set and random methods.
Figure 4. Comparison of population distribution between the good point set and random methods.
Biomimetics 10 00416 g004
Figure 5. Schematic of hybrid oppositional learning.
Figure 5. Schematic of hybrid oppositional learning.
Biomimetics 10 00416 g005
Figure 6. Flowchart of MSECOA.
Figure 6. Flowchart of MSECOA.
Biomimetics 10 00416 g006
Figure 7. The ablation study on CEC2017 (10-dimension). (a) The radar chart. (b) The average rank chart.
Figure 7. The ablation study on CEC2017 (10-dimension). (a) The radar chart. (b) The average rank chart.
Biomimetics 10 00416 g007
Figure 8. Convergence curves of different algorithms (CEC2017, dimension 10).
Figure 8. Convergence curves of different algorithms (CEC2017, dimension 10).
Biomimetics 10 00416 g008
Figure 9. Convergence curves of different algorithms (CEC2017, dimension 30).
Figure 9. Convergence curves of different algorithms (CEC2017, dimension 30).
Biomimetics 10 00416 g009
Figure 10. Convergence curves of different algorithms (CEC2022, dimension 20).
Figure 10. Convergence curves of different algorithms (CEC2022, dimension 20).
Biomimetics 10 00416 g010
Figure 11. Schematic of the three-bar truss design problem.
Figure 11. Schematic of the three-bar truss design problem.
Biomimetics 10 00416 g011
Figure 12. Schematic of welded beam design problem.
Figure 12. Schematic of welded beam design problem.
Biomimetics 10 00416 g012
Figure 13. Schematic of tension/compression spring design problem.
Figure 13. Schematic of tension/compression spring design problem.
Biomimetics 10 00416 g013
Figure 14. Schematic of pressure vessel design problem.
Figure 14. Schematic of pressure vessel design problem.
Biomimetics 10 00416 g014
Figure 15. Degree reduction results of different algorithms (example 1).
Figure 15. Degree reduction results of different algorithms (example 1).
Biomimetics 10 00416 g015
Figure 16. Degree reduction results of different algorithms (example 2).
Figure 16. Degree reduction results of different algorithms (example 2).
Biomimetics 10 00416 g016
Figure 17. Degree reduction results of different algorithms (example 3).
Figure 17. Degree reduction results of different algorithms (example 3).
Biomimetics 10 00416 g017
Figure 18. Degree reduction results of different algorithms (example 4).
Figure 18. Degree reduction results of different algorithms (example 4).
Biomimetics 10 00416 g018
Table 1. Algorithm parameter settings.
Table 1. Algorithm parameter settings.
AlgorithmParameter Settings
MSECOA C R [ 0.1 , 0.9 ] , I [ 1 , 2 ]
COA I [ 1 , 2 ]
ICOA v = 0.3 , a [ 1 , 2 ]
WOA a [ 2 , 0 ] , b = 1 , l [ 1 , 1 ]
AOA MOP Max = 1 , MOP Min = 0.2 , A = 5 , M u = 0.499
NCHHO c [ 2 , 0 ] , a = 4 , X n = 0.7
ISCSO C = 0.01 , β = 2 , δ = 2 , S M = 2
BKA P = 0.9
HOA S w e e p F a c t o r f [ 2 , 0 ] , A n g l e o f i n c l i n a t i o n [ 0 , 50 ]
LSHADE_cnEpSin N P m i n = 4.0 , P b e s t r a t e = 0.11 , A r c r a t e = 1.4 , H = 5 , μ F i n i t i a l = μ C R i n i t i a l = 0.5
Table 2. IEEE CEC2017 benchmark functions.
Table 2. IEEE CEC2017 benchmark functions.
TypeFunction NoFunction DescriptionRangeOptimum
UnimodalF1Shifted and rotated bent cigar function[−100, 100]100
UnimodalF3Shifted and rotated Zakharov function[−100, 100]300
MultimodalF4Shifted and rotated Rosenbrock’s function[−100, 100]400
MultimodalF5Shifted and rotated Rastrigin’s function[−100, 100]500
MultimodalF6Shifted and rotated expanded Scaffer’s F6 function[−100, 100]600
MultimodalF7Shifted and rotated Lunacek Bi-Rastrigin function[−100, 100]700
MultimodalF8Shifted and rotated Non-continuous Rastrigin’s function[−100, 100]800
MultimodalF9Shifted and rotated Levy function[−100, 100]900
MultimodalF10Shifted and rotated Schwefel’s function[−100, 100]1000
HybridF11Hybrid function 1 (N = 3)[−100, 100]1100
HybridF12Hybrid function 2 (N = 3)[−100, 100]1200
HybridF13Hybrid function 3 (N = 3)[−100, 100]1300
HybridF14Hybrid function 4 (N = 4)[−100, 100]1400
HybridF15Hybrid function 4 (N = 4)[−100, 100]1500
HybridF16Hybrid function 6 (N = 4)[−100, 100]1600
HybridF17Hybrid function 6 (N = 5)[−100, 100]1700
HybridF18Hybrid function 6 (N = 5)[−100, 100]1800
HybridF19Hybrid function 6 (N = 5)[−100, 100]1900
HybridF20Hybrid function 6 (N = 5)[−100, 100]2000
CompositionF21Composition function 1 (N = 3)[−100, 100]2100
CompositionF22Composition function 2 (N = 3)[−100, 100]2200
CompositionF23Composition function 3 (N = 4)[−100, 100]2300
CompositionF24Composition function 4 (N = 4)[−100, 100]2400
CompositionF25Composition function 5 (N = 5)[−100, 100]2500
CompositionF26Composition function 6 (N = 5)[−100, 100]2600
CompositionF27Composition function 7 (N = 6)[−100, 100]2700
CompositionF28Composition function 8 (N = 6)[−100, 100]2800
CompositionF29Composition function 9 (N = 3)[−100, 100]2900
CompositionF30Composition function 10 (N = 3)[−100, 100]3000
Table 3. Results of CEC2017 in dimension 10.
Table 3. Results of CEC2017 in dimension 10.
FunIndexMSECOAICOACOAWOAAOABKAHOANCHHOISCSOLSHADE_cnEpSin
F1Mean 1.968 × 10 5 8.878 × 10 9 9.361 × 10 9 8.268 × 10 6 7.114 × 10 9 1.101 × 10 8 4.297 × 10 9 7.411 × 10 9 1.524 × 10 10 2.022 × 10 6
Std 6.941 × 10 5 2.403 × 10 9 2.717 × 10 9 1.742 × 10 7 2.961 × 10 9 3.517 × 10 8 2.220 × 10 9 2.861 × 10 9 6.940 × 10 9 1.129 × 10 6
F3Mean 3.019 × 10 2 8.222 × 10 3 9.933 × 10 3 5.280 × 10 3 1.050 × 10 4 1.246 × 10 3 8.776 × 10 3 9.441 × 10 3 1.777 × 10 4 3.272 × 10 3
Std 6.008 × 10 0 1.989 × 10 3 2.448 × 10 3 4.220 × 10 3 2.628 × 10 3 3.027 × 10 3 3.002 × 10 3 3.670 × 10 3 4.497 × 10 3 1.484 × 10 3
F4Mean 4.029 × 10 2 1.012 × 10 3 1.133 × 10 3 4.307 × 10 2 1.099 × 10 3 4.477 × 10 2 6.111 × 10 2 8.501 × 10 2 1.876 × 10 3 4.072 × 10 2
Std 1.485 × 10 0 1.879 × 10 2 3.040 × 10 2 3.558 × 10 1 4.659 × 10 2 1.263 × 10 2 1.297 × 10 2 3.259 × 10 2 1.088 × 10 3 7.313 × 10 1
F5Mean 5.122 × 10 2 5.941 × 10 2 5.906 × 10 2 5.594 × 10 2 5.535 × 10 2 5.368 × 10 2 5.543 × 10 2 5.814 × 10 2 6.162 × 10 2 5.257 × 10 2
Std 4.186 × 10 0 1.238 × 10 1 1.699 × 10 1 2.039 × 10 1 2.092 × 10 1 1.417 × 10 1 1.485 × 10 1 1.878 × 10 1 2.877 × 10 1 5.329 × 10 0
F6Mean 6.003 × 10 2 6.429 × 10 2 6.480 × 10 2 6.375 × 10 2 6.404 × 10 2 6.268 × 10 2 6.286 × 10 2 6.433 × 10 2 6.671 × 10 2 6.011 × 10 2
Std 6.515 × 10 1 7.558 × 10 0 9.222 × 10 0 1.431 × 10 1 8.403 × 10 0 1.072 × 10 1 9.214 × 10 0 1.323 × 10 1 1.026 × 10 1 4.119 × 10 1
F7Mean 7.313 × 10 2 7.951 × 10 2 8.038 × 10 2 7.862 × 10 2 7.974 × 10 2 7.548 × 10 2 7.638 × 10 2 8.087 × 10 2 8.407 × 10 2 7.403 × 10 2
Std 1.452 × 10 1 1.525 × 10 1 1.900 × 10 1 2.304 × 10 1 1.567 × 10 1 1.905 × 10 1 1.411 × 10 1 1.999 × 10 1 2.490 × 10 1 6.116 × 10 0
F8Mean 8.145 × 10 2 8.560 × 10 2 8.567 × 10 2 8.427 × 10 2 8.317 × 10 2 8.226 × 10 2 8.339 × 10 2 8.461 × 10 2 8.732 × 10 2 8.247 × 10 2
Std 4.886 × 10 0 5.351 × 10 0 9.471 × 10 0 1.386 × 10 1 5.884 × 10 0 7.991 × 10 0 6.346 × 10 0 8.798 × 10 0 1.063 × 10 1 5.971 × 10 0
F9Mean 9.273 × 10 2 1.478 × 10 3 1.537 × 10 3 1.407 × 10 3 1.382 × 10 3 1.188 × 10 3 1.155 × 10 3 1.596 × 10 3 2.325 × 10 3 9.034 × 10 2
Std 4.812 × 10 1 1.859 × 10 2 2.063 × 10 2 3.356 × 10 2 1.781 × 10 2 1.545 × 10 2 1.261 × 10 2 2.369 × 10 2 3.892 × 10 2 2.306 × 10 0
F10Mean 1.427 × 10 3 2.587 × 10 3 2.538 × 10 3 2.074 × 10 3 2.182 × 10 3 1.830 × 10 3 2.274 × 10 3 2.461 × 10 3 2.791 × 10 3 1.972 × 10 3
Std 2.176 × 10 2 1.650 × 10 2 2.000 × 10 2 3.589 × 10 2 2.291 × 10 2 3.566 × 10 2 2.861 × 10 2 3.264 × 10 2 4.810 × 10 2 2.056 × 10 2
F11Mean 1.113 × 10 3 2.161 × 10 3 2.717 × 10 3 1.227 × 10 3 2.561 × 10 3 1.187 × 10 3 2.268 × 10 3 2.788 × 10 3 1.503 × 10 4 1.114 × 10 3
Std 2.035 × 10 1 8.538 × 10 2 1.608 × 10 3 8.430 × 10 1 2.499 × 10 3 1.587 × 10 2 8.756 × 10 2 2.134 × 10 3 1.819 × 10 4 4.010 × 10 0
F12Mean 2.456 × 10 4 1.717 × 10 8 2.659 × 10 8 4.546 × 10 6 9.242 × 10 7 2.509 × 10 5 7.794 × 10 6 5.298 × 10 7 7.561 × 10 8 7.168 × 10 5
Std 6.057 × 10 4 1.467 × 10 8 2.178 × 10 8 5.142 × 10 6 1.334 × 10 8 1.020 × 10 6 6.991 × 10 6 1.415 × 10 8 5.882 × 10 8 4.451 × 10 5
F13Mean 1.336 × 10 3 6.829 × 10 5 8.912 × 10 5 1.981 × 10 4 1.097 × 10 4 1.951 × 10 3 5.002 × 10 5 1.619 × 10 4 8.460 × 10 7 1.450 × 10 3
Std 6.438 × 10 1 8.620 × 10 5 3.164 × 10 6 1.403 × 10 4 8.205 × 10 3 4.352 × 10 2 1.665 × 10 6 1.232 × 10 4 1.239 × 10 8 1.184 × 10 2
F14Mean 1.406 × 10 3 1.593 × 10 3 1.545 × 10 3 2.173 × 10 3 8.826 × 10 3 1.475 × 10 3 4.955 × 10 3 3.060 × 10 3 6.278 × 10 3 1.432 × 10 3
Std 6.265 × 10 0 6.825 × 10 1 4.435 × 10 1 1.118 × 10 3 7.941 × 10 3 2.862 × 10 1 3.964 × 10 3 1.520 × 10 3 1.670 × 10 4 4.408 × 10 0
F15Mean 1.506 × 10 3 7.526 × 10 3 8.058 × 10 3 8.923 × 10 3 1.648 × 10 4 1.613 × 10 3 8.692 × 10 3 1.092 × 10 4 7.153 × 10 4 1.518 × 10 3
Std 6.806 × 10 0 3.171 × 10 3 3.885 × 10 3 5.963 × 10 3 5.107 × 10 3 8.434 × 10 1 4.426 × 10 3 3.889 × 10 3 1.863 × 10 5 5.301 × 10 0
F16Mean 1.764 × 10 3 2.107 × 10 3 2.089 × 10 3 1.906 × 10 3 2.038 × 10 3 1.769 × 10 3 1.941 × 10 3 2.025 × 10 3 2.403 × 10 3 1.625 × 10 3
Std 1.237 × 10 2 1.087 × 10 2 1.230 × 10 2 1.176 × 10 2 1.373 × 10 2 1.012 × 10 2 1.169 × 10 2 1.674 × 10 2 2.331 × 10 2 1.155 × 10 1
F17Mean 1.716 × 10 3 1.786 × 10 3 1.811 × 10 3 1.819 × 10 3 1.880 × 10 3 1.770 × 10 3 1.797 × 10 3 1.802 × 10 3 1.955 × 10 3 1.737 × 10 3
Std 1.348 × 10 1 1.945 × 10 1 3.518 × 10 1 6.331 × 10 1 9.836 × 10 1 3.271 × 10 1 4.241 × 10 1 3.168 × 10 1 6.220 × 10 1 7.512 × 10 0
F18Mean 1.817 × 10 3 1.187 × 10 7 3.231 × 10 7 1.870 × 10 4 1.570 × 10 4 2.630 × 10 3 1.220 × 10 7 1.524 × 10 4 1.655 × 10 8 2.375 × 10 3
Std 2.325 × 10 1 2.509 × 10 7 8.558 × 10 7 1.222 × 10 4 1.032 × 10 4 1.988 × 10 3 2.844 × 10 7 1.259 × 10 4 3.023 × 10 8 6.398 × 10 2
F19Mean 1.902 × 10 3 1.283 × 10 4 1.647 × 10 4 2.952 × 10 4 4.212 × 10 4 1.957 × 10 3 2.768 × 10 4 3.389 × 10 4 6.720 × 10 6 1.908 × 10 3
Std 1.940 × 10 0 1.970 × 10 4 2.828 × 10 4 5.918 × 10 4 3.737 × 10 4 4.401 × 10 1 3.851 × 10 4 6.544 × 10 4 2.805 × 10 7 3.069 × 10 0
F20Mean 2.006 × 10 3 2.223 × 10 3 2.258 × 10 3 2.182 × 10 3 2.158 × 10 3 2.087 × 10 3 2.145 × 10 3 2.236 × 10 3 2.362 × 10 3 2.029 × 10 3
Std 1.215 × 10 1 5.191 × 10 1 3.929 × 10 1 6.824 × 10 1 6.381 × 10 1 5.188 × 10 1 7.704 × 10 1 7.323 × 10 1 1.205 × 10 2 6.194 × 10 0
F21Mean 2.247 × 10 3 2.346 × 10 3 2.370 × 10 3 2.312 × 10 3 2.325 × 10 3 2.278 × 10 3 2.330 × 10 3 2.339 × 10 3 2.389 × 10 3 2.261 × 10 3
Std 5.534 × 10 1 4.579 × 10 1 4.831 × 10 1 6.257 × 10 1 4.054 × 10 1 6.355 × 10 1 4.879 × 10 1 5.368 × 10 1 5.162 × 10 1 5.724 × 10 1
F22Mean 2.289 × 10 3 2.926 × 10 3 3.097 × 10 3 2.503 × 10 3 3.041 × 10 3 2.307 × 10 3 2.655 × 10 3 2.739 × 10 3 3.188 × 10 3 2.308 × 10 3
Std 3.032 × 10 1 2.329 × 10 2 3.254 × 10 2 4.874 × 10 2 3.168 × 10 2 3.512 × 10 1 3.368 × 10 2 3.443 × 10 2 4.493 × 10 2 1.458 × 10 0
F23Mean 2.621 × 10 3 2.709 × 10 3 2.707 × 10 3 2.653 × 10 3 2.724 × 10 3 2.637 × 10 3 2.729 × 10 3 2.688 × 10 3 2.772 × 10 3 2.623 × 10 3
Std 8.280 × 10 0 2.651 × 10 1 3.299 × 10 1 2.646 × 10 1 3.403 × 10 1 2.925 × 10 1 3.972 × 10 1 3.549 × 10 1 3.337 × 10 1 4.106 × 10 0
F24Mean 2.512 × 10 3 2.819 × 10 3 2.870 × 10 3 2.778 × 10 3 2.818 × 10 3 2.753 × 10 3 2.824 × 10 3 2.814 × 10 3 2.919 × 10 3 2.737 × 10 3
Std 4.957 × 10 1 6.486 × 10 1 8.071 × 10 1 4.740 × 10 1 8.994 × 10 1 7.248 × 10 1 1.048 × 10 2 7.236 × 10 1 4.810 × 10 1 5.320 × 10 1
F25Mean 2.922 × 10 3 3.363 × 10 3 3.459 × 10 3 2.958 × 10 3 3.283 × 10 3 2.933 × 10 3 3.119 × 10 3 3.182 × 10 3 3.741 × 10 3 2.928 × 10 3
Std 2.393 × 10 1 1.402 × 10 2 2.002 × 10 2 2.996 × 10 1 2.201 × 10 2 4.095 × 10 1 1.261 × 10 2 1.405 × 10 2 3.539 × 10 2 2.113 × 10 1
F26Mean 2.913 × 10 3 4.034 × 10 3 4.083 × 10 3 3.598 × 10 3 3.887 × 10 3 3.135 × 10 3 3.775 × 10 3 4.028 × 10 3 4.591 × 10 3 2.977 × 10 3
Std 6.216 × 10 1 3.566 × 10 2 4.017 × 10 2 5.705 × 10 2 3.420 × 10 2 2.868 × 10 2 3.764 × 10 2 4.031 × 10 2 4.490 × 10 2 2.340 × 10 1
F27Mean 3.095 × 10 3 3.154 × 10 3 3.196 × 10 3 3.158 × 10 3 3.249 × 10 3 3.108 × 10 3 3.242 × 10 3 3.190 × 10 3 3.322 × 10 3 3.091 × 10 3
Std 3.243 × 10 0 2.466 × 10 1 4.077 × 10 1 5.317 × 10 1 7.000 × 10 1 2.799 × 10 1 4.913 × 10 1 4.531 × 10 1 1.054 × 10 2 9.790 × 10 1
F28Mean 3.295 × 10 3 3.740 × 10 3 3.758 × 10 3 3.422 × 10 3 3.723 × 10 3 3.265 × 10 3 3.701 × 10 3 3.610 × 10 3 3.861 × 10 3 3.226 × 10 3
Std 1.435 × 10 2 1.102 × 10 2 8.844 × 10 1 1.431 × 10 2 1.467 × 10 2 1.540 × 10 2 8.808 × 10 1 1.938 × 10 2 9.509 × 10 1 4.584 × 10 1
F29Mean 3.185 × 10 3 3.424 × 10 3 3.433 × 10 3 3.338 × 10 3 3.425 × 10 3 3.256 × 10 3 3.302 × 10 3 3.402 × 10 3 3.671 × 10 3 3.175 × 10 3
Std 3.338 × 10 1 6.318 × 10 1 8.040 × 10 1 9.770 × 10 1 1.457 × 10 2 6.410 × 10 1 7.197 × 10 1 1.339 × 10 2 1.873 × 10 2 1.476 × 10 1
F30Mean 7.874 × 10 4 7.173 × 10 6 8.471 × 10 6 1.477 × 10 6 2.127 × 10 7 6.953 × 10 5 4.420 × 10 6 1.017 × 10 7 3.480 × 10 7 2.049 × 10 5
Std 1.233 × 10 5 9.826 × 10 6 8.954 × 10 6 1.608 × 10 6 1.885 × 10 7 9.735 × 10 5 6.675 × 10 6 1.103 × 10 7 2.948 × 10 7 2.855 × 10 5
Friedman Rank1.2076.7938.0004.8976.8622.8975.6556.7249.9662.000
Final Rank17948356102
Table 4. Wilcoxon rank sum test results for CEC2017 in dimension 10.
Table 4. Wilcoxon rank sum test results for CEC2017 in dimension 10.
FunICOACOAWOAAOABKAHOANCHHOISCSOLSHADE_cnEpSin
F1 3.02 × 10 11 3.02 × 10 11 1.33 × 10 10 3.02 × 10 11 4.42 × 10 06 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 7.38 × 10 10
F3 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 2.68 × 10 06 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11
F4 3.02 × 10 11 3.02 × 10 11 1.61 × 10 10 3.02 × 10 11 3.87 × 10 01 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.69 × 10 11
F5 3.01 × 10 11 3.01 × 10 11 3.01 × 10 11 3.01 × 10 11 4.60 × 10 10 3.32 × 10 11 3.01 × 10 11 3.01 × 10 11 1.46 × 10 10
F6 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 1.49 × 10 06
F7 5.49 × 10 11 4.98 × 10 11 2.15 × 10 10 6.70 × 10 11 4.74 × 10 06 5.46 × 10 09 4.08 × 10 11 3.02 × 10 11 7.20 × 10 05
F8 3.01 × 10 11 3.01 × 10 11 3.68 × 10 11 4.96 × 10 11 2.28 × 10 05 4.07 × 10 11 3.01 × 10 11 3.01 × 10 11 3.08 × 10 08
F9 3.33 × 10 11 3.02 × 10 11 1.09 × 10 10 3.02 × 10 11 2.37 × 10 10 2.61 × 10 10 3.02 × 10 11 3.02 × 10 11 7.39 × 10 01
F10 3.02 × 10 11 3.02 × 10 11 5.46 × 10 09 1.21 × 10 10 3.57 × 10 06 1.33 × 10 10 7.39 × 10 11 3.02 × 10 11 1.29 × 10 09
F11 3.02 × 10 11 3.02 × 10 11 1.96 × 10 10 3.69 × 10 11 8.20 × 10 07 4.08 × 10 11 3.34 × 10 11 3.02 × 10 11 2.84 × 10 04
F12 3.02 × 10 11 3.02 × 10 11 5.49 × 10 11 8.10 × 10 10 9.93 × 10 02 3.34 × 10 11 3.34 × 10 11 3.02 × 10 11 5.49 × 10 11
F13 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 8.15 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.35 × 10 08
F14 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 4.07 × 10 11
F15 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 4.50 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 2.20 × 10 07
F16 2.15 × 10 10 7.38 × 10 10 1.25 × 10 04 3.50 × 10 09 2.06 × 10 01 1.39 × 10 06 1.87 × 10 07 1.96 × 10 10 6.77 × 10 05
F17 3.34 × 10 11 3.02 × 10 11 3.69 × 10 11 3.02 × 10 11 1.61 × 10 10 3.34 × 10 11 3.02 × 10 11 3.02 × 10 11 8.48 × 10 09
F18 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 6.07 × 10 11 3.02 × 10 11 3.34 × 10 11 3.02 × 10 11 6.70 × 10 11
F19 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.34 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 4.62 × 10 10
F20 2.95 × 10 11 2.95 × 10 11 2.95 × 10 11 3.61 × 10 11 9.70 × 10 11 3.98 × 10 11 2.95 × 10 11 2.95 × 10 11 6.99 × 10 09
F21 8.84 × 10 07 8.48 × 10 09 8.20 × 10 07 5.19 × 10 07 1.99 × 10 02 1.01 × 10 08 1.73 × 10 07 9.76 × 10 10 1.24 × 10 03
F22 3.02 × 10 11 3.02 × 10 11 4.20 × 10 10 3.02 × 10 11 6.74 × 10 06 3.02 × 10 11 3.47 × 10 10 3.02 × 10 11 4.08 × 10 11
F23 3.02 × 10 11 3.02 × 10 11 2.83 × 10 08 3.02 × 10 11 5.37 × 10 02 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.11 × 10 01
F24 3.92 × 10 11 3.20 × 10 11 3.20 × 10 11 5.29 × 10 11 7.87 × 10 11 5.29 × 10 11 3.20 × 10 11 2.36 × 10 11 1.42 × 10 10
F25 3.02 × 10 11 3.02 × 10 11 2.03 × 10 07 3.02 × 10 11 3.33 × 10 01 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 1.44 × 10 02
F26 2.76 × 10 11 2.76 × 10 11 9.03 × 10 10 2.76 × 10 11 8.54 × 10 08 2.76 × 10 11 3.73 × 10 11 2.76 × 10 11 9.06 × 10 07
F27 3.02 × 10 11 3.02 × 10 11 3.47 × 10 10 3.02 × 10 11 2.53 × 10 04 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 7.74 × 10 06
F28 3.02 × 10 11 3.02 × 10 11 3.56 × 10 04 1.61 × 10 10 8.30 × 10 01 1.78 × 10 10 1.19 × 10 06 3.02 × 10 11 2.12 × 10 01
F29 3.02 × 10 11 3.02 × 10 11 2.23 × 10 09 2.61 × 10 10 1.39 × 10 06 3.82 × 10 10 1.86 × 10 09 3.69 × 10 11 2.77 × 10 01
F30 3.33 × 10 11 3.02 × 10 11 1.10 × 10 08 3.02 × 10 11 9.47 × 10 01 7.39 × 10 11 4.50 × 10 11 3.34 × 10 11 2.61 × 10 02
+/=/−29/0/029/0/029/0/029/0/022/7/029/0/029/0/029/0/023/4/2
Table 5. Results of CEC2017 in dimension 30.
Table 5. Results of CEC2017 in dimension 30.
FunIndexMSECOAICOACOAWOAAOABKAHOANCHHOISCSOLSHADE_cnEpSin
F1Mean 1.028 × 10 8 5.326 × 10 10 5.664 × 10 10 1.406 × 10 9 5.339 × 10 10 1.129 × 10 10 3.534 × 10 10 4.159 × 10 10 6.315 × 10 10 1.221 × 10 9
Std 3.275 × 10 8 5.763 × 10 9 7.837 × 10 9 6.061 × 10 8 1.112 × 10 10 1.655 × 10 10 7.570 × 10 9 5.719 × 10 9 7.372 × 10 9 3.678 × 10 8
F3Mean 5.322 × 10 4 8.419 × 10 4 8.376 × 10 4 2.598 × 10 5 8.272 × 10 4 2.647 × 10 4 6.992 × 10 4 8.506 × 10 4 6.564 × 10 5 1.106 × 10 5
Std 8.793 × 10 3 4.570 × 10 3 5.428 × 10 3 8.634 × 10 4 9.602 × 10 3 1.903 × 10 4 7.800 × 10 3 4.538 × 10 3 2.999 × 10 6 2.161 × 10 4
F4Mean 5.305 × 10 2 1.525 × 10 4 1.579 × 10 4 8.734 × 10 2 1.396 × 10 4 1.561 × 10 3 7.553 × 10 3 1.049 × 10 4 1.896 × 10 4 5.703 × 10 2
Std 3.465 × 10 1 2.210 × 10 3 2.824 × 10 3 1.268 × 10 2 4.590 × 10 3 3.152 × 10 3 1.656 × 10 3 2.086 × 10 3 3.009 × 10 3 2.952 × 10 1
F5Mean 7.024 × 10 2 9.326 × 10 2 9.197 × 10 2 8.311 × 10 2 8.718 × 10 2 7.371 × 10 2 8.136 × 10 2 8.993 × 10 2 9.757 × 10 2 7.271 × 10 2
Std 4.472 × 10 1 2.192 × 10 1 2.562 × 10 1 5.002 × 10 1 4.565 × 10 1 4.871 × 10 1 3.903 × 10 1 2.982 × 10 1 3.262 × 10 1 1.346 × 10 1
F6Mean 6.166 × 10 2 6.882 × 10 2 6.903 × 10 2 6.785 × 10 2 6.771 × 10 2 6.606 × 10 2 6.661 × 10 2 6.847 × 10 2 7.049 × 10 2 6.209 × 10 2
Std 7.433 × 10 0 7.063 × 10 0 6.806 × 10 0 1.243 × 10 1 8.986 × 10 0 9.221 × 10 0 8.131 × 10 0 9.429 × 10 0 9.173 × 10 0 4.280 × 10 0
F7Mean 9.985 × 10 2 1.385 × 10 3 1.429 × 10 3 1.281 × 10 3 1.364 × 10 3 1.228 × 10 3 1.244 × 10 3 1.389 × 10 3 1.499 × 10 3 1.028 × 10 3
Std 6.922 × 10 1 3.950 × 10 1 4.740 × 10 1 7.172 × 10 1 5.629 × 10 1 6.611 × 10 1 7.358 × 10 1 3.248 × 10 1 6.541 × 10 1 2.130 × 10 1
F8Mean 9.507 × 10 2 1.144 × 10 3 1.146 × 10 3 1.048 × 10 3 1.119 × 10 3 9.852 × 10 2 1.068 × 10 3 1.111 × 10 3 1.195 × 10 3 1.022 × 10 3
Std 2.283 × 10 1 1.595 × 10 1 2.909 × 10 1 4.816 × 10 1 3.481 × 10 1 5.531 × 10 1 3.254 × 10 1 2.540 × 10 1 3.551 × 10 1 1.651 × 10 1
F9Mean 3.073 × 10 3 1.103 × 10 4 1.102 × 10 4 1.090 × 10 4 7.250 × 10 3 5.447 × 10 3 7.080 × 10 3 9.209 × 10 3 1.518 × 10 4 3.429 × 10 3
Std 1.132 × 10 3 1.277 × 10 3 1.721 × 10 3 4.350 × 10 3 1.121 × 10 3 1.339 × 10 3 1.172 × 10 3 1.099 × 10 3 2.141 × 10 3 6.441 × 10 2
F10Mean 4.935 × 10 3 8.960 × 10 3 8.955 × 10 3 7.225 × 10 3 7.433 × 10 3 5.137 × 10 3 7.480 × 10 3 8.466 × 10 3 9.786 × 10 3 8.152 × 10 3
Std 7.199 × 10 2 3.565 × 10 2 3.334 × 10 2 8.959 × 10 2 6.233 × 10 2 7.096 × 10 2 5.834 × 10 2 5.703 × 10 2 7.368 × 10 2 3.755 × 10 2
F11Mean 1.272 × 10 3 8.813 × 10 3 8.378 × 10 3 7.656 × 10 3 9.637 × 10 3 1.722 × 10 3 6.525 × 10 3 9.266 × 10 3 2.068 × 10 4 1.860 × 10 3
Std 6.087 × 10 1 1.709 × 10 3 2.021 × 10 3 3.316 × 10 3 3.200 × 10 3 1.189 × 10 3 1.967 × 10 3 1.803 × 10 3 1.305 × 10 4 1.416 × 10 2
F12Mean 1.661 × 10 6 1.165 × 10 10 1.260 × 10 10 2.998 × 10 8 1.430 × 10 10 5.199 × 10 8 5.896 × 10 9 8.991 × 10 9 1.886 × 10 10 8.299 × 10 7
Std 1.205 × 10 6 2.786 × 10 9 3.398 × 10 9 2.415 × 10 8 3.355 × 10 9 1.805 × 10 9 1.660 × 10 9 3.376 × 10 9 4.806 × 10 9 2.379 × 10 7
F13Mean 9.409 × 10 3 6.922 × 10 9 9.887 × 10 9 2.512 × 10 6 1.174 × 10 10 1.756 × 10 5 2.913 × 10 9 3.518 × 10 9 1.788 × 10 10 1.188 × 10 7
Std 5.455 × 10 3 4.074 × 10 9 5.511 × 10 9 3.546 × 10 6 4.683 × 10 9 1.039 × 10 5 1.869 × 10 9 4.045 × 10 9 7.519 × 10 9 6.959 × 10 6
F14Mean 2.348 × 10 4 4.742 × 10 6 3.987 × 10 6 2.171 × 10 6 2.358 × 10 6 5.706 × 10 4 2.115 × 10 6 3.603 × 10 6 4.424 × 10 7 9.505 × 10 4
Std 6.089 × 10 4 3.690 × 10 6 4.156 × 10 6 2.705 × 10 6 2.234 × 10 6 1.856 × 10 5 2.241 × 10 6 2.943 × 10 6 4.890 × 10 7 5.265 × 10 4
F15Mean 3.694 × 10 3 6.582 × 10 8 7.717 × 10 8 1.170 × 10 6 2.945 × 10 6 1.344 × 10 5 9.781 × 10 7 2.291 × 10 8 3.199 × 10 9 9.038 × 10 5
Std 3.223 × 10 3 3.295 × 10 8 5.643 × 10 8 1.585 × 10 6 7.510 × 10 6 4.670 × 10 5 9.557 × 10 7 2.613 × 10 8 1.165 × 10 9 5.529 × 10 5
F16Mean 2.558 × 10 3 5.514 × 10 3 6.240 × 10 3 4.324 × 10 3 5.229 × 10 3 3.136 × 10 3 4.625 × 10 3 4.969 × 10 3 7.573 × 10 3 3.173 × 10 3
Std 3.115 × 10 2 1.024 × 10 3 1.148 × 10 3 6.969 × 10 2 1.305 × 10 3 5.254 × 10 2 7.132 × 10 2 8.859 × 10 2 2.587 × 10 3 1.943 × 10 2
F17Mean 2.107 × 10 3 3.461 × 10 3 4.465 × 10 3 2.654 × 10 3 4.110 × 10 3 2.347 × 10 3 3.001 × 10 3 3.694 × 10 3 1.710 × 10 4 2.260 × 10 3
Std 1.991 × 10 2 4.300 × 10 2 2.139 × 10 3 3.061 × 10 2 1.187 × 10 3 2.953 × 10 2 4.827 × 10 2 2.163 × 10 3 2.034 × 10 4 1.329 × 10 2
F18Mean 3.085 × 10 5 4.029 × 10 7 5.965 × 10 7 7.200 × 10 6 2.259 × 10 7 1.644 × 10 5 1.650 × 10 7 5.686 × 10 7 3.522 × 10 8 3.256 × 10 6
Std 4.078 × 10 5 3.564 × 10 7 5.202 × 10 7 7.166 × 10 6 1.549 × 10 7 2.173 × 10 5 1.648 × 10 7 6.628 × 10 7 3.358 × 10 8 1.669 × 10 6
F19Mean 7.628 × 10 3 7.142 × 10 8 5.809 × 10 8 1.015 × 10 7 1.722 × 10 7 1.067 × 10 6 5.514 × 10 7 3.404 × 10 8 2.741 × 10 9 1.301 × 10 6
Std 9.997 × 10 3 3.955 × 10 8 4.234 × 10 8 1.020 × 10 7 6.744 × 10 7 3.719 × 10 6 8.796 × 10 7 2.742 × 10 8 9.831 × 10 8 7.965 × 10 5
F20Mean 2.358 × 10 3 3.098 × 10 3 3.052 × 10 3 2.867 × 10 3 2.784 × 10 3 2.516 × 10 3 2.765 × 10 3 2.979 × 10 3 3.495 × 10 3 2.510 × 10 3
Std 1.261 × 10 2 1.590 × 10 2 2.301 × 10 2 1.962 × 10 2 1.585 × 10 2 1.620 × 10 2 1.707 × 10 2 2.066 × 10 2 2.274 × 10 2 1.162 × 10 2
F21Mean 2.441 × 10 3 2.743 × 10 3 2.745 × 10 3 2.632 × 10 3 2.679 × 10 3 2.543 × 10 3 2.608 × 10 3 2.743 × 10 3 2.836 × 10 3 2.517 × 10 3
Std 3.085 × 10 1 4.065 × 10 1 5.397 × 10 1 7.172 × 10 1 5.629 × 10 1 6.622 × 10 1 3.813 × 10 1 5.550 × 10 1 7.803 × 10 1 1.507 × 10 1
F22Mean 2.452 × 10 3 9.442 × 10 3 9.739 × 10 3 8.224 × 10 3 9.065 × 10 3 6.187 × 10 3 7.609 × 10 3 9.513 × 10 3 1.071 × 10 4 3.597 × 10 3
Std 7.149 × 10 2 6.350 × 10 2 6.191 × 10 2 7.062 × 10 2 6.224 × 10 2 1.770 × 10 3 1.309 × 10 3 6.843 × 10 2 7.697 × 10 2 3.047 × 10 2
F23Mean 2.785 × 10 3 3.446 × 10 3 3.631 × 10 3 3.130 × 10 3 3.443 × 10 3 3.124 × 10 3 3.444 × 10 3 3.414 × 10 3 3.737 × 10 3 2.865 × 10 3
Std 3.909 × 10 1 1.536 × 10 2 1.607 × 10 2 1.035 × 10 2 1.426 × 10 2 9.757 × 10 1 1.086 × 10 2 1.650 × 10 2 2.693 × 10 2 1.598 × 10 1
F24Mean 3.007 × 10 3 3.668 × 10 3 3.821 × 10 3 3.231 × 10 3 3.859 × 10 3 3.296 × 10 3 3.806 × 10 3 3.684 × 10 3 4.037 × 10 3 3.023 × 10 3
Std 5.952 × 10 1 1.356 × 10 2 1.768 × 10 2 9.318 × 10 1 2.103 × 10 2 1.295 × 10 2 1.412 × 10 2 2.137 × 10 2 3.031 × 10 2 2.099 × 10 1
F25Mean 2.936 × 10 3 4.919 × 10 3 5.115 × 10 3 3.115 × 10 3 5.656 × 10 3 3.288 × 10 3 3.845 × 10 3 4.388 × 10 3 6.321 × 10 3 2.980 × 10 3
Std 2.883 × 10 1 3.494 × 10 2 4.132 × 10 2 5.708 × 10 1 8.271 × 10 2 6.449 × 10 2 2.339 × 10 2 3.940 × 10 2 4.048 × 10 2 1.936 × 10 1
F26Mean 4.767 × 10 3 1.110 × 10 4 1.196 × 10 4 8.384 × 10 3 1.037 × 10 4 7.517 × 10 3 9.460 × 10 3 1.082 × 10 4 1.283 × 10 4 5.846 × 10 3
Std 1.621 × 10 3 7.347 × 10 2 9.633 × 10 2 9.450 × 10 2 9.525 × 10 2 1.740 × 10 3 7.934 × 10 2 8.885 × 10 2 1.258 × 10 3 1.775 × 10 2
F27Mean 3.307 × 10 3 4.345 × 10 3 4.689 × 10 3 3.473 × 10 3 4.543 × 10 3 3.408 × 10 3 4.340 × 10 3 4.237 × 10 3 4.873 × 10 3 3.227 × 10 3
Std 3.850 × 10 1 3.063 × 10 2 3.778 × 10 2 1.309 × 10 2 2.853 × 10 2 1.205 × 10 2 3.365 × 10 2 4.410 × 10 2 6.351 × 10 2 4.235 × 10 0
F28Mean 3.305 × 10 3 6.938 × 10 3 7.650 × 10 3 3.557 × 10 3 7.131 × 10 3 3.551 × 10 3 5.866 × 10 3 6.370 × 10 3 8.680 × 10 3 3.426 × 10 3
Std 3.556 × 10 1 6.543 × 10 2 7.200 × 10 2 9.222 × 10 1 9.329 × 10 2 5.798 × 10 2 5.881 × 10 2 5.108 × 10 2 4.614 × 10 2 5.654 × 10 1
F29Mean 3.906 × 10 3 7.719 × 10 3 8.635 × 10 3 5.308 × 10 3 7.236 × 10 3 4.668 × 10 3 6.254 × 10 3 6.799 × 10 3 1.337 × 10 4 4.246 × 10 3
Std 2.728 × 10 2 1.103 × 10 3 1.851 × 10 3 5.253 × 10 2 1.998 × 10 3 3.951 × 10 2 1.035 × 10 3 1.124 × 10 3 7.988 × 10 3 1.953 × 10 2
F30Mean 4.702 × 10 4 1.169 × 10 9 1.790 × 10 9 3.906 × 10 7 1.668 × 10 9 3.100 × 10 6 6.171 × 10 8 5.951 × 10 8 2.082 × 10 9 9.276 × 10 5
Std 5.467 × 10 4 5.227 × 10 8 1.002 × 10 9 2.719 × 10 7 9.835 × 10 8 8.717 × 10 6 4.662 × 10 8 3.589 × 10 8 2.015 × 10 9 5.297 × 10 5
Friedman Rank1.1037.5528.4144.3796.6902.7594.8976.58610.0002.621
Final Rank18947356102
Table 6. Wilcoxon rank sum test results for CEC2017 in dimension 30.
Table 6. Wilcoxon rank sum test results for CEC2017 in dimension 30.
FunICOACOAWOAAOABKAHOANCHHOISCSOLSHADE_cnEpSin
F1 3.02 × 10 11 3.02 × 10 11 4.20 × 10 10 3.02 × 10 11 2.61 × 10 10 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 5.57 × 10 10
F3 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 1.46 × 10 10 8.84 × 10 7 1.70 × 10 8 3.02 × 10 11 3.02 × 10 11 3.34 × 10 11
F4 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 1.61 × 10 6 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 6.77 × 10 5
F5 3.02 × 10 11 3.02 × 10 11 1.21 × 10 10 3.02 × 10 11 6.38 × 10 3 2.87 × 10 10 3.02 × 10 11 3.02 × 10 11 1.76 × 10 2
F6 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 1.33 × 10 2
F7 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.69 × 10 11 4.08 × 10 11 3.02 × 10 11 3.02 × 10 11 4.51 × 10 2
F8 3.02 × 10 11 3.02 × 10 11 9.92 × 10 11 3.02 × 10 11 1.17 × 10 3 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 4.98 × 10 11
F9 3.02 × 10 11 3.02 × 10 11 4.50 × 10 11 9.92 × 10 11 1.43 × 10 8 2.15 × 10 10 3.02 × 10 11 3.02 × 10 11 2.81 × 10 2
F10 3.02 × 10 11 3.02 × 10 11 5.49 × 10 11 3.02 × 10 11 4.38 × 10 0 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11
F11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.16 × 10 5 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11
F12 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 2.03 × 10 9 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11
F13 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11
F14 3.69 × 10 11 3.69 × 10 11 4.62 × 10 10 8.15 × 10 11 7.62 × 10 0 3.34 × 10 11 3.02 × 10 11 3.02 × 10 11 1.20 × 10 8
F15 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 4.08 × 10 11 3.69 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11
F16 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 6.53 × 10 7 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 2.23 × 10 9
F17 3.02 × 10 11 3.02 × 10 11 2.60 × 10 8 4.50 × 10 11 1.17 × 10 3 1.61 × 10 10 8.15 × 10 11 3.02 × 10 11 5.87 × 10 4
F18 3.02 × 10 11 4.98 × 10 11 3.47 × 10 10 3.02 × 10 11 2.61 × 10 0 4.08 × 10 11 4.08 × 10 11 3.02 × 10 11 5.49 × 10 11
F19 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 6.12 × 10 10 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11
F20 3.02 × 10 11 1.09 × 10 10 6.07 × 10 11 4.62 × 10 10 5.26 × 10 4 1.96 × 10 10 4.08 × 10 11 3.02 × 10 11 2.00 × 10 5
F21 3.02 × 10 11 3.02 × 10 11 3.69 × 10 11 3.02 × 10 11 1.86 × 10 9 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 4.98 × 10 11
F22 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 8.99 × 10 11 4.50 × 10 11 3.02 × 10 11 3.02 × 10 11 5.57 × 10 10
F23 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 1.61 × 10 10
F24 3.02 × 10 11 3.02 × 10 11 1.61 × 10 10 3.02 × 10 11 8.15 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 7.98 × 10 2
F25 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 4.18 × 10 9 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 1.36 × 10 7
F26 3.02 × 10 11 3.02 × 10 11 1.09 × 10 10 3.02 × 10 11 3.52 × 10 7 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 5.75 × 10 2
F27 3.02 × 10 11 3.02 × 10 11 2.03 × 10 9 3.02 × 10 11 2.84 × 10 4 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.34 × 10 11
F28 3.02 × 10 11 3.02 × 10 11 4.50 × 10 11 3.02 × 10 11 4.74 × 10 6 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 2.37 × 10 10
F29 3.02 × 10 11 3.02 × 10 11 8.15 × 10 11 3.02 × 10 11 2.23 × 10 9 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 1.34 × 10 5
F30 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.34 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 4.08 × 10 11
+/=/−29/0/029/0/029/0/029/0/025/2/229/0/029/0/029/0/026/2/1
Table 7. IEEE CEC2022 benchmark functions.
Table 7. IEEE CEC2022 benchmark functions.
TypeFunction NoFunction DescriptionRangeOptimum
UnimodalF1Shifted and full Rotated Zakharov Function[−100, 100]300
BasicF2Shifted and full Rotated Rosenbrock’s Function[−100, 100]400
BasicF3Shifted and full Rotated Expanded Schaffer’s f6 Function[−100, 100]600
BasicF4Shifted and full Rotated Non-Continuous Rastrigin’s Function[−100, 100]800
BasicF5Shifted and full Rotated Levy Function[−100, 100]900
HybridF6Hybrid Function 1 (N = 3)[−100, 100]1800
HybridF7Hybrid Function 2 (N = 6)[−100, 100]2000
HybridF8Hybrid Function 3 (N = 5)[−100, 100]2200
CompositionF9Composition Function 1 (N = 5)[−100, 100]2300
CompositionF10Composition Function 2 (N = 4)[−100, 100]2400
CompositionF11Composition Function 3 (N = 5)[−100, 100]2600
CompositionF12Composition Function 4 (N = 6)[−100, 100]2700
Table 8. Results of CEC2022 in dimension 20.
Table 8. Results of CEC2022 in dimension 20.
FunIndexMSECOAICOACOAWOAAOABKAHOANCHHOISCSOLSHADE_cnEpSin
F1Mean 1.402 × 10 4 4.292 × 10 4 4.774 × 10 4 2.374 × 10 4 3.519 × 10 4 3.491 × 10 3 2.943 × 10 4 5.055 × 10 4 1.341 × 10 5 2.254 × 10 4
Std 4.170 × 10 3 1.032 × 10 4 1.525 × 10 4 7.447 × 10 3 9.868 × 10 3 6.049 × 10 3 8.750 × 10 3 2.051 × 10 4 3.703 × 10 4 4.958 × 10 3
F2Mean 4.704 × 10 2 2.585 × 10 3 2.967 × 10 3 5.756 × 10 2 2.133 × 10 3 5.908 × 10 2 1.426 × 10 3 1.893 × 10 3 4.089 × 10 3 4.581 × 10 2
Std 2.903 × 10 1 4.120 × 10 2 6.782 × 10 2 5.738 × 10 1 6.894 × 10 2 4.511 × 10 2 3.485 × 10 2 5.731 × 10 2 1.071 × 10 3 3.789 × 10 0
F3Mean 6.052 × 10 2 6.791 × 10 2 6.796 × 10 2 6.735 × 10 2 6.656 × 10 2 6.509 × 10 2 6.549 × 10 2 6.796 × 10 2 7.000 × 10 2 6.100 × 10 2
Std 4.615 × 10 0 9.958 × 10 0 1.118 × 10 1 1.297 × 10 1 7.953 × 10 0 9.870 × 10 0 1.030 × 10 1 9.998 × 10 0 1.135 × 10 1 1.786 × 10 0
F4Mean 8.690 × 10 2 9.784 × 10 2 9.760 × 10 2 9.267 × 10 2 9.533 × 10 2 8.821 × 10 2 9.184 × 10 2 9.540 × 10 2 1.005 × 10 3 9.178 × 10 2
Std 1.142 × 10 1 1.052 × 10 1 1.745 × 10 1 3.776 × 10 1 1.684 × 10 1 3.032 × 10 1 1.789 × 10 1 1.829 × 10 1 2.039 × 10 1 1.559 × 10 1
F5Mean 1.683 × 10 3 3.569 × 10 3 3.452 × 10 3 4.048 × 10 3 2.805 × 10 3 2.111 × 10 3 2.584 × 10 3 3.175 × 10 3 5.206 × 10 3 1.427 × 10 3
Std 4.609 × 10 2 3.405 × 10 2 3.554 × 10 2 1.285 × 10 3 4.509 × 10 2 3.837 × 10 2 4.835 × 10 2 2.919 × 10 2 6.203 × 10 2 1.816 × 10 2
F6Mean 1.155 × 10 4 2.357 × 10 9 2.543 × 10 9 1.323 × 10 6 7.190 × 10 8 4.579 × 10 7 5.043 × 10 8 9.572 × 10 8 3.792 × 10 9 8.300 × 10 6
Std 3.205 × 10 4 7.755 × 10 8 1.262 × 10 9 2.311 × 10 6 7.046 × 10 8 2.498 × 10 8 5.384 × 10 8 7.142 × 10 8 1.881 × 10 9 5.867 × 10 6
F7Mean 2.059 × 10 3 2.222 × 10 3 2.220 × 10 3 2.238 × 10 3 2.232 × 10 3 2.120 × 10 3 2.167 × 10 3 2.211 × 10 3 2.389 × 10 3 2.080 × 10 3
Std 2.482 × 10 1 3.612 × 10 1 4.477 × 10 1 7.665 × 10 1 8.419 × 10 1 4.199 × 10 1 4.819 × 10 1 4.563 × 10 1 1.081 × 10 2 1.189 × 10 1
F8Mean 2.224 × 10 3 2.426 × 10 3 2.445 × 10 3 2.284 × 10 3 2.585 × 10 3 2.267 × 10 3 2.307 × 10 3 2.357 × 10 3 2.682 × 10 3 2.239 × 10 3
Std 2.591 × 10 0 1.411 × 10 2 1.768 × 10 2 5.336 × 10 1 2.292 × 10 2 5.155 × 10 1 1.006 × 10 2 1.177 × 10 2 3.593 × 10 2 4.837 × 10 0
F9Mean 2.483 × 10 3 3.370 × 10 3 3.466 × 10 3 2.562 × 10 3 3.128 × 10 3 2.524 × 10 3 3.106 × 10 3 2.984 × 10 3 3.720 × 10 3 2.481 × 10 3
Std 2.797 × 10 0 2.510 × 10 2 4.591 × 10 2 5.032 × 10 1 2.716 × 10 2 8.769 × 10 1 2.643 × 10 2 2.319 × 10 2 4.828 × 10 2 3.003 × 10 1
F10Mean 2.557 × 10 3 6.406 × 10 3 6.867 × 10 3 4.624 × 10 3 5.569 × 10 3 4.134 × 10 3 4.887 × 10 3 5.906 × 10 3 6.784 × 10 3 2.513 × 10 3
Std 1.987 × 10 2 8.191 × 10 2 5.481 × 10 2 1.297 × 10 3 1.072 × 10 3 9.890 × 10 2 1.386 × 10 3 1.256 × 10 3 1.494 × 10 3 5.742 × 10 0
F11Mean 2.926 × 10 3 7.819 × 10 3 8.785 × 10 3 3.512 × 10 3 8.641 × 10 3 4.159 × 10 3 7.084 × 10 3 7.964 × 10 3 9.407 × 10 3 4.983 × 10 3
Std 7.158 × 10 1 4.618 × 10 2 5.661 × 10 2 2.950 × 10 2 2.040 × 10 3 1.893 × 10 3 1.046 × 10 3 8.226 × 10 2 6.669 × 10 2 4.794 × 10 2
F12Mean 2.988 × 10 3 3.473 × 10 3 3.628 × 10 3 3.081 × 10 3 3.855 × 10 3 3.054 × 10 3 3.702 × 10 3 3.511 × 10 3 3.951 × 10 3 2.945 × 10 3
Std 2.870 × 10 1 1.578 × 10 2 2.595 × 10 2 1.287 × 10 2 2.568 × 10 2 7.158 × 10 1 1.459 × 10 2 2.696 × 10 2 3.468 × 10 2 2.368 × 10 0
Friedman Rank1.5007.3338.1674.6676.8332.9175.0006.6679.9172.000
Final Rank18947356102
Table 9. Wilcoxon rank sum test results for CEC2020 in dimension 20.
Table 9. Wilcoxon rank sum test results for CEC2020 in dimension 20.
FunICOACOAWOAAOABKAHOANCHHOISCSOLSHADE_cnEpSin
F1 4.08 × 10 11 3.02 × 10 11 2.57 × 10 7 2.87 × 10 10 2.60 × 10 8 1.29 × 10 9 3.02 × 10 11 3.02 × 10 11 1.16 × 10 7
F2 3.02 × 10 11 3.02 × 10 11 8.10 × 10 10 3.02 × 10 11 9.07 × 10 3 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 5.40 × 10 1
F3 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 7.22 × 10 6
F4 3.02 × 10 11 3.02 × 10 11 4.62 × 10 10 3.02 × 10 11 1.62 × 10 1 7.39 × 10 11 3.02 × 10 11 3.02 × 10 11 4.98 × 10 11
F5 3.02 × 10 11 3.02 × 10 11 8.99 × 10 11 5.07 × 10 10 9.03 × 10 4 7.09 × 10 8 3.02 × 10 11 3.02 × 10 11 3.64 × 10 2
F6 3.02 × 10 11 3.02 × 10 11 8.99 × 10 11 3.02 × 10 11 2.05 × 10 3 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11
F7 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.34 × 10 11 7.09 × 10 8 6.70 × 10 11 3.34 × 10 11 3.02 × 10 11 1.17 × 10 4
F8 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 2.87 × 10 10 3.69 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11
F9 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 1.49 × 10 4 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 5.11 × 10 1
F10 3.02 × 10 11 3.02 × 10 11 2.02 × 10 8 4.50 × 10 11 1.69 × 10 9 4.20 × 10 10 6.07 × 10 11 5.49 × 10 11 9.51 × 10 6
F11 3.02 × 10 11 3.02 × 10 11 4.08 × 10 11 3.02 × 10 11 7.12 × 10 9 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11
F12 3.02 × 10 11 3.02 × 10 11 8.88 × 10 6 3.02 × 10 11 1.75 × 10 5 3.02 × 10 11 3.02 × 10 11 3.02 × 10 11 4.08 × 10 11
+/−/=12/0/012/0/012/0/012/0/010/1/112/0/012/0/012/0/07/2/3
Table 10. Friedman test results for all algorithms on different test sets.
Table 10. Friedman test results for all algorithms on different test sets.
IndexMSECOAICOACOAWOAAOABKAHOANCHHOISCSOLSHADE_cnEpSinFunction Sets/Dime
Friedman Rank1.2076.7938.0004.8976.8622.8975.6556.7249.9662.000CEE2017-10
Final Rank17948356102CEE2017-10
Friedman Rank1.1037.5528.4144.3796.6902.7594.8976.58610.0002.621CEE2017-30
Final Rank18947356102CEE2017-30
Friedman Rank1.5007.3338.1674.6676.8332.9175.0006.6679.9172.000CEE2022-20
Final Rank18947356102CEE2022-20
Sum Rank32327122291518306
Total Rank18947356102
Table 11. Results of the welded beam structural design problem.
Table 11. Results of the welded beam structural design problem.
Algorithm x 1 x 2 BestMeanStdRank
MSECOA0.78870.4082263.8958263.8958 2.98556 × 10 14 1
ICOA0.78980.4051263.8988263.95680.05635
COA0.78790.4105263.8971263.95490.05294
WOA0.78900.4072263.8959264.86071.44248
AOA0.79460.3924263.9812265.35663.35259
BKA0.78870.4082263.8958263.89630.00073
HOA0.78910.4071263.8960263.97130.08796
NCHHO0.78910.4070263.8960264.48270.88047
ISCSO0.79650.3867263.9454266.85522.825910
LSHADE_cnEpSin0.78870.4082263.8958263.8958 1.46046 × 10 05 2
Table 12. Results of welded beam structural design problem.
Table 12. Results of welded beam structural design problem.
AlgorithmhltbBestMeanStdRank
MSECOA0.20573.23499.03660.20571.69281.69280.00011
ICOA0.18623.99998.92250.23001.93032.71790.44808
COA0.21043.72018.61920.22621.84372.38160.24226
WOA0.19133.62589.03400.20581.72362.45370.74437
AOA0.19653.097010.00000.20471.81562.23610.27534
BKA0.20573.23539.03580.20581.69301.89310.51293
HOA0.29342.63887.33510.32352.15073.10380.44599
NCHHO0.20133.35339.07950.20551.70802.34490.73745
ISCSO0.19794.20029.68990.20441.9157 9.1641 × 10 99 3.1053 × 10 100 10
LSHADE_cnEpSin0.20573.23619.03650.20571.69301.71120.03022
Table 13. Results of tension/compression spring design problem.
Table 13. Results of tension/compression spring design problem.
AlgorithmdDNBestMeanStdRank
MSECOA0.05170.356711.28920.0126650.012665 7.8035 × 10 8 1
ICOA0.05000.316114.31770.012894 1.6579 × 10 99 4.9779 × 10 99 9
COA0.05160.354411.44190.012683 7.6036 × 10 98 4.1647 × 10 99 7
WOA0.05130.347011.87980.0126680.013502 6.9757 × 10 4 3
AOA0.05000.310415.00000.0131930.013846 3.1890 × 10 3 4
BKA0.05180.359511.12940.012665 7.6036 × 10 98 4.1647 × 10 99 7
HOA0.05160.354911.39740.0126680.014367 1.2166 × 10 3 6
NCHHO0.05390.41268.63630.0127570.014119 1.1151 × 10 3 5
ISCSO0.06610.80504.24860.021963 6.0657 × 10 99 1.4491 × 10 100 10
LSHADE_cnEpSin0.05130.348211.80900.0126680.012699 3.2278 × 10 5 2
Table 14. Results of pressure vessel design problem.
Table 14. Results of pressure vessel design problem.
Algorithm T s T s RLBestMeanStdRank
MSECOA0.77820.384640.3196200.00005885.33285885.48430.66661
ICOA1.24092.635261.643176.731223,715.169287,460.142845,702.08429
COA1.36911.121556.052456.586111,388.210873,119.092952,805.44498
WOA0.81750.422441.4994184.20446123.48499566.48494663.81963
AOA1.19840.575254.1501136.268010,665.379269,181.760677,481.39057
BKA0.77830.384740.3263199.90705885.553313,798.637934,303.52535
HOA0.88130.661744.0702158.79857193.078510,177.43364496.00084
NCHHO0.97570.446746.7089126.80626594.391924,205.631027,288.74446
ISCSO2.78142.877270.5541124.952854,618.2573213,250.5202101,961.845810
LSHADE_cnEpSin0.77960.385940.3938198.97855889.73436057.2659218.75732
Table 15. Degree reduction error results for example 1.
Table 15. Degree reduction error results for example 1.
AlgorithmCoordinatesMean ( ε )Std ( ε )Min ( ε )
COA q 0 = ( 0.1 , 0 ) , q 1 = ( 0.0605 , 0.1587 ) , q 2 = ( 0.2652 , 0.3195 ) , 2.180 × 10 3 2.022 × 10 3 3.900 × 10 5
q 3 = ( 0.6998 , 0.1790 ) , q 4 = ( 0.5 , 0 )
MSECOA q 0 = ( 0.1 , 0 ) , q 1 = ( 0.0774 , 0.1742 ) , q 2 = ( 0.3006 , 0.3142 ) , 3.671 × 10 5 2.710 × 10 6 3.351 × 10 5
q 3 = ( 0.6769 , 0.1738 ) , q 4 = ( 0.5 , 0 )
ICOA q 0 = ( 0.1 , 0 ) , q 1 = ( 0.0937 , 0.2500 ) , q 2 = ( 0.3804 , 0.2500 ) , 3.149 × 10 3 2.029 × 10 3 6.557 × 10 4
q 3 = ( 0.6000 , 0.2500 ) , q 4 = ( 0.5 , 0 )
WOA q 0 = ( 0.1 , 0 ) , q 1 = ( 0.0396 , 0.1594 ) , q 2 = ( 0.2076 , 0.3759 ) , 1.786 × 10 3 1.977 × 10 3 1.533 × 10 4
q 3 = ( 0.7621 , 0.1255 ) , q 4 = ( 0.5 , 0 )
HOA q 0 = ( 0.1 , 0 ) , q 1 = ( 0.0869 , 0.1825 ) , q 2 = ( 0.3115 , 0.3159 ) , 5.121 × 10 4 5.019 × 10 4 5.443 × 10 5
q 3 = ( 0.6773 , 0.1744 ) , q 4 = ( 0.5 , 0 )
Table 16. Degree reduction error results for example 2.
Table 16. Degree reduction error results for example 2.
AlgorithmCoordinatesMean ( ε )Std ( ε )Min ( ε )
COA q 0 = ( 0.871 , 0.408 ) , q 1 = ( 0.6119 , 0.8805 ) , q 2 = ( 0.2604 , 0.0209 ) , 1.295 × 10 2 4.820 × 10 3 4.606 × 10 3
q 3 = ( 0.3916 , 1.1019 ) , q 4 = ( 0.871 , 0.409 )
MSECOA q 0 = ( 0.871 , 0.408 ) , q 1 = ( 0.6290 , 1.1316 ) , q 2 = ( 0.0017 , 0.3108 ) , 5.704 × 10 4 4.420 × 10 6 5.628 × 10 4
q 3 = ( 0.6280 , 1.1331 ) , q 4 = ( 0.871 , 0.409 )
ICOA q 0 = ( 0.871 , 0.408 ) , q 1 = ( 0.6959 , 1.1275 ) , q 2 = ( 0.1221 , 0.3068 ) , 4.988 × 10 3 1.755 × 10 3 7.190 × 10 4
q 3 = ( 0.5288 , 1.1379 ) , q 4 = ( 0.871 , 0.409 )
WOA q 0 = ( 0.871 , 0.408 ) , q 1 = ( 0.4961 , 1.0977 ) , q 2 = ( 0.2040 , 0.3108 ) , 2.325 × 10 3 1.096 × 10 3 9.846 × 10 4
q 3 = ( 0.6934 , 1.1771 ) , q 4 = ( 0.871 , 0.409 )
HOA q 0 = ( 0.871 , 0.408 ) , q 1 = ( 0.6451 , 0.8733 ) , q 2 = ( 0.1713 , 0.0296 ) , 7.412 × 10 3 2.819 × 10 3 3.153 × 10 3
q 3 = ( 0.5055 , 1.0026 ) , q 4 = ( 0.871 , 0.409 )
Table 17. Degree reduction error results for example 3.
Table 17. Degree reduction error results for example 3.
AlgorithmCoordinatesMean ( ε )Std ( ε )Min ( ε )
COA q 0 = ( 0.5 , 0 ) , q 1 = ( 0.7894 , 0.7768 ) , q 2 = ( 0.0600 , 1.1144 ) , 3.368 × 10 2 2.899 × 10 2 4.386 × 10 3
q 3 = ( 2.6733 , 0.9246 ) , q 4 = ( 1.7 , 0 )
MSECOA q 0 = ( 0.5 , 0 ) , q 1 = ( 1.1377 , 0.6125 ) , q 2 = ( 0.5999 , 1.6111 ) , 1.059 × 10 5 1.070 × 10 5 6.720 × 10 6
q 3 = ( 2.3374 , 0.6119 ) , q 4 = ( 1.7 , 0 )
ICOA q 0 = ( 0.5 , 0 ) , q 1 = ( 1.0757 , 0.6892 ) , q 2 = ( 0.6904 , 1.5395 ) , 4.699 × 10 3 2.565 × 10 3 1.075 × 10 3
q 3 = ( 2.2891 , 0.5899 ) , q 4 = ( 1.7 , 0 )
WOA q 0 = ( 0.5 , 0 ) , q 1 = ( 1.0499 , 0.6985 ) , q 2 = ( 0.5199 , 1.5090 ) , 3.980 × 10 3 2.659 × 10 3 3.192 × 10 4
q 3 = ( 2.3523 , 0.6317 ) , q 4 = ( 1.7 , 0 )
HOA q 0 = ( 0.5 , 0 ) , q 1 = ( 1.2617 , 0.8366 ) , q 2 = ( 0.8275 , 1.1947 ) , 5.643 × 10 3 3.959 × 10 3 1.577 × 10 3
q 3 = ( 2.2081 , 0.8827 ) , q 4 = ( 1.7 , 0 )
Table 18. Degree reduction error results for example 4.
Table 18. Degree reduction error results for example 4.
AlgorithmCoordinatesMean ( ε )Std ( ε )Min ( ε )
COA q 0 = ( 0.3 , 0 ) , q 1 = ( 1.5113 , 1.1532 ) , q 2 = ( 2.6916 , 1.1674 ) , 1.844 × 10 2 1.867 × 10 2 1.650 × 10 3
q 3 = ( 1.5 , 0 )
MSECOA q 0 = ( 0.3 , 0 ) , q 1 = ( 1.5695 , 1.1602 ) , q 2 = ( 2.7704 , 1.1545 ) , 1.353 × 10 3 8.900 × 10 7 1.350 × 10 3
q 3 = ( 1.5 , 0 )
ICOA q 0 = ( 0.3 , 0 ) , q 1 = ( 1.5965 , 1.1989 ) , q 2 = ( 2.7993 , 1.1578 ) , 2.567 × 10 3 9.558 × 10 4 2.518 × 10 3
q 3 = ( 1.5 , 0 )
WOA q 0 = ( 0.3 , 0 ) , q 1 = ( 1.5844 , 1.1599 ) , q 2 = ( 2.7882 , 1.1457 ) , 1.746 × 10 3 3.518 × 10 4 1.354 × 10 3
q 3 = ( 1.5 , 0 )
HOA q 0 = ( 0.3 , 0 ) , q 1 = ( 1.5632 , 1.1817 ) , q 2 = ( 2.7636 , 1.1395 ) , 6.389 × 10 3 7.880 × 10 3 1.382 × 10 3
q 3 = ( 1.5 , 0 )
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zou, F.; Wang, X.; Zhang, W.; Shi, Q.; Yang, H. Multi-Degree Reduction of Said–Ball Curves and Engineering Design Using Multi-Strategy Enhanced Coati Optimization Algorithm. Biomimetics 2025, 10, 416. https://doi.org/10.3390/biomimetics10070416

AMA Style

Zou F, Wang X, Zhang W, Shi Q, Yang H. Multi-Degree Reduction of Said–Ball Curves and Engineering Design Using Multi-Strategy Enhanced Coati Optimization Algorithm. Biomimetics. 2025; 10(7):416. https://doi.org/10.3390/biomimetics10070416

Chicago/Turabian Style

Zou, Feng, Xia Wang, Weilin Zhang, Qingshui Shi, and Huogen Yang. 2025. "Multi-Degree Reduction of Said–Ball Curves and Engineering Design Using Multi-Strategy Enhanced Coati Optimization Algorithm" Biomimetics 10, no. 7: 416. https://doi.org/10.3390/biomimetics10070416

APA Style

Zou, F., Wang, X., Zhang, W., Shi, Q., & Yang, H. (2025). Multi-Degree Reduction of Said–Ball Curves and Engineering Design Using Multi-Strategy Enhanced Coati Optimization Algorithm. Biomimetics, 10(7), 416. https://doi.org/10.3390/biomimetics10070416

Article Metrics

Back to TopTop