Next Article in Journal
Fourier Synchrosqueezing Transform-ICA-EMD Framework Based EOG-Biometric Sustainable and Continuous Authentication via Voluntary Eye Blinking Activities
Next Article in Special Issue
Fault Diagnosis in Analog Circuits Using Swarm Intelligence
Previous Article in Journal
The Effects of a Biomimetic Hybrid Meso- and Nano-Scale Surface Topography on Blood and Protein Recruitment in a Computational Fluid Dynamics Implant Model
Previous Article in Special Issue
Bio-Inspired Internet of Things: Current Status, Benefits, Challenges, and Future Directions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

PSO-Incorporated Hybrid Artificial Hummingbird Algorithm with Elite Opposition-Based Learning and Cauchy Mutation: A Case Study of Shape Optimization for CSGC–Ball Curves

1
Unmanned System Research Institute, Northwestern Polytechnical University, Xi’an 710072, China
2
Xi’an Jingkai No. 1 Primary School, Xi’an 710018, China
3
Department of Applied Mathematics, Xi’an University of Technology, Xi’an 710054, China
4
School of Computer Science and Engineering, Xi’an University of Technology, Xi’an 710048, China
*
Author to whom correspondence should be addressed.
Biomimetics 2023, 8(4), 377; https://doi.org/10.3390/biomimetics8040377
Submission received: 26 June 2023 / Revised: 8 August 2023 / Accepted: 9 August 2023 / Published: 18 August 2023

Abstract

:
With the rapid development of the geometric modeling industry and computer technology, the design and shape optimization of complex curve shapes have now become a very important research topic in CAGD. In this paper, the Hybrid Artificial Hummingbird Algorithm (HAHA) is used to optimize complex composite shape-adjustable generalized cubic Ball (CSGC–Ball, for short) curves. Firstly, the Artificial Hummingbird algorithm (AHA), as a newly proposed meta-heuristic algorithm, has the advantages of simple structure and easy implementation and can quickly find the global optimal solution. However, there are still limitations, such as low convergence accuracy and the tendency to fall into local optimization. Therefore, this paper proposes the HAHA based on the original AHA, combined with the elite opposition-based learning strategy, PSO, and Cauchy mutation, to increase the population diversity of the original algorithm, avoid falling into local optimization, and thus improve the accuracy and rate of convergence of the original AHA. Twenty-five benchmark test functions and the CEC 2022 test suite are used to evaluate the overall performance of HAHA, and the experimental results are statistically analyzed using Friedman and Wilkerson rank sum tests. The experimental results show that, compared with other advanced algorithms, HAHA has good competitiveness and practicality. Secondly, in order to better realize the modeling of complex curves in engineering, the CSGC–Ball curves with global and local shape parameters are constructed based on SGC–Ball basis functions. By changing the shape parameters, the whole or local shape of the curves can be adjusted more flexibly. Finally, in order to make the constructed curve have a more ideal shape, the CSGC–Ball curve-shape optimization model is established based on the minimum curve energy value, and the proposed HAHA is used to solve the established shape optimization model. Two representative numerical examples comprehensively verify the effectiveness and superiority of HAHA in solving CSGC–Ball curve-shape optimization problems.

1. Introduction

Geometric modeling mainly focuses on the representation, approximation, analysis, and synthesis of curve and surface information in computer image system environments [1]. It has been widely used in various fields such as aviation, shipbuilding, surveying and mapping, mechanical design, computer vision, bioengineering, animation, and military combat simulation [2]. The study of Ball curves and surfaces is a very important research topic in geometric modeling, mainly focusing on the geometric research of various products [3]. In 1974, Ball [4] first constructed the rational cubic parametric curves and used them as the mathematical basis for Warton’s former British Airways CONSURF fuselage surface modeling system, which was later called a Ball curve. Later, in order to extend the cubic Ball curves to higher-order generalized Ball curves, Wang–Ball curves were proposed by Wang [5] in 1987, which provided a powerful method for evaluating the higher degree curves. In 1989, Said [6] constructed Said–Ball curves, but the degree of basis functions of these curves could only be odd because the curves were derived by Hermite interpolation. In order to eliminate this limitation, Hu [7] extended the degree of the basis functions from odd to even in 1996 and defined the Said–Ball curves with arbitrary times. Subsequently, Othman [8], Xi [9], and Ding [10] independently discussed the dual basis functions and their applications to generalized Said–Ball curves. In 2004, Jiang [11] studied the dual basis functions of Wang–Ball curves and their applications, which further improved the theory of Wang–Ball curves.
Compared with the Bézier curves, generalized Ball curves have the same basic properties as the Bézier curves, but at the same time, they also have faster calculation speeds and better calculation efficiency [12]. Therefore, generalized Ball curves have increasingly become an important research hotspot in the field of curve shape design. However, the control points and the corresponding basis functions of the curves are the only basis for determining the shape of generalized Ball curves. If changes are to be made to the shape of the curves, the position of the control points must be adjusted, which makes the adjustment of the curve shape extremely inflexible. In order to improve the flexibility of curve shape adjustment, many scholars have constructed generalized Ball curves with a single shape parameter that can control the shape of the curves by adjusting the value of the shape parameter.
In 2000, Wu [13] proposed two new types of generalized Ball curves, namely n-th Said–Bézier–Ball (SBGB) curves and Wang–Said–Ball (WSGB) curves. The SBGB curves are between Said–Ball curves and Bézier curves, and the WSGB curves are between Wang–Ball curves and Said–Ball curves. Both of these new curves adjust the shape of the curves by introducing position parameters. On the basis of the cubic Ball curves, Wang [14] further proposed the cubic Ball curves with shape control parameter λ in 2008. In 2009, Wang [15] increased the degree of cubic Ball curves, proposed quartic Wang–Ball curves with one shape control parameter λ , and the shape of the curves can be modified by controlling the value of the λ . In the same year, Hu [16] constructed the 2m + 2 degree generalized Said–Ball curves with a single shape parameter λ , and the ideal shape could be obtained by changing the value of the shape parameter λ . In 2011, Yan [17] constructed two new types of quintic generalized Ball curves with a shape parameter. In 2013, Xiong et al. [18] proposed n-th degree Wang–Ball curves with a shape parameter λ , which can control the shape of the curves by changing the parameter λ , especially when λ = 2 , these curves can be degenerated into traditional n-th degree Wang–Ball curves.
The above generalized Ball curves with a single parameter solve the limitation that the shape of traditional Ball curves cannot be adjusted to some extent. However, because they only take a single shape parameter, the curves can only simply swing, and the adjustment of the curve shape is still inflexible, making it difficult to meet people’s daily expectations and requirements for curve flexibility.
In order to make the generalized Ball curve shape adjustment with higher flexibility, in 2011, Liu [19] constructed the quadratic Q-Ball curves with two parameters, λ and μ , which improved the flexibility of curve shape adjustment. Subsequently, in 2012, Huang et al. [20] proposed the quartic Wang–Ball curves, which have two shape parameters, α and β , to better control the shape of the curves through the coordination of α and β . Compared with the generalized Ball curves with a single shape parameter, the flexibility of the generalized Ball curves with two shape parameters has significantly improved, but there are still some limitations in adjusting the local shape of the curves. Therefore, Hu [21] constructed new generalized cubic Ball basis functions in 2021. On this basis, SGC–Ball curves with adjustable shape parameters were proposed, which adjusted the shape of the curves through the joint adjustment of multiple parameters. Meanwhile, in 2022, Hu et al. [22] proposed CG–Ball curves with multiple shape control parameters, which can flexibly control the shape of the curves through the coordination of multiple shape parameters. In the construction of curve modeling, the SGC–Ball curves can achieve satisfactory results, but the curve graph in real life is usually very complicated, and it is difficult to build the ideal curves with the single SGC–Ball curve. Therefore, in order to meet the design requirements of more complex geometric product modeling, this paper studies the smooth splicing continuity conditions of two adjacent SGC–Ball curves, G1 and G2. Secondly, the complex combined SGC–Ball curves (CSGC–Ball curves for short) are constructed on the basis of the SGC–Ball curves. It is worth noting that the curves have a global shape parameter and some local shape parameters. By adjusting the global or local shape parameters, the overall or local shape of the curves can be adjusted more flexibly. The CSGC–Ball curves can be used to construct complex curves more flexibly in engineering.
In addition, the shape optimization problem of curves has also become a very important research hotspot in CAGD. In order to construct the CSGC–Ball curves with ideal shapes, the CSGC–Ball curve-shape optimization problem with the overall G1 and G2 continuity conditions is also studied in this paper. The energy method is a classical method for establishing curve-shape optimization models. In 1987, Terzopoulos et al. [23] first combined computer graphics with physics-based energy models, and then the energy method was widely used to study different types of curve and surface optimization problems, as can be seen in references [24,25,26,27]. Therefore, in this paper, the minimum curve bending energy value is taken as the objective function to establish CSGC–Ball curve-shape optimization models with G1 and G2 continuity, respectively. By solving the shape optimization models and obtaining the optimal parameter values, the most ideal curve shape can be obtained. However, due to the fact that the objective function of the established shape optimization models is highly nonlinear, has multiple shape parameters, and has high computational complexity, solving the CSGC–Ball curve optimization models using traditional optimization methods is very complex and challenging. Solving the shape optimization model also solves the optimization problem. The Meta-Heuristic Algorithm (MA) has become a popular method for solving complex optimization problems in different fields due to its simplicity and flexibility [28]. Therefore, this paper considers using MA to solve the CSGC–Ball curve-shape optimization models.
MA is an optimization method established by simulating different natural phenomena in nature. It has the advantages of flexibility, simplicity, and wide application [29]. According to No-Free-Lunch (NFL) [30], no single MA can solve all optimization problems, so there are many meta-heuristics. According to the different sources of design inspiration, MA is mainly divided into the following four categories: Evolutionary Algorithm (EA), Physics-based Algorithm (PA), Human-based Algorithm (HA), and Swarm Intelligence (SI) [31], as shown in Table 1.
EA is a kind of algorithm used to simulate the evolutionary behavior of organisms in nature. The most classic EA is the Genetic Algorithm (GA) [32], inspired by Darwin’s evolutionary theory, which seeks the optimal solution for the population according to the law of survival of the fittest. In addition, popular EA also includes Differential Evolution (DE) [33], Genetic Programming (GP) [34], the Species Co-evolutionary Algorithm (SCEA) [35], etc. PA is designed according to physical laws or chemical reaction principles, and the Simulated Annealing (SA) [36] algorithm is the most typical and widely used example. Others include the Gravity Search Algorithm (GSA) [37] based on Newton’s law of universal gravitation, the Sine Cosine Algorithm (SCA) [38] that simulates the sine cosine function to find the optimal solution, the Archimedes Optimization Algorithm (AOA) [39] designed based on Archimedes’ principle, the Crystal Structure Algorithm (CryStAl) [40], and Smell Agent Optimization (SAO) [41] that simulates the interactions between a smell agent and an object evaporating a smell molecule. HA is related to human social behavior. The most classic algorithm is Teaching–Learning-Based Optimization (TLBO) [42], based on the improvement of class levels. Other algorithms include Social Group Optimization (SGO) [43], Student Psychology Based Optimization (SPBO) [44], Bus Transportation Algorithm (BTA) [45], Alpine Skiing Optimization (ASO) [46], etc.
SI is the most popular branch of MA used to simulate the collective behavior of social animals in nature. Particle Swarm Optimization (PSO) [47] is the most classic SI, inspired by the social behavior of birds and often used to solve various global optimization problems. Famous SI algorithms also include, but are not limited to, Ant Colony Optimization (ACO) [48], based on the collective behavior of ant colony, Moth–Flame Optimization (MFO) [49,50,51], the Grey Wolf Optimizer (GWO) [52] simulating the cooperative hunting behavior of gray wolves, the Whale Optimization Algorithm (WOA) [53], Harris Hawk Optimization (HHO) [54], the Black Widow Algorithm [55,56], the Seagull Optimization algorithm (SOA) [57], the Salp Swarm Algorithm (SSA) [58,59], the African Vultures Optimization Algorithm (AVOA) [60], the Dwarf Mongoose Optimization Algorithm (DMOA) [61], the Pelican Optimization Algorithm (POA) [62], Golden Jackal Optimization (GJO) [63], the Artificial Hummingbird Algorithm (AHA) [64], etc. Among them, AHA is a recently proposed bionic MA that is inspired by the intelligent foraging behaviors, special flight skills, and amazing memory function of hummingbirds. Hummingbirds have three unique flight skills: axial, diagonal, and omnidirectional flight. These skills are flexibly and alternately used in its three foraging behaviors. The migration foraging strategy provides the algorithm with powerful exploration capabilities, territorial foraging improves population diversity and avoids the possibility of the algorithm falling into local optima, and guided foraging creates an intelligent balance between exploration and exploitation. In addition, the visit table was established to simulate the powerful memory abilities of hummingbirds.
The performance of AHA is competitive with other well-known algorithms [64]. In 2022, Ramadan [65] made improvements on the basis of the original AHA and proposed an adaptive opposition artificial hummingbird algorithm, referred to as AOAHA, which improved the performance of the AHA and applied it to solve an accurate photovoltaic model of the solar cell system. In the same year, Mohamed [66] proposed the Artificial Hummingbird Optimization Technology (AHOT) to solve the parameter identification problem of lithium-ion batteries for electric vehicles. Meanwhile, in [67], Sadoun et al. used a machine learning method based on AHA to predict the effect of the tribological behavior of in situ-synthesized Cu-Al2O3 nanocomposites. In 2022, AHA was used in [68] to solve the planning optimization problem of multiple renewable energy integrated distribution systems with uncertainty, and the optimization results were better.
Compared with other advanced meta-heuristic algorithms, AHA can quickly and accurately find the global optimal solution and has certain applicability and competitiveness in terms of computational accuracy and time. However, due to the standard AHA being designed as simply as possible, there are still certain limitations when solving complex optimization problems, such as slow iteration speed, low diversity, and the tendency to converge prematurely. In order to make the original AHA more competitive, another goal of this paper is to propose a hybrid artificial hummingbird algorithm (HAHA) based on the standard AHA, that is, the elite opposition-based learning strategy [69], the PSO strategy [47], and the Cauchy mutation strategy [70] combined with the original AHA. Three strategies work together to increase the optimization ability and overall performance of AHA. The proposed HAHA algorithm is tested on 25 benchmark functions and the CEC 2022 test suite, and it is verified that the proposed HAHA shows good competitiveness in solving global optimization problems. Therefore, the proposed HAHA is used to solve the established CSGC–Ball curve-shape optimization models. The main contributions of this paper are as follows:
(1)
The smooth splicing continuity conditions of adjacent SGC–Ball curves G1 and G2 are derived, and the combined SGC–Ball curves with global and local shape parameters are constructed, called CSGC–Ball curves, which verify that the CSGC–Ball curves have better shape adjustability.
(2)
Based on the original AHA, an enhanced AHA (HAHA) is proposed by combining three strategies to effectively solve complex optimization problems. To demonstrate the superiority of HAHA, numerical experiments are compared with other advanced algorithms on the 25 benchmark functions and the CEC 2022 test set. The superiority and practicality of the proposed HAHA have been comprehensively verified.
(3)
According to the minimum energy, the CSGC–Ball curve optimization model is established. The proposed HAHA is used to solve the established model, and the results are compared with those of other algorithms. The results demonstrate that the proposed HAHA is effective in solving the CSGC–Ball curve-shape optimization model.
Table 1. Classification of some advanced MAs.
Table 1. Classification of some advanced MAs.
TypeAlgorithmYearReference
Evolutionary Algorithm (EA)Genetic Algorithm (GA)1992[32]
Differential Evolution (DE)1997[33]
Genetic Programming (GP)1992[34]
Simulated Annealing (SA)1983[36]
Physics-based Algorithm (PA)Gravity Search Algorithm (GSA)2009[37]
Sine Cosine Algorithm (SCA)2016[38]
Archimedes Optimization Algorithm (AOA)2020[39]
Crystal Structure Algorithm (CryStAl)2021[40]
Smell Agent Optimization (SAO)2021[41]
Human-based Algorithm (HA)Teaching-Learning-Based Optimization (TLBO)2012[42]
Psychology Based Optimization (SPBO)2020[44]
Bus Transportation Algorithm (BTA)2019[45]
Alpine Skiing Optimization (ASO)2020[46]
Swarm Intelligence (SI)Ant Colony Optimization (ACO)1995[48]
Grey Wolf Optimizer (GWO)2014[52]
Whale Optimization Algorithm (WOA)2016[53]
Harris Hawk Optimization (HHO)2019[54]
African Vultures Optimization Algorithm (DMOA)2021[60]
Dwarf Mongoose Optimization Algorithm (DMOA)2022[61]
Pelican Optimization Algorithm (POA)2022[62]
Golden Jackal Optimization (GJO)2022[63]
Artificial Hummingbird Algorithm (AHA)2022[64]
The rest of the paper is structured as follows:
Section 2 introduces the proposed HAHA in detail. Numerical experiments to evaluate the performance of the proposed HAHA are given in Section 3. Section 4 introduces the constructed combined SGC–Ball curves and studies the G1 and G2 continuous splicing conditions for the SGC–Ball curves. In Section 5, the CSGC–Ball curve-shape optimization models are established based on minimum energy, and the detailed process of solving the shape optimization model using the proposed HAHA is given. Section 6 summarizes the paper and provides future research directions.

2. Hybrid Artificial Hummingbird Algorithm

2.1. Basic Artificial Hummingbird Algorithm

The Artificial Hummingbird Algorithm (AHA) [64] is a novel bionic MA proposed in 2021, inspired by the unique flight skills, intelligent foraging strategies, and strong memory capacity of hummingbirds. Hummingbirds are the smallest but most intelligent birds in the world. They have three special flight skills and three intelligently adjusted foraging strategies. Three foraging behaviors of hummingbirds are shown in Figure 1. Meanwhile, most notably, they have a strong memory, so AHA constructed the visit table to simulate the unique memory ability of hummingbirds for food sources.

2.1.1. Initialization

AHA uses the random initialization method to generate hummingbird population X, and randomly places n hummingbirds on n food sources, as described by Equation (1):
x i = L b + r ( U b L b ) ,       i = 1 , , n ,
where X = {x1,…,xn} is the hummingbird population, n represents the population size, xi is the location of the i-th food source, r is a d-dimensional random vector in [0,1], and Ub = {ub1,…,ubd} and Lb = {lb1,…,lbd} are upper bounds and lower bounds, respectively. The visit table is initialized by Equation (2):
V T i , j = n u l l   i f   i = j 0   i f   i j   i = 1 , , n ; j = 1 , , n ,
where VTi,j is the visit level, indicating the time period when the i-th hummingbird did not reach the j-th food source; null indicates the hummingbird visited the food source.

2.1.2. Guided Foraging

In the process of foraging, hummingbirds have three special flight skills: axial, diagonal, and omnidirectional flight. Use the direction switching vector D to determine which flight skill the hummingbird chooses. Figure 2 describes the three flight behaviors in three-dimensional space. Figure 2a shows axial flight, in which the hummingbird can choose to fly in an arbitrary direction of the coordinate axis; Figure 2b reflects diagonal flight, in which the hummingbird can fly from any angle of the coordinate axis to its diagonal position; and Figure 2c demonstrates omnidirectional flight, in which the hummingbird can fly in any direction.
In d-dimensional space, the expressions for simulating the axial, diagonal, and omnidirectional flight of hummingbirds are expressed by Equations (3)–(5), respectively.
D ( i ) = 1 ,   i f   i = r a n d i ( [ 1 , d ] ) 0 ,   e l s e     i = 1 , , d ,
D ( i ) = 1 ,   i f   i = P ( j ) , j = [ 1 , q ] , P = r a n d p r e m ( q ) 0 ,   e l s e ,
D ( i ) = 1     i = 1 , , d ,
where randi([1,d]) is a randomly generated integer in [1,d], q [ 1 , r a n d ( d 2 ) + 1 ] , and randprem(q) represents generating a random arrangement of integers from 1 to q.
Hummingbirds will rely on the alternation of three flight skills to reach the target food source and use Equation (6) to simulate guided foraging to obtain the position of candidate food source vi.
v i ( t + 1 ) = x i , a i m ( t ) + A D ( x i ( t ) x i , a i m ( t ) ) ,
where vi(t + 1) is the position of the candidate solution in iteration t + 1, and xi(t) is the i-th food source in iteration t. In addition, xi,aim(t) is the location of the target food source where the i-th hummingbird will be located. A~N(0,1) is the guiding parameter that obeys the normal distribution.
The position of the i-th food source of the hummingbird is updated by Equation (7).
x i ( t + 1 ) = x i ( t ) ,   i f   f ( x i ( t ) ) f ( v i ( t + 1 ) ) v i ( t + 1 ) ,   f ( x i ( t ) ) > f ( v i ( t + 1 ) ) ,
where f(xi(t)) and f(vi(t + 1)) represent the nectar replenishment rates of hummingbird food sources and candidate food sources, respectively; that is, the fitness value of the function.
The visit table simulates the unique memory ability of hummingbirds and is used to store important time information for accessing food sources. Each hummingbird can find the food source they are going to visit based on the information on the visit table. They prefer the food source with the highest visit level, but if multiple food sources have the same visit level, the food source with the highest supplement rate will be selected. In each iteration, after the hummingbird selects the target food source through guided foraging by Equation (6), the visit table will be updated accordingly; for the update details of the visit table, refer to reference [64].

2.1.3. Territorial Foraging

When hummingbirds visit target candidate solutions, they will move into adjacent territories in search of new food sources that may be better candidates than existing ones. The mathematical expression for simulating the territorial foraging strategy of hummingbirds is:
v i ( t + 1 ) = x i ( t ) + B D x i ( t ) ,
where vi(t + 1) is the position of the candidate food source obtained by hummingbird i through territorial foraging in t + 1 iterations, and B~N(0,1) represents the territorial parameter obeying the normal distribution. Hummingbirds update the visit table after performing territorial foraging.

2.1.4. Migration Foraging

Hummingbirds tend to migrate further afield to feed when there is a shortage of food in the areas they visit. The migration coefficient, M, is the value given to determine whether the hummingbird is migrating. If the number of iterations exceeds M, the hummingbird with the worst fitness value will randomly migrate to any randomly generated food source in the search space for foraging. The migration foraging behavior of hummingbirds from the food source with the worst nectar replenishment rate to the randomly generated food source can be expressed as
x w o r ( t + 1 ) = L o w + r ( U p L o w ) ,
where xwor(t + 1) is the food source with the worst nectar supplementation in the hummingbird population, and r is the random vector in [0,1]. Hummingbirds will update the visit table after migration and foraging. Here, the migration coefficient M = 2n. The pseudo-code of the original AHA can be found in reference [64].

2.2. Hybrid Artificial Hummingbird Algorithm

Compared with other commonly used MAs, AHA can quickly find the global optimal solution and has certain applicability and potential for solving global optimization problems. However, the original AHA still has some limitations in solving some complex optimization problems, such as low algorithm accuracy and the tendency to fall into local optima. In order to make the original AHA more competitive, a new hybrid artificial hummingbird algorithm (HAHA) is proposed in this study, which makes the following three improvements based on the original AHA. Firstly, the introduction of a light opposition-based learning strategy in the guided foraging process helps to improve hummingbirds’ search ability, which can effectively improve the exploration ability of standard AHA. Secondly, the introduction of the PSO strategy in the exploitation stage of AHA helps hummingbirds learn from individuals with good fitness values in the population, accelerates the convergence speed, and improves the accuracy of the algorithm. Lastly, the Cauchy mutation strategy is introduced into the migration foraging of hummingbirds to expand the range of mutation, which helps the algorithm get out of stagnation and improve the search efficiency of the original AHA.

2.2.1. Elite Opposition-Based Learning

AHA communicates information within the population by the visit table, which largely limits the search range of hummingbirds and easily makes the population fall into the local optimum, thereby affecting the accuracy of the solution. In order to improve the possibility of individuals approaching the optimal value in the exploration stage, the elite opposition-based learning (EOL) strategy [69] is introduced on the basis of the original AHA to improve the exploration ability of the algorithm. EOL is an innovative search method in intelligent computing. The main idea is as follows: first, the hummingbird individual with the best fitness value is regarded as the elite individual e ( t ) = { e 1 ( t ) , e 2 ( t ) , e d ( t ) } ; the elite individual is used to generate the opposition solution to the current solution; and the better solution is selected instead of the original solution. Then, the elite opposition-based solution x i , e l i t e ( t ) = { x i , e l i t e 1 ( t ) , x i , e l i t e 2 ( t ) , x i , e l i t e d ( t ) } of the hummingbird individual x i ( t ) = { x i 1 ( t ) , x i 2 ( t ) , x i d ( t ) } can be defined by Equation (10):
x i , e l i t e j ( t + 1 ) = r a n d ( 0 , 1 ) ( e a ( t ) + e b ( t ) ) x i j ( t ) ,   i = 1 , , n , j = 1 , , d
where e a ( t ) = min ( e j ( t ) ) , e b ( t ) = max ( e j ( t ) ) , j = 1,…,d.
In guiding the foraging stage, the EOL strategy can better enable hummingbirds to forage for food sources with the highest nectar replenishment rates, improve their exploration abilities, enhance population diversity, and reduce the probability of falling into the local optimum, thereby improving the global search ability of hummingbird populations.

2.2.2. PSO Strategy

In the exploitation stage, hummingbirds need to search for novel food sources and then select the food source with the highest nectar supplement rate as the object to be visited, according to the visit table. However, it does not consider learning from hummingbirds with good fitness values in the population, which still has certain limitations. PSO [47] is an optimization algorithm proposed by Eberhart and Kennedy in 1995 that has the advantages of fast convergence speed and easy implementation. The speed update equation is shown in Equation (11):
v i , p ( t + 1 ) = w x i ( t ) + c 1 r a n d ( x i , p b e s t ( t ) x i ( t ) ) +         c 2 r a n d ( x g b e s t x i ( t ) ) + b D x i ( t ) ,
where c1 and c2 are learning factors with a value of 2, xi,pbest is the local optimal solution, and xgbest represents the global optimal solution.
w = ( w i n i w e n d ) ( T t ) T + w e n d ,
in Equation (12), w is the inertia factor, and wini = 0.4 and wend = 0.9 are the initial and final inertia weights, respectively. With the increase in iterations, w shows a decreasing trend.
The speed update formula in PSO is introduced into the exploitation stage of standard AHA so that hummingbirds learn from individuals with good fitness values in the population, which increases the convergence speed and accuracy of the solution of the original AHA.

2.2.3. Cauchy Mutation Strategy

In AHA, the main purpose of hummingbirds choosing migration foraging is to enhance the exploration skills of the algorithm. When the number of iterations exceeds the migration coefficient M, the hummingbirds with the worst nectar replenishment rate will migrate to randomly generated food sources, which realizes the global exploration of the algorithm. However, in the experiments, it was found that it is still easy for the standard AHA to fall into local optimization.
In this paper, the Cauchy mutation strategy [70] is introduced to generate larger disturbances near randomly generated hummingbird individuals to improve the mutation ability of the hummingbird population, so as to improve the global search ability of the algorithm and increase the mutation range, thereby preventing the algorithm from falling into a local optimal state prematurely. The Cauchy distribution is the unique continuous probability distribution; the one-dimensional Cauchy distribution density function is shown in Equation (13):
f ( x ; δ , μ ) = 1 π δ δ 2 + ( x μ ) 2 ,   x ( , ) ,
when δ = 0 and μ = 1, the Cauchy density function is defined by Equation (14):
f ( x ; 0 , 1 ) = 1 π ( x 2 + 1 ) ,   x ( , ) ,
the standard Cauchy distribution formula is described by Equation (15):
c a u c h y ( 0 , 1 ) = tan ( π ( ξ 1 2 ) ) ,   ξ U [ 0 , 1 ] .
Equation (16) is used to perform Cauchy mutation processing on randomly generated food sources in migration foraging:
x c a u c h y ( t + 1 ) = x w o r ( t + 1 ) + r × c a u c h y ( 0 , 1 ) ,
where cauchy(0,1) is the Cauchy mutation operator.
The Cauchy mutation strategy is introduced in the exploration stage of the original AHA to ensure that hummingbird individuals learn from other random individuals in the population, which expands the search range of the hummingbird population, increases the diversity of the population, thereby effectively improving the AHA’s accuracy and convergence speed.

2.2.4. Detailed Steps for the Proposed HAHA

This part details the specific steps of the proposed HAHA. In order to improve the performance of the original AHA, combined with the EOL, the PSO strategy, and the Cauchy mutation, HAHA is proposed. In order to describe the process of the proposed HAHA in more detail, Figure 3 summarizes the specific implementation steps and flow chart of the proposed HAHA.

2.3. Computational Complexity of the Proposed HAHA

Computational complexity is one of the significant indicators used to evaluate the efficiency of an algorithm, including space complexity and time complexity. The computational complexity of the proposed HAHA is related to algorithm initialization (Init), the individual fitness value (FV) in each iteration, D, n, and T. In this paper, “Oh” represents the computation complexity of the algorithm. Initialization is to assign values to each dimension of the hummingbirds, so the computation complexity is expressed as Oh(nD). HAHA needs to calculate the individual fitness value in each iteration, so the computational complexity can be defined as Oh(T‧FV‧n). HAHA introduces EOL in guided foraging (gui fora), which increases the computational complexity of AHA. The computational complexity of this stage is Oh(TnD/2 + TnD/2). The PSO strategy is introduced in territorial foraging (ter fora), and the computational complexity of this stage is defined as Oh(TnD/2 + TnD/2). The Cauchy mutation strategy is introduced in migration foraging (mig fora), so the computational complexity is Oh((TnD + TnD)/(2n)). Therefore, the overall computational complexity of the proposed HAHA can be expressed by Equation (17):
O h ( H A H A ) = O h ( p r o b l e m d e f i n i t i o n ) + O h ( I n i t ) + O h ( t ( F V ) ) + O h ( t ( g u i f o r a ) ) + O h ( t ( t e r f o r a ) ) + O h ( t ( m i g f o r a ) ) = O h ( 1 + n D + T F V n + T n D + T n D 2 + T n D + T n D 2 + T n D + T n D 2 n ) = O h ( 1 + n D + T F V n + 2 T n D + T D )

3. Numerical Experiments and Analysis

In this section, the performance of the proposed HAHA is simulated on 25 benchmark functions and CEC 2022 benchmark functions and compared with other optimization algorithms and other improved AHAs. The optimization ability, convergence, and statistical tests of HAHA are evaluated, which further verifies the superiority of HAHA in a series of evaluation indicators. In addition, in order to guarantee the reliability and persuasiveness of experimental results, the compilation environment for all experiments is the same; they are compiled and run on MATLABR2017b on Windows 11, AMD Ryzen 7 4800H with Radeon [email protected], and 16GB RAM, and the experimental results are obtained by running each test function 30 times independently.

3.1. Benchmark Functions

As part of the study, 25 benchmark test functions and the challenging CEC 2022 test suite were used to evaluate the performance of the proposed HAHA. The details of the 25 benchmark functions used to verify the performance of the proposed HAHA are shown in Appendix A, Table A1. These functions contain uni-modal, multi-modal, hybrid, composition, and fixed-dimensional functions, which have good representation in evaluating algorithm performance. Among the 25 benchmark functions, F1 is a uni-modal function with a local extreme value, which is suitable for testing the utilization and local exploration abilities of the algorithm. F2–F5 are multi-modal functions with multiple local minima, which are usually used to evaluate the ability of algorithms to explore and jump out of local optima. F6–F10 are hybrid functions composed of multi-modal or uni-modal functions to test the balance between algorithm exploration and exploitation. Composite functions F11–F15 are often composed of basic functions and mixed functions, and the problem is more complicated. F16–F20 are fixed-dimensional functions taken from CEC 2019 test functions [71], and the complexity of the search space is significantly increased and more challenging, which is used to evaluate the comprehensive ability of the algorithm.

3.2. Algorithm Parameter Settings

The proposed HAHA is compared with other algorithms such as the original AHA [64], PSO [47], WOA [53], SCA [38], HHO [54], Seagull Optimization Algorithm (SOA) [57], SSA [58], AVOA [60], CryStAl [40], DMOA [62], Sand Cat Swarm Optimization (SCSO) [72], GJO [63], and AOAHA [65] to conduct comparative experiments to evaluate the performance of HAHA, where AOHAH is an improved algorithm of AHA. Table 2 shows the parameter settings of some algorithms, and the parameters of the rest of the comparison algorithms are the same as the corresponding references. Each algorithm is independently run 30 times on each benchmark function, and the calculation results of all algorithms are based on the average performance of the algorithms.

3.3. Results and Analyses for 25 Benchmark Functions

In this experiment, in order to objectively and fairly evaluate the proposed HAHA, the average value (Avg) and standard deviation (Std) of the best solution obtained by independent running of each test function are compared as evaluation indicators, and the calculation formula is as follows [73]:
A v g = 1 r u n s i = 1 r u n s f i * , S t d = 1 r u n s 1 i = 1 r u n s ( f i * A v g ) 2 ,
where f i * is the best solution obtained in the i-th independent run, and runs represents the number of independent runs of the function.
Table 3 shows the statistical results of 14 algorithms running independently 30 times on 25 test functions, including Avg, Std, Wilcoxon rank-sum test p-value, and average rank. The best results are marked in bold. The uni-modal function F1 only has a global optimal value, which is used to test the local exploitation ability of the algorithm. The average value of the optimal solution obtained by HAHA on F1 is the smallest, which indicates that the proposed HAHA has very effective exploitation and utilization abilities. F2–F5 in Table 3 are the evaluation results of multi-modal functions. It can be seen that the experimental results of HAHA are basically better than competing algorithms on such functions, especially in solving F3 and F4 optimization problems, which proves that the proposed HAHA has good exploration ability and effectively avoids the algorithm from falling into the local optimal state.
Hybrid functions and combination functions are mostly used to evaluate the balance between algorithm development and exploration. As shown in Table 3, the proposed HAHA performance is often more competitive in functions F6, F7, F10, F12, and F15. Compared with other competitive algorithms, HAHA has a good balance between exploitation and exploration. F16–F20 are the fixed dimensions selected from CEC 2019, which are more challenging. F16 reaches the optimal value of 1 and has obvious advantages in F19, F21, F22, and F25, which further indicates that HAHA can better explore the optimal solution of complex problems. From the standard deviation, it can be seen that the performance of the proposed HAHA is stable. At the end of Table 3, the final ranking of each algorithm on 25 test sets is given, with HAHA ranking first on average.
In addition to statistical analysis of data by means and standard deviation, the Wilcoxon rank-sum test was also used to assess statistical differences between the proposed HAHA and other competing algorithms [74]. The p-value is used to determine whether the given algorithm is significantly different from the algorithm to be tested. When the p-value < 0.05, it indicates that the algorithm is significantly different from HAHA and has statistical significance. Table 3 presents the p-value for HAHA and other comparison algorithms for 25 benchmark functions. In most cases, the p-value of most algorithms is less than 0.05, indicating a significant difference between HAHA and other algorithms. The last line of Table 3 gives the number of significant differences, expressed by (+/=/−). Among them, “+” indicates that the algorithm being compared performs better than the proposed HAHA, “=” indicates that HAHA and this comparison algorithm have similar performance, and “−” indicates that the algorithm being compared is not as good as HAHA. Compared with the original AHA, HAHA and AHA have significant differences in 19 test functions, and HAHA’s performance is better than that of AHA. The HAHA has significant differences compared with PSO, WOA, SCA, HHO, SOA, SSA, AVOA, CryStAl, DMOA, SCSO, and GJO. Therefore, HAHA has good performance and is statistically significant.
Meanwhile, the Friedman test [75] is also commonly used as a popular method for non-parametric testing, and Table 4 shows the results of the Friedman test for each algorithm on 25 test functions. The proposed HAHA has the best Friedman test results for most test functions, and the average ranking of algorithm performance is shown in Table 4. Compared with other comparison algorithms, HAHA has the best average Friedman test result of 2.4 for 25 benchmark functions.
The convergence of the proposed algorithm is verified by comparing the convergence curves between the proposed HAHA and the competing algorithms. Figure 4 shows the convergence curves on 25 test functions, which are obtained by averaging the best solutions of the algorithm through 1000 iterations. As can be seen from Figure 4, the proposed HAHA is more competitive than the competitive algorithm. In the initial stage of iteration, HAHA converges faster. At the initial stage of iteration, F3, F7, F16, F19, F21, and F22 converge faster. With an increase in the number of iterations, the algorithm converges quickly to the optimal solution with high convergence accuracy. During the entire iteration process, the proposed HAHA maintains an intelligent balance between exploration and exploitation, effectively reducing the possibility of premature convergence of the algorithm.
Figure 5 shows box plots of the optimal solution distribution of each function. For most functions, the box graph position of HAHA is lower, indicating that the proposed HAHA has better performance and stronger robustness. Figure 6 shows the radar charts of comparison between HAHA and other competitive algorithms. The larger the area, the lower the ranking of the algorithm. On the contrary, the minimum area means that the overall performance of the algorithm is the best. Figure 7 shows that the average rank of the HAHA algorithm is the smallest, indicating that it ranks first among other competitive algorithms. This result proves the superiority of the proposed HAHA again.
The computational cost is also an important criterion for evaluating the performance of the algorithm. Table 5 shows the average runtime of HAHA and other competing algorithms in seconds on the test set. Compared with the original AHA, the calculation cost of HAHA is also inevitably increased, which is consistent with the previous analysis of calculation complexity and is a noticeable issue in the subsequent research.
To sum up, compared with other intelligent algorithms, HAHA effectively improves the exploration and development capabilities of the algorithm, avoids falling into local optima, and shows good competitiveness.

3.4. Results and Analyses on CEC 2022 Benchmark Functions

In this section, the latest CEC 2022 test functions are selected to further evaluate the performance of the proposed HAHA and compare it with other advanced intelligent algorithms and improved algorithms of AHA, including PSO [47], WOA [53], SCA [38], HHO [54], SOA [71], SSA [58], SAO [41], POA [62], Kepler Optimization Algorithm (KOA) [76], SCSO [72], GJO [63], AOAHA [65], and AHA [64]. The CEC 2022 test function simulates highly complex problems in global optimization, which is very challenging. In order to ensure the fairness and persuasiveness of the experimental results, all functions were tested in 10-dimensional space. The algorithm parameter settings were the same as in Table 2, and the experimental results were taken as an average of 30 independent runs.
Table 6 presents the experimental results of 30 independent runs of HAHA and other competitive algorithms on the CEC 2022 test set, including the mean, standard deviation, ranking, and p-value. This table demonstrates the best performance of HAHA among the 9 test functions, and the average ranking of HAHA is 1.250, which is the highest overall performance ranking. Using the Wilcoxon rank-sum test p-value to test whether there is a significant difference between HAHA and other algorithms, it can be seen from Table 6 that most algorithms have p-values less than 0.05, indicating that HAHA has statistical significance. From the convergence curves shown in Figure 8, it can be seen that the HAHA can effectively jump out of local optima and quickly approach the global optimal solution. The research results indicate that HAHA exhibits strong competitiveness and can serve as a powerful tool for solving global optimization problems.

4. Construction of CSGC–Ball Curves

In this section, first, the CSGC–Ball curves are defined, which are composed of N-segmented SGC–Ball curves and can construct more flexible and controllable complex curves. Secondly, in order to make the constructed curves smooth and continuous, the continuity conditions satisfying the G1 and G2 smooth splicing of the CSGC–Ball curves are studied, respectively. Finally, an example of CSGC–Ball curves is given.
Definition 1. 
The shape-adjustable generalized cubic Ball (SGC–Ball, for short) curves can be defined as [21]
P ( t ; Ω ) = i = 0 3 P i b i , 3 ( t ) , t [ 0 , 1 ] ,
where P i R u ( u = 2 , 3 ; i = 0 , 1 , 2 , 3 ) are the control points of the curves; Ω = { ω , λ 1 , λ 2 , λ 3 } are the shape parameters; ω [ 0 , 1 ] is the global shape parameter; and λ 1 , λ 3 [ 3 , 3 ] ,   λ 2 [ 0 , 4 ] are the local shape parameters; and b i , 3 ( t ) ( i = 0 , 1 , 2 , 3 ) are the SCG–Ball curves basis functions defined as
b 0 , 3 ( t ) = [ 1 + ( λ 1 1 ) ω t ( 1 t ) ] ( 1 t ) 2 , b 1 , 3 ( t ) = [ 2 + ω ( 1 λ 1 ) + ω ( λ 1 + λ 2 3 ) t ] t ( 1 t ) 2 , b 2 , 3 ( t ) = [ 2 + ω ( 2 λ 2 ) ω ( λ 3 λ 2 + 1 ) t ] t 2 ( 1 t ) , b 3 , 3 ( t ) = [ 1 + ( λ 3 1 ) ω t ( 1 t ) ] t 2 .
Compared with the traditional generalized Ball curves, the SGC–Ball curves have a satisfactory effect when constructing simple curve shapes, but for the construction of complex geometric curves in real life, the original single SGC–Ball curve is difficult to meet people’s requirements for curve construction and has certain limitations. Therefore, it is of great significance to construct the complex combination of SGC–Ball curves, which are defined as follows.
Definition 2. 
Given N + 1 nodes u 0 < u 1 < u 2 < < u i < u i + 1 < < u N 1 < u N , the composite shape-adjustable generalized cubic Ball (abbreviation, CSGC–Ball) curves are defined as
P ˜ ( u , Ω ) : = P j ( u u j 1 h j , Ω j ) ,   u [ u j 1 , u j ] , j = 1 , 2 , , N
where h j = u j u j 1 , Ω j = ( ω , λ 1 , j , λ 2 , j , λ 3 , j ) , j ( 1 , 2 , , N ) are the shape parameters of the j-th SGC–Ball curves; ω [ 0 , 1 ] is the global shape parameter; and λ 1 , j ,   λ 3 , j [ 3 , 3 ] ,   λ 2 , j [ 0 , 4 ] ,   j ( 1 , 2 , , N ) are local shape parameters of the CSGC–Ball curves. Meanwhile, the CSGC–Ball curves can be written as:
Π ˜ : P j ( u u j 1 h j , Ω j ) = i = 0 3 P i , j b i , 3 ( u u j 1 h j ) ,
where P i , j ( i = 0 , 1 , 2 , 3 ; j = 1 , 2 , , N ) represents the i-th control vertex of the j-th SGC–Ball curves.
The CSGC–Ball curve requires two adjacent curves to be smooth and continuous. In order to make the constructed CSGC–Ball curves meaningful, the G1 and G2 smooth continuity splicing conditions of the j-th and j + 1-th SGC–Ball curves are discussed below.
Theorem 1. 
If the control vertices and shape parameters of the j-segment and j + 1-segment SGC–Ball curves at node uj satisfy
P 0 , j + 1 = P 3 , j , P 1 , j + 1 = k h j + 1 ( 2 ω ( λ 3 , j 1 ) ) h j ( 2 ω ( λ 1 , j + 1 1 ) ) ( P 3 , j P 2 , j ) + P 0 , j + 1 .
then the CSGC–Ball curves are said to be G1 continuous at node uj. Among them, k > 0 . If the CSGC–Ball curves satisfy Equation (23) at each node uj (j = 1,…,N), then the CSGC–Ball curves are G1 continuous on the whole. In particular, when k = 1, Equation (23) is a necessary and sufficient condition for the CSGC–Ball curves to satisfy G0 continuity at nodes uj (j = 1,2…,N). The proof of Theorem 1 is given in Appendix B.1.
Theorem 2. 
If the control vertices and shape parameters of the j-segment and j + 1-segment SGC–Ball curves at node uj satisfy
{ P 0 , j + 1 = P 3 , j P 1 , j + 1 = k h j + 1 ( 2 ω ( λ 3 , j 1 ) ) h j + 1 ( 2 ω ( λ 1 , j + 1 1 ) ) ( P 3 , j P 2 , j ) + P 0 , j + 1 P 2 , j + 1 = 2 k 2 h j + 1 2 ( 3 ω ( λ 3 , j 1 ) 1 ) β h j h j + 1 2 ( 2 ω ( λ 3 , j 1 ) ) 2 h j 2 ( ω ( λ 2 , j + 1 2 ) 2 ) P 3 , j + 2 k 2 h j + 1 2 ( ω ( 3 λ 3 , j λ 2 , j 1 ) 4 ) + β h j h j + 1 2 ( 2 ω ( λ 3 , j 1 ) ) 2 h j 2 ( ω ( λ 2 , j + 1 2 ) 2 ) P 2 , j k 2 h j + 1 2 ( ω ( λ 2 , j 2 ) + 2 ) h j 2 ( ω ( λ 2 , j + 1 2 ) 2 ) P 1 , j k 2 h j + 1 2 h j 2 ( ω ( λ 2 , j + 1 2 ) 2 ) P 0 , j ( 3 ω ( λ 1 , j + 1 1 ) 1 ) ω ( λ 2 , j + 1 2 ) 2 P 0 , j + 1 + ω ( 3 λ 1 , j + 1 + λ 2 , j + 1 5 ) 4 ω ( λ 2 , j + 1 2 ) 2 P 1 , j + 1 + 1 ω ( λ 2 , j + 1 2 ) 2 P 3 , j + 1 .
then the CSGC–Ball curves are said to be G2 continuous at connection note uj. Among them, k > 0 , β is an arbitrary constant. If the CSGC–Ball curves satisfy the G2 continuity condition at each node uj (j = 1,2,…,N), then the overall CSGC–Ball curves are G2 continuous. The proof of Theorem 2 is given in Appendix B.2.
According to the CSGC–Ball curves definition and the G1 smooth splicing continuity condition of Theorem 1, Figure 9 gives examples of the CSGC–Ball curves graphs that satisfy the overall G1 smooth splicing condition when N = 5. Different colors represent each SGC–Ball curves to be spliced. Ω j = ( ω , λ 1 , j , λ 2 , j , λ 3 , j ) , j ( 1 , 2 , , 5 ) are the shape parameters of the j-th SGC–Ball curves of the CSGC–Ball curves, where ω is the global shape parameter of CSGC–Ball curves, and λ 1 , j , λ 2 , j , λ 3 , j j = 1 , 2 , , 5 are the local shape parameters. The figure involves 16 variables, including 1 global shape parameter and 15 local shape parameters. Figure 9a–c describes the CSGC–Ball curves of the whole G1 smooth splicing, and the parameter values are Ω j = ( 1 , 1 , 1 , 1 ) , j = 1 , 2 , , 5 , Ω j = ( 0.5 , 1 , 1 , 1 ) , j = 1 , .2 , , 5 and Ω j = ( 0 , 1 , 1 , 1 ) , j = 1 , 2 , , 5 , respectively. The local shape parameters are the same, but the overall shape parameter value is different, reflecting the whole G1 smooth splicing CSGC–Ball curve. It can be seen that ω controls the overall shape change of the graphs. Figure 9d–f discusses the comparison curves on the same shape with different local shape parameters. The solid line “-” represents the curves with the given local shape parameter value of 1, the dotted line “--” represents the curves with the given local shape parameter value taken as 0, and the dotted line “-.” represents the curves with the given local shape parameter value of 2. From Figure 9, it can be found that local shape parameters control the local shape changes of the CSGC–Ball curves, and the overall shape parameters control the overall shape of the CSGC–Ball curves. When different shape parameters change, the control points of the curves also change, and the curves are close to the corresponding control points.
According to the G2 smooth splicing continuity condition of CSGC–Ball curves given by Theorem 2, Figure 10 shows examples of the spatial curve designed by CSGC–Ball curves of the overall G2 smooth splicing when N = 3. This CSGC–Ball curve involves 10 variables, including 1 global shape parameter and 9 local shape parameters. Figure 10a–c shows the whole G2 smooth CSGC–Ball curves of shape parameters are Ω j = ( 1 , 1 , 1 , 1 ) , j = 1 , 2 , 3 , Ω j = ( 0.5 , 1 , 1 , 1 ) , j = 1 , 2 , 3 and Ω j = ( 0 , 1 , 1 , 1 ) , j = 1 , 2 , 3 , respectively. Figure 10d–f displays the comparison curves on the same graph when the given local shape parameter values are different. When the shape parameters are different, the CSGC–Ball curves will appropriately change some control points of the curves to meet the overall G2 smooth stitching continuity condition.

5. Application of HAHA in CSGC–Ball Curve-Shape Optimization

5.1. CSGC–Ball Curve-Shape Optimization Model

The bending energy value of the curves can approximately reflect the smoothness of the curves, and they are negatively correlated when the bending energy value of the curves is smaller, the smoothness of the curves is better, and vice versa. Therefore, the G1 and G2 continuous shape optimization models of CSGC–Ball curves can be established, respectively, according to the minimum value of the curve bending energy.
Assuming that E j ( Ω j ) represents the bending energy of the j-th SGC–Ball curve, the calculation formula of its energy is
E j ( Ω j ) = 0 1 | | P j ( u u j 1 h j ; Ω j ) | | 2 d t               u [ u j 1 , u j ]
where h j = u j u j 1 , Ω j = ( ω , λ 1 , j , λ 2 , j , λ 3 , j ) , j ( 1 , 2 , , N ) are shape parameters of the j-segment SGC–Ball curve; ω [ 0 , 1 ] is the global shape parameter; and λ 1 , j , λ 3 , j [ 3 , 3 ] , λ 2 , j [ 0 , 4 ] , j ( 1 , 2 , , N ) are local shape parameters.
Therefore, the bending energy E of the combined CSGC–Ball curves can be expressed by Equation (26):
E = j = 1 n E j ( Ω j ) = j = 1 n 0 1 | | P j ( u u j 1 h j ; Ω j ) | | 2 d t
where Ω j = { ω , λ 1 , j , λ 2 , j , λ 3 , j } are the shape optimization parameters of the CSGC–Ball curves.
Through the energy function of CSGC–Ball curves, the energy minimum shape optimization model of CSGC–Ball curves is expressed by Equation (27):
arg   min   Ω j E = j = 1 n E j ( Ω j ) = j = 1 n 0 1 | | P j ( u u j 1 h j ; Ω j ) | | 2 d t
Substituting Equation (22) into Equation (27), the bending energy of the j-th segment of CSGC–Ball curves can be obtained through simple calculation, and its expression is shown in Equation (28):
0 1 | | P j ( u u j 1 h j ; Ω j ) | | 2 d t = 0 1 | | i = 0 3 P i , j b i , 3 ( u u j 1 h j ) | | 2 d t = a 0 , j | | P 0 , j | | 2 + a 1 , j | | P 1 , j | | 2 + a 2 , j | | P 2 , j | | 2 + a 3 , j | | P 3 , j | | 2 + 2 a 4 , j P 0 , j P 1 , j + 2 a 5 , j P 0 , j P 2 , j + 2 a 6 , j P 0 , j P 3 , j + 2 a 7 , j P 1 , j P 2 , j + 2 a 8 , j P 1 , j P 3 , j + 2 a 9 , j P 2 , j P 3 , j .
where
a 0 , j = ( 6 ω ( λ 1 , j 1 ) 2 ) 2 + 155 5 ω 2 ( λ 1 , j 1 ) 2 10 ω ( λ 1 , j 1 ) ( 6 ω ( λ 1 , j 1 ) 2 ) ;
a 1 , j = 24 5 λ 1 , j 2 ω 2 + 8 5 λ 1 , j λ 2 , j ω 2 64 5 λ 1 , j ω 2 16 λ 1 , j ω + 4 5 λ 2 , j 2 ω 2 24 5 λ 2 , j ω 2 + 56 5 ω 2 + 16 ω + 16 ;
a 2 , j = 4 5 λ 2 , j 2 ω 2 + 8 5 λ 2 , j λ 3 , j ω 2 8 5 λ 2 , j ω 2 + 24 5 λ 3 , j 2 ω 2 32 5 λ 3 , j ω 2 16 λ 3 , j ω + 24 5 ω 2 + 16 ω + 16 ;
a 3 , j = 24 5 ω 2 ( λ 3 , j 1 ) 2 4 ω ( λ 3 , j 1 ) + 4 ;
a 4 , j = 10 λ 1 , j ω 10 ω + 56 5 λ 1 , j ω 2 + 4 5 λ 2 , j ω 2 32 5 ω 2 24 5 λ 1 , j 2 ω 2 4 5 λ 1 , j λ 2 , j ω 2 4 ;
a 5 , j = ( 2 + 4 λ 1 , j + 2 λ 3 , j ) ω ( 14 5 λ 1 , j + 4 5 λ 2 , j + 6 5 λ 3 , j 14 5 4 5 λ 1 , j λ 2 , j 6 5 λ 2 , j λ 3 , j ) ω 2 4 ;
a 6 , j = ω j ( λ 3 , j 1 ) ( 6 ω ( λ 1 , j 1 ) 2 ) 2 ω ( λ 1 , j 1 ) 36 5 ω 2 ( λ 1 , j 1 ) ( λ 3 , j 1 ) + 4 ;
a 7 , j = ( 4 λ 1 , j 8 + 4 λ 3 , j ) ω + ( 14 5 λ 1 , j + 16 5 λ 2 , j 2 5 λ 3 , j 22 5 4 5 λ 2 , j 2 4 5 λ 1 , j λ 2 , j 6 5 λ 1 , j λ 3 , j + 4 5 λ 2 , j λ 3 , j ) ω 2 8 ;
a 8 , j = ( 2 + 2 λ 1 , j 4 λ 3 , j ) ω ( 6 5 λ 1 , j 4 5 λ 2 , j + 2 5 λ 3 , j + 2 5 6 5 λ 1 , j λ 3 , j + 4 5 λ 2 , j λ 3 , j ) ω 2 4 ;
a 9 , j = 10 λ 3 , j ω 10 ω 4 5 λ 2 , j ω 2 + 8 λ 3 , j ω 2 16 5 ω 2 24 5 λ 3 , j ω 2 + 4 5 λ 2 , j λ 3 , j ω 2 4 .
Combined with Equations (27) and (28), the shape optimization model of CSGC–Ball curves can be written as
arg   min   Ω j E = j = 1 n ( a 0 , j | | P 0 , j | | 2 + a 1 , j | | P 1 , j | | 2 + a 2 , j | | P 2 , j | | 2 + a 3 , j | | P 3 , j | | 2 + 2 a 4 , j P 0 , j P 1 , j + 2 a 5 , j P 0 , j P 2 , j + 2 a 6 , j P 0 , j P 3 , j + 2 a 7 , j P 1 , j P 2 , j + 2 a 8 , j P 1 , j P 3 , j + 2 a 9 , j P 2 , j P 3 , j ) ,
and the constraint conditions of the whole G1 and G2 continuous are Equation (23) and Equation (24), respectively.
Due to the high nonlinearity of the objective function, it is not an easy task to solve the established optimization model using traditional optimization methods. Therefore, the objective function of the CSGC–Ball curve-shape optimization models is regarded as the fitness function, and the proposed HAHA algorithm can be used to obtain the energy optimal solution of the established optimization models.

5.2. Steps for HAHA to Solve the CSGC–Ball Curve-Shape Optimization Model

This subsection will introduce the detailed steps to solve the established CSGC–Ball curve-shape optimization model with the proposed HAHA, which are described as follows:
Step 1: Set relevant parameters, for example, n, T, Ub, Lb, and the CSGC–Ball curve control points;
Step 2: Initialization. Randomly initialize the hummingbird population by Equation (1) when t = 1, obtain the positions of n hummingbirds, take the bending energy value E of the CSGC–Ball curves as the fitness function, calculate E of each individual, record the best fitness value as the problem optimal solution Ebest, and initialize the visit table;
Step 3: If rand > 0.5, perform Step 5 and Step 6; otherwise, use Equation (6) to obtain the candidate solution vi (t + 1) of guided foraging, and obtain the elite opposition-based solution xi,elite (t + 1) by Equation (10), if E (xi,elite (t + 1)) < E (vi (t + 1)), then vi (t + 1)= xi,elite (t + 1);
Step 4: If E (vi (t + 1)) < E (xi (t)), then xi (t + 1) = vi (t + 1), and update the visit table;
Step 5: Use Equation (8) to execute the territory foraging strategy of hummingbirds to obtain candidate solutions vi (t + 1), and obtain solutions vi,p (t + 1) by Equation (11), if E(vi,p (t + 1)) < E(vi (t + 1)), then vi (t + 1) = vi,p (t + 1),
Step 6: If E (vi (t + 1)) < E (xi (t)), then xi (t + 1) = vi (t + 1), and update the visit table;
Step 7: If mod(t, 2n) == 0, then the solution with the largest energy value is used for migration foraging by Equation (9) to obtain the random solution xwor (t + 1), and use Equation (16) to perform Cauchy mutation on it to obtain the mutated solution xcauchy (t + 1). If E (xcauchy (t + 1)) < E (xwor (t + 1)), then xwor (t + 1) = xcauchy (t + 1), update the visit table. Otherwise, perform Step 8.
Step 8: t = t + 1, if t < T, then return to Step 3, otherwise execute Step 9;
Step 9: Output the energy best value Ebest of the established CSGC–Ball curves and the corresponding shape parameter values.

5.3. Numerical Examples

In order to demonstrate the effectiveness and excellence of the proposed HAHA in solving the established CSGC–Ball curve-shape optimization model. In this section, some representative numerical examples are given to solve the established optimization model using advanced algorithms such as HAHA, and the results are compared and studied. In all numerical examples, the algorithm parameters are shown in Table 2, the population size is 50, and the maximum number of iterations is 1000.
Example 5.1 This numerical example graphically presents the “letter W” graph designed by the complex CSGC–Ball curves that satisfy the overall G2 smooth splicing continuity condition. The graph shape is composed of eight-segment SGC–Ball curves G2 smoothly spliced; different colors represent different SGC–Ball curves; the black lines are auxiliary lines. The convergence curves when the objective function of the optimization model converges to the optimal value are also provided. In this example, for the CSGC–Ball curves with overall G2 smooth splicing, it is only necessary to give the coordinates of control points P 0 , 0 , P 0 , 1 , P 0 , 2 , P 0 , 3 , P 1 , 3 , P 2 , 3 , P 3 , 3 and P 4 , 0 , P 4 , 1 , P 4 , 2 , P 4 , 3 , P 5 , 3 , P 6 , 3 , P 7 , 3 . The remaining control vertices of the curves to be spliced can be calculated according to the G2 smooth continuity splicing condition and the coordinates of the known control vertices.
In this example, there are a total of 25 shape parameters that need to be optimized, including 1 global parameter and 24 local shape parameters. Figure 11 shows the CSCG–Ball curves and the energy change convergence diagrams obtained by solving the established shape optimization model using optimization algorithms such as HAHA. Figure 11a,b shows the “letter W” shape CSCG–Ball curves of the overall G2 smooth splicing for given the shape parameter value. Figure 11c,h, respectively, describes the CSCG–Ball curves of smooth splicing of the overall G2 with minimum energy obtained after optimization by PSO, WOA, SCA, HHO, GWO, and HAHA. Figure 11i shows the energy change diagrams of each algorithm to solve the G2 smooth splicing shape optimization mode, and the proposed HAHA solves the model with the highest convergence accuracy.
Appendix C Table A2 shows the optimal shape parameters and minimum energy values obtained by the corresponding intelligent algorithm to solve the overall G2 smooth splicing shape optimization model. It proves that the proposed HAHA algorithm is more competitive and has advantages over other optimization algorithms in solving the optimization model of the CSCG–Ball curves that satisfy the G2 smooth splicing continuity condition. The final result has a minimum energy value of 41.7970 to obtain the smoothest graphics.
Example 5.2 This example gives the “snail on grass” diagram designed by the complex CSCG–Ball curves of the hybrid smooth splicing of G0, G1, and G2 in graphical form, and the convergence curves of the optimization model converging to the optimal value are given. Different colors represent different curves, and the graph is composed of 29 SGC–Ball curves involving 88 shape optimization parameters, including 1 overall shape parameter and 87 local shape parameters. Using PSO, WOA, SCA, HHO, SMA (Slime Mould Algorithm) [77], and HAHA to solve the shape optimization model, the ideal optimal shape of the CSCG–Ball curves that satisfies the smooth splicing of mixed G0, G1, and G2 can be obtained.
Figure 12 shows the CSCG–Ball curves and the energy change diagrams with the smooth splicing of the hybrid G0, G1, and G2 with the smallest energy obtained by solving the curve-shape optimization model established by the HAHA algorithm and other advanced optimization algorithms. Among them, Figure 12a,b is the graphs constructed from the CSCG–Ball curves that are blended G0, G1, and G2 spliced smoothly with freely given shape parameter values. Figure 12c–h shows the CSCG–Ball curves with smooth splicing of mixed G0, G1, and G2 with minimum energy obtained after optimization by different optimization algorithms, respectively. Figure 12i shows the energy change diagram of each algorithm to solve the hybrid G0, G1, and G2 smooth splicing shape optimization models. When the number of iterations reaches 200, the energy value of the model solved by HAHA tends to be stable, and compared with other algorithms, HAHA has the highest convergence accuracy.
Appendix D Table A3 shows the optimal shape parameter values and the minimum energy values of the graphs designed by the mixed G0, G1, and G2 smoothly spliced CSCG–Ball curves obtained by each algorithm. Among all the algorithms, the CSGC–Ball curve obtained by the proposed HAHA with the smooth splicing of mixed G0, G1, and G2 is the smoothest, and the obtained energy value is 182.437. The effectiveness of HAHA in solving the CSGC–Ball curve-shape optimization model is fully demonstrated.

6. Conclusions and Future Research

In this paper, complex CSCG–Ball curves with global and local shape parameters are constructed based on SGC–Ball basis functions, and the geometric conditions for G1 and G2 continuity splicing between adjacent SCG–Ball curves are studied. The constructed CSGC–Ball curves can not only construct more complex geometric product shapes in reality but also adjust the overall or local shape of the curves more flexibly by changing the overall or local shape parameters, thereby making the curves have higher shape adjustability.
In addition, a novel improved HAHA is proposed, which combines EOL, PSO, and Cauchy mutations with AHA. The introduction of the EOL strategy better balances the exploration and exploitation of algorithms and increases their optimization ability. In the exploitation stage, combined with the PSO strategy, the convergence speed is accelerated and the optimization ability of the algorithm is improved. Cauchy mutations are added to increase the diversity of the population and improve the ability of the algorithm to jump out of the local optimal. In order to better evaluate the overall performance of HAHA compared with other advanced intelligent algorithms for 25 benchmark functions and the CEC 2022 test set, the experimental results verify that the proposed HAHA has certain superiority and competitiveness in solving global optimization problems.
Finally, according to the minimum bending energy of the curves, the CSGC–Ball curve-shape optimization models are established, and the specific steps for HAHA to solve the CSGC–Ball shape optimization model are given. Two representative numerical examples verify the effectiveness of HAHA in solving the CSGC–Ball curve-shape optimization models. However, it is worth noting that the HAHA proposed in this paper exhibits advantages and competitiveness in solving optimization problems with continuous variables, but there are certain limitations in solving problems in non-continuous decision spaces. In future research, the proposed HAHA can be used to solve optimization problems in the fields of feature selection, image segmentation, and machine learning. In addition, we will consider extending the research technique of combined SGC–Ball interpolation curves to the CQGS–Ball surfaces in [78] and utilizing intelligent algorithms in [79,80,81] to investigate the shape optimization problem of the surfaces.

Author Contributions

Conceptualization, K.C. and G.H.; Methodology, K.C., L.C. and G.H.; Software, K.C. and L.C.; Validation, K.C.; Formal analysis, L.C. and G.H.; Investigation, K.C., L.C. and G.H.; Resources, G.H.; Data curation, K.C. and L.C.; Writing—Original draft, K.C., L.C. and G.H.; Writing—Review and editing, K.C., L.C. and G.H.; Visualization, K.C. and L.C.; Supervision, L.C.; Project administration, K.C. and G.H.; Funding acquisition, K.C. and G.H. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the National Natural Science Foundation of China (Grant No. 51875454).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

All data generated or analyzed during the study are included in this published article.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Twenty-Five Benchmark Functions

Table A1. The 25 benchmark functions.
Table A1. The 25 benchmark functions.
Function TypeFunction NameDimSearch RangeOptimal Value
Uni-modal functionsF1: Shifted and Rotated Bent Cigar Function (CEC 2017 F1)30[−100,100]100
Multi-modal functionsF2: Shifted and Rotated Rastrigin’s Function (CEC 2014 F9)30[−100,100]900
F3: Shifted and Rotated Expanded Scaffer’s F6 Function (CEC 2014 F16)30[−100,100]1600
F4: Shifted and Rotated Rastrigin’s Function (CEC 2017 F5)30[−100,100]500
F5: Shifted and Rotated Non-Continuous Rastrigin’s Function (CEC 2017 F8)30[−100,100]800
Hybrid
functions
F6: Hybrid Function 1 (N = 3) (CEC 2017 F11)30[−100,100]1100
F7: Hybrid Function 2 (N = 4) (CEC 2017 F14)30[−100,100]1400
F8: Hybrid Function 3 (N = 4) (CEC 2014 F20)30[−100,100]2000
F9: Hybrid Function 4 (N = 5) (CEC 2017 F18)30[−100,100]1800
F10: Hybrid Function 5 (N = 5) (CEC 2014 F21)30[−100,100]2100
Composition
functions
F11: Composition Function 1 (N = 3) (CEC 2017 F21)30[−100,100]2100
F12: Composition Function 2 (N = 4) (CEC 2017 F23)30[−100,100]2300
F13: Composition Function 3 (N = 5) (CEC 2017 F25)30[−100,100]2500
F14: Composition Function 4 (N = 6) (CEC 2017 F27)30[−100,100]2700
F15: Composition Function 5 (N = 3) (CEC 2017 F29)30[−100,100]2900
Fixed dimensional functionsF16: Storn’s Chebyshev Polynomial Fitting Problem (CEC 2019 F1)9[−8192,8192]1
F17: Inverse Hilbert Matrix Problem (CEC 2019 F2)16[−16,384,16,384]1
F18: Lennard-Jones Minimum Energy Cluster (CEC 2019 F3)18[−4,4]1
F19: Rastrigin’s Function (CEC 2019 F4)10[−100,100]1
F20: Griewangk’s Function (CEC 2019 F5)10[−100,100]1
F21: Weierstrass Function (CEC 2019 F6)10[−100,100]1
F22: Modified Schwefel’s Function (CEC 2019 F7)10[−100,100]1
F23: Expanded Schaffer’s F6 Function (CEC 2019 F8)10[−100,100]1
F24: Happy Cat Function (CEC 2019 F9)10[−100,100]1
F25: Ackley Function (CEC 2019 F10)10[−100,100]1

Appendix B. Proof of Theorems in Section 4

Appendix B.1. Proof of Theorem 1

Proof. 
If the j-segment and the j + 1-segment curve of CSGC–Ball curves meet the G1 continuity condition at the connection point uj, then G0 should be continuous first, that is,
P 3 , j = P 0 , j + 1 .
Furthermore, the two curves should have the same unit tangent vector at node uj, which is
k P ( u j ) = P ( u j + ) , k > 0 ,
From the endpoint properties of the SGC–Ball curves, it is known that
P ( u j ) = 1 h j ( 2 ω ( λ 3 , j 1 ) ) ( P 3 , j P 2 , j ) , P ( u j + ) = 1 h j + 1 ( 2 ω ( λ 1 , j + 1 1 ) ) ( P 1 , j + 1 P 0 , j + 1 ) .
Substituting the above formula into Equation (A2), we can arrange to obtain
P 1 , j + 1 = k h j + 1 ( 2 ω ( λ 3 , j 1 ) ) h j ( 2 ω ( λ 1 , j + 1 1 ) ) ( P 3 , j P 2 , j ) + P 0 , j + 1 .
where k > 0 is an arbitrary constant. Theorem 1 is proved. □

Appendix B.2. Proof of Theorem 2

Proof. 
If the j-segment and the j + 1-segment curves of the CSGC–Ball curves meet the G2 continuity condition at node uj, then the G1 continuity should be satisfied first, which is given by Theorem 1.
Let D1 and D2 be the subnormal vectors of the j-segment and j + 1-segment SGC–Ball curves at node uj, respectively. Then
D 1 = P ( u j ) × P ( u j ) ,   D 2 = P ( u j + ) × P ( u j + )
To satisfy G2 continuity, D1 and D2 must have the same tangent vector at node uj. As is known from Equation (A2), P ( u j ) , P ( u j ) , P ( u j + ) , P ( u j + ) are collinear, that is, D1 and D2 have the same direction at uj, that is
P ( u j + ) = α P ( u j ) + β P ( u j )
Among them, α , β are arbitrary constants, and α > 0 .
Let k ( u j ) , k ( u j + ) be the curvatures of the j-segment and j + 1-segment curves at the junction uj, respectively. According to the curvature formula, we have
k ( u j ) = | P ( u j ) × P ( u j ) | | P ( u j ) | 3 ,   k ( u j + ) = | P ( u j + ) × P ( u j + ) | | P ( u j + ) | 3
if the G2 continuity condition is satisfied. Substituting Equations (A2) and (A6) into the above Equation (A7), there are
k ( u j + ) = | P ( u j + ) × P ( u j + ) | | P ( u j + ) | 3 = | k P ( u j ) × ( α P ( u j ) + β P ( u j ) ) | | λ P ( u j ) | 3 = α | P ( u j ) × P ( u j ) | + β | P ( u j ) × P ( u j ) | k 2 | P ( u j ) | 3 = α | P ( u j ) × P ( u j ) | k 2 | P ( u j ) | 3 .
From Equation (A8), when α = k 2 , then k ( u j ) = k ( u j + ) . Then, Equation (A5) can be written as:
P ( u j + ) = k 2 P ( u j ) + β P ( u j )
From the endpoint properties of the SGC–Ball curves, the second derivative of the j-segment and j + 1-segment SGC–Ball curves at the junction uj can be written as
P ( u j ) = 2 h j 2 P 0 , j + 2 h j 2 ( ω ( λ 2 , j 2 ) + 2 ) P 1 , j + 2 h j 2 ( ω ( 3 λ 3 , j λ 2 , j 1 ) 4 ) P 2 , j 2 h j 2 ( 3 ω ( λ 3 , j 1 ) 1 ) P 3 , j , P ( u j + ) = 2 h j + 1 2 ( 3 ω ( λ 1 , j + 1 1 ) 1 ) P 0 , j + 1 + 2 h j + 1 2 ( ω ( 3 λ 1 , j + 1 + λ 2 , j + 1 5 ) 4 ) P 1 , j + 1 2 h j + 1 2 ( ω ( λ 2 , j + 1 2 ) 2 ) P 2 , j + 1 + 2 h j + 1 2 P 3 , j + 1 .
By substituting Equations (A3) and (A10) into Equation (A9), the following formula is obtained:
P 2 , j + 1 = 2 k 2 h j + 1 2 ( 3 ω ( λ 3 , j 1 ) 1 ) β h j h j + 1 2 ( 2 ω ( λ 3 , j 1 ) ) 2 h j 2 ( ω ( λ 2 , j + 1 2 ) 2 ) P 3 , j + 2 k 2 h j + 1 2 ( ω ( 3 λ 3 , j λ 2 , j 1 ) 4 ) + β h j h j + 1 2 ( 2 ω ( λ 3 , j 1 ) ) 2 h j 2 ( ω ( λ 2 , j + 1 2 ) 2 ) P 2 , j k 2 h j + 1 2 ( ω ( λ 2 , j 2 ) + 2 ) h j 2 ( ω ( λ 2 , j + 1 2 ) 2 ) P 1 , j k 2 h j + 1 2 h j 2 ( ω ( λ 2 , j + 1 2 ) 2 ) P 0 , j ( 3 ω ( λ 1 , j + 1 1 ) 1 ) ω ( λ 2 , j + 1 2 ) 2 P 0 , j + 1 + ω ( 3 λ 1 , j + 1 + λ 2 , j + 1 5 ) 4 ω ( λ 2 , j + 1 2 ) 2 P 1 , j + 1 + 1 ω ( λ 2 , j + 1 2 ) 2 P 3 , j + 1 .
where k > 0 , β is an arbitrary constant. G2 is continuous at uj if Equations (A1), (A4), and (A11) are satisfied. Then, the SGC–Ball curves of j-segment and j + 1-segment are G2 continuous at uj. Theorem 2 is proved. □

Appendix C

Table A2. Optimal shape parameters and energy values of the CSGC–Ball curves of the overall G2 splicing.
Table A2. Optimal shape parameters and energy values of the CSGC–Ball curves of the overall G2 splicing.
AlgorithmOptimal Shape ParametersE
PSO ω 0.64887919 66.2849
j = 1j = 2j = 3j = 4j = 5j = 6j = 7j = 8
λ 1 . j 1.15185 1.37390 1.32772 1.55349 1.11589 1.29393 1.46232 1.54878
λ 2 . j 2.06512 1.96168 2.05905 1.80837 1.98417 1.64995 2.36528 1.96000
λ 3 . j 1.32390 1.28802 1.47045 1.45026 1.32576 1.06542 1.46548 1.16265
WOA ω 0.61246533 42.3460
j = 1j = 2j = 3j = 4j = 5j = 6j = 7j = 8
λ 1 . j 1.86538 2.53608 2.80315 2.74539 1.97669 2.50878 2.73656 2.61410
λ 2 . j 2.82767 2.10635 2.18380 2.26413 2.65508 2.20090 2.70274 2.71869
λ 3 . j 2.47729 2.63008 2.90391 2.23474 2.48893 2.59616 2.97236 2.24699
SCA ω 0.44943302 60.6286
j = 1j = 2j = 3j = 4j = 5j = 6j = 7j = 8
λ 1 . j 1.53812 1.21378 2.25093 1.89868 1.72244 1.40985 2.50022 2.24194
λ 2 . j 0.28967 0.22890 0.34591 0.12716 0.19957 0.37379 0.60330 0.37674
λ 3 . j 2.09452 2.25425 2.32298 2.21089 1.27993 2.21361 2.43706 1.85464
HHO ω 0.58330802 42.8444
j = 1j = 2j = 3j = 4j = 5j = 6j = 7j = 8
λ 1 . j 2.26112 2.66245 2.77293 2.75090 2.27722 2.72139 2.79630 2.64388
λ 2 . j 2.10014 2.04252 2.10340 2.23652 2.31288 2.04330 2.46973 2.39952
λ 3 . j 2.59681 2.71575 2.89981 2.36353 2.69098 2.67674 2.94551 2.39238
GWO ω 0.59802310 42.9182
j = 1j = 2j = 3j = 4j = 5j = 6j = 7j = 8
λ 1 . j 1.89364 2.73443 2.87954 2.83924 1.94839 2.80668 2.77928 2.63251
λ 2 . j 0.78809 0.87901 1.71278 1.69333 0.90586 0.45850 2.82386 0.49710
λ 3 . j 2.45587 2.61126 2.90566 2.20632 2.42842 2.42752 2.99789 2.28221
HAHA ω 0.77204510 41.7970
j = 1j = 2j = 3j = 4j = 5j = 6j = 7j = 8
λ 1 . j 1.72569 2.47537 2.45510 2.42984 1.76746 2.45557 2.36967 2.30659
λ 2 . j 1.96822 1.67667 2.11353 1.88233 2.03359 1.31535 2.71346 1.80763
λ 3 . j 2.23051 2.21689 2.53882 1.94375 2.27044 2.11102 2.64168 1.98074

Appendix D

Table A3. Optimal shape parameters and energy values of the CSGC–Ball curves of mixed G0, G1, and G2 splicing.
Table A3. Optimal shape parameters and energy values of the CSGC–Ball curves of mixed G0, G1, and G2 splicing.
AlgorithmOptimal Shape ParametersE
PSO ω 8.165946× 10−17379.882
j = 1j = 2j = 3j = 4j = 5j = 6j = 7j = 8
λ 1 . j 0.01199 0.68281 0.37140 −0.1024 −0.35110.35437 −0.3069−0.0002
λ 2 . j 1.90894 2.26708 2.15170 1.94534 1.88223 2.14267 1.77331 2.11611
λ 3 . j −0.13580.12068 −0.1542 0.30099 0.33231 0.35239 0.12931 −0.1755
j = 9j = 10j =11j = 12j = 13j = 14j = 15j = 16
λ 1 . j −0.2375−0.1540−0.0842−0.0511 0.46105 0.25110 0.01922 −0.1765
λ 2 . j 2.21089 1.87741 1.84059 1.92628 1.63868 2.20693 2.11329 1.87746
λ 3 . j 0.10939 −0.1249−0.05020.24929 −0.1074 −0.03900.08777 −0.1168
j = 17j = 18j =19j = 20j = 21j = 22j = 23j = 24
λ 1 . j 0.04473 −0.1061−0.01010.22035 −0.25020.15433 0.17569 −0.0559
λ 2 . j 1.99198 2.12004 2.12426 1.88229 1.90081 1.89130 1.87136 1.81734
λ 3 . j −0.00620.14957 0.17624 0.03901 0.01131 −0.1524−0.18730.07681
j = 25j = 26j =27j = 28j = 29
λ 1 . j 0.22829 −0.23740.02877 0.18938 0.08976
λ 2 . j 1.96420 2.33591 2.08569 1.93850 1.98105
λ 3 . j 0.04001 −0.1362−0.0880−0.17760.22704
WOA ω 0.56143072184.656
j = 1j = 2j = 3j = 4j = 5j = 6j = 7j = 8
λ 1 . j 2.85218 2.87138 2.82369 2.89281 2.98078 2.83488 2.88864 2.86913
λ 2 . j 2.35992 2.05760 2.36563 1.84918 2.34436 2.21655 2.30250 2.16827
λ 3 . j 2.89427 2.90698 2.93313 2.81194 2.91938 2.87946 2.87397 2.86274
j = 9j = 10j =11j = 12j = 13j = 14j = 15j = 16
λ 1 . j 2.87848 2.87460 2.84069 2.87111 2.86491 2.86395 2.86615 2.87630
λ 2 . j 2.16549 2.21958 2.33277 2.16387 1.82327 2.08353 2.14077 1.87726
λ 3 . j 2.85025 2.85099 2.89884 2.81336 2.85745 2.89370 2.89365 2.89385
j = 17j = 18j =19j = 20j = 21j = 22j = 23j = 24
λ 1 . j 2.84521 2.85292 2.78854 2.84806 2.85620 2.82256 2.82369 2.81984
λ 2 . j 2.20087 2.39431 2.24645 2.64762 2.45415 2.27135 2.58449 2.39753
λ 3 . j 2.87404 2.88912 2.77618 2.85626 2.86230 2.89564 2.87897 2.87074
j = 25j = 26j =27j = 28j = 29
λ 1 . j 2.78441 2.86354 2.86487 2.84957 2.85947
λ 2 . j 2.59021 2.35955 2.00985 2.34287 2.20680
λ 3 . j 2.84430 2.91575 2.87652 2.92195 2.86763
SCA ω 0.01240168 378.441
j = 1j = 2j = 3j = 4j = 5j = 6j = 7j = 8
λ 1 . j −1.0570−0.33850.23026 0.08677 0.60017 −0.1256−0.5899−0.0320
λ 2 . j 2.15431 1.46926 1.64490 1.50380 2.03020 1.77393 1.23516 1.80551
λ 3 . j 0.52234 0.12062 −0.4202−0.2678−0.4550−0.0387−0.2334−0.0960
j = 9j = 10j =11j = 12j = 13j = 14j = 15j = 16
λ 1 . j 0.45369 0.01261 −0.4439−0.1557 −0.0764−0.0760−0.32890.17971
λ 2 . j 1.50170 1.45567 1.64068 1.91496 1.93461 2.28278 1.55732 1.72778
λ 3 . j −0.2907−0.24230.01349 −0.3109−0.1890−0.2911−0.09780.77774
j = 17j = 18j =19j = 20j = 21j = 22j = 23j = 24
λ 1 . j 0.49273 0.25800 −0.1537−0.1686 0.12123 −0.41680.51629 0.29765
λ 2 . j 1.49014 2.09171 1.38876 1.95450 1.95912 1.58009 1.86773 1.87182
λ 3 . j −0.05680.03448 −0.1341 0.47338 −0.8309−0.2569 0.12387 −0.7595
j = 25j = 26j =27j = 28j = 29
λ 1 . j 0.06529 0.53273 0.11868 0.01192 −0.1921
λ 2 . j 1.53690 2.10757 1.98951 1.84692 2.61408
λ 3 . j −0.2140−0.27510.63215 −0.3338−0.5503
HHO ω 0.53141245 184.025
j = 1j = 2j = 3j = 4j = 5j = 6j = 7j = 8
λ 1 . j 2.98076 2.98442 2.98010 2.97835 2.99602 2.98160 2.98318 2.97598
λ 2 . j 1.98222 2.00455 2.07102 2.00812 2.03681 1.99617 2.02540 1.98957
λ 3 . j 2.98414 2.99028 2.98943 2.98353 2.97914 2.98232 2.98244 2.98264
j = 9j = 10j =11j = 12j = 13j = 14j = 15j = 16
λ 1 . j 2.98423 2.97553 2.97932 2.98584 2.98094 2.98193 2.98142 2.98279
λ 2 . j 1.92067 1.96532 2.05272 1.94200 1.98074 2.02657 2.03130 1.92533
λ 3 . j 2.97530 2.97999 2.98822 2.96541 2.97992 2.98762 2.98329 2.98282
j = 17j = 18j =19j = 20j = 21j = 22j = 23j = 24
λ 1 . j 2.97561 2.98808 2.97989 2.98418 2.97869 2.98229 2.97976 2.98119
λ 2 . j 2.05583 2.06593 2.02464 1.99876 1.95430 1.96503 1.98384 1.95259
λ 3 . j 2.98265 2.98353 2.98228 2.98389 2.98242 2.98294 2.98386 2.98094
j = 25j = 26j =27j = 28j = 29
λ 1 . j 2.98291 2.98321 2.98492 2.98132 2.97698
λ 2 . j 1.99819 1.99420 2.01433 1.95868 2.06611
λ 3 . j 2.97534 2.98722 2.97132 2.98569 2.98160
SMA ω 0.85577335 208.323
j = 1j = 2j = 3j = 4j = 5j = 6j = 7j = 8
λ 1 . j 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464
λ 2 . j 3.42309 3.42309 3.42309 3.42309 3.42309 3.42309 3.42309 3.42309
λ 3 . j 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464
j = 9j = 10j =11j = 12j = 13j = 14j = 15j = 16
λ 1 . j 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464
λ 2 . j 3.42309 3.42309 3.42309 3.42309 3.42309 3.42309 3.42309 3.42309
λ 3 . j 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464
j = 17j = 18j =19j = 20j = 21j = 22j = 23j = 24
λ 1 . j 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464
λ 2 . j 3.42309 3.42309 3.42309 3.42309 3.42309 3.42309 3.42309 3.42309
λ 3 . j 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464 2.13464
j = 25j = 26j =27j = 28j = 29
λ 1 . j 2.13464 2.13464 2.13464 2.13464 2.13464
λ 2 . j 3.42309 3.42309 3.42309 3.42309 3.42309
λ 3 . j 2.13464 2.13464 2.13464 2.13464 2.13464
HAHA ω 0.75838681182.437
j = 1j = 2j = 3j = 4j = 5j = 6j = 7j = 8
λ 1 . j 2.32285 2.39393 2.24157 2.38538 2.74613 2.39668 2.35229 2.37876
λ 2 . j 2.07657 1.87513 2.00551 1.06112 1.47205 1.90181 2.12946 1.97447
λ 3 . j 2.44915 2.49225 2.59108 1.58380 1.73668 2.31265 2.42312 2.34776
j = 9j = 10j =11j = 12j = 13j = 14j = 15j = 16
λ 1 . j 2.38142 2.33024 2.36332 2.38664 2.41047 2.37189 2.35745 2.41375
λ 2 . j 1.96577 2.14084 2.09773 1.88414 1.75302 1.95039 2.08764 1.71225
λ 3 . j 2.31657 2.29552 2.38640 2.25331 2.33960 2.41308 2.41769 2.36731
j = 17j = 18j =19j = 20j = 21j = 22j = 23j = 24
λ 1 . j 2.35154 2.21571 2.36552 2.40253 2.31975 2.32859 2.27686 2.39248
λ 2 . j 2.13212 2.97383 1.95326 1.86995 2.30571 2.34025 2.44898 1.85148
λ 3 . j 2.36369 2.82544 2.29492 2.40542 2.37088 2.68900 2.42857 2.40000
j = 25j = 26j =27j = 28j = 29
λ 1 . j 2.27982 2.28423 2.40584 2.33562 2.29302
λ 2 . j 2.45719 2.76197 1.53374 2.27599 2.09232
λ 3 . j 2.34878 2.60824 2.26335 2.67179 2.43301

References

  1. Shi, F.Z. Computer Aided Geometric Design and Nonuniform Rational B-Splines: CAGD & NURBS; Beijing University of Aeronautics and Astronautics Press: Beijing, China, 2001. [Google Scholar]
  2. Wang, G.J.; Wang, G.Z.; Zheng, J.M. Computer Aided Geometric Design; Higher Education Press: Beijing, China, 2001. [Google Scholar]
  3. Hu, G.; Dou, W.T.; Wang, X.F.; Abbas, M. An enhanced chimp optimization algorithm for optimal degree reduction of Said–Ball curves. Math. Comput. Simul. 2022, 197, 207–252. [Google Scholar] [CrossRef]
  4. Consurf, A.A. Part one: Introduction of the conic lofting tile. Comput.-Aided Des. 1974, 6, 243–249. [Google Scholar]
  5. Wang, G.J. Ball curve of high degree and its geometric properties. Appl. Math. A J. Chin. Univ. 1987, 2, 126–140. [Google Scholar]
  6. Said, H.B. A generalized ball curve and its recursive algorithm. ACM Trans. Graph. (TOG) 1989, 8, 360–371. [Google Scholar] [CrossRef]
  7. Hu, S.M.; Wang, G.Z.; Jin, T.G. Properties of two types of generalized Ball curves. Comput.-Aided Des. 1996, 28, 125–133. [Google Scholar] [CrossRef]
  8. Othlnan, W.; Goldman, R.N. The dual basis functions for the generalized ball basis of odd degree. Comput. Aided Geom. Des. 1997, 14, 571–582. [Google Scholar]
  9. Xi, M.C. Dual basis of Ball basis function and its application. Comput. Math. 1997, 19, 7. [Google Scholar]
  10. Ding, D.Y.; Li, M. Properties and applications of generalized Ball curves. Chin. J. Appl. Math. 2000, 23, 123–131. [Google Scholar]
  11. Jiang, P.; Wu, H. Dual basis of Wang-Ball basis function and its application. J. Comput. Aided Des. Graph. 2004, 16, 454–458. [Google Scholar]
  12. Hu, S.M.; Jin, T.G. Degree reductive approximation of Bézier curves. In Proceedings of the Eighth Annual Symposium on Computational Geometry, Berlin, Germany, 10–12 June 1992; pp. 110–126. [Google Scholar]
  13. Wu, H.Y. Two new kinds of generalized Ball curves. J. Appl. Math. 2000, 23, 196–205. [Google Scholar]
  14. Wang, C.W. Extension of cubic Ball curve. J. Eng. Graph. 2008, 29, 77–81. [Google Scholar]
  15. Wang, C.W. The extension of the quartic Wang-Ball curve. J. Eng. Graph. 2009, 30, 80–84. [Google Scholar]
  16. Yan, L.L.; Zhang, W.; Wen, R.S. Two types of shape-adjustable fifth-order generalized Ball curves. J. Eng. Graph. 2011, 32, 16–20. [Google Scholar]
  17. Hu, G.S.; Wang, D.; Yu, A.M. Construction and application of 2m+2 degree Ball curve with shape parameters. J. Eng. Graph. 2009, 30, 69–79. [Google Scholar]
  18. Xiong, J.; Guo, Q.W. Generalized Wang-Ball curve. Numer. Comput. Comput. Appl. 2013, 34, 187–195. [Google Scholar]
  19. Liu, H.Y.; Li, L.; Zhang, D.M. Quadratic Ball curve with shape parameters. J. Shandong Univ. 2011, 41, 23–28. [Google Scholar]
  20. Huang, C.L.; Huang, Y.D. Quartic Wang-Ball curve and surface with two parameters. J. Hefei Univ. Technol. 2012, 35, 1436–1440. [Google Scholar]
  21. Hu, G.; Zhu, X.N.; Wei, G.; Chang, C.T. An improved marine predators algorithm for shape optimization of developable Ball surfaces. Eng. Appl. Artif. Intell 2021, 105, 104417. [Google Scholar] [CrossRef]
  22. Hu, G.; Li, M.; Wang, X.; Wei, G.; Chang, C.T. An enhanced manta ray foraging optimization algorithm for shape optimization of complex CCG-Ball curves. Knowl. Based Syst. 2022, 240, 108071. [Google Scholar] [CrossRef]
  23. Gurunathan, B.; Dhande, S. Algorithms for development of certain classes of ruled surfaces. Comput. Graph 1987, 11, 105–112. [Google Scholar] [CrossRef]
  24. Jaklič, G.; Žagar, E. Curvature variation minimizing cubic Hermite interpolants. Appl. Math. Comput. 2011, 218, 3918–3924. [Google Scholar] [CrossRef]
  25. Lu, L.Z. A note on curvature variation minimizing cubic Hermite interpolants. Appl. Math. Comput. 2015, 259, 596–599. [Google Scholar] [CrossRef]
  26. Zheng, J.Y.; Hu, G.; Ji, X.M.; Qin, X.Q. Quintic generalized Hermite interpolation curves: Construction and shape optimization using an improved GWO algorithm. Comput. Appl. Math 2022, 41, 115. [Google Scholar] [CrossRef]
  27. Hu, G.; Wu, J.L.; Li, H.N.; Hu, X.Z. Shape optimization of generalized developable H-Bézier surfaces using adaptive cuckoo search algorithm. Adv. Eng. Softw. 2020, 149, 102889. [Google Scholar] [CrossRef]
  28. Ahmadianfar, I.; Heidari, A.; Gandomi, A.H.; Chu, X.; Chen, H. RUN beyond the metaphor: An efficient optimization algorithm based on Runge Kutta method. Expert Syst. Appl. 2021, 181, 115079. [Google Scholar] [CrossRef]
  29. Hu, G.; Du, B.; Wang, X.F.; Wei, G. An enhanced black widow optimization algorithm for feature selection. Knowl. Based Syst. 2022, 35, 107638. [Google Scholar] [CrossRef]
  30. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  31. Nematollahi, A.F.; Rahiminejad, A.; Vahidi, B. A novel meta-heuristic optimization method based on golden ratio in nature. Soft Comput. 2020, 24, 1117–1151. [Google Scholar] [CrossRef]
  32. Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–72. [Google Scholar] [CrossRef]
  33. Storn, R.; Price, K. Differential evolution–a simple and efficient heuristic for global optimization over continuous Spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  34. Koza, J.R. Genetic Programming: On the Programming of Computers by Means of Natural Selection; MIT Press: Cambridge, MA, USA, 1992. [Google Scholar]
  35. Li, W.Z.; Wang, L.; Cai, X.; Hu, J.; Guo, W. Species co-evolutionary algorithm: A novel evolutionary algorithm based on the ecology and environments for optimization. Neural. Comput. Appl. 2019, 31, 2015–2024. [Google Scholar] [CrossRef]
  36. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef] [PubMed]
  37. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  38. Mirjalili, S. SCA: A Sine Cosine Algorithm for Solving Optimization Problems. Knowl. Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  39. Hashim, F.A.; Hussain, K.; Houssein, E.H.; Mai, S.M.; Al-Atabany, W. Archimedes optimization algorithm: A new metaheuristic algorithm for solving optimization problems. Appl. Intell. 2020, 51, 1531–1551. [Google Scholar] [CrossRef]
  40. Talatahari, S.; Azizi, M.; Tolouei, M.; Talatahari, B.; Sareh, P. Crystal structure algorithm (CryStAl): A metaheuristic optimization method. IEEE Access 2021, 9, 71244–71261. [Google Scholar] [CrossRef]
  41. Salawudeen, A.T.; Mu’Azu, M.B.; Sha’Aban, Y.A.; Adedokun, A.E. A novel smell agent optimization (sao): An extensive cec study and engineering application. Knowl. Based Syst. 2021, 232, 107486. [Google Scholar] [CrossRef]
  42. Rao, R.V.; Savsani, V.J.; Vakharia, D.P. Teaching-Learning-Based Optimization: An optimization method for continuous non-linear large scale problems. Inf. Sci. 2012, 183, 1–15. [Google Scholar] [CrossRef]
  43. Satapathy, S.; Naik, A. Social group optimization (SGO): A new population evolutionary optimization technique. Complex Intell. Syst. 2016, 2, 173–203. [Google Scholar] [CrossRef]
  44. Bikash, D.; Mukherjee, V.; Debapriya, D. Student psychology based optimization algorithm: A new population based optimization algorithm for solving optimization problems. Adv. Eng. Softw. 2020, 146, 102804. [Google Scholar]
  45. Bodaghi, M.; Samieefar, K. Meta-heuristic bus transportation algorithm. Iran J. Comput. Sci. 2019, 2, 23–32. [Google Scholar] [CrossRef]
  46. Yuan, Y.L.; Ren, J.J.; Wang, S.; Wang, Z.X.; Mu, X.K.; Zhao, W. Alpine skiing optimization: A new bio-inspired optimization algorithm. Adv. Eng. Softw. 2022, 170, 103158. [Google Scholar] [CrossRef]
  47. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on NeuralNetworks, Perth, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
  48. Dorigo, M.; Blum, C. Ant colony optimization theory: A survey. Theor. Comput. Sci. 2005, 344, 243–278. [Google Scholar] [CrossRef]
  49. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl. Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  50. Ma, L.; Wang, C.; Xie, N.G.; Shi, M.; Wang, L. Moth-flame optimization algorithm based on diversity and mutation strategy. Appl. Intell. 2021, 51, 5836–5872. [Google Scholar] [CrossRef]
  51. Wang, C.; Ma, L.L.; Ma, L.; Lai, J.; Zhao, J.; Wang, L.; Cheong, K.H. Identification of influential users with cost minimization via an improved moth flame optimization. J. Comput. Sci. 2023, 67, 101955. [Google Scholar] [CrossRef]
  52. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  53. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  54. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H.L. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  55. Hayyolalam, V.; Kazem, A.A.P. Black Widow Optimization Algorithm: A novel meta-heuristic approach for solving engineering optimization problems. Eng. Appl. Artif. Intell. 2020, 87, 103249. [Google Scholar] [CrossRef]
  56. Huang, Q.H.; Wang, C.; Yılmaz, Y.; Wang, L.; Xie, N.G. Recognition of EEG based on Improved Black Widow Algorithm optimized SVM. Biomed. Signal Process Control 2023, 81, 104454. [Google Scholar] [CrossRef]
  57. Dhiman, G.; Kumar, V. Seagull optimization algorithm: Theory and its applications for large-scale industrial engineering problems. Knowl. Based Syst. 2019, 165, 169–196. [Google Scholar] [CrossRef]
  58. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp swarm algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  59. Wang, C.; Xu, R.Q.; Ma, L.; Zhao, J.; Wang, L.; Xie, N.G. An efficient salp swarm algorithm based on scale-free informed followers with self-adaption weight. Appl. Intell. 2023, 53, 1759–1791. [Google Scholar] [CrossRef]
  60. Abdollahzadeh, B.; Gharehchopogh, F.S.; Mirjalili, S. African vultures optimization algorithm: A new nature-Inspired metaheuristic algorithm for global optimization problems. Comput. Ind. Eng. 2021, 158, 107408. [Google Scholar] [CrossRef]
  61. Agushaka, J.O.; Ezugwu, A.E.; Abualigah, L. Dwarf mongoose optimization algorithm. Comput. Methods Appl. Mech. Eng. 2022, 391, 114570. [Google Scholar] [CrossRef]
  62. Trojovský, P.; Dehghani, M. Pelican Optimization Algorithm: A Novel Nature-Inspired Algorithm for Engineering Applications. Sensors 2022, 22, 855. [Google Scholar] [CrossRef]
  63. Nitish, C.; Muhammad, M.A. Golden jackal optimization: A novel nature-inspired optimizer for engineering applications. Expert Syst. Appl. 2022, 198, 116924. [Google Scholar]
  64. Zhao, W.; Wang, L.; Mirjalili, S. Artificial hummingbird algorithm: A new bio-inspired optimizer with its engineering applications. Comput. Methods Appl. Mech. Eng. 2022, 388, 114194. [Google Scholar] [CrossRef]
  65. Ramadan, A.; Kamel, S.; Hassan, M.H.; Ahmed, E.M.; Hasanien, H.M. Accurate photovoltaic models based on an adaptive opposition artificial hummingbird algorithm. Electronics 2022, 11, 318. [Google Scholar] [CrossRef]
  66. Mohamed, H.; Ragab, E.S.; Ahmed, G.; Ehab, E.; Abdullah, S. Parameter identification and state of charge estimation of Li-Ion batteries used in electric vehicles using artificial hummingbird optimizer. J. Energy Storage 2022, 51, 104535. [Google Scholar]
  67. Sadoun, A.M.; Najjar, I.R.; Alsoruji, G.S.; Abd-Elwahed, M.S.; Elaziz, M.A.; Fathy, A. Utilization of improved machine learning method based on artificial hummingbird algorithm to predict the tribological Behavior of Cu-Al2O3 nanocomposites synthesized by In Situ method. Mathematics 2022, 10, 1266. [Google Scholar] [CrossRef]
  68. Abid, M.S.; Apon, H.J.; Morshed, K.A.; Ahmed, A. Optimal planning of multiple renewable energy-integrated distribution system with uncertainties using artificial hummingbird algorithm. IEEE Access 2022, 10, 40716–40730. [Google Scholar] [CrossRef]
  69. Yildiz, B.S.; Pholdee, N.; Bureerat, S.; Yildiz, A.; Sait, S. Enhanced grasshopper optimization algorithm using elite opposition-based learning for solving real-world engineering problems. Eng. Comput. 2021, 38, 4207–4219. [Google Scholar] [CrossRef]
  70. Wang, Y.J.; Su, T.T.; Liu, L. Multi-strategy cooperative evolutionary PSO based on Cauchy mutation strategy. J. Syst. Simul. 2018, 30, 2875–2883. [Google Scholar]
  71. Hu, G.; Chen, L.X.; Wang, X.P.; Guo, W. Differential Evolution-Boosted Sine Cosine Golden Eagle Optimizer with Lévy Flight. J. Bionic. Eng. 2022, 19, 850–1885. [Google Scholar] [CrossRef]
  72. Seyyedabbasi, A.; Kiani, F. Sand Cat swarm optimization: A nature-inspired algorithm to solve global optimization problems. Eng. Comput. 2022, 39, 2627–2651. [Google Scholar] [CrossRef]
  73. Hu, G.; Zhong, J.Y.; Du, B.; Guo, W. An enhanced hybrid arithmetic optimization algorithm for engineering applications. Comput. Methods Appl. Mech. Eng. 2022, 394, 114901. [Google Scholar] [CrossRef]
  74. Wilcoxon, F.; Bulletin, S.B.; Dec, N. Individual Comparisons by Ranking Methods; Springer: New York, NY, USA, 1992. [Google Scholar]
  75. Derrac, J.; García, S.; Molina, D.; Herrera, F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 2011, 1, 3–18. [Google Scholar] [CrossRef]
  76. Mohamed, A.; Reda, M.; Shaimaa, A.A.A.; Mohammed, J.; Mohamed, A. Kepler optimization algorithm: A new metaheuristic algorithm inspired by Kepler’s laws of planetary motion. Knowl. Based Syst. 2023, 268, 110454. [Google Scholar]
  77. Li, S.M.; Chen, H.L.; Wang, M.J.; Heidari, A.; Mirjalili, S. Slime mould algorithm: A new method for stochastic optimization. Future Gener. Comput. Syst. 2022, 111, 300–323. [Google Scholar] [CrossRef]
  78. Zheng, J.; Ji, X.; Ma, Z.; Hu, G. Construction of Local-Shape-Controlled Quartic Generalized Said-Ball Model. Mathematics 2023, 11, 2369. [Google Scholar] [CrossRef]
  79. Hu, G.; Wang, J.; Li, M.; Hussien, A.G.; Abbas, M. EJS: Multi-strategy enhanced jellyfish search algorithm for engineering applications. Mathematics 2023, 11, 851. [Google Scholar] [CrossRef]
  80. Hu, G.; Zhong, J.; Wei, G.; Chang, C.-T. DTCSMO: An efficient hybrid starling murmuration optimizer for engineering applications, Comput. Methods Appl. Mech. Eng. 2023, 405, 115878. [Google Scholar] [CrossRef]
  81. Hu, G.; Yang, R.; Qin, X.Q.; Wei, G. MCSA: Multi-strategy boosted chameleon-inspired optimization algorithm for engineering applications. Comput. Methods Appl. Mech. Eng. 2023, 403, 115676. [Google Scholar] [CrossRef]
Figure 1. Three foraging behaviors of hummingbirds.
Figure 1. Three foraging behaviors of hummingbirds.
Biomimetics 08 00377 g001
Figure 2. Three special flight abilities of hummingbirds: (a) axial fight behavior; (b) diagonal flight behavior; (c) omnidirectional flight behavior.
Figure 2. Three special flight abilities of hummingbirds: (a) axial fight behavior; (b) diagonal flight behavior; (c) omnidirectional flight behavior.
Biomimetics 08 00377 g002
Figure 3. Flowchart of the proposed HAHA.
Figure 3. Flowchart of the proposed HAHA.
Biomimetics 08 00377 g003
Figure 4. Convergence curves of the HAHA with other algorithms for 25 benchmark functions.
Figure 4. Convergence curves of the HAHA with other algorithms for 25 benchmark functions.
Biomimetics 08 00377 g004aBiomimetics 08 00377 g004b
Figure 5. The box plots of the HAHA with other algorithms for 25 benchmark functions. "+" represents the frequency of abnormal situations, "−" represents the median line of the data.
Figure 5. The box plots of the HAHA with other algorithms for 25 benchmark functions. "+" represents the frequency of abnormal situations, "−" represents the median line of the data.
Biomimetics 08 00377 g005aBiomimetics 08 00377 g005b
Figure 6. Radar charts of the proposed HAHA and other algorithms for 25 benchmark functions.
Figure 6. Radar charts of the proposed HAHA and other algorithms for 25 benchmark functions.
Biomimetics 08 00377 g006aBiomimetics 08 00377 g006b
Figure 7. The average ranking of algorithms based on 25 benchmark functions.
Figure 7. The average ranking of algorithms based on 25 benchmark functions.
Biomimetics 08 00377 g007
Figure 8. Convergence curves of the HAHA with other algorithms for CEC 2022 test functions.
Figure 8. Convergence curves of the HAHA with other algorithms for CEC 2022 test functions.
Biomimetics 08 00377 g008aBiomimetics 08 00377 g008b
Figure 9. G1 splicing of the CSGC–Ball curves.
Figure 9. G1 splicing of the CSGC–Ball curves.
Biomimetics 08 00377 g009
Figure 10. G2 splicing of the CSGC–Ball curves.
Figure 10. G2 splicing of the CSGC–Ball curves.
Biomimetics 08 00377 g010
Figure 11. Shape optimization of the CSGC–Ball curves of overall G2 splicing.
Figure 11. Shape optimization of the CSGC–Ball curves of overall G2 splicing.
Biomimetics 08 00377 g011
Figure 12. Shape optimization of CSGC–Ball curves for mixed G0, G1, and G2 splicing.
Figure 12. Shape optimization of CSGC–Ball curves for mixed G0, G1, and G2 splicing.
Biomimetics 08 00377 g012
Table 2. Parameter settings of each algorithm.
Table 2. Parameter settings of each algorithm.
AlgorithmParametersSetting Value
All algorithmPopulation size (n)100
Max iterations (T)1000
Number of runs30
AHAMigration coefficient (M)2n
HAHALearning factors (c1, c2)2
Migration coefficient (M)2n
PSONeighboring ratio0.25
Inertia weight (ω)0.9
Cognitive and social factorsc1 = c2 = 1.5
WOAParameter (a)from 2 to 0
SCAConstant (a)2
HHOEnergy (E1)from 2 to 0
SOAControl factor (fc)2
AOAC3, C4C3 = 1,C4 = 2
GJODecreasing energy of the prey (E1)from 1.5 to 0
Constant values (β, c1)1.5, 1.5
POAR0.2
SCSOSensitivity range (rG)from 2 to 0
Rfrom −2rG to 2rG
KOA T ¯ , μ0, γ3, 0.1, 15
Table 3. The comparison results of HAHA and other algorithms for 25 benchmark functions.
Table 3. The comparison results of HAHA and other algorithms for 25 benchmark functions.
FIndexAlgorithms
PSOWOASCAHHOSOASSAAVOACryStAlDMOASCSOGJOAHAAOAHAHAHA
F1Avg.2.29 × 1036.75 × 1071.46 × 10101.21 × 1079.84 × 1095.35 × 10³4.04 × 10³3.11 × 1084.40 × 1062.05 × 1098.79 × 1094.87 × 10³3.12 × 10³1.64 ×103
Std.3.60 × 10³3.95 × 1072.15 × 1092.42 × 1062.89 × 1095.89 × 10³4.80 × 10³7.57 × 1072.91 × 1061.87 × 1092.68 × 1096.17 × 10³3.58 × 10³1.94 × 10³
p-value0.1857673.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.67 × 10−31.49 × 10−13.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−110.0013700.036439\
Rank2914813641071112531
F2Avg.1.02 × 10³1.13 × 10³1.18 × 10³1.08 × 10³1.08 × 10³1.03 × 10³1.07 × 10³1.07 × 10³1.12 × 10³1.10 × 10³1.08 × 10³1.05 × 10³1.04 × 10³1.01 × 10³
Std.24.265.416.520.62238.627.912.410.530.33729.237.635.3
p-value0.1334544.20 × 10−103.02 × 10−112.67 × 10−91.69 × 10−95.55 × 10−21.60 × 10−71.56 × 10−83.69 × 10−111.07 × 10−95.53 × 10−86.36 × 10−50.009468\
Rank2131481036712119541
F3Avg.1611.9941612.7941613.0271612.3681612.6381612.1211612.2961612.5121613.5891612.1731611.7991610.2741610.2941609.722
Std.0.5720.6770.2480.4830.5150.5850.5640.2050.1520.5540.5990.74710.6240.710
p-value3.34 × 10−113.69 × 10−113.02 × 10−113.02 × 10−113.02 × 10−114.50 × 10−113.34 × 10−113.02 × 10−113.02 × 10−113.34 × 10−119.92 × 10−110.0117110.001236\
Rank5121391168101474231
F4Avg.653.6209773.0286798.6650723.8702694.3867627.5682698.6022686.8479725.0058726.1686674.7303604.0870599.1974574.0913
Std.28.560.217.337.631.644.838.216.99.4050.747.724.124.422.6
p-value1.96 × 10−103.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−112.38 × 10−73.02 × 10−113.02 × 10−113.02 × 10−113.34 × 10−111.33 × 10−101.09 × 10−52.53 × 10−4\
Rank5131410849711126321
F5Avg.909.80991008.1531066.307957.5451969.0228939.2126956.6075973.40621020.7300974.4398959.1655909.8768899.4413882.7806
Std.23.649.117.823.226.134.934.89.5614.729.146.324.42517.4
p-value2.00 × 10−53.02 × 10−113.02 × 10−113.69 × 10−114.08 × 10−113.20 × 10−91.96 × 10−103.02 × 10−113.02 × 10−114.08 × 10−112.61 × 10−101.17 × 10−55.57 × 10−3\
Rank3121479561013118421
F6Avg.1195.3952558.8672492.8941259.1432400.1511314.8331234.5391643.4424375.5501675.2892608.5761171.7081172.9951168.996
Std.30.390850339.588664.852.789.68335091.31 × 10329.623.632.7
p-value9.03 × 10−43.02 × 10−113.02 × 10−111.29 × 10−93.02 × 10−111.61 × 10−104.44 × 10−73.02 × 10−113.02 × 10−114.50 × 10−113.02 × 10−110.5105980.157976\
Rank4121161075814913231
F7Avg.1.45 × 1041.62 × 1062.58 × 1051.12 × 1051.54 × 1053.09 × 1043.32 × 1041.35 × 1041.59 × 1051.77 × 1053.32 × 1054.61 × 1034.30 × 1033.92 × 103
Std.1.41 × 1041.77 × 1061.55 × 1051.18 × 1052.59 × 1052.64 × 1042.88 × 1047.59 × 1037.42 × 1042.67 × 1055.04 × 1054.01 × 1033.16 × 1035.08 × 103
p-value2.00 × 10−63.69 × 10−113.02 × 10−111.09 × 10−106.07 × 10−115.57 × 10−109.76 × 10−102.39 × 10−83.02 × 10−117.39 × 10−113.69 × 10−110.0451460.000201\
Rank5141289674101113321
F8Avg.5.43 × 1042.15 × 1061.94 × 1066.56 × 1056.74 × 1051.73 × 1053.00 × 1051.23 × 1052.25 × 1069.78 × 1058.99 × 1056.87 × 1047.70 × 1046.03 × 104
Std.5.07 × 1042.80 × 1069.57 × 1057.41 × 1057.98 × 1051.36 × 1052.83 × 1055.86 × 1049.58 × 1051.98 × 1066.08 × 1055.96 × 1045.19 × 1045.47 × 104
p-value0.3483.34 × 10−113.02 × 10−116.12 × 10−105.07 × 10−103.83 × 10−51.25 × 10−71.17 × 10−53.02 × 10−115.97 × 10−98.99 × 10−110.0035010.141277\
Rank1131289675141110342
F9Avg.1.25 × 1053.76 × 1064.54 × 1061.67 × 1061.47 × 1063.07 × 1055.83 × 1051.80 × 1057.66 × 1061.37 × 1061.23 × 1066.74 × 1045.69 × 1044.90 × 104
Std.1.19 × 1054.88 × 1062.83 × 1061.87 × 1061.61 × 1062.53 × 1056.29 × 1056.22 × 1043.64 × 1061.24 × 1061.16 × 1067.94 × 1043.58 × 1043.21 × 104
p-value1.11 × 10−66.70 × 10−113.02 × 10−111.78 × 10−103.02 × 10−119.76 × 10−108.99 × 10−119.92 × 10−113.02 × 10−113.02 × 10−118.99 × 10−110.0398490.304177\
Rank4121311106751498321
F10Avg.9.83 × 1034.13 × 1042.22 × 1041.47 × 1042.05 × 1041.09 × 1041.51 × 1041.74 × 1041.07 × 1052.22 × 1042.75 × 1049.39 × 1039.27 × 1036.81 × 103
Std.4.38 × 1032.06 × 1047.42 × 1037.84 × 1038.95 × 1035.11 × 1036.09 × 1036.52 × 1034.42 × 1041.32 × 1041.52 × 1045.02 × 1034.28 × 1033.05 × 103
p-value9.88 × 10−33.02 × 10−113.34 × 10−119.51 × 10−62.92 × 10−98.12 × 10−41.73 × 10−76.72 × 10−103.02 × 10−111.09 × 10−103.34 × 10−110.0455450.016955\
Rank4131069578141112321
F11Avg.2.43 × 1032.57 × 1032.57 × 1032.55 × 1032.48 × 1032.43 × 1032.49 × 1032.49 × 1032.53 × 1032.51 × 1032.46 × 1032.38 × 1032.38 × 1032.37 × 103
Std.39.848.819.639.932.43248.215.914.841.239.721.522.613.7
p-value3.16 × 10−103.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−116.70 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−111.95 × 10−32.32 × 10−2\
Rank5141312749811106321
F12Avg.3104.1363070.6423022.5323093.8352816.9542756.6752942.6352898.6032874.5872902.4742837.2102736.7692737.2732717.570
Std.10911639.690.445.433.8843116.370.640.625.817.922.9
p-value3.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−118.15 × 10−115.46 × 10−63.02 × 10−113.02 × 10−113.02 × 10−119.92 × 10−114.08 × 10−114.86 × 10−31.68 × 10−4\
Rank1412111354108796231
F13Avg.2882.5492999.3963337.4532919.7623132.9202907.4742899.5302952.1572894.8003029.2203103.7872898.5012897.3982896.925
Std.7.6341.822519.984.120.618.313.93.9953.110916.614.516.4
p-value7.60 × 10−74.98 × 10−113.02 × 10−114.64 × 10−53.02 × 10−110.2280.4463.47 × 10−100.2778.15 × 10−113.02 × 10−110.02510.589\
Rank1101481376921112543
F14Avg.3247.9863379.9843429.6153366.4913272.3983228.6063272.3583299.8333200.0073334.7653303.1913238.7203237.8923231.053
Std.15394.942.511330.513.428.160.45.53 × 10−560.142.615.313.215.5
p-value1.11 × 10−64.08 × 10−113.02 × 10−112.15 × 10−101.87 × 10−70.6952.83 × 10−82.60 × 10−83.02 × 10−118.99 × 10−112.87 × 10−100.0176490.028128\
Rank6131412827911110543
F15Avg.3810.9915000.9204849.0464472.3394326.7973998.3244161.2784112.8964701.1574442.9544022.9703623.2763615.7123507.604
Std.236489270329249219280166211343169171149149
p-value2.15 × 10−63.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−116.12 × 10−101.46 × 10−104.98 × 10−113.02 × 10−113.02 × 10−111.21 × 10−101.60 × 10−35.08 × 10−3\
Rank4141311958712106321
F16Avg.22.14821157.662130.123117.298113.073411228.263211.3684111
Std.31.71.22 × 10330601630.5001355.69 × 10−132.02000
p-value1.31 × 10−71.21 × 10−121.21 × 10−12NaN1.21 × 10−125.77 × 10−11NaNNaN1.21 × 10−122.79 × 10−31.21 × 10−12NaNNaN\
Rank1114121910111378111
F17Avg.4.6086548.2442813.502364.964744.788265.467814.753923.2253738.492284.345194.735274.146134.131464.04666
Std.0.53214.25.910.1130.6501.420.3330.02856.700.2240.5320.1770.1730.253
p-value1.19 × 10−63.02 × 10−113.02 × 10−115.22 × 10−128.15 × 10−118.99 × 10−111.80 × 10−103.02 × 10−113.02 × 10−113.25 × 10−78.89 × 10−100.0195150.018817\
Rank6141210911811357432
F18Avg.12.312067.6805812.706453.4038212.1597412.678732.0590910.6965813.082643.473428.008271.4091351.5903471.381860
Std.1.072.920.03421.410.5620.1831.350.9420.2951.943.562.29 × 10−70.7300.104
p-value3.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−112.60 × 10−83.02 × 10−113.02 × 10−115.57 × 10−103.02 × 10−110.6952.23 × 10−9\
Rank1171351012491468231
F19Avg.26.8357146.7418038.2337643.1422922.0686923.7513432.2301823.4083222.6174332.2231922.4680213.4701411.248078.79384
Std.8.6016.46.33118.5313.9103.313.9512.47.846.974.784.55
p-value2.83 × 10−104.44 × 10−112.98 × 10−112.98 × 10−112.58 × 10−85.95 × 10−84.44 × 10−111.59 × 10−103.12 × 10−101.08 × 10−103.78 × 10−92.54 × 10−40.0118\
Rank9141213481176105321
F20Avg.1.1297131.8995396.5869501.8596393.1428551.2316721.4148711.9078731.1728151.7148093.6210111.0374191.0530011.055549
Std.0.06020.5161.410.2691.160.1380.2450.1040.1060.6176.080.02160.03980.0319
p-value3.25 × 10−73.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.47 × 10−102.61 × 10−103.02 × 10−111.53 × 10−53.02 × 10−113.02 × 10−119.06 × 10−30.438\
Rank4101491267115813123
F21Avg.3.3339047.5603576.4225297.0732897.1778943.1790016.5683343.8956407.0702564.9923933.6952821.1633411.0125811.031128
Std.1.331.711.061.591.541.521.940.5861.401.541.400.4490.03450.0479
p-value6.06 × 10−112.72 × 10−112.72 × 10−112.72 × 10−112.72 × 10−112.61 × 10−102.72 × 10−112.72 × 10−112.72 × 10−112.72 × 10−112.72 × 10−110.0474850.617057\
Rank5149121341071186312
F22Avg.977.36131196.5201316.1581024.136874.9208756.3315947.4577740.33961429.7321032.339956.2095460.4976363.6702265.2986
Std.302346169269287283242145138255279177147165
p-value9.92 × 10−114.98 × 10−113.02 × 10−114.98 × 10−111.69 × 10−95.46 × 10−98.15 × 10−112.15 × 10−103.02 × 10−113.69 × 10−119.92 × 10−110.0001630.024615\
Rank9121310657414118321
F23Avg.4.1313724.4080094.2721274.5153724.2169443.7418614.0834363.8308294.4692623.9938253.8009102.8354982.8324772.720197
Std.0.4400.3960.1900.4780.3250.4300.4010.2480.1710.3910.4670.5050.4700.405
p-value3.34 × 10−113.02 × 10−113.02 × 10−111.96 × 10−103.02 × 10−114.20 × 10−101.09 × 10−104.50 × 10−113.02 × 10−118.99 × 10−111.96 × 10−100.5590.387\
Rank9121114104861375321
F24Avg.1.2367361.4316541.4292291.4116461.2811011.2827541.3374761.2908721.1602411.2896461.2018941.0739081.0781661.074186
Std.0.08620.2210.08890.1980.06690.1120.1580.04060.02740.1240.07510.03410.03670.0321
p-value7.12 × 10−97.39 × 10−113.02 × 10−116.70 × 10−114.98 × 10−112.87 × 10−106.70 × 10−113.02 × 10−117.09 × 10−81.33 × 10−103.65 × 10−80.001520.040900\
Rank6141312781110495132
F25Avg.19.6658021.1049321.2170121.0162121.3479121.0271721.0348619.4875821.3000120.2519520.7420414.9928414.6630313.65820
Std.5.070.1020.7910.02540.07970.05730.06814.310.071823.221.849.329.149.80
p-value0.63.02 × 10−113.02 × 10−115.97 × 10−53.02 × 10−119.52 × 10−46.77 × 10−54.74 × 10−63.02 × 10−113.65 × 10−83.02 × 10−110.04210.0309\
Rank5111281491041367321
+/=/−2/2/210/0/250/0/250/1/240/0/251/2/220/3/221/1/231/1/230/0/250/0/252/4/192/5/18\
Avg. Rank5.612.3212.529.249.366.127.32710.489.248.283.002.521.40
Final Rank413149115761298321
Table 4. The Friedman test results of HAHA and other algorithms for 25 benchmark functions.
Table 4. The Friedman test results of HAHA and other algorithms for 25 benchmark functions.
AlgorithmsFun
F1F2F3F4F5F6F7F8F9F10F11F12F13F14
PSO2.933.036.775.703.503.634.734.174.132.675.4312.772.174.13
WOA9.0011.1710.6712.1011.1311.4713.0012.1710.4311.4312.5012.2310.47 11.73
SCA13.9313.8711.8313.6713.7711.7711.839.7312.4312.5312.9011.7313.73 13.13
HHO7.978.238.5310.077.605.679.206.409.708.8011.5712.476.40 11.53
SOA12.738.639.808.208.8011.2010.178.909.579.307.635.3712.57 7.93
SSA4.104.007.104.176.076.636.774.906.375.904.903.505.77 4.83
AVOA3.576.838.078.677.235.076.436.807.677.178.539.435.50 7.60
CryStAl10.106.678.807.939.539.175.208.005.675.078.278.378.93 8.63
DMOA7.0312.1013.8310.5012.2313.5710.9713.8713.3313.1310.877.604.50 1.47
SCSO10.979.907.5310.078.938.778.979.079.478.739.278.2710.80 11.23
GJO12.278.305.736.807.7010.8310.5010.339.2010.276.636.3712.20 8.37
AHA4.235.102.402.833.632.272.474.032.503.232.532.604.40 5.50
AOAHA3.474.372.402.703.172.632.503.832.103.772.332.773.77 5.00
HAHA2.702.801.531.601.702.332.272.802.173.001.631.533.80 4.07
AlgorithmsFunAvg.
Rank
Overall
Rank
F15F16F17F18F19F20F21F22F23F24F25
PSO4.238.787.0010.278.304.475.578.409.177.474.175.74 4
WOA12.6313.8013.677.6312.2010.0011.6010.4310.9711.139.0311.30 13
SCA12.7711.5711.8312.9711.7313.9010.3012.109.9712.4012.4012.35 14
HHO10.104.089.035.6012.1310.3311.309.1311.7010.936.509.00 10
SOA9.279.707.9710.806.5012.0711.707.209.678.7312.509.48 11
SSA6.109.529.6011.306.835.775.606.176.138.776.236.28 5
AVOA7.604.088.003.779.477.2710.438.178.609.636.207.27 6
CryStAl7.074.081.008.636.8311.006.535.836.239.339.077.44 7
DMOA11.6312.8713.3014.007.035.0011.4013.0711.474.8711.8710.46 12
SCSO10.035.235.575.379.508.778.178.837.838.637.078.68 9
GJO6.379.037.877.676.579.676.308.336.376.4011.278.45 8
AHA2.734.083.371.703.551.872.372.573.602.232.603.143
AOAHA2.674.083.573.632.532.551.672.112.431.932.532.982
HAHA1.804.083.231.671.972.352.071.731.702.432.932.401
Table 5. The computational time of HAHA and other algorithms for 25 benchmark functions.
Table 5. The computational time of HAHA and other algorithms for 25 benchmark functions.
FunAlgorithms
PSOWOASCAHHOSOASSAAVOACryStAlDMOASCSOGJOAHAAOAHAHAHA
F10.78 0.97 1.09 2.69 1.11 1.62 1.59 8.02 7.47 14.94 1.62 1.77 2.85 2.86
F20.61 0.73 0.99 2.18 1.23 1.72 1.62 6.72 6.66 14.14 1.68 1.54 2.79 2.96
F30.68 0.80 1.00 2.55 1.03 1.44 1.36 7.11 6.75 13.06 1.44 1.68 2.60 2.38
F40.89 1.00 1.20 2.89 1.20 1.71 1.64 8.19 8.18 23.90 1.66 1.81 3.13 3.13
F50.87 0.94 1.20 3.07 1.22 1.70 1.64 7.85 7.25 19.91 1.64 1.90 3.11 2.84
F60.81 0.88 1.16 2.83 1.42 1.91 1.83 8.88 7.86 14.26 1.87 2.07 3.22 3.32
F71.06 1.15 1.40 3.39 1.42 1.98 1.87 9.09 7.80 18.78 1.81 1.94 3.34 3.33
F80.63 0.73 0.96 2.32 1.01 1.48 1.46 7.47 6.97 13.08 1.56 1.49 2.52 2.65
F90.88 1.02 1.25 3.21 1.23 1.75 1.71 8.41 7.81 15.10 1.78 2.16 3.30 3.06
F100.76 0.88 1.05 2.40 1.06 1.47 1.53 7.13 6.83 12.63 1.47 1.77 2.60 2.55
F111.77 1.94 2.11 5.10 2.13 2.66 2.60 12.09 10.09 23.46 2.75 2.74 5.15 5.05
F121.97 2.13 2.36 6.26 2.85 3.31 3.33 14.09 11.74 35.84 2.85 2.93 5.15 5.44
F131.86 2.03 2.26 5.38 2.23 2.79 3.18 13.56 11.46 29.12 2.77 3.00 5.33 5.12
F142.81 2.96 3.31 7.28 3.06 3.47 3.47 15.94 11.01 16.47 3.74 3.75 6.99 7.07
F151.87 1.98 2.16 5.35 2.21 2.69 2.68 12.26 10.18 24.60 2.74 2.75 5.05 5.01
F161.09 0.98 1.32 3.59 1.31 1.68 1.54 8.37 10.58 5.11 1.96 2.03 3.39 2.79
F170.58 0.59 0.72 2.22 0.99 1.15 1.26 6.50 6.86 7.44 1.16 1.35 2.29 2.10
F180.49 0.57 0.71 1.96 0.73 1.17 1.26 6.54 6.82 8.36 1.15 1.52 2.17 2.17
F190.48 0.59 0.64 1.86 0.67 1.02 1.20 6.28 7.09 4.99 1.16 1.32 2.19 2.03
F200.47 0.58 0.67 2.01 0.67 1.00 1.16 6.18 6.99 4.78 1.05 1.31 2.26 2.09
F214.98 5.27 5.33 12.16 4.78 5.13 5.28 22.54 14.82 9.70 5.48 5.51 10.87 11.01
F220.48 0.62 0.69 1.94 0.69 1.08 1.16 6.24 6.35 4.52 1.16 1.21 2.17 1.97
F230.47 0.58 0.63 1.92 0.66 1.04 1.15 6.21 6.64 4.73 1.17 1.36 2.24 2.12
F240.48 0.60 0.64 1.93 0.83 1.27 1.35 6.79 7.36 5.03 1.13 1.20 2.13 1.96
F250.49 0.61 0.68 2.08 0.71 1.09 1.26 6.76 7.08 4.94 1.13 1.29 2.33 2.12
Table 6. The comparison results of HAHA and other algorithms for CEC 2022 test functions.
Table 6. The comparison results of HAHA and other algorithms for CEC 2022 test functions.
FIndexAlgorithms
PSOWOASCAHHOSOASSASAOPOAKOASCSOGJOAHAAOAHAHAHA
F1Avg.3008.83 × 1037863011.06 × 1033007.70 × 1033541.96 × 109711.67 × 103300300300
Std.3.17 × 10−145.18 × 1032570.3281.69 × 1033.46 × 10−102.07 × 10340.86.36 × 1031.27 × 1032.04 × 1036.82 × 10−101.63 × 10−115.32 × 10−11
p-value1.08 × 10−62.76 × 10−112.76 × 10−112.76 × 10−112.76 × 10−112.76 × 10−112.76 × 10−112.76 × 10−112.76 × 10−112.76 × 10−112.76 × 10−114.67 × 10−31.42 × 10−3\
Rank1138610512714911111
F2Avg.4014244544204304101.34 × 103412646435427400.90136 400.47212 400.87782
Std.2.4731.320.525.355.817.652922.492.933.323.92.421.791.67
p-value0.01847.77 × 10−93.02 × 10−112.23 × 10−99.92 × 10−112.38 × 10−73.02 × 10−111.34 × 10−53.02 × 10−111.41 × 10−93.02 × 10−117.01 × 10−41.55 × 10−3\
Rank4812710514613119312
F3Avg.606629616627609606646612646610605600.00007 600.00004 600.00000
Std.5.7010.73.0111.44.896.468.076.407.536.673.621.93 × 10−49.76 × 10−59.46 × 10−6
p-value2.70 × 10−112.70 × 10−112.70 × 10−112.70 × 10−112.70 × 10−112.70 × 10−112.70 × 10−112.70 × 10−112.70 × 10−112.70 × 10−112.70 × 10−112.72 × 10−38.63 × 10−3\
Rank6121011751391484321
F4Avg.817836835823820820859815874825824820818815
Std.6.8616.95.886.756.048.059.263.4210.67.199.277.247.314.59
p-value0.1451.24 × 10−72.98 × 10−117.55 × 10−74.44 × 10−45.56 × 10−32.98 × 10−110.2972.98 × 10−112.18 × 10−75.94 × 10−51.13 × 10−30.0160\
Rank3121186513214109741
F5Avg.9081.38 × 1039701.36 × 1039829011.60 × 1039391.92 × 1031.00 × 103942900.09959 900.10497 900.08438
Std.33.331926.812950.15.1522243.332299.144.30.2020.3870.829
p-value2.55 × 10−33.00 × 10−113.00 × 10−113.00 × 10−113.67 × 10−110.08773.00 × 10−112.36 × 10−103.00 × 10−113.67 × 10−116.09 × 10−100.4920.0310\
Rank5128119413614107231
F6Avg.2.96 × 1033.60 × 1031.13 × 1062.73 × 1031.16 × 1043.81 × 1031.11 × 1071.95 × 1033.86 × 1075.08 × 1036.96 × 1031804.7947 1804.1257 1803.4043
Std.1.63 × 1031.63 × 1039.62 × 1051.11 × 1035.07 × 1031.80 × 1032.37 × 10789.22.74 × 1072.21 × 1031.93 × 1036.364.524.45
p-value5.57 × 10−103.02 × 10−113.02 × 10−113.34 × 10−113.02 × 10−113.02 × 10−113.02 × 10−114.50 × 10−113.02 × 10−113.02 × 10−113.02 × 10−110.01241.15 × 10−3\
Rank6712511813414910321
F7Avg.2.03 × 1032.06 × 1032.05 × 1032.05 × 1032.04 × 1032.03 × 1032.08 × 1032.03 × 1032.11 × 1032.04 × 1032.03 × 1032004.2951 2006.3915 2003.6587
Std.12.920.46.9421.812.712.617.48.7416.911.311.27.998.607.46
p-value3.46 × 10−103.01 × 10−113.01 × 10−113.01 × 10−114.96 × 10−118.12 × 10−113.01 × 10−111.46 × 10−103.01 × 10−113.01 × 10−114.96 × 10−111.39 × 10−30.0122\
Rank5121110861341497231
F8Avg.2.22 × 1032.23 × 1032.23 × 1032.23 × 1032.23 × 1032.23 × 1032.24 × 1032.22 × 1032.26 × 1032.22 × 1032.23 × 1032214.3083 2214.76772209.7614
Std.5.056.393.425.772.054.7313.18.75164.224.199.118.679.44
p-value3.09 × 10−61.09 × 10−101.09 × 10−103.02 × 10−113.02 × 10−111.46 × 10−103.02 × 10−111.89 × 10−43.02 × 10−111.33 × 10−104.20 × 10−100.02971.05 × 10−3\
Rank5121110971341468231
F9Avg.2.49 × 1032.54 × 1032.55 × 1032.55 × 1032.56 × 1032.53 × 1032.73 × 1032.53 × 1032.69 × 1032.57 × 1032.57 × 1032529.2844 2529.28432529.2842
Std.34.816.213.243.441.826.827.50.48637.641.637.69.43 × 10−121.26 × 10−118.24 × 10−12
p-value1.43 × 10−103.01 × 10−113.01 × 10−113.01 × 10−113.01 × 10−113.01 × 10−113.01 × 10−113.01 × 10−113.01 × 10−113.01 × 10−113.01 × 10−110.02731.93 × 10−3\
Rank1798106145131112432
F10Avg.2.55 × 1032.54 × 1032.50 × 1032.55 × 1032.50 × 1032.50 × 1032.74 × 1032.50 × 1032.55 × 1032.52 × 1032.52 × 1032500.3118 2500.31112500.3007
Std.61.863.30.38266.822.221.52030.23350.546.9450.07770.07180.0641
p-value1.44 × 10−34.98 × 10−113.02 × 10−113.69 × 10−111.96 × 10−103.57 × 10−63.02 × 10−112.57 × 10−73.02 × 10−111.43 × 10−58.84 × 10−70.3400.483\
Rank1110512761441398321
F11Avg.2.62 × 1032.78 × 1032.77 × 1032.77 × 1032.75 × 1032.63 × 1033.26 × 1032.67 × 1032.99 × 1032.80 × 1032.78 × 1032600 26002600
Std.77.117910.218682.482.72971001201981363.87 × 10−134.85 × 10−133.68 × 10−13
p-value0.2554.10 × 10−124.10 × 10−124.10 × 10−124.10 × 10−124.10 × 10−124.10 × 10−124.10 × 10−124.10 × 10−124.10 × 10−124.10 × 10−120.7210.592\
Rank4118975146131210111
F12Avg.2.89 × 1032.89 × 1032.87 × 1032.88 × 1032863.764 2862.407 3.02 × 1032.86 × 1032.88 × 1032.87 × 1032.87 × 1032865.5126 2865.48352863.7415
Std.35.530.21.49251.421.9972.612.5910.32.116.531.641.440.996
p-value0.6638.32 × 10−83.48 × 10−93.18 × 10−97.64 × 10−52.59 × 10−83.00 × 10−110.02713.00 × 10−110.5590.1330.02090.0149\
Rank1213911311441058762
+/=/−1/1/100/0/120/0/120/0/120/0/121/0/110/0/120/0/120/0/120/0/120/0/250/2/101/2/9\
Avg. Rank5.250 10.750 9.500 9.000 8.083 5.250 13.333 5.083 13.333 9.083 8.583 3.167 2.583 1.250
Final Rank5121197513413108321
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chen, K.; Chen, L.; Hu, G. PSO-Incorporated Hybrid Artificial Hummingbird Algorithm with Elite Opposition-Based Learning and Cauchy Mutation: A Case Study of Shape Optimization for CSGC–Ball Curves. Biomimetics 2023, 8, 377. https://doi.org/10.3390/biomimetics8040377

AMA Style

Chen K, Chen L, Hu G. PSO-Incorporated Hybrid Artificial Hummingbird Algorithm with Elite Opposition-Based Learning and Cauchy Mutation: A Case Study of Shape Optimization for CSGC–Ball Curves. Biomimetics. 2023; 8(4):377. https://doi.org/10.3390/biomimetics8040377

Chicago/Turabian Style

Chen, Kang, Liuxin Chen, and Gang Hu. 2023. "PSO-Incorporated Hybrid Artificial Hummingbird Algorithm with Elite Opposition-Based Learning and Cauchy Mutation: A Case Study of Shape Optimization for CSGC–Ball Curves" Biomimetics 8, no. 4: 377. https://doi.org/10.3390/biomimetics8040377

APA Style

Chen, K., Chen, L., & Hu, G. (2023). PSO-Incorporated Hybrid Artificial Hummingbird Algorithm with Elite Opposition-Based Learning and Cauchy Mutation: A Case Study of Shape Optimization for CSGC–Ball Curves. Biomimetics, 8(4), 377. https://doi.org/10.3390/biomimetics8040377

Article Metrics

Back to TopTop