PSO-Incorporated Hybrid Artificial Hummingbird Algorithm with Elite Opposition-Based Learning and Cauchy Mutation: A Case Study of Shape Optimization for CSGC–Ball Curves

With the rapid development of the geometric modeling industry and computer technology, the design and shape optimization of complex curve shapes have now become a very important research topic in CAGD. In this paper, the Hybrid Artificial Hummingbird Algorithm (HAHA) is used to optimize complex composite shape-adjustable generalized cubic Ball (CSGC–Ball, for short) curves. Firstly, the Artificial Hummingbird algorithm (AHA), as a newly proposed meta-heuristic algorithm, has the advantages of simple structure and easy implementation and can quickly find the global optimal solution. However, there are still limitations, such as low convergence accuracy and the tendency to fall into local optimization. Therefore, this paper proposes the HAHA based on the original AHA, combined with the elite opposition-based learning strategy, PSO, and Cauchy mutation, to increase the population diversity of the original algorithm, avoid falling into local optimization, and thus improve the accuracy and rate of convergence of the original AHA. Twenty-five benchmark test functions and the CEC 2022 test suite are used to evaluate the overall performance of HAHA, and the experimental results are statistically analyzed using Friedman and Wilkerson rank sum tests. The experimental results show that, compared with other advanced algorithms, HAHA has good competitiveness and practicality. Secondly, in order to better realize the modeling of complex curves in engineering, the CSGC–Ball curves with global and local shape parameters are constructed based on SGC–Ball basis functions. By changing the shape parameters, the whole or local shape of the curves can be adjusted more flexibly. Finally, in order to make the constructed curve have a more ideal shape, the CSGC–Ball curve-shape optimization model is established based on the minimum curve energy value, and the proposed HAHA is used to solve the established shape optimization model. Two representative numerical examples comprehensively verify the effectiveness and superiority of HAHA in solving CSGC–Ball curve-shape optimization problems.


Introduction
Geometric modeling mainly focuses on the representation, approximation, analysis, and synthesis of curve and surface information in computer image system environments [1]. It has been widely used in various fields such as aviation, shipbuilding, surveying and mapping, mechanical design, computer vision, bioengineering, animation, and military combat simulation [2]. The study of Ball curves and surfaces is a very important research topic in geometric modeling, mainly focusing on the geometric research of various products [3]. In 1974, Ball [4] first constructed the rational cubic parametric curves and used them as the mathematical basis for Warton's former British Airways CONSURF fuselage SI is the most popular branch of MA used to simulate the collective behavior of social animals in nature. Particle Swarm Optimization (PSO) [47] is the most classic SI, inspired by the social behavior of birds and often used to solve various global optimization problems. Famous SI algorithms also include, but are not limited to, Ant Colony Optimization (ACO) [48], based on the collective behavior of ant colony, Moth-Flame Optimization (MFO) [49][50][51], the Grey Wolf Optimizer (GWO) [52] simulating the cooperative hunting behavior of gray wolves, the Whale Optimization Algorithm (WOA) [53], Harris Hawk Optimization (HHO) [54], the Black Widow Algorithm [55,56], the Seagull Optimization algorithm (SOA) [57], the Salp Swarm Algorithm (SSA) [58,59], the African Vultures Optimization Algorithm (AVOA) [60], the Dwarf Mongoose Optimization Algorithm (DMOA) [61], the Pelican Optimization Algorithm (POA) [62], Golden Jackal Optimization (GJO) [63], the Artificial Hummingbird Algorithm (AHA) [64], etc. Among them, AHA is a recently proposed bionic MA that is inspired by the intelligent foraging behaviors, special flight skills, and amazing memory function of hummingbirds. Hummingbirds have three unique flight skills: axial, diagonal, and omnidirectional flight. These skills are flexibly and alternately used in its three foraging behaviors. The migration foraging strategy provides the algorithm with powerful exploration capabilities, territorial foraging improves population diversity and avoids the possibility of the algorithm falling into local optima, and guided foraging creates an intelligent balance between exploration and exploitation. In addition, the visit table was established to simulate the powerful memory abilities of hummingbirds.
The performance of AHA is competitive with other well-known algorithms [64]. In 2022, Ramadan [65] made improvements on the basis of the original AHA and proposed an adaptive opposition artificial hummingbird algorithm, referred to as AOAHA, which improved the performance of the AHA and applied it to solve an accurate photovoltaic model of the solar cell system. In the same year, Mohamed [66] proposed the Artificial Hummingbird Optimization Technology (AHOT) to solve the parameter identification problem of lithium-ion batteries for electric vehicles. Meanwhile, in [67], Sadoun et al. used a machine learning method based on AHA to predict the effect of the tribological behavior of in situ-synthesized Cu-Al 2 O 3 nanocomposites. In 2022, AHA was used in [68] to solve the planning optimization problem of multiple renewable energy integrated distribution systems with uncertainty, and the optimization results were better.
Compared with other advanced meta-heuristic algorithms, AHA can quickly and accurately find the global optimal solution and has certain applicability and competitiveness in terms of computational accuracy and time. However, due to the standard AHA being designed as simply as possible, there are still certain limitations when solving complex optimization problems, such as slow iteration speed, low diversity, and the tendency to converge prematurely. In order to make the original AHA more competitive, another goal of this paper is to propose a hybrid artificial hummingbird algorithm (HAHA) based on the standard AHA, that is, the elite opposition-based learning strategy [69], the PSO strategy [47], and the Cauchy mutation strategy [70] combined with the original AHA. Three strategies work together to increase the optimization ability and overall performance of AHA. The proposed HAHA algorithm is tested on 25 benchmark functions and the CEC 2022 test suite, and it is verified that the proposed HAHA shows good competitiveness in solving global optimization problems. Therefore, the proposed HAHA is used to solve the established CSGC-Ball curve-shape optimization models. The main contributions of this paper are as follows: (1) The smooth splicing continuity conditions of adjacent SGC-Ball curves G 1 and G 2 are derived, and the combined SGC-Ball curves with global and local shape parameters are constructed, called CSGC-Ball curves, which verify that the CSGC-Ball curves have better shape adjustability.  2022 [64] The rest of the paper is structured as follows: Section 2 introduces the proposed HAHA in detail. Numerical experiments to evaluate the performance of the proposed HAHA are given in Section 3. Section 4 introduces the constructed combined SGC-Ball curves and studies the G 1 and G 2 continuous splicing conditions for the SGC-Ball curves. In Section 5, the CSGC-Ball curve-shape optimization models are established based on minimum energy, and the detailed process of solving the shape optimization model using the proposed HAHA is given. Section 6 summarizes the paper and provides future research directions.

Basic Artificial Hummingbird Algorithm
The Artificial Hummingbird Algorithm (AHA) [64] is a novel bionic MA proposed in 2021, inspired by the unique flight skills, intelligent foraging strategies, and strong memory capacity of hummingbirds. Hummingbirds are the smallest but most intelligent birds in the world. They have three special flight skills and three intelligently adjusted foraging strategies. Three foraging behaviors of hummingbirds are shown in Figure 1. Meanwhile, most notably, they have a strong memory, so AHA constructed the visit table to simulate the unique memory ability of hummingbirds for food sources. memory capacity of hummingbirds. Hummingbirds are the smallest but most intelligent birds in the world. They have three special flight skills and three intelligently adjusted foraging strategies. Three foraging behaviors of hummingbirds are shown in Figure 1. Meanwhile, most notably, they have a strong memory, so AHA constructed the visit table to simulate the unique memory ability of hummingbirds for food sources.

Initialization
AHA uses the random initialization method to generate hummingbird population X, and randomly places n hummingbirds on n food sources, as described by Equation (1): where X = {x1,...,xn} is the hummingbird population, n represents the population size, xi is the location of the i-th food source, r is a d-dimensional random vector in [0,1], and Ub = {ub1,...,ubd} and Lb = {lb1,...,lbd} are upper bounds and lower bounds, respectively. The visit table is initialized by Equation (2): where VTi,j is the visit level, indicating the time period when the i-th hummingbird did not reach the j-th food source; null indicates the hummingbird visited the food source.

Guided Foraging
In the process of foraging, hummingbirds have three special flight skills: axial, diagonal, and omnidirectional flight. Use the direction switching vector D to determine which flight skill the hummingbird chooses. Figure 2 describes the three flight behaviors in three-dimensional space. Figure 2a shows axial flight, in which the hummingbird can choose to fly in an arbitrary direction of the coordinate axis; Figure 2b reflects diagonal flight, in which the hummingbird can fly from any angle of the coordinate axis to its diagonal position; and Figure 2c demonstrates omnidirectional flight, in which the hummingbird can fly in any direction.
In d-dimensional space, the expressions for simulating the axial, diagonal, and omnidirectional flight of hummingbirds are expressed by Equations (3)-(5), respectively.

Initialization
AHA uses the random initialization method to generate hummingbird population X, and randomly places n hummingbirds on n food sources, as described by Equation (1): where X = {x 1 ,. . .,x n } is the hummingbird population, n represents the population size, x i is the location of the i-th food source, r is a d-dimensional random vector in  (2): where VT i,j is the visit level, indicating the time period when the i-th hummingbird did not reach the j-th food source; null indicates the hummingbird visited the food source.

Guided Foraging
In the process of foraging, hummingbirds have three special flight skills: axial, diagonal, and omnidirectional flight. Use the direction switching vector D to determine which flight skill the hummingbird chooses. Figure 2 describes the three flight behaviors in threedimensional space. Figure where randi( [1,d]) is a randomly generated integer in [1,d], , a randprem(q) represents generating a random arrangement of integers from 1 to q. Hummingbirds will rely on the alternation of three flight skills to reach the tar food source and use Equation (6) to simulate guided foraging to obtain the position candidate food source vi.  In d-dimensional space, the expressions for simulating the axial, diagonal, and omnidirectional flight of hummingbirds are expressed by Equations (3)-(5), respectively.
where randi( [1,d]) is a randomly generated integer in [1,d], q ∈ [1, rand · (d − 2) + 1], and randprem(q) represents generating a random arrangement of integers from 1 to q. Hummingbirds will rely on the alternation of three flight skills to reach the target food source and use Equation (6) to simulate guided foraging to obtain the position of candidate where v i (t + 1) is the position of the candidate solution in iteration t + 1, and x i (t) is the i-th food source in iteration t. In addition, x i,aim (t) is the location of the target food source where the i-th hummingbird will be located. A~N(0,1) is the guiding parameter that obeys the normal distribution. The position of the i-th food source of the hummingbird is updated by Equation (7).
where f (x i (t)) and f (v i (t + 1)) represent the nectar replenishment rates of hummingbird food sources and candidate food sources, respectively; that is, the fitness value of the function. The visit table simulates the unique memory ability of hummingbirds and is used to store important time information for accessing food sources. Each hummingbird can find the food source they are going to visit based on the information on the visit table. They prefer the food source with the highest visit level, but if multiple food sources have the same visit level, the food source with the highest supplement rate will be selected. In each iteration, after the hummingbird selects the target food source through guided foraging by Equation (6), the visit table will be updated accordingly; for the update details of the visit table, refer to reference [64].

Territorial Foraging
When hummingbirds visit target candidate solutions, they will move into adjacent territories in search of new food sources that may be better candidates than existing ones. The mathematical expression for simulating the territorial foraging strategy of hummingbirds is: where v i (t + 1) is the position of the candidate food source obtained by hummingbird i through territorial foraging in t + 1 iterations, and B~N(0,1) represents the territorial parameter obeying the normal distribution. Hummingbirds update the visit table after performing territorial foraging.

Migration Foraging
Hummingbirds tend to migrate further afield to feed when there is a shortage of food in the areas they visit. The migration coefficient, M, is the value given to determine whether the hummingbird is migrating. If the number of iterations exceeds M, the hummingbird with the worst fitness value will randomly migrate to any randomly generated food source in the search space for foraging. The migration foraging behavior of hummingbirds from the food source with the worst nectar replenishment rate to the randomly generated food source can be expressed as where x wor (t + 1) is the food source with the worst nectar supplementation in the hummingbird population, and r is the random vector in [0,1]. Hummingbirds will update the visit table after migration and foraging. Here, the migration coefficient M = 2n. The pseudo-code of the original AHA can be found in reference [64].

Hybrid Artificial Hummingbird Algorithm
Compared with other commonly used MAs, AHA can quickly find the global optimal solution and has certain applicability and potential for solving global optimization problems. However, the original AHA still has some limitations in solving some complex optimization problems, such as low algorithm accuracy and the tendency to fall into local optima. In order to make the original AHA more competitive, a new hybrid artificial hummingbird algorithm (HAHA) is proposed in this study, which makes the following three improvements based on the original AHA. Firstly, the introduction of a light oppositionbased learning strategy in the guided foraging process helps to improve hummingbirds' search ability, which can effectively improve the exploration ability of standard AHA. Secondly, the introduction of the PSO strategy in the exploitation stage of AHA helps hummingbirds learn from individuals with good fitness values in the population, accelerates the convergence speed, and improves the accuracy of the algorithm. Lastly, the Cauchy mutation strategy is introduced into the migration foraging of hummingbirds to expand the range of mutation, which helps the algorithm get out of stagnation and improve the search efficiency of the original AHA.

Elite Opposition-Based Learning
AHA communicates information within the population by the visit table, which largely limits the search range of hummingbirds and easily makes the population fall into the local optimum, thereby affecting the accuracy of the solution. In order to improve the possibility of individuals approaching the optimal value in the exploration stage, the elite opposition-based learning (EOL) strategy [69] is introduced on the basis of the original AHA to improve the exploration ability of the algorithm. EOL is an innovative search method in intelligent computing. The main idea is as follows: first, the hummingbird individual with the best fitness value is regarded as the elite individual e(t) = e 1 (t), e 2 (t), . . . e d (t) ; the elite individual is used to generate the opposition solution to the current solution; and the better solution is selected instead of the original solution. Then, the elite opposition- can be defined by Equation (10): where ea(t) = min(e j (t)),eb(t) = max(e j (t)), j = 1,. . .,d.
In guiding the foraging stage, the EOL strategy can better enable hummingbirds to forage for food sources with the highest nectar replenishment rates, improve their exploration abilities, enhance population diversity, and reduce the probability of falling into the local optimum, thereby improving the global search ability of hummingbird populations.

PSO Strategy
In the exploitation stage, hummingbirds need to search for novel food sources and then select the food source with the highest nectar supplement rate as the object to be visited, according to the visit table. However, it does not consider learning from hummingbirds with good fitness values in the population, which still has certain limitations. PSO [47] is an optimization algorithm proposed by Eberhart and Kennedy in 1995 that has the advantages of fast convergence speed and easy implementation. The speed update equation is shown in Equation (11): where c 1 and c 2 are learning factors with a value of 2, x i,pbest is the local optimal solution, and x gbest represents the global optimal solution.
in Equation (12), w is the inertia factor, and w ini = 0.4 and w end = 0.9 are the initial and final inertia weights, respectively. With the increase in iterations, w shows a decreasing trend. The speed update formula in PSO is introduced into the exploitation stage of standard AHA so that hummingbirds learn from individuals with good fitness values in the population, which increases the convergence speed and accuracy of the solution of the original AHA.

Cauchy Mutation Strategy
In AHA, the main purpose of hummingbirds choosing migration foraging is to enhance the exploration skills of the algorithm. When the number of iterations exceeds the migration coefficient M, the hummingbirds with the worst nectar replenishment rate will migrate to randomly generated food sources, which realizes the global exploration of the algorithm. However, in the experiments, it was found that it is still easy for the standard AHA to fall into local optimization.
In this paper, the Cauchy mutation strategy [70] is introduced to generate larger disturbances near randomly generated hummingbird individuals to improve the mutation ability of the hummingbird population, so as to improve the global search ability of the algorithm and increase the mutation range, thereby preventing the algorithm from falling into a local optimal state prematurely. The Cauchy distribution is the unique continuous probability distribution; the one-dimensional Cauchy distribution density function is shown in Equation (13): when δ = 0 and µ = 1, the Cauchy density function is defined by Equation (14): the standard Cauchy distribution formula is described by Equation (15): Equation (16) is used to perform Cauchy mutation processing on randomly generated food sources in migration foraging: x cauchy (t + 1) = x wor (t + 1) + r × cauchy(0, 1), where cauchy(0,1) is the Cauchy mutation operator. The Cauchy mutation strategy is introduced in the exploration stage of the original AHA to ensure that hummingbird individuals learn from other random individuals in the population, which expands the search range of the hummingbird population, increases the diversity of the population, thereby effectively improving the AHA's accuracy and convergence speed.

Detailed Steps for the Proposed HAHA
This part details the specific steps of the proposed HAHA. In order to improve the performance of the original AHA, combined with the EOL, the PSO strategy, and the Cauchy mutation, HAHA is proposed. In order to describe the process of the proposed HAHA in more detail, Figure 3 summarizes the specific implementation steps and flow chart of the proposed HAHA.

Computational Complexity of the Proposed HAHA
Computational complexity is one of the significant indicators used to evaluate the efficiency of an algorithm, including space complexity and time complexity. The computational complexity of the proposed HAHA is related to algorithm initialization (Init), the individual fitness value (FV) in each iteration, D, n, and T. In this paper, "Oh" represents the computation complexity of the algorithm. Initialization is to assign values to each dimension of the hummingbirds, so the computation complexity is expressed as Oh(nD). HAHA needs to calculate the individual fitness value in each iteration, so the computational complexity can be defined as Oh(T·FV·n). HAHA introduces EOL in guided foraging (gui fora), which increases the computational complexity of AHA. The computational complexity of this stage is Oh(TnD/2 + TnD/2). The PSO strategy is introduced in territorial foraging (ter fora), and the computational complexity of this stage is defined as Oh(TnD/2 + TnD/2). The Cauchy mutation strategy is introduced in migration foraging (mig fora), so the computational complexity is Oh((TnD + TnD)/(2n)). Therefore, the overall computational complexity of the proposed HAHA can be expressed by Equation (17): Oh(H AH A) = Oh(problem de f inition) + Oh(Init) + Oh(t(FV)) + Oh(t(gui f ora)) +Oh(t(ter f ora)) + Oh(t(mig f ora)) = Oh(1 + nD + T · FV · n + TnD+TnD 2 + TnD+TnD 2 + TnD+TnD 2n ) = Oh(1 + nD + T · FV · n + 2TnD + TD) (17)

Numerical Experiments and Analysis
In this section, the performance of the proposed HAHA is simulated on 25 benchmark functions and CEC 2022 benchmark functions and compared with other optimization algorithms and other improved AHAs. The optimization ability, convergence, and statistical tests of HAHA are evaluated, which further verifies the superiority of HAHA in a series of evaluation indicators. In addition, in order to guarantee the reliability and persuasiveness of experimental results, the compilation environment for all experiments is the same; they are compiled and run on MATLABR2017b on Windows 11, AMD Ryzen 7 4800H with Radeon Graphics@2.90GHz, and 16GB RAM, and the experimental results are obtained by running each test function 30 times independently.

Benchmark Functions
As part of the study, 25 benchmark test functions and the challenging CEC 2022 test suite were used to evaluate the performance of the proposed HAHA. The details of the 25 benchmark functions used to verify the performance of the proposed HAHA are shown in Appendix A, Table A1. These functions contain uni-modal, multi-modal, hybrid, composition, and fixed-dimensional functions, which have good representation in evaluating algorithm performance. Among the 25 benchmark functions, F1 is a uni-modal function with a local extreme value, which is suitable for testing the utilization and local exploration abilities of the algorithm. F2-F5 are multi-modal functions with multiple local minima, which are usually used to evaluate the ability of algorithms to explore and jump out of local optima. F6-F10 are hybrid functions composed of multi-modal or uni-modal functions to test the balance between algorithm exploration and exploitation. Composite functions F11-F15 are often composed of basic functions and mixed functions, and the problem is more complicated. F16-F20 are fixed-dimensional functions taken from CEC 2019 test functions [71], and the complexity of the search space is significantly increased and more challenging, which is used to evaluate the comprehensive ability of the algorithm.

Algorithm Parameter Settings
The proposed HAHA is compared with other algorithms such as the original AHA [64], PSO [47], WOA [53], SCA [38], HHO [54], Seagull Optimization Algorithm (SOA) [57], SSA [58], AVOA [60], CryStAl [40], DMOA [62], Sand Cat Swarm Optimization (SCSO) [72], GJO [63], and AOAHA [65] to conduct comparative experiments to evaluate the performance of HAHA, where AOHAH is an improved algorithm of AHA. Table 2 shows the parameter settings of some algorithms, and the parameters of the rest of the comparison algorithms are the same as the corresponding references. Each algorithm is independently run 30 times on each benchmark function, and the calculation results of all algorithms are based on the average performance of the algorithms.

Results and Analyses for 25 Benchmark Functions
In this experiment, in order to objectively and fairly evaluate the proposed HAHA, the average value (Avg) and standard deviation (Std) of the best solution obtained by independent running of each test function are compared as evaluation indicators, and the calculation formula is as follows [73]: where f * i is the best solution obtained in the i-th independent run, and runs represents the number of independent runs of the function. Table 3 shows the statistical results of 14 algorithms running independently 30 times on 25 test functions, including Avg, Std, Wilcoxon rank-sum test p-value, and average rank. The best results are marked in bold. The uni-modal function F1 only has a global optimal value, which is used to test the local exploitation ability of the algorithm. The average value of the optimal solution obtained by HAHA on F1 is the smallest, which indicates that the proposed HAHA has very effective exploitation and utilization abilities. F2-F5 in Table 3 are the evaluation results of multi-modal functions. It can be seen that the experimental results of HAHA are basically better than competing algorithms on such functions, especially in solving F3 and F4 optimization problems, which proves that the proposed HAHA has good exploration ability and effectively avoids the algorithm from falling into the local optimal state.   Hybrid functions and combination functions are mostly used to evaluate the balance between algorithm development and exploration. As shown in Table 3, the proposed HAHA performance is often more competitive in functions F6, F7, F10, F12, and F15. Compared with other competitive algorithms, HAHA has a good balance between exploitation and exploration. F16-F20 are the fixed dimensions selected from CEC 2019, which are more challenging. F16 reaches the optimal value of 1 and has obvious advantages in F19, F21, F22, and F25, which further indicates that HAHA can better explore the optimal solution of complex problems. From the standard deviation, it can be seen that the performance of the proposed HAHA is stable. At the end of Table 3, the final ranking of each algorithm on 25 test sets is given, with HAHA ranking first on average.
In addition to statistical analysis of data by means and standard deviation, the Wilcoxon rank-sum test was also used to assess statistical differences between the proposed HAHA and other competing algorithms [74]. The p-value is used to determine whether the given algorithm is significantly different from the algorithm to be tested. When the p-value < 0.05, it indicates that the algorithm is significantly different from HAHA and has statistical significance. Table 3 presents the p-value for HAHA and other comparison algorithms for 25 benchmark functions. In most cases, the p-value of most algorithms is less than 0.05, indicating a significant difference between HAHA and other algorithms. The last line of Table 3 gives the number of significant differences, expressed by (+/=/−). Among them, "+" indicates that the algorithm being compared performs better than the proposed HAHA, "=" indicates that HAHA and this comparison algorithm have similar performance, and "−" indicates that the algorithm being compared is not as good as HAHA. Compared with the original AHA, HAHA and AHA have significant differences in 19 test functions, and HAHA's performance is better than that of AHA. The HAHA has significant differences compared with PSO, WOA, SCA, HHO, SOA, SSA, AVOA, CryStAl, DMOA, SCSO, and GJO. Therefore, HAHA has good performance and is statistically significant.
Meanwhile, the Friedman test [75] is also commonly used as a popular method for nonparametric testing, and Table 4 shows the results of the Friedman test for each algorithm on 25 test functions. The proposed HAHA has the best Friedman test results for most test functions, and the average ranking of algorithm performance is shown in Table 4. Compared with other comparison algorithms, HAHA has the best average Friedman test result of 2.4 for 25 benchmark functions.
The convergence of the proposed algorithm is verified by comparing the convergence curves between the proposed HAHA and the competing algorithms. Figure 4 shows the convergence curves on 25 test functions, which are obtained by averaging the best solutions of the algorithm through 1000 iterations. As can be seen from Figure 4, the proposed HAHA is more competitive than the competitive algorithm. In the initial stage of iteration, HAHA converges faster. At the initial stage of iteration, F3, F7, F16, F19, F21, and F22 converge faster. With an increase in the number of iterations, the algorithm converges quickly to the optimal solution with high convergence accuracy. During the entire iteration process, the proposed HAHA maintains an intelligent balance between exploration and exploitation, effectively reducing the possibility of premature convergence of the algorithm. Figure 5 shows box plots of the optimal solution distribution of each function. For most functions, the box graph position of HAHA is lower, indicating that the proposed HAHA has better performance and stronger robustness. Figure 6 shows the radar charts of comparison between HAHA and other competitive algorithms. The larger the area, the lower the ranking of the algorithm. On the contrary, the minimum area means that the overall performance of the algorithm is the best. Figure 7 shows that the average rank of the HAHA algorithm is the smallest, indicating that it ranks first among other competitive algorithms. This result proves the superiority of the proposed HAHA again.
The computational cost is also an important criterion for evaluating the performance of the algorithm. Table 5 shows the average runtime of HAHA and other competing algorithms in seconds on the test set. Compared with the original AHA, the calculation cost of HAHA is also inevitably increased, which is consistent with the previous analysis of calculation complexity and is a noticeable issue in the subsequent research.
To sum up, compared with other intelligent algorithms, HAHA effectively improves the exploration and development capabilities of the algorithm, avoids falling into local optima, and shows good competitiveness. quickly to the optimal solution with high convergence accuracy. During the entire iteration process, the proposed HAHA maintains an intelligent balance between exploration and exploitation, effectively reducing the possibility of premature convergence of the algorithm.   Figure 5 shows box plots of the optimal solution distribution of each function. For most functions, the box graph position of HAHA is lower, indicating that the proposed HAHA has better performance and stronger robustness. Figure 6 shows the radar charts of comparison between HAHA and other competitive algorithms. The larger the area, the lower the ranking of the algorithm. On the contrary, the minimum area means that the overall performance of the algorithm is the best. Figure 7 shows that the average rank of the HAHA algorithm is the smallest, indicating that it ranks first among other competitive algorithms. This result proves the superiority of the proposed HAHA again.    The computational cost is also an important criterion for evaluating the performance of the algorithm. Table 5 shows the average runtime of HAHA and other competing algorithms in seconds on the test set. Compared with the original AHA, the calculation cost of HAHA is also inevitably increased, which is consistent with the previous analysis of calculation complexity and is a noticeable issue in the subsequent research.
To sum up, compared with other intelligent algorithms, HAHA effectively improves the exploration and development capabilities of the algorithm, avoids falling into local optima, and shows good competitiveness.   The computational cost is also an important criterion for evaluating th of the algorithm. Table 5 shows the average runtime of HAHA and other co rithms in seconds on the test set. Compared with the original AHA, the calc HAHA is also inevitably increased, which is consistent with the previous a culation complexity and is a noticeable issue in the subsequent research.
To sum up, compared with other intelligent algorithms, HAHA effecti the exploration and development capabilities of the algorithm, avoids fal optima, and shows good competitiveness.

Results and Analyses on CEC 2022 Benchmark Functions
In this section, the latest CEC 2022 test functions are selected to further evaluate the performance of the proposed HAHA and compare it with other advanced intelligent algorithms and improved algorithms of AHA, including PSO [47], WOA [53], SCA [38], HHO [54], SOA [71], SSA [58], SAO [41], POA [62], Kepler Optimization Algorithm (KOA) [76], SCSO [72], GJO [63], AOAHA [65], and AHA [64]. The CEC 2022 test function simulates highly complex problems in global optimization, which is very challenging. In order to ensure the fairness and persuasiveness of the experimental results, all functions were tested in 10-dimensional space. The algorithm parameter settings were the same as in Table 2, and the experimental results were taken as an average of 30 independent runs. Table 6 presents the experimental results of 30 independent runs of HAHA and other competitive algorithms on the CEC 2022 test set, including the mean, standard deviation, ranking, and p-value. This table demonstrates the best performance of HAHA among the 9 test functions, and the average ranking of HAHA is 1.250, which is the highest overall performance ranking. Using the Wilcoxon rank-sum test p-value to test whether there is a significant difference between HAHA and other algorithms, it can be seen from Table 6 that most algorithms have p-values less than 0.05, indicating that HAHA has statistical significance. From the convergence curves shown in Figure 8, it can be seen that the HAHA can effectively jump out of local optima and quickly approach the global optimal solution. The research results indicate that HAHA exhibits strong competitiveness and can serve as a powerful tool for solving global optimization problems.

Construction of CSGC-Ball Curves
In this section, first, the CSGC-Ball curves are defined, which are composed of Nsegmented SGC-Ball curves and can construct more flexible and controllable complex curves. Secondly, in order to make the constructed curves smooth and continuous, the continuity conditions satisfying the G 1 and G 2 smooth splicing of the CSGC-Ball curves are studied, respectively. Finally, an example of CSGC-Ball curves is given. Definition 1. The shape-adjustable generalized cubic Ball (SGC-Ball, for short) curves can be defined as [21] are the SCG-Ball curves basis functions defined as

Construction of CSGC-Ball Curves
In this section, first, the CSGC-Ball curves are defined, which are composed of Nsegmented SGC-Ball curves and can construct more flexible and controllable complex curves. Secondly, in order to make the constructed curves smooth and continuous, the continuity conditions satisfying the G 1 and G 2 smooth splicing of the CSGC-Ball curves are studied, respectively. Finally, an example of CSGC-Ball curves is given. Definition 1. The shape-adjustable generalized cubic Ball (SGC-Ball, for short) curves can be defined as [21] where P i ∈ R u (u = 2, 3; i = 0, 1, 2, 3) are the control points of the curves; Ω = {ω, λ 1 , λ 2 , λ 3 } are the shape parameters; ω ∈ [0, 1] is the global shape parameter; and λ 1 , 4] are the local shape parameters; and b i,3 (t)(i = 0, 1, 2, 3) are the SCG-Ball curves basis functions defined as Compared with the traditional generalized Ball curves, the SGC-Ball curves have a satisfactory effect when constructing simple curve shapes, but for the construction of complex geometric curves in real life, the original single SGC-Ball curve is difficult to meet people's requirements for curve construction and has certain limitations. Therefore, it is of great significance to construct the complex combination of SGC-Ball curves, which are defined as follows.
The CSGC-Ball curve requires two adjacent curves to be smooth and continuous. In order to make the constructed CSGC-Ball curves meaningful, the G 1 and G 2 smooth continuity splicing conditions of the j-th and j + 1-th SGC-Ball curves are discussed below. Theorem 1. If the control vertices and shape parameters of the j-segment and j + 1-segment SGC-Ball curves at node u j satisfy    P 0,j+1 = P 3,j , then the CSGC-Ball curves are said to be G 1 continuous at node u j . Among them, k > 0. If the CSGC-Ball curves satisfy Equation (23) at each node u j (j = 1,. . .,N), then the CSGC-Ball curves are G 1 continuous on the whole. In particular, when k = 1, Equation (23) is a necessary and sufficient condition for the CSGC-Ball curves to satisfy G 0 continuity at nodes u j (j = 1,2. . .,N). The proof of Theorem 1 is given in Appendix B.1.

Theorem 2.
If the control vertices and shape parameters of the j-segment and j + 1-segment SGC-Ball curves at node u j satisfy then the CSGC-Ball curves are said to be G 2 continuous at connection note u j . Among them, k > 0, β is an arbitrary constant. If the CSGC-Ball curves satisfy the G 2 continuity condition at each node u j (j = 1,2,. . .,N), then the overall CSGC-Ball curves are G 2 continuous. The proof of Theorem 2 is given in Appendix B.2.
According to the CSGC-Ball curves definition and the G 1 smooth splicing continuity condition of Theorem 1, Figure 9 gives examples of the CSGC-Ball curves graphs that satisfy the overall G 1 smooth splicing condition when N = 5. Different colors represent each SGC-Ball curves to be spliced. Ω j = (ω, λ 1,j , λ 2,j , λ 3,j ), j ∈ (1, 2, . . . , 5) are the shape parameters of the j-th SGC-Ball curves of the CSGC-Ball curves, where ω is the global shape parameter of CSGC-Ball curves, and λ 1,j , λ 2,j , λ 3,j j = 1, 2, . . . , 5 are the local shape parameters. The figure involves 16 variables, including 1 global shape parameter and 15 local shape parameters. Figure 9a-c describes the CSGC-Ball curves of the whole G 1 smooth splicing, and the parameter values are Ω j = (1, 1, 1, 1), j = 1, 2, . . . , 5, Ω j = (0.5, 1, 1, 1), j = 1, .2, . . . , 5 and Ω j = (0, 1, 1, 1), j = 1, 2, . . . , 5, respectively. The local shape parameters are the same, but the overall shape parameter value is different, reflecting the whole G 1 smooth splicing CSGC-Ball curve. It can be seen that ω controls the overall shape change of the graphs. Figure 9d-f discusses the comparison curves on the same shape with different local shape parameters. The solid line "-" represents the curves with the given local shape parameter value of 1, the dotted line "--" represents the curves with the given local shape parameter value taken as 0, and the dotted line "-." represents the curves with the given local shape parameter value of 2. From Figure 9, it can be found that local shape parameters control the local shape changes of the CSGC-Ball curves, and the overall shape parameters control the overall shape of the CSGC-Ball curves. When different shape parameters change, the control points of the curves also change, and the curves are close to the corresponding control points.
According to the G 2 smooth splicing continuity condition of CSGC-Ball curves given by Theorem 2, Figure 10 shows examples of the spatial curve designed by CSGC-Ball curves of the overall G 2 smooth splicing when N = 3. This CSGC-Ball curve involves 10 variables, including 1 global shape parameter and 9 local shape parameters. Figure 10a-c shows the whole G 2 smooth CSGC-Ball curves of shape parameters are Ω j = (1, 1, 1, 1), j = 1, 2, 3, Ω j = (0.5, 1, 1, 1), j = 1, 2, 3 and Ω j = (0, 1, 1, 1), j = 1, 2, 3, respectively. Figure 10d-f displays the comparison curves on the same graph when the given local shape parameter values are different. When the shape parameters are different, the CSGC-Ball curves will appropriately change some control points of the curves to meet the overall G 2 smooth stitching continuity condition. given local shape parameter values are different. When the shape parameters are different, the CSGC-Ball curves will appropriately change some control points of the curves to meet the overall G 2 smooth stitching continuity condition.
(a)  given local shape parameter values are different. When the shape parameters are different, the CSGC-Ball curves will appropriately change some control points of the curves to meet the overall G 2 smooth stitching continuity condition. (a)

CSGC-Ball Curve-Shape Optimization Model
The bending energy value of the curves can approximately reflect the smoothness of the curves, and they are negatively correlated when the bending energy value of the curves is smaller, the smoothness of the curves is better, and vice versa. Therefore, the G 1 and G 2 continuous shape optimization models of CSGC-Ball curves can be established, respectively, according to the minimum value of the curve bending energy.
Due to the high nonlinearity of the objective function, it is not an easy task to solve the established optimization model using traditional optimization methods. Therefore, the objective function of the CSGC-Ball curve-shape optimization models is regarded as the fitness function, and the proposed HAHA algorithm can be used to obtain the energy optimal solution of the established optimization models.

Steps for HAHA to Solve the CSGC-Ball Curve-Shape Optimization Model
This subsection will introduce the detailed steps to solve the established CSGC-Ball curve-shape optimization model with the proposed HAHA, which are described as follows: Step 1: Set relevant parameters, for example, n, T, Ub, Lb, and the CSGC-Ball curve control points; Step 2: Initialization. Randomly initialize the hummingbird population by Equation (1) when t = 1, obtain the positions of n hummingbirds, take the bending energy value E of the CSGC-Ball curves as the fitness function, calculate E of each individual, record the best fitness value as the problem optimal solution E best , and initialize the visit table; Step 3: If rand > 0.5, perform Step 5 and Step 6; otherwise, use Equation (6) to obtain the candidate solution v i (t + 1) of guided foraging, and obtain the elite opposition-based solution x i,elite (t + 1) by Equation (10), if E (x i,elite (t + 1)) < E (v i (t + 1)), then v i (t + 1)= x i,elite (t + 1); Step 4: If E (v i (t + 1)) < E (x i (t)), then x i (t + 1) = v i (t + 1), and update the visit table; Step 5: Use Equation (8) to execute the territory foraging strategy of hummingbirds to obtain candidate solutions v i (t + 1), and obtain solutions v i,p (t + 1) by Equation (11), if E(v i,p (t + 1)) < E(v i (t + 1)), then v i (t + 1) = v i,p (t + 1), Step 6: If E (v i (t + 1)) < E (x i (t)), then x i (t + 1) = v i (t + 1), and update the visit table; Step 7: If mod(t, 2n) == 0, then the solution with the largest energy value is used for migration foraging by Equation (9) to obtain the random solution x wor (t + 1), and use Equation (16) to perform Cauchy mutation on it to obtain the mutated solution x cauchy (t + 1). If E (x cauchy (t + 1)) < E (x wor (t + 1)), then x wor (t + 1) = x cauchy (t + 1), update the visit table. Otherwise, perform Step 8.
Step 8: t = t + 1, if t < T, then return to Step 3, otherwise execute Step 9; Step 9: Output the energy best value E best of the established CSGC-Ball curves and the corresponding shape parameter values.

Numerical Examples
In order to demonstrate the effectiveness and excellence of the proposed HAHA in solving the established CSGC-Ball curve-shape optimization model. In this section, some representative numerical examples are given to solve the established optimization model using advanced algorithms such as HAHA, and the results are compared and studied. In all numerical examples, the algorithm parameters are shown in Table 2, the population size is 50, and the maximum number of iterations is 1000.
Example 5.1 This numerical example graphically presents the "letter W" graph designed by the complex CSGC-Ball curves that satisfy the overall G 2 smooth splicing continuity condition. The graph shape is composed of eight-segment SGC-Ball curves G 2 smoothly spliced; different colors represent different SGC-Ball curves; the black lines are auxiliary lines. The convergence curves when the objective function of the optimization model converges to the optimal value are also provided. In this example, for the CSGC-Ball curves with overall G 2 smooth splicing, it is only necessary to give the coordinates of control points P 0,0 , P 0,1 , P 0,2 , P 0,3 , P 1,3 , P 2,3 , P 3,3 and P 4,0 , P 4,1 , P 4,2 , P 4,3 , P 5,3 , P 6,3 , P 7,3 . The remaining control vertices of the curves to be spliced can be calculated according to the G 2 smooth continuity splicing condition and the coordinates of the known control vertices.
In this example, there are a total of 25 shape parameters that need to be optimized, including 1 global parameter and 24 local shape parameters. Figure 11 shows the CSCG-Ball curves and the energy change convergence diagrams obtained by solving the established shape optimization model using optimization algorithms such as HAHA. Figure 11a,b shows the "letter W" shape CSCG-Ball curves of the overall G 2 smooth splicing for given the shape parameter value. Figure 11c,h, respectively, describes the CSCG-Ball curves of smooth splicing of the overall G 2 with minimum energy obtained after optimization by PSO, WOA, SCA, HHO, GWO, and HAHA. Figure 11i shows the energy change diagrams of each algorithm to solve the G 2 smooth splicing shape optimization mode, and the proposed HAHA solves the model with the highest convergence accuracy.
Appendix C Table A2 shows the optimal shape parameters and minimum energy values obtained by the corresponding intelligent algorithm to solve the overall G 2 smooth splicing shape optimization model. It proves that the proposed HAHA algorithm is more competitive and has advantages over other optimization algorithms in solving the optimization model of the CSCG-Ball curves that satisfy the G 2 smooth splicing continuity condition. The final result has a minimum energy value of 41.7970 to obtain the smoothest graphics.
Appendix C Table A2 shows the optimal shape parameters and minimum energy values obtained by the corresponding intelligent algorithm to solve the overall G 2 smooth splicing shape optimization model. It proves that the proposed HAHA algorithm is more competitive and has advantages over other optimization algorithms in solving the optimization model of the CSCG-Ball curves that satisfy the G 2 smooth splicing continuity condition. The final result has a minimum energy value of 41.7970 to obtain the smoothest graphics.  Example 5.2 This example gives the "snail on grass" diagram designed by the complex CSCG-Ball curves of the hybrid smooth splicing of G 0 , G 1 , and G 2 in graphical form, and the convergence curves of the optimization model converging to the optimal value are given. Different colors represent different curves, and the graph is composed of 29 SGC-Ball curves involving 88 shape optimization parameters, including 1 overall shape parameter and 87 local shape parameters. Using PSO, WOA, SCA, HHO, SMA (Slime Mould Algorithm) [77], and HAHA to solve the shape optimization model, the ideal optimal shape of the CSCG-Ball curves that satisfies the smooth splicing of mixed G 0 , G 1 , and G 2 can be obtained. Figure 12 shows the CSCG-Ball curves and the energy change diagrams with the smooth splicing of the hybrid G 0 , G 1 , and G 2 with the smallest energy obtained by solving Example 5.2 This example gives the "snail on grass" diagram designed by the complex CSCG-Ball curves of the hybrid smooth splicing of G 0 , G 1 , and G 2 in graphical form, and the convergence curves of the optimization model converging to the optimal value are given. Different colors represent different curves, and the graph is composed of 29 SGC-Ball curves involving 88 shape optimization parameters, including 1 overall shape parameter and 87 local shape parameters. Using PSO, WOA, SCA, HHO, SMA (Slime Mould Algorithm) [77], and HAHA to solve the shape optimization model, the ideal optimal shape of the CSCG-Ball curves that satisfies the smooth splicing of mixed G 0 , G 1 , and G 2 can be obtained. Figure 12 shows the CSCG-Ball curves and the energy change diagrams with the smooth splicing of the hybrid G 0 , G 1 , and G 2 with the smallest energy obtained by solving the curve-shape optimization model established by the HAHA algorithm and other advanced optimization algorithms. Among them, Figure 12a,b is the graphs constructed from the CSCG-Ball curves that are blended G 0 , G 1 , and G 2 spliced smoothly with freely given shape parameter values. Figure 12c-h shows the CSCG-Ball curves with smooth splicing of mixed G 0 , G 1 , and G 2 with minimum energy obtained after optimization by different optimization algorithms, respectively. Figure 12i shows the energy change diagram of each algorithm to solve the hybrid G 0 , G 1 , and G 2 smooth splicing shape optimization models. When the number of iterations reaches 200, the energy value of the model solved by HAHA tends to be stable, and compared with other algorithms, HAHA has the highest convergence accuracy.
highest convergence accuracy.
Appendix D Table A3 shows the optimal shape parameter values and the minimum energy values of the graphs designed by the mixed G 0 , G 1 , and G 2 smoothly spliced CSCG-Ball curves obtained by each algorithm. Among all the algorithms, the CSGC-Ball curve obtained by the proposed HAHA with the smooth splicing of mixed G 0 , G 1 , and G 2 is the smoothest, and the obtained energy value is 182.437. The effectiveness of HAHA in solving the CSGC-Ball curve-shape optimization model is fully demonstrated.

Conclusions and Future Research
In this paper, complex CSCG-Ball curves with global and local shape parameters are constructed based on SGC-Ball basis functions, and the geometric conditions for G 1 and Appendix D Table A3 shows the optimal shape parameter values and the minimum energy values of the graphs designed by the mixed G 0 , G 1 , and G 2 smoothly spliced CSCG-Ball curves obtained by each algorithm. Among all the algorithms, the CSGC-Ball curve obtained by the proposed HAHA with the smooth splicing of mixed G 0 , G 1 , and G 2 is the smoothest, and the obtained energy value is 182.437. The effectiveness of HAHA in solving the CSGC-Ball curve-shape optimization model is fully demonstrated.

Conclusions and Future Research
In this paper, complex CSCG-Ball curves with global and local shape parameters are constructed based on SGC-Ball basis functions, and the geometric conditions for G 1 and G 2 continuity splicing between adjacent SCG-Ball curves are studied. The constructed CSGC-Ball curves can not only construct more complex geometric product shapes in reality but also adjust the overall or local shape of the curves more flexibly by changing the overall or local shape parameters, thereby making the curves have higher shape adjustability.
In addition, a novel improved HAHA is proposed, which combines EOL, PSO, and Cauchy mutations with AHA. The introduction of the EOL strategy better balances the exploration and exploitation of algorithms and increases their optimization ability. In the exploitation stage, combined with the PSO strategy, the convergence speed is accelerated and the optimization ability of the algorithm is improved. Cauchy mutations are added to increase the diversity of the population and improve the ability of the algorithm to jump out of the local optimal. In order to better evaluate the overall performance of HAHA compared with other advanced intelligent algorithms for 25 benchmark functions and the CEC 2022 test set, the experimental results verify that the proposed HAHA has certain superiority and competitiveness in solving global optimization problems.
Finally, according to the minimum bending energy of the curves, the CSGC-Ball curveshape optimization models are established, and the specific steps for HAHA to solve the CSGC-Ball shape optimization model are given. Two representative numerical examples verify the effectiveness of HAHA in solving the CSGC-Ball curve-shape optimization models. However, it is worth noting that the HAHA proposed in this paper exhibits advantages and competitiveness in solving optimization problems with continuous variables, but there are certain limitations in solving problems in non-continuous decision spaces. In future research, the proposed HAHA can be used to solve optimization problems in the fields of feature selection, image segmentation, and machine learning. In addition, we will consider extending the research technique of combined SGC-Ball interpolation curves to the CQGS-Ball surfaces in [78] and utilizing intelligent algorithms in [79][80][81] to investigate the shape optimization problem of the surfaces. Institutional Review Board Statement: Not applicable.

Informed Consent Statement: Not applicable.
Data Availability Statement: All data generated or analyzed during the study are included in this published article.

Conflicts of Interest:
The authors declare no conflict of interest.  Proof. If the j-segment and the j + 1-segment curve of CSGC-Ball curves meet the G 1 continuity condition at the connection point u j , then G 0 should be continuous first, that is,

Appendix A. Twenty-Five Benchmark Functions
Furthermore, the two curves should have the same unit tangent vector at node u j , which is From the endpoint properties of the SGC-Ball curves, it is known that Substituting the above formula into Equation (A2), we can arrange to obtain where k > 0 is an arbitrary constant. Theorem 1 is proved.

Appendix B.2. Proof of Theorem 2
Proof. If the j-segment and the j + 1-segment curves of the CSGC-Ball curves meet the G 2 continuity condition at node u j , then the G 1 continuity should be satisfied first, which is given by Theorem 1.