Ameliorated Snake Optimizer-Based Approximate Merging of Disk Wang–Ball Curves

A method for the approximate merging of disk Wang–Ball (DWB) curves based on the modified snake optimizer (BEESO) is proposed in this paper to address the problem of difficulties in the merging of DWB curves. By extending the approximate merging problem for traditional curves to disk curves and viewing it as an optimization problem, an approximate merging model is established to minimize the merging error through an error formulation. Considering the complexity of the model built, a BEESO with better convergence accuracy and convergence speed is introduced, which combines the snake optimizer (SO) and three strategies including bi-directional search, evolutionary population dynamics, and elite opposition-based learning. The merging results and merging errors of numerical examples demonstrate that BEESO is effective in solving approximate merging models, and it provides a new method for the compression and transfer of product shape data in Computer-Aided Geometric Design.


Introduction
Computer-Aided Geometric Design (CAGD for short) [1] takes the representation, drawing, display, analysis, and processing of product geometric shape information as the core research content, and it occupies an important position in manufacturing, medical diagnosis, artificial intelligence, computer vision, and other fields.Product geometry design is the focus of CAGD research, and free-form curves and surfaces are an important tool for describing product geometry.From the Ferguson method [2] to Bézier [3,4], B-spline [5,6], NURBS [7][8][9], and other methods, the representation of free-form curves and surfaces has gone through different stages of development driven by industrial software technology.The Ball method is also one of the current commonly used representations.It was proposed in 1974 as the mathematical basis for the former British Airways CONSURF fuselage surface modeling system [10][11][12].Subsequent research by several scholars has led to the emergence of a variety of generalized forms such as the Said-Ball curve [13][14][15], the Wang-Ball curve [16], the generalized Ball curves of the Wang-Said type, and the generalized Ball curves of the Said-Bézier type [17].The Wang-Ball curve not only has good properties such as stability, symmetry, endpoint interpolation, and geometric invariance but also significantly outperforms the Said-Ball and Bézier curves in terms of degree elevation, degree reduction, and recursive evaluation [15].
With the rapid development of the geometric modeling industry, the requirements for accuracy standards, surface quality, and overall smoothness of free curves and surfaces are becoming increasingly stringent.The inaccuracies and limited accuracy caused by the floating-point environment in the modeling of curves and surfaces are major causes for the lack of robustness in solid modeling.Interval analysis [18], which is a tool to enhance the stability of algorithms by dealing with errors, has been gradually introduced into the fields of geometric modeling, computer graphics, and Computer-Aided Design/Computer-Aided Manufacturing (CAD/CAM) since the 1980s [19][20][21].Subsequently, the concepts of interval Bézier curves, interval Ball curves, and interval B-spline curves have been proposed.Unlike conventional curves, interval curves are constructed with rectangles instead of control vertices represented by real numbers.However, the interval method also has the disadvantage of an expanding error domain under rotational transformation.To this end, Lin et al. [22] put forward disk Bézier curves in combination with the disk algorithm.A disk curve uses disks to represent the control vertices.Compared with interval curves, disk curves have the following advantages: (1) the shape of disk curves remains unchanged under affine transformation; (2) in terms of data storage volume, the interval method requires eight data records in contrast to only two data records for the disk, which reduces the amount of data.Since then, research related to disk curves has been carried out rapidly.Chen [23] studied the order reduction problem of disk Bézier curves using both linear programming and optimization methods.In 2018, Ao et al. [24] proposed a high-precision intersection algorithm for disk B-spline curves and made this curve flexible for stroke representation.Seah et al. [25] applied disk B-spline curves to artistic brush strokes and 2D animation, and Hu et al. [26] constructed the disk Wang-Ball (DWB) curve and investigated its degree reduction problem.
The rapid development of the graphics industry and the manufacturing sector is accompanied by the constant updating of geometric modeling systems, which has led to an increase in the exchange, integration, and sharing of data between different systems for geometric descriptions.Approximate merging [27] is an approximate transformation technique proposed by Hoschek in 1987, which involves approximating a curve consisting of multiple lower-order curve segments with a single higher-order curve.Approximate merging reduces the amount of data transfer during product design and development, thus enabling efficient data transfer and exchange between different systems.In 2001, Hu et al. [28] presented a methodology to merge two adjacent Bézier curves using controlled vertex perturbations and least squares and showed that the merger error could be reduced if the original curves were first raised to higher degrees before approximating the merger.This method is simple, intuitive, and operational, so Tai et al. [29] used a similar method to solve the exact merging of B-spline curves and suggested a node adjustment technique for tuning the end nodes of the kth-order curve without varying the shape of the curve.Subsequently, Cheng et al. [30] gave a uniform matrix representation for exact merging by minimizing the curve distance.Zhu and Wang [31] used the L 2 parametrization to measure the approximation error of the curves before and after the merger and achieved the optimal merger problem for Bézier curves with G 2 continuity.Lu et al. [32] also minimized the L 2 distance and obtained the merger curve for two Bézier curves with G 3 continuity in an explicit manner.
It is clear from the above research that most of the approximate merging problems are confined to conventional curves and do not involve interval and disk curves.Therefore, the questions of how to extend the approximate merging method to interval as well as disk curves, how to estimate the merging error of interval and circular curves, and whether it is possible to directly extend the approximate merging method of traditional curves to interval curves all need to be addressed.The main objective of this paper is to investigate an approximation method to merge disk Wang-Ball curves with improved robustness and accuracy.The merging error is one of the criteria for judging the effectiveness of curve merging, so we take into account the objective of minimizing the merging error to establish an approximate merging model for circular domain curves.In the problem of how to obtain the optimal coordinates of the merged curves, we invoke meta-heuristic optimization algorithms.
Meta-heuristics algorithms have the advantages of fast convergence, high search power, and high solution accuracy, and they play an increasingly important role in optimization problems.Particle swarm optimization (PSO) [33] has appeared in various improved versions since its introduction in 1995, such as multi-objective particle swarm optimization [34] and adaptive particle swarm optimization algorithms [35], and it has been successfully applied in many fields such as image classification [36], path planning [37] and biomedicine [38].The marine predators algorithm (MPA) [39] is inspired by ocean predator and prey movements, and its combination with problems such as the shape optimization [40], image segmentation [41], wind power prediction [42], and the 0-1 knapsack problem [43] have achieved high-quality optimization results.There are also a variety of algorithms and enhanced variants developed based on different inspirations, such as the chimp optimization algorithm [44], the nutcracker optimizer [45], enhanced black widow optimization (QIWBWO) [46], the snow ablation optimizer (SAO) [47], the multi-strategy enhanced chameleon swarm algorithm (MCSA) [48], and the multi-objective artificial hummingbird algorithm (MOAHA) [49].
The snake optimizer (SO) [50] is inferred from the unique lifestyle of snakes and has been successfully applied to Hammerstein adaptive filters [51] and power-aware task schedulers for wearable biomedical systems [52].As SO is prone to falling into local optimality and inadequate optimization capabilities when faced with different optimization problems, enhanced versions of it have been proposed successively to achieve better results.A multi-strategy fused snake optimizer (MFISO) was developed by Fu et al. for deep-learning prediction models of gas prominence in underground mines [53].Rawa investigated a hybrid form of SO and sine cosine algorithm (SCA) called SO-SCA using both parallel and tandem mechanism runs and used it to solve transmission expansion planning models [54].Khurma et al. not only generated a binary version called BSO based on the S-shaped transformation function but also integrated the new evolutionary greedy crossover operator with SO to propose a BSO-CV algorithm for medical classification problems [55].The multi-strategy enhanced snake optimizer (BEESO) [56] is an improved algorithm proposed by Hu et al. for the optimization problem introducing bi-directional search, modified evolutionary population dynamics, and elite opposition-based learning strategy in SO.The experimental results in the literature demonstrate that it possesses a highly competitive search capability and convergence speed compared to a variety of sophisticated algorithms.Therefore, BEESO will be used to solve the approximate merging problem for DWB curves, and the main contributions are summarized as follows: (1) We discuss the approximate merging problem of DWB curves and establish an approximate merging optimization model with the merging error as the objective; (2) We propose an approximate merging method of DWB curves based on BEESO and demonstrate the optimization capability of BEESO with numerical examples.
The remainder of this text is structured as follows: The definition of DWB curves is presented in Section 2, along with a discussion of the approximate merging of adjacent DWB curves and a specific optimization model; Section 3 introduces the BEESO and proposes an approximate merging method of DWB curves based on BEESO and gives three numerical examples; Section 4 concludes the paper.

Definition of the DWB Curves
Definition 1.In R 2 , given n+1 control disks, then the DWB curve of degree n is defined as where p i = (x i , y i ) and r i represent the center coordinates and radius of the control disk, and {W i,n (t)} n i=0 indicates Wang-Ball basis functions, in which According to Equation (1), the DWB curve can be written in the following form: where C(t) and R(t) are the center curve and radius function, respectively.
where λ is the subdivision parameter.W i,n 1 ( t λ ) and W j,n 2 ( t−λ 1−λ ) are the Wang-Ball basis functions of the order n 1 and n 2 defined by Equation (2).d((D)(t), (D)(t)) can be selected to the appropriate value as required.

1.
Approximate merging without endpoint preservation According to Equation (3), the two adjacent DWB curves (W) 1 (t), (W) 2 (t) are denoted as (C 1 (t), R 1 (t)) and (C 2 (t), R 2 (t)), and the DWB curve (D)(t) in Equation ( 4) is expressed as (C(t), R(t)).Referring to the nth DWB curve (D)(t) to be found as the third curve, that is, Take the subdivision parameter λ to be any constant in the open interval (0, 1).Then, divide (D)(t) into two DWB curves of degree n 1 on the left and n 2 on the right, which are recorded as (D) For the three curves above, we measure their metric distances in terms of two components, the center curve and the radius function.Let the subdivision parameter , the distance between the center curves, which is defined as Based on the above description of curve merging, there should be For the same between the radius functions R(t) and R(t), there is In order to verify the optimized effect, the merging error formula is a common criterion for judging.In summary, this paper will use the merging error between the merging curve (W) 1 (t) and (W) 2 (t) as the curve to be merged and as an objective function to build an optimization model, defined as

Approximate merging with endpoint preservation
The resulting merged curve does not guarantee that (D)(t) is interpolated at the left endpoint of the curve (W) 1 (t) and the right endpoint of the curve (W) 2 (t), so this part will build an approximate merge optimization model with endpoint-preserving interpolation between the merged curve and the curve to be merged.Endpoint-preserving interpolation means that the third curve (D)(t) found must not only approximate the merge of the two given DWB curves, but also the interpolation at the left endpoint of (W) 1 (t) as well as the right endpoint of (W) 2 (t) to be merged.
In accordance with the above definition of endpoint-preserving interpolation approximation merging and the endpoint properties of DWB curves, the endpoint-preserving interpolation approximation merging optimization model is expressed as The constraints are as follows:

BEESO
BEESO [56] is an improved algorithm proposed to address the shortcomings of SO, which integrates three strategies into SO to improve its optimization capabilities.The update phase of BEESO includes exploration, exploitation, and mutation operations.The initial population is calculated using Equation (13), in which where r is randomly selected at [0, 1], LB stands for the lower bound, and UB stands for the upper bound.
The population is composed of 50% females and 50% males, and the population is randomly split into two sub-groups before the start of the iteration, as follows: where N m and N f represent male and female subgroups.
The food mass P and temperature Temp that control the algorithmic process are calculated with the following formula: where k and K, respectively, represent the current and maximum number of iterations, and s 1 = 0.5.

Exploration Phase
If the food quality P < Threshold means that it is of low quality, the population needs to search for better-quality food in the feasible range, which indicates that BEESO enters the exploration phase.The exploration phase consists of a random search and a bi-directional search; BEESO will use these two search methods to update the position of individuals and select the better ones for the next stage by comparing their fitness values.For the random search, the random update position of each individual is formulated as follows: where y r,m and y r, f represent male and female individuals, respectively.AM and AF are the snake's ability to find food, r is a random number, and s 2 = 0.5.
where Fit • represents the fitness value.Bi-directional search makes use of the best and worst individuals to guide BEESO toward the optimal value while maximizing the search area, which effectively improves the disadvantages of the random search such as lower randomness, higher uncertainty, and a narrow search range.The mathematical approach is formulated as where y best and y worst are the best and worst individuals, respectively, and r 1 and r 2 are evenly generated random numbers.

Exploitation Phase
A higher quality of food represents BEESO entering the exploitation phase.When the temperature Temp > 0.6 represents a higher temperature in the environment, the snake will move toward the food, as described by the mathematical equation Mating behavior occurs when the temperature is right, and there is a fighting mode and a mating mode.Fighting mode means that each female engages in mating with the best male, and the males will get the best females by fighting, as in the following equation: where FM and FF are the combat abilities and are calculated using the following formula: Mating patterns in which mating between each pair of individuals occurs is mathematically modeled as where MM and MF stand for the mating ability: There is a potential for egg production after mating.If the snake eggs hatch, modified evolutionary population dynamics (MEPD) is implemented on the current parent to improve population quality by eliminating the poorer individuals and mutating the better ones.New offspring are first generated for the bottom 50% of individuals using Equations ( 32) and (33): where i = 1, 2, ..., N/2.The mutation operation is applied to the top 50% of individuals, as follows: where p 1 , p 2 , p 3 are random integers between [1, N], and p 1 ̸ = p 2 ̸ = p 3 ̸ = i.F is the scaling factor: where freq is the frequency of vibration of the sine function.

Elite Opposition-Based Learning Strategy
The tendency to fall into local optima is a common problem with optimization algorithms.The elite opposition-based learning strategy is meant to optimize the betterperforming individuals in the population, so that the algorithm approximates the global optimum with a higher probability.
stands for elite individuals, who rank in the top EN of the population.The elite opposite solution of the current individual is calculated below: where EN = 0.1 × N, and EA(k) and EB(k) are the minimum and maximum values of elite individuals.

Steps for Solving the Approximate Merger Models by BEESO
In Section 2, two optimization models, endpoint-preserving merging and non-endpointpreserving merging, are established with the merging error of the curves before and after merging as the objective function.In this section, the BEESO will be used to solve the approximate merged optimized model for DWB curves, and the implementation steps are as follows: Step one: Setting the parameters of the algorithm; Step two: Enter the coordinates and radius of the DWB curves to be merged, and calculate the subdivision parameters; Step three: Initialization.Calculate the initial population according to Equation ( 13), and divide it into female and male populations.Use the approximate merging model Equation ( 9) (or Equation (10)) as the objective function; Step four: Judging food quality.If P < 0.25, execute Equations ( 17)-( 20) and the bi-directional search Equations ( 21) and (22) of the exploration phase to generate new individuals.Calculate the fitness value of the individuals, and select the better individuals to proceed to the next phase; otherwise, proceed to the exploitation phase; Step five: The exploitation phase starts by calculating the temperature Temp.If Temp > 0.6, the individual moves closer to the food as in Equation (23).Otherwise, it enters the fight mode or the mating mode.When r > 0.6, the individual is in fight mode and performs Equations ( 24)-( 27); otherwise, it operates in mating mode; Step six: The mating pattern performs Equations ( 28)-(31) on individuals, followed by consideration of whether the eggs hatch after mating.If the eggs hatch, MEPD (Equations ( 32)-( 36)) is used to produce individuals that move on to the next stage; Step seven: Find the elite individuals, and update the population again by Equations ( 37)- (39); Step eight: Calculate the fitness value, and determine the optimal individual; Step nine: Determine the termination condition of the algorithm.Let k = k + 1.If k < K, return to step four; otherwise, output the minimum merging error and the coordinates of the control disks of the merging curve.
A detailed flowchart for the approximate merged model is shown in Figure 1.
Step seven: Find the elite individuals, and update the population again by Equations ( 37)- (39); Step eight: Calculate the fitness value, and determine the optimal individual; Step nine: Determine the termination condition of the algorithm.Let k = k + 1.If K k < , return to step four; otherwise, output the minimum merging error and the coor- dinates of the control disks of the merging curve.
A detailed flowchart for the approximate merged model is shown in Figure 1.

Optimization Examples
To verify the feasibility of the algorithm, three specific numerical examples are given in this section.Each numerical example is optimized in terms of both endpoint-preserving

Optimization Examples
To verify the feasibility of the algorithm, three specific numerical examples are given in this section.Each numerical example is optimized in terms of both endpoint-preserving interpolation and endpoint-unperceiving interpolation.The population size and iterations in all experiments are 50 and 500.In addition to BEESO, several advanced algorithms are selected for comparison purposes, such as CSA [48], SCA [57], grey wolf optimizer (GWO) [58], white shark optimizer (WSO) [59], the Coot optimization algorithm (COOT) [60], the sooty tern optimization algorithm (STOA) [61], and Harris hawks optimization (HHO) [62].
Example 1.It is given that two adjacent DWB curves and their control vertex coordinates as well as the control radius are shown in Table 1, and this constructs the shape of the DWB curve shown in Figure 2. In this example, these two curves of 3 and 4 degrees will be combined into a single 6-degree DWB curve according to the model built in Section 2.2.2. in all experiments are 50 and 500.In addition to BEESO, several advanced algorithms are selected for comparison purposes, such as CSA [48], SCA [57], grey wolf optimizer (GWO) [58], white shark optimizer (WSO) [59], the Coot optimization algorithm (COOT) [60], the sooty tern optimization algorithm (STOA) [61], and Harris hawks optimization (HHO) [62].Tables 2 and 3 show the optimum results obtained by the eight intelligent algorithms including BEESO and SCA for the two cases of no endpoint preservation and endpoint preservation, respectively, which include the control disk coordinates of the merged curves and the merging error.The comparative results of the before and after merging curves are shown in Figures 3 and 4, where the merged DWB curve is shown in green.The change in the shape of the curves before and after the merger can be observed in the figures.For example, the shape of the left segment of the curve obtained by CSA shows a large error.In addition, the convergence in the optimization process is shown in Figures 3(i) and 4(i).

Example 1. It is given that two adjacent DWB curves and their control vertex coordinates as well as the control radius are shown in
The overall result shows that the optimization of the "C" curve fits the original curve beBer.For the non-endpoint-preserving case, SCA and STOA obtain curves that differ significantly from the width of the original curve, and their errors are accordingly two of the larger of all the algorithms.In the endpoint-preserving case, STOA also has a larger difference in results and a slower convergence rate in the optimization process.Although the remaining algorithms have similar results, BEESO has beBer results when looking at the fit and width of the two curves before and after merging.BEESO achieves the best results among the eight algorithms, both in terms of error results and convergence speed.Tables 2 and 3 show the optimum results obtained by the eight intelligent algorithms including BEESO and SCA for the two cases of no endpoint preservation and endpoint preservation, respectively, which include the control disk coordinates of the merged curves and the merging error.The comparative results of the before and after merging curves are shown in Figures 3 and 4, where the merged DWB curve is shown in green.The change in the shape of the curves before and after the merger can be observed in the figures.For example, the shape of the left segment of the curve obtained by CSA shows a large error.In addition, the convergence in the optimization process is shown in Figures 3i and 4i.Example 2. The purpose of this example is to approximate the merging of two adjacent 4-degree DWB curves into one 5-degree curve.The control disks of the two curves before merging are shown in Table 4, and the shape of the curve obtained by visualizing it is shown in Figure 5.The overall result shows that the optimization of the "C" curve fits the original curve better.For the non-endpoint-preserving case, SCA and STOA obtain curves that differ significantly from the width of the original curve, and their errors are accordingly two of the larger of all the algorithms.In the endpoint-preserving case, STOA also has a larger difference in results and a slower convergence rate in the optimization process.Although the remaining algorithms have similar results, BEESO has better results when looking at the fit and width of the two curves before and after merging.BEESO achieves the best results among the eight algorithms, both in terms of error results and convergence speed.
Example 2. The purpose of this example is to approximate the merging of two adjacent 4-degree DWB curves into one 5-degree curve.The control disks of the two curves before merging are shown in Table 4, and the shape of the curve obtained by visualizing it is shown in Figure 5.

Curves Variables
Values  Tables 5 and 6, respectively, give the optimization results of the eight algorithms for the two cases of no endpoint preservation and endpoint preservation, which include the optimal control vertices and the merging error.Figures 6 and 7 show the optimization results for two adjacent 4-degree DWB curves merged into one 5-degree DWB curve under different constraints, respectively.Table 5. Optimization results of merging into 5-degree DWB curves without endpoint preservation.

Methods Variables
Optimal Results Errors  Tables 5 and 6, respectively, give the optimization results of the eight algorithms for the two cases of no endpoint preservation and endpoint preservation, which include the optimal control vertices and the merging error.Figures 6 and 7 show the optimization results for two adjacent 4-degree DWB curves merged into one 5-degree DWB curve under different constraints, respectively.In the case where endpoints are not preserved, there is a large difference in the optimization results between the eight algorithms.For example, the visual difference between the merged curves obtained by HHO, GWO, and CSA and the original curve is significant, which is also evident from the error for each algorithm in Table 5.Whereas for the endpoint-preserving case, BEESO, CSA, SO, WSO, and COOT obtain similar effect plots, the In the case where endpoints are not preserved, there is a large difference in the optimization results between the eight algorithms.For example, the visual difference between the merged curves obtained by HHO, GWO, and CSA and the original curve is significant, which is also evident from the error for each algorithm in Table 5.Whereas for the endpoint-preserving case, BEESO, CSA, SO, WSO, and COOT obtain similar effect plots, the In the case where endpoints are not preserved, there is a large difference in the optimization results between the eight algorithms.For example, the visual difference For the case of non-endpoint preservation, the approximate merging results of the three algorithms BEESO, COOT, and SO in Figure 9 are relatively good, with BEESO having the smallest merging error of all the algorithms.The other algorithms, on the other hand, all have much room for improvement in their overall results, especially GWO, HHO, and STOA, which have large errors in the position of the curve and the width of the curve before merging.Due to the low dimensionality of the optimizing variables in the endpointpreserving case, the results for all seven algorithms apart from STOA are approximate and have a small gap in terms of the combined errors given.For BEESO, SO, and WSO, there is no significant gap when the errors are kept to three decimal places.The experimental outcome by keeping the valid data to more than one decimal place is 0.14669880, 0.14669887, and 0.14669885 for BEESO, SO, and WSO respectively; the error for BEESO is the smallest of the three algorithms.

Conclusions
Based on the basic theory of the disk Wang-Ball curve, an approximate merging method based on a meta-heuristics algorithm is proposed for the problem that this curve is difficult to merge.This paper first discusses the approximate merging of DWB curves and establishes two optimization models from the perspectives of non-endpoint-preserving merging and endpoint-preserving merging, with the merging error as the objective function.In addition, BEESO is introduced to solve the constructed optimization models, and the steps of the algorithm for solving the approximate merging problem of DWB curves are specified.The method can directly obtain the control disks of the merging curve while calculating the merging error, which is characterized by simple and practical calculation.Finally, some advanced algorithms are selected for comparison in Section 3.3, and the effectiveness of the method in curve design is also demonstrated by the fact that BEESO achieves DWB curves with good merging results in all three numerical examples.
The approximate merging model developed in this paper achieves a good merging effect and a small error, but the integral form leads to a more complex model, and the algorithm runs slower during the solution process.The question of how to simplify or build the mathematical model in a simple way so that it can be applied to the approximate merging of disk curves is a problem to be solved in the future.In addition, BEESO can be used to solve optimization problems in areas such as cryptosystem design [63], path planning, geometry optimization [40,64], engineering design [65,66], and feature selection [67].

Figure 1 .
Figure 1.Flowchart of BEESO solving the approximate merged model.

Figure 1 .
Figure 1.Flowchart of BEESO solving the approximate merged model.

Figure 3 .
Figure 3. Graphs and convergence curves for merging into 6-degree curves without endpoint preservation in Example 1.

Figure 4 .
Figure 4. Graphs and convergence curves for merging into 6-degree curves in the endpointpreserving case.

Figure 6 .
Figure 6.Graphs and convergence curves for merging into 6-degree curves without endpoint preservation in Example 2.

Figure 7 .
Figure 7. Graphs and convergence curves for merging into 5-degree curves in the endpoint-preserving case.

Figure 7 .
Figure 7. Graphs and convergence curves for merging into 5-degree curves in the endpointpreserving case.

Table 1 .
Coordinates of the two adjacent DWB curves in Example 1.

Table 1 ,
and this constructs the shape of the DWB curve shown in Figure2.In this example, these two curves of 3 and 4 degrees will be combined into a single 6degree DWB curve according to the model built in Section 2.2.2.

Table 1 .
Coordinates of the two adjacent DWB curves in Example 1.

Table 2 .
Optimization results of merging into 6-degree DWB curves without endpoint preservation.

Table 3 .
Optimization results of merging into 6-degree DWB curves in the endpoint-preserving case.

Table 4 .
Coordinates of the two adjacent DWB curves in Example 2.

Table 4 .
Coordinates of the two adjacent DWB curves in Example 2.

Table 5 .
Optimization results of merging into 5-degree DWB curves without endpoint preservation.

Table 6 .
Optimization results of merging into 5-degree DWB curves in the endpoint-preserving case.

Table 8 .
Optimization results of merging into 4-degree DWB curves without endpoint preservation.

Table 9 .
Optimization results of merging into 4-degree DWB curves in the endpoint-preserving case.