Next Article in Journal
Experimental and Numerical Analysis of Triply Coupled Vibration of Thin-Walled Beam with Arbitrary Closed Cross-Section
Previous Article in Journal
An Autonomous Maze-Solving Robotic System Based on an Enhanced Wall-Follower Approach
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Improved Chimpanzee Search Algorithm with Multi-Strategy Fusion and Its Application

School of Mechanical Engineering and Automation, Dalian Polytechnic University, Dalian 116034, China
*
Author to whom correspondence should be addressed.
Machines 2023, 11(2), 250; https://doi.org/10.3390/machines11020250
Submission received: 14 December 2022 / Revised: 30 January 2023 / Accepted: 6 February 2023 / Published: 8 February 2023
(This article belongs to the Topic Intelligent Systems and Robotics)

Abstract

:
An improved chimpanzee optimization algorithm incorporating multiple strategies (IMSChoA) is proposed to address the problems of initialized population boundary aggregation distribution, slow convergence speed, low precision, and proneness to fall into local optimality of the chimpanzee search algorithm. Firstly, the improved sine chaotic mapping is used to initialize the population to solve the population boundary aggregation distribution problem. Secondly, a linear weighting factor and an adaptive acceleration factor are added to join the particle swarm idea and cooperate with the improved nonlinear convergence factor to balance the global search ability of the algorithm, accelerate the convergence of the algorithm, and improve the convergence accuracy. Finally, the sparrow elite mutation and Bernoulli chaos mapping strategy improved by adaptive change water wave factor are added to improve the ability of individuals to jump out of the local optimum. Through the comparative analysis of benchmark functions seeking optimization and the comparison of Wilcoxon rank sum statistical test seeking results, it can be seen that the IMSChoA optimization algorithm has stronger robustness and applicability. Further, the IMSChoA optimization algorithm is applied to two engineering examples to verify the superiority of the IMSChoA optimization algorithm in dealing with mechanical structure optimization design problems.

1. Introduction

Meta-heuristic algorithms are widely used in path planning [1], image detection [2], system control [3], and shop floor scheduling [4] due to their excellent flexibility, practicality, and robustness. Common meta-heuristic algorithms include the genetic algorithm (GA) [5,6], the particle swarm optimization algorithm (PSO) [7,8], the gray wolf optimization algorithm (GWO) [9,10], the chicken flock optimization algorithm (CSO) [11], the sparrow optimization algorithm (CSA) [12], the whale optimization algorithm (WOA) [13], etc.
Different intelligent optimization algorithms exist with different search approaches, but most of them aim at the balance between population diversity and search ability and avoid premature maturity while ensuring convergence accuracy and speed [14]. In response to the above ideas, numerous scholars have proposed improvements to the intelligent algorithms they studied. For example, Zhi-jun Teng et al. [15] introduced the idea of PSO on the basis of the gray wolf optimization algorithm, which preserved the individual optimum while improving the ability of the algorithm to jump out of the local optimum; Hussien A. G. et al. [16] proposed two transfer functions (S-shaped and V-shaped) to map the continuous search space to the binary space, which improved the search accuracy and speed of the whale optimization algorithm; Wang et al. [17] introduced a fuzzy system in the process of chicken flock optimization algorithm, which adaptively adjusted the number of individuals in the algorithm, as well as random factors to balance the local exploitation performance and global search ability of the algorithm; Tian et al. [18] used logistic chaotic mapping to improve the initial population quality of the particle swarm algorithm while applying the auxiliary speed mechanism to the global optimal particles, which effectively improved the convergence of the algorithm; Li et al. [19] integrated two strategies, Levy flight and dimension-by-dimension evaluation, in the mothballing algorithm to improve the global search capability and to enhance the effectiveness of the algorithm.
The chimpanzee optimization algorithm (ChoA) is a heuristic optimization algorithm based on the social behavior of chimpanzee populations proposed by Khishe et al. [20] in 2020. Compared with traditional algorithms, ChoA has the advantages of fewer parameters, being prone to understand, and high stability. However, it also has the problems of initialized population boundary aggregation distribution, slow convergence speed, low accuracy, and being prone to fall into the local optimum. To address these problems, many researchers have proposed different improvement methods. Du et al. [21] introduced a somersault foraging strategy in ChoA to avoid the algorithmic population from easily falling into the local optimum, as well as to improve the diversity of the pre-population. However, this relatively single improvement leads to its less obvious improvement effect; Kumari et al. [22] combined the SHO algorithm with the ChoA algorithm, which improved the convergence accuracy of the ChoA algorithm itself and enhanced its local exploitation ability to deal with high-dimensional problems; Houssein et al. [23] proposed to extend the population diversity in the search space of the ChoA algorithm in relation to the algorithm initialization phase by applying opposition-based learning (OBL).
In summary, there are numerous improvements to the chimpanzee optimization algorithm, and the improved algorithms are suitable for the optimization of some single problems but reveal shortcomings for others. Therefore, in order to improve the performance of ChoA optimization, an improved chimpanzee optimization algorithm incorporating multiple strategies (IMSChoA) is proposed in this paper. Firstly, an improved sine chaotic mapping is used to initialize the population and solve the phenomenon of population boundary aggregation distribution. Secondly, a linear weight factor and an adaptive acceleration factor are introduced to add to the particle swarm algorithm and cooperate with the improved nonlinear convergence factor to balance the search ability of the algorithm, accelerate the convergence of the algorithm, and improve the convergence accuracy. Finally, the sparrow elite mutation and Bernoulli chaos mapping strategy improved by adaptive change water wave factor are introduced to improve the ability of individuals to jump out of the local optimum. After 21 standard test functions for the optimization search test, and with the help of the Wilcoxon rank sum statistical test for the optimization results, the robustness and applicability of the improved algorithm are verified. Finally, the IMSChoA optimization algorithm is applied to two engineering examples to further verify the superiority of the IMSChoA optimization algorithm in dealing with mechanical structure optimization design problems.
The other sections of the article are organized as follows: in Section 2, the mathematical model of the traditional ChoA algorithm is presented. Section 3 presents the specific improvement strategies incorporated on top of the ChoA algorithm. Section 4 shows the comparison and analysis of the results of IMSChoA with the other four optimization algorithms after 21 standard test function search tests. Section 5 applies the IMSChoA algorithm to two engineering examples and analyzes their optimization results accordingly. Finally, the full text is summarized in Section 6 for discussion.

2. Basic Chimpanzee Algorithm

The ChoA algorithm is an intelligent algorithm proposed by simulating the prey-hunting behavior of chimpanzee groups. According to the abilities shown in the chimpanzee hunting process, individual chimpanzees are classified into driver, barrier, chaser, and attacker. The chimpanzee group hunting process is mainly divided into exploratory phases, i.e., repelling, blocking, and chasing prey. The development stage involves attacking the prey. Each type of chimpanzee has the ability to think independently and search for the location of prey in its own way, while chimpanzees are also affected by sexual behavior, making them appear to confuse individual hunting behavior in the final stage. It is assumed that the first driver, barrier, chaser, and attacker are able to predict the location of prey and the others update their position according to the closest chimpanzee to the prey. The equation model for chimpanzee repelling and chasing prey is shown in Equations (1) and (2).
d t = | c X P t m X E t |
X E t + 1 = X P t a d t
where XP is the position of the prey, XE is the position of the chimpanzee, m is the chaotic vector, t is the number of iterations, d(t) is the distance of the chimpanzee from the prey, and a and c are the coefficient vectors. a and c are calculated by Equations (3) and (4), respectively. When |a| < 1, the chimpanzee individual tends to the prey, and when |a| > 1, it means the chimpanzee has deviated from the prey position and expanded the search range.
a = 2 f r 1 f
c = 2 r 2
where r1 and r2 are random numbers taking values of [0, 1], a is a random variable between [−2f, 2f], f is a linear convergence factor, and the calculation of the f formula is Equation (5).
f = 2 2 t t max
During the iterations, f decays linearly from 2 to 0, and tmax is the maximum number of iterations.
The position of chimpanzees in the population is co-determined by the position of the driver, barrier, chaser, and attacker. The mathematical model of chimpanzee attack on prey is shown in Equations (6)–(8).
D A = | c X A m X | D B = | c X B m X | D C = | c X C m X | D D = | c X D m X |
X 1 = X A a X A X 2 = X B a X B X 3 = X C a X C X 4 = X D a X D
X t + 1 = 1 4 X 1 + X 2 + X 3 + X 4
From Equations (6)–(8), the position of the prey is estimated from the position of the driver, barrier, chaser, and attacker. Other chimpanzees update their position in the direction of the prey.
In the final stages of a population’s predation, when individuals obtain food satisfaction, chimpanzees unleash their natural instinct to force chaotic access to food. The chaotic behavior of chimpanzees in the final stage helps to further alleviate the two problems of local optimal traps and slow convergence when the problem is high-dimensional. To simulate the chimpanzee’s chaotic behavior, it is assumed that there is a 50% probability of choosing one of the update positions in either the normal update position mechanism or the chaotic model, and the model formulation is shown in Equation (9) [24].
X E ( t + 1 ) = X P ( t ) a d , μ < 0.5 C h a o t i c , μ > 0.5
where μ takes the value of [0, 1] random number, and Chaotic is the chaotic mapping used to update the position.

3. Improving the Chimpanzee Algorithm

Firstly, for ChoA, the population initialization is performed by the random distribution method. This approach leads to population diversity, poor uniformity, easy boundary aggregation phenomenon, and large blindness of individual search for the best result. Secondly, the convergence factor of linear decay of the algorithm balancing local search and global search does not conform to the nonlinear merit-seeking characteristics of the algorithm, and finally, the algorithm jumps out of the local optimum with low chaotic perturbation trigger probability, which has great instability.
In summary, the corresponding improvement strategies are introduced for the problems of the ChoA algorithm, as follows.

3.1. Improved Sine Chaotic Mapping for Initializing Populations

Because the size of each dimension of chimpanzee individuals is randomly generated in the initialization stage, which leads to poor population diversity, serious boundary aggregation, and low individual variability. Chaotic searches are based on non-repetition and ergodicity, which are different from stochastic search methods, which are based on probabilities [25]. The common chaotic mappings include circle chaotic mapping, tent chaotic mapping, iteration chaotic mapping, logistic chaotic mapping, and sine chaotic mapping. Among them, sine mapping has good stability and high coverage, but it still has uneven distribution and boundary aggregation phenomena.
The expression of the original sine chaos mapping is:
X t = μ sin ( π X t 1 )
Therefore, the sine mapping is improved by introducing the Chebyshev mapping for the above problem. At the same time, a high-dimensional chaotic mapping is established to make it better represent the chaotic property based on the original one.
The expression of the improved sine chaos mapping is:
P t = μ sin ( π X t 1 ) + λ cos ( i cos 1 ( X t 1 ) )
W t = μ sin ( π W t 1 ) + λ cos ( t cos 1 ( W t 1 ) O t = μ sin ( π O t 1 ) + λ cos ( t cos 1 ( O t 1 ) E t = μ sin ( π E t 1 ) + λ cos ( t cos 1 ( E t 1 ) H t = μ sin ( π H t 1 ) + λ cos ( t cos 1 ( H t 1 ) X t = mod W t + O t + E t + H t , 1
where λ and μ are random numbers between [0, 1] and satisfy the sum of λ and μ as 1.
The dimensional distribution map and dimensional distribution histograms of both initial solutions before and after the improvement are shown in Figure 1. Here, Figure 1a,c is the original sine mapping, and Figure 1b,d is the improved sine mapping. Comparing Figure 1a,b and Figure 1c,d, it was found that the value distribution of the improved sine mapping chaos is more uniform, and the boundary aggregation problem is effectively solved.

3.2. PSO Idea and Nonlinear Convergence Factor

3.2.1. PSO Idea

In order to regulate the balance of global and local search ability in the early and late stages of the ChoA algorithm, the particle swarm idea is introduced to improve the position updating method of the ChoA algorithm. The best position information experienced by the particle itself and the best position information of the population are used to update the current position of the particle and realize the information exchange between individual chimpanzees and the population [26]. The position update formula is Equation (13).
X t + 1 = C r a n d w X 1 + X 2 + X 3 + X 4 + r a n d X 1 X t
where w is the inertia weight coefficient and C is the acceleration factor. The inertia weight w and acceleration factor C take values related to the influence of the past motion state of the particle on the present motion state, when w and C become large, the search space of the particle will be expanded, and, when w and C become small, the direction of particle motion produces many changes, and the search space is relatively small, which will lead the algorithm to fall into the local optimum. The convergence speed of the algorithm is accelerated by adjusting the size of w, C to regulate the ability of local search and global search of the algorithm [27].
In order to improve the global search ability of the algorithm in the early stage while strengthening the local optimization ability of the particle swarm in the later stage, this paper introduces an adaptive linear acceleration factor: at the beginning of the algorithm iteration, a larger value of C is given to complete a wide range of search, which is conducive to the algorithm to quickly search for the global optimal position, and, as the number of iterations increases, the algorithm gradually converges, and individuals search for the optimal solution locally, and, at this time, a smaller C value is given to achieve accurate exploration of the optimal position in small steps size so as to improve the convergence accuracy of the algorithm. At the same time, in order to prevent the algorithm from falling into the local optimum during the iteration process, the cosine function is introduced to correct the acceleration factor and to keep the acceleration factor fluctuating at all times. The adaptive acceleration factor mathematical model is shown in Equation (14), and the variation of C with the number of iterations is shown in Figure 2.
C = g e 0.1 t t max cos 0.1 t + 24.5 + 1.1
where g is the adjustment trade-off factor, and tmax is the maximum number of iterations.
At the same time, in order to accelerate the convergence of the algorithm and improve the convergence accuracy, the linear inertia weight model is introduced. At the beginning of the algorithm iteration, a large weight is given to make the population search the solution space extensively in large steps and to search the global optimal position quickly. With an increase in iteration number, the algorithm converges gradually at this time. At this time, the inertia weight coefficient gradually becomes smaller to facilitate the fine search of the optimal position in small steps and to improve the convergence accuracy of the algorithm. The linear inertia weight mathematical model is shown in Equation (15), and w varies with the number of iterations, as in Figure 3.
w = 0.5 w max t w max w min t max cos 2 ( 0.005 t + 10 )
where wmax and wmin are the maximum weight coefficient and minimum weight coefficient, respectively, t is the number of iterations, and tmax is the maximum number of iterations.

3.2.2. Nonlinear Decay Convergence Factor

One of the important factors in evaluating the performance of heuristic algorithms is the ability to balance the algorithm’s global search ability and local search ability. From the analysis of the chimpanzee algorithm, it is known that, when |a| < 1, the chimpanzee individual converges to the prey, and, when |a| > 1, this means that the chimpanzee has deviated from the prey position and expanded the search range. Therefore, the change in the convergence factor determines the global and local search ability of the algorithm. According to the above description, this paper introduces a nonlinear decay variation model, which cooperates with the adaptive acceleration factor in the particle swarm idea to jointly balance the global search ability and local search ability of the algorithm. Meanwhile, a control factor б is introduced to control the decay amplitude. The nonlinear decay convergence factor mathematical model is described as Equation (16).
f = f g 1 e t t max 1 e 1 δ
where t is the number of iterations, tmax is the maximum number of iterations, and fg is the initial convergence factor. б∈[1, 10], and, the larger the б, the slower the decay rate, as shown in Figure 4.

3.3. Improved Sparrow Elite Variation and Logistic Chaos Mapping

In the ChoA algorithm, the individual update is affected by the last optimal individual in each iteration, so the ChoA algorithm is easy to converge to the local optimum during the iterative process. To address the above problems, an optimization strategy combining adaptive water wave factor improved sparrow elite mutation and Bernoulli chaotic mapping is proposed.

3.3.1. Improved Sparrow Elite Variation and Logistic Chaos Mapping

The sparrow search algorithm is an efficient population intelligence optimization algorithm, which divides the search population into three parts: explorers, followers, and early warners, whose work is divided among themselves to find the optimal value [28]. Sparrow elite mutation is used to assign the capabilities of individuals with higher search performance to the current optimal individual. At each ChoA iteration, the individuals with the top 40% of the current fitness value are given a stronger optimization ability, and an adaptive water wave factor is added to the mutant individual update formula [29] to further improve the optimization ability of mutant individuals. The sparrow elite mutation mathematical description is shown in Equation (17).
X t + 1 0.4 = X ( t ) 0.4 v exp ( t α t max )         R < S T X ( t ) 0.4 + v Q L                                   R S T
where X(t)0.4 is for the top 40% of the current fitness value of the individual, Q is a random number obeying a normal distribution of [0, 1], L is a 1 × d matrix with all elements of 1, ST is the warning value, taken as 0.6, and v is the water wave factor, which varies adaptively with the number of iterations. The mathematical model of the adaptive water wave factor is shown in Equation (18).
v = 1 sin π t 2 t max + 2 π
As the iterations increase, the uncertainty in the iterative process and the dramatic abrupt changes in the water wave factor enhance the ability of individuals to jump out of the local optimum. The water wave factor changes are shown in Figure 5.

3.3.2. Bernoulli Chaotic Mappings

Bernoulli chaotic mapping is a classical representative of chaotic mapping and is more widely used [30]. Its mathematical expression is shown in Equation (19).
Z t + 1 = Z t / 1 λ   Z t 0 , 1 λ Z t 1 + λ / λ Z t 0 , 1 λ
where t is the number of chaotic iterations and λ is the conditioning factor, generally taken as 0.4. The resulting new chaotic sequence roots are mapped into the search space of the solution as follows.
X t d = X L + X U X L Z t d
where Xtd is the position of the tth element in d dimensions, XU and XL are analyzed as the upper and lower bounds of the search space, and Ztd is the chaotic value generated by Equation (19).

3.4. IMSChoA Algorithm Flow

The specific implementation steps of the IMSChoA algorithm are as follows.
  • Step 1: Initialize the population using the improved sine chaotic mapping, including the number of population individuals N, the maximum number of iterations tmax, the dimension d, the search boundary ub, and lb, the maximum and minimum weight factors, and the adjustment trade-off factor g, and set the relevant parameters.
  • Step 2: Update the acceleration factor, inertia weight, convergence factor, and water wave factor.
  • Step 3: Calculate the position of each chimpanzee.
  • Step 4: Update the positions of repellers, blockers, pursuers, and attackers.
  • Step 5: Calculate the adaptation degree value and the average value of the adaptation degree to find the global optimum and individual optimum.
  • Step 6: Compare the individual adaptation degree value f with the average value of adaptation degree favg. If f < favg, perform Brenoylli perturbation to determine whether the perturbed individual is better than the original individual, and update if better. Otherwise, keep the original individual unchanged; if f > favg, perform sparrow elite variation, and replace it if it is better than the original individual, otherwise keep it.
  • Step 7: Update the global optimal value of the population and the individual optimal value.
  • Step 8: Determine whether the condition is satisfied, and output the result if satisfied, otherwise return to step 2 for execution.
The flow chart is shown in Figure 6.

3.5. Time Complexity Analysis

Time complexity is an important index reflecting the performance of the algorithm [31]. Assuming that the chimpanzee population size is N, the search space dimension is n, the initialization time is t1, the update time of individual chimpanzee positions is t2, and the time to solve for the value of the target fitness function is f(n), the time complexity of the ChoA algorithm is:
T 1 n = O t 1 + N n t 2 + f n = O n + f n
In the IMSChoA algorithm, the time required to initialize the parameters is kept consistent with the standard ChoA. The time used to initialize the population using the modified sine is t3, which is employed in the loop phase, assuming that the time required to introduce the particle swarm idea, the nonlinear convergence factor, the modified sparrow elite variation, and the logistic chaos mapping are t4, t5, and t6, respectively. Then, the time complexity of IMSChoA is:
T 2 n = O t 1 + t 3 + N n t 2 + t 4 + t 5 + n t 6 + f n = O n + f n
The time complexity of SPWChoA and ChoA is the same by Equations (21) and (22). It is shown in Equation (23).
T 1 n = T 2 n = O n + f n
In summary, the improvement strategy proposed in this paper for the ChoA defect does not increase the time complexity.

4. Algorithm Performance Testing

4.1. Experimental Parameter Settings

In this paper, the PSO algorithm, GWO algorithm, IMSChoA algorithm, ChoA algorithm, and MFO algorithm are selected for the optimization search comparison. The basic parameters were uniformly set as follows: population size N = 30, the maximum number of iterations tmax = 500, and the internal parameters of the algorithm are shown in Table 1.

4.2. Benchmark Test Functions

To verify the effectiveness of the improved chimpanzee algorithm (IMSChoA), 21 benchmark test functions used in the literature [32] were experimentally selected for the optimization test, as shown in Table 2. F1~F7 are continuous unimodal test functions, F8~F12 are continuous multimodal test functions, and F13~F21 are fixed multimodal test functions. Figure 7, Figure 8 and Figure 9 show some of the several continuous unimodal test functions, continuous multimodal test functions, and fixed multimodal test functions function value distributions, respectively.

4.3. Comparison of the IMSChoA Algorithm with Other Algorithms

Twenty-one basic test functions are selected to perform the optimization search test for the algorithm mentioned in the summary of 4.1, and the iteration curves are shown in Figure 10. Among them, F1~F7 are continuous unimodal test functions with only one global optimum and no local optimum, which are used to test the global search ability and convergence speed of the algorithm. From Figure 10a–g, it can be seen that the IMSChoA optimization algorithm has a fast convergence speed and strong global search capability, which is faster and stronger than other group intelligence optimization algorithms in terms of iteration speed and preliminary global search capability. F8~F13 are continuous multimodal test functions with multiple local optima, which are used to test the ability of the algorithm to jump out of the local optimum. From Figure 10h–l, it can be seen that the IMSChoA optimization algorithm falls into the local optimum when calculating F8, F9, F11, and F12 functions, but it quickly jumps out of the current state and continues to search for the optimal iteration, which proves that the IMSChoA optimization algorithm has a strong ability to jump out of the local optimum, which is further improved compared with the ChoA optimization algorithm. F13~F21 are fixed multimodal test functions, which are used to test the equilibrium development capability and stability of the system. As can be seen from Figure 10m–u, the IMSChoA optimization algorithm can well balance the iterative ability of the algorithm and quickly complete the optimization search test, and the algorithm has significantly improved in terms of convergence accuracy and stability. However, it can be seen in Figure 10t,u that there are still some functions that still fall into the local optimum in the iterative process, and although the final completion jumps out of the local optimum in relation to finding the optimal solution, the iteration time is longer. For the overall function iteration graph, we can conclude that the algorithm is better for the high-dimensional, large range of optimal function search, while for the low-dimensional one, a small range of optimal search still has a certain disadvantage, although compared with the other optimization algorithms, they still have some inadequacies regarding the iteration speed and jumping out of the local optimum problem. There is still room for improvement.
In order to maintain the fairness of the test environment, twenty-one basic test functions are selected, and each algorithm is run 50 times independently, and the test results are shown in Table 3. The optimal value, mean value, and standard deviation reflect the convergence accuracy, convergence speed, and optimality-seeking stability of the algorithms, respectively. Compared with other algorithms, the IMSChoA algorithm can find a fixed optimal value in each function, and the computational performance of all functions is better than that of the PSO algorithm, except for F20 and F21, which are slightly worse than the PSO algorithm in terms of finding speed. Compared with the MFO algorithm, IMSChoA outperforms the MFO algorithm in terms of computational performance for all functions, except for the F18 function, which is slightly less stable than the MFO algorithm. Compared with the ChoA algorithm and the GWO algorithm, IMSChoA outperforms both of them in all aspects. This proves that the IMSChoA optimization algorithm has certain advantages in convergence accuracy, convergence speed, and stability of the optimization search.

4.4. Wilcoxon Rank Sum Test

In order to reflect the effectiveness of the improved algorithm, the literature suggests that a statistical test should be performed for the evaluation of the performance of the improved algorithm, and the effectiveness of the improved algorithm should be proved by the results of the statistical test. In this paper, the Wilcoxon rank sum test is used at the 5% significance level to determine whether the results of each iteration of IMSChoA are significantly different from PSO, GWO, ChoA, and MFO. The Wilcoxon rank sum test is a nonparametric statistical test that can detect more complex data distributions, and the general data analysis is only for the current data mean and standard deviation and does not compare with the data from multiple runs of the algorithm, so this data comparison analysis is not scientific. To demonstrate the superiority of the IMSChoA optimization algorithm, the results of the 12 runs of the test function were selected, and the results of the PSO, GWO, ChoA, and MFO algorithm runs were subjected to then Wilcoxon rank sum test, and the p-value was calculated, and, when p < 5%, they can be considered as a strong verification of the rejection of the null hypothesis [33]. The results of the Wilcoxon rank sum test are shown in Table 4. The symbols “+”, “−”, and “=“ indicate that IMSChoA outperforms, underperforms, and cannot make significant judgments of other algorithms, respectively. From the results in Table 4, the p-values of the Wilcoxon rank sum test for IMSChoA are basically less than 5%, indicating that, statistically speaking, IMSChoA has a significant advantage in the performance of the basic function search, which further reflects the robustness of IMSChoA.

5. Application Analysis of IMSChoA Algorithm Engineering Calculations

5.1. Spring Optimization Design Case Study

The optimization goal of the extension and compression spring design problem is to reduce the weight of the spring. The spring mechanism design is shown in Figure 11. The constraints of this design are shear stress, vibration frequency, and minimum vibration deflection. The variables y1, y2, and y3 represent the coil diameter d, spring coil diameter D, and the number of coils N, respectively, and f(x) is the minimum spring weight. The spring stretching mathematical model is described as follows.
Objective function:
min f x = y 1 2 y 2 2 + y 3
Constraints:
s 1 ( y ) = 1 y 2 3 y 3 71785 y 1 4 0 s 2 ( y ) = 4 y 2 2 y 1 y 2 12566 ( y 2 y 1 3 y 1 4 ) + 1 5108 y 1 2 1 0 s 3 ( y ) = 1 140.45 y 1 y 2 3 y 3 0 s 4 ( y ) = y 1 + y 2 1.5 1 0
Among them: 0.05 ≤ y1 ≤ 2, 0.05 ≤ y2 ≤ 2, 0.05 ≤ y3 ≤ 2, 0.05 ≤ y4 ≤ 2.
The PSO algorithm, GWO algorithm, IMSChoA algorithm, ChoA algorithm, and MFO algorithm proposed in this paper were compared experimentally, where the data of the compared algorithms were obtained from the literature [24,34]. The experiments were selected with a population size of 50 and a maximum number of iterations of 500, and each algorithm was run 100 times independently to take the average value. The optimization results are shown in Table 5.
As shown in Table 5, the IMSChoA algorithm obtains the optimal solution of the function [y1, y2, y3] = [0.0615, 0.7215, 5.5122] and the optimal solution f(x) = 0.0124. IMSChoA has good optimization results for the extension/compression spring design problem, and the optimization results for the spring coil diameter, spring coil diameter, and spring coil number are better than other algorithms. This shows that IMSChoA obtains the best solution for reducing the weight of the spring.

5.2. Optimization Experiments of the Fully Automatic Piston Manometer Control System

The ultimate goal of the optimization of the manometer control system is to check that the piston quickly and stably reaches the equilibrium position and achieves pressure measurement. The PID controller optimized by a group intelligence algorithm is generally used in engineering for regulation to achieve fast, stable, and accurate control. the PID control expression is shown in Equation (26), where kp is a proportional coefficient; ki is an integral coefficient; and kd is a differential coefficient. The structure of the manometer control system is shown in Figure 12.
u k = k p e k + k i n = 0 k e k + k d e ( k ) e ( k 1 )
The experiment uses a 500 Mpa fully automatic piston manometer, as shown in Figure 13. The device uses STM32F429IGT6 as the control core, equipped with a series of control circuits. By controlling the pneumatic solenoid valve to control the weight configuration, the servo motor carries out pressure to make the piston float to the balance position to complete the pressure check. The range of piston movement is set from −2 mm to 2 mm, and 0 mm is the equilibrium position.
The experiment uses a 500 Mpa fully automatic piston manometer, as shown in Figure 13. The device uses STM32F429IGT6 as the control core, equipped with a series of control circuits. By controlling the pneumatic solenoid valve to control the weight configuration, the servo motor carries out pressure to make the piston float to the balance position to complete the pressure check. The range of piston movement is set from −2 mm to 2 mm, and 0 mm is the equilibrium position.
The PSO algorithm, GWO algorithm, IMSChoA algorithm, ChoA algorithm, and MFO algorithm were used to adjust the parameters of the PID controller and to compare the results of the manometer operation, respectively. The initial conditions of each algorithm are the same. Each algorithm is run 50 times independently, and the average value is taken. The results of PID parameter adjustment were obtained in Table 6, and the displacement curve and velocity curve of the check piston movement are shown in Figure 14 and Figure 15.
As shown in Figure 14 and Figure 15, the PID controller optimized by the IMSChoA algorithm has the best control effect on the manometer system, and the check piston can reach the balance position quickly and stably to complete the pressure detection. This further proves the feasibility of IMSChoA in practical engineering applications for the optimal design of mechanical structures.

6. Conclusions

In this paper, we propose an improved chimpanzee search algorithm with multi-strategy fusion, namely, IMSChoA, to address the problems of the ChoA optimization algorithm, such as low convergence accuracy and being prone to fall into local optimality. Firstly, we use improved sine chaotic mapping to initialize the population and solve the phenomenon of population boundary aggregation distribution. Secondly, the particle swarm algorithm idea was added, cooperating with the improved nonlinear convergence factor to balance the searchability of the algorithm, to accelerate the convergence of the algorithm, and to improve the convergence accuracy. Finally, the adaptive water wave factor improved sparrow elite mutation, and the Bernoulli chaos mapping strategy was added to improve the ability of individuals to jump out of the local optimum. After 21 standard test functions for the optimization search test and analysis with the help of Wilcoxon rank sum statistical test results, the robustness and applicability of the algorithm were verified. Finally, the IMSChoA optimization algorithm was applied to the spring design case study and the optimization analysis of the fully automatic piston manometer control system, and the experimental results showed that the IMSChoA optimization algorithm also has good applicability to mechanical structure optimization design problems, but it has to be said that the comprehensive performance of the algorithm for low-dimensional, small-range high-precision search is still inadequate. Therefore, the next step will be to consider combining the IMSChoA algorithm with deep learning to eliminate the limitations of the algorithm in optimizing high-precision, as well as complex, problems, as well as to use it to solve more practical engineering problems.

Author Contributions

T.G. designed the project and coordinated the work. H.W. checked and discussed the results and the whole manuscript. F.Z. contributed to the discussion of this study. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Scientifific Research Fund Project of the Education Department of Liaoning Province, grant number No. LJKZ0510.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Tharwat, A.; Elhoseny, M.; Hassanien, A.E.; Gabel, T.; Kumar, A. Intelligent Bézier curve-based path planning model using Chaotic Particle Swarm Optimization algorithm. Cluster Comput. 2019, 22 (Suppl. S2), 4745–4766. [Google Scholar] [CrossRef]
  2. Cinsdikici, M.G.; Aydın, D. Detection of blood vessels in ophthalmoscope images using MF/ant (matched filter/ant colony) algorithm. Comput. Methods Programs Biomed. 2009, 96, 85–95. [Google Scholar] [CrossRef] [PubMed]
  3. Ghanamijaber, M. A hybrid fuzzy-PID controller based on gray wolf optimization algorithm in power system. Evol. Syst. 2019, 10, 273–284. [Google Scholar] [CrossRef]
  4. Maharana, D.; Kotecha, P. Optimization of Job Shop Scheduling Problem with Grey Wolf Optimizer and JAYA Algorithm. In Smart Innovations in Communication and Computational Sciences; Springer: Singapore, 2019; pp. 47–58. [Google Scholar] [CrossRef]
  5. Ebrahimi, B.; Rahmani, M.; Ghodsypour, S.H. A new simulation-based genetic algorithm to efficiency measure in IDEA with weight restrictions. Measurement 2017, 108, 26–33. [Google Scholar] [CrossRef]
  6. Bu, S.J.; Kang, H.B.; Cho, S.B. Ensemble of Deep Convolutional Learning Classifier System Based on Genetic Algorithm for Database Intrusion Detection. Electronics 2022, 11, 745. [Google Scholar] [CrossRef]
  7. Afzal, A.; Ramis, M.K. Multi-objective optimization of thermal performance in battery system using genetic and particle swarm algorithm combined with fuzzy logics. J. Energy Storage 2020, 32, 101815. [Google Scholar] [CrossRef]
  8. Xin-gang, Z.; Ji, L.; Jin, M.; Ying, Z. An improved quantum particle swarm optimization algorithm for environmental economic dispatch. Expert Syst. Appl. 2020, 152, 113370. [Google Scholar] [CrossRef]
  9. Sun, X.; Hu, C.; Lei, G.; Guo, Y.; Zhu, J. State Feedback Control for a PM Hub Motor Based on Gray Wolf Optimization Algorithm. IEEE Trans. Power Electron. 2020, 35, 1136–1146. [Google Scholar] [CrossRef]
  10. Meidani, K.; Hemmasian, A.; Mirjalili, S.; Farimani, A.B. Adaptive grey wolf optimizer. Neural Comput. Appl. 2022, 34, 7711–7731. [Google Scholar] [CrossRef]
  11. Meng, X.; Liu, Y.; Gao, X.; Zhang, H. A new bio-inspired algorithm: Chicken swarm optimization. In International Conference in Swarm Intelligence; Springer: Cham, Switzerland, 2014; pp. 86–94. [Google Scholar] [CrossRef]
  12. Xue, J.; Shen, B. A novel swarm intelligence optimization approach: Sparrow search algorithm. Syst. Sci. Control. Eng. 2020, 8, 22–34. [Google Scholar] [CrossRef]
  13. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  14. Lu, C.; Gao, L.; Yi, J. Grey wolf optimizer with cellular topological structure. Expert Syst. Appl. 2018, 107, 89–114. [Google Scholar] [CrossRef]
  15. Teng, Z.; Lv, J.; Guo, L. An improved hybrid grey wolf optimization algorithm. Soft Comput. 2019, 23, 6617–6631. [Google Scholar] [CrossRef]
  16. Hussien, A.G.; Hassanien, A.E.; Houssein, E.H.; Amin, M.; Azar, A.T. New binary whale optimization algorithm for discrete optimization problems. Eng. Optim. 2020, 52, 945–959. [Google Scholar] [CrossRef]
  17. Wang, Z.; Qin, C.; Wan, B.; Song, W.W.; Yang, G. An Adaptive Fuzzy Chicken Swarm Optimization Algorithm. Math. Probl. Eng. 2021, 2021, 8896794. [Google Scholar] [CrossRef]
  18. Tian, D.; Shi, Z. MPSO: Modified particle swarm optimization and its applications. Swarm Evol. Comput. 2018, 41, 49–68. [Google Scholar] [CrossRef]
  19. Li, Y.; Zhu, X.; Liu, J. An improved moth-flame optimization algorithm for engineering problems. Symmetry 2020, 12, 1234. [Google Scholar] [CrossRef]
  20. Khishe, M.; Mosavi, M.R. Chimp optimization algorithm. Expert Syst. Appl. 2020, 149, 113338. [Google Scholar] [CrossRef]
  21. Du, N.; Zhou, Y.; Deng, W.; Luo, Q. Improved chimp optimization algorithm for three-dimensional path planning problem. Multimed. Tools Appl. 2022, 81, 27397–27422. [Google Scholar] [CrossRef]
  22. Kumari, C.L.; Kamboj, V.K.; Bath, S.K.; Tripathi, S.L.; Khatri, M.; Sehgal, S. A boosted chimp optimizer for numerical and engineering design optimization challenges. Eng. Comput. 2022, 1–52. [Google Scholar] [CrossRef]
  23. Houssein, E.H.; Emam, M.M.; Ali, A.A. An efficient multilevel thresholding segmentation method for thermography breast cancer imaging based on improved chimp optimization algorithm. Expert Syst. Appl. 2021, 185, 115651. [Google Scholar] [CrossRef]
  24. Liu, C.; He, Q. Golden sine chimpanzee optimization algorithm integrating multiple strategies. J. Autom. 2022, 47, 1–14. [Google Scholar]
  25. Hekmatmanesh, A.; Wu, H.; Handroos, H. Largest Lyapunov Exponent Optimization for Control of a Bionic-Hand: A Brain Computer Interface Study. Front. Rehabil. Sci. 2022, 2, 802070. [Google Scholar] [CrossRef] [PubMed]
  26. Li, Y.; Han, M.; Guo, Q. Modified Whale Optimization Algorithm Based on Tent Chaotic Mapping and Its Application in Structural Optimization. KSCE J. Civ. Eng. 2020, 24, 3703–3713. [Google Scholar] [CrossRef]
  27. Xiong, X.; Wan, Z. The simulation of double inverted pendulum control based on particle swarm optimization LQR algorithm. In Proceedings of the 2010 IEEE International Conference on Software Engineering and Service Sciences, Beijing, China, 16–18 July 2009; IEEE: Piscataway, NJ, USA, 2010; pp. 253–256. [Google Scholar] [CrossRef]
  28. Liu, X.; Bai, Y.; Yu, C.; Yang, H.; Gao, H.; Wang, J.; Chang, Q.; Wen, X. Multi-Strategy Improved Sparrow Search Algorithm and Application. Math. Comput. Appl. 2022, 27, 96. [Google Scholar] [CrossRef]
  29. Liu, Z.; Li, M.; Pang, G.; Song, H.; Yu, Q.; Zhang, H. A Multi-Strategy Improved Arithmetic Optimization Algorithm. Symmetry 2022, 14, 1011. [Google Scholar] [CrossRef]
  30. Wang, P.; Zhang, Y.; Yang, H. Research on economic optimization of microgrid cluster based on chaos sparrow search algorithm. Comput. Intell. Neurosci. 2021, 2021, 5556780. [Google Scholar] [CrossRef]
  31. Mareli, M.; Twala, B. An adaptive Cuckoo search algorithm for optimisation. Appl. Comput. Inform. 2018, 14, 107–115. [Google Scholar] [CrossRef]
  32. Derrac, J.; García, S.; Molina, D.; Herrera, F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 2011, 1, 3–18. [Google Scholar] [CrossRef]
  33. Xinming, Z.; Xia, W.; Qiang, K. Improved grey wolf optimizer and its application to high-dimensional function and FCM optimization. Control. Decis. 2019, 34, 2073–2084. [Google Scholar]
  34. He, Q.; Luo, S.H.H. Hybrid improvement strategy of chimpanzee optimization algorithm and its mechanical application. Control. Decis. Mak. 2022, 1–11. [Google Scholar]
Figure 1. (a) Dimensional distribution map of the solutions generated by the original sine mapping. The circle in (a) is the focused observation area, demonstrating the boundary aggregation phenomenon; (b) dimensional distribution map of the solutions generated by the improved sine mapping; (c) dimensional distribution histogram of the solutions generated by the original sine mapping; (d) dimensional distribution histogram of the solutions generated by the improved Sine mapping.
Figure 1. (a) Dimensional distribution map of the solutions generated by the original sine mapping. The circle in (a) is the focused observation area, demonstrating the boundary aggregation phenomenon; (b) dimensional distribution map of the solutions generated by the improved sine mapping; (c) dimensional distribution histogram of the solutions generated by the original sine mapping; (d) dimensional distribution histogram of the solutions generated by the improved Sine mapping.
Machines 11 00250 g001
Figure 2. Change curve of acceleration factor.
Figure 2. Change curve of acceleration factor.
Machines 11 00250 g002
Figure 3. Curve of inertia weight change.
Figure 3. Curve of inertia weight change.
Machines 11 00250 g003
Figure 4. Comparison curve of convergence factors.
Figure 4. Comparison curve of convergence factors.
Machines 11 00250 g004
Figure 5. Adaptive water wave factor distribution with 500 iterations.
Figure 5. Adaptive water wave factor distribution with 500 iterations.
Machines 11 00250 g005
Figure 6. Flow chart of IMSChoA algorithm optimization.
Figure 6. Flow chart of IMSChoA algorithm optimization.
Machines 11 00250 g006
Figure 7. Distribution diagram of different continuous unimodal test function values. (a) F1 function; (b) F4 function; (c) F5 function.
Figure 7. Distribution diagram of different continuous unimodal test function values. (a) F1 function; (b) F4 function; (c) F5 function.
Machines 11 00250 g007
Figure 8. Distribution diagram of different continuous multimodal test function values. (a) F8 function; (b) F9 function; (c) F12 function.
Figure 8. Distribution diagram of different continuous multimodal test function values. (a) F8 function; (b) F9 function; (c) F12 function.
Machines 11 00250 g008
Figure 9. Distribution diagram of different fixed multimodal test function values. (a) F14 function; (b) F16 function; (c) F19 function.
Figure 9. Distribution diagram of different fixed multimodal test function values. (a) F14 function; (b) F16 function; (c) F19 function.
Machines 11 00250 g009
Figure 10. Convergence curves of different functions. (a) F1 function; (b) F2 function; (c) F3 function; (d) F4 function; (e) F5 function; (f) F6 function; (g) F7 function; (h) F8 function; (i) F9 function; (j) F10 function; (k) F11 function; (l) F12 function; (m) F13 function; (n) F14 function; (o) F15 function; (p) F16 function; (q) F17 function; (r) F18 function; (s) F19 function; (t) F20 function; (u) F21 function.
Figure 10. Convergence curves of different functions. (a) F1 function; (b) F2 function; (c) F3 function; (d) F4 function; (e) F5 function; (f) F6 function; (g) F7 function; (h) F8 function; (i) F9 function; (j) F10 function; (k) F11 function; (l) F12 function; (m) F13 function; (n) F14 function; (o) F15 function; (p) F16 function; (q) F17 function; (r) F18 function; (s) F19 function; (t) F20 function; (u) F21 function.
Machines 11 00250 g010aMachines 11 00250 g010b
Figure 11. Schematic diagram of spring extension/compression structure.
Figure 11. Schematic diagram of spring extension/compression structure.
Machines 11 00250 g011
Figure 12. Structure of PID control system.
Figure 12. Structure of PID control system.
Machines 11 00250 g012
Figure 13. 500 Mpa automatic piston manometer.
Figure 13. 500 Mpa automatic piston manometer.
Machines 11 00250 g013
Figure 14. Displacement curve.
Figure 14. Displacement curve.
Machines 11 00250 g014
Figure 15. Speed curve.
Figure 15. Speed curve.
Machines 11 00250 g015
Table 1. Parameter table of the algorithm.
Table 1. Parameter table of the algorithm.
AlgorithmParameters
PSOC1 = 1.445; C2 = 1.445; wmax = 2.0; wmin = 0.5
GWOa decreases linearly from 1.5 to 0; r1,r2∈[0, 1]
IMSChoAwmax = 2.5; wmin = 0.05; fg = 2.5; λ = 0.4; g = 1000
ChoAm = chaos (3,1,1)
MFOt∈[k, 1]; k varies linearly between −1 and −2; b = 1
Table 2. Benchmark functions.
Table 2. Benchmark functions.
No.Function NameDefinition DomainDimensionalityOptimal ValueAbsolute Accuracy Error
F1Sphere[−100, 100]3001.00 × 10−3
F2Schwefel’ problem 2.22[−10, 10]3001.00 × 10−3
F3Schwefel’ problem 1.2[−100, 100]3001.00 × 10−3
F4Schwefel’ problem 2.21[−100, 100]3001.00 × 10−3
F5Generalized Rosenbrock’s Function[−30, 30]3001.00 × 10−2
F6Step Function[−100, 100]3001.00 × 10−2
F7Quartic Function[−1.28, 1.28]3001.00 × 10−2
F8Generalized Schwefel’s problem[−500, 500]30−12,569.51.00 × 102
F9Generalized Rastrigin’s Function[−5.12, 5.12]3001.00 × 10−2
F10Ackley’s Function[−32, 32]3001.00 × 10−2
F11Ceneralized Criewank Function[−600, 600]3001.00 × 10−2
F12Ceneralized Penalized Function[−50, 50]3001.00 × 10−2
F13Branin Function[−5, 5]20.3981.00 × 10−2
F14Shekell’s Foxholes Function[−65, 65]211.00 × 10−2
F15Kowalik’s Function[−5, 5]40.00031.00 × 10−2
F16Six-Hump Camel-Back Function[−5, 5]2−1.031.00×10−2
F17Goldstein-Price Function[−2, 2]231.00 × 10−2
F18Hatman’s Function1[0, 1]3−3.861.00 × 10−2
F19Hatman’s Function2[0, 1]6−3.321.00 × 10−2
F20Shekel’s Family 1[0, 10]4−101.00 × 10−2
F21Shekel’s Family 2[0, 10]4−101.00 × 10−2
Table 3. Experimental results of function test (30 dimensions).
Table 3. Experimental results of function test (30 dimensions).
Function NameAlgorithmOptimum ValueAverage ValueStandard Deviation
F1PSO1.7739 × 1004.2242 × 1031.3399 × 104
GWO3.8341 × 10−63.8284 × 1044.0132 × 104
IMSChoA2.3526 × 10−309.8425 × 1027.5512 × 103
ChoA6.2740 × 10−114.1370 × 1043.9220 × 104
MFO7.8368 × 1011.0170 × 1041.8759 × 104
F2PSO5.8589 × 1009.9897 × 1082.2020 × 1010
GWO0.0008 × 1005.5000 × 10167.3352 × 1016
IMSChoA3.3714 × 10−188.1379 × 1031.8164 × 108
ChoA9.2287 × 10−61.4540 × 101343.4356 × 1013
MFO5.0048 × 1012.4481 × 10163.4463 × 1017
F3PSO0.6379 × 1005.2282 × 1031.3308 × 104
GWO5.7064 × 10−63.2449 × 1043.1888 × 104
IMSChoA1.2559 × 10−307.7739 × 1026.0601 × 103
ChoA9.8877 × 10−84.5275 × 1044.3573 × 104
MFO2.0501 × 1001.4254 × 1042.5940 × 104
F4PSO2.8173 × 1001.2901 × 1011.6815 × 101
GWO4.9114 × 10−43.6630 × 1013.0889 × 101
IMSChoA2.4606 × 10−41.2484 × 1012.8149 × 101
ChoA1.8419 × 10−26.0327 × 1014.3718 × 101
MFO9.2480 × 1019.2986 × 1011.3286 × 102
F5PSO6.7185 × 1022.4067 × 1061.6751 × 107
GWO2.9001 × 1011.4677 × 1082.1058 × 108
IMSChoA2.6094 × 1013.8713 × 1063.9511 × 107
ChoA2.8961 × 1013.2706 × 1073.2469 × 106
MFO9.0475 × 1044.8562 × 1071.0463 × 108
F6PSO1.2471 × 1004.1923 × 1031.3160 × 104
GWO7.5015 × 1004.0337 × 1044.3493 × 104
IMSChoA0.4743 × 1009.2983 × 1028.0821 × 103
ChoA4.0870 × 1004.5433 × 1044.1956 × 104
MFO7.7287 × 1002.6025 × 1043.1180 × 104
F7PSO3.2543 × 1008.8942 × 1065.3271 × 107
GWO1.6692 × 1002.8689 × 1084.6735 × 108
IMSChoA0.7521 × 10−15.5112 × 1065.1999 × 107
ChoA0.3327 × 1008.0941 × 1082.5548 × 108
MFO7.918 × 1007.7442 × 1071.7117 × 108
F8PSO1.2435 × 1012.1457 × 1023.2457 × 103
GWO4.5876 × 10−32.1475 × 1002.4785 × 101
IMSChoA8.7812 × 10−57.7865 × 10−11.5782 × 101
ChoA4.7852 × 10−41.4231 × 1022.4785 × 103
MFO7.2145 × 1002.1452 × 1017.8452 × 103
F9PSO4.2127 × 1001.0856 × 1075.8853 × 107
GWO1.6693 × 1001.9264 × 1082.9353 × 108
IMSChoA0.4157 × 10−17.6690 × 1067.0082 × 107
ChoA0.4032 × 1003.1662 × 1082.8858 × 108
MFO3.1148 × 1011.8874 × 1083.0155 × 108
F10PSO2.3462 × 1021.1810 × 1076.7967 × 107
GWO1.6697 × 1001.7130 × 1082.5750 × 108
IMSChoA0.1330 × 10−16.1348 × 1066.4309 × 107
ChoA0.4086 × 1006.1276 × 1086.0678 × 108
MFO5.2633 × 1005.8286 × 1071.4566 × 108
F11PSO3.9789 × 10−13.9946 × 10−19.5817 × 10−3
GWO5.1327 × 1005.2427 × 1002.1338 × 10−14
IMSChoA3.9789 × 10−14.1277 × 10−21.4810 × 10−1
ChoA3.9814 × 10−14.3905 × 10−11.8777 × 10−1
MFO7.8961 × 10−14.0922 × 10−11.0551 × 10−1
F12PSO9.5104 × 10−14.9024 × 1031.2576 × 104
GWO7.5020 × 1004.0944 × 1044.4157 × 104
IMSChoA4.9153 × 10−19.6161 × 1027.6567 × 103
ChoA3.7205 × 1005.0133 × 1044.4403 × 104
MFO1.0101 × 1042.3015 × 1042.2183 × 104
F13PSO3.9789 × 10−14.1037 × 10−11.1297 × 10−1
GWO4.0002 × 10−14.6941 × 10−12.4898 × 10−1
IMSChoA3.6787 × 10−13.1971 × 10−11.3684 × 10−1
ChoA3.9810 × 10−14.2046 × 10−15.9657 × 10−1
MFO3.9787 × 10−15.0134 × 10−19.5415 × 10−1
F14PSO1.9920 × 1002.2788 × 1001.2463 × 102
GWO9.9954 × 10−12.3876 × 1019.0365 × 101
IMSChoA9.9800 × 10−12.0056 × 1001.4190 × 101
ChoA9.9961 × 10−12.1719 × 1019.3934 × 101
MFO3.9683 × 1001.1003 × 1015.5468 × 101
F15PSO9.1133 × 10−46.6071 × 10−31.0972 × 10−2
GWO1.8780 × 10−34.1942 × 10−27.6213 × 10−1
IMSChoA3.9751 × 10−42.2051 × 10−31.0951 × 10−2
ChoA1.3171 × 10−31.6130 × 10−37.6465 × 10−4
MFO1.4888 × 10−31.6207 × 10−39.3492 × 10−4
F16PSO3.3912 × 1001.9896 × 1071.2035 × 108
GWO3.0040 × 1006.8273 × 1081.0925 × 109
IMSChoA4.0537 × 10−11.5265 × 1071.6062 × 107
ChoA2.8445 × 1001.2423 × 1091.2502 × 109
MFO3.4422 × 1011.3698 × 1083.1584 × 108
F17PSO3.0287 × 1003.6754 × 1001.4510 × 100
GWO3.0159 × 1003.5667 × 1003.8514 × 10−1
IMSChoA3.0000 × 1003.3619 × 1009.2576 × 10−2
ChoA3.1006 × 1004.0996 × 1004.3993 × 100
MFO3.0505 × 1006.6287 × 1003.8731 × 101
F18PSO−3.8538 × 100−3.0897 × 10−04.8218 × 10−1
GWO−3.7439 × 100−3.7220 × 1001.4091 × 10−1
IMSChoA−3.8628 × 100−3.8573 × 1003.4272 × 10−2
ChoA−3.8549 × 100−3.8158 × 1002.0926 × 10−1
MFO−3.7436 × 100−3.5870 × 1003.2694 × 10−2
F19PSO−2.8067 × 100−1.7476 × 1008.7338 × 10−1
GWO−2.0159 × 10−1−2.0159 × 10−11.1669 × 10−1
IMSChoA−3.3220 × 100−3.2524 × 1002.6905 × 10−2
ChoA−3.0124 × 100−2.8667 × 1004.4520 × 10−1
MFO−3.3220 × 100−3.0003 × 1003.7082 × 10−1
F20PSO−1.0138 × 101−9.7699 × 1001.2788 × 10−1
GWO−2.7312 × 10−1−2.7312 × 10−11.6670 × 100
IMSChoA−1.0152 × 101−7.2606 × 1002.7155 × 10−1
ChoA−4.9481 × 10−1−4.2491 × 1001.1362 × 100
MFO−5.0552 × 100−4.8248 × 1008.9123 × 10−1
F21PSO−1.0518 × 101−1.0113 × 1011.3731 × 100
GWO−1.6697 × 100−1.5444 × 1002.8352 × 10−1
IMSChoA−1.0536 × 101−7.3240 × 1002.6989 × 10−1
ChoA−5.0690 × 100−4.3478 × 10−11.1266 × 100
MFO−5.1285 × 100−4.9702 × 1007.2547 × 100
Table 4. Wilcoxon rank-sum test results.
Table 4. Wilcoxon rank-sum test results.
No.PSOGWOChoAMFO
F13.04 × 10−203.04 × 10−203.04 × 10−203.04 × 10−20
F23.04 × 10−203.04 × 10−203.04 × 10−203.04 × 10−20
F33.04 × 10−203.04 × 10−203.04 × 10−203.04 × 10−20
F43.04 × 10−203.04 × 10−203.04 × 10−203.04 × 10−20
F51.29 × 10−172.69 × 10−172.33 × 10−101.06 × 10−10
F67.24 × 10−187.11 × 10−181.39 × 10−171.43 × 10−10
F74.28 × 10−177.09 × 10−181.38 × 10−176.77 × 10−14
F87.09 × 10−187.09 × 10−187.24 × 10−182.24 × 10−10
F93.45 × 10−203.33 × 10−201.23 × 10−19NaN
F103.45 × 10−203.33 × 10−202.98 × 10−202.42 × 10−16
F113.49 × 10−203.31 × 10−202.69 × 10−043.65 × 10−14
F127.07 × 10−209.55 × 10−118.01 × 10−101.85 × 10−17
+/=/−12/0/012/0/012/0/011/0/1
Table 5. The optimal solutions of each algorithm in the stretching/compression spring design problem.
Table 5. The optimal solutions of each algorithm in the stretching/compression spring design problem.
AlgorithmKpKiKdAdaptability Value
PSO0.52320.06030.582164.7664
GWO0.51140.34150.073455.3211
IMSChoA0.26150.02150.411543.1124
ChoA0.45620.47540.421453.1451
MFO0.65330.31470.431557.5521
Table 6. Tuning parameters of PID controller optimized by different algorithms.
Table 6. Tuning parameters of PID controller optimized by different algorithms.
AlgorithmKpKiKdAdaptability Value
PSO0.52320.06030.582164.7664
GWO0.51140.34150.073455.3211
IMSChoA0.26150.02150.411543.1124
ChoA0.45620.47540.421453.1451
MFO0.65330.31470.431557.5521
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wu, H.; Zhang, F.; Gao, T. Improved Chimpanzee Search Algorithm with Multi-Strategy Fusion and Its Application. Machines 2023, 11, 250. https://doi.org/10.3390/machines11020250

AMA Style

Wu H, Zhang F, Gao T. Improved Chimpanzee Search Algorithm with Multi-Strategy Fusion and Its Application. Machines. 2023; 11(2):250. https://doi.org/10.3390/machines11020250

Chicago/Turabian Style

Wu, Hongda, Fuxing Zhang, and Teng Gao. 2023. "Improved Chimpanzee Search Algorithm with Multi-Strategy Fusion and Its Application" Machines 11, no. 2: 250. https://doi.org/10.3390/machines11020250

APA Style

Wu, H., Zhang, F., & Gao, T. (2023). Improved Chimpanzee Search Algorithm with Multi-Strategy Fusion and Its Application. Machines, 11(2), 250. https://doi.org/10.3390/machines11020250

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop