Next Article in Journal
A Biomimetic Flapping Mechanism for Insect Robots Driven by Indirect Flight Muscles
Previous Article in Journal
Directional Liquid Transport on Biomimetic Surface with Wedge-Shaped Pattern: Mechanism, Construction, and Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Modified Sparrow Search Algorithm by Incorporating Multi-Strategy for Solving Mathematical Optimization Problems

1
School of Information Engineering, Tianjin University of Commerce, Beichen, Tianjin 300134, China
2
College of Science, Tianjin University of Commerce, Beichen, Tianjin 300134, China
*
Author to whom correspondence should be addressed.
Biomimetics 2025, 10(5), 299; https://doi.org/10.3390/biomimetics10050299
Submission received: 21 January 2025 / Revised: 25 April 2025 / Accepted: 5 May 2025 / Published: 8 May 2025

Abstract

:
The Sparrow Search Algorithm (SSA), proposed by Jiankai Xue in 2020, is a swarm intelligence optimization algorithm that has received extensive attention due to its powerful optimization-seeking ability and rapid convergence. However, similar to other swarm intelligence algorithms, the SSA has the problem of being prone to falling into local optimal solutions during the optimization process, which limits its application effectiveness. To overcome this limitation, this paper proposes a Modified Sparrow Search Algorithm (MSSA), which enhances the algorithm’s performance by integrating three optimization strategies. Specifically, the Latin Hypercube Sampling (LHS) method is employed to achieve a uniform distribution of the initial population, laying a solid foundation for global search. An adaptive weighting mechanism is introduced in the producer update phase to dynamically adjust the search step size, effectively reducing the risk of the algorithm falling into local optima in later iterations. Meanwhile, the cat mapping perturbation and Cauchy mutation operations are integrated to further enhance the algorithm’s global exploration ability and local development efficiency, accelerating the convergence process and improving the quality of the solutions. This study systematically validates the performance of the MSSA through multi-dimensional experiments. The MSSA demonstrates excellent optimization performance on 23 benchmark test functions and the CEC2019 standard test function set. Its application to three practical engineering problems, namely the design of welded beams, reducers, and cantilever beams, successfully verifies the effectiveness of the algorithm in real-world scenarios. By comparing it with deterministic algorithms such as DIRET and BIRMIN, and based on the five-dimensional test functions generated by the GKLS generator, the global optimization ability of the MSSA is thoroughly evaluated. In addition, the successful application of the MSSA to the problem of robot path planning further highlights its application advantages in complex practical scenarios. Experimental results show that, compared with the original SSA, the MSSA has achieved significant improvements in terms of convergence speed, optimization accuracy, and robustness, providing new ideas and methods for the research and practical application of swarm intelligence optimization algorithms.

1. Introduction

With the continuous advancement of science and technology, traditional optimization algorithms, such as Newton’s method [1] and gradient descent [2], are increasingly revealing their limitations when facing large-scale, high-dimensional, and nonlinear problems. These methods typically rely on specific assumptions about the problem, making them less adaptable to dynamic optimization scenarios. In contrast, swarm intelligence optimization algorithms [3], due to their simple structure, ease of implementation, and outstanding efficiency and adaptability in solving complex problems, have gradually become essential tools in both research and application. These algorithms simulate the collective behavior of biological populations in nature, relying on cooperation and competition among individuals to search for optimal solutions. They enable effective global search in dynamic and complex environments. Compared to traditional optimization methods, swarm intelligence optimization algorithms offer greater robustness, avoiding the problem of local optima more effectively, thereby enhancing both global search capability and precision.
In the research process of optimization algorithms, many scholars have proposed a series of distinctive optimization methods for different problems. To address the nonlinear optimization challenge in passive positioning using time-frequency differences of moving dual stations, Zhang et al. [4] proposed a hybrid positioning algorithm combining the Cuckoo Search (CS) algorithm and the Newton method. This approach uses the global search results of the Cuckoo Search algorithm as the initial values for the Newton method and iteratively solves for the target position through the Newton method. It effectively overcomes the drawbacks of the slow convergence of the Cuckoo Search algorithm and the sensitivity of the Newton method to the selection of initial values. In the field of optimization algorithms, the projected reflected gradient method is a core approach for solving variational inequalities. The inertial extrapolated projected reflected gradient method proposed in reference [5] optimizes the iterative process by introducing an inertial mechanism, effectively adjusting the search strategy. Experiments show that this method outperforms traditional algorithms in both convergence speed and solution accuracy, providing a new solution for complex optimization problems. Nikita Sakovich et al. proposed a new MAMGD gradient optimization method [6], which utilizes exponential decay, adaptive learning rates, and the discrete second-order derivative of the gradient. Its effectiveness has been verified in scenarios such as the minimization of multivariate real-valued functions and the function approximation of multi-layer neural networks. Experiments indicate that this method features a fast convergence speed, strong stability against fluctuations, and excellent gradient accumulation effects. Li et al. [7] proposed an optimized MTD (Moving Target Detection) algorithm for dynamic target detection based on gradient descent with sampling point weights, optimizing the MTD moving target detection technology. The experimental results show that, compared with traditional methods, this algorithm can more accurately detect the speed of dynamic targets even with a small number of sampling points, significantly improving the accuracy of MTD detection. Ye et al. [8] proposed a multi-objective fuzzy optimization scheduling method for regional power grids based on the distributed Newton method. By iteratively finding the minimum value of the objective function within a given region and transforming multiple objective functions into a single objective function, the computational load of multi-objective scheduling of power grids is greatly reduced. Comparative experiments show that this method significantly improves scheduling efficiency and can effectively meet the economic requirements of power grid operation.
The family of swarm intelligence algorithms is rich and diverse. Common swarm intelligence algorithms include Particle Swarm Optimization (PSO, 1995 [9]), Shuffled Frog Leaping Algorithm (SFLA, 2003 [10]), Artificial Bee Colony Algorithm (ABC, 2005 [11]), Grey Wolf Optimization (GWO, 2014 [12]), Sine Cosine Algorithm (SCA, 2016 [13]), Whale Optimization Algorithm (WOA, 2016 [14]), Harris Hawks Optimization (HHO, 2019 [15]), Chimp Optimization Algorithm (Chimp, 2020 [16]), Sparrow Search Algorithm (SSA, 2020 [17]), Dung Beetle Optimizer (DBO, 2022 [18]), Snow Albation Optimizer (SAO, 2023 [19]), Chinese Pangolin Optimizer (CPO, 2025 [20]), and Mirage Search Optimization (MSO, 2025 [21]). These algorithms are inspired by the collective behaviors of various natural organisms, such as the foraging of birds, the food-seeking behavior of bees, and the hunting strategies of whales. They employ a hybrid search strategy that combines local exploration and global exploitation. These algorithms have found widespread applications in diverse fields.
In successive iterations, population diversity in population-based optimization algorithms often decreases, leading to premature convergence to local optima. To address this, various advanced learning strategies have been proposed. Elsisi et al. [22] introduced an Improved Gray Wolf Optimization (IGWO) algorithm that enhances the exploration–exploitation balance by incorporating a novel learning mechanism and a Fitness–Distance Balancing (FDB) technique, improving global search and reducing the risk of local optima without additional parameters. Chen et al. [23] enhanced the Whale Optimization Algorithm (WOA) by adding a nonlinear convergence factor to better balance exploration and exploitation, and also introduced an adaptive weighting strategy and a stochastic differential change strategy to speed up convergence and prevent premature trapping in local optima, with successful application to signal denoising. Liu et al. [24] improved the Chimpanzee Optimization Algorithm (ChOA) with adaptive inertia weights and a flight strategy, and later developed the IChOA-KFC algorithm for RGB-D image segmentation, demonstrating significant performance gains. Zhang et al. [25] proposed an Improved Sine Cosine Algorithm (ISCA) to enhance population diversity and balance exploration and exploitation through uniform initialization and nonlinear strategies, successfully optimizing BiLSTM model parameters to improve prediction accuracy. Javaheri et al. [26] developed a discrete and dyadic-based version of the Harris Hawk Optimization (HHO) algorithm, incorporating eight dyadic learning strategies to improve convergence rates and applied it effectively to job scheduling for computational providers, showing notable improvements in optimization efficiency and effectiveness.
The Sparrow Search Algorithm (SSA) was proposed by Professors Xie and Shen from Donghua University, inspired by the natural patterns of sparrow foraging and defense behaviors. The algorithm exhibits fast convergence speed, high precision, and strong robustness, demonstrating excellent performance in solving complex optimization problems. However, when handling certain complex problems, SSA may still fall into local optima, and its performance is highly sensitive to parameter selection. The overly rapid convergence speed can sometimes lead to a trade-off between accuracy and efficiency.
Recent advancements in the Sparrow Search Algorithm (SSA) have led to its widespread application across various fields, including UAV trajectory planning, multi-threshold image segmentation, wireless sensor network coverage, job shop scheduling, and price trend prediction [27,28,29]. However, SSA faces challenges in the later stages of iteration due to its direct jump-based update mechanism. This can result in a decline in population diversity and reduced search efficiency. As a result, the algorithm tends to converge slowly and often gets trapped in local optima. To address these issues, researchers worldwide have proposed several enhanced versions of SSA. For instance, Zhou et al. [30] proposed a multi-strategy improved SSA (RSSA), which integrates coarse-to-fine data reasoning and introduces low-differential sequences for population initialization, thereby enhancing the global search capability of the algorithm. Another study [31] introduced an improved SSA based on adaptive t-distribution and golden sine functions. This method uses an adaptive t-distribution mutation to perturb individual positions, helping the algorithm escape from local optima. A further enhancement, the Adaptive Spiral Flight Improved Sparrow Search Algorithm (ASFSSA) [32], incorporates self-adaptive spiral flight patterns to boost the algorithm’s search performance. Additionally, an SSA algorithm based on Sobol sequences was proposed [33], combining longitudinal and lateral crossover strategies to improve search efficiency. Another variant, the Chaos Sparrow Search Algorithm (CSSA) [34], was applied to construct an adaptive network model (CSSA-SCN) for addressing challenges in large-scale data regression and classification. Furthermore, an Adaptive Sparrow Search Algorithm (ASSA) [35] was employed for optimal model parameter identification in proton exchange membrane fuel cells (PEMFCs), demonstrating its effectiveness in PEMFC stack optimization in several case studies. Finally, the improved SSA (CASSA) [36] was designed specifically for 3D path planning of UAVs, with a focus on optimizing the path while minimizing collision risks. Enhanced SSA algorithms have also been integrated into frameworks for short-term power load forecasting [37], where they help optimize the parameters of gated recurrent unit neural networks to improve forecasting accuracy.
According to the “No Free Lunch” (NFL) theorem [38], the most effective optimization strategy is problem-dependent, and no single algorithm can consistently outperform others across all problem landscapes. Therefore, optimization strategies must be adapted and improved according to the specific characteristics of the problem at hand. While several modified versions of the Sparrow Search Algorithm (SSA) have been proposed—such as those that adjust the algorithm’s structure or incorporate new operations to enhance global search capability—these approaches still exhibit certain limitations when faced with more complex or dynamic optimization problems. Consequently, there is an urgent need to further optimize the SSA to overcome the shortcomings of existing modifications and improve its performance across a wider range of complex problems. This paper proposes a modified version of the Sparrow Search Algorithm (MSSA), featuring three critical improvements. First, Latin Hypercube Sampling (LHS) [39] is utilized to augment population diversity, thereby enhancing the global search capability and accelerating convergence toward the global optimum. Second, an adaptive weighting mechanism is incorporated in the position update process for the producers, ensuring balanced exploration in the early iterations and reducing the likelihood of premature convergence in later stages. Finally, the algorithm is further strengthened by integrating Cauchy mutation and Cat perturbation techniques, which effectively disrupt the search process and enable the algorithm to escape local optima.
The primary contributions of this paper are briefly summarized as follows:
(1)
A new optimization algorithm MSSA is proposed. It enhances population diversity with Latin Hypercube Sampling (LHS) during initialization, enhances search efficiency through an adaptive weighting mechanism in the discovery phase, and strengthens global search with Cauchy mutation and Cat perturbation strategies.
(2)
Based on the tests conducted on 23 benchmark functions, CEC2019 test functions, and three engineering optimization problems, the MSSA was also compared with the deterministic algorithms DIRECT and BRIMIN on 100 five-dimensional GKLS test functions to verify its global optimization ability.
(3)
The algorithm’s effectiveness is verified through statistical analysis of mean and standard deviation. The Wilcoxon’s rank-sum test at a 0.05 significance level shows a significant difference.
(4)
The modified MSSA is applied to a 20 × 20 robot path planning problem, validating its performance in dynamic obstacle avoidance and path optimization, providing strong algorithmic support for practical applications.
The structure of this paper is arranged as follows: Section 2 reviews the principles and development history of the basic Sparrow Search Algorithm; Section 3 elaborates on the implementation process of the Modified Sparrow Search Algorithm (MSSA) in detail. In Section 4, the optimization performance of the MSSA is compared with that of other swarm intelligence algorithms on 23 benchmark test functions and CEC2019 test functions. Its global optimization ability, convergence speed, and algorithm stability are evaluated, and the statistical significance is assessed through the Wilcoxon test. Meanwhile, the MSSA is compared with deterministic algorithms on 100 five-dimensional GKLS test functions to verify its global optimization ability. Section 5 validates the effectiveness of the MSSA in handling complex constraints through three engineering design problems, highlighting its advantages. Section 6 demonstrates the effectiveness of the MSSA in industrial applications by taking the robot path planning problem as an example. Finally, Section 7 summarizes the main research content and proposes potential directions for future research.

2. Sparrow Search Algorithm (SSA)

In the SSA algorithm, the sparrow population is divided into three types: producers, scroungers, and scouters. Producers, with higher energy, lead the population in exploring food sources; scroungers follow producers and compete for resources; scouters stay alert and guide the population to safety upon detecting threats. The SSA generates an initial population of N sparrow individuals in a D -dimensional space through random initialization.

2.1. Producer Position Updates Phase

During each iteration, the position update formula for the producers is described by Equation (1):
x i , j t + 1 = x i , j t exp i α T ,         R 2 < S T   x i , j t + Q L ,                     R 2 S T
Here, t represents the current iteration number, and j = 1,2 , 3 , . . . , D . T is a constant representing the maximum number of iterations. x i , J t + 1 denotes the position of the i sparrow in the j dimension, α ( 0,1 ] is a random number. Both S T ( S T [ 0.5,1 ] ) and ( R 2 [ 0,1 ] ) represent the warning value and safety value, respectively. Q is a random number following a normal distribution. L is a 1 × D matrix with all elements 1.

2.2. Scrounger Position Updates Phase

The position update formula for the scroungers is described by Equation (2):
x i , j t + 1 = Q exp x w o r s t t x i , j t i 2                                             i f         i > n 2   x p t + 1 + x i , t x p t + 1 A + L                   o t h e r w i s e        
where x p t + 1 is the location with the best food resource in the t + 1 iteration, x w o r s t t is the location with the location with the worst resource, A is a 1 × d matrix with elements randomly replicated as 1 or −1, A + = A T ( A A T ) 1 .

2.3. Scouter Position Updates Phase

The position update formula for the scouters is described by Equation (3):
x i , j t + 1 = x b e s t t + β x i , j t x b e s t t                   i f       f i > f g   x i , j t + K x i , j t x w o r s t t ( f i f w ) + ε                   e l s e     f i < f g
where x b e s t t is the current global best position. β obeys a standard normal distribution, K ( K [ 1,1 ] ) is a random number. f i is the fitness value of the current sparrow individual. f g and f w are the optimal and worst fitness values in the current iteration, respectively. ε is the smallest constant.

3. A Modified Sparrow Search Algorithm

Although the SSA offers advantages such as ease of implementation, fast convergence speed, and strong robustness, it still faces challenges such as a tendency to get trapped in local optima and insufficient convergence accuracy. In order to overcome the shortcomings of the SSA, this paper proposes a modified sparrow search algorithm (MSSA). The improvement strategy of the MSSA focuses on three main areas. Firstly, LHS is employed to enhance population diversity, this approach helps prevent premature convergence to local optima and enhances the algorithm’s ability to escape local traps, thereby strengthening its global search performance. Secondly, an adaptive weighting mechanism is incorporated into the position update process of the producers. As the algorithm progresses, the exploration weight is gradually reduced, shifting the focus towards a more refined exploitation of the information gathered. This adaptive strategy mitigates the risk of premature convergence, enabling the algorithm to effectively exploit promising regions of the search space in the later stages of the optimization process. Finally, the algorithm is further enhanced by integrating Cauchy mutation and Cat perturbation techniques. Cauchy mutation introduces heavy-tailed perturbations to the search process, allowing the algorithm to escape local optima by exploring more diverse regions of the search space. Additionally, the Cat perturbation simulates random walk behaviors, enabling large-scale search disruptions that help avoid stagnation in suboptimal solutions. The detailed improvement strategy is summarized as follows.

3.1. Latin Hypercube Sampling

Latin Hypercube Sampling (LHS), proposed by McKay et al., is a multi-dimensional stratified sampling technique that efficiently samples within the distribution intervals of variables. The basic principle involves dividing the interval [0,1] into N equal subintervals and performing independent random sampling with equal probability within each subinterval, ensuring that the samples are evenly distributed across the entire interval.
The initial population is generated in a D -dimensional space with a population size of N . By incorporating Latin Hypercube Sampling (LHS), the population initialization strategy for MSSA is developed. The specific process is as follows:
(1)
First, determine the population size N and the dimensionality D of the population.
(2)
Define the range of variables x as [ u b , l b ] , where u b and l b are the upper and lower bounds, respectively.
(3)
Divide the range [ u b , l b ] of each variable x into N equal intervals. The width of each sub-interval is Δ x = ( u b l b ) / N .
(4)
Randomly select a point from each interval in every dimension. A random number generator within the range of [0, 1] can be used in each sub-interval.
(5)
Combine the points selected in all dimensions to form the initial population. After sampling one point in each sub-interval of all D dimensions, an individual in the population is formed. Repeat this process N times to obtain the initial population of the MSSA algorithm.
Table 1 summarizes four common sampling methods along with their characteristics, advantages, and disadvantages, and compares LHS with these methods. By drawing 40 and 100 points in the interval [0, 1], the scatter plots in Figure 1 and Figure 2 clearly demonstrate the significant advantages of the LHS sampling method, which effectively distributes the samples uniformly across the entire interval.

3.2. Adaptive Weighting Mechanism

The adaptive mechanism dynamically adjusts the algorithm’s position based on the current search state, improving its adaptability and robustness. It guides the algorithm towards the optimal solution, speeding up convergence and helping escape local optima. In the sparrow population finder position update stage, when R 2 < S T , the original update method with the number of iterations increases, the population diversity decreases sharply, to address this issue, an adaptive mechanism is introduced to ensure the algorithm continues broad searching in later iterations. The update strategy is as follows:
X i , d t + 1 = X i , d t + 1 w 1
w 1 = l 1 exp ( r i α T ) + 2 l 2 π a b s [ arctan ( 1 + r i α 2 T ) ]
where l 1 ( 0,1 ) , and l 1 + l 2 = 1 , r is a constant greater than 1, α is a random number within (0, 1) and w 1 is an adaptive weight. Here take l 1 = 0.4 , r = 2 .
The weight parameter w 1 is designed to dynamically balance exploration and exploitation through two components: (1) an exponential decay term l 1 exp ( r i α T ) , where the decay rate reduces global exploration reliance as iterations i progress, and (2) a compensatory arctangent term 2 l 2 π a b s [ arctan ( 1 + r i α 2 T ) ] , scaled to the interval ( 0 , l 2 ) , which grows monotonically with i to sustain late-stage exploration while permitting localized exploitation. This dual-term structure ensures progressive focus refinement without premature convergence, critical for navigating complex search spaces.
The maximum number of iterations is set to 1000 in order to verify the effectiveness of the proposed adaptive weights w 1 . As shown in Figure 3, in the case of R 2 < S T , it can be observed that the raw SSA w = exp i α T decreases rapidly with the increase in the number of iterations up to 0.3. This indicates a sharp decline in the diversity of the original SSA population in later iterations, hindering the balance between global exploration and local search. However, the improved w 1 solves this problem well.

3.3. Cauchy Mutation and Cat Disturbance Strategy

In order to solve the problem that the sparrow search algorithm is easy to fall into the local optimum, this paper adopts the Cauchy variation [40] and Cat perturbation strategy [41] in order to increase the diversity of the population, so as to improve the global search ability of the algorithm. The Cauchy variation uses its long-tailed distributional properties to introduce a larger perturbation around the currently mutated sparrow individual, which helps to jump out of the locally optimal solution. Cat mapping is a two-dimensional invertible chaotic mapping with good traversal uniformity, and the generated chaotic sequences are uniformly distributed in [0, 1], which effectively improves the diversity of the population and the global search ability.
After each iteration, the f m of the sparrow population is calculated by Equation (6). If the f i < f m , the Cauchy variation is used to introduce the perturbation, see Equation (7); otherwise, the Cat perturbation is used, see Equation (8). In the Cauchy variation strategy, new solutions are generated by applying the Cauchy variation operator to perform a variation operation at the optimal solution location. And, in Cat perturbation, its chaotic properties are utilized to obtain new population solutions. This design can more effectively control the triggering conditions of the perturbation strategy and flexibly adjust the perturbation according to the performance of the population, thus improving the global search capability of the algorithm.
f m = 1 N N i = 1 f i
x n e w b e s t = x b e s t + x b e s t × C a u c h y ( 0,1 )
x N + 1 y N + 1 = 1       1 1       2 x N y N mod 1 = c x N y N mod 1
C a u c h y ( 0,1 ) in Equation (7) is the standard Cauchy distribution. Equation (8) is a two-dimensional cat mapping equation where x   mod 1 = x [ x ] , C = 1       1 1       2 .
For the standard Cauchy distribution C a u c h y ( 0,1 ) , its probability density function is f ( x ) = 1 / ( π ( 1 + x 2 ) ) . When generating a new solution using Equation (7), due to the long-tail property of the Cauchy distribution, a Cauchy random variable can take values far from zero with a non-negligible probability. For example, if x b e s t = 5 and a Cauchy random variable z = 3 is sampled from C a u c h y ( 0,1 ) , then x n e w b e s t = 5 + 5 × 3 = 20 . Such a large perturbation has the potential to move the solution to a new region of the search space, enabling the algorithm to escape from local optimal solutions.
Regarding the cat perturbation, assume that it starts from the initial point x 0 y 0 = 0.2 0.3 . Using Equation (8), x 1 y 1 = 1       1 1       2 0.2 0.3 =       0.2 + 0.3 0.2 + 2 × 0.3 = 0.5 0.8 . After performing the mod1 operation, this situation remains unchanged. If further iteration is carried out x 2 y 2 = 1       1 1       2 0.5 0.8 =       0.5 + 0.8 0.5 + 2 × 0.8 = 1.3 2.1 , and after the mod1 operation, x 2 y 2 = 0.3 0.1 . This process generates a chaotic sequence of points, which can be used to perturb the population and increase its diversity.
The pseudo-code for the MSSA is given in Algorithm 1, and the overall flowchart of the algorithm is given in Figure 4.
Algorithm 1: Pseudo-code of MSSA
Input:  N , D , T , P D , S D , R 2 , S T
Initialized population individuals x i ( i = 1,2 , . . . , N ) generated Latin hypercube sampling (LHS) within the D dimensional problem space.
Output:  x b e s t , f g
1: While  t < T do
2:    Rank the fitness values, identify the current best individual f g and worst individuals f w ,
3:   for  i = 1 : P D
4:      Using Equation (1) to update the sparrow producers’ positions. When R 2 < S T , replaced the original in Equation (1) with Equation (4) and Equation (5).
5:   end for
6:   for i = ( P D + 1 ) : N
7:   Using Equation (2) to update the sparrow scroungers’ positions.
8:   for  i = 1 : S D
9:    Using Equation (3) to update the sparrow producers’ positions.
10:    end for
11:    Calculating the average fitness of the population f m by Equation (6).
12:    if f i < f m
13:    Cauchy variation by Equation (7) was employed to perturb sparrow populations in instances where the fitness of individual sparrows fell below the average fitness.
14:    else
15:    Sparrow populations were perturbed utilizing the Cat perturbation strategy by Equation (8).
16:    Boundary checks and adjustments
17:    Obtain the current new location;
18:    If the current new location is superior to the previous one, update it;
19:     t = t + 1
20: end while
21: return  x b e s t , f g .

3.4. Complexity Analysis

In this subsection, the selection of time complexity and space complexity as evaluation metrics offers a comprehensive assessment of an algorithm’s efficiency and feasibility. Time complexity influences the execution speed, particularly when handling large-scale datasets, while space complexity quantifies memory usage, which affects performance in memory-constrained environments. Evaluating these two dimensions ensures the operability and scalability of MSSA under varying conditions.
The space complexity of the algorithm is determined using the Big O notation. For the standard SSA, the space complexity is O ( N × D ) , where N represents population size, D represents dimension. According to the flowchart and pseudo-code of MSSA, the introduction of an adaptive inertia coefficient leads to the addition of a constant scalar, which does not significantly affect the space complexity and can be considered negligible. Additionally, MSSA requires the storage of each individual’s historical best solution, which results in a space requirement of O ( N × D ) . Furthermore, MSSA must store the global best solution, contributing a space complexity of O ( D ) . Overall, the total space complexity of MSSA remains O ( N × D ) , which is the same as that of the SSA algorithm. Thus, although MSSA incorporates performance improvements, its space overhead remains equivalent to that of the SSA algorithm, maintaining the same level of space efficiency.
The time complexity of the algorithm is determined using the big O notation. In SSA, the initialization phase is specifically formulated to generate a set of prospective solutions intended for subsequent exploration and optimization. The process encompasses the generation of initial solutions, determination of parameter settings, and execution of other essential operations, with a computational complexity of O ( N ) . At this juncture, the evaluation of adaptation becomes imperative to thoroughly appraise the efficacy and caliber of prospective solutions. Overall, the time complexity of the SSA algorithm can be represented as O T × N + O ( T × N × D ) , where T represents the maximum number of iterations. The difficulty in updating the Sparrow Search Algorithm is determined by the neighborhood search difficulty and the employed update strategy. Consequently, the runtime complexity of the SSA is O N × ( T + T × D + 1 ) . In MSSA, the runtime complexity remains unaltered at O N × ( T + T × D + 1 ) .

4. Performance Testing of Functions

To evaluate the effectiveness of the proposed Modified Sparrow Search Algorithm (MSSA), a performance test was conducted using 23 benchmark functions [42] from multiple dimensions, including global search ability, local exploitation ability, and optimization-seeking capability. Specifically, unimodal functions F1–F7, each with a single global extremum, were employed to examine the local exploitation ability of MSSA. Multimodal functions F8–F13, which possess multiple local extrema, were utilized to test the algorithm’s robustness in handling multiple optimal solutions. Additionally, fixed-dimensional multimodal functions F14–F23 were applied to assess the performance of MSSA in low-dimensional spaces. Furthermore, 10 CEC2019 benchmark functions [43] were introduced to further validate the algorithm’s performance.
The experiments were carried out on a computer equipped with a Windows 10 64-bit operating system, an Intel Core i5 processor (2.60 GHz), and 8 GB of RAM, using MATLAB R2022b for simulation. The population size was set to 40, and the maximum number of iterations was configured as 500. The parameter settings of MSSA were consistent with those of the original Sparrow Search Algorithm (SSA), while the other compared algorithms adopted their default parameter configurations. The parameter settings are shown in Table 2. The convergence accuracy, stability, convergence speed, and overall performance advantages of the algorithms were comprehensively evaluated by calculating the mean and standard deviation based on 20 independent runs. To objectively evaluate the performance differences between the MSSA and other comparative algorithms, we conducted a statistical analysis based on the rank-sum test at a significance level of α = 0.05 . For each test task, we computed the rank sums of MSSA and each competing algorithm. At the 0.05 significance level, we compared the calculated rank sums against standard critical values. If the obtained rank sum fell outside the critical range, we rejected the null hypothesis, concluding that the performance difference between MSSA and the competing algorithm was statistically significant. Otherwise, we failed to reject the null hypothesis, indicating no significant performance difference. As a non-parametric test, the rank-sum test does not assume any specific data distribution, making it robust for analyzing diverse performance metrics in complex testing scenarios.

4.1. Performance Testing on 23 Benchmark Functions

In this subsection, the performance of MSSA and other algorithms is evaluated using 23 benchmark functions, each with 30 dimensions (though they can also be configured with 2, 10, 50, or 100 dimensions). The comparison focuses on their exploitation ability, exploration ability, and convergence capability. For each test function, the MSSA and other algorithms were independently run 20 times to identify the global optimal solution. The mean and standard deviation of the results from these 20 trials were then computed, with the mean representing the algorithm’s convergence accuracy and the standard deviation reflecting its stability. The experimental results are shown in Table 3. The comparison of the convergence curves of MSSA and other algorithms across 23 benchmark functions is shown in Figure 5.
The experimental results presented in Table 3 clearly demonstrate that the improved MSSA algorithm performs excellently on the benchmark functions F1 through F7. Not only does it achieve optimal average values, but it also exhibits remarkable stability in terms of variance, indicating its strong robustness. For the functions in the range of F8 to F13, although MSSA performs slightly worse than the HHO and WOA algorithms on F8, its optimization results on the other functions remain superior to those of the comparison algorithms, highlighting its outstanding performance in most cases. In the more complex function set from F14 to F23, MSSA continues to exhibit strong competitiveness, with optimization results for most benchmark functions being highly satisfactory. However, it falls short of reaching the optimal solution for functions F15, F19, and F20, showing a slight performance degradation in these specific cases. Additionally, the Wilcoxon rank-sum test further verifies the statistical independence of the MSSA algorithm in terms of optimization performance. In the Wilcoxon rank-sum test, a significance level of 0.05 is conventionally selected as it represents a 5% probability of erroneously rejecting the null hypothesis (i.e., a Type I error) when the null hypothesis is actually true. This threshold provides a balance between the test’s sensitivity and specificity, thereby reducing the risk of overlooking meaningful differences due to an excessively stringent significance level. When the p-value is close to 0.05, the results should be interpreted with caution, as they lie on the margin of statistical significance. In such cases, it is advisable to further validate the findings through studies with larger sample sizes to enhance the robustness and reliability of the conclusions. The results indicate significant differences between the MSSA and the comparison algorithms, with MSSA outperforming them in terms of statistical significance. As shown in Figure 5, the MSSA possesses a significant convergence speed advantage, rapidly approaching the global optimum, demonstrating both its efficiency and practicality. While there are areas that could be further refined, it remains a viable and efficient optimization algorithm overall.
To further evaluate the performance of the improved MSSA, we selected the F1–F13 benchmark functions and conducted experiments in higher-dimensional spaces ( D = 50 and D = 100 ). The MSSA was compared with the original SSA and the ASFSSA, with the parameter settings consistent with those described earlier. As shown in Table 4 and Figure 6, MSSA achieves the optimal solution for the majority of test functions. Moreover, it exhibits a significantly faster convergence rate compared to both the original SSA and ASFSSA, which effectively validates the effectiveness and superiority of the proposed improvements in MSSA.
To further assess the performance of MSSA, its discoverer proportion was adjusted from 0.5 to 0.8. The adjusted MSSA was then compared with the Chinese Pangolin Optimizer (CPO) and Mirage Search Optimization (MSO) algorithms. Using F1–F13 benchmark functions in a 30-dimensional space, this study aimed to verify the performance enhancement of MSSA under alternative parameter configurations and its competitiveness against recent algorithms. As presented in Table 5 and Figure 7, MSSA outperforms MSO and CPO across various settings, validating its stability and effectiveness.

4.2. Performance Testing on CEC2019

The CEC2019 test functions consist of 10 test functions, denoted as F1 to F10, as detailed in Table 6 Due to the stochastic nature of the algorithms, all conclusions are derived from the results of 20 independent runs, which include the average values, standard deviations, and Friedman test statistics for each iteration. MSSA was compared with five other algorithms on the CEC2019 test functions set to further validate the effectiveness of its improvements. The parameter settings are provided in Table 2. The experimental results are provided in Table 6 (For the CEC2019 test results, the evaluation is based on values rounded to four decimal places). The comparison of convergence curves of MSSA and other algorithms are provided in Figure 8.
From Table 6 and Figure 8, it is evident that MSSA exhibits a faster convergence rate. Except for its slightly lower performance on F3 compared to SAO, MSSA outperforms all other algorithms on the remaining functions. The average rank results from the Friedman test further confirm that MSSA achieves the best overall performance. These findings highlight the superior effectiveness of MSSA on the CEC2019 test functions.

4.3. Performance Testing on GKLS Functions

When evaluating the global optimization ability of algorithms, deterministic algorithms and heuristic algorithms belong to different technical categories [44]. Deterministic algorithms are based on rigorous mathematical theories and rule systems. Algorithms such as DIRECT [45] and BIRMIN [46] employ systematic search strategies, and their operation processes follow strict logic, exhibiting a high degree of predictability and certainty. Taking the DIRECT algorithm as an example, it demonstrates excellent exploration capabilities in low-dimensional spaces, but its performance significantly deteriorates in high-dimensional space scenarios. In contrast, heuristic algorithms often simulate natural phenomena or draw on human experience and adopt a random exploration approach to search for approximate optimal solutions. To comprehensively and systematically study the global optimization performance of the Modified Sparrow Search Algorithm (MSSA), this study constructs 100 five-dimensional simple test functions with the aid of the GKLS generator [47]. By constructing an operation area and connecting deterministic algorithms and heuristic algorithms through visualization means [48], an effort is made to break down the barriers between theory and practice, and thus conduct an in-depth analysis of the characteristics and effectiveness of these two types of algorithms.
In this study, the objective of the algorithm is to find a solution within the given number of iterations, where the error between this solution and the global optimal value is less than the preset tolerance ( ε = 10 4 ). Once this condition is met, the corresponding problem is considered to have been successfully solved.
In order to intuitively present the operation characteristics of the DIRECT and BIRMIN algorithms for simple and difficult five-dimensional problems, this paper plots a line chart showing the relationship between the number of function evaluations and the number of successfully solved problems. Among them, the horizontal axis represents the number of function evaluations, which is displayed at intervals of 1000 function evaluations. If there is a case of a successful solution, the range of the horizontal axis is from the minimum value to the maximum value of the number of function evaluations when the problem is successfully solved. If there is no record of a successful solution, the range of the horizontal axis is from 1000 function evaluations to the product of the maximum number of function evaluations for each problem and the number of problems (reaching this range is regarded as an unsuccessful solution). The vertical axis represents the cumulative number of successfully solved problems. For each record of successful solution, according to the corresponding number of function evaluations, the number of successfully solved problems is accumulated at the corresponding position on the horizontal axis. Finally, the cumulative number of successfully solved problems corresponding to each point is determined through cumulative summation.
For these 100 GKLS test functions, 20 representative test functions are selected from them. The MSSA algorithm sets the population size to 30 and runs independently 10 times to construct an operation region. During each run, the iteration curve is recorded. This curve can reflect the changes in the optimal solution of the algorithm at different iteration steps. After each run, the error between the final solution and the global optimal value is checked. If the error is less than the tolerance, it is determined that the corresponding problem has been successfully solved in this run. In addition, during each iteration, the number of problems successfully solved by all functions at the current iteration number is also counted. The Figure 9 shows the results of different algorithms solving GKLS test functions.
From the Figure 9, when the two deterministic algorithms, DIRECT and BIRMIN, solve 100 GKLS functions, the number of successfully solved problems gradually increases as the number of function evaluations rises, with DIRECT having a slightly higher efficiency. When the SSA solves 20 GKLS functions, the number of solved problems increases gently. The MSSA, when solving the same 20 GKLS functions, can quickly solve a larger number of problems with fewer trials. Evidently, the MSSA performs outstandingly in global optimization. Although it is a heuristic algorithm, it performs comparably to deterministic algorithms based on rigorous theories, and even has an edge in terms of search efficiency.

5. Performance on Engineering Optimization Problems

Addressing real-world engineering problems is a primary goal in the research of swarm intelligence algorithms [46]. Although the results from benchmark test functions provide valuable insights into the performance of an algorithm, they do not fully capture its effectiveness in practical applications. To assess the performance of the improved Sparrow Search Algorithm (MSSA) in engineering tasks, this study investigates three representative engineering problems: the welded beam design problem, the gear reducer design problem, and the cantilever beam design problem. The brief descriptions and graphical references of the three engineering design problems mentioned in the article can be found in reference [47].
In order to evaluate the effectiveness and feasibility of the enhanced MSSA for solving these engineering challenges, five algorithms were selected for comparison: SSA, Chimp, HHO, DBO, and WOA. Each algorithm was independently executed 30 times. The optimal results in the table are displayed in bold italics. The evaluation metrics include the mean, variance, and convergence curves of the experimental results, aiming to derive optimal solutions for each of the engineering problems.

5.1. Welded Beam Design Problem

The objective of the welded beam design problem is to minimize manufacturing costs while ensuring safety performance. The problem involves optimizing four variables: weld seam height (h), connection beam length (L), connection beam height (t), and connection beam thickness (b). The image of the welded beam design model and the iteration convergence curve are shown in Figure 10.
From Figure 10 and Table 7, it can be observed that MSSA outperforms the other five algorithms in terms of both mean value and standard deviation, and exhibits faster convergence speed. This observation suggests that MSSA has a competitive advantage in addressing the welded beam design problem.

5.2. Speed Reducer Design Problem

Speed reducers are indispensable components in mechanical systems, serving as core elements in gearboxes and finding widespread applications across various fields. In this optimization problem, seven variables are to be considered and the objective is to minimize the weight of the gearbox while satisfying 11 constraints. The image of the speed reduction design model and the iteration convergence curve are shown in Figure 11.
From Figure 11 and Table 8, it can be observed that MSSA outperforms the other five algorithms in terms of both mean value and standard deviation, and exhibits faster convergence speed. This suggests that MSSA can provide valuable guidance for determining optimal settings of the seven variables to minimize the weight of the speed reducer.

5.3. Cantilever Beam Design Problem

The cantilever beam design problem pertains to a structural engineering scenario wherein the primary objective involves the reduction or minimization of the weight associated with the cantilever arm. The cantilever beam arm depicted in the figure is rigidly supported at one end, with vertical forces applied to the free nodes of the cantilever. The beam comprises five hollow cells, each featuring a uniform hollow cross-section with variable height (or width) dimensions, and a constant thickness (here 2/3). The image of the cantilever design model and the iteration convergence curve are shown in Figure 12.
As shown in Figure 12 and Table 9, although the MSSA exhibits lower mean values and standard deviations compared to DBO and SAO, it outperforms SSA and Chimp, while also demonstrating a faster convergence rate. This demonstrates the high effectiveness of utilizing MSSA in addressing the cantilever beam design problem.

6. Robot Path Planning Based on MSSA

To evaluate the practicality and feasibility of the modified algorithm, this paper selects a classic robot path planning case for in-depth study [48]. Each sparrow individual is treated as a potential feasible path, assuming there are N possible paths. The dimension of the path is determined by the number of connections between the start and end points. The environment modeling uses a 1 × 1 grid method, transforming the work environment into a plane where obstacles are represented by grid values. A grid value of 0 indicates the feasible domain, while a grid value of 1 indicates the obstacle domain. Therefore, the robot can plan a path within the feasible domain. The dimension represents the number of columns in the grid map, and the path cost function represents the path cost of the i t h sparrow individual, as shown below:
f ( x i ) = j = 1 D 1 ( x j + 1 x j ) 2 + ( y j + 1 y j ) 2
In Equation (9), j is the j t h dimension of a sparrow individual.

6.1. Experimental Environment Settings

To validate the practicality of the improved algorithm, it is applied to route a 20 × 20 grid map and compared with GA, SSA, MSSA, ASFSSA, and GWO. The population size is set to 40, and the number of iterations is 100, with other parameters kept consistent.

6.2. Simulation Results and Analysis

The optimal route of each algorithm in the 20 × 20 model graphs is shown in Figure 13. To eliminate the influence of randomness, each algorithm is tested 10 times, and the best, average, and worst routes are recorded. Three indicators are used to evaluate the stability and feasibility of each algorithm. The optimization statistical table is shown in Table 10.
From Table 10 and Figure 13, it can be seen that the average shortest path of MSSA is second only to ASFSSA and outperforms GA, SSA, and GWO. Compared to the original SSA, MSSA demonstrates stronger search capability and stability, effectively preventing it from getting trapped in local optima, significantly improving both the algorithm’s performance and stability.

7. Conclusions

This paper presents an algorithm named MSSA (Modified Sparrow Search Algorithm), which aims to address the issues of reduced population diversity, weakened search ability, and a tendency to be trapped in local optima during the later iterations of the Sparrow Search Algorithm (SSA). In the initialization phase, the MSSA algorithm incorporates Latin Hypercube Sampling (LHS) to enhance population diversity. It adopts an adaptive weighting mechanism to boost search efficiency and employs Cauchy mutation and cat swarm disturbance strategies during the discovery phase to improve the global search capability. Experimental results demonstrate that the MSSA performs outstandingly in terms of optimization performance and problem solving, showing great potential for practical applications. To evaluate the performance of MSSA comprehensively, tests were carried out on 23 benchmark functions, 10 CEC2019 test functions, and GKLS test functions in five-dimensional and ten-dimensional spaces. Additionally, MSSA was applied to solve three real-world engineering problems and a 20 × 20 robot path-planning problem.
To provide a clearer overview of the contributions of the MSSA algorithm discussed above, please refer to Table 11.
Statistical test indicators (such as mean and standard deviation) show that, compared to seven other algorithms, MSSA demonstrates higher convergence accuracy and better stability on most test functions. Simulation curves further illustrate that the algorithm converges faster and achieves higher precision. The application of MSSA to real-world engineering problems also further validates the improvements made by the algorithm. The results of the non-parametric Wilcoxon signed-rank test and Friedman rank sum test show that MSSA exhibits a significant difference from other algorithms at the 0.05 significance level.
Although the parameters of the MSSA algorithm were set in this study, these parameters may not be optimal in different scenarios. When dealing with problems of different scales and complexity levels, they may need to be readjusted and optimized. Meanwhile, there is a gap between the test functions used to verify the performance of the MSSA algorithm and real-world problems. Therefore, the effectiveness and adaptability of the algorithm in practical scenarios still need to be further verified. In addition, this study assumes that optimization problems have optimal solutions and that the MSSA algorithm can approach the optimal solutions within a reasonable time. However, some complex problems may not have global optimal solutions. This deviation between the assumption and reality may limit the application of the algorithm in extremely complex problems.
Future research will focus on validating the performance of MSSA on additional benchmark functions, such as the CEC2017 and CEC2022 test functions. In addition, the MSSA algorithm will be enhanced by incorporating techniques like elite reverse strategy, fast non-dominated sorting, mutation factor selection, and archive gene pools. These improvements will enable the evolution of MSSA into a multi-objective sparrow search algorithm, making it suitable for solving real-world optimization problems. For instance, in intelligent transportation systems, MSSA can be used to optimize traffic flow management and path planning; in energy management, it can help improve the thermal efficiency of coal-fired boiler units while reducing pollutant emissions. Furthermore, MSSA shows great potential across various fields, including environmental protection, robot path planning, and aerospace. As the demand for efficient optimization algorithms continues to grow in various industries, the multi-objective optimization capability of MSSA will provide more accurate and effective solutions in these domains.

Author Contributions

Conceptualization, Y.M. and W.M.; methodology, W.M. and X.W.; software, X.W. and W.M.; data curation, W.M. and P.G.; writing—original draft preparation, Y.M.; writing—review and editing, W.M. and X.Z.; visualization, W.M.; supervision, Y.M. and X.Z.; project and administration, Y.M. funding acquisition. Y.M. All authors have read and agreed to the published version of the manuscript.

Funding

The work was supported by the National Natural Science Foundation of China (Grant No. 62203332), the Natural Science Foundation of Tianjin (Grant No. 20JCQNJC00430), and the Tianjin Research Innovation Project for Postgraduate Students (Grant No. 2022SKYZ310), and College Students’ Innovative Entrepreneurial Training Plan Program (Grant No. 202410069005).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

This manuscript does not report data generation or analysis.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Yun, L. MATLAB implementation of Newton’s iteration method. Inf. Commun. 2011, 24, 20–22. [Google Scholar]
  2. Ruder, S. An overview of gradient descent optimization algorithms. arXiv 2016, arXiv:1609.04747. [Google Scholar]
  3. Cao, L.; Cai, Y.; Yue, Y. Swarm Intelligence-Based Performance Optimization for Mobile Wireless Sensor Networks: Survey, Challenges, and Future Directions. IEEE Access 2019, 7, 161524–161553. [Google Scholar] [CrossRef]
  4. Zhang, J.J.; Li, J.D.; Xu, X.M. A Passive Positioning Algorithm Using Time-Frequency Differences Based on the Cuckoo Search Algorithm and the Newton Method. Electron. Des. Eng. 2023, 31, 78–82. [Google Scholar]
  5. Izuchukwu, C.; Shehu, Y. A new inertial projected reflected gradient method with application to optimal control problems. Optim. Methods Softw. 2023, 39, 197–226. [Google Scholar] [CrossRef]
  6. Sakovich, N.; Aksenov, D.; Pleshakova, E.; Gataullin, S. MAMGD: Gradient-Based Optimization Method Using Exponential Decay. Technologies 2024, 12, 154. [Google Scholar] [CrossRef]
  7. Yan, F.; Xu, Y. An Optimized MTD Moving Target Detection Algorithm Based on Gradient Descent with Sampling Point Weights. In Proceedings of the 22nd Academic Annual Conference on Vacuum Electronics, Online, 27–30 April 2021; pp. 509–513. [Google Scholar]
  8. Ye, R.Z.; Du, F.Z. A Multi-Objective Fuzzy Optimization Scheduling Method for Regional Power Grids Based on the Distributed Newton Method. Electr. Technol. Econ. 2024, 299–302. [Google Scholar]
  9. Kennedy, J. Particle Swarm Optimization. In Proceedings of the 1995 IEEE International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 2011; Volume 4, pp. 1942–1948. [Google Scholar]
  10. Han, Y.; Cai, J.; Zhou, G.; Li, Y.; Lin, H.; Tang, J. Research Progress of Random Frog Leaping Algorithm. Comput. Sci. 2010. [Google Scholar]
  11. Qin, Q.; Cheng, S.; Li, L.; Shi, Y. Artificial Bee Colony Algorithm: A Survey. Appl. Math. Comput. 2014, 249, 126–141. [Google Scholar]
  12. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  13. Mirjalili, S. SCA: A Sine Cosine Algorithm for Solving Optimization Problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  14. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  15. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harrishawks Optimization: Algorithm and Applications. Fut. Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  16. Khishe, M.; Mosavi, M.R. Chimp Optimization Algorithm. Expert Syst. Appl. 2020, 149, 113338. [Google Scholar] [CrossRef]
  17. Xue, J.; Shen, B. A novel swarm intelligence optimization approach: Sparrow search algorithm. Syst. Sci. Control. Eng. 2020, 8, 22–34. [Google Scholar] [CrossRef]
  18. Xue, J.; Shen, B. Dung beetle optimizer: A new meta-heuristic algorithm for global optimization. J. Supercomput. 2022, 79, 1–32. [Google Scholar] [CrossRef]
  19. Deng, L.; Liu, S. Snow Ablation Optimizer: A Novel Metaheuristic Technique for Numerical Optimization and Engineering Design. Expert. Syst. Appl. 2023, 225, 120069. [Google Scholar] [CrossRef]
  20. Guo, Z.; Liu, G.; Jiang, F. Chinese Pangolin Optimizer: A novel bio-inspired metaheuristic for solving optimization problems. J. Supercomput. 2025, 81, 517. [Google Scholar] [CrossRef]
  21. He, J.; Zhao, S.; Ding, J.; Wang, Y. Mirage search optimization: Application to path planning and engineering design problems. Adv. Eng. Softw. 2025, 203, 103883. [Google Scholar] [CrossRef]
  22. Elsisi, M. Optimal Design of Adaptive Model Predictive Control Based on Improved GWO for Autonomous Vehicle Considering System Vision Uncertainty. Appl. Soft Comput. 2024, 158, 111581. [Google Scholar] [CrossRef]
  23. Chen, M.; Cheng, Q.; Feng, X.; Zhao, K.; Zhou, Y.; Xing, B.; Tang, S.; Wang, R.; Duan, J.; Wang, J.; et al. Optimized variational mode decomposition algorithm based on adaptive thresholding method and improved whale optimization algorithm for denoising magnetocardiography signal. Biomed. Signal Process. Control. 2024, 88, 105681. [Google Scholar] [CrossRef]
  24. Liu, H.; Fan, J.; Guo, P. Improved gorilla optimization algorithm for kernel fuzzy clustering segmentation of RGB-D images. Microelectron. Comput. 2024, 1, 1–12. [Google Scholar]
  25. Javaheri, D.; Gorgin, S.; Lee, J.A.; Masdari, M. An improved discrete Harris hawk optimization algorithm for efficient workflow scheduling in multi-fog computing. Expert. Syst. Appl. 2021, 166, 113917. [Google Scholar] [CrossRef]
  26. Zhang, C.; Ma, H.; Hua, L.; Sun, W.; Nazir, M.S.; Peng, T. An evolutionary deep learning model based on TVFEMD, improved sine cosine algorithm, CNN and BiLSTM for wind speed prediction. Renew. Energy 2022, 187, 1107–1118. [Google Scholar] [CrossRef]
  27. Wu, D.; Yuan, C. Correction to: Threshold image segmentation based on improved sparrow search algorithm. Multimed. Tools Appl. 2022, 81, 33513–33546. [Google Scholar] [CrossRef] [PubMed]
  28. Panimalar, K.; Kanmani, S. Energy Efficient Cluster Head Selection Using Improved Sparrow Search Algorithm in Wireless Sensor Networks. J. King Saud Univ. Comput. Inf. Sci. 2022, 34, 8564–8575. [Google Scholar]
  29. Fei, L.; Li, R.; Liu, S.Q.; Tang, B.; Li, S.; Masoud, M. An Improved Sparrow Search Algorithm for Solving the Energy-Saving Flexible Job Shop Scheduling Problem. Machines 2022, 10, 847. [Google Scholar] [CrossRef]
  30. Zhou, N.; Zhang, S.; Zhang, C. Multi Strategy Improved Sparrow Search Algorithm Based on Rough Data Reasoning. J. Univ. Electron. Sci. Technol. China 2022, 51, 743–753. [Google Scholar]
  31. Zhang, W.; Liu, S. Adaptive t-Distribution and Improved Golden Sine Sparrow Search Algorithm and Its Applications. Microelectron. Comput. 2022, 39, 17–24. [Google Scholar]
  32. Ouyang, C.; Qiu, Y.; Zhu, D. Adaptive Spiral Flying Sparrow Search Algorithm. Sci. Prog. 2021, 2021, 1–16. [Google Scholar] [CrossRef]
  33. Duan, Y.; Liu, C. Sparrow Search Algorithm Based on Sobol Sequence and Crisscross Strategy. J. Comput. Appl. 2022, 42, 36–43. [Google Scholar]
  34. Zhang, C.; Ding, S. A Stochastic Configuration Network Based on Chaotic Sparrow Search Algorithm. Knowl. Based Syst. 2021, 220, 106924. [Google Scholar] [CrossRef]
  35. Zhu, Y.; Yousefi, N. Optimal Parameter Identification of PEMFC Stacks Using Adaptive Sparrow Search Algorithm. Microelectron. Comput. 2021, 39, 17–24. [Google Scholar] [CrossRef]
  36. Liu, G.; Shu, C.; Liang, Z.; Peng, B.; Cheng, L. A Modified Sparrow Search Algorithm with Application in 3D Route Planning for UAV. Sensors 2021, 21, 1224. [Google Scholar] [CrossRef] [PubMed]
  37. Song, X.; Wu, Q.; Cai, Y. Short-Term Power Load Forecasting Based on GRU Neural Network Optimized by an Improved Sparrow Search Algorithm. In Proceedings of the Eighth International Symposium on Advances in Electrical, Electronics, and Computer Engineering (ISAEECE 2023), Hangzhou, China, 17–19 February 2023; p. 1270431. [Google Scholar]
  38. Wolpert, D.H.; Macready, W.G. No Free Lunch Theorems for Optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  39. Stein, M. Large Sample Properties of Simulations Using Latin Hypercube Sampling. Technometrics 1987, 29, 143–151. [Google Scholar] [CrossRef]
  40. Lü, L.; Ji, W. A Particle Swarm Optimization Algorithm Combining Centroid Concept and Cauchy Mutation Strategy. Comput. Appl. 2017, 37, 1369–1375. [Google Scholar]
  41. Han, R.; Zhang, X.F. Pseudo-random sequence generation method based on high-dimensional cat mapping. Comput. Eng. Appl. 2016, 52, 91–99. [Google Scholar]
  42. Suganthan, P.N.; Hansen, N.; Liang, J.J.; Deb, K.; Chen, Y.-P.; Auger, A.; Tiwari, S. Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization. Nat. Comput. 2005, 341–357. [Google Scholar]
  43. Price, K.V.; Awad, N.H.; Ali, M.Z.; Suganthan, P.N. Problem Definitions and Evaluation Criteria for the 100-Digit Challenge Special Session and Competition on Single Objective Numerical Optimization; Technical Report; Nanyang Technological University: Singapore, 2018. [Google Scholar]
  44. Gaviano, M.; Kvasov, D.E.; Lera, D.; Sergeyev, Y.D. Algorithm 829: Software for generation of classes of test functions with known local and global minima for global optimization. ACM Trans. Math. Softw. 2003, 29, 469–480. [Google Scholar] [CrossRef]
  45. Sergeyev, Y.D.; Kvasov, D.E.; Mukhametzhanov, M.S. On the efficiency of nature-inspired metaheuristics in expensive global optimization with limited budget. Sci. Rep. 2018, 8, 453. [Google Scholar] [CrossRef] [PubMed]
  46. Rather, S.A.; Bala, P.S. Swarm-Based Chaotic Gravitational Search Algorithm for Solving Mechanical Engineering Design Problems. World J. Eng. 2020, 17, 97–114. [Google Scholar] [CrossRef]
  47. Chen, P.; Zhou, S.; Zhang, Q.; Kasabov, N. A Meta-Inspired Termite Queen Algorithm for Global Optimization and Engineering Design Problems. Eng. Appl. Artif. Intell. 2022, 111, 104805. [Google Scholar] [CrossRef]
  48. Zhao, Z.H.; Ma, J.D.; Zhang, Y.R. Research on Robot Path Planning Based on an Improved Particle Swarm Dung Beetle Algorithm. China New Technol. Prod. 2024, 46. [Google Scholar]
Figure 1. Comparison of sampling distributions ( N = 40 ).
Figure 1. Comparison of sampling distributions ( N = 40 ).
Biomimetics 10 00299 g001
Figure 2. Comparison of sampling distributions ( N = 100 ).
Figure 2. Comparison of sampling distributions ( N = 100 ).
Biomimetics 10 00299 g002
Figure 3. Trajectory diagram of w and w 1 .
Figure 3. Trajectory diagram of w and w 1 .
Biomimetics 10 00299 g003
Figure 4. Flowchart of MSSA.
Figure 4. Flowchart of MSSA.
Biomimetics 10 00299 g004
Figure 5. Comparison of convergence curves of MSSA and other algorithms on 23 benchmark functions.
Figure 5. Comparison of convergence curves of MSSA and other algorithms on 23 benchmark functions.
Biomimetics 10 00299 g005
Figure 6. Comparison of convergence curves of MSSA and other algorithms on F1−F13.
Figure 6. Comparison of convergence curves of MSSA and other algorithms on F1−F13.
Biomimetics 10 00299 g006aBiomimetics 10 00299 g006b
Figure 7. Comparison of convergence curves of MSSA and other algorithms on F1−F13 ( D = 30 ).
Figure 7. Comparison of convergence curves of MSSA and other algorithms on F1−F13 ( D = 30 ).
Biomimetics 10 00299 g007aBiomimetics 10 00299 g007b
Figure 8. Comparison of convergence curves of MSSA and other algorithms on CEC2019.
Figure 8. Comparison of convergence curves of MSSA and other algorithms on CEC2019.
Biomimetics 10 00299 g008
Figure 9. Performance comparison on GKLS functions.
Figure 9. Performance comparison on GKLS functions.
Biomimetics 10 00299 g009
Figure 10. Welded beam design problem’s image and its convergence.
Figure 10. Welded beam design problem’s image and its convergence.
Biomimetics 10 00299 g010
Figure 11. Speed reducer design problem’s image and its convergence.
Figure 11. Speed reducer design problem’s image and its convergence.
Biomimetics 10 00299 g011
Figure 12. Cantilever beam design problem’s image and its convergence.
Figure 12. Cantilever beam design problem’s image and its convergence.
Biomimetics 10 00299 g012
Figure 13. A 20 × 20 shortest path planning diagram.
Figure 13. A 20 × 20 shortest path planning diagram.
Biomimetics 10 00299 g013
Table 1. Several Sampling Methods.
Table 1. Several Sampling Methods.
Sampling MethodsCharacteristicsAdvantagesDisadvantages
Random SamplingSamples are randomly distributed within the interval.Easy to implement, suitable for large-scale sampling.Samples may cluster in some areas, leading to uneven coverage.
Latin Hypercube Sampling (LHS)Multidimensional stratified Can effectively cover the entire range even with few samples.More complex to compute compared to the tails.
Uniform Distribution SamplingEach point in the interval has the same probability of being selected.Simple and easy to implement.Samples may cluster in certain areas, resulting in uneven coverage.
Gaussian Distribution SamplingSamples are distributed around the mean, with fewer samples far from the mean. Suitable for normally distributed data, easy to generate.Samples are concentrated near the mean, with sparse coverage in the tails.
Table 2. The complexity and parameters of each algorithm.
Table 2. The complexity and parameters of each algorithm.
AlgorithmParametersPopulation
SCA α = 2 Population = 40
WOA b = 1 , α = 2 0 , linearly decrease
HHO E 0 ( 1,1 )
Chimp m = c h a o s ( 3,1 , 1 )
SSA S T = 0.7 , P D = 0.5 , S D = 0.3
ASFSSA S T = 0.7 , P D = 0.5 , S D = 0.3
DBO α = 1   o r 1 , b = 0.3 , k = 0.1 , S = 0.5 , P _ p e r c e n t = 0.2
SAO N a = N b = N 2
GA p m = 0.8 , p c = 0.05
CPO r 1 , r 2 ( 0,1 )
MSO α ( 0,1 )
Table 3. Experimental results of MSSA and other algorithms on 23 benchmark functions (D = 30).
Table 3. Experimental results of MSSA and other algorithms on 23 benchmark functions (D = 30).
FunctionParametersSSADBOSAOSCAChimpHHOWOAMSSA
Uni-modal functionsF1Mean4.8403 × 10−2441.6633 × 10−973.9791 × 10−53.0268.0807 × 10−71.2148 × 10−1014.5632 × 10−750
SD07.4385 × 10−976.0809 × 10−55.26342.0318 × 10−64.7343 × 10−1012.0406 × 10−740
p-values0.1258.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−5NA
F2Mean1.1962 × 10−72.8036 × 10−658.8091 × 10−41.5654 × 10−27.0217 × 10−65.5887 × 10−521.9709 × 10−531.2942 × 10−273
SD2.6373 × 10−71.2459 × 10−641.3044 × 10−33.2183 × 10−28.563 × 10−62.2355 × 10−515.9608 × 10−530
p-values3.5153 × 10−48.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−5NA
F3Mean9.0416 × 10−2682.3775 × 10-831066.37015856.0238154.29241.917 × 10−8333,479.28650
SD01.0602 × 10−821446.77583755.2895435.35178.1173 × 10−8313,578.13840
p-values3.125 × 10−28.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−5NA
F4Mean3.3225 × 10−1074.4094 × 10−521.42330.14690.116231.0448 × 10−5047.64423.2137 × 10−267
SD1.4859 × 10−1061.8761 × 10−510.5708110.40158.5933 × 10−22.8154 × 10−5031.77550
p-values4.3778 × 10−48.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−5NA
F5Mean2.0593 × 10−525.403540.437943,449.619728.89274.184 × 10−327.83027.7089 × 10−6
SD5.3199 × 10−50.2236727.6976120,561.7919.5663 × 10−26.0131 × 10−30.473041.8361 × 10−5
p-values0.167188.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−51.0335 × 10−48.8575 × 10−5NA
F6Mean1.1962 × 10−74.0449 × 10−73.3332 × 10−59.19993.17818.4771 × 10−50.161755.3953 × 10−8
SD2.6373 × 10−75.2474 × 10−73.0624 × 10−57.69020.444391.6742 × 10−40.13231.1408 × 10−7
p-values0.644166.8061 × 10−48.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−5NA
F7Mean2.741 × 10−41.8578 × 10−30.42847.5726 × 10−22.0733 × 10−31.3783 × 10−41.9485 × 10-31.1603 × 10−4
SD2.493 × 10−41.2634 × 10−31.7508 × 10−27.8847 × 10−21.8934 × 10−31.6957 × 10−41.6768 × 10−38.5419 × 10−5
p-values4.7858 × 10−28.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−50.881291.6286 × 10−4NA
Multi-modal
functions
F8Mean−9502.1132−8870.9105−9120.8181−3768.5139−5730.4026−12,569.1605−10,973.321−10,879.7721
SD2738.04341446.006772.1644243.080462.1940.547212094.27081794.4138
p-values0.217965.734 × 10−32.495 × 10−38.8575 × 10−51.0335 × 10−48.8575 × 10−50.68132NA
F9Mean02.489838.186236.06419.075000
SD06.806415.538438.03867.8769000
p-values10.1258.8575 × 10−58.8575 × 10−58.8575 × 10−511NA
F10Mean4.4409 × 10−164.4409 × 10−161.4674 × 10−316.506319.96224.4409 × 10−162.7534 × 10−154.4409 × 10−16
SD001.459 × 10−37.65071.3489 × 10−301.7386 × 10−154.4409 × 10−16
p-values118.8575 × 10−58.8575 × 10−58.8575 × 10−512.4414 × 10−4NA
F11Mean005.2997 × 10−20.70052.5468 × 10−204.2266 × 10−30
SD002.0365 × 10−10.319244.3681 × 10−201.8902 × 0
p-values118.8575 × 10−58.8575 × 10−58.8575 × 10−511NA
F12Mean3.6802 × 10−82.6807 × 10−41.0371 × 10−245,201.4690.36972.4008 × 10−61.3899 × 10−21.2762 × 10−8
SD6.6565 × 10-81.1283 × 10−33.1908 × 10−2202,001.790.148122.5829 × 10−61.0749 × 10−21.5159 × 10−8
p-values0.295885.6915 × 10−28.8575 × 10−58.8575 × 10−58.857 × 10−58.8575 × 10−58.8575 × 10−5NA
F13Mean4.1787 × 10−76.527 × 10−24.4476 × 10−360,807.88722.79338.5529 × 10−50.326282.1011 × 10−7
SD6.2505 × 10−70.113890.10309164,292.1360.120836.9127 × 10−50.18774.1398 × 10−7
p-values3.6561 × 10−28.8575 × 10−51.4013 × 10−48.8575 × 10-58.8575 × 10−51.0335 × 10−48.8575 × 10−5NA
Fixed-dimensional
multi-modal functions
F14Mean7.19491.19643.01551.8910.998121.49312.66140.998
SD5.02080.610693.01551.01264.3042 × 10−41.13213.52922.3447 × 10−9
p-values1.1529 × 10−40.726564.8828 × 10−48.8575 × 10−58.8575 × 10−56.8061 × 10−45.9342 × 10−4NA
F15Mean3.1787 × 10−48.648 × 10−42.5917 × 10−31.0903 × 10−31.2823 × 10−33.762 × 10−45.7833 × 10−43.5493 × 10−4
SD2.0142 × 10−53.2621 × 10−46.0923 × 10−34.0239 × 10−43.7148 × 10−52.072 × 10−42.8029 × 10−42.0437 × 10−4
p-values0.29883.3385 × 10−42.2039 × 10−38.8575 × 10−58.8575 × 10−52.2821 × 10−31.3245 × 10−3NA
F16Mean−1.0316−1.0316−1.0316−1.0316−1.0316−1.0316−1.0316−1.0316
SD1.4408 × 10−162.0376 × 10−162.2781 × 10−162.9831 × 10−59.7388 × 10−62.0397 × 10−101.3452 × 10−101.3478 × 10−16
p-values1118.8575 × 10−58.8575 × 10−51.1964 × 10−48.8575 × 10−5NA
F17Mean0.397890.397890.397890.399580.398980.397890.397890.39789
SD0001.1511 × 10−31.4041 × 10−39.9056 × 10−61.4366 × 10−51.1917 × 10−15
p-values1118.8575 × 10−58.8575 × 10−51.9644 × 10−48.8575 × 10−5NA
F18Mean33333.0001333
SD2.849 × 10−151.5214 × 10−154.5563 × 10−161.5395 × 10−51.3295 × 10−41.9892 × 10−76.0152 × 10−51.1246 × 10−13
p-values1.1985 × 10−41.2257 × 10−47.9305 × 10−58.8575 × 10−58.8575 × 10−51.0178 × 10−38.8575 × 10−5NA
F19Mean−3.8628−3.8616−3.8628−3.8553−3.8552−3.8611−3.8579−3.8628
SD3.3348 × 10−142.8874 × 10−32.2781 × 10−153.3086 × 10−32.1766 × 10−33.9838 × 10−33.9586 × 10−31.278 × 10−9
p-values8.8575 × 10−57.3138 × 10−38.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−5NA
F20Mean−3.2739−3.2076−3.2685−2.9212−2.5851−3.1395−3.1855−3.2654
SD6.7875 × 10−20.111956.0685 × 10−20.201530.48629.1708 × 10−20.203738.0715 × 10−2
p-values0.708910.217960.370261.4013 × 10−48.8575 × 10−51.1713 × 10−30.20433NA
F21Mean−10.1532−7.0682−5.4464−3.3795−3.5228−5.0536−7.8775−10.1532
SD8.5674 × 10−82.94721.69441.88852.03191.5366 × 10−33.25132.3441 × 10−13
p-values0.172127.7877 × 10−41.4599 × 10−48.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−5NA
F22Mean−10.4029−7.9039−6.4516−1.6658−4.1684−5.6137−8.5163−10.4029
SD2.0709 × 10−72.87952.761.52211.67271.62622.89641.2333 × 10−11
p-values0.704671.2264 × 10−21.2673 × 10−38.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−5NA
F23Mean−10.266−8.4347−6.9566−4.3889−4.824−5.3937.−10.5364
SD1.20922.91692.71011.88210.914421.19143.36981.1691 × 10−10
p-values0.556582.1682 × 10−24.0324 × 10−38.8575 × 10−58.8575 × 10−58.8575 × 10−58.8575 × 10−5NA
Note: NA in the table indicates invalid data.
Table 4. Experimental results of MSSA and other algorithms on F1–F13 functions.
Table 4. Experimental results of MSSA and other algorithms on F1–F13 functions.
FunctionAlgorithms D = 50 D = 100
MeanSDMeanSD
F1MSSA0000
SSA1.3087 × 10−23509.1969 × 10−2890
ASFSSA0000
F2MSSA001.6393 × 10−2820
SSA2.0532 × 10−69.1280 × 10−864.8594 × 10−1392.1732 × 10−138
ASFSSA3.2924 × 10−31103.1895 × 10−2800
F3MSSA0000
SSA1.4521 × 10−21004.8765 × 10−1302.1808 × 10−129
ASFSSA0000
F4MSSA5.990 × 10−249000
SSA1.4296 × 10−1276.3935 × 10−1272.2598 × 10−1261.0106 × 10−125
ASFSSA2.7552 × 10−30403.5802 × 10−2550
F5MSSA5.2803 × 10−51.0789 × 10−42.0216 × 10−43.0459 × 10−4
SSA7.8868 × 10−51.1923 × 10−44.1506 × 10−49.9227 × 10−4
ASFSSA4.9156 × 10−38.6149 × 10−31.6051 × 10−23.459 × 10−2
F6MSSA2.3840 × 10−73.5863 × 10−76.6802 × 10−78.8676 × 10−7
SSA4.3318 × 10−75.8853 × 10−71.5188 × 10−62.3585 × 10−6
ASFSSA1.2079 × 10−52.5184 × 10−59.1211 × 10−51.9119 × 10−4
F7MSSA1.3568 × 10−41.2107 × 10−41.2734 × 10−41.0386 × 10−4
SSA2.5163 × 10−41.2289 × 10−42.4166 × 10−41.62621 × 10−4
ASFSSA9.8284 × 10−51.2425 × 10−41.0428 × 10−41.0206 × 10−4
F8MSSA−17,708.57542721.5471−36,630.6833971.7559
SSA−17,278.5363586.9176−37,485.90434113.869
ASFSSA−15,546.30621071.4226−22,423.76052050.4667
F9MSSA0000
SSA0000
ASFSSA0000
F10MSSA4.4409 × 10−1604.4409 × 10−160
SSA4.4409 × 10−1604.4409 × 10−160
ASFSSA4.4409 × 10−1604.4409 × 10−160
F11MSSA0000
SSA0000
ASFSSA0000
F12MSSA1.8017 × 10−84.0916 × 10−82.4191 × 10−87.6385 × 10−8
SSA3.9135 × 10−97.3722 × 10−92.9194 × 10−86.2248 × 10−8
ASFSSA1.9148 × 10−72.8972 × 10−72.3519 × 10−73.7433 × 10−7
F13MSSA4.3383 × 10−75.6604 × 10−71.0375 × 10−62.2243 × 10−6
SSA9.1585 × 10−72.6195 × 10−62.1674 × 10−63.3805 × 10−6
ASFSSA5.0759 × 10−67.0915 × 10−61.9273 × 10−52.9267 × 10−5
Table 5. Experimental results of MSSA and other algorithms on F1–F13 functions (D = 30).
Table 5. Experimental results of MSSA and other algorithms on F1–F13 functions (D = 30).
FunctionParametersMSSACPOCSSAMSO
F1Mean01.3957 × 10−14801.032 × 10−3
SD03.797 × 10−14808.1508 × 10−4
p-valuesNA1.9531 × 10−311.9531 × 10−3
F2Mean02.8512 × 10−662.9768 × 10−2005.3912 × 10−3
SD09.0161 × 10−6604.4073 × 10−3
p-valuesNA1.9531 × 10−30.51.9531 × 10−3
F3Mean03.4518 × 10−12101952.2359
SD01.0915 × 10−1200755.6469
p-valuesNA1.9531 × 10−311.9531 × 10−3
F4Mean1.538 × 10−3202.0157 × 10−635.164 × 10−635.164 × 10−225
SD04.7631 × 10−6305.5849
p-valuesNA1.9531 × 10−30.51.9531 × 10−3
F5Mean5.5967 × 10−624.98921.5787 × 10−4214.0628
SD8.888 × 10−68.78532.0579 × 10−4301.0648
p-valuesNA1.9531 × 10−39.7656 × 10−31.9531 × 10−3
F6Mean2.5788 × 10−86.1026 × 10−21.1519 × 10−69.5927 × 10−4
SD4.4412 × 10−81.1005 × 10−11.3663 × 10−64.1921 × 10−4
p-valuesNA1.9531 × 10−33.9062 × 10−31.9531 × 10−3
F7Mean1.252 × 10−41.115 × 10−41.5085 × 10−49.1615 × 10−2
SD1.4052 × 10−49.9963 × 10−51.0329 × 10−43.3619 × 10−2
p-valuesNA0.769530.492191.9531 × 10−3
F8Mean−10,511.7565−4353.8839−12,200.0677−9583.8378
SD1722.75941933.8196563.9687328.9952
p-valuesNA1.9531 × 10−33.9062 × 10−31.6016 × 10−1
F9Mean00036.6183
SD00011.571
p-valuesNA111.9531 × 10−3
F10Mean4.4409 × 10−164.4409 × 10−164.4409 × 10−164.6258 × 10−1
SD0006.1442 × 10−1
p-valuesNA111.9531 × 10−3
F11Mean0002.1175 × 10−2
SD0001.8055 × 10−2
p-valuesNA111.9531 × 10−3
F12Mean1.2615 × 10−83.6508 × 10−75.2542 × 10−83.2297 × 10−1
SD2.0312 × 10−83.5871 × 10−77.4516 × 10−84.3165 × 10−1
p-valuesNA1.9531 × 10−32.3242 × 10−31.9532 × 10−3
F13Mean1.9609 × 10−73.9974 × 10−61.5651 × 10−62.5870 × 10−1
SD2.7869 × 10−74.2964 × 10−62.2656 × 10−67.9157 × 10−1
p-valuesNA9.7656 × 10−32.3242 × 10−11.9531 × 10−3
Note: NA in the table indicates invalid data.
Table 6. Experimental results of MSSA and other algorithms on CEC2019.
Table 6. Experimental results of MSSA and other algorithms on CEC2019.
FunctionParametersMSSASSAChimpHHODBOSAO
F1Mean1.00001.00001,781,542.161.0000821,233.1217,947.94
SD0.00000.00003,096,970.030.00003,264,743.6123,363.7653
F2Mean4.97485.00002307.61975.0000457.5043166.2708
SD0.13800.00001300.28470.00001201.2186101.6676
F3Mean4.45106.46405.64284.53914.70952.7336
SD2.37542.75491.14360.96202.38072.0580
F4Mean1.00001.00001.00001.00001.00001.0000
SD0.00000.00000.00000.00000.00000.0000
F5Mean1.00001.00001.00001.00001.00001.0000
SD0.00000.00000.00000.00000.00000.0000
F6Mean4.10314.20084.66674.26854.42284.1928
SD0.31980.44070.36380.39850.32790.3166
F7Mean1.00001.00001.00281.00001.00001.0042
SD0.00000.00000.01070.00000.00000.0158
F8Mean1.00001.00001.00001.00001.00001.0000
SD0.00000.00000.00000.00000.00000.0000
F9Mean1.00001.00001.00001.00001.00001.0000
SD0.00000.00000.00000.00000.00020.0000
F10Mean1.00001.00001.00001.00001.00001.0000
SD0.00000.00000.00000.00000.00000.0000
Friedman Score3.00493.46334.58833.28003.55003.5480
Friedman Rank136254
Table 7. Comparison of optimization designs for welded beam design using different algorithms.
Table 7. Comparison of optimization designs for welded beam design using different algorithms.
AlgorithmMeanSD
MSSA1.71870.0586
SSA1.97940.3861
Chimp1.79690.0268
HHO2.03260.3202
DBO1.73290.0843
SAO1.71540.0692
Table 8. Comparison of optimization designs of the speed reducer using different algorithms.
Table 8. Comparison of optimization designs of the speed reducer using different algorithms.
AlgorithmMeanSD
MSSA2727.733822.0685
SSA3087.6111337.1181
Chimp3130.472742.3819
HHO3051.301262.0078
DBO3027.479448.4128
SAO2994.47110.0000
Table 9. Comparison of the optimization designs for the cantilever beam using different algorithms.
Table 9. Comparison of the optimization designs for the cantilever beam using different algorithms.
AlgorithmMeanSD
MSSA1.34090.0006
SSA1.34230.0018
Chimp1.36280.0091
HHO1.34310.0021
DBO1.34000.0000
SAO1.34000.0000
Table 10. Statistics table of optimization route by algorithms.
Table 10. Statistics table of optimization route by algorithms.
MetricsGASSAMSSAASFSSAGWO
Best28.019228.577728.577728.419328.4193
Mean29.314129.985228.883628.587529.3287
Worse30.486931.139529.776528.631530.8721
Table 11. Key contributions and performance validation of the MSSA algorithm.
Table 11. Key contributions and performance validation of the MSSA algorithm.
ContributionDescription
Enhancement of Population DiversityIntroduced Latin Hypercube Sampling (LHS) during the initialization phase to enhance population diversity and avoid premature convergence.
Adaptive Weighting MechanismApplied an adaptive weighting mechanism to improve search efficiency, ensuring optimal performance at different stages of the search process.
Enhanced Global Search CapabilityUtilized Cauchy mutation and cat disturbance strategies during the discovery phase to strengthen global search ability and prevent premature convergence to local optima.
Optimization Performance ValidationTo verify the optimization performance and global optimization ability of the Modified Sparrow Search Algorithm (MSSA), tests were conducted on 23 benchmark functions, 10 CEC2019 test functions, and 100 five-dimensional GKLS test functions.
Stability and PrecisionExperimental results indicate that MSSA outperforms other algorithms in terms of convergence precision and stability on most test functions.
Application to Real-World ProblemsDemonstrated the effectiveness of MSSA by applying it to three real-world engineering problems, and a 20 × 20 robot path-planning problem further validating the improvements made.
Statistical TestsWilcoxon signed-rank test showed significant differences between MSSA and other algorithms at a 0.05 significance level.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ma, Y.; Meng, W.; Wang, X.; Gu, P.; Zhang, X. Modified Sparrow Search Algorithm by Incorporating Multi-Strategy for Solving Mathematical Optimization Problems. Biomimetics 2025, 10, 299. https://doi.org/10.3390/biomimetics10050299

AMA Style

Ma Y, Meng W, Wang X, Gu P, Zhang X. Modified Sparrow Search Algorithm by Incorporating Multi-Strategy for Solving Mathematical Optimization Problems. Biomimetics. 2025; 10(5):299. https://doi.org/10.3390/biomimetics10050299

Chicago/Turabian Style

Ma, Yunpeng, Wanting Meng, Xiaolu Wang, Peng Gu, and Xinxin Zhang. 2025. "Modified Sparrow Search Algorithm by Incorporating Multi-Strategy for Solving Mathematical Optimization Problems" Biomimetics 10, no. 5: 299. https://doi.org/10.3390/biomimetics10050299

APA Style

Ma, Y., Meng, W., Wang, X., Gu, P., & Zhang, X. (2025). Modified Sparrow Search Algorithm by Incorporating Multi-Strategy for Solving Mathematical Optimization Problems. Biomimetics, 10(5), 299. https://doi.org/10.3390/biomimetics10050299

Article Metrics

Back to TopTop