Next Article in Journal
Distance Measures of (m,a,n)-Fuzzy Neutrosophic Sets and Their Applications in Decision Making
Previous Article in Journal
A Collaborative Optimization Model for Metro Passenger Flow Control Considering Train–Passenger Symmetry
Previous Article in Special Issue
Symmetry-Aware Sequential Recommendation with Dual-Domain Filtering Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Sailfish Optimization Algorithm Integrated with the Osprey Optimization Algorithm and Cauchy Mutation and Its Engineering Applications

School of Electronic and Electrical Engineering, Wenzhou University of Technology, Wenzhou 325035, China
*
Author to whom correspondence should be addressed.
Symmetry 2025, 17(6), 938; https://doi.org/10.3390/sym17060938
Submission received: 4 May 2025 / Revised: 5 June 2025 / Accepted: 11 June 2025 / Published: 12 June 2025
(This article belongs to the Special Issue Symmetry in Intelligent Algorithms)

Abstract

:
From collective intelligence to evolutionary computation and machine learning, symmetry can be leveraged to enhance algorithm performance, streamline computational procedures, and elevate solution quality. Grasping and leveraging symmetry can give rise to more resilient, scalable, and understandable algorithms. In view of the flaws of the original Sailfish Optimization Algorithm (SFO), such as low convergence precision and a propensity to get stuck in local optima, this paper puts forward an Osprey and Cauchy Mutation Integrated Sailfish Optimization Algorithm (OCSFO). The enhancements are mainly carried out in three aspects: (1) Using the Logistic map to initialize the sailfish and sardine populations. (2) In the first stage of the local development phase of sailfish individual position update, adopting the global exploration strategy of the Osprey Optimization Algorithm to boost the algorithm’s global search capability. (3) Introducing Cauchy mutation to activate the sailfish and sardine populations during the prey capture stage. Through the comparative analysis of OCSFO and seven other swarm intelligence optimization algorithms in the optimization of 23 classic benchmark test functions, as well as the Wilcoxon rank-sum test, it is evident that the optimization speed and convergence precision of OCSFO have been notably improved. To confirm the practicality and viability of the OCSFO algorithm, it is applied to solve the optimization problems of piston rods, three-bar trusses, cantilever beams, and topology. Through experimental analysis, it can be concluded that the OCSFO algorithm has certain advantages in solving practical optimization problems.

1. Introduction

With the ongoing progress and evolution of society, the attributes of engineering optimization problems have undergone significant transformations. These issues have developed from small-scale and simple structures to large-scale and highly intricate systems, from single-peak solution landscapes to multi-peak setups, and from linear to strongly non-linear features [1,2]. Such changes present considerable challenges to traditional optimization algorithms, which usually depend on accurate mathematical models and gradient-based techniques. When dealing with large-scale, multi-peak, and non-linear problems, these conventional algorithms struggle to efficiently explore the solution space [3]. As a result, they often fail to find the global optimal solution or get stuck in local optima, making them insufficient for dealing with the increasing complexity of modern engineering optimization tasks [4,5]. In the complex realm of computational algorithms, symmetry appears as a powerful yet underappreciated concept with broad implications. For instance, in swarm intelligence, the behavioral patterns of social insects like ants and bees inherently show symmetrical properties. Ant colonies display symmetrical foraging patterns around their nests when searching for food. By imitating this symmetry in artificial swarm intelligence algorithms, such as Ant Colony Optimization (ACO), more effective search strategies can be created. ACO algorithms make use of the symmetrical distribution of pheromone trails left by ants to guide the search process, ensuring a balanced exploration of all potential paths. This symmetry-driven method not only speeds up the convergence towards optimal solutions but also lowers the chance of being trapped in local optima.
Evolutionary computation, inspired by biological evolution, also gains a lot from symmetry. Genetic algorithms, a typical case, work based on genetic operators such as crossover and mutation. By incorporating symmetry into the encoding of genetic information, more efficient and effective search spaces can be built. For example, symmetrical chromosome representations can ensure that similar genetic configurations are grouped together, facilitating a more systematic exploration of the solution space. This allows for faster evolution of superior solutions, as the algorithm can more easily recognize and spread beneficial genetic traits. Swarm intelligence optimization algorithms have become a promising alternative. Based on the study of group behavior in nature, these algorithms take inspiration from the collective activities of biological groups like ants, birds, and fish [6]. They imitate behaviors such as ant foraging, bird flocking, or fish schooling. By emulating these natural behaviors, swarm intelligence algorithms use information exchange and cooperation mechanisms among group members [7,8], enabling a more comprehensive exploration of the solution space and a gradual convergence towards the optimal solution for a specific problem. Their unique ability to handle complex and uncertain situations has led to their wide application in solving various complex optimization problems [9]. For example, in path planning for autonomous vehicles or drones, these algorithms can efficiently find the shortest and safest routes while avoiding obstacles. In workshop scheduling, they can optimize resource allocation (such as machines and labor) to minimize production time and costs [10]. These applications show the effectiveness and versatility of swarm intelligence optimization algorithms in addressing real-world complex optimization challenges.
The Sailfish Optimization Algorithm (SFO), put forward by scholar Shadravan in 2019, is a notable addition to the family of swarm intelligence optimization algorithms [11]. This algorithm is cleverly designed based on the hunting dynamics between sailfish (predators) and sardines (prey) in marine ecosystems. In the SFO framework, sailfish are regarded as candidate solutions for the optimization problem, with their positions in the solution space corresponding to potential solutions that are updated iteratively to approach the optimal solution. Sardines, on the other hand, play a crucial role in enhancing the algorithm’s randomness by introducing unpredictability, which helps the algorithm avoid local optima and promotes a more thorough exploration of the solution space [12]. During the optimization process, the SFO algorithm simulates three different behavioral strategies that occur when sailfish groups hunt sardine schools. The first is the population elite strategy, in which the best-performing sailfish (elite solutions) guide the movement of other members of the population. By sharing their information and positions, elite sailfish help the entire group converge more quickly towards promising areas of the solution space. The second is the alternating attack strategy [13], which imitates the way sailfish take turns attacking sardine schools from different directions. Through alternating attack patterns, the algorithm can explore various parts of the solution space, increasing the probability of finding the global optimal solution. The third is the prey capture strategy, which focuses on the actual process of sailfish catching sardines. In an algorithmic context, this represents the exploitation phase, where the algorithm refines identified solutions and seeks a more precise convergence towards the optimal solution. Through these three strategies, the SFO algorithm systematically searches for feasible solutions within the solution space, gradually getting closer to the optimal solution.
The SFO algorithm has been widely applied and studied in many fields. In engineering design, it has been used to optimize the parameters of mechanical systems, electrical circuits, and structural designs, enabling performance improvements while reducing costs. In data mining and machine learning, it has been used for tasks such as feature selection, classifier parameter tuning, and clustering, improving the efficiency and accuracy of these algorithms. These applications have, to a certain extent, verified the flexibility and reliability of the SFO algorithm, showing its potential to become a powerful tool in the optimization field.
The following are this paper’s primary contributions:
  • Propose an OCSFO hybrid algorithm that enhances global exploration capability through OOA and avoids local optima by combining Cauchy mutation;
  • Design a multi-stage adaptive parameter strategy to dynamically balance search accuracy and speed;
  • Validate algorithm performance in 23 benchmark functions and 4 types of engineering problems, outperforming 7 compared algorithms.

2. Related Work

Analogous to other swarm intelligence optimization algorithms, the SFO algorithm also faces drawbacks, including a tendency to converge to local optima and a sluggish convergence rate. To boost SFO’s optimization capacity and expedite its convergence, many scholars have put forward various enhancements. Reference [14] presented a new DESFO algorithm that combines the Differential Evolution (DE) algorithm with SFO. When utilized for feature selection tasks, this algorithm can effectively pinpoint the most discriminative features and showcases remarkable advantages in both convergence speed and accuracy. Reference [15] proposed an improved ISFO algorithm by integrating an adaptive non-linear iteration factor, Lévy flight strategy, and differential mutation strategy. Experimental results indicate that the ISFO algorithm displays superior robustness and precision on standard test functions. Reference [16] enhanced the SFO algorithm through strategies such as inertia weight adjustment, modification of the global search formula, and adoption of the Lévy flight strategy. This improved variant was successfully applied to select optimal controller nodes from a group of sensor nodes, yielding favorable results. Reference [17] developed an advanced SFO algorithm with enhanced performance by introducing two search strategies: population switching and random mutation. When tested on the CEC2017 benchmark suite and classical engineering problems, this improved algorithm demonstrated notably higher convergence accuracy.
While the above-mentioned improved algorithms have enhanced the optimization performance of the standard SFO to varying degrees, they still encounter challenges such as premature convergence and vulnerability to local optima when tackling complex multi-peak problems. To further enhance SFO’s performance, this paper proposes an Osprey and Cauchy Mutation Integrated Sailfish Optimization Algorithm (OCSFO). The improvements are carried out in three key aspects:
(1) Initializing the sailfish and sardine populations using the Logistic map to enhance population diversity;
(2) Employing the global exploration strategy of the Osprey Optimization Algorithm during the first stage of the local development phase in sailfish individual position updates, thereby strengthening the algorithm’s global search capability;
(3) Introducing Cauchy mutation to stimulate the activity of sailfish and sardine populations during the prey capture stage, enhancing local exploitation and escaping from local optima.
Subsequently, OCSFO is tested on 23 classical benchmark test functions, and four engineering design optimization problems to verify its effectiveness in solving complex optimization tasks.

3. Standard Sailfish Optimization Algorithm

The conventional Sailfish Optimization Algorithm (SFOA) comprises two populations. One is the sailfish population, which conducts searches around the current optimal solution’s location to enhance the local search capability [18,19]. The other is the sardine population. As the prey of sailfish, the sardine population explores positions within the search space to improve the global search comprehensiveness. Sailfish are capable of searching in multi-dimensional spaces [20]. Thus, sailfish are regarded as candidate solutions, and the number of variables in the problem to be solved corresponds to the dimension of the search space, which is represented as the sailfish’s position in the algorithm. The standard Sailfish Optimization Algorithm includes four steps [21]:
(1) Population Initialization
As a population-based meta-heuristic algorithm, in the Sailfish Optimization Algorithm, it is assumed that sailfish stand for candidate solutions, and the problem variables are equivalent to the sailfish’s positions in the search space. Correspondingly, the population in the solution space is generated randomly. Sailfish can perform searches in one-dimensional, two-dimensional, three-dimensional, or even higher-dimensional spaces, and their position vectors are changeable [22]. In a d-dimensional search space, the current position of S F i k R ( i = 1 , 2 , , m ) of the i-th member in the k-th search is
S F p o s i t i o n = S F 1 , 1 S F 1 , 2 S F 1 , d S F 2 , 1 S F 2 , 2 S F 2 , d S F m , 1 S F m , 2 S F m , d
where, the parameter m represents the number of sailfish, d represents the number of variables, and SFij represents the value of the i-th sailfish in the j-th dimension. In addition, the fitness of each sailfish is calculated through the fitness function as follows:
fitness   value   of   sailfish = f ( sailfish ) = f ( S F 1 , S F 2 , , S F m )
To evaluate each sailfish, the following matrix shows the fitness values of all solutions:
S F F i t n e s s = f S F 1 , 1 S F 1 , 2 S F 1 , d f S F 2 , 1 S F 2 , 2 S F 2 , d f S F m , 1 S F m , 2 S F m , d = F S F 1 F S F 2 F S F m
where, the parameter m is the number of sailfish, SFij is the value of the i-th sailfish in the j-th dimension, the parameter f is the fitness function used for calculation, and SFFitness is used to store the fitness values returned for each sailfish’s fitness or objective function value [23]. The first row of the matrix SFposition is fed into the fitness function, and its output represents the fitness value of the corresponding sailfish in the SFFitness matrix.
The sardine population is another important component of the Sailfish Optimization Algorithm. The algorithm assumes that the sardine school also swims in the search space. Therefore, the positions of the sardines and their fitness values are utilized as follows:
S F p o s i t i o n = S F 1 , 1 S F 1 , 2 S F 1 , d S F 2 , 1 S F 2 , 2 S F 2 , d S F n , 1 S F n , 2 S F n , d
The parameter n is the number of sardines, Sij is the value of the i-th sardine in the j-th dimension, and the Sposition matrix represents the positions of all sardines [24].
S F i t n e s s = f S 1 , 1 S 1 , 2 S 1 , d f S 2 , 1 S 2 , 2 S 2 , d f S m , 1 S m , 2 S m , d = F S 1 F S 2 F S m
The parameter n is the number of sardines, Sij represents the value of the i-th sardine in the j-th dimension, f is the objective function, and it is used to store the fitness value of each sardine. It is worth noting that the sailfish and sardines are corresponding factors in finding solutions. In this algorithm, the sailfish are the main factor, scattered in the search space, while the sardines can cooperate to find the best position in the area. In fact, when searching the space, the sardines may be preyed upon by the sailfish. If a sailfish discovers a better solution obtained so far, it will update its position [25].
(2) Population Elite Strategy
To enable the algorithm to retain the optimal solution of each iteration, the population elite strategy is introduced. The individual with the highest fitness value in the sailfish population is selected as the elite sailfish for each attack iteration X e l i t e _ S F i . The individual with the highest fitness value in the sardine population is selected as the injured sardine X i n j u r e d _ S i that is most likely to be captured after being attacked by the sailfish in each iteration [26].
(3) Alternating Attack Strategy
During the process of preying on sardines, sailfish adopt an alternating attack method, which can minimize the energy consumption of sailfish and make it easier to capture sardines. The SFO algorithm simulates this process, and the position update formula of the sailfish is as follows:
X n e w _ S F i = X e l i t e _ S F i λ i × r a n d ( 0 , 1 ) × X e l i t e _ S F i + X i n j u r e d _ S i 2 X old _ SF i
Wherein, X e l i t e _ S F i is the position of the elite sailfish formed up to the current time, X i n j u r e d _ S i determines the best position of the injured sardine formed currently, X old _ SF i is the current position of the sailfish, rand (0,1) is a random number between 0 and 1, and i is a coefficient generated in the i-th iteration. The generation method is as follows [27]:
λ i = 2 × r a n d × P D P D
where PD is a factor set considering the change in prey density caused by the predation of sailfish during the hunting period. The generation formula is as follows:
P D = 1 N S F N S F + N S
NSF and NS are the number of sailfish and sardines in the loop, respectively.
(4) Prey Capture Strategy
In the SFO algorithm, the random update of the sardine population’s position helps to find the optimal solution in the search space, thereby simulating the process of sailfish capturing sardines. The sardines update their positions in each iteration based on the current attack power of the sailfish and the current best position. The position update formula is as follows [28]:
X n e w S i = r × ( X e l i t e S F i X o l d S i + A P )
The position of the sardine before update is X o l d S i , the position after update is X n e w S i , the parameter r is a random number between 0 and 1, The parameter AP is the attack power of the sailfish in each iteration, and its value gradually decreases as the number of iterations increases. The generation formula is as follows:
A P = A × ( 1 ( 2 × t × ε ) )
The parameters A and ε are coefficients that make the attack power AP decrease from A to 0. The AP parameter helps to balance the exploration and exploitation of the search space. When the attack power of the sailfish is high (i.e., AP ≥ 0.5, all sardine positions are updated. When the attack power of the sailfish is low (i.e., AP < 0.5), only the positions of α sardines with a shift amount of β are updated. The calculation formulas for α and β are as follows:
α = N s × A P
β = d i × A P
The parameter di is the dimension of the sardine, and Ns is the number of sardines.
When the fitness value of the updated sardine is better than that of the sailfish, the process of the sailfish capturing the sardine is simulated. The position of the sailfish at the current position is replaced with the position of the current sardine, and the position of the replaced sardine is randomized. The formula is as follows:
X S F i = X S i   i f   f ( S i ) < f ( S F i )
The parameter X S F i is the position of the sailfish in the i-th iteration. The parameter X S i is the position of sardine in the i-th iteration.
Summarizing the above operating mechanism of the Sailfish Optimization Algorithm, the steps of the SFO algorithm can be simplified as follows:
Step 1: Parameter Setting. Set the sailfish population size M, the sardine population size N, the search space dimension D, the maximum number of iterations Maxiter, and the non-linear factor parameters A = 4 and ε = 0.01 in the algorithm.
Step 2: Population Initialization. Initialize the positions of the sailfish and sardine populations in the search space.
Step 3: Elite Selection. Calculate the fitness values of each sailfish and sardine, record and find the positions and fitness values of the optimal individuals among them. Select the sailfish with the best fitness value X e l i t e _ S F i as the elite sailfish, and select the sardine with the best fitness value X i n j u r e d _ S i as the injured sardine.
Step 4: Start Iteration. Update the positions of the sailfish according to Formula (6) and the positions of the sardines according to Formula (9). Consider the attack power factor. If the attack power AP < 0.5, calculate α and β according to Formulas (11) and (12), and update the positions of some sardines. Otherwise, update the positions of all sardines.
Step 5: Fitness Value. Evaluation of Sailfish and Sardines Determine whether to perform position replacement.
Step 6: Calculate the Fitness Values of Sailfish and Sardines Record and update the global optimal value and position.
Step 7: Check Iteration Termination Condition. If the current iteration meets the termination condition, stop the iteration and output the global optimal solution. Otherwise, jump to Step 3 to continue the iterative operation.

4. Osprey and Cauchy Mutation Integrated Sailfish Optimization Algorithm

4.1. Chaotic Map Initialization

The population initialization method has a certain impact on the optimization effect and robustness of swarm intelligence algorithms [29]. Compared with completely random population initialization, introducing an appropriate chaotic sequence for initialization can generate a more uniformly distributed initial population in the search space, making the selected elite sailfish and injured sardines more representative and accurate [30]. This paper optimizes the population initialization using the Logistic map. Its iteration formula is as follows:
L i + 1 = r L i 1 L i
where L i 0 , 1 , r is the control parameter, and r = 0.3 in this paper.

4.2. Osprey Optimization Algorithm

The Osprey Optimization Algorithm (OOA), proposed by Dehghani et al. in 2023 [31], is a swarm intelligence optimization algorithm inspired by the strategy of ospreys catching fish on water [31,32]. That is, after detecting the position of prey, ospreys hunt and kill it and then take the prey to a suitable place to eat. The OOA has the advantages of few parameters and high optimization efficiency. During the local development phase of sailfish individual position update, the global exploration strategy of the OOA in the first stage is adopted to enhance the global search ability of the algorithm [33].
The global exploration strategy formula of the OOA in the first stage is used to replace the position update formula of the explorers in the original Sailfish Optimization Algorithm [34]. The OOA can compensate for the position update method of the SFO algorithm that overly relies on the positions of the previous-generation sailfish. It randomly detects the position of one of the fish and attacks it. The position update method of the explorers in the SFO algorithm is updated based on the movement simulation method of ospreys towards fish [35]. The global exploration strategy formula of the OOA in the first stage is as follows:
X S F i + 1 , j + 1 = X S F i , j + r i , j × ( S F i , j I i , j × X i , j )
where X S F i , j represents the position of the i-th osprey in the j-th dimension, S F i , j represents the fish selected by the i-th osprey, r i , j is a random number between [0, 1], and I i , j is a random number in the set {1,2} [36,37].

4.3. Cauchy Mutation

To address the problem that the Sailfish Optimization Algorithm is prone to getting trapped in local optima, Cauchy mutation is introduced to improve the globally optimal individual, avoiding the algorithm from falling into local optima and enhancing its global optimization ability [38].
Cauchy mutation has the advantage of a relatively small peak at the origin and a long distribution at both ends, which can generate large perturbations during the mutation operation [39]. The Cauchy mutation formula is as follows:
X S F i + 1 = X S F i + X S F i × C a u c h y   0 , 1
X S i + 1 = X S i + X S i × C a u c h y   0 , 1
C a u c h y ( 0 , 1 ) = tan ( ( r a n d 0.5 ) × π )
where Cauchy(0,1) is the standard Cauchy distribution function. X S F i and X S i are the positions of the sailfish and sardine before mutation, respectively. X S F i + 1 and X S i + 1 are the positions of the sailfish and sardine after mutation, respectively, and rand is a random number between [0, 1].

4.4. Pseudo-Code of the Osprey and Cauchy Mutation Integrated Sailfish Optimization Algorithm

In summary, the Osprey and Cauchy Mutation Integrated Sailfish Optimization Algorithm (OCSFO) proposed in this paper can increase the swarm diversity to a certain extent, balance the local and global search abilities of the algorithm, and improve its optimization ability. The pseudo-code of the Algorithm 1 is as follows:
Algorithm 1. Pseudo-code of the OCSFO Algorithm
Start
  • Set algorithm parameters, where m is the number of sailfish and n is the number of sardines.
  • Initialize the sailfish and sardine populations through the Logistic chaotic map in Formula (14).
  • Calculate   the   fitness   values   of   the   sailfish   and   sardines   to   obtain   the   elite   sailfish   X e l i t e _ S F i   and   the   injured   sardine   X i n j u r e d _ S i .
  • While (tTmax)
  •       for i = 1 to m
  •             Update the position of each sailfish according to Formula (6);
  •       end for
  •             Calculate the attack power AP of the sailfish according to Formula (10).
  •       if AP < 0.5
  •             Calculate α and β according to Formulas (11) and (12), respectively.
  •               Update the positions of the selected sardines according to Formula (9).
  •        else
  •               Update the positions of all sardines according to Formula (10).
  •        end if
  •        Calculate the fitness values of all sardines.
  •        if there is a better sardine position
  •              Update the position of the sailfish with the position of the injured sardine according to Formula (15).
  •              Remove the injured sardine.
  •              Update the positions of the sailfish and sardines according to Formulas (16) and (17).
  •                Calculate the fitness values of the sailfish and sardines, and record and update the global optimal value and position.
  •        end if
  •        t = t + 1;
  • end while
  • Output the fitness array, the global best sailfish position, and the optimal fitness value.
End
The flowchart of the Osprey and Cauchy Mutation integrated sailfish optimization algorithm is shown in Figure 1. In view of the flaws of the original Sailfish Optimization Algorithm (SFO), such as low convergence precision and a propensity to get stuck in local optima, this paper puts forward an Osprey and Cauchy Mutation Integrated Sailfish Optimization Algorithm (OCSFO). The enhancements are mainly carried out in three aspects: (1) Using the Logistic map to initialize the sailfish and sardine populations. (2) In the first stage of the local development phase of sailfish individual position update, adopting the global exploration strategy of the Osprey Optimization Algorithm to boost the algorithm’s global search capability. (3) Introducing Cauchy mutation to activate the sailfish and sardine populations during the prey capture stage.

5. Simulation Experiments and Analysis

5.1. Experimental Environment Setup

In the experiments to verify the performance of the OCSFO algorithm, the experimental environment is set up as follows: The operating system is Windows 10 (64-bit), the processor is Intel(R) Core(TM) i7-10875H with a main frequency of 2.3 GHz, the computer memory is 16 GB, and the simulation software is MATLAB R2020b. To verify the effectiveness and advantages of the OCSFO algorithm, the OCSFO algorithm is compared with the standard SFO algorithm and six other well-known intelligent algorithms: PSO [40], GWO [41], HHO [42], DBO [43], POA [44], and PO [45]. To ensure the fairness of the algorithm comparison, the population size of all algorithms is set to 50, the number of iterations is set to 1000, and the dimension is set to 30. The fitness values obtained from 30 independent runs of each algorithm, including the optimal value (Min), standard deviation (Std), average value (Avg), median value (Median), and worst value (Worse), are used as evaluation indicators. The optimal data in the statistical results are shown in bold.

5.2. Test Functions

This section describes the 23 classic benchmark test functions and the benchmark functions test set used in this paper. Table 1 presents the information of the 23 classic benchmark test functions, including the function expressions, dimensions, search ranges, and optimal values. Among them, F1–F7 are unimodal functions, F8–F13 are multimodal functions, and F14–F23 are fixed-dimension multimodal functions.

5.3. Test Results and Analysis of Classic Benchmark Functions

(1) Test Results and Analysis
Under the above-mentioned experimental environment and parameter settings, each algorithm is used to independently run 30 times on the 23 classic benchmark test functions, and the test results shown in Table 2 are obtained. The optimal values of each index in the table are marked in bold.
The data in the table reveals that OCSFO has delivered robust optimization outcomes across the majority of the 23 classical benchmark test functions. Specifically, in nine test functions—F5, F9, F10, F11, F12, F13, F15, F16, and F21—OCSFO consistently outperformed other algorithms in all evaluated metrics. For most functions, OCSFO successfully identified the theoretical optimal values. Even when other algorithms matched this achievement, OCSFO exhibited notably smaller standard deviations, underscoring its superior convergence accuracy and stability. These results highlight OCSFO’s enhanced capability to balance exploration and exploitation, positioning it as an effective solution for complex optimization tasks.
(2) Wilcoxon Rank-Sum Test
To test the significance of the optimization effect of OCSFO, based on the optimal values obtained from 30 independent runs of each algorithm, the OCSFO is compared with the other seven algorithms through the Wilcoxon rank-sum test. The Wilcoxon rank-sum test is a non-parametric statistical test method mainly used to compare whether there is a significant difference in the medians of two independent samples. The hypothesis is set that there is no significant difference between the two algorithms. When the result of the Wilcoxon rank-sum test is less than 0.05, it means that the hypothesis is not valid, that is, there is a significant difference between the two algorithms. When the result of the Wilcoxon rank-sum test is greater than 0.05, it means that the hypothesis is valid, that is, there is no significant difference between the two algorithms. When the result of the Wilcoxon rank-sum test is NaN, it means that the effects of the two algorithms are equivalent and cannot be compared. The results of the Wilcoxon rank-sum test are shown in Table 3.
It can be seen from Table 3 that most of the results of the Wilcoxon rank-sum test is less than 0.05, indicating that the optimization effect of OCSFO has a significant improvement compared with other algorithms. The OCSFO algorithm performs much better than the other seven algorithms.
(3) Convergence Performance Analysis
To verify the convergence performance of the OCSFO algorithm, this experiment lists the comparison charts of the iterative convergence curves of the OCSFO algorithm and other intelligent optimization algorithms on 23 classic benchmark test functions. The name of the benchmark function is shown at the top of each comparison chart. The convergence curves of 8 comparison algorithms on the classic benchmark test functions are shown in Figure 2.
Comparison of 23 classical function radar charts for 8 algorithms is shown in Figure 3.
As illustrated in Figure 2 and Figure 3, OCSFO demonstrates superior convergence behavior by attaining better solutions in F1, F2, and F5, while closely approaching the theoretical optimal value in F6. For the unimodal functions F3 and F4, although OCSFO does not fully converge to the theoretical optimum, it still achieves substantial improvements in solution accuracy and convergence velocity when juxtaposed with the classical SFO algorithm. This highlights the effectiveness of its hybrid exploration-exploitation mechanism in gradient-based optimization landscapes.
Notably, OCSFO excels in multimodal function optimization. In F9, F10, F11, F12, and F13—characterized by complex, multi-peak structures—it not only converges to the theoretical optimal values but also does so with significantly faster convergence kinetics compared to other algorithms. This proficiency underscores its enhanced ability to navigate rugged fitness landscapes and avoid premature stagnation in local optima. Further validation is observed in fixed-dimension multimodal functions (F15, F17, F19, F20, F21, F22, F23), where OCSFO consistently exhibits stronger global search capabilities. Function F16 encountered an error and did not display the image, it is not be displayed. Its performance across these diverse problem classes demonstrates remarkable adaptability to highly non-linear, multi-peak search spaces, a critical advantage for real-world engineering applications.
Collectively, these results establish that OCSFO surpasses the classical SFO algorithm in both convergence speed and accuracy. When benchmarked against other state-of-the-art comparison algorithms, it demonstrates distinct dominance in handling multimodal and fixed-dimension multimodal challenges, showcasing robust escape mechanisms from local optima, superior convergence precision, and overall optimal performance. The findings solidify OCSFO’s utility as a versatile and efficient optimization framework for complex, high-dimensional problems in engineering and machine learning. In response to the problems of search ability, convergence speed, and local optima in the SFO benchmark algorithm, this paper proposes a swordfish optimization algorithm (OCSFO) that integrates the Osprey and Cauchy mutation. The first stage exploration strategy of the Osprey algorithm is used to improve the discoverer position update formula, increasing the exploration ability of SFO in identifying optimal regions and escaping local optima. The variable spiral update strategy and Cauchy mutation strategy are used to improve the follower update formula and dynamically update the target position. Through comparative analysis of experiments, the algorithm has shown significant improvements in convergence speed, convergence accuracy, probability of escaping local optima, and global search capability.

5.4. Engineering Problem Optimization Experiments

Solving practical engineering problems is one of the main applications of swarm intelligence algorithms. However, the results in test functions cannot fully reflect the application effect of the algorithm in practical problems. Engineering constraint optimization problems are widely used in various fields. Such problems usually revolve around aspects such as time, quality, and resource consumption. They optimize one or more objectives under a given engineering background while satisfying engineering constraint conditions, maximizing efficiency, minimizing costs, and balancing various indicators. This paper evaluates the practicality of OCSFO by solving four engineering problems: piston rod, three-bar truss, cantilever beam, and topology optimization.
(1) Piston Rod Optimization
The main goal of the piston rod problem is to locate the piston components H(=x1), B(=x2), D(=x3), X(=x4) by minimizing the oil volume, as shown in Figure 4 [46].
The mathematical model of the piston rod design problem is expressed as follows:
(1) Objective Function:
f ( x ) = 1 4 π x 3 2 ( L 2 L 1 )
(2) Constraint Conditions:
g 1 ( x ) = Q L cos θ R × F 0
g 2 ( x ) = Q ( L x 4 ) M max 0
g 3 ( x ) = 1.2 ( L 2 L 1 ) L 1 0
g 4 ( x ) = x 3 2 x 2 0
R = x 4 ( x 4 sin θ + x 1 ) + x 1 ( x 2 x 4 cos θ ) x 4 x 2 2 + x 1 2
F = π P x 3 2 4 , L 1 = x 4 x 2 2 + x 1 2 , L 2 = x 4 sin θ + x 1 2 + x 2 x 4 cos θ 2
(3) Variable Range:
0.05 x 1 , x 2 , x 4 500 , 0.05 x 3 120
(4) Parameter Setting:
θ = 45 ° , Q = 10000 l b s , M max = 1.8 × 10 6 l b s i n , P = 1500 p s i
The optimized results are shown in Table 4.
It can be seen from Table 4, Figure 5 and Figure 6 that compared with the other seven algorithms, OCSFO can find the minimum optimization result of the piston rod problem. The optimal value obtained by OCSFO for the piston rod problem is 1.0574, and the optimal solution is x = [0.05, 1.0081, 2.0162, 500].
(2) Three-Bar Truss Optimization
As shown in Figure 7, the three-bar truss design problem (TBTD) is a problem involving optimization and constraint satisfaction. In the process of solving this problem, it is necessary to adjust two key parameters S1 and S2 to achieve the minimum weight of the truss while satisfying 3 specific constraint conditions [47].
The mathematical model of the three-bar truss design problem is expressed as follows:
(1) Objective Function:
f ( x ) = ( 2 2 S 1 + S 2 ) × L
(2) Constraint Conditions:
g 1 ( x ) = 2 S 1 + S 2 2 S 1 2 + 2 S 1 S 2 P σ 0
g 2 ( x ) = S 2 2 S 1 2 + 2 S 1 S 2 P σ 0
g 3 ( x ) = 1 2 S 2 + S 1 P σ 0
(3) Variable Range:
0 S 1 , S 2 1
(4) Parameter Setting:
L = 100   cm , P = 2   kN / cm 2 , σ = 2   kN / cm 2
The optimized results are shown in Table 5.
It can be seen from Table 5, Figure 8 and Figure 9 that compared with the other seven algorithms, OCSFO can find a result close to the minimum optimization of the three-bar truss design problem. The optimal value obtained by OCSFO for the piston rod problem is 263.8958, and the optimal solution is s = [0.7887, 0.4082].
(3) Cantilever Beam Optimization
The cantilever beam problem is a structural engineering design example involving the weight optimization of a rectangular-cross-section cantilever beam. One end of the beam is rigidly supported, and the free node of the cantilever is subjected to a vertical force, as shown in Figure 10. The beam consists of 5 hollow squares with a constant thickness. The height (or width) of the square is the decision variable, and the thickness remains fixed [48].
The mathematical model of the cantilever beam design problem is expressed as follows:
(1) Objective Function:
f ( x ) = 0.0624 ( x 1 + x 2 + x 3 + x 4 + x 5 )
(2) Constraint Conditions:
g ( x ) = 61 x 1 3 + 37 x 2 3 + 19 x 3 3 + 7 x 4 3 + 1 x 5 3 1 0
(3) Variable Range:
0.01 x i 100 , i = 1 , , 5
The optimized results are shown in Table 6.
It can be seen from Table 6, Figure 11 and Figure 12 that compared with the other seven algorithms, OCSFO can find a result close to the minimum optimization of the cantilever beam design problem. The optimal value obtained by OCSFO for the cantilever beam design problem is 1.33996, and the optimal solution is x = [6.0082, 5.3148, 4.49383, 3.5022, 2.1548].
(4) Topology Optimization
The main purpose of the topology optimization problem is to optimize the material layout of a set of provided loads within a given design search space and under constraint conditions related to system performance. This problem is based on the power-law method, and its mathematical model is expressed as follows [49]:
(1) Objective Function:
f ( x ¯ ) = U T K U = e = 1 N ( x e ) p u e T k 0 u 0
(2) Constraint Conditions:
h 1 ( x ¯ ) = V ( x ¯ ) V 0 f = 0
h 2 ( x ¯ ) = K U F = 0
(3) Variable Range:
0 < x ¯ m i n x 1
The optimized results are shown in Table 7.
The average convergence curve of topology optimization problem is shown in Figure 13. The box plot of topology optimization problem is shown in Figure 14.
It can be seen from Table 7, Figure 13 and Figure 14 that compared with the other seven algorithms, OCSFO can find a result close to the minimum optimization of the topology optimization problem. The optimal value obtained by OCSFO for the topology optimization problem is 2.6393.
From the comparison of four engineering application algorithms, the optimal value of the OCSFO algorithm ranks first compared to the seven comparison algorithms, indicating that OCSFO has good optimization ability in solving design problems of piston rods, three bar trusses, cantilever beams, and topology optimization. Compared with the SFO algorithm, it can be seen that the proposed OCSFO has an enhanced ability to jump out of local optima. Under the fusion of the Fishhawk and Cauchy mutation, its optimization accuracy is high, the optimization speed is fast, and the stability is strong.

6. Conclusions

The Osprey and Cauchy Mutation Integrated Sailfish Optimization Algorithm (OCSFO) proposed in this paper effectively improves and enhances the optimization performance of the Sailfish Optimization Algorithm (SFO). (1) The sailfish and sardine populations are initialized using the Logistic map. (2) The global exploration strategy of the Osprey Optimization Algorithm in the first stage is adopted during the local development phase of sailfish individual position update to enhance the global search ability of the algorithm. (3) Cauchy mutation is introduced to stimulate the activity of the sailfish and sardine populations during the prey capture stage. Through experimental verification, in the process of solving 23 classic benchmark test functions, the convergence speed, optimization accuracy, and robustness of OCSFO have been significantly improved. At the same time, the results of OCSFO in solving 4 complex engineering design optimization problems show that it has good applicability and superior solution effects in dealing with different types of practical complex optimization problems. In future research, further studies can be carried out on applying the OCSFO algorithm to multi-objective optimization problems.

Author Contributions

Conceptualization, L.C.; validation, L.C.; data curation, C.C.; writing—original draft preparation, Y.Y. writing—review and editing, Y.Y.; methodology, B.C.; software, Y.C. All authors have read and agreed to the published version of the manuscript.

Funding

This work was funded in part by the Natural Science Foundation of Zhejiang Province under Grant LY23F010002, the School-level Scientific Research Project of Wenzhou university of Technology under Grant ky202403.

Data Availability Statement

The data that support the findings of this study are available from the corresponding author upon request. There are no restrictions on data availability.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. Tian, A.Q.; Liu, F.F.; Lv, H.X. Snow Geese Algorithm: A novel migration-inspired meta-heuristic algorithm for constrained engineering optimization problems. Appl. Math. Model. 2024, 126, 327–347. [Google Scholar] [CrossRef]
  2. Wang, G.; Tao, J. New Trends in Symmetry in Optimization Theory, Algorithms and Applications. Symmetry 2024, 16, 284. [Google Scholar] [CrossRef]
  3. Zhong, R.; Yu, J.; Zhang, C.; Munetomo, M. SRIME: A strengthened RIME with Latin hypercube sampling and embedded distance-based selection for engineering optimization problems. Neural Comput. Appl. 2024, 36, 6721–6740. [Google Scholar] [CrossRef]
  4. Wei, F.; Zhang, Y.; Li, J. Multi-strategy-based adaptive sine cosine algorithm for engineering optimization problems. Expert Syst. Appl. 2024, 248, 123444. [Google Scholar] [CrossRef]
  5. Yue, Y.; Cao, L.; Lu, D.; Hu, Z.; Xu, M.; Wang, S.; Li, B.; Ding, H. Review and empirical analysis of sparrow search algorithm. Artif. Intell. Rev. 2023, 56, 10867–10919. [Google Scholar] [CrossRef]
  6. Wang, S.; Yue, Y.; Cai, S.; Li, X.; Chen, C.; Zhao, H.; Li, T. A comprehensive survey of the application of swarm intelligent optimization algorithm in photovoltaic energy storage systems. Sci. Rep. 2024, 14, 17958. [Google Scholar] [CrossRef]
  7. Hu, G.; Huang, F.; Chen, K.; Wei, G. MNEARO: A meta swarm intelligence optimization algorithm for engineering applications. Comput. Methods Appl. Mech. Eng. 2024, 419, 116664. [Google Scholar] [CrossRef]
  8. Chen, B.; Cao, L.; Chen, C.; Chen, Y.; Yue, Y. A comprehensive survey on the chicken swarm optimization algorithm and its applications: State-of-the-art and research challenges. Artif. Intell. Rev. 2024, 57, 170. [Google Scholar] [CrossRef]
  9. Yue, Y.; Cao, L.; Chen, H.; Chen, Y.; Su, Z. Towards an Optimal KELM Using the PSO-BOA Optimization Strategy with Applications in Data Classification. Biomimetics 2023, 8, 306. [Google Scholar] [CrossRef]
  10. Wang, Q.; Ma, T.; Yang, S.; Yan, F.; Zhao, J. Intelligent rockburst level prediction model based on swarm intelligence optimization and multi-strategy learner soft voting hybrid ensemble. Geomech. Geophys. Geo-Energy Geo-Resour. 2025, 11, 12. [Google Scholar] [CrossRef]
  11. Shadravan, S.; Naji, H.R.; Bardsiri, V.K. The Sailfish Optimizer: A novel nature-inspired metaheuristic algorithm for solving constrained engineering optimization problems. Eng. Appl. Artif. Intell. 2019, 80, 20–34. [Google Scholar] [CrossRef]
  12. Rajamoorthy, R.; Arunachalam, G.; Kasinathan, P.; Devendiran, R.; Ahmadi, P.; Pandiyan, S.; Muthusamy, S.; Panchal, H.; Kazem, H.A.; Sharma, P. A novel intelligent transport system charging scheduling for electric vehicles using Grey Wolf Optimizer and Sail Fish Optimization algorithms. Energy Sources Part A Recovery Util. Environ. Eff. 2022, 44, 3555–3575. [Google Scholar] [CrossRef]
  13. Chen, C.; Cao, L.; Chen, Y.; Yue, Y. A comprehensive survey of convergence analysis of beetle antennae search algorithm and its applications. Artif. Intell. Rev. 2024, 57, 141. [Google Scholar] [CrossRef]
  14. Azzam, S.M.; Emam, O.E.; Abolaban, A.S. An improved Differential evolution with Sailfish optimizer (DESFO) for handling feature selection problem. Sci. Rep. 2024, 14, 13517. [Google Scholar] [CrossRef]
  15. Li, L.; Liu, J.; Tseng, M.L.; Lim, M.K. Accuracy of IGBT junction temperature prediction: An improved sailfish algorithm to optimize support vector machine. IEEE Trans. Power Electron. 2024, 39, 6864–6876. [Google Scholar] [CrossRef]
  16. Rajoriya, M.K.; Gupta, C.P. ISFO-CS: An Improved Sailfish Optimization Algorithm for Controller Selection in SDWSN. Wirel. Pers. Commun. 2025, 140, 299–331. [Google Scholar] [CrossRef]
  17. Peng, F.; Zhong, R.; Fan, Q.; Zhang, C.; Yu, J. Improving Sailfish Optimizer with Population Switching Strategy and Random Mutation Strategy. In Proceedings of the 2023 7th Asian Conference on Artificial Intelligence Technology (ACAIT), Piscataway, NJ, USA, 10–12 November 2023. [Google Scholar]
  18. Kumar, B.S.; Santhi, S.G.; Narayana, S. Sailfish optimizer algorithm (SFO) for optimized clustering in wireless sensor network (WSN). J. Eng. Des. Technol. 2022, 20, 1449–1467. [Google Scholar] [CrossRef]
  19. Ikram, R.M.A.; Dehrashid, A.A.; Zhang, B.; Chen, Z.; Le, B.N.; Moayedi, H. A novel swarm intelligence: Cuckoo optimization algorithm (COA) and SailFish optimizer (SFO) in landslide susceptibility assessment. Stoch. Environ. Res. Risk Assess. 2023, 37, 1717–1743. [Google Scholar] [CrossRef]
  20. Mohammadi, R.; Akleylek, S.; Ghaffari, A. SDN-IoT: SDN-based efficient clustering scheme for IoT using improved Sailfish optimization algorithm. PeerJ Comput. Sci. 2023, 9, e1424. [Google Scholar] [CrossRef]
  21. Nassef, M.G.A.; Hussein, T.M.; Mokhiamar, O. An adaptive variational mode decomposition based on sailfish optimization algorithm and Gini index for fault identification in rolling bearings. Measurement 2021, 173, 108514. [Google Scholar] [CrossRef]
  22. Geetha, P.; Nanda, S.J.; Yadav, R.P. A parallel chaotic sailfish optimization algorithm for estimation of DOA in wireless sensor array. Phys. Commun. 2022, 51, 101536. [Google Scholar]
  23. Amin, S.A.; Alqudah, M.K.S.; Almutairi, S.A.; Almajed, R.; Al Nasar, M.R.; Alkhazaleh, H.A. Optimal extreme learning machine for diagnosing brain tumor based on modified sailfish optimizer. Heliyon 2024, 10, e34050. [Google Scholar] [CrossRef] [PubMed]
  24. Ghosh, K.K.; Ahmed, S.; Singh, P.K.; Geem, Z.W.; Sarkar, R. Improved binary sailfish optimizer based on adaptive β-hill climbing for feature selection. IEEE Access 2020, 8, 83548–83560. [Google Scholar] [CrossRef]
  25. Naji, H.R.; Shadravan, S.; Jafarabadi, H.M.; Momeni, H. Accelerating sailfish optimization applied to unconstrained optimization problems on graphical processing unit. Eng. Sci. Technol. Int. J. 2022, 32, 101077. [Google Scholar]
  26. Kumar, M.; Suman, S. Scheduling in iaas cloud computing environment using sailfish optimization algorithm. Trends Sci. 2022, 19, 4204. [Google Scholar] [CrossRef]
  27. Gutte, V.S.; Parasar, D. Sailfish invasive weed optimization algorithm for multiple image sharing in cloud computing. Int. J. Intell. Syst. 2022, 37, 4190–4213. [Google Scholar] [CrossRef]
  28. Al Duhayyim, M.; Malibari, A.; Dhahbi, S.; Nour, M.K.; Al-Turaiki, I.; Obayya, M.I.; Mohamed, A. Sailfish Optimization with Deep Learning Based Oral Cancer Classification Model. Comput. Syst. Sci. Eng. 2023, 45, 753–767. [Google Scholar] [CrossRef]
  29. Huang, H.; Yao, Z.; Wei, X.; Zhou, Y. Twin support vector machines based on chaotic mapping dung beetle optimization algorithm. J. Comput. Des. Eng. 2024, 11, 101–110. [Google Scholar] [CrossRef]
  30. Jawed, M.S.; Sajid, M. COBLAH: A chaotic OBL initialized hybrid algebraic-heuristic algorithm for optimal S-box construction. Comput. Stand. Interfaces 2025, 91, 103890. [Google Scholar] [CrossRef]
  31. Dehghani, M.; Trojovský, P. Osprey optimization algorithm: A new bio-inspired metaheuristic algorithm for solving engineering optimization problems. Front. Mech. Eng. 2023, 8, 1126450. [Google Scholar] [CrossRef]
  32. Ismaeel, A.A.K.; Houssein, E.H.; Khafaga, D.S.; Abdullah Aldakheel, E.; AbdElrazek, A.S.; Said, M. Performance of osprey optimization algorithm for solving economic load dispatch problem. Mathematics 2023, 11, 4107. [Google Scholar] [CrossRef]
  33. Zhang, Y.; Liu, P. Research on reactive power optimization based on hybrid osprey optimization algorithm. Energies 2023, 16, 7101. [Google Scholar] [CrossRef]
  34. Yuan, Y.; Yang, Q.; Ren, J.; Mu, X.; Wang, Z.; Shen, Q.; Zhao, W. Attack-defense strategy assisted osprey optimization algorithm for PEMFC parameters identification. Renew. Energy 2024, 225, 120211. [Google Scholar] [CrossRef]
  35. Alqahtani, A.H.; Fahmy, H.M.; Hasanien, H.M.; Tostado-Véliz, M.; Alkuhayli, A.; Jurado, F. Parameters estimation and sensitivity analysis of lithium-ion battery model uncertainty based on osprey optimization algorithm. Energy 2024, 304, 132204. [Google Scholar] [CrossRef]
  36. Somula, R.; Cho, Y.; Mohanta, B.K. SWARAM: Osprey optimization algorithm-based energy-efficient cluster head selection for wireless sensor network-based internet of things. Sensors 2024, 24, 521. [Google Scholar] [CrossRef]
  37. Zhou, L.; Liu, X.; Tian, R.; Wang, W.; Jin, G. A modified osprey optimization algorithm for solving global optimization and engineering optimization design problems. Symmetry 2024, 16, 1173. [Google Scholar] [CrossRef]
  38. Chen, T.; Sun, Y.; Chen, H.; Deng, W. Enhanced wild horse optimizer with cauchy mutation and dynamic random search for hyperspectral image band selection. Electronics 2024, 13, 1930. [Google Scholar] [CrossRef]
  39. Wu, L.; Wu, J.; Wang, T. The improved grasshopper optimization algorithm with Cauchy mutation strategy and random weight operator for solving optimization problems. Evol. Intell. 2024, 17, 1751–1781. [Google Scholar] [CrossRef]
  40. Zaini, F.A.; Sulaima, M.F.; Razak, I.A.W.A.; Zulkafli, N.I.; Mokhlis, H. A review on the applications of PSO-based algorithm in demand side management: Challenges and opportunities. IEEE Access 2023, 11, 53373–53400. [Google Scholar] [CrossRef]
  41. Hatta, N.M.; Zain, A.M.; Sallehuddin, R.; Shayfull, Z.; Yusoff, Y. Recent studies on optimisation method of Grey Wolf Optimiser (GWO): A review (2014–2017). Artif. Intell. Rev. 2019, 52, 2651–2683. [Google Scholar] [CrossRef]
  42. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  43. Wang, X.; Wei, Y.; Guo, Z.; Wang, J.; Yu, H.; Hu, B. A sinh–cosh-enhanced DBO Algorithm applied to global optimization problems. Biomimetics 2024, 9, 271. [Google Scholar] [CrossRef] [PubMed]
  44. Rubavathy, A.H.; Sundar, S. Optimizing relay node selection in cooperative wireless body area networks with modified POA. Results Eng. 2024, 24, 103215. [Google Scholar] [CrossRef]
  45. Abdollahzadeh, B.; Khodadadi, N.; Barshandeh, S.; Trojovský, P.; Gharehchopogh, F.S.; El-kenawy, E.S.M.; Abualigah, L.; Mirjalili, S. Puma optimizer (PO): A novel metaheuristic optimization algorithm and its application in machine learning. Clust. Comput. 2024, 27, 5235–5283. [Google Scholar] [CrossRef]
  46. Yang, D.; Zhang, H.; Wang, X.; Wang, F.; Gao, G. Multi-objective prediction and optimization of working condition parameters for piston rod cap seal in Stirling engine. J. Mech. Sci. Technol. 2024, 38, 6817–6839. [Google Scholar] [CrossRef]
  47. Cai, Q.; Ma, J.; Xie, Y.M.; San, B.; Zhou, Y. Topology optimization and diverse truss designs considering nodal stability and bar buckling. J. Constr. Steel Res. 2025, 224, 109128. [Google Scholar] [CrossRef]
  48. Xu, Q.; Gao, A.; Li, Y.; Jin, Y. Design and optimization of piezoelectric cantilever beam vibration energy harvester. Micromachines 2022, 13, 675. [Google Scholar] [CrossRef]
  49. Prathyusha, A.L.R.; Babu, G.R. A review on additive manufacturing and topology optimization process for weight reduction studies in various industrial applications. Mater. Today Proc. 2022, 62, 109–117. [Google Scholar] [CrossRef]
Figure 1. The flowchart of the Osprey and Cauchy Mutation integrated sailfish optimization algorithm.
Figure 1. The flowchart of the Osprey and Cauchy Mutation integrated sailfish optimization algorithm.
Symmetry 17 00938 g001
Figure 2. Convergence curves of 8 comparison algorithms on classic benchmark test functions.
Figure 2. Convergence curves of 8 comparison algorithms on classic benchmark test functions.
Symmetry 17 00938 g002aSymmetry 17 00938 g002bSymmetry 17 00938 g002c
Figure 3. Comparison of 23 classical function radar charts for 8 algorithms.
Figure 3. Comparison of 23 classical function radar charts for 8 algorithms.
Symmetry 17 00938 g003
Figure 4. Piston rod design problem.
Figure 4. Piston rod design problem.
Symmetry 17 00938 g004
Figure 5. Average convergence curve of piston rod design problem.
Figure 5. Average convergence curve of piston rod design problem.
Symmetry 17 00938 g005
Figure 6. Box plot of piston rod design problem.
Figure 6. Box plot of piston rod design problem.
Symmetry 17 00938 g006
Figure 7. Three-bar truss design problem.
Figure 7. Three-bar truss design problem.
Symmetry 17 00938 g007
Figure 8. Average convergence curve of three-bar truss design problem.
Figure 8. Average convergence curve of three-bar truss design problem.
Symmetry 17 00938 g008
Figure 9. Box plot of three-bar truss design problem.
Figure 9. Box plot of three-bar truss design problem.
Symmetry 17 00938 g009
Figure 10. Cantilever Beam Design Problem.
Figure 10. Cantilever Beam Design Problem.
Symmetry 17 00938 g010
Figure 11. Average Convergence Curve of Cantilever Beam Design Problem.
Figure 11. Average Convergence Curve of Cantilever Beam Design Problem.
Symmetry 17 00938 g011
Figure 12. Box Plot of Cantilever Beam Optimization.
Figure 12. Box Plot of Cantilever Beam Optimization.
Symmetry 17 00938 g012
Figure 13. Average convergence curve of topology optimization problem.
Figure 13. Average convergence curve of topology optimization problem.
Symmetry 17 00938 g013
Figure 14. Box plot of topology optimization problem.
Figure 14. Box plot of topology optimization problem.
Symmetry 17 00938 g014
Table 1. Classic benchmark test functions.
Table 1. Classic benchmark test functions.
Benchmark Test FunctionDScopeOptimal Value
F 1 x = i = 1 n x i 2 30[−100,100]0
F 2 x = i = 1 n x i + i = 1 n x i 30[−10,10]0
F 3 x = i = 1 n j - 1 i x j 2 30[−100,100]0
F 4 x = max i x i , 1 i n 30[−100,100]0
F 5 x = i = 1 n 1 100 x i + 1 x i 2 2 + x i 1 2 30[−30,30]0
F 6 x = i = 1 n x i + 0.5 2 30[−100,100]0
F 7 x = i = 1 n i x i 4 + r a n d o m 0 , 1 30[−1.28,1.28]0
F 8 x = i = 1 n x i sin x i 30[−500,500]−418.9829
× 30
F 9 x = i = 1 n x 1 2 10 cos 2 π x i + 10 30[−5.12,5.12]0
F 10 x = 20 exp 0.2 1 n i = 1 n x i 2                         exp 1 n i = 1 n cos 2 π x i + 20 + e 30[−32,32]0
F 11 x = 1 4000 i = 1 n x i 2 i = 1 n cos x i i + 1 30[−600,600]0
F 12 x = π n 10 sin π y 1 + i = 1 n 1 y i 1 2 1 + 10 sin 2 π y i + 1 + y n 1 2 + i = 1 n u x i , 10 , 100 , 4
y i = 1 + x i + 1 4 u x i , a , k , m = k x i a m x i > a 0 a < x i < a k x i a m x i < a
30[−50,50]0
F 13 x = 0.1 sin 2 3 π x 1 + i = 1 n x i 1 2 1 + sin 2 3 π x i + 1       + x n 1 2 1 + sin 2 2 π x n + i = 1 n u x i , 5 , 100 , 4 30[−50,50]0
F 14 x = 1 500 + j = 1 25 1 j + i = 1 2 x i a i 6 1 2[−65,65]0.998
F 15 x = i = 1 11 a i x 1 b i 2 + b i x 2 b i 2 + b i x 3 + x 4 2 4[−5,5]0.0003
F 16 x = 4 x 1 2 2.1 x 1 4 + 1 3 x 1 6 + x 1 x 2 4 x 2 2 + 4 x 2 4 2[−5,5]−1.0316
F 17 x = x 2 5.1 4 π 2 x 1 2 + 5 π x 1 6 2 + 10 1 1 8 π cos x 1 + 10 2[−5,5]0.398
F 18 x = 1 + x 1 + x 2 + 1 2 19 14 x 1 + 3 x 1 2 14 x 2 + 6 x 1 x 2 + 3 x 2 2                             30 + 2 x 1 3 x 2 2 18 32 x 1 + 12 x 1 2 + 48 x 2 36 x 1 x 2 + 27 x 2 2 2[−2,2]3
F 19 x = i = 1 4 c i exp j = 1 3 a i j x j p i j 2 3[1,3]−3.86
F 20 x = i = 1 4 c i exp j = 1 6 a i j x j p i j 2 6[0,1]−3.32
F 21 x = i = 1 5 X a i X a i T + c i 1 4[0,10]−10.1532
F 22 x = i = 1 7 X a i X a i T + c i 1 4[0,10]−10.4028
F 23 x = i = 1 10 X a i X a i T + c i 1 4[0,10]−10.5363
Table 2. Test Results of Classic Benchmark Test Functions.
Table 2. Test Results of Classic Benchmark Test Functions.
FunctionIndexOCSFOSFOPSOGWOHHODBOPOAPO
F1min3.3441 × 10−2801.99155 × 10−222.99388 × 10−871.0018 × 10−1221.6081 × 10−21104.3239 × 10−2418.54219 × 10−88
std2.94539 × 10−963.47061 × 10−192.53038 × 10−821.5237 × 10−1160007.38652 × 10−44
avg5.37753 × 10−971.99967 × 10−198.48845 × 10−835.0714 × 10−1172.0104 × 10−18302.0386 × 10−2171.89194 × 10−44
median3.4111 × 10−1275.21126 × 10−204.84329 × 10−843.5011 × 10−1203.5713 × 10−19702.0016 × 10−2246.9278 × 10-62
worse1.61325 × 10−951.24725 × 10−181.03419 × 10−817.3762 × 10−1165.8315 × 10−18205.9937 × 10−2163.92273 × 10−43
F2min05.87459 × 10−124.24156 × 10−402.7678 × 10−701.4811 × 10−10709.4741 × 10−1260
std1.60456 × 10−529.70142 × 10−100.0002164291.04789 × 10−662.31382 × 10−9603.9694 × 10−1081.0801 × 10−18
avg3.17018 × 10−537.12066 × 10−104.58084 × 10−55.3563 × 10−674.24134 × 10−9707.3245 × 10−1091.97873 × 10−19
median4.45645 × 10−743.60236 × 10−102.90932 × 10−186.24664 × 10−681.077 × 10−10201.3658 × 10−1146.27336 × 10−30
wors × 108.7847 × 10−524.17213 × 10−90.0011819624.0563 × 10−661.2675 × 10−9502.1748 × 10−1075.91658 × 10−18
F3min2.6835 × 10−1821.15836 × 10−212.96586 × 10−416.38864 × 10−612.8768 × 10−19608.5608 × 10−2385.892 × 10−104
std8.91365 × 10−631.11343 × 10−170.0001315199.66776 × 10−522.0277 × 10−151003.02985 × 10−33
avg2.76715 × 10−636.73265 × 10−183.2458 × 10−52.04556 × 10−523.7021 × 10−15201.8083 × 10−2155.53173 × 10−34
median2.42639 × 10−882.65884 × 10−187.47732 × 10−226.3233 × 10−583.8781 × 10−17605.8082 × 10−2276.46344 × 10−56
worse3.67323 × 10−624.7822 × 10−170.0006591915.2896 × 10−511.1106 × 10−15005.1266 × 10−2141.65952 × 10−32
F4min8.3655 × 10−1794.4523 × 10−125.38933 × 10−323.11965 × 10−401.3125 × 10−10603.1697 × 10−1183.8063 × 10−70
std4.3419 × 10−331.03299 × 10−100.0002206739.94689 × 10−374.92904 × 10−9201.2789 × 10−1056.76258 × 10−21
avg7.9272 × 10−341.29055 × 10−104.029 × 10−54.06407 × 10−379.05483 × 10−9303.2961 × 10−1061.2422 × 10−21
median8.94018 × 10−691.31971 × 10−102.8531 × 10−293.91058 × 10−387.22553 × 10−9802.4877 × 10−1125.70081 × 10−37
worse2.37816 × 10−323.56752 × 10−100.0012086753.98829 × 10−362.70026 × 10−9105.8955 × 10−1053.70471 × 10−20
F5min1.47705 × 10−138.8942656350.4091011025.2273526457.1447 × 10−77.1048338364.680459196.98333 × 10−7
std3.56888 × 10−60.00400181916.451585640.6617733320.0017652350.3722812010.6406156250.000118
avg1.09915 × 10−68.9059477959.5959347926.5140735340.0010168327.9566945565.8457111959.25272 × 10−5
median1.8748 × 10−88.9078624845.2930447476.2529267560.0001768758.0733007416.0117078223.43532 × 10−5
worse1.48447 × 10−58.90911383575.295689938.0509319320.0078728918.7409996347.2155966070.00047854
F6min00.0003456305.41723 × 10−81.99213 × 10−80.0069280339.0545 × 10−82.66931 × 10−9
std9.16978 × 10−330.46577639700.04566081.73622 × 10−50.1191947310.1401069554.98526 × 10−7
avg6.77927 × 10−330.2895515600.008337271.19878 × 10−50.1695670050.0994549983.92906 × 10−7
median00.03938483508.16252 × 10−73.74499 × 10−60.2005997857.40171 × 10−71.7476 × 10−7
worse3.69779 × 10−321.9877278800.2500952896.57409 × 10−50.4747257640.495960642.07102 × 10−6
F7min1.67325 × 10−58.71432 × 10−70.0003000083.19885 × 10−53.12241 × 10−65.0988 × 10−53.79165 × 10−66.10529 × 10−7
std0.0003332897.55217 × 10−50.0049126330.0001766245.51818 × 10−50.0001988365.25525 × 10−51.20955 × 10−5
avg0.0003433997.53558 × 10−50.0069406860.0002614775.22296 × 10−50.000255687.42767 × 10−51.20325 × 10−5
median0.0002671226.36912 × 10−50.0060137680.000221114.09228 × 10−50.0001979276.49739 × 10−58.33384 × 10−6
worse0.0017057050.0003546940.0218213280.0006723680.0002893540.0007666060.0002221764.41103 × 10−5
F8min−1834.513869−2817.437803−3043.369545−3419.928162−4189.828873−4189.731539−3497.347079−3498.406743
std204.9628195317.4633967291.7458352300.5643807213.3000764571.6147412225.1114115330.8162325
avg−1738.133111−2003.274612−2551.758582−2828.250359−4134.372302−3016.963852−3100.172917−2873.832192
median−1861.839447−2014.463201−2511.926621−2844.405696−4189.815049−2929.286086−3151.740096−2882.155091
worse−1242.326831−1463.507948−1899.841891−2272.153418−3237.730163−2156.787094−2545.751369−2231.009906
F9min004.97479528500000
std004.455206700000
avg0011.4088537800000
median0010.4470625400000
worse0020.8941200500000
F10min4.44089 × 10−167.39719 × 10−123.9968 × 10−153.9968 × 10−154.44089 × 10−164.44089 × 10−164.44089 × 10−164.44089 × 10−16
std02.89928 × 10−100.5806938531.34665 × 10−15001.52832 × 10−150
avg4.44089 × 10−163.37035 × 10−100.2801372144.58892 × 10−154.44089 × 10−164.44089 × 10−163.16784 × 10−154.44089 × 10−16
median4.44089 × 10−162.80285 × 10−107.54952 × 10−153.9968 × 10−154.44089 × 10−164.44089 × 10−163.9968 × 10−154.44089 × 10−16
worse4.44089 × 10−161.24863 × 10−91.6462236337.54952 × 10−154.44089 × 10−164.44089 × 10−163.9968 × 10−154.44089 × 10−16
F11min000.01231607300000
std000.0899395360.0211862210000
avg000.108914890.0171029280000
median000.0860971980.0165031350000
worse000.4232610980.1063746940000
F12min4.71163 × 10−323.13749 × 10−64.71163 × 10−321.05488 × 10−78.00945 × 10−80.0128011747.96347 × 10−82.05765 × 10−9
std1.05513 × 10−330.2997930650.0789042010.0084786831.34636 × 10−50.0302231650.0147827143.70291 × 10−7
avg4.76972 × 10−320.1178372470.0207368620.0045988491.06278 × 10−50.0588507390.0103738194.14776 × 10−7
median4.71163 × 10−320.0027410644.71163 × 10−323.91629 × 10−75.12101 × 10−60.0592945975.54316 × 10−73.43779 × 10−7
worse5.09887 × 10−321.2708641950.3110071340.0201195494.82959 × 10−50.1471772160.0407099051.65055 × 10−6
F13min1.34978 × 10−322.98252 × 10−81.34978 × 10−324.12695 × 10−74.03673 × 10−81.19069 × 10−57.91169 × 10−74.48046 × 10−9
std6.83973 × 10−330.0169636960.0033525590.0465911952.70513 × 10−50.2695387040.1956475091.16384 × 10−6
avg1.76476 × 10−320.0084271150.0010987370.0185299312.2722 × 10−50.2501341970.2520290148.41251 × 10−7
median1.34978 × 10−326.77393 × 10−51.34978 × 10−321.5808 × 10−61.1186 × 10−50.0793913190.2090970773.13258 × 10−7
worse4.30801 × 10−320.0538794120.0109873660.2040020368.62787 × 10−50.7339133470.6335668594.08816 × 10−6
F14min0.9980038380.9980038390.9980038380.9980038380.9980038380.9980038380.9980038380.998003838
std5.6292247864.6305866143.3017475764.3234641340.9399394090.8662693103.842302718
avg8.0702009388.1546773073.7170800624.6194990221.2949021561.8263676510.9980038382.689301822
median12.6705058110.283586951.99203092.9821051570.9980038381.9920309020.9980038380.998003838
worse12.6705058112.6705058112.6705058112.670505815.9288451252.9821052220.99800383812.67050581
F15min0.0003074860.0003076040.0003074860.0003074870.0003083550.0003075390.0003074860.000307491
std2.77335 × 10−120.0169501280.0036485910.0069031351.8873 × 10−50.0002831320.0036593540.000279077
avg0.0003074860.0104656050.0012032860.003073470.0003344860.0005582960.000991510.000400032
median0.0003074860.000725610.0003074860.0003075280.000336750.0004775620.0003074860.000307956
worse0.0003074860.0633374580.0203633390.0203633450.0003701940.0013681340.0203633390.001223178
F16min−1.031628453−1.031628453−1.031628453−1.031628453−1.031628453−1.031628413−1.031628453−1.031628453
std6.1849 × 10−164.25893 × 10−76.77522 × 10−166.17499 × 10−94.78468 × 10−113.83802 × 10−65.60825 × 10−162.06062 × 10−11
avg−1.031628453−1.031628321−1.031628453−1.031628448−1.031628453−1.031626805−1.031628453−1.031628453
median−1.031628453−1.031628449−1.031628453−1.03162845−1.031628453−1.031627653−1.031628453−1.031628453
worse−1.031628453−1.031626165−1.031628453−1.031628432−1.031628453−1.031607003−1.031628453−1.031628453
F17min0.3978873580.3978873580.3978873580.3978873640.3978873580.3978874120.3978873580.397887358
std04.1586 × 10−608.80442 × 10−51.06161 × 10−62.16695 × 10-603.87801 × 10−10
avg0.3978873580.3978891760.3978873580.3979040460.397887790.3978893840.3978873580.397887358
median0.3978873580.3978875210.3978873580.3978875010.3978873620.3978886750.3978873580.397887358
worse0.3978873580.3979090120.3978873580.3983701770.3978916370.3978957790.3978873580.39788736
F18min33.00000000233.0000000433.00000092433
std6.85011955615.986527029.32988 × 10−161.30292 × 10−59.91192 × 10−80.0017943411.38973 × 10−152.75566 × 10−9
avg4.87.50000194333.0000117263.0000000353.00145348233.000000001
median33.00000035933.0000055533.00092688933
worse3084.0000092833.0000518143.0000004623.00791376933.000000014
F19min−3.862782148−3.862781197−3.862782148−3.862781731−3.862778497−3.862728529−3.862782148−3.862782144
std2.55376 × 10−150.1418619320.1411331290.0028463850.0013052750.0018735582.35424 × 10−156.02214 × 10−6
avg−3.862782148−3.82557338−3.837014882−3.861242952−3.861881771−3.861755304−3.862782148−3.86277744
median−3.862782148−3.862286892−3.862782148−3.862755524−3.862523531−3.862530211−3.862782148−3.862780186
worse−3.862782148−3.088297089−3.089764163−3.854900391−3.858167242−3.854899807−3.862782148−3.862758912
F20min−3.321995172−3.284905494−3.321995172−3.321994474−3.318005559−3.321991188−3.321995053−3.321995116
std0.0570048880.1784031190.0670122770.0779366210.0778296180.1033185740.0217100660.073466686
avg−3.282364131−3.024451777−3.265993126−3.269285605−3.20538674−3.146998922−3.318029695−3.268011872
median−3.321995172−3.086538211−3.321995172−3.321990654−3.203937452−3.130050979−3.321993579−3.321987974
worse−3.20310205−2.64588878−3.137641726−3.017025651−3.015952667−2.919349668−3.203082465−3.131624713
F21min−10.15319968−10.09031571−10.15319968−10.15311951−10.06137001−10.15283752−10.15319968−10.15319914
std1.293404552.4540177413.6474313962.3356847991.2686048591.2932463971.5555425852.2792714
avg−9.813332882−6.979552287−5.82171248−9.021653477−5.388062568−5.394999561−9.643388668−6.406301399
median−10.15319968−5.05513378−2.682860396−10.15281231−5.054975227−5.055179726−10.15319235−5.055197726
worse−5.055197729−2.611030696−2.630471668−2.906079144−5.053119821−5.05513708−5.055196199−5.055197604
F22min−10.40294057−10.23503117−10.40294057−10.40289971−10.25868268−10.40271314−10.40294057−10.40293969
std1.8377323872.7089972723.6335898540.0002724220.9441867832.4765593280.9704265722.05904534
avg−9.694238049−6.679881053−6.417287907−10.40247051−5.259543486−6.681668563−10.22574147−6.087458056
median−10.40294057−5.087543731−5.108247311−10.4025076−5.087244675−5.087652068−10.40293798−5.087671811
worse−5.087671825−2.477525884−1.837592971−10.40180096−5.085316142−5.08752684−5.087671082−5.087671601
F23min−10.53640982−10.48580348−10.53640982−10.53640047−10.32254704−10.53586194−10.53640982−10.53640957
std1.8697693092.7691609413.797560710.0003740650.9484615631.8691614110.9873427152.326397136
avg−9.815352613−6.233852205−6.031367232−10.53588067−5.300776167−5.849277973−10.35611626−6.390329567
median−10.53640982−5.128399321−3.353284754−10.53596295−5.128100662−5.128461403−10.53639721−5.128480778
worse−5.128480787−1.818691603−2.421734027−10.53462323−5.12455458−5.128372756−5.128480787−5.128480533
Table 3. Results of Wilcoxon Rank-Sum Test for Classic Benchmark Test Functions.
Table 3. Results of Wilcoxon Rank-Sum Test for Classic Benchmark Test Functions.
FunctionOCSFOSFOPSOGWOHHODBOPOAPO
F12.36874 × 10−123.0198 × 10−113.01986 × 10−110.0555456935.96731 × 10−91.21178 × 10−125.57265 × 10−103.01986 × 10−11
F21.54899 × 10−123.0198 × 10−113.01986 × 10−110.0050842220.001596884.57359 × 10−121.52917 × 10−55.31383 × 10−10
F32.35461 × 10−133.0198 × 10−113.01986 × 10−113.01986 × 10−111.85673 × 10−91.21178 × 10−123.01986 × 10−112.57212 × 10−7
F45.36541 × 10−133.0198 × 10−113.01986 × 10−112.22727 × 10−90.00052641.21178 × 10−121.74791 × 10−59.06321 × 10−8
F52.31548 × 10−113.0198 × 10−113.01986 × 10−113.01986 × 10−111.20567 × 10−103.01986 × 10−113.01986 × 10−119.91863 × 10−11
F61.05671 × 10−111.6524 × 10−116.25727 × 10−51.65249 × 10−111.65249 × 10−111.65249 × 10−111.65249 × 10−111.65249 × 10−11
F71.36579 × 10−73.8052 × 10−71.32885 × 10−100.4552969071.3111 × 10−80.2643262134.11271 × 10−74.50432 × 10−11
F83.68712 × 10−124.9721 × 10−112.0125 × 10−80.0006911331.32807 × 10−100.1453135370.739395330.003033542
F9111.19898 × 10−1211111
F101.03547 × 10−131.2117 × 10−127.18202 × 10−131.17724 × 10−13111.47323 × 10−91
F11111.21079 × 10−123.45263 × 10−71111
F121.02478 × 10−111.7522 × 10−110.1082351361.75228 × 10−111.75228 × 10−111.75228 × 10−111.75228 × 10−111.75228 × 10−11
F131.15794 × 10−111.8502 × 10−110.0075013131.85028 × 10−111.85028 × 10−111.85028 × 10−111.85028 × 10−111.85028 × 10−11
F140.0365781130.0728287350.0103781060.1927086470.0281044860.0502730173.83001 × 10−80.219304033
F151.36579 × 10−123.0198 × 10−110.0003986623.01986 × 10−113.01986 × 10−113.01986 × 10−115.26501 × 10−53.01986 × 10−11
F161.0009 × 10−111.0149 × 10−110.0013054641.0149 × 10−111.57547 × 10−111.0149 × 10−110.0395759661.0149 × 10−11
F173.65712 × 10−131.2117 × 10−1211.21178 × 10−124.5664 × 10−121.21178 × 10−1211.21178 × 10−12
F181.26985 × 10-93.74455 × 10−90.0239455726.41102 × 10−96.41102 × 10−96.41102 × 10−90.0075081346.41102 × 10−9
F192.36488 × 10−121.2454 × 10−110.0076589351.24547 × 10−111.24547 × 10−111.24547 × 10−110.0076112031.24547 × 10−11
F202.64479 × 10−107.2761 × 10−100.3243880529.47854 × 10−56.62878 × 10−64.65352 × 10−90.0164263549.47854 × 10−5
F211.64121 × 10−122.4905 × 10−100.0293014814.11914 × 10−92.80853 × 10−112.80853 × 10−111.22401 × 10−89.35754 × 10−11
F226.69745 × 10−131.30473 × 10−90.0565695818.23335 × 10−72.64186 × 10−116.15132 × 10−104.47168 × 10−61.93971 × 10−10
F236.35784 × 10−115.9550 × 10−100.0110593838.06341 × 10−72.5446 × 10−118.52183 × 10−116.63732 × 10−72.76514 × 10−10
Table 4. Comparison of Results of Piston Rod Design Problem.
Table 4. Comparison of Results of Piston Rod Design Problem.
IndexOCSFOSFOPSOGWOHHODBOPOAPO
min1.0573938883.7053599711.0573938881.0582527432.3062904011.058061471.0573938881.057803626
std71.58905973235.659790867.7040995457.59179142203.82527360.009948544.10374 × 10−120.047900342
avg39.88763899254.364803534.3404611223.26917275212.05472931.0662583721.0573938881.077079603
median1.057393888218.65066971.0573938881.059041966241.81981021.0635582151.0573938881.061981985
worse167.4727301916.011294167.4727301167.6960173941.3318821.1042113541.0573938881.252325321
Table 5. Comparison of Results of Three-Bar Truss Design Problem.
Table 5. Comparison of Results of Three-Bar Truss Design Problem.
IndexOCSFOSFOPSOGWOHHODBOPOAPO
min263.8958439263.8964172263.8958446263.8961538263.8958438263.8989862263.8958434263.8960476
std0.0031600760.9785142670.0125795090.0029635360.0829948990.1430381841.73446 × 10−130.064466374
avg263.8974988264.2162079263.8994882263.8995972263.9630128263.9922193263.8958434263.9546643
median263.8963985263.9013246263.8963453263.8985967263.9235141263.9434366263.8958434263.9328776
worse263.912221268.2329229263.9649984263.9104325264.2055554264.5910234263.8958434264.1468713
Table 6. Comparison of Results of Cantilever Beam Design Problem.
Table 6. Comparison of Results of Cantilever Beam Design Problem.
IndexOCSFOSFOPSOGWOHHODBOPOAPO
min1.3399586631.3923752981.3399641061.3399674941.3403240971.3399886981.3399571581.339974185
std4.94849 × 10−50.0735035498.33206 × 10−55.36225 × 10−50.0017826120.0001879782.06533 × 10−50.001696178
avg1.3400133751.5623502161.3400284671.3400297921.3430134011.3402248391.3399805811.342146539
median1.3399987651.5696038891.3399996591.3400111611.3428090251.3401881561.3399756461.341582181
worse1.3401815081.7155420681.3403592481.3402013231.3463520021.3407116771.3400437441.346670449
Table 7. Comparison of Results of Topology Optimization Problem.
Table 7. Comparison of Results of Topology Optimization Problem.
IndexOCSFOSFOPSOGWOHHODBOPOAPO
min2.6393464972.6393464972.650605162.7499135112.6393464972.6393464972.6393464972.639346497
std000.1204022430.106492184000.4230091320
avg2.6393464972.6393464972.7981783062.8853648182.6393464972.6393464973.058991452.639346497
median2.6393464972.6393464972.7556424852.8472827672.6393464972.6393464972.949082242.639346497
worse2.6393464972.6393464973.11437883.1734237542.6393464972.6393464974.3661276152.639346497
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cao, L.; Yue, Y.; Chen, Y.; Chen, C.; Chen, B. Sailfish Optimization Algorithm Integrated with the Osprey Optimization Algorithm and Cauchy Mutation and Its Engineering Applications. Symmetry 2025, 17, 938. https://doi.org/10.3390/sym17060938

AMA Style

Cao L, Yue Y, Chen Y, Chen C, Chen B. Sailfish Optimization Algorithm Integrated with the Osprey Optimization Algorithm and Cauchy Mutation and Its Engineering Applications. Symmetry. 2025; 17(6):938. https://doi.org/10.3390/sym17060938

Chicago/Turabian Style

Cao, Li, Yinggao Yue, Yaodan Chen, Changzu Chen, and Binhe Chen. 2025. "Sailfish Optimization Algorithm Integrated with the Osprey Optimization Algorithm and Cauchy Mutation and Its Engineering Applications" Symmetry 17, no. 6: 938. https://doi.org/10.3390/sym17060938

APA Style

Cao, L., Yue, Y., Chen, Y., Chen, C., & Chen, B. (2025). Sailfish Optimization Algorithm Integrated with the Osprey Optimization Algorithm and Cauchy Mutation and Its Engineering Applications. Symmetry, 17(6), 938. https://doi.org/10.3390/sym17060938

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop