Abstract
The reptile search algorithm is an effective optimization method based on the natural laws of the biological world. By restoring and simulating the hunting process of reptiles, good optimization results can be achieved. However, due to the limitations of natural laws, it is easy to fall into local optima during the exploration phase. Inspired by the different search fields of biological organisms with varying flight heights, this paper proposes a reptile search algorithm considering different flight heights. In the exploration phase, introducing the different flight altitude abilities of two animals, the northern goshawk and the African vulture, enables reptiles to have better search horizons, improve their global search ability, and reduce the probability of falling into local optima during the exploration phase. A novel dynamic factor (DF) is proposed in the exploitation phase to improve the algorithm’s convergence speed and optimization accuracy. To verify the effectiveness of the proposed algorithm, the test results were compared with ten state-of-the-art (SOTA) algorithms on thirty-three famous test functions. The experimental results show that the proposed algorithm has good performance. In addition, the proposed algorithm and ten SOTA algorithms were applied to three micromachine practical engineering problems, and the experimental results show that the proposed algorithm has good problem-solving ability.
1. Introduction
With the deeper exploration of natural laws by humans, more and more practical problems have emerged in fields such as control [,], manufacturing [,], economics [,], and physics []. Most of these problems have characteristics such as a large scale, multiple constraints, and discontinuity []. Traditional algorithms often optimize the objective function results through the gradient of the objective function, a deterministic search method that makes it difficult for people to use existing traditional methods to solve such problems.
Basically, the characteristic of most heuristic algorithms is random search, and through this characteristic, higher global optimal possibilities are obtained []. Due to their independence from utilizing function gradients, heuristic algorithms do not require the objective function to have continuously differentiable conditions, providing optimization possibilities for some objective functions that cannot be optimized through gradient descent. Heuristic algorithms can be roughly divided into three categories based on the different ideas of imitation: simulating biological habits [,], cognitive thinking [,], and physical phenomena [,]. Among these, due to the abundance of natural organisms, heuristic algorithms that simulate bodily patterns are primarily used, such as the genetic algorithm (GA) [], particle swarm optimization (PSO) [], ant colony optimization (ACO) [], Grey wolf optimizer (GWO) [], etc. However, no free lunch globally exists, and no single algorithm is suitable for solving all optimization problems []. In recent years, in pursuit of the effectiveness of heuristic algorithms, many improved algorithms have emerged, mainly consisting of strategy-based improvement and algorithm combinations. In recent years, our research team has been committed to obtaining better-performing heuristic algorithms through algorithmic combinations, such as the beetle antenna strategy based on grey wolf optimization [], grey wolf optimization based on the Aquila exploration method (AGWO) [], hybrid golden jackal optimization and the golden sine algorithm [], enhanced snake optimization [], etc.
The reptile search algorithm (RSA) is a novel intelligent optimization algorithm based on crocodile hunting behavior that was proposed by Laith et al. in 2022 []. The RSA has the characteristics of fewer parameter adjustments, strong optimization stability, and easy implementation, achieving excellent results in optimization problems. Ervural and Hakli proposed a binary RSA to extend the RSA to binary optimization issues []. Emam et al. proposed an enhanced reptile search algorithm for global optimization. They selected the optimal thresholding values for multilevel image segmentation []. Xiong et al. proposed a dual-scale deep learning model based on ELM-BiLSTM and improved the reptile search algorithm for wind power prediction []. Elkholy et al. proposed an AI-embedded FPGA-based real-time intelligent energy management system using a multi-objective reptile search algorithm and a gorilla troops optimizer [].
However, due to the physiological limitations of any animal, there are corresponding drawbacks to algorithms that simulate biological habits. This also leads to the RSA, like other algorithms that simulate physical patterns, having a slow convergence speed, low optimization accuracy, and being prone to falling into local optima. This article aims to solve this problem by studying the natural patterns of organisms inspired by natural laws. Crocodiles have good hunting ability as land animals but need a better observation field due to height constraints. Therefore, in the search section, the performance could be better (in line with the RSA’s slow convergence speed, low optimization accuracy, and quick fall into local optima). Inspired by the different flight heights and search horizons of natural organisms, this article introduces the African vulture optimization algorithm (AVOA) [] and northern goshawk optimization (NGO) [], utilizing the high-altitude advantages of birds to explore accordingly. Considering the sizeable spatial range, the northern goshawk algorithm is used in the high-altitude field, and African vulture optimization is used in the mid- to high-altitude range. In the exploration phase, the hunting advantages of crocodiles are utilized. On this basis, a reptile search algorithm considering different flight heights (FRSA) is proposed.
To verify the effectiveness of the FRSA, a comparison was made with ten SOTA algorithms on two function sets (thirty-three functions) and three engineering design optimization problems, demonstrating significant improvements in both the algorithm’s performance and its practical problem-solving capabilities. The highlights and contributions of this paper are summarized as follows: (1) The reptile search algorithm considering different flight heights is proposed. (2) Wilcoxon rank sum and Friedman tests are used to analyze the statistical data. (3) The FRSA is applied to solve three constrained optimization problems in mechanical fields and compared with ten SOTA algorithms.
The rest of this article is arranged as follows: Section 2 reviews the RSA, and Section 3 provides a detailed introduction to the FRSA, including all the processes of exploration and exploitation. Section 4 describes and analyzes the results of the FRSA and other comparative algorithms on the two sets of functions. Section 5 represents the FRSA’s performance on three practical engineering design issues. Finally, Section 6 provides a summary and the outlook of the entire article.
2. RSA
The RSA is a novel, naturally inspired meta-heuristic optimizer. It simulates the hunting behavior of crocodiles to optimize problems. Crocodiles’ hunting behavior is divided into two phases: implement encirclement (exploration) and hunting (exploitation). The implementation of hunting is achieved through high walking or belly walking, and hunting is achieved through hunting coordination or hunting cooperation.
In each optimization process, the first step is to generate an initial population. In the RSA, the initial population of crocodiles is randomly generated, as described in Equation (1), and the rules for randomly generating populations are shown in Equation (2).
where denotes randomly generated initial solutions, and represents the position of the m-th solution in the n-th dimension. m denotes the number of candidate solutions, and n denotes the dimension of the given problem.
where denotes a random value between 0 and 1, and and denote the lower and upper bounds of the given problem, respectively.
The RSA can transition between encirclement (exploration) and hunting (exploitation), and each phase can be divided into two states according to different situations. Therefore, the RSA can be divided into four other parts.
During the exploration phase, there are two states: high-altitude walking and abdominal walking. When , the crocodile population enters a high-altitude walking state, and when , the crocodile population enters an abdominal walking state. Different conditions during the exploration phase benefit the population by conducting better searches and finding better solutions. The position update rules of the population during the exploration phase are shown in Equation (3).
where denotes the position of the optimal solution at time in the j-th dimension, is the maximum number of iterations per experiment, and and denote a random value between 0 and 1. denotes the hunting operator for the j-th dimension of the i-th candidate solution, which can be calculated by Equation (4). is a scaling function used to reduce the search area, which can be calculated by Equation (5). is a random number between 1 and m, and is an evolutionary factor, with a randomly decreasing value between 2 and −2, which can be calculated by Equation (6).
where is a near-zero minimum, which is to prevent cases where the denominator is zero, is an integer between −1 and 1, and represents the percentage difference between the best solution and the current solution in the j-th dimension position, which can be calculated by Equation (7).
In the exploitation phase, there are two states based on the hunting behavior of crocodiles: hunting coordination and hunting cooperation. Crocodile hunting coordination and cooperation enable them to approach their target prey easily, as their reinforcement effect differs from the surrounding mechanism. Therefore, exploitation search may discover near-optimal solutions after several attempts. When , the crocodile population enters a hunting coordination state, when , the crocodile population enters a hunting cooperative state. Different states during the exploitation phase are beneficial in avoiding optimization from falling into local optima and helping to determine the optimal solution during the exploitation phase. The location update rules of the population during the exploration phase are shown in Equation (8).
where denotes the position of the optimal solution at time in the j-th dimension, and and denote random values between 0 and 1. is a scaling function used to reduce the search area, which can be calculated by Equation (5). is a minimal value.
The pseudo-code of the RSA is shown in Algorithm 1.
| Algorithm 1. Pseudo-code of RSA | |
| 1. | Define Dim, UB, LB, Max_Iter(T), Curr_Iter(t), α, β, etc |
| 2. | Initialize the population randomly |
| 3. | while (t < T) do |
| 4. | Evaluate the fitness of each |
| 5. | Find Best solution |
| 6. | Update the ES using Equation (6). |
| 7. | for (i = 1 to m) do |
| 8. | for (j = 1 to n) do |
| 9. | Update the η, R, P and values using Equations (4), (5) and (7), respectively. |
| 10. | If (t ≤ T/4) then |
| 11. | Calculate using Equation (3) |
| 12. | else if (t ≤ 2T/4 and t > T/4) then |
| 13. | Calculate using Equation (3) |
| 14. | else if (t ≤ 3T/4 and t > 2T/4) then |
| 15. | Calculate using Equation (8) |
| 16. | else |
| 17. | Calculate using Equation (8) |
| 19. | end if |
| 20. | end for |
| 21. | end for |
| 22. | t = t+1 |
| 23. | end while |
| 24. | Return the best solution. |
3. Proposed FRSA
As a heuristic algorithm, the RSA has achieved good results in solving optimization problems due to its novel imitation approach. However, due to the limitations of natural biological behavior, this algorithm still has some drawbacks. In the process of individual optimization, multiple complex situations may be encountered, and the steady decrease in evolutionary factors does not conform to the nonlinear optimization law of algorithms when dealing with complex optimization problems. The team collaboration, search scope, and hunting mechanism of the crocodile population are all updated around the current optimal value. The iterative updating process of individuals lacks a mutation mechanism. Suppose the present optimal individual falls into a local optimum. In that case, it is easy for the population to aggregate quickly, resulting in the algorithm being unable to break free from the constraints of the local extremum.
In this section, based on the shortcomings of the RSA, the FRSA is proposed by introducing different search mechanisms (based on the exploration altitude) in the exploration phase of the algorithm and introducing fluctuation factors in the exploration phase.
3.1. High-Altitude Search Mechanism (Northern Goshawk Exploration)
The northern goshawk randomly selects prey during the prey identification stage of hunting and quickly attacks it. Due to the random selection of targets in the search space, this stage increases the exploration capability of the NGO algorithm. This stage conducts a global search of the search space to determine the optimal region. At this stage, the behavior of northern goshawks in prey selection and attack is described using Equations (9) and (10).
where is the prey position of the i-th northern hawk, is the objective function value of the prey position of the i-th northern hawk, is the position of the i-th northern hawk, is the position of the i-th northern hawk in the j-th dimension at time t, is the updated objective function value of the i-th northern hawk, I is a random integer of 1 or 2.
3.2. Low-Altitude Search Mechanism (African Vulture Exploration)
Inspired by the speed at which vultures feed or starve, mathematical modeling is performed using Equation (11), which can be used to simulate the exploration and exploration phases. The satiety rate shows a decreasing trend, and this behavior is simulated using Equation (12).
where represents the hunger level of vultures, is the current number of iterations, is the maximum number of iterations, denotes a random value between −1 and 1, and denotes a random value between −2 and 2. When , the vultures are in the exploration phase. Based on the living habits of vultures, there are two different search methods in the exploration phase of the African vulture optimization algorithm, as shown in Equation (13).
3.3. Novel Dynamic Factor
In the exploration phase of the RSA, due to the lack of the random walkability of the algorithm, the convergence speed of the algorithm is slow, and the optimization accuracy is low at this stage. Therefore, this paper proposes a new DF on the original basis to add disturbance factors and to improve the random walkability of the algorithm in the exploration stage, enable the population to explore local regions in small steps, reduce the probability of individuals falling into the local extremum under the influence of fluctuations, and improve the optimization accuracy of the algorithm. The new DF is calculated by Equation (14). The DF graph for 500 iterations is shown in Figure 1.
where is the current number of iterations, is the maximum number of iterations, and denotes a random value between 0 and 1.
Figure 1.
The dynamic factor graph for 500 iterations.
After adding disturbance factors, the position update rules of the FRSA during the exploration phase are shown in Equation (15).
By utilizing the proposed strategy to improve the RSA, the optimization ability and efficiency of RSA can be effectively improved. The cooperative hunting mode of the FRSA is shown in Figure 2. The pseudocode of the FRSA is shown in Algorithm 2. And the flowchart of FRSA is shown in Figure 3.
| Algorithm 2. Pseudo-code of FRSA | |
| 1. | Define Dim, UB, LB, Max_Iter(T), Curr_Iter(t), α, β, etc |
| 2. | Initialize the population randomly |
| 3. | while (t < T) do |
| 4. | Evaluate the fitness of each |
| 5. | Find Best solution |
| 6. | Update the ES using Equation (6). |
| 7. | for (i = 1 to m) do |
| 8. | for (j = 1 to n) do |
| 9. | Update the η, R, P and values using Equations (4), (5) and (7), respectively. |
| 10. | if (t ≤ 3T/10) then |
| 11. | Calculate using Equation (10) |
| 12. | else if (t ≤ 6T/10 and t > 3T/10) then |
| 13. | Calculate using Equation (14) |
| 14. | else if (t ≤ 8T/10 and t > 6T/10) then |
| 15. | Calculate using Equation (15) |
| 16. | else |
| 17. | Calculate using Equation (15) |
| 18. | end if |
| 19. | end for |
| 20. | end for |
| 21. | t = t + 1 |
| 22. | end while |
| 23. | Return best solution. |
Figure 2.
Cooperative hunting mode of FRSA.
Figure 3.
Flowchart of FRSA.
3.4. Computational Time Complexity of the FRSA
In the process of optimizing practical problems, in addition to pursuing accuracy, time is also an essential element []. The time complexity of an algorithm is an important indicator for measuring the algorithm. Therefore, it is necessary to analyze the time complexity of the improved algorithm compared to the original algorithm. The time complexity is mainly reflected in the algorithm’s initialization, fitness evaluation, and update solution.
When there are N solutions, the time complexity of the initialization phase is , and the time complexity of the update phase is . Therefore, the algorithm complexity of the RSA can be obtained as . Compared to the RSA, the time complexity of the FRSA only increases the part of the evolution factor. Assuming the time of the evolution factor is , the time complexity of the FRSA is . From this, the FRSA proposed in this article does not increase the time complexity.
4. Analysis of Experiments and Results
4.1. Benchmark Function Sets and Compared Algorithms
This section uses the classic function set and the CEC 2019 set as the benchmark test functions for this article. There are 33 functions, including 7 unimodal, 6 multimodal, and 20 fixed-dimensional multimodal functions. Unimodal functions were used to test the exploration ability of the optimization algorithms due to having only one extreme value. Multimodal functions were used to test the exploration ability of optimization algorithms due to the existence of multiple extreme values. Finally, fixed dimensional parts were used to evaluate the algorithm’s total capacity for exploration and exploration. The details of the classic function set are shown in Table 1. The details of the CEC 2019 set are shown in Table 2.
Table 1.
The classic function set.
Table 2.
The CEC 2019 set.
To better compare the results with other algorithms, this study used ten well-known algorithms as benchmark algorithms, including the GA [], PSO [], ACO [], GWO [], GJO [], SO [], TACPSO [], AGWO [], EGWO [], and the RSA []. These benchmark algorithms have achieved excellent results in function optimization and are often used as benchmark comparison algorithms. The details of the parameter settings for the algorithms are shown in Table 3. To be fair, the setting information for these parameters was taken from the original literature that proposed these algorithms.
Table 3.
Parameter settings for algorithms.
To fairly compare the results of the benchmark algorithms, all algorithms adopted the following unified parameter settings: the number of independent continuous runs of the algorithm was 30, the number of populations was 50, the number of algorithm iterations was 500, and the comparison indicators included the mean, the standard deviation, the p-value, the Wilcoxon rank sum test, and the Friedman test [,]. The best results of the test are displayed in bold. This simulation testing environment was carried out on a computer with the following features: Intel(R) Core (TM) i5-9400F CPU @ 2.90 GHz and 16 GB RAM, Windows 10, 64-bit operating system.
4.2. Results Comparison and Analysis
To fully validate the robustness and effectiveness of the algorithm for different dimensional problems, this study adopted three dimensions (30, 100, 500) for the non-fixed dimensional functions (unimodal and multimodal functions).
Table 4 shows the results of the non-fixed dimensional functions in 30 dimensions, including the mean (Mean), standard deviation (Std), and Friedman test of 11 algorithms. Figure 4 shows the iterative curves of these 11 algorithms for solving 13 non-fixed dimensional functions. Figure 5 is a boxplot of the results obtained by these 11 algorithms after solving 13 functions with non-fixed dimensions. The boxplot results were analyzed from five perspectives: the minimum, lower quartile, median, upper quartile, and maximum. By convergence curves and boxplots, the algorithm can be more intuitively and comprehensively characterized for solving functional problems. Out of 13 non-fixed dimensional functions, the FRSA achieved ten optimal values, with the highest number among all 11 algorithms. The Friedman value shows the overall results obtained by each algorithm in 13 functions. In the Friedman value, the FRSA achieved the mark of 2.2115, ranking first in the Friedman rank, indicating that the FRSA achieved better results than the other algorithms in 30 dimensions.
Table 4.
Results and comparison of 11 algorithms on 13 classic functions with Dim = 30.
Figure 4.
The convergence curves of the 11 algorithms with Dim = 30.
Figure 5.
Boxplot analysis of classic functions (F1−F13) with Dim = 30.
Table 5 shows the results of the non-fixed dimensional functions in 100 dimensions, including the Mean, Std, and Friedman test of 11 algorithms. Figure 6 shows the iterative curves of these 11 algorithms for solving 13 non-fixed dimensional functions. Figure 7 is a boxplot of the results obtained by these 11 algorithms after solving 13 functions with non-fixed dimensions. The boxplot results were analyzed from five perspectives: the minimum, lower quartile, median, upper quartile, and maximum. By convergence curves and boxplots, the algorithm can be more intuitively and comprehensively characterized for solving functional problems. Out of the 13 non-fixed dimensional functions, the FRSA achieved 11 optimal values, with the highest number among all 11 algorithms. The Friedman value shows the overall results obtained by each algorithm in the 13 functions. For the Friedman value, the FRSA achieved a mark of 2.0192, ranking first in the Friedman test, and indicating that the FRSA achieved better results than the other algorithms in 100 dimensions.
Table 5.
Results and comparison of 11 algorithms on 13 classic functions with Dim =100.
Figure 6.
The convergence curves of the 11 algorithms with Dim = 100.
Figure 7.
Boxplot analysis of classic functions (F1−F13) with Dim = 100.
Table 6 shows the results of non-fixed dimensional functions at 500 dimensions, including the Mean, Std, and Friedman test of 11 algorithms. Figure 8 shows the iterative curves of these 11 algorithms for solving 13 non-fixed dimensional functions. Figure 9 is a boxplot of the results obtained by these 11 algorithms after solving 13 functions with non-fixed dimensions. The boxplot results were analyzed from five perspectives: the minimum, lower quartile, median, upper quartile, and maximum. By convergence curves and boxplots, the algorithm can be more intuitively and comprehensively characterized for solving functional problems. Out of the 13 non-fixed dimensional functions, the FRSA achieved 11 optimal values, with the highest number among all 11 algorithms. The Friedman value shows the overall results obtained by each algorithm in the 13 functions. For the Friedman value, the FRSA achieved a mark of 1.9615, ranking first in the Friedman test, and indicating that the FRSA achieved better results than the other algorithms in 500 dimensions.
Table 6.
Results and comparison of 11 algorithms on 13 classic functions with Dim = 500.
Figure 8.
The convergence curves of the 11 algorithms with Dim = 500.
Figure 9.
Boxplot analysis of classic functions (F1−F13) with Dim = 500.
Table 7 shows the results of the fixed dimensional functions, including the Mean, Std, and Friedman test of 11 algorithms. Figure 10 shows the iterative curves of these 11 algorithms for solving 10 fixed dimensional functions. Figure 11 is a boxplot of the results obtained by these 11 algorithms after solving 13 functions with non-fixed dimensions. The boxplot results were analyzed from five perspectives: the minimum, lower quartile, median, upper quartile, and maximum. By convergence curves and boxplots, the algorithm can be more intuitively and comprehensively characterized for solving functional problems. The FRSA achieved 8 optimal values out of the 10 fixed dimensional functions, with the highest number among all 11 algorithms. The Friedman value shows the overall results obtained by each algorithm in the 13 functions. For the Friedman value, the FRSA achieved a mark of 1.9615, ranking first in the Friedman test, and indicating that the FRSA achieved better results than other algorithms in 500 dimensions.
Table 7.
Results and comparison of 11 algorithms on 10 classic functions with fixed dimensions.
Figure 10.
The convergence curves of the 11 algorithms with fixed dimensions.
Figure 11.
Boxplot analysis of classic functions (F14−F23) with fixed dimensions.
To compare the results of the FRSA with 10 benchmark algorithms more comprehensively, this article introduces another statistical analysis method, the Wilcoxon rank sum test.
As a non-parametric rank sum hypothesis test, the Wilcoxon rank sum test is frequently used in statistical practice for the comparison of measures of location when the underlying distributions are far from normal or not known in advance []. The purpose of the Wilcoxon rank sum test is to test whether there is a significant difference between two populations that are identical except for the population mean. In view of this, this article uses the Wilcoxon rank sum test to compare the differences among the results of various algorithms.
For the Wilcoxon rank sum test, the significance level was set to 0.05, and the symbols “+”, “=”, and “-” indicate that the performance of the FRSA was superior, similar, and inferior to the corresponding algorithm, respectively. In Table 8, no underline represents “+”, and “=” and “-” are represented by different underlines: “ ” and “ ”. Thus, it is possible to evaluate the adopted algorithms from multiple perspectives. Table 8 shows the rank sum test results between the FRSA and the ten benchmark algorithms.
Table 8.
Statistical analysis results of Wilcoxon rank sum test of classic functions.
In order to better demonstrate the comparison of the results between the RSA and the FRSA, this study added a comparative analysis of the convergence of the two algorithms, as shown in Figure 12. There are five columns in Figure 12, which represent three dimensional plots of the benchmark function, the conversion curves of the RSA and FRSA, and the search histories, average fitness values, and trajectories. According to Figure 12, compared to the RSA, the FRSA proposed in this article had better exploration and development capabilities, and achieved higher exploration accuracy.



Figure 12.
Convergence analysis between RSA and FRSA.
All functions in the CEC 2019 are fixed dimensions. The design of this function set is more complex and can be used to demonstrate the robustness and universality of the proposed FRSA. Table 9 shows the results of solving the CEC 2019 using the FRSA and benchmark algorithms, including the Mean, Std, and Friedman test of 11 algorithms. Table 10 shows the FRSA’s Wilcoxon rank sum test results and those of the ten benchmark algorithms. According to Table 9, in the CEC 2019, the FRSA achieved optimal values for 4 functions, with the highest number among all 11 algorithms, in the Wilcoxon rank sum test and Friedman test. Wilcoxon’s rank sum test compared the FRSA with other algorithms, achieving a result of 58/18/24. The Friedman value showed the overall results of each algorithm in 10 functions. In the Friedman value, the FRSA achieved a result of 3.5500, ranking first in the Friedman rank. Both statistical methods proved that the FRSA achieved better results than the other algorithms in the CEC 2019 function. Figure 13 shows the iterative curves of the 11 algorithms in solving CEC 2019. Figure 14 presents a more comprehensive representation of the results of the 11 algorithms on the CEC 2019 function in the form of a boxplot.
Table 9.
Results and comparison of CEC 2019 benchmark functions.
Table 10.
Statistical analysis results of Wilcoxon rank sum test of CEC 2019 functions.
Figure 13.
The convergence curves of the 11 algorithms on CEC 2019 functions.
Figure 14.
Boxplot analysis of CEC2019 benchmark functions.
This section compares the non-fixed dimensional and fixed dimensional functions from two different sets of functions with ten advanced algorithms to verify the performance of the FRSA. It is proved that the improvement strategies proposed in this article can effectively improve the performance of the original RSA and obtain better solutions. The proposed FRSA algorithm has a strong exploration ability and efficient space exploration ability and can effectively solve optimization problems in different dimensions.
5. Real-World Engineering Design Problems
In this section, the FRSA solves three engineering design problems: pressure vessel design [,], corrugated bulkhead design [,], and welded beam design []. Including multiple variables and multiple constraints, these problems are significant practical problems and are often used to verify the performance of heuristic algorithms. These engineering design problems have become a vital aspect of the practical application of meta-heuristic algorithms. To verify the performance of the FRSA more fairly, this section used ten advanced algorithms (GA, PSO, ACO, GWO, GJO, SO, TACPSO, AGWO, EGWO, and RSA) similar to the function testing section for testing.
5.1. Pressure Vessel Design
A pressure vessel is a closed container that can withstand pressure. The use of pressure vessels is pervasive, and they have an important position and role in many sectors, such as industry, civil service, military industry, and many fields of scientific research. In the design of a pressure vessel, under the constraints of four conditions, it is required to meet the production needs while maintaining the lowest total cost. The problem has four variables: the thickness of the shell , the thickness of the head , the inner radius , and the length of the cylindrical section of the vessel, not including the head . The mathematical model of the pressure vessel design is as follows:
The FRSA and ten other advanced algorithms proposed in this article were solved for the pressure vessel design problem. The minimum cost values required for pressure vessel production obtained by the 11 algorithms are shown in Table 11. According to the Table 11, the result obtained by the FRSA is , which is the optimal result achieved among all 11 algorithms. To better demonstrate the optimization process of 11 algorithms in pressure vessel design problems, Figure 15 shows the convergence curves of the 11 algorithms, including the FRSA. It provides the corresponding change angles for each variable to reflect the trend of differences among the parameters during multi-parameter design. To verify the robustness of the algorithm on this issue, statistical analysis was also conducted, and the relevant statistical analysis data are shown in Table 12. Among them, the unit of time was seconds per experiment, that is, the average running time of each algorithm in a single experiment. The Wilcoxson rank sum test counted the results of the FRSA compared with other algorithms, and the FRSA achieved a result of 9/1/0. Through the corresponding convergence curve and statistical analysis, the FRSA converged faster and had higher accuracy and obvious advantages compared to the other algorithms.
Table 11.
Comparison results of pressure vessel design problem.
Figure 15.
The convergence curves of 11 algorithms for the pressure vessel design problem.
Table 12.
Statistical analysis of pressure vessel design problem.
5.2. Corrugated Bulkhead Design
A corrugated bulkhead is made of a pressed steel plate, and then it is bent to replace the function of the stiffener. In the corrugated bulkhead design problem, the minimum weight is required under the constraints of six conditions. The issue has four variables, which are the width (), depth (), length (), and plate thickness (). The mathematical model of the corrugated bulkhead design is as follows:
The FRSA and ten other advanced algorithms proposed in this article were solved for the corrugated bulkhead design problem. The corrugated bulkhead design values obtained by the 11 algorithms are shown in Table 13. According to the Table 13, the result obtained by the FRSA is . Among all 11 algorithms, the FRSA achieved the best result. To better demonstrate the optimization process of the 11 algorithms in the corrugated bulkhead design problem, Figure 16 shows the convergence curves of the 11 algorithms, including the FRSA. It provides the corresponding change angles for each variable to reflect the trend of differences among the parameters during multi-parameter design. To verify the robustness of the algorithm on this issue, statistical analysis was also conducted, and the relevant statistical analysis results are shown in Table 14. The Wilcoxson rank sum test counted the results of the FRSA compared with the other algorithms, and the FRSA achieved a result of 9/0/1. Through the corresponding convergence curve and statistical analysis, the FRSA converged faster, had higher accuracy, and had obvious advantages compared to the other algorithms.
Table 13.
Comparison of the results for the corrugated bulkhead design problem.
Figure 16.
The convergence curves of 11 algorithms for the corrugated bulkhead design problem.
Table 14.
Statistical analysis of corrugated bulkhead design problem.
5.3. Welded Beam Design
A welded beam is a simplified model obtained for the convenience of calculation and analysis in material mechanics. One end of a cantilever beam is fixed support, and the other is free. This problem is a structural engineering design problem related to the weight optimization of square-section cantilever beams. The beams consist of five hollow blocks with constant thickness. The mathematical description of the welded beam design problem is as follows:
The FRSA and ten other advanced algorithms proposed in this article were solved for the welded beam design problem. The values of the welded beam design obtained by the 11 algorithms are shown in Table 15. According to the Table 15, the result obtained by the FRSA is . Among all 11 algorithms, the FRSA achieved the best result. To better demonstrate the optimization process of the 11 algorithms in the welded beam design problem, Figure 17 shows the convergence curves of the 11 algorithms, including the FRSA. It provides the corresponding change angles for each variable to reflect the trend of differences among the parameters during multi-parameter design. To verify the robustness of the algorithm on this issue, statistical analysis was also conducted, and the relevant statistical analysis results are shown in Table 16. The Wilcoxson rank sum test counted the results of the FRSA compared with the other algorithms, and FRSA achieved a result of 9/1/0. Through the corresponding convergence curve and statistical analysis, the FRSA converged faster, had higher accuracy, and had obvious advantages compared to the other algorithms.
Table 15.
Comparison of the results for the welded beam design problem.
Figure 17.
Convergence curves for the welded beam design problem.
Table 16.
Statistical analysis of welded beam design problem.
6. Conclusions and Future Work
To improve the global optimization ability of the RSA, inspired by the different search horizons of different flying heights of natural creatures, this paper proposes a reptile algorithm considering different flying sizes based on the original RSA. In the exploration phase, introducing the different flight altitude abilities of two animals, the northern goshawk and the African vulture, enables reptiles to have better search horizons, improve their global search ability, and reduce the probability of falling into local optima during the exploration phase. In the exploration phase, a new DF is proposed to improve the algorithm’s convergence speed and optimization accuracy. To evaluate the effectiveness of the proposed FRSA, 33 benchmark functions were used for testing, including 13 non-fixed dimensional functions and 20 fixed dimensional functions. Among them, three different dimensions (30, 100, 500) were selected for the non-fixed dimensional functions for testing. The experimental and statistical results indicate that the FRSA has excellent performance and has certain advantages in accuracy, convergence speed, and stability compared to the ten most advanced algorithms. Furthermore, the FRSA was applied to solve three engineering optimization problems, and the results and comparison proved the algorithm’s effectiveness in solving practical problems.
In summary, the FRSA proposed in this article has good convergence accuracy, fast convergence speed, and good optimization performance. Through the testing of fixed and non-fixed dimensional functions and the validation of practical optimization problems, it has been proven that the proposed method can adapt to a wide range of optimization problems, and the algorithm’s robustness has been verified. In later research, the focus will be on evolving the proposed algorithm towards multi-objective optimization, such as path planning, workshop scheduling, and other fields, so that the proposed algorithm can generate more excellent value in practical life.
Author Contributions
Conceptualization, L.Y., T.Z. and D.T.; methodology, P.Y. and L.Y.; software, P.Y. and L.Y.; writing—original draft, L.Y.; writing—review & editing, L.Y., T.Z. and D.T.; data curation, G.L. and J.Y.; visualization G.L. and J.Y.; supervision, T.Z. and D.T.; funding acquisition, T.Z. All authors have read and agreed to the published version of the manuscript.
Funding
Guizhou Provincial Science and Technology Projects (Grant No. Qiankehejichu-ZK [2022] General 320), the Growth Project for Young Scientific and Technological Talents in General Colleges and Universities of Guizhou Province (Grant No. Qianjiaohe KY [2022]167), the National Natural Science Foundation (Grant No. 52242703, 72061006), and the Academic New Seedling Foundation Project of Guizhou Normal University (Grant No. Qianshixinmiao-[2021]A30).
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
Not applicable.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Barkhoda, W.; Sheikhi, H. Immigrant imperialist competitive algorithm to solve the multi-constraint node placement problem in target-based wireless sensor networks. Ad Hoc Netw. 2020, 106, 102183. [Google Scholar] [CrossRef]
- Fu, Z.; Wu, Y.; Liu, X. A tensor-based deep LSTM forecasting model capturing the intrinsic connection in multivariate time series. Appl. Intell. 2022, 53, 15873–15888. [Google Scholar] [CrossRef]
- Liao, C.; Shi, K.; Zhao, X. Predicting the extreme loads in power production of large wind turbines using an improved PSO algorithm. Appl. Sci. 2019, 9, 521. [Google Scholar] [CrossRef]
- Wei, J.; Huang, H.; Yao, L.; Hu, Y.; Fan, Q.; Huang, D. New imbalanced bearing fault diagnosis method based on Sample-characteristic Oversampling TechniquE (SCOTE) and multi-class LS-SVM. Appl. Soft Comput. 2021, 101, 107043. [Google Scholar] [CrossRef]
- Shi, J.; Zhang, G.; Sha, J. Jointly pricing and ordering for a multi-product multi-constraint newsvendor problem with supplier quantity discounts. Appl. Math. Model. 2011, 35, 3001–3011. [Google Scholar] [CrossRef]
- Wu, Y.; Fu, Z.; Liu, X.; Bing, Y. A hybrid stock market prediction model based on GNG and reinforcement learning. Expert Syst. Appl. 2023, 228, 120474. [Google Scholar] [CrossRef]
- Sadollah, A.; Choi, Y.; Kim, J.H. Metaheuristic optimization algorithms for approximate solutions to ordinary differential equations. In Proceedings of the 2015 IEEE Congress on Evolutionary Computation (CEC), Sendai, Japan, 25–28 May 2015; pp. 792–798. [Google Scholar]
- Mahdavi, S.; Shiri, M.E.; Rahnamayan, S. Metaheuristics in large-scale global continues optimization: A survey. Inf. Sci. 2015, 295, 407–428. [Google Scholar] [CrossRef]
- Yang, X.-S. Nature-inspired optimization algorithms: Challenges and open problems. J. Comput. Sci. 2020, 46, 101104. [Google Scholar] [CrossRef]
- Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
- Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
- Chou, J.-S.; Nguyen, N.-M. FBI inspired meta-optimization. Appl. Soft Comput. 2020, 93, 106339. [Google Scholar] [CrossRef]
- Askari, Q.; Younas, I.; Saeed, M. Political Optimizer: A novel socio-inspired meta-heuristic for global optimization. Knowl.-Based Syst. 2020, 195, 105709. [Google Scholar] [CrossRef]
- Hatamlou, A. Black hole: A new heuristic optimization approach for data clustering. Inf. Sci. 2013, 222, 175–184. [Google Scholar] [CrossRef]
- Abedinpourshotorban, H.; Mariyam Shamsuddin, S.; Beheshti, Z.; Jawawi, D.N.A. Electromagnetic field optimization: A physics-inspired metaheuristic optimization algorithm. Swarm Evol. Comput. 2016, 26, 8–22. [Google Scholar] [CrossRef]
- Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the IEEE ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
- Dorigo, M.; Birattari, M.; Stutzle, T. Ant colony optimization. IEEE Comput. Intell. Mag. 2006, 1, 28–39. [Google Scholar] [CrossRef]
- Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
- Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
- Fan, Q.; Huang, H.; Li, Y.; Han, Z.; Hu, Y.; Huang, D. Beetle antenna strategy based grey wolf optimization. Expert Syst. Appl. 2021, 165, 113882. [Google Scholar] [CrossRef]
- Ma, C.; Huang, H.; Fan, Q.; Wei, J.; Du, Y.; Gao, W. Grey wolf optimizer based on Aquila exploration method. Expert Syst. Appl. 2022, 205, 117629. [Google Scholar] [CrossRef]
- Yuan, P.; Zhang, T.; Yao, L.; Lu, Y.; Zhuang, W. A Hybrid Golden Jackal Optimization and Golden Sine Algorithm with Dynamic Lens-Imaging Learning for Global Optimization Problems. Appl. Sci. 2022, 12, 9709. [Google Scholar] [CrossRef]
- Yao, L.; Yuan, P.; Tsai, C.-Y.; Zhang, T.; Lu, Y.; Ding, S. ESO: An enhanced snake optimizer for real-world engineering problems. Expert Syst. Appl. 2023, 230, 120594. [Google Scholar] [CrossRef]
- Abualigah, L.; Elaziz, M.A.; Sumari, P.; Geem, Z.W.; Gandomi, A.H. Reptile Search Algorithm (RSA): A nature-inspired meta-heuristic optimizer. Expert Syst. Appl. 2022, 191, 116158. [Google Scholar] [CrossRef]
- Ervural, B.; Hakli, H. A binary reptile search algorithm based on transfer functions with a new stochastic repair method for 0–1 knapsack problems. Comput. Ind. Eng. 2023, 178, 109080. [Google Scholar] [CrossRef]
- Emam, M.M.; Houssein, E.H.; Ghoniem, R.M. A modified reptile search algorithm for global optimization and image segmentation: Case study brain MRI images. Comput. Biol. Med. 2023, 152, 106404. [Google Scholar] [CrossRef] [PubMed]
- Xiong, J.; Peng, T.; Tao, Z.; Zhang, C.; Song, S.; Nazir, M.S. A dual-scale deep learning model based on ELM-BiLSTM and improved reptile search algorithm for wind power prediction. Energy 2023, 266, 126419. [Google Scholar] [CrossRef]
- Elkholy, M.; Elymany, M.; Yona, A.; Senjyu, T.; Takahashi, H.; Lotfy, M.E. Experimental validation of an AI-embedded FPGA-based Real-Time smart energy management system using Multi-Objective Reptile search algorithm and gorilla troops optimizer. Energy Convers. 2023, 282, 116860. [Google Scholar] [CrossRef]
- Abdollahzadeh, B.; Gharehchopogh, F.S.; Mirjalili, S. African vultures optimization algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems. Comput. Ind. Eng. 2021, 158, 107408. [Google Scholar] [CrossRef]
- Dehghani, M.; Hubálovský, Š.; Trojovský, P. Northern goshawk optimization: A new swarm-based algorithm for solving optimization problems. IEEE Access 2021, 9, 162059–162080. [Google Scholar] [CrossRef]
- Bansal, S. Performance comparison of five metaheuristic nature-inspired algorithms to find near-OGRs for WDM systems. Artif. Intell. Rev. 2020, 53, 5589–5635. [Google Scholar] [CrossRef]
- Chopra, N.; Ansari, M.M. Golden jackal optimization: A novel nature-inspired optimizer for engineering applications. Expert Syst. Appl. 2022, 198, 116924. [Google Scholar] [CrossRef]
- Hashim, F.A.; Hussien, A.G. Snake Optimizer: A novel meta-heuristic optimization algorithm. Knowl.-Based Syst. 2022, 242, 108320. [Google Scholar] [CrossRef]
- Ziyu, T.; Dingxue, Z. A modified particle swarm optimization with an adaptive acceleration coefficients. In Proceedings of the 2009 Asia-Pacific Conference on Information Processing, Shenzhen, China, 18–19 July 2009; pp. 330–332. [Google Scholar]
- Komathi, C.; Umamaheswari, M. Design of gray wolf optimizer algorithm-based fractional order PI controller for power factor correction in SMPS applications. IEEE Trans. Power Electron. 2019, 35, 2100–2118. [Google Scholar] [CrossRef]
- Fan, Q.; Huang, H.; Chen, Q.; Yao, L.; Yang, K.; Huang, D. A modified self-adaptive marine predators algorithm: Framework and engineering applications. Eng. Comput. 2021, 38, 3269–3294. [Google Scholar] [CrossRef]
- Abualigah, L.; Almotairi, K.H.; Al-qaness, M.A.A.; Ewees, A.A.; Yousri, D.; Elaziz, M.A.; Nadimi-Shahraki, M.H. Efficient text document clustering approach using multi-search Arithmetic Optimization Algorithm. Knowl.-Based Syst. 2022, 248, 108833. [Google Scholar] [CrossRef]
- Rosner, B.; Glynn, R.J.; Ting Lee, M.L. Incorporation of clustering effects for the Wilcoxon rank sum test: A large-sample approach. Biometrics 2003, 59, 1089–1098. [Google Scholar] [CrossRef]
- Yang, X.-S.; Huyck, C.R.; Karamanoğlu, M.; Khan, N. True global optimality of the pressure vessel design problem: A benchmark for bio-inspired optimisation algorithms. Int. J. Bio-Inspired Comput. 2014, 5, 329–335. [Google Scholar] [CrossRef]
- Braik, M.S. Chameleon Swarm Algorithm: A bio-inspired optimizer for solving engineering design problems. Expert Syst. Appl. 2021, 174, 114685. [Google Scholar] [CrossRef]
- Ravindran, A.R.; Ragsdell, K.M.; Reklaitis, G.V. Engineering Optimization: Methods and Applications; Wiley: Hoboken, NJ, USA, 1983. [Google Scholar]
- Bayzidi, H.; Talatahari, S.; Saraee, M.; Lamarche, C.-P. Social Network Search for Solving Engineering Optimization Problems. Comput. Intell. Neurosci. 2021, 2021, 8548639. [Google Scholar] [CrossRef]
- Coello Coello, C.A. Use of a self-adaptive penalty approach for engineering optimization problems. Comput. Ind. 2000, 41, 113–127. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).