Next Article in Journal
Log Transformed Coherency Matrix for Differentiating Scattering Behaviour of Oil Spill Emulsions Using SAR Images
Next Article in Special Issue
An Improved Reptile Search Algorithm Based on Lévy Flight and Interactive Crossover Strategy to Engineering Application
Previous Article in Journal
Banach Limit and Ulam Stability of Nonhomogeneous Cauchy Equation
Previous Article in Special Issue
A Hybrid Arithmetic Optimization and Golden Sine Algorithm for Solving Industrial Engineering Design Problems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enhanced Remora Optimization Algorithm for Solving Constrained Engineering Optimization Problems

1
School of Information Engineering, Sanming University, Sanming 365004, China
2
Department of Computer and Information Science, Linköping University, SE-581 83 Linköping, Sweden
3
Faculty of Science, Fayoum University, Fayoum 63514, Egypt
4
Faculty of Computer Sciences and Informatics, Amman Arab University, Amman 11953, Jordan
*
Authors to whom correspondence should be addressed.
Mathematics 2022, 10(10), 1696; https://doi.org/10.3390/math10101696
Submission received: 21 April 2022 / Revised: 6 May 2022 / Accepted: 11 May 2022 / Published: 16 May 2022
(This article belongs to the Special Issue Optimisation Algorithms and Their Applications)

Abstract

:
Remora Optimization Algorithm (ROA) is a recent population-based algorithm that mimics the intelligent traveler behavior of Remora. However, the performance of ROA is barely satisfactory; it may be stuck in local optimal regions or has a slow convergence, especially in high dimensional complicated problems. To overcome these limitations, this paper develops an improved version of ROA called Enhanced ROA (EROA) using three different techniques: adaptive dynamic probability, SFO with Levy flight, and restart strategy. The performance of EROA is tested using two different benchmarks and seven real-world engineering problems. The statistical analysis and experimental results show the efficiency of EROA.

1. Introduction

The process of determining the best values of design variables to minimize/maximize fitness function while fulfilling the requirements of the whole system is known as optimization [1,2]. Optimization problems exist in almost everything and every field, such as engineering, business, science, etc. Optimization methods can be classified into two large categories: (1) exact and (2) heuristic & metaheuristic algorithms [3,4,5]. The former category can be considered less applicable and practical as it needs less complicated calculations, which will take much time. In contrast, the latter class (metaheuristic algorithms) shows some randomized/stochastic behavior and performs an “educated search decision” for some “wise regions” [6,7].
These days optimization field has gained a huge interest by many scholars as it become one of the hot topics in computer science since it appears in many domains such as cloud computing tasks [8], face detection [9], power [10,11], and engineering problems [12]. In the literature, there exist an enormous number of optimization algorithms since there is no algorithm that is able to find the optimal solution in all problems as stated by the No Free Lunch (NFL) theory [13]. In other words, if an algorithm is able to find the optimal solution in one type of problem, it will fail in other types. The above theorem encourages researchers to introduce novel algorithms and enhance already existed ones.
Metaheuristics algorithms can be classified based on their source of inspiration into three sub-classes: (1) Swarm-based, (2) Physics-based, and (3) Evolution-based [14].
Swarm-inspired metaheurstics contain algorithms that mimic species social & biological traits such as mating, labor division, foraging, navigation or self-organize [15,16]. Examples of such algorithms includes Particle Swarm Optimization (PSO) [17], Ant Colony Optimization [18], Grey Wolf Optimization (GWO) [19], Ant Lion Optimizer (ALO) [20], Whale Optimization Algorithm (WOA) [21], Marine Predator Algorithm (MPA) [22], Salp Swarm Algorithm (SSA) [23], Remora Optimization Algorithm (ROA) [24], Harris Hawks Optimization (HHO) [25], COOT bird [26], Moth-Flame Optimization (MFO) [27], Social Spider Optimization [28], Snake Optimizer [29], Crow Search Algorithm [30], Emperor Penguin Optimizer [31], Virus Colony Search [32], Spotted Hyena Optimizer [33], Aquila Optimizer (AO) [34], and Tunicate Swarm Algorithm [35].
Physics-inspired meta-heuristic contains algorithms inspired by physical laws or phenomena. Examples of such algorithms, contain Simulated Annealing [36], Big-Bang Big-Crunch [37], Gravitational Search Algorithm (GSA) [38], Lightning Search Algorithm [39], Black Hole Algorithm [40], Sine Cosine Algorithm (SCA) [41], Ray Optimization Algorithm [42], Artificial Electric Field Algorithm [43], Arithmetic Optimization Algorithm (AOA) [44], Multi-Verse Optimizer (MVO) [45], and Henry Gas Solubility Optimization [46].
Evolution-based algorithms are driven from biological evolution ideas. Examples of such algorithms are Genetic Algorithm (GA) [47], Evolutionary Programming [48], Biogeography Based Optimization [49], Memetic Algorithm [50], Bacterial Foraging Optimization [51], Artificial Algae Algorithm [52], and Monkey King Evolutionary [53].
ROA [24] is a recent metaheuristic population-based algorithm inspired by Remora foraging parasitic behavior in the oceans. ROA handles many updating position rules based on many hosts. Zheng et al. [54] developed an improved version of ROA called IROA by using an autonomous foraging mechanism (AFM). Furthermore, Liu et al. [55] developed a modified version of ROA based on Brownian motion and lens opposition-based. Vinayaki et al. [56] applied ROA in the multilevel image segmentation in the detection of Retinopathy from fundus images. In this study, an enhanced ROA version is proposed, which is called EROA, to improve the performance of the original ROA by using three techniques, (1) Adaptive dynamic probability, (2) Sailfish Optimizer (SFO) with Levy flight, and (3) Restart strategy (RS).
The main contribution of this paper can be summarized as follows:
  • Enhanced version of ROA is proposed based on 3 strategies: Adaptive dynamic probability, SFO with Levy flight, and Restart strategy (RS);
  • EROA has been tested on 23 different functions (CEC2005), 29 functions from (CEC2017), and 7 real-world engineering problems;
  • EROA has been tested using 3 dimensions (D = 30, 100, & 500);
  • EROA has been compared with original algorithm and other 6 different algorithms.
This paper is organized as follows: Section 2 gives a brief description of the ROA, whereas Section 3 illustrates the using operators: adaptive dynamic probability, Levy flight, and restart strategy, and gives the framework of the proposed algorithm. Section 4 and Section 5 show the experiments results of the proposed algorithm in solving benchmark and constrained engineering problems, whereas Section 6 concludes the paper.

2. Remora Optimization Algorithm (ROA)

ROA is a new metaheuristic optimization algorithm inspired by Remora, the “intelligent traveler” in the ocean (Figure 1). It mimics the concept of parasitism and random host replacement of Remora. Remora can attach to whales and swordfishes to learn the effective characteristics of the hosts, so the ROA borrows two strategies from WOA and SFO [57]. ROA contains “Free travel” and “Eat thoughtfully” phases, corresponding to the exploration and exploitation stages. The algorithm switches between exploration and exploitation phases through a “one small step try”.
ROA has many advantages such as:
  • Easy-to-implement;
  • Few number of parameters;
  • Good balance between exploration and exploitation.
Moreover, like all other metaheurstics algorithms, it may stuck in local optima or have a slow convergence curve. A brief description of ROA’s mathematical model can be described as follows.

2.1. Free Travel (Exploration)

2.1.1. Sailfish Optimization (SFO) Strategy

When Remora attach to the swordfish, its position can be considered as the swordfish’s position. ROA improves the location update formula based on the elite idea and obtains the following formula:
X ( t + 1 ) = X B e s t ( t ) ( r a n d · ( X B e s t ( t ) + X r a n d ( t ) 2 ) X r a n d ( t ) )
where t is the number of current iterations; XBest(t) represents the best solution obtained so far and Xrand(t) indicates a random location; and rand is a uniformly distributed random number between 0 and 1.

2.1.2. Experience Attempt

At the same time, Remora continuously takes a small step around the host to accumulate experience to determine whether the host needs to be replaced. The mathematical formula is as follows.
X a t t ( t + 1 ) = X ( t ) + ( X ( t ) X p r e ( t ) ) · r a n d n
where Xatt(t + 1) represents a tentative step; Xpre(t) is the position of the previous generation and X(t) indicates the current position; and randn is a normally distributed random number between 0 and 1.
After this “small global” movement, Remora will compare fitness values of SFO Strategy f(X) and experience attempt f(Xatt) to choose whether to change the host. The position with a smaller fitness value is retained.

2.2. Eat Thoughtfully (Exploitation)

2.2.1. Whale Optimization Algorithm (WOA) Strategy

When Remora attaches to the whale, the position update formula of remora is described as follows.
X ( t + 1 ) = D i s t e α cos ( 2 π α ) + X ( t )
α = r a n d ( a 1 ) + 1
a = ( 1 + t T )
D i s t = | X B e s t ( t ) X ( t ) |
where T is the maximum number of iterations, Dist indicates the distance between the best position and the current position, α is a random number in [−1, 1], and a linearly decreases from −1 to −2.

2.2.2. Host Feeding

“Host feeding” is a small step in the exploitation process, which creates a solution space that converges gradually around the host, refining and enhancing the ability of local optimization. This stage can be mathematically modeled as:
X ( t + 1 ) = X ( t ) + A
A = B ( X ( t ) C X B e s t ( t ) )
B = 2 V r a n d V
V = 2 ( 1 t T )
where A denotes a small step movement related to the volume space of the host and Remora, and factor C is a constant number equal to 0.1, used to narrow the position of remora.
It is worth noting that a random integer argument H (0 or 1) is used to decide whether to choose the WOA Strategy or the SFO Strategy. The pseudo-code of ROA is shown in Algorithm 1.
Algorithm 1 Pseudo-code of ROA
1:   Set initial values of the population size N and the maximum number of iterations T
2:   Initialize positions of the population Xi (i = 1, 2, 3, ..., N)
3:   Initialize the best solution Xbest and corresponding best fitness f(Xbest)
4:   While t < T do
5:     Calculate the fitness value of each Remora
6:     Check if any search agent goes beyond the search space and amend it
7:     Update aαV and H
8:     For each Remora indexed by i do
9:       If H(i) = 0 then
10:        Update the position using Equation (3)
11:      Elseif H(i) = 1 then      
12:        Update the position using Equation (1)
13:      Endif
14:      Make a one-step prediction by Equation (2)
15:      Compare fitness values to judge whether host replacement is necessary
16:      If the host is not replaced, Equation (7) is used as the host feeding mode for Remora
17:    End for
18:  End while
19:  Return Xbest

3. The Proposed Approach

As a newly proposed algorithm, ROA has achieved good results on some test functions. However, experiment results show that it still has the defects of insufficient global exploration and local optimum stagnation. The global exploration is implemented by the SFO Strategy and the “small global” movement experience attempt. The lack of global exploration capacity can be attributed to the deficient SFO Strategy. Thus, adaptive dynamic probability and Levy flight are utilized to improve the global search ability in this work. Meanwhile, a restart strategy is added to help the algorithm escape from local optima.
To best of our knowledge, it is the first time the following there operators have been combined with ROA.

3.1. Adaptive Dynamic Probability

As mentioned above, H is used to decide whether to choose the WOA Strategy or the SFO Strategy; that is, H determines whether to explore or exploit the search space. However, H is a random integer number, which means that the probability of exploration and exploitation is the same, whether during early or late iterations. This is not in line with our desire to focus on exploration in the early stage and on exploitation in the later stage for the optimization algorithm. Thus, the adaptive dynamic probability of H is designed as follows:
{ p ( H = 0 ) = t T p ( H = 1 ) = 1 t T
where p denotes the probability that H takes 0 or 1; obviously, with the increase of the number of iterations, the probability of H taking 0 increases, while the probability of 1 decreases. The possibility of individuals for exploited increases and the possibility of exploration decreases.

3.2. Sailfish Optimization (SFO) Strategy with Levy Flight

Levy flight is a stochastic strategy widely used in optimization algorithms. It has a relatively high probability of large strides in random walking, which can effectively improve the randomness of the algorithm. To further enhance the exploration ability of the method, Levy flight is integrated into the formula of the SFO Strategy, which is described as follows:
X ( t + 1 ) = X B e s t ( t ) ( r a n d ( X B e s t ( t ) + X r a n d ( t ) 2 ) X r a n d ( t ) ) L e v y ( D )
L e v y ( D ) = 0.01 × u × σ | υ | 1 β
σ = ( Γ ( 1 + β ) × sin ( π β 2 ) Γ ( 1 + β 2 ) × β × 2 ( β 1 2 ) ) 1 β
where Levy represents the Levy flight function, and D is the dimension size of the problem. u and v are random values between 0 and 1, and β is a constant number equal to 1.5.

3.3. Restart Strategy (RS)

Restart schemes can help worse individuals jump out of the local optimum, so they are used to prevent the population from stagnating. Zhang et al. [58] proposed a RS with a trial vector recording the times the position of individuals has not been improved. If the position of the ith individual has not been improved in this search, the trial value of this individual is increased by 1. Otherwise, the trial value is reset to zero. If the trail value is not less than the predefined Limit, the position will be replaced by choosing the location with a better fitness value from Equations (15) and (16).
X ( t + 1 ) = l b + r a n d ( u b l b )
X ( t + 1 ) = r a n d ( u b + l b ) X ( t )
where lb and ub are the lower and upper bound of the problem, respectively; in this paper, we replace Equation (16) with Equation (17) from the random opposition-based learning (ROL) strategy [59] to obtain an opposite position. A better solution generated from Equations (15) and (17) is adopted if the trail value is not less than the Limit.
X ( t + 1 ) = ( u b + l b ) r a n d X ( t )

3.4. The Proposed EROA

EROA is proposed to combine the above three strategies. The overall process of the EROA is similar to ROA, except that the update method of H is replaced by adaptive dynamic probability, the SFO Strategy integrates Levy flight, and the RS is added at the end. The pseudo-code of EROA is given in Algorithm 2, and the summarized flowchart is illustrated in Figure 2.
Algorithm 2 Pseudo-code of EROA
1:   Set initial values of the population size N and the maximum number of iterations T
2:   Initialize positions of the population Xi (i = 1, 2, 3, ..., N)
3:   Initialize the best solution Xbest and corresponding best fitness f(Xbest)
4:   While t < T do
5:     Calculate the fitness value of each Remora
6:     Check if any search agent goes beyond the search space and amend it
7:     Update aα, and V
8:     Update H based on Equation (11)
9:     For each Remora indexed by i do
10:      If H(i) = 0 then
11:        Update the position using Equation (3)
12:      Elseif H(i) = 1 then      
13:        Update the position using Equation (12)
14:      End if
15:      Make a one-step prediction by Equation (2)
16:      Compare fitness values to judge whether host replacement is necessary
17:      If the host is not replaced, Equation (7) is used as the host feeding mode for Remora
18:      Update trial(i) for remora
19:      If trial(i) >= Limit
20:        Generate positions using Equations (15) and (17), respectively
21:        Compare fitness values to choose the position with better fitness value
22:      End if
23:    End for
24:  End while
25:  Return Xbest

4. Numerical Experiment Results

In this section, two different types of benchmark functions are used to evaluate the performance of the EROA. First, experiments on 23 standard benchmark functions are carried out to evaluate the performance of EROA in solving simple numerical optimization problems. Then, the CEC2017 test suite, including 29 benchmark functions, is utilized to evaluate the performance of EROA in solving complex numerical problems. The EROA is compared with seven well-known metaheuristic methods, including ROA, AO, AOA, HHO, WOA, SCA, and STOA [60]. We set the population size N = 30, dimension size D = 30/100/500, the maximum number of iterations T = 500, and run 30 times independently for all algorithms. The parameter settings of each algorithm are shown in Table 1. All experiments are carried out in MATLAB R2016a on a PC with Intel (R) Core (TM) i7-9700 CPU @ 3.00 GHz and RAM 8 GB memory on OS Windows 10.

4.1. Experiments on Standard Benchmark Functions

Here, the EROA performance is tested using 23 mathematical benchmark functions. This benchmark contains seven unimodal, six multimodal, and ten fixed-dimension multimodal functions. The mathematical description of each type is given in Table 2, Table 3 and Table 4 where Fun refers to a mathematical function, D refers to the number of dimensions, Range shows the interval of search space, fmin refers to the optimal value that the corresponding functions can achieve.
Table 5 shows the results of the introduced algorithm with its competitors. The parameter settings of each algorithm are illustrated in Table 1. From Table 5, it can be seen that EROA ranked first in 19 functions out of 23 ones. In unimodal functions, it ranked first in 5 out of 7, whereas in multimodal, it achieves the best results in 4 out of 6. On the other hand, it achieves the best in all functions that belong to fixed-dimension multimodal functions. Figure 3 shows the convergence curve for all functions. From this table, it can be noticed that EROA has a faster convergence than other competitors.
To test the scalability of the proposed algorithm, we carry out the experiments from F1–F13 using two other dimensions D = 100 and D = 500 as shown in Table 5. Moreover, the convergence curve for these dimensions are shown in Figure 4 and Figure 5.
Furthermore, a non-parametric test called Wilcoxon rank-sum is used at a 5% level of significance to make a fair comparison between EROA and other algorithm results in each independent run. Table 6 shows the results of such parameters. From this table, it can be seen that p-values for almost functions are less than 0.05.

4.2. Experiments on CEC2017 Test Suite

In this subsection, a discussion on the performance of the suggested algorithm named EROA using CEC2017 is provided. The results of EROA with other algorithms are given in Table 7. This table shows that EROA has achieved the best results in 9 functions out of 29, whereas STOA achieved the best results in only eight functions. Moreover, EROA has achieved the second-best in two functions (F20-F21) and the third-best in eight functions (F3-F5-F18-F19-F24-F26-F30). To make a fair comparison in algorithm ranking, we perform a Friedman test, as shown in Table 8. From this table, it can be noticed that EROA has ranked first overall in solving CEC2017. Furthermore, Figure 6 shows some convergence curves to the introduced algorithm compared with the classical ROA and other six functions. EROA achieved a fast convergence.

5. Constrained Engineering Design Problems

The previous experiments show the EROA’s ability to solve numerical optimization problems. Here, to be able to reveal the powerfulness of EROA in solving real constrained engineering problems, seven different problems are used; namely, pressure vessel design, speed reducer design, tension/compression spring design, three-bar truss, welded beam design, tubular column design, and gear train design problem.

5.1. Pressure Vessel Design Problem

Pressure vessel design is a minimization problem that consists of 4 variables, as shown in Figure 7. The mathematical formulation of this problem is shown below.
Consider
x = [ x 1   x 2   x 3   x 4 ] = [ T s   T h   R   L ] ,
Minimize
f ( x ) = 0.6224 x 1 x 3 x 4 + 1.7781 x 2 x 3 2 + 3.1661 x 1 2 x 4 + 19.84 x 1 2 x 3 ,
Subject to
g 1 ( x ) = x 1 + 0.0193 x 3 0 , g 2 ( x ) = x 3 + 0.00954 x 3 0 , g 3 ( x ) = π x 3 2 x 4 4 3 π x 3 3 + 1296000 0 , g 4 ( x ) = x 4 240 0 ,
Variable Range
0 x 1 99 , 0 x 2 99 , 10 x 3 200 , 10 x 4 200 ,
Results of EROA in solving pressure vessel design are compared with Aquila Optimizer (AO), Harris Hawks Optimization (HHO), Whale Optimization Algorithm (WOA), Slime Mould Algorithm (SMA) [61], Grey Wolf Optimizer (GWO), Multi-Verse Optimizer (MVO), Evolutions Strategy (ES) [62], Gravitational Search Algorithm (GSA), Genetic Algorithm (GA), and Co-evolutionary Particle Swarm Optimization (CPSO) [63] as shown in Table 9. It can be seen from this table, EROA has the smallest cost, 5935,7301 with X = (0.8434295, 0.4007618, 44.786, 145.9578).

5.2. Speed Reducer Design Problem

The second engineering problem is the speed reducer design, which minimizes the reducer weight. It has seven variables, as shown in Figure 8. The mathematical formulation is given below.
Minimize
f ( x ) = 0.7854 x 1 x 2 2 ( 3.3333 x 3 2 + 14.9334 x 3 43.0934 )           1.508 x 1 ( x 6 2 + x 7 2 ) + 7.4777 ( x 6 3 + x 7 3 ) ,
Subject to
g 1 ( x ) = 27 x 1 x 2 2 x 3 1 0 , g 2 ( x ) = 397.5 x 1 x 2 2 x 3 2 1 0 , g 3 ( x ) = 1.93 x 4 3 x 2 x 3 x 6 4 1 0 , g 4 ( x ) = 1.93 x 5 3 x 2 x 3 x 7 4 1 0 , g 5 ( x ) = ( 745 x 4 x 2 x 3 ) 2 + 16.9 × 10 6 110.0 x 6 3 1 0 , g 6 ( x ) = ( 745 x 4 x 2 x 3 ) 2 + 157.5 × 10 6 85.0 x 6 3 1 0 , g 7 ( x ) = x 2 x 3 40 1 0 , g 8 ( x ) = 5 x 2 x 1 1 0 , g 9 ( x ) = x 1 12 x 2 1 0 , g 10 ( x ) = 1.5 x 6 + 1.9 x 4 1 0 , g 11 ( x ) = 1.1 x 7 + 1.9 x 5 1 0 ,
Variable Range
2.6 x 1 3.6 , 0.7 x 2 0.8 , 17 x 3 28 , 7.3 x 4 8.3 , 7.8 x 5 8.3 , 2.9 x 6 3.9 , 5.0 x 7 5.5 ,
Table 10 shows the results of EROA compared with Aquila Optimizer (AO), Arithmetic Optimization Algorithm (AOA), Sine Cosine Algorithm (SCA), Particle Swarm Optimization (PSO), Moth-Flame Optimization (MFO), Harmony Search (HS) [64], and Decomposition Algorithm (MDA) [65]. It can be noticed that EROA ranked first with 2998.9886 where X = (3.49692, 0.7, 17, 7.66313, 7.8, 3.3505, 5.28582).

5.3. Tension/Compression Spring Design Problem

Tension/compression spring design problem aims to determine the minimum cost of spring fabrication. It has three variables as shown in Figure 9. The mathematical model can be shown below.
Consider
x = [ x 1   x 2   x 3   x 4 ] = [ d   D   N ] ,
Minimize
f ( x ) = ( x 3 + 2 ) x 2 x 1 2 ,
Subject to
g 1 ( x ) = 1 x 2 3 x 3 71785 x 1 4 0 , g 2 ( x ) = 4 x 2 2 x 1 x 2 12566 ( x 2 x 1 3 x 1 4 ) + 1 5108 x 1 2 0 , g 3 ( x ) = 1 140.45 x 1 x 2 2 x 3 0 , g 4 ( x ) = x 1 + x 2 1.5 1 0 ,
Variable Range
0.05 x 1 2.00 , 0.25 x 2 1.30 , 2.00 x 3 15.00 ,
Results of tension/compression spring design is given in Table 11, where the EROA is compared with Aquila Optimizer (AO), Harris Hawks Optimization (HHO), Whale Optimization Algorithm (WOA), Salp Swarm Algorithm (SSA), Grey Wolf Optimizer (GWO), Multi-Verse Optimization (MVO), Particle Swarm Optimization (PSO), Improved Teaching-Learning Based Optimization algorithm (RLTLBO) [66], Genetic Algorithm (GA), and Harmony Search (HS). From this table, we can seen that, EROA has achieved the best results.

5.4. Three-Bar Truss Design Problem

The three-bar design problem aims to find the minimum structure burden. It has two variables, as shown in Figure 10. The mathematical model is given below.
Consider
x = [ x 1   x 2 ] = [ A 1   A 2 ] ,
Minimize
f ( x ) = ( 2 2 x 1 + x 2 ) l ,
Subject to
g 1 ( x ) = 2 x 1 + x 2 2 x 1 2 + 2 x 1 x 2 P σ 0 , g 2 ( x ) = x 2 2 x 1 2 + 2 x 1 x 2 P σ 0 , g 3 ( x ) = 1 2 x 2 + x 1 P σ 0 ,
Variable Range
0 x 1 , x 2 1 ,
where l = 100   cm , P = 2   KN / cm 2 , σ = 2   KN / cm 2 .
Results of the three-bar truss are given in Table 12, in which EROA has been compared with Sailfish Optimizer (SFO), Aquila Optimizer (AO), Arithmetic Optimization Algorithm (AOA), Harris Hawks Optimizer (HHO), Salp Swarm Algorithm (SSA), Ant Lion Optimizer (ALO), Multi-Verse Optimizer (MVO), Moth-Flame Optimization (MFO), Grasshopper Optimization Algorithm (GOA) [67], and Improved Hybrid Aquila Optimizer and Harris Hawks Optimization (IHAOHHO) [68]. It’s obvious that EROA has ranked first with fitness value equal to 263.8552.

5.5. Welded Beam Design Problem

The fifth engineering problem is welded beam design which tries to find the minimum price of welded beam manufacturing. It has four variables, as shown in Figure 11. The mathematical model of welded beam design is given below.
Consider
x = [ x 1 ,   x 2 ,   x 3 ,   x 4 ] = [ h ,   l ,   t ,   b ]
Minimize
f ( x ) = 1.10471 x 1 2 x 2 + 0.04811 x 3 x 4 ( 14.0 + x 2 )
Subject to
g 1 ( x ) = τ ( x ) τ max 0 , g 2 ( x ) = σ ( x ) σ max 0 , g 3 ( x ) = δ ( x ) δ max 0 , g 4 ( x ) = x 1 x 4 0 , g 5 ( x ) = P P C ( x ) 0 , g 6 ( x ) = 0.125 x 1 0 , g 7 ( x ) = 1.10471 x 1 2 + 0.04811 x 3 x 4 ( 14.0 + x 2 ) 5 0 ,
where
τ ( x ) = ( τ ) 2 + 2 τ τ x 2 2 R + ( τ ) 2 , τ = P 2 x 1 x 2 ,   τ = M R J ,   M = P ( L + x 2 2 ) , R = x 2 2 4 + ( x 1 + x 3 2 ) 2 , J = 2 { 2 x 1 x 2 [ x 2 2 4 + ( x 1 + x 3 2 ) 2 ] } , σ ( x ) = 6 P L x 3 2 x 4 ,   δ ( x ) = 6 P L 3 E x 3 3 x 4 , P C ( x ) = 4.013 E x 3 2 x 4 6 36 L 2 ( 1 x 3 2 L E 4 G ) ,
Variable Range
P = 6000   lb ,   L = 14   in ,   E = 30 × 10 6   psi ,   G = 12 × 10 6   psi τ max = 13600   psi ,   σ max = 30000   psi ,   δ max = 0 . 25   in ,
Results of welded beam design are given in Table 13, where the EROA is compared with the original ROA, Whale Optimization Algorithm (WOA), Grey Wolf Optimizer (GWO), Moth-Flame Optimization (MFO), Multi-Verse Optimization (MVO), Co-evolutionary Particle Swarm Optimization (CPSO), Ray Optimization (RO), Improved Wild Horse Optimization (IWHO) [69], Improved HHO (IHHO), and Harmony Search (HS). From this table, we can conclude that EROA has ranked first in comparison with 11 algorithms.

5.6. Tubular Column Design Problem

The sixth engineering problem is called tubular column design [65], which aims to find the lowest cost to design tubular columns. It consists of two variables, whereas the mathematical description of this problem is shown below.
Minimize
f ( d ,   t ) = 9.8 d t + 2 d ,
Subject to
g 1 = P π d t σ y 1 0 , g 2 = 8 P L 2 π 3 E d t ( d 2 + t 2 ) 1 0 , g 3 = 2.0 d 1 0 , g 4 = d 14 1 0 , g 5 = 0.2 t 1 0 , g 6 = t 0.8 1 0 ,
Variable Range
0.01 d ,   t 100 ,
Results of the tubular column are given in Table 14, where EROA is compared with the classical ROA, Aquila Optimizer (AO), Harris Hawks Optimization (HHO), Whale Optimization Algorithm (WOA), Marine Predators Algorithm (MPA), Cuckoo Search (CS) [70], Modified Ant Lion Optimizer (MALO) [71], and Random Opposition-based Learning Grey Wolf Optimizer (ROLGWO). EROA has find the nearest to optimal value which equals to 26.5316.

5.7. Gear Train Design Problem

The last engineering problem discussed here is Gear train design [14] which aims to minimize gear ratio. It consists of four variables whereas the mathematical formulation is founded below.
Consider
x = [ x 1   x 2   x 3   x 4 ] = [ n A   n B   n C   n D ] ,
Minimize
f ( x ) = ( 1 6.931 x 2 x 3 x 1 x 4 ) 2 ,
Variable Range
12 x 1 , x 2 , x 3 , x 4 60 ,
Results of gear train is compared with Multi-Verse Optimization (MVO), Moth-Flame Optimization (MFO), Ant Lion Optimizer (ALO), Genetic Algorithm (GA), Cuckoo Search (CS), Artificial Bee Colony (ABC) [72], and Mine Blast Algorithm (MBA) [73] in Table 15.

6. Conclusions

ROA is a recent swarm-based algorithm that simulates the intelligent traveler Remora behavior. In this study, a novel version of ROA, called Enhanced ROA (EROA), is proposed, which is based on three strategies; namely, (1) adaptive dynamic probability, (2) SFO with Levy flight, and (3) restart strategy. Two different datasets have been used to evaluate the performance of the suggested algorithm (CEC2005 & CEC2017) with 52 functions and different seven constrained real-world engineering problems. Moreover, Wilcoxon rank sum and Friedman tests have been used to prove the powerful of EROA. The statistical analysis and experimental results show the efficiency of EROA. In future, we can propose this version as a binary version and apply it to feature selection, classification, or unit commitment problem. We also can propose a multi-objective version of it to solve the more complex multiobjective problems.

Author Contributions

Conceptualization, S.W.; methodology, S.W. and H.J.; software, S.W.; validation, S.W., A.G.H. and R.Z.; formal analysis, L.A.; investigation, A.G.H.; resources, S.W.; data curation, S.W.; writing—original draft preparation, S.W. and A.G.H.; writing—review and editing, L.A. and A.G.H.; visualization, S.W.; supervision, H.J.; project administration, S.W.; funding acquisition, S.W. and H.J. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Sanming University Introduces High-level Talents to Start Scientific Research Funding Support Project, grant number 20YG01, 20YG14; the Guiding Science and Technology Projects in Sanming City, grant number 2020-S-39; the Educational Research Projects of Young and Middle-aged Teachers in Fujian Province, grant number JAT200638; the Scientific Research and Development Fund of Sanming University, grant number B202029; School level education and teaching reform project of Sanming University, grant number J2010306; Higher education research project of Sanming University, grant number SHE2102; 2021 project of the 14th Five-year Plan of Education science in Fujian Province, grant number FJJKBK21-138; Fujian Natural Science Foundation Project, grant number 2021J011128; and Sanming University National Natural Science Foundation Breeding Project, grant number PYT2105.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to thank the support of Fujian Key Lab of Agriculture IOT Application and IOT Application Engineering Research Center of Fujian Province Colleges and Universities, as well as the anonymous reviewers to help us improve the quality of this paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Yang, X.S. Engineering Optimization: An Introduction with Metaheuristic Applications; John Wiley & Sons: Hoboken, NJ, USA, 2010. [Google Scholar] [CrossRef]
  2. Abualigah, L.; Gandomi, A.H.; Elaziz, M.A.; Hussien, A.G.; Khasawneh, A.M.; Alshinwan, M.; Houssein, E.H. Nature-inspired optimization algorithms for text document clustering—A comprehensive analysis. Algorithms 2020, 13, 345. [Google Scholar] [CrossRef]
  3. Hassanien, A.E.; Emary, E. Swarm Intelligence: Principles, Advances, and Applications; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
  4. Hussien, A.G.; Hassanien, A.E.; Houssein, E.H.; Bhattacharyya, S.; Amin, M. S-shaped binary whale optimization algorithm for feature selection. Recent Trends Signal Image Process. 2019, 727, 79–87. [Google Scholar]
  5. Fathi, H.; AlSalman, H.; Gumaei, A.; Manhrawy, I.I.; Hussien, A.G.; El-Kafrawy, P. An efficient cancer classification model using microarray and high-dimensional data. Comput. Intell. Neurosci. 2021, 2021, 7231126. [Google Scholar] [CrossRef] [PubMed]
  6. Hussien, A.G.; Houssein, E.H.; Hassanien, A.E. A binary whale optimization algorithm with hyperbolic tangent fitness function for feature selection. In Proceedings of the 2017 Eighth International Conference on Intelligent Computing and Information Systems (ICICIS), Cairo, Egypt, 5–7 December 2017; pp. 166–172. [Google Scholar] [CrossRef]
  7. Hussien, A.G.; Amin, M. A self-adaptive harris hawks optimization algorithm with opposition-based learning and chaotic local search strategy for global optimization and feature selection. Int. J. Mach. Learn. Cybern. 2022, 13, 309–336. [Google Scholar] [CrossRef]
  8. Abdullahi, M.; Ngadi, M.A.; Dishing, S.I.; Ahmad, B.I. An efficient symbiotic organisms search algorithm with chaotic optimization strategy for multi-objective task scheduling problems in cloud computing environment. J. Netw. Comput. Appl. 2019, 133, 60–74. [Google Scholar] [CrossRef]
  9. Besnassi, M.; Neggaz, N.; Benyettou, A. Face detection based on evolutionary Haar filter. Pattern Anal. Appl. 2020, 23, 309–330. [Google Scholar] [CrossRef]
  10. Neshat, M.; Mirjalili, S.; Sergiienko, N.Y.; Esmaeilzadeh, S.; Amini, E.; Heydari, A.; Garcia, D.A. Layout optimisation of offshore wave energy converters using a novel multi-swarm cooperative algorithm with backtracking strategy: A case study from coasts of Australia. Energy 2022, 239, 122463. [Google Scholar] [CrossRef]
  11. Eslami, M.; Neshat, M.; Khalid, S.A. A Novel Hybrid Sine Cosine Algorithm and Pattern Search for Optimal Coordination of Power System Damping Controllers. Sustainability 2022, 14, 541. [Google Scholar] [CrossRef]
  12. Nadimi-Shahraki, M.H.; Taghian, S.; Mirjalili, S.; Zamani, H.; Bahreininejad, A. GGWO: Gaze cues learning-based grey wolf optimizer and its applications for solving engineering problems. J. Comput. Sci. 2022, 61, 101636. [Google Scholar] [CrossRef]
  13. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef] [Green Version]
  14. Hussien, A.G.; Oliva, D.; Houssein, E.H.; Juan, A.A.; Yu, X. Binary whale optimization algorithm for dimensionality reduction. Mathematics 2020, 8, 1821. [Google Scholar] [CrossRef]
  15. Hussien, A.G.; Hassanien, A.E.; Houssein, E.H. Swarming behaviour of salps algorithm for predicting chemical compound activities. In Proceedings of the 2017 Eighth International Conference on Intelligent Computing and Information Systems (ICICIS), Cairo, Egypt, 5–7 December 2017; pp. 315–320. [Google Scholar]
  16. Hussien, A.G.; Hassanien, A.E.; Houssein, E.H.; Amin, M.; Azar, A.T. New binary whale optimization algorithm for discrete optimization problems. Eng. Optim. 2020, 52, 945–959. [Google Scholar] [CrossRef]
  17. Fearn, T. Particle swarm optimisation. NIR News 2014, 25, 27. [Google Scholar] [CrossRef]
  18. Dorigo, M.; Maniezzo, V.; Colorni, A. Ant system: Optimization by a colony of cooperating agents. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 1996, 26, 29–41. [Google Scholar] [CrossRef] [Green Version]
  19. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  20. Assiri, A.S.; Hussien, A.G.; Amin, M. Ant lion optimization: Variants, hybrids, and applications. IEEE Access 2020, 8, 77746–77764. [Google Scholar] [CrossRef]
  21. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  22. Faramarzi, A.; Heidarinejad, M.; Mirjalili, S.; Gandomi, A.H. Marine predators algorithm: A nature-inspired metaheuristic. Expert Syst. Appl. 2020, 152, 113377. [Google Scholar] [CrossRef]
  23. Hussien, A.G. An enhanced opposition-based salp swarm algorithm for global optimization and engineering problems. J. Ambient. Intell. Humaniz. Comput. 2022, 13, 129–150. [Google Scholar] [CrossRef]
  24. Jia, H.; Peng, X.; Lang, C. Remora optimization algorithm. Expert Syst. Appl. 2021, 185, 115665. [Google Scholar] [CrossRef]
  25. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  26. Mostafa, R.R.; Hussien, A.G.; Khan, M.A.; Kadry, S.; Hashim, F. Enhanced coot optimization algorithm for dimensionality reduction. In Proceedings of the 2022 Fifth International Conference of Women in Data Science at Prince Sultan University (WiDS PSU), Riyadh, Saudi Arabia, 28–29 March 2022. [Google Scholar]
  27. Hussien, A.G.; Amin, M.; Abd El Aziz, M. A comprehensive review of moth-flame optimisation: Variants, hybrids, and applications. J. Exp. Theor. Artif. Intell. 2020, 32, 705–725. [Google Scholar] [CrossRef]
  28. Cuevas, E.; Cienfuegos, M.; Zaldívar, D.; Perez-Cisneros, M. A swarm optimization algorithm inspired in the behavior of the social-spider. Expert Syst. Appl. 2013, 40, 6374–6384. [Google Scholar] [CrossRef] [Green Version]
  29. Hashim, F.A.; Hussien, A.G. Snake optimizer: A novel meta-heuristic optimization algorithm. Knowl.-Based Syst. 2022, 242, 108320. [Google Scholar] [CrossRef]
  30. Hussien, A.G.; Amin, M.; Wang, M.; Liang, G.; Alsanad, A.; Gumaei, A.; Chen, H. Crow search algorithm: Theory, recent advances, and applications. IEEE Access 2020, 8, 173548–173565. [Google Scholar] [CrossRef]
  31. Dhiman, G.; Kumar, V. Emperor penguin optimizer: A bio-inspired algorithm for engineering problems. Knowl.-Based Syst. 2018, 159, 20–50. [Google Scholar] [CrossRef]
  32. Hussien, A.G.; Heidari, A.A.; Ye, X.; Liang, G.; Chen, H.; Pan, Z. Boosting whale optimization with evolution strategy and gaussian random walks: An image segmentation method. Eng. Comput. 2022, 1–45. [Google Scholar] [CrossRef]
  33. Dhiman, G.; Kumar, V. Spotted hyena optimizer: A novel bio-inspired based metaheuristic technique for engineering applications. Adv. Eng. Softw. 2017, 114, 48–70. [Google Scholar] [CrossRef]
  34. Abualigah, L.; Yousri, D.; Elaziz, M.A.; Ewees, A.A.; Al-qaness, M.A.A.; Gandomi, A.H. Aquila Optimizer: A novel meta-heuristic optimization algorithm. Comput. Ind. Eng. 2021, 157, 107250. [Google Scholar] [CrossRef]
  35. Kaur, S.; Awasthi, L.K.; Sangal, A.; Dhiman, G. Tunicate swarm algorithm: A new bioinspired based metaheuristic paradigm for global optimization. Eng. Appl. Artif. Intell. 2020, 90, 103541. [Google Scholar] [CrossRef]
  36. Kirkpatrick, S.; Gelatt, C.D., Jr.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef] [PubMed]
  37. Genc¸, H.M.; Eksin, I.; Erol, O.K. Big bang-big crunch optimization algorithm hybridized with local directional moves and application to target motion analysis problem. In Proceedings of the 2010 IEEE International Conference on Systems, Man and Cybernetics, Istanbul, Turkey, 10–13 October 2010; pp. 881–887. [Google Scholar]
  38. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. Gsa: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  39. Abualigah, L.; Elaziz, M.A.; Hussien, A.G.; Alsalibi, B.; Jalali, S.M.J.; Gandomi, A.H. Lightning search algorithm: A comprehensive survey. Appl. Intell. 2021, 51, 2353–2376. [Google Scholar] [CrossRef] [PubMed]
  40. Hatamlou, A. Black hole: A new heuristic optimization approach for data clustering. Inf. Sci. 2013, 222, 175–184. [Google Scholar] [CrossRef]
  41. Mirjalili, S. SCA: A Sine Cosine Algorithm for Solving Optimization Problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  42. Kaveh, A.; Khayatazad, M. A new meta-heuristic method: Ray optimization. Comput. Struct. 2012, 112, 283–294. [Google Scholar] [CrossRef]
  43. Anita; Yadav, A. Aefa: Artificial electric field algorithm for global optimization. Swarm Evol. Comput. 2019, 48, 93–108. [Google Scholar]
  44. Abualigah, L.; Diabat, A.; Mirjalili, S.; Elaziz, M.A.; Gandomi, A.H. The Arithmetic Optimization Algorithm. Comput. Methods Appl. Mech. Eng. 2021, 376, 113609. [Google Scholar] [CrossRef]
  45. Mirjalili, S.; Mirjalili, S.M.; Hatamlou, A. Multi-verse optimizer: A nature-inspired algorithm for global optimization. Neural Comput. Appl. 2015, 27, 495–513. [Google Scholar] [CrossRef]
  46. Hashim, F.A.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W.; Mirjalili, S. Henry gas solubility optimization: A novel physics-based algorithm. Future Gener. Comput. Syst. 2019, 101, 646–667. [Google Scholar] [CrossRef]
  47. Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  48. Sinha, N.; Chakrabarti, R.; Chattopadhyay, P. Evolutionary programming techniques for economic load dispatch. IEEE Trans. Evol. Comput. 2003, 7, 83–94. [Google Scholar] [CrossRef]
  49. Simon, D. Biogeography-based optimization. IEEE Trans. Evol. Comput. 2008, 12, 702–713. [Google Scholar] [CrossRef] [Green Version]
  50. Moscato, P.; Cotta, C.; Mendes, A. Memetic algorithms. In New Optimization Techniques in Engineering; Springer: Berlin/Heidelberg, Germany, 2004; pp. 53–85. [Google Scholar]
  51. Passino, K.M. Bacterial foraging optimization. Int. J. Swarm Intell. Res. (IJSIR) 2010, 1, 1–16. [Google Scholar] [CrossRef]
  52. Uymaz, S.A.; Tezel, G.; Yel, E. Artificial algae algorithm (aaa) for nonlinear global optimization. Appl. Soft Comput. 2015, 31, 153–171. [Google Scholar] [CrossRef]
  53. Meng, Z.; Pan, J.S. Monkey king evolution: A new memetic evolutionary algorithm and its application in vehicle fuel consumption optimization. Knowl.-Based Syst. 2016, 97, 144–157. [Google Scholar] [CrossRef]
  54. Zheng, R.; Jia, H.; Abualigah, L.; Wang, S.; Wu, D. An improved remora optimization algorithm with autonomous foraging mechanism for global optimization problems. Math. Biosci. Eng. 2022, 19, 3994–4037. [Google Scholar] [CrossRef]
  55. Liu, Q.; Li, N.; Jia, H.; Qi, Q.; Abualigah, L. Modified remora optimization algorithm for global optimization and multilevel thresholding image segmentation. Mathematics 2022, 10, 1014. [Google Scholar] [CrossRef]
  56. Vinayaki, V.D.; Kalaiselvi, R. Multithreshold image segmentation technique using remora optimization algorithm for diabetic retinopathy detection from fundus images. Neural Process. Lett. 2022, 1–22. [Google Scholar] [CrossRef]
  57. Shadravan, S.; Naji, H.R.; Bardsiri, V.K. The Sailfish Optimizer: A novel nature-inspired metaheuristic algorithm for solving constrained engineering optimization problems. Eng. Appl. Artif. Intell. 2019, 80, 20–34. [Google Scholar] [CrossRef]
  58. Zhang, H.; Wang, Z.; Chen, W.; Heidari, A.A.; Wang, M.; Zhao, X.; Liang, G.; Chen, H.; Zhang, X. Ensemble mutation-driven salp swarm algorithm with restart mechanism: Framework and fundamental analysis. Expert Syst. Appl. 2021, 165, 113897. [Google Scholar] [CrossRef]
  59. Long, W.; Jiao, J.; Liang, X.; Cai, S.; Xu, M. A Random Opposition-Based Learning Grey Wolf Optimizer. IEEE Access 2019, 7, 113810–113825. [Google Scholar] [CrossRef]
  60. Dhiman, G.; Kaur, A. STOA: A bio-inspired based optimization algorithm for industrial engineering problems. Eng. Appl. Artif. Intell. 2019, 82, 148–174. [Google Scholar] [CrossRef]
  61. Li, S.M.; Chen, H.L.; Wang, M.J.; Heidari, A.A.; Mirjalili, S. Slime mould algorithm: A new method for stochastic optimization. Future Gener. Comput. Syst. 2020, 111, 300–323. [Google Scholar] [CrossRef]
  62. Rechenberg, I. Evolutionsstrategien. In Simulationsmethoden in der Medizin und Biologie; Springer: Berlin/Heidelberg, Germany, 1978; Volume 8, pp. 83–114. [Google Scholar]
  63. He, Q.; Wang, L. An effective co-evolutionary particle swarm optimization for constrained engineering design problems. Eng. Appl. Artif. Intell. 2007, 20, 89–99. [Google Scholar] [CrossRef]
  64. Geem, Z.W.; Kim, J.H.; Loganathan, G.V. A new heuristic optimization algorithm: Harmony search. Simulation 2001, 76, 60–68. [Google Scholar] [CrossRef]
  65. Lu, S.; Kim, H.M. A regularized inexact penalty decomposition algorithm for multidisciplinary design optimization problems with complementarity constraints. J. Mech. Des. 2010, 132, 041005. [Google Scholar] [CrossRef] [Green Version]
  66. Wu, D.; Wang, S.; Liu, Q.; Abualigah, L.; Jia, H. An Improved Teaching-Learning-Based Optimization Algorithm with Reinforcement Learning Strategy for Solving Optimization Problems. Comput. Intell. Neurosci. 2022, 2022, 1535957. [Google Scholar] [CrossRef]
  67. Saremi, S.; Mirjalili, S.; Lewis, A. Grasshopper optimisation algorithm: Theory and application. Adv. Eng. Softw. 2017, 105, 30–47. [Google Scholar] [CrossRef] [Green Version]
  68. Wang, S.; Jia, H.; Abualigah, L.; Liu, Q.; Zheng, R. An Improved Hybrid Aquila Optimizer and Harris Hawks Algorithm for Solving Industrial Engineering Optimization Problems. Processes 2021, 9, 1551. [Google Scholar] [CrossRef]
  69. Zheng, R.; Hussien, A.G.; Jia, H.; Abualigah, L.; Wang, S.; Wu, D. An Improved Wild Horse Optimizer for Solving Optimization Problems. Mathematics 2022, 10, 1311. [Google Scholar] [CrossRef]
  70. Gandomi, A.H.; Yang, X.S.; Alavi, A.H. Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar] [CrossRef]
  71. Wang, S.; Sun, K.; Zhang, W.; Jia, H. Multilevel thresholding using a modified ant lion optimizer with opposition-based learning for color image segmentation. Math. Biosci. Eng. 2021, 18, 3092–3143. [Google Scholar] [CrossRef] [PubMed]
  72. Sharma, T.K.; Pant, M.; Singh, V. Improved local search in artificial bee colony using golden section search. arXiv 2012, arXiv:1210.6128. [Google Scholar]
  73. Sadollah, A.; Bahreininejad, A.; Eskandar, H.; Hamdi, M. Mine blast algorithm: A new population based algorithm for solving constrained engineering optimization problems. Appl. Soft Comput. 2013, 13, 2592–2612. [Google Scholar] [CrossRef]
Figure 1. The different states of ROA.
Figure 1. The different states of ROA.
Mathematics 10 01696 g001
Figure 2. The flowchart of the EROA.
Figure 2. The flowchart of the EROA.
Mathematics 10 01696 g002
Figure 3. Convergence curves for the optimization algorithms for standard benchmark functions.
Figure 3. Convergence curves for the optimization algorithms for standard benchmark functions.
Mathematics 10 01696 g003aMathematics 10 01696 g003b
Figure 4. Convergence curves for the optimization algorithms for standard benchmark functions with D = 100.
Figure 4. Convergence curves for the optimization algorithms for standard benchmark functions with D = 100.
Mathematics 10 01696 g004
Figure 5. Convergence curves for the optimization algorithms for standard benchmark functions with D = 500.
Figure 5. Convergence curves for the optimization algorithms for standard benchmark functions with D = 500.
Mathematics 10 01696 g005
Figure 6. Convergence curves for the optimization algorithms for some test functions on CEC 2017.
Figure 6. Convergence curves for the optimization algorithms for some test functions on CEC 2017.
Mathematics 10 01696 g006
Figure 7. The pressure vessel design problem is the 3D model diagram (left) and structural parameters (right).
Figure 7. The pressure vessel design problem is the 3D model diagram (left) and structural parameters (right).
Mathematics 10 01696 g007
Figure 8. The speed reducer design problem is the 3D model diagram (left) and structural parameters (right).
Figure 8. The speed reducer design problem is the 3D model diagram (left) and structural parameters (right).
Mathematics 10 01696 g008
Figure 9. The tension/compression spring design problem: 3D model diagram (left) and structural parameters (right).
Figure 9. The tension/compression spring design problem: 3D model diagram (left) and structural parameters (right).
Mathematics 10 01696 g009
Figure 10. The three-bar truss design problem is the 3D model diagram (left) and structural parameters (right).
Figure 10. The three-bar truss design problem is the 3D model diagram (left) and structural parameters (right).
Mathematics 10 01696 g010
Figure 11. The welded beam design problem is the 3D model diagram (left) and structural parameters (right).
Figure 11. The welded beam design problem is the 3D model diagram (left) and structural parameters (right).
Mathematics 10 01696 g011
Table 1. Parameter settings for the comparative algorithms.
Table 1. Parameter settings for the comparative algorithms.
AlgorithmParameters
EROAC = 0.1; Limit = log(t)
ROAC = 0.1
AOU = 0.00565; r1 = 10; ω = 0.005; α = 0.1; δ = 0.1; G1∈[−1, 1]; G2 = [2, 0]
AOAα = 5; μ = 0.5;
HHOq∈[0, 1]; r∈[0, 1]; E0∈[−1, 1]; E1 = [2, 0]; E∈[−2, 2];
WOAa1 = [2, 0]; a2 = [−1, −2]; b = 1
STOACf = 2; u = 1; v = 1
Table 2. Unimodal benchmark functions.
Table 2. Unimodal benchmark functions.
Fun.DRangefmin
F 1 ( x ) = i = 1 n x i 2 30/100/500[−100, 100]0
F 2 ( x ) = i = 1 n | x i | + i = 1 n | x i | 30/100/500[−10, 10]0
F 3 ( x ) = i = 1 n ( j 1 i x j ) 2 30/100/500[−100, 100]0
F 4 ( x ) = max i { | x i | , 1 i n } 30/100/500[−100, 100]0
F 5 ( x ) = i = 1 n 1 [ 100 ( x i + 1 x i 2 ) 2 + ( x i 1 ) 2 ] 30/100/500[−30, 30]0
F 6 ( x ) = i = 1 n ( x i + 5 ) 2 30/100/500[−100, 100]0
F 7 ( x ) = i = 1 n i x i 4 + r a n d o m [ 0 , 1 ) 30/100/500[−1.28, 1.28]0
Table 3. Multimodal benchmark functions.
Table 3. Multimodal benchmark functions.
Fun.DRangefmin
F 8 ( x ) = i = 1 n x i sin ( | x i | ) 30/100/500[−500, 500]−418.9829 × D
F 9 ( x ) = i = 1 n [ x i 2 10 cos ( 2 π x i ) + 10 ] 30/100/500[−5.12, 5.12]0
F 10 ( x ) = 20 exp ( 0.2 1 n i = 1 n x i 2 ) exp ( 1 n i = 1 n cos ( 2 π x i ) ) + 20 + e 30/100/500[−32, 32]0
F 11 ( x ) = 1 4000 i = 1 n x i 2 i = 1 n cos ( x i i ) + 1 30/100/500[−600, 600]0
F 12 ( x ) = π n { 10 sin ( π y 1 ) + i = 1 n 1 ( y i 1 ) 2 [ 1 + 10 sin 2 ( π y i + 1 ) ] + ( y n 1 ) 2 } + i = 1 n u ( x i , 10 , 100 , 4 ) ,   where   y i = 1 + x i + 1 4 , u ( x i , a , k , m ) = { k ( x i a ) m x i > a 0 a < x i < a k ( x i a ) m x i < a 30/100/500[−50, 50]0
F 13 ( x ) = 0.1 ( sin 2 ( 3 π x 1 ) + i = 1 n ( x i 1 ) 2 [ 1 + sin 2 ( 3 π x i + 1 ) ] + ( x n 1 ) 2 [ 1 + sin 2 ( 2 π x n ) ] ) + i = 1 n u ( x i , 5 , 100 , 4 ) 30/100/500[−50, 50]0
Table 4. Fixed-dimension multimodal benchmark functions.
Table 4. Fixed-dimension multimodal benchmark functions.
Fun.DRangefmin
F 14 ( x ) = ( 1 500 + j = 1 25 1 j + i = 1 2 ( x i a i j ) 6 ) 1 2[−65, 65]1
F 15 ( x ) = i = 1 11 [ a i x 1 ( b i 2 + b i x 2 ) b i 2 + b i x 3 + x 4 ] 2 4[−5, 5]0.00030
F 16 ( x ) = 4 x 1 2 2.1 x 1 4 + 1 3 x 1 6 + x 1 x 2 4 x 2 2 + x 2 4 2[−5, 5]−1.0316
F 17 ( x ) = ( x 2 5.1 4 π 2 x 1 2 + 5 π x 1 6 ) 2 + 10 ( 1 1 8 π ) cos x 1 + 10 2[−5, 5]0.398
F 18 ( x ) = [ 1 + ( x 1 + x 2 + 1 ) 2 ( 19 14 x 1 + 3 x 1 2 14 x 2 + 6 x 1 x 2 + 3 x 2 2 ) ] × [ 30 + ( 2 x 1 3 x 2 ) 2 × ( 18 32 x 2 + 12 x 1 2 + 48 x 2 36 x 1 x 2 + 27 x 2 2 ) ] 2[−2, 2]3
F 19 ( x ) = i = 1 4 c i exp ( j = 1 3 a i j ( x j p i j ) 2 ) 3[−1, 2]−3.86
F 20 ( x ) = i = 1 4 c i exp ( j = 1 6 a i j ( x j p i j ) 2 ) 6[0, 1]−3.32
F 21 ( x ) = i = 1 5 [ ( X a i ) ( X a i ) T + c i ] 1 4[0, 10]−10.1532
F 22 ( x ) = i = 1 7 [ ( X a i ) ( X a i ) T + c i ] 1 4[0, 10]−10.4028
F 23 ( x ) = i = 1 10 [ ( X a i ) ( X a i ) T + c i ] 1 4[0, 10]−10.5363
Table 5. Results of algorithms on 23 benchmark functions.
Table 5. Results of algorithms on 23 benchmark functions.
FD EROAROAAOAOAHHOWOASCASTOA
F130Avg07.33 × 10−3142.73 × 101024.83 × 1062.34 × 10954.15 × 10729.655.81 × 107
Std001.49 × 101011.81 × 1061.26 × 10942.25 × 10711.71 × 1011.24 × 106
100Avg001.47 × 10989.93 × 1045.55 × 10939.06 × 10721.08 × 1049.12 × 103
Std008.02 × 10982.80 × 1042.05 × 10922.16 × 10717.78 × 1031.22 × 102
500Avg007.28 × 101025.24 × 1019.52 × 10974.70 × 10702.17 × 1059.52
Std003.86 × 101013.36 × 1023.92 × 10961.35 × 10697.89 × 1049.27
F230Avg01.16 × 101654.63 × 10641.50 × 1039.36 × 10516.83 × 10501.69 × 1021.01 × 105
Std002.32 × 10631.92 × 1033.73 × 10503.45 × 10492.25 × 1021.06 × 105
100Avg07.40 × 101625.90 × 10551.85 × 1022.07 × 10501.24 × 10491.27E × 1012.65 × 103
Std04.04 × 101612.61 × 10542.34 × 1038.48 × 10505.08 × 10491.03 × 1012.05 × 103
500Avg09.24 × 101601.46 × 10575.24 × 1011.17 × 10481.22 × 10471.11 × 1029.34 × 102
Std03.95 × 101598.01 × 10579.93 × 1025.95 × 10486.19 × 10477.32 × 1016.76 × 102
F330Avg01.68 × 102899.44 × 101119.51 × 1042.69 × 10674.09 × 1041.11 × 1047.91 × 102
Std003.78 × 101107.68 × 1041.47 × 10661.39 × 1047.68 × 1039.41 × 102
100Avg03.32 × 102765.31 × 101001.30 × 1013.89 × 10621.09 × 1062.49 × 1052.12 × 103
Std002.91 × 10993.20 × 1022.11 × 10613.07 × 1054.85 × 1043.24 × 103
500Avg01.09 × 102533.30 × 101046.914.61 × 10393.38 × 1077.15 × 1065.89 × 105
Std001.02 × 101031.242.53 × 10381.05 × 1071.79 × 1062.47 × 105
F430Avg03.71 × 101566.47 × 10531.67 × 1023.18 × 10485.54 × 1013.20 × 1015.18 × 102
Std02.03 × 101552.57 × 10521.24 × 1021.71 × 10472.37 × 1011.33 × 1015.14 × 102
100Avg05.19 × 101561.20 × 10555.57 × 1035.67 × 10487.59 × 1018.97 × 1017.04 × 101
Std02.83 × 101556.59 × 10555.81 × 1032.84 × 10472.24 × 1013.281.61 × 101
500Avg03.95 × 101525.72 × 10541.21 × 1014.10 × 10498.22 × 1019.91 × 1019.87 × 101
Std02.16 × 101512.85 × 10539.18 × 1031.93 × 10482.13 × 1013.52 × 1015.42 × 101
F530Avg6.52 × 1022.71 × 1019.40 × 1032.79 × 1011.28 × 1022.79 × 1017.71 × 1042.84 × 101
Std1.58 × 1024.46 × 1012.72 × 1022.37 × 1011.39 × 1024.94 × 1012.34 × 1055.01 × 101
100Avg2.73 × 1019.76 × 1012.29 × 1029.82 × 1013.42 × 1029.81 × 1011.08 × 1081.06 × 102
Std5.56 × 1014.54 × 1013.13 × 1025.63 × 1024.13 × 1022.42 × 1013.96 × 1076.54
500Avg8.84 × 1014.95 × 1021.00 × 1014.99 × 1022.39 × 1014.96 × 1022.06 × 1091.23 × 104
Std1.973.06 × 1011.26 × 1011.37 × 1014.50 × 1014.25 × 1014.56 × 1081.07 × 104
F630Avg4.53 × 1041.05 × 1011.52 × 1043.029.07 × 1054.33 × 1012.06 × 1012.59
Std2.66 × 1049.54 × 1024.51 × 1042.46 × 1011.74 × 1042.21 × 1013.95 × 1015.09 × 101
100Avg1.00 × 1021.849.23 × 1041.59 × 1014.92 × 1044.331.37 × 1041.77 × 101
Std1.50 × 1025.34 × 1012.94 × 1037.25 × 1015.53 × 1041.218.42 × 1037.95 × 101
500Avg9.47 × 1021.56 × 1016.07 × 1041.12 × 1021.82 × 1033.31 × 1012.11 × 1051.24 × 102
Std1.28 × 1014.438.88 × 1041.642.10 × 1039.986.42 × 1046.66
F730Avg8.37 × 1051.36 × 1041.01 × 1049.06 × 1051.57 × 1044.18 × 1039.69 × 1025.85 × 103
Std5.99 × 1051.20 × 1046.70 × 1058.11 × 1051.32 × 1044.43 × 1031.13 × 1012.83 × 103
100Avg1.11 × 1041.57 × 1041.22 × 1047.75 × 1051.57 × 1044.78 × 1031.66 × 1022.58 × 102
Std1.02 × 1042.34 × 1041.30 × 1047.26 × 1051.61 × 1044.32 × 1039.84 × 1011.10 × 102
500Avg7.39 × 1051.47 × 1046.99 × 1056.12 × 1051.77 × 1044.38 × 1031.44 × 1044.94 × 101
Std7.05 × 1051.24 × 1045.04 × 1056.66 × 1051.94 × 1045.40 × 1034.06 × 1032.40 × 101
F830Avg−1.26 × 104−1.23 × 104−7.48 × 103−5.34 × 103−1.26 × 104−1.03 × 104−3.91 × 103−5.12 × 103
Std2.20 × 1027.21 × 1023.74 × 1033.69 × 1029.54 × 1011.74 × 1033.78 × 1024.44 × 102
100Avg−4.19 × 104−4.14 × 104−1.09 × 104−1.40 × 104−4.19 × 104−3.52 × 104−6.83 × 103−1.10 × 104
Std1.79 × 1011.45 × 1036.20 × 1037.36 × 1023.356.14 × 1035.65 × 1021.51 × 103
500Avg−2.09 × 105−2.07 × 105−3.90 × 104−3.82 × 104−2.09 × 105−1.66 × 105−1.55 × 104−2.50 × 104
Std3.465.11 × 1031.04 × 1041.54 × 1032.02 × 1032.86 × 1041.28 × 1033.26 × 103
F930Avg0001.20 × 10601.89 × 10154.33 × 1011.26 × 101
Std0001.08 × 10601.04 × 10143.79 × 1011.67 × 101
100Avg0001.85 × 104003.23 × 1021.26 × 101
Std0003.93 × 105001.08 × 1028.97
500Avg003.03 × 10141.11 × 102001.44 × 1032.76 × 101
Std001.66 × 10137.83 × 104005.64 × 1021.88 × 101
F1030Avg8.88 × 10168.88 × 10168.88 × 10164.15 × 1048.88 × 10164.91 × 10151.21 × 1011.99 × 101
Std0001.78 × 10402.23 × 10159.251.49 × 103
100Avg8.88 × 10168.88 × 10168.88 × 10163.42 × 1038.88 × 10164.91 × 10151.81 × 1012.00 × 101
Std0003.76 × 10403.06 × 10154.953.43 × 104
500Avg8.88 × 10168.88 × 10168.88 × 10162.67 × 1028.88 × 10163.26 × 10151.93E × 1012.00 × 101
Std0009.11 × 10402.35 × 10153.326.17 × 105
F1130Avg0001.09 × 10302.49 × 1028.85 × 1011.81 × 102
Std0004.17 × 10307.26 × 1023.10 × 1012.16 × 102
100Avg0001.42 × 101008.53 × 1014.58 × 102
Std0001.61 × 101006.15 × 1015.97 × 102
500Avg0001.35 × 103001.85 × 1036.45 × 101
Std0005.15 × 102007.02 × 1022.99 × 101
F1230Avg1.35 × 1059.35 × 1036.63 × 1067.42 × 1011.55 × 1051.66 × 1011.32 × 1052.62 × 101
Std1.23 × 1056.03 × 1031.26 × 1052.08 × 1021.93 × 1057.90 × 1015.71 × 1051.71 × 101
100Avg6.55 × 1062.20 × 1021.07 × 1069.10 × 1012.57 × 1055.52 × 1023.21 × 1087.79 × 101
Std9.69 × 1061.10 × 1022.81 × 1066.55 × 1023.17 × 1052.34 × 1021.69 × 1081.20 × 101
500Avg2.18 × 1054.56 × 1029.43 × 1079.34 × 1012.24 × 1069.73 × 1026.35 × 1094.73
Std2.79 × 1052.84 × 1021.11 × 1062.56 × 1023.24 × 1064.54 × 1021.21 × 1092.46
F1330Avg2.76 × 1041.85 × 1014.18 × 1052.969.08 × 1055.31 × 1013.24 × 1051.93
Std2.19 × 1041.13 × 1019.53 × 1051.77 × 1021.09 × 1043.47 × 1011.51 × 1052.57 × 101
100Avg1.79 × 1031.328.10 × 1059.921.33 × 1042.735.67 × 1081.03 × 101
Std3.51 × 1036.70 × 1019.74 × 1058.17 × 1031.95 × 1048.22 × 1012..41 × 1086.93 × 101
500Avg6.08 × 1037.513.43 × 1044.93 × 1015.18 × 1042.009.35 × 1091.62 × 102
Std1.09 × 1023.928.03 × 1042.68 × 1016.87 × 1047.641.98 × 1095.41 × 101
F142Avg9.98 × 1014.913.479.151.163.411.592.18
Std1.47 × 10114.574.094.423.77 × 1013.829.22 × 1012.49
F154Avg3.13 × 1045.56 × 1044.90 × 1044.97 × 1033.43 × 1046.84 × 1049.41 × 1043.61 × 103
Std1.57 × 1053.12 × 1042.87 × 1049.79 × 1032.89 × 1053.35 × 1043.13 × 1046.69 × 103
F162Avg−1.03−1.03−1.03−1.03−1.03−1.03−1.03−1.03
Std7.94 × 1094.88 × 1087.48 × 1041.68 × 10112.81 × 1093.75 × 1095.08 × 1052.19 × 106
F172Avg3.98 × 1013.98 × 1013.98 × 1013.99 × 1013.98 × 1013.98 × 1013.99 × 1013.98 × 101
Std2.46 × 1079.09 × 1061.65 × 1044.72 × 1031.36 × 1055.86 × 1062.04 × 1031.01 × 104
F182Avg3.003.003.041.74 × 1013.003.003.003.00
Std1.75 × 1051.03 × 1044.20 × 1022.53 × 1014.71 × 1076.68 × 1051.90 × 1041.37 × 104
F193Avg−3.86−3.86−3.86−3.77−3.86−3.86−3.85−3.86
Std3.06 × 1061.66 × 1036.05 × 1035.22 × 1012.90 × 1036.04 × 1031.04 × 1024.95 × 103
F206Avg−3.26−3.25−3.13−3.27−3.12−3.20−2.82−2.89
Std7.76 × 1028.86 × 1021.05 × 1015.93 × 1028.94 × 1021.19 × 1014.77 × 1015.65 × 101
F214Avg−1.02 × 101−1.01 × 101−1.01 × 101−8.05−5.37−8.28−2.29−3.82
Std1.85 × 1042.19 × 1024.20 × 1022.661.242.731.874.31
F224Avg−1.04 × 1011.04 × 101−1.04 × 101−6.83−5.25−7.79−3.07−5.96
Std1.59 × 1041.74 × 1021.62 × 1023.729.14 × 1013.071.604.34
F234Avg−1.05 × 1011.05 × 101−1.05 × 101−8.13−5.62−7.35−3.37−6.96
Std1.76 × 1041.95 × 1022.68 × 1023.301.523.081.873.94
Table 6. p-Values from the Wilcoxon rank-sum test for the results in Table 5.
Table 6. p-Values from the Wilcoxon rank-sum test for the results in Table 5.
FDEROA vs. ROAEROA vs. AOEROA vs. AOAEROA vs. HHOIHAOHHO vs. WOAEROA vs. SCAEROA vs. STOA
F1306.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−5
100N/A6.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−5
500N/A6.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−5
F2306.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−5
1006.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−5
5006.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−5
F3306.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−5
1006.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−5
5006.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−5
F4306.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−5
1006.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−5
5006.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−5
F5306.1035 × 10−55.5359 × 1026.1035 × 10−52.5238 × 1016.1035 × 10−56.1035 × 10−56.1035 × 10−5
1006.1035 × 10−52.1545 × 1026.1035 × 10−54.1260 × 1026.1035 × 10−56.1035 × 10−56.1035 × 10−5
5006.1035 × 10−52.7686 × 1016.1035 × 10−55.9949 × 1016.1035 × 10−56.1035 × 10−56.1035 × 10−5
F6306.1035 × 10−58.5449 × 1046.1035 × 10−56.7139 × 1036.1035 × 10−56.1035 × 10−56.1035 × 10−5
1006.1035 × 10−58.5449 × 1046.1035 × 10−51.1597 × 1036.1035 × 10−56.1035 × 10−56.1035 × 10−5
5006.1035 × 10−58.5449 × 1046.1035 × 10−56.7139 × 1036.1035 × 10−56.1035 × 10−56.1035 × 10−5
F7302.5238 × 1029.7797 × 1029.4606 × 1039.7797 × 1036.1035 × 10−56.1035 × 10−56.1035 × 10−5
1006.3721 × 1029.3408 × 1014.2725 × 1037.1973 × 1016.1035 × 10−56.1035 × 10−56.1035 × 10−5
5001.8762 × 1018.0396 × 1018.0396 × 1015.5359 × 1026.1035 × 10−56.1035 × 10−56.1035 × 10−5
F8306.1035 × 10−56.1035 × 10−56.1035 × 10−51.1597 × 1036.1035 × 10−56.1035 × 10−56.1035 × 10−5
1002.6245 × 1036.1035 × 10−56.1035 × 10−54.2725 × 103 6.1035 × 10−56.1035 × 10−56.1035 × 10−5
5001.2207 × 1046.1035 × 10−56.1035 × 10−58.5449 × 1046.1035 × 10−56.1035 × 10−56.1035 × 10−5
F930N/AN/A1.2207 × 104N/AN/A6.1035 × 10−56.1035 × 10−5
100N/AN/A6.1035 × 10−5N/AN/A6.1035 × 10−56.1035 × 10−5
500N/AN/A6.1035 × 10−5N/AN/A6.1035 × 10−56.1035 × 10−5
F1030N/AN/A6.1035 × 10−5N/A3.9063 × 1036.1035 × 10−56.1035 × 10−5
100N/AN/A6.1035 × 10−5N/A4.8828 × 1046.1035 × 10−56.1035 × 10−5
500N/AN/A6.1035 × 10−5N/A1.9531 × 1036.1035 × 10−56.1035 × 10−5
F1130N/AN/A6.1035 × 10−5N/AN/A6.1035 × 10−56.1035 × 10−5
100N/AN/A6.1035 × 10−5N/AN/A6.1035 × 10−56.1035 × 10−5
500N/AN/A6.1035 × 10−5N/AN/A6.1035 × 10−56.1035 × 10−5
F12306.1035 × 10−53.3569 × 1036.1035 × 10−51.3538 × 10−16.1035 × 10−56.1035 × 10−56.1035 × 10−5
1006.1035 × 10−58.5449 × 1046.1035 × 10−58.3252 × 1026.1035 × 10−56.1035 × 10−56.1035 × 10−5
5006.1035 × 10−53.3569 × 1036.1035 × 10−58.3618 × 1036.1035 × 10−56.1035 × 10−56.1035 × 10−5
F13306.1035 × 10−56.1035 × 1046.1035 × 10−51.8066 × 1016.1035 × 10−56.1035 × 10−56.1035 × 10−5
1006.1035 × 10−51.5259 × 1036.1035 × 10−54.7913 × 1026.1035 × 10−56.1035 × 10−56.1035 × 10−5
5006.1035 × 10−54.2120 × 1016.1035 × 10−54.5428 × 1016.1035 × 10−56.1035 × 10−56.1035 × 10−5
F1426.1035 × 10−56.1035 × 10−56.1035 × 10−52.6245 × 1031.2207 × 1046.1035 × 10−56.1035 × 10−5
F1545.5359 × 1028.3618 × 1035.3711 × 1033.8940 × 1022.0142 × 1036.1035 × 10−56.1035 × 10−5
F1624.2725 × 1036.1035 × 10−51.2207 × 1043.3569 × 1031.5259 × 1036.1035 × 10−56.1035 × 10−5
F1721.1597 × 1036.1035 × 10−56.1035 × 10−56.3721 × 1024.2725 × 1036.1035 × 10−56.1035 × 10−5
F1821.8066 × 1028.3618 × 1036.1035 × 10−52.1545 × 1021.0254 × 1028.3618 × 1038.3618 × 103
F1931.2207 × 1046.1035 × 10−56.1035 × 10−53.3569 × 1036.1035 × 10−56.1035 × 10−56.1035 × 10−5
F2061.5143 × 1016.7139 × 1035.9949 × 1016.1035 × 1045.3711 × 1036.1035 × 10−56.1035 × 10−5
F2146.1035 × 10−56.1035 × 1041.1597 × 1036.1035 × 10−51.2207 × 1046.1035 × 10−56.1035 × 10−5
F2246.1035 × 10−56.1035 × 10−56.1035 × 1046.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−5
F2346.1035 × 10−58.5449 × 1041.2207 × 1046.1035 × 10−56.1035 × 10−56.1035 × 10−56.1035 × 10−5
Table 7. Results of algorithms on the CEC2017 test suite.
Table 7. Results of algorithms on the CEC2017 test suite.
F EROAROAAOAOAHHOWOASCASTOA
CEC_01Avg1.21 × 1081.11 × 1091.26 × 1071.48 × 10101.12 × 1069.98 × 1071.13 × 1092.44 × 108
Std1.66 × 1081.18 × 1091.20 × 1074.69 × 1091.10 × 1062.92 × 1083.89 × 1081.98 × 108
CEC_03Avg2.39 × 1034.84 × 1032.56 × 1031.55 × 1047.29 × 1029.65 × 1033.36 × 1032.23 × 103
Std1.66 × 1033.37 × 1039.78 × 1021.61 × 1033.31 × 1028.33 × 1032.20 × 1031.78 × 103
CEC_04Avg4.26 × 1024.93 × 1024.30 × 1022.06 × 1034.31 × 1024.52 × 1024.64 × 1024.40 × 102
Std2.90 × 1011.07 × 1022.73 × 1019.69 × 1023.95 × 1015.08 × 1013.13 × 1012.61 × 101
CEC_055Avg5.48 × 1025.58 × 1025.36 × 1025.78 × 1025.53 × 1025.63 × 1025.55 × 1025.30 × 102
Std1.26 × 1012.02 × 1011.11 × 1012.42 × 1011.78 × 1012.42 × 1018.278.79
CEC_06Avg6.31 × 1026.33 × 1026.21 × 1026.44 × 1026.39 × 1026.38 × 1026.24 × 1026.15 × 102
Std1.25 × 1011.49 × 1018.327.209.631.49 × 1015.375.60
CEC_07Avg7.86 × 1027.96 × 1027.58 × 1028.01 × 1027.94 × 1027.96 × 1027.82 × 1027.63 × 102
Std1.97 × 1012.58 × 1011.43 × 1018.881.76 × 1012.43 × 1011.08 × 1011.82 × 101
CEC_08Avg8.33 × 1028.37 × 1028.26 × 1028.48 × 1028.32 × 1028.42 × 1028.46 × 1028.29 × 102
Std7.588.51 × 1018.309.887.711.59 × 1019.261.15 × 101
CEC_09Avg1.50 × 1031.31 × 1031.05 × 1031.53 × 1031.50 × 1031.48 × 1031.06 × 1031.08 × 103
Std2.62 × 1022.14 × 1027.39 × 1011.74 × 1022.29 × 1025.67 × 1028.04 × 1011.42 × 102
CEC_10Avg1.97 × 1032.12 × 1032.06 × 1032.25 × 1032.14 × 1032.32 × 1032.49 × 1032.00 × 103
Std2.02 × 1023.28 × 1023.32 × 1022.52 × 1023.12 × 1023.27 × 1021.83 × 1022.81 × 102
CEC_11Avg1.18 × 1031.22 × 1031.29 × 1034.80 × 1031.18 × 1031.25 × 1031.25 × 1031.23 × 103
Std5.32 × 1018.74 × 1011.12 × 1022.35 × 1036.22 × 1019.25 × 1018.26 × 1011.12 × 102
CEC_12Avg2.56 × 1064.25 × 1063.40 × 1061.05 × 1094.95 × 1065.48 × 1062.15 × 1073.21 × 106
Std3.11 × 1065.18 × 1063.54 × 1066.67 × 1085.20 × 1066.12 × 1062.24 × 1073.96 × 106
CEC_13Avg8.62 × 1031.37 × 1041.63 × 1041.14 × 1061.78 × 1042.29 × 1048.54 × 1042.26 × 104
Std8.03 × 1039.87 × 1031.03 × 1043.60 × 1061.36 × 1041.66 × 1046.46 × 1041.73 × 104
CEC_14Avg1.59 × 1032.78 × 1032.71 × 1031.05 × 1042.38 × 1033.23 × 1032.03 × 1033.77 × 103
Std6.91 × 1021.65 × 1031.64 × 1038.12 × 1031.43 × 1031.85 × 1037.89 × 1022.62 × 103
CEC_15Avg8.36 × 1037.53 × 1036.43 × 1032.09 × 1047.84 × 1038.99 × 1034.75 × 1036.69 × 103
Std3.50 × 1035.57 × 1033.06 × 1034.97 × 1033.40 × 1038.37 × 1033.96 × 1034.81 × 103
CEC_16Avg1.90 × 1031.93 × 1031.89 × 1032.11 × 1031.92 × 1031.94 × 1031.80 × 1031.76 × 103
Std1.62 × 1022.01 × 1021.49 × 1021.85 × 1021.29 × 1021.50 × 1029.24 × 1011.11 × 102
CEC_17Avg1.78 × 1031.80 × 1031.78 × 1031.89 × 1031.79 × 1031.82 × 1031.80 × 1031.80 × 103
Std3.95 × 1013.57 × 1012.98 × 1011.10 × 1025.54 × 1017.80 × 1012.80 × 1015.07 × 101
CEC_18Avg1.75 × 1042.12 × 1043.67 × 1042.16 × 1081.66 × 1041.53 × 1042.59 × 1054.90 × 104
Std1.05 × 1041.61 × 1042.92 × 1044.03 × 1081.02 × 1041.13 × 1041.88 × 1052.40 × 104
CEC_19Avg1.36 × 1042.24 × 1041.91 × 1041.31 × 1052.01 × 1048.22 × 1048.37 × 1031.31 × 104
Std1.13 × 1045.15 × 1042.21 × 1046.74 × 1042.31 × 1041.28 × 1057.13 × 1031.11 × 104
CEC_20Avg2.18 × 1032.18 × 1032.13 × 1032.15 × 1032.20 × 1032.22 × 1032.12 × 1032.15 × 103
Std1.01 × 1026.85 × 1016.48 × 1015.06 × 1016.95 × 1019.70 × 1014.59 × 1016.34 × 101
CEC_21Avg2.30 × 1032.28 × 1032.31 × 1032.37 × 1032.33 × 1032.33 × 1032.27 × 1032.21 × 103
Std7.17 × 1015.77 × 1014.48 × 1012.96 × 1016.10 × 1015.87 × 1016.76 × 1012.27 × 101
CEC_22Avg2.32 × 1032.41 × 1032.31 × 1033.44 × 1032.37 × 1032.40 × 1032.40 × 1032.93 × 103
Std2.36 × 1011.07 × 1027.902.64 × 1022.93 × 1023.15 × 1022.93 × 1016.84 × 102
CEC_23Avg2.68 × 1032.65 × 1032.65 × 1032.79 × 1032.69 × 1032.66 × 1032.66 × 1032.64 × 103
Std3.07 × 1012.79 × 1011.70 × 1015.39 × 1013.72 × 1013.04 × 1017.94 × 1011.05 × 101
CEC_24Avg2.78 × 1032.78 × 1032.76 × 1032.90 × 1032.83 × 1032.80 × 1032.79 × 1032.76 × 103
Std9.37 × 1016.31 × 1015.05 × 1019.77 × 1017.61 × 1014.91 × 1011.25 × 1011.45 × 101
CEC_25Avg2.93 × 1033.00 × 1032.94 × 1033.57 × 1032.93 × 1032.96 × 1032.98 × 1032.94 × 103
Std3.17 × 1019.84 × 1012.22 × 1012.99 × 1022.27 × 1014.02 × 1012.33 × 1012.28 × 101
CEC_26Avg3.65 × 1033.37 × 1033.05 × 1034.42 × 1033.63 × 1033.64 × 1033.20 × 1033.28 × 103
Std5.27 × 1023.13 × 1022.23 × 1023.38 × 1022.94 × 1025.69 × 1023.04 × 1024.47 × 102
CEC_27Avg3.13 × 1033.13 × 1033.11 × 1033.30 × 1033.16 × 1033.15 × 1033.11 × 1033.10 × 103
Std4.05 × 1014.49 × 1018.581.05 × 1024.84 × 1013.85 × 1012.912.60
CEC_28Avg3.32 × 1033.35 × 1033.44 × 1033.95 × 1033.42 × 1033.46 × 1033.34 × 1033.36 × 103
Std1.05 × 1021.29 × 1029.77 × 1011.89 × 1021.69 × 1021.57 × 1028.82 × 1011.18 × 102
CEC_29Avg3.35 × 1033.29 × 1033.25 × 1033.53 × 1033.36 × 1033.39 × 1033.26 × 1033.23 × 103
Std1.07 × 1027.85 × 1015.58 × 1012.03 × 1021.02 × 1028.82 × 1015.57 × 1015.42 × 101
CEC_30Avg1.34 × 1061.65 × 1061.26 × 1067.29 × 1073.15 × 1061.58 × 1062.03 × 1064.65 × 105
Std1.54 × 1061.75 × 1061.33 × 1066.56 × 1073.72 × 1061.82 × 1061.25 × 1062.99 × 105
Table 8. Friedman rank test results on CEC2017 test suite.
Table 8. Friedman rank test results on CEC2017 test suite.
FEROAROAAOAOAHHOWOASCASTOA
CEC_0146281374
CEC_0336481753
CEC_0417283561
CEC_0536284753
CEC_0645287634
CEC_0746185634
CEC_0845183674
CEC_0964186526
CEC_1014365781
CEC_1113781551
CEC_1214385671
CEC_1312384671
CEC_1415483621
CEC_1524685712
CEC_1646385724
CEC_1714183741
CEC_1834582173
CEC_1936485713
CEC_2025537812
CEC_2143586624
CEC_2226183442
CEC_2322687442
CEC_2433187653
CEC_2517381561
CEC_2674185627
CEC_2744287624
CEC_2813685721
CEC_2954286735
CEC_3035287463
Avg Rank2.79314.58623.03447.75864.44825.75864.10342.8275
Final Rank16385742
Table 9. Optimization results for the pressure vessel design problem.
Table 9. Optimization results for the pressure vessel design problem.
AlgorithmOptimum VariablesBest Cost
TsThRL
EROA0.84342950.400761844.786145.95785935.7301
AO [34]1.05400.18280659.621938.80505949.2258
HHO [25]0.817583830.407292742.09174576176.71963526000.46259
WOA [21]0.81250.437542.0982699176.6389986059.7410
SMA [61]0.79310.393240.6711196.21785994.1857
GWO [19]0.81250.434542.0892176.75876051.5639
MVO [45]0.81250.437542.090738176.738696060.8066
ES [62]0.81250.437542.098087176.6405186059.74560
GSA [38]1.1250000.62500055.988659884.45420258538.8359
GA [47]0.81250.437542.097398176.654056059.94634
CPSO [63]0.81250.437542.091266176.74656061.0777
Table 10. Optimization results for the the speed reducer design problem.
Table 10. Optimization results for the the speed reducer design problem.
AlgorithmOptimum VariablesBest Weight
x1x2x3x4x5x6x7
EROA3.496920.7177.663137.83.35055.285822998.9886
AO [34]3.50210.7177.30997.74763.36415.29943007.7328
AOA [44]3.503840.7177.37.729333.356495.28672997.9157
SCA [41]3.508750.7177.37.83.461025.289213030.563
PSO [17]3.50010.717.00027.51777.78323.35085.28673145.922
MFO [27]3.497450.7177.827757.712453.351785.286352998.9408
GA [47]3.510250.7178.357.83.362205.287723067.561
HS [64]3.520120.7178.377.83.366975.288713029.002
MDA [65]3.50.7177.37.670393.542425.245813019.58336
Table 11. Optimization results for the tension/compression design problem.
Table 11. Optimization results for the tension/compression design problem.
AlgorithmOptimum VariablesBest Weight
dDN
EROA0.0537990.469515.8110.010614
AO [34]0.05024390.3526210.54250.011165
HHO [25]0.0517963930.35930535511.1388590.012665443
WOA [21]0.0512070.34521512.0040320.0126763
SSA [23]0.0512070.34521512.0040320.0126763
GWO [19]0.051690.35673711.288850.012666
MVO [45]0.052510.3760210.335130.012790
PSO [17]0.0517280.35764411.2445430.0126747
RLTLBO [60]0.0551180.50595.11670.010938
GA [47]0.0514800.35166111.6322010.01270478
HS [64]0.0511540.34987112.0764320.0126706
Table 12. Optimization results for the the three-bar truss design problem.
Table 12. Optimization results for the the three-bar truss design problem.
AlgorithmOptimum VariablesBest Weight
x1x2
EROA0.786450.41369263.8552
SFO [57]0.78845620.40886831263.8959212
AO [34]0.79260.3966263.8684
AOA [44]0.793690.39426263.9154
HHO [25]0.7886628160.408283133832900263.8958434
SSA [23]0.788665410.408275784263.89584
ALO [20]0.78866180.4082831263.8958434
MVO [45]0.788602760.408453070000000263.8958499
MFO [27]0.7882447710.409466905784741263.8959797
GOA [67]0.7888975555789730.407619570115153263.895881496069
IHAOHHO [68]0.790020.40324263.8622
Table 13. Optimization results for the welded beam design problem.
Table 13. Optimization results for the welded beam design problem.
AlgorithmOptimum VariablesBest Weight
hltb
EROA0.203523.30139.00910.207351.7059
ROA [24]0.2000773.3657549.0111820.2068931.706447
WOA [21]0.2053963.4842939.0374260.2062761.730499
GWO [19]0.2056763.4783779.036810.2057781.72624
MFO [27]0.20573.47039.03640.20571.72452
MVO [45]0.2054633.4731939.0445020.2056951.72645
CPSO [63]0.2023693.5442149.0482100.2057231.73148
RO [42]0.2036873.5284679.0042330.2072411.735344
GA [47]0.18294.04839.36660.20591.82420
IWHO [69]0.20573.25309.03660.20571.6952
IHHO [7]0.205333.472269.03640.20101.7238
HS [64]0.24426.22318.29150.24432.3807
Table 14. Optimization results for the tubular column design problem.
Table 14. Optimization results for the tubular column design problem.
AlgorithmOptimal Values for VariablesBest Cost
dt
EROA5.45110.2919826.5316
ROA [24]5.4336710.29481326.598146
AO [34]5.463000.2965626.83540
HHO [25]5.443800.2931326.55820
WOA [21]5.4370320.29422826.583393
MPA [22]5.4513890.29195126.531737
CS [70]5.451390.2919626.53217
MALO [71]5.4511400.29196726.531342
ROLGWO [59]5.4526500.29189426.534764
Table 15. Optimization results for the gear train design problem.
Table 15. Optimization results for the gear train design problem.
AlgorithmOptimum VariablesBest
Gear Ratio
nAnBnCnD
EROA491916432.7009 × 10−12
MVO [45]431916492.7009 × 10−12
MFO [27]431916492.7009 × 10−12
ALO [20]491916432.7009 × 10−12
GA [47]491916432.7009 × 10−12
CS [70]431916492.7009 × 10−12
ABC [72]491916432.7009 × 10−12
MBA [73]431916492.7009 × 10−12
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Wang, S.; Hussien, A.G.; Jia, H.; Abualigah, L.; Zheng, R. Enhanced Remora Optimization Algorithm for Solving Constrained Engineering Optimization Problems. Mathematics 2022, 10, 1696. https://doi.org/10.3390/math10101696

AMA Style

Wang S, Hussien AG, Jia H, Abualigah L, Zheng R. Enhanced Remora Optimization Algorithm for Solving Constrained Engineering Optimization Problems. Mathematics. 2022; 10(10):1696. https://doi.org/10.3390/math10101696

Chicago/Turabian Style

Wang, Shuang, Abdelazim G. Hussien, Heming Jia, Laith Abualigah, and Rong Zheng. 2022. "Enhanced Remora Optimization Algorithm for Solving Constrained Engineering Optimization Problems" Mathematics 10, no. 10: 1696. https://doi.org/10.3390/math10101696

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop