Previous Article in Journal
Impact Evaluation of Electric Vehicle Parking on Solving Security-Constrained Unit Commitment Problem

Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

# A New Hybrid Whale Optimizer Algorithm with Mean Strategy of Grey Wolf Optimizer for Global Optimization

by 1,* and
1
Department of Mathematics, Punjabi University, Patiala 147002, Punjab, India
2
BOSS Team, GS laboratory, ENSA, Ibn Tofail University, Kenitra 14000, Morocco
*
Author to whom correspondence should be addressed.
Math. Comput. Appl. 2018, 23(1), 14; https://doi.org/10.3390/mca23010014
Received: 16 February 2018 / Revised: 6 March 2018 / Accepted: 8 March 2018 / Published: 12 March 2018

## Abstract

:
The quest for an efficient nature-inspired optimization technique has continued over the last few decades. In this paper, a hybrid nature-inspired optimization technique has been proposed. The hybrid algorithm has been constructed using Mean Grey Wolf Optimizer (MGWO) and Whale Optimizer Algorithm (WOA). We have utilized the spiral equation of Whale Optimizer Algorithm for two procedures in the Hybrid Approach GWO (HAGWO) algorithm: (i) firstly, we used the spiral equation in Grey Wolf Optimizer algorithm for balance between the exploitation and the exploration process in the new hybrid approach; and (ii) secondly, we also applied this equation in the whole population in order to refrain from the premature convergence and trapping in local minima. The feasibility and effectiveness of the hybrid algorithm have been tested by solving some standard benchmarks, XOR, Baloon, Iris, Breast Cancer, Welded Beam Design, Pressure Vessel Design problems and comparing the results with those obtained through other metaheuristics. The solutions prove that the newly existing hybrid variant has higher stronger stability, faster convergence rate and computational accuracy than other nature-inspired metaheuristics on the maximum number of problems and can successfully resolve the function of constrained nonlinear optimization in reality.

## 1. Introduction

Recently, many metaheuristic algorithms have been developed by researchers and scientists in different fields. These include Differential Evolution (DE), Particle Swarm Optimization (PSO), Genetic Algorithm (GA), Differential Evolution (DE), Ant Colony Optimization (ACO), Bat Algorithm (BA), Biogeographically Based Optimization (BBO), Firefly Algorithm (FA), Sine Cosine Algorithm (SCA), Robust Optimization (RO), Grey Wolf Optimizer (GWO), Whale Optimizer Algorithm (WOA), Mean Grey Wolf Optimizer (MGWO) and many others. The common goal of these algorithms is to improve quality of solutions, stability and convergence performance. In order to do this, nature-inspired techniques should be equipped with exploration and exploitation.
Exploitation is the convergence ability to the most excellent solution of the problem near a good optimal solution and exploration is the capability of an algorithm to locate whole parts of a problem search space. Finally, the goal of all metaheuristics is to balance the capability of exploration and exploitation in order to search for the best global optimal solution in the search space. The process continues over a number of generations (iterative process) till the solutions are found to be most suitable for the environment.
GWO has recently been developed and is metaheuristics-inspired from the hunting mechanism and leadership hierarchy of grey wolves in nature and has been successfully applied for solving optimizing key values in the cryptography algorithms [1], feature subset selection [2], time forecasting [3], optimal power flow problem [4], economic dispatch problems [5], flow shop scheduling problem [6] and optimal design of double later grids [7]. Several algorithms have also been developed to improve the convergence performance of GWO that includes parallelized GWO [8,9], a hybrid version of GWO with PSO [10] and binary GWO [11].
The article is organized as follows. In Section 1 and Section 2 in this text, we introduce the introductory and related works of the old and new meta-heuristics. Section 3 and Section 4 describe the mathematical model of WOA and MGWO. A new hybrid approach is fully described in Section 5. The details of parameter settings and tested benchmark functions are presented in Section 6 and Section 7. The performance of the new hybrid existing variant is verified in Section 8. Section 9 describes the analysis of the meta-heuristics. In order to show the performance of the developed variant, twenty three standard benchmarks, four bio-medical sciences, Welded Beam Design, Pressure Vessel Design problems are studied in Section 10, Section 11, Section 12 and Section 13. Finally, some conclusions are derived in Section 14.

## 2. Related Works

Metaheuristic global optimization techniques are stochastic variants that have become the most popular solutions for solving real life applications and global optimization functions in the last few decades; they have the strong robustness, flexibility, characteristics of simplicity, and so on. Some of the most famous of these algorithms are BA [12], GA [13], Harmony Search (HS) [14], ACO [15], Cuckoo Search (CS) [16], Bacterial Foraging Optimization (BFO) [17], PSO [18], Artificial Bee Colony (ABC) [19], Black Hole (BH) [20], One Half Personal Best Position Particle Swarm Optimizations (OHGBPPSO) [21], Half Mean Particle Swarm Optimization algorithm (HMPSO) [22], Personal Best Position Particle Swarm Optimization (PBPPSO) [23], Hybrid Particle Swarm Optimization (HPSO) [24], Hybrid MGBPSO-GSA [25] and MGWO [26], Gravitational Search Algorithm (GSA) [27], Artificial Neural Network (ANN) [28], SCA [29], Adaptive Group Search Optimization (AGSO) [30], Ant Lion Optimizer (ALO) [31], Biogeography Based Optimization (BBO) [32], Moth Flame Optimizer (MFO) [33], Krill Herd Algorithm (KHA) [34], Grasshopper Optimization Algorithm (GOA) [35], Multi-Verse Optimizer (MVO) [36], Black-Hole-Based Optimization (BHBO) [37], Dragonfly Algorithm (DA) [38], HPSOGWO [39], MOSCA [40] and so forth.
Bentouati et al. [41] presented a new power system planning strategy by combining Pattern Search algorithm (PS) with WOA. The existing variant has been carried out on the IEEE 30-bus test system considering several objective functions, such as voltage profile improvement, generating fuel cost, emission reduction and minimization of total power losses which were also verified. The obtained numerical and statistical solutions are verified with recently published population-based metaheuristic variants. Simulation solutions clearly conceal the speed and effectiveness of the presented approach for solving the OPF function.
A new hybrid approach has been developed by [42] called Hybrid GWOSCA which is a combination of GWO used for exploitation phase and SCA for the exploration phase in an uncertain environment. The position and convergence performance of the grey wolf (alpha) is improved using position update equations of SCA. Experimental solutions obtained with existing approaches are verified with other metaheuristics approaches. On the basis of the numerical and statistical experimental results, the proposed existing hybrid algorithm can highly be effective in solving standard test functions and recent real life problems with or without constrained and unknown search space.
Tawhid and Ali [43] presented a new hybrid approach between the GWO and the GA variant in order to minimize a simplified model of the energy function of the molecule. This research used three different procedures: (i) they used the GWO variant to balance between the exploitation and the exploration process in the proposed variant; (ii) they used the dimensionality reduction and the population partitioning processes by dividing the population into sub-populations and using the arithmetical crossover operator in each sub-population in order to increase the diversity of the search in the algorithm; (iii) they used the GA operator in the whole population in order to refrain from premature convergence and trapping in local minima. The performance of the new hybrid algorithm has been tested on several standard test functions and performance of the algorithm has been compared with different metaheuristics. Experimental solutions prove that the hybrid approach is promising and competent for searching the near global optimal minimum value of the molecular energy function faster than other meta-heuristics.
Emary et al. [44] used three modern techniques, namely Antlion Optimizer, MFO and GWO in domain of machine learning for feature selection. Solutions on a set of standard machine learning data using a set of assessment indicators proved advances in optimization approach performance when using variational repeated periods of declined exploration rates over using systematically decreased exploration rates.
Emary et al. [45] proposed a variant of the recently introduced WOA based on adaptive switching of random walk per individual search agent. The basic approach stochastically switches amid the two random walks at each iteration regardless of the search member performance and regardless of the fitness of terrain around it and in which a newly existing approach called Adaptive Whale Optimization Algorithm (AWOA), an adaptive switching amid the two random walks is recommended based on the agent’s performance. The proposed AWOA was benchmarked using 29 standard test functions with uni-modal, multi-modal, and composite test functions. Performance over such functions proves that the capability of the proposed variant outperforms the original WOA. The performance has been tested on the 29 standard functions and its convergence performance over such functions proves the capability of the newly existing approach to outperform the basic Whale Optimizer Algorithm.
In this paper, we propose a new hybrid whale optimizer algorithm and mean grey wolf optimizer algorithm in order to solve the standard benchmark, XOR, Baloon, Iris, Breast Cancer, Welded Beam Design and Pressure Vessel Design functions. We call the proposed algorithm Whale Optimizer Algorithm and Mean Grey Wolf Optimizer (HAGWO). The proposed HAGWO algorithm is based on two procedures. In the first procedure, we used the spiral equation in GWO algorithm for balance between the exploitation and the exploration process in the new hybrid approach. In the second procedure, we also apply this equation in the whole population in order to refrain from the premature convergence and trapping in local minima. The partitioning idea can improve the diversity search of the proposed variant. The combination between these two procedures accelerates the search and helps the algorithm to reach the optimal or near optimal solution in reasonable time.

## 3. Whale Optimizer Algorithm (WOA)

Mirjalili and Lewis [46] proposed a new nature-inspired technique, namely, Whale Optimizer Algorithm (WOA), which mimics the social behavior of humpback whales. The algorithm is inspired by the bubble-net hunting strategy.
The mathematical model for WOA is given as follows:
Encircling prey: Whale encircles the small fishes (prey) then modifies its position towards the global optimum solution over the course of increasing number of generation from start to a maximum number of generations.
$d → = | c . x → t * − x → t |$
$x → t + 1 = x → p * − a → . d →$
The coefficient $a →$ and $c →$ are calculated as follows:
$a → = 2 . a . r − a$
$c → = 2 . r$
where $t$ is the current time, $a →$ and $c →$ are coefficient vectors, $x → *$ the position vector of the best solution obtained so far, $x →$ the position vector, $r$ is random vector in [0, 1], “” the absolute value and an element by element multiplication.
Bubble-net attacking method: In order to reach a mathematical equation for the bubble net behavior of whales, two separate methods are as follows:
• Shrinking encircling mechanism: this method is used by linearly decreasing the value of $a → ~ [ 0 , 2 ]$. Random value for a vector $a → ~ [ − 1 , 1 ]$.
• Spiral updating position: position update amid whale and small fishes (prey) that showed a helix-shaped movement is given as follows:
$M = d → ′ e b t cos ( 2 π l ) + x → t *$
where $d → ′ = | x → * − x → |$ and indicates the distance of the ith whale to the prey, $b$ is a constant for defining the shape of the logarithmic spiral, $l$ is a random number in [−1, 1].
In addition, if probability is 50%, then positions of whales are calculated as follows:
where $p$ is random number in [0, 1]. In addition to the bubble-net method, the humpback whales search for prey randomly.
In this mechanism, the whales search for small fishes (prey) randomly and change their positions according the position of other search agents.
In order to force the whale to move away from the reference whale, we use the $a → > 1$ or $a → < 1$.
It is mathematically calculated as follows:
$d → = | c → . x → r a n d − x → t |$
$x → t + 1 = x → r a n d − a → . d →$
where $x → r a n d$ is a random position vector (a random whale) chosen from the current population.

## 4. Mean Grey Wolf Optimizer (MGWO)

A modified variant of Grey Wolf Optimization algorithm, namely Mean Grey Wolf Optimization algorithm has been developed by Singh and Singh [26] by modifying the position updated (encircling behavior) equations of Grey Wolf Optimization algorithm. This variant has been developed for the purpose of improving the exploration and exploitation performance of the basic GWO algorithm. It is also inspired by the hunting mechanism and leadership hierarchy of grey wolves in nature.
The Mean Grey Wolf Optimization (MGWO) approach is outlined as:
The encircling behavior of each agent of the crowd is calculated by the following mathematical equations:
$d → = | c . x → p t − μ × x → t |$
$x → t + 1 = x → p t − a → . d →$
The vectors $a$ and $c$ are formulate as below:
$a → = 2 l . r 1$
$c → = 2 . r 2$
where $t$ indicates the current iteration, $a →$ and $c →$ are coefficient vectors, $r 1 , r 2$ are random vectors in [0, 1], $x → p$ is the vector of the prey, and $x →$ indicates the position vector of a grey wolf.
Hunting: In order to mathematically simulate hunting behavior, we suppose that the alpha, beta and delta have better knowledge about the potential location of the prey. The following equations are developed in this regard.
$d → α = | c → 1 . x → α − μ × x → | , d → β = | c → 2 . x → β − μ × x → | , d → δ = | c → 3 . x → δ − μ × x → |$
$x → 1 + x → 2 + x → 3 3$
Search for prey and attacking prey: The $a →$ is random value in the gap $[ − 2 a , 2 a ]$. When random value $| a → | < 1$ the wolves are forced to attack the prey. Searching for prey is the exploration ability and attacking the prey is the exploitation ability. The arbitrary values of $a →$ are utilized to force the search to move away from the prey.
When $| a → | > 1$, the members of the population are enforced to diverge from the prey.

## 5. Hybrid Algorithm

Several scientists/researchers have been developing several hybrid nature-inspired approaches for improving the exploration and exploitation performance of existing algorithms. According to Talbi [47], two variants can be hybridized in low-level or high-level with relay or coevolutionary techniques as heterogeneous or homogeneous.
In this text, we hybridize Whale Optimizer Algorithm with Mean Grey Wolf Optimizer algorithm using low-level coevolutionary mixed hybrid. The hybrid is low-level because we merge the functionality of both approaches. It is co-evolutionary because we do not use both variant one after another. In other words, they run in parallel. It is mixed because there are two different approaches that are involved to generate a final optimal solution of the test benchmark and real life problems. On behalf of this modification, we improve the capability of exploitation in Mean Grey Wolf Optimizer with the capability of exploration in Whale Optimizer Algorithm to show the strengths of both approaches.
Under this research, the Whale Optimizer Algorithm is used for the exploration phase as it uses logarithmic spiral problems, so it covers broader areas in uncertain search spaces. Since both of the variants are randomization approaches, we use unknown term search space during the computation over the course of iteration from starting to maximum iteration limit. Exploration phase means the ability of the variant to try out large numbers of feasible solutions. Position of grey wolf that is liable for finding the global optimum solution of the problem is replaced with the position of whale that is equivalent to position of grey wolf but highly efficient to move a solution towards an optimal one. Whale Optimizer Algorithm directs the wolves towards an optimal value and reduces computational time. We know that that Grey Wolf Optimizer is a recognized approach that exploits the best possible solution from its unknown search space. Therefore, mixture of best characteristic (exploitation with Mean Grey Wolf Optimizer and exploration with Whale Optimizer Algorithm) guarantees to obtain best possible global optimal solution of the real life and standard problems that also avoid local stagnation or local optima problems. Hybrid WOA-MGWO merges the best strength of both Mean Grey Wolf Optimizer in exploitation and Whale Optimizer Algorithm in the exploration phase towards the targeted optimum solution.
Mathematical model for HAGWO is given as follows:
In HAGWO variant, the position of alpha, beta and delta have been updated using spiral updating equation of whale optimizer algorithm for the purpose of improving the convergence performance of MGWO algorithm. The rest of the operations of Mean GWO and WOA algorithm are the same. The following spiral and hunting position update equations are developed in this regard.
$M = μ × d → ′ e b t cos ( 2 π l ) + x → t *$
$d → α = | c → 1 . x → α − μ × M | , d → β = | c → 2 . x → β − μ × M | , d → δ = | c → 3 . x → δ − μ × M |$
where $d → ′ = | x → * − x → |$ and indicates the distance of the ith whale to the prey, $b$ is a constant for defining the shape of the logarithmic spiral, $d → α$, $d → β$ and $d → δ$ are position of three best search agents, $μ$ is a mean and $l$ is a random number in [−1, 1].
 Pseudo Code of HAGWO Initialize the population Find the fitness of each search member $x → *$ is the best search member While ($t <$ max. number of generations) For every search member Update $a , a → , c → , l$ and $p$ if ($p < 0.5$) if ($| a → | < 1$) update the position of the current search member by the Equation (1) else if ($| a → | ≥ 1$) select a random search member ($x → r a n d$) update the position of the current search member by the Equation (8) end if else if ($p ≥ 0.5$) update the position of the present search member by using Equations (16) and (17) end if end for Find the fitness of all search members Update $x → *$,$d → α$,$d → β$ and $d → δ$ $t = t + 1$ end while return $x → *$

## 6. Parameter Setting

Computational Experiments were performed to fine tune the values of various parameters for its best performance. For that purpose, all measured values of parameters viz. number of search agents ~20 and number of generations ~[5, 5000], were tested.

## 7. Test Problems

It is often found that the evaluation of a newly developed approach is evaluated only on a standard benchmark function. However, in this article we consider a test of Unimodal, Multimodal and fixed dimension multimodal functions with varying difficulty levels and problem sizes. The capability of the newly hybrid variant, Particle Swarm Optimization, Grey Wolf Optimizer, Whale Optimizer algorithm and Mean Grey Wolf Optimizer has been verified on these three types of function sets. The exact details of these test problems are given in Table 1, Table 2 and Table 3.

## 8. The Performance of the HAGWO Algorithm

The performance of several population-based metaheuristics has been verify with the newly existing variant in order to test the stability, convergence rate and computational accuracy on the number of iterations in Figure 1. We have taken the similar parameter constants (in Section 6) for the entire variants to make a valid comparison. We illustrate the results in Figure 1 by plotting the worst optimal values of problem values against the number of iterations for a simplified model of the molecule with a distinct size from 20 to 100 dimensions.
The figure shows that the standard test function values quickly decrease as the number of generations increase for newly existing variant solutions than those of the other metaheuristics. In Figure 1, HAGWO, PSO, GWO, WOA and MGWO variants suffer from slow convergence and get stuck in the partitioning procedure; nevertheless, many local minima and invoking the Mean Grey Wolf variant in the newly existing hybrid algorithm avoid trapping in local minima and accelerate the search.

## 9. Analysis

The capability of improved metaheuristic has been tested on 29 benchmark functions. We have chosen these benchmark functions to be able to compare our numerical and statistical results to those of the recently nature-inspired techniques. These tested functions are shown in Table 1, Table 2 and Table 3, where dim represents the dimension of the objective function, Range boundary of the objective function’s search space and $f min$ is the optimum.
The HAGWO variant was run 20 times on each standard function. The numerical and statistical solutions (standard deviation and average) are reported in Table 1, Table 2, Table 3, Table 4, Table 5, Table 6, Table 7, Table 8 and Table 9. For verifying the solutions, the HAGWO variant is compared with PSO, GWO, WOA and MGWO algorithms. In WOA, MGWO and the newly developed approach HAGWO, the balance amid the local and global exploration abilities is mainly controlled by the mean. The numerical and statistical experimental solutions have been performed to illustrate this. By best parameter settings, it was found that the best global optimal solutions lie within a reasonable number of generations. A number of criteria have been used to evaluate the capability of GWO, PSO, WOA, MGWO with HAGWO. The mean and standard deviation statistical values are used to evaluate the reliability. The average computational time of the successful runs and the average number of problem evaluations of successful runs, are applied to estimate the cost of the standard problem.
For Unimodal benchmark functions, the quality of the global optimal solution obtained is considered by the maximum, minimum, standard deviation and mean of the objective function values out of twenty runs. This is shown in Table 4 and Table 5 and convergence performance of PSO, GWO, WOA, MGWO and HAGWO algorithms are shown in Figure 2.
Furthermore, the performance of the algorithms in solving multimodal benchmark functions is shown in Table 6, Table 7 and the convergence curve is shown in Figure 3, respectively.
Furthermore, the statistical and numerical global optimal solutions using PSO, GWO, WOA, MGWO and HAGWO variants on fixed dimensional multimodal benchmark functions are given in Table 8 and Table 9. The performance of algorithms is shown in Figure 4.
The performance of the newly existing variant has been tested on the standard, bio-medical and engineering real life functions in terms of minimum objective function values, maximum objective function values, mean and standard deviations (Table 4, Table 5, Table 6, Table 7, Table 8 and Table 9).
Here, the maximum and minimum values represent the best possible cost of the functions in the number of iterations. On the other sides, the mean and standard deviation statistical values are used to evaluate the reliability. Furthermore, the convergence graphs of the functions represent the convergence performance of the algorithms.
Summing up, Table 4, Table 6 and Table 8 show that the newly hybrid approach provides the best possible optimal values of the functions in terms of minimum and maximum values of the functions as compared to others meta-heuristics, and Table 5, Table 7 and Table 9 illustrate that the hybrid approach also gives superior quality of standard and mean values of the functions which outperforms others. At the end, the convergence of graphs (Figure 2, Figure 3 and Figure 4) proves that, the existing approach finds the best possible optimal values of the standard functions in the least number of iterations in comparison to others.

## 10. Experiments and Discussion on the Results

Performance of the proposed variant was tested on a set of 23 standard functions (Unimodal, Multimodal and Fixed dimension multimodal). These functions were chosen as the test functions. Computer programs for solving the numerical problems using PSO, WOA, GWO, MGWO and HAGWO pseudo code were coded in MATLAB R2013a and implemented on Intel HD Graphics, i5 Processor 430 M, 15.6” 3GB Memory, 320 GB HDD, 16.9 HD LCD and Pentium-Intel Core (TM). The maximum number of generations~5000; all this parameter setting is used to test the ability of meta-heuristics.
As per the numerical results of Table 4, improved hybrid variant is capable to give very competitive global optimal solutions. This variant outperforms all other metaheuristics in all unimodal functions. It may be noted that these test problems are suitable for benchmarking exploitation. These numerical and statistical global optimal solutions indicate that the improved hybrid variant is more reliable in giving superior quality of solutions in terms of exploiting the global optimum.
While observing Table 6, the superiority of the result obtained is measured by the minimum and maximum objective function value, average and standard deviation and the objective function values out of 20 runs. It can be seen that HAGWO gives a better quality of results as compared to other metaheuristics. Thus, for the multimodal benchmark functions, HAGWO outperforms PSO, GWO, WOA, MGWO with respect to efficiency, reliability, cost and robustness.
Fixed-dimension multimodal functions have many local optima with the number growing exponentially with dimension. This makes them fitting for benchmarking the exploration capacity of a variant. As per the results shown in Table 8, the HAGWO variant is competent to provide very competitive solutions to these problems as well. This variant outperforms PSO, WOA, GWO and MGWO on the majority of these test functions. Hence, HAGWO variant has merit in terms of exploration.
A number of criteria have been applied to find out the performance of PSO, GWO, WOA, MGWO and the new hybrid approach of GWO variants. The mean and standard deviation statistical values are used to evaluate the reliability in Table 5, Table 7 and Table 9. The average computational time of the successful runs and the average number of function evaluations of successful runs, are applied to estimate the cost of the standard function.
In Figure 2, Figure 3 and Figure 4, the convergence performance of PSO, GWO, WOA, MGWO and HAGWO variants in solving unimodal benchmark functions is compared; obtained convergence solutions prove that the HAGWO variant is more able to find the best optimal solution in minimum number of iterations. Hence, the HAGWO variant avoids premature convergence of the search process to local optimal points and provides superior exploration of the search course.
To sum up, all simulation solutions assert that the new hybrid existing approach is very helpful in improving the efficiency of the Whale Optimizer Algorithm and Mean Grey Wolf Optimizer Algorithm in terms of result quality as well as computational efforts.

## 11. Bio-Medical Science Real Life Applications

In this section, four dataset problems: (i) Iris (ii) XOR (iii) Baloon and (iv) Breast Cancer are employed (Mirjalili, S. [48]). These real-life problems have been solved using the new hybrid variant and compared with PSO, WOA, GWO and MGWO meta-heuristics. Different parameter settings have been used for running the code of meta-heuristics and these parameter settings are described in Appendix Table A1. The capability of the variants has been compared in terms of minimum objective function value, maximum objective function value, average, standard deviation, classification rate and convergence rate of the algorithms in Table 10. All these real-life applications are discussed step by step in this section:
The performance of the metaheuristics has been tested on different parameter settings as shown in Appendix Table A2. The experimental numerical and statistical results of HAGWO, PSO, WOA, GWO and MGWO on these datasets have been given in Table 10 and convergence performance of the algorithms is shown Figure 5. In Table 10, we show that HAGWO algorithm gives superior quality of numerical and statistical solutions in comparison to other meta-heuristics. The results of the HAGWO algorithm indicate that it has the highest capability to avoid the local optima and is considerably superior to other algorithms like PSO, WOA, GWO and MGWO.
Secondly, the performance of meta-heuristics has been compared in terms of average, standard deviation classification rate (in Table 10) and convergence rate (in Figure 5). The low average and standard deviation shows the superior local optima avoidance of the algorithm. On the basis of the obtained solutions, we have concluded that the new hybrid algorithm gives highly competitive results as compared to other metaheuristics and the convergence graph shows that HAGWO gives better solutions than PSO, WOA, GWO and MGWO variants.

## 12. Welded Beam Design

This function is designed for the minimum cost subject to constraints on side constraints, buckling load on the bar ($p c$), shear stress ($τ$), end deflection of the beam ($δ$) and bending stress in the beam ($σ$). There are four design variables: $h ( x 1 ) , l ( x 2 ) , t ( x 3 )$ and $b ( x 4 )$. The WBD function can be mathematical formulated as below [49]:
$g 2 ( Y ) = σ ( Y ) − σ max ≤ 0 ,$
$g 3 ( Y ) = x 1 − x 4 ≤ 0 ,$
$g 4 ( Y ) = 0.10471 x 1 2 + 0.04811 x 3 x 4 ( 14.0 + x 2 ) − 5.0 ≤ 0 ,$
$g 5 ( Y ) = 0.125 − x 1 ≤ 0 ,$
$g 6 ( Y ) = δ ( Y ) − δ max ≤ 0 ,$
$g 7 ( Y ) = p − p c ( Y ) ≤ 0 ,$
where
$J = 2 { 2 x 1 x 2 [ x 2 2 / 12 + ( ( x 1 + x 3 ) / 2 ) 2 ] } ,$
$R = x 2 2 / 4 + ( ( x 1 + x 3 ) / 2 ) 2 , σ ( Y ) = 6 p L 3 / x 4 x 3 3 ,$
$p c ( Y ) = ( 4.013 E x 3 2 x 4 6 / 36 / L 2 ) ( 1 − ( x 3 / 2 L ) E / 4 G ) , δ ( Y ) = p L 3 / E x 3 3 x 4 ,$
During last few years, many scientists and researchers have used several types of nature-inspired metaheuristics to locate the best optimal results of the Welded Beam Design (WBD) problem in the literature, such as Genetic Algorithm (GA) [50,51,52], Unified Particle Swarm Optimization (UPSO) [53], Artificial Bee Colony algorithm (ABC) [54], Co-evolutionary Differential Evolution (CDE) [55], Co-evolutionary Particle Swarm Optimization (CPSO) [56], Harmony Search algorithm (IHS) [57], Moth-Flame Optimization algorithm (MFO) [33], Adaptive Firefly Algorithm (AFA) [58], Charged System Search (CSS) [59] and Lightning Search Algorithm-Simplex Method (LSA-SM) [49].
In Table 11 and Figure 6, we compare the optimal solutions of the new hybrid approach (HAGWO) and other metaheuristics found in the literature, where the newly existing variant achieves better quality of solutions that are better than several latest metaheuristics with minimum cost of 1.661258 for the welded beam design problem.

## 13. Pressure Vessel Design

This problem is a cylindrical vessel whose ends are capped by a hemispherical head as shown in Figure 7. The main objective is to minimize the total cost. The pressure vessel design problem can be mathematically formulated as below [49]:
$g 2 ( Y ) = − x 2 + 0.00954 x 3 ≤ 0 ,$
$g 3 ( Y ) = − π x 3 2 x 4 − 4 3 π x 3 3 + 1296000 ≤ 0 ,$
$g 4 ( Y ) = x 4 − 240 ≤ 0 ,$
where $x 1$ is the thickness of the shell, $x 2$ is thickness of the head, $x 3$ is inner radius and $x 4$ length of the cylindrical section of the vessel [57].
During the last few decades, several researchers have used different types of metaheuristics to find the best possible optimal solutions of the Pressure Vessel Design Problem in the literature such as Genetic Algorithm (GA) [50,51,52], Artificial Bee Colony algorithm (ABC) [54], Co-evolutionary Differential Evolution (CDE) [55], Co-evolutionary Particle Swarm Optimization (CPSO) [56], Improved Harmony Search algorithm (IHS) [57], Moth-Flame Optimization algorithm (MFO) [33], Adaptive Firefly Algorithm (AFA) [58], Bat Algorithm (BA) [60], Cuckoo Search algorithm (CS) [61], Evolution Strategies (ES) [62], Ant Colony Optimization (ACO) [63], Teaching-Learning-Based Optimization (TLBO) [64] and Lightning Search Algorithm-Simplex Method (LSA-SM) [49].
The experimental solutions of the different metaheuristics pressure vessel design are represented in Table 12. It can be seen that the best optimal value of the pressure vessel design problem by HAGWO is 5924.2536. Hence HAGWO algorithm provides the superior quality of the solutions in comparison to others.

## 14. Conclusions and Future Work

In the current study, we have developed an improved hybrid algorithm utilizing the strengths of Whale Optimizer Algorithm and Grey Wolf Optimizer algorithm. The global optimal solution quality of benchmark function has been improved with Hybrid WOA–MGWO as it extracts quality characteristics of both WOA and MGWO. Whale Optimizer Algorithm (WOA) is used for the exploration phase as it uses a spiral function, hence, it covers a broader area in an uncertain search space. Hence, Whale Optimizer algorithm directs the members more rapidly towards global optimal value and reduces computational time. Twenty three benchmark functions are utilized to test the quality of the hybrid variant compared to PSO, WOA, GWO and MGWO. The experimental numerical and statistical solutions have shown that an improved hybrid strategy is most suitable for giving the superior quality of solutions with a minimum number of iterations, therefore, the HAGWO variant avoids premature convergence of the search process to local optima and gives superior exploration due to the search procedure.
This article also considers solving bio-medical science dataset (XOR, Baloon, Iris, and Breast Cancer) and engineering (Welded Beam Design and Pressure Vessel Design) problems. The solutions of these problems indicate that the proposed approach is applicable to solve challenging problems with unknown search spaces.
Future work will focus on two parts: (i) Structural Damage Detection, composite functions, aircraft wings, feature selection, the gear train design problem, bionic car problem, cantilever beam, and mechanical engineering functions; and (ii) Developing new variants based on nature-inspired algorithms for these tasks. Finally, we expect that this work will encourage young researchers who are working on recent evolutionary metaheuristics concepts.

## Author Contributions

Narinder Singh designed the numerical experiments, developed code and prepared the manuscript. Both the authors revised and finalized the final draft of manuscript.

## Conflicts of Interest

The authors declare no conflict of interest.

## Appendix A

Table A1. Classification datasets (Mirjalili [48]).
Table A1. Classification datasets (Mirjalili [48]).
Classification DatasetsNumber of AttributesNumber of Training SamplesNumber of Test SamplesNumber of Classes
3-bits XOR388 as training samples2
Baloon41616 as training samples2
Iris4150150 as training samples3
Breast Cancer95991002
Table A2. The parameter settings of algorithms.
Table A2. The parameter settings of algorithms.
ParameterValue
$a →$Linearly decreased from 2 to 0
Search Agents200
Maximum number of iterations100–200

## References

1. Shankar, K.; Eswaran, P.A. Secure visual secret share (VSS) creation scheme in visual cryptography using elliptic curve cryptography with optimization technique. Aust. J. Basic Appl. Sci. 2015, 9, 150–163. [Google Scholar]
2. Emary, E.; Zawbaa, H.M.; Grosan, C.; Hassenian, A.E. Feature subset selection approach by gray-wolf optimization. In Proceedings of the First International Afro-European Conference for Industrial Advancement, Addis Ababa, Ethiopia, 17–19 November 2014. [Google Scholar]
3. Yusof, Y.; Mustaffa, Z. Time series forecasting of energy commodity using grey wolf optimizer. In Proceedings of the International Multi Conference of Engineers and Computer Scientists (IMECS ’15), Hong Kong, China, 1 March 2015. [Google Scholar]
4. El-Fergany, A.A.; Hasanien, H.M. Single and multi-objective optimal power flow using grey wolf optimizer and differential evolution algorithms. Electr. Power Compon. Syst. 2015, 43, 1548–1559. [Google Scholar] [CrossRef]
5. Kamboj, V.K.; Bath, S.K.; Dhillon, J.S. Solution of non-convex economic load dispatch problem using Grey Wolf Optimizer. Neural Comput. Appl. 2016, 27, 1301–1316. [Google Scholar] [CrossRef]
6. Komaki, G.M.; Kayvanfar, V. Grey wolf optimizer algorithm for the two-stage assembly flow shop scheduling problem with release time. J. Comput. Sci. 2015, 8, 109–120. [Google Scholar] [CrossRef]
7. Gholizadeh, S. Optimal design of double layer grids considering nonlinear behaviour by sequential grey wolf algorithm. J. Optim. Civ. Eng. 2015, 5, 511–523. [Google Scholar]
8. Pan, T.S.; Dao, T.K.; Nguyen, T.T.; Chu, S.C. A communication strategy for paralleling grey wolf optimizer. Adv. Intell. Syst. Comput. 2015, 388, 253–262. [Google Scholar]
9. Jayapriya, J.; Arock, M. A parallel GWO technique for aligning multiple molecular sequences. In Proceedings of the International Conference on Advances in Computing, Communications and Informatics, Kochi, India, 10–13 August 2015; pp. 210–215. [Google Scholar]
10. Kamboj, V.K. A novel hybrid PSOGWO approach for unit commitment problem. Neural Comput. Appl. 2016, 27, 1643–1655. [Google Scholar] [CrossRef]
11. Emary, E.; Zawbaa, H.M.; Hassanien, A.E. Binary grey wolf optimization approaches for feature selection. Neurocomputing 2016, 172, 371–381. [Google Scholar] [CrossRef]
12. Yang, X.-S. A new metaheuristic bat-inspired algorithm. Available online: https://link.springer.com/chapter/10.1007/978-3-642-12538-6_6 (accessed on 9 March 2018).
13. Holland, J. Adaptation in Natural and Artificial Systems; University of Michigan Press: Ann Arbor, MI, USA, 1975. [Google Scholar]
14. Geem, Z.W.; Kim, J.H.; Loganathan, G.V. A new heuristic optimization algorithm: Harmony search. Simulation 2001, 76, 60–68. [Google Scholar] [CrossRef]
15. Dorigo, M. Optimization, Learning and Natural Algorithms. Ph.D. Thesis, Politecnico di Milano, Milan, Italy, 1992. [Google Scholar]
16. Yang, X.S.; Deb, S. Cuckoo search via Lévy flights. In Proceedings of the World Congress on Nature & Biologically Inspired Computing, Coimbatore, India, 9–11 December 2009; pp. 210–214. [Google Scholar]
17. Passino, K.M. Biomimicry of bacterial foraging for distributed optimization and control. IEEE Control Syst. Mag. 2002, 22, 52–67. [Google Scholar] [CrossRef]
18. Kennedy, J.; Eberhart, R.C. Particle swarm optimization. In Proceedings of the IEEE International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
19. Karaboga, D.; Basturk, B. On the performance of artificial bee colony (ABC) algorithm. Appl. Soft Comput. J. 2008, 8, 687–697. [Google Scholar] [CrossRef]
20. Hatamlou, A. Black hole: A new heuristic optimization approach for data clustering. Inf. Sci. Int. J. 2013, 222, 175–184. [Google Scholar] [CrossRef]
21. Singh, N.; Singh, S.B. One Half Global Best Position Particle Swarm Optimization Algorithm. Int. J. Sci. Eng. Res. 2011, 2, 1–10. [Google Scholar]
22. Singh, N.; Singh, S.; Singh, S.B. Half Mean Particle Swarm Optimization Algorithm. Int. J. Sci. Eng. Res. 2012, 3, 1–9. [Google Scholar]
23. Singh, N.; Singh, S.B. Personal Best Position Particle Swarm Optimization. J. Appl. Comput. Sci. Math. 2012, 12, 69–76. [Google Scholar]
24. Singh, N.; Singh, S.; Singh, S.B. HPSO: A New Version of Particle Swarm Optimization Algorithm. J. Artif. Intell. 2012, 3, 123–134. [Google Scholar]
25. Singh, N.; Singh, S.; Singh, S.B. A New Hybrid MGBPSO-GSA Variant for Improving Function Optimization Solution in Search Space. Evol. Bioinform. 2017, 13, 1–13. [Google Scholar] [CrossRef] [PubMed]
26. Singh, N.; Singh, S.B. A Modified Mean Grey Wolf Optimization Approach for Benchmark and Biomedical Problems. Evol. Bioinform. 2017, 13, 1–28. [Google Scholar] [CrossRef] [PubMed]
27. Duman, S.; Güvenç, U.; Sönmez, Y.; Yörükeren, N. Optimal power flow using gravitational search algorithm. Energy Convers. Manag. 2012, 59, 86–95. [Google Scholar] [CrossRef]
28. Chowdhury, B.H. Towards the concept of integrated security: Optimal dispatch under static and dynamic security constraints. Electr. Power Syst. Res. 1992, 25, 213–225. [Google Scholar] [CrossRef]
29. Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl.-Based Syst. 2016, 96, 1–14. [Google Scholar] [CrossRef]
30. Daryani, N.; Hagh, M.T.; Teimourzadeh, S. Adaptive group search optimization algorithm for multi-objective optimal power flow problem. Appl. Soft Comput. 2016, 38, 1012–1024. [Google Scholar] [CrossRef]
31. Mirjalili, S. The Ant Lion Optimizer. Adv. Eng. Softw. 2015, 83, 80–98. [Google Scholar] [CrossRef]
32. Simon, D. Biogeography-based optimization. IEEE Trans. Evol. Comput. 2008, 12, 702–713. [Google Scholar] [CrossRef]
33. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
34. Mukherjee, A.; Mukherjee, V. Solution of optimal power flow using chaotic krill herd algorithm. Chaos Solitons Fractals 2015, 78, 10–21. [Google Scholar] [CrossRef]
35. Mirjalili, S. Grasshopper Optimisation Algorithm: Theory and application. Adv. Eng. Softw. 2016, 105, 30–47. [Google Scholar]
36. Mirjalili, S.; Mirjalili, S.M.; Hatamlou, A. Multi-Verse Optimizer: A nature-inspired algorithm for global optimization. Neural Comput. Appl. 2016, 2, 495–513. [Google Scholar] [CrossRef]
37. Bouchekara, H.R.E.H. Optimal power flow using black-hole-based optimization approach. Appl. Soft Comput. 2014, 24, 879–888. [Google Scholar] [CrossRef]
38. Mirjalili, S. Dragonfly algorithm: A new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput. Appl. 2016, 4, 1053–1073. [Google Scholar] [CrossRef]
39. Singh, N.; Singh, S.B. Hybrid Algorithm of Particle Swarm Optimization and Grey Wolf Optimizer for Improving Convergence Performance. J. Appl. Math. 2017, 2017, 2030489. [Google Scholar] [CrossRef]
40. Allah, R.M.R. Hybridizing sine cosine algorithm with multi-orthogonal search strategy for engineering design problems. J. Comput. Des. Eng. 2017, in press. [Google Scholar]
41. Bentouati, B.; Chaib, L.; Saliha, C. A hybrid whale algorithm and pattern search technique for optimal power flow problem. In Proceedings of the 8th International Conference on Modelling, Identification and Control (ICMIC-2016), Algiers, Algeria, 15–17 November 2016. [Google Scholar]
42. Singh, N.; Singh, S.B. A novel hybrid GWO-SCA approach for optimization problems. Eng. Sci. Technol. Int. J. 2017. [Google Scholar] [CrossRef]
43. Twhid, M.A.; Ali, A.F. A Hybrid grey wolf optimizer and genetic algorithm for minimizing potential energy function. Memet. Comput. 2017, 9, 347–359. [Google Scholar] [CrossRef]
44. Emary, E.; Zawbaa, H.M. Impact of chaos functions on modern swarm optimizers. PLoS ONE 2016, 11, e0158738. [Google Scholar] [CrossRef] [PubMed]
45. Emary, E.; Zawbaa, H.M.; Salam, M.A. A Proposed Whale Search Algorithm with Adaptive Random Walk. In Proceedings of the IEEE 13th International Conference on Intelligent Computer Communication and Processing, Cluj-Napoca, Romania, 7–9 September 2017. [Google Scholar] [CrossRef]
46. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
47. Talbi, E.G. A Taxonomy of Hybrid Metaheuristic. J. Heuristics 2002, 8, 541–546. [Google Scholar] [CrossRef]
48. Mirjalili, S. How effective is the Grey Wolf Optimizer in training multi-layer perceptrons. Appl. Intell. 2015, 43, 150–161. [Google Scholar] [CrossRef]
49. Lu, Y.; Zhou, Y.; Wu, X. A Hybrid Lightning Search Algorithm-Simplex Method for Global Optimization. Discret. Dyn. Nat. Soc. 2017, 2017, 8342694. [Google Scholar] [CrossRef]
50. Coello, C.A.C. Use of a self-adaptive penalty approach for engineering optimization problems. Comput. Ind. 2000, 41, 113–127. [Google Scholar] [CrossRef]
51. Coello, C.A.C.; Montes, E.M. Constraint-handling in genetic algorithms through the use of dominance-based tournament selection. Adv. Eng. Inform. 2002, 16, 193–203. [Google Scholar] [CrossRef]
52. Deb, K. Optimal design of a welded beam via genetic algorithms. AIAA J. 1991, 29, 2013–2015. [Google Scholar]
53. Parsopoulos, K.E.; Vrahatis, M.N. Unified particle swarm optimization for solving constrained engineering optimization problems. In Lecture Notes in Computer Science, Proceedings of the Advances in Natural Computation, Changsha, China, 27–29 August 2005; Springer: Berlin, Germany, 2005; Volume 3612, pp. 582–591. [Google Scholar]
54. Akay, B.; Karaboga, D. Artificial bee colony algorithm for large-scale problems and engineering design optimization. J. Intell. Manuf. 2012, 23, 1001–1014. [Google Scholar] [CrossRef]
55. Huang, F.Z.; Wang, L.; He, Q. An effective co-evolutionary differential evolution for constrained optimization. Appl. Math. Comput. 2007, 186, 340–356. [Google Scholar] [CrossRef]
56. He, Q.; Wang, L. An effective co-evolutionary particle swarm optimization for constrained engineering design problems. Eng. Appl. Artif. Intell. 2007, 20, 89–99. [Google Scholar] [CrossRef]
57. Mahdavi, M.; Fesanghary, M.; Damangir, E. An improved harmony search algorithm for solving optimization problems. Appl. Math. Comput. 2007, 188, 1567–1579. [Google Scholar] [CrossRef]
58. Baykasoğlu, A.; Ozsoydan, F.B. Adaptive firefly algorithm with chaos for mechanical design optimization problems. Appl. Soft Comput. 2015, 36, 152–164. [Google Scholar] [CrossRef]
59. Kaveh, A.; Talatahari, S. A novel heuristic optimization method: Charged system search. Acta Mech. 2010, 213, 267–289. [Google Scholar] [CrossRef]
60. Gandomi, A.H.; Yang, X.S.; Alavi, A.H.; Talatahari, S. Bat algorithm for constrained optimization tasks. Neural Comput. Appl. 2013, 22, 1239–1255. [Google Scholar] [CrossRef]
61. Gandomi, A.H.; Yang, X.S.; Alavi, A.H. Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar] [CrossRef]
62. Mezura-Montes, E.; Coello, C.A.C. An empirical study about the usefulness of evolution strategies to solve constrained optimization problems. Int. J. Gen. Syst. 2008, 37, 443–473. [Google Scholar] [CrossRef]
63. Kaveh, A.; Talatahari, S. An improved ant colony optimization for constrained engineering design problems. Eng. Comput. 2010, 27, 155–182. [Google Scholar] [CrossRef]
64. Rao, R.V.; Savsani, V.J.; Vakharia, D.P. Teaching-learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput.-Aided Des. 2011, 43, 303–315. [Google Scholar] [CrossRef]
Figure 1. Convergence graph of metaheuristics. (a) Dim 20; (b) Dim 40; (c) Dim 60; (d) Dim 80; (e) Dim 100.
Figure 1. Convergence graph of metaheuristics. (a) Dim 20; (b) Dim 40; (c) Dim 60; (d) Dim 80; (e) Dim 100.
Figure 2. Convergence Curve of PSO, GWO, WOA, MGWO and HAGWO variants on Unimodal benchmark functions.
Figure 2. Convergence Curve of PSO, GWO, WOA, MGWO and HAGWO variants on Unimodal benchmark functions.
Figure 3. Convergence Curve of PSO, GWO, WOA, MGWO and HAGWO variants on Multimodal benchmark functions.
Figure 3. Convergence Curve of PSO, GWO, WOA, MGWO and HAGWO variants on Multimodal benchmark functions.
Figure 4. Convergence Curve of PSO, GWO, WOA, MGWO and HAGWO variants on Fixed-dimension multimodal benchmark functions.
Figure 4. Convergence Curve of PSO, GWO, WOA, MGWO and HAGWO variants on Fixed-dimension multimodal benchmark functions.
Figure 5. (a) Convergence graph of Iris dataset problem; (b) Convergence graph of XOR dataset problem; (c) Convergence graph of Baloon dataset problem; (d) Convergence graph of Breast cancer dataset problem.
Figure 5. (a) Convergence graph of Iris dataset problem; (b) Convergence graph of XOR dataset problem; (c) Convergence graph of Baloon dataset problem; (d) Convergence graph of Breast cancer dataset problem.
Figure 6. Comparison of best optimal value of the metaheuristics on welded beam design problem.
Figure 6. Comparison of best optimal value of the metaheuristics on welded beam design problem.
Figure 7. The Pressure Vessel Design Problem.
Figure 7. The Pressure Vessel Design Problem.
Table 1. Unimodal benchmark functions.
Table 1. Unimodal benchmark functions.
FunctionDimRange$f min$Graph
$F 1 ( x ) = ∑ i = 1 n x i 2$30[−100, 100]0
$F 2 ( x ) = ∑ i = 1 n | x i | + ∏ i = 1 n | x i |$30[−10, 10]0
$F 3 ( x ) = ∑ i = 1 n ( ∑ j − 1 i x j ) 2$30[−100, 100]0
30[−100, 100]0
$F 5 ( x ) = ∑ i = 1 n − 1 [ 100 ( x i + 1 − x i 2 ) 2 + ( x i − 1 ) 2 ]$30[−30, 30]0
$F 6 ( x ) = ∑ i = 1 n ( [ x i + 0.5 ] ) 2$30[−100, 100]0
$F 7 ( x ) = ∑ i = 1 n i x i 4 + r a n d [ 0 , 1 )$30[−1.28, 1.28]0
Table 2. Multimodal benchmark functions.
Table 2. Multimodal benchmark functions.
FunctionDimRange$f min$Graph
$F 8 ( x ) = ∑ i = 1 n − x i sin ( | x i | )$30[−500, 500]−418.9829 × 5
$F 9 ( x ) = ∑ i = 1 n [ x i 2 − 10 cos ( 2 π x i ) + 10 ]$30[−5.12, 5.12]0
$F 10 ( x ) = − 20 exp ( − 0.2 1 n ∑ i = 1 n x i 2 ) − exp ( 1 n ∑ i = 1 n cos ( 2 π x i ) ) + 20 + e$30[−32, 32]0
$F 11 ( x ) = 1 4000 ∑ i = 1 n x i 2 − ∏ i = 1 n cos ( x i i ) + 1$30[−600, 600]0
$F 12 ( x ) = π n { 10 sin ( π y i ) + ∑ i = 1 n − 1 ( y i − 1 ) 2 [ 1 + 10 sin 2 ( π y i + 1 ) + ( y n − 1 ) 2 ] } + ∑ i = 1 n u ( x i , 10 , 100 , 4 ) y i = 1 + x i + 1 4 u ( x i , a , k , m ) = { k ( x i − a ) m x i > a 0 − a < x i < a k ( − x i − a ) m x i < − a$30[−50, 50]0
$F 13 ( x ) = 0.1 { sin 2 ( 3 π x i ) + ∑ i = 1 n ( x i − 1 ) 2 [ 1 + sin 2 ( 3 π x i + 1 ) ] + ( x n − 1 ) 2 [ 1 + sin 2 ( 2 π x n ) ] } + ∑ i = 1 n u ( x i , 5 , 100 , 4 )$30[−50, 50]0
Table 3. Fixed-dimension multimodal benchmark functions.
Table 3. Fixed-dimension multimodal benchmark functions.
FunctionDimRange$f min$Graph
$F 14 ( x ) = ( 1 500 + ∑ j = 1 25 1 j + ∑ i = 1 2 ( x i − a i j ) 6 ) − 1$2[−65, 65]1
$F 15 ( x ) = ∑ i = 1 11 [ a i − x 1 ( b i 2 + b i x 2 ) b i 2 + b i x i + x 4 ] 2$4[−5, 5]0.00030
$F 16 ( x ) = 4 x 1 2 − 2.1 x 1 4 + 1 3 x 1 6 + x 1 x 2 − 4 x 2 2 + 4 x 2 4$2[−5, 5]−1.0316
$F 17 ( x ) = ( x 2 − 5.1 4 π 2 x 1 2 + 5 π x 1 − 6 ) 2 + 10 ( 1 − 1 8 π ) cos x 1 + 10$2[−5, 5]0.398
$F 18 ( x ) = [ 1 + ( x 1 + x 2 + 1 ) 2 ( 19 − 14 x 1 + 3 x 1 2 − 14 x 2 + 6 x 1 x 2 + 3 x 2 2 ) ] × [ 30 + ( 2 x 1 − 3 x 2 ) 2 × ( 18 − 32 x 1 + 12 x 1 2 + 48 x 2 − 36 x 1 x 2 + 27 x 2 2 ) ]$2[−2, 2]3
$F 19 ( x ) = − ∑ i = 1 4 c i exp ( − ∑ j = 1 3 a i j ( x j − p i j ) 2 )$3[1, 3]−3.86
$F 20 ( x ) = − ∑ i = 1 4 c i exp ( − ∑ j = 1 6 a i j ( x j − p i j ) 2 )$6[0, 1]−3.32
$F 21 ( x ) = − ∑ i = 1 5 [ ( X − a i ) ( X − a i ) T + c i ] − 1$4[0,10]−10.1532
$F 22 ( x ) = − ∑ i = 1 7 [ ( X − a i ) ( X − a i ) T + c i ] − 1$4[0, 10]−10.4028
$F 23 ( x ) = − ∑ i = 1 10 [ ( X − a i ) ( X − a i ) T + c i ] − 1$4[0, 10]−10.5363
Table 4. Numerical results of unimodal benchmark functions.
Table 4. Numerical results of unimodal benchmark functions.
ProblemPSOGWOWOAMGWOHAGWO
$↓$$f m i n$$f m a x$$f m i n$$f m a x$$f m i n$$f m a x$$f m i n$$f m a x$$f m i n$$f m a x$
17.8901 × 10−286.2609 × 1041.2292 × 10−2935.7592 × 10407.2834 × 1047.4502 × 10−2677.2336 × 10407.5361 × 104
25.9781 × 10−121.1398 × 10121.2084 × 10−1538.2863 × 101107.6425 × 10139.3431 × 10−1546.0482 × 101308.8066 × 1013
327.25521.4283 × 1058.2741 × 10−121.9326 × 1056.6018 × 1041.3360 × 1056.0565 × 10−131.6858 × 1055.9829 × 10−172.4735 × 105
41.992289.31643.3355 × 10−6484.513513.624589.11331.8414 × 10−6582.53561.5274 × 10−8689.9799
5130.86503.0054 × 10827.95933.5339 × 10828.78153.3241 × 10827.17072.9514 × 10827.16303.7213 × 108
67.6619 × 10−56.4409 × 1041.99836.7418 × 1041.01137.5845 × 1041.75517.3870 × 1041.72557.5449 × 104
70.069182.08140.001281.83834.2362 × 10−4137.78613.1935 × 10−4140.17250.0012141.0199
Table 5. Statistical results of unimodal benchmark functions.
Table 5. Statistical results of unimodal benchmark functions.
ProblemPSOGWOWOAMGWOHAGWO
$↓$$μ$$σ$$μ$$σ$$μ$$σ$$μ$$σ$$μ$$σ$
199.12201.7580 × 10390.29391.8259 × 10371.41891.7312 × 103102.37342.0195 × 10377.65271.9623 × 103
22.2796 × 10101.6119 × 10101.9355 × 1081.1882 × 10101.5336 × 10101.0808 × 10121.2096 × 10108.5535 × 10111.7613 × 10101.2454 × 1012
32.3823 × 1031.3443 × 1041.9391 × 1031.2191 × 1045.3442 × 1042.9219 × 1041.9383 × 1031.1327 × 1041.8372 × 1031.4026 × 104
41.73513.66000.52605.223822.898211.56190.45205.02901.969412.1502
51.2183 × 1061.5789 × 1072.7240 × 1062.5709 × 1073.5593 × 1062.6816 × 1072.4950 × 1062.1397 × 1072.0059 × 1062.0732 × 107
61.0927 × 1036.6662 × 103796.37525.1671 × 103887.14496.1512 × 1031.0961 × 1036.8357 × 103691.96975.6904 × 103
723.582332.27220.19802.98340.36795.83630.31635.50020.28795.5672
Table 6. Numerical results of multimodal benchmark functions.
Table 6. Numerical results of multimodal benchmark functions.
ProblemPSOGWOWOAMGWOHAGWO
$↓$$f m i n$$f m a x$$f m i n$$f m a x$$f m i n$$f m a x$$f m i n$$f m a x$$f m i n$$f m a x$
8−6.5676 × 103−2.5481 × 103−6.1169 × 103−864.3118−8.7219 × 103−3.0712 × 103−5.0428 × 103−1.5535 × 103−9.0083 × 103−1.4592 × 103
946.7630432.12570457.93860466.50420425.46330477.4335
105.5924 × 10−620.91997.9936 × 10−1520.84.274.4409 × 10−152091347.9936 × 10−1520.87744.4409 × 10−1520.9385
110.0123690.02170666.30220623.42770.0083705.51960710.0255
122.78045.7234 × 1084.38886.7091 × 1080.57375.4619 × 1085.55986.1162 × 1080.15517.1939 × 108
133.02531.1558 × 1092.06709.2722 × 1081.56881.6628 × 1093.07291.1356 × 1091.28659.3192 × 108
Table 7. Statistical results of multimodal benchmark functions.
Table 7. Statistical results of multimodal benchmark functions.
ProblemPSOGWOWOAMGWOHAGWO
$↓$$μ$$σ$$μ$$σ$$μ$$↓$$μ$$σ$$μ$$σ$
8−6.2997 × 103857.6904−4.1179 × 1031.0767 × 103−8.4186 × 103378.1677−3.6282 × 103761.7072−8.4892 × 103622.2040
9151.9952129.915112.654245.12185.122637.86486.069734.78502.966329.5347
102.52153.26840.25131.81980.26071.74640.24151.77420.12961.3559
1112.633270.77971.987925.25361.923427.84381.619823.86141.675926.8415
122.5010 × 1071.0165 × 1082.5134 × 1071.0722 × 1084.2413 × 1071.1477 × 1085.7916 × 1071.5932 × 1084.6413 × 1071.3041 × 108
132.6298 × 1071.4156 × 1083.2950 × 1071.5177 × 1088.8731 × 1072.9685 × 1083.8630 × 1071.6992 × 1083.1642 × 1071.3828 × 108
Table 8. Numercial results of fixed-dimension multimodal benchmark functions.
Table 8. Numercial results of fixed-dimension multimodal benchmark functions.
ProblemPSOGWOWOAMGWOHAGWO
$↓$$f m i n$$f m a x$$f m i n$$f m a x$$f m i n$$f m a x$$f m i n$$f m a x$$f m i n$$f m a x$
147.8740412.932313.618622.04082.9822210.69459.639118.78192.9821493.6227
158.6440 × 10−40.40444.6612 × 10−40.07604.2691 × 10−40.18714.4429 × 10−40.07023.1020 × 10−40.4770
16−1.03160.5530−1.0316−0.4506−1.03160.2993−1.03160.4158−1.03163.1477
170.39821.17320.40041.14320.41451.45590.39854.05871.46240.4635
183.0009110.45153.0046187.01533.0011241.66663.0009207.64783.0009415.2426
19−3.8624−2.6790−3.8617−2.6833−3.7304−3.3467−3.8624−2.7356−3.8624−2.5019
20−3.2031−2.2384−3.3025−0.7864−2.9526−2.9526−2.8402−1.6544−3.3201−0.7698
21−2.6305−0.6444−10.1517−0.3554−5.0551−0.3800−101521−0.6119−10.1532−0.2680
22−10.4016−1.1161−10.4028−1.0080−5.0876−0.8341−10.4022−0.8415−10.4028−0.8049
23−5.1756−1.0843−10.5361−0.8808−9.9864−0.5751−10.5353−0.5825−10.5361−0.4436
Table 9. Statistical results of fixed-dimension multimodal benchmark functions.
Table 9. Statistical results of fixed-dimension multimodal benchmark functions.
ProblemPSOGWOWOAMGWOHAGWO
$↓$$μ$$σ$$μ$$σ$$μ$$σ$$μ$$σ$$μ$$σ$
820.153056.940313.96161.66607.869329.34109.46392.529713.447869.3887
90.00350.03730.00130.00540.00300.01420.00120.00470.00150.0214
10−0.99550.2106−1.01940.0817−1.00380.1513−0.99550.2106−0.96790.4297
110.55630.22680.51610.18310.76920.90170.68650.91730.46350.2174
1230.524137.95857.698826.34548.812534.11047.812328.942512.104958.4134
13−3.73940.2649−3.80410.1681−3.66560.1125−3.77930.1772−3.75140.2717
14−2.93780.3013−3.07270.3977−2.84030.2221−2.75250.1956−2.86690.4184
15−2.51990.2835−7.06262.6720−5.02960.2538−7.64301.6995−8.53552.5679
16−9.58022.0466−7.88062.1526−5.06170.2709−9.06051.5746−9.99910.9721
17−4.86790.7136−7.69612.4874−7.70942.5030−8.74972.7833−8.74972.7833
Table 10. Numerical and Statistical solutions of Bio-Medical problems.
Table 10. Numerical and Statistical solutions of Bio-Medical problems.
 (i) Iris Dataset Problem Algorithm Best Min Value Best Max Value Average S.D. Classification Rate PSO 0.6667 0.8418 0.6895 0.0336 37.22% WOA 0.7029 0.8572 0.7263 0.0374 89.31% GWO 0.6667 0.8756 0.6714 0.0249 91.333% MGWO 0.6667 0.8133 0.6789 0.0154 91.334% HAGWO 0.6668 0.8807 0.6728 0.0281 93.00% (ii) XOR Dataset Problem Algorithm Best Min Value Best Max Value Average S.D. Classification Rate PSO 2.0621 × 10−23 0.1771 0.0162 0.0481 37.50% WOA 0.0705 0.1523 0.0943 6.9504 98% GWO 8.2721 × 10−6 0.1327 0.0156 0.0375 100% MGWO 5.0578 × 10−5 0.2159 0.0348 0.0634 100% HAGWO 0.0427 0.2300 0.0029 0.0469 100% (iii) Baloon Dataset Problem Algorithm Best Min Value Best Max Value Average S.D. Classification Rate PSO 5.6029 × 10−28 0.1596 0.0161 0.0409 100% WOA 7.7005 × 10−4 0.0313 0.0076 0.0080 100% GWO 3.5126 × 10−17 0.1168 0.0064 0.0261 100% MGWO 2.2483 × 10−15 0.0556 0.0071 0.0173 100% HAGWO 1.6372 × 10−5 0.1798 0.0143 0.0438 100% (iv) Breast Cancer Dataset Problem Algorithm Best Min Value Best Max Value Average S.D. Classification Rate PSO 0.0054 0.0441 0.0130 0.0070 14.00% WOA 0.0018 0.0416 0.0033 0.0043 97.21% GWO 0.0014 0.0464 0.0065 0.0093 99.00% MGWO 0.0017 0.0387 0.0066 0.0096 99.11% HAGWO 0.0013 0.0464 0.0026 0.0042 100%
Table 11. Best optimal solutions of the welded beam design by metaheuristics.
Table 11. Best optimal solutions of the welded beam design by metaheuristics.
AlgorithmOptimum Variables$f ( Y )$
-$h$$l$$t$$b$-
GA0.2088003.4205008.9975000.2100001.748309
GA0.2059863.4713289.0202240.2064801.728226
GA0.24896.17308.17890.25332.4328
UPSO0.24076.48518.23990.24972.4426
ABC0.2057303.4704899.0366240.2057301.7248852
CDE0.2031373.5429989.0334980.2061791.733462
CPSO0.2023693.5442149.0482100.2057231.728024
HIS0.205733.470499.036620.205731.7248
MFO0.20573.47039.03640.20571.72452
AFA0.2057303.404899.0366240.2057301.724852
CSS0.20583.46819.03800.20571.7249
LSA-SM0.20572963.2531209.0366240.20572961.695247
HAGWO0.20552353.2012589.0332580.20521251.661258
Table 12. Best optimal solutions of the welded beam design by metaheuristics.
Table 12. Best optimal solutions of the welded beam design by metaheuristics.
AlgorithmOptimum Variables$f ( Y )$
-$x 1$$x 2$$x 3$$x 4$-
GA0.8125000.43750040.323900200.0000006288.7445
CPSO0.8125000.43750042.091266176.7465006061.0777
GA0.8125000.43750042.097398176.6540506059.9463
HIS0.750.37538.86010221.365535849.76169
CDE0.8125000.43750042.098411176.7465006061.0777
BA0.81250.437542.0984456176.63659586059.7143348
ABC0.8125000.43750042.098446176.6365966059.714339
AFA0.81250.437542.09844611176.63658946059.7142719
CS0.81250.437542.0984456176.63659586059.7143348
ES0.81250.437542.098087176.6405186059.7456
ACO0.81250.437542.103624176.5726566059.0888
TLBONANANANA6059.714335
MFO0.81250.437542.098445176.6365966059.7143
LSA-SM0.81037640.400569541.98842178.00485942.6966
HAGWO0.81024560.400352641.78451178.00125924.2536

## Share and Cite

MDPI and ACS Style

Singh, N.; Hachimi, H. A New Hybrid Whale Optimizer Algorithm with Mean Strategy of Grey Wolf Optimizer for Global Optimization. Math. Comput. Appl. 2018, 23, 14. https://doi.org/10.3390/mca23010014

AMA Style

Singh N, Hachimi H. A New Hybrid Whale Optimizer Algorithm with Mean Strategy of Grey Wolf Optimizer for Global Optimization. Mathematical and Computational Applications. 2018; 23(1):14. https://doi.org/10.3390/mca23010014

Chicago/Turabian Style

Singh, Narinder, and Hanaa Hachimi. 2018. "A New Hybrid Whale Optimizer Algorithm with Mean Strategy of Grey Wolf Optimizer for Global Optimization" Mathematical and Computational Applications 23, no. 1: 14. https://doi.org/10.3390/mca23010014