Recent Advances in Harris Hawks Optimization: A Comparative Study and Applications

: The Harris hawk optimizer is a recent population-based metaheuristics algorithm that simulates the hunting behavior of hawks. This swarm-based optimizer performs the optimization procedure using a novel way of exploration and exploitation and the multiphases of search. In this review research, we focused on the applications and developments of the recent well-established robust optimizer Harris hawk optimizer (HHO) as one of the most popular swarm-based techniques of 2020. Moreover, several experiments were carried out to prove the powerfulness and effectivness of HHO compared with nine other state-of-art algorithms using Congress on Evolutionary Computation (CEC2005) and CEC2017. The literature review paper includes deep insight about possible future directions and possible ideas worth investigations regarding the new variants of the HHO algorithm and its widespread applications.


Introduction
The optimization area has witnessed a wide range of applications in understanding the solutions of many new problems due to the fast progress of industrial technologies and artificial intelligence [1][2][3][4][5].Such rapid progress requires the development of new technologies for tackling hard and challenging problems in a reasonable time.Many proposals and research attempts have been introduced by researchers with years of experience based on two classes of deterministic methods and stochastic-based methods [6][7][8][9].In the first-class, there is a need for gradient info and details of the search space.The later class does not need such info and can handle black-box optimization without knowing mathematics details of the objective function, based on sensing and touching the surface of the problems.One of the popular classes of these stochastic optimizers is a swarm and evolutionary method [10][11][12][13].Table 1 shows a list of metaheuristic algorithms belonging to this class and their algorithmic behavior.
Table 1.A list of some optimization algorithms based on their components.

Evolutionary
Breeding-based Evolution Genetic Algorithm (GA) [14] 1992 Breeding-based Evolution Genetic programming (GP) [15] 1992 Influenced by representative solutions Differential Evolution (DE) [16] 1997 Breeding-based Evolution Evolution Strategies [17] 2002 Mathematical Arithmetic operators Arithmetic Optimization Algorithm [18] 2021 guesses, and then they evolve and improve the initial solutions until convergence to a high-quality suboptimal or optimal solution [37][38][39][40][41].The core operators of evolutionary techniques are crossovers and mutations to generate new gens from parent solutions based on some ideas inspired by the evolution of nature.The swarm-based solvers may also use such operations, but they organize initial swarms based on the interaction of two fundamental cores for scattering them and then intensifying them into specific high-potential areas [42][43][44].Swarm-based approaches are developed and utilized in many areas of machine learning and artificial intelligence according to the cooperative intellect life of self-organized and reorganized coordination, e.g., artificial clusters of randomly generated agents.This way of problem solving is well-known among the swarm-intelligence community; due to such a delicate balance between the exploration and exploitation steps, it is a hard target to achieve [45][46][47].
There are many ways to classify these methods as well, but generally, some researchers, who benefit from a source of inspiration, classify these methods based on their source of inspiration [48][49][50].More scientific classification has been conducted based on the algorithmic behavior of these methods by [51][52][53].However, this field has some diseases that still have not healed.First, many population-based methods cannot show a high performance or novelty in their mathematical rationalities but have a new source of inspiration, and they are well-known as questioned metaphor-based methods.For instance, the well-known gray wolf optimizer (GWO) has a structural defect, and it shows uncertain performance for problems whose optimal solutions are not zero but near-zero points such as epsilon, as discovered by [54,55].Moreover, the core equations of many new optimizers can be constructed using the particle swarm optimizer (PSO) and differential evolution (DE) methods or other popular swarm-based methods [56].This means the inspiration and metaphor-based language made it easy to develop unoriginal or "pseudonovel" solvers that show "pseudoefficient" efficacy.More and more, the wisdom of new researchers grows, they understand and reach the conclusion that "performance matters", not the source of inspiration.The Harris hawk optimizer (HHO) was an attempt to reach not only better performance but also low-cost and efficient operators within a new stochastic optimizer.
The Harris hawks optimization is a recently developed algorithm that simulates Harris hawks' special hunting behavior known as "seven kills".The HHO algorithm [33] has some unique features compared with other popular swarm-based optimization methods.The first is that this optimizer utilizes a time-varying rule which evolves by more iterations of the method during exploration and exploitation.Such a way of shifting from exploration to exploitation propensities can make it subsequently flexible when the optimizer is in front of an undesirable difficulty in the feature space.Another advantage is that the algorithm has a progressive trend during the convergence process and when shifting from the first diversification/exploration phase to the intensification/exploitation core.The quality of results in HHO is relatively higher compared with other popular methods, and this feature also supported the widespread applications of this solver.Moreover, the exploitation phase of this method is compelling based on a greedy manner in picking the best solutions explored so far and ignoring low-quality solutions obtained until that iteration.We review different applications of this optimizer in the next parts of this survey to see how its features make it a fitting choice for real-world cases.
HHO is similar to all metaheurstics algorithms.HHO has many benefits (advantages) and a smaller number of disadvantages.HHO advantages can be listed as follows: • Good convergence speed.

•
Powerful neighborhood search characteristic.• Good balance between exploration and exploitation.• Suitable for many kinds of problems.• Easy to implement.• Adaptability, scalability, flexibility, and robustness.
The disadvantages of HHO, as with all other algorithms, is that it may stick in local optima, and there is no theatrical converging study frame.
This paper is organized as follows.Section 2 presents the original procedure of the studied HHO.Section 3 presents the variants of the HHO.Section 4 presents the application of the HHO.Section 5 presents a brief discussion about the HHO with its advantages.Finally, the conclusion and future works are given in Section 6.

Review Methodology
The aims of this study are to present a comprehensive review of all HHO aspects and how the researchers and scholars are encouraged and motivated to use it in various disciplines.Since the suggestion of this algorithm, it has received huge attention from scholars all over the world.According to Google Scholar (https://scholar.google.com/scholar?cites=16912266037349375725&as_sdt=2005&sciodt=0,5&hl=en (accessed on 31 December 2021)), it has been cited more than "1231" times.Moreover, the original study is also selected as a hot paper and one of the most ranked studies in both Scopus (https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85063421586&origin=recordpage(accessed on 31 December 2021)) and Web of Science (https://www.webofscience.com/wos/woscc/full-record/WOS:000469154500064 (accessed on 31 December 2021)).Firstly, a preliminary study was conducted, after that, we performed a search on studies that referred to the HHO original study to make a list of keywords we must use to perform the search.Secondly, skimming and scanning methods were used to select relevant studies.Thirdly, a checking and screening process was performed to extract data.Finally, we sorted the data and classified ideas (papers).

Harris Hawks Optimization (HHO)
The Harris hawks optimization is a recently introduced population-based optimization method suggested by Heidari et al. [33] in which the authors tried to simulate the Harris hawk's cooperative behavior in nature.Some hawks try to surprise prey using some techniques (tracing and encircling-approaching and attacking).The HHO pseudocode can be founded in Algorithm 1.The authors simulated hawks' behavior in 3 steps.
The logical diagram of the HHO can be given in Figure 1.
(i) The first step (exploration phase) can be formulated as follows: where X(t) and X(t + 1) refer to hawk location in the current iteration and next iteration, respectively, r 1 , r 2 , r 3 , r 4 , and q are stochastic numbers in the interval [0, 1], X rabbit and X rand (t) refer to rabbit location (best position) and randomly selected hawk location, respectively, and X m (t) can be calculated from the following Equation (2).
where X i (t) refers to each hawk position and N is the maximum iteration number.
(ii) Transition from exploration to exploitation where E o and E are the initial energy and the escaping energy, respectively.If E o ≥ 1, then exploration occurs.Otherwise, exploitation happens.This step can be identified in 4 scenarios: Soft besiege: occurs if r ≥ 0.5 and |E| ≥ 0.5, which can be obtained from the following equation: ∆X(t) refers to the difference between location of current hawk and location of prey.
Hard besiege occurs if r ≥ 0 and |E| < 0. This scenario is formulated as: Advanced rapid dives while soft surround: if r < 0 and |E| ≥ 0. The next hawk move is obtained by Equation (7).
where S is a random vector and D refers to dimension.LF is levy flight, which can be obtained from the following equation: where β is fixed and equal to 1.5, u and ν refer to random numbers ∈ (0, 1).The final equation can be given as Advanced rapid dives while hard surround: occurs if r < 0.5 and |E| < 0.5.This behavior can be shown as follows: where Y and Z are calculated from the following equations: thenUpdate hawk using Eq. 9 0: end for 0: end while 0: Return X rabbit .

Parameter Setting
In all these experiments, we used the same parameter setting, which is given in Table 2, which gives the number of dimensions, number of individuals, and maximum number of iterations.These experiments were carried out using Matlab 2021b on Intel(R) Core(TM) i5-5200 machine with 6 GBs of RAM.The setting of compared algorithms in both CEC2005 and CEC2017 can be found in Table 3.  CEC 2005 is a classical benchmark mathematical functions that contains many mathematical function types (Unimodal, multimodal, hybrid, and composite).Here, we compared HHO with the Genetic Algorithm (GA) [104], Covariance Matrix adaptation Evolution Strategy (CMAES) [105], Linear population size reduction-Success-History Adaptation for Differential Evolution (L-SHADE) [106], Ensemble Sinusoidal incorporated with L-SHADE (LSHADE-EpSin) [107], Sine Cosine Algorithm (SCA) [108], Grasshopper Optimization Algorithm (GOA) [73], Whale Optimization Algorithm (WOA) [48], Thermal Exchange Optimization (TEO) [88], and Artificial Ecosystem Optimization (AEO) [109].
Table 4 shows the results of these algorithms.It is easy to notice that the HHO has a good performance compared with the others.Moreover, Figures 2 and 3 show the convergence curves of these algorithms, whereas Figures 4 and 5 show their boxplot.The Wilcoxon Rank Sum (WRS) test was carried out with a 5% percentage between the HHO and the other algorithms to prove its superiority.The results of the Wilcoxon test can be found in Table 5.
The results of such a comparison are shown in Table 6 in terms of average and standard deviation for fitness value.From this table, it is clear that the HHO has a good performance.Moreover, Figures 6-8 show convergence curves of the HHO compared with the abovementioned algorithms, where we conclude that the HHO has a good speed convergence.Moreover, Figures 9-11 show the boxplot of the HHO compared with the other algorithms.Hybrid Function 1 (N=3)

Enhancement to HHO
In the literature, there exist many studies which enhanced the HHO using many mathematical operators.These studies can be summarized, as shown in Table 7.  [121] tried to speed the HHO convergence by introducing an enhanced the HHO version using two strategies: (1) enhancing the HHO exploration by using oppositebased learning and logarithmic spiral and (2) the Modify Rosenbrock Method (RM) to fuse HHO in order to improve convergence accuracy and enhance the HHO local search capability.The authors tested their algorithm, which is called (RLHHO), using 30 IEEE CEC 2014 and 23 traditional benchmark functions.They compared their results with eight standard metaheurstic algorithms and six advanced ones.In [122], Ariui et al. developed an enhanced version of the HHO by hybridizing the JOS operator with the HHO.JOS consists of two other operators: Dynamic Opposite (DO) and Selective Leading Opposition (SLO).The authors assumed that JOS increases the HHO exploration capabilities, whereas DO increases HHO exploration capabilities.

Binary HHO
Many binary variants of the HHO have been introduced.For example, Too et al. [110] introduced a binary version of the HHO called (BHHO) using two transfer functions: Sshaped and V-shaped.Furthermore, they proposed another version called Quadratic Binary HHO (QBHHO).They compared their versions with five different MAs, namely binary differential evolution, binary flower pollination algorithm, genetic algorithm, and binary salp swarm algorithm.The same work was conducted by Thaher et al. [123] in which the authors applied their algorithm to feature selection.Moreover, Thaher and Arman [124] developed another version of the HHO called enhanced binary HHO (EBHHO) by using the multiswarm technique, in which the population is divided into three groups.Each group has a leader, and the fittest leader is able to guide more agents.Furthermore, they used three classifiers: K-nearest neighbors (KNN), Linear Discriminant Analysis (LDA), and Decision Tree (DT).Likewise, in [125], Chellal and Benmessahed developed a binary version of the HHO in order to be able to have an accurate detection of a protein complex.Dokeroglu et al. [126] developed a binary version of multiobjective HHO to be able to solve a classification problem.They introduced novel discrete besiege (exploitation) and perching (exploration) operators.They used four machine learning techniques: Support Vector Machines (SVM), Decision Trees, Logistic regression, and Extreme Learning Machines.They used a COVID-19 dataset.Moreover, Chantar et al. [127] developed a binary HHO version using the Time-Varying Scheme.The new version, which is called BHHO-TVS, is used to solve classification problems.They used 18 different datasets to prove the significant performance of BHHO-TVS.

Opposite HHO
Hans et al. [111] proposed a new version of the HHO called (OHHO), based on the opposition-based learning (OBL) strategy.They applied the OHHO in feature selection in breast cancer classification.Fan et al. [113] proposed a novel HHO version called NCOHHO, which improves the HHO by two mechanisms: neighborhood centroid and opposite-based learning.In NCOHHO, neighborhood centroid is considered a reference point in generating the opposite particle.Likewise, Gupta et al. [128] used four strategies on the HHO namely: proposing a new nonlinear parameter for the prey energy, greedy selection, different rapid dives, and OBL.Other studies that used optimization methods to solve the parameter extraction problems refer to [129][130][131].
Jiao et al. [112] introduced a novel HHO called EHHO which employed OBL and Orthogonal Learning (OL).
Likewise, another improved approach of HHO was made by Song et al. [114] called IHHO, in which two techniques were employed: (1) Quasi-Oppositional and (2) Chaos theory.
Amer et al. [132] developed another version of the HHO called Elite Learning HHO (ELHHO).They used elite opposition-based learning to improve exploration phase quality.

Modified HHO
Akdag et al. [133] developed a modified HHO version by using seven different random distribution functions in order to show how stochastic search affects HHO performance.Moreover, Zhang et al. [134] tried to balance the exploration and exploitation phases by focusing on prey escaping energy (E).They introduced six different strategies to update E.
Yousri et al. [135] proposed an improved version of the HHO by modifying exploration strategies by using levy flight instead of depending on prey position.Moreover, they updated the hawks' position based on three randomly chosen hawks instead of the random update technique.Likewise, Zhao et al. [136] developed another modified version of the HHO by using a chaotic method called the Singer mechanism.They also used the levy flight mechanism in order to enhance HHO convergence.
Hussain et al. [115] developed a long-term HHO algorithm (LMHHO) which shares multiple promising areas of information.With such information, more exploration capability is given to the HHO.The authors used CEC 2017 and classical functions to validate their algorithm.
Liu et al. [137] proposed an improved HHO algorithm that combined the Nelder-Mead Simplex algorithm and the crisscrossed algorithm crossover technique which is called the horizontal and vertical crossover mechanism.The authors applied their algorithm, which is called CCNMHHO, in photovoltaic parameter estimation.
Rizk-Allah and Hassanien [138] developed a hybrid HHO algorithm which combined Nelder-Mead with the Harris algorithm.They validated their algorithm using six differential equations and four engineering differential equations.
Youssri et al. [139] developed a modified version of the HHO, called Fractional-Order modified HHO (FMHHO), which used the fractional calculus (FOC) memory concept.They tested the FMHHO using 23 benchmark functions in addition to the IEEE CEC 2017 ones.They applied it for proton exchange membrance for modeling of fuel cells.Moreover, Irfan et al. [140] proposed a modified HHO (MHHO) by using crowding distance and roulette wheel selection.They tested it using the IEEE 8 and 15 bus systems.
Ge et al. [141] suggested another improved HHO version using the predator-rabbit distributance method.
Singh et al. [142] used opposition-based learning (OBL) to enahnce the HHO.They applied the novel algorithm which is called OHHO to data clustering and tested it using 10 benchmark datasets.5.1.5.Improved HHO Kardani et al. [143] introduced another improved version of the HHO and extreme learning machine (ELM).Their novel algorithm, which is called ELM-IHHO, tried to overcome the limitations of the HHO by using a mutation mechanism.The authors applied ELM-IHHO in the prediction of light carbon permeability.They compared their results with ELM-based algorithms such as PSO, GA, and SMA.Moreoever, Guo et al. [144] used the random unscented sigma mutation strategy to improve the HHO.Their novel HHO version used quasi-reflection learning and quasi-opposite learning strategies in order to enhance generation diversity.They also implemented logarithmic nonlinear convergence factor to have a good balance between local and global searches.
Liu [145] developed an improved version of the HHO (IHHO).In the IHHO, a new search process is added to improve candidate solution quality.They applied it in the Job-Shop Scheduling problem.Duan and Liu [146] introduced the golden sine strategy to improve the HHO.The novel algorithm can enhance the diversity of population and improve its performance.
Hu et al. [147] developed another variant of the HHO called IHHO using two techniques: (1) adding velocity to the HHO from the Particle Swarm Optimization Algorithm and (2) using the crossover scheme from the artificial Tree algorithm [148].They used 23 functions to test the IHHO and compared the results with 11 metaheuristics algorithms.They applied it in stock market prediction.Moreover, Selim et al. [149] tried to enhance the HHO by returning hawks to rabbit position instead of returning them to the maximum and minimum variables' limits.They also developed a multiobjective version.Moreover, a novel search mechanism was proposed by Sihwail et al. [150] to improve HHO performance by mutation neighborhood search and rollback techniques.
Another enhanced HHO version was proposed in [116], in which the authors tried to enhance the HHO by using three techniques: (1) Chaos theory, (2) Multipopulation topological structure, and (3) Differential evolution operators: mutation and crossover.The authors applied their algorithm, which is known as the CMDHHO, in image applications.
In order to have the right balance between exploitation and exploration search in the HHO, Song et al. [151] developed an enhanced HHO version called GCHHO, where two techniques were employed, namely the Gaussian mutation and Cuckoo Search dimension decision techniques.To test GCHHO, CEC 2017 functions were used in addition to three engineering problems.The authors compared GCHHO with classical HHO, WOA, MFO, BA, SCA, FA, and PSO.
Moreover, Yin et al. [152] tried to prevent the HHO from falling in the local optimum region by developing an improved version of the HHO called NOL-HHO, in which there are Nonlinear Control Parameter and Random Opposition-based Learning strategies.
Ridha et al. [153] developed a boosted HHO (BHHO) algorithm which employed the random exploration strategy from the Flower Pollination Algorithm (FPA) and the mutation strategy from differential evolution.Wei et al. [120] developed another improved HHO approach that uses Gaussian barebone (GB).They tested their algorithm, which is called GBHHO, using the CEC 2014 problems.Zhang et al. [154] used the adaptive cooperative foraging technique and dispersed foraging strategy to improve the HHO.Their algorithm, which known as ADHHO, was tested using a CEC 2014 benchmark function.
A vibrational HHO (VHHO) was proposed by Shao et al. [119] to prevent the HHO particles from converging around local optima by embedding SVM into HHO and using a frequent mutation.VHHO was compared with SCA, PSO, and classical HHO.

Chaotic HHO
Many chaotic HHO variants have been proposed.Menesy et al. [155] proposed a chaotic HHO algorithm (CHHO) using ten chaotic functions.They compared their algorithm with conventional HHO, GWO, CS-EO, and SSO.The authors claimed that the experimental results show the superiority of CHHO over other algorithms.Likewise, Chen et al. [156] developed a new version termed EHHO, in which the chaotic local search method is used in addition to OBL techniques.Statistical results show that the EHHO achieved better results than other competitors.The authors applied EHHO in identifying photovoltaic cells parameters.Moreover, Gao et al. [157] used tent map with HHO.
Basha et al. [159] developed a variant chaotic HHO using quasi-reflection learning.They applied it in order to enhance CNN design for classifying different brain tumor grades using magnetic resonance imaging.The authors tested their model using 10 benchmark functions, and after that they used two datasets.Likewise, Hussien and Amin in [160] developed a novel HHO version based on the chaotic local search, opposition-based learning, and self adaption mechanism.They evaluated their model using IEEE CEC 2017 and applied the novel algorithm, which is called m-HHO, in feature selection.
Dehkordi et al. [161] introduced a nonlinear-based chaotic HHO algorithm.The new algorithm, which known as (NCHHO), is applied to solve the Internet of Vehicles (IoV) optimization problem.

Dynamic HHO with Mutation Mechanism
Jia et al. [118] developed a dynamic HHO algorithm with mutation strategy.Instead of decreasing the energy escaping parameter E from 2 to 0, they introduced a dynamic control parameter to be able to avoid local optimum solutions.The authors claimed that this dynamic control would prevent the solution from getting stacked in local optimum as E in the original cannot take values greater than 1 in the second half of iterations.

Other HHO Variants
An adaptive HHO technique has been proposed by Wunnava et al. [117], which is called AHHO.The authors used mutation strategy to force the escape energy in the interval [0, 2].The AHHO was tested using 23 classical functions and 30 functions obtained from CEC 2014.
To strengthen HHO performance, two strategies from Cuckoo Search were introduced to the HHO by Song et al. [151].They used the dimension decided strategy and Gaussian mutation.They tested their algorithm, GCHHO, using 30 functions from IEEE CEC 2017.
Jiao et al. [162] proposed a multistrategy search HHO using the Least Squares Support Vector Machine (LSSVM).They used the Gauss chaotic method as the initialization method.They also employed the neighborhood perturbation mechanism, variable spiral search strategy, and adaptive weight.They used 23 functions in addition to CEC 2017 suit test functions to prove the effectiveness of their algorithm.
Zhong and Li [163] developed a hybrid algorithm called the Comprehensive Learning Harris Hawks Equilibrium Optimizer (CLHHEO) algorithm.The authors used three operators: the equilibrium optimizer operator, comprehensive learning, and terminal replacement mechanism.They used EO operators to enhance the HHO exploration capacity.Comprehensive Learning was employed in the CLHHEO to share agent knowledge in order to enhance the convergence capacity.They usedthe terminal replacement technique to prevent the CLHHEO from falling in local stagnation.The authors compared the CLHHEO with CLPSO, PSO, GWO, BBBC, WOA, DA, SSA, AOA, SOA, and classical HHO using 15 benchmark functions and 10 constrained problems.
Abd Elaziz et al. [164] developed an algorithm called multilayer HHO based on the algorithm called multileader HHO, which is based on Differential Evolution (MLHHDE).They introduced a memory structure that makes hawks learn from the global best positions and the best historical ones.DE is employed in order to increase the exploration phase.They used CEC 2017 benchmark functions.
Bujok [165] developed an advanced HHO algorithm that archive each old solution.He compared his algorithm using 22 real-world problems from CEC 2011.
Al-Batar et al. [166] enhanced the exploration phase of the original HHO by introducing the survival-of-the-fittest evolutionary principle that helps smooth transitions from exploration to exploitation.The authors used three, strategies namely proportional, linear ranked-based, and tournament methods.They tested their algorithms using 23 mathematical functions and 3 constrained problems.
Qu et al. [167] tried to improve the HHO by employing Variable Neighborhood Learning (VNL) which is used to balance between exploration and exploitation.They also used F-score to narrow down selection range and used mutation to increase diversity.Nandi and Kamboj [168] combined canis lupus and the Grey wolf optimizer to improve the HHO.The novel algorithm, which is called hHHO-GWO, was tested on CEC 2005, CEC-BC-2017, and 11 different engineering problems.
Gölcük and Ozsoydan [169] developed a hybrid algorithm which combines both the Teaching-Learning-based Optimization (TLBO) with the HHO.The new algorithm (ITL-HHO) is design to give a proper balance between exploitation and exploration.The authors tested theirs using 33 benchmark function (CEC 2006 andCEC 2009) and 10 multidisciplinary engineering problems.
Yu et al. [170] proposed a modified HHO called compact HHO (cHHO) in which only one hawk is used to search for the optimal solution instead of many hawks.ElSayed and Elattar [187] developed a hybrid algorithm which combines HHO with Sequential Quadratic Programming (HHO-SQP) in order to obtain the optimal overcurrent relay coordination that relays incorporating distributed generation.
Kaveh et al. [188] developed a hybrid algorithm called Imperialist Competitive HHO (ICHHO) that used the Imperialist Competitive Algorithm (ICA) [189] to improve the HHO's exploration performance.They tested their model using 23 mathematical functions and many common engineering problems.
Sihwail et al. [190] developed a hybrid algorithm called the Netwon-Harris Hawks Optimization (NHHO) in which Netwon's technique second-order is used to correct digits in order to solve nonlinear equation systems.
Another hybrid algorithm combines Differential Evolution (DE) with HHO and Gaining-Sharing Knowledge algorithm (GSK) [191].The new algorithm is abbreviated as DEGH and was developed using the "rand/1" DE operator and GSK two-phase strategy.They also used self-adaption crossover probability to strengthen the relationship between selection, mutation, and crossover.They used 32 benchmark functions and compared DEGH with 8 state-of-the-art algorithms.
Another hybrid version between DE and HHO has been proposed by Abualigah et al. [192], where DE is used to enhance the HHO exploitation experience.The novel hybrid algorithm, known as H-HHO, was used to obtain cluster optimal numbers to each dataset.
Azar et al. [193] developed a prediction model using the Least Square Support Vector Machine (LS-SVM) and Adaptive Neuro-Fuzzy Inference System (ANFIS).ANFIS was optimized by the HHO (ANFIS-HHO).The authors argued that the HHO increases the ANFIS prediction performance.
Firouzi et al. [194] developed a hybrid algorithm which combines both the HHO and Nelder-Mead Optimization.They used it to determine the depth and location of a microsystem crack.They compared their algorithm with classical HHO, GOA, WOA, and DA.Lie et al. [195] developed a hybrid algorithm that combines the Fireworks Algorithm (FWA) [98] with the HHO based on the mechanism of dynamic competition.The authors used CEC 2005 to verify the powerfulness of their algorithm, which is called (DCFW-HHO).They compared it with MPA, WOA, WCA, LSA, FWA, and HHO.
Ahmed [197] developed a hybrid algorithm between the HHO and homotopy analytical method (HAM) to solve partial differential equations.
Yuan et al. [198] developed a hybrid HHO algorithm that combines Harris hawks with instinctive reaction Strategy (IRS).The novel algorithm (IRSHHO) is tested using five mathematical functions.
Setiawan et al. [199] developed a hybrid algorithm called (HHO-SVR) in which SVR is optimized using the HHO.

Multiobject HHO
Hossain et al. [200] introduced a multiobjective version of HHO based on the 2-hop routing mechanism.The novel algorithm is applied for the cognitive Radio-Vehicular Ad Hoc Network.Dabba et al. [201] introduced another multiobjective binary algorithm based on the HHO.The novel algorithm, MOBHHO, was applied on microarray data gene selection by using two fitness functions: SVM and KNN.Another binary multiobjective version was proposed in [126].
Du et al. [203] proposed a multiobjective HHO called (MOHHO) in order to tune extreme learning machine (ELM) parameters and applied it in air pollution prediction and forecasting.The authors evaluated their model, MOHHO-ELM, using 12 air pollutant concentrations recorded from 3 cities.Moreover, Islam et al. [204] tried to solve multiobjective optimal power flow.Likewise, an improved multiobjective version of the HHO was introduced by Selim et al. [149], which is termed as MOIHHHO.The authors applied it to find the optimal DG size and location.
Fu and Lu [205] developed an improved version of the HHO called (HMOHHO), in which hybrid techniques are integrated with the HHO, including Latin hypercube sampling initialization, mutation technique, and a modified differential evolution operator.The authors used UF and ZDT to validate their algorithm.
Piri and Mohapatra [206] presented a multiobjective Quadratic Binary (MOQBHHO) approach using KNN.The authors used Crowding Distance (CD) to pick the best solution from nondominated ones.They compared MOQBHHO with MOBHHO-S, MOGA, MOALO, and NSGA-II.

HHO Applications
HHO has been successfully applied to many applications, as shown in Figure 12 and Table 9.Recently, electrical energy utilities are increasing rapidly with wide demand [215,216].To face such a problem, interconnected networks have emerged with differential power systems [217].
Optimal Power Flow (OPF) can be considered as an optimization problem.Hussain et al. [115] applied their algorithm to solve the OPF problem.They claimed that their algorithm, which is called LMHHO, achieved better results than classical HHO.Another attempt to solve OPF was made by Islam et al. [218], in which the HHO was compared with ALO, WOA, SSA, MFO, and the Glow Warm algorithm.The same authors solved the same problem in [204] with consideration to environmental emission.Likewise, a modified HHO introduced in [133] has been applied to solve OPF.
Akdag et al. [133] developed a modified HHO version using seven random distribution functions, namely normal distribution, F distribution, Rayleigh distribution, chi-square distribution, exponential distribution, Student's distribution, and lognormal distribution.The authors applied their algorithm to Optimal Power Flow (OPF) and applied it to the IEEE 30-bus test system.
Paital et al. [219] tuned interval fuzzy type-2 lead-lag using the HHO.The novel algorithm (Dual-IT2FL) was applied to find the enhancement of stability in unified power flow controllers (UPFC).
Shekarappa et al. [207] used a hybrid algorithm between HHO and PSO called (HHOPSO) to solvea reactive power planning problem.The HHOPSO was tested using IEEE 57-bus.Mohanty and Panda [220] adapted the HHO using the Sine and Cosine Algorithm.Their developed algorithm, which is called (ScaHHO), was utilized in order to tune an adaptive fuzzy proportional integrated (AFPID) for hybrid power system frequency controller.

Distributed Generation
Abdel Aleem et al. [221] tried to solve the optimal design of C-type resonance-free harmonic filter in a distributed system using the HHO.The authors claimed that the HHO has better results than the other compared algorithm.
[222] used the HHO to obtain the optimal reconfiguration of the network in distributed systems.Abdelsalam et al. [223] presented a smart unit model for multisource operation and cost management based on the HHO algorithm.Mohandas and Devanathan [208] used crossover and mutation with the HHO.The novel algorithm, which is called (CMBHHO), is applied to configure the network by distribution generator (DG) size and optimal location.They compared it with GSA, ALO, LSA, HSA, GA, FWA, RGA, GA, TLBO, and HHO.Mossa et al. [224] developed a model to estimate proton exchange membrance fuel cell (PEMFC) parameters based on the HHO.
Chakraborty et al. [225] tried to use the HHO to select the optimum capacity, site, and number of solar DG.They used two benchmark functions: IEEE 33 and IEEE 69 bus radial.

Photovoltaic Models
Due to continuous and huge increases in energy demand, solar photovoltaic (PV), which is based on solar cells systems, has gained huge momentum.The HHO has been successfully applied to solve PV problems.Liu et al. [137] used their algorithm, called CCNMHHO, in finding the optimal PV model parameters.The authors state that CCN-MHHO has competitive results when compared with other well-known and state-of-the-art algorithms.Likewise, Jiao et al. [112] used their developed algorithm (EHHO) in finding PV parameters and for the construction of its model with high precision.Moreover, in [135], the authors used their novel approach to find the optimal array reconfiguration PV in alleviation the influence of partial shading.Likewise, Qias et al. [226] tried to extract the parameters of 3-diode PV (TDPV).
Chen et al. [156] tried to identify PV cell parameters by an enhanced HHO version termed EHHO.They compared their algorithm with CLPSO, IJAYA, and GOTLBO.Sahoo has conducted similar work, as has Sahoo and Panda [227], in which they control solar PV frequency.

Wind Applications
In [228], Fang et al. developed a multiobject mutation operator (HMOHHO) which is used to acquire kernel identification voltera parameters.They applied it to forecast wind speed.Roy et al. [229] used the HHO to reduce the interconnected wind turbines.They compared it with PSO, FPA, GWO, MVO, WOA, MFO, and BOA.

Economic Load Dispatch Problem
The Economic Load Dispatch Problem can be considered as one of the most common and important problems in power systems [230,231].
Pham et al. [232] employed multirestart strategy and OBL to enhance the HHO.The novel algorithm is applied to solve ELD with nonsmooth cost functions.They argued that results are superior when compared with previous studies.

Unit Commitment Problem
Nandi and Kamboj [233] tried to hybridize the Sine Cosine Algorithm and Memetic Algorithm using the HHO.They applied it to solve a unit commitment problem with photovoltaic applications.SCA is used in power provision, whereas ELD is performed by HHO.6.2.Computer Science 6.2.1.Artificial Neural Network Sammen et al. [234] used the HHO in order to enhance Artificial Neural Network (ANN) performance and proposed a hybrid model called ANN-HHO. the authors compared their novel model with ANN-GA, ANN-PSO, and classical ANN.They argued that the ANN-HHO outperformed other compared algorithms.The same work was conducted by Essa et al. [235], in which their model is compared against the Support Vector Machine (SVM) and the traditional ANN.They applied the novel model to improve active solar productivity prediction.
Fan et al. [113] used their novel algorithm, NCOHHO, in training a multilayer feedforward neural network using five different datasets.A similar work which combined the HHO with ANN was conducted by Moayedi et al. [236] and applied to predict the compression coefficient of soil.The authors argued that the HHO is better than GOA in training ANN.
Moreover, in [237] Kolli and Tatavarth developed the Harris Water Optimization (HWO) based on a deep recurrent neural network (RNN) to detect fraud in a bank transaction.
Artificial Neural Networks (ANNs) are one of the most famous and popular learning methods that simulate the biological nervous system.The HHO has been used by many authors to train ANNs.Bacanin et al. [238] tried to adapt the HHO algorithm to train ANNs.They used two popular classification datasets to test their proposed approach.Moreover, Atta et al. [239] applied their enhanced version of the HHO in order to train a feed-forward neural network.They compared their method with eight metaheurstic algorithms using five classification datasets.A hybrid algorithm between the HHO and Whale Optimization Algorithm was developed to enhance ANN by Agarwal et al. [240].
Bac et al. [241] developed a hybrid model based on the HHO and Multiple Layers Perceptron (MLP).They used their developed system, which is called HHO-MLP, to estimate the efficiency of heavy metal absorption using nanotube-type halloysite.
Alamir in [242] proposed an enhanced ANN using the HHO to predict food liking in different masking background presence noise levels and types.Simsek and Alagoz [243] used an ANN learning model and the HHO in order to develop analysis schemes for optimal engine behavior.Likewise, Zhang et al. [244] estimated clay's friction angle using deep NN and the HHO to evaluate slope stability.Murugadoss used Deep Convolution ANN (DCANN) with the HHO to have an early diabetes prediction.

Image Processing
In [210], Bao et al. applied their hybrid algorithm that combined the HHO with differential evolution (HHO-DE) in the segmentation of color multilevel thresholding images by using two techniques: Ostu's method and Kapur's entropy.They compared their results with seven other algorithms using ten images.They argued that HHO-DE outperforms all other algorithms in terms of structure, similarity index, peak signal-to-noise ratio, and feature similarity index.Similar work has been conducted by Wunnava et al. [245].In addition to using DE, they modified the exploration phase by limiting the escape energy in the interval ∈ [2,3].
Moreover, Golilarz et al. [246] utilized the HHO for obtaining the optimal thresholding function parameters for satellite images.Jia et al. [118] employed their algorithm, which is called dynamic HHO, with mutation (DHHO/M) in order to segment satellite images by using three criteria: Kanpur's entropy, Tsallis entropy, and Otsu.Similar work has been conducted by Shahid et al. [247], in which denoising of the image was presented in the wavelet domain.
In [248], an efficient HHO variant was introduced by Esparza et al. in which they used minimum cross-entropy as a fitness function since they applied it to image segmentation.To validate their method, they compared it with K-means and fuzzy iterAg.
Naik et al. [249] proposed a leader HHO (LHHO) in order to enhance the exploration of algorithm.They applied it for 2-D Masi entropy multilevel image thresholding.They used segmentation metrics such as PSNR, FSIM, and SSIM [250].

Scheduling Problem
Machine scheduling can be considered as a decision-making optimization process that has a vital role in transport, manufacturing, etc.
Jouhari et al. [174] used SSA with the HHO in solving a machine scheduling problem.Moreover, Attiya et al. [175] used a modified version of the HHO called HHOSA in order to solve a scheduling job in the cloud environment.
Utama and Widodo [211] developed a hybrid algorithm based on the HHO to solve a flow shop scheduling problem (FSSP).

Feature Selection
Feature Selection (FS) is one of the most important preprocessing techniques which aims to reduce features which may influence machine learning performance [251][252][253].The HHO has been used in solving FS problems.For example, Thaher et al. [123] used a binary version of the HHO in solving the FS problem.They used nine high-dimensional datasets.
Moreover, the authors in [254] used the HHO with Simulated Annealing and Bitwise operators (AND and OR).They used 19 datasets of all sizes.They claimed that HHOBSA has good results when compared with others.
A similar work was conducted by Thaher and Arman [124], where they used five different datasets.They used the ADASYN technique.Moreover, Sihwail et al. [150] used IHHO in order to solve the FS problem.They used 20 datasets with different feature dimensionality levels (low, moderate, and high).They compared IHHHO with seven other algorithms.Thaher et al. [255] detected Arabic tweets' false information using the hybrid HHO algorithm based on ML models and feature selection.Their algorithm, which is called Binary HHO Logistic Regression (LR), has better results compared with other previous work on the same dataset.In [254], Abdel-Basset et al. tried to hybridize the HHO with Simulated Annealing based on a bitwise operator.They applied the novel algorithm, which is called (HHOBSA), in a feature selection problem using 24 datasets and 19 artificial ones.
Moreover, in [256] Turabieh et al. proposed an enhanced version of the HHO and applied it to a feature selection problem using K-Nearest Neighbor (KNN).They applied it in order to predict the performance of students.They evaluated their prediction system using many machine learning classifiers such as kNN, Naïve Bayes, Layered recurrent neural network (LRNN), and Artificial Neural Network.
Al-Wajih et al. [257] introduced a hybrid algorithm which combined the HHO with the Grey Wolf Optimizer.The new algorithm, which is called HBGWOHHO, used sigmoid function to transfer from the continuous to binary domain.They compared it with Binary Particle Swarm Optimization (BPSO), Binary Genetic Algorithm (BGA), Binary Grey Wolf Optimizer (BGWO), Binary Harris Hawks Optimizer (BHHO), and Binary Hybrid BWOPSO.They assumed that their algorithm had better accuracy and a smaller size of selected features.
Khurma et al. [258] developed two binary HHO versions based on filter approach for feature selection.The first one applies mutual information with the HHO for any two features.The second version applies the HHO with each feature's group entropy.The authors assumed that the first approach selects fewer feature subsets, whereas the second one achieves higher classification accuracy.
Too et al. [212] enhanced the HHO algorithm by saving the memory mechanism and adopting a learning strategy.The novel algorithms, which are called MEHHO1 and MEHHO2, were employed to solve a feature selection problem.These approaches were evaluated using thirteen benchmark datasets with low-dimensional and eight others with high-dimensional ones.

Traveling Salesman Problem
Yaser and Ku-Mahamud [259] applied their hybrid algorithm called the Harris Hawk Optimizer Ant Colony System (HHO-ACS) in solving the Traveling Salesman Problem (TSP).They used many symmetric TSP instances such as bayg29, att48, berlin52, bays29, eil51, st70, eil76, and eil101.They compared the novel algorithm with Black hole [260], PSO, DA, GA, and ACO.They argued that the HHO-ACS has a good performance compared with them.
Ismael et al. [261] developed a new version of the HHO to solve an FS problem using V-support regression.

Wireless Sensor Network
The wireless sensor network (WSN) has recently been used in many fields, such as smart homes, health care, and environment detection, due to its features such as self-organizing and being environment friendly [262,263].Sriniv and Amgoth in [264] used the HHO with SSA in proposing an energy-efficient WSN.
Likewise, Bhat and Venkata [265] used the HHO in classifying node coverage ranges to an incoming neighbor and outgoing neighbor.This technique is based on area minimization.the authors tested the HHO-MA in a 2D square, 2D C-shape, 3D G-shape, 3D cube, and 3D mountain.In [266], Singh and Prakash used the HHO to find the optimal place of multiple optical networks.Xu et al. [267] used the HHO in the intelligent reflecting surface by trying to maximize signal power by optimizing access point beamforming.
Sharma and Prakash [268] developed a model called HHO-LPWSN which used the HHO to localize sensors nodes in a wireless sensor network.

Medical Applications
A pulse-coupled neural network (PCNN) based on the HHO called HHO-PCNN has been developed by Jia et al. [269].They applied it in image segmentation using many performance indicators (UM, CM, Recall, Dice, and Precision).The results were compared with WOA-PCNN, SCA-PCNN, PSO-PCNN, SSA-PCNN, MVO-PCNN, and GWO-PCNN.They claimed that their method has the best results.
Moreover, In [270], Rammurthy and Mahesh used their hybrid algorithm Whale HHO (WHHO) with deep learning classifier in detecting brain tumors using MRI images from two datasets: BRATS and SimBRATS.
Moreover, Abd Elaziz et al. [177] employed their method, called competitive chain HHO, in multilevel image thresholding using eleven natural gray scales.An adaptive HHO, which was proposed by Wunnava et al. [117], has been applied in D gray gradient multilevel image thresholding.They proved that their algorithm using I2DGG outperforms all other methods.Moreover, Golilarz et al. [116] used their algorithm (CMDHHO), which is based on multipopulation differential evolution, in denoising satellite images.Suresh et al. [186] used their chaotic hybrid algorithm, which is based on the deep kernel machine learning classifier CMVHHO-DKMLC, to classify medical diagnoses.
In [271], Kaur et al. employed Dimension Learning-based Hunting (DLH) with the original HHO.The novel algorithm, known as DLHO, was developed for biomedical datasets.The authors applied it to detect breast cancer.
Likewise, Chacko and Chacko [272] used the HHO to integrate watermarks on various strengths.The bits of watermarking were merged with Deep Learning Convolution NN (DLCNN).DLCNN used the HHO to identify watermarks.
Bandyopadhyay et al. [273] used the altruism concept and chaotic initialization to improve the HHO.Their novel algorithm was applied to segment brain Magnetic Resonance Images (MRI) using 18 benchmark images taken from the brainweb and WBE databases.
Iswisi et al. [274] used HHO in order to select the optimal cluster centers in fuzzy C-means (FCM) and segmentation.They tested their model using many brain MRIs.
Balamurugan et al. [275] tried to classify heart disease using adaptive HHO and deep GA.Their classified features were clustered using adaptive HHO, whereas enhanced deep GA was used to process classification processes.They used the UCI dataset to test their novel algorithm.
Qu et al. [167] used their algorithm, called VNLHHO, to be able enhance the classification of gene expression by improving the performance of feature selection.They used the Bayesian Classifier as a fitness function.They tested their model using profile data of the gene expression of different eight tumor types.

Chemical Engineering and Drug Discovery
Cheminformatics or chemical engineering is a field which is interested in discovering, analyzing, and predicting the properties of molecules by combining processes from mathematics and information science [276].
Houssein et al. [277] used the HHO with two classification techniques: SVM and kNN.By using two datasets: MonoAmine Oxidase and QSAR Biodegradation, they proved that the HHO-SVM achieved better results than the HHO-KNN.Moreover, in [178], the same authors hybridized the HHO with Cuckoo Search (CS) and chaotic maps and used SVM as an objective function.They proved that CHHO-CS is better than PSO, MFO, GWO, SSA, and SCA.
Abd Elaziz and Yousri [213] used the Henary Gas Solubility Optimization (HGSO) to enhance the HHO.They applied their algorithm (DHGHHD) to predict drug discovery and design using two real-world datasets.
Houssien et al. [278] used genetic algorithm operators (crossover and mutation) in addition to OBL and random OBL strategies in classical HHO to select chemical descriptors/features and compound activities.

Electronic and Control Engineering
Proportional-Integral Derivative (PID) controllers are used in order to improve Automatic Voltage Regulator (AVR) performance.Ekinic et al. [279] used the HHO to tune the PID parameters in AVR systems.Moreover, the same authors used the same algorithm [280] in tuning a DC motor PID controller by minimizing multiplied absolute error (ITAE) time.They compared their approach with the Grey Wolf Optimizer, Atom Search Algorithm, and Sine Cosine Algorithm.Moreover, the same authors [281] applied the HHO to tune the Fractional Order PID (FOPID) controller.They used it in a DC-DC buck.They compared their algorithm, which is called HHO-FOPID, with WOA-PID and GA-PID.
Likewise, Fu and Lu [205] developed a multiobjective PID controller based on the HHO with hybrid strategies (HMOHHO) for Hydraulic Turbine Governing Systems (HTGS).
Yousri et al. [282] used the HHO to evaluate PI parameter controllers which simulate load frequency control in the multi-interconnected system by using two systems: (1) two interconnected thermal areas and comprised PV and (2) four PV plants, two thermal plants, and a wind turbine.They compared the HHO against SCA, MVO, ALO, and GWO.
Barakat et al. [283] developed an interconnected power system LFC using PD-PI cascade control.
Munagala and Jatoth [284] tried to design a Fractional-order PID (FOPID) using the HHO for speed control.

Geological Engineering
The gravity-triggered mass downward movements can be used to define landslides.Bui et al. [285] applied the HHO to landslide susceptibility analysis.The authors used 208 historical landslides to build a predictive tool using ANN.
Moreover, Moayedi et al. applied the HHO to train multilayer perceptron (MLP).They used it in the footings bearing capacity assessments over two-layer foundation soils.
Murlidhar et al. [286] introduced a novel version of the HHO based on the multilayer perceptron neural network to predict flyrock distance induced by mine blasting.
Yu et al. [287] developed an ELM model based on the HHO in order to forecast mine blasting peak particle velocity.
Payani et al. [288] used the HHO and Bat Algorithm with machine learning tools (ANFIS/SVR) in order to improve modeling of a landslide spatial.

Building and Construction or Civil Engineering
Golafshani et al. [289] used the Radial Basis Function neural network (RBFNN) and multilayer neural network (MLNN) with the HHO in order to measure concrete Compres-sive Strength (CS).Parsa and Naderpour [290] tried to estimate the strength of shear of reinforced concrete walls using the HHO with support vector regression.

Coronavirus COVID-19
The new coronavirus, also known as COVID-19, was termed as an infectious disease by the World Health Organization (WHO).This virus began in China (Wuhan city) and has affected billions of people's lives [291,292].Computer science researchers tried to use the HHO to analyze and detect this virus.
Another work was conducted by Houssein et al. [293] to classify COVID-19 genes using SVM.They used a big gene expression cancer (RNA-Seq) dataset with 20531 features to test their model.Hu et al. [294] employed the HHO with the Extreme Learning Machine (ELM) to detect COVID-19 severity using blood gas analysis.They used Specular Reflection Learning and named the new HHO algorithm HHOSRL.
Ye et al. [295] developed the fuzzy KNN HHO method to predict and diagnose COVID-19.They compared their algorithm, which is called HHO-FKNN, with several machine learning algorithms and argued that it has a higher classification and better stability.Moreover, Bandyopadhyay et al. [296] used their hybrid algorithm, which combined chaotic HHO with SA, to screen COVID-19 CT scans.6.10.Other Applications 6.10.1.Microchannel Heat Sinks Design Electronic devices' thermal management has become very important in product designs that require more efficiency and power.Abbasi et al. [297] tried to used the HHO in the microchannel heat sinks design.The authors compared their results with the Newton-Raphson method, PSO, GOA, WOA, DA, and the bees Optimization Algorithm.

Chart Patterns Recognition
Golilarz et al. [298] proposed a novel automatic approach based on the HHO and deep learning (ConvNet) for nine control chat patterns (CCP).In this approach, the CPP recognition method, the unprocessed data are passed and processed using more than one hidden layer in order to extract all representation features.6.10.3.Water Distribution Khalifeh et al. [299] developed a model based on the HHO in order to optimize the distribution network of water in a city named Homashahr in Iran from September 2018 to October 2019.the authors stated that the HHO proved efficiency in finding the optimal water network design.

Internet of Things
The Internet of Things (IoT) gives the ability to different entities to access the environment, monitor it, and communicate with other entities [300].Seyfollahi and Ghaffari [301] developed a scheme for handling Reliable Data Dissemination in IoT (RDDI) based on the HHO.The authors evaluated their schemes using three comparative approaches using five different metrics: reliability, energy consumption, end-to-end delay, packet forwarding distance, and computational overhead.Moreover, Saravanan et al. [302] proposed a PI controller based on the HHO.They tuned BLDC motor parameters in the globe with IoT establishment.6.10.5.Short-Term Load Forecasting Tayab et al. [303] proposed a novel hybrid algorithm called HHO-FNN by training feed-forward neural network (FNN) using the HHO.They applied it in predicting load demand in the electric market in Queensland.They compared HHO-FNN with PSO, ANN, and PSO based on a support vector machine and a back-propagation neural network.6.10.6.Cardiomyopathy Ding et al. [304] developed a fuzzy HHO algorithm (FHHO) to monitor cardiomyopathy patients using wearable devices and sensors.They introduced Wearable Sensing Data Optimization (WSDO) to have accurate cardiomyopathy data.6.10.7.Qos-Aware Service Composition Li et al. [305] tried to solve QoS-aware Web Service (QWSC) problems and drawbacks using a metaheuristic algorithm by developing a method to construct fuzzy neighborhood relations and combining the HHO with logical chaotic function.6.10.8.PEMFC Parameter Estimation Mossa et al. [224] employed the HHO in order to evaluate the Proton Exchange Membrane Fuel Cell (PEMFC) unknown parameters.They tested it using three PEMFC stacks: 500W SR-12 PEM, BCS 500-W PEM, and 250 W. They claimed that based on the HHO it surpassed other algorithms in convergence speed and accuracy.6.10.9.DVR Control System ElKady et al. [306] employed the HHO to develop an optimized, enhanced, and less complex Dynamic Voltage Restorer (DVR).The results are compared with PSO and WOA.The authors used MATLAB/Simulink to simulate a system via Typhoon HIL402 real-time emulator validation.

A Brief Discussion
In this paper, we reviewed the most contemporary works and developments in the improvement and verification of the HHO.As collected works show, there are several points to be considered for both the conventional HHO and the enhanced variants of HHO.First, the performance matter.The different studies verified that one of the reasons for interest in the HHO over other competitors is its performance.Due to several reasons such as dynamic components, greedy selection, and multiphase searching rules, the HHO can show high-quality results in the same conditions that other methods are applied.This is one of the main whys and wherefores that the HHO is utilized in tracked papers.The second reason is the need for performance optimization.In most of the variants, the authors mention that the stability of the optimizer needs to be improved to reach higher levels of the convergence and lower stagnation problems.This requirement is a need in all population-based methods to make a more stable balance among the local and global search inclinations.
Second, most of the studies have enhanced the sense of balance among the exploratory and exploitative propensities of the HHO.The accuracy of results an convergence speed are the most frequent features enhanced in the literature until now.Hence, we observed in the studied papers that the authors applied the HHO and its variants to many new problems and datasets.As per the no free lunch (NFL) [307] theorem, the authors more and more understood, compared with before 2019, that for tackling new problems, how to adapt algorithm features and how some of their enhanced operations of each method contribute to the efficacy of the final results.Hence, they could investigate different performance aspects of the HHO in dealing with many real-world problems.The last but not least is to further enhance the quality of the results of the HHO with more deep evolutionary basis, such as coevolutionary technology, multipopulation approaches, memetic methods, and parallel computing.Such bases can further assist in harmonizing global and local trends, which will result in better variants of the HHO.Another aspect we want to suggest is that if the literature also attaches the source codes of their enhanced variants, it will be more productive for future research on the HHO, and they can also compare their variants with previous ideas.

Conclusions and Future Work
In this survey paper, we reviewed the recent applications and variants of the recently well-established robust optimizer, Harris hawk optimizer (HHO), as blueone of the most popular swarm-based techniques in recent years.The original HHO includes a set of random solutions, as a population-based method, which can perform two phases of global searches and four phases of local searches.The HHO can show high flexibility in the transition of phases and has several dynamic components that assist it in more efficient exploratory and exploitative trends.The literature review also contained an in-depth review of enhanced versions, the way they enhanced, and application domains.
There are several possible future directions and possible ideas worth investigations regarding the new variants of the HHO algorithm and its widespread applications.First, the HHO is still one year old and several problems in the machine learning domain, mainly feature selection, are still not resolved.Moreover, it is expected that authors provide more balance measures into their analysis and provide more about what resulted in the computational complexity of variants after modifications.Although the HHO is a relatively fast method, such analysis can help the literature to be more open regarding the time of computations.

Figure 4 .
Figure 4. Boxplot of some functions from F1-F12 for all algorithms.

Figure 9 .
Figure 9. Boxplot of some functions from F1-F10 for all algorithms.

Figure 10 .
Figure 10.Boxplot of some functions from F11-F20 for all algorithms.

Figure 11 .
Figure 11.Boxplot of some functions from F21-F30 for all algorithms.

Figure 12 .
Figure 12.Distribution of HHO-related papers in many applications, as reported by Scopus.

Author
Contributions: A.G.H.: conceptualization, supervision, methodology, formal analysis, resources, data curation, and writing-original draft preparation.L.A. and K.H.A.: conceptualization, supervision, writing-review and editing, project administration, and funding acquisition.R.A.Z., F.A.H., M.A. and A.S.: conceptualization, writing-review and editing, and supervision.A.H.G.: conceptualization and writing-review and editing.All authors have read and agreed to the published version of the manuscript.Funding: This research received no external funding.

Table 6 .
The comparison results of all algorithms over 30 functions.

Table 7 .
Summary of literature review on variants and modified HHO algorithms.

Table 8 .
Summary of the literature review on hybrid HHO algorithms.

Table 9 .
The applications of HHO algorithm.