A Hybrid Whale Optimization Algorithm for Global Optimization

: This paper proposes a hybrid whale optimization algorithm (WOA) that is derived from the genetic and thermal exchange optimization-based whale optimization algorithm (GWOA-TEO) to enhance global optimization capability. First, the high-quality initial population is generated to improve the performance of GWOA-TEO. Then, thermal exchange optimization (TEO) is applied to improve exploitation performance. Next, a memory is considered that can store historical best-so-far solutions, achieving higher performance without adding additional computational costs. Finally, a crossover operator based on the memory and a position update mechanism of the leading solution based on the memory are proposed to improve the exploration performance. The GWOA-TEO algorithm is then compared with ﬁve state-of-the-art optimization algorithms on CEC 2017 benchmark test functions and 8 UCI repository datasets. The statistical results of the CEC 2017 benchmark test functions show that the GWOA-TEO algorithm has good accuracy for global optimization. The classiﬁcation results of 8 UCI repository datasets also show that the GWOA-TEO algorithm has competitive results with regard to comparison algorithms in recognition rate. Thus, the proposed algorithm is proven to execute excellent performance in solving optimization problems.


Introduction
With the rapid development of technology and higher requirements for product quality, a large number of practical optimization problems has arisen in various fields. However, most optimization problems are NP-hard problems and cannot be solved by polynomials in a reasonable time [1]. Therefore, researchers have proposed many meta-heuristics, which can find a good enough solution in a reasonable time, effectively applying it to the optimization problem. Meta-heuristic algorithms are group-heuristic search strategies and have the advantages of easy implementation, simple conception, and wide application to different problems in different fields. Meta-heuristic algorithms are inspired by biological or physical phenomena in nature and solve optimization problems by mimicking their behavior. In recent years, numerous meta-heuristic algorithms have been applied to solve complex optimization problems in various fields, and these algorithms can be divided into three categories, evolutionary algorithm, physical algorithm, and swarm intelligence algorithm. The commonly used evolutionary algorithms are differential evolution (DE) [2], evolutionary programming (EP) [3], genetic algorithm (GA) [4], evolutionary strategy (ES) [5] and state transition algorithm (STA) [6]. The commonly used physical algorithms are simulated annealing (SA) [7], gravitational search algorithm (GSA) [8], charged system search (CSS) [9], water evaporation optimization (WEO) [10], and thermal exchange optimization (TEO) [11]. The commonly used swarm intelligence algorithms are particle swarm optimization (PSO) [12], bat algorithm (BA) [13], ant lion optimizer (ALO) [14], chicken swarm optimization (CSO) [15], seagull optimization algorithm (SOA) [16] and whale optimization algorithm (WOA) [17]. repository datasets. Section 5 defines the advantages and limitations of the proposed algorithm. Finally, Section 6 explains the conclusions.

Whale Optimization Algorithm
The WOA algorithm mimics the hunting behavior of humpback whales. The algorithm has two phases to mimic these behaviors, the exploitation phase and the exploration phase. The exploitation phases are encircling prey and bubble-net attacking method, and the exploration phase is search for prey.

Exploitation Phases
Equations (1) and (2) mimic the behavior of humpback whales encircling prey. In the algorithm, the leading solution is supposed as the target prey, and other solutions will try to close on the target prey.
where t is the current iteration, → X b is the best-so-far solution, → α and → β are coefficient vectors that are shown as follows: where → a is a linear decrease coefficient, which will decrease from 2 to 0 in the iteration process, and Based on Equation (2) the algorithm assumes that the best-so-far solution is the prey, updates the current position of the humpback whale to the position near the prey and simulates the situation of surrounding the prey.
To mimic the bubble-net attacking of humpback whales, two mathematical models are proposed as follows: 1.
Shrinking encircling mechanism: This model is implemented by linearly decreasing the value of vector → a . Based on the vector a and the random vector → r , the fluctuation range of the coefficient vector a is reduced from 2 to 0 in the iteration process.

2.
Spiral updating position: The model first calculates the distance between itself and the prey, and then the humpback whale surrounds the prey in a logarithmic spiral motion, the mathematical model is shown as follows: represents the distance between the prey and the humpback whale, γ is a constant for defining the shape of the logarithmic spiral, and ψ is a random value in (−1, 1).
In the exploitation phase, when the position of the prey is determined, the humpback whale will dive deep, and then begin to form spiral bubbles around the prey and move upstream toward the surface. The humpback whale gradually shrinks into the circle while simultaneously hunting prey along a spiral path. The hunting behavior assumes that the shrinking circle and the spiral-shaped path have the same implementation probability to update the position of the humpback whale in the iteration process.

Exploration Phase
In the exploration phase, the algorithm forces the solution to be far from the current best solution and randomly explores the search space. For this reason, the WOA algorithm uses a random value of the → α vector greater than +1 or less than −1, and randomly selects the reference solution. This mechanism and → α > 1 allow the algorithm to perform global exploration. The mathematical model is as follows: where → X rand is the position vector of the solution randomly chosen from the current population.

Overview of WOA
In the WOA algorithm, the adaptive change of the coefficient vector → α allows the WOA algorithm to smoothly transition between exploration and exploitation: exploration is performed when | → α | > 1, and exploitation is performed when | → α | < 1. During the exploitation phase, WOA can switch between shrinking circle and spiral-shaped path. The WOA has only two main internal parameters, → α and → β , that need to be adjusted.

Thermal Exchange Optimization
The TEO is inspired by Newton's law of cooling, i.e., the rate of heat loss of an object is proportional to the difference in temperatures between the object and its surroundings.
In the TEO algorithm, some solutions are defined as cooling objects, and others represent the environment. The environmental temperature modified by the previous temperature of the solution is defined as follows: where c 1 and c 2 are the controlling variables, T env.
i is the previous temperature of the solution, Iter is the current iteration, and MaxIter is the maximum number of iterations.
According to T env. i , the temperature update equation of each solution is defined as follows: where when an object has lower ε, it changes the temperature slightly.
To improve the capability of escaping from local optimum traps, the parameter Pro, is introduced, and it is specified whether a component of each cooling object must be changed or not. If Ran(i) (i = 1, 2, . . . , n) < Pro, one dimension of the ith object is selected randomly, and its value is regenerated as follows: where T i,j is the jth variable of the ith solution. T j,max and T j,min are the upper and lower bounds of the jth variable, respectively.

Crossover Operator
GA is a well-known evolutionary algorithm used to solve the optimization problem. The GA algorithm applies the following genetic operators: selection, crossover, mutation, and replacement. First, two chromosomes are selected based on their fitness value, and then the crossover operator is applied to combine the two parents to create offspring chromosomes. The mutation operator is applied to create localized change in offspring. Finally, the replacement operator eliminates insufficient fitness value chromosomes from the population. Therefore, the fittest chromosomes will create more offspring and converge towards the global optimum.

Genetic Whale Optimization Algorithm-Thermal Exchange Optimization for Global Optimization Problem
This paper proposes five mechanisms to enhance the WOA algorithm. Based on GA, solutions with high potential are selected using crossover operation to generate more effective solutions in order to improve the global exploration capability of the algorithm. Based on TEO, the position update equation of the TEO algorithm will be applied and replaces the original equation. The TEO algorithm's thermal memory (TM) is also combined with the proposed algorithm to save some historical-best-so-far solutions. Based on these two mechanisms, the TEO algorithm can improve the local exploitation capability of the algorithm.

Generate a High-Quality Initial Population
In the global optimization problem, if there are some solutions near the region that contains the global optimum, the performance of global optimization algorithms can be improved. This paper proposes a mechanism which combines the local-search algorithm to generate high-quality initial populations. First, a larger initial population is generated to cover the search space completely, and then the best part is evaluated and selected as the population of the algorithm. This step can find one or more potential areas that contain the global optimum. Second, the position update equation of the TEO algorithm is used as a local-search algorithm to explore the region near the population. This step can increase the probability of the population approach to the global optimum. Figures 1 and 2 show 100 solutions for solving the Shifted and Rotated Expanded Scaffer's F6 function in the CEC 2017 benchmark test function [38], which are generated by a random initial population and a high-quality initial population.
mosomes. The mutation operator is applied to create localized chang nally, the replacement operator eliminates insufficient fitness value ch the population. Therefore, the fittest chromosomes will create more o verge towards the global optimum.

Genetic Whale Optimization Algorithm-Thermal Exchange Optim Global Optimization Problem
This paper proposes five mechanisms to enhance the WOA algorit solutions with high potential are selected using crossover operation to fective solutions in order to improve the global exploration capability Based on TEO, the position update equation of the TEO algorithm w replaces the original equation. The TEO algorithm's thermal memory bined with the proposed algorithm to save some historical-best-so-far so these two mechanisms, the TEO algorithm can improve the local explo of the algorithm.

Generate a High-Quality Initial Population
In the global optimization problem, if there are some solutions ne contains the global optimum, the performance of global optimization improved. This paper proposes a mechanism which combines the local to generate high-quality initial populations. First, a larger initial popul to cover the search space completely, and then the best part is evaluate the population of the algorithm. This step can find one or more potential the global optimum. Second, the position update equation of the TEO as a local-search algorithm to explore the region near the population. crease the probability of the population approach to the global optimum show 100 solutions for solving the Shifted and Rotated Expanded Scaff the CEC 2017 benchmark test function [38], which are generated by a ra ulation and a high-quality initial population.

Improvement of the Exploitation Phase
Basic WOA proposes two mathematical form encircling prey formula and the spiral updating p formula, the solution only follows the best solutio the local optimum, the other solutions will also do improve the algorithm exploitation capabilities. Th search algorithm (TEO) which has powerful expl position update equation of the TEO algorithm w

The Thermal Memory of the TEO Algorithm
Inspired by TEO, a memory that saves some prove the algorithm's performance without adding paper, the TM is adopted and combined with the tions, the new position of solutions will be compar matches the TM, this may indicate that the overal

Improvement of the Exploitation Phase
Basic WOA proposes two mathematical formulas in the local exploitation phase, the encircling prey formula and the spiral updating position. However, in the encircling prey formula, the solution only follows the best solution in the population. When this falls into the local optimum, the other solutions will also do so. Therefore, it is necessary to further improve the algorithm exploitation capabilities. The proposed algorithm combines a local search algorithm (TEO) which has powerful exploitation capabilities. In this article, the position update equation of the TEO algorithm will replace the prey-around formula.

The Thermal Memory of the TEO Algorithm
Inspired by TEO, a memory that saves some historical-best-so-far solutions can improve the algorithm's performance without adding additional computational costs. In this paper, the TM is adopted and combined with the proposed algorithm. During the iterations, the new position of solutions will be compared with the TM. When the new position matches the TM, this may indicate that the overall evolution is sluggish, and the position update mechanism will be implemented to search the new area in the search space.

The Crossover Operator Mechanism
In the GA, the parent of the offspring with a better fitness value is selected to generate an offspring of solutions through evolutionary operators, including crossover, mutation, and elitism. The crossover has a powerful global search capability, which can explore new areas in the search space. This study uses a crossover operator with three random cutting points to create a new solution. First, a pair of solutions are randomly selected from the historical best solution stored in the TM. Next, the crossover operator creates two new solutions by cutting the pair of solutions at three random points and swapping segments. Then, one of the new solutions is randomly selected as the current solution. Figure 3 illustrates the crossover operator with three cutting points.
solutions by cutting the pair of solutions at three r Then, one of the new solutions is randomly selecte trates the crossover operator with three cutting po

Update Leading Solution Based on Thermal Memory
The solutions in the basic WOA algorithm always follow the leading solution to update the current position. However, the best-so-far solution only represents the best solution of all solutions, and it cannot indicate that the overall evolution is correct. In the problem with multi-local optimum, if the leading solution is the local optimum, the other solutions whose positions are followed by the leading solution will also fall into the local optimum. To enhance the local exploitation ability, TM is used in this paper. The proposed algorithm will randomly select one solution from TM instead of just following a current best solution.

The Proposed Algorithm GWOA-TEO
The proposed algorithm is combined with TM and four improvement mechanisms: the high-quality initial population, improvement for the exploitation phase, crossover operation, and update best-so-far solution. The pseudocode and procedure of the proposed algorithm are shown in Algorithm 1 and Figure 4. The implementation time IT is used to prevent falling into an infinite loop and the increase of a large number of computation costs. When solving low-dimensional problems, because population diversity is not high, it will take a long time to implement the TM mechanism. The size of the thermal memory TM will also affect the calculation cost because, when implementing the mechanism for escaping the local optimum, it will check whether the solution matches the memory one by one. Therefore, the appropriate IT value and thermal memory TM must be selected to optimize the calculation cost. In this article, IT = 20 and the size of thermal memory TM = 10 are the settings for all case studies.  (3) and (4) p → probability of implement shrinking encircling or spiral updating if (p < 0.5) if (|α| < 1) Use Equation (11) to update the current solution else if (|α| ≥ 1) Use Equation (8)

Experimental Results
In this section, two case studies are discussed, and the experimental results are simulated by Intel Core i7-3930 K 3.2 Ghz, 24 GB RAM, and MATLAB 2017a. In this experimental study, CEC 2017 benchmark functions [40], including 30 test functions, are applied to verify the optimization capability of GWOA-TEO; however, the second function has been excluded due to the unstable behavior. The descriptions of the 29 benchmark functions are shown in Table 1. Note: UF is unimodal function, SMF is simple multimodal function, HF is hybrid function and CF is composition function.

Parameter Setting
The performance of GWOA-TEO is compared with other optimization algorithms such as the heap-based optimizer (HBO) [41], complex path-perceptual disturbance WOA (CP-PDWOA) [42], and the basic whale optimization algorithm (WOA). The codes of all the stated algorithms are published by their original authors. The parameter settings of the above algorithms are shown in Table 2, where the maximum number of iterations is considered as the convergence criteria.

Statistical Results
In this subsection, three comparative algorithms, including HBO, CP-PDWOA, and WOA are adopted to evaluate the performance of GWOA-TEO. All the functions are tested in 10 dimensions. Each algorithm evaluates all benchmark functions, including average fitness value (avg) and standard deviation (std) for 30 independent runs, and the results are shown in Table 3. The results show that GWOA-TEO has good performance, and its results are better than the other algorithms. Although GWOA-TEO better HBO, GWOA-TEO has comparable results to HBO in many benchmark functions. For instance, GWOA-TEO achieved the best results for F3, F4, F10, F14, F15, F21, F22, F23, F24, F27, F29, and F30. Especially in the optimization of composition functions, GWOA-TEO achieved the best results in 7 of 11 test functions. Moreover, for F4, F10, F14, F15, F21, F22, F24, F27, and F30, the performance of GWOA-TEO is better than HBO. Furthermore, for F5, F6, F8, F9, F11, F17, F20, F25, and F26, the results of GWOA-TEO do not show a significant difference to the results of HBO. Eight standard datasets from the UCI data repository are performed in this case study to evaluate the optimization performance of the proposed algorithm [43]. The details of the selected datasets involving the number of features, instances, and classes in each dataset are shown in Table 4.

Parameter Setting
In this case study, GWOA-TEO is considered as a wrapper approach based on the k-nearest neighbors (k-NN) classifier. The k-NN classifier with the number of nearest neighbor k = 1 and 10-fold cross-validation is used in this experiment to generate the best results. The proposed algorithm is compared to optimization methods including WOA, GA, and TEO. In this case study, the fitness function is used to evaluate each solution, as shown in the following equation: where γ R (D) is the classification error rate of a used classifier. |R| is the number of selected features of each solution and |N| is the total number of features. σ and ω are two parameters to balance between the classification accuracy and the number of selected features. The parameter setting of the above algorithms is shown in Table 5. In Table 2, the maximum number of iterations is considered as the convergence criteria. Each algorithm evaluates all standard datasets, including the average classification accuracy, the average number of selected features and the average computational time for 30 independent runs.

Comparison with GWOA-TEO, WOA, TEO, and GA
The performances of the WOA, GA, TEO, and GWOA-TEO for average classification accuracy, average number of selected features, and average computational time for 30 independent runs are compared in this subsection. The convergence curves from the best-so-far solution for each dataset from the enumerated algorithms are also compared in this subsection. The convergence curve of eight repository datasets is shown in Figure 5. From the convergence curve of Figure 5, GWOA-TEO has a better initial population and achieves higher classification accuracy, especially for WDBC, Ionosphere, krvskp and Sonar. For Vehicle, Ionosphere, krvskp and Sonar, the GWOA-TEO shows strong exploitation capability so that the best-so-far solution value keeps decreasing in the iteration.
subsection. The convergence curve of eight repository datasets is shown in Figure 5. Fr the convergence curve of Figure 5, GWOA-TEO has a better initial population a achieves higher classification accuracy, especially for WDBC, Ionosphere, krvskp and nar. For Vehicle, Ionosphere, krvskp and Sonar, the GWOA-TEO shows strong explo tion capability so that the best-so-far solution value keeps decreasing in the iteration.  From Table 6, the results show that the GWOA-TEO is valid in the feature selection field. The GWOA-TEO has the best average classification accuracy for eight adopted datasets. The WOA has better average classification accuracy than the TEO on six of eight datasets, including Vowel, Vehicle, WDBC, Ionosphere, krvskp, and Sonar. However, in the less-feature-dataset case, such as Vowel, Wine, and CongressEW, the average classification accuracies of the TEO are better or the same as those of the WOA. These results show that TEO has better local search capability than WOA, while in the multi-feature-dataset cases, such as Vehicle, Ionosphere, and Sonar, the GA has better average classification accuracy than the WOA. These results also show that GA has a better global search capability than WOA. The proposed algorithm GWOA-TEO combines the advantages of TEO and GA, which achieves the best average classification accuracy in this study. As shown from the average computational time in Figure 6, GA is the best computing costs algorithm. Besides, the average calculation time of GWOA-TEO is very close to that of WOA and TEO. GWOA-TEO can maintain computing costs, and classification accuracy is better than other algorithms. From Table 6, the results show that the GWOA-TEO is valid in the feature selection field. The GWOA-TEO has the best average classification accuracy for eight adopted da tasets. The WOA has better average classification accuracy than the TEO on six of eigh datasets, including Vowel, Vehicle, WDBC, Ionosphere, krvskp, and Sonar. However, in the less-feature-dataset case, such as Vowel, Wine, and CongressEW, the average classifi cation accuracies of the TEO are better or the same as those of the WOA. These results show that TEO has better local search capability than WOA, while in the multi-feature dataset cases, such as Vehicle, Ionosphere, and Sonar, the GA has better average classifi cation accuracy than the WOA. These results also show that GA has a better global search capability than WOA. The proposed algorithm GWOA-TEO combines the advantages o TEO and GA, which achieves the best average classification accuracy in this study. As shown from the average computational time in Figure 6, GA is the best computing costs algorithm. Besides, the average calculation time of GWOA-TEO is very close to that o WOA and TEO. GWOA-TEO can maintain computing costs, and classification accuracy is better than other algorithms.

Comparison with Published Algorithms
In this subsection, to further evaluate the GWOA-TEO algorithm's performance, a comparison is performed with other state-of-the-art feature selection algorithms. Some brief descriptions of state-of-the-art feature selection algorithms are as follows: the hybrid seagull optimization algorithm (SOA) and thermal exchange optimization (TEO) are based on improved local searchability, named SOA-TEO3 [44]; the heap-based optimizer is based on the corporate rank hierarchy (CRH) principle, named HBO [41]; the binary version of the ant lion optimizer is based on crossover operation and mutation operation to improve the local searchability, named BALO-1 [45]; the high-level relay hybrid (HRH) model for whale optimization algorithm and simulated annealing (SA) is based on tournament selection to maintain the diversity of the population, named WOASAT-2 [46]. Table 7 compares the results of the eight standard data sets of the above-mentioned state-of-the-art feature selection algorithms. SOA-TEO3 achieves the best performance on the fewest number of selected features because the algorithm effectively strengthens the exploitation capability through TEO; however, too much exploitation may lead to local optimum. WOASAT-2 performs well on all datasets, achieving the best classification accuracy on the CongressEW dataset and the best performance on the Sonar dataset. This is related to the fact that the algorithm uses a tournament selection mechanism to reduce the probability of falling into a local optimum, extensively explores regions in the space, and uses the local search algorithm (SA) to strengthen these regions. Based on the classification accuracy criterion, GWOA-TEO performs best in the comparison algorithm. On the vowel, wine, vehicle and WDBC data sets, GWOA-TEO has better classification accuracy than other published algorithms. GWOA-TEO ranks second in classification accuracy on the Ionosphere, krvskp and Sonar datasets. The proposed algorithm strengthens exploration through GA, strengthens exploitation through TEO, and combines a thermal memory mechanism to further improve local optimal avoidance. Finally, based on the classification accuracy criterion, GWOA-TEO has better capabilities in the field of feature selection.

Discussion
According to the above experimental results, the superiority of the proposed algorithm can be summarized as follows.
(1) Improvement in the initial population: in general, the traditional meta-heuristic algorithms randomly generate the initial population. However, this randomness may lead to non-uniform and low-quality population distribution and slow convergence. Therefore, an initial population generation mechanism combined with a local exploitation algorithm is proposed in this paper. This mechanism can increase the probability of the population approach being the global optimum. Figure 5 shows the convergence curve of GWOA-TEO and three comparison algorithms. The experimental results show that in seven of the eight repository data sets, including Vowel, Wine, CongressEW, WDBC, Ionosphere, krvskp and Sonar, the improved initial population has a better fitness value than the randomly generated initial population. (2) Improvement in the exploration and exploitation phase: the proposed algorithm combines GA to enhance exploration capabilities and TEO to enhance exploitation capabilities. Figure 5 shows the convergence curve of GWOA-TEO and the three comparison algorithms. In the convergence curve of the WDBC dataset, GWOA-TEO shows the advantage of fast convergence. In the convergence curves of the Ionosphere and Sonar datasets, GWOA-TEO shows strong exploration and exploitation capabilities. Compared with the original WOA, WOA converges at the 58th iteration, while GWOA-TEO converges at the 70th and 76th iterations, and has higher accuracy. (3) Improvement in escape from local optimum: in this paper, a memory that saves some historical-best-so-far solutions is proposed to further improve the avoidance of local optimum. Composition functions are challenging problems because the functions contain randomly located global optimum and several randomly located deep local optimums. Table 3 shows the statistical results of 11 composition functions in the CEC 2017 benchmark function. The proposed algorithm is better than CP-PDWOA and has competitive performance with HBO. HBO achieved the best fitness value in six composition functions, including F20, F23, F25, F26, F28 and F29, and GWOA-TEO achieved the best fitness value in seven composition functions, including F21, F22, F23, F24, F27, F29 and F30.
Note that, despite these advantages, the proposed algorithm still has some limitations.
(1) The robustness of the proposed algorithm: in this paper, the standard deviation is a parameter to evaluate the robustness of the model. The smaller the value of standard deviation, the more robust the model. In Table 3, although the algorithm achieves a higher accuracy rate, it only ranks second in standard deviation, better than WOA variants (CP-PDWOA) and WOA, but not as good as HBO. The proposed algorithm only achieved the best standard deviation in F14, F15, F21 and F28 among the 30 CEC 2017 benchmark functions. (2) Computational time and complexity: in Figure 6, the computational time of the proposed algorithm is lower than GA but higher than WOA and TEO. Although GA has powerful exploration capabilities, the implementation of the crossover operator mechanism in the proposed algorithm is the main reason for increasing the computational time. Therefore, it is necessary to study further the global exploration algorithm to reduce the complexity. (3) The generalization capacity of the proposed algorithm: the proposed algorithm is tested only on CEC 2017 benchmark functions and eight UCI repository datasets. The performance of the proposed algorithm on the actual optimization problems was not mentioned in this study. Therefore, the effectiveness of the GWOA-TEO algorithm still needs to be studied further.

Conclusions
This paper proposes an effective global optimization algorithm to solve complex optimization problems. The proposed algorithm can combine the advantages of each algorithm to further improve the exploration and exploitation capabilities of the WOA algorithm, and escape from the local optimum. The main contributions of this paper are as follows. First, an initial population generation mechanism based on the local search algorithm is proposed, which can improve the convergence accuracy of the algorithm. Secondly, combining GA (strong exploration capability) and TEO (strong exploitation capability) can solve the shortcomings of slow convergence speed and low exploitation and exploration capability of WOA. Finally, the concept of memory is introduced, and a memory check mechanism is performed in each iteration. If it is the same as the historical best solution, a mechanism is implemented to escape from the local optimum. The concept of memory is introduced, which can further improve local optimal avoidance. The experimental results based on the CEC 2017 benchmark function show that, compared with the three optimization algorithms HBO, CP-PDWOA, and WOA, the GWOA-TEO algorithm has good global optimization capabilities. Especially in the 11 combinatorial functions of the challenging optimization task, the proposed algorithm shows almost the same optimization results as HBO. Experimental results based on eight UCI knowledge base data sets show that, compared with the traditional three optimization algorithms, GWOA-TEO achieves the best classification accuracy and maintains a reasonable calculation time. Compared with some of the most advanced feature selection algorithms (including SOA-TEO3, HBO, BALO-1 and WOASAT-2), the proposed algorithm exhibits competitive results in classification accuracy and achieves the best classification accuracy in four datasets. However, the proposed algorithm still has some limitations that have been mentioned in the discussion chapter: (1) the robustness, computational time and complexity of the proposed algorithm still need to be further improved; (2) the generalization ability of the proposed algorithm has not been tested. Therefore, it is necessary to further study the global exploration algorithm to verify its effectiveness.