Hybrid Reptile Search Algorithm and Remora Optimization Algorithm for Optimization Tasks and Data Clustering

: Data clustering is a complex data mining problem that clusters a massive amount of data objects into a predeﬁned number of clusters; in other words, it ﬁnds symmetric and asymmetric objects. Various optimization methods have been used to solve different machine learning problems. They usually suffer from local optimal problems and unbalance between the search mechanisms. This paper proposes a novel hybrid optimization method for solving various optimization problems. The proposed method is called HRSA, which combines the original Reptile Search Algorithm (RSA) and Remora Optimization Algorithm (ROA) and handles these mechanisms’ search processes by a novel transition method. The proposed HRSA method aims to avoid the main weaknesses raised by the original methods and ﬁnd better solutions. The proposed HRSA is tested on solving various complicated optimization problems—twenty-three benchmark test functions and eight data clustering problems. The obtained results illustrate that the proposed HRSA method performs signiﬁcantly better than the original and comparative state-of-the-art methods. The proposed method overwhelmed all the comparative methods according to the mathematical problems. It obtained promising results in solving the clustering problems. Thus, HRSA has a remarkable efﬁcacy when employed for various clustering problems.


Introduction
Unsupervised learning methods are instrumental in machine learning because they may explore data without any prior knowledge of them, i.e., there are no labels linked with the data [1].These algorithms try to represent the data's underlying mechanism or pattern, which may be helpful for things such as decision making and forecasting future inputs.Clustering and feature methods are classic examples of unsupervised algorithms [2,3].
Clustering is an important unsupervised method that uses a number of data objects or items to discover homogenous groupings [4].Clustering is used to split data items into groups so that objects in the same cluster are similar to one another and different from objects in other clusters.Algorithms for clustering have been utilized in a wide range of applications.They are used in biology to extract interesting patterns from gene expression [5,6].Additionally, they are used to divide sensor networks into groups in wireless sensor networks and in information retrieval for grouping text content and generating thematic hierarchies.For a product search task, clustering can be utilized to group people or objects [7].
The data and available information are the root of the fast development model of today [8], and are extensively employed in various information technology applications such as manufacturing, marketing, and commerce.Because there are so many data, data mining is one of the most critical methods for extracting meaningful information.Data mining is a revolutionary idea utilized to tailor knowledge across numerous industries, such as medical records examination, client transaction analysis, and market surveying programs [9].The clustering and classification algorithms are the two most essential requirements for extracting information from data mining.As a result, data clustering has become a valuable and demanding process for generating clusters where data are clustered together.Data clustering is achieved by using two clustering mechanisms: hierarchical and partitional clustering [10,11].The data items are grouped hierarchically, either in the shape of a tree or a cluster analysis, in a clustering algorithm.The partitional clustering technique, on the other hand, produces non-overlapping groups.This method has a propensity to cope with noise and outliers, manageability, integration, usability, and other issues in real-world applications.K-means clustering, Fuzzy c-means method, density accurate and robust, and so on are some of the most frequently used classification techniques [12,13].
Several related works have been conducted based on using optimization techniques to solve the clustering problems, which is worth mentioning [14,15].They motivated us to propose a new efficient method to deal with various data clustering issues.
A novel Class Topper Optimization (CTO) method is presented in [16].The cognitive intelligence of pupils in a class inspired the optimization method.A population-based search technique is used.The answer narrows towards the optimal solution in this technique.This may lead to a worldwide optimal solution.A clustering challenge is explored to validate the algorithm's performance.For real-time testing, five common data sets are examined.A comparison of the performance of the proposed approach to many well-known existing optimization techniques reveals that it outperforms them all.
A novel clustering approach is presented based on the evolution particle swarm optimization (EPSO) algorithm [17].The suggested approach is based on the development of swarm generations.At first, the particles are equally dispersed in the input data set, and a new swarm population emerges after a defined number of repetitions.This paper covers the novel technique and its initial implementation and testing using real clustering benchmark data.The findings demonstrate that the technique creates compact clusters and is effective.
A modernized firefly technique is combined with the well-known PSO approach to handle automated data clustering difficulties [18].The proposed fusion technique is designed against four standard metaheuristic algorithms from the literature using twelve common datasets and the two moons dataset to see how well it performs.The thorough computational tests and results analysis reveals that the suggested approach outperforms the standard published methods.
The ABCWOA method, which uses Random Memory (RM) and Elite Memory (EM) to overcome investigation problems and late convergence in ABC, is presented in [19].The search phase for the bait in the whale optimization algorithm (WOA) is employed by RM in the ABCWOA method, and EM is also used to boost convergence.The ABCWOA algorithm outperforms other metaheuristic algorithms, according to the results.
This study suggests a novel heuristic strategy based on the big bang-big crunch algorithm for clustering issues [20].In comparison to previous heuristic strategies, the suggested method not only utilizes the use of the heuristic nature to reduce standard clustering algorithms such as k-means, but it also benefits from the memory-based scheme.Furthermore, the proposed algorithm's system is assessed using various benchmark functions, including well datasets.The experimental findings reveal that the suggested technique outperforms similar algorithms substantially.
This study proposes K-NM-PSO, a hybrid approach that combines the K-means algorithm, Nelder-Mead simplex search, and PSO [21].The K-NM-PSO method, like the K-means algorithm, looks for cluster centers in any data set.However, it can also discover global optima effectively and rapidly.The results suggest that K-NM-PSO is reliable and appropriate for data clustering.Other optimization methods can be used to solve the data clustering problems.Additionally, many optimization methods have been used to solve various problems [22].It is also worth mentioning that the optimization methods proved their ability in solving many other problems such as solve complex engineering optimization problems [23], reactive power planning problems [24], parameter estimation problems [25], voltage-constrained power problems [26], and others.
The Reptile Search Algorithm (RSA) is a recent and novel optimizer that was proposed by Abualigah et al. in [27].This method mimics the behavior of crocodiles on two principal activities-encircling and hunting-each of which is performed in two main strategies: (1) encircling: high walking or belly walking; (2) hunting: hunting coordination or hunting cooperation.The Remora Optimization Algorithm (ROA) is a recent and novel optimizer that has been proposed by Jia et al. in [28], a new bionics-based optimization method.The main inspiration of the ROA is Remora's behavior.
Various optimization methods have been employed in the literature to solve different machine learning problems, especially clustering problems.The proposed method's primary motivation is that the recent studies' that used optimization methods for solving similar problems yielded performances that were not good enough, and a new, improved method can find new optimal solutions.Optimization methods usually suffer from optimal local problems and unbalance between the search mechanisms (i.e., exploration and exploitation).Moreover, we selected the most reputed methods to incorporate them and created a new hybrid approach to yield new and better results.This paper proposed a novel hybrid optimization search method for solving challenging optimization problems.The proposed method is called HRSA, which combines the original Reptile Search Algorithm (RSA) and Remora Optimization Algorithm (ROA) and handles these mechanisms' search processes by a novel transition method.The proposed HRSA method solves the main weaknesses raised by the original methods and finds better solutions.The proposed HRSA is tested on two classes of optimization problems: twenty-three benchmark test functions and eight real-world data clustering problems.The results illustrate that the proposed HRSA method performs significantly better than the original and comparative state-of-the-art methods.The proposed method overwhelmed all the comparative methods according to the mathematical problems.It obtained promising results in solving the clustering problems.Thus, HRSA has a remarkable ability to be employed for various clustering problems.
The major contributions in this paper are covered in the following points.
• A novel hybrid optimization method is proposed using the original Reptile Search Algorithm (RSA) and Remora Optimization Algorithm (ROA).

•
A new transition method is proposed to handle the mechanisms' search processes and help the proposed method in enabling the suitable search operator during the optimization process.

•
The performance of the proposed HRSA method is tested on several benchmark functions to show the mai.

•
A real-world problem, data clustering, is used to prove further the proposed HRSA's ability to deal with complicated problems, finding symmetric and asymmetric objects.

•
The results showed the superiority of the proposed method in solving the given problems compared to other various state-of-the-art methods.
In the rest of the paper, Section 2 presents the background of the used methods and the procedure of the proposed method.Section 3 shows the experiments, results, and discussion.Section 4 gives the research conclusions and potential future work.

Background
This section presents the original methods used in the proposed method.Additionally, the main procedure of the proposed hybrid Reptile Search Algorithm (RSA) and Remora Optimization Algorithm (ROA) with a novel transition mechanism (HRSA) is presented.

Reptile Search Algorithm (RSA)
This section presents the original Reptile Search Algorithm (RSA) and its procedure.The basic Reptile Search Algorithm (RSA) is described in its exploration (global search) and exploitation (local search) stages, which were inspired by the encircling mechanics, hunting processes, and social behavior of crocodiles in real life [27].

Encircling Phase (Exploration)
The exploratory behavior (encircling) of the RSA is introduced in this section.Crocodiles engage in two movements when encircling: high walking and belly walking, according to their encircling behavior [29].
The RSA switches between exploration and exploitation search phases based on four scenarios: splitting the number of iterations into four parts; dividing the total number of iterations into four parts.The RSA exploration mechanisms investigate the search regions and approach to discover a better answer based on two major search techniques.
One condition must be met throughout this phase of the search.The high walking search method is performed according to t ≤ T 4 , and the belly walking search method is performed according to t ≤ 2 T 4 and t > T 4 .The position-updating process is presented in Equation (1).
where Best j (t) is the best-obtained solution, rand is a random number, t is the current iteration, and T is the maximum iterations.η (i,j) is the hunting parameter determined by Equation (2).β is a parameter fixed to 0.1.The reduce function (R (i,j) ) is determined by Equation (3).r 1 − r 4 are random numbers, x (r 1 ,j) is a random position, and N is the used solutions.Evolutionary Sense (ES(t)) is a probability parameter determined by Equation (4).
where a small value.P (i,j) is a difference parameter determined by Equation (5).
where M(x i ) represents the average positions determined by Equation (6).UB (j) and LB (j) are the upper and lower boundaries.α is a parameter fixed to 0.1.

Hunting Phase (Exploitation)
The exploitative behavior of RSA is discussed in this section.Crocodiles use two hunting techniques: hunting coordination and hunting collaboration, according to their hunting behavior.
The searching in this phase (hunting coordination) is executed and determined according to t ≤ 3 T 4 and t > 2 T 4 ; otherwise, the hunting cooperation is executed according to t ≤ T and t > 3 T 4 .The position-updating processes are presented in Equation ( 7): where Best j (t) is the best obtained solution, and η (i,j) is the hunting parameter determined by Equation (2).P (i,j) is a difference parameter determined by Equation (5).η (i,j) is the hunting parameter determined by Equation (2).R (i,j) is determined by Equation (3).

Remora Optimization Algorithm (ROA)
This section presents the original Remora Optimization Algorithm (ROA) and its procedure, which is as follows [28].

Free Travel (Exploration) SFO Strategy
The formulation for this algorithm's location update was modeled based on the algorithm's elite notion, yielded by Equation (8).
where R t rand is a random location.

Experience Attack
To evaluate whether or not it is required to replace the host, the tuyu must regularly take modest steps around the host, analogous to the development of knowledge.The formula for modeling the principles as mentioned above is as follows: where R pre is the position of the previous iteration, and R att is a tentative step.The evaluation of the fitness function of the present solution f (R t i ) and the attempted solution f (R att ) is described as the decision of this step.As an example, when addressing the minimal problem, if the fitness function value produced by the proposed solution is less than the present solution, Remora chooses a different mechanism for local optima, as shown in the following section.It returns to host choosing if the fitness function value of the attempted solution is greater than that of the existing solution.

Eat Thoughtfully (Exploitation) WOA Strategy
The location update formula of Remora connected to the whale was recovered using the original WOA technique, as seen in the Equations below: Their locations might be considered the same when a Remora is on a whale in the broader solution space.D is the space between both the hunter and the prey, α is a random value in the range [−1,1], and a is a number that decreases exponentially between [−2, −1].

Host Feeding
The exploitation procedure is further subdivided into host feeding.At this point, the optimal solution can be condensed to the host's location area.Incremental steps can be thought of as moving on or around the host, which can be mathematically characterized as: A was employed here to represent a tiny movement connected to the size space of the host and remora.A Remora factor (C) was utilized to limit the position of Remora to distinguish between the host and Remora.If the size of the host is 1, then the volume of the Remora is about a percent of the host's volume.

The Proposed HRSA Method
This section presents the main procedure of the proposed hybrid Reptile Search Algorithm (RSA) and Remora Optimization Algorithm (ROA) with a novel transition mechanism (HRSA).
The suggested HRSA is used to solve various issues using two major search strategies and a new mean transition mechanism.The traditional RSA is the principal search technique, which has strong global search capabilities but occasionally suffers from local search problems, premature convergence, and disequilibrium between global and local search methods.As a result, the second search strategy, the ROA, avoids the local search issue and premature convergence.The strategy can improve the RSA's searchability by generating new local solutions based on the best solutions currently available.Furthermore, a new mean transition mechanism is provided to govern the execution of the search methods (i.e., RSA and ROA) in the proposed HRSA to tackle the disequilibrium problem between the global and local search techniques.As a result, the search space may be efficiently expanded by incorporating new approaches from other places.These solutions inspire the proposed HRSA to use more robust solutions to obtain better results.

Initialization Phase
The optimization process in RSA begins with a set of candidate solutions (X) created stochastically, as indicated in Equation (20).The best obtained solution is deemed nearly optimal in each iteration.
where X is a collection of the solutions that are created by using Equation ( 21), x i,j is the j th position of the i th solution, N is the number of solutions, and n is the dimension size.
where rand is a random value and LB and UB denote the lower and upper bound, respectively.

The Proposed Mean Transition Mechanism (MTM)
The mean transition mechanism (MTM) is devised and given in Algorithm 1 in this section.This technique is used to control the search process and the transition between the RSA and the MT.The transition from one search process to the next is quite delicate.It necessitates an effective way to change the update procedures across the various approaches.The suggested MTM's fundamental concept is to regulate the search techniques when the fitness function does not increase after five iterations (I).When no gains can be made by experimentation, the number of iterations changes.

Algorithm 1
The proposed Mean Transition Mechanism (MTM).end if 13: end for TM is a binary variable utilized to adjust the search process between the RSA and the MT, sumFF is a variable used to calculate the mean fitness function values, currentFF is the current fitness function value, C is a counter, I is the number of iterations to change when no improvements and it can be adjusted by experiment, and f lip is a function to flip the TM value from 0 to 1 or vice versa, as described in Algorithm 1.

The Detailed Process of the HRSA
This section goes through the proposed technique in detail.The main purpose of the proposed strategy is to obtain better results than those yielded by the existing approaches.We also want to avoid the defects presented by the original processes, such as local search issues, convergence constraints, and search balance concerns.
To conclude, the proposed HRSA starts with the creation of a random set of solutions.During the renewal process, the HRSA's search criteria explore possible placements of the current best solution.Each answer progresses to the next stage of the procedure.According to Figure 1, the suggested HRSA uses the Reptile Search Algorithm (RSA) and the Remora Optimization Algorithm (ROA) approach.Each iteration will update and improve the potential solutions using one of these search methodologies.
The suggested HRSA's search techniques are divided into two categories, as shown Figure 1: RSA and ROA.Following this, the RSA's search techniques are divided into global and local search methods.There are two search techniques for each method: (1) at the global level, with high and walking tactics; and (2) hunting coordination and cooperation at the local level.Candidate solutions try to figure out what is going on in the search space t ≤ T 2 and seek to discover the near-optimal solution if t > T 2 .For the first section, if TM == 0, the search process of the RSA will be performed; otherwise, the search process of the ROA will be conducted.In the exploration step of the RSA, the first search strategy from the global search methods is performed when t ≤ T 4 , and the second strategy from the global search processes are accomplished when t ≤ 2 T 4 and t > T 4 .In the exploitation of the conventional RSA, the first search approach from the local methods is carried out when t ≤ 3 T 4 and t > 2 T 4 ; otherwise, the second search approach from the local methods is executed, when t ≤ T and t > 3 T 4 .Finally, the HRSA is finished when it reaches the end criterion.The complete rule of the recommended HRSA is shown in Figure 1.

Results and Discussion
In this section, the proposed HRSA method is evaluated using twenty-three common benchmark functions.

Experimental Settings
The results of the proposed HRSA method are compared with other state-of-the-art methods including Aquila Optimizer (AO), Dwarf Mongoose Optimization Algorithm (DMOA), Whale Optimization Algorithm (WOA), Sine Cosine Algorithm (SCA), Dragonfly Algorithm (DA), Grey Wolf Optimizer (GWO), Particle Swarm Optimizer (PSO), Ant Lion Optimizer (ALO), Reptile Search Algorithm (RSA), Remora Optimization Algorithm (ROA), Arithmetic Optimization Algorithm (AOA), and the proposed HRSA method.All the tested methods are tuned according to the original paper and its parameters, as shown in Table 1.Each algorithm is executed 30 times using 50 solutions and 1000 iterations.Twenty-three benchmark problems are used with different characterizations, as shown in Table 2.These test functions are categorized into three main categories, unimodal, multimodal, and multimodal, with fixed dimension functions.They are always used in the domain of machine learning and optimization algorithms [30,31]. where

Benchmark Function: Experiments
In this section, the results of the comparative methods are presented in several formats to validate their performances.Figure 2 shows the qualitative analysis of the first thirteen benchmark functions (F1-F13) in terms of function topology, trajectory curves of the first dimension values, the average fitness function values, and the convergence curves of the original methods (i.e., RSA and ROA) and the proposed HRSA method.
It is clear from Figure 2 that the proposed method obtained promising results, and it can be an excellent alternative in this domain.The proposed method reaches near-optimal solutions for all the tested functions according to the functions typologies.Additionally, the trajectory curves of the first dimension values show that the proposed method acceptably changes the values.This refers to the diversity of the position-updating mechanisms.The proposed method obtained the minimal values regarding the average fitness function values.The convergence is always near the optimal area.Finally, the convergence curves show that the proposed method overwhelmed the original methods in all the tested problems.Table 3 shows the proposed HRSA method in solving thirteen benchmark functions with different population sizes.This experiment was conducted to find the optimal number of solutions to proceed with the following experiments.It is clear from Table 3 that the proposed HRSA obtained the best results when the population size was equal to 50, followed by a population of 45.According to the given results from the Friedman ranking test, when the population size is equal to 50, the proposed HRSA is ranked as the first best method followed by the number of solutions 45, 35, 40, 30, 25, 20, 15, 10, and 5. Thus, the proposed method obtains better results when the number of solutions is larger.Worst 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 Average 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 Best 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 STD 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 Worst 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 Average 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 Best 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 STD 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00 0.0000E+00  Table 4 shows the results of the comparative methods and the proposed HRSA for solving thirteen benchmark functions problems (F1-F13) when the dimension size is equal to 10.It is clear from this table that the proposed method obtained more promising results compared to the other comparative methods.This outcome proved the ability of the modified method to avoid the main problems of the original methods.According to the Wilcoxon signed-rank test, the proposed HRSA overwhelmed almost all the comparative methods in all the given problems.For example, the proposed HRSA significantly overwhelmed DOMA and DA in F1.Additionally, it significantly improved compared to ROA, PSO, GWO, DA, DMOA, and WOA in F4.According to the Friedman ranking test, the proposed HRSA is ranked as the first-best method, followed by the AO method as the second-best method, the RSA as the third-best method, the GWO as the fourth-best method, the AOA as the fifth-best method, the WOA as the sixth-best method, the PSO as the seventh-best method, the ROA as the eighth-best method, the DMOA as the ninth-best method, the SCA as the tenth-best method, the ALO as the eleventh-best method, and finally, the DA as the twentieth-best method.We concluded that the proposed method can find better solutions than the other methods' solutions.
Table 5 shows the results of the comparative methods and the proposed HRSA for solving thirteen benchmark functions problems (F1-F13) when the dimension size is equal to 100.It is clear from this table that the proposed method achieved better results compared to the other comparative methods for these cases.This result demonstrated the improved method's capacity to overcome the original method's major flaws.According to the Wilcoxon signed-rank test, the proposed HRSA overwhelmed almost all the comparative methods in all the given cases.For example, the proposed HRSA significantly overwhelmed ROA, PSO, GWO, SCA, and DMOA in F1.Additionally, it led to significant improvements compared to AOA, ROA, RSA, ALO, PSO, GWO, SCA, DMOA, and WOA in F10.According to the Friedman ranking test, the proposed HRSA is ranked as the first-best method, followed by the AO method as the second-best method, the RSA as the third-best method, the WOA as the fourth-best method, the GWO as the fifth-best method, the AOA as as the sixth-best method, the PSO as the seventh-best method, the DMOA as the eighth-best method, the ROA as the eighth-best method, the ALO as the tenth-best method, the DA as the eleventh-best method, and finally, the SCA as the twentieth-best method.We came to the conclusion that the proposed HRSA can find better solutions than previous methods when the dimensions are greater (i.e., Dim = 100).Table 6 shows the results of the comparative methods and the proposed HRSA methods for fixed dimensional functions (F14-F23).As shown in the table, the suggested technique outperformed the existing comparison methods in these circumstances.This result revealed the enhanced approach's ability to overcome the critical weaknesses of the old method.According to the Wilcoxon signed-rank test, the proposed HRSA overwhelmed almost all the comparative methods in all the given cases.For example, the proposed HRSA significantly overwhelmed DMOA, SCA, DA, ALO, RSA, and ROA in F14.Additionally, it gave a significant improvement compared to AO, DMOA, and SCA in F23.According to the Friedman ranking test, the proposed HRSA ranked as the first-best method, followed by the RSA method as the second-best method, the ROA as the third-best method, the GWO as the third-best method, the ALO as the fifth-best method, the DMOA as the sixth-best method, the GWO as the seventh-best method, the AO as the eighth-best method, the DA as the ninth-best method, the WOA as the tenth-best method, the SCA as the eleventh-best method, and finally, the AOA as the twentieth-best method.We came to the conclusion that the proposed HRSA can find better solutions than previous methods when the dimensions are fixed.Worst −3.85522E+00 −3.85628E+00 −3.7519E+00 −3.85491E+00 −3.85626E+00 −3.85556E+00 −3.85628E+00 −3.85628E+00 −3.85628E+00 −3.3798E+00 −3.85415E+00 −3.85628E+00Average −3.85578E+00 −3.85628E+00 −3.85251E+00 −3.85520E+00 −3.85627E+00 −3.85589E+00 −3.85628E+00 −3.85628E+00 −3.85628E+00 −3.5949E+00 −3.85450E+00 −3.85628E+00 Best −3.85627E+00 −3.85628E+00 −3.85555E+00 −3.85547E+00 −3.85628E+00 −3.85627E+00 −3.85628E+00 −3.85628E+00 −3.85628E+00 −3.7898E+00 −3.85523E+00 −  The convergence behaviors of the comparative optimization methods employing the evaluated twenty-three benchmark functions are shown in Figure 3.The optimization history and trajectory during the evolution processes are depicted in these diagrams.In comparison to the previous comparative approaches, it is evident that the proposed HRSA obtains optimum solutions in all of the studied cases and converges to the optimal solution smoothly.Furthermore, the observed convergence tendency is appealing and reinforces the original method's primary flaws.As a result, the primary issues with the original approaches are addressed, as seen in the figures.Figure 4 shows the distribution values of the given position during the optimization process.It is clear that the proposed method generates higher distribution values during the improvement (optimization) process.This evidences that modifying the original method achieved new results that satisfied the proposed idea.

Data Clustering: Experiments
In this section, the proposed method is tested further for solving the data clustering problems.
Eight problems are used with different characterizations shown in Table 7: the number of features, objects, and clusters.These datasets were taken from the UCI repository.They are always used in the domain of machine learning and clustering algorithms.The results of the proposed method are compared with other methods such as Aquila Optimizer (AO) [32], Particle Swarm Optimizer (PSO) [33], Grey Wolf Optimizer (GWO) [34], African Vultures Optimization Algorithm (AVOA) [35], Whale Optimization Algorithm (WOA) [36], Reptile Search Algorithm (RSA) [27], Remora Optimization Algorithm (ROA) [28], Arithmetic Optimization Algorithm (AOA) [37], and the proposed HRSA method.All the tested methods are tuned according to the original paper and its parameters.Each method was executed 30 times using 50 solutions and 1000 iterations.Table 8 shows the obtained results of the proposed HRSA method and other comparative clustering methods using eight datasets.This table presents and measures the results of nine comparative algorithms in terms of the worst, best, average fitness function values.Additionally, the standard deviation values are presented to show the variate of the presented results.The Wilcoxon signed-rank test was also used to show the degree of improvements achieved by the proposed method compared to other comparative methods.Then, the Friedman ranking test was used to find the rank of the tested methods for all the used datasets.
In Table 8, the proposed HRSA achieved better results in almost all the tested problems compared to the other comparative methods.The proposed method proved its performance in solving the given problems.It is the best method overall, compared to the other methods.The proposed method reaches optimal solutions in almost all the tested datasets, and its solutions are better than the previous solutions.According to the Wilcoxon signed-rank test, the proposed HRSA overwhelmed almost all the comparative methods.For example, the proposed HRSA significantly overwhelmed the cancer dataset's ROA, GWO, PSO, and AO.Additionally, it achieved significant improvement compared to ROA, WOA, GWO, PSO, and AO in the class dataset.According to the Friedman ranking test, the proposed HRSA is ranked the first-best method, followed by the PSO method as the second best method, the GWO as the third-best method, the AO as the fourth-best method, the ROA as the fifth-best method, the AOA and RSA as the sixth-best methods, the WOA as the eight best method, and finally, the AVOA as the ninth-best method.The final results are explained in Figure 5.
Figure 6 shows the convergence behavior of the comparative optimization algorithms using the eight tested data clustering problems.These figures are used to show the optimization history and the trajectory during the evolution processes.It is clear that the proposed HRSA reaches the optimal solutions in all the tested problems and converges to the optimal solution smoothly compared to the other comparative methods.Moreover, the convergence behavior observed is attractive and strengthens the main weaknesses of the original method.Thus, the main problems of the original methods are solved as given in the given figures.Figure 7 shows the clustering plot images using the eight tested datasets.This figure is used to display the clustered objects and the distribution of the objective over the specified area.The given results clearly show that the given objects are distributed in a similar area, and each area presents a similar topic.The clustering accuracy is recognized, which proves the performance of the proposed HRSA method.The execution time ranking is given in Figure 8, which clearly shows that the proposed method has a good execution time compared to other comparative methods.Convergence behaviour of the comparative algorithms using the tested data clustering problems.We concluded that the proposed method obtained promising results in almost all the tested problems.In comparison with other methods, the proposed method proved superior in solving the tested problems.As mentioned above, the proposed HRSA has been tested on solving the most common and standard benchmark functions; these functions are typically used to evaluate the performance of the optimization methods.According to the reported results, the proposed method obviously proved its ability to deal with different benchmark functions.Moreover, the data clustering problems are also used to evaluate the performance of the proposed method as a real-world application.The results given above showed that the proposed method obtained better results in almost all the tested problems and overwhelmed the comparative methods.

Conclusions
Various optimization methods have been employed in the literature to solve different machine learning problems, especially clustering problems.Data clustering is one of the complex data mining problems that clusters the given objectives, which usually amounts to a massive number of data objects, into a predefined number of clusters.Optimization methods usually suffer from optimal local problems and unbalance between the search mechanisms (i.e., exploration and exploitation).This paper proposed a novel hybrid optimization search method for solving challenging optimization problems.The proposed method is called HRSA, which combined the original Reptile Search Algorithm (RSA) and Remora Optimization Algorithm (ROA) and handled these mechanisms' search process by a novel transition method.The proposed HRSA method solves the main weaknesses raised by the original methods and finds better solutions.The proposed HRSA is tested on various complicated optimization problems: twenty-three benchmark test functions and eight data clustering problems.The results illustrated that the proposed HRSA method performs significantly better than the original and comparative state-of-the-art methods.The proposed method was overwhelmingly better than all the other methods according to the mathematical problems.It obtained promising results in solving the clustering problems.Thus, HRSA has a remarkable efficacy when employed for various clustering problems.
In future work, the proposed method can be further studied to discover the weaknesses that can be improved.Additionally, it can be hybridized with other metaheuristic components.Moreover, the proposed HRSA can be investigated to solve other problems such as text clustering, text classification, image enhancement, image segmentation, task scheduling in computing, parameter estimation, forecasting problems, advanced mathematical problems, prediction problems, industrial problems, engineering problems, constrained mechanical design problems, home energy management, wastewater quality parameters, and other real-world problems [38][39][40].The limitation of this paper is that the used data sets for the data clustering can be real data in the future.Additionally, in some cases, the proposed method needs more execution time.

Figure 3 .
Figure 3. Convergence curves of HRSA and other methods for F1-F23.

Figure 4 .
Figure 4.The distribution vales of the given position during the optimization process.

Figure 5 .
Figure 5. Execution time ranking for the benchmark functions.

Figure 6 .
Figure 6.Convergence behaviour of the comparative algorithms using the tested data clustering problems.

Figure 8 .
Figure 8. Execution time ranking for the data clustering problems.

Table 1 .
Parameter values of the tested algorithms.

Table 2 .
Details of the tested benchmark functions.

Table 3 .
The results of several population sizes on F1-F13.

Table 4 .
The results of the comparative methods and the proposed HRSA methods for F1-F13, Dim = 10.

Table 5 .
The results of the comparative methods and the proposed HRSA methods for F1-F13, Dim = 100.

Table 6 .
The results of the comparative methods and the proposed HRSA methods for fixed dimensional functions (F14-F23).

Table 8 .
The comparative results of HRSA and other clustering methods using eight datasets.