OOBO: A New Metaheuristic Algorithm for Solving Optimization Problems

This study proposes the One-to-One-Based Optimizer (OOBO), a new optimization technique for solving optimization problems in various scientific areas. The key idea in designing the suggested OOBO is to effectively use the knowledge of all members in the process of updating the algorithm population while preventing the algorithm from relying on specific members of the population. We use a one-to-one correspondence between the two sets of population members and the members selected as guides to increase the involvement of all population members in the update process. Each population member is chosen just once as a guide and is only utilized to update another member of the population in this one-to-one interaction. The proposed OOBO’s performance in optimization is evaluated with fifty-two objective functions, encompassing unimodal, high-dimensional multimodal, and fixed-dimensional multimodal types, and the CEC 2017 test suite. The optimization results highlight the remarkable capacity of OOBO to strike a balance between exploration and exploitation within the problem-solving space during the search process. The quality of the optimization results achieved using the proposed OOBO is evaluated by comparing them to eight well-known algorithms. The simulation findings show that OOBO outperforms the other algorithms in addressing optimization problems and can give more acceptable quasi-optimal solutions. Also, the implementation of OOBO in six engineering problems shows the effectiveness of the proposed approach in solving real-world optimization applications.


Introduction
The term "optimization" refers to obtaining the optimal solution out of all available solutions to a problem [1].Optimization appears widely in real-world issues.For example, the goal of engineers is to design a product with the best performance, traders seek to maximize profits from their transactions, and investors try to minimize investment risk, etc. [2].These types of problems must be modeled mathematically and then optimized using the appropriate method.Each optimization problem is composed of three parts: (a) decision variables, (b) constraints, and (c) objective functions that can be modeled using Equations ( 1)- (4).
Minimize/Maximize : f (x), x = [x 1 , x 2 , . . . ,x m ] (1) Subject to: g i (x) < 0, i = 1, 2, . . ., p, h k (x) = 0, k = 1, 2, . . ., q, (3) where m is the number of problem variables, x = (x 1 , x 2 , . . . ,x m ) is the vector of problem variables, f (x) is the values of the objective function for problem variables, g i is the ith inequality constraint, p is the total number of inequality constraints, h k is the kth equality constraint, q is the total number of equality constraints, and lb j and ub j are the lower and upper bounds of the jth problem variable x j , respectively.Problem-solving techniques in the study of optimization problems fall into two categories.The first category consists of "exact algorithms" that find optimal solutions to these problems and guarantee the optimality of these solutions.The second category consists of "approximate algorithms," which are usually designed to solve optimization problems that exact methods are unable to solve [3].In contrast to exact algorithms, approximate algorithms are able to generate appropriate quality solutions for many optimization problems in a reasonable period of time.However, the important issue with approximate algorithms is that there is no assurance that the problem's global optimal solution will be found [4].As a result, solutions derived from approximation approaches are referred to as quasi-optimal [5].A quasi-optimal solution should be as near to the global optimum as feasible.
Random-based optimization algorithms are among the most extensively utilized approximate algorithms in the solution of optimization problems.Optimization algorithms can give acceptable quasi-optimal solutions for objective functions by employing random operators and random scanning of the optimization problem's search space [6].The proximity of their offered quasi-optimal solution to the global optimum is the key criterion of optimization algorithms' superiority over one another.Scholars created numerous optimization techniques in this respect with the goal of finding quasi-optimal solutions that are closer to the global optimum.These random-based optimization algorithms are used in solving combinatorial optimization problems.
The main question that arises is whether there is still a need to design new optimizers, given that numerous optimization algorithms have been produced.According to the No Free Lunch (NFL) theorem [7], even if an optimization method is very good at solving a certain set of optimization problems, there is no guarantee that it will be an effective optimizer for other optimization problems.As a result, it is impossible to declare that a specific method is the best optimizer for all optimization challenges.The NFL theorem has prompted researchers to design new optimizers to handle optimization issues in a variety of fields [8].This motivated the authors of this study to develop a novel optimization approach to optimizing real-world engineering problems that are both effective and gradient-free.
The novelty and innovation of this paper are in developing a novel population-based optimization method called the One-to-One-Based Optimizer (OOBO) to handle diverse optimization problems.The main contributions of this paper are as follows: • The key idea behind the suggested OOBO algorithm is the effective use of different members of the population and not relying on specific members during the population updating process.

•
The suggested OOBO algorithm's theory is discussed, and its mathematical model for applications in solving optimization problems is offered.

•
OOBO's ability to provide appropriate solutions is evaluated with fifty-two distinct objective functions.

•
The effectiveness of OOBO in solving real-world applications is tested on four engineering design problems.

•
The performance of OOBO is compared with eight well-known algorithms to assess its quality and ability.
The proposed OOBO approach has advantages, such as simple concepts, simple equations, and convenient implementation.The main advantage of OOBO is that it does not have any control parameters; therefore, the proposed approach does not need to adjust the parameters (of course, it should be mentioned, except for the population size, i.e., N and the maximum number of iterations of the algorithm, i.e., T, which are present in all metaheuristic algorithms due to the nature of population-based metaheuristic algorithms).In addition, the optimization process in the proposed OOBO ensures that a member of the population is employed solely to guide a member of the population in each iteration of the algorithm.Therefore, all members participate in guiding the OOBO population.
The rest of the paper is as follows: A literature review is presented in Section 2. Section 3 introduces the suggested OOBO algorithm.Section 4 contains simulation studies and results.The evaluation of OOBO for optimizing four real-life problems is presented in Section 5. Finally, conclusions and several recommendations for further research are stated in Section 6.

Literature Review
Optimization algorithms are classified into five types, based on their primary design concepts: (a) swarm-based, (b) physics-based, (c) evolutionary-based, (d) human-based, and (e) game-based approaches.
Physics-based optimization algorithms are produced by drawing inspiration from numerous physical occurrences and using a variety of its rules.Simulated annealing (SA) is one of the methods in this group that originates from the process of refrigerating molten metals.In the refrigeration process, a very high-temperature molten metal is gradually cooled [38].The gravitational search algorithm (GSA) is designed by modeling the force of gravity and Newton's laws of motion in an artificial system in which masses apply force to each other at different distances and move in this system according to such laws [39].Some of the other physics-based optimization algorithms are the galaxy-based search algorithm (GbSA) [40], the small world optimization algorithm (SWOA) [41], Henry gas solubility optimization (HGSO) [42], central force optimization (CFO) [43], ray optimization (RO) [44], the flow regime algorithm (FRA) [45], curved space optimization (CSO) [46], the billiards-inspired optimization algorithm (BOA) [47], and nuclear reaction optimization (NRO) [48].
Evolutionary-based optimization algorithms are based on the simulation of biological evolution and the theory of natural selection.This category includes the genetic algorithm (GA), one of the earliest approximation optimizers.The GA was developed by modeling the reproductive process according to Darwin's theory of evolution using three operators: (a) selection, (b) crossover, and (c) mutation [49].Some of the other evolutionary-based optimization algorithms are the biogeography-based optimizer (BBO) [50], the memetic algorithm (MA) [51], evolutionary programming (EP) [52], the drawer algorithm (DA) [53], evolution strategy (ES) [54], differential evolution (DE) [55], and genetic programming (GP) [56].
Human-based optimization algorithms are developed based on modeling human behavior.Teaching-learning-based optimization (TLBO) is among the most employed human-based algorithms and models the educational process in the classroom between teachers and students.In the TLBO, the educational process is implemented in two phases: (a) a teaching phase in which the teacher shares knowledge with the students and (b) a learner phase in which the students share knowledge with each other [57].Some of the other human-based optimization algorithms are the mother optimization algorithm (MOA) [58], the exchange market algorithm (EMA) [59], the group counseling optimizer (GCO) [60], the teamwork optimization algorithm (TOA) [6], dual-population social group optimization (DPSGO) [61], and the election-based optimization algorithm (EBOA) [6].
Game-based optimization algorithms originate from the rules of various groups or individual games.The volleyball premier league (VPL) algorithm is based on modeling the interaction and competition among volleyball teams during a season and the coaching process during a match [62].Some of the other game-based optimization algorithms are football game-based optimization (FGBO) [63], ring toss game-based optimization (RTGBO) [64], the golf optimization algorithm (GOA) [65], and shell game optimization (SGO) [66].

One-to-One Based Optimizer
In this section, the proposed OOBO algorithm is described, and its mathematical modeling is presented.OOBO is a population-based metaheuristic algorithm that can provide effective solutions to optimization problems in an iteration-based process using a population search power in the problem-solving space.

Basis of the Algorithm
The basis of OOBO is that, first, several feasible solutions are generated based on the constraints of the problem.Then, in each iteration, the position of these solutions in the search space is updated, employing the algorithm's main idea.Excessive reliance on specific population members in the update process prevents accurate scanning of the problem's search space.This can lead to the convergence of the algorithm towards local optimal areas.The main idea in designing the proposed OOBO algorithm, while preventing it from relying too much on specific members of the population, such as best, worst, and mean members, is the effective use of information on all population members in the process of updating the algorithm population.Therefore, in this process of updating, the following items are considered: (a) the non-reliance of population updates on its specific members; (b) the involvement of all members in the updating process; and (c) each population member is employed in a one-to-one correspondence to guide another member in the search space.

Algorithm Initialization
In the OOBO algorithm, each population member is a proposed solution to the given problem as values for the decision variables, depending on its location in the search space.As a result, in OOBO, each population member is mathematically represented by a vector with the same number of elements as the number of decision variables.A population member can be represented using To generate the initial population of OOBO, population members are randomly positioned in the search space utilizing where → X i is the ith population member (that is, the proposed solution), x i,d is its dth dimension (that is, the proposed value for the dth variable), rand() is a function generating a random uniform number from the interval [0, 1], and N is the size of the population.
In OOBO, the algorithm population is represented using a matrix according to The optimization problem's objective function can be assessed based on each population member, which is a proposed solution.Thus, different values for the objective function are acquired in each iteration equal to the number of population members, which can be mathematically described by means of where → F is the objective function vector and f i is the objective function value for the ith proposed solution.

Mathematical Modeling of OOBO
At this stage of mathematical modeling for the OOBO algorithm, the population members' positions must be updated in the search space.The main difference between metaheuristic algorithms is in how to update the position of population members.One of the things that can be seen in many metaheuristic algorithms is that the population update process is strongly dependent on the best member.This may lead to a decrease in the algorithm's exploration ability to provide the global search in the problem-solving space and then get stuck in the local optimum.In fact, moving the population towards the best member can cause convergence to inappropriate local solutions, especially in complex optimization problems.Meanwhile, in the design of OOBO, the dependence of the population update process on the best member has been prevented.Hence, by moving the population of the algorithm to different areas in the search space, the exploration power of OOBO can be increased to provide the global search.The main idea of OOBO for this process is that all members of the population should participate in population updating.Therefore, each population member is selected only once and randomly to guide a different member of the population in the search space.We can mathematically describe this idea using an N-tuple with the following properties: (a) each member is randomly selected from the positive integers from 1 to N; (b) there are no duplicate members among its members; and (c) no member has a value equal to its position in this N-tuple.
To model a one-to-one correspondence, the member position number in the population matrix is used.The random process of forming the set → K as "the set of the positions of guiding members" is modeled by where N = {1, . . . ,N}, P N is the set of all permutations of the set N, and k is the th element of the vector → K.In OOBO, to guide the ith member (X i ), a member of the population with position number k i (X k i ) in the population matrix is selected.Based on the values of the objective function of these two members, if the status of member X k i in the search space is better than that of member X i , member X i moves to member X ki ; otherwise, it moves away from member X ki .Based on the above concepts, the process of calculating the new status of population members in the search space is modeled, employing where x new i,d is the new suggested status of the ith member in the dth dimension, x k i ,d is the dth dimension of the selected member to guide the ith member, f k i is the objective function value obtained based on X k i , and the variable I takes values from the set {1, 2}.
The updating process of the population members in the proposed algorithm is such that the suggested new status for a member is acceptable if it leads to an improvement in the value of the objective function.Otherwise, the suggested new status is unacceptable, and as a result, the member stays in the previous position.This step of modeling OOBO is formulated as where X new i is the new suggested status in the search space for the ith population member and f new i is its value of the objective function.

Repetition Process, Pseudocode, and Flowchart of OOBO
At this stage of OOBO, after updating the positions of all members of the population in the search space, the algorithm completes one iteration and enters the next iteration based on the population members' new statuses.The procedure of updating population members is repeated using Equations ( 9)-(12) until the algorithm reaches the stopping rule.OOBO provides the best-found solution as a quasi-optimal after fully implementing the algorithm in the given problem.The implementation steps of OOBO are presented as pseudocode in Algorithm 1.The complete set of codes is available at the following repository: https://www.mathworks.com/matlabcentral/fileexchange/135807-one-toone-based-optimizer-oobo(accessed on 22 September 2023).Save the best solution found so far.13. end for 14.Output the best quasi-optimal solution.End OOBO.

Computational Complexity of OOBO
Next, the computational complexity of the OOBO algorithm, including the time complexity and space complexity, is studied.
The time complexity of OOBO is affected by the initialization process, the calculation of the objective function, and population updating as follows:

•
The algorithm initialization process requires O(Nm) time, where, as mentioned, N is the number of population members and m the number of decision variables.

•
In each iteration, the objective function is calculated for each population member.Therefore, calculating the objective function requires O(NT) time, where T is the number of iterations of the algorithm.

•
The updating of population members requires an O(NTm) time.
Therefore, O(N(T(1 + m) + m)) is the total time complexity of the OOBO algorithm, which can be simplified to O(NTm).Competitor algorithms such as GA, PSO, GSA, GWO, WOA, TSA, and MPA have a time complexity equal to O(NTm), and TLBO has a time complexity equal to O(N(2T(1 + m) + m)).Of course, considering that it is usually expressed as time complexity without constants and slower-growing terms, this expression is simplified to O(NTm).Thus, the proposed OOBO approach has a similar time complexity to the seven competitor algorithms mentioned above.Compared to the TLBO, the OOBO approach has less time complexity and better conditions from this perspective.
The space complexity of the OOBO algorithm is O(Nm), which is considered the maximum amount of space in its initialization process.Similarly, the competitor algorithms also have a space complexity equal to O(Nm).In this respect, there is no difference between OOBO and the competitor algorithms.

Simulation Studies and Results
In this section, OOBO's ability to solve optimization problems and provide quasioptimal solutions is evaluated.For this purpose, OOBO was tested on 52 objective functions, which were categorized into (a) seven unimodal functions of F 1 to F 7 , (b) six highdimensional multimodal functions of F 8 to F 13 , and (c) ten fixed-dimensional multimodal test functions of F 14 to F 23 , as well as twenty-nine functions from the CEC 2017 test suite (C17-F1, C17-F3 to C17-F30).Detailed information and a complete description of the benchmark functions for functions F 1 to F 23 are provided in [76], and for the CEC 2017 test suite, they are provided in [77].In addition, the performance of OOBO was evaluated in four real-world optimization problems.

Intuitive Analysis in Two-Dimensional Search Space
Next, to visually observe the optimization process of the OOBO approach, the OOBO function was implemented in ten objective functions, F 1 to F 10 , in two dimensions.In this experiment, the number of OOBO population members was considered equal to five.To show the mechanism of the OOBO algorithm in solving the problems related to F 1 to F 10 , convergence curves, search history curves, and trajectory curves are presented in Figure 1.The horizontal axis in convergence curves and trajectory curves represents the number of iterations of the algorithm.These curves display OOBO's behavior in scanning the problem-search space, solution-finding, the convergence process, and how it achieves better solutions based on update processes after each iteration, as well as decreasing the objective function values.What was concluded from the analysis of this experiment is that the OOBO approach, by improving the initial candidate solutions during the progress of the algorithm iterations, can converge towards the optimal solution, providing acceptable quasi-optimal solutions for the given problem.

Intuitive Analysis in Two-Dimensional Search Space
Next, to visually observe the optimization process of the OOBO approach, the OOBO function was implemented in ten objective functions,  to  , in two dimensions.In this experiment, the number of OOBO population members was considered equal to five.To show the mechanism of the OOBO algorithm in solving the problems related to  to  , convergence curves, search history curves, and trajectory curves are presented in Figure 1.The horizontal axis in convergence curves and trajectory curves represents the number of iterations of the algorithm.These curves display OOBO's behavior in scanning the problem-search space, solution-finding, the convergence process, and how it achieves better solutions based on update processes after each iteration, as well as decreasing the objective function values.What was concluded from the analysis of this experiment is that the OOBO approach, by improving the initial candidate solutions during the progress of the algorithm iterations, can converge towards the optimal solution, providing acceptable quasi-optimal solutions for the given problem.

Experimental Setup
To further analyze the quality of OOBO, the results obtained from this algorithm were compared with eight well-known optimization algorithms: PSO, TLO, GWO, WOA, MPA, TSA, GSA, and GA.The reasons for choosing these competitor algorithms were as follows: GA and PSO are among the most famous and widely used optimization algorithms that have been employed in many applications; GSA, TLBO, and GWO are highly cited algorithms, which shows that they have always been trusted and used by researchers.Additionally, WOA, MPA, and TSA are methods that have been published recently, and because of their acceptable performance, they have been favored by many researchers in this short period of publication.Therefore, in total, eight competitor algorithms in this study were selected, based on the following three criteria:

Experimental Setup
To further analyze the quality of OOBO, the results obtained from this algorithm were compared with eight well-known optimization algorithms: PSO, TLO, GWO, WOA, MPA, TSA, GSA, and GA.The reasons for choosing these competitor algorithms were as follows: GA and PSO are among the most famous and widely used optimization algorithms that have been employed in many applications; GSA, TLBO, and GWO are highly cited algorithms, which shows that they have always been trusted and used by researchers.Additionally, WOA, MPA, and TSA are methods that have been published recently, and because of their acceptable performance, they have been favored by many researchers in this short period of publication.Therefore, in total, eight competitor algorithms in this study were selected, based on the following three criteria: (i) The most widely used algorithms: GA and PSO.
The values used for the control parameters of these competitors are specified in Table 1.To provide a fair comparison, standard versions of metaheuristic algorithms are used.Experiments were implemented on the software MATLAB R2022a utilizing a 64-bit Core i7 processor with 3.20 GHz and 16 GB main memory.

Performance Comparison
The ability of OOBO was compared with eight competitor algorithms applied to different objective functions of unimodal and multimodal types.Five indicators (mean, best, worst, standard deviation, and median) of the best-found solutions were used to report the performance results of the algorithms.To optimize each of the objective functions, OOBO was implemented in 20 independent runs, each of which contained 1000 iterations.Convergence curves for each benchmark function were drawn based on the average performance of metaheuristic algorithms in 20 independent runs.Random optimization algorithms are stochastic-based approaches that can provide a solution to the problem in an iterative process.An essential point in implementing optimization algorithms is determining the stopping rule for the algorithm iterations.There are various stopping rules (criteria) for optimization algorithms, including the total number of iterations, the total number of function evaluations, no change in the value of the objective function after a certain number of iterations, and determining an error level between the values of the objective function in several consecutive repetitions.Among them, the total number of iterations has been the focus of researchers, who employ this criterion for the stopping rule.Hence, the present investigation considered the total number of iterations (T) as a stopping rule for optimization algorithms in solving the functions F 1 to F 23 and function evaluations (FEs) in solving the CEC 2017 test suite.
Seven unimodal structures were included in the first group of objective functions analyzed to assess the competence of OOBO.Table 2 reports the implementation results of OOBO and eight competitors.What is clear from the analysis of the simulation results is that OOBO is the first best optimizer for the functions F 1 , F 2 , F 3 , F 4 , F 5 , F 6 , and F 7 compared to the competitor algorithms.The comparison of the simulation results demonstrates that the proposed OOBO has a great capacity to solve unimodal problems and is far more competitive than the other eight algorithms.
The second set of objective functions chosen to assess the efficacy of optimization algorithms consisted of six high-dimensional multimodal objective functions, F 8 to F 13 .Table 3 presents the outcomes of optimizing these objective functions utilizing the proposed OOBO and eight competitor techniques.Based on the simulation results, OOBO provides the optimal solution for F 9 and F 11 and is also the first-best optimizer for F 8 , F 10 , F 12 , and F 13 .Similarly, it was determined that OOBO has a more efficient ability to provide suitable solutions for F 8 to F 13 in relation to the competitor algorithms.
Ten fixed-dimensional multimodal functions were considered as the third group of objective functions to test the performance of the optimization techniques.Table 4 provides the outcomes of implementing the proposed OOBO and eight competitor algorithms on F 14 to F 23 .The simulation results reveal that OOBO outperforms the competitor algorithms for F 14 , F 15 , F 20 , F 21 , F 22 , and F 23 .In optimizing the functions F 16 , F 17 , F 18 , and F 19 , although from the "mean" perspective, the performance of several algorithms is the same, OOBO has better "standard deviation", providing adequate solutions.The simulation results demonstrate that OOBO is more efficient than the competitor algorithms at solving this sort of objective functions.
Figure 2 depicts a boxplot of the performance of optimization algorithms in solving objective functions F 1 to F 23 .In addition, the convergence curves of the OOBO approach and all competitor algorithms for benchmark functions F 1 to F 23 are presented in Figure 3.The best score in convergence curves refers to the best value obtained for the objective function up to each iteration.This index is updated in each iteration based on the comparison with its value in the previous iteration.The analysis of the convergence curves indicates that, when solving unimodal problems with objectives functions F 1 to F 7 , the proposed OOBO converges on much better solutions than its eight competitor algorithms, and it has superior performance.When solving high-dimensional multi-model problems based on F 8 to F 13 , OOBO has a greater convergence strength than its eight competitor algorithms.When solving high-dimensional multi-model problems using F 14 to F 23 , the proposed OOBO approach has a faster convergence speed and greater convergence strength than eight competitor algorithms.

Sensitivity Analysis
The proposed OOBO employs two parameters, the number of population members (N) and the maximum number of iterations (T) in the implementation process, to solve optimization problems.In this regard, the analysis of OOBO's sensitivity to these two parameters was assessed next.OOBO was implemented in independent runs for different values of N = 10, 20, 30, and 100 on F 1 to F 23 to investigate the sensitivity of the proposed method to the number of population members' parameter.The simulation results of this part of the study are reported in Table 5, whereas the behavior of the convergence curves under the impact of changes in population size is displayed in Figure 4.The simulation results show that the values of all objective functions decline as the population size increases.To investigate the proposed algorithm's sensitivity in relation to T, OOBO is employed in independent runs with different values of this parameter equal to T = 200, 500, 800, and 1000 for optimizing the functions F 1 to F 23 .Table 6 and Figure 5 show the results of the sensitivity analysis of OOBO regarding T. The inference from the OOBO sensitivity analysis with the parameter T is that this algorithm can converge on better optimal solutions when employed in a larger number of iterations.

Scalability Analysis
Next, a scalability study is presented to analyze the performance of OOBO in optimizing objective functions under the influence of changes in the problem dimensions.For this purpose, OOBO was employed in different dimensions (30, 50, 80, 100, 250, and 500) in optimizing F 1 to F 13 .The OOBO convergence curves in solving objective functions for the various mentioned dimensions are presented in Figure 6.The simulation results obtained from the scalability study are reported in Table 7. From the analysis of the results in this table, we can deduce that the efficiency of OOBO is not degraded too much when the dimensions of the given problem increase.OOBO's optimal performance under the influence of changes in the problem dimensions is due to OOBO's ability to achieve the proper balance between exploration and exploitation.

Evaluation of the CEC 2017 Test Suite
Next, the performance of the proposed OOBO approach in handling the CEC 2017 test suite was evaluated.The CEC 2017 test suite has thirty benchmark functions consisting of three unimodal functions, C17-F1 to C17-F3, seven multimodal functions, C17-F4 to C17-F10, ten hybrid functions, C17-F11 to C17-F20, and ten composition functions, C17-F21 to C17-F30.The function C17-F2 was removed from this test suite due to unstable behavior (as in similar papers).The unstable behavior of C17-F2 means that, especially for higher dimensions, it shows significant performance variations for the same algorithm implemented in Matlab.Complete information on the CEC 2017 test suite is provided in [77].The implementation results of OOBO and competitor algorithms with the CEC 2017 test suite are reported in Table 8.The boxplot diagrams obtained from the performance of the metaheuristic algorithms in handling the CEC 2017 test suite are drawn in Figure 7.   Next, the performance of the proposed OOBO approach in handling the CEC 2017 test suite was evaluated.The CEC 2017 test suite has thirty benchmark functions consisting of three unimodal functions, C17-F1 to C17-F3, seven multimodal functions, C17-F4 to C17-F10, ten hybrid functions, C17-F11 to C17-F20, and ten composition functions, C17-F21 to C17-F30.The function C17-F2 was removed from this test suite due to unstable behavior (as in similar papers).The unstable behavior of C17-F2 means that, especially for higher dimensions, it shows significant performance variations for the same algorithm implemented in Matlab.Complete information on the CEC 2017 test suite is provided in [77].The implementation results of OOBO and competitor algorithms with the CEC 2017 test suite are reported in Table 8.The boxplot diagrams obtained from the performance of the metaheuristic algorithms in handling the CEC 2017 test suite are drawn in Figure 7.
Based on the simulation results, OOBO is the first-best optimizer for the functions C17-F1, C17-F4 to C17-F6, C17-F8, C17-F10 to C17-24, and C17-F26 to C17-F30.The analysis of the simulation results shows that the proposed OOBO approach has been able to provide superior performance in solving the CEC 2017 test suite by providing better results in most of the benchmark functions compared to competitor algorithms.

Statistical Analysis
Next, the Wilcoxon rank sum test [78] was utilized to evaluate the performance of optimization algorithms in addition to statistical analysis of the average and standard deviation.The Wilcoxon rank sum test is used to determine whether there is a statistically significant difference between two sets of data.A p-value in this test reveals whether the difference between OOBO and each of the competitor algorithms is statistically significant.Table 9 reports the results of our statistical analysis.Based on the analysis of the simulation results, the proposed OOBO has a p-value less than 0.05 in each of the three types of objective functions compared to each of the competitor algorithms.This result indicates that OOBO is significantly different in statistical terms from the eight compared algorithms.

Statistical Analysis
Next, the Wilcoxon rank sum test [78] was utilized to evaluate the performance of optimization algorithms in addition to statistical analysis of the average and standard deviation.The Wilcoxon rank sum test is used to determine whether there is a statistically significant difference between two sets of data.A p-value in this test reveals whether the difference between OOBO and each of the competitor algorithms is statistically significant.Table 9 reports the results of our statistical analysis.Based on the analysis of the simulation results, the proposed OOBO has a p-value less than 0.05 in each of the three

OOBO for Real-World Applications
In this section, the proposed OOBO and eight competitor algorithms are applied to four science/engineering designs to evaluate their capacity to resolve real-world problems.These design problems are pressure vessel, speed reducer, welded beam, and tension/compression spring.

Pressure Vessel Design Problem
The mathematical model of the pressure vessel design problem was adapted from [79].The main goal of this problem is to minimize the design cost.A schematic view of the pressure vessel design problem is shown in Figure 8.

OOBO for Real-World Applications
In this section, the proposed OOBO and eight competitor algorithms are applied to four science/engineering designs to evaluate their capacity to resolve real-world problems.These design problems are pressure vessel, speed reducer, welded beam, and tension/compression spring.

Pressure Vessel Design Problem
The mathematical model of the pressure vessel design problem was adapted from [79].The main goal of this problem is to minimize the design cost.A schematic view of the pressure vessel design problem is shown in Figure 8.To formulate the model, consider that X = [x 1 , x 2 , x 3 , x 4 ] = [T s , T h , R, L], and then the mathematical program is given by Minimize: Subject to: with 0 ≤ x 1 , x 2 ≤ 100 and 10 ≤ x 3 , x 4 ≤ 200.The obtained solutions using OOBO and eight competitor algorithms are presented in Table 10.Based on the results of this table, OOBO presented the optimal solution at (0.7781, 0.3832, 40.3150, 200), and the value of the objective function was equal to 5870.8460.An analysis of the results of this table showed that the proposed OOBO has good performance in solving the problem at a low cost.The statistical results of the performance of the optimization algorithms when solving this problem are presented in Table 11.These results indicated that OOBO provides better values for the best, mean, and median indices than the other eight compared algorithms.The convergence curve of the proposed OOBO is presented in Figure 9 while achieving the optimal solution.

Speed Reducer Design Problem
The mathematical model of the speed reducer design problem was first formulated in [80], but we used an adapted formulation from [81].The main goal of this problem is to minimize the weight of the speed reducer.A schematic view of the speed reducer design problem is shown in Figure 10.

Speed Reducer Design Problem
The mathematical model of the speed reducer design problem was first formulated in [80], but we used an adapted formulation from [81].The main goal of this problem is to minimize the weight of the speed reducer.A schematic view of the speed reducer design problem is shown in Figure 10.The results of the implementation of the proposed OOBO and eight compared algorithms on this problem are presented in Table 12.OOBO presented the optimal solution at (3.5012, 0.7, 17, 7.3, 7.8, 3.33412, 5.26531) with an objective function value of 2989.8520.Table 13 presents the statistical results obtained from the proposed OOBO and eight To formulate the model, consider that X = [x 1, x 2 , x 3 , x 4 , x 5 , x 6 , x 7 ] = [b, m, p, l 1 , l 2 , d 1 , d 2 ], and then the mathematical program is given by Minimize: Subject to: The results of the implementation of the proposed OOBO and eight compared algorithms on this problem are presented in Table 12.OOBO presented the optimal solution at (3.5012, 0.7, 17, 7.3, 7.8, 3.33412, 5.26531) with an objective function value of 2989.8520.Table 13 presents the statistical results obtained from the proposed OOBO and eight competitor algorithms.Based on the simulation results, OOBO has superior performance over the eight algorithms when solving the speed reducer design problem.The convergence curve of the proposed OOBO is presented in Figure 11.

Welded Beam Design
The mathematical model of a welded beam design was adapted from [22].The main goal for solving this design problem is to minimize the fabrication cost of the welded beam.A schematic view of the welded beam design problem is shown in Figure 12.

Welded Beam Design
The mathematical model of a welded beam design was adapted from [22].The main goal for solving this design problem is to minimize the fabrication cost of the welded beam.A schematic view of the welded beam design problem is shown in Figure 12.To formulate the model, consider that X = [x 1 , x 2 , x 3 , x 4 ] = [h, l, t, b], and then the mathematical program is given by Minimize: , δ (x) = 65856000 (30•10 The results of the implementation of the proposed OOBO and the compared algorithms on this problem are presented in Table 14.The simulation results show that the proposed algorithm presented the optimal solution at (0.20328, 3.47115, 9.03500, 0.20116) with an objective function value of 1.72099.An analysis of the statistical results of the implemented algorithms is presented in Table 15.Based on this analysis, note that the proposed OOBO is superior to the compared algorithms in providing the best, mean, and median indices.The convergence curve of OOBO to solve the welded beam design problem is shown in Figure 13.

Tension/Compression Spring Design
The mathematical model of this problem was adapted from [22].The main goal of this design problem is to minimize the tension/compression of the spring weight.A schematic view of the tension/compression spring design problem is shown in Figure 14.The results of the implementation of the proposed OOBO and the compared algorithms on this problem are presented in Table 14.The simulation results show that the proposed algorithm presented the optimal solution at (0.20328, 3.47115, 9.03500, 0.20116) with an objective function value of 1.72099.An analysis of the statistical results of the implemented algorithms is presented in Table 15.Based on this analysis, note that the proposed OOBO is superior to the compared algorithms in providing the best, mean, and median indices.The convergence curve of OOBO to solve the welded beam design problem is shown in Figure 13.

Tension/Compression Spring Design
The mathematical model of this problem was adapted from [22].The main goal of this design problem is to minimize the tension/compression of the spring weight.A sche-  The performance of all optimization algorithms in achieving the objective values and design variables values is presented in Table 16.The optimization results show that the proposed OOBO provided the optimal solution at (0.05107, 0.34288, 12.08809) with an objective function value of 0.01266.A comparison of the results showed that OOBO has superior performance in solving this problem compared to those of the other eight algorithms.A comparison of the statistical results of the performance of the proposed OOBO against the eight competitor algorithms is provided in Table 17.The analysis of this table reveals that OOBO offers a more competitive performance in providing the best, mean, and median indices.The convergence curve of the proposed OOBO in achieving the obtained optimal solution is shown in Figure 15.Subject to: The performance of all optimization algorithms in achieving the objective values and design variables values is presented in Table 16.The optimization results show that the proposed OOBO provided the optimal solution at (0.05107, 0.34288, 12.08809) with an objective function value of 0.01266.A comparison of the results showed that OOBO has superior performance in solving this problem compared to those of the other eight algorithms.A comparison of the statistical results of the performance of the proposed OOBO against the eight competitor algorithms is provided in Table 17.The analysis of this table reveals that OOBO offers a more competitive performance in providing the best, mean, and median indices.The convergence curve of the proposed OOBO in achieving the obtained optimal solution is shown in Figure 15.

Conclusions and Future Works
A new optimization technique called one-to-one-based optimizer (OOBO) was proposed in this study.The main idea in designing OOBO was the participation of all population members in the algorithm's updating process based on a one-to-one correspondence between the two sets of members of the population and a set of selected members as guides.Thus, each population member was selected precisely once as a guide to another member and then used to update the position of that population member.The performance of the proposed OOBO in solving optimization problems was tested on 52 objective functions belonging to unimodal, high-dimensional, and fixed-dimensional multimodal types, as well as the CEC 2017 test suite.The findings indicated OOBO's strong ability in exploitation based on the results of unimodal functions, OOBO's strong ability in exploration based on the results of high-dimensional multimodal functions, and OOBO's acceptability in balancing exploitation and exploration based on the results of fixed-dimensional multimodal, hybrid, and composition functions.
In addition, the performance of the proposed approach in solving optimization problems was compared with eight well-known algorithms.Simulation results reported that the proposed algorithm provided quasi-optimal solutions with better convergence than the compared algorithms.Furthermore, the power of the proposed approach to provide

Conclusions and Future Works
A new optimization technique called one-to-one-based optimizer (OOBO) was proposed in this study.The main idea in designing OOBO was the participation of all population members in the algorithm's updating process based on a one-to-one correspondence between the two sets of members of the population and a set of selected members as guides.Thus, each population member was selected precisely once as a guide to another member and then used to update the position of that population member.The performance of the proposed OOBO in solving optimization problems was tested on 52 objective functions belonging to unimodal, high-dimensional, and fixed-dimensional multimodal types, as well as the CEC 2017 test suite.The findings indicated OOBO's strong ability in exploitation based on the results of unimodal functions, OOBO's strong ability in exploration based on the results of high-dimensional multimodal functions, and OOBO's acceptability in balancing exploitation and exploration based on the results of fixed-dimensional multimodal, hybrid, and composition functions.
In addition, the performance of the proposed approach in solving optimization problems was compared with eight well-known algorithms.Simulation results reported that the proposed algorithm provided quasi-optimal solutions with better convergence than the compared algorithms.Furthermore, the power of the proposed approach to provide suitable solutions for real-world applications was tested by applying it to four science/engineering design problems.It is clear from the optimization results of this experiment that the proposed OOBO is applicable to solving real-world optimization problems.In response to the main research question about introducing a new optimization algorithm, the simulation findings showed that the proposed OOBO approach performed better in most of the benchmark functions than its competing algorithms.The successful and acceptable performance of OOBO justifies the introduction and design of the proposed approach.
Against advantages such as a strong ability to balance exploration and exploitation and effectiveness in handling real-world applications, the proposed OOBO approach has limitations and disadvantages.The first limitation for all optimization algorithms is that, based on the NFL theorem, there is always a possibility that newer algorithms will be designed that perform better than OOBO.A second limitation of OOBO is that there is no guarantee of achieving global optimization using it due to the nature of random search.

Figure 1 .
Figure 1.Search history curves, trajectory curves, and convergence curves for optimization of different objective functions using OOBO.

Figure 1 .
Figure 1.Search history curves, trajectory curves, and convergence curves for optimization of different objective functions using OOBO.

Figure 2 .
Figure 2. Boxplots of the performance of OOBO and competitor algorithms based on F 1 to F 23 .

Figure 3 .
Figure 3. Convergence curves of OOBO and competitor algorithms in optimizing  to  .

Figure 3 .
Figure 3. Convergence curves of OOBO and competitor algorithms in optimizing F 1 to F 23 .

Figure 4 .
Figure 4. Convergence curves of sensitivity analysis of OOBO in relation to .

Figure 4 .
Figure 4. Convergence curves of sensitivity analysis of OOBO in relation to N.

Figure 5 .
Figure 5. Convergence curves of sensitivity analysis of OOBO in relation to .Figure 5. Convergence curves of sensitivity analysis of OOBO in relation to T.

Figure 5 .
Figure 5. Convergence curves of sensitivity analysis of OOBO in relation to .Figure 5. Convergence curves of sensitivity analysis of OOBO in relation to T.

Figure 7 .
Figure 7. Boxplot diagram of OOBO and competitor algorithms using the CEC 2017 test suite.

Figure 7 .
Figure 7. Boxplot diagram of OOBO and competitor algorithms using the CEC 2017 test suite.

Figure 8 .
Figure 8. Schematic of the pressure vessel design.

Figure 9 .
Figure 9. OOBO's performance convergence curve on the pressure vessel design.

Figure 9 .
Figure 9. OOBO's performance convergence curve on the pressure vessel design.

Figure 10 .
Figure 10.Schematic of the speed reducer design.

Figure 11 .
Figure 11.OOBO's performance convergence curve on the speed reducer design.

Figure 11 .
Figure 11.OOBO's performance convergence curve on the speed reducer design.

Figure 12 .
Figure 12.Schematic of the welded beam design.

Figure 13 .
Figure 13.OOBO's performance convergence curve on the welded beam design.

Figure 13 .
Figure 13.OOBO's performance convergence curve on the welded beam design.

Figure 14 .
Figure 14.Schematic of the tension/compression spring design.To formulate the model, consider that  =  ,  ,  = , ,  , and then the mathematical program is given by Minimize:() = ( + 2)

Figure 14 .
Figure 14.Schematic of the tension/compression spring design.To formulate the model, consider that X = [x 1 , x 2 , x 3 ] = [d, D, P], and then the mathematical program is given by Minimize:f (x) = (x 3 + 2)x 2 x 2 1

Figure 15 .
Figure 15.OOBO's performance convergence curve on the tension/compression spring.

Figure 15 .
Figure 15.OOBO's performance convergence curve on the tension/compression spring.

Table 3 .
Optimization results for the indicated algorithm and unimodal functions.

Table 4 .
Optimization results for the indicated algorithm and unimodal functions.

Table 7 .
Scalability study results of OOBO.

Table 8 .
Optimization results of the indicated algorithm and functions.

Table 9 .
Wilcoxon rank sum test results for the indicated algorithm and function.

Table 9 .
Biomimetics 2023, 8, x FOR PEER REVIEW 31 of 43 types of objective functions compared to each of the competitor algorithms.This result indicates that OOBO is significantly different in statistical terms from the eight compared algorithms.Wilcoxon rank sum test results for the indicated algorithm and function.

Table 10 .
Performance of the indicated algorithm on the pressure vessel design problem.

Table 11 .
Statistical results of the indicated algorithm on the pressure vessel design problem.

Table 11 .
Statistical results of the indicated algorithm on the pressure vessel design problem.

Table 12 .
Performance of the indicated algorithm on the speed reducer design problem.

Table 13 .
Statistical results of the indicated algorithm on the speed reducer design problem.Based on the simulation results, OOBO has superior performance over the eight algorithms when solving the speed reducer design problem.The convergence curve of the proposed OOBO is presented in Figure11.

Table 12 .
Performance of the indicated algorithm on the speed reducer design problem.

Table 13 .
Statistical results of the indicated algorithm on the speed reducer design problem.

Table 14 .
Performance of the indicated algorithm on the welded beam design problem.

Table 15 .
Statistical results of the indicated algorithm on the welded beam design problem.

Table 14 .
Performance of the indicated algorithm on the welded beam design problem.

Table 16 .
Performance of the indicated algorithm on the tension spring design problem.

Table 17 .
Statistical results of the indicated algorithm on the tension spring design problem.

Table 17 .
Statistical results of the indicated algorithm on the tension spring design problem.