Selecting Some Variables to Update-Based Algorithm for Solving Optimization Problems

With the advancement of science and technology, new complex optimization problems have emerged, and the achievement of optimal solutions has become increasingly important. Many of these problems have features and difficulties such as non-convex, nonlinear, discrete search space, and a non-differentiable objective function. Achieving the optimal solution to such problems has become a major challenge. To address this challenge and provide a solution to deal with the complexities and difficulties of optimization applications, a new stochastic-based optimization algorithm is proposed in this study. Optimization algorithms are a type of stochastic approach for addressing optimization issues that use random scanning of the search space to produce quasi-optimal answers. The Selecting Some Variables to Update-Based Algorithm (SSVUBA) is a new optimization algorithm developed in this study to handle optimization issues in various fields. The suggested algorithm’s key principles are to make better use of the information provided by different members of the population and to adjust the number of variables used to update the algorithm population during the iterations of the algorithm. The theory of the proposed SSVUBA is described, and then its mathematical model is offered for use in solving optimization issues. Fifty-three objective functions, including unimodal, multimodal, and CEC 2017 test functions, are utilized to assess the ability and usefulness of the proposed SSVUBA in addressing optimization issues. SSVUBA’s performance in optimizing real-world applications is evaluated on four engineering design issues. Furthermore, the performance of SSVUBA in optimization was compared to the performance of eight well-known algorithms to further evaluate its quality. The simulation results reveal that the proposed SSVUBA has a significant ability to handle various optimization issues and that it outperforms other competitor algorithms by giving appropriate quasi-optimal solutions that are closer to the global optima.


Introduction
The act of obtaining the optimal solution from multiple solutions under a given situation is known as optimization [1]. In designed problems in different sciences, items such as cost minimization, profit maximization, shortest length, maximum endurance, best structure, etc., are often raised, which require mathematical modeling of the problem based on the structure of an optimization problem and solving it with appropriate methods.
Mathematical methods of optimization are introduced according to the type of problem modeling, such as linear or nonlinear, constrained or non-constrained, continuous or linear programming, or nonlinear programming. Despite their good performance, these methods also have obstacles and disadvantages. These methods generally find the local optimal, especially if the initial guess is close to a local optimal. In addition, each of these methods assumes assumptions about the problem, which may not be true. These assumptions include derivability, convexity, and coherence. In addition to these disadvantages, the computation time of these methods in a group of optimization problems called nondeterministic polynomial-hard increases exponentially as the dimensions of the problem increase [2].
To overcome these challenges, a special class of optimization methods called stochasticbased optimization algorithms were developed. Because these algorithms rely on probabilistic and random search decisions and principles in many search steps of the optimal solution, these algorithms are called stochastic methods [3].
To find the best answer, optimization algorithms rely on a similar technique. The search procedure in most of these algorithms begins by generating a number of random answers within the allowable range of decision variables. This set of solutions in each of the algorithms has names such as population, colony, group, and so on. Moreover, each solution is assigned names such as chromosomes, ants, particles, and so on. The existing answers are then enhanced in various ways in an iterative process, and this action proceeds until the stop condition is achieved [4].
The global optimum is the fundamental answer to an optimization issue. However, optimization algorithms as stochastic methods are not necessarily able to supply the global optimal answer. Hence, the solution obtained from an optimization algorithm for an optimization problem is called quasi-optimal [5]. The criterion of goodness of a quasioptimal solution depends on how close it is to the global optimal. As a result, when comparing the effectiveness of several optimization algorithms in addressing a problem, the method that produces a quasi-optimal solution that is closer to the global ideal optimal is preferable. This issue, as well as the goal to attain better quasi-optimal solutions, has prompted academics to extensive efforts and research to develop a variety of optimization algorithms that can provide solutions that are closer to the global optimal for optimization issues. Stochastic-based optimization algorithms have wide applications in optimization challenges in various sciences such as sensor networks [6], image processing [7], data mining [8], feature selection [9], clustering [10], engineering [11], the internet of things [12], and so on.
Is there still a need to develop new optimization algorithms despite the optimization algorithms that have been established so far? This is a key question that emerges in the research of optimization algorithms. The notion of the No Free Lunch (NFL) theorem has the answer to this question [13]. According to the NFL theorem, an optimization method that is effective in optimizing a group of optimization issues does not ensure that it will be useful in solving other optimization problems. As a result, it is impossible to say that one method is the best optimizer for all optimization problems. The NFL theorem motivates academics to create novel optimization algorithms to tackle optimization issues more efficiently.
The authors of this paper have developed several optimization algorithms in their previous works, such as the Pelican Optimization Algorithm (POA) [14] and Teamwork Optimization Algorithm (TOA) [15]. The common denominator of all optimization algorithms (both in the works of the authors of this article and the works of other researchers) can be considered the use of a random scan of the problem search space, random operators, no need for derivation process, easy implementation, simple concepts, and practicality in optimization challenges. The optimization process in population-based optimization algorithms starts with a random initial population. Then, in an iteration-based process, according to the algorithm steps, the position of the algorithm population in the search space is updated until the implementation is completed. The most important difference between optimization algorithms is in the same process of updating members of the algorithm population from one iteration to another. In POA, the algorithm population update process is based on simulating the strategies of pelicans while hunting. In TOA, modeling the activities and interactions of individuals in a group by presenting teamwork to achieve the team goal is the main idea in updating the population.
The novelty of this paper is in the development and design of a new optimization method named Selecting Some Variables to Update-Based Algorithm (SSVUBA) to address 1.
A new stochastic-based approach called Selecting Some Variables to Update-Based Algorithm (SSVUBA) used in optimization issues is introduced.

2.
The fundamental idea behind the proposed method is to change the number of selected variables to update the algorithm population throughout iterations, as well as to use more information from diverse members of the population to prevent the algorithm from relying on one or several specific members. 3.
SSVUBA theory and steps are described and its mathematical model is presented. 4.
On a set of fifty-three standard objective functions of various unimodal, multimodal types, and CEC 2017, SSVUBA's capacity to optimize is examined. 5.
The proposed algorithm is implemented in four engineering design problems to analyze SSVUBA's ability to solve real-world applications, 6.
SSVUBA's performance is compared to the performance of eight well-known algorithms to better understand its potential to optimize.
The following is the rest of the paper: A study of optimization methods is provided in Section 2. The proposed SSVUBA is introduced in Section 3. Simulation investigations are presented in Section 4. A discussion is provided in Section 5. The performance of SSVUBA in optimizing real-world applications is evaluated in Section 6. Section 7 contains the conclusions and recommendations for future research.

Background
Optimization algorithms are usually developed based on the simulation of various ideas in nature, physics, genetics and evolution, games, and any type of process that can be modeled as an optimizer.
One of the first and most prominent meta-heuristic algorithms is the Genetic Algorithm (GA), which is based on the theory of evolution. The main operator of this algorithm is a crossover that combines different members of the population together. However, the mutation operator is also useful for preventing premature convergence and falling into the local optimal trap. The smart part of this method is the selection stage, which in each stage, transmits better solutions to the next generation [16]. Ant Colony Optimization (ACO) is designed based on the inspiration of ants' group behavior in food discovery. Ants release pheromones along the way to food. The presence of more pheromones in a path indicates the presence of a rich food source near that path. By modeling the process of pheromone release, pheromone tracking, and its evaporation with sunlight, the ACO is completed [17]. Particle Swarm Optimization (PSO) is one of the most established swarm-based algorithms, which is inspired by the social behavior of different biological species in their group life, such as birds and fish. This algorithm mimics the interaction between members to share information. Every particle is affected by its best situation and the best situation of the whole swarm, but it must move randomly [18]. The Simulated Annealing (SA) algorithm is a physics-based stochastic search method for optimization that relies on the simulation of the gradual heating and cooling process of metals called annealing. The purpose of annealing metals is to achieve a minimum energy and a suitable crystalline structure. In SA, this idea has been applied for optimization and search [19]. The Firefly Algorithm (FA) is based on the natural behavior of fireflies that live together in large clusters. FA simulates the activity of a group of fireflies by assigning a value to each firefly's position as a model for the quantity of firefly pigments and then updating the fireflies' location in subsequent iterations. The two main stages of FA in each iteration are the pigment update phase and the motion phase. Fireflies move toward other fireflies with more pigments in their neighborhood. In this way, during successive repetitions, the proposed solutions tend towards a better solution [20]. The Teaching-Learning Based Optimization (TLBO) method is based on simulating a teacher's impact on the output of students in a classroom. TLBO is built on two fundamental modalities of teaching and learning: (1) Teacher phase in which knowledge is exchanged between the teacher and learners and (2) Learner phase

Selecting Some Variables to Update-Based Algorithm (SSVUBA)
In this section, the theory and all stages of the Selecting Some Variables to Update-Based Algorithm (SSVUBA) are described, and then its mathematical model is presented for application in tackling optimization issues.

Mathmatical Model of SSVUBA
SSVUBA is a population-based stochastic algorithm. Each optimization issue has a search space with the same number of axes as the problem's variables. According to its position in the search space, each member of the population assigns values to these axes. As a result, each member of the population in the SSVUBA is a proposed solution to the optimization issue. Each member of the population can be mathematically described as a vector, each component of which represents the value of one of the problem variables. As a result, the population members of the proposed SSVUBA can be modeled using a matrix termed the population matrix, as shown in Equation (1).
where X is the SSVUBA's population matrix, X i is the ith member, x i,d is the value of the dth problem variable generated by the ith member, N is the number of population members, and m is the number of problem variables. The objective function of the problem can be assessed using the theory that each member of the population provides values for the problem variables. As a result, the values derived for the objective function based on the evaluation of different members of the population can be described employing a vector according to Equation (2).
where F denotes the objective function vector and F i represents the objective function value obtained from the ith population member's evaluation. The process of updating population members in the proposed SSVUBA adheres to two principles.
The first principle is that some members of the population may be in a situation where if only the values of some variables change, they will be in a better position instead of changing all of the variables. Therefore, in the proposed SSVUBA, the number of variables selected for the update process is set in each iteration. In this way, in the initial repetitions, the number is set to the maximum and at the end of the repetitions to the minimum number of variables. This principle is mathematically simulated using an index based on Equation (3).
where I v denotes the number of selected variables for the update process, T is the maximum number of iterations, and t is the repetition counter. The second principle is to prevent the algorithm population update process from relying on specific members. Relying on algorithm updates to specific members of the population might cause the algorithm to converge towards the local optimum and prevent accurate scanning of the search space to attain the global optimum. The process of updating population members has been modeled using Equations (4)- (6) according to the two principles expressed. To update each member of the population, another member of the population is randomly selected. If the selected member has a better value for the objective function, the first formula in Equation (4) is used. Otherwise, the second formula is used.
where X new i , i = 1, 2, . . . , N, is the new status of the ith member, x new i,k j , j = 1, 2, . . . , I v , k j is a random element from the set {1, 2, . . . , m} is the k j th dimension of the ith member, F new i is the objective function value of the ith population member in new status, r is a random number in interval [0, 1], x s,k j is the selected member for guiding the ith member in the k j th dimension, and F s is the its objective function value.

Repetition Process of SSVUBA
After all members of the population have been updated, the SSVUBA algorithm goes on to the next iteration. In the new iteration, index I v is adjusted using Equation (3), and then population members are updated based on Equations (4)- (6). This process repeats until the algorithm is completed. The best quasi-optimal solution found by the algorithm during execution is offered as the answer to the problem after the complete implementation of SSVUBA for the specified optimization problem. Figure 1 depicts the flowchart of the SSVUBA's various steps, while Algorithm 1 presents its pseudocode.

Computational Complexity of SSVUBA
In this subsection, the computational complexity of SSVUBA is presented. In this regard, time and space complexities are discussed.

Time Complexity
SSVUBA preparation and initialization require O(N·m) time where N is the number of SVVUBA population members and m is the number of problem variables. In each iteration of the algorithm, population members are updated, which requires O(T·N·I v ) time where T is the maximum number of iteations and I v is the number of selected variables for the update process. Accordingly, the total time computational complexity of SSVUBA is equal to O(N(m + T·I v )).

Computational Complexity of SSVUBA
In this subsection, the computational complexity of SSVUBA is presented. In this regard, time and space complexities are discussed.

Space Complexity
The space complexity of SSVUBA is equal to O(N·m), which is considered the maximum value of space pending its initialization procedure.
Input the optimization problem information: Decision variables, constraints, and objective function 2.
Set the T and N parameters.
Adjust number of selected variables to update (I v ) using Equation (3).
For j = 1: Select a population member randomly to guide the ith population member. X S ← X(S, :), S randomly selected f rom {1, 2, . . . , N} and S = i , is the Sth row of the population matrix. 8.
Select one of the variables at random to update. x i,k j , k j randomly selected f rom {1, 2, . . . , m}.
Calculate the new status of the k j th dimension using Equation (4).
Calculate the new status of the k j th dimension using Equation (4).
Calculate the objective function based on X new Update the ith population member using Equation (6).
Update the ith population member using Equation (6).
Save the best solution so far. 24. end 25. Output the best obtained solution. End SSVUBA.

Visualization of the Movement of Population Members towards the Solution
In the SSVUBA approach, population members converge to the optimal area and solution in the search space under the exchange of information between each other and the algorithm steps. In this subsection, to provide the visualization of the members' movement in the search space, the process of SSVUBA members' access to the solution is intuitively shown. This visualization is presented in a two-dimensional space, with a population size equals 30 and 30 iterations in optimizing an objective function called the Sphere function; its mathematical model is as follows: Subject to: − 10 ≤ x 1 , x 2 ≤ 10 Figure 2 shows the process of achieving SSVUBA towards the solution by optimizing the mentioned objective function. In this figure, the convergence of the population members towards the optimal solution of the variables (i.e., x 1 = x 2 = 0) and the optimal value of the objective function (i.e., F(x 1 , x 2 ) = 0) is well evident.

Simulation Studies and Results
In this section, simulation studies are presented to evaluate the performance of the SSVUBA in optimization and provide appropriate solutions for optimization problems. For this purpose, the SSVUBA is utilized for twenty-three standard objective functions of unimodal, high-dimensional multimodal, and fixed-dimensional multimodal types [37] (see their definitions in Appendix A). In addition to the twenty-three objective functions, SSVUBA performance has been tested in optimizing CEC 2017 test functions [38] (see their definitions in Appendix A). Furthermore, the optimization results achieved for the above objective functions using SSVUBA are compared to the performance of twelve optimization methods: PSO, TLBO, GWO, WOA, MPA, TSA, GSA, GA, RFO, RSA, AHA, and HBA to assess the further proposed approach. Numerous optimization algorithms have been developed so far. Comparing an algorithm with all existing algorithms, although possible, will yield a large amount of results. Therefore, twelve optimization algorithms have been used to compare the results. The reasons for choosing these algorithms are as follows: (i) Popular and widely used algorithms: GA and PSO. (ii) Algorithms that have been widely cited and employed in a variety of applications: GSA, TLBO, GWO, WOA. (iii) Algorithms that have been published recently and have received a lot of attention: RFO, TSA, MPA, RSA, AHA, HBA. The average of the best obtained solutions (avg), the standard deviation of the best obtained solutions (std), the best obtained candidate solution (bsf), and the median of obtained solutions (med) are used to present the optimization outcomes of objective functions. Table 1 shows the values utilized for the control parameters of the compared optimization techniques.

Simulation Studies and Results
In this section, simulation studies are presented to evaluate the performance of the SSVUBA in optimization and provide appropriate solutions for optimization problems. For this purpose, the SSVUBA is utilized for twenty-three standard objective functions of unimodal, high-dimensional multimodal, and fixed-dimensional multimodal types [37] (see their definitions in Appendix A). In addition to the twenty-three objective functions, SSVUBA performance has been tested in optimizing CEC 2017 test functions [38] (see their definitions in Appendix A). Furthermore, the optimization results achieved for the above objective functions using SSVUBA are compared to the performance of twelve optimization methods: PSO, TLBO, GWO, WOA, MPA, TSA, GSA, GA, RFO, RSA, AHA, and HBA to assess the further proposed approach. Numerous optimization algorithms have been developed so far. Comparing an algorithm with all existing algorithms, although possible, will yield a large amount of results. Therefore, twelve optimization algorithms have been used to compare the results. The reasons for choosing these algorithms are as follows:  Table 1 shows the values utilized for the control parameters of the compared optimization techniques.

Assessment of F1 to F7 Unimodal Functions
Unimodal functions are the first category of objective functions that are considered for analyzing the performance of optimization methods. The optimization results of unimodal objective functions including F1 to F7 using SSVUBA and eight compared algorithms are reported in Table 2. The SSVUBA has been able to find the global optimal for the F6 function. Further, SSVUBA is the first best optimizer for the F1 to F5 and F7 functions. Analysis of the performance of optimization algorithms against the results of the proposed approach indicates that SSVUBA is able to provide quasi-optimal solutions closer to the global optimum and thus has a higher capability in optimizing unimodal functions than the compared algorithms.

Assessment of F8 to F13 High-Dimensional Multimodal Functions
High-dimensional multimodal functions are the second type of objective function employed to assess the performance of optimization techniques. Table 3 reveals the results of the implementation of the SSVUBA and eight compared algorithms for functions F8 to F13. For the F9 and F11 functions, SSVUBA was able to deliver the best global solution. Further-more, for the F8, F10, F12, and F13 functions, SSVUBA was the superior optimizer. SSVUBA outperformed the other algorithms in solving high-dimensional multimodal issues by offering effective solutions for the F8 to F13 functions, according to the simulation findings.

Assessment of F14 to F23 Fixed-Dimensional Multimodal Functions
Fixed-dimensional functions are the third type of objective function used to evaluate the efficiency of optimization techniques. Table 4 shows the optimization results for the F14 to F23 functions utilizing the SSVUBA and eight compared techniques. SSVUBA was able to deliver the global optimum for the F14 function. The SSVUBA was also the first best optimizer for the F15, F16, F21, and F22 functions. SSVUBA, in optimizing functions F17, F18, F19, F20, and F23, was able to converge to quasi-optimal solutions with smaller values of the standard deviation. By comparing the performance of optimization algorithms in solving the F14 to F23 functions, it is clear that SSVUBA provides superior and competitive results versus the compared algorithms. Figure 3 shows the performance of SSVUBA as well as eight competitor algorithms in the form of a boxplot.

Statistical Analysis
Use of the average of the obtained solutions, standard deviation, best candidate solution, and median of obtained solutions to analyze and compare the performance of optimization algorithms in solving optimization issues offers significant information about

Statistical Analysis
Use of the average of the obtained solutions, standard deviation, best candidate solution, and median of obtained solutions to analyze and compare the performance of optimization algorithms in solving optimization issues offers significant information about the quality and capabilities of optimization algorithms. However, it is possible that the superiority of one algorithm among several algorithms in solving optimization problems is random by even a low probability. Therefore, in this subsection, in order to statistically analyze the superiority of SSVUBA, the Wilcoxon sum rank test [39] is used. The Wilcoxon rank sum test is a nonparametric test to assess whether the distributions of results obtained between two separate methods for a dependent variable are systematically different from one another.
The Wilcoxon rank sum test was implemented for the optimization results obtained from the optimization algorithms. The results of this analysis are presented in Table 5. In the Wilcoxon rank sum test, a p-value indicates whether the superiority of one algorithm over another is significant. Therefore, the proposed SSVUBA in cases where the p-value is less than 5% has a statistically significant performance superior to the compared algorithm.

Sensitivity Analysis
The proposed SSVUBA is a population-based algorithm that is able to solve optimization problems in an iteration-based procedure. Therefore, the two parameters N and T affect the performance of SSVUBA in achieving the solution. As a result, the sensitivity analysis of the proposed SSVUBA to these two parameters is described in this subsection.
SSVUBA has been applied to F1 to F23 functions in independent runs for different populations with 20, 30, 50, and 80 members to investigate the sensitivity of the proposed SSVUBA performance to the N parameter. Table 6 reveals the findings of SSVUBA's sensitivity analysis to N. In addition, the convergence curves of the proposed SSVUBA to attain a quasi-optimal solution for different populations are plotted in Figure 4. The sensitivity analysis of the SSVUBA to the number of population members show that increasing the search agents in the search space leads to more accurate scanning of the search space and achieving more appropriate optimal solutions.
The proposed approach is implemented in independent performances for the number of iterations 100, 500, 800, and 1000 in order to optimize the objective functions F1 to F23 with aim of the investigating the sensitivity of the performance of SSVUBA to parameter T. Table 7 shows the simulated results of this sensitivity study, and Figure 5 shows the convergence curves of the SSVUBA under the influence of this analysis. The results of the simulation and sensitivity analysis of the proposed algorithm to the parameter T illustrate that increasing the number of iterations of the algorithm provides more opportunity for the algorithm to converge towards optimal solutions. As a result, as the maximum number of iterations increases and the values of the objective functions decrease. SSVUBA has been applied to F1 to F23 functions in independent runs for different populations with 20, 30, 50, and 80 members to investigate the sensitivity of the proposed SSVUBA performance to the N parameter. Table 6 reveals the findings of SSVUBA's sensitivity analysis to N. In addition, the convergence curves of the proposed SSVUBA to attain a quasi-optimal solution for different populations are plotted in Figure 4. The sensitivity analysis of the SSVUBA to the number of population members show that increasing the search agents in the search space leads to more accurate scanning of the search space and achieving more appropriate optimal solutions.   The proposed approach is implemented in independent performances for the number of iterations 100, 500, 800, and 1000 in order to optimize the objective functions F1 to F23 with aim of the investigating the sensitivity of the performance of SSVUBA to parameter T. Table 7 shows the simulated results of this sensitivity study, and Figure 5 shows the convergence curves of the SSVUBA under the influence of this analysis. The results of the simulation and sensitivity analysis of the proposed algorithm to the parameter T illustrate that increasing the number of iterations of the algorithm provides more opportunity  for the algorithm to converge towards optimal solutions. As a result, as the maximum number of iterations increases and the values of the objective functions decrease.  In addition to studying the analysis of SSVUBA sensitivity to the and parameters, each of the relationships used in Equation (4) also affects the performance of SSVUBA. Therefore, the effectiveness of all cases in Equation (4) is examined at this stage. In this regard, the proposed SSVUBA is implemented in three different modes for the objective functions F1 to F23. In the first case (mode 1), the first case of Equation (4) Table 8, and Figure 6 shows the SSVUBA convergence curves in the optimization of functions F1 to F23 in this study. What can be deduced from the simulation results is that applying the relationships in Equation (4) simultaneously has led to better and more efficient optimization results for the objective functions F1 to F23 compared to using each of the relationships separately. In addition to studying the analysis of SSVUBA sensitivity to the N and T parameters, each of the relationships used in Equation (4) also affects the performance of SSVUBA. Therefore, the effectiveness of all cases in Equation (4) is examined at this stage. In this regard, the proposed SSVUBA is implemented in three different modes for the objective functions F1 to F23. In the first case (mode 1), the first case of Equation (4), i.e., x i,k j + r· x s,k j − I·x i,k j is used. In the second case (mode 2), the second case of Equation (4), i.e., x i,k j + r· x i,k j − I·x s,k j is used. In the third case (mode 3), both cases introduced in Equation (4) are used simultaneously. The results of this analysis are shown in Table 8, and Figure 6 shows the SSVUBA convergence curves in the optimization of functions F1 to F23 in this study. What can be deduced from the simulation results is that applying the relationships in Equation (4) simultaneously has led to better and more efficient optimization results for the objective functions F1 to F23 compared to using each of the relationships separately.

Population Diversity Analysis
Population diversity has a significant impact on the success of the optimization process by optimization algorithms. Population diversity can improve the algorithm's ability to search globally in the problem-solving space, thus preventing it from falling into the trap of local optimal solutions. In this regard, in this subsection, population diversity analysis of SSVUBA performance has been studied. To show the population diversity of SSVUBA in achieving the solution during the iterations of the algorithm, the index is used, which is calculated using Equations (7) and (8) [40].
Here, is the spreading of each population member from its centroid and is its centroid.

Population Diversity Analysis
Population diversity has a significant impact on the success of the optimization process by optimization algorithms. Population diversity can improve the algorithm's ability to search globally in the problem-solving space, thus preventing it from falling into the trap of local optimal solutions. In this regard, in this subsection, population diversity analysis of SSVUBA performance has been studied. To show the population diversity of SSVUBA in achieving the solution during the iterations of the algorithm, the I C index is used, which is calculated using Equations (7) and (8) [40].
Here, I C is the spreading of each population member from its centroid and c j is its centroid.
The impact of population diversity on the optimization process given by SSVUBA in optimizing functions F1 to F23 is shown in Figure 7. In this figure, population diversity and SSVUBA convergence curves are presented for each of the objective functions. As can be seen from the simulation results, SSVUBA has a high population diversity in the process of optimizing most of the target functions. By optimizing functions F1, F2, F3,  F4, F7, F8, F9, F12, F13, F15, F16, F17, F18, F19, F20, F21, F22, and F23, it is evident that until the final iterations, the algorithm convergence process as well as population diversity continues. In handling the F5 function, it is evident that the convergence process continues until the final iteration. In the optimization of function F6, SSVUBA with high search power reached the global optimization, and then the population diversity decreased. In the optimization of function F10, the population diversity decreased while the algorithm achieved an acceptable solution. In solving function F11, the population diversity decreased, while SSVUBA converged to the best solution, the global optima. In optimizing function F14, the population diversity decreased after the algorithm converged to the optimal solution. Therefore, the results of population diversity analysis indicate the high ability of SSVUBA in maintaining population diversity, which has led to its effective performance in providing appropriate solutions for objective functions. The impact of population diversity on the optimization process given by SSVUBA in optimizing functions F1 to F23 is shown in Figure 7. In this figure, population diversity and SSVUBA convergence curves are presented for each of the objective functions. As can be seen from the simulation results, SSVUBA has a high population diversity in the process of optimizing most of the target functions. By optimizing functions F1, F2, F3, F4, F7,  F8, F9, F12, F13, F15, F16, F17, F18, F19, F20, F21, F22, and F23, it is evident that until the final iterations, the algorithm convergence process as well as population diversity continues. In handling the F5 function, it is evident that the convergence process continues until the final iteration. In the optimization of function F6, SSVUBA with high search power reached the global optimization, and then the population diversity decreased. In the optimization of function F10, the population diversity decreased while the algorithm achieved an acceptable solution. In solving function F11, the population diversity decreased, while SSVUBA converged to the best solution, the global optima. In optimizing function F14, the population diversity decreased after the algorithm converged to the optimal solution. Therefore, the results of population diversity analysis indicate the high ability of SSVUBA in maintaining population diversity, which has led to its effective performance in providing appropriate solutions for objective functions.

Evaluation of the CEC 2017 Test Functions
In this subsection, the performance of SSVUBA in addressing the CEC 2017 benchmark is examined. The CEC 2017 set includes three unimodal functions (C1 to C3), seven simple multimodal functions (C4 to C10), ten hybrid functions (C11 to C20), and ten composition functions (C21 to C30). The results obtained from the implementation of SSVUBA and competitor algorithms for these functions are shown in Table 9. What can be deduced from the simulation results is that SSVUBA performed better than competitor algorithms in handling the C1, C2, C4, C5, C11, C12, C13, C14, C15, C16, C17, C18, C19, C20, C21, C24, C26, C27, C29, and C30 functions.

Evaluation of the CEC 2017 Test Functions
In this subsection, the performance of SSVUBA in addressing the CEC 2017 benchmark is examined. The CEC 2017 set includes three unimodal functions (C1 to C3), seven simple multimodal functions (C4 to C10), ten hybrid functions (C11 to C20), and ten composition functions (C21 to C30). The results obtained from the implementation of SSVUBA and competitor algorithms for these functions are shown in Table 9. What can be deduced from the simulation results is that SSVUBA performed better than competitor algorithms in handling the C1, C2, C4, C5, C11, C12, C13, C14, C15, C16, C17, C18, C19, C20, C21, C24, C26, C27, C29, and C30 functions.

Discussion
Two essential factors that influence the performance of optimization algorithms are the exploitation and exploration capabilities. To give an acceptable solution to an optimization issue, each optimization algorithm must strike a reasonable balance between these two requirements.
In the study of optimization algorithms, the idea of exploitation refers to the algorithm's capacity to search locally. In reality, after reaching the optimal area in the optimization problem's search space, an optimization algorithm should be able to converge as much as feasible to the global optimal. As a result, when comparing the performance of several algorithms in solving an optimization issue, an algorithm that provides a solution that is closer to the global optimal has a better exploitation capability. The exploitation ability of an algorithm is essential, especially when solving problems that have only one basic solution. The objective functions F1 to F7, which are unimodal functions, have the property that they lack local optimal solutions and have only one main solution. As a result, functions F1 to F7 are good candidates for testing the exploitation ability of optimization techniques. The optimization results of the unimodal objective functions reported in Table 2 show that the proposed SSVUBA has a higher capability in local search than the compared algorithms and with high exploitation power, is able to deliver solutions very close to the global optimal.
In the study of optimization algorithms, the idea of exploration refers to the algorithm's capacity to search globally. In reality, to find the optimal area, an optimization algorithm should be able to correctly scan diverse portions of the search space. Exploration power enables the algorithm to pass through all optimal local areas and avoid becoming trapped in a local optimum. As a result, when comparing the potential of various optimization algorithms to handle an optimization issue, an algorithm that can appropriately check the problem search space to distance itself from all local optimal solutions and move towards the global optimal solution has a higher exploration ability. The exploration ability of an algorithm is of particular importance, especially when solving issues with several optimal local solutions in addition to the original solution. The objective functions F8 to F23, which are multimodal functions, have this feature. As a result, these functions are good candidates for testing the exploration ability in optimization algorithms. The examination of the results of optimization of multimodal functions, provided in Tables 3 and 4, shows that the SSVUBA has a superior ability in global search and is capable of passing through the local optimum areas due to its high exploration power.
Although exploitation and exploration affect the performance of optimization algorithms, each alone is not enough for the algorithm to succeed in optimization. Therefore, there is a need for a balance between these two indicators for an algorithm to be able to handle optimization problems. The simulation results show that SSVUBA has a high potential for balancing exploration and exploitation. The superiority of SSVUBA in the management of optimization applications with statistical criteria and ranking compared to competitor algorithms is evident. However, statistical analysis of the Wilcoxon rank sum test shows that this superiority is also statistically significant.
SSVUBA sensitivity analysis to parameters N and T shows that the performance of the proposed algorithm under the influence of changes in these two parameters provides different results. This is because the algorithm must have sufficient power to scan the search space whose tool is search agents (population members, i.e., N), as well as a sufficient opportunity (i.e., T) to identify the optimal area and converge towards the global optima. Thus, as expected, increasing the T and N values improved the SSVUBA performance and decreased the target function values.
To further analyze the performance of SSVUBA in optimization applications, this proposed method, along with competitor algorithms, was implemented on the CEC 2017 test suite. The simulation results in this type of optimization challenge indicate the successful performance of SSVUBA in addressing this type of optimization problem. Comparing SSVUBA with competing algorithms, it was found that SSVUBA ranked first in most cases and was more efficient than the compared algorithms.

SSVUBA for Engineering Design Applications
In order to analyze the efficiency of SSVUBA in real world purposes, this optimizer has been employed to address four engineering problems: pressure vessel design, speed reducer design, welded beam design, and tension/compression spring design.

Pressure Vessel Design Problem
Pressure vessel design is an engineering challenge in which the design purpose is minimizing the total cost (material, forming, and welding) of the cylindrical pressure vessel [41]. The schematic of this issue is shown in Figure 8. This problem's mathematical model is as follows: Consider:  The implementation results of SSVUBA and eight competitor algorithms in achieving the optimal design for pressure vessel are reported in Table 10. SSVUBA presents the optimal solution with the values of the variables equal to (0.7789938, 0.3850896, 40.3607, 199.3274) and the value of the objective function (5884.8824). The statistical results of SSVUBA performance against eight competitor algorithms in optimizing the pressure vessel problem are presented in Table 11. What can be seen from the statistical results is that SSVUBA has a superior performance over the compared algorithms by providing better values in statistical indicators. The behavior of the SSVUBA convergence curve during achieving the optimal solution for pressure vessel design is presented in Figure 9.  The implementation results of SSVUBA and eight competitor algorithms in achieving the optimal design for pressure vessel are reported in Table 10. SSVUBA presents the optimal solution with the values of the variables equal to (0.7789938, 0.3850896, 40.3607, 199.3274) and the value of the objective function (5884.8824). The statistical results of SSVUBA performance against eight competitor algorithms in optimizing the pressure vessel problem are presented in Table 11. What can be seen from the statistical results is that SSVUBA has a superior performance over the compared algorithms by providing better values in statistical indicators. The behavior of the SSVUBA convergence curve during achieving the optimal solution for pressure vessel design is presented in Figure 9.

Speed Reducer Design Problem
Speed reducer design is a minimization challenge whose main goal in optimal design is to reduce the weight of the speed reducer, which is depicted schematically in Figure 10. [42,43]. This problem's mathematical model is as follows: Consider

Speed Reducer Design Problem
Speed reducer design is a minimization challenge whose main goal in optimal design is to reduce the weight of the speed reducer, which is depicted schematically in Figure 10 [42,43]. This problem's mathematical model is as follows:  The results obtained from SSVUBA and eight competing algorithms in optimizing the speed reducer design are presented in Table 12. Based on the simulation results, it is obvious that SSVUBA has provided the optimal design of this problem for the values of the variables equal to (3.50003, 0.700007, 17, 7.3, 7.8, 3.35021, 5.28668) and the value of the objective function equal to (2996.3904). The statistical results of the SSVUBA performance as well as competitor algorithms in optimizing the speed reducer problem are reported in Table  13. Statistical results show the superiority of SSVUBA over competitor algorithms. The SSVUBA convergence curve when solving the speed reducer design is shown in Figure 11.  Consider: 2.9 ≤ x 6 ≤ 3.9, and 5 ≤ x 7 ≤ 5.5 .
The results obtained from SSVUBA and eight competing algorithms in optimizing the speed reducer design are presented in Table 12. Based on the simulation results, it is obvious that SSVUBA has provided the optimal design of this problem for the values of the variables equal to (3.50003, 0.700007, 17, 7.3, 7.8, 3.35021, 5.28668) and the value of the objective function equal to (2996.3904). The statistical results of the SSVUBA performance as well as competitor algorithms in optimizing the speed reducer problem are reported in Table 13. Statistical results show the superiority of SSVUBA over competitor algorithms. The SSVUBA convergence curve when solving the speed reducer design is shown in Figure 11.

Welded Beam Design
Welded beam design is an engineering topic with the main goal of minimizing the fabrication cost of the welded beam, a schematic of which is shown in Figure 12 [26]. This problem's mathematical model is as follows: Consider

Welded Beam Design
Welded beam design is an engineering topic with the main goal of minimizing the fabrication cost of the welded beam, a schematic of which is shown in Figure 12 [26]. This problem's mathematical model is as follows:  The optimization results for the welded beam design are reported in Table 14. Analysis of the simulation results shows that SSVUBA has provided the optimal design for the welded beam with the values of the variables equal to (0.205730, 3.4705162, 9.0366314, 0.2057314) and the value of the objective function equal to (1.724852). The statistical results obtained from the implementation of SSVUBA and eight competitor algorithms on this design are presented in Table 15. Analysis of the results of this table shows that SSVUBA with better values in statistical indicators provides superior performance in solving the welded beam design against competitor algorithms. The SSVUBA convergence curve for the optimal solution of the welded beam design problem is shown in Figure 13.  welded beam design against competitor algorithms. The SSVUBA convergence curve for the optimal solution of the welded beam design problem is shown in Figure 13.

Tension/Compression Spring Design Problem
Tension/compression spring design is an engineering challenge aimed at reducing the weight of the tension/compression spring, a schematic of which is shown in Figure 14  Subject to:

Tension/Compression Spring Design Problem
Tension/compression spring design is an engineering challenge aimed at reducing the weight of the tension/compression spring, a schematic of which is shown in Figure 14 [26]. This problem's mathematical model is as follows:  The results for the tension/compression spring design variables using SSVUBA and compared methods are provided in Table 16. The simulation results reveal that SSVUBA provides the optimal solution with the values of the variables equal to (0.051704, 0.357077, 11.26939) and the value of the objective function equal to (0.012665). The statistical results of implementation of SSVUBA and compared algorithms for the tension/compression spring problem are presented in Table 17. The observations indicate the superiority of SSVUBA performance due to the provision of better values of statistical indicators compared to competitor algorithms. The SSVUBA convergence curve when achieving the optimal solution to the tension/compression spring problem is shown in Figure 15.   Consider: . Minimize: f (x) = (x 3 + 2)x 2 x 2 1 . Subject to: The results for the tension/compression spring design variables using SSVUBA and compared methods are provided in Table 16. The simulation results reveal that SSVUBA provides the optimal solution with the values of the variables equal to (0.051704, 0.357077, 11.26939) and the value of the objective function equal to (0.012665). The statistical results of implementation of SSVUBA and compared algorithms for the tension/compression spring problem are presented in Table 17. The observations indicate the superiority of SSVUBA performance due to the provision of better values of statistical indicators compared to competitor algorithms. The SSVUBA convergence curve when achieving the optimal solution to the tension/compression spring problem is shown in Figure 15.  The results for the tension/compression spring design variables using SSVUBA and compared methods are provided in Table 16. The simulation results reveal that SSVUBA provides the optimal solution with the values of the variables equal to (0.051704, 0.357077, 11.26939) and the value of the objective function equal to (0.012665). The statistical results of implementation of SSVUBA and compared algorithms for the tension/compression spring problem are presented in Table 17. The observations indicate the superiority of SSVUBA performance due to the provision of better values of statistical indicators compared to competitor algorithms. The SSVUBA convergence curve when achieving the optimal solution to the tension/compression spring problem is shown in Figure 15.

The SSVUBA's Applicability in Sensor Networks and Image Processing
Many complex problems in the field of image processing are the focus of extensive research to find efficient methods. In this subject, local search approaches are commonly utilized for solving difficult problems. However, many issues and research in image processing are combinatorial and NP-hard. As optimization algorithms are populationbased stochastic approaches, they are generally better suited to solving these complicated challenges. As a result, optimization algorithms such as proposed SSVUBA can prevent becoming stuck in the local optimum and can frequently locate the global optimal solution. Recent advancements have resulted in an increased use of artificial intelligence approaches for image processing. Today, wireless sensor networks are one of the most popular wireless networks due to their various applications. These networks consist of a set of automated sensors to monitor physical or environmental conditions such as heat, sound, vibration, pressure, motion, or pollution. As a result, sensor networks are faced with a huge amount of valuable information. In this type of application, data analysis using classical methods is not very efficient and appropriate. Because of this, artificial intelligence approaches, such as the employment of the proposed SSVUBA for various applications in image processing and sensor networks, have become significant. The proposed SSVUBA approach is effective for topics such as energy optimization in sensor networks, sensor network placement, network coding (NC) in wireless sensor networks, sensor network coverage optimization, clustering in sensor networks, medical image processing, pattern recognition, video processing, and so on.

Conclusions and Future Works
Numerous optimization issues have been defined in various disciplines of science that must be addressed by employing proper approaches. One of the most successful and extensively used approaches for tackling such issues is optimization algorithms, which belong to the category of random methods. To handle different optimization challenges, a novel optimization technique named "Selecting Some Variables to Update-Based Algorithm" (SSVUBA) was developed in this study. Making more use of the information of different members of the population and adjusting the number of selected variables in order to update the population of the algorithm during successive iterations of the algorithm were the main ideas in the design of the proposed SSVUBA. The ability of SSVUBA to solve optimization problems was tested on fifty-three different objective functions. The results of optimization of unimodal functions indicated the strong ability of the proposed algorithm in the exploitation index and the presentation of solutions very close to the global optimal. The optimization results of multi-model functions showed that the SSVUBA with high capability in the exploration index is able to scan the search space of the problem and accurately and converge to the global optimal by passing local optimal areas. Further, in order to analyze the optimization results obtained from SSVUBA, these results were compared with the performance of eight well-known algorithms: PSO, TLBO, GWO, WOA, MPA, TSA, GSA, GA, RFO, RSA, AHA, and HBA. What is clear from the analysis of simulation results is that the SSVUBA has a strong ability to solve optimization problems by providing appropriate quasi-optimal solutions, and its performance is superior and more competitive than that of similar algorithms. In order to further analyze SSVUBA in optimization, the proposed algorithm was employed to optimize four engineering design challenges. The optimization results indicated the effective performance of SSVUBA in real-world applications and the provision of optimal values for design variables.
The authors provide various recommendations for future research, including the development of multi-objective and binary SSVUBA versions. Other proposals for future investigations of this work include using the proposed SSVUBA to solve optimization issues in many fields as well as real-world applications. The proposed SSVUBA approach opens up a wide range of future studies. These studies include the SSVUBA employment in wireless sensor networks, image processing, machine learning, signal denoising, artificial intelligence, engineering, feature selection, big data, data mining, and other optimization chalenges.
As with all stochastic approaches for optimization problems, the limitation of the proposed SSVUBA is that it offers no guarantee that the solutions provided by it will be the global optimal. Another limitation of any random approach, including SSVUBA, is that it is always possible for researchers to develop new algorithms that can provide more effective solutions to optimization issues. Moreover, according to the NFL theorem, another limitation of SSVUBA is that its strong performance in solving a group of optimization applications leaves no reason to offer the same performance in all other optimization applications.

Conflicts of Interest:
The authors declare no conflict of interest.

Appendix A
The information of the objective functions utilized in the simulation section is shown in Tables A1-A3.    5,100,4) [−50, 50] 30 0