An Improved Controlled Random Search Method

: A modiﬁed version of a common global optimization method named controlled random search is presented here. This method is designed to estimate the global minimum of multidimensional symmetric and asymmetric functional problems. The new method modiﬁes the original algorithm by incorporating a new sampling method, a new termination rule and the periodical application of a local search optimization algorithm to the points sampled. The new version is compared against the original using some benchmark functions from the relevant literature.


Introduction
Global optimization [1] is considered a problem of high complexity with many applications. The problem is defined as the location of the global minimum of a multi-dimensional function f (x): x * = arg min x∈S f (x) (1) where S ⊂ R n is formulated as: In global optimization, many functional problems that need to be solved can have symmetric solutions-the minimum-without this being the rule. The location of the global optimum finds application in many areas such as physics [2,3], chemistry [4,5], medicine [6,7], economics [8], etc. In modern theory there are two different categories of global optimization methods: the stochastic methods and the deterministic methods. The first category contains the vast majority of methods such as simulated annealing methods [9][10][11], genetic algorithms [12][13][14], tabu search methods [15], particle swarm optimization [16][17][18] etc. A common method that also belongs to stochastic methods is the controlled random search (CRS) method [19], which is a procedure that uses a population of trial solutions. This method initially creates a set with randomly selected points and repeatedly replaces the worst point in that set with a randomly generated point. This process can continue until some termination criterion is satisfied. The CRS method has been used intensively in many problems such as geophysics problems [20,21], optimal shape design problems [22], the animal diet problem [23], the heat transfer problem [24] etc.
This CRS method has been thoroughly analyzed by many researchers in the field, such as the work A of Ali and Storey, where two new variants of the CRS method were proposed [25]. These variants have proposed alternative techniques for the selection of the initial sample set and usage of local search methods. Additionally, Pillo et al. [26] suggested a hybrid CRS method where the base algorithm is combined with a Newtontype unconstrained minimization algorithm [27] to enhance the efficiency of the method in various test problems. Another work is of Kaelo and Ali, in which they suggested [28] some modifications to the method, especially in the new point generation step. Additionally, Filho and Albuquerque have suggested [29] the usage of a distribution strategy to accelerate the controlled random search method. Tsoulos and Lagaris [30] suggested the usage of a new line search method based on genetic algorithms to improve the original CRS method. The current work proposed three major modifications in the CRS method: a new point replacement strategy, a stochastic termination rule and a periodical application of some local search method. The first modification is used to better explore the domain range of the function. The second modification is made in order to achieve a better termination of the method without wasting valuable computational time. The third modification is used in order to speed up the method by applying a small amount of steps of a local search method. The new method introduces a new method to create trial points that was not present in the previous work [30] and also replaces the expensive call-to-line search method with a few calls to a local search optimization method.
The rest of this article is organized as follows: in Section 2, the major steps of the CRS method as well as the proposed modifications are presented; in Section 3, the results from the application of the proposed method on a series of benchmark functions are listed; and finally, in Section 4, some conclusions and guidelines for future research are presented.

Method Description
The controlled random search has a series of steps that are described in Algorithm 1. The changes proposed by the new method focus on three points:

1.
The creation of a test point (New_Point step) is performed using a new procedure described in Section 2.1.

2.
In the Min_Max step, the stochastic termination rule described in Section 2.2 is used. The aim of this rule is to terminate the method when, with some certainty, no lower minimums are to be found.

3.
Apply a few steps of a local search procedure after New_Point step in thez point. This procedure is used to bring the test points closer to the corresponding minimums. This speeds up the process of searching for new minima, although it obviously leads to an increase in function calls

A New Method for Trial Points
The proposed technique to compute the trial pointz is shown in Algorithm 2. According to this, the calculation of the test pointz does not contain a product with high values as in the basic algorithm, so that the test point is not too far from the centroid. This technique avoids vector jumps from the centroid, where it has great gravity in the calculation for starting the local optimization. This method also considers in the calculation the current minimum point and not only a random point as in the original technique. With this modification, knowledge that has already been found in the past is used to create a new point and in such a way that it is close to the area of attraction of a local minimum.

A New Stopping Rule
It is quite common in the optimization techniques to use a predefined number of maximum iterations as the stopping rule of the method. Even though this termination rule is easy to implement, it could sometimes require an excessive number of functions calls before termination; therefore, a more sophisticated termination rule is needed. The termination rule proposed here is inspired by [31]. At every iteration k, the variance σ (k) of the quantity f min is calculated. If the optimization technique did not manage to find a new estimation of the global minimum for some iterations, then probably the global minimum has been discovered and the algorithm should terminate. The termination rule is defined as follows; terminate when: The term k last represents the last iteration where a new global minimum was located.
The final outcome of the algorithm is the discovered global minimum z * .
The amount σ (k) decreases continuously over time as either the method will find a lower estimate for the global minimum or the global minimum will have already been found. In addition, this quantity is de facto permanently positive and therefore is a good candidate for use in termination criteria. If the global minimum has already been found or the method is no longer able to find a new estimate for it, then this quantity will tend to zero and therefore we can interrupt the execution of the algorithm when this quantity falls below a value. This value may be a fraction of the value of σ (k) the last time a new estimate for the global minimum was found. If we want to allow the algorithm to continue for several generations, this fraction can be small, e.g., 0.25. If we want it to stop more immediately, a good estimate for the fraction can be 0.75. A good compromise between these prices is the 0.5 price chosen here.

Test Functions
The modified version of the CRS was tested against the traditional CRS on series of benchmark functions from the relevant literature [32,33]. The following functions were used: Algorithm 2: The steps of the new proposed method to create more efficient trial points for the controlled random search method 1.
Calculate the centroid G: Compute a trial pointz = G − 1 n z T n+1 .

Results
In the experiments, two different values were measured: the rejection rate in the New_Point step and the average number of function calls required. In the first case we measured the percentage of points rejected during the New_Point step, i.e., points created that are outside the domain range of the function. All the experiments were conducted 30 times and different seeds for the random number generator were used each time. The local search method that was used in the experiments and denoted as localsearch(x) was a BFGS variant due to Powell [35]. The experiments were conducted on a i7-10700T CPU (Intel, Mountain View, CA, USA) at 2.00 GHz equipped with 16 GB of RAM. The operating system used was Debian Linux and the all the code was compiled using ANSI C++ compiler.
The experimental results are listed in Table 1. The column FUNCTION stands for the name of the objective function. The column CRS-R stands for the rejection rate for the CRS method, while the column NEWCRS-R displays the same measure for the current method. Similarly, the column CRS-C represents the average function calls for the CRS method and the column NEWCRS-C stands for the average function calls of the proposed method. Additionally, a statistical comparison between the CRS and the proposed method is shown in Figure 1.
The proposed method almost annihilates the rejection rate in every test function. This is evidence that the new mechanism proposed here to create a new point is more accurate than the traditional one. Additionally, the proposed method requires a lower number of function calls than the CRS method, as one can deduce from the relevant columns and the statistical comparison. The same information is presented graphically in Figure 2, where the percentage comparison of times of functional problems is outlined. Additionally, in the most difficult problems, the proposed method seems to be even more superior to the original one in number of calls, as the combination of the termination rule together with the improved new point generation technique terminate the method much faster and more correctly than the original method.
Additionally, the execution time for every test function was measured, and this information is outlined in Table 2. The column CRS-TIME stands for the average execution time of the original CRS method, the column NEWCRS-TIME represents the average execution time for the proposed method and the column DIFF is the calculated percentage difference between the previously mentioned columns. It is evident that the proposed method requires shorter execution times than the original one, and in addition, the difference between the two methods is more obvious in large problems. This phenomenon is also reflected in Figure 3, where a graphical representation of the average execution times of the two methods for the EXP problem for a different number of dimensions is made.

Conclusions
Three important modifications were proposed in the current work for the CRS method. The first modification has to do with the new test point generation process, which seems to be more accurate than the original one. The new method creates points that are within the domain range of the function almost every time. The second change adds a new termination rule based on stochastic observations. The third proposed modification applies a few steps of a local search procedure to every trial point created by the algorithm. Judging by the results, it seems that the proposed changes have two important effects. The first is that the success of the algorithm in creating valid test points is significantly improved. The second is the large reduction in the number of function calls required to locate the global minimum.
Future research may include the exploration of the usage of additional stopping rules and the parallelization of different aspects of the method in order to speed up the optimization procedure as well as to take advantage of multicore programming environments.