A new Approach for Dynamic Stochastic Fractal Search with Fuzzy Logic for Parameter Adaptation

Metaheuristic algorithms are widely used as optimization methods, due to their global exploration and exploitation characteristics, which obtain better results than a simple heuristic. The Stochastic Fractal Search (SFS) is a novel method inspired by the process of stochastic growth in nature and the use of the fractal mathematical concept. Considering the chaotic-stochastic diffusion property, an improved Dynamic Stochastic Fractal Search (DSFS) optimization algorithm is presented. The DSFS algorithm was tested with benchmark functions, such as the multimodal, hybrid and composite functions, to evaluate the performance of the algorithm with dynamic parameter adaptation with type-1 and type-2 fuzzy inference models. The main contribution of the article is the utilization of fuzzy logic in the adaptation of the diffusion parameter in a dynamic fashion. This parameter is in charge of creating new fractal particles, and the diversity and iteration are the input information used in the fuzzy system to control the values of diffusion.


Introduction
Metaheuristic algorithms are applied to optimization problems due to their characteristics that help in searching for the global optimum, while simple heuristics are mostly capable in searching for local optimum and these are not very effective for optimization solutions.
Stochastic methods have the peculiarity of being characterized by stochastic random variables. In the current literature several of the most popular stochastic algorithms for optimization can be found, like the Genetic Algorithm (GA) [1], [2] that is inspired by biological evolution with random genetic combinations and mutations in a chromosome, and is based on the selection, crossover, mutation and replacement operators. Particle Swarm Optimization (PSO) inspiration was the natural fish and birds behavior when moving in swarms, where each particle moves randomly to find the global optimum, updating its speed and position until finding the best global solution [3]. Tabu Search (TS) [4], [5] is an iterative method that builds meta-strategies to build a neighborhood and avoids getting trapped in local optima. As mentioned above, these algorithms are the most widely used, for example in [6] an algorithm for optimization of hybrid particle swarms incorporating chaos is proposed, where the adaptive inertia weight factor (AIWF) is used for enhancing PSO in balancing in an efficient way the diversification and intensification abilities of the algorithm. In this case, the hybridization of PSO with AIWF and chaos is used to build a chaotic PSO (CPSO), combining prudently the evolutionary search ability based on the population of PSO and the behavior of chaotic search. Furthermore [7] proposes new PSO algorithm that rely on chaotic equation maps for parameter adaptation, this is done through the use of chaotic number generators every time the classical PSO algorithm needs a random number [8]. On the other hand, [9] develops an improved particle swarm optimization (IPSO) algorithm for enhancing the performance of the traditional PSO, which uses a dynamic inertia weight. In [10] the authors present an invasive weed optimization (IWO) metaheuristic, which is a weed colony based population optimization approach based also on chaos theory. In addition to improvements to stochastic optimization methods, there are also combinations or hybridizations with fuzzy logic to improve performance or reach a specific solution, for example the author in [11] an adaptive fuzzy control approach to control chaotic unified systems was proposed. Also in [12]- [15] a multi-metaheuristic model is developed for the optimal design of fuzzy controllers. An enhancement to the Bat algorithm is carried out in [16] for the dynamic adaptation of parameters. There are different applications using metaheurisitics and fuzzy logic for optimization [17]- [19]. In this article we focus on testing the efficiency of the Dynamic Stochastic Fractal Search (DSFS) method in optimizing unimodal, multimodal, hybrid and composite functions. First, the Stochastic Fractal Search (SFS) algorithm is taken by analyzing its chaotic-stochastic characteristics in the diffusion process, where each of its particles is generated and moved in a random stochastic way, second, it is detected that the stochastic movement of each particle may not be optimal due to the formation of the fractal not being able to get to explore and exploit the entire search space. Therefore, diversity is introduced for each iteration, to eventually get closer to the global optima by looking for the particles with the best fitness, adapting an inference system for their adjustment. To obtain a comparison of the efficiency of the improved method, 30 functions with different dimensions were evaluated generating satisfactory results compared to other algorithms. The main contribution in this article was the improvement of the SFS method, since the algorithm had a disadvantage in the diffusion parameter because it only uses the Gaussian distribution as its randomness method, to compensate for the fact that the particles might not be able to cover all the space. search, it was decided to add, dynamic adjustment to said parameter, to achieve a better movement in each of the newly generated fractal particles, this improvement was implemented using type-1 and type-2 fuzzy systems, by making a dynamic changes with a controller, that has diversity as input 1 and iteration as input 2, diversification is charge of spreading the particles throughout the search area for each iteration, as a result, a dynamically adapted method is obtained that does not stagnate as fast in the optimal ones local, thus reaching the global optimum and therefore improving the effectiveness of the Dynamic Stochastic

Materials and Methods Stochastic Fractal Search (SFS)
The term fractal was used for the first time by [20] Benoit Mandelbrot who described in his theory of fractals, geometric patterns generated in nature. There are some methods to generate fractals such as systems of iterated functions [21], Strange attractors [22], Lsystems [23], finite subdivision rules [24] and random fractals [25]. On the part of the generation of random fractals we find the Diffusion Limited Aggregation (DLA), which consists of the formation of fractals starting with an initial particle that is called a seed and is usually situated at the origin [26], [27]. Then other particles are randomly generated near the origin causing diffusion. This diffusion process is carried out with a mathematical algorithm as a random walk where the diffuser particle adheres to the initial particle, this process is iteratively repeated and stops only when a group of particles is formed. While the group is forming, the probability of a particle getting stuck at the end has been incremented with respect to those that reach the interior, forming a cluster with a structure similar to a branch. These branches can shape chaotic-stochastic patterns, such as the formation of lightning in nature [28].
In Stochastic Fractal Search (SFS) [29] two important processes occur: the diffusion and the update, respectively. In the first one, the particles diffuse near their position to fulfill the intensification property (exploitation), with this the possibility of finding the global minima is increased, and at the same time avoiding getting stuck at a local minima.
In the second one, a simulation of how one point is updating its position using the positions of other points in the group is made. In this process, the best particle produced from diffusion is the only one that is taken into account, and the remaining particles are eliminated. The equations used in each of the aforementioned processes are explained below.
The particle population P is randomly produced considering the problem constraints after setting the lower (LB) and the upper (UB) limits, where  is a number randomly produced in the range [0,1].
The process of diffusion (exploitation in fractal search) is expressed as follows: where  , ´ are numbers randomly generated in the range [0,1], represents the best position of the point, -th indicates a point and represents a normal distribution that randomly generates numbers with a mean and a standard deviation where log tends to a zero value as g increases.
The update process (representing exploration in fractal search) is: where N represents the number of particles and is the estimated particle probability, whose rank is given by the "rank" function. A classification of the particles is done according to their fitness value. Finally, a probability is assigned to each particle i.
where the augmented component is given by ′ ( ) , and , are different points selected from the group in a random fashion. ′ replaces if it achieves a better fitness value.
Once the first updating stage is finished, the second one initiates with a ranking of all points based on Eqs. (7) and (8). As previously mentioned, if is lower than a random , the current point, is changed by using the previous equations, in which the and y indices should be different. Of course, the new ′ is substituted by if it has better value.

Proposed Dynamic Stochastic Fractal Search (DSFS)
As mentioned above, the SFS [30]- [32] has two important processes: the diffusion method and the updating strategy. Analyzing the diffusion method we know that a particle called seed originates, and others are generated and adhered to it, through diffusion that forms a chaotic-stochastic fractal branch, taking into account this fact the Stochastic Fractal Search (SFS) method was improved by adding diversity to the diffusion process for the particles in each iteration, in this way, the Gaussian random walk is helped, where the particles have more possibilities to exploit the search space and therefore do not stagnate in an optimal location. To control the diffusion parameter, a fuzzy inference system was introduced which dynamically adjusts the diffusion of the particles with a range of [0 1], this fuzzy system for control has two inputs (iteration and diversity) and one output (diffusion) as illustrated in Figure.1. In this case, Eqs. (11) and (12) represent diversity and iteration, respectively.

If (Iteration is Low) and (Diversity is Low) then (Diffusion is High)
If (Iteration is Low) and (Diversity is Medium) then (Diffusion is Medium)

If (Iteration is Medium) and (Diversity is Medium) then (Diffusion is Medium)
If (Iteration is Medium) and (Diversity is High) then (Diffusion is Medium)

If (Iteration is High) and (Diversity is High) then (Diffusion is Low)
Eqs. 9 and 10 mathematically define the geometrical shape of the triangular functions, for type-1 and type-2 fuzzy logic, respectively [33]- [37].
where 1 < 2 , 1 < 2 , 1 < 2 , 1 < 2 Diversity contributes to the stochastic movement of the particles, to having more possibility of exploiting the entire search space, in each iteration it is dynamically adjusted close to the global optimum, thus the diffusion process is improved and therefore the most efficient method.

Experimental results
To obtain an understanding of the effectiveness of the Dynamic Stochastic Fractal Search (DSFS) method, 30 functions of the CEC'2017 competition [38], summarized in Table 1, were evaluated. In Table 1we can find several types of functions, such as: unimodal, multimodal, hybrid and composite. To compare the optimization performance of the proposal respective to other methods, different number of dimensions (10,30,50 and 100) and different adaptation strategies for the parameters (using type-1 and type-2 fuzzy systems) were used.  To evaluate the proposed method, the CEC'2017 benchmark mathematical functions that have been widely used in the literature [39], are considered in the tests. Table 2   As can be seen, the results using type-1 and type-2 fuzzy systems did not have a significant visible difference using 10 dimensions, even so the results obtained were good on average, because most reached the optimal global of the functions. Table 3 has the same structure as the previous one, the difference is that these results were made with 30 dimensions, as it is 30 dimensions, the algorithm has a wider search space. In addition, the functions that are being optimized are complex, despite this situation, the method showed approaching the optimum of each function, and the values obtained with both types of fuzzy logic were also very close. Multimodal functions are more difficult to optimize than unimodal ones due to the complexity that they represent, because the algorithms must escape or avoid local optima and arrive to the global optimal solution. In this study, not only are unimodal and multimodal functions being optimized, but also hybrid and complex functions are being optimized using different values of dimensions, as can be seen in Tables 4 and 5 with 50 and 100 dimensions respectively. In addition, the values obtained with the variants using Type-1 and Type-2 fuzzy systems for adaptation of parameters, show that the method had some degree of difficulty in reaching the global optimum, even so, there provide good approximation values showing that the improved method is efficient in optimization tasks.

Discussion of Results
In the literature, we can find the hybrid firefly and particle swarm optimization algorithm for solving expensive computationally problems [40]    The combination of Firefly Algorithm and Particle Swarm Optimization (HFPSO) was the one that generated the best results in the comparison that was made in the reference article with respect to the other 5 optimization algorithms. For this reason, in Tables 8 and 9      being represented by the blue line, we will have to remember that the dimensions are high and therefore, the performance of the methods is not the same, to use lower dimensions where the efficiency of the algorithms is much better for each of the functions.
To determine which of HFPSO or DSFS provided the closest to optimal result for each function, a statistical comparison was performed using the parametric z test. The formula for the Z test is expressed mathematically in the following fashion: where: As can be seen in Table 10, column 7, the values obtained by the z-test, provide statistical evidence that the DSFS method using type-1 fuzzy logic is significantly better than HFPSO, in functions with 10 dimensions (Bold indicates best values).

Conclusions
In conclusion we can describe stochastic methods, as metaheuristics that include basic algorithms for global stochastic optimization, such as random search, which helps the dispersion of the particles to reach the global optimum and avoid local stagnation. In this study, an improvement was made to a recent method that has been used in several optimization problems, this algorithm has two important processes for its operation, the diffusion and the updating process, which make up the most important part of Stochastic After that, then a comparison with the combination the Hybrid firefly algorithm and particles swarm optimization (HFPSO) was also made showing a significant statistical advantage for the proposed DFSF method of this paper. In addition, since HFPSO was better than FA, PSO, HPSOFF and FFPSO [40], then the proposed DFSF is also better than these 5 metaheuristic optimization algorithms.
It was shown that the improvement applied to the method was satisfactory for its efficiency and optimization performance, this improvement was made by adding the iteration and diversity equations, to help the chaotic-stochastic movement of the particles, in addition, it was adjusted with a controller of fuzzy inference. Finally, it can be concluded that the combination of stochastic metaheuristics with fuzzy logic can generate good results for the efficiency and improvement of the optimization algorithms, as developed in this study. As future work, we plan to apply the proposed DFSF with real-world problems in different areas of application.