# Surfing on Fitness Landscapes: A Boost on Optimization by Fourier Surrogate Modeling

^{1}

^{2}

^{3}

^{4}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Materials and Methods

#### 2.1. Fuzzy Self-Tuning PSO (FST-PSO)

Algorithm 1: Pseudocode of the FST-PSO algorithm. |

#### 2.2. Fitness Landscape Surrogate Modeling with Fourier Filtering (surF)

- a set $A=\{{\overrightarrow{x}}_{0},\cdots ,{\overrightarrow{x}}_{\sigma}\}$ of $\sigma $ points, with $\sigma \ll {\rho}^{D}$, is defined by sampling f uniformly in ${[\ell ,u]}^{D}$;
- a surrogate $\widehat{f}$ of f is defined in the following way, for each $\overrightarrow{x}\in {[\ell ,u]}^{D}$:
- (a)
- if $\overrightarrow{x}$ is inside the convex hull of the points in A, then a triangulation of the points in A is constructed and the value of $\widehat{f}(\overrightarrow{x})$ is obtained by linear interpolation. For example, in two dimensions, $\overrightarrow{x}$ will be contained in a triangle defined by three points ${\overrightarrow{x}}_{i},{\overrightarrow{x}}_{j},{\overrightarrow{x}}_{k}\in A$, and $\widehat{f}(\overrightarrow{x})$ will be a linear combination of $f({\overrightarrow{x}}_{i})$, $f({\overrightarrow{x}}_{j})$, and $f({\overrightarrow{x}}_{k})$;
- (b)
- if $\overrightarrow{x}$ is outside the convex hull of the points in A, then $\widehat{f}(\overrightarrow{x})=f({\overrightarrow{x}}^{\prime})$, where ${\overrightarrow{x}}^{\prime}\in A$ is the point in A that is nearest to $\overrightarrow{x}$.

Algorithm 2: Pseudocode of the surF algorithm. |

#### 2.3. The Search on the Smoothed Landscape: Coupling surF with FST-PSO (F3ST-PSO)

- the surrogate model represents a smoothed version of the original fitness landscape, whose “smoothness” can be tuned by means of the $\gamma $ hyperparameter;
- the evaluation of a candidate solution, using the surrogate model, requires a small computational effort. Notably, the latter can be far smaller than the evaluation of the original fitness function, especially in the case of real-world engineering or scientific problems (e.g., parameter estimation of biochemical systems [29], integrated circuits optimization [14], vehicle design [15]);
- even if an optimization performed on the surrogate model (e.g., using FST-PSO) does not require any evaluation of the original fitness function, it can provide useful information about the fitness landscape and the likely position of optimal solutions;
- the information about the optimal solutions found on the surrogate model can be used for a new optimization, leveraging the original fitness function.

- a part of the fitness evaluations budget is reserved for surF to randomly sample the search space and create the surrogate model;
- a preliminary optimization on the surrogate model is performed with FST-PSO, to identify an optimal solution ${\overrightarrow{g}}^{\simeq}$;
- a new FST-PSO instance is created, and ${\overrightarrow{g}}^{\simeq}$ is added to the initial random population;
- a new optimization is performed, exploiting the original fitness function and using the remaining budget of fitness evaluations;
- a new optimal solution ${\overrightarrow{g}}^{\mathtt{real}}$ is determined and returned as a result of the whole optimization.

#### 2.4. Frequency of the Optimum Conjecture

## 3. Results and Discussion

#### 3.1. Generation of Surrogate Models by surF

#### 3.2. Optimization of Benchmark Functions by F3ST-PSO

#### 3.3. Optimization of the CEC 2005 TEST suite by F3ST-PSO

## 4. Conclusions

`pip install surfer`.

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## Abbreviations

DFT | Discrete Fourier Transform |

EC | Evolutionary Computation |

FRBS | Fuzzy Rule Based System |

FST-PSO | Fuzzy Self-Tuning Particle Swarm Optimization |

F3ST-PSO | Fourier Filtering Fuzzy Self-Tuning Particle Swarm Optimization |

PSO | Particle Swarm Optimization |

surF | Surrogate modeling with Fourier filtering |

## References

- Bhosekar, A.; Ierapetritou, M. Advances in surrogate based modeling, feasibility analysis, and optimization: A review. Comput. Chem. Eng.
**2018**, 108, 250–267. [Google Scholar] [CrossRef] - Box, G.E.; Draper, N.R. Empirical Model-Building and Response Surfaces; John Wiley & Sons: Chichester, UK, 1987. [Google Scholar]
- Sacks, J.; Welch, W.J.; Mitchell, T.J.; Wynn, H.P. Design and analysis of computer experiments. Stat. Sci.
**1989**, 4, 409–423. [Google Scholar] [CrossRef] - Smola, A.J.; Schölkopf, B. A tutorial on support vector regression. Stat. Comput.
**2004**, 14, 199–222. [Google Scholar] [CrossRef][Green Version] - Wang, Z.; Ierapetritou, M. A novel feasibility analysis method for black-box processes using a radial basis function adaptive sampling approach. AIChE J.
**2017**, 63, 532–550. [Google Scholar] [CrossRef] - Eason, J.; Cremaschi, S. Adaptive sequential sampling for surrogate model generation with artificial neural networks. Comput. Chem. Eng.
**2014**, 68, 220–232. [Google Scholar] [CrossRef] - Lew, T.; Spencer, A.; Scarpa, F.; Worden, K.; Rutherford, A.; Hemez, F. Identification of response surface models using genetic programming. Mech. Syst. Signal Process.
**2006**, 20, 1819–1831. [Google Scholar] [CrossRef] - Samad, A.; Kim, K.Y.; Goel, T.; Haftka, R.T.; Shyy, W. Multiple surrogate modeling for axial compressor blade shape optimization. J. Propuls. Power
**2008**, 24, 301–310. [Google Scholar] [CrossRef] - Forrester, A.I.; Sóbester, A.; Keane, A.J. Multi-fidelity optimization via surrogate modelling. Proc. R. Soc. A Math. Phys. Eng. Sci.
**2007**, 463, 3251–3269. [Google Scholar] [CrossRef] - Viana, F.A.; Haftka, R.T.; Watson, L.T. Efficient global optimization algorithm assisted by multiple surrogate techniques. J. Glob. Optim.
**2013**, 56, 669–689. [Google Scholar] [CrossRef] - Zhou, Z.; Ong, Y.S.; Nair, P.B.; Keane, A.J.; Lum, K.Y. Combining global and local surrogate models to accelerate evolutionary optimization. IEEE Trans. Syst. Man Cybern. Part C
**2006**, 37, 66–76. [Google Scholar] [CrossRef][Green Version] - Forrester, A.I.; Keane, A.J. Recent advances in surrogate-based optimization. Prog. Aerosp. Sci.
**2009**, 45, 50–79. [Google Scholar] [CrossRef] - Queipo, N.V.; Haftka, R.T.; Shyy, W.; Goel, T.; Vaidyanathan, R.; Tucker, P.K. Surrogate-based analysis and optimization. Prog. Aerosp. Sci.
**2005**, 41, 1–28. [Google Scholar] [CrossRef][Green Version] - Liu, B.; Zhang, Q.; Gielen, G.G. A Gaussian process surrogate model assisted evolutionary algorithm for medium scale expensive optimization problems. IEEE Trans. Evol. Comput.
**2013**, 18, 180–192. [Google Scholar] [CrossRef][Green Version] - Yang, Y.; Zeng, W.; Qiu, W.S.; Wang, T. Optimization of the suspension parameters of a rail vehicle based on a virtual prototype Kriging surrogate model. Proc. Inst. Mech. Eng. Part J. Rail Rapid Transit
**2016**, 230, 1890–1898. [Google Scholar] [CrossRef] - Jin, Y. Surrogate-assisted evolutionary computation: Recent advances and future challenges. Swarm Evol. Comput.
**2011**, 1, 61–70. [Google Scholar] [CrossRef] - Sun, C.; Jin, Y.; Cheng, R.; Ding, J.; Zeng, J. Surrogate-assisted cooperative swarm optimization of high-dimensional expensive problems. IEEE Trans. Evol. Comput.
**2017**, 21, 644–660. [Google Scholar] [CrossRef][Green Version] - Tang, Y.; Chen, J.; Wei, J. A surrogate-based particle swarm optimization algorithm for solving optimization problems with expensive black box functions. Eng. Optim.
**2013**, 45, 557–576. [Google Scholar] [CrossRef] - Branke, J. Creating robust solutions by means of evolutionary algorithms. In International Conference on Parallel Problem Solving from Nature; Springer: Berlin/Heidelberger, Germany, 1998; pp. 119–128. [Google Scholar]
- Yu, X.; Jin, Y.; Tang, K.; Yao, X. Robust optimization over time—A new perspective on dynamic optimization problems. In Proceedings of the IEEE Congress on evolutionary computation, Barcelona, Spain, 18–23 July 2010; pp. 1–6. [Google Scholar]
- Bhattacharya, M. Reduced computation for evolutionary optimization in noisy environment. In Proceedings of the 10th Annual Conference Companion on Genetic and Evolutionary Computation, New York, NY, USA, 21–24 July 2008; pp. 2117–2122. [Google Scholar]
- Yang, D.; Flockton, S.J. Evolutionary algorithms with a coarse-to-fine function smoothing. In Proceedings of the 1995 IEEE International Conference on Evolutionary Computation, Perth, WA, Australia, 29 November–1 December 1995; pp. 657–662. [Google Scholar]
- Nobile, M.S.; Cazzaniga, P.; Besozzi, D.; Colombo, R.; Mauri, G.; Pasi, G. Fuzzy Self-Tuning PSO: A settings-free algorithm for global optimization. Swarm Evol. Comp.
**2018**, 39, 70–85. [Google Scholar] [CrossRef] - Poli, R.; Kennedy, J.; Blackwell, T. Particle swarm optimization. Swarm Intell.
**2007**, 1, 33–57. [Google Scholar] [CrossRef] - Tangherloni, A.; Spolaor, S.; Cazzaniga, P.; Besozzi, D.; Rundo, L.; Mauri, G.; Nobile, M.S. Biochemical parameter estimation vs. benchmark functions: A comparative study of optimization performance and representation design. Appl. Soft Comput.
**2019**, 81, 105494. [Google Scholar] [CrossRef] - SoltaniMoghadam, S.; Tatar, M.; Komeazi, A. An improved 1-D crustal velocity model for the Central Alborz (Iran) using Particle Swarm Optimization algorithm. Phys. Earth Planet. Inter.
**2019**, 292, 87–99. [Google Scholar] [CrossRef] - Fuchs, C.; Spolaor, S.; Nobile, M.S.; Kaymak, U. A Swarm Intelligence approach to avoid local optima in fuzzy C-Means clustering. In Proceedings of the 2019 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), New Orleans, LA, USA, 23–26 June 2019; pp. 1–6. [Google Scholar]
- Cooley, J.W.; Tukey, J.W. An algorithm for the machine calculation of complex Fourier series. Math. Comput.
**1965**, 19, 297–301. [Google Scholar] [CrossRef] - Nobile, M.S.; Tangherloni, A.; Besozzi, D.; Cazzaniga, P. GPU-powered and settings-free parameter estimation of biochemical systems. In Proceedings of the 2016 IEEE Congress on Evolutionary Computation (CEC), Vancouver, BC, Canada, 24–29 July 2016; pp. 32–39. [Google Scholar]
- Oliphant, T.E. A Guide to NumPy; Trelgol Publishing: Spanish Fork, UT, USA, 2006. [Google Scholar]
- Virtanen, P.; Gommers, R.; Oliphant, T.E.; Haberland, M.; Reddy, T.; Cournapeau, D.; Burovski, E.; Peterson, P.; Weckesser, W.; Bright, J.; et al. SciPy 1.0–Fundamental Algorithms for Scientific Computing in Python. arXiv
**2019**, arXiv:1907.10121. [Google Scholar] [CrossRef][Green Version] - Matsumoto, M.; Nishimura, T. Mersenne twister: A 623-dimensionally equidistributed uniform pseudo-random number generator. ACM Trans. Model. Comput. Simul.
**1998**, 8, 3–30. [Google Scholar] [CrossRef][Green Version] - Gibbs, J.W. Fourier’s series. Nature
**1899**, 59, 606. [Google Scholar] [CrossRef] - Schwefel, H.P. Numerical Optimization of Computer Models; John Wiley & Sons: Chichester, UK, 1981. [Google Scholar]
- Suganthan, P.N.; Hansen, N.; Liang, J.J.; Deb, K.; Chen, Y.P.; Auger, A.; Tiwari, S. Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization. KanGAL Rep.
**2005**, 2005005, 2005. [Google Scholar] - Nobile, M.S.; Cazzaniga, P.; Ashlock, D.A. Dilation functions in global optimization. In Proceedings of the 2019 IEEE Congress on Evolutionary Computation (CEC), Wellington, New Zealand, 10–13 June 2019; pp. 2300–2307. [Google Scholar]
- Nobile, M.S.; Besozzi, D.; Cazzaniga, P.; Mauri, G.; Pescini, D. A GPU-based multi-swarm PSO method for parameter estimation in stochastic biological systems exploiting discrete-time target series. In European Conference on Evolutionary Computation, Machine Learning and Data Mining in Bioinformatics; Springer: Berlin/Heidelberger, Germany, 2012; pp. 74–85. [Google Scholar]
- Sobol’, I.M. On the distribution of points in a cube and the approximate evaluation of integrals. Zhurnal Vychislitel’Noi Mat. Mat. Fiz.
**1967**, 7, 784–802. [Google Scholar] [CrossRef] - Manzoni, L.; Mariot, L. Cellular Automata pseudo-random number generators and their resistance to asynchrony. In International Conference on Cellular Automata; Springer: Berlin/Heidelberger, Germany, 2018; pp. 428–437. [Google Scholar]
- Ye, K.Q. Orthogonal column Latin hypercubes and their application in computer experiments. J. Am. Stat. Assoc.
**1998**, 93, 1430–1439. [Google Scholar] [CrossRef]

**Figure 1.**F3ST-PSO phases. Step 1: surF randomly samples the fitness landscape within a chosen search space (red dots) and uses that information to create a surrogate and smoother model. Step 2: a population of random candidate solutions is generated and placed on the surrogate model of the fitness landscape (orange dots). Step 3: FST-PSO is exploited to perform an optimization on the surrogate model. Step 4: the best individual (black star) found by FST-PSO is placed on the original fitness landscape, together with a new population of random candidate solutions (orange dots). Step 5: a final optimization with FST-PSO is performed on the original fitness landscape.

**Figure 2.**Detailed scheme of F3ST-PSO functioning. In the first phase (red boxes), the algorithm creates the surrogate model of the fitness landscape, by exploiting random sampling and Fourier filtering. The second phase (yellow boxes) consists in an optimization by means of FST-PSO over the surrogate model. In the third phase (green boxes), the best solution found is fed to a new FST-PSO optimization step over the real fitness function.

**Figure 3.**Examples of 2D surrogate models of the benchmark functions defined in Table 1, created by surF. The first column shows the original fitness landscape. The second column represents a random sampling of the fitness landscape, which is used to create the interpolation grid for the Fourier transform. The interpolation is shown as background color: dark/blue colors correspond to good fitness values, while bright/yellow colors correspond to bad fitness values. The third, fourth, and fifth columns represent the surrogate models, obtained by applying the inverse Fourier transform using $\gamma =3$, $\gamma =5$ and $\gamma =15$ coefficients, respectively.

**Figure 4.**Boxplots of the fitness values distribution of the best individuals ${\overrightarrow{g}}^{\simeq}$ found by FST-PSO at the last iteration on the surrogate models, exploiting $\gamma =3$, $\gamma =5$ and $\gamma =15$ coefficients (x-axis). The orange line and the green triangle denote the median and the mean, respectively.

**Figure 5.**Convergence plot showing the performance of FST-PSO (black dashed line) against F3ST-PSO (blue, orange, and green solid lines correspond to the use of $\gamma =3$, 5, and 15 coefficients in surF, respectively). The plots show that FST-PSO can perform 20 additional iterations compared to F3ST-PSO, since the construction of the surrogate model “consumes” 500 fitness evaluations during the initial random sampling.

**Figure 6.**Convergence plot showing the performance of FST-PSO (black dashed line) against F3ST-PSO (blue, orange, and green solid lines correspond to the use of $\gamma =3$, 5, and 15 coefficients in surF, respectively). The plots correspond, from left to right, to the results on the benchmark functions F4, F5, and F9 of the CEC 2005 suite.

**Figure 7.**Surrogate models of function F4 using $\gamma =3$, $\gamma =5$ and $\gamma =15$ coefficients (plots (

**a**), (

**b**), and (

**c**), respectively). By removing the higher components of the Fourier transform, the random noise is reduced, so that the fitness landscape becomes smoother and easier to explore (orange surface), while retaining the general characteristics of the original problem (blue surface).

Function | Equation | Search Space | Value in Global Minimum |
---|---|---|---|

Ackley | ${f}_{\mathrm{Ack}}(\overrightarrow{x})=20+e-20exp(-0.2\sqrt{\frac{1}{D}{\sum}_{d=1}^{D}{x}_{d}^{2}})-$$exp(\frac{1}{D}{\sum}_{d=1}^{D}cos(2\pi {x}_{d}))$ | ${[-30,30]}^{D}$ | ${f}_{\mathrm{Ack}}(\overrightarrow{0})=0$ |

Alpine | ${f}_{\mathrm{Alp}}(\overrightarrow{x})={\sum}_{d=1}^{D}|{x}_{d}sin({x}_{d})+.1{x}_{d}|$ | ${[-10,10]}^{D}$ | ${f}_{\mathrm{Alp}}(\overrightarrow{0})=0$ |

Griewank | ${f}_{\mathrm{Gri}}(\overrightarrow{x})=\frac{1}{4000}{\sum}_{d=1}^{D}{x}_{d}^{2}-{\prod}_{d=1}^{D}cos(\frac{{x}_{d}}{\sqrt{d}})+1$ | ${[-600,600]}^{D}$ | ${f}_{\mathrm{Gri}}(\overrightarrow{0})=0$ |

Michalewicz | ${f}_{\mathrm{Mic}}(\overrightarrow{x})=-{\sum}_{d=1}^{D}sin({x}_{d}){sin}^{2k}(\frac{d{x}_{d}^{2}}{\pi})$, $k=10$ in this work | ${[0,\pi ]}^{D}$ | ${f}_{\mathrm{Mic}}(0,0)=-1.801$ ${f}_{\mathrm{Mic}}(0,0,0,0,0)=-4.687$ |

Rastrigin | ${f}_{\mathrm{Ras}}(\overrightarrow{x})=10D+{\sum}_{d=1}^{D}({x}_{d}^{2}-10cos(2\pi {x}_{d}))$ | ${[-5.12,5.12]}^{D}$ | ${f}_{\mathrm{Ras}}(\overrightarrow{0})=0$ |

Rosenbrock | ${f}_{\mathrm{Ros}}(\overrightarrow{x})={\sum}_{d=1}^{D-1}[100{({x}_{d}^{2}-{x}_{d+1})}^{2}+{({x}_{d}-1)}^{2}]$ | ${[-5,10]}^{D}$ | ${f}_{\mathrm{Ros}}(\overrightarrow{1})=0$ |

Schwefel | ${f}_{\mathrm{Sch}}(\overrightarrow{x})=418.9829D-{\sum}_{d=1}^{D}{x}_{d}sin(\sqrt{|{x}_{d}|)}$ | ${[-500,500]}^{D}$ | ${f}_{Sch}(\overrightarrow{420.9687})=0$ |

Shubert | ${f}_{Shu}(\overrightarrow{x})={\prod}_{d=1}^{D}({\sum}_{i=1}^{5}icos[(i+1){x}_{d}+i])$ | ${[-10,10]}^{D}$ | Many global minima, whose values depend on D |

Vincent | ${f}_{\mathrm{Vin}}(\overrightarrow{x})={\sum}_{d=1}^{D}sin(10log({x}_{d}))$ | ${[0.25,10]}^{D}$ | ${f}_{\mathrm{Vin}}(\overrightarrow{7.706281})=-D$ |

Xin-She Yang n.2 | ${f}_{\mathrm{Xin}}(\overrightarrow{x})={\sum}_{d=1}^{D}\left|{x}_{d}\right|{[exp({\sum}_{d=1}^{D}sin({x}_{d}^{2}))]}^{-1}$ | ${[-2\pi ,2\pi ]}^{D}$ | ${f}_{\mathrm{Xin}}(\overrightarrow{0})=0$ |

**Table 2.**Settings used for the comparison of performances between F3ST-PSO and FST-PSO, considering the benchmark functions with $D=5$.

Setting | Value |
---|---|

Fitness evaluations budget | 13,000 |

$\sigma $ | 500 |

$\rho $ | 40 |

$\gamma $ values tested | 3, 5 and 15 |

Swarm size F3ST-PSO | 25 |

Iterations F3ST-PSO | 500 |

Swarm size of FST-PSO | 25 |

Iterations FST-PSO | 520 |

**Table 3.**Settings used for the comparison of performances between F3ST-PSO and FST-PSO, considering the functions F4, F5, and F9 of the CEC 2005 suite, with $D=5$.

Setting | Value |
---|---|

Fitness evaluations budget | 25,500 |

$\sigma $ | 500 |

$\rho $ | 40 |

$\gamma $ values tested | 3, 5 and 15 |

Swarm size F3ST-PSO | 25 |

Iterations F3ST-PSO | 1000 |

Swarm size of FST-PSO | 25 |

Iterations FST-PSO | 1020 |

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Manzoni, L.; Papetti, D.M.; Cazzaniga, P.; Spolaor, S.; Mauri, G.; Besozzi, D.; Nobile, M.S. Surfing on Fitness Landscapes: A Boost on Optimization by Fourier Surrogate Modeling. *Entropy* **2020**, *22*, 285.
https://doi.org/10.3390/e22030285

**AMA Style**

Manzoni L, Papetti DM, Cazzaniga P, Spolaor S, Mauri G, Besozzi D, Nobile MS. Surfing on Fitness Landscapes: A Boost on Optimization by Fourier Surrogate Modeling. *Entropy*. 2020; 22(3):285.
https://doi.org/10.3390/e22030285

**Chicago/Turabian Style**

Manzoni, Luca, Daniele M. Papetti, Paolo Cazzaniga, Simone Spolaor, Giancarlo Mauri, Daniela Besozzi, and Marco S. Nobile. 2020. "Surfing on Fitness Landscapes: A Boost on Optimization by Fourier Surrogate Modeling" *Entropy* 22, no. 3: 285.
https://doi.org/10.3390/e22030285