Next Article in Journal
Global Reliability Sensitivity Analysis Based on Maximum Entropy and 2-Layer Polynomial Chaos Expansion
Next Article in Special Issue
Overview of the 37th MaxEnt
Previous Article in Journal
Leggett-Garg Inequalities for Quantum Fluctuating Work
Previous Article in Special Issue
Estimating Multivariate Discrete Distributions Using Bernstein Copulas
Article Menu
Issue 3 (March) cover image

Export Article

Open AccessArticle
Entropy 2018, 20(3), 201; https://doi.org/10.3390/e20030201

Global Optimization Employing Gaussian Process-Based Bayesian Surrogates

Max-Planck-Institute for Plasma Physics, EURATOM Association, 85748 Garching, Germany
This paper is an extended version of our paper published in the Proceedings of the 37th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Jarinu/SP, Brazil, 9–14 July 2017.
*
Author to whom correspondence should be addressed.
Received: 22 December 2017 / Revised: 8 March 2018 / Accepted: 13 March 2018 / Published: 16 March 2018
Full-Text   |   PDF [639 KB, uploaded 16 March 2018]   |  

Abstract

The simulation of complex physics models may lead to enormous computer running times. Since the simulations are expensive it is necessary to exploit the computational budget in the best possible manner. If for a few input parameter settings an output data set has been acquired, one could be interested in taking these data as a basis for finding an extremum and possibly an input parameter set for further computer simulations to determine it—a task which belongs to the realm of global optimization. Within the Bayesian framework we utilize Gaussian processes for the creation of a surrogate model function adjusted self-consistently via hyperparameters to represent the data. Although the probability distribution of the hyperparameters may be widely spread over phase space, we make the assumption that only the use of their expectation values is sufficient. While this shortcut facilitates a quickly accessible surrogate, it is somewhat justified by the fact that we are not interested in a full representation of the model by the surrogate but to reveal its maximum. To accomplish this the surrogate is fed to a utility function whose extremum determines the new parameter set for the next data point to obtain. Moreover, we propose to alternate between two utility functions—expected improvement and maximum variance—in order to avoid the drawbacks of each. Subsequent data points are drawn from the model function until the procedure either remains in the points found or the surrogate model does not change with the iteration. The procedure is applied to mock data in one and two dimensions in order to demonstrate proof of principle of the proposed approach. View Full-Text
Keywords: parametric studies; global optimization; gaussian process parametric studies; global optimization; gaussian process
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Preuss, R.; von Toussaint, U. Global Optimization Employing Gaussian Process-Based Bayesian Surrogates. Entropy 2018, 20, 201.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top