Next Article in Journal
Thermodynamic Properties of a Regular Black Hole in Gravity Coupling to Nonlinear Electrodynamics
Previous Article in Journal
Game Theoretic Approach for Systematic Feature Selection; Application in False Alarm Detection in Intensive Care Units
Open AccessArticle

Gaussian Processes and Polynomial Chaos Expansion for Regression Problem: Linkage via the RKHS and Comparison via the KL Divergence

College of Liberal Arts and Sciences, National University of Defense Technology, Changsha 410073, China
*
Author to whom correspondence should be addressed.
Entropy 2018, 20(3), 191; https://doi.org/10.3390/e20030191
Received: 21 January 2018 / Revised: 6 March 2018 / Accepted: 12 March 2018 / Published: 12 March 2018
(This article belongs to the Section Information Theory, Probability and Statistics)
In this paper, we examine two widely-used approaches, the polynomial chaos expansion (PCE) and Gaussian process (GP) regression, for the development of surrogate models. The theoretical differences between the PCE and GP approximations are discussed. A state-of-the-art PCE approach is constructed based on high precision quadrature points; however, the need for truncation may result in potential precision loss; the GP approach performs well on small datasets and allows a fine and precise trade-off between fitting the data and smoothing, but its overall performance depends largely on the training dataset. The reproducing kernel Hilbert space (RKHS) and Mercer’s theorem are introduced to form a linkage between the two methods. The theorem has proven that the two surrogates can be embedded in two isomorphic RKHS, by which we propose a novel method named Gaussian process on polynomial chaos basis (GPCB) that incorporates the PCE and GP. A theoretical comparison is made between the PCE and GPCB with the help of the Kullback–Leibler divergence. We present that the GPCB is as stable and accurate as the PCE method. Furthermore, the GPCB is a one-step Bayesian method that chooses the best subset of RKHS in which the true function should lie, while the PCE method requires an adaptive procedure. Simulations of 1D and 2D benchmark functions show that GPCB outperforms both the PCE and classical GP methods. In order to solve high dimensional problems, a random sample scheme with a constructive design (i.e., tensor product of quadrature points) is proposed to generate a valid training dataset for the GPCB method. This approach utilizes the nature of the high numerical accuracy underlying the quadrature points while ensuring the computational feasibility. Finally, the experimental results show that our sample strategy has a higher accuracy than classical experimental designs; meanwhile, it is suitable for solving high dimensional problems. View Full-Text
Keywords: Gaussian process; polynomial chaos expansion; reproducing kernel Hilbert space; Kullback–Leibler divergence; experimental design Gaussian process; polynomial chaos expansion; reproducing kernel Hilbert space; Kullback–Leibler divergence; experimental design
Show Figures

Figure 1

MDPI and ACS Style

Yan, L.; Duan, X.; Liu, B.; Xu, J. Gaussian Processes and Polynomial Chaos Expansion for Regression Problem: Linkage via the RKHS and Comparison via the KL Divergence. Entropy 2018, 20, 191.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map

1
Back to TopTop