Next Article in Journal
Polynomial and Wavelet-Type Transfer Function Models to Improve Fisheries’ Landing Forecasting with Exogenous Variables
Previous Article in Journal
Evaluation of Granger Causality Measures for Constructing Networks from Multivariate Time Series
Open AccessArticle

The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design

by Sergey Oladyshkin *,† and Wolfgang Nowak *,†
Department of Stochastic Simulation and Safety Research for Hydrosystems, Institute for Modelling Hydraulic and Environmental Systems/SC SimTech, University of Stuttgart, Pfaffenwaldring 5a, 70569 Stuttgart, Germany
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Entropy 2019, 21(11), 1081; https://doi.org/10.3390/e21111081
Received: 24 September 2019 / Revised: 30 October 2019 / Accepted: 31 October 2019 / Published: 4 November 2019
(This article belongs to the Section Information Theory, Probability and Statistics)
We show a link between Bayesian inference and information theory that is useful for model selection, assessment of information entropy and experimental design. We align Bayesian model evidence (BME) with relative entropy and cross entropy in order to simplify computations using prior-based (Monte Carlo) or posterior-based (Markov chain Monte Carlo) BME estimates. On the one hand, we demonstrate how Bayesian model selection can profit from information theory to estimate BME values via posterior-based techniques. Hence, we use various assumptions including relations to several information criteria. On the other hand, we demonstrate how relative entropy can profit from BME to assess information entropy during Bayesian updating and to assess utility in Bayesian experimental design. Specifically, we emphasize that relative entropy can be computed avoiding unnecessary multidimensional integration from both prior and posterior-based sampling techniques. Prior-based computation does not require any assumptions, however posterior-based estimates require at least one assumption. We illustrate the performance of the discussed estimates of BME, information entropy and experiment utility using a transparent, non-linear example. The multivariate Gaussian posterior estimate includes least assumptions and shows the best performance for BME estimation, information entropy and experiment utility from posterior-based sampling. View Full-Text
Keywords: model evidence; entropy; model selection; information entropy; Bayesian experimental design; Kullback–Leibler divergence; Markov chain Monte Carlo; Monte Carlo model evidence; entropy; model selection; information entropy; Bayesian experimental design; Kullback–Leibler divergence; Markov chain Monte Carlo; Monte Carlo
Show Figures

Figure 1

MDPI and ACS Style

Oladyshkin, S.; Nowak, W. The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design. Entropy 2019, 21, 1081.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop