Next Article in Journal
The Gibbs Paradox: Early History and Solutions
Next Article in Special Issue
Noise Enhanced Signal Detection of Variable Detectors under Certain Constraints
Previous Article in Journal
Equilibrium States in Open Quantum Systems
Previous Article in Special Issue
Adjusted Empirical Likelihood Method in the Presence of Nuisance Parameters with Application to the Sharpe Ratio
Open AccessArticle

Principles of Bayesian Inference Using General Divergence Criteria

Department of Statistics, University of Warwick, Coventry CV4 7AL, UK
Department of Statistics, University of Oxford, Oxford OX1 3LB, UK
Author to whom correspondence should be addressed.
Entropy 2018, 20(6), 442;
Received: 2 February 2018 / Revised: 25 May 2018 / Accepted: 28 May 2018 / Published: 6 June 2018
(This article belongs to the Special Issue Foundations of Statistics)
When it is acknowledged that all candidate parameterised statistical models are misspecified relative to the data generating process, the decision maker (DM) must currently concern themselves with inference for the parameter value minimising the Kullback–Leibler (KL)-divergence between the model and this process (Walker, 2013). However, it has long been known that minimising the KL-divergence places a large weight on correctly capturing the tails of the sample distribution. As a result, the DM is required to worry about the robustness of their model to tail misspecifications if they want to conduct principled inference. In this paper we alleviate these concerns for the DM. We advance recent methodological developments in general Bayesian updating (Bissiri, Holmes & Walker, 2016) to propose a statistically well principled Bayesian updating of beliefs targeting the minimisation of more general divergence criteria. We improve both the motivation and the statistical foundations of existing Bayesian minimum divergence estimation (Hooker & Vidyashankar, 2014; Ghosh & Basu, 2016), allowing the well principled Bayesian to target predictions from the model that are close to the genuine model in terms of some alternative divergence measure to the KL-divergence. Our principled formulation allows us to consider a broader range of divergences than have previously been considered. In fact, we argue defining the divergence measure forms an important, subjective part of any statistical analysis, and aim to provide some decision theoretic rational for this selection. We illustrate how targeting alternative divergence measures can impact the conclusions of simple inference tasks, and discuss then how our methods might apply to more complicated, high dimensional models. View Full-Text
Keywords: Kullback–Leibler divergence; robustness; Bayesian updating; minimum divergence estimation; M-open inference Kullback–Leibler divergence; robustness; Bayesian updating; minimum divergence estimation; M-open inference
Show Figures

Figure 1

MDPI and ACS Style

Jewson, J.; Smith, J.Q.; Holmes, C. Principles of Bayesian Inference Using General Divergence Criteria. Entropy 2018, 20, 442.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

Search more from Scilit
Back to TopTop