Next Article in Journal
The Gibbs Paradox: Early History and Solutions
Next Article in Special Issue
Noise Enhanced Signal Detection of Variable Detectors under Certain Constraints
Previous Article in Journal
Equilibrium States in Open Quantum Systems
Previous Article in Special Issue
Adjusted Empirical Likelihood Method in the Presence of Nuisance Parameters with Application to the Sharpe Ratio
Article Menu
Issue 6 (June) cover image

Export Article

Open AccessArticle
Entropy 2018, 20(6), 442; https://doi.org/10.3390/e20060442

Principles of Bayesian Inference Using General Divergence Criteria

1
Department of Statistics, University of Warwick, Coventry CV4 7AL, UK
2
Department of Statistics, University of Oxford, Oxford OX1 3LB, UK
*
Author to whom correspondence should be addressed.
Received: 2 February 2018 / Revised: 25 May 2018 / Accepted: 28 May 2018 / Published: 6 June 2018
(This article belongs to the Special Issue Foundations of Statistics)
View Full-Text   |   Download PDF [1956 KB, uploaded 6 June 2018]   |  

Abstract

When it is acknowledged that all candidate parameterised statistical models are misspecified relative to the data generating process, the decision maker (DM) must currently concern themselves with inference for the parameter value minimising the Kullback–Leibler (KL)-divergence between the model and this process (Walker, 2013). However, it has long been known that minimising the KL-divergence places a large weight on correctly capturing the tails of the sample distribution. As a result, the DM is required to worry about the robustness of their model to tail misspecifications if they want to conduct principled inference. In this paper we alleviate these concerns for the DM. We advance recent methodological developments in general Bayesian updating (Bissiri, Holmes & Walker, 2016) to propose a statistically well principled Bayesian updating of beliefs targeting the minimisation of more general divergence criteria. We improve both the motivation and the statistical foundations of existing Bayesian minimum divergence estimation (Hooker & Vidyashankar, 2014; Ghosh & Basu, 2016), allowing the well principled Bayesian to target predictions from the model that are close to the genuine model in terms of some alternative divergence measure to the KL-divergence. Our principled formulation allows us to consider a broader range of divergences than have previously been considered. In fact, we argue defining the divergence measure forms an important, subjective part of any statistical analysis, and aim to provide some decision theoretic rational for this selection. We illustrate how targeting alternative divergence measures can impact the conclusions of simple inference tasks, and discuss then how our methods might apply to more complicated, high dimensional models. View Full-Text
Keywords: Kullback–Leibler divergence; robustness; Bayesian updating; minimum divergence estimation; M-open inference Kullback–Leibler divergence; robustness; Bayesian updating; minimum divergence estimation; M-open inference
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Supplementary material

SciFeed

Share & Cite This Article

MDPI and ACS Style

Jewson, J.; Smith, J.Q.; Holmes, C. Principles of Bayesian Inference Using General Divergence Criteria. Entropy 2018, 20, 442.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top