Next Article in Journal
A Combinatorial Grassmannian Representation of the Magic Three-Qubit Veldkamp Line
Next Article in Special Issue
Inquiry Calculus and the Issue of Negative Higher Order Informations
Previous Article in Journal
Entropy of Entropy: Measurement of Dynamical Complexity for Biological Systems
Previous Article in Special Issue
Use of the Principles of Maximum Entropy and Maximum Relative Entropy for the Determination of Uncertain Parameter Distributions in Engineering Applications
Open AccessArticle

The Prior Can Often Only Be Understood in the Context of the Likelihood

1
Department of Statistics, Columbia University, New York, NY 10027, USA
2
Department of Political Science, Columbia University, New York, NY 10027, USA
3
Department of Statistical Sciences, University of Toronto, Toronto, ON M5S, Canada
4
Institute for Social and Economic Research and Policy, Columbia University, New York, NY 10027, USA
*
Author to whom correspondence should be addressed.
Entropy 2017, 19(10), 555; https://doi.org/10.3390/e19100555
Received: 26 August 2017 / Revised: 30 September 2017 / Accepted: 14 October 2017 / Published: 19 October 2017
(This article belongs to the Special Issue Maximum Entropy and Bayesian Methods)
A key sticking point of Bayesian analysis is the choice of prior distribution, and there is a vast literature on potential defaults including uniform priors, Jeffreys’ priors, reference priors, maximum entropy priors, and weakly informative priors. These methods, however, often manifest a key conceptual tension in prior modeling: a model encoding true prior information should be chosen without reference to the model of the measurement process, but almost all common prior modeling techniques are implicitly motivated by a reference likelihood. In this paper we resolve this apparent paradox by placing the choice of prior into the context of the entire Bayesian analysis, from inference to prediction to model evaluation. View Full-Text
Keywords: Bayesian inference; default priors; prior distribution Bayesian inference; default priors; prior distribution
MDPI and ACS Style

Gelman, A.; Simpson, D.; Betancourt, M. The Prior Can Often Only Be Understood in the Context of the Likelihood. Entropy 2017, 19, 555.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop