Next Article in Journal
A Combinatorial Grassmannian Representation of the Magic Three-Qubit Veldkamp Line
Next Article in Special Issue
Inquiry Calculus and the Issue of Negative Higher Order Informations
Previous Article in Journal
Entropy of Entropy: Measurement of Dynamical Complexity for Biological Systems
Previous Article in Special Issue
Use of the Principles of Maximum Entropy and Maximum Relative Entropy for the Determination of Uncertain Parameter Distributions in Engineering Applications
Article Menu
Issue 10 (October) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(10), 555; https://doi.org/10.3390/e19100555

The Prior Can Often Only Be Understood in the Context of the Likelihood

1
Department of Statistics, Columbia University, New York, NY 10027, USA
2
Department of Political Science, Columbia University, New York, NY 10027, USA
3
Department of Statistical Sciences, University of Toronto, Toronto, ON M5S, Canada
4
Institute for Social and Economic Research and Policy, Columbia University, New York, NY 10027, USA
*
Author to whom correspondence should be addressed.
Received: 26 August 2017 / Revised: 30 September 2017 / Accepted: 14 October 2017 / Published: 19 October 2017
(This article belongs to the Special Issue Maximum Entropy and Bayesian Methods)
Full-Text   |   PDF [256 KB, uploaded 20 October 2017]

Abstract

A key sticking point of Bayesian analysis is the choice of prior distribution, and there is a vast literature on potential defaults including uniform priors, Jeffreys’ priors, reference priors, maximum entropy priors, and weakly informative priors. These methods, however, often manifest a key conceptual tension in prior modeling: a model encoding true prior information should be chosen without reference to the model of the measurement process, but almost all common prior modeling techniques are implicitly motivated by a reference likelihood. In this paper we resolve this apparent paradox by placing the choice of prior into the context of the entire Bayesian analysis, from inference to prediction to model evaluation. View Full-Text
Keywords: Bayesian inference; default priors; prior distribution Bayesian inference; default priors; prior distribution
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Gelman, A.; Simpson, D.; Betancourt, M. The Prior Can Often Only Be Understood in the Context of the Likelihood. Entropy 2017, 19, 555.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top