Next Article in Journal
All Shook Up: Fluctuations, Maxwell’s Demon and the Thermodynamics of Computation
Previous Article in Journal
Efficiently Measuring Complexity on the Basis of Real-World Data
Entropy 2013, 15(10), 4416-4431; doi:10.3390/e15104416
Article

Bayesian Testing of a Point Null Hypothesis Based on the Latent Information Prior

1,2
Received: 9 August 2013 / Revised: 16 September 2013 / Accepted: 10 October 2013 / Published: 17 October 2013
Download PDF [233 KB, uploaded 24 February 2015]

Abstract

Bayesian testing of a point null hypothesis is considered. The null hypothesisis that an observation, x, is distributed according to the normal distribution with a mean ofzero and known variance q2. The alternative hypothesis is that x is distributed accordingto a normal distribution with an unknown nonzero mean, μ, and variance q2. The testingproblem is formulated as a prediction problem. Bayesian testing based on priors constructedby using conditional mutual information is investigated.
Keywords: conditional mutual information; discrete prior; Kullback-Leibler divergence; prediction; reference prior; Jeffreys-Lindley paradox conditional mutual information; discrete prior; Kullback-Leibler divergence; prediction; reference prior; Jeffreys-Lindley paradox
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Share & Cite This Article

Further Mendeley | CiteULike
Export to BibTeX |
EndNote
MDPI and ACS Style

Komaki, F. Bayesian Testing of a Point Null Hypothesis Based on the Latent Information Prior. Entropy 2013, 15, 4416-4431.

View more citation formats

Related Articles

Article Metrics

For more information on the journal, click here

Comments

Cited By

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert