Next Article in Journal
Thermodynamic Analysis of a Waste Heat Driven Vuilleumier Cycle Heat Pump
Previous Article in Journal
A Comparison of Nonlinear Measures for the Detection of Cardiac Autonomic Neuropathy from Heart Rate Variability
Article Menu

Export Article

Open AccessArticle
Entropy 2015, 17(3), 1441-1451;

Approximated Information Analysis in Bayesian Inference

Department of Statistics, Yeungnam University, Gyeongsan 712-749, Korea
Department of Statistics, Kyungpook National University, Daegu 702-701, Korea
Author to whom correspondence should be addressed.
Academic Editor: Kevin H. Knuth
Received: 30 December 2014 / Revised: 15 March 2015 / Accepted: 19 March 2015 / Published: 20 March 2015
(This article belongs to the Section Information Theory)
View Full-Text   |   Download PDF [231 KB, uploaded 20 March 2015]


In models with nuisance parameters, Bayesian procedures based on Markov Chain Monte Carlo (MCMC) methods have been developed to approximate the posterior distribution of the parameter of interest. Because these procedures require burdensome computations related to the use of MCMC, approximation and convergence in these procedures are important issues. In this paper, we explore Gibbs sensitivity by using an alternative to the full conditional distribution of the nuisance parameter. The approximate sensitivity of the posterior distribution of interest is studied in terms of an information measure, including Kullback–Leibler divergence. As an illustration, we then apply these results to simple spatial model settings. View Full-Text
Keywords: Bayesian sensitivity; Gibbs sampler; Kullback–Leibler divergence; Laplace approximation Bayesian sensitivity; Gibbs sampler; Kullback–Leibler divergence; Laplace approximation
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Share & Cite This Article

MDPI and ACS Style

Seo, J.I.; Kim, Y. Approximated Information Analysis in Bayesian Inference. Entropy 2015, 17, 1441-1451.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top