Approximated Information Analysis in Bayesian Inference
AbstractIn models with nuisance parameters, Bayesian procedures based on Markov Chain Monte Carlo (MCMC) methods have been developed to approximate the posterior distribution of the parameter of interest. Because these procedures require burdensome computations related to the use of MCMC, approximation and convergence in these procedures are important issues. In this paper, we explore Gibbs sensitivity by using an alternative to the full conditional distribution of the nuisance parameter. The approximate sensitivity of the posterior distribution of interest is studied in terms of an information measure, including Kullback–Leibler divergence. As an illustration, we then apply these results to simple spatial model settings. View Full-Text
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Seo, J.I.; Kim, Y. Approximated Information Analysis in Bayesian Inference. Entropy 2015, 17, 1441-1451.
Seo JI, Kim Y. Approximated Information Analysis in Bayesian Inference. Entropy. 2015; 17(3):1441-1451.Chicago/Turabian Style
Seo, Jung I.; Kim, Yongku. 2015. "Approximated Information Analysis in Bayesian Inference." Entropy 17, no. 3: 1441-1451.