Next Article in Journal
Using Permutation Entropy to Measure the Changes in EEG Signals During Absence Seizures
Next Article in Special Issue
Information-Geometric Markov Chain Monte Carlo Methods Using Diffusions
Previous Article in Journal
Tsallis Wavelet Entropy and Its Application in Power Signal Analysis
Previous Article in Special Issue
Information Geometric Complexity of a Trivariate Gaussian Statistical Model
Open AccessArticle

Asymptotically Constant-Risk Predictive Densities When the Distributions of Data and Target Variables Are Different

by Keisuke Yano 1,* and Fumiyasu Komaki 1,2
1
Department of Mathematical Informatics, Graduate School of Information Science and Technology, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan
2
RIKEN Brain Science Institute, 2-1 Hirosawa, Wako City, Saitama 351-0198, Japan
*
Author to whom correspondence should be addressed.
Entropy 2014, 16(6), 3026-3048; https://doi.org/10.3390/e16063026
Received: 28 March 2014 / Revised: 9 May 2014 / Accepted: 22 May 2014 / Published: 28 May 2014
(This article belongs to the Special Issue Information Geometry)
We investigate the asymptotic construction of constant-risk Bayesian predictive densities under the Kullback–Leibler risk when the distributions of data and target variables are different and have a common unknown parameter. It is known that the Kullback–Leibler risk is asymptotically equal to a trace of the product of two matrices: the inverse of the Fisher information matrix for the data and the Fisher information matrix for the target variables. We assume that the trace has a unique maximum point with respect to the parameter. We construct asymptotically constant-risk Bayesian predictive densities using a prior depending on the sample size. Further, we apply the theory to the subminimax estimator problem and the prediction based on the binary regression model. View Full-Text
Keywords: Bayesian prediction; Fisher information; Kullback–Leibler divergence; minimax; predictive metric; subminimax estimator Bayesian prediction; Fisher information; Kullback–Leibler divergence; minimax; predictive metric; subminimax estimator
MDPI and ACS Style

Yano, K.; Komaki, F. Asymptotically Constant-Risk Predictive Densities When the Distributions of Data and Target Variables Are Different. Entropy 2014, 16, 3026-3048.

Show more citation formats Show less citations formats

Article Access Map by Country/Region

1
Only visits after 24 November 2015 are recorded.
Back to TopTop