Next Article in Journal
Using Permutation Entropy to Measure the Changes in EEG Signals During Absence Seizures
Next Article in Special Issue
Information-Geometric Markov Chain Monte Carlo Methods Using Diffusions
Previous Article in Journal
Tsallis Wavelet Entropy and Its Application in Power Signal Analysis
Previous Article in Special Issue
Information Geometric Complexity of a Trivariate Gaussian Statistical Model
Entropy 2014, 16(6), 3026-3048; doi:10.3390/e16063026
Article

Asymptotically Constant-Risk Predictive Densities When the Distributions of Data and Target Variables Are Different

1,*  and 1,2
Received: 28 March 2014; in revised form: 9 May 2014 / Accepted: 22 May 2014 / Published: 28 May 2014
(This article belongs to the Special Issue Information Geometry)
Abstract: We investigate the asymptotic construction of constant-risk Bayesian predictive densities under the Kullback–Leibler risk when the distributions of data and target variables are different and have a common unknown parameter. It is known that the Kullback–Leibler risk is asymptotically equal to a trace of the product of two matrices: the inverse of the Fisher information matrix for the data and the Fisher information matrix for the target variables. We assume that the trace has a unique maximum point with respect to the parameter. We construct asymptotically constant-risk Bayesian predictive densities using a prior depending on the sample size. Further, we apply the theory to the subminimax estimator problem and the prediction based on the binary regression model.
Keywords: Bayesian prediction; Fisher information; Kullback–Leibler divergence; minimax; predictive metric; subminimax estimator Bayesian prediction; Fisher information; Kullback–Leibler divergence; minimax; predictive metric; subminimax estimator
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Export to BibTeX |
EndNote


MDPI and ACS Style

Yano, K.; Komaki, F. Asymptotically Constant-Risk Predictive Densities When the Distributions of Data and Target Variables Are Different. Entropy 2014, 16, 3026-3048.

AMA Style

Yano K, Komaki F. Asymptotically Constant-Risk Predictive Densities When the Distributions of Data and Target Variables Are Different. Entropy. 2014; 16(6):3026-3048.

Chicago/Turabian Style

Yano, Keisuke; Komaki, Fumiyasu. 2014. "Asymptotically Constant-Risk Predictive Densities When the Distributions of Data and Target Variables Are Different." Entropy 16, no. 6: 3026-3048.


Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert