Next Article in Journal
Using Permutation Entropy to Measure the Changes in EEG Signals During Absence Seizures
Next Article in Special Issue
Information-Geometric Markov Chain Monte Carlo Methods Using Diffusions
Previous Article in Journal
Tsallis Wavelet Entropy and Its Application in Power Signal Analysis
Previous Article in Special Issue
Information Geometric Complexity of a Trivariate Gaussian Statistical Model
Article Menu

Export Article

Open AccessArticle
Entropy 2014, 16(6), 3026-3048; doi:10.3390/e16063026

Asymptotically Constant-Risk Predictive Densities When the Distributions of Data and Target Variables Are Different

1
Department of Mathematical Informatics, Graduate School of Information Science and Technology, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan
2
RIKEN Brain Science Institute, 2-1 Hirosawa, Wako City, Saitama 351-0198, Japan
*
Author to whom correspondence should be addressed.
Received: 28 March 2014 / Revised: 9 May 2014 / Accepted: 22 May 2014 / Published: 28 May 2014
(This article belongs to the Special Issue Information Geometry)

Abstract

We investigate the asymptotic construction of constant-risk Bayesian predictive densities under the Kullback–Leibler risk when the distributions of data and target variables are different and have a common unknown parameter. It is known that the Kullback–Leibler risk is asymptotically equal to a trace of the product of two matrices: the inverse of the Fisher information matrix for the data and the Fisher information matrix for the target variables. We assume that the trace has a unique maximum point with respect to the parameter. We construct asymptotically constant-risk Bayesian predictive densities using a prior depending on the sample size. Further, we apply the theory to the subminimax estimator problem and the prediction based on the binary regression model. View Full-Text
Keywords: Bayesian prediction; Fisher information; Kullback–Leibler divergence; minimax; predictive metric; subminimax estimator Bayesian prediction; Fisher information; Kullback–Leibler divergence; minimax; predictive metric; subminimax estimator
This is an open access article distributed under the Creative Commons Attribution License (CC BY 3.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Yano, K.; Komaki, F. Asymptotically Constant-Risk Predictive Densities When the Distributions of Data and Target Variables Are Different. Entropy 2014, 16, 3026-3048.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top