Entropy 2012, 14(12), 2427-2438; doi:10.3390/e14122427
Article

Implications of the Cressie-Read Family of Additive Divergences for Information Recovery

1 Gianinni Hall, University of California, Berkeley, Berkeley, CA 94720, USA 2 School of Economic Sciences, Washington State University, Pullman, WA 99164, USA Member of the Giannini Foundation.
* Author to whom correspondence should be addressed.
Received: 15 October 2012; in revised form: 20 November 2012 / Accepted: 27 November 2012 / Published: 3 December 2012
PDF Full-text Download PDF Full-Text [232 KB, uploaded 3 December 2012 08:03 CET]
Abstract: To address the unknown nature of probability-sampling models, in this paper we use information theoretic concepts and the Cressie-Read (CR) family of information divergence measures to produce a flexible family of probability distributions, likelihood functions, estimators, and inference procedures. The usual case in statistical modeling is that the noisy indirect data are observed and known and the sampling model-error distribution-probability space, consistent with the data, is unknown. To address the unknown sampling process underlying the data, we consider a convex combination of two or more estimators derived from members of the flexible CR family of divergence measures and optimize that combination to select an estimator that minimizes expected quadratic loss. Sampling experiments are used to illustrate the finite sample properties of the resulting estimator and the nature of the recovered sampling distribution.
Keywords: conditional moment equations; Cressie-Read divergence; information theoretic methods; minimum power divergence; information functionals

Article Statistics

Load and display the download statistics.

Citations to this Article

Cite This Article

MDPI and ACS Style

Judge, G.G.; Mittelhammer, R.C. Implications of the Cressie-Read Family of Additive Divergences for Information Recovery. Entropy 2012, 14, 2427-2438.

AMA Style

Judge GG, Mittelhammer RC. Implications of the Cressie-Read Family of Additive Divergences for Information Recovery. Entropy. 2012; 14(12):2427-2438.

Chicago/Turabian Style

Judge, George G.; Mittelhammer, Ron C. 2012. "Implications of the Cressie-Read Family of Additive Divergences for Information Recovery." Entropy 14, no. 12: 2427-2438.

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert