Open AccessThis article is
- freely available
An Entropic Estimator for Linear Inverse Problems
Department of Economics, Info-Metrics Institute, American University, 4400 Massachusetts Ave., Washington, DC 20016, USA
Centro de Finanzas, IESA, Caracas 1010, Venezuela
* Author to whom correspondence should be addressed.
Received: 29 February 2012; in revised form: 2 April 2012 / Accepted: 17 April 2012 / Published: 10 May 2012
Abstract: In this paper we examine an Information-Theoretic method for solving noisy linear inverse estimation problems which encompasses under a single framework a whole class of estimation methods. Under this framework, the prior information about the unknown parameters (when such information exists), and constraints on the parameters can be incorporated in the statement of the problem. The method builds on the basics of the maximum entropy principle and consists of transforming the original problem into an estimation of a probability density on an appropriate space naturally associated with the statement of the problem. This estimation method is generic in the sense that it provides a framework for analyzing non-normal models, it is easy to implement and is suitable for all types of inverse problems such as small and or ill-conditioned, noisy data. First order approximation, large sample properties and convergence in distribution are developed as well. Analytical examples, statistics for model comparisons and evaluations, that are inherent to this method, are discussed and complemented with explicit examples.
Keywords: maximun entropy method; generalized entropy estimator; information-theoretic methods; parameter estimation; inverse problems
Citations to this Article
Cite This Article
MDPI and ACS Style
Golan, A.; Gzyl, H. An Entropic Estimator for Linear Inverse Problems. Entropy 2012, 14, 892-923.
Golan A, Gzyl H. An Entropic Estimator for Linear Inverse Problems. Entropy. 2012; 14(5):892-923.
Golan, Amos; Gzyl, Henryk. 2012. "An Entropic Estimator for Linear Inverse Problems." Entropy 14, no. 5: 892-923.