This article is
- freely available
Some Further Results on the Minimum Error Entropy Estimation
Department of Precision Instruments and Mechanology, Tsinghua University, Beijing, 100084, China
Department of Electrical and Computer Engineering, University of Florida, Gainesville, FL 32611, USA
* Author to whom correspondence should be addressed.
Received: 1 April 2012; in revised form: 2 May 2012 / Accepted: 10 May 2012 / Published: 21 May 2012
Abstract: The minimum error entropy (MEE) criterion has been receiving increasing attention due to its promising perspectives for applications in signal processing and machine learning. In the context of Bayesian estimation, the MEE criterion is concerned with the estimation of a certain random variable based on another random variable, so that the error’s entropy is minimized. Several theoretical results on this topic have been reported. In this work, we present some further results on the MEE estimation. The contributions are twofold: (1) we extend a recent result on the minimum entropy of a mixture of unimodal and symmetric distributions to a more general case, and prove that if the conditional distributions are generalized uniformly dominated (GUD), the dominant alignment will be the MEE estimator; (2) we show by examples that the MEE estimator (not limited to singular cases) may be non-unique even if the error distribution is restricted to zero-mean (unbiased).
Keywords: entropy; estimation; minimum error entropy estimation
Article StatisticsClick here to load and display the download statistics.
Notes: Multiple requests from the same IP address are counted as one view.
Cite This Article
MDPI and ACS Style
Chen, B.; Principe, J.C. Some Further Results on the Minimum Error Entropy Estimation. Entropy 2012, 14, 966-977.
Chen B, Principe JC. Some Further Results on the Minimum Error Entropy Estimation. Entropy. 2012; 14(5):966-977.
Chen, Badong; Principe, Jose C. 2012. "Some Further Results on the Minimum Error Entropy Estimation." Entropy 14, no. 5: 966-977.