Some Further Results on the Minimum Error Entropy Estimation
AbstractThe minimum error entropy (MEE) criterion has been receiving increasing attention due to its promising perspectives for applications in signal processing and machine learning. In the context of Bayesian estimation, the MEE criterion is concerned with the estimation of a certain random variable based on another random variable, so that the error’s entropy is minimized. Several theoretical results on this topic have been reported. In this work, we present some further results on the MEE estimation. The contributions are twofold: (1) we extend a recent result on the minimum entropy of a mixture of unimodal and symmetric distributions to a more general case, and prove that if the conditional distributions are generalized uniformly dominated (GUD), the dominant alignment will be the MEE estimator; (2) we show by examples that the MEE estimator (not limited to singular cases) may be non-unique even if the error distribution is restricted to zero-mean (unbiased). View Full-Text
Share & Cite This Article
Chen, B.; Principe, J.C. Some Further Results on the Minimum Error Entropy Estimation. Entropy 2012, 14, 966-977.
Chen B, Principe JC. Some Further Results on the Minimum Error Entropy Estimation. Entropy. 2012; 14(5):966-977.Chicago/Turabian Style
Chen, Badong; Principe, Jose C. 2012. "Some Further Results on the Minimum Error Entropy Estimation." Entropy 14, no. 5: 966-977.