Next Article in Journal
Heat Transfer and Pressure Drop Characteristics in Straight Microchannel of Printed Circuit Heat Exchangers
Previous Article in Journal
Entropy Approximation in Lossy Source Coding Problem
Article Menu

Export Article

Open AccessArticle
Entropy 2015, 17(5), 3419-3437; doi:10.3390/e17053419

Minimum Error Entropy Algorithms with Sparsity Penalty Constraints

1
School of Electronic and Information Engineering, South China University of Technology, Guangzhou 510640, China
2
School of Electronic and Information Engineering, Xi'an Jiaotong University, Xi'an 710049, China
3
Department of Electrical and Computer Engineering, University of Florida, Gainesville, FL 32611, USA
*
Author to whom correspondence should be addressed.
Academic Editor: Kevin H. Knuth
Received: 30 January 2015 / Revised: 28 April 2015 / Accepted: 5 May 2015 / Published: 18 May 2015
View Full-Text   |   Download PDF [915 KB, uploaded 18 May 2015]   |  

Abstract

Recently, sparse adaptive learning algorithms have been developed to exploit system sparsity as well as to mitigate various noise disturbances in many applications. In particular, in sparse channel estimation, the parameter vector with sparsity characteristic can be well estimated from noisy measurements through a sparse adaptive filter. In previous studies, most works use the mean square error (MSE) based cost to develop sparse filters, which is rational under the assumption of Gaussian distributions. However, Gaussian assumption does not always hold in real-world environments. To address this issue, we incorporate in this work an l1-norm or a reweighted l1-norm into the minimum error entropy (MEE) criterion to develop new sparse adaptive filters, which may perform much better than the MSE based methods, especially in heavy-tailed non-Gaussian situations, since the error entropy can capture higher-order statistics of the errors. In addition, a new approximator of l0-norm, based on the correntropy induced metric (CIM), is also used as a sparsity penalty term (SPT). We analyze the mean square convergence of the proposed new sparse adaptive filters. An energy conservation relation is derived and a sufficient condition is obtained, which ensures the mean square convergence. Simulation results confirm the superior performance of the new algorithms. View Full-Text
Keywords: sparse estimation; minimum error entropy; correntropy induced metric; mean square convergence; impulsive noise sparse estimation; minimum error entropy; correntropy induced metric; mean square convergence; impulsive noise
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Wu, Z.; Peng, S.; Ma, W.; Chen, B.; Principe, J.C. Minimum Error Entropy Algorithms with Sparsity Penalty Constraints. Entropy 2015, 17, 3419-3437.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top