Entropy 2010, 12(3), 338-374; doi:10.3390/e12030338
Article

Estimation of an Entropy-based Functional

Received: 30 December 2009; in revised form: 8 February 2010 / Accepted: 24 February 2010 / Published: 3 March 2010
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Abstract: Given a function f from [0, 1] to the real line, we consider the (nonlinear) functional h obtained by evaluating the continuous entropy of the “density function” of f. Motivated by an application in signal processing, we wish to estimate h(f). Our main tool is a decomposition of h into two terms, which each have favorable scaling properties. We show that, if functions f and g satisfy a regularity condition, then the smallness of ∥fg and ∥f′g′, along with some basic control on derivatives of f and g, is sufficient to imply that h(f) and h(g) are close.
Keywords: entropy; differential entropy; Shannon entropy; entropy estimation; nonlinear functional; signal processing
PDF Full-text Download PDF Full-Text [282 KB, uploaded 3 March 2010 09:17 CET]

Export to BibTeX |
EndNote


MDPI and ACS Style

Maurizi, B.N. Estimation of an Entropy-based Functional. Entropy 2010, 12, 338-374.

AMA Style

Maurizi BN. Estimation of an Entropy-based Functional. Entropy. 2010; 12(3):338-374.

Chicago/Turabian Style

Maurizi, Brian N. 2010. "Estimation of an Entropy-based Functional." Entropy 12, no. 3: 338-374.

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert