Entropy 2010, 12(3), 338-374; doi:10.3390/e12030338

Estimation of an Entropy-based Functional

7576 Dale Ave., Saint Louis, Missouri 63117, USA
Received: 30 December 2009; in revised form: 8 February 2010 / Accepted: 24 February 2010 / Published: 3 March 2010
PDF Full-text Download PDF Full-Text [282 KB, uploaded 3 March 2010 09:17 CET]
Abstract: Given a function f from [0, 1] to the real line, we consider the (nonlinear) functional h obtained by evaluating the continuous entropy of the “density function” of f. Motivated by an application in signal processing, we wish to estimate h(f). Our main tool is a decomposition of h into two terms, which each have favorable scaling properties. We show that, if functions f and g satisfy a regularity condition, then the smallness of ∥fg and ∥f′g′, along with some basic control on derivatives of f and g, is sufficient to imply that h(f) and h(g) are close.
Keywords: entropy; differential entropy; Shannon entropy; entropy estimation; nonlinear functional; signal processing

Article Statistics

Load and display the download statistics.

Citations to this Article

Cite This Article

MDPI and ACS Style

Maurizi, B.N. Estimation of an Entropy-based Functional. Entropy 2010, 12, 338-374.

AMA Style

Maurizi BN. Estimation of an Entropy-based Functional. Entropy. 2010; 12(3):338-374.

Chicago/Turabian Style

Maurizi, Brian N. 2010. "Estimation of an Entropy-based Functional." Entropy 12, no. 3: 338-374.

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert