Open AccessThis article is
- freely available
Estimation of an Entropy-based Functional
7576 Dale Ave., Saint Louis, Missouri 63117, USA
Received: 30 December 2009; in revised form: 8 February 2010 / Accepted: 24 February 2010 / Published: 3 March 2010
Abstract: Given a function f from [0, 1] to the real line, we consider the (nonlinear) functional h obtained by evaluating the continuous entropy of the “density function” of f. Motivated by an application in signal processing, we wish to estimate h(f). Our main tool is a decomposition of h into two terms, which each have favorable scaling properties. We show that, if functions f and g satisfy a regularity condition, then the smallness of ∥f −g∥∞ and ∥f′ − g′∥∞, along with some basic control on derivatives of f and g, is sufficient to imply that h(f) and h(g) are close.
Keywords: entropy; differential entropy; Shannon entropy; entropy estimation; nonlinear functional; signal processing
Article StatisticsClick here to load and display the download statistics.
Notes: Multiple requests from the same IP address are counted as one view.
Cite This Article
MDPI and ACS Style
Maurizi, B.N. Estimation of an Entropy-based Functional. Entropy 2010, 12, 338-374.
Maurizi BN. Estimation of an Entropy-based Functional. Entropy. 2010; 12(3):338-374.
Maurizi, Brian N. 2010. "Estimation of an Entropy-based Functional." Entropy 12, no. 3: 338-374.