Next Article in Journal
Wavelet q-Fisher Information for Scaling Signal Analysis
Previous Article in Journal
Application of Hydration Thermodynamics to the Evaluation of Protein Structures and Protein-Ligand Binding
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Integral Representation of the Relative Entropy

1
Department of Mathematics, Ochanomizu University, 2-1-1, Otsuka, Bunkyo-ku, Tokyo 112-8610, Japan
2
Department of Information Sciences, Ochanomizu University, 2-1-1, Otsuka, Bunkyo-ku, Tokyo 112-8610, Japan
*
Author to whom correspondence should be addressed.
Entropy 2012, 14(8), 1469-1477; https://doi.org/10.3390/e14081469
Submission received: 15 June 2012 / Revised: 28 July 2012 / Accepted: 2 August 2012 / Published: 8 August 2012

Abstract

:
Recently the identity of de Bruijn type between the relative entropy and the relative Fisher information with the reference moving has been unveiled by Verdú via MMSE in estimation theory. In this paper, we shall give another proof of this identity in more direct way that the derivative is calculated by applying integrations by part with the heat equation. We shall also derive an integral representation of the relative entropy, as one of the applications of which the logarithmic Sobolev inequality for centered Gaussian measures will be given.

1. Introduction

Probability measures on R n treated in this paper are absolutely continuous with respect to the standard Lebesgue measure and we shall identify them with their densities.
For a probability measure f, the entropy H ( f ) and the Fisher information J ( f ) can be introduced, which play important roles in information theory, probability, and statistics. For more details on these subjects see the famous book [1].
Hereafter, for an n-variables function ϕ ( x ) = ϕ ( x 1 , x 2 , , x n ) on R n , the integral of ϕ over the whole R n by the standard Lebesgue measure d x = d x 1 d x 2 d x n is abbreviated as
R n ϕ d x = R n ϕ ( x 1 , x 2 , , x n ) d x 1 d x 2 d x n .
that is, we shall leave out ( x 1 , x 2 , , x n ) in the integrand in order to simplify the expressions.
Definition 1.1. Let f be a probability measure on R n . Then the (differential) entropy of f is defined by
H ( f ) = R n f log f d x .
For a random variable X on R n with the density f, we write the entropy of X by H ( X ) = H ( f ) .
The Fisher information for a differentiable density f is defined by
J ( f ) = R n f 2 f d x = R n f ( log f ) 2 d x .
When the random variable X on R n has the differentiable density f, we also write as J ( X ) = J ( f ) .
The important result for a behavior of the Fisher information on convolution (sum of independent random variables) is the Stam inequality, which was first stated by Stam in [2] and subsequently proved by Blachman [3],
1 J ( f * g ) 1 J ( f ) + 1 J ( g )
where we have the equality if and only if f and g are Gaussian.
The importance of the Stam inequality can be found in its applications, for instance, the entropy power inequality [2]; the logarithmic Sobolev inequality [4]; Cercignani conjecture [5]; the Shannon conjecture on entropy and the central limit theorem [6,7].
For t 0 , we denote by P t f the convolution of f with the n-dimensional Gaussian density with mean vector 0 and covariance matrix t I n , where I n is the identity matrix. Namely, P t t 0 is the heat semigroup acting on f and satisfies the partial differential equation
t P t f = 1 2 Δ P t f
which is called the heat equation. In this paper, we simply denote P t f by f t and call it the Gaussian perturbation of f. Namely, letting X be the random variable on R n with the density f and Z be an n-dimensional Gaussian random variable independent of X with mean vector 0 and covariance matrix I n , the Gaussian perturbation f t stands the density function f ( x , t ) of the independent sum X + t Z .
The remarkable relation between the entropy and the Fisher information can be established by a Gaussian perturbation (see, for instance, [1], [2] or [8]);
d d t H ( f t ) = 1 2 J ( f t ) for t > 0
which is known as the de Bruijn identity.
Let f and g be probability measures on R n such that f g (f is absolutely continuous with respect to g). Setting the probability measure g as a reference, the relative entropy and the relative Fisher information can be introduced as follows:
Definition 1.2. The relative entropy of f with respect to g, D ( f g ) is defined by
D ( f g ) = R n f log f g d x = R n f log f d x R n f log g d x ,
which takes always a non-negative value.
We also define the relative Fisher information of f with respect to g by
J ( f g ) = R n f log f g 2 d x = R n f ( log f ) ( log g ) 2 d x ,
which is also non-negative. When random variables X and Y have the densities f and g, respectively, the relative entropy and the relative Fisher information of X with respect to Y are defined by D ( X Y ) = D ( f g ) and J ( X Y ) = J ( f g ) , respectively.
In view of the de Bruijn identity, one might expect that there is a similar connection between the relative entropy and the relative Fisher information. Indeed, the gradient formulas for the relative entropy functionals were obtained in [9,10,11], where the reference measures would not be changed in their cases.
Recently Verdú in [12], however, investigated the derivative in t of D ( f t g t ) for two Gaussian perturbations f t and g t . Here we should note that the reference measure does move by the same time parameter in this case. The following identity of de Bruijn type
d d t D ( f t g t ) = 1 2 J ( f t g t )
has been derived via MMSE in estimation theory (see also [13], for general perturbations).
The main aim in this paper is that we shall give an alternative proof of this identity by direct calculation with integrations by part, the method of which is similar to ones in [11,14]. Moreover, it will be easily found that the above identity yields an integral representation of the relative entropy. We shall also see the simple proof of the logarithmic Sobolev inequality for centered Gaussian in univariate ( n = 1 ) case as an application of the integral representation.

2. An Integral Representation of the Relative Entropy

We shall make the Gaussian perturbations f t and g t , respectively, and consider the relative entropy D ( f t g t ) , where the absolute continuity f t g t remains true for t > 0 .
Here, we regard D ( f t g t ) as a function of t and calculate the derivative,
d d t D ( f t g t ) = d d t R n f t log f t g t d x = d d t R n f t log f t d x d d t R n f t log g t d x
by integrations by part with help of the heat equation.
Proposition 2.1. Let f g be probability measures on R n with finite Fisher informations J ( f ) < and J ( g ) < , and finite relative entropy D ( f g ) < . Then we obtain
d d t D ( f t g t ) = 1 2 J ( f t g t ) f o r t > 0 .
Proof. First we should notice that the Fisher informations J ( f t ) and J ( g t ) are finite at any t > 0 . Because, for instance, if an n-dimensional random variable X has the density f and Z is an n-dimensional Gaussian random variable independent of X with mean vector 0 and covariance matrix I n , then by applying the Stam inequality (1) to independent random variables X and t Z , we have that
J ( f t ) = J X + t Z 1 J ( X ) + 1 J ( t Z ) 1 = J ( X ) 1 + t n J ( X ) J ( f ) <
where J ( Z ) = n is by simple calculation. We shall also notice that the function D ( f t g t ) is non-increasing in t, that is, for t > 0 ,
0 D ( f t g t ) D ( f g ) < ,
which can be found in [15] (p. 101). Therefore, D ( f t g t ) is finite for t > 0 . But by a nonlinear approximation argument in [11], we can impose a stronger assumption without loss of generality that
the relative density  f t g t  is bounded away from  0  and   on  R n
Concerning the first term in the most right hand side of (4), it follows immediately that
d d t R n f t log f t d x = 1 2 R n f t 2 f t d x
by the de Bruijn identity (3), hence, we shall concentrate our attention upon the second term.
Since the densities f t and g t satisfy the heat equation (2), the second term can be reformulated as follows:
d d t R n f t log g t d x = R n f t t log g t d x + R n log g t t f t d x
= R n f t t g t g t d x + R n log g t 1 2 Δ f t d x
= R n f t g t 1 2 Δ g t d x + R n log g t 1 2 Δ f t d x
In this reformulation, we have changed integration and differentiation at the first equality, which is justified by a routine argument with the bounded convergence theorem (see, for instance, [16]).
Applying integration by part to the first term in the last expression of (8), it becomes
R n f t g t 1 2 Δ g t d x = 1 2 R n f t g t · g t d x
which can be asserted by the observation below. As g t has finite Fisher information J ( g t ) < , g t g t has finite 2-norm in L 2 R n and must be bounded at infinity. Furthermore, from our technical assumption (6), f t g t is also bounded. Hence if we factorize as
f t g t g t = f t f t g t g t g t ,
then it can be found that f t g t g t will vanish at infinity.
Applying integration by part to the second term in the last expression of (8), it becomes
R n log g t 1 2 Δ f t d x = 1 2 R n g t g t · f t d x
Here it should be noted that log g t ( f t ) will vanish at infinity by the following observation. Similarly, we factorize it as
log g t ( f t ) = 2 g t log g t f t g t f t f t .
Then the boundedness of f t f t comes from that J ( f t ) < , and one of f t g t is by the assumption (6) same as before. Furthermore, the limit formula lim ξ 0 ξ log ξ = 0 ensures that g t log g t will vanish at infinity.
Substitute the Equation (9) and Equation (10) into (8), it follows that
d d t R n f t log g t d x = 1 2 R n f t g t · g t d x 1 2 R n g t g t · f t d x
= 1 2 R n f t g t f t g t g t 2 · g t d x 1 2 R n f t g t g t · f t f t d x
= R n f t g t g t · f t f t d x + 1 2 R n f t g t g t · g t g t d x
Combining the Equation (7) and Equation (11), we have that
d d t R n f t log f t d x d d t R n f t log g t d x = 1 2 R n f t f t f t · f t f t d x + R n f t g t g t · f t f t d x 1 2 R n f t g t g t · g t g t d x = 1 2 R n f t f t f t g t g t 2 d x
which means
d d t D ( f t g t ) = 1 2 R n f t ( log f t ) ( log g t ) 2 d x = 1 2 J ( f t g t ) .
Let X and Y be n-dimensional random variables with the densities f and g, respectively, and Z be an n-dimensional Gaussian random variable independent of X and Y with mean vector 0 and covariance matrix I n .
Since the relative entropy is scale invariant, it follows that
D ( X + t Z Y + t Z ) = D 1 t X + Z 1 t Y + Z .
We know that both of 1 t X + Z and 1 t Y + Z , as t converge to Z in distribution. Thus, we have
lim t D ( f t g t ) = 0 ,
and the following integral representation for the relative entropy can be obtained:
Theorem 2.2. Let f g be probability measures with finite Fisher informations and finite relative entropy D ( f g ) . Then we have the integral representation,
D ( f g ) = 1 2 0 J ( f t g t ) d t .

3. An Application to the Logarithmic Sobolev Inequality

In this section, we shall give a proof of the logarithmic Sobolev inequality for a centered Gaussian measure in case of n = 1 . Although several proofs of the logarithmic Sobolev inequality have already been given in many literatures (see, for instance, [10,17]), we shall give it here again as an application of the integral representation in Theorem 2.2.
Theorem 3.1. Let g be the centered Gaussian measure of variance σ 2 . Then for any probability measure f on R of finite moment of order 2 with finite Fisher information J ( f ) < , the following inequality holds:
D ( f g ) σ 2 2 J ( f g ) .
Proof. It is clear that the perturbed measure g t is the centered Gaussian of variance σ 2 + t and the score of which is given by
x log g t = x σ 2 + t .
Then using the Stein relation (see, for instance, [15]), the relative Fisher information J ( f t g t ) can be expanded as follows:
J ( f t g t ) = R x log f t x log g t 2 f t d x
= J ( f t ) + 2 R x x σ 2 + t f t d x + R x σ 2 + t 2 f t d x
= J ( f t ) 2 σ 2 + t R f t d x + 1 ( σ 2 + t ) 2 R x 2 f t d x
As it was seen in (5), by Stam inequality, we have that
J f t 1 J ( f ) + t 1 = 1 ( 1 / α ) + t
where we put α = J ( f ) < .
Since f has finite moment of order 2, if we put the second moment of f as β = m 2 ( f ) < , then it is easy to see that the second moment of f t is given by
m 2 ( f t ) = x 2 f t d x = β + t
Substitute (13) and (14) into (12) and we obtain that
J ( f t g t ) 1 ( 1 / α ) + t 2 σ 2 + t + β + t ( σ 2 + t ) 2 = 1 ( 1 / α ) + t 1 σ 2 + t + β σ 2 ( σ 2 + t ) 2 .
Integrating for t 0 , we have
1 2 0 J ( f t g t ) d t 1 2 0 1 ( 1 / α ) + t 1 σ 2 + t + β σ 2 ( σ 2 + t ) 2 d t = 1 2 log ( 1 / α ) + t σ 2 + t β σ 2 σ 2 + t 0 = 1 2 log ( σ 2 α ) + β σ 2 1 .
Since log y is dominated as log y y 1 for y > 0 , it follows that
1 2 0 J ( f t g t ) d t 1 2 σ 2 α 2 + β σ 2
On the other hand, the relative Fisher information J ( f g ) can be given as
J ( f g ) = R x log f x σ 2 2 f d x
= R x log f 2 f d x 2 σ 2 R f d x + 1 ( σ 2 ) 2 R x 2 f d x
= J ( f ) 2 σ 2 + m 2 ( f ) ( σ 2 ) 2 = α 2 σ 2 + β ( σ 2 ) 2
Combining (15) and (16), we have
1 2 0 J ( f t g t ) d t σ 2 2 J ( f g ) ,
which means our desired inequality by Theorem 2.2.
Remark 3.2. Similar way to the proof of Theorem 3.1 can be found in the paper by Stam [2], where it is not for relative case. Namely, based on convolution inequalities and the de Bruijn identity, the isoperimetric inequality on entropy for a standardized random variable X on R ,
( 2 π e ) e 2 H ( X ) J ( X )
was shown. This inequality is essentially the same as the logarithmic Sobolev inequality for the standard Gaussian measure, where the left hand side in (17) is the reciprocal of the entropy power.

Acknowledgments

The authors are grateful to the anonymous reviewers for their correcting inaccuracies, useful suggestions, and valuable comments. Especially, the extension to the R n -version of Proposition 2.1 is based on the reviewer’s comments.

References

  1. Cover, T.; Thomas, J. Elements of Information Theory, 2nd ed.; Wiley-Interscience: Hoboken, NJ, USA, 2006. [Google Scholar]
  2. Stam, A.J. Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inf. Contr. 1959, 2, 101–112. [Google Scholar] [CrossRef]
  3. Blachman, N.M. The convolution inequality for entropy powers. IEEE Trans. Inform. Theor. 1965, 2, 267–271. [Google Scholar] [CrossRef]
  4. Carlen, E. Superadditivity of Fisher’s information and logarithmic Sobolev inequalities. J. Funct. Anal. 1991, 101, 194–211. [Google Scholar] [CrossRef]
  5. Villani, C. Cercignani’s conjecture is sometimes true and always almost true. Commun. Math. Phys. 2003, 234, 455–490. [Google Scholar] [CrossRef]
  6. Madiman, M.; Barron, A. Generalized entropy power inequalities and monotonicity properties of information. IEEE Trans. Inform. Theor. 2007, 53, 2317–2329. [Google Scholar] [CrossRef]
  7. Johnson, O.; Barron, A. Fisher information inequalities and the central limit theorem. Probab. Theor. Relat. Field. 2004, 129, 391–409. [Google Scholar] [CrossRef]
  8. Dembo, A.; Thomas, J.; Cover, T. Information theoretic inequalities. IEEE Trans. Inform. Theor. 1991, 37, 1501–1518. [Google Scholar] [CrossRef]
  9. Arnold, A.; Markowich, P.; Toscani, G.; Unterreiter, A. On convex Sobolev inequalities and the rate of convergence to equilibrium for Fokker–Planck type equations. Comm Part. Differ. Equat. 2001, 26, 43–100. [Google Scholar] [CrossRef]
  10. Bakry, D.; Émery, M. Diffusions hypercontractives. In Séminar de Probabilités XIX, 1983/84, Lecture Notes in Math. 1123; Springer-Verlag: Berlin, Germany, 1985; pp. 177–206. [Google Scholar]
  11. Otto, F.; Villani, C. Generalization of an inequality by Talagrand and links with the logarithmic Sobolev inequality. J. Funct. Anal. 2000, 173, 361–400. [Google Scholar] [CrossRef]
  12. Verdú, S. Mismatched estimation and relative entropy. IEEE Trans. Inform. Theor. 2010, 56, 3712–3719. [Google Scholar] [CrossRef]
  13. Guo, D.; Shamai, S.; Verdú, S. Mutual information and minimum mean-square error in Gaussian channels. IEEE Trans. Inform. Theor. 2005, 51, 1261–1283. [Google Scholar] [CrossRef]
  14. Villani, C. A short proof of the concavity of entropy power. IEEE Trans. Inform. Theor. 2000, 46, 1695–1696. [Google Scholar] [CrossRef]
  15. Johnson, O. Information Theory and the Central Limit Theorem; Imperial College Press: London, UK, 2004. [Google Scholar]
  16. Barron, A. Entropy and the central limit theorem. Ann. Probab. 1986, 14, 336–342. [Google Scholar] [CrossRef]
  17. Gross, L. Logarithmic Sobolev inequalities. Amer. J. Math. 1975, 97, 1061–1083. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Hirata, M.; Nemoto, A.; Yoshida, H. An Integral Representation of the Relative Entropy. Entropy 2012, 14, 1469-1477. https://doi.org/10.3390/e14081469

AMA Style

Hirata M, Nemoto A, Yoshida H. An Integral Representation of the Relative Entropy. Entropy. 2012; 14(8):1469-1477. https://doi.org/10.3390/e14081469

Chicago/Turabian Style

Hirata, Miku, Aya Nemoto, and Hiroaki Yoshida. 2012. "An Integral Representation of the Relative Entropy" Entropy 14, no. 8: 1469-1477. https://doi.org/10.3390/e14081469

Article Metrics

Back to TopTop