You are currently viewing a new version of our website. To view the old version click .
Entropy
  • Correction
  • Open Access

3 November 2017

Correction: Kolchinsky, A. and Tracey, B.D. Estimating Mixture Entropy with Pairwise Distances. Entropy 2017, 19, 361

and
1
Santa Fe Institute, Santa Fe, NM 87501, USA
2
Department of Aeronautics and Astronautics, Massachusetts Institute of Technology, Cambridge, MA 02139, USA
*
Author to whom correspondence should be addressed.
Following the publication of our paper [1], we uncovered a mistake in the derivation of two formulas in the manuscript. This error does not affect any of the empirical results or conclusions of the article.
The following incorrect text on Page 9 should be replaced:
  • These bounds have particularly simple forms when all of the mixture components have equal covariance matrices, i.e., Σ i = Σ for all i. In this case, the lower bound of Equation (10) can be written as
    H ^ C α = d 2 i c i ln j c j p j ( μ i ) α ( 1 α ) .
    This is derived by combining the expressions for C α , Equation (14), the entropy of a Gaussian, Equation (13), and the Gaussian density function. For a homoscedastic mixture, the tightest lower bound among the Chernoff α -divergences is given by α = 0.5 , corresponding to the Bhattacharyya distance,
    H ^ BD = d 2 i c i ln j c j p j ( μ i ) 1 4 .
    (This is derived above in Section 3.2.)
The replacement text should read:
  • These bounds have simple forms when all of the mixture components have equal covariance matrices; i.e., Σ i = Σ for all i. First, define a transformation in which each Gaussian component p j is mapped to a different Gaussian p ˜ j , α , which has the same mean but where the covariance matrix is rescaled by 1 α ( 1 α ) ,
    p j : = N μ j , Σ p ˜ j , α : = N μ j , 1 α ( 1 α ) Σ .
    Then, the lower bound of Equation (10) can be written as
    H ^ C α = d 2 + d 2 ln α ( 1 α ) i c i ln j c j p ˜ j , α ( μ i ) .
    This is derived by combining the expressions for C α , Equation (14), the entropy of a Gaussian, Equation (13), and the Gaussian density function. For a homoscedastic mixture, the tightest lower bound among the Chernoff α -divergences is given by α = 0.5 , corresponding to the Bhattacharyya distance,
    H ^ BD = d 2 + d 2 ln 1 4 i c i ln j c j p ˜ j , 0.5 ( μ i ) .
    (This is derived above in Section 3.2.)

Reference

  1. Kolchinsky, A.; Tracey, B.D. Estimating Mixture Entropy with Pairwise Distances. Entropy 2017, 19, 361. [Google Scholar] [CrossRef]

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.