Following the publication of our paper [1], we uncovered a mistake in the derivation of two formulas in the manuscript. This error does not affect any of the empirical results or conclusions of the article.
The following incorrect text on Page 9 should be replaced:
- These bounds have particularly simple forms when all of the mixture components have equal covariance matrices, i.e., for all i. In this case, the lower bound of Equation (10) can be written asThis is derived by combining the expressions for , Equation (14), the entropy of a Gaussian, Equation (13), and the Gaussian density function. For a homoscedastic mixture, the tightest lower bound among the Chernoff -divergences is given by , corresponding to the Bhattacharyya distance,(This is derived above in Section 3.2.)
The replacement text should read:
- These bounds have simple forms when all of the mixture components have equal covariance matrices; i.e., for all i. First, define a transformation in which each Gaussian component is mapped to a different Gaussian , which has the same mean but where the covariance matrix is rescaled by ,Then, the lower bound of Equation (10) can be written asThis is derived by combining the expressions for , Equation (14), the entropy of a Gaussian, Equation (13), and the Gaussian density function. For a homoscedastic mixture, the tightest lower bound among the Chernoff -divergences is given by , corresponding to the Bhattacharyya distance,(This is derived above in Section 3.2.)
Reference
- Kolchinsky, A.; Tracey, B.D. Estimating Mixture Entropy with Pairwise Distances. Entropy 2017, 19, 361. [Google Scholar] [CrossRef]
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).