Information Distances versus Entropy Metric
AbstractInformation distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon entropy. However, for any computable probability distributions, up to a constant, the expected value of Kolmogorov complexity equals the Shannon entropy. We study the similar relationship between entropy and information distance. We also study the relationship between entropy and the normalized versions of information distances. View Full-Text
Share & Cite This Article
Hu, B.; Bi, L.; Dai, S. Information Distances versus Entropy Metric. Entropy 2017, 19, 260.
Hu B, Bi L, Dai S. Information Distances versus Entropy Metric. Entropy. 2017; 19(6):260.Chicago/Turabian Style
Hu, Bo; Bi, Lvqing; Dai, Songsong. 2017. "Information Distances versus Entropy Metric." Entropy 19, no. 6: 260.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.