Entropy Measures vs. Kolmogorov Complexity
AbstractKolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for R´enyi and Tsallis entropies of order α, showing that it only holds for α = 1. Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution mt(x), Tsallis and Rényi entropies converge if and only if α is greater than 1. We also establish the uniform continuity of these entropies.
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Teixeira, A.; Matos, A.; Souto, A.; Antunes, L. Entropy Measures vs. Kolmogorov Complexity. Entropy 2011, 13, 595-611.
Teixeira A, Matos A, Souto A, Antunes L. Entropy Measures vs. Kolmogorov Complexity. Entropy. 2011; 13(3):595-611.Chicago/Turabian Style
Teixeira, Andreia; Matos, Armando; Souto, André; Antunes, Luís. 2011. "Entropy Measures vs. Kolmogorov Complexity." Entropy 13, no. 3: 595-611.