Entropy Measures vs. Kolmogorov Complexity
1
Computer Science Department, Faculty of Sciences, University of Porto, Rua Campo Alegre 1021/1055, 4169-007 Porto, Portugal
2
Instituto de Telecomunicações, Av. Rovisco Pais 1, 1049-001 Lisboa, Portugal
3
Laboratório de Inteligência Artificial e Ciência de Computadores, Rua Campo Alegre 1021/1055, 4169-007 Porto, Portugal
*
Author to whom correspondence should be addressed.
Entropy 2011, 13(3), 595-611; https://doi.org/10.3390/e13030595
Received: 14 January 2011 / Revised: 25 February 2011 / Accepted: 26 February 2011 / Published: 3 March 2011
(This article belongs to the Special Issue Kolmogorov Complexity)
Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for R´enyi and Tsallis entropies of order α, showing that it only holds for α = 1. Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution mt(x), Tsallis and Rényi entropies converge if and only if α is greater than 1. We also establish the uniform continuity of these entropies.
View Full-Text
Keywords:
Kolmogorov complexity; Shannon entropy; Rényi entropy; Tsallis entropy
This is an open access article distributed under the Creative Commons Attribution License
MDPI and ACS Style
Teixeira, A.; Matos, A.; Souto, A.; Antunes, L. Entropy Measures vs. Kolmogorov Complexity. Entropy 2011, 13, 595-611.
Show more citation formats