Open AccessThis article is
- freely available
Entropy Measures vs. Kolmogorov Complexity
Computer Science Department, Faculty of Sciences, University of Porto, Rua Campo Alegre 1021/1055, 4169-007 Porto, Portugal
Instituto de Telecomunicações, Av. Rovisco Pais 1, 1049-001 Lisboa, Portugal
Laboratório de Inteligência Artificial e Ciência de Computadores, Rua Campo Alegre 1021/1055, 4169-007 Porto, Portugal
* Author to whom correspondence should be addressed.
Received: 14 January 2011; in revised form: 25 February 2011 / Accepted: 26 February 2011 / Published: 3 March 2011
Abstract: Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for R´enyi and Tsallis entropies of order α, showing that it only holds for α = 1. Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution mt(x), Tsallis and Rényi entropies converge if and only if α is greater than 1. We also establish the uniform continuity of these entropies.
Keywords: Kolmogorov complexity; Shannon entropy; Rényi entropy; Tsallis entropy
Citations to this Article
Cite This Article
MDPI and ACS Style
Teixeira, A.; Matos, A.; Souto, A.; Antunes, L. Entropy Measures vs. Kolmogorov Complexity. Entropy 2011, 13, 595-611.
Teixeira A, Matos A, Souto A, Antunes L. Entropy Measures vs. Kolmogorov Complexity. Entropy. 2011; 13(3):595-611.
Teixeira, Andreia; Matos, Armando; Souto, André; Antunes, Luís. 2011. "Entropy Measures vs. Kolmogorov Complexity." Entropy 13, no. 3: 595-611.