entropy-logo

Journal Browser

Journal Browser

Kolmogorov Complexity

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Complexity".

Deadline for manuscript submissions: closed (31 October 2011) | Viewed by 49681

Special Issue Editor


E-Mail Website
Guest Editor
1. The National Research Center for Mathematics and Computer Science in the Netherlands (CWI), 1098XG Amsterdam, The Netherlands
2. Department of Computer Science, University of Amsterdam, 1012 WX Amsterdam, The Netherlands
Interests: cellular automata; computational complexity; distributed and parallel computing; machine learning and prediction; physics of computation; Kolmogorov complexity; information theory; quantum computing

Special Issue Information

Dear Colleagues,

This Special Issue deals with Kolmogorov complexity in its many facets:

Pointwise Randomness:

Kolmogorov complexity is a modern notion of randomness dealing with the quantity of information in individual objects; that is, pointwise randomness rather than average randomness as produced by a random source. It was proposed by A.N. Kolmogorov in 1965 to quantify the information and randomness of individual objects in an objective and absolute manner. This is impossible by classical probability theory (a branch of measure theory satisfying the so-called Kolmogorov axioms formulated in 1933). Kolmogorov complexity is known variously as "algorithmic information", "algorithmic entropy", "Kolmogorov-Chaitin complexity", "descriptional complexity", "shortest program length", "algorithmic randomness", and others.

Incompressibility Method:

In some parts of the research interests mentioned on Paul Vitanyi's home page a new mathematical proof technique was developed, now known as the `incompressibility method'. The incompressibility method is a basic general technique such as the "pigeon hole" argument, "the counting method" or the "probabilistic method". The new method is based on Kolmogorov complexity.

Absolute Information.

The Kolmogorov complexity of an object is a form of absolute information of the individual object. This is not possible to do by C.E. Shannon's information theory. Unlike Kolmogorov complexity, information theory is only concerned with the average information of a random source.

Universal Induction and Data Compression.

Traditional wisdom has it that the better a theory compresses the learning data concerning some phenomenon under investigation, the better we learn, generalize, and the better the theory predicts unknown data. This belief is vindicated in practice but before the advent of Kolmogorov complexity has not been rigorously proved in a general setting. Making these ideas rigorous involves the length of the shortest effective description of an individual object: its Kolmogorov complexity. Ray Solomonoff invented the notion of universal prediction using the Kolmogorov complexity based universal distribution. Universal prediction is related to optimal effective compression. The latter is almost always a best strategy in hypotheses identification (an ideal form of the minimum description length (MDL) principle invented for real Statistics by J.J. Rissanen). Whereas the single best hypothesis does not necessarily give the best prediction, it can be demonstrated that nonetheless compression is almost always the best strategy in prediction methods in the style of R. Solomonoff.

Prof. Dr. Paul M. B. Vitányi
Guest Editor

Keywords

  • Kolmogorov complexity
  • algorithmic information theory
  • randomness of individual object
  • information in individual object
  • incompressibility method
  • universal induction and data compression
  • similarity by compression

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

425 KiB  
Article
A Philosophical Treatise of Universal Induction
by Samuel Rathmanner and Marcus Hutter
Entropy 2011, 13(6), 1076-1136; https://doi.org/10.3390/e13061076 - 03 Jun 2011
Cited by 53 | Viewed by 20946
Abstract
Understanding inductive reasoning is a problem that has engaged mankind for thousands of years. This problem is relevant to a wide range of fields and is integral to the philosophy of science. It has been tackled by many great minds ranging from philosophers [...] Read more.
Understanding inductive reasoning is a problem that has engaged mankind for thousands of years. This problem is relevant to a wide range of fields and is integral to the philosophy of science. It has been tackled by many great minds ranging from philosophers to scientists to mathematicians, and more recently computer scientists. In this article we argue the case for Solomonoff Induction, a formal inductive framework which combines algorithmic information theory with the Bayesian framework. Although it achieves excellent theoretical results and is based on solid philosophical foundations, the requisite technical knowledge necessary for understanding this framework has caused it to remain largely unknown and unappreciated in the wider scientific community. The main contribution of this article is to convey Solomonoff induction and its related concepts in a generally accessible form with the aim of bridging this current technical gap. In the process we examine the major historical contributions that have led to the formulation of Solomonoff Induction as well as criticisms of Solomonoff and induction in general. In particular we examine how Solomonoff induction addresses many issues that have plagued other inductive systems, such as the black ravens paradox and the confirmation problem, and compare this approach with other recent approaches. Full article
(This article belongs to the Special Issue Kolmogorov Complexity)
Show Figures

Graphical abstract

120 KiB  
Article
Algorithmic Relative Complexity
by Daniele Cerra and Mihai Datcu
Entropy 2011, 13(4), 902-914; https://doi.org/10.3390/e13040902 - 19 Apr 2011
Cited by 20 | Viewed by 8868
Abstract
Information content and compression are tightly related concepts that can be addressed through both classical and algorithmic information theories, on the basis of Shannon entropy and Kolmogorov complexity, respectively. The definition of several entities in Kolmogorov’s framework relies upon ideas from classical information [...] Read more.
Information content and compression are tightly related concepts that can be addressed through both classical and algorithmic information theories, on the basis of Shannon entropy and Kolmogorov complexity, respectively. The definition of several entities in Kolmogorov’s framework relies upon ideas from classical information theory, and these two approaches share many common traits. In this work, we expand the relations between these two frameworks by introducing algorithmic cross-complexity and relative complexity, counterparts of the cross-entropy and relative entropy (or Kullback-Leibler divergence) found in Shannon’s framework. We define the cross-complexity of an object x with respect to another object y as the amount of computational resources needed to specify x in terms of y, and the complexity of x related to y as the compression power which is lost when adopting such a description for x, compared to the shortest representation of x. Properties of analogous quantities in classical information theory hold for these new concepts. As these notions are incomputable, a suitable approximation based upon data compression is derived to enable the application to real data, yielding a divergence measure applicable to any pair of strings. Example applications are outlined, involving authorship attribution and satellite image classification, as well as a comparison to similar established techniques. Full article
(This article belongs to the Special Issue Kolmogorov Complexity)
Show Figures

Graphical abstract

150 KiB  
Article
Quantum Kolmogorov Complexity and Information-Disturbance Theorem
by Takayuki Miyadera
Entropy 2011, 13(4), 778-789; https://doi.org/10.3390/e13040778 - 29 Mar 2011
Cited by 3 | Viewed by 6562
Abstract
In this paper, a representation of the information-disturbance theorem based on the quantum Kolmogorov complexity that was defined by P. Vit´anyi has been examined. In the quantum information theory, the information-disturbance relationship, which treats the trade-off relationship between information gain and its caused [...] Read more.
In this paper, a representation of the information-disturbance theorem based on the quantum Kolmogorov complexity that was defined by P. Vit´anyi has been examined. In the quantum information theory, the information-disturbance relationship, which treats the trade-off relationship between information gain and its caused disturbance, is a fundamental result that is related to Heisenberg’s uncertainty principle. The problem was formulated in a cryptographic setting and the quantitative relationships between complexities have been derived. Full article
(This article belongs to the Special Issue Kolmogorov Complexity)
172 KiB  
Article
Entropy Measures vs. Kolmogorov Complexity
by Andreia Teixeira, Armando Matos, André Souto and Luís Antunes
Entropy 2011, 13(3), 595-611; https://doi.org/10.3390/e13030595 - 03 Mar 2011
Cited by 39 | Viewed by 11777
Abstract
Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for R´enyi and Tsallis entropies of order [...] Read more.
Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for R´enyi and Tsallis entropies of order α, showing that it only holds for α = 1. Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution mt(x), Tsallis and Rényi entropies converge if and only if α is greater than 1. We also establish the uniform continuity of these entropies. Full article
(This article belongs to the Special Issue Kolmogorov Complexity)
Back to TopTop