Special Issue "Kolmogorov Complexity"


A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (31 October 2011)

Special Issue Editor

Guest Editor
Prof. Dr. Paul M. B. Vitányi
1 CWI (Centrum Wiskunde & Informatica), Science Park 123, 1098 XG, Amsterdam, The Netherlands
2 University of Amsterdam, 1012 WX Amsterdam, The Netherlands
Website: http://www.cwi.nl/~paulv/
E-Mail: paul.vitanyi@cwi.nl
Phone: + 31 20 5924124
Fax: + 31 20 5924199
Interests: cellular automata; computational complexity; distributed and parallel computing; machine learning and prediction; physics of computation; Kolmogorov complexity; information theory; quantum computing

Special Issue Information

Dear Colleagues,

This Special Issue deals with Kolmogorov complexity in its many facets:

Pointwise Randomness:

Kolmogorov complexity is a modern notion of randomness dealing with the quantity of information in individual objects; that is, pointwise randomness rather than average randomness as produced by a random source. It was proposed by A.N. Kolmogorov in 1965 to quantify the information and randomness of individual objects in an objective and absolute manner. This is impossible by classical probability theory (a branch of measure theory satisfying the so-called Kolmogorov axioms formulated in 1933). Kolmogorov complexity is known variously as "algorithmic information", "algorithmic entropy", "Kolmogorov-Chaitin complexity", "descriptional complexity", "shortest program length", "algorithmic randomness", and others.

Incompressibility Method:

In some parts of the research interests mentioned on Paul Vitanyi's home page a new mathematical proof technique was developed, now known as the `incompressibility method'. The incompressibility method is a basic general technique such as the "pigeon hole" argument, "the counting method" or the "probabilistic method". The new method is based on Kolmogorov complexity.

Absolute Information.

The Kolmogorov complexity of an object is a form of absolute information of the individual object. This is not possible to do by C.E. Shannon's information theory. Unlike Kolmogorov complexity, information theory is only concerned with the average information of a random source.

Universal Induction and Data Compression.

Traditional wisdom has it that the better a theory compresses the learning data concerning some phenomenon under investigation, the better we learn, generalize, and the better the theory predicts unknown data. This belief is vindicated in practice but before the advent of Kolmogorov complexity has not been rigorously proved in a general setting. Making these ideas rigorous involves the length of the shortest effective description of an individual object: its Kolmogorov complexity. Ray Solomonoff invented the notion of universal prediction using the Kolmogorov complexity based universal distribution. Universal prediction is related to optimal effective compression. The latter is almost always a best strategy in hypotheses identification (an ideal form of the minimum description length (MDL) principle invented for real Statistics by J.J. Rissanen). Whereas the single best hypothesis does not necessarily give the best prediction, it can be demonstrated that nonetheless compression is almost always the best strategy in prediction methods in the style of R. Solomonoff.

Prof. Dr. Paul M. B. Vitányi
Guest Editor


  • Kolmogorov complexity
  • algorithmic information theory
  • randomness of individual object
  • information in individual object
  • incompressibility method
  • universal induction and data compression
  • similarity by compression

Published Papers (4 papers)

by , ,  and
Entropy 2011, 13(3), 595-611; doi:10.3390/e13030595
Received: 14 January 2011; in revised form: 25 February 2011 / Accepted: 26 February 2011 / Published: 3 March 2011
Show/Hide Abstract | Cited by 1 | PDF Full-text (172 KB)

Entropy 2011, 13(4), 778-789; doi:10.3390/e13040778
Received: 17 January 2011; in revised form: 9 March 2011 / Accepted: 24 March 2011 / Published: 29 March 2011
Show/Hide Abstract | Cited by 1 | PDF Full-text (150 KB)

by  and
Entropy 2011, 13(4), 902-914; doi:10.3390/e13040902
Received: 3 March 2011; in revised form: 31 March 2011 / Accepted: 1 April 2011 / Published: 19 April 2011
Show/Hide Abstract | Cited by 1 | PDF Full-text (120 KB)
abstract graphic

by  and
Entropy 2011, 13(6), 1076-1136; doi:10.3390/e13061076
Received: 20 April 2011; in revised form: 24 May 2011 / Accepted: 27 May 2011 / Published: 3 June 2011
Show/Hide Abstract | Cited by 4 | PDF Full-text (425 KB)
abstract graphic

Last update: 25 February 2014

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert