Special Issue "Kolmogorov Complexity"
Deadline for manuscript submissions: closed (31 October 2011)
Prof. Dr. Paul M. B. Vitányi
1 CWI (Centrum Wiskunde & Informatica), Science Park 123, 1098 XG, Amsterdam, The Netherlands
2 University of Amsterdam, 1012 WX Amsterdam, The Netherlands
Fax: + 31 20 5924199
Interests: cellular automata; computational complexity; distributed and parallel computing; machine learning and prediction; physics of computation; Kolmogorov complexity; information theory; quantum computing
This Special Issue deals with Kolmogorov complexity in its many facets:Pointwise Randomness:
Kolmogorov complexity is a modern notion of randomness dealing with the quantity of information in individual objects; that is, pointwise randomness rather than average randomness as produced by a random source. It was proposed by A.N. Kolmogorov in 1965 to quantify the information and randomness of individual objects in an objective and absolute manner. This is impossible by classical probability theory (a branch of measure theory satisfying the so-called Kolmogorov axioms formulated in 1933). Kolmogorov complexity is known variously as "algorithmic information", "algorithmic entropy", "Kolmogorov-Chaitin complexity", "descriptional complexity", "shortest program length", "algorithmic randomness", and others.
In some parts of the research interests mentioned on Paul Vitanyi's home page a new mathematical proof technique was developed, now known as the `incompressibility method'. The incompressibility method is a basic general technique such as the "pigeon hole" argument, "the counting method" or the "probabilistic method". The new method is based on Kolmogorov complexity.
The Kolmogorov complexity of an object is a form of absolute information of the individual object. This is not possible to do by C.E. Shannon's information theory. Unlike Kolmogorov complexity, information theory is only concerned with the average information of a random source.
Universal Induction and Data Compression.
Traditional wisdom has it that the better a theory compresses the learning data concerning some phenomenon under investigation, the better we learn, generalize, and the better the theory predicts unknown data. This belief is vindicated in practice but before the advent of Kolmogorov complexity has not been rigorously proved in a general setting. Making these ideas rigorous involves the length of the shortest effective description of an individual object: its Kolmogorov complexity. Ray Solomonoff invented the notion of universal prediction using the Kolmogorov complexity based universal distribution. Universal prediction is related to optimal effective compression. The latter is almost always a best strategy in hypotheses identification (an ideal form of the minimum description length (MDL) principle invented for real Statistics by J.J. Rissanen). Whereas the single best hypothesis does not necessarily give the best prediction, it can be demonstrated that nonetheless compression is almost always the best strategy in prediction methods in the style of R. Solomonoff.
Prof. Dr. Paul M. B. Vitányi
- Kolmogorov complexity
- algorithmic information theory
- randomness of individual object
- information in individual object
- incompressibility method
- universal induction and data compression
- similarity by compression