Special Issue "Kolmogorov Complexity"
A special issue of Entropy (ISSN 1099-4300).
Deadline for manuscript submissions: closed (31 October 2011)
Prof. Dr. Paul M. B. Vitányi
1 CWI (Centrum Wiskunde & Informatica), Science Park 123, 1098 XG, Amsterdam, The Netherlands
2 University of Amsterdam, 1012 WX Amsterdam, The Netherlands
Phone: + 31 20 5924124
Fax: + 31 20 5924199
Interests: cellular automata; computational complexity; distributed and parallel computing; machine learning and prediction; physics of computation; Kolmogorov complexity; information theory; quantum computing
This Special Issue deals with Kolmogorov complexity in its many facets:Pointwise Randomness:
Kolmogorov complexity is a modern notion of randomness dealing with the quantity of information in individual objects; that is, pointwise randomness rather than average randomness as produced by a random source. It was proposed by A.N. Kolmogorov in 1965 to quantify the information and randomness of individual objects in an objective and absolute manner. This is impossible by classical probability theory (a branch of measure theory satisfying the so-called Kolmogorov axioms formulated in 1933). Kolmogorov complexity is known variously as "algorithmic information", "algorithmic entropy", "Kolmogorov-Chaitin complexity", "descriptional complexity", "shortest program length", "algorithmic randomness", and others.
In some parts of the research interests mentioned on Paul Vitanyi's home page a new mathematical proof technique was developed, now known as the `incompressibility method'. The incompressibility method is a basic general technique such as the "pigeon hole" argument, "the counting method" or the "probabilistic method". The new method is based on Kolmogorov complexity.
The Kolmogorov complexity of an object is a form of absolute information of the individual object. This is not possible to do by C.E. Shannon's information theory. Unlike Kolmogorov complexity, information theory is only concerned with the average information of a random source.
Universal Induction and Data Compression.
Traditional wisdom has it that the better a theory compresses the learning data concerning some phenomenon under investigation, the better we learn, generalize, and the better the theory predicts unknown data. This belief is vindicated in practice but before the advent of Kolmogorov complexity has not been rigorously proved in a general setting. Making these ideas rigorous involves the length of the shortest effective description of an individual object: its Kolmogorov complexity. Ray Solomonoff invented the notion of universal prediction using the Kolmogorov complexity based universal distribution. Universal prediction is related to optimal effective compression. The latter is almost always a best strategy in hypotheses identification (an ideal form of the minimum description length (MDL) principle invented for real Statistics by J.J. Rissanen). Whereas the single best hypothesis does not necessarily give the best prediction, it can be demonstrated that nonetheless compression is almost always the best strategy in prediction methods in the style of R. Solomonoff.
Prof. Dr. Paul M. B. Vitányi
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.
- Kolmogorov complexity
- algorithmic information theory
- randomness of individual object
- information in individual object
- incompressibility method
- universal induction and data compression
- similarity by compression
Entropy 2011, 13(3), 595-611; doi:10.3390/e13030595
Received: 14 January 2011; in revised form: 25 February 2011 / Accepted: 26 February 2011 / Published: 3 March 2011| Download PDF Full-text (172 KB)
Entropy 2011, 13(4), 778-789; doi:10.3390/e13040778
Received: 17 January 2011; in revised form: 9 March 2011 / Accepted: 24 March 2011 / Published: 29 March 2011| Download PDF Full-text (150 KB)
Article: Algorithmic Relative Complexity
Entropy 2011, 13(4), 902-914; doi:10.3390/e13040902
Received: 3 March 2011; in revised form: 31 March 2011 / Accepted: 1 April 2011 / Published: 19 April 2011| Download PDF Full-text (120 KB)
Entropy 2011, 13(6), 1076-1136; doi:10.3390/e13061076
Received: 20 April 2011; in revised form: 24 May 2011 / Accepted: 27 May 2011 / Published: 3 June 2011| Download PDF Full-text (425 KB)
Last update: 27 May 2011