Special Issue "The Information Bottleneck Method"


A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (1 February 2013)

Special Issue Editors

Guest Editor
Prof. Dr. Naftali Tishby
1 School of Engineering and Computer Science
2 Director, Interdisciplinary Center for Neural Computation
3 Flinkman Professor of Brain Research, Edmond and Lilly Safra Center for Brain Sciences, The Hebrew University, Jerusalem 91904, Israel
Website: http://www.cs.huji.ac.il/~tishby/
E-Mail: tishby@cs.huji.ac.il
Interests: information theory in machine learning; dynamical systems and control; statistical physics of neural systems; computational neuroscience

Guest Editor
Dr. Daniel Polani
Department of Computer Science, University of Hertfordshire, Hatfield AL10 9AB, UK
Website: http://homepages.feis.herts.ac.uk/~comqdp1/
E-Mail: d.polani@herts.ac.uk
Interests: artificial intelligence; artificial life; information theory for intelligent information processing; sensor Evolution; collective and multiagent systems

Special Issue Information

Dear Colleagues,

The Information Bottleneck Method is a simple optimization principle for a model-free extraction the relevant part of one random variable with respect to another. Its formulation is closely related to classical problems in information theory, such as Rate-Distortion Theory and channel coding with side-information (the WAK problem). But it turned out to be connected with many other interesting domains. It generalizes the notion of minimal sufficient statistics in classical estimation theory; generalizes the Canonical-Correlation-Analysis (CCA) when applied to multivariate Gaussian variables; provides an optimal solution to the Kelly gambling problem; and serves as a basic building block for an information theory of perception and action. It provides elegant extensions of optimal control theory to information gathering problems, and has numerous applications in machine learning and computational neuroscience.  This special issue should provide a thorough discussion of the statistical, algorithmic, control theoretic, and biological aspects of this suggestive principle.

Prof. Dr. Naftali Tishby
Dr. Daniel Polani
Guest Editors


Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1200 CHF (Swiss Francs).


  • information divergences
  • sufficient statistics
  • predictive information
  • information and control
  • side information
  • cognitive information processing

Published Papers (6 papers)

Entropy 2014, 16(2), 968-989; doi:10.3390/e16020968
Received: 3 June 2013; Accepted: 18 June 2013 / Published: 17 February 2014
Show/Hide Abstract | PDF Full-text (276 KB) | HTML Full-text | XML Full-text

Entropy 2013, 15(9), 3714-3733; doi:10.3390/e15093714
Received: 21 June 2013; in revised form: 29 August 2013 / Accepted: 30 August 2013 / Published: 6 September 2013
Show/Hide Abstract | PDF Full-text (210 KB)

by  and
Entropy 2013, 15(5), 1887-1915; doi:10.3390/e15051887
Received: 10 March 2013; in revised form: 23 April 2013 / Accepted: 9 May 2013 / Published: 21 May 2013
Show/Hide Abstract | PDF Full-text (2234 KB) | Supplementary Files

by , ,  and
Entropy 2013, 15(5), 1587-1608; doi:10.3390/e15051587
Received: 26 February 2013; in revised form: 27 March 2013 / Accepted: 22 April 2013 / Published: 6 May 2013
Show/Hide Abstract | PDF Full-text (825 KB)

Entropy 2012, 14(11), 2100-2121; doi:10.3390/e14112100
Received: 17 August 2012; in revised form: 1 October 2012 / Accepted: 25 October 2012 / Published: 31 October 2012
Show/Hide Abstract | Cited by 7 | PDF Full-text (434 KB)

by ,  and
Entropy 2012, 14(3), 456-479; doi:10.3390/e14030456
Received: 2 December 2011; in revised form: 7 February 2012 / Accepted: 24 February 2012 / Published: 1 March 2012
Show/Hide Abstract | PDF Full-text (221 KB)

Last update: 5 June 2012

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert