entropy-logo

Journal Browser

Journal Browser

The Information Bottleneck Method

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (1 February 2013) | Viewed by 58027

Special Issue Editors

The School of Computer Science and Engineering and the Interdisciplinary Center for Neural Computation, The Hebrew University of Jerusalem, Jerusalem 9190401, Israel
Interests: information theory in machine learning, dynamical systems and control, statistical physics of neural systems, computational neuroscience
Special Issues, Collections and Topics in MDPI journals
Department of Computer Science, University of Hertfordshire, Hatfield AL10 9AB, UK
Interests: artificial life; artificial intelligence; information theory; minimally cognitive agents; embodiment
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The Information Bottleneck Method is a simple optimization principle for a model-free extraction the relevant part of one random variable with respect to another. Its formulation is closely related to classical problems in information theory, such as Rate-Distortion Theory and channel coding with side-information (the WAK problem). But it turned out to be connected with many other interesting domains. It generalizes the notion of minimal sufficient statistics in classical estimation theory; generalizes the Canonical-Correlation-Analysis (CCA) when applied to multivariate Gaussian variables; provides an optimal solution to the Kelly gambling problem; and serves as a basic building block for an information theory of perception and action. It provides elegant extensions of optimal control theory to information gathering problems, and has numerous applications in machine learning and computational neuroscience.  This special issue should provide a thorough discussion of the statistical, algorithmic, control theoretic, and biological aspects of this suggestive principle.

Prof. Dr. Naftali Tishby
Dr. Daniel Polani
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • information divergences
  • sufficient statistics
  • predictive information
  • information and control
  • side information
  • cognitive information processing

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

276 KiB  
Article
Information Bottleneck Approach to Predictive Inference
by Susanne Still
Entropy 2014, 16(2), 968-989; https://doi.org/10.3390/e16020968 - 17 Feb 2014
Cited by 38 | Viewed by 9777
Abstract
This paper synthesizes a recent line of work on automated predictive model making inspired by Rate-Distortion theory, in particular by the Information Bottleneck method. Predictive inference is interpreted as a strategy for efficient communication. The relationship to thermodynamic efficiency is discussed. The overall [...] Read more.
This paper synthesizes a recent line of work on automated predictive model making inspired by Rate-Distortion theory, in particular by the Information Bottleneck method. Predictive inference is interpreted as a strategy for efficient communication. The relationship to thermodynamic efficiency is discussed. The overall aim of this paper is to explain how this information theoretic approach provides an intuitive, overarching framework for predictive inference. Full article
(This article belongs to the Special Issue The Information Bottleneck Method)
210 KiB  
Article
Consideration on Singularities in Learning Theory and the Learning Coefficient
by Miki Aoyagi
Entropy 2013, 15(9), 3714-3733; https://doi.org/10.3390/e15093714 - 06 Sep 2013
Cited by 4 | Viewed by 4587
Abstract
We consider the learning coefficients in learning theory and give two new methods for obtaining these coefficients in a homogeneous case: a method for finding a deepest singular point and a method to add variables. In application to Vandermonde matrix-type singularities, we show [...] Read more.
We consider the learning coefficients in learning theory and give two new methods for obtaining these coefficients in a homogeneous case: a method for finding a deepest singular point and a method to add variables. In application to Vandermonde matrix-type singularities, we show that these methods are effective. The learning coefficient of the generalization error in Bayesian estimation serves to measure the learning efficiency in singular learning models. Mathematically, the learning coefficient corresponds to a real log canonical threshold of singularities for the Kullback functions (relative entropy) in learning theory. Full article
(This article belongs to the Special Issue The Information Bottleneck Method)
Show Figures

Figure 1

2234 KiB  
Article
Quantifying Morphological Computation
by Keyan Zahedi and Nihat Ay
Entropy 2013, 15(5), 1887-1915; https://doi.org/10.3390/e15051887 - 21 May 2013
Cited by 40 | Viewed by 7157
Abstract
The field of embodied intelligence emphasises the importance of the morphology and environment with respect to the behaviour of a cognitive system. The contribution of the morphology to the behaviour, commonly known as morphological computation, is well-recognised in this community. We believe that [...] Read more.
The field of embodied intelligence emphasises the importance of the morphology and environment with respect to the behaviour of a cognitive system. The contribution of the morphology to the behaviour, commonly known as morphological computation, is well-recognised in this community. We believe that the field would benefit from a formalisation of this concept as we would like to ask how much the morphology and the environment contribute to an embodied agent’s behaviour, or how an embodied agent can maximise the exploitation of its morphology within its environment. In this work we derive two concepts of measuring morphological computation, and we discuss their relation to the Information Bottleneck Method. The first concepts asks how much the world contributes to the overall behaviour and the second concept asks how much the agent’s action contributes to a behaviour. Various measures are derived from the concepts and validated in two experiments that highlight their strengths and weaknesses. Full article
(This article belongs to the Special Issue The Information Bottleneck Method)
Show Figures

Figure 1

825 KiB  
Article
Function Identification in Neuron Populations via Information Bottleneck
by S. Kartik Buddha, Kelvin So, Jose M. Carmena and Michael C. Gastpar
Entropy 2013, 15(5), 1587-1608; https://doi.org/10.3390/e15051587 - 06 May 2013
Cited by 10 | Viewed by 6445
Abstract
It is plausible to hypothesize that the spiking responses of certain neurons represent functions of the spiking signals of other neurons. A natural ensuing question concerns how to use experimental data to infer what kind of a function is being computed. Model-based approaches [...] Read more.
It is plausible to hypothesize that the spiking responses of certain neurons represent functions of the spiking signals of other neurons. A natural ensuing question concerns how to use experimental data to infer what kind of a function is being computed. Model-based approaches typically require assumptions on how information is represented. By contrast, information measures are sensitive only to relative behavior: information is unchanged by applying arbitrary invertible transformations to the involved random variables. This paper develops an approach based on the information bottleneck method that attempts to find such functional relationships in a neuron population. Specifically, the information bottleneck method is used to provide appropriate compact representations which can then be parsed to infer functional relationships. In the present paper, the parsing step is specialized to the case of remapped-linear functions. The approach is validated on artificial data and then applied to recordings from the motor cortex of a macaque monkey performing an arm-reaching task. Functional relationships are identified and shown to exhibit some degree of persistence across multiple trials of the same experiment. Full article
(This article belongs to the Special Issue The Information Bottleneck Method)
Show Figures

Figure 1

434 KiB  
Article
A Free Energy Principle for Biological Systems
by Friston Karl
Entropy 2012, 14(11), 2100-2121; https://doi.org/10.3390/e14112100 - 31 Oct 2012
Cited by 215 | Viewed by 21463
Abstract
This paper describes a free energy principle that tries to explain the ability of biological systems to resist a natural tendency to disorder. It appeals to circular causality of the sort found in synergetic formulations of self-organization (e.g., the slaving principle) and models [...] Read more.
This paper describes a free energy principle that tries to explain the ability of biological systems to resist a natural tendency to disorder. It appeals to circular causality of the sort found in synergetic formulations of self-organization (e.g., the slaving principle) and models of coupled dynamical systems, using nonlinear Fokker Planck equations. Here, circular causality is induced by separating the states of a random dynamical system into external and internal states, where external states are subject to random fluctuations and internal states are not. This reduces the problem to finding some (deterministic) dynamics of the internal states that ensure the system visits a limited number of external states; in other words, the measure of its (random) attracting set, or the Shannon entropy of the external states is small. We motivate a solution using a principle of least action based on variational free energy (from statistical physics) and establish the conditions under which it is formally equivalent to the information bottleneck method. This approach has proved useful in understanding the functional architecture of the brain. The generality of variational free energy minimisation and corresponding information theoretic formulations may speak to interesting applications beyond the neurosciences; e.g., in molecular or evolutionary biology. Full article
(This article belongs to the Special Issue The Information Bottleneck Method)
Show Figures

Figure 1

221 KiB  
Article
The Mathematical Structure of Information Bottleneck Methods
by Tomáš Gedeon, Albert E. Parker and Alexander G. Dimitrov
Entropy 2012, 14(3), 456-479; https://doi.org/10.3390/e14030456 - 01 Mar 2012
Cited by 17 | Viewed by 6969
Abstract
Information Bottleneck-based methods use mutual information as a distortion function in order to extract relevant details about the structure of a complex system by compression. One of the approaches used to generate optimal compressed representations is by annealing a parameter. In this manuscript [...] Read more.
Information Bottleneck-based methods use mutual information as a distortion function in order to extract relevant details about the structure of a complex system by compression. One of the approaches used to generate optimal compressed representations is by annealing a parameter. In this manuscript we present a common framework for the study of annealing in information distortion problems. We identify features that should be common to any annealing optimization problem. The main mathematical tools that we use come from the analysis of dynamical systems in the presence of symmetry (equivariant bifurcation theory). Through the compression problem, we make connections to the world of combinatorial optimization and pattern recognition. The two approaches use very different vocabularies and consider different problems to be “interesting”. We provide an initial link, through the Normalized Cut Problem, where the two disciplines can exchange tools and ideas. Full article
(This article belongs to the Special Issue The Information Bottleneck Method)
Show Figures

Figure 1

Back to TopTop