Next Article in Journal
Quantifying Chaos by Various Computational Methods. Part 2: Vibrations of the Bernoulli–Euler Beam Subjected to Periodic and Colored Noise
Next Article in Special Issue
Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory
Previous Article in Journal
Robustness Property of Robust-BD Wald-Type Test for Varying-Dimensional General Linear Models
Previous Article in Special Issue
Mutual Information and Information Gating in Synfire Chains
Article Menu
Issue 3 (March) cover image

Export Article

Open AccessFeature PaperArticle
Entropy 2018, 20(3), 169; https://doi.org/10.3390/e20030169

The Identity of Information: How Deterministic Dependencies Constrain Information Synergy and Redundancy

1
Department of Neurobiology, Harvard Medical School, Boston, MA 02115, USA
2
Neural Computation Laboratory, Center for Neuroscience and Cognitive [email protected], Istituto Italiano di Tecnologia, Rovereto (TN) 38068, Italy
*
Author to whom correspondence should be addressed.
Received: 13 November 2017 / Revised: 26 February 2018 / Accepted: 28 February 2018 / Published: 5 March 2018
(This article belongs to the Special Issue Information Theory in Neuroscience)
Full-Text   |   PDF [1887 KB, uploaded 7 March 2018]   |  

Abstract

Understanding how different information sources together transmit information is crucial in many domains. For example, understanding the neural code requires characterizing how different neurons contribute unique, redundant, or synergistic pieces of information about sensory or behavioral variables. Williams and Beer (2010) proposed a partial information decomposition (PID) that separates the mutual information that a set of sources contains about a set of targets into nonnegative terms interpretable as these pieces. Quantifying redundancy requires assigning an identity to different information pieces, to assess when information is common across sources. Harder et al. (2013) proposed an identity axiom that imposes necessary conditions to quantify qualitatively common information. However, Bertschinger et al. (2012) showed that, in a counterexample with deterministic target-source dependencies, the identity axiom is incompatible with ensuring PID nonnegativity. Here, we study systematically the consequences of information identity criteria that assign identity based on associations between target and source variables resulting from deterministic dependencies. We show how these criteria are related to the identity axiom and to previously proposed redundancy measures, and we characterize how they lead to negative PID terms. This constitutes a further step to more explicitly address the role of information identity in the quantification of redundancy. The implications for studying neural coding are discussed. View Full-Text
Keywords: information theory; mutual information decomposition; synergy; redundancy information theory; mutual information decomposition; synergy; redundancy
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Chicharro, D.; Pica, G.; Panzeri, S. The Identity of Information: How Deterministic Dependencies Constrain Information Synergy and Redundancy. Entropy 2018, 20, 169.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top