Next Article in Journal
An Approach for Determining the Number of Clusters in a Model-Based Cluster Analysis
Next Article in Special Issue
Morphological Computation: Synergy of Body and Brain
Previous Article in Journal
Sleep Information Gathering Protocol Using CoAP for Sleep Care
Previous Article in Special Issue
Multiscale Information Decomposition: Exact Computation for Multivariate Gaussian Processes
Article Menu
Issue 9 (September) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(9), 451; doi:10.3390/e19090451

Invariant Components of Synergy, Redundancy, and Unique Information among Three Variables

1
Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems @UniTn, Istituto Italiano di Tecnologia, 38068 Rovereto (TN), Italy
2
Department of Neurobiology, Harvard Medical School, Boston, MA 02115, USA
*
Authors to whom correspondence should be addressed.
Received: 27 June 2017 / Revised: 21 August 2017 / Accepted: 25 August 2017 / Published: 28 August 2017
View Full-Text   |   Download PDF [1509 KB, uploaded 29 August 2017]   |  

Abstract

In a system of three stochastic variables, the Partial Information Decomposition (PID) of Williams and Beer dissects the information that two variables (sources) carry about a third variable (target) into nonnegative information atoms that describe redundant, unique, and synergistic modes of dependencies among the variables. However, the classification of the three variables into two sources and one target limits the dependency modes that can be quantitatively resolved, and does not naturally suit all systems. Here, we extend the PID to describe trivariate modes of dependencies in full generality, without introducing additional decomposition axioms or making assumptions about the target/source nature of the variables. By comparing different PID lattices of the same system, we unveil a finer PID structure made of seven nonnegative information subatoms that are invariant to different target/source classifications and that are sufficient to describe the relationships among all PID lattices. This finer structure naturally splits redundant information into two nonnegative components: the source redundancy, which arises from the pairwise correlations between the source variables, and the non-source redundancy, which does not, and relates to the synergistic information the sources carry about the target. The invariant structure is also sufficient to construct the system’s entropy, hence it characterizes completely all the interdependencies in the system. View Full-Text
Keywords: information theory; information decomposition; redundancy; synergy; multivariate dependencies information theory; information decomposition; redundancy; synergy; multivariate dependencies
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Pica, G.; Piasini, E.; Chicharro, D.; Panzeri, S. Invariant Components of Synergy, Redundancy, and Unique Information among Three Variables. Entropy 2017, 19, 451.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top