Next Article in Journal
An Approach for Determining the Number of Clusters in a Model-Based Cluster Analysis
Next Article in Special Issue
Morphological Computation: Synergy of Body and Brain
Previous Article in Journal
Sleep Information Gathering Protocol Using CoAP for Sleep Care
Previous Article in Special Issue
Multiscale Information Decomposition: Exact Computation for Multivariate Gaussian Processes
Open AccessArticle

Invariant Components of Synergy, Redundancy, and Unique Information among Three Variables

1
Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems @UniTn, Istituto Italiano di Tecnologia, 38068 Rovereto (TN), Italy
2
Department of Neurobiology, Harvard Medical School, Boston, MA 02115, USA
*
Authors to whom correspondence should be addressed.
Entropy 2017, 19(9), 451; https://doi.org/10.3390/e19090451
Received: 27 June 2017 / Revised: 21 August 2017 / Accepted: 25 August 2017 / Published: 28 August 2017
In a system of three stochastic variables, the Partial Information Decomposition (PID) of Williams and Beer dissects the information that two variables (sources) carry about a third variable (target) into nonnegative information atoms that describe redundant, unique, and synergistic modes of dependencies among the variables. However, the classification of the three variables into two sources and one target limits the dependency modes that can be quantitatively resolved, and does not naturally suit all systems. Here, we extend the PID to describe trivariate modes of dependencies in full generality, without introducing additional decomposition axioms or making assumptions about the target/source nature of the variables. By comparing different PID lattices of the same system, we unveil a finer PID structure made of seven nonnegative information subatoms that are invariant to different target/source classifications and that are sufficient to describe the relationships among all PID lattices. This finer structure naturally splits redundant information into two nonnegative components: the source redundancy, which arises from the pairwise correlations between the source variables, and the non-source redundancy, which does not, and relates to the synergistic information the sources carry about the target. The invariant structure is also sufficient to construct the system’s entropy, hence it characterizes completely all the interdependencies in the system. View Full-Text
Keywords: information theory; information decomposition; redundancy; synergy; multivariate dependencies information theory; information decomposition; redundancy; synergy; multivariate dependencies
Show Figures

Figure 1

MDPI and ACS Style

Pica, G.; Piasini, E.; Chicharro, D.; Panzeri, S. Invariant Components of Synergy, Redundancy, and Unique Information among Three Variables. Entropy 2017, 19, 451.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop