Next Article in Journal
An Approach to Data Analysis in 5G Networks
Next Article in Special Issue
Emergence of Distinct Spatial Patterns in Cellular Automata with Inertia: A Phase Transition-Like Behavior
Previous Article in Journal / Special Issue
Identifying Critical States through the Relevance Index
Article Menu
Issue 2 (February) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(2), 71; doi:10.3390/e19020071

Synergy and Redundancy in Dual Decompositions of Mutual Information Gain and Information Loss

1
Department of Neurobiology, Harvard Medical School, Boston, MA 02115, USA
2
Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, Rovereto (TN) 38068, Italy
*
Author to whom correspondence should be addressed.
Academic Editor: Mikhail Prokopenko
Received: 30 December 2016 / Revised: 12 February 2017 / Accepted: 13 February 2017 / Published: 16 February 2017
(This article belongs to the Special Issue Complexity, Criticality and Computation (C³))
View Full-Text   |   Download PDF [2240 KB, uploaded 16 February 2017]   |  

Abstract

Williams and Beer (2010) proposed a nonnegative mutual information decomposition, based on the construction of information gain lattices, which allows separating the information that a set of variables contains about another variable into components, interpretable as the unique information of one variable, or redundant and synergy components. In this work, we extend this framework focusing on the lattices that underpin the decomposition. We generalize the type of constructible lattices and examine the relations between different lattices, for example, relating bivariate and trivariate decompositions. We point out that, in information gain lattices, redundancy components are invariant across decompositions, but unique and synergy components are decomposition-dependent. Exploiting the connection between different lattices, we propose a procedure to construct, in the general multivariate case, information gain decompositions from measures of synergy or unique information. We then introduce an alternative type of lattices, information loss lattices, with the role and invariance properties of redundancy and synergy components reversed with respect to gain lattices, and which provide an alternative procedure to build multivariate decompositions. We finally show how information gain and information loss dual lattices lead to a self-consistent unique decomposition, which allows a deeper understanding of the origin and meaning of synergy and redundancy. View Full-Text
Keywords: information theory; mutual information decomposition; synergy; redundancy information theory; mutual information decomposition; synergy; redundancy
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Chicharro, D.; Panzeri, S. Synergy and Redundancy in Dual Decompositions of Mutual Information Gain and Information Loss. Entropy 2017, 19, 71.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top