Next Article in Journal
Non-Extensive Entropic Distance Based on Diffusion: Restrictions on Parameters in Entropy Formulae
Previous Article in Journal
Local Band Spectral Entropy Based on Wavelet Packet Applied to Surface EMG Signals Analysis
Article Menu

Export Article

Open AccessArticle
Entropy 2016, 18(2), 38;

Understanding Interdependency Through Complex Information Sharing

Departement Elektrotechniek, KU Leuven, Leuven 3001, Belgium
Department of Electrical Engineering & Computer Sciences, UC Berkeley, Berkeley, CA 94720, USA
Center for Complexity and Collective Computation, University of Wisconsin-Madison, Madison, WI 53706, USA
Author to whom correspondence should be addressed.
Academic Editor: Raúl Alcaraz Martínez
Received: 14 September 2015 / Revised: 18 December 2015 / Accepted: 22 December 2015 / Published: 26 January 2016
(This article belongs to the Section Information Theory, Probability and Statistics)
Full-Text   |   PDF [1122 KB, uploaded 26 January 2016]   |  


The interactions between three or more random variables are often nontrivial, poorly understood and, yet, are paramount for future advances in fields such as network information theory, neuroscience and genetics. In this work, we analyze these interactions as different modes of information sharing. Towards this end, and in contrast to most of the literature that focuses on analyzing the mutual information, we introduce an axiomatic framework for decomposing the joint entropy that characterizes the various ways in which random variables can share information. Our framework distinguishes between interdependencies where the information is shared redundantly and synergistic interdependencies where the sharing structure exists in the whole, but not between the parts. The key contribution of our approach is to focus on symmetric properties of this sharing, which do not depend on a specific point of view for differentiating roles between its components. We show that our axioms determine unique formulas for all of the terms of the proposed decomposition for systems of three variables in several cases of interest. Moreover, we show how these results can be applied to several network information theory problems, providing a more intuitive understanding of their fundamental limits. View Full-Text
Keywords: Shannon information; multivariate dependencies; mutual information; synergy; information decomposition; shared information Shannon information; multivariate dependencies; mutual information; synergy; information decomposition; shared information

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Share & Cite This Article

MDPI and ACS Style

Rosas, F.; Ntranos, V.; Ellison, C.J.; Pollin, S.; Verhelst, M. Understanding Interdependency Through Complex Information Sharing. Entropy 2016, 18, 38.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top