Next Article in Journal
Non-Extensive Entropic Distance Based on Diffusion: Restrictions on Parameters in Entropy Formulae
Previous Article in Journal
Local Band Spectral Entropy Based on Wavelet Packet Applied to Surface EMG Signals Analysis
Article Menu

Export Article

Open AccessArticle
Entropy 2016, 18(2), 38; doi:10.3390/e18020038

Understanding Interdependency Through Complex Information Sharing

1
Departement Elektrotechniek, KU Leuven, Leuven 3001, Belgium
2
Department of Electrical Engineering & Computer Sciences, UC Berkeley, Berkeley, CA 94720, USA
3
Center for Complexity and Collective Computation, University of Wisconsin-Madison, Madison, WI 53706, USA
*
Author to whom correspondence should be addressed.
Academic Editor: Raúl Alcaraz Martínez
Received: 14 September 2015 / Revised: 18 December 2015 / Accepted: 22 December 2015 / Published: 26 January 2016
(This article belongs to the Section Information Theory)
View Full-Text   |   Download PDF [1122 KB, uploaded 26 January 2016]   |  

Abstract

The interactions between three or more random variables are often nontrivial, poorly understood and, yet, are paramount for future advances in fields such as network information theory, neuroscience and genetics. In this work, we analyze these interactions as different modes of information sharing. Towards this end, and in contrast to most of the literature that focuses on analyzing the mutual information, we introduce an axiomatic framework for decomposing the joint entropy that characterizes the various ways in which random variables can share information. Our framework distinguishes between interdependencies where the information is shared redundantly and synergistic interdependencies where the sharing structure exists in the whole, but not between the parts. The key contribution of our approach is to focus on symmetric properties of this sharing, which do not depend on a specific point of view for differentiating roles between its components. We show that our axioms determine unique formulas for all of the terms of the proposed decomposition for systems of three variables in several cases of interest. Moreover, we show how these results can be applied to several network information theory problems, providing a more intuitive understanding of their fundamental limits. View Full-Text
Keywords: Shannon information; multivariate dependencies; mutual information; synergy; information decomposition; shared information Shannon information; multivariate dependencies; mutual information; synergy; information decomposition; shared information
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Rosas, F.; Ntranos, V.; Ellison, C.J.; Pollin, S.; Verhelst, M. Understanding Interdependency Through Complex Information Sharing. Entropy 2016, 18, 38.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top