Understanding Interdependency Through Complex Information Sharing
AbstractThe interactions between three or more random variables are often nontrivial, poorly understood and, yet, are paramount for future advances in fields such as network information theory, neuroscience and genetics. In this work, we analyze these interactions as different modes of information sharing. Towards this end, and in contrast to most of the literature that focuses on analyzing the mutual information, we introduce an axiomatic framework for decomposing the joint entropy that characterizes the various ways in which random variables can share information. Our framework distinguishes between interdependencies where the information is shared redundantly and synergistic interdependencies where the sharing structure exists in the whole, but not between the parts. The key contribution of our approach is to focus on symmetric properties of this sharing, which do not depend on a specific point of view for differentiating roles between its components. We show that our axioms determine unique formulas for all of the terms of the proposed decomposition for systems of three variables in several cases of interest. Moreover, we show how these results can be applied to several network information theory problems, providing a more intuitive understanding of their fundamental limits. View Full-Text
Share & Cite This Article
Rosas, F.; Ntranos, V.; Ellison, C.J.; Pollin, S.; Verhelst, M. Understanding Interdependency Through Complex Information Sharing. Entropy 2016, 18, 38.
Rosas F, Ntranos V, Ellison CJ, Pollin S, Verhelst M. Understanding Interdependency Through Complex Information Sharing. Entropy. 2016; 18(2):38.Chicago/Turabian Style
Rosas, Fernando; Ntranos, Vasilis; Ellison, Christopher J.; Pollin, Sofie; Verhelst, Marian. 2016. "Understanding Interdependency Through Complex Information Sharing." Entropy 18, no. 2: 38.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.