Next Article in Journal
Analysis of Entropy Generation in Flow of Methanol-Based Nanofluid in a Sinusoidal Wavy Channel
Next Article in Special Issue
Partial and Entropic Information Decompositions of a Neuronal Modulatory Interaction
Previous Article in Journal / Special Issue
Bivariate Partial Information Decomposition: The Optimization Perspective
Article Menu
Issue 10 (October) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(10), 531; https://doi.org/10.3390/e19100531

Multivariate Dependence beyond Shannon Information

Complexity Sciences Center, Physics Department, University of California at Davis, One Shields Avenue, Davis, CA 95616, USA
*
Author to whom correspondence should be addressed.
Received: 20 June 2017 / Revised: 18 September 2017 / Accepted: 24 September 2017 / Published: 7 October 2017
Full-Text   |   PDF [374 KB, uploaded 10 October 2017]   |  

Abstract

Accurately determining dependency structure is critical to understanding a complex system’s organization. We recently showed that the transfer entropy fails in a key aspect of this—measuring information flow—due to its conflation of dyadic and polyadic relationships. We extend this observation to demonstrate that Shannon information measures (entropy and mutual information, in their conditional and multivariate forms) can fail to accurately ascertain multivariate dependencies due to their conflation of qualitatively different relations among variables. This has broad implications, particularly when employing information to express the organization and mechanisms embedded in complex systems, including the burgeoning efforts to combine complex network theory with information theory. Here, we do not suggest that any aspect of information theory is wrong. Rather, the vast majority of its informational measures are simply inadequate for determining the meaningful relationships among variables within joint probability distributions. We close by demonstrating that such distributions exist across an arbitrary set of variables. View Full-Text
Keywords: stochastic process; transfer entropy; causation entropy; partial information decomposition; network science stochastic process; transfer entropy; causation entropy; partial information decomposition; network science
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

James, R.G.; Crutchfield, J.P. Multivariate Dependence beyond Shannon Information. Entropy 2017, 19, 531.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top