Next Article in Journal
Analysis of Entropy Generation in Flow of Methanol-Based Nanofluid in a Sinusoidal Wavy Channel
Next Article in Special Issue
Partial and Entropic Information Decompositions of a Neuronal Modulatory Interaction
Previous Article in Journal / Special Issue
Bivariate Partial Information Decomposition: The Optimization Perspective
Open AccessArticle

Multivariate Dependence beyond Shannon Information

Complexity Sciences Center, Physics Department, University of California at Davis, One Shields Avenue, Davis, CA 95616, USA
*
Author to whom correspondence should be addressed.
Entropy 2017, 19(10), 531; https://doi.org/10.3390/e19100531
Received: 20 June 2017 / Revised: 18 September 2017 / Accepted: 24 September 2017 / Published: 7 October 2017
Accurately determining dependency structure is critical to understanding a complex system’s organization. We recently showed that the transfer entropy fails in a key aspect of this—measuring information flow—due to its conflation of dyadic and polyadic relationships. We extend this observation to demonstrate that Shannon information measures (entropy and mutual information, in their conditional and multivariate forms) can fail to accurately ascertain multivariate dependencies due to their conflation of qualitatively different relations among variables. This has broad implications, particularly when employing information to express the organization and mechanisms embedded in complex systems, including the burgeoning efforts to combine complex network theory with information theory. Here, we do not suggest that any aspect of information theory is wrong. Rather, the vast majority of its informational measures are simply inadequate for determining the meaningful relationships among variables within joint probability distributions. We close by demonstrating that such distributions exist across an arbitrary set of variables. View Full-Text
Keywords: stochastic process; transfer entropy; causation entropy; partial information decomposition; network science stochastic process; transfer entropy; causation entropy; partial information decomposition; network science
Show Figures

Figure 1

MDPI and ACS Style

James, R.G.; Crutchfield, J.P. Multivariate Dependence beyond Shannon Information. Entropy 2017, 19, 531.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop