Next Article in Journal
Entropy vs. Energy Waveform Processing: A Comparison Based on the Heat Equation
Next Article in Special Issue
Tail Risk Constraints and Maximum Entropy
Previous Article in Journal
Operational Reliability Assessment of Compressor Gearboxes with Normalized Lifting Wavelet Entropy from Condition Monitoring Information
Previous Article in Special Issue
An Information-Theoretic Perspective on Coarse-Graining, Including the Transition from Micro to Macro
Article Menu

Export Article

Open AccessArticle
Entropy 2015, 17(5), 3501-3517; doi:10.3390/e17053501

Information Decomposition and Synergy

1
Max Planck Institute for Mathematics in the Sciences, Inselstraße 22, 04103 Leipzig, Germany
2
Frankfurt Institute for Advanced Studies, Ruth-Moufang-Straße 1, 60438 Frankfurt am Main, Germany
3
Institute of Algebraic Geometry, Leibniz Universität Hannover, Welfengarten 1, 30167 Hannover, Germany
*
Author to whom correspondence should be addressed.
Academic Editor: Rick Quax
Received: 26 March 2015 / Revised: 12 May 2015 / Accepted: 19 May 2015 / Published: 22 May 2015
(This article belongs to the Special Issue Information Processing in Complex Systems)
View Full-Text   |   Download PDF [259 KB, uploaded 22 May 2015]   |  

Abstract

Recently, a series of papers addressed the problem of decomposing the information of two random variables into shared information, unique information and synergistic information. Several measures were proposed, although still no consensus has been reached. Here, we compare these proposals with an older approach to define synergistic information based on the projections on exponential families containing only up to k-th order interactions. We show that these measures are not compatible with a decomposition into unique, shared and synergistic information if one requires that all terms are always non-negative (local positivity). We illustrate the difference between the two measures for multivariate Gaussians. View Full-Text
Keywords: Shannon information; mutual information; information decomposition; shared information; synergy Shannon information; mutual information; information decomposition; shared information; synergy
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Olbrich, E.; Bertschinger, N.; Rauh, J. Information Decomposition and Synergy. Entropy 2015, 17, 3501-3517.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top