Synergy and Redundancy in Dual Decompositions of Mutual Information Gain and Information Loss
AbstractWilliams and Beer (2010) proposed a nonnegative mutual information decomposition, based on the construction of information gain lattices, which allows separating the information that a set of variables contains about another variable into components, interpretable as the unique information of one variable, or redundant and synergy components. In this work, we extend this framework focusing on the lattices that underpin the decomposition. We generalize the type of constructible lattices and examine the relations between different lattices, for example, relating bivariate and trivariate decompositions. We point out that, in information gain lattices, redundancy components are invariant across decompositions, but unique and synergy components are decomposition-dependent. Exploiting the connection between different lattices, we propose a procedure to construct, in the general multivariate case, information gain decompositions from measures of synergy or unique information. We then introduce an alternative type of lattices, information loss lattices, with the role and invariance properties of redundancy and synergy components reversed with respect to gain lattices, and which provide an alternative procedure to build multivariate decompositions. We finally show how information gain and information loss dual lattices lead to a self-consistent unique decomposition, which allows a deeper understanding of the origin and meaning of synergy and redundancy. View Full-Text
Share & Cite This Article
Chicharro, D.; Panzeri, S. Synergy and Redundancy in Dual Decompositions of Mutual Information Gain and Information Loss. Entropy 2017, 19, 71.
Chicharro D, Panzeri S. Synergy and Redundancy in Dual Decompositions of Mutual Information Gain and Information Loss. Entropy. 2017; 19(2):71.Chicago/Turabian Style
Chicharro, Daniel; Panzeri, Stefano. 2017. "Synergy and Redundancy in Dual Decompositions of Mutual Information Gain and Information Loss." Entropy 19, no. 2: 71.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.