# Information Decomposition of Target Effects from Multi-Source Interactions: Perspectives on Previous, Current and Future Work

^{1}

^{2}

^{3}

^{4}

^{5}

^{6}

^{*}

## Abstract

**:**

## 1. Background to Information Decomposition

- the information held by one source about the target $I({S}_{1};T)$,
- the information held by the other source about the target $I({S}_{2};T)$, and
- the information jointly held by those sources together about the target $I(\{{S}_{1},{S}_{2}\};T)$.

- how much redundant or shared information $R({S}_{1},{S}_{2}\phantom{\rule{-0.166667em}{0ex}}\to \phantom{\rule{-0.166667em}{0ex}}T)$ the two source variables hold about the target,
- how much unique information $U({S}_{1}\phantom{\rule{-0.166667em}{0ex}}\setminus \phantom{\rule{-0.166667em}{0ex}}{S}_{2}\phantom{\rule{-0.166667em}{0ex}}\to \phantom{\rule{-0.166667em}{0ex}}T)$ source variable ${S}_{1}$ holds about T that ${S}_{2}$ does not,
- how much unique information $U({S}_{2}\phantom{\rule{-0.166667em}{0ex}}\setminus \phantom{\rule{-0.166667em}{0ex}}{S}_{1}\phantom{\rule{-0.166667em}{0ex}}\to \phantom{\rule{-0.166667em}{0ex}}T)$ source variable ${S}_{2}$ holds about T that ${S}_{1}$ does not, and
- how much complementary or synergistic information $C({S}_{1},{S}_{2}\phantom{\rule{-0.166667em}{0ex}}\to \phantom{\rule{-0.166667em}{0ex}}T)$ can only be discerned by examining the two sources together.

## 2. Contents of the Special Issue

#### 2.1. New Measures of Redundancy

#### 2.2. Theoretical Investigations

#### 2.3. Applications of Information Decomposition

## 3. Outlook

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## References

- Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J.
**1948**, 27, 379–423. [Google Scholar] [CrossRef] - Cover, T.M.; Thomas, J.A. Elements of Information Theory; John Wiley & Sons: Hoboken, NJ, USA, 2012. [Google Scholar]
- MacKay, D. Information Theory, Inference and Learning Algorithms; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
- Williams, P.L.; Beer, R.D. Nonnegative decomposition of multivariate information. arXiv. 2010. Available online: https://arxiv.org/abs/1004.2515 (accessed on 21 April 2018).
- Williams, P.L.; Beer, R.D.; Indiana University. Decomposing Multivariate Information. Privately communicated, 2010. [Google Scholar]
- Schneidman, E.; Bialek, W.; Berry, M.J. Synergy, redundancy, and independence in population codes. J. Neurosci.
**2003**, 23, 11539–11553. [Google Scholar] [CrossRef] [PubMed] - Lizier, J.T.; Flecker, B.; Williams, P.L. Towards a Synergy-Based Approach to Measuring Information Modification. In Proceedings of the 2013 IEEE Symposium on Artificial Life (IEEE ALIFE), Singapore, 16–19 April 2013; pp. 43–51. [Google Scholar]
- Bertschinger, N.; Rauh, J.; Olbrich, E.; Jost, J. Shared Information—New Insights and Problems in Decomposing Information in Complex Systems. In Proceedings of the European Conference on Complex Systems 2012, Brussels, Belgium, 3–7 September 2012; Springer: Cham, Switzerland, 2013; pp. 251–269. [Google Scholar]
- Harder, M.; Salge, C.; Polani, D. Bivariate measure of redundant information. Phys. Rev. E
**2013**, 87, 012130. [Google Scholar] [CrossRef] [PubMed] - Griffith, V.; Koch, C. Quantifying Synergistic Mutual Information. In Guided Self-Organization: Inception; Prokopenko, M., Ed.; Springer: Berlin/Heidelberg, Germany, 2014; Volume 9, pp. 159–190. [Google Scholar] [CrossRef]
- Wibral, M.; Lizier, J.T.; Priesemann, V. Bits from brains for biologically inspired computing. Front. Robot. AI
**2015**, 2. [Google Scholar] [CrossRef] - Bertschinger, N.; Rauh, J.; Olbrich, E.; Jost, J.; Ay, N. Quantifying unique information. Entropy
**2014**, 16, 2161–2183. [Google Scholar] [CrossRef] - Rauh, J.; Bertschinger, N.; Olbrich, E.; Jost, J. Reconsidering Unique Information: Towards a Multivariate Information Decomposition. In Proceedings of the 2014 IEEE International Symposium on Information Theory (ISIT), Honolulu, HI, USA, 29 June–4 July 2014; pp. 2232–2236. [Google Scholar]
- Olbrich, E.; Bertschinger, N.; Rauh, J. Information decomposition and synergy. Entropy
**2015**, 17, 3501–3517. [Google Scholar] [CrossRef] - Perrone, P.; Ay, N. Hierarchical Quantification of Synergy in Channels. Front. Robot. AI
**2016**, 2, 35. [Google Scholar] [CrossRef] - Rosas, F.; Ntranos, V.; Ellison, C.J.; Pollin, S.; Verhelst, M. Understanding interdependency through complex information sharing. Entropy
**2016**, 18, 38. [Google Scholar] [CrossRef] - Griffith, V.; Chong, E.K.; James, R.G.; Ellison, C.J.; Crutchfield, J.P. Intersection information based on common randomness. Entropy
**2014**, 16, 1985–2000. [Google Scholar] [CrossRef] - Griffith, V.; Ho, T. Quantifying redundant information in predicting a target random variable. Entropy
**2015**, 17, 4644–4653. [Google Scholar] [CrossRef] - Quax, R.; Har-Shemesh, O.; Sloot, P. Quantifying Synergistic Information Using Intermediate Stochastic Variables. Entropy
**2017**, 19, 85. [Google Scholar] [CrossRef] - Barrett, A.B. Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems. Phys. Rev. E
**2015**, 91, 052802. [Google Scholar] [CrossRef] [PubMed] - Chatterjee, P.; Pal, N.R. Construction of synergy networks from gene expression data related to disease. Gene
**2016**, 590, 250–262. [Google Scholar] [CrossRef] [PubMed] - Williams, P.L.; Beer, R.D. Generalized Measures of Information Transfer. arXiv. 2011. Available online: https://arxiv.org/abs/1102.1507 (accessed on 21 April 2018).
- Flecker, B.; Alford, W.; Beggs, J.M.; Williams, P.L.; Beer, R.D. Partial information decomposition as a spatiotemporal filter. Chaos
**2011**, 21, 037104. [Google Scholar] [CrossRef] [PubMed] - Timme, N.M.; Ito, S.; Myroshnychenko, M.; Nigam, S.; Shimono, M.; Yeh, F.C.; Hottowy, P.; Litke, A.M.; Beggs, J.M. High-Degree Neurons Feed Cortical Computations. PLoS Comput. Biol.
**2016**, 12, 1–31. [Google Scholar] [CrossRef] [PubMed] - Timme, N.; Alford, W.; Flecker, B.; Beggs, J.M. Synergy, redundancy, and multivariate information measures: an experimentalist’s perspective. J. Comput. Neurosci.
**2014**, 36, 119–140. [Google Scholar] [CrossRef] [PubMed] - Stramaglia, S.; Cortes, J.M.; Marinazzo, D. Synergy and redundancy in the Granger causal analysis of dynamical networks. New J. Phys.
**2014**, 16, 105003. [Google Scholar] [CrossRef] - Wibral, M.; Priesemann, V.; Kay, J.W.; Lizier, J.T.; Phillips, W.A. Partial information decomposition as a unified approach to the specification of neural goal functions. Brain Cogn.
**2017**, 112, 25–38. [Google Scholar] [CrossRef] [PubMed] - Linsker, R. Self-organisation in a perceptual network. IEEE Comput.
**1988**, 21, 105–117. [Google Scholar] [CrossRef] - Biswas, A.; Banik, S.K. Redundancy in information transmission in a two-step cascade. Phys. Rev. E
**2016**, 93, 052422. [Google Scholar] [CrossRef] [PubMed] - Frey, S.; Williams, P.L.; Albino, D.K. Information encryption in the expert management of strategic uncertainty. arXiv. 2016. Available online: https://arxiv.org/abs/1605.04233 (accessed on 21 April 2018).
- Rauh, J.; Banerjee, P.K.; Olbrich, E.; Jost, J.; Bertschinger, N. On Extractable Shared Information. Entropy
**2017**, 19, 328. [Google Scholar] [CrossRef] - Rauh, J.; Banerjee, P.K.; Olbrich, E.; Jost, J.; Bertschinger, N.; Wolpert, D. Coarse-Graining and the Blackwell Order. Entropy
**2017**, 19, 527. [Google Scholar] [CrossRef] - Ince, R. Measuring Multivariate Redundant Information with Pointwise Common Change in Surprisal. Entropy
**2017**, 19, 318. [Google Scholar] [CrossRef] - Ince, R.A.A. The Partial Entropy Decomposition: Decomposing multivariate entropy and mutual information via pointwise common surprisal. arXiv. 2017. Available online: https://arxiv.org/abs/1702.01591 (accessed on 21 April 2018).
- Finn, C.; Lizier, J.T. Pointwise Partial Information Decomposition Using the Specificity and Ambiguity Lattices. Entropy
**2018**, 20, 297. [Google Scholar] [CrossRef] - Finn, C.; Lizier, J.T. Probability Mass Exclusions and the Directed Components of Pointwise Mutual Information. arXiv. 2018. Available online: https://arxiv.org/abs/1801.09223 (accessed on 21 April 2018).
- James, R.G.; Crutchfield, J.P. Multivariate dependence beyond shannon information. Entropy
**2017**, 19, 531. [Google Scholar] [CrossRef] - Pica, G.; Piasini, E.; Chicharro, D.; Panzeri, S. Invariant components of synergy, redundancy, and unique information among three variables. Entropy
**2017**, 19, 451. [Google Scholar] [CrossRef] - Rauh, J. Secret sharing and shared information. Entropy
**2017**, 19, 601. [Google Scholar] [CrossRef] - Faes, L.; Marinazzo, D.; Stramaglia, S. Multiscale information decomposition: exact computation for multivariate Gaussian processes. Entropy
**2017**, 19, 408. [Google Scholar] [CrossRef] - Makkeh, A.; Theis, D.O.; Vicente, R. Bivariate Partial Information Decomposition: The Optimization Perspective. Entropy
**2017**, 19, 530. [Google Scholar] [CrossRef] - Kay, J.W.; Ince, R.A.; Dering, B.; Phillips, W.A. Partial and Entropic Information Decompositions of a Neuronal Modulatory Interaction. Entropy
**2017**, 19, 560. [Google Scholar] [CrossRef] - Wibral, M.; Finn, C.; Wollstadt, P.; Lizier, J.T.; Priesemann, V. Quantifying Information Modification in Developing Neural Networks via Partial Information Decomposition. Entropy
**2017**, 19, 494. [Google Scholar] [CrossRef] - Wollstadt, P.; Lizier, J.T.; Finn, C.; Martinz-Zarzuela, M.; Vicente, R.; Lindner, M.; Martinez-Mediano, P.; Wibral, M. The Information Dynamics Toolkit, IDT
^{xl}. Available online: https://github.com/pwollstadt/IDTxl (accessed on 25 August 2017). - Tax, T.; Mediano, P.A.; Shanahan, M. The partial information decomposition of generative neural network models. Entropy
**2017**, 19, 474. [Google Scholar] [CrossRef] - Ghazi-Zahedi, K.; Langer, C.; Ay, N. Morphological computation: Synergy of body and brain. Entropy
**2017**, 19, 456. [Google Scholar] [CrossRef] - Maity, A.K.; Chaudhury, P.; Banik, S.K. Information theoretical study of cross-talk mediated signal transduction in MAPK pathways. Entropy
**2017**, 19, 469. [Google Scholar] [CrossRef] - Sootla, S.; Theis, D.; Vicente, R. Analyzing Information Distribution in Complex Systems. Entropy
**2017**, 19, 636. [Google Scholar] [CrossRef]

**Figure 1.**Partial information diagram for two sources to a target showing the relationship of the partial information quantities to the fundamental mutual information terms.

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Lizier, J.T.; Bertschinger, N.; Jost, J.; Wibral, M. Information Decomposition of Target Effects from Multi-Source Interactions: Perspectives on Previous, Current and Future Work. *Entropy* **2018**, *20*, 307.
https://doi.org/10.3390/e20040307

**AMA Style**

Lizier JT, Bertschinger N, Jost J, Wibral M. Information Decomposition of Target Effects from Multi-Source Interactions: Perspectives on Previous, Current and Future Work. *Entropy*. 2018; 20(4):307.
https://doi.org/10.3390/e20040307

**Chicago/Turabian Style**

Lizier, Joseph T., Nils Bertschinger, Jürgen Jost, and Michael Wibral. 2018. "Information Decomposition of Target Effects from Multi-Source Interactions: Perspectives on Previous, Current and Future Work" *Entropy* 20, no. 4: 307.
https://doi.org/10.3390/e20040307