# Quantifying Information Modification in Developing Neural Networks via Partial Information Decomposition

^{1}

^{2}

^{3}

^{4}

^{5}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Methods

#### 2.1. Definition and Estimation of Unique, Shared and Synergistic Mutual Information

#### 2.2. Mapping of Neural Recordings to Input and Output Variables for PID, and Definition of Information Modification

- The unique mutual information of the output spike train’s own past ${\tilde{I}}_{unq}({X}_{A}^{+}:{\mathbf{X}}_{\mathbf{A}}^{-}\backslash {\mathbf{X}}_{\mathbf{B}}^{-})$—this can be considered as information uniquely stored in the past output of the spike train that reappears at the present sample.
- The unique information from the other spike train ${\tilde{I}}_{unq}({X}_{A}^{+}:{\mathbf{X}}_{\mathbf{B}}^{-}\backslash {\mathbf{X}}_{\mathbf{A}}^{-})$—this is the information that is transferred unmodified from the input to the output of the receiving spike train (also known as the state independent transfer entropy [20]).
- The shared mutual information about the output of spike train A that can be obtained both from the past states of spike train A and of spike train B, ${\tilde{I}}_{shd}({X}_{A}^{+}:{\mathbf{X}}_{\mathbf{B}}^{-};{\mathbf{X}}_{\mathbf{A}}^{-})$—this is information that is redundantly stored in the past of both spike trains and that reappears at the present sample.
- The synergistic mutual information ${\tilde{I}}_{syn}({X}_{A}^{+}:{\mathbf{X}}_{\mathbf{B}}^{-};{\mathbf{X}}_{\mathbf{A}}^{-})$, i.e., the information in the output of spike train A, ${X}_{A}^{+}$, that can only be obtained when having knowledge about both the past state of the external input, ${\mathbf{X}}_{\mathbf{B}}^{-}$, and the past state of the receiving spike train, ${\mathbf{X}}_{\mathbf{A}}^{-}$. (This is also known as the state dependent transfer entropy [20]).

#### 2.3. PID Estimation

#### 2.4. Statistical Testing

#### 2.5. Electrophysiological Data—Acquisition and Preprocessing

## 3. Results

#### 3.1. PID of Information Processing in Neural Cultures

## 4. Discussion

#### 4.1. Which Definition of Synergistic Mutual Information to Use?

#### 4.2. Previous Studies of Information Modification in Neural Data

#### 4.3. Information Represented by Multi and Single Unit Data

#### 4.4. Measured Information Modification versus the Capacity of a Mechanism for Modifying Information

#### 4.5. On the Distinction between Information Modification and Noise

#### 4.6. On the Relation between Transfer Entropy and Information Modification

#### 4.7. New Research Perspectives in Neuroscience Based on PID and Information Modification

## 5. Conclusions

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## References

- Wibral, M.; Lizier, J.T.; Priesemann, V. Bits from Brains for Biologically Inspired Computing. Front. Robot. AI
**2015**, 2, 5. [Google Scholar] [CrossRef] - Schreiber, T. Measuring information transfer. Phys. Rev. Lett.
**2000**, 85, 461–464. [Google Scholar] [CrossRef] [PubMed] - Vicente, R.; Wibral, M.; Lindner, M.; Pipa, G. Transfer entropy—A model-free measure of effective connectivity for the neurosciences. J. Comput. Neurosci.
**2011**, 30, 45–67. [Google Scholar] [CrossRef] [PubMed] - Wibral, M.; Pampu, N.; Priesemann, V.; Siebenhühner, F.; Seiwert, H.; Lindner, M.; Lizier, J.T.; Vicente, R. Measuring information-transfer delays. PLoS ONE
**2013**, 8, e55809. [Google Scholar] [CrossRef] [PubMed] - Wollstadt, P.; Martínez-Zarzuela, M.; Vicente, R.; Díaz-Pernas, F.J.; Wibral, M. Efficient transfer entropy analysis of non-stationary neural time series. PLoS ONE
**2014**, 9, e102833. [Google Scholar] [CrossRef] [PubMed] - Lizier, J.T.; Prokopenko, M.; Zomaya, A.Y. Local information transfer as a spatiotemporal filter for complex systems. Phys. Rev. E
**2008**, 77, 026110. [Google Scholar] [CrossRef] [PubMed] - Lizier, J.T.; Prokopenko, M.; Zomaya, A.Y. Local measures of information storage in complex distributed computation. Inf. Sci.
**2012**, 208, 39–54. [Google Scholar] [CrossRef] - Wibral, M.; Lizier, J.T.; Vögler, S.; Priesemann, V.; Galuske, R. Local active information storage as a tool to understand distributed neural information processing. Front. Neuroinf.
**2014**, 8. [Google Scholar] [CrossRef] [PubMed] - Gomez, C.; Lizier, J.T.; Schaum, M.; Wollstadt, P.; Grützner, C.; Uhlhaas, P.; Freitag, C.M.; Schlitt, S.; Bölte, S.; Hornero, R.; et al. Reduced Predictable Information in Brain Signals in Autism Spectrum Disorder. Front. Neuroinf.
**2014**, 8, 9. [Google Scholar] [CrossRef] [PubMed] - Williams, P.L.; Beer, R.D. Nonnegative Decomposition of Multivariate Information. arXiv, 2010; arXiv:1004.2515. [Google Scholar]
- Lizier, J.T.; Flecker, B.; Williams, P.L. Towards a synergy-based approach to measuring information modification. In Proceedings of the 2013 IEEE Symposium on Artificial Life (ALIFE), Singapore, 16–19 April 2013; pp. 43–51. [Google Scholar]
- Bertschinger, N.; Rauh, J.; Olbrich, E.; Jost, J.; Ay, N. Quantifying Unique Information. Entropy
**2014**, 16, 2161–2183. [Google Scholar] [CrossRef] - Griffith, V.; Koch, C. Quantifying Synergistic Mutual Information. In Guided Self-Organization: Inception; Prokopenko, M., Ed.; Springer: Berlin, Germany, 2014; Volume 9, pp. 159–190. [Google Scholar]
- Harder, M.; Salge, C.; Polani, D. Bivariate measure of redundant information. Phys. Rev. E
**2013**, 87, 012130. [Google Scholar] [CrossRef] [PubMed] - Fano, R. Transmission of Information; The MIT Press: Cambridge, MA, USA, 1961. [Google Scholar]
- Cover, T.M.; Thomas, J.A. Elements of Information Theory; Wiley-Interscience: New York, NY, USA, 1991. [Google Scholar]
- Wibral, M.; Priesemann, V.; Kay, J.W.; Lizier, J.T.; Phillips, W.A. Partial information decomposition as a unified approach to the specification of neural goal functions. Brain Cognit.
**2017**, 112, 25–38. [Google Scholar] [CrossRef] [PubMed] - Langton, C.G. Computation at the edge of chaos: Phase transitions and emergent computation. Phys. D Nonlinear Phenom.
**1990**, 42, 12–37. [Google Scholar] [CrossRef] - Mitchell, M. Computation in Cellular Automata: A Selected Review. In Non-Standard Computation; Gramß, T., Bornholdt, S., Groß, M., Mitchell, M., Pellizzari, T., Eds.; Wiley-VCH Verlag GmbH & Co. KGaA: Weinheim, Germany, 1998; pp. 95–140. [Google Scholar]
- Williams, P.L.; Beer, R.D. Generalized Measures of Information Transfer. arXiv, 2011; arXiv:1102.1507. [Google Scholar]
- Timme, N.M.; Ito, S.; Myroshnychenko, M.; Nigam, S.; Shimono, M.; Yeh, F.C.; Hottowy, P.; Litke, A.M.; Beggs, J.M. High-Degree Neurons Feed Cortical Computations. PLoS Comput. Biol.
**2016**, 12, 1–31. [Google Scholar] [CrossRef] [PubMed] - Wollstadt, P.; Lizier, J.T.; Finn, C.; Martinz-Zarzuela, M.; Vicente, R.; Lindner, M.; Martinez-Mediano, P.; Wibral, M. The Information Dynamics Toolkit, IDT
^{xl}. Available online: https://github.com/pwollstadt/IDTxl (accessed on 25 August 2017). - Makkeh, A.; Theis, D.O.; Vicente Zafra, R. Bivariate Partial Information Decomposition: The Optimization Perspective.
**2017**. under review. [Google Scholar] - Wagenaar, D.A.; Pine, J.; Potter, S.M. An extremely rich repertoire of bursting patterns during the development of cortical cultures. BMC Neurosci.
**2006**, 7, 11. [Google Scholar] [CrossRef] [PubMed] - Wagenaar, D.A. Network Activity of Developing Cortical Cultures In Vitro. Available online: http://neurodatasharing.bme.gatech.edu/development-data/html/index.html (accessed on 25 August 2017).
- Wagenaar, D.; DeMarse, T.B.; Potter, S.M. MeaBench: A toolset for multi-electrode data acquisition and on-line analysis. In Proceedings of the 2nd International IEEE EMBS Conference on Neural Engineering, Arlington, VA, USA, 16–20 March 2005; pp. 518–521. [Google Scholar]
- Timme, N.; Shinya, I.; Maxym, M.; Fang-Chin, Y.; Emma, H.; Pawel, H.; Beggs, J.M. Multiplex Networks of Cortical and Hippocampal Neurons Revealed at Different Timescales. PLoS ONE
**2014**, 9, 1–43. [Google Scholar] [CrossRef] [PubMed] - Faes, L.; Marinazzo, D.; Montalto, A.; Nollo, G. Lag-specific transfer entropy as a tool to assess cardiovascular and cardiorespiratory information transfer. IEEE Trans. Biomed. Eng.
**2014**, 61, 2556–2568. [Google Scholar] [CrossRef] [PubMed] - Lizier, J.T.; Rubinov, M. Inferring effective computational connectivity using incrementally conditioned multivariate transfer entropy. BMC Neurosci.
**2013**, 14, P337. [Google Scholar] [CrossRef] - Montalto, A.; Faes, L.; Marinazzo, D. MuTE: A MATLAB Toolbox to Compare Established and Novel Estimators of the Multivariate Transfer Entropy. PLoS ONE
**2014**, 9, e109462. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Lindner, M.; Vicente, R.; Priesemann, V.; Wibral, M. TRENTOOL: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy. BMC Neurosci.
**2011**, 12, 1–22. [Google Scholar] [CrossRef] [PubMed] - Wollstadt, P.; Meyer, U.; Wibral, M. A Graph Algorithmic Approach to Separate Direct from Indirect Neural Interactions. PLoS ONE
**2015**, 10, e0140530. [Google Scholar] [CrossRef] [PubMed] - Levina, A.; Priesemann, V. Subsampling scaling. Nat. Commun.
**2017**, 8, 15140. [Google Scholar] [CrossRef] [PubMed] - Levina, A.; Priesemann, V. Subsampling Scaling: A Theory about Inference from Partly Observed Systems. arXiv, 2017; arXiv:1701.04277. [Google Scholar]
- Priesemann, V.; Wibral, M.; Valderrama, M.; Pröpper, R.; le van Quyen, M.; Geisel, T.; Triesch, J.; Nikolić, D.; Munk, M.H.J. Spike avalanches in vivo suggest a driven, slightly subcritical brain state. Front. Syst. Neurosci.
**2014**, 8, 108. [Google Scholar] [CrossRef] [PubMed] - Priesemann, V.; Lizier, J.; Wibral, M.; Bullmore, E.; Paulsen, O.; Charlesworth, P.; Schröter, M. Self-organization of information processing in developing neuronal networks. BMC Neurosci.
**2015**, 16, P221. [Google Scholar] [CrossRef] - Timme, N.; Alford, W.; Flecker, B.; Beggs, J.M. Synergy, redundancy, and multivariate information measures: an experimentalist’s perspective. J. Comput. Neurosci.
**2014**, 36, 119–140. [Google Scholar] [CrossRef] [PubMed] - Phillips, W.A. Cognitive functions of intracellular mechanisms for contextual amplification. Brain Cognit.
**2017**, 112, 39–53. [Google Scholar] [CrossRef] [PubMed] - Larkum, M. A cellular mechanism for cortical associations: an organizing principle for the cerebral cortex. Trends Neurosci.
**2013**, 36, 141–151. [Google Scholar] [CrossRef] [PubMed] - Finn, C.; Prokopenko, M.; Lizier, J.T. Pointwise Partial Information Decomposition Using the Specificity and Ambiguity Lattices.
**2017**. under review. [Google Scholar] - Bertschinger, N.; Rauh, J.; Olbrich, E.; Jost, J. Shared Information—New Insights and Problems in Decomposing Information in Complex Systems. In Proceedings of the European Conference on Complex Systems 2012; Gilbert, T., Kirkilionis, M., Nicolis, G., Eds.; Springer International Publishing: Cham, Vietnam, 2013; pp. 251–269. [Google Scholar]
- Ince, R.A.A. Measuring Multivariate Redundant Information with Pointwise Common Change in Surprisal. Entropy
**2017**, 19, 318. [Google Scholar] [CrossRef] - Rauh, J.; Bertschinger, N.; Olbrich, E.; Jost, J. Reconsidering unique information: Towards a multivariate information decomposition. In Proceedings of the 2014 IEEE International Symposium on Information Theory, Honolulu, HI, USA, 29 June–4 July 2014; pp. 2232–2236. [Google Scholar]
- Pica, G.; Piasini, E.; Chicharro, D.; Panzeri, S. Invariant Components of Synergy, Redundancy, and Unique Information among Three Variables. Entropy
**2017**, 19, 451. [Google Scholar] [CrossRef] - Olbrich, E.; Bertschinger, N.; Rauh, J. Information Decomposition and Synergy. Entropy
**2015**, 17, 3501–3517. [Google Scholar] [CrossRef] - Schneidman, E.; Puchalla, J.L.; Segev, R.; Harris, R.A.; Bialek, W.; Berry, M.J. Synergy from Silence in a Combinatorial Neural Code. J. Neurosci.
**2011**, 31, 15732–15741. [Google Scholar] [CrossRef] [PubMed] - Stramaglia, S.; Wu, G.R.; Pellicoro, M.; Marinazzo, D. Expanding the transfer entropy to identify information circuits in complex systems. Phys. Rev. E
**2012**, 86, 066211. [Google Scholar] [CrossRef] [PubMed] - Stramaglia, S.; Cortes, J.M.; Marinazzo, D. Synergy and redundancy in the Granger causal analysis of dynamical networks. New J. Phys.
**2014**, 16, 105003. [Google Scholar] [CrossRef] - Stramaglia, S.; Angelini, L.; Wu, G.; Cortes, J.M.; Faes, L.; Marinazzo, D. Synergetic and redundant information flow detected by unnormalized Granger causality: Application to resting state fMRI. IEEE Trans. Biomed. Eng.
**2016**, 63, 2518–2524. [Google Scholar] [CrossRef] [PubMed] - Rauh, J.; Banerjee, P.; Olbrich, E.; Jost, J.; Bertschinger, N. On Extractable Shared Information. Entropy
**2017**, 19, 328. [Google Scholar] [CrossRef] - Linsker, R. Self-organisation in a perceptual network. IEEE Comput.
**1988**, 21, 105–117. [Google Scholar] [CrossRef] - Kay, J.W.; Phillips, W.A. Coherent Infomax as a computational goal for neural systems. Bull. Math. Biol.
**2011**, 73, 344–372. [Google Scholar] [CrossRef] [PubMed] - Marr, D. Vision: A Computational Investigation into the Human Representation and Processing of Visual Information; Henry Holt and Co., Inc.: New York, NY, USA, 1982. [Google Scholar]

**Figure 1.**Decomposition of the joint mutual information between two input variables ${X}_{1}$, ${X}_{2}$ and the output variable Y. Modified from [17], CC-BY license.

**Figure 2.**Mapping between the decomposition into storage and transfer (

**A**) and individual or joint mutual information terms, and PID components (

**B**). Numbers in (

**B**) refer to the enumeration of components given in Section 2.2. Number “4” is the modified information.

**Figure 3.**

**Left:**development of the joint mutual information with network maturation. Grey symbols and lines—joint mutual information (MI) from individual pairs of spike time series, red symbols—median over all pairs. Horizontal black lines connect significantly different pairs of median values ($p<0.05$, permutation test, Bonferroni corrected for multiple comparisons);

**Right:**magnification of the joint mutual information estimates in the first two recording weeks. Note that the three large outliers from week 2 have been omitted from the plot. These tiny, but non-zero, values form the basis for the normalized non-zero PID terms presented in Figure 4—also leading to considerable variance there.

**Figure 4.**Development of normalized PID contributions (i.e., PID terms normalized by the joint mutual information) with network maturation. Grey symbols and lines—PID values from individual pairs of spike time series, red symbols—median over all pairs. Horizontal black lines connect significantly different pairs of median values ($p<0.05$, permutation test, Bonferroni corrected for multiple comparisons). On the lower right, note the sudden increase in normalized shared mutual information from week 2 to 3 that coincides with the onset of system spanning neural avalanches (see text).

**Figure 5.**Development of raw PID contributions with network maturation. Grey symbols and lines—PID values from individual pairs of spike time series, blue symbols—median over all pairs. Note that we do not provide statistical tests here as the visible differences are heavily influenced by the corresponding differences in the joint mutual information (see Figure 3).

**Figure 6.**PID diagram for three input variables—two of them external inputs (${X}_{1}$, ${X}_{2}$), and one representing the past state of the receiving system ($M={\mathbf{Y}}^{-}$). The parts of the diagram highlighted in green would be considered information modification. These parts represent the information in the receiver that can only be explained by two or more input variables considered jointly.

© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Wibral, M.; Finn, C.; Wollstadt, P.; Lizier, J.T.; Priesemann, V.
Quantifying Information Modification in Developing Neural Networks via Partial Information Decomposition. *Entropy* **2017**, *19*, 494.
https://doi.org/10.3390/e19090494

**AMA Style**

Wibral M, Finn C, Wollstadt P, Lizier JT, Priesemann V.
Quantifying Information Modification in Developing Neural Networks via Partial Information Decomposition. *Entropy*. 2017; 19(9):494.
https://doi.org/10.3390/e19090494

**Chicago/Turabian Style**

Wibral, Michael, Conor Finn, Patricia Wollstadt, Joseph T. Lizier, and Viola Priesemann.
2017. "Quantifying Information Modification in Developing Neural Networks via Partial Information Decomposition" *Entropy* 19, no. 9: 494.
https://doi.org/10.3390/e19090494