# Information Decomposition in Multivariate Systems: Definitions, Implementation and Application to Cardiovascular Networks

^{1}

^{2}

^{3}

^{4}

^{5}

^{6}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Information Decomposition in Multivariate Processes

#### 2.1. Information Measures for Random Variables

#### 2.1.1. Variance-Based and Entropy-Based Measures of Information

_{X}(x), x∈D

_{X}, where D

_{X}is the domain of X. As we are interested in the variability of their outcomes, all random variables considered in this study are supposed to have zero mean: $\mathrm{E}[X]=0$. The information content of X can be intuitively related to the uncertainty of X, or equivalently, the unpredictability of its outcomes x∈D

_{X}: if X takes on many different values inside D

_{X}, its outcomes are uncertain and the information content is assumed to be high; if, on the contrary, only a small number of values are taken by X with high probability, the outcomes are more predictable and the information content is low. This concept can be formulated with reference to the degree of variability of the variable, thus quantifying information in terms of variance:

_{V}(X) of Equation (1) or to the entropy-based definition H

_{E}(X) of Equation (2) when necessary.

**Z**= [Z

_{1}∙∙∙Z

_{k}]

^{T}with probability density ${f}_{Z}(\mathit{z})$. To this end, we introduce the concept of conditional information, i.e., the information remaining in X when

**Z**is assigned, denoted as H(X|

**Z**). This concept is linked to the resolution of uncertainty about X, or equivalently, the decrement of unpredictability of its outcomes x∈D

_{X}, brought by the knowledge of the outcomes

**z**∈D

**of the variable**

_{Z}**Z**: if the values of X are perfectly predicted by the knowledge of

**Z**, no uncertainty is left about X when

**Z**is known and thus H(X|

**Z**) = 0; if, on the contrary, knowing

**Z**does not alter the uncertainty about the outcomes of X, the residual uncertainty will be maximum, H(X|

**Z**) = H(X). To formulate this concept we may reason again in terms of variance, considering the prediction of X on

**Z**and the corresponding prediction error variable $U=X-\mathrm{E}[X|\mathit{Z}]$, and defining the conditional variance of X given

**Z**as:

**Z**, ${f}_{X|\mathit{Z}}(x|\mathit{z})={f}_{X,\mathit{Z}}(x,\mathit{z})/{f}_{\mathit{Z}}(\mathit{z})$, and defining the conditional entropy of X given

**Z**as:

#### 2.1.2. Variance-Based and Entropy-Based Measures of Information for Gaussian Variables

**Z**are related by the expression [35]:

**Z**have a joint Gaussian distribution their interactions are fully described by a linear relation of the form $X=\mathit{A}\mathit{Z}+U$, where

**A**is a k-dimensional row vector of coefficients such that $\mathrm{E}[X|\mathit{Z}]=\mathit{A}\mathit{Z}$ [36]. This leads to computing the variance of the prediction error as $\mathrm{E}[{U}^{2}]=\mathrm{E}[{X}^{2}]-\mathit{A}\Sigma (\mathit{Z}){\mathit{A}}^{\mathrm{T}}$, where $\Sigma (\mathit{Z})=\mathrm{E}[\mathit{Z}{\mathit{Z}}^{\mathrm{T}}]$ is the covariance of

**Z**; additionally, the uncorrelation between the regressor

**Z**and the error U, $\Sigma (\mathit{Z};U)=0$, leads to express the coefficients as $\mathit{A}=\Sigma (X;\mathit{Z})\Sigma {(\mathit{Z})}^{-\mathrm{T}}$, which yields:

**Z**starting from the variance of X, ${H}_{V}(X)$, the covariance of

**Z**, $\Sigma (\mathit{Z})$, and their cross covariance, $\Sigma (X;\mathit{Z})$.

#### 2.1.3. Measures Derived from Information and Conditional Information

**Z**as:

**Z**are known. Moreover, the conditional mutual information between X and

**Z**given a third variable

**U**, I(X;

**Z**|

**U**), quantifies the information shared between X and

**Z**which is not shared with

**U**, intended as the reduction in uncertainty about the outcomes of X provided by the knowledge of the outcomes of

**Z**that is not explained by the outcomes of

**U**:

**Z**and

**U**when they are taken individually but not when they are taken together:

**Z**,

**U**} beyond that which is present in the individual subsets {X,

**Z**} and {X,

**U**}. Contrary to all other information measures which are never negative, the interaction information defined in Equation (10) can take on both positive and negative values, with positive values indicating redundancy (i.e., I(X;

**Z**,

**U**) < I(X;

**Z**) + I(X;

**U**)) and negative values indicating synergy (i.e., I(X;

**Z**,

**U**) > I(X;

**Z**) + I(X;

**U**)) between the two sources

**Z**and

**U**that share information with the target X. Note that all the measures defined in this Section can be computed as sums of information and conditional information terms. As such, the generic notations I(∙;∙), I(∙;∙|∙), and I(∙;∙;∙) used to indicate mutual information, conditional mutual information and interaction information will be particularized to I

_{V}(∙;∙), I

_{V}(∙;∙|∙), I

_{V}(∙;∙;∙), or to I

_{E}(∙;∙), I

_{E}(∙;∙|∙), I

_{E}(∙;∙;∙), to clarify when their computation is based on variance measures or entropy measures, respectively. Note that, contrary to the entropy-based measure I

_{E}(∙;∙), the variance-based measure I

_{V}(∙;∙) is not symmetric and thus fails to satisfy a basic property of “mutual information” measures. However, this disadvantage is not crucial for the formulations proposed in study which, being based on exploiting the flow of time that sets asymmetric relations between the analyzed variables, do not exploit the symmetry property of mutual information (see Section 2.2).

**Z**,

**U**) + H(X|

**Z**,

**U**), the chain rule for mutual information decomposes the information shared between the target X and the two sources

**Z**and

**U**as I(X;

**Z**,

**U**) = I(X;

**Z**) + I(X;

**U**|

**Z**) = I(X;

**U**) + I(X;

**Z**|

**U**), and the interaction information between X,

**Z**and

**U**results as I(X;

**Z**;

**U**) = I(X;

**Z**) − I(X;

**Z**|

**U**) = I(X;

**U**) − I(X;

**U**|

**Z**).

#### 2.2. Information Measures for Networks of Dynamic Processes

**S**. We consider the problem of dissecting the information carried by an assigned “target” process Y, into contributions resulting either from its own dynamics and from the dynamics of the other processes

**X**=

**S**\Y, that are considered as “sources”. We further suppose that two separate (groups of) sources, identified by the two disjoint sets

**V**= {V

_{1},...,V

_{P}} and

**W**= {W

_{1},...,W

_{Q}} (Q + P = M − 1), have effects on the dynamics of the target, such that the whole observed process is

**S**= {

**X**,Y} = {

**V**,

**W**,Y}. Moreover, setting a temporal reference frame in which n represents the present time, we denote as ${Y}_{n}$ the random variable describing the present of Y, and as ${Y}_{n}^{-}=[{Y}_{n-1},{Y}_{n-2},\dots ]$ the infinite-dimensional variable describing the past of Y. The same notation applies for each source component V

_{i}∈

**V**and W

_{j}∈

**W**, and extends to ${\mathit{X}}_{n}^{-}=[{\mathit{X}}_{n-1},{\mathit{X}}_{n-2},\dots ]$ and ${\mathit{S}}_{n}^{-}=[{\mathit{X}}_{n}^{-},{Y}_{n}^{-}]$ to denote the past of the source process

**X**and of the full network process

**S**. This simple operation of separating the present from the past allows to consider the flow of time and to study the causal interactions within and between processes by looking at the statistical dependencies among these variables [1]. An exemplary diagram of the process interactions is depicted in Figure 2a.

_{V}and I

_{V}) or in terms of entropy (i.e., using H

_{E}and I

_{E}).

#### 2.2.1. New Information and Predictive Information

_{Y}. Then, exploiting the chain rule for information [34], we decompose the target information as:

#### 2.2.2. Predictive Information Decomposition (PID)

**X**to the target Y, quantified as the amount of information contained in the past of the sources ${\mathit{X}}_{n}^{-}$ that can be used to predict the present of the target ${Y}_{n}$ above and beyond the information contained in the past of the target ${Y}_{n}^{-}$.

#### 2.2.3. Information Storage Decomposition (ISD)

**X**,Y}, quantified as the interaction information of the present of the target ${Y}_{n}$, its past ${Y}_{n}^{-}$, and the past of the sources ${\mathit{X}}_{n}^{-}$.

**X**= {

**V**,

**W**}, the interaction information storage can be expanded as:

**V**,Y} and {

**W**,Y}, ${I}_{Y;\mathit{V}|\mathit{W}}^{Y}=I({Y}_{n};{Y}_{n}^{-};{\mathit{V}}_{n}^{-}|{\mathit{W}}_{n}^{-})$ and ${I}_{Y;\mathit{W}|\mathit{V}}^{Y}=I({Y}_{n};{Y}_{n}^{-};{\mathit{W}}_{n}^{-}|{\mathit{V}}_{n}^{-})$ quantify the conditional interaction information storage of Y in the context of the whole network processes {V,

**W**,Y}, and ${I}_{Y;\mathit{V};\mathit{W}}^{Y}=I({Y}_{n};{Y}_{n}^{-};{\mathit{V}}_{n}^{-};{\mathit{W}}_{n}^{-})$ is the multivariate interaction information of the target Y in the context of the network itemized evidencing the two sources

**V**and

**W**. This last term quantifies the interaction information between the present of the target ${Y}_{n}$, its past ${Y}_{n}^{-}$, the past of one source ${\mathit{V}}_{n}^{-}$, and the past of the other source ${\mathit{W}}_{n}^{-}$.

#### 2.2.4. Information Transfer Decomposition (ITD)

**V**and

**W**to the target Y can be further expanded to evidence how the past of the sources interact with each other in determining the information transferred to the target. To do this, we decompose the joint information transfer from

**X**= (

**V**,

**W**) to Y as:

**V**,Y} and {

**W**,Y}, ${T}_{\mathit{V}\to Y|\mathit{W}}=I({Y}_{n};{\mathit{V}}_{n}^{-}|{Y}_{n}^{-},{\mathit{W}}_{n}^{-})$ and ${T}_{\mathit{W}\to Y|\mathit{V}}=I({Y}_{n};{\mathit{W}}_{n}^{-}|{Y}_{n}^{-},{\mathit{V}}_{n}^{-})$ quantify the conditional information transfer from one source to the target conditioned to the other source in the context of the whole network process {V,

**W**,Y}, and ${I}_{\mathit{V};\mathit{W}|Y}^{Y}=I({Y}_{n};{\mathit{V}}_{n}^{-};{\mathit{W}}_{n}^{-}|{Y}_{n}^{-})=I({Y}_{n};{\mathit{V}}_{n}^{-}|{Y}_{n}^{-})-I({Y}_{n};{\mathit{V}}_{n}^{-}|{Y}_{n}^{-};{\mathit{W}}_{n}^{-})$ is the interaction information transfer between

**V**and

**W**to Y in the context of the network process {V,

**W**,Y}, quantified as the interaction information of the present of the target ${Y}_{n}$ and the past of the two sources ${\mathit{V}}_{n}^{-}$ and ${\mathit{W}}_{n}^{-}$, conditioned to the past of the target ${Y}_{n}^{-}$.

#### 2.2.5. Summary of Information Decomposition

#### 2.3. Computation for Multivariate Gaussian Processes

**S**= {

**X**,Y} = {

**V**,

**W**,Y} is composed by Gaussian processes [12]. Specifically, we assume that the overall vector process

**S**has a joint Gaussian distribution, which means that any vector variable extracted sampling the constituent processes at present and past times takes values from a multivariate Gaussian distribution. In such a case, the information of the present state of the target process, H(Yn), and the conditional information of the present of the target given any vector

**Z**formed by past variables of the network processes, H(Yn|

**Z**), can be computed using Equations (5) and (6) where the conditional variance is given by Equation (7). Then, any of the measures of information storage, transfer and modification appearing in Equations (12)–(16) can be obtained from the information H(Yn) and the conditional information H(Yn|

**Z**)—where

**Z**can be any combination of ${Y}_{n}^{-},{\mathit{V}}_{n}^{-}$ and ${\mathit{W}}_{n}^{-}$.

**V**,

**W**, and Y, which in turn appear as elements of the M × M autocovariance of the whole observed M-dimensional process

**S**, defined at each lag k ≥ 0 as ${\mathbf{\Gamma}}_{k}=\mathrm{E}[{\mathit{S}}_{n}{\mathit{S}}_{n-k}]$. Now we show how this autocovariance matrix can be computed from the parameters of the vector autoregressive (VAR) formulation of the process

**S**:

**U**n is a zero mean Gaussian white noise process with diagonal covariance matrix

**Λ**. The autocovariance of the process (17) is related to the VAR parameters via the well-known Yule–Walker equations:

_{k0}is the Kronecher product. In order to solve Equation (18) for

**Γ**

_{k}, with k = 0, 1, ..., m − 1, we first express Equation (17) in a compact form as ${\mathit{\phi}}_{n}=\mathbf{A}{\mathit{\phi}}_{n-1}+{\mathit{E}}_{n}$, where:

**E**n. This last equation is a discrete-time Lyapunov equation, which can be solved for $\mathbf{\Psi}$ yielding the autocovariance matrices

**Γ**

_{0}, ...,

**Γ**

_{m−1}. Finally, the autocovariance can be calculated recursively for any lag k ≥ m by repeatedly applying Equation (18). This shows how the autocovariance sequence can be computed up to arbitrarily high lags starting from the parameters of the VAR representation of the observed Gaussian process.

## 3. Simulation Study

#### 3.1. Simulated VAR Processes

**S**= {

**X**,Y} = {V,W,Y} with temporal dynamical structure defined by the equations:

**Λ**=

**I**). The parameter design in Equation (21) is chosen to allow autonomous oscillations in the three processes, obtained placing complex-conjugate poles with amplitude ${\rho}_{v},{\rho}_{w},{\rho}_{y}$ and frequency ${f}_{v},{f}_{w},{f}_{y}$ in the complex plane representation of the transfer function of the vector process, as well as causal interactions between the processes at fixed time lag of 1 or 2 samples and with strength modulated by the parameters a, b, c, d [37]. Here we consider two parameter configurations describing respectively basic dynamics and more realistic dynamics resembling rhythms and interactions typical of cardiovascular and cardiorespiratory signals.

#### 3.2. Information Decomposition

_{r}

_{|x}is not affected by c, documenting the insensitivity to causal interactions of this measure that is designed to reflect exclusively variations in the internal dynamics of the target process [12]. We note that also the interaction information storage between the target Y and the source W conditioned to the other source V, ${I}_{Y;W|V}^{Y}$, is constant in all simulated conditions, reflecting the fact that the direct interaction between Y and W is not affected by c. Therefore, the ISD allows to evidence that in our simulations variations in the information storage are related to how the target Y interacts with a specific source (in this case, V); such an interaction is documented by the trends of the interaction information measure ${I}_{Y;V|W}^{Y}$ and ${I}_{Y;V;W}^{Y}$. In type-I simulation, the increasing coupling between V and Y determines a monotonic increase of the interaction storage ${I}_{Y;V|W}^{Y}$ and a monotonic decrease of the multivariate interaction ${I}_{Y;V;W}^{Y}$ (Figure 4e); in particular, ${I}_{Y;V|W}^{Y}$ is zero and ${I}_{Y;V;W}^{Y}$ is maximum when c = 0, and the opposite occurs when c = 1, reflecting respectively the conditions of absence of direct coupling V→Y and presence of exclusive direct coupling V→Y. In type-II simulation, the concordant variations set for the couplings V→Y and V→W lead to a similar but smoothed response of the interaction storage (${I}_{Y;V|W}^{Y}$ slightly increases with c) and to an opposite response of the multivariate interaction information (${I}_{Y;V;W}^{Y}$ increases with c) (Figure 4g). These trends of the interaction measures ${I}_{Y;V|W}^{Y}$ and ${I}_{Y;V;W}^{Y}$ are apparent when information is measured in terms of variance, but become of difficult interpretation when information is measured as entropy: in such a case, the variations with c of ${I}_{Y;V|W}^{Y}$ and ${I}_{Y;V;W}^{Y}$ are non-monotonic (Figure 4f) or even opposite to those observed before (Figure 4h).

#### 3.3. Interpretation of Interaction Information

## 4. Application to Physiological Networks

#### 4.1. Experimental Protocol and Data Analysis

**S**= {R,S,H} were obtained by normalizing the measured multivariate time series, i.e., subtracting the mean from each series and dividing the result by the standard deviation. The resulting time series {R

_{n}, S

_{n}, H

_{n}} was fitted with a VAR model in the form of Equation (17) where model identification was performed using the standard vector least squares method and the model order was optimized according to the Bayesian Information Criterion [50]. The estimated model coefficients were exploited to derive the covariance matrix of the vector process, and the covariances between the present and the past of the processes were computed as in Equations (18)–(20) used as in Equation (7) to estimate all the partial variances needed to compute the measures of information dynamics. In all computations, the vectors representing the past of the normalized respiratory and vascular processes were incremented with the present variables in order to take into account fast vagal reflexes capable to modify HP in response to within-beat changes of RA and SP (effects R

_{n}→H

_{n}, S

_{n}→H

_{n}) and fast effects capable to modify SP in response to within-beat changes of RA (effect R

_{n}→S

_{n}).

#### 4.2. Results and Discussion

#### 4.2.1. Information Decomposition of Heart Period Variability during Head-Up Tilt

#### 4.2.2. Information Decomposition of Heart Period Variability during Mental Arithmetics

#### 4.2.3. Information Decomposition of Systolic Arterial Pressure Variability during Head-Up Tilt

#### 4.2.4. Information Decomposition of Systolic Arterial Pressure Variability during Mental Arithmetics

#### 4.2.5. Different Profiles of Variance-Based and Entropy-Based Information Measures

## 5. Summary of Main Findings

- Information decomposition methods are recommended for the analysis of multivariate processes to dissect the general concepts of predictive information, information storage and information transfer in basic elements of computation that are sensitive to changes in specific network properties;
- The combined evaluation of several information measures is recommended to characterize unambiguously changes of the network across conditions;
- Entropy-based measures are appropriate for the analysis of information transfer thanks to the intrinsic normalization to the complexity of the target dynamics, but are exposed to the detection of net synergy in the analysis of information modification;
- Variance-based measures are recommended for the analysis of information modification since they yield zero synergy/redundancy for uncorrelated sources, but can return estimates of information transfer biased by modifications of the complexity of the target dynamics.

- The physiological stress induced by head-up tilt brings about a decrease of the complexity of the short-term variability of heart period, reflected by higher information storage and internal information, lower cardiorespiratory and higher cardiovascular information transfer, physiologically associated with sympathetic activation and vagal withdrawal;
- Head-up tilt does not alter the information stored in and transferred to systolic arterial pressure variability, but information decompositions reveal an enhancement during tilt of respiratory effects on systolic pressure independent of heart period dynamics;
- The mental stress induced by the arithmetic task does not alter the complexity of heart period variability, but leads to a decrease of the cardiorespiratory information transfer physiologically associated to vagal withdrawal;
- Mental arithmetics increases the complexity of systolic arterial pressure variability, likely associated with the action of physiological mechanisms unrelated to respiration and heart period variability.

## 6. Conclusions

## Supplementary Materials

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## References

- Faes, L.; Porta, A. Conditional entropy-based evaluation of information dynamics in physiological systems. In Directed Information Measures in Neuroscience; Vicente, R., Wibral, M., Lizier, J.T., Eds.; Springer: Berlin/Heidelberg, Germany, 2014; pp. 61–86. [Google Scholar]
- Lizier, J.T. The Local Information Dynamics of Distributed Computation in Complex Systems; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
- Wibral, M.; Lizier, J.T.; Priesemann, V. Bits from biology for biologically-inspired computing. Front. Robot. AI
**2015**, 2. [Google Scholar] [CrossRef] - Chicharro, D.; Ledberg, A. Framework to study dynamic dependencies in networks of interacting processes. Phys. Rev. E
**2012**, 86, 041901. [Google Scholar] [CrossRef] [PubMed] - Faes, L.; Kugiumtzis, D.; Nollo, G.; Jurysta, F.; Marinazzo, D. Estimating the decomposition of predictive information in multivariate systems. Phys. Rev. E
**2015**, 91, 032904. [Google Scholar] [CrossRef] [PubMed] - Lizier, J.T.; Prokopenko, M.; Zomaya, A.Y. Local measures of information storage in complex distributed computation. Inf. Sci.
**2012**, 208, 39–54. [Google Scholar] [CrossRef] - Wibral, M.; Lizier, J.T.; Vogler, S.; Priesemann, V.; Galuske, R. Local Active Information Storage as a Tool to Understand Distributed Neural Information Processing; Frontiers Media SA: Lausanne, Switzerland, 2015. [Google Scholar]
- Schreiber, T. Measuring information transfer. Phys. Rev. Lett.
**2000**, 85, 461. [Google Scholar] [CrossRef] [PubMed] - Wibral, M.; Vicente, R.; Lindner, M. Transfer entropy in neuroscience. In Directed Information Measures in Neuroscience; Vicente, R., Wibral, M., Lizier, J.T., Eds.; Springer: Berlin/Heidelberg, Germany, 2014. [Google Scholar]
- Lizier, J.T.; Prokopenko, M.; Zomaya, A.Y. Information modification and particle collisions in distributed computation. Chaos
**2010**, 20, 037109. [Google Scholar] [CrossRef] [PubMed] - Faes, L.; Marinazzo, D.; Stramaglia, S.; Jurysta, F.; Porta, A.; Nollo, G. Predictability decomposition detects the impairment of brain-heart dynamical networks during sleep disorders and their recovery with treatment. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci.
**2016**, 374. [Google Scholar] [CrossRef] [PubMed] - Faes, L.; Porta, A.; Nollo, G. Information Decomposition in Bivariate Systems: Theory and Application to Cardiorespiratory Dynamics. Entropy
**2015**, 17, 277–303. [Google Scholar] [CrossRef] - Porta, A.; Faes, L.; Nollo, G.; Bari, V.; Marchi, A.; De Maria, B.; Takahashi, A.C.M.; Catai, A.M. Conditional Self-Entropy and Conditional Joint Transfer Entropy in Heart Period Variability during Graded Postural Challenge. PLoS ONE
**2015**, 10, e0132851. [Google Scholar] [CrossRef] [PubMed] - Porta, A.; Faes, L. Wiener-Granger Causality in Network Physiology with Applications to Cardiovascular Control and Neuroscience. Proc. IEEE
**2016**, 104, 282–309. [Google Scholar] [CrossRef] - Stramaglia, S.; Wu, G.R.; Pellicoro, M.; Marinazzo, D. Expanding the transfer entropy to identify information circuits in complex systems. Phys. Rev. E
**2012**, 86, 066211. [Google Scholar] [CrossRef] [PubMed] - Barrett, A.B. Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems. Phys. Rev. E
**2015**, 91, 052802. [Google Scholar] [CrossRef] [PubMed] - Williams, P.L. Nonnegative decomposition of multivariate information. arXiv
**2010**. [Google Scholar] - Barnett, L.; Lizier, J.T.; Harre, M.; Seth, A.K.; Bossomaier, T. Information flow in a kinetic Ising model peaks in the disordered phase. Phys. Rev. Lett.
**2013**, 111, 177203. [Google Scholar] [CrossRef] [PubMed] - Dimpfl, T.; Peter, F.J. Using transfer entropy to measure information flows between financial markets. Stud. Nonlinear Dyn. Econom.
**2013**, 17, 85–102. [Google Scholar] [CrossRef] - Faes, L.; Nollo, G.; Jurysta, F.; Marinazzo, D. Information dynamics of brain-heart physiological networks during sleep. New J. Phys.
**2014**, 16, 105005. [Google Scholar] [CrossRef] - Faes, L.; Porta, A.; Rossato, G.; Adami, A.; Tonon, D.; Corica, A.; Nollo, G. Investigating the mechanisms of cardiovascular and cerebrovascular regulation in orthostatic syncope through an information decomposition strategy. Auton. Neurosci.
**2013**, 178, 76–82. [Google Scholar] [CrossRef] [PubMed] - Gomez, C.; Lizier, J.T.; Schaum, M.; Wollstadt, P.; Grutzner, C.; Uhlhaas, P.; Freitag, C.M.; Schlitt, S.; Bolte, S.; Hornero, R.; et al. Reduced Predictable Information in Brain Signals in Autism Spectrum Disorder; Frontiers Media: Lausanne, Switzerland, 2015. [Google Scholar]
- Lizier, J.T.; Pritam, S.; Prokopenko, M. Information Dynamics in Small-World Boolean Networks. Artif. Life
**2011**, 17, 293–314. [Google Scholar] [CrossRef] [PubMed] - Moniz, L.J.; Cooch, E.G.; Ellner, S.P.; Nichols, J.D.; Nichols, J.M. Application of information theory methods to food web reconstruction. Ecol. Model.
**2007**, 208, 145–158. [Google Scholar] [CrossRef] - Pahle, J.; Green, A.K.; Dixon, C.J.; Kummer, U. Information transfer in signaling pathways: A study using coupled simulated and experimental data. BMC Bioinform.
**2008**, 9, 139. [Google Scholar] [CrossRef] [PubMed] - Runge, J.; Heitzig, J.; Marwan, N.; Kurths, J. Quantifying causal coupling strength: A lag-specific measure for multivariate time series related to transfer entropy. Phys. Rev. E
**2012**, 86, 061121. [Google Scholar] [CrossRef] [PubMed] - Stramaglia, S.; Cortes, J.M.; Marinazzo, D. Synergy and redundancy in the Granger causal analysis of dynamical networks. New J. Phys.
**2014**, 16, 105003. [Google Scholar] [CrossRef] - Wibral, M.; Rahm, B.; Rieder, M.; Lindner, M.; Vicente, R.; Kaiser, J. Transfer entropy in magnetoencephalographic data: Quantifying information flow in cortical and cerebellar networks. Prog. Biophys. Mol. Biol.
**2011**, 105, 80–97. [Google Scholar] [CrossRef] [PubMed] - Porta, A.; Faes, L.; Marchi, A.; Bari, V.; De Maria, B.; Guzzetti, S.; Colombo, R.; Raimondi, F. Disentangling cardiovascular control mechanisms during head-down tilt via joint transfer entropy and self-entropy decompositions. Front. Physiol.
**2015**, 6, 00301. [Google Scholar] [CrossRef] [PubMed][Green Version] - Porta, A.; Bari, V.; Marchi, A.; De Maria, B.; Takahashi, A.C.M.; Guzzetti, S.; Colombo, R.; Catai, A.M.; Raimondi, F. Effect of variations of the complexity of the target variable on the assessment of Wiener-Granger causality in cardiovascular control studies. Phys. Meas.
**2016**, 37, 276–290. [Google Scholar] [CrossRef] [PubMed] - Faes, L.; Marinazzo, D.; Jurysta, F.; Nollo, G. Linear and non-linear brain-heart and brain-brain interactions during sleep. Physiol. Meas.
**2015**, 36, 683–698. [Google Scholar] [CrossRef] [PubMed] - Porta, A.; De Maria, B.; Bari, V.; Marchi, A.; Faes, L. Are nonlinear model-free approaches for the assessment of the entropy-based complexity of the cardiac control superior to a linear model-based one? IEEE Trans. Biomed. Eng.
**2016**. [Google Scholar] [CrossRef] [PubMed] - Javorka, M.; Czippelova, B.; Turianikova, Z.; Lazarova, Z.; Tonhajzerova, I.; Faes, L. Causal analysis of short-term cardiovascular variability: state-dependent contribution of feedback and feedforward mechanisms. Med. Biol. Eng. Comput.
**2016**. [Google Scholar] [CrossRef] [PubMed] - Cover, T.M.; Thomas, J.A. Elements of Information Theory; Wiley: New York, NY, USA, 2006. [Google Scholar]
- Barnett, L.; Barrett, A.B.; Seth, A.K. Granger causality and transfer entropy are equivalent for Gaussian variables. Phys. Rev. Lett.
**2009**, 103, 238701. [Google Scholar] [CrossRef] [PubMed] - Barrett, A.B.; Barnett, L.; Seth, A.K. Multivariate Granger causality and generalized variance. Phys. Rev. E
**2010**, 81, 041907. [Google Scholar] [CrossRef] [PubMed] - Faes, L.; Marinazzo, D.; Montalto, A.; Nollo, G. Lag-Specific Transfer Entropy as a Tool to Assess Cardiovascular and Cardiorespiratory Information Transfer. IEEE Trans. Biomed. Eng.
**2014**, 61, 2556–2568. [Google Scholar] [CrossRef] [PubMed] - Malliani, A.; Pagani, M.; Lombardi, F.; Cerutti, S. Cardiovascular neural regulation explored in the frequency domain. Circulation
**1991**, 84, 482–492. [Google Scholar] [CrossRef] [PubMed] - Heart rate variability. Standards of measurement, physiological interpretation, and clinical use. Eur. Heart J.
**1996**, 17, 354–381. - Cooke, W.H.; Hoag, J.B.; Crossman, A.A.; Kuusela, T.A.; Tahvanainen, K.U.O.; Eckberg, D.L. Human response to upright tilt: A window on central autonomic integration. J. Physiol.
**1999**, 517, 617–628. [Google Scholar] [CrossRef] [PubMed] - Kuipers, N.T.; Sauder, C.L.; Carter, J.R.; Ray, C.A. Neurovascular responses to mental stress in the supine and upright postures. J. Appl. Physiol.
**2008**, 104, 1129–1136. [Google Scholar] [CrossRef] [PubMed] - Baselli, G.; Cerutti, S.; Badilini, F.; Biancardi, L.; Porta, A.; Pagani, M.; Lombardi, F.; Rimoldi, O.; Furlan, R.; Malliani, A. Model for the assessment of heart period and arterial pressure variability interactions and of respiration influences. Med. Biol. Eng. Comput.
**1994**, 32, 143–152. [Google Scholar] [CrossRef] [PubMed] - Cohen, M.A.; Taylor, J.A. Short-term cardiovascular oscillations in man: measuring and modelling the physiologies. J. Physiol.
**2002**, 542, 669–683. [Google Scholar] [CrossRef] [PubMed] - Faes, L.; Erla, S.; Nollo, G. Measuring connectivity in linear multivariate processes: Definitions, interpretation, and practical analysis. Comp. Math. Methods Med.
**2012**, 2012, 140513. [Google Scholar] [CrossRef] [PubMed] - Patton, D.J.; Triedman, J.K.; Perrott, M.H.; Vidian, A.A.; Saul, J.P. Baroreflex gain: characterization using autoregressive moving average analysis. Am. J. Physiol.
**1996**, 270, H1240–H1249. [Google Scholar] [PubMed] - Triedman, J.K.; Perrott, M.H.; Cohen, R.J.; Saul, J.P. Respiratory Sinus Arrhythmia—Time-Domain Characterization Using Autoregressive Moving Average Analysis. Am. J. Physiol. Heart Circ. Physiol.
**1995**, 268, H2232–H2238. [Google Scholar] - Xiao, X.; Mullen, T.J.; Mukkamala, R. System identification: a multi-signal approach for probing neural cardiovascular regulation. Phys. Meas.
**2005**, 26, R41–R71. [Google Scholar] [CrossRef] [PubMed] - Nollo, G.; Faes, L.; Porta, A.; Pellegrini, B.; Antolini, R. Synchronization index for quantifying nonlinear causal coupling between RR interval and systolic arterial pressure after myocardial infarction. Comput. Cardiol.
**2000**, 27, 143–146. [Google Scholar] - Tukey, J.W. Exploratory Data Analysis; Pearson: London, UK, 1977. [Google Scholar]
- Schwarz, G. Estimating the dimension of a model. Ann. Stat.
**1978**, 6, 461–464. [Google Scholar] [CrossRef] - Montano, N.; Gnecchi Ruscone, T.; Porta, A.; Lombardi, F.; Pagani, M.; Malliani, A. Power spectrum analysis of heart rate variability to assess the change in sympathovagal balance during graded orthostatic tilt. Circulation
**1994**, 90, 1826–1831. [Google Scholar] [CrossRef] [PubMed] - Porta, A.; Tobaldini, E.; Guzzetti, S.; Furlan, R.; Montano, N.; Gnecchi-Ruscone, T. Assessment of cardiac autonomic modulation during graded head-up tilt by symbolic analysis of heart rate variability. Am. J. Physiol. Heart Circ. Physiol.
**2007**, 293, H702–H708. [Google Scholar] [CrossRef] [PubMed] - Dick, T.E.; Baekey, D.M.; Paton, J.F.R.; Lindsey, B.G.; Morris, K.F. Cardio-respiratory coupling depends on the pons. Respir. Physiol. Neurobiol.
**2009**, 168, 76–85. [Google Scholar] [CrossRef] [PubMed] - Miyakawa, K.; Koepchen, H.P.; Polosa, C. Mechanism of Blood Pressure Waves; Japan Science Society Press: Tokyo, Japan, 1984. [Google Scholar]
- Faes, L.; Nollo, G.; Porta, A. Information domain approach to the investigation of cardio-vascular, cardio-pulmonary, and vasculo-pulmonary causal couplings. Front. Physiol.
**2011**, 2, 1–13. [Google Scholar] [CrossRef] [PubMed] - Faes, L.; Nollo, G.; Porta, A. Non-uniform multivariate embedding to assess the information transfer in cardiovascular and cardiorespiratory variability series. Comput. Biol. Med.
**2012**, 42, 290–297. [Google Scholar] [CrossRef] [PubMed] - Visnovcova, Z.; Mestanik, M.; Javorka, M.; Mokra, D.; Gala, M.; Jurko, A.; Calkovska, A.; Tonhajzerova, I. Complexity and time asymmetry of heart rate variability are altered in acute mental stress. Physiol. Meas.
**2014**, 35, 1319–1334. [Google Scholar] [CrossRef] [PubMed] - Widjaja, D.; Montalto, A.; Vlemincx, E.; Marinazzo, D.; Van Huffel, S.; Faes, L. Cardiorespiratory Information Dynamics during Mental Arithmetic and Sustained Attention. PLoS ONE
**2015**, 10, e0129112. [Google Scholar] [CrossRef] [PubMed][Green Version] - Bernardi, L.; Wdowczyk-Szulc, J.; Valenti, C.; Castoldi, S.; Passino, C.; Spadacini, G.; Sleight, P. Effects of controlled breathing, mental activity and mental stress with or without verbalization on heart rate variability. J. Am. Coll. Cardiol.
**2000**, 35, 1462–1469. [Google Scholar] [CrossRef] - Houtveen, J.H.; Rietveld, S.; de Geus, E.J. Contribution of tonic vagal modulation of heart rate, central respiratory drive, respiratory depth, and respiratory frequency to respiratory sinus arrhythmia during mental stress and physical exercise. Psychophysiology
**2002**, 39, 427–436. [Google Scholar] [CrossRef] [PubMed] - Sloan, R.P.; Shapiro, P.A.; Bagiella, E.; Boni, S.M.; Paik, M.; Bigger, J.T., Jr.; Steinman, R.C.; Gorman, J.M. Effect of mental stress throughout the day on cardiac autonomic control. Biol. Psychol.
**1994**, 37, 89–99. [Google Scholar] [CrossRef] - Widjaja, D.; Orini, M.; Vlemincx, E.; Van Huffel, S. Cardiorespiratory dynamic response to mental stress: a multivariate time-frequency analysis. Comput. Math. Methods Med.
**2013**, 2013, 451857. [Google Scholar] [CrossRef] [PubMed] - Porta, A.; Baselli, G.; Guzzetti, S.; Pagani, M.; Malliani, A.; Cerutti, S. Prediction of short cardiovascular variability signals based on conditional distribution. IEEE Trans. Biomed. Eng.
**2000**, 47, 1555–1564. [Google Scholar] [PubMed] - Porta, A.; Catai, A.M.; Takahashi, A.C.; Magagnin, V.; Bassani, T.; Tobaldini, E.; van de, B.P.; Montano, N. Causal relationships between heart period and systolic arterial pressure during graded head-up tilt. Am. J. Physiol. Regul. Integr. Comput. Physiol.
**2011**, 300, R378–R386. [Google Scholar] [CrossRef] [PubMed] - Elstad, M.; Toska, K.; Chon, K.H.; Raeder, E.A.; Cohen, R.J. Respiratory sinus arrhythmia: opposite effects on systolic and mean arterial pressure in supine humans. J. Physiol.
**2001**, 536, 251–259. [Google Scholar] [CrossRef] [PubMed] - Lackner, H.K.; Papousek, I.; Batzel, J.J.; Roessler, A.; Scharfetter, H.; Hinghofer-Szalkay, H. Phase synchronization of hemodynamic variables and respiration during mental challenge. Int. J. Psychophysiol.
**2011**, 79, 401–409. [Google Scholar] [CrossRef] [PubMed]

**Figure 1.**Information diagram (

**a**) and mutual information diagram (

**b**,

**c**) depicting the relations between the basic information-theoretic measures defined for three random variables X,

**Z**,

**U**: the information H(∙), the conditional information H(∙|∙), the mutual information I(∙;∙), the conditional mutual information I(∙;∙|∙), and the interaction information I(∙;∙;∙). Note that the interaction information I(X;

**Z**;

**U**) = I(X;

**Z**) – I(X;

**Z|U**) can take both positive and negative values. In this study, all interaction information terms are depicted with gray shaded areas, and all diagrams are intended for positive values of these terms. Accordingly, the case of positive interaction information is depicted in (

**b**), and that of negative interaction information is depicted in (

**c**).

**Figure 2.**Graphical representation of the information theoretic quantities resulting from the decomposition of the information carried by the target Y of a network of interacting stationary processes

**S**= {

**X**,Y} = {

**V**,

**W**,Y}. (

**a**) Exemplary realizations of a six-dimensional process

**S**composed of the target process Y and the source processes

**V**= {V

_{1},V

_{2}} and

**W**= {W

_{1}, W

_{2}, W

_{3}}, with representation of the variables used for information domain analysis: the present of the target, ${Y}_{n}$, the past of the target, ${Y}_{n}^{-}$, and the past of the sources, ${\mathit{V}}_{n}^{-}$ and ${\mathit{W}}_{n}^{-}$. (

**b**) Venn diagram showing that the information of the target process H

_{Y}is the sum of the new information (N

_{Y}, yellow-shaded area) and the predictive information (P

_{Y}, all other shaded areas with labels); the latter is expanded according to the predictive information decomposition (PID) as the sum of the information storage (S

_{Y}= S

_{Y|X}+ I

^{Y}

_{Y}

_{;V|W}+ I

^{Y}

_{Y}

_{;W|V}+ I

^{Y}

_{Y}

_{;W;V}) and the information transfer (T

_{X}_{→}

_{Y}= T

_{V}_{→}

_{Y}

_{|W}+ T

_{W}_{→}

_{Y}

_{|V}+ I

^{Y}

_{V}_{;W|Y}); the information storage decomposition dissects S

_{Y}as the sum of the internal information (S

_{Y|X}), conditional interaction terms (I

^{Y}

_{Y}

_{;V|W}and I

^{Y}

_{Y}

_{;W|V}) and multivariate interaction (I

^{Y}

_{Y}

_{;W;V}). The information transfer decomposition dissects T

_{X}_{→}

_{Y}as the sum of conditional information transfer terms (T

_{V}_{→}

_{Y}

_{|W}and T

_{W}_{→}

_{Y}

_{|V}) and interaction information transfer (I

^{Y}

_{V}_{;W|Y}).

**Figure 3.**Graphical representation of the trivariate VAR process of Equation (21) with parameters set according the first configuration reproducing basic dynamics and interactions (

**a**) and to the second configuration reproducing realistic cardiovascular and cardiorespiratory dynamics and interactions (

**b**). The theoretical power spectral densities of the three processes V, W and Y corresponding to the parameter setting with c = 1 are also depicted in panel (

**c**) (see text for details).

**Figure 4.**Information decomposition for the stationary Gaussian VAR process composed by the target Y and the sources

**X**= {V,W}, generated according to Equation (21). The Venn diagrams of the predictive information decomposition (PID), information storage decomposition (ISD) and information transfer decomposition (ITD) are depicted on the left. The interaction structure of the VAR process set according to the two types of simulation are depicted on the top. The information measures relevant to (

**a**–

**d**) PID (${H}_{Y}={N}_{Y}+{S}_{Y}+{T}_{\mathit{X}\to Y}$), (

**e**–

**h**) ISD (${S}_{Y}={S}_{Y|\mathit{X}}+{I}_{Y;\mathit{V}|\mathit{W}}^{Y}+{I}_{Y;\mathit{W}|\mathit{V}}^{Y}+{I}_{Y;\mathit{V};\mathit{W}}^{Y}$) and (

**i**–

**l**) ITD (${T}_{\mathit{X}\to Y}={T}_{\mathit{V}\to Y|\mathit{W}}+{T}_{\mathit{W}\to Y|\mathit{V}}+{I}_{\mathit{V};\mathit{W}|Y}^{Y}$), expressed in their variance and entropy formulations, are computed as a function of the parameter c for the two simulations.

**Figure 5.**Examples of computation of interaction information transfer ${I}_{V;W|Y}^{Y}$ for exemplary cases of jointly Gaussian processes V, W (sources) and Y (target): (

**a**–

**c**) uncorrelated sources; (

**d**–

**f**) positively correlated sources. Panels show the logarithmic dependence between variance and entropy measures of conditional information (

**a**,

**d**) and Venn diagrams of the information measures based on variance computation (

**b**,

**e**) and entropy computation (

**c**,

**f**). In (

**a**–

**c**), the variance-based interaction transfer is zero, suggesting no source interaction, while the entropy-based transfer is negative, denoting synergy. In (

**d**–

**f**), the variance-based interaction transfer is positive, suggesting redundancy, while the entropy-based transfer is negative, denoting synergy.

**Figure 6.**Information decomposition of the heart period (process H) measured as the target of the physiological network including also respiration (process R) and systolic pressure (process S) as source processes. Plots depict the values of the (

**a**,

**d**) predictive information decomposition (PID), (

**b**,

**e**) information storage decomposition (ISD) and (

**c**,

**f**) the information transfer decomposition (ITD) computed using entropy measures (

**a**–

**c**) and prediction measures (

**d**–

**f**) and expressed as mean + standard deviation over 61 subjects in the resting baseline condition (B), during head-up tilt (T), during recovery in the supine position (R), and during mental arithmetics (M). Statistically significant differences between pairs of distributions are marked with * (T vs. B, M vs. B), with # (T vs. R, M vs. R), and with § (T vs. M).

**Figure 7.**Information decomposition of systolic pressure (process S) measured as the target of the physiological network including also respiration (process R) and heart period (process H) as source processes. Plots depict the values of the (

**a**,

**d**) predictive information decomposition (PID), (

**b**,

**e**) information storage decomposition (ISD) and the (

**c**,

**f**) information transfer decomposition (ITD) computed using entropy measures (

**a**–

**c**) and prediction measures (

**d**–

**f**) and expressed as mean + standard deviation over 61 subjects in the resting baseline condition (B), during head-up tilt (T), during recovery in the supine position (R), and during mental arithmetics (M). Statistically significant differences between pairs of distributions are marked with * (T vs. B, M vs. B), with # (T vs. R, M vs. R), and with § (T vs. M).

**Figure 8.**Graphical representation of the variance-based (red) and entropy-based (blue) measures of information content (${H}_{H}$), storage (${S}_{H}$), transfer (${T}_{S\to H|R}$, ${T}_{R\to H|S}$) and new information (${N}_{H}$) relevant to the information decomposition of the heart period variability during baseline (dark colors) and during tilt (light colors), according to the results of Figure 6. The logarithmic relation explains why opposite variations can be obtained by variance-based measures and entropy-based measures moving from baseline to tilt.

© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Faes, L.; Porta, A.; Nollo, G.; Javorka, M.
Information Decomposition in Multivariate Systems: Definitions, Implementation and Application to Cardiovascular Networks. *Entropy* **2017**, *19*, 5.
https://doi.org/10.3390/e19010005

**AMA Style**

Faes L, Porta A, Nollo G, Javorka M.
Information Decomposition in Multivariate Systems: Definitions, Implementation and Application to Cardiovascular Networks. *Entropy*. 2017; 19(1):5.
https://doi.org/10.3390/e19010005

**Chicago/Turabian Style**

Faes, Luca, Alberto Porta, Giandomenico Nollo, and Michal Javorka.
2017. "Information Decomposition in Multivariate Systems: Definitions, Implementation and Application to Cardiovascular Networks" *Entropy* 19, no. 1: 5.
https://doi.org/10.3390/e19010005