# Assessing Coupling Dynamics from an Ensemble of Time Series

^{1}

^{2}

^{3}

^{4}

^{5}

^{6}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Entropy Combinations

_{x}+ 1)) for n = 1, …,N, where n is a discrete time index and d

_{x}is the corresponding Markov order. Similarly, we could construct y(n) and z(n) for processes Y and Z, respectively. Let V = (V

_{1}, …, V

_{m}) denote a random m-dimensional vector and H(V) its Shannon entropy. Then, an entropy combination is defined by:

_{i}⊂ [1, m] and s

_{i}∈{−1,1}, such that ${\sum}_{i=1}^{p}{\mathit{s}}_{i}{\mathrm{\chi}}_{{\mathcal{L}}_{i}}}={\mathrm{\chi}}_{[1,m]$, where ${\mathrm{\chi}}_{\mathcal{S}}$ is the indicator function of a set $\mathcal{S}$ (having the value one for elements in the set $\mathcal{S}$ and zero for elements not in $\mathcal{S}$).

^{+}≡ x(n + 1), so that H

_{WX}is the differential entropy of p(x(n + 1), x(n)). The latter denotes the joint probability of finding X at states x(n + 1), x(n), …, x(n − d

_{x}+ 1) during time instants n + 1, n, n − 1, …, n − d

_{x}+ 1. Notice that, due to stationarity, p(x(n + 1), x(n)) is invariant under variations of the time index n.

## 3. Ensemble Estimators for Entropy Combinations

_{d}is the volume of the d-dimensional unit ball and ϵ(i) is the distance from x[i] to its k-th nearest neighbor in the set {x[j]}

_{∀j≠i}. The KL estimator is based on the assumption that the density of the distribution of random vectors is constant within an ϵ-ball. The bias of the final entropy estimate depends on the validity of this assumption and, thus, on the values of ϵ(n). Since the size of the ϵ-balls depends directly on the dimensionality of the random vector, the biases of estimates for the differential entropies in Equation (1) will, in general, not cancel, leading to a poor estimator of the entropy combination. This problem can be partially overcome by noticing that Equation (2) holds for any value of k, so that we do not need to have a fixed k. Therefore, we can vary the value of k in each data point, so that the radius of the corresponding ϵ-balls would be approximately the same for the joint and the marginal spaces. This idea was originally proposed in [23] for estimating mutual information and was used in [16] to estimate PMI, and we generalize it here to the following estimator of entropy combinations:

_{i}(n) accounts for the number of neighbors of the n-th realization of the marginal vector ${V}_{{\mathcal{L}}_{i}}$ located at a distance strictly less than ϵ(n), where ϵ(n) denotes the radius of the ϵ-ball in the joint space. Note that the point itself is included in the counting neighbors in marginal spaces (k

_{i}(n)), but not when selecting ϵ(n) from the k-th nearest neighbor in the full join space. Furthermore, note that estimator Equation (3) corresponds to extending “Algorithm 1” in [23] to entropy combinations. Extensions to conditional mutual information and conditional transfer entropy using “Algorithm 2” in [23] have been discussed recently [12].

^{(r)}[n]}

_{r}the measured dynamics for those trials (r = 1, 2, …r′). Similarly, we denote by ${\{{v}_{i}^{(r)}[n]\}}_{r}$ the measured dynamics for the marginal vector ${V}_{{\mathcal{L}}_{i}}$. A straightforward approach for integrating the information from different trials is to average together estimates obtained from individual trials:

## 4. Tests on Simulated and Experimental Data

^{en}can be used to characterize dynamic coupling patterns, we apply the ensemble estimator of PTE to multivariate time series from coupled processes.

_{x}, η

_{y}and η

_{z}represent normally-distributed noise processes, which are mutually independent across trials and time instants. The coupling delays amount to τ

_{yx}= 10, τ

_{zy}= 15, while the dynamics of the coupling follows a sinusoidal variation:

_{a←b|c}(n), the time series b and c were delayed, so that they shared maximum information with the time series a, as suggested in [16]. For a rigorous and formal way to investigate the lag in the information flow between systems, we refer to [26,27].

_{δ}represents the value of the variable x at time t − δ, γ, β and n are positive numbers and η represent noise sources. The feedback loop of the first circuit and time-varying coupling between the two circuits are represented by the first terms of each equation, respectively. We note that the former set of equations was not used to sample data. Instead, time series were directly obtained from the voltage variables of the electronic circuits. The equations above just serve to illustrate in mathematical terms the type of dynamics expected from the electronic circuits.

_{1}

_{←}

_{2}< 0.0795 nats ∀(t, τ)) and only reached significance (p < 0.01) for about 1% of the tuples (n, τ) Figure 4. Both the period of the coupling dynamics (100 samples) and the coupling delay (20 samples) can be accurately recovered from Figure 3.

_{10}(bias) and −20×log

_{10}(var), so large values of these quantities correspond to small bias and variances, respectively. In particular, Figure 5 shows the bias and variance of the estimator as a function of the number of samples and cross-correlation coefficient. As observed in the plot, the smaller the value of the underlying TE (smaller cross-correlation), the better its estimation (smaller bias and variance). For a given value of TE, the estimation improves as more samples are included, as is expected. Regarding the number of neighbors (Figure 6), we obtain that beyond a minimum number of samples, the accuracy obtained increased by either increasing the sample size or the number of neighbors.

## 5. Conclusions

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## References

- Gray, C.; Konig, P.; Engel, A.; Singer, W. Oscillatory responses in cat visual cortex exhibit inter-columnar synchronization which reflects global stimulus properties. Nature
**1989**, 338, 334. [Google Scholar] - Bjornstad, O.; Grenfell, B. Noisy clockwork: Time series analysis of population fluctuations in animals. Science
**2001**, 293, 638. [Google Scholar] - Granger, C.; Hatanaka, M. Spectral Analysis of Economic Time Series; Princeton University Press: Princeton, NJ, USA, 1964. [Google Scholar]
- Valdes-Sosa, P.A.; Roebroeck, A.; Daunizeau, J.; Friston, K. Effective connectivity: Influence, causality and biophysical modeling. Neuroimage
**2011**, 58, 339. [Google Scholar] - Granger, C. Investigating causal relations by econometric models and cross-spectral methods. Econometrica
**1969**, 37, 424. [Google Scholar] - Pereda, E.; Quian Quiroga, R.; Bhattacharya, J. Nonlinear multivariate analysis of neurophysiological signals. Prog. Neurobio.
**2005**, 77, 1. [Google Scholar] - Cover, T.; Thomas, J. Elements of Information Theory; Wiley: Hoboken, NY, USA, 2006. [Google Scholar]
- Wiener, N. The theory of prediction. In Modern Mathematics for Engineers; McGraw-Hill: New York NY, USA, 1956. [Google Scholar]
- Schreiber, T. Measuring information transfer. Phys. Rev. Lett
**2000**, 85, 461. [Google Scholar] - Chicharro, D.; Ledberg, A. When two become one: the limits of causality analysis of brain dynamics. PLoS One
**2012**, 7, e32466. [Google Scholar] - Wibral, M.; Vicente, R.; Lizier, J.T. Directed Information Measures in Neuroscience; Springer: Berlin, Germany, 2014. [Google Scholar]
- Wibral, M.; Vicente, R.; Lindner, M. Transfer entropy in Neuroscience. In Directed Information Measures in Neuroscience; Wibral, M., Vicente, R., Lizier, J.T., Eds.; Springer: Berlin, Germany, 2014. [Google Scholar]
- Lizier, J.T. The Local Information Dynamics of Distributed Computation in Complex Systems; Springer: Berlin, Germany, 2013. [Google Scholar]
- Kantz, H.; Schreiber, T. Nonlinear Time Series Analysis, 2nd ed; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar]
- Wyner, A.D. A Definition of Conditional Mutual Information for Arbitrary Ensembles. Inf. Control
**1978**, 38, 51. [Google Scholar] - Frenzel, S.; Pompe, B. Partial mutual information for coupling analysis of multivariate time series. Phys. Rev. Lett
**2007**, 99, 204101. [Google Scholar] - Verdes, P.F. Assessing causality from multivariate time series. Phys. Rev. E
**2005**, 72, 026222. [Google Scholar] - Gómez-Herrero, G. Ph.D. thesis, Department of Signal Processing, Tampere University of Technology, Finland, 2010.
- Ragwitz, M.; Kantz, H. Markov models from data by simple nonlinear time series predictors in delay embedding spaces. Phys. Rev. E
**2002**, 65, 056201. [Google Scholar] - Victor, J.D. Binless strategies for estimation of information from neural data. Phys. Rev. E
**2002**, 66, 051903. [Google Scholar] - Vicente, R.; Wibral, M. Efficient estimation of information transfer. In Directed Information Measures in Neuroscience; Wibral, M., Vicente, R., Lizier, J.T., Eds.; Springer: Berlin, Germany, 2014. [Google Scholar]
- Kozachenko, L.; Leonenko, N. Sample Estimate of the Entropy of a Random Vector. Problemy Peredachi Informatsii
**1987**, 23, 9. [Google Scholar] - Kraskov, A.; Stögbauer, H.; Grassberger, P. Estimating mutual information. Phys. Rev. E
**2004**, 69, 066138. [Google Scholar] - Kramer, M.A.; Edwards, E.; Soltani, M.; Berger, M.S.; Knight, R.T.; Szeri, A.J. Synchronization measures of bursting data: application to the electrocorticogram of an auditory event-related experiment. Phys. Rev. E
**2004**, 70, 011914. [Google Scholar] - Cao, L. Practical method for determining the minimum embedding dimension of a scalar time series. Physica D
**1997**, 110, 43. [Google Scholar] - Wibral, M.; Pampu, N.; Priesemann, V.; Siebenhuhner; Seiwert; Lindner; Lizier; Vicente, R. Measuring information-transfer delays. PLoS One
**2013**, 8, e55809. [Google Scholar] - Wollstadt, P.; Martinez-Zarzuela, M.; Vicente, R.; Diaz-Pernas, F.J.; Wibral, M. Efficient transfer entropy analysis of non-stationary neural time series. PLoS One
**2014**, 9, e102833. [Google Scholar] - Pesarin, F. Multivariate Permutation Tests; John Wiley and Sons: Hoboken, NJ, USA, 2001. [Google Scholar]
- Kaiser, A.; Schreiber, T. Information transfer in continuous processes. Physica D
**2002**, 166, 43. [Google Scholar] - Kantz, H.; Ragwitz, M. Phase space reconstruction and nonlinear predictions for stationary and nonstationary Markovian processes. Int. J. Bifurc. Chaos
**2004**, 14, 1935. [Google Scholar] - Rutanen, K. TIM 1.2.0. Available online: http://www.tut.fi/tim accessed on 2 April 2015.
- Lindner, M.; Vicente, R.; Priesemann, V.; Wibral, M. TRENTOOL: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy. BMC Neurosci.
**2011**, 12, 119. [Google Scholar]

**Figure 1.**Nearest neighbor statistics across trials. (

**a**): For each time instant n = n

^{∗}and trial r = r

^{∗}, we compute the (maximum norm) distance ${\u03f5}^{(r*)}(n*)$ from ${v}^{(r*)}(n*)$ to its k-th nearest neighbor among all trials. Here, the procedure is illustrated for k = 5. (

**b**): ${k}_{i}^{(r*)}[n*]$ counts how many neighbors of ${v}_{i}^{(r*)}[n*]$ are within a radius ${\u03f5}^{(r*)}(n*)$. The point itself $(i.e.,{v}_{i}^{(r*)}[n*])$ is also included in this count. These neighbor counts are obtained for all i = 1, …p marginal trajectories.

**Figure 2.**Partial transfer entropy between three non-linearly coupled Gaussian processes. The upper panel displays the partial transfer entropy (PTE) in directions compatible with the structural coupling of Gaussian processes (X to Y to Z). The lower panel displays the PTE values in directions non-compatible with the structural coupling. The solid lines represent PTE values, while the color-matched dashed lines denote corresponding p = 0.05 significance levels. k = 20. The time window for the search of neighbors is 2σ = 10. The temporal variance of the PTE estimates was reduced with a post-processing moving average filter of order 20.

**Figure 3.**Transfer entropy from the first electronic circuit towards the second. The upper figure shows time-varying TE versus the lag introduced in the temporal activation of the first circuit. The lower figure shows that the temporal pattern of information flow for τ = 20, i.e., T

_{2}

_{←}

_{1}(n, τ = 20), which resembles a sinusoid with a period of roughly 100 data samples.

**Figure 4.**Transfer entropy from the second electronic circuit towards the first. The upper figure shows time-varying TE versus the lag introduced in the temporal activation of the first circuit. The lower figure shows that the temporal pattern of information flow for τ = 20, i.e., T

_{1}

_{←}

_{2}(n, τ = 20).

**Figure 5.**(

**a**): −20 × log

_{10}(bias) of ensemble estimator TE(Y → X) as a function of the number of samples and cross-correlation coefficient for X (which controls the nominal TE value for (Y → X)). (

**b**): −20 × log

_{10}(variance) as a function of the number of samples and cross-correlation coefficient for X (which controls the nominal TE value for (Y → X)).

**Figure 6.**(

**a**): −20 × log

_{10}(bias) of ensemble estimator TE(Y → X) as a function of the number of samples and the number of nearest neighbors used in the estimator. (

**b**): −20 × log

_{10}(variance) as a function of the number of samples and the number of nearest neighbors used in the estimator.

© 2015 by the authors; licensee MDPI, Basel, Switzerland This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).

## Share and Cite

**MDPI and ACS Style**

Gómez-Herrero, G.; Wu, W.; Rutanen, K.; Soriano, M.C.; Pipa, G.; Vicente, R. Assessing Coupling Dynamics from an Ensemble of Time Series. *Entropy* **2015**, *17*, 1958-1970.
https://doi.org/10.3390/e17041958

**AMA Style**

Gómez-Herrero G, Wu W, Rutanen K, Soriano MC, Pipa G, Vicente R. Assessing Coupling Dynamics from an Ensemble of Time Series. *Entropy*. 2015; 17(4):1958-1970.
https://doi.org/10.3390/e17041958

**Chicago/Turabian Style**

Gómez-Herrero, Germán, Wei Wu, Kalle Rutanen, Miguel C. Soriano, Gordon Pipa, and Raul Vicente. 2015. "Assessing Coupling Dynamics from an Ensemble of Time Series" *Entropy* 17, no. 4: 1958-1970.
https://doi.org/10.3390/e17041958