Special Issue "Transfer Entropy"
A special issue of Entropy (ISSN 1099-4300).
Deadline for manuscript submissions: 30 November 2014
Dr. Deniz Gencaga
Alcoa Technology, Alcoa Technical Center, 100 Technical Drive, Alcoa Center, 15069, PA, USA
Interests: bayesian data analysis; statistical signal processing; machine learning (for big data); information theory; source separation; computational mathematics and statistics; autonomous and intelligent systems; data mining and knowledge discovery; remote sensing; climatology; astronomy; systems biology; smart grid Contribution: Special Issue: Transfer Entropy
In many research fields, we need to analyze the causal interactions among the variables of a complex system, to better understand the physical behavior of it. While linear techniques, such as correlation, are widely used to identify and characterize these relationships, more advanced information-based techniques, such as mutual information and transfer entropy, have proven to be superior. Transfer entropy has been used to analyze the causal relationships between subsystem variables from data. The fact that it is non-symmetric enables one to infer the direction of information flow. Granger causality, which generally relies on autoregression to assess interactions, has been shown to be equivalent to transfer entropy in the case of Gaussian variables.
In this special issue, we would like to collect papers focusing on both the theory and applications of Transfer Entropy. The application areas are diverse and include neuroscience, systems biology, bioinformatics, environmental sciences, climatology, engineering, finance, astronomy, Earth and space sciences, and astronomy. Of special interest are theoretical papers elucidating the state of the art of data-based transfer entropy estimation techniques.
Dr. Deniz Gencaga
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.
- transfer entropy
- causal relationships
- entropy estimation
- statistical dependency
- nonlinear interactions
- interacting subsystems
- Granger causality
- mutual information
- machine learning
- data mining
Entropy 2013, 15(1), 113-143; doi:10.3390/e15010113
Received: 14 November 2012; in revised form: 19 December 2012 / Accepted: 19 December 2012 / Published: 28 December 2012| Download PDF Full-text (579 KB)
Article: Moving Frames of Reference, Relativity and Invariance in Transfer Entropy and Information Dynamics
Entropy 2013, 15(1), 177-197; doi:10.3390/e15010177
Received: 16 November 2012; Accepted: 31 December 2012 / Published: 10 January 2013| Download PDF Full-text (493 KB)
Article: Compensated Transfer Entropy as a Tool for Reliably Estimating Information Transfer in Physiological Time Series
Entropy 2013, 15(1), 198-219; doi:10.3390/e15010198
Received: 29 October 2012; in revised form: 21 December 2012 / Accepted: 5 January 2013 / Published: 11 January 2013| Download PDF Full-text (314 KB)
Entropy 2013, 15(1), 327-360; doi:10.3390/e15010327
Received: 17 October 2012; in revised form: 22 November 2012 / Accepted: 28 December 2012 / Published: 18 January 2013| Download PDF Full-text (996 KB)
Entropy 2013, 15(2), 524-543; doi:10.3390/e15020524
Received: 16 November 2012; in revised form: 16 January 2013 / Accepted: 28 January 2013 / Published: 1 February 2013| Download PDF Full-text (324 KB)
Entropy 2013, 15(3), 767-788; doi:10.3390/e15030767
Received: 26 January 2013; in revised form: 13 February 2013 / Accepted: 19 February 2013 / Published: 25 February 2013| Download PDF Full-text (1163 KB)
Entropy 2013, 15(7), 2635-2661; doi:10.3390/e15072635
Received: 28 March 2013; in revised form: 5 June 2013 / Accepted: 27 June 2013 / Published: 4 July 2013| Download PDF Full-text (3052 KB)
Entropy 2013, 15(8), 3186-3204; doi:10.3390/e15083276
Received: 22 May 2013; in revised form: 5 July 2013 / Accepted: 18 July 2013 / Published: 7 August 2013| Download PDF Full-text (397 KB)
Entropy 2013, 15(12), 5549-5564; doi:10.3390/e15125549
Received: 7 September 2013; in revised form: 19 October 2013 / Accepted: 9 December 2013 / Published: 16 December 2013| Download PDF Full-text (391 KB)
Entropy 2014, 16(3), 1272-1286; doi:10.3390/e16031272
Received: 29 September 2013; in revised form: 12 February 2014 / Accepted: 19 February 2014 / Published: 27 February 2014| Download PDF Full-text (283 KB)
The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.
Type of Paper: Review
Title: Assessment of Transfer Entropy and Related Measures
Author: Dimitris Kugiumtzis
Affiliation: Department of Mathematical, Physical and Computational Sciences, Faculty of Engineering, Aristotle University of Thessaloniki, Thessaloniki 54124, Greece; E-Mail: email@example.com
Abstract: In the analysis of multivariate time series, the primary interest is in investigating inter-dependencies of the observed variables, which are often assumed to be representing possibly coupled systems. The measures of directed coupling or inter-dependence are based on the concept of Granger causality (X Granger causes Y if the prediction of Y from its own past improves by including also the past of X). The transfer entropy (TE) quantifies Granger causality in terms of information, and it thus has the advantages of being model-free and able to identify nonlinear effects. Besides the recent popularity of TE in applications, its estimation is not straightforward and depends heavily on free parameters related to the state space reconstruction and the estimation of entropies. These points are discussed and different estimation schemes are assessed, suggesting the use of k-nearest neighbor estimate. Further, extensions of transfer entropy will be discussed, such as the use of rank vectors instead of sample vectors, termed as symbolic transfer entropy (STE), and the mutual information on mixed embedding (MIME), as well as the modification of TE to take into account the presence of confounding variables, termed partial transfer entropy (PTE). These measures will be assessed as to their statistical significance and power, and examples on simulated and real data will be given.
Type of Paper: Article
Title: State-Dependent Functional Interactions in Neuronal Cultures Revealed By Transfer Entropy
Authors: Olav Stetter 1,2, Javier Orlandi 3, Jordi Soriano 3 and Demian Battaglia 1,2
Affiliation: 1 Max Planck Institute for Dynamics and Self-organization, Göttingen, Germany; E-Mail: firstname.lastname@example.org
2 Bernstein Center for Computational Neuroscience, Göttingen, Germany
3 Universitat de Barcelona, Facultat de Fisica, Barcelona, Spain
Abstract: Structural connectivity, describing actual synaptic connections, contribute to shape spontaneous or induced neural activity generated by neural circuits, both in vitro and in vivo. However, the resulting dynamics is not fully constrained by structural connectivity and different activity patterns can be generated by a same network, depending on its dynamical state. Beyond structural connectivity, actual influences between neurons in a circuit are described by directed functional connectivity, assessed by means of causal analysis tools like Granger Causality or, more recently, Transfer Entropy. Thus, structural networks with a rich repertoire of possible dynamics give rise to a multiplicity of functional networks. We focus here, on a specific example of state-dependent functional connectivity, resorting to simulations of large networks of spiking neurons and to analyses of real data. First, we consider a model of a culture of dissociated neurons in vitro, undergoing spontaneous switching between bursting and non-bursting states. These dynamical regimes are associated to functional digraphs with qualitatively different topologies, only partially dictated by the underlying unchanging structural topology. Second, we identify similar state-dependency patterns in living neuronal cultures, analyzing actual high-resolution calcium imaging recordings. Interestingly, we show that, while Transfer Entropy (and even time-delayed Mutual Information) analysis provide a description of directed functional connectivity in agreement with an intuition based on dynamical systems theory, this does not hold for linear measures like Granger Causality, which are strongly biased toward the inference of exceedingly clustered functional topologies.
Type of Paper: Research Article
Title: Nonuniform Embedding Corrected Transfer Entropy as a Tool for Reliably Estimating Information Transfer in Physiological Time Series
Authors: Luca Faes 1, Giandomenico Nollo 1 and Alberto Porta 2
Affiliation: 1 Dept. of Physics and BIOtech, University of Trento, Trento, Italy
2 Dept. of Biomedical Sciences for Health, Galeazzi Orthopaedic Institute, University of Milan, Milano, Italy; E-Mail: email@example.com
Abstract: Transfer entropy (TE) has recently emerged as a powerful nonlinear and model-free tool to detect the direction and quantify the strength of the information flow between interacting processes. However, TE estimation in short and noisy physiological time series is complicated by a number of practical issues. In the present study we develop a multivariate TE estimation framework based on conditional entropy (CE) estimation, non-uniform embedding, and consideration of instantaneous causality. The framework is based on recognizing that TE can be interpreted as CE difference, and builds on an efficient CE estimator that compensates for the bias occurring for high dimensional conditioning vectors and follows a sequential embedding procedure whereby the conditioning vectors are formed progressively according to a criterion for CE minimization. The issue of instantaneous causality is faced allowing for the possibility of zero-lag effects in TE computation, according to two alternative procedures: if instantaneous effects are deemed as physiologically meaningful, the zero-lag term is assimilated to the lagged terms to make it causally relevant; if not, the zero-lag term is incorporated in both CE computations to obtain a compensation of its confounding effects. The resulting TE estimator, denoted as corrected TE (cTE), is first tested on simulations of linear stochastic and nonlinear deterministic systems. In particular, we consider realistic simulations of cardiovascular time series interacting according to both instantaneous and time lagged effects, and of brain source signals instantaneously mixed to reproduce volume conduction effects. Simulation results demonstrate the flexibility of the proposed framework and make clear the necessity of accounting for instantaneous causality through cTE computation. Then, representative examples of heart rate, arterial pressure and respiration variability series measured during a paced breathing protocol, and of magnetoencephalography multi-trial signals measured during a visuo-tactile cognitive experiment, are provided to illustrate the practical applicability of the framework. Results indicate that the information flow detected by means of the cTE is better interpretable, according to the known cardiovascular physiology or the expected mechanisms of multisensory integration, when instantaneous causality effects are properly taken into account.
Last update: 21 October 2013