entropy-logo

Journal Browser

Journal Browser

Transfer Entropy

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (31 December 2014) | Viewed by 139880

Printed Edition Available!
A printed edition of this Special Issue is available here.

Special Issue Information

Dear Colleagues,

In many research fields, we need to analyze the causal interactions among the variables of a complex system, to better understand the physical behavior of it. While linear techniques, such as correlation, are widely used to identify and characterize these relationships, more advanced information-based techniques, such as mutual information and transfer entropy, have proven to be superior. Transfer entropy has been used to analyze the causal relationships between subsystem variables from data. The fact that it is non-symmetric enables one to infer the direction of information flow. Granger causality, which generally relies on autoregression to assess interactions, has been shown to be equivalent to transfer entropy in the case of Gaussian variables.

In this special issue, we would like to collect papers focusing on both the theory and applications of Transfer Entropy. The application areas are diverse and include neuroscience, systems biology, bioinformatics, environmental sciences, climatology, engineering, finance, astronomy, Earth and space sciences, and astronomy. Of special interest are theoretical papers elucidating the state of the art of data-based transfer entropy estimation techniques.

Dr. Deniz Gencaga
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • transfer entropy
  • causality
  • causal relationships
  • entropy estimation
  • statistical dependency
  • nonlinear interactions
  • interacting subsystems
  • information-theory
  • Granger causality
  • mutual information
  • machine learning
  • data mining

Published Papers (17 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research, Review

4 pages, 181 KiB  
Editorial
Transfer Entropy
by Deniz Gençağa
Entropy 2018, 20(4), 288; https://doi.org/10.3390/e20040288 - 16 Apr 2018
Cited by 13 | Viewed by 4188
Abstract
Statistical relationships among the variables of a complex system reveal a lot about its physical behavior[...] Full article
(This article belongs to the Special Issue Transfer Entropy)

Research

Jump to: Editorial, Review

750 KiB  
Article
Contribution to Transfer Entropy Estimation via the k-Nearest-Neighbors Approach
by Jie Zhu, Jean-Jacques Bellanger, Huazhong Shu and Régine Le Bouquin Jeannès
Entropy 2015, 17(6), 4173-4201; https://doi.org/10.3390/e17064173 - 16 Jun 2015
Cited by 22 | Viewed by 7277
Abstract
This paper deals with the estimation of transfer entropy based on the k-nearest neighbors (k-NN) method. To this end, we first investigate the estimation of Shannon entropy involving a rectangular neighboring region, as suggested in already existing literature, and develop [...] Read more.
This paper deals with the estimation of transfer entropy based on the k-nearest neighbors (k-NN) method. To this end, we first investigate the estimation of Shannon entropy involving a rectangular neighboring region, as suggested in already existing literature, and develop two kinds of entropy estimators. Then, applying the widely-used error cancellation approach to these entropy estimators, we propose two novel transfer entropy estimators, implying no extra computational cost compared to existing similar k-NN algorithms. Experimental simulations allow the comparison of the new estimators with the transfer entropy estimator available in free toolboxes, corresponding to two different extensions to the transfer entropy estimation of the Kraskov–Stögbauer–Grassberger (KSG) mutual information estimator and prove the effectiveness of these new estimators. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

709 KiB  
Article
Assessing Coupling Dynamics from an Ensemble of Time Series
by Germán Gómez-Herrero, Wei Wu, Kalle Rutanen, Miguel C. Soriano, Gordon Pipa and Raul Vicente
Entropy 2015, 17(4), 1958-1970; https://doi.org/10.3390/e17041958 - 02 Apr 2015
Cited by 45 | Viewed by 8497
Abstract
Finding interdependency relations between time series provides valuable knowledge about the processes that generated the signals. Information theory sets a natural framework for important classes of statistical dependencies. However, a reliable estimation from information-theoretic functionals is hampered when the dependency to be assessed [...] Read more.
Finding interdependency relations between time series provides valuable knowledge about the processes that generated the signals. Information theory sets a natural framework for important classes of statistical dependencies. However, a reliable estimation from information-theoretic functionals is hampered when the dependency to be assessed is brief or evolves in time. Here, we show that these limitations can be partly alleviated when we have access to an ensemble of independent repetitions of the time series. In particular, we gear a data-efficient estimator of probability densities to make use of the full structure of trial-based measures. By doing so, we can obtain time-resolved estimates for a family of entropy combinations (including mutual information, transfer entropy and their conditional counterparts), which are more accurate than the simple average of individual estimates over trials. We show with simulated and real data generated by coupled electronic circuits that the proposed approach allows one to recover the time-resolved dynamics of the coupling between different subsystems. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

1655 KiB  
Article
A Recipe for the Estimation of Information Flow in a Dynamical System
by Deniz Gencaga, Kevin H. Knuth and William B. Rossow
Entropy 2015, 17(1), 438-470; https://doi.org/10.3390/e17010438 - 19 Jan 2015
Cited by 42 | Viewed by 9954
Abstract
Information-theoretic quantities, such as entropy and mutual information (MI), can be used to quantify the amount of information needed to describe a dataset or the information shared between two datasets. In the case of a dynamical system, the behavior of the relevant variables [...] Read more.
Information-theoretic quantities, such as entropy and mutual information (MI), can be used to quantify the amount of information needed to describe a dataset or the information shared between two datasets. In the case of a dynamical system, the behavior of the relevant variables can be tightly coupled, such that information about one variable at a given instance in time may provide information about other variables at later instances in time. This is often viewed as a flow of information, and tracking such a flow can reveal relationships among the system variables. Since the MI is a symmetric quantity; an asymmetric quantity, called Transfer Entropy (TE), has been proposed to estimate the directionality of the coupling. However, accurate estimation of entropy-based measures is notoriously difficult. Every method has its own free tuning parameter(s) and there is no consensus on an optimal way of estimating the TE from a dataset. We propose a new methodology to estimate TE and apply a set of methods together as an accuracy cross-check to provide a reliable mathematical tool for any given data set. We demonstrate both the variability in TE estimation across techniques as well as the benefits of the proposed methodology to reliably estimate the directionality of coupling among variables. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

3117 KiB  
Article
Inferring a Drive-Response Network from Time Series of Topological Measures in Complex Networks with Transfer Entropy
by Xinbo Ai
Entropy 2014, 16(11), 5753-5776; https://doi.org/10.3390/e16115753 - 03 Nov 2014
Cited by 10 | Viewed by 6156
Abstract
Topological measures are crucial to describe, classify and understand complex networks. Lots of measures are proposed to characterize specific features of specific networks, but the relationships among these measures remain unclear. Taking into account that pulling networks from different domains together for statistical [...] Read more.
Topological measures are crucial to describe, classify and understand complex networks. Lots of measures are proposed to characterize specific features of specific networks, but the relationships among these measures remain unclear. Taking into account that pulling networks from different domains together for statistical analysis might provide incorrect conclusions, we conduct our investigation with data observed from the same network in the form of simultaneously measured time series. We synthesize a transfer entropy-based framework to quantify the relationships among topological measures, and then to provide a holistic scenario of these measures by inferring a drive-response network. Techniques from Symbolic Transfer Entropy, Effective Transfer Entropy, and Partial Transfer Entropy are synthesized to deal with challenges such as time series being non-stationary, finite sample effects and indirect effects. We resort to kernel density estimation to assess significance of the results based on surrogate data. The framework is applied to study 20 measures across 2779 records in the Technology Exchange Network, and the results are consistent with some existing knowledge. With the drive-response network, we evaluate the influence of each measure by calculating its strength, and cluster them into three classes, i.e., driving measures, responding measures and standalone measures, according to the network communities. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

1825 KiB  
Article
Structure of a Global Network of Financial Companies Based on Transfer Entropy
by Leonidas Sandoval, Jr.
Entropy 2014, 16(8), 4443-4482; https://doi.org/10.3390/e16084443 - 07 Aug 2014
Cited by 110 | Viewed by 9880
Abstract
This work uses the stocks of the 197 largest companies in the world, in terms of market capitalization, in the financial area, from 2003 to 2012. We study the causal relationships between them using Transfer Entropy, which is calculated using the stocks of [...] Read more.
This work uses the stocks of the 197 largest companies in the world, in terms of market capitalization, in the financial area, from 2003 to 2012. We study the causal relationships between them using Transfer Entropy, which is calculated using the stocks of those companies and their counterparts lagged by one day. With this, we can assess which companies influence others according to sub-areas of the financial sector, which are banks, diversified financial services, savings and loans, insurance, private equity funds, real estate investment companies, and real estate trust funds. We also analyze the exchange of information between those stocks as seen by Transfer Entropy and the network formed by them based on this measure, verifying that they cluster mainly according to countries of origin, and then by industry and sub-industry. Then we use data on the stocks of companies in the financial sector of some countries that are suffering the most with the current credit crisis, namely Greece, Cyprus, Ireland, Spain, Portugal, and Italy, and assess, also using Transfer Entropy, which companies from the largest 197 are most affected by the stocks of these countries in crisis. The aim is to map a network of influences that may be used in the study of possible contagions originating in those countries in financial crisis. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

Graphical abstract

218 KiB  
Article
Transfer Entropy Expressions for a Class of Non-Gaussian Distributions
by Mehrdad Jafari-Mamaghani and Joanna Tyrcha
Entropy 2014, 16(3), 1743-1755; https://doi.org/10.3390/e16031743 - 24 Mar 2014
Cited by 8 | Viewed by 5939
Abstract
Transfer entropy is a frequently employed measure of conditional co-dependence in non-parametric analysis of Granger causality. In this paper, we derive analytical expressions for transfer entropy for the multivariate exponential, logistic, Pareto (type IIV) and Burr distributions. The latter two [...] Read more.
Transfer entropy is a frequently employed measure of conditional co-dependence in non-parametric analysis of Granger causality. In this paper, we derive analytical expressions for transfer entropy for the multivariate exponential, logistic, Pareto (type IIV) and Burr distributions. The latter two fall into the class of fat-tailed distributions with power law properties, used frequently in biological, physical and actuarial sciences. We discover that the transfer entropy expressions for all four distributions are identical and depend merely on the multivariate distribution parameter and the number of distribution dimensions. Moreover, we find that in all four cases the transfer entropies are given by the same decreasing function of distribution dimensionality. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

283 KiB  
Article
Information Theory Analysis of Cascading Process in a Synthetic Model of Fluid Turbulence
by Massimo Materassi, Giuseppe Consolini, Nathan Smith and Rossana De Marco
Entropy 2014, 16(3), 1272-1286; https://doi.org/10.3390/e16031272 - 27 Feb 2014
Cited by 19 | Viewed by 6724
Abstract
The use of transfer entropy has proven to be helpful in detecting which is the verse of dynamical driving in the interaction of two processes, X and Y . In this paper, we present a different normalization for the transfer entropy, which is [...] Read more.
The use of transfer entropy has proven to be helpful in detecting which is the verse of dynamical driving in the interaction of two processes, X and Y . In this paper, we present a different normalization for the transfer entropy, which is capable of better detecting the information transfer direction. This new normalized transfer entropy is applied to the detection of the verse of energy flux transfer in a synthetic model of fluid turbulence, namely the Gledzer–Ohkitana–Yamada shell model. Indeed, this is a fully well-known model able to model the fully developed turbulence in the Fourier space, which is characterized by an energy cascade towards the small scales (large wavenumbers k), so that the application of the information-theory analysis to its outcome tests the reliability of the analysis tool rather than exploring the model physics. As a result, the presence of a direct cascade along the scales in the shell model and the locality of the interactions in the space of wavenumbers come out as expected, indicating the validity of this data analysis tool. In this context, the use of a normalized version of transfer entropy, able to account for the difference of the intrinsic randomness of the interacting processes, appears to perform better, being able to discriminate the wrong conclusions to which the “traditional” transfer entropy would drive. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

391 KiB  
Article
Risk Contagion in Chinese Banking Industry: A Transfer Entropy-Based Analysis
by Jianping Li, Changzhi Liang, Xiaoqian Zhu, Xiaolei Sun and Dengsheng Wu
Entropy 2013, 15(12), 5549-5564; https://doi.org/10.3390/e15125549 - 16 Dec 2013
Cited by 36 | Viewed by 7510
Abstract
What is the impact of a bank failure on the whole banking industry? To resolve this issue, the paper develops a transfer entropy-based method to determine the interbank exposure matrix between banks. This method constructs the interbank market structure by calculating the transfer [...] Read more.
What is the impact of a bank failure on the whole banking industry? To resolve this issue, the paper develops a transfer entropy-based method to determine the interbank exposure matrix between banks. This method constructs the interbank market structure by calculating the transfer entropy matrix using bank stock price sequences. This paper also evaluates the stability of Chinese banking system by simulating the risk contagion process. This paper contributes to the literature on interbank contagion mainly in two ways: it establishes a convincing connection between interbank market and transfer entropy, and exploits the market information (stock price) rather than presumptions to determine the interbank exposure matrix. Second, the empirical analysis provides an in depth understanding of the stability of the current Chinese banking system. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

Graphical abstract

397 KiB  
Article
Linearized Transfer Entropy for Continuous Second Order Systems
by Jonathan M. Nichols, Frank Bucholtz and Joe V. Michalowicz
Entropy 2013, 15(8), 3186-3204; https://doi.org/10.3390/e15083276 - 07 Aug 2013
Cited by 12 | Viewed by 3888
Abstract
The transfer entropy has proven a useful measure of coupling among components of a dynamical system. This measure effectively captures the influence of one system component on the transition probabilities (dynamics) of another. The original motivation for the measure was to quantify such [...] Read more.
The transfer entropy has proven a useful measure of coupling among components of a dynamical system. This measure effectively captures the influence of one system component on the transition probabilities (dynamics) of another. The original motivation for the measure was to quantify such relationships among signals collected from a nonlinear system. However, we have found the transfer entropy to also be a useful concept in describing linear coupling among system components. In this work we derive the analytical transfer entropy for the response of coupled, second order linear systems driven with a Gaussian random process. The resulting expression is a function of the auto- and cross-correlation functions associated with the system response for different degrees-of-freedom. We show clearly that the interpretation of the transfer entropy as a measure of "information flow" is not always valid. In fact, in certain instances the "flow" can appear to switch directions simply by altering the degree of linear coupling. A safer way to view the transfer entropy is as a measure of the ability of a given system component to predict the dynamics of another. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

Figure 1

3052 KiB  
Article
Simulation Study of Direct Causality Measures in Multivariate Time Series
by Angeliki Papana, Catherine Kyrtsou, Dimitris Kugiumtzis and Cees Diks
Entropy 2013, 15(7), 2635-2661; https://doi.org/10.3390/e15072635 - 04 Jul 2013
Cited by 70 | Viewed by 9049
Abstract
Measures of the direction and strength of the interdependence among time series from multivariate systems are evaluated based on their statistical significance and discrimination ability. The best-known measures estimating direct causal effects, both linear and nonlinear, are considered, i.e., conditional Granger causality index [...] Read more.
Measures of the direction and strength of the interdependence among time series from multivariate systems are evaluated based on their statistical significance and discrimination ability. The best-known measures estimating direct causal effects, both linear and nonlinear, are considered, i.e., conditional Granger causality index (CGCI), partial Granger causality index (PGCI), partial directed coherence (PDC), partial transfer entropy (PTE), partial symbolic transfer entropy (PSTE) and partial mutual information on mixed embedding (PMIME). The performance of the multivariate coupling measures is assessed on stochastic and chaotic simulated uncoupled and coupled dynamical systems for different settings of embedding dimension and time series length. The CGCI, PGCI and PDC seem to outperform the other causality measures in the case of the linearly coupled systems, while the PGCI is the most effective one when latent and exogenous variables are present. The PMIME outweighs all others in the case of nonlinear simulation systems. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

Figure 1

1163 KiB  
Article
Transfer Entropy for Coupled Autoregressive Processes
by Daniel W. Hahs and Shawn D. Pethel
Entropy 2013, 15(3), 767-788; https://doi.org/10.3390/e15030767 - 25 Feb 2013
Cited by 25 | Viewed by 8243
Abstract
A method is shown for computing transfer entropy over multiple time lags for coupled autoregressive processes using formulas for the differential entropy of multivariate Gaussian processes. Two examples are provided: (1) a first-order filtered noise process whose state is measured with additive noise, [...] Read more.
A method is shown for computing transfer entropy over multiple time lags for coupled autoregressive processes using formulas for the differential entropy of multivariate Gaussian processes. Two examples are provided: (1) a first-order filtered noise process whose state is measured with additive noise, and (2) two first-order coupled processes each of which is driven by white process noise. We found that, for the first example, increasing the first-order AR coefficient while keeping the correlation coefficient between filtered and measured process fixed, transfer entropy increased since the entropy of the measured process was itself increased. For the second example, the minimum correlation coefficient occurs when the process noise variances match. It was seen that matching of these variances results in minimum information flow, expressed as the sum of transfer entropies in both directions. Without a match, the transfer entropy is larger in the direction away from the process having the larger process noise. Fixing the process noise variances, transfer entropies in both directions increase with the coupling strength. Finally, we note that the method can be generally employed to compute other information theoretic quantities as well. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

Figure 1

324 KiB  
Article
On Thermodynamic Interpretation of Transfer Entropy
by Mikhail Prokopenko, Joseph T. Lizier and Don C. Price
Entropy 2013, 15(2), 524-543; https://doi.org/10.3390/e15020524 - 01 Feb 2013
Cited by 59 | Viewed by 9016
Abstract
We propose a thermodynamic interpretation of transfer entropy near equilibrium, using a specialised Boltzmann’s principle. The approach relates conditional probabilities to the probabilities of the corresponding state transitions. This in turn characterises transfer entropy as a difference of two entropy rates: the rate [...] Read more.
We propose a thermodynamic interpretation of transfer entropy near equilibrium, using a specialised Boltzmann’s principle. The approach relates conditional probabilities to the probabilities of the corresponding state transitions. This in turn characterises transfer entropy as a difference of two entropy rates: the rate for a resultant transition and another rate for a possibly irreversible transition within the system affected by an additional source. We then show that this difference, the local transfer entropy, is proportional to the external entropy production, possibly due to irreversibility. Near equilibrium, transfer entropy is also interpreted as the difference in equilibrium stabilities with respect to two scenarios: a default case and the case with an additional source. Finally, we demonstrated that such a thermodynamic treatment is not applicable to information flow, a measure of causal effect. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

Graphical abstract

314 KiB  
Article
Compensated Transfer Entropy as a Tool for Reliably Estimating Information Transfer in Physiological Time Series
by Luca Faes, Giandomenico Nollo and Alberto Porta
Entropy 2013, 15(1), 198-219; https://doi.org/10.3390/e15010198 - 11 Jan 2013
Cited by 80 | Viewed by 10231
Abstract
We present a framework for the estimation of transfer entropy (TE) under the conditions typical of physiological system analysis, featuring short multivariate time series and the presence of instantaneous causality (IC). The framework is based on recognizing that TE can be interpreted as [...] Read more.
We present a framework for the estimation of transfer entropy (TE) under the conditions typical of physiological system analysis, featuring short multivariate time series and the presence of instantaneous causality (IC). The framework is based on recognizing that TE can be interpreted as the difference between two conditional entropy (CE) terms, and builds on an efficient CE estimator that compensates for the bias occurring for high dimensional conditioning vectors and follows a sequential embedding procedure whereby the conditioning vectors are formed progressively according to a criterion for CE minimization. The issue of IC is faced accounting for zero-lag interactions according to two alternative empirical strategies: if IC is deemed as physiologically meaningful, zero-lag effects are assimilated to lagged effects to make them causally relevant; if not, zero-lag effects are incorporated in both CE terms to obtain a compensation. The resulting compensated TE (cTE) estimator is tested on simulated time series, showing that its utilization improves sensitivity (from 61% to 96%) and specificity (from 5/6 to 0/6 false positives) in the detection of information transfer respectively when instantaneous effect are causally meaningful and non-meaningful. Then, it is evaluated on examples of cardiovascular and neurological time series, supporting the feasibility of the proposed framework for the investigation of physiological mechanisms. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

Graphical abstract

493 KiB  
Article
Moving Frames of Reference, Relativity and Invariance in Transfer Entropy and Information Dynamics
by Joseph T. Lizier and John R. Mahoney
Entropy 2013, 15(1), 177-197; https://doi.org/10.3390/e15010177 - 10 Jan 2013
Cited by 12 | Viewed by 6915
Abstract
We present a new interpretation of a local framework for informationdynamics, including the transfer entropy, by defining a moving frame of reference for theobserver of dynamics in lattice systems. This formulation is inspired by the idea ofinvestigating “relativistic” effects on observing the dynamics [...] Read more.
We present a new interpretation of a local framework for informationdynamics, including the transfer entropy, by defining a moving frame of reference for theobserver of dynamics in lattice systems. This formulation is inspired by the idea ofinvestigating “relativistic” effects on observing the dynamics of information - in particular,we investigate a Galilean transformation of the lattice system data. In applying thisinterpretation to elementary cellular automata, we demonstrate that using a moving frameof reference certainly alters the observed spatiotemporal measurements of informationdynamics, yet still returns meaningful results in this context. We find that, as expected,an observer will report coherent spatiotemporal structures that are moving in their frame asinformation transfer, and structures that are stationary in their frame as information storage.Crucially, the extent to which the shifted frame of reference alters the results dependson whether the shift of frame retains, adds or removes relevant information regarding thesource-destination interaction. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

Figure 1

Review

Jump to: Editorial, Research

996 KiB  
Review
The Liang-Kleeman Information Flow: Theory and Applications
by X. San Liang
Entropy 2013, 15(1), 327-360; https://doi.org/10.3390/e15010327 - 18 Jan 2013
Cited by 54 | Viewed by 12787
Abstract
Information flow, or information transfer as it may be referred to, is a fundamental notion in general physics which has wide applications in scientific disciplines. Recently, a rigorous formalism has been established with respect to both deterministic and stochastic systems, with flow measures [...] Read more.
Information flow, or information transfer as it may be referred to, is a fundamental notion in general physics which has wide applications in scientific disciplines. Recently, a rigorous formalism has been established with respect to both deterministic and stochastic systems, with flow measures explicitly obtained. These measures possess some important properties, among which is flow or transfer asymmetry. The formalism has been validated and put to application with a variety of benchmark systems, such as the baker transformation, Hénon map, truncated Burgers-Hopf system, Langevin equation, etc. In the chaotic Burgers-Hopf system, all the transfers, save for one, are essentially zero, indicating that the processes underlying a dynamical phenomenon, albeit complex, could be simple. (Truth is simple.) In the Langevin equation case, it is found that there could be no information flowing from one certain time series to another series, though the two are highly correlated. Information flow/transfer provides a potential measure of the cause–effect relation between dynamical events, a relation usually hidden behind the correlation in a traditional sense. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

Figure 1

579 KiB  
Review
The Relation between Granger Causality and Directed Information Theory: A Review
by Pierre-Olivier Amblard and Olivier J. J. Michel
Entropy 2013, 15(1), 113-143; https://doi.org/10.3390/e15010113 - 28 Dec 2012
Cited by 90 | Viewed by 11864
Abstract
This report reviews the conceptual and theoretical links between Granger causality and directed information theory. We begin with a short historical tour of Granger causality, concentrating on its closeness to information theory. The definitions of Granger causality based on prediction are recalled, and [...] Read more.
This report reviews the conceptual and theoretical links between Granger causality and directed information theory. We begin with a short historical tour of Granger causality, concentrating on its closeness to information theory. The definitions of Granger causality based on prediction are recalled, and the importance of the observation set is discussed. We present the definitions based on conditional independence. The notion of instantaneous coupling is included in the definitions. The concept of Granger causality graphs is discussed. We present directed information theory from the perspective of studies of causal influences between stochastic processes. Causal conditioning appears to be the cornerstone for the relation between information theory and Granger causality. In the bivariate case, the fundamental measure is the directed information, which decomposes as the sum of the transfer entropies and a term quantifying instantaneous coupling. We show the decomposition of the mutual information into the sums of the transfer entropies and the instantaneous coupling measure, a relation known for the linear Gaussian case. We study the multivariate case, showing that the useful decomposition is blurred by instantaneous coupling. The links are further developed by studying how measures based on directed information theory naturally emerge from Granger causality inference frameworks as hypothesis testing. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

Figure 1

Back to TopTop