Special Issue "Information Transfer, Entropy Production, Irreversibility and Time Series Analysis"

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: 16 February 2020.

Special Issue Editor

Dr. Milan Paluš
E-Mail Website
Guest Editor
Institute of Computer Science, Czech Academy of Sciences, Pod vodarenskou vezi 2, 182 07 Praha 8, Czech Republic
Interests: complex systems; nonlinear dynamics; theory of deterministic chaos; synchronization; causality

Special Issue Information

Dear Colleagues,

Time series record the time evolution of systems or processes in nature or society and are sources of important information about system states, transitions between them, interactions among system components, or about physical mechanisms underlying observed phenomena. Time series analysis is an established scientific discipline, traditionally rooted in the theory of stochastic processes. There are also new avenues in time series analysis inspired by nonlinear dynamical systems, the theory of deterministic chaos and statistical physics. Different approaches, however, are becoming unified in using notions of information, entropy, entropy rates or production and information transfer. Ideas and tools from information theory have become an important part of time series analysis and related interdisciplinary research in an increasing number of scientific disciplines.

The focus of this Special Issue is the theoretical development as well as interesting applications of methods for estimating information transfer, causality, entropy production and irreversibility from time series recorded in complex systems. We anticipate theoretical developments accompanied with explanatory examples using either simulated or experimental data and interdisciplinary applications that uncover new phenomena or shed new light on known events in different scientific fields.

Dr. Milan Paluš
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • time series
  • complex systems
  • entropy rate
  • entropy production
  • irreversibility
  • information transfer
  • causality
  • nonlinear dynamics
  • multiscale processes

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
Residual Predictive Information Flow in the Tight Coupling Limit: Analytic Insights from a Minimalistic Model
Entropy 2019, 21(10), 1010; https://doi.org/10.3390/e21101010 - 17 Oct 2019
Abstract
In a coupled system, predictive information flows from the causing to the caused variable. The amount of transferred predictive information can be quantified through the use of transfer entropy or, for Gaussian variables, equivalently via Granger causality. It is natural to expect and [...] Read more.
In a coupled system, predictive information flows from the causing to the caused variable. The amount of transferred predictive information can be quantified through the use of transfer entropy or, for Gaussian variables, equivalently via Granger causality. It is natural to expect and has been repeatedly observed that a tight coupling does not permit to reconstruct a causal connection between causing and caused variables. Here, we show that for a model of interacting social groups, carried from the master equation to the Fokker–Planck level, a residual predictive information flow can remain for a pair of uni-directionally coupled variables even in the limit of infinite coupling strength. We trace this phenomenon back to the question of how the synchronizing force and the noise strength scale with the coupling strength. A simplified model description allows us to derive analytic expressions that fully elucidate the interplay between deterministic and stochastic model parts. Full article
Show Figures

Figure 1

Open AccessArticle
Correlation Dimension Detects Causal Links in Coupled Dynamical Systems
Entropy 2019, 21(9), 818; https://doi.org/10.3390/e21090818 - 21 Aug 2019
Abstract
It is becoming increasingly clear that causal analysis of dynamical systems requires different approaches than, for example, causal analysis of interconnected autoregressive processes. In this study, a correlation dimension estimated in reconstructed state spaces is used to detect causality. If deterministic dynamics plays [...] Read more.
It is becoming increasingly clear that causal analysis of dynamical systems requires different approaches than, for example, causal analysis of interconnected autoregressive processes. In this study, a correlation dimension estimated in reconstructed state spaces is used to detect causality. If deterministic dynamics plays a dominant role in data then the method based on the correlation dimension can serve as a fast and reliable way to reveal causal relationships between and within the systems. This study demonstrates that the method, unlike most other causal approaches, detects causality well, even for very weak links. It can also identify cases of uncoupled systems that are causally affected by a hidden common driver. Full article
Show Figures

Figure 1

Open AccessArticle
Information Thermodynamics for Time Series of Signal-Response Models
Entropy 2019, 21(2), 177; https://doi.org/10.3390/e21020177 - 14 Feb 2019
Cited by 3
Abstract
The entropy production in stochastic dynamical systems is linked to the structure of their causal representation in terms of Bayesian networks. Such a connection was formalized for bipartite (or multipartite) systems with an integral fluctuation theorem in [Phys. Rev. Lett. 111, 180603 (2013)]. [...] Read more.
The entropy production in stochastic dynamical systems is linked to the structure of their causal representation in terms of Bayesian networks. Such a connection was formalized for bipartite (or multipartite) systems with an integral fluctuation theorem in [Phys. Rev. Lett. 111, 180603 (2013)]. Here we introduce the information thermodynamics for time series, that are non-bipartite in general, and we show that the link between irreversibility and information can only result from an incomplete causal representation. In particular, we consider a backward transfer entropy lower bound to the conditional time series irreversibility that is induced by the absence of feedback in signal-response models. We study such a relation in a linear signal-response model providing analytical solutions, and in a nonlinear biological model of receptor-ligand systems where the time series irreversibility measures the signaling efficiency. Full article
Show Figures

Figure 1

Back to TopTop