entropy-logo

Journal Browser

Journal Browser

Causal Discovery of Time Series

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Complexity".

Deadline for manuscript submissions: closed (15 November 2021) | Viewed by 11048

Special Issue Editors


E-Mail Website
Guest Editor
1. Department Computational Sciences, Wigner Research Institute for Physics, Konkoly-Thege M. út 29-33, H-1121 Budapest, Hungary
2. Department of Computer Science and Information Theory, Budapest University of Technology and Economics, Budapest, Hungary
3. Department of Quantitative Methods, University of Pannonia, Veszprém, Hungary
Interests: random walks; computational neuroscience; ML; AI; causality analysis
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Clarkson Center for Complex Systems Science, and Department of Electrical and Computer Engineering, Clarkson University, Potsdam, NY 13699, USA
Interests: dynamical systems; data-driven science and engineering; information theory; ergodic theory; network science; machine learning

E-Mail Website
Guest Editor
Department Computational Sciences, Wigner Research Institute for Physics, Konkoly-Thege M. út 29-33, H-1121 Budapest, Hungary
Interests: computational neuroscience; causality analysis; anomaly detection; neurophysiological data analysis

Special Issue Information

Dear Colleagues,

From ancient philosophers to modern scientists across many different fields, including mathematicians, physicists, economists, biologists, neuroscientists, and Earth scientists, we are engaged in a most fundamental issue of revealing causal relationships and interactions. Consider that causal inference must decide if the relationship is unidirectional or bidirectional, or only apparent—implied only by a co-founder. Learning functionality demands methods to distinguish causal relationships of the system’s components. While some definitions of causal relationships are stated in terms of interventions and observations of response, experimental design and algorithmic inference follow. Other respected notions are free of explicit interventions, including the Nobel prize work of Granger, in terms of observations of a free running process to associate information flow by discovery of variables that confidently but minimally allow future forecasts. The latter is common in the natural sciences, for example, where experimental design generally cannot conceive of interventions. Recently, causality research has exploded thanks to the big data afforded by abundant sensors, advancements of algorithmic and computational methods including machine learning, and the ever-increasing computational capacity of modern hardware. We are witnessing an era of enormous theoretical advances, methodological innovations, and ever more exciting applications from literally all scientific fields.

Prof. András Telcs
Prof. Erik M. Bollt
Dr. Zoltan Somogyvari
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • causal discovery
  • transfer entropy
  • dynamic systems
  • time delay embedding

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

10 pages, 543 KiB  
Article
Relaxation of Some Confusions about Confounders
by Ádám Zlatniczki, Marcell Stippinger, Zsigmond Benkő, Zoltán Somogyvári and András Telcs
Entropy 2021, 23(11), 1450; https://doi.org/10.3390/e23111450 - 31 Oct 2021
Viewed by 1345
Abstract
This work is about observational causal discovery for deterministic and stochastic dynamic systems. We explore what additional knowledge can be gained by the usage of standard conditional independence tests and if the interacting systems are located in a geodesic space. Full article
(This article belongs to the Special Issue Causal Discovery of Time Series)
Show Figures

Figure 1

14 pages, 355 KiB  
Article
Normalized Multivariate Time Series Causality Analysis and Causal Graph Reconstruction
by X. San Liang
Entropy 2021, 23(6), 679; https://doi.org/10.3390/e23060679 - 28 May 2021
Cited by 39 | Viewed by 4456
Abstract
Causality analysis is an important problem lying at the heart of science, and is of particular importance in data science and machine learning. An endeavor during the past 16 years viewing causality as a real physical notion so as to formulate it from [...] Read more.
Causality analysis is an important problem lying at the heart of science, and is of particular importance in data science and machine learning. An endeavor during the past 16 years viewing causality as a real physical notion so as to formulate it from first principles, however, seems to have gone unnoticed. This study introduces to the community this line of work, with a long-due generalization of the information flow-based bivariate time series causal inference to multivariate series, based on the recent advance in theoretical development. The resulting formula is transparent, and can be implemented as a computationally very efficient algorithm for application. It can be normalized and tested for statistical significance. Different from the previous work along this line where only information flows are estimated, here an algorithm is also implemented to quantify the influence of a unit to itself. While this forms a challenge in some causal inferences, here it comes naturally, and hence the identification of self-loops in a causal graph is fulfilled automatically as the causalities along edges are inferred. To demonstrate the power of the approach, presented here are two applications in extreme situations. The first is a network of multivariate processes buried in heavy noises (with the noise-to-signal ratio exceeding 100), and the second a network with nearly synchronized chaotic oscillators. In both graphs, confounding processes exist. While it seems to be a challenge to reconstruct from given series these causal graphs, an easy application of the algorithm immediately reveals the desideratum. Particularly, the confounding processes have been accurately differentiated. Considering the surge of interest in the community, this study is very timely. Full article
(This article belongs to the Special Issue Causal Discovery of Time Series)
Show Figures

Figure 1

22 pages, 3022 KiB  
Article
Information Theoretic Causality Detection between Financial and Sentiment Data
by Roberta Scaramozzino, Paola Cerchiello and Tomaso Aste
Entropy 2021, 23(5), 621; https://doi.org/10.3390/e23050621 - 16 May 2021
Cited by 8 | Viewed by 4401
Abstract
The interaction between the flow of sentiment expressed on blogs and media and the dynamics of the stock market prices are analyzed through an information-theoretic measure, the transfer entropy, to quantify causality relations. We analyzed daily stock price and daily social media sentiment [...] Read more.
The interaction between the flow of sentiment expressed on blogs and media and the dynamics of the stock market prices are analyzed through an information-theoretic measure, the transfer entropy, to quantify causality relations. We analyzed daily stock price and daily social media sentiment for the top 50 companies in the Standard & Poor (S&P) index during the period from November 2018 to November 2020. We also analyzed news mentioning these companies during the same period. We found that there is a causal flux of information that links those companies. The largest fraction of significant causal links is between prices and between sentiments, but there is also significant causal information which goes both ways from sentiment to prices and from prices to sentiment. We observe that the strongest causal signal between sentiment and prices is associated with the Tech sector. Full article
(This article belongs to the Special Issue Causal Discovery of Time Series)
Show Figures

Figure 1

Back to TopTop