Special Issue "Advance Methods for the Quantification of Correlations and Causal Relations between Processes"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Multidisciplinary Applications".

Deadline for manuscript submissions: 31 July 2020.

Special Issue Editors

Prof. Dr. Andrea Murari

Guest Editor
Consorzio RFX (CNR, ENEA, INFN, Universita’ di Padova, Acciaierie Venete SpA), 35127 Padova, Italy
Interests: nuclear fusion; entropy; information theory; machine learning; evolutionary computation; tomography; image processing
Dr. Teddy Craciunescu

Guest Editor
National Institute for Laser, Plasma and Radiation Physics, RO-077125 Magurele-Bucharest, Romania
Interests: computed tomography; imagine processing; time series analysis, complex networks, data mining, Monte Carlo simulations
Dr. Michela Gelfusa

Guest Editor
Department of Industrial Engineering, University of Rome Tor Vergata, 00133 Rome, Italy
Interests: Plasma diagnostics; Inverse problems; Data mining; Time series analysis; Genetic programming

Special Issue Information

Dear Colleagues,

Two of the most relevant characteristics of modern societies are their complexity and the huge amounts of data that they produce. Unfortunately, in the investigation of complex systems, large datasets can become a liability, instead of an asset, if they are not analysed with adequate tools. One of the first steps in the formulation of scientific models and theories is certainly the assessment of the correlations between the quantities potentially involved. More advanced is the goal of determining their actual causal relations and relative strengths. In various domains, performing experiments and interventions to establish direct causal relationships could be unethical, extremely expensive, or even impossible. In the last few years, many efforts have been made to improve the techniques and methodologies for identifying and quantifying the correlations and the causal influences between processes based on time-series and cross-sectional data; they range from causal networks to phase space reconstructions and information-theoretic tools. For practical applications, the limited number of observations and the noise that inherently accompanies the measurements represent additional challenges.

This Special Issue aims to collect papers that describe new solutions for the above-mentioned problems. The contributions can be based on (but not limited to) the following fields:

  • Information Theory;
  • Network Theory;
  • Statistical Inference;
  • Machine Learning;
  • Neural Computation;
  • Genetic Programing.

Theoretical approaches as well as practical applications are welcome.

Best regards,

Prof. Dr. Andrea Murari
Dr. Teddy Craciunescu
Dr. Michela Gelfusa
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
On the Potential of Time Delay Neural Networks to Detect Indirect Coupling between Time Series
Entropy 2020, 22(5), 584; https://doi.org/10.3390/e22050584 - 21 May 2020
Abstract
Determining the coupling between systems remains a topic of active research in the field of complex science. Identifying the proper causal influences in time series can already be very challenging in the trivariate case, particularly when the interactions are non-linear. In this paper, [...] Read more.
Determining the coupling between systems remains a topic of active research in the field of complex science. Identifying the proper causal influences in time series can already be very challenging in the trivariate case, particularly when the interactions are non-linear. In this paper, the coupling between three Lorenz systems is investigated with the help of specifically designed artificial neural networks, called time delay neural networks (TDNNs). TDNNs can learn from their previous inputs and are therefore well suited to extract the causal relationship between time series. The performances of the TDNNs tested have always been very positive, showing an excellent capability to identify the correct causal relationships in absence of significant noise. The first tests on the time localization of the mutual influences and the effects of Gaussian noise have also provided very encouraging results. Even if further assessments are necessary, the networks of the proposed architecture have the potential to be a good complement to the other techniques available in the market for the investigation of mutual influences between time series. Full article
Show Figures

Figure 1

Open AccessArticle
Upgrading Model Selection Criteria with Goodness of Fit Tests for Practical Applications
Entropy 2020, 22(4), 447; https://doi.org/10.3390/e22040447 - 15 Apr 2020
Abstract
The Bayesian information criterion (BIC), the Akaike information criterion (AIC), and some other indicators derived from them are widely used for model selection. In their original form, they contain the likelihood of the data given the models. Unfortunately, in many applications, it is [...] Read more.
The Bayesian information criterion (BIC), the Akaike information criterion (AIC), and some other indicators derived from them are widely used for model selection. In their original form, they contain the likelihood of the data given the models. Unfortunately, in many applications, it is practically impossible to calculate the likelihood, and, therefore, the criteria have been reformulated in terms of descriptive statistics of the residual distribution: the variance and the mean-squared error of the residuals. These alternative versions are strictly valid only in the presence of additive noise of Gaussian distribution, not a completely satisfactory assumption in many applications in science and engineering. Moreover, the variance and the mean-squared error are quite crude statistics of the residual distributions. More sophisticated statistical indicators, capable of better quantifying how close the residual distribution is to the noise, can be profitably used. In particular, specific goodness of fit tests have been included in the expressions of the traditional criteria and have proved to be very effective in improving their discriminating capability. These improved performances have been demonstrated with a systematic series of simulations using synthetic data for various classes of functions and different noise statistics. Full article
Show Figures

Figure 1

Open AccessArticle
Association Factor for Identifying Linear and Nonlinear Correlations in Noisy Conditions
Entropy 2020, 22(4), 440; https://doi.org/10.3390/e22040440 - 13 Apr 2020
Abstract
Background: In data analysis and machine learning, we often need to identify and quantify the correlation between variables. Although Pearson’s correlation coefficient has been widely used, its value is reliable only for linear relationships and Distance correlation was introduced to address this shortcoming. [...] Read more.
Background: In data analysis and machine learning, we often need to identify and quantify the correlation between variables. Although Pearson’s correlation coefficient has been widely used, its value is reliable only for linear relationships and Distance correlation was introduced to address this shortcoming. Methods: Distance correlation can identify linear and nonlinear correlations. However, its performance drops in noisy conditions. In this paper, we introduce the Association Factor (AF) as a robust method for identification and quantification of linear and nonlinear associations in noisy conditions. Results: To test the performance of the proposed Association Factor, we modeled several simulations of linear and nonlinear relationships in different noise conditions and computed Pearson’s correlation, Distance correlation, and the proposed Association Factor. Conclusion: Our results show that the proposed method is robust in two ways. First, it can identify both linear and nonlinear associations. Second, the proposed Association Factor is reliable in both noiseless and noisy conditions. Full article
Show Figures

Figure 1

Back to TopTop