Next Article in Journal
Complex Network Construction of Univariate Chaotic Time Series Based on Maximum Mean Discrepancy
Next Article in Special Issue
A Novel Counterfeit Feature Extraction Technique for Exposing Face-Swap Images Based on Deep Learning and Error Level Analysis
Previous Article in Journal
Detecting Epileptic Seizures in EEG Signals with Complementary Ensemble Empirical Mode Decomposition and Extreme Gradient Boosting
Previous Article in Special Issue
A Novel Improved Feature Extraction Technique for Ship-Radiated Noise Based on IITD and MDE
Open AccessArticle

Quantifying Total Influence between Variables with Information Theoretic and Machine Learning Techniques

1
Consorzio RFX (CNR, ENEA, INFN, Universita’ di Padova, Acciaierie Venete SpA), Corso Stati Uniti 4, 35127 Padova, Italy
2
Associazione EURATOM-ENEA, University of Rome “Tor Vergata”, 00133 Rome, Italy
*
Author to whom correspondence should be addressed.
These authors contributed equally to the work.
Entropy 2020, 22(2), 141; https://doi.org/10.3390/e22020141
Received: 4 December 2019 / Revised: 22 January 2020 / Accepted: 23 January 2020 / Published: 24 January 2020
The increasingly sophisticated investigations of complex systems require more robust estimates of the correlations between the measured quantities. The traditional Pearson correlation coefficient is easy to calculate but sensitive only to linear correlations. The total influence between quantities is, therefore, often expressed in terms of the mutual information, which also takes into account the nonlinear effects but is not normalized. To compare data from different experiments, the information quality ratio is, therefore, in many cases, of easier interpretation. On the other hand, both mutual information and information quality ratio are always positive and, therefore, cannot provide information about the sign of the influence between quantities. Moreover, they require an accurate determination of the probability distribution functions of the variables involved. As the quality and amount of data available are not always sufficient to grant an accurate estimation of the probability distribution functions, it has been investigated whether neural computational tools can help and complement the aforementioned indicators. Specific encoders and autoencoders have been developed for the task of determining the total correlation between quantities related by a functional dependence, including information about the sign of their mutual influence. Both their accuracy and computational efficiencies have been addressed in detail, with extensive numerical tests using synthetic data. A careful analysis of the robustness against noise has also been performed. The neural computational tools typically outperform the traditional indicators in practically every respect. View Full-Text
Keywords: machine learning tools; information theory; information quality ratio; total correlations; encoders; autoencoders machine learning tools; information theory; information quality ratio; total correlations; encoders; autoencoders
Show Figures

Figure 1

MDPI and ACS Style

Murari, A.; Rossi, R.; Lungaroni, M.; Gaudio, P.; Gelfusa, M. Quantifying Total Influence between Variables with Information Theoretic and Machine Learning Techniques. Entropy 2020, 22, 141. https://doi.org/10.3390/e22020141

AMA Style

Murari A, Rossi R, Lungaroni M, Gaudio P, Gelfusa M. Quantifying Total Influence between Variables with Information Theoretic and Machine Learning Techniques. Entropy. 2020; 22(2):141. https://doi.org/10.3390/e22020141

Chicago/Turabian Style

Murari, Andrea; Rossi, Riccardo; Lungaroni, Michele; Gaudio, Pasquale; Gelfusa, Michela. 2020. "Quantifying Total Influence between Variables with Information Theoretic and Machine Learning Techniques" Entropy 22, no. 2: 141. https://doi.org/10.3390/e22020141

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop