Next Article in Journal
Time-Rescaling of Dirac Dynamics: Shortcuts to Adiabaticity in Ion Traps and Weyl Semimetals
Next Article in Special Issue
Neural Estimator of Information for Time-Series Data with Dependency
Previous Article in Journal
High Entropy Alloys as Filler Metals for Joining
Previous Article in Special Issue
Mutual Information Based Learning Rate Decay for Stochastic Gradient Descent Training of Deep Neural Networks

Discovering Higher-Order Interactions Through Neural Information Decomposition

Information Sciences Institute, University of Southern California, Los Angeles, CA 90292, USA
Author to whom correspondence should be addressed.
Entropy 2021, 23(1), 79;
Received: 3 November 2020 / Revised: 21 December 2020 / Accepted: 25 December 2020 / Published: 7 January 2021
(This article belongs to the Special Issue Deep Artificial Neural Networks Meet Information Theory)
If regularity in data takes the form of higher-order functions among groups of variables, models which are biased towards lower-order functions may easily mistake the data for noise. To distinguish whether this is the case, one must be able to quantify the contribution of different orders of dependence to the total information. Recent work in information theory attempts to do this through measures of multivariate mutual information (MMI) and information decomposition (ID). Despite substantial theoretical progress, practical issues related to tractability and learnability of higher-order functions are still largely unaddressed. In this work, we introduce a new approach to information decomposition—termed Neural Information Decomposition (NID)—which is both theoretically grounded, and can be efficiently estimated in practice using neural networks. We show on synthetic data that NID can learn to distinguish higher-order functions from noise, while many unsupervised probability models cannot. Additionally, we demonstrate the usefulness of this framework as a tool for exploring biological and artificial neural networks. View Full-Text
Keywords: information theory; information decomposition; neural coding information theory; information decomposition; neural coding
Show Figures

Figure 1

MDPI and ACS Style

Reing, K.; Ver Steeg, G.; Galstyan, A. Discovering Higher-Order Interactions Through Neural Information Decomposition. Entropy 2021, 23, 79.

AMA Style

Reing K, Ver Steeg G, Galstyan A. Discovering Higher-Order Interactions Through Neural Information Decomposition. Entropy. 2021; 23(1):79.

Chicago/Turabian Style

Reing, Kyle, Greg Ver Steeg, and Aram Galstyan. 2021. "Discovering Higher-Order Interactions Through Neural Information Decomposition" Entropy 23, no. 1: 79.

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

Back to TopTop