Next Article in Journal
Synergistic Information Transfer in the Global System of Financial Markets
Next Article in Special Issue
CEB Improves Model Robustness
Previous Article in Journal
Composition Classification of Ultra-High Energy Cosmic Rays
Previous Article in Special Issue
Variational Information Bottleneck for Semi-Supervised Classification
Open AccessArticle

The Conditional Entropy Bottleneck

Google Research, Mountain View, CA 94043, USA
Entropy 2020, 22(9), 999; https://doi.org/10.3390/e22090999
Received: 30 July 2020 / Revised: 25 August 2020 / Accepted: 28 August 2020 / Published: 8 September 2020
(This article belongs to the Special Issue Information Bottleneck: Theory and Applications in Deep Learning)
Much of the field of Machine Learning exhibits a prominent set of failure modes, including vulnerability to adversarial examples, poor out-of-distribution (OoD) detection, miscalibration, and willingness to memorize random labelings of datasets. We characterize these as failures of robust generalization, which extends the traditional measure of generalization as accuracy or related metrics on a held-out set. We hypothesize that these failures to robustly generalize are due to the learning systems retaining too much information about the training data. To test this hypothesis, we propose the Minimum Necessary Information (MNI) criterion for evaluating the quality of a model. In order to train models that perform well with respect to the MNI criterion, we present a new objective function, the Conditional Entropy Bottleneck (CEB), which is closely related to the Information Bottleneck (IB). We experimentally test our hypothesis by comparing the performance of CEB models with deterministic models and Variational Information Bottleneck (VIB) models on a variety of different datasets and robustness challenges. We find strong empirical evidence supporting our hypothesis that MNI models improve on these problems of robust generalization. View Full-Text
Keywords: information theory; information bottleneck; machine learning information theory; information bottleneck; machine learning
Show Figures

Figure 1

MDPI and ACS Style

Fischer, I. The Conditional Entropy Bottleneck. Entropy 2020, 22, 999.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop