The Conditional Entropy Bottleneck
Google Research, Mountain View, CA 94043, USA
Entropy 2020, 22(9), 999; https://doi.org/10.3390/e22090999
Received: 30 July 2020 / Revised: 25 August 2020 / Accepted: 28 August 2020 / Published: 8 September 2020
(This article belongs to the Special Issue Information Bottleneck: Theory and Applications in Deep Learning)
Much of the field of Machine Learning exhibits a prominent set of failure modes, including vulnerability to adversarial examples, poor out-of-distribution (OoD) detection, miscalibration, and willingness to memorize random labelings of datasets. We characterize these as failures of robust generalization, which extends the traditional measure of generalization as accuracy or related metrics on a held-out set. We hypothesize that these failures to robustly generalize are due to the learning systems retaining too much information about the training data. To test this hypothesis, we propose the Minimum Necessary Information (MNI) criterion for evaluating the quality of a model. In order to train models that perform well with respect to the MNI criterion, we present a new objective function, the Conditional Entropy Bottleneck (CEB), which is closely related to the Information Bottleneck (IB). We experimentally test our hypothesis by comparing the performance of CEB models with deterministic models and Variational Information Bottleneck (VIB) models on a variety of different datasets and robustness challenges. We find strong empirical evidence supporting our hypothesis that MNI models improve on these problems of robust generalization.
View Full-Text
Keywords:
information theory; information bottleneck; machine learning
▼
Show Figures
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited
MDPI and ACS Style
Fischer, I. The Conditional Entropy Bottleneck. Entropy 2020, 22, 999. https://doi.org/10.3390/e22090999
AMA Style
Fischer I. The Conditional Entropy Bottleneck. Entropy. 2020; 22(9):999. https://doi.org/10.3390/e22090999
Chicago/Turabian StyleFischer, Ian. 2020. "The Conditional Entropy Bottleneck" Entropy 22, no. 9: 999. https://doi.org/10.3390/e22090999
Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.
Search more from Scilit