Next Article in Journal
Analysis of HIV/AIDS Epidemic and Socioeconomic Factors in Sub-Saharan Africa
Next Article in Special Issue
Information Bottleneck: Theory and Applications in Deep Learning
Previous Article in Journal
Nowcasting Avalanches as Earthquakes and the Predictability of Strong Avalanches in the Olami-Feder-Christensen Model
Previous Article in Special Issue
CEB Improves Model Robustness
Open AccessArticle

A Comparison of Variational Bounds for the Information Bottleneck Functional

1
Know-Center GmbH, Inffeldgasse 13/6, 8010 Graz, Austria
2
Google Research, Mountain View, CA 94043, USA
*
Author to whom correspondence should be addressed.
Entropy 2020, 22(11), 1229; https://doi.org/10.3390/e22111229
Received: 24 September 2020 / Revised: 19 October 2020 / Accepted: 20 October 2020 / Published: 29 October 2020
(This article belongs to the Special Issue Information Bottleneck: Theory and Applications in Deep Learning)
In this short note, we relate the variational bounds proposed in Alemi et al. (2017) and Fischer (2020) for the information bottleneck (IB) and the conditional entropy bottleneck (CEB) functional, respectively. Although the two functionals were shown to be equivalent, it was empirically observed that optimizing bounds on the CEB functional achieves better generalization performance and adversarial robustness than optimizing those on the IB functional. This work tries to shed light on this issue by showing that, in the most general setting, no ordering can be established between these variational bounds, while such an ordering can be enforced by restricting the feasible sets over which the optimizations take place. The absence of such an ordering in the general setup suggests that the variational bound on the CEB functional is either more amenable to optimization or a relevant cost function for optimization in its own regard, i.e., without justification from the IB or CEB functionals. View Full-Text
Keywords: information bottleneck; deep learning; neural networks information bottleneck; deep learning; neural networks
MDPI and ACS Style

Geiger, B.C.; Fischer, I.S. A Comparison of Variational Bounds for the Information Bottleneck Functional. Entropy 2020, 22, 1229. https://doi.org/10.3390/e22111229

AMA Style

Geiger BC, Fischer IS. A Comparison of Variational Bounds for the Information Bottleneck Functional. Entropy. 2020; 22(11):1229. https://doi.org/10.3390/e22111229

Chicago/Turabian Style

Geiger, Bernhard C.; Fischer, Ian S. 2020. "A Comparison of Variational Bounds for the Information Bottleneck Functional" Entropy 22, no. 11: 1229. https://doi.org/10.3390/e22111229

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop