Next Article in Journal
Efficient Multi-Object Detection and Smart Navigation Using Artificial Intelligence for Visually Impaired People
Next Article in Special Issue
The Conditional Entropy Bottleneck
Previous Article in Journal
Photon Dissipation as the Origin of Information Encoding in RNA and DNA
Previous Article in Special Issue
Convergence Behavior of DNNs with Mutual-Information-Based Regularization
Open AccessArticle

Variational Information Bottleneck for Semi-Supervised Classification

1
Department of Computer Science, University of Geneva, 1227 Carouge, Switzerland
2
DeepMind, London N1C 4AG, UK
*
Author to whom correspondence should be addressed.
Entropy 2020, 22(9), 943; https://doi.org/10.3390/e22090943
Received: 22 July 2020 / Revised: 24 August 2020 / Accepted: 24 August 2020 / Published: 27 August 2020
(This article belongs to the Special Issue Information Bottleneck: Theory and Applications in Deep Learning)
In this paper, we consider an information bottleneck (IB) framework for semi-supervised classification with several families of priors on latent space representation. We apply a variational decomposition of mutual information terms of IB. Using this decomposition we perform an analysis of several regularizers and practically demonstrate an impact of different components of variational model on the classification accuracy. We propose a new formulation of semi-supervised IB with hand crafted and learnable priors and link it to the previous methods such as semi-supervised versions of VAE (M1 + M2), AAE, CatGAN, etc. We show that the resulting model allows better understand the role of various previously proposed regularizers in semi-supervised classification task in the light of IB framework. The proposed IB semi-supervised model with hand-crafted and learnable priors is experimentally validated on MNIST under different amount of labeled data. View Full-Text
Keywords: information bottleneck principle; deep networks; semi-supervised classification; latent space representation; hand crafted priors; learnable priors; regularization information bottleneck principle; deep networks; semi-supervised classification; latent space representation; hand crafted priors; learnable priors; regularization
Show Figures

Figure 1

MDPI and ACS Style

Voloshynovskiy, S.; Taran, O.; Kondah, M.; Holotyak, T.; Rezende, D. Variational Information Bottleneck for Semi-Supervised Classification. Entropy 2020, 22, 943.

AMA Style

Voloshynovskiy S, Taran O, Kondah M, Holotyak T, Rezende D. Variational Information Bottleneck for Semi-Supervised Classification. Entropy. 2020; 22(9):943.

Chicago/Turabian Style

Voloshynovskiy, Slava; Taran, Olga; Kondah, Mouad; Holotyak, Taras; Rezende, Danilo. 2020. "Variational Information Bottleneck for Semi-Supervised Classification" Entropy 22, no. 9: 943.

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop