Next Article in Journal
Analysis of TDMP Algorithm of LDPC Codes Based on Density Evolution and Gaussian Approximation
Previous Article in Journal
A Neutrosophic Forecasting Model for Time Series Based on First-Order State and Information Entropy of High-Order Fluctuation
Article Menu
Issue 5 (May) cover image

Export Article

Open AccessArticle

Utilizing Information Bottleneck to Evaluate the Capability of Deep Neural Networks for Image Classification

1
Shanghai Institute of Microsystem and Information Technology, Chinese Academy of Sciences, Shanghai 200050, China
2
University of Chinese Academy of Sciences, Beijing 100049, China
3
School of Information Science and Technology, ShanghaiTech University, Shanghai 201210, China
4
State Key Laboratory of ISN, Xidian University, Xi’an 710071, China
*
Author to whom correspondence should be addressed.
This paper is an extended version of our paper published in the 15th European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018
Entropy 2019, 21(5), 456; https://doi.org/10.3390/e21050456
Received: 10 February 2019 / Revised: 12 April 2019 / Accepted: 28 April 2019 / Published: 1 May 2019
  |  
PDF [687 KB, uploaded 16 May 2019]
  |  

Abstract

Inspired by the pioneering work of the information bottleneck (IB) principle for Deep Neural Networks’ (DNNs) analysis, we thoroughly study the relationship among the model accuracy, I ( X ; T ) and I ( T ; Y ) , where I ( X ; T ) and I ( T ; Y ) are the mutual information of DNN’s output T with input X and label Y. Then, we design an information plane-based framework to evaluate the capability of DNNs (including CNNs) for image classification. Instead of each hidden layer’s output, our framework focuses on the model output T. We successfully apply our framework to many application scenarios arising in deep learning and image classification problems, such as image classification with unbalanced data distribution, model selection, and transfer learning. The experimental results verify the effectiveness of the information plane-based framework: Our framework may facilitate a quick model selection and determine the number of samples needed for each class in the unbalanced classification problem. Furthermore, the framework explains the efficiency of transfer learning in the deep learning area. View Full-Text
Keywords: information bottleneck; mutual information; neural networks; image classification information bottleneck; mutual information; neural networks; image classification
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Cheng, H.; Lian, D.; Gao, S.; Geng, Y. Utilizing Information Bottleneck to Evaluate the Capability of Deep Neural Networks for Image Classification. Entropy 2019, 21, 456.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top