Next Article in Journal
Effect of a Percutaneous Coronary Intervention Procedure on Heart Rate Variability and Pulse Transit Time Variability: A Comparison Study Based on Fuzzy Measure Entropy
Next Article in Special Issue
An Entropy-Based Kernel Learning Scheme toward Efficient Data Prediction in Cloud-Assisted Network Environments
Previous Article in Journal
Modeling Fluid’s Dynamics with Master Equations in Ultrametric Spaces Representing the Treelike Structure of Capillary Networks
Previous Article in Special Issue
Product Design Time Forecasting by Kernel-Based Regression with Gaussian Distribution Weights
Article Menu

Export Article

Open AccessArticle
Entropy 2016, 18(7), 251; doi:10.3390/e18070251

Maximum Entropy Learning with Deep Belief Networks

Research Center for Information Technology Innovation, Academia Sinica, Taipei 11529, Taiwan
*
Author to whom correspondence should be addressed.
Academic Editors: Badong Chen and Jose C. Principe
Received: 30 April 2016 / Revised: 21 June 2016 / Accepted: 30 June 2016 / Published: 8 July 2016
(This article belongs to the Special Issue Information Theoretic Learning)
View Full-Text   |   Download PDF [1674 KB, uploaded 8 July 2016]   |  

Abstract

Conventionally, the maximum likelihood (ML) criterion is applied to train a deep belief network (DBN). We present a maximum entropy (ME) learning algorithm for DBNs, designed specifically to handle limited training data. Maximizing only the entropy of parameters in the DBN allows more effective generalization capability, less bias towards data distributions, and robustness to over-fitting compared to ML learning. Results of text classification and object recognition tasks demonstrate ME-trained DBN outperforms ML-trained DBN when training data is limited. View Full-Text
Keywords: maximum entropy; machine learning; deep learning; deep belief networks; restricted Boltzmann machine; deep neural networks; low-resource tasks maximum entropy; machine learning; deep learning; deep belief networks; restricted Boltzmann machine; deep neural networks; low-resource tasks
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Lin, P.; Fu, S.-W.; Wang, S.-S.; Lai, Y.-H.; Tsao, Y. Maximum Entropy Learning with Deep Belief Networks. Entropy 2016, 18, 251.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top