Next Article in Journal
Voice Activity Detection Using Fuzzy Entropy and Support Vector Machine
Next Article in Special Issue
Mechanical Fault Diagnosis of High Voltage Circuit Breakers with Unknown Fault Type Using Hybrid Classifier Based on LMD and Time Segmentation Energy Entropy
Previous Article in Journal
Temporal Predictability of Online Behavior in Foursquare
Previous Article in Special Issue
An Entropy-Based Kernel Learning Scheme toward Efficient Data Prediction in Cloud-Assisted Network Environments
Article Menu

Export Article

Open AccessArticle
Entropy 2016, 18(8), 295; doi:10.3390/e18080295

Information Theoretical Measures for Achieving Robust Learning Machines

1
Facultad de Ingeniería y Ciencias Aplicadas, Universidad de los Andes, Mons. Álvaro del Portillo 12455, Las Condes, Santiago 7620001, Chile
2
College of Optical Sciences, University of Arizona, 1630 E. University Blvd., Tucson, AZ 85721, USA
3
Independent, Santiago 7620001, Chile
*
Author to whom correspondence should be addressed.
Academic Editors: Badong Chen and Jose C. Principe
Received: 13 May 2016 / Revised: 6 August 2016 / Accepted: 8 August 2016 / Published: 12 August 2016
(This article belongs to the Special Issue Information Theoretic Learning)
View Full-Text   |   Download PDF [520 KB, uploaded 12 August 2016]   |  

Abstract

Information theoretical measures are used to design, from first principles, an objective function that can drive a learning machine process to a solution that is robust to perturbations in parameters. Full analytic derivations are given and tested with computational examples showing that indeed the procedure is successful. The final solution, implemented by a robust learning machine, expresses a balance between Shannon differential entropy and Fisher information. This is also surprising in being an analytical relation, given the purely numerical operations of the learning machine. View Full-Text
Keywords: information theoretical learning; Shannon entropy; Kullback–Leibler divergence; relative entropy; cross-entropy; Fisher information; relative information information theoretical learning; Shannon entropy; Kullback–Leibler divergence; relative entropy; cross-entropy; Fisher information; relative information
Figures

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Zegers, P.; Frieden, B.R.; Alarcón, C.; Fuentes, A. Information Theoretical Measures for Achieving Robust Learning Machines. Entropy 2016, 18, 295.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top