Next Article in Journal
Context Based Predictive Information
Next Article in Special Issue
Entropic Regularization of Markov Decision Processes
Previous Article in Journal
Information Theory and an Entropic Approach to an Analysis of Fiscal Inequality
Previous Article in Special Issue
MEMe: An Accurate Maximum Entropy Method for Efficient Approximations in Large-Scale Machine Learning
Open AccessArticle

Online Gradient Descent for Kernel-Based Maximum Correntropy Criterion

1 and 2,*
1
School of Mathematics and Statistics, South-Central University for Nationalities, Wuhan 430074, China
2
School of Mathematics and Statistics, Wuhan University, Wuhan 430072, China
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(7), 644; https://doi.org/10.3390/e21070644
Received: 15 May 2019 / Revised: 14 June 2019 / Accepted: 24 June 2019 / Published: 29 June 2019
(This article belongs to the Special Issue Entropy Based Inference and Optimization in Machine Learning)
  |  
PDF [309 KB, uploaded 2 July 2019]

Abstract

In the framework of statistical learning, we study the online gradient descent algorithm generated by the correntropy-induced losses in Reproducing kernel Hilbert spaces (RKHS). As a generalized correlation measurement, correntropy has been widely applied in practice, owing to its prominent merits on robustness. Although the online gradient descent method is an efficient way to deal with the maximum correntropy criterion (MCC) in non-parameter estimation, there has been no consistency in analysis or rigorous error bounds. We provide a theoretical understanding of the online algorithm for MCC, and show that, with a suitable chosen scaling parameter, its convergence rate can be min–max optimal (up to a logarithmic factor) in the regression analysis. Our results show that the scaling parameter plays an essential role in both robustness and consistency. View Full-Text
Keywords: correntropy; maximum correntropy criterion; online algorithm; robustness; reproducing kernel Hilbert spaces correntropy; maximum correntropy criterion; online algorithm; robustness; reproducing kernel Hilbert spaces
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Wang, B.; Hu, T. Online Gradient Descent for Kernel-Based Maximum Correntropy Criterion. Entropy 2019, 21, 644.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top