Next Article in Journal
The More You Know, the More You Can Grow: An Information Theoretic Approach to Growth in the Information Age
Previous Article in Journal
Sequential Batch Design for Gaussian Processes Employing Marginalization †
Article Menu
Issue 2 (February) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(2), 83; doi:10.3390/e19020083

Breakdown Point of Robust Support Vector Machines

1
Department of Computer Science and Mathematical Informatics, Nagoya University, Nagoya 464-8601, Japan
2
TOPGATE Co. Ltd., Bunkyo-ku, Tokyo 113-0033, Japan
3
Institute of Statistical Mathematics, Tokyo 190-8562, Japan
4
RIKEN Center for Advanced Intelligence Project, Tokyo 103-0027, Japan
*
Author to whom correspondence should be addressed.
Received: 15 January 2017 / Accepted: 16 February 2017 / Published: 21 February 2017
View Full-Text   |   Download PDF [785 KB, uploaded 22 February 2017]   |  

Abstract

Support vector machine (SVM) is one of the most successful learning methods for solving classification problems. Despite its popularity, SVM has the serious drawback that it is sensitive to outliers in training samples. The penalty on misclassification is defined by a convex loss called the hinge loss, and the unboundedness of the convex loss causes the sensitivity to outliers. To deal with outliers, robust SVMs have been proposed by replacing the convex loss with a non-convex bounded loss called the ramp loss. In this paper, we study the breakdown point of robust SVMs. The breakdown point is a robustness measure that is the largest amount of contamination such that the estimated classifier still gives information about the non-contaminated data. The main contribution of this paper is to show an exact evaluation of the breakdown point of robust SVMs. For learning parameters such as the regularization parameter, we derive a simple formula that guarantees the robustness of the classifier. When the learning parameters are determined with a grid search using cross-validation, our formula works to reduce the number of candidate search points. Furthermore, the theoretical findings are confirmed in numerical experiments. We show that the statistical properties of robust SVMs are well explained by a theoretical analysis of the breakdown point. View Full-Text
Keywords: support vector machine; breakdown point; outlier; kernel function support vector machine; breakdown point; outlier; kernel function
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Kanamori, T.; Fujiwara, S.; Takeda, A. Breakdown Point of Robust Support Vector Machines. Entropy 2017, 19, 83.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top