Next Article in Journal
Online Auction Fraud Detection in Privacy-Aware Reputation Systems
Next Article in Special Issue
Overfitting Reduction of Text Classification Based on AdaBELM
Previous Article in Journal
Multi-User Detection for Sporadic IDMA Transmission Based on Compressed Sensing
Previous Article in Special Issue
The Expected Missing Mass under an Entropy Constraint
Article Menu
Issue 7 (July) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(7), 336;

Rate-Distortion Bounds for Kernel-Based Distortion Measures

Department of Computer Science and Engineering, Toyohashi University of Technology, 1-1 Hibarigaoka Tempaku-cho Toyohashi, Aichi 441-8580, Japan
This paper is an extended version of my papers published in the Eighth Workshop on Information Theoretic Methods in Science and Engineering, Copenhagen, Denmark, 24–26 June 2015 and the IEEE International Symposium on Information Theory, Aachen, Germany, 25–30 June 2017.
Received: 9 May 2017 / Revised: 16 June 2017 / Accepted: 2 July 2017 / Published: 5 July 2017
(This article belongs to the Special Issue Information Theory in Machine Learning and Data Science)
View Full-Text   |   Download PDF [346 KB, uploaded 6 July 2017]   |  


Kernel methods have been used for turning linear learning algorithms into nonlinear ones. These nonlinear algorithms measure distances between data points by the distance in the kernel-induced feature space. In lossy data compression, the optimal tradeoff between the number of quantized points and the incurred distortion is characterized by the rate-distortion function. However, the rate-distortion functions associated with distortion measures involving kernel feature mapping have yet to be analyzed. We consider two reconstruction schemes, reconstruction in input space and reconstruction in feature space, and provide bounds to the rate-distortion functions for these schemes. Comparison of the derived bounds to the quantizer performance obtained by the kernel K -means method suggests that the rate-distortion bounds for input space and feature space reconstructions are informative at low and high distortion levels, respectively. View Full-Text
Keywords: kernel methods; rate-distortion function; kernel K-means; preimaging kernel methods; rate-distortion function; kernel K-means; preimaging

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Share & Cite This Article

MDPI and ACS Style

Watanabe, K. Rate-Distortion Bounds for Kernel-Based Distortion Measures. Entropy 2017, 19, 336.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top