Entropy 2015, 17(5), 3400-3418; doi:10.3390/e17053400
Entropy Approximation in Lossy Source Coding Problem
Department of Mathematics and Computer Science, Jagiellonian University, Lojasiewicza 6, 30-348 Kraków, Poland
*
Author to whom correspondence should be addressed.
Academic Editor: Raúl Alcaraz Martínez
Received: 26 March 2015 / Revised: 11 May 2015 / Accepted: 12 May 2015 / Published: 18 May 2015
(This article belongs to the Section Information Theory)
Abstract
In this paper, we investigate a lossy source coding problem, where an upper limit on the permitted distortion is defined for every dataset element. It can be seen as an alternative approach to rate distortion theory where a bound on the allowed average error is specified. In order to find the entropy, which gives a statistical length of source code compatible with a fixed distortion bound, a corresponding optimization problem has to be solved. First, we show how to simplify this general optimization by reducing the number of coding partitions, which are irrelevant for the entropy calculation. In our main result, we present a fast and feasible for implementation greedy algorithm, which allows one to approximate the entropy within an additive error term of log2 e. The proof is based on the minimum entropy set cover problem, for which a similar bound was obtained. View Full-TextKeywords:
Shannon entropy; entropy approximation; minimum entropy set cover; lossy compression; source coding
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).
Scifeed alert for new publications
Never miss any articles matching your research from any publisher- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now