Entropy Approximation in Lossy Source Coding Problem
AbstractIn this paper, we investigate a lossy source coding problem, where an upper limit on the permitted distortion is defined for every dataset element. It can be seen as an alternative approach to rate distortion theory where a bound on the allowed average error is specified. In order to find the entropy, which gives a statistical length of source code compatible with a fixed distortion bound, a corresponding optimization problem has to be solved. First, we show how to simplify this general optimization by reducing the number of coding partitions, which are irrelevant for the entropy calculation. In our main result, we present a fast and feasible for implementation greedy algorithm, which allows one to approximate the entropy within an additive error term of log2 e. The proof is based on the minimum entropy set cover problem, for which a similar bound was obtained. View Full-Text
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Śmieja, M.; Tabor, J. Entropy Approximation in Lossy Source Coding Problem. Entropy 2015, 17, 3400-3418.
Śmieja M, Tabor J. Entropy Approximation in Lossy Source Coding Problem. Entropy. 2015; 17(5):3400-3418.Chicago/Turabian Style
Śmieja, Marek; Tabor, Jacek. 2015. "Entropy Approximation in Lossy Source Coding Problem." Entropy 17, no. 5: 3400-3418.