Next Article in Journal
A Comparison Study on Criteria to Select the Most Adequate Weighting Matrix
Previous Article in Journal
Classical (Local and Contextual) Probability Model for Bohm–Bell Type Experiments: No-Signaling as Independence of Random Variables
Previous Article in Special Issue
On Continuous-Time Gaussian Channels
Article Menu

Article Versions

Export Article

Open AccessArticle
Entropy 2019, 21(2), 158; https://doi.org/10.3390/e21020158

Universality of Logarithmic Loss in Successive Refinement

Department of Electronic and Electrical Engineering, Hongik University, Seoul 04066, Korea
This paper is an extended version of our paper published in the 2015 IEEE International Symposium on Information Theory (ISIT), Hong Kong, China, 14–19 June 2015..
Received: 19 December 2018 / Revised: 1 February 2019 / Accepted: 7 February 2019 / Published: 8 February 2019
(This article belongs to the Special Issue Multiuser Information Theory II)
PDF [245 KB, uploaded 8 February 2019]

Abstract

We establish an universal property of logarithmic loss in the successive refinement problem. If the first decoder operates under logarithmic loss, we show that any discrete memoryless source is successively refinable under an arbitrary distortion criterion for the second decoder. Based on this result, we propose a low-complexity lossy compression algorithm for any discrete memoryless source.
Keywords: logarithmic loss; rate-distortion; successive refinability logarithmic loss; rate-distortion; successive refinability
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

No, A. Universality of Logarithmic Loss in Successive Refinement . Entropy 2019, 21, 158.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top