Next Article in Journal
Optimization of the Casualties’ Treatment Process: Blended Military Experiment
Previous Article in Journal
Exergy and Exergoeconomic Analysis of a Cogeneration Hybrid Solar Organic Rankine Cycle with Ejector
Open AccessArticle

Sharp Second-Order Pointwise Asymptotics for Lossless Compression with Side Information

Department of Engineering, University of Cambridge, Trumpington Street, Cambridge CB2 1PZ, UK
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Entropy 2020, 22(6), 705; https://doi.org/10.3390/e22060705
Received: 22 May 2020 / Revised: 19 June 2020 / Accepted: 22 June 2020 / Published: 25 June 2020
(This article belongs to the Section Information Theory, Probability and Statistics)
The problem of determining the best achievable performance of arbitrary lossless compression algorithms is examined, when correlated side information is available at both the encoder and decoder. For arbitrary source-side information pairs, the conditional information density is shown to provide a sharp asymptotic lower bound for the description lengths achieved by an arbitrary sequence of compressors. This implies that for ergodic source-side information pairs, the conditional entropy rate is the best achievable asymptotic lower bound to the rate, not just in expectation but with probability one. Under appropriate mixing conditions, a central limit theorem and a law of the iterated logarithm are proved, describing the inevitable fluctuations of the second-order asymptotically best possible rate. An idealised version of Lempel-Ziv coding with side information is shown to be universally first- and second-order asymptotically optimal, under the same conditions. These results are in part based on a new almost-sure invariance principle for the conditional information density, which may be of independent interest. View Full-Text
Keywords: entropy; lossless data compression; side information; conditional entropy; central limit theorem; law of the iterated logarithm; conditional varentropy entropy; lossless data compression; side information; conditional entropy; central limit theorem; law of the iterated logarithm; conditional varentropy
MDPI and ACS Style

Gavalakis, L.; Kontoyiannis, I. Sharp Second-Order Pointwise Asymptotics for Lossless Compression with Side Information. Entropy 2020, 22, 705.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop