Due to scheduled maintaince work on one of the switches in our server center, our websites www.mdpi.com and susy.mdpi.com may experience short service dirsuptions of up to 5 minutes on 30 January 2015 00:00 PM EST time.
Open AccessThis article is
 freely available
 reusable
Entropy 2014, 16(7), 41684184; doi:10.3390/e16074168
Article
Characterizing the Asymptotic PerSymbol Redundancy of Memoryless Sources over Countable Alphabets in Terms of SingleLetter Marginals
Received: 27 May 2014 / Revised: 24 June 2014 / Accepted: 7 July 2014 / Published: 23 July 2014
(This article belongs to the Section Information Theory)
View FullText

Download PDF [258 KB, uploaded 23 July 2014]
Abstract: The minimum expected number of bits needed to describe a random variable is its entropy, assuming knowledge of the distribution of the random variable. On the other hand, universal compression describes data supposing that the underlying distribution is unknown, but that it belongs to a known set Ρ of distributions. However, since universal descriptions are not matched exactly to the underlying distribution, the number of bits they use on average is higher, and the excess over the entropy used is the redundancy. In this paper, we study the redundancy incurred by the universal description of strings of positive integers (Z+), the strings being generated independently and identically distributed (i.i.d.) according an unknown distribution over Z+ in a known collection P. We first show that if describing a single symbol incurs finite redundancy, then P is tight, but that the converse does not always hold. If a single symbol can be described with finite worstcase regret (a more stringent formulation than redundancy above), then it is known that describing length n i.i.d. strings only incurs vanishing (to zero) redundancy per symbol as n increases. On the contrary, we show it is possible that the description of a single symbol from an unknown distribution of P incurs finite redundancy, yet the description of length n i.i.d. strings incurs a constant (> 0) redundancy per symbol encoded. We then show a sufficient condition on singleletter marginals, such that length n i.i.d. samples will incur vanishing redundancy per symbol encoded.
Keywords:
universal compression; redundancy; large alphabets; tightness; redundancycapacity theorem
This is an open access article distributed under the
Creative Commons Attribution License which permits unrestricted use, distribution,
and reproduction in any medium, provided the original work is properly cited.
Export to BibTeX  EndNote
MDPI and ACS Style
Hosseini, M.; Santhanam, N. Characterizing the Asymptotic PerSymbol Redundancy of Memoryless Sources over Countable Alphabets in Terms of SingleLetter Marginals. Entropy 2014, 16, 41684184.
AMA StyleHosseini M, Santhanam N. Characterizing the Asymptotic PerSymbol Redundancy of Memoryless Sources over Countable Alphabets in Terms of SingleLetter Marginals. Entropy. 2014; 16(7):41684184.
Chicago/Turabian StyleHosseini, Maryam; Santhanam, Narayana. 2014. "Characterizing the Asymptotic PerSymbol Redundancy of Memoryless Sources over Countable Alphabets in Terms of SingleLetter Marginals." Entropy 16, no. 7: 41684184.