Next Article in Journal
Efficient Heuristics for Structure Learning of k-Dependence Bayesian Classifier
Next Article in Special Issue
Detection Games under Fully Active Adversaries
Previous Article in Journal
Optimal Design of Nanoparticle Enhanced Phan-Thien–Tanner Flow of a Viscoelastic Fluid in a Microchannel
Open AccessArticle

Tight Bounds on the Rényi Entropy via Majorization with Applications to Guessing and Compression

Department of Electrical Engineering, Technion—Israel Institute of Technology, Haifa 3200003, Israel
Entropy 2018, 20(12), 896; https://doi.org/10.3390/e20120896
Received: 16 October 2018 / Revised: 5 November 2018 / Accepted: 20 November 2018 / Published: 22 November 2018
This paper provides tight bounds on the Rényi entropy of a function of a discrete random variable with a finite number of possible values, where the considered function is not one to one. To that end, a tight lower bound on the Rényi entropy of a discrete random variable with a finite support is derived as a function of the size of the support, and the ratio of the maximal to minimal probability masses. This work was inspired by the recently published paper by Cicalese et al., which is focused on the Shannon entropy, and it strengthens and generalizes the results of that paper to Rényi entropies of arbitrary positive orders. In view of these generalized bounds and the works by Arikan and Campbell, non-asymptotic bounds are derived for guessing moments and lossless data compression of discrete memoryless sources. View Full-Text
Keywords: Majorization; Rényi entropy; Rényi divergence; cumulant generating functions; guessing moments; lossless source coding; fixed-to-variable source codes; Huffman algorithm; Tunstall codes Majorization; Rényi entropy; Rényi divergence; cumulant generating functions; guessing moments; lossless source coding; fixed-to-variable source codes; Huffman algorithm; Tunstall codes
Show Figures

Figure 1

MDPI and ACS Style

Sason, I. Tight Bounds on the Rényi Entropy via Majorization with Applications to Guessing and Compression. Entropy 2018, 20, 896.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop