Next Article in Journal
Teaching Ordinal Patterns to a Computer: Efficient Encoding Algorithms Based on the Lehmer Code
Next Article in Special Issue
Rényi and Tsallis Entropies of the Aharonov–Bohm Ring in Uniform Magnetic Fields
Previous Article in Journal
Fast, Asymptotically Efficient, Recursive Estimation in a Riemannian Manifold
Previous Article in Special Issue
Entropic Matroids and Their Representation
Open AccessFeature PaperArticle

On Data-Processing and Majorization Inequalities for f-Divergences with Applications

Department of Electrical Engineering, Technion—Israel Institute of Technology, Haifa 3200003, Israel
Entropy 2019, 21(10), 1022; https://doi.org/10.3390/e21101022
Received: 17 July 2019 / Revised: 12 October 2019 / Accepted: 17 October 2019 / Published: 21 October 2019
(This article belongs to the Special Issue Information Measures with Applications)
This paper is focused on the derivation of data-processing and majorization inequalities for f-divergences, and their applications in information theory and statistics. For the accessibility of the material, the main results are first introduced without proofs, followed by exemplifications of the theorems with further related analytical results, interpretations, and information-theoretic applications. One application refers to the performance analysis of list decoding with either fixed or variable list sizes; some earlier bounds on the list decoding error probability are reproduced in a unified way, and new bounds are obtained and exemplified numerically. Another application is related to a study of the quality of approximating a probability mass function, induced by the leaves of a Tunstall tree, by an equiprobable distribution. The compression rates of finite-length Tunstall codes are further analyzed for asserting their closeness to the Shannon entropy of a memoryless and stationary discrete source. Almost all the analysis is relegated to the appendices, which form the major part of this manuscript. View Full-Text
Keywords: contraction coefficient; data-processing inequalities; f-divergences; hypothesis testing; list decoding; majorization theory; Rényi information measures; Tsallis entropy; Tunstall trees contraction coefficient; data-processing inequalities; f-divergences; hypothesis testing; list decoding; majorization theory; Rényi information measures; Tsallis entropy; Tunstall trees
Show Figures

Figure 1

MDPI and ACS Style

Sason, I. On Data-Processing and Majorization Inequalities for f-Divergences with Applications. Entropy 2019, 21, 1022.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop