Next Article in Journal
Kolmogorov Complexity of Coronary Sinus Atrial Electrograms Before Ablation Predicts Termination of Atrial Fibrillation After Pulmonary Vein Isolation
Previous Article in Journal
Ordered Avalanches on the Bethe Lattice
Previous Article in Special Issue
Distributed Hypothesis Testing with Privacy Constraints
Open AccessArticle

Conditional Rényi Divergence Saddlepoint and the Maximization of α-Mutual Information

1
Department of Electrical Engineering, Princeton University, C307 Engineering Quadrangle, NJ 08540, USA
2
Independent Researcher, Princeton, NJ 08540, USA
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(10), 969; https://doi.org/10.3390/e21100969
Received: 8 August 2019 / Revised: 20 September 2019 / Accepted: 25 September 2019 / Published: 4 October 2019
Rényi-type generalizations of entropy, relative entropy and mutual information have found numerous applications throughout information theory and beyond. While there is consensus that the ways A. Rényi generalized entropy and relative entropy in 1961 are the “right” ones, several candidates have been put forth as possible mutual informations of order α . In this paper we lend further evidence to the notion that a Bayesian measure of statistical distinctness introduced by R. Sibson in 1969 (closely related to Gallager’s E 0 function) is the most natural generalization, lending itself to explicit computation and maximization, as well as closed-form formulas. This paper considers general (not necessarily discrete) alphabets and extends the major analytical results on the saddle-point and saddle-level of the conditional relative entropy to the conditional Rényi divergence. Several examples illustrate the main application of these results, namely, the maximization of α -mutual information with and without constraints. View Full-Text
Keywords: information measures; relative entropy; conditional relative entropy; mutual information; Rényi divergence; α-mutual information; channel capacity; minimax redundancy information measures; relative entropy; conditional relative entropy; mutual information; Rényi divergence; α-mutual information; channel capacity; minimax redundancy
MDPI and ACS Style

Cai, C.; Verdú, S. Conditional Rényi Divergence Saddlepoint and the Maximization of α-Mutual Information. Entropy 2019, 21, 969.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop