Next Article in Journal
Finite-Time Synchronization of Markovian Jumping Complex Networks with Non-Identical Nodes and Impulsive Effects
Previous Article in Journal
Power, Efficiency and Fluctuations in a Quantum Point Contact as Steady-State Thermoelectric Heat Engine
Previous Article in Special Issue
Empirical Estimation of Information Measures: A Literature Guide
Article Menu
Issue 8 (August) cover image

Export Article

Open AccessArticle

Two Measures of Dependence

Signal and Information Processing Laboratory, ETH Zurich, 8092 Zurich, Switzerland
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(8), 778; https://doi.org/10.3390/e21080778
Received: 5 July 2019 / Revised: 2 August 2019 / Accepted: 5 August 2019 / Published: 8 August 2019
(This article belongs to the Special Issue Information Measures with Applications)
  |  
PDF [955 KB, uploaded 8 August 2019]
  |  

Abstract

Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order α and the relative α -entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order α is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding. View Full-Text
Keywords: data processing; dependence measure; relative α-entropy; Rényi divergence; Rényi entropy data processing; dependence measure; relative α-entropy; Rényi divergence; Rényi entropy
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Lapidoth, A.; Pfister, C. Two Measures of Dependence. Entropy 2019, 21, 778.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top