Special Issue "Information Measures with Applications"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 15 November 2019.

Special Issue Editor

Guest Editor
Prof. Dr. Amos Lapidoth Website E-Mail
Signal and Information Processing Laboratory, ETH Zurich, 8092 Zurich, Switzerland
Interests: Information Theory and Digital Communications

Special Issue Information

Dear Colleagues,

Classical information measures such as entropy, relative entropy (Kullback–Leibler divergence), and mutual information have found numerous applications in storage, compression, transmission, cryptography, statistics, large deviations, gambling, and physics. However, over the years—arguably starting with the pioneering work of Alfréd Rényi (1921–1970)—other information measures were introduced and studied. Those include Rényi Entropy, Rényi Divergence, f-divergence, Arimoto's mutual information, Sibson's information radius, and others. These new measures typically generalize the classical measures and in some applications provide finer results. In recent years they have also found new applications in guessing, hypothesis testing, error exponents, task encoding, large deviations, etc.

For this Special Issue we solicit original papers presenting new applications of known information measures and new measures with interesting applications.

Prof. Dr. Amos Lapidoth
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • f-divergence
  • Rényi Divergence
  • Rényi Entropy
  • Arimoto's Mutual Information
  • Information Measures
  • Information Radius

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
Two Measures of Dependence
Entropy 2019, 21(8), 778; https://doi.org/10.3390/e21080778 - 08 Aug 2019
Abstract
Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order α and the relative α-entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order α is one. The first [...] Read more.
Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order α and the relative α -entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order α is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding. Full article
(This article belongs to the Special Issue Information Measures with Applications)
Show Figures

Figure 1

Open AccessArticle
Empirical Estimation of Information Measures: A Literature Guide
Entropy 2019, 21(8), 720; https://doi.org/10.3390/e21080720 - 24 Jul 2019
Abstract
We give a brief survey of the literature on the empirical estimation of entropy, differential entropy, relative entropy, mutual information and related information measures. While those quantities are of central importance in information theory, universal algorithms for their estimation are increasingly important in [...] Read more.
We give a brief survey of the literature on the empirical estimation of entropy, differential entropy, relative entropy, mutual information and related information measures. While those quantities are of central importance in information theory, universal algorithms for their estimation are increasingly important in data science, machine learning, biology, neuroscience, economics, language, and other experimental sciences. Full article
(This article belongs to the Special Issue Information Measures with Applications)
Show Figures

Figure 1

Back to TopTop