entropy-logo

Journal Browser

Journal Browser

Measures of Information III

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (31 December 2023) | Viewed by 5136

Special Issue Editors


E-Mail Website
Guest Editor
Dipartimento di Biologia, Università di Napoli Federico II, 80126 Naples, NA, Italy
Interests: stochastic orders; reliability theory; measures of discrimination (in particular entropy, extropies, inaccuracy, Kullback-Leibler); coherent systems; inference
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Institute of Statistics, RWTH Aachen University, 52056 Aachen, Germany
Interests: information measures; stochastic orders; probability theory; reliability theory; aging properties; coherent systems; hazard rate function

Special Issue Information

Dear Colleagues,

How important is uncertainty in the life of a human being? Certainly, an existence in which everything is deterministic is not worth living.

In 1948, Claude Shannon developed the general concept of entropy, a “measure of uncertainty”, a fundamental cornerstone of information theory, coming out from the idea of quantifying how much information there is in a message. In his paper “A Mathematical Theory of Communication”, he set out to mathematically quantify the statistical nature of “lost information” in phone-line signals while working at Bell Telephone Laboratories.

Entropy in information theory is directly analogous to entropy in statistical thermodynamics.

In information theory, the entropy of a random variable is the average level of “information”, “uncertainty” or “surprise” inherent in the variable’s possible outcomes.

The entropy was originally a part of his theory of communication, in which a data communication system is composed of three elements: a source of data, a communication channel, and a receiver. In Shannon’s theory, the “fundamental problem of communication” is for the receiver to be able to identify what data were generated by the source, based on the signal it receives through the channel. Thus, the basic idea is that the “informational value” of a communicated message depends on the degree to which the content of the message is surprising.

Entropy has relevance to other areas of mathematics. The definition comes from a set of axioms establishing that entropy should be a measure of how “surprising” the average outcome of a variable is. For a continuous random variable, differential entropy is analogous to entropy.

If an event is very probable, it is uninteresting when that event happens as expected; hence, transmission of such a message carries very little new information. However, if an event is unlikely to occur, it is much more informative to learn if the event happened or will happen.

In the last decades, several new measures of information and of discrimination have been defined and studied, and it is clear that many other ones (with applications in different fields) will be introduced. This Special Volume has the aim of enriching notions related to measures of discrimination.

Prof. Dr. Maria Longobardi
Dr. Francesco Buono
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Related Special Issues

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

9 pages, 265 KiB  
Article
Quantum Purity as an Information Measure and Nernst Law
by F. Pennini and A. Plastino
Entropy 2023, 25(8), 1113; https://doi.org/10.3390/e25081113 - 26 Jul 2023
Viewed by 961
Abstract
We propose to re-express Nernst law in terms of a suitable information measure (IM) parameter. This is achieved by dwelling on the idea of adapting the notion of purity in the case of a thermal Gibbs environment, yielding what we might call the [...] Read more.
We propose to re-express Nernst law in terms of a suitable information measure (IM) parameter. This is achieved by dwelling on the idea of adapting the notion of purity in the case of a thermal Gibbs environment, yielding what we might call the “purity” indicator (which we denote by the symbol D in the text). We find it interesting to define an extension of this DIM indicator in a classical context. This generalization turns out to have useful conceptual consequences when used in conjunction with the classical Shannon entropy S. Implications for the Nernst law are discussed. Full article
(This article belongs to the Special Issue Measures of Information III)
12 pages, 312 KiB  
Article
Cumulative Residual Entropy of the Residual Lifetime of a Mixed System at the System Level
by Mohamed Kayid and Mashael A. Alshehri
Entropy 2023, 25(7), 1033; https://doi.org/10.3390/e25071033 - 09 Jul 2023
Viewed by 719
Abstract
Recently, there has been growing interest in alternative measures of uncertainty, including cumulative residual entropy. In this paper, we consider a mixed system consisting of n components, assuming that all components are operational at time t. By utilizing the system signature, we [...] Read more.
Recently, there has been growing interest in alternative measures of uncertainty, including cumulative residual entropy. In this paper, we consider a mixed system consisting of n components, assuming that all components are operational at time t. By utilizing the system signature, we are able to compute the cumulative residual entropy of a mixed system’s remaining lifetime. This metric serves as a valuable tool for evaluating the predictability of a system’s lifetime. We study several results related to the cumulative residual entropy of mixed systems, including expressions, limits, and order properties. These results shed light on the behavior of the measure and provide insights into the predictability of mixed systems. In addition, we propose a criterion for selecting a preferred system based on the relative residual cumulative entropy. This criterion is closely related to the parallel system and provides a practical way to choose the best system configuration. Overall, the present study of cumulative residual entropy and the proposed selection criterion provide valuable insights into the predictability of mixed systems and can be applied in various fields. Full article
(This article belongs to the Special Issue Measures of Information III)
Show Figures

Figure 1

15 pages, 427 KiB  
Article
Orders between Channels and Implications for Partial Information Decomposition
by André F. C. Gomes and Mário A. T. Figueiredo
Entropy 2023, 25(7), 975; https://doi.org/10.3390/e25070975 - 25 Jun 2023
Cited by 1 | Viewed by 944
Abstract
The partial information decomposition (PID) framework is concerned with decomposing the information that a set of random variables has with respect to a target variable into three types of components: redundant, synergistic, and unique. Classical information theory alone does not provide a unique [...] Read more.
The partial information decomposition (PID) framework is concerned with decomposing the information that a set of random variables has with respect to a target variable into three types of components: redundant, synergistic, and unique. Classical information theory alone does not provide a unique way to decompose information in this manner, and additional assumptions have to be made. Recently, Kolchinsky proposed a new general axiomatic approach to obtain measures of redundant information based on choosing an order relation between information sources (equivalently, order between communication channels). In this paper, we exploit this approach to introduce three new measures of redundant information (and the resulting decompositions) based on well-known preorders between channels, contributing to the enrichment of the PID landscape. We relate the new decompositions to existing ones, study several of their properties, and provide examples illustrating their novelty. As a side result, we prove that any preorder that satisfies Kolchinsky’s axioms yields a decomposition that meets the axioms originally introduced by Williams and Beer when they first proposed PID. Full article
(This article belongs to the Special Issue Measures of Information III)
Show Figures

Figure 1

13 pages, 332 KiB  
Article
On the Uncertainty Properties of the Conditional Distribution of the Past Life Time
by Mohamed Kayid and Mansour Shrahili
Entropy 2023, 25(6), 895; https://doi.org/10.3390/e25060895 - 02 Jun 2023
Cited by 2 | Viewed by 679
Abstract
For a given system observed at time t, the past entropy serves as an uncertainty measure about the past life-time of the distribution. We consider a coherent system in which there are n components that have all failed at time t. [...] Read more.
For a given system observed at time t, the past entropy serves as an uncertainty measure about the past life-time of the distribution. We consider a coherent system in which there are n components that have all failed at time t. To assess the predictability of the life-time of such a system, we use the signature vector to determine the entropy of its past life-time. We explore various analytical results, including expressions, bounds, and order properties, for this measure. Our results provide valuable insight into the predictability of the coherent system’s life-time, which may be useful in a number of practical applications. Full article
(This article belongs to the Special Issue Measures of Information III)
Show Figures

Figure 1

14 pages, 559 KiB  
Article
Jensen–Inaccuracy Information Measure
by Omid Kharazmi, Faezeh Shirazinia, Francesco Buono and Maria Longobardi
Entropy 2023, 25(3), 483; https://doi.org/10.3390/e25030483 - 10 Mar 2023
Cited by 2 | Viewed by 1070
Abstract
The purpose of the paper is to introduce the Jensen–inaccuracy measure and examine its properties. Furthermore, some results on the connections between the inaccuracy and Jensen–inaccuracy measures and some other well-known information measures are provided. Moreover, in three different optimization problems, the arithmetic [...] Read more.
The purpose of the paper is to introduce the Jensen–inaccuracy measure and examine its properties. Furthermore, some results on the connections between the inaccuracy and Jensen–inaccuracy measures and some other well-known information measures are provided. Moreover, in three different optimization problems, the arithmetic mixture distribution provides optimal information based on the inaccuracy information measure. Finally, two real examples from image processing are studied and some numerical results in terms of the inaccuracy and Jensen–inaccuracy information measures are obtained. Full article
(This article belongs to the Special Issue Measures of Information III)
Show Figures

Figure 1

Back to TopTop