Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (2)

Search Parameters:
Keywords = Shannon and Rényi axioms

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
23 pages, 730 KiB  
Article
On the α-q-Mutual Information and the α-q-Capacities
by Velimir M. Ilić and Ivan B. Djordjević
Entropy 2021, 23(6), 702; https://doi.org/10.3390/e23060702 - 1 Jun 2021
Cited by 2 | Viewed by 3541
Abstract
The measures of information transfer which correspond to non-additive entropies have intensively been studied in previous decades. The majority of the work includes the ones belonging to the Sharma–Mittal entropy class, such as the Rényi, the Tsallis, the Landsberg–Vedral and the Gaussian entropies. [...] Read more.
The measures of information transfer which correspond to non-additive entropies have intensively been studied in previous decades. The majority of the work includes the ones belonging to the Sharma–Mittal entropy class, such as the Rényi, the Tsallis, the Landsberg–Vedral and the Gaussian entropies. All of the considerations follow the same approach, mimicking some of the various and mutually equivalent definitions of Shannon information measures, and the information transfer is quantified by an appropriately defined measure of mutual information, while the maximal information transfer is considered as a generalized channel capacity. However, all of the previous approaches fail to satisfy at least one of the ineluctable properties which a measure of (maximal) information transfer should satisfy, leading to counterintuitive conclusions and predicting nonphysical behavior even in the case of very simple communication channels. This paper fills the gap by proposing two parameter measures named the α-q-mutual information and the α-q-capacity. In addition to standard Shannon approaches, special cases of these measures include the α-mutual information and the α-capacity, which are well established in the information theory literature as measures of additive Rényi information transfer, while the cases of the Tsallis, the Landsberg–Vedral and the Gaussian entropies can also be accessed by special choices of the parameters α and q. It is shown that, unlike the previous definition, the α-q-mutual information and the α-q-capacity satisfy the set of properties, which are stated as axioms, by which they reduce to zero in the case of totally destructive channels and to the (maximal) input Sharma–Mittal entropy in the case of perfect transmission, which is consistent with the maximum likelihood detection error. In addition, they are non-negative and less than or equal to the input and the output Sharma–Mittal entropies, in general. Thus, unlike the previous approaches, the proposed (maximal) information transfer measures do not manifest nonphysical behaviors such as sub-capacitance or super-capacitance, which could qualify them as appropriate measures of the Sharma–Mittal information transfer. Full article
(This article belongs to the Special Issue The Statistical Foundations of Entropy)
Show Figures

Figure 1

26 pages, 385 KiB  
Article
Equivalence of Partition Functions Leads to Classification of Entropies and Means
by Michel S. Elnaggar and Achim Kempf
Entropy 2012, 14(8), 1317-1342; https://doi.org/10.3390/e14081317 - 27 Jul 2012
Cited by 1 | Viewed by 6407
Abstract
We derive a two-parameter family of generalized entropies, Spq, and means mpq. To this end, assume that we want to calculate an entropy and a mean for n non-negative real numbers {x1,,xn [...] Read more.
We derive a two-parameter family of generalized entropies, Spq, and means mpq. To this end, assume that we want to calculate an entropy and a mean for n non-negative real numbers {x1,,xn}. For comparison, we consider {m1,,mk} where mi = m for all i = 1,,k and where m and k are chosen such that the lp and lq norms of {x1,,xn} and {m1,,mk} coincide. We formally allow k to be real. Then, we define k, log k, and m to be a generalized cardinality kpq, a generalized entropy Spq, and a generalized mean mpq respectively. We show that this family of entropies includes the Shannon and Rényi entropies and that the family of generalized means includes the power means (such as arithmetic, harmonic, geometric, root-mean-square, maximum, and minimum) as well as novel means of Shannon-like and Rényi-like forms. A thermodynamic interpretation arises from the fact that the lp norm is closely related to the partition function at inverse temperature β = p. Namely, two systems possess the same generalized entropy and generalized mean energy if and only if their partition functions agree at two temperatures, which is also equivalent to the condition that their Helmholtz free energies agree at these two temperatures. Full article
(This article belongs to the Special Issue Advances in Applied Thermodynamics)
Show Figures

Figure 1

Back to TopTop