Next Article in Journal
Functional Connectome of the Human Brain with Total Correlation
Next Article in Special Issue
Non-Additive Entropy Composition Rules Connected with Finite Heat-Bath Effects
Previous Article in Journal
Joint Deep Reinforcement Learning and Unsupervised Learning for Channel Selection and Power Control in D2D Networks
Previous Article in Special Issue
Information Shift Dynamics Described by Tsallis q = 3 Entropy on a Compact Phase Space
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Entropy Optimization, Generalized Logarithms, and Duality Relations

1
CeBio y Departamento de Ciencias Básicas, Universidad Nacional del Noroeste de la Província de Buenos Aires, UNNOBA, CONICET, Roque Saenz Peña 456, Junin B6000, Argentina
2
Centro Brasileiro de Pesquisas Físicas and National Institute of Science and Technology for Complex Systems, Rua Xavier Sigaud 150, Rio de Janeiro 22290-180, RJ, Brazil
3
Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, NM 87501, USA
4
Complexity Science Hub Vienna, Josefstädter Straße 39, 1080 Vienna, Austria
5
Instituto de Matemática e Estatística, Universidade do Estado do Rio de Janeiro, Rua São Francisco Xavier 524, Rio de Janeiro 20550-900, RJ, Brazil
6
Office for Outer Space Affairs, United Nations, Vienna International Center, 1400 Vienna, Austria
*
Author to whom correspondence should be addressed.
Entropy 2022, 24(12), 1723; https://doi.org/10.3390/e24121723
Submission received: 7 November 2022 / Revised: 20 November 2022 / Accepted: 21 November 2022 / Published: 25 November 2022
(This article belongs to the Special Issue Non-additive Entropy Formulas: Motivation and Derivations)

Abstract

:
Several generalizations or extensions of the Boltzmann–Gibbs thermostatistics, based on non-standard entropies, have been the focus of considerable research activity in recent years. Among these, the power-law, non-additive entropies S q k 1 i p i q q 1 ( q R ; S 1 = S B G k i p i ln p i ) have harvested the largest number of successful applications. The specific structural features of the S q thermostatistics, therefore, are worthy of close scrutiny. In the present work, we analyze one of these features, according to which the q-logarithm function ln q x x 1 q 1 1 q ( ln 1 x = ln x ) associated with the S q entropy is linked, via a duality relation, to the q-exponential function characterizing the maximum-entropy probability distributions. We enquire into which entropic functionals lead to this or similar structures, and investigate the corresponding duality relations.

1. Introduction

Extensions of the maximum entropy principle based on non-standard entropic functionals [1,2,3,4] have proven to be useful for the study of diverse problems in physics and elsewhere, particularly in connection with complex systems [5,6]. These lines of enquiry were greatly stimulated by research into a generalized thermostatistics advanced in the late 80s, in which the canonical probability distributions optimize the S q power-law, non-additive entropies [7]. The S q thermostatistics was successfully applied to the analysis of a wide range of systems and processes in physics, astronomy, biology, economics, and other fields [8,9,10,11]. Motivated by the work on the S q entropies, researchers also explored the properties and possible applications of several other entropic measures, such as those introduced by Borges and Roditi [12], by Anteneodo and Plastino [13], by Kaniadakis [14], and by Obregón [15]. Recent reviews on these and other entropic forms can be found in [16,17]. These developments, in turn, led to the investigation of general properties of entropic variational principles, in order to elucidate which features are shared by large families of entropic forms, or are even universal, and, on the contrary, which features characterize specific entropies, such as the S q ones. Several aspects of general entropic variational principles have been studied along those lines, including, for instance, the Legendre transform structure [18,19,20], the maximum entropy–minimum norm approach to inverse problems [21], the implementation of dynamical thermostatting schemes [22,23], the interpretation of superstatistics in terms of entropic variational prescriptions [24], and the derivation of generalized maximum-entropy phase-space densities from Liouville dynamics [25].
Of all the thermostatistics associated with generalized entropic forms, the thermostatistics derived form the S q entropies has been the most intensively studied and fruitfully applied one. The S q -thermostatistics exhibits some intriguing structural similarities with the standard Boltzmann–Gibbs one. The aim of the present contribution is to explore one of these similarities, within the context of thermostatistical formalisms based on general entropic functionals. As is well known, the Boltzmann–Gibbs entropy S BG of a normalized probability distribution can be expressed as minus the mean value of the logarithms of the probabilities. Or, alternatively, as the mean value of the logarithms of the inverse probabilities. On the other hand, the probability distribution that optimizes S BG under the constraints imposed by normalization and by the energy mean value, has an exponential form, where the exponential is the inverse function of the above mentioned logarithm function. In a nutshell: the entropy is the mean value of a logarithm (evaluated on the inverse probabilities), while the maximum-entropy probabilities are given by an exponential function, which is the inverse function of the logarithm. This structure turns out to be nontrivial, and, up to a duality condition, is shared by the S q -thermostatistics. Indeed, it is possible to define a q-logarithm function, and its inverse function, a q-exponential, both parameterized by the parameter q, such that the S q entropy is the mean value of a q-logarithm (evaluated on the inverse probabilities), while the probability distribution optimizing S q has a q-exponential form. The alluded duality condition, however, imposes that the value of the q-parameter associated with the aforementioned q-logarithm should not be the same as the value of the parameter associated with the q-exponential. Both q-values are connected via the duality relation q 2 q , which is ubiquitous in the S q -thermostatistics. In the present work, we shall explore which entropic measures generate similar structures, linking the entropic functional, regarded as the mean value of a generalized logarithm, with the form of the maximum-entropy distributions.
This paper is organized in the following way. In Section 2, we provide a brief review of the S q -thermostatistical formalism, focusing on the q-logarithm duality relation. In Section 3, we explore which entropic functionals give rise to structures, and duality relations, similar to those characterizing the S q -thermostatistics. More general scenarios are considered in Section 3. Finally, some conclusions are drawn in Section 4.

2. S q Entropies, q-Logarithms, and q-Exponential Maximum-Entropy Probability Distributions

The S q -thermostatistics is constructed on the basis of the non-additive, power-law entropy S q [5] defined as
S q = k 1 q i = 1 W p i q p i ,
where q R is a parameter characterizing the degree of non-additivity exhibited by the entropy, k is a constant chosen once and for ever, determining the dimensions and the units in which the entropy is measured, and { p i , i = 1 , , W } is an appropriately normalized probability distribution for a system admitting W microstates. In what follows, we shall assume that k = 1 . The limit q 1 corresponds to the standard Boltzmann–Gibbs (BG) entropy, that is, S 1 = S BG = k i = 1 W p i ln p i . The power-law entropy S q constitutes a distinguished and founding member of the club of generalized entropies, which is nowadays the focus of intensive research activity [3,4,16].
The q-logarithm function, given by
ln q ( x ) = x 1 q 1 1 q ( x > 0 ; ln 1 x = ln x ) ,
and its inverse function, the q-exponential
exp q ( x ) = [ 1 + ( 1 q ) x ] 1 1 q , if 1 + ( 1 q ) x > 0 , 0 , if 1 + ( 1 q ) x 0
constitute essential ingredients of the S q thermostatistical formalism. For the sake of completeness, it is worth mentioning that sometimes people use an alternative notation for the q-exponential, given by exp q ( x ) = [ 1 + ( 1 q ) x ] + 1 1 q . The q-logarithm and the q-exponential functions arise naturally when one considers the constrained optimization of the entropy S q [5,9]. Moreover, it is central to the q-thermostatistical theory that the S q entropy itself can be expressed in terms of q-logarithms,
S q = k i = 1 W p i ln q 1 p i = k ln q 1 p i .
Note that, for q 1 , the above equation reduces to the well-known one, S BG = k i = 1 W p i ln 1 p i .
The gist of the S q thermostatistics is centered on the optimization of S q under suitable constraints. The S q entropic variational problem can be formulated using standard linear constraints or nonlinear constraints based on escort probability distributions [26,27]. When working with more general entropic functionals, it is not well understood what are the appropriate escort mean values to be used, and few or no concrete applications of escort mean values to particular problems have been developed. Consequently, in order to investigate and clarify the distinguishing features of the S q formalism within the context of more general entropic formalisms, it is convenient to restrict our considerations to the optimization of the S q entropy under linear constraints. The main instance of the S q variational problem is the one yielding the generalized canonical probability distribution, which corresponds to the optimization of S q under the constraints corresponding to normalization,
i = 1 W p i = 1 ,
and to mean energy. We assume that the ith microstate of the system under consideration, which has probability p i , has energy ϵ i . The mean energy is then
E = i = 1 W p i ϵ i .
Introducing the Lagrange multipliers α and β , corresponding to the constraints of normalization (5) and the mean energy (6), the optimization of S q leads to the variational problem
δ S q α i = 1 W p i β E = 0 ,
yielding
p i q 1 = 1 q 1 ( q 1 ) ( α + β ϵ i ) .
For later comparison with thermostatistical formalisms based on general entropic forms, it will prove convenient to recast the above equation as
p i q 1 = 1 + ( q 1 ) A B ( α + β ϵ i ) ,
with A = 1 / q and B = 1 / q . At first glance, it might seem cumbersome to introduce the parameters A and B , since, within the context of the S q -thermostatistics, they are simple functions of the entropic parameter q. The new parameters, however, will prove essential when exploring the duality properties exhibited by thermostatitsical formalisms based on other generalized entropies, and when comparing those properties with the ones corresponding to the S q entropy. In those scenarios, the parameters A and B have other values, depending on the parameterized form of the relevant entropic functionals. Using the A and B parameters, the maximum S q entropy probability distribution can be expressed in terms of a q-exponential, as follows:
p i = exp q ˜ A B ( α + β ϵ i ) = ln q ˜ ( 1 ) A B ( α + β ϵ i ) ,
where
q ˜ = 2 q .
Comparing now the Equation (4) for the entropy, with the Equation (10) for the probabilities optimizing the entropy, we see that the S q entropy can be expressed in terms of a q-logarithm function, while the optimal probabilities are given by an inverse q-logarithm function (that is, by a q-exponential function). However, the value of the q-parameter that appears in the first q-logarithm, associated with the entropy, does not coincide with the one, denoted by q ˜ , that appears in the inverse q-logarithm defining the optimal probabilities. This pair of q-values satisfy the duality relation (11). It is important to emphasize that the duality relation (11) has the property
q ˜ ˜ = q .
In other words, the dual of the dual of q is equal to q itself. Note also that, in the Boltzmann–Gibbs limit, q 1 , the duality relation reduces to q ˜ = q = 1 . The Boltzmann–Gibbs thermostatistics, regarded as a particular member of the S q -thermostatistical family, is self-dual. The duality relation (12) between the values of the q-parameters characterizing two q-logarithm functions, can be reformulated as a duality relation between the q-logarithms themselves. Indeed, one has that
ln q ˜ ( x ) = ln q 1 x .
For q 1 , the self-dual condition q = q ˜ = 1 is obtained, and the relation (13) reduces to the well-known property of the standard logarithm, ln ( x ) = ln ( 1 / x ) .

3. Generalized Entropies and Logarithms

Now, we shall consider a generic trace-form entropy S G . It can always be written in the form
S G = i = 1 W p i ln G 1 p i ,
expressed in terms of an appropriate generalized logarithm function ln G ( x ) . The specific form of the generalized logarithmic function ln G ( x ) depends on which particular thermostatistical formalism one is considering. For example, in the case of the S q -based thermostatistics, ln G ( x ) is given by the generalized logarithm ln q ( x ) . Note that the subindex “G” stands for “generalized", and it does not represent a numerical parameter. In order to lead to a sensible entropy, the function ln G ( x ) has to be continuous and two-times differentiable, has to comply with x ln G ( 1 / x ) > 0 for 0 < x < 1 and lim x 0 x ln G ( 1 / x ) = lim x 1 x ln G ( 1 / x ) = 0 , and has to satisfy the concavity requirement given by d 2 d x 2 x ln G 1 x < 0 .
One can optimize the entropic measure (14) under the constraints imposed by normalization (5) and by the energy mean value (6). The corresponding variational problem reads
δ S G α i = 1 W p i β E = 0 ,
where α and β are the Lagrange multipliers corresponding to the normalization and the mean energy constraints. The solution to the variational problem is given by a probability distribution complying with the equations
1 p i ln G 1 p i ln G 1 p i = α β ϵ i , ( i = 1 , , W ) ,
where ln G ( x ) = d d x ln G ( x ) .
Equation (16) arises from a generic entropy optimization problem. Basically, the optimization of any trace form entropy leads to an equation of the form (16). Here, we want to consider a particular family of entropies, leading to maximum entropy distributions satisfying a special symmetry requirement. We want the maximum entropy distribution p i to be of the form
p i = ln G ˜ ( 1 ) ( ξ i ) ,
where ξ i = A + B ( α β ϵ i ) , with A and B appropriate constants ( B > 0 ), and ln G ˜ ( 1 ) is the inverse of a generalized logarithmic function ln G ˜ ( x ) , related to ln G ( x ) through a duality relationship. A few clarifying remarks are now in order. First, ξ i is, up to the additive and multiplicative constants A and B , equal to the right-hand side of (16). Second, the constants A and B depend only on the form of the entropy (14), and not on any details of the system under consideration, such as the number of microstates W, the values of the microstates’ energies ϵ i , or the values of the Lagrange multipliers α and β . Last, the duality relation connecting the functions ln G ( x ) and ln G ˜ ( x ) is such that the dual of the dual is equal to the original function, that is
ln G ˜ ˜ ( x ) = ln G ( x ) .
Combining Equations (16) and (17), one obtains
1 p i ln G 1 p i ln G 1 p i = 1 B ln G ˜ ( p i ) A .
Introducing the constants A = A / B and B = 1 / B , the above equation can be cast in the more convenient form
1 p i ln G 1 p i ln G 1 p i = A + B ln G ˜ ( p i ) .
For a given duality relation ln G ( x ) ln G ˜ ( x ) , and given values of the parameters A and B, Equation (20) can be regarded as a differential equation that has to be obeyed by the generalized logarithmic function ln G ( x ) . For solving the differential equation, one needs an initial condition, given by the value ln G ( x 0 ) adopted by the generalized logarithm at some particular point x 0 . We shall assume, as an initial condition, that ln G ( 1 ) = 0 .
Different forms of the duality relation ln G ( x ) ln G ˜ ( x ) are compatible with different forms of the generalized logarithm, and with different forms of the generalized entropy. In what follows, we shall explore some instances of duality relations, in order to determine which entropic forms are compatible with them.

3.1. The Duality Condition Satisfied by the S q Thermostatistics

Motivated by the S q -based thermostatistics, we shall first adopt the duality condition
ln G ˜ ( x ) = ln G ( 1 / x ) ,
which is precisely the relation (13) satisfied by the S q -thermostatistics. Equation (20) then becomes
1 p i ln G 1 p i ln G 1 p i = A B ln G 1 p i .
Therefore, in order to find the form of ln G ( x ) , we have to solve the differential equation
ln G x = 1 x A + ( 1 B ) ln G x ,
with the initial condition ln G ( 1 ) = 0 . The (unique) solution of Equation (23) is then
ln G ( x ) = A x 1 B 1 1 B .
We see that, up to the multiplicative constant A, the only generalized logarithmic function leading to an entropy optimization scheme compatible with the duality condition (21) is the q-logarithm
ln q ( x ) = x 1 q 1 1 q .
The parameter B appearing in (22) coincides with the parameter q of the S q -thermostatistics.

3.2. The Simplest Duality Relation

We shall now consider the simplest possible duality relation, which is
ln G ˜ ( x ) = ln G ( x ) .
In spite of its simplicity, this duality relation is worthy of consideration, because it includes the standard logarithm (and the corresponding Boltzmann–Gibbs scenario) as a particular case. It is interesting, therefore, to explore which entropic forms are compatible with the simplest conceivable condition (26), even if this exploration is not a priori motivated by a generalized entropy of known physical relevance.
Combining the general Equation (20) with the duality relation (26), one obtains
1 p i ln G 1 p i ln G 1 p i = A + B ln G ( p i ) .
Then, we have to solve the ordinary differential equation
1 x ln G 1 x ln G 1 x = A + B ln G ( x ) ,
or, equivalently,
d ln G d x = 1 x ln G ( x ) + A + B ln G 1 x ,
with the condition ln G ( 1 ) = 0 . At first sight, Equation (29) may look like a standard ordinary differential equation. It has, however, the peculiarity that in the right-hand side of (29), the unknown function ln G is evaluated at two different values of its argument: x and 1 / x . This situation is similar to the one that occurs, for instance, with differential equations describing dynamical systems with delay. In the case of (29), this difficulty can be removed by recasting the equation as a pair of coupled ordinary differential equations. Let us introduce the functions
F ( x ) = ln G ( x ) , G ( x ) = ln G ( 1 / x ) .
The differential Equation (28) can be reformulated as the two coupled differential equations
d F d x = 1 x F ( x ) + B G ( x ) + A , d G d x = 1 x G ( x ) + B F ( x ) + A ,
with the conditions F ( 1 ) = G ( 1 ) = 0 . To find a solution for (31), we propose the ansatz
F ( x ) = c 1 x γ 1 + c 2 x γ 2 + c 3 , G ( x ) = c 1 x γ 1 + c 2 x γ 2 + c 3 .
If one inserts the ansatz (32) into the differential Equations (31), one can verify that (32) constitutes a solution, provided that
γ 1 = γ 2 0 , c 1 / c 2 = 1 B 1 + 1 B 2 , 0 B 2 1 , c 3 = A / ( 1 + B ) ,
and
γ = 1 B 2 ,
where γ = γ 1 = γ 2 . It follows from (33) and (34) that 0 γ 1 , and that
c 2 = 1 γ 1 + γ c 1 .
The relations (33)–(35), together with the initial conditions F ( 1 ) = G ( 1 ) = 0 , lead to
F ( x ) = A 1 + B 1 + γ x γ 1 γ x γ 1 + γ 1 γ 1 ,
and
G ( x ) = A 1 + B 1 + γ x γ 1 γ x γ 1 + γ 1 γ 1 .
The solution to the system of differential Equations (31) is completely determined by the conditions F ( 1 ) = G ( 1 ) = 0 . Therefore, given these conditions, and for 0 B 1 , the solution (36) and (37) is unique. Now, the entropy S γ compatible with the duality relation (26) is S γ = i p i ln G ( 1 p i ) , with ln G ( x ) = F ( x ) . Therefore, for 0 B 1 , one has
S γ = A 1 + B i 1 + γ p i 1 γ 1 γ p i 1 + γ 1 + γ 1 γ p i ,
which, after some algebra, can be recast in the more convenient form
S γ = A 2 1 + γ + 1 γ 1 + 1 γ 2 i 1 + γ p i 1 γ p i γ + 1 γ p i 1 + γ p i γ .
Introducing now the parameters q = 1 γ , ( 0 q 1 ) and q * = 1 + γ = 2 q , ( 1 q * 2 ), the entropy (39) can be expressed as a linear combination of two S q entropies,
S γ = K q * S q + q S q * ,
where
K = A 2 q + q * 1 + q q * .
In the limit B 1 , which corresponds to γ 0 , q 1 , and q * 1 , the generalized entropy (40) is, up to a multiplicative constant, equal to the Boltzmann–Gibbs entropy S BG .

3.3. More General Duality Relations

It is possible to consider duality relations more general than the ones discussed previously. One can consider scenarios where the relation between a generalized logarithm and its dual is defined in terms of a pair of functions h 1 , 2 ( x ) , as
ln G ˜ ( x ) = h 1 ln G ( h 2 ( x ) ) ,
where the functions h 1 , 2 ( x ) satisfy
h 1 ( h 1 ( x ) ) = x , and h 2 ( h 2 ( x ) ) = x .
For example, the duality relation associated with the S q entropy corresponds to h 1 ( x ) = x and h 2 ( x ) = 1 / x , while the duality relation associated with the entropy S γ corresponds to h 1 ( x ) = h 2 ( x ) = x .
Other duality relations can be constructed, for instance, in terms of the Moebius transformations
M ( x ) = m 1 x + m 2 m 3 x + m 4 ,
with m 1 m 4 m 2 m 3 0 . The inverse of (44) is
M ( 1 ) ( x ) = m 4 x m 2 m 3 x + m 1 .
Moebius transformations that are self-inverse (that is, transformations coinciding with their own inverse: M ( x ) = M ( 1 ) ( x ) ) are candidates for the functions h 1 , 2 ( x ) from which possible duality relations for generalized logarithmic functions can be constructed. Examples of self-inverse Moebius transformations are those of the form
M ( x ) = m 1 x + m 2 m 3 x m 1 ,
which have m 4 = m 1 . Notice that, for m 1 0 , the above form of M ( x ) depends on only two parameters, as follows: M ( x ) = x + ( m 2 / m 1 ) ( m 3 / m 1 ) x 1 . Another self-inverse Moebius transformation, not included in the family (46), is the identity function, M ( x ) = x , corresponding to m 1 = m 4 0 and m 2 = m 3 = 0 (see also [28]). The duality relations corresponding to the entropic measures S q and S γ are both constructed in terms of particular instances of Moebius transformations. The duality relation associated with the entropy S q is constructed with h 1 ( x ) = x and h 2 ( x ) = 1 / x , which are the self-inverse Moebious transformation corresponding, respectively, to m 1 = 1 , m 4 = 1 , and m 2 = m 3 = 0 , and to m 1 = m 4 = 0 and m 2 = m 3 = 1 . The duality relation for the entropy S γ is constructed with h 1 ( x ) = h 2 ( x ) = x , which correspond to m 1 = m 4 = 1 and m 2 = m 3 = 0 .
A generalized logarithmic function ln G ( x ) defining a trace-form entropy (14), for which the associated entropic optimization principle leads to the duality relation (42), must satisfy the differential equation
1 x ln G 1 x ln G 1 x = A + B ln G ˜ ( x ) = A + B h 1 ln G ( h 2 ( x ) ) ,
with the condition ln G ( 1 ) = 0 . For expression (14) to represent a sensible (i.e., concave) entropy, the generalized logarithm satisfying (47) has to comply with the requirement
d 2 d x 2 x ln G 1 x = B d d x h 1 ln G ( h 2 ( x ) ) < 0 .
For duality relations more general than the two ones already analyzed by us in detail (corresponding to the entropies S q and S γ ), the associated differential Equation (47) has, presumably, to be treated numerically.

3.4. Duality Relations: The Inverse Problem

One can also consider the following inverse problem. Given a parameterized family of non-negative, monotonically increasing functions J ( x ; λ ) , depending on one or more parameters (that we collectively denote by λ ), find out if the inverse function J ( 1 ) ( x ; λ ) is related to a generalized logarithmic function defining a sensible entropy (14), and satisfying a duality relation (42) defined in terms of appropriate functions h 1 , 2 ( x ) . The problem is the following: for the inverse function J ( 1 ) ( x ; λ ) , determine if suitable functions h 1 , 2 ( x ) exist, and identify them. We assume that the integral
I = 0 1 J ( 1 ) ( x ; λ ) d x ,
converges.
In order to formulate this inverse problem, we consider a thermostatistical formalism, based on a generalized entropy, which yields optimizing-entropy canonical probability distributions of the form
p i = J ( ξ i ; λ ) ,
where ξ i = A + B ( α β ϵ i ) . In the latter expression, α and β are, as usual, the Lagrange multipliers associated with normalization of mean energy, and A and B are constants, possibly depending on the parameters λ characterizing the function J ( x ; λ ) .
The associated entropy S J can be expressed as
S J = i C ( p i ) ,
where the function C ( x ) is defined as the integral
C ( x ) = x 1 J ( 1 ) ( x ; λ ) I d x .
The function C ( x ) satisfies the following properties,
C ( x ) > 0 , for 0 < x < 1 , C ( 0 ) = C ( 1 ) = 0 , d C / d x = I J ( 1 ) ( x ; λ ) , d 2 C / d x 2 = d J ( 1 ) ( x ; λ ) / d x < 0 ,
which guarantee that S J , defined by (51), is a sensible entropy. For J ( x ) = exp ( x ) , one has J ( 1 ) ( x ) = ln ( x ) , I = 1 , C ( x ) = x ln ( x ) , and S J coincides with the Boltzmann–Gibbs entropy. If we compare the expression (51) for S J with the expression (14) for a generalized entropy in terms of a generalized logarithm, we find that the generalized logarithm associated with S J is
ln G ( J ) 1 x = 1 x x 1 J ( 1 ) ( x ; λ ) I d x ,
or, equivalently,
ln G ( J ) x = x x 1 1 J ( 1 ) ( x ; λ ) I d x .
On the other hand, if we compare the form (17) for a generalized canonical distribution, with the form (50) corresponding to the function S J , we obtain
ln G ˜ ( J ) x = J ( 1 ) ( x ; λ ) .
The present inverse problem consists of determining what type of duality relation, if any, exists between the functions (55) and (56). It seems that this is a difficult problem, which has to be tackled in a case-by-case way. As an intriguing example of this inverse problem, we can consider the one posed by probability distributions related to the Mittag-Leffler function E a , b ( x ) (see [29] and references therein). The Mittag-Leffler function is given, for a general complex argument z, by the power series expansion
E a , b ( z ) = k = 0 z k Γ ( b + a k ) , a , b C , ( a ) > 0 , ( b ) > 0 , z C ,
with E a ( z ) E a , 1 ( z ) . Notice that, in the literature [29], the two parameters a and b characterizing the Mittag-Leffler function are sometimes referred to as α and β .
The Mittag-Leffler function has several applications in physics and other fields. In particular, it plays a distinguished role in the study of non-standard diffusion processes involving fractional calculus operators [29]. In the present context, we consider only real values of the parameters ( a , b ) and real arguments. A few examples of the Mittag-Leffler function, and of its inverses, are respectively depicted in Figure 1 and Figure 2, for b = 1 and different values of the parameter a.
In the context of a Mittag-Leffler-based thermostatistical formalism, some possible choices for the function J ( x ; λ ) would be
J ( x ; λ ) = E a , b ( x ) , or , J ( x ; λ ) = E a , b ( x 2 ) ,
where λ = ( a , b ) is the set of parameters characterizing the Mittag-Leffler function. For each of these choices, provided that the values of the parameters λ are such that the appropriate conditions are fulfilled, it is possible to explore the existence of functions h 1 , 2 for which the Mittag-Leffler-related generalized logarithms, (55) and (56), satisfy a differential equation of the form (47). For J ( x ; λ ) = E a , 1 ( x ) = E a ( x ) , the corresponding generalized entropy (51) is defined in terms of the function C ( x ) , given by (52). A few examples of C ( x ) , which we obtained by numerically solving the integrals (49) and (52) for particular values of the parameter a, are plotted in Figure 3.

4. Conclusions

Several generalizations or extensions of the notion of entropy have been advanced and enthusiastically investigated in recent years. The associated entropic optimization problems seem to provide valuable tools for the study of diverse problems in physics and other fields, particularly when applied to the analysis of complex systems. Among the growing number of entropic forms that have been advanced, the non-additive, power-law S q entropies exhibit the largest number of successful applications. It is clear by now, however, that the S q entropies are not universal: some systems or processes seem to be described by entropic forms not belonging to the S q family. Given this state of affairs, it is imperative to investigate in detail the properties of the various entropies, and of the associated thermostatistics, in order to elucidate and clarify the deep reasons that make them suitable for treating specific problems. In particular, the structural features of the S q thermostatistics are certainly worthy of close scrutiny. In the present work, we investigated one of these features, according to which the q-exponential function describing the maximum-entropy probability distributions are linked, via a duality relation, with the q-logarithm function in terms of which the S q entropy itself can be defined. We investigated which entropic functionals lead to this kind of structure and explored the corresponding duality relations.
The main take-home message of the present work is that there is a close connection between the aforementioned duality relations, and the forms of the entropic measures. The S q thermostatistics exhibits a particular duality connection, which, in the limit of the Boltzmann–Gibbs thermostatistics, reduces to a self-duality. We proved that there is no other entropic functional satisfying the duality relation associated with S q , namely, Equation (21). This constitutes what may be regarded as a brand new uniqueness theorem leading to S q , in addition to those already existing, such as the Enciso–Tempesta theorem [30] and those indicated therein. Assuming other types of duality relation, it is possible to formulate differential equations that lead to new entropic measures complying with the assumed duality. We studied in detail a duality relation leading to a differential equation that admits closed analytical solutions, and corresponds to a new generalized entropy, which we denoted by S γ . The duality relations characterizing the entropies S q and S γ seem to be exceptional, in that the concomitant differential equations can be solved analytically. In many other cases, the differential equations resulting from duality relations have to be treated numerically. The investigation of these equations, associated with thermostatistical scenarios different from, or more general than, those based on the entropies S q and S γ , would certainly be worthwhile. It would also be valuable to identify new duality relations admitting an analytical treatment. The exploration of the ensuing thermostatistical scenarios may suggest interesting new applications of generalized entropies. Another promising direction for future research is to extend the present study to scenarios involving non-trace-form entropies [31], or involving escort mean values [27,32]. We would be delighted to see further advances along these or related lines.

Author Contributions

A.R.P., C.T., R.S.W., H.J.H. contributed equally to this paper. All authors have read and agreed to the published version of the manuscript.

Funding

Financial support from the Brazilian funding agencies: Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq), Fundação Carlos Chagas Filho de Amparo à Pesquisa do Estado do Rio de Janeiro (FAPERJ), and Coordenação de Aperfeiçoamento de Pessoal de Nível Superior-Brasil (CAPES).

Acknowledgments

We acknowledge the financial support. One of us (CT) also acknowledges interesting related discussions with E.M.F. Curado.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Tsallis, C. Entropy. Encyclopedia 2022, 2, 264–300. [Google Scholar] [CrossRef]
  2. Jizba, P.; Korbel, J. Maximum entropy principle in statistical inference: Case for non-Shannonian entropies. Phys. Rev. Lett. 2019, 122, 120601. [Google Scholar] [CrossRef] [Green Version]
  3. Naudts, J. Generalised Thermostatistics; Springer: London, UK, 2011. [Google Scholar]
  4. Beck, C. Generalised information and entropy measures in physics. Contemp. Phys. 2009, 50, 495–510. [Google Scholar] [CrossRef]
  5. Tsallis, C. Introduction to Nonextensive Statistical Mechanics—Approaching a Complex World; Springer: New York, NY, USA, 2009. [Google Scholar]
  6. Hanel, R.; Thurner, S. A comprehensive classification of complex statistical systems and an axiomatic derivation of their entropy and distribution functions. EPL 2011, 93, 20006. [Google Scholar] [CrossRef]
  7. Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
  8. Gell-Mann, M.; Tsallis, C. Nonextensive Entropy: Interdisciplinary Applications; Oxford University Press: Oxford, UK, 2004. [Google Scholar]
  9. Tsallis, C. The nonadditive entropy Sq and its applications in physics and elsewhere: Some remarks. Entropy 2011, 13, 1765–1804. [Google Scholar] [CrossRef]
  10. Tsallis, C. Beyond Boltzmann-Gibbs-Shannon in physics and elsewhere. Entropy 2019, 21, 696. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  11. Sánchez Almeida, J. The principle of maximum entropy and the distribution of mass in galaxies. Universe 2022, 8, 214. [Google Scholar] [CrossRef]
  12. Borges, E.P.; Roditi, I. A family of nonextensive entropies. Phys. Lett. A 1998, 246, 399–402. [Google Scholar] [CrossRef]
  13. Anteneodo, C.; Plastino, A.R. Maximum entropy approach to stretched exponential probability distributions. J. Phys. A Math. Gen. 1999, 32, 1089–1097. [Google Scholar] [CrossRef]
  14. Kaniadakis, G. Statistical mechanics in the context of special relativity. Phys. Rev. E 2002, 66, 056125. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Obregón, O. Superstatistics and gravitation. Entropy 2010, 12, 2067–2076. [Google Scholar] [CrossRef] [Green Version]
  16. Amigó, J.M.; Balogh, S.G.; Hernández, S. A brief review of generalized entropies. Entropy 2018, 20, 813. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Ilic, V.M.; Korbel, J.; Gupta, S.; Scarfone, A.M. An overview of generalized entropic forms. EPL 2021, 133, 50005. [Google Scholar] [CrossRef]
  18. Plastino, A.; Plastino, A.R. On the universality of thermodynamics’ Legendre transform structure. Phys. Lett. A 1997, 226, 257–263. [Google Scholar] [CrossRef]
  19. Mendes, R.S. Some general relations in arbitrary thermostatistics. Physical A 1997, 242, 299–308. [Google Scholar] [CrossRef]
  20. Curado, E.M.F. General aspects of the thermodynamical formalism. Braz. Journ. Phys. 1999, 29, 36–45. [Google Scholar] [CrossRef]
  21. Plastino, A.R.; Miller, H.G.; Plastino, A.; Yen, G.D. The role of information measures in the determination of the maximum entropy-minimum norm solution of the generalized inverse problem. J. Math. Phys 1997, 38, 6675–6682. [Google Scholar] [CrossRef]
  22. Roston, G.B.; Plastino, A.R.; Casas, M.; Plastino, A.; Da Silva, L.R. Dynamical thermostatting and statistical ensembles. Eur. Phys. J. B 2005, 48, 87–93. [Google Scholar] [CrossRef]
  23. Plastino, A.R.; Anteneodo, C. A dynamical thermostatting approach to nonextensive canonical ensembles. Ann. Phys. 1997, 255, 250–269. [Google Scholar] [CrossRef]
  24. Tsallis, C.; Souza, A.M.C. Constructing a statistical mechanics for Beck-Cohen superstatistics. Phys. Rev. E 2003, 67, 026106. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Saadatmand, S.N.; Gould, T.; Cavalcanti, E.G.; Vaccaro, J.A. Thermodynamics from first principles: Correlations and nonextensivity. Phys. Rev. E 2020, 101, 060101. [Google Scholar] [CrossRef] [PubMed]
  26. Curado, E.M.F.; Tsallis, C. Generalized statistical mechanics: Connection with thermodynamics. J. Phys. A 1991, 24, L69. [Google Scholar] [CrossRef]
  27. Tsallis, C.; Mendes, R.S.; Plastino, A.R. The role of constraints within generalized nonextensive statistics. Phys. A 1998, 261, 534–554. [Google Scholar] [CrossRef]
  28. Gazeau, J.P.; Tsallis, C. Moebius transforms, cycles and q-triplets in statistical mechanics. Entropy 2019, 21, 1155. [Google Scholar] [CrossRef] [Green Version]
  29. Haubold, H.J.; Mathai, A.M.; Saxena, R.K. Mittag-Leffler Functions and Their Applications. J. Appl. Math. 2011, 2011, 298628. [Google Scholar] [CrossRef] [Green Version]
  30. Enciso, A.; Tempesta, P. Uniqueness and characterization theorems for generalized entropies. J. Stat. Mech. 2017, 2017, 123101. [Google Scholar] [CrossRef] [Green Version]
  31. Tempesta, P.; Jensen, H.J. Universality classes and information-theoretic measures of complexity via group entropies. Sci. Rep. 2020, 10, 1–11. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  32. Furuichi, S. On the maximum entropy principle and the minimization of the Fisher information in Tsallis statistics. J. Math. Phys. 2009, 50, 013303. [Google Scholar] [CrossRef]
Figure 1. Plot of the Mittag-Leffler function E a , b ( x ) , for b = 1 and illustrative values of the parameter a; E 1 , 1 ( x ) = e x .
Figure 1. Plot of the Mittag-Leffler function E a , b ( x ) , for b = 1 and illustrative values of the parameter a; E 1 , 1 ( x ) = e x .
Entropy 24 01723 g001
Figure 2. Plot of the inverse Mittag-Leffler function, E a , b ( 1 ) ( x ) , for b = 1 and specific values of the parameter a; E 1 , 1 ( 1 ) ( x ) = ln x .
Figure 2. Plot of the inverse Mittag-Leffler function, E a , b ( 1 ) ( x ) , for b = 1 and specific values of the parameter a; E 1 , 1 ( 1 ) ( x ) = ln x .
Entropy 24 01723 g002
Figure 3. Plotof the function C ( x ) corresponding to J ( x ) = E a ( x ) , for different values of the parameter a. The function C ( x ) appears in the definition of a trace-form entropic measure (51), and is given by Equation (52). For a = 1 , one has E 1 ( x ) = exp ( x ) and C ( x ) = x ln x .
Figure 3. Plotof the function C ( x ) corresponding to J ( x ) = E a ( x ) , for different values of the parameter a. The function C ( x ) appears in the definition of a trace-form entropic measure (51), and is given by Equation (52). For a = 1 , one has E 1 ( x ) = exp ( x ) and C ( x ) = x ln x .
Entropy 24 01723 g003
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Plastino, A.R.; Tsallis, C.; Wedemann, R.S.; Haubold, H.J. Entropy Optimization, Generalized Logarithms, and Duality Relations. Entropy 2022, 24, 1723. https://doi.org/10.3390/e24121723

AMA Style

Plastino AR, Tsallis C, Wedemann RS, Haubold HJ. Entropy Optimization, Generalized Logarithms, and Duality Relations. Entropy. 2022; 24(12):1723. https://doi.org/10.3390/e24121723

Chicago/Turabian Style

Plastino, Angel R., Constantino Tsallis, Roseli S. Wedemann, and Hans J. Haubold. 2022. "Entropy Optimization, Generalized Logarithms, and Duality Relations" Entropy 24, no. 12: 1723. https://doi.org/10.3390/e24121723

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop