Nonextensive Statistical Mechanics: Equivalence Between Dual Entropy and Dual Probabilities

The concept of duality of probability distributions constitutes a fundamental “brick” in the solid framework of nonextensive statistical mechanics—the generalization of Boltzmann–Gibbs statistical mechanics under the consideration of the q-entropy. The probability duality is solving old-standing issues of the theory, e.g., it ascertains the additivity for the internal energy given the additivity in the energy of microstates. However, it is a rather complex part of the theory, and certainly, it cannot be trivially explained along the Gibb’s path of entropy maximization. Recently, it was shown that an alternative picture exists, considering a dual entropy, instead of a dual probability. In particular, the framework of nonextensive statistical mechanics can be equivalently developed using q- and 1/q- entropies. The canonical probability distribution coincides again with the known q-exponential distribution, but without the necessity of the duality of ordinary-escort probabilities. Furthermore, it is shown that the dual entropies, q-entropy and 1/q-entropy, as well as, the 1-entropy, are involved in an identity, useful in theoretical development and applications.


Introduction
Non-extensive statistical mechanics generalizes the classical statistical framework of Boltzmann-Gibbs (BG). The generalization is based on two fundamental considerations, (i) the q-entropy, a monoparametrical generalization of BG's entropy [1], and (ii) the escort probability distribution [2,3], a metastable distribution at which the ordinary distribution that maximizes the entropy is stabilized. The metastable distribution coincides with the empirical model of frequently observed distributions in nature and especially in plasmas, called either q-exponential or kappa distribution.
The connection of kappa distributions with non-extensive statistical mechanics, as well as the equivalence between the q-exponential and kappa distributions, has been examined by several authors (e.g., [37][38][39][40][41]). The empirical kappa distribution and the Tsallis-like Maxwellian distribution of velocities are accidentally of the same form, under a transformation between the q-index and the kappa parameter that labels and governs the kappa distributions. (For details on this topic, see the review [41], the special issue introduction [42], and the book of kappa distributions: [32]). Understanding the statistical origin of these distributions was a cornerstone of theoretical developments and a plethora of applications in space plasma physics and complexity science.
It is now well understood that the previously mentioned examples of observed distributions can be described by the q-exponential or kappa distributions, that is, the type of distribution maximizing the q-entropy in the canonical ensemble.
It was recently shown that the statistical framework of non-extensive statistical mechanics could be deduced as it is, without the consideration of the dual formalism of ordinary/escort distributions. This concept can significantly simplify the usage of the theory, and make it accessible by the new generation of researchers that are straggling to understand and apply it in exotic particle systems out of thermal equilibrium, such as the space plasmas. However, the cost of this simplification is the necessity of having two types of entropic functions. This dual formulation preserves the basic fundamental thermodynamic formulae, which is necessary for the consistent connection of the statistical mechanics with the thermodynamics. In Section 2, the paper presents the standard nonextensive statistical mechanics, which is based on escort duality formalism (that involves the maximization of q-Entropy using the ordinary/escort duality formalism). In Section 3, we show how the framework of nonextensive statistical mechanics can be developed, considering entropy metastable duality. The paper deals with the duality between the qand 1/q-entropies; the maximization of the latter under the canonical ensemble; the derivation of an identity formula involving the dual entropies, q-entropy and 1/q-entropy, as well as, the 1-entropy, which can be useful in theoretical development and applications. As an example, in Section 4, we focus on the continuous description of energy distribution. Finally, Section 5 summarizes the conclusions.

q-Entropy and Ordinary/Escort Duality Formalism
Consider the discrete energy spectrum {ε k } W k=1 associated with a discrete probability distribution p k W k=1 . Non-extensive statistical mechanics is based on the q-entropy and the dual formalism of ordinary/escort probabilities. The non-extensive entropy is given by [1]: leading to the BG formulation for q → 1, S 1 = − W k p k ln(p k ) (note: the entropic formulations are given in units of the Boltzmann's constant k B ).
On the other hand, the escort probability distribution {P k } W k=1 is constructed from the ordinary probability distribution, p k W k=1 , as follows [2]:

Maximization of q-Entropy
The maximization of entropy is derived from However, the probabilities p k W k=1 do not constitute independent variables, because of the constraints of (i) probability normalization, and (ii) fixed internal energy; the two constraints can be expressed either in terms of the ordinary or the escort probabilities, i.e., The Lagrange method (as used by Gibbs [128]) involves maximizing an alternative functional G, instead of the entropy d directly, that is, where G is written in terms of the Lagrange multipliers λ 1 and λ 2 by The maximization of this functional leads to the ordinary probability distribution The multiplier λ 1 is connected to the partition function Z q ≡ (λ 1 · (q − 1)/q] 1/(q−1) . The other multiplier, λ 2 , is connected to temperature (k B T) ∝ − λ 2 −1 . In particular, the negative and inverse value of the second Lagrange multiplier defines the so-called Lagrangian temperature T L [114], i.e., Substituting Equation (8) into Equation (2) leads to the escort distribution: The escort probability distribution describes a metastable distribution at which the ordinary distribution that maximizes the entropy is duplexed. The metastable distribution coincides with the kappa distribution, under the transformation of the kappa and q indices [37]: leading to the kappa distribution, in the discrete description, Or, in the continuous description,

Entropy Duality Formalism
Alternatively, the exactly identical framework of non-extensive statistical mechanics can be developed by considering a dual entropy, instead of a dual probability distribution (as it was considered in Section 2).
The duality is given by the standard q-entropy and the one with inverse q-index, i.e., (1/q)-entropy. In order to show this, we accept that there is only one type of distribution; this must coincide with the metastable distribution of the case where the duality in probabilities was considered (Section 2); that is, the escort probability distribution. Then, the 1/q-entropy is expressed by [129]: (notice the usage of escort instead of the ordinary probability distribution). Using the entropy S q in Equation (1) and the following identity that can be easily derived from Equation (2): we have leading to We observe that there are always two different indices, q and 1/q, (except for the case of q = 1 when both indices coincide), for which the partition function remains invariant for the two indices, Q 1 = q and Q 2 = 1/q; however, one has to recall that S Q S Q for any Q 1.

q-Deformed Exponential/Logarithm Functions
The Q-deformed exponential function [4,5] and the Q-logarithm function, are defined by where the subscript "+" denotes the cut-off condition, where exp Q (x) becomes zero if its base is non-positive. These are inverse functions for any Q (in similar to the case of Q = 1): Hence, the dual ordinary/escort distributions are written as: while the dual q-/ (1/q)-entropies in Equation (16) are written as:

Maximization of 1/q-Entropy
The maximization of 1/q-entropy leads directly to the escort probability distribution. Indeed, by maximizing the functional G, as in Equation (6), we find [129][130][131]: or, in terms of the kappa index (via Equation (10)): It has to be stressed out that (i) the correct canonical distribution is derived without the duality of ordinary/escort distributions, since only one distribution is considered; however, (ii) the canonical distribution in Equation (23) is not derived by maximizing the system's q-entropy Sq; instead, it is deduced by maximizing the (1/q)-entropy, S1/q, the dual of the system's q-entropy.

q-Independent Information Measure
In [131], we examine the thermodynamic origin of q-entropy and its associated q-exponential or kappa distributions. As it was shown, the classical concept of thermal equilibrium and the thermodynamic definition of temperature, given by can be naturally generalized to 1 The classical BG entropy is noted as S 1 , that is, the q-entropy S q for q = 1. When q = 1, the expression ln[exp q (Sq)] at the right-hand-side of Equation (25) becomes simply S 1 . On the other hand, the internal energy does not depend on the qor kappa indices; for example, in the continuous case we have U = 1 2 f k B T. This is because the kappa index is irrelevant to the energy transition among particles, but is count only for the correlation among particle energies. In addition, the temperature and kappa index are found to be two independent thermodynamic variables. Therefore, if the temperature and internal energy are quantities independent of the q-index, then, the quantity ln[exp q (Sq)] should be a sum of a q-dependent function, ln g(q), and a U-dependent function, f (U), i.e., ln[exp q (S q )] = ln g(q) + f (U) = ln g(q) + S 1 , where we set the value of ln g(1) to be absorbed by f (U), and thus we may redefine function g to be ln g(1) = 0, or S 1 = f (U). Hence, exp q (S q ) = g(q) · exp(S 1 ).

Application in the Continuous Description
Extensivity requires that the entropy of the whole system is proportional to the size of the system or the number of independent particles of the system. Additivity means that the entropy of the whole system sums up the entropies of all the statistically independent subsystems. Apparently, additivity leads to extensivity, but non-additivity does not mean non-extensivity. Macroscopically all physically meaningful entropies end up to be extensive (as the number of particles tends to infinity). This can be understood as follows. Depending on the range of interactions whether it is small or long, there is always a scale-let this be λ C -in which particles are characterized by local correlations. Particles within this scale are correlated to each other, but the particles from different particles are non-correlated, i.e., independent. Let N C be the number within the scale λ C . If N is the total number of particles, then, there is about M~N/N C uncorrelated groups of correlated particles of length λ C . Since there is no correlation among all M groups, the total entropy of the system is similar to BG statistical mechanics, that is, S = M·S C , where S C (N C ) is the entropy characterizing the scale in which particles are correlated; this is rewritten as S = N·S q , where S q (N C ) is the per particle entropy that characterizes the scale λ C , while it depends in a nonlinear way on the number of N C particles; therefore, the entropy of the whole system S is macroscopically proportional to the number of its particles N, independently of the number N C . For instance, Boltzmann-Gibbs (BG) statistics considers no correlations among particles [128], that is, N C = 1. On the other hand, nonextensive statistics for plasmas considers local correlations among particles, with a typically large number N C , given by the number within a Debye sphere, N D . (For more details, e.g., see: [32].) As an example, we show the continuous description of kappa distributions. The kappa index depends on the total number of correlated degrees of freedom f.
The physical meaning of the kappa index is the reciprocal correlation coefficient of the energies of any two correlated kinetic degrees of freedom. In particular, the correlation coefficient is given by ρ = (3/2)/κ [112]. The kappa index κ is dependent on the correlated degrees of freedom f, and can be related to an invariant kappa index κ 0 by κ(f) = κ 0 + (1/2)f. For a number of N C correlated particles with d degrees of freedom per particle, we have f = d · N C , and the dependent kappa index is κ(N C ) = κ 0 + (d/2)N C . Note that κ 0 is the actual kappa index that characterizes a stationary state, and it is invariant from the number of particles and degrees of freedom of the system [112].
The corresponding partition function Z q was found to be [114]: where N C is the number of correlation particles included in a correlation length C ; the involved dimensionless scale parameter σ is expressed in terms of the thermal speed θ = 2k B T/ m for particles of mass m, and the d-dim spherical volume of radius equal to the correlation length, i.e., or, considering an ion-electron plasma, with masses m i and m e , and temperatures T i and T e , Therefore, the partition function becomes: We observe that Equation (33) has exactly the form of Equation (26)! Namely, with and ln g(1) = 0, while S 1 equals the Sackur-Tetrode entropic formula [72,114], Solving in terms of entropies, Equation (28) gives or, in terms of the kappa index, Substituting Z q from Equations (34,35) into Equations (38,39), we end up with As noted by [129], Sq entropy has a fine property that lacks in S1/q: at conditions near the classical thermal equilibrium of large values of κ 0 and S 1 , the slope of entropy must be positive, so that, the closer to the classical equilibrium (κ 0 →∞), the higher the entropy. We have: In Figure 1, we plot the entropies Sq and S1/q as a function of (a) Z q , (b,c) S 1 , and (d) κ 0 . Entropies q S and 1/q S  plotted as a function of the invariant kappa index κ0, as shown in Equations (40,41), for various values of the BG entropy, S1; we observe that (i) the two entropies tend to S1 as κ0→ ∞; for large values of S1, entropy q S has a positive slope, while entropy 1/q S  has negative slope (a nonrealistic property).

Conclusions
The paper is a theoretical analysis on the dualities that characterize nonextensive statistical mechanics. The concept of duality of probability distributions is of fundamental importance within the framework of nonextensive statistical mechanics-the generalization of Boltzmann-Gibbs statistical mechanics under the consideration of the q-entropy. While the probability duality is solving old-standing issues of the theory, e.g., it ascertains the additivity for the internal energy given an additivity in the energy of microstates, is a rather complex part of the theory, and certainly, it cannot be trivially explained along the Gibb's path of entropy maximization.  (37), for various kappa indices; the case of κ→∞ corresponding to log(Z q ) is shown. Entropies (b) Sq, and (c) S1/q, plotted as a function of S 1 , as shown in Equations (38,39), for various kappa indices. (d) Entropies Sq and S1/q plotted as a function of the invariant kappa index κ 0 , as shown in Equations (40,41), for various values of the BG entropy, S 1 ; we observe that (i) the two entropies tend to S 1 as κ 0 →∞; for large values of S 1 , entropy Sq has a positive slope, while entropy S1/q has negative slope (a nonrealistic property).
These results verify that the entropy of the particle system is given by Sq, and not by S1/q, though, it is the entropic function of S1/q, which is maximized to lead to the canonical distribution P(ε). Therefore, the standard description of nonextensive statistical mechanics considers the entropy Sq, which is maximized to provide p(ε), and indirectly, the dual escort distribution P(ε), i.e., the actual distribution P(ε) is dual to the auxiliary distribution p(ε) that comes from the maximization of entropy.
In the present picture of nonextensive statistical mechanics, the entropy has the duality property, and not the distribution, i.e., the actual Sq is dual to the auxiliary entropy S1/q that needs to be maximized.

Conclusions
The paper is a theoretical analysis on the dualities that characterize nonextensive statistical mechanics. The concept of duality of probability distributions is of fundamental importance within the framework of nonextensive statistical mechanics-the generalization of Boltzmann-Gibbs statistical mechanics under the consideration of the q-entropy. While the probability duality is solving old-standing issues of the theory, e.g., it ascertains the additivity for the internal energy given an additivity in the energy of microstates, is a rather complex part of the theory, and certainly, it cannot be trivially explained along the Gibb's path of entropy maximization.
Recently, it was shown that an alternative picture exists, considering a dual entropy, instead of a dual probability. In particular, the framework of nonextensive statistical mechanics can be equivalently developed using qand 1/q-entropies (noted by Sq and S1/q). The canonical probability distribution coincides again with the known q-exponential distribution, but without the necessity of the duality of ordinary-escort probabilities. The paper deals with this duality between the qand 1/q-entropies; the maximization of the 1/q-entropy under the canonical ensemble; the derivation of an identity formula involving the dual entropies, q-entropy and 1/q-entropy, as well as, the 1-entropy, which can be useful in theoretical development and applications.
It was shown that the entropy of the particle system is given by Sq and not by S1/q, though, it is the entropic function of S1/q that should be maximized to lead directly to the canonical distribution P(ε). Therefore, the standard description of nonextensive statistical mechanics considers the entropy Sq, which is maximized to provide p(ε), and indirectly, the dual escort distribution P(ε), i.e., the actual distribution P(ε) is dual to the auxiliary distribution p(ε) that comes from the maximization of entropy. In the present picture of nonextensive statistical mechanics, the entropy has the duality property, and not the distribution, i.e., the actual Sq is dual to the auxiliary entropy S1/q that is maximized to provide the canonical ensemble.
The paper focused on the continuous description of energy distribution. We show that the actual entropy of the system, Sq, can be expressed as a function of the kappa index, the number of particles, and the BG entropy, S 1 . At conditions near the classical thermal equilibrium, i.e., at large values of κ 0 and S 1 , the slope of entropy is positive, verifying that the closer to the classical equilibrium (κ 0 → ∞), the higher the value of the entropy, Sq.