On the Simplification of Statistical Mechanics for Space Plasmas

Space plasmas are frequently described by kappa distributions. Non-extensive statistical mechanics involves the maximization of the Tsallis entropic form under the constraints of canonical ensemble, considering also a dyadic formalism between the ordinary and escort probability distributions. This paper addresses the statistical origin of kappa distributions, and shows that they can be connected with non-extensive statistical mechanics without considering the dyadic formalism of ordinary/escort distributions. While this concept does significantly simplify the usage of the theory, it costs the definition of a dyadic entropic formulation, in order to preserve the consistency between statistical mechanics and thermodynamics. Therefore, the simplification of the theory by means of avoiding dyadic formalism is impossible within the framework of non-extensive statistical mechanics.

A breakthrough in the field came with the connection of kappa distributions with statistical mechanics, and specifically with the concept of non-extensive statistical mechanics.This topic has been examined by several authors [67,[92][93][94][95][96].The empirical kappa distribution and the theoretical q-exponential distribution (that results from non-extensive statistical mechanics) are accidentally of the same form under the transformation of indices: q = 1 + 1/κ.Livadiotis & McComas [94] showed that the consistent connection of the theory and formalism of kappa distributions with non-extensive statistical mechanics is based on four fundamental physical notions and concepts: (i) q-deformed exponential and logarithm functions [97,98]; (ii) escort probability distribution [99]; (iii) Tsallis entropy [100]; and (iv) physical temperature [101][102][103] (that is, the actual temperature of the system).Understanding the statistical origin of kappa distributions led to further theoretical developments and applications, including the (a) foundation theory; (b) plasma formalism; and (c) space plasma applications (for more details, see: [96]).Non-extensive statistical mechanics was introduced by Tsallis [100] as a generalization of the classical Boltzmann Gibbs (BG) statistical mechanics.The well-known q-exponential function was deduced as a result of the maximization of Tsallis entropy under the constraints of the canonical ensemble.However, this result was facing some fundamental problems: In particular, the canonical distribution of energy ε was not invariant for an arbitrary selection of the ground level of energy, and the internal energy (mean energy), U = ε , was not extensive as it should be for uncorrelated distributions.These inconsistencies as well as other problems were corrected by the later work of Tsallis et al. [104] that considers the escort-expectation values.The modern theory of non-extensive statistics is based on the generalization of two elements, fundamental for statistical mechanics, (i) the classical formulation of Boltzmann-Gibbs entropy, and (ii) the notion of canonical distribution via the formalism of ordinary/escort probabilities; (see: [99,105,106]; see also: [94][95][96]).
The purpose of this paper is to show that the statistical framework of non-extensive statistical mechanics can be deduced as it is without the consideration of the dyadic formalism of ordinary/escort distributions.This concept can significantly simplify the usage of the theory, and make it accessible by the new generation of researchers that struggle to understand and apply it in exotic particle systems out of thermal equilibrium, such as the space plasmas.However, the cost of this simplification is the necessity of having two types of entropic functions.This dyadic entropic formulation preserves the basic fundamental thermodynamic formulae, which is necessary for the consistent connection of statistical mechanics with thermodynamics.
Next, we first expose the existing formalism of non-extensive statistical mechanics, then, in Section 2, we briefly present the above two elements (i) and (ii), and in Section 3, we present the maximization of the entropy under the constraint of canonical ensemble.Then, in Section 4, we present an alternative formalism that skips the consideration of ordinary/escort formalism.In Section 5, we show the consistency of the introduced formalism with thermodynamics, but with a dyadic formalism of entropy that is needed.Finally, in Section 6, we briefly summarize the conclusions.

Tsallis Entropy and Ordinary/Escort Formalism
Consider a system described by a discrete energy spectrum {ε k } W k=1 , which is associated with a discrete probability distribution {p k } W k=1 .Non-extensive statistical mechanics generalizes the classical formulation of BG entropy and uses the dyadic formalism of ordinary/escort probabilities.For the discrete energy spectrum {ε k } W k=1 and the associated distribution {p k } W k=1 , the non-extensive entropy is given by: or in terms of the kappa index: leading to the BG formulation for κ→∞ (or q→1): Note that all the entropic formulations are given in units of the Boltzmann's constant, that is, setting k B = 1 for simplicity; this will be restored further below.On the other hand, the formalism of non-extensive statistical mechanics is built together with the concept of escort probabilities [99].The escort probability distribution {P k } W k=1 is constructed from the for q → 1 .In fact, there is a duality between the ordinary {p k } W k=1 and escort probabilities {P k } W k=1 , such as . ., W, given by: According to the Tsallis statistical interpretation of the internal energy U, this is given by the escort expectation value of energy ε esc , that is: where the symbol esc denotes the escort expectation value.(Hereafter, the subscript "esc" will be ignored, since the expectation value will be taken associated with the escort distributions by default, unless it is stated otherwise.)It is noted that the ordinary and escort distributions have differences in their meaning and usage.The ordinary distribution is rather an auxiliary mathematical tool related to the information arrangement and its dynamics, while the escort distribution is related to the observed stationary states and measurements (statistical moments).

The Entropy Maximization in the Framework of Non-Extensive Statistical Mechanics
In order to obtain the probability distribution {p k } W k=1 , associated with the conservative physical system of energy spectrum {ε k } W k=1 , we follow along the famous Gibbs' path [107], where the entropy S = S {p k } W k=1 is maximized under the constraints of canonical ensemble.The maximum of entropy is derived from: On the other hand, the "variables" {p k } W k=1 are not independent, because of the two constraints: (i) Normalization of the probability distribution; (ii) Fixed value of internal energy; i.e., (i) Then, the Lagrange method applies, which involves maximizing a functional form where unknown Lagrange multipliers λ 1 , λ 2 , . . .are linearly expressed in terms of the constraints: where the maximization follows by setting (∂/∂p j )G {p k } W k=1 = 0, ∀ j = 1, . . ., W. In the case of non-extensive statistical mechanics, the interpretation for the internal energy U is given by the escort expectation value as shown in Equation ( 3), thus, we maximize the functional form: Entropy 2017, 19, 285 The maximization of this functional gives: leading to: where we set the auxiliary partition function Z ≡ λ 1 q−1 q , defined by the negative value of the second Lagrange multiplier, β L ≡ −λ 2 , that is: Note that T (also known as physical temperature, e.g., [102]) represents the actual temperature of the system, thus it does not depend on the q-index.On the other hand, the Lagrangian "temperature" T L is a thermal parameter that depends on the actual temperature T and the q-index, T L = T L (T; q).At the classical limit of q→1, T L coincides with the system's temperature [96].
The system is described by the escort distribution, defined by P j (ε j ) ∝ p j (ε j ) q , i.e., We can rewrite this, using the Q-deformed exponential function [97,98] and its inverse, the Q-logarithm function, defined by: We also use the q-deformed "unity function" [94], defined by: The subscript "+" denotes the cut-off condition, where exp Q (x) becomes zero if its base is non-positive.Setting Q = q, the ordinary/escort distributions are written as: In the above, the escort distribution P j (ε j ) is expressed by the q-exponential function, in terms of the q-index, but it can be transformed to its equivalent, the kappa distribution, under the transformation of the kappa and q indices: Namely, we derive the discrete kappa distribution: In the continuous description, the kappa distribution attains its well-known form: Entropy 2017, 19, 285 5 of 16 (e.g., see [96], Chapter 1, and reference therein).

Avoiding Ordinary/Escort Formalism
First, we start with a classical case of BG entropy: The maximization of the entropy under the constraints of canonical ensemble requires (∂/∂p j )G(p 1 , p 2 , . . ., p W ) = 0, with: We consider the "mean-less" energies, ε j − U: where we used the auxiliary partition function Z.The inverse-temperature β = (k B T) −1 is now given by the Lagrangian one β = β L , defined via the Lagrange multiplier (with opposite sign), Next, we proceed with the Tsallis entropy: Again, the maximization of the entropy under the constraints of the canonical ensemble involves maximizing the functional: Hence, (∂/∂p j )G(p 1 , p 2 , . . ., p W ) = 0, gives: Entropy 2017, 19, 285 6 of 16 where we used the q-deformed unity and exponential functions for Q = q −1 .
We enter again the mean-less energies, ε j − U: which may be rewritten using the Q-deformed exponential function, Equation (12), by setting Q = q −1 ; hence: Also, we use again the notions of the auxiliary partition function Z and the inverse-temperature β: where now the actual (inverse) temperature β does not coincide with the Lagrangian β L : It can be easily shown that for the argument φ q ≡ ∑ W j=1 p j q , we have φ q = Z 1−q (next section), and: Finally, the distributions can be expressed in terms of the kappa index: where we set: It must be mentioned that in the previous developments, the entropy and the deduced q-exponential distribution (in the framework of the canonical ensemble) are not characterized by the same q-index, but they are inverse to each other.Namely: Entropy 2017, 19, 285 7 of 16

Connection to Thermodynamics
First, we show the following relations: The identity (32a) is derived from: Then, we observe that: Hence, we show (32b): We define the "actual" partition function, as follows: As we will see, Equation (32a, b), together with the definition of the partition function Z in (34), lead to the relations required for the connection of statistical mechanics with thermodynamics.First, the entropy is related with the auxiliary partition function Z: Then, from Equation (32b), we obtain: Hence, Equation ( 35) leads to: Next, using again Equation ( 36): and the actual partition function Z defined in Equation ( 34), we obtain: Equation ( 34) can be rewritten as U − ln q Z/β L = − ln q Z/β L , and taking into account Equation (35), we arrive at: and the Helmholtz free energy: (Equations (37a-d) constitute the basic formulae connecting statistical mechanics with thermodynamics; e.g., see other similar cases in [108,109].)Next, we define an alternative entropic formula.Let the standard Tsallis entropy involved in Equation ( 22) be denoted by S q .An associated auxiliary formulation of entropy S q −1 is defined by: exp q −1 (S q −1 ) ≡ exp q (S q ) = Z or : (38a) (where we have again set k B = 1).The actual temperature of the system, T, is given by the physical temperature (Equation ( 10)), and can be related to the entropy via: Both entropic formulations, S q and S q −1 , are associated with the same physical temperature T, which means that both quantities are equivalently related to the actual temperature of the system.Indeed, starting from Equation (38a), we obtain: Then, the entropy must increase when the system transits through the stationary states toward thermal equilibrium (κ 0 →∞); in other words: As we will see below, in the continuous description, the entropy S q does not follow the above condition, i.e., (∂S q /∂κ) κ→∞ < 0. However, the dyadic entropy does, i.e., (∂S q −1 /∂κ) κ 0 →∞ > 0. In the continuous description, the kappa distribution of particle velocity is written as: (for f kinetic degrees of freedom).The argument φ q , which is related with the entropy as shown in Equation (35), is given by: where the speed scale σ u is used, in order to derive the number of the possible states that the system can attain in the velocity space, i.e., du 1 du 2 . . .du f /σ u f (see: [24,53,96], Chapter 2).
We can define the dimensionless probability distribution density and partition function: Hence: which reads the continuous version of Equation (32a).We calculate φ q as follows: where we set σ ≡ σ u /θ.Then, we have: where we set κ = κ 0 + 1 2 • f and f = d • N C ; N C is the number of particles and d denotes the per-particle kinetic and potential degrees of freedom.Hence: where we modified the speed scale parameter to: The entropy is associated with Z σ similar to Equation ( 35), according to: In terms of the kappa index, Equation ( 49) is written as: Entropy 2017, 19, 285 10 of 16 Figure 1 plots both the entropic formulations S q and S q −1 , as a function of the (invariant) kappa index κ 0 .We observe that the entropy S q decreases monotonically when the kappa index increases, while S q −1 increases when the kappa index increases for stationary states near thermal equilibrium (κ 0 →∞).
For large values of the kappa index tending to infinity, both entropic formulations tend to be equal to each other, recovering BG entropy.The two functional behaviors are summarized as: where In terms of the kappa index, Equation ( 49) is written as: Figure 1 plots both the entropic formulations q S and 1 − q S , as a function of the (invariant) kappa index κ0.We observe that the entropy q S decreases monotonically when the kappa index increases, while 1 − q S increases when the kappa index increases for stationary states near thermal equilibrium (κ0→∞).For large values of the kappa index tending to infinity, both entropic formulations tend to be equal to each other, recovering BG entropy.The two functional behaviors are summarized as: 1 0 0 0 and 0  For large kappa indices, both the entropic formulations tend to be equal to each other, recovering the Boltzmann Gibbs (BG) entropy.

Conclusions
Figure 1.The entropic formulations S q (red) and S q −1 (blue) are plotted as a function of the (invariant) kappa index, for number of particles N C and d=3 kinetic degrees of freedom per particle.We set the number of correlated particles to N C = 10 (left) and N C = 100 (right), and the modified speed scale parameter to σ = 0.9 (up) and σ = 0.5 (down) (the Boltzmann constant k B is set to 1.) For large kappa indices, both the entropic formulations tend to be equal to each other, recovering the Boltzmann Gibbs (BG) entropy.

Conclusions
This paper shows that the maximization of Tsallis entropy can directly lead to the canonical probability distribution expressed by a q-exponential or kappa distribution, without using the dyadic formalism of ordinary/escort probability distributions.In particular, it was shown (using discrete description for simplicity) that the derived distribution is: The above distribution depends on the Q-index; it leads to the kappa distribution by the standard transformation between kappa and the Q index: κ ≡ (Q − 1) −1 .The Q index, however, is not exactly the same as the entropic index q, but is its inverse, Q = q −1 .The derived kappa distribution is applicable for positive kappa indices, i.e., κ>0 or Q>1, that is, the entropic index q<1.However, the standard entropic formulation is associated with the kappa distribution is characterized by q>1.
This inconsistency has some unphysical result: The entropy does not increase with the increase of the kappa index for stationary states near thermal equilibrium (large kappa indices).As it was shown for the continuous description, for example, it appears that entropy monotonically decreases with the kappa index.In order to avoid this undesired result, a dyadic entropic formulation with index Q = q −1 instead of q, must be defined: In summary, the escort expectation approach is the proper way to formulate non-extensive statistical mechanics.If we erroneously attempt to avoid the use of escort probability, then we will run into a physically inconsistent situation that involves the connection of statistical mechanics with thermodynamics.In such a case, a double entropic formulation must be defined to resolve the inconsistency.More precisely, it is shown that if the known dyadic distribution formalism is used then the entropic formulation is unique; however, if a unique distribution formulation is used then a dyadic entropic formulation is necessary, according to the scheme in Figure 2. Therefore, we conclude that simplification by means of avoiding dyadic formalism (either of distributions or entropic formulations) is impossible within the framework of non-extensive statistical mechanics (in order to be consistent with thermodynamics).
The above distribution depends on the Q-index; it leads to the kappa distribution by the standard transformation between kappa and the Q index: .The Q index, however, is not exactly the same as the entropic index q, but is its inverse, . The derived kappa distribution is applicable for positive kappa indices, i.e., κ>0 or Q>1, that is, the entropic index q<1.However, the standard entropic formulation is associated with the kappa distribution is characterized by q>1.
This inconsistency has some unphysical result: The entropy does not increase with the increase of the kappa index for stationary states near thermal equilibrium (large kappa indices).As it was shown for the continuous description, for example, it appears that entropy monotonically decreases with the kappa index.In order to avoid this undesired result, a dyadic entropic formulation with index 1 − = q Q instead of q, must be defined: exp ( / ) exp ( / ), or ln ( ) ln ( ) In summary, the escort expectation approach is the proper way to formulate non-extensive statistical mechanics.If we erroneously attempt to avoid the use of escort probability, then we will run into a physically inconsistent situation that involves the connection of statistical mechanics with thermodynamics.In such a case, a double entropic formulation must be defined to resolve the inconsistency.More precisely, it is shown that if the known dyadic distribution formalism is used then the entropic formulation is unique; however, if a unique distribution formulation is used then a dyadic entropic formulation is necessary, according to the scheme in Figure 2. Therefore, we conclude that simplification by means of avoiding dyadic formalism (either of distributions or entropic formulations) is impossible within the framework of non-extensive statistical mechanics (in order to be consistent with thermodynamics).

Figure 1 .
Figure 1.The entropic formulations q S (red) and