Comments on Black Hole Entropy: A Closer Look

In a recent paper [Entropy 2020, 22(1), 17] C. Tsallis states that entropy -- as in Shannon's or Kullback-Leiber's definitions -- is inadequate to interpret black hole entropy and suggests a new non-additive functional that should take the role of entropy. Here we argue that Tsallis' motivations are unsound, caused by a misunderstanding of the concepts of additivity and extensivity. We also point out other problems in his analysis of black hole entropy.


Introduction
In his paper, Tsallis [1] presents some questionable statements on black hole entropy. He affirms that since Bekeinstein-Hawking (BH) entropy is not proportional to the black hole volume, and, consequentially, Boltzmann-Gibbs (BG) entropy would be inappropriate to describe black holes. He then proposes a different non-additive functional meant to replace entropy.
This article is organized to rebuke this idea. On section 2, we present the foundations for entropy and show how the additive entropy functional does not necessarily lead to extensivity. On section 3, we revisit the laws of black hole thermodynamics and how they can be accurately accounted for on the basis of additive entropy.

Entropic Foundations
The work of Jaynes [2,3] solidified ideas of Boltzmann and Gibbs for the foundations of statistical physics by designing the method of maximum entropy. Probability distributions p(x) should be selected from the maximization of entropy, under constraints meant to represent the relevant information for the problem at hand. Here q(x) is a prior distribution generally taken to be uniform. In the context of density matrices in quantum mechanics it takes the form of Umegaki entropy, The form of entropy is deeply attached to fundamental concepts of statistical inference. Since the work of Shannon [4], there has been deep foundational research [5][6][7][8] on the criteria that an entropy functional should obey. A modern understanding [8] is that entropy should abide by two design criteria (DC) that can be roughly explained as: DC1 subdomain independence -local information should have only local effects and DC2 subsystem independence -that a priori independent subsystems should remain independent, unless the constraints explicitly require otherwise. The unique functional that fits these criteria is (1), also called Kullback-Leiber (KL) entropy. Tsallis calls (1), and several particular cases of it, BG entropies.
A consequence of DC2 that can be directly seen from (1) is that entropy is additive. That means, if a state x can be written as composition of two substates, x = (x 1 , x 2 ) ∈ X = X 1 ×X 2 , and the subsystems are statistically independent, q(x) = q(x 1 )q(x 2 ) and p(x) = p(x 1 )p(x 2 ), the entropy for the composite system is the sum of the entropies for each subsystem, (3) The maximization of (1) after the imposition of constraints in the form of expected values for a set of functions a i (x) -referred to as sufficient statistics -and respective constants A i is used to infer probabilities in statistical physics. For example, internal energy, total volume, and magnetization are identified as constants A i in (4). Since maximization of the entropy under the set of values A = (A 1 , A 2 , . . . , A k ) and the prior q(x) determines a unique posterior, we are lead into writing the chosen distribution as p(x|A), and maximum value for entropy as S(A) = S[p(x|A)]; a function of A that is identified as the thermodynamical entropy, the one that appears in the formulation for the laws of thermodynamics [9,10].
In thermodynamics, one typically is interested in properties of the system only in the thermodynamic limit. If the system in equilibrium is homogeneous, one can divide it into N ≫ 1 identical parts which are small compared to the full system, but large enough so that the thermodynamic limit is still descriptive of the individual parts. In well studied applications of statistical physics (e.g. Van der Walls gas and the Ising model), the interaction between constituents are short-ranged, meaning that if the N subsystems are large enough, each sufficient statistics can be approximated as a sum of functions calculated separately for the subsystems.
In the appendix we show how this condition leads to independent probabilities for each subsystem and also how the quantities A i , and the thermodynamical entropy S(A), seen as functions of N are homogeneous of degree 1, therefore extensive. In this description, extensivity, as property of the thermodynamical entropy of some systems, is not fundamental nor a direct consequence of the additivity of entropy. Rather, it is a property that emerges from approximations in the sufficient statistics as the method of maximum entropy is applied.
Tsallis claims: " We frequently verify perplexity by various authors that the entropy of a black hole appears to be proportional to its [black hole's] area whereas it was expected to be proportional to its volume. Such perplexity is tacitly or explicitly based on a sort of belief (i) that a black hole is a three-dimensional system, and (ii) that its thermodynamic entropy is to be understood as its Boltzmann-Gibbs (BG) one. " It is correct that BH entropy is not proportional to the volume of the black hole [11] , but this comes as no surprise for two independent reasons. First, the realization that for a black hole, mass and volume are not linear in each other. It is much more usual in statistical physics to constrain the expected value of the total energy than the volume, so, if the entropy were proportional to the mass of a black hole, it could still be the case of not being proportional to its "volume". Second, a system bound by self-interaction is a clear example of violation of the criteria for statistical independence explained before. Such systems require a long-range attractive interaction between its parts, so the energy of the total system is necessarily considerably smaller than the sum of the energies of its parts.
So, when Tsallis claims: "It is our standpoint that, whenever the additive entropic functional S BG is thermodynamically nonextensive, the entropic functional to be used for all thermodynamical issues is not given by Equations (1)-(4), but by a nonadditive one instead".
his argument is flawed because as the entropy for a system happens not to be extensive, it does not mean the entropy functional needs to be changed to a non-additive one. The thermodynamical entropy calculated from (1) gives the correct thermodynamical description independent of extensivity [9], that incidentally is not expected for a black hole.
To conclude our discussion of the foundations of entropy, we want to refer authors who have reported that (i) substituting entropy by a non-additive functional leads to inconsistent statistics [12][13][14][15], as expected from the violation of the DC and (ii) that distributions once thought to arise only from these non-additive functionals can, indeed, by found by maximizing (1) [16][17][18][19]. Confirming, from different perspectives, that (1) is the appropriate form for entropy.

Black Hole specifics
In his article, Tsallis presents questionable statements about the entropy of black holes. Plenty of references are provided to the (correct) claim that, in semiclassical, standard general relativity, the entropy of a black hole is given by the BH formula, which, in units such that k = = c = G = 1, is written as where A is the area of a section of the event horizon. However, Tsallis claims "In all such papers it is tacitly assumed that S BG [BH formula] is the thermodynamic entropy of the black hole. But, as argued in [24,26], there is no reason at all for this being a correct assumption." Unlike claimed, black holes were found to satisfy the laws of thermodynamics if assigned with an entropy given by (5) without appeal to arguments coming from probabilities or statistical mechanics. At present, we are not aware of any direct, explicit calculation starting from (1) leading to expression (5) in the literature.
Specifically, black holes are assigned with an entropy thanks to the combination of two key findings. First, the mathematical analogy between the laws of black hole mechanics and the laws of thermodynamics [20], namely, the constancy of its surface gravity κ across the horizon of a stationary black hole is analogous to the zeroth law of thermodynamics. The analogue of the first law is the constraint variations δ in the parameters of stationary black holes obey, ακδ A 8πα = δM , where M is the mass of a black hole, which, for simplicity, is assumed to be static and charge-free, and α is an arbitrary constant. And the second law is analogous to the area theorem, a general result that implies that the area of sections of the event horizon cannot decrease as one moves to the future under a general class of assumptions.
The second ingredient is the discovery that the analogue of temperature in these laws, ακ, is precisely the physical temperature of the Hawking radiation when α = 1 2π . [21] This radiation is a consequence of effects of quantum mechanical origin. Since M also possesses a physical meaning as the energy of the black hole, the laws of black hole mechanics were elevated to having thermodynamic significance [22]. But at this point, no derivation came from first principles of statistical mechanics and no calculation of entropy from (1) leading to (5) was necessary for the conclusion to hold.
For similar reasons, the claim "In what concerns thermodynamics, the spatial dimensionality of a (3+1) black hole depends on whether its bulk (inside its event horizon or boundary) has or not non negligible amount of matter oranalogous (sic) physical information." cannot be sustained. Moreover, neither the classical laws of black hole mechanics [20,23] or the derivation of Hawking effect [21,25,26] assumes a particular form (2-or 3-dimensional) for the distribution of classical matter inside the black hole. More than that, it is justifiable to assign a non-zero entropy to an eternal, Kruskal-Szekeres black hole [24,26], which has zero classical matter everywhere. This strongly suggests that any attempt to frame black hole entropy in common grounds with the entropy of ordinary matter should not rely on assigning an entropy density to the matter. In fact, as recently shown by one of us [27], general properties of the entropy functional as in (2) account for the origin of the first and second law of black hole thermodynamics. Hence, per section 2, there are common grounds on which one can understand the origin both ordinary and black hole thermodynamics. On these grounds, entropy is associated with the information one has about the system. Hence, we do not see the need for moving away from this definition and/or interpretation of entropy.
For black holes, the existence of an event horizon prevents an external observer from knowing the initial data for quantum fields. Roughly speaking, variations of the entropy (2) associated with the reduced state of free quantum fields obtained by taking the partial trace over these inaccessible degrees of freedom generate all the variations of entropy appearing in the laws of black hole thermodynamics [27].

Conclusion
We were able to invalidate many arguments from Tsallis' paper. From statistical physics considerations, the fact that BH entropy is not proportional to the black hole "volume" is not indicative of an issue. Substituting KL entropy by a non-additive functional violate principles of statistical inference and are not necessary to explain non-extensive thermodynamical entropies. The are no direct calculations from BG entropy into BH formula. Lastly, there is solid foundation to interpret the laws of black hole thermodynamics from additive entropy.

A Extensivity and subsystems independence
In this appendix, we show derive the extensivity properties of expected values and entropy in the thermodynamical limit from the hypothesis discussed in the main text.
The distribution that maximizes (1) under constraints given by (4) is a Gibbs distribution, where each λ i is the Lagrange multiplier associated with the expected value constraint for A i . For a system divided into N subsystems, X = X 1 ×X 2 ×. . . ×X N , and the prior is independent in this division, q(x) = N j=1 q(x j ), we can examine a case in which each sufficient statistics can be (approximately) written as: whereã i (x j ) are functions of the substates x j ∈ X j only. (6) can be written as a probability for the substates, If we, instead, search for the maximum entropy separately for each subsystem j under a similar constraint, It coincides that each factor in the product (8) is, also, the maximum entropy distribution for a subsystem j. Allowing (8) to be written as Therefore (7) implies that maximum entropy distribution is one in which the subsystems are statistically independent, and A i = N j=1Ã i j . If we compute the thermodynamical entropy for each subsystem we have, from (3), S(A) = N j=1 S(Ã j ). In the case where the subsystems are identical, follows directly that S(A) = N S(A/N ) .