 Next Article in Journal
Estimation of Time-Frequency Muscle Synergy in Wrist Movements
Next Article in Special Issue
Rényi’s Entropy, Statistical Order and van der Waals Gas
Previous Article in Journal
Multi-Classifier Fusion Based on MI–SFFS for Cross-Subject Emotion Recognition Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

# Rényi Entropy and Free Energy

Department of Mathematics, University of California, Riverside, CA 92507, USA
Entropy 2022, 24(5), 706; https://doi.org/10.3390/e24050706
Received: 28 March 2022 / Revised: 11 May 2022 / Accepted: 15 May 2022 / Published: 16 May 2022
(This article belongs to the Special Issue Rényi Entropy: Sixty Years Later)

## Abstract

:
The Rényi entropy is a generalization of the usual concept of entropy which depends on a parameter q. In fact, Rényi entropy is closely related to free energy. Suppose we start with a system in thermal equilibrium and then suddenly divide the temperature by q. Then the maximum amount of work the system can perform as it moves to equilibrium at the new temperature divided by the change in temperature equals the system’s Rényi entropy in its original state. This result applies to both classical and quantum systems. Mathematically, we can express this result as follows: the Rényi entropy of a system in thermal equilibrium is without the ‘$q − 1$-derivative’ of its free energy with respect to the temperature. This shows that Rényi entropy is a q-deformation of the usual concept of entropy.

## 1. Introduction

In 1960, Rényi  defined a generalization of Shannon entropy which depends on a parameter. If p is a probability distribution on a finite set, its Rényi entropy of order q is defined to be
$S q = 1 1 − q ln ∑ i p i q$
where $0 < q < ∞$. Of course, we need $q ≠ 1$ to avoid dividing by zero, but L’Hôpital’s rule shows that the Rényi entropy approaches the Shannon entropy as q approaches one:
$lim q → 1 S q = − ∑ i p i ln p i .$
Thus, it is customary to define $S 1$ to be the Shannon entropy.
While Shannon entropy has a deep relation to thermodynamics, Rényi entropy has not been completely integrated into this subject, or at least not in a well-recognized way. While many researchers have tried to modify statistical mechanics by changing the usual formula for entropy, so far, the most convincing uses of Rényi entropy in physics seem to involve the limiting cases $S 0 = lim q → 0 S q$ and $S ∞ = lim q → + ∞ S q$. These are known as the ‘max-entropy’ and ‘min-entropy’, respectively, since $S q$ is a decreasing function of q. They show up in studies on the work value of information  and the thermodynamic meaning of negative entropy . For other interpretations of Rényi entropy, see the works of Harremöes , König et al. , and Uffink .
In fact, it is not necessary to modify the statistical mechanics to find a natural role for Rényi entropy in physics. Rényi entropy is closely related to the familiar concept of free energy,with the parameter q appearing as a ratio of the temperatures.
The trick is to think of the probability distribution as a Gibbs state, or the state of thermal equilibrium for some Hamiltonian at some chosen temperature, say $T 0$. Suppose that all the probabilities $p i$ are nonzero. Then, when working in units where Boltzmann’s constant equals one, we can write
$p i = e − E i / T 0$
for some nonnegative real numbers $E i$. If we think of these numbers as the energies of microstates of some physical system, the Gibbs state of this system at temperature T is the probability distribution
$e − E i / T Z ( T )$
where Z is the partition function:
$Z ( T ) = ∑ i ∈ X e − E i / T$
Since $Z ( T 0 ) = 1$, the Gibbs state reduces to our original probability distribution p at this temperature.
Starting from these assumptions, the free energy
$F ( T ) = − T ln Z ( T )$
is related to the Rényi entropy as follows:
$F ( T ) = − ( T − T 0 ) S T 0 / T$
The proof is an easy calculation:
$S T 0 / T = 1 1 − T 0 / T ln ∑ i p i T 0 / T = T T − T 0 ln ∑ i e − E i / T = − F ( T ) T − T 0 .$
This works for $T ≠ T 0$, but we can use L’Hôpital’s rule to show that in the limit $T → T 0$, both sides converge to the Shannon entropy $S 1$ of the original probability distribution p.
After the author noticed this result in the special case $T 0 = 1$ , Stacey commented that this case was already mentioned in Beck and Schlögl’s 1995 text on the thermodynamics of chaotic systems . However, most people using Rényi entropy were unaware of its connection to free energy, perhaps because they work on statistical inference rather than physics . Thus, the author put a version of this note on the arXiv in 2011 . It has subsequently been cited 77 times, which suggests that it was indeed useful. We have therefore decided to publish it.
Shortly after the first draft of this note was released, Polettini gave a nice physical intepretation of Rényi entropy . Downes then made a further generalization . We explain those ideas here. The above argument concerns a system with Gibbs state $p i = exp ( − E i / T 0 )$ at a chosen temperature $T 0$. Such a system automatically has zero free energy at this chosen temperature. Downes generalized the relation between Rényi entropy and free energy to systems whose free energy is not constrained this way. Polettini’s physical interpretation of Rényi entropy can be extended to these more general systems, and we describe this interpretation in what follows. We also explain how the Rényi entropy is a ‘q-deformation’ of the ordinary notion of entropy. This complements the work of Abe on another generalization of entropy: the Tsallis entropy .
In what follows, we work in a quantum rather than classical context, using a density matrix instead of a probability distribution. However, we can diagonalize any density matrix, and then its diagonal entries define a probability distribution. Thus, all our results apply to classical as well as quantum systems. The quantum generalization of Shannon entropy is, of course, well-known: it is the von Neumann entropy. The quantum generalization of Rényi entropy is also already known .

## 2. Rényi Entropy as a $q$-Derivative of Free Energy

Let H be a self-adjoint complex matrix. Thinking of H as the Hamiltonian of a quantum system, and no longer assuming that Boltzmann’s constant k equals 1, we may define the Gibbs state of this system at temperature T to be the density matrix
$ρ T = 1 Z ( T ) e − H / k T$
where the partition function
$Z ( T ) = tr ( e − E / k T )$
ensures that $tr ( ρ T ) = 1$. The Helmholtz free energy at temperature T is defined by
$F ( T ) = − k T ln Z ( T ) .$
On the other hand, for any density matrix $ρ$, the quantum generalization of Rényi entropy is defined by
$S q ( ρ ) = k ln tr ( ρ q ) 1 − q$
since this formula reduces to the usual definition of Rényi entropy, Equation (1), when the probabilities $p i$ are the eigenvalues of $ρ$ and we set $k = 1$. This formula makes sense when $0 < q < ∞$ and $q ≠ 1$, but we can define the quantum Rényi entropy as a limit in the special cases $q = 0 , 1 , + ∞$. For $q = 1$, this gives the usual von Neumann entropy:
$S 1 ( ρ ) : = lim q → 1 S q ( ρ ) = − k tr ( ρ ln ρ ) .$
Returning to our system with Gibbs state $ρ T$ at temperature T, let us write $S q ( T )$ for $S q ( ρ T )$. By computing this Rényi entropy at some temperature $T 0$, we find
$S q ( T 0 ) = k ln tr ( ρ T 0 q ) 1 − q = k 1 − q ln tr e − q H / T 0 Z ( T 0 ) q = k 1 − q ln Z ( T 0 / q ) − q ln Z ( T 0 )$
If we define a new temperature T with
$q = T 0 / T ,$
we obtain
$S q ( T 0 ) = k ln Z ( T ) − q ln Z ( T 0 ) 1 − q = k T ln Z ( T ) − T 0 ln Z ( T 0 ) T − T 0$
or, in short, the following:
$S T 0 / T ( T 0 ) = − F ( T ) − F ( T 0 ) T − T 0 .$
This equation is the clearest way to express the relation between Rényi entropy and free energy. In the special case where the free energy vanishes at temperature $T 0$, it reduces to Equation (2). In the limit $T → T 0$, it reduces to
$S 1 ( T 0 ) = − d F ( T ) d T T = T 0 .$
Of course, it is already well-known that the von Neumann entropy is the derivative of $− F$ with respect to the temperature. What we see now is that the Rényi entropy is the difference quotient approximating this derivative. Instead of the slope of the tangent line, it is the slope of the secant line.
In fact, we can say a bit more: the Rényi entropy is the the ‘$q − 1$-derivative’ of the negative free energy. For $q ≠ 1$, the q-derivative of a function f is defined by
$d f d x q = f ( q x ) − f ( x ) q x − x .$
This reduces to the ordinary derivative in the limit $q → 1$. The $q − 1$-derivative is defined the same way but with $q − 1$ replacing q. Equation (9) can be rewritten more tersely using this concept:
$S q = − d F d T q − 1$
Here we have made a change of variables, writing T for the variable called $T 0$ in Equation (9).
The concept of the q-derivative shows up in mathematics whenever we ‘q-deform’ familiar structures, obtaining new ones such as quantum groups. For an introduction, see the text by Cheung and Kac . In some cases, q-deformation should be thought of as quantization, with q playing the role of $exp ( ℏ )$. That is definitely not the case here: the parameter q in our formulas is unrelated to Planck’s constant . Indeed, Equation (11) holds in classical as well as quantum mechanics.
What, then, is the thermodynamic meaning of Rényi entropy? This was pointed out by Polettini . Start with a physical system in thermal equilibrium at some temperature. Then ‘quench’ it, suddenly dividing the temperature by q. The maximum amount of work the system can perform as it moves to thermal equilibrium at the new temperature divided by the change in temperature equals the system’s Rényi entropy of the order q in its original state. Note that this formulation even accounts for the minus sign in Equation (9), because it speaks of the work the system performs rather than the work performed with it.

## Funding

This research received no external funding.

## Acknowledgments

I thank all the members of the Entropy Club at the Centre for Quantum Technologies and especially Oscar Dahlsten for the exciting discussions in which these ideas emerged. I thank David Corfield and Tom Leinster for the useful discussion of Rényi entropy and Blake Stacey for pointing out references relating it to free energy. I especially thank Matteo Polettini and Eric Downes for suggestions that vastly improved this paper.

## Conflicts of Interest

The author declares no conflict of interest.

## References

1. Rényi, A. On measures of information and entropy. In Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics and Probability 1960, Berkeley, CA, USA, 20 June–30 July 1960; Neyman, J., Ed.; UC Press: Berkeley, CA, USA, 1906; Volume 1, pp. 547–561. [Google Scholar]
2. Dahlsten, O.; Renner, R.; Rieper, E.; Vedral, V. On the work value of information. New J. Phys. 2011, 13, 053015. Available online: http://arxiv.org/abs/0908.0424 (accessed on 15 May 2022). [CrossRef]
3. del Rio, L.; Aberg, J.; Renner, R.; Dahlsten, O.; Vedral, V. The thermodynamic meaning of negative entropy. Nature 2011, 474, 61–63. Available online: http://arxiv.org/abs/1009.1630 (accessed on 15 May 2022). [PubMed][Green Version]
4. Harremöes, P. Interpretations of Renyi entropies and divergences. Phys. A Stat. Mech. Appl. 2006, 365, 57–62. Available online: http://arxiv.org/abs/math-ph/0510002 (accessed on 15 May 2022). [CrossRef][Green Version]
5. König, R.; Renner, R.; Schaffner, C. The operational meaning of min- and max-entropy. IEEE Trans. Info. Theory 2009, 55, 4337–4347. Available online: http://arxiv.org/abs/0807.1338 (accessed on 15 May 2022). [CrossRef][Green Version]
6. Uffink, J. Can the maximum entropy principle be explained as a consistency requirement? Sec. 6: Justification by consistency: Shore and Johnson. Stud. Hist. Phil. Sci. 1995, B26, 223–261. [Google Scholar]
7. Baez, J.C. Rényi Entropy and Free Energy. Azimuth. Available online: http://johncarlosbaez.wordpress.com/2011/02/10/rnyi-entropy-and-free-energy/ (accessed on 10 February 2011).
8. Beck, C.; Schlögl, F. Thermodynamics of Chaotic Systems; Cambridge U. Press: Cambridge, UK, 1995. [Google Scholar]
9. Erdogmuns, D.; Xu, D. Rényi’s entropy, divergence and their nonparametric estimators. In Information Theoretic Learning: Rényi’s Entropy and Kernel Perspectives; Principe, J., Ed.; Springer: Berlin/Heidelberg, Germany, 2010; pp. 47–102. [Google Scholar]
10. Baez, J.C. Rényi Entropy and Free Energy. 2011. Available online: https://arxiv.org/abs/1102.2098v3 (accessed on 15 March 2022).
11. Polettini, M. Rényi Entropy and Free Energy. Matteoeo. 10 February 2011. Available online: https://web.archive.org/web/20120124091413/http://tomate.blogsome.com/2011/02/10/renyi-entropy-and-free-energy/ (accessed on 15 March 2022).
12. Downes, E. Comment on Rényi Entropy and Free Energy. Azimuth. Available online: http://johncarlosbaez.wordpress.com/2011/02/10/rnyi-entropy-and-free-energy/#comment-4065 (accessed on 15 May 2022).
13. Abe, S. A Note on the q-deformation-theoretic Aspect of the Generalized Entropies in Nonextensive Physics. Phys. Lett. A 1997, 224, 326–330. [Google Scholar] [CrossRef]
14. van Dam, W.; Hayden, P. Rényi-Entropic Bounds on Quantum Communication, Sec. 4.1: Rényi Entropy. Available online: http://arxiv.org/abs/quant-ph/0204093 (accessed on 15 May 2022).
15. Cheung, P.; Kac, V. Quantum Calculus; Springer: Berlin/Heidelberg, Germany, 2002. [Google Scholar]
 Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

## Share and Cite

MDPI and ACS Style

Baez, J.C. Rényi Entropy and Free Energy. Entropy 2022, 24, 706. https://doi.org/10.3390/e24050706

AMA Style

Baez JC. Rényi Entropy and Free Energy. Entropy. 2022; 24(5):706. https://doi.org/10.3390/e24050706

Chicago/Turabian Style

Baez, John C. 2022. "Rényi Entropy and Free Energy" Entropy 24, no. 5: 706. https://doi.org/10.3390/e24050706

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.