# Entropy, Carnot Cycle, and Information Theory

## Abstract

**:**

## 1. Carnot Cycle and Thermodynamics

_{i}is the probability of finding the system in the ith state. The “Helmholtz potential” or Free-Energy F is by definition

_{v}, the (7) becomes

_{2}and T

_{1}. In fact, although the first and third terms in Equations (8) self-cancel, the second term prevails on the fourth and a net work is hence generated after the cycle.

## 2. Carnot Cycle and Information Theory

_{i}. This perturbation affects the final partition function, which becomes now

_{i}) in a final state described by an “a posteriori” distribution (q

_{i}). According to this point of view, they describe Equation (22) as “…the free-energy difference consists of two terms: the average free-energy of the individual compartments (note: in [14] the enumeration of the states characterized by different energy labels are call “compartments”) and a cost term that measures the information theoretic distance between the initial and final information state, which is then converted into units of energy”. Moreover, they note that “While expression of the free-energy instantiates a trade-off between the internal energy $\langle $ and the entropic cost S…we generalize these previous models of bounded rationality based on the duality between information and utility: Instead of considering absolute free-energy F we consider difference in free-energy $\mathsf{\Delta}\mathrm{F}$ between an initial state and a final state corresponding to the situation before and after the deliberation associated with the decision-making process”. Following this interpretation, the “variational free-energy principle” can be applied to different contexts: the perceptual decision making, the sensorimotor control, and so on [14].

_{i}as final distribution” (Cover [6]). The change of entropy of the isothermal expansion hence becomes

_{2}to T

_{1}that occurs during the adiabatic cycle involves a change of the dimension of the ensemble on which the partition function is evaluated (in fact, the partition function upper limits results function of the temperature).

_{v}, we re-obtain Equation (20). For small ensembles, typical of the information theory, the second term after the equality in Equation (32) cannot be neglected. In particular, since the upper limit of the sum is the dimension A (previously defined), if the maximum entropy ${\lambda}_{max}$ increases, dN increases. This means that less work is available for this phase of the cycle. Hence, the adiabatic phase realizes a sort of code-translation (re-arranging the dimension of the words in function of the new alphabet) that introduces “cost” in the energy-conversion (from internal energy to work). This result seems reasonable because any re-coding process will have a "cost", but its value will depend on the details of the process itself and must be the result of future investigations.

## 3. Comments and Conclusions

## Funding

## Acknowledgments

## Conflicts of Interest

## References

- Martinelli, M. Photons, Bits and Entropy: From Planck to Shannon at the Roots of the Information Age. Entropy
**2017**, 19, 341. [Google Scholar] [CrossRef] - Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J.
**1948**, 27, 379–423. [Google Scholar] [CrossRef] - Jaynes, E.T. Information theory and Statistical Mechanics. Phys. Rev.
**1957**, 106, 620. [Google Scholar] [CrossRef] - Jaynes, E.T. Note on Unique Decipherability. Trans. IRE Inf. Theory
**1959**, 5, 98–102. [Google Scholar] [CrossRef] - Callen, H.B. Thermodynamics and an Introduction to Thermostatistics, 2nd ed.; Wiley: Hoboken, NJ, USA, 1985. [Google Scholar]
- Cover, T.M.; Thomas, J.A. Elements of Information Theory, 2nd ed.; Wiley: Hoboken, NJ, USA, 2006. [Google Scholar]
- Schrodinger, E. Statistical Thermodynamics, 1st ed.; Cambridge University Press: Cambridge, UK; Dover: Mineola, NY, USA, 1948. [Google Scholar]
- Merhav, N. Physics of the Shannon Limits. IEEE Trans. Inf. Theory
**2010**, 56, 4274–4285. [Google Scholar] [CrossRef] - Cardoso Dias, P.M. William Thomson and the heritage of caloric. Ann. Sci.
**1996**, 53, 511–520. [Google Scholar] [CrossRef] - Reiss, H. Thermodynamic-like transformations in information theory. J. Stat. Phys.
**1969**, 1, 107–131. [Google Scholar] [CrossRef] - Planck, M. Treatise on Thermodynamics; Dover Publications: Mineola, NY, USA, 1905. [Google Scholar]
- Tribus, M. Information Theory as the Basis for Thermostatics and Thermodynamics. J. Appl. Mech.
**1961**, 28, 1–8. [Google Scholar] [CrossRef] - Kullback, S.; Leibler, R.A. On information and sufficiency. Ann. Math. Stat.
**1951**, 22, 79–86. [Google Scholar] [CrossRef] - Ortega, P.A.; Braun, D.A. Thermodynamics as a theory of decision making with information-processing costs. Proc. R. Soc. A
**2013**, 469, 20120683. [Google Scholar] [CrossRef] - Sagawa, T.; Ueda, M. Generalized Jarzynski equality under nonequilibrium feedback control. Phys. Rev. Lett.
**2010**, 104, 090602. [Google Scholar] [CrossRef] [PubMed] - Toyabe, S.; Sagawa, T.; Ueda, M.; Mauneyuki, E.; Sano, M. Experimental demonstration of information-to-energy conversion and validation of the generalized Jarzynski equality. Nat. Phys.
**2010**, 6, 988–992. [Google Scholar] [CrossRef][Green Version] - Parrondo, J.M.R.; Horowitz, J.M.; Sagawa, T. Thermodynamics of information. Nat. Phys.
**2015**, 11, 131–139. [Google Scholar] [CrossRef] - Reiss, H. Methods of Thermodynamics; Dover Publications: Mineola, NY, USA, 1965. [Google Scholar]
- Csiszar, I. Two remarks to noiseless coding. Inf. Control
**1967**, 11, 317. [Google Scholar] [CrossRef]

**Figure 1.**The Carnot cycle described by two variables, temperature and volume (elaborated from Reference [10]).

**Figure 3.**The Carnot cycle in the presence of intervention on the partition function. In the isothermal phase the existence of a Kullback–Leibler divergence; in the adiabatic phase the existence of an alphabet change.

© 2018 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Martinelli, M.
Entropy, Carnot Cycle, and Information Theory. *Entropy* **2019**, *21*, 3.
https://doi.org/10.3390/e21010003

**AMA Style**

Martinelli M.
Entropy, Carnot Cycle, and Information Theory. *Entropy*. 2019; 21(1):3.
https://doi.org/10.3390/e21010003

**Chicago/Turabian Style**

Martinelli, Mario.
2019. "Entropy, Carnot Cycle, and Information Theory" *Entropy* 21, no. 1: 3.
https://doi.org/10.3390/e21010003