Entropy, Carnot Cycle, and Information Theory
AbstractThe fundamental intuition that Carnot had in analyzing the operation of steam machines is that something remains constant during the reversible thermodynamic cycle. This invariant quantity was later named “entropy” by Clausius. Jaynes proposed a unitary view of thermodynamics and information theory based on statistical thermodynamics. The unitary vision allows us to analyze the Carnot cycle and to study what happens when the entropy between the beginning and end of the isothermal expansion of the cycle is considered. It is shown that, in connection with a non-zero Kullback–Leibler distance, minor free-energy is available from the cycle. Moreover, the analysis of the adiabatic part of the cycle shows that the internal conversion between energy and work is perturbed by the cost introduced by the code conversion. In summary, the information theoretical tools could help to better understand some details of the cycle and the origin of possible asymmetries. View Full-Text
Share & Cite This Article
Martinelli, M. Entropy, Carnot Cycle, and Information Theory. Entropy 2019, 21, 3.
Martinelli M. Entropy, Carnot Cycle, and Information Theory. Entropy. 2019; 21(1):3.Chicago/Turabian Style
Martinelli, Mario. 2019. "Entropy, Carnot Cycle, and Information Theory." Entropy 21, no. 1: 3.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.