Next Article in Journal
Agent Inaccessibility as a Fundamental Principle in Quantum Mechanics: Objective Unpredictability and Formal Uncomputability
Next Article in Special Issue
On the Logic of a Prior Based Statistical Mechanics of Polydisperse Systems: The Case of Binary Mixtures
Previous Article in Journal
Fabrication of AlCoCrFeNi High-Entropy Alloy Coating on an AISI 304 Substrate via a CoFe2Ni Intermediate Layer
Previous Article in Special Issue
Concavity, Response Functions and Replica Energy
Article Menu
Issue 1 (January) cover image

Export Article

Open AccessArticle

Entropy, Carnot Cycle, and Information Theory

Dipartimento di Elettronica Informazione e Bioingegneria, Politecnico di Milano, 20133 Milano, Italy
Entropy 2019, 21(1), 3;
Received: 14 November 2018 / Revised: 3 December 2018 / Accepted: 18 December 2018 / Published: 20 December 2018
(This article belongs to the Special Issue Applications of Statistical Thermodynamics)
PDF [847 KB, uploaded 21 December 2018]


The fundamental intuition that Carnot had in analyzing the operation of steam machines is that something remains constant during the reversible thermodynamic cycle. This invariant quantity was later named “entropy” by Clausius. Jaynes proposed a unitary view of thermodynamics and information theory based on statistical thermodynamics. The unitary vision allows us to analyze the Carnot cycle and to study what happens when the entropy between the beginning and end of the isothermal expansion of the cycle is considered. It is shown that, in connection with a non-zero Kullback–Leibler distance, minor free-energy is available from the cycle. Moreover, the analysis of the adiabatic part of the cycle shows that the internal conversion between energy and work is perturbed by the cost introduced by the code conversion. In summary, the information theoretical tools could help to better understand some details of the cycle and the origin of possible asymmetries. View Full-Text
Keywords: entropy; Carnot cycle; information theory; Kullback–Leibler divergence entropy; Carnot cycle; information theory; Kullback–Leibler divergence

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Share & Cite This Article

MDPI and ACS Style

Martinelli, M. Entropy, Carnot Cycle, and Information Theory. Entropy 2019, 21, 3.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top