Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem
AbstractWe start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynamic Entropy. The first is defined on any probability distribution; and therefore it is a very general concept. On the other hand Entropy is defined on a very special set of distributions. Next we show that the Shannon Measure of Information (SMI) provides a solid and quantitative basis for the interpretation of the thermodynamic entropy. The entropy measures the uncertainty in the distribution of the locations and momenta of all the particles; as well as two corrections due to the uncertainty principle and the indistinguishability of the particles. Finally we show that the H-function as defined by Boltzmann is an SMI but not entropy. Therefore; much of what has been written on the H-theorem is irrelevant to entropy and the Second Law of Thermodynamics. View Full-Text
Share & Cite This Article
Ben-Naim, A. Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem. Entropy 2017, 19, 48.
Ben-Naim A. Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem. Entropy. 2017; 19(2):48.Chicago/Turabian Style
Ben-Naim, Arieh. 2017. "Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem." Entropy 19, no. 2: 48.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.