Next Article in Journal
Entropy-Based Method for Evaluating Contact Strain-Energy Distribution for Assembly Accuracy Prediction
Next Article in Special Issue
Sequential Batch Design for Gaussian Processes Employing Marginalization †
Previous Article in Journal
Topological Entropy Dimension and Directional Entropy Dimension for ℤ2-Subshifts
Article Menu
Issue 2 (February) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(2), 48; doi:10.3390/e19020048

Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem

Department of Physical Chemistry, The Hebrew University of Jerusalem, Jerusalem 91904, Israel
Academic Editors: Geert Verdoolaege and Kevin H. Knuth
Received: 23 November 2016 / Revised: 17 January 2017 / Accepted: 21 January 2017 / Published: 24 January 2017
(This article belongs to the Special Issue Selected Papers from MaxEnt 2016)
View Full-Text   |   Download PDF [1588 KB, uploaded 24 January 2017]   |  

Abstract

We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynamic Entropy. The first is defined on any probability distribution; and therefore it is a very general concept. On the other hand Entropy is defined on a very special set of distributions. Next we show that the Shannon Measure of Information (SMI) provides a solid and quantitative basis for the interpretation of the thermodynamic entropy. The entropy measures the uncertainty in the distribution of the locations and momenta of all the particles; as well as two corrections due to the uncertainty principle and the indistinguishability of the particles. Finally we show that the H-function as defined by Boltzmann is an SMI but not entropy. Therefore; much of what has been written on the H-theorem is irrelevant to entropy and the Second Law of Thermodynamics. View Full-Text
Keywords: entropy; Shannon’s measure of information; Second Law of Thermodynamics; H-theorem entropy; Shannon’s measure of information; Second Law of Thermodynamics; H-theorem
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Ben-Naim, A. Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem. Entropy 2017, 19, 48.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top