Next Article in Journal
Previous Article in Journal
Entropy 2005, 7(1), 68-96; doi:10.3390/e7010068
Article

The meanings of entropy

Received: 19 November 2004; Accepted: 14 February 2005 / Published: 14 February 2005
Download PDF [227 KB, uploaded 16 September 2008]
Abstract: Entropy is a basic physical quantity that led to various, and sometimes apparently conflicting interpretations. It has been successively assimilated to different concepts such as disorder and information. In this paper we're going to revisit these conceptions, and establish the three following results: Entropy measures lack of information; it also measures information. These two conceptions are complementary. Entropy measures freedom, and this allows a coherent interpretation of entropy formulas and of experimental facts. To associate entropy and disorder implies defining order as absence of freedom. Disorder or agitation is shown to be more appropriately linked with temperature.
Keywords: entropy; freedom; information; disorder entropy; freedom; information; disorder
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Export to BibTeX |
EndNote


MDPI and ACS Style

Brissaud, J.-B. The meanings of entropy. Entropy 2005, 7, 68-96.

AMA Style

Brissaud J-B. The meanings of entropy. Entropy. 2005; 7(1):68-96.

Chicago/Turabian Style

Brissaud, Jean-Bernard. 2005. "The meanings of entropy." Entropy 7, no. 1: 68-96.


Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert