The meanings of entropy
AbstractEntropy is a basic physical quantity that led to various, and sometimes apparently conflicting interpretations. It has been successively assimilated to different concepts such as disorder and information. In this paper we're going to revisit these conceptions, and establish the three following results: Entropy measures lack of information; it also measures information. These two conceptions are complementary. Entropy measures freedom, and this allows a coherent interpretation of entropy formulas and of experimental facts. To associate entropy and disorder implies defining order as absence of freedom. Disorder or agitation is shown to be more appropriately linked with temperature.
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Brissaud, J.-B. The meanings of entropy. Entropy 2005, 7, 68-96.
Brissaud J-B. The meanings of entropy. Entropy. 2005; 7(1):68-96.Chicago/Turabian Style
Brissaud, Jean-Bernard. 2005. "The meanings of entropy." Entropy 7, no. 1: 68-96.