Entropy? Exercices de Style
Abstract
:A symbol is a reality that is greater than itself |
Pavel Florensky [1] |
1. Introduction
2. The Dawn of the Concept: Entropy in Thermodynamics Style
2.1. Clausius’ Coinage: Giving Names to Things
Certainly informative, such definition seemingly just renames the ratio between two primitive and measurable thermodynamic quantities (heat and temperature). It is only through some subtle considerations that one can reach a second definition which reveals the implicit consequences of the former and goes as:(i) Entropy is function of the state of a physical system at equilibrium with all its parts whose variation is given by the ratio of the heat traversing the boundary of the system and the absolute temperature at which the infinitesimal energy transfer is carried out.
Since this alternative statement betrays the motivations behind its introduction of the concept of entropy and its core peculiarity, it is illustrative to historically retrace the pathway connecting the two.(ii) Entropy [production] is the measure of the irreversibility of spontaneous physical transformations in an isolated system.
If for the entire universe we conceive the same magnitude to be determined [..] which for a single body I have called entropy, and if at the same time we introduce the other and simpler conception of energy, we may express in the following manner the fundamental laws of the universe which correspond to the two fundamental theorems of the mechanics of heat.
The energy of the universe is constant The entropy of the universe tends to a maximum.
2.2. Entropy in the Cosmogonic Context
3. Entropy in Statistics Style
3.1. Boltzmann’s Versions
Any individual uniform distribution, which might arise after a certain time from some particular initial state, is just as improbable as an individual non-uniform distribution; just as in the game of Lotto, any individual set of five numbers is as improbable as the set 1,2,3,4,5. It is only because there are many more uniform distributions than non-uniform ones that the distribution of states will become uniform in the course of time [4].
3.2. Gibbs’ Version
4. Entropy in Informational Style
4.1. Resurrecting Maxwell Demon
- The result of the detection has to be physically instantiated in the position of the lever. The measurement process occurs only with some kind of temporary memorisation.
- The entropy gained, as a consequence of the detection, in the isothermal expansion , i.e., , has to equal the entropy cost of storing the binary information .
4.2. Literal Interpretation of Szilard Entropy Cost of Information
[By letting the gas expand in the Szilard experiment,] we have exchanged our knowledge for the entropy decrease of . That is: in volume V the entropy is the same as that in volume under the assumption one knows in which half of the container the molecule is located.
4.3. Entropy According to Shannon
- H takes the maximum value when the states i are equiprobable—and monotonically increases with the number of states—and conversely is minimised by a probability distribution of the type , where a single choice is available.
- Given two events x and y with some degree of mutual correlation, the function H of the joint event , where the equality holds only if the events are uncorrelated.
- Any averaging operation on the , i.e., an operation whose result equalises the probabilities in any degree, corresponds to an increase of H.
- The probability distribution which maximises the functional H, on the only additional condition that a standard deviation for x is fixed, is a Gaussian distribution.
[When searching for a name for H] I thought of calling it “information”, but the word was overly used, so I decided to call it “uncertainty”. [...] von Neumann told me, “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more importantly, nobody knows what entropy really is, so in a debate you will always have the advantage.”
4.4. Information-Theoretic Interpretation of Physical Entropy
The mere fact that the same mathematical expression occurs both in statistical mechanics and in information theory does not in itself establish a connection between the two fields. This can be done by finding new viewpoints from which [the two] entropies appear as the same concept.
4.5. The Concept of Relative Entropy
5. Other Fruits of the Entropy Tree
6. Summary and Conclusions
Funding
Acknowledgments
Conflicts of Interest
References
- Florensky, P. Il Valore Magico Della Parola; Edizioni Medusa: Milan, Italy, 2003; Originally Published Posthumously with the title U Vodarasdelov Mysli (At the Watershed of Thought) by Pravda, Moscow, Russia, 1990. [Google Scholar]
- Clausius, R.; Hirst, T. The Mechanical Theory of Heat: With Its Applications to the Steam-engine and to the Physical Properties of Bodies; J. Van Voorst: London, UK, 1867. [Google Scholar]
- Kragh, H.; Hamm, E.; Brain, R. Entropic Creation: Religious Contexts of Thermodynamics and Cosmology; Science, Technology and Culture, 1700–1945; Ashgate Publishing Limited: Farnham, UK, 2013. [Google Scholar]
- Brush, S. The Kinetic Theory of Gases: An Anthology of Classic Papers with Historical Commentary; History of Modern Physical Sciences; World Scientific Publishing Company: Singapore, 2003. [Google Scholar]
- Maxwell, J.C. V. Illustrations of the dynamical theory of gases.—Part I. On the motions and collisions of perfectly elastic spheres. Lond. Edinb. Dublin Philos. Mag. J. Sci. 1860, 19, 19–32. [Google Scholar] [CrossRef]
- Maxwell, J.C. II. Illustrations of the dynamical theory of gases. Lond. Edinb. Dublin Philos. Mag. J. Sci. 1860, 20, 21–37. [Google Scholar] [CrossRef]
- Maxwell, J.; Harman, P. The Scientific Letters and Papers of James Clerk Maxwell; Number v. 3; v. 1874–1879 in The Scientific Letters and Papers of James Clerk Maxwell; Cambridge University Press: Cambridge, UK, 2002. [Google Scholar]
- Maxwell, J. Theory of Heat; Text-Books of Science; Longmans: London, UK, 1871. [Google Scholar]
- Klein, M.J. The development of Boltzmann’s statistical ideas. In The Boltzmann Equation; Springer: Berlin/Heidelberg, Germany, 1973; pp. 53–106. [Google Scholar]
- Ter Haar, D. Foundations of statistical mechanics. Rev. Mod. Phys. 1955, 27, 289. [Google Scholar] [CrossRef]
- Brown, H.R.; Myrvold, W.; Uffink, J. Boltzmann’s H-theorem, its discontents, and the birth of statistical mechanics. Stud. Hist. Philos. Sci. Part B Stud. Hist. Philos. Mod. Phys. 2009, 40, 174–191. [Google Scholar] [CrossRef]
- Uffink, J. Boltzmann’s Work in Statistical Physics. In The Stanford Encyclopedia of Philosophy, Spring 2017 ed.; Zalta, E.N., Ed.; Metaphysics Research Lab, Stanford University: Stanford, CA, USA, 2017. [Google Scholar]
- Boltzmann, L. Wissenschaftliche Abhandlungen, Volume II; Number v. 2 in AMS Chelsea Publishing Series; AMS Chelsea Publ.: Providence, RI, USA, 2001. [Google Scholar]
- Badino, M. The odd couple: Boltzmann, Planck and the application of statistics to physics (1900–1913). Ann. Der Phys. 2009, 18, 81–101. [Google Scholar] [CrossRef]
- Klein, M.J. Max Planck and the beginnings of the quantum theory. Arch. Hist. Exact Sci. 1961, 1, 459–479. [Google Scholar] [CrossRef]
- Kuhn, T. Black-Body Theory and the Quantum Discontinuity, 1894–1912; Oxford University Press: Oxford, UK, 1978. [Google Scholar]
- Uffink, J. Bluff your way in the second law of thermodynamics. Stud. Hist. Philos. Sci. Part B Stud. Hist. Philos. Mod. Phys. 2001, 32, 305–394. [Google Scholar] [CrossRef]
- Boltzmann, L. On certain questions of the theory of gases. Nature 1895, 51, 413. [Google Scholar] [CrossRef]
- Jaynes, E.T. The gibbs paradox. In Maximum Entropy and Bayesian Methods; Springer: Berlin/Heidelberg, Germany, 1992; pp. 1–21. [Google Scholar]
- Gibbs, J. Elementary Principles in Statistical Mechanics; Dover Books on Physics; Dover Publications: Mineola, NY, USA, 2014. [Google Scholar]
- Leff, H.; Rex, A. Maxwell’s Demon 2 Entropy, Classical and Quantum Information, Computing; CRC Press: Boca Raton, FL, USA, 2002. [Google Scholar]
- Leff, H.; Rex, A. Maxwell’s Demon: Entropy, Information, Computing; Princeton Series in Physics; Princeton University Press: Princeton, NJ, USA, 2014. [Google Scholar]
- Szilard, L. Über die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen. Z. Für Phys. 1929, 53, 840–856. [Google Scholar] [CrossRef]
- Szilard, L. On the decrease of entropy in a thermodynamic system by the intervention of intelligent beings. Behav. Sci. 1964, 9, 301–310. [Google Scholar] [CrossRef]
- Toyabe, S.; Sagawa, T.; Ueda, M.; Muneyuki, E.; Sano, M. Experimental demonstration of information-to-energy conversion and validation of the generalized Jarzynski equality. Nat. Phys. 2010, 6, 988. [Google Scholar] [CrossRef]
- Koski, J.V.; Maisi, V.F.; Pekola, J.P.; Averin, D.V. Experimental realization of a Szilard engine with a single electron. Proc. Natl. Acad. Sci. USA 2014, 111, 13786–13789. [Google Scholar] [CrossRef] [Green Version]
- Berut, A.; Arakelyan, A.; Petrosyan, A.; Ciliberto, S.; Dillenschneider, R.; Lutz, E. Experimental verification of Landauer’s principle linking information and thermodynamics. Nature 2012, 483, 187–189. [Google Scholar] [CrossRef] [PubMed]
- Gaudenzi, R.; Burzurí, E.; Maegawa, S.; Zant, H.; Luis, F. Quantum Landauer erasure with a molecular nanomagnet. Nat. Phys. 2018, 14, 565. [Google Scholar] [CrossRef]
- Brillouin, L. Maxwell’s demon cannot operate: Information and entropy. I. J. Appl. Phys. 1951, 22, 334–337. [Google Scholar] [CrossRef]
- Brillouin, L. Physical entropy and information. II. J. Appl. Phys. 1951, 22, 338–343. [Google Scholar] [CrossRef]
- Landauer, R. Irreversibility and heat generation in the computing process. IBM J. Res. Dev. 1961, 5, 183–191. [Google Scholar] [CrossRef]
- Bennett, C.H. Notes on Landauer’s principle, reversible computation, and Maxwell’s Demon. Stud. Hist. Philos. Sci. Part B Stud. Hist. Philos. Mod. Phys. 2003, 34, 501–510. [Google Scholar] [CrossRef]
- Köhler, E. Why von Neumann Rejected Carnap’s Dualism of Information Concepts. In John von Neumann and the Foundations of Quantum Physics; Springer: Berlin/Heidelberg, Germany, 2001; pp. 97–134. [Google Scholar]
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
- Tribus, M.; McIrvine, E.C. Energy and information. Sci. Am. 1971, 225, 179–188. [Google Scholar] [CrossRef]
- Denbigh, K. How subjective is entropy. In Maxwell’s Demon, Entropy, Information, Computing; Princeton Legacy Library: Princeton, NJ, USA, 1990; pp. 109–115. [Google Scholar]
- Jaynes, E.T. Information theory and statistical mechanics. Phys. Rev. 1957, 106, 620. [Google Scholar] [CrossRef]
- Kapur, J. Maximum-Entropy Models in Science and Engineering; Wiley: Hoboken, NJ, USA, 1989. [Google Scholar]
- Caticha, A. Consistency, amplitudes, and probabilities in quantum theory. Phys. Rev. A 1998, 57. [Google Scholar] [CrossRef]
- Jaynes, E.T. Gibbs vs. Boltzmann entropies. Am. J. Phys. 1965, 33, 391–398. [Google Scholar] [CrossRef]
- Toffoli, T. Entropy? Honest! Entropy 2016, 18, 247. [Google Scholar] [CrossRef]
- Kullback, S.; Leibler, R.A. On information and sufficiency. Ann. Math. Stat. 1951, 22, 79–86. [Google Scholar] [CrossRef]
- Kullback, S. Information Theory and Statistics; Wiley publication in mathematical statistics; Wiley: Hoboken, NJ, USA, 1959. [Google Scholar]
- Fisher, R.A. Theory of statistical estimation. In Mathematical Proceedings of the Cambridge Philosophical Society; Cambridge University Press: Cambridge, UK, 1925; Volume 22, pp. 700–725. [Google Scholar]
- Shore, J.; Johnson, R. Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Trans. Inf. Theory 1980, 26, 26–37. [Google Scholar] [CrossRef] [Green Version]
- Kapur, J.N.; Kesavan, H.K. Entropy optimization principles and their applications. In Entropy and Energy Dissipation in Water Resources; Springer: Berlin/Heidelberg, Germany, 1992; pp. 3–20. [Google Scholar]
- Palm, G. Evidence, information, and surprise. Biol. Cybern. 1981, 42, 57–68. [Google Scholar] [CrossRef]
- Bekenstein, J.D. Black holes and entropy. Phys. Rev. D 1973, 7, 2333. [Google Scholar] [CrossRef]
- Hawking, S.W. Gravitational radiation from colliding black holes. Phys. Rev. Lett. 1971, 26, 1344. [Google Scholar] [CrossRef]
- Theil, H. The information approach to demand analysis. Econ. J. Econ. Soc. 1965, 33, 67–87. [Google Scholar] [CrossRef]
- Theil, H.; Raj, B.; Koerts, J. Henri Theil’s Contributions to Economics and Econometrics: Econometric Theory and Methodology. Vol. I; Advanced Studies in Theoretical and Applied Econometrics; Springer: Berlin/Heidelberg, Germany, 1992. [Google Scholar]
- Georgescu-Roegen, N. The Entropy Law and the Economic Process; Harvard University Press: Cambridge, MA, USA, 1971. [Google Scholar]
- Queneau, R. Exercices de Style; Collection Folio; Gallimard: Paris, France, 1982. [Google Scholar]
© 2019 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gaudenzi, R. Entropy? Exercices de Style. Entropy 2019, 21, 742. https://doi.org/10.3390/e21080742
Gaudenzi R. Entropy? Exercices de Style. Entropy. 2019; 21(8):742. https://doi.org/10.3390/e21080742
Chicago/Turabian StyleGaudenzi, Rocco. 2019. "Entropy? Exercices de Style" Entropy 21, no. 8: 742. https://doi.org/10.3390/e21080742
APA StyleGaudenzi, R. (2019). Entropy? Exercices de Style. Entropy, 21(8), 742. https://doi.org/10.3390/e21080742