Algorithmic Entropy and Landauer’s Principle Link Microscopic System Behaviour to the Thermodynamic Entropy
Abstract
:1. Introduction
- The configuration of a natural system can be represented by a binary string in its state space, i.e., a string that specifies the instantaneous position, momentum, and stored energy states of all the species in the system. As natural laws can be understood as computations on a real-world universal Turing machine (UTM), the algorithmic entropy of a natural world configuration can be defined as the fewest computational bits that generate this string. As is discussed later, because a laboratory UTM can simulate the real-world UTM, the number of bits in the appropriately coded laboratory computer that halts after generating the configuration, specifies its algorithmic entropy to within a machine-dependent constant. As only entropy differences are of significance, the constant is irrelevant. The laboratory UTM that measures algorithmic entropy differences in the real-world is a measuring device that captures the algorithmic entropy of the natural system.
- The algorithmic entropy has always been recognised as a measure closely related to the Shannon entropy and the entropy of statistical mechanics (see [24,25]). However, once the specific requirements associated with real world reversible computations are properly accounted for, it is shown here that, allowing for units, the algorithmic entropy of an equilibrium or the typical microstate in an isolated system is identical to the thermodynamic entropy of the macrostate containing the set of allowable microstates. As shown in Section 3, because the algorithmic entropy of a typical macrostate is the same as the thermodynamic entropy, the algorithmic entropy can be used to quantify thermodynamic processes.
- The thermodynamic entropy of a thermodynamic macrostate in an isolated system is primarily determined by the most probable set of states. In contrast, the algorithmic entropy is conceptually different as it is the number of bits needed to specify an instantaneous microstate of the system. However, this is no different to recognising that the energy of a thermodynamic macrostate at an instant is actually the energy stored in the instantaneous microstate. Furthermore, in an isolated system, all microstates have the same algorithmic entropy, as they are connected by reversible processes, in which case the instantaneous configuration that is termed a “fluctuation from equilibrium” in an isolated system is here seen as a fluctuation from a typical or an equilibrium microstate that belongs to the most probable set of states.
- It is shown that, allowing for units, the number of bits specifying a microstate in the most probable set of states corresponds to the thermodynamic entropy of the isolated macrostate (see Section 3). For this reason, because the algorithmic entropy corresponds to the thermodynamic entropy when most of the bits specify momentum states, the algorithmic entropy can be termed the “realised entropy”. On the other hand, the bits specifying stored energy states are not usually seen as contributing to the thermodynamic entropy of the macrostate. As it is argued that bits, like energy, are conserved, the bits that specify a fluctuation from equilibrium mainly specify the stored energy or potential energy states. These bits can be termed “potential thermodynamic entropy” and only become realised when the energy associated with these bits diffuses through the system as heat, in which case bits specifying stored energy states become bits specifying momentum states.
- The distance from equilibrium of a fluctuation from a typical equilibrium state is the number of bits that shift from the stored energy states to the momentum states as the system trends to the most probable set of states. However, such a fluctuation from the most probable set of states in an isolated system is distinct from a system where the initial state is not just a fluctuation, but instead is a far-from-equilibrium configuration. In the latter case, when such a system trends to the most probable set of equilibrium states, the Boltzmann entropy increases as more states become available. In this different case, the distance from equilibrium is the number of bits that must enter the system for it to settle in an equilibrium configuration.
- The thermodynamic cost of maintaining a system distant from equilibrium can be understood in terms of compensating for the bit flows that attempt to drive the system to the most probable set of states. In the natural world, bits are computational instructions, embodied in the physical laws that specify the interactions between different real world states. As is discussed later in Section 6, Landauer’s principle [26] can be used to establish the thermodynamic cost of transferring bits out of a system and within a system [22,23,27] for a real-world reversible process. For a system to be stable in a homeostatic far-from-equilibrium set of configurations, both the energy flows into and out of the system, as well as the bit flows in and out, must balance. The paper explores the conceptual framework behind this principle to provide confidence in the manner in which the principle should be applied.
- In this paper, it can be seen that the algorithmic entropy, by focusing at the level of the microstate, provides a deterministic understanding of the manner in which the thermodynamic entropy increases as a system trends to equilibrium in terms of the computational processes involved. When ordering occurs, such as when magnetic spins align in a magnetic system, bits previously specifying random magnetic states become bits specifying momentum states raising the temperature. If the phase change in the spin system is to be locked in, bits from the momentum states must exist to lower the temperature. Bits can be tracked entering and leaving the system, but bits are conserved when isolated.
2. Formal Outline of the Algorithmic Entropy
2.1. Specifying the Shortest Algorithm
2.2. The Provisional Entropy
3. The Algorithmic Entropy of a Real-World Thermodynamic Macrostate
4. Perspectives of Real-World Computations
- Instructions underpinning real-world computations are self-delimiting and are ongoing, only stopping when an external event impacts on the system. Ongoing computations in the natural world are parallel computations, where what might be deemed a subroutine by the observer continuously feeds its varying outputs into what would be termed a register. This register is regularly interrogated by the other routines until an appropriate input becomes available. As a consequence, real-world subroutines do not need to halt. It is the observer that requires subroutines to halt, so that the number of bits that characterise a microstate at an instant can be tracked.Section 4.2 argues that, for a reversible system, provided the bits are already in the system and the net flow of bits in an out are tracked, the algorithmic entropy obtained by tracking bits to the halt instant is the same as that obtained from the halting algorithm. This understanding provides insights into the computational requirements of maintaining a system far-from-equilibrium.
- A real-word computation enacts physical laws captured in the behaviour of atoms and molecules. These species act as reversible gates from a computational perspective, and the instructions embodied in the gates determine the computational trajectory. The behaviour of the gates is simulated on the laboratory reference computer by a programme. The programme is usually considered to be distinct from the string representing the bit settings of actual states. A difficulty arises, as reversibility, a critical characteristic of a real-world computation, is not usually built into a laboratory programme. However, as is discussed later, Bennett [39] points out that reversibility can be simulated on a laboratory computer if the computational history is kept to regenerate the initial state. In this case, the total number of bits in the system, including programme and history bits, are conserved as discussed in detail in Section 4.2. This allows Landauer’s principle to identify the conditions under which the thermodynamic entropy aligns with the algorithmic entropy.
- As there is only one reversible forward path to a particular microstate in the natural world, provided the full details of the microstate are specified, there can be no shorter algorithmic description than that provided by the reversible path. Reversibility also implies that there is maximum mutual information between the initial state and the final state of a computation, as the computational path must pass through the initial state and, as a consequence, Equation (1) will have an equal sign. i.e.,
4.1. Why Bits Are Conserved
4.2. Net Entropy Flows and the Algorithmic Entropy
5. Application to the Second Law of Thermodynamics
5.1. Non-Equilibrium States and Fluctuations within Equilibrium
5.2. The Trajectory Approach
6. Landauer’s Principle
6.1. Tracking Bit Flows between Stored Energy Degrees of Freedom and Momentum Degrees of Freedom
6.2. The Principle
An Additional Perspective of the Algorithmic Approach
6.3. Landauer’s Principle and Resource Use of Natural Systems
6.4. Interpreting Bit Transfers in a Far-From-Equilibrium System
7. Conclusions
Funding
Conflicts of Interest
Abbreviations
AIT | algorithmic information theory |
UTM | universal Turing machine |
AMSS | algorithmic minimum sufficient statistic |
References
- Martin-Löf, P. The definition of random sequences. Inf. Control 1966, 9, 602–619. [Google Scholar] [CrossRef]
- Gács, P. Exact Expressions for some randomness tests. Zeitschr. f. Math. Logik und Grundlagen d. Math. 1980, 26, 385–394. [Google Scholar] [CrossRef]
- Chaitin, G. Information-theoretic Computational Complexity. IEEE Trans. Inf. Theory 1974, 20, 10–15. [Google Scholar] [CrossRef]
- Vitányi, P.M.B.; Li, M. Minimum description length induction, Bayesianism, and Kolmogorov complexity. IEEE Trans. Inf. Theory 2000, 46, 446–464. [Google Scholar] [CrossRef] [Green Version]
- Li, M.; Chen, X.; Li, X.; Ma, B.; Vitanyi, P. The similarity metric. IEEE Trans. Inf. Theory 2004, 50, 3250–3264. [Google Scholar] [CrossRef]
- Ferragina, P.; Giancarlo, R.; Greco, V.; Manzini, G.; Valiente, G. Compression-based classification of biological sequences and structures via the Universal Similarity Metric: Experimental assessment. BMC Bioinform. 2007, 8, 252. [Google Scholar] [CrossRef] [PubMed]
- Hutter, M. On universal prediction and Bayesian confirmation. Theor. Comput. Sci. 2007, 384, 33–48. [Google Scholar] [CrossRef]
- Chaitin, G. On the intelligibility of the universe and the notions of simplicity, complexity and irreducibility. In Grenzen und Grenzüberschreitungen, XIX. Deutscher Kongress für Philosophie, Bonn, September 2002; Hogrebe, W., Bromand, J., Eds.; Akademie Verlag: Berlin, Germany, 2004; pp. 517–534. [Google Scholar]
- Calude, C.S.; Meyerstein, F.W. Is the universe lawful? Chaos Solitons Fractals 1999, 106, 1075–1084. [Google Scholar]
- Hutter, M. A Complete Theory of Everything (Will Be Subjective). Algorithms 2010, 3, 329–350. [Google Scholar] [CrossRef] [Green Version]
- Davies, P.C.W. The Fifth Miracle: The Search for the Origin of Life; Penguin Books Ltd.: London, UK, 2003. [Google Scholar]
- Li, M.; Vitányi, P.M.B. An Introduction to Kolmogorov Complexity and Its Applications, 3rd ed.; Springer: New York, NY, USA, 2008. [Google Scholar]
- Zenil, H.; Badillo, L.; Hernández-Orozco, S.; Hernández-Quiroz, F. Coding-theorem like behaviour and emergence of the universal distribution from resource-bounded algorithmic probability. Int. J. Parallel Emerg. Distrib. Syst. 2018, 1–20. [Google Scholar] [CrossRef] [Green Version]
- Zenil, H.; Marshall, J.A.R.; Jesper Tegnér, J. Approximations of Algorithmic and Structural Complexity Validate Cognitive-behavioural Experimental Results. arXiv, 2015; arXiv:1509.06338. [Google Scholar]
- Gauvrit, N.; Zenil, H.; Soler-Toscano, F.; Delahaye, J.P.; Brugger, P. Human behavioral complexity peaks at age 25. PLoS Comput. Biol. 2017, 13, e1005408. [Google Scholar] [CrossRef] [PubMed]
- Zenil, H.; Kiani, N.; Tegnér, J. An Algorithmic Refinement of Maxent Induces a Thermodynamic-like Behaviour in the Reprogrammability of Generative Mechanisms. arXiv, 2018; arXiv:1805.07166. [Google Scholar]
- Zenil, H.; Kiani, N.A.; Marabita, F.; Deng, Y.; Elias, S.; Schmidt, A.; Ball, G.; Tegnér, J. An Algorithmic Information Calculus for Causal Discovery and Reprogramming Systems. BioArxiv 2018. [Google Scholar] [CrossRef]
- Zenil, H.; Gershenson, C.; Marshall, J.A.R.; Rosenblueth, D.A. Life as Thermodynamic Evidence of Algorithmic Structure in Natural Environments. Entropy 2012, 14, 810–812. [Google Scholar] [CrossRef]
- Hernández-Orozco, S.; Kiani, N.A.; Zenil, H. Algorithmically probable mutations reproduce aspects of evolution, such as convergence rate, genetic memory and modularity. R. Soc. Open Sci. 2018, 5, 180399. [Google Scholar] [CrossRef] [PubMed]
- Devine, S. An Algorithmic Information Theory Challenge to Intelligent Design. Zygon 2014, 49, 42–65. [Google Scholar] [CrossRef]
- Dembski, W.A. Intelligent Design as a Theory of Information. 2002. Available online: http://arn.org/docs/dembski/wd_idtheory.htm (accessed on 17 October 2018).
- Devine, S.D. Understanding how replication processes can maintain systems away from equilibrium using Algorithmic Information Theory. Biosystems 2016, 140, 8–22. [Google Scholar] [CrossRef] [PubMed]
- Devine, S. An economy viewed as a far-from-equilibrium system from the perspective of algorithmic information theory. Entropy 2018, 20, 228. [Google Scholar] [CrossRef]
- Bennett, C.H. Thermodynamics of computation—A review. Int. J. Theor. Phys. 1982, 21, 905–940. [Google Scholar] [CrossRef]
- Vereshchagin, N.K.; Vitányi, P.M.B. Kolmogorov’s Structure Functions and Model Selection. IEEE Trans. Inf. Theory 2004, 50, 3265–3290. [Google Scholar] [CrossRef]
- Landauer, R. Irreversibility and heat generation in the computing process. IBM J. Res. Dev. 1961, 5, 183–191. [Google Scholar] [CrossRef]
- Devine, S. The information requirements of complex biological and economic systems with algorithmic information theory. Int. J. Des. Nat. Ecodyn. 2017, 12, 367–376. [Google Scholar] [CrossRef]
- Solomonoff, R.J. A formal theory of inductive inference; part 1 and part 2. Inf. Control 1964, 7, 1–22. [Google Scholar] [CrossRef]
- Kolmogorov, K. Three approaches to the quantitative definition of information. Problems Inform. Transmission 1965, 1, 1–7. [Google Scholar] [CrossRef]
- Chaitin, G. On the length of programs for computing finite binary sequences. J. ACM 1966, 13, 547–569. [Google Scholar] [CrossRef]
- Chaitin, G. A theory of program size formally identical to information theory. J. ACM 1975, 22, 329–340. [Google Scholar] [CrossRef]
- Zvonkin, A.; Levin, L. The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms. Russ. Math. Surv. 1970, 25, 83–124. [Google Scholar] [CrossRef]
- Gács, P. On the symmetry of algorithmic information. Sov. Math. Dokl. 1974, 15, 1477–1780. [Google Scholar]
- Devine, S.D. The insights of algorithmic entropy. Entropy 2009, 11, 85–110. [Google Scholar] [CrossRef]
- Zurek, W.H. Algorithmic randomness and physical entropy. Phys. Rev. A 1989, 40, 4731–4751. [Google Scholar] [CrossRef]
- Bennett, C.H. Logical Depth and Physical Complexity. In The Universal Turing Machine—A Half-Century Survey; Herken, R., Ed.; Oxford University Press: Oxford, UK, 1988; pp. 227–257. [Google Scholar]
- Gács, P. The Boltzmann Entropy and Random Tests. 2004. Available online: http://www.cs.bu.edfaculty/gacs/papers/ent-paper.pdf (accessed on 17 October 2018).
- Jaynes, E.T. Gibbs vs Boltzmann entropies. Am. J. Phys. 1965, 33, 391–398. [Google Scholar] [CrossRef]
- Bennett, C.H. Logical reversibility of computation. IBM J. Res. Dev. 1973, 17, 525–532. [Google Scholar] [CrossRef]
- Zurek, W.H. Thermodynamics of of computation, algorithmic complexity and the information metric. Nature 1989, 341, 119–124. [Google Scholar] [CrossRef]
- Schneider, E.D.; Kay, J.J. Life as a manifestation of the second law of thermodynamics. Math. Comput. Model. 1994, 16, 25–48. [Google Scholar] [CrossRef]
- Esposito, M.; Van den Broeck, C. Second law and Landauer principle far from equilibrium. Europhys. Lett. 2011, 95, 40004. [Google Scholar] [CrossRef] [Green Version]
- Parrondo, J.M.R.; Horowitz, J.M.; Sagawa, T. Thermodynamics of information. Nat. Phys. 2015, 11, 131–139. [Google Scholar] [CrossRef]
- Szilard, S. Uber die Entropieverminderung in einnem thermodynamischen System bei Eingriffen intelligenter Wesen. Zeitschrift für Physik 1929, 53, 840–856. [Google Scholar] [CrossRef]
- Brillouin, L. Maxwell’s Demon Cannot Operate: Information and Entropy. I. J. Appl. Phys. 1951, 22, 334–337. [Google Scholar] [CrossRef]
- Lloyd, S. Ultimate physical limits to computation. Nature 2000, 406, 1047–1055. [Google Scholar] [CrossRef] [PubMed]
- Rex, A. Maxwell’s demon—A historical review. Entropy 2017, 19, 240. [Google Scholar] [CrossRef]
- Kish, L.B.; Khatri, S.P.; Granqvist, C.G.; Smulko, J.M. Critical remarks on Landauer’s principle of erasure-dissipation: Including notes on Maxwell demons and Szilard engines. In Proceedings of the 2015 International Conference on Noise and Fluctuations (ICNF), Xian, China, 2–6 June 2015. [Google Scholar] [CrossRef]
- del Rio, L.; Aberg, J.; Renner, R.; Dahlsten, O.; Vlatko Vedra, V. The thermodynamic meaning of negative entropy. Nature 2011, 474, 61–63. [Google Scholar] [PubMed] [Green Version]
- Ladyman, J.; Robertson, K. Going round in circles: Landauer vs. Norton on the thermodynamics of computation. Entropy 2014, 16, 2278–2290. [Google Scholar] [CrossRef]
- Bérut, A.; Petrosyan, A.; Ciliberto, S. Information and thermodynamics: Experimental verification of Landauer’s Erasure principle. J. Stat. Mech. Theory Exp. 2015, 2015, P06015. [Google Scholar] [CrossRef]
- Jun, Y.; Gavrilov, M.C.V.; Bechhoefer, J. High-precision test of Landauer’s principle in a feedback trap. Phys. Rev. Lett. 2014, 113, 190601. [Google Scholar] [CrossRef] [PubMed]
- Hong, J.; Lambson, B.; Dhuey, S.; Bokor, J. Experimental test of Landauers principle in single-bit operations on nanomagnetic memory bits. Sci. Adv. 2016, 2, e1501492. [Google Scholar] [CrossRef] [PubMed]
- Yan, L.L.; Xiong, T.P.; Rehan, K.; Zhou, F.; Liang, D.F.; Chen, L.; Zhang, J.Q.; Yang, W.L.; Ma, Z.H.; Feng, M. Single-atom demonstration of the quantum Landauer principle. Phys. Rev. Lett. 2018, 120, 210601. [Google Scholar] [CrossRef] [PubMed]
© 2018 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Devine, S. Algorithmic Entropy and Landauer’s Principle Link Microscopic System Behaviour to the Thermodynamic Entropy. Entropy 2018, 20, 798. https://doi.org/10.3390/e20100798
Devine S. Algorithmic Entropy and Landauer’s Principle Link Microscopic System Behaviour to the Thermodynamic Entropy. Entropy. 2018; 20(10):798. https://doi.org/10.3390/e20100798
Chicago/Turabian StyleDevine, Sean. 2018. "Algorithmic Entropy and Landauer’s Principle Link Microscopic System Behaviour to the Thermodynamic Entropy" Entropy 20, no. 10: 798. https://doi.org/10.3390/e20100798
APA StyleDevine, S. (2018). Algorithmic Entropy and Landauer’s Principle Link Microscopic System Behaviour to the Thermodynamic Entropy. Entropy, 20(10), 798. https://doi.org/10.3390/e20100798