# Information Theory and Computational Thermodynamics: Lessons for Biology from Physics

## Abstract

**:**

## 1. Unifying Information and Energy through Computation

_{2})kT (where k is Boltzmann’s constant, and T the temperature of the system).

**Figure 1.**The logical gate and is not reversible as can be seen in this truth table diagram, because given 0 as output one cannot tell which of the 3 inputs generated it.

_{2})kT every time M bits are erased. We have a good sense of how to connect these concepts using algorithmic information theory. If a string is algorithmically random, for example, there is no way the machine can be set up to make it produce usable work, because the more predictable the string (the lower its algorithmic—Kolmogorov—complexity [13,14]) the more the quantity of work that can be extracted from it. This is consistent with the second law of thermodynamics—computational thermodynamics basically says that one cannot extract work from an (algorithmically) random string. The machine would either not be able to fully predict the incoming input or it would require more energy to actually predict its bits and prepare itself to take advantage of the incoming input and produce work. In the words of Feynman ([8]), a random tape has zero fuel value.

**Figure 2.**A bit for work regarded as a particle pushing the piston if it is known whether the bit will be 1 or 0 (interpreted as coming from one direction or the other). Every bit in the sequence determines whether the piston will expand one way or the other, but in order to do so the piston has to be in the right position.

#### 1.1. Thermodynamics, Computation and Computability

## 2. The Role of Information in Physics

It is tempting to take the limit where the surface area goes to infinity, and the surface is locally approximately at. Our variables on the surface then apparently determine all physical events at one side (the black hole side) of the surface. But since the entropy of a black hole also refers to all physical fields outside the horizon the same degrees of freedom determine what happens at this side. Apparently one must conclude that a two-dimensional surface drawn in a three-space can contain all information concerning the entire three-[dimensional] space. ... This suggests that physical degrees of freedom in three-space are not independent but, if considered at Planckian scale, they must be infinitely correlated.

^{−33}cm and ∼ 10

^{−43}s) at which general relativity breaks down and should be replaced by laws of “quantum gravity” (Wheeler is also credited with having coined the terms Planck time and Planck length). Wheeler [24] thought that quantum mechanics would eventually be rooted in the “language of bits”. According to [25], Wheeler’s last blackboard contained the following, among several other ideas: “We will first understand how simple the universe is when we recognize how strange it is.” Wheeler himself provides examples of the trend from physics to information in his “it from bit” programme, suggesting that all of reality derives its existence from information. He asserted that any formula with units involving the Planck length ħ would be indisputable evidence of the discrete nature of the world. It was perhaps not by chance that the same person who coined the term “black hole” for the strange solutions that general relativity produced, leading to singularities, proposed the “it from bit” dictum, suggesting that everything could be written in, and ultimately consisted of, bits of information.

## 3. Information and Biology

**Figure 3.**A step configuration of Conway’s Game of Life [41] (each cell looks to its neighbours to stay alive or die—stay black or white). Surprisingly, simple processes like this rule system, today called a cellular automaton, can capture fundamental aspects of life such as self-assembly, robustness and self-replication, and are capable of the most important feature of computation: Turing universality. von Neumann [40] sought to model one of the most basic life processes—reproduction—by designing these kinds of lattice-based rules where space is updated altogether in discrete steps for studying self-replication.

#### 3.1. Computational Thermodynamics and Biology

^{11}kT per discharge [4]. Computers (mainly because of their volatile memory devices—the RAM memory) also dissipate energy by at least 1kT per bit [3,4], which is also the reason computers heat up and require an internal fan.

## 4. Concluding Remarks

## References and Notes

- Cooper, S.B. “The Mathematician’s Bias”, and the Return to Embodied Computation. In A Computable Universe: Understanding Computation & Exploring Nature as Computation; Zenil, H., Ed.; World Scientific Publishing Company: Singapore, 2012. [Google Scholar]
- Szilárd, L. Über die Entropieverminderung in einem thermodynamischen System bei Eingriffenin telligenter Wesen (On the reduction of entropy in a thermodynamic system by the interference of intelligent beings). Z. Physik
**1929**, 53, 840–856. [Google Scholar] [CrossRef] - Landauer, R. Irreversibility and heat generation in the computing process. IBM J. Res. Dev
**1961**, 5, 183–191. [Google Scholar] [CrossRef] - Bennett, C.H. The thermodynamics of computation—A review. Int. J. Theor. Phys.
**1982**, 21, 905–940. [Google Scholar] [CrossRef] - Bennett, C.H. Demons, Engines and the Second Law. Sci. Am.
**1987**, 257, 108–116. [Google Scholar] [CrossRef] - Fredkin, E.; Toffoli, T. Conservative logic. Int. J. Theor. Phys.
**1982**, 21, 219–253. [Google Scholar] [CrossRef] - Maxwell, J.C. Theory of Heat, 9th; Pesic, P., Ed.; Dover: Mineola, NY, USA, 2001. [Google Scholar]
- Feynman, R.P. Feynman Lectures on Computation; Hey, J.G., Allen, W., Eds.; Addison-Wesley: Boston, MA, USA, 1996; p. 147. [Google Scholar]
- Sethna, J. Statistical Mechanics: Entropy, Order Parameters and Complexity; Oxford University Press: Oxford, UK, 2006. [Google Scholar]
- Toffoli, T. Reversible Computing; Technical memo MIT/LCS/TM-151; MIT Lab for Computer Science: Cambridge, MA, USA, 1980. [Google Scholar]
- Bennett, C.H. Logical reversibility of computation. IBM J. Res. Dev.
**1973**, 17, 525. [Google Scholar] [CrossRef] - Cerny, V. Energy, Entropy, Information, and Intelligence. Available online: http://arxiv.org/pdf/1210.7065.pdf (accessed on 19 November 2012).
- Kolmogorov, A.N. Three approaches to the quantitative definition of information. Probl. Inform. Transm.
**1965**, 1, 1–7. [Google Scholar] - Chaitin, G.J. A Theory of Program Size Formally Identical to Information Theory. J. Assoc. Comput. Mach.
**1975**, 22, 329–340. [Google Scholar] [CrossRef] - Solomonoff, R.J. A formal theory of inductive inference: Parts 1 and 2. Inform. Control
**1964**, 7, 1–22, 224–254. [Google Scholar] - Levin, L. Laws of information conservation (non-growth) and aspects of the foundation of probability theory. Probl. Inform. Transm.
**1974**, 10, 206–210. [Google Scholar] - Preskill, J. Do Black Holes Destroy Information? Available online: http://arxiv.org/abs/hep-th/9209058 (accessed on 19 November 2012).
- Giddings, S.B. Comments on information loss and remnants. Phys. Rev. D.
**1994**, 49, 4078–4088. [Google Scholar] [CrossRef] - Read in a presentation at the Philosophical Society of Zurich on April 24, 1865.
- Bekenstein, J.D. Information in the Holographic Universe. Sci. Am.
**2003**, 289, 61. [Google Scholar] - Susskind, L. The World as a Hologram. J. Math. Phys.
**1995**, 36, 6377–6396. [Google Scholar] [CrossRef] - Bousso, R. The holographic principle. Rev. Mod. Phys.
**2002**, 74, 825–874. [Google Scholar] [CrossRef] - t Hooft, G. Dimensional Reduction in Quantum Gravity. Available online: http://arxiv.org/abs/gr-qc/9310026 (accessed on 19 November 2012).
- Wheeler, J.A. Information, physics, quantum: The search for links. In Complexity, Entropy, and the Physics of Information; Zurek, E., Ed.; Addison-Wesley: Boston, MA, USA, 1990. [Google Scholar]
- Misner, C.W.; Thorne, K.S.; Zurek, W.H. John Wheeler, relativity, and quantum information. Phys. Today
**2009**, 62, 40. [Google Scholar] - Deutsch, D. Quantum Theory, the Church-Turing Principle, and the Universal Quantum Computer. Proc. R. Soc. Lond.
**1985**, A400, 97–117. [Google Scholar] - Wolfram, S. A New Kind of Science; Wolfram Media: Champaign, IL, USA, 2002. [Google Scholar]
- Dennett, D.C. Darwin’s Dangerous Idea: Evolution and the Meanings of Life; Simon & Schuster: New York, NY, USA, 1996. [Google Scholar]
- Zenil, H.; Delahaye, J.P. On the Algorithmic Nature of the World. In Information and Computation; Dodig-Crnkovic, G., Burgin, M., Eds.; World Scientific: Singapore, 2010. [Google Scholar]
- Grafen, A. The simplest formal argument for fitness optimization. J. Genet.
**2008**, 87, 1243–1254. [Google Scholar] - Grafen, A. The formal Darwinism project: A mid-term report. J. Evolution. Biol.
**2007**, 20, 1243–1254. [Google Scholar] - Hunt, P. The Function of Hox Genes. In Developmental Biology; Bittar, E.E., Ed.; Elsevier: Amsterdam, The Netherlands, 1998. [Google Scholar]
- Shetty, R.P; Endy, D.; Knight, T.F., Jr. Engineering BioBrick vectors from BioBrick parts. J. Biol. Eng.
**2008**, 2. [Google Scholar] [CrossRef] - Adleman, L.M. Toward a Mathematical Theory of Self-Assembly; USC Technical Report: Los Angeles, CA, USA, 2000. [Google Scholar]
- Rothemund, P.W.K.; Papadakis, N.; Winfree, E. Algorithmic Self-Assembly of DNA Sierpinski Triangles. PLoS Biol.
**2004**, 2. [Google Scholar] [CrossRef][Green Version] - Adleman, L.M.; Cheng, Q.; Goel, A.; Huang, M.-D.A. Running time and program size for self-assembled squares. In ACM Symposium on Theory of Computing, Crete, Greece; 2001; pp. 740–748. [Google Scholar]
- Rothemund, P.W.K.;Winfree, E. The program-size complexity of self-assembled squares (extended abstract). In ACM Symposium on Theory of Computing, Portland, OR, USA; 2000; pp. 459–468. [Google Scholar]
- Aggarwal, G.; Goldwasser, M.; Kao, M.; Schweller, R.T. Complexities for generalized models of self-assembly. In Symposium on Discrete Algorithms, New Orleans, LA, USA; 2004. [Google Scholar]
- Winfree, E. Algorithmic Self-Assembly of DNA Thesis. Ph.D Thesis, 1998. [Google Scholar]
- von Neumann, J. The Theory of Self-reproducing Automata; Burks, A., Ed.; University of Illinois Press: Urbana, IL, USA, 1966. [Google Scholar]
- Gardner, M. Mathematical Games—The fantastic combinations of John Conway’s new solitaire game “life”. Sci. Am.
**1970**, 223, 120–123. [Google Scholar] [CrossRef] - Langton, C.G. Studying artificial life with cellular automata. Physica D
**1986**, 22, 120–149. [Google Scholar] [CrossRef] - Winfree, E. Simulations of Computing by Self-Assembly; Technical Report CS-TR:1998.22; Caltech: Pasadena, CA, USA, 1998. [Google Scholar]
- Barricelli, N.A. Numerical testing of evolution theories Part I Theoretical introduction and basic tests. Acta Biotheor.
**1961**, 16, 69–98. [Google Scholar] [CrossRef] - Reed, J.; Toombs, R.; Barricelli, N.A. Simulation of biological evolution and machine learning. I. Selection of self-reproducing numeric patterns by data processing machines, effects of hereditary control, mutation type and crossing. J. Theor. Biol.
**1967**, 17, 319–342. [Google Scholar] - Dyson, G. Darwin Among the Machines; Penguin Books Ltd.: London, UK, 1999. [Google Scholar]
- Cook, M. Universality in Elementary Cellular Automata. Complex Syst.
**2004**, 15, 1–40. [Google Scholar] - Neary, T.; Woods, D. Small weakly universal Turing machines. In 17th International Symposium on Fundamentals of Computation Theory (FCT 2009); Springer: Warsaw, Poland, 2009; Volume 5699 of LNCS, pp. 262–273. [Google Scholar]
- Holland, J.H. Adaptation in Natural and Artificial Systems; University of Michigan Press: Ann Arbor, MI, USA, 1975. [Google Scholar]
- Koza, J.R. Genetic Programming: On the Programming of Computers by Means of Natural Selection; MIT Press: Cambridge, MA, USA, 1992. [Google Scholar]
- Martínez, G.J.; Adamatzky, A.; Stephens, C.R. Cellular automaton supercolliders. Int. J. Mod. Phys. C
**2011**, 22, 419–439. [Google Scholar] [CrossRef] - Livnat, A.; Pippenger, N. An optimal brain can be composed of conflicting agents. PNAS
**2006**, 103, 9. [Google Scholar] - Chaitin, G.J. Metaphysics, Metamathematics and Metabiology. In Randomness Through Computation; Zenil, H., Ed.; World Scientific: Singapore, 2011; pp. 93–103. [Google Scholar]
- McNamara, J.M.; Dall, S.R.X. Information is a fitness enhancing resource. Oikos
**2010**, 119, 231–236. [Google Scholar] [CrossRef] - Zenil, H.; Marshall, J.A.R. Some Computational Aspects of Essential Properties of Evolution and Life. ACM Ubiquity
**2012**, 12, 11. [Google Scholar] - Zenil, H.; Gershenson, C.; Marshall, J.A.R.; Rosenblueth, D. Life as Thermodynamic Evidence of Algorithmic Structure in Natural Environments. Entropy
**2012**, 14, 2173–2191. [Google Scholar] [CrossRef] - Zenil, H.; Hernandez-Quiroz, F. On the Possible Computational Power of the Human Mind. In Worldviews, Science and Us: Philosophy and Complexity; Gershenson, C., Aerts, D., Edmonds, B., Eds.; World Scientfic: Singapore, 2007. [Google Scholar]
- Paz Flanagan, T.; Letendre, K.; Burnside, W.; Fricke, G.M.; Moses, M. How ants turn information into food. IEEE Artificial Life (ALIFE)
**2011**. Paris, France. [Google Scholar] - Catalania, K.C. Born knowing: Tentacled snakes innately predict future prey behaviour. PLoS ONE
**2010**, 5, 6. [Google Scholar] - de Vladar, H.P.; Barton, N.H. The contribution of statistical physics to evolutionary biology. Trends Ecol. Evol.
**2011**, 26, 424–432. [Google Scholar] [CrossRef] - Shannon, C. A Mathematical Theory of Communication. Bell Syst. Tech. J.
**1948**, 27, 279–423, 623–656. [Google Scholar]

© 2012 by the authors; licensee MDPI, Basel, Switzerland. This article is an open-access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).

## Share and Cite

**MDPI and ACS Style**

Zenil, H.
Information Theory and Computational Thermodynamics: Lessons for Biology from Physics. *Information* **2012**, *3*, 739-750.
https://doi.org/10.3390/info3040739

**AMA Style**

Zenil H.
Information Theory and Computational Thermodynamics: Lessons for Biology from Physics. *Information*. 2012; 3(4):739-750.
https://doi.org/10.3390/info3040739

**Chicago/Turabian Style**

Zenil, Hector.
2012. "Information Theory and Computational Thermodynamics: Lessons for Biology from Physics" *Information* 3, no. 4: 739-750.
https://doi.org/10.3390/info3040739