Next Article in Journal
Using Game Theory to Model the Evolution of Information: an Illustrative Game
Previous Article in Journal
Algorithmic Complexity in Cosmology and Quantum Gravity
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

Comments on "A Law of Information Growth"

Emeritus Professor, Electrical and Computer Engineering Department, Drexel University, Philadelphia, PA 19104, USA
Entropy 2002, 4(1), 32-34; https://doi.org/10.3390/e4010032
Submission received: 11 January 2001 / Accepted: 11 January 2002 / Published: 30 January 2002

Abstract

:
Notes added in proof of an earlier paper on an empirical “law” of information growth in evolution, and in response to questions raised therein.

A recent paper [1] in this journal presented evidence for a systematic growth of information in the history of the cosmos, of the life on earth, and of the technology developed there.  This evidence is empirical, guided by a cybernetic model of evolution that has been successfully applied to several other physical and sociological phenomena [2].  In this case the scale of the data offered is so great, and because the relation between information and entropy is so close [3], the question was raised of the relation being an unrecognized aspect of the second law of thermodynamics.  In particular it was asked  “whether a theoretical foundation can be found for this phenomenology . . .  .”
  That paper also points out that:
  “From the Second Law of Thermodynamics, we know that the increase of entropy is a consequence of the dynamics of system change, not their source, although we often use it as such.  However, ‘Processes that generate order are in no sense driven by the growth of entropy [4].’  The relation between entropy and information means that this is true of the latter as well as the former.”  This implies that indirect mechanisms must be involved in the information development described.
  In this regard, Schneider and Kay [5] discuss the behavior of systems in dynamic equilibrium under a set of external fluxes and forces.  When these influences change, the systems react in ways to resist being moved from their equilibrium.  If a threshold is crossed they may develop a new state to oppose further movement from equilibrium.  A common example is the Benard cell, a container of fluid heated from below.  Heat rises through the liquid by conduction, i.e., molecular collisions.  If the temperature gradient and heat flux increase too much the cell suddenly displays an emergent, coherent organization. This involves a switch to heat flow by convection, where columns of liquid move up, carrying the heat, and then, cooled, return to the base.  It is a more ordered state, with lower configurational entropy, but the greater heat flow entails an entropy increase that more than compensates.
  Consider an even simpler system, consisting of a gas of hydrogen and oxygen molecules, that can be completely described by stating that there are twice as many hydrogen (H) atoms as oxygen (O) atoms and that each atom occupies an equivalent volume.  This is the greatest entropy state of the system since it has uniformly distributed components.  If the gas is cooled these atoms will combine to form water molecules (H-O-H).  A new description must include information about the chemical bonds, e.g., their angles, how the molecules rotate about those bonds, and how the molecules twist and vibrate when they interact.  Further cooling produces ice, where the molecules become more firmly interconnected.  Now the behavior is different and more complex.  Its description must include the crystalline structure with its fluctuations and imperfections, the crystalline and molecular electronic states, the phonon spectrum, etc.
  Each change resulted in a more ordered state with lower entropy than its predecessor.  All the changes liberated a heat of fusion so that the net entropy of the local universe increased.  In addition, each successive condition involved more information to formulate its state of being, as the internal entropy reduced.  This information is the difference between the maximum entropy and that of the ordered state [3].  The same conclusion can be drawn for the ordered states of formation of the chemical elements after the Big Bang, of the stars and galaxies, of the Benard cell, and of the living cell.  We expect, therefore, that the totality of changes of ordered systems under the influence of changing environmental forces is toward greater order and information.
  Schneider and kay extend these considerations to speciation: “When a new living system is generated after the demise of an earlier one, it would make the self-organization process more efficient if it were constrained to variations which have a high probability of success.  Genes play this role in constraining the self-organization process to those options which have a high probability of success.  . . .  This is the role of the gene and, at a larger scale, biodiversity:  to act as information databases of self-organization strategies that work.”
  Once genes are involved the nature of order/information change can take different forms, leading to vast complexification of structure and, ultimately, of a neural net and brain, as illustrated by those events on the information trajectory of reference [1].  For example, there seems to have been an impetus for the evolutionary development of information communication.  First this was through the appearance of coherent speech, even at the cost of some physical disadvantage [6], and then followed by various inventive technologies.
  We are the conscious originators of the last, technological, informational inventions on the information trajectory of reference [1]. We can therefore appreciate the motivation they represent for information expansion, even as we inquire about its connection to the law of entropy.

References

  1. Coren, R. L. Empirical Evidence for a Law of Information Growth. Entropy. 2001, 3, pp. 259–273. < http://www.mdpi.org/entropy/papers/e3040259.pdf>.
  2. Coren, R. L. The Evolutionary Trajectory: The Growth of Information in the History and Future of Earth; Gordon and Breach Publishers, 1998. [Google Scholar]
  3. Machta, J. Entropy, Information, and Computation. Am. J. Phys. 1999, 67(12), 1074–1077. [Google Scholar]
  4. Layzer, D. Entropy, Information, and Evolution; Weber, B.H., Depew, D.J., Eds.; MIT Press, 1990; p. 23. [Google Scholar]
  5. Schneider, E. D.; Kay, J. J. What is Life? The Next Fifty Years; Murphy, M. P., O’Neill, L., Eds.; Cambridge Univ. Press, 1995; p. 161. [Google Scholar]
  6. Lieberman, P. The Biology and Evolution of language; Harvard Univ. Press, 1984. [Google Scholar] Lieberman, P. On Human Speech, Syntax, and language. Human Evolution 1988, 3, 3–18. [Google Scholar]

Share and Cite

MDPI and ACS Style

Coren, R.L. Comments on "A Law of Information Growth". Entropy 2002, 4, 32-34. https://doi.org/10.3390/e4010032

AMA Style

Coren RL. Comments on "A Law of Information Growth". Entropy. 2002; 4(1):32-34. https://doi.org/10.3390/e4010032

Chicago/Turabian Style

Coren, Richard L. 2002. "Comments on "A Law of Information Growth"" Entropy 4, no. 1: 32-34. https://doi.org/10.3390/e4010032

APA Style

Coren, R. L. (2002). Comments on "A Law of Information Growth". Entropy, 4(1), 32-34. https://doi.org/10.3390/e4010032

Article Metrics

Back to TopTop