# Entropy and Information Theory: Uses and Misuses

## Abstract

**:**

## 1. Introduction

“It is the only physical theory of universal content, which I am convinced, that within the framework of applicability of its basic concepts will never be overthrown.”

“Two things are infinite: the universe and human stupidity; and I’m not sure about the universe.”

“I was warned that if I persisted [criticizing relativity], I was likely to spoil my career prospects.”

“Students are told that the theory must be accepted although they cannot expect to understand it. They are encouraged right at the beginning of their careers to forsake science in favor of dogma.”

This leads us to a deeper question: Why do we have to die at all? Why couldn’t God or Nature endow us with immortality? The answer has to do with… the Second Law of Thermodynamics, or the fact that there’s an arrow of time in our universe that leads to entropy and the running down of everything.

#### 1.1. The Unique Nature of Entropy

- (1)
- There is no other concept which was gravely misused, misapplied and in some instances, even abused.
- (2)
- There is no other concept to which so many powers were ascribed, none of which was ever justified.
- (3)
- There is no other concept which features explicitly in so many book’s covers and titles. And ironically some of these book though mentioning entropy, are, in fact totally irrelevant to entropy.
- (4)
- There is no other concept which was once misinterpreted, then the same interpretation was blindly and uncritically propagated in the literature by so many scientists.
- (5)
- There is no other concept on which people wrote whole books full of “whatever came to their mind,” knowingly or unknowingly that they shall be immune from being proven wrong.
- (6)
- (7)
- There is no other concept in which the role of “cause” and “effect” have been interchanged.
- (8)
- Finally, I would like to add my own “no-other-concept” that has contributed more to the confusion, distortion and misleading the human minds than entropy and the Second Law.

“…no other scientific law has contributed more to the liberation of the human spirit than the Second Law of thermodynamics.”

No other scientific law has liberated the spirit of some scientists to say whatever comes to their minds on the Second Law of thermodynamics!

#### 1.2. Outline of the Article

## 2. Three Different but Equivalent Definitions of Entropy

#### 2.1. Clausius’s “Sefinition” of Entropy

#### 2.2. Boltzmann’s Definition Based on Total Number of Micro-States

^{−23}J/K), and W is the number of accessible micro-states of the system. Here, log is the natural logarithm. At first glance, Boltzmann’s entropy seems to be completely different from Clausius’ entropy. Nevertheless, in all cases for which one can calculate changes of entropy one obtains agreements between the values calculated by the two methods. Boltzmann’s entropy, as defined in Equation (2), has raised considerable confusion regarding the question of whether entropy is, or isn’t a subjective quantity [4].

#### 2.3. ABN’s Definition of Entropy Based on Shannon’s Measure of Information

“The word information, in this theory, is used in a special sense that must not be confused with its ordinary usage. In particular, information must not be confused with meaning.”

Suppose we have a set of possible events whose probabilities of occurrence are ${p}_{1},{p}_{2},\cdots ,{p}_{n}$. These probabilities are known but that is all we know concerning which event will occur. Can we find a measure of how much “choice” is involved in the selection of the event or how uncertain we are of the outcome?

- (1)
- H should be continuous in the${p}_{i}$.
- (2)
- If all the${p}_{i}$are equal,${p}_{i}=\frac{1}{{n}_{\prime}}$then H should be a monotonic increasing function of n. With equally likely events there is more choice, or uncertainty, when there are more possible events.
- (3)
- If a choice be broken down into two successive choices, the original H should be the weighted sum of the individual values of H.

#### 2.3.1. First Step: The Locational SMI of a Particle in a 1D Box of Length L

#### 2.3.2. Second Step: The Velocity SMI of a Particle in a 1D “Box” of Length L

#### 2.3.3. Third Step: Combining the SMI for the Location and Momentum of One Particle; Introducing the Uncertainty Principle

#### 2.3.4. The SMI of a Particle in a Box of Volume V

#### 2.3.5. Step Four: The SMI of Locations and Momenta of N Independent and Indistinguishable Particles in a Box of Volume V

## 3. Various Formulations of the Second Law

- (1)
- (2)
- We never observe the final state of any of the processes in Figure 4 going back to the initial state and staying in that state.

The entropy of the universe always increases

#### 3.1. Entropy-Formulations of the Second Law

The entropy of an unconstrained isolated system $\left(E,V,N\right)$, at equilibrium is larger than the entropy of any possible constrained equilibrium states of the same system.

Removing any constraint from a constrained equilibrium state of an isolated system will result in an increase (or unchanged) of the entropy.

#### 3.1.1. Probability Formulations of the Second Law for Isolated Systems

“… the system…when left to itself, it rapidly proceeds to the disordered, most probable state.”

#### 3.1.2. The Probability Formulation of the Second Law

#### 3.1.3. Some Concluding Remarks

## 4. Interpretation and Misinterpretations of Entropy.

#### 4.1. The Order-Disorder Interpretation

“… the initial state of the system…must be distinguished by a special property (ordered or improbable) …”

“…this system takes in the course of time states…which one calls disordered.”

“Since by far most of the states of the system are disordered, one calls the latter the probable states.”

“… the system…when left to itself, it rapidly proceeds to the disordered, most probable state.”

“…disorder (measured by the parameter called entropy) increases with time”

“Nobody know for sure why the world is going from order to disorder, but this is so”

#### 4.2. The Association of Entropy with Spreading/Dispersion/Sharing

“To the question what in one word does entropy really mean, the author would have no hesitation in replying ‘Accessibility’ or ‘Spread.’ When this picture of entropy is adopted, all mystery concerning the increasing property of entropy vanishes.”

#### 4.3. Entropy as Information; Known and Unknown, Visible and Invisible and Ignorance

“Gain in entropy always means loss of information and nothing more.”

“I have deliberately omitted reference to the relation between information theory and entropy. There is the danger, it seems to me, of giving the impression that entropy requires the existence of some cognizant entity capable of possessing ‘information’ or of being to some degree ‘ignorant.’ It is then only a small step to the presumption that entropy is all in the mind, and hence is an aspect of the observer.”

#### 4.4. Entropy as a Measure of Probability

**Entropy is not probability!**

Entropy is shown to be related to probability. A closed isolated system may have been created artificially with very improbable structure…

The probability has a natural tendency to increase, and so does entropy. The exact relation is given by the famous Boltzmann-Planck formula:

#### 4.5. Entropy as a Measure of Irreversibility

Clausius introduces a quantity that measures this irreversible progress of heat in only one direction and, since he was a cultivated German, he gives it a name taken from ancient Greek–entropy.

“Entropy: A measure of the irreversibility of the thermodynamic evolution of an isolated system.”

“…the entropy difference between two states of an isolated system quantifies the irreversibility of a process connecting those two states”

#### 4.6. Entropy as Equilibriumness

“The equilibrium of the universe increases, despite your best effort.”

“What if you don’t use an engine? What if you can turn a crank by hand? Well, in actuality your muscles are acting as an engine, too. They are exploiting the chemical energy stored in molecules in your bloodstream, breaking them apart, and releasing the energy into the environment in the form of work. This increases the “equilibriumness” of the universe just as severely as a heat engine does.”

## 5. Misuses and Misapplications of Entropy and the Second Law

#### 5.1. The Association of Entropy with Time

“The entropy of the universe always increases.”

#### 5.2. Boltzmann’s H-Theorem and the Seeds of an Enormous Misconception about Entropy

#### 5.3. Can Entropy Be Defined For, and the Second Law Applied to Living Systems?

“It has been explained in Chapter 1 that the laws of physics, as we know them, are statistical laws. They have a lot to do with the natural tendency of things to go over into disorder.”

“How does the living organism avoid decay? The obvious answer is: By eating, drinking, breathing and (in the case of plants) assimilating. The technical term is metabolism.”

“What then is that precious something contained in our food which keeps us from death? That is easily answered. Every process, event, happening–call it what you will; in a word, everything that is going on in Nature means an increase of the entropy of the part of the world where it is going on. Thus, a living organism continually increases its entropy–or, as you may say, produces positive entropy–and thus tends to approach the dangerous state of maximum entropy, which is death. It can only keep aloof from it, i.e., alive, by continually drawing from its environment negative entropy–which is something very positive as we shall immediately see. What an organism feeds upon is negative entropy.”

“In Chapter 8 we also saw how the Second Law accounts for the emergence of the intricately ordered forms characteristic of life.”

“If living organism needs food, it is only for the negentropy it can get from it, and which is needed to make up for the losses due to mechanical work done, or simple degradation processes in living systems. Energy contained in food does not really matter: Since energy is conserved and never gets lost, but negentropy is the important factor.”

IT IS ENTROPY, NOT ENERGY THAT DRIVES THE WORLD.

#### 5.4. Entropy and Evolution

“The entropy of the earth’s biosphere is indeed decreasing by a tiny amount due to evolution, and the entropy of the cosmic microwave background is increasing by an even greater amount to compensate for that decrease.”

#### 5.5. Application of Entropy and the Second Law to the Entire Universe

“The entropy of the universe always increases.”

Entropy always increases.

Entropy of the universe always increases.

“So what is its entropy?”

This number comes from simply treating the contents of the universe as a conventional gas in thermal equilibrium.”

“The conclusion is perfectly clear: The state of the early universe was not chosen randomly among all possible states. Everyone in the world who has thought about the problem agrees with that.”

“Why don’t we live in empty space?”

When it comes to the past, however, we have at our disposal both our knowledge of the current macroscopic state of the universe, plus the fact that the early universe began in a low-entropy state. That one extra bit of information, known simply as the “Past Hypothesis,” gives us enormous leverage when it comes to reconstructing the past from the present.

## 6. Conclusions

## Funding

## Conflicts of Interest

## References

- Shermer, M. Heavens on Earth: The Scientific Search for the Afterlife, Immortality, and Utopia; Henry Holt and Co.: New York, NY, USA, 2018. [Google Scholar]
- Atkins, P. The Second Law. In Scientific American Books; W.H. Freeman and Co.: New York, NY, USA, 1984. [Google Scholar]
- Atkins, P. Four Laws That Drive the Universe; Oxford University Press: Oxford, UK, 2007. [Google Scholar]
- Ben-Naim, A. The Greatest Blunder Ever in the History of Science, Involving: Entropy, Time Life and the Universe; 2020; in preparation. [Google Scholar]
- Ben-Naim, A. Time’s Arrow (?) The Timeless Nature of Entropy and the Second Law of Thermodynamics; Lulu Publishing Services: Morrisville, NC, USA, 2018. [Google Scholar]
- Ben-Naim, A.; Casadei, D. Modern Thermodynamics; World Scientific Publishing: Singapore, 2017. [Google Scholar]
- Ben-Naim, A. Information Theory, Part I: An Introduction to the Fundamental Concepts; World Scientific Publishing: Singapore, 2017. [Google Scholar]
- Ben-Naim, A. Entropy the Truth the Whole Truth and Nothing but the Truth; World Scientific Publishing: Singapore, 2016. [Google Scholar]
- Ben-Naim, A. The Four Laws That Do not Drive the Universe; World Scientific Publishing: Singapore, 2017. [Google Scholar]
- Sackur, O. Annalen der Physik; WILEY-VCH Verlag GmbH &, Co.: Weinheim, Germany, 1911; Volume 36, p. 958. [Google Scholar]
- Tetrode, H. Annalen der Physik; WILEY-VCH Verlag GmbH &, Co.: Weinheim, Germany, 1912; Volume 38, p. 434. [Google Scholar]
- Shannon, C.E. A Mathematical Theory of Communication. Bell System Tech. J.
**1948**, 27, 379–423. [Google Scholar] [CrossRef] - Callen, H.B. Thermodynamics and an Introduction to Thermostatics, 2nd ed.; Wiley: New York, NY, USA, 1985. [Google Scholar]
- Brush, S.G. The Kind of Motion We Call Heat. A History of The Kinetic Theory of Gases in the 19th Century, Book 2: Statistical Physics and Irreversible Processes; North-Holland Publishing Company: Amsterdam, The Netherlands, 1976. [Google Scholar]
- Brush, S.G. Statistical Physics and the Atomic Theory of Matter, from Boyle and Newton to Landau and Onsager; Princeton University Press: Princeton, NJ, USA, 1983. [Google Scholar]
- Aharoni, R. Circularity; A Common Secret to Paradoxes, Scientific Revolutions and Humor; World Scientific Publishing: Singapore, 2016. [Google Scholar]
- Guggenheim, E.A. Statistical Basis of Thermodynamics. Research
**1949**, 2, 450. [Google Scholar] - Lewis, G.N. The Symmetry of Time in Physics. Science
**1930**, 71, 569. [Google Scholar] [CrossRef] [PubMed] - Brillouin, L. Science and Information Theory; Academy Press: New York, NY, USA, 1962. [Google Scholar]
- Rovelli, C. The Order of Time; Riverhead Books: New York, NY, USA, 2018. [Google Scholar]
- Lemons, D.S. A Student’s Guide to Entropy; Cambridge University Press: Cambridge, UK, 2013. [Google Scholar]
- Seife, C. Decoding the Universe. How the Science of Information is Explaining Everything in the Cosmos, From our Brains to Black Holes; Penguin Book: East Rutherford, NJ, USA, 2006. [Google Scholar]
- Eddington, A. The Nature of the Physical World; Cambridge University Press: Cambridge, UK, 1928. [Google Scholar]
- Schrödinger, E. What Is Life? The Physical Aspect of the Living Cell; Cambridge University Press: Cambridge, UK, 1944. [Google Scholar]
- Styer, D.F. Entropy and Evolution. Am. J. Phys.
**2008**, 76, 1031. [Google Scholar] [CrossRef][Green Version] - Carroll, S. From Eternity to Here, The Quest for the Ultimate Theory of Time; Plume: New York, NY, USA, 2010. [Google Scholar]

**Figure 1.**Ten particles in a box of volume V. Each particle, i has a locational and a velocity vector.

**Figure 2.**The uniform distribution for a particle in a 1D box of length L. The probability of finding a particle in a small interval dx, is dx/L.

**Figure 3.**(

**a**) The velocity distribution of particles in one dimension at different temperatures; (

**b**) The speed (or absolute velocity) distribution of particles in 3D at different temperatures.

**Figure 4.**Three typical spontaneous irreversible processes occurring in isolated systems. (

**a**) Expansion of an ideal gas; (

**b**) Mixing of two ideal gases; (

**c**) Heat transfer from a hot to a cold body.

**Figure 5.**An expansion of an ideal gas. (

**a**) The initial state; (

**b**) The system at the moment after removal of the partition and (

**c**) the final equilibrium state.

© 2019 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Ben-Naim, A.
Entropy and Information Theory: Uses and Misuses. *Entropy* **2019**, *21*, 1170.
https://doi.org/10.3390/e21121170

**AMA Style**

Ben-Naim A.
Entropy and Information Theory: Uses and Misuses. *Entropy*. 2019; 21(12):1170.
https://doi.org/10.3390/e21121170

**Chicago/Turabian Style**

Ben-Naim, Arieh.
2019. "Entropy and Information Theory: Uses and Misuses" *Entropy* 21, no. 12: 1170.
https://doi.org/10.3390/e21121170