Next Article in Journal
Texture Segmentation Using Laplace Distribution-Based Wavelet-Domain Hidden Markov Tree Models
Previous Article in Journal
Energy and Exergy Analyses of a Diesel Engine Fuelled with Biodiesel-Diesel Blends Containing 5% Bioethanol
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

A Possible Ethical Imperative Based on the Entropy Law

Department of Biomedical Engineering, Department of Mechanical Engineering, Carnegie Mellon University, Pittsburgh, PA 15213-3890, USA
Entropy 2016, 18(11), 389;
Submission received: 10 August 2016 / Revised: 27 October 2016 / Accepted: 28 October 2016 / Published: 3 November 2016


Lindsay in an article titled, “Entropy consumption and values in physical science,” (Am. Sci. 1959, 47, 678–696) proposed a Thermodynamic Imperative similar to Kant’s Ethical Categorical Imperative. In this paper, after describing the concept of ethical imperative as elaborated by Kant, we provide a brief discussion of the role of science and its relationship to the classical thermodynamics and the physical implications of the first and the second laws of thermodynamics. We finally attempt to extend and supplement Lindsay’s Thermodynamic Imperative (TI), by another Imperative suggesting simplicity, conservation, and harmony.

1. Introduction

“Thermodynamics is a funny subject. The first time you go through it, you don’t understand it at all. The second time you go through it, you think you understand it, except for one or two small points. The third time you go through it, you know you don’t understand it, but by that time you are so used to it, it doesn’t bother you anymore.”
—Sommerfeld (see Angrist and Loren [1], p. 215)
This paper is an attempt to establish a connection between ethics and the second law of thermodynamics (entropy) (See Remark B1 in Appendix B). Specifically, we ask: how do scientists make their ethical decisions? Do they make them based on their religious/spiritual/philosophical outlook or do they make their decision based on other scientific laws? Einstein ([2], p. 115) seems to provide an answer:
“It is the privilege of man’s moral genius, impersonated by inspired individuals, to advance ethical axioms which are so comprehensive and so well founded that men will accept them as grounded in the vast mass of their individual emotional experience. Ethical axioms are found and tested not very differently from the axioms of science. Truth is what stands the test of experience.”
What Einstein is perhaps suggesting or implying, I think, is that ethical axioms, at least some of them, can be founded upon science. In fact, Lindsay [3] in an article titled, “Entropy consumption and values in physical science,” proposed an Ethical Imperative similar to that of Kant’s Categorical Imperative:
“We may call it the thermodynamic imperative since it is suggested by the principles of thermodynamics. In brief, and in simple language, it urges all men to fight always as vigorously as possible to increase the degree of order in their environment so as to combat the natural tendency for order in the universe to be transformed into disorder, the so-called second law or principle of thermodynamics.”
A moral imperative is a statement which can be expressed as: “I ought …,” implying that “an action of this kind to be imposed or necessitated by an objective principle valid for any rational agent as such.” (Kant, translated by Paton, [4], p. 26). Thus an imperative is directly related to rationality. Kant’s Categorical Imperative deals with maxims, universal law, will, etc. A maxim is a purely personal principle upon which we will to act. According to Kant, this is a subjective principle. An objective principle on the other hand implies a principle where every rational agent would necessarily act provided that this rational agent has complete (full) control over his or her actions. Kant’s categorical imperative is first given as a negative statement: “I ought never to act except in such a way that I can also will that my maxim should become a universal law” ([4], p. 70). Later in his book, Groundwork of the Metaphysics of Morals, Kant expresses this imperative as a positive statement: “Act only on that maxim through which you can at the same time will that it should become a universal law” ([4], p. 88). Lindsay [3] used these guidelines to develop his ideas culminating in a thermodynamical imperative. In this paper, we attempt to extend and supplement Lindsay’s thermodynamic imperative, by another imperative suggesting simplicity, conservation, and harmony.
We start by asking the following question: If science, formerly and more accurately known as Natural Philosophy, is a part of Philosophy, as much as ethics is part of philosophy, then are there ethical laws stemming or emanating from the body of knowledge we now call science (Remark B2)? In other words, is it possible to use science and arrive at some rules of ethics based on science (Remark B3)? This is the primary concern in this essay. In the same spirit Newton ([5], p. 1) said: “Natural philosophy consists in discovering the frame and operation of nature, and reducing them, as far as may be, to general rules or laws—establishing these rules by observations and experiments, and thence deducing the causes and effects of things…” There are those who say that ethics belongs to the domain of religions, while there are others who say ethics is a branch of philosophy and one can be ethical without belonging to a religious tradition or establishment. Is it possible that scientists can use science and devise a code of ethics? Sarton ([6], p. 171) one of the greatest historian of science, says:
“Almost every man of science, whether he be historically minded or not, is obliged to do a certain amount of retrospection, because his own investigations bring him face to face with the work of some predecessor, or because of academic conventions…The fundamental questions ‘When did that happen? Where?’ are easy to answer. The questions ‘Why?’ and ‘how?’ are more difficult of course, yet they are still comparatively easy for later periods.”
It was perhaps this sort of reflection that caused many of the greatest scientists of the 20th Century to realize the potential disasters ensuing from a nuclear war (see Masters and Way [7]).
In fact, as pointed out by Deltete [8], Hertz in 1894 had summarized one of the goals of classical physics: “All physicists agree that the problem of physics consists in tracing the phenomena of nature back to the simple laws of mechanics.” This, in a sense, is a summary of the “mechanistic” view of nature. With the introduction of the concepts of “energetic” and “energeticism” in thermodynamics, mainly by Ostwald and others (see for example Ostwald [9], and Deltete [10,11]) some physicists began to look at things from a different perspective. As Deltete [8] provides a succinct summary of this view: “The energeticists believed that scientists should abandon their efforts to understand the world in mechanical terms, and that they should give up atomism as well, in favor of a new world view based entirely on the transfers and transformations of energy.” In fact, Ostwald [9] went as far as saying that “…energetics coincides with that movement which has originated on philosophical ground and which pursues very similar ends under the name of ‘pragmatism’ or ‘humanism’.” This seems to be one of the earliest attempts by a scientist to connect energy (or thermodynamics) with philosophy. However, it needs to be mentioned that the concept of “energetics” as an all-encompassing theory has not been widely accepted by the science community. According to Holt [12]: “The two major scientific critics of Energism, Max Planck and Mach, dismissed Energism as useless ‘metaphysics.’” For additional criticism and discussion of these views, we refer the reader to the papers of Carus [13] and Hakfoort [14]. Of course, in the current paper I am not advocating an enegeticism view of ethics or philosophy.
The objective of this essay is to show that the application of the Entropy Law to the environment, should include not only Lindsay’s thermodynamical imperative, which was based on a modern interpretation of the Entropy Law, relating it to order, organization, and sorting, but it also needs to be supplemented and indeed must be preceded by another thermodynamical imperative, which is based on a classical interpretation of the Entropy Law, relating it to conservation, simplicity, and harmony.
In Section 2 of this paper, we discuss very briefly certain aspects of science. In Section 3, a discussion of classical thermodynamics, including concepts such as systems, energy, and entropy is provided. In Section 4, a few remarks relating entropy to ethics are mentioned and a new Thermodynamics Imperative (complementary to that of Lindsay’s) is suggested.

2. The Role of Science

“Scientific progress continually reminds us that dogmas are for doubting, orthodox opinions are for factual rebuttals, established facts are for disconfirming, wild conjectures are (sometimes) for entertaining, beautiful thoughts are often (alas) for unthinking, and world authorities are for deflating.”
—Ziman [15]
Science has been blamed for many of the disasters, especially warfare and destruction of nature, and it has been praised for many of the advances made in health, transportation, agriculture, etc. With issues such as “Global Warming” and other (natural) catastrophes such as floods, fires, and tornadoes, very often scientists are asked to interpret, make predictions, and come up with solutions and explanations (Remark B4). Bertrand Russell ([16], p. 78) says: “Science used to be valued as a means of getting to know the world; now, owing to the triumph of technique, it is conceived as showing how to change the world.” Amongst the important contributions of the 20th Century, are perhaps the Gaia Hypothesis (see Lovelock [17]) and Connectivity Hypothesis (see Laszlo [18]), where the organic and living nature of the environment and the connection amongst the various parts and members of the biotic community are explored. To this, we can also add the Systems Theory (Bertalanffy [19], Laszlo [20], Macy [21]), in some ways, related to thermodynamics, which attempts to look at the Whole rather than the parts, and also considers the interactions among the various parts of the sub-systems.
While there are many definitions of science (Remark B5) and who a scientist (Remark B6) is, below we give a few sample definitions given by a few scientists and philosophers. For example, Lindsay [3] says: “Science is a method for the description, creation and understanding of experience. It is, of course, not the only method for doing this, but no one can deny its success in the field to which it has generally confined itself.” Stephen Toulmin ([22], p. 99) says: “Science is not an intellectual computing-machine: it is a slice of life. We set out on our enquiry into the aims of science, hoping to do two things: first, to define in a life-like way the common intellectual tasks on which scientists are engaged, and the types of explanation their theories are intended to provide; and secondly, to pose the problem, how we are to tell good theories from bad, and better ideas, hypotheses, or explanations from worse ones.” Bateson ([23], p. 29) says: “Science is a way of perceiving and making what we may call ‘sense’ of our precepts. But perception operates only upon difference. All receipt of information is necessarily the receipt of news of difference, and all perception of difference is limited by threshold. Differences that are too slight or too slowly presented are not perceivable. They are not food for perception.” Russell [24] says: “Science is the attempt to discover, by means of observation, and reasoning based upon it, first, particular facts about the world, and then laws connecting facts with one another and (in fortunate cases) making it possible to predict future occurrences.” And finally, Ziman ([25], p. 28) defines physics, which is what is what I mean by science in this paper, as: “…the science devoted to discovering, developing and refining those aspects of reality that are amenable to mathematical analysis.”
There are two distinct and related concepts here which need to be considered: observation and reasoning based upon it. If something cannot be observed, either via our sensory organs, or instruments, then does science take a (strong) position about its non-existence? In other words, can a science based on reasoning arrive at something that no observation could have substantiated it? In general, when we talk about scientific laws, we are indirectly assuming that they are based on observation and reasoning based upon it. However, as Boulding [26] remarks:
“Most of the great laws, however, have a strong element of a priori about them. They are the way we know things must be rather than the way we have observed them. There is a certain myth among scientists that the great laws of science are derived by induction from observation and tested by the confirmation of predictions. This may be true of the small laws, but it is not really true of the great ones, which are derived on the whole from identities that we discover in our imaginations.”
If we define physics as that branch of science which studies the physical world, then its laws are restricted to that realm of existence (see also Goswami [27]). For example, Newton’s Third Law which says: “Action equals reaction” [28] operates in the physical realm where action means a physical action. There are some who think that many of the laws of physics have complementary forms in non-physical realms (Remark B7). For example, one can also talk about a verbal action or a mental action (Remark B8). Each has consequences that Newton’s Third Law, in its classical form, cannot count for (Remark B9). That is, if we consider not just the physical actions, but also the verbal (vocal) and the mental (imaginative) actions, then we can question whether or not the laws of physics can or should be applied to these other two realms.
If we consider the poetic or the mystical realm belonging to the realm of imaginative activity, then clearly it is understandable that many scientists may not consider themselves as mystical. But there are many great scientists who even if they were not great poets or great mystics, had poetic and mystical tendencies. By “scientist”, in its purest form, we mean one who seeks knowledge in general, and knowledge as related to the physical world is but a limited aspect of this search (Remark B10). Many cultural values such as religious, educational, social, and family ones influence or help to create our code of ethics. A scientifically-oriented ethics allows for a person to be a scientist, yet believe in a transcendental realm. Such an orientation would imply that we would study our own code of ethics, and we would try to improve or modify it using observation (i.e., the effects of our conduct on the outer world as well as the inner world). Davis [29] in his essay discussing “Engineering Ethics” distinguishes at least four different ways of studying ethics (Remark B11):
  • “…a system for ordinary morality, those standards of conduct that apply to everyone as moral agent.”
  • “...the art of living well.”
  • “…special morally-permissible standards of conduct that apply to the members of a group simply because they are members of that group.”
  • “…a field of philosophy (the attempt to understand ethics, in one or more of its other senses, as a rational undertaking.”
In exploring the relationship between creativity (a way of bringing meaning into one’s life) and science, Sir Peter Medawar ([30], p. 85) offers the following guidelines:
  • “We must study particulars and not abstractions: after all biologists do not study the nature of life: they study living things and likewise physicians study not the concept of illness but sick people…
  • A scientist will shun an explanation which while it has outwardly the form of an explanation, does no more in fact than interpret one unknown in terms of another…
  • A scientist will take evidence from all quarters likely to be informative, not excluding introspection, for no good would come of self-righteously abjuring such an important source of evidence.
  • A scientist must be resolutely critical, seeking reasons to disbelieve hypotheses, perhaps especially those which he has thought of himself and thinks rather brilliant.”
From this list we can identify key phrases such as studying specific problems, avoiding useless explanations, collecting data from various sources and being critical as important concepts.
Having discussed, albeit very briefly and selectively, some aspects of science and ethics, in the next Section, we will discuss some of the ideas in classical thermodynamics, especially the 2nd law of thermodynamics (the entropy law) and explore its possible connections to ethics.

3. Classical Thermodynamics: The First and the Second Laws

“The task of the theorist is to bring order into the chaos of the phenomena of nature, to invent a language by which a class of these phenomena can be described efficiently and simply. Here is the place for ‘intuition’ and here the old preconceptions, common among natural philosophers that nature is simple and elegant, has led to many successes. Of course physical theory must be based in experience, but experiment comes after, rather than before, theory. Without theoretical concepts one would neither know what experiments to perform nor be able to interpret their outcomes.”
—Truesdell ([31], p. 52)
According to Callen ([32], p. 11) Leibniz in 1693 was the first person who stated a conservation principle, for the sum of the kinetic energy ( 1 2 m v 2 ) and the potential energy ( m g h ) for a simple mechanical mass point. As more complicated systems were studied, initially it appeared that the established form of this conservation principle had failed, but later it was shown that with the addition of new mathematical terms, i.e., new kinds of energy (which were later shown physically to exist), the conservation principle has held its place. Thus, in a sense, the first law of thermodynamics has been modified but never violated.
Before we look at the (four) laws of thermodynamics, we will discuss concepts such as systems, states, processes (paths), stability, etc. According to classical thermodynamics (Remark 12) there are three (or four) types of systems (see Massoudi [33]).
  • An isolated system is a system which does not exchange energy or matter with the outside environment. And an adiabatically isolated system is one in which the transfer of heat and matter are excluded, but other forms of energy can be transferred across the boundaries (Denbigh [34]).
  • A closed system is one in which only energy is exchanged with its environment. Earth is an example of such a system.
  • An open system is one in which both matter and energy can be exchanged with the environment. (Remark B13)
Within a closed system, for example the earth, it is possible to have isolated sub-systems, closed sub-systems, and open sub-systems. Most of the naturally occurring systems, such as human beings and other living creatures, and “unnatural” systems, such as refrigerators are open systems and in these cases (the standard form of) the entropy law does not apply. Within any system, it is possible to have many sub-systems (also called elements, constituents…) which move (or can be transformed) from one state to another state. The process (or the path) that this transformation takes place can be reversible (which implies frictionless processes where there is no dissipation) or irreversible (Remark B14). A state may be in equilibrium (Remark B15) or non-equilibrium, stable or unstable. In general, all these concepts have to be defined before we can discuss the laws of thermodynamics, in a meaningful way.
Callen ([32], pp. 13–32) has provided an outstanding and very concise summary of classical thermodynamics; he has presented the four postulates of thermodynamics as:
Postulate I.
“There exists particular states (called equilibrium states) of simple systems that, macroscopically, are characterized completely by the internal energy U , the volume V , and the mole numbers N 1 , N 2 , , N r of the chemical components.”([32], p. 13)
We can think of equilibrium states as states which are time independent; it is intrinsically assumed in classical thermodynamics that “in all systems there is a tendency to evolve toward states in which the properties are determined by intrinsic factors and not by previously applied external influences,” (Callen, [32], p. 13). Furthermore, we often talk about quasi-static processes. These generally refer to the cases where things are happening very slowly or in a technical sense where processes are occurring with “infinite slowness.”
Postulate II.
“There exists a function (called the entropy S ) of the extensive parameters of any composite system, defined for all equilibrium states and having the following property: The values assumed by the extensive parameters in the absence of an internal constraint are those that maximize the entropy over the manifold of constrained equilibrium states.”([32], p. 27)
At this stage of the development of classical thermodynamics, it is important to recognize that there are no references to this entropy function being related to the order or disorder and chaos in the system. As mentioned by Callen ([32], p. 26): “The single, all-encompassing problem of thermodynamics is the determination of the equilibrium state that eventually results after the removal of internal constrains in a closed, composite system.” For our purpose here, it is not important to discuss composite systems. Postulate II is at the core of classical thermodynamics, for as mentioned by Callen ([32], p. 28) if the fundamental relation of a particular system, which is the relation which specifies the entropy as a function of the extensive parameters, is known, then all “conceivable” thermodynamical information about that system can be ascertained. This fundamental relationship can have the following form:
S = S ( U , V , N 1 N r )
Postulate III.
“The entropy of a composite system is additive over the constituent subsystems. The entropy is continuous and differentiable and is a monotonically increasing function of the energy.”([32], p. 28)
And according to this Postulate, we can define a non-negative quantity called the temperature such that:
( S / U ) V , N 1 , N r > 0
Since Equation (1) is a single-valued continuous and differentiable function, it can be inverted and solved for U such that:
U = U ( S , V , N 1 N r )
And thus, we can talk about minimizing the internal energy U , instead of maximizing the entropy S . In simple language, if the total entropy of a system is known as a function of the various extensive parameters of the subsystems, then by differentiating it, and on the basis of the second derivative, we can see whether these extrema can be classified as minima, maxima, or as horizontal inflections. Thus, now we are in the position to talk about stable or unstable equilibrium states (Callen [32], p. 31).
Postulate IV.
“The entropy of any system vanishes in the state for which ( U / S ) V , N 1 , N r = 0 (that is, at the zero temperature).”([32], p. 30)
The vanishing of this derivative indicates the vanishing of the temperature, which means that zero temperature implies zero entropy. This is an extension, due to Planck, of the so-called third law of thermodynamics.
As Callen ([32], p. 27) points out, based on previous experience with many physical theories it is expected that a suitable equilibrium criterion would be expressed in terms of an extremum principle where the values of the extensive parameters in the final equilibrium state are simply those that maximize some physically meaningful function, which in our case is the entropy S . In general, within the context of classical thermodynamics, we can talk about mechanical, thermal, or chemical equilibrium. As stated by Callen ([32], p. 203):
“The basic extremum principle of thermodynamics implies that dS = 0 and that d 2 S < 0 , the first of these conditions, stating that the entropy is an extremum and the second stating that the extremum is, in particular, a minimum.”
We can recall, from classical mechanics, that similarly, the stable equilibrium of a rigid pendulum is at the position where the potential energy is minimum. One of the interesting stability (Remark B16) criteria in thermodynamics is that of Le Chatelier’s Principle, where (Callen ([32], p. 211) “any inhomogeneity that somehow develops in a system should induce a process that tends to eradicate the inhomogeneity.” That is, given enough time, there seems to be a tendency for the fluctuations and the disturbances to die out and for the system to return to its equilibrium state. For example, unless externally enforced and sustained, turbulences tend to damp out within the fluid flow and inhomogeneities in concentration diffuse to uniformity.
The science of thermodynamics is primarily based on experiment and much of the classical thermodynamics is based on the four fundamental laws of thermodynamics (see Kestin [35]). By the time the now well-known two laws of thermodynamics were formulated, it was recognized that another fundamental law had to precede them, and this has become known as the Zeroth law, formulated by Fowler in 1931, which states (Kestin [35]: “Two systems in thermal equilibrium with a third system are in thermal equilibrium with each other.” Also according to Kestin and Dorfman ([36], p. 15)): “The Third law asserts that the entropy difference in an isothermal process tends to zero as the thermodynamic temperature tends to zero.” There are other ways of expressing this law, as for example done by Nernst, Planck, etc. The first law of thermodynamics, also known as the law of conservation of energy, states that “all matter and energy in the universe is constant, that it cannot be created or destroyed.” (Remark B17) Feynman et al., ([28], Chapter 4) says:
“There is a fact, or if you wish, a law, governing all natural phenomena that are known to date. There is no exception to this law—that is exact so far as we know. The law is called the conservation of energy. It states that there is a certain quantity, which we call energy, that does not change in the manifold changes which nature undergoes. That is a most abstract idea, because it is a mathematical principle; it says that there is a numerical quantity which does not change when something happens...It is important to realize that in physics today, we have no knowledge of what energy is...It is an abstract thing in that it does not tell us the mechanism or the reasons for the various formulas.”
Interestingly, as Asimov [37] says:
“No one knows why energy is conserved, and no one can be completely sure it is truly conserved everywhere in the universe and under all conditions. All that anyone can say is that in over a century and a quarter of careful measurement, scientists have never been able to point to a definite violation of energy conservation, either in the familiar everyday surroundings about us, or in heavens above or in the atoms within.”
The first law (Remark B18) of thermodynamics does not say anything about the direction of this change, this transformation from one state to another state; it is the second law of thermodynamics which provides this information (Remark B19). There does not seem to be any controversy or disagreement with regard to the form or the interpretation of the first law of thermodynamics: it is accepted that energy is conserved. In most textbooks on thermodynamics, energy is usually defined as the capacity of a system to perform work. In physics, the known forms of energy are chemical, potential, kinetic, nuclear, etc., which in some ways can be measured or quantified. Physics does not directly specify or discuss whether there is any energy due to one’s thoughts and whether the energy caused by one’s thoughts and one’s words is conserved (Remark B20).
The entropy law has been known for more than 150 years. While there have been many attempts to challenge its status as a law of physics, such as the many claims of perpetual machines of the first or second kind, it has never been disproved. There are different ways of looking at the Second Law. One could look at it from the Classical Thermodynamics or from the Statistical Thermodynamics perspective. It could also be related to irreversibility and disorder (Remark B21). The important thing about the entropy law (Remark B22) is “that it is neither a theorem deducible from the principles of classical mechanics nor a reflection of some man’s imperfections or illusions. On the contrary, it is as independent a law as, for example, the law of universal attraction, and just as inexorable.” (Georgescu-Roegen, [38], p. 9). Entropy in a classical thermodynamical process is related to the free energy of that system. As Lindsay [3] explains: “If the essence of the first principle in everyday life is that we cannot get something for nothing, the second principle emphasizes that every time we do get something we reduce by a measurable amount the opportunity to get that something in the future, until ultimately the time will come when there will be no more getting.”
According to Seifert [39] and others, Clausius in 1850 invented the concept of entropy (Remark B23) “to describe and measure the loss in available energy which occurs in most natural processes.” Feynman, et al. ([28], on the other hand says, “The so-called second law of thermodynamics was thus discovered by Carnot before the first law!” There are two well-known statements of the second law in classical thermodynamics (see Cengal [40]). In one case the emphasis is on the efficiency of conversion of heat into work (Remark B24), and in the other case on the irreversibility (Remark B25) of nature. According to Adkins [41], these statements are:
The Kelvin Statement (Remark B26):
No process is possible whose sole result is the complete conversion of heat into work.
The Clausius Statement:
No process is possible whose sole result is the transfer of heat from a colder to a hotter body.
The first statement indicates that it is impossible to achieve 100 percent efficiency in the conversion of heat into work and the second statement indicates that the natural tendency for heat is to flow from hotter to colder bodies, unless there is an external agent (such as extra work) to reverse this direction (refrigeration being an example of such a process (Remark B27)). These two statements explain the impossibility of two different forms of perpetual motion. For example, it is because of the first statement that we cannot have a perpetual machine of the first kind which is a machine operating continuously by creating its own energy. It is because of the second statement that we cannot have a perpetual machine of the second kind “a machine that cannot be made which runs continuously by using the internal energy of a single heat reservoir,” (Adkins, [41], p. 54). As indicated by Denbigh [34], to be proper, the second law of thermodynamics applies only to System 1 and System 1-b defined earlier, whereas for Systems 2 or 3 the entropy may actually decrease “due to an outward passage of heat and/or matter across the boundary surface.”
Up to now, in this paper, I have attempted to provide a brief summary of the relevant issues from the perspective of classical thermodynamics, and I have also tried to discuss important ideas such as systems, stability, equilibrium, etc. In the next and final section of this paper, I take the “leap of faith”, where I will first look at how the concepts of (dis)order and (dis)organizations, through the introduction of statistical and irreversible thermodynamics, have been used for proposing ethical imperatives. I will then show that classical thermodynamics can also be used and with a similar leap of faith, it can provide another thermodynamic imperative. Section 4 will have more of a philosophical twist about it with some conjectures about the relationship between entropy (mainly form a classical thermodynamics perspective) and ethics.

4. Brief Remarks on Possible Ethical Aspects of the Entropy Law

“Even when the individual believes that science contributes to the human ends which he has at heart, his belief needs a continual scanning and re-evaluation which is only partly possible... And if we adhere simply to the creed of the scientist, that an incomplete knowledge of the world and of ourselves is better than no knowledge, we can still by no means always justify the naive assumption that the faster we rush ahead to employ the new powers for action which are opened up to us, the better it will be.”
—Norbert Wiener [42]
Most of what we discussed in the previous section pertain to the concept of entropy as developed and used in classical thermodynamics. However, according to Prigogine ([43], p. 9), Boltzmann was “the first to note that entropy is a measure of molecular disorder,” and he concluded “that the law of entropy increase is simply a law of increasing disorganization.” This is perhaps the beginning of the introduction of new ideas, such as order and disorder (Remark B28) into thermodynamics. Lindsay [3], like many other physicists, looked at entropy as a measure of the orderliness of a system (Remark B29): “Increase in entropy means a transition from a more orderly state to a less orderly state.” He furthermore indicated that:
“In any naturally occurring process, the tendency is for all systems to proceed from order to disorder. The maximum entropy of Clausius is the state of complete disorder or thorough randomness, out of which no return to order is practically possible because it applies to the universe as a whole; nothing short of an inexpressibly improbable revolution could reverse the process and decrease the entropy. From this point of view, the trend from order to disorder with production of entropy is inexorable. The second law always wins in the end. A gloomy outlook indeed! But, there is perhaps a silver lining in the cloud.”
It is through the living organisms, the conscious and the aware ones, that perhaps the hope lies. A complex system (Remark B30) may or may not be a disorderly one. For example, to build a house one has to organize various elements such as bricks, pipes, wires, etc., in an orderly fashion. According to Denbigh [44]: “A system is said to be organized when it has a function, or a set of functions; that is to say when the system in question is able to do certain things, or can be made to do them.” Therefore, while a house is a complex system (Remark B31) (in a relative sense), it is also an orderly system. However, if the house is destroyed, due to earthquake for example, while the level of complexity may not have changed, the disorder has certainly increased. Clearly organization is related to or has an important relationship with the notion of complexity, which Denbigh [44] defines as: “…a measure of the number of distinct components (e.g., atoms, molecules, cells, etc.,) that a particular system can contain under the prevailing physical conditions.” Not all living organisms make a system more orderly. Additional and conscious efforts of sorting or ordering are needed. As Orrell ([45], p. 11) explains: “A defining property of complex systems is that they exhibit what is known as emergent behavior (see also [46]): properties which emerge from the system but cannot be predicted using the knowledge of the system’s components alone.” Living organisms are complex systems and in living organisms, we talk not only about the production but also the consumption of entropy (Remark B32).
An area where physics can provide some ethical guidelines, and in fact, some warning signs, is the application of the entropy law, not only in scientific and environmental realms, but perhaps also on a personal level and relating to our livelihood. While there are categories of ethics dealing with different aspects of environment, including animal rights and the rights of nature (see [47,48,49,50,51,52,53]), in general, entropy law, in my opinion, can also be used to suggest certain ethical guidelines. The (ethical) consequences of the Entropy Law are generally stated in the form of Consequentialist Ethics, or a form of Deontological Ethics. Lindsay [3], following the Deontological approach, suggests:
“...while we do live we ought always to act in all things in such a way as to produce as much order in our environment as possible, in other words to maximize the consumption of entropy. This is the thermodynamic imperative, a normative principle which may serve as the basis for a persuasive ethic in the spirit of the Golden Rule and Kant’s categorical imperative.”
Thus, in a sense, Lindsay seems to be associating the maximization of the consumption of entropy with the creation of some type of order in the environment. That is, by intentionally consuming entropy, i.e., by organizing, sorting and putting things in order, we create a more orderly state of living. There is conscious effort involved here. This statement can be put in the following form, as a Thermodynamical Imperative (TI):
One ought to do things, in so far as possible, in such ways that the consumption of entropy is maximized ([3]).
This imperative, however, does not address the issue of how the disorder came into existence (was created) in the first place and whether it was caused naturally. For example, a tornado will cause much destruction and create a lot of disorder in its path. Undoubtedly, we as humans, by cleaning up the mess caused by a tornado, i.e., by consuming the existing entropy, can help to create a more orderly, organized and cleaner environment. However, not all disorders are caused by natural phenomena; many are man-made, and because of this, I think the imperative suggested by Lindsay needs to be supplemented by another one which should precede it. That is, we should try to do things, in so far as possible, in such ways that we minimize the production of entropy. Stating this in the form of a Thermodynamical Imperative, we have
One ought to do things, in so far as possible, in such ways that the production of entropy is minimized (Remark B33).
Thus we can say that while this statement is concerned with conservation, simplicity, and harmony, the imperative proposed by Lindsay [3] involves ordering, sorting, and organizing.
While most modern technical inventions such as organizers, computers, cell phones, etc., can order and organize things for us (the mess which we have created due to our entropy production) there are few or no (modern) inventions which help us live simply and with more in harmony and peace with our environment. These are more difficult to do and they are qualitatively dependent on the individual and they are more of a personal nature; how to live in harmony and more simply come from within each individual and are related to our worldview and to our perception of how we fit in the world, and what our responsibilities are to the world. For example, TI.1 suggests or implies that we should abstain from using things such as minerals, resources, etc., which are not renewable [54]. If we need to use these things, then we should take measures to protect against excessive use. It is only after enforcing this imperative that we would apply TI.2. For example, recycling is primarily a form of reorganizing the waste or the by-product; similarly every time we re-order or re-organize a system, for whatever reason, we can use TI.2. However, these activities at the same time involve the use of more available energy (see [55]). That is, activities such as emission control, improving efficiency, etc., on a large scale, are actions we can do and ought to do in order to organize or minimize the waste or pollution. This is indeed necessary and useful. However, these types of activities do not address the more fundamental and underlying issue, namely: How or why was the waste created in the first place? This type of questioning requires the implementation of TI.1. For example, if we walk into a room and see clothes, shoes, books, etc., all over the room, we can immediately recognize this as being a dis-organized environment. By intentionally consuming the entropy, i.e., the disorder, through our efforts, we can make the room more organized. This, however, does not address how or why was the room in such a disordered state? As Ben-Naim ([56], p. 188) says: “The Second Law of Thermodynamics states that in any spontaneous process in an isolated system, the entropy increases. The Second Law does not state anything as to why the entropy increases, nor does it address the question as to why a spontaneous process occurs at all.”
Therefore, the issue is not whether we should maximize the consumption of entropy (TI.2) (this we should do), but how can we minimize the production of entropy (TI.1)? In my opinion, this is a question that has been addressed by the World’s Wisdom traditions (see [57,58]), where we are encouraged to preserve and protect mature and all beings in it, to the extent possible, and to conserve and to let things be. Yet, we see a world rapidly decaying, fast becoming a trash-bin where we discard and dump things so easily, not realizing or not caring that there is no place for the waste to go; it will have to come back to us. I think in some ways, we can relate TI.1 to a measure of simplicity in our life (Remark B34). Whether or not and how much each of us can simplify our life remains an individual choice and decision (Remark B35). For example, as Thoreau ([59], p. 323) said:
“I learned this, at least, by my experiment; that if one advances confidently in the direction of his dreams, and endeavors to live the life which he has imagined, he will meet with a success unexpected in common hours. He will put some things behind, will pass an invisible boundary; new and universal, and more liberal laws will begin to establish themselves around and within him; or the old laws be expanded, and interpreted in his favor in a more liberal sense, and he will live with the license of a higher order of beings. In proportion as he simplifies his life, the laws of the universe will appear less complex, and solitude will not be solitude, nor poverty poverty, nor weakness weakness. If you have built castles in the air, your work need not be lost; that is where they should be. Now put the foundations under them.”
A dead organism (Remark B36) perhaps has the lowest production of entropy, since it is only through external means such as soil, insects, wind, moisture, heat, etc. that is able to exchange mass with its surroundings. For a living being, perhaps the state of lowest production of entropy is in some sense similar to a monk or a nun or any person capable of sitting quietly in meditation, in perfect Samadhi [attention, concentration], and in full control of his or her senses, simply breathing in and breathing out, not engaged in any physical, vocal or even mental activities. On the surface, any individual who can also sit still for a while, with a closed mouth would also fit this category; however, the difference between this individual and the one whose mind is at peace is that the latter does not produce or create any thoughts of anger, jealousy, hatred, greed (all related to high entropy activities). Instead, the person in the state of Samadhi can actually guide his or her peaceful thoughts towards all beings by producing or generating thoughts of harmony, love, and compassion. Praying and contemplative meditations are also forms of this type of activity (Remark B37). While most of us are not able to sit still in meditation for long periods on any given day, we can find the right balance in our life.
In so far as our external activities related to our physical or verbal actions, we can try to use TI.2 as closely as possible (Remark B38). For our mental activities, we can try to use TI.1 by keeping our mind at peace and act in a compassionate manner, with loving-kindness towards all beings.
The application of TI.1 is not limited to mental actions only; this imperative also has practical implications related to our physical actions, for example, how we (ought to) live. Issues related to our livelihood, our source of income, and how we spend our money and whether we walk or bike to work instead of driving... Ultimately these are decisions made by an individual; they can be guided by the application of the entropy law. As mentioned by Brillouin ([60], p. 558) the second law of thermodynamics “is a death sentence, but it contains no time limit… The principle states that in a closed system S will increase and high-grade energy must decrease; but it does not say how fast… The second principle is an arrow pointing to a direction along a one-way road, with no upper or lower speed limit.” It is necessary to make a distinction between different forms of energy. Georgescu-Roegen ([61], Chapter 3) says: “The free energy to which man can have access comes from two distinct sources. The first source is a stock, the stock of free energy of the mineral deposits in the bowels of the earth. The second source is a flow, the flow of solar radiation intercepted by the earth.” The stock of free energy in the form of mineral deposits is similar to one’s money in a savings account, whereas the flow of solar radiation is similar to one’s income deposited regularly in a checking account. Cloud [62] indicates: “Here is where entropy becomes critical. All of these minerals, without exception, are produced, beneficiated, and transformed into low-entropy products at a cost in available energy. All of the energy utilized is concentrated and brought to focus on useful work at a cost in materials. All of this energy is also irreversibly transformed into unavailable, high-entropy states. All of the low-entropy metals and other mineral products utilized become disordered high-entropy waste with time. They can only be restored to states useful to man, recycled that is, with further inputs of energy. Entropy inexorably increase.” When the natural deposits are taken out, processed, and used, either they could never be replaced or it will take thousands or millions of years or so, before the stocks could reach the point where they can again be used as a source of energy. The use of solar energy, advocated by the energy conservation movement, on the other hand, for all practical purposes, presents an unlimited source of energy. In summary, we can rephrase these two imperatives in the following order as the two thermodynamic imperatives (TI):
We ought to do things, in so far as possible, in such ways that the production of entropy is minimized.
We ought to do things, in so far as possible, in such ways that the consumption of entropy is maximized (Lindsay [3]).
Norbert Wiener in his article titled “Some moral and technical consequences of automation,” [42] warned us about the dangers of blindly following scientific adventures: “It is my thesis that machines can and do transcend some of the limitations of their designers, and that in doing so they may be both effective and dangerous.” He compared machines with slaves: “Complete subservience and complete intelligence do not go together.” We cannot and should not expect that science or technology will solve all our problems. They cannot. And as these two imperatives indicate, we as individuals, especially those familiar with the laws of physics in general, and specifically with the entropy law, have a responsibility to implement these ethical imperatives in our daily lives, as much as we can. Tagore ([63], p. 101) says:
“I ask once again, let us, the dreamers of the East and the West, keep our faith firm in the Life that creates and not in the Machine that constructs—in the power that hides its force and blossoms in beauty, and not in the power that bares its arms and chuckles at its capacity to make itself obnoxious. Let us know that the Machine is good when it helps, but not so when it exploits life; that Science is great when it destroys evil, but not when the two enter into unholy alliance.”

Conflicts of Interest

The author declares no conflict of interest.

Appendix A

As mentioned earlier, what is usually referred to as the entropy law or the second law of thermodynamics is really only a postulate, albeit one which has been supported by a wealth of empirical evidence. This postulate has certain properties, one being “additive”, which implies that if a system has many parts, the total entropy is equal to the sum of the entropies of the parts. Now, if we designate the total entropy of a system as S, then its change dS is divided into two parts (see Gal-Or ([64]):
d S = d e S + d i S
where deS is the change in the entropy due to interactions with the external world, and diS is the entropy generation due to irreversible processes occurring inside the system. Now the entropy law is simply stated as:
d i S 0
According to the general theory of non-equilibrium thermodynamics (see de Groot and Mazur [65]), d i S can be related to the irreversible processes occurring in the system (see Curran [66], p. xx), if the system is not “too far” from equilibrium, where:
θ d i S d t = Φ = i J i X i 0
where J i is the rate of the i-th irreversible process, X i is the “force” causing this and Φ is the dissipation function or the total free energy dissipation by the system. For living organisms, the second law requires that the sum in the above equation be greater than zero. That is, it is possible that one or more of the J i X i terms may be negative, as long as Φ 0 . In other words, the thermodynamic equilibrium of the system can be defined as the state when all irreversible processes stop and d i S = 0 . Furthermore, if the system is an isolated one, then by definition there is no exchange of mass or energy with the external world and thus d e S = 0 , and as a result the above two equations reduce to:
d S 0   ( for an isolated system )
It is also possible to relate the change of entropy content of a system in an irreversible process (dSirrev) to the change of entropy content of a reversible process (dSrev) such that (see Aoki [67], p. 168):
d S i r r e v > d S r e v
where d S r e v = 0 as stated in books. However, as mentioned by Aoki ([67], p. 168) this equation does not apply to open systems such as biological systems which are not isolated systems (Remark B39). In that case, it is helpful to think of entropy content as having two parts: entropy flow d e S and entropy production d i S , as shown by (A1). For open systems, we use the entropy production as the criterion, such that the entropy production in an irreversible process d i S i r r e v is always greater than d i S r e v , and the latter is zero (see Nicolis and Prigogine [68]), that is
d i S i r r e v > d i S r e v = 0
Nicolis and Prigogine [68] state that in open systems which are near equilibrium, d i S i r r e v always decreases with time and approaches a minimum value. This is called the minimum entropy production principle, where:
d d t [ d i S i r r e v   ] < 0
We should mention that, in general, for open and living systems, there does not seem to be a universally accepted form of the entropy principle (see Aoki (2001, p. 1680) [67]).

Appendix B

Remark B1.
Historically, it seems that Clausius coined the term “entropy” in 1865, from the Greek word τ ρ ο π η (see Kondepudi, [69], p. 112) meaning transformation. The root of this word is from Greek “tropy”, meaning “turn” and the prefix “en” means “in”. Thus, entropy directly implies “in turn” or “turn in”. According to Lemons ([70], p. 8), it was the intention of Clausius to give a proper name to a concept that shows how physical systems “change, evolve or proceed”.
Remark B2.
In fact, as pointed out by Ostwald [9]: “Great discoveries in natural sciences always bring in their train a far-reaching reform of general philosophical conceptions and modes of thought.” To the enquiry “Should scientists be ethical?” there can only be an affirmative answer. No scientist would intentionally say otherwise, although, perhaps most scientists may say that ethics belongs to a different realm of enquiry. The question then changes to “Whose ethics?”.
Remark B3.
As Bronowski ([71], p. 63) says: “Science is not a mechanism but a human progress, and not a set of findings but the search for them. Those who think that science is ethically neutral confuse the findings of science, which are, with the activity of science, which is not.”
Remark B4.
Peirce ([72], p. 350) says: “That which constitutes science, then, is not so much correct conclusions, as it is a correct method. But the method of science is itself a scientific result. It did not spring out of the brain of a beginner: it was a historic attainment and a scientific achievement. So that not even this method ought to be regarded as essential to the beginnings of science. That which is essential, however, is the scientific spirit, which is determined not to rest satisfied with existing opinions, but to press on to the real truth of nature. To science once enthroned in this sense, among many people, science in every other sense is heir apparent.”
Remark B5.
Science is considered by many, especially the “social” scientists, to be an instrument of policy, where interests of governments and industries are served. However, there are non-instrumental aspects of science (see Ziman [15]) where “critical scenarios” are created, “rational attitudes” are stimulated, and a class of “enlightened practitioners” and “independent experts” are produced.
Remark B6.
A layman’s perception of who a scientist is and what a scientist does is oftentimes very different from the perception of a scientist. Medawar ([73], p. 2) discusses this point: “The layman’s interpretation of scientific practice contains two elements which seem to be unrelated and all but impossible to reconcile. In one conception the scientist is a discoverer, an innovator, an adventurer into the domain of what is not yet known or not yet understood. Such a man must be speculative, surely at least in the sense of being able to envisage what might happen or what could be true. In the other conception the scientist is a critical man, a skeptic, hard to satisfy; a questioner of received beliefs. Scientists (in this second view) are men of facts and not of fancies, and science is antithetical to, perhaps even an antidote to, imaginative activity in all its forms.”
Remark B7.
Sacred science is based on the view that everything that we see or observe in the physical world (through our senses or instruments that we have devised) is only a manifestation of the Unseen, i.e., the spiritual world (see Nasr [74,75]). That is, the physical reality is but a reflection of the spiritual reality, and the physical laws are reflections of the higher laws operating in the spiritual realms. Science, especially as it has become known in the last few hundred years in the physical realm, i.e., the lower and the more specific. The traditional science or sacred science looks at the physical reality as a manifestation, a limitation, or a projection of the spiritual world (see Wilber [76]).
Remark B8.
For an excellent discussion of the structure of action from a philosophical point of view, see Searle ([77], pp. 57–70) who proposes eight principles, the first of which is what I am using here.
Remark B9.
For example, it is very difficult to quantify what a mental force is or devise an instrument to measure it. It is difficult to deny the existence of such a force when we come in the presence of it, or in its field of action. Perhaps we can add a new field to the already existing ones and call it Mental Field. In many of the spiritual traditions a similar law to that of Newton’s third law exists which simply indicates that our actions, be they physical, verbal, or mental have consequences. There are scientists who do not accept the possibility that our speech or our thoughts may have (physical) consequences. It is understandable if many scientists do not believe in other realms of possibilities, i.e., non-material realms. The following story perhaps can shed some light on this issue (Goldstein and Kornfield, [78], p. 9): “Once a master was called to heal a sick child with a few words of prayer. A skeptic in the crowd observed it all and expressed doubts about such a superficial way of healing. The master turned to him and said: ‘You know nothing of these matters; you are an ignorant fool!’ The skeptic became very upset. He turned red and shook with anger. Before he could gather himself to reply, however, the master spoke again, asking, ‘When one word has the power to make you hot and angry, why should not another word have the power to heal?’” Another example, one of the by-products of this spiritual force, which can be developed through years of practice (concentration and meditation), is the power to levitate. For example, it is known and observed, at least in India, that a Sadhu (an advanced practitioner of yoga) in deep Samadhi (a state of deep concentration) can levitate. There are no apparent forces acting on his body. Then how is it that the gravity is not pulling him down? It appears that Newton’s law is violated. This is mentioned by the physicist, Raychaudhri ([79], p. 5): “We could say that there are supernatural phenomena where Newton’s laws do not apply. But I think we would prefer a different approach. We would bring a new force—the Yogic force (say)—into our physics. We would say that the Sadhu has been able to balance gravity with the help of this Yogic force. That saves Newton’s laws again. We would then proceed to find the Source of this new force, determine the laws, assuming hopefully that such laws do indeed exist, and finally proceed to find out the field equations. Perhaps the enterprising theoretical physicist would attempt a quantization of the field!”
Remark B10.
Sir Karl Popper (The Logic of Scientific Discovery, [80], p. 278) said: “Science is not a system of certain, or well-established, statements; nor is it a system which steadily advances towards a state of finality. Our science is not knowledge (episteme); it can never claim to have attained truth, or even a substitute for it, such as probability. Yet science has more than mere biological survival value. It is not only a useful instrument. Although it can attain neither truth nor probability, the striving for knowledge and the search for truth are still the strongest motives of scientific discovery.”
Remark B11.
It is only when we, as scientists, question our personal beliefs that we could see how scientific we are, i.e., how much observation and reasoning can influence our views. Our world-view is based on our experiences and our interpretations of these experiences, among other things. As Huston Smith ([81], p. 16) says: “Values, life meanings, purposes, and qualities slip through science like sea slips through the nets of fisherman. Yet man swims in this sea, so he cannot exclude it from his purview. This is what was meant when we noted earlier that a scientific world view is in principle impossible. Taken in its entirety, the world is not as science says it is; it is as science, philosophy, religion, the arts, and everyday speech say it is. Not science but the sum of man’s symbol systems, of which science is but one, is the measure of things.” I think to bring ethics into the domain of science and engineering is a challenge, especially for the scientists and the engineers (See Massoudi [30], Massoudi [31]). This encounter may require a change, or at least a shift, in our perception where we become aware of different orders or structures within our field of study, giving rise to new orders and structures. As Bohm [32] says: “The whole process tends to form harmonious and unified totalities, felt to be beautiful, as well as capable of moving those who understand them in a profoundly stirring way.” There is room for emotions in science. Another difficulty that many scientists encounter, both on a professional as well as personal level, is whether there is any meaning ‘in a philosophical sense’ in doing science. That is, can science give meaning to one’s life or is this beyond its domain? Laszlo ([82], p. 12) said: “Meaningfulness in science is an important dimension, even if it is an often neglected one. Science is not only a collection of formulas, abstract and dry, but also a source of insight into the way things are in the world. It is more than just observation, measurement, and computation; it is also a search for meaning and truth. Scientists are concerned with not only the how of the world—the way things work—but also what the things of this world are and why they are the way we find them.” This may require a great deal of extrapolations and generalization, which might be difficult for many scientists to accept, as many scientists are more into specialization than generalization. To do so, in its bare elements requires model-building and making many assumptions (Massoudi [83]). The assumptions are clearer if certain definitions are accepted, sometimes on a priori basis.
Remark B12.
See for example, Adkins [41].
Remark B13.
For an interesting discussion on the statistical physics approach to open systems, see Annila [84].
Remark B14.
Atkins ([85], p. 35) points out that “A reversible process in thermodynamics is one that is reversed by an infinitesimal modification of the conditions in the surroundings.” Sommerfeld ([86], p. 19) says: “Reversible processes are not, in fact, processes at all, they are sequences of states of equilibrium. The processes which we encounter in real life are always irreversible processes, processes during which disturbed equilibria are being equalized. Instead of using the term ‘reversible process’ we can also speak of infinitely slow, quasi-static processes during which the system’s capacity for performing work is fully utilized and no energy is dissipated.”
Remark B15.
The equilibrium state is the state when entropy has reached a maximum and there is no longer any free energy available to perform additional work. Moran and Shapiro ([87], p. 165) indicate that: “When left to themselves, systems tend to undergo spontaneous changes until a condition of equilibrium is achieved, both internally and with their surroundings. In some cases equilibrium is reached quickly, in others it is achieved slowly.”
Remark B16.
For the concept of stability in thermodynamics of both closed and open systems and its relation to biocommunity, see Svirezhev [88]. Another type of stability is called asymptotic stability. As Nicolis and Prigogine ([89], p. 10) mention: “When a system is in a state such that the perturbations acting on it die out in time more or less quickly, we say that the state is asymptotically stable.”
Remark B17.
See for example, Georgescu-Roegen [38]; Rifkin [90], Kirwan [91].
Remark B18.
Truesdell ([92], p. 69) says: “I hesitate to use the terms ‘first law’ and ‘second law’, because there are almost as many ‘first and second laws’ as there are thermodynamicists, and I have been told by these people for so many years that I break their laws as to make me now exult in my criminal states and give non-condemning names to the concrete mathematical axioms I wish to use in my outlawed studies of heat and temperature. The term ‘entropy’ seems superfluous, also, since it suggests nothing to ordinary persons and only intense headaches to those who have studied thermodynamics but have not given in and joined the professional.”
Remark B19.
In summary, as Kestin and Dorfman ([36], p. 16) say: “The Zeroth and First laws and the first part of the Second law assure us of the existence of three important thermodynamic properties for the equilibrium states of all systems. These are: the energy, E, or the internal energy, U; the thermodynamic temperature, T; and the entropy, S. The Third law determines the behavior of entropy as T→0 and the second part of the Second law gives an indication of the direction of natural processes.”
Remark B20.
Bentov ([93], p. 100) explains: “A thought is energy that causes the neurons in the brain to fire in a certain pattern. That naturally produces tiny currents along definite paths in the brain cortex that can be picked up with sensitive instruments through electrodes on the surface of the skull. In other words, a thought that starts out as a tiny stir eventually develops into a full-fledged thought producing at least a 70 millivolt potential somewhere in the cortex. It fires the first neuron, which in turn causes others to fire in a certain sequence. However, in this universe no energy is lost. If we can pick up the current produced by the thought outside the head, it means that the energy of the thought was broadcast in the form of electromagnetic waves, and at the velocity of light into the environment and, finally, into the cosmos.”
Remark B21.
This law has been applied in various fields, even religion; Hiebert [94] says: “The first and second laws of thermodynamics have been used, affirmed, rejected, manipulated, exploited, and criticized in order both to further and to censure religion.” If by religion we mean blind faith in a system of thought, then not too many scientists can be considered religious. Morley (cf. Hiebert, [94]) says: “Where it is a duty to worship the sun it is pretty sure to be a crime to examine the laws of heat.”
Remark B22.
Eddington (The Philosophical Scientists, Foster, D. [95]) said: “The law that entropy always increases—the Second Law of Thermodynamics—holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equation—then so much the worse for Maxwell’s equations...But if your theory is found to be against the Second Law of Thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.”
Remark B23.
We can perhaps make a distinction between the Second Law and the Entropy Law. While the former can be applied to the state of order in any system, the latter is applied to the “motion of atoms in gases under the influence of heat.”
Remark B24.
In classical thermodynamics, for practical purposes, as suggested by Laidler ([96], p. 34) “entropy is a property that helps to provide a numerical measure of the extent to which the heat in a system is unavailable for conversion into mechanical work.”
Remark B25.
For irreversible processes, entropy production occurs anywhere and at any time. In general, one can say that the higher the degree of the irreversibility in a given process, the more entropy is produced; thus entropy production can be thought of as a measure of the irreversibility in or of a process.
Remark B26.
Interestingly, Lord Kelvin referred to entropy as an indication of “degradation of energy”. That is, mechanical power, considered to be a high-grade form of energy, has a tendency, under proper conditions, to be transformed into heat, considered a low-grade form of energy. Some scientists (see for example, Brillouin [97]) refers to the grade of energy as the negative entropy (negentropy), which is a quantity that according to the second law of thermodynamics must always decrease in a closed system. That is, as Brillouin ([60], p. 557) says: “The second principle is often referred to as a principle of energy degradation. The increase in entropy means a decrease in quality for the total energy stored in the isolated system.”
Remark B27.
Seifert [39] in an article titled: “Can we decrease our entropy?” says: “Every time you light a cigarette, availability of energy goes down and, correspondingly, entropy goes up. True, you can reverse this process locally, and make heat flow from a cold body to a hot one, as is done every day in the electric refrigerator; but looking at the refrigerator and its power source as a single over-all system, there is more increase in entropy (loss in available energy) outside the cold box than decrease in entropy (gain in available energy) inside it. As Clausius put it, the energy of the world stays constant; the entropy of the world increases without limit.” It is not uncommon to read in popular science writings statements such as “The entropy of the universe is increasing.” However, as mentioned by Brillouin ([60], p. 561) this statement is really “beyond the limits of human knowledge.” Instead, we can focus on earth, which even though is not a closed system, since it is “constantly receiving energy and negative entropy form outside—radiant heat from the sun…” we can, to some extent, apply the techniques of thermodynamics.
Remark B28.
It is common to say, hear or think that entropy is a measure of disorder in a system, as taught and seen in many textbooks. This is however neither helpful nor accurate, since disorder is not that easily defined or understood and it may mean different things to different people. However, some have tried, within the context of Information Theory, to define and a give a more accurate meaning to (Shannon’s) disorder (see Ben-Naim [56], p. 251)) which is actually a measure of information and not disorder. For one of the earliest discussion of thermodynamical ideas and information theory, see Brillouin [98].
Remark B29.
Interestingly, many scientists (see Denbigh [34]) think that entropy should not be regarded as a measure of disorder or disorganization.
Remark B30.
As indicated by Nicolis and Prigogine ([89], p. 36): “…one of the essential features of complex behavior is the ability to perform transitions between different states. Stated differently, complexity is concerned with systems in which evolution, and hence, history, plays or has played an important role in the observed behavior.”
Remark B31.
As Carl Rogers ([99], p. 131) says: “…the more complex the structure-whether a chemical or a human—the more energy it expends to maintain that complexity. For example, the human brain, with only 2 percent of body weight, uses 20 percent of the available oxygen! Such a system is unstable, has fluctuations or ‘perturbations,’ as Prigogine calls them. As these fluctuations increase, they are amplified by the system’s many connections, and thus drive the system-whether chemical compound or human individual-into a new, altered state, more ordered and coherent than before. This new state has still greater complexity, and hence, even more potential for creating change.”
Remark B32.
I think this is perhaps a source of confusion in the application of the Entropy Law in our daily life. And I hope the distinction becomes clearer in this section.
Remark B33.
I am grateful to one of the reviewers for indicating that one could also argue that: “One ought to do things, so that free energy is consumed in least time.” This is an interesting concept. Since I do not talk about the concept of time and entropy in an explicit way in this paper, I tried to avoid discussion of (least) time.
Remark B34.
As the Chinese Poet, Li Po said (see Hamil [100]): “The birds have vanished down the sky. Now the last cloud drains away. We sit together, the mountain and me, until only the mountain remains.”
Remark B35.
See Elgin [101], or Naess [102].
Remark B36.
Brillouin ([60], p. 563) says: “Accordingly, a living organism is a chemical system in unstable equilibrium maintained by some strange ‘power of life,’ which manifests itself as a sort of negative catalyst. So long as life goes on, the organism maintains its unstable structure and escapes disintegration.”
Remark B37.
One could pose the question that a person who is also sleep, does not produce much entropy. Indeed, this true if one is in the state of deep sleep as it is called in Advaita Vedanta (see Sharma [103]), which itself is a state of bliss.
Remark B38.
For example, the concept of Right Livelihood or Right Action or Right Speech as described in the teachings of the Buddha can provide a framework (See Rahula [104]). Though these three elements constitute the Ethics (Sila) portion of the Eightfold Path of Buddha, other spiritual (wisdom) traditions also have something similar to this code of ethics.
Remark B39.
In general, thermodynamical variables can be classified or divided into two classes: those related to the state of the matter and those within the processes involved. As mentioned by Aoki ([67], p. 168): “With regard to the entropy concept, the state variable is entropy content and the process variables are entropy flow and entropy production.”


  1. Angrist, S.W.; Hepler, L.G. Order and Chaos: Laws of Energy and Entropy; Basic Books: New York, NY, USA, 1967. [Google Scholar]
  2. Einstein, A. Out of My Later Years, revised reprint ed.; Carol Publishing Group: New York, NY, USA, 1995. [Google Scholar]
  3. Lindsay, R.B. Entropy consumption and values in physical science. Am. Sci. 1959, 47, 376–385. [Google Scholar]
  4. Kant, I. Groundwork for the Metaphysics of Morals; Paton, H.J., Ed.; Harper & Row Publishers: New York, NY, USA, 1956. [Google Scholar]
  5. Newton, I. Selections from His Writings; Dover Publications, Inc.: New York, NY, USA, 2005. [Google Scholar]
  6. Sarton, G. A Guide to the History of Science; Chronica Botanica Company: Waltham, MA, USA, 1952. [Google Scholar]
  7. Masters, D.; Way, K. One World or None: A Report to the Public on the Full Meaning of the Atomic Bomb; McGraw-Hill: New York, NY, USA, 1946. [Google Scholar]
  8. Deltete, R.J. Wilhelm Ostwald’s energetics 3: Energetic theory and applications, part II. Found. Chem. 2008, 10, 187–221. [Google Scholar] [CrossRef]
  9. Ostwald, W. The modern theory of energetics. Monist 1907, 17, 481–515. [Google Scholar] [CrossRef]
  10. Deltete, R.J. Wilhelm Ostwald’s energetics 2: Energetic theory and applications, part I. Found. Chem. 2007, 9, 265–316. [Google Scholar] [CrossRef]
  11. Deltete, R.J. Wilhelm Ostwald’s energetics 1: Origins and motivations. Found. Chem. 2007, 9, 3–56. [Google Scholar] [CrossRef]
  12. Holt, N.R. A note on Wilhelm Ostwald’s energism. Isis 1970, 61, 386–389. [Google Scholar] [CrossRef]
  13. Carus, P. Professor Ostwald’s Philosophy. an Appreciation and a criticism. Monist 1907, 17, 516–540. [Google Scholar] [CrossRef]
  14. Hakfoort, C. Science deified: Wilhelm Osstwald’s energeticist world-view and the history of scientism. Ann. Sci. 1992, 49, 525–544. [Google Scholar] [CrossRef]
  15. Ziman, J. Non-instrumental roles of science. Sci. Eng. Ethics 2003, 9, 17–27. [Google Scholar] [CrossRef] [PubMed]
  16. Russell, B. The Impact of Science on Society; Simon and Schuster, Inc.: New York, NY, USA, 1953. [Google Scholar]
  17. Lovelock, J. The Revenge of Gaia: Earth’s Climate Crisis & the Fate of Humanity; Basic Books: New York, NY, USA, 2006. [Google Scholar]
  18. Laszlo, E. The Connectivity Hypothesis: Foundations of an Integral Science of Quantum, Cosmos, Life, and Consciousness; SUNY Press: Albany, NY, USA, 2003. [Google Scholar]
  19. Von Bertalanffy, L. General Systems Theory, revised 679 ed.; George Braziller, Inc.: New York, NY, USA, 1969. [Google Scholar]
  20. Laszlo, E. The Systems View of the World; George Braziller: New York, NY, USA, 1972. [Google Scholar]
  21. Macy, J. Mutual Causality in Buddhism and General Systems Theory: The Dharma of Natural Systems; Suny Press: Albany, NY, USA, 1991. [Google Scholar]
  22. Toulmin, S.E. Foresight and Understanding; Harper & Row, Publishers, Inc.: New York, NY, USA, 1963. [Google Scholar]
  23. Bateson, G. Mind and Nature; Bantam Books: New York, NY, USA, 1988. [Google Scholar]
  24. Russell, B. Religion and Science; Oxford University Press: New York, NY, USA, 1961. [Google Scholar]
  25. Ziman, J.M. Reliable Knowledge: An Exploration of the Grounds for Belief in Science; Cambridge University Press: Cambridge, UK, 1978. [Google Scholar]
  26. Boulding, K.E. The Great Laws of Change. In Evolution, Welfare, and Time in Economics; Tang, A.M., Westfield, F.M., Worley, J.S., Eds.; Lexington Books: Lanham, MD, USA, 1976. [Google Scholar]
  27. Goswami, A. The Physicists’ View of Nature; Kluwer Academic: New York, NY, USA, 2000. [Google Scholar]
  28. Feynman, R.; Leighton, R.; Sands, M. The Feynmen Lectures on Physics; Basic Books: New York, NY, USA, 1963; Volume 1, Chapter 4. [Google Scholar]
  29. Davis, M. What’s philosophically interesting about engineering ethics? Sci. Eng. Ethics 2003, 9, 353–361. [Google Scholar] [CrossRef] [PubMed]
  30. Medawar, P.B. The Threat and the Glory; HarperCollins: New York, NY, USA, 1990. [Google Scholar]
  31. Truesdell, C. An Idiot’s Fugitive Essays on Science; Springer: New York, NY, USA, 1984. [Google Scholar]
  32. Callen, H.B. Thermodynamics and an Introduction to Thermostatistics; John Wiley & Sons: New York, NY, USA, 1985. [Google Scholar]
  33. Massoudi, M. A system theory approach to interfaith dialogue. Intercult. Educ. 2006, 17, 421–437. [Google Scholar] [CrossRef]
  34. Denbigh, K.G. Note on entropy, disorder and disorganization. Br. J. Philos. Sci. 1989, 40, 323–332. [Google Scholar] [CrossRef]
  35. Kestin, J. A Course in Thermodynamics; Revised Printing; Hemisphere Publishing Company: Washington, DC, USA, 1979; Volume 1. [Google Scholar]
  36. Kestin, J.; Dorfman, J.R. A Course in Statistical Thermodynamics; Academic Press Inc.: New York, NY, USA, 1971. [Google Scholar]
  37. Asimov, I. Game of energy and thermodynamics you cant even break even. Smithsonian 1970, 1, 4–11. [Google Scholar]
  38. Georgescu-Roegen, N. The Entropy Law and the Economic Process; Harvard University Press: Cambridge, MA, USA, 1971. [Google Scholar]
  39. Seifert, H.S. Can we decrease our entropy. Am. Sci. 1961, 49, 124A, 128A, 130A, 134A. [Google Scholar]
  40. Cengel, Y.A. Introduction to Thermodynamics and Heat Transfer+ EES Software; McGraw Hill Higher Education Press: New York, NY, USA, 2008. [Google Scholar]
  41. Adkins, C.J. Equilibrium Thermodynamics, 3rd ed.; Cambridge University Press: New York, NY, USA, 1985. [Google Scholar]
  42. Wiener, N. Some Moral and Technical Consequences of Automation. Science (N.Y.) 1960, 131, 1355–1358. [Google Scholar] [CrossRef] [PubMed]
  43. Prigogine, I. From Being to Becoming; W.H. Freeman and Company: New York, NY, USA, 1980. [Google Scholar]
  44. Denbigh, K.G. Order and organisation. Interdiscip. Sci. Rev. 1999, 24, 167–170. [Google Scholar] [CrossRef]
  45. Orrell, D. Truth or Beauty: Science and the Quest for Order; Yale University Press: New Haven, CT, USA, 2012. [Google Scholar]
  46. Pernu, T.K.; Annila, A. Natural emergence. Complexity 2012, 17, 44–47. [Google Scholar] [CrossRef]
  47. Singer, P. Practical Ethics; Cambridge Press: New York, NY, USA, 1979. [Google Scholar]
  48. Singer, P. How Are We to Live?: Ethics in an Age of Self-Interest; Prometheus Books: Amherst, NY, USA, 1995. [Google Scholar]
  49. Regan, T. All that Dwell Therein; University of California Press: Berkeley, CA, USA, 1982. [Google Scholar]
  50. Regan, T. The Thee Generation; Temple University Press: Philadelphia, PA, USA, 1988. [Google Scholar]
  51. Rolston, H. Environmental Ethics; Temple University Press: Philadelphia, PA, USA, 1988. [Google Scholar]
  52. Taylor, P.W. Respect for Nature; Princeton University Press: Princeton, NJ, USA, 1986. [Google Scholar]
  53. Johnson, L.E. A Morally Deep World; Johnson, L.E., Ed.; Cambridge University Press: Cambridge, UK, January 1993; p. 311. [Google Scholar]
  54. Schumacher, E.F. Small is Beautiful; Harper & Row: New York, NY, USA, 2010. [Google Scholar]
  55. Kestin, J. Availability: The concept and associated terminology. Energy 1980, 5, 679–692. [Google Scholar] [CrossRef]
  56. Ben-Naim, A. Entropy and the Second Law; World Scientific Publishers: Singapore, 2012. [Google Scholar]
  57. Smith, H. The World’s Religions; Harper San Fransisco: San Fransisco, CA, USA, 1991. [Google Scholar]
  58. Bushrui, S.B.; Massoudi, M. The Spiritual Heritage of the Human Race: An Introduction to the World’s Religions; OneWorld Publications: Oxford, UK, 2009. [Google Scholar]
  59. Thoreau, H.D. Walden; Shanley, J.L., Ed.; Princeton University Press: Princeton, NJ, USA, 1973. [Google Scholar]
  60. Brillouin, L. Life, thermodynamics, and cybernetics. Am. Sci. 1949, 37, 554–568. [Google Scholar] [PubMed]
  61. Georgescu-Roegen, N. Energy and economic myths. South. Econ. J. 1976, 41, 347–381. [Google Scholar] [CrossRef]
  62. Cloud, P. Entropy, materials, and posterity. Geol. Rundsch. 1977, 66, 678–696. [Google Scholar] [CrossRef]
  63. Tagore, R. The Religion of Man; Unwin Paperbacks: London, UK, 1988. [Google Scholar]
  64. Gal-Or, B. Cosmology, physics and philosophy. In Cosmology, Physics and Philosophy; Springer: Berlin/Heidelberg, Germany, 1983; pp. 277–307. [Google Scholar]
  65. De Groot, S.R.; Mazur, P. Non-Equilibrium Thermodynamics; Dover Publications: New York, NY, USA, 1984. [Google Scholar]
  66. Curran, P.F. Thermodynamic of Living Systems: The Contributions of Aharon Katzir-Katcholsky. In Modern Developments in Thermodynamics; Gal-Or, B., Ed.; John Wiley & Sons: New York, NY, USA, 1974. [Google Scholar]
  67. Aoki, I. Entropy and exergy principles in living systems. In Thermodynamics and Ecological Modelling; Lewis Publishers/CRC Press: Boca Raton, FL, USA, 2001; pp. 167–190. [Google Scholar]
  68. Nicolis, G.; Prigogine, I. Self-Organization in Nonequilibrium Systems; Wiley: New York, NY, USA, 1977. [Google Scholar]
  69. Kondepudi, D.K. Introduction to Modern Thermodynamics; Wiley: Chichester, UK, 2008. [Google Scholar]
  70. Lemons, D.S. A Student’s Guide to Entropy; Cambridge University Press: Cambridge, UK, 2013. [Google Scholar]
  71. Bronowski, J. Science and Human Values; Harper & Row Publishers: New York, NY, USA, 1965. [Google Scholar]
  72. Peirce, C.S. Selected Writings; Wiener, P.P., Ed.; Dover Publications, Inc.: New York, NY, USA, 1966. [Google Scholar]
  73. Medawar, P.B. Induction and Intuition in Scientific Thought; American Philosophical Society: Philadelphia, PA, USA, 1969. [Google Scholar]
  74. Nasr, S.H. Knowledge and the Sacred; SUNY Press: Albany, NY, USA, 1989. [Google Scholar]
  75. Nasr, S.H. The Need for a Sacred Science; SUNY Press: Albany, NY, USA, 1993. [Google Scholar]
  76. Wilber, K. The Spectrum of Consciousness; Quest Books: Wheaton, IL, USA, 1977. [Google Scholar]
  77. Searle, J.R. Minds, Brains, and Science; Harvard University Press: Cambridge, MA, USA, 1984. [Google Scholar]
  78. Goldstein, J.; Kornfield, J. Seeking the Heart of Wisdom (Boston, MA: Shambala); Shambhala Press: Boston, MA, USA, 2001. [Google Scholar]
  79. Raychaudhuri, A.K. Classical Mechanics: A Course of Lectures; Oxford University Press: Oxford, UK, 1983. [Google Scholar]
  80. Popper, K.R. The Logic of Scientific Discovery; Routledge: London, UK, 1992. [Google Scholar]
  81. Smith, H. Forgotten Truth; Harper San Francisco: San Francisco, CA, USA, 1992. [Google Scholar]
  82. Laszlo, E. Science and the Akashic Field: An Integral Theory of Everything; Inner Traditions/Bear & Co.: Rochester, VT, USA, 2004. [Google Scholar]
  83. Massoudi, M. An enquiry into the role and importance of ethics in scientific research. Interchange 2008, 39, 443–468. [Google Scholar] [CrossRef]
  84. Annila, A. Natural thermodynamics. Phys. A Stat. Mech. Appl. 2016, 444, 843–852. [Google Scholar] [CrossRef]
  85. Atkins, P. Four Laws that Drive the Universe; Oxford University Press: Oxford, UK, 2007. [Google Scholar]
  86. Sommerfeld, A. Thermodynamics and Statistical Mechanics; Bopp, F., Meixner, J., Eds.; Academic Press, Inc.: New York, NY, USA, 1956. [Google Scholar]
  87. Moran, M.J.; Shapiro, H.N. Fundamentals of Engineering Thermodynamics; John Wiley & Sons, Inc.: New York, NY, USA, 1988. [Google Scholar]
  88. Svirezhev, Y. Thermodynamic orientors: How to use thermodynamic concepts in ecology. In Eco Targets, Goal Functions, and Orientors; Springer: Berlin/Heidelberg, Germany, 1998; pp. 102–122. [Google Scholar]
  89. Prigogine, I.; Nicolis, G. Exploring Complexity: An Introduction; W.H. Freeman: London, UK, 1989. [Google Scholar]
  90. Rifkin, J. Entropy into the Greenhouse World, revised ed.; Bentam Books: New York, NY, USA, 1989. [Google Scholar]
  91. Kirwan, A.D., Jr. Mother Nature’s Two Laws: Ringmasters for Circus Earth: Lessons on Entropy, Energy, Critical Thinking, and the Practice of Science; World Scientific: Singapore, 2000. [Google Scholar]
  92. Truesdell, C. Rational Thermodynamics, 2nd ed.; Springer: New York, NY, USA, 1984. [Google Scholar]
  93. Bentov, I. Stalking the Wild Pendulum; Destiny Books: Rochester, VT, USA, 1988. [Google Scholar]
  94. Hiebert, E.N. The uses and abuses of thermodynamics in religion. Daedalus 1966, 1046–1080. [Google Scholar]
  95. Foster, D. The Philosophical Scientists; Dorset Press: New York, NY, USA, 1985. [Google Scholar]
  96. Laidler, K.J. Energy and the Unexpected; Oxford University Press: New York, NY, USA, 2002. [Google Scholar]
  97. Brillouin, L. Entropy and growth of an organism. Ann. N. Y. Acad. Sci. 1955, 63, 454–455. [Google Scholar] [CrossRef]
  98. Brillouin, L. Thermodynamics and information theory. Am. Sci. 1950, 38, 594–599. [Google Scholar]
  99. Rogers, C. A Way of Being; Houghton Mifflin Company: Boston, MA, USA, 1980. [Google Scholar]
  100. Hamil, S. Crossing the Yellow River: Three Hundred Poems from the Chinese; Tiger Bark Press: Rochester, NY, USA, 2013. [Google Scholar]
  101. Elgin, D. Voluntary Simplicity: Toward a Way of Life that is Outwardly Simple, Inwardly Rich; Quill New York: New York, NY, USA, 1993; Volume 25. [Google Scholar]
  102. Naess, A. Ecology, Community and Lifestyle; Cambridge University Press: New York, NY, USA, 1990. [Google Scholar]
  103. Sharma, A. The Experiential Dimension of Advaita Vedanta; Motilal Banarsidass Publishers: New Delhi, India, 1993. [Google Scholar]
  104. Rahula, W. What the Buddha Taught; Grove Press, Inc.: New York, NY, USA, 1974. [Google Scholar]

Share and Cite

MDPI and ACS Style

Massoudi, M. A Possible Ethical Imperative Based on the Entropy Law. Entropy 2016, 18, 389.

AMA Style

Massoudi M. A Possible Ethical Imperative Based on the Entropy Law. Entropy. 2016; 18(11):389.

Chicago/Turabian Style

Massoudi, Mehrdad. 2016. "A Possible Ethical Imperative Based on the Entropy Law" Entropy 18, no. 11: 389.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop