Miscellania about Entropy, Energy, and Available Free Energy

While the main concepts of thermodynamics are universal, the application to specific systems is not. Thus, the universal concepts combined with specific constitutive relations permit the derivation of important results in such fields as diverse as physics, chemistry, physical chemistry, chemical engineering and rheology. In all of these fields equilibrium is characterized either by a maximum of entropy or by a minimum of available free energies, depending on boundary data. In the latter case there is a compromise between the entropic tendency to grow and the energetic tendency to decrease. After some historical considerations the situation is illustrated for several specific cases: planetary atmospheres, osmosis and elastic rubber molecules, pertaining to physics, chemistry and rheology respectively. Afterwards, in the later parts of the article, thermodynamics considerations are extrapolated to remote fields, to wit evolutionary genetics and sociology.


Introduction
One may distinguish three branches of classical-non-extended--thermodynamics, which differ in intent and result: • Thermodynamics of irreversible processes with the objective to determine the fields of mass density, velocity and temperature of a body for given initial and boundary data.More often than not the actual determination of the fields requires complex numerical schemes, and it is rarely done without severe simplifying assumptions.

OPEN ACCESS
• Stability analysis, i.e. identification of available free energies which assume extrema at the end of possibly strongly irreversible processes under special boundary conditions-adiabatic ones, or isothermal, or isobaric, or non-moving boundaries.• Thermostatics which describes quasi-static or reversible processes which are so slow that in a reasonable approximation they may be considered sequences of equilibria.This branch is largely adequate for the treatment of even fast moving engines; it is the subject of most of engineering thermodynamics.
If the truth were known, entropy plays a minor role in the first and last branches, but it is absolutely essential in the second branch.That branch reveals the teleological aspects of thermodynamics: Growth of entropy or decay of available free energy.Its basic reasoning and some applications are the subject of this exposition.
Ever since Boltzmann, entropy and its growth have been related to the randomisation of bodies with many elements, and often that phenomenon tends to push the body into one shape, while energetic interaction of the elements -among themselves or with the gravitational field -will attempt to pull the body into some other, energetically preferred, shape.Thus, quite often there is a competition between entropy and energy and the actual final equilibrium shape may represent a compromise.In most of these competitions the elements are atoms, molecules, or ions of fluids, solids, or solutions.But they may also be links in a polymer chain.Indeed, in physics, chemistry and rheology the elements may be all of these.
However, randomisation and, on the other hand, a deterministic trend toward preferred states may be observed in other fields of knowledge as well, fields that are remote from thermodynamics, e.g.genetics, sociology, economics, etc.Indeed, in the later chapters of this work I propose extrapolations of thermodynamics to such remote areas.This is a daring undertaking prone to attract criticism; I invite such criticism.
Oddly enough, in such efforts of extrapolation the concept of entropy -considered mysterious by many -is most easily extrapolated away from physics.What is more difficult, and possibly objectionable, is the extrapolation of temperature and energy.And it is there that I expect criticism and -perhaps -helpful suggestions from readers who are not opposed outright to unconventional applications of ideas motivated by thermodynamics.

First and Second Laws
The first law of thermodynamics is the energy equation; it states that the rate of change of energy of a body-internal, potential, and kinetic energy-is equal to the heating Q  and the working W  on the surface of the body: Thus the energy is constant when there is neither heating nor working on the surface.The second law of thermodynamics is an inequality; it states that the rate of change of entropy of a body is greater or equal to the heating on the surface divided by the uniform temperature T 0 on the surface: The equality holds when the process in the body is quasi-static.
The second law was introduced by Rudolf E. Clausius (1822-1888) [1].Noting that the energy is constant and the entropy is non-decreasing in an adiabatic body, whose surface is not subject to working, Clausius was moved to express that consequence of the thermodynamic laws by saying "The energy of the universe is constant, and the entropy of the universe tends to a maximum."Evidently he thought that he knew that the universe is free of heating and working on its surface.Clausius was much impressed by the inequality.It led him to the doctrine of the heat death.Says he: "The more closely the maximum (of entropy) is approached, the less cause for change exists.And when the maximum is reached, no further changes can occur; the world is then in a dead stagnant state." The heat death was much discussed in the nineteenth century.Physicists, philosophers and historians grappled with it and not everybody was happy with the bleak prospect.Josef Loschmidt (1821-1895) [2] expressed his misgivings most poignantly when he deplored "the terroristic nimbus of the second law […] which lets it appear as a destructive principle of all life in the universe." Nowadays the discussion of the heat death as the eventual end of the world has quieted down.Physicists have come to realize that there is so much about the beginning and the end of the universe, which they do not know, that most of them prefer not to speak about it.

Available Free Energies
We may ask whether, if a body is not adiabatic, there is another quantity which -under different boundary conditions -moves toward an extremum as equilibrium is established.This is indeed the case.However, the boundary conditions must include a constant-time independent as well as uniform -boundary temperature T 0 ; thermodynamicists love to say that the body is immersed in a heat bath of temperature T 0. Indeed, in that case the heating Q  may be eliminated between the first and second laws and we obtain: which is half-way to the desired result.The working W  is defined by the integral over the working of the stress t ij on the surface element dA moving with the velocity v i : V  is the surface of the volume occupied by the body and n i is the outer unit normal of the element dA.
V  There are three commonly considered cases: i) The surface is at rest, i.e. the volume is constant.
ii) Only part of the surface is at rest.On the remaining part the stress is isotropic, i.e. t ij = -p 0 δ ij , with a constant and uniform surface pressure p 0 (the best-known visualization of this circumstance is a vertical cylinder closed off at the top by a horizontal piston whose weight determines p 0 ).iii) A part of the surface is fixed, another part is free of stresses, and a third part 0 V  --horizontal (say), i.e. normal to the 3-direction -moves with the uniform vertical velocity t (the common visualization is a vertical elastic rod of length L, fixed at the bottom and loaded by P 0 on the upper surface).
In those three cases the two laws of thermodynamics ( 3) and ( 4) imply: Thus there are several different quantities which tend to a minimum as the body tends to an equilibrium.Generically we call them available free energies or availabilities, and we denote them by A.
In order to anticipate a misunderstanding I emphasize that the availabilities identified in (5) 1 and (5) 2,3 are not the free energy and the free enthalpies, respectively, of equilibrium thermodynamics (also known as the Helmholtz and Gibbs free energies, respectively).Indeed, T 0 , p 0 , and P 0 are temperature, pressure, or load on the surface of the body.Inside the body the temperature and the stress may be strongly non-uniform and time-dependent fields.This point is conceptually important, -particularly with respect to temperature -but in the sequel of this article it plays no role, since we shall consider bodies with constant and uniform temperature.
The terms -p 0 V or P 0 L in (5) 2,3 -or the corresponding terms for different boundary conditionsmay be interpreted as the energies E L of the loading device.We join them to U+E pot +E kin and define E = U+E pot +E kin +E L of body and loading device and write: This line may be expressed by saying that a minimum of energy E is conducive to equilibrium and so is a maximum of S.
If T 0 is so small that the entropic term in (6) can be neglected, the availability tends to a minimum, because the energy does.On the other hand, if T 0 is big, so that the energetic term in (6) may be neglected in comparison with the entropic one, the availability tends to a minimum, because the entropy tends to a maximum.
For intermediate values of T 0 it is neither the energy that becomes minimal in equilibrium nor the entropy which becomes maximal; the two tendencies compete and often they find a compromise and minimize A, the availability.
Among the three quantities involved in the availability, namely T 0 , E, and S, the surface temperature is best understood: It determines the mean kinetic energy of the molecules of the body at the surface and may be considered as a measure for the intensity of their thermal motion.If the truth were known, the energy is the most mysterious one among the three quantities, but we have gotten used to it, particularly to the kinetic energy and the gravitational potential energy.Entropy still needs a definition in molecular terms and will then be crystal-clear.
Maybe it is appropriate to say here that Clausius did not know what entropy was.With the second law he discovered a property of entropy, but the second law cannot be considered a definition, or interpretation in terms of atoms and molecules.Such an interpretation was found by Ludwig E. Boltzmann [3], who made the entropy and its growth a plausible concept.
In order to summarize and interpret the argument of this section we may say -in an anthropomorphic manner -that the energy wishes to approach a minimum, while the entropy wishes to become maximal.The main reason for the decay of energy is the conversion of the potential energy into kinetic energy and the conversion of the kinetic energy into heat which -under isothermal conditions -leaves the body.The reason for the growth of entropy will be discussed next.

S = k ln W
Boltzmann´s definition of entropy is nowadays summarized in the formula S = k ln W, where W is the number of possibilities to realize a distribution of the atoms over the available points in the volume V; k is now known as the Boltzmann constant.An example: let there be P occupiable positions in V (we take the positions as discrete and comment on this assumption a little later).A distribution is then given by {N 1 , N 2 ,…N P }, where N i (i=1,2,…P) is the number of atoms on position i.By the rules of combinatorics the number of realizations of that distribution is: In order to motivate this definition of entropy we must make sure that the S in (7) has the properties of Clausius´s entropy.In particular, does it have the growth property?Indeed, it does!And in order to see that we must make two observations and one reasonable assumption.
The first observation concerns thermal motion.The thermal motion of the atoms makes a realization of their distributions change very quickly, -once every 10 -12 s by order of magnitude at normal temperatures.And the assumption is that each and every realization occurs just as frequently as any other one.This is known as the a priori assumption of equal probability and it seems to be the only reasonable unbiased assumption given the randomness of thermal motion.It follows that a distribution with few realizations occurs less often than a distribution with more realizations, and most often is the equi-distribution , because -rather obviously, by (7) 1 -that is the distribution with most realizations.For example, let us start with a distribution in which all atoms are stacked -nice and orderly -on point 3 (say).In this case the number of realizations is equal to 1 and the entropy is zero by (7).But the thermal motion will quickly mess up this orderly stack and lead to a distribution with more realizations and -eventually, as equilibrium is approached -to the equi-distribution with most realizations.Such is the nature of entropic growth.By (7) and we obtain the equilibrium entropy as (here and frequently in the sequel we make use of the Stirling formula to replace factorials by the logarithmic function): Of course, this growth is entirely probabilistic.During the growth process and even in equilibrium it is entirely possible that the entropy briefly decreases in what we call a thermal fluctuation.J. Willard Gibbs (1839Gibbs ( -1903) ) was one of the first to understand that and he says [4] "..... the impossibility of … a decrease of entropy seems to be reduced to an improbability." There remains the question how the number of occupiable points P in the volume V depends on V and on the temperature T. We observe that, whatever the value of P is, it ought to be proportional to V, because surely doubling V is tantamount to doubling P. We introduce a factor of proportionality α and write: It follows that, when the equi-distribution has been established, the entropy can still grow by making V bigger.Therefore we may say that the entropy, in its tendency to grow, strives for the atoms to be distributed uniformly through as large a volume as possible.Examples follow in the next section So, what about the proportionality factor α? Obviously, by (9) 1 ,  1 is the smallest volume that can accommodate a single occupiable point.We might say that  1 quantizes the volume and again it was Boltzmann who introduced this notion without, however, taking it seriously.In fact Boltzmann [3] considered "…. it needless to emphasize that (in writing P = αV) we are not concerned here with a real physical problem, ….. the assumption is nothing more than an auxiliary tool." How wrong he was!Quantization became acceptable by the work of M.L.E.Planck (1858-1947), and Louis de Broglie (1892-1987) introduced the notion that atoms are waves which -roughly -need a volume proportional to the cube of the wavelength , the de Broglie wavelength (h is the Planck constant, μ the atomic mass and v the speed of the atom -de Broglie´s discovery was later broadened and subsumed in the uncertainty principle).The mean speed of an atom in a gas of temperature T is , so that the atom -in the mean -needs a cubic box of dimensions: Thus, by (9) we have for the equilibrium entropy of a gas of atoms (since we are now speaking about equilibrium, the temperature is uniform and there is no distinction between T and T 0 , the boundary temperature): (10) This formula coincides with the expression for the entropy of a monatomic gas that can be derived from the first and second laws applied to a gas in equilibrium.Thus klnW provides the correct form of the equilibrium entropy of a gas and a plausible understanding for the growth of entropy.

Boltzmann´s Controversies
Boltzmann´s work on the molecular interpretation of entropy created long-lasting controversies.Planck disagreed, and his assistant Ernst Zermelo (1871-1953) involved Boltzmann in an acrimonious public debate about the recurrence objection.This objection is based on a result of analytical mechanics by which a body -in the course of time -must return to its initial state, or at least close to it.Obviously that, if it happened, would violate the monotonic growth of entropy toward an equilibrium.Boltzmann answered by estimating the time of recurrence as being so long -many dozen billion years for even small systems, and many, many more for large systems -that it did not matter.The argument did not convince Zermelo, but to this day it helps to quiet the misgivings that some physicists feel about recurrence.
The other important objection was the reversibility argument expressed by Loschmidt, a friend of Boltzmann´s who coined the phrase of the terroristic nimbus of entropy that was already mentioned.Loschmidt argued as follows: since the atoms of a body follow time-reversible paths, the entropy should be able to decay in one process and grow in another one, depending only on the initial conditions.Boltzmann could not answer that objection.At first he begged the question by arguing that there are infinitely many more initial conditions for growth than decay but that convinced nobody, not even himself.So in the end he came up with a remarkable argument which is either science-fiction or far ahead of his -and our (!) -time: Boltzmann knew that equilibrium is not a dead stagnant state, as Clausius had said, because there are fluctuations.He considered the universe to be in equilibrium with, however, small parts of it -the size of our Big-Bang world -that fluctuate away from equilibrium and other parts that are in regression from a fluctuation back to equilibrium.And then he suggests that "….. a person who lives in either world will denote the direction of time toward less probable states as the past, the opposite direction toward more probable states as the future." So, rather obviously, if this were true, we should always experience more initial conditions that -in the "future" -lead to equilibrium than those that lead away from it.That is a mind-boggling idea andas with quantization -Boltzmann admonishes his auditorium, not to take the argument seriously, except that "….. maybe it is not to be rejected out of hand." Nowadays physicists do not often feel that there is a need for an argument.They have become used to the idea of a priori equally probable realizations of a distribution.And equal probability is considered to be a trivial consequence of symmetry, -just like in throwing dice.It is taken for granted that most probable distributions will be approached in the end.Loschmidt´s objection is brushed aside.And, as frequently happens in such circumstances, there are some who declare the whole problem to be a semantic one.

Planetary Atmospheres
The energy of the molecules of a planetary atmosphere is minimal when the molecules are all lying on the surface of the planet.And the entropy is maximal when the molecules are spread evenly throughout space.So, the decay of energy and the growth of entropy represent opposing tendencies and we may ask which one prevails.
The answer can be had most easily by assuming that the atmosphere is isothermal and finds itself in a constrained equilibrium between the surface and a sphere of the height H.In that manner energy and entropy are both functions of the single variable H; the equilibrium is constrained, because the prescribed H may not be the minimizer of E(H)-T 0 S(H).If we do the calculation (the detailed calculation has been presented in [5]) for a particular case, we obtain E(H) and S(H) as shown in Figure1: the energy is minimal at H = 0-as expected -and grows monotonically and the entropy tends to infinity as well as H grows.However, E(H) grows less steeply than S(H) so that the available free energy E(H)-T 0 S(H) has its minimum for H  ∞, cf. the fat curve in Figure 1.In other words, the entropy prevails and a planet with a stable atmosphere is not a valid proposition.The essential parameter governing the rate of growth of energy and entropy with H is the dimensionless quantity: where M and R are the mass and radius of the planet and γ is the gravitational constant.Figure 1 shows the sets of graphs for β = 1 and β = 8.For the larger β the availability drops less for a given increase of H -at least for large values of H -and we may conclude that it takes more time to strip the planet of its atmosphere.
We must not be surprised by the unstable character of the atmosphere because, after all, in the course of the thermal motion every once in a while a molecule reaches the escape velocity and, if it has the right position and direction when that happens, it will be gone forever.Also we do know that the hot and light planet Mercury has no atmosphere left, and that nearly all the fast-moving light molecules of hydrogen and helium have escaped from the Earth.For the time being our Earth hangs on to the heavier and slower molecules of oxygen and nitrogen and provides us with a thin layer of thin but breathable atmosphere.

Osmosis. Pfeffer Tube
We consider a long tube closed off at one end by a water-permeable membrane.We stick that end into a water reservoir.The water will then adjust itself to the same level inside the tube and outside, cf. Figure 2. When we dissolve some salt in the water of the tube, the salt ions will not be able to leave the water and we might expect that they will increase their entropy by reaching an equi-distribution inside the little bit of water in the tube.However, nature is cleverer than that: The salt, being unable to pass the membrane pulls water into the tube through the membrane and thus increases its volume and its entropy.Or else, the water pushes (osmosis is the Greek word for push) its way through the membrane to provide the salt with a bigger volume and thus with the possibility to increase its entropy.Therefore the level of solution in the tube rises and so does the potential energy of the solution and of the whole system of water and solution.It is interesting to note that the salt profits from osmosis during the process; the profit lies in increased entropy.The water on the other hand pays a price in increased energy.However, the combined system gains by reaching a distribution with the maximum number of realizations under the constraint of a rising potential energy, i.e. by reaching a minimum of available free energy.
So, here we have another competition between entropy and energy.Conceivably the osmotic process could continue until all the water has been sucked into the tube; the entropy would then be maximal.Or else, the system could reach a minimum of energy by -essentially -not experiencing any rise of the solution in the tube at all.And we may ask which tendency will prevail.
The calculation of energy and entropy as functions of H, cf. Figure 3, shows that the potential energy grows parabolically with H, while the entropy grows logarithmically.For reasonable data -2 L of water, 1 g of salt, 1 cm 2 as cross-section of the tube, and 20 °C -the graph A(H) = E(H)-T 0 S(H) implies that now neither energy nor entropy prevail.Both tendencies -the decay of energy and the growth of entropy -reach a compromise where A(H) has a minimum for H E a little less than 10 m.Thus a little less than half of the water is sucked into the tube (the calculation is again part of [5]).Of course, with more salt the tube may suck in all the water, so that there is no compromise and the entropy prevails.

Entropy and gravitational potential energy of a rubber molecule
A polymer molecule, e.g. a molecule of rubber -a polyisoprene -may be modelled as a long chain of N links, each of length b, whose orientations are independent, cf. Figure 4 (left).Thus, by Boltzmann´s interpretation, the entropy of the chain can be written as: where N i (i = 1,2,...P) is the number of links pointing into direction i. P is the total number of directions and it is considered finite.As was explained in Section 2.3 this entropy -or its counterpart (7) for a gas of atoms -makes good sense, if there is thermal motion of the links and if all realizations of a distribution {N i } are equally probable.
We consider the links of the molecule as weightless, but a mass M is attached to the last one and that mass is subject to a gravitational force.Its potential energy is given by: bMg where i  is the polar angle of direction i with the vertical axis.
Obviously, the assumption of a finite number P of directions is tantamount to a quantization of the unit sphere.I do not follow up this aspect, because I wish to simplify the model even further.

A one-dimensional model
We allow only one alternative in the direction of the links: they can either point straight upwards or downwards, cf. Figure 4 (right).The corresponding numbers are N + and N -.In that case we obtain for the entropy with q = N -/N: and the potential energy reads: Since the potential energy of the mass is the only contribution to energy that depends on q, the availability is: Its minimum characterizes the equilibrium and it occurs for: Thus in equilibrium the end-to-end distance of the chain is equal to: Once again: what we are seeing here is a compromise between the randomizing effect of thermal motion and the straightening effort of the gravitational field acting on M. For large T the entropy prevails and r E is close to zero, while energy prevails for small T and the chain becomes straight with r E close to Nb.
The entropy (16), appropriate to only one alternative in the orientation of the links, lends itself most easily to an extrapolation of thermodynamic concepts to remote fields like genetics and sociology.That is why we have introduced and discussed it here.It is true that this represents an extremely simple case, and yet, the case is of considerable heuristic value as we shall presently see.
In order to anticipate criticism I wish to say that even the exploitation of the entropy (12) and the energy (33) -with many orientational alternatives -leads to a formula for r E that is essentially equivalent to (18), i.e. the M-and T-dependence of r E is quite similar.For more information about rubber -and the chain model -I refer the readers to the book [6].

A Population of Cells and Its Entropy
We consider a population of 2N haploid cells with 2N chromosomes which differ on only one locus where the allele may be either A or a (in my paper [7] I have introduced the model treated here; in that article the treatment was not restricted to haploid cells; diploids were considered as well as haploids; it was then appropriate to consider the number of cells as even, hence the factor 2, i.e. 2N).Both alleles are supposed to have the same molecular energy so that there is no physical bias for either of them.Mutations A<-->a occur, either spontaneously or triggered by the application of radiation.
The population is characterized by the distribution {N A ,N a } -with N A +N a = 2N -and we assume that every one of the realizations of a distribution is equally probable in the course of the mutational process.That assumption is tantamount to saying that a distribution with many realizations is more probable than one with less realizations.In analogy to thermodynamics we may thus define an entropy of the population by [the factor k in ( 7) is now dropped, because it would be futile here]: This is a function of the single variable N N a 2 (say).The entropy is expected to grow in its typical random manner, cf.Section 2.3, until it reaches a maximum for the equi-distribution N A = N a = N, cf. Figure 5a.
The random mutations in genetics correspond to the random thermal motion in thermodynamics of a gas.

Selective "Energy"
We consider an environmental situation in which an a-cell enjoys a bonus Δ over an A-cell.The cells are still not physically distinguished -in terms of molecular energy -but the a-cells fit into the environment somewhat better.For instance they may find it easier to obtain nourishment and may consequently have a more numerous progeny than the A-cells.In the long run -over several, or many generations -the a-cells will then dominate in the population and the A-cells will recede.Mathematically we make the simplest possible ansatz for this selective advantage.We write: so that E decreases linearly with a growing number N a .We represent E by a graph between N a = 0 and N a = 2N of the form shown in Figure 5b.We call E the selective energy.Its values may be visualized as lying in a potential well whose deepest point, i.e. the selectively most preferred state, occurs at N a = 2N, and the highest point, i.e. the least preferred state, lies at N a = 0.The shortfall of E from 2NΔ measures the selective advantage of the population, cf. Figure 5b.

Selective Free Energy
Thus we are able to identify two conflicting tendencies in the population: the entropic tendency to grow toward its value 2Nln2 and the tendency of the selective energy to decrease toward the value zero.
We may say that the entropic growth is counteracted by the energetic decay: e.g. if entropy grows away from its value zero at N a = 2N, the selective energy must be dragged up against its tendency to decay.Under those circumstances we do not expect the entropy to become maximal, nor the selective energy to become minimal.The entropy will grow as far as it can under the constraint of an increasing selective energy.
The best way to maximize a function under a constraint is by use of a Lagrange multiplier, here β.Thus for finding the equilibrium we need to maximize the constrained entropy (the index C on S denotes the constrained equilibrium.W C = exp[S C ] is the maximum number of realization to which W can rise under the constraint): The significance of the Lagrange multiplier and its influence on establishing equilibrium may be understood by the following consideration: If β is small so that S dominates the right hand side of (21), S C becomes maximal, because the entropy S does, as if there were no selection.On the other hand, if β is large, S C becomes maximal because E is minimal, as if there were no mutation.Therefore 1 /β may be considered as a measure for the intensity of mutation.
In analogy to thermodynamics we introduce T = 1 /β as characterizing the mutational intensity and instead of maximizing S C we minimize the selective free energy: which is equivalent.The graph F as a function of T and N a is shown in Figure 6.
Thermodynamicists will see at a single glance that the mutational intensity corresponds to the absolute temperature of thermodynamics, a measure for the intensity of the thermal motion.Thus the correspondence between thermodynamics and genetics is complete with an entropy that attempts to grow, an energy that attempts to decay and a temperature that weighs the relative influence of these opposing tendencies.Comparison of these considerations on evolutionary genetics with the treatment of the one-dimensional model for a rubber molecule shows complete formal agreement as far as the mathematics is concerned.The interpretation is different, but the formulae are the same, apart from superficial changes of the notation.Encouraged by this observation we shall now proceed with another extrapolation of thermodynamics.

A Population of Hawks and Doves. Entropy
Let there be two species in a population: N H hawks and and therefore the entropy of the population is: where z H is the fraction of hawks among the birds.Figure 7a shows the graph of the function S(z H ) which has a maximum at z H = 1 / 2 .
Of course, the birds are metaphorical and it is only their character -to be defined later -that marks them as hawks or doves.In particular, a bird may give birth to either hawk or dove with equal probability irrespective of whether it is hawk or dove itself.That means that in the course of one generation the distribution {N H ,N D } changes stochastically to one with more realizations until in the end, when all birds have given birth, the equi-distribution would statistically prevail, i.e. entropy would be maximal -unless there are "energetic" features.
Note that in this new case there is no external agent -like temperature upholding thermal motion, or radiation controlling mutational intensity -to adjust the rate of growth of entropy.That rate is governed by the rate of reproduction.

Contest. "Social Energy"
However, we may conceive of an opposing "energetic" trend.This is furnished by a contest of the birds over a resource which is needed by both species.In order to formulate a model of this contest we assume a conflict strategy adapted from game theory [8][9][10].
If two hawks meet at a resource, they fight over it until one is injured.The winner gains the value 1, while the loser, being injured, needs time for healing his wounds.Let that time be such that the losing hawk must buy 1 resource to feed himself during convalescence.Two doves do not fight; they merely engage in a symbolic conflict posturing and threatening but not actually fighting.One of them will eventually win the resource-always with the value 1 -but both lose time such that after a dove-dove encounter they need to catch up by buying part of a resource, worth 0.2.When a hawk meets a dove, the dove still does not fight but it will try to grab the resource and run for cover.Let it be successful 4 out of 10 times.However, successful or not, it risks injury from the enraged hawk and may need a period of convalescence at the cost of 3 resources.
Assuming that winning and losing the fights or the posturing games is equally probable, we conclude that the expectation values for gain per encounter are given by the arithmetic mean values of the gains in winning and losing, i.e.: for the four encounters HH, HD, DH, DD.It follows that the expectation value for the gain of a hawk or a dove per encounter reads: And the gain expectation of a bird per encounter -irrespective of whether the bird is hawk or dove -is given by: Insertion of (24) into (26) provides the specific result: In the permissible range this function is given by a convex parabolic graph with maximal values of gain at z H = 0 and z H = 1, cf. Figure 7b (dashed).We assume that -within the time of a generation -the contest strategy will force the population toward those maxima of gain, because higher gain may mean a higher chance of survival.For more plausibility -at least to physicists -we turn the graph g(z H ) upside down and obtain a "double-well potential" E(z H ) = -Ng(z H ), cf. Figure 7b (solid), with two minima at z H = 0 and z H = 1.
We call E the social energy of the population and we may say that in the long run the contest favours pure populations, i.e. either hawks or doves, because their social energies are minimal.Actually we may suppose that the population splits apart into spatially separate colonies of pure hawks or pure doves, so that no interspecies conflict occurs.That should be the final state, if the entropic trend for mixing hawks and doves were ignored.

Entropic Growth and Energetic Decay Combined
When the birds are in contest as described in Section 5.2, the distribution {N H ,N D } cannot rise to the entropic maximum, because that trend competes with the trend toward an energetic minimum.For example, consider a population with N H smaller but nearly equal to N, its entropy increase is at first strong enough to pull the population out of the energetic minimum at N H = N, since, after all, the slope of entropy is infinite while the slope of energy is finite.So, there is an initial growth of entropy and energy, the latter being forced against its natural tendency.Therefore, as has been explained in Section 4.3, the entropy grows only as far as it can given the constraint of the concomitant growth of energy (the present case is richer than the one in Section4, since the energy is not a linear function of z H , nor even a monotonic one).And, once again, the maximization under a constraint is done by use of a Lagrange multiplier.As in (21) we maximize the constrained entropy: When the Lagrange multiplier β is large, the maximum of S C is largely due to a minimum of energy.And if β is small, the maximum of S C is due to a maximum of entropy.We may say that β measures the relative importance of energy and entropy in the approach to constrained equilibrium.Or, from what has been said, 1 / β may be taken as characterizing the rate of reproduction.If that value is large, the hawk fraction is expected to be close to ½ , where the entropy is maximal.And if the rate of reproduction is small, the hawk fraction is determined by the social energy.The latter case is more interesting as we shall presently see.

Constrained Equilibria for Different Resource Values
We insert the specific expressions ( 23) and ( 27) into (28) and obtain the constrained entropy S C in the form:  The interpretation of the graph in Figure 8a is easy: We recall that the maximum is reached in equilibrium and that the graphs illustrate the result of two trends, the entropic growth and the energetic decay.For β = 0.5 the entropic growth dominates and is only slightly influenced by the energetic decay to the deeper energetic minimum at z H = 0. Therefore the maximum of S C lies a little distance to the left of the entropic maximum at z H = ½ .For any given z H other than the maximizer, the constrained entropy S C does not have its equilibrium value so that there will be a slow drift toward the maximum in time.
More interesting is the graph of the constrained entropy shown in Figure 8b -and repeated in Figure 9 -which corresponds to β = 1.5: in this case there are two maxima of S C and they correspond to dove-rich and hawk-rich populations.The entropy has given up its decisive role and the social energy dominates.With this we may envisage a situation in which the population is split into dove-rich and hawk-rich colonies and where the population as a whole has a constrained entropy lying on the concavifying straight line which is represented by the common tangent of the concave parts of S C (z H ) cf. Figure 9.In that case the value of S C is higher than the value of the homogeneous population.We have:  is the fraction of birds in colony ´.The separation of the birds into colonies is a plausible strategy on their part.Indeed, when the energy dominates, and particularly the penalty of the dove-hawk competition, it is plausible that the birds avoid such an interspecies conflict.In that manner -by separating into colonies -doves and hawks rarely meet each other.
Of course, as long as the left peak in Figure 8 and Figure 9 is higher than the right one-as it will be in our case for any value of β-the points on the concavification do not correspond to equilibria.In time there is a drift to the higher maximum.
The whole situation is akin to a two-phase system in physics, a liquid-vapor system (say).In the present case, under the influence of their in-bred contest strategy, the birds separate into "phases", -here hawk-rich and dove-rich colonies -, because that policy guarantees a higher gain for all birds than homogeneity.

Figure 1 .
Figure 1.Energy, entropy and available free energy of a planetary atmosphere in a spherical shell of thickness H (M A -mass of the atmosphere).Continuous graphs refer to β = 1, dashed ones to β = 8.

Figure 2 .
Figure 2. The Pfeffer tube.Before and after the addition of salt.

Figure 3 .
Figure 3. Energy, entropy and available free energy as functions of H in the Pfeffer tube.(data given in the text).

Figure 4 .
Figure 4. Models of a polymer molecule.

Figure 5 .
Figure 5. a. Entropy as a function of q N N a  2 , b. "Potential well" for the population.

Figure 6 .
Figure 6.Selective free energy as a function of T and N a .

Figure 7 .
Figure 7. a. Entropy as a function of hawk fraction z H , b. Expected gain and "social" energy E.
β = 0.5 and β = 1.5 the graphs of this function are shown in Figure8:
the constrained entropies and the numbers of the birds in the colonies ' and ´´ respectively, as indicated in Figure9.

Figure 9 .
Figure 9. Separation of the population in colonies ´ and ´´.