Next Article in Journal
Entropy Generation in Steady Laminar Boundary Layers with Pressure Gradients
Previous Article in Journal
On Conservation Equation Combinations and Closure Relations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Entropy vs. Majorization: What Determines Complexity?

1
Department of Marine Sciences, Texas A&M University at Galveston, P.O. Box 1675, Galveston, TX 77553, USA
2
School of Marine Science and Policy, University of Delaware, Newark, DE 19716, USA
*
Author to whom correspondence should be addressed.
Entropy 2014, 16(7), 3793-3807; https://doi.org/10.3390/e16073793
Submission received: 18 April 2014 / Revised: 10 June 2014 / Accepted: 1 July 2014 / Published: 9 July 2014

Abstract

:
The evolution of a microcanonical statistical ensemble of states of isolated systems from order to disorder as determined by increasing entropy, is compared to an alternative evolution that is determined by mixing character. The fact that the partitions of an integer N are in one-to-one correspondence with macrostates for N distinguishable objects is noted. Orders for integer partitions are given, including the original order by Young and the Boltzmann order by entropy. Mixing character (represented by Young diagrams) is seen to be a partially ordered quality rather than a quantity (See Ruch, 1975). The majorization partial order is reviewed as is its Hasse diagram representation known as the Young Diagram Lattice (YDL).Two lattices that show allowed transitions between macrostates are obtained from the YDL: we term these the mixing lattice and the diversity lattice. We study the dynamics (time evolution) on the two lattices, namely the sequence of steps on the lattices (i.e., the path or trajectory) that leads from low entropy, less mixed states to high entropy, highly mixed states. These paths are sequences of macrostates with monotonically increasing entropy. The distributions of path lengths on the two lattices are obtained via Monte Carlo methods, and surprisingly both distributions appear Gaussian. However, the width of the path length distribution for diversity is the square root of the mixing case, suggesting a qualitative difference in their temporal evolution. Another surprising result is that some macrostates occur in many paths while others do not. The evolution at low entropy and at high entropy is quite simple, but at intermediate entropies, the number of possible evolutionary paths is extremely large (due to the extensive branching of the lattices). A quantitative complexity measure associated with incomparability of macrostates in the mixing partial order is proposed, complementing Kolmogorov complexity and Shannon entropy.

We’d been talking about storytelling, how there could be many versions of the same story, many ways of telling, and how each version was a kind of manifestation, as if the story itself was a living, evolving entity, a god capable of many guises.
Vaddey Ratner, In the Shadow of the Banyan [1]

1. Introduction

The evolution from order to disorder involves at least three readily recognizable stages. System complexions near complete order are simple, but as the system ages the simple complexions evolve into intricate structures. This process ceases at equilibrium when the disorder is maximized. For example, the collections of cells making up individual organisms evolve from rudimentary forms to mature individuals, but then age until death. Physical systems such as gases and electro-mechanical devices as well as social structures and ecosystems apparently follow a similar evolutionary template. Appeal to the second law of thermodynamics, namely that the entropy of a system must increase until the system reaches equilibrium, is often invoked as a qualitative explanation of this universal phenomenon. Here we move the dialogue beyond the entropy argument alone, with a simple model that quantifies the evolution of generic systems to equilibrium. Our approach relies on elementary ideas founded in statistical physics and thermodynamics.
Central to the approach is the concept of macrostates and microstates of complex systems. A macrostate is described by a few, preferably observable, variables. Microstates, on the other hand, deal with the specific details making up the system. For example the same temperature and pressure of 22.4 L of gas can be produced by an enormous number of combinations of positions and momenta of the approximately 6 × 1023 molecules in the container. As a second example, a computer screen will look exactly the same if a character is entered in an application and then deleted. However, the internal configuration of the computer has changed. The internal clock has advanced, background housekeeping programs run nearly continuously, the mailer may be refreshed, etc. These details, however, do not impact the macrostate depicted on the screen.
In order to connect statistical physics characterizations of entropy with the evolution of a system, it is customary to use the Boltzmann entropy. It was originally introduced in statistical mechanics to describe the macrostate of a gas by its characterization in a six dimensional position—momentum phase space for each of the molecules. Since then it has become the default disorder measure in disciplines as diverse as information theory (Shannon entropy), economics, and ecology. Arguably, it is the most widely used formula in science and technology.
The Boltzmann entropy is given by [2,3]
S k = log Ω = log ( N ! Π i = 1 N λ i ! )
where k is the Boltzmann constant, N is the total number of distinguishable objects in the system, and λi are the number of objects in cell i. The term in parentheses on the right hand side of Equation (1) is simply the number of ways (i.e., permutations) the N distinguishable objects can be distributed in the λi cells, and is equal to the number of microstates, Ω, capable of producing the macrostate with entropy S. Inspection of Equation (1) shows that if all the objects are in one cell then the entropy is 0, and if they are equally distributed so that each cell has only one object then the system entropy achieves its maximum value of log N!. We contend that both of these states are conceptually simple. The Boltzmann entropy defines irreversibility in isolated systems.
For non-isolated systems, entropy is still a well-established thermodynamic quantity, though its change alone does not determine irreversibility. It is generally not directly measured experimentally. Nevertheless, statistical interpretations of entropy prove useful in a qualitative sense because increasing entropy is one of two factors determining spontaneity of chemical reactions at constant temperature and pressure (the other factor being enthalpy, which when combined with entropy gives the Gibbs free energy). Thus, for non-isolated systems, chemists often invoke heuristic arguments based on statistical disorder. However, for isolated systems near equilibrium, the second law states that entropy quantitatively determines whether or not a chemical or physical process can occur spontaneously. Following Ruch [4], we show that if an isolated system is far from equilibrium, there are many sequences of macrostates that lead to equilibrium, all of which are consistent with a monotonic increase in their Boltzmann entropy. In other words, while the entropy of an isolated system must increase as it evolves toward equilibrium, this can be accomplished in many ways. In this paper we will explore two types of sequences of macrostates that lead to equilibrium, both based on fundamental criteria for the simplest transition from one (more ordered) state to the next (less ordered) state. While these approaches to equilibrium do not in any way violate the second law of thermodynamics, they do not require that entropy uniquely determines the transition from a lower to a higher entropy macrostate. Indeed we find that every individual sequence involves a comparatively small number of macrostates—though, of course, an infinite ensemble of all sequences contains all macrostates.
The paper begins with two sections defining Young diagrams (which uniquely characterize the macrostates) and defining a partial order relation for these diagrams. A unique Boltzmann entropy for each macrostate is easily defined by Equation (1). In Section 4 we give results for the distribution of sequence lengths for ensembles of paths from low to high entropy via Monte Carlo simulation. Two evolutions for the macrostates (diversity and mixing) are studied and both exhibit monotonically increasing Boltzmann entropy. Our arguments will be couched in terms of the lattice of Young diagrams (well known as the Young Diagram Lattice) that describes the partial order of integer partitions according to dominance or majorization [5].
The dynamics on the lattices is studied in Section 5, and using the usual master equation arguments, may be related to the time required to reach equilibrium depending upon the assumed criterion for transitions between macrostates. In Section 6, we discuss some curious properties of the macrostates and their probabilities of occurrence. Finally, Section 7 discusses the results from the perspective of irreversibility and proposes a general relationship between entropy, complexity, and the incomparability of macrostates.

2. Mixing: The Young Diagram and Partitions of Integers

In 1900, Cambridge University mathematician Alfred Young introduced the Young Diagram (YD) [6]. It consisted of a finite collection of boxes arranged in left justified rows such that, proceeding from the top to the bottom of the diagram, successive rows have equal or fewer boxes. Figure 1 shows a YD with 10 boxes (where we have introduced the notation [4,3,1,1,1] for the diagram). (Note on notation: (a) the partition [5] represents the partition of 5 into [5,0,0,0,0]; (b) the partitions [3,1,1] and [1,1,1,1,1] can be written as [3, 12] and [15]). By construction, the Young diagrams are in one to one correspondence with the partitions of integers. The number of partitions, P (N), grows rapidly with N. For example, there are 9,253,082,936,723,602 partitions for N = 300 and more than 1031 for N = 1000. Hardy and Ramanujan [7] obtained an asymptotic formula for the number of partitions of the integer N:
P ( N ) ~ 1 / ( 4 N 3 ) exp 2 N / 3
While there is an entropy associated with every partition (obtained from Equation (1)), the Young diagram itself gives a more complete description of the macrostate.

3. Orders and Partial Orders for Young Diagrams

Foragiven N, the set of partitions can be ordered by various means, but partitions also can be partially ordered. Here we consider briefly two unique orders, and a partial order.
Young’s original order is as follows [6]. Reading from the top down, the first horizontal line in one diagram that is longer than the corresponding line in the second diagram makes the first diagram the larger. While this provides a unique order to the set of Young diagrams (and partitions), there seems to have been little use for this order in thermodynamics or statistical mechanics.
Recognizing that the partitions of integers are in one-to-one correspondence with macrostates in the microcanonical ensemble, a second example of an ordering of Young diagrams is by entropy. Let [λ1, λ2, λ3, ..., λN] (λi defined as in Equation (1)) be a partition, then for an isolated system the Boltzmann entropy is given by Equation (1). The Boltzmann entropy provides an order for YDs i.e., the diagrams are ordered according to increasing entropy with partition [N] having entropy 0 and [1N] having entropy N!.
To define the partial order, we introduce the concept of majorization or dominance. Let λ =[λ1, λ2, ..., λN] where λi is the number of objects of type i (or class i) be a partition of N.
In general we can take λiλj if i < j. The criterion of majorization (or dominance) is that a partition λ majorizes another partition μ if
i = 1 m λ i i = 1 m μ i m = 1 to N
and we say that λ majorizes μ, symbolized by λμ.
The majorization partial order can be represented by its Hasse Diagram in which the nodes are labeled by YD’s representing the mixing character of the macrostates. A macrostate Φ is connected to macrostate Ψ if and only if the Young diagram for Φ can be obtained from that for Ψ by moving one box to nearest shorter row. The Hasse diagram for N = 6 is shown by the solid lines in Figure 2. We will refer to this as the mixing lattice.
Now we define a second lattice, which we term the diversity lattice. In the diversity lattice, macrostate Φ can immediately succeed macrostate Ψ if the Young diagram for Φ can be obtained by moving one box in Ψ’s diagram to any shorter row (preceded only by rows longer than its original length). It is easily seen that both lattices only connect macrostates with increasing entropy. The diversity lattice includes both the dotted and solid lines in Figure 2. Note that there are many more node connections in the diversity lattice than in the mixing lattice.
The majorization partial order has a long history in number theory. In 1974, Ruch [4] proved that the majorization partial order describes the general process of mixing. Recent attention has been drawn to mixing with regard to quantum entanglement [8] and to molecular genetics as well [9]. The fundamental insight in Ruch’s paper is that mixing character has no numerical value; rather, it can only be represented by a diagram that represents the macrostate (i.e., the Young diagram). Even more generally, Ruch suggested that evolution in general proceeds according to increasing mixing character (as determined by the majorization partial order). This is a stronger requirement than increase of entropy alone, though entirely consistent with thermodynamic principles. As Ruch pointed out, evolution toward equilibrium is not necessarily determined by a single function such as entropy, but rather may be determined by a “quality”—the mixing character—which is partially ordered by majorization.

4. Paths to Equilibrium

We define a path (or chain) as a sequence of Young diagrams that proceeds from the most ordered partition [N] to the least ordered [1N] in accord with the ordering rule assumed for the diagrams. For any unambiguous order (not a partial order), there is only a single path; and that contains (goes through) each diagram in sequence. The length of this path is thus simply the total number of diagrams given asymptotically in Equation (2).
For partially ordered macrostates, however, it is important to note that, while the macrostates still proceed from lower to higher entropy, they must also proceed in accordance with the rule for allowed transitions between macrostates, depicted in the mixing and diversity lattices. In both cases, the paths occur on a lattice and can be studied via Monte Carlo techniques similar to random growth models on tessellations or directed random walks on lattices [10]. For an isolated system, the macrostate [1N] represents equilibrium. Thus, paths on the lattices are related to evolution from [N] to [1N].
In this section we explore paths on both the diversity and mixing lattices. For low N, (N ≤ 10), the paths can be found by inspection. However, since the number of nodes in the lattice increases exponentially with N for large N, one must resort to statistical sampling. Assuming an equal transition probability between any two macrostates (nodes) connected by a single edge in the lattice, an algorithm for evolution on the lattice may be obtained as follows. Assume that partition i = [i1, i2, i3, ..., iN] majorizes several partitions j1, j2, j3, ..., jm and is connected to each by a single edge; then one simply selects one of the j partitions at random (choosing a random number 1 ≤ rm) and then proceeds until reaching equilibrium. For example, consider the lattices in Figure 2. On the mixing lattice, partition [3, 2, 1] is connected to and majorizes [3, 1, 1, 1] and [2, 2, 2]. In this case m =2 and the Monte Carlo method chooses r =1 or 2 to select the next partition. On the diversity lattice [3, 2, 1] is also connected to [2, 2, 1, 1] (see the dotted line in the figure) and also majorizes it. Thus in this case, m =3 and the Monte Carlo method chooses r =1, 2 or 3 to select which partition occurs next in the path.
We have obtained the distribution of path lengths on the diversity and mixing lattices. The distributions are shown in Figure 3 for N = 200 with a sample size of 10 million paths. Both have a Gaussian shape
f ( x ) = a e ( x b ) 2 2 c 2
For N = 200 there are over 3.9×1012 partitions so the sample size is “small”. Nevertheless, we found that the distribution appears well converged after 10 million paths are sampled.
If we assume that the average path length l varies as Nδ, Figure 4 shows a log-log plot yielding δdiversity ∼ 1 and δmixing ∼ 1.375.
In summary, the path lengths vary with N as follows:
l ~ e N ( entropy order ) ~ N 1.375 ( mixing ) ~ N ( diversity )
Next we consider the distribution widths. While both the diversity and mixing path distributions appear Gaussian, they have different average path lengths and different distribution widths. If we again assume a power law for the increase of the widths with N we have full width at half maximum (FWHM) ∼Nγ then a plot of the ln FWHM vs. ln N of the distributions should be linear with slope γ. The plots for both distributions are given in Figure 5.
From the figure we see that γmixing ∼ 1.18 and γdiversity ∼ 0.61 or
γ diversity ~ 1 2 γ mixing
It is evident from Figure 2 that the number of allowed transitions between partitions is much larger for the diversity lattice than the mixing lattice, hence path lengths are shorter, and the distribution is narrower for the diversity lattice. This occurs because the added paths available on the diversity lattice are generally much shorter, leading to an overall decrease in average path length. The effect of this may be profound because system properties are determined by ensemble averages, which in turn may be averages over the properties of the macrostates as weighted by their probability of occurrence. Thus, there are far more macrostates that contribute to ensemble averages in the mixing lattice as compared with the diversity case. For large N, this will be particularly significant.
We note that the average path length on the mixing lattice is a power law very close to 4/3. Four thirds power laws have been identified in empirical studies in a variety of disciplines. These include allometric relations in biology and ecosystems [1117] and the growth of Koch curves. It seems then that the mixing lattice paradigm might provide a theoretical basis for this behavior. We will explore this idea further in future work.
In the next section we discuss the evolution from order to disorder in the context of Ruch’s General Thermodynamic Principle.

5. Paths and Time

We now consider various means that an isolated system far from equilibrium might evolve toward equilibrium. Since the states are entropically ordered, entropy alone cannot rule out processes that pass through all or most of the partitions (since every step could correspond to a minimum increase of entropy at each step). Thus, in principle, paths of exponentially increasing length as a function of N are possible.
The increase in entropy is often argued to provide an arrow of time. Now as we have already noted, paths on both the diversity and mixing lattices also pass through states of monotonically increasing entropy. However, for partial orders with “incomparable” diagrams (in our case those incomparable under majorization), a single path on the lattice clearly can go through only one of the incomparable states/partitions/diagrams. Nevertheless, every path on both the diversity and mixing lattices still corresponds to a sequence of states of monotonically increasing entropy. This led Ruch to propose his Principle of Increasing Mixing Character
“The time development of a statistical (Gibbs-) ensemble of isolated systems (microcanonical ensemble) proceeds in such a way that the mixing character increases.”
We explore the consequences of this principle by considering the path length distributions in Section 4 with the assumption that each transition between states in the lattice connected by a single edge occurs with unit time [10]. Thus, for both lattices, ensembles of systems evolve from low to high entropy with a Gaussian distribution of steps to equilibrium. However, the average “relaxation time” from complete order to equilibrium is much longer for the mixing lattice than for the diversity lattice. From the results in Section 4
[ relaxation ( mixing ) ] ~ [ relaxation ( diversity ) ] 1.375
It is also evident from Equation (7) that for large N, the time to reach equilibrium assuming entropic ordering (where every macrostate must be visited and the length is simply the partition number) is exponentially longer than the evolution time for either diversity or mixing.
The distribution’s full width at half maximum provides an estimate of the “band width” of relaxation times so that
[ band width ( diversity ) ] ~ [ band width ( mixing ) ]
In other words, systems obeying mixing lattice evolution will have average mixing lifetimes that are the square of mixing lifetimes for systems obeying diversity lattice evolutions. Attard [18,19] proposed the concept of second entropy to introduce time specifically. His work developed a variational principle, based on maximizing the so-called transition entropy, which is “the number of molecular configurations associated with a transition between macrostates in a specified time”. While different from (and more general than) our assumption of unit time per transitions on the lattice, his work recognizes the need to explicitly consider the number of molecular configurations associated with macrostate transitions.

6. Properties of the Nodes and Paths

The most striking difference between an order and a partial order is that the latter has nodes that are not comparable. In the Hasse diagram for a partial order, incomparable nodes are those that cannot occur in the same path. It seems natural, then, to define an incomparability number for a node as the total number of nodes with which it cannot be compared according to the partial ordering rule. Both the diversity and mixing lattices satisfy the majorization criteria—hence the incomparability numbers for the nodes is the same in each. Thus if two nodes are incomparable with regard to mixing, they are also incomparable with regard to diversity. This property could be responsible for the common Gaussian shapes for the path distributions. Seitz proposed that incomparability is related to the complexity of the mixing character [20]. This will be discussed further in Section 7.
A second property is conjectured for nodes—namely if the entropy of two nodes has the same value, those nodes are incomparable. For example the partitions of N = 10[4, 3, 1, 1, 1] and [3, 3, 2, 2] have the same entropy in Equation (1). We have verified the conjecture for N ≤ 10. Note, however, that iso-entropic nodes can have different incomparability numbers. We have not investigated the implications of this fact.
Next, we briefly consider the frequency with which nodes are visited in the Monte Carlo procedure. Table 1 shows the visit frequency for selected nodes for N = 10 and N = 36. First for N = 36, one sees that practically no paths go through macrostates with partitions that begin with [6, 6, 6, 6, ...] for both mixing and diversity lattices. However, there is a significant difference between paths through macrostates corresponding to partitions beginning with [8, 7, 7, 5...]; 7.75% visit one of those partitions on the mixing lattice but they are almost never visited on the diversity lattice. In the diversity lattice, there are far more short paths than long paths, which for larger N becomes the dominant feature determining the path length distribution. In fact, since every path in the mixing lattice is also a path in the diversity lattice, a careful examination of Figure 3 shows that the mixing paths, though possible, essentially never occur in the diversity distribution for large N.
The results in Table 1 for N = 10 further illustrate the complex relationship between diversity and mixing paths to equilibrium—in mixing evolution 32.5% of the paths pass through [4, 4, 1, 1] and [5, 2, 1, 1, 1], while for the diversity evolution only 3.4% go through [4, 4, 1, 1], and half go through [5, 2, 1, 1, 1]. The paths along the right hand side of the lattice (for the N = 10 mixing lattice see Ruch 1974) then provide an express route to equilibrium on the diversity lattice, many steps of which are prohibited on the mixing lattice.

7. Discussion: Irreversibility, Entropy, Mixing and Complexity

Several alternative paths leading from completely ordered macrostates with zero entropy to the equilibrium macrostate with maximum entropy have been presented and were shown to predict three different dynamics for the evolution of isolated thermodynamic systems. Two of the evolutions studied occurred on lattices where the macrostates are partially ordered according to majorization or dominance, though both still satisfied the requirement that every spontaneous transition must lead to an increase in entropy. As a function of particle number N, the evolution based on entropy alone allowed exponential steps to equilibrium, the evolution based on mixing showed an increase in steps to equilibrium scaling as N1.375, and a third evolution (which allows for a more rapid increase in diversity) showed a linear increase in steps to equilibrium. Next we discuss these results in the context of previous observations regarding irreversibility with an emphasis on the mixing partial order for macrostates.
In his paper, The Many Faces of Irreversibility, Denbigh [21] observed that “What appears to be the common feature of irreversibility is the fanning out of trajectories, new entities or new states, in the temporal direction towards the future”. Our Monte Carlo simulations illustrated this feature on the two lattices obtained by application of the diversity and mixing rules for transitions between nodes in the YDL. The majorization partial order depicted in the YDL, in turn, was proved by Ruch [4]to represent the general property of “mixedness” of macrostates in the microcanonical ensemble. Our work focused primarily on the distribution of lengths of the trajectories leading temporally from complete order to disorder, and couched much of the discussion in the context of Boltzmann entropy. Indeed, we found that all of the allowed trajectories followed paths of monotonically increasing entropy, and hence the assumption that irreversible trajectories are defined by increase in mixing character is fully consistent with the usual thermodynamic view of increasing entropy. While it is true that there are other functions that increase monotonically along the trajectories (see Ruch’s definition of column partitions and corresponding column entropy [4]), such functions were not discussed. Grad [22] in his seminal paper “The Many Faces of Entropy” discussed this general idea in some mathematical detail.
Denbigh [21] also suggested that there are three distinct forms for the “divergent quality” of the trajectories: (a) a branching towards a greater number of distinct kinds of entities; (b) a divergence from each other of particle trajectories or of sections of wave fronts; and (c) a spreading over an increased number of states of the same entities.
To connect this to our work here, we consider two complementary, yet very different, definitions for entities along a path. The first defines the entity of a macrostate as the set of macrostates that occur along the path that preceded the macrostate—i.e., the set of macrostates on the path leading up to the macrostate (its history). Thus, a given macrostate will have many possible histories depending upon the path taken to reach it. When viewed in this manner, Denbigh’s first form for “divergent quality” describes the diversity of trajectories that occur on the lattice as one proceeds from low to high mixing or entropy, and the second form is related to the fact that the odds that any two paths will be the same is vanishingly small for large N. We note that this divergence seems to generally occur in nature. For example, there is enormous diversity among human beings that is based on an individual’s detailed history and genetic diversity.
To understand Denbigh’s third form for divergent quality we describe a different type of entity associated with a macrostate Ψ. This entity is the set of macrostates to which Ψ is incomparable. Now proceeding from complete order along a path, the number of states incomparable with the macrostates Ψ generally increases, until one reaches the middle of the lattice. This behavior illustrates Denbigh’s third form of divergent quality. By definition of incomparability, the members of the set of macrostates incomparable with macrostate Ψ are those that could not occur on any trajectory containing Ψ. The order of the set IΨ we term the incomparability of macrostate, and denote it Iψ. In some sense, these sets represent “roads not taken”, and it complements the history. Next, evolution from the middle (most complex) portion of the diagram to equilibrium continues through macrostates, but this time the incomparability generally decreases as one approaches equilibrium. Hence a new property of the macrostates has emerged, namely the extent of their incomparability with other macrostates. The incomparability is small for low and high entropies, but quite high at intermediate entropy/mixing character. We conjecture that this property, measured by IΨ, may be viewed as a complexity function for macrostates. We further conjecture that this complexity will approximate a convex function of the entropy, SΨ, of Ψ. (Note, if macrostates Φ and Ψ are incomparable, it does not follow that IΦ = IΨ. This is because Ψ being incomparable with states Φ and Θ does not necessarily imply that states Φ and Θ are also incomparable. In mathematical order theory, sets of macrostates that are mutually incomparable to one another are antichains. A subtle point is that the order of maximal antichains is less than or equal to the incomparability number.) A similar relationship between incomparability and complexity has been suggested previously [20]. Our complexity here is also consistent with others. (See for example Γ11 in Shiner et al. [23]; Huberman&Hogg [24] and the complexity discussion on the Scholarpedia website.
Examples of system evolution from simple ordered states to complex ones abound. Organizations, cities, organisms, ecosystems, companies and civilizations, all typically begin with a simple, fairly uniform environment. Generally, there is energy input so that entropy increase and energy combine constructively to increase size and diversity that leads to, for example, political parties, New York City, human beings, rain forests, Exxon and Rome. Yet, in each case, there appears to come a point where further growth and complexity becomes increasingly hard to come by, despite the fact that energy input has not ceased. It seems that, absent an ever increasing energy input, decay inevitably sets in, growth ceases, and systems return to simple, but this time, disordered states. All of this, of course, is why entropy is invoked as the arrow of time. In our model system, these observations correspond to the evolution of incomparability of the macrostates that are present.
From Norway to Italy and from France to Greece, one finds (in the European context) popular prints in which a tiered bridge represents the Scale of Life or Stages of life. In Figure 6, we see a simple baby, not very different from other babies, beginning life. As children, differences become more apparent and there is an increase of incomparability among them, which continues to increase until middle age. At that stage of life, people are often labeled as: lawyers, doctors, mechanics, mothers, computer programmers, bowlers, football players, etc. It is then when they are most incomparable with one another. However, as people age and retire, even though their histories continue to diverge, their incomparability with one another seems to decrease, until at extremely old age and death, we again find a high degree of comparability. We regard incomparability as a fundamental characteristic of macrostates, which emerges only when one considers these states as being partially ordered [25]. More generally, attention to partial orders that are present in complex systems has enormous potential for future advances in many fields.
Finally, we list several challenges that arise as the result of the view taken here that mixing character is the fundamental property of macrostates in the microcanonical ensemble that determines spontaneity in isolated systems. First, we found in Section 6 that the frequencies with which macrostates occur in an ensemble of trajectories are drastically different. Some macrostates occur in a vanishingly small number of trajectories, while others occur with much higher frequency. Thus it is important to characterize the relationship of the macrostate’s mixing character to its occurrence frequency. Second, it is interesting that the union of entities defined by the sets of macrostates incomparable with those visited on any trajectory must contain every macrostate. This is a topic of ongoing research. Also, our assumption of equal transition probabilities between macrostates on the lattices, while consistent with common practice, is ad-hoc. Introduction of a transition probability based on the difference of the Boltzmann entropies of the neighboring macrostates (or other criteria) might lead to intermediate cases between mixing and diversity (i.e., the transitions may be weighted according to the entropy change during the transition or other factors). Finally, the complexity measure proposed here should be studied further to clarify the conjectures above and to explore in more detail the relationship to other complexity measures.

Author Contributions

William Seitz conceived the idea of calculating the walks on the Young Diagram Lattice for the mixing relation and performed all calculations. A. D. Kirwan Jr. suggested the diversity lattice and worked with Seitz to clarify the notion of complexity as related to the majorization partial order in particular, and to incomparability in general. Both authors have read and approved the final manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ratner, V. In the Shadow of the Banyan: A Novel; Turtleback Books: St. Louis, MO, USA, 2012. [Google Scholar]
  2. Boltzmann, L. Bermerkungen über einige Probleme der mechanische Wärmetheorie. Wien. Berichte 1877, 75, 62–100. (In German) [Google Scholar]
  3. Boltzmann, L. Über die beziehung dem zweiten Haubtsätze der mechanischen Wärmetheorie und der Wahrscheinlichkeitsrechnung respektive den Sätzen über das Wärmegleichgewicht. Wien. Berichte 1877, 76, 373–435. (In German) [Google Scholar]
  4. Ruch, E. The Diagram Lattice as Structural Principle. Theor. Chim. Acta (Berl.) 1975, 38, 167–183. [Google Scholar]
  5. Muirhead, R. Some methods applicable to identities and inequalities of symmetric algebraic functions of n letters. Proc. Edinb. Math. Soc 1903, 21, 144–157. [Google Scholar]
  6. Young, A. On Quantitative Substitutional Analysis. Proc. Lond. Math. Soc 1900, 33, 97–146. [Google Scholar]
  7. Hardy, G.; Ramanujan, S. Asymptotic Formulae in Combinatory Analysis. Proc. Lond. Math. Soc 1918, 17, 75–115. [Google Scholar]
  8. Nielsen, M.A. Characterizing mixing and measurement in quantum mechanics. Phys. Rev. A 2000, 61, 064301. [Google Scholar]
  9. Wan, H.; Wootton, J. A global compositional complexity measure for biological sequences: AT-rich and GC-rich genomes encode less complex proteins. Comput. Chem 2000, 24, 71–94. [Google Scholar]
  10. Richardson, D. Random growth in a tessellation. Math. Proc. Camb. Philos. Soc 1973, 74, 515–528. [Google Scholar]
  11. West, G.B.; Brown, J.H.; Enquist, B.J. A General Model for the Origin of Allometric Scaling Laws in Biology. Science 1997, 276, 122–126. [Google Scholar]
  12. Galtier, P.A.S.; Billant, P. Kolmogorov laws for stratied turbulence. J. Fluid Mech 2012, 709, 659–670. [Google Scholar]
  13. Kirkwood, D.H.; Ward, P.J. Comment on the power law in rheological equations. Mater. Lett 2008, 62, 3981–3983. [Google Scholar]
  14. West, G.B.; Woodruff, W.H.; Brown, J.H. Allometeric scaling of metabolic rate from molecules and micochondria to cells and mammals. Proc. Natl. Acad. Sci. USA 2002, 99, 2473–2478. [Google Scholar]
  15. Richardson, L.F.; Stommel, H. Note on Eddy Diffusion in the Sea. J. Meteorol 1948, 5, 238–240. [Google Scholar]
  16. Kirwan, A.D., Jr. Quantum and Ecosystem Entropies. Entropy 2008, 10, 58–70. [Google Scholar]
  17. Ollitrault, M.; Gabillet, C.; Verdiere, A.C.D. Open ocean regimes of relative dispersion. J. Fluid Mech 2005, 533, 381–407. [Google Scholar]
  18. Attard, P. The Second Entropy: A Variational Principle for Time-dependent Systems. Entropy 2008, 10, 380–390. [Google Scholar]
  19. Attard, P. The second entropy: A general theory for non-equilibrium thermodynamics and statistical mechanics. Annu. Rep. Prog. Chem. Sect. C Phys. Chem 2009, 105, 63–173. [Google Scholar]
  20. Seitz, W.A. Partial Orders and Complexity: The Young Diagram Lattice. Partial Orders Environ. Sci. Chem 2006, 1, 366–384. [Google Scholar]
  21. Denbigh, K.G. The Many Faces of Irreversibility. Br. J. Philos. Sci 1989, 40, 501–518. [Google Scholar]
  22. Grad, H. The Many Faces of Entropy. Commun. Pure Appl. Math 1961, 14, 323–354. [Google Scholar]
  23. Shiner, J.S.; Davison, M.; Landsberg, P.T. Simple measure for complexity. Phys. Rev. A 1999, 59, 1459–1464. [Google Scholar]
  24. Huberman, B.; Hogg, T. Complexity and Adaptation. Physica D 1986, 22, 376–384. [Google Scholar]
  25. Klein, D.J. Similarity and dissimilarity in posets. J. Math. Chem 1995, 18, 321–348. [Google Scholar]
Figure 1. Partition [4, 3, 1, 1, 1] of N =10.
Figure 1. Partition [4, 3, 1, 1, 1] of N =10.
Entropy 16 03793f1
Figure 2. Mixing and Diversity Lattice Diagrams: The solid lines are allowed transitions in mixing, while dotted lines indicate additional allowed transitions in the diversity lattice.
Figure 2. Mixing and Diversity Lattice Diagrams: The solid lines are allowed transitions in mixing, while dotted lines indicate additional allowed transitions in the diversity lattice.
Entropy 16 03793f2
Figure 3. Distributions of path lengths for N = 200. The top distribution is for paths on the diversity lattice; the bottom distribution is for paths on the mixing lattice. Note significantly different widths and average path lengths.
Figure 3. Distributions of path lengths for N = 200. The top distribution is for paths on the diversity lattice; the bottom distribution is for paths on the mixing lattice. Note significantly different widths and average path lengths.
Entropy 16 03793f3
Figure 4. Log-log plots of the average path length vs. N. The top curve is for the mixing lattice. Note slope is 1 for diversity, but fractional for mixing.
Figure 4. Log-log plots of the average path length vs. N. The top curve is for the mixing lattice. Note slope is 1 for diversity, but fractional for mixing.
Entropy 16 03793f4
Figure 5. Log-log plots of the full-width half-maximum vs. N. The top curve is for the mixing lattice.
Figure 5. Log-log plots of the full-width half-maximum vs. N. The top curve is for the mixing lattice.
Entropy 16 03793f5
Figure 6. The complexity of individuals is greatest at middle age while the complexity of groups near birth and death is less. The figure is Talamantes famous “Steps of Life - female”.
Figure 6. The complexity of individuals is greatest at middle age while the complexity of groups near birth and death is less. The figure is Talamantes famous “Steps of Life - female”.
Entropy 16 03793f6
Table 1. Visit Frequency to Partitions (20 million paths sampled).
Table 1. Visit Frequency to Partitions (20 million paths sampled).
NPartitionMixing VisitsDiversity Visits
36[8, 7, 7, 5, all]1,547,98857
36[6, 6, 6, 6, all]4150
10[4, 3, 3]2,499,401312,338
10[4, 4, 1, 1]7,500,069682,480
10[5, 2, 1, 1, 1]7,502,52010,074,654
10[4, 2, 2, 2]4,169,246880,079
10[3, 3, 3, 1]4,163,4562,285,122
10[4, 3, 2, 1]12,497,4802,063,059

Share and Cite

MDPI and ACS Style

Seitz, W.; Kirwan, A.D., Jr. Entropy vs. Majorization: What Determines Complexity? Entropy 2014, 16, 3793-3807. https://doi.org/10.3390/e16073793

AMA Style

Seitz W, Kirwan AD Jr. Entropy vs. Majorization: What Determines Complexity? Entropy. 2014; 16(7):3793-3807. https://doi.org/10.3390/e16073793

Chicago/Turabian Style

Seitz, William, and A. D. Kirwan, Jr. 2014. "Entropy vs. Majorization: What Determines Complexity?" Entropy 16, no. 7: 3793-3807. https://doi.org/10.3390/e16073793

Article Metrics

Back to TopTop