Next Article in Journal
Correction: Chen, W.; Huang, S.P. Evaluating Flight Crew Performance by a Bayesian Network Model. Entropy 2018, 20, 178
Previous Article in Journal
Complexity and Entropy Analysis of a Multi-Channel Supply Chain Considering Channel Cooperation and Service
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Landauer’s Principle as a Special Case of Galois Connection

by
Radosław A. Kycia
1,2
1
Department of Mathematics and Statistics, Masaryk Univeristy, Kotlářská 267/2, 611 37 Brno, Czech Republic
2
Mathematics and Computer Science, Faculty of Physics, Cracow University of Technology, Warszawska 24, 31-155 Kraków, Poland
Entropy 2018, 20(12), 971; https://doi.org/10.3390/e20120971
Submission received: 14 October 2018 / Revised: 3 December 2018 / Accepted: 12 December 2018 / Published: 14 December 2018

Abstract

:
It is demonstrated how to construct a Galois connection between two related systems with entropy. The construction, called the Landauer’s connection, describes coupling between two systems with entropy. It is straightforward and transfers changes in one system to the other one, preserving ordering structure induced by entropy. The Landauer’s connection simplifies the description of the classical Landauer’s principle for computational systems. Categorification and generalization of the Landauer’s principle opens the area of modeling of various systems in presence of entropy in abstract terms.

1. Introduction

There are various kinds of entropy describing different systems, e.g., in computations, physics, and dynamical systems. In continuous thermodynamic system, e.g., ideal gases, the entropy has precise meaning of a function that provides foliation of the thermodynamic space of states [1,2], which is the statement of the Caratheodory formulation of the second law of thermodynamics. This approach requires a continuous structure on the space of states of the system [3,4]. Entropy can be used as comparison measure between states [3,4], which is useful in the paper. There is also a point of view that the entropy in a theory can be traced to inaccurate (as it always is) measurement [5], and the only crucial thing is the difference in entropy and not the entropy itself. In the theory of dynamical systems, topological entropy is used to measure the level of dynamical complexity of a system [6]. In information theory, the (Shannon) entropy measures how information is produced by its (stochastic) source [7]. This discussion can be largely extended, however it is not the aim of the paper to make extensive research on the vast literature of the subject. These various interpretations show that the notion of entropy is not well understood.
Apart from these different approaches, better insight is possible when systems with entropy are “connected” in the following sense. In 1961, Rolf Landauer introduced the principle (Landauer’s principle) in irreversible computing in which he postulated that every act of erasing information results in expelling at least T k B ln ( 2 ) [Joule per bit] (here, k B is the Boltzmann constant and T is the temperature of the system) heat to the environment [8], i.e., increases thermodynamic entropy. This principle has profound implication in explaining old thermodynamic paradox of the Maxwell’s demon [9,10]. There was a dispute on the validity of this principle, however careful derivation [11] and experimental results, e.g., experimental setup close to the one proposed by Landauer that is presented in [12], and recent verification in quantum systems that can be found in [13], prove that the principle is correct. The principle is based on the assumption that every computational system (in principle computer memory) is implemented with the help of physical system and this is a link between information and physical realms.
This paper is an attempt to generalize this principle to every system that contains entropy. This generalized connection between systems is the minimal that preserves entropy-induced ordering. The kinds of structures from category theory [14,15] which are involved in the Landauer’s principle are described in the following. Category theory approach has been used in studying entropy, e.g., in [16,17]. In this view, it is an extension of the paper [11], where this categorical viewpoint is abandoned. It is shown that Table 1 of [11] (see Table 1 in this paper) is an indication of the Galois connection. The mappings (functors) between states of one system (e.g., a logical system) and the second one (a thermodynamic realization of the logical system) preserve entropy properties. These features, when properly defined, are exactly the properties required for the existence of a Galois connection between these two systems seen as ordered sets.
A few remarks are in order before we provide details. The category theory is not an alternative approach in proving the Landauer’s principle, in the same way as it is not a tool to prove basic properties of objects in mathematics. Instead, it offers a layer of abstraction (called “abstract nonsense”) that allows promoting some specific features (e.g., the Landauer’s principle proven by thermodynamic methods) to “universal properties” that can be observed in any other system that shares specific common features. To use this link between specific phenomena/object and category theory, these universal properties have to be proven using specific domain methods. Then, the language of category theory can be used to prove and understand even more on abstract level. The link or “bridge” between original Landauer’s principle and the Galois connection for systems with entropy is formulated and proven in Theorem 4.
This approach is from bottom to top and recently there is some trend in applied science to use such kind of abstract approach to concrete models [18], especially adjoint functors in physics [19] of which the Galois connection is a special and distinguished case. In the words of S. Mac Lane, “Adjoint functors arise everywhere.”
This paper is organized as follows: In Section 2, a brief overview of the mathematical notion of entropy in thermodynamics and the Galois connection is presented for the reader’s convenience. Section 3 contains the definition of the Landauer’s connection that relates systems with entropy. Then, some various examples of the Landauer’s connection, including description of the classical Landauer’s principle in terms of the Landauer’s connection are outlined. The paper concludes with the discussion on possible implications. The first part of the paper is strict and precise, however Section 4 varies in the level of precision since the description of a complicated system on high level of generality is in principle impossible or depends on too many details to include them here. Therefore, many conclusions in that section should not be taken too strictly, and are in fact hypotheses or general features rather than strict claims. We however believe that it is worth including them here as they illustrate wide range of disciplines in which the generalized Landauer’s principle can be possibly applied.

2. Related Work

This section summarizes important facts from entropy theory of thermodynamic systems (based on [3,4]), and the definition and properties of the Galois connection (mainly following [15]).

2.1. Entropy and Ordering in Thermodynamics

This subsection presents the poset (pre-ordered set) structure constructed on the state-space of thermodynamic system, however some parts are also valid for other types of systems with entropy. We closely follow [3,4].
A state of thermodynamic system is associated with a point (equilibrium state) X in the state-space Γ . This space can be topologized and coordinates that describe physical quantities such as energy, volume, etc., can be introduced, however we do not need it for what follows (for details, see [4]). Equilibrium state is attained when the system is left to itself.
Crucial operation that can be introduced on the thermodynamic system is the scaling Γ λ , or for the state, λ X , which is an action of abelian multiplicative group λ ( R + { 0 } , · ) on the state-space. It fulfills composition law ( Γ λ ) μ = Γ λ μ and μ ( λ X ) = ( μ λ ) X , with obvious unity Γ 1 = Γ and 1 X = X . This scaling multiplies extensive properties (coordinates) of the system and does not alter intensive ones. It allows us to build bigger system from the small similar pieces. A similar construction is a composition of two systems X Γ 1 and Y Γ 2 , that is ( X , Y ) Γ 1 × Γ 2 .
Adiabatic transition/process is a change of state that is done without influence from outside of the system. It can occur abruptly or slowly. This allows us, according to Lieb and Yngvason [3], to introduce partial ordering as follows: If Y can be reached by an adiabatic process (is adiabatically accessible) from the state X, then we denote it as
X Y .
We can define adiabatic equivalence (which is antisymmetric relation ≼) saying that X is adiabatically equivalent with Y, in symbols X Y , if X Y and Y X . The classes of equivalence are called adiabats. Obviously, there is reflexivity of ≼ since identity process is also an adiabatic process—when the system is in equilibrium.
If there is no symmetry between X and Y, we say that X Y if X Y and Y X . In this case, the (isolated) transition between these states is called irreversible adiabatic process.
Two states are comparable if there is ≼ relation between them. Not every pair of states is comparable since, e.g., they have different chemical composition. In addition, the relation ≼ is transitive, i.e., if X Y and Y Z , then X Z . Reflexivity, antisymmetry and transitivity show that the relation ≼ is a partial order on Γ .
We want to transfer the ordering to the ordering of the real line. This can be done using entropy function:
Definition 1.
Per [3,4], this order structure induces the entropy function which is a mapping S : Γ R that fulfills:
  • Monotonicity: For two comparable states X and Y
    X Y S ( X ) S ( Y ) ;
  • Additivity: For two states X and Y, the entropy of compound state ( X , Y ) is
    S ( X , Y ) = S ( X ) + S ( Y ) ;
  • Extensivity: For λ > 0 and a state X
    S ( λ X ) = λ S ( X ) .
The properties of entropy allows us to transfer the relation ≼ to the ≤ relation on the real line. Therefore, the above definitions of processes can be rewritten as:
  • Reversible adiabatic process:
    X Y S ( X ) = S ( Y ) ;
  • Irreversible adiabatic process:
    X Y S ( X ) < S ( Y ) .
If, in addition, all states are comparable in state-space Γ (Comparison Hypothesis [3,4]), then there exists unique, up to affine transformation, entropy, that fulfills above properties for ≼ and supplied with some additional conditions (see Axioms A1–A6 of [3]). This property says that ≼ is total order on Γ and S function transfers total order on Γ to the total order on R . We assume Comparison Hypothesis below.
These ingredients are minimal for our purposes. The reader interested in full axiomatization of entropy is refereed to [3,4] and references therein.
Most of the properties presented here hold for different types of entropy, not necessarily thermodynamic ones.

2.2. Galois Connection

Let us also recall the definition of the Galois connection between two posets following [15], and a few of its properties from the vast literature on this subject, e.g., [14,15,20,21,22,23].
First, the three definitions of functors that preserve ordering have to be given [15]. Let C = ( C , ) and D = ( D , ) be two posets (≼ and ⊑ are partial order relations), then the mapping (functor) F : C D is
  • monotone if for any x , y C , if x y , then F x F y ;
  • order-embedding if for all x , y C , x y F x F y ; and
  • order-isomorphism iff F is surjective order-embedding.
Next, following [15], Definition 116, the Galois connection is given as
Definition 2
([15]). Suppose that C = ( C , ) and D = ( D , ) are two posets, and let F : C D and G : D C be a pair of functors such that for all c C , d D ,
F c d c G d .
Then, F and G form a Galois connection between C and D . When this holds, we write F G , and F is said to be the left adjoint of G, and G is the right adjoint of F.
There are alternative conditions to Equation (7), which are given by Theorem 144 of [15].
Theorem 1
([15]). In the assumptions of Definition 2, F G if and only if
1. 
F and G are both monotone;
2. 
for all c C , d D , c G F c and F G d d ; and
3. 
F G F = F and G F G = G .

3. Main Results

In this section, we use the properties of entropy [3,4] and the Galois connection [15] to construct connection between systems with entropy. We consider general entropy, and not necessarily the thermodynamic one.
The plan of deriving the Galois connection from entropy consists of the following steps:
  • state-space (G-Set) + entropy → total ordering;
  • total ordering → poset (G-poset) structure; and
  • two posets → Galios (Landauer’s) connection between them.
Step 1. The main point is to introduce state-space set Γ . In the case of thermodynamic systems, the scaling of the system is modeled as an action of the multiplicative group ( R + , · , 1 ) on the set Γ , which preserves ordering (as described in [3,4] and Section 2.1). Such kind of element { Γ , ( R + , · , 1 ) } is the object of a G-Set category [24], namely R + -Set category. However, in non-thermodynamic systems (e.g., information theory), there is usually no group action and we have the following options: narrowly describe the state-space to the category set, select the category of G-Set with the trivial group ( 1 , · , 1 ) , or model state-space on the R + -Set category with trivial group action. We select the last possibility since it gives a more uniform approach.
Definition 3.
System space is the object of G-Set category, i.e., { Γ , ( R + , · , 1 ) } , where the multiplicative group acts on state-space Γ.
The second ingredient is an entropy function S : Γ R . For example for thermodynamic systems, the entropy must fulfill properties of Definition 1.
It is assumed that every point of Γ is in the domain of the entropy function. In thermodynamic systems, this assumption is called Comparison Hypothesis and is not always true, as described in [3,4]. However, we assume it holds (e.g., no systems with chemical reactions for thermodynamic systems).
The existence of entropy allows us to define
Definition 4.
Total ordering ≼ on Γ is defined in the following way:
X Y S ( X ) S ( Y ) ,
for X , Y Γ . Likewise,
X = Y S ( X ) = S ( Y ) .
The above construction from entropy to ordering for thermodynamic systems is the reverse of the argument from [3,4], and it is also sketched in Section 2.1.
Step 2. The existence of total ordering ≼ on Γ allows us to define a poset structure. However, for accounts of additional group structure of scaling, the more general approach would be to use G-Pos category [25], i.e., posets with a group action. The group action is needed only when the scaling is present in the system (in particular thermodynamics). In other systems without scaling, the group action is trivial. Therefore, we omit the group action/scaling part when it is not important in the context and provide modifications in the presence of G-Pos structure of group action/scaling later.
Definition 5.
The entropy system is the object of G-Pos category, the objects of which are G = ( Γ , ) , with preserving ordering group ( R + , · , 1 ) action (if for X , Y Γ there is X Y , then for λ R + there is λ X λ Y ), where the (partial or) total order is given by the entropy function S : Γ R .
Step 3. The third key ingredient to formulate the Landauer’s connection is the Galois connection from category theory. The short overview of its definition and basic properties are collected in Section 2.2 for the reader’s convenience.
The definition of the Galois connection suggests that it can relate two thermodynamic or, more generally, entropy systems with state-spaces Γ 1 and Γ 2 . In such case, the existence of entropy imposes poset structure (we omit group action structure for clarity for the moment). This gives our main observation—the following definition is reformulation of the Galois connection (Definition 2) in terms of the entropy, and from the historical reasons we call it the Landauer’s connection (or generalized Landauer’s principle). In the definition below, every poset is treated as a category on its own.
Definition 6.
The Landauer’s connection and Landauer’s functor is defined as follows.
Entropy system G 1 = ( Γ 1 , S 1 ) is implemented/realized/simulated in the entropy system G 2 = ( Γ 2 , S 2 ) when there is a Galois connection between them, namely, there is a functor F : G 1 G 2 and a functor G : G 2 G 1 such that F G .
In terms of the entropy, the condition in Equation (7) is given as
S 2 ( F c ) S 2 ( d ) S 1 ( c ) S 1 ( G d ) .
We name the functors F and G the Landauer’s functors.
If the group action on set-state is nontrivial, i.e., when scaling of states is present, then we operate on G-Posets and in such situation every functor above, e.g., F ˜ = ( F , ϕ ) , consists of two parts that act as F ˜ ( λ X ) = ϕ ( λ ) F X :
  • a set part of the functor: F : Γ 1 Γ 2 ; and
  • a function ϕ that is a surjective group endomorphism of ( R + , · , 1 ) ,
We intuitively explain that the functorial properties hold. In the case when there is no scaling ( ϕ is trivial), if there are mappings f , g : Γ 1 Γ 1 , then they induce the mappings F f , F g : F Γ 2 F Γ 2 on the connected system and the composition g f is mapped into F ( g f ) = F g F f . In addition, if there is no transition changing the entropy in Γ 1 , it corresponds to the identity mapping on Γ 1 and this corresponds to the identity mapping on F Γ 1 . This is similar for G functor.
Definition 6 is reasonable, as, from the first condition of Theorem 1 in Section 2.2, the realization of G 1 on G 2 preserves ordering. From the second condition, we get that the mappings G F and F G do not give lower and higher entropy states, respectively, than the original states, and the third condition shows that F G and G F preserve images of F and G, respectively. For thermodynamic systems the Landauer’s functors preserve entropy properties given in Definition 1. Therefore, for such simulation, the Landauer’s connection is needed, as it is minimal order preserving connection between two posets, and therefore entropy systems.
Surjectivity of ϕ is only a technical assumption that simplifies what follows. This assumption is added only for removing an additional degree of freedom, since we can always choose group action to compensate.
It is obvious that, when one system has scaling and the other does not (trivial action of the group), the mapping ϕ is the trivial map. However, for nontrivial group action, we have the following
Corollary 1.
For Landauer’s connected functors in the presence of scaling, if F ˜ G ˜ , then F ˜ = ( F , ϕ ) , G ˜ = ( G , ϕ 1 ) and F G . Therefore, ϕ is an isomorphism of groups.
Proof. 
Assume that G ˜ = ( G , ψ ) for the moment. In the relations S 2 ( F G F c ) = S 2 ( F c ) , and S 1 ( G F G d ) = S 2 ( G d ) , substitute c λ c and d γ d , where λ , γ R + . Then, one gets ψ ϕ ψ ( λ ) = ψ ( λ ) and likewise ϕ ψ ϕ ( γ ) = ϕ ( γ ) . Since ψ and ϕ are surjective (group homomorphisms), therefore ϕ 1 = ψ . □
The word “realization” or “simulation” explains that usually we are interested in simulating one, possibly abstract system, which consists some kind of entropy that introduces poset structure in its state-space, e.g., binary computations, using its physical implementation in terms of electronic system, spin system, or any other computing realization, where thermodynamic entropy is given. We can also consider simulation of physical system by the other physical system. If the Landauer’s connection is present, the simulated system will behave as its connected counterpart, following the second law of thermodynamics. If the physical system as one of the Galois connected category is involved, then the connection transfers the second law of thermodynamics to the other category which does not have to be connected with physical world (e.g., from electronic circuits to computation described by the Shannon entropy). This issue is described in details in the next subsection.
Reformulation of the properties of Landauer’s functors is given by Corollary 2.
Corollary 2.
The equivalent conditions to Equation (10) are as follows
1. 
For c 1 , c 2 C , if c 1 c 2 then S 2 ( F c 1 ) S 2 ( F c 2 ) ; analogously for G functor. In other words, F and G are monotone functors.
2. 
For all c C , d D , S 1 ( c ) S 1 ( G F c ) and S 2 ( F G d ) S 2 ( d ) .
3. 
For all c C , d D , S 2 ( F G F c ) = S 2 ( F c ) , and S 1 ( G F G d ) = S 2 ( G d ) .
The proof is repetition of the proof of Theorem 1.
From Theorem 145 of [15], we immediately have transitivity of the Galois connection.
Theorem 2.
The Landauer’s connection is transitive, namely, if F : G 1 G 2 , G : G 2 G 1 , H : G 2 G 3 and K : G 3 G 2 , then, if F G and H K , then also H F G K .
We have also some kind of uniqueness (see Theorem 144 of [15]).
Theorem 3.
Landauer’s connection is “unique” in the following sense: if F G and F G , then G = G , and similarly for the other direction.
From two Landauer’s connected entropy systems, we can isolate those parts that are order isomorphic (see Section 2.2 for definition). We can define an order isomorphism in the following way (see Definition 118 and Theorem 150 of [15]):
Construction 1.
Take images of the functors Γ 1 = G [ Γ 2 ] and Γ 2 = F [ Γ 1 ] and define new subcategories of posets G 1 = ( Γ 1 , S 1 ) and G 2 = ( Γ 2 , S 2 ) . Then, these last categories are order isomorphic by F and G.
The order-isomorphism allows us to introduce classes of equivalences between entropy systems or their subcategories, and therefore introduce in the category of entropy systems (sets of all entropy systems without any arrows apart of identity arrow) the quotient category, whose objects are equivalence classes of order-isomorphism. In addition, order-isomorphism allows us to construct a subsystem of Landauer’s connected systems that can be used to implement the other entropy system faithfully.
We can also close the poset G 1 (see Theorem 151 of [15]) using the functor K = G F for F G , where F : G 1 G 2 . The closure of G 1 (Definition 119 of [15]) is an endofunctor K : G 1 G 1 , such that for all c , c G 1 we have: (1) c K c ; (2) K is monotone; and (3) K is idempotent, i.e., K K c = K c . This closure gives the biggest subcategory of G 1 that can be used to simulate/realize F [ G 1 ] .
The above construction of Landauer connection for entropy systems shows that it is “weakest” relation between them in the sense that it allows only preserving the entropy ordering between them, since the Galois connection is the “weakest” connection between posets.
The final part of this section is devoted to explaining how Table 1 is related to the Landauer’s connection, and to provide practical tools to indicate this connection.
Corollary 3.
Conditions equivalent to the first two conditions of Corollary 2 for F G are
1. 
if states p , p are ordered as follows: p p , then p p G F ( p ) and p G F ( p ) G F ( p ) ; and
2. 
if states p , p are ordered as follows: q q , then F G ( q ) q q and F G ( q ) F G ( q ) q .
The Corollary says that the Galois connection is in general not symmetric operation, and G F functor maps a state to the “higher or equal” state. Likewise, F G maps a state to the “lower of equal” state.
Order-embedding and order-isomorphism conditions are given by the following standard results from the Galois connection theory [14,15]
Lemma 1.
The following conditions are equivalent for F G :
1. 
G F ( p ) = p , p Γ 1 .
2. 
F is surjective,
3. 
G is injective.
In general, T = G F is a closure functor, however
Corollary 4.
If F G = I d Γ 2 and G F = I d Γ 1 , then Γ 1 and Γ 2 are order-isomorphic.
Finally, we need the definition of reversibility, which mimics thermodynamic reversibility of a process, namely
Definition 7.
An entropy system map, that is a poset map f : Γ Γ , is reversible at p Γ , if p = f ( p ) , that is S ( p ) = S ( f ( p ) ) , i.e., f at p preserves entropy. Otherwise, f is irreversible at p.
Note that reversibility is connected with map and the state on which it acts. Such maps in thermodynamics are called reversible processes and in logic/computing reversible operations.
We are ready to formulate the main theorem that connects classical Landauer’s principle given by Table 1 with the Landauer’s connection
Theorem 4.
For two entropy systems G 1 = ( Γ 1 , ) and G 2 = ( Γ 2 , ) , and functors F : G 1 G 2 and G : G 2 G 1 , we have following possibilities for Landauer–Galois connections
Entropy 20 00971 i001
Proof. 
For the first claim, let us take a state p Γ 1 and a reversible map f : Γ 1 Γ 1 that gives p = f ( p ) , such that p = p (that is S 1 ( p ) = S 1 ( p ) , where S 1 is the entropy in G 1 ). Using the first claim of Corollary 3, we have p = f ( p ) G F ( f ( p ) ) and p F G ( p ) G F ( f ( p ) ) and F , G are monotone functors. Therefore, F can map p = p into F ( p ) = F ( p ) or F ( p ) F ( p ) , F ( p ) F ( p ) , i.e., F f is reversible or irreversible process. Irreversible process in G 1 is always mapped (functors are monotone) into irreversible process (see the third row of the table).
A similar argument in opposite direction holds for the second case.
For the third case, the functors F and G must map reversible maps to reversible maps and irreversible maps to irreversible ones. Therefore, they preserve ordering so they are order-embeddings onto images (see Lemma 1). If in addition F and G are surjective functors, then from Corollary 4 it results that they are order-isomorphisms. □
The above theorem is a simple tool that helps to detect the presence of the Landauer’s connection and their direction.
In the next section, examples of the Landauer’s connection, including original Landauer’s principle, are given.

4. Examples

This section presents a few examples of various level of details of interplay between entropy systems and Galois connection that we call the Landauer’s connection. We start from simple yet imaginative toy example.

4.1. Toy Example

This example is motivated by a simple example (Example 1.80) of the Galois connection from [18].
Consider two entropy systems Γ 1 = ( R 0 , S ) and Γ 2 = ( N 0 , S ) , where the entropy in both cases is given by the identity S ( x ) = x (different functions can be selected, e.g., the floor function S ( x ) = x ; in this case, different ordering on real numbers is used, however derivations are similar). This choice of entropy agrees with standard ordering on natural and real numbers.
One can see that the system Γ 1 has higher cardinality of states than Γ 2 . We consider the Galois extension in both ways as in [18]. In the considerations, these posets are treated as categories on their own, and therefore functors are simple monotone functions.
Case 1. Consider F : Γ 1 Γ 2 defined as F ( z ) = z 3 and G : Γ 2 Γ 1 given by G ( z ) = 3 z . We have F G since it fulfills Equation (7), i.e.
x 3 y x 3 y .
Let us consider a few processes on Γ 1 and related processes in Γ 2 induced by the Galois connection:
  • Consider now the following map f : Γ 1 Γ 1 given by a simple shift f ( z ) = z + 0.2 . Take x = 1 Γ 1 for which S ( x ) = 1 . Then, x ¯ = f ( x ) = 1.2 and S ( f ( x ) ) = 1.2 and therefore process x x ¯ is irreversible (entropy increases). We have y = F ( x ) = 1 with S ( y ) = 1 , and y ¯ = F ( x ¯ ) = F f ( x ¯ ) = 1 with S ( y ¯ ) = 1 , and therefore, the irreversible process in Γ 1 is mapped by F to reversible process on the level of Γ 2 .
  • Take the same map f ( x ) = x + 0.2 with initial point x = 2.9 . It gives x ¯ = f ( x ) = 3.1 and therefore S ( x ) = 2.9 and S ( x ¯ ) = 3.1 —irreversible process in Γ 1 . Using functor F, we get y = F ( x ) = 1 and y ¯ = F ( y ¯ ) = 2 . Therefore, irreversible process in Γ 1 is mapped to irreversible process in Γ 2 .
  • If we take f ( x ) = x , then reversible (trivial) process in Γ 1 is mapped to reversible process in Γ 2
  • No irreversible process in Γ 2 can be realized by a reversible process in Γ 1 .
Summing up, proposed Galois connection gives Case 1 from Theorem 4.
Case 2. We now take F : Γ 2 Γ 1 defined as F ( x ) = 3 x and G : Γ 1 Γ 2 given by G ( x ) = x 3 . This also defines the Galois connection F G as it is easily checked. We have the following examples of processes:
  • The process in Γ 1 , e.g., the shift f ( z ) = z + 3 that irreversibly maps x = 6 to x ¯ = 9 on the level of Γ 2 gives the map from y = G ( x ) = 2 into y ¯ = G ( x ¯ ) = 3 which is also irreversible.
  • For the irreversible shift f ( z ) = z + 0.1 on Γ 1 that maps x = 2 to x ¯ = 2.1 , we have reversible (identity) process in Γ 2 that maps y = G ( x ) = 0 to x ¯ = G ( y ¯ ) = 0 that is obviously reversible.
  • Identity (reversible) process in Γ 1 is trivially mapped into reversible process in Γ 2 .
  • There is no mapping of an irreversible process on Γ 1 to a reversible process in Γ 2 .
In summary, Case 2 of Theorem 4 is restored.
Case 3. For an example of Case 3 of Theorem 4, consider identity mapping F = I d Γ 1 = G between two copies of Γ 1 .
First two examples show how a system with “bigger multiplicity of states” is Galois connected with a system with “smaller number of states” and fulfills the claims of Theorem 4 relating reversible or irreversible processes between these two systems. Similar principle can be used in description of memory chip where two logical states can be realized by some complicated sets of physical states and their internal transitions that realize binary operations. This idea is used in the next subsection.

4.2. Landauer’s Functors and Maxwell’s demon

In this subsection, we describe how the above abstract language can be applied to description of original Landauer’s principle of irreversible computations. Then, the well-known Maxwell’s demon paradox is presented using Landauer’s connection. This is use of a new and more powerful language to the known solution described in detail in [11].
We again stress that this is not a new solution for the problem, which was solved already. Instead, it is reformulation of the problem in the new abstract language of the Landauer’s connection which in our opinion gives clearer and more uniform description of the problem. The thermodynamic details are hidden in the details of the Galois connection and they manifest themselves in heat emission during irreversible computation.
Let us first explain the classical Landauer’s principle in therms of the Landauer’s connection introduced in the previous subsection. Let us consider first computer memory M that bases on binary logic, and its implementation using some physical system D. In both cases, they are entropy systems (see Figure 1). We can therefore build posets using entropy as ordering, namely construct ( M , S ) and ( D , S ¯ ) , where S and S ¯ are corresponding entropies.
Since the relation between logical part and physical realization of the memory is described by Table 1, from Theorem 4, there is a Ladnauer’s connection F G between functors F : M D and G : D M . The details of the functors depend on the implementation, however they are maps between logical states and corresponding physical states in the memory implementation, see e.g., [12,13]. If there is irreversible operation on the memory M given by a function f : M M , then it induces irreversible operation on the device D given by F f : F M F M , and this, by the second law of thermodynamics generates heat that is expelled to the environment. The amount of emitted heat depends on the realization (i.e., on properties of F and G), however Landauer showed the lower bound for it, namely, k B T ln 2 .
In the next part of our considerations this memory M is used as a Demon’s memory in the Maxwell’s demon “paradox”. In the experiment, there is the thermodynamic system, i.e., the box with an ideal gas, and the partition that can selectively be opened, i.e., part E. It is connected with the memory M, which saves informations on separation of gas particles depending of their kinetic energy, e.g., high kinetic energy particles are collected in the left and low energy particles in the right chamber. The functors K : E M and H : M E are relations between information on localization of the particle in E and its logical description in M (not in D). For details, consider Figure 2 where simplified Maxwell’s demon experiment (due to Leó Szilárd [26]) with one gas particle is presented. We describe every step in the cycle. Two-bit description of the logical state of the system was selected for better understanding—1 means that the particle is localized in the left (10) or the right (01) part of the box.
  • In this state, there is no information on localization of the particle, and therefore information entropy S i = 2 as the state is the mixture of two states 01 and 10. This state is associated to 00 bit description (reset) and transferred to the memory.
  • Particle was localized (for example) in the left chamber (10) so the information entropy is now S i = 1 . This state is transferred into memory, where reversible operation (e.g., N O T I d ) is performed that change 00 into 10.
  • Partition starts to move freely. There is adiabatic decompression of a single particle gas. State of the memory is the same as in the previous step.
  • Partition is pushed maximally to the right. The work done by the particle is W = k B T V / 2 V d V V = k B T ln 2 , where V is the volume of the box, k B is the Boltzmann constant, and T is the temperature. Since no heat flow was present, thermodynamic entropy is still constant and the internal energy of the gas decreased. New cycle will start.
  • This transition is the restart of the cycle. The partition is placed in the middle of the box, and therefore information on localization of the particle is lost. Information entropy is now S i = 2 . State become 00 and it is correlated with the state of the memory—irreversible operation (e.g., f ( x ) = 00 A N D x ) is performed on the memory, which results in expelling, via Landauer’s principle, at least k B T ln 2 of heat form its physical part D. Cycle repeats.
One can note that K and H are in fact isomorphisms and they connect the state of the knowledge on the particle position and state of the memory.
Irreversible operation on the memory f : M M is transferred via the Landauer’s functor F into irreversible operation F f on its physical implementation. That results in expelling heat and preserves the second law of thermodynamics for the whole system in every cycle.
If a bigger memory would be used for storing information on the particle localization in a few cycles, then its erasing would expel at least multiple of k B T ln 2 of heat from D part and preserved the second law of thermodynamics after these cycles. In this case, the end of the cycle is marked by erasing of the memory and not the thermodynamic cycle in E part of the system.
It is interesting to note that the irreversible process f can be transferred to E as K f , however it produces heat only in D part of the system (however corresponding changes of entropies appear in E, M and D). This explains how the second law of thermodynamics and entropy changes can be transferred from D to E, i.e., irreversible operation f corresponds to irreversible operation K G F f on E.

4.3. DNA Computation

In this and the next subsection, sketches of application of the Galois connection to biochemical and biological systems are presented. Due to large scale of complexity of such systems, the description is not detailed and many statements can be treated rather as research hypotheses than firmly stated claims. We believe however that promoting the Galois–Landauer principle to a general principle justifies its formulations in abstract language of category theory, which makes it possible to trace it also in biochemical systems and living organisms.
Every living cell contains “a computer” that operates on complicated chemical principles using DNA (deoxyribonucleic acid), and controls every aspect of the cell. In recent years, such principles were used to implement efficiently some computationally difficult algorithms thanks to enormous parallelization achievable by this approach. For an excellent overview of this subject, see [27] and references therein.
In short, DNA as a storage of genetic informations consist four bases: A (adenine), G (guanine), C (cytosine) and T (thymine). Single DNA strand can be considered as a list composed with these four letters. The second strand can be connected using the following complementary connection rules: A-T and G-C. Therefore, single strand consists the same amount of information as a double one. In addition, the direction (polarity) of the strand is marked by chemical compounds named 3 and 5 . An example of a short fragment of a DNA strand (called oligonucleotide) is 3 ACTGTA 5 .
Operations on such structure are controlled by changing physical properties of environment (temperature that, e.g., decides if the DNA double helix decouples into individual strands (melting) or combines single strands into double list (annealing)) or chemical properties (especially by adding specially designed enzymes that perform various operations on short pieces of DNA [27]). In computation, some external, and not present in living organisms, methods such as gel electrophoresis are used [27].
Following [28], the most effective representation of bases that are unchanged after changing polarity and the respects complementarity is the three-bit association: A-000, C- 010, G-101, and T-111. One can then represents basic operations in algebraic form [28].
DNA computing and DNA processing in a living cell is a complicated sequence of chemical reactions in some environment usually without sharp borders, and therefore we present only a rough idea how it can be connected with the Galois connection.
On the logical level (see Figure 3), as described above and in [27,28], there is well-formalized set of operations on logical representation of DNA state. In this system, various flavors/notions of entropy that capture different levels of computations can be described—from the Shannon entropy [29], through block entropy [30] to topological entropy [31] among others.
On chemical (realization) level (see Figure 3), due to enormous complexity, the reactions [32] in the system with DNA cannot be decoupled from the enclosing environment in which the system is embedded (e.g., living cell or biochemical reactor). The boundary of the environment depends on how complex computation is, e.g., for “small” computation, it can be nucleus of the cell or its membrane (or the test-tube in which compounds are for in vitro computations). However, for long and complex computations, probably the environment in which the cell is living including other cells would be a good choice. Some hints on selecting boundary of the environment result from thermodynamics—boundary should be such a (natural or artificial) barrier/closed surface that the total entropy inside it should always increase or remains constant for the whole process. In other words, it should be minimal boundary of volume in which the second law of thermodynamics is fulfilled for all time of the process.
This whole composed system and environment have to obey the second law of thermodynamic. Interaction between the System and the Environment drive the System with DNA to perform some specific reactions by changing physical and chemical properties of the System. That leads to change in the Gibbs free energy (Figure 3) that determines direction of the reactions in the DNA system.
Similarity with the model of physical memory described above suggests that such relation between information stored in DNA and various logical operations from logical side, and their chemical realization by the system and environment qualify to describe them in terms of the Galois–Landauer’s connection. If this hypothesis is true, then it can shed some light on basic principles of life on elementary level. This is reasonable hypothesis since such complicated chemical reactions (whatever optimized by evolution process) should always increase total entropy (nontrivial reversible and irreversible computations should be realized by irreversible chemical processes that increase total entropy). In addition, every irreversible computation on logical level, via hypothetical connection, release the Landauer’s heat to the environment.
The model of DNA as a memory modeled using the Galois connection is also vital in view of recent experiment that show how to encode large amount of data (short movie) in DNA of living organism [33].
In this example, we are motivated by the fact that Galois connection is a model for operation on single genetic bases and their chemical realization. In the next example, we present that the similar structure should exist on the level of genes (conglomerates of genetic information that encode proteins structures) and animal species in the tree of life.

4.4. Is 42 the Meaning of Life?

The provocative title of this subsection refers to the fiction book The Hitchhiker’s Guide to the Galaxy by Douglas Adams. He describes an advanced civilization that designed planet-size computer (Earth) with a “biological component” that should answer the “Ultimate Question of Life, The Universe, and Everything”. This fictional idea surprisingly well resembles the following construction.
The hypothesis on existence of the Galois in the previous example, if true, shows that that on the basic level, life is a computation process on chemical components. On larger scales, the Galois connecting can also conjugate expression of gene pool and animal species interconnections. This vague idea is described in [18], Example 1.84, and it is worth citing here for completing the picture. Due to complexity of biological examples, only a rough idea is presented. Some general idea on the level of such complexity is presented in [32] and reverences therein.
Following [18], let ( P , ) be the poset describing possible animal populations with inclusion p q if the animal species p is also the animal species q in the sense of specificity on the tree of life (see Example 1.51 in [18]). Moreover, the inclusion of species induce some kind of entropy that can be used to measure various changes in the tree of Life.
The other poset ( G , ) describes gene polls and the ordering has the following meaning: a b when the gene pool b can be generated by the gene pool a. The ordering also can be used to define some kind of entropy that measures ordering or information loss on the level of gene pools.
The Galois connection is defined by two monotone functions (functors, when posets are treated as categories on their own) [18]. The first one i : P G sends each population to the gene pool that defines it. The second functor c l : G P sends each gene pool to the set of animals that can be obtained by recombination of the given gene pool. Then, i c l in this very broad sense.
This example involves structure of the whole population of organism on Earth and their genetic information as “a database” for processes that describe computations (evolution). Since connected systems are not of thermodynamic origin, no heat during the irreversible process is expelled (it is even difficult to define such quantity not having a definition of temperature in the model). However, operating on not too strict level, the process of evolution can be decomposed, using the hypothesis from the previous section, into chemical reactions. Since evolution is a long-term process, the environment for computation would involve the whole Earth and all organisms in which biochemical reactions take place. This is an unrealistic model, and, therefore, usually focusing on small piece of systems and their interaction with environment is a more reasonable approach [16,17]. However, this abstract level motivates the introduction of “the heat of evolution”—Landauer’s heat of all chemically realized bio-computations that were (and still are) realized in the process of evolution.
This idea can serve as a model of life on Earth, however the way to state it precisely requires mathematical biology, biochemistry and taxonomy to be developed on very detailed and precise level not available currently. Only then can. an exact definition of Galois functors be provided. Moreover, this slightly modified example of the Galois connection from [18] gives hints how to develop entropy measures consistent (preserved by the connection) between biochemical, genetic and taxonomy levels provided that the details of the Galois connections are known.
The situation in modeling such structure is somehow simpler in evolutionary robotics [34] where a virtual environment that resembles some features of physical environment in which robot operates is used to simulate artificial evolution that is optimization of robot shapes. Genetic code in this case is a set of optimized parameters of robot (e.g., the length of the legs of a robot or its neural network design) and environment is represented by some multidimensional function called fitness function [35]. Virtual evolution is in fact multidimensional optimization process that is aimed to find minimum of fitness function (which can be globally shifted to have a value equal to 42). This minimum (which can be non-unique) describes optimal fitness in a given environment for the robot construction with respect to optimized parameters. Due to larger “rigidity” of environment (usually fitness function does not change or changes very slowly) than in biological situation (when environment changes abruptly and contains other interacting specimens), the construction of the Galois connection between robots genetic poll and their construction features, and related entropies (in analogy to the presented above biological example), should be easier. The possibility and details of such construction deserves another paper.
Summing up abstract discussion from the last two subsections, on every level of life, there is a pattern which resembles the Galois connection. This pattern can be a hint for an emergent phenomena in biology.

5. Discussion

The Galois connection was originally invented to model the relationship between semantic and syntax [36] of mathematical theories—the relation between set of axioms and classes of models that implement such axioms. It is a surprising coincidence that the same structure exists in systems governed by entropy and describing realization/simulation of one system by the other. It can be understood in the terms that one simulated system gives a set of axioms that have to be met and another simulation system is a model that is used to map the behavior of the first simulated system.
There is also an even more important interpretation in the view of entropy as the loss of information when we formulate a physical model from uncertain measurement data [5]. In this context, this information loss can be propagated via Landauer’s connection to the other related systems.
Likewise, from the Maxwell’s demon example, we observe that the second law of thermodynamics propagates along the Landauer–Galois connection and for a proper thermodynamic description of the system, all connected parts must be considered together.
Landauer’s connection and information interpretation of entropy bear some relationships to MUH (Mathematical Universe Hypothesis) articulated by Max Tegmark [37]. The observations in this paper may be reminiscence of computational/information principles on which our universe is based. The Landauer’s connection can be a relation between encoding physical laws and its “equivalence class of descriptions” [37].
One of the basic questions that arise from the Landauer’s connection is whether there are (ultimate) computations that are not connected with physical realizations, or whether, more generally, is every entropy system connected with some other entropy system, i.e., can we group entropy systems in pairs without one left out?
From the above connection, some insight on the black hole entropy and information loss after crossing the event horizon could be anticipated when the realization of such information via the Landauer’s connection is taken into account. This deserves another paper.

6. Conclusions

In this paper, it is shown that the well-known and experimentally confirmed Landauer’s principle is a reflection of a general connection between entropy systems, called the Galois connection. This connection was cast into systems where entropy exists naturally. The result of entropy-induced ordering was used to provide poset structure in such systems (defined as entropy systems) and this gives link between Galois connection and Landauer’s principle.
The original Landauer’s principle was restricted to the connected information-physical systems, however, the general form of the Landauer’s connection presented in this paper can be applied to every pair of systems with some kind of mapping (realization/implementation) that connects states of both systems. Then, Landauer’s connection preserves entropy structure, and therefore preserves poset structure on the state-spaces of the systems that is induced by entropy.
In the discussion, it is argued that this connection can be a remnant of a more fundamental principle that our universe is based on.
Ladauer’s connection presented in the paper can be used as a backbone of a category theory model that describes the relationship between two entropy systems.

Funding

This research was supported by the GACR grant 17-19437S, and the grant MUNI/A/1138/2017 of Masaryk University.

Acknowledgments

I would like to thank those who made me interested in category theory and abstract approach to science—those who I met in person or know only by their magnificent books and papers.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Boyling, J.B. An axiomatic approach to classical thermodynamics. Proc. R. Soc. Lond. A 1972, 329, 35–70. [Google Scholar] [CrossRef]
  2. Frankel, T. Geometry of Physics; Cambridge UP: Cambridge, UK, 2011. [Google Scholar]
  3. Lieb, E.H.; Yngvason, J. A Guide to Entropy and the second law of thermodynamics. Not. AMS 1998, 45, 571–581. [Google Scholar]
  4. Lieb, E.H.; Yngvason, J. The Physics and Mathematics of the second law of thermodynamics. Phys. Rep. 1999, 310, 1–96. [Google Scholar] [CrossRef]
  5. Lychagin, V. Lecture Notes on Entropy from the Wisła 2018 Summer School. In Lecture Notes in Mathematics; Springer: Berlin, Germany, 2019; in press. [Google Scholar]
  6. Katok, A.; Hasselblatt, B. Introduction to the Modern Theory of Dynamical Systems, Revised ed.; Cambridge University Press: Cambridge, UK, 1996. [Google Scholar]
  7. Reza, F.M. An Introduction to Information Theory, Revised ed.; Dover Publications: Mineola, NY, USA, 1994. [Google Scholar]
  8. Landauer, R. Irreversibility and heat generation in the computing process. IBM J. Res. Dev. 1961, 5, 183–191. [Google Scholar] [CrossRef]
  9. Bennett, C.H. The logical reversibility of computation. IBM J. Res. Dev. 1973, 17, 525–532. [Google Scholar] [CrossRef]
  10. Bennett, C.H. Demons, Engines and the Second Law. Sci. Am. 1987, 257, 108–116. [Google Scholar] [CrossRef]
  11. Ladyman, J.; Presnell, S.; Short, A.J.; Groisman, B. The connection between logical and thermodynamic irreversibility. Stud. Hist. Philos. Sci. Part B Stud. Hist. Philos. Mod. Phys. 2007, 38, 58–79. [Google Scholar] [CrossRef] [Green Version]
  12. Bérut, A.; Arakelyan, A.; Petrosyan, A.; Ciliberto, S.; Dillenschneider, R.; Lutz, E. Experimental verification of Landauer’s principle linking information and thermodynamics. Nature 2012, 483, 187–189. [Google Scholar] [CrossRef] [PubMed]
  13. Yan, L.L.; Xiong, T.P.; Rehan, K.; Zhou, F.; Liang, D.F.; Chen, L.; Zhang, J.Q.; Yang, W.L.; Ma, Z.H.; Feng, M. Single-Atom Demonstration of the Quantum Landauer Principle. Phys. Rev. Lett. 2018, 120, 210601. [Google Scholar] [CrossRef] [Green Version]
  14. Lane, S.M. Categories for the Working Mathematician, 2nd ed.; Springer: Berlin, Germany, 1978. [Google Scholar]
  15. Smith, P. Category Theory: A Gentle Introduction. Logic Matters. 2018. Available online: https://www.logicmatters.net/categories/ (accessed on 12 December 2018).
  16. Baez, J.C.; Fritz, T. A Bayesian Characterization of Relative Entropy. Theory Appl. Categ. 2014, 29, 421–456. [Google Scholar]
  17. Baez, J.C.; Pollard, B.S. Relative Entropy in Biological Systems. Entropy 2016, 18, 46. [Google Scholar] [CrossRef]
  18. Fong, B.; Spivak, D.I. Seven Sketches in Compositionality: An Invitation to Applied Category Theory. unpublished lecture notes. Department of Mathematics, Massachusetts Institute of Technology: Cambridge, MA, USA, 2018. Available online: http://math.mit.edu/~dspivak/teaching/sp18/ (accessed on 12 December 2018).
  19. Mallios, A.; Zafiris, E. Differential Sheaves and Connections: A Natural Approach to Physical Geometry; WSPC: Singapore, 2015. [Google Scholar]
  20. Birkhoff, G. Lattice Theory; American Mathematical Society: Providence, RI, USA, 1940; Volume 25. [Google Scholar]
  21. Davey, B.A.; Priestley, H.A. Introduction to Lattices and Order; Cambridge University Press: Cambridge, UK, 2002. [Google Scholar]
  22. Gierz, G.; Hofmann, K.H.; Keimel, K.; Lawson, J.D.; Mislove, M.W.; Scott, D.S. Continuous Lattices and Domains; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
  23. Ore, O. Galois Connexions. Trans. Am. Math. Soc. 1944, 55, 493–513. [Google Scholar] [CrossRef]
  24. Dieck, T. Transformation Groups and Representation Theory. In Lecture Notes in Mathematics; Springer: Berlin, Germany, 1979; Volume 766. [Google Scholar]
  25. Babson, E.; Kozlov, D.N. Group Actions on Posets. J. Algebra 2005, 285, 439–450. [Google Scholar] [CrossRef]
  26. Szilard, L. Über die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen (On the reduction of entropy in a thermodynamic system by the intervention of intelligent beings). Zeitschrift für Physik 1929, 53, 840–856, (English Translation: NASA Document TTF-16723). [Google Scholar] [CrossRef]
  27. Karl, L. DNA computing: Arrival of biological mathematics. The most highly parallelized computers. Math. Intell. 1997, 19, 9–22. [Google Scholar] [CrossRef]
  28. Li, Z. Algebraic properties of DNA operations. Biosystems 1999, 52, 55–61. [Google Scholar] [CrossRef]
  29. Arias-Gonzalez, J.R. Entropy Involved in Fidelity of DNA Replication. PLoS ONE 2012, 7, e42272. [Google Scholar] [CrossRef]
  30. Schmitt, A.O.; Herzel, H. Estimating the Entropy of DNA Sequences. J. Theor. Biol. 1997, 1888, 369–377. [Google Scholar] [CrossRef]
  31. Jin, S.; Tan, R.; Jiang, Q.; Xu, L.; Peng, J.; Wang, Y.; Wang, Y. A Generalized Topological Entropy for Analyzing the Complexity of DNA Sequences. PLoS ONE 2014, 9, e88519. [Google Scholar] [CrossRef]
  32. Lane, N. The Vital Question: Energy, Evolution, and the Origins of Complex Life, 1st ed.; W. W. Norton & Company: New York, NY, USA, 2016. [Google Scholar]
  33. Shipman, S.L.; Nivala, J.; Macklis, J.D.; Church, G.M. CRISPR–Cas encoding of a digital movie into the genomes of a population of living bacteria. Nature 2017, 547, 345–349. [Google Scholar] [CrossRef] [Green Version]
  34. Nolfi, S.; Floreano, D.; Arkin, R.C. Evolutionary Robotics: The Biology, Intelligence, and Technology of Self-Organizing Machines; A Bradford Book: Cambridge, MA, USA, 2004. [Google Scholar]
  35. Nelson, A.L.; Barlow, G.J.; Doitsidis, L. Fitness functions in evolutionary robotics: A survey and analysis. Robot. Auton. Syst. 2009, 57, 345–370. [Google Scholar] [CrossRef] [Green Version]
  36. Lawvere, F.W. Adjointness in Foundations. Dialectica 1969, 23, 281–296. [Google Scholar] [CrossRef] [Green Version]
  37. Tegmark, M. The Mathematical Universe. Found. Phys. 2008, 38, 101–150. [Google Scholar] [CrossRef]
Figure 1. The Landauer’s connection between box with ideal gas E, memory M of the Maxwell’s demon and its physical realization D in the Maxwell’s demon experiment.
Figure 1. The Landauer’s connection between box with ideal gas E, memory M of the Maxwell’s demon and its physical realization D in the Maxwell’s demon experiment.
Entropy 20 00971 g001
Figure 2. Maxwell’s demon experiment with a single particle and movable partition. On the left, there is a box with movable partition and, on the right, a corresponding memory state.
Figure 2. Maxwell’s demon experiment with a single particle and movable partition. On the left, there is a box with movable partition and, on the right, a corresponding memory state.
Entropy 20 00971 g002
Figure 3. Schematic structure of computation in DNA. On the level of computation realm (information theory), there is information encoded in DNA strand. It is Galois connected with biochemical system which realizes computations by means of chemical reactions. This system contains an Environment (Env.) and embeds inside the System (Sys.) with DNA, chemical elements and enzymes, where actual computation takes place. The Environment interacts with the System for conducting specific chemical reactions that realizes logical operations. The System and Environment overall fulfill the second law of thermodynamics and therefore the total entropy can remain constant or increase, i.e., Δ S E n v + Δ S S y s . = Δ S T 0 . Defining the enthalpy change (dispersed heat of the System) as Δ H = T Δ S S y s . and the Gibbs free energy change as Δ G = T Δ S T on gets the famous equation Δ G = Δ H T Δ S S y s . All reactions in the system are spontaneous if Δ S T > 0 , that is Δ G < 0 . This is the principle of interaction between the System and the Environment.
Figure 3. Schematic structure of computation in DNA. On the level of computation realm (information theory), there is information encoded in DNA strand. It is Galois connected with biochemical system which realizes computations by means of chemical reactions. This system contains an Environment (Env.) and embeds inside the System (Sys.) with DNA, chemical elements and enzymes, where actual computation takes place. The Environment interacts with the System for conducting specific chemical reactions that realizes logical operations. The System and Environment overall fulfill the second law of thermodynamics and therefore the total entropy can remain constant or increase, i.e., Δ S E n v + Δ S S y s . = Δ S T 0 . Defining the enthalpy change (dispersed heat of the System) as Δ H = T Δ S S y s . and the Gibbs free energy change as Δ G = T Δ S T on gets the famous equation Δ G = Δ H T Δ S S y s . All reactions in the system are spontaneous if Δ S T > 0 , that is Δ G < 0 . This is the principle of interaction between the System and the Environment.
Entropy 20 00971 g003
Table 1. Table 1 from [11] defining connection between memory operations and their realizations in thermodynamic system. For definition of (ir)reversibility, see below.
Table 1. Table 1 from [11] defining connection between memory operations and their realizations in thermodynamic system. For definition of (ir)reversibility, see below.
PossibilitiesThermodynamically ReversibleThermodynamically Irreversible
Logically reversibleYESYES
Logically irreversibleNOYES

Share and Cite

MDPI and ACS Style

Kycia, R.A. Landauer’s Principle as a Special Case of Galois Connection. Entropy 2018, 20, 971. https://doi.org/10.3390/e20120971

AMA Style

Kycia RA. Landauer’s Principle as a Special Case of Galois Connection. Entropy. 2018; 20(12):971. https://doi.org/10.3390/e20120971

Chicago/Turabian Style

Kycia, Radosław A. 2018. "Landauer’s Principle as a Special Case of Galois Connection" Entropy 20, no. 12: 971. https://doi.org/10.3390/e20120971

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop