Next Article in Journal
Deep-Learning-Based Cryptanalysis of Lightweight Block Ciphers Revisited
Next Article in Special Issue
Thermodynamical versus Logical Irreversibility: A Concrete Objection to Landauer’s Principle
Previous Article in Journal
SCFusion: Infrared and Visible Fusion Based on Salient Compensation
Previous Article in Special Issue
Dissipation during the Gating Cycle of the Bacterial Mechanosensitive Ion Channel Approaches the Landauer Limit
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Landauer Bound and Continuous Phase Transitions

by
Maria Cristina Diamantini
NiPS Laboratory, INFN and Dipartimento di Fisica e Geologia, University of Perugia, Via A. Pascoli, I-06100 Perugia, Italy
Entropy 2023, 25(7), 984; https://doi.org/10.3390/e25070984
Submission received: 30 May 2023 / Revised: 21 June 2023 / Accepted: 21 June 2023 / Published: 28 June 2023

Abstract

:
In this review, we establish a relation between information erasure and continuous phase transitions. The order parameter, which characterizes these transitions, measures the order of the systems. It varies between 0, when the system is completely disordered, and 1, when the system is completely ordered. This ordering process can be seen as information erasure by resetting a certain number of bits to a standard value. The thermodynamic entropy in the partially ordered phase is given by the information-theoretic expression for the generalized Landauer bound in terms of error probability. We will demonstrate this for the Hopfield neural network model of associative memory, where the Landauer bound sets a lower limit for the work associated with ‘remembering’ rather than ‘forgetting’. Using the relation between the Landauer bound and continuous phase transition, we will be able to extend the bound to analog computing systems. In the case of the erasure of an analog variable, the entropy production per degree of freedom is given by the logarithm of the configurational volume measured in units of its minimal quantum.

1. Introduction

Landauer’s principle [1,2,3] tells us that forgetting is costly: the erasing of one bit of information, namely resetting it to a particular memory state, independently of its previous memory state, has an entropic cost of, at least, k T ln ( 2 ) energy (where T is the temperature and k the Boltzmann constant). This is the content of the famous statement that “information is physical” as realized first by Szilard [4] and after by Landauer: information can only be processed by physical systems, computers, and thus it is subject to the laws of thermodynamics of physical systems. The minimum energy expenditure of k T ln ( 2 ) solves the problem of the violation of the second law of Maxwell’s demon [5]: the second law is not violated since one has to take into account the cost of erasing the demon’s memory. The paradox of Maxwell’s demon has also been addressed in a related but slightly different way by Brillouin [6,7], using the idea of negentropy, namely the reverse of entropy, describing a system becoming “more ordered”, and its relation with information. A bit of information is obtained by the demon at the price of some negative entropy lost in the environment, which allows the demon to make choices which decrease the entropy in the environment. The relation between the negentropy and the Landauer limit was analyzed in [8,9] considering a system of magnetic skyrmions. There, it was shown that the Landauer bound can be seen as a variation of the negentropy of the skyrmion. Landauer’s principle was recently experimentally verified in [10,11,12,13].
Since its formulation, many discussions have been devoted to the validity and usefulness of Landauer’s principle [14,15,16], and many attempts have been devoted to possibly beat Landauer’s limit since it sets a minimum energy expenditure in computation. More sophisticated formulations have been proposed [17,18], which take into account the role of the conditional entropy to relate Shannon and Gibbs entropy and that lower Landauer’s limit. For a review on recent developments on the thermodynamics of information, see [19]. Another possibility to beat this limit is to admit errors during the erasure procedure. In this case, the original Landauer limit and the associated minimum cost of erasure can be lowered. In [20], it was shown that, admitting errors, the Landauer bound can be lowered and the minimum work necessary to stochastically erase one bit becomes
Δ S k = ln 2 + p ln ( p ) + ( 1 p ) ln ( 1 p ) ,
where p is the error probability. This error probability, which can be interpreted as mutual or conditional entropy [21,22], becomes relevant [20] for future nanoscale implementations of switches which must necessarily take into account also their thermal fluctuations.
Information is physical; this statement, as shown in [23], implies that physical systems, which contain order, can encode information bits. Continuous phase transitions represent a paradigmatic example of these physical systems. Continuous phase transitions are characterized by symmetry breaking. The order parameter describes the symmetry broken phase and it is zero in the unbroken phase. These transitions, which are generally driven by temperature in the classical case, are described by the phenomenological Landau theory expressed in terms of the temperature and of the order parameter [24]. As we lower the temperature below the critical temperature T C , at which the phase transition takes place, the Landau function [24], representing the effective potential, goes from one single minimum to a manifold of minima, e.g., it bifurcates into two minima in the case of Z 2 symmetry breaking, creating a new order and a new configuration space for the system. The work performed on the system to lower the temperature is completely used to lower the entropy and to change the state of the system, making it “more ordered”. We call this procedure efficient. To better understand the relation between Landauer erasure and continuous phase transition, let us consider one classical bit of information stored in a bistable potential, exemplified by a particle in a double potential well. The first step of the erasure of the memory corresponds to the lowering of the barrier; note that in this step, the phase space available doubles. This corresponds to the disordered phase with m = 0 . Then, in general, by applying a tilting force and raising again the barrier will force the particle to be in one of the wells, depending on the tilting force, thus resetting either to zero or to one. This is a non-equilibrium state valid for a time smaller than the relaxation time in the well. In this last step, we have a phase space reduction; we ‘compress’ two states in one, and this is what causes the heat dissipation. In spontaneous symmetry breaking, something similar happens. Above the critical temperature, all possible degenerate ground states are available, while below the critical temperature, the system ‘chooses’ one state. In the case of symmetry breaking, an external perturbation, which is generally then set to zero, is what makes the system choose a particular state. In Landauer erasure, this is the role of the tilting in the double-well potential model for a single switch [10].
A particularly interesting example is given by neural networks [25], which are composed of a large number of interacting stochastic bits. Neural networks are the basic elements of associative memories. Contrary to address-oriented memories, recovery of the information is based on the similarity between the stored memory pattern and the presented pattern. The Hopfield model [26] is the most used example of neural networks [25]. The Hopfield model undergoes a phase transition, characterized by order parameter m that goes from zero in the disordered phase to one in the ordered phase. The transition is driven by a fictitious temperature: below T C , the system becomes ordered. As shown in [23] (and derived in Section 2) this phase transition is akin to an erasure with errors for the N stochastic neurons. The entropy difference between the disordered phase and the partially ordered phase can be exactly written as Equation (1) of [10,20,21], where, in this case, the error probability is related to the order parameter. Note that, while in the Landauer case, the erasure corresponds to forgetting, in the case of the neural networks, it corresponds to remembering.
A phase transition is clearly a collective phenomenon; a single spin cannot have a phase transition, while one can erase a single bit or flip a single spin. However, in the example we chose of the Hopfield model, which, as we will show in the next section, can be mapped in a long range Ising model [25], the bits which compose the associative memory represent the information bearing degrees of freedom. The order parameter m, which characterizes the continuous phase transition, plays the role of the error in an erasure with errors. When the parameter m is equal to one, we have the completely ordered phase corresponding to the erasure without errors. In [23], we showed that the entropy difference between the disordered phase and the completely ordered phase goes approximately k T ln 2 times the number of spins of the network, which coincides with the number of the information bearing degrees of freedom.
The Landauer principle [1] was originally formulated to compute the minimal energy required to erase a bit of information and it applies, thus, to the system in which information is represented by discrete units. What happens for analog computing systems? In [27], the relation between erasure and continuous phase transition allowed us to extend Landauer’s principle to systems where information is a continuous variable.
When we erase discrete information, assuming that the conditional entropy is zero [18], the difference in the Shannon entropy between the final state, to which we reset the memory, e.g., to one, and the one in which the system can be in any one of the possible states s i with probability p i , is given by
Δ S S = i M p i ln p i ,
where M is the finite number of possible logic states. The continuous generalization of the Shannon entropy is defined as [28,29]
S S c o n t = x M p ( x ) ln p ( x ) .
where p ( x ) is the probability distribution of the relevant degree(s) of freedom. The information-theoretic continuous Shannon entropy, however, requires an appropriate regularization, which adapts the dimensional character of the relevant degrees of freedom to the dimensionless quantity considered in the probability density p ( x ) . This is because the continuous extension of the Shannon entropy, contrary to the discrete entropy, which is an absolute quantity, is not invariant under the change of coordinates [28]. To cure this problem, Jaynes [30,31,32] proposed to modify Equation (3) by introducing an invariant factor p 0 ( M ) , which represents the density of the discrete distribution, which gives p ( x ) in the continuum limit:
S S c o n t = x M p ( x ) ln p ( x ) p 0 ( M ) .
The factor p 0 ( M ) , introduced as a regularization, arises naturally when we consider the continuous Landauer reset. This factor needs to be introduced to cure the problem of classical continuous entropy, which can be negative and divergent [33,34], and is given by the minimum quantum of configuration volume of the physical system.
In Section 2 of this review, we analyze the relation between continuous phase transitions, characterized by an order parameter, and the Landauer bound [23]. Using the example of the Hopfield model [26], we show that the information-theoretic expression for the entropy production during the erasure process, expressed in terms of the error probability, has the same expressions as the thermodynamic entropy in the partially ordered phase. For the Hopfield model, however, the completely ordered state corresponds to perfect remembering rather than forgetting, so the Landauer bound sets a lower limit for the cost of ‘remembering’ [35].
In Section 3, using the relation between the Landauer’s limit and continuos phase transitions, we extend the results of Section 2 to analog computing systems [27]. In this case, the entropy production per degree of freedom during the erasure of an analog variable is given by the logarithm of the configurational volume measured in units of its minimal quantum. Additionally, in this case, we have a “discretization” of the information bearing degrees of freedom, and an infinite amount of energies will be required to perform a computation with infinite precision.

2. Thermodynamic Entropy in Continuous Phase Transitions and Landauer Bound

Neural networks, using the definition given in [25], “are algorithms for cognitive tasks, such learning and optimization, which are in a loose sense based on concepts derived from research into the nature of the brain”. One important task that neural networks perform is pattern recognition: the retrieval of information, contrary to address-oriented memories, is performed by looking at the “similarity” between a pattern, which is presented, and the stored patterns. Associative memories have the advantage of being able to retrieve information even in the case of incomplete or noisy inputs, which is not permitted in traditional computers. The Hopfield model [26,36] is the paradigmatic example of a neural network designed to perform the task of associative pattern retrieval and is largely used in associative memory.
In associative memories, when a new pattern is presented, the network evolves from a totally unknown state to a state which corresponds to the stored pattern. As shown in [25], this is gauged equivalent to a state with all neurons, e.g., equal to +1. The transition between the unknown state and the final state corresponding to the stored pattern is, by definition, the process of remembering rather than forgetting, and the Landauer limit corresponds to the minimum energy necessary for remembering. The noise affects the remembering process: when it is not too large, the network provides the minimum energy required to remember, and when errors become too important, there is a phase transition to a state in which remembering becomes impossible.
The Hopfield model [26] is a directed graph of N binary neurons s i , i = 1 N , with s i = ± 1 fully connected by symmetric synapses with coupling strengths w i j = w j i ( w i i = 0 ), which can be excitatory (>0) or inhibitory (<0). The state s i = + 1 indicates the firing state of the neuron, while s i = 1 indicates the resting state. The network is characterized by an energy function
E = J 2 i j w i j s i s j , s i = ± 1 , i , j = 1 N ,
where J represents the (positive) coupling constant. The dynamical evolution of the network state is defined by the random sequential updating (in time t) of the neurons according to the rule
s i ( t + 1 ) = sign h i ( t ) ,
h i ( t ) = J i j w i j s j ( t ) ,
where h i is the local magnetization. As is standard for neural networks and, thus, for the Hopfield model [25], the temporal evolution proceeds in finite steps, which correspond to the updating of neurons according to the rule Equation (7) in this model. At time (t + 1), the neurons are firing or resting depending on the activation function. This process is intrinsically discrete in time. The synaptic coupling strengths are chosen according to the Hebb rule [25]
w i j = 1 N μ = 1 p σ i μ σ j μ ,
where σ i μ , μ = 1 p are p binary patterns to be memorized. The synaptic strengths contain all the information of the memory, which is encoded in the interaction between the spins σ i μ .
The dynamical evolution of the networks will allow the system, prepared in an initial state s i 0 (presented pattern), to retrieve the stored pattern σ i λ , which most closely “resembles” the presented pattern, namely the one that minimizes the Hamming distance, i.e., the total number of different bits in the two patterns.
Updating the Hopfield network according to the Hebb rule guarantees that the dynamical evolution minimizes the energy of Equation (5): the stored patterns are “attractors” for this dynamic, namely, they are local minima of the energy functional, which is bounded below. This implies that, when an initial pattern is presented, it will evolve until it overlaps with the closest stored pattern and then not evolve anymore. The possibility of remembering depends, however, crucially upon the loading factor α = p / N , given by the ratio between the number of stored memories and the number of available bits [25]: above a critical value, the network has a phase transition into a spin glass [36], and remembering becomes impossible.
In what follows, we consider the case of a single stored pattern σ i . As shown in [25], using the gauge transformation,
s i σ i s i ,
the energy functional Equation (5) becomes
E = J 2 N i j s i s j ,
the Hopfield model thus reduces to the long-range Ising model, and the stored pattern becomes σ i = + 1 for all i. Remembering for the network, in this case, is equivalent to resetting the N-bit register to this value. Note that in the Hopfield model, since the synapses are quadratic in the spins, there is always a symmetry between the memory and its NOT for one stored pattern, e.g., s i = 1 s i = 1 i , if the stored pattern all spins up as we chose in the present case. Both are minima for the dynamic. However, when a pattern is presented, the system recovers the one that is closed in the Hamming distance to the stored pattern.
The deterministic update law Equation (6) can be made probabilistic, introducing a fictitious temperature T = 1 / k β and, thus, thermal noise:
Prob s i ( t + 1 ) = + 1 = f h i ( t ) ,
where the activation function f is the Fermi function
f ( h ) = 1 1 + exp ( 2 β h ) .
The deterministic behavior is recovered in the limit β . The main difference with respect to deterministic neurons, which are always active or dormant according to the sign of h is that stochastic neuron activities fluctuate due to thermal noise and we can define a mean activity for a single neuron:
s i = ( + 1 ) f ( h i ) + ( 1 ) f ( h i ) ,
where denotes the thermal average. Now we note that, for the long-range Ising model, the mean field approximation f ( h i ) f ( h i ) is exact [37], and we thus obtain the deterministic equation:
s i = tanh β J N j i s j .
Defining the mean magnetization as m ( 1 / N ) i < s i > , we can rewrite Equation (14) as
m = tanh ( β J m ) ,
where we considered the thermodynamic limit N . We can now apply the known results for the mean field Ising model. The self-consistency Equation (15) has only one solution for β J < 1 , which corresponds to zero magnetization, m = 0 . When β J > 1 , Equation (15) admits three solutions m = 0 and m = ± m 0 ( β ) , but only the second two solutions are stable against small fluctuations; we thus have a magnetization m 0 . The condition β J = 1 gives the critical temperature T c = J / k : for T > T c , the network is disordered and remembering is not possible, while for T < T c , the network exhibits a partial magnetization m 0 , which goes m 1 for T 0 . Partial erasure is, thus, possible.
Remembering for stochastic neurons is equivalent to a reset operation with errors. For T T c , individual neurons fluctuate freely, and we are in the disordered phase. When T goes below T c , neurons become partially frozen in the stored pattern configuration and m ( T ) will tell us what is the average rate of errors in the reset process at this temperature. In this procedure, all work performed by lowering the temperature goes into lowering the entropy of the system. In fact, as T goes infinitesimally below T C , the Landau function [24] bifurcates into two minima, creating a new order and a new configuration space for the network. Note, however, that if the erasure processes is performed in a finite amount of time, in this case, the system will dissipate a finite amount of heat [38,39].
Following the standard treatment for the mean field Ising model (which in the present case is an exact solution), we expand the spin variables s i around their mean value m as s i = m + δ s i , with δ s i ( s i m ) . At the lowest order, the energy functional becomes
E = J N m 2 2 J m i s i ,
where we omitted an irrelevant constant. At this order in δ s i , the partition function is
Z = conf . e β E = e β J N m 2 / 2 2 cosh ( β J m ) N .
We thus obtain for the entropy the expression
S = T ( k T ln Z ) = k N ln ( 2 cosh ( β m J ) ) β m J tanh ( β m J ) .
At T = T c , m = 0 , the system is disordered, and the entropy takes the maximum value S = k N ln 2 , while at T = 0 , m = 1 and S = 0 , the system is ordered and the remembering is perfect. The entropy variation between the disordered state and the state with partial remembering 0 < m ( T ) < 1 is
Δ S k N = 1 k N S T c S T = ln 2 ln ( 2 cosh ( m T c T ) ) + m T c T tanh ( m T c T ) .
Equation (19) represents the heat dissipated per bit during the simulated annealing erasure procedure, and, thus, the Landauer bound for stochastic neurons described by the Hopfield model at temperature T. Perfect remembering, T = 0 and m = 1 , gives back the original bound ln ( 2 ) . Higher temperature corresponds to erasure with errors, in our case, due to thermal fluctuations in the fictitious temperature and, when T reaches T C and m becomes 0, the system has a phase transition to a disordered state, and remembering is not possible anymore. In the Landauer erasure, this corresponds to resetting to an unknown state, i.e., setting the probability error p = 1 / 2 in Equation (1).
The previous analysis tell us that the error probability p in the Landauer erasure is represented by the stochastic updating rules for the Hopfield network Equation (11). According to Equation (11), the probability that a neuron flips due to thermal noise is
Prob s i ( t + 1 ) = s i ( t ) = exp [ β h i ( t ) s i ( t ) ] 2 cosh [ β h i ( t ) s i ( t ) ] ,
so the probability that it flips from the desired value +1, since we are resetting to a memory register with all bits +1, to the wrong value 1 is
p = Prob + 1 1 1 2 ( 1 m ) = exp ( β J m ) 2 cosh ( β J m ) ) .
The maximum error probability p = 1 / 2 corresponds to m = 0 , the maximally disordered state of the network reached at T = T C , while p = 0 corresponds to the perfect order for the network with order parameter m = 1 at T = 0 . Inserting Equation (21) into Equation (19), we obtain for the entropy difference, and thus for the dissipated heat, exactly the information-theoretic expression Equation (1):
Δ S k N = ln 2 + p ln ( p ) + ( 1 p ) ln ( 1 p ) .
When p = 0 , the Landauer bound is saturated, and the entropy difference between the increasingly disordered state of the model and its perfectly ordered T = 0 state reaches the exact value
Δ S = k N ln ( 2 ) .
Once we reach the value m = 0 for the order parameter, which describes the broken symmetry phase, we reach the maximum entropy for the network and we cannot keep disordering the system without violating the Landauer bound and, thus, the second law of thermodynamics. The phase transition, which takes place at T C , thus corresponds to the saturation of the Landauer limit. The generalized Landauer theorem states, thus, that the sum of the entropy loss per bit and the one-bit error entropy cannot be lower than the bound k ln ( 2 ) , and it is exactly equal to this bound when the procedure is efficient. When this bound is saturated by the error entropy, resetting (remembering here) is no longer possible, and a phase transition occurs.
The Hopfield model has a discrete Z 2 symmetry corresponding to a spin 1 / 2 . The generalization to higher-order spins with a classical Z ( 2 n + 1 ) symmetry, with n = 1 / 2 , 1 , 3 / 2 is, however, straightforward. In the case of a Z ( 2 n + 1 ) symmetry, Equation (23) becomes
Δ S = S ( T C ) S ( 0 ) = k N ln ( 2 n + 1 ) .
In the more general case of a continuous phase transition of a system of N elementary components with D degrees of freedom each, which undergoes a continuous phase transition to a partially ordered phase below a critical temperature, we have only d degrees of freedom, which survive in the partially ordered phase, while the others are frozen. The phase transition is characterized by a complex vector of ordered parameters whose norm η rises from 0 in the disordered phase to 1 at zero temperature. The ration between the original degrees of freedom D and the one in the partially ordered phase d can be written as
D / d = q n ,
with n being an integer larger than one if D / d is a prime power and q a prime number. If D / d is not a prime power, we have n = 1 and q = D / d . If we take, for example, the simple case q = 2 , the phase transition can be seen as the formal “resetting” of d N bits to their standard value, with error probability p ( T ) = ( 1 η ( T ) ) / 2 . The entropy change during a generic phase transition is, thus, again given by Equation (22) with N d N . Otherwise, the Landauer bound would be violated in the ordering process. For q = 3 , we have trits instead of bits, and the generalization to other values of q is straightforward.

3. Analog Computing Systems

In analog computing systems, information is encoded in a continuous variable. To compute the entropy change during the erasure of information encoded in a continuous variable, we will use the relation between the Landauer principle and entropy change during continuous phase transitions [23]. We will again assume that the erasure is efficient.
We study the 3-dimensional ferromagnetic classic Heisenberg model, which undergoes a phase transition with spontaneous symmetry breaking of O ( 3 ) O ( 2 ) [24], which is described by the Hamiltonian
H = J 2 < i , j > s i · s j H i s i ,
where i , j , which denotes the sum over nearest neighbors spin, with i that goes from 1 to the number N of spins and | s | 2 = 1 . In this case, the spin orientation, that encodes analog information, can take all values on a sphere of unit radius and we have, thus, a continuum of possible values; for the Ising spins, the orientation is binary, up or down. Since the model is ferromagnetic, we have J > 0 . H is a constant external magnetic field in the z ^ direction. The ferromagnetic Heisenberg model undergoes a continuous phase transition [24], characterized by an order parameter m, the mean magnetization. For T T C , the system is disordered, while for T < T C , the phase becomes partially ordered and m reaches the value 1 at T = 0 . As in the case of discrete symmetry, lowering the temperature is akin to an erasure process, and m plays the role of the error probability in the reset operation [20].
Following what we did in the previous section, we identify the Shannon entropy of the erasure process in the analog computing system with the entropy variation during the transition from from T = T C to T = 0 . The entropy variation
Δ S k N = S ( T C ) S ( 0 ) k N ,
gives, thus, the Landauer bound for an analog computing system [23].
We use again the mean field approximation; the mean field Hamiltonian for the Heisenberg model is [24]
H mf = J < i , j > s i H eff + J N m 2 2 , H eff = ( J m + H ) ,
where m is the mean magnetization: m ( 1 / N ) i s i . The effective magnetic field is the sum of the average magnetic field, generated by all other spins, plus the external magnetic field H. As usual, we will take the limit H 0 , obtaining, thus, the partition function:
Z = exp β J N 3 m 2 2 d 3 s δ ( s 2 1 ) exp β J i = 1 N m cos θ i ,
where θ i is the angle between the spin and the z ^ direction and β = ( k T ) 1 . From Equation (29), we derive the free energy
F N = J m 2 3 2 1 β ln 4 π sinh β m J β m J ,
and from the free energy, the entropy
S k N = ln 4 π sinh β m J β m J β m J L ( β m J ) ,
where L ( x ) = coth x 1 / x is the Langevin function.
Let us now consider the limits for T = T C and T = 0 of Equation (36). When T T C , m 0 , and the entropy reaches its maximal value, the logarithm of the volume of the configuration space, namely, the area of a sphere of unitary radius:
S ( T C ) k N = ln ( 4 π ) .
When T 0 and m 1 , which corresponds to the perfect reset, the entropy becomes negative and divergent, contrary to the third law of thermodynamics:
S ( T 0 ) k N .
This problem is common for various classical systems. One textbook example is the classical harmonic oscillator [40], and, in general, the way to cure this problem is to consider the classical system as the limit of its quantum counterpart.
In the mean field approximation, the quantum ferromagnetic Heisenberg model describes a system of quantum, non-interacting spins s i with ( 2 s + 1 ) components in an external magnetic field H eff . The mean field Hamiltonian for the quantum Heisenberg model is [41]:
H = H eff i = 1 N s i .
It describes non-interacting ( 2 s + 1 ) components spins in an external magnetic field H eff , which, again, is the sum of the average magnetic field generated by all other spins plus an external magnetic field H. The partition function is
Z = n = s s exp β H eff n N ,
while the entropy, in the limit in which the external magnetic field H 0 , is
S k N = ln sinh ( 1 + 1 2 s ) β m J s sinh β m J s 2 s β m J s B s ( β m J ) ,
where B s ( β m J ) is the Brillouin function defined as B n ( x ) = 2 n + 1 2 n coth 2 n + 1 2 n x 1 2 n coth 1 2 n x . In the limit T 0 , the entropy Equation (36) goes to zero, S ( T = 0 ) k N = 0 , in agreement with the third law of thermodynamics.
The classical limit of the quantum Heisenberg model [42,43,44] is obtained by properly distributing the ( 2 s + 1 ) values of the quantum spin on the classical sphere of area 4 π r 2 with r that is the dimension of an action and is equal to 1 in our case. If we call s max the highest weight of the representation in the quantum case, we can define the spin density as [42]
Δ 2 s max + 1 4 π for s max , Δ 0 .
To ensure the existence of the infinite spin limit [42], we need, however, to rescale the spin as s s / s max . The minimum state volume Δ represents the minimum area in the unitary sphere occupied by a spin. This minimum volume is given by the Heisenberg principle:
Δ = 1 2 2 s max ; s max s clas for s max , 0 ,
with s class , which has the dimensions of an action, and s class = 1 in our case. Correspondingly, we define the regularized entropy as
S k N = ln 4 π ( 2 s max + 1 ) sinh ( 1 + 1 2 s max ) β m J sinh β m J 2 s max β m J B s max ( β m J ) .
The classical limit corresponds to s max . When T = T C , the entropy is, as before,
S ( T = T C ) k N = ln ( 4 π ) ,
while at T = 0 , we obtain
S ( T 0 ) k N = ln 4 π 2 s max + 1 = ln 1 2 2 s max = ln 1 2 ,
where we use Equations (37) and (38) (note that in the last term of Equation (41), is divided by a constant that has the dimension of an action and that it is equal to one). This result tells us that if we want to avoid the entropy divergence, we cannot actually send 0 . In fact, the limit 0 corresponds to a classical distribution concentrated in regions smaller than the minimum area allowed by the Heisenberg principle. Δ is the factor p 0 ( M ) in Equation (4), and its presence is due to the fact that the continuous Shannon entropy must be regularized in order to make it invariant upon a change of coordinates.
Using Equations (40) and (41), we obtain for the entropy variation
Δ S k N = S ( T C ) S ( T = 0 ) k N = = ln ( 4 π s clas 2 ) ln 1 2 2 s max = ln 8 π s clas .
For the Heisenberg model, we have s class = 1 , and we thus obtain
Δ S k N = S ( T C ) S ( T = 0 ) k N = ln 8 π ,
(note that the quantity inside the logarithm is dimensionless since s class = 1 has the dimensions of an action). Entropy variation Equation (43) represents the analog generalization of the Landauer bound: the entropy change during the erasure process performed by resetting a continuous variable, the spin s, to a standard value is given by the available configuration volume (the area 4 π in this case) measured in units of the minimum quantum of configuration volume Δ . This implies that, both for digital or analog information, physical systems can encode only a finite countable amount of information [28,45,46], and that information can be manipulated only with finite precision: infinite precision, namely the realization of a truly analog computing system, is forbidden by the laws of physics.
The maximum number of possible logic states that we can associate with the Heisenberg model is
N l = int 8 π ,
( int ( a ) is the integer part of number a), while for a generic angular momentum L, it is
N l = int 8 π L ,
to which we can associate a finite number of bits:
n = log 2 ( N l ) .
For the case of a cube of 5 × 5 × 5 = 125 atoms [47], with an angular momentum per atom of the order of L , for which the interactions between the momenta are such that they behave like a single classical momentum, we have N l = 3140 . From Equation (46), we obtain a number n 11.6 of bits that can be stored. Under the same assumption, in a system of magnetic nano-dots with a 20 nm side, containing approximately 200 million atoms, we can store up to n = 27.6 bits. If we want to perform with this system a perfect Landauer reset, the amount of heat to be dissipated is readily provided by Equation (42): Q 19.11 k T , approximately 30 times what we would have for a binary system reset.
We now want to consider the more general case of the symmetry breaking pattern O ( n ) O ( n 1 ) . Within the mean field approximation, this generalization is straightforward. From Equation (29), substituting d 3 s d n s , we have
Z = exp β J N 3 m 2 2 d n s δ ( s 2 1 ) exp β J i = 1 N m cos θ i ,
and for the entropy
S k N = ln 2 n 2 π n 2 I n 2 1 ( β m J ) ( β m J ) n 2 β m J I n 2 ( β m J ) I n 2 1 ( β m J ) ,
with I ν ( z ) , the modified Bessel functions of the first kind.
The entropy difference between the perfectly ordered state at T = 0 and the completely disordered one, at T = T C , m = 0 , which gives the Landauer bound for the erasure of a O ( n ) spin s is
Δ S k N = S ( T C ) S ( 0 ) k N ,
with
S ( T C ) k N = ln S n 1 , S n 1 = 2 π n 2 Γ n 2 ,
where S n 1 is the area of the ( n 1 ) -sphere of unit radius and Γ ( x ) is the Euler gamma function. As for the Heisenberg model, the limit T 0 is singular, and the entropy is negative and logarithmically divergent: ( S ( T 0 ) / k N ) .
The regularization of the entropy in this case is, however, more complicated since, contrary to the O(3) Heisenberg model, the analytical results are not known [48] for the O ( n ) -symmetric quantum Heisenberg model, not even in the mean field. Additionally, the definition of the classical limit is not clear. Extending to this case the results obtained for the O ( 3 ) case, given by Equation (50), we conjecture that for the O ( n ) -symmetric case, the entropy change during the erasure process will be given by the available configuration volume, the area of the n-sphere, measured in units of the minimum quantum of the configuration volume that in this case will be n 2 . In the case of the S U ( n ) -symmetric (restricted to symmetric representations) Heisenberg model, the possibility of having a positive classical entropy was proposed by Lieb and Solovey [49] using a coherent states approach.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Landauer, R. Irreversibility and heat generation in the computing process. IBM J. Res. Dev. 1961, 5, 183–191. [Google Scholar] [CrossRef]
  2. Landauer, R. Information Is Physical. Phys. Today 1991, 44, 23–29. [Google Scholar] [CrossRef]
  3. Landauer, R. The Physical Nature of Information. Phys. Lett. A 1996, 217, 188–193. [Google Scholar] [CrossRef]
  4. Szilard, L. Uber die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen. Z. Phys. 1929, 53, 840–856. [Google Scholar] [CrossRef]
  5. Bennet, C.H. The thermodynamics of computation—A review. Int. J. Theor. Phys. 1982, 21, 905–940. [Google Scholar] [CrossRef]
  6. Brillouin, L. Maxwell’s Demon Cannot Operate: Information and entropy. I. J. Appl. Phys. 1951, 22, 334–337. [Google Scholar] [CrossRef]
  7. Brillouin, L. Physical Entropy and Information. II. J. Appl. Phys. 1951, 22, 338–343. [Google Scholar] [CrossRef]
  8. Zivieri, R. Magnetic Skyrmions as Information Entropy Carriers. IEEE Trans. Magn. 2022, 58, 1500105. [Google Scholar] [CrossRef]
  9. Zivieri, R. From Thermodynamics to Information: Landauer’s Limit and Negentropy Principle Applied to Magnetic Skyrmions. Front. Phys. 2022, 10, 769904. [Google Scholar] [CrossRef]
  10. Berut, A.; Arakelyan, A.; Petrosyan, A.; Cilberto, S.; Dillenschneider, R.; Lutz, E. Experimental verification of Landauer’s principle linking information and thermodynamics. Nature 2012, 483, 187–189. [Google Scholar] [CrossRef]
  11. Roland, E.; Martinez, I.A.; Parrondo, J.M.R.; Petrov, D. Universal features in the energetics of symmetry breaking. Nat. Phys. 2014, 10, 457. [Google Scholar]
  12. Jun, Y.; Gavrilov, M.; Bechhoefer, J. High-Precision Test of Landauer’s Principle in a Feedback Trap. Phys. Rev. Lett. 2014, 113, 190601. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Hong, J.H.; Lambson, B.; Dhuey, S.; Bokor, J. Experimental test of Landauer’s principle in single-bit operations on nanomagnetic memory bits. Sci. Adv. 2016, 2, e1501492. [Google Scholar] [CrossRef] [Green Version]
  14. Maroney, O.J.E. Generalising Landauer’s principle. Phys. Rev. E 2009, 79, 031105. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Kish, L.B.; Granqvist, C.G. Energy requirement of control: Comments on Szilard’s engine and Maxwell’s demon. Europhys. Lett. 2012, 98, 68001. [Google Scholar] [CrossRef] [Green Version]
  16. Norton, J.D. All shook up: Fluctuations, Maxwell’s demon and the thermodynamics of computation. Entropy 2013, 15, 4432–4483. [Google Scholar] [CrossRef] [Green Version]
  17. Sagawa, T. Thermodynamic and logical reversibilities revisited. J. Stat. Mech. 2014, 2014, 03025. [Google Scholar] [CrossRef] [Green Version]
  18. Chiuchiu, D.; Diamantini, M.C.; Gammaitoni, L. Conditional entropy and Landauer principle. Europhys. Lett. 2015, 111, 40004. [Google Scholar] [CrossRef] [Green Version]
  19. Parrondo, J.M.; Horowitz, J.M.; Sagawa, T. Thermodynamics of information. Nat. Phys. 2015, 11, 131–139. [Google Scholar] [CrossRef]
  20. Gammaitoni, L. Beating the Landauer’s limit by trading energy with uncertainty. arXiv 2011, arXiv:1111.2937v1. [Google Scholar]
  21. Sagawa, T.; Ueda, M. Nonequilibrium thermodynamics of feedback control. Phys. Rev. E 2012, 85, 021104. [Google Scholar] [CrossRef] [Green Version]
  22. Sagawa, T.; Ueda, M. Minimal energy cost for thermodynamic information processing: Measurement and information erasure. Phys. Rev. Lett. 2009, 102, 250602. [Google Scholar] [CrossRef] [Green Version]
  23. Diamantini, M.C.; Trugenberger, C.A. Generalized Landauer bound as a universal thermodynamic entropy in continuous phase transitions. Phys. Rev. E 2014, 89, 052138. [Google Scholar] [CrossRef] [Green Version]
  24. Negele, J.W.; Orland, H. Quantum Many-Particle Systems; Addison-Wesley: Boston, MA, USA, 1998. [Google Scholar]
  25. Müller, B.; Reinhardt, J. Neural Networks; Springer: Berlin, Germany, 1990. [Google Scholar]
  26. Hopfield, J.J. Neural networks and physical systems with emergent collective computational abilities. Proc. Nat. Acad. Sci. USA 1982, 79, 2554. [Google Scholar] [CrossRef] [Green Version]
  27. Diamantini, M.C.; Gammaitoni, L.; Trugenberger, C.A. Landauer bound for analog computing systems. Phys. Rev. E 2016, 94, 012139. [Google Scholar] [CrossRef] [Green Version]
  28. Ihara, S. Information Theory for Continuous Systems; World Scientific: Singapore, 1993. [Google Scholar]
  29. Shannon, C.E. The Mathematical Theory of Communication; University of Illinois Press: Champaign, IL, USA, 1949. [Google Scholar]
  30. Jaynes, E.T. Information Theory and Statistical Mechanics; Brandeis University Summer Institute Lectures in Theoretical Physics; Brandeis University: Waltham, MA, USA, 1963. [Google Scholar]
  31. Jaynes, E.T. Information Theory and Statistical Mechanics I. Phys. Rev. 1957, 106, 620–630. [Google Scholar] [CrossRef]
  32. Jaynes, E.T. Information Theory and Statistical Mechanics II. Phys. Rev. 1957, 108, 171–190. [Google Scholar] [CrossRef]
  33. Ash, R. Information Theory; Interscience Publication: New York, NY, USA, 1965. [Google Scholar]
  34. Wehrl, A. On the relation between classical and quantum-mechanical entropy. Rep. Math. Phys. 1979, 16, 353–358. [Google Scholar] [CrossRef]
  35. Chiuchiu, D.; Lopez-Suarez, M.; Neri, I.; Diamantini, M.C.; Gammaitoni, L. Cost of remembering a bit of information. Phys. Rev. A 2018, 97, 052108. [Google Scholar] [CrossRef] [Green Version]
  36. Mezard, M.; Parisi, G.; Virasoro, M.A. Spin Glass Theory and Beyond; World Scientific: Singapore, 1987. [Google Scholar]
  37. Zinn-Justin, J. Quantum Field Theory and Critical Phenomena; Oxford University Press: Oxford, UK, 1989. [Google Scholar]
  38. Proesmans, K.; Ehrich, J.; Bechhoefer, J. Finite-time Landauer principle. Phys. Rev. Lett. 2020, 125, 100602. [Google Scholar] [CrossRef]
  39. Van Vu, T.; Saito, K. Finite-time quantum Landauer principle and quantum coherence. Phys. Rev. Lett. 2022, 128, 010602. [Google Scholar] [CrossRef] [PubMed]
  40. Hängi, P.; Ingold, G.L. Quantum Brownian motion and the Third Law of thermodynamics. Acta Phys. Pol. B 2006, 37, 1537. [Google Scholar]
  41. Pathria, R.K. Statistical Mechanics; Pergamon Press: Oxford, UK, 1972. [Google Scholar]
  42. Fisher, M.E. Magnetism in one-dimensional systems—The Heisenberg model for infinite spin. Am. J. Phys. 1964, 32, 343–346. [Google Scholar] [CrossRef]
  43. Millard, K.; Leff, H.S. Infinite-Spin Limit of the Quantum Heisenberg Model. J. Math. Phys. 1971, 12, 1000. [Google Scholar] [CrossRef]
  44. Lieb, E.H. The classical limit of quantum spin systems. Commun. Math. Phys. 1973, 31, 327–340. [Google Scholar] [CrossRef]
  45. Bekenstein, J.D. Entropy content and information flow in systems with limited energy. Phys. Rev. D 1984, 30, 1669. [Google Scholar] [CrossRef]
  46. Loydd, S. Ultimate physical limits to computation. Nature 2000, 406, 1047. [Google Scholar] [CrossRef] [Green Version]
  47. Feynman, R.P. There’s plenty of room at the bottom [data storage]. J. Microelectromech. Syst. 1992, 1, 60–66. [Google Scholar] [CrossRef]
  48. Berdnikov, B.A. Quantum Magnets with an SO(n) Symmetry; MIT: Cambridge, MA, USA, 1998; and reference therein. [Google Scholar]
  49. Lieb, E.H.; Solovej, J.P. Proof of the Wehrl-type Entropy Conjecture for Symmmetric SU(N) Coherent States. Commun. Math. Phys. 2016, 348, 567–578. [Google Scholar] [CrossRef] [Green Version]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Diamantini, M.C. Landauer Bound and Continuous Phase Transitions. Entropy 2023, 25, 984. https://doi.org/10.3390/e25070984

AMA Style

Diamantini MC. Landauer Bound and Continuous Phase Transitions. Entropy. 2023; 25(7):984. https://doi.org/10.3390/e25070984

Chicago/Turabian Style

Diamantini, Maria Cristina. 2023. "Landauer Bound and Continuous Phase Transitions" Entropy 25, no. 7: 984. https://doi.org/10.3390/e25070984

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop