Next Article in Journal
Information Theoretic Multi-Target Feature Selection via Output Space Quantization
Next Article in Special Issue
Enhanced Negative Nonlocal Conductance in an Interacting Quantum Dot Connected to Two Ferromagnetic Leads and One Superconducting Lead
Previous Article in Journal
Directionally-Unbiased Unitary Optical Devices in Discrete-Time Quantum Walks
Previous Article in Special Issue
Nonadiabaticity in Quantum Pumping Phenomena under Relaxation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Entropy Production in Quantum is Different

by
Mohammad H. Ansari
1,*,
Alwin van Steensel
1 and
Yuli V. Nazarov
2
1
Jülich-Aachen Research Alliance Institute (JARA) and Peter Grünberg Institute (PGI-2), Forschungszentrum Jülich, D-52425 Jülich, Germany
2
Department of Quantum Nanoscience, Kavli Institute of Nanoscience, TU Delft, Lorentzweg 1, 2628CJ Delft, The Netherlands
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(9), 854; https://doi.org/10.3390/e21090854
Submission received: 30 June 2019 / Revised: 6 August 2019 / Accepted: 28 August 2019 / Published: 31 August 2019
(This article belongs to the Special Issue Quantum Transport in Mesoscopic Systems)

Abstract

:
Currently, ‘time’ does not play any essential role in quantum information theory. In this sense, quantum information theory is underdeveloped similarly to how quantum physics was underdeveloped before Erwin Schrödinger introduced his famous equation for the evolution of a quantum wave function. In this review article, we cope with the problem of time for one of the central quantities in quantum information theory: entropy. Recently, a replica trick formalism, the so-called ‘multiple parallel world’ formalism, has been proposed that revolutionizes entropy evaluation for quantum systems. This formalism is one of the first attempts to introduce ‘time’ in quantum information theory. With the total entropy being conserved in a closed system, entropy can flow internally between subsystems; however, we show that this flow is not limited only to physical correlations as the literature suggest. The nonlinear dependence of entropy on the density matrix introduces new types of correlations with no analogue in physical quantities. Evolving a number of replicas simultaneously makes it possible for them to exchange particles between different replicas. We will summarize some of the recent news about entropy in some example quantum devices. Moreover, we take a quick look at a new correspondence that was recently proposed that provides an interesting link between quantum information theory and quantum physics. The mere existence of such a correspondence allows for exploring new physical phenomena as the result of controlling entanglement in a quantum device.

1. Introduction

Entropy is one of the central quantities in thermodynamics and, without its precise evaluation, one cannot predict what new phenomena are to be expected in the thermodynamics of a device. In quantum theory, entropy is defined as a nonlinear function of the density matrix, i.e., S = Tr ρ ^ ln ρ ^ , in the units of the Boltzmann constant k B . The mere nonlinearity indicates that entropy is not physically observable because, by definition, observables are linear in the density matrix. Let us further describe this statement. Here, we do not assume that the density matrix is a physical quantity. The reason is that evaluating all components of a many-body density matrix requires many repetitions of the same experiment with the same initial state. Not only is this difficult but also the fact that measurement changes quantum states prevents exact evaluation. A physical quantity, such as energy or charge, can be measured in the lab in real time and can be defined in quantum theory to linearly depend on the density matrix. This is not true for entropy and therefore we cannot assume it is a physical quantity directly measurable in the lab.
In fact, the precise time evolution of entropy is still an open problem and has not been properly addressed in the literature [1,2,3]. A consistent theory of quantum thermodynamics can only be achieved after finding nontrivial relations between the quantum of information and physics. In recent years, exquisite mesoscopic scale control over quantum states has led technology to the quantum realm. This has motivated exploring new phenomena such as exponential speed up in computation as well as power extraction from quantum coherence [4,5,6,7,8]. Recently, there have been attempts to implement quantum versions of heat engines using superconducting qubits [9]. However, recent developments in realizing quantum heat engines, such as in References [10,11,12], rely on semiclassical stochastic entropy production after discretizing energy. A long-lasting question is how the superposition of states transfers heat and how much entropy is produced as the result of such a transfer.
A quantum heat engine (QHE) is a system with several discrete quantum states and, similar to a common heat engine, is connected to several environments kept at different temperatures. In fact, a number of large heat baths in these engines share some degrees of freedom quantum mechanically. Such a system is supposed to transfer heat according to the laws of quantum mechanics. The motivation for research in QHE originates from differences they may controllably make on the efficiency and output powers. Let us consider the example of two heat baths A and B, both coupled through a quantum system q that contains discrete energies and allows for the superposition of states with long coherence time. Let us clarify that, in this paper, we study the flow of thermodynamic Renyi and von Neumann entropies between the heat baths and quantum system q. Therefore, other entropies are beyond the scope of this paper. This quantum system coupled to the two large heat baths is in fact a physical quantum system that is energetically coupled to the reservoirs and allows for stationary flow of heat as well as a flow of thermodynamic entropy from one reservoir to another. We will see in the next section that, similar to physical quantities such as energy and charge, the total entropy of a closed system is a conserved quantity and does not change in time. However, internally, entropy can flow from one subsystem to another. Therefore, sub-entropies may change in time and this change may indicate a change in the energy transfer. Some important questions one may ask are: Does a quantum superposition change entropy? This is one of the questions that we will address in this almost pedagogical review paper and we will furthermore describe how the information content in entropy can be meaningful in physics.
In a typical engine made of reservoirs A, B and an intermediate quantum system q with discrete energy levels, the change of entropy in one of the reservoirs, say B, between the time 0 and t is S B t S B 0 = Tr ρ t ln ρ B t T r B ρ B eq ln ρ B eq , where in the first term we have safely replaced one of the two partial density matrices with the total density matrix, and accordingly replaced the partial trace with total one. The conservation of entropy tells us that the total entropy maintains its initial value at the separable compound state ρ 0 = ρ q 0 ρ A eq ρ B eq , i.e., Tr ρ t ln ρ t = T r q ρ q 0 ln ρ q 0 i = A , B T r i ρ i eq ln ρ i eq . After a few lines of algebra one can find that the change of entropy at the reservoir is S B t S B 0 = S B ρ t | | ρ A eq ρ B t ρ q 0 + i = q , A T r i ρ i t ρ i 0 ln ρ i 0 , with S ρ | | ρ Tr ρ ln ρ Tr ρ ln ρ being the relative entropy. Since relative entropy is a positive number [13] and equals zero only for identical density matrices ρ = ρ , the first part of the entropy flow is positive and irreversible. This satisfies the classical laws of thermodynamics. We will show that, in contrast to what has been so far presented in the literature [14], the second term in the entropy flow is not heat transfer—the average change of energy at the two times Q B H 0 B H t B . Instead, it is the difference of incoherent and coherent heat transfers [15], i.e., Q B , incoh t Q B , coh t Q B , incoh 0 Q B , coh 0 . This is the new result that heavily modifies the flow of entropy in some quantum heat engines and leads to some recent new physics [16,17,18,19].
In this review paper, we look at some of the simplest and most important quantum heat engines. Depending on the external drive or internal degeneracy, the exact evaluation of entropy is indeed very different from what has been presented in the literature so far. We will describe how to precisely evaluate entropy and its flow by using a replica trick that properly allows for the mathematically involved nonlinearity. We introduce a new class of correlations that allow information transfers and are different from physical correlations. For equilibrium systems, these informational correlations satisfy a generalized form of Kubo–Martin–Schwinger (KMS) relation [20,21]. This part of the analysis will be presented in a self-contained fashion after reviewing some of the classical and quantum definitions of entropy and introducing our replica trick for evaluating the time evolution of generalized Keldysh contours. We describe a short protocol for evaluating Keldysh diagrams and in some examples perform the evaluation of a number of diagrams. We present results of example quantum devices such as a two-level quantum heat engine, a photocell, as well as a resonator, each one mediating heat transfer between two large heat baths. Finally, we briefly report on the new correspondence that makes entropy flow directly measurably in the lab by monitoring physical quantities, i.e., the statistics of energy transfer.

2. Classical Systems

2.1. Classical Entropy

Many systems in classical physics carry entropy. Some of the most studied systems are: charge transport at a point contact [22,23], energy transport in heat engines [24], and a gravitational hypersurface falling into a black hole [25,26,27,28]. Let us for simplicity of the discussion review classical entropy by means of the example of charge transport through a point contact. Consider for this purpose two large conductor plates connected at a point, the so-called ‘point contact system’. This classical point contact either transmits a charged particle with probability p or blocks the transmission with probability 1 p . Let us consider N attempts take place. For N 1 it is most likely that, in p N out of N times, the particles are successfully transferred and, in 1 p N out of N times, they are not. For unmarked particles, the order of events does not matter, therefore the number of possibilities with p N transfers out of N attempts is
N = N p N N N p N p N 1 p N 1 p N = 1 p p N 1 p 1 p N .
This number rapidly grows with N. In order to keep the number small, we take its logarithm. This defines the so-called Shannon entropy, i.e., S S h a n n o n = log 2 N = N p log 2 p + 1 p log 2 1 p .
The linear dependence of the Shannon entropy on the number of attempts N indicates its additivity. The definition of entropy can be generalized to account for extended geometries such as a k + 1-path terminal that connects any reservoir to k others. In this case, k probabilities contribute to understanding the possibility of transmission from a reservoir to any one of the other k reservoirs, thus entropy is generalized to S S h a n n o n = N n = 1 k p n log 2 p n . This entropy may vary in time. One possible reason for such variation could be due to time-dependent probabilities p n ( t ) . Another possibility for time evolution of entropy could be the presence of some bias in controlling the system. For example, consider that, after one successful transfer, the transmission is reduced or closed for a rather long time before it opens again to another transfer attempt. The entropy of such a system depends on whether or not a success transfer has taken place in the past.
In fact, in this paper, what we call entropy production refers to the time variation of partial entropy associated with a part of a closed system. Moreover, as stated in the Introduction, in this paper, we are only interested in the time variation in thermodynamic systems such as heat baths; therefore, our focus is only on thermodynamic entropies and its time evolution, namely ‘entropy production’. In this section, although we discuss Shannon entropy S S h a n n o n , we have to distinguish between the Shannon entropy, which can be measured as a number of bits, and the rest of the paper in which we study von Neumann thermodynamic entropy measured in the unit Joule per Kelvin. The Shannon entropy and the thermodynamic entropy are related by the Boltzmann constant k B , i.e., S T h e r m o d y n a m i c = k B S S h a n n o n . Without the loss of generality, we use the convention that k B = 1 , although the reader should keep in mind that, in this paper, we are interested in finding changes in thermodynamic entropy flow as the result of energy exchange processes.

2.2. Renyi Entropy

Alfred Renyi introduced the generalization of Shannon entropy that maintains the additive property [29]. For a finite set of k probabilities p i with i = 1 , , k , the Renyi entropy of degree M is defined as
S M = 1 1 M log i p i M ,
with positive entropy order M > 0 . The symbol S M indicates that this is the original definition of Renyi entropy to make it distinct from the simplified definition S M we use in this paper. The constant prefactor 1 / ( 1 M ) in Equation (2) has certain advantages. One of the advantages is that it helps to compactify the definition of some other entropies using Equation (2); i.e., the analytical continuation of Renyi entropy in the limit of M approaching 1 (∞) defines Shannon (min) entropy. Another advantage of the prefactor is that it allows for interpretation of the quantity as the number of bits (thanks to one of the referees for pointing out these remarks).
Here, we present a simplified version of the definition. The logic behind such simplification is that the calculation in the limits requires L’Hopital’s rule; i.e., S S h a n n o n , m i n = lim M 1 , S M R = lim M 1 , d ( log i p i M ) / d M . We define a rescaled Renyi entropy, which is different from the original definition by a prefactor 1 / ( M 1 ) :
S M = log i p i M .
The reason to define the simplified formula is that evaluating entropy itself is beyond the scope of this paper. Instead, we need to find the time derivative of the entropy (i.e., entropy flow). Due to the presence of a logarithm in Equation (3), any contact prefactor in the definition of entropy will be canceled out from the numerator and denominator of entropy flow. The only trouble is that we must keep in mind that the Shannon entropy can be reproduced after taking the d S M / d M in the limit of M 1 . In fact, given that d x M / d M = d exp M ln x / d M = x M ln x , one can write
lim M 1 d S M d M = lim M 1 i p i M ln p i i p i M = i p i ln p i = S S h a n n o n .
In the rest of the paper, we use the simplified definition. However, given that the difference between the two definitions is marginal, only a constant factor, the reader may decide to use either definition, subject to the discussion above.
In a point contact, given that Renyi entropy is additive for independent attempts, the total Renyi entropy after N uncorrelated attempts will be S M = N log p M + ( 1 p ) M . In a classical heat reservoir, the Renyi entropy is more closely related to free energy. Consider a bath at temperature T with a large number of energy states ϵ i . The corresponding Gibbs probabilities are p i = exp ϵ i T / Z ( T ) and Z T i p i is the corresponding partition function. The Renyi entropy of the heat bath is S M = ln i exp M ϵ i T + M ln Z T . The free energy will be F T = T ln Z T , which is related to the Renyi entropy as S M = M / T F T F T / M , i.e., the free energy difference at temperatures T and T / M .

3. Quantum

3.1. Von Neumann and Renyi Entropy

Let us now consider that a large system A with many degrees of freedom interacts with a small quantum system q. This can be thought of as the two share some degrees of freedom. The two exchange some energy via those shared degrees of freedom. Quantumness indicates that q carries a discrete energy spectrum and can be found in superposition between energy levels. Let ρ be the density matrix of the compound system. The partial density matrix of A is defined by tracing out the system q from ρ , i.e., ρ A = Tr q ρ . The von Neumann entropy for system A in the Boltzmann constant unit is defined as
S A = Tr A ρ A ln ρ A
and the generalization of entropy in quantum theory will naturally give rise to defining the following quantum Renyi entropy for system A:
S M A = ln Tr A ρ A M .
The density matrix of the isolated compound system evolves between the times t and t > t using a unitary transformation that depends on the time difference U t t . Therefore, one can evaluate Tr A ρ A M using the unitary transformation to trace it back to the time t ; i.e.,
Tr ρ t M = Tr U t t ρ t U t t M = Tr U t t ρ t M U t t = Tr ρ t M .
After taking the logarithm from both sides, one finds that the Renyi entropy remains unchanged between the two times t and t . In other words, in a closed system, similar to energy and charge, Renyi entropy is a conserved quantity:
d S M d t = 0 .
Let us consider for now that there is no interaction between A and q. One can expect naturally that partial entropies are conserved as the result of no interaction because each subsystem can evolve with an independent unitary operator:
d S M A d t = d S M q d t = 0 .
Interesting physical systems interact. Therefore, let us now consider that A and q interact. Consider that the total Hamiltonian is H = H A + H q + H A q . For interacting systems, there is an important difference between conserved physical and information quantities. For physical quantities, the conservation holds in the whole system as well as in each subsystem. As far as Renyi entropies are concerned, there is a conservation law for the total Renyi entropy ln S M ( A + q ) ; however, this quantity is only approximately equal to the sum ln S M ( A ) + ln S M ( q ) , up to the terms proportional to the volume of the system. Therefore, no exact conservation law can be expected for the extensive quantity summation: ln S M ( A ) + ln S M ( q ) [30]. The reason is that, although the evolution of the entire system is governed by a unitary operator, the subsystem evolves non-unitarily. In the limit of weak coupling | H A q | / | H A + H q | 1 , the entropy of entire system can only be approximated with the sum of two partial entropies, thus the sum of partial entropies can only approximately satisfy a conservation, i.e., d S M A / d t + d S M q / d t 0 . Outside of the validity of the weak coupling approximation, we must expect that, although the total entropy conserves, the interacting parts have entropy flows different from each other:
d S M A d t d S M q d t .
This makes the conservation of Renyi entropy different from the conservation of physical quantities. The root for the difference is in fact in the nonlinear dependence on the density matrix, namely ‘non-observability’ of entropy [31].

3.2. Replica Trick

Calculating the full reduced density matrix for a general system is the subject of active research. Here, we use a different method that is reminiscent of the ‘replica trick’ in disorder systems. The trick has been introduced in the context of quantum field theory by Wilczek [32] and Cardy [33] and later in the context of quantum transport by Nazarov [31]. The key point is that, if we can evaluate Tr ρ M for any M 1 , we are able to evaluate the von Neumann entropy using the following relation:
S A = lim M 1 d d M S M A = lim M 1 d d M Tr A ρ A M .
One can see that there is no need to take the logarithm of Tr A ρ A M . This is only a mathematical simplification in the vicinity of M 1 , i.e., when we want to reproduce von Neumann entropy by analytically continuing the derivative of the Renyi entropy. Otherwise, the presence of the logarithm is essential for the definition of the Renyi entropy. It might be useful to further comment that the Renyi entropy without the logarithm has many names such as Tsallis entropy or power entropy, etc. However, the presence of the logarithm is necessary for what we call the Renyi entropy. Otherwise, we would have lim M 1 T r ρ M = 1 , which, in this important limit, cannot be a true measure of information.
However, calculating Tr A ρ A M for a real or complex number M is a hopeless task. The ‘replica trick’ does the following: compute Tr A ρ A M only for integer M and then analytically continue it to a general real or even complex number.

3.3. Time Evolution of Entropy

Let us mention that we limit our analysis here only to weak coupling. In this regime, the dynamics of a quantum system are reversible and can be formulated in terms of the density matrix evolution. This time evolution depends on the the time-dependent Hamiltonian H t = H A + H B + H A B as follows:
d ρ d t = i H ( t ) , ρ ( t ) .
We transform the basis to the interaction frame by using defining a unitary operator with the non-interaction part of the Hamiltonian U ( t ) = exp i H A + H B t . The density matrix transforms as R t = U t ρ t U t , thereby not changing its entropy, neither in parts nor in total. In the new basis, Equation (11) becomes
d R d t = i U ( t ) H A B ( t ) U ( t ) , R ( t ) .
Let us refer to the interaction Hamiltonian H A B in the new basis as H I , i.e., H I U t H A B t U t . The solution to the time evolution Equation (12) can be written as
R ( t ) = R 0 + R 1 ( t ) + O ( 2 )
with
R 0 R ( 0 ) (noninteracting)
R 1 ( t ) i 0 t d s H I s , R 0 (1st order)
This solution can (repeatedly) be inserted back into the right side of Equation (12), declaring its cycle of internal interaction:
1 1 d R t d t = Δ 1 + Δ 2 + O ( 3 )
with
Δ 1 i H I t , R 0 , (1st order)
Δ 2 1 2 0 t d s H I t , H I s , R 0 . (2nd order).
In order to find the time evolution of the Renyi and von Neumann entropies, we first notice that the unitary transformation U ( t ) , defining the basis change, also transforms any power of the density matrix, i.e.,
R t M = U ( t ) ρ t M U ( t ) .
Now, all we need to do is to generalize the evolution of density matrix to the powers of density matrix R t M . We follow the terminology of Nazarov in [31] and name each copy of replica R t in the matrix ( R t ) M a ‘world’, thus R t M is the generalized density matrix of M worlds:
d d t R t M = d d t R t R t M 1 + R t d d t R t R t M 2 + + R t M 2 d d t R t R t + R t M 1 d d t R t .
By substituting the solutions of Equations (13) to (18), and limiting the result to second order, we find the the following time evolution of the M-world density matrix:
d d t R t M = Δ 2 R 0 M 1 + R 0 Δ 2 R 0 M 2 + + R 0 M 1 Δ 2 + Δ 1 R 1 R 0 M 2 + R 0 R 1 R 0 M 3 + + R 0 M 2 R 1 + R 0 Δ 1 R 1 R 0 M 3 + R 0 R 1 R 0 M 4 + + R 0 M 3 R 1 + R 0 2 Δ 1 R 1 R 0 M 4 + R 0 R 1 R 0 M 5 + + R 0 M 4 R 1 + + R 1 R 0 M 2 + R 0 R 1 R 0 M 3 + + R 0 M 2 R 1 Δ 1 .
This is how the M-world density matrix evolves in time. The first line in Equation (20) denotes the case where the 2nd order perturbation takes place in one world while the M 1 remaining worlds are left non-interacting. All these remaining terms have in common that they don’t contain a 2nd order term occurring in a single replica. Instead, these terms contain two 1st order interactions, each acting in a single replica, which together combine to give a 2nd order perturbation term. These new terms have recently been found [34].
If you decide to consider higher perturbative orders, say up to k-th order with k M , there will be terms like R 0 M 1 Δ ( k ) in the expansions that have k interactions taking place in one replica, leaving M 1 replicas noninteracting as well as terms having k first-order configurations combining to give a kth order interaction term, such as R 0 M k Δ 1 k . In the case k > M , some of the lowest-order interactions will obviously become excluded from the summations.
Let us show the time evolution pictorially using the following diagrams, in which the evolution of R ( t ) M is shown by M parallel lines, each one denoting the time evolution of one world, starting in the past at the bottom and arriving at the present time on the top. In the following diagrams, we show five time-slices by horizontal dashed lines. Blue dots denote the interaction H I ( t ) and our diagrams are limited to the 2nd order only. Curly photon-like lines connect the two interactions and represent the correlation function.
The first line of Equation (20) contains all terms that have two interactions in a single world. These two interactions within the same world are called ‘self-replica interactions’. They can be illustrated pictorially by the following diagrams in Figure 1 from left to right:
The following diagram in Figure 2 illustrates the typical term R 0 2 Δ 1 R 0 R 1 R 0 M 4 from Equation (20) and pictorially shows the contribution of two first order interactions in two different worlds that together evolve the generalized density matrix of M worlds in the second order.
A typical higher order digram limited to two-correlation interactions can diagrammatically be shown as below in Figure 3.

3.4. Extended Keldysh Diagrams

In all the above diagrams, quantum states have been represented as labels on the contours. By definition, we know that the density matrix contains both ket and bra states. The second order interactions can, in fact, only take place either between two kets, two bras, or between a ket and a bra. This internal degree of freedom makes it necessary to add more details to our diagrams and represent each replica with the well-known Keldysh contour diagrams [35]. The Keldysh technique permits a natural formulation of the density matrix dynamics in terms of path integrals, which is a generalization of the Feynman–Vernon formalism.
Considering that the time evolution of a quantum system takes place by the Hamiltonian H, kets evolve as | ψ t = exp i H t | ψ 0 and bras evolve with the opposite phase: ψ t | = ψ 0 | exp i H t . Based on this simple observation, bras (kets) evolve in the opposite (same) direction of time along the Keldysh contour.
The evolution of the density matrix R from the initial time to the present time can diagrammatically be represented in the following way: one can start at a bra at the present time, move down along the contour to the initial time, pass there through the initial density matrix thereby changing from a bra to a ket, and finally move upwards to end with a ket at the present time. Taking a trace from the density matrix can be shown diagrammatically by closing the contours at the present time: i.e., we connect the present ket to the present bra. It is of course awkward to do this for the total density matrix, as this will simply yield one at any time; however, taking a trace is meaningful for multiple interacting subsystems.
The two subsystems A and B each require a contour, resulting in a double contour. We assume separability of A and B at the initial time: R 0 = R A 0 R B 0 . Interaction results in energy exchange, which we represent by a cross between the two contours, somewhere between initial and present times, i.e., 0 < t < t . In the case we are interested in the evolution of one of the subsystems, say B, the partial trace over A should be taken, which in the diagram can be done by connecting the present bra and ket of system A, see the right diagram in Figure 4. Further details about this Keldysh representation of quantum dynamics can be found in [16].
In order to evaluate the time evolution of the von Neumann and Renyi entropies, we need extended Keldysh contours in multiple parallel worlds (replicas). For this purpose, we consider multiple copies of the Keldysh diagram, one for each world, and add the initial state of the density matrix in each world along the contour at the initial time. The overall trace will get the contours of different worlds connected.
In the second order, one can find:
d d t S M B = 1 S M B Tr B Δ B 2 R B 0 M 1 + R 0 Δ B 2 R B 0 M 2 + + R B 0 M 1 Δ B 2 1 S M B Tr B Δ B 1 R B 1 R B 0 M 2 + + R B 0 M 2 R B 1 + R B 0 Δ B 1 R B 1 R B 0 M 3 + + R B 0 M 3 R B 1 + + R B 1 R B 0 M 21 + + R B 0 M 2 R B 1 Δ B 1 .
The first line contains terms with second-order interactions taking place in only one world. A typical such diagram for M = 3 has been shown in Figure 5.
The rest of the lines other than the first line in Equation (21) denote maximally no more than first-order interaction in a replica. The diagram in Figure 6 shows a typical such term.

3.5. Calculating the Diagrams

The main reason why the time evolution of entropy in Equation (21) has been diagrammatically represented is that, due to the multiplicity in time ordering interactions, these extended Keldysh diagrams can help to correctly determine all possible symmetries that may simplify the problem. We need to express all ‘single-world’ interactions that carry the highest order perturbation as well as all ‘cross-world’ terms with lower orders of perturbation.
We assume the interaction Hamiltonian does not implicitly depend on time through its parameters; instead, the time dependence is globally assigned in the rotating frame and state evolutions. The explicit formulation of quantum dynamics and keeping track of symmetries between different diagrams have resulted in the following rules for the evaluations of the diagrams:
  • With each system having its own contours in each world, label each separate segment of these contours, according to the state of the associated bra or ket of that segment. The state of the bras and kets change after an interaction, at the initial time and at the final time.
  • Starting from the present time in any of the worlds, say the leftmost world, and encompassing the contours, the following operators or changes must be added along the contour:
    (a)
    Every interaction on a ket contour will be i / H I t and will be i / H I t on a bra contour.
    (b)
    After passing an interaction, the states must change. The new states remain the same until a new interaction is encountered, or if the initial time or the final time is reached.
    (c)
    A contour arriving at the initial time will capture the initial density matrix in the interaction picture R 0 .
  • In general, the result should be integrated over the individual interaction times, i.e., 0 0 d t 1 d t 2 , subject to time order between them. This can be simplified for a small quantum system coupled to a large reservoir kept at a fixed temperature. The reason being that the correlation function of absorption and decay of particles only depends on the time difference between the two interactions [36]. In this case, the double integral over d t 1 and d t 2 can be be simplified to only contain a single integral over the time difference between the two interactions, i.e., 0 d τ .

3.6. Quantum Entropy Production

Let us consider that two large heat reservoirs A and B, each one containing many degrees of freedom and kept at a temperature, are coupled to one another via only a few numbers of shared degrees of freedom. The Hamiltonian can be written as H = H A + H B + H A B with H A B representing the coupled degrees of freedom.
In order to compute the flow of a quantity between A and B, that quantity should be conserved in the combined system A + B . As we discussed in the first section of this paper, Renyi entropy is a conserved quantity in a closed system, therefore d ln S M ( A + B ) / d t = 0 . However, one should notice that there is a difference between the conservation of physical quantities such as energy and the conservation of entropy. Because physical quantities linearly depend on the density matrix, when it is conserved for a closed system, internally it can flow from a subsystem to another one such that its production in a subsystem is exactly equal to the negative sign of its removal from the other subsystem. However, entropy is not so. In fact, due to nonlinear dependence of entropy on the density matrix, when it is conserved for a bipartite closed system, it is not equally added and subtracted from the subsystem due to the non-equality in Equation (9).
Below, we will present some example systems with rather general Hamiltonians and, using the diagram rules, we evaluated all entropy production diagrams.

3.6.1. Example 1: Entropy in a Two-Level Quantum Heat Engine

In Ref. [15], we used the extended Keldysh technique and evaluated entropy flow for the simplest quantum heat engine in which a two-level system couples two heat baths kept at different temperatures, see Figure 7. After taking all physical and informational correlations into account, we found that the exact evaluation in the second order is much different from what physical correlations predict. Here, we reproduce the exact result by giving a pedagogical use of the diagram evaluation described above.
Let us consider two heat baths that are kept at different temperatures weakly interact by exchanging the quantum energy ω o . Such a quantum system can be thought of as a two-level system that couples the two heat baths through shared excitations and de-excitations. The Hilbert space of the two-level system contains the states 0 and 1 . The free Hamiltonian contains heat bath energy levels E α A s and E β B s and quantum system energies E n with n = 0 , 1 , i.e., H 0 = α E α A | α α | + β E β B | β β | + n = 0 , 1 E n | n n | .
We assume the so-called ‘transversal’ interaction is taken into account between A/B and the two-level system q. This means that they interact via exchanging the quantum of energy ω o . Of course, we can generalize the discussion to longitudinal interactions in which no energy is exchanged; however, since such interactions are not of immediate interest for heat transfer in quantum heat devices. we ignore them.
This interaction we assume for the heat bath has the following general form: H i n t = n , m = 0 , 1 n m X ^ n m ( A ) ω 0 + X ^ n m ( B ) ω 0 subject to E m E n and X ^ n m representing energy absorption/decay in heat baths. The summation in H i n t can be generalized to an arbitrary number of heat baths interacting at shared degrees of freedom.
Moreover, the entire system including the two-level system is externally driven. The classical heat baths are naturally not influenced effectively by the driving field; however, the driving can pump in and out energy to the two-level system by the following Hamiltonian H d r = Ω cos ( ω d r t ) | 0 1 | + | 1 0 | .
For simplicity, we take the Hamiltonian into the rotating frame that makes excitation/relaxation with the frequency ω d r . In this frame, the excited and ground states are transformed as follows: | 1 R = exp i ω d r t | 1 and | 0 R = | 0 . This will introduce the unitary transformation U R = exp i ω d r t | 1 1 | on the Hamiltonian, i.e., H R = U R H U R + i U R / t U R . A few lines of simplification will result in the following Hamiltonian in the rotating frame:
H R H 0 + V q A + V q B + V A B + V d r , H 0 = E 0 | 0 0 | + E 1 ω d r | 1 1 | + α E α A | α α | + α E α B | α α | , V q A = | 0 1 | X ^ 01 ( A ) t e i ω d r t + | 1 0 | X ^ 10 ( A ) t e i ω d r t n , m = 0 , 1 ( n m ) | n m | X ^ n m ( A ) ( t ) e i ω d r η n m t , V q B = | 0 1 | X ^ 01 ( B ) t e i ω d r t + | 1 0 | X ^ 10 ( B ) t e i ω d r t n , m = 0 , 1 ( n m ) | n m | X ^ n m ( B ) ( t ) e i ω d r η n m t , V A B = 0 , V d r = Ω 2 | 0 1 | + | 1 0 | ,
with η 01 = η 10 = 1 and η 00 = η 11 = 0 . Given the fact that there is no direct exchange of energy between A and B, the density matrix can be represented as R = R q A R B + R A R q B in an interaction picture, thus determining entropy flow in the heat bath B will depend on the quantum system and the heat bath B, although indirectly the heat bath A will influence the quantum system. In general, d R B M / d t = T r q d R q B M / d t . Let us recall that this quantity determines the flow of von Neumann entropy and, using Equation (10), it can be simplified to d S ( B ) / d t = lim M 1 d T r B T r q d R q B / d t R q B M 1 + + R q B M 1 d R q B / d t / d M . Each term in the sum is evaluated in the interaction picture using d R / d t = i V , R . One can show that the external driving will cause the density matrix to evolve as d R n m / d t d r = i Ω / 2 R n 0 δ m 1 + R n 1 δ m 0 δ n 0 R 1 m δ n 1 R 0 m .
The interaction Hamiltonian evolves quantum states and below we evaluate the entropy flow in the M = 3 example to the second order perturbation theory. As discussed above, there are in general two types of diagrams in the second order: (1) ’self-interacting’ diagrams with second order interaction taking place in one replica, and (2) cross-world-interacting terms in which two different replicas take on each 1st order interaction. The self-interacting diagrams for the two-level system are listed in Figure 8.
These diagrams correspond to the following flows, respectively:
( a ) : 1 0 d τ T r B m , k = 0 , 1 ( m k ) X ^ m k ( B ) ( t ) X ^ k m ( B ) t τ R ^ B R ^ m m e i ω d r η k m τ e i ω d r η m k + η k m t R ^ B 2 T r B R ^ B 3 , ( b ) : + 1 0 d τ T r B m , k = 0 , 1 ( m k ) X ^ m k ( B ) ( t τ ) R ^ B R ^ k k X ^ k m ( B ) t e i ω d r η m k τ e i ω d r η m k + η k m t R ^ B 2 T r B R ^ B 3 , ( c ) : + 1 0 d τ T r B m , k = 0 , 1 ( m k ) X ^ m k ( B ) ( t ) R ^ B R ^ k k X ^ k m ( B ) t τ e i ω d r η k m τ e i ω d r η m k + η k m t R ^ B 2 T r B R ^ B 3 , ( d ) : 1 0 d τ T r B m , k = 0 , 1 ( m k ) R ^ B R ^ m m X ^ m k ( B ) ( t τ ) X ^ k m ( B ) t e i ω d r η m k τ e i ω d r η m k + η k m t R ^ B 2 T r B R ^ B 3 .
In all these terms, there is a time dependent factor e i ω d r η m k + η k m t which is identical to 1 because we always have the following relation valid: η m k = η k m . We assume that heat baths are large and, at equilibrium, therefore the correlation function is the same at all times t and only depends on the time difference τ between the creation and annihilation of a photon. In the heat bath B, the equilibrium correlation is defined as S m n , p q B τ T r B X ^ m n B 0 X ^ p q B τ R B . The Fourier transformation of the correlation defines the following frequency-dependent correlation: S m n , p q B ω = d τ T r B X ^ m n B 0 X ^ p q B τ R B exp i ω τ . Therefore, in the case of M = 1 (i.e., the absence of the last term R B 2 ), the diagrams a–d can be rewritten in terms of S m n , p q B ω . For example, the diagram (a) for the case of M = 1 can be simplified to m , k = 0 , 1 ( m k ) R ^ m m 0 d τ T r B X ^ m k ( B ) ( 0 ) X ^ k m ( B ) τ R ^ B e i ω d r η k m τ in which the integral is half of the domain in Fourier transformation and therefore it can be proved to simplify to m , k = 0 , 1 ( m k ) R ^ m m 1 / 2 S m k , k m B ω d r η m k + i Π m k , k m ω d r η m k with Π m n , p q i / 2 π d ν S m n , p q B ν / ω ν . What is left to be determined is the frequency-dependent correlation function S m n , p q B ω , which turns out to become completely characterized by the set of reduced frequency-dependent susceptibilities defined as χ ˜ m n , p q B ω χ m n , p q B ω χ p q , m n B ω / i , with the dynamical susceptibility in the environment being χ m n , p q B ω i 0 T r B X ^ m n B τ , X ^ p q B 0 R B exp i ω τ . The fluctuation–dissipation theorem provides a link between the equilibrium correlation and the reduced dynamical susceptibility in the classical thermal bath B at temperature T B . This relation is usually called the Kubo–Martin–Scwinger (KMS) relation: S m n , p q B ω = n B ω / T B χ ˜ m n , p q B ω with n B ω / T B = 1 / exp ω T B 1 being the Bose distribution and k B the Boltzmann constant.
Generalized KMS
In the presence of replicas, similarly, the generalized correlations are defined. For the case in which there are M replicas in total and between creation and annihilations there are N replicas with 0 N M , the generalized correlation function is defined as
S m n , p q N , M B τ T r B X ^ m n B 0 R ^ B N X ^ p q B τ R ^ B M N T r B R ^ B M .
Similarly, one can show that
0 d τ T r B X ^ m n ( B ) ( 0 ) R ^ B N X ^ p q ( B ) τ R ^ B M N e i ω τ T r B R ^ B M = S m n , p q N , M B ω 2 + i Π m n , p q N , M B ω ,
with the definition Π m n , p q N , M B ω i / 2 π d ν S m n , p q N , M B ν / ω ν . One can also check from definitions that, for any heat bath, the following identities: S m n , p q N , M ω = S p q , m n M N , M ω , Π m n , p q N , M ω = Π p q , m n M N , M ω , and χ ˜ m n , p q ω = χ ˜ p q , m n ω .
Fourier transformation of this generalized correlation will define the frequency-dependent generalized correlation and, following the same mathematics as above, one can show at equilibrium thermal bath of temperature T B that all correlation functions can be determined through a generalized KMS relation:
S m n , p q N , M B ω = n B ω T B χ ˜ m n , p q B ω e N ω k B T B .
Further details can be found in [34]. ◂
Using these definitions as well as Equation (25), the sum of diagrams (a)–(d) in Figure 8 can be further simplified to
m , k = 0 , 1 ( m k ) R ^ m m 1 2 S k m , m k 3 , 3 B ω d r η m k + i Π k m , m k 3 , 3 B ω d r η m k 1 2 S m k , k m 0 , 3 B ω d r η k m + i Π m k , k m 0 , 3 B ω d r η k m , m , k = 0 , 1 ( m k ) R ^ k k + 1 2 S m k , k m 1 , 3 B ω d r η k m + i Π m k , k m 1 , 3 B ω d r η k m + 1 2 S k m , m k 2 , 3 B ω d r η m k + i Π k m , m k 2 , 3 B ω d r η m k , = m , k = 0 , 1 ( m k ) S m k , k m 0 , 3 B ω d r η k m R ^ m m + S m k , k m 1 , 3 B ω d r η m k R ^ k k .
In total, there are M number of terms similar to the last line in Equation (26) associated with similar diagrams at M worlds. It is important to notice that these self-replica correlated terms are determined in fact only by physical correlations and they make already known results for the flow of von Neumann entropy in the heat bath [37]. To see this more in more detail, one can expand the summation and use the KMS relation and its generalized version in Equation (25). After generalizing the result for M replicas, taking derivative with respect to M and analytically continuing the result to M 1 , the incoherent part of flow in von Neumann entropy is
d S ( B ) d t incoherent = 1 T B Γ ( B ) p 0 Γ ( B ) p 1 ,
with Γ ( B ) χ ˜ n B ω d r / T B + 1 and Γ ( B ) χ ˜ n B ω d r / T B , χ ˜ χ ˜ 10 , 01 , and p n R n n . These are only self-interacting replicas, which are incomplete as they ignore the following diagrams.
The new diagrams are the cross-world interactions. As discussed previously, cross-world diagrams cannot transfer physical quantities as they rely on the fact that entropy depends nonlinearly on the density matrix and therefore it is not a physical observable quantity. Some of these types of diagrams are shown in Figure 9—for the case that one interaction takes place in the leftmost replica and the second interaction in the middle replica, thus leaving the third replica intact.
( e ) : 0 d τ T r B m , n , k , l X ^ m k ( B ) ( t ) R ^ B R ^ m k X ^ n l ( B ) t τ R ^ B R ^ n l e i ω d r η n l τ δ E n l , E k m R ^ B / T r B R ^ B 3 , ( f ) : 0 d τ T r B m , n , k , l X ^ m k ( B ) ( t τ ) R ^ B R ^ m k X ^ n l ( B ) t R ^ B R ^ n l e i ω d r η m k τ δ E m k , E l n R ^ B / T r B R ^ B 3 , ( g ) : 0 d τ T r B m , n , k , l X ^ m k ( B ) ( t ) R ^ B R ^ m k R ^ B R ^ l n X ^ l n ( B ) t τ e i ω d r η l n τ δ E l n , E k m R ^ B / T r B R ^ B 3 , ( h ) : 0 d τ T r B m , n , k , l X ^ m k ( B ) ( t τ ) R ^ B R ^ m k R ^ B R ^ l n X ^ l n ( B ) t e i ω d r η m k τ δ E m k , E n l R ^ B / T r B R ^ B 3 , ( i ) : 0 d τ T r B m , n , k , l R ^ B R ^ k m X ^ k m ( B ) ( t ) X ^ n l ( B ) t τ R ^ B R ^ n l e i ω d r η n l τ δ E n l , E m k R ^ B / T r B R ^ B 3 , ( j ) : 0 d τ T r B m , n , k , l R ^ B R ^ k m X ^ k m ( B ) ( t τ ) X ^ n l ( B ) t R ^ B R ^ n l e i ω d r η k m τ δ E k m , E l n R ^ B / T r B R ^ B 3 , ( k ) : 0 d τ T r B m , n , k , l R ^ B R ^ k m X ^ k m ( B ) ( t ) R ^ B R ^ l n X ^ n l ( B ) t τ e i ω d r η l n τ δ E l n , E m k R ^ B / T r B R ^ B 3 , ( l ) : 0 d τ T r B m , n , k , l R ^ B R ^ k m X ^ k m ( B ) ( t τ ) R ^ B R ^ l n X ^ l n ( B ) t e i ω d r η k m τ δ E k m , E n l R ^ B / T r B R ^ B 3 ,
where we used the following identity e i ω d r η m n + η p q t = δ E m n , E q p .
One can evaluate all diagrams associated with a general number of replicas using the above example. After carefully analyzing all diagrams and proper simplifications—see [34]—the flow of Renyi entropy d S M / d t in the heat bath B can be found, and consequently the so-called coherent part of entanglement (von Neumann) entropy can be found as follows:
d S ( B ) d t coherent = Γ ( B ) Γ ( B ) T B R 01 2 .
This is the new part of the entropy flow that comes from the generalized KMS correlations. We call this part the coherent part because it is nonzero for degenerate states or equivalently a two-level system driven by their detuning frequency.
Therefore, the entanglement entropy flow is naturally separated into two parts and therefore it is equal to the sum between the two parts:
d S ( B ) d t = d S ( B ) d t incoherent + d S ( B ) d t coherent , = 1 T B Γ p 0 Γ p 1 Γ Γ T B R 01 2 ,
in which the first term on the second line is what in textbooks has so far been mistakenly taken as total entropy flow.
As we can see, Equation (29) is not directly related to energy flow—which here corresponds to the incoherent part instead of a finite flow that depends on the quantum coherence ( R 01 ) 2 .
Consider that the two-level system with energy difference ω o is driven at the same frequency, i.e., H = Ω cos ( ω o t ) and weakly coupled to two heat reservoirs at temperatures T A and T B . From Equation (1) of Ref. [34], one can find the following time evolution equations for the density matrix and setting them to zero determines the stationary solutions:
d R 11 d t = i Ω 2 R 01 R 10 Γ R 11 + Γ R 00 = 0 , d R 01 d t = i Ω 2 R 11 R 00 1 2 Γ + Γ R 01 = 0 , R 00 + R 11 = 1 ,
which finds the stationary ground state population R 00 = ( Γ ( Γ + Γ ) + Ω 2 ) / ( ( Γ + Γ ) 2 + 2 Ω 2 ) and the stationary off-diagonal density matrix element R 10 = i Ω ( 1 2 R 00 ) / ( Γ + Γ ) , with Γ Γ ( A ) + Γ ( B ) and Γ Γ ( A ) + Γ ( B ) . By considering that B is a probe environment with zero temperature, substituting all solutions in Equation (28), the incoherent and coherent parts of entropy flow in the probe environment have been plotted in Figure 10 for different driving amplitudes and ω 0 / T A .

3.6.2. Example 2: Entropy in a Four-Level Quantum Photovoltaic Cell

Scovil and Schulz–DuBois first introduced a model of a quantum heat engine (SSDB heat engine) in which a single three-level atom, consisting of a ground and two excited states, is in contact with two heat baths [38,39]. A large enough difference between the heat bath temperatures can create population inversion between the two excited states and a coherent light output. One hot photon is absorbed and one cold photon is emitted; therefore, a laser photon is produced. The SSDB heat engine model gives a clear demonstration of the quantum thermodynamics. However, we notice that some detailed properties of this lasing heat engine, e.g., the threshold behavior and the statistics of the output light, are still not well studied. There are a number of applications for the model, such as light-harvesting biocells, photovoltaic cells, etc.
Since then, the model has been modified to describe other systems such as light-harvesting biocells, photovoltaic cells, etc.
Recently, in Ref. [40], one of us studied the entropy flow using the replica trick for a 4-level photovoltaic cell with two degenerate ground states and two excited states, see Figure 11. This heat engine was first proposed by Schully in [11] and recently studied in many further details by Schully and others [17,41].
After finding all extended Keldysh diagrams for an arbitrary Renyi degree M, evaluating all self-interacting and cross-interacting diagrams and simplifying the results, the von Neumann entropy flow in heat bath A becomes [40]:
d S d t A = 1 T A { γ p 4 ω A χ ˜ 42 n ¯ ω A T A p 2 ω A χ ˜ 41 n ¯ ω A T A p 1 χ ˜ 14 , 42 ω A n ¯ ω A T A + ω A n ¯ ω A T A Re R 12 1 2 i = 1 , 2 ω A χ ˜ 14 , 42 | R 12 | 2 } .
The first two lines can be found using physical correlations. The last line, however, which plays an essential role in the entropy evaluation, can be obtained only through informational correlations. Here, the state probabilities are p x R x x with x being 1 , 2 , 3 , 4 and depending on the characteristics of all heat baths. The dynamical response function is χ ˜ α i χ ˜ i α , α i ( ω i α ) with i = 1 , 2 and α = 3 , 4 , and χ ˜ 1 α , α 2 = χ ˜ α 1 χ ˜ α 2 . Moreover, γ i = 1 , 2 n ¯ ω A / T A + 1 ω A χ ˜ 3 i .
In order to evaluate the stationary value of the entropy flow in this heat bath, we must solve the quantum master equation for the density matrix time evolution. This can be found in Ref. [40]. The solution is such that the coupling between the environment and the quantum system introduces decoherence in quantum states. Energy exchange between the heat bath and a quantum system introduces a limited coherence time, namely τ 1 , for quantum state probabilities. The phase of a quantum state can fluctuate and, depending on environmental noise, the lifetime of quantum state can be limited to τ 2 . These two coherence times affect all elements of the density matrix. From solving the quantum Bloch equation, one can see that the only stationary solution in the off-diagonal part is the imaginary part of R 12 whose real part of exponential decay due to dephasing is: Im R 12 exp ( t / τ 2 ) .
One can substitute the stationary solution of the density matrix in Equation (30) and the flow of entropy in the heat bath changes depending on the dephasing time—see Figure 2a,b in [40]. In fact, increasing the dephasing time will increase the contribution of the coherent part of the entropy flow, i.e., information correlations. This will reduce the total entropy flow in the heat bath, which will equivalently increase the output power in this photovoltaic cell.

3.6.3. Example 3: Entropy in a Quantum Resonator/Cavity Heat Engine

Using a rather different technique—i.e., the correspondence between entropy and statistics of energy transfer that we discuss in the next section—in [16,45], we calculated entropy production for a resonator/cavity coupled two different environments kept at two different temperatures, see Figure 12. One of the two baths is a probe environment at a temperature of zero for which we calculate the flow of entropy.
Knowing how entropy flows as the result of interactions between the resonator, cavity and other parts of the circuit can help to obtain important information about the possibility of leakage or dephasing in the system and ultimately give rise to modifications of quantum circuits [4]. A good understanding of cavities/resonators is beneficial to search for the nature of non-equilibrium quasiparticles in quantum circuits [42,43]. This can help with detecting light particles like muons whose tunnelling in a quantum circuit can signal a sudden jump in the entropy flow [44,45,46]. Given that entropy flow can be measured by the full counting statistics of energy transfer, see the next section, it is important to keep track of entropy flow in a resonator.
Again, we use the standard technique that we described above. Let us consider a single harmonic oscillator of frequency ω 0 and Hamiltonian H ^ = ω 0 ( a ^ a ^ + 1 / 2 ) , which is coupled to a number of environments at different temperatures with different coupling strengths. We concentrate on a probe environment that is weakly coupled to the oscillator. In addition, the oscillator is driven by an external force at frequency Ω . We calculate the Renyi flow and consequently the von Neumann entropy flow of the probe environment. The coupling Hamiltonian between the harmonic oscillator and the probe reservoir is H ^ ( t ) = X ^ ( t ) a ^ ( t ) + h . c . , with X ^ being the probe reservoir operator. The Fourier transform of the correlator is: S m n ( ω ) = exp ( i ω t ) S m n ( t ) d ω / 2 π . Due to the conservation of energy, the energy exchange occurs either with quantum Ω or with quantum ω 0 .
We note that the time dependence of the average of two operators can be written as a ^ ( t ) a ^ ( t ) = a ^ a ^ e i ω 0 ( t t ) + a ^ ( t ) a ^ ( t ) , where the time dependence of a ( t ) is due to the driving force and therefore oscillates at frequency Ω : a ( t ) = a + exp ( i Ω t ) + a exp ( i Ω t ) . This corresponds to the fact that the oscillator can oscillate both at its own frequency and at the frequency of external force.
Obtaining the entropy flows from the extended Keldysh correlators is straightforward. The generalized KMS relation in Equation (25) helps to describe the correlators in the thermal bath B in terms of their dynamical susceptibility. The result can be summarized as follows:
d S M ( B ) d t = M n ¯ M ω 0 / T B χ ˜ n ¯ ( ( M 1 ) ω 0 / T B ) n ¯ ω 0 / T B a a e ω 0 T B a a ,
where we defined T resonator to be the effective temperature of the harmonic oscillator a a = n ¯ ( ω 0 / T resonator ) + 1 and a a = n ¯ ( ω 0 / T resonator ) . Taking the derivative with respect to M and analytically continuing the result in the limit of M 1 will determine the thermodynamic entropy flow:
d S M ( B ) d t = 1 T B n ¯ ω 0 / T resonator n ¯ ω 0 / T B .
The entropy flow changes sign at the onset temperature T r e s o n a t o r = T B . Moreover, after the exact evaluation of the incoherent part of the entropy flow, one should notice that it contains some terms proportional to a and a . These terms oscillate with the external drive and are nonzero. However, they are all cancelled out by the coherent part of entropy flow such that the overall flow will only depend on the temperatures, and not on the driving force. Therefore, the entropy flow is robust in the sense that it only depends on the temperatures of the probe and harmonic oscillator and is completely insensitive to the external driving force.
The insensitivity of entropy flow to external driving force is interesting and a direct result of including coherent flow of entropy that is absent in semi-classical analysis. The difference can put the coherent entropy flow into an experimental verification.
In the absence of cross-replica correlators, the thermodynamic entropy of a probe environment, coupled to a thermal bath via a resonator, will dramatically depend on the amplitude of the external driving. If no such dependence on the driving amplitude is found, then this is an indication that they are absent; they are in fact eliminated by quantum coherence!

4. Linking Information to Physics: A New Correspondence

As discussed above, the Renyi entropies in quantum physics are considered unphysical, i.e., non-observable quantities, due to their nonlinear dependence on the density matrix. Such quantities cannot be determined from immediate measurements; instead, their quantification seems to be equivalent to determining the density matrix. This requires reinitialization of the density matrix between many successive measurements. Therefore, the Renyi entropy flows between the systems are conserved measures of nonphysical quantities. An interesting and nontrivial question is: Is there any relation between the Renyi entropy flows and the physical flows?
An idea of such a relation was first put forward by Levitov and Klich in [23], where they proposed that entanglement entropy flow in electronic transport can be quantified from the measurement of the full counting statistics (FCS) of charge transfers [22,47,48,49]. The validity of this relation is restricted to zero temperature and obviously to the systems where interaction occurs by means of charge transfer. Recently, we presented a relation that is similar in spirit [15]. We derived a correspondence for coherent and incoherent second-order diagrams in a general time-dependent situation.
This relation gives an exact correspondence between the informational measure of Renyi entropy flows and physical observables, namely, the full counting statistics of energy transfers [47,50].
We consider reservoir B and quantum system q. We assume that B is infinitely large and is kept in thermal equilibrium at temperature T B . System q is arbitrary as it may carry several degrees of freedom as well as infinitely many. It does not have to be in thermal equilibrium and is in general subject to time-dependent forces. It is convenient to assume that these forces are periodic with a period of τ ; however, the period does not explicitly enter the formulation of our result, which is also valid for aperiodic forces. The only requirement is that the flows of physical quantities have stationary limits. The stationary limits are determined after averaging instant flows over a period and—for aperiodic forces—by averaging over a sufficiently long time interval. In the case of energetic interactions, energy transfer is statistical. The statistics can be described by the generating function of the full counting statistics (FCS), namely ‘FCS Keldysh actions’.
Recently, in Ref. [15], we proved that the flow of thermodynamic entropy as well as the flow of Renyi entropy between two heat baths via a quantum system is exactly equivalent to the difference between two FCS Keldysh actions of incoherent and coherent energy transfers. In the limit of long τ and for a typical reservoir B with temperature T B , the incoherent and coherent FCS Keldysh actions are f i ξ , T B and f c ξ , T B , with ξ being the counting field of energy transfer. These generating functions can be determined using Keldysh diagrams, see [16]. After their evaluation, one finds the statistical m-th cumulant function C m by taking the derivative of the generating function in the limit of zero counting function, i.e., C m = lim ξ 0 m f / ξ m .
In fact, any physical quantity should depend on the cumulants and consequently on a zero counting field. However, informational measures are exceptional. Detailed analysis shows that the flow of Renyi entropy of degree M in the reservoir B at equilibrium temperature T B is exactly, and unexpectedly, the following: d S M T B / d t = M f i ( ξ * , T B / M ) f c ( ξ * , T / M ) with ξ * i ( M 1 ) / T B . Notice that in this correspondence the temperature on the left side is T B while it is T B / M on the right side. In addition, it is important to notice that the entropy is evaluated by using the generating function of full counting statistics at nonzero counting field ξ * . This relation is valid in the weak-coupling limit where the interaction between the systems can be treated perturbatively.

5. Discussion

Currently, ‘time’ does not play any essential role in quantum information theory. In this sense, quantum theory is underdeveloped similarly to how quantum physics was underdeveloped before Schrödinger introduced his wave equation. In this review article, we discussed a fascinating extension of the Keldysh formalism that consistently copes with the problem of time for one of the central quantities in quantum information theory: entropy. We characterized the flows of conserved entropies (both Renyi and von Neumann entropies) and illustrated them diagrammatically to introduce new correlators that have been absent so far in the literature.
Given that entropy is not an observable, as it is a nonlinear function of the density matrix, one can use a probe environment to make an indirect measurement of the entropy in light of the new correspondence between entropy and full counting statistics of energy transfer. This can be done equally well for the imaginary and real values of the characteristic parameter. The measurement procedures may be complex, yet they are feasible and physical. The correspondence can have many other advantages. For instance, a complete understanding of entropy flows may help to identify the sources of fidelity loss in quantum communication and may help to develop methods to control or even prevent them.

Author Contributions

Y.V.N. contributed in conceptualization and idea; A.v.S. in editing, and M.H.A. wrote the original draft.

Funding

The authors declare no funding support.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Polkovnikov, A.; Sengupta, K.; Silva, A.; Vengalatorre, M. Colloquium: Non- equilibrium dynamics of closed interacting quantum systems. Rev. Mod. Phys. 2011, 83, 863. [Google Scholar] [CrossRef]
  2. Santos, L.F.; Polkovnikov, A.; Rigol, M. Entropy of Isolated Quantum Systems after a Quench. Phys. Rev. Lett. 2011, 107, 040601. [Google Scholar] [CrossRef] [PubMed]
  3. Jaeger, G. Quantum Information; Springer: Berlin, Germany, 2007. [Google Scholar]
  4. Ansari, M.H. Superconducting qubits beyond the dispersive regime. Phys. Rev. B 2019, 100, 024509. [Google Scholar] [CrossRef] [Green Version]
  5. Lagoudakis, K.G.; McMahon, P.L.; Fischer, K.A.; Puri, S.; Müller, K.; Dalacu, D.; Poole, P.J.; Reimer, M.E.; Zwiller, V.; Yamamoto, Y.; et al. Initialization of a spin qubit in a site-controlled nanowire quantum dot. New J. Phys. 2014, 16, 023019. [Google Scholar] [CrossRef]
  6. Ansari, M.H. Exact quantization of superconducting circuits. Phys. Rev. B 2019, 100, 024509. [Google Scholar] [CrossRef]
  7. Paik, H.; Mezzacapo, A.; Sandberg, M.; McClure, D.T.; Abdo, B.; Córcoles, A.D.; Dial, O.; Bogorin, D.F.; Plourde, B.L.T.; Steffen, M.; et al. Experimental demonstration of a resonator-induced phase gate in a multiqubit circuit-qed system. Phys. Rev. Lett. 2016, 117, 250502. [Google Scholar] [CrossRef] [PubMed]
  8. Ansari, M.H.; Wilhelm, F.K. Noise and microresonance of critical current in Josephson junction induced by Kondo trap states. Phys. Rev. B 2011, 84, 235102. [Google Scholar] [CrossRef] [Green Version]
  9. Pekola, J.P.; Khaymovich, I.M. Thermodynamics in single-electron circuits and superconducting qubits. Annu. Rev. Condens. Matter Phys. 2019, 10, 193. [Google Scholar] [CrossRef]
  10. Uzdin, R.; Levy, A.; Kosloff, R. Equivalence of Quantum Heat Machines, and Quantum-Thermodynamic Signatures. Phys. Rev. X 2015, 5, 031044. [Google Scholar] [CrossRef]
  11. Scully, M.O.; Chapin, K.; Dorfman, K.; Kim, M.; Svidzinsky, A. Quantum heat engine power can be increased by noise-induced coherence. Proc. Natl. Acad. Sci. USA 2011, 108, 15097–15100. [Google Scholar] [CrossRef] [Green Version]
  12. Linden, N.; Popescu, S.; Skrzypczyk, P. How small can thermal machines be? The smallest possible refrigerator. Phys. Rev. Lett. 2010, 105, 13. [Google Scholar] [CrossRef] [PubMed]
  13. Frank, R.L.; Lieb, E.H. Monotonicity of a relative Rényi entropy. J. Math. Phys. 2013, 54, 122201. [Google Scholar] [CrossRef]
  14. Esposito, M.; Lindenberg, K.; Van den Broeck, C. Entropy production as correlation between system and reservoir. New J. Phys. 2010, 12, 013013. [Google Scholar] [CrossRef]
  15. Ansari, M.H.; Nazarov, Y.V. Exact correspondence between Renyi entropy flows and physical flows. Phys. Rev. B 2015, 91, 174307. [Google Scholar] [CrossRef] [Green Version]
  16. Ansari, M.H.; Nazarov, Y.V. Keldysh formalism for multiple parallel worlds. J. Exp. Theor. Phys. 2016, 122, 389–401. [Google Scholar] [CrossRef] [Green Version]
  17. Li, S.W.; Kim, M.B.; Agarwal, G.S.; Scully, M.O. Quantum statistics of a single-atom Scovil–Schulz-DuBois heat engine. Phys. Rev. A 2017, 96, 063806. [Google Scholar] [CrossRef]
  18. Utsumi, Y. Optimum capacity and full counting statistics of information content and heat quantity in the steady state. Phys. Rev. B 2019, 99, 115310. [Google Scholar] [CrossRef] [Green Version]
  19. Utsumi, Y. Full counting statistics of information content. Eur. Phys. J. Spec. Top. 2019, 227, 1911. [Google Scholar] [CrossRef]
  20. Kubo, R. Statistical-mechanical theory of irreversible process. I. General theory and simple applications to magnetic and conduction problems. J. Phys. Soc. Jpn. 1957, 12, 570–586. [Google Scholar] [CrossRef]
  21. Martin, P.; Schwinger, J. Theory of many-particle systems. I. Phys. Rev. 1959, 115, 1342. [Google Scholar] [CrossRef]
  22. Levitov, L.S.; Lee, H.W.; Lesovik, G.B. Electron counting statistics and coherent states of electric current. J. Math. Phys. 1996, 37, 4845–4866. [Google Scholar] [CrossRef] [Green Version]
  23. Klich, I.; Levitov, L.S. Quantum noise as an entanglement meter. Phys. Rev. Lett. 2009, 102, 100502. [Google Scholar] [CrossRef] [PubMed]
  24. Nazarov, Y.V.; Kindermann, M. Full counting statistics of a general quantum mechanical variable. Eur. Phys. J. B 2003, 35, 413–420. [Google Scholar] [CrossRef]
  25. Ansari, M.H. The Statistical Fingerprints of Quantum Gravity. Ph.D. Thesis, University of Waterloo, Waterloo, ON, Canada, 2008. [Google Scholar]
  26. Ansari, M.H. Spectroscopy of a canonically quan-tized horizon. Nucl. Phys. B 2007, 783, 179–212. [Google Scholar] [CrossRef]
  27. Ansari, M.H. Generic degeneracy and entropy in loop quantum gravity. Nucl. Phys. B 2008, 795, 635–644. [Google Scholar] [CrossRef] [Green Version]
  28. Ansari, M.H. Quantum amplification effect in a horizon fluctuation. Phys. Rev. D 2010, 81, 104041. [Google Scholar] [CrossRef]
  29. Renyi, A. On Measures of Entropy and Information. In Proceedings of the 4th Berkeley Symposium on Mathematics and Statistical Probability, Berkeley, CA, USA, 20 June–30 July 1960; pp. 547–561. [Google Scholar]
  30. Deutsch, D.; Hayden, P. Information flow in entangled quantum systems. Proc. Soc. Lond. A 2000, 456, 1759–1774. [Google Scholar] [CrossRef] [Green Version]
  31. Nazarov, Y.V. Flows of Rényi entropies. Phys. Rev. B 2011, 84, 205437. [Google Scholar] [CrossRef]
  32. Holzhey, C.; Larsen, F.; Wilczek, F. Geometric and renormalized entropy in conformal field theory. Nucl. Phys. B 1994, 424, 443. [Google Scholar] [CrossRef]
  33. Calabrese, P.; Cardy, J. Entanglement entropy and conformal field theory. J. Phys. A 2009, 42, 504005. [Google Scholar] [CrossRef]
  34. Ansari, M.H.; Nazarov, Y.V. Rényi entropy flows from quantum heat engines. Phys. Rev. B 2015, 91, 104303. [Google Scholar] [CrossRef]
  35. Keldysh, L.V. Diagram technique for nonequilibrium processes. Zh. Eksp. Teor. Fiz. 1964, 47, 1515–1527. [Google Scholar]
  36. Nazarov, Y.V.; Blanter, Y.M. Quantum Transport: Introduction to Nanoscience; Cambridge University Press: Cambridge, UK, 2009. [Google Scholar]
  37. Alicki, R.; Kosloff, R. Introduction to Quantum Thermodynamics: History and Prospects. arXiv 2018, arXiv:1801.08314. [Google Scholar]
  38. Scovil, H.E.D.; Schulz–DuBois, E.O. Three-level masers as heat engines. Phys. Rev. Lett. 1959, 2, 262. [Google Scholar] [CrossRef]
  39. Geusic, J.E.; Schulz-DuBios, E.O.; Scovil, H.E.D. Quantum equivalent of the carnot cycle. Phys. Rev. 1967, 156, 343. [Google Scholar] [CrossRef]
  40. Ansari, M.H. Entropy production in a photovoltaic cell. Phys. Rev. B 2017, 95, 174302. [Google Scholar] [CrossRef] [Green Version]
  41. Mitchison, M.T. Quantum thermal absorption machines: Refrigerators, engines and clocks. arXiv 2019, arXiv:1902.02672. [Google Scholar] [CrossRef]
  42. Houzet, M.; Serniak, K.; Catelani, G.; Devoret, M.H.; Glazman, L.I. Photon-assisted charge-parity jumps in a superconducting qubit. arXiv 2019, arXiv:1904.06290. [Google Scholar]
  43. Ansari, M.H. Rate of tunneling nonequilibrium quasiparticles in superconducting qubits. Supercond. Sci. Technol. 2015, 28, 045005. [Google Scholar] [CrossRef] [Green Version]
  44. Bal, M.; Ansari, M.H.; Orgiazzi, J.L.; Lutchyn, R.M.; Lupascu, A. Dynamics of parametric fluctuations induced by quasiparticle tunneling in superconducting flux qubits. Phys. Rev. B 2015, 91, 195434. [Google Scholar] [CrossRef]
  45. Ansari, M.H.; Wilhelm, F.K.; Sinha, U.; Sinha, A. The effect of environmental coupling on tunneling of quasiparticles in Josephson junctions. Supercond. Sci. Technol. 2013, 26, 035209. [Google Scholar] [CrossRef]
  46. Jafari-Salim, A.; Eftekharian, A.; Majedi, A.H.; Ansari, M.H. Stimulated quantum phase slips from weak electromagnetic radiations in superconducting nanowires. AIP Advances 2016, 6, 125013. [Google Scholar] [CrossRef]
  47. Kindermann, M.; Pilgram, S. Statistics of heat transfer in mesoscopic circuits. Phys. Rev. B 2004, 69, 155334. [Google Scholar] [CrossRef] [Green Version]
  48. Buttiker, M.; Imry, Y.; Landauer, R.; Pinhas, S. Generalized many-channel conductance formula with application to small rings. Phys. Rev. B 1985, 31, 6207. [Google Scholar] [CrossRef] [PubMed]
  49. Buttiker, M. Four-terminal phase-coherent conductance. Phys. Rev. Lett. 1986, 57, 1761. [Google Scholar] [CrossRef] [PubMed]
  50. Heikkilä, T.T.; Nazarov, Y.V. Statistics of temperature fluctuations in an electron system out of equilibrium. Phys. Rev. Lett. 2009, 102, 130605. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Diagrammatic representation of terms in the first line in Equation (20).
Figure 1. Diagrammatic representation of terms in the first line in Equation (20).
Entropy 21 00854 g001
Figure 2. A typical diagram with two first order interactions acting on two different worlds.
Figure 2. A typical diagram with two first order interactions acting on two different worlds.
Entropy 21 00854 g002
Figure 3. A typical higher order diagram.
Figure 3. A typical higher order diagram.
Entropy 21 00854 g003
Figure 4. The Keldysh diagram for the time evolution of: (left) one world made of one subsystem, (right) a world made of two interacting subsystems. Each contour represents a subsystem and the crosses denote interactions.
Figure 4. The Keldysh diagram for the time evolution of: (left) one world made of one subsystem, (right) a world made of two interacting subsystems. Each contour represents a subsystem and the crosses denote interactions.
Entropy 21 00854 g004
Figure 5. A diagram with two energy exchanges in one replica and no interaction in others.
Figure 5. A diagram with two energy exchanges in one replica and no interaction in others.
Entropy 21 00854 g005
Figure 6. A diagram with two replicas taking over 1st order interactions and the others remain intact.
Figure 6. A diagram with two replicas taking over 1st order interactions and the others remain intact.
Entropy 21 00854 g006
Figure 7. A two-level system quantum heat bath.
Figure 7. A two-level system quantum heat bath.
Entropy 21 00854 g007
Figure 8. Self-interacting diagrams for interaction between a quantum system and a heat bath.
Figure 8. Self-interacting diagrams for interaction between a quantum system and a heat bath.
Entropy 21 00854 g008
Figure 9. Cross-replica interacting diagrams for a quantum system and a heat bath.
Figure 9. Cross-replica interacting diagrams for a quantum system and a heat bath.
Entropy 21 00854 g009
Figure 10. Entropy production in a probe bath that is kept at zero temperature and is coupled to a two-level system depicted in Figure 7. The entropy is the sum of two parts: the incoherent and the coherent parts. (a) the incoherent part of entropy is nothing new and can be determined by standard correlations. It is positive by the convention that entropy enters from a higher temperature bath (via the two-level system); (b) the coherent part of entropy is a previously unknown part as it comes from the informational correlations between different replicas. This part depends quadratically on the off diagonal density. Quite nontrivially, this part of entropy is negative and summing it with the incoherent part will result in a positive flow yet with much smaller magnitude for entropy at small driving amplitudes.
Figure 10. Entropy production in a probe bath that is kept at zero temperature and is coupled to a two-level system depicted in Figure 7. The entropy is the sum of two parts: the incoherent and the coherent parts. (a) the incoherent part of entropy is nothing new and can be determined by standard correlations. It is positive by the convention that entropy enters from a higher temperature bath (via the two-level system); (b) the coherent part of entropy is a previously unknown part as it comes from the informational correlations between different replicas. This part depends quadratically on the off diagonal density. Quite nontrivially, this part of entropy is negative and summing it with the incoherent part will result in a positive flow yet with much smaller magnitude for entropy at small driving amplitudes.
Entropy 21 00854 g010
Figure 11. A four-level doubly degenerate photovoltaic cell.
Figure 11. A four-level doubly degenerate photovoltaic cell.
Entropy 21 00854 g011
Figure 12. A quantum cavity heat engine.
Figure 12. A quantum cavity heat engine.
Entropy 21 00854 g012

Share and Cite

MDPI and ACS Style

Ansari, M.H.; van Steensel, A.; Nazarov, Y.V. Entropy Production in Quantum is Different. Entropy 2019, 21, 854. https://doi.org/10.3390/e21090854

AMA Style

Ansari MH, van Steensel A, Nazarov YV. Entropy Production in Quantum is Different. Entropy. 2019; 21(9):854. https://doi.org/10.3390/e21090854

Chicago/Turabian Style

Ansari, Mohammad H., Alwin van Steensel, and Yuli V. Nazarov. 2019. "Entropy Production in Quantum is Different" Entropy 21, no. 9: 854. https://doi.org/10.3390/e21090854

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop