Next Article in Journal
Environment-Assisted Modulation of Heat Flux in a Bio-Inspired System Based on Collision Model
Next Article in Special Issue
Non-Hermitian Generalization of Rényi Entropy
Previous Article in Journal
Introducing Robust Statistics in the Uncertainty Quantification of Nuclear Safeguards Measurements
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Quantum Statistical Complexity Measure as a Signaling of Correlation Transitions

by
André T. Cesário
1,*,
Diego L. B. Ferreira
1,†,
Tiago Debarba
2,†,
Fernando Iemini
3,†,
Thiago O. Maciel
4,† and
Reinaldo O. Vianna
1,†
1
Departamento de Física, ICEx, Universidade Federal de Minas Gerais (UFMG), Av. Pres. Antônio Carlos 6627, Belo Horizonte 31270-901, Brazil
2
Departamento Acadêmico de Ciências da Natureza, Universidade Tecnológica Federal do Paraná (UTFPR), Campus Cornélio Procópio, Avenida Alberto Carazzai 1640, Cornélio Procópio 86300-000, Brazil
3
Instituto de Física, Universidade Federal Fluminense (UFF), Niterói 24210-346, Brazil
4
Instituto de Física, Federal University of Rio de Janeiro (UFRJ), Rio de Janeiro 21941-972, Brazil
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Entropy 2022, 24(8), 1161; https://doi.org/10.3390/e24081161
Submission received: 20 June 2022 / Revised: 20 July 2022 / Accepted: 23 July 2022 / Published: 19 August 2022
(This article belongs to the Special Issue Quantum Information Entropy in Physics)

Abstract

:
We introduce a quantum version for the statistical complexity measure, in the context of quantum information theory, and use it as a signaling function of quantum order–disorder transitions. We discuss the possibility for such transitions to characterize interesting physical phenomena, as quantum phase transitions, or abrupt variations in correlation distributions. We apply our measure on two exactly solvable Hamiltonian models: the 1 D -Quantum Ising Model (in the single-particle reduced state), and on Heisenberg XXZ spin- 1 / 2 chain (in the two-particle reduced state). We analyze its behavior across quantum phase transitions for finite system sizes, as well as in the thermodynamic limit by using Bethe Ansatz technique.

1. Introduction

Let us consider a physical, chemical, or biological process such as, for example, the change in the temperature of the water, the mixture between two solutions, or the formation of a neural network. It is intuitive to believe that if one can classify all the possible configurations of such systems, described by their ordering and disordering patterns, it would be possible to characterize and control them. As, for example, during the process of changing the temperature of water, by knowing the pattern of ordering and disordering of its molecular structure, it would be possible to characterize and completely control its phase transitions.
In information theory, the ability to identify certain patterns of order and disorder of probability distributions enables us to control the creation, transmission, and measurement of information. In this way, the characterization and quantification of complexity contained in physical systems and their constituent parts is a crucial goal for information theory [1]. One point of consensus in the literature about complexity is that no formal definition of this term exists. Intuition suggests that systems which can be described as “not complex” are readily comprehended: they can be described concisely by means of few parameters or variables, and their information content is low. Complexity quantifiers must satisfy some properties: (i.) assign a minimum value (possibly zero) for opposite extremes of order and disorder; (ii.) should be sensitive to transitions of order–disorder patterns; (iii.) and must be computable. There are considerable ways to define measures of the degree of complexity of physical systems. Among such definitions, we can mention measures based on data compression algorithms of finite size sequences [2,3,4], Kolmogorov or Chaitin measures based on the size of the smallest algorithm that can reproduce a particular type of pattern [5,6], and measures concerning the classical information theory [7,8,9,10,11,12,13]. Based on recent progress in defining a general canonical divergence within Information Geometry, a canonical divergence measure was presented with the objective of quantify complexity for both classical and quantum systems. In the classical realm, it was proven that this divergence coincides with the classical Kullback–Leibler divergence, and in the quantum domain it reduces to the quantum relative entropy [14,15].
The statistical complexity assigns the simplicity of a probability distribution to the amount of resources needed to store information [16,17]. Similarly, in the quantum realm, the complexity of a given density matrix could be translated as the resource needed to create, operate or measure the quantum state of the system [18,19,20]. On the other hand, the quantum information meaning of complexity could play an important role in the quantification of transitions of order and disorder patterns, which could indicate some quantum physical phenomenon, such as quantum phase transitions.
Regarding the complexity contained in systems, some of the simplest models in physics are the ideal gas and the perfect crystal. In an ideal gas model, the system can be found with the same probability in any of the available micro-states, therefore each state contributes equally to the same amount of information. On the other hand, in a perfect crystal, the symmetry rules restrict the accessible states of the system to only one very symmetric state. These simple models are extreme cases of minimum complexity, in a scale of order and disorder, therefore, there might exist some intermediate state which contains a maximum complexity value in that scale [21].
The main goal of this work is to introduce a quantum version of the statistical complexity measure, based on the physical meaning of the characterization and quantification of transitions between order–disorder patterns of quantum systems. As a by-product, this measure could be applied in the study of quantum phase transitions. Physical properties of systems across a quantum phase transition are dramatically altered, and in this way, it is interesting to understand how the complexity of a system would behave under such transitions. In our analysis, we studied the single particle reduced state of the parametric 1 D -Ising Model and the two-particle reduced state of Heisenberg XXZ spin- 1 / 2 Model.
The manuscript is organized as follows: In Section 2, we introduce the Statistical Measure of Complexity defined by Lópes-Ruiz et al. in [21], and in Section 3, we introduce a quantum counterpart of this measure: the Quantum Statistical Complexity Measure. We present some properties of this measure (in Section 4), and also exhibit a closed expression of this measure for one-qubit, written in the Bloch basis (Section 4.1). We also discuss two interesting examples and applications: the 1 D -Quantum Ising Model (Section 4.2), in which we compute the Quantum Statistical Complexity Measure for one-qubit reduced state from N spins, in the thermodynamic limit, with the objective of determining the quantum phase transition point. We further determine the first-order quantum transition point and the continuous quantum phase transition for the Heisenberg XXZ spin- 1 / 2 model, with h = 0 (Section 4.3), by means of the Quantum Statistical Complexity Measure of the two-qubit reduced state for nearest neighbors, and in the thermodynamic limit. Finally, we give concluding remarks in Section 5.

2. Classical Statistical Complexity Measure—CSCM

Consider a system possessing N accessible states { x 1 , x 2 , , x N } , when observed on a particular scale, with each state having an intrinsic probability given by p = { p i } i = 1 N . As discussed before, the candidate function to quantify the complexity of a probability distribution associated with a physical system, must attribute zero value for systems with the maximum degree of order, that is, for pure distributions: p = { p i = 1 , p j i = 0 } , and also assign zero for disordered systems which are characterized by an independent and identically distributed vector (i.i.d.): I = { I i = 1 / N } , for all i = 1 , , N . Let us address the case of ordered (Section 2.1) and disordered (Section 2.2) systems separately.

2.1. Degree of Order

A physical system possessing the maximum degree of order can be regarded as a system with a symmetry of all of its elements. The probability distribution that describes such systems is best represented by a pure vector, which places the system as having only one possible configuration. Physically, this is the case for a gas at zero Kelvin temperature, or a perfect crystal where the symmetry rules restrict the accessible state to a very symmetric one. In order to quantify the degree of order of a given system, the function must assign maximum value for pure probability distributions, and attribute zero for equiprobable configurations. A function capable of quantifying such a degree of order is the l 1 -distance between the probability distribution and the independent and identically distributed vector (i.i.d.):
D ( p , I ) = 1 2 | | p I | | 1 = 1 2 i p i 1 N 2 ,
where I , is the (i.i.d.) vector, i.e., it is a vector with elements: I i = 1 / N , for all i = 1 , , N . This function plays the role of the disequilibrium function and it quantifies the degree of order of a probability vector. It consists of the sum of the absolute values of the elements of the vector p I . The disequilibrium measure D will have zero value for maximally disordered systems and maximum value for maximally ordered systems.

2.2. Degree of Disorder

In contrast with an ordered system, a system possessing the maximum degree of disorder is described by an equiprobable distribution. This means an equally probable expectation of occurring any of its configurations, as in a fair dice game, or in a partition function of an isolated ideal gas. From a statistical point of view, the probability vector that describes this feature is the independent and identically distributed vector (i.i.d.) I , as defined above. One can define the degree of disorder of a system as a function which assigns value of zero for pure probabilities distributions, (associated with maximally ordered distributions), and a maximum value for the i.i.d. distribution. A well known function capable of quantifying the degree of disorder of a probability vector is the Shannon entropy:
H ( p ) = i = 1 N p i log ( p i ) .
In this way, the Shannon entropy H ( p ) will assign zero for maximally ordered systems, and a maximal value for i.i.d vectors equals to log N . The log function is usually taken in basis 2, in order to quantify the amount of disorder in bits.

2.3. Quantifying Classical Complexity

Using the maximally ordered and maximally disordered states and Equations (1) and (2), respectively, Lópes-Ruiz et al. in [21] defined a classical statistical measure of complexity constructed as a product of such order–disorder quantifiers. There are intermediate states of order–disorder that may exhibit some interesting physical properties and which can be associated with complex behavior. Therefore, in this sense, this measure should deal with these intermediate states by measuring the amount of complexity of a physical system [21].
Definition 1 
(Classical Statistical Complexity Measure—(CSCM) [21]). Let us consider a probability vector given by p = { p i } i = 1 N , with dim ( p ) = N , associated with a random variable X, representing all possible states of a system. The function C ( p ) is a measure of the system’s complexity and can be defined as:
C ( p ) = 1 log N H ( p ) D ( p , I ) .
The function C ( p ) will vanish for “simple systems”, such as the ideal gas model or a crystal, and it should reach a maximum value for some state.
Definition 2 
(Classical Statistical Complexity Measure of Joint Probability Distributions and of Marginal Probability Distributions). Given a known joint distribution of two discrete random variables X and Y, given by: p ( x i , y j ) , of dimension N, one can define the CSCM of the joint distribution C X Y ( p X Y ( x i , y j ) ) , by Equation (4):
C X Y ( p X Y ( x i , y j ) ) = 1 log N H ( p X Y ( x i , y j ) ) D ( p X Y ( x i , y j ) , I ) ,
and in the same way, we define the CSCM of the two marginal distributions, C X ( p X ( x i ) ) , by Equation (5) and C Y ( p Y ( y j ) ) , by Equation (6):
C X ( p X ( x i ) ) = 1 log N H ( p X ( x i ) ) D ( p X ( x i ) , I ) ,
C Y ( p Y ( y j ) ) = 1 log N H ( p Y ( y j ) ) D ( p Y ( y j ) , I ) ,
where p X and p Y are marginal probability density functions given by: p X ( x i ) = j = 1 N p X Y ( x i , y j ) , and p Y ( y j ) = i = 1 N p X Y ( x i , y j ) .
Generalizations of these measures defined above to continuous variables are straightforward (cf. Refs. [22,23]). In the same way, we can also generalize Definition 2 in order to deal with conditional probability densities like p ( x | y ) or p ( y | x ) , which may be useful in some other more general contexts.
Classical Statistical Complexity Measure depends on the nature of the description associated to a system and with the scale of observation [24]. This function, generalized as a functional of a probability distribution, has a relation with a time series generated by a classical dynamical system [24]. Two ingredients are fundamental in order to define such a quantity: the first one is an entropy function which quantifies the information contained in a system, and could also be the Tsallis’ Entropy [25], Escort-Tsallis [26], or Rényi Entropy [27]. The other ingredient is the definition of a distance function in the state of probabilities, which indicates the disequilibrium relative to a fixed distribution (in this case the distance to the i.i.d. vector). For this purpose we can use an Euclidean Distance (or some other p-norm), the Bhattacharyya Distance [28], or Wootters’ Distance [29]. We can also apply a statistical measure of divergence, for example the Classical Relative Entropy [30], Hellinger distance, and also Jensen–Shannon Divergence [31]. We make note of some other generalized versions of complexity measures in recent years, and these functions have proven to be useful in some branches of classical information theory [7,32,33,34,35,36,37,38,39,40].

3. Quantum Statistical Complexity Measure—QSCM

3.1. Quantifying Quantum Complexity

The quantum version of the statistical complexity measure quantifies the amount of complexity contained in a quantum system, in an order–disorder scale. For quantum systems, the probability distribution is replaced by a density matrix (positive semi-definite and trace one). Likewise, the classical case, the extreme cases of order and disorder, are, respectively, the pure quantum states ψ ψ , and the maximally mixed state: I : = I / N , where N is the dimension of the Hilbert space. Additionally, in analogy with the description for classical probability distributions, the quantifier of quantum statistical complexity must be zero for maximum degree of order and disorder. One can define the Quantum Statistical Complexity Measure (QSCM) as a product of an order and a disorder quantifiers: a quantum entropy, which measures the amount of disorder related to a quantum system, and a pairwise distinguishability measure of quantum states, which plays the role of a disequilibrium function. One of the functions to measure the amount of disorder of a quantum system is the von Neumann entropy and it is given by:
S ( ρ ) = Tr [ ρ log ( ρ ) ] ,
where ρ is the density matrix of the system. The trace distance between ρ and the the maximally mixed state quantifies the degree of order:
D ( ρ , I ) ρ I 1 = 1 2 Tr ρ I 2 .
For our purposes here, we define the Quantum Statistical Complexity Measure–(QSCM) in Definition 4 by means of the trace distance between the reduced quantum state ρ and I acting as the reference state. However, the trace distance function was chosen in both Definition 3 and in Definition 4 because it is the most distinguishable distance in the Hilbert space, and also monotonic under stochastic operations.
Definition 3 
(Quantum Statistical Complexity Measure—(QSCM)). Let ρ D ( H N ) be a quantum state over an N -dimensional Hilbert space. Then we can define the Quantum Statistical Complexity Measure as the following functional of ρ:
C ( ρ ) = 1 log N S ( ρ ) · D ( ρ , I ) ,
where S ( ρ ) is the von Neumann entropy, and D ( ρ , I ) is a distinguishability quantity between the state ρ and the normalized maximally mixed state I , defined in the suitable space.
Definition 4 
(Quantum Statistical Complexity Measure of the Reduced Density Matrix). Let ρ S E D ( H N M ) be a global system of dimension N M , or any compound state of a system and its environment, and let ρ = ρ S = Tr E [ ρ S E ] D ( H N ) , (having dimension N), be the reduced state of this compound state (where Tr E is the partial trace over the environment). Then we can define the Quantum Statistical Complexity Measure of the Reduced Density Matrix C ( ρ S ) as:
C ( ρ S ) = C ( Tr E [ ρ S E ] ) = 1 log N S ( Tr E [ ρ S E ] ) · D ( Tr E [ ρ S E ] , I ) , C ( ρ S ) = 1 log N S ( ρ S ) · D ( ρ S , I ) ,
where S ( ρ S ) is the von Neumann entropy of the quantum reduced state: ρ S = Tr E [ ρ S E ] , and D ( ρ S , I ) is a distinguishability quantity between the quantum reduced state ( ρ S ), and the normalized maximally mixed state I , defined in the suitable space.
In this work, we will use the Definition 4 (Quantum Statistical Complexity Measure of the Reduced Density Matrix) as the Quantum Statistical Complexity Measure–(QSCM). The reason for this choice is that we will study quantum phase transitions, therefore, we will apply the QSCM in the one-qubit state (reduced by N parts) in the thermodynamic limit, in the case of the 1 D -Ising Model, (Section 4.2), and in the case of the Heisenberg Model XXZ- 1 / 2 , (Section 4.3), we will apply the QSCM in the two-qubit state (reduced of N parts), also in the thermodynamic limit. Definition 4 is the quantum analogue of Definition 2, i.e., it is the quantum correspondent of CSCM defined by means of “marginal probability distributions”. Extensions of these measures defined above to continuous variables are trivial.
In analogy with the classical counterpart, in the definition of quantum statistical complexity measure, there is a carte blanche in choosing the quantum entropy function, such as the quantum Rényi entropy [41], or quantum Tsallis entropy [42], among many others functions. Similarly, we can choose other disequilibrium functions as a measure of distinguishability of quantum states. It can be some Shatten-p norm [43], or a quantum Rényi relative entropy [44], the quantum skew divergence [45], or a quantum relative entropy [46]. Another feature that might generalize the quantities defined in Definition 3 and Definition 4 is to define a more general quantum state ρ * as a reference state (rather than the normalized maximally mixed state I ) in the disequilibrium function. This choice must be guided by some physical symmetry or interest. Some obvious candidates are the thermal mixed quantum state, and the canonical thermal pure quantum state [47].

3.2. Some Properties of the QSCM

To complete our introduction of the quantifier of quantum statistical complexity, we should require some properties to guarantee a bona fide information quantifier. The amount of order–disorder, as measured by the QSCM, must be invariant under local unitary operations because it is related to the purity of the quantum reduced states.
Proposition 1 
(Local Unitary Invariance). The Quantum Statistical Complexity Measure is invariant under local unitary transformations, applied on the quantum reduced state of system-environment: Let a quantum system: ρ = ρ S D ( H S ) be the quantum reduced state of dimension N, and let ρ S E D ( H S E ) be the compound system-environment state, having dimension N M , such that: ρ S = Tr E [ ρ S E ] , therefore:
C ( U S ρ U S ) = C ( ρ ) ,
where ρ = ρ S D ( H S ) , and U S is a local unitary transformation acting on D ( H S ) , and Tr E is the partial trace over the environment E. The extension of this property to the global state ρ S E is trivial.
This statement comes directly from the invariance under local unitary transformation of von Neumann entropy and trace distance applied on the quantum reduced state ρ S .
Another important property regards the case of inserting copies of the system in some experimental contexts. Let us consider an experiment in which the experimentalist must quantify the QSCM of a given state ρ by means of a certain number n of copies ρ n , which therefore implies that the QSCM of the copies should be bounded by the quantity of only one copy.
Proposition 2 
(Sub-additivity over copies). Given a product state ρ n , with dim ( ρ ) = N , the QSCM is a sub-additive function over copies:
C ( ρ n ) n C ( ρ ) .
Indeed this is an expected property for a measure of information, since the regularized number of bits of information gained from a given system cannot increase just by considering more copies of the same system. The proof of Proposition 2 is in Appendix A, and it comes from the additivity of von Neumann entropy and sub-additivity of trace distance.
It is important to notice that quantum statistical complexity is not sub-additive over general extensions with quantum states, for example: i. Extensions with maximally mixed states: let us consider a given state ρ is extended with one maximally mixed state I = I / N , with dim ( ρ ) = dim ( I ) = N .
C ( ρ I ) = S ( ρ ) + log N 2 log N D ( ρ , I ) ,
= C ( ρ ) 2 + D ( ρ , I ) 2 ,
C ( ρ ) 2 + 1 2 .
Equation (15) presents an upper bound to the QSCM for this extended state. This feature demonstrates that the measure of the compound state is bounded by the quantity of one copy. ii. In Equation (18) we present the QSCM for a more general extension given by  ρ n I n . This feature shows that the measure of the compound state is also bounded by the quantity of one copy.
C ( ρ n I n ) n S ( ρ ) + log N 2 log N D ( ρ , I ) ,
n C ( ρ ) 2 + D ( ρ , I ) 2 ,
n C ( ρ ) 2 + 1 2 .
iii. As a last example of nonextensivity over general compound states, let us consider the extension with a pure state ψ ψ , with dim ( ρ ) = dim ( ψ ψ ) .
C ( ρ ψ ψ ) = S ( ρ ) 2 log N D ( ρ , ψ ψ ) ,
S ( ρ ) 2 log N D ( ρ , I ) + N 1 N .
As discussed above, the QSCM is a measure that intends to detect changes in properties, as for example changes on patterns of order and disorder. Therefore, the measure must be a continuous function over the parameters of the states responsible for its transitional characteristics. Naturally, the quantum complexity is a continuous function, since it comes from the product of two continuous functions. Due to continuity, it is possible to define the derivative function of the quantum statistical complexity measure:
Definition 5 
(Derivative). Let us consider a physical system described by the one-parameter set of states: ρ ( α ) D ( H N ) , for α R . We can define the derivative with respect to α as:
d C d α : = lim ε 0 C ( ρ ( α + ε ) ) C ( ρ ( α ) ) ε .
In the same way as defined in Definition 5, it is possible to obtain higher order derivatives.
Definition 6 
(Correlation Transition). In many-particle systems a transition of correlations occurs when a system changes from a state that has a certain order, pattern or correlation, to another state possessing another order pattern or correlation.
At low temperatures, physical systems are typically ordered, increasing the temperature of the system, they can undergo phase transitions or order–disorder transitions into less ordered states: solids lose their crystalline form in a solid–liquid transition; iron loses magnetic order if heated above the Curie point in a ferromagnetic-paramagnetic transition, etc. The description of physical systems depends on measurable quantities such as temperature, interaction strength, interaction range, orientation of an external field, etc. These quantities can be described by parameters in a suitable space. For example, let us consider a parameter describing some physical quantity α , and a set of one parameter state ρ ( α ) .
Phase transitions are characterized by a sharp change in the complexity of the physical system that exhibits such emergent phenomena when this suitable control parameter α exceeds a certain critical value. The study of transitions between correlations with the objective of inferring physical properties of a system can generate great interest. We know that at the phase transition point, the reduced state of N particles undergoes abrupt transitions that go through states that have a certain purity and change abruptly to reduced mixed states. These transitions can indicate a certain type of correlation transition that can be detected. For many-particle and composed systems, quick change on the local order–disorder degree can be associated with a transition in the correlations pattern. In this way, a detectable change in these parameters may indicate an alteration in system configuration, which is considered here as a change in the pattern of order–disorder.
Quantum phase transition is a fundamental phenomenon in condensed matter physics, and is tightly related to quantum correlations. Quantum critical points for Hamiltonian models with external magnetic fields at finite temperatures were studied extensively. In the Quantum Information scenario, the quantum correlation functions used in these studies of quantum phase transitions concerned almost only concurrence and quantum discord. The behavior of quantum correlations for the Heisenberg XXZ spin-1/2 chain via negativity, information deficit, trace distance discord, and local quantum uncertainty was investigated in [48]. However, other measures of quantum correlations had also been proposed in order to detect quantum phase transitions, such as: local quantum uncertainty [49], entanglement of formation [50,51], quantum discord, and classical correlations [52]. Authors in Ref. [53] revealed a quantum phase transition in an infinite 1 D -XXZ chain by using concurrence and Bell inequalities. The behaviors of the quantum discord, quantum coherence, and Wigner–Yansase skew information and the relations between the phase transitions and symmetry points in the Heisenberg XXZ spin-1 chains have been broadly investigated in Ref. [54]. The ground state properties of the one-dimensional extended Hubbard model at half filling from the perspective of its particle reduced density matrix were studied in [55], where the authors focused on the reduced density matrix of two fermions and performed an analysis of its quantum correlations and coherence along the different phases of the model.
In an abstract manner, a quantum state undergoing a path through the i.i.d. identity matrix is an example of such transitions which may have physical meaning, as we will observe later in some examples. Let us suppose that a certain subspace of a quantum system can be interpreted as having a certain order (i.e., a degree of purity of the compound state of N particles), and there exists a path in which this subspace passes through the identity. This path can be analyzed as having an order–disorder transition. In order to illustrate the formalism of quantum statistical complexity in this context of order–disorder transition, in Section 4 we apply it to two well-known quantum systems that exhibit quantum phase transitions: the 1 D -Quantum Ising Model (Section 4.2), and the Heisenberg XXZ spin- 1 / 2 model (Section 4.3).

4. Examples and Applications

In this section we calculate an analytic expression for the Quantum Statistical Complexity Measure (QSCM), of one-qubit, written in the Bloch basis, (Section 4.1), and present the application of QSCM in order to evince quantum phase transitions and correlation ordering transitions for the 1 D -Quantum Ising Model (Section 4.2) and for the Heisenberg XXZ spin- 1 / 2 model (Section 4.3).

4.1. QSCM of One-Qubit

Let us suppose we have a one-qubit state ρ , written in the Bloch basis. In Equation (22), we analytically exhibit the Quantum Statistical Complexity Measure C ( ρ ) of one-qubit, written as: ρ = 1 2 ( I + r · σ ) :
C ( r ) = r 2 2 arctanh ( r ) r 4 log 1 r 2 4 .
where r = ( x , y , z ) , and r = | r | = x 2 + y 2 + z 2 , with 0 r 1 , and σ is the Pauli matrix vector. Normalization constants such as log ( 2 ) , for example, are omitted in Equation (22) just for aesthetic reasons.
It is interesting to notice that the Quantum Statistical Complexity Measure of one-qubit, written in the Bloch basis, C ( ρ ) is a function dependent only on r, that is, C ( r ) . This expression will be useful in the study of quantum phase transitions, for example, in the 1 D -Ising Model, discussed in Section 4.2, where an analytical expression for the state of one-qubit reduced from N spins, in the thermodynamic limit, will be obtained. Other useful expressions can be obtained, for example, the trace distance between the state and the normalized identity for one-qubit is also a function of r, in the Bloch’s basis, D ( r ) = r / 2 , and therefore, the entropy function can be easily written as S ( r ) = 2 C ( r ) / r by using Equation (22). In addition, we exhibit analytic expressions for the first (Equation (23)), and the second, Equation (24)) derivatives of QSCM, for one-qubit, written in the Bloch Basis. One can observe that these functions also depend only on r:
d C ( r ) d r = r · arctanh ( r ) 1 4 log 1 r 2 4 ,
d 2 C ( r ) d r 2 = r 2 1 r 2 arctanh ( r ) .

4.2. 1 D Quantum Ising Model

The 1 D -Quantum Ising Model presents a quantum phase transition and, despite its simplicity, still generates a lot of interest from the research community. One of the motivations lies in the fact that spin chains possess a great importance in modelling quantum computers. The Hamiltonian of the 1 D -Quantum Ising Model is given by:
H = J j = 1 N σ j x σ j + 1 x g σ j z ,
where { σ x , σ y , σ z } are the Pauli matrices, J is an exchange constant that sets the interaction strength between the pairs of first neighbors { j , j + 1 } , and g is a parameter that represents an external transverse field. Without loss of generality, we can set J = 1 , since it simply defines an energy scale for the Hamiltonian. The Ising model ground state can be obtained analytically by a diagonalization consisting of three steps:
  • A Jordan–Wigner transformation:
    σ j z 1 2 c j c j ;
    where c j and c j are the annihilation-creation operators, respecting the anti-commutation relations: { c j , c k } = δ j k I , and { c j , c k } = 0 ;
  • A Discrete Fourier Transform (DFT):
    c j 1 N k = 0 N 1 c k e 2 π i ( k j ) / N ;
  • A Bogoliubov transformation:
    c k cos ( θ k / 2 ) γ k sin ( θ k / 2 ) γ k ;
    where θ k represents the basis rotation from the mode c k to the new mode representation γ k . The angles θ k are chosen such that the ground state of the Hamiltonian in Equation (25) is the vacuum state in γ k mode representation, and it is given by θ k = arctan sin ( k ) g cos ( k ) 56].
We can calculate the reduced density matrix of one spin by using the Bloch representation, in which all coefficients are obtained via expectation values of Pauli operators. The one-qubit state in the site j, given by ρ j ( 1 ) can be written as:
ρ j ( 1 ) = I 2 + r j · σ j 2 ,
where r j a = σ j a are expected values in vacuum state in the site j, and a = x , y , z . Note that σ j x = σ j y = 0 , j , because they combine an odd number of fermions. Therefore, the Bloch’s vector possesses only the z-component. Let us define θ k / 2 = β k , and θ k / 2 = β k . Thus, the z-component will be given by:
σ j z = 1 2 c j c j , σ j z = 1 2 N k , k e i ( k k ) j c k c k , σ j z = 1 2 N k , k e i ( k k ) j cos ( β k ) γ k sin ( β k ) γ k cos ( β k ) γ k sin ( β k ) γ k . σ j z = 1 2 N k , k e i ( k k ) j cos ( β k ) cos ( β k ) γ k γ k cos ( β k ) sin ( β k ) γ k γ k sin ( β k ) cos ( β k ) γ k γ k + sin ( β k ) sin ( β k ) γ k γ k .
As discussed above, the only non-vanishing term will be σ j z , and therefore:
σ j z = 1 2 N k , k e i ( k k ) j cos ( β k ) cos ( β k ) γ k γ k cos ( β k ) sin ( β k ) γ k γ k sin ( β k ) cos ( β k ) γ k γ k + sin ( β k ) sin ( β k ) γ k γ k , σ j z = 1 2 N k , k e i ( k k ) j sin ( β k ) sin ( β k ) δ k , k , σ j z = 1 2 N k sin 2 ( θ k / 2 ) .
In Equations (28) we exhibit the one-qubit reduced density matrix in the Bloch basis ( ρ 1 ):
ρ 1 = I 2 + 1 2 1 N k K sin 2 θ k 2 σ z ,
where the angle θ k is the Bogoliubov rotation angle and the summation index k K , with K = [ ± π N , ± 3 π N , , ± π 2 π N ] . This result is independent of the spin index, as expected for systems that are translational invariant.
We can now calculate QSCM for the reduced density matrix analytically. From Equation (22), we simply identify the Bloch vector of the reduced density matrix having only z-component, as written in Equation (27). This quantity in the thermodynamic limit can be obtained by taking the limit of the Riemann sums, σ z ( g ) = lim N k = 1 N σ k z . This Bloch’s vector component is a function of the field g, that is, σ z ( g ) = lim N k = 1 N 1 2 N k sin 2 ( θ ( k ) ) , with θ ( k ) = arctan sin ( l ) g cos ( l ) , and l = ( 2 k 1 ) π N π . Thus, the z component of Bloch’s vector given in Equation (27) goes to the following integral, written as:
σ z ( g ) = 1 2 π π 0 sin 2 1 2 arctan sin ( ξ ) g cos ( ξ ) d ξ .
This integral can be solved analytically in the thermodynamic limit for some values of the transverse field parameter. For g = 0 , we can easily obtain σ z = 0 , which corresponds to a one-qubit maximally mixed reduced state ρ 1 = I / 2 . At g = 1 , i.e., in the critical point, the integral given in Equation (29) can be also solved and we obtain σ z = 2 / π , in the thermodynamic limit. The eigenvalues of the one-qubit reduced state ρ 1 , at g = 1 , can be obtained analytically as: { 1 / 2 ± 1 / π } . For other values of g, the integral written in Equation (29) can be written as elliptic integrals of first and second kinds [57]. By using the result given in Equation (29) on Equation (22), we can thus obtain the quantum statistical complexity measure for one-qubit reduced density matrix in the thermodynamic limit as a function of the transverse field parameter g.
In Figure 1 we present the second derivative of QSCM with respect to the transverse field parameter g, for different finite system sizes N = 4 , 8 , 16 , 1000 . We also calculated this derivative in the thermodynamic limit by using Equation (24) and Equation (29).
It is well known that at g = 1 there is a quantum phase transition of second order [58]. By observing Figure 1, we can directly recognize a sharp behavior of the measure in the transition point.

4.3. XXZ-½ Model

Quantum spin models as the XXZ- 1 / 2 can be simulated experimentally by using Rydberg-excited atomic ensembles in magnetic microtrap arrays [59], and also by a low-temperature scanning tunneling microscopy [60], among many other quantum simulation experimental arrangements. Let us consider a Heisenberg XXZ spin- 1 / 2 model defined by the following Hamiltonian:
H = J j = 1 N S j x S j + 1 x + S j y S j + 1 y + Δ S j z S j + 1 z 2 h j = 1 N S j z ,
with periodic boundary conditions, S j + N α = S j α , and S j α = 1 2 σ j α , where σ j α are Pauli matrices, and Δ is the uni-axial parameter strength, which is a ratio of S z interactions between S x or S y interactions. This model can interpolate continuously between classical Ising, quantum XXX, and quantum XY models. At Δ = 0 , it turns to the quantum X Y or X X 0 model which corresponds to free fermions on a lattice. For Δ = 1 , ( Δ = 1 ), the anisotropic XXZ model Hamiltonian reduces to the isotropic (ferro)anti-ferromagnetic XXX model Hamiltonian. For Δ ± , the model goes to an (ferro)anti-ferromagnetic Ising Model.
The parameter J defines an energy scale and only its sign is important: we observe a ferromagnetic ordering along the x y plane for positive values of J, and, for negative ones, we notice the anti-ferromagnetic alignment. The uni-axial parameter strength Δ distinguishes a planar regime x y (when | Δ | < 1 ), from the axial alignment, (for | Δ | > 1 ), cf. [61]. Thereby, it is useful to define two regimes: for | Δ | > 1 , the Ising-like regime and | Δ | < 1 , the XY-like regime in order to model materials possessing respectively an easy-axis and easy-plane magnetic anisotropies [62].
Here we are interested in quantum correlations between the nearest and next to nearest neighbor spins in the XXZ spin- 1 / 2 chain with J = 1 , at a temperature of 0 K , and zero external field ( h = 0 ). The matrix elements of ϱ i + r are written in function of expectation values which mean the correlation functions for nearest neighbor r = 1 , (for ϱ i + 1 ), and the correlation functions for next-to-nearest neighbors r = 2 , (for ϱ i + 2 ), and they are given by a set of integral equations which can be found in Appendix B or in [48,53,63,64,65]. These two point correlation functions for the XXZ model at zero temperature and in the thermodynamic limit can be derived by using the Bethe Ansatz technique. In Equation (30), due to the symmetry in the Hamiltonian model, it is presented the two-qubit reduced density matrix of sites i and i + r , for r = 1 , 2 , in the thermodynamic limit, written in the basis | 1 = | , | 2 = | , | 3 = | and | 4 = | , where | and | are the eigenstates of the Pauli z-operator [48]:
ϱ i + r = ϱ 11 0 0 0 0 ϱ 22 ϱ 23 0 0 ϱ 32 ϱ 33 0 0 0 0 ϱ 44 ,
where ϱ 11 = 1 + σ i z σ i + r z 4 , ϱ 23 = σ i x σ i + r x 2 , and ϱ 22 = 1 σ i z σ i + r z 4 , with ϱ 11 = ϱ 44 , ϱ 23 = ϱ 32 and ϱ 22 = ϱ 33 .
In Figure 2 we show the QSCM, C ( ϱ i + 1 ) , for nearest neighbor, in contrast with the von Neumann entropy S ( ϱ i + 1 ) and the trace distance D ( ϱ i + 1 , I ) between ϱ and the normalized identity matrix, all as a function of the uni-axial parameter strength Δ . The XXZ model possess two critical points: the first-order transition occurs at Δ = 1 , and also a continuous phase transition shows up at Δ = 1 [66]. An interesting feature of the QSCM is the fact that it evinces points of correlation transitions, related to the order–disorder transitions, which may not necessarily be connected with phase transitions. In this respect, we take note of the cusp point in Figure 2, at Δ 2.178 .
In order to investigate the cusp point of C ( ϱ i + 1 ) at Δ = 2.178 , let us consider what happens with the state ϱ i + 1 , given by Equation (30), as Δ varies. The state ϱ i + 1 given in Equation (30) can be easily diagonalized, thus let us study the following matrix ϱ i + 1 I / 4 , which plays an important role in the quantum statistical complexity measure as already discussed. This matrix has the following eigenvalues: { 1 4 ( 2 σ i x σ i + 1 x σ i z σ i + 1 z ) , 1 4 ( 2 σ i x σ i + 1 x σ i z σ i + 1 z ) , 1 4 σ i z σ i + 1 z , 1 4 σ i z σ i + 1 z } . As the Δ value increases in the interval [ 1 , 3 ] , correlation values in the x direction also increase while correlations in z decrease, reaching the local minimum observed in Figure 2. In this interval, the eigenvalue 2 σ i x σ i + 1 x σ i z σ i + 1 z goes through zero, and this, therefore, should cause the correlation transition. This correlation transition is due to the fact that this eigenvalue vanishes for some Δ in this interval, which should imply a change of orientation of spin correlations.
By following this reasoning, in order to determine such points at which changes of orientation of spin correlations occur, it is necessary to solve numerically some integral equations, given by the eigenvalues of Equation (30), which are functions of expected values given in Refs. [48,53,63,64,65]. This procedure has the objective of determining the solution for which values of Δ the following integral equation holds: 2 σ i x σ i + 1 x σ i z σ i + 1 z = 0 . Due to the fact that the values of σ i y σ i + 1 y = σ i x σ i + 1 x , for this Hamiltonian, the solution of this equation indicates the point where planar x y -correlation decreases while z-correlation increases, although we are already in the ferromagnetic phase. For Δ , the system moves towards a configuration that exhibits correlation only in the z-direction. Proceeding in the same way, by solving the other integral equation given in the eigenvalues set: 2 σ i x σ i + 1 x σ i z σ i + 1 z = 0 , we obtain a divergence solution for which Δ = 1 .
In Figure 3, we call attention to a contour map of C ( ϱ i + 1 ) in function of σ i x σ i + 1 x , and σ i z σ i + 1 z . The triangle region represents the convex hull of positive semi-definite density matrices. The vertices of this triangle are given by: ( σ i x σ i + 1 x , σ i z σ i + 1 z ) = { ( 1 , 1 ) ; ( 0 , 1 ) ; and ( 1 , 1 ) } . Along with the contour map of QSCM as a function of the correlation functions in x and z directions, the integral equations obtained while the two eigenvalues of ϱ i + 1 I / 4 goes to zero are also represented in Figure 3. These integral equations are represented by the two inclined straight lines (the dash and dash-dot ones). The dash and inclined straight line describes the integral equation whose solution is Δ = 2.178 . The dash-dot straight line represents the curve for Δ = 1 solution, for which there exists a divergence point (the phase transition point).
As previously mentioned, QSCM showed to be sensitive to correlation transitions. In Figure 3, the thick and colorful curve inside the contour map shows the path taken by C ( ϱ i + 1 ) , while the values of correlations in x and z vary when Δ increases monotonically in the interval [ 1 , 8 ] . This same path was also presented in Figure 2, on the blue curve. The blue part of the thick curve represents values for the correlations in which we have a paramagnetic state, and the red part of the thick and colorful curve indicates the values for the ferromagnetic arrangement. Additionally, we have highlighted some interesting points in this colorful curve by a ×: for Δ = 1 , ( ); for Δ = 0 , (×); Δ = 1 , (+); Δ = 2.178 , (☐) and for Δ = 8 , (Δ).
Figure 4 shows QSCM for nearest neighbors, given by C ( ϱ i + 1 ) , (blue), and for next-to-nearest neighbors, written as C ( ϱ i + 2 ) , (orange), both in the thermodynamic limit in function of the uni-axial parameter strength Δ .
The asymptotic limit for both measures ( r = 1 , 2 ) is also presented in the sub-figure. As Δ , the behavior of these two correlation functions σ i x σ i + 1 x 0 and σ i z σ i + 1 z 1 , for both cases. In this limit, the density matrix of the system can be written as ϱ i + r d i a g { 0 , 1 / 2 , 1 / 2 , 0 } , for both cases and thus, S ( ϱ i + r ) 1 / 2 . Additionally, D ( ϱ i + r , I ) 1 / 2 , which makes C ( ϱ i + r ) 1 / 4 , for r = 1 and for r = 2 . It is interesting to notice that C ( ϱ i + 1 ) = C ( ϱ i + 2 ) exactly at Δ = 2.178 . The QSCM for nearest neighbor is greater than the QSCM for next-to-nearest neighbors, i.e.,; C ( ϱ i + 1 ) > C ( ϱ i + 2 ) , for 1 Δ < 2.178 . For Δ > 2.178 , C ( ϱ i + 1 ) < C ( ϱ i + 2 ) , until both goes to 1 / 4 , for large values of Δ . This behavior can be an indication of a probable increase in complexity as the transition of order–disorder occurs. Therefore, it should be expected that this measure could act as a complexity pointer. Considering more correlations possible if we count on the action of second neighbors, it is understandable that an increase of complexity should occur at Δ = 2.178 .

5. Conclusions

We introduced a quantum version for the statistical complexity measure, the Quantum Statistical Complexity Measure (QSCM), and displayed some of its properties. The measure has demonstrated to be useful and physically meaningful. It possesses several of the expected properties for a bona fide complexity measure and demonstrates its possible usefulness in other areas of quantum information theory.
We presented two applications of the QSCM, investigating the physics of two exactly solvable quantum Hamiltonian models, namely: the 1 D -Quantum Ising Model and the Heisenberg XXZ spin- 1 / 2 chain, both in the thermodynamic limit. Firstly we calculated the QSCM for one-qubit, in the Bloch’s base, and we determined this measure as a function of the magnitude of the Bloch vector r. We computed its magnitude in the thermodynamic limit, first by analytically calculating the measure for the one-qubit state reduced density matrix from N spins. Later, in order to study the quantum phase transition for the 1 D -Quantum Ising Model, we performed the limit N . For the 1 D -Quantum Ising Model, we obtained the quantum phase transition point at g = 1 . In this way, we have found that the QSCM can be used as a signaling of quantum phase transitions for this model.
Secondly, we studied the Heisenberg XXZ spin- 1 / 2 chain, and by means of QSCM we evince a point at which a correlation transition occurs for this model. Physically, at Δ = 2.178 , the planar x y -correlation decreases while the z-correlation increases, reaching a minimum point, although we are already in the ferromagnetic arrangement. This competition between these two different alignments of correlations indicates an order–disorder transition in which the measure was shown to be sensitive.
We have studied the derivatives of the QSCM and they demonstrated to be sensitive to the quantum transition points. As a summary of this study for the Heisenberg XXZ spin- 1 / 2 , we can list: (i) the Quantum Statistical Complexity Measure, characterizing the first-order quantum phase transition at Δ = 1 , (ii) and evinces the continuous quantum phase transition at Δ = 1 , and (iii) witnesses order–disorder transition at Δ = 2.178 , related to the alignment of the spin correlations.

Author Contributions

Conceptualization, A.T.C.; methodology, A.T.C., D.L.B.F.; software, A.T.C. and D.L.B.F.; formal analysis, A.T.C. and D.L.B.F.; validation, A.T.C., D.L.B.F., T.O.M., T.D., F.I. and R.O.V.; funding acquisition R.O.V.; Writing—original draft, A.T.C.; Writing—review/editing, A.T.C., T.D., F.I. and R.O.V.; Project administration, R.O.V.; supervision, R.O.V. and T.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the brazilian agency FAPEMIG through Reinaldo O. Vianna’s project: PPM-00711-18.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

This work was partially supported by Brazilian agencies Fapemig, Capes, CNPq and INCT-IQ through the project (465469/2014-0). ROV also acknowledges FAPEMIG project PPM-00711-18. T.D. also acknowledge the support from the Austrian Science Fund (FWF) through the project P 31339-N27. F.I. acknowledges the financial support of the Brazilian funding agencies CNPq (Grant No. 308205/2019-7), FAPERJ (Grant No. E-26/211.318/2019 and No. E-26/201.365/2022). T.O.M. acknowledge financial support from the Serrapilheira Institute (grant number Serra-1709-17173), and the Brazilian agencies CNPq (PQ grant No. 305420/2018-6) and FAPERJ (JCN E-26/202.701/2018).

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
CSCMClassical Statistical Complexity Measure
QSCMQuantum Statistical Complexity Measure

Appendix A. Sub-Additivity over Copies

Proposition A1. 
Given a product state ρ n D ( H n ) , the QSCM is a sub-additive function:
C ( ρ n ) n C ( ρ ) .
Equality holds if we choose the disequilibrium function to be the quantum relative entropy, S ( ρ | | I ) = log N S ( ρ ) , which is additive under tensor products. In this case, the measure will be additive: C ( ρ n ) = n C ( ρ ) .
Proof. 
Von Neumann entropy is additive for product states: S ( ρ n ) = n S ( ρ ) . This means that information contained in an uncorrelated system ρ n , is equal to the sum of its constituents parts. This equality also holds for Shannon Entropy. However, the trace distance is sub-additive with respect to the tensor product: D ( ρ ρ , I I ) D ( ρ , I ) + D ( ρ , I ) , ρ , ρ [67]. We will prove this proposition by induction and also we will only consider states with same dimension, i.e., dim ( ρ n ) = dim ( I n ) , n . It is easy to observe that the proposition is true for n = 1 . Let us suppose now, as an induction step, that for some arbitrary n = k > 1 , k N , C ( ρ k ) k C ( ρ ) .
C ( ρ k + 1 ) = S ( ρ k ρ ) log ( N k + 1 ) D ( ρ k ρ , I k + 1 ) ,
C ( ρ k + 1 ) = ( k + 1 ) S ( ρ ) log ( N k + 1 ) D ( ρ k ρ , I k + 1 ) ,
C ( ρ k + 1 ) = S ( ρ ) log ( N ) D ( ρ k ρ , I k + 1 ) .
Using the sub-additivity property for the trace distance: D ( ρ k , I ) k D ( ρ , I ) , and also using S ( ρ k ) = k S ( ρ ) , for an arbitrary n = k + 1
C ( ρ k + 1 ) ( k + 1 ) S ( ρ ) log ( N ) D ( ρ , I ) ,
( k + 1 ) C ( ρ ) .
Therefore C ( ρ n ) n C ( ρ ) , n .  □

Appendix B. Correlation Functions for Nearest Neighbors and Next-to-Nearest Neighbors

The correlation functions for nearest neighbors ( r = 1 ) spins of XXZ- 1 / 2 model. The two-point correlation functions of this model at zero temperature and in the thermodynamics limit can be derived by using the Bethe Ansatz technique [66]. The spin–spin correlation functions between nearest-neighbors spin sites for Δ > 1 are given by Takahashi et al. in [63]:
σ i z σ i + 1 z = 1 + 2 + i / 2 + i / 2 d x sinh ( π x ) cot ( ν x ) coth ( ν x ) x sin 2 ( ν x ) , σ i x σ i + 1 x = + i / 2 + i / 2 d x sinh ( π x ) x sin 2 ( ν x ) cosh ν cot ( ν x ) sinh ν ,
with ν = cosh 1 Δ . For Δ = 1 , we have σ i x σ i + 1 x = σ i z σ i + 1 z = 1 / 3 ( 1 4 ln 2 ) , and for Δ 1 , σ i z σ i + 1 z = 1 and σ i x σ i + 1 x = 0 , (see [53]). For 1 < Δ < 1 , the correlation functions are given by Kato et al. in Ref. [64]:
σ i z σ i + 1 z = 1 2 π 2 d x sinh x x cosh x cosh 2 ( Φ x ) + 2 cot ( π Φ ) π d x sinh x sinh ( ( 1 Φ ) x ) cosh ( Φ x ) , σ i x σ i + 1 x = cos ( π Φ ) π 2 d x sinh x x cosh x cosh 2 ( Φ x ) 1 π sin ( π Φ ) d x sinh x sinh ( ( 1 Φ ) x ) cosh ( Φ x ) ,
where Φ = 1 π cos 1 Δ .
The correlation functions for next-to-nearest neighbors ( r = 2 ) spins of XXZ- 1 / 2 model. For the next-to-nearest-neighbors spins, in the region Δ > 1 , we have the correlation functions (see [63]) are:
σ i x σ i + 2 x = + i / 2 + i / 2 d x sinh ( π x ) 1 2 x sin 2 ( ν x ) 3 sinh 2 ν sin 2 ( ν x ) + 1 3 cosh 2 ν + + cot ( ν x ) 3 cosh ( 2 ν ) tanh ( ν ) sin 2 ( ν x ) 4 sinh ( 2 ν ) , σ i z σ i + 2 z = 1 + + i / 2 + i / 2 d x sinh ( π x ) x sin 2 ( ν x ) 3 sinh 2 ν sin 2 ( ν x ) 1 cosh ( 2 ν ) cot ( ν x ) 3 tanh ν sin 2 ( ν x ) 4 coth ( 2 ν ) ,
where ν = cosh 1 Δ , Δ 1 , σ i z σ i + 1 z = 1 , and σ i x σ i + 1 x = 0 . For 1 < Δ < 1 , we have (see [65,68]):
σ i x σ i + 2 x = d x sinh x sinh ( 1 Φ ) x cosh ( Φ x ) 2 π sin ( 2 π Φ ) + 3 cos 2 π Φ tan π Φ π 3 x 2 + + cosh x ( cosh Φ x ) 2 cos 2 π Φ π 2 x + ( sin π Φ ) 2 π 4 x 3 ,
and, also:
σ i z σ i + 2 z = 1 + 4 d x sinh x sinh ( 1 Φ ) x cosh Φ x cot ( 2 π Φ ) π + 3 tan π Φ 2 π 3 x 2 4 d x sinh x cosh x ( cosh Φ x ) 2 x 2 π 2 + ( sin π Φ ) 2 2 π 4 x 3 .

References

  1. Badii, R.; Politi, A. Complexity: Hierarchical Structures and Scaling in Physics; Cambridge University Press: Cambridge, UK, 1997. [Google Scholar]
  2. Lempel, A.; Ziv, J. On the Complexity of Finite Sequences. IEEE Trans. Inf. Theory 1976, 22, 75–81. [Google Scholar] [CrossRef]
  3. Jiménez-Montaño, M.A.; Ebeling, W.; Pohl, T.; Rapp, P.E. Entropy and complexity of finite sequences as fluctuating quantities. Biosystems 2002, 64, 23–32. [Google Scholar] [CrossRef]
  4. Szczepanski, J. On the distribution function of the complexity of finite sequences. Inf. Sci. 2009, 179, 1217–1220. [Google Scholar] [CrossRef] [Green Version]
  5. Kolmogorov, A.N. Three approaches to the quantitative definition of information. Probl. Inf. Transm. 1965, 1, 1–7. [Google Scholar] [CrossRef]
  6. Chaitin, G.J. On the Length of Programs for Computing Finite Binary Sequences. J. ACM 1966, 13, 547–569. [Google Scholar] [CrossRef]
  7. Martin, M.T.; Plastino, A.; Rosso, O.A. Statistical complexity and disequilibrium. Phys. Lett. A 2003, 311, 126–132. [Google Scholar] [CrossRef]
  8. Lamberti, P.W.; Martin, M.T.; Plastino, A.; Rosso, O.A. Intensive entropic non-triviality measure. Phys. A 2004, 334, 119–131. [Google Scholar] [CrossRef]
  9. Binder, P.-M. Complexity and Fisher information. Phys. Rev. E 2000, 61, R3303–R3305. [Google Scholar] [CrossRef] [PubMed]
  10. Shiner, J.S.; Davison, M.; Landsberg, P.T. Simple measure for complexity. Phys. Rev. E 1999, 59, 1459–1464. [Google Scholar] [CrossRef]
  11. Toranzo, I.V.; Dehesa, J.S. Entropy and complexity properties of the d-dimensional blackbody radiation. Eur. Phys. J. D 2014, 68, 316. [Google Scholar] [CrossRef] [Green Version]
  12. Wackerbauer, R.; Witt, A.; Atmanspacher, H.; Kurths, J.; Scheingraber, H. A comparative classification of complexity measures. Chaos Solitons Fractals 1994, 4, 133–173. [Google Scholar] [CrossRef]
  13. Zurek, W.H. Complexity, Entropy, and the Physics of Information; Addison-Wesley Pub. Co.: Redwood City, CA, USA, 1990. [Google Scholar]
  14. Domenico, F.; Stefano, M.; Nihat, A. Canonical Divergence for Measuring Classical and Quantum Complexity. Entropy 2019, 21, 435. [Google Scholar]
  15. Felice, D.; Cafaro, C.; Mancini, S. Information geometric methods for complexity. Chaos 2018, 28, 032101. [Google Scholar] [CrossRef] [PubMed]
  16. Crutchfield, J.P.; Ellison, C.J.; Mahoney, J.R. Time’s Barbed Arrow: Irreversibility, Crypticity, and Stored Information. Phys. Rev. Lett. 2009, 103, 94101. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Riechers, P.M.; Mahoney, J.R.; Aghamohammadi, C.; Crutchfield, J.P. Minimized state complexity of quantum-encoded cryptic processes. Phys. Rev. A 2016, 93, 052317. [Google Scholar] [CrossRef] [Green Version]
  18. Gu, M.; Wiesner, K.; Rieper, E.; Vedral, V. Quantum mechanics can reduce the complexity of classical models. Nat. Commun. 2012, 3, 762–765. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  19. Yang, C.; Binder, F.C.; Narasimhachar, V.; Gu, M. Matrix Product States for Quantum Stochastic Modeling. Phys. Rev. Lett. 2018, 121, 260602. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  20. Thompson, J.; Garner, A.J.P.; Mahoney, J.R.; Crutchfield, J.P.; Vedral, V.; Gu, M. Causal Asymmetry in a Quantum World. Phys. Rev. X 2018, 8, 031013. [Google Scholar] [CrossRef] [Green Version]
  21. López-Ruiz, R.; Mancini, H.L.; Calbet, X. A statistical measure of complexity. Phys. Lett. A 1995, 209, 321–326. [Google Scholar] [CrossRef] [Green Version]
  22. Anteneodo, C.; Plastino, A.R. Some features of the López-Ruiz-Mancini-Calbet (LMC) statistical measure of complexity. Phys. Lett. A 1996, 223, 348–354. [Google Scholar] [CrossRef]
  23. Catalán, R.G.; Garay, J.; López-Ruiz, R. Features of the extension of a statistical measure of complexity to continuous systems. Phys. Rev. E 2002, 66, 011102. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  24. Rosso, O.A.; Martin, M.T.; Larrondo, H.A.; Kowalski, A.M.; Plastino, A. Generalized Statistical Complexity—A New Tool for Dynamical Systems; Bentham Science Publisher: Sharjah, United Arab Emirates, 2013. [Google Scholar]
  25. Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
  26. Gell-Mann, M.; Tsallis, C. Nonextensive Entropy-Interdisciplinary Applications; Oxford University Press: Oxford, UK, 2004. [Google Scholar]
  27. Rényi, A. On measures of Entropy and Information; University of California Press: Berkeley, CA, USA, 1961; pp. 547–561. [Google Scholar]
  28. Bhattacharyya, A. On a Measure of Divergence between Two Statistical Populations Defined by Their Probability Distributions. Bull. Calcutta Math. Soc. 1943, 35, 99–109. [Google Scholar]
  29. Majtey, A.; Lamberti, P.W.; Martin, M.T.; Plastino, A. Wootters’ distance revisited: A new distinguishability criterium. Eur. Phys. J. D 2005, 32, 413–419. [Google Scholar] [CrossRef]
  30. Kullback, S.; Leibler, R.A. On Information and Sufficiency. Ann. Math. Stat. 1951, 22, 79–86. [Google Scholar] [CrossRef]
  31. Majtey, A.P.; Lamberti, P.W.; Prato, D.P. Jensen-Shannon divergence as a measure of distinguishability between mixed quantum states. Phys. Rev. A 2005, 72, 052310. [Google Scholar] [CrossRef] [Green Version]
  32. López-Ruiz, R.; Nagy, Á.; Romera, E.; Sañudo, J. A generalized statistical complexity measure: Applications to quantum systems. J. Math. Phys. 2009, 50, 123528. [Google Scholar] [CrossRef] [Green Version]
  33. Sañudo, J.; López-Ruiz, R. Statistical complexity and Fisher-Shannon information in the H-atom. Phys. Lett. A 2008, 372, 5283–5286. [Google Scholar] [CrossRef] [Green Version]
  34. Montgomery, H.E.; Sen, K.D. Statistical complexity and Fisher–Shannon information measure of H2+. Phys. Lett. A 2008, 372, 2271–2273. [Google Scholar] [CrossRef]
  35. Sen, K.D. Statistical Complexity-Applications in Electronic Structure; Springer: Dordrecht, The Netherlands, 2011. [Google Scholar]
  36. Sañudo, J.; López-Ruiz, R. Alternative evaluation of statistical indicators in atoms: The non-relativistic and relativistic cases. Phys. Lett. A 2009, 373, 2549–2551. [Google Scholar] [CrossRef] [Green Version]
  37. Moustakidis, C.C.; Chatzisavvas, K.C.; Nikolaidis, N.S.; Panos, C.P. Statistical measure of complexity of hard-sphere gas: Applications to nuclear matter. Int. J. Appl. Math. Stat. 2012, 26, 2. [Google Scholar]
  38. Sánchez-Moreno, P.; Angulo, J.C.; Dehesa, J.S. A generalized complexity measure based on Rényi entropy. J. Eur. Phys. J. D 2014, 68, 212. [Google Scholar] [CrossRef]
  39. Calbet, X.; López-Ruiz, R. Tendency towards maximum complexity in a nonequilibrium isolated system. Phys. Rev. E 2001, 63, 066116. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  40. Lopez-Ruiz, R.; Sanudo, J.; Romera, E.; Calbet, X. Statistical Complexity and Fisher-Shannon Information: Applications. In Statistical Complexity; Sen, K., Ed.; Springer: Dordrecht, The Netherlands, 2011. [Google Scholar]
  41. Müller-Lennert, M.; Dupuis, F.; Szehr, O.; Fehr, S.; Tomamichel, M. On quantum Rényi entropies: A new generalization and some properties. J. Math. Phys. 2013, 54, 122203. [Google Scholar] [CrossRef] [Green Version]
  42. Petz, D.; Virosztek, D. Some inequalities for quantum Tsallis entropy related to the strong subadditivity. Math. Inequalities Appl. 2015, 18, 555–568. [Google Scholar] [CrossRef] [Green Version]
  43. Bhatia, R. Matrix Analysis; Graduate Texts in Mathematics; Springer: New York, NY, USA, 2013. [Google Scholar]
  44. Misra, A.; Singh, U.; Bera, M.N.; Rajagopal, A.K. Quantum Rényi relative entropies affirm universality of thermodynamics. Phys. Rev. E 2015, 92, 042161. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  45. Audenaert, K.M.R. Quantum skew divergence. J. Math. Phys. 2014, 55, 112202. [Google Scholar] [CrossRef] [Green Version]
  46. Schumacher, B.; Westmoreland, M.D. Relative entropy in quantum information theory. arXiv 2000, arXiv:quant-ph/0004045. [Google Scholar]
  47. Sugiura, S.; Shimizu, A. Canonical Thermal Pure Quantum State. Phys. Rev. Lett. 2013, 111, 010401. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  48. Ye, B.-L.; Li, B.; Li-Jost, X.; Fei, S.-M. Quantum correlations in critical XXZ system and LMG model. Int. J. Quantum Inf. 2018, 16, 1850029. [Google Scholar] [CrossRef] [Green Version]
  49. Girolami, D.; Tufarelli, T.; Adesso, G. Characterizing Nonclassical Correlations via Local Quantum Uncertainty. Phys. Rev. Lett. 2013, 110, 240402. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  50. Werlang, T.; Trippe, C.; Ribeiro, G.A.P.; Rigolin, G. Quantum Correlations in Spin Chains at Finite Temperatures and Quantum Phase Transitions. Phys. Rev. Lett. 2010, 105, 095702. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  51. Werlang, T.; Ribeiro, G.A.P.; Rigolin, G. Spotlighting quantum critical points via quantum correlations at finite temperatures. Phys. Rev. A 2011, 83, 062334. [Google Scholar] [CrossRef] [Green Version]
  52. Li, Y.-C.; Lin, H.-Q. Thermal quantum and classical correlations and entanglement in the XY spin model with three-spin interaction. Phys. Rev. A 2011, 83, 052323. [Google Scholar] [CrossRef]
  53. Justino, L.; de Oliveira, T.R. Bell inequalities and entanglement at quantum phase transitions in the XXZ model. Phys. Rev. A 2012, 85, 052128. [Google Scholar] [CrossRef] [Green Version]
  54. Malvezzi, A.L.; Karpat, G.; Çakmak, B.; Fanchini, F.F.; Debarba, T.; Vianna, R.O. Quantum correlations and coherence in spin-1 Heisenberg chains. Phys. Rev. B 2016, 93, 184428. [Google Scholar] [CrossRef] [Green Version]
  55. Ferreira, D.L.B.; Maciel, T.O.; Vianna, R.O.; Iemini, F. Quantum correlations, entanglement spectrum, and coherence of the two-particle reduced density matrix in the extended Hubbard model. Phys. Rev. B 2022, 105, 115145. [Google Scholar] [CrossRef]
  56. Osborne, T.J.; Nielsen, M.A. Entanglement in a simple quantum phase transition. Phys. Rev. A 2002, 66, 032110. [Google Scholar] [CrossRef] [Green Version]
  57. Pfeuty, P. The one-dimensional ising model with a transverse field. Ann. Phys. 1970, 57, 79–90. [Google Scholar] [CrossRef]
  58. Damski, B.; Rams, M.M. Exact results for fidelity susceptibility of the Quantum Ising Model: The interplay between parity, system size, and magnetic field. J. Phys. Math. Theor. 2014, 47, 025303. [Google Scholar] [CrossRef]
  59. Whitlock, S.; Glaetzle, A.W.; Hannaford, P. Simulating quantum spin models using rydberg-excited atomic ensembles in magnetic microtrap arrays. J. Phys. B At. Mol. Opt. Phys. 2017, 50, 074001. [Google Scholar] [CrossRef]
  60. Toskovic, R.; van den Berg, R.; Spinelli, A.; Eliens, I.S.; van den Toorn, B.; Bryant, B.; Caux, J.-S.; Otte, A.F. Atomic spin-chain realization of a model for quantum criticality. Nat. Phys. 2016, 12, 656–660. [Google Scholar] [CrossRef]
  61. Franchini, F. Notes on Bethe Ansatz Techniques. Available online: https://people.sissa.it/~ffranchi/BAnotes.pdf (accessed on 26 July 2022).
  62. Sarıyer, O.S. Two-dimensional quantum-spin-1/2 XXZ magnet in zero magnetic field: Global thermodynamics from renormalization group theory. Philos. Mag. 2019, 99, 1787–1824. [Google Scholar] [CrossRef] [Green Version]
  63. Takahashi, M.; Kato, G.; Shiroishi, M. Next Nearest-Neighbor Correlation Functions of the Spin-1/2 XXZ Chain at Massive Region. J. Phys. Soc. Jpn. 2004, 73, 245–253. [Google Scholar] [CrossRef] [Green Version]
  64. Kato, G.; Shiroishi, M.; Takahashi, M.; Sakai, K. Third-neighbour and other four-point correlation functions of spin-1/2 XXZ chain. J. Phys. A Gen. 2004, 37, 5097. [Google Scholar] [CrossRef] [Green Version]
  65. Kato, G.; Shiroishi, M.; Takahashi, M.; Sakai, K. Next-nearest-neighbour correlation functions of the spin-1/2 XXZ chain at the critical region. J. Phys. Math. Gen. 2003, 36, L337. [Google Scholar] [CrossRef] [Green Version]
  66. Shiroishi, M.; Takahashi, M. Exact Calculation of Correlation Functions for Spin-1/2 Heisenberg Chain. J. Phys. Soc. Jpn. 2005, 74, 47–52. [Google Scholar] [CrossRef]
  67. Wilde, M.M. Quantum Information Theory, 2nd ed.; Cambridge University Press: Cambridge, UK, 2017. [Google Scholar]
  68. Takahashi, M. Thermodynamics of One-Dimensional Solvable Models; Cambridge University Press: Cambridge, UK, 1999. [Google Scholar]
Figure 1. Second derivative of QSCM as a function of g. The Second derivative of QSCM with respect to the transverse field parameter g, for different finite system sizes: N = 4 , 6 , 16 , 1000 , and for the thermodynamic limit (continuous line), N , for g [ 0 , 2 ] .
Figure 1. Second derivative of QSCM as a function of g. The Second derivative of QSCM with respect to the transverse field parameter g, for different finite system sizes: N = 4 , 6 , 16 , 1000 , and for the thermodynamic limit (continuous line), N , for g [ 0 , 2 ] .
Entropy 24 01161 g001
Figure 2. Comparison between measures. Quantum Statistical Complexity Measure C ( ϱ i + 1 ) (blue), von Neumann Entropy S ( ϱ i + 1 ) (orange), and the disequilibrium function given by the Trace distance D ( ϱ i + 1 , I ) (yellow) in function of Δ . All measures were calculated for the two-qubit reduced density matrix of sites i and i + 1 , ϱ i + 1 , for Δ [ 1 , 8 ] , in the thermodynamic limit.
Figure 2. Comparison between measures. Quantum Statistical Complexity Measure C ( ϱ i + 1 ) (blue), von Neumann Entropy S ( ϱ i + 1 ) (orange), and the disequilibrium function given by the Trace distance D ( ϱ i + 1 , I ) (yellow) in function of Δ . All measures were calculated for the two-qubit reduced density matrix of sites i and i + 1 , ϱ i + 1 , for Δ [ 1 , 8 ] , in the thermodynamic limit.
Entropy 24 01161 g002
Figure 3. Contour map. The contour map of QSCM, C ( ϱ i + 1 ) , in function of the correlation functions σ i x σ i + 1 x and σ i z σ i + 1 z . The dash inclined straight line represents the integral equation whose solution is Δ = 2.178 , and the dash-point straight line represents the curve for Δ = 1 , for which there is a divergence point. The indicated path inside the contour map shows the curve performed by the variation of C ( ϱ i + 1 ) , inside the positive semi-definite density matrix space, for Δ [ 1 , 8 ] . The highlighted points are: Δ = 1 , ( ); Δ = 0 , (×); Δ = 1 , (+); Δ = 2.178 , (☐) and Δ = 8 , (Δ). Additionally, the ferromagnetic region (red) and the paramagnetic region (blue) are also represented in this path.
Figure 3. Contour map. The contour map of QSCM, C ( ϱ i + 1 ) , in function of the correlation functions σ i x σ i + 1 x and σ i z σ i + 1 z . The dash inclined straight line represents the integral equation whose solution is Δ = 2.178 , and the dash-point straight line represents the curve for Δ = 1 , for which there is a divergence point. The indicated path inside the contour map shows the curve performed by the variation of C ( ϱ i + 1 ) , inside the positive semi-definite density matrix space, for Δ [ 1 , 8 ] . The highlighted points are: Δ = 1 , ( ); Δ = 0 , (×); Δ = 1 , (+); Δ = 2.178 , (☐) and Δ = 8 , (Δ). Additionally, the ferromagnetic region (red) and the paramagnetic region (blue) are also represented in this path.
Entropy 24 01161 g003
Figure 4. QSCM for nearest neighbors, ( r = 1 ), and for next-to-nearest neighbors ( r = 2 ). C ( ϱ i + r ) for the two-qubit reduced density matrix of sites i and i + r , for r = 1 (blue), and for r = 2 (orange), both in the thermodynamic limit in function of the uni-axial parameter strength Δ . (a) Sub-Figure: Asymptotic Behavior. The sub-figure shows the asymptotic behavior for large Δ for both cases (in fact, both C ( ϱ i + 1 ) and C ( ϱ i + 2 ) 1 / 4 , when Δ ).
Figure 4. QSCM for nearest neighbors, ( r = 1 ), and for next-to-nearest neighbors ( r = 2 ). C ( ϱ i + r ) for the two-qubit reduced density matrix of sites i and i + r , for r = 1 (blue), and for r = 2 (orange), both in the thermodynamic limit in function of the uni-axial parameter strength Δ . (a) Sub-Figure: Asymptotic Behavior. The sub-figure shows the asymptotic behavior for large Δ for both cases (in fact, both C ( ϱ i + 1 ) and C ( ϱ i + 2 ) 1 / 4 , when Δ ).
Entropy 24 01161 g004
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Cesário, A.T.; Ferreira, D.L.B.; Debarba, T.; Iemini, F.; Maciel, T.O.; Vianna, R.O. Quantum Statistical Complexity Measure as a Signaling of Correlation Transitions. Entropy 2022, 24, 1161. https://doi.org/10.3390/e24081161

AMA Style

Cesário AT, Ferreira DLB, Debarba T, Iemini F, Maciel TO, Vianna RO. Quantum Statistical Complexity Measure as a Signaling of Correlation Transitions. Entropy. 2022; 24(8):1161. https://doi.org/10.3390/e24081161

Chicago/Turabian Style

Cesário, André T., Diego L. B. Ferreira, Tiago Debarba, Fernando Iemini, Thiago O. Maciel, and Reinaldo O. Vianna. 2022. "Quantum Statistical Complexity Measure as a Signaling of Correlation Transitions" Entropy 24, no. 8: 1161. https://doi.org/10.3390/e24081161

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop