Quantum Statistical Complexity Measure as a Signaling of Correlation Transitions

We introduce a quantum version for the statistical complexity measure, in the context of quantum information theory, and use it as a signaling function of quantum order–disorder transitions. We discuss the possibility for such transitions to characterize interesting physical phenomena, as quantum phase transitions, or abrupt variations in correlation distributions. We apply our measure on two exactly solvable Hamiltonian models: the 1D-Quantum Ising Model (in the single-particle reduced state), and on Heisenberg XXZ spin-1/2 chain (in the two-particle reduced state). We analyze its behavior across quantum phase transitions for finite system sizes, as well as in the thermodynamic limit by using Bethe Ansatz technique.


Introduction
Let us consider a physical, chemical, or biological process such as, for example, the change in the temperature of the water, the mixture between two solutions, or the formation of a neural network. It is intuitive to believe that if one can classify all the possible configurations of such systems, described by their ordering and disordering patterns, it would be possible to characterize and control them. As, for example, during the process of changing the temperature of water, by knowing the pattern of ordering and disordering of its molecular structure, it would be possible to characterize and completely control its phase transitions.
In information theory, the ability to identify certain patterns of order and disorder of probability distributions enables us to control the creation, transmission, and measurement of information. In this way, the characterization and quantification of complexity contained in physical systems and their constituent parts is a crucial goal for information theory [1]. One point of consensus in the literature about complexity is that no formal definition of this term exists. Intuition suggests that systems which can be described as "not complex" are readily comprehended: they can be described concisely by means of few parameters or variables, and their information content is low. Complexity quantifiers must satisfy some properties: (i.) assign a minimum value (possibly zero) for opposite extremes of order and disorder; (ii.) should be sensitive to transitions of order-disorder patterns; (iii.) and must be computable. There are considerable ways to define measures of the degree of complexity of physical systems. Among such definitions, we can mention measures based on data compression algorithms of finite size sequences [2][3][4], Kolmogorov or Chaitin measures based on the size of the smallest algorithm that can reproduce a particular type of pattern [5,6], and measures concerning the classical information theory [7][8][9][10][11][12][13]. Based on recent progress in defining a general canonical divergence within Information Geometry, a canonical divergence measure was presented with the objective of quantify complexity for both classical and quantum systems. In the classical realm, it was proven that this divergence coincides with the classical Kullback-Leibler divergence, and in the quantum domain it reduces to the quantum relative entropy [14,15].
The statistical complexity assigns the simplicity of a probability distribution to the amount of resources needed to store information [16,17]. Similarly, in the quantum realm, the complexity of a given density matrix could be translated as the resource needed to create, operate or measure the quantum state of the system [18][19][20]. On the other hand, the quantum information meaning of complexity could play an important role in the quantification of transitions of order and disorder patterns, which could indicate some quantum physical phenomenon, such as quantum phase transitions.
Regarding the complexity contained in systems, some of the simplest models in physics are the ideal gas and the perfect crystal. In an ideal gas model, the system can be found with the same probability in any of the available micro-states, therefore each state contributes equally to the same amount of information. On the other hand, in a perfect crystal, the symmetry rules restrict the accessible states of the system to only one very symmetric state. These simple models are extreme cases of minimum complexity, in a scale of order and disorder, therefore, there might exist some intermediate state which contains a maximum complexity value in that scale [21].
The main goal of this work is to introduce a quantum version of the statistical complexity measure, based on the physical meaning of the characterization and quantification of transitions between order-disorder patterns of quantum systems. As a by-product, this measure could be applied in the study of quantum phase transitions. Physical properties of systems across a quantum phase transition are dramatically altered, and in this way, it is interesting to understand how the complexity of a system would behave under such transitions. In our analysis, we studied the single particle reduced state of the parametric 1D-Ising Model and the two-particle reduced state of Heisenberg XXZ spin-1/2 Model.
The manuscript is organized as follows: In Section 2, we introduce the Statistical Measure of Complexity defined by Lópes-Ruiz et al. in [21], and in Section 3, we introduce a quantum counterpart of this measure: the Quantum Statistical Complexity Measure. We present some properties of this measure (in Section 4), and also exhibit a closed expression of this measure for one-qubit, written in the Bloch basis (Section 4.1). We also discuss two interesting examples and applications: the 1D-Quantum Ising Model (Section 4.2), in which we compute the Quantum Statistical Complexity Measure for one-qubit reduced state from N spins, in the thermodynamic limit, with the objective of determining the quantum phase transition point. We further determine the first-order quantum transition point and the continuous quantum phase transition for the Heisenberg XXZ spin-1/2 model, with h = 0 (Section 4.3), by means of the Quantum Statistical Complexity Measure of the two-qubit reduced state for nearest neighbors, and in the thermodynamic limit. Finally, we give concluding remarks in Section 5.

Classical Statistical Complexity Measure-CSCM
Consider a system possessing N accessible states {x 1 , x 2 , · · · , x N }, when observed on a particular scale, with each state having an intrinsic probability given by p = {p i } N i=1 . As discussed before, the candidate function to quantify the complexity of a probability distribution associated with a physical system, must attribute zero value for systems with the maximum degree of order, that is, for pure distributions: p = {p i = 1, p j =i = 0}, and also assign zero for disordered systems which are characterized by an independent and identically distributed vector (i.i.d.): I = {I i = 1/N}, for all i = 1, . . . , N. Let us address the case of ordered (Section 2.1) and disordered (Section 2.2) systems separately.

Degree of Order
A physical system possessing the maximum degree of order can be regarded as a system with a symmetry of all of its elements. The probability distribution that describes such systems is best represented by a pure vector, which places the system as having only one possible configuration. Physically, this is the case for a gas at zero Kelvin temperature, or a perfect crystal where the symmetry rules restrict the accessible state to a very symmetric one. In order to quantify the degree of order of a given system, the function must assign maximum value for pure probability distributions, and attribute zero for equiprobable configurations. A function capable of quantifying such a degree of order is the l 1 -distance between the probability distribution and the independent and identically distributed vector (i.i.d.): where I, is the (i.i.d.) vector, i.e., it is a vector with elements: This function plays the role of the disequilibrium function and it quantifies the degree of order of a probability vector. It consists of the sum of the absolute values of the elements of the vector p − I. The disequilibrium measure D will have zero value for maximally disordered systems and maximum value for maximally ordered systems.

Degree of Disorder
In contrast with an ordered system, a system possessing the maximum degree of disorder is described by an equiprobable distribution. This means an equally probable expectation of occurring any of its configurations, as in a fair dice game, or in a partition function of an isolated ideal gas. From a statistical point of view, the probability vector that describes this feature is the independent and identically distributed vector (i.i.d.) I, as defined above. One can define the degree of disorder of a system as a function which assigns value of zero for pure probabilities distributions, (associated with maximally ordered distributions), and a maximum value for the i.i.d. distribution. A well known function capable of quantifying the degree of disorder of a probability vector is the Shannon entropy: In this way, the Shannon entropy H( p) will assign zero for maximally ordered systems, and a maximal value for i.i.d vectors equals to log N. The log function is usually taken in basis 2, in order to quantify the amount of disorder in bits.

Quantifying Classical Complexity
Using the maximally ordered and maximally disordered states and Equations (1) and (2), respectively, Lópes-Ruiz et al. in [21] defined a classical statistical measure of complexity constructed as a product of such order-disorder quantifiers. There are intermediate states of order-disorder that may exhibit some interesting physical properties and which can be associated with complex behavior. Therefore, in this sense, this measure should deal with these intermediate states by measuring the amount of complexity of a physical system [21]. Definition 1 (Classical Statistical Complexity Measure-(CSCM) [21]). Let us consider a probability vector given by p = {p i } N i=1 , with dim( p) = N, associated with a random variable X, representing all possible states of a system. The function C( p) is a measure of the system's complexity and can be defined as: The function C( p) will vanish for "simple systems", such as the ideal gas model or a crystal, and it should reach a maximum value for some state.
Definition 2 (Classical Statistical Complexity Measure of Joint Probability Distributions and of Marginal Probability Distributions). Given a known joint distribution of two discrete random variables X and Y, given by: p(x i , y j ), of dimension N, one can define the CSCM of the joint distribution C XY ( p XY (x i , y j )), by Equation (4): and in the same way, we define the CSCM of the two marginal distributions, C X ( p X (x i )), by Equation (5) and C Y ( p Y (y j )), by Equation (6): where p X and p Y are marginal probability density functions given by: Generalizations of these measures defined above to continuous variables are straightforward (cf. Refs. [22,23]). In the same way, we can also generalize Definition 2 in order to deal with conditional probability densities like p(x|y) or p(y|x), which may be useful in some other more general contexts.
Classical Statistical Complexity Measure depends on the nature of the description associated to a system and with the scale of observation [24]. This function, generalized as a functional of a probability distribution, has a relation with a time series generated by a classical dynamical system [24]. Two ingredients are fundamental in order to define such a quantity: the first one is an entropy function which quantifies the information contained in a system, and could also be the Tsallis' Entropy [25], Escort-Tsallis [26], or Rényi Entropy [27]. The other ingredient is the definition of a distance function in the state of probabilities, which indicates the disequilibrium relative to a fixed distribution (in this case the distance to the i.i.d. vector). For this purpose we can use an Euclidean Distance (or some other p-norm), the Bhattacharyya Distance [28], or Wootters' Distance [29]. We can also apply a statistical measure of divergence, for example the Classical Relative Entropy [30], Hellinger distance, and also Jensen-Shannon Divergence [31]. We make note of some other generalized versions of complexity measures in recent years, and these functions have proven to be useful in some branches of classical information theory [7,[32][33][34][35][36][37][38][39][40].

Quantifying Quantum Complexity
The quantum version of the statistical complexity measure quantifies the amount of complexity contained in a quantum system, in an order-disorder scale. For quantum systems, the probability distribution is replaced by a density matrix (positive semi-definite and trace one). Likewise, the classical case, the extreme cases of order and disorder, are, respectively, the pure quantum states |ψ ψ|, and the maximally mixed state: I := I/N, where N is the dimension of the Hilbert space. Additionally, in analogy with the description for classical probability distributions, the quantifier of quantum statistical complexity must be zero for maximum degree of order and disorder. One can define the Quantum Statistical Complexity Measure (QSCM) as a product of an order and a disorder quantifiers: a quantum entropy, which measures the amount of disorder related to a quantum system, and a pairwise distinguishability measure of quantum states, which plays the role of a disequilibrium function. One of the functions to measure the amount of disorder of a quantum system is the von Neumann entropy and it is given by: where ρ is the density matrix of the system. The trace distance between ρ and the the maximally mixed state quantifies the degree of order: For our purposes here, we define the Quantum Statistical Complexity Measure-(QSCM) in Definition 4 by means of the trace distance between the reduced quantum state ρ and I acting as the reference state. However, the trace distance function was chosen in both Definition 3 and in Definition 4 because it is the most distinguishable distance in the Hilbert space, and also monotonic under stochastic operations.

Definition 3 (Quantum Statistical Complexity
Measure-(QSCM)). Let ρ ∈ D(H N ) be a quantum state over an N-dimensional Hilbert space. Then we can define the Quantum Statistical Complexity Measure as the following functional of ρ: where S(ρ) is the von Neumann entropy, and D(ρ, I) is a distinguishability quantity between the state ρ and the normalized maximally mixed state I, defined in the suitable space.

Definition 4 (Quantum Statistical Complexity Measure of the Reduced Density Matrix).
Let ρ SE ∈ D(H N M ) be a global system of dimension N M, or any compound state of a system and its environment, and let ρ = ρ S = Tr E [ρ SE ] ∈ D(H N ), (having dimension N), be the reduced state of this compound state (where Tr E is the partial trace over the environment). Then we can define the Quantum Statistical Complexity Measure of the Reduced Density Matrix C(ρ S ) as: where S(ρ S ) is the von Neumann entropy of the quantum reduced state: ρ S = Tr E [ρ SE ], and D(ρ S , I) is a distinguishability quantity between the quantum reduced state (ρ S ), and the normalized maximally mixed state I, defined in the suitable space.
In this work, we will use the Definition 4 (Quantum Statistical Complexity Measure of the Reduced Density Matrix) as the Quantum Statistical Complexity Measure-(QSCM). The reason for this choice is that we will study quantum phase transitions, therefore, we will apply the QSCM in the one-qubit state (reduced by N parts) in the thermodynamic limit, in the case of the 1D-Ising Model, (Section 4.2), and in the case of the Heisenberg Model XXZ-1/2, (Section 4.3), we will apply the QSCM in the two-qubit state (reduced of N parts), also in the thermodynamic limit. Definition 4 is the quantum analogue of Definition 2, i.e., it is the quantum correspondent of CSCM defined by means of "marginal probability distributions". Extensions of these measures defined above to continuous variables are trivial.
In analogy with the classical counterpart, in the definition of quantum statistical complexity measure, there is a carte blanche in choosing the quantum entropy function, such as the quantum Rényi entropy [41], or quantum Tsallis entropy [42], among many others functions. Similarly, we can choose other disequilibrium functions as a measure of distinguishability of quantum states. It can be some Shatten-p norm [43], or a quantum Rényi relative entropy [44], the quantum skew divergence [45], or a quantum relative entropy [46]. Another feature that might generalize the quantities defined in Definition 3 and Definition 4 is to define a more general quantum state ρ * as a reference state (rather than the normalized maximally mixed state I) in the disequilibrium function. This choice must be guided by some physical symmetry or interest. Some obvious candidates are the thermal mixed quantum state, and the canonical thermal pure quantum state [47].

Some Properties of the QSCM
To complete our introduction of the quantifier of quantum statistical complexity, we should require some properties to guarantee a bona fide information quantifier. The amount of order-disorder, as measured by the QSCM, must be invariant under local unitary operations because it is related to the purity of the quantum reduced states.
Proposition 1 (Local Unitary Invariance). The Quantum Statistical Complexity Measure is invariant under local unitary transformations, applied on the quantum reduced state of systemenvironment: Let a quantum system: ρ = ρ S ∈ D(H S ) be the quantum reduced state of dimension N, and let ρ SE ∈ D(H SE ) be the compound system-environment state, having dimension N M, such that: where ρ = ρ S ∈ D(H S ), and U S is a local unitary transformation acting on D(H S ), and Tr E is the partial trace over the environment E. The extension of this property to the global state ρ SE is trivial.
This statement comes directly from the invariance under local unitary transformation of von Neumann entropy and trace distance applied on the quantum reduced state ρ S .
Another important property regards the case of inserting copies of the system in some experimental contexts. Let us consider an experiment in which the experimentalist must quantify the QSCM of a given state ρ by means of a certain number n of copies ρ ⊗n , which therefore implies that the QSCM of the copies should be bounded by the quantity of only one copy.
Proposition 2 (Sub-additivity over copies). Given a product state ρ ⊗n , with dim(ρ) = N, the QSCM is a sub-additive function over copies: Indeed this is an expected property for a measure of information, since the regularized number of bits of information gained from a given system cannot increase just by considering more copies of the same system. The proof of Proposition 2 is in Appendix A, and it comes from the additivity of von Neumann entropy and sub-additivity of trace distance.
It is important to notice that quantum statistical complexity is not sub-additive over general extensions with quantum states, for example: i. Extensions with maximally mixed states: let us consider a given state ρ is extended with one maximally mixed state I = I/N, Equation (15) presents an upper bound to the QSCM for this extended state. This feature demonstrates that the measure of the compound state is bounded by the quantity of one copy. ii. In Equation (18) we present the QSCM for a more general extension given by ρ ⊗n ⊗ I ⊗n . This feature shows that the measure of the compound state is also bounded by the quantity of one copy.
iii. As a last example of nonextensivity over general compound states, let us consider the extension with a pure state |ψ ψ|, with dim(ρ) = dim(|ψ ψ|).
As discussed above, the QSCM is a measure that intends to detect changes in properties, as for example changes on patterns of order and disorder. Therefore, the measure must be a continuous function over the parameters of the states responsible for its transitional characteristics. Naturally, the quantum complexity is a continuous function, since it comes from the product of two continuous functions. Due to continuity, it is possible to define the derivative function of the quantum statistical complexity measure: Definition 5 (Derivative). Let us consider a physical system described by the one-parameter set of states: ρ(α) ∈ D(H N ), for α ∈ R. We can define the derivative with respect to α as: In the same way as defined in Definition 5, it is possible to obtain higher order derivatives.
Definition 6 (Correlation Transition). In many-particle systems a transition of correlations occurs when a system changes from a state that has a certain order, pattern or correlation, to another state possessing another order pattern or correlation.
At low temperatures, physical systems are typically ordered, increasing the temperature of the system, they can undergo phase transitions or order-disorder transitions into less ordered states: solids lose their crystalline form in a solid-liquid transition; iron loses magnetic order if heated above the Curie point in a ferromagnetic-paramagnetic transition, etc. The description of physical systems depends on measurable quantities such as temperature, interaction strength, interaction range, orientation of an external field, etc. These quantities can be described by parameters in a suitable space. For example, let us consider a parameter describing some physical quantity α, and a set of one parameter state ρ(α).
Phase transitions are characterized by a sharp change in the complexity of the physical system that exhibits such emergent phenomena when this suitable control parameter α exceeds a certain critical value. The study of transitions between correlations with the objective of inferring physical properties of a system can generate great interest. We know that at the phase transition point, the reduced state of N particles undergoes abrupt transitions that go through states that have a certain purity and change abruptly to reduced mixed states. These transitions can indicate a certain type of correlation transition that can be detected. For many-particle and composed systems, quick change on the local orderdisorder degree can be associated with a transition in the correlations pattern. In this way, a detectable change in these parameters may indicate an alteration in system configuration, which is considered here as a change in the pattern of order-disorder.
Quantum phase transition is a fundamental phenomenon in condensed matter physics, and is tightly related to quantum correlations. Quantum critical points for Hamiltonian models with external magnetic fields at finite temperatures were studied extensively. In the Quantum Information scenario, the quantum correlation functions used in these studies of quantum phase transitions concerned almost only concurrence and quantum discord. The behavior of quantum correlations for the Heisenberg XXZ spin-1/2 chain via negativity, information deficit, trace distance discord, and local quantum uncertainty was investigated in [48]. However, other measures of quantum correlations had also been proposed in order to detect quantum phase transitions, such as: local quantum uncertainty [49], entanglement of formation [50,51], quantum discord, and classical correlations [52]. Authors in Ref. [53] revealed a quantum phase transition in an infinite 1D-XXZ chain by using concurrence and Bell inequalities. The behaviors of the quantum discord, quantum coherence, and Wigner-Yansase skew information and the relations between the phase transitions and symmetry points in the Heisenberg XXZ spin-1 chains have been broadly investigated in Ref. [54]. The ground state properties of the one-dimensional extended Hubbard model at half filling from the perspective of its particle reduced density matrix were studied in [55], where the authors focused on the reduced density matrix of two fermions and performed an analysis of its quantum correlations and coherence along the different phases of the model.
In an abstract manner, a quantum state undergoing a path through the i.i.d. identity matrix is an example of such transitions which may have physical meaning, as we will observe later in some examples. Let us suppose that a certain subspace of a quantum system can be interpreted as having a certain order (i.e., a degree of purity of the compound state of N particles), and there exists a path in which this subspace passes through the identity. This path can be analyzed as having an order-disorder transition. In order to illustrate the formalism of quantum statistical complexity in this context of order-disorder transition, in Section 4 we apply it to two well-known quantum systems that exhibit quantum phase transitions: the 1D-Quantum Ising Model (Section 4.2), and the Heisenberg XXZ spin-1/2 model (Section 4.3).

Examples and Applications
In this section we calculate an analytic expression for the Quantum Statistical Complexity Measure (QSCM), of one-qubit, written in the Bloch basis, (Section 4.1), and present the application of QSCM in order to evince quantum phase transitions and correlation ordering transitions for the 1D-Quantum Ising Model (Section 4.2) and for the Heisenberg XXZ spin-1/2 model (Section 4.3).

QSCM of One-Qubit
Let us suppose we have a one-qubit state ρ, written in the Bloch basis. In Equation (22), we analytically exhibit the Quantum Statistical Complexity Measure C(ρ) of one-qubit, written as: ρ = 1 2 (I + r · σ): where r = (x, y, z), and r = | r| = x 2 + y 2 + z 2 , with 0 ≤ r ≤ 1, and σ is the Pauli matrix vector. Normalization constants such as log(2), for example, are omitted in Equation (22) just for aesthetic reasons.
It is interesting to notice that the Quantum Statistical Complexity Measure of onequbit, written in the Bloch basis, C(ρ) is a function dependent only on r, that is, C(r). This expression will be useful in the study of quantum phase transitions, for example, in the 1D-Ising Model, discussed in Section 4.2, where an analytical expression for the state of one-qubit reduced from N spins, in the thermodynamic limit, will be obtained.
Other useful expressions can be obtained, for example, the trace distance between the state and the normalized identity for one-qubit is also a function of r, in the Bloch's basis, D(r) = r/2, and therefore, the entropy function can be easily written as S(r) = 2C(r)/r by using Equation (22). In addition, we exhibit analytic expressions for the first (Equation (23)), and the second, Equation (24)) derivatives of QSCM, for one-qubit, written in the Bloch Basis. One can observe that these functions also depend only on r:

1D Quantum Ising Model
The 1D-Quantum Ising Model presents a quantum phase transition and, despite its simplicity, still generates a lot of interest from the research community. One of the motivations lies in the fact that spin chains possess a great importance in modelling quantum computers. The Hamiltonian of the 1D-Quantum Ising Model is given by: where {σ x , σ y , σ z } are the Pauli matrices, J is an exchange constant that sets the interaction strength between the pairs of first neighbors {j, j + 1}, and g is a parameter that represents an external transverse field. Without loss of generality, we can set J = 1, since it simply defines an energy scale for the Hamiltonian. The Ising model ground state can be obtained analytically by a diagonalization consisting of three steps: • A Jordan-Wigner transformation: where c j and c † j are the annihilation-creation operators, respecting the anti-commutation relations: {c j , c † k } = δ jk I, and {c j , c k } = 0; • A Discrete Fourier Transform (DFT): c k e 2πi(kj)/N ; • A Bogoliubov transformation: where θ k represents the basis rotation from the mode c k to the new mode representation γ k . The angles θ k are chosen such that the ground state of the Hamiltonian in Equation (25) is the vacuum state in γ k mode representation, and it is given by θ k = arctan sin(k) g−cos(k) [56]. We can calculate the reduced density matrix of one spin by using the Bloch representation, in which all coefficients are obtained via expectation values of Pauli operators.
The one-qubit state in the site j, given by ρ (1) j can be written as: where r a j = σ a j are expected values in vacuum state in the site j, and a = x, y, z. Note that σ x j = σ y j = 0, ∀j, because they combine an odd number of fermions. Therefore, the Bloch's vector possesses only the z-component. Let us define θ k /2 = β k , and θ k /2 = β k . Thus, the z-component will be given by: As discussed above, the only non-vanishing term will be σ z j , and therefore: In Equations (28) we exhibit the one-qubit reduced density matrix in the Bloch basis (ρ 1 ): where the angle θ k is the Bogoliubov rotation angle and the summation index k ∈ K, with K = [± π N , ± 3π N , · · · , ± π − 2π N ]. This result is independent of the spin index, as expected for systems that are translational invariant.
We can now calculate QSCM for the reduced density matrix analytically. From Equation (22), we simply identify the Bloch vector of the reduced density matrix having only z-component, as written in Equation (27). This quantity in the thermodynamic limit can be obtained by taking the limit of the Riemann sums, σ z (g) = lim N→∞ ∑ N k=1 σ z k . This Bloch's vector component is a function of the field g, that is, σ z (g) = lim N→∞ ∑ N k=1 1 − 2 N ∑ k sin 2 (θ(k)), with θ(k) = arctan sin(l) g−cos(l) , and l = (2k−1)π N − π. Thus, the z component of Bloch's vector given in Equation (27) goes to the following integral, written as: This integral can be solved analytically in the thermodynamic limit for some values of the transverse field parameter. For g = 0, we can easily obtain σ z = 0, which corresponds to a one-qubit maximally mixed reduced state ρ 1 = I/2. At g = 1, i.e., in the critical point, the integral given in Equation (29) can be also solved and we obtain σ z = 2/π, in the thermodynamic limit. The eigenvalues of the one-qubit reduced state ρ 1 , at g = 1, can be obtained analytically as: {1/2 ± 1/π}. For other values of g, the integral written in Equation (29) can be written as elliptic integrals of first and second kinds [57]. By using the result given in Equation (29) on Equation (22), we can thus obtain the quantum statistical complexity measure for one-qubit reduced density matrix in the thermodynamic limit as a function of the transverse field parameter g.
In Figure 1 we present the second derivative of QSCM with respect to the transverse field parameter g, for different finite system sizes N = 4, 8, 16, 1000. We also calculated this derivative in the thermodynamic limit by using Equation (24) and Equation (29).
It is well known that at g = 1 there is a quantum phase transition of second order [58]. By observing Figure 1, we can directly recognize a sharp behavior of the measure in the transition point.

XXZ-½ Model
Quantum spin models as the XXZ-1/2 can be simulated experimentally by using Rydberg-excited atomic ensembles in magnetic microtrap arrays [59], and also by a lowtemperature scanning tunneling microscopy [60], among many other quantum simulation experimental arrangements. Let us consider a Heisenberg XXZ spin-1/2 model defined by the following Hamiltonian: with periodic boundary conditions, S α j+N = S α j , and S α j = 1 2 σ α j , where σ α j are Pauli matrices, and ∆ is the uni-axial parameter strength, which is a ratio of S z interactions between S x or S y interactions. This model can interpolate continuously between classical Ising, quantum XXX, and quantum XY models. At ∆ = 0, it turns to the quantum XY or XX0 model which corresponds to free fermions on a lattice. For ∆ = 1, (∆ = −1), the anisotropic XXZ model Hamiltonian reduces to the isotropic (ferro)anti-ferromagnetic XXX model Hamiltonian. For ∆ → ±∞, the model goes to an (ferro)anti-ferromagnetic Ising Model.
The parameter J defines an energy scale and only its sign is important: we observe a ferromagnetic ordering along the x − y plane for positive values of J, and, for negative ones, we notice the anti-ferromagnetic alignment. The uni-axial parameter strength ∆ distinguishes a planar regime x − y (when |∆| < 1), from the axial alignment, (for |∆| > 1), cf. [61]. Thereby, it is useful to define two regimes: for |∆| > 1, the Ising-like regime and |∆| < 1, the XY-like regime in order to model materials possessing respectively an easy-axis and easy-plane magnetic anisotropies [62].
Here we are interested in quantum correlations between the nearest and next to nearest neighbor spins in the XXZ spin-1/2 chain with J = 1, at a temperature of 0K, and zero external field (h = 0). The matrix elements of i+r are written in function of expectation values which mean the correlation functions for nearest neighbor r = 1, (for i+1 ), and the correlation functions for next-to-nearest neighbors r = 2, (for i+2 ), and they are given by a set of integral equations which can be found in Appendix B or in [48,53,[63][64][65]. These two point correlation functions for the XXZ model at zero temperature and in the thermodynamic limit can be derived by using the Bethe Ansatz technique. In Equation (30), due to the symmetry in the Hamiltonian model, it is presented the two-qubit reduced density matrix of sites i and i + r, for r = 1, 2, in the thermodynamic limit, written in the basis |1 = |↑↑ , |2 = |↑↓ , |3 = |↓↑ and |4 = |↓↓ , where |↑ and |↓ are the eigenstates of the Pauli z-operator [48]:  In Figure 2 we show the QSCM, C( i+1 ), for nearest neighbor, in contrast with the von Neumann entropy S( i+1 ) and the trace distance D( i+1 , I) between and the normalized identity matrix, all as a function of the uni-axial parameter strength ∆. The XXZ model possess two critical points: the first-order transition occurs at ∆ = −1, and also a continuous phase transition shows up at ∆ = 1 [66]. An interesting feature of the QSCM is the fact that it evinces points of correlation transitions, related to the order-disorder transitions, which may not necessarily be connected with phase transitions. In this respect, we take note of the cusp point in Figure 2, at ∆ ≈ 2.178.
In order to investigate the cusp point of C( i+1 ) at ∆ = 2.178, let us consider what happens with the state i+1 , given by Equation (30), as ∆ varies. The state i+1 given in Equation (30) can be easily diagonalized, thus let us study the following matrix i+1 − I/4, which plays an important role in the quantum statistical complexity measure as already discussed. This matrix has the following eigenvalues: As the ∆ value increases in the interval [1,3], correlation values in the x direction also increase while correlations in z decrease, reaching the local minimum observed in Figure 2. In this interval, the eigenvalue 2 σ x i σ x i+1 − σ z i σ z i+1 goes through zero, and this, therefore, should cause the correlation transition. This correlation transition is due to the fact that this eigenvalue vanishes for some ∆ in this interval, which should imply a change of orientation of spin correlations.
By following this reasoning, in order to determine such points at which changes of orientation of spin correlations occur, it is necessary to solve numerically some integral equations, given by the eigenvalues of Equation (30), which are functions of expected values given in Refs. [48,53,[63][64][65]. This procedure has the objective of determining the solution for which values of ∆ the following integral equation holds: 2 σ x i σ x i+1 − σ z i σ z i+1 = 0. Due to the fact that the values of σ y i σ y i+1 = σ x i σ x i+1 , for this Hamiltonian, the solution of this equation indicates the point where planar xy-correlation decreases while z-correlation increases, although we are already in the ferromagnetic phase. For ∆ → ∞, the system moves towards a configuration that exhibits correlation only in the z-direction. Proceeding in the same way, by solving the other integral equation given in the eigenvalues set: we obtain a divergence solution for which ∆ = −1. In Figure 3, we call attention to a contour map of C( i+1 ) in function of σ x i σ x i+1 , and σ z i σ z i+1 . The triangle region represents the convex hull of positive semi-definite density matrices. The vertices of this triangle are given by: ( σ x i σ x i+1 , σ z i σ z i+1 ) = {(−1, −1); (0, 1); and (1, −1)}. Along with the contour map of QSCM as a function of the correlation functions in x and z directions, the integral equations obtained while the two eigenvalues of i+1 − I/4 goes to zero are also represented in Figure 3. These integral equations are represented by the two inclined straight lines (the dash and dash-dot ones). The dash and inclined straight line describes the integral equation whose solution is ∆ = 2.178. The dash-dot straight line represents the curve for ∆ = −1 solution, for which there exists a divergence point (the phase transition point).
As previously mentioned, QSCM showed to be sensitive to correlation transitions. In Figure 3, the thick and colorful curve inside the contour map shows the path taken by C( i+1 ), while the values of correlations in x and z vary when ∆ increases monotonically in the interval [−1, 8]. This same path was also presented in Figure 2, on the blue curve. The blue part of the thick curve represents values for the correlations in which we have a paramagnetic state, and the red part of the thick and colorful curve indicates the values for the ferromagnetic arrangement. Additionally, we have highlighted some interesting points in this colorful curve by a ×: for ∆ = −1, ( ); for ∆ = 0, (×); ∆ = 1, (+); ∆ = 2.178, ( ) and for ∆ = 8, ( ).  Figure 4 shows QSCM for nearest neighbors, given by C( i+1 ), (blue), and for next-tonearest neighbors, written as C( i+2 ), (orange), both in the thermodynamic limit in function of the uni-axial parameter strength ∆.
The asymptotic limit for both measures (r = 1, 2) is also presented in the sub-figure. As ∆ → ∞, the behavior of these two correlation functions σ x i σ x i+1 → 0 and σ z i σ z i+1 → −1, for both cases. In this limit, the density matrix of the system can be written as i+r → diag{0, 1/2, 1/2, 0}, for both cases and thus, S( i+r ) → 1/2. Additionally, D( i+r , I) → 1/2, which makes C( i+r ) → 1/4, for r = 1 and for r = 2. It is interesting to notice that C( i+1 ) = C( i+2 ) exactly at ∆ = 2.178. The QSCM for nearest neighbor is greater than the QSCM for next-to-nearest neighbors, i.e., C( i+1 ) > C( i+2 ), for −1 ≤ ∆ < 2.178. For ∆ > 2.178, C( i+1 ) < C( i+2 ), until both goes to 1/4, for large values of ∆. This behavior can be an indication of a probable increase in complexity as the transition of order-disorder occurs. Therefore, it should be expected that this measure could act as a complexity pointer. Considering more correlations possible if we count on the action of second neighbors, it is understandable that an increase of complexity should occur at ∆ = 2.178. . QSCM for nearest neighbors, (r = 1), and for next-to-nearest neighbors (r = 2). C( i+r ) for the two-qubit reduced density matrix of sites i and i + r, for r = 1 (blue), and for r = 2 (orange), both in the thermodynamic limit in function of the uni-axial parameter strength ∆. (a) Sub- Figure: Asymptotic Behavior. The sub-figure shows the asymptotic behavior for large ∆ for both cases (in fact, both C( i+1 ) and C( i+2 ) → 1/4, when ∆ → ∞).

Conclusions
We introduced a quantum version for the statistical complexity measure, the Quantum Statistical Complexity Measure (QSCM), and displayed some of its properties. The measure has demonstrated to be useful and physically meaningful. It possesses several of the expected properties for a bona fide complexity measure and demonstrates its possible usefulness in other areas of quantum information theory.
We presented two applications of the QSCM, investigating the physics of two exactly solvable quantum Hamiltonian models, namely: the 1D-Quantum Ising Model and the Heisenberg XXZ spin-1/2 chain, both in the thermodynamic limit. Firstly we calculated the QSCM for one-qubit, in the Bloch's base, and we determined this measure as a function of the magnitude of the Bloch vector r. We computed its magnitude in the thermodynamic limit, first by analytically calculating the measure for the one-qubit state reduced density matrix from N spins. Later, in order to study the quantum phase transition for the 1D-Quantum Ising Model, we performed the limit N → ∞. For the 1D-Quantum Ising Model, we obtained the quantum phase transition point at g = 1. In this way, we have found that the QSCM can be used as a signaling of quantum phase transitions for this model.
Secondly, we studied the Heisenberg XXZ spin-1/2 chain, and by means of QSCM we evince a point at which a correlation transition occurs for this model. Physically, at ∆ = 2.178, the planar xy-correlation decreases while the z-correlation increases, reaching a minimum point, although we are already in the ferromagnetic arrangement. This competition between these two different alignments of correlations indicates an order-disorder transition in which the measure was shown to be sensitive.
We have studied the derivatives of the QSCM and they demonstrated to be sensitive to the quantum transition points. As a summary of this study for the Heisenberg XXZ spin-1/2, we can list: (i) the Quantum Statistical Complexity Measure, characterizing the first-order quantum phase transition at ∆ = −1, (ii) and evinces the continuous quantum phase transition at ∆ = 1, and (iii) witnesses order-disorder transition at ∆ = 2.178, related to the alignment of the spin correlations.