Next Article in Journal
The Impact of Blame Attribution on Moral Contagion in Controversial Events
Previous Article in Journal
Comment on Cimmelli, V.A. Interpretation of Second Law of Thermodynamics in Extended Procedures for the Exploitation of the Entropy Inequality: Korteweg Fluids and Strain-Gradient Elasticity as Examples. Entropy 2024, 26, 293
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Local Invariance of Divergence-Based Quantum Information Measures

by
Christopher Popp
*,
Tobias C. Sutter
and
Beatrix C. Hiesmayr
Faculty of Physics, University of Vienna, Währingerstraße 17, 1090 Vienna, Austria
*
Author to whom correspondence should be addressed.
Entropy 2025, 27(10), 1051; https://doi.org/10.3390/e27101051
Submission received: 10 September 2025 / Revised: 6 October 2025 / Accepted: 7 October 2025 / Published: 10 October 2025
(This article belongs to the Section Quantum Information)

Abstract

Quantum information quantities, such as mutual information and entropies, are essential for characterizing quantum systems and protocols in quantum information science. In this contribution, we identify types of information measures based on generalized divergences and prove their invariance under local isometric or unitary transformations. Leveraging the reversal channel for local isometries together with the data-processing inequality, we establish invariance for information quantities used in both asymptotic and one-shot regimes without relying on the specific functional form of the underlying divergence. These invariances can be applied to improve the computation of such information quantities or optimize protocols and their output states, whose performance is determined by some invariant measure. Our results improve the capability to characterize and compute many operationally relevant information measures with application across the field of quantum information processing.

1. Introduction

A central goal of quantum information theory is the precise quantification of correlations, uncertainties, and distinguishability within quantum systems. Various information-theoretic quantities have been defined to capture the fundamental limits of quantum information processing tasks like communication, computation, or entanglement manipulation.
Traditionally, the fundamental quantity is the von Neumann entropy [1], from which many quantities relevant in classical information science, like the mutual information or the conditional entropy, can be generalized to the quantum regime (cf. [2]). These measures are relevant due to their operational meaning in quantifying optimal rates of achieving various information processing tasks in the asymptotic regime of infinitely many independent uses of a resource (e.g., quantum state). A notable example is the quantum relative entropy [3], which can be related to quantum hypothesis testing [4]. Several other information quantities can be expressed in terms of the relative entropy, such as the private information for the rate of secret-key distillation and the coherent information for the rate of entanglement distillation [5]. In the so-called one-shot setting (cf. Ref. [6]), only a finite amount of resources are considered. In this approach, the goal is to find optimal rates to transform the limited resources, such as quantum states, into the target states while allowing fixed error bounds. Moreover, the one-shot setting is crucial for security analysis in the context of quantum cryptography without any assumptions on the actions of a malicious third party [7]. As in the asymptotic setting, several information quantities with operational meaning have been identified such as the ε -hypothesis testing mutual information [8] and the smooth max-mutual information [9] that can be related to secret-key distillation in the one-shot setting.
As quantum technologies advance, the landscape of information quantities and related processing tasks becomes increasingly diverse (see Ref. [10] for a comprehensive overview of information measures and related processing tasks) and a unified framework capable of accommodating both asymptotic and one-shot scenarios is essential. Generalized divergences offer such a unifying language. By definition, each divergence satisfies monotonicity under completely positive trace-preserving (CPTP) maps, i.e., a data-processing inequality under the action of quantum channels, that guarantee operational meaning and allow us to derive an entire spectrum of information measures. Examples include the Petz–Rényi relative entropy [11,12,13], the sandwiched Rényi relative entropy [14,15], the geometric Rényi relative entropy [16,17] and (smooth) max/min-relative entropies [7]. Note that the relative entropy also satisfies a data-processing inequality and thus many information quantities in the asymptotic setting can be expressed by this specific divergence. Replacing the relative entropy by other divergences in similar expressions allows one to define generalized quantities that are applicable in other regimes, e.g., the one-shot setting.
Despite their broad applicability, evaluating these divergence-based measures poses significant computational problems in the case of high-dimensional quantum states or if their definition involves challenging or infeasible optimizations. In this work, we define several types of divergence-based information measures and prove their invariance under local isometric or unitary transformations.
In Section 2, we introduce the setting and notation and define the types of generalized divergence-based information quantities. We continue to present and prove the main result of this contribution, namely the invariance of information quantities of the defined types under local isometric or unitary transformations in Section 3. Finally, we conclude our results in Section 4 and provide an outlook to potential applications and future research directions.

2. Methods: Generalized Divergences and Information Quantities

In this section, we briefly introduce the notation and necessary objects to define several types of generalized information quantities and prove their invariance under local isometric or unitary transformations.

2.1. Notation and Setting

We consider two parties A and B with corresponding Hilbert spaces H = H A H B . L + ( H ) and D ( H ) denote the spaces of positive-semidefinite operators and density operators (i.e., quantum states) acting on H , respectively. Isometries are denoted by a capital V. Corresponding channels, i.e., completely positive and trace-preserving maps, are written as V ( · ) V ( · ) V . Unitary operators are referred to with a capital U. The identity map is written as id and ∘ denotes the composition of maps. States or operators are labeled with the systems they act on. For any multipartite state or operator, e.g., the state ρ A B , the marginal state or operator is given by ρ A : = Tr B [ ρ A B ] .

2.2. Generalized Divergence-Based Types of Information Quantities

The information quantities we analyze in this contribution are defined via so-called generalized divergences D [18,19].
Definition 1
(Generalized divergence, data-processing inequality). For quantum states ρ D ( H ) and positive-semidefinite operators ζ L + ( H ) , a function D : D ( H ) × L + ( H ) R that satisfies the data-processing inequality under any channel N , D ( ρ | | ζ ) D ( N ( ρ ) | | N ( ζ ) ) , is called (generalized) divergence.
For quantum states, the data-processing inequality can be interpreted as the property that no physical transformation can make two states more distinguishable. It is known that any divergence is invariant under the application of isometric transformations V [10]:
D ( V ( ρ ) | | V ( ζ ) ) = D ( ρ | | ζ ) .
Generalized divergences are used to define generalized information quantities for bipartite quantum states ρ A B . There exist several ways to define information quantities based on divergences with operational meaning. In this work, we consider quantities that relate a bipartite state and its local marginals. Using specific instances for the divergences, these quantities can be related to various information-processing tasks (cf. [10] for a comprehensive overview). The method to prove local invariance of the quantities we present in this work, however, does not depend on the specific form of the divergence, but only on how it is applied to the state. We therefore define several types of information quantities independent of the specific form of the divergence. Some types involve a so-called smoothing, i.e., an optimization over an environment of the quantum state. For this, the sine distance [20] is often used as a distance measure that is closely related to the fidelity [21] as defined below.
In the following, let ε [ 0 , 1 ] and let ρ and σ be states.
Definition 2
(Fidelity, sine distance). The fidelity of two quantum states ρ and σ is defined as follows:
F ( ρ , σ ) : = Tr [ σ ρ σ ] 2
The sine distance is defined as:
P ( ρ , σ ) : = 1 F ( ρ , σ )
However, other distinguishability functions potentially with different properties can be used as well. In this work, we only require that the function obeys the data-processing inequality and invariance under isometric transformations. As both properties are satisfied by any divergence (cf. Definition 1 and (1)), including the sine distance [10], the smoothing environment B ε ( ρ ) around a state ρ is defined as follows:
Definition 3
(Smoothing environment). Let D be any divergence.
B ε ( ρ ) : = ρ ^ : D ( ρ | | ρ ^ ) ε
For the sine distance, it specifically reads:
B P ε ( ρ ) : = ρ ^ : P ( ρ , ρ ^ ) ε = ρ ^ : F ( ρ , ρ ^ ) 1 ε 2 .
In this work, we define and analyze the following types of information quantities:
Definition 4
(Types of generalized mutual information).
I 1 ( ρ A B ) : = D ( ρ A B | | ρ A ρ B )
I 2 ( ρ A B ) : = inf σ B D ( ρ A B | | ρ A σ B )
I 3 ε ( ρ A B ) : = inf ρ ^ A B B ε ( ρ A B ) σ B D ( ρ ^ A B | | ρ A σ B )
I 4 ε ( ρ A B ) : = inf ρ ^ A B B ε ( ρ A B ) D ( ρ ^ A B | | ρ A ρ ^ B )
Definition 5
(Types of generalized conditional entropies).
H 1 ( ρ A B ) : = D ( ρ A B | | 1 A ρ B )
H 2 ( ρ A B ) : = inf σ B D ( ρ A B | | 1 A σ B )
H 3 ε ( ρ A B ) : = inf ρ ^ A B B ε ( ρ A B ) σ B D ( ρ ^ A B | | 1 A σ B )
The defined types include information quantities in both the asymptotic and the one-shot setting. Examples of this are the (generalized) quantum mutual information ( I 1 , I 2 ), the coherent information ( H 2 ), the Petz-Rényi mutual information, the sandwiched Rényi mutual information, the geometric Rényi mutual information, the smooth min-mutual information (see Ref. [22] for these quantities), the ε -hypothesis testing mutual information ( I 3 ε ), the smooth max-conditional and min-conditional entropy ( H 3 ε ), and the smooth max-mutual information ( I 4 ε ).

3. Results: Local Invariance of Information Quantities

We show that the types I 1 , I 2 and I 3 ε are invariant under any local isometric transformation, and that the types I 4 ε , H 1 , H 2 and H 3 ε are invariant if the first subsystem is transformed unitarily, while the second subsystem can be transformed by any isometry. More precisely, we prove below:
Proposition 1.
Any information quantity of type I 1 , I 2 and I 3 ε as in Definition 4 is invariant under local isometric transformations of the form V A B ( · ) = V A V B ( · ) V A V B .
Proposition 2.
Any information quantity of type I 4 ε , H 1 , H 2 and H 3 ε as in Definitions 4 and 5 is invariant under local unitary transformations in the first system and isometric transformations in the second system of the form V A B ( · ) = U A V B ( · ) U A V B .
The proofs for Propositions 1 and 2 use the technical Lemmas 1, 2 and 4 related to the isometric transformations and the corresponding so-called reversal channel [2], defined in the following. Due to V V 1 in general, the map V is not necessarily trace-preserving, although it is completely positive. The reversal channel R V corresponding to an isometric channel V is a completely positive and trace-preserving map satisfying R V V = id , hence reversing the action of an isometric channel.
Definition 6
(Reversal channel). Let V : L + ( H ) L + ( H ˜ ) be an isometric channel and ω D ( H ) . The reversal channel corresponding to V and ω is defined as
R V ( σ ) : = V ( σ ) + Tr [ ( 1 V V ) σ ] ω ,
for any σ L + ( H ˜ ) .
Note, that this channel is not unique and V ( R V ( ρ ) ) ρ , in general. Also note that while σ L + ( H ˜ ) , ω D ( H ) needs to be a quantum state such that the map is trace-preserving. Given an isometric channel V , we define the following sets that relate a smoothing environment to its image and pre-image under the isometric channel and its reversal channel:
Definition 7.
For a state ρ with ε-environment B ε ( ρ ) as in Definition 3, an isometric channel V , and the reversal channel R V , we define:
B V ε ( ρ ) : = { V ( ρ ^ ) | ρ ^ B ε ( ρ ) } ,
B R V ε ( ρ ) : = { ρ ˜ | R V ( ρ ˜ ) B ε ( ρ ) } .
We now show that these sets can be ordered (see Figure 1), implying inequalities for smoothed information quantities as used in the proofs of Propositions 1 and 2.
Lemma 1.
Regarding the sets from Definitions 3 and 7, the following set relations hold:
B V ε ( ρ ) B ε ( V ( ρ ) ) B R V ε ( ρ ) .
Proof. 
For the first relation let ρ ˜ B V ε ( ρ ) , ρ ˜ = V ( ρ ^ ) , ρ ^ B ε ( ρ ) . Using the invariance under isometric transformations (1), we have D ( V ( ρ ) | | ρ ˜ ) = D ( V ( ρ ) | | V ( ρ ^ ) ) = D ( ρ | | ρ ^ ) ε ρ ˜ B ε ( V ( ρ ) ) . Now let ρ ˜ B ε ( V ( ρ ) ) D ( V ( ρ ) | | ρ ˜ ) ε . Using the data-processing inequality for any quantum channel, we have D ( R V ( V ( ρ ) ) | | R V ( ρ ˜ ) ) ε D ( ρ | | R V ( ρ ˜ ) ) ε ρ ˜ B R V ε ( ρ ) . □
A second technical property of the reversal channels is used for proving invariance under local transformations. There exist reversal channels for local transformations that preserve the reversal property R V V = id locally, in the sense R V ( V A ( · ) ( · ) ) id A ( · ) .
For mutual information quantities of types I using quantum states as arguments of the divergence, the following fact is used for proving Proposition 1.
Lemma 2.
Let V = V A V B be a local isometry. For any state ρ A and any local quantum states σ A , σ B , there exists a reversal channel R V satisfying
R V ( V A ( ρ A ) σ B ) = ρ A Tr A [ R V ( V A ( σ A ) σ B ) ] .
Proof. 
Define the reversal channel with ω = ρ A ω B for any local quantum state ω B . Using the definition of R V and V A V A = 1 , one can directly calculate:
R V ( V A ( ρ A ) σ B ) = ρ A ( V B σ B V B + ( 1 Tr B [ V B V B σ B ] ) ω B ) = ρ A Tr A [ R V ( V A ( σ A ) σ B ) ] .
For conditional entropies of types H using both quantum states and positive-semidefinite operators as arguments of the divergence, the following Lemma is used for proving invariance in the proof of Proposition 2.
Lemma 3.
Let V = U A V B with unitary U A and isometric V B . For any local quantum state σ B , there exists a reversal channel R V satisfying
R V ( 1 A σ B ) = 1 A Tr A [ R V ( π A σ B ) ] ,
where π A = 1 d A 1 A is the maximally mixed state of A with dimension d A .
Proof. 
Define the reversal channel with ω = π A ω B for any local quantum state ω B . Considering U A U A = U A U A = 1 A , using the definition of R V and calculating as in the proof of Lemma 2, one finds:
R V ( 1 A σ B ) = 1 A ( V B σ B V B + ( 1 Tr B [ V B V B σ B ] ) ω B ) = 1 A Tr A [ R V ( π A σ B ) ] .
In the special case of smoothed states present in both arguments of the divergence as in I 4 ε , the following property is used for proving Proposition 2.
Lemma 4.
Let V = U A V B with unitary U A and isometric V B . For any state ρ A and any quantum state σ A B , there exists a reversal channel R V satisfying
R V ( V A ( ρ A ) σ B ) = ρ A Tr A [ R V ( σ A B ) ] .
Proof. 
Set ω = ρ A ω B for any quantum state ω B . Using the definition of R V and U A U A = 1 , we then have for any local isometry V = U A V B :
R V ( V A ( ρ A ) σ B ) = ρ A V B σ B V B + Tr [ 1 A B V A ( ρ A ) V B V B σ B ] ρ A ω B
= ρ A ( V B σ B V B + ( 1 Tr B [ V B V B σ B ] ω B ) .
On the other hand, using U A U A = 1 A , one finds
ρ A Tr A [ R V ( σ A B ) ] = ρ A Tr A [ U A V B σ A B U A V B + Tr [ ( 1 A B U A U A V B V B ) σ A B ] ρ A ω B ]
= ρ A Tr A [ 1 A V B σ A B 1 A V B + ( 1 Tr [ 1 A V B V B σ A B ] ) ρ A ω B ]
= ρ A ( V B σ B V B + ( 1 Tr B [ V B V B σ B ] ) ω B ) ,
showing the claimed equality. □
Using these results, we now prove Propositions 1 and 2.
Proof of Proposition 1.
Let V = V A V B , where V ( A / B ) : D ( H ( A / B ) ) D ( H ˜ ( A / B ) ) . Let i m ( V ( A / B ) ) D ( H ˜ ( A / B ) ) denote the image of V ( A / B ) .
First consider type I 1 . Using Tr ( A / B ) [ V A V B ( ρ A B ) ] = V ( B / A ) ( ρ B / A ) and the invariance of any divergence under isometric evolution (1), one has:
I 1 ( V ( ρ A B ) ) = D ( V ( ρ A B ) | | Tr B [ V ( ρ A B ) ] Tr A [ V ( ρ A B ) ] )
= D ( V ( ρ A B ) | | V ( ρ A ρ B ) )
= D ( ρ A B | | ρ A ρ B )
= I 1 ( ρ A B ) .
Next, consider type I 2 . Again, using the invariance under isometric transformations, one has:
I 2 ( ρ A B ) = inf σ B D ( H B ) D ( ρ A B | | ρ A σ B )
= inf σ B D ( H B ) D ( V ( ρ A B ) | | V A ( ρ A ) V B ( σ B ) )
= inf σ ˜ B i m ( V B ) D ( V ( ρ A B ) | | V A ( ρ A ) σ ˜ B )
inf σ ˜ B D ( H ˜ B ) D ( V ( ρ A B ) | | V A ( ρ A ) σ ˜ B )
= I 2 ( V ( ρ A B ) )
Conversely, using the data-processing inequality for D with the reversal channel R V , one has:
I 2 ( V ( ρ A B ) ) = inf σ ˜ B D ( H ˜ B ) D ( V ( ρ A B ) | | Tr B [ V ( ρ A B ) ] σ ˜ B )
= inf σ ˜ B D ( H ˜ B ) D ( V ( ρ A B ) | | V A ( ρ A ) σ ˜ B )
inf σ ˜ B D ( H ˜ B ) D ( ρ A B | | R V ( V A ( ρ A ) σ ˜ B ) )
Using the reversal channel as in Lemma 2, with arbitrary σ ^ A D ( H A ) , yields therefore:
I 2 ( V ( ρ A B ) ) inf σ ˜ B D ( H ˜ B ) D ( ρ A B | | ρ A Tr A [ R V ( V A ( σ ^ A ) σ ˜ B ) ] )
inf σ B D ( H B ) D ( ρ A B | | ρ A σ B )
= I 2 ( ρ A B ) .
Note, that the second inequality holds because Tr A [ R V ( V A ( σ ^ A ) σ ˜ B ) ] D ( H B ) . The inequalities below (31) and (39) together imply the claimed equality.
Finally, consider type I 3 ε . Using similar arguments and Lemma 1, one finds:
I 3 ε ( ρ A B ) = inf ρ ^ A B B ε ( ρ A B ) σ B D ( H B ) D ( ρ ^ A B | | ρ A σ B )
= inf ρ ^ A B B V ε ( ρ A B ) σ ˜ B i m ( V B ) D ( ρ ^ A B | | V A ( ρ A ) σ ˜ B )
inf ρ ˜ A B B ε ( V ( ρ A B ) ) σ ˜ B D ( H ˜ B ) D ( ρ ˜ A B | | V A ( ρ A ) σ ˜ B )
= I 3 ε ( V ( ρ A B ) )
Conversely, using the reversal channel again as in Lemma 2 with arbitrary σ ^ A together with Lemma 1 and noting the set equality R V [ B R V ε ( ρ A B ) ] = B ε ( ρ A B ) , one concludes:
I 3 ε ( V ( ρ A B ) ) = inf ρ ˜ A B B ε ( V ( ρ A B ) ) σ ˜ B D ( H ˜ B ) D ( ρ ˜ A B | | V A ( ρ A ) σ ˜ B )
inf ρ ˜ A B B R V ε ( V ( ρ A B ) ) σ ˜ B D ( H ˜ B ) D ( ρ ˜ A B | | V A ( ρ A ) σ ˜ B )
inf ρ ˜ A B B R V ε ( V ( ρ A B ) ) σ ˜ B D ( H ˜ B ) D ( R V ( ρ ˜ A B ) | | R V ( V A ( ρ A ) σ ˜ B ) )
= inf ρ ^ A B B ε ( ρ A B ) σ ˜ B D ( H ˜ B ) D ( ρ ^ A B | | ρ A Tr A [ R V ( V A ( σ ^ A ) σ ˜ B ) ] )
inf ρ ^ A B B ε ( ρ A B ) σ B D ( H B ) D ( ρ ^ A B | | ρ A σ B )
= I 3 ε ( ρ A B )
Proof of Proposition 2.
Let V = V A V B , V A ( · ) = U A ( · ) U A , V A : D ( H A ) D ( H A ) with unitary U A and V B : D ( H B ) D ( H ˜ B ) with isometric V B . Let i m ( V B ) D ( H ˜ B ) denote the image of V B .
First, consider the type I 4 ε . Using the invariance of D under isometric transformations, and Lemma 1, one has:
I 4 ε ( ρ A B ) = inf ρ ^ A B B ε ( ρ A B ) D ( ρ ^ A B | | ρ A ρ ^ B )
= inf ρ ^ A B B ε ( ρ A B ) D ( V ( ρ ^ A B ) | | V ( ρ A ρ ^ B ) )
= inf ρ ˜ A B B V ε ( ρ A B ) D ( ρ ˜ A B | | V A ( ρ A ) ρ ˜ B )
inf ρ ˜ A B B ε ( V ( ρ A B ) ) D ( ρ ˜ A B | | V A ( ρ A ) ρ ˜ B )
= I 4 ε ( V ( ρ A B ) ) .
Conversely, using again Lemma 1 and applying the data-processing inequality for the reversal channel R V as in Lemma 4, one finds:
I 4 ε ( V ( ρ A B ) ) = inf ρ ˜ A B B ε ( V ( ρ A B ) ) D ( ρ ˜ A B | | V A ( ρ A ) ρ ˜ B )
inf ρ ˜ A B B R V ε ( ρ A B ) D ( ρ ˜ A B | | V A ( ρ A ) ρ ˜ B )
inf ρ ˜ A B B R V ε ( ρ A B ) D ( R V ( ρ ˜ A B ) | | R V ( V A ( ρ A ) ρ ˜ B ) ) .
Now, since U A is unitary, Lemma 4 implies together with the set equality R V [ B R V ε ( ρ A B ) ] = B ε ( ρ A B ) ,
I 4 ε ( V ( ρ A B ) ) inf ρ ˜ A B B R V ε ( ρ A B ) D ( R V ( ρ ˜ A B ) | | ρ A Tr A [ R V ( ρ ˜ A B ) ] )
= inf ρ ^ A B B ε ( ρ A B ) D ( ρ ^ A B | | ρ A ρ ^ B )
= I 4 ε ( ρ A B ) ,
proving the claimed property.
Finally, consider the conditional entropy types H 1 , H 2 and H 3 ε . Noting V A ( 1 A ) = 1 A due to the fact that V A is unitary, the proofs of local invariance are equivalent to those for I 1 , I 2 and I 3 ε given in the proof of Proposition 1, respectively, if ρ A is replaced by 1 A and Lemma 3 is used instead of Lemma 2. □

4. Summary and Conclusions

In this work, we defined types of quantum information quantities based on generalized divergences and analyzed their properties regarding local isometric and unitary transformations. Leveraging properties of the reversal channel associated with the local transformation, we proved that several types of information quantities imply invariance under any local isometric transformation, while others are invariant if one of the subsystems is transformed by a unitary. Two main technical results are utilized to prove the local invariance. First, the smoothing environment required by some types of information quantities is related to its image and its pre-image under the isometric channel and a corresponding reversal channel, implying the set relation of Lemma 1. Second, the action of the reversal channel for a local isometric channel is characterized in Lemmas 2–4. Combining these insights with the data-processing inequality of any generalized divergence allows us to derive the main results regarding invariance for several types of information quantities under local isometric or unitary transformation as presented in Propositions 1 and 2.
Since methods for proving the invariance do not depend on the specific form of the generalized divergence used to define the information quantity, the invariances hold for numerous key quantities with operational relevance in quantum information processing. These invariances can enable more efficient computation of quantum information quantities by reducing complex states to simpler, equivalent forms with symmetries or smaller dimensions. For instance, consider the setting of quantum cryptography including a third party holding the purifying system of a bipartite state. If the output of a tripartite quantum channel is a pure state, it is equivalent to the purified output of a corresponding bipartite channel up to a local isometry in the purifying system. The dimension of the purification system of the latter state, however, may be smaller than the dimension of the former purification system. Consequently, the evaluation of relevant information quantities like the private information may be significantly improved by calculating it for the smaller but equivalent output state. Such invariances reduce computational overhead and unlock flexibility in protocol design. Given that the protocol performance is measured by an invariant information quantity, local unitary or isometric operations like encoding or correction operations can be added, modified or removed while preserving performance metrics.
Note, that the general proofs for local invariance depend only on the monotonicity by the data-processing inequality and the reversal properties of the isometric channel as given in Lemma 2–4. Therefore, local invariance may be shown similarly for other channels and corresponding reversals, such as the Petz recovery map (cf. [10]).
The computation of generalized information quantities remains a significant challenge, in general. While some quantities like the hypothesis testing mutual information can be calculated efficiently, e.g., by semidefinite programs [23,24], others can only be approximated (see Ref. [25] regarding the smooth max-relative entropy) or lack a general and efficient solution like the smooth max-mutual information. Our results may enable new methods to bound or approximate these quantities by leveraging state equivalences. Finally, we note that the invariance under local isometries, notably local Clifford and unitary operations, plays a pivotal role in the classification and optimization of graph states as used in error-correction [26,27] or measurement-based quantum computation [28,29], allowing for the optimization of codes or resource states with respect to an invariant information measure. In conclusion, the results of this contribution allow us to derive the behavior of general types of information quantities under local isometric or unitary transformations. Therefore, they improve the capability to characterize and compute information quantities relevant throughout the vast field of quantum information processing.

Author Contributions

Conceptualization, C.P.; validation, C.P., T.C.S. and B.C.H.; formal analysis, C.P.; writing—original draft preparation, C.P.; writing—review and editing, C.P., T.C.S. and B.C.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded in whole, or in part, by the Austrian Science Fund (FWF) [10.55776/P36102]. For the purpose of open access, the author has applied a CC BY public copyright licence to any Author Accepted Manuscript version arising from this submission.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Neumann, J.v. Thermodynamik quantenmechanischer Gesamtheiten. Nachrichten Von Der Ges. Der Wiss. Zu Göttingen Math.-Phys. Kl. 1927, 1927, 273–291. [Google Scholar]
  2. Wilde, M.M. Quantum Information Theory; Cambridge University Press: Cambridge, UK, 2013. [Google Scholar] [CrossRef]
  3. Umegaki, H. Conditional expectation in an operator algebra. IV. Entropy and information. Kodai Math. Semin. Rep. 1962, 14, 59–85. [Google Scholar] [CrossRef]
  4. Hiai, F.; Petz, D. The proper formula for relative entropy and its asymptotics in quantum probability. Commun. Math. Phys. 1991, 143, 99–114. [Google Scholar] [CrossRef]
  5. Devetak, I.; Winter, A. Distillation of secret key and entanglement from quantum states. Proc. R. Soc. A Math. Phys. Eng. Sci. 2005, 461, 207–235. [Google Scholar] [CrossRef]
  6. Tomamichel, M. Quantum Information Processing with Finite Resources—Mathematical Foundations; Springer: Cham, Switzerland, 2016; Volume 5. [Google Scholar] [CrossRef]
  7. Renner, R. Security of Quantum Key Distribution. arXiv 2006, arXiv:quant-ph/0512258. [Google Scholar] [CrossRef]
  8. Wang, L.; Renner, R. One-Shot Classical-Quantum Capacity and Hypothesis Testing. Phys. Rev. Lett. 2012, 108, 200501. [Google Scholar] [CrossRef] [PubMed]
  9. Buscemi, F.; Datta, N. The Quantum Capacity of Channels With Arbitrarily Correlated Noise. IEEE Trans. Inf. Theory 2010, 56, 1447–1460. [Google Scholar] [CrossRef]
  10. Khatri, S.; Wilde, M.M. Principles of Quantum Communication Theory: A Modern Approach. arXiv 2024, arXiv:2011.04672. [Google Scholar] [CrossRef]
  11. Petz, D. Quasi-entropies for States of a von Neumann Algebra. Publ. Res. Inst. Math. Sci. 1985, 21, 787–800. [Google Scholar] [CrossRef]
  12. Petz, D. Quasi-entropies for finite quantum systems. Rep. Math. Phys. 1986, 23, 57–65. [Google Scholar] [CrossRef]
  13. Tomamichel, M.; Colbeck, R.; Renner, R. A Fully Quantum Asymptotic Equipartition Property. IEEE Trans. Inf. Theory 2009, 55, 5840–5847. [Google Scholar] [CrossRef]
  14. Müller-Lennert, M.; Dupuis, F.; Szehr, O.; Fehr, S.; Tomamichel, M. On quantum Rényi entropies: A new generalization and some properties. J. Math. Phys. 2013, 54, 122203. [Google Scholar] [CrossRef]
  15. Wilde, M.M.; Winter, A.; Yang, D. Strong Converse for the Classical Capacity of Entanglement-Breaking and Hadamard Channels via a Sandwiched Rényi Relative Entropy. Commun. Math. Phys. 2014, 331, 593–622. [Google Scholar] [CrossRef]
  16. Petz, D.; Ruskai, M.B. Contraction of Generalized Relative Entropy Under Stochastic Mappings on Matrices. Infin. Dimens. Anal. Quantum Probab. Relat. Top. 1998, 01, 83–89. [Google Scholar] [CrossRef]
  17. Matsumoto, K. A New Quantum Version of f-Divergence. In Reality and Measurement in Algebraic Quantum Theory; Ozawa, M., Butterfield, J., Halvorson, H., Rédei, M., Kitajima, Y., Buscemi, F., Eds.; Springer: Singapore, 2018; pp. 229–273. [Google Scholar] [CrossRef]
  18. Polyanskiy, Y.; Verdú, S. Arimoto channel coding converse and Rényi divergence. In Proceedings of the 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton), Monticello, IL, USA, 29 September–1 October 2010; pp. 1327–1333. [Google Scholar] [CrossRef]
  19. Sharma, N.; Warsi, N.A. Fundamental Bound on the Reliability of Quantum Information Transmission. Phys. Rev. Lett. 2013, 110, 080501. [Google Scholar] [CrossRef]
  20. Rastegin, A.E. Sine distance for quantum states. arXiv 2006, arXiv:quant-ph/0602112. [Google Scholar] [CrossRef]
  21. Uhlmann, A. The “transition probability” in the state space of a *-algebra. Rep. Math. Phys. 1976, 9, 273–279. [Google Scholar] [CrossRef]
  22. Datta, N. Min- and Max-Relative Entropies and a New Entanglement Monotone. IEEE Trans. Inf. Theory 2009, 55, 2816–2826. [Google Scholar] [CrossRef]
  23. Dupuis, F.; Krämer, L.; Faist, P.; Renes, J.M.; Renner, R. Generalized entropies. In XVIIth International Congress on Mathematical Physics; World Scientific: Singapore, 2012; pp. 134–153. [Google Scholar] [CrossRef]
  24. Datta, N.; Tomamichel, M.; Wilde, M.M. On the second-order asymptotics for entanglement-assisted communication. Quantum Inf. Process. 2016, 15, 2569–2591. [Google Scholar] [CrossRef]
  25. Nuradha, T.; Wilde, M.M. Fidelity-Based Smooth Min-Relative Entropy: Properties and Applications. IEEE Trans. Inf. Theory 2024, 70, 4170–4196. [Google Scholar] [CrossRef]
  26. Bény, C.; Oreshkov, O. General Conditions for Approximate Quantum Error Correction and Near-Optimal Recovery Channels. Phys. Rev. Lett. 2010, 104, 120501. [Google Scholar] [CrossRef]
  27. Sarkar, R.; Yoder, T.J. A graph-based formalism for surface codes and twists. Quantum 2024, 8, 1416. [Google Scholar] [CrossRef]
  28. Hein, M.; Eisert, J.; Briegel, H.J. Multiparty entanglement in graph states. Phys. Rev. A 2004, 69, 062311. [Google Scholar] [CrossRef]
  29. Briegel, H.J.; Browne, D.E.; Dür, W.; Raussendorf, R.; Van den Nest, M. Measurement-based quantum computation. Nat. Phys. 2009, 5, 19–26. [Google Scholar] [CrossRef]
Figure 1. Schematic visualization of the smoothing environment B ε and its image B V ε and pre-image B R V ε under the isometric channel V and its reversal channel R V .
Figure 1. Schematic visualization of the smoothing environment B ε and its image B V ε and pre-image B R V ε under the isometric channel V and its reversal channel R V .
Entropy 27 01051 g001
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Popp, C.; Sutter, T.C.; Hiesmayr, B.C. Local Invariance of Divergence-Based Quantum Information Measures. Entropy 2025, 27, 1051. https://doi.org/10.3390/e27101051

AMA Style

Popp C, Sutter TC, Hiesmayr BC. Local Invariance of Divergence-Based Quantum Information Measures. Entropy. 2025; 27(10):1051. https://doi.org/10.3390/e27101051

Chicago/Turabian Style

Popp, Christopher, Tobias C. Sutter, and Beatrix C. Hiesmayr. 2025. "Local Invariance of Divergence-Based Quantum Information Measures" Entropy 27, no. 10: 1051. https://doi.org/10.3390/e27101051

APA Style

Popp, C., Sutter, T. C., & Hiesmayr, B. C. (2025). Local Invariance of Divergence-Based Quantum Information Measures. Entropy, 27(10), 1051. https://doi.org/10.3390/e27101051

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop