Jump to content

Generalized relative entropy

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Anirudh215 (talk | contribs) at 20:16, 13 December 2013 (Definitions). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

The generalized relative entropy is a distance measure between two quantum states. As the name suggests, it is a generalization of the quantum relative entropy. It shares many properties with the quantum relative entropy. For instance, it is non-negative and obeys strong sub-additivity.

In the study of quantum information theory, one typically assumes that our experiments are independently repeated multiple times. Our measures of information processing are defined in this asymptotic limit. The quintessential entropy measure is the von Neumann entropy. In contrast, the study of one-shot quantum information theory is concerned with information processing when our experiment is only going to be conducted once. There is a need for new measures in this scenario as the von Neumann entropy ceases to give a correct characterization of operational quantities. Generalizations of the von Neumann entropy have been developed over the past few years. One particularly interesting measure is the -relative entropy.

In the asymptotic scenario, the relative entropy acted as a parent quantity for other measures besides being an important measure itself. Similarly, the -relative entropy functions as a parent quantity for other measures in the one-shot scenario.

Definitions

To motivate the definition of the -relative entropy , we consider the information processing protocol of hypothesis testing. In hypothesis testing, we have to devise a strategy to distinguish between two density operators and . A strategy is a POVM with elements and . The probability that the strategy produces a correct guess on input is given by and the probability that it produces a wrong guess is given by . The -relative entropy is the negative logarithm of the minimum probability of failure when the state is , given that the success probability for is at least .

Definition: For , the -relative entropy between two state operators and is

From the definition, it is clear that . This inequality is saturated if and only if (Shown below).

Two relative min and max entropies defined as

where denotes the fidelity between and . Intuitively, the quantity can be interpreted as follows. For some fixed measurement that we perform on and , the probability of detecting is times more likely than the probability of detecting .

The corresponding smooth -relative entropy is defined as

where the set where ranges over the set of all positive semi-definite operators with trace lesser than or equal to 1. These quantities are discussed in detail by Dutta. [1]

Relationship to the trace distance

Suppose the trace distance between two density operators and is

For , it holds that

a)

In particular, this implies the following analogue of the Pinsker inequality[2]

b) .

Furthermore, the proposition implies that for , if and only if , inheriting this property from the trace distance. This result and its proof can be found in Dupuis et. al.[3]

Proof of the inequality

Proof of inequality a):

Upper bound: The trace distance can be written as

This maximum is achieved when is the projector onto the positive part of . For any such that we have

.

From the definition of the -relative entropy, we get

Lower bound: Choose such that it is the following convex combination of and

where

This means and thus

Moreover,

We can substitute the above expression for to re-write this as

Hence

.

Proof of inequality (b):

Pinsker-like inequality: Observe that

Data processing inequality

One of the properties that the -relative entropy and the von Neumann entropy share is strong subadditivity. Intuitively, this inequality states that the information content in a system cannot increase when one performs only local operations on that system. This inequality is also known as the data processing inequality.

For the case of the von Neumann entropy, the strong subadditivity can be expressed as follows: Let be a density operator on the tensor product Hilbert space ,

where refer to the marginals. Using the relative entropy, this inequality is can equivalently be expressed as [4]

for every CPTP map . In the literature, this inequality is referred to as the monotonicity of the relative entropy under CPTP maps.

The -relative entropy also obeys strong subadditivity. This can be expressed as

.

Not only is this interesting in itself, but it also allows us to prove the strong subadditivity of the von Neumann entropy in yet another way.

Monotonicity of -relative entropy

Wang and Renner showed[5] that the -relative entropy is monotonic under partial trace. The proof goes as follows.

Suppose we had some POVM to distinguish between and . We can then construct a new POVM to distinguish between and by preceeding the given POVM with the CPTP map . However,

because by definition, was chosen optimally in the definition of the -relative entropy.

Proof of the data processing inequality for the von Neumann entropy

This proof can be found in [6]. From Stein's lemma [7], it follows that

where the minimum is taken over such that .

Applying the data processing inequality to the states and with the CPT map , we get

Dividing by on either side and taking the limit, we get the desired result.

See Also

Quantum relative entropy

Strong subadditivity

Classical information theory

Min entropy

References

  1. ^ Datta, Nilanjana. "Min-and max-relative entropies and a new entanglement monotone." Information Theory, IEEE Transactions on 55.6 (2009): 2816-2826.
  2. ^ Watrous, Theory of Quantum Information, Fall 2013. Ch. 5, page 194 https://cs.uwaterloo.ca/~watrous/CS766/DraftChapters/5.QuantumEntropy.pdf
  3. ^ Dupuis, F., et al. "Generalized entropies." arXiv preprint arXiv:1211.3141 (2012).
  4. ^ Ruskai, Mary Beth. "Inequalities for quantum entropy: A review with conditions for equality." Journal of Mathematical Physics 43 (2002): 4358. arXiv: quant-ph/0205064
  5. ^ Wang, Ligong, and Renato Renner. "One-shot classical-quantum capacity and hypothesis testing." Physical Review Letters 108.20 (2012): 200501. arXiv:quant-ph/1007.5456v3
  6. ^ Dupuis, F., et al. "Generalized entropies." arXiv preprint arXiv:1211.3141 (2012).
  7. ^ Petz, Dénes. Quantum information theory and quantum statistics. Springer, 2008. Chapter 8