Jump to content

Generalized relative entropy

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Anirudh215 (talk | contribs) at 15:42, 10 December 2013 (Data processing inequality). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In the study of quantum information theory, one typically assumes that our experiments are independently repeated multiple times. Our measures of information processing are defined in this asymptotic limit. The quintessential entropy measure is the von Neumann entropy. In contrast, the study of one-shot quantum information theory is concerned with information processing when our experiment is only going to be conducted once. There is a need for new measures in this scenario as the von Neumann entropy ceases to give a correct characterization of operational quantities. Generalizations of the von Neumann entropy have been developed over the past few years. One particularly interesting measure is the -relative entropy.

In the asymptotic scenario, the relative entropy acted as a parent quantity for other measures besides being an important measure itself. Similarly, the -relative entropy functions as a parent quantity for other measures in the one-shot scenario.

Definitions

From an operational perspective, the -relative entropy is defined in the context of hypothesis testing. Here, we have to devise a strategy to distinguish between two density operators and . A strategy is defined by a POVM with elements and . The probability that the strategy produces a correct guess on input is given by and the probability that it produces a wrong guess is given by . The -relative entropy represents the negative logarithm of the probability of failure when the state is , given that the success probability for is at least .

Definition: The -relative entropy between two state operators and is

From the definition, it is clear that . This inequality is saturated if and only if (Shown below).

Two relative min and max entropies defined as

The corresponding smooth -relative entropy is defined as

where the set where ranges over the set of all positive semi-definite operators with trace lesser than or equal to 1.

Relationship to the trace distance

This result and its proof can be found in Dupuis et. al.[1] Suppose the trace distance between two density operators and is

For , it holds that

.

In particular, this implies the following analogue of the Pinsker inequality[2]

.

Furthermore, the proposition implies that for , if and only if , inheriting this property from the trace distance.

Proof of the inequality

Upper bound: The trace distance can be written as

This is maximized when is the projector along the positive part of .

.

This directly implies that

Lower bound: Choose such that it is the following convex combination of and

where

This means and thus

Moreover,

Hence

.

Pinsker-like inequality: Observe that

Data processing inequality

One of the properties that the -relative entropy and the von Neumann entropy share is the strong subadditivity. Intuitively, this inequality states that the information content in a system cannot increase when one performs only local operations on that system. This inequality is also known as the data processing inequality.

For the case of the von Neumann entropy, the strong subadditivity can be expressed as follows: For a tripartite state

Using the relative entropy, this inequality is equivalent to [3]

for every CPTP map . In the literature, this inequality is referred to as the monotonicity of the relative entropy under CPTP maps.

One of the interesting features of the -relative entropy is that it too obeys the strong subadditivity. This can be expressed as

.

Not only is this interesting in itself, but it also allows us to prove the strong subadditivity of the von Neumann entropy in yet another way.

Monotonicity of -relative entropy

Wang and Renner showed[4] that the -relative entropy is monotonic under partial trace. The proof goes as follows.

Suppose we had some POVM to distinguish between and . We can then construct a new POVM to distinguish between and by preceeding the given POVM with the CPTP map . However,

because by definition, was chosen optimally in the definition of the -relative entropy.

Proof of the data processing inequality for the von Neumann entropy

This proof can be found in [5]. From Stein's lemma [6], it follows that

where the minimum is taken over such that .

Applying the data processing inequality to the states and with the CPT map , we get

Dividing by on either side and taking the limit, we get the desired result.

See Also

Strong subadditivity

Classical information theory

Min entropy

References

  1. ^ Dupuis, F., et al. "Generalized entropies." arXiv preprint arXiv:1211.3141 (2012).
  2. ^ Watrous, Theory of Quantum Information, Fall 2013. Ch. 5, page 194 https://cs.uwaterloo.ca/~watrous/CS766/DraftChapters/5.QuantumEntropy.pdf
  3. ^ Ruskai, Mary Beth. "Inequalities for quantum entropy: A review with conditions for equality." Journal of Mathematical Physics 43 (2002): 4358. arXiv: quant-ph/0205064
  4. ^ Wang, Ligong, and Renato Renner. "One-shot classical-quantum capacity and hypothesis testing." Physical Review Letters 108.20 (2012): 200501. arXiv:quant-ph/1007.5456v3
  5. ^ Dupuis, F., et al. "Generalized entropies." arXiv preprint arXiv:1211.3141 (2012).
  6. ^ Petz, Dénes. Quantum information theory and quantum statistics. Springer, 2008. Chapter 8