Jump to content

Generalized relative entropy

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Anirudh215 (talk | contribs) at 19:16, 7 December 2013 (Created page with 'In the study of quantum information theory, one typically assumes that our experiments are independently repeated multiple times. Our measures of information...'). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

In the study of quantum information theory, one typically assumes that our experiments are independently repeated multiple times. Our measures of information processing are defined in this asymptotic limit. The quintessential entropy measure is the von Neumann entropy. In contrast, the study of one-shot quantum information theory is concerned with information processing when our experiment is only going to be conducted once. There is a need for new measures in this scenario as the von Neumann entropy ceases to give a correct characterization of operational quantities. Generalizations of the von Neumann entropy have been developed over the past few years. One particularly interesting measure is the -relative entropy.

In the asymptotic scenario, the relative entropy acted as a parent quantity for other measures besides being an important measure itself. Similarly, the -relative entropy functions as a parent quantity for other measures in the one-shot scenario.

Definitions

-relative entropy

From an operational perspective, the -relative entropy is defined in the context of hypothesis testing. Here, we have to devise a strategy to distinguish between two density operators and . A strategy is defined by a POVM with elements and . The probability that the strategy produces a correct guess on input is given by and the probability that it produces a wrong guess is given by . The -relative entropy represents the negative logarithm of the probability of failure when the state is , given that the success probability for is at least .

Definition: The -relative entropy between two state operators and is

Two relative min and max entropies defined as

The corresponding smooth -relative entropy is defined as

where the set where ranges over the set of all positive semi-definite operators with trace lesser than or equal to 1.

Data processing inequality

One of the properties that the -relative entropy and the von Neumann entropy share is the strong subadditivity. Intuitively, this inequality states that the information content in a system cannot increase when one performs only local operations on that system. This inequality is also known as the data processing inequality.

For the case of the von Neumann entropy, the strong subadditivity can be expressed as follows: For a tripartite state

Using the relative entropy, this inequality is equivalent to [1]

for every CPTP map .

One of the interesting features of the -relative entropy is that it too obeys the strong subadditivity. This can be expressed as

.

Not only is this interesting in itself, but it also allows us to prove the strong subadditivity of the von Neumann entropy in yet another way.

Monotonicity of -relative entropy under partial trace

Wang and Renner showed[2] that the -relative entropy is monotonic under partial trace. The proof goes as follows.

Suppose we had some POVM to distinguish between and . We can then construct a new POVM to distinguish between and by preceeding the given POVM with the CPTP map . However,

because by definition, was chosen optimally in the definition of the -relative entropy.

Proof of the data processing inequality for the von Neumann entropy

This proof can be found in [3]. From Stein's lemma [4], it follows that

where the minimum is taken over such that .

Applying the data processing inequality to the states and with the CPT map , we get

Dividing by on either side and taking the limit, we get the desired result.

See Also

Strong subadditivity

Classical information theory

min entropy

References

  1. ^ Ruskai, Mary Beth. "Inequalities for quantum entropy: A review with conditions for equality." Journal of Mathematical Physics 43 (2002): 4358. arXiv: quant-ph/0205064
  2. ^ Wang, Ligong, and Renato Renner. "One-shot classical-quantum capacity and hypothesis testing." Physical Review Letters 108.20 (2012): 200501. arXiv:quant-ph/1007.5456v3
  3. ^ Dupuis, F., et al. "Generalized entropies." arXiv preprint arXiv:1211.3141 (2012).
  4. ^ Petz, Dénes. Quantum information theory and quantum statistics. Springer, 2008. Chapter 8