Weitere Beispiele werden automatisch zu den Stichwörtern zugeordnet - wir garantieren ihre Korrektheit nicht.
Unlike its classical counterpart, the quantum conditional entropy can be negative.
A basic property of the conditional entropy is that:
It is a generalization of the conditional entropy of classical information theory.
In many applications, one wants to maximize mutual information (thus increasing dependencies), which is often equivalent to minimizing conditional entropy.
These definitions parallel the use of the classical joint entropy to define the conditional entropy and mutual information.
The conditional entropy of given , also called the equivocation of about is then given by:
Given a particular value of a random variable , the conditional entropy of given is defined as:
In quantum information theory, the conditional entropy is generalized to the conditional quantum entropy.
Conditional entropy (equivocation)
By analogy with the classical conditional entropy, one defines the conditional quantum entropy as .
The differential analogies of entropy, joint entropy, conditional entropy, and mutual information are defined as follows:
This is equivalent to the fact that the conditional quantum entropy may be negative, while the classical conditional entropy may never be.
The conditional entropy measures the amount of entropy remaining in one random variable when we know the value of a second random variable.
For a bipartite state , the conditional entropy is written , or , depending on the notation being used for the von Neumann entropy.
Positive conditional entropy of a state thus means the state cannot reach even the classical limit, while the negative conditional entropy provides for additional information.
Namely the joint entropy, conditional entropy, and mutual information can be considered as the measure of a set union, set difference, and set intersection, respectively (Reza pp.
The negative conditional entropy is also known as the coherent information, and gives the additional number of bits above the classical limit that can be transmitted in a quantum dense coding protocol.
Conditional Entropy asks the question: "Given that I know a set of tags, how much uncertainty regarding the document set that I was referencing with those tags remains?"
In a follow-up study published in IEEE Computer, Rao et al. present data which strengthen their original conditional entropy result, which involved analysis of pairs of symbols.
In information theory, the conditional entropy (or equivocation) quantifies the amount of information needed to describe the outcome of a random variable given that the value of another random variable is known.
An information diagram is a type of Venn diagram used in information theory to illustrate relationships among Shannon's basic measures of information: entropy, joint entropy, conditional entropy and mutual information.
Mathematically, this is expressed as , where is the entropy of the plaintext and is the conditional entropy of the plaintext given the ciphertext C. Perfect secrecy is a strong notion of cryptanalytic difficulty.
Because entropy can be conditioned on a random variable or on that random variable being a certain value, care should be taken not to confuse these two definitions of conditional entropy, the former of which is in more common use.
Because the entropy, joint entropy, conditional entropy, and bivariate mutual information of discrete random variables are all nonnegative, many basic inequalities in information theory (among no more than two random variables) can be derived from this formulation by considering the measure μ to be nonnegative.
An equivalent (and more intuitive) operational definition of the quantum conditional entropy (as a measure of the quantum communication cost or surplus when performing quantum state merging) was given by Michał Horodecki, Jonathan Oppenheim, and Andreas Winter in their paper "Quantum Information can be negative" [1].