![]() Where is the mutual information between and. , where is the mutual information between and. The latter can take negative values, unlike its classical counterpart. In quantum information theory, the conditional entropy is generalized to the conditional quantum entropy. Conversely, if and only if and are independent random variables. If and only if the value of is completely determined by the value of. If we learn the value of, we have gained bits of information, and the system has bits of uncertainty remaining. Intuitively, the combined system contains bits of information: we need bits of information to reconstruct its exact state. Chain ruleįrom this definition and the definition of conditional probability, the chain rule for conditional entropy is Note: The supports of X and Y can be replaced by their domains if it is understood that should be treated as being equal to zero. Given discrete random variable with support and with support, the conditional entropy of given is defined as: More precisely, if is the entropy of the variable conditional on the variable taking a certain value, then is the result of averaging over all possible values that may take. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |