Joint entropy

A misleading[1] Venn diagram showing additive, and subtractive relationships between various information measures associated with correlated variables X and Y. The area contained by both circles is the joint entropy H(X,Y). The circle on the left (red and violet) is the individual entropy H(X), with the red being the conditional entropy H(X|Y). The circle on the right (blue and violet) is H(Y), with the blue being H(Y|X). The violet is the mutual information I(X;Y).

In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.[2]

  1. ^ D.J.C. Mackay (2003). Information theory, inferences, and learning algorithms. Bibcode:2003itil.book.....M.: 141 
  2. ^ Theresa M. Korn; Korn, Granino Arthur (January 2000). Mathematical Handbook for Scientists and Engineers: Definitions, Theorems, and Formulas for Reference and Review. New York: Dover Publications. ISBN 0-486-41147-8.

From Wikipedia, the free encyclopedia · View on Wikipedia

Developed by Tubidy