Processing math: 100%
Google
 
Web unafbapune.blogspot.com

Saturday, January 11, 2014

 

Mutual Information

For random variable X and Y, the mutual information between them is defined as:
  I(X;Y)=x,yp(x,y)logp(x,y)p(x)p(y)=Elogp(X,Y)p(X)p(Y)
which is symmetrical. Alternatively,
  I(X;Y)=x,yp(x,y)logp(x|y)p(x)=Elogp(X|Y)p(X)
Interestingly, the mutual information of a random variable with itself, called the self-information, is the entropy of the variable, ie I(X;X)=H(X), for
  I(X;X)=Elogp(X,X)p(X)p(X)=Elogp(X)p(X)p(X)=Elogp(X)=H(X)
Proposition 2.19, provided the entropies and conditional entropies are finite:
  I(X;Y)=H(X)H(X|Y)
and
  I(X;Y)=H(X)+H(Y)H(X,Y)
Both equations can be easily verified. For example,
  H(X)H(X|Y)=Elogp(X)+Elogp(X|Y)=Elogp(X)+Elogp(X,Y)p(Y)=Elogp(X,Y)p(X)p(Y)=I(X;Y)













Furthermore, for random variable X,Y and Z, the mutual information between X and Y conditioning on Z is defined as:
  I(X;Y|Z)=x,y,zp(x,y,z)logp(x,y|z)p(x|z)p(y|z)=Elogp(X,Y|Z)p(X|Z)p(Y|Z)

Source: Information Theory.


Comments: Post a Comment

<< Home

This page is powered by Blogger. Isn't yours?