Document Details

StylishSpessartine

Uploaded by StylishSpessartine

جامعة العلوم والتقانة

Tags

information theory conditional entropy entropy data science

Full Transcript

Information Theory conditional Entropy Conditional Entropy Conditional Entropy H(X|Y) is uncertainty of X that another random variable Y are known. The conditional entropy of X given Y is defined as: H ( X | Y )  p ( y ) H ( X | Y  y ) yY...

Information Theory conditional Entropy Conditional Entropy Conditional Entropy H(X|Y) is uncertainty of X that another random variable Y are known. The conditional entropy of X given Y is defined as: H ( X | Y )  p ( y ) H ( X | Y  y ) yY 1  p ( y )  p ( x | y ) log yY x X p( x | y) 1   p ( y ) p ( x | y ) log x X yY p( x | y) 1   p ( x, y ) log x X yY p( x | y ) Conditional Entropy cont’d conditional entropy of X given Y=yi is defined by: 1 H ( X | Y  yi )   p( x j | yi ) log x j X p ( x j | yi ) Chain Rule The chain rule for joint entropy states that the total uncertainty about the value of X and Y (H(X,Y)) is equal to uncertainty about X (H(X)) plus the uncertainty about Y once you know X (H(Y|X)). H ( X , Y ) H ( X )  H (Y | X ) Proof. H ( X , Y )    p ( x, y ) log p ( x, y ) x X yY    p ( x, y ) log p ( x) p ( y | x) x X yY    p ( x, y ) log p ( x)    p( x, y) log p( y | x) x X yY xX yY   p( x) log p ( x)    p( x) p( y | x) log p( y | x) x X x X yY   p( x) log p ( x)   p( x) p( y | x) log p( y | x) x X x X yY   p( x) log p ( x)   p( x) H (Y | X x) x X x X H ( X )  H (Y | X ) Chain Rule cont’d H ( X , Y ) H ( X )  H (Y | X ) H ( X , Y ) H (Y )  H ( X | Y ) Example Let X represent whether it’s sunny or rainy in particular town on given day. Let Y represent whether it is above 70 degrees or bellow 70 degrees. Compute conditional entropy of X given Y and vice versa X rainy sunny 1/4 1/2 70> Y 0 1/4 70< Sol. Firstly compute marginal probabilities of X and Y: –P(X):{3/4 , 1/4} –P(Y):{3/4,1/4} H(X | Y)  p ( y ) H (X | y ) yY 3 2 3 1  { log10  log10 3}  4 3 2 3 1 {1 log10 1  0} 4 0.207 hartlys Example Let X and Y have the following joint probability mass function: P(Y) 3 2 1 X Y 1/4 0 1/8 1/8 1 7/16 1/4 1/16 1/8 2 5/16 0 1/4 1/16 3 1 1/4 7/16 5/16 P(X) Calculate H(X|Y) and H(Y|X) Sol. H(X | Y)  p ( y ) H (X | y ) yY 1 1 7 2 7 1 4 7  {( log 2) * 2}  { log  log 7  log } 4 2 16 7 2 7 7 4 5 1 4 5  { log 5  log } 16 5 5 4 0.075  0.945  0.264 1.284hartlys Sol. Cont’d H(Y | X)   p( x) H (Y | x) x X 5 2 5 1  {( log ) * 2  log 5} 16 5 2 5 7 2 7 1 4 7 1  { log  log 7  log }  {1 log 1} 16 7 2 7 7 4 4 0.143  0.181  0 0.324hatlys Assignment From previous compute H(X) , H(Y) and H(X,Y) in hartlys?

Use Quizgecko on...
Browser
Browser