lec(6)joint Entropy.pptx
Document Details
Uploaded by StylishSpessartine
جامعة العلوم والتقانة
Tags
Related
- FC4b-Probabilités-final PDF
- 14-Evaluation-of-Topic-Modeling.pdf
- Chapter 7 - Introduction to Information Theory.pdf
- JG University Probability and Statistics Sample Mid-Term Exam Paper 2023-2024 PDF
- Introduction to Compression Techniques Lecture 8 - Information Theory (PDF)
- Lecture 2: Information Theory PDF
Full Transcript
Information Theory Joint and marginal Probability Joint and marginal Probability If Ω=XxY then probability measure P on Ω is called joint probability. (joint are (X={x1,x2,….,xm} and Y={y1,y2, …,yn}). Therefore P is given by: –Pij = p(xi,yj). The functions: Pi...
Information Theory Joint and marginal Probability Joint and marginal Probability If Ω=XxY then probability measure P on Ω is called joint probability. (joint are (X={x1,x2,….,xm} and Y={y1,y2, …,yn}). Therefore P is given by: –Pij = p(xi,yj). The functions: Pi (x) p ( x, y ) for x X yY and P (y) i p ( x, y ) xX for y Y are called marginal probability functions Joint and marginal Probability cont’d p(x i , y j ) p(x i | y j ) is called p(y j ) conditional probability of xi given yj. We say that xi and yj are independent if: p(x i , y j ) p(x i ) p(y j ) Example X and Y are discrete random variables which are jointly distributed with the following probability function: P(Y) 3 2 1 X Y 1/6 1/9 1/18 1 1/6 0 1/9 2 1/9 1/9 1/6 3 P(X) Calculate marginal probability of X (P(X)) and marginal probability of Y (P(Y)). Then calculate P(X=1,Y=2) Sol. P(X) P ( x, y ) yY P(X=1) = 1/18 +1/9 + 1/6 =6/18=1/3 P(X=2)= 1/9 + 0 +1/9 = 2/9 P(X=3)= 1/6 + 1/6 + 1/9 = 4/9 Then P(X)={1/3, 2/9, 4/9} P(Y=1)= 1/18 +1/9 + 1/6 =6/18=1/3 P(Y=2)= 1/9 + 0 + 1/6 =5/18 P(Y=3)= 1/6 + 1/9 + 1/9 = 7/18 Then P(X)={1/3, 5/18, 7/18} P(X=1,Y=2)= 1/9 Information Theory Joint Entropy Joint Entropy The joint entropy of pair of discrete random variables (X,Y) with joint probability mass function P(X,Y) is defined by : 1 H ( X , Y ) P ( x, y ) log x X yY p ( x, y ) P( x, y) log p( x, y) x X yY Joint Entropy cont’d Joint entropy is no different than regular entropy. We have to compute entropy over all possible pairs of two random variables. Example Let X represent whether it’s sunny or rainy in particular town on given day. Let Y represent whether it is above 70 degrees or bellow 70 degrees. Compute entropy of the joint distribution P(X,Y) given by: X rainy sunny 1/4 1/2 70> Y 0 1/4 70< Sol. 1 H ( X , Y ) P ( x, y ) log x X yY p ( x, y ) 1 1 1 3 log 2 2 log 2 4 log 2 4 1.5bits 2 4 4 2 Example X and Y are discrete random variables which are jointly distributed with the following probability function: P(Y) 3 2 1 X Y 1/6 1/9 1/18 1 1/6 0 1/9 2 1/9 1/9 1/6 3 P(X) Calculate joint entropy of X and Y (H(X,Y)) Sol. 1 H ( X , Y ) P ( x, y ) log x X yY p ( x, y ) 1 1 1 log10 18 ( log10 9) * 4 ( log10 6) * 3 18 9 6 0.0697 0.424 0.389 0.883hartlys