lec(4)Measurement of Information (1).pptx
Document Details
Uploaded by StylishSpessartine
جامعة العلوم والتقانة
Tags
Full Transcript
Information Theory Measurement of Information Self Information , Uncertainty information content of an event: –Shannon define the amount of information I(E) of an event E as function which depends on the probability P(E). Axioms I(E) must decreasing function of P(E). I(E...
Information Theory Measurement of Information Self Information , Uncertainty information content of an event: –Shannon define the amount of information I(E) of an event E as function which depends on the probability P(E). Axioms I(E) must decreasing function of P(E). I(E)=0 if P(E)=1 –Since if we are certain that E will occur we get no information from it’s outcome. I(E∩F)=I(E)+I(F) if E and F are independent. Amount of information The only function satisfying these axioms is logarithmic function. I(E)=log (P(E)) satisfying Axioms? I(E)=log (1/P(E))= - log P(E) It is expressed in different units according to the chosen base of logarithm. Measurement Units Unit Logarithm base bits 2 nats e hartlys 10 Example: Pi=1 → I(1)= log 1/1=0 Pi=0.5 →I(0.5)=log1/0.5 -1 bit Example Pack of 32 playing cards. One of which is drawn at random. Calculate the amount of information of the event E =[the card drawn is king of hearts] Sol. As each card has the same probability of being chosen p(E)=1/32 I(E)=log232=log2 (25)= 5 bits Conversion of Measures Log2 (1/P(x))=y bits 1/P(x)= 2y Log10 1/P(x)=log10 2y=y log10 2 y= log10 (1/P(x))/log10 2 bits 1 hartlys = 1/log10 2 bits 1 nat = 1/ loge 2 bits 1 bit = 1/ log2 e nats Conversion of Measures cont’d 1 hartlys = 1/log10 e nats 1 nat = 1/ loge 10 hartlys 1 bit = 1/ log2 10 hartlys