Podcast
Questions and Answers
What is the information content of an event E defined as?
What is the information content of an event E defined as?
If the probability P(E) of an event is 1, what is the information content I(E)?
If the probability P(E) of an event is 1, what is the information content I(E)?
What type of function does the information content I(E) satisfy if E and F are independent events?
What type of function does the information content I(E) satisfy if E and F are independent events?
Using base 2 logarithm, what is the information content of drawing a king of hearts from a pack of 32 cards?
Using base 2 logarithm, what is the information content of drawing a king of hearts from a pack of 32 cards?
Signup and view all the answers
What is the conversion relationship of 1 nat in bits?
What is the conversion relationship of 1 nat in bits?
Signup and view all the answers
Study Notes
Information Theory: Measurement of Information
- Defined by Shannon, the information content of an event E, denoted as I(E), is a function of its probability P(E).
- I(E) is a decreasing function of P(E), implying higher probability events yield less information.
- When P(E)=1 (certain event), I(E) equals 0 since no new information is gained.
- For independent events E and F, the combined information is additive: I(E ∩ F) = I(E) + I(F).
Logarithmic Function
- The only function satisfying the axioms of information content is the logarithmic function.
- I(E) can be expressed as I(E) = log(1/P(E)) = -log(P(E)).
- Information can be measured in various units, depending on the logarithm base used:
- Bits: base 2
- Nats: base e
- Hartlys: base 10
Example Calculation
- With a pack of 32 playing cards, the probability of drawing the king of hearts (event E) is P(E) = 1/32.
- Calculation of information: I(E) = log2(32) = log2(25) = 5 bits.
Conversion of Measures
- Logarithmic relationships for converting measures of information:
- For bits: Log2(1/P(x)) = y bits, meaning 1/P(x) = 2^y.
- For Hartlys: Log10(1/P(x)) = log10(2^y) = y log10(2).
- Conversion formulas summarize the relationships between bits, nats, and hartlys:
- 1 Hartly = 1/log10(2) bits
- 1 Nat = 1/log_e(2) bits
- 1 Bit = 1/log2(e) nats.
Additional Conversions
- More relationships between units include:
- 1 Hartly = 1/log10(e) nats.
- 1 Nat = 1/log_e(10) hartlys.
- 1 Bit = 1/log2(10) hartlys.
Studying That Suits You
Use AI to generate personalized quizzes and flashcards to suit your learning preferences.
Related Documents
Description
Explore the foundational concepts of information theory, including the measurement of information and self-information. Discover Shannon's definition of information and how probability affects the information content of events. This quiz delves into uncertainty and the axioms of information measurement.