Information Theory: Conditional Entropy
5 Questions
0 Views

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

What is the marginal probability of X being rainy?

  • 0
  • 1/2
  • 1/4
  • 3/4 (correct)
  • What is the value of H(X | Y) computed in hartlys?

  • 0.324
  • 0.075
  • 0.207 (correct)
  • 1.284
  • What is the method used to compute conditional entropy H(X | Y)?

  • Calculating the difference between H(X) and H(Y).
  • Using joint probabilities only.
  • Only considering the probabilities of X.
  • Taking the sum of conditional entropies multiplied by marginal probabilities. (correct)
  • What is H(Y | X) computed in hartlys?

    <p>0.324</p> Signup and view all the answers

    Which values are part of the joint probability mass function for X and Y?

    <p>1/4, 1/8, 1/16</p> Signup and view all the answers

    Study Notes

    Conditional Entropy

    • Conditional Entropy H(X|Y) quantifies the uncertainty of random variable X given that another variable Y is known.
    • Mathematically defined as:
      • H(X|Y) = Σ p(y) H(X|Y=yi) for all y in Y
      • Equivalently expressed as H(X|Y) = -Σ Σ p(x, y) log p(x|y) for all x in X and y in Y.
    • Specific case for conditional entropy of X given a particular outcome of Y (yi):
      • H(X|Y=yi) = -Σ p(xj|yi) log p(xj|yi).

    Chain Rule

    • The chain rule for joint entropy illustrates the relationship between individual entropies and conditional uncertainties:
      • H(X, Y) = H(X) + H(Y|X).
    • A symmetrical expression also exists:
      • H(X, Y) = H(Y) + H(X|Y).
    • This rule clarifies that joint uncertainty can be decomposed into the uncertainty of individual variables and their dependencies.

    Example Calculations

    • Two variables X (weather: sunny or rainy) and Y (temperature: above or below 70 degrees) are utilized to compute conditional entropies.
    • Marginal probabilities derived from the example are:
      • P(X): {3/4 rainy, 1/4 sunny}
      • P(Y): {3/4 above 70, 1/4 below 70}
    • Conditional entropy H(X|Y) calculated at 0.207 hartlys.

    Joint Probability Mass Function

    • A second example presents a joint probability mass function for variables X and Y with several discrete outcomes.
    • The joint distribution is defined with probabilities summing to 1 over all possible X and Y combinations.
    • Calculated entropies:
      • H(X|Y) yields a result of approximately 1.284 hartlys.
      • H(Y|X) results in approximately 0.324 hartlys.

    Assignments

    • Tasks include calculations of H(X), H(Y), and H(X,Y) in hartlys based on joint distributions and marginal probabilities provided from previous sections.

    Studying That Suits You

    Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

    Quiz Team

    Related Documents

    lec(7)conditional Entropy.pptx

    Description

    This quiz covers the concept of conditional entropy, H(X|Y), and its mathematical definitions. It explores how uncertainty in one random variable can be understood given knowledge of another. Test your understanding of the chain rule in joint entropy as well.

    More Like This

    Conditional Sentences Quiz
    5 questions

    Conditional Sentences Quiz

    AdventuresomeBlackTourmaline3278 avatar
    AdventuresomeBlackTourmaline3278
    21 - Maximum Entropy Models
    12 questions
    Conditional Statements Flashcards
    8 questions
    Use Quizgecko on...
    Browser
    Browser