Information Theory: Variable Length Codes

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to Lesson

Podcast

Play an AI-generated podcast conversation about this lesson
Download our mobile app to listen on the go
Get App

Questions and Answers

Under what condition are Variable Length Codes most effectively used?

  • When all message probabilities are equal.
  • When message probabilities are exponentially increasing.
  • When message probabilities are not equal. (correct)
  • When transmitting data over a noiseless channel.

What is the primary goal of minimum redundancy codes?

  • To simplify the encoding process.
  • To reduce the average codeword length. (correct)
  • To maximize the transmission rate.
  • To increase the error detection capabilities.

What is the first step in applying Shannon Coding?

  • Arranging messages in decreasing order of probability. (correct)
  • Calculating codeword lengths.
  • Assigning codewords based on probability.
  • Defining the weighting factor for each message.

In Shannon coding, how is the codeword length ($l_i$) determined if the probability $p(x_i)$ is not equal to $(1/2)^r$?

<p>$l_i = Int[-log_2 p(x_i)] + 1$ (B)</p>
Signup and view all the answers

What is the role of the weighting factor ($w_i$) in Shannon coding?

<p>To resolve the binary equivalent into $l_i$ bits. (C)</p>
Signup and view all the answers

In Shannon coding, what operation is performed iteratively on $w_i$ to convert it into a binary codeword?

<p>Multiplication by 2. (A)</p>
Signup and view all the answers

What condition regarding codeword decodability is essential for Shannon coding?

<p>The codewords must be uniquely decodable. (D)</p>
Signup and view all the answers

What is the formula for calculating coding efficiency (η) in the context of source coding?

<p>η = H(X) / Lc. (D)</p>
Signup and view all the answers

In Shannon-Fano coding, what is the key criterion for selecting the point at which to divide messages?

<p>To make the sum of probabilities in each group as nearly equal as possible. (B)</p>
Signup and view all the answers

After arranging messages in decreasing order of probability in Shannon-Fano coding, what is the next step?

<p>Dividing the messages into two groups with nearly equal probability sums. (B)</p>
Signup and view all the answers

In Shannon-Fano coding, after dividing the messages into groups, how are the binary codes assigned?

<p>Assign '0' to the entire upper group and '1' to the entire lower group. (B)</p>
Signup and view all the answers

What is the role of repeating steps in Shannon-Fano coding?

<p>To separate all the messages completely into uniquely decodable codes. (B)</p>
Signup and view all the answers

In Huffman coding, what is the initial step after arranging messages in decreasing order of probability?

<p>Joining the two lowest probability messages. (B)</p>
Signup and view all the answers

In Huffman coding, what values are assigned to the two lowest probability messages when they are joined?

<p>0 and 1 (D)</p>
Signup and view all the answers

What is the criterion for reading the codeword in Huffman coding?

<p>Left to right following arrows marked '0' and '1'. (B)</p>
Signup and view all the answers

What is a primary characteristic of the code produced by Huffman coding?

<p>It is uniquely decodable. (D)</p>
Signup and view all the answers

Under what circumstances is Source Extension typically employed?

<p>When the source produces symbols with very extreme probabilities. (B)</p>
Signup and view all the answers

What does Source Extension involve?

<p>Grouping sequences of source symbols into 'super-symbols'. (C)</p>
Signup and view all the answers

What assumption is typically made about the symbols when using Source Extension?

<p>They are statically independent. (B)</p>
Signup and view all the answers

What calculation is performed to determine the probability of a group of two symbols when using Source Extension with statically independent symbols?

<p>The product of their individual probabilities. (C)</p>
Signup and view all the answers

How does increasing the number of symbols grouped in Source Extension typically affect coding efficiency?

<p>Efficiency tends to improve, approaching theoretical limits. (B)</p>
Signup and view all the answers

When is the coding efficiency 100%?

<p>When p(xi) = (1/D)^r (A)</p>
Signup and view all the answers

Which of the following is NOT a type of minimum redundancy code?

<p>The shorter the code the higher the probability (D)</p>
Signup and view all the answers

Which is the correct method for computing the codeword Length?

<p>l = -log p(x) (D)</p>
Signup and view all the answers

What is the advantage of the Huffman code?

<p>Meet the conditions of uniquely decodable code (D)</p>
Signup and view all the answers

Which is the correct order for the procedure to the Binary Fano code?

<p>Arranging, Finding, assigning, separating (D)</p>
Signup and view all the answers

What happens to the total amount of information if the source produces 10000 messages?

<p>It is 26.1k bits (B)</p>
Signup and view all the answers

Regarding coding efficiency, what is accurate?

<p>It is the ratio of entropy H(X) to the average codeword length Lc (D)</p>
Signup and view all the answers

Shannon code requires fulfilling which two consideration

<p>Less Li for the higher P(xi) and the codewords are uniquely decodable (B)</p>
Signup and view all the answers

Flashcards

Variable Length Codes

Codes used when message probabilities are not equal; also referred to as minimum redundancy codes.

Shannon Code

A type of variable length code where the codeword length is determined based on the probability of the message.

Steps for Shannon Code

Arrange messages in decreasing order of probability. Find codeword length based on message probability: li = -log p(xi). Codeword is binary equivalent of wi resolved into li bits.

Shannon-Fano Coding steps

Arrange messages by probability. Find point where probabilities are nearly equal when split. Assign '0' to upward, '1' to downward. Repeat until separated.

Signup and view all the flashcards

Steps in Huffman Coding

Arrange messages by probability. Join two lowest probabilites, assign '0' and '1'. Rewrite + repeat until one probability is achieved.

Signup and view all the flashcards

Source Extension

A method that groups symbols into larger blocks to improve coding efficiency, especially with extreme probability distributions.

Signup and view all the flashcards

Study Notes

  • The presentation is about Information Theory and Coding, presented by Dr. Wassan Saad for 4th Electrical Engineering students.

Recall Topics

  • Channel capacity of Gaussian channels
  • Source coding of discrete sources
  • Coding efficiency and redundancy
  • Fixed Length Codes

Outline

  • Variable Length Codes
  • Shannon code
  • Fano Code
  • Huffman Code
  • Source Extension

Variable Length Codes

  • Variable Length Codes are used when message probabilities are unequal
  • These codes are minimum redundancy codes
  • Types of Variable Length Codes:
    • Shannon Code
    • Shannon-Fano Code
    • Huffman Code

Shannon Code

  • For messages x1, x2, x3, ..., xn with probabilities P(x1), P(x2), P(x3), ... P(xn), suitable for binary coding (D=2)
  • Step 1: Arrange messages in decreasing order of probabilities
  • Step 2: Find the codeword length such that:
    • lᵢ = -log P(xᵢ) if P(xᵢ) = (½)^r = ½, ¼, ⅛,...
    • lᵢ = Int [-log₂ P(xᵢ)] + 1 if P(xᵢ) ≠ (½)^r
  • Step 3: Define wᵢ = Σ P(xᵢ) for i=1 to i-1; where w₁ < 0
    • The codeword of xᵢ is the binary equivalent of wᵢ resolved into lᵢ bits
  • Step 4: Find the codeword of xᵢ by changing wᵢ into binary by lᵢ times multiplication by 2 (wᵢ x 2) and take the integer part.

Example 1: Shannon Code Development

  • Develop Shannon code for messages x1, x2, x3, x4, x5, x6 with probabilities [0.4, 0.25, 0.15, 0.1, 0.07, 0.03]
  • Find coding efficiency and redundancy
  • Determine p(0) and p(1) at the encoder output
  • Calculate the amount of information produced at the encoder output if the source produces 10000 messages

Shannon Code: Key Considerations

  • Less lᵢ for higher P(xᵢ)
  • The codewords must be uniquely decodable

Code Efficiency

  • H(x) = Σ P(xᵢ) log₂ P(xᵢ) = 2.1918 bits/message
  • L_c = Σ lᵢ P(xᵢ) = 2.61
  • η% = H(x) / L_c = 2.1918 / 2.61 = 83.9%

Probabilities of 0 and 1

  • p(0) = Σ P(xᵢ) = 0.574
  • p(1) = Σ lᵢ P(xᵢ) / L_c = 0.426

Amount of Information

  • 10000 x L_c = 10000 x 2.61 = 26.1 k bits

Shannon-Fano Code Procedure

  • Step 1: Arrange the messages in a decreasing order of probability
  • Step 2: Find a point in the decreasing order such that the sum of probabilities upward is almost equal to the sum of probabilities downward
  • Step 3: Assign all messages upward as "0" and all messages downward as "1"
  • Step 4: Repeat steps 2 and 3 many times on the upward and downward parts until all the messages are separated

Example 2: Fano Code

  • Develop Fano code for messages x1, x2, x3, x4, x5, x6 with probabilities [0.25, 0.2, 0.18, 0.15, 0.12, 0.1]
  • Determine coding efficiency

Fano Code Solution

  • Messages are already in decreasing order of probability
  • The point between x2 and x3 is the best choice
  • Sum of probabilities upward is 0.45
  • Sum of probabilities downward is 0.55
  • Assign upward by "0" and downward by "1"

Example 3: Fano Code

  • Develop Fano code for the following set of messages: x1, x2, x3, x4, x5, x6, x7, x8 with probabilities = [0.4, 0.2, 0.12, 0.08, 0.04, 0.04]
  • Find coding efficiency
  • H(x) = 2.5 bits/message
  • Lc = 2.56 bits/message
  • η% = H(x) / Lc = 2.5 / 2.56 = 97.69%

Huffman Code

  • Step 1: Arrange messages in a decreasing order of probability
  • Step 2: The two lowest probability messages are joined, assigned "0" for one and "1" for the other
  • Step 3: Rewrite messages once again in a decreasing order by replacing the sum of probabilities of step 2 as a probability of one message, reducing the number of messages by one
  • Step 4: Repeat steps 2 and 3 many times until you end up with a total probability of unity 1
  • Step 5: The codeword for each message is read from marked "0" and "1" following the arrows from left to right, writing codeword bits from right to left to meet the condition of a uniquely decodable code

Example 3: Huffman Code Development

  • Develop Huffman code for messages x1, x2, x3, x4, x5, x6, x7, x8 with probabilities [0.4, 0.25, 0.15, 0.1, 0.07, 0.03]
  • Determine coding efficiency

Source Extension

  • When a source produces symbols with extreme probabilities, like p(x) = [0.9 0.1] or p(x) = [0.9 0.08 0.02]..., using previous length coding methods results in low efficiency
  • The source is p(x) = [0.9 0.1]
  • The Huffman procedure then assigns '0' to x1 (0.9) and '1' to x2 (0.1), resulting in Lc = 1 bit/symbol
  • H(x) = 0.469 bits/symbol and efficiency η = H(x) / Lc = 0.469 / 1 = 46.9%
  • Now, group two symbols and regard them as one message (source extension), assuming statistically independent symbols
  • The probability of a group of two symbols becomes the joint probability

Source Extension Efficiency Improvement

  • By grouping symbols into messages (source extension), coding efficiency can be improved

Example of Huffman with Source Extension

  • Code Extension of 3
  • Lc=1.626 bit/3-symbol
  • Efficiency=0.469/0.542=86.5%

Studying That Suits You

Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

Quiz Team

Related Documents

More Like This

Use Quizgecko on...
Browser
Browser