Podcast
Questions and Answers
Under what condition are Variable Length Codes most effectively used?
Under what condition are Variable Length Codes most effectively used?
- When all message probabilities are equal.
- When message probabilities are exponentially increasing.
- When message probabilities are not equal. (correct)
- When transmitting data over a noiseless channel.
What is the primary goal of minimum redundancy codes?
What is the primary goal of minimum redundancy codes?
- To simplify the encoding process.
- To reduce the average codeword length. (correct)
- To maximize the transmission rate.
- To increase the error detection capabilities.
What is the first step in applying Shannon Coding?
What is the first step in applying Shannon Coding?
- Arranging messages in decreasing order of probability. (correct)
- Calculating codeword lengths.
- Assigning codewords based on probability.
- Defining the weighting factor for each message.
In Shannon coding, how is the codeword length ($l_i$) determined if the probability $p(x_i)$ is not equal to $(1/2)^r$?
In Shannon coding, how is the codeword length ($l_i$) determined if the probability $p(x_i)$ is not equal to $(1/2)^r$?
What is the role of the weighting factor ($w_i$) in Shannon coding?
What is the role of the weighting factor ($w_i$) in Shannon coding?
In Shannon coding, what operation is performed iteratively on $w_i$ to convert it into a binary codeword?
In Shannon coding, what operation is performed iteratively on $w_i$ to convert it into a binary codeword?
What condition regarding codeword decodability is essential for Shannon coding?
What condition regarding codeword decodability is essential for Shannon coding?
What is the formula for calculating coding efficiency (η) in the context of source coding?
What is the formula for calculating coding efficiency (η) in the context of source coding?
In Shannon-Fano coding, what is the key criterion for selecting the point at which to divide messages?
In Shannon-Fano coding, what is the key criterion for selecting the point at which to divide messages?
After arranging messages in decreasing order of probability in Shannon-Fano coding, what is the next step?
After arranging messages in decreasing order of probability in Shannon-Fano coding, what is the next step?
In Shannon-Fano coding, after dividing the messages into groups, how are the binary codes assigned?
In Shannon-Fano coding, after dividing the messages into groups, how are the binary codes assigned?
What is the role of repeating steps in Shannon-Fano coding?
What is the role of repeating steps in Shannon-Fano coding?
In Huffman coding, what is the initial step after arranging messages in decreasing order of probability?
In Huffman coding, what is the initial step after arranging messages in decreasing order of probability?
In Huffman coding, what values are assigned to the two lowest probability messages when they are joined?
In Huffman coding, what values are assigned to the two lowest probability messages when they are joined?
What is the criterion for reading the codeword in Huffman coding?
What is the criterion for reading the codeword in Huffman coding?
What is a primary characteristic of the code produced by Huffman coding?
What is a primary characteristic of the code produced by Huffman coding?
Under what circumstances is Source Extension typically employed?
Under what circumstances is Source Extension typically employed?
What does Source Extension involve?
What does Source Extension involve?
What assumption is typically made about the symbols when using Source Extension?
What assumption is typically made about the symbols when using Source Extension?
What calculation is performed to determine the probability of a group of two symbols when using Source Extension with statically independent symbols?
What calculation is performed to determine the probability of a group of two symbols when using Source Extension with statically independent symbols?
How does increasing the number of symbols grouped in Source Extension typically affect coding efficiency?
How does increasing the number of symbols grouped in Source Extension typically affect coding efficiency?
When is the coding efficiency 100%?
When is the coding efficiency 100%?
Which of the following is NOT a type of minimum redundancy code?
Which of the following is NOT a type of minimum redundancy code?
Which is the correct method for computing the codeword Length?
Which is the correct method for computing the codeword Length?
What is the advantage of the Huffman code?
What is the advantage of the Huffman code?
Which is the correct order for the procedure to the Binary Fano code?
Which is the correct order for the procedure to the Binary Fano code?
What happens to the total amount of information if the source produces 10000 messages?
What happens to the total amount of information if the source produces 10000 messages?
Regarding coding efficiency, what is accurate?
Regarding coding efficiency, what is accurate?
Shannon code requires fulfilling which two consideration
Shannon code requires fulfilling which two consideration
Flashcards
Variable Length Codes
Variable Length Codes
Codes used when message probabilities are not equal; also referred to as minimum redundancy codes.
Shannon Code
Shannon Code
A type of variable length code where the codeword length is determined based on the probability of the message.
Steps for Shannon Code
Steps for Shannon Code
Arrange messages in decreasing order of probability. Find codeword length based on message probability: li = -log p(xi). Codeword is binary equivalent of wi resolved into li bits.
Shannon-Fano Coding steps
Shannon-Fano Coding steps
Signup and view all the flashcards
Steps in Huffman Coding
Steps in Huffman Coding
Signup and view all the flashcards
Source Extension
Source Extension
Signup and view all the flashcards
Study Notes
- The presentation is about Information Theory and Coding, presented by Dr. Wassan Saad for 4th Electrical Engineering students.
Recall Topics
- Channel capacity of Gaussian channels
- Source coding of discrete sources
- Coding efficiency and redundancy
- Fixed Length Codes
Outline
- Variable Length Codes
- Shannon code
- Fano Code
- Huffman Code
- Source Extension
Variable Length Codes
- Variable Length Codes are used when message probabilities are unequal
- These codes are minimum redundancy codes
- Types of Variable Length Codes:
- Shannon Code
- Shannon-Fano Code
- Huffman Code
Shannon Code
- For messages x1, x2, x3, ..., xn with probabilities P(x1), P(x2), P(x3), ... P(xn), suitable for binary coding (D=2)
- Step 1: Arrange messages in decreasing order of probabilities
- Step 2: Find the codeword length such that:
- lᵢ = -log P(xᵢ) if P(xᵢ) = (½)^r = ½, ¼, ⅛,...
- lᵢ = Int [-log₂ P(xᵢ)] + 1 if P(xᵢ) ≠ (½)^r
- Step 3: Define wᵢ = Σ P(xᵢ) for i=1 to i-1; where w₁ < 0
- The codeword of xᵢ is the binary equivalent of wᵢ resolved into lᵢ bits
- Step 4: Find the codeword of xᵢ by changing wᵢ into binary by lᵢ times multiplication by 2 (wᵢ x 2) and take the integer part.
Example 1: Shannon Code Development
- Develop Shannon code for messages x1, x2, x3, x4, x5, x6 with probabilities [0.4, 0.25, 0.15, 0.1, 0.07, 0.03]
- Find coding efficiency and redundancy
- Determine p(0) and p(1) at the encoder output
- Calculate the amount of information produced at the encoder output if the source produces 10000 messages
Shannon Code: Key Considerations
- Less lᵢ for higher P(xᵢ)
- The codewords must be uniquely decodable
Code Efficiency
- H(x) = Σ P(xᵢ) log₂ P(xᵢ) = 2.1918 bits/message
- L_c = Σ lᵢ P(xᵢ) = 2.61
- η% = H(x) / L_c = 2.1918 / 2.61 = 83.9%
Probabilities of 0 and 1
- p(0) = Σ P(xᵢ) = 0.574
- p(1) = Σ lᵢ P(xᵢ) / L_c = 0.426
Amount of Information
- 10000 x L_c = 10000 x 2.61 = 26.1 k bits
Shannon-Fano Code Procedure
- Step 1: Arrange the messages in a decreasing order of probability
- Step 2: Find a point in the decreasing order such that the sum of probabilities upward is almost equal to the sum of probabilities downward
- Step 3: Assign all messages upward as "0" and all messages downward as "1"
- Step 4: Repeat steps 2 and 3 many times on the upward and downward parts until all the messages are separated
Example 2: Fano Code
- Develop Fano code for messages x1, x2, x3, x4, x5, x6 with probabilities [0.25, 0.2, 0.18, 0.15, 0.12, 0.1]
- Determine coding efficiency
Fano Code Solution
- Messages are already in decreasing order of probability
- The point between x2 and x3 is the best choice
- Sum of probabilities upward is 0.45
- Sum of probabilities downward is 0.55
- Assign upward by "0" and downward by "1"
Example 3: Fano Code
- Develop Fano code for the following set of messages: x1, x2, x3, x4, x5, x6, x7, x8 with probabilities = [0.4, 0.2, 0.12, 0.08, 0.04, 0.04]
- Find coding efficiency
- H(x) = 2.5 bits/message
- Lc = 2.56 bits/message
- η% = H(x) / Lc = 2.5 / 2.56 = 97.69%
Huffman Code
- Step 1: Arrange messages in a decreasing order of probability
- Step 2: The two lowest probability messages are joined, assigned "0" for one and "1" for the other
- Step 3: Rewrite messages once again in a decreasing order by replacing the sum of probabilities of step 2 as a probability of one message, reducing the number of messages by one
- Step 4: Repeat steps 2 and 3 many times until you end up with a total probability of unity 1
- Step 5: The codeword for each message is read from marked "0" and "1" following the arrows from left to right, writing codeword bits from right to left to meet the condition of a uniquely decodable code
Example 3: Huffman Code Development
- Develop Huffman code for messages x1, x2, x3, x4, x5, x6, x7, x8 with probabilities [0.4, 0.25, 0.15, 0.1, 0.07, 0.03]
- Determine coding efficiency
Source Extension
- When a source produces symbols with extreme probabilities, like p(x) = [0.9 0.1] or p(x) = [0.9 0.08 0.02]..., using previous length coding methods results in low efficiency
- The source is p(x) = [0.9 0.1]
- The Huffman procedure then assigns '0' to x1 (0.9) and '1' to x2 (0.1), resulting in Lc = 1 bit/symbol
- H(x) = 0.469 bits/symbol and efficiency η = H(x) / Lc = 0.469 / 1 = 46.9%
- Now, group two symbols and regard them as one message (source extension), assuming statistically independent symbols
- The probability of a group of two symbols becomes the joint probability
Source Extension Efficiency Improvement
- By grouping symbols into messages (source extension), coding efficiency can be improved
Example of Huffman with Source Extension
- Code Extension of 3
- Lc=1.626 bit/3-symbol
- Efficiency=0.469/0.542=86.5%
Studying That Suits You
Use AI to generate personalized quizzes and flashcards to suit your learning preferences.