Error Detection and Correction Codes

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to Lesson

Podcast

Play an AI-generated podcast conversation about this lesson
Download our mobile app to listen on the go
Get App

Questions and Answers

Indicate(s) an error in a received combination.

  • Data bits
  • Error syndrome (correct)
  • None of the given
  • Parity bits

... is a measure of uncertainty

  • Redundancy
  • Entropy (correct)
  • Encoding
  • Information

A code has two allowable combinations 101 and 010. What is the allowable combination for the error combination 001?

  • 001
  • None (correct)
  • 101
  • 010

A codeword of the Hamming code consists of ____ and ____ bits.

<p>data; parity (B)</p> Signup and view all the answers

A Huffman code is a = 1, b = 000, c = 001, d = 01. Probabilities are p(a) = 0.4, p(b) = 0.1, p(c) = 0.2, p(d) = 0.3. The average length of codewords q is

<p>2.0 bit (C)</p> Signup and view all the answers

A redundancy of a code S = ...

<p>1 - lavr/Imax (B)</p> Signup and view all the answers

An average length of codewords qavr = ...

<p>Σ (pi*qi) (B)</p> Signup and view all the answers

An efficiency of a code E = ...

<p>lavr/Imax (B)</p> Signup and view all the answers

ASCII code is a

<p>Fixed length code (D)</p> Signup and view all the answers

By the Bayes' rule for conditional entropy H(Y|X) = ...

<p>H(XIY) - H(X) (C)</p> Signup and view all the answers

By the Bayes' theorem ...

<p>P(B|A) = P(A and B)/P(A) (A)</p> Signup and view all the answers

By the Chain rule H(X,Y) = H(Y|X) + ...

<p>H(X) (C)</p> Signup and view all the answers

By the Hartley's formula the amount of information I = ...

<p>I = n*log m (D)</p> Signup and view all the answers

By the Hartley's formula the entropy H = ...

<p>Η = - ∑(pi*log pi) (C)</p> Signup and view all the answers

By the property of joint entropy H(X,Y) <= ...

<p>H(X) + H(Y) (B)</p> Signup and view all the answers

By the Shannon's formula the amount of information I = ...

<p>H = - n * ∑(pi*log pi) (B)</p> Signup and view all the answers

By the Shannon's formula the entropy H = ...

<p>Η = - Σ(pi*log pi) (D)</p> Signup and view all the answers

Calculate the code rate for Hamming (15,11) code

<p>0,733 (C)</p> Signup and view all the answers

Calculate the efficiency of the language if it has 32 letters and its I average is 1 bit.

<p>0,2 (A)</p> Signup and view all the answers

Calculate the redundancy of the language if it has 32 letters and its I average is 1 bit.

<p>0,8 (C)</p> Signup and view all the answers

Choose an example of block code

<p>Hamming code (D)</p> Signup and view all the answers

Choose conditions of an optimal coding (p – probability, I – length of a code word)

<p>pi &gt; pj and li&lt;=lj (B)</p> Signup and view all the answers

Choose the formula to create the Hamming code

<p>(n, k) = (2r - 1, 2r - 1 - r) (D)</p> Signup and view all the answers

Choose the formula to determine the number N of possible messages with length n if the message source alphabet consists of m characters, each of which can be an element of the message.

<p>N = m*n (A), N = m^n (B)</p> Signup and view all the answers

Code has dmin = 1. How many errors can be corrected by this code?

<p>1 (A)</p> Signup and view all the answers

Flashcards

What is a parity bit?

Parity bits are extra bits added to a data block to detect errors during transmission. They ensure that the total number of '1' bits in the block is even or odd, depending on the parity scheme.

What is entropy?

Entropy is a measure of uncertainty or randomness associated with a random variable. It quantifies the average amount of information needed to describe the outcome of a random event.

What is an error syndrome?

Error syndrome is a binary code derived from the received code word that indicates the position of the error(s), if any, within the transmitted code word.

How many errors can be detected by code with dmin = 1?

A code with a minimum Hamming distance of 1 can only detect a single error.

Signup and view all the flashcards

How many errors can be detected by code with dmin = 10 ?

A code with a minimum Hamming distance of 10 can detect up to 10 errors.

Signup and view all the flashcards

Given the allowable combinations 101 and 010, what is the allowable combination for the error combination 001?

The allowable combination for the error combination 001 is 101, as it is the closest allowable combination to it.

Signup and view all the flashcards

Given the allowable combinations 101 and 010, what is the allowable combination for the error combination 100?

The allowable combination for the error combination 100 is 'None' as it is not within the Hamming distance of any allowable combination.

Signup and view all the flashcards

Given the allowable combinations 101 and 010, what is the allowable combination for the error combination 011?

The allowable combination for the error combination 011 is 010, as it is the closest allowable combination.

Signup and view all the flashcards

What does a code word consist of?

A code word consists of data bits and parity bits.

Signup and view all the flashcards

How is the average length of codewords calculated?

The average length of codewords (q) is calculated by summing the product of the probability of each symbol (pi) and the length of its corresponding codeword (qi).

Signup and view all the flashcards

How is the redundancy of a code calculated?

The redundancy of a code measures the proportion of redundant bits added to the original data. It is calculated using:

Signup and view all the flashcards

How is the average length of codewords (qavr) calculated?

The average length of code words is calculated by taking the sum of the product of each symbol's probability (pi) and its corresponding code word's length (qi).

Signup and view all the flashcards

How is the efficiency of a code (E) calculated?

The efficiency of a code is calculated by dividing the average information by the maximum possible information.

Signup and view all the flashcards

What type of code is ASCII?

ASCII code is a fixed-length code, meaning that every character is represented by the same number of bits.

Signup and view all the flashcards

How is conditional entropy H(Y|X) calculated?

Conditional entropy H(Y|X) represents the uncertainty about Y given that X is known. It is calculated by:

Signup and view all the flashcards

What is Bayes' theorem?

Bayes' theorem provides a way to calculate the conditional probability of an event A given another event B. It is given by:

Signup and view all the flashcards

What is the Chain rule for entropy?

The Chain rule states that the joint entropy of X and Y is equal to the conditional entropy of Y given X plus the entropy of X.

Signup and view all the flashcards

What is Hartley's formula?

Hartley's formula calculates the amount of information I associated with a message source, which has an alphabet with m symbols, where the message length is n.

Signup and view all the flashcards

What is Hartley's formula for entropy?

Hartley's formula calculates the entropy H associated with a source with an alphabet of m symbols.

Signup and view all the flashcards

What is Shannon's formula?

Shannon's formula calculates the amount of information I for a message source where events have different probabilities.

Signup and view all the flashcards

What is Shannon's formula for entropy?

Shannon's formula calculates the entropy H for a source with different probabilities for each symbol.

Signup and view all the flashcards

What is the formula for creating Hamming code?

The Hamming code is a specific type of linear block code commonly used for error detection and correction. The formula to create Hamming code is:

Signup and view all the flashcards

What is code rate (R) in error-correcting codes?

Code rate (R) is the ratio of the number of information bits (k) to the total number of bits (n) in a code word.

Signup and view all the flashcards

What is Hamming distance?

The Hamming distance between two code words is the number of bit positions in which the two code words differ.

Signup and view all the flashcards

How to calculate the Hamming distance?

The Hamming distance between two strings of equal length is the number of positions at which the corresponding symbols are different.

Signup and view all the flashcards

What is the first step in the Shannon-Fano algorithm?

The first step of Shannon-Fano algorithm is to arrange the characters of the alphabet in descending order based on their probabilities.

Signup and view all the flashcards

What is the condition for a code to detect r errors?

For a code to be able to detect r errors, the minimum Hamming distance dmin of the code should be at least r+1.

Signup and view all the flashcards

What is the condition for a code to correct up to 's' errors?

For a code to be able to correct s errors, the minimum Hamming distance dmin of the code should be at least 2s+1.

Signup and view all the flashcards

Study Notes

Error Detection and Correction Codes

  • Error syndrome: Indicates an error in a received combination of data bits.
  • Parity bits: Indicate an error in received data.
  • Data bits: Represent the actual data being transmitted.
  • Entropy: A measure of uncertainty.
  • Encoding: The process of converting data into a specific format.
  • Information: The content or data being transmitted.
  • Redundancy: Extra data added for error detection and correction.

Allowable Combinations

  • A code with allowable combinations 101 and 010.

    • Error combination 001: Allowable combination is 101
    • Error combination 100: Allowable combination is 010
    • Error combination 011: Allowable combination is 010
    • Error combination 110: Allowable combination is 101
    • Error combination 000: Allowable combination is 101
    • Error combination 111: Allowable combination is 000 (or none)

Hamming Code

  • A codeword in the Hamming code consists of data and parity bits.
  • ASCII code is a fixed-length code.
  • Code rate is a measure of efficiency, usually given as k/n where k is the number of data bits and n is the total number of bits.

Additional Concepts

  • Code Rate: Ratio of information bits to total bits.
  • Redundancy: Extra bits in a code to detect/correct errors.
  • Efficiency: Ratio of information bits to total bits.
  • Hamming Distance: Minimum number of bits that need to be changed to produce a different codeword
  • How many errors a code can correct/detect depends on dmin.
  • Average Length of Codewords (qavr): Calculated using Σ(pi * qi) for specific probabilities.
  • Entropy (H): Computed using Σ(pi * log2(pi))

Calculation of Errors

  • Determining the number of errors a code can correct or detect given the minimum Hamming distance (dmin).
  • The number of errors that can be detected by a code equals dmin− 1
  • The number of errors that can be corrected by a code depends on its minimum Hamming distance and is equal to (dmin−1)/2

Studying That Suits You

Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

Quiz Team

Related Documents

More Like This

Use Quizgecko on...
Browser
Browser