Error Detection and Correction Codes
25 Questions
4 Views

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

Indicate(s) an error in a received combination.

  • Data bits
  • Error syndrome (correct)
  • None of the given
  • Parity bits
  • ... is a measure of uncertainty

  • Redundancy
  • Entropy (correct)
  • Encoding
  • Information
  • A code has two allowable combinations 101 and 010. What is the allowable combination for the error combination 001?

  • 001
  • None (correct)
  • 101
  • 010
  • A codeword of the Hamming code consists of ____ and ____ bits.

    <p>data; parity</p> Signup and view all the answers

    A Huffman code is a = 1, b = 000, c = 001, d = 01. Probabilities are p(a) = 0.4, p(b) = 0.1, p(c) = 0.2, p(d) = 0.3. The average length of codewords q is

    <p>2.0 bit</p> Signup and view all the answers

    A redundancy of a code S = ...

    <p>1 - lavr/Imax</p> Signup and view all the answers

    An average length of codewords qavr = ...

    <p>Σ (pi*qi)</p> Signup and view all the answers

    An efficiency of a code E = ...

    <p>lavr/Imax</p> Signup and view all the answers

    ASCII code is a

    <p>Fixed length code</p> Signup and view all the answers

    By the Bayes' rule for conditional entropy H(Y|X) = ...

    <p>H(XIY) - H(X)</p> Signup and view all the answers

    By the Bayes' theorem ...

    <p>P(B|A) = P(A and B)/P(A)</p> Signup and view all the answers

    By the Chain rule H(X,Y) = H(Y|X) + ...

    <p>H(X)</p> Signup and view all the answers

    By the Hartley's formula the amount of information I = ...

    <p>I = n*log m</p> Signup and view all the answers

    By the Hartley's formula the entropy H = ...

    <p>Η = - ∑(pi*log pi)</p> Signup and view all the answers

    By the property of joint entropy H(X,Y) <= ...

    <p>H(X) + H(Y)</p> Signup and view all the answers

    By the Shannon's formula the amount of information I = ...

    <p>H = - n * ∑(pi*log pi)</p> Signup and view all the answers

    By the Shannon's formula the entropy H = ...

    <p>Η = - Σ(pi*log pi)</p> Signup and view all the answers

    Calculate the code rate for Hamming (15,11) code

    <p>0,733</p> Signup and view all the answers

    Calculate the efficiency of the language if it has 32 letters and its I average is 1 bit.

    <p>0,2</p> Signup and view all the answers

    Calculate the redundancy of the language if it has 32 letters and its I average is 1 bit.

    <p>0,8</p> Signup and view all the answers

    Choose an example of block code

    <p>Hamming code</p> Signup and view all the answers

    Choose conditions of an optimal coding (p – probability, I – length of a code word)

    <p>pi &gt; pj and li&lt;=lj</p> Signup and view all the answers

    Choose the formula to create the Hamming code

    <p>(n, k) = (2r - 1, 2r - 1 - r)</p> Signup and view all the answers

    Choose the formula to determine the number N of possible messages with length n if the message source alphabet consists of m characters, each of which can be an element of the message.

    <p>N = m*n</p> Signup and view all the answers

    Code has dmin = 1. How many errors can be corrected by this code?

    <p>1</p> Signup and view all the answers

    Study Notes

    Error Detection and Correction Codes

    • Error syndrome: Indicates an error in a received combination of data bits.
    • Parity bits: Indicate an error in received data.
    • Data bits: Represent the actual data being transmitted.
    • Entropy: A measure of uncertainty.
    • Encoding: The process of converting data into a specific format.
    • Information: The content or data being transmitted.
    • Redundancy: Extra data added for error detection and correction.

    Allowable Combinations

    • A code with allowable combinations 101 and 010.

      • Error combination 001: Allowable combination is 101
      • Error combination 100: Allowable combination is 010
      • Error combination 011: Allowable combination is 010
      • Error combination 110: Allowable combination is 101
      • Error combination 000: Allowable combination is 101
      • Error combination 111: Allowable combination is 000 (or none)

    Hamming Code

    • A codeword in the Hamming code consists of data and parity bits.
    • ASCII code is a fixed-length code.
    • Code rate is a measure of efficiency, usually given as k/n where k is the number of data bits and n is the total number of bits.

    Additional Concepts

    • Code Rate: Ratio of information bits to total bits.
    • Redundancy: Extra bits in a code to detect/correct errors.
    • Efficiency: Ratio of information bits to total bits.
    • Hamming Distance: Minimum number of bits that need to be changed to produce a different codeword
    • How many errors a code can correct/detect depends on dmin.
    • Average Length of Codewords (qavr): Calculated using Σ(pi * qi) for specific probabilities.
    • Entropy (H): Computed using Σ(pi * log2(pi))

    Calculation of Errors

    • Determining the number of errors a code can correct or detect given the minimum Hamming distance (dmin).
    • The number of errors that can be detected by a code equals dmin− 1
    • The number of errors that can be corrected by a code depends on its minimum Hamming distance and is equal to (dmin−1)/2

    Studying That Suits You

    Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

    Quiz Team

    Related Documents

    Description

    This quiz covers essential concepts related to error detection and correction codes, including syndromes, parity bits, and Hamming codes. It introduces the mechanics of data transmission and discusses allowable combinations for error scenarios. Test your understanding of these vital information theory concepts.

    More Like This

    Error Detection in Computer Networks
    5 questions
    Error Detection in Digital Communication
    54 questions
    Use Quizgecko on...
    Browser
    Browser