Final Information Theory Base PDF
Document Details
Uploaded by DelicateChalcedony3041
Tags
Summary
This document contains a large number of questions on information theory, error correction codes, and coding theory.
Full Transcript
1.______ indicate(s) an error in a received combination. A) Parity bits B) Error syndrome C) Data bits D) None of the given 2.... is a measure of uncertainty A) Encoding B) Entropy C) Information D) Redundancy 3. A code has tw...
1.______ indicate(s) an error in a received combination. A) Parity bits B) Error syndrome C) Data bits D) None of the given 2.... is a measure of uncertainty A) Encoding B) Entropy C) Information D) Redundancy 3. A code has two allowable combinations 101 and 010. What is the allowable combination for the error combination 001? A) 101 B) 010 C) 001 D) None 4. A code has two allowable combinations 101 and 010. What is the allowable combination for the error combination 100? A) 101 B) 010 C) 100 D) None 5. A code has two allowable combinations 101 and 010. What is the allowable combination for the error combination 000? A) 010 B) 101 C) 000 D) None 6. A code has two allowable combinations 101 and 010. What is the allowable combination for the error combination 111? A) 101 B) 010 C) 111 D) None 7. A code has two allowable combinations 101 and 010. What is the allowable combination for the error combination 011? A) 010 B) 101 C) 011 D) None 8. A code has two allowable combinations 101 and 010. What is the allowable combination for the error combination 110? A) 010 B) 101 C) 110 D) None 9. A codeword of the Hamming code consists of ________ and ________ bits. A) data; parity B) with errors; without errors C) allowable; not allowable D) none of the given 10. A Huffman code is a = 1, b = 000, c = 001, d = 01. Probabilities are p(a) = 0.4, p(b) = 0.1, p(c) = 0.2, p(d) = 0.3. The average length of codewords q is A) 2.1 bit B) 1.9 bit C) 2.0 bit D) 8.0 bit 11. A redundancy of a code S =... A) 1 - Iavr/Imax B) Iavr/Imax C) 1 + Iavr/Imax D) Imax/Iavr 12. An average length of codewords qavr =... A) ∑ (pi*qi) B) ∑ (pi/qi) C) ∑pi / n D) ∑qi / n 13. An efficiency of a code E =... A) Iavr/Imax B) Imax/Iavr C) Iavr/100 D) Imax - Iavr 14. ASCII code is a A) Variable length code B) Fixed length code C) Error-correction code D) None of the given 15. By the Bayes' rule for conditional entropy H(Y|X) =... A) H(X|Y) - H(X) + H(Y) B) [P(A)] /P(B) C) H(X|Y) - H(X) D) H(X|Y)+ H(Y) 16. By the Bayes' theorem... A) P(B|A) = P(A and B)/P(A) B) P(A|B) = [P(B|A)][P(A)] /P(B) C) P(B|A) = P(A and B)*P(A) D) P(A|B) = [P(B|A)][P(A)] * P(B) 17. By the Chain rule H(X,Y) = H(Y|X) +... A) H(X) B) H(Y) C) H(Y|X) D) H(X|Y) 18. By the Hartley's formula the amount of information I =... A) I = n*log m B) I = m*n C) I = log (m/n) D) I = log (m*n) 19. By the Hartley's formula the entropy H =... A) H = - ∑(pi*log pi) B) H = - ∑ (log pi) C) H = log m D) H = - ∑ (pi/log pi) 20. By the property of joint entropy H(X,Y) = H(X) and H(X,Y) = H(X) and H(X,Y) >= H(Y) D) H(X,Y) >= H(X) + H(Y) 22. By the Shannon's formula the amount of information I =... A) H = - n * ∑(pi*log pi) B) H = - n * ∑ (log pi) C) H = - n * ∑ pi D) H = - n * ∑ (pi/log pi) 23. By the Shannon's formula the entropy H =... A) H = - ∑(pi*log pi) B) H = - ∑ (log pi) C) H = - ∑ pi D) H = - ∑ (pi/log pi) 24. Calculate the code rate for Hamming (15,11) code A) 1 B) 0,733 C) 0,571 D) 0,839 25. Calculate the code rate for Hamming (31,26) code A) 1 B) 0,839 C) 0,733 D) 0,571 26. Calculate the code rate for Hamming (7,4) code A) 1 B) 0,571 C) 0,733 D) 0,839 27. Calculate the efficiency of the language if it has 32 letters and its I average is 1 bit. A) 0,8 B) 0,2 C) 5 D) 1 28. Calculate the redundancy of the language if it has 32 letters and its I average is 1 bit. A) 0,8 B) 0,2 C) 5 D) 1 29. Choose an example of block code A) Shannon-Fano code B) Huffman code C) Hamming code D) None of the given 30. Choose conditions of an optimal coding (p – probability, l – length of a code word) A) pi < pj and li pj and li pj and li>=lj D) none of the given 31. Choose the formula to create the Hamming code A) (n, k) = (2r - 1, 2r - 1 - r) B) (n, k) = (2r, 2r - 1 - r) C) (n, k) = (2r - 1, 2r - r) D) (n, k) = (2r - 1, 2r - 1 + r) 32. Choose the formula to determine the number N of possible messages with length n if the message source alphabet consists of m characters, each of which can be an element of the message. A) N = mn B) N = nm C) N = m*n D) N = log m Правильный ответ N = m^n 33. Code has dmin = 1. How many errors can be corrected by this code? A) 2 B) 3 C) 0 D) 1 34. Code has dmin = 1. How many errors can be detected by this code? A) 2 B) 3 C) 0 D) 1 35. Code has dmin = 10. How many errors can be detected by this code? A) 4 B) 8 C) 9 D) 10 36. Code has dmin = 11. How many errors can be corrected by this code? A) 11 B) 7 C) 5 D) 10 37. Code has dmin = 11. How many errors can be detected by this code? A) 5 B) 9 C) 10 D) 11 38. Code has dmin = 12. How many errors can be detected by this code? A) 5 B) 10 C) 11 D) 12 39. Code has dmin = 2. How many errors can be corrected by this code? A) 2 B) 3 C) 0 D) 1 40. Code has dmin = 2. How many errors can be detected by this code? A) 2 B) 3 C) 1 D) 0 41. Code has dmin = 3. How many errors can be corrected by this code? A) 2 B) 3 C) 1 D) 4 42. Code has dmin = 3. How many errors can be detected by this code? A) 1 B) 3 C) 2 D) 4 43. Code has dmin = 4. How many errors can be detected by this code? A) 5 B) 1 C) 3 D) 4 44. Code has dmin = 5. How many errors can be corrected by this code? A) 5 B) 3 C) 2 D) 4 45. Code has dmin = 5. How many errors can be detected by this code? A) 6 B) 2 C) 4 D) 5 46. Code has dmin = 6. How many errors can be detected by this code? A) 6 B) 2 C) 5 D) 4 47. Code has dmin = 7. How many errors can be corrected by this code? A) 5 B) 6 C) 3 D) 4 48. Code has dmin = 7. How many errors can be detected by this code? A) 7 B) 3 C) 6 D) 5 49. Code has dmin = 8. How many errors can be detected by this code? A) 8 B) 6 C) 7 D) 3 50. Code has dmin = 9. How many errors can be corrected by this code? A) 5 B) 7 C) 4 D) 8 51. Code has dmin = 9. How many errors can be detected by this code? A) 7 B) 9 C) 8 D) 4 52. Code rate R (k information bits and n total bits) is defined as A) k = n/R B) R = k * n C) R = k/n D) n = R * k 53. Conditional entropy H(Y|X) lies between A) - H(Y) and 0 B) 0 and H(Y) C) - H(Y) and H(Y) D) 0 and 1 54. Convert the message into a signal suitable for transmission over the channel of communication, referred to as … A) Encoding B) Decoding C) Entropy D) Redundancy 55. Determine the Hamming distance for code that can detect 3 errors and correct 2 errors. A) 6 B) 5 C) 7 D) 9 56. Determine the Hamming distance for code that can detect 3 errors and correct 1 errors. A) 5 B) 4 C) 6 D) 8 57. Determine the Hamming distance for code that can detect 5 errors and correct 3 errors. A) 9 B) 8 C) 10 D) 14 58. Encode a string "0000" with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) 0000001 B) 0000111 C) 0000000 D) 0000101 59. Encode a string "0001" with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) 0001010 B) 0001001 C) 0001011 D) 0001111 60. Encode a string "0010" with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) 0010010 B) 0010111 C) 0010110 D) 0010100 61. Encode a string "0011" with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) 0011100 B) 0011001 C) 0011101 D) 0011111 62. Encode a string "0100" with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) 0100011 B) 0100110 C) 0100111 D) 0100101 63. Encode a string "0101" with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) 0101101 B) 0101000 C) 0101100 D) 0101110 64. Encode a string "0110" with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) 0110101 B) 0110011 C) 0110001 D) 0110000 65. Encode a string "0111" with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) 0111110 B) 0111000 C) 0111010 D) 0111011 66. Encode a string "1000" with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) 1000111 B) 1000100 C) 1000101 D) 1000001 67. Encode a string "1001" with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) 1001111 B) 1001010 C) 1001110 D) 1001100 68. Encode a string "1010" with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) 1010111 B) 1010001 C) 1010011 D) 1010010 69. Encode a string "1011" with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) 1011100 B) 1011010 C) 1011000 D) 1011001 70. Encode a string "1100" with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) 1100110 B) 1100000 C) 1100010 D) 1100011 71. Encode a string "1101" with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) 1101101 B) 1101011 C) 1101001 D) 1101000 72. Encode a string "1110" with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) 1110000 B) 1110101 C) 1110100 D) 1110110 73. Encode a string "1111" with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) 1111110 B) 1111011 C) 1111111 D) 1111101 74. The length of the message is 16 symbols and the message’s alphabet consists of 4 symbols. Find the amount of information in this message. A) 16 B) 80 C) 64 D) 32 75. The efficiency of the language is 0,5 and its I average is equal to 1 bit. Calculate the number of letters in this language’s alphabet? A) 64 B) 32 C) 4 D) 16 76. Elements of alphabets X and Y are statistically related. It is known that H(X)=4 bits and H(Y)=10 bits. What are a range of variation for a conditional entropy H(Y|X) when H(X|Y) changes from its min to max? A) (from 7 to 11) B) (from 4 to 12) C) (from 6 to 10) D) (from 6 to 11) 77. For Hamming distance dmin and r errors in the received word, the condition to be able to detect the errors is A) dmin>= r+1 B) dmin>= 2r+1 C) dmin>= 2r+2 D) dmin>= r+2 78. For Hamming distance dmin and s errors in the received word, the condition to be able to correct the errors is A) dmin>= s+1 B) dmin>= 2s+1 C) dmin>= 2s+2 D) dmin>= s+2 79. Hamming (7,4) code can correct ___ error(s) A) 2 B) 3 C) 1 D) 0 80. Hamming distance can easily be found with... A) XNOR operation B) XOR operation C) OR operation D) AND operation 81. How does a noise affect the data? A) change only the 0 to 1 B) change only the 1 to 0 C) change the 0 to 1 and the 1 to 0 D) None of the above 82. How many data bits are in the (15, 11) Hamming code? A) 11 B) 4 C) 15 D) 5 83. How many data bits are in the (31, 26) Hamming code? A) 26 B) 31 C) 5 D) 4 84. How many data bits are in the (7, 4) Hamming code? A) 4 B) 3 C) 7 D) 10 85. How many parity bits are in the (15, 11) Hamming code? A) 4 B) 15 C) 11 D) 5 86. How many parity bits are in the (31, 26) Hamming code? A) 26 B) 31 C) 5 D) 4 87. How many parity bits are in the (7, 4) Hamming code? A) 3 B) 4 C) 7 D) 11 88. A Huffman code is a = 0, b = 10, c = 110, d = 1110, e = 1111. Probabilities are p(a) = 0.50, p(b) = 0.30, p(c) = 0.15, p(d) = 0.03, p(e) = 0.02. The average length of a code words is A) 1.75 bit B) 2.0 bit C) 1.3 bit D) 1.7 bit 89. Which letter will get the shortest codeword after Huffman coding of the word «bbaacccabaac»? A) a B) b C) c D) none 90. An alphabet consist of the letters a, b, c, d, e and f. The probability of occurrence is p(a) = 0.06, p(b) = 0.15, p(c) = 0.4 and p(d) = 0.18, p(e)=0.17, p(f)=0.04. The Huffman code is A) c=1,d=000,e=001,b=010,a=0110,f=0111 B) c=0,d=111,e=110,b=101,a=1001,f=1000 C) c=1,d=01,e=001,b=0000,a=00010,f=00011 D) c=1,d=01,e=001,b=000,a=0010,f=00011 E) c=0,d=101,e=110,b=101,a=1000,f=1001 91. If k - number of bits before Hamming encoding and n - number of bits after Hamming encoding then A) k > n B) k < n C) k = n D) k = 1/2 n 92. In digital communication system, smaller the code rate,... are the redundant bits. A) less B) equal C) more D) unpredictable 93. Main idea of error control codes is A) To add some redundancy B) To delete some redundancy C) To double all bits D) None of the given 94. Noise affects... A) information source B) receiver C) channel D) transmitter 95. Shannon-Fano and Huffman codes are an encoding algorithms used for A) lossy data compression B) lossless data compression C) error correction D) error detection 96. Specify the case when entropy is maximum A) p1=0,5 and p2=0,5 B) p1=1 and p2=0 C) p1=0 and p2=1 D) p1=0,9 and p2=0,1 97. Specify the error position in the string "0001110", if the initial string was encoded with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) i4 B) i1 C) i2 D) r2 98. Specify the error position in the string "1000110", if the initial string was encoded with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) i1 B) i4 C) i2 D) i3 99. Specify the error position in the string "1001010", if the initial string was encoded with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) r2 B) r1 C) r3 D) i3 100. Specify the error position in the string "1001100", if the initial string was encoded with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) r3 B) r2 C) r1 D) no error 101. Specify the error position in the string "1001110", if the initial string was encoded with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) r1 B) no error C) r2 D) i4 102. Specify the error position in the string "1001111", if the initial string was encoded with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) r1 B) r3 C) r2 D) i4 103. Specify the error position in the string "1011110", if the initial string was encoded with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) i1 B) i3 C) i2 D) i4 104. Specify the error position in the string "1101110", if the initial string was encoded with Hamming (7,4) code using the following structure (i1, i2, i3, i4, r1, r2, r3) A) i1 B) i2 C) i3 D) i4 105. Specify the formula to find the amount of information if events have different probabilities. A) Hartley's formula B) Shannon's formula C) Fano's formula D) Bayes' formula 106. Specify the formula to find the amount of information if events have the same probabilities. A) Shannon's formula B) Hartley's formula C) Fano's formula D) Bayes' formula 107. Specify the right formula if dmin is Hamming distance, s - number of correctable errors and r - number of detecteable errors. A) dmin>= s+r+1 B) dmin>= 2s+r+1 C) dmin>= s+2r+1 D) dmin>= s+r+2 108. Suppose the letters a, b, c, d, e, f have probabilities 1/2, 1/4, 1/8, 1/16, 1/32, 1/32 respectively. Which of the following is the Huffman code for the letter a, b, c, d, e, f? A) 11, 10, 011, 010, 001, 000 B) 0, 10, 110, 1110, 11110, 11111 C) 11, 10, 01, 001, 0001, 0000 D) 110, 100, 010, 000, 001, 111 109. Suppose the letters a, b, c, d, e, f have probabilities 1/2, 1/4, 1/8, 1/16, 1/32, 1/32 respectively. What is the average length q of the Huffman code? A) 3,0 B) 1,9 C) 2,7 D) 4,3 110. The amount of information in the message is 120 bits. Calculate the length of this message, which is written by characters of 16-character alphabet. A) 30 B) 480 C) 120 D) 130 111. The amount of information in the message is 60 bits. Calculate the length of this message, which is written by characters of 4-character alphabet. A) 30 B) 60 C) 15 D) 510 112. The basic idea behind Shannon-Fano coding is to A) compress data by using more bits to encode more frequently occuring characters B) compress data by using fewer bits to encode more frequently occuring characters C) compress data by using fewer bits to encode fewer frequently occuring characters D) expand data by using fewer bits to encode more frequently occuring characters 113. The efficiency of the language is 0,25 and its I average is 1 bit. Calculate the number of letters in this language's alphabet? A) 32 B) 16 C) 8 D) 64 114. The first code combination is 0000 and the Hamming distance of this code equals 4. Choose the second combination. A) 1111 B) 1011 C) 0011 D) 0000 115. The Hamming code is a method of _______. A) Error control coding B) Optimal coding C) None of the above 116. The Hamming distance between "client" and "server" is A) 0 B) 1 C) 6 D) impossible to detect 117. The Hamming distance between "make" and "made" is A) 4 B) 3 C) 1 D) impossible to detect 118. The Hamming distance between "push" and "pull" is A) 0 B) 4 C) 2 D) impossible to detect 119. The Hamming distance between "starting" and "finishing" is A) 4 B) 3 C) impossible to detect D) 5 120. The Hamming distance between 001111 and 010011 is A) 1 B) 2 C) 3 D) 4 121. The Hamming distance between 010111 and 010011 is A) 2 B) 3 C) 1 D) 4 122. The Hamming distance between 011111 and 010011 is A) 1 B) 3 C) 2 D) 4 123. The Hamming distance between 101001 and 010011 is A) 1 B) 2 C) 4 D) 3 124. The length of the message is 16 symbols and the message's alphabet consists of 32 symbols. Find the amount of information in this message. A) 80 B) 16 C) 64 D) 32 125. The length of the message is 6 symbols and the message's alphabet consists of 32 symbols. Find the amount of information in this message. A) 30 B) 6 C) 32 D) 24 126. The redundancy of the language is 0,75 and its I average is 1 bit. Calculate the number of letters in this language's alphabet? A) 32 B) 16 C) 8 D) 64 127. The string was encoded with Hamming (7,4) code using the structure (i1, i2, i3, i4, r1, r2, r3). After a channel the error syndrome is 000. Specify the position of the error. A) i1 B) r1 C) no error D) r3 128. The string was encoded with Hamming (7,4) code using the structure (i1, i2, i3, i4, r1, r2, r3). After a channel the error syndrome is 001. Specify the position of the error. A) i1 B) r1 C) r3 D) no error 129. The string was encoded with Hamming (7,4) code using the structure (i1, i2, i3, i4, r1, r2, r3). After a channel the error syndrome is 010. Specify the position of the error. A) r3 B) r1 C) r2 D) i2 130. The string was encoded with Hamming (7,4) code using the structure (i1, i2, i3, i4, r1, r2, r3). After a channel the error syndrome is 011. Specify the position of the error. A) r4 B) i1 C) i4 D) r1 131. The string was encoded with Hamming (7,4) code using the structure (i1, i2, i3, i4, r1, r2, r3). After a channel the error syndrome is 100. Specify the position of the error. A) r3 B) i1 C) r1 D) r2 132. The string was encoded with Hamming (7,4) code using the structure (i1, i2, i3, i4, r1, r2, r3). After a channel the error syndrome is 101. Specify the position of the error. A) no error B) r1 C) i1 D) i2 133. The string was encoded with Hamming (7,4) code using the structure (i1, i2, i3, i4, r1, r2, r3). After a channel the error syndrome is 110. Specify the position of the error. A) i4 B) r3 C) i3 D) i1 134. The string was encoded with Hamming (7,4) code using the structure (i1, i2, i3, i4, r1, r2, r3). After a channel the error syndrome is 111. Specify the position of the error. A) i4 B) r2 C) i2 D) no error 135. This is the method for data processing for reducing errors during transmission via channel with noise. A) Error Correction code B) Uniform code C) Non-uniform code D) Optimal code 136. We can divide coding schemes into two broad categories: ________ and ______coding. A) block; linear B) linear; nonlinear C) block; convolution D) none of the given 137. What is the first step of Shannon-Fano algorithm? A) Characters of the original alphabet are setted in descending order of probability. B) Letters divided to the two subsets so that the overall probability of these subsets were about equal. C) For all characters (letters) of the top subset assign a code element 1, and for characters of the lower subset the 0 code. D) For all characters (letters) of the top subset assign a code element 0, and for characters of the lower subset the 1 code. 138. What is the Hamming distance between two strings of equal length? A) the number of positions at which the corresponding symbols are different B) the number of positions at which the corresponding symbols are equal C) the number of identical symbols in the first string D) the number of identical symbols in the second string 139. What is the meaning of number “2” in the formula I = n*log2m? A) Binary number system B) Message length equals to 2 C) It hasn’t any meaning D) Information is measured in nits 140. What is the sample space of one dice roll? A) {1,2,3,4,5,6} B) {1,3,5} C) {2,4,6} D) {1,2,3,4,5,6,7,8,9,10,11,12} 141. When the base of the logarithm is 10, then the unit of measure of information is A) bytes B) dits C) nits D) bits 142. When the base of the logarithm is 2, then the unit of measure of information is A) bytes B) bits C) nits D) dits 143. When the base of the logarithm is e, then the unit of measure of information is A) bytes B) nits C) dits D) bits 144. Which letter will get the shortest codeword after Huffman coding of the word "abracadabra"? A) c B) r C) d D) a 145. Which of the following codes can be the Huffman code for the letters a,b,c,d,e? A) 10,011,11,001,010 B) 0,10,110,1110,1111 C) 10,01,0001,100,1010 D) 100,110,001,000,010 146. Which of the following codes has the highest code rate? A) code rate is constant for all of the Hamming codes B) Hamming (31,26) C) Hamming (15,11) D) Hamming (7,4) 147. Which of the following codes has the highest redundancy? A) redundancy is constant for all of the Hamming codes B) Hamming (7,4) C) Hamming (15,11) D) Hamming (31,26) 148. Which of the following codes is prefix? A) 0, 111, 11 B) 0, 111, 10 C) 0, 101, 10 D) 00, 10, 101 149. Which of the following codes is prefix? A) 0, 01, 11 B) 0, 10, 11 C) 0, 10, 1 D) 0, 01, 001 150. Which of the following codes is uniform? A) ASCII B) Shannon-Fano C) Huffman D) None of the given 151. Which of the following codes is uniform? A) 10,011,11,001,010 B) 0,10,110,1110,1111 C) 10,01,0001,100,1010 D) 100,110,001,000,010 152. Which of the following symbols will get the shortest codeword after Shannon-Fano coding if probabilities are p(a) = 0.05, p(b) = 0.6, p(c) = 0.2 and p(d) = 0.15? A) c B) a C) d D) b