Information Theory and Coding Lecture Notes PDF

Document Details

TimeHonoredNovaculite429

Uploaded by TimeHonoredNovaculite429

National Institute of Technology Calicut

Dr. Waquar Ahmad

Tags

information theory coding theory mathematics lectures

Summary

These lecture notes cover Information Theory and Coding, focusing on topics like Jensen's inequality, convex and concave functions, and information inequality. The notes are suitable for undergraduate-level students.

Full Transcript

Information Theory and Coding Dr. Waquar Ahmad National Institute of Technology Calicut [email protected] Dr. Waquar Ahmad (NITC) ITC 1 / 10 JENSEN’S INEQUALITY...

Information Theory and Coding Dr. Waquar Ahmad National Institute of Technology Calicut [email protected] Dr. Waquar Ahmad (NITC) ITC 1 / 10 JENSEN’S INEQUALITY A function f (x) is said to be convex over an interval (a, b) if for every x1 , x2 ∈ (a, b) and 0 ≤ λ ≤ 1, f (λx1 + (1 − λ)x2 ≤ λf (x1 ) + (1 − λ)f (x2 ) A function f is said to be strictly convex if equality holds only if λ = 0 or λ = 1. Definition A function f is concave if −f is convex. A function is convex if it always lies below any chord. A function is concave if it always lies above any chord. Examples of convex functions include x 2 , |x|, exp x. √ Examples of concave functions include log x and x for x ≥ 0. Dr. Waquar Ahmad (NITC) ITC 2 / 10 CONVEX and CONCAVE FUNCTION If the function f has a second derivative that is non-negative (positive) over an interval, the function is convex (strictly convex) over that interval. Dr. Waquar Ahmad (NITC) ITC 3 / 10 Jensen’s inequality If f is a convex function and X is a random variable, Ef (X ) ≥ f (E (X )) Moreover, if f is strictly convex, the equality implies that X = EX with probability 1 (i.e., X is a constant) Dr. Waquar Ahmad (NITC) ITC 4 / 10 Information inequality (Information inequality) Let p(x), q(x), x ∈ X , be two probability mass functions. Then D(p||q) ≥ 0 Dr. Waquar Ahmad (NITC) ITC 5 / 10 Proof Dr. Waquar Ahmad (NITC) ITC 6 / 10. H(X ) ≤ log |χ|, where |χ| denotes the number of elements in the range of X , with equality if and only X has a uniform distribution over χ Dr. Waquar Ahmad (NITC) ITC 7 / 10 Proof Dr. Waquar Ahmad (NITC) ITC 8 / 10. Conditioning reduces entropy H(X |Y ) ≤ H(X ) with equality if and only if X and Y are independent. Dr. Waquar Ahmad (NITC) ITC 9 / 10 Independence bound on entropy Let X1 , X2 ,...., Xn be drawn according to p(x1 , x2 ,..., xn ). Then n X H(X1 , X2 , X3....Xn ) ≤ H(Xi ) i=1 with equality if and only if the Xi are independent. Dr. Waquar Ahmad (NITC) ITC 10 / 10

Use Quizgecko on...
Browser
Browser