Random Variable Lecture Notes PDF

Summary

These lecture notes provide a comprehensive introduction to random variables, covering topics such as probability, random signals, and random processes. They are suitable for undergraduate-level study in mathematics, statistics, or engineering.

Full Transcript

Introduction to Random variable Introduction Systems are deterministic and not random Signals are random Random signal : Introduction Random Signal is one whose variation w. r. t. the independent variable cannot be predicted (or, the amplitude of the random si...

Introduction to Random variable Introduction Systems are deterministic and not random Signals are random Random signal : Introduction Random Signal is one whose variation w. r. t. the independent variable cannot be predicted (or, the amplitude of the random signal varies arbitrarily w. r. t. the independent variable) Example: coloured noise Amplitude of random signal cannot be predicted and is not important either, for random signals Alternative ways of characterizing random signals are Average power Frequency spectrum Statistical measure like probability Random signal modeling Types: Probabilistic model Statistical model Probabilistic model is more precise but, in general, only probability is not used in system analysis. Usually, from probability model to statistical model of signal is approached in terms of average, mean, variance, correlation function, spectral density etc Introduction to Probability Experiment: Collection of events Event: Outcome of one step of the experiment Probability is linked with the frequency of occurrence of an event If the outcome of an experiment is uncertain, then it is a random event Example For tossing a coin sufficiently large number of times, the probability to have Head is almost ½, so as Tail. i.e., P(H)=1/2=P(T) So for N no. of experiments, P(H)=N/2 and P(T)=N/2 No of times that an event A is likely to occur P(A)= No of experiments NA P(A)= N ' N Actual no of times that A occurs A ( ) may not be N A equal N  to N A N A' If , then If the sample size is sufficiently large, the probability is more accurate in terms of the number of times that an event occur actually Probability is positive number between 0 and 1 If one event occurs every time, the probability of occurrence of that event is 1 i.e., P(A)=1 …… certain event P(A)=0 …… Impossible event i.e., 0P ( A ) 1 Discrete Probability For tossing a coin, P(H)=P(T)=1/2 In case of a dice, the probability of occurrence of a single dot is P(1)=1/6, as there are six possibilities P(1)=P(2)= …… =P(6)=1/6 So, there N Mmay be N no. of experiments and M no of outcomes and , otherwise all outcomes may not be visible Discrete Probability contd… Suppose outcome 1 occurs N1 times, outcome 2 occurs N2 times, and so on …., then N1 +N2 + -------- +NM =N N1  N 2 .......  N M N i.e.,  1 N N N1 N 2 N  ......  M 1 P (1)  P (2) ........  P( M ) 1 i.e., N N N i.e.,  In case of complete experiment, the probabilities must add up to unity Discrete Probability contd… If two coins are tossed simultaneously, then outcome can be four types: HH TT HT TH Last two outcomes are indistinguishable unless it is deliberately marked. So it a joint probability. P(H,H)=1/4 P(T,T)=1/4 P(T,H)=1/2 Example Let us have a box with 100 balls of 4 distinct colors :Red 10 Black 20 White 40 Green 30. Someone reached the box and picks up one ball. How many possible events can occur. Four outcomes may possible , i.e., M=4 N should be ≥ M P(R)=10 out of 100 =0.1 P(B)=0.2 P(W)=0.4 P(G)=0.3 Joint Probability Wt↓colo R B W G Total r→ 1g 5 10 25 0 40 2g 5 5 0 10 20 3g 0 5 15 20 40 Total 10 20 40 30 100 What is the probability of picking up a green ball which has a weight of 3 gram? P(G,3)=0.2 Total no of joint probabilities exists in the experiment = 12 Joint Probability contd… In general, JP is symbolized as P(A,B) JP is calculated in terms of two quantities :  Conditional Probability  Marginal Probability Joint Probability contd… Conditional Probability P(A/B) is defined as the probability of the event A given that B has occurred. P(G/3) means a 3 g ball is picked up. What is the probability that it is green P(G/3)= 20 (total green ball of 3g wt) out of 40 (total 3 g ball) = 20/40 =0.5 Joint Probability contd… Probability of a single characteristic event A is called marginal probability P(A) Ex : P(G), P(3) etc Marginal probability of 1 g ball is, P(1)= 0.4 MP imply that there exist more than one character to each event but we are isolating events as belonging to a single character. Joint probability of two characters A and B is given by, P(A,B)=P(A/B)P(B) =P(B/A)P(A) Bayes Theorem P(G,3)= 0.2 Again, P(G,3)=P(G/3)P(3)=0.5*0.4=0.2 P(A,B)=P(A)P(B), when A and B are independent events Conditional probability: Event tree formation Discrete Probability to Continuous Probability Sometimes we need to define the probability of occurrence of a range of possibility Example: Probability to pick up a resistance within the range 995 Ω and 1005 Ω from a box of resistance containing infinite number of resistances within the range 99 Ω to 1100 Ω i.e., P(995≤R ≤1005) is a nonzero quantity Thus, if we want to go from discrete to continuous probability, a range has to be assigned. Procedure to specify the range: Let x is the particular value of the variable which varies randomly (resistance in this case) x is a continuum and it can have range from ‘a’ to ‘b’ (995 to 1005 in resistance example) Usual notation is P(x)=Prob(X ≤x) It implies, what is the probability that the random variable (i.e., resistance) X, which assumes various values within a range, a particular value being x. Random Process A family of random time functions which have some common characteristic is called Random Process thus, if x(t) is a random time function, Random process will be collection of x(t)’s If in a random process, the probability of each particular time function is also known, then the collection of these random time function is called ensemble So all ensemble are random process, but all random processes are not ensemble Random Process and ensemble If there are parallel frame of all the members of the ensemble or the random process and the amplitudes at t=t1 of all members is measured, then all X1’s will be different for different members. So X1 is a variable, not over the Each of the member of a time but over an ensemble, though random process or an x(t) is a function of time ensemble is called a Here independent variable is the sample function designation of the ensemble. X1 cannot be predicted and X1 Consider any time t=t1 variation over the total ensemble X1=amplitude at t=t1 cannot be predicted either So X1 is a random variable Continuous RV: A continuous random variable is one which takes an infinite number of possible values Over the large no of members (like analog voltage waveform) in the ensemble, X1 can vary continuously, such that, range of X1 is a continuum with non-deterministic nature Discrete RV: A discrete random variable is one which may take on only a countable number of distinct values. If it is a digital signal then the value of X1 depends on number of bits that is allowed in the digital system, that is, it can take some discrete values from a countable set of specified values. This is called discrete RV Continuous RV As discussed, continuous probability is defined as, P(x)=Prob(X≤x); x is a particular value of X. Range of x can be infinite or finite. Few important properties: If -∞ ≤x ≤ ∞, then 0 ≤P(x) 1 If x increases within a range then P(x) is monotonically non-decresing function of x, i.e, P’(x) ≥ 0 As x varies from -∞ ≤x ≤ ∞, then P(-∞) = 0 and P(+∞) =1 Prob(x1 ≤ x ≤ x2) = P(x2) – P(x1) Values of x are in continuum and the plot of P(x) vs x is a continuous curve called Probability Distribution Function (PDF) PDF has various types but its slope is non- negative. Typical PDFs Slope is non negative in all 3 cases P’(x)=dP/dx = p(x) is called probability density function (pdf) In practice, pdf is more important than PDF pdf in above 3 cases This picture shows clearly whether the original variable is a continuous or discrete random variable Discrete probability density function shall consist only by impulses Interpretation of pdf P( x  dx)  P ( x) p ( x) P '( x)  Lt dx 0 dx or , p( x)dx P( x  dx)  P( x) or , p( x)dx P( x  X  x  dx) p(x)dx is the probability element which indicates the that the probability of random variable lies between P(x) and P(x+dx) Properties of pdf p(x)≥0 for -∞ ≤x ≤ ∞ and its upper limit is not restricted but lower limit is restricted at zero  p( x)dx  p()   p ( ) 1  0 1 x PDF P(x)= p(u )du  , it is the probability that x2 the P(u )du P( x )  P( x ) x1 2 1 random variable lies between x1 and x2 = Prob(x1 ≤X ≤x2) Examples of probability density functions generally encountered  ( x  x )2 1 Gaussian p ( x)  e 2 2 ; -  x  2  Observation: Power of exponential is a variation of x from a certainxfixed x point ; is ensemble average and σ is standard deviation. 1 2  The multiplying factor is applied to make the integral over the range -∞ ≤x ≤ ∞ is equal to 1 Problem The time T (second) for the completion of two sequential events is a continuous RV with pdf given below: Problem contd… Some discrete RVs Geometric RV Consider a sequence of binary digits being sent over some communication channel Let us consider the number of bits that are observed before the first error occurs is k So, k-1 bits are without error. Probability of such event is given be geometric RV. Poisson Random Variable The numbers of event k is said to be random variable and Mean value and moments In a RP time average over any member of the family is zero, actually it does not make sense to take time average as the function cannot be predicted. x Instead, ensemble average i.e., the average value of the RV x over the complete ensemble can be calculated as,  xp( x)dx x    p( x)dx  It is mean or average value of the RV also the average value of the RP and is used as a parameter of the RP to characterize it Hence for any functionf ( x) of RV, the ensemble average,  f ( x)   f ( x) p ( x)dx x , as denominator is =1  E[ f ( x)]  f ( x) It is also called expected value of x. i.e.,  let f ( x)  x n E[ , then, f ( x )]   p( x)dx x n  If n=1, we get mean value If n=2, we get, 2 , i.e., mean squared value (2 ) x 2 x x Central Moment (CM ) n E[( x  x ) n ]  ( x  x ) n  ( x  x ) n p( x)dx  if n =1, (CM )1 0 if n =2, (CM ) 2 E[( x  x ) 2 ] E[ x 2  2 xx  x 2 ] E[ x 2 ]  2 xE[ x]  x 2 as x is constant x 2  2 x 2  x 2 x 2  x 2  2 var iance Properties of variance Var[c]=0, c is a constant Var[x+c]=Var[x] Var[cx]=c2 Var[x] Let var[x]=1 and y=2x+3. Find the variance of y. Recollect  f ( x)   f ( x) p ( x)dx  Iff ( x) x; f ( x) x Ensemble average Iff ( x)  x 2 ; f ( x) x 2 Mean squared value 2 2 2 2 Iff ( x) ( x  x) ;   x  x Variance None of the above make sense unless the RP in an ensemble, i.e., p(x), P(x) must be known Example 1  A as area p ( x)dx 1 Uniform p(x) b a   A, a  x b p ( x)  0 otherwise Find ensemble average, mean squared error and variance for  A, a  x buniform p(x) p ( x)  0 otherwise b A ba Note: x Axdx  (b 2  a 2 )  2 2  = standard a b deviation b a 2 A 3 23 b 2  ab  a 2 x Ax dx  (b  a )  3 3 12 a = 2 2 b 2  ab  a 2 b  a 2 2  x  x  ( ) 3 2 This is very useful in b 2  ab  a 2 b 2  2ab  a 2 DSP and   3 4 communication (b  a) 2 applications  12 Some higher order moments Some normalized quantities related to the third and fourth order moments are sometimes used in the analysis of non-Gaussion RVs as below: E[( x  x )3 ] Skewness  3 Skewness can be used to measure the asymmetry of a distribution. pdf which are symmetric about their mean have zero skewness E[( x  x ) 4 ] Kurtosis  4 Kurtosis is used to measure the deviation of a distribution from the Gaussian pdf. Closer to Gaussian functions have lower values of kurtosis Skewness and kurtosis for a Gaussian RV are ideally zero More on Gaussian pdf Most popular pdf is Gaussian as many physical phenomena can be modeled by Gaussian Function as bellow: ( x  x ) 2 1 Gaussian p( x)  e 2 2 ; - x  2  x P( x)  p ( x) dx  1 2 0.607 2 x x  Features of Gaussian pdf Many physical processes can be modeled by Gaussian pdf Easy to handle analytically If there are two RVs x and y with known pdf (Gaussian) and other parameters statistically independent to each other, then the joint pdf is also Gaussian and is given by, ( x  x )2 ( y  y )2 1 [ 2 x2  2 y2 ] p ( x, y )  e 2 x y This is not true for other density functions This feature is true for any number of statistically independent RVs A Gaussian RP (ensemble) can be described completely x x2 only by its first and second moment ( and ) For a Gaussian pdf is very important in many applications and PDF is given by, b 1  ( x  x )2 /2 2 P[a  X b]   e dx 2 a But it cannot be evaluated analytically In engineering it is more common to use the Q function defined as, 1   z 2 /2 Q( x)   e dz 2 x It is similar to the Gaussian RV with parameters x 0  2 1 and This Q function directly corresponds to the communication system For a Gaussian RV with parameters x 2 and , it can be written as,  x x  P[ X  x] Q   , it is the probability    of the event X>x  x x  Q     x x When x is to the right of the mean, the region X>x is called a right ‘tail’ of the distribution If x is chosen at the left of the mean, Q function represents the probability of the un-shaded region as depicted in the figure   xx   x x  Q  Q        In such cases argument of the Q function is negative and the following relation can be used: Q( x) 1  Q( x) The Q function can be used for other computations as well. For example, the left tail of   x  xby the distribution isdefined The probability P[ X  x] 1 of P[ Xthe shaded  x] 1 Q  x  x region Q   x  xis,            Also the Q function can be used to compute the probability of a more general b  x region a  x for a     P[ a X b ] Q Gaussian variable, i.e,             Q  Typical Q values Problem

Use Quizgecko on...
Browser
Browser