Stochastic Processes Lecture Notes PDF

Summary

These lecture notes from Aston University cover stochastic processes, including different types of SPs, the statistics of SPs, and examples of wide-sense stationary processes. The notes incorporate definitions of strict-sense stationary and wide-sense stationary processes in addition to examples.

Full Transcript

Intended Learning Outcomes De ne di erent types of stochastic processes. Apply the statistics of stochastic processes to problems. De ne what is meant as strict-sense-stationary (SSS). De ne what is meant as wide-sense-stationary (WSS) and determine if a stochastic process is WSS. De ne...

Intended Learning Outcomes De ne di erent types of stochastic processes. Apply the statistics of stochastic processes to problems. De ne what is meant as strict-sense-stationary (SSS). De ne what is meant as wide-sense-stationary (WSS) and determine if a stochastic process is WSS. De ne the increments of a stochastic processes and be able to determine if these increments are stationary. AM30SO Stochastic Processes Lecture Notes 2 Introduction to stochastic processes Last lecture, we described a stochastic process informally as a collection of random variables that we can use to model a random process in the real world such as queuing or the spread of a disease. Let’s formalise this. Definition A stochastic process {X(t) | t → T } is a set of finitely or infinitely many random variables where the index set T is called the parameter space and the set S of all possible values that the random variable X(t) can take, i.e. the union of the sample spaces of the random variables X(t), is called the state space. all values your random variable can take NOTE: (i) To indicate the random variables in a stochastic process, we may sometimes write Xt instead of X(t) (ii) For each t → T , X(t) is a di!erent random variable. Ct is a different random variable to te etc 2.1 Types of SPs We describe a stochastic process with parameter space T and state space S as being: Discrete-time when T is countable. Continuous-time when T is uncountable. Discrete-state when S is countable. Continuous-state when S is uncountable. Therefore there are four types of stochastic processes: Discrete-time, discrete-state. Discrete-time, continuous-state. Continuous-time, discrete-state. Continuous-time, continuous-state. Aston University 1 AM30SO Stochastic Processes Lecture Notes Example - Type of stochastic processes Suppose we want to model the number of customers in a shop over the course of a day. How can we definite an appropriate stochastic process to model this? customers in the shop at optiont t Iyybery of T ter oct 24 uncountably infinite 5 0,1 2 countably infinite outcomes state space possible Ct t ET is continuous time discretestate options It Number of customers in the shop after the 7ᵗʰ customer leaves T 1 2,3 countably infinite 5 0,1 2,3 countably infinite discrete time discrete space stochastic process Aston University 2 AM30SO Stochastic Processes Lecture Notes 2.2 Statistics of SPs There are three main points of interest we have when studying stochastic processes: 1. The dependencies that the sequences of values generated by the process has. 2. Long-term averages of the generated sequence of values. 3. Characterisation of the likelihood and frequency of the occurrence of certain boundary events. To explore these, we need to define some statistics. Throughout, {X(t) | t → T } will be a stochastic process with parameter space T and state space S. 2.2.1 Density functions fixed value of t Let t be some fixed element of T. Random variable X(t) has cumulative distribution function F (x, t) = P (X(t) ↑ x) for every x → R. X(t) has probability density function/probability mass function given by:   ωF (x, t)     if the process is continuous-state ωx f (x, t) =     P (X(t) = x) if the process is discrete-state For values t1 , t2 → T with t1 ↓= t2 , the joint cdf of the random variables X(t1 ) and X(t2 ) is given by: F (x1 , x2 , t1 , t2 ) = P (X(t1 ) ↑ x1 ↔ X(t2 ) ↑ x2 ) and the joint pdf/pmf is given by:  2  ω F (x1 , x2 , t1 , t2 )   if S continuous   ωx ωx 1 2 f (x1 , x2 , t1 , t2 ) =     % & f (x 1 , x2 , t1 , t2 ) = P X(t1 ) = x1 ↔ X(t2 ) = x2 if S discrete area These generalise to n random variables X(t1 ), X(t2 ),..., X(tn ) in the obvious way. Aston University 3 Note if we treat t as a random variable these are the same definitions as last lecture To completely know a full stochastic process fesci see son ta ta to must be known and all ti tn ET for all new parameters ti Usually impossible AM30SO Stochastic Processes Lecture Notes 2.2.2 Moments and correlations For fixed t → T : The mean or expectation of X(t) is given by:  → '     xf (x, t)dx if S continuous  ↑→ µ(t) = E(X(t)) =    → (    xf (x, t) if S discrete ↑→ The variance of X(t) is given by:  → '     (x ↗ µ(t))2 f (x, t)dx if S continuous  ↑→ ε 2 (t) = Var(X(t)) = E((X(t)↗µ(t))2 ) =    → (    (x ↗ µ(t))2 f (x, t) if S discrete ↑→ relationship For fixed t1 , t2 → T : a measure of the The autocorrelation of X(t1 ) and X(t2 ) is: its cons LIE  → → ' '     x1 x2 f (x1 , x2 ; t1 , t2 )dx1 dx2 if continuous-state  ↑→ ↑→ R(t1 , t2 ) = E(X(t1 )X(t2 )) =    → ( → (  x1 x2 f (x1 , x2 ; t1 , t2 ) if discrete-state 1 state creating  If  x1 =↑→ x2 =↑→ where x is the complex conjugate of x - although of course we are usu- ally thinking about real numbers in this module. We can also calculate the autocorrelation of two di!erent processes X(t) and Y (t) using a RXY (t1 , t2 ) = E(X(t1 )Y (t2 )) The autocovariance of X(t1 ) and X(t2 ) is prove C(t1 , t2 ) = E((X(t1 ) ↗ µ(t1 ))(X(t2 ) ↗ µ(t2 ))) = R(t1 , t2 ) ↗ µ(t1 )µ(t2 ) for discrete-state and continuous state processes. abilitsof meet Eat covariance t.in Aston University 4 Prove Clenite R ta ta MHI E exit Miti Ed É sci mit.is mttoflocincsitiitildocidsc pettie mit.az Ifixiioca.tiita2 f ximtta aciscz the dude f ocisczfcoipait.it Retiity petedfact see tytaldscidoce im tiiii integrating o ee all t.ae 02 ME cifia.t.sd.ci attract mm as using same arguments rectifelt docider mctifeltdflainczit.it ta docidoc mltifult MCEDpttal.SIfcoapen.t t.it Rent rictault ultimate MAIM Rct t rect Mlt AM30SO Stochastic Processes Lecture Notes Example - Tossing a coin Suppose we toss a coin repeatedly with the probability of the coin landing on heads being p. Let’s also assume that our coin tosses are independent. Then we can model this process via the stochastic process, {X(t) | t → T } with T = {1, 2, 3,...} and S = {0, 1}, where  1 if toss t lands on heads   X(t) =    0 otherwise Calculate the cdf, pmf, joint pmf, expectation, variance and autocorrelation. T Number of Is 112,3 5 H T 1,0 discrete time and discrete space process P XH 1 p and P X A 0 1 p 1 P X 0 pmf float PE 8 Edwise cdf Florid P X Ex 0 otherwise 1 p xe 0,1 1 0C 1 Joint pmf flocinen t.it P A sa n a oz as coin tosses are independent P Lt x P ta 22 Aston University 5 In or 21 0,22 flscipnit.it p t p oct 1,22 0 Otherwise E Ct I XP E OXP XH O Expectation I ox i p p p Variance Var x t ECXH ECXH Yp i P x A 03 P 4 0 p Vananie p p pct p autocorrelation R t.it E XHDx̅ ta E t X t2 f Étd oxfcoe.FI xflosf.to ofcfoitit p 3P autocovananie Cat to to MIKE 0 they are uncorrelated 1st order t follows the same distribution as Hits 2ⁿᵈ order a Itz follows the same distribution as tits tats AM30SO Stochastic Processes Lecture Notes 2.3 At the station A stochastic process {X(t) | t → T } is n-th order strict-sense stationary (SSS) if for some n → N, t1 , t2 ,..., tn → T and s → R, for which ti + s → T ↘i = 1, 2,..., n, the joint distribution of X(t1 ), X(t2 ),..., X(tn ) is the same as the joint distribution of X(t1 + s), X(t2 + s),...X(tn + s). If the process is n-th order strict-sense stationary for every n → N, we can say it is strict-sense stationary. 2.3.1 Consequences of stationary SPs 1. If f (x, t) is independent of time then E(X(t)) = µ is constant. ELXIA c.fi Ffinclde dici p constant 2. If a process is 2nd -order SSS, we have f (x1 , x2 ; t1 , t2 ) = f (x1 , x2 ; t1 + s, t2 + s) ↘s. And in particular if s = ↗t1 we get f (x1 , x2 ; t1 , t2 ) = f (x1 , x2 , t2 ↗ t1 ). becomes a function of te ti Then µ(t1 ) = E(X(t1 )) = µ(t2 ↗ t ) and similarly for µ(t ). 1 2 ti door E so fascist ic c.fi ac.i in i s ft j As a consequence R(t1 , t2 ) only depends on the di!erence t2 ↗ t1 and not on t1 itself. This implies that C(t, t + s) = R(t, t + s) ↗ µ(t)µ(t + s) only depends on the di!erence s between t and t + s. Most of the time it is very di"cult to prove a stochastic process is stationary. The above consequences give rise to a weaker notion of stationary that is easier to handle. I air fluider to f diadic Aston University 6 fr pelta t ti dici flienta wss if expectation value ECxH and R titts are constants i.e not dependence AM30SO Stochastic Processes Lecture Notes Definition A stochastic process {X(t) | t → T } is wide-sense stationary (WSS) if ↘t → T and ↘s → R such that t+s → T , E(X(t)) and R(t, t+s) do NOT depend on t. Two processes {X(t) | t → Tx } and {Y (t) | t → Ty } are jointly wide-sense stationary if both processes are WSS and RXY (t, t + s) does not depend on t. Example - Wide-sense Stationary Let X(t) = r cos(at + ϑ) define a random process {X(t) | t ≃ 0} where r and a are constants and ϑ is a uniformally distributed random variabe with pdf:  1   if ↗ϖ ↑ x ↑ ϖ is It   2ϖ fω (x) =   ECHE   0 otherwise Let us investigate whether or not this process is WSS. E XCH E costatto f cos att x fafax atte doc sin atte I cos If I sin at sin at I t sina.to sitJ'tcostat Aston University sinlat sift5I cos 7 iosiag sin at sins at I 0 ECxCt is a constant independent times 2 titts E CE t SS d cos attts 0 E cos at E coscat d cos acts Tingtigidentites cos 0 cos acts E costat cos d sinlat sin o sin acts sin costat cos altts f 8 coscat sin acts o E casino ios90 siniat cos alttsDEC.si sin at sinlactts ECsYQ E sin d cos 01 Findcoshafonadoc 0 niznadoc cos 2nd 0 E cost 1 Fossa doc I E sin'd 21 sirna dx I Rlt s coscas constant as no t dependence As ECXCt and RCE s are not t dependent WSS AM30SO Stochastic Processes Lecture Notes 2.4 Increments of an SP it 2.4.1 Independent increments An increment of a random process {X(t) | t → Tx } is X(t2 ) ↗ X(t1 ) for some t2 , t1 → T where t2 > t1. A stochastic process {X(t) | t → Tx } has independent increments if for every n → N and t0 < t1 <... < tn → T , the random variables X(t0 ), X(t1 ) ↗ X(t0 ),..., X(tn ) ↗ X(tn↑1 ) are jointly independent, i.e. the di!erence be- tween vales of the random process across non-overlapping time intervals are independent. 2.4.2 Stationary ones A stochastic process {X(t) | t = 1, 2,...} has stationary increments if, for every fixed s > 0, the increment X(t + s) ↗ X(t) has the same distribution ↘t → T , i.e. the distribution of an increment only depends on the length of the time period it spans, i.e. s. Aston University 8 AM30SO Stochastic Processes Lecture Notes Example - Stationary increments Consider the stochastic process defined by flipping a coin with probability p of landing on heads which we described earlier. Show that it has stationary increments. where YA total number Y e f 0,112,3 tosses of of heads seen in the first the coin MEN and to ct atn E T where Let T 1,2 3 4 Ei 410 0 Y E xii and no heads as no coin has been tossed It are all independent Bernoulli traits Y to Éixci sum of independent Berouki trials Increment in Y E e f m y t Y ti IT i these increments are sets of RV's independent increments to ct c ctn Aston University 9 Stationary we need to consider y s Y E Effi Ttsamedistibutions Is Esx i If they have the same distribution time ie Y t s Y t is independent of only depends on the value of s we have stationary increments Pesina Or P 4 a E Xe xixP Xi it 2t x sin it not wss t sin because ECXHD is not b in constant xons 10.2s when g x 7 F 2 Floc 0.25 1 F ic 0.25 when XE 52 2 x CE F 0 1 x 10 2 Flx 0.25 2 x F x 0.2s KE E 7C 7,52 2 Set 0.5 in OST 1 PCH A PCT FCs s 0 Then F x o s 1 XC F x o s 9 sept s E 0 PCH t 1 PCT i i c O OCE 0,2 F oc O F x 1 0.2 Fla x 2 Xn Xn l t 2n Xxn z 2 i 2n z t 2n zo t xn Eixti z.J Elxn1 ELE.x z ES E Xn 0 i e a constant R Kitts E s xtts i E Etp xt.iq z s ÉQ xtix's z 3 ÉÑ 255 Coula z E Cz z E 2 2 0 for i j all the summation is zero except when i j jst Eat xt iytts Retitts i.gg É Var 2 E 2 C E 2 Var 2 1ˢᵗ 202 Ef X g Rct s w̅ ÉÉX 0 1ˢᵗ 1 o Jeni progression Er Eti 1 RCt.ms 1ˢᵗ t constant want time Y