Stochastic Processes: Definition and Types
40 Questions
0 Views

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to Lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

Consider a stochastic process modeling coin tosses, where $X(t) = 1$ if toss t is heads and $0$ otherwise. If the probability of heads is p, what is the expected value, $E[X(t)]$, for any given toss t?

  • $p^2$
  • $1-p$
  • $2p(1-p)$
  • $p$ (correct)

A stochastic process {X(t)} models a sequence of independent coin tosses. $X(t) = 1$ represents heads, and $X(t) = 0$ represents tails. Assuming the probability of heads is p, what is the variance, $Var[X(t)]$, for any single toss t?

  • $p^2$
  • $p$
  • $p(1-p)$ (correct)
  • $(1-p)^2$

Given a stochastic process ${X(t)}$ representing a series of independent coin flips, where $X(t) = 1$ denotes heads and $X(t) = 0$ denotes tails, what is the autocorrelation $R(t_1, t_2) = E[X(t_1)X(t_2)]$ for $t_1 \neq t_2$, assuming the probability of heads is p?

  • $p$
  • $p(1-p)$
  • $p^2$ (correct)
  • $0$

In a sequence of independent coin tosses modeled by a stochastic process ${X(t)}$, where $X(t) = 1$ for heads and $X(t) = 0$ for tails, what is the autocovariance $C(t_1, t_2)$ for $t_1 \neq t_2$, assuming the probability of heads is p?

<p>$0$ (A)</p> Signup and view all the answers

Given a stochastic process ${X(t) | t \in T}$, what condition must be met for it to be considered n-th order strict-sense stationary (SSS)?

<p>The joint distribution of $X(t_1), X(t_2), ..., X(t_n)$ is the same as the joint distribution of $X(t_1 + s), X(t_2 + s), ..., X(t_n + s)$ for all $t_i$ and some $s$. (B)</p> Signup and view all the answers

Which of the following is a necessary condition for a stochastic process to have stationary increments?

<p>The distribution of the increments $X(t+h) - X(t)$ depends only on <em>h</em>, not on <em>t</em>. (B)</p> Signup and view all the answers

Consider a stochastic process ${X(t)}$ with stationary increments. If $E[X(t)] = 0$ for all t, what can be said about the expected value of the increment $X(t+h) - X(t)$?

<p>It is always equal to 0. (B)</p> Signup and view all the answers

Suppose you have a stochastic process where the increments $X(t+h) - X(t)$ are independent and identically distributed (i.i.d.) with a mean of 0 and a variance of $\sigma^2h$. What can you infer about the increments of this process?

<p>The increments are stationary. (B)</p> Signup and view all the answers

Which of the following is a characteristic of a stochastic process with independent increments?

<p>The increments over non-overlapping intervals are statistically independent. (C)</p> Signup and view all the answers

Given a stochastic process with independent increments, how does knowing the value of $X(t_1)$ affect the distribution of $X(t_2)$ if $t_1 < t_2$?

<p>The distribution of the increment $X(t_2) - X(t_1)$ is independent of $X(t_1)$. (B)</p> Signup and view all the answers

What does the expression $E[X(t)]$ represent in the context of stochastic processes?

<p>The mean or expected value of the stochastic process at time t. (D)</p> Signup and view all the answers

Which of the following describes the autocorrelation function $R(t_1, t_2)$ of a stochastic process?

<p>A measure of the correlation between $X(t_1)$ and $X(t_2)$. (C)</p> Signup and view all the answers

How is the variance of a stochastic process $X(t)$ at a fixed time t calculated?

<p>$E[(X(t) - E[X(t)])^2]$ (D)</p> Signup and view all the answers

What is required to fully define a stochastic process?

<p>The joint density function for all possible combinations of time points and corresponding random variables. (D)</p> Signup and view all the answers

Which formula correctly represents the mean $\mu(t)$ of a stochastic process $X(t)$ when $X(t)$ is a continuous random variable?

<p>$\int x f(x, t) dx$ (C)</p> Signup and view all the answers

Given a stochastic process X(t) and its mean function μ(t), which of the following expressions defines the variance of X(t) if X(t) is discrete?

<p>$\sum (x - \mu(t))^2 f(x, t)$ (A)</p> Signup and view all the answers

What is the key difference in calculating the autocorrelation function $R(t_1, t_2)$ for continuous-state versus discrete-state stochastic processes?

<p>Continuous-state processes use integration, while discrete-state processes use summation. (C)</p> Signup and view all the answers

In the context of stochastic processes, what does it mean for a process to have stationary increments?

<p>The probability distribution of the increments depends only on the time difference and not on the absolute time. (D)</p> Signup and view all the answers

If $X(t)$ is a stochastic process and $R(t_1, t_2) = E[X(t_1)X(t_2)]$, how would you interpret a high value of $R(t_1, t_2)$?

<p>$X(t_1)$ and $X(t_2)$ tend to have values with the same sign and large magnitudes. (A)</p> Signup and view all the answers

Suppose you are given a stochastic process and told that its autocorrelation function $R(t_1, t_2)$ depends only on $|t_1 - t_2|$. What can you conclude about the process?

<p>The process is stationary. (B)</p> Signup and view all the answers

Consider a stochastic process $X(t)$. If $E[X(t)] = c$ (a constant) for all $t$, and $Cov(X(t_1), X(t_2)) = f(t_2 - t_1)$ for some function $f$, what type of stationarity does this process exhibit?

<p>Wide-sense stationarity (A)</p> Signup and view all the answers

Which of the following statements is NOT true about the mean function $\mu(t)$ of a stochastic process $X(t)$?

<p>$\mu(t)$ completely defines the stochastic process. (D)</p> Signup and view all the answers

For a stochastic process $X(t)$ with mean function $\mu(t)$, how would you calculate the autocovariance function $C(t_1, t_2)$?

<p>Both A and C (B)</p> Signup and view all the answers

Given two stochastic processes, $X(t)$ and $Y(t)$, how is their cross-correlation function $R_{XY}(t_1, t_2)$ defined?

<p>Both B and C are valid definitions depending on the application. (C)</p> Signup and view all the answers

A stochastic process is described as having 'independent increments'. What does this imply about the relationship between increments at different times?

<p>The increments are uncorrelated. (C)</p> Signup and view all the answers

Which of the following is a necessary condition for a stochastic process ${X(t) \mid t \in T}$ to have independent increments?

<p>For every $n \in N$ and $t_0 &lt; t_1 &lt; ... &lt; t_n \in T$, the random variables $X(t_0), X(t_1) - X(t_0), ..., X(t_n) - X(t_{n-1})$ are jointly independent. (D)</p> Signup and view all the answers

A stochastic process ${X(t) \mid t = 1, 2, ...}$ has stationary increments if:

<p>For every fixed $s &gt; 0$, the increment $X(t + s) - X(t)$ has the same distribution for all $t$. (D)</p> Signup and view all the answers

Consider a stochastic process where $Y_t$ represents the total number of heads seen in the first $t$ tosses of a coin. What condition must be met for this process to have stationary increments?

<p>The probability $p$ of landing on heads must be the same for each toss. (C)</p> Signup and view all the answers

If a stochastic process has stationary increments, what can be said about the distribution of $X(t + s) - X(t)$?

<p>It depends only on $s$. (D)</p> Signup and view all the answers

Let $X_t$ be a stochastic process. If $E[X_t]$ is not constant in $t$, which of the following is true?

<p>The process may be stationary if its autocovariance function only depends on the time difference. (B)</p> Signup and view all the answers

Assume a stochastic process $X(t)$ exhibits stationary increments. Which of the following statements is necessarily true?

<p>The distribution of $X(t + s) - X(t)$ is identical for all $t$ given a fixed $s &gt; 0$. (C)</p> Signup and view all the answers

Let $X_n = \sum_{i=1}^{2n} Z_i$, where $Z_i$ are independent random variables. If $E[X_n] = 0$, what does this imply about $E[Z_i]$?

<p>The sum of all $E[Z_i]$ must be 0. (C)</p> Signup and view all the answers

Given $X_n = \sum_{i=1}^{2n} Z_i$, where $Z_i$ are independent random variables with $E[Z_i] = 0$. What can be said about $E[X_n X_{n+s}]$?

<p>It depends on $s$ but not on $n$. (B)</p> Signup and view all the answers

Let $X_n = \sum_{i=1}^{2n} Z_i$, where the $Z_i$ are independent random variables. What is the implication if $E[X_n] = 0$ for all $n$?

<p>The sum of the expected values of $Z_i$ up to $2n$ is zero for every $n$ (D)</p> Signup and view all the answers

Consider $X_n = \sum_{i=1}^{2n} Z_i$, with $Z_i$ being independent random variables. If $E[Z_i]=0$ and $Var(Z_i)=1$, what is the variance of $X_n$?

<p>$2n$ (B)</p> Signup and view all the answers

Given a stochastic process $X(t)$, what condition regarding its autocovariance function $R(t, s)$ must be satisfied for the process to be wide-sense stationary (WSS)?

<p>$R(t, s)$ must be a function of $t - s$ only. (A)</p> Signup and view all the answers

In the context of stochastic processes, what does it mean for a process to have 'independent increments'?

<p>The changes in the process over non-overlapping time intervals are independent. (A)</p> Signup and view all the answers

What is a key characteristic of a stochastic process with 'stationary increments'?

<p>The increments of the process have the same distribution, regardless of the time interval's location. (A)</p> Signup and view all the answers

If $R(t, s)$ is the autocovariance function of a wide-sense stationary process, how is $R(t, s)$ related to $R(0, t-s)$?

<p>All of the above. (D)</p> Signup and view all the answers

What is an implication of a stochastic process ${X(t)}$ having independent increments on the covariance between $X(t_1) - X(t_0)$ and $X(t_3) - X(t_2)$ given that the intervals $[t_0, t_1]$ and $[t_2, t_3]$ do not overlap?

<p>The covariance is zero. (D)</p> Signup and view all the answers

Flashcards

Stochastic Process

A collection of random variables indexed by time.

Index Set (T)

The set of all possible times in a stochastic process (e.g., {1, 2, 3,...}).

State Space (S)

The set of all possible values that the random variable can take.

Discrete-Time Process

A stochastic process where time is discrete (e.g., t = 1, 2, 3...).

Signup and view all the flashcards

Discrete-Space Process

A stochastic process where the state space is discrete (e.g., S = {0, 1}).

Signup and view all the flashcards

Probability Mass Function (PMF)

Probability of a discrete random variable equaling some value.

Signup and view all the flashcards

Cumulative Distribution Function (CDF)

The probability that a random variable is less than or equal to a certain value.

Signup and view all the flashcards

Joint PMF

The probability of multiple events occurring together.

Signup and view all the flashcards

Expectation

The average value of a random variable.

Signup and view all the flashcards

Variance

A measure of the spread of a random variable around its mean.

Signup and view all the flashcards

Independent Increments

A stochastic process where differences between values at non-overlapping time intervals are independent.

Signup and view all the flashcards

Stationary Increments

A stochastic process where increments X(t+s) - X(t) have the same distribution for any fixed 's'.

Signup and view all the flashcards

Y(t) in Coin Tosses

Y(t) represents the total number of heads seen in the first 't' coin tosses.

Signup and view all the flashcards

Independent Increments (Example)

Increments are independent random variables when non overlapping.

Signup and view all the flashcards

Stationary Increment Dependence

The increment Y(t + s) - Y(t) depends only on 's', not 't'.

Signup and view all the flashcards

Bernoulli Trials

Examines probability of number of heads (successes)

Signup and view all the flashcards

Independent Bernoulli Trials

Each toss is unaffected by the other

Signup and view all the flashcards

Distribution function F(x)

F(x) describes the probability of something occuring up to point sample x.

Signup and view all the flashcards

Constant Expected Values

The value of expected values over time for a weakly stationary series should be constant.

Signup and view all the flashcards

Stationary Increments (Distribution)

The distribution of an increment only depends on the length of the increment.

Signup and view all the flashcards

Xn Time Series

Xn represents a series of values at different points in time.

Signup and view all the flashcards

Expected Value

The expected value is the average value.

Signup and view all the flashcards

R(t,s) - Autocovariance

Autocovariance measures how a series varies with it's previous values.

Signup and view all the flashcards

Implication of Weak Stationarity

Weak stationarity implies the expected value and autocovariance are constant.

Signup and view all the flashcards

Geometric progressions

Geometric progressions show proportional increases.

Signup and view all the flashcards

Mean of X(t)

The mean or average value of the stochastic process at a specific time t.

Signup and view all the flashcards

f(x, t)

A function describing the probability distribution of X at a particular time t, denoted as f(x, t).

Signup and view all the flashcards

Variance of X(t)

Measures the spread or dispersion of X(t) around its mean at a specific time t.

Signup and view all the flashcards

Autocorrelation

Measures the relationship between the values of a stochastic process at two different times, t1 and t2.

Signup and view all the flashcards

R(t1, t2)

The expected product of the process at two different times: E[X(t1)X(t2)].

Signup and view all the flashcards

Value of Autocorrelation

Helps to determine how the process at one point in time is related to its values at another point in time.

Signup and view all the flashcards

Continuous State

A state space where the stochastic process can take on any value within a given range.

Signup and view all the flashcards

Discrete State

A state space where the stochastic process can only take on specific, separate values.

Signup and view all the flashcards

Full Stochastic Process Knowledge

Complete probabilistic description of the process's behavior across all times.

Signup and view all the flashcards

Moments and Correlations

A concise summary of moments and correlations at different time points.

Signup and view all the flashcards

f(x1, x2; t1, t2)

The joint probability distribution of X at two specific times t1 and t2. Denoted f(x1, x2; t1, t2).

Signup and view all the flashcards

Complex Conjugate

Represents a complex number where the imaginary part changes sign. In the context of real numbers, it typically remains unchanged.

Signup and view all the flashcards

Dependency Between Time Points

Characterizes the degree of dependency between values of the process at two different time points.

Signup and view all the flashcards

Study Notes

  • A stochastic process is a collection of random variables used to model random processes.
  • Represented as {X(t) | t ∈ T}, where T is the parameter space and S is the state space.
  • X(t) denotes the random variables in the process.
  • X₁ can be used instead of X(t) to indicate random variables in a stochastic process.
  • For each t ∈ T, X(t) represents a different random variable.

Types of Stochastic Processes

  • Stochastic processes are described by their parameter space T and state space S.
  • Discrete-time: T is countable.
  • Continuous-time: T is uncountable.
  • Discrete-state: S is countable.
  • Continuous-state: S is uncountable.
  • Four types of stochastic processes exist, based on combinations of discrete/continuous time and state.

Example: Modeling Customers in a Shop

  • Uses two options to model the number of customers in a shop during the day.
  • Option 1: X(t) = Number of customers in the shop at time t.
  • T = {t ∈ ℝ : 0 ≤ t ≤ 24} is uncountably infinite.
  • S = {0, 1, 2, ...} is countably infinite.
  • This option represents a continuous-time, discrete-state process.
  • Option 2: X(t) = Number of customers in the shop after the nth customer leaves.
  • T = {1, 2, 3, ...} is countably infinite.
  • S = {0, 1, 2, 3, ...} is countably infinite.
  • This option represents a discrete-time, discrete-space process.

Statistics of Stochastic Processes

  • Three main points of interest when studying stochastic processes:
    • Dependencies that the sequences of values generated by the process exhibit.
    • Long-term averages of the generated sequence of values.
    • Characterization of the likelihood and frequency of certain boundary events.
  • {X(t) | t ∈ T} represents a stochastic process with parameter space T and state space S.

Density Functions

  • For a fixed element t of T:
    • The random variable X(t) has a cumulative distribution function F(x, t) = P(X(t) ≤ x) for every x ∈ ℝ.
    • Probability density function/probability mass function is given by:
      • Continuous-state: f(x, t) = ∂F(x, t) / ∂x
      • Discrete-state: f(x, t) = P(X(t) = x)
    • For values t₁, t₂ ∈ T with t₁ ≠ t₂, the joint cdf of random variables X(t₁) and X(t₂) is: F(x₁, x₂, t₁, t₂) = P(X(t₁) ≤ x₁ ∩ X(t₂) ≤ x₂).
    • Joint pdf/pmf:
      • S continuous: f(x₁, x₂, t₁, t₂) = ∂²F(x₁, x₂, t₁, t₂) / ∂x₁∂x₂
      • S discrete: f(x₁, x₂, t₁, t₂) = P(X(t₁) = x₁ ∩ X(t₂) = x₂)
    • Generalization to n random variables X(t₁), X(t₂), ..., X(tₙ) follows.
  • If we treat X(t) as a random variable, then these definitions are the same as previous definitions.
  • Fully knowing a stochastic process requires knowing f(x₁, x₂,... xₙ; t₁, t₂,... tₙ) for all n ∈ ℕ and parameters t₁, t₂, ... tₙ ∈ T, which is usually impossible.

Moments and Correlations

  • For a fixed t ∈ T:

    • The mean or expectation of X(t) is given by:
      • S continuous: μ(t) = E(X(t)) = ∫₋∞ to ∞ x f(x, t) dx
      • S discrete: μ(t) = E(X(t)) = ∑ x f(x, t)
    • The variance of X(t) is given by:
      • S continuous: σ²(t) = Var(X(t)) = E((X(t) − μ(t))²) = ∫₋∞ to ∞ (x − μ(t))² f(x, t) dx
      • S discrete: σ²(t) = Var(X(t)) = E((X(t) − μ(t))²) = ∑ (x − μ(t))² f(x, t)
  • For fixed t₁, t₂ ∈ T:

    • The autocorrelation of X(t₁) and X(t₂) is: R(t₁, t₂) = E(X(t₁)X(t₂)).
      • Measures the relationship between the current value and its past value.
      • S continuous: R(t₁, t₂) = ∫₋∞ to ∞ ∫₋∞ to ∞ x₁x₂ f(x₁, x₂; t₁, t₂) dx₁ dx₂
      • S discrete: R(t₁, t₂) = ∑ ∑ x₁x₂ f(x₁, x₂; t₁, t₂)
    • The autocorrelation of two different processes X(t) and Y(t) is: Rxy(t₁, t₂) = E(X(t₁)Y(t₂)).
      • Measures how common the processes are with themselves.
    • The autocovariance of X(t₁) and X(t₂) is: C(t₁, t₂) = E((X(t₁) – μ(t₁))(X(t₂) – μ(t₂))) = R(t₁, t₂) – μ(t₁)μ(t₂).
      • This applies for discrete-state and continuous-state processes.
      • Measures the joint variability of the two random variables.

Example: Tossing a Coin

  • X(t) represents the outcome of the t-th toss: 1 if heads, 0 otherwise.
  • T = {1, 2, 3, ...}
  • S = {0, 1}
  • Joint pmf is calculated using independance of coin tosses.
  • p is the probability of getting heads
  • If not independent, then P(X(t₁) = x₁) * P(X(t₂) = x₂) has no connection to the joint pmf

At The Station

  • nth-order strict-sense stationary (SSS) for stochastic processes.
  • The joint distribution of random variables at times t1, t2,...,tn is the same as that at times t1+s, t2+s, ..., tn+s.
  • This definition applies for some n ∈ N, t1, t2, ..., tn ∈ T, and s ∈ R such that ti + s ∈ T for all i = 1, 2, ..., n.
  • If the process is nth-order SSS for every n ∈ N, it is considered strict-sense stationary.

Consequences of Stationary SPs

Under Stationary SPs:

  • If f(x, t) is independent of time, then E(X(t)) = μ is constant.
    • μ represents a constant.
  • With a 2nd-order SSS process, f(x₁, x₂; t₁, t₂) = f(x₁, x₂; t₁ + s, t₂ + s) for all s.
    • If s = -t₁, f(x₁, x₂; t₁, t₂) = f(x₁, x₂; t₂ - t₁), becoming a function of t₂ - t₁.
    • This means μ(t₁) = E(X(t₁)) = μ(t₂ - t₁) and similarly for μ(t₂).
    • R(t₁, t₂) only depends on the difference t₂ - t₁ and not on t₁ itself.
    • C(t, t + s) = R(t, t + s) – μ(t)μ(t + s) only depends on the difference between t and t + s.
  • A weaker definition of stationary is simpler than direct proof.

Definition of Wide-Sense Stationary

  • A stochastic process {X(t) | t ∈ T} is wide-sense stationary (WSS).
  • This applies if, for all t ∈ T and s ∈ ℝ such that t + s ∈ T, E(X(t)) and R(t, t + s) do NOT depend on t.
  • Two processes {X(t) | t ∈ Tx} and {Y(t) | t ∈ Ty} are jointly wide-sense stationary when both processes are WSS.
  • Rxy(t, t + s) shouldn't depend on t either.

Example: Wide-Sense Stationary

  • X(t) defined as r cos(at + φ).
  • r and a are constants.
  • φ is a uniformally distributed random variable.
  • E[X(t)] is a constant independent of time.

Increments of a Stochastic process

  • An increment of a random process {X(t) | t ∈ Tz} is X(t₂) - X(t₁) for some t₂,t₁ ∈ T where t₂ > t₁.
  • A stochastic process {X(t) | t ∈ Tx} has independent increments.
  • For every n∈ N and t₀ < t₁ < ... < tₙ ∈ T, the random variables X(t₀), X(t₁) - X(t₀), ..., X(tₙ) - X(tₙ₋₁) are jointly independent.
    • I.e. the difference between values of the random process across non-overlapping time intervals are independent.

Stationary ones

  • A stochastic process {X(t) | t = 1,2,...} has stationary increments if, for every fixed s > 0.
  • The increment X (t + s) – X(t) has the same distribution Vt ∈ T.
    • The distribution of an increment only depends on the length of the time period it spans, i.e. s.

Example - Stationary increments

  • Considers a stochastic process defined as flipping a coin with probability p of landing on heads.
  • Y(.) Represents total # of heads
  • Has stationary increments.

Studying That Suits You

Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

Quiz Team

Related Documents

Description

Explore stochastic processes, which model random phenomena using random variables. Learn about parameter and state spaces and their influence on process types. Understand discrete and continuous-time stochastic processes. See examples in real-world applications.

More Like This

Stochastic Processes
5 questions

Stochastic Processes

RenownedResilience avatar
RenownedResilience
Lecture 8: Stochastic Process
16 questions
Use Quizgecko on...
Browser
Browser