Advanced Probability Theory
48 Questions
0 Views

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to Lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

Let $T$ be a random variable. Which of the following conditions is both necessary and sufficient for $T$ to be a stopping time with respect to a filtration $(\mathcal{F}n){n \geq 0}$?

  • $\forall n \in \mathbb{N}, \{T = n\} \in \mathcal{F}_n$ (correct)
  • $\forall n \in \mathbb{N}, \{T \leq n\} \in \mathcal{F}_{n-1}$
  • $\forall n \in \mathbb{N}, \{T = n\} \in \mathcal{F}_{n+1}$
  • $\forall n \in \mathbb{N}, \{T \geq n\} \in \mathcal{F}_n$

Consider a sequence of random variables $(X_n)_{n \geq 0}$ and a measurable set $B \in \mathcal{B}(\mathbb{R})$. Let $T = \inf {n : X_n \in B}$ and $S = \sup {n : X_n \in B}$. Which of the following statements is generally true?

  • Neither $T$ nor $S$ can be stopping times.
  • $S$ is a stopping time, but $T$ is not necessarily a stopping time.
  • Both $T$ and $S$ are always stopping times.
  • $T$ is a stopping time, but $S$ is not necessarily a stopping time. (correct)

Let $T$ be a stopping time with respect to a filtration $(\mathcal{F}n){n \geq 0}$, and $X_n$ a sequence of random variables. If $X_T(\omega) = X_{T(\omega)}(\omega)$ on ${T < \infty}$ and 0 otherwise, how is the $\sigma$-algebra $\mathcal{F}_T$ defined, encapsulating information up to the random time $T$?

  • $\mathcal{F}_T = \{A \in \mathcal{F}_\infty : A \cap \{T > n\} \in \mathcal{F}_n \}$
  • $\mathcal{F}_T = \{A \in \mathcal{F}_\infty : A \cap \{T < n\} \in \mathcal{F}_n \}$
  • $\mathcal{F}_T = \{A \in \mathcal{F}_\infty : A \cap \{T \leq n\} \in \mathcal{F}_n \}$ (correct)
  • $\mathcal{F}_T = \{A \in \mathcal{F}_\infty : A \cap \{T = n\} \in \mathcal{F}_n \}$

Consider stopping times $S$ and $T$. Which of the following statements regarding the relationship between $\mathcal{F}_S$ and $\mathcal{F}_T$ is correct if $S \leq T$ almost surely?

<p>$\mathcal{F}_S \subseteq \mathcal{F}_T$ (A)</p> Signup and view all the answers

Let $(X_n){n \geq 0}$ be a sequence of random variables and $T$ a stopping time. Which of the following expressions defines the stopped process $(X{T \wedge n})_{n \geq 0}$ correctly, emphasizing the path-wise behavior of the process up to the stopping time?

<p>$(X_{T(\omega) \wedge n}(\omega))_{n \geq 0}$ (D)</p> Signup and view all the answers

Given stopping times $T_n$ for $n \geq 0$, which of the following operations on the sequence $(T_n)_{n \geq 0}$ does not necessarily result in another stopping time?

<p>$\frac{1}{N} \sum_{n=1}^{N} T_n$, where $N$ is a fixed positive integer. (A)</p> Signup and view all the answers

Consider a random variable $X_n^* = \max_{k \leq n} |X_k|$ representing the running maximum of a sequence of random variables $(X_n)_{n \geq 0}$. For $p > 1$ and $k > 0$, and defining $X_n^* \wedge k = \min(X_n^, k)$, which inequality relating the $L^p$ norm of $X_n^ \wedge k$ and $X_n$ is most accurate, reflecting a maximal inequality concept?

<p>$||X_n^* \wedge k||_p^p \leq \frac{p}{p-1} E[|X_n|^p]$ (D)</p> Signup and view all the answers

Let $X$ be a stochastic process and $T$ a stopping time with respect to the filtration $(\mathcal{F}t)$. Suppose that for every bounded martingale $M$, the process $M_t^{T} = M{t \land T}$ is also a martingale. Which of the following provides the most precise description of what can be concluded about the process $X$ and the stopping time $T$?

<p>The stopped process $X^T$ is adapted to the filtration $(\mathcal{F}_t)$, and $E[X_T | \mathcal{F}_0] = X_0$. (B)</p> Signup and view all the answers

Consider a stochastic process $(X_t)_{t \geq 0}$. Which of the following statements, concerning the relationship between finite-dimensional distributions and the law of $X$, is most accurate under general conditions?

<p>Knowledge of <em>all</em> finite-dimensional distributions is sufficient to uniquely determine the law of $X$ because the cylinder sets form a $\pi$-system generating the $\sigma$-algebra. (A)</p> Signup and view all the answers

Suppose $(X_t)_{t \geq 0}$ is a stochastic process. Which of the following statements most accurately describes the relationship between the process being cadlag and its sample paths?

<p>The stochastic process $(X_t)_{t \geq 0}$ is cadlag if and only if almost surely, the map $t \mapsto X_t(\omega)$ is continuous from the right and has limits from the left. (D)</p> Signup and view all the answers

Let $X: [0, \infty) \rightarrow \mathbb{R}$ be a function. Which of the following is the most accurate interpretation of the statement that $X$ is a cadlag function?

<p>For all $t \in [0, \infty)$, $\lim_{s \downarrow t} X_s = X_t$ and $\lim_{s \uparrow t} X_s$ exists, regardless of whether they are equal. (A)</p> Signup and view all the answers

In the context of Kolmogorov's criterion, given random variables $(\rho_t)_{t \in I}$ where $I \subseteq [0, 1]$ is dense, and assuming $k\rho_t - \rho_s k_p \leq C|t - s|^{\beta}$ for some $p > 1$ and $\beta > \frac{1}{p}$, what is the most precise interpretation of the role of the condition $\beta > \frac{1}{p}$?

<p>It provides a sufficient condition for the existence of a continuous modification $(X_t)_{t \in [0, 1]}$ and ensures that the Hölder exponent $\alpha$ can be chosen such that the sample paths are almost surely $\alpha$-Hölder continuous for $\alpha \in [0, \beta - \frac{1}{p})$. (A)</p> Signup and view all the answers

Consider the space $D([0, \infty), \mathbb{R})$ of cadlag functions. Which of the following statements is most accurate concerning the $\sigma$-algebra typically used on this space?

<p>The $\sigma$-algebra on $D([0, \infty), \mathbb{R})$ is typically generated by the coordinate functions $X_t \mapsto X_s$ for all $s \geq 0$, and it coincides with the Borel $\sigma$-algebra induced by the Skorokhod metric. (B)</p> Signup and view all the answers

Suppose we have a collection of random variables $(\rho_t){t \in I}$, where $I \subseteq [0, 1]$ is a dense set. Under the conditions of Kolmogorov's criterion, specifically assuming that for some $p > 1$ and $\beta > \frac{1}{p}$, we have $E[|\rho_t - \rho_s|^p] \leq C|t - s|^{\beta}$ for all $t, s \in I$, which of the following additional conditions is absolutely necessary to ensure the uniqueness (in the almost sure sense) of the continuous modification $(X_t){t \in [0,1]}$ such that $X_t = \rho_t$ almost surely for all $t \in I$?

<p>No additional conditions are needed; Kolmogorov's criterion inherently provides uniqueness. (C)</p> Signup and view all the answers

Consider a measurable space $(\Omega, \mathcal{F})$ and two probability measures $P$ and $Q$ on it. If $Q$ is absolutely continuous with respect to $P$, which of the following statements must hold regarding the Radon-Nikodym derivative $\frac{dQ}{dP}$?

<p>$\frac{dQ}{dP}$ is measurable with respect to the completion of $\mathcal{F}$ under $P$, accommodating scenarios where $\mathcal{F}$ is not complete but Q remains absolutely continuous with respect to P. (C)</p> Signup and view all the answers

Let $P$ and $Q$ be two probability measures on a measurable space $(\Omega, \mathcal{F})$. Suppose that $Q$ is absolutely continuous with respect to $P$. Which of the following conditions is sufficient to ensure the existence of a bounded Radon-Nikodym derivative $\frac{dQ}{dP}$?

<p>There exists a constant $M &gt; 0$ such that for all $A \in \mathcal{F}$, $Q(A) \leq M \cdot P(A)$. (C)</p> Signup and view all the answers

Given a stochastic process $(X_t)_{t \geq 0}$, which of the following statements best differentiates between the concepts of continuity and cadlag properties in the context of stochastic processes?

<p>A stochastic process is continuous if its sample paths are continuous, while it is cadlag if its sample paths are right-continuous with left limits, allowing for jumps but maintaining a certain regularity in the discontinuities. (D)</p> Signup and view all the answers

Consider a probability space $(\Omega, \mathcal{F}, P)$ and a sub-sigma algebra $\mathcal{G} \subseteq \mathcal{F}$. Let $X$ be a random variable such that $E[|X|] < \infty$. If the conditional expectation $E[X | \mathcal{G}]$ is pathwise unique (i.e., any version of the conditional expectation is equal $P$-almost surely), what deeper property does this imply about the structure of $\mathcal{G}$ and its relationship to $X$?

<p>The sigma-algebra $\mathcal{G}$ is atomic up to $P$-null sets, where each atom $A \in \mathcal{G}$ either almost surely determines the value of $X$ or is irrelevant. (C)</p> Signup and view all the answers

Let $X_t$ be a stochastic process indexed by $t \in [0, \infty)$. Suppose $X_t$ satisfies Kolmogorov's criterion on a dense subset $I \subseteq [0, 1]$. Which of the following statements regarding the Hölder continuity of the resulting continuous modification is the most precise?

<p>For any $\alpha &lt; \beta - \frac{1}{p}$, there exists a random variable $K_\alpha \in L^p$ such that $|X_s - X_t| \leq K_\alpha |s - t|^\alpha$ for all $s, t \in [0, 1]$, but there's no guarantee that such a $K_\alpha$ exists for $\alpha = \beta - \frac{1}{p}$. (B)</p> Signup and view all the answers

Let $(\Omega, \mathcal{F}, P)$ be a probability space. Suppose $(X_n)_{n \geq 0}$ is a martingale with respect to a filtration $(\mathcal{F}n){n \geq 0}$. Under what conditions does the martingale converge both almost surely and in $L^1$?

<p>The martingale $(X_n)_{n \geq 0}$ is bounded in $L^p$ for some $p &gt; 1$, ensuring control over the tail behavior of the martingale. (B)</p> Signup and view all the answers

Consider a measurable space $(\Omega, \mathcal{F})$ and a $\sigma$-finite measure $\mu$ on it. If a positive function $f$ satisfies $\int_A f d\mu = 0$ for all $A \in \mathcal{F}$, what can be definitively concluded about the function $f$?

<p>The function $f$ is zero $\mu$-almost everywhere on $\Omega$, allowing for a set of $\mu$-measure zero where $f$ is non-zero. (C)</p> Signup and view all the answers

Let $(\Omega, \mathcal{F}, P)$ be a probability space, and let $(X_n){n \geq 1}$ be a sequence of i.i.d. random variables with $E[X_1] = 0$ and $E[X_1^2] = 1$. Define $S_n = \sum{i=1}^n X_i$. Which of the following statements accurately describes the asymptotic behavior of $\frac{S_n}{\sqrt{n}}$?

<p>$\frac{S_n}{\sqrt{n}}$ converges in distribution to a standard normal distribution $N(0, 1)$ as $n \to \infty$, as predicted by the central limit theorem. (B)</p> Signup and view all the answers

Suppose $(\Omega, \mathcal{F}, P)$ is a probability space and $X$ is an integrable random variable. Let $(\mathcal{F}n){n \geq 0}$ be a filtration. If $X_n = E[X | \mathcal{F}n]$, under what condition is $(X_n){n \geq 0}$ guaranteed to be a uniformly integrable martingale?

<p>The random variable $X$ is integrable, belonging to $L^1(P)$, and the conditional expectations are well-defined. (D)</p> Signup and view all the answers

Let $(\Omega, \mathcal{F}, P)$ be a probability space. Consider a sequence of random variables $(X_n)_{n \geq 1}$ converging in probability to a random variable $X$. Under what additional condition does this convergence imply convergence in $L^1$ (i.e., $E[|X_n - X|] \to 0$)?

<p>The sequence $(X_n)_{n \geq 1}$ is stochastically dominated by an integrable random variable $Y$, i.e., $P(|X_n| &gt; t) \leq P(Y &gt; t)$ for all $t &gt; 0$ and all $n$, and $E[Y] &lt; \infty$. (B)</p> Signup and view all the answers

Given a standard Brownian motion $(B_t)_{t \geq 0}$ and a stopping time $T$, and defining a reflected process $\tilde{B}t$ as $\tilde{B}t = B_t \cdot 1{t < T} + (2B_T - B_t) \cdot 1{t \geq T}$, what is the most precise characterization of the relationship between the increments of $B^*$ and $\tilde{B}$ as $n \to \infty$, considering $T_n$ as an approximation of $T$?

<p>The increments of $B^*$ converge almost surely to the increments of $\tilde{B}$, given the continuity of $B$ and the convergence of $T_n$ to $T$ almost surely. (C)</p> Signup and view all the answers

Considering a bounded domain $D \subset \mathbb{R}^d$ and a function $u \in C^2(D)$ satisfying Laplace's equation, which of the following statements regarding the mean value property is most accurate in the context of characterizing harmonic functions?

<p>The mean value property is both a necessary and sufficient condition for $u$ to be harmonic in D, provided $u$ is continuous and the integrals exist. (C)</p> Signup and view all the answers

Let $(B_t){t \geq 0}$ be a standard Brownian motion in $\mathbb{R}^d$, and let $u: \mathbb{R}^d \rightarrow \mathbb{R}$ be a harmonic function such that $E[|u(x + B_t)|] < \infty$ for any $x \in \mathbb{R}^d$ and $t \geq 0$. Which of the following modifications to Itô's lemma would be most pertinent in demonstrating that $(u(B_t)){t \geq 0}$ is a martingale with respect to $(F_t)_{t \geq 0}$?

<p>Itô's lemma must be adapted to account for the stochastic integral term vanishing due to the harmonicity of $u$, resulting in a simplification of the stochastic process. (D)</p> Signup and view all the answers

Consider a stochastic process $X_t = f(B_t)$, where $B_t$ is a standard Brownian motion and $f$ is a continuously differentiable function. Under what condition does $X_t$ qualify as a local martingale but not necessarily a true martingale?

<p>The stochastic integral $\int_0^t f'(B_s) dB_s$ must satisfy the integrability condition for local martingales but $E[|X_t|] = \infty$ for some $t$, thus preventing it from being a true martingale. (B)</p> Signup and view all the answers

Let $B_t$ represent standard Brownian motion. If a functional $F[B]$ is invariant under time reversal, what profound implication does this have for characterizing the statistical properties of $B_t$ in relation to its time-reversed counterpart?

<p>The invariance dictates that $B_t$ and $B_{1-t}$ (for $0 \leq t \leq 1$) have the same finite-dimensional distributions, directly linking forward and backward statistical behaviors. (A)</p> Signup and view all the answers

Suppose a stochastic process $X_t$ satisfies the stochastic differential equation $dX_t = \mu(X_t)dt + \sigma(X_t)dB_t$, where $B_t$ is a standard Brownian motion. Under what condition does the scale function $s(x)$ guarantee that $X_t$ is inaccessible from the interval $(a, b)$?

<p>When $s(x)$ is unbounded as $x$ approaches either $a$ or $b$, ensuring that the boundaries are impenetrable and act as reflecting barriers. (C)</p> Signup and view all the answers

Given a martingale $(M_t)_{t \geq 0}$ with continuous paths and $M_0 = 0$, and defining its quadratic variation process as $\langle M \rangle_t$, what is the precise interpretation of the statement $\langle M \rangle_t = 0$ for all $t \geq 0$ almost surely?

<p>This implies that $M_t$ is constant almost surely i.e. $M_t = 0$ for all $t$, indicating the absence of any stochastic fluctuations whatsoever. (A)</p> Signup and view all the answers

Let $X$ be a random variable representing the payoff of a derivative security at maturity $T$, and consider a risk-neutral measure $\mathbb{Q}$. What is the economic implication of the statement that $E^{\mathbb{Q}}[X] = 0$?

<p>The derivative security is fairly priced, reflecting a market equilibrium where the expected payoff under the risk-neutral measure equals its initial cost, normalized to zero. (C)</p> Signup and view all the answers

Given a sequence of independent and identically distributed random variables $X_1, X_2, ..., X_n$ with sample mean $\bar{x}$ and $S_n = X_1 + ... + X_n$, and considering Cramér's theorem concerning large deviations, what is the significance of the Legendre transform $\psi^*(a)$ in determining the rate of decay for $P(S_n \geq an)$ where $a > \bar{x}$?

<p>$\psi^<em>(a)$ represents the exact exponential decay rate, such that $\lim_{n \to \infty} \frac{1}{n} \log P(S_n \geq an) = -\psi^</em>(a)$, accurately quantifying the speed of convergence. (C)</p> Signup and view all the answers

Suppose $b_n = -\log P(S_n \geq an)$ forms a sub-additive sequence, where $S_n$ is the sum of $n$ independent and identically distributed random variables. According to Fekete's lemma, what can be rigorously inferred about the asymptotic behavior of $\frac{b_n}{n}$?

<p>$\lim_{n \to \infty} \frac{b_n}{n}$ exists and is equal to the infimum of the sequence ${\frac{b_k}{k}}_{k=1}^{\infty}$. (B)</p> Signup and view all the answers

Consider a scenario where $P(X_1 \leq 0) = 1$ for a random variable $X_1$. Based on the provided context relating to large deviations theory, what is the precise expression for $\frac{1}{n} \log P(S_n \geq 0)$, where $S_n$ represents the sum of $n$ independent and identically distributed random variables each distributed as $X_1$?

<p>$\frac{1}{n} \log P(S_n \geq 0) = \log P(X_1 = 0)$ (A)</p> Signup and view all the answers

In the context of Cramér's theorem, if the moment generating function $M(\lambda) = E[e^{\lambda X_1}]$ exists for a random variable $X_1$, and $\psi(\lambda) = \log M(\lambda)$, how does the Legendre transform $\psi^*(a)$ relate to the Chernoff bound for $P(S_n \geq an)$, where $S_n = \sum_{i=1}^{n} X_i$?

<p>$\psi^*(a)$ is equal to the exponent in the tightest possible Chernoff bound, providing the precise exponential rate of decay for $P(S_n \geq an)$. (D)</p> Signup and view all the answers

Consider a series of independent, identically distributed random variables and let $\psi(\lambda)$ represent the cumulant generating function. What specific optimization problem must be solved to compute the rate function $\psi^*(a)$ in Cramér's theorem, and what constraints, if any, apply to the optimization variable $\lambda$?

<p>Maximize $a\lambda - \psi(\lambda)$ with respect to $\lambda$ subject to the constraint that $\lambda$ is real and non-negative. (C)</p> Signup and view all the answers

Given Cramér's theorem, under what precise condition concerning the relationship between $a$ and $\bar{x}$ (the mean of the i.i.d. random variables) does the large deviation principle become relevant for analyzing the tail behavior of $P(S_n \geq an)$, and why is this condition necessary?

<p>The large deviation principle is relevant if and only if $a &gt; \bar{x}$, because this ensures that the event $S_n \geq an$ is a rare event whose probability decays exponentially. (C)</p> Signup and view all the answers

Suppose one aims to establish a lower bound on the probability $P(S_n \geq 0)$ using large deviation theory. What specific assumptions or transformations are invoked to simplify the analysis, and what critical challenge arises when $a = 0$ in applying Cramér's theorem?

<p>Translating $X_i$ by a constant to enforce a zero mean simplifies the analysis; however when $a = 0$, the infimum of $\psi(\lambda)$ over non-negative $\lambda$ may be zero, thereby complicating the lower bound estimation. (A)</p> Signup and view all the answers

In the context of large deviations and Cramér's theorem, if we define $\psi(\lambda) = \log E[e^{\lambda X_1}]$ and $\psi^(a) = \sup_{\lambda \geq 0} (a\lambda - \psi(\lambda))$, how does the non-negativity of $\psi^(a)$ relate to the fundamental properties of moment generating functions and their implications for bounding probabilities?

<p>$\psi^*(a)$ is always non-negative because it stems from maximizing $a\lambda - \psi(\lambda)$ and is fundamentally linked to the properties of moment generating functions, ensuring that the probability bounds remain meaningful. (D)</p> Signup and view all the answers

Consider a filtered probability space $(\Omega, \mathcal{F}, (\mathcal{F}t){t \geq 0}, P)$ and a stochastic process $(X_t){t \geq 0}$. Which of the following conditions is sufficient to ensure that $(X_t){t \geq 0}$ is a martingale with respect to the filtration $(\mathcal{F}t){t \geq 0}$?

<p>$E[X_t | \mathcal{F}_s] = X_s$ almost surely for all $s \leq t$, $E[|X_t|] &lt; \infty$ for all $t \geq 0$, and $X_t$ is $\mathcal{F}_t$-measurable. (B)</p> Signup and view all the answers

Let $(B_t)_{t \geq 0}$ be a standard Brownian motion. Which of the following stochastic integrals is pathwise the least regular (i.e., possesses the worst sample path properties)?

<p>$\int_{0}^{t} B_{s}^{2} dB_s$ (B)</p> Signup and view all the answers

Suppose $(X_n){n \geq 1}$ is a sequence of independent and identically distributed random variables with characteristic function $\phi(t)$. According to Cramér’s large deviation theorem, under suitable conditions, the probability $P(\sum{i=1}^{n} X_i > na)$ decays exponentially in $n$. Which of the following large deviation rate functions, $I(a)$, correctly characterizes this exponential decay?

<p>$I(a) = \sup_{t} [at - \log(E[e^{tX_1} ])]$ (B)</p> Signup and view all the answers

Consider a Lévy process $(X_t)_{t \geq 0}$ with Lévy exponent $\psi(u)$. Which of the following statements is not generally true regarding the properties of $\psi(u)$?

<p>$\psi(u)$ is infinitely differentiable. (B)</p> Signup and view all the answers

Let $X$ and $Y$ be random variables on $(\Omega, \mathcal{F}, P)$. Which of the following statements regarding conditional expectation is always true without additional assumptions?

<p>$E[aX + bY | \mathcal{G}] = aE[X | \mathcal{G}] + bE[Y | \mathcal{G}]$ for any sub-$\sigma$-algebra $\mathcal{G} \subseteq \mathcal{F}$ and constants $a, b \in \mathbb{R}$. (B)</p> Signup and view all the answers

Let $(M_t)_{t \geq 0}$ be a continuous martingale with $M_0 = 0$. Define its quadratic variation as $[M]_t$. Which of the following statements is not a direct consequence of the definition or properties of quadratic variation?

<p>If $M_t$ is a Brownian motion, then $[M]_t$ is absolutely continuous with respect to Lebesgue measure. (B)</p> Signup and view all the answers

Consider a Poisson random measure $N(dt, dx)$ on a space $E$ with intensity measure $\lambda(dx)$. For a measurable function $f: E \to \mathbb{R}$, the integral $\int_E f(x) N(dt, dx)$ is well-defined. Under which condition is the integral $\int_E f(x) N(dt, dx)$ a martingale?

<p>$\int_E f(x) \lambda(dx) = 0$ (D)</p> Signup and view all the answers

Suppose $(B_t)_{t \geq 0}$ is a standard Brownian motion. Let $\tau = \inf{t \geq 0 : B_t = a}$ be the hitting time of level $a > 0$. Which of the following statements regarding the strong Markov property applied to $\tau$ is generally correct?

<p>$(B_{\tau + t} - a)<em>{t \geq 0}$ is a Brownian motion independent of $\mathcal{F}</em>\tau$. (C)</p> Signup and view all the answers

Flashcards

Sigma-algebra

A family of subsets of a set, closed under complement, countable unions, and containing the empty set.

Measure

A function that assigns a non-negative real number or +∞ to subsets of a set. Satisfies measure axioms (non-negativity, null empty set, countable additivity).

Filtration

An increasing sequence of sigma-algebras, representing the information available at different points in time.

Conditional Expectation

The expected value of a random variable, given information about another random variable or event.

Signup and view all the flashcards

Martingale

A sequence of random variables where the expected value of the next variable, given the past, is equal to the current variable.

Signup and view all the flashcards

Optional Stopping

A procedure to stop a stochastic process based on the history of the process so far.

Signup and view all the flashcards

Central Limit Theorem

The convergence in distribution of the normalized sum of independent, identically distributed random variables to a standard normal distribution.

Signup and view all the flashcards

Brownian Motion

A continuous-time stochastic process with continuous paths, scaling and symmetry properties.

Signup and view all the flashcards

Stopping Time (T)

A random variable T is a stopping time if the event {T = n} is in the sigma-algebra Fn for all n.

Signup and view all the flashcards

Sigma-Algebra FT

For a stopping time T, the sigma-algebra FT contains events A from the future (F∞), for which the intersection with {T ≤ n} is in Fn.

Signup and view all the flashcards

Random Variable XT

The random variable XT is the value of the process X at the stopping time T.

Signup and view all the flashcards

Stopped Process (XnT)

The stopped process (XnT) is the process X up to time T, after which it remains constant.

Signup and view all the flashcards

T ∨ S (Max of Stopping Times)

If T and S are stopping times, then T ∨ S (max of T and S) is also a stopping time.

Signup and view all the flashcards

T ∧ S (Min of Stopping Times)

If T and S are stopping times, then T ∧ S (min of T and S) is also a stopping time

Signup and view all the flashcards

FS ⊆ FT (Information Increase)

If S ≤ T, meaning S always occurs before or at the same time as T, then FS is a subset of FT.

Signup and view all the flashcards

FT when T is constant n

If T is a constant value n, then FT is the same information as Fn.

Signup and view all the flashcards

Counting Measure

A measure that assigns the number of elements in a set.

Signup and view all the flashcards

Lebesgue Measure

A specific way of assigning 'size' to subsets of the real numbers. Standard way to measure length, area, volume.

Signup and view all the flashcards

Absolute Continuity (of Q w.r.t P)

Measure where zero measure of a set under P implies zero measure under Q.

Signup and view all the flashcards

Radon-Nikodym Derivative (dQ/dP)

A function that represents the 'density' of one measure with respect to another.

Signup and view all the flashcards

Radon-Nikodym Theorem

States Q(A) can be expressed as the expectation under P of X times the indicator function of A.

Signup and view all the flashcards

Absolute Continuity

A probability measure is absolutely continuous with respect to another if the former will always be zero when the latter is.

Signup and view all the flashcards

Filtration Fn

A sequence of sigma-algebras that increases with index n.

Signup and view all the flashcards

Martingale (Xn)n≥0

A sequence of random variables whose conditional expectation given the past equals the present value, and that is also uniformly integrable.

Signup and view all the flashcards

Cadlag Function

Right-continuous with left limits. For all t, the limit from the right equals the function's value at t, and the limit from the left exists.

Signup and view all the flashcards

Cadlag Stochastic Process

A stochastic process where, for any ω, the map t -> Xt(ω) is cadlag.

Signup and view all the flashcards

Finite-Dimensional Distribution

A measure on R^n, µ(A) = P((Xt1, ..., Xtn) ∈ A) for A ∈ B(R^n), where 0 ≤ t1 < ... < tn.

Signup and view all the flashcards

Knowing All FDDs

If we know all finite-dimensional distributions, then we know the law of X, since the cylinder sets form a π-system generating the σ-algebra.

Signup and view all the flashcards

Xt where t ∈ I

Random variables (Xt)t∈I, where I ⊆ [0, ∞).

Signup and view all the flashcards

Hölder Condition

For p > 1 and β > 1/p, kρt - ρs kp ≤ C|t - s|^β for all t, s ∈ I.

Signup and view all the flashcards

Kolmogorov’s Criterion (Conclusion)

There exists a continuous process (Xt)t∈I such that Xt = ρt almost surely, for all t ∈ I, and |Xs − Xt | ≤ Kα |s − t|α.

Signup and view all the flashcards

Dyadic Numbers

Numbers of the form k/2^n, where k and n are integers. Examples: 1/2, 1/4, 3/4, etc.

Signup and view all the flashcards

Process Resetting

A process where resetting can occur at any desired time while maintaining invariance properties.

Signup and view all the flashcards

Mean Value Property

A property of harmonic functions relating the value at a point to the average value on a sphere or ball around that point.

Signup and view all the flashcards

Harmonic Function Martingale

For a harmonic function u, E[u(x + Bt)] is a martingale, given certain conditions.

Signup and view all the flashcards

Reflection Principle

Given a standard Brownian motion (Bt) and a stopping time T, the reflected process B̃t = Bt - 2BT for t > T and Bt otherwise.

Signup and view all the flashcards

Harmonic Function Condition

A function u such that E|u(x + Bt)| < ∞ for any x ∈ Rd and t ≥ 0.

Signup and view all the flashcards

Standard Brownian Motion

A Brownian Motion is a continuous stochastic process used in various mathematical models.

Signup and view all the flashcards

Increment Convergence

The increments of B∗ converging almost surely to the increments of B̃.

Signup and view all the flashcards

Reflection Principle Distribution

A property that states the reflected process (B̃t ) shares the same distribution as the original Brownian motion.

Signup and view all the flashcards

Sn

The sum of random variables X1 through Xn.

Signup and view all the flashcards

Sub-additive Sequence

For a sequence, if b(m+n) <= bm + bn, the sequence is sub-additive.

Signup and view all the flashcards

Super-multiplicative Sequence

If P(Sm+n ≥ a(m + n)) ≥ P(Sm ≥ am)P(Sn ≥ an), then P(Sn ≥ an) is super-multiplicative.

Signup and view all the flashcards

Fekete's Lemma

If a non-negative sequence bn is sub-additive, then lim (bn/n) exists.

Signup and view all the flashcards

Moment Generating Function M(λ)

M(λ) = E[exp(λX1)], expectation of exponential transformation of a random variable.

Signup and view all the flashcards

ψ(λ)

ψ(λ) = log(M(λ)), the logarithm of the moment generating function.

Signup and view all the flashcards

Legendre Transform ψ*(a)

ψ*(a) = sup (aλ - ψ(λ)) for λ≥0, the Legendre transform of ψ(λ).

Signup and view all the flashcards

Cramér's Theorem

Describes the exponential rate at which probabilities of rare events decay.

Signup and view all the flashcards

Study Notes

Advanced Probability Overview

  • Builds on foundational measure theory for advanced probability topics.
  • Emphasizes tools for rigorously analyzing stochastic processes, especially Brownian motion.
  • Focuses on applications where probability theory plays a crucial role.

Course Topics

  • Sigma-algebras, measures, filtrations, integrals, expectation, convergence theorems, product measures, independence, and Fubini's theorem are all in this section
  • Conditional expectation, including discrete and Gaussian cases, density functions, existence, uniqueness, and properties.
  • Martingales and submartingales in discrete time, optional stopping, Doob's inequalities, martingale convergence theorems, and applications.
  • Stochastic processes in continuous time, Kolmogorov's criterion, regularization of paths, and martingales.
  • Definitions, characterizations, convergence in distribution, tightness, Prokhorov's theorem, characteristic functions, and Lévy's continuity theorem.
  • Strong laws of large numbers, central limit theorem, and Cramér's theory.
  • Wiener's existence theorem, scaling and symmetry properties.
  • Martingales, strong Markov property, hitting times, sample paths, recurrence, transience, Dirichlet problem, and Donsker's invariance principle are all part of it.
  • Construction, properties, and integrals.
  • Lévy-Khinchin theorem.

Prerequisites

  • Basic measure theory is helpful, especially for probability theory formulation.
  • Foundational topics will be reviewed, but consulting external resources like Williams' book is advised.

Introduction to Stochastic Processes

  • Stochastic processes are a core focus.
  • Time is a key component, studying how things change over time.
  • Initial focus is on discrete time processes.
  • Introduction to martingales and their properties.
  • Addresses fundamental differences due to interval topology in continuous time.
  • Discusses Brownian motion, its rich structure, and connection to Laplace's equation.

Conditional Expectation

  • This is a key object of study.
  • Involves integrating out randomness but retaining some dependence.

Stopping Time

  • Stopping time is another key object of study.
  • "Niceness" requires that when the time comes, that point in time is known.

Large Deviations

  • This is briefly introduced at the end of the course.

Measure Theory Review

  • σ-algebra definition

  • Measurable space definition refers to a set with a σ-algebra.

  • Defines Borel σ-algebra on a topological space.

  • Focuses on Borel σ-algebra B(R), denoted as B.

  • Measure definition: a function μ : E → [0,∞] where μ(0) = 0.

  • Measure space is defined as a measurable space with a measure.

  • Definition of measurable function between measurable spaces.

  • Notational conventions: mƐ for measurable functions E → R, mƐ+ for positive measurable functions allowing ∞

  • The function μ : mE+ → [0,∞] where μ(1A) = μ(A) exists and is unique.

  • States linearity property of the function.

Convergence and Integration

  • Key properties of integrals and measurable functions are outlined
  • Details monotone convergence conditions and implications.
  • The integral with respect to μ is defined and used.
  • Simple function definition and properties.
  • Approximating functions with simple functions.
  • Definition of "almost everywhere" equality, defining versions of functions.
  • Provides a counterexample where monotone convergence fails without monotonicity.
  • Fatou's lemma states an inequality for the measure of the limit inferior of functions.
  • Integrable function definition using μ(|f|) ≤ ∞, denoted by L¹(E).
  • Extends μ to L¹ and defines μ(f) using positive and negative parts of f.
  • Dominated convergence theorem states conditions for μ(f) = lim μ(fn).
  • Product σ-algebra definition on E1 × E2.
  • Product measure: unique measure μ satisfying μ(A1 × A2) = μ1(A1)μ2(A2)
  • Fubini's/Tonelli's theorem outlines conditions for iterated integrals.

Conditional Probability

  • Focuses on probability theory, assuming μ(E) = 1, and changes notation to E = Ω, Ε = F, μ = P

  • The definition of random variables (measurable functions), events (elements in F), and a realization (element w in Ω)

  • Defines conditional probability P(A | B) when P(B) > 0.

  • States how to interpret P(A | B) as P(A ∩ B) / P(B).

  • Describes scaling probability measure by P(B)

  • Conditional expectation of random variable based on the measure

  • Introduces allowing B to vary.

  • Defines Y as the sum of conditional expectations of X given disjoint events Gn.

  • Describes that process averages out X in compartments to obtin the value out of Y

  • Y is G-measurable.

  • Conditional expectation Y = E(X | G) satisfies EY1A = EX1A.

  • Theorem (Existence and uniqueness of conditional expectation)

  • Given X∈L and GcF a random variable Y is created

  • With created Y it is then showed that Y is G-measurable

  • It becomes part of the proof that Y∈L¹ and EX1A = EY14 for all AeG

  • This shows conditional expectation almost surely

  • If Y is σ(Z)-measurable, Y = h(Z) is Borel-measurable for some h

  • Definition E(X|Z=z) to allow in cases when P(Z = z) = 0

  • This gives opportunity to use the prior definitions

  • We note some immediate properties of conditional expectation

  • There are several characteristics to take into consideration

  • Clear

  • Take A = ω

  • We get L¹ almost surely

Martingales and stopping

  • Gives context with random variables that "evolve with time"

  • A sequence of G-algebras help with telling the införmation we have at time n

  • Definition (Filtration). With F being ∞, this means: Fn+1 2 Fn for all n and σ(F0, F1, ...) ⊆ F The study of stochastic processes is also done here

  • definition:(Stochastic process in discrete time).

  • Stating that this involves : A stochastic (in discrete time) is : sequence of random variables (Xn)n≥

  • Definition of adapted process

  • We say that (Xn)n≥0 is adapted (to (Fn)n≥0)

  • Martingale Definition*

  • Integrable adapted process (Xn)n≥0 is a martingale states it's: E(Xn | Fm) = Xm.

  • The same formula is applied to the sub and super martingale Important note that if (Xn)n>0 is a martingale then it’s a super-martingale and sub-martingale as a result

  • Defines a way of calculating the variables if something goes up or down Stopping time: The function(Stopping time). It is a random variable T: Ω→N>0∪{∞} such that: {t≤N}ϵFN

  • Important details: If we do want to determine if T has occurred we can do using the same information we have at time n Also if you are coming from stopping time it only has to be{t=n}ϵF,∀nn because after because : {T=n}={T≤n}∖{τ≤n−1}

Fundamental theorem and other important notes

  • With every theorum to state that XT1{T<=} is FT measurable The optional stopping, also makes it all integrable (𝑋𝑇∧n)n≥0

Studying That Suits You

Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

Quiz Team

Related Documents

Description

Covers measure theory for probability, stochastic processes, and Brownian motion analysis. Explores sigma-algebras, martingales, and convergence. Discusses stochastic processes in continuous time and tightness.

More Like This

Brownian Motion Quiz
5 questions

Brownian Motion Quiz

ImprovingCrocus avatar
ImprovingCrocus
Brownian Motion
9 questions

Brownian Motion

SensitiveMorningGlory avatar
SensitiveMorningGlory
Brownian Motion: Definition and History
10 questions
Use Quizgecko on...
Browser
Browser