WhatsApp Image 2025-03-03 at 15.57.57.jpeg

Full Transcript

# Notes on Probability ## Events ### Definition An event $E$ is a subset of the sample space $S$. ### Example Rolling an even number on a die: $E = \{2, 4, 6\}$. ## Probability ### Definition The probability of an event $E$, denoted as $P(E)$, is a number between 0 and 1, inclusive, that repres...

# Notes on Probability ## Events ### Definition An event $E$ is a subset of the sample space $S$. ### Example Rolling an even number on a die: $E = \{2, 4, 6\}$. ## Probability ### Definition The probability of an event $E$, denoted as $P(E)$, is a number between 0 and 1, inclusive, that represents the likelihood of the event occurring. ### Axioms of Probability 1. $0 \leq P(E) \leq 1$ for any event $E$. 2. $P(S) = 1$, where $S$ is the sample space. 3. If $E_1, E_2, E_3, \dots$ are mutually exclusive events, then $$ P(E_1 \cup E_2 \cup E_3 \cup \dots) = \sum_{i=1}^{\infty} P(E_i) $$ ### Basic Probability Rules 1. **Complement Rule**: $P(E^c) = 1 - P(E)$, where $E^c$ is the complement of $E$. 2. **Addition Rule**: For any two events $A$ and $B$: $$ P(A \cup B) = P(A) + P(B) - P(A \cap B) $$ 3. **Conditional Probability**: The probability of $A$ given $B$: $$ P(A \mid B) = \frac{P(A \cap B)}{P(B)}, \quad \text{provided } P(B) > 0 $$ 4. **Multiplication Rule**: $$ P(A \cap B) = P(A \mid B) P(B) = P(B \mid A) P(A) $$ 5. **Independence**: Two events $A$ and $B$ are independent if: $$ P(A \cap B) = P(A) P(B) $$ or equivalently, $P(A \mid B) = P(A)$ and $P(B \mid A) = P(B)$. ## Random Variables ### Definition A random variable is a variable whose value is a numerical outcome of a random phenomenon. ### Types 1. **Discrete Random Variable**: A variable that can take on a countable number of distinct values. 2. **Continuous Random Variable**: A variable that can take on any value within a given range. ## Probability Distributions ### Discrete Probability Distribution A probability distribution that specifies the probability for each possible value of a discrete random variable. #### Probability Mass Function (PMF) For a discrete random variable $X$, the PMF is defined as: $$ p(x) = P(X = x) $$ such that: $$ \sum_{x} p(x) = 1 $$ ### Continuous Probability Distribution A probability distribution that specifies the probability for each possible value of a continuous random variable. #### Probability Density Function (PDF) For a continuous random variable $X$, the PDF is defined as a function $f(x)$ such that: $$ P(a \leq X \leq b) = \int_{a}^{b} f(x) \, dx $$ and $$ \int_{-\infty}^{\infty} f(x) \, dx = 1 $$ ## Expectation ### Definition The expected value (or mean) of a random variable $X$, denoted as $E[X]$ or $\mu$, is the weighted average of its possible values. ### Discrete Random Variable $$ E[X] = \sum_{x} x \cdot P(X = x) = \sum_{x} x \cdot p(x) $$ ### Continuous Random Variable $$ E[X] = \int_{-\infty}^{\infty} x \cdot f(x) \, dx $$ ## Variance ### Definition The variance of a random variable $X$, denoted as $Var(X)$ or $\sigma^2$, measures the spread or dispersion of the possible values of $X$ around the expected value. ### Formula $$ Var(X) = E[(X - E[X])^2] = E[X^2] - (E[X])^2 $$ ### Discrete Random Variable $$ Var(X) = \sum_{x} (x - E[X])^2 \cdot P(X = x) = \sum_{x} (x - \mu)^2 \cdot p(x) $$ ### Continuous Random Variable $$ Var(X) = \int_{-\infty}^{\infty} (x - E[X])^2 \cdot f(x) \, dx = \int_{-\infty}^{\infty} (x - \mu)^2 \cdot f(x) \, dx $$ ## Standard Deviation ### Definition The standard deviation of a random variable $X$, denoted as $\sigma$, is the square root of the variance. ### Formula $$ \sigma = \sqrt{Var(X)} $$

Use Quizgecko on...
Browser
Browser