IMG_8411.jpeg
Document Details

Uploaded by TrustingIndigo6950
Full Transcript
# Statistical Inference ## Point Estimation ### Definition A **point estimator** is a statistic(a function of the sample) that is used to estimate a parameter of a population. ### Examples * The sample mean $\bar{X}$ is a point estimator of the population mean $\mu$. * The sample variance $...
# Statistical Inference ## Point Estimation ### Definition A **point estimator** is a statistic(a function of the sample) that is used to estimate a parameter of a population. ### Examples * The sample mean $\bar{X}$ is a point estimator of the population mean $\mu$. * The sample variance $S^2$ is a point estimator of the population variance $\sigma^2$. * The sample proportion $\hat{p}$ is a point estimator of the population proportion $p$. ### Properties of Point Estimators * **Bias**: The bias of an estimator $\hat{\theta}$ is defined as $Bias(\hat{\theta}) = E[\hat{\theta}] - \theta$. * If $Bias(\hat{\theta}) = 0$, then $\hat{\theta}$ is an **unbiased estimator** of $\theta$. * **Variance**: The variance of an estimator $\hat{\theta}$ is defined as $Var(\hat{\theta}) = E[(\hat{\theta} - E[\hat{\theta}])^2]$. * **Mean Squared Error(MSE)**: The MSE of an estimator $\hat{\theta}$ is defined as $MSE(\hat{\theta}) = E[(\hat{\theta} - \theta)^2]$. * $MSE(\hat{\theta}) = Var(\hat{\theta}) + [Bias(\hat{\theta})]^2$ ### Methods of Finding Estimators * Method of Moments * Maximum Likelihood Estimation(MLE) #### Method of Moments The method of moments is a method of estimating the parameters of a population by equating the sample moments to the corresponding population moments and solving for the parameters. ##### Example Suppose we have a random sample $X_1, X_2,..., X_n$ from a population with mean $\mu$ and variance $\sigma^2$. We want to estimate $\mu$ and $\sigma^2$ using the method of moments. * The first population moment is $E[X] = \mu$. * The second population moment is $E[X^2] = Var(X) + (E[X])^2 = \sigma^2 + \mu^2$. We can estimate these moments using the sample moments: * The first sample moment is $\frac{1}{n}\sum_{i=1}^{n}X_i = \bar{X}$. * The second sample moment is $\frac{1}{n}\sum_{i=1}^{n}X_i^2$. Equating the population moments to the sample moments, we have: * $\mu = \bar{X}$ * $\sigma^2 + \mu^2 = \frac{1}{n}\sum_{i=1}^{n}X_i^2$ Solving for $\mu$ an $\sigma^2$, we get: * $\hat{\mu} = \bar{X}$ * $\hat{\sigma}^2 = \frac{1}{n}\sum_{i=1}^{n}X_i^2 - \bar{X}^2 = \frac{1}{n}\sum_{i=1}^{n}(X_i - \bar{X})^2$ #### Maximum Likelihood Estimation(MLE) The maximum likelihood estimation(MLE) is a method of estimating the parameters of a population by maximizing the likelihood function. ##### Definition The **likelihood function** is the probability of observing the given sample as a function of the parameters. ##### Example Suppose we have a random sample $X_1, X_2,..., X_n$ from a population with probability mass function(PMF) $f(x|\theta)$, where $\theta$ is a parameter. The likelihood function is: $$ L(\theta) = \prod_{i=1}^{n}f(x_i|\theta) $$ To find the MLE of $\theta$, we need to maximize the likelihood function with respect to $\theta$. It's often easier to maximize the log-likelihood function: $$ \log L(\theta) = \sum_{i=1}^{n}\log f(x_i|\theta) $$ ##### Example Suppose we have a random sample $X_1, X_2,..., X_n$ from a normal distribution with mean $\mu$ and variance $\sigma^2$. We want to estimate $\mu$ and $\sigma^2$ using MLE. The likelihood function is: $$ L(\mu, \sigma^2) = \prod_{i=1}^{n}\frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x_i - \mu)^2}{2\sigma^2}} $$ The log-likelihood function is: $$ \log L(\mu, \sigma^2) = -\frac{n}{2}\log(2\pi\sigma^2) - \frac{1}{2\sigma^2}\sum_{i=1}^{n}(x_i - \mu)^2 $$ To maximize the log-likelihood function, we need to take the partial derivatives with respect to $\mu$ and $\sigma^2$ and set them equal to zero: $$ \frac{\partial \log L(\mu, \sigma^2)}{\partial \mu} = \frac{1}{\sigma^2}\sum_{i=1}^{n}(x_i - \mu) = 0 $$ $$ \frac{\partial \log L(\mu, \sigma^2)}{\partial \sigma^2} = -\frac{n}{2\sigma^2} + \frac{1}{2\sigma^4}\sum_{i=1}^{n}(x_i - \mu)^2 = 0 $$ Solving for $\mu$ and $\sigma^2$, we get: $$ \hat{\mu} = \bar{X} $$ $$ \hat{\sigma}^2 = \frac{1}{n}\sum_{i=1}^{n}(x_i - \bar{X})^2 $$