Special Discrete Probability Distributions PDF
Document Details
Uploaded by WealthyCottonPlant
Tags
Summary
This document presents an overview of special discrete probability distributions, including the discrete uniform and Bernoulli distributions, and provides their definitions, moments, and properties. The text also introduces the binomial distribution, highlighting its key characteristics and properties. The discussion encompasses several aspects of these distributions, making it helpful for students or professionals in the field of probability theory or mathematical statistics.
Full Transcript
# SPECIAL DISCRETE PROBABILITY DISTRIBUTIONS ## 8-3 - This chapter is devoted to the study of the following univariate distributions: - Rectangular - Binomial - Poisson - Negative Binomial - Geometric - Hypergeometric - Multinomial - Power-series ## 8.2. Discrete U...
# SPECIAL DISCRETE PROBABILITY DISTRIBUTIONS ## 8-3 - This chapter is devoted to the study of the following univariate distributions: - Rectangular - Binomial - Poisson - Negative Binomial - Geometric - Hypergeometric - Multinomial - Power-series ## 8.2. Discrete Uniform Distribution - **Definition**: A r.v. X is said to have a discrete uniform distribution over the range [1, n] if its p.m.f. is defined as follows: - $P(X = x) = \frac{1}{n}$ for x = 1, 2, ..., n - $P(X = x) = 0$ otherwise - *n* is known as the parameter of the distribution and lies in the set of all positive integers. - The discrete uniform distribution is also called a **discrete rectangular distribution**. ### 8.2.1. Moments - $E(X) = \frac{1}{n} \sum_{i=1}^n i = \frac{n+1}{2}$ - $E(X^2) = \frac{1}{n} \sum_{i=1}^n i^2 = \frac{(n+1)(2n+1)}{6}$ - $V(X) = E(X^2) - [E(X)]^2 = \frac{(n+1)(n-1)}{12}$ - **The m.g.f. of X is:** - $M_x(t) = E(e^{tX}) = \sum_{x=1}^n e^{tx} = \frac{e^t(1-e^{nt})}{n(1-e^t)}$ ## 8.3. Bernoulli Distribution - **Definition**: A r.v. X is said to have a Bernoulli distribution with parameter *p* if its p.m.f. is given by: - $P(X = x) = p^x(1-p)^{1-x}$ for x = 0, 1 - $P(X = x) = 0$ otherwise - The parameter *p* satisfies 0 ≤ p ≤ 1. - (1-p) is often denoted as *q*. - A random experiment whose outcomes are of two types, success *S* and failure *F*, occurring with probabilities *p* and *q* respectively, is called a **Bernoulli trial**. - If for this experiment, a r.v. *X* is defined such that it takes value 1 when *S* occurs and 0 if *F* occurs, then *X* follows a Bernoulli distribution. ### 8.3.1. Moments of Bernoulli Distribution - The *r*th moment about origin is: - $\mu'_r = E(X^r) = 0^r.q + 1^r.p = p; r = 1, 2, ...$ - $\mu_1' = E(X) = p, \mu_2' = E(X^2) = p$, so that $\mu_2 = Var(X) = \mu_2' - \mu_1'^2 = p^2 - p = pq$ ## 8.4. Binomial Distribution - The binomial distribution was discovered by James Bernoulli (1654-1705) in the late 1700 and was first published posthumously in 1713, eight years after his death. - Let the occurrence of an event in a trial be called a success and its non-occurrence: failure. - Consider *n* independent Bernoulli trials (n being finite) in which the probability *p* of success in any trial is constant for each trial, then *q* = 1 - *p* is the probability of failure. - The probability of *x* successes and consequently (n – x) failures in *n* independent trials, in a specified order (say) SSFSFFFS...FSF (where *S* represents success and *F* represents failure) is given by the compound probability theorem by the expression: - $P (SSFSFFFS...FSF) = P(S)P(S) P(F) P(S) P(F) (F) P(F) P(S) x ... × P(F) P(S) P(F)$ -$=p.p.q.p.q.q.q.p...q.p.q = p^x q^{n-x}$ - But *x* successes in *n* trials can occur in (n choose x) ways and the probability for each of these ways is same, viz., $p^x q^{n-x}$. - The probability distribution of the number of successes, so obtained is called the **Binomial probability distribution**. - The probability distribution of the number of successes, so obtained is called the **Binomial probability distribution**. - The probabilities of 0, 1, 2 ..., n successes, viz., $q^n, q^{n-1}p, (1)q^{n-2}p^2, ..., p^n$, are the successive terms of the binomial expansion $(q + p)^n$. - **Definition**: A random variable *X* is said to follow binomial distribution if it assumes only non-negative values and its probability mass function is given by: - $P(X = x) = p(x) = (n choose x) p^x q^{n-x} ; x = 0, 1, 2, ..., n ; q = 1 − p$ - $P(X = x) = 0$ otherwise = **The two independent constants *n* and *p* in the distribution are known as the parameters of the distribution**. - *n* is also sometimes, known as the **degree of the binomial distribution.** - **Binomial distribution is a discrete distribution** as *X* can take only the integral values, viz., 0, 1, 2, ..., n. - Any random variable which follows binomial distribution is known as a **binomial variate**. ## 8.5 - We shall use the notation **X ~ B (n, p)** to denote that the random variable *X* follows binomial distribution with parameters *n* and *p*. - The probability *p(x)* in (8-3) is also sometimes denoted by **b (x,n,p)**. ### Remarks: 1. $\sum_{x=0}^n p(x) = \sum_{x=0}^n (n choose x) p^x q^{n-x} = (q + p)^n = 1$ 2. Let us suppose that *n* trials constitute an experiment. Then, if this experiment is repeated *N* times, the frequency function of the binomial distribution is given by: - $f(x) = Np(x) = N (n choose x) p^x q^{n-x}; x = 0, 1, 2, .... n$ - The expected frequencies of 0, 1, 2, ..., n successes are the successive forms of the binomial expansion, $N(q + p)^n, (q + p = 1)$. 3. Physical conditions for Binomial Distribution: - Each trial results in two exhaustive and mutually disjoint outcomes, termed as success and failure. - The number of trials *n* is finite. - The trials are independent of each other. - The probability of success *p* is constant for each trial. - The trials satisfying the conditions (i), (iii) and (iv) are also called **Bernoulli trials**. 4. The problems relating to tossing of a coin or throwing of dice or drawing cards from a pack of cards with replacement lead to binomial probability distribution. 5. Binomial distribution is important not only because of its wide applicability, but because it gives rise to many other probability distributions. Tables for *p(x)* are available for various values of *n* and *p*. ## Example 8.1 - Ten coins are thrown simultaneously. Find the probability of getting at least seven heads. - **Solution.** - $p = Probability$ of getting a head = $\frac{1}{2}$ - $q = Probability$ of not getting a head = $1-\frac{1}{2} = \frac{1}{2}$ - The probability of getting *x* heads in a random throw of 10 coins is: - $p(x) = (10 choose x)(\frac{1}{2})^x(\frac{1}{2})^{10-x}=(10 choose x)(\frac{1}{2})^{10}; x = 0, 1, 2, ..., 10$ - Probability of getting at least seven heads is given by: - $P(X ≥7) = p(7) + p(8) + p(9) + p(10)$ - $=(10 choose 7)(\frac{1}{2})^{10} + (10 choose 8)(\frac{1}{2})^{10} + (10 choose 9)(\frac{1}{2})^{10} + (10 choose 10)(\frac{1}{2})^{10} = \frac{120+45+10+1}{1024} = \frac{176}{1024} = \frac{11}{64}$ ## Example 8.2 - A and B play a game in which their chances of winning are in the ratio 3:2. "ind A’s chance of winning at least three games out of the five games played. - **Solution.** Let *p* be the probability that *A* wins the game. Then we are given: - $p = \frac{3}{5}$, $q = 1 - p = \frac{2}{5}$ - Hence, by binomial probability law, the probability that out of 5 games played. A wins *x* games is given by: ## Example 8.9 - In a binomial distribution consisting of 5 independent trials, the probabilities of 1 and 2 successes are 0.4096 and 0,2048 respectively. Find the parameter *p* of the distribution. - **Solution.** Let X ~ B(n, p). In usual notations, we are given: - n = 5, p(1) = 0.4096 and p(2) = 0.2048 - According to Binomial probability law: - $P(X= x) = p(x) =(5 choose x) p^x (1-p)^{5-x} ; x = 0, 1, 2, ... 5$ - Now - $p(1) = (5 choose 1) p (1-p)^4 = 0.4096 \dots$ (*) and - $p(2) = (5 choose 2) p^2 (1-p)^3 = 0.2048 \dots$ (**) - Dividing (*) by (**), we get - $\frac{(5 choose 1)p(1-p)^4}{(5 choose 2)p^2(1-p)^3} = \frac{0-4096}{0-2048} => \frac{5(1-P)}{10p} = 2 => \frac{1}{p} = 5 => p = \frac{1}{5} = 0.2$ ## Example 8.10 - With the usual notations, find *p* for a binomial variate *X* with parameters n = 6 and *p*, the probability function is: - $P(X = r) = (6 choose r) p^r q^{6-r} ; r = 0, 1, 2, ... 6$ - **Solution.** We are given: 9P(X = 4) = P(X = 2) - $9 \times (6 choose 4) p^4 q^2 = (6 choose 2) p^2 q^4$ - $9 \times (6 choose 4) p^2 = (6 choose 2) q^2$ - $9 p^2 = (1-p)^2 = 1 + p^2 - 2p$ - $8 p^2 + 2p - 1 = 0$ - $p = \frac{2\pm \sqrt{4+32}}{2\times8} = \frac{2\pm 6}{16}$ - $p = \frac{1}{2}; p = \frac{-1}{2}$ - Since probability cannot be negative, p = -5 is rejected. Hence p = 1/2 ## 8.4.1. Moments of Binomial Distribution - The first four moments about origin of binomial distribution are obtained as follows: - $\mu_1' = E(X) = \sum_{x=0}^n x (n choose x) p^x q^{n-x} = np \sum_{x=1}^n (x-1)p^{x-1}q^{n-x} = np (q+p)^{n-1} = np$ - $\mu_2' = E(X^2) = \sum_{x=0}^n x^2 (n choose x) p^x q^{n-x} = \sum_{x=0}^n (x (x-1) +x) p^x q^{n-x} = n(n-1)p^2 \sum_{x=2}^n (x-2)p^{x-2}q^{n-x} + np$ ## Example 8.11 - The mean of a binomial distribution is 3 and variance is 4. - **Solution.** If the given binomial distribution has parameters *n* and *p*, then we are given: - $Mean = np = 3$ - $Variance = npq = 4$ - Dividing (**) by (*), we get $q = \frac{4}{3} => p = 1 - q = \frac{-1}{3}$ ## Example 8.12 - The mean and variance of binomial distribution are 4 and 4/3 respectively. Find P(X ≥1). - **Solution.** Let X ~ B (n, p). Then we are given: $Mean = np = 4$ and $Var(X) = npq = \frac{4}{3}$. - Dividing, we get, $q = \frac{1}{3} => p = 1 - q = \frac{2}{3}$ - Substituting in (*), we obtain $n = 6$. - $P(X ≥ 1) = 1 - P(X = 0) = 1 - q^n = 1 - (\frac{1}{3})^6 = 1 - \frac{1}{729} = 0.99863$. ## Example 8.13 - If X ~ B (n, p), show that: - $E(\frac{n - X}{n})^2 = \frac{pq}{n} $ - $Cov(X, \frac{n-X}{n}) = \frac{-pq}{n}$ - ## Solution. Since X ~ B (n, p), $E(X) = np $ and $Var(X) = npq$ - (i) $E(\frac{n-X}{n})^2 = E[\frac{1}{n}(n-X)]^2 = \frac{1}{n^2} Var(n-X) = \frac{1}{n^2} Var(X) = \frac{npq}{n^2} = \frac{pq}{n}$ [From (*)] - (ii) $Cov(X, \frac{n-X}{n}) = E[(X-E(X))[ \frac{n-X}{n} - E( \frac{n-X}{n} )]] = E[(X-np)[ \frac{n-X}{n} - (1-p) ]]$ - $=E[(x-np)(\frac{-x}{n} +p)] = E[\frac{-x^2}{n} +xp + \frac{npX}{n}-np^2] = E[\frac{-x^2}{n}+xp +pX -np^2]$ - $=E[\frac{-X^2}{n} + (1+p)X - np^2] = \frac{-npq}{n} + (1+p)np - np^2 = \frac{-npq}{n} +np + np^2 - np^2 = \frac{-pq}{n}$ ## 8.4.2. Recurrence Relation for the moments of Binomial Distribution (Renowsky Formula) - By def., $\mu_r' = E(X – E(X))^r = \sum_{x=0}^n (x-np)^r (n choose x) p^x q^{n-x} $ - Differentiating w.r. to *p*, we get - $\frac{d \mu_r'}{dp} = \sum_{x=0}^n (n choose x) [-nr (x-np)^{r-1}p^x q^{n-x} + (x-np)^{r} (xp^{x-1}q^{n-x} - (n-x)p^xq^{n-x-1}]$ - $=-nr \sum_{x=0}^n (n choose x) (x-np)^{r-1} p^x q^{n-x} + \sum_{x=0}^n (n choose x) (x-np)^{r} p^x q^{n-x} (\frac{x}{p}-\frac{n-x}{q})$ - $=-nr \sum_{x=0}^n (x-np)^{r-1} p(x) + \sum_{x=0}^n (x-np)^{r} p(x)(\frac{x-np}{pq})$ - $=-nr \sum_{x=0}^n (x-np)^{r-1} p(x) + \frac{1}{pq} \sum_{x=0}^n (x-np)^{r+1} p(x)$ - $=-nr \mu_{r-1}' - 1 + \frac{1}{pq} \mu_{r+1}'$ - $=> \mu_{r+1}' = pq (nr \mu_{r-1}' + 1 + \frac{d\mu_r'}{dp})$ (Renovsky Formula) ... (8.7) - Putting r = 1, 2 and 3 successively in (8.7), we get - $\mu_2' = pq (\mu_1' + 1 + \frac{d\mu_1'}{dp}) = pq (np+1+np) = npq (1 + p)$ - $\mu_3' = pq (2n \mu_2' + 1 + \frac{d \mu_2'}{dp}) = pq (2n.npq + npq (q-p)) = npq (1 - 2p) + 3n^2p^2q$ - $\mu_4' = pq (3n \mu_3' + 1 + \frac{d\mu_3'}{dp}) = pq (3n^2pq + npq (1-p)(1-2p)) = npq (1 - 6pq) + 6n^2p^2q + 3n^3p^3q^2 $ ## Example 8.14 - Show that the *r*th moment $\mu'_r$ about the origin of the binomial distribution of degree *n* is given by: - $(\frac{d}{dp})^r (q+p)^n = \sum_{x=0}^n (n choose x) p^x q^{n-x} x^r$ - **Solution.** We shall prove this result by using the principle of mathematical induction. We have: - $(q+p)^n = \sum_{x=0}^n (n choose x) p^x q^{n-x}$ - $p (\frac{d}{dp} (q+p)^{n} = p \sum_{x=0}^n (n choose x) p^{x-1} q^{n-x} x = \sum_{x=0}^n (n choose x) p^x q^{n-x} x $ ... (i) - $=> (\frac{d}{dp}) (q + p)^n = \sum_{x=0}^n (n choose x) p^x q^{n-x} x = \sum_{x=0}^n (n choose x) p^x q^{n-x} 1^r$ ... (ii) - Thus, the result (*) is true for r = 1. - Let us now assume that the result (*) is true for r = k, say, so that - $(\frac{d}{dp})^k (q + p)^{n} = \sum_{x=0}^n(n choose x) p^x q^{n-x} x^k$ - $= \sum_{x=0}^n(n choose x) p^x q^{n-x} x^k$... (ii) - Differentiating (ii) partially w.r. to *p* and then multiplying both sides by *p*, we get - $p (\frac{d}{dp})[(\frac{d}{dp})^k (q+p)^{n}] = p \sum_{x=0}^n(n choose x)p^{x-1}q^{n-x} kx^k = E(Xk+1)$ - $=> p (\frac{d}{dp}) (q + p)^{n} = \sum_{x=0}^n(n choose x) p^x q^{n-x} (k+1)x^k$ ## 8.4.3. Factorial Moments of Binomial Distribution - The *r*th factorial moment of the Binomial distribution is: - $\mu'_{(r)} = E(X_{(r)}) = \sum_{x=0}^n x_{(r)} p(x) = \sum_{x=0}^n x_{(r)} (n choose x) p^x q^{n-x} = \sum_{x=r}^n \frac{x!}{(x-r)!} p^x q^{n-x} = n_{(r)} p^r (q+p)^{n-r} = n_{(r)} p^r$ ... (8.8) - $\mu'_{(1)} = E(X_{(1)}) = np = Mean$ - $\mu'_{(2)} = E(X_{(2)}) = n(2) p^2 = n(n-1)p^2$, $\mu'_{(3)} = E(X_{(3)}) = n(3) p^3 = n(n-1)(n-2)p^3$ - Now $\mu_{(2)} = \mu'_{(2)} - \mu'_{(1)}^2 + \mu'_{(1)} = n^2p^2 - np^2 - n^2p^2 + np = npq$ - $\mu_{(3)} = \mu'_{(3)} - 3\mu'_{(2)} \mu'_{(1)} + 2 \mu'_{(1)}^3 - 2\mu'_{(1)}$ - $= n(n-1)(n-2)p^3 - 3n(n-1)p^2 . np + 2n^3p^3 - 2np = -2npq (1+p)$ [On simplification] ## 8.4.4. Mean Deviation about Mean of Binomial Distribution - The mean deviation *n* about the mean *np* of the binomial distribution is given by: - $n = \sum_{x=0}^n |x-np| p(x) = \sum_{x=0}^n |x-np| (n choose x) p^x q^{n-x} = \sum_{x=0}^n (x-np) (n choose x) p^x q^{n-x} $ (x being an integer) - $= \sum_{x=0}^n (x-np) (n choose x) p^x q^{n-x} + \sum_{x=np}^n (x-np) (n choose x) p^x q^{n-x}$ - $= 2 \sum_{x=np}^n (x-np) (n choose x) p^x q^{n-x}$ - $ [... \sum_{x=0}^n (n choose x) p^x q^{n-x} = np => \sum_{x=0}^{np} (x-np) (n choose x) p^x q^{n-x} = 0]$ - $= 2 \sum_{x=\mu}^n (x-np) (n choose x) p^x q^{n-x}$, where μ is greatest integer contained in np + 1. - $= 2 \sum_{x=\mu}^n [{xq-(n-x)p}] (n choose x) p^x q^{n-x}$ - $= 2 \sum_{x=\mu}^n [xq-(n-x)p] (n choose x) p^x q^{n-x}$ - $= 2 \sum_{x=\mu}^n [(x-1)q+(n-x+1)p] (n choose x) p^x q^{n-x} $ - $= 2 \sum_{x=\mu}^n [(x-1)q+(n-x+1)p] \frac{n!}{x!(n-x)!} p^x q^{n-x}$ - $= 2 \sum_{x=\mu}^n [(x-1)q+(n-x+1)p] \frac{(n-1)!}{(x-1)!(n-x)!} p^{x-1} q^{n-x+1}$ - This is obtained by summing over x and using $t_{n+1} = 0$. - $n = 2t_{\mu-1} + 2t_\mu = 2 \frac{(n-1)!}{(\mu-1)!(n-\mu)!} p^{\mu-1} q^{n-\mu+1} = 2npq (\frac{n-1}{\mu-1} )p^{\mu-2}q^{n-\mu} ... (8.9)$ ## 8.4.5. Mode of Binomial Distribution - We have - $\frac{p(x)}{p(x-1)}= \frac{(n choose x)p^x q^{n-x}}{(n choose x-1) p^{x-1} q^{n-x+1}}$ - $= \frac{n!}{(n-x)!x!} \frac{p^x q^{n-x}}{(x-1)!(n-x+1)! p^{x-1} q^{n-x+1}} = \frac{(n-x+1)p}{xq}$ - $=1 + \frac{(n+1)p-x}{xq}$ - $=1 + \frac{(n+1)p}{xq} - 1$ - $=1 + \frac{(n+1)p-x}{xq}$ - **Mode is the value of *x* for which p(x) is maximum**. - We discuss the following two cases: - **Case 1.** When $(n+1)p$ is not an integer. Let $(n+1)p = m+ f$, where m is an integer and f is fractional such that 0 <f < 1. Substituting in (8.10), we get: - $\frac{p(x)}{p(x-1)} = 1 + \frac{(m+f)-x}{xq}$ - **From (*), it is obvious that:** - $\frac{p(x)}{p(x-1)}> 1$ for x = 0, 1, 2, ... m and - $\frac{p(x)}{p(x-1)} < 1$ for x = m + 1, m + 2, ... , n - $=> p(1) > p(0), p(2) > p(1), ..., p(m-1) > p(m-2) \space and \space p(m)> p(m-1), p(m+1) < p(m), p(m+2) < 1 .....p(n) < p(n-1)$ - $=> p(0) < p(1) < p(2) < ... < p(m-1) < p(m) > p(m + 1) > p (m + 2) > ...> p (n)$ - $=> p(x)$ is maximum at x = m. - **Thus, in this case there exists unique modal value for binomial distribution as n is m, the integral part of (n + 1)*p*. - **Case II.** When (n + 1)p is an integer. - **Let (n + 1)p = m (an integer).** - **Substituting in (8-10), we get:** - $\frac{p(x)}{p(x-1)} = 1 + \frac{m-x}{xq}$ - **From (**), it is obvious that:** - $\frac{p(x)}{p(x-1)} >1$ for x = 1, 2, ..., m-1 - $\frac{p(x)}{p(x-1)} = 1$ for x = m - $\frac{p(x)}{p(x-1)} < 1$ for x = m + 1, m + 2, ..., n - **Now proceeding as in case 1, we have:** - $p(0) < p(1) < ... < p(m - 1) = p (m) > p (m + 1) > p (m + 2) > ... > p(n)$ - **Thus, in this case the binomial distribution is bimodal and the two modal values are *m* and *m – 1* .** ## Example 8.15 - Determine the binomial distribution for which the mean is 4 and variance 3 and find its mode. - **Solution.** Let X ~ B (n, p), then we are given that: - $E(X) = np = 4$ ... (*) - $Var (X) = npq = 3$ ... (**) - Dividing (**) by (*), we get $q = \frac{3}{4} => p = 1 - q = \frac{1}{4}$ - Hence from (*), we obtain $n= \frac{4}{p} = 16$, $p = \frac{1}{4} $ - Thus the given binomial distribution has parameters n = 16 and p = 1/4 ## Example 8.16 - Show that for p = 0.5, the binomial distribution has a maximum probability at X = , if n is even, and at X = (n+1) as well as X = (n-1), if n is odd. - **Solution.** Here we have to find the mode of the binomial distribution. - (1) Let n be even = 2m, (say), n = 1, 1/2, ..., m + 0.5. Hence in this case, the distribution is unimodal, the unique mode being at X = m = 2. - (ii) Let n be odd = (2m + 1), say. Then - $(n + 1) p = (2m + 2) x \frac{1}{2} = n+1 = \frac{n+1}{2}$ (Integer) = $\frac{n}{2} + 1 = \frac{n+1}{2} + \frac{1}{2} + \frac{1}{2} $ - **Since (n + 1) p is an integer, the distribution is bimodal, the two modes being (n+1) and (n+1) - 1 = (n - 1).** ## 8.4.6. Moment Generating Function of Binomial Distribution - Let X ~ B (n, p), then: - $M_x(t) = E(e^{tX}) = E[\sum_{x=0}^n e^{tx} p(x)] = \sum_{x=0}^n e^{tx} (n choose x) p^x q^{n-x} = (q + pe^t)^{n}$ ... (8-11) ## m.g.f. about Mean of Binomial Distribution: - $E[e^{t(X-np)}] = e^{-tnp}.E(e^{tX}) = e^{-tnp}.M_x(t) = e^{-tmp} (q + pet)^{n} = (qe^{-pt} + pet)^{n}$ - $=[q{\{ 1-pt + \frac{p^2 t^2}{2!} + \frac{p^3 t^3}{3!} + ... \}} + p{\{ 1 + pt + \frac{p^2 t^2}{2!} + \frac{p^3 t^3}{3!} + ... \}}]^n$ - $=[(q+p) +pq(q+p)+pq(q^2 - p^2)+ \frac{p^3q(q^2+ p^2)t^3}{4!} + ... ]^n$ - $=[1 + \frac{t}{2!} \{- pq + pq (q-p) + qp (1 – 3pq) + ... \}]^ n$ - $=[1 + \frac{t}{2!} \{ pq+pq (q-p)+Pq (1-3pq) +...) $ - $+(2 choose 2) \frac{t^2}{2!}(pq+pq(q-p)+...+...]^n$ - $=[1+(1 choose 1) \frac{t}{2!}(pq+pq(q-p)+...+...] $ - $+(2 choose 2) \frac{t^2}{2!}(pq+pq(q-p)+...+...] +... + ... ]^n$ - **Now** - $\mu_2 = Coefficient of t^2/2! = npq$, $\mu_3 = Coefficient of t^3/3! = npq (q-p)$ - $\mu_4 = Coefficient of \frac{t^4}{4!} = npq (1 – 3pq) + 3n (n - 1) p^2 q^2 $ - $= 3n^2p^2q^2 + npq (1 – 6pq)$. ## Example 8.17 - X is binomially distributed with parameters *n* and *p*. What is distribution of