Bernoulli and Binomial Distribution PDF

Summary

These lecture notes cover Bernoulli and Binomial distributions, focusing on discrete and continuous random variables. Examples and solutions are included to illustrate the concepts.

Full Transcript

FACULTY OF SCIENCE AND SCIENCE EDUCATION DEPARTMENT OF MATHEMATICAL SCIENCES FIRST SEMESTER 2021/2022 ACADEMIC SESSION COURSE CODE: STA 211 COURSE TITLE: PROBABILITY I (2UNITS) LECTURER: MR. O. C. AYENI...

FACULTY OF SCIENCE AND SCIENCE EDUCATION DEPARTMENT OF MATHEMATICAL SCIENCES FIRST SEMESTER 2021/2022 ACADEMIC SESSION COURSE CODE: STA 211 COURSE TITLE: PROBABILITY I (2UNITS) LECTURER: MR. O. C. AYENI LECTURE NOTE FOR WEEK THREE AND FOUR (3 & 4). TOPIC: PROBABILITY DISTRIBUTION OF DISCRETE AND CONTINUOUS RANDOM VARIABLES. SUB TOPIC: BERNOULLI AND BINOMIAL DISTRIBUTION DISTRIBUTION OF RANDOM VARIABLES SPACES INTRODUCTION This unit concerns with the meaning and classification of random variables into discrete and continuous random variables are highlighted, distributive functions for discrete and continuous random variables and related examples are also given. RANDOM VARIABLES A random variable is a function whose domain of definition is the sample space S of a random experiment and whose range is a set of real numbers. Example 1: suppose that a coin is tossed twice so that the sample space S = (H HT TH, TT). Let X represents the number of heads that can come up. For example X (HH) = 2, X (HT) = X (TH) = 1, X (TT) = 0 Since the domain of X is S and the range consists of real numbers, then x is a random variable. DISCRETE RANDOM VARIABLE: A discrete random variable can take on only finite or countable infinite distinct values. A discrete random variable X will assign values to the elements in the sample space S above as follows ( ) ( ) ( ) Examples of experiments whose outcomes are defined by discrete random variables are as follows. a. An experiment of rolling a die once. Here the possible outcomes can be represented by a random variable X which can take any one of the values 1, 2, 3, 4, 5, 6. Hence the random variable X is therefore discrete. b. Consider an experiment of counting the number of defective items in several batches of items produced in a factory. Here the possible outcomes of this experiment are 0, 1, 2, 3, … Therefore a random variable X that describes the outcomes of the experiment is said to be discrete, for the fact that the random variable X can take any of the values 0, 1, 2, … CONTINUOUS RANDOM VARIABLE: A random variable that takes on all the values associated with the point interval, say R, is called a continuous random variable. A typical example is a person’s height or weight. DISCRETE PROBABILITY DENSITY FUNCTION A table or a list of the distinct values together with their associated probabilities ( ) ( ) is called a discrete probability mass function (pmf), sometimes called probability density function of the random variable X of the discrete type, and let R be the space of X. the probability mass function f (x) is a real valued function and satisfies the following properties: i. ( ) for all ii. ∑ ( ) Example 2 Find the probability function corresponding to the random variable x (i.e. number of heads) when a coin is tossed twice; assuming that the coin is fair. Solution ( ) ( ) ( ) ( ) Then, ( ) ( ) ( ) ( ) ( ) ( ) ( ) X 0 1 2 f (x) CONTINUOUS PROBABILITY DENSITY FUNCTION This probability distribution cannot be given in tabular form as the formula or probability density function is zero when the random variable assumes exactly any of its values. That is; ( ) ( ) ( ) ( ). The function with values f (x) is called a probability density function for the continuous random variable X, and satisfies the following properties: ( ) ( ) ∫ ( ) Where the function f (x) has the following properties i. ( ) ii. ∫ ( ) iii. ∫ ( ) ( ) where a and b are any two values of X satisfying iv. ∫ ( ) ( ) Example 3 Given the function ( ) 2 Is a density function, find a. the constant C. b. compute ( ) Example 4 A continuous random variable X that can assume values between x = 1 and x = 3 has a density function given by ( ) ; a. Determine the constant K b. Find ( ) (ASS) BERNOULLI DISTRIBUTION Any scientific experiments by convention give rise to two or more distinct outcomes. However experiments of the Bernoulli type recognizes exactly two possible distinct outcomes which are usually in the form of “Success” and “Failure”. Some examples of experiments of the Bernoulli types are listed as 1 – 4 as follows: 1. Tossing a coin once gives a sample space S as follows: * + 2. Inspection of several lots of items produced by a factory may give a sample space S as follows: * + 3. Students sitting for an examination in a course may constitute an experiment with the appropriate sample space S as follows: * + 4. An experiment on insect infestation of seeds in several laboratory pots may give rise to a sample space of the form: * + It is of interest however to note that the two distinct outcomes in Bernoulli trials may or may not be equally likely as it is evident in the following examples: 1. In an experiment of tossing a fair coin once, the two distinct outcomes are equally likely. 2. In an experiment of rolling a balanced die, we may be interested in the occurrence of a number that is less than or equal to 4, and therefore the two distinct outcomes are specified in terms of ( ) ( ) as follows: * +. Here the two distinct outcomes are not equally likely. 3. Also in an experiment of rolling a balanced die, we may have the outcome of interest to be the occurrence of an odd number so that we have ( ) ( ) as two distinct outcomes of the sample space S, that is * +. Note that the two distinct outcomes here are equally likely. In general therefore, we may state that for a Bernoulli experiment with the sample space S specified as, * + Probability of occurring, i.e. ( ) , and Probability of occurring, i.e. ( ) BERNOULLI PROBABILITY DENSITY FUNCTION Recall that for a Bernoulli experiment, there are two distinct outcomes that may be expressed in terms of “Success” and “Failure”. However, in order to define the probability density function we let X denote a random variable that assigns numerical values to the two distinct outcomes of a Bernoulli trial. Meanwhile a random variable X that follows the Bernoulli distribution has the following probability density function (pdf) as expressed in equation (1): ( ) { eqn (1) Where Equation (1) may be used to illustrate the fact that two distinct outcomes of a Bernoulli experiment are disjoint: ( ) ( ) Note that for the Bernoulli distribution: ( ) ( ) Example 5 Approximately 1 in 200 American adults are lawyers. One American adult is randomly selected. What is the distribution of the number of lawyers? Solution The distribution is Bernoulli since it is a single trial. ( ) ( ) ( ) For x = 0, 1, we have ( ) ( ) ( ) ( ) ( ) ( ) Example 6 An urn contains 7 white and 11 red balls. Suppose a ball is drawn at random (such that a random variable X = 0 if a white ball is drawn) and a success is recorded with the selection of a red ball. State the appropriate pdf for this experiment. Hence or otherwise obtain the mean and variance of X. Solution Given that when a white ball is drawn X = 0. Therefore X = 1, implies the selection of a red ball. Probability that a white (W) ball is drawn which is denoted by P(W) = P(X = 0) is: ( ) Also the probability of selecting a Red ball is obtained as: ( ) Note that there are two distinct outcomes in this experiment and our interest is to have a single trial by drawing a ball. Hence the appropriate probability density function (pdf) is the Bernoulli pdf., that is given below: ( ) ( ) 0 1 0 1 { The mean of X, that is E(X) is therefore obtained, from definition point of view, as: ( ) Or alternatively, the mean is obtained as: ( ) ∑ ( ) [ ] [ ] [ ] [ ] The variance of X, from definition is obtained as: ( ) , - , ( )- {∑ ( )} {∑ ( )} {[ ( ) ( ) ( ) ( ) ]} {[ ( ) ( ) ( ) ( ) ]} [ ] [ ] THE BINOMIAL DISTRIBUTION This distribution was discovered by James Bernoulli towards the end of the 17th century. It is a distribution dealing with dichotomous outcomes such as head or tail, male or female, good or bad, pass or fail, etc. generally these dichotomous outcomes are denoted as success (S) or failure (F). It is a discrete probability distribution which makes use of integral values (i.e. whole numbers) which could be finite or countably infinite. Before we define the binomial distribution we need to make two crucial definitions. i. A Bernoulli experiment is a random experiment having only two dichotomous outcomes. ii. A Bernoulli distribution is the probability mass function of a Bernoulli random variable. The probability of success of a Bernoulli distribution is usually denoted by p while the probability of failure is denoted by q with q = 1 – p. The Binomial distribution is a Bernoulli distribution with n trials. In the binomial distribution ( ), the probability that an event will occur x times in N trials is; ( ) Example 7 A fair die is tossed five times. Find the probability that a 4 occurs twice Solution P (obtaining a 4) = since there is only one 4 out of 6 possibilities. So p = and q= So, P (4 occurs twice) = ( ) ( ). /. / ( )( ) ( ) Example 8 If X represents the event that a head is obtained when a coin is tossed three times, construct the probability distribution of the experiment. Solution ( ) ( ). / ( )( ) ( ) ( ) ( ). / ( )( ) ( ) ( ) ( ). / ( )( ) ( ) ( ) ( ). / ( )( ) ( ) The obtained distribution is thus: X 0 1 2 3 P (X) Mean and Standard Deviation of the Binomial Distribution The mean and standard deviation of Binomial distribution is with variance:. Therefore, the standard deviation = √ √ Example 9 A fair coin is tossed 400 times; find the mean and standard deviation. Solution Mean = Where Mean = Standard deviation = √ √ CONCLUSION We have looked at probability distribution function with respect to discrete and continuous. Also, two distributions namely: Bernoulli and Binomial have been taught with several examples relating to them. REFERENCES / FURTHER READING DR. R.A Kasumu (2003) probability theory first edition published by Fatol ventures Lagos. Alexander M. Mood et al 1974 introduction to the theory of statistics third edition published by McGraw –Hill.

Use Quizgecko on...
Browser
Browser