Basic Probability PDF

Document Details

FearlessPentagon1177

Uploaded by FearlessPentagon1177

Vidhyadeep Institute of Engineering and Technology

Manisha N Sawant

Tags

probability mathematics random variables statistics

Summary

These notes cover basic probability, including definitions, concepts like random experiments, outcomes, sample spaces, and events. It includes examples of calculations, and explains concepts such as mutually exclusive, exhaustive, and equally likely events, and independent events. Further aspects such as permutations and combinations are discussed.

Full Transcript

BASIC PROBABILITY PREPARED BY: MANISHA N SAWANT A.S.H.DEPARTMENT VIDHYADEEP INSTITUTE OF ENGINEERING AND TECHNOLOGY Introduction Probability theory is the branch of mathematics that is concerned with random (or chance) phenomena. It has attracted people to its study both because of its intrin...

BASIC PROBABILITY PREPARED BY: MANISHA N SAWANT A.S.H.DEPARTMENT VIDHYADEEP INSTITUTE OF ENGINEERING AND TECHNOLOGY Introduction Probability theory is the branch of mathematics that is concerned with random (or chance) phenomena. It has attracted people to its study both because of its intrinsic interest and its successful applications to many areas within the physical, biological, social sciences, in engineering and in the business world. The words PROBABLE and POSSIBLE CHANCES are quite familiar to us. We use these words when we are sure of the result of certain events. These words convey the sense of uncertainty of occurrence of events. Probability is the word we use to calculate the degree of the certainty of events. RANDOM EXPERIMENTS:  An experiment is called random experiment if it satisfies the following conditions:  It has more than one possible outcome.  It is not possible to predict the outcome in advance  Example: Tossing a coin, throwing a die OUTCOME  The result of a random experiment is called an outcome  E.g. Suppose a random experiment is ‘a coin is tossed’. This experiment gives two possible outcomes. Head or tail. SAMPLE SPACE:  The set of outcomes is called the sample space of the experiment. It is denoted by “S”.  If a sample space is in one-one correspondence with a finite set, then it is called a finite sample space. Otherwise it is knowing as an infinite sample space.  Examples: Finite Sample Space: Experiment of tossing a coin twice. S = {H, T} × {H, T} = {HH,HT, TH, TT} Infinite Sample Space: Experiment of tossing a coin until a head comes up for first time. S = {H, TH, TTH, TTTH, TTTTH, TTTTTH, … } TRIAL AND EVENT  Any particular performance of a random experiment is called a trial and A combination of outcomes is called an event.  E.g. Tossing of a coin is a trial and getting a head or tail is an event. EXHAUSTIVE EVENT  The total number of possible outcomesof a random experiment is called an exhaustive event.  i.e. A ∪ B = S  E.g. In a tossing of a coin, there are two exhaustive events i.e. head and tail. MUTUALLY EXCLUSIVE EVENTS  Events are said to be mutually exclusive if the occurrence of one of them precludes the occurrence of all others in the same trial.  i.e. A ∩ B = ϕ  E.g. In tossing a coin the events head or tail are mutually exclusive since both head and tail can not occur at the same time. EQUALLY LIKELY EVENTS  The outcomes of a random experiment are said to be equally likely if the occurrence of none of them is expected in preference to others.  E.g. In tossing a coin,head or tail are equally likely events. INDEPENDENT EVENTS Events are said to be independent if the occurrence of an event does not have any effect on the occurrence of other events.  e.g. In tossing a coin,the event of getting a head in the first toss is independent of getting a head in the second, third and subsequent tosses. FAVOURABLE EVENTS  The favourable events in a random experiment are the number of outcomes which entail the occurrence of the event.  E.g. In throwing of two dice,the favourable events of getting the sum 5 is (1,4), (4,1), (2,3),(3,2) i.e. 4 DESCRIPTION OF DIFFERENT EVENTS IN WORDS AND SET NOTATION: DEFINITION: PROBABILITY OF AN EVENT  If a finite sample space associated with a random experiments has "𝐧" equally likely (Equiprobable) outcomes (elements) and of these "𝐫“ (𝟎 ≤ 𝐫 ≤ 𝐧) outcomes are favorable for the occurrence of an event A, then probability of A is defined as follow. NOTES:  P(A) ≥ 0  P(S) = 1.  If A1,A2,….,An are finite mutually exclusive events then P(A1UA2U…. UAn) = P(A1) + P(A2) +…+P(An)  EXAMPLE:Three unbiased coins are tossed. Find the probability of getting (i) exactly two heads, (ii) at least one tail, (iii) at most two heads, (iv) a head on the second coin and (v) exactly two heads in succession.  SOLUTION:When three coins are tossed, the sample space S is given by, S= {HHH, HTH, THH, HHT, TTT, THT, TTH, HTT} n(S) = 8 (i) A = { HTH, THH, HHT} n(A) = 3. P(A)= n(A)/n(S) = 3/8 (ii) B = { HTH, THH, HHT, TTT, THT, TTH, HTT} n(B) = 7 P(B) = n(B)/n(S) = 7/8 (iii) C = { HTH, THH, HHT, TTT, THT, TTH, HTT} n(C) = 7 P(C) = n(C)/n(S) = 7/8 (iv) D = { HHH, THH, HHT, THT} n(D) = 4 P(D) = n(D)/n(S) = 4/8 = ½ (v) E = {HHH, THH, HHT} n(E) = 3. P(E) = n(E)/n(S) = 3/8. PERMUTATION:  Suppose that we are given ‘n’ distinct objects and wish to arrange ‘r’ of these objects in a line. Since there are ‘n’ ways of choosing the 1st object, after this is done ‘n-1’ ways of choosing the 2nd object and finally n-r+1 ways of choosing the rth object, it follows by the fundamental principle of counting that the number of different arrangement (or PERMUTATIONS) is given by  Note: (1). Suppose that a set consists of ‘n’ objects of which n1 are of one type, n2 are of second type,… …, and nk are of kth type. Here n = n1 + n2 + ⋯ + nk. Then the number of different permutations of the objects is Example: A number of different permutations of the letters of the word MISSISSIPPI is (2). If ‘r’ objects are to be arranged out of ‘n’ objects and if repetition of a object is allowed then the total number of permutations is n^r  Example: Different numbers of three digits can be formed from the digits 4, 5, 6, 7, 8 is 5^3 = 125 COMBINATION:  In a permutation we are interested in the order of arrangement of the objects. For example, ABC is a different permutation from BCA. In many problems, however, we are interested only in selecting or choosing objects without regard to order. Such selections are called combination.  The total number of combination (selections) of ‘r’ objects selected from ‘n’ objects is denoted and defined by  Examples: (1). The number of ways in which 3 card can be chosen from 8 cards is (2). A club has 10 male and 8 female members. A committee composed of 3 men and 4 women is formed. In how many ways this be done?  EXAMPLE: A card is drawn from a well shuffled pack of 52 cards. Find the probability of (i) getting a king card, (ii) getting a face card, (iii) getting a red card, (iv) getting a card between 2 and 7, both inclusive and (V) getting a card between 2 and 8 both exclusive. SOLUTION: Total number of cards = 52 One card out of 52 cards can be drawn in ways n(S) = 52c1 = 52 (i) Let A be the event of getting a king card. There are 4 king cards and one of them can be drawn in 4c1 ways. n(A)= 4c1 = 4 P(A) = n(A)/n(S) = 4/52 = 1/13 (ii) Let B be event of getting a face card. There are 12 face cards and one of them can be drawn 12c1 ways. n(B)= 12c1= 12 P(B) = n(B)/ n(S) = 12/52 = 3/13 (iii) Let C be event of getting a red card. There are 26 red cards and one of them can be drawn 26c1 ways. n(C)= 26c1= 26 P(C) = n(C)/ n(S) = 26/52 = 1/2 (iv) Let D be the event of getting a card between 2 and 7 both inclusive. There are 6 such cards in each suit giving a total of 6×4=24 cards. One of them can be drawn in 24c1 ways. n(D) = 24c1 = 24. P(D) = n(D)/n(S) = 24/52 = 6/13 (V) Let E be the event of getting a card between 2 and 8, both exclusive. There are 5 such cards in each suit giving a total of 5×4=20 cards. One of them can be drawn in 20c1 ways. n(E) = 20c1 = 20. P(E) = n(E)/n(S) = 20/52 = 5/13. RESULTS EXAMPLE: A card is drawn from a well- shuffled pack of cards. What is the probability that it is either a spade or an ace? SOLUTION: Let A and B be the events of getting a spade and an ace card respectively. P(A) = 13c1/ 52c1 = 13/52 P(B) = 4c1/ 52c1 = 4/52 P(A ∩ B ) = 1c1/52c1 = 1/52 P(A ∪ B ) = P(A)+P(B)-P(A ∩ B ) = 13/52 + 4/52 – 1/52 = 16/52 = 4/13 EXAMPLE: If A and B are mutually exclusive events and P(A)=0.30, P(B)=0.45, then find the probability of the following events. (i) A‫ ׳‬, (ii) A ∩ B, (iii) A ∪ B, (iv) A‫ ∩ ׳‬B‫׳‬ SOLUTION: (i) P(A‫ =)׳‬1 – P(A) = 1 – 0.30 = 0.70 (ii) Since A and B are given to be mutually exclusive events, A∩B=Ø P(A ∩ B) = P(Ø) = 0. (iii) P(A ∪ B) = P(A) + P(B) = 0.30 + 0.45 = 0.75 (iv) P(A‫ ∩ ׳‬B‫ = )׳‬1 – P(A ∪ B) = 1 – 0.75 = 0.25. CONDITIONAL PROBABILITY:  Let S be a sample space and A and B be any two events in S. Then the probability of the occurrence of event A when it is given that B has already occurred is define as  Which is known as conditional probability of the event A relative to event B.  Similarly, the conditional probability of the event B relative to event A is Properties:  Let A1, A2 and B be any three events of a sample space S,then P(A1 ∪ A2⁄B) = P(A1⁄B) + P(A2⁄B) − P(A1 ∩ A2⁄B); P(B) > 0.  Let A and B be any two events of a sample space S, then P(A′⁄B) = 1 − P(A⁄B); P(B) > 0.  EXAMPLE: Let A and B be two events with P(A)= 3/8 and P(B)= 7/8 and P(A ∪ B) = ¾ Find P(A/B) and P(B/A).  SOLUTION: P(A ∪ B) = P(A) + P(B) - P(A ∩ B ) P(A ∩ B) = P(A) + P(B) - P(A ∪ B ) = 3/8 + 7/8 – ¾ = 4/8 =½ P(A/B) = P(A ∩ B) / P(B) = (1/2)/(7/8) = 4/7 P(B/A) = P(A ∩ B) / P(A) = (1/2)/(3/8) = 4/3  EXAMPLE: If A and B are two events such that P(A) = 2/3, P(A´∩B)=1/6 and P(A∩B)=1/3, find P(B), P(A∪B), P(A/B), P(B/A), P(A´∪B) and P(B´).  SOLUTION: P(B)= P(A´∩B) + P(A∩B) = 1/6 + 1/3 =½ P(A∪B) = P(A) + P(B) - P(A∩B) = 2/3 + ½ - 1/3 = 5/6 P(A/B) = P(A∩B)/P(B) = (1/3)/(1/2) = 2/3 P(B/A) = P(A∩B)/P(A) = (1/3)/(2/3) = 1/2 P(A´∪B) = P(A´) + P(B) - P(A´∩B) = [1 – P(A)] + P(B) - P(A´∩B) = [1- 2/3] +1/2 – 1/6 = 1/3 + ½ - 1/6 = 2/3 P(B´) = 1 – P(B) = 1- ½ =½ THEOREM (MULTIPLICATION RULE):  Let S be a sample space and A and B be any two events in 𝐒, then P(A ∩ B) = P(A) ⋅ P(B⁄A); P(A) > 0 or P(A ∩ B) = P(B) ⋅ P(A⁄B); P(B) > 0.  Corollary: Let S be a sample space and A, B and C be three events in S,then P(A ∩ B ∩ C) = P(A) ⋅ P(B⁄A) ⋅ P(C⁄A ∩ B) INDEPENDENT EVENTS:  Let A and B be any two events of a sample space S, then A and B are called independent events if P(A ∩ B) = P(A) ⋅ P(B).  It also means that, P(A⁄B) = P(A) and P(B⁄A) = P(B).  This means that the probability of A does not depend on the occurrence or nonoccurrence of B, and conversely. Remarks:  Let A , B and C are said to be Mutually independent, if (1) P(A ∩ B) = P(A) ⋅ P(B) (2) P(B ∩ C) = P(B) ⋅ P(C) (3) P(C ∩ A) = P(C) ⋅ P(A) (4) P(A ∩ B ∩ C) = P(A) ⋅ P(B) ⋅ P(C)  Let A , B and C are said to be Pairwise independent, if (1) P(A ∩ B) = P(A) ⋅ P(B) (2) P(B ∩ C) = P(B) ⋅ P(C) (3) P(C ∩ A) = P(C) ⋅ P(A)  EXAMPLE: If A and B are independent events, where P(A)=1/4, P(B)=2/3. Find P(A∪B).  SOLUTION: Here A and B are independent events. P(A ∩ B) = P(A)· P(B) ….(1) P(A∪B) = P(A) + P(B) - P(A ∩ B) = P(A) + P(B) - P(A)· P(B) (from(1)) = ¼ + 2/3 – (1/4 × 2/3) = ¼ + 2/3 – 1/6 = 9/12 = 3/4  EXAMPLE: A problem of statistics is given to three students A,B and C whose chances of solving it are 1/3, ¼ and ½ respectively. What is the probability that the problem will be solved?  SOLUTION: P(A)=1/3, P(B)= ¼, P(C)= ½ P(A∪B∪C)= 1 – P(A∪B∪C)´ = 1 – P(A´∩B´∩C´) = 1- P(A´)· P(B´)· P(C´)(A,B,C are independent events) = 1- [1- P(A)] [1-P(B)] [1- P(C)] = 1 – [1- 1/3] [1- ¼] [ 1- ½] = 1- (2/3) (3/4) (1/2) = 1- ¼ = 3/4 TOTAL PROBABILITY:  If B1 and B2 are two mutually exclusive and exhaustive events of sample space S and P(B1), P(B2) ≠ 0 , then for any event A, 𝐏(𝐀) = 𝐏(𝐁𝟏) ⋅ 𝐏(𝐀⁄𝐁𝟏) + 𝐏(𝐁𝟐) ⋅ 𝐏(𝐀⁄𝐁𝟐)  Corollary: If B1, B2 and B3 are mutually exclusive and exhaustive events and (B1), P(B2), P(B3) ≠ 0 , then for any event A. P(A) = P(B1) ⋅ P(A⁄B1) + P(B2) ⋅ P(A⁄B2) + +P(B3) ⋅ P(A⁄B3) BAYES’ THEOREM:  Let B1, B2, B3 … , Bn be an n-mutually exclusive and exhaustive events of a sample space S and let A be any event such that P(A) ≠ 0, then  EXAMPLE:Three boxes contain 10%, 20% and 30% of defective finger joint is selected at random which is defective. Determine the probability that is comes from (a) first box, (b) second box, (c) Third box  SOLUTION: A= Selection of defective finger joint B1= Defective finger joint is from box-1 B2= Defective finger joint is from box-2 B3= Defective finger joint is from box-3 P(B1)= 1/3, P(B2)=1/3, P(B3)= 1/3 P(A/B1)=0.1, P(A/B2)=0.2, P(A/B3)= 0.3 P(A)= P(B1)· P(A/B1)+ P(B2)· P(A/B2)+ P(B3)· P(A/B3) = (1/3)(0.1)+(1/3)(0.2)+(1/3)(0.3) = (1/3) (0.6) = 0.2 (a) P(B1/A)= P(B1) P(A/B1)/P(A) = (1/3)(0.1)/ (0.2) ≈ 0.16667 (b) P(B2/A)= P(B2) P(A/B2)/P(A) = (1/3)(0.2)/ (0.2) ≈ 0.3333 (c) P(B3/A)= P(B3) P(A/B3)/P(A) = (1/3)(0.3)/ (0.2) ≈ 0.5 Bernoulli trial:  Independent repeated trials of an experiment with exactly two possible outcomes are called Bernoulli trials. Call one of the outcomes "success" and the other outcome "failure". Let p be the probability of success in a Bernoulli trial, and q be the probability of failure. Then the probability of success and the probability of failure sum to unity (one), since these are complementary events: "success" and "failure" are mutually exclusive and exhaustive. Thus one has the following relations:  p=1-q,q=1-p,p+q=1.  The probability of exactly k successes in the experiment B(n,p) is given by:  EXAMPLE: Consider the simple experiment where a fair coin is tossed four times. Find the probability that exactly two of the tosses result in heads.  SOLUTION: RANDOM VARIABLE:  An experiment, in which we know all the possible outcomes in advance but which of them will occur is known only after the experiment is performed, is called a Random Variable. PROBABILITY DISTRIBUTION OF RANDOM VARIABLE:  Probability distribution of random variable is the set of its possible values together with their respective probabilities. It means, X X1 X2 X3 … … Xn P(X) P(X1) P(X2) P(X3) … … P(Xn) ; Where p(xi) ≥ 0 for all i and Σi=1 p(xi) = 1.  Example: Two balanced coins are tossed, find the probability distribution for heads. Sample space = {HH, HT, TH, TT} P(X = 0) = P(no head) = ¼ = 0.25 P(X = 1) = P(one head) = 2/4 =1/2 = 0.5 P(X = 2) = P(two heads) = ¼ =0.25 Probability distribution is as follow: X 0 1 2 P(X) 0.25 0,50 0.25 TYPES OF RANDOM VARIABLES:  (1). Discrete Random Variable  (2). Continuous Random Variable DISCRETE RANDOM VARIABLE:  A random variable, which can take only finite, countable, or isolated values in a given interval, is called discrete random variable.  i.e. A random variable is one, which can assume any of a set of possible values which can be counted or listed.  A discrete random variable is a random variable with a finite (or countably infinite) range.  For example, the numbers of heads in tossing coins, the number of auto passengers can take on only the values 1 , 2 , 3 and so on.  Note: Discrete random variables can be measured exactly. CONTINUOUS RANDOM VARIABLE:  A random variable, which can take all possible values that are infinite in a given interval, is called Continuous random variable.  i.e. a continuous random variable is one, which can assume any of infinite spectrum of different values across an interval which cannot be counted or listed.  For example, measuring the height of a student selected at random, finding the average life of a brand X tire etc.  Note: Continuous random variables cannot be measured exactly.  Examples For the following situations, determine whether a discrete or continuous random variable is involved.  Example 1  The number of hairs on a sea otter.  Since hairs are something we can count, this is a discrete random variable.  Even though the number of hairs may be very large, that we wouldn't actually want to count them, there are no "half hairs" or fractional amounts of hair, only whole number amounts of hairs.  Example 2  The length of a sea otter.  The length is typically considered a continuous variable, since, a sea otter will typically not measure exactly 5 feet, but the length will differ by some fraction of a foot.  Example 3  The age of a sea otter.  Age can sometimes be treated as discrete or continuous. For example, we generally report age as only a number of years, but sometimes we talk about a sea otter being 3 and half years old. Technically, since age can be treated as a continuous random variable, then that is what it is considered, unless we have a reason to treat it as a discrete variable PROBABILITY FUNCTION:  If for random variable X, the real valued function f(x) is such that P(X = x) = f(x), then f(x) is called Probability function of random variable X.  Probability function f(x) gives the measures of probability for different values of X say x1, x2, …. , xn. PROBABILITY MASS FUNCTION:  If X is a discrete random variable then its probability function p(x) is discrete probability function. It is also called probability mass function.  Properties:  P(X = xi) = p(xi)  p(x) ≥ 0 and Σ ni=1 p(xi) = 1  EXAMPLE: Is f(x)= x/6;x=0,1,2,3,4 define probability distribution? Justify your answer.  SOLUTION: Here n=5 Let x1=0, x2=1, x3=2, x4=3, x5=4 5 Σ i=1 f(xi)= f(x1)+f(x2)+f(x3)+f(x4)+f(x5) = 0/6 + 1/6 + 2/6 + 3/6 + 4/6 = 10/6 ≠1 The given function f(x) is not defined probability distribution  EXAMPLE: A random variable X has the probability mass function given by: x 1 2 3 4 P(X=x) 0.1 0.2 0.5 0.2  Find (i)P(2≤x2), (iii)P(x is odd) & (iv) P(x is even)  SOLUTION: (i)P(2≤x2)= P(X=3) + P(X=4) = 0.5 + 0.2 = 0.7 (iii) P(x is odd)= P(X=1) + P(X=3) = 0.1 + 0.5 = 0.6 (iv) P(x is even) = P(X=2) + P(X=4) = 0.2 + 0.2 = 0.4 PROBABILITY DENSITY FUNCTION:  If X is a continuous random variable then its probability function f(x) is called continuous probability function OR probability density function.  Properties: b P(a < x < b) = a∫ f(x) dx f(x) ≥ 0 ∞ ∫ f(x) dx = 1 -∞ MATHEMATICAL EXPECTATION:  If X is a discrete random variable having various possible values x1 , x2 , …. , xn & if f(x) is the probability function, the mathematical Expectation of X is defined & denoted by n  𝐄(𝐗) = Σ 𝐱𝐢 ⋅ 𝐟(𝐱𝐢) 𝐢=𝟏 = Σ𝐱𝐢 ⋅ 𝐩(𝐱𝐢) = Σ 𝐱𝐢 ⋅ 𝐩𝐢  If X is, a continuous random variable having probability density function f(x), expectation of X is defined as ∞  𝐄(𝐗) = ∫ 𝐱 𝐟(𝐱) 𝐝𝐱 -∞ E(X) is also called the mean value of the probability distribution of x and is denoted by μ. Properties:  Expected value of constant term is constant. i.e. 𝐄(𝐜) =𝐜  If c is constant, then 𝐄(𝐜𝐗) = 𝐜 ∙ 𝐄(𝐗)  𝐄(𝐗𝟐) = Σ 𝐱𝐢𝟐 ⋅ 𝐩𝐢 (PMF)  𝐄(𝐗𝟐) = ∫ x2 f(x) dx (PDF)  If a and b are constants, then 𝐄(𝐚𝐗 ± 𝐛) = 𝐚𝐄(𝐗) ± 𝐛  If a , b and c are constants, then 𝐄 (𝐚𝐗+𝐛/c) =𝟏/𝐜 [𝐚𝐄(𝐗) + 𝐛]  If X and Y are two random variables , then 𝐄(𝐗 + 𝐘) = 𝐄(𝐗) + 𝐄(𝐘)  If X and Y are two independent random variable, then 𝐄(𝐗 ∙ 𝐘) = 𝐄(𝐗) ∙ 𝐄(𝐘) VARIANCE OF A RANDOM VARIABLE:  Variance is a characteristic of random variable X and it is used to measure dispersion (or variation) of X.  If X is a discrete random variable (or continuous random variable) with probability mass function f(x) (or probability density function), then expected value of [X − E(X)]2 is called the variance of X and it is denoted by V(X).  𝐕(𝐗) = 𝐄(𝐗𝟐) − [𝐄(𝐗)]𝟐 Properties:  𝐕(𝐜) = 𝟎, Where c is a constant.  𝐕(𝐜𝐗) = 𝐜𝟐 𝐕(𝐗) , where c is a constant.  𝐕(𝐗 + 𝐜) = 𝐕(𝐗), Where c is a constant.  If a and b are constants , then 𝐕(𝐚𝐗 + 𝐛) = 𝐚𝟐𝐕(𝐗)  If X and Y are the independent random variables, then  𝐕(𝐗 + 𝐘) = 𝐕(𝐗) + 𝐕(𝐘) STANDARD DEVIATION OF RANDOM VARIABLE:  The positive square root of V(X) (Variance of X) is called standard deviation of random variable X and is denoted by σ. i.e. 𝛔 = √𝐕(𝐗)  Note: 𝛔𝟐 is called variance of 𝐕(𝐗). CUMULATIVE DISTRIBUTION FUNCTION:  An alternate method for describing a random variable’s probability distribution is with cumulative probabilities such as P(X ≤ x).  A Cumulative Distribution Function of a discrete random variable X, denoted as F(X), is 𝐅(𝐱) = 𝐏(𝐗 ≤ 𝐱) = Σ 𝐏(𝐱𝐢) 𝐱𝐢≤𝐱  A Cumulative Distribution Function of a continuous random variable X is x  𝐅(𝐱) = 𝐏(𝐗 ≤ 𝐱) = ∫ 𝐟(𝐭) 𝐝𝐭 −∞ , −∞ < 𝐱 < ∞  Example: Determine the probability mass function of X from the following Cumulative distribution function:  F(x) = {0 x < −2 =0.2 −2≤x

Use Quizgecko on...
Browser
Browser