🎧 New: AI-Generated Podcasts Turn your study notes into engaging audio conversations. Learn more

553.311 Lecture Notes Ch01.pdf

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Full Transcript

1 How many...?! —the art of counting (1/16) 1 How many...?! —the art of counting The beginning of probability is linked to gambling. Historically, the Italian mathematician Gerolamo Cardano developed the first probabilistic methods a...

1 How many...?! —the art of counting (1/16) 1 How many...?! —the art of counting The beginning of probability is linked to gambling. Historically, the Italian mathematician Gerolamo Cardano developed the first probabilistic methods as a way of earning money while gambling. In this chapter is all about the coins, dices, words, cards and draws assuming that we are in a fair setting. In other words, we will see the combinatorial definition of probability in action. 1.1 How to count? How can we count the number of favorable and total possibilities? The area of mathematics that deals with counting is called combinatorics, also called the art of counting. In this chapter, we will see how to apply some fundamental combinatorial principles: the multiplication principle, the codifying technique and the division principle. 1.1.1 Weak Multiplication Principle Before starting with a statement, let us start with a couple of illustrating examples. Example 1.1. (Rolling a die three times) Take a 6-sided die and roll it three consecutive times. After this process, we will get a sequence of three numbers such as (2, 6, 3). In general, we will have a sequence (a, b, c ) where a , b and c are (not necessarily equal) number going from one to six. How many possibilities are there? Of course, we can just write down all possibilities and count, but this is very tedious and we are don’t want to work so much. Alternative, there are three easy questions: how many possibilities are there for a ? And for b ? And for c ? In each of the above questions, the answer is 6. Moreover, we can see that each choice is inde- pendent of the others. In this way, for each of the six possible values for a , we can choose six possible values for b. Hence we get 6 times z }| { 6+...+6 = 6·6 possibilities for (a, b ). In other words, to get to total number of possibilities for choosing (a, b ), we only need to multiply the number of possibilities of a with the number of possibilities for b. After choosing (a, b ), we need to choose c. Again, for each of the 36 possible choices of (a, b ), we have 6 choices of c. Hence, arguing as above, the total number is 36 · 6 = 216. We conclude,in this way, that the total number of possibilities is possibilities for a possibilities for b possibilities for c z}|{ z}|{ z}|{ 6 · 6 · 6 = 216. △ 7/193 JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024) 1 How many...?! —the art of counting (2/16) Example 1.2. (Rolling three dice) What if instead of rolling a die three times we roll three dice? Does anything change? By thinking on the three dice as having different colors, we can see that nothing changes if we want to compute probabilities. It is therefore to remember that even if we cannot distinguish certain cases in the practical implementation of a random experiment, this does not mean that we should consider them as not distinguishable when we count. △ Example 1.3. (Tossing several coins) If we toss n coins, either successively or at the same time, we have that for each coin there are two possibilities: heads or tails. In this way, the total number of possible outcomes is n times z}|{ 2 · · · 2 = 2n. That is, we have two choices for the 1st coin, another two for the 2nd one, another two for the 3rd one, another two for the 4th one, and so on. So when putting then together, we need to multiply 2 by itself n times. In this way, the probability of getting any particular outcome, assuming the coins are fair, is 1/2n. △ In the examples above, we have counted the number of possibilities after performing several con- secutive choices. To do so, we usually construct the possibilities one choice at a time: we perform one choice, then another, and so on until we have a final object. This is the so-called multiplication principle. Theorem 1.A. (Weak multiplication principle) If we perform two consequtive choices and: (1) The first choice has n possibilities. (2) The second choice does not depend on the first choice, and it has m possibilities. Then performing these two consequetive choices has nm possibilieis. Alternatively, assume we perform consequtively two operations A and B. If the operation A can be perfomed in n possible ways and operation B is independent of operation A and it can be performed in m ways, then we can perform consequtively A and B in mn ways. In Figure 1.1, we see an illustration of the multiplication rule. For each of the m ways to complete operation A, there are n ways to complete operation B. m ways for op. A 1 2...... m n ways for op. B 1 2... n 1 2... n... 1 2... n Figure 1.1: Graphic Illustration of the Multiplication Principle (Theorem 1.A) JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024) 8/193 1 How many...?! —the art of counting (3/16) 1.1.2 Strong Multiplication Principle Unfortunately, not always we are in the setting where the second choice is independent of the first choice. Example 1.4. (Drawing two cards) After shuffling a deck of cards, a person draws two cards in order. If the two cards have the same value, the person wins the game. What’s the probability that this person wins the game? If we try to apply the multiplication principle as stated above to count the number of possibilities, we will find a problem: the second choice depends on the first choice! However, we should also note that the number of 2nd choices does not depend on the 1st choice. This last observation allows us to use the multiplication principle to count. Regarding the total number of possibilities, note that, in our 1st choice, we can choose 52 possible cards, while, in the 2nd one, we can only choose 51 cards—although which cards we can choose depend on the 1st choice! In this way, we see that there are in total 52 · 51 = 2652 possibilities. Regarding the total number of favorable possibilities, we can choose the 1st card arbitrarily, but the 2nd card has to have the same value as the 1st one. This means that we have 52 1st choices, but only 3 2nd choices as only three other cards have the same value. In this way, we have a total of 52 · 3 = 156 favorable possibilities. Hence the probability that the person wins the game is 156/2652 = 3/51 ≃ 0.0588... △ The above example motivates the following version of the multiplication principle: Theorem 1.B. (Strong multiplication principle) If we perform two consequtive choices and: (1) The first choice has n possibilities. (2) Independently of how the possibilities for the first choice, The second choice has m possibilities independently of what we chose in the first choice, even though the 2nd choice’s possibilities themselves might depend on the first choice. Then performing these two consequetive choices has nm possibilieis. Alternatively, assume we perform consequtively two operations A and B. If the operation A can be perfomed in n possible ways and operation B can be performed in m ways, which might depend on the way we performed A, then we can perform consequtively A and B in mn ways. Example 1.5. (Rolling dice: no repetition probability) You roll some (fair) 6-sided dice, what’s the probability that no two dice have the same number? Of course, if we roll more than 6 dice, at least two will have the same number (why?!), so we will focus on the case of rolling n ≤ 6 dice. If we roll two dice, we have seen that the number of total possibilities is 62. Now, the number of favorable cases should be 6 · 5, because we can choose any of the six values for the 1st die, but then we can only choose five values for the 2nd die. In this way, the probability of not having a repeating number after rolling two dice is 5/6. If we roll three dice, there are a total of 63 possibilities, of which only 6 · 5 · 4 are favorable. This is the case because for the 1st die, we have 6 choices; for the 2nd die, we have 5 choices (the other 5 numbers we didn’t choose in the 1st choice); and for the 3rd choice, we have 4 choices (the other 4 remaining numbers, which were not chosen neither in the 1st or 2nd choice). 9/193 JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024) 1 How many...?! —the art of counting (4/16) In general, we leave as an exercise to show that the probability that after rolling n ≤ 6 dice, the probability of not having a repeated number is 6 · 5 · · · (6 − n + 1) , 6n i.e., n 1 2 3 4 5 6 >6 5·4 5·4·3 5·4·3·2 5·4·3·2 prob 1 5 6 62 63 64 65 0 △ In order to avoid the tedious notation of the example above, the factorial is a useful notation. Definition 1.a. (Factorial) Let n be a non-negative integer. The factorial of n , n !, is the number given by n ! = n (n − 1) · · · 2 · 1 where, by convention, 0! = 1. In other words, n ! is the product of the first n positive integers, where, by convention, the empty product of the first zero positive integers is one. Remark 1.1. Observe that n ! = n · (n − 1)!. ¶ In this way, the probability that, after rolling n ≤ 6 (fair) 6-sided dice, no pair of dice has the same number is (6 −n6!)!6n. In general, expressions of the form n! = n · (n − 1) · (n − 2) · · · (n − k + 1) (n − k )! will be very common when applying the strong multiplication principle. See Theorem 1.F for a precise meaning of this number. 1.1.3 Codification Principle For now, our choices have been very straightforward. However, some- times we need to be more creative when codifying possibilities as a sequence of choices. The main idea to do this is the following theorem. Theorem 1.C. (Codification Principle) Assume that we construct the possibilities of some problem by performing a sequence of choices A1,... , Ar , where a choice might depend on the previous ones. If for every possibility there is a sequence of choices giving the possibility and no two different sequences of choices give the same possibility, then the number of possibilities is precisely the number of ways we can perform the sequence of choices. Remark 1.2. Note that two sequences of choices are equal when, in the sequence of choices, we have the same number of choices and both sequences have the same 1st choice, the same 2nd choice, the same 3rd choice... ¶ We now see some examples of how the codification happens in general. Example 1.6. (Words with four consonants and one vowel) Ludwig Wittgenstein writes in the English alphabet a word with five letters uniformly at random1 , what’s the probability that this word has three consonants and two vowels? Note that since the word is uniformly at random, the words might be a word of the form HAXWB that does not have any sense in English. 1 Imagine Wittgenstein rolls five times a (fair) 26-sided die with the letters on its faces. JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024) 10/193 1 How many...?! —the art of counting (5/16) In total, there are 265 possible words that the person can generate. This is so because there are 26 choices for each possible letter in the English alphabet. Therefore the multiplication principles gives us 26 · · · 26 = 265 possibilities. To count the number of favorable cases, how can we construct a word with four consonants and one vowels? If we try to first choose the 1st letter, then the 2nd, then the 3rd, then the 4th and then the 5th, we can see that once I choose to put a vowel the number of choices changes. Hence we cannot apply the multiplication principle with this sequence of choices. However, we can do the following: 1) Choose the position where the vowel goes. 2) Choose which vowel to put. 3) Put the consonants in the remainder positions. Observe that this sequence of choices satisfies the conditions of the codification principle and so it allows us to count the number of words with four consonants and one vowel. Moreover, we can use the multiplication principle, beause the number of choices at each step does not depend on the choices. In this way, we get number of number of possible consonants number of possible consonants possible possitions number of vowels in the 1st free position in the 4th free position z}|{ z}|{ z}|{ z}|{ 5 · 21 = 2 4 5 · 5 · 21 ··· 21. Hence we have that the probability that Wittgenstein generates a word with four consonants and one vowel is 5 26· 21 2 4 5 ≃ 0.4092... △ 1.1.4 Division Principle At some moments, we cannot apply the codification principle directly. In other words, we end up overcounting in such a way that we count the same case the same number of times. In this cases, we can still count, but to obtain the correct count we need to divide by the number of times we overcount each object. Theorem 1.D. (Division Principle) Assume that we construct the possibilities of some problem by performing a sequence of choices A1,... , Ar , where a choice might depend on the previous ones. If for every possibility there are exactly ℓ sequences of choices giving the possibility, then the number of possibilities is precisely the number of ways we can perform the sequence of choices divided by ℓ. Example 1.7. (Words with three consonants and two vowels) Wittgenstein continues generating words as in Example 1.6, what’s the probability that the word has three consonants and two vowels? The total number of possibilities is the same as in Example 1.6: 265. However, the number of favorable cases is harder. We will proceed as in Example 1.6 and construct the word as follows: 1) Choose the position of the 1st vowel. 2) Choose the position of the 2nd vowel. 3) Choose the 1st and 2nd vowel. 4) Choose the consonant in order. Now, using the multiplication principle, we get 5 · 4 · 52 · 214 possibilities, since the 1st vowel can go in five possible position and the 2nd one in the remaining four position. Now, consider SQU1 I2 D and SQU2 I1 D where the subscript indicates where we put the 1st and 2nd vowels. We can see that our construction procedure does actually repeat words. Fortunately, we have that each word is obtained precisely twice: 11/193 JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024) 1 How many...?! —the art of counting (6/16) we just need to exchange the positions and values of the 1st and 2nd vowel. Hence, by the division principle, we have that there are 5 · 4 · 52 · 214 2 possibilities. Therefore the probability that Wittgenstein gets a word with three consonant and two vowels is 0.194..., which is smaller than the probability of getting a word with one vowel and four consonant. △ Example 1.8. (Number of heads after tossing n times a coin) If we toss a coin n times, we will get a sequence of heads and tails, such as HTHHTTTH. Now, what’s the probability that we get k heads after tossing a coin n times? As shown above, the total number of possibilities is 2n. For each toss, we either gets heads or n times z}|{ tails, i.e., two possibilities. Thus by the multiplication principle, we have 2 · · · 2 = 2n possibilities in total. How can we construct all sequences with k heads? To do so, we only need to choose the tosses at which the heads appear. When we do this, we have that we can choose n possible positions in our 1st choice, n − 1 in our second one, and so on, until n − k + 1 in our k th choice. Now, there is a problem, H1 TH2 TT and H2 TH1 TT, where the subindexes indicates the order in which we have chosen the positions, are the same se- quence of heads and tails. We are overcounting! Fortunately, we are counting each sequence k ! times, because, once decided the k positions we will choose, we can choose any of the k positions on our 1st choice, ay of the remaining k − 1 on our 2nd choice and so on. Hence the total number sequences of n tosses with k heads is number of ways we choose k positions number of times z }| { we choose the same k positions z}|{ n! n! = / k!. k !(n − k )! (n − k )! n! Hence the probability of getting k heads after tossing a (fair) coin n times is k !(n −k )!2n. △ The binomial coefficients provide a nice way of writing the above probability. Definition 1.b. (Binomial Coefficient) Let n, k be a non-negative integers such that k ≤ n. The  binomial coefficient of n and k , kn , read n choose k , is the number given by   n n! :=. k k !(n − k )!  By convention, kn = 0 for k > n.  n  Remark 1.3. Observe that kn = n −k and that for n, k ≥ 1,     n n n −1 =. k k k −1 ¶ JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024) 12/193 1 How many...?! —the art of counting (7/16)  In this way, the probability that, after tossing n times a (fair) coin, we get k heads is 2 −n kn. See Theorem 1.H for a precise combinatorial meaning of this number. 1.2 Sampling Objects Many of the above settings can become more abstract through the mental experiment of sampling objects. The advantage of this relies in that it allows us to apply a pre-existing count without the need to repeat the count in similar settings. In general, the mental experiment is as follows: Given a collection of n distinct objects, we pick k objects uniformly at random. Unfortunately, the above setting is not completely clear. This is because of the following two ques- tions: (R1) After we sample an object, can we sample again the same object? If this is the case, we say that we are sampling with replacement; if not, then we are sampling without replacement. (R2) Does the order in which we sample the objects matter? If so, we will get an ordered sample, also called variation; if not, we will get an unordered sample, also called combination. The case in which we find ourselves depends on the context of the problem as the following four examples show. Example 1.9. (Rooling a die) When we repeatedly toss a coin or roll a die, we are essentially per- forming an ordered sample with replacement. Why? Because the order in which the outcomes appear matter and every outcome can be repeated. In this way, for example, rolling k times a die is performing an ordered sample of size 6 of a collection of 6 distinct objects: the faces of the die. △ Example 1.10. (Chooing a Pokémon team) In the Pokémon game, each player can form a team of up to six Pokémon. In principle, there is no restriction on how many times a Pokémon can be put in the team. In this way, choosing the Pokémon for the Pokémon team of a player is the same as an unordered sampling of size 6 of a collection of 151, 251 or n distinct objects: the Pokémons of the 1st (red, blue and yellow), 2nd (emerald and ruby) or one of the subsequent ones. However, note that before any Pokémon battle, the Pokémons have to be in order. Hence choosing a Pokémon team for a Pokémon battle in a game of the 1st edition is instead an ordered sample of size 6 of a collection of 151 distinct objects: the Pokémons of the 1st edition of the game. △ Example 1.11. (Poker hand) When we draw a hand in poker the order in which we draw the cards does not matter. In this way, in poker, drawing a hand is the same as performing an unordered sample of size 5 on a collection of 52 distinct objects: the deck of cards. △ Example 1.12. (Rack-O hand) When we draw a hand in certain card games, such as Rack-O, we cannot change the location of the cards once in our hand. In this game, drawing a hand is the same as performing an ordered sample of size 10 in a collection of 60 distinct objects: the deck of 60 cards numbered 1 to 60. △ Table 1.1 briefs the results of Theorems 1.E, 1.F, 1.H and 1.I. These numbers are very useful to remember, since many problems can be codified using sampling from a collection of distinct objects. 13/193 JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024) 1 How many...?! —the art of counting (8/16) with replacement without replacement n! ordered nk (n −k )! n +k − 1  n unordered k k Table 1.1: Sampling k objects from a collection of n distinct objects 1.2.1 Ordered Sampling/Variations with Replacement The following theorem gives how many ordered samples with replacement are there. Theorem 1.E. The number of ordered samples with replacement of k objects of a collection of n distinct objects is nk. Proof. We perform k consecutive choices having each one of them n possibilities. Hence, by the k times z }| { weak multiplication principle, the total number is just n · · · n = n k. □ The advantage of theorems like the one above is that they simplify counting enormously. Example 1.13. (Rolling a die three times—revisited) How many possible outcome are there if we roll a six-sided die three times? We can see the outcomes of rolling a die three times as an ordered samples with replacement of size 3 out of a population with 6 distinct objects: the six faces of the die. Hence, by Theorem 1.I, there are 63 = 216 possibilities. △ 1.2.2 Ordered Sampling/Variations without Replacement The following theorem gives how many ordered samples without replacement are there. Theorem 1.F. The number of ordered samples without replacement of k objects of a collection of n distinct objects is n! = n (n − 1) · · · (n − k + 1). (n − k )! Proof. We perform k choices, having the 1st one n possibilities, the 2nd one n − 1, the 3rd one n − 2 and so on. In general, the i th choice has n − i + 1 possibilities. In this way, by the strong multiplication principle, we have n (n − 1) · · · (n − k + 1) ways. □ Example 1.14. (Possible podiums in a race) A race has 100 runners, how many possibilities are there for the podium of the race assuming that no tie can happen? We observe that selecting runners for the podium is the same as an ordered sample without replacement of size 3 out of the collection of 100 distinct objects: the (obviously distinct) 100 runners2. Hence, by Theorem 1.F, there are 100!/97! = 970200 possible podiums in the race. △ 2 We note that mathematically, it is useful to think of people as objects to counting them for the shake of making the abstraction easier to apply in full generality. However, outside mathematical considerations, it is not a good idea to abstractize people as objects. JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024) 14/193 1 How many...?! —the art of counting (9/16) Example 1.15. (Books in a shelf) After moving to Baltimore, you want to start a small library for your apartment. You have a budget for buying six books out of the 30 distinct books that the bookshop near your apartment. How many libraries can you make in your apartment’s bookshelf if we assume the order in which you position the books matter? To construct your library is the same as doing an ordered sample without replacement of size 6 out of a collection of 60 distinct objects: the 60 books in the collection of the local bookshop. Hence, by Theorem 1.F, there are 30!/24! = 427518000 possible libraries you can arrange in your new apartment. △ Example 1.16. (Turning over cards of a deck) A magician ask you to turn in order the top five cards of a usual poker deck after you have well-shuffled it. Before revealing each of the cards, the magician will make a guess, what’s the probability that the magician guesses the five cards? Of course, if you believe the magician has a trick under the sleeve, the probability of she guessing the cards will be one. However, let’s assume that this magician is just guessing at random. Note that in this setting the order of the cards matter as the magician is guessing them in the order they are turned over. In this way, turning over the cards in order is the same as an ordered sample without replacement of size 5 out of a population of 52 disticnt objects: the deck of cards. Hence, by Theorem 1.F, there are 52!/47! = 311875200 possibilities, and so the probability that the magician guesses all the cards is 47!/52! ≃ 3.2 · 10 − 9 , i.e., her chances are roughly 3 in a billion! △ 1.2.3 Interlude: Anagrams and Ordering Objects How many ways can we reorder/rearrange a sequence of objects? In particular, how many anagrams3 does a word have? The answer to this question is a neat formula. However, before giving the final answer, let’s practice with some examples. Example 1.17. (Anagrams of CANOPIES) How many anagrams does the word CANOPIES have? Note that CANOPIES doesn’t have any repeated letters and ordering them is just to take an ordered sample without replacement of size 8 out of a collection of 8 distinct objects: the letters of CANOPIES. Hence, by Theorem 1.F, we have that there are 8! = 40320 anagrams of CANOPIES4. Now, what’s the probability that an anagram of CANOPIES chosen uniformly at random starts with the letter A? And that it has all its vowels before its consonants? And that it has its vowels ordered in lexicographic order? For all these questions, we need to use the combinatorial definition of probability and compute the number of possible cases. The number of anagrams of CANOPIES that start by A is the same number as the number of anagrams of CNOPIES, since we can re-order the other letters. Now, arguing as above, we have 7! possibilioties. Hence the probability that a random anagram of CANOPIES starts by A is 7!/8! = 1/8. If all vowels go before the consonant of an anagram of CANOPIES that’s the same as an anagram of AOIE followed by an anagram of CNPS. For the first one, we have 4! possibilities; and for the second one, we also have 4!. Hence, by the multiplication principle, we have 4! · 4! possible angrams with all 3 Recallthat an anagram of a word is just another word obtained by reordering the letters. Note that an anagram doesn’t have to be an existing word. 4 Observe two things: first, 8! = 8!/0!; second, 8! is quite a large number! 15/193 JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024) 1 How many...?! —the art of counting (10/16) WAT1 T2 WT1 AT2 WT1 T2 A AWT1 T2 WT1 T2 A AT1 T2 W WAT2 T1 WT2 AT1 WT2 T1 A AWT2 T1 AT1 WT2 AT2 T1 W WATT WTAT WTTA AWTT ATWT ATTW TWAT TAWT TATW TTWA TTWA TWTA T1 WAT2 T1 AWT2 T1 AT2 W T1 T2 WA T1 T2 WA T1 WT2 A T2 WAT2 T2 AWT1 T2 AT1 W T2 T1 WA T2 T1 WA T2 WT1 A Figure 1.2: Arrangement of WAT1 T2 and their relationship to anagrams of WATT. the vowels before the consonants. Thus the probability that a random anagram of CANOPIES has all vowels before the consonants is 4!4! 8! = 5 · 64!· 7 · 8 = 70 1. To construct an anagram of CANOPIES where vowels go in lexicographic order, we only need to put the consonants, then the vowels will go in order in the remaining of the places. Hence choosing an anagram of CANOPIES is the same as choosing for each consonant a position, but this is just an ordered sample without replacement of size 4 of a collection of 8 distinct objects: the positions of the letters. Hence, by Theorem 1.F, we get 8!/4! possibilities and so the probability that a random anagram of CANOPIES has its vowels in lexicographic order is 1/4! = 1/24. △ In view of the above example, we see that the number of ways we can reorder n distinct objects is n !. However, in many situations, we have repeated objects of the same type that are indistinguishable between them. What happens then? Example 1.18. (Anagrams of WATT) How many anagrams does the word WATT have? Unfortunately, not all letters are distinct. Because of this the number of anagrams of WATT is not 4!. Now, for the sake of argument, assume that we can distinguis the two Ts as T1 and T2. In this case, the number of anagrams of WAT1 T2 is precisely 4!. Now, if we exchange T1 and T2 , we get the same anagram of WATT. In this way, for every anagram of WATT there are precisely two anagrams of WAT1 T2 as Figure 1.2 shows. Hence, by the division principle, we have 4! 2! anagrams of WATT, because there are 4! anagrams of WAT1 T2 and every anagram of WATT corresponds exactly to 2! anagrams of WAT1 T2. △ Example 1.19. (Anagrams of WWATTTT) What if we have more repeated letters as in WWATTTT? We proceed as in Example 1.17. First, we consider the number of anagrams of W1 W2 AT1 T2 T3 T4 where we consider the Ti and the Wi as different letters. This ‘new word’ has 7! anagrams, but many of its anagram give the same anagram of the word WWATTTT that we are interested in. JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024) 16/193 1 How many...?! —the art of counting (11/16) As before, to pass from an anagram of W1 W2 AT1 T2 T3 T4 to an anagram of WWATTTT, we only need to forget the subscripts. In reverse, to pass from an anagram of WWATTTT to an anagram of W1 W2 AT1 T2 T3 T4 , we only need to add the subscripts to the Ws and Ts. However, this can be done in more than one way. On the one hand, to put the subscripts back to the Ws, we have 2! ways. On the other hand, to put them back on the Ts, we have 4! ways—we choose the subscript of the 1st T, then the one of the 2nd T among the remaining three and so on. Hence, per anagram of W1 W2 AT1 T2 T3 T4 , we get 2!4! anagrams of WWATTTT. In this way, by the division principle, we conclude that there are 7! 2!4! anagrams of WWATTTT △ Using the division principle as in the examples above we can get the general case. Recall that, when ordering objects, we assume that we cannot distinguish between objects of the same type: they are indistinguishable from each other. Theorem 1.G. Let a list of n objects with r types with n 1 objects of type 1, n 2 objects of type 2,...., and n r objects of type r. The number ways in which we can order these object is n!. n 1 !n 2 ! · · · n r ! In other words, a word with n letters where the number of distinct letter is r and there are n 1 of the 1st letter,..., and n r of the r th letter has n!. n 1 !n 2 ! · · · n r ! anagrams. Remark 1.4. Note that, in the theorem, we have n 1 +... + n r = n. ¶ Proof. Number the objects of the same type so that they are distinguishable. After doing this, we have n ! possible orderings. Now, given an ordering of the original collection of objects, we can obtained an ordering of the numbered collection of objects by numbering the objects of each type. Now, there are n i ! of numbering the objects of type i and so, by the multiplication principle, n 1 ! ·n r ! ways of numbering the objects of each type. Hence we have that each ordering of the original n objects corresponds to n 1 ! · · · n r ! orderings of the numbered objects. Hence, by the division principle, we have n !/(n 1 ! · · · n r !) possible orderings. □ Example 1.20. In a football5 team, there are 11 player. A coach decides to use the following con- figuration for the female football team that she is coaching: 1 goalkeeper, 4 defenders, 3 midfielders and 3 forwards. 5 As it is called everywhere in the Americas except in Canada and the US. 17/193 JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024) 1 How many...?! —the art of counting (12/16) If the team has 11 players, how many different assignments can the coach do ignoring the side they are positioned in the field? And if it is a bigger team of 20 players, where not every player plays, how many assignments are possible then? Number the 11 players from 1 to 11, then we can view an assignment as an ordering of 1 object of type 1 (the goalkeeper), 4 objects of type 2 (the defenders), 3 objects of type 3 (the midfielders) and 3 objects of type 4 (the forwards). This is illustrated in Figure 1.3. Hence, by Theorem 1.G, there are 11! 1!3!4!3! possible assignments. If we have 20 players, we proceed in the same way. However, we have now an extra of 9 objects of type 5 (the players not on the field). Hence, again by by Theorem 1.G, we get that there are 20! 1!3!4!3!9! assigments for a team of 20 players. △ F2 F1 F3 M2 M1 M3 D2 D3 D1 D4 G Figure 1.3: We can see assignments as a problem equivalent to the number of anagrams of GDDDDM- MMFFF, where the k th player is assigned to the type of player indicated by the k th letter of the anagram. 1.2.4 Unordered Sampling/Combinations without Replacement The following theorem gives how many unordered samples without replacement are there. Theorem 1.H. The number of unordered samples without replacement of k objects of a collection of n distinct objects is   n n! =. k k !(n − k )! Proof. Observe that there are exactly k ! ordered samples without replacement of size k of n distinct object that give rise to the same unordered sample without replacement of size k of n distinct objects. JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024) 18/193 1 How many...?! —the art of counting (13/16) To see this, note that to perform an ordered sample without replacement of size k of n distinct object is the same as doing an unordered sample without replacement of size k of n distinct objects followed by an ordered sample without replacement of size k on the collection of k distinct objects sampled before. In this way, by the division principle, the number of unordered samples without replacement n! of k objects of a collection of n distinct objects is (n −k )! /k !. □ Example 1.21. (Number of heads after tossing n times a coin—revisited) If we flip a fair coin n times, what’s the probability we get k heads? The total number of possibilities is 2n , since it is given by an ordered sample with replacement of size n of a collection of two distinct objects: heads and tails. Now, a sequence of length n heads and tails with exactly k heads can be codified as an anagram k times n −k times z }| { z}|{  of the word H · · · H T · · · T. Now, by Theorem 1.G, we have precisely kn os such anagrams. Hence the probability of getting exactly k heads after n coin tosses of a fair coin is   1 n. 2n k △ Example 1.22. A candy dish has 20 jelly beans, of which 8 are yellow (lemon) and 12 are red (cherry). You randomly select 5 jelly beans from the dish, what’s the probability that 3 are red and 2 yellow? By the combinatorial definition of probability, we need divide the number of favorable cases by the number of total cases. As the order in which we pick the jelly beans does not matter, we have   20 5 possibilities, as it is the same as an unordered sample without replacement of size 5 out of a population of 20 objects. Now, to pick 3 red jelly beans and 2 yellow beans, we just do that, we first pick 3 red jelly beans out of the total of 12 red jelly beans, and then 2 yellow beans out of the total of 8 yellow beans. In each of this cases, we have an unordered sample without replacement, and so, by the multiplication principle, the number of favorable possibilities is    12 8. 3 2 Hence the probability of picking 3 red and 2 yellow jelly beans after picking 5 jelly beans at random from the candy dish is 12  8  3 2 2 ≃ 0.397... 5 △ Remark 1.5. Note that in the example above, although we might not be able to distinguish in practice the red jelly beans among themselves, we act as they are distinguishable. This is done, because if 19/193 JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024) 1 How many...?! —the art of counting (14/16) the jelly beans would be numbered, the probability should still be the same. This is an important property to notice when thinking if we should consider objects dishtinguishable or not when computing probabilities. ¶ Example 1.23. (Full House in Poker) A full house is a hand in poker where we have 3 cards of one value and 2 cards of another value. Below, we have two examples of two full houses: A A 7q q 7 7r r ♠ r q q ♣ ♣♣♣ r r ♠ ♠ r r q q q ♣ ♣ ♣ r r r ♣ A A q 7q 7♣ r 7r 3 3 5q 5 5r ♠ ♠ r r q q ♣♣ ♣ r r ♠ ♠ r r q q ♣ ♣ r r ♣ ♠ 3 r 3 q 5q 5♣ r 5r What’s the probability that the dealer hands you a hand (of 5 cards) that is full house out of a well-shuffled deck (of 52 cards)? Observe that this problem is very similar to the one of the jelly beans. On the one hand, the total  number of possibilities for a poker hand is 52 5 , since drawing a hand is performing an unordered sample withoput replacemente of size 5 out of the deck of 52 cards. On the other hand, to construct a hand that is full house, we need to proceed as follow: 1) choose in order two distinct values, 2) choose (without order) three cards with the 1st value, and 3) choose (without order) two cards with the second value. In this way, by the multiplication principle, we get that the number of hands that are full house are choice of three cards choice of two cards with the 1st value with the 2nd value ordered choice of values z}|{ z}|{ z}|{        4 4 4 4 13 · 12 · · = 13 · 12 · , 3 2 3 2 because the 1st choice is just an ordered sample without replacement of size 2 out of the 13 different values, the 2nd choice is an unordered sample of size 3 out of the 4 suits of the chosen 1st value, and the 3rd choice is an unordered sample without replacement of size 2 out of the 4 suits of the 2nd chosen value. Hence the probability that we are handed a full house in poker is 4 4 13 · 12 · 52  3 2 ≃ 0.00144... , 5 i.e., it has a chance greater that one in a thousand, but smaller than two in a thousand. △ Example 1.24. (Full House in Dice Poker) In dice poker, one rolls five six-sided dice whose six faces are: ace, king, queen, jack, 10 and 9. Then the corresponding result it interpreted as a poker hand. What’s the probability that a player rolls a full house in poker dice? JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024) 20/193 1 How many...?! —the art of counting (15/16) In this case, the total number of possibilities is 65 , since we are just doing an ordered sample with replacement of size 5 out of a collection of 6 distinct objects. Note that the sample is ordered, because we could roll all dice at one or just one die at a time without affecting the probability. For the number of favorable cases, we again perform a set of similar choices: 1) choose in order two faces, 2) choose the three dice that will have the first face, and 3) choose the two dice that will have the second face. Note that the order in which we choose the dice does not make a different. In this way, the number of favorable cases is choice of three dice choice of two dice with the 1st face with the 2nd face ordered choice of faces z}|{ z}|{ z}|{        5 2 5 2 6·5 · · =6·5· , 3 2 3 2 since the 1st choice is an ordered sample with replacement of size 2 out of a collection with 6 distinct objects (the faces), the 2nd choice is an unordered sample without replacement of size 3 out of a col- lection of 5 distinct objects (the dice) and the 3rd choice is an unordered sample without replacement of size 2 out of a collection of 2 distinct objects (the dice not selected in the 2nd choice). Hence the probability of rolling a full house in poker dice after rolling is 5 2 6·5· 3 2 ≃ 0.0385... , 65 i.e. it has a chance greater than 3 in ahundred but less than 4 in a hundred. Note that this is significantly larger than the case of poker. △ 1.2.5 Unordered Sampling/Combinations with Replacement (Optional) The following theo- rem gives how many unordered samples with replacement are there. Theorem 1.I. The number of unordered samples with replacement of k objects of a collection of n distinct objects is   n +k −1 (n + k − 1)! =. k k !(n − 1)! Proof. The proof of this result relies on the so-called stars and bars. We will codify our sample as a sequence of n − 1 bars ( | ) and k stars (⋆), in such a way that the number of stars before the i th bar but not before the (i − 1)th bar represent the number of times the i th object was picked. For example, ⋆ ⋆ || ⋆ | ⋆ ⋆ ⋆ |⋆ means that we picked the 1st object 2 times, the 2nd one zero times, the 3rd one one time, the 4th one 3 times and the 5th one one time. Hence we have reduced counting unordered samples with replacement of k objects of a collection of n distinct objects to counting anagrams of n − 1 times k times z}|{ z }| { | · · · | ⋆ · · · ⋆, (n +k − 1)!  of which thre are k !(n − 1)! = n +kk − 1 by Theorem 1.G. □ 21/193 JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024) 1 How many...?! —the art of counting (16/16) 1.3 When to compute an expression? It is important to note that many times in combinatorics one would just write a formula of the form   12 · 47 3 without computing the numerical value. This is done for two reasons. On the one hand, it allows one to simplify better expressions when computing combinatorial probabilities. On the other hand, it reveals how the counting was done. For example, in this formula we can see that an unordered sample without replacement of size 3 out of a collection of 12 distinct objects was followed by an ordered sample with replacement of size 7 out of a collection of 4 distinct objects. It is an important test of how well one masters counting on how well one is able to see the com- binatorial interpretation of formulas as the one above. JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024) 22/193 29 An introduction to an statistical tank: ANOVA (3/1) About these lecture notes These lectures notes have grown out of some rough notes by Dr. Fred Torcaso, which were originally typed by Dr. Zach Pisano and then significantly improved by Dr. Taylor Jones. The current version is a full rewrite, reorganization and extension by Dr. Josué Tonelli-Cueto, with the feedback of Dr. Taylor Jones. 193/193 JHU 553.311 Intermediate Probability and Statistics Lecture Notes (Fall 2024)

Use Quizgecko on...
Browser
Browser