Summary

These lecture notes detail greedy algorithms, including coin changing, activity selection, Huffman code, and the 0-1 knapsack problem. Different strategies, solutions, and optimal subproblems are examined.

Full Transcript

Chapter 15: Greedy Algorithm About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm 2 Coin Changing Suppose that in a certain country, the coin dominations consist of: $...

Chapter 15: Greedy Algorithm About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm 2 Coin Changing Suppose that in a certain country, the coin dominations consist of: $1, $2, $5, $10 You want to design an algorithm such that you can make change of any x dollars using the fewest number of coins 3 Coin Changing An idea is as follows: 1. Create an empty bag 2. while (x > 0) { Find the largest coin c at most x; Put c in the bag; Set x = x – c ; } 3. Return coins in the bag 4 Coin Changing It is easy to check that the algorithm always return coins whose sum is x At each step, the algorithm makes a greedy choice (by including the largest coin) which looks best to come up with an optimal solution (a change with fewest #coins) This is an example of Greedy Algorithm 5 Coin Changing Is Greedy Algorithm always working? No! Consider a new set of coin denominations: $1, $4, $5, $10 Suppose we want a change of $8 Greedy algorithm: 4 coins (5,1,1,1) Optimal solution: 2 coins (4,4) 6 Greedy Algorithm We will look at some non-trivial examples where greedy algorithm works correctly Usually, to show a greedy algorithm works: ü We show that some optimal solution includes the greedy choice è selecting greedy choice is correct ü We show optimal substructure property è solve the subproblem recursively 7 Activity Selection Suppose you are a freshman in a school, and there are many welcoming activities There are n activities A1, A2, …, An For each activity Ak , it has a start time sk , and a finish time fk Target: Join as many as possible! 8 Activity Selection To join the activity Ak, you must join at sk ; you must also stay until fk Since we want as many activities as possible, should we choose the one with (1) Shortest duration time? (2) Earliest start time? (3) Earliest finish time? (4) Last start time? Last finish time? 9 Activity Selection Shortest duration time may not be good: A1 : [4:50, 5:10), A2 : [3:00, 5:00), A3 : [5:05, 7:00), Though not optimal, #activities in this solution R (shortest duration first) is at least half #activities in an optimal solution O ü One activity in R clashes with at most 2 in O ü If |O| > 2|R|, R should have one more activity 10 Activity Selection Earliest start time may even be worse: A1 : [3:00, 10:00), A2 : [3:10, 3:20), A3 : [3:20, 3:30), A4 : [3:30, 3:40), A5 : [3:40, 3:50) … In the worst-case, the solution contains 1 activity, while optimal has n-1 activities 11 Greedy Choice Property To our surprise, earliest finish time works! We actually have the following lemma: Lemma: For the activity selection problem, some optimal solution includes an activity with earliest finish time How to prove? 12 Proof: (By “Cut-and-Paste” argument) Let OPT = an optimal solution Let Aj = activity with earliest finish time If OPT contains Aj, done! Else, let A’ = earliest finish activity in OPT ü Since Aj finishes no later than A’, we can replace A’ by Aj in OPT without conflicting other activities in OPT è an optimal solution containing Aj (since it has same #activities as OPT) 13 Optimal Substructure Let Aj = activity with earliest finish time Let S = the subset of original activities that do not conflict with Aj Let OPT = optimal solution containing Aj Lemma: OPT – { Aj } must be an optimal solution for the subproblem with input activities S 14 Proof: (By contradiction) First, OPT – { Aj } can contain only activities in S If it is not an optimal solution for input activities in S, let C be some optimal solution for input S è C has more activities than OPT – { Aj } è C ∪{Aj} has more activities than OPT è Contradiction occurs 15 Greedy Algorithm The previous two lemmas implies the following correct greedy algorithm: S = input set of activities ; while (S is not empty) { A = activity in S with earliest finish time; Select A and update S by removing activities having conflicts with A; } If finish times are sorted in input, running time = O(n) 16 Pseudo code Greedy-Activity-Selector(s, f) 1. n = s.length 2. A ={a1} 3. k = 1 4. For m = 2 to n 5. if s[m] ≥ f[k] 6. A = A U {am} 7. k=m 8. Return A 18 Designing a greedy algorithm Greedy-choice property: A global optimal solution can be achieved by making a local optimal (greedy) choice. Optimal substructure: An optimal solution to the problem contains its optimal solution to subproblem. 19 0-1 Knapsack Problem Suppose you are a thief, and you are now in a jewelry shop (nobody is around !) You have a big knapsack that you have “borrowed” from some shop before ü Weight limit of knapsack: W There are n items, I1, I2, …, In ü Ik has value vk, weight wk Target: Get items with total value as large as possible without exceeding weight limit 20 0-1 Knapsack Problem We may think of some strategies like: (1) Take the most valuable item first (2) Take the densest item (with vk/wk is maximized) first Unfortunately, someone shows that this problem is very hard (NP-complete), so that it is unlikely to have a good strategy Let’s change the problem a bit… 21 Fractional Knapsack Problem In the previous problem, for each item, we either take it all, or leave it there ü Cannot take a fraction of an item Suppose we can allow taking fractions of the items; precisely, for a fraction c ü c part of Ik has value cvk, weight cwk Target: Get as valuable a load as possible, without exceeding weight limit 22 Fractional Knapsack Problem Suddenly, the following strategy works: Take as much of the densest item (with vk/wk is maximized) as possible ü The correctness of the above greedy- choice property can be shown by cut- and-paste argument Also, it is easy to see that this problem has optimal substructure property è implies a correct greedy algorithm 23 Fractional Knapsack Problem However, the previous greedy algorithm (pick densest) does not work for 0-1 knapsack To see why, consider W = 50 and: I1 : v1 = $60, w1 = 10 (density: 6) I2 : v2 = $100, w2 = 20 (density: 5) I3 : v3 = $120, w3 = 30 (density: 4) Greedy algorithm: $160 (I1, I2) Optimal solution: $220 (I2, I3) 24 Encoding Characters In ASCII, each character is encoded using the same number of bits (8 bits) called fixed-length encoding However, in real-life English texts, not every character has the same frequency One way to encode the texts is: ü Encode frequent chars with few bits ü Encode infrequent chars with more bits è called variable-length encoding 25 Encoding Characters Variable-length encoding may gain a lot in storage requirement Example: ü Suppose we have a 100K-char file consisted of only chars a, b, c, d, e, f ü Suppose we know a occurs 45K times, and other chars each 11K times è Fixed-length encoding: 300K bits 26 Encoding Characters Example (cont.): Suppose we encode the chars as follows: a à 0, b à 100, c à 101, d à 110, e à 1110, f à 1111 Storage with the above encoding: (45x1 + 33x3 + 22x4) x 1K = 232K bits (reduced by 25% !!) 27 Encoding Characters Thinking a step ahead, you may consider an even “better” encoding scheme: a à 0, b à 1, c à 00, d à 01, e à 10, f à 11 This encoding requires less storage since each char is encoded in fewer bits … What’s wrong with this encoding? 28 Prefix Code Suppose the encoded texts is: 0101 We cannot tell if the original text is abab, dd, abd, aeb, or … The problem comes from: one codeword is a prefix of another one 29 Prefix Code To avoid the problem, we generally want each codeword not a prefix of another ü called prefix code, or prefix-free code Let T = text encoded by prefix code We can easily decode T back to original: ü Scan T from the beginning ü Once we see a codeword, output the corresponding char ü Then, recursively decode remaining 30 Prefix Code Tree Naturally, a prefix code 0 1 scheme corresponds to a prefix code tree a 0 1 ü Each char è a leaf ü Root-to-leaf path è 0 1 0 1 codeword b c d E.g., a à 0, b à 100, 0 1 c à 101, d à 110, e f e à 1110, f à 1111 31 Optimal Prefix Code Question: Given frequencies of each char, how to find the optimal prefix code scheme (or optimal prefix code tree)? Precisely: Input: S = a set n chars, c1, c2, …, cn with ck occurs fck times Target: Find codeword wk for each ck such that Sk |wk| fck is minimized 32 Huffman Code In 1952, David Albert Huffman (then an MIT PhD student) thinks of a greedy algorithm to obtain the optimal prefix code tree Let c and c’ be chars with least frequencies. He observed that: Lemma: There is some optimal prefix code tree with c and c’ sharing the same parent, and the two leaves are farthest from root 33 Proof: (By “Cut-and-Paste” ) Let OPT = some optimal solution If c and c’ as required, done! Else, let a and b be two bottom-most leaves sharing same parent (such leaves must exist… why??) swap a with c, swap b with c’ an optimal solution as required (since it at most the same Sk |wk| fk as OPT … why??) 34 Graphically: c’ b Optimal Substructure Let OPT be an optimal prefix code tree with c and c’ as required Let T’ be a tree formed by merging c, c’, and their parent into one node Consider S’ = set formed by removing c and c’ from S, but adding X with fX = fc+ fc’ Lemma: If T’ is an optimal prefix code tree for S’, then T obtained from T’ by replacing the leaf node X with an internal node having c and c’ is an optimal prefix code tree for S. 36 Graphically, the lemma says: then this is optimal for S If this is optimal for S’ 0 1 1 0 Tree T for S Tree T’ for S’ a a Merged X Merging c, c’ node c c’ and the parent Here, fX = fc + fc’ Proof If T is not optimal tree for S then there exists an optimal tree T” for S and cost(T”) < cost(T). Let T”’ be the tree T” with the common parent of c and c’ replaced by a leaf with X Then cost(T”’) = cost (T”)-fc-fc’< cost(T) - fc- fc’= cost(T’) Yielding a contradiction to the assumption that T’ represent an optimal prefix code for S’ 38 contradiction to the assumption that T’ represent an optimal prefix code for S’ Then this is optimal for S’ If this is optimal for S and better than T’ 0 1 0 1 Tree T’’ for S Tree T’’’ for S’ a a Merged X node c c’ Here, fX = fc + fc’ Huffman Code Questions: Based on the previous lemmas, can you obtain Huffman’s coding scheme? What is the running time? O(n log n) time, using heap (how??) 40 Huffman(S) { // build Huffman code tree 1. Find least frequent chars c and c’ 2. S’ = remove c and c’ from S, but add char X with fX = fc + fc’ 3. T’ = Huffman(S’) 4. Make leaf X of T’ an internal node by connecting two leaves c and c’ to it 5. Return resulting tree } The steps of Huffman’s algorithm 42 Constructing a Huffman code HUFFMAN( C ) 1 n ¬ |C| 2 Q ¬ C 3 for i ¬ 1 to n – 1 4 do allocate a new node z 5 z.left = x = EXTRACT-MIN(Q) 6 z.right = y = EXTRACT-MIN(Q) 7 z.freq = x.freq + y.freq 8 INSERT(Q, z) 9 return EXTRACT-MIN(Q) // the root of the tree is the only node left Complexity: O(n log n) 43 Practice at home Exercises: 15.1-2, 15.1-3, 15.1-4, 15.2- 2, 15. 2-3, 15.2-4, 15.2-6, 15.3-2, 15.3-3, 15.3-5 Problem 15-2 We have learned a solution to solving the maximum-subarray problem by using divide-and-conquer. We can solve the maximum-subarray problem with a greedy algorithm. Please write down the pseudo code and analyze the 44

Use Quizgecko on...
Browser
Browser