Algorithms Analysis And Design Chapter 9: Greedy Technique PDF
Document Details
Uploaded by PlayfulErbium
UKM
2012
A. Levitin
Tags
Summary
This document discusses the greedy technique in algorithm design. It covers its applications and provides examples, including change-making problem. Examples of applications and the theory are extensively explained.
Full Transcript
Algorithms Analysis and Design Chapter 9: Greedy Technique A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 3 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 1 Greedy Technique...
Algorithms Analysis and Design Chapter 9: Greedy Technique A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 3 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 1 Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: feasible locally optimal irrevocable For some problems, yields an optimal solution for every instance. For most, does not but can be useful for fast approximations. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 2 Applications of the Greedy Strategy Optimal solutions: change making for “normal” coin denominations minimum spanning tree (MST) single-source shortest paths simple scheduling problems Huffman codes Approximations: traveling salesman problem (TSP) knapsack problem other combinatorial optimization problems A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 3 Change-Making Problem Given unlimited amounts of coins of denominations d1 > … > dm , give change for amount n with the least number of coins Example: d1 = 25c, d2 =10c, d3 = 5c, d4 = 1c and n = 48c Greedy solution: Greedy solution is optimal for any amount and “normal’’ set of denominations may not be optimal for arbitrary coin denominations A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 4 Change-Making Problem example find the minimum number of coins which satisfies the value. Example coins[] = {5,10,20,25} value = 50 Possible Solutions {coin * count} {5 * 10} = 50 [10 coins] {5 * 8 + 10 * 1} = 50 [9 coins] goes on. {10 * 5} = 50 [5 coins] {20 * 2 + 10 * 1} = 50 [3 coins] {20 * 2 + 5 * 2} = 50 [4 coins] {25 * 2} = 50 [2 coins] etc etc Best Solution A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 5 MST: Kruskal’s Sort the edges in nondecreasing order of lengths “Grow” tree one edge at a time to produce MST through a series of expanding forests F1, F2, …, Fn-1 On each iteration, add the next edge on the sorted list unless this would create a cycle. (If it would, skip the edge.) A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 6 A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 7 A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 8 Shortest paths – Dijkstra’s algorithm Single Source Shortest Paths Problem: Given a weighted connected graph G, find shortest paths from source vertex s to each of the other vertices Dijkstra’s algorithm: Start a Tree with s as the root. Among vertices not already in the tree, it finds vertex u with the smallest sum dv + w(v,u) where - v is a vertex for which shortest path has been already found on preceding iterations (such vertices form a tree) - dv is the length of the shortest path form source to v - w(v,u) is the length (weight) of edge from v to u A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 9 Visit A A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 10 A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 11 A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 12 A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 13 A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 14 15 This time around, it is vertex B 16 17 18 Notes on Dijkstra’s algorithm Doesn’t work for graphs with negative weights Applicable to both undirected and directed graphs Efficiency O(|V|2) for graphs represented by weight matrix and array implementation of priority queue O(|E|log|V|) for graphs represented by adjacency lists and min-heap implementation of priority queue 19 Coding Problem Coding: assignment of bit strings to alphabet characters Codewords: bit strings assigned for characters of alphabet Two types of codes: fixed-length encoding (e.g., ASCII) variable-length encoding (e.g., Morse code) Prefix-free codes: no codeword is a prefix of another codeword Problem: If frequencies of the character occurrences are known, what is the best binary prefix-free code? 20 Huffman code Any binary tree with edges labeled with 0’s and 1’s yields a prefix-free code of characters assigned to its leaves Optimal binary tree minimizing the expected (weighted average) length of a codeword can be constructed as follows Huffman’s algorithm Initialize n one-node trees with alphabet characters and the tree weights with their frequencies. Repeat the following step n-1 times: join two binary trees with smallest weights into one (as left and right subtrees) and make its weight equal the sum of the weights of the two trees. Mark edges leading to left and right subtrees with 0’s and 1’s, respectively. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved. 21 Huffman Code Approach Variable length encoding of symbols Exploit statistical frequency of symbols Efficient when symbol probabilities vary widely Principle Use fewer bits to represent frequent symbols Use more bits to represent infrequent symbols A A B A A A B A Huffman Code Example Symbol A B C D Frequency 13% 25% 50% 12% Original 00 01 10 11 Encoding 2 bits 2 bits 2 bits 2 bits Huffman 110 10 0 111 Encoding 3 bits 2 bits 1 bit 3 bits Expected size (average bits per character) Original 0.132 + 0.252 + 0.52 + 0.122 = 2 bits / symbol Huffman 0.133 + 0.252 + 0.51 + 0.123 = 1.75 bits / symbol Huffman Code Data Structures Binary (Huffman) tree D A Represents Huffman code Edge code (0 or 1) Leaf symbol 1 0 B Path to leaf encoding Example – A = “110”, B = “10”, C = “0” C Priority queue 1 0 To efficiently build binary tree 1 0 Huffman Code Algorithm Overview Encoding Calculate frequency of symbols in file Create binary tree representing “best” encoding Use binary tree to encode compressed file – For each symbol, output path from root to leaf – Size of encoding = length of path Save binary tree Huffman Code – Creating Tree Algorithm Place each symbol in leaf – Weight of leaf = symbol frequency Select two trees L and R (initially leafs) – Such that L, R have lowest frequencies in tree Create new (internal) node – Left child L – Right child R – New frequency frequency( L ) + frequency( R ) Repeat until all nodes merged into one tree Huffman Tree Construction 1 A C E H I 3 5 8 2 7 Huffman Tree Construction 2 A H C E I 3 2 5 8 7 5 Huffman Tree Construction 3 A E I H 8 7 3 2 C 5 5 10 Huffman Tree Construction 4 E I A H 8 7 3 2 C 15 5 5 10 Huffman Tree Construction 5 A H E = 01 3 2 I = 00 C E I C = 10 1 0 5 5 8 7 A = 111 1 0 1 0 H = 110 10 15 1 0 25 Huffman Coding Example Huffman code E = 01 I = 00 C = 10 Input A = 111 ACE H = 110 Output (111)(10)(01) = 1111001 Huffman Code Algorithm Overview Decoding Read file & binary tree Use binary tree to decode file – Follow path from root to leaf Huffman Decoding 1 A H 1111001 3 2 C E I 1 0 5 5 8 7 1 0 1 0 10 15 1 0 25 Huffman Decoding 2 A H 1111001 3 2 C E I 1 0 5 5 8 7 1 0 1 0 10 15 1 0 25 Huffman Decoding 3 A H 1111001 3 2 C E I 1 0 5 5 8 7 A 1 0 1 0 10 15 1 0 25 Huffman Decoding 4 A H 1111001 3 2 C E I 1 0 5 5 8 7 A 1 0 1 0 10 15 1 0 25 Huffman Decoding 5 A H 1111001 3 2 C E I 1 0 5 5 8 7 AC 1 0 1 0 10 15 1 0 25 Huffman Decoding 6 A H 1111001 3 2 C E I 1 0 5 5 8 7 AC 1 0 1 0 10 15 1 0 25 Huffman Decoding 7 A H 1111001 3 2 C E I 1 0 5 5 8 7 ACE 1 0 1 0 10 15 1 0 25 Huffman Code Properties Prefix code No code is a prefix of another code Example – Huffman(“I”) 00 – Huffman(“X”) 001 // not legal prefix code Can stop as soon as complete code found No need for end-of-code marker Nondeterministic Multiple Huffman coding possible for same input If more than two trees with same minimal weight Huffman Code Properties Greedy algorithm Chooses best local solution at each step Combines 2 trees with lowest frequency Still yields overall best solution Optimal prefix code Based on statistical frequency Better compression possible (depends on data) Using other approaches (e.g., pattern dictionary) 0.1 0.15 0.2 0.2 0.35 Example B _ C D A 0.2 0.2 0.35 0.25 C D A character A B C D _ 0.1 0.15 B _ frequency 0.35 0.1 0.2 0.2 0.15 0.25 0.35 0.4 A codeword 11 100 00 01 101 0.1 0.15 0.2 0.2 B _ C D average bits per character: 2.25 0.4 0.6 for fixed-length encoding: 3 0.2 0.2 0.25 0.35 compression ratio: (3-2.25)/3*100% = 25% C D A 0.1 0.15 B _ 1.0 0 1 0.4 0.6 0 1 0 1 0.2 0.2 0.35 0.25 C D A 0 1 A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson 0.1 0.15 Education, Inc. Upper Saddle River, NJ. All Rights Reserved. B _ 43