Untitled document (10).pdf
Document Details
Uploaded by StatuesqueAffection
CUNY Queens College
Tags
Full Transcript
Heaps and HeapSort Heap A heap is a specialized tree-based data structure that satisfies the heap property. For a max-heap, each parent node is greater than or equal to its children, whereas for a min-heap, each parent node is less than or equal to its children. Binary Heap: The most common type...
Heaps and HeapSort Heap A heap is a specialized tree-based data structure that satisfies the heap property. For a max-heap, each parent node is greater than or equal to its children, whereas for a min-heap, each parent node is less than or equal to its children. Binary Heap: The most common type of heap, typically implemented as an array. For a node at index iii: Parent: floor((i−1)/2)\text{floor}((i-1)/2)floor((i−1)/2) Left child: 2i+12i+12i+1 Right child: 2i+22i+22i+2 HeapSort Steps: 1. Build a max-heap from the input data. 2. Swap the root (maximum value) with the last item of the heap, reduce the size of the heap by one, and heapify the root. 3. Repeat step 2 until the heap is empty. Time Complexity: Building the heap: O(n)O(n)O(n) Heapify operation: O(logn)O(\log n)O(logn) Overall: O(nlogn)O(n \log n)O(nlogn) QuickSort Steps: 1. Choose a pivot element from the array. 2. Partition the array into two halves: elements less than the pivot and elements greater than the pivot. 3. Recursively sort the two halves. Time Complexity: Best and average case: O(nlogn)O(n \log n)O(nlogn) Worst case: O(n2)O(n^2)O(n2) (when the smallest or largest element is always chosen as the pivot, e.g., for a sorted array without randomization) MergeSort Steps: 1. Divide the array into two halves. 2. Recursively sort each half. 3. Merge the two sorted halves to produce a sorted array. Time Complexity: Best, average, and worst case: O(nlogn)O(n \log n)O(nlogn) Radix Sort Steps: 1. Sort the numbers based on each digit, starting from the least significant digit to the most significant digit. 2. Use a stable sort (like Counting Sort) for sorting based on each digit. Time Complexity: O(d(n+k))O(d(n + k))O(d(n+k)), where ddd is the number of digits and kkk is the range of the digit values. Counting Sort Steps: 1. Count the occurrences of each element. 2. Compute the positions of each element in the sorted array. 3. Place the elements in the correct positions. Time Complexity: O(n+k)O(n + k)O(n+k), where kkk is the range of the input values. Divide and Conquer Strategies Definition: A technique that breaks a problem into smaller subproblems, solves each subproblem independently, and combines their solutions to solve the original problem. Examples: MergeSort QuickSort Binary Search: Search for an element in a sorted array by repeatedly dividing the search interval in half. Greedy Algorithms Definition: A technique that makes a series of choices, each of which looks best at the moment, to solve an optimization problem. Examples: Kruskal's Algorithm: For finding the Minimum Spanning Tree (MST). Prim's Algorithm: For finding the MST. Dijkstra's Algorithm: For finding the shortest paths in a graph with non-negative weights. Optimization: Greedy algorithms work well for problems with the greedy-choice property (a global optimal solution can be reached by making locally optimal choices) and optimal substructure (an optimal solution to the problem contains optimal solutions to the subproblems). Master Theorem Definition: A formula for solving recurrences of the form T(n)=aT(n/b)+f(n)T(n) = aT(n/b) + f(n)T(n)=aT(n/b)+f(n). Cases: 1. If f(n)=O(nc)f(n) = O(n^c)f(n)=O(nc) and c \log_b{a}c>logba, then T(n)=O(f(n))T(n) = O(f(n))T(n)=O(f(n)). Runtimes for Sorts Bubble Sort: O(n2) Insertion Sort: O(n2)O(n^2)O(n2) Selection Sort: O(n2)O(n^2)O(n2) MergeSort: O(nlogn)O(n \log n)O(nlogn) QuickSort: ○ Average: O(nlogn)O(n \log n)O(nlogn) ○ Worst: O(n2)O(n^2)O(n2) HeapSort: O(nlogn)O(n \log n)O(nlogn) Counting Sort: O(n+k)O(n + k)O(n+k) Radix Sort: O(d(n+k))O(d(n + k))O(d(n+k)) Runtimes for Data Structures Arrays: ○ Access: O(1)O(1)O(1) ○ Search: O(n)O(n)O(n) ○ Insertion/Deletion: O(n)O(n)O(n) Linked Lists: ○ Access: O(n)O(n)O(n) ○ Search: O(n)O(n)O(n) ○ Insertion/Deletion: O(1)O(1)O(1) Hash Tables: ○ Access: O(1)O(1)O(1) ○ Search: O(1)O(1)O(1) ○ Insertion/Deletion: O(1)O(1)O(1) Binary Search Trees: ○ Access: O(logn)O(\log n)O(logn) ○ Search: O(logn)O(\log n)O(logn) ○ Insertion/Deletion: O(logn)O(\log n)O(logn) Heaps: ○ Access: O(1)O(1)O(1) ○ Search: O(n)O(n)O(n) ○ Insertion/Deletion: O(logn)O(\log n)O(logn) Fractional Knapsack Problem: Given items with weights and values, maximize the total value by taking fractions of items. Solution: Sort items by value-to-weight ratio and take as much of the highest ratio items as possible. Time Complexity: O(nlogn)O(n \log n)O(nlogn) (due to sorting). 0-1 Knapsack Problem: Given items with weights and values, maximize the total value without breaking items. Solution: Use dynamic programming to build a table where the entry dp[i][w]dp[i][w]dp[i][w] represents the maximum value that can be attained with the first iii items and weight limit www. Time Complexity: O(nW)O(nW)O(nW) where nnn is the number of items and WWW is the knapsack capacity. Dijkstra's Algorithm Problem: Find the shortest paths from a source vertex to all other vertices in a graph with non-negative weights. Solution: Use a priority queue to repeatedly select the vertex with the smallest distance, update its neighbors' distances, and continue until all vertices are processed. Time Complexity: O(V2)O(V^2)O(V2) using a simple array, O((V+E)logV)O((V + E) \log V)O((V+E)logV) using a priority queue (with a binary heap). Trees Definition: A connected acyclic graph. Properties: Number of edges: n−1n - 1n−1 (for nnn vertices). A tree with nnn nodes has n−1n - 1n−1 edges. A tree is a special case of a graph where any two vertices are connected by exactly one path. Big O Notation and Other Notations Big O (OOO): Upper bound on the time complexity. Big Omega (Ω\OmegaΩ): Lower bound on the time complexity. Big Theta (Θ\ThetaΘ): Tight bound on the time complexity. Little o (ooo): Upper bound that is not tight. Little omega (ω\omegaω): Lower bound that is not tight. Asymptotic Analysis Purpose: To analyze the performance of algorithms as the input size grows. Focus: Worst-case, average-case, and best-case scenarios. Notations: Big O: Describes the upper bound. Big Omega: Describes the lower bound.