Full Transcript

# Algorithmic Complexity ## What? * **Time Complexity**: How much time does an algorithm need as a function of the input size? * **Space Complexity**: How much space does an algorithm need as a function of the input size? * **Why do we care?** We want to write the fastest and most efficient...

# Algorithmic Complexity ## What? * **Time Complexity**: How much time does an algorithm need as a function of the input size? * **Space Complexity**: How much space does an algorithm need as a function of the input size? * **Why do we care?** We want to write the fastest and most efficient code! * Get a job * Impress your friends * Make the world a better place ## How to measure time complexity? ### 1. Benchmarking * Measure the actual time it takes for an algorithm to run. * **Problems** * Different computers will have different results. * The same computer will have different results at different times. * Not very precise; hard to compare algorithms. ### 2. Counting Instructions * Count the number of instructions an algorithm executes. * **Problems** * Different instructions take different amounts of time. * Hard to keep track of all the instructions. * Still depends on the computer (to some extent). ### 3. Asymptotic Analysis * Analyze the algorithm as the input size approaches infinity. * **Big O Notation**: A way to describe the performance or complexity of an algorithm. * $O(n)$: Linear Time * $O(1)$: Constant Time * $O(log n)$: Logarithmic Time * $O(n^2)$: Quadratic Time * $O(2^n)$: Exponential Time ## Big O Notation * Describes the **upper bound** of an algorithm's time complexity. * Describes how the time or space resources grow as the input size grows. * We only care about the **dominant term**. * $O(n^2 + n) = O(n^2)$ * $O(n + log n) = O(n)$ * We can ignore constant factors. * $O(2n) = O(n)$ * $O(n/2) = O(n)$ ### Common Complexities | Name | Big O Notation | Example | | ------------- | -------------- | ---------------------------- | | Constant | $O(1)$ | Accessing Array Element | | Logarithmic | $O(log n)$ | Binary Search | | Linear | $O(n)$ | Searching an Array | | Log-Linear | $O(n log n)$ | Merge Sort | | Quadratic | $O(n^2)$ | Insertion Sort | | Cubic | $O(n^3)$ | Matrix Multiplication | | Exponential | $O(2^n)$ | Tower of Hanoi | | Factorial | $O(n!)$ | Generating Permutations | ## Examples ### 1. Constant Time - $O(1)$ ```python def get_first_element(list): return list ``` No matter how big the list is, this function will always take the same amount of time. ### 2. Linear Time - $O(n)$ ```python def print_all_elements(list): for element in list: print(element) ``` The time this function takes is directly proportional to the size of the list. ### 3. Quadratic Time - $O(n^2)$ ```python def print_all_pairs(list): for i in list: for j in list: print(i, j) ``` For every element in the list, we iterate through the entire list again. ## Space Complexity * How much memory does an algorithm need as a function of the input size? * Includes input size and auxiliary space. * Follows the same rules as time complexity. * Big O Notation * Dominant Term * Ignore Constants ### Examples ```python def sum_array(arr): sum = 0 for num in arr: sum += num return sum ``` $O(1)$ - We only use a constant amount of extra space. ```python def double_array(arr): new_arr = * len(arr) for i in range(len(arr)): new_arr[i] = arr[i] * 2 return new_arr ``` $O(n)$ - We create a new array that is the same size as the input array.