Full Transcript

# Algorithmic Complexity ## What is Complexity? * Complexity is a measure of the amount of resources (time, space, etc.) that a particular algorithm will consume when executed. * **Time Complexity**: the amount of computer time an algorithm needs to run to completion. * **Space Complexity**:...

# Algorithmic Complexity ## What is Complexity? * Complexity is a measure of the amount of resources (time, space, etc.) that a particular algorithm will consume when executed. * **Time Complexity**: the amount of computer time an algorithm needs to run to completion. * **Space Complexity**: the amount of memory an algorithm needs to run to completion. In this course, we will primarily focus on **time complexity**. ## How to Measure Time Complexity? ### 1. Experimental Analysis * Write a program implementing the algorithm. * Run the program with inputs of varying size and composition. * Use a method like `System.currentTimeMillis()` to get an accurate measure of the actual running time. * Plot the results. ### 2. Theoretical Analysis * Determine the running time as a function of the input size. * **Input Size**: depends on the problem. * **Sorting**: the number of items to be sorted. * **Searching**: the number of items in the search space. * **Graph Problem**: the number of vertices and/or edges * Use a high-level description of the algorithm instead of implementing. * Takes into account all possible inputs. * Allows us to evaluate the efficiency of the algorithm independent of hardware and software environment. ## Primitive Operations * Low-level computations that are largely independent from the programming environment. * Evaluating an expression * Assigning a value to a variable * Comparing two numbers * Returning from a method * Primitive operations worst-case take constant time $O(1)$ ## Counting Primitive Operations * By inspecting the pseudo-code, we can count the number of primitive operations executed by an algorithm. ```java Algorithm arrayMax(A, n) Input array A of n integers Output maximum element in A currentMax ← A // 1 operation for i ← 1 to n − 1 do // n operations if A[i] > currentMax then // 2(n − 1) operations currentMax ← A[i] // 2(n − 1) operations return currentMax // 1 operation ``` * $T(n) = 1 + n + 2(n-1) + 2(n-1) + 1 = 5n - 2$ * The running time $T(n)$ is a linear function of $n$. ## Growth Rate of Running Time * Changing the hardware/software environment * Affects the running time by a constant factor * But does not alter the growth rate * The linear growth rate of the running time $T(n)$ remains unchanged * $T(n) = 10n - 2$ * $T(n) = 5n - 100$ * **Conclusion**: The growth rate of the running time is not affected by the constant factors or lower-order terms ## Asymptotic Notation * To analyze an algorithm, we want to focus on the **order of growth** of the running time as the input size increases. * We use **asymptotic notation** to express the order of growth of the running time. * **Big-O Notation**: $O(f(n))$ * **Big-Omega Notation**: $\Omega(f(n))$ * **Big-Theta Notation**: $\theta(f(n))$ ## Big-O Notation * Given functions $f(n)$ and $g(n)$, we say that $f(n)$ is $O(g(n))$ if there are positive constants $c$ and $n_0$ such that $f(n) \le cg(n)$ for $n \ge n_0$ * $f(n)$ grows no faster than $g(n)$ ### Example * $2n + 10$ is $O(n)$ * We want to find constants $c$ and $n_0$ such that $2n + 10 \le cn$ for $n \ge n_0$ * Let $c = 3$ and $n_0 = 10$ * $2n + 10 \le 3n$ for $n \ge 10$ ## Big-Omega Notation * Given functions $f(n)$ and $g(n)$, we say that $f(n)$ is $\Omega(g(n))$ if there are positive constants $c$ and $n_0$ such that $f(n) \ge cg(n)$ for $n \ge n_0$ * $f(n)$ grows at least as fast as $g(n)$ ### Example * $2n + 10$ is $\Omega(n)$ * We want to find constants $c$ and $n_0$ such that $2n + 10 \ge cn$ for $n \ge n_0$ * Let $c = 2$ and $n_0 = 1$ * $2n + 10 \ge 2n$ for $n \ge 1$ ## Big-Theta Notation * Given functions $f(n)$ and $g(n)$, we say that $f(n)$ is $\theta(g(n))$ if there are positive constants $c_1$, $c_2$, and $n_0$ such that $c_1g(n) \le f(n) \le c_2g(n)$ for $n \ge n_0$ * $f(n)$ grows at the same rate as $g(n)$ ### Example * $2n + 10$ is $\theta(n)$ * We want to find constants $c_1$, $c_2$ and $n_0$ such that $c_1n \le 2n + 10 \le c_2n$ for $n \ge n_0$ * Let $c_1 = 2$, $c_2 = 3$ and $n_0 = 10$ * $2n \le 2n + 10 \le 3n$ for $n \ge 10$ ## Relationships Between $O$, $\Omega$, and $\theta$ * $f(n)$ is $\theta(g(n))$ if and only if $f(n)$ is $O(g(n))$ and $f(n)$ is $\Omega(g(n))$. ## Properties of Asymptotic Notation * If $f(n)$ is $O(g(n))$ and $g(n)$ is $O(h(n))$, then $f(n)$ is $O(h(n))$. * If $f(n)$ is $\Omega(g(n))$ and $g(n)$ is $\Omega(h(n))$, then $f(n)$ is $\Omega(h(n))$. * If $f(n)$ is $\theta(g(n))$ and $g(n)$ is $\theta(h(n))$, then $f(n)$ is $\theta(h(n))$. ## Using Asymptotic Notation * The Big-O notation is used to describe the **worst-case** running time of an algorithm. * The Big-Omega notation is used to describe the **best-case** running time of an algorithm. * The Big-Theta notation is used to describe the **average-case** running time of an algorithm. ## Common Growth Rate * **Constant**: $\theta(1)$ * **Logarithmic**: $\theta(log n)$ * **Linear**: $\theta(n)$ * **N-Log-N**: $\theta(n log n)$ * **Quadratic**: $\theta(n^2)$ * **Cubic**: $\theta(n^3)$ * **Exponential**: $\theta(2^n)$ * **Factorial**: $\theta(n!)$ ### Plots of Common Growth Rates The image shows a graph plotting common growth rates, illustrating how the time (y-axis) increases with the input size n (x-axis) for different complexities. Specifically, it includes the following functions: * **Constant** * **Logarithmic** * **Linear** * **N-Log-N** * **Quadratic** * **Cubic** * **Exponential**