Full Transcript

# Algorithmic Efficiency When attacking a problem, algorithm designers have many choices. This lecture addresses issues in selecting the *right* algorithm: one that is efficient, and one that makes efficient use of resources. ## Outline - Efficiency - Computational Complexity - Asymptotic Analysi...

# Algorithmic Efficiency When attacking a problem, algorithm designers have many choices. This lecture addresses issues in selecting the *right* algorithm: one that is efficient, and one that makes efficient use of resources. ## Outline - Efficiency - Computational Complexity - Asymptotic Analysis - Algorithm Analysis ## Efficiency * **Definition:** Efficiency is the comparison of algorithm's resource usage to solve a problem. * Resources include * Processing time (CPU cycles) * Memory * Efficiency is measured as a function of the size of the input (e.g. number of elements to sort) Efficiency is crucial for *Resource-constrained devices *Solving complex problems *Handling large datasets ## Computational Complexity * **Definition:** Computational complexity is the measure of the amount of resources (typically time and memory) required to execute an algorithm. * Expressed using asymptotic notation to describe how resource usage grows with input size. * Two Types: * **Time Complexity:** Measures the amount of time taken by an algorithm to run as a function of the length of the input. * **Space Complexity:** Measures the amount of memory space taken by an algorithm to run as a function of the length of the input. ## Asymptotic Analysis * Used to describe the limiting behavior of a function as the argument tends towards infinity. * Focuses on the growth rate of the algorithm's resource usage. * Provides a high-level understanding of an algorithm's efficiency. * Three common notations: * **Big O notation (O):** Describes the upper bound of the time complexity. * **Omega notation ($\Omega$):** Describes the lower bound of the time complexity. * **Theta notation ($\Theta$):** Describes the average bound of the time complexity. ### Common Growth rates | Notation | Name | Example | | :------------------ | :------------ | :---------- | | $O(1)$ | Constant | Array lookup | | $O(\log n)$ | Logarithmic | Binary search | | $O(n)$ | Linear | List search | | $O(n \log n)$ | Log-linear | Merge sort | | $O(n^2)$ | Quadratic | Bubble sort | | $O(n^3)$ | Cubic | | | $O(2^n)$ | Exponential | | | $O(n!)$ | Factorial | | ## Algorithm Analysis * **Definition**: Process of determining the amount of resources (e.g. time, memory) required by an algorithm. * Two main approaches: * **Empirical Analysis:** Involves implementing the algorithm and running it on various inputs to measure its performance. * Pros: Reflects real-world performance * Cons: Depends on hardware/software, limited test cases * **Theoretical Analysis:** Involves analyzing the algorithm's pseudocode to determine its time and space complexity. * Pros: Independent of implementation, considers all possible inputs * Cons: May not reflect real-world performance, requires mathematical reasoning. ## Conclusion Selecting an efficient algorithm and making efficient use of resources are the main goals of algorithm design. Efficiency can be determined by measuring the computational complexity. Asymptotic analysis using "big O" notation is the most common way to measure resource use. Algorithm analysis can be performed empirically or theoretically to measure resource use.