IMG_9823.jpeg
Document Details

Uploaded by AlluringHeliotrope126
Full Transcript
# Algorithmic Complexity ## What is algorithmic complexity? Algorithmic complexity is a measure of how much time (time complexity) and space (space complexity) is needed to execute an algorithm. ## Why is it important? Algorithmic complexity matters because it helps us to write scalable and perf...
# Algorithmic Complexity ## What is algorithmic complexity? Algorithmic complexity is a measure of how much time (time complexity) and space (space complexity) is needed to execute an algorithm. ## Why is it important? Algorithmic complexity matters because it helps us to write scalable and performant code. The goal is to aim for the lowest complexity possible. ## How can we express it? Algorithmic complexity is most often expressed using Big O notation. Big O notation describes the upper bound of the time or space complexity of an algorithm. * **O(1)** - *Constant Time*: The execution time remains constant, regardless of the input size. * **O(log n)** - *Logarithmic Time*: The execution time increases logarithmically as the input size grows. * **O(n)** - *Linear Time*: The execution time increases linearly with the input size. * **O(n log n)** - *Log-Linear Time*: The execution time grows proportionally to n multiplied by the logarithm of n. * **O(n2)** - *Quadratic Time*: The execution time increases quadratically with the input size. * **O(2n)** - *Exponential Time*: The execution time doubles with each additional element in the input size. * **O(n!)** - *Factorial Time*: The execution time grows factorially with the input size. ## Visual example The picture is a graph that visualizes Big O notations. The x axis is labeled "Elements" and the y axis is labeled "Operations". The graph plots the following Big O notations: O(1), O(log n), O(n), O(n log n), O(n2), O(2n), O(n!). The O(1) or *Constant time* is a horizontal line that never increases. The O(log n) or *Logarithmic time* slowly increases as the number of elements grow. The O(n) or *Linear time* increases linearly with the number of elements. The O(n log n) or *Log-linear time* increases at a faster rate than O(n) as the number of elements grows. The O(n2) or *Quadratic time* increases at an exponential rate as the number of elements grows. The O(2n) or *Exponential time* increases at a very fast exponential rate. The O(n!) or *Factorial time* increases at an extremely fast exponential rate. ## Which complexity should I aim for? When aiming for the best complexity, follow this order: $$\Omicron(1) < \Omicron(\log n) < \Omicron(n) < \Omicron(n \log n) < \Omicron(n^{2})$$