Algorithms and Complexity (Week 1, Lesson 2) PDF
Document Details
Uploaded by Deleted User
2025
Tags
Summary
This document is a set of DSA & Complexity, including examples relating to time complexity and Big O notation. The document also includes code snippets of examples and exercises.
Full Transcript
Algorithms and Complexity Introduction to DSA & Complexity Winter semester 24/25 Task of the Day 1 a=0 b=0 N=4 M=4 for i in range(N): a = a + 10 for i in range(M): b = b + 40 print(a,b) Task of the day 2 def first_non_repeating(s): # Step 1: Count the frequency of each character frequency...
Algorithms and Complexity Introduction to DSA & Complexity Winter semester 24/25 Task of the Day 1 a=0 b=0 N=4 M=4 for i in range(N): a = a + 10 for i in range(M): b = b + 40 print(a,b) Task of the day 2 def first_non_repeating(s): # Step 1: Count the frequency of each character frequency = {} for char in s: if char in frequency: frequency[char] += 1 else: frequency[char] = 1 # Step 2: Find the first character with frequency 1 for char in s: if frequency[char] == 1: return char return None Task of the day 3 def longest_increasing_subsequence(nums): dp = * len(nums) # dp[i] is the length of the LIS ending at index i for i in range(1, len(nums)): for j in range(i): if nums[i] > nums[j]: dp[i] = max(dp[i], dp[j] + 1) return max(dp) Definition of Big Theta (Θ) Notation Θ (g(n)) = {f(n): there exist positive constants c1, c2 and n0 such that 0 ≤ c1 * g(n) ≤ f(n) ≤ c2 * g(n) for all n ≥ n0} Exponential Time Complexity: Big O(2n) Complexity Θ notation provides exact asymptotic bounds (both upper and lower). Average time complexity reflects realistic performance. f(n) remains within bounds for large n. Steps to Determine Average Time Complexity Break the program into smaller segments. Find all types and number of inputs and calculate the number of operations they take to be executed. Make sure that the input cases are equally distributed. Find the sum of all the calculated values and divide the sum by the total number of inputs - let say the function of n obtained is g(n) after removing all the constants, then in Θ notation its represented as Θ(g(n)) In a linear search problem, let’s assume that all the cases are uniformly distributed (including the case when the key Example is absent in the array). So, sum all the cases (when the key is present at position def linearSearch(a, n, key): 1, 2, 3, ……, n and not present, and divide the sum by n + 1. for i in range(n): if a[i] == key: Average case time complexity = return True ∑i=1 n+1. θ(i)/n+1 return False ⇒ θ((n+1)∗(n+2)/2)n+1n+1 ⇒ θ(1+n/2) ⇒ θ(n) (constants are removed) When to Use Big Θ Notation Use when precise analysis is needed. Suitable for average-case complexity. Challenges: Requires uniform input distribution. Big-Omega Ω Notation Big-Omega Ω Notation, is a way to express the asymptotic lower bound of an algorithm’s time complexity, since it analyses the best-case situation of algorithm. It provides a lower limit on the time taken by an algorithm in terms of the size of the input. It’s denoted as Ω(f(n)), where f(n) is a function that represents the number of operations (steps) that an algorithm performs to solve a problem of size n. Big-Omega in simple words Big-Omega Ω Notation is used when we need to find the asymptotic lower bound of a function. In other words, we use Big-Omega Ω when we want to represent that the algorithm will take at least a certain amount of time or space. Definition of Big-Omega Given two functions g(n) and f(n), we say that f(n) = Ω(g(n)), if there exists constants c > 0 and n0 >= 0 such that f(n) >= c*g(n) for all n >= n0. How to calculate Big Omega 1. Break the program into smaller segments: Break the algorithm into smaller segments such that each segment has a certain runtime complexity. 2. Find the complexity of each segment: Find the number of operations performed for each segment(in terms of the input size) assuming the given input is such that the program takes the least amount of time. 3. Add the complexity of all segments: Add up all the operations and simplify it, let’s say it is f(n). 4. Remove all the constants: Remove all the constants and choose the term having the least order or any other function which is always less than f(n) when n tends to infinity. Let’s say the least order function is g(n) then, Big-Omega (Ω) of f(n) is Ω(g(n)). Example def printt(a, n) : Linear functions g(n), logarithmic functions g(log n), constant functions g(1) will always grow at a lesser rate than n^2 when for i in range(n) : the input range tends to infinity therefore, the best-case for j in range(n) : running time of this program can be Ω(log n), Ω(n), Ω(1), or any if (i != j) : function g(n) which is less than n2 when n tends to infinity. print(a[i], "", a[j]) Big O Big Omega Big Theta It is like (=) rate of growth is greater It is like (==) meaning the rate of algorithm is less than or equal to a than or equal to a specified value. growth is equal to a specified value. specific value. The upper bound of algorithm is The algorithm’s lower bound is The bounding of function from above represented by Big O notation. Only represented by Omega notation. The and below is represented by theta the above function is bounded by Big asymptotic lower bound is given by notation. The exact asymptotic O. Asymptotic upper bound is given Omega notation. behavior is done by this theta by Big O notation. notation. Big O – Upper Bound Big Omega (Ω) – Lower Bound Big Theta (Θ) – Tight Bound It is define as upper bound and upper It is define as lower bound and lower It is define as tightest bound and bound on an algorithm is the most bound on an algorithm is the least tightest bound is the best of all the amount of time required ( the worst amount of time required ( the most worst case times that the algorithm case performance). efficient way possible, in other words can take. best case). Mathematically: Big Oh is 0