Big O Notation PDF
Document Details
Uploaded by FluentLimerick
Faculty of Computers and Information, Arish University
Dr. Naglaa Sobhy
Tags
Summary
This document explains the concept of Big O notation, a method to analyze the performance of algorithms. It explores different aspects of algorithm comparison, including benchmarking and asymptotic analysis. The document covers topics like time and space complexities, and demonstrates how Big O notation simplifies the understanding of algorithm efficiency.
Full Transcript
COMPLEXITY ANALYSIS AND Big O() NOTATION Prepared By / Dr. Naglaa Sobhy FCI-Arish University Is the algorithm good? Correct : solve the problem correctly Efficient: time complexity space complexity Easy to implement Approaches of comparing...
COMPLEXITY ANALYSIS AND Big O() NOTATION Prepared By / Dr. Naglaa Sobhy FCI-Arish University Is the algorithm good? Correct : solve the problem correctly Efficient: time complexity space complexity Easy to implement Approaches of comparing algorithms Computer scientists are often faced with the task of comparing algorithms to see how fast they run or how much memory they require. There are two approaches to this task: The first approach is BENCHMARKING (running the algorithm on real machine) Benchmarking means running the algorithms on a computer and measuring speed in seconds and memory consumption in bytes. A benchmark can be unsatisfactory because it is so specific: it measures the performance of a particular program written in a particular language, running on a particular computer, with a particular compiler and particular input data. From the single result that the benchmark provides, it can be difficult to predict how well the algorithm would do on a different compiler, computer, or data set. The second approach is Frequency count method (Asymptotic analysis) : Asymptotic analysis is a method used in computer science and mathematics to describe the behavior of algorithms as the input size grows. It depends on counting the number of machine instructions according to the input size n. For example, an algorithm running on an input of size [ n ] takes [6n^2 + 100n + 300 ] machine instructions. It helps to provide an estimate of the time complexity or space complexity(amount of memory) of an algorithm for large input sizes. Relies on a mathematical analysis of algorithms, independently of the particular implementation and input. Useful for comparing different algorithms and understanding their efficiency, especially when dealing with large data sets Time Complexity Time complexity analysis, defines the time as a function of the problem size and try to estimate the growth of the execution time with the growth in problem size. Time Complexity tells us how fast an algorithm runs. Space Complexity(Memory Space) Space Complexity tells us how much memory is required by an algorithm. History of Big-O-Notation Big O notation (with a capital letter O, not a zero), also called Landau's symbol, is a symbolism used in complexity theory, computer science, and mathematics to describe the basic behavior of functions. Basically, it tells you how fast a function grows or declines. Landau's symbol comes from the name of the German mathematician Edmund Landau who invented the notation. The letter O is used because the rate of growth of a function is also called its order. What is Big-O-Notation It’s represented by O(n) , where n is the number of operations “It’s used to measure the performance of any algorithm by providing the order of growth of the function” It gives the upper bound of a function “Worst Case” It describes : Efficiency of an Algorithm Time Complexity Factor of an Algorithm Space Complexity of an Algorithm The running time of an algorithm depends on how long it takes a computer to run the lines of code of the algorithm. This depends on : The speed of the computer The programming language The compiler that translates the program from the programming language into code that runs directly on the computer Input The running time of an algorithm can be analyzed by considering two ideas: the size of its input and the rate of growth of the running time. The size of the input determines how long the algorithm takes to run. The rate of growth of the running time is important to determine how fast a function grows with the input size. To simplify the function, we must extract the most important part and disregard the less important parts. note: log 𝑛 = log 2 𝑛 The best algorithm is that of rate of growth (time function) with O( log 2 𝑛 ) The worst algorithm is that of rate of growth (time function) with O(2𝑛 ) For example, if an algorithm running on an input of size [ n ] takes [6n^2 + 100n + 300 ] machine instructions, the [ 6n^2 ] term becomes larger than the remaining terms, [ 100 n + 300 ], once [ n ] becomes large enough, 20 in this case. A chart showing values of [ 6n^2 ] and [ 100n + 300 ] for values of [ n ] from 0 to 100 is provided as an example. For example, here's a chart showing values of [ 0.6n^2 ] and [ 1000n + 3000 ] so that we've reduced the coefficient of [ n^2 ] by a factor of 10 and increased the other two constants by a factor of 10: The value of [ n ] at which [ 0.6n^2 ] becomes greater than [ 1000n + 3000 ] has increased, but there will always be such a crossover point, no matter what the constants. Asymptotic notation is used to simplify functions by dropping constant coefficients and less significant terms of an algorithm's running time(rate of growth) function. There are three forms of asymptotic notation: big- [ Theta ] notation, big-O notation, and big- [ Omega ] notation. The O() notation gives us what is called an asymptotic analysis. We can say without question that, as n asymptotically approaches infinity, an O(n) algorithm is better than an O(n2) algorithm. Asymptotic analysis is the most widely used tool for analyzing algorithms. Best case: When the algorithm takes less run time w.r.t the given inputs/data. Best case also defines as Big omega- Notation Ω. Average case When the algorithm takes average run time w.r.t to the input /data. This is also known as Big Theta – Notation ⱺ. Worst case: When the algorithm takes higher runtime as compared to the given inputs/data. Worst case is also called as Big O - Notation.