Module 2: Algorithm Analysis PDF

Summary

This document details module 2 of a computer science course, focusing on algorithm analysis. It covers the importance of algorithm analysis, how to measure the running time of an algorithm, and how to compare the efficiency of different algorithms. The document includes practical examples, such as a scenario where a program takes a significant amount of time to finish.

Full Transcript

CS 215 – DESIGN AND ANALYSIS OF ALGORITHMS MODULE 2 Algorithm Analysis Module Overview: Welcome to Module 2. In this module, you will learn the importance of algorithm analysis and help you understand the meaning of algorithm jargo...

CS 215 – DESIGN AND ANALYSIS OF ALGORITHMS MODULE 2 Algorithm Analysis Module Overview: Welcome to Module 2. In this module, you will learn the importance of algorithm analysis and help you understand the meaning of algorithm jargons such as O or O log if you happen to come across them while reading programming books. This module will also discuss how to measure the running time of an algorithm, how to compare the efficiency several algorithms as well as estimate the running time without the need to execute the code. This module is divided into two lessons: Lesson 1: Framework for Analysing Algorithms Lesson 2: Solutions to the Maximum Subsequence Sum Problem Module Outcomes: At the completion of this module, you should be able to:  estimate the time required for a program;  apply the formal framework for analysing algorithms;  apply the general rules in calculating running times; and  analyse algorithms and be able to compare them. 34 CS 215 – DESIGN AND ANALYSIS OF ALGORITHMS Lesson 1 Lesson Title: Framework for Analysing Algorithms Learning Outcomes: At the completion of this lesson, you should be able to:  estimate the running time required for a program;  apply the formal framework for analysing algorithms; and  apply the general rules in calculating running times. Time Frame: 2 weeks Introduction In Lesson 1 in the previous module, we defined an algorithm as a finite set of instructions for performing a computation or solving a problem. Once an algorithm is already designed for a given problem and assessed to be correct, the next important step is to determine how much resources in terms of time and space the algorithm will require. In this lesson, we will discuss how to estimate the running time required for a program, the formal framework used for algorithm analysis called asymptotic analysis, and estimate the running time of a program by applying rules. Activity Let’s say you write a computer program to automate a certain task that takes one minute per run. This execution time is not bad considering the amount of time you spent writing the program, not to mention if you perform the computation manually, the task to perform would take hours or even days. You get excited because suddenly you have plenty of time left for yourself to do other things, and you feel empowered to run the same program to analyse more data. With your computer’s capability, your program takes less than a second to run given an empty module and about one minute for a single piece of data. Now you want to try your program to analyse 10,000 data. 35 CS 215 – DESIGN AND ANALYSIS OF ALGORITHMS How long do you think your computer will run? You’re feeling confident that it may just take a few hours and you can run it overnight and, in the morning, it should be done. Hence, you insert few additional lines of code to read the data and start running it for 10,000 data. You hit the sack and expect the computer to give results in the morning when you wake up. To your surprise, you find your computer still running and there are no results yet. What makes it worse is that the running of your program actually makes your computer become very slow because the program is hogging your memory. This means that you will not be able to use your computer for other tasks. Hence, you decide not to use your computer and just content yourself to do other stuff that will not be needing a computer. Evening comes and the computer is still running. You are tempted to abort it but you thought that maybe in the morning it will give results already. The following morning comes, and to your dismay, the computer is still running. On the third day of running the program, you start wondering when it will finish. You then take out your calculator and compute the estimated running time of your program: 1 minute x 10,000 runs / (60 minutes/hour) / (24 hours/day) = 6.94 days Now that you have already run it half way, you start to wonder whether to terminate the running of your program and waste 3.5 days of computer run, or let it run for another 3.5 days knowing that you will not be able to use your computer. You realize that you should have saved the results of every run so that in the event that you need to abort it, you do not need to restart from the beginning. This story is a very typical scenario depicting the ignorance of some programmers about analysis of algorithms. Analysis Put yourself in the shoes of the programmer in our story above, then think about these questions: 1. What do you think have gone wrong? 2. Would you consider rewriting the program? What do you think will be the effect on the running time of your program? 36 CS 215 – DESIGN AND ANALYSIS OF ALGORITHMS 3. Would you consider replacing your computer? Will this make a difference on the program’s running time? 4. If you are to write a program, say for a particular problem, what properties or characteristics should you ensure your program must possess? Abstraction We saw from the story above that writing a working program is not good enough. If the program is to be run on huge input sizes, then the running time becomes a concern. Hence, it is important to learn how to write efficient algorithms and approximate its running time without actually coding them. The analysis required to estimate the resource of an algorithm is a theoretical issue, and thus, a formal framework is needed, which we shall cover later. First, let’s define what algorithm analysis is. What is Algorithm Analysis? Algorithm analysis is a branch of computer science that involves sophisticated mathematical analysis to determine the best or most efficient algorithm for a particular task. Analysing algorithm means predicting the resources that the algorithm requires, e.g., computational time. Resources such as memory, communication bandwidth, or computer hardware are sometimes of primary concern, but most often it is computational time that we want to measure. When we talk about algorithm analysis, we are concerned about algorithm efficiency. When is an algorithm considered efficient? An algorithm is said to be efficient if it solves the problem within the required resource constraints, such as time and space. 37 CS 215 – DESIGN AND ANALYSIS OF ALGORITHMS Since we emphasize efficiency as a design criterion, we include careful analysis of the running times of all our algorithms. But how do we actually measure efficiency? Efficiency can be measured in two ways: 1. Empirical comparison  Used to compare algorithm performance in practice on a set of instances  It has the following limitations: o Need to implement the algorithm o Results may not be indicative of the running time on other inputs not included in the experiment o Same hardware and software environments must be used in comparison 2. Asymptotic algorithm analysis  Uses a high-level description of the algorithm instead of an implementation for comparative purposes  A tool that allows us to explain how an algorithm behaves as the input grows larger  Takes into account all possible inputs  Allows us to evaluate the speed of the algorithm independent of the hardware/software environment  Characterizes running time as a function of the input size In this Design and Analysis of Algorithms course, we will be measuring algorithm efficiency using asymptotic algorithm analysis. Running time has already been mentioned several times so it is time to define it. 38 CS 215 – DESIGN AND ANALYSIS OF ALGORITHMS What is running time?  Measures how long the algorithm takes to complete a task  Normally depends on the problem size  Expressed as for some function on input size  Refers to the number of primitive operations or “steps” executed In short, the running time of a program is expressed as a function of the size of its input. Here, the input size depends on the problem being studied (e.g., number of items in the input, total number of bits needed, number of vertices and edges in a graph, etc.). The running time of a program depends on factors such as: 1. The input of the program 2. The quality of code generated by the compiler used to create the object program 3. The nature and speed of the instructions on the machine used to execute the program 4. The time complexity of the algorithm underlying the program When we look at input sizes large enough to make only the order of growth of the running time relevant, we are studying asymptotic efficiency of algorithms. Take, for example, the table below. This illustrates the order of growth of the running time of an algorithm. It gives a simple characterization of the algorithm’s efficiency and also allows us to compare the relative performance of alternative algorithms. We can see from the table below that once the input size becomes large enough, the algorithm with the running time of beats the other algorithms. Usually, an algorithm that is asymptotically more efficient will be the best choice for all but very small inputs. We will see an example of this in our next lesson. 39 CS 215 – DESIGN AND ANALYSIS OF ALGORITHMS The notations we use to describe the asymptotic running time of an algorithm are defined in terms of functions whose domains are the set of natural numbers. Such notations, which are called asymptotic notations, are convenient for describing the worst-case running-time function. 1.  – Notation (Theta) Definition: For a given function , we denote by Θ the set of functions Θ ∶ there exists positive constants , , and such that 0 . Instead of “ ∈Θ ”, we write “ Θ ” is an asymptotically tight bound for Example No. 1: Show that 3 is Θ Solution: Let 3 and Find , , and such that: Let 7, , 40 CS 215 – DESIGN AND ANALYSIS OF ALGORITHMS 1 1 3 14 2 2 3.5 3.5 24.5 TRUE ∴ 3 is Θ Certainly, other choices for the constants exist, but the important thing is that some choice exists. Note that these constants depend on this function only; a different function belonging to Θ would usually require different constants. 2. O – Notation (Big Oh) Definition: For a given function , we denote by O the set of functions O ∶ there exists positive constants and such that 0 . Instead of “ ∈O ”, we write “ O ” Θ implies O gives an asymptotic upper bound The Θ notation asymptotically bounds a function from above and below. O notation gives an upper bound for a function to within a constant factor. Θ notation is a stronger notation than O notation. 41 CS 215 – DESIGN AND ANALYSIS OF ALGORITHMS When we say that O , we are guaranteeing that the function grows at a rate not faster than ; thus is an upper bound. Example No. 2: Show that 1000 = O This translates to: “Show that 1000 grows at a rate not faster than.” Solution: Let 1000 and Find and such that: 1000 Let 1000 and 1 1000 1000 1 1000 1,000,000 1,000,000 TRUE ∴ 1000 O 3.  – Notation (Omega) Definition: For a given function , we denote by  the set of functions  ∶ there exists positive constants and such that 0 . gives an asymptotic lower bound Just as O notation provides an asymptotic upper bound on a function, Ω notation provides an asymptotic lower bound. 42 CS 215 – DESIGN AND ANALYSIS OF ALGORITHMS When we say that the running time (no modifier) of an algorithm is Ω , we mean that no matter what particular input of size is chosen for each value of , the running time on that input is at least a constant times Ω , for sufficiently large. Example No. 3: Show that 2 is Ω Solution: Find and such that: Let 1 and 1 1 2 1 1 1 3 1 TRUE ∴ 2 is Ω Theorem: For any two functions and , we have Θ if and only if O and Ω. Example No. 4: 1. What are the relative growth rates of and ? 2. What are the relative growth rates of and 2 ? Solution: 1. grows faster than , so we can say that: O or  2. and 2 grow at the same rate, so both O and  are true. We can also say that:  2 or 2  43 CS 215 – DESIGN AND ANALYSIS OF ALGORITHMS 4. o – Notation (Little Oh) Definition: For a given function , we denote by o the set of functions o : for any positive constant 0, there exists a constant 0 such that 0 . gives an upper bound that is not asymptotically tight The asymptotic upper bound provided by O notation may or may not be asymptotically tight. The bound 2 O is asymptotically tight, but the bound 2 O is not. The definitions of O notation and o notation are similar. The main difference is that in O , the bound 0 holds for some constant 0, but in o , the bound 0 holds for all constants 0. Example No. 5: Is 2 o ? Solution: No, since 2 can be both, that is, 2 O and 2  or 2  5.  – Notation (Little Omega) Definition: For a given function , we denote by  the set of functions  : for any positive constant 0, there exists a constant 0 such that 0 .  gives a lower bound that is not asymptotically tight  ∈ω if and only if ∈o 44 CS 215 – DESIGN AND ANALYSIS OF ALGORITHMS Comparing Functions Many of the relational properties of real numbers apply to asymptotic comparisons as well. For the following, assume that and are asymptotically positive: Because these properties hold for asymptotic notations, we can draw an analogy between the asymptotic comparison of two functions and and the comparison of two real numbers and : 45 CS 215 – DESIGN AND ANALYSIS OF ALGORITHMS Below are some of the rules that we need to familiarize ourselves with: 1. If O and O , then a) O (Intuitively and less formally it is O max , ) b) ∗ O ∗ 2. If is a polynomial of degree , then Θ. 3. O for any constant. This tells us that logarithms grow very slowly. In order to analyse algorithms in a formal framework, we need a model of computation, which is characterized as follows: 1. A normal computer in which instructions are executed sequentially 2. Has simple instructions (e.g. addition, multiplication, comparison, assignment) that take exactly one time unit 3. Has fixed size integers (e.g. 32-bit integers) 4. No fancy operations that cannot be done in one unit time 5. Has infinite memory Now that we know about algorithm analysis and the rules governing it, what is it that we are going to analyse? The most important resource to analyse is generally the running time and typically, the size of the input is the main consideration. We can analyse three cases: 1. Best Case – done occasionally since it does not represent typical behaviour 2. Average Case – reflect typical behaviour but difficult to compute; does not include bad input 3. Worst Case – represents a guarantee for performance on any possible input including bad input For this course, we will analyse the worst case performance of algorithms. 46 CS 215 – DESIGN AND ANALYSIS OF ALGORITHMS To calculate running times of algorithms, we will adopt several conventions to simplify the analysis, which are as follow:  There are no particular units of time  Throw away leading constants  Throw away low-order terms  Compute for Big-Oh so as not to underestimate the running time of the program Example No. 6: Below is a simple program fragment to calculate ∑ : public static int sum(int n) { int partialSum; partialSum = 0; for( int i = 1; i

Use Quizgecko on...
Browser
Browser