CSAC313 MODULE 2 (Week 6-8).pdf

Document Details

TroubleFreeTortoise

Uploaded by TroubleFreeTortoise

Don Honorio Ventura State University

Tags

algorithms asymptotic analysis complexity theory

Full Transcript

Republic of the Philippines DON HONORIO VENTURA STATE UNIVERSITY Bacolor, Pampanga COLLEGE OF COMPUTING STUDIES ALGORITHMS and COMPLEXITIES (CSAC313) MODULE 2 Growth of functions & Asymptotic notations...

Republic of the Philippines DON HONORIO VENTURA STATE UNIVERSITY Bacolor, Pampanga COLLEGE OF COMPUTING STUDIES ALGORITHMS and COMPLEXITIES (CSAC313) MODULE 2 Growth of functions & Asymptotic notations (Week 6-8) Module 2 – Growth of functions & Asymptotic notations Module 2 – Growth of functions & Asymptotic notations Preface In this module you will learn the behavior of the algorithm. It is essential for us to comprehend asymptotic notations and function development in order to assess and compare the algorithmic efficiency. By offering insights into how performance scales with rising input sizes, it aids in the formulation of well-informed decisions about algorithm design and optimization. You may more effectively select or create algorithms that satisfy your performance and resource needs by examining these variables. Understanding how algorithms and data structures behave as the amount of input increases is essential for studying them. Usually, asymptotic notations and the development of function notions are used to accomplish this comprehension. These ideas offer a framework for assessing and contrasting algorithms' efficacy, especially with regard to their time and spatial complexity. Knowing the growth rates enables you to decide on the viability and efficiency of algorithms with confidence, particularly when dealing with enormous amounts of data and computing constraints. Objectives At the end of the lesson, student should be able to: 1. review and discuss the growth of functions and asymptotic notations; 2. solve and analyze algorithm using notations; 3. recognized and differentiate the non-recursive and recursive algorithm; 4. share ideas regarding the topic in this module; 5. ask question during discussion or through messaging tool. Algorithms and Complexities Page 2 of 12 Module 2 – Growth of functions & Asymptotic notations Growth of Functions Describes how an algorithm's space or runtime needs expand as the size of the input increases. Performance can be strongly impacted by the different growth rates that different function types display. These growth rates can be used to forecast an algorithm's behavior as the size of the challenge increases. Common Growth Rates Constant Time: O(1) o Description: The function’s runtime or space requirement does not change with the input size. o Example: Accessing an element in an array by index. o Graph: A horizontal line. Logarithmic Time: O(log n) o Description: The function grows logarithmically with the input size. Logarithmic growth is slower compared to linear or polynomial growth. o Example: Binary search in a sorted array. o Graph: A curve that increases slowly and flattens out. Linear Time: O(n) o Description: The function grows linearly with the input size. o Example: Iterating through an array. o Graph: A straight line with a slope proportional to n. Linearithmic Time: O(n log n) o Description: The function grows in proportion to n log n. This is common in efficient sorting algorithms. o Example: Merge sort or quicksort (average case). o Graph: A curve that grows faster than linear but slower than quadratic. Quadratic Time: O(n2) o Description: The function grows proportionally to the square of the input size. This is typical in algorithms with nested loops. o Example: Bubble sort or selection sort. o Graph: A parabolic curve that grows quickly. Cubic Time: O(n3) o Description: The function grows proportionally to the cube of the input size. o Example: Some naive algorithms for matrix multiplication. o Graph: A steeply increasing curve. Algorithms and Complexities Page 3 of 12 Module 2 – Growth of functions & Asymptotic notations Exponential Time: O(2n) o Description: The function doubles with each additional element, leading to very rapid growth. o Example: Solving the Traveling Salesman Problem using brute force. o Graph: An extremely steep curve. Factorial Time: O(n!) o Description: The function grows factorially with the input size, making it impractical for large inputs. o Example: Generating all permutations of n elements. o Graph: A very steep curve, even steeper than exponential growth. Comparing Growth Rates When comparing these functions, it’s useful to understand their relative growth: Logarithmic vs. Linear: Logarithmic growth is much slower than linear growth, meaning algorithms with O(log n) complexity are typically more efficient than those with O(n) for large inputs. Linear vs. Quadratic: Quadratic growth is significantly faster than linear growth, so algorithms with O(n2) complexity will become impractical much sooner than those with O(n). Polynomial vs. Exponential: Exponential growth is much faster than polynomial growth. Algorithms with O(2n) complexity are often infeasible for even moderately large n, while polynomial complexities are more manageable. Factorial Growth: Factorial growth is the fastest among commonly encountered complexities, making algorithms with O(n!) complexity impractical for even small n. Visual Representation Visualizing these functions helps to understand their growth: Constant Time (O(1)): Horizontal line. Logarithmic Time (O(log n)): Slowly rising curve. Linear Time (O(n)): Straight line at an angle. Linearithmic Time (O(n log n)): Curve rising faster than linear but not as steep as quadratic. Quadratic Time (O(n2)): Parabolic curve. Cubic Time (O(n3)): Steeply rising curve. Algorithms and Complexities Page 4 of 12 Module 2 – Growth of functions & Asymptotic notations Exponential Time (O(2n)): Extremely steep curve. Factorial Time (O(n!)): Even steeper curve than exponential. Therefore: 1 < log n < √n < n < n log n < n2 < n3 1 is false that’s why it doesn’t print the value of n. f(n)=O(n+1)=O(n) Synthesis Understanding the growth of functions and asymptotic notations is crucial for evaluating and comparing the efficiency of algorithms. It helps in making informed decisions about algorithm design and optimization by providing insights into how performance scales with increasing input sizes. By analyzing these factors, you can better choose or develop algorithms that meet your performance and resource requirements. When designing or selecting algorithms, aim for lower time complexity to handle larger inputs efficiently. For example, choosing O(n log n) sorting algorithms (like merge sort) over O(n2) ones (like bubble sort) can make a significant difference in performance for large datasets. Programming recursive functions are an effective technique for decomposing complicated issues into smaller, more manageable subproblems. It is essential to comprehend the design and analysis of recursive algorithms in order to solve problems and maximize performance. Iterative functions, often known as non-recursive functions, offer an alternative to recursion in situations where recursion could cause performance problems or excessive memory use. Iterative functions may frequently be more practical and efficient for a variety of issues by utilizing loops and explicit state management. You may select the optimal solution depending on the needs and limitations of the issue by being aware of both recursive and iterative techniques. Algorithms and Complexities Page 10 of 12 Module 2 – Growth of functions & Asymptotic notations References: Printed/PDF File References: [P1] Levitin, A. (2012). Introduction to the design & analysis of algorithms. Boston: Pearson. [P2] Cormen, T. H. (2009). Introduction to algorithms. Cambridge, MA: MIT Press. [P3] Dasgupta, S., & Papadimitriou, C. (2008). Algorithms. Boston: McGraw-Hill Higher Education. [P4] Edmonds, J. (2008). How to think about algorithms. Cambridge: Cambridge University Press. [P5] Kleinberg, J., & Tardos, E. (2006). Algorithm design. Boston: Pearson/Addison- Wesley. [P6] Lewis, H., & Papadimitriou, C. Elements of the theory of computation (207). Upper Saddle River, N.J.: Prentice-Hall. Online: [O1] Complexity theory from Wolfram: http://mathworld.wolfram.com/ComplexityTheory.html [O2] Information and Computation journal: http://projects.csail.mit.edu/iandc/ [O3] Journal of Graph Algorithms and Applications: http://www.cs.brown.edu/sites/jgaa/ [O4] ACM journal on experimental Algorithms: http://www.jea.acm.org/about.html [O5] ACM Transactions on Algorithms http://talg.acm.org/ [O6] Journal of the ACM http://jacm.acm.org/ [O7] The NP versus P source, http://www.win.tue.nl/~gwoegi/P-versus-NP.htm [O8] Algorithms and Complexity resources: ttp://www.dcs.gla.ac.uk/research/algorithms/links.html Library Resources References: [LRR1] Madhav, Sanjay, Game programming algorithms and techniques, 2014 [LRR2] Gupta, Er Deepak, Algorithm analysis and design, 2013 [LRR3] Carmen, Thomas, Algorithms unlocked, 2013 Algorithms and Complexities Page 11 of 12 Module 2 – Growth of functions & Asymptotic notations MODULE DISCLAIMER It is not the intention of the author/s nor the publisher of this module to have monetary gain in using the textual information, imageries, and other references used in its production. This module is only for the exclusive use of a bona fide student of DHVSU under the department of CCS. In addition, this module or no part of it thereof may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, and/or otherwise, without the prior permission of DHVSU-CCS. Algorithms and Complexities Page 12 of 12

Use Quizgecko on...
Browser
Browser