COMP002-INTRO-COMP-PROGRAMMING-PPT1.pptx
Document Details
Uploaded by FancyGroup
Polytechnic University of the Philippines
Tags
Full Transcript
LESSON 1 in Introduction to Computer Programming Concepts Prepared by Elaine B. Bolambot, MIT Faculty At the end of the lesson, the learner will be able to: Define the fundamentals of Computer...
LESSON 1 in Introduction to Computer Programming Concepts Prepared by Elaine B. Bolambot, MIT Faculty At the end of the lesson, the learner will be able to: Define the fundamentals of Computer Programming concepts; Identify the different programming languages and their application; and Describe the System Development Life Cycle (SDLC). Introduction What is/are the benefits/advantages of computers in your daily life? When people say that computers “run” our lives, they are only barely exaggerating. Pretty much every aspect of our day-to-day lives utilizes computers in a major way. Introduction (continuation) And, What keeps the computers running? Programming! No computer does anything without code telling it what to do, and how to do it. Every time a grocery store needs a new inventory system, or an HR department needs a new way to track employee benefits, that’s work done by a programmer. What is a Computer and Computer Program? What is a Computer? A computer is an electronic device that is designed to perform various tasks by executing sequences of instructions or programs. It can process data, perform calculations, store and retrieve information, and perform various other operations based on the instructions it receives. A computer is a programmable electronic device that accepts raw data as input and processes it with a set of instructions (a program) to produce the result as output. A computer is designed to execute applications and provides a variety of solutions through integrated hardware and software components. Charles Babbage It is believed that the Analytical Engine was the first computer which was invented by Charles Babbage in 1837. It used punch cards as read-only memory. Charles Babbage is also known as the father of the computer. Other definition of a Computer an electronic device capable of performing complex computations in a short time a fast electronic calculating machine that accepts input information, processes it according to a list of internally stored instructions called a program, and produces the resultant output information What is a Computer Program? A computer program, also known as software, is a set of instructions written in a programming language that tells a computer what tasks to perform. These instructions are composed of logical and sequential steps that guide the computer's hardware components to execute specific operations. Computer programs can range from simple calculations and data processing tasks to complex applications like word processors, video games, web browsers, and more. What is a Computer Program? A computer program is a sequence of instructions written using a Computer Programming Language to perform a specified task by the computer. Basic Terminologies: A computer program is also called a computer software, which can range from two lines to millions of lines of instructions. What is a Computer Program? (continuation) Basic Terminologies: (continuation) Computer program instructions are also called program source code and computer programming is also called program coding. A computer without a computer program is just a dump box; it is programs that make computers active. What is a Computer Program? (continuation) Computer programs are created by software developers who write code using programming languages like Python, Java, C++, and many others. These languages provide a way for developers to communicate with the computer's hardware and achieve specific tasks or functionalities. Once a program is written, it needs to be compiled or interpreted, depending on the programming language, to produce machine- readable instructions that the computer can execute. Simple definition of Computer Program a set of instructions telling a computer what to do; an algorithm written in a language that a computer can understand; the implementation (or concrete realization) of an algorithm in a language particular to the computing environment available to its author. Uses of Computer Programs Today computer programs are being used in almost every field, household, agriculture, medical, entertainment, defense, communication, etc. Listed below are a few applications of computer programs: What is Computer Programming? What is Computer Programming? Computer programming is the act/process of writing, designing, and creating set of instructions (code/computer programs) that a computer can understand and execute. Which are a sequence of instructions written using a Computer Programming Language to perform a specified task by the computer or these instructions are written in programming languages and are used to create computer programs or software applications. Computer programming is the craft of implementing one or more interrelated abstract algorithms using a particular programming language to produce a concrete computer program. What is Computer Programming? (continuation) Programming involves breaking down complex tasks into smaller, manageable steps that can be expressed in a programming language. These steps are often logical and sequential, and they guide the computer in performing various operations and tasks. Programmers use programming languages to communicate with computers and instruct them to perform specific actions, such as calculations, data processing, interacting with hardware devices, and more. What is Computer Programming? (continuation) Programming languages serve as the means of communication between humans and computers. There are various programming languages available, each with its strengths and weaknesses, and they are used for different types of applications. Some popular programming languages include Python, Java, C++, JavaScript, Ruby, and more. What is Computer Programming? (continuation) Overall, computer programming is a skill that empowers individuals to create software solutions that address real-world problems and automate tasks, making computers versatile tools for various industries and domains. Key Aspects of Computer Programming: Problem Solving Programming involves analyzing problems and finding efficient ways to solve them using code. This requires logical thinking and creativity. Algorithm Design Programmers create algorithms, which are step-by-step procedures for solving specific problems. An algorithm forms the foundation of a program's structure and logic. Key Aspects of Computer Programming: (continuation) Coding Writing code involves using a programming language to express the instructions that the computer will follow. This involves adhering to the syntax and rules of the chosen programming language. Testing and Debugging After writing code, programmers test it to ensure it behaves as expected. Debugging involves identifying and fixing errors or bugs in the code that cause unexpected behavior. Key Aspects of Computer Programming: (continuation) Optimization Programmers strive to create code that is efficient in terms of execution speed, memory usage, and other resources. Documentation Good programming practice involves creating documentation that explains the purpose, functionality, and usage of the code. This helps other programmers understand and maintain the code in the future. Key Aspects of Computer Programming: (continuation) Collaboration Often, programmers work in teams to develop large -scale software projects. Collaboration involves coordinating efforts, dividing tasks, and integrating different components into a cohesive application. Programming Languages and Its Types Programming Language and Its Types Programming languages are formal systems used to communicate instructions to computers. They provide a way for programmers to express their ideas and algorithms in a way that computers can understand and execute. There are numerous programming languages, e ach with its own syntax, semantics, and purposes. Programming languages can be broadly categorized into several types based on their characteristics and applications: Programming Language and Its Types (continuation) Programming languages can be broadly categorized into several types based on their characteristics and applications: High-Level Programming Languages High-level languages are designed to be more human-readable and abstract, allowing programmers to focus on solving problems rather than dealing with low-level details of computer hardware. Examples include Python, Java, C++, Ruby, and JavaScript. Programming Language and Its Types (continuation) Low-Level Programming Languages Low-level languages are closer to the machine code that computers understand, making them less human -readable but more closely tied to hardware. They provide greater control over hardware resources but require more effort to write and maintain. Assembly languages are examples of low-level languages. Programming Language and Its Types (continuation) Procedural Programming Languages Procedural languages focus on procedures or routines that define a series of steps to perform a specific task. These languages encourage the use of functions and procedures to organize code. Examples include C, Pascal, and Fortran. Programming Language and Its Types (continuation) Object-Oriented Programming (OOP) Languages OOP languages organize code into objects, which encapsulate data and methods that operate on that data. They promote modularity, reusability, and easier maintenance. Examples include Java, C+ +, and Python. Programming Language and Its Types (continuation) Functional Programming Languages Functional languages treat computation as the evaluation of mathematical functions. They emphasize immutability and the avoidance of side effects, making programs more predictable and easier to reason about. Examples include Haskell, Lisp, and Erlang. Programming Language and Its Types (continuation) Scripting Languages Scripting languages are often used for automating tasks, creating small utilities, and web development. They tend to have simpler syntax and dynamic typing. Examples include Python, Ruby, and JavaScript. Programming Language and Its Types (continuation) Markup Languages Markup languages are used to describe the structure and presentation of documents, often for web content. While not traditional programming languages, they play a crucial role in defining how information is displayed. Examples include HTML, XML, and CSS. Programming Language and Its Types (continuation) Domain-Specific Languages (DSLs) DSLs are tailored for specific tasks or industries, offering specialized syntax and features to address specific problems efficiently. SQL (for databases) and MATLAB (for numerical computation) are examples of DSLs. Programming Language and Its Types (continuation) Concurrency-Oriented Languages These languages focus on managing concurrent or parallel execution of tasks, essential for multi-core processors and distributed computing. Examples include Go and Erlang. Programming Language and Its Types (continuation) Compiled vs. Interpreted Languages Languages can be classified based on how they are executed. Compiled languages are translated into machine code before execution, resulting in faster performance. Interpreted languages are executed directly by an interpreter, offering greater portability and flexibility. Programming Language and Its Types (continuation) Static vs. Dynamic Typing Languages can also be categorized based on how they handle data types. Static typing (e.g., C, Java) enforces type checking at compile time, while dynamic typing (e.g., Python, JavaScript) performs type checking at runtime. Low Level vs High Level Programming Languages Low Level Programming Language It is machine-dependent (0s and 1s) programming language. Divided into two parts: Machine Language Assembly Language Low Level vs High Level Programming Languages (continuation) Machine Language It is also called as machine code or object code. Machine language is easier to read because it is normally displayed in binary or hexadecimal form (base 16) form. It does not require a translator to convert the programs because computers directly understand the machine language programs. Low Level vs High Level Programming Languages (continuation) Machine Language (continuation) Sample machine language program to add 5 and 3 (using the Intel 486 processor instruction set): Low Level vs High Level Programming Languages (continuation) Assembly Language Assembly language (ASM) is also a type of low-level programming language that is designed for specific processors. It represents the set of instructions in a symbolic and human-understandable form. It uses an assembler to convert the assembly language to machine language. Low Level vs High Level Programming Languages (continuation) Assembly Language (continuation) Sample machine language program to add 5 and 3 (using the Intel 486 processor instruction set): Low Level vs High Level Programming Languages (continuation) Low-Level Programming Languages (continuation) Proximity to Hardware: Low-level languages are closer to the hardware and are more closely related to the architecture of the computer's CPU. They provide a direct interface to the computer's hardware resources. Abstraction: They have minimal abstraction from the hardware. Programmers need to manage memory and hardware resources manually. Low Level vs High Level Programming Languages (continuation) Low-Level Programming Languages (continuation) Readability: Low-level languages are less human-readable due to their complex and hardware-specific syntax. Efficiency: Programs written in low-level languages tend to be more efficient in terms of execution speed and memory usage since they allow fine-grained control over hardware resources. Low Level vs High Level Programming Languages (continuation) Low-Level Programming Languages (continuation) Portability: Programs written in low-level languages are less portable across different hardware platforms since they are closely tied to the architecture of a specific machine. Low Level vs High Level Programming Languages (continuation) High Level Programming Language High-level programming language (HLL) is designed for developing user-friendly software programs and websites. This programming language requires a compiler or interpreter to translate the program into machine language (execute the program). Low Level vs High Level Programming Languages (continuation) High Level Programming Language (continuation) Abstraction: High-level languages provide a higher level of abstraction from the hardware. They offer more intuitive and human-readable syntax, allowing programmers to focus on problem-solving rather than hardware intricacies. Memory Management: High-level languages often handle memory management automatically, reducing the risk of memory-related errors like buffer overflows. Low Level vs High Level Programming Languages (continuation) High Level Programming Language (continuation) Readability: High-level languages are more human-readable due to their syntax, which is closer to natural language and higher-level concepts. Efficiency: Programs written in high-level languages might not be as efficient as low-level counterparts in terms of raw execution speed and memory usage. However, modern optimizing compilers often bridge this gap. Low Level vs High Level Programming Languages (continuation) High Level Programming Language (continuation) Portability: High-level languages are generally more portable across different platforms, as long as the required interpreter or compiler is available. Low Level vs High Level Programming Languages (continuation) Low-level languages are typically chosen when performance and hardware control are critical, while High-level languages are preferred for faster development, maintainability, and ease of understanding. Programming Paradigms Imperative Programming: Focuses on describing the steps required to solve a problem. It involves changing the program's state through statements that modify data. Declarative Programming: Focuses on logic of the program and the end result. Digital Data: Understanding the Foundation of Computing Digital Data Digital data is at the core of modern computing and information technology. It refers to the representation of information using binary digits (bits), which can take on the values of 0 and 1. All digital devices, from computers to smartphones, process and store data in this binary format. Binary Representation Binary Digits (Bits): The smallest unit of digital data is a bit, which can represent either a “0” or a “1”. Bytes: A group of 8 bits forms a byte, the basic unit for storing and processing data in most computer systems. Binary Representation (continuation) How then can a computer represent words and letters using bits? Bits can also be used to represent character data. In this case, computers make use of 0 and 1 as a replacement to dashes and dots. Encoding Schemes Types of Codes: ASCII American Standard Code for Information Interchange requires only seven bits for each character. EBCDIC Extended Binary Coded Decimal Interchange Code an alternative 8-bit code used by older IBM mainframe computers. EBCDIC Table Types of Codes (continuation) UNICODE uses 8, 16, or 32 bits providing codes for 65,000 characters (represent the alphabets of multiple languages) and becoming popular Extended ASCII Code makes use of a series of 0’s and 1’s to represent 256 characters (including letters for uppercase and lowercase, numbers, and symbols). Extended ASCII Code Table Data Types Integers: Whole numbers (positive, negative, or zero) are represented using binary numbers. The size of the binary representation determines the range of values that can be represented. Floating-Point Numbers: Decimal numbers are approximated using floating-point notation in binary, consisting of a sign, exponent, and mantissa. Data Types (continuation) Characters: Characters and symbols are represented using character encoding schemes such as ASCII or Unicode. Boolean Values: Represented as 0 or 1, Boolean values are used for logical operations. Binary Arithmetic Binary Addition and Subtraction: Similar to decimal arithmetic, binary numbers can be added and subtracted using the same principles. Conversion Between Binary and Decimal: Techniques for converting between binary and decimal representations. Data Compression Lossless Compression: Techniques that reduce the size of data without losing any information. Examples include Run-Length Encoding (RLE) and Huffman Coding. Lossy Compression: Methods that sacrifice some data to achieve higher compression ratios, often used for multimedia data (e.g., images, audio, video). Examples include JPEG and MP3. Binary Logic and Operations Logical Gates: Basic building blocks of digital circuits that perform logical operations (AND, OR, NOT). Boolean Algebra: A mathematical system used to analyze and simplify binary logic expressions. Binary Representation in Computer Binary Memory: Computers use binary storage at the hardware level, with each bit stored as a voltage level. CPU Operations: Central Processing Units (CPUs) perform arithmetic, logical, and control operations using binary instructions. Software Concepts: Essential ideas in computing Software Concepts Software is a fundamental aspect of modern technology, encompassing a wide range of applications and systems that drive our digital world. Understanding software concepts is crucial for anyone involved in programming, development, or using technology. Software Concepts (continuation) Software Categories: System Software: Software that manages and controls computer hardware, providing a platform for application software to run. Examples include operating systems (Windows, macOS, Linux) and device drivers. Software Concepts (continuation) Software Categories: Features of a System Software Close to the system Fast in speed Difficult to design Difficult to understand Less interactive Smaller in size Difficult to manipulate Generally written in low-level language Software Concepts (continuation) Software Categories: (continuation) Application Software: Software designed to perform specific tasks or provide particular services to users. Examples include word processors, web browsers, games, and multimedia software. Software Concepts (continuation) Software Categories: (continuation) Features of a Application Software Close to the user Easy to design More interactive Slow in speed Generally written in high-level language Easy to understand Easy to manipulate and use Bigger in size and requires large storage space Software Concepts (continuation) Software Components: Functions and Procedures: Segments of code that perform specific tasks and can be reused throughout the program. Classes and Objects: Building blocks of object- oriented programming, where classes define blueprints for creating objects with attributes (data) and methods (functions). Software Concepts (continuation) Software Components: (continuation) Modules and Libraries: Collections of related functions, classes, and other resources that can be reused in different projects. Application Programming Interfaces (APIs): Defines methods and data structures that allow different software components to interact with each other. Software Development Life Cycle (SDLC) Software Development Life Cycle The Software Development Life Cycle (SDLC) is a systematic approach used by software developers to design, develop, test, and deploy software applications. It provides a structured framework for managing the entire software development process, from initial concept to final deployment and maintenance. Phases of SDLC Planning and Requirements Analysis/Gathering: Identify and document the requirements of the software application. Gather input from stakeholders, including end- users, clients, and business analysts. Define the features, functionalities, and objectives of the software. Phases of SDLC (continuation) System Design: Create a high-level design that outlines the software's architecture and components. Specify how different parts of the software will interact and communicate. Decide on technologies, databases, and platforms to be used. Phases of SDLC (continuation) Implementation: Write the actual code based on the design specifications. Develop the software components, modules, and functions. Apply programming languages, libraries, and frameworks. Phases of SDLC (continuation) Testing and Quality Assurance: Test the software to identify and fix defects (bugs) and ensure its functionality. Conduct various types of testing, including unit testing, integration testing, and user acceptance testing (UAT). Verify that the software meets the specified requirements and quality standards. Phases of SDLC (continuation) Deployment: Release the software to end-users or clients. Set up the necessary infrastructure and environments for the software to run. Provide installation instructions and support for users during deployment. Phases of SDLC (continuation) Maintenance and Updates: Monitor the software in its operational environment. Address any issues, bugs, or performance problems that arise. Provide updates, patches, and new features based on user feedback and evolving requirements. Phases of SDLC (continuation) Waterfall Model Modified Waterfall Model Iterative Waterfall Model Software Development Methodologies Waterfall Model Sequential approach where each phase must be completed before moving to the next. Well-suited for projects with well-defined requirements and limited changes. Can result in longer development times and less flexibility. Software Development Methodologies (continuation) Agile Methodology Iterative and incremental development approach. Focuses on collaboration, flexibility, and responding to changes. Emphasizes regular feedback from end-users and stakeholders. Software Development Methodologies (continuation) Scrum An Agile framework that divides the project into time-bound iterations called sprints. Features regular team meetings and continuous development cycles. Supports adaptive planning and frequent adjustments Thank you!