Chapter1_Introduction to Computers and Python_Part3 (4).pptx

Full Transcript

Chapter 1 Introduction to Computers and Python CSC1402 Learning Objectives Learn computer system components Understand the data hierarchy from bits to databases. Understand the different types of programming languages. Understand the strengt...

Chapter 1 Introduction to Computers and Python CSC1402 Learning Objectives Learn computer system components Understand the data hierarchy from bits to databases. Understand the different types of programming languages. Understand the strengths of Python and other leading programming language Be introduced to Python programming and some Python programs What Is a Computer? A computer is an electronic device that is programmed to accept raw data as input and processes it to produce the result as output. Designed to perform calculations and make logical decisions phenomenally faster than human beings can. Today’s personal computers can perform billions of calculations in one second. Computer System Hardware Equipment associated with the system Software Instructions that tell the hardware what to do People Computer programmer: writes software User: purchases and uses software Often called end-user Computer Hardware Input/Output Units Input Unit “receiving” section obtains information (data and computer programs) from input devices and places it at the disposal of the other units for processing Output Unit “shipping” section Takes information the computer has processed and places it on various output devices to make it available. Central Processing Unit (CPU) The brain of the computer Its electronic circuitry executes instructions of a computer program, such as arithmetic, logic, and input/output (I/O) operations Can perform billions of tasks per second CPU consists of Arithmetic Logic Unit (ALU) and Control Unit (CU). ALU performs the arithmetic and logic operations on the data that is made available to it CU controls and coordinates the activity of the other units of computer. Memory Memory is an important part of the Computer which is responsible for the storage of data and information on a temporary or permanent basis. Memory can be classified into two broad categories: Primary memory or main memory Secondary memory Primary Memory or Main Memory Random access Read-only memory (ROM): memory (RAM): Stores start-up instructions Computer Short Permanent storage memory Low capacity Stores instructions and data currently used Temporary (volatile) storage Secondary Storage Long-term, high-capacity storage section Programs or data not actively being used by the other units normally are placed on secondary storage devices until they’re needed Information on secondary storage devices is persistent—it’s preserved even when the computer’s power is turned off Secondary storage information takes much longer to access than information in primary memory, but its cost per unit is much less CPU Machine Cycle  Machine cycle, or instruction cycle, is a sequence of steps that a CPU goes through in order to execute a single machine language instruction. The CPU process breaks down into key stages: fetch, decode, execute and store Computer Software Software: Programs that enable hardware to perform different tasks Application software System software Computer Software System Software Application Software Coordinates instructions Programs used to complete between software and tasks hardware Includes Includes Productivity software Operating system Entertainment software Utility programs Educational and reference software Personal software Operating Systems Manages the software and hardware on the computer. There are several different computer programs running at the same time, and they all need to access your computer's central processing unit (CPU), memory, and storage => The operating system coordinates all of this to make sure each program gets what it needs. Provide services that allow each application to execute safely, efficiently and concurrently with other applications Linux, Windows and macOS are popular desktop computer operating systems Modern operating systems use a graphical user interface Google’s Android and Apple’s iOS are the most popular mobile operating systems Windows Operating System In the mid-1980s, Microsoft developed the Windows operating system, consisting of a graphical user interface built on top of DOS (Disk Operating System) Windows is a proprietary operating system controlled by Microsoft Windows is by far the world’s most widely used desktop operating system Linux Operating System Among the greatest successes of the open-source movement With open-source development, individuals and companies contribute their efforts in developing, maintaining and evolving software in exchange for the right to use that software for their own purposes, typically at no charge Open-source code is often scrutinized by a much larger audience than proprietary software, so errors often get removed faster Open source also encourages innovation Linux source code (the program code) is available to the public for examination and modification and is free to download and install Check Questions Information in the Random memory unit is persistent—it’s preserved even when the computer’s power is turned off. Application programs are programs that are implemented to achieve specific tasks such as Microsoft Word. Proprietary code is often characterized by a much larger audience than open-source software, so errors often get removed faster. Data Hierarchy Data Hierarchy Bit All data inside a computer is transmitted as a series of electrical signals that are either on or off. A bit (binary digit) is the smallest unit of data that a computer can process and store A bit is always in one of two physical states, similar to an on/off light switch The state is represented by a single binary value, usually a 0 or 1 Impressive functions performed by computers involve only the simplest manipulations of 0s and 1 Character Digits (0-9), letters (A-Z and a-z) and special symbols ($ @ % & * ( ) – + " : ; , ? /) are known as characters Since a computer only understands numbers, it needs a system that can process characters and symbols in binaries. Character encoding or character set is a way to do that. Humans don’t naturally use binary to communicate, and computers don’t natively understand our languages. Hence, the need for character encoding arises—to translate our text into a format that computers can process and understand. Computers process only 1s and 0s, so a computer’s character set represents every character as a pattern of 1s and 0s. Character Encoding American Standard Code for Information Interchange (ASCII) — an encoding standard for electronic communication. ASCII (American Standard Code for Information Interchange) encoding scheme represents a character as a group of 8 bits or a byte. ASCII code associates an integer value for each symbol in the character set, such as letters, digits, punctuation marks, special characters, and control characters. The ASCII code takes each character on the keyboard and assigns it a binary number. For example: the letter ‘a’ has the binary number 0110 0001 (97) the letter ‘b’ has the binary number 0110 0010 (98) the letter ‘c’ has the binary number 0110 0011 (99) Unicode contains characters for many of the world’s languages. Field Record File/Database Evolution of Programming languages Programmers write instructions in various programming languages, some directly understandable by computers and others requiring intermediate translation steps Early computers were programmed in machine language. Machine languages generally consist of strings of numbers (ultimately reduced to 1s and 0s) that instruct computers to perform their most elementary operations one at a time. Machine languages are machine dependent (a particular machine language can be used on only one type of computer). The Evolution of Programming Languages  Example: Suppose we want to represent the equation Wages = Rate X Hours  To calculate the weekly wages in machine language. 100100 0000 010001 100110 0000 010010 100010 0000 010011 100100 stands for LOAD 100110 stands for MULTIPLY 100010 stands for STORE 30 Assembly languages Machine languages are cumbersome for humans Instead of using the strings of numbers that computers could directly understand, programmers began using English-like abbreviations to represent elementary operations. Using the assembly language instructions, the equation to calculate the weekly wages can be written as follows: LOAD rate MULT hour STORE wages Assembler: An assembler is a program that translates a program written in assembly language into an equivalent program in machine language. High-Level Languages Programmers still had to use numerous instructions to accomplish even the simplest tasks. High-level languages were developed in which single statements could be written to accomplish substantial tasks. A typical high-level-language program contains many statements, known as the program’s source code. In order to calculate the weekly wages, the equation wages = rate X hours In high level language, can be written as follows: wages = rate * hours; High-Level Languages From programmer’s perspective, high-level languages are preferable to machine and assembly languages. Source code needs another computer program to execute because computers can only execute their native machine instructions. Interpreters and Compilers convert the Source Code to Machine Code (understandable by Computer). Compiler: is a program that translates a program written in a high-level language to an equivalent machine language. The resulting file is called an executable. Interpreter: It translates only one statement of the program at a time. It does not generate an executable. 33 Compiler Interpreter A compiler is a program that converts the entire An interpreter takes a source program and runs it source code of a programming language into line by line, translating each line as it comes to it. executable machine code for a CPU. The compiler takes a large amount of time to An interpreter takes less amount of time to analyze analyze the entire source code but the overall the source code but the overall execution time of execution time of the program is comparatively the program is slower. faster. The compiler generates the error message only after scanning the whole program, so debugging is Its Debugging is easier as it continues translating comparatively hard as the error can be present the program until the error is met. anywhere in the program. Flow of Information During Program Execution Python High level programming language Released publicly in 1991 Developed by Guido van Rossum of the National Research Institute for Mathematics and Computer Science in Amsterdam. Has rapidly become one of the world’s most popular programming languages Particularly popular for educational and scientific computing Why Python? Open source, free and widely available with a massive open- source community Easier to learn than many other languages, enabling novices and professional developers to get up to speed quickly Easier to read than many other popular programming languages Enhances developer productivity with extensive standard libraries Programmers can write code faster and perform complex tasks with minimal code Why Python? Massive numbers of free open-source Python applications Popular in web development Popular in data science Widely used in the financial community Extensive job market for Python programmers across many disciplines, especially in data-science-oriented positions, and Python jobs are among the highest paid of all programming jobs Check Questions A bit is a binary digit that can assume one of two values, is the smallest data item in a computer High-level languages allow you to write instructions that look almost like everyday English and contain commonly used mathematical notations. Interpreter executes source code directly avoiding the delay of compilation.

Use Quizgecko on...
Browser
Browser