Introduction to Computing PDF
Document Details
Uploaded by Deleted User
Cavite State University
Amiel E. Malicsi
Tags
Related
Summary
This document provides an introduction to computing topics. It explains concepts like data and information, different data representations, and transformation into information. The document is presented as a set of slides.
Full Transcript
1 2 Data, Information, and Software Let’s talk about Data! Data Data refers to raw, unorganized facts or figures that are collected and stored. It can be in the form of numbers, text, images, or any other...
1 2 Data, Information, and Software Let’s talk about Data! Data Data refers to raw, unorganized facts or figures that are collected and stored. It can be in the form of numbers, text, images, or any other type of input. Data, by itself, lacks context and meaning. It is the most basic form of representation and requires further processing to become useful. 5 Data Representations Binary data refers to information that can be represented using only two possible values, typically 0 and 1. These values correspond to the off and on states of a switch in a digital system. In computing, binary data is fundamental because computers use binary digits, or bits, to store and process information. On the other hand, non-binary data encompasses information that is not restricted to two possible values. Non-binary data can have multiple states or values beyond just 0 and 1. 6 Transforming Data into Information Information Information is the processed and organized form of data. It is data that has been analyzed, structured, and given context. Information provides meaning and can be used to answer questions or make decisions. It is the result of data being transformed into a more meaningful and useful state. 8 Information 9 Data Processing The INPUT – PROCESS – OUTPUT (I-P-O) Model It refers to a conceptual framework wherein input in the form of data or information is processed which result in the generation of an output basically in the form of information. 10 Stages of Data Processing 11 Systems What is a System? It is a group of organized interdependent components that interact with and complement one another to achieve one or more predetermined goals. 13 Characteristics of a System Unitary Whole – a system is the sum of its parts glued into one distinct entity. Composed of Parts – a system is made up of functionally oriented Bounded – boundaries separate the system from its environment System Parts Interact With Each Other – the parts are related and have definite interactions and interdependencies. Hierarchical – Each system is likely to be part of another larger system. Just as it is likely to be divided into subsystems. Goal-Oriented – The components all work toward a particular purpose of function. 14 Elements of a Computer System 1. Remember, DATA! 2. Hardware – supported by auxiliary or peripheral; simply refers to computer equipment. It refers to the physical components that are used in data preparation, data input, data storage, data computation and logic comparisons, control functions and outputting information. It includes the central processing unit (CPU) and the storage, input, output and communication devices. 15 Elements of a Computer System 3. Peopleware – refers to the human aspects of computing and information systems. It focuses on the human factors of technology, emphasizing how people interact with hardware, software, and processes. 4. SOFTWARE! 16 Introduction to Software What is Software? Software are non-physical components of a computer system such as the machine coded instructions used by the different hardware facilities. It refers to all computer programs which direct and control the computer hardware in data processing. 18 Types of Software Systems Software – it is a collection of programs that facilitates the programming and operation of the computer. It is called systems software because it is an integral part of the computer system itself. Specifically, it supervises the operations of the CPU, controls the input/output functions of the computer system, translates programming languages, and provides other support services. Examples: Operating Systems, Device Drivers, Firmware, Utility Software, Command Line Interface, Virtual Machine 19 Managers Types of Software Applications software – this is not an integral part of the computer system. These programs are written to solve a specific problem. Application programs maybe written by programmers or maybe purchase or leased from computer vendors, software companies or other computer users. Examples: Word processing software, spreadsheets, database management system, web browsers, multimedia software, graphics and design, accounting software, project 20 management software Types of Software Development Software – an example of application software; these are tools and applications specifically designed to assist software developers in creating, testing, and maintaining software applications. Examples: IDEs/code editors, version control systems, compilers and interpreters 21 22 Types of Software Development 23 Computer Language Computer Language Machine Language - this is also called the machine code, which is a computer program that is developed in the language of machines to provide instructions that can be directly executed by the central processing unit of a computer. This language is strictly numerical, and it is expected to run as fast as possible, and it may also be considered as the level of representation that is the lowest, of an assembled computer program or as a programming language that is primitive and dependent on hardware. 25 Computer Language Assembly Language – also symbolic machine code; this is a low-level programming language which has a great correspondence when it comes to the instructions in its language and the machine code instructions of the architecture. Every assembler has its own assembly language because assembly normally depends on the instructions of the machine code. Each assembly language is normally designed for when specific computer architecture. 26 Computer Language High-Level Language – are programming languages that are designed to allow humans to write computer programs and interact with a computer system without having to have specific knowledge of the processor or hardware that the program will run on. They are used in programming because they allow programmers to write code that is more readable, maintainable, and portable. 27 Computer Language Compiler - a compiler translates code from a high-level programming language into machine code before the program runs. Interpreter - an interpreter translates code written in a high-level programming language into machine code line-by-line as the code runs. 28 Thanks! For questions and concerns, email me at [email protected] 29 1 Evolution of Computing Learning Objectives This lesson will let the students to: 1. Know and understand the evolution of computing and algorithms; 2. Recognize selected key features in the field of computing; and 3. Appreciate the history of computing 3 History of Computing Why is it important to study history? 5 History of Computing Abacus (~2000 B.C.) One of the earliest known computing tools, used in ancient civilizations (China, Mesopotamia) to perform basic arithmetic like addition, subtraction, multiplication, and division using beads on rods. The term, “Computer” (1613) First recorded usage of the term "computer" referred to people, not machines, who performed complex mathematical calculations by hand, particularly for astronomy and navigation. 6 History of Computing Oughtred’s Slide Rule (1620) Invented by William Oughtred, the slide rule was a mechanical analog device used for multiplication, division, and complex functions like roots and logarithms. It was the engineer's tool of choice until electronic calculators emerged in the 1970s. 7 History of Computing Pascaline (1642) Blaise Pascal invented the Pascaline, a mechanical calculator that used gears and wheels to add and subtract numbers. It was the first machine to represent numbers mechanically and perform calculations automatically. 8 History of Computing Leibniz’s Calculator (1672) Gottfried Wilhelm Leibniz improved Pascal's design by creating a machine that could multiply and divide as well as add and subtract. Leibniz also developed the binary system, which became the foundation of modern computer systems. 9 History of Computing Jacquard Loom (1801) Joseph-Marie Jacquard's automated loom used punched cards to control the pattern of the cloth. This idea of "programming“ a machine using punched cards influenced early computers, marking the first step towards programmable machines. 10 History of Computing Analytical Engine (1837) Charles Babbage designed the Analytical Engine, considered the first concept of a programmable computer. It had features like an arithmetic logic unit (ALU), memory, and the ability to use punched cards for instructions. Though never completed, it laid the foundation for modern computing. 11 History of Computing Ada Lovelace's Algorithm (1843) Ada Lovelace, a mathematician, wrote the first algorithm designed to be executed by Babbage's Analytical Engine. She is considered the first computer programmer. Lovelace also foresaw that computers could go beyond just calculations to create music or art. She is the first ever programmer. 12 History of Computing Hollerith Tabulating Machine (1890) Herman Hollerith created a punched card machine to automate the U.S. Census. It used electrical circuits to read punched cards and greatly sped up data processing. This success led to the founding of IBM in the early 20th century. 13 History of Computing Turing Machine (1936) Alan Turing conceptualized the Turing Machine, a theoretical device that could manipulate symbols and perform any calculation if given a set of instructions (algorithm). His work laid the mathematical foundation for modern computing and algorithms. 14 History of Computing ENIAC (1939-1944) The ENIAC (Electronic Numerical Integrator and Computer) was the first general-purpose electronic digital computer. Built by John Mauchly and J. Presper Eckert at the University of Pennsylvania, it used vacuum tubes and was capable of performing massive calculations much faster than human "computers.“ 15 History of Computing Transistors (1947) Invented by Bell Labs, the transistor replaced vacuum tubes in computers, marking a revolution in electronics. Transistors were smaller, faster, and more reliable, paving the way for modern computers to become smaller and more powerful. 16 History of Computing Mainframe Computers (1950s) Large mainframe computers like the UNIVAC and IBM 701 were developed. These were primarily used by government agencies and large corporations for scientific and business calculations, requiring entire rooms due to their size. 17 History of Computing Integrated Circuits (IC) (1958) Jack Kilby and Robert Noyce independently developed the integrated circuit, a revolutionary advancement that allowed multiple transistors and components to be packed into a single silicon chip. This drastically reduced the size and cost of computers. 18 History of Computing IBM System/360 (1964) IBM's System/360 was a family of compatible mainframe computers that allowed businesses to upgrade without losing their previous investments in software. It standardized commercial computing and introduced the concept of computer families. 19 History of Computing Intel 4004 Microprocessor (1971) The Intel 4004 was the first commercially available microprocessor, a computer's central processing unit (CPU) on a single chip. This invention by Intel marked the beginning of the microcomputer revolution and set the stage for personal computers. 20 History of Computing Apple 1 (1976) Steve Wozniak and Steve Jobs created the Apple I, a hobbyist computer kit sold to tech enthusiasts. It was one of the first personal computers (PCs) designed for individual use, with a display and keyboard. Apple would later launch the Apple II, a massive success in the home computing market. 21 History of Computing IBM PC (1981) IBM released its first personal computer (PC), setting a new standard in the industry. It featured an Intel microprocessor and an operating system from Microsoft (MS-DOS), and it became widely adopted for business and home use, shaping the future of computing. 22 History of Computing Apple Macintosh (1984) Apple introduced the Macintosh, the first personal computer with a graphical user interface (GUI) and a mouse, making computers more user-friendly and accessible to non-experts. This paved the way for modern computing interfaces like Windows. 23 History of Computing World Wide Web and the Internet (1990s) Tim Berners-Lee invented the World Wide Web in 1990, revolutionizing how people accessed and shared information globally. During this time, the internet and graphical user interfaces (GUIs) like Windows made computers widely accessible to everyday users. 24 History of Computing Smartphones and Mobile Computing (2000s) Smartphones, starting with early models like the Blackberry and later the iPhone (2007), put computing power into handheld devices. These phones combined telephony, computing, and internet access, changing how people interacted with technology daily. 25 History of Computing Cloud Computing and AI (2010s) Cloud computing (services like Amazon Web Services, Google Cloud) allowed data storage and processing to move online, accessible from anywhere. Advances in artificial intelligence (AI), deep learning, and quantum computing opened new frontiers in computing, enhancing automation and decision-making. 26 History of Computing Quantum Computing & Edge Computing (2020s) Quantum computers are being developed to solve complex problems beyond the capabilities of traditional computers. Edge computing reduces latency by processing data closer to where it is generated (like in IoT devices). These innovations represent the cutting edge of computing technology. 27 History of Algorithms History of Algorithms Al-Khwarizmi's Algorithm (9th century) Al-Khwarizmi, a Persian mathematician, introduced the concept of an algorithm in his work on algebra, which provided systematic methods for solving equations. His name is the root of the term "algorithm," and his work laid the foundation for mathematical problem-solving processes. His contributions helped transition from arithmetic methods based on Greek geometry to more abstract algebraic techniques. He is the Father of Modern Algorithm 29 History of Algorithms Newton- Raphson Method (17th century) Isaac Newton and Joseph Raphson developed an iterative numerical method to approximate the roots of a real-valued function. This algorithm is still widely used in numerical analysis today. It high- lights the need for approximation methods when exact solutions are hard to find, especially in complex mathematical problems. 30 History of Algorithms Euler’s Graph Theory (18th century) Leonhard Euler introduced graph theory, solving the "Seven Bridges of Königsberg" problem, which laid the foundation for algorithmic thinking in network theory. Euler’s work gave birth to algorithms related to paths, circuits, and traversals that we still use in computer science for network optimization and connectivity. Ada Lovelace's Algorithm (19th century) Ada Lovelace, often considered the first computer programmer, wrote the first algorithm intended for Charles Babbage's Analytical Engine. She recognized that machines could go beyond number crunching to manipulating symbols, paving the way for general-purpose computation. Lovelace’s contributions inspired the concept 31 of algorithms in programming. History of Algorithms Turing Machines (1930s) Alan Turing introduced the concept of the Turing machine, a theoretical model for computation that could simulate any algorithmic process. Turing’s work laid the foundation for modern computing and algorithmic theory, particularly in defining the limits of what can be computed algorithmically. The Turing machine model is crucial in understanding algorithms' efficiency and complexity. Simplex Algorithm (1950s) George Dantzig developed the simplex algorithm, which is used to solve linear programming problems. This was one of the first major algorithms used for optimization in fields like economics and operations research. The simplex algorithm revolutionized decision-making processes by finding the best possible outcomes 32 given a set of constraints. History of Algorithms Djikstra’s Shortest Path Algorithm (1960s) Edsger Dijkstra developed an algorithm to find the shortest path between nodes in a graph, which is fundamental in network routing, transportation, and communication systems. It is still widely used in GPS navigation and internet traffic routing. This algorithm is an example of the practical applications of theoretical computer science in everyday life. Quicksort Algorithm (1970s) Tony Hoare introduced the Quicksort algorithm, which is one of the fastest and most widely used sorting algorithms. It significantly improved data sorting in computing by dividing and conquering the problem, thus speeding up processes in systems that require data arrangement. The efficiency of Quicksort is why it’s still used in modern 33 software. History of Algorithms RSA Algorithms (1976) Ron Rivest, Adi Shamir, and Leonard Adleman created the RSA encryption algorithm, which is based on the difficulty of factoring large prime numbers. This is a key algorithm in modern cryptography, securing everything from internet transactions to confidential communications. RSA ensures privacy and data security, which are critical in today’s digital age. Neural Networks (1980s) The resurgence of neural networks, inspired by biological neural systems, led to the development of algorithms for pattern recognition, learning, and artificial intelligence. Neural networks, used in deep learning, became essential for tasks like image and speech recognition. The idea that machines can "learn" from data 34 transformed the landscape of algorithms. History of Algorithms Genetic Algorithms (1990s) Inspired by the process of natural selection, genetic algorithms use evolution-based methods to solve optimization problems. These algorithms are used in areas like machine learning, robotics, and engineering. By mimicking the process of evolution, genetic algorithms offer innovative solutions to problems where traditional methods fall short. Google’s PageRank Algorithm (1990s-2000s) Developed by Larry Page and Sergey Brin, PageRank became the core of Google’s search engine, which ranks web pages based on their relevance and the number of incoming links. This algorithm revolutionized information retrieval on the web and is one of the reasons for Google’s dominance in the search engine market. It’s a great 35 example of how algorithms impact daily life by organizing the vastness of the History of Algorithms Deep Learning Algorithms (2010s) Deep learning algorithms, such as convolutional neural networks (CNN) and recurrent neural networks (RNN), have become foundational in AI applications. These algorithms enable machines to interpret and process large datasets, such as in image recognition, natural language processing, and autonomous vehicles. Deep learning has revolutionized the capabilities of AI, pushing the boundaries of what machines can achieve. 36 History of Algorithms Quantum Algorithms (to present) With the development of quantum computing, researchers are working on algorithms like Shor’s algorithm (for factoring large numbers) and Grover’s algorithm (for searching unsorted databases). These quantum algorithms could potentially outperform classical algorithms for certain tasks, opening new doors in computing power and speed. Quantum algorithms represent the future frontier of problem- solving and computational efficiency. 37 Key Figures in Computing Key Figures in Computing Charles Babbage – Father of Computer Ada Lovelace – First Computer Programmer Alan Turing – Father of Modern Computing and AI John von Neumann – Father of Modern Computer Architecture George Boole - Father of Boolean Logic 39 Key Figures in Computing Grace Hopper – Mother of Computing Claude Shannon – Father of Information Theory Steve Jobs & Steve Wozniak - Pioneers of Personal Computing Vint Cerf & Bob Kahn– Fathers of Internet Tim Berners-Lee - Inventor of the WWW Bill Gates - Founder of MS and pioneer of Software 40 Thanks! For questions and concerns, email me at [email protected] 41 Operating System System Software vs. Application Software System Software The operating system and utility programs that control a computer system and allow you to use your computer Enables the boot process, launches applications, transfers files, controls hardware configuration, manages files on the hard drive, and protects from unauthorized use Application Software Programs that allow a user to perform specific tasks on a computer Word processing, playing games, browsing the Web, listening to music, etc. The Operating System Operating System A collection of programs that manage and coordinate the activities taking place within a computer Acts as an intermediary between the user and the computer and between the application programs and system hardware Functions of an Operating System Interfacing with Users (typically via a GUI) Booting the Computer Configuring Devices Managing Network Connections Managing and Monitoring Resources and Jobs File Management Security Functions of an Operating System Processing Techniques for Increased Efficiency Multitasking The ability of an operating system to have more than one program (task) open at one time CPU rotates between tasks Switching is done quickly Appears as though all programs executing at the same time Processing Techniques for Increased Efficiency Multithreading The ability to rotate between multiple threads so that processing is completed faster and more efficiently Thread Sequence of instructions within a program that is independent of other thread Processing Techniques for Increased Efficiency Multiprocessing and Parallel Processing Multiple processors (or multiple cores) are used in one computer system to perform work more efficiently Tasks are performed sequentially Functions of an Operating System Processing Techniques for Increased Efficiency Memory Management Optimizing the use of main memory (RAM) Virtual memory Memory-management technique that uses hard drive space as additional RAM Processing Techniques for Increased Efficiency Buffering and Spooling Buffer Area in RAM or on the hard drive designated to hold data that is used by different hardware devices or programs Buffering or Spooling Placing items in a buffer so they can be retrieved by the appropriate device when needed Spoolin g Bufferi ng Differences Among Operating Systems Command Line Interface Require users to input commands using the keyboard Graphical User Interface Graphics based interface Used by most operating systems Differences Among Operating Systems Categories of Operating Systems Personal (Desktop) Operating Systems Designed to be installed on a single computer Server (Network) Operating Systems Designed to be installed on a network server Client computers still use a personal operating system Server operating system controls access to network resources Mobile and embedded operating systems are also common Differences Among Operating Systems Differences Among Operating Systems The Types of Processors Supported Desktop, mobile, server processors 32-bit or 64-bit CPUs Support for Virtualization and Other Technologies New types of buses Virtualization Mobility Security concerns Power-consumption concerns Touch and gesture input The move to cloud Operating Systems for Personal Computers and Servers DOS (Disk Operating System) DOS traditionally used a command-line interface Dominant operating system in the 1980s and early 1990s PC-DOS Created originally for IBM microcomputers MS-DOS Created for use with IBM-compatible computers Can enter DOS commands in Windows DOS Windows Windows The predominant personal operating system developed by Microsoft Corporation Windows 1.0 through Windows 11 Windows 1.0 released in 1985 Windows 1.0 through Windows 3.x were operating environments for DOS Designed for personal computers Windows Windows NT (New Technology) First 32-bit version of Windows designed for high-end workstations and servers Replaced by Windows 2000 Windows XP Replaced both Windows 2000 and Windows Me Windows Vista Replaced Windows XP Introduced the Aero interface and Sidebar feature Windows Windows 7 Released in late 2009 Home Premium (primary version for home users) Professional (primary version for businesses) Libraries feature gives you virtual folders Windows 8 Designed to be used with smartphones, desktop computers, with or without a keyboard or mouse Supports multi-touch input Includes Start screen, tiles, and charms bar Windows Windows 10 Replaced Windows 8 and 8.1 Includes new Start menu, Microsoft Edge, Cortana, Multiple desktops and Task view, Action Center and Tablet mode Windows 11 Current version of Windows On June 24, 2021, Windows 11 was announced as the successor to Windows 10 during a livestream. The new operating system was designed to be more user-friendly and understandable. It was released on October 5, 2021. Windows 11 will be a free upgrade to all Windows 10 users. Windows Windows Server The version of Windows designed for server use Windows Server 2022 is the latest version Supports both virtualization and cloud computing Windows Home Server Preinstalled on home server devices Designed to provide services for a home network Can be set up to back up all devices in the home on a regular basis Mac OS Mac OS Proprietary operating system for computers made by Apple Corporation Based on the UNIX operating system Originally set the standard for graphical user interfaces Latest version: macOS 14 Sonoma Mac OS Unix UNIX Operating system developed in the late 1960s for midrange servers Multiuser, multitasking operating system More expensive, requires high level of technical knowledge; harder to install, maintain, and upgrade “UNIX” initially referred to the original UNIX operating system, now refers to a group of similar operating systems based on UNIX Single UNIX Specification A standardized UNIX environment Linux Linux Developed by Linus Torvalds in 1991—resembles UNIX but was developed independently Is open-source software; has been collaboratively modified by volunteer programmers all over the world Originally used a command line interface, most recent versions use a GUI Strong support from mainstream companies, such as IBM, NVIDIA, HP, Dell, and Novell Individuals and organizations are switching to Linux and other open source software because of cost Linux Chrome OS The first cloud operating system Essentially is the Chrome Web browser redesigned to run a computer, in addition to accessing Web resources Replaces traditional desktop operating systems Is currently only available preinstalled on Chrome devices Operating Systems for Mobile Devices Windows Phone 8, Windows RT, and Windows Embedded Windows Phone Latest version of Windows designed for smartphones Windows Phone 8 is based on the Windows 8 operating system Windows RT Designed for tablet use Windows Embedded Designed primarily for consumer and industrial devices that are not personal computers Operating Systems for Mobile Phones and Other Devices Android Linux-based operating system created with current mobile device capabilities in mind Can create applications that take full advantage of all the features a mobile device has to offer Open platform Current version is Android 14 – Upside Down Cake Devices support multitasking, multiple cores, NFC mobile payment transactions, Internet phone calls Operating Systems for Mobile Phones and Other Devices Operating Systems for Mobile Phones and Other Devices iOS Designed for Apple Mobile phones and mobile devices Current version is iOS 17 Supports multitasking Includes Safari Web browser, the Siri intelligent assistant, Facetime video calling, AirDrop to send items to others, and apps for email, messaging, music, and search Operating Systems for Mobile Phones and Other Devices Operating Systems for Mobile Phones and Other Devices Blackberry OS and Blackberry PlayBook OS Designed for Blackberry devices Mobile Linux Other mobile operating systems based on Linux besides Android and iOs Ubuntu, webOS, Firefox OS, and Tizen Operating Systems for Larger Computers Larger computers sometimes use operating systems designed solely for that type of system IBM’s z/OS is designed for IBM mainframes Windows, UNIX, and Linux are also used with servers, mainframes, and supercomputers Larger computers may also use a customized operating system based on a conventional operating system Utility Programs Utility Programs Utility Program Software that performs a specific task, usually related to managing or maintaining the computer system Many utilities are built into operating systems (for finding files, viewing images, backing up files, etc.) Utilities are also available as stand-alone products and as suites File Management Programs Enable the user to perform file management tasks, such as: Looking at the contents of a storage medium Copying, moving, and renaming files and folders Deleting files and folders File management program in Windows 8 is File Explorer To copy or move files, use the Home tab to copy (or cut) and then paste To delete files, use the Delete key on the keyboard or the Home tab File Management Programs Search Tools Designed to search for documents and other files on the user’s hard drive Windows 8 has Search charm to search for files, apps, and Store items Are often integrated into file management programs Third-party search tools are also available Diagnostic and Disk Management Programs Diagnostic programs evaluate your system and make recommendations for fixing any errors found Disk management programs diagnose and repair problems related to your hard drive Utility Programs Uninstall and Cleanup Utilities Uninstall utilities remove programs from your hard drive without leaving bits and pieces behind Important to properly uninstall programs, not just delete them Cleanup utilities delete temporary files Files still in Recycle Bin Temporary Internet files Temporary installation files Utility Programs File Compression Programs Reduce the size of files to optimize storage space and transmission time Both zip and unzip files WinZip (Windows users) and Stuffit (Mac users) Backup and Recovery Utilities Make the backup and restoration process easier Creating a backup means making a duplicate copy of important files Can use a recordable or rewritable CD or DVD disc, a USB flash drive, or an external hard drive Utility Programs Good backup procedures are critical for everyone Individuals should back up important documents, e-mail, photos, home video, etc. Performing a backup can include backing up an entire computer (so it can be restored at a later date) Can do the backup manually or use backup utility programs (stand alone or those built into operating systems) Can also backup individual files are they are modified Utility Programs Antivirus, Antispyware, Firewalls, and Other Security Programs Security Concerns Viruses, spyware, identity theft, phishing schemes Security programs protect computers and users and it is essential that all computer users protect themselves and their computers Antivirus programs Antispyware programs Firewalls Many are included in Windows and other operating systems The Future of Operating Systems Will continue to become more user-friendly Will eventually be driven primarily by a voice interface, touch, and/or gesture interface Likely to continue to become more stable and self- healing Will likely continue to include security and other technological improvements as they become available Will almost certainly include improvements in the areas of synchronizing and coordinating data and activities among a person’s various computing and communicating devices May be used primarily to access software available through the Internet or other networks CAVITE STATE UNIVERSITY A Quick Overview “ The Premier University in historic Cavite recognized for excellence in the development of globally and morally upright individuals. 2 ▪ Cavite State University shall “ provide excellent, equitable, and relevant educational opportunities in the arts, sciences and technology through quality instruction and responsive research and development activities. ▪ It shall produce professional, skilled and morally upright individuals for global competitiveness. 3 “ Truth Service Excellence 4 ▪ Commit to the highest “ We standards of education, value our stakeholders, Strive for continual improvement of our products and U services, and phold the University’s tenets of Truth, Excellence, and Service to produce globally competitive and morally upright individuals. 5 DCIT 21A Introduction to Computing About the Course CS 1st Semester Course Objectives: No. of Units: 3 After completing this course, the students should be able to: 1. Explain fundamental principles, concepts, and Course Description evolution of computing systems as they relate to This course provides an overview of the Computing different fields Industry and Computing Profession, including Research 2. Expound on the recent developments in the and Applications in different fields; an Appreciation of different computing knowledge areas computing in different fields such as Biology, Sociology, Environment and Gaming. An understanding of ACM 3. Analyze solutions employed by organizations to Requirements; and appreciation of the History of address different computing issues computing and knowledge of the Key components of the computer systems (Organization and Architecture), Malware, Computer Security, Internet and internet protocols, HTML 4/5 and CSS. 7 Points to Remember! LECTURE Quizzes and Long Examinations will be done in F2F approach by all students enrolled in the course. Quizzes are either announced or unannounced. Failure to take a quiz or submit an activity in the scheduled time is automatically a grade of zero. Failure to take a long examination in the scheduled time will have a chance to take it in a different set of exam. A student will need an excuse letter from his/her guardian and valid 8 proof(s) for being absent in the given schedule. Points to Remember! LECTURE Take down notes. Whatever discussed during the lecture time will be used for the course assessments. You are not allowed to take a picture/video/sound recording during the class. Avoid unnecessary noise or behaviors, especially during lecture and/or laboratory classes. 15 minute-late are not allowed to enter the class. 9 Points to Remember! LABORATORY Laboratory activities will be performed in Central Computer Laboratory, particularly in CCL 203. But expect that, more library/research works and case studies will be conducted. Failure to take the laboratory activities means zero. Failure to take a practical examination in the scheduled time will have a chance to take it but in a different set of exam. A student will need an excuse letter from his/her guardian and valid proof(s) for being absent in the given schedule. 10 Points to Remember! You can e-mail me at [email protected] but personal consultation is more preferred. Please visit me at the DIT Faculty Room 2 OR CEIT Library. Messages thru FB messenger will/can not be entertained. Announcements and reminders will be posted in our FB group: 1st Semester 2024-2025: AEMalicsi https://www.facebook.com/groups/dcit21aemalicsi20242025/ Individuals who are not enrolled in this course should not join nor be 11 added as a member of the group. Reminders: 1. Be polite. Use po and opo when communicating online or in-person not only with your instructors but also with the older ones anywhere. Be polite in words and in manners. Make it a habit at all times. 2. Use your CvSU email address for communication. Please visit the link, read the content, and download the presentation there for you to know the proper e-mail writing as a student: https://www.hmhco.com/blog/how-to-teach-email-writing-to- students. 12 Reminders: 3. Avoid plagiarism. Turnitin, a plagiarism detection service, will be used to check the originality of students’ works. 4. Do your assessments independently. Do not share your answers during a quiz or examination. Unity of a class to study together and being kind is good but tolerating others to rely on your efforts isn't. Individual efforts to study should be exerted for them to learn on their own and achieve excellence. 5. Always uphold the University tenets of Truth, Service, and Excellence. 13 Grading System Lecture – 100% in total Midterm Examination - 20% Final Examination - 20% Attendance - 10% Outputs/Portfolio - 25% Quizzes/Long Exams - 15% Class Participation - 10% Laboratory – 100% in total Laboratory Reports – 50% Practical/Written Exam - 30% Attendance/Participation - 20% 14 Grading System 96.7 – 100.0 1.00 93.4 – 96.6 1.25 FINAL GRADE = (Lecture * 60%) + (Laboratory * 40%) 90.1 -93.30 1.50 86.7 – 90.0 1.75 83.4 – 86.6 2.00 80.1 – 83.3 2.25 76.7 – 80.0 2.50 73.4 – 76.6 2.75 70.00 – 73.3 3.00 50.0-69.9 4.00 Below 50 5.00 INC Passed the course but lack some requirements. Dropped If unexcused absence is at least 20% of the Total Class Hours. 15 Total Class Hours/Semester: Lecture – 18 hrs., Lab – 108 hrs. Introduction about the BSCS and other Related Programs CMO No. 25 series of 2015 The subject of the said CMO is “REVISED POLICIES, STANDARDS, AND GUIDELINES FOR BACHELOR OF SCIENCE IN COMPUTER SCIENCE (BSCS), BACHELOR OF SCIENCE IN INFORMATION SYSTEMS (BSIS), AND BACHELOR OF SCIENCE IN INFORMATION TECHNOLOGY (BSIT) PROGRAMS” The field of computing is ever dynamic; its advancement and development had been rapid, and its evolvement is a continuous process (O'Brien, 2008). To face the challenges of advancement, the Commission recognizes the need to be responsive to the current needs of the country. It is essential and important that the country's computing capability be continually developed and strengthened to be at par globally. 17 Bachelor of Science in Computer Science (BSCS) The BS Computer Science program includes the study of computing concepts and theories, algorithmic foundations and new developments in computing. The program prepares students to design and create algorithmically complex software and develop new and effective algorithms for solving computing problems. The program also includes the study of the standards and practices in Software Engineering. It prepares students to acquire skills and disciplines required for designing, writing and modifying software components, modules and applications that comprise software solutions. 18 Bachelor of Science in Information Systems (BSIS) The BS Information Systems Program includes the study of application and effect of information technology to organizations. Graduates of the program should be able to implement an information system, which considers complex technological and organizational factors affecting it. These include components, tools, techniques, strategies, methodologies, etc. Graduates are able to help an organization determine how information and technology-enabled business processes can be used as strategic tool to achieve a competitive advantage. As a result, IS professionals require a sound understanding of organizational principles and practices so that they can serve as an effective bridge between the technical and management/users' communities within an organization. This enables them to ensure that the organization has the information and the systems it needs to support its operations. 19 Bachelor of Science in Information Technology (BSIT) The BS Information Technology program includes the study of the utilization of both hardware and software technologies involving planning, installing, customizing, operating, managing and administering, and maintaining information technology infrastructure that provides computing solutions to address the needs of an organization. The program prepares graduates to address various user needs involving the selection, development, application, integration and management of computing technologies within an organization. 20 What is the goal of the programs? The BSCS, BSIS, and BSIT graduates are expected to become globally competent, innovative, and socially and ethically responsible computing professionals engaged in life-long learning endeavours. They are capable of contributing to the country's national development goals. 21 Professions/Careers for BSCS Graduates Primary Job Roles Software Engineer Systems Software Developer Research and Development computing professional Applications Software Developer Computer Programmer Secondary Job Roles Systems Analyst Data Analyst Quality Assurance Specialist Software Support Specialist 22 Professions/Careers for BSIS Graduates Primary Job Roles Organizational Process Analyst Data Analyst Solutions Specialist Systems Analyst IS Project Management Personnel Secondary Job Roles Applications Developer End User Trainer Documentation Specialist Quality Assurance Specialist 23 Professions/Careers for BSIT Graduates Primary Job Roles Web and Applications Developer Junior Database Administrator Systems Administrator Network Engineer Junior Information Security Administrator Systems Integration Personnel IT Audit Assistant Technical Support Specialist Secondary Job Roles QA Specialist Systems Analyst 24 Computer Programmer Thanks! See you next week. 25 1 Computing Disciplines Computer Science (CS) Computer Science (CS) program includes the study of computing concepts and theories, algorithmic foundations and new developments in computing. The program prepares students to design and create algorithmically complex software and develop new and effective algorithms for solving computing problems. The program also includes the study of the standards and practices in Software Engineering. It prepares students to acquire skills and disciplines required for designing, writing and modifying software components, modules and applications that comprise software solutions. 3 Information Systems (IS) Information Systems (IS) Program includes the study of application and effect of information technology to organizations. Graduates of the program should be able to implement an information system, which considers complex technological and organizational factors affecting it. These include components, tools, techniques, strategies, methodologies, etc. Graduates are able to help an organization determine how information and technology-enabled business processes can be used as strategic tool to achieve a competitive advantage. As a result, IS professionals require a sound understanding of organizational principles and practices so that they can serve as an effective bridge between the technical and management/users' communities within an organization. This enables them to ensure that the organization has the information and the systems it needs to support its operations. 4 Information Technology (IT) Information Technology (IT) program includes the study of the utilization of both hardware and software technologies involving planning, installing, customizing, operating, managing and administering, and maintaining information technology infrastructure that provides computing solutions to address the needs of an organization. The program prepares graduates to address various user needs involving the selection, development, application, integration and management of computing technologies within an organization. 5 Computer Engineering (CpE) Computer Engineering (CpE) is a program that embodies the science and technology of design, development, implementation, maintenance, and integration of software and hardware components in modern computing systems and computer-controlled equipment. 6 Data Science Data Science (DS) is an interdisciplinary program that is designed to equip its graduates with integrated skill sets spanning mathematics, statistics, machine learning, databases and other branches of computer science with the aim of extracting new knowledge from data in various forms in order to provide actionable insights for decision makers in data-driven 7 industries and sectors. CS Computing Domains Computing Domains ▪ Algorithms and complexities ✓ Design and analysis of algorithms ✓ Automata theory and formal languages ✓ Computational science ▪ Architecture and organization ▪ Discrete structures ✓ Logic, sets, relations, functions, and proof techniques ✓ Graphs, trees, matrices, combinatorics, and recurrences ▪ Human computer interaction ✓ Computer graphics and visual computing 9 Computing Domains ▪ Information assurance and security ▪ Networks and communications ▪ Operating systems ✓ Parallel and distributed computing ▪ Programming languages (design and implementation) ▪ Software development ✓ Data structures and algorithms ✓ Object oriented programming ▪ Software engineering 10 ▪ Social issues and professional practice Computing Domains ▪ Software engineering ✓ Analysis and design ✓ Implementation and management ✓ Intelligent systems ▪ Social issues and professional practice 11 Job Roles for BSCS Primary Job Roles Software Engineer – computer science professional who use engineering principles and programming languages to build software products, develop web and mobile applications, and run network control systems. Systems Software Developer - develop and execute software and applications for IT systems' components that are concealed from the public but contribute to the smooth operation of organizations. 13 Primary Job Roles Research and Development computing professional - continually creating new innovations in computing technology and developing solutions to enhance the efficiency of old technology. Upon experience, these scientists are usually offered significant salaries and broad exposure to career prospects, as well. Applications Software Developer - In charge of developing software systems and applications based on the clients' specifications or business needs. The use of system tools and programming codes by application software developers enables them to customize programs, implement software solutions, modify test codes, and update existing 14 applications in order to improve efficiency and performance. Primary Job Roles Computer Programmer - write, edit, and test the code and scripts necessary for the correct operation of computer programs and applications. They translate plans produced by engineers and software developers into computer-readable instructions. 15 Secondary Job Roles Systems Analyst - ensures that computer systems, infrastructures, and systems are operating as effectively and efficiently as possible at a high level inside a company. In order to satisfy requirements, system analysts are responsible for investigating issues, identifying solutions, suggesting courses of action, and collaborating with stakeholders. Data Analyst - collect, organize and interpret statistical information to help colleagues and clients use it make decisions. 16 Secondary Job Roles Quality Assurance Specialist - accountable for overseeing, examining, and recommending changes to a company's final products and processes to ensure they adhere to set quality standards. Software Support Specialist - responsible for ensuring that employees within a company have access to and are proficient with using company software 17 Application of Computer Science in Different Fields United Nation’s Sustainable Development Goals 92nd out of 167 countries 19 SDG 1. NO POVERTY Application of CS: Mobile banking platforms enhance financial inclusion by providing access to banking services in remote areas, helping marginalized communities build financial resilience. Blockchain technology is being used to improve transparency and accountability in financial aid distribution, ensuring that resources reach those in need. 20 SDG 2. ZERO HUNGER Application of CS: Agriculture is benefiting from advancements in artificial intelligence (AI) and machine learning (ML). AI-driven predictive models are used to optimize crop yields, analyze soil health, and monitor environmental factors. 21 SDG 3. GOOD HEALTH AND WELL-BEING Application of CS: AI and ML are revolutionizing healthcare through personalized medicine, diagnostics, and disease prediction models. Telemedicine platforms are enabling remote healthcare delivery, increasing access to medical services in underserved areas 22 SDG 4. QUALITY EDUCATION Application of CS: Digital education platforms and AI tutoring systems are enhancing learning opportunities globally. 23 SDG 5. GENDER EQUALITY Application of CS: Research on AI algorithms for job matching and skill development has shown that these tools can help close the gender gap in industries where women are underrepresented, such as STEM fields. 24 SDG 6. CLEAN WATER AND SANITATION Application of CS: IoT (Internet of Things) and AI are being used to monitor and manage water resources efficiently. 25 SDG 7. AFFORDABE AND CLEAN ENERGY Application of CS: AI-based models are helping optimize energy production from renewable sources such as solar and wind. Research has demonstrated how AI and big data are used to predict energy demand and enhance the efficiency of energy grids. 26 SDG 8. DECENT WORK AND ECONOMIC GROWTH Application of CS: Freelance marketplace platforms allow individuals to find remote work opportunities globally. AI and automation are helping create new job opportunities while improving labor market efficiency. 27 SDG 9. INDUSTRY, INNOVATION, & INFRASTRUCTURE Application of CS: Blockchain, AI, and cloud computing are transforming industries by improving operational efficiency and creating new business models. 28 SDG 10. REDUCED INEQUALITIES Application of CS: Data mining/AI is being used to develop more inclusive systems by analyzing social patterns and ensuring equitable access to services like healthcare, education, and financial services. 29 OTHER SDGs 30 Thanks! For questions and concerns, email me at [email protected] 31