IM - COMP 001 - Introduction to Computing.pdf
Document Details
Uploaded by ThriftyCongas2666
Polytechnic University of the Philippines
2023
Tags
Full Transcript
Republic of the Philippines POLYTECHNIC UNIVERSITY OF THE PHILIPPINES SANTA ROSA CAMPUS City of Santa Rosa, Laguna INSTRUCTIONAL MATERIAL COMP 001 INTRODUCTION TO COMPUTING Compiled by: Inst. Owen Harvey Balocon Faculty, Bachelor of...
Republic of the Philippines POLYTECHNIC UNIVERSITY OF THE PHILIPPINES SANTA ROSA CAMPUS City of Santa Rosa, Laguna INSTRUCTIONAL MATERIAL COMP 001 INTRODUCTION TO COMPUTING Compiled by: Inst. Owen Harvey Balocon Faculty, Bachelor of Science in Information Technology Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Chapter 1. Introduction to Information Technology Lesson 1.1 Introduction and History of Computer Technology The electronic computer is one of the most important developments of the twentieth century. Like the industrial revolution of the nineteenth century, the computer and the information and communication technology built upon it have drastically changed business, culture, government and science, and have touched nearly every aspect of our lives. This text introduces the field of computing and details the fundamental concepts and practices used in the development of computer applications. “We’re changing the World with Technology” Bill Gates Analog computers Analog computers use continuous physical magnitudes to represent quantitative information. At first they represented quantities with mechanical components (see differential analyzer and integrator), but after World War II voltages were used; by the 1960s digital computers had largely replaced them. Nonetheless, analog computers, and some hybrid digital-analog systems, continued in use through the 1960s in tasks such as aircraft and spaceflight simulation. One advantage of analog computation is that it may be relatively simple to design and build an analog computer to solve a single problem. Another advantage is that analog computers can frequently represent and solve a problem in “real time”; that is, the computation proceeds at the same rate as the system being modeled by it. Their main disadvantages are that analog representations are limited in precision—typically a few decimal places but fewer in complex mechanisms—and general-purpose devices are expensive and not easily programmed. Digital computers In contrast to analog computers, digital computers represent information in discrete form, generally as sequences of 0s and 1s (binary digits, or bits). The modern era of digital computers began in the late 1930s and early 1940s in the United States, Britain, and Germany. The first devices used switches operated by electromagnets (relays). Their programs were stored on punched paper tape or cards, and they had limited internal data storage. For historical developments, see the section Invention of the modern computer. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Mainframe computer These computers came to be called mainframes, though the term did not become common until smaller computers were built. Mainframe computers were characterized by having (for their time) large storage capabilities, fast components, and powerful computational abilities. They were highly reliable, and, because they frequently served vital needs in an organization, they were sometimes designed with redundant components that let them survive partial failures. Because they were complex systems, they were operated by a staff of systems programmers, who alone had access to the computer. Other users submitted “batch jobs” to be run one at a time on the mainframe. Supercomputer The most powerful computers of the day have typically been called supercomputers. They have historically been very expensive and their use limited to high-priority computations for government-sponsored research, such as nuclear simulations and weather modeling. Today many of the computational techniques of early supercomputers are in common use in PCs. On the other hand, the design of costly, special-purpose processors for supercomputers has been supplanted by the use of large arrays of commodity processors (from several dozen to over 8,000) operating in parallel over a high-speed communications network. Minicomputer Although minicomputers date to the early 1950s, the term was introduced in the mid-1960s. Relatively small and inexpensive, minicomputers were typically used in a single department of an organization and often dedicated to one task or shared by a small group. Minicomputers generally had limited computational power, but they had excellent compatibility with various laboratory and industrial devices for collecting and inputting data. One of the most important manufacturers of minicomputers was Digital Equipment Corporation (DEC) with its Programmed Data Processor (PDP). In 1960 DEC’s PDP-1 sold for $120,000. Five years later its PDP-8 cost $18,000 and became the first widely used minicomputer, with more than IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL 50,000 sold. The DEC PDP-11, introduced in 1970, came in a variety of models, small and cheap enough to control a single manufacturing process and large enough for shared use in university computer centres; more than 650,000 were sold. However, the microcomputer overtook this market in the 1980s. Microcomputer A microcomputer is a small computer built around a microprocessor integrated circuit, or chip. Whereas the early minicomputers replaced vacuum tubes with discrete transistors, microcomputers (and later minicomputers as well) used microprocessors that integrated thousands or millions of transistors on a single chip. In 1971 the Intel Corporation produced the first microprocessor, the Intel 4004, which was powerful enough to function as a computer although it was produced for use in a Japanese-made calculator. In 1975 the first personal computer, the Altair, used a successor chip, the Intel 8080 microprocessor. Like minicomputers, early microcomputers had relatively limited storage and data-handling capabilities, but these have grown as storage technology has improved alongside processing power. Embedded processors Another class of computer is the embedded processor. These are small computers that use simple microprocessors to control electrical and mechanical functions. They generally do not have to do elaborate computations or be extremely fast, nor do they have to have great “input-output” capability, and so they can be inexpensive. Embedded processors help to control aircraft and industrial automation, and they are common in automobiles and in both large and small household appliances. One particular type, the digital signal processor (DSP), has become as prevalent as the microprocessor. DSPs are used in wireless telephones, digital telephone and cable modems, and some stereo equipment. GENERATIONS OF COMPUTERS The First Generation The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts. 5 MB OF DATA MACHINE LANGUAGE The Second Generation Transistors replaced vacuum tubes and ushered in the second generation of computers. One transistor replaced the equivalent of 40 vacuum tubes. Allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable. Still generated a great deal of heat that can damage the computer. Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. Second-generation computers still relied on punched cards for input and printouts for output. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL The Third Generation The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers. Much smaller and cheaper compare to the second generation computers. It could carry out instructions in billionths of a second. Users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device; to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL The Fourth Generation The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices. Based on Artificial Intelligence (AI). Still in development. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. The goal is to develop devices that respond to natural language input and are capable of learning and self-organization. There are some applications, such as voice recognition, that are being used today. WISDOM OF THIS LESSON: Let us Contribute to the Advancement, Evolution, and the Betterment of the Human Race IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Lesson 1.2 Understanding Information Technology What is IT? Information technology (IT) refers to the use of computers, software, hardware, networks, and other digital technologies to store, retrieve, transmit, and manipulate data or information. IT encompasses a wide range of technologies and practices that are used to manage and process information effectively in various domains, including business, healthcare, education, entertainment, and more. Here are some key components and aspects of information technology: 1. Hardware: This includes computers, servers, storage devices, and networking equipment. Hardware forms the physical infrastructure on which IT systems and applications run. 2. Software: IT involves the development, installation, and maintenance of software applications and systems. This includes operating systems, productivity software, databases, and custom software solutions. 3. Networking: IT relies heavily on networks to connect devices and enable data communication. This includes local area networks (LANs), wide area networks (WANs), the internet, and various network protocols. 4. Data Management: IT professionals are responsible for managing data, which includes data storage, retrieval, backup, and security. Databases and data centers play a crucial role in this aspect. 5. Cybersecurity: IT security is a critical component of information technology. It involves protecting systems, networks, and data from unauthorized access, breaches, and cyber threats. 6. Cloud Computing: Cloud technology has become a central part of IT, allowing organizations to access and manage resources (such as servers, storage, and applications) remotely over the internet. 7. Programming and Development: IT professionals often engage in software development, coding, and programming to create custom solutions or modify existing software to meet specific needs. 8. IT Support: IT support teams provide assistance to users and organizations, helping them resolve technical issues and ensuring that IT systems run smoothly. 9. Digital Transformation: IT plays a pivotal role in modernizing and transforming businesses and organizations by enabling automation, data analysis, and the adoption of emerging technologies like artificial intelligence (AI) and the Internet of Things (IoT). IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL 10. Project Management: Managing IT projects is crucial to ensure that technology implementations are completed on time and within budget while meeting the desired objectives. 11. Data Analytics and Business Intelligence: IT is instrumental in collecting and analyzing data to derive insights and support decision-making processes. 12. Mobile Technology: The proliferation of smartphones and mobile devices has expanded the scope of IT to include mobile app development and mobile device management. 13. E-commerce and Online Services: IT powers online businesses and services, from e- commerce platforms to social media networks. Information technology is an ever-evolving field, and it continues to shape and revolutionize the way individuals and organizations operate and interact in today's digital age. It has become an integral part of our daily lives and has a significant impact on nearly every aspect of modern society. The Differences of Information Technology, Computer Science, Computer Engineering, and Information Systems Information Technology (IT), Computer Science (CS), Computer Engineering (CE), and Information Systems (IS) are related fields but have distinct focuses and areas of expertise. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Lesson 1.3 History of Computing A computer might be described with deceptive simplicity as “an apparatus that performs routine calculations automatically.” Such a definition would owe its deceptiveness to a naive and narrow view of calculation as a strictly mathematical process. In fact, calculation underlies many activities that are not normally thought of as mathematical. Walking across a room, for instance, requires many complex, albeit subconscious, calculations. Computers, too, have proved capable of solving a vast array of problems, from balancing a checkbook to even—in the form of guidance systems for robots—walking across a room. Before the true power of computing could be realized, therefore, the naive view of calculation had to be overcome. The inventors who laboured to bring the computer into the world had to learn that the thing they were inventing was not just a number cruncher, not merely a calculator. For example, they had to learn that it was not necessary to invent a new computer for every new calculation and that a computer could be designed to solve numerous problems, even problems not yet imagined when the computer was built. They also had to learn how to tell such a general problem-solving computer what problem to solve. In other words, they had to invent programming. They had to solve all the heady problems of developing such a device, of implementing the design, of actually building the thing. The history of the solving of these problems is the history of the computer. That history is covered in this section, and links are provided to entries on many of the individuals and companies mentioned. In addition, see the articles computer science and supercomputer. Early history The abacus The earliest known calculating device is probably the abacus. It dates back at least to 1100 BCE and is still in use today, particularly in Asia. Now, as then, it typically consists of a rectangular frame with thin parallel rods strung with beads. Long before any systematic positional notation was adopted for the writing of numbers, the abacus assigned different units, or weights, to each rod. This scheme allowed a wide range of numbers to be represented by just a few beads and, together with the invention of zero in India, may have inspired the invention of the Hindu-Arabic number system. In any case, abacus beads can be readily manipulated to perform the common arithmetical operations—addition, subtraction, multiplication, and division—that are useful for commercial transactions and in bookkeeping. The abacus is a digital device; that is, it represents values discretely. A bead is either in one predefined position or another, representing unambiguously, say, one or zero. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Analog calculators: from Napier’s logarithms to the slide rule Calculating devices took a different turn when John Napier, a Scottish mathematician, published his discovery of logarithms in 1614. As any person can attest, adding two 10-digit numbers is much simpler than multiplying them together, and the transformation of a multiplication problem into an addition problem is exactly what logarithms enable. This simplification is possible because of the following logarithmic property: the logarithm of the product of two numbers is equal to the sum of the logarithms of the numbers. By 1624, tables with 14 significant digits were available for the logarithms of numbers from 1 to 20,000, and scientists quickly adopted the new labour-saving tool for tedious astronomical calculations. Digital calculators: from the Calculating Clock to the Arithmometer In 1623 the German astronomer and mathematician Wilhelm Schickard built the first calculator. He described it in a letter to his friend the astronomer Johannes Kepler, and in 1624 he wrote again to explain that a machine he had commissioned to be built for Kepler was, apparently along with the prototype, destroyed in a fire. He called it a Calculating Clock, which modern engineers have been able to reproduce from details in his letters. Even general knowledge of the clock had been temporarily lost when Schickard and his entire family perished during the Thirty Years’ War. The first calculator or adding machine to be produced in any quantity and actually used was the Pascaline, or Arithmetic Machine, designed and built by the French mathematician-philosopher Blaise Pascal between 1642 and 1644. It could only do addition and subtraction, with numbers being entered by manipulating its dials. Pascal invented the machine for his father, a tax collector, so it was the first business machine too (if one does not count the abacus). He built 50 of them over the next 10 years. Leibniz was a strong advocate of the binary number system. Binary numbers are ideal for machines because they require only two digits, which can easily be represented by the on and off states of a switch. When computers became electronic, the binary system was particularly appropriate because an electrical circuit is either on or off. This meant that on could represent true, off could represent false, and the flow of current would directly represent the flow of logic. The Jacquard Loom Calculators such as the Arithmometer remained a fascination after 1820, and their potential for commercial use was well understood. Many other mechanical devices built during the 19th century also performed repetitive functions more or less automatically, but few had any application to computing. There was one major exception: the Jacquard loom, invented in 1804–05 by a French weaver, Joseph- Marie Jacquard. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL The Difference Engine Charles Babbage was an English mathematician and inventor: he invented the cowcatcher, reformed the British postal system, and was a pioneer in the fields of operations research and actuarial science. It was Babbage who first suggested that the weather of years past could be read from tree rings. He also had a lifelong fascination with keys, ciphers, and mechanical dolls. The Analytical Engine While working on the Difference Engine, Babbage began to imagine ways to improve it. Chiefly he thought about generalizing its operation so that it could perform other kinds of calculations. By the time the funding had run out in 1833, he had conceived of something far more revolutionary: a general- purpose computing machine called the Analytical Engine. Early business machines Herman Hollerith’s census tabulator The U.S. Constitution mandates that a census of the population be performed every 10 years. The first attempt at any mechanization of the census was in 1870, when statistical data were transcribed onto a rolling paper tape displayed through a small slotted window. As the size of America’s population exploded in the 19th century and the number of census questions expanded, the urgency of further mechanization became increasingly clear. After graduating from the Columbia University School of Mines, New York City, in 1879, Herman Hollerith obtained his first job with one of his former professors, William P. Trowbridge, who had received a commission as a special agent for the 1880 census. It was while employed at the Census Office that Hollerith first saw the pressing need for automating the tabulation of statistical data. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Howard Aiken’s digital calculators While Bush was working on analog computing at MIT, across town Harvard professor Howard Aiken was working with digital devices for calculation. He had begun to realize in hardware something like Babbage’s Analytical Engine, which he had read about. Starting in 1937, he laid out detailed plans for a series of four calculating machines of increasing sophistication, based on different technologies, from the largely mechanical Mark I to the electronic Mark IV. The Turing machine Alan Turing, while a mathematics student at the University of Cambridge, was inspired by German mathematician David Hilbert’s formalist program, which sought to demonstrate that any mathematical problem can potentially be solved by an algorithm—that is, by a purely mechanical process. Turing interpreted this to mean a computing machine and set out to design one capable of resolving all mathematical problems, but in the process he proved in his seminal paper “On Computable Numbers, with an Application to the Entscheidungsproblem [‘Halting Problem’]” (1936) that no such universal mathematical solver could ever exist. Developments during World War II ENIAC In the United States, government funding went to a project led by John Mauchly, J. Presper Eckert, Jr., and their colleagues at the Moore School of Electrical Engineering at the University of Pennsylvania; their objective was an all-electronic computer. Under contract to the army and under the direction of Herman Goldstine, work began in early 1943 on the Electronic Numerical Integrator and Computer (ENIAC). The next year, mathematician John von Neumann, already on full-time leave from the Institute for Advanced Studies (IAS), Princeton, New Jersey, for various government research projects (including the Manhattan Project), began frequent consultations with the group. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Early computer language development Machine language One implication of the stored-program model was that programs could read and operate on other programs as data; that is, they would be capable of self-modification. Konrad Zuse had looked upon this possibility as “making a contract with the Devil” because of the potential for abuse, and he had chosen not to implement it in his machines. But self-modification was essential for achieving a true general-purpose machine. Interpreters HLL coding was attempted right from the start of the stored-program era in the late 1940s. Shortcode, or short-order code, was the first such language actually implemented. Suggested by John Mauchly in 1949, it was implemented by William Schmitt for the BINAC computer in that year and for UNIVAC in 1950. Shortcode went through multiple steps: first it converted the alphabetic statements of the language to numeric codes, and then it translated these numeric codes into machine language. It was an interpreter, meaning that it translated HLL statements and executed, or performed, them one at a time—a slow process. Because of their slow execution, interpreters are now rarely used outside of program development, where they may help a programmer to locate errors quickly. Compilers An alternative to this approach is what is now known as compilation. In compilation, the entire HLL program is converted to machine language and stored for later execution. Although translation may take many hours or even days, once the translated program is stored, it can be recalled anytime in the form of a fast-executing machine-language program. FORTRAN, COBOL, and ALGOL FORTRAN took another step toward making programming more accessible, allowing comments in the programs. The ability to insert annotations, marked to be ignored by the translator program but readable by a human, meant that a well-annotated program could be read in a certain sense by people with no programming knowledge at all. For the first time a nonprogrammer could get an idea what a program did—or at least what it was intended to do—by reading (part of) the code. It was an obvious but powerful step in opening up computers to a wider audience. By 1954 Backus and a team of programmers had designed the language, which they called FORTRAN (Formula Translation). Programs written in FORTRAN looked a lot more like mathematics than machine instructions: IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL COBOL About the time that Backus and his team invented FORTRAN, Hopper’s group at UNIVAC released Math-matic, a FORTRAN-like language for UNIVAC computers. It was slower than FORTRAN and not particularly successful. Another language developed at Hopper’s laboratory at the same time had more influence. Flow-matic used a more English-like syntax and vocabulary. ALGOL During the late 1950s a multitude of programming languages appeared. This proliferation of incompatible specialized languages spurred an interest in the United States and Europe to create a single “second-generation” language. A transatlantic committee soon formed to determine specifications for ALGOL (Algorithmic Language), as the new language would be called. Backus, on the American side, and Heinz Rutishauser, on the European side, were among the most influential committee members. ACTIVITY: To understand the mentioned programming languages, print “Hello World” in your computers using C programming. OPERATING SYSTEMS Control programs In order to make the early computers truly useful and efficient, two major innovations in software were needed. One was high-level programming languages (as described in the preceding section, FORTRAN, COBOL, and ALGOL). The other was control. Today the systemwide control functions of a computer are generally subsumed under the term operating system, or OS. An OS handles the behind-the-scenes activities of a computer, such as orchestrating the transitions from one program to another and managing access to disk storage and peripheral devices. The IBM 360 IBM had been selling business machines since early in the century and had built Howard Aiken’s computer to his architectural specifications. But the company had been slow to implement the stored-program digital computer architecture of the early 1950s. It did develop the IBM 650, a (like UNIVAC) decimal implementation of the IAS plan—and the first computer to sell more than 1,000 units. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL The invention of the transistor in 1947 led IBM to reengineer its early machines from electromechanical or vacuum tube to transistor technology in the late 1950s (although the UNIVAC Model 80, delivered in 1958, was the first transistor computer). These transistorized machines are commonly referred to as second-generation computers. Minicomputers About 1965, roughly coterminous with the development of time-sharing, a new kind of computer came on the scene. Small and relatively inexpensive (typically one-tenth the cost of the Big Iron machines), the new machines were stored-program computers with all the generality of the computers then in use but stripped down. The new machines were called minicomputers. (About the same time, the larger traditional computers began to be called mainframes.) Minicomputers were designed for easy connection to scientific instruments and other input/output devices, had a simplified architecture, were implemented using fast transistors, and were typically programmed in assembly language with little support for high-level languages. THE PERSONAL COMPUTER REVOLUTION Before 1970, computers were big machines requiring thousands of separate transistors. They were operated by specialized technicians, who often dressed in white lab coats and were commonly referred to as a computer priesthood. The machines were expensive and difficult to use. Few people came in direct contact with them, not even their programmers. The typical interaction was as follows: a programmer coded instructions and data on preformatted paper, a keypunch operator transferred the data onto punch cards, a computer operator fed the cards into a card reader, and the computer executed the instructions or stored the cards’ information for later processing. Advanced installations might allow users limited interaction with the computer more directly, but still remotely, via time-sharing through the use of cathode-ray tube terminals or teletype machines. The microprocessor Commodore and Tandy enter the field In late 1976 Commodore Business Machines, an established electronics firm that had been active in producing electronic calculators, bought a small hobby-computer company named MOS Technology. For the first time, an established company with extensive distribution channels would be selling a microcomputer. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Apple Inc. Like the founding of the early chip companies and the invention of the microprocessor, the story of Apple is a key part of Silicon Valley folklore. Two whiz kids, Stephen G. Wozniak and Steven P. Jobs, shared an interest in electronics. Wozniak was an early and regular participant at Homebrew Computer Club meetings (see the earlier section, The Altair), which Jobs also occasionally attended. The graphical user interface In 1982 Apple introduced its Lisa computer, a much more powerful computer with many innovations. The Lisa used a more advanced microprocessor, the Motorola 68000. It also had a different way of interacting with the user, called a graphical user interface (GUI). The GUI replaced the typed command lines common on previous computers with graphical icons on the screen that invoked actions when pointed to by a handheld pointing device called the mouse. The Lisa was not successful, but Apple was already preparing a scaled-down, lower-cost version called the Macintosh. Introduced in 1984, the Macintosh became wildly successful and, by making desktop computers easier to use, further popularized personal computers. The IBM Personal Computer The entry of IBM did more to legitimize personal computers than any event in the industry’s history. By 1980 the personal computer field was starting to interest the large computer companies. Hewlett-Packard, which had earlier turned down Stephen G. Wozniak’s proposal to enter the personal computer field, was now ready to enter this business, and in January 1980 it brought out its HP-85. Hewlett-Packard’s machine was more expensive ($3,250) than IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL those of most competitors, and it used a cassette tape drive for storage while most companies were already using disk drives. Another problem was its closed architecture, which made it difficult for third parties to develop applications or software for it. Microsoft’s Windows operating system In 1985 Microsoft came out with its Windows operating system, which gave PC compatibles some of the same capabilities as the Macintosh. Year after year, Microsoft refined and improved Windows so that Apple, which failed to come up with a significant new advantage, lost its edge. IBM tried to establish yet another operating system, OS/2, but lost the battle to Gates’s company. In fact, Microsoft also had established itself as the leading provider of application software for the Macintosh. Thus Microsoft dominated not only the operating system and application software business for PC-compatibles but also the application software business for the only nonstandard system with any sizable share of the desktop computer market. In 1998, amid a growing chorus of complaints about Microsoft’s business tactics, the U.S. Department of Justice filed a lawsuit charging Microsoft with using its monopoly position to stifle competition. Handheld digital devices It happened by small steps. The popularity of the personal computer and the ongoing miniaturization of the semiconductor circuitry and other devices first led to the development of somewhat smaller, portable—or, as they were sometimes called, luggable—computer systems. The first of these, the Osborne 1, designed by Lee Felsenstein, an electronics engineer active in the Homebrew Computer Club in San Francisco, was sold in 1981. Soon most PC manufacturers had portable models. At first these portables looked like sewing machines and weighed in excess of 20 pounds (9 kg). Gradually they became smaller (laptop-, notebook-, and then sub-notebook-size) and came with more powerful processors. These devices allowed people to use computers not only in the office or at home but also while traveling—on airplanes, in waiting rooms, or even at the beach. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL ONE INTERCONNECTED WORLD The Internet The Internet grew out of funding by the U.S. Advanced Research Projects Agency (ARPA), later renamed the Defense Advanced Research Projects Agency (DARPA), to develop a communication system among government and academic computer-research laboratories. The first network component, ARPANET, became operational in October 1969. With only 15 nongovernment (university) sites included in ARPANET, the U.S. National Science Foundation decided to fund the construction and initial maintenance cost of a supplementary network, the Computer Science Network (CSNET). E-commerce Early enthusiasm over the potential profits from e-commerce led to massive cash investments and a “dot-com” boom-and-bust cycle in the 1990s. By the end of the decade, half of these businesses had failed, though certain successful categories of online business had been demonstrated, and most conventional businesses had established an online presence. Search and online advertising proved to be the most successful new business areas. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Social networking Social networking services emerged as a significant online phenomenon in the 2000s. These services used software to facilitate online communities, where members with shared interests swapped files, photographs, videos, and music, sent messages and chatted, set up blogs (Web diaries) and discussion groups, and shared opinions. Early social networking services included Classmates.com, which connected former schoolmates, and Yahoo! 360°, Myspace, and SixDegrees, which built networks of connections via friends of friends. By 2018 the leading social networking services included Facebook, Twitter, Instagram, LinkedIn and Snapchat. LinkedIn became an effective tool for business staff recruiting. Businesses began exploring how to exploit these networks, drawing on social networking research and theory which suggested that finding key “influential” members of existing networks of individuals could give access to and credibility with the whole network. Quantum Computing: The New Generation of Computing and Technology Quantum computing is a cutting- edge field of computing that leverages the principles of quantum mechanics to process and store information in a fundamentally different way than classical computers. While classical computers use bits (0s and 1s) as the basic units of information, quantum computers use quantum bits or qubits. Quantum computing has the potential to revolutionize fields such as cryptography, optimization, materials science, drug discovery, and more. However, building practical and scalable quantum computers is a significant technological challenge, and many research and development efforts are ongoing in both academia and industry to harness the power of quantum computing for real- world applications. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Chapter 2. Information Technology Advancement Lesson 2.1 Human Computer Interaction HCI helps us to understand why some software products are good and other software is bad. But sadly, it is not a guaranteed formula for creating a successful product. In this sense it is like architecture or product design. Architects and product designers need a thorough technical grasp of the materials they work with, but the success of their work depends on the creative application of this technical knowledge. This creativity is a craft skill that is normally learned by working with a master designer in a studio, or from case studies of successful designs. A computer science course does not provide sufficient time for this kind of training in creative design, but it can provide the essential elements: an understanding of the user’s needs, and an understanding of potential solutions. “Part of the inhumanity of the computer is that, once it is competently programmed and working smoothly, it is completely honest” Isaac Asimov There are many different approaches to the study and design of user interfaces. Interaction Design, User Experience Design (UX), Interactive Systems Design, Cognitive Ergonomics, Man-Machine Interface (MMI), User Interface Design (UI), Human Factors, Cognitive Task Design, Information Architecture (IA), Software Product Design, Usability Engineering, User- Centred Design (UCD) and Computer Supported Collaborative Work (CSCW). VISUAL REPRESENTATION 1. Typography and text For many years, computer displays resembled paper documents. This does not mean that they were simplistic or unreasonably constrained. On the contrary, most aspects of modern industrial society have been successfully achieved using the representational conventions of paper, so those conventions seem to be powerful ones. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL 2. Maps and graphs Basic diagrammatic conventions rely on quantitative correspondence between a direction on the surface and a continuous quantity such as time or distance. These should follow established conventions of maps and graphs. 3. Schematic Drawings Engineering drawing conventions allow schematic views of connected components to be shown in relative scale, and with text annotations labelling the parts. White space in the representation plane can be used to help the reader distinguish elements from each other rather than directly representing physical space. 4. Pictures Pictorial representations, including line drawings, paintings, perspective renderings and photographs rely on shared interpretive conventions for their meaning. It is naïve to treat screen representations as though they were simulations of experience in the physical world. 5. Icons and symbols The design of simple and memorable visual symbols is a sophisticated graphic design skill. Following established conventions is the easiest option, but new symbols must be designed with an awareness of what sort of correspondence is intended - pictorial, symbolic, metonymic (e.g. a key to represent locking), bizarrely mnemonic, but probably not monolingual puns. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL ADVANCED USER INTERFACE TECHNIQUES Virtual reality (VR) The term virtual reality originally applied only to full immersion VR, in which simulated world is projected onto all walls of a room (CAVE – a recursive acronym for Cave Automatic Virtual Environment), or via a head-mounted display (HMD) which uses motion-tracking to change the view as you turn your head. Interaction was always a challenge – data gloves could supposedly be used to pick up and interact with objects in the virtual scene. Augmented reality Augmented reality (AR) systems overlay digital information onto the real world, either using partially-transparent head mounted displays, or by taking a video feed of an actual scene, and compositing it with computer generated elements. (This is now becoming available on a few mobile phone applications, using the phone camera to provide the video feed). Tangible user interfaces Tangible user interfaces (TUIs) use physical objects to control the computer, most often a collection of objects arranged on a tabletop to act as ‘physical icons’. An immediate problem is that physical objects don’t change their visible state very easily. You can include motors and displays in each object (expensive), or project overlaid AR information onto them, or just use them as multiple specialized mice/pucks that control elements of the display on a separate screen. In this case, it is necessary to track their positions, perhaps by using a large tablet device. If they are just being used as tokens to select a particular function or piece of data, an embedded RFID chip can be used to sense when they are placed within a certain distance of a reader. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Lesson 2.2 Introduction to Programming Languages A programming language is a set of instructions and syntax used to create software programs. Some of the key features of programming languages include: 1. Syntax: The specific rules and structure used to write code in a programming language. 2. Data Types: The type of values that can be stored in a program, such as numbers, strings, and booleans. 3. Variables: Named memory locations that can store values. 4. Operators: Symbols used to perform operations on values, such as addition, subtraction, and comparison. 5. Control Structures: Statements used to control the flow of a program, such as if-else statements, loops, and function calls. 6. Libraries and Frameworks: Collections of pre-written code that can be used to perform common tasks and speed up development. 7. Paradigms: The programming style or philosophy used in the language, such as procedural, object-oriented, or functional. Examples of popular programming languages include Python, Java, C++, JavaScript, and Ruby. Each language has its own strengths and weaknesses and is suited for different types of projects. A programming language is a formal language that specifies a set of instructions for a computer to perform specific tasks. It’s used to write software programs and applications, and to control and manipulate computer systems. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Are you aiming to become a software engineer one day? Do you also want to develop a mobile application that people all over the world would love to use? Are you passionate enough to take the big step to enter the world of programming? The basic components of a computer are: Input unit Central Processing Unit(CPU) Output unit The CPU is further divided into three parts- Memory unit Control unit Arithmetic Logic unit Most of us have heard that CPU is called the brain of our computer because it accepts data, provides temporary memory space to it until it is stored(saved) on the hard disk, performs logical operations on it and hence processes (here also means converts) data into information. We all know that a computer consists of hardware and software. Software is a set of programs that performs multiple tasks together. An operating system is also software (system software) that helps humans to interact with the computer system. A program is a set of instructions given to a computer to perform a specific operation. or computer is a computational device that is used to process the data under the control of a computer program. While executing the program, raw data is processed into the desired output format. These computer programs are written in a programming language which are high-level languages. High level languages are nearly human languages that are more complex than the computer understandable language which are called machine language, or low level language. So after knowing the basics, we are ready to create a very simple and basic program. Like we have different languages to communicate with each other, likewise, we have different languages like C, C++, C#, Java, python, etc to communicate with the computers. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL The computer only understands binary language (the language of 0’s and 1’s) also called machine-understandable language or low-level language but the programs we are going to write are in a high-level language which is almost similar to human language. The piece of code given below performs a basic task of printing “hello world! I am learning programming” on the console screen. We must know that keyboard, scanner, mouse, microphone, etc are various examples of input devices, and monitor(console screen), printer, speaker, etc are examples of output devices. At this stage, you might not be able to understand in-depth how this code prints something on the screen. The main() is a standard function that you will always include in any program that you are going to create from now onwards. Note that the execution of the program starts from the main() function. The clrscr() function is used to see only the current output on the screen while the printf() function helps us to print the desired output on the screen. Also, getch() is a function that accepts any character input from the keyboard. In simple words, we need to press any key to continue(some people may say that getch() helps in holding the screen to see the output). Between high-level language and machine language, there are assembly languages also called symbolic machine code. Assembly languages are particularly computer architecture specific. Utility program (Assembler) is used to convert assembly code into executable machine code. High Level Programming Language is portable but requires Interpretation or compiling to convert it into a machine language that is computer understood. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL There have been many programming languages some of them are listed below: Most Popular Programming Languages – C Python C++ Java SCALA C# R Ruby Go Swift JavaScript IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Characteristics of a programming Language – A programming language must be simple, easy to learn and use, have good readability, and be human recognizable. Abstraction is a must-have Characteristics for a programming language in which the ability to define the complex structure and then its degree of usability comes. A portable programming language is always preferred. Programming language’s efficiency must be high so that it can be easily converted into a machine code and its execution consumes little space in memory. A programming language should be well structured and documented so that it is suitable for application development. Necessary tools for the development, debugging, testing, maintenance of a program must be provided by a programming language. A programming language should provide a single environment known as Integrated Development Environment(IDE). A programming language must be consistent in terms of syntax and semantics. Basic Terminologies in Programming Languages: Algorithm: A step-by-step procedure for solving a problem or performing a task. Variable: A named storage location in memory that holds a value or data. Data Type: A classification that specifies what type of data a variable can hold, such as integer, string, or boolean. Function: A self-contained block of code that performs a specific task and can be called from other parts of the program. Control Flow: The order in which statements are executed in a program, including loops and conditional statements. Syntax: The set of rules that govern the structure and format of a programming language. Comment: A piece of text in a program that is ignored by the compiler or interpreter, used to add notes or explanations to the code. Debugging: The process of finding and fixing errors or bugs in a program. IDE: Integrated Development Environment, a software application that provides a comprehensive development environment for coding, debugging, and testing. Operator: A symbol or keyword that represents an action or operation to be performed on one or more values or variables, such as + (addition), – (subtraction), * (multiplication), and / (division). Statement: A single line or instruction in a program that performs a specific action or operation. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Basic Example Of Most Popular Programming Languages: Here the basic code for addition of two numbers are given in some popular languages (like C, C++, Java, Python, C#, JavaScript etc.). Tips for learning new programming language: 1. Start with the fundamentals: Begin by learning the basics of the language, such as syntax, data types, variables, and simple statements. This will give you a strong foundation to build upon. 2. Code daily: Like any skill, the only way to get good at programming is by practicing regularly. Try to write code every day, even if it’s just a few lines. 3. Work on projects: One of the best ways to learn a new language is to work on a project that interests you. It could be a simple game, a web application, or anything that allows you to apply what you’ve learned that is the most important part. 4. Read the documentation: Every programming language has documentation that explains its features, syntax, and best practices. Make sure to read it thoroughly to get a better understanding of the language. 5. Join online communities: There are many online communities dedicated to programming languages, where you can ask questions, share your code, and get feedback. Joining these communities can help you learn faster and make connections with other developers. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL 6. Learn from others: Find a mentor or someone who is experienced in the language you’re trying to learn. Ask them questions, review their code, and try to understand how they solve problems. 7. Practice debugging: Debugging is an essential skill for any programmer, and you’ll need to do a lot of it when learning a new language. Make sure to practice identifying and fixing errors in your code. Lesson 2.3 The Internet HISTORY OF INTERNET 1. Web 1.0. Web 1.0 is commonly associated as the initiated or the first stage of the WWW. The first stage of the web was made up of web pages and was entirely connected by hyperlinks. Well, the official definition for this version is a moot. 2. Web 2.0. The second stage of WWW Web 2.0 is identified by the modification from static to dynamic web pages which are better organized and are completely based on serving web applications, user-generated content to the end users. Web 2.0 is not only readable but also writable form with a connection to data. It allows the end users to navigate and to interact in a better way. It cheers the participation and data sharing. 3. Web 3.0 is referred as the new pattern in web and the evolution of Web 2.0 that will mark the changes in the creation of websites and particularly in what way people reach out to those websites. It is the executable clause of WWW with dynamic services. Web 3.0 is the read, write and executable format of WWW. It can easily read the data just like human and could also distribute or tailor the useful content to the end users. Key Differences to Note in the Terms Web 1.0, Web 2.0 and Web 3.0 IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL 4. Technological Convergence. The term technological convergence is often defined in a very generalized and simplified terms as a process by which telecommunications, information technology and the media, sectors that originally operated largely independent of one another, are growing together. Technological convergence has both a technical and a functional side. Types of Internet 1. Dial-Up Dial-up Internet became commercially available in the 1990s. During that time, customers could purchase a modem from their telephone service providers. The most common form was the so-called 56K modem, due to its maximum speed of 56 kbit/s. A limitation of dial-up Internet was that the Internet couldn’t be used whenever someone used a telephone, and vice versa. This is because the two services shared a single telephone line. For anyone who lived through that time, one of the memories must be yelling at their family who talked on the phone while you wanted to use the Internet. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Additionally, users needed to log in and out using details given by the telephone companies. They were also automatically disconnected after some time, allowing for a fair share of then- limited resources. And, although this might sound weird, dial-up is still in use in 2020. That’s because it requires only a telephone line, which is handy for remote places and third-world countries where the Internet is rarely present or non-existent. 2. DSL A DSL (Digital Subscriber Line) broadband Internet came after dial-up. Among users, it was better known as ADSL (Asymmetrical Digital Subscriber Line) because it had been the most commonly DSL technology installed. It functioned the same as dial-up – through a telephone network. But, because of technology advancements and a DSL filter (or splitter), the Internet operated in a spectrum above the band of regular telephone calls. This removed the most annoying feature of dial-up Internet – being unable to send and receive voice calls and the Internet simultaneously. 3. Cable The next advancement in the field was cable Internet access, which remains the most commonly used form of the Internet in 2020. It combined existing DSL telephone networks and cable television infrastructure to bring the best of both worlds. They supplied the Internet from a cable modem termination system (CMTS) at the provider’s premises, through kilometers-long underground cables, and finally separate coaxial cables to the customers’ modem and/or router. 4. Wireless Wireless LAN or WLAN is becoming more and more common in residential homes because of smart devices such as smartphones, tablets, and virtual assistant devices. For home use, customers receive a full-fledged cable or ADSL wired r outers or modems. However, those devices can also send and receive data wirelessly on two frequencies, 2.4 GHz and 5 GHz. If you ever see the term “dual-band”, this refers to the device’s ability to use both bands. The main difference between the two bands is their speed and range. 2.4 GHz supports maximum speeds between 450 Mbps and 600 Mbps. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL 5. Satellite For those places that rely on dial-up Internet or places where an Internet connection cannot be achieved, satellite Internet might be the only choice. The downside – it requires a substantial investment. This is because you must purchase or rent a satellite dish at the location, to send or receive data. Furthermore, the number of providers is much lower. The speeds aren’t known to be fast either. Generally, those are between 12 Mbps and 100 Mbps, but most likely on the lower end. 6. Cellular Have you heard about the 5G controversy and conspiracy theories? It is an advanced version of widely used cellular networks, 3G and 4G. They require a SIM card, and for the mobile phone service provider to have enabled the service on your account. You don’t need any additional gear – it’s all built into the smartphone, tablet, or hotspot devices. If you aren’t aware of hotspots, here’s a short answer. It a term used when a 3G, 4G, or 5G Internet is shared in the form of a wireless network to anyone in the vicinity. 7. Fiber Optic Instead of copper, thin glass wires are used to transmit data, and because light signals travel rapidly, this reduces latency. Although speed might be affected by the bouncing of light, it is still mind-numbingly fast. Providers offer speeds of around 1 Gbit/s for commercial use. It is also more reliable because it requires no electricity. Fiber optics can also be operated without a loss over greater distances than cable or DSL Internet. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Lesson 2.4 Artificial Intelligence Artificial Intelligence (AI) refers to the simulation of human intelligence in machines that are programmed to think and act like humans. It involves the development of algorithms and computer programs that can perform tasks that typically require human intelligence such as visual perception, speech recognition, decision-making, and language translation. AI has the potential to revolutionize many industries and has a wide range of applications, from virtual personal assistants to self-driving cars. Before leading to the meaning of artificial intelligence let understand what is the meaning of Intelligence- Intelligence: The ability to learn and solve problems. This definition is taken from webster’s Dictionary. The most common answer that one expects is “to make computers intelligent so that they can act intelligently!”, but the question is how much intelligent? How can one judge intelligence? Intelligence, as we know, is the ability to acquire and apply knowledge. Knowledge is the information acquired through experience. Experience is the knowledge gained through exposure(training). Summing the terms up, we get artificial intelligence as the “copy of something natural(i.e., human beings) ‘WHO’ is capable of acquiring and applying the information it has gained through exposure.” Intelligence is composed of: Reasoning Learning Problem-Solving Perception Linguistic Intelligence Need for Artificial Intelligence 1. To create expert systems that exhibit intelligent behavior with the capability to learn, demonstrate, explain, and advise its users. 2. Helping machines find solutions to complex problems like humans do and applying them as algorithms in a computer-friendly manner. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL 3. Improved efficiency: Artificial intelligence can automate tasks and processes that are time-consuming and require a lot of human effort. This can help improve efficiency and productivity, allowing humans to focus on more creative and high-level tasks. 4. Better decision-making: Artificial intelligence can analyze large amounts of data and provide insights that can aid in decision-making. This can be especially useful in domains like finance, healthcare, and logistics, where decisions can have significant impacts on outcomes. 5. Enhanced accuracy: Artificial intelligence algorithms can process data quickly and accurately, reducing the risk of errors that can occur in manual processes. This can improve the reliability and quality of results. 6. Personalization: Artificial intelligence can be used to personalize experiences for users, tailoring recommendations, and interactions based on individual preferences and behaviors. This can improve customer satisfaction and loyalty. 7. Exploration of new frontiers: Artificial intelligence can be used to explore new frontiers and discover new knowledge that is difficult or impossible for humans to access. This can lead to new breakthroughs in fields like astronomy, genetics, and drug discovery. Technologies Based on Artificial Intelligence: 1. Machine Learning: A subfield of AI that uses algorithms to enable systems to learn from data and make predictions or decisions without being explicitly programmed. 2. Natural Language Processing (NLP): A branch of AI that focuses on enabling computers to understand, interpret, and generate human language. 3. Computer Vision: A field of AI that deals with the processing and analysis of visual information using computer algorithms. 4. Robotics: AI-powered robots and automation systems that can perform tasks in manufacturing, healthcare, retail, and other industries. 5. Neural Networks: A type of machine learning algorithm modeled after the structure and function of the human brain. 6. Expert Systems: AI systems that mimic the decision-making ability of a human expert in a specific field. 7. Chatbots: AI-powered virtual assistants that can interact with users through text-based or voice-based interfaces. The Future of AI Technologies: 1. Reinforcement Learning: Reinforcement Learning is an interesting field of Artificial Intelligence that focuses on training agents to make intelligent decisions by interacting with their environment. 2. Explainable AI: this AI techniques focus on providing insights into how AI models arrive at their conclusions. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL 3. Generative AI: Through this technique AI models can learn the underlying patterns and create realistic and novel outputs. 4. Edge AI:AI involves running AI algorithms directly on edge devices, such as smartphones, IoT devices, and autonomous vehicles, rather than relying on cloud-based processing. 5. Quantum AI: Quantum AI combines the power of quantum computing with AI algorithms to tackle complex problems that are beyond the capabilities of classical computers. “Artificial Intelligence is a TOOL not a THREAT” Rodney Brooks IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Chapter 3. Computer Number Systems Lesson 3.1. Introduction to Number Systems A Number system is a method of showing numbers by writing, which is a mathematical way of representing the numbers of a given set, by using the numbers or symbols in a mathematical manner. The writing system for denoting numbers using digits or symbols in a logical manner is defined as a Number system. The numeral system Represents a useful set of numbers, reflects the arithmetic and algebraic structure of a number, and Provides standard representation. The digits from 0 to 9 can be used to form all the numbers. With these digits, anyone can create infinite numbers. For example, 156,3907, 3456, 1298, 784859 etc. Types of Number Systems Based on the base value and the number of allowed digits, number systems are of many types. The four common types of Number System are: Decimal Number System Binary Number System Octal Number System Hexadecimal Number System Decimal Number System Number system with a base value of 10 is termed a Decimal number system. It uses 10 digits i.e. 0-9 for the creation of numbers. Here, each digit in the number is at a specific place with place value a product of different powers of 10. Here, the place value is termed from right to left as first place value called units, second to the left as Tens, so on Hundreds, Thousands, etc. Here, units have the place value as 100, tens have the place value as 101, hundreds as 102, thousands as 103, and so on. Binary Number System Number System with base value 2 is termed as Binary number system. It uses 2 digits i.e. 0 and 1 for the creation of numbers. The numbers formed using these two digits are termed Binary Numbers. The binary number system is very useful in electronic devices and computer systems because it can be easily performed using just two states ON and OFF i.e. 0 and 1. Decimal Numbers 0-9 are represented in binary as: 0, 1, 10, 11, 100, 101, 110, 111, 1000, and 1001 For example, 14 can be written as 1110, 19 can be written as 10011, 50 can be written as 110010. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Octal Number System Octal Number System is one in which the base value is 8. It uses 8 digits i.e. 0-7 for the creation of Octal Numbers. Octal Numbers can be converted to Decimal values by multiplying each digit with the place value an