🎧 New: AI-Generated Podcasts Turn your study notes into engaging audio conversations. Learn more

IM - COMP 001 - Introduction to Computing-part-1.pdf

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Document Details

ThriftyCongas2666

Uploaded by ThriftyCongas2666

Polytechnic University of the Philippines

2023

Tags

computing computer technology information technology

Full Transcript

Republic of the Philippines POLYTECHNIC UNIVERSITY OF THE PHILIPPINES SANTA ROSA CAMPUS City of Santa Rosa, Laguna INSTRUCTIONAL MATERIAL COMP 001 INTRODUCTION TO COMPUTING Compiled by: Inst. Owen Harvey Balocon Faculty, Bachelor of...

Republic of the Philippines POLYTECHNIC UNIVERSITY OF THE PHILIPPINES SANTA ROSA CAMPUS City of Santa Rosa, Laguna INSTRUCTIONAL MATERIAL COMP 001 INTRODUCTION TO COMPUTING Compiled by: Inst. Owen Harvey Balocon Faculty, Bachelor of Science in Information Technology Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Chapter 1. Introduction to Information Technology Lesson 1.1 Introduction and History of Computer Technology The electronic computer is one of the most important developments of the twentieth century. Like the industrial revolution of the nineteenth century, the computer and the information and communication technology built upon it have drastically changed business, culture, government and science, and have touched nearly every aspect of our lives. This text introduces the field of computing and details the fundamental concepts and practices used in the development of computer applications. “We’re changing the World with Technology” Bill Gates Analog computers Analog computers use continuous physical magnitudes to represent quantitative information. At first they represented quantities with mechanical components (see differential analyzer and integrator), but after World War II voltages were used; by the 1960s digital computers had largely replaced them. Nonetheless, analog computers, and some hybrid digital-analog systems, continued in use through the 1960s in tasks such as aircraft and spaceflight simulation. One advantage of analog computation is that it may be relatively simple to design and build an analog computer to solve a single problem. Another advantage is that analog computers can frequently represent and solve a problem in “real time”; that is, the computation proceeds at the same rate as the system being modeled by it. Their main disadvantages are that analog representations are limited in precision—typically a few decimal places but fewer in complex mechanisms—and general-purpose devices are expensive and not easily programmed. Digital computers In contrast to analog computers, digital computers represent information in discrete form, generally as sequences of 0s and 1s (binary digits, or bits). The modern era of digital computers began in the late 1930s and early 1940s in the United States, Britain, and Germany. The first devices used switches operated by electromagnets (relays). Their programs were stored on punched paper tape or cards, and they had limited internal data storage. For historical developments, see the section Invention of the modern computer. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Mainframe computer These computers came to be called mainframes, though the term did not become common until smaller computers were built. Mainframe computers were characterized by having (for their time) large storage capabilities, fast components, and powerful computational abilities. They were highly reliable, and, because they frequently served vital needs in an organization, they were sometimes designed with redundant components that let them survive partial failures. Because they were complex systems, they were operated by a staff of systems programmers, who alone had access to the computer. Other users submitted “batch jobs” to be run one at a time on the mainframe. Supercomputer The most powerful computers of the day have typically been called supercomputers. They have historically been very expensive and their use limited to high-priority computations for government-sponsored research, such as nuclear simulations and weather modeling. Today many of the computational techniques of early supercomputers are in common use in PCs. On the other hand, the design of costly, special-purpose processors for supercomputers has been supplanted by the use of large arrays of commodity processors (from several dozen to over 8,000) operating in parallel over a high-speed communications network. Minicomputer Although minicomputers date to the early 1950s, the term was introduced in the mid-1960s. Relatively small and inexpensive, minicomputers were typically used in a single department of an organization and often dedicated to one task or shared by a small group. Minicomputers generally had limited computational power, but they had excellent compatibility with various laboratory and industrial devices for collecting and inputting data. One of the most important manufacturers of minicomputers was Digital Equipment Corporation (DEC) with its Programmed Data Processor (PDP). In 1960 DEC’s PDP-1 sold for $120,000. Five years later its PDP-8 cost $18,000 and became the first widely used minicomputer, with more than IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL 50,000 sold. The DEC PDP-11, introduced in 1970, came in a variety of models, small and cheap enough to control a single manufacturing process and large enough for shared use in university computer centres; more than 650,000 were sold. However, the microcomputer overtook this market in the 1980s. Microcomputer A microcomputer is a small computer built around a microprocessor integrated circuit, or chip. Whereas the early minicomputers replaced vacuum tubes with discrete transistors, microcomputers (and later minicomputers as well) used microprocessors that integrated thousands or millions of transistors on a single chip. In 1971 the Intel Corporation produced the first microprocessor, the Intel 4004, which was powerful enough to function as a computer although it was produced for use in a Japanese-made calculator. In 1975 the first personal computer, the Altair, used a successor chip, the Intel 8080 microprocessor. Like minicomputers, early microcomputers had relatively limited storage and data-handling capabilities, but these have grown as storage technology has improved alongside processing power. Embedded processors Another class of computer is the embedded processor. These are small computers that use simple microprocessors to control electrical and mechanical functions. They generally do not have to do elaborate computations or be extremely fast, nor do they have to have great “input-output” capability, and so they can be inexpensive. Embedded processors help to control aircraft and industrial automation, and they are common in automobiles and in both large and small household appliances. One particular type, the digital signal processor (DSP), has become as prevalent as the microprocessor. DSPs are used in wireless telephones, digital telephone and cable modems, and some stereo equipment. GENERATIONS OF COMPUTERS The First Generation The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts. 5 MB OF DATA MACHINE LANGUAGE The Second Generation Transistors replaced vacuum tubes and ushered in the second generation of computers. One transistor replaced the equivalent of 40 vacuum tubes. Allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable. Still generated a great deal of heat that can damage the computer. Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. Second-generation computers still relied on punched cards for input and printouts for output. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL The Third Generation The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers. Much smaller and cheaper compare to the second generation computers. It could carry out instructions in billionths of a second. Users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device; to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL The Fourth Generation The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices. Based on Artificial Intelligence (AI). Still in development. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. The goal is to develop devices that respond to natural language input and are capable of learning and self-organization. There are some applications, such as voice recognition, that are being used today. WISDOM OF THIS LESSON: Let us Contribute to the Advancement, Evolution, and the Betterment of the Human Race IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Lesson 1.2 Understanding Information Technology What is IT? Information technology (IT) refers to the use of computers, software, hardware, networks, and other digital technologies to store, retrieve, transmit, and manipulate data or information. IT encompasses a wide range of technologies and practices that are used to manage and process information effectively in various domains, including business, healthcare, education, entertainment, and more. Here are some key components and aspects of information technology: 1. Hardware: This includes computers, servers, storage devices, and networking equipment. Hardware forms the physical infrastructure on which IT systems and applications run. 2. Software: IT involves the development, installation, and maintenance of software applications and systems. This includes operating systems, productivity software, databases, and custom software solutions. 3. Networking: IT relies heavily on networks to connect devices and enable data communication. This includes local area networks (LANs), wide area networks (WANs), the internet, and various network protocols. 4. Data Management: IT professionals are responsible for managing data, which includes data storage, retrieval, backup, and security. Databases and data centers play a crucial role in this aspect. 5. Cybersecurity: IT security is a critical component of information technology. It involves protecting systems, networks, and data from unauthorized access, breaches, and cyber threats. 6. Cloud Computing: Cloud technology has become a central part of IT, allowing organizations to access and manage resources (such as servers, storage, and applications) remotely over the internet. 7. Programming and Development: IT professionals often engage in software development, coding, and programming to create custom solutions or modify existing software to meet specific needs. 8. IT Support: IT support teams provide assistance to users and organizations, helping them resolve technical issues and ensuring that IT systems run smoothly. 9. Digital Transformation: IT plays a pivotal role in modernizing and transforming businesses and organizations by enabling automation, data analysis, and the adoption of emerging technologies like artificial intelligence (AI) and the Internet of Things (IoT). IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL 10. Project Management: Managing IT projects is crucial to ensure that technology implementations are completed on time and within budget while meeting the desired objectives. 11. Data Analytics and Business Intelligence: IT is instrumental in collecting and analyzing data to derive insights and support decision-making processes. 12. Mobile Technology: The proliferation of smartphones and mobile devices has expanded the scope of IT to include mobile app development and mobile device management. 13. E-commerce and Online Services: IT powers online businesses and services, from e- commerce platforms to social media networks. Information technology is an ever-evolving field, and it continues to shape and revolutionize the way individuals and organizations operate and interact in today's digital age. It has become an integral part of our daily lives and has a significant impact on nearly every aspect of modern society. The Differences of Information Technology, Computer Science, Computer Engineering, and Information Systems Information Technology (IT), Computer Science (CS), Computer Engineering (CE), and Information Systems (IS) are related fields but have distinct focuses and areas of expertise. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Lesson 1.3 History of Computing A computer might be described with deceptive simplicity as “an apparatus that performs routine calculations automatically.” Such a definition would owe its deceptiveness to a naive and narrow view of calculation as a strictly mathematical process. In fact, calculation underlies many activities that are not normally thought of as mathematical. Walking across a room, for instance, requires many complex, albeit subconscious, calculations. Computers, too, have proved capable of solving a vast array of problems, from balancing a checkbook to even—in the form of guidance systems for robots—walking across a room. Before the true power of computing could be realized, therefore, the naive view of calculation had to be overcome. The inventors who laboured to bring the computer into the world had to learn that the thing they were inventing was not just a number cruncher, not merely a calculator. For example, they had to learn that it was not necessary to invent a new computer for every new calculation and that a computer could be designed to solve numerous problems, even problems not yet imagined when the computer was built. They also had to learn how to tell such a general problem-solving computer what problem to solve. In other words, they had to invent programming. They had to solve all the heady problems of developing such a device, of implementing the design, of actually building the thing. The history of the solving of these problems is the history of the computer. That history is covered in this section, and links are provided to entries on many of the individuals and companies mentioned. In addition, see the articles computer science and supercomputer. Early history The abacus The earliest known calculating device is probably the abacus. It dates back at least to 1100 BCE and is still in use today, particularly in Asia. Now, as then, it typically consists of a rectangular frame with thin parallel rods strung with beads. Long before any systematic positional notation was adopted for the writing of numbers, the abacus assigned different units, or weights, to each rod. This scheme allowed a wide range of numbers to be represented by just a few beads and, together with the invention of zero in India, may have inspired the invention of the Hindu-Arabic number system. In any case, abacus beads can be readily manipulated to perform the common arithmetical operations—addition, subtraction, multiplication, and division—that are useful for commercial transactions and in bookkeeping. The abacus is a digital device; that is, it represents values discretely. A bead is either in one predefined position or another, representing unambiguously, say, one or zero. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Analog calculators: from Napier’s logarithms to the slide rule Calculating devices took a different turn when John Napier, a Scottish mathematician, published his discovery of logarithms in 1614. As any person can attest, adding two 10-digit numbers is much simpler than multiplying them together, and the transformation of a multiplication problem into an addition problem is exactly what logarithms enable. This simplification is possible because of the following logarithmic property: the logarithm of the product of two numbers is equal to the sum of the logarithms of the numbers. By 1624, tables with 14 significant digits were available for the logarithms of numbers from 1 to 20,000, and scientists quickly adopted the new labour-saving tool for tedious astronomical calculations. Digital calculators: from the Calculating Clock to the Arithmometer In 1623 the German astronomer and mathematician Wilhelm Schickard built the first calculator. He described it in a letter to his friend the astronomer Johannes Kepler, and in 1624 he wrote again to explain that a machine he had commissioned to be built for Kepler was, apparently along with the prototype, destroyed in a fire. He called it a Calculating Clock, which modern engineers have been able to reproduce from details in his letters. Even general knowledge of the clock had been temporarily lost when Schickard and his entire family perished during the Thirty Years’ War. The first calculator or adding machine to be produced in any quantity and actually used was the Pascaline, or Arithmetic Machine, designed and built by the French mathematician-philosopher Blaise Pascal between 1642 and 1644. It could only do addition and subtraction, with numbers being entered by manipulating its dials. Pascal invented the machine for his father, a tax collector, so it was the first business machine too (if one does not count the abacus). He built 50 of them over the next 10 years. Leibniz was a strong advocate of the binary number system. Binary numbers are ideal for machines because they require only two digits, which can easily be represented by the on and off states of a switch. When computers became electronic, the binary system was particularly appropriate because an electrical circuit is either on or off. This meant that on could represent true, off could represent false, and the flow of current would directly represent the flow of logic. The Jacquard Loom Calculators such as the Arithmometer remained a fascination after 1820, and their potential for commercial use was well understood. Many other mechanical devices built during the 19th century also performed repetitive functions more or less automatically, but few had any application to computing. There was one major exception: the Jacquard loom, invented in 1804–05 by a French weaver, Joseph- Marie Jacquard. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL The Difference Engine Charles Babbage was an English mathematician and inventor: he invented the cowcatcher, reformed the British postal system, and was a pioneer in the fields of operations research and actuarial science. It was Babbage who first suggested that the weather of years past could be read from tree rings. He also had a lifelong fascination with keys, ciphers, and mechanical dolls. The Analytical Engine While working on the Difference Engine, Babbage began to imagine ways to improve it. Chiefly he thought about generalizing its operation so that it could perform other kinds of calculations. By the time the funding had run out in 1833, he had conceived of something far more revolutionary: a general- purpose computing machine called the Analytical Engine. Early business machines Herman Hollerith’s census tabulator The U.S. Constitution mandates that a census of the population be performed every 10 years. The first attempt at any mechanization of the census was in 1870, when statistical data were transcribed onto a rolling paper tape displayed through a small slotted window. As the size of America’s population exploded in the 19th century and the number of census questions expanded, the urgency of further mechanization became increasingly clear. After graduating from the Columbia University School of Mines, New York City, in 1879, Herman Hollerith obtained his first job with one of his former professors, William P. Trowbridge, who had received a commission as a special agent for the 1880 census. It was while employed at the Census Office that Hollerith first saw the pressing need for automating the tabulation of statistical data. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Howard Aiken’s digital calculators While Bush was working on analog computing at MIT, across town Harvard professor Howard Aiken was working with digital devices for calculation. He had begun to realize in hardware something like Babbage’s Analytical Engine, which he had read about. Starting in 1937, he laid out detailed plans for a series of four calculating machines of increasing sophistication, based on different technologies, from the largely mechanical Mark I to the electronic Mark IV. The Turing machine Alan Turing, while a mathematics student at the University of Cambridge, was inspired by German mathematician David Hilbert’s formalist program, which sought to demonstrate that any mathematical problem can potentially be solved by an algorithm—that is, by a purely mechanical process. Turing interpreted this to mean a computing machine and set out to design one capable of resolving all mathematical problems, but in the process he proved in his seminal paper “On Computable Numbers, with an Application to the Entscheidungsproblem [‘Halting Problem’]” (1936) that no such universal mathematical solver could ever exist. Developments during World War II ENIAC In the United States, government funding went to a project led by John Mauchly, J. Presper Eckert, Jr., and their colleagues at the Moore School of Electrical Engineering at the University of Pennsylvania; their objective was an all-electronic computer. Under contract to the army and under the direction of Herman Goldstine, work began in early 1943 on the Electronic Numerical Integrator and Computer (ENIAC). The next year, mathematician John von Neumann, already on full-time leave from the Institute for Advanced Studies (IAS), Princeton, New Jersey, for various government research projects (including the Manhattan Project), began frequent consultations with the group. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Early computer language development Machine language One implication of the stored-program model was that programs could read and operate on other programs as data; that is, they would be capable of self-modification. Konrad Zuse had looked upon this possibility as “making a contract with the Devil” because of the potential for abuse, and he had chosen not to implement it in his machines. But self-modification was essential for achieving a true general-purpose machine. Interpreters HLL coding was attempted right from the start of the stored-program era in the late 1940s. Shortcode, or short-order code, was the first such language actually implemented. Suggested by John Mauchly in 1949, it was implemented by William Schmitt for the BINAC computer in that year and for UNIVAC in 1950. Shortcode went through multiple steps: first it converted the alphabetic statements of the language to numeric codes, and then it translated these numeric codes into machine language. It was an interpreter, meaning that it translated HLL statements and executed, or performed, them one at a time—a slow process. Because of their slow execution, interpreters are now rarely used outside of program development, where they may help a programmer to locate errors quickly. Compilers An alternative to this approach is what is now known as compilation. In compilation, the entire HLL program is converted to machine language and stored for later execution. Although translation may take many hours or even days, once the translated program is stored, it can be recalled anytime in the form of a fast-executing machine-language program. FORTRAN, COBOL, and ALGOL FORTRAN took another step toward making programming more accessible, allowing comments in the programs. The ability to insert annotations, marked to be ignored by the translator program but readable by a human, meant that a well-annotated program could be read in a certain sense by people with no programming knowledge at all. For the first time a nonprogrammer could get an idea what a program did—or at least what it was intended to do—by reading (part of) the code. It was an obvious but powerful step in opening up computers to a wider audience. By 1954 Backus and a team of programmers had designed the language, which they called FORTRAN (Formula Translation). Programs written in FORTRAN looked a lot more like mathematics than machine instructions: IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL COBOL About the time that Backus and his team invented FORTRAN, Hopper’s group at UNIVAC released Math-matic, a FORTRAN-like language for UNIVAC computers. It was slower than FORTRAN and not particularly successful. Another language developed at Hopper’s laboratory at the same time had more influence. Flow-matic used a more English-like syntax and vocabulary. ALGOL During the late 1950s a multitude of programming languages appeared. This proliferation of incompatible specialized languages spurred an interest in the United States and Europe to create a single “second-generation” language. A transatlantic committee soon formed to determine specifications for ALGOL (Algorithmic Language), as the new language would be called. Backus, on the American side, and Heinz Rutishauser, on the European side, were among the most influential committee members. ACTIVITY: To understand the mentioned programming languages, print “Hello World” in your computers using C programming. OPERATING SYSTEMS Control programs In order to make the early computers truly useful and efficient, two major innovations in software were needed. One was high-level programming languages (as described in the preceding section, FORTRAN, COBOL, and ALGOL). The other was control. Today the systemwide control functions of a computer are generally subsumed under the term operating system, or OS. An OS handles the behind-the-scenes activities of a computer, such as orchestrating the transitions from one program to another and managing access to disk storage and peripheral devices. The IBM 360 IBM had been selling business machines since early in the century and had built Howard Aiken’s computer to his architectural specifications. But the company had been slow to implement the stored-program digital computer architecture of the early 1950s. It did develop the IBM 650, a (like UNIVAC) decimal implementation of the IAS plan—and the first computer to sell more than 1,000 units. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL The invention of the transistor in 1947 led IBM to reengineer its early machines from electromechanical or vacuum tube to transistor technology in the late 1950s (although the UNIVAC Model 80, delivered in 1958, was the first transistor computer). These transistorized machines are commonly referred to as second-generation computers. Minicomputers About 1965, roughly coterminous with the development of time-sharing, a new kind of computer came on the scene. Small and relatively inexpensive (typically one-tenth the cost of the Big Iron machines), the new machines were stored-program computers with all the generality of the computers then in use but stripped down. The new machines were called minicomputers. (About the same time, the larger traditional computers began to be called mainframes.) Minicomputers were designed for easy connection to scientific instruments and other input/output devices, had a simplified architecture, were implemented using fast transistors, and were typically programmed in assembly language with little support for high-level languages. THE PERSONAL COMPUTER REVOLUTION Before 1970, computers were big machines requiring thousands of separate transistors. They were operated by specialized technicians, who often dressed in white lab coats and were commonly referred to as a computer priesthood. The machines were expensive and difficult to use. Few people came in direct contact with them, not even their programmers. The typical interaction was as follows: a programmer coded instructions and data on preformatted paper, a keypunch operator transferred the data onto punch cards, a computer operator fed the cards into a card reader, and the computer executed the instructions or stored the cards’ information for later processing. Advanced installations might allow users limited interaction with the computer more directly, but still remotely, via time-sharing through the use of cathode-ray tube terminals or teletype machines. The microprocessor Commodore and Tandy enter the field In late 1976 Commodore Business Machines, an established electronics firm that had been active in producing electronic calculators, bought a small hobby-computer company named MOS Technology. For the first time, an established company with extensive distribution channels would be selling a microcomputer. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Apple Inc. Like the founding of the early chip companies and the invention of the microprocessor, the story of Apple is a key part of Silicon Valley folklore. Two whiz kids, Stephen G. Wozniak and Steven P. Jobs, shared an interest in electronics. Wozniak was an early and regular participant at Homebrew Computer Club meetings (see the earlier section, The Altair), which Jobs also occasionally attended. The graphical user interface In 1982 Apple introduced its Lisa computer, a much more powerful computer with many innovations. The Lisa used a more advanced microprocessor, the Motorola 68000. It also had a different way of interacting with the user, called a graphical user interface (GUI). The GUI replaced the typed command lines common on previous computers with graphical icons on the screen that invoked actions when pointed to by a handheld pointing device called the mouse. The Lisa was not successful, but Apple was already preparing a scaled-down, lower-cost version called the Macintosh. Introduced in 1984, the Macintosh became wildly successful and, by making desktop computers easier to use, further popularized personal computers. The IBM Personal Computer The entry of IBM did more to legitimize personal computers than any event in the industry’s history. By 1980 the personal computer field was starting to interest the large computer companies. Hewlett-Packard, which had earlier turned down Stephen G. Wozniak’s proposal to enter the personal computer field, was now ready to enter this business, and in January 1980 it brought out its HP-85. Hewlett-Packard’s machine was more expensive ($3,250) than IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL those of most competitors, and it used a cassette tape drive for storage while most companies were already using disk drives. Another problem was its closed architecture, which made it difficult for third parties to develop applications or software for it. Microsoft’s Windows operating system In 1985 Microsoft came out with its Windows operating system, which gave PC compatibles some of the same capabilities as the Macintosh. Year after year, Microsoft refined and improved Windows so that Apple, which failed to come up with a significant new advantage, lost its edge. IBM tried to establish yet another operating system, OS/2, but lost the battle to Gates’s company. In fact, Microsoft also had established itself as the leading provider of application software for the Macintosh. Thus Microsoft dominated not only the operating system and application software business for PC-compatibles but also the application software business for the only nonstandard system with any sizable share of the desktop computer market. In 1998, amid a growing chorus of complaints about Microsoft’s business tactics, the U.S. Department of Justice filed a lawsuit charging Microsoft with using its monopoly position to stifle competition. Handheld digital devices It happened by small steps. The popularity of the personal computer and the ongoing miniaturization of the semiconductor circuitry and other devices first led to the development of somewhat smaller, portable—or, as they were sometimes called, luggable—computer systems. The first of these, the Osborne 1, designed by Lee Felsenstein, an electronics engineer active in the Homebrew Computer Club in San Francisco, was sold in 1981. Soon most PC manufacturers had portable models. At first these portables looked like sewing machines and weighed in excess of 20 pounds (9 kg). Gradually they became smaller (laptop-, notebook-, and then sub-notebook-size) and came with more powerful processors. These devices allowed people to use computers not only in the office or at home but also while traveling—on airplanes, in waiting rooms, or even at the beach. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL ONE INTERCONNECTED WORLD The Internet The Internet grew out of funding by the U.S. Advanced Research Projects Agency (ARPA), later renamed the Defense Advanced Research Projects Agency (DARPA), to develop a communication system among government and academic computer-research laboratories. The first network component, ARPANET, became operational in October 1969. With only 15 nongovernment (university) sites included in ARPANET, the U.S. National Science Foundation decided to fund the construction and initial maintenance cost of a supplementary network, the Computer Science Network (CSNET). E-commerce Early enthusiasm over the potential profits from e-commerce led to massive cash investments and a “dot-com” boom-and-bust cycle in the 1990s. By the end of the decade, half of these businesses had failed, though certain successful categories of online business had been demonstrated, and most conventional businesses had established an online presence. Search and online advertising proved to be the most successful new business areas. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023 Polytechnic University of the Philippines COMP 001 INTRODUCTION TO COMPUTING Sta. Rosa Branch INSTRUCTIONAL MATERIAL Social networking Social networking services emerged as a significant online phenomenon in the 2000s. These services used software to facilitate online communities, where members with shared interests swapped files, photographs, videos, and music, sent messages and chatted, set up blogs (Web diaries) and discussion groups, and shared opinions. Early social networking services included Classmates.com, which connected former schoolmates, and Yahoo! 360°, Myspace, and SixDegrees, which built networks of connections via friends of friends. By 2018 the leading social networking services included Facebook, Twitter, Instagram, LinkedIn and Snapchat. LinkedIn became an effective tool for business staff recruiting. Businesses began exploring how to exploit these networks, drawing on social networking research and theory which suggested that finding key “influential” members of existing networks of individuals could give access to and credibility with the whole network. Quantum Computing: The New Generation of Computing and Technology Quantum computing is a cutting- edge field of computing that leverages the principles of quantum mechanics to process and store information in a fundamentally different way than classical computers. While classical computers use bits (0s and 1s) as the basic units of information, quantum computers use quantum bits or qubits. Quantum computing has the potential to revolutionize fields such as cryptography, optimization, materials science, drug discovery, and more. However, building practical and scalable quantum computers is a significant technological challenge, and many research and development efforts are ongoing in both academia and industry to harness the power of quantum computing for real- world applications. IM Developed by: Date Created: Revision No. COMP 001 INTRODUCTION TO Page No. COMPUTING Mr. Owen Harvey Balocon September 15, 2023 - 000 ___ of ___ INSTRUCTIONAL MATERIAL Instructor 2023

Use Quizgecko on...
Browser
Browser