GEE-LIE-Chapter-1-Intro-to-ICT Course Packs PDF
Document Details
Uploaded by BlitheIndium
Cebu Technological University
2020
Prof. Liberty Grace Baay, Prof. Maria Rowena Lobrigas, Prof. Juanito Paner, Prof. Joyjean Basingcol, Prof. Frances Jay B. Pacaldo, Prof. Stephanie Loise Tanedo
Tags
Summary
This document is instructional materials for living in the IT era. It covers the history of computers, generations of computer, and classifications of computers and the evolution of media. It shows the instructional materials for living in the IT era and some questions for a assessment.
Full Transcript
1 Department of Mathematics and Statistics INSTRUCTIONAL MATERIALS FOR LIVING IN THE IT ERA (GEE-LIE) PREPARED BY: PROF. LIBERTY GRACE BAAY PROF. JOYJEAN BASINGCOL PROF. MARIA ROWENA LOBRIGAS PROF. FRANCES JAY B. PACALDO PROF. JUANITO PANER...
1 Department of Mathematics and Statistics INSTRUCTIONAL MATERIALS FOR LIVING IN THE IT ERA (GEE-LIE) PREPARED BY: PROF. LIBERTY GRACE BAAY PROF. JOYJEAN BASINGCOL PROF. MARIA ROWENA LOBRIGAS PROF. FRANCES JAY B. PACALDO PROF. JUANITO PANER PROF. STEPHANIE LOISE TAÑEDO AUGUST 2020 2 TABLE OF CONTENTS Page Course Learning outcomes ………………………………………………………3 Intended Learning Outcomes ……………………………………………………3 Introduction ………………………………………………………………………3-4 Lesson 1.1. History of Computer ……………………………………………...4-15 Lesson 1.2 Generation of Computer …………………………………………16-20 Assessment ……………………………………………………………………...21-25 Lesson 1.3 Four Basic Computer Periods….…………………………………26-31 Assessment………………………………………………………………………31-32 Lesson 1.4 Classification of Computers………………………………………34-35 Assessment………………………………………………………………………36-38 Lesson 1.5 Evolution of information Technology……………………………39-41 Assessment………………………………………………………………………41-43 Lesson 1.6 Evolution of Media…………………………………………………44-50 Assessment………………………………………………………………………51 Lesson 1.7 Media in the Digital Age……………………………………………52-54 Assessment……………………………………………………………………….55 References………………………………………………………………………...56-57 3 GEE-LIE Living in IT Era LESSON 1. Introduction to ICT Lecture Notes Course Learning Outcome for the Lesson Create a brief timeline of the history of Information Technology Intended Learning Outcome summarize key events in the history of information Technology; discover the milestones of Information and Communication Technology and its impacts and issues; and explain the role of technology in media and how it affects communication.. Introduction The rise of information and communication technologies (ICT) – that is, computers, software, telecommunications and the internet – and the large impact that these new technologies are having on the way that society functions, have prompted many to claim that we have entered a new era, often referred to as the ‘Third Industrial Revolution’, the ‘information age’ or the ‘new economy’. There are many technological advances many of us assume have been with us forever, and for some of us, technology like computers have been around longer than we have. Most of us are more interested in advances or future enhancements than in the past. After all, why is the history of a piece of technology important to now? The study of the history of any technology is crucial because often times the original design influences a future design not always related or in the same field as the original. 4 For instance, the card punch design was used to compile health information and was adopted to tabulate the 1890 Census. By the 1950s, punch cards lead to the development of the need for concise computer language to operate. Meanwhile, transistors and circuits became more sophisticated, and inventors were able to adapt the punch card language to electronic circuits. From the meager loom to the supercomputer, the simplicity of design leads to adaptation of the original idea to fit new applications. The study of history is how we rethink the old and find new applications. In this course, we will learn how computers become computers, its evolution, generation of computers, four basic computers periods, classification of computers, evolution of media, and media in the digital age. Lesson 1.1 History of Computer The computer was born not for entertainment or email but out of a need to solve a serious number-crunching crisis. By 1880, the U.S. population had grown so large that it took more than seven years to tabulate the U.S. Census results. The government sought a faster way to get the job done, giving rise to punch-card based computers that took up entire rooms. Today, we carry more computing power on our smartphones than was available in these early models. The following brief history of computing is a timeline of how computers evolved from their humble beginnings to the machines of today that surf the Internet, play games and stream multimedia in addition to crunching numbers. It seems a real miracle, that the first digital computer in the world, which embodied in its mechanical and logical details just about every major principle of the modern digital computer, was designed as early as in 1830s. This was done by the great Charles Babbage, and the name of the machine is Analytical Engine. The object of the machine may shortly be given thus (according to Henry Babbage, the youngest son of the inventor): It is a machine to calculate the numerical value or values of any formula or function of which the mathematician can indicate the method of solution. It is to perform the ordinary rules of arithmetic in any order as previously settled by the mathematician, 5 and any number of times and on any quantities. It is to be absolutely automatic, the slave of the mathematician, carrying out his orders and relieving him from the drudgery of computing. It must print the results, or any intermediate result arrived at. This is just one of the may historic events in the field of technology. Now, here’s a brief timeline of the history of computers. In 1801, In France, Joseph Marie Jacquard invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards. In 1890: Herman Hollerith designs a punch card system to calculate the 1880 census, accomplishing the task in just three years and saving the government $5 million. He establishes a company that would ultimately become IBM. In 1936, Alan Turing presents the notion of a universal machine, later called the Turing machine, capable of computing anything that is computable. The central concept of the modern computer was based on his ideas. In 1941, Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29 equations simultaneously. This marks the first time a computer is able to store information on its main memory. In 1943-1944, Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert, build the Electronic Numerical Integrator and Calculator (ENIAC). Considered the grandfather of digital computers, it fills a 20-foot by 40-foot room and has 18,000 vacuum tubes. In 1946, Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications. In 1947, William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor. They discovered how to make an electric switch with solid materials and no need for a vacuum. Transistors and vacuum tubes can do the same things, in terms of how they alter the flow of an electric current in a circuit, however, vacuum tubes are much larger, and they are 6 also much less reliable, since they will eventually burn out, unlike transistors. So with transistors you can do a lot more. Vacuum tubes are REALLY big and get VERY hot. They use a LOT of electricity. Transistor on the other hand, are “VERY” small and generate almost no heat. They use very little electricity. In term of their purpose, both do the same thing -- control electricity. They just do it differently. Therefore, it is NOT possible to just remove a vacuum tube and "plug in" a transistor. Two identical functioning circuits have completely different designs, based on whether a tube or a transistor is used. In 1953, Grace Hopper develops the first computer language, which eventually becomes known as COBOL. COBOL (Common Business-Oriented Language) is a high-level programming language for business applications. It was the first popular language designed to be operating system-agnostic and is still in use in many financial and business applications today. COBOL was designed for business computer programs in industries such as finance and human resources. Unlike some high-level computer programming languages, COBOL uses English words and phrases to make it easier for ordinary business users to understand. Before COBOL, all operating systems had their own associated programming languages. This was a problem for companies that used multiple brands of computers, as was the case with the United States Department of Defense, which backed the COBOL project. Because of its ease of use and portability, COBOL quickly became one of the most used programming languages in the world. Although the language is widely viewed as outdated, more lines of code in active use today are written in COBOL than any other programming language. In 1954, the FORTRAN programming language, an acronym for FORmula TRANslation, is developed by a team of programmers at IBM led by John Backus, according to the University of Michigan. FORTRAN, in full Formula Translation, computer-programming language created in 1957 by John Backus that shortened the process of programming and made computer programming more accessible. 7 FORTRAN was the first important algorithmic language designed in 1957 by an IBM team led by John Backus. It was intended for scientific computations with real numbers and collections of them organized as one- or multidimensional arrays. In 1958, Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for his work. An integrated circuit, or IC, is small chip that can function as an amplifier, oscillator, timer, microprocessor, or even computer memory. An IC is a small wafer, usually made of silicon, that can hold anywhere from hundreds to millions of transistors, resistors, and capacitors. These extremely small electronics can perform calculations and store data using either digital or analog technology. In 1964, Douglas Engelbart shows a prototype of the modern computer, with a mouse and a graphical user interface (GUI). This marks the evolution of the computer from a specialized machine for scientists and mathematicians to technology that is more accessible to the general public. In 1969, A group of developers at Bell Labs produce UNIX, an operating system that addressed compatibility issues. Written in the C programming language, UNIX was portable across multiple platforms and became the operating system of choice among mainframes at large companies and government entities. Due to the slow nature of the system, it never quite gained traction among home PC users. C is a general-purpose programming language that is extremely popular, simple and flexible. It is machine-independent, structured programming language which is used extensively in various applications. C was the basics language to write everything from operating systems (Windows and many others) to complex programs like the Oracle database, Git, Python interpreter and more. It is said that 'C' is a god's programming language. One can say, C is a base for the programming. If you know 'C,' you can easily grasp the knowledge of the other programming languages that uses the concept of 'C' 8 It is essential to have a background in computer memory mechanisms because it is an important aspect when dealing with the C programming language. In 1970, The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip. Intel’s initial products were memory chips, including the world’s first metal oxide semiconductor, the 1101, which did not sell well. However, its sibling, the 1103, a one- kilobit dynamic random-access memory (DRAM) chip, was successful and the first chip to store a significant amount of information. It was purchased first by the American technology company Honeywell Incorporated in 1970 to replace the core memory technology in its computers. Because DRAMs were cheaper and used less power than core memory, they quickly became the standard memory devices in computers worldwide. In 1971, Alan Shugart leads a team of IBM engineers who invent the "floppy disk," allowing data to be shared among computers. A floppy disk is a magnetic storage medium for computer systems. The floppy disk is composed of a thin, flexible magnetic disk sealed in a square plastic carrier. In order to read and write data from a floppy disk, a computer system must have a floppy disk drive (FDD). A floppy disk is also referred to simply as a floppy. Since the early days of personal computing, floppy disks were widely used to distribute software, transfer files, and create back-up copies of data. When hard drives were still very expensive, floppy disks were also used to store the operating system of a computer. In 1973, Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware. In 1974-1977, A number of personal computers hit the market, including Scelbi & Mark- 8 Altair, IBM 5100, Radio Shack's TRS-80 — affectionately known as the "Trash 80" — and the Commodore PET. In 1975, The January issue of Popular Electronics magazine features the Altair 8080, described as the "world's first minicomputer kit to rival commercial models." Two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using 9 the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft. The Altair 8800 was one of the first computers available for personal use. The Altair 8800 revolutionized the computing world by allowing common people to have their own personal computer to use as they pleased. The Altair 8800 also was responsible for the rise of Microsoft as a computing industry giant. In 1976, Steve Jobs and Steve Wozniak start Apple Computers on April Fool's Day and roll out the Apple I, the first computer with a single-circuit board, according to Stanford University. In 1977, Radio Shack's initial production run of the TRS-80 was just 3,000. It sold like crazy. For the first time, non-geeks could write programs and make a computer do what they wished. In 1977, Jobs and Wozniak incorporate Apple and show the Apple II at the first West Coast Computer Faire. It offers color graphics and incorporates an audio cassette drive for storage. In 1978, Accountants rejoice at the introduction of VisiCalc, the first computerized spreadsheet program. In 1979, Word processing becomes a reality as MicroPro International releases WordStar. "The defining change was to add margins and word wrap," said creator Rob Barnaby in email to Mike Petrie in 2000. "Additional changes included getting rid of command mode and adding a print function. In 1981, The first IBM personal computer, code-named "Acorn," is introduced. It uses Microsoft's MS-DOS operating system. It has an Intel chip, two floppy disks and an optional color monitor. Sears & Roebuck and Computerland sell the machines, marking the first time a computer is available through outside distributors. It also popularizes the term PC. In 1983, Apple's Lisa is the first personal computer with a GUI. It also features a drop- down menu and icons. It flops but eventually evolves into the Macintosh. The Gavilan SC 10 is the first portable computer with the familiar flip form factor and the first to be marketed as a "laptop." In 1985, Microsoft announces Windows, according to Encyclopedia Britannica. This was the company's response to Apple's GUI. Commodore unveils the Amiga 1000, which features advanced audio and video capabilities. In 1985, The first dot-com domain name is registered on March 15, years before the World Wide Web would mark the formal beginning of Internet history. The Symbolics Computer Company, a small Massachusetts computer manufacturer, registers Symbolics.com. More than two years later, only 100 dot-coms had been registered. In 1986, Compaq brings the Deskpro 386 to market. Its 32-bit architecture provides as speed comparable to mainframes. The Compaq Deskpro 386, announced in September 1986, was a landmark IBM PC compatible computer. The first fully 32-bit PC based on the Intel 386, its release took the leadership of the PC ecosystem away from IBM, and Compaq became the leader. In 1990, Tim Berners-Lee, a researcher at CERN, the high-energy physics laboratory in Geneva, develops HyperText Markup Language (HTML), giving rise to the World Wide Web. HTML is a computer language devised to allow website creation. These websites can then be viewed by anyone else connected to the Internet. It is relatively easy to learn, with the basics being accessible to most people in one sitting; and quite powerful in what it allows you to create. It is constantly undergoing revision and evolution to meet the demands and requirements of the growing Internet audience. In 1993, The Pentium microprocessor advances the use of graphics and music on PCs. Microprocessor is a controlling unit of a micro-computer, fabricated on a small chip capable of performing ALU (Arithmetic Logical Unit) operations and communicating with the other devices connected to it. Microprocessor consists of an ALU, register array, and a control unit. ALU performs arithmetical and logical operations on the data received from the memory or an input 11 device. Register array consists of registers identified by letters like B, C, D, E, H, L and accumulator. The control unit controls the flow of data and instructions within the computer. In 1994, PCs become gaming machines as "Command & Conquer," "Alone in the Dark 2," "Theme Park," "Magic Carpet," "Descent" and "Little Big Adventure" are among the games to hit the market. In 1996, Sergey Brin and Larry Page develop the Google search engine at Stanford University. In 1997, Microsoft invests $150 million in Apple, which was struggling at the time, ending Apple's court case against Microsoft in which it alleged that Microsoft copied the "look and feel" of its operating system. In 1999, The term Wi-Fi becomes part of the computing language and users begin connecting to the Internet without wires. In 2001, Apple unveils the Mac OS X operating system, which provides protected memory architecture and pre-emptive multi-tasking, among other benefits. Not to be outdone, Microsoft rolls out Windows XP, which has a significantly redesigned GUI. In 2003, The first 64-bit processor, AMD's Athlon 64, becomes available to the consumer market. In 2004, Mozilla's Firefox 1.0 challenges Microsoft's Internet Explorer, the dominant Web browser. Facebook, a social networking site, launches. In 2005, YouTube, a video sharing service, is founded. Google acquires Android, a Linux- based mobile phone operating system. In 2006, Apple introduces the MacBook Pro, its first Intel-based, dual-core mobile computer, as well as an Intel-based iMac. Nintendo's Wii game console hits the market. In 2007, The iPhone brings many computer functions to the smartphone. In 2009, Microsoft launches Windows 7, which offers the ability to pin applications to the taskbar and advances in touch and handwriting recognition, among other features. 12 In 2010, Apple unveils the iPad, changing the way consumers view media and jumpstarting the dormant tablet computer segment. In 2011, Google releases the Chromebook, a laptop that runs the Google Chrome OS. In 2012, Facebook gains 1 billion users on October 4. In 2015, Apple releases the Apple Watch. Microsoft releases Windows 10. In 2016, The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum-computing platform that had the capability to program new algorithms into their system. They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park. Read this article to know more about Quantum computer: https://www.sciencedaily.com/terms/quantum_computer.htm In 2017, The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a statement. "Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic- based, digital architectures." The Molecular Informatics program brings together a collaborative interdisciplinary community to explore completely new approaches to store and process information with molecules. Chemistry offers an untapped, rich palette of molecular diversity that may yield a vast design space to enable dense data representations and highly versatile computing concepts outside of traditional digital, logic-based approaches. Given radical advances in tools and techniques to sense, separate, and manipulate at the molecular scale, what innovations can be injected into information technology, and what will the resulting systems be able to “compute”? By addressing a series of mathematical and 13 computational problems with molecule-based information encoding and processing, Molecular Informatics aims to discover and define future opportunities for molecules in information storage and processing. In 2018-Present, 1. Cloud Computing Business organization stores terabytes of data every day, which must be arranged, sorted and restored. A conventional computer cannot store large volumes of data which increases pressure to move towards Cloud. The Cloud will increase the productivity of an organization by saving time and money. Cloud is cheap, reliable for secure backup, and eases resource management. Cloud and its automation will collaborate with new upcoming technologies bringing profits to the organization. 2. The Internet of Things The internet of things has turned the dream of smart homes, devices, cars, and workplaces into a reality. It can forge compelling solutions for real-time problems by focusing on hardware-centric software, and firmware. IOT will merge with edge computing and enable a better range of IOT applications. Edge computing will improve IOT based application ability to detect obstacles, face identification, and security. 3. Artificial Intelligence Artificial Intelligence along with machine learning will be an unstoppable force in 2018. AI and its advanced algorithms will be assisting other technologies to build sophisticated software. Commoditized Artificial Intelligence will work in natural language processing, computer vision, and recommender systems. Conversational Artificial Intelligence will come across areas like supply chain, sales, manufacturing, and insurance. Amazon Alexa is the best example of Artificial Intelligence application. 4. Virtual Assistance Virtual Assistance technology will be a deciding vote for brand engagement. Amazon’s Alexa and Echo have proved to be current market leaders in the world of virtual assistance. Advancements can be made to the existing technology by merging with 14 voice assistant technology companies and media providers. Chatbots are one of the technology advancements brands can use for providing quick responses and 24/7 availability to the customers. 5. Augmented Reality AR is a technology which places a computer-generated image in front of a user creating a composite view of reality. Unlike virtual reality, it uses existing environment and imposes new information on top of it. Snapchat’s funny face filters and iPhone X Emojis are the best way to describe AR. Apple’s AR Kit will ease the work of many developers to include this technology in phones, software, etc. Brands like Amazon have started to work with AR 6. 3-D printing 3-D printing will change the perspective of many industries of manufacturing a product. HP has created 3-D printers which are portable and simple to use. 3-D printing can change the world by creating touchable pictures, human body parts, lightweight, and long-lasting cast for broken bones, and safer, stronger vehicles. It will also enhance design and innovation in a variety of sectors. 3-D printing will disrupt the market by saving time and money. The technology is expensive currently, but when the price will fall, it will be accessible. 7. Robotic Process Automation or RPA Like AI and Machine Learning, Robotic Process Automation, or RPA, is another technology that is automating jobs. RPA is the use of software to automate business processes such as interpreting applications, processing transactions, dealing with data, and even replying to emails. RPA automates repetitive tasks that people used to do. These are not just the menial tasks of a low-paid worker: up to 45 percent of the activities we do can be automated, including the work of financial managers, doctors, and CEOs. 8. Cybersecurity Cybersecurity might not seem like emerging technology, given that it has been around for a while, but it is evolving just as other technologies are. That’s in part because 15 threats are constantly new. The malevolent hackers who are trying to illegally access data are not going to give up any time soon, and they will continue to find ways to get through even the toughest security measures. It’s also in part because new technology is being adapted to enhance security. As long as we have hackers, we will have cyber security as an emerging technology because it will constantly evolve to defend against those hackers. You can’t know where you are going until you know where you have been! All new technologies, including computers, evolve from an original, but that doesn’t necessitate our presumption that the original no longer has a purpose or is less valued. Much of our technology is repurposed for different applications, and that may be the best reason to study the history of computers. 16 Lesson 1.2 GENERATION OF COMPUTERS A computer is an electronic device that manipulates information or data. It has the ability to store, retrieve, and process data. Nowadays, a computer can be used to type documents, send email, play games, and browse the Web. It can also be used to edit or create spreadsheets, presentations, and even videos. But the evolution of this complex system started around 1940 with the first Generation of Computer and evolving ever since. There are five generations of computers. 1. The First Generation: Vacuum Tubes and Plug boards (1951 -1958) The first generation of computers used vacuum tubes as their main logic elements; punched cards to input and externally store data; and rotating magnetic drums for internal storage of data in programs written in machine language (instructions written as a string of 0s and 1s) or assembly language (a language that allowed the programmer to write instructions in a kind of shorthand that would then be "translated" by another program called a compiler into machine language). The following are the characteristics of the First Generation computers: Used vacuum tubes for circuitry Electron emitting metal in vacuum tubes burned out easily Used magnetic drums for memory Were huge, slow, expensive, and many times undependable Were expensive to operate Were power hungry Generated a lot of heat which would make them malfunction Solved one problem at a time Used input based on punched cards 17 Had their outputs displayed in print outs Used magnetic tapes Used machine language Had limited primary memory We’re programming only in machine language The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951. 2. The Second Generation: Transistors and Batch Filing (1959 -1963) AT&T's Bell Laboratories, in the 1940s, discovered that a class of crystalline mineral materials called semiconductors could be used in the design of a device called a transistor to replace vacuum tubes. Magnetic cores (very small donut-shaped magnets that could be polarized in one of two directions to represent data) strung on wire within the computer became the primary internal storage technology. Magnetic tape and disks began to replace punched cards as external storage devices. High-level programming languages (program instructions that could be written with simple words and mathematical expressions), like FORTRAN and COBOL, made computers more accessible to scientists and businesses. Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology. The following are the characteristics of the Second Generation computers: Used transistors Faster and more reliable than first generation systems 18 Were slightly smaller, cheaper, faster Generated heat though a little less Still relied on punch cards and printouts for input/output Allowed assembly and high-level languages Stored data in magnetic media Were still costly Needed air conditioning Introduced assembly language and operating system software The first computers of this generation were developed for the atomic energy industry. 3. The Third Generation: Integrated Circuits and Multi-Programming (1964 - 1979) Individual transistors were replaced by integrated circuits. Magnetic core memories began to give way to a new form, metal oxide semiconductor memory (MOS), which, like integrated circuits, used silicon-backed chips. Increased memory capacity and processing power made possible the development of operating systems — special programs that help the various elements of the computer to work together to process information. Programming languages like BASIC were developed, making programming easier to do. Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors. The following are the characteristics of the Third Generation computers: Used ICs Used parallel processing 19 Were slightly smaller, cheaper, faster Used motherboards Data was input using keyboards Output was visualized on the monitors Used operating systems, thus permitting multitasking Simplified programming languages (i.e. BASIC) 4. The Fourth Generation: The Microprocessor, OS and GUI (1979 to Present) The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip. In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors. As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices. The following are the characteristics of the Third Generation computers: Used CPUs which contained thousands of transistors Were much smaller and fitted on a desktop, laps and palms Used a mouse Were used in networks Were cheap 20 Had GUI Were very fast Register over 19 billion transistors in high-end microprocessors 5. The Fifth Generation: The Present and The Future (present) Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self- organization. 21 ASSESSMENT EXERCISE NO.1 (History of Computer and Generation of Computer) Exercise no. ________________________Date: ___________________________ Name:______________________________Student ID number:______________ Course, Year and Section: _____________Score:__________________________ Instruction: Choose the letter of the correct answer 1. What was the name of first computer designed by Charlse Babbage? a. Analytical Engine b. Difference Engine c. Colossus d. ENIAC 2. Which was the first electronics digital programmable computing device? a. Analytical Engine b. Difference Engine c. Colossus d. ENIAC 3. John Mauchly and J. Presper Eckert are the inventors of __________ a.UNIAC b. ENIAC c. EDSAC d. Ferranti Mark 1 4. Who invented the punch card? a. Charles Babbage b. Semen Korsakov c. Herman Hollerith d. Joseph Marie Jacquard 5. In the late __________, Herman Hollerith invented data storage on punched cards that could then be read by a machine. a. 1860 b. 1900 c. 1890 d. 1880 6. Which electronic components are used in First Generation Computers? a. Transistors b. Integrated Circuits 22 c. Vacuum Tubes d. VLSI Microprocessor e. ULSI Microprocessor 7. Which electronic components are used in Second Generation Computers? a. Transistors b. Integrated Circuits c. Vacuum Tubes d. VLSI Microprocessor e. ULSI Microprocessor 8. Which electronic components are used in Third Generation Computers? a. Transistors b. Integrated Circuits c. Vacuum Tubes d. VLSI Microprocessor\ e. ULSI Microprocessor 9. Which electronic components are used in Fourth Generation Computers? a. Transistors b. Integrated Circuits c. Vacuum Tubes d. VLSI Microprocessor e. ULSI Microprocessor 10. Which electronic components are used in Fifth Generation Computers? a. Transistors b. Integrated Circuits c. Vacuum Tubes d. VLSI Microprocessor e. ULSI Microprocessor 11. ENIAC Computer belongs to __________. a. First Generation Computers b. Second Generation Computers c. Third Generation Computers d. Fourth Generation Computers 12. __________ is used as a programming language in first generation computers? a. FORTRAN b. COBOL c. BASIC d. Machine Language 13. Which is not true about Second Generation of Computer? a. Stored data in magnetic media b. Bigger, cheaper, faster 23 c. Faster and more reliable than first generation systems d. Generated heat though a little less 14. All are the characteristics of a Third Generation Computer except: a. Used parallel processing b. Output was visualized on the monitors c. Slightly smaller, cheaper, faster d. Do not used motherboards 15. Which characteristic below does not define Fourth Generation Computer? a. Used CPUs which contained thousands of transistors b. smaller and fitted on a desktop, laps and palms c. very fast d. expensive 16. The microprocessor that advances the use of graphics and music on PCs. a. fortran microprocessor b. pentium microprocessor c. Intel microprocessor d. python microprocessor 17. They are from Bell Laboratories who invented the transistor. They discovered how to make an electric switch with solid materials and no need for a vacuum, except: a. William Shockley b. John Bardeen c. Walter Brattain d. Joseph Marie Jacquard 18. They develop the Google search engine at Stanford University. a. William Shockley and John Page b. Walter Brin and Larry Book c. Sergey Brin and Larry Page d. William Brin and Joseph Page 19. The first portable computer with the familiar flip form factor a. Gavilan SC b. Gavilan Mac Pro c. Gavialn SD d. Gavilan PG 20. The computer which features advanced audio and video capabilities. a. Gavilan SD b. Amiga 1000 c. Altair 8080 d. Radio Shock TRS 21. The person behind the first computer mouse created in 1964 24 a. Douglas Engelbart b. Jack Kilby c. Robert Noyce d. Grace Hopper 22. It is used to store the operating system of a computer and allows data to be shared among computers invented by Allan Shugart and his team in 1971. a. Intel microprocessor b. USB c. Floppy Disk d. OS 22. He develops Ethernet for connecting multiple computers and other hardware. a. Robert Metcalfe b. Douglas Engelbart c. Robert Noyce d. Grace Hopper 23. The following are all personal computers existed in 1974-1977, except: a. Scelbi b. Mark-8 Altair c.IBM 500 d. Radio Shack's TRS-180 24. It is the world's first minicomputer a. Gavilan SC b. Mark 8- Altair c. Altair 808 d. Radio Shack's TRS-180 25. The first computerized spreadsheet program used by the accountants in 1978 a. VisiCalc b. Altair 8080 c. VisCalc d. Altair 8800 26. A personal computer that has an Intel chip, two floppy disks and an optional color monitor. a. Altair 8080 b. Radio Shack's TRS 80 c. Acorn d. IBM 27. A small chip that can function as an amplifier, oscillator, timer, microprocessor, or even computer memory. a. transistor b. Vacuum tubes 25 c. integrated cicuits d. none of the above 28. It was the first important algorithmic language designed in 1957 by an IBM team led by John Backus. a. Cobol b. C++ d. Fortran d. Algol 29. The central concept of the modern computer was based on his ideas. a. Charles Babbage b. Allan Turing c. France Marie Jacquard d. Herman Hollerith 30. The computer that allows non-geeks could write programs and a computer do what they wished. a. TRS 80 b. Altair 808 c. Acorn d. IBM 26 Lesson 1.3 FOUR BASIC COMPUTER PERIODS Information technology has been around for a long time. Basically, as long as people have been around, information technology has been around because there were always ways of communicating through technology available at that point in time. There are 4 main ages that divide up the history of information technology. Only the latest age (electronic) and some of the electromechanical age really affects us today, but it is important to learn about how we got to the point we are at with technology today. 1.3.1 Pre-mechanical Age (3000 B.C. – 1450 A.D.) The pre-mechanical age is the earliest age of information technology. In 3000 B.C., the Sumerians in Mesopotamia (what is today southern Iraq) devised a writing system. The system, called cuneiform, used signs corresponding to spoken sounds, instead of pictures, to express words. When humans first started communicating, they would try to use language or simple pictures or drawings known as petroglyths, which were usually carved in rock. Early alphabets were developed such as the Phoenician alphabet. As alphabets became more popular and more people were writing information down, pens and paper began to be developed. It started off as just marks in wet clay, but later paper was created out of papyrus plant. The most popular kind of paper made was probably by the Chinese who made paper from rags. Now that people were writing a lot of information down, they needed ways to keep it all in permanent storage. This is where the first books and libraries are developed. Religious leaders in Mesopotamia kept the earliest "books"" a collection of rectangular clay tablets, inscribed with cuneiform and packaged in labeled containers — in their personal "libraries." The Egyptians kept scrolls - sheets of papyrus wrapped around a shaft of wood. Around 600 B.C., the Greeks began to fold sheets of papyrus vertically into leaves and bind them together. 27 The dictionary and encyclopedia made their appearance about the same time. The Greeks are also credited with developing the first truly public libraries around 500 B.C. Also during this period were the first numbering systems. Around 100A.D. was when the first 1-9 system was created by people from India. However, it wasn’t until 875A.D. (775 years later) that the number zero (0) was invented. And yes now that numbers were created, people wanted stuff to do with them so they created calculators. A calculator was the very first sign of an information processor. The popular model of that time was the abacus. 1.3.2 Mechanical Age (1450 – 1840) The mechanical age is when we first start to see connections between our current technology and its ancestors. A lot of new technologies are developed in this era as there is a large explosion in interest with this area. Johann Gutenberg in Mainz, Germany, invented the movable metal-type printing process in 1450 and sped up the process of composing pages from weeks to a few minutes. The printing press made written information much more accessible to the general public by reducing the time and cost that it took to reproduce written material. In the early 1600s, William Oughtred, an English clergyman, invented the slide rule, a device that allowed the user to multiply and divide by sliding two pieces of precisely machines and scribed wood against each other. The slide rule is an early example of an analog computer — an instrument that measures instead of counts. Blaise Pascal, a French mathematician, invented the Pascaline around 1642 which was a very popular mechanical computer; it used a series of wheels and cogs to add and subtract numbers. An eccentric English mathematician named Charles Babbage, frustrated by mistakes, set his mind to create a machine that could both calculate numbers 28 and print the results. In the 1820s, he was able to produce a working model of his first attempt, which he called the Difference Engine, the name was based on a method of solving mathematical equations called the "method of differences". Made of toothed wheels and shafts turned by a hand crank, the machine could do computations and create charts showing the squares and cubes of numbers. He had plans for a more complex Difference Engine but was never able to actually build it because of difficulties in obtaining funds, but he did create and leave behind detailed plans. Designed during the 1830s by Babbage, the Analytical Engine had parts remarkably similar to modern-day computers. For instance, the Analytical Engine was to have a part called the "store," which would hold the numbers that had been inputted and the quantities that resulted after they had been manipulated. Babbage also planned to use punch cards to direct the operations performed by the machine — an idea he picked up from seeing the results that a French weaver named Joseph Jacquard had achieved using punched cards to automatically control the patterns that would be woven into cloth by a loom. Lady Augusta Ada Byron helped Babbage design the instructions that would be given to the machine on punch cards and to describe, analyze, and publicize his ideas. She has been called the "first programmer". Babbage eventually was forced to abandon his hopes of building the Analytical Engine, once again because of a failure to find funding. There were lots of different machines created during this era and while we have not yet gotten to a machine that can do more than one type of calculation in one, like our modern-day calculators, we are still learning about how all of our all-in-one machines started. Also, if you look at the size of the machines invented in this time compared to the power behind them it seems (to us) absolutely ridiculous to understand why anybody would want to use them, but to the people living in that time all of these inventions were huge. 29 1.3.3 Electromechanical Age (1840 – 1940) Now we are finally getting close to some technologies that resemble our modern- day technology. The discovery of ways to harness electricity was the key advance made during this period. Knowledge and information could now be converted into electrical impulses. These are the beginnings of telecommunication. The discovery of a reliable method of creating and storing electricity, with a Voltaic Battery, at the end of the 18th century made possible a whole new method of communicating information. The telegraph was created in the early 1800s. It is the first major invention to use electricity for communication purposes and made it possible to transmit information over great distances with great speed. Morse code was created by Samuel Morse in 1835. Morse devised a system that broke down information (in this case, the alphabet) into bits (dots and dashes) that could then be transformed into electrical impulses and transmitted over a wire (just as today's digital technologies break down information into zeros and ones). The telephone (one of the most popular forms of communication ever) was created by Alexander Graham Bell in 1876. This was followed by the discovery that electrical waves travel through space and can produce an effect far from the point at which they originated. These two events led to the invention of the radio by Marconi in 1894. By 1890, Herman Hollerith, a young man with a degree in mining engineering who worked in the Census Office in Washington, D.C., had perfected a machine that could automatically sort census cards into a number of categories using electrical sensing devices to "read" the punched holes in each card and thus 30 count the millions of census cards and categorize the population into relevant groups. The company that he founded to manufacture and sell it eventually developed into the International Business Machines Corporation (IBM). Howard Aiken, a Ph.D. student at Harvard University, decided to try to combine Hollerith's punched card technology with Babbage's dreams of a general- purpose, "programmable" computing machine. With funding from IBM, he built a machine known as the Mark I, which used paper tape to supply instructions (programs) to the machine tor manipulating data (input on paper punch cards), counters to store numbers, and electromechanical relays to help register results. 1.3.4 Electronic Age (1940 - Present) The electronic age is what we currently live in. It can be defined as the time between 1940 and right now. The Electronic Numerical Integrator and Computer (ENIAC) was the first high-speed, digital computer capable of being reprogrammed to solve a full range of computing problems. This computer was designed to be used by the U.S. Army for artillery firing tables. This machine was even bigger than the Mark 1 taking up 680 square feet and weighing 30 tons - HUGE. It mainly used vacuum tubes to do its calculations. There are 4 main sections of digital computing. The first was the era of vacuum tubes and punch cards like the ENIAC and Mark 1. Rotating magnetic drums were used for internal storage. The second generation replaced vacuum tubes with transistors, punch cards were replaced with magnetic tape, and rotating magnetic drums were replaced by magnetic cores for internal storage. Also during this time high-level programming languages were created such as FORTRAN and COBOL. The third generation replaced transistors with integrated circuits, magnetic tape was used throughout all computers, and magnetic core turned into metal oxide semiconductors. An actual operating system showed up around this time along with the advanced programming language BASIC. The fourth and latest generation brought in CPUs (central processing units) which contained memory, 31 logic, and control circuits all on a single chip. The personal computer was developed (Apple II). The graphical user interface (GUI) was developed. 32 Assessment Exercise 2 (Four Basic Computer Periods) Exercise no. ____________________________Date: _________________________ Name:__________________________________Student ID number: ____________ Course, Year and Section: _________________Score:________________________ Instruction: Choose the letter of the correct answer. 1. A writing system invented by the Sumerians. a. Abacus b. Cuneiform c. Petroglyths d. Phoenician alphabet 2. Religious leaders in Mesopotamia kept the earliest "books"" a collection of rectangular clay tablets, inscribed with cuneiform and packaged in labeled containers. a. True b. False 3. He invented the Slide Rule. a. Charles Babbage b. Johann Gutenberg c. William Oughtred d. Blaise Pascal 4. It is the first major invention to use electricity for communication purposes and made it possible to transmit information over great distances with great speed. * a. Morse Code b. Telegraph c. Telephone d. Voltaic Battery 5. It was created by Alexander Graham Bell in 1876 a. Morse Code b. Telegraph c. Telephone d. Voltaic Battery 6. Lady Augusta Ada Byron helped Babbage design the instructions that would be given to the machine on punch cards and to describe, analyze, and publicize his ideas. 33 a. True b. False 7. The following technologies are invented during electromechanical age except: * a. Voltaic Battery b. ENIAC c. Mark I d. Radio 8. ENIAC stands for: a. Electronic Numeric Integrator and Computer b. Electronic Numerical Integrator and Computer c. Electronic Numeric Integrator and Calculator d. Electronic Numerical Integrator and Calculator 9. The fourth generation was based on integrated circuits. a. True b. False 10. ______ generation of computer started with using vacuum tubes as the basic components. a. 1st b. 2nd c. 3rd d. 4th 34 Lesson: 1.4 Classifications of Computers Computers can be classified based on size and computing power. However, as technology advances, these classifications tend to overlap as modern computers have become smaller, yet more powerful, and relatively cheaper. GENERAL CLASSIFICATIONS OF COMPUTERS Personal Computer is a small, single-user computer based on a microprocessor. In addition to the microprocessor, a personal computer has a keyboard for entering data, a monitor for displaying information, and a storage device for saving data. Workstation is a powerful, single-user computer. A workstation is like a personal computer, but it has a more powerful microprocessor and a higher-quality monitor. Minicomputer it is a mid size computer that is less powerful and cheaper than a mainframe It is a multi-user computer capable of supporting from 10 to hundreds of users simultaneously referred to as mid range server it is used in factories for process control, inventory and manufacturing control Mainframe is a powerful multi-user computer capable of supporting hundreds or thousands of users simultaneously. Supercomputer 35 is an extremely fast computer that can perform millions of instructions per second. Ex: NASA’s Pleiades supercomputer that has 245,536 CPU cores and a total memory of 935 TB. Types of Computers Desktop computers are computers designed to be placed on a desk, and are normally made up of a few different parts, including the computer case, central processing unit (CPU), monitor, keyboard, and mouse. Laptop computers are battery-powered computer devices whose portability makes then possible to use almost anytime, anywhere. Tablet computers are hand-held computers with touch-sensitive screen for typing and navigation. Smartphones are hand-held telephones which can do things that computers can do, including browsing and searching the internet and even playing console games. Wearables include fitness trackers and smartwatches that can be worn throughout the day. Smart TVs are the latest television sets that include applications present in computers. For example, videos can be used as a computer monitor and gaming monitor. 36 Assessment Exercise 3. (CLASSIFICATION OF COMPUTERS) Exercise no. _____________________________Date: ________________________ Name: __________________________________Student ID number: ____________ Course, Year and Section: _________________Score: _______________________ Instruction: Choose the letter of the correct answer. 1. It is a multi-user computer capable of supporting from 10 to hundreds of users simultaneously. * a. Personal Computer b. Mainframe c. Supercomputer d. Minicomputer 2. It is a powerful multi-user computer capable of supporting hundreds or thousands of users simultaneously. a. Personal Computer b. Minicomputer c. Supercomputer d. Mainframe 3. It is an extremely fast computer that can perform millions of instructions per second. a. Supercomputer b. Mainframe c. Minicomputer d. Personal Computer 37 4. It is a single-user computer based on a microprocessor. a. Personal Computer b. Workstation c. Minicomputer d. Mainframe 5. It is a single-user computer but has a more powerful microprocessor and a higher- quality monitor than a personal computer. a. Personal Computer b. Workstation c. Minicomputer d. Mainframe 6. It is referred to as mid-range server. a. Mainframe b. Minicomputer c. Workstation d. Supercomputer 7. These are latest television sets that include applications present in computers. a. Wearables b. Tablets c. Smart TVs d. Smartphones 8. These includes fitness trackers and smartwatches that can be worn throughout the day. a. Smart TVs b. Smartphones 38 c. Wearables d. Tablets 9. These are hand-held telephones that can do things that computers can do, including browsing and searching the internet and even playing console games. a. Desktop b. Laptop c. Smart TVs d. Smartphones 10. These are computers designed to be placed on a desk, and are normally made up of a few different parts, including the computer case, CPU, monitor, keyboard, and mouse. a. Smartphones b. Laptop c. Smart TVs d. Desktop 39 Lesson 1.5 Evolution of Information Technology The evolution of IT began in the 1970s Technological Evolution Since World War II, the performance capabilities of computers and telecommunications have been doubling every few years at constant cost. For example, a decade ago $3,500 could buy a new Apple II microcomputer. Today, $6,800 — the same amount of purchasing power (adjusted for 10 years of inflation)-can buy a new Macintosh II microcomputer. The Macintosh handles 4 times the information at 16 times the speed, preprogrammed and reprogrammable memory are both about 20 times larger, disk storage is about 90 times larger, and the display has 7 times the resolution and 16 times the number of colors. Comparable figures could be cited for other brands of machines. Equally impressive, users’ demands for this power have increased as rapidly as it has become available. Over the next two decades, data processing and information systems will probably be replaced by sophisticated devices for knowledge creation, capture, transfer, and use. A similar evolution can be forecast for telecommunications: personal video-recorders, optical fiber networks, intelligent telephones, information utilities such as videotex, and digital discs will change the nature of media. Cognition Enhancers The concept of “cognition enhancers” can help us understand how we can use these emerging technologies. A cognition enhancer combines the complementary strengths of a person and an information technology. Two categories of cognition enhancers will have considerable impact on the workplace: empowering environments and hypermedia. Empowering Environments Empowering environments enhance human accomplishment by a division of labor: the machine handles the routine mechanics of a task, while the person is immersed in its higher-order meanings. The workplace is adopting many empowering environments: 40 databases for information management, spreadsheets for modeling, computer-aided design systems for manufacturing. And word processors with embedded spelling checkers, thesauruses, out-liners, text analyzers, and graphics tools are driving the evolution of a new field: desktop publishing. Hypermedia Hypermedia is a framework for creating an interconnected, web-like representation of symbols (text, graphics, images, software codes) in the computer. This representation is similar to human long-term memory: people store information in networks of symbolic, temporal, and visual images related by association. Evolution of Software Applications The advancement of hardware was not sufficient to change the human life-style, had it not been supported by software and software application. Let us see how software applications revolved over time. Command Line Programs (1980s)- the first generation software application included compilers, device drivers etc., which were mainly command line programs. Desktop Application (1990s)- with the popularity of graphical interface, GUI based desktop applications of multiple types and forms were released: office application, audio and video players, utility programs, browsers, etc. Web application (21st century)- with web’s availability, the next generation applications were developed keeping world wide web in mind. Web applications were developed keeping in mind that they can be accessed from any location over internet. Most popular 41 web applications include email clients like gmail, ymail, etc. Social networking platform life facebook, twitter, instagram, pinterest, quora, etc. Mobile application (21st century)- advent of computer technology has resulted into smart phones being affordable. The most popular mobile applications development platforms are IOS, Android, windows which are also the most popular mobile operating systems. Evolution of Programming Language Software are developed through various programming languages. Programming started with machine language and evolved to new-age programming systems. 1st generation programming language (1GL)- early programming was done in machine language. So machine language is the first generation programming language. 2nd generation programming language (2GL)- also called as the assembly language programming which is easier for computer to understand but difficult for programmers. 3rd generation programming language (3GL)- more normal English language like and hence easier for programmers to understand. Also called High Level Languages(HLLs). 4th generation programming language (4GL)- closer to natural language than 3GLs. 5th generation programming language (5GL)- used mainly in artificial intelligence research. 42 ASSESSMENT Exercise 4. (EVOLUTION OF INFORMATION TECHNOLOGY) Exercise no. _____________________________Date: ________________________ Name: __________________________________Student ID number: ____________ Course, Year and Section: _________________Score: _______________________ Instruction. Choose the letter of the correct answer 1. It enhance human accomplishment by a division of labor the machine handles the routine mechanics of a task a. Empowering environments b. Hypermedia 2. A framework for creating an interconnected, web-like representation of symbols a. Hypermedia b. Empowering environments 3. It is closer to natural language than 3GLs a. 4GL c. 5GL 4. It is the first generation programming language a. Assembly language b. Machine language 5. It is used mainly in artificial intelligence research a. 4GL b. 5GL 43 6. The first generation software application included compilers, device drivers, etc. a. Desktop application b. Command line programs 7. Were developed keeping in mind that they can be accessed from any location over Internet a. Web application b. Desktop application 8. It combines the complementary strengths of a person and an information technology a. Hypermedia b. Cognition enhancers 9. It is more normal English language like and hence easier for programmers to understand a. 2GL b. 3GL 10. The evolution of IT began in what year a. 1970 b. 1960 44 1.6 EVOLUTION OF MEDIA The media has transformed itself based on two things – (1) how information is presented; and (2) how the connection is established. Woodcut printing on cloth or on paper was used in the early 15th century. It was in 1436 when Johannes Gutenberg started working on a printing press which used relief printing and a molding system. Now, the modern printing press delivers messages in print, such as newspapers, textbooks, and magazines. Figure 1. Modern Printing Press In the 1800s, the telegraph was developed followed by the telephone which made the two-way communication possible. Message sending and receiving can now be done both ways simultaneously. 45 Figure 2. Telegraph in the 1800s At the beginning of the 1900s, broadcasting and recorded media were introduced. Radio and television were used to send sound and video to homes and offices through electromagnetic spectrum or radio waves. Audio (lower frequency band) or video (higher frequency band) content can be received depending on the frequency used. Later on, a combination of both audio and video information made the audience’s viewing experience more exciting. Films and movies became popular as they catered to larger audiences. As communication devices also evolved and became pervasive. So did information distribution. A photo taken using a smartphone can immediately be uploaded and share on Facebook, Twitter, or Instagram. Community websites such as OLX.ph, a Philippine counterpart of ebay.com, let its users buy and sell items online. This eliminates the need for going to physical stores. In line with this development, the audience regardless of their professions can now interact with one another and are no longer disconnected. News sites can even get news stories for example from Twitter or other social media sites. According to Claudine Beaumont, author from The Telegraph, one good example of this happened on January 15, 2009, when dozens of New Yorkers sent ‘tweets’ about a plane crash in the city. News about the US Airways Flight 1549 which was forced to land in the Hudson River in Manhattan, USA immediately spread all over the country. All plane’s engine shut down when it struck a flock of geese, minutes after take-off from New York’s LaGuardia Airport. 46 Figure 3 shows one of the first photos taken from a Twitter user, Jānis Krūms, showing the drowned plane with survivors standing on its wings waiting for rescue. It was instantly forwarded across Twitter and used by numerous blogs and news websites, causing the TwitPic service to crash due to multiple views. In this regard, Twitter users were able to break the news of the incident around 15 minutes before the mainstream media have alerted the public about crash incident. Figure 3. A screenshot of Janis Krums’ tweet about a plane crash in Hudson This is typical example of how individuals can now deliver content to everyone and connections are no longer controlled by professionals. 47 What Does Media Do for Us? Media fulfills several basic roles in our society. One obvious role is entertainment. Media can act as a springboard for our imaginations, a source of fantasy, and an outlet for escapism. In the 19th century, Victorian readers disillusioned by the grimness of the Industrial Revolution found themselves drawn into fantastic worlds of fairies and other fictitious beings. In the first decade of the 21st century, American television viewers could peek in on a conflicted Texas high school football team in Friday Night Lights; the violence-plagued drug trade in Baltimore in The Wire; a 1960s-Manhattan ad agency in Mad Men; or the last surviving band of humans in a distant, miserable future in Battlestar Galactica. Through bringing us stories of all kinds, media has the power to take us away from ourselves. Media can also provide information and education. Information can come in many forms, and it may sometimes be difficult to separate from entertainment. Today, newspapers and news-oriented television and radio programs make available stories from across the globe, allowing readers or viewers in London to access voices and videos from Baghdad, Tokyo, or Buenos Aires. Books and magazines provide a more in-depth look at a wide range of subjects. The free online encyclopedia Wikipedia has articles on topics from presidential nicknames to child prodigies to tongue twisters in various languages. The Massachusetts Institute of Technology (MIT) has posted free lecture notes, exams, and audio and video recordings of classes on its OpenCourseWare website, allowing anyone with an Internet connection access to world-class professors. Another useful aspect of media is its ability to act as a public forum for the discussion of important issues. In newspapers or other periodicals, letters to the editor allow readers to respond to journalists or to voice their opinions on the issues of the day. These letters were an important part of U.S. newspapers even when the nation was a British colony, and they have served as a means of public discourse ever since. The Internet is a fundamentally democratic medium that allows everyone who can get online the ability to express their opinions through, for example, blogging or podcasting—though whether anyone will hear is another question. 48 Similarly, media can be used to monitor government, business, and other institutions. Upton Sinclair’s 1906 novel The Jungle exposed the miserable conditions in the turn-of-the-century meatpacking industry; and in the early 1970s, Washington Post reporters Bob Woodward and Carl Bernstein uncovered evidence of the Watergate break- in and subsequent cover-up, which eventually led to the resignation of President Richard Nixon. But purveyors of mass media may be beholden to particular agendas because of political slant, advertising funds, or ideological bias, thus constraining their ability to act as a watchdog. The following are some of these agendas: Entertaining and providing an outlet for the imagination Educating and informing Serving as a public forum for the discussion of important issues Acting as a watchdog for government, business, and other institutions It’s important to remember, though, that not all media are created equal. While some forms of mass communication are better suited to entertainment, others make more sense as a venue for spreading information. In terms of print media, books are durable and able to contain lots of information, but are relatively slow and expensive to produce; in contrast, newspapers are comparatively cheaper and quicker to create, making them a better medium for the quick turnover of daily news. Television provides vastly more visual information than radio and is more dynamic than a static printed page; it can also be used to broadcast live events to a nationwide audience, as in the annual State of the Union address given by the U.S. president. However, it is also a one-way medium—that is, it allows for very little direct person-to-person communication. In contrast, the Internet encourages public discussion of issues and allows nearly everyone who wants a voice to have one. However, the Internet is also largely unmoderated. Users may have to wade through thousands of inane comments or misinformed amateur opinions to find quality information. The 1960s media theorist Marshall McLuhan took these ideas one step further, famously coining the phrase “the medium is the message (McLuhan, 1964).” By this, 49 McLuhan meant that every medium delivers information in a different way and that content is fundamentally shaped by the medium of transmission. For example, although television news has the advantage of offering video and live coverage, making a story come alive more vividly, it is also a faster-paced medium. That means more stories get covered in less depth. A story told on television will probably be flashier, less in-depth, and with less context than the same story covered in a monthly magazine; therefore, people who get the majority of their news from television may have a particular view of the world shaped not by the content of what they watch but its medium. Or, as computer scientist Alan Kay put it, “Each medium has a special way of representing ideas that emphasize particular ways of thinking and de-emphasize others (Kay, 1994).” Kay was writing in 1994, when the Internet was just transitioning from an academic research network to an open public system. A decade and a half later, with the Internet firmly ensconced in our daily lives, McLuhan’s intellectual descendants are the media analysts who claim that the Internet is making us better at associative thinking, or more democratic, or shallower. But McLuhan’s claims don’t leave much space for individual autonomy or resistance. In an essay about television’s effects on contemporary fiction, writer David Foster Wallace scoffed at the “reactionaries who regard TV as some malignancy visited on an innocent populace, sapping IQs and compromising SAT scores while we all sit there on ever fatter bottoms with little mesmerized spirals revolving in our eyes…. Treating television as evil is just as reductive and silly as treating it like a toaster with pictures (Wallace, 1997).” Nonetheless, media messages and technologies affect us in countless ways, some of which probably won’t be sorted out until long in the future. 50 Key Takeaways Media fulfills several roles in society, including the following: entertaining and providing an outlet for the imagination, educating and informing, serving as a public forum for the discussion of important issues, and acting as a watchdog for government, business, and other institutions. Johannes Gutenberg’s invention of the printing press enabled the mass production of media, which was then industrialized by Friedrich Koenig in the early 1800s. These innovations led to the daily newspaper, which united the urbanized, industrialized populations of the 19th century. In the 20th century, radio allowed advertisers to reach a mass audience and helped spur the consumerism of the 1920s—and the Great Depression of the 1930s. After World War II, television boomed in the United States and abroad, though its concentration in the hands of three major networks led to accusations of homogenization. The spread of cable and subsequent deregulation in the 1980s and 1990s led to more channels, but not necessarily to more diverse ownership. Transitions from one technology to another have greatly affected the media industry, although it is difficult to say whether technology caused a cultural shift or resulted from it. The ability to make technology small and affordable enough to fit into the home is an important aspect of the popularization of new technologies. 51 Assessment Type of Exam: Essay Mode of Exam: It may be through written or in encoded form. Question (5 points each): 1. How does the evolution of media affects the communication in: a. Society b. Education c. Government, Business and other Institutions 52 Lesson 1.7. Media in the Digital Age Media normally refers to the means of communication that uses unique tools to interconnect among people. The forms of media include media include televisions, radio, cellular phones, and internet (which involves the access and user of various social media sites such as Facebook, Twitter, Instagram, and YouTube, among others). In the digital age, however, media can be considered as the message, the medium, and the messenger. The Message Media is considered to be the message itself for those who create and own the rights of content. The forms of content can be user-generated or professionally-produced. User-generated content (UGC) is a form of content created and owned by the users of a system. UGC has grown exponentially especially with the wide internet coverage or easy WIFI access, increased social media participation, and affordable smart devices. Media as creative content Below is one of the many example of media tools used especially by millennials to generate content. Blog Keep a diary or a journal is a traditional method of recording one’s thoughts or expressing one’s emotions through writing. With the advent of the internet, expressing one’s feelings and thoughts was given new concept through online writing or blogging, a Blog is a combination of two words- web ad log. It works the same way as pen and paper would but privacy becomes irrelevant given that a blog post can be seen by anyone online. A person who writes blogs is called a bloggers. One of the TV personalities in the Philippines who also does blogging to further express feelings, thoughts, and opinions is Maine Mendoza, also known as “Yaya Dub,” in the Kalye Seryeo of the noontime show “Eat Bulaga” 53 Most recently, blogs have evolved into microblogs and video blogs. Microblogs have become popular due to the rise The artist Maine “Yaya Dub” Mendoza as a blogger Of Tumblr and Twitter in which users instantly share limited content or short messages. Now, with the increasing number of YouTube viewers, people have explored uploading their own videos ranging from their travels and products their own videos ranging from travels and product reviews to any other random topics. Some vloggers, Youtubers, or social media influencers even make money based in the number of views they get and on the products they feature in their video blogs. 54 The Medium The medium used refers to the tool or tools used in sending a message from the source to the destination. Traditionally, for example, professionals send messages to the audience, such as a news anchor delivering the news on TV and/or radio. However, with the latest technologies, the so-called social media has become an avenue for information dissemination even taking over news sites in bringing the latest or up-to-date. One such example is twitter. Users of this application can constantly update other Twitter users about a certain topic. In this platform, Twitter users can also share other user’s content (in the form of retweeting) and discuss any topic with one another. Thus, the media, instead of just being an avenue for delivering messages, is becoming increasingly social- with the audience themselves creating their own content and interacting with one another. The Messenger The messenger is the one who delivers the message. This is why broadcasters, for example, being the messenger of news are called “media” in the digital space, however, does the “media” also refer to social network users who create content themselves but are not professional journalists? Hence, although the media can be the message, the medium, and the messenger, in the digital age, the demarcation lines between them are somewhat blurry. 55 Assessment Task QUIZ Topics: Media in the Digital Age Type of Exam: Essay – Give at least three (3) advantages and disadvantages of digital age. (2 points each) 56 References Online Source: https://www.livescience.com/20718-computer-history.html https://dfarq.homeip.net/compaq-deskpro-386/ https://www.tutorialspoint.com/microprocessor/microprocessor_overview.htm https://www.yourhtmlsource.com/starthere/whatishtml.html https://www.darpa.mil/program/molecular-informatics https://www.vistacollege.edu/blog/careers/it/trends-in-information-technology-for-2019/ "Why is it important to study the history of computers?" eNotes Editorial, 18 June 2018, https://www.enotes.com/homework-help/why-important-study-history-computers- 1318216. Accessed 21 Aug. 2020. https://www.webopedia.com/DidYouKnow/Hardware_Software/FiveGenerations.asp https://www.geeksforgeeks.org/generations-of-computer/ https://www.britannica.com/technology/computer-programming-language#ref134615 https://openbookproject.net/courses/intro2ict/history/history.html http://informationtechnoluogy.blogspot.com/#:~:text=Four%20basic%20periods%2C%2 0each%20characterized,Electromechanical%2C%20and. https://www.google.com/search?q=modern+printing+press+machine&tbm=isch&safe=st rict&bih=657&biw=1366&safe=strict&hl=en&sa=X&ved=2ahUKEwjU- NXc7rjrAhUTTZQKHdGiDOkQrNwCKAB6BQgBEJUC#imgrc=K--DPGbkPwjwPM https://www.google.com/search?q=modern+telegraph+machine&tbm=isch&ved=2ahUK Ewic9cKF87jrAhUwHKYKHcRBDfEQ2- cCegQIABAA&oq=modern+te&gs_lcp=CgNpbWcQARgBMgQIABBDMgQIABBDMgUIA BCxAzICCAAyAggAMgUIABCxAzICCAAyAggAMgIIADICCAA6BAgjECc6BwgAELEDE ENQ9BtYhyZg6TVoAHAAeAGAAc0BiAHOCJIBBTEuNy4xmAEAoAEBqgELZ3dzLXdpe i1pbWfAAQE&sclient=img&ei=VllGX5zyILC4mAXEg7WIDw&bih=657&biw=1366&safe= strict#imgrc=lSJ6ABctfFuFVM 57 https://www.google.com/search?q=J%C4%81nis+Kr%C5%ABms+tweet+about+a+plan e+crash+in+hudson&tbm=isch&ved=2ahUKEwjN4uqri7nrAhWHHaYKHVHPDpgQ2- cCegQIABAA&oq=J%C4%81nis+Kr%C5%ABms+tweet+about+a+plane+crash+in+hud son&gs_lcp=CgNpbWcQAzoECAAQGFDB3wVYkOoGYKvsBmgBcAB4AIABlwGIAbkck gEEMzMuNZgBAKABAaoBC2d3cy13aXotaW1nwAEB&sclient=img&ei=0XJGX422FIe7 mAXRnrvACQ&bih=657&biw=1366&safe=strict&hl=en#imgrc=kBcEGPdfcfMToM https://open.lib.umn.edu/mediaandculture/chapter/1-3-the-evolution-of-media/ https://www.google.com/search?q=yaya+dub+maine+mendoza+bora+escapade&tbm=i sch&ved=2ahUKEwiHqOzQ67jrAhUlKqYKHf1rD-EQ2- cCegQIABAA&oq=yaya+dub+maine+mendoza+bora+escapade&gs_lcp=CgNpbWcQD DoFCAAQsQM6AggAOgQIABBDOgcIABCxAxBDOgQIABAeUNi70AFY4bvRAWDk- dEBaAVwAHgAgAGlAYgBphWSAQQyMy41mAEAoAEBqgELZ3dzLXdpei1pbWfAAQE &sclient=img&ei=kVFGX8ebAaXUmAX9172IDg&bih=789&biw=1600&safe=strict#imgrc =9YDXVMs1VcG9IM https://www.google.com/search?q=media+in+the+digital+age&tbm=isch&ved=2ahUKE wjQz8rJ-rjrAhUkNaYKHQsgAJMQ2- cCegQIABAA&oq=media+in+the+digital+age&gs_lcp=CgNpbWcQAzIECAAQGDIECAA QGDIECAAQGDoICAAQsQMQgwE6BAgAEEM6AggAOgYIABAFEB46BggAEAgQHlD F0QFYzOsBYJXxAWgAcAB4AIABugGIAbcOkgEEMTEuN5gBAKABAaoBC2d3cy13aX otaW1nwAEB&sclient=img&ei=PGFGX9DBE6TqmAWLwICYCQ&bih=789&biw=1600& safe=strict#imgrc=obmgGr174wFe2M Book: Caoili-Tayuan, R. R., & Eleazar, M. V. (2019). Living in the Information Technology Era. C & E Publishing Inc.