Lesson-1.1-History-of-Computer.pdf
Document Details
Full Transcript
GEC-LIE Living in the IT Era Stephanie Loise Tanedo Chapter 1 Intro to Information and Communication Technology Course Learning Outcomes Create a brief history of the Information Technology. Intended Learning Outcomes 1. Summarize key events in the history of information technology; 2. Discover...
GEC-LIE Living in the IT Era Stephanie Loise Tanedo Chapter 1 Intro to Information and Communication Technology Course Learning Outcomes Create a brief history of the Information Technology. Intended Learning Outcomes 1. Summarize key events in the history of information technology; 2. Discover the milestone of information and communication technology and its impact and issues; and 3. Explain the role of technology in media and how it affects communication 2 INTRODUCTION The rise of information and communication technologies (ICT) – that is, computers, software, telecommunications and the internet – and the large impact that these new technologies are having on the way that society functions, have prompted many to claim that we have entered a new era, often referred to as the ‘Third Industrial Revolution’, the ‘information age’ or the ‘new economy’. There are many technological advances many of us assume have been with us forever, and for some of us, technology like computers have been around longer than we have. Most of us are more interested in advances or future enhancements than in the past. After all, why is the history of a piece of technology important to now? 3 INTRODUCTION The study of the history of any technology is crucial because often times the original design influences a future design not always related or in the same field as the original. For instance, the card punch design was used to compile health information and was adopted to tabulate the 1890 Census. By the 1950s, punch cards lead to the development of the need for concise computer language to operate. Meanwhile, transistors and circuits became more sophisticated, and inventors were able to adapt the punch card language to electronic circuits. From the meager loom to the supercomputer, the simplicity of design leads to adaptation of the original idea to fit new applications. The study of history is how we rethink the old and find new applications. 4 INTRODUCTION In this course, we will learn the following 1) History of Computer 2) Generation of Computers 3) Four basic computer periods 4) Classification of Computers 5) Evolution of Information Technology 6) Evolution of Media 7) Media in the Digital Age 5 1.1 HISTORY OF COMPUTER “The more you know about the past, the better prepared you are for the future” ~Theodore Roosevelt 6 HISTORY OF COMPUTER The computer was born not for entertainment or email but out of a need to solve a serious number-crunching crisis. By 1880, the U.S. population had grown so large that it took more than seven years to tabulate the U.S. Census results. The government sought a faster way to get the job done, giving rise to punch-card-based computers that took up entire rooms. Today, we carry more computing power on our smartphones than was available in these early models. The following brief history of computing is a timeline of how computers evolved from their humble beginnings to the machines of today that surf the Internet, play games, and stream multimedia in addition to crunching numbers. 7 HISTORY OF COMPUTER It seems a real miracle, that the first digital computer in the world, which embodied in its mechanical and logical details just about every major principle of the modern digital computer, was designed as early as in the 1830s. This was done by the great Charles Babbage, and the name of the machine is Analytical Engine. The object of the machine may shortly be given thus (according to Henry Babbage, the youngest son of the inventor): It is the machine to calculate the numerical value or values of any formula or function of which the mathematician can indicate the method of solution. 8 HISTORY OF COMPUTER It is to perform the ordinary rules of arithmetic in any order as previously settled by the mathematician, and any number of times and on any quantities. It is to be absolutely automatic, the slave of the mathematician, carrying out his orders and relieving him from the drudgery of computing. It must print the results, or any intermediate result arrived at. This is just one of the many historic events in the field of technology. Now, here’s a brief timeline of the history of computers. 9 HISTORY OF COMPUTER In 1801, in France, Joseph Marie Jacquard invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards. 10 HISTORY OF COMPUTER In 1890, Herman Hollerith designs a punch card system to calculate the 1880 census, accomplishing the task in just three years and saving the government $5 million. He establishes a company that would ultimately become IBM. 11 HISTORY OF COMPUTER In 1936, Alan Turing presents the notion of a universal machine, later called the Turing machine, capable of computing anything that is computable. The central concept of the modern computer was based on his ideas. 12 HISTORY OF COMPUTER In 1941, Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29 equations simultaneously. This marks the first time a computer is able to store information on its main memory. 13 HISTORY OF COMPUTER In 1943-1944, Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert, build the Electronic Numerical Integrator and Calculator (ENIAC). Considered the grandfather of digital computers, it fills a 20-foot by 40-foot room and has 18,000 vacuum tubes. 14 HISTORY OF COMPUTER In 1946, Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications. 15 HISTORY OF COMPUTER In 1947, William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor. They discovered how to make an electric switch with solid materials and no need for a vacuum. 16 HISTORY OF COMPUTER In 1953, Grace Hopper develops the first computer language, which eventually becomes known as COBOL. COBOL (Common Business- Oriented Language) is a high- level programming language for business applications. 17 HISTORY OF COMPUTER It was the first popular language designed to be operating system-agnostic and is still in use in many financial and business applications today. COBOL was designed for business computer programs in industries such as finance and human resources. 18 HISTORY OF COMPUTER Unlike some high-level computer programming languages, COBOL uses English words and phrases to make it easier for ordinary business users to understand. 19 HISTORY OF COMPUTER Before COBOL, all operating systems had their own associated programming languages. This was a problem for companies that used multiple brands of computers, as was the case with the United States Department of Defense, which backed the COBOL project. 20 HISTORY OF COMPUTER Because of its ease of use and portability, COBOL quickly became one of the most used programming languages in the world. Although the language is widely viewed as outdated, more lines of code in active use today are written in COBOL than any other programming language. 21 HISTORY OF COMPUTER In 1954, the FORTRAN programming language, an acronym for FORmula TRANslation, is developed by a team of programmers at IBM led by John Backus, according to the University of Michigan. 22 HISTORY OF COMPUTER FORTRAN was the first important algorithmic language designed in 1957 by an IBM team led by John Backus. It was intended for scientific computations with real numbers and collections of them organized as one- or multidimensional arrays. 23 HISTORY OF COMPUTER In 1958, Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for his work. These extremely small electronics can perform calculations and store data using either digital or analog technology. 24 HISTORY OF COMPUTER In 1964, Douglas Engelbart shows a prototype of the modern computer, with a mouse and a graphical user interface (GUI). This marks the evolution of the computer from a specialized machine for scientists and mathematicians to technology that is more accessible to the general public. 25 HISTORY OF COMPUTER In 1969, A group of developers at Bell Labs produce UNIX, an operating system that addressed compatibility issues. Written in the C programming language, UNIX was portable across multiple platforms and became the operating system of choice among mainframes at large companies and government entities. 26 HISTORY OF COMPUTER Due to the slow nature of the system, it never quite gained traction among home PC users. C is a general-purpose programming language that is extremely popular, simple, and flexible. It is a machine-independent, structured programming language which is used extensively in various applications. 27 HISTORY OF COMPUTER It is said that 'C' is a god's programming language. One can say, C is a base for the programming. If you know 'C,' you can easily grasp the knowledge of the other programming languages that uses the concept of 'C' It is essential to have a background in computer memory mechanisms because it is an important aspect when dealing with the C programming language. 28 HISTORY OF COMPUTER In 1971, Alan Shugart leads a team of IBM engineers who invent the "floppy disk," allowing data to be shared among computers. A floppy disk is a magnetic storage medium for computer systems. 29 HISTORY OF COMPUTER The floppy disk is composed of a thin, flexible magnetic disk sealed in a square plastic carrier. In order to read and write data from a floppy disk, a computer system must have a floppy disk drive (FDD). A floppy disk is also referred to simply as a floppy. 30 HISTORY OF COMPUTER Since the early days of personal computing, floppy disks were widely used to distribute software, transfer files, and create backup copies of data. When hard drives were still very expensive, floppy disks were also used to store the operating system of a computer. 31 HISTORY OF COMPUTER In 1973, Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware. 32 HISTORY OF COMPUTER In 1974-1977, A number of personal computers hit the market, including Scelbi & Mark-8 Altair, IBM 5100, Radio Shack's TRS-80 — affectionately known as the "Trash 80" — and the Commodore PET. 33 HISTORY OF COMPUTER In 1975, The January issue of Popular Electronics magazine features the Altair 8080, described as the "world's first minicomputer kit to rival commercial models." Two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. 34 HISTORY OF COMPUTER On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft. The Altair 8800 was one of the first computers available for personal use. The Altair 8800 revolutionized the computing world by allowing common people to have their own personal computer to use as they pleased. 35 HISTORY OF COMPUTER The Altair 8800 also was responsible for the rise of Microsoft as a computing industry giant. 36 HISTORY OF COMPUTER In 1976, Steve Jobs and Steve Wozniak start Apple Computers on April Fool's Day and roll out the Apple I, the first computer with a single- circuit board, according to Stanford University. 37 HISTORY OF COMPUTER In 1977, Radio Shack's initial production run of the TRS-80 was just 3,000. It sold like crazy. For the first time, non-geeks could write programs and make a computer do what they wished. 38 HISTORY OF COMPUTER In 1977, Jobs and Wozniak incorporate Apple and show the Apple II at the first West Coast Computer Faire. It offers color graphics and incorporates an audio cassette drive for storage. 39 HISTORY OF COMPUTER In 1978, Accountants rejoice at the introduction of VisiCalc, the first computerized spreadsheet program. 40 HISTORY OF COMPUTER In 1979, Word processing becomes a reality as MicroPro International releases WordStar. "The defining change was to add margins and word wrap," said creator Rob Barnaby in an email to Mike Petrie in 2000. "Additional changes included getting rid of command mode and adding a print function. 41 HISTORY OF COMPUTER In 1981, The first IBM personal computer, code-named "Acorn," is introduced. It uses Microsoft's MS-DOS operating system. It has an Intel chip, two floppy disks, and an optional color monitor. Sears & Roebuck and Computerland sell the machines, marking the first time a computer is available through outside distributors. It also popularizes the term PC. 42 HISTORY OF COMPUTER In 1983, Apple's Lisa is the first personal computer with a GUI. It also features a drop- down menu and icons. It flops but eventually evolves into the Macintosh. The Gavilan SC is the first portable computer with the familiar flip form factor and the first to be marketed as a "laptop." 43 HISTORY OF COMPUTER In 1985, Microsoft announces Windows, according to Encyclopedia Britannica. This was the company's response to Apple's GUI. Commodore unveils the Amiga 1000, which features advanced audio and video capabilities. 44 HISTORY OF COMPUTER In 1985, The first dot-com domain name is registered on March 15, years before the World Wide Web would mark the formal beginning of Internet history. The Symbolics Computer Company, a small Massachusetts computer manufacturer, registers Symbolics.com. More than two years later, only 100 dot-coms had been registered. 45 HISTORY OF COMPUTER In 1990, Tim Berners-Lee, a researcher at CERN, the high- energy physics laboratory in Geneva, develops HyperText Markup Language (HTML), giving rise to the World Wide Web. HTML is a computer language devised to allow website creation. 46 HISTORY OF COMPUTER These websites can then be viewed by anyone else connected to the Internet. It is relatively easy to learn, with the basics being accessible to most people in one sitting; and quite powerful in what it allows you to create. It is constantly undergoing revision and evolution to meet the demands and requirements of the growing Internet audience. 47 HISTORY OF COMPUTER In 1993, The Pentium microprocessor advances the use of graphics and music on PCs. A Microprocessor is a controlling unit of a micro-computer, fabricated on a small chip capable of performing ALU (Arithmetic Logical Unit) operations and communicating with the other devices connected to it. 48 HISTORY OF COMPUTER The Microprocessor consists of an ALU, register array, and a control unit. ALU performs arithmetical and logical operations on the data received from the memory or an input device. Register array consists of registers identified by letters like B, C, D, E, H, L, and accumulator. The control unit controls the flow of data and instructions within the computer. 49 HISTORY OF COMPUTER In 1994, PCs become gaming machines as "Command & Conquer," "Alone in the Dark 2," "Theme Park," "Magic Carpet," "Descent" and "Little Big Adventure" are among the games to hit the market. 50 HISTORY OF COMPUTER In 1996, Sergey Brin and Larry Page develop the Google search engine at Stanford University. 51 HISTORY OF COMPUTER In 1999, The term Wi-Fi becomes part of the computing language and users begin connecting to the Internet without wires. 52 HISTORY OF COMPUTER In 2001, Apple unveils the Mac OS X operating system, which provides protected memory architecture and pre-emptive multi-tasking, among other benefits. Not to be outdone, Microsoft rolls out Windows XP, which has a significantly redesigned GUI. 53 HISTORY OF COMPUTER In 2004, Mozilla's Firefox 1.0 challenges Microsoft's Internet Explorer, the dominant Web browser. Facebook, a social networking site, launches. 54 HISTORY OF COMPUTER In 2005, YouTube, a video sharing service, is founded. Google acquires Android, a Linux-based mobile phone operating system. 55 HISTORY OF COMPUTER In 2006, Apple introduces the MacBook Pro, its first Intel- based, dual-core mobile computer, as well as an Intel- based iMac. Nintendo's Wii game console hits the market. 56 HISTORY OF COMPUTER In 2007, The iPhone brings many computer functions to the smartphone. In 2009, Microsoft launches Windows 7, which offers the ability to pin applications to the taskbar and advances in touch and handwriting recognition, among other features. In 2010, Apple unveils the iPad, changing the way consumers view media and jumpstarting the dormant tablet computer segment. 57 HISTORY OF COMPUTER In 2011, Google releases the Chromebook, a laptop that runs the Google Chrome OS. In 2012, Facebook gains 1 billion users on October 4. In 2015, Apple releases the Apple Watch. Microsoft releases Windows 10. 58 HISTORY OF COMPUTER In 2016, The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum- computing platform that had the capability to program new algorithms into their system. 59 HISTORY OF COMPUTER They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park. 60 HISTORY OF COMPUTER In 2018-Present, 1. Cloud Computing Business organization stores terabytes of data every day, which must be arranged, sorted and restored. A conventional computer cannot store large volumes of data which increases pressure to move towards Cloud. The Cloud will increase the productivity of an organization by saving time and money. Cloud is cheap, reliable for secure backup, and eases resource management. Cloud and its automation will collaborate with new upcoming technologies bringing profits to the organization. 61 HISTORY OF COMPUTER In 2018-Present, 2. The Internet of Things The internet of things has turned the dream of smart homes, devices, cars, and workplaces into a reality. It can forge compelling solutions for real-time problems by focusing on hardware-centric software, and firmware. IOT will merge with edge computing and enable a better range of IOT applications. Edge computing will improve IOT based application ability to detect obstacles, face identification, and security. 62 HISTORY OF COMPUTER In 2018-Present, 3. Artificial Intelligence Artificial Intelligence along with machine learning will be an unstoppable force in 2018. AI and its advanced algorithms will be assisting other technologies to build sophisticated software. Commoditized Artificial Intelligence will work in natural language processing, computer vision, and recommender systems. Conversational Artificial Intelligence will come across areas like supply chain, sales, manufacturing, and insurance. Amazon Alexa is the best example of Artificial Intelligence application. 63 HISTORY OF COMPUTER In 2018-Present, 4. Virtual Assistance Virtual Assistance technology will be a deciding vote for brand engagement. Amazon’s Alexa and Echo have proved to be current market leaders in the world of virtual assistance. Advancements can be made to the existing technology by merging with voice assistant technology companies and media providers. Chatbots are one of the technology advancements brands can use for providing quick responses and 24/7 availability to the customers. 64 HISTORY OF COMPUTER In 2018-Present, 5. Augmented Reality AR is a technology which places a computer-generated image in front of a user creating a composite view of reality. Unlike virtual reality, it uses existing environment and imposes new information on top of it. Snapchat’s funny face filters and iPhone X Emojis are the best way to describe AR. Apple’s AR Kit will ease the work of many developers to include this technology in phones, software, etc. Brands like Amazon have started to work with AR 65 HISTORY OF COMPUTER In 2018-Present, 6. 3-D printing 3-D printing will change the perspective of many industries of manufacturing a product. HP has created 3-D printers which are portable and simple to use. 3-D printing can change the world by creating touchable pictures, human body parts, lightweight, and long-lasting cast for broken bones, and safer, stronger vehicles. It will also enhance design and innovation in a variety of sectors. 3-D printing will disrupt the market by saving time and money. The technology is expensive currently, but when the price will fall, it will be 66 accessible. HISTORY OF COMPUTER In 2018-Present, 7. Robotic Process Automation or RPA Like AI and Machine Learning, Robotic Process Automation, or RPA, is another technology that is automating jobs. RPA is the use of software to automate business processes such as interpreting applications, processing transactions, dealing with data, and even replying to emails. RPA automates repetitive tasks that people used to do. These are not just the menial tasks of a low-paid worker: up to 45 percent of the activities we do can be automated, including the work of financial managers, doctors, and CEOs. 67 HISTORY OF COMPUTER In 2018-Present, 8. Cybersecurity Cybersecurity might not seem like emerging technology, given that it has been around for a while, but it is evolving just as other technologies are. That’s in part because threats are constantly new. The malevolent hackers who are trying to illegally access data are not going to give up any time soon, and they will continue to find ways to get through even the toughest security measures. It’s also in part because new technology is being adapted to enhance security. As long as we have hackers, we will have cyber security as an emerging technology because it will constantly evolve to defend against those hackers. 68 HISTORY OF COMPUTER You can’t know where You are going until you know where you have been! All new technologies, including computers, evolve from an original, but that doesn’t necessitate our presumption that the original no longer has a purpose or is less valued. Much of our technology is repurposed for different applications, and that may be the best reason to study the history of computers. 69 References https://www.livescience.com/20718-computer-history.html https://dfarq.homeip.net/compaq-deskpro-386/ https://www.tutorialspoint.com/microprocessor/microprocessor_overview.htm https://www.yourhtmlsource.com/starthere/whatishtml.html https://www.darpa.mil/program/molecular-informatics https://www.vistacollege.edu/blog/careers/it/trends-in-information-technology-for-2019/ "Why is it important to study the history of computers?" eNotes Editorial, 18 June 2018, https://www.enotes.com/homework-help/why-important-study-history-computers-1318216. Accessed 21 Aug. 2020. https://www.webopedia.com/DidYouKnow/Hardware_Software/FiveGenerations.asp https://www.geeksforgeeks.org/generations-of-computer/ https://www.britannica.com/technology/computer-programming-language#ref134615 70