🎧 New: AI-Generated Podcasts Turn your study notes into engaging audio conversations. Learn more

[CC 201] Unit 1 - Computing - A Historical Perspective.pdf

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Full Transcript

CC 201 Introduction to Computing WVSU-CICT (Information Systems Department) CC 201 Introduction to Computing Unit 1: Computi...

CC 201 Introduction to Computing WVSU-CICT (Information Systems Department) CC 201 Introduction to Computing Unit 1: Computing: A Historical Perspective 1st year, First Semester Presented by Prepared by Sigen Marc C. Miranda Shambhavi Roy, Clinton Daniel [email protected] and Manish Agrawal Lecturer I, Information Systems Department University of South Florida WVSU-CICT https://digitalcommons.usf.edu/ STEM graduates are the majority of BSIS 1A allocated 67.65% of the class population to be followed by HUMSS, TVL-ICT, and GAS. 25 23 20 15 10 6 5 3 2 0 BSIS 1A STEM HUMSS TVL-ICT GAS Updates ! We are now in a REMOTE set-up! 19 20 21 22 BSIS 1A BSIS 1A (7:00AM-10:00AM) (7:00AM-9:00AM) Unit 1: Computing: A Additional Unit Historical Perspective Discussion Activity #1: Individual Activity #2: Group Activity Discussion Objectives Traced the evolution of computing from early computational devices to modern computers. Identified key figures and their contributions to the development of computing. Explained the impact of technological advancements on society. Recognized the significance of different computing eras and their defining characteristics. Outline of this Unit How Information Technology Began The Era of Networked Computers and the Internet The Mobile Revolution The Era of Personal Computers The Internet and the World Wide Web Women in Technology Unit 1.1: How Information Technology began? This section provides a quick tour through the history of how we have reached the current state of “information technology everywhere.” We show how innovative teams have responded to human needs and commercial incentives to create the technologies we take for granted today. If you find this interesting, we hope you will read more about these individuals and technologies on the Internet. The history of information technology includes the history of computer hardware and software. Charles Babbage is credited with building the first mechanical computer in the 1820s. Over a hundred years later in 1946, a team at the University of Pennsylvania publicly reported the first programmable, general-purpose computer. It was called the Electronic Numerical Integrator and Computer (ENIAC). The ENIAC weighed 30 tons and took up 1800 square feet of space. It supported most hardware and software components that modern programmers recognize. The ENIAC could read inputted data, hold information in memory, step through programming instructions, create and call sub-programs, loop FIGURE 4 — A function table, interface for function table, and through code, and print output. accumulator for an ENIAC are displayed. The ENIAC didn’t have a lot of modern peripherals we take for granted now, such as monitors, keyboards, and printers. To use the ENIAC, programmers had to write instructions (code) into punched paper cards (cards with holes that could be read by computers). It took weeks of code- writing and debugging before the computer could do anything useful. The US Army funded the development of the ENIAC to compute firing tables during WWII. Firing tables provided recommendations to gun operators on the optimal specifications to hit a target, taking into account terrain conditions, weapon wear-and-tear, ammunition type, etc. While the ENIAC was created to serve a specific military purpose, its general computing capabilities captured the imagination of the public. The ENIAC was a computer like any modern computer, but it did not use software as we understand it today. Every instruction for every task was hard coded by experts. If the task was to be repeated, the instructions were written again on punch cards, which could take days. A lot of these instructions involved tasks such as reading data and writing outputs, which are common to all computer programs. Eventually, these shared tasks were aggregated into computer programs called operating systems. The Operating System (OS) is the brain of the computer and controls all the parts of a computer. A computer mouse, keyboard, display monitor, motherboards, and storage drives are all components of a computer, and they act together as one computer only when the operating system recognizes them and orchestrates a symphony of coordinated actions to complete the task you tell the computer to perform. When you move your mouse, tab your screen, type on your keyboard, or make a phone call, it is the operating system that recognizes the action and tells the components how to act to bring about the desired outcome. What Does an Operating System Do? Operating systems are to computers what front offices are to schools. Most complex entities offer a “front office” that makes it simple for users to request and offer services. At most schools, for example, the front office is where students report absences and get their schedules, teachers report grades and request supplies, and parents make inquiries. The front office staff are experts in handling these requests and orchestrating the necessary actions to complete these requests. The front office staff also ensures that any administrative requirements, such as student privacy, are protected as these actions are performed. Operating systems perform the same role for computers. They receive inputs from users and applications and coordinate all necessary actions until the appropriate output is presented to users. Operating systems evolved rapidly in the 1960s and 70s. In 1971, AT&T, the dominant phone company at the time, built an operating system called Unix. See Figure 5 in the next slide. Unix and its variants were freely distributed across the world in the 1970s and 1980s, and some of the world’s best computer scientists and engineers volunteered their contributions to make Unix extremely capable. These experts were guided by the principle of using the “minimum number of keystrokes [to] achieve the maximum effort.” Because of their powerful capabilities and low to no costs, Unix and its variants including the popular Linux operating system, now power most computers around the world, including all major smartphones. Windows is FIGURE 5 — Scientists at AT&T Bell Labs created the Unix system, thereby layin g th e f o u n d a t i o n f o r m o d e r n c o m p u t e r s. K e n another popular operating system, used extensively on Thompson (seated) and Dennis Ritchie (standing) are credited with inventing Unix. desktops and in data centers. The Origin of Unix Unix began as one developer’s attempt to support games on office machines. Ken Thompson used his spare time at AT&T Bell Labs to write nerdy computer games and developed software to support his game, Space Travel. Because the software only supported one user (Ken Thompson), the software became famous at the lab as Thompson’s Un-multiplexed Information and Computing Service, or Unics, and later was abbreviated as Unix. Eventually, Unix acquired capabilities for multiple users to share the resources of a central processor, so the Unix name is not really representative of the operating system’s single-user limitations anymore. But names stick, and the Unix name remains popular. The first edition of Unix was just 4,200 lines of code indicating how powerful good computer code can be. Even in these earliest editions, Unix included games such as Blackjack, and this contributed to its popularity. A powerful economic force also contributed to Unix’s widespread adoption. While AT&T funded the development of Unix, 15 years earlier, in 1956, AT&T had reached an agreement with the federal government that gave it monopoly status on long-distance telephony. In exchange, AT&T agreed not to sell any product that was not directly related to long-distance telephony. Eventually, AT&T shared the source code to Unix with multiple organizations, and they released their adaptations to the world. One of the most popular of these adaptations was developed at UC Berkeley and was called Berkeley Systems Distribution (BSD) Unix. The licensing for BSD Unix allowed adopters to make their own modifications without releasing them back to the community. This was very useful to commercial vendors. Among the most popular current commercial releases tracing their lineage to BSD are the operating systems on all Apple products, including MacOS on laptops and iOS on smartphones. The popularity of Unix is the result of technology excellence and economic incentives. Unit 1.2: The Era of Personal Computers Until the early 1980s, computers were too expensive for personal use. As the cost to manufacture computer components came down, IBM saw an opportunity to make small, self-contained personal computers (PCs) that had their own Central Processing Units (CPUs). Since Unix was designed for giant centralized machines and dumb terminals, there was a need While IBM was not the first to make a for an operating system that could personal computer, its entry into the market in 1981 with its PC priced at $1565, significantly cheaper than others, run on these personal devices. revolutionized the market. IBM partnered with Microsoft to create an operating system for personal computers. This operating system was called Disk Operating System (DOS). Although DOS wasn’t easy to use (users still needed to type commands manually on a line), the idea of owning a computer caught on, and the IBM PC started the PC revolution by becoming the world’s first popular personal computer. The IBM PC and Florida The IBM personal computer was invented in Florida, Boca Raton to be precise. Since 1967, IBM had operated a unit in Boca Raton to develop, build, and sell inexpensive computers. A team at this unit built the inexpensive personal computer (PC) by creating an open design where components from multiple vendors could interoperate. The IBM PC used processors from Intel, operating systems from Microsoft, and components from several other vendors. Competition among these vendors brought costs down, while the popularity of personal computers helped many of these suppliers become large companies themselves. As of 2022, Microsoft (valued at $2Tr) and Intel (valued at $130bn) are worth more than IBM itself ($118bn). As of 2022, IBM, Intel, and Microsoft employ over 600,000 people. Along with inexpensive hardware, in the mid-1980s came user-friendly software. In 1985, Microsoft launched Microsoft Windows, an easy-to-use Graphical User Interface (GUI) based operating system. Microsoft also released Excel for financial calculations in 1985 and Word for text editing in 1989. Doug Klunder (Microsoft’s first college hire) led the development of Excel and Charles Simonyi (the world’s first repeat space tourist led the development of Word. Bill Gates Both programs leveraged the special capabilities of graphical user interfaces and propelled personal computers to widespread adoption in businesses. Computers also turned into convenient home devices for everyday users to write letters, manage personal finances, communicate with friends and family, create music, and watch entertainment shows. Between 250 million and 350 million personal computers sell each year to meet this demand. Unit 1.3: The New Era of Network Computers and Internet Communication is a very fundamental human activity and information exchange has been one of the most popular uses of computers. Computer engineers recognized this need for information exchange early on and developed technologies for computers to talk to each other. The initial networks were limited in scope, connecting computers located within an office, and allowing users within an office to send emails to each other and share expensive resources such as printers. Mother of All Demos One of the most famous demonstrations of technology happened on December 9, 1968. Douglas Engelbart delivered a 90-minute demonstration of essentially all the personal networked computer technology we use today. The demo included networking, graphical user interfaces, web-like pages (hypertext), images, the computer mouse, video conferencing, and word processing. In about 20 years, the technology was available in stores. For its far-reaching impact, the technology industry gave it the name “mother of all demos.” As networks grew, network effects emerged. To understand the network effect, imagine a village with just two telephones connected to each other by a wire. The telephones will not be very useful since they only connect two people in the village. Conversations with all other users happen outside this network. However, as more people in the village connect to the network, every telephone in the network becomes increasingly useful. The same telephone allows users to connect to more people in the village. The free increase in benefit to the community as more members join a network is called the network effect. The network effect generated powerful incentives within the industry to network computers. By 1981, the core computer networking technology we use today was specified. Since that time, the development of computers is closely associated with the development of computer networks, the Internet, and the World Wide Web. Unit 1.4: The Internet and the World Wide Web Since the beginning of the 21st century, computer networks have become robust and globally available like water and electricity. Users and businesses have taken advantage of this networking capability to share information and do business with people around the world. The global network of computers that share information with each other is called the Internet. Information on the Internet can be linked to any other information on the Internet and these links can be considered as a web of information. Therefore, the information shared on the Internet is called the World Wide Web. Information traveling on the Internet is like cars traveling on a highway system. The Internet is a vast network connecting many smaller local networks, in the same way that the highway network connects all local roads. You can start from any point and drive to any other point using the network of roads as long as the roads are connected. The World Wide Web enables the same capability for information. Networks and Development Until about 1870, the United States was about as economically advanced as the rest of the world. But by 1920, the United States had advanced decisively in comparison. Much of this is attributed to the construction of five networks—water, sewage, electricity, roads, and telephones. By the end of the 20th century, the information network was added to this list of networks contributing to American prosperity. LANs vs WANs The Internet is built by connecting two types of networks— small networks within buildings and large networks that connect these small networks. The small networks connecting workers inside an office building or a school are called Local Area Networks (LANs). The network at your school or home is an example of a LAN. LANs help the computers in an office share files, emails, printers, and the Internet connection with each other. The networks that connect these small networks to each other are called Wide Area Networks (WANs). WANs are typically large networks spread across a wide geographic area such as a state or country and are used to connect the LANs within corporate and satellite offices. WANs are typically operated by Internet providers such as Verizon, Frontier, and Spectrum and users pay subscription fees to access WANs. Unit 1.5: The Mobile Technology Computing technologies have evolved rapidly and the technology industry has succeeded in shrinking computers to the size of mobile phones. Soon after the PC became a household device, developments in related technologies like storage, battery capacity, screens, materials, and networking fueled the mobile revolution. As a result, the traditional phone is now mostly replaced by powerful computers called smartphones, which users around the world carry in their pockets and purses. While Windows was the dominant operating system for the personal computer era, two operating systems are dominant in the mobile era—Apple’s iOS and Steven Jobs shows off the iPhone 4 at Google’s Android. Both these the 2010 Worldwide Developers Conference. operating systems trace their lineage to Unix. While the iPhone’s iOS is a version of Unix, Android is based on Linux, a Unix-compatible operating system initially created Linus Torvalds, Father of Linux in 1991 by Linus Torvalds when he was a student at the University of Helsinki in Finland. Moore’s Law The availability of computers gave us the ability to design more powerful computers. The virtuous cycle was the basis of Moore’s law named after Gordon Moore, one of the founders of Intel, a computer chip pioneer. Gordon Moore noted that the number of transistors on a microchip doubled every two years. The transistor is the core component of a computer chip and is a tiny electronic device used to store and process information. Its size is measured in nanometers (one-millionth of the width of a human hair). To understand the kind of progress we have made, consider the fact that a cutting-edge computer microchip in 1970 had about 2000 transistors in it, but the latest Apple M1 computer chip, released in 2020, has 114 billion transistors. Today, even street vendors with modest incomes in developing countries around the world own mobile phones; and people living in distant countries can be as accessible as your next-door neighbors, all thanks to the availability of free audio and video phone apps. You can use your phone to do office work while being entertained, whether at home or waiting in a line at a grocery store. Unit 1.6: Women in Technology The First World Programmer Did you know that the person often regarded as the world’s first programmer was a woman, Ada Lovelace? Early on, women didn’t get sufficient credit for their work. The dedicated team of women programmers who worked on the ENIAC received recognition for Women were the original “computers,” doing complex math problems by hand for the military before the machine that took their name replaced them.… During the 1940s and 50s, women remained the dominant sex in programming, and in 1967 Cosmopolitan magazine published “The Computer Girls,” an article encouraging women into programming. “It’s just like planning a dinner,” explained computing pioneer Grace Hopper. “You have to plan ahead and schedule everything so that it’s ready when you need it. Programming requires patience and t h e a bi li t y t o h a ndle de t a i l. W om e n a re ‘naturals’ at computer programming.” —Caroline Criado Perez, Invisible Women, 2019 However, the moment it became clear that there was money to be made in computers, the moment it became clear that programmers needed to be brilliant, men ended up replacing women as programmers. Women faced significant hurdles in getting hired as programmers. Clearly, women had the programming skills since they were already doing the job. However, we, as a society, have a brilliance bias—we rarely see women as brilliant. Rather than trying to figure out an applicant’s suitability for a job, tech companies stereotyped male characteristics as brilliant—a nerdy attitude, unkempt hair and face, staying up all night to program, loitering on programming websites that often also have content that women may find offensive. Hiring managers failed to account for the fact that a girl programmer may look different and even express her love for programming differently. For example, the tech-hiring platform Gild combed through applicants’ social data to assess their suitability. According to Gild’s data, frequenting a particular Japanese manga site is a “solid predictor of strong coding.” Programmers who visit this site therefore receive higher scores. Which all sounds very exciting, but as O’Neil points out, awarding marks for this rings immediate alarm bells for anyone who cares about diversity. Women, who as we have seen do 75% of the world’s unpaid care work, may not have the spare leisure time to spend hours chatting about manga online. O’Neil also points out that “if, like most of techdom, that manga site is dominated by males and has a sexist tone, a good number of the women in the industry will probably avoid it.” —Caroline Criado Perez, Invisible Women, 2019 More than 40% of women leave tech companies after ten years compared to 17% of men. A report by the Center for Talent Innovation found that women didn’t leave for family reasons or because they didn’t enjoy the work. They left because of “workplace conditions,” “undermining behavior from managers,” and “a sense of feeling stalled in one’s career.” A feature for the Los Angeles Times similarly found that women left because they were repeatedly passed up for promotion and had their projects dismissed. Does this sound like a meritocracy? Or does it look more like institutionalized bias? —Caroline Criado Perez, Invisible Women, 2019 CC 201 Introduction to Computing Unit 1: Computing: A Historical Perspective 1st year, First Semester Presented by Prepared by Sigen Marc C. Miranda Shambhavi Roy, Clinton Daniel [email protected] and Manish Agrawal Lecturer I, Information Systems Department University of South Florida WVSU-CICT https://digitalcommons.usf.edu/ Activity #1: ESSAY How Information Which chapter Technology Began or discussion The Era of you think was Networked Computers and the your favorite? Internet And why? The Mobile Revolution AI-generated answers are prohibited. The Era of Personal Submit your answers Computers via Microsoft Forms. The Internet and the World Wide Web Women in Technology

Use Quizgecko on...
Browser
Browser