IT AI Primer Std. 7,8,9 (Pre-Senior).pdf

Full Transcript

IT & AI PRIMER CHAPTER 1 An Introduction to Information Technology "An Introduction to Information Technology," is a chapter designed to provide you with a foundational understanding of IT. This chapter co...

IT & AI PRIMER CHAPTER 1 An Introduction to Information Technology "An Introduction to Information Technology," is a chapter designed to provide you with a foundational understanding of IT. This chapter covers the definition and components of IT, and a detailed look at the critical elements of IT infrastructure. It also honors the visionaries who have shaped the field and discusses the historical significance of the ENIAC, the first computer. Additionally, we will explore how information and technology combine to form the backbone of IT. This chapter aims to equip you with essential knowledge and an appreciation for the contributions that have driven the evolution of Information Technology. Before we start, it is important to understand the meaning of Information Technology. What is Information Technology? Information technology (IT) is the application of technology to store, study, retrieve, transmit, and manipulate data or information of an enterprise. Information Technology is considered a subset of Information and Communications Technology (ICT). The formal definition as per the Information Technology Association of America (ITAA) is: "the study, design, development, implementation, support or management of computer-based information systems, particularly software applications and computer hardware." 1.1 Computer rendition of Information Technology Prepared by USO, for the exclusive use by students appearing in USO National Test. 1 IT & AI PRIMER In simple terms, Information Technology is anything and everything to do with systems, hardware, software, applications, and data. In the general media, the following terms are also used to refer to IT: Information Technology (IT) Information and Communications Technology (ICT) Data Communications and Technology (DCT) Creative Digital Technology (CDT) Design and Technology (DT) Computer Science and Technology (CST) What are the components of Information Technology? The diagram below represents the basic architectural model of Information Technology. The various steps are as follows: A user (human) inputs information or data into the system through the input device - hardware. The system then uses the hardware and software, collects and interprets the information or data received, and performs the action based on the user's requirement. The system then provides the completed information or data in the required format, back to the user, via the hardware. All these components - the hardware, software, the related networks, the data, people, processes and the Internet and Cloud services, all form Information Technology. Understanding these components helps us see how they work together to create the powerful and versatile systems we use every day in school, work, and our personal lives. Prepared by USO, for the exclusive use by students appearing in USO National Test. 2 IT & AI PRIMER USER Information is sent by the User for processing. TECHNOLOGY SOFTWARE HARDWARE Processor Input Output Device Storage Device Information is received by the User after it has been processed. USER 1.2 Components of Information Technology Prepared by USO, for the exclusive use by students appearing in USO National Test. 3 IT & AI PRIMER Did we always have so much data? How and why did this subject of Information Technology start? Humans have been storing, retrieving, manipulating, and communicating information for decision-making since the Sumerians in Mesopotamia developed writing in about 3000 BC. Users enter information into their technology of choice for computation and then retrieve the computed values for decision-making. For thousands of years, various means have been used to aid computation. Fingers were the first devices used for counting, followed by sticks and marks on clay tablets. Mechanical devices like the Abacus and Slide Rule were invented later, followed by Electromechanical devices like the ENIAC, which completely transformed the world of technology. Modern computing devices (i.e., computers) have been around for the last 75 years, and we are now living in the era of mobile computing, with people carrying computers in their pockets in the form of mobile phones. 1.3 L to R: Abacus and Slide Rule, both were used for counting Visionaries of Information Technology Throughout history, there have been several visionaries who have influenced the development in Information Technology. Countess Ada Lovelace, an English mathematician and writer was known for her work on Charles Babbage's mechanical general-purpose computer, the Analytical Engine. She was the first to recognise that the machine had applications beyond pure calculation and created the first algorithm intended to be carried out by such a machine in 1842-43. As a result, she is often regarded as the first computer programmer who recognised the full potential of a "computing machine". Prepared by USO, for the exclusive use by students appearing in USO National Test. 4 IT & AI PRIMER 1.4 Countess Ada Lovelace 1.5 Alan Turing Alan Turing, known as the ‘Father of Computing’, is the first person to have implemented the concepts of theoretical computer science and artificial intelligence. He modelled computation as a one-dimensional storage tape, giving birth to the Turing machine and Turing-complete programming systems. His work in creating the Colossus computer (the world’s first electronic digital programmable computer) during World War II allowed the Allied forces to break the codes of the German Enigma machine thereby enabling the Allies to intercept secret German messages. So which were the first computers, or computing machines? There were a number of different computing machines that were created in the 1930’s and 1940’s. The ENIAC (Electronic Numerical Integrator And Computer) was the first electronic general-purpose computer, developed in 1946 in the United States. It was capable of being reprogrammed to solve a range of computing problems. For its 50th anniversary, the room-sized ENIAC was recreated as a single integrated circuit chip. It was the ENIAC which formed the precursor to other commercial computers like EDVAC (Electronic Discrete Variable Automatic Computer), EDSAC (Electronic Delay Storage Automatic Calculator), UNIVAC (Universal Automatic Computer), and BINAC (Binary Automatic Computer). 1.6 The EDVAC (Electronic Discrete Variable Automatic Computer) Prepared by USO, for the exclusive use by students appearing in USO National Test. 5 IT & AI PRIMER 1.7 The EDSAC Electronic Delay Storage Automatic Calculator The ENIAC – The First Computer ENIAC was a computer built between 1943 and 1946. It was designed by John Mauchly and J. Presper Eckert. Here are some interesting facts about the ENIAC: The machine was built out of nearly 17,500 vacuum tubes, 7,200 diodes, and many miles of wire. It took up 1,800 square feet (170 m2) of space (equivalent to the size of a large room). The history of computers is complex, and many machines have been identified as the "first computer". ENIAC's claim is that it was more programmable than any of the previous ones. Because the computer was built out of vacuum tubes, it often broke. This meant that someone had to find the failed tube, (which was not an easy job), take out the bad tube, and put a working tube in. Making programs for the computer was difficult. Making the computer ready for one single program could take many days or even weeks. This was because the programming had to be done by pulling wires from one place to another. This was not hard for small programs, but if a large program was wanted, it was very hard work. Programming by pulling wires was used until 1948 when a special type of memory was added. This was called Read Only Memory(ROM), because the computer could read it but not write it. After that, programming was done by using switches, which took hours instead of days. When asked if it was possible to make a hydrogen bomb, ENIAC answered in 20 seconds. ENIAC took 70 hours to work out the value of pi to 2000 decimal places. A modern PC with a CPU of the size of 2x2 cm is much faster than ENIAC. A modern PC can work out a million (1,000,000) decimal places of the value of pi in about ten seconds. Prepared by USO, for the exclusive use by students appearing in USO National Test. 6 IT & AI PRIMER 1.8 ENIAC ENIAC was shut down forever on October 2, 1955. Now, only about 10 panels of the ORIGINAL 40 exist. Although the original is shut down, programs written for the ENIAC can still be run on a modern computer using an emulator. For the longest time, hardware was seen as the main innovation when it came to technology. A team of women including Kay McNulty, Betty Jennings, Betty Snyder, Marlyn Meltzer, Fran Bilas, and Ruth Lichterman were the first programmers of the ENIAC, though their work was not widely recognized for over 50 years. The complexity of machine programming was greatly undervalued, perhaps because the first programmers were all women. Due to this mindset, historians had mistaken the women programmers to be models posing in front of the machine in all photographs. Information + Technology To further understand Information Technology, we must understand its foundational building blocks. As I.T. contains two very important words: Information, and Technology Although both these pieces are unique and different, they both need to work in parallel for any information to be relayed to the user. Prepared by USO, for the exclusive use by students appearing in USO National Test. 7 IT & AI PRIMER INFORMATION TECHNOLOGY Information Technology Business Context and Process Hardware Data Software Networking In short, Information Technology shall cover all of the following: “There is no reason anyone would want a computer in their home.” — Ken Olsen, founder, president, and chairman of Digital Equipment Corporation, in 1977. Prepared by USO, for the exclusive use by students appearing in USO National Test. 8 IT & AI PRIMER MEMORY CHECK Fill in the blank: The full form of ICT is Information and __________ Technology. Communications. Name the first Electromechanical device. The ENIAC. Who is the Father of Computing? Alan Turing Who designed the ENIAC Computer? John Mauchly and J. Presper Eckert When was the Colossus computer created? During World War II. True or False: To understand IT, we need to examine information and technology separately. False. Who sends and receives information in the processing system? User. What is one early mechanical device? Slide Rule. Who first recognized machines had applications beyond calculation? Countess Ada Lovelace. True or False: Hardware is part of technology. True Prepared by USO, for the exclusive use by students appearing in USO National Test. 9 IT & AI PRIMER Chapter 2 Technology: Hardware "Technology: Hardware," is a chapter dedicated to exploring essential computer hardware components. We will cover processors, their functions, memory and storage units, control units, and the Arithmetic Logic Unit (ALU). We will trace the evolution of computers from 1997 to 2000 and learn about the Semiconductor Chip Shortage that affected the tech industry in 2020. We will also discuss the evolution of storage devices, and the rise of Cloud Storage, and compare RAM vs ROM. Finally, we will explore input and output devices, rounding out your knowledge of the hardware components that power modern technology. This chapter aims to equip you with a solid understanding of computer hardware, its development, and its impact on the tech landscape. As with all devices, when we consider the components of a modern computer, we are familiar with the input device (e.g.: keyboard) that a user utilizes to enter information into a computer, as well as an output device (e.g.: monitor) to review the output. In this chapter, over the next few pages, we shall introduce the Processor, which acts as the brains of the computer, as well as Storage, which acts as the memory of the computer. With various developments made, the technology components have become more complex. Over the last few decades, software has grown in importance, just as significant advances have been made in the field of hardware. What we can see is referred to as the hardware. Hardware also refers to objects that one can touch. In any computer, the following are called ‘hardware’: Processors Storage units Input devices Output devices Prepared by USO, for the exclusive use by students appearing in USO National Test. 10 IT & AI PRIMER HARDWARE Processor Input Storage Output Device Device 2.1 Various elements of hardware Early computing machines had fixed programs. The calculator is an example of a fixed program computer which can do basic mathematics but cannot be used as a word processor or gaming console. Changing the program of a fixed-program machine requires re-wiring, re-structuring, or re-designing the machine completely. The earliest computers were not so much ‘programmed’ as they were ‘designed’. ‘Reprogramming,’ when it was possible at all, was a laborious process, starting with flowcharts and paper notes, followed by detailed engineering designs, and finally the time-consuming process of physically re-wiring and re- building the machine. With the introduction of the stored-program computer, this process changed. A stored-program computer included an instruction set and could store in memory a set of instructions (a program) detailing all computations. A series of breakthroughs, such as miniaturized transistor computers, and the integrated circuit, caused digital computers to largely replace the older analogue computers. The cost of computers gradually became so low by the 1990s, and 2000s, that personal computers and mobile computers like laptops, smartphones and tablets became ever-present in industrialized countries. Prepared by USO, for the exclusive use by students appearing in USO National Test. 11 IT & AI PRIMER Processors: The processor is the brain of the computer. The development of the processor as a unique unit started in 1947 with the introduction of the Von Neumann Architecture in the EDVAC, where both the Control Unit and Arithmetic Logic Unit were controlled through Vacuum Tubes. 2.2 The Processor (chip) What is the CPU and what does it do? A central processing unit, or CPU, is a piece of hardware that enables our computer to interact with all the applications and programs that have been installed on it. A CPU interprets the program's instructions and creates the output that we are asking the computer to do. The CPU itself has the following three components. Memory or Storage Unit. Control Unit. ALU (Arithmetic Logic Unit). Memory or Storage Unit: This unit can store instructions, data, and intermediate results, and it supplies information to other units of the computer when needed. It is also known as the internal storage unit, the main memory, the primary storage, or Random Access Memory (RAM). Its size affects speed, power, and capability. Primary memory and secondary memory are two types of memory in the computer. The functions of the memory unit are: It stores all the data and the instructions required for processing the information. It stores intermediate results of processing. It stores the results of processing before these results are released to an output device. All inputs and outputs are transmitted through the main memory. Control Unit: This unit controls the operations of all parts of the computer but does not carry out any actual data processing operations. The functions of this unit are: It is responsible for controlling the transfer of data and instructions among other units of a computer. It manages and coordinates all the units of the computer. It obtains the instructions from the memory, interprets them, and directs the operation of the computer. It communicates with Input/Output devices for transfer of data or results from storage. It does not process or store data. Prepared by USO, for the exclusive use by students appearing in USO National Test. 12 IT & AI PRIMER ALU (Arithmetic Logic Unit): This unit consists of two subsections: Arithmetic Section: The function of the arithmetic section is to perform arithmetic operations like addition, subtraction, multiplication, and division. All complex operations are done by making repetitive use of the above operations. Logic Section: The function of the logic section is to perform logic operations such as comparing, selecting, matching, and merging of data. What processor does your computer or laptop have? What are its specifications? TECH SPOTLIGHT Intel Corporation is an American multinational corporation and technology company headquartered in California. Founded by Gordon Moore and Robert Noyce, it is the world's second largest and second highest valued semiconductor chip makers based on revenue, after being overtaken by Samsung in 2017. Intel introduced the first single-chip microprocessor, the Intel 4004 from 1969 to 1970; the Intel 8080 in 1974, followed by the 8085 in 1976, and the 8086 in 1978. Intel is the inventor of the x86 series of microprocessors, the processors found in most personal computers. Intel also supplies processors for computer system manufacturers such as Apple, Lenovo, HP, and Dell. 2.3 Left: Laptop with ‘Intel Inside’ logo Right: Intel i7 processor chip Prepared by USO, for the exclusive use by students appearing in USO National Test. 13 IT & AI PRIMER Timeline for the Development of the Processor 1947-1960 Bipolar transistor, invented in 1947. Transistors replaced vaccum tubes in 1955. Integrated Circuits (IC), invented in 1958, gave rise to second generation of computers. IC's were used to build the Apollo guidance computer and Minuteman missile. The first microprocessor was invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor in 1958, leading to the development of microcomputers. 1961-1970 Small, low-cost computers like calculators developed with the help of integrated circuits and owned by individuals and small businesses. The CDC 6600, manufactured by the Control Data Corporation, and released in 1964, is generally considered the first supercomputer. 1971-1980 IC's were replaced by Microprocessors, which saw usage in terminals, cash registers, printers, industrial robots, and cruise missiles. 1981-1990 Microcomputers began to be used for multiple purposes. Shift from Complex Instruction Set Computing (CISC) to a Reduced Instruction Set Computing (RISC) chip design. 1991-2000 Multi-core processors were developed and used across many industries. Prepared by USO, for the exclusive use by students appearing in USO National Test. 14 IT & AI PRIMER Semiconductor Chip Shortage: A semiconductor chip is a crucial component that powers not only computers, but also a host of electronic items right from smartphones and automobiles to washing machines, refrigerators, and even electric toothbrushes. Due to their wide usage, these chips have always been in great demand. With the onset of COVID-19, the pandemic-induced lockdowns forced the chip-making facilities across the world to shut down. Meanwhile, as schools and colleges shifted to an online medium of education and as more people started to work from home, the demand for electronic items like laptops, tablets, and smartphones increased dramatically. Chip manufacturers struggled to create enough chips to meet the rising demand, giving rise to a huge backlog. The solution would appear to be – to produce more chips. But it is not that easy. With this high demand, manufacturing cannot be scaled up on such short notice as it is difficult and time-consuming to set up chip factories (called foundries). Companies like Intel, Samsung, Texas Instruments, and Taiwan Semiconductor Manufacturing Company (TSMC) have all announced new chip fabrication plants (called fabs) over the past few months but these fabs will take years to build. Intel CEO, Patrick Gelsinger, has said that the shortage could last till 2024. This shortage has also increased pricing of the semiconductor chips. Raw materials, foundries test, and assembly, logistics and labour have all become more expensive. Manufacturers have passed this increase on to the semiconductor suppliers, who in turn are forced to pass their costs on to their customers to help stabilize the supply chain. Chip suppliers have raised prices between 10% and 20%, on average globally. Storage Devices: Storage devices are hardware components used to store data permanently or temporarily. Be it for a photo album, a movie, or a company’s business systems, data storage has become essential. With evolving technology, data storage capacity and efficiency have both increased significantly within computers. Examples include hard disk drives (HDDs) and solid-state drives (SSDs), which keep our files, applications, and operating system safe and accessible. Removable storage like USB flash drives and memory cards provide portable solutions for transferring data between devices. Prepared by USO, for the exclusive use by students appearing in USO National Test. 15 IT & AI PRIMER Let’s review the evolution of storage devices. Punch Cards, which had been used as early as 1775, were small pieces of paper that could hold data in the form of small punched holes. The holes were strategically positioned to be read by computers. The first computers like the Colossus and the ENIAC used Paper Tape as the input device. Magnetic Tape, invented in 1928, was a system for storing digital information and was made of a thin, magnetizable coating on a long, narrow strip of plastic film. Magnetic Drums and Magnetic Core Memory were then invented that worked on the same principle of magnetizable coatings. In the 1950s, the most widely installed computer was the IBM 650, which used magnetic drum memory onto which programs were loaded using punched cards. 2.4 Punch Cards 2.5 Magnetic Tape Philips introduced the compact audio cassette in 1963. Although originally intended for dictation machines, it quickly became a popular method for distributing pre-recorded music. In 1979, Sony's Walkman used the audio cassette tape, to create the first portable player for customized music lists. 2.6 Sony’s Walkman 2.7 Audio cassette tape The Floppy Disk, a portable storage device made of magnetic film encased in plastic, made it easier and faster to store data. It was invented at IBM by David Noble. An 8” version was launched in 1971 with a capacity of 1.2MB, followed by the 5.25” version in 1976. Subsequently, the smaller 3.25” version was introduced in 1981 with a capacity of 1.44MB. Prepared by USO, for the exclusive use by students appearing in USO National Test. 16 IT & AI PRIMER The Compact Disc was invented in 1980 by James T. Russel, which increased the storage capacity significantly to 700 MB since it used lasers to record and replay music. This led to the development of the CD-ROM in 1984. 2.8 (L to R): Floppy Disk 5.25”, Floppy Disk 3.25”, Compact Disc (also known as CD) In 1994, Compact Flash (CF), also known as “flash drives,” were launched. Compact Flash used flash memory in an enclosed disc to save digital data. Toshiba launched the SmartMedia, a flash memory card, in 1995 to compete with MiniCard and SanDisk. These evolved gradually to what we know today as the microSD card. In 2006, SanDisk released a microSD card, having the capacity to store about 2GB of data, for $99. The USB Flash Drive, invented in 1999, uses a NAND-type flash memory to store digital data. The Trek ThumbDrive, the world’s first USB flash drive, had the capacity to hold about 8MB of data, about 4 times that of a floppy disk, and was priced at about $28. 2.9 SD cards 2.10 USB Flash The Hard Disk Drive (HDD) uses magnetic storage to store and retrieve digital information with one or more rigid rapidly rotating disks coated with magnetic material. The platters are paired with magnetic heads, arranged on a moving actuator arm, which read and write data to the platter surfaces. Data is accessed in a random-access manner. IBM 350 was the first disk drive, with the size of a large wardrobe, and data storage capacity of 3.75 MB, and was available for lease at $3,200 per month. Prepared by USO, for the exclusive use by students appearing in USO National Test. 17 IT & AI PRIMER With ever-increasing computation and storage needs for large organisations, the concept of servers arose, which evolved into Server Farms and Data Centers. A Server is a computer or a software system that provides services, data, or resources to other computers, known as clients, over a network. Servers host websites, manage email, store files, or run applications. They are designed to handle multiple requests simultaneously and are optimized for reliability, availability, and performance. Server farms: A server farm or server cluster is a collection of computer servers – maintained by an organization to supply server functionality far beyond the capability of a single machine. Server farms often consist of thousands of computers which require a large amount of power to run and to keep cool. They also include backup servers that can take over the functions of primary servers that may fail. Companies like Google, Facebook, Twitter, Microsoft, and Amazon create server farms for their services or internal usage. Google has different server farms that contain nearly 10,000 servers. 3.4 Facebook’s Server Farm is located in Lulea, Sweden (70 miles south of the Arctic Circle). This Data Centre uses local natural resources to be more energy efficient and the Arctic Circle provides natural cooling for the equipment. With the exponential growth of digital information and the increasing need for storage and its related services grew. This is because businesses and individuals began to generate, store, and process vast amounts of data, making the traditional methods of data management inadequate. Hence, the demand for reliable, secure, and scalable storage solutions, coupled with the need for constant access to data, led to the development of specialized facilities known as Data Centers. These centers house critical IT infrastructure, including servers, storage devices, and networking equipment, ensuring efficient data management and uninterrupted access to information.Companies like Amazon, Google, and Microsoft pioneered cloud computing, offering scalable and flexible computing resources over the Internet. This shift from on-premise data centers to cloud data centres allowed businesses to offload their IT infrastructure to third-party providers, reducing costs and complexity. Prepared by USO, for the exclusive use by students appearing in USO National Test. 18 IT & AI PRIMER By 2017, Cloud Data Storage became the preferred medium for storage of data. With improvements in internet bandwidth and the falling cost of storage capacity, it is surely more economical for business and individuals to outsource their data storage to the cloud, rather than buying, maintaining, and replacing their own hardware. Cloud Storage is available through a number of providers such as Dropbox, Box, OneDrive and Google Drive‎. 2.11 Hard Disk Drive 2.12 Cloud storage is depicted in the above manner But what exactly is the cloud? Simply put, the cloud is the Internet—it is all things we can access remotely over the Internet. When something is in the cloud, it means it's stored on Internet servers instead of our computer's hard drive. So, instead of buying, owning, and maintaining physical data centers and servers, all technology services can be accessed from a cloud provider. Cloud Storage: Over the last 5 years, the Cloud has become the preferred place for people and organizations to store their data as they are not investing in increasing their hardware footprint. The main reason for using the Cloud is that storing data and applications on the cloud can be done with a very small investment. Additionally, the cost of maintaining the applications and the data goes up or down based on the usage of storage space. This avoids situations where companies have bought several computers, but they are not all being used regularly. The top 3 companies that are leaders in this providing cloud storage to large organizations are: 1. Amazon Web Services (AWS) 2. Microsoft Azure 3. Google Cloud Platform (GCP) The Banking industry produces the most activity within the cloud. As consumers, we are already using Cloud services on a smaller scale through Microsoft’s OneDrive, Google Drive, and Apple’s iCloud. And now cloud services have given rise to a whole new way of working – the ‘as a service’ model. In these instances, users pay directly for the usage of software applications and data without having to worry about any of the underlying technology elements. Prepared by USO, for the exclusive use by students appearing in USO National Test. 19 IT & AI PRIMER TECH SPOTLIGHT AWS was the pioneer in cloud storage and was an offshoot of the main Amazon e-commerce business. Amazon as a company realized that they had invested a great deal with computers in data centres. These computers were incurring running costs but were not being used to compute tasks regularly. They also realized that there was a significant seasonality to their business, e.g.- the traffic to the website went up during Christmas time, which required additional investment. Hence, they came up with a model where the unused compute power during low traffic durations could be rented out to other organizations. This offering became so popular that Amazon decided to form a separate company - Amazon Web Services (AWS). Now, AWS is far more profitable than the original Amazon.com. SanDisk is an Israeli American manufacturer of flash memory products, such as memory cards and readers, USB flash drives, and solid-state drives. SanDisk delivers flash storage solutions used in data centers, that are embedded in smartphones, tablets, and laptops. Founded in 1988 by Eli Harari, Sanjay Mehrotra and Jack Yuan, SanDisk was the third- largest manufacturer of flash memory in 2015. On May 12, 2016, SanDisk was acquired by hard drive manufacturer Western Digital in a US$19 billion deal. Western Digital Corporation (commonly referred to as Western Digital and often abbreviated as WDC or WD) is an American computer data storage company. Founded on April 23, 1970, by Alvin B. Phillips, a Motorola employee, as General Digital, initially (and briefly), WD was a manufacturer of MOS test equipment. It rapidly became a specialty semiconductor maker, with start-up capital provided by several individual investors and industrial giant Emerson Electric. It is one of the largest computer hard disk drive manufacturers in the world, along with its main competitor Seagate Technology. Prepared by USO, for the exclusive use by students appearing in USO National Test. 20 IT & AI PRIMER Where do you store your files? Do you use the Cloud? Every computer contains both RAM and ROM. RAM is a computer's short-term memory for active tasks, while ROM is long-term memory for essential startup instructions. Let’s view them in detail: RAM (Random Access Memory) is a type of volatile memory. The data in RAM is not permanently written. When the computer is powered off, the data stored in RAM is deleted. RAM is very fast, allowing quick read and write access to support active processes and applications. Hence, it is used to allow applications to quickly access data they are currently using. To summarise, RAM is the temporary memory that a computer uses to store data and instructions that are currently in use or being processed. ROM (Read-Only Memory) is a type of non-volatile memory. Data in ROM is permanently written and is not erased when the computer is powered off. ROM can be written only once, hence the term "read-only." ROM is slower compared to RAM, as it is mainly used for reading data rather than writing. It is used for instructions pre-written by a manufacturer, such as how to boot up the computer. ROM stores important instructions that the computer needs to start up and operate, such as the BIOS (Basic Input/Output System). Input and Output Devices: Input and Output devices are pieces of hardware used by a human (or other system) to communicate with a computer. A keyboard or mouse is an input device for a computer, while monitors and printers are output devices. Other devices for communication between computers, such as modems and network cards, perform both input and output operations. With technologies like smartphones, the boundaries between input and output devices have blurred. Monitors which were once output devices, and now also used as input device in touchscreen laptops, tablets, or mobiles. Prepared by USO, for the exclusive use by students appearing in USO National Test. 21 IT & AI PRIMER The following are examples of input and output devices: Input Devices Output Devices * Punched card/tape * Punched card/tape * Keyboard * Screen/Monitor * Pointing devices: Mouse/ * Speakers Trackpad/Trackball * Headphones * Composite devices: Joystick, * Printer Wii Remote, Gaming paddle * Automotive navigation system * Video input devices: * Plotter Webcam, Digital Camera/ Camcorder, Microsoft Kinect sensor,Scanner, Barcode reader * Audio input device: Microphone Keyboard - The famous “QWERTY” keyboard was designed in the 1800s for mechanical typewriters. It was actually designed to slow typists down to avoid jamming the keys on mechanical units. It is called QWERTY, because the first six typing keys on the top row of letters spell QWERTY. Nowadays, physical keyboards are being replaced by virtual ones with the screen itself acting as the input device. 2.13 Keyboard Mouse – The Mouse was conceived by Douglas Engelbart in 1963 at Xerox and did not become popular until 1983 with Apple Computer's Lisa and the 1984 Macintosh. It was not adopted by IBM PCs until 1987 – although compatible computers such as the Amstrad PC1512 were fitted with mice before this date. To make mice affordable for the consumer, relatively low- quality rubber ball-based versions were commonly used, instead of the more expensive optical versions that required dedicated mouse pads with embedded lines. As time went on, other kinds of optical mice, which could operate on ordinary surfaces, became affordable and are by far the most widely used today. Prepared by USO, for the exclusive use by students appearing in USO National Test. 22 IT & AI PRIMER Mice come as wired and wireless. While the wired mouse connects to the USB port through a cable, the wireless mouse has a dongle that is inserted in the USB port of the computer or laptop, and it runs on a battery. The wireless mouse connects to the dongle through the Bluetooth. 2.14 Wired Mouse 2.15 Wireless Mouse From tablet to touch: Surprisingly, the graphics tablet and stylus-based input pre-dates the mouse. Used in the 1950s and ’60s for the input of graphical data into computer systems, even Apple sold one called the BitPad as an accessory to the Apple II, in the 1980s. More recently, several tablets like the iPad have become popular. The same touch technology has also been built into the latest models of smartphones. 2.16 Touch screen and a stylus-pen used in place of a mouse Monitors - Originally, computer monitors were used for data processing while television receivers were used for entertainment. From the 1980’s onwards, computers (and their monitors) have been used for both data processing and entertainment, while televisions have absorbed some computer functionality. Multiple technologies have been used for computer monitors. Until the 21st century most monitors used cathode ray tubes (CRT) but they have largely been superseded by LCD (Liquid Crystal Display) technology. Throughout the 1990’s, the primary use of LCD technology as computer monitors was in laptops where the lower power consumption, lighter weight, and smaller physical size of LCD's justified the higher price and lower space requirements versus a CRT. Prepared by USO, for the exclusive use by students appearing in USO National Test. 23 IT & AI PRIMER 2.17 Computer monitor 2.18 Smart TV Printers - The first computer printer design was a mechanically driven apparatus by Charles Babbage for his ‘Difference Engine’ (the first mechanical computer) in the 19th century.The first electronic printer was the EP-101, invented by the Japanese company Epson, released in 1968. The first laser printer (IBM 3800) was introduced by IBM in 1976. In 1984, Hewlett-Packard released the LaserJet printer, and in 1988 the HP Deskjet was introduced using inkjet technology. Inkjet and LaserJet systems rapidly displaced dot matrix and daisy wheel printers from the market, thus setting off the ‘desktop publishing’ revolution. By the 2000s high-quality printers of this sort had fallen under the $100 price point and became more affordable for consumers to use. Starting around 2010, 3D printing has become an area of intense interest, creating a revolution in the manufacturing industry. A 3D printer is a device for making a three-dimensional object from a 3D model or other electronic data source through additive processes in which successive layers of material (including plastics, metals, food, cement, wood, and other materials) are laid down under computer control. It is called a printer by analogy with an inkjet printer which produces a two-dimensional document by a similar process of depositing a layer of ink on paper. 2.19 3D Printer Prepared by USO, for the exclusive use by students appearing in USO National Test. 24 IT & AI PRIMER Putting it all together: Apple put all the elements of a touch-driven User Interface (UI) together in the iPhone and then the iPad. Most previous devices, including Windows Mobile, relied on a stylus and a single point of contact with a resistive touch screen. Apple’s approach was to use a multi- touch capacitive screen, which is very sensitive to finger touches and can detect several touches at once to allow for “pinch” and “zoom” gestures. Text input relies on a virtual keyboard which mimics a physical keyboard, or a small keyboard connected over Bluetooth. This approach has quickly become the standard for mobile devices — essentially killing stylus-based mobile devices almost overnight. 2.20 Tablet (left) and Mobile phone (right) having the touchscreen TECH SPOTLIGHT Apple, founded by Steve Wozniak and Steve Jobs, is an American technology company that designs, manufactures, and sells devices, software, and services. iPhone, iPad, iPod are Apple’s flagship products that operate on MacOS and iOS systems. One of the most successful technology companies of the world, it started in Steve Job’s garage, and is headquartered in Cupertino, California. Some interesting facts about Apple Inc. are The name ‘Apple’ came from founder Steve Jobs’ love of the fruit. Since the first iPhone unveiled in 2007 at 9.41 a.m., in every iPhone ad, the time is shown as 9.41. iPhones sell for twice the price in Brazil than in the US. Apple was the first to make a digital colour camera. Prepared by USO, for the exclusive use by students appearing in USO National Test. 25 IT & AI PRIMER Can you name the various hardware parts that form part of your computer or laptop? Other Key Innovators IBM announced their IBM Personal Computer with MS-DOS 1.0 in August 1981. 100,000 orders were taken by Christmas. The design turned out to be far more successful than IBM had anticipated. This became the basis for most of the modern personal computer industry. BlackBerry helped innovate with a tiny but almost fully featured keyboard, often called a “thumb keyboard,” which allowed users to carefully type with one or both thumbs. This was a huge productivity improvement for many frequent texters and was very popular in its original form or as a “slider” that hides under the main phone. Prepared by USO, for the exclusive use by students appearing in USO National Test. 26 IT & AI PRIMER MEMORY CHECK Fill in the blank: ___________ units act as the memory of the computer. Storage. Fill in the blank: The _________ is an example of a fixed program computer. Calculator. Which device controlled the Control Unit and Arithmetic Logic Unit? Vacuum tubes. What is the main memory unit of a computer system? RAM. Which unit of the processor coordinates all the units of the computer? Control Unit. Under which section of the processor is data matching and merging performed? Logic Section. Which company introduced the first single-chip microprocessor? Intel. What innovation gave rise to the second generation of computers? Integrated Circuits (IC). True or False: The monitor acts as both an input and output device. True. True or False: Philips introduced the Walkman. False. Prepared by USO, for the exclusive use by students appearing in USO National Test. 27 IT & AI PRIMER Chapter 3 Technology: Software This chapter explores the world of computer software. We cover ‘Application Software’ (Apps), including video games, and spotlight key companies in the industry. We will learn about ‘System Software’ with a timeline of operating system evolution, device drivers, and utilities. Further, we will explore Programming Languages and provide a comprehensive timeline of their evolution, leading to the current Low Code / No Code Movement, which is transforming how software is developed and deployed. This chapter aims to provide a thorough understanding of software technology, its history, and its current trends, equipping you with the knowledge to appreciate the integral role software plays in the tech landscape. Software is a part of a computer system that consists of one or more computer programs and related data. It provides instructions for telling a computer what to do and how to do it. Software is classified into ‘Application Software’ and ‘Systems Software’. Users do not interact directly with the hardware of the computer. Users input information into the computer and get information from an Application Software. This Application Software, in turn, interacts with System Software, that controls the Hardware. 3.1 The distinction between Application Software and System Software Prepared by USO, for the exclusive use by students appearing in USO National Test. 28 IT & AI PRIMER Application Software or Applications or Apps: Apps are what most of us mean when we think of software. Application software uses the computer system to perform special functions beyond the basic operation of the computer. A wide variety of application software exists to meet a range of business and consumer applications. Some of these are: Computer-aided design (e.g.- AutoCAD) Databases (e.g.- Oracle) Image editing (e.g.- Photoshop) Spreadsheets (e.g.- Excel) Telecommunications (e.g.- Skype) Video editing software (e.g.- TikTok) Video games (e.g.- Minecraft) Word processors (e.g.- Word) Application software products are designed to satisfy a particular need of a specific environment. All software applications prepared in the computer lab can come under the category of Application software. Application software may consist of a single program, such as Microsoft Word for writing and editing text. It may also consist of a collection of programs, often called a software package, which work together to accomplish a task, such as Microsoft Office. Application software is made keeping the user in mind and is therefore normally easy to use. It is, however, written in a high-level language that can only be changed, improved, or worked on by the programmer or developer. What is an Application Programming Interface? An Application Programming Interface (API) is a computing interface that defines interactions between multiple software intermediaries. It defines the requests that can be made, how to make them, the data formats that should be used, the conventions to follow and more. It can also provide extension mechanisms so users can extend existing functionality in various ways. An API can be entirely custom, specific to a component, or designed based on an industry standard to ensure interoperability. Through information hiding, APIs enable modular programming, allowing users to use the interface independently. Prepared by USO, for the exclusive use by students appearing in USO National Test. 29 IT & AI PRIMER What are video games? A video game is an electronic game that involves interaction with a user interface or input device – such as a joystick, controller, keyboard, or motion- sensing device, to generate visual feedback for a player. Video games are defined based on their platform, which includes arcade games, console games, and PC games. Video games are also a type of application software. They are created by programmers and then released to users. One of the earliest video games was Tetris, written by a Russian national, Alexey Pazhitnov. It was later released for various Western game machines, the crown jewel being its inclusion with Nintendo's Game Boy in 1989. Alexey made nothing from the game since under the Communist Regime it was owned by the people. Built on simple rules and requiring intelligence and skill, Tetris established itself as one of the great early video games. It has sold 202 million copies – approximately 70 million physical units and 132 million paid mobile game downloads – as of December 2011, making it one of the best-selling video game franchises of all time. Minecraft is a video game developed by Swedish video game developer, Mojang Studios. Created by Markus Persson in the Java programming language, Minecraft has since been ported to several other platforms and is the best-selling video game of all time, with 200 million copies sold and 140 million monthly active users as of 2021. Minecraft has been critically acclaimed, winning several awards and being cited as one of the greatest video games of all time. Minecraft is a 3D sandbox game that has no specific goals to accomplish, allowing players a great deal of freedom in choosing how to play the game. Due to its popularity, the game has also been used in educational environments to teach chemistry, computer-aided design and computer science. 3.2 Tetris 3.3 Minecraft Prepared by USO, for the exclusive use by students appearing in USO National Test. 30 IT & AI PRIMER System Software: System software is software (which may be a collection of programs) that directly operates the computer hardware, to provide basic functionality needed by users and/or other software. This helps us operate, control, and extend the processing capabilities of the computer itself. System software also provides a platform for running the application software. System software is generally prepared by the computer manufacturers. These software products comprise programs written in low-level languages, which interact with the hardware at a very basic level, thereby serving as the interface between the hardware and the end user. Some examples of system software are Operating Systems, Compilers, Interpreter, Assemblers, etc. System software is of three types: Operating Systems (OS); Device Drivers; and Utilities Operating systems (OS): These are essential collections of software that manage computer hardware and software resources and provide common services for computer programs (i.e., application software, device drivers and utilities) that run "on top" of them. Supervisory programs, boot loaders, shells and window systems are core parts of operating systems. Operating systems are specialized software that control and monitor the execution of all programs that reside in the computer, including application software. OS is found on many devices that contain a computer – from cellular phones and video game consoles to web servers and supercomputers. Why is OS important? Why do we need it? The reason why OS is important is because it makes the computer system easy and convenient to use. It also helps manage the resources of the computer system efficiently. For instance, the operating system helps keep track of primary memory, prevents unauthorized access to programs and data by using a password, keeps track of all devices, coordinates between other software and users etc. What are some of the commonly used OS? Five of the most common operating systems are Microsoft Windows, Apple macOS, Linux, Android and Apple's iOS. Prepared by USO, for the exclusive use by students appearing in USO National Test. 31 IT & AI PRIMER 1969 UNIX was developed by Ken Thompson and Dennis Ritchie, who would go on to develop the ‘C’ language later. 1976 Berkeley Software Distribution (BSD), created by University of California Berkeley, first Open Source OS, which later formed core of MacOS X. 1981 Development of MS-DOS begins. Microsoft commissioned by IBM to write the operating system for the IBM PC. 1983 Lisa OS debuted on Apple Lisa desktop computer and introduced the concept of the Graphical User Interface (GUI). 1990 Microsoft Windows 3.1 introduced, establishing the GUI in personal computers. 1991 Linus Torvalds releases an open-source, Unix-like OS kernel called Linux. 2001 First Mac OS X is released in the same year as Windows XP. 2004 Ubuntu was released as a Linux distribution based on the Debian architecture. This is currently the most popular operating system running in ‘cloud’ hosted environments. 2007 Android was released by Google, as a mobile operating system based on the Linux kernel. It was designed for touchscreen mobile devices. 2008 iOS 2.0 released one year after release of the iPhone. The first release of iOS to support third-party applications via AppStore. 3.5 Timeline of the evolution of Operating Systems Prepared by USO, for the exclusive use by students appearing in USO National Test. 32 IT & AI PRIMER In 2015, Apple releases watchOS, as the mobile operating system of the Apple Watch. TECH SPOTLIGHT Android is a mobile operating system developed by Google. It is based on the Linux kernel and designed primarily for touchscreen mobile devices such as smartphones and tablets. Android's user interface is based on direct manipulation, using touch gestures that corresponding to real-world actions, such as swiping, tapping and pinching, to manipulate on-screen objects, along with a virtual keyboard for text input. In addition to touchscreen devices, Android has also developed Android Auto for cars, and Android Wear for wrist watches, each with a specialized user interface. Variants of Android are also used on game consoles, digital cameras, PCs and other electronics. What operating system does your mobile phone and computer (or laptop) operate with? Device drivers: Commonly known as a driver, a device driver or hardware driver, these operate or control a particular type of hardware device attached to a computer. A driver provides a software interface (group of files with instructions) to hardware devices, enabling the operating systems and other computer programs to access hardware functions. A device driver does not use standard commands. Drivers are hardware- dependent and operating-system-specific. Each device needs at least one corresponding device driver, as a computer has at least one input and one output device. Each requires its driver as different hardware requires different commands. Without drivers, the computer would not be able to send and receive data correctly to hardware devices, such as a printer. Prepared by USO, for the exclusive use by students appearing in USO National Test. 33 IT & AI PRIMER Using the wrong device driver can prevent hardware from working correctly. For example, an HP printer will not work with a computer that only has a Canon printer driver. Different operating systems also need different drivers, a driver written for macOS cannot be used by Microsoft Windows. Keeping drivers up to date avoids potential problems when using programs with the new hardware. Utilities: These are computer programs designed to help analyse, configure, optimize or maintain a computer. Utilities are used for the limited purpose of changing the overall behaviour of hardware or other software. There are various types of utilities. These are: Anti Virus: The anti-virus scans the computer for computer viruses, intimates the user and removes them. It also protects the computer system from other threats such as identity theft, password protection etc. Eg: Norton, McAfee, Quick Heal Disk Cleaners: These find files that are unnecessary for any computer operation, or that take up considerable amounts of space. It intimates the user and helps users to decide what to delete when their hard disk is full. Disk Compressions: This utility transparently compresses/uncompresses the contents of a disk, increasing the capacity of the disk. Network Utilities: Network Utilities analyses the computer's network connectivity, configure the network settings, and check data transfer or log events. It helps users select which network to use. System Monitors: Monitors resources and performance of a computer system. Registry Cleaners: These clean and optimize the Windows Registry by removing old registry keys that are no longer in use. Prepared by USO, for the exclusive use by students appearing in USO National Test. 34 IT & AI PRIMER TECH SPOTLIGHT Started in 1975, with headquarters in Washington, Microsoft, was founded by Bill Gates and Paul Allen to create an operating system for the IBM PC, called MS-DOS (MicroSoft-Disk Operating System). This led to the development of the Microsoft Windows line of operating systems. Microsoft now develops, manufactures, licenses, supports, and sells computer software, consumer electronics, personal computers, and related services. Its best-known software products are the Microsoft Windows line of operating systems - The Microsoft Office Suite, the Internet Explorer and Edge web browsers. Its flagship hardware products are the Xbox video game consoles and the Microsoft Surface line-up of touchscreen personal computers. Their constant innovation, modern designs and easy-to-use software has made them immensely successful. Microsoft shareholders have benefitted a great deal from the success of the company, making three billionaires and 12,000 millionaires since inception. What is a Kernel? A Kernel is a computer program that is the heart and core of an Operating System. Since the Operating System has control over the system, the Kernel also has control over everything in the system. Programming Languages: For any Application or System Software, programming languages are required, as Software is written in one or more Programming Languages. While there are many programming languages in existence, each has at least one implementation environment, consisting of its own set of programming tools. These tools may be relatively self-contained programs such as compilers, debuggers, interpreters, linkers, and text editors, that can be combined to accomplish a task; or they may form an integrated development environment (IDE), which combines much or all of the functionality of such self-contained tools. Examples of IDE’s include Eclipse, IntelliJ and Microsoft Visual Studio. Prepared by USO, for the exclusive use by students appearing in USO National Test. 35 IT & AI PRIMER Programming languages define the syntax and semantics of computer programs. The software is written in high-level programming languages that are easier and more efficient for programmers to use as they are closer than machine languages to natural languages. High-level languages are translated into machine language using a compiler, an interpreter or a combination of the two. Recent trends in the evolution of programming languages include a focus on Open Source as a developmental philosophy, as well as support for Parallel Processing. C#, Scala, Windows PowerShell and Swift are a few of the languages in the post-2000 world. The design for what would have been the first piece of software was written by Ada Lovelace in the 19th century but was never implemented. Before 1946, software as we now understand it—programs stored in the memory of stored-program digital computers—did not exist. A timeline of the evolution of the programming languages is below: FORTRAN (FORmula TRANslation), the first high-level programming language, started by John Backus and his team at IBM in 1954. COBOL (Common Business-Oriented Language) developed by Grace Murray Hopper in 1961, as successor to FLOW-MATIC. BASIC (Beginners All Purpose Symbolic Instruction Code) developed at Dartmouth College, USA, by Thomas E. Kurtz and John George Kemeny. C developed at The Bell Laboratories, US by Dennis Ritchie, one of the inventors of the Unix operating system, in 1972. Inspired by C, C++, introduced in the 1980s, helped usher in the era of object-oriented programming concepts that are still followed today in.NET and Java. COBOL (Common Business-Oriented Language) developed by Grace Murray Hopper in 1961, as successor to FLOW-MATIC. Prepared by USO, for the exclusive use by students appearing in USO National Test. 36 IT & AI PRIMER The Internet Age of the 1990s saw the creation of Scripting languages like Visual Basic, Python, R, Java, Javascript, and PHP. Does one need to learn programming to understand IT? The answer is a big ‘NO’! Historically, when software had to be developed, the option for an organization was always whether to ‘Build’ or to ‘Buy’. The closest analogy to programming is the ‘Build’ option, which is like having a tailor make a shirt or dress customized to your requirements. This comes at a higher cost and a long wait time before the product is developed. The ‘Buy’ option, in the same analogy, is to pick up a ready-made shirt or dress from a store. The software equivalent to it is buying an existing pre-built software. The cost and the wait time are significantly reduced, but the ‘fit to requirements’ is not as close as the ‘Build’ option. But now, a third option is available where we can ‘Buy’ some base functionality and then improve the fit by adding relevant modules as needed. This can be done either through a ‘No-Code’ environment where the required functionality can be added from a menu, or through a ‘Low-Code’ environment, which requires only 10% of the programming needed in a ‘Build’ scenario. Generally, ‘Low-Code’ platforms contain drag-and-drop features that initiate automatic code generation. In recent years, several platforms have become available for ‘citizen developers’ to use, who do not have a background in technology. These Low- Code and No-Code platforms allow non-techies to design, build and launch solutions quickly and painlessly. Wordpress.org and Salesforce are perfect examples of this ‘Low Code-No-Code Movement’. Wordpress.org: More than one-third of all websites are run on Wordpress.org. Extremely simple to use, the website developer first begins by installing a free version. Several free and paid ‘plug-ins’ can be downloaded that offer added functionalities like Forms, SEO, Website Analytics, etc. Most of this can be done without any coding knowledge. Salesforce: It is the #1 Customer Relationship Management (CRM) software in the market. For organizations trying to develop a 360-degree view of their customers, this is the foundation of capturing customer data and getting insights about their purchase patterns. While all infrastructure, security, data storage and operational aspects are taken care of by Salesforce, it also allows users to quickly create applications without writing any code. Prepared by USO, for the exclusive use by students appearing in USO National Test. 37 IT & AI PRIMER MEMORY CHECK Which software controls the computer hardware? System Software. True or False: The joystick is an input device. True. Which video game is considered one of the greatest and has won several awards? Minecraft. How many types of system software are there? Three. Which system software provides a group of files with instructions to hardware devices? Device drivers. Which system software helps users choose and connect to a network? Network utilities. Which program has control over everything in the system? Kernel. Through what process are high-level languages translated into machine language? Compilation or Interpretation. Which is the first high-level programming language? FORTRAN. Which OS has Google developed for their tablets and mobile phones? Andriod Prepared by USO, for the exclusive use by students appearing in USO National Test. 38 IT & AI PRIMER CHAPTER 4 Technology: Networking Enter into the world of "Technology: Networking," where we delve into computer networks. This chapter explores the various types of networks, the evolution of the Internet, and advancements in 5G networks for mobile devices. We showcase key companies in the networking industry and discuss computer security, including ransomware threats and strategies for online safety. This chapter aims to provide a thorough overview of networking technology, spanning from fundamentals to cutting-edge developments, underscoring its crucial role in our interconnected world. The power of technology is best felt when we can connect with others for the exchange of information. Advances in connectivity and networking have made possible some of the technologies we see and experience today. A computer network is a system in which multiple computers are connected to share information and resources. Files that are created and stored in one computer, can be accessed from other computers that may be on the same network. Printers and scanners can be connected to several computers through a network system. How can we connect several computers? The following is required to set up any computer network: Network cables: These cables are used to connect the various computers that will form the network. Switch: A computer can communicate with other devices via a port. When several computers need to be connected to produce a network, a switch is required. This connects with other computers, printers, scanners, etc. thereby managing and distributing network traffic. Router: A router is a type of device that acts as the central point among computers and other devices that are a part of the network. It is equipped with ports. Computers and other devices are connected to a router using network cables. Nowadays router comes in wireless modes using which computers can be connected without any physical cable. Network card: A network card is a necessary component of a computer without which a computer cannot be connected over a network. It is also known as the Network Adapter or Network Interface Card (NIC). Most branded computers have network cards pre-installed. Network cards are of two types: Internal and External Network Cards. Prepared by USO, for the exclusive use by students appearing in USO National Test. 39 IT & AI PRIMER With all the cables, network cards, etc, will all computers globally be connected? There are two types of networks that the computers can be connected to. These networks are: Local Area Network (LAN): A local area network (LAN) is a computer network that interconnects computers within a limited area such as a residence, school, laboratory, university campus, or office building. Ethernet and Wi-Fi are the two most common technologies in use for local area networks. Wide Area Network (WAN): A wide area network (WAN) not only covers a larger geographic distance but also generally involves leased telecommunication circuits. It connects several LAN connections. The internet is an example of a worldwide public WAN. 4.1 LAN vs WAN What about the The Internet? How did it come about and how does it work? In 1969, work on the ARPANET was begun by the United States Department of Defence for research into networking. It is the original basis for what now forms the Internet. It was opened to non-military users later in the 1970s and many universities and large businesses went on-line. The linking of commercial networks and enterprises in the early 1990s marked the beginning of the transition to the modern Internet, and generated rapid growth as institutional, personal, and mobile computers were connected to the network. By the late 2000s, its services and technologies had been incorporated into virtually every aspect of everyday life. The Internet today is the global system of interconnected computer networks that use the Internet protocol suite (TCP/IP) to link devices worldwide. It is a network of networks that consists of private, public, academic, business, and government networks of local to global scope, linked by a broad array of electronic, wireless, and optical networking technologies. The Internet carries a vast range of information resources and services, such as the inter-linked hypertext documents and applications of the World Wide Web (WWW), electronic mail, telephony, and file sharing. Prepared by USO, for the exclusive use by students appearing in USO National Test. 40 IT & AI PRIMER And, here is what the Internet has grown into... Most traditional communications media, including telephony, radio, television, paper mail, and newspapers are being reshaped, redefined, or even bypassed by the Internet, giving birth to new services such as email, Internet telephony, Internet television, online music, digital newspapers, and video streaming websites. Newspapers, books, and other print publishing are adapting to website technology, or are being reshaped into blogging, web feeds, and online news aggregators. The Internet has enabled and accelerated new forms of personal interactions through instant messaging, internet forums, and social networking. Online shopping has grown exponentially both for major retailers and small businesses and entrepreneurs, as it enables firms to extend their "brick and mortar" presence to serve a larger market or even sell goods and services entirely online. Business-to-business and financial services on the Internet affect supply chains across entire industries. With COVID-19, the entire world has stayed connected despite being isolated due to the internet. Whether it is social media, news, online shopping, or attending online classes, none of these would have been possible without the Internet. This has also put a huge demand on all internet and related services. The increase in global wi-fi traffic uploads to cloud computing platforms, video calls, and social media connectivity. This has pushed countries to become digitally ready sooner rather than later. How has the Internet evolved? Web technology and the way it is used has evolved over the last few years, and each evolution has brought about new tools and techniques. These evolutions have popularly been called Web 1.0, Web 2.0, and Web 3.0. The main differences are: Web 1.0 is the ‘read-only Web’ and is the earliest form of the Internet. Technologies such as HTML and HTTP allowed the publishing of static webpages that were read-only, i.e., the user was only able to read the content not edit it. This era was driven by the rapid growth of applications like email, blogs, and real-time news retrieval. Prepared by USO, for the exclusive use by students appearing in USO National Test. 41 IT & AI PRIMER Web 2.0 is the ‘participative social Web’ and is a better, more enhanced version than Web 1.0. Starting in the early 2000’s timeframe, this made the internet more interactive because of web technologies like JavaScript, HTML5, and CSS. It allowed companies to create interactive web platforms like YouTube, Facebook, and Wikipedia. Web 3.0 is the ‘read, write, execute Web’ and is the name given to the third evolution of the World Wide Web. It comprises multiple technologies allowing it to process data with human intelligence using concepts like AI and Blockchain that run innovative programs by providing them with their relevant choices. This makes each user become a content owner instead of just a content user. What is the network used in mobile devices? 5G is the fifth-generation technology standard for broadband cellular networks and is the planned successor to the 4G networks which provide connectivity to most current cell phones. 5G networks are predicted to have more than 1.7 billion subscribers and account for 25% of the worldwide mobile technology market by 2025. Approximately every 10 years, we have seen a 20-50x improvement in data transmission speeds. In 2019, 5G began to surface as companies like Qualcomm, Apple, Verizon, and T-Mobile began discussions on and debuting products with 5G capabilities. 5G is the fifth generation of broadband cellular networks, the evolution of which is shown in the table below. Technology 1G 2G 3G 4G 5G Year 1979 1991 2001 2009 2019 Analog, (SMS), Cheap data HD Video Internet of System, MMS, transmission, Streaming, Things, Cloud Dropped Conference GPS, Web Wearable Storage, Use Cases Calls, calls, Long Browsing, SD Devices, High Remote Giant Cell Distance Call Video Speed Surgical Phones Tracking Streaming Applications Robots Frequency 30 KHz 1.8 GHz 1.6-2 GHz 208 GHz 3-30 GHz Bandwidth 2 kbps 364 kbps 3 Mbps 100 Mbps 10 Gbps Avg Speeds 2 kbps 40 kbps 300 kbps 25 Mbps 150 Mbps Range N/A 50 mi 35 miles 10 miles 1,000 ft Prepared by USO, for the exclusive use by students appearing in USO National Test. 42 IT & AI PRIMER In India, both Airtel and Jio announced the introduction of 5G at the Indian Mobile Congress (IMC) held in October 2022 in New Delhi. Telecom providers have stated that they are investing around $19.5 billion in building 5G networks by 2025. Some of the use cases demonstrated by the telecom providers include immersive mobile 5G cloud gaming, SmartAgri for farmers, and Gurushala (cloud-based collaborative knowledge exchange platform for teachers and students). The main advantages of the 5G network are greater transmission speeds, low latency, greater capacity for remote execution, greater number of connected devices, possibility of implementing virtual networks, and more. Simply put, it will work faster on cellular and other mobile devices. TECH SPOTLIGHT Cisco Systems, Inc. is an American multinational technology company headquartered in San Jose, California. Cisco develops, manufactures and sells networking hardware, telecommunications equipment and other high-technology services and products. Through its numerous acquired subsidiaries, such as OpenDNS, WebEx, Jabber and Jasper, Cisco Systems specializes into specific tech markets, such as Internet of Things (IoT), domain security and energy management. Currently, the largest networking company in the world, Cisco Systems was founded in December 1984 by Leonard Bosack and Sandy Lerner, two Stanford University computer scientists, who pioneered the concept of a local area network (LAN) being used to connect geographically disparate computers over a multiprotocol router system. CISCO stands for Commercial and Industrial Security Corporation. Prepared by USO, for the exclusive use by students appearing in USO National Test. 43 IT & AI PRIMER Computer security, also known as Cyber security or IT security, is the protection of computer systems from theft and damage to their hardware, software, or information, or from the disruption or misdirection of the services they provide. The field is of growing importance due to the increasing reliance on computer systems and the Internet, wireless networks such as Bluetooth and Wi-Fi, growth of "smart" devices, including smartphones, televisions, and sensors as part of the Internet of Things. Several factors affect our hardware or software. Previously, malicious code often spread from computer to computer via infected floppy disks. Today, it spreads via the internet. A virus may attach itself to an existing application and change or destroy an application. Executing as per the creator’s code, the results can be very disastrous. In extreme cases, data that a user has saved may be permanently lost. In everyday conversation, people use the terms virus and malware interchangeably. But strictly speaking, a Virus is a specific type of malware. The two other main types are Trojans, which masquerade as harmless applications to trick users into executing them, and Worms, which can reproduce and spread independently of any other application. The distinguishing feature of a virus is that it needs to infect other programs to operate. 4.2 Computer systems when malware infects it Malware, short for malicious software, is an umbrella term used to refer to a variety of forms of hostile or intrusive software, including computer Viruses, Spyware, Adware, Worms, Scareware, Trojans, Ransomware, and other malicious programs. Prepared by USO, for the exclusive use by students appearing in USO National Test. 44 IT & AI PRIMER Malware is closely associated with computer-related crimes, though some malicious programs may have been designed as practical jokes. Many early infectious programs, including the first Internet Worm, were written as experiments or pranks. Today, malware is used by both black hat hackers and governments, to steal personal, financial, or business information. The Morris Worm or Internet worm of November 2, 1988, was one of the first computer worms launched from the computer systems of the Massachusetts Institute of Technology and distributed via the Internet. It was written by a graduate student at Cornell University, Robert Tappan Morris, who said 'he wanted to count how many machines were connected to the Internet'. This gained significant media attention. As a result of this, resulted in the first felony conviction in the US under the 1986 Computer Fraud and Abuse Act. The Morris Worm infected 10% of the computers around the world. Since 60,000 computers were connected to ARPANET this worm infected 6,000 computers and it took 72 hours for all systems to function normally again. Ransomware is a type of malware, which blocks access of data to the actual owner and demands ransom for access. While some simple ransomware may lock the system without damaging any files, more advanced malware encrypts the victim's files, making them inaccessible, and demands a ransom payment to decrypt them. In most cases, difficult-to-trace cryptocurrencies such as Bitcoin are used for ransomware, making it difficult to trace and prosecute the attackers. Ransomware attacks are carried out using a Trojan disguised as a legitimate file that the user is tricked into downloading or opening when it arrives as an email attachment. One of the most well-known examples is the WannaCry ransomware attack, which was a worldwide cyberattack in May 2017 by the WannaCry ransomware crypto worm. This ransomware targeted computers running the Microsoft Windows operating system by encrypting data and demanding ransom payments in the Bitcoin cryptocurrency. 4.3 How Ransomware works Prepared by USO, for the exclusive use by students appearing in USO National Test. 45 IT & AI PRIMER How can we stay safe on the Internet? The growth of the internet has given rise to many important services accessible to everyone with a connection. One of these important services is digital communication. While this service allowed communication with others through the internet, this also opened up the possibility of communication with malicious users. Here are some things all of us can do to stay safe while on the Internet: Do not open an email or click on a link from someone you do not know. Doing so may allow hackers to take control of your computer. Do not download anything without verifying with an adult first – downloading from unsecured sites can also download viruses which may corrupt stored files and harm your computer. Never share your passwords with anyone, except your parents. Giving your password to your friends could lead to unwanted use of your account by them. Use different passwords for all your accounts – if the same password is used for all accounts, and one of them is hacked, the same password can be used by the hacker to access all accounts. Key rule to remember: If there is something that you would not do face to face, do not do it online. Also remember – online is still the real world, and just like one would not indulge in a conversation with a stranger, the same principle applies online as well. Keep in mind - Once you have written something on social media, it cannot be deleted. If what you do or say is controversial, it will be copied many times and will always come back and bite you, even in later life when you apply to go to college, or even a job. Do not share any personal information about yourself, your name, contact information, or email address, with any stranger on the internet. As essential as it is to browse the internet and get answers to our questions, it is imperative to stay safe. So, do exercise caution. How many of the above do you follow to be safe on the internet? Prepared by USO, for the exclusive use by students appearing in USO National Test. 46 IT & AI PRIMER MEMORY CHECK What essential component does a computer require to connect to a network? Network card. Which network connects computers in a local area? LAN. What is the full form of WWW? World Wide Web. Which network is the planned successor to the 4G networks? 5G. Who attaches to an application and can damage it? Virus. What term is used to steal personal, financial, or business information by both black hat hackers and governments? Malware. Which malware demands ransom for blocked data access? Ransomware. True or False: We can share our passwords with anyone. False. Which web series represented the 'read-only' web? Web 1.0. What is the central point connecting computers and other network devices? Router. Prepared by USO, for the exclusive use by students appearing in USO National Test. 47 IT & AI PRIMER CHAPTER 5 AI - An Introduction In this chapter, we embark on a journey to explore the fundamentals of AI, starting with an introduction to its concept and evolution. We explore the diverse applications of AI across various industries, revealing its transformative impact on technology and society. Looking ahead, we contemplate the future possibilities and ethical considerations surrounding AI. Finally, we conclude with a glossary to demystify key terms and concepts. Explore with us as we unravel the potential and implications of this groundbreaking field. Artificial Intelligence (AI) is a fascinating field of computer science that aims to create machines capable of performing tasks that typically require human intelligence. These tasks include learning, reasoning, problem-solving, understanding natural language, and recognizing patterns. AI is becoming an integral part of our daily lives and is transforming various industries, from healthcare to entertainment. Let us address some common questions. What is AI? AI involves programming computers to mimic human thought processes and actions. There are different types of AI, including: Narrow AI: Designed to perform a specific task, such as voice assistants like Siri and Alexa. General AI: Aims to perform any intellectual task that a human can do. This type of AI is still in the experimental stages. How and when did AI come about? The first case of Artificial Intelligence (AI) can be traced back to the 1950s. In 1956, a group of scientists, including John McCarthy, Marvin Minsky, Nathaniel Rochester, and Claude Shannon, organized the Dartmouth Conference. This conference is considered the official birth of AI as a field of study. During the Dartmouth Conference, the term "Artificial Intelligence" was coined, and the attendees discussed the possibility of creating machines that could think and learn like humans. They laid the foundation for future AI research by exploring how computers could be programmed to solve problems, understand language, and learn from experience. Prepared by USO, for the exclusive use by students appearing in USO National Test. 48 IT & AI PRIMER One of the earliest AI programs developed was the "Logic Theorist," created by Allen Newell and Herbert A. Simon in 1955. This program was designed to mimic human problem-solving skills and was capable of proving mathematical theorems. The Logic Theorist is often considered the first AI program and marked a significant milestone in the development of artificial intelligence. These early efforts paved the way for the rapid advancements in AI that we see today, influencing everything from voice assistants and self-driving cars to healthcare and entertainment. How has AI evolved over the years? Artificial Intelligence (AI) has come a long way since its inception in the mid- 20th century. From simple problem-solving programs to complex systems capable of learning and making decisions, AI's evolution is marked by significant milestones and advancements. The evolution of AI has been marked by significant milestones and periods of both progress and setbacks. From its early beginnings to the present day, AI has grown to become a transformative technology with far-reaching implications. Understanding this evolution helps us appreciate the potential and challenges of AI as we look toward the future of this exciting field. Let's explore the key stages in the development of AI and its impact on our world. Prepared by USO, for the exclusive use by students appearing in USO National Test. 49 IT & AI PRIMER The Birth of AI (1950s-1960s) The term "Artificial Intelligence" was coined at the Dartmouth Conference, held in 1956, organized by John McCarthy, Marvin Minsky, Nathaniel Rochester, and Claude Shannon. This event is considered the official birth of AI as a field of study. The early AI programs were the Logic Theorist, created by Allen Newell and Herbert A. Simon in 1955. This program could prove mathematical theorems and is often considered the first AI program. The General Problem Solver was also developed by Newell and Simon in 1957, and this program was designed to mimic human problem-solving. The Golden Years (1960s-1970s) AI research expanded to include natural language processing, robotics, and machine learning. The ELIZA,created by Joseph Weizenbaum, in 1966, was an early natural language processing program that could simulate a conversation with a human. Also, developed by SRI International in 1966, Shakey the Robot, was the first general-purpose mobile robot able to perceive and reason about its environment. The AI Winter (1970s-1980s) Despite early successes, AI faced significant challenges, including limited computing power and unrealistic expectations. Funding and interest in AI research declined, leading to what is known as the "AI Winter." Despite the downturn, expert systems, which mimic the decision-making abilities of human experts, gained traction in the 1980s. These systems were used in various industries for tasks such as medical diagnosis and financial analysis. Prepared by USO, for the exclusive use by students appearing in USO National Test. 50 IT & AI PRIMER The Revival and Growth (1990s-2000s) Advances in computing power, the availability of large datasets, and new algorithms led to a resurgence in AI research. In 1997, IBM's Deep Blue became the first computer to defeat a reigning world chess champion, Garry Kasparov, in a match. AI began to be integrated into everyday applications, such as recommendation systems used by Amazon and Netflix. The Age of Machine Learning and Deep Learning (2010s-Present) Machine Learning - ML, is a subset of AI that enables systems to learn from data, became a dominant approach. Algorithms like decision trees, support vector machines, and neural networks gained prominence. A subset of ML, Deep Learning uses multi-layered neural networks to analyze complex patterns in large datasets. This approach has revolutionized fields such as image and speech recognition. Developed by DeepMind in 2016, AlphaGo defeated the world champion Go player, showcasing the power of deep learning and reinforcement learning. So how does AI work? AI systems rely on several key technologies. These are: Machine Learning (ML): Machine Learning is a subset of AI that enables computers to learn from data and improve their performance over time without being explicitly programmed. Recommendation systems on platforms like Netflix and YouTube that suggest content based on our viewing history, are some examples. Neural Networks: This is a type of machine learning model inspired by the human brain, consisting of layers of interconnected nodes (neurons) that process data. An example is ‘Image recognition’ systems that can identify objects and faces in photos. Prepared by USO, for the exclusive use by students appearing in USO National Test. 51 IT & AI PRIMER Natural Language Processing (NLP): This enables computers to understand, interpret, and respond to human language. Some examples are chatbots and virtual assistants that can hold conversations and answer questions. Computer Vision: This allows computers to interpret and understand visual information from the world. An example is self-driving cars that use cameras and sensors to navigate and avoid obstacles. So what are the applications of AI? AI is used in a wide range of applications that impact our everyday lives, without us realizing it. some of these are mentioned below: Healthcare: AI in Diagnostics: AI systems can analyze medical images, such as X-rays and MRIs, to help doctors diagnose diseases more accurately. AI can also tailor treatments based on a patient’s genetic makeup and medical history. Education: AI-powered platforms provide personalized learning experiences, adapting to each student’s pace and understanding. AI can assist teachers by grading assignments and exams, freeing up time for more personalized instruction. Entertainment: Streaming services use AI to suggest movies, shows, and music based on user preferences. AI creates more intelligent and responsive non-player characters (NPCs) in video games. Transportation: Autonomous vehicles use AI to navigate roads, recognize traffic signals, and avoid obstacles. AI systems optimize traffic flow in cities, reducing congestion and improving safety. Customer Service: AI-driven chatbots handle customer inquiries and provide support, available 24/7. AI can analyze customer feedback to gauge satisfaction and identify areas for improvement. 5.1 Applications of AI - Left: In Healthcare; Middle: In Education; Right: In Customer Service Prepared by USO, for the exclusive use by students appearing in USO National Test. 52 IT & AI PRIMER So, is AI here to stay? What is its future? AI continues to advance rapidly, with several exciting trends on the horizon: Enhanced Learning Algorithms: AI systems will become better at learning from smaller amounts of data, making them more efficient and accessible. Integration with IoT: AI will be integrated with the Internet of Things (IoT), enabling smarter homes, cities, and industries. Ethical AI: There will be a greater focus on developing AI systems that are fair, transparent, and accountable to ensure they benefit society as a whole. AI in Space Exploration: AI will play a crucial role in space missions, from analyzing data to controlling autonomous rovers and spacecraft. Ethical Considerations and Challenges: While AI offers numerous benefits, it also raises important ethical and societal issues: Privacy: The use of AI involves collecting and analyzing large amounts of data, raising concerns about how personal information is used and protected. Bias: AI systems can sometimes reflect or amplify biases present in their training data, leading to unfair outcomes. Job Displacement: Automation and AI may replace certain jobs, necessitating the need for new skills and career paths. Prepared by USO, for the exclusive use by students appearing in USO National Test. 53 IT & AI PRIMER Companies like Tesla and Waymo are using AI to develop self-driving cars that can navigate roads, recognize traffic signs, and avoid obstacles without human intervention. AI powers voice assistants like Siri, Alexa, and Google Assistant, which can answer questions, set reminders, and control smart home devices with just our voice. The AI character in the movie "Iron Man," called JARVIS, inspired real-life AI assistants that help with daily tasks, although JARVIS is still more advanced than the current technology. Google Translate uses AI to translate text and speech between different languages, making it easier for people from around the world to communicate. NASA uses AI to help rovers navigate the surface of Mars, making decisions about where to go and what to study, even when they are millions of kilometers away from Earth. An AI-created painting called "Portrait of Edmond de Belamy" was sold at an auction for $432,500, showing that AI can create valuable and recognized pieces of art. AI-powered cameras and drones are used to monitor endangered species and track poachers in real-tim

Use Quizgecko on...
Browser
Browser