Full Transcript

EVOLUTION OF COMPUTING The Early Years The first counting device was the abacus, originally from Asia. It worked on a place-value notion meaning that the place of a bead or rock on the apparatus determined how much it was worth. CHINESE ABACUS ROMAN ABACUS RUSSIAN ABACUS Napie...

EVOLUTION OF COMPUTING The Early Years The first counting device was the abacus, originally from Asia. It worked on a place-value notion meaning that the place of a bead or rock on the apparatus determined how much it was worth. CHINESE ABACUS ROMAN ABACUS RUSSIAN ABACUS Napier's Bones (1600s) Napier's bones is a manually-operated calculating device created by John Napier of Merchiston for calculation of products and quotients of numbers. Pascaline (1642) Blaise Pascal invented the mechanical calculator called Pascaline. This calculating machine could add and subtract two numbers directly and multiply and divide by repetition. Difference Engine (1812) Charles P. Babbage, the "father of the computer", designed a machine, the difference engine which would be steam- powered, fully automatic and commanded by a fixed instruction program. Ada Lovelace (1840s) Ada Lovelace, world's first computer programmer, provided the first algorithm intended to be processed by Charles Babbage's early mechanical general-purpose computer, the Analytical Engine. She suggested that a binary system should be used for storage rather than a decimal system. Boolean Logic (1850s) George Boole developed Boolean logic which would later be used in the design of computer circuitry. VENN DIAGRAMS FOR CONJUNCTION, DISJUNCTION, AND COMPLEMENT. Hollerith’s Tabulator (1890s) Dr. Herman Hollerith introduced the first electromechanical, punched- card data-processing machine which was used to compile information for the 1890 U.S. census. Hollerith's tabulator became so successful that he started his own business to market it. His company would eventually become International Business Machines (IBM). HOLLERITH CARD PUNCHER USED BY THE UNITED STATES CENSUS BUREAU Vacuum Tube (1906) The first generation of computers is characterized by the use of "Vacuum tubes" and it was developed in 1904 by the British engineer "John Ambrose Fleming". A vacuum tube is an electronic device that is used to control the flow of electric current in a vacuum. It is used in CRT(Cathode Ray Tube) Tv, Radio, etc. FIRST GENERATION OF COMPUTER: Vacuum Tubes (1940-1956) ADVANTAGES DISADVANTAGES These computers were designed by The computer was very costly. using vacuum tubes. These generations' computers were simple Very large in size architecture. It takes up a lot of space and These computers calculate data in a electricity millisecond. The speed of these computers was This computer is used for scientific very slow purposes. It is used for commercial purposes. It is very expensive. These computers heat a lot. Cooling is needed to operate these computers because it heats up very quickly. Turing Machine (1943) British mathematician Alan Turing developed a hypothetical device, the Turing machine which would be designed to perform logical operation and could read and write. Harvard Mark I Howard Aiken, in collaboration with engineers from IBM, constructed a large automatic digital sequence-controlled computer called the Harvard Mark 1. This computer could handle all four arithmetic operations, and had special built-in programs for logarithms and trigonometric functions. Eniac The giant ENIAC (Electrical Numerical Integrator and Calculator) machine was developed by John W. Mauchly and J. Presper Eckert, Jr. at the University of Pennsylvania. It used 18, 000 vacuums, punch-card input, weighed thirty tons and occupied a thirty-by-fifty-foot space. 1951: Mauchly and Eckert built the UNIVAC I, the first computer designed and sold commercially, specifically for business data-processing applications. 1950s: Dr. Grace Murray Hopper developed the UNIVAC I compiler. 1957: The programming language FORTRAN (Formula Translator) was designed by John Backus, an IBM engineer. SECOND GENERATION OF COMPUTER: Transistors (1956-1963) The second generation of computers is characterized by the use of "transistors" and it was developed in 1947 by three American physicists "John Bardeen, Walter Brattain, and William Shockley". A transistor is a semiconductor device used to amplify or switch electronic signals or open or close a circuit. It is invented in bell labs, The transistors become the key Ingredient of all digital circuits, including computers. ADVANTAGES DISADVANTAGES It is smaller in size as It is also costly and not compared to the versatile first-generation computer still, it is expensive for It used a less electricity commercial purposes Not heated as much as the Cooling is still needed first-generation computer. Punch cards were used for It has better speed input The computer is used for the particular purpose Integrated Circuit Jack St. Clair Kilby and Robert Noyce of Texas Instruments manufactured the first integrated circuit, or chip, which is a collection of tiny little transistors. THIRD GENERATION OF COMPUTER: Integrated Circuits (1964-1971) ADVANTAGES DISADVANTAGES These computers are Still, a cooling system is smaller in size as compared needed. to previous generations It is still very costly It consumed less energy and was more reliable Sophisticated Technology is required to manufacture More Versatile Integrated Circuits It produced less heat as It is not easy to maintain the compared to previous IC chips. generations The performance of these These computers are used computers is degraded if we for commercial and as well execute large applications. as general-purpose ○ These computers used a fan for head discharge to prevent damage This generation of computers has increased the storage capacity of computers. Second Generation (1959-1965) 1960s: Gene Amdahl designed the IBM System/360 series of mainframe (G) computers, the first general-purpose digital computers to use integrated circuits. 1961: Dr. Hopper was instrumental in developing the COBOL (Common Business Oriented Language) programming language. 1963: Ken Olsen, founder of DEC, produced the PDP-I, the first minicomputer (G). 1965: BASIC (Beginners All-purpose Symbolic Instruction Code) programming language developed by Dr. Thomas Kurtz and Dr. John Kemeny. Third Generation (1965-1971) 1969: The Internet started. 1970: Dr. Ted Hoff developed the famous Intel 4004 microprocessor (G) chip. 1971: Intel released the first microprocessor, a specialized integrated circuit which was able to process four bits of data at a time. It also included its own arithmetic logic unit. PASCAL, a structured programming language, was developed by Niklaus Wirth. Fourth Generation (1971-Present) 1975: Ed Roberts, the "father of the microcomputer" designed the first microcomputer, the Altair 8800, which was produced by Micro Instrumentation and Telemetry Systems (MITS). The same year, two young hackers, William Gates and Paul Allen approached MITS and promised to deliver a BASIC compiler. So they did and from the sale, Microsoft was born. 1976: Cray developed the Cray-l supercomputer (G). Apple Computer, Inc was founded by Steven Jobs and Stephen Wozniak. 1977: Jobs and Wozniak designed and built the first Apple II microcomputer. 1970: 1980: IBM offers Bill Gates the opportunity to develop the operating system for its new IBM personal computer. Microsoft has achieved tremendous growth and success today due to the development of MS-DOS. Apple III was also released. 1981: The IBM PC was introduced with a 16-bit microprocessor. 1984: Apple introduced the Macintosh computer, which incorporated a unique graphical interface, making it easy to use. The same year, IBM released the 286-AT. 1986: Compaq released the DeskPro 386 computer, the first to use the 80036 microprocessor 1987: IBM announced the OS/2 operating-system technology. 1988: A nondestructive worm was introduced into the Internet network bringing thousands of computers to a halt. 1989: The Intel 486 became the world's first 1,000,000 transistor microprocessor. 1993s: The Energy Star program, endorsed by the Environmental Protection Agency (EPA), encouraged manufacturers to build computer equipment that met power consumption guidelines. When guidelines are met, equipment displays the Energy Star logo. The same year, Several companies introduced computer systems using the Pentium microprocessor from Intel that contains 3.1 million transistors and is able to perform 112 million instructions per second (MIPS). FOURTH GENERATION OF COMPUTER: Microprocessor (1971-Present) ADVANTAGES DISADVANTAGES Fans are required to operate These computers are smaller these kinds of computers in size and much more reliable as compared to other It required the latest generations of computers. technology for the need of making microprocessors and The heating issue on these complex software computers is almost negligible These computers were highly No A/C or Air conditioner is sophisticated required in a fourth-generation computer. ○ In these computers, all types of higher languages can be used in this generation less expensive It is totally also for the general-purpose. These computers are cheaper and also portable FIFTH GENERATION OF COMPUTER (Present-Beyond) ADVANTAGES DISADVANTAGES It tends to be sophisticated and These computers are smaller in size complex tools and it is more compatible It pushes the limit of transistor density. These computers are very powerful and cheaper in cost. It is obviously used for the general purpose Higher technology is used Development of true artificial intelligence Advancement in Parallel Processing and superconductor technology. Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization. Cloud Computing Cloud computing, or something being in the cloud, is an expression used to describe a variety of different types of computing concepts that involve a large number of computers connected through a real-time communication network such as the Internet. Cloud providers claim that computing costs reduce. Device and location independence enable users to access systems using a web browser regardless of their location or what device they use. Virtualization technology allows sharing of servers and storage devices and increased utilization.

Use Quizgecko on...
Browser
Browser