STS 6_new PDF
Document Details
Uploaded by Deleted User
Tags
Summary
This presentation details the history of technology, providing a timeline of key inventions and discoveries. It includes information about significant milestones such as the development of the internet, and the rise of artificial intelligence.
Full Transcript
Specific Issues in Science, Technology, and Society: The Information Age The Information Age Lession 6.1 The Information Age What is the Information Age? “access to and the control of information” is the defining characteristic of...
Specific Issues in Science, Technology, and Society: The Information Age The Information Age Lession 6.1 The Information Age What is the Information Age? “access to and the control of information” is the defining characteristic of this current era in human civilization. a.k.a. “Computer Age”, “Digital Age” and “New Media Age” Associated with personal computers, internet. The Information Age Claude E. Shannon An American mathematician, known as the ”Father of Information Theory." published a landmark paper proposing that “information can be quantitatively encoded as a series of ones (1) and zeroes (0), known as binary code. He showed how all information media (e.g., telephone signals to radio waves to TV) could be transmitted without error using this single framework. The Information Age Milestones of the Information Age 1970s development of the internet by USDF and adoption of personal computers a decade later, paved for the Information, or Digital Revolution. – Digital refers to data or information that is represented in a format that can be processed by computers and electronic devices – From analog to digital technologies development of fiber optic cables and faster microprocessors, accelerated the transmission and processing of information. The Information Age Milestones of the Information Age World Wide Web Developed by Tim Berners-Lee in 1989 used initially as an electronic billboard for companies’ products and services; became an interactive consumer exchange for goods and information. Electronic mail Introduced by Roy Tomlison in 1971; part of ARPANET permitted near-instant exchange of information widely adopted as the primary platform for workplace and personal communications. The Information Age Milestones of the Information Age Digitization of information has had a profound impact on traditional media businesses – book publishing – music industry – major TV and cable networks As information is increasingly described in digital form, businesses across many industries sharpened their focus on how to capitalize on the Information Age. – Online selling (vs. physical store) – Use of social media platforms in advocacy History of Technology Timeline 1. 3.3 million years ago: The first tools History of technology begins even before the beginning of our own species. Sharp flakes of stone used as knives; larger unshaped stones used as hammers and anvils have been uncovered at Lake Turkana in Kenya. These tools were likely used by our ancestor such as Australopithecus. 2. 1 million years ago: Fire it was probably invented/discovered by an ancestor of Homo sapiens. Evidence of burnt material can be found in caves used by Homo erectus (1- 1.5 million years ago). History of Technology Timeline 3. 20,000 to 15,000 years ago: Neolithic Revolution Humans moved from being hunter and gatherers to being agriculturist. People came together in larger groups. Clay was used for pottery and bricks. Clothing began to be made of woven fabrics. Wheel was also likely invented at this time. 4. 6000 BCE: Irrigation The first irrigation systems arose roughly simultaneously in the civilizations of the Tigris-Euphrates river valley in Mesopotamia and the Nile River valley in Egypt. Since irrigation requires an extensive amount of work, it shows a high level of social organization. History of Technology Timeline 5. 4000 BCE: Sailing first sailing ships were used on the Nile River. These ships also had oars for navigation. 6. 1200 BCE: Iron production of iron became widespread as it supplanted bronze. Iron was much more abundant than copper and tin (metals that make up bronze), and thus put metal tools into more hands than ever before. History of Technology Timeline 7. 850 CE: Gunpowder invented by Chinese alchemists as a result of search for life-extending elixirs. It was used to propel rockets attached to arrows. knowledge of gunpowder spread to Europe in the 13th century. 8. 950 CE: Windmill wind was first used to operate a mill, about 5 kya after the first sailing ships The first windmills were in Persia; horizontal windmills in which the blades were set on a vertical shaft. European windmills were of the vertical type. Windmill may have been invented independently in Persia and in Europe. History of Technology Timeline 9. 1044 CE: Compass first definitive mention of a magnetic compass dates from a Chinese book finished in 1044. It describes how soldiers found their way by using a fish-shaped piece of magnetized iron floating in a bowl of water when the sky was too cloudy to see the stars. 10. 1250–1300: Mechanical clock Hourglass and water clocks had been around for centuries but the first mechanical clocks began to appear in Europe toward the end of the 13th century and were used in cathedrals to mark the time when services would be held. History of Technology Timeline 11. 1455: Printing Press Johannes Gutenberg completed the printing of the Bible, the first book printed in the West using movable type. – Gutenberg’s printing press led to an information explosion in Europe. 12. 1765: Steam engine Start of Industrial Revolution – significant technological, economic, and social change, transitioning from agrarian economies to industrialized and urbanized ones James Watt improved the Newcomen steam engine by adding a condenser that turned the steam back into liquid water. – The condenser was separated from cylinder that moved piston; engine was more efficient Steam engine became one of the most important inventions of the Industrial Revolution. History of Technology Timeline 13. 1804: Railways Richard Trevithick improved James Watt’s steam engine and used it for transport. He built the first railway locomotive at an ironworks in Wales. 14. 1807: Steamboat Robert Fulton put the steam engine on water. His steamboat (called the Clermont) took 32 hours to go up the Hudson River from New York City to Albany. Sailing ships took four days. History of Technology Timeline 15. 1826-27: Photography Nicéphore Niépce became interested in using a light-sensitive solution to make copies of lithographs onto glass, zinc, and a pewter plate. He used his solution to make a copy of an image in a camera obscura (a room or box with a small hole in one end through which an image of the outside is projected). Years later, he made an 8-hour long exposure of the courtyard of his house, the first known photograph. 16. 1831: Reaper For thousands of years, harvesting crops was very labor-intensive. Cyrus McCormick’s invented the mechanical reaper. – earliest reaper had some mechanical problems, but later versions spread throughout the world. History of Technology Timeline 17. 1844: Telegraph Samuel Morse became interested in the possibility of an electric telegraph in the 1830s; patented a prototype in 1837; form of dots and dashes In 1844, he sent the first message over the first long-distance telegraph line, which stretched between Washington, D.C., and Baltimore. The message: “What hath God wrought.” 18. 1876: Telephone Once it was possible to send information through a wire, the next step was actual voice communication. Alexander Graham Bell made the first telephone call, on March 10, 1876, when he asked his assistant Tom Watson to come to him: “Mr. Watson— come here—I want to see you.” History of Technology Timeline 19. 1876: Internal-combustion engine (ICE) Nikolaus Otto built an engine that used the burning of fuel inside the engine to move a piston. later used to power automobiles. 20. 1879: Electric light Thomas Edison invented a carbon-filament light bulb to burn for 13½ hours. Edison and others in his laboratory were also working on an electrical power distribution system to light homes and businesses. In 1882, the Edison Electric Illuminating Company opened the first power plant. History of Technology Timeline 21. 1885: Automobile The ICE improved, becoming smaller and more efficient. Karl Benz used a one-cylinder gas-powered engine to power the first modern automobile, a three-wheeled car that he drove around a track. However, the automobile did not make a commercial splash until 1888, when his wife, Bertha, exasperated with Karl’s slow methodical pace, took an automobile without his knowledge on a 64-mile trip to see her mother. 22. 1901: Radio Guglielmo Marconi had been experimenting with radio since 1894 and was sending transmissions over longer and longer distances. In 1901, his reported transmission of the Morse code letter S across the Atlantic from Cornwall to Newfoundland excited the world. History of Technology Timeline 23. 1903: Airplane Wilbur and Orville Wright made the first airplane flight (120-852 feet) near Kitty Hawk, North Carolina. 24. 1926: Rocketry Robert Goddard was inspired by H.G. Wells’s The War of the Worlds and the possibilities of space travel. He achieved the first test flight of a liquid-fueled rocket from his aunt’s farm in Auburn, Massachusetts. The rocket flew 12.5 meters (41 feet) in the air. History of Technology Timeline 25. 1927: Television After the radio, the transmission of an image was the next logical step. Early television used a mechanical disk to scan an image. Philo T. Farnsworth convinced that a mechanical system would not be able to scan and assemble images multiple times a second; Only an electronic system would do that. He worked out a plan for such a system; he successfully made the first all electronic TV system, a horizontal line. History of Technology Timeline 26. 1937: Computer Charles Babbage – “Father of Computer”; designed the first mechanical computer John Atanasoff - designed the first electronic digital computer. – use binary numbers (base 2, in which all numbers are expressed with the digits 0 and 1), and its data would be stored in capacitors. – His contributions are significant in the history of computing, particularly for introducing concepts like binary arithmetic and electronic computation. he and Clifford Berry began building the Atanasoff-Berry Computer - 1942 – designed to solve systems of linear equations and used electronic switches (vacuum tubes) for computation, making it distinct from earlier mechanical computers. ENIAC (1945) by John W. Mauchly and J. Presper Eckert - often recognized as the first general-purpose electronic computer due to its broader capabilities and successful operation. History of Technology Timeline 27. 1942: Nuclear power Manhattan Project built the first atomic bomb; understood nuclear reactions in detail. Enrico Fermi used uranium to produce the first self-sustaining chain reaction. 28. 1947: Transistor John Bardeen, Walter Brattain, and William Shockley gave the first public demonstration of the transistor – an electrical component that could control, amplify, and generate current. The transistor was much smaller and used less power than vacuum tubes and ushered in an era of cheap small electronic devices. History of Technology Timeline 29. 1957: Spaceflight USSR launched the first artificial satellite (Sputnik 1) Space race between USSR and USA; opening up a new front in Cold War. 30. 1974: Personal computers The first computers that emerged after War were gigantic and mechanical Advances in technology (putting many transistors on a semiconductor chip) made computers became both smaller and more powerful. became small enough for home and office use. First personal computers – Kenbak-1 (1971) – Altair 8800 (1974) – IBM (1978) History of Technology Timeline 31. 1974: Internet Vinton Cerf and Robert Kahn produced the TCP/IP (Transmission Control Protocol/Internet Protocol), which describes how data can be broken down into smaller pieces (called packets) and how these packets can be transmitted to the right destination. TCP/IP became the basis for how data is transmitted over the Internet. 32. 2012: CRISPR Jennifer Doudna and Emmanuelle Charpentier developed CRISPR-Cas9, a method for editing genes—that is, making changes to DNA sequences. Gene editing has the potential to treat many diseases but also opens up the ethical gray area of creating designer humans. History of Technology Timeline 33. 2017: Artificial intelligence John MacCarthy, Alan Turing, Marvin Minsky, Allen Newell, and Herbert A. Simon– considered as the Founding Fathers of AI AI development started in 1940s and 1950s, where the early computing and mathematical theories were made. – Combination of machine learning and deep learning AlphaGo artificial intelligence program – plays the Go board game; announced as the world’s best go player. Go is a game with very simple rules but many possible positions. Through machine learning, AlphaGo had become better at the game than any human. Artificial Intelligence Lession 6.3 Artificial Intellegence What is Artificial Intelligence? a wide-ranging branch of computer science concerned with building smart machines capable of performing tasks that typically require human intelligence. it is an interdisciplinary science with multiple approaches, but advancements in machine learning and deep learning are creating a paradigm shift in virtually every sector of the tech industry. Refers to the simulation of human intelligence in machines that are programmed to think and learn like humans How Does Artificial Intelligence Work? Norvig and Russell explore four different approaches that have historically defined the field of AI: 1. Thinking humanly Thought processes and reasoning 2. Thinking rationally 3. Acting humanly 4. Acting rationally Behavior They focus particularly on rational agents that act to achieve the best outcome, noting "all the skills needed for the Turing Test also allow an agent to act rationally." (Russel and Norvig 4). How is AI used? Broad categories of AI 1. Narrow AI Sometimes referred to as "Weak AI” operates within a limited context and is a simulation of human intelligence. often focused on performing a single task extremely well these machines may seem intelligent, but they are operating under far more constraints and limitations than even the most basic human intelligence. How is AI used? Examples of Narrow AI 1. Google search 2. Image recognition software 3. Siri, Alexa and other personal assistants 4. Self-driving cars 5. Machine Learning & Deep Learning Much of Narrow AI is powered by breakthroughs in machine learning and deep learning. How is AI used? Broad categories of AI 2. Artificial General Intelligence (AGI) sometimes referred to as "Strong AI" is the kind of AI we see in the movies, like the robots from Westworld or Data from Star Trek: The Next Generation. It is a machine with general intelligence and can apply that intelligence to solve any problem. How is AI used? Examples of Artificial Intelligence 1. Smart assistants (like Siri and Alexa) 2. Disease mapping and prediction tools 3. Manufacturing and drone robots 4. Optimized, personalized healthcare treatment recommendations 5. Conversational bots for marketing and customer service 6. Robo-advisors for stock trading 7. Spam filters on email 8. Social media monitoring tools for dangerous content or false news 9. Song or TV show recommendations from Spotify and Netflix How is AI used? Artificial General Intelligence The creation of a machine with human-level intelligence that can be applied to any task is difficulty for AI researchers AGI has long been the muse of dystopian science fiction – super-intelligent robots overrun humanity – but experts agree it's not something we need to worry about anytime soon. Intelligent robots and artificial beings first appeared in ancient Greek myths. While the roots are long and deep, the history of artificial intelligence as we think of it today spans less than a century. History of AI 1943 Warren McCullough and Walter Pitts publish "A Logical Calculus of Ideas Immanent in Nervous Activity." The paper proposed the first mathematic model for building a neural network. 1949 Donald Hebb proposes the theory that neural pathways are created from experiences and that connections between neurons become stronger the more frequently they're used (The Organization of Behavior: A Neuropsychological Theory) Hebbian learning continues to be an important model in AI. History of AI 1950 Alan Turing publishes "Computing Machinery and Intelligence” proposed the Turing Test, a method for determining if a machine is intelligent. Marvin Minsky and Dean Edmonds build SNARC, the first neural network computer. Claude Shannon publishes "Programming a Computer for Playing Chess." 1952 Arthur Samuel develops a self-learning program to play checkers History of AI 1954 The Georgetown-IBM machine translation experiment automatically translates 60 carefully selected Russian sentences into English. 1956 The phrase artificial intelligence was coined at the "Dartmouth Summer Research Project on Artificial Intelligence." defined the scope and goals of AI; widely considered to be the founding event of AI as we know it today. Allen Newell and Herbert Simon - demonstrate Logic Theorist (LT), the first reasoning program. History of AI 1958 John McCarthy – develops the AI programming language Lisp; publishes the paper "Programs with Common Sense." – proposed the hypothetical Advice Taker, a complete AI system with the ability to learn from experience as effectively as humans do. 1959 Allen Newell, Herbert Simon and J.C. Shaw – develop the General Problem Solver, a program designed to imitate human problem- solving. Herbert Gelernter - develops the Geometry Theorem Prover program. Arthur Samuel - coins the term “machine learning” while at IBM. John McCarthy and Marvin Minsky - found the MIT Artificial Intelligence Project. History of AI 1963 John McCarthy starts the AI Lab at Stanford. 1966 The Automatic Language Processing Advisory Committee (ALPAC) report by the U.S. government details the lack of progress in machine translations research, a major Cold War initiative with the promise of automatic and instantaneous translation of Russian. The ALPAC report leads to the cancellation of all government-funded MT projects. History of AI 1969 The first successful expert systems are developed in DENDRAL, a XX program, and MYCIN, designed to diagnose blood infections, are created at Stanford. 1972 The logic programming language PROLOG is created. 1973 The "Lighthill Report," detailing the disappointments in AI research, is released by the British government and leads to severe cuts in funding for artificial intelligence projects. History of AI 1974-1980 Frustration with the progress of AI development leads to major DARPA cutbacks in academic grants. Combined with the earlier ALPAC report and the previous year's "Lighthill Report," artificial intelligence funding dries up and research stalls. This period is known as the "First AI Winter." 1980 Digital Equipment Corporations develops R1 (also known as XCON), the first successful commercial expert system. Designed to configure orders for new computer systems, R1 kicks off an investment boom in expert systems that will last for much of the decade, effectively ending the first "AI Winter." History of AI 1982 Japan's Ministry of International Trade and Industry launches the ambitious Fifth Generation Computer Systems project. The goal of FGCS is to develop supercomputer-like performance and a platform for AI development. 1983 In response to Japan's FGCS, the U.S. government launches the Strategic Computing Initiative to provide DARPA funded research in advanced computing and artificial intelligence. History of AI 1985 Companies are spending more than a billion dollars a year on expert systems and an entire industry known as the Lisp machine market springs up to support them. Companies like Symbolics and Lisp Machines Inc. build specialized computers to run on the AI programming language Lisp. History of AI 1987-1993 As computing technology improved, cheaper alternatives emerged and the Lisp machine market collapsed in 1987, ushering in the "Second AI Winter." During this period, expert systems proved too expensive to maintain and update, eventually falling out of favor. Japan terminates the FGCS project in 1992, citing failure in meeting the ambitious goals outlined a decade earlier. DARPA ends the Strategic Computing Initiative in 1993 after spending nearly $1 billion and falling far short of expectations. History of AI 1991 U.S. forces deploy DART, an automated logistics planning and scheduling tool, during the Gulf War. 1997 IBM's Deep Blue beats world chess champion Gary Kasparov 2005 STANLEY, a self-driving car, wins the DARPA Grand Challenge. The U.S. military begins investing in autonomous robots like Boston Dynamic's "Big Dog" and iRobot's "PackBot." History of AI 2008 Google makes breakthroughs in speech recognition and introduces the feature in its iPhone app. 2011 IBM's Watson trounces the competition on Jeopardy!. 2012 Andrew Ng, founder of the Google Brain Deep Learning project, feeds a neural network using deep learning algorithms 10 million YouTube videos as a training set. The neural network learned to recognize a cat without being told what a cat is, ushering in breakthrough era for neural networks and deep learning funding. History of AI 2014 Google makes first self-driving car to pass a state driving test. 2016 Google DeepMind's AlphaGo defeats world champion Go player Lee Sedol. The complexity of the ancient Chinese game was seen as a major hurdle to clear in AI. Artificial Intelligence, Machine Learning, and Deep Learning Artificial Intelligence (AI) Definition: simulation of human intelligence in machines that are programmed to think and learn like humans. Scope: a wide range of techniques and technologies: – rule-based systems – natural language processing – robotics Goal: create systems that can perform tasks that typically require human intelligence: – reasoning – problem-solving – understanding natural language. Machine Learning (ML) Definition: a subset of AI that focuses on the development of algorithms that allow computers to learn from and make predictions or decisions based on data. Methods: includes supervised learning (training on labeled data), unsupervised learning (finding patterns in unlabeled data), and reinforcement learning (learning through feedback). Goal: enable machines to improve their performance on a task over time without being explicitly programmed for each specific task.. Deep Learning (DL) Definition: a specialized subset of ML that uses neural networks with many layers (hence "deep") to model complex patterns in large amounts of data. Architecture: involves architectures such as convolutional neural networks (CNNs) for image data and recurrent neural networks (RNNs) for sequential data, among others. Goal: automatically discover intricate structures in data, which can lead to breakthroughs in areas like image and speech recognition. AI, Machine Learning, Deep Learning Summary AI is the broad concept of machines simulating human intelligence. ML is a subset of AI that focuses on learning from data. DL is a further specialization within ML that uses deep neural networks to analyze large datasets.