The Psychology of Evolving Technology PDF

Document Details

SmarterExuberance

Uploaded by SmarterExuberance

Kent State University at Stark

2023

Rhoda Okunev

Tags

technology history computer science technology social media

Summary

This book explores the psychology of evolving technology, focusing on how social media, influencers, and new technologies are altering society. It details the history of computing, from early pioneers to modern advancements.

Full Transcript

The Psychology of Evolving Technology How Social Media, Influencer Culture and New Technologies are Altering Society ― Rhoda Okunev The Psychology of Evolving Technology: How Social Media, Influencer Culture and New Technologies are Altering Society Rhoda Okunev Tamarac, FL, USA ISBN-13 (pbk): 978...

The Psychology of Evolving Technology How Social Media, Influencer Culture and New Technologies are Altering Society ― Rhoda Okunev The Psychology of Evolving Technology: How Social Media, Influencer Culture and New Technologies are Altering Society Rhoda Okunev Tamarac, FL, USA ISBN-13 (pbk): 978-1-4842-8685-2 https://doi.org/10.1007/978-1-4842-8686-9 ISBN-13 (electronic): 978-1-4842-8686-9 Copyright © 2023 by Rhoda Okunev This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. Trademarked names, logos, and images may appear in this book. Rather than use a trademark symbol with every occurrence of a trademarked name, logo, or image we use the names, logos, and images only in an editorial fashion and to the benefit of the trademark owner, with no intention of infringement of the trademark. The use in this publication of trade names, trademarks, service marks, and similar terms, even if they are not identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to proprietary rights. While the advice and information in this book are believed to be true and accurate at the date of publication, neither the authors nor the editors nor the publisher can accept any legal responsibility for any errors or omissions that may be made. The publisher makes no warranty, express or implied, with respect to the material contained herein. Managing Director, Apress Media LLC: Welmoed Spahr Acquisitions Editor: Shiva Ramachandran Development Editor: James Markham Coordinating Editor: Jessica Vakili Copy Editor: Kim Wimpsett Distributed to the book trade worldwide by Springer Science+Business Media New York, 1 New York Plaza, New York, NY 10004. Phone 1-800-SPRINGER, fax (201) 348-4505, e-mail [email protected], or visit www.springeronline.com. Apress Media, LLC is a California LLC and the sole member (owner) is Springer Science + Business Media Finance Inc (SSBM Finance Inc). SSBM Finance Inc is a Delaware corporation. For information on translations, please e-mail [email protected]; for reprint, paperback, or audio rights, please e-mail [email protected]. Apress titles may be purchased in bulk for academic, corporate, or promotional use. eBook versions and licenses are also available for most titles. For more information, reference our Print and eBook Bulk Sales web page at www.apress.com/bulk-sales. Printed on acid-free paper CHAPTER 1 History of Computing The history of the personal computer and other technological advances is by no means a straight path or one person’s idea. The current state is a result of many advances. At times, an invention was arrived at by chance or by mistake. Some inventions were developed for wartime needs. The technological advances that affect many parts of our lives today have a story and inventor behind them. This chapter will take a look at some of them. To Begin With… Ada Byron Lovelace, a mathematician from England and the daughter of poet Lord Byron, wrote the first “computer program,” which was a set of instructions to solve a complex math problem. When she was a teenager, she met the Cambridge mathematician and engineering professor Charles Babbage, and they had a long correspondence about mathematics and computer topics. In 1848 Babbage designed the first analytical engine, which was the first general-purpose computer that would use punch cards. Lovelace thought that a table of logarithms could be calculated and the computer could be used as a calculating device. Although Babbage designed the machine, Lovelace envisioned a machine that could process musical notes, letters, and images. She, in effect, imagined the modern-day computer. In Lovelace’s © Rhoda Okunev 2023 R. Okunev, The Psychology of Evolving Technology, https://doi.org/10.1007/978-1-4842-8686-9_1 4 Chapter 1 | History of Computing famous notes, she used the design of Babbage’s machine to compute an algorithm to calculate the Bernoulli numbers. Her invention is believed to be the world’s first computer program. At the time, in part because of lack of funding, the Babbage analytical engine did not materialize. Alan Turing is an esteemed and renowned mathematician who received his PhD from Princeton. Turing is mostly known and remembered for many ideas in artificial intelligence, cryptology, and computer science. In about 1938, Turing broke the Nazi code, called the Enigma code, using cryptology and “the Bombe” machine in Britain during World War II. This saved the allies in many conflicts, including and importantly the Battle of the Atlantic, which was the longest continuous military campaign in World War II. Much of Turing’s mathematical work remains secret because of his position in British intelligence. Around 1947, Turing joined Max Newman at the University of Manchester in the Mathematics Department where Newman received a grant to build a new computing machine called the Mark I. It was the first ENIAC-type electronic stored-program computer to be completed. The ENIAC computer has many of the properties of the modern-day computer. This was one of the first computers with stored programming. After that, Turing was recruited by Ferranti Ltd. to develop the Ferranti Mark 1, a machine that Ferranti would market commercially. At the same time, Turing continued to work on his abstract mathematics ideas and whether computers were intelligent or could think, and in 1950 he wrote an article in Mind called “Computing Machinery and Intelligence” that introduced the Turing test. The Turing test examines if a computer is intelligent or was able to think. The way to determine, according to Turing, is if a computer is able to think like a human, then people would not be able to differentiate between the computer’s decision and a person’s decision. The modern-day CAPTCHA uses the Turing test. The CAPTCHA acronym stands for Completely Automated Public Turing test, and it is used to differentiate computers from humans. Today CAPTCHA is used as a type of security measure known as the challenge-response authentication process to help protect computer users from spam. It analyzes password decryption by asking users to complete a task to show that they are not an automaton trying to break into a password-protected account. Turing thought it better to build a program that would simulate the child’s mind because it would be a simpler model. So in 1948, Turing, working with his former colleague D.G. Champernowne, wrote a computer program for chess moves against an opponent. In 1950 the program was completed and dubbed the Turochamp. In 1952 it was implemented on a Ferranti Mark 1, but the computer lacked suitable power to execute the program. Turing’s program did not run because the program needed to flip through the many, many pages The Psychology of Evolving Technology of the algorithm to carry out its instructions on a chessboard, and it took about half an hour per move. Turing’s computer lost the chess competition, but the event was still significant and pushed forward the ideas behind artificial intelligence. Similar methods to Turing’s machine learning have been utilized and sharpened by Dennis Hassabis to have computers play chess. Hassabis is a British neuroscientist, artificial intelligence scientist, and video game designer. He was a co-founder of DeepMind, which was bought by Google in 2014, and is now vice president of engineering at Google DeepMind. Turing is also remembered for the award named after him, the Turing Award. It is recognized as the highest honor in computer science and is also known as the Nobel Prize of Computing. Turing is thought of as the major founder and thinker behind many concepts in computer science and data science. Bell Labs and Unix Bell Labs, formally known by other names such as Volta Laboratory and Bell Laboratories, was started by Alexander Graham Bell from the Volta Prize he received in 1880 for his invention of the telephone. In 1876 Alexander Graham Bell, Thomas Sanders, and Gardiner Hubbard filed for a patent for the telephone, and Bell Labs Company was formed a year later. In 1889 it became part of the American Telephone & Telegraph Company known as AT&T. Bell has nine Nobel Prizes, the most Nobel Prizes of any company so far. The company was sold to Nokia and is now called Nokia Bell Labs. During the 1960s, Ken Thompson, Dennis Ritchie, and others while working at Bell Labs developed the Unix system, a multitasking, multiuser operating system for computers. Multitasking means new tasks can be started before old ones are finished. Multiuser is where multiple users can have access to the computer operating system at the same time. Software resources provide common services for computer programs. By the early 1980s, Unix was seen as a universal operating system, suitable for computers of all sizes. This was essential for the development of the Internet and the reshaping of computing, not only for individual computers but for network computing as well. While working at Bell Labs, in 1972 and 1973, Thompson developed the B programming language, which was the precursor to the C programming language. Ritchie is known for creating the C language, which is still widely used in applications, operating systems, and embedded system developments. In 2009, Thompson left Bell Labs and co-developed the Go programming language at Google with his colleagues, Rob Pike and Robert Griesemer. Moreover, in 1983, Thompson and Ritchie received the Turing Award for their development of generic operating systems theory and for the implementing the UNIX operating system. 5 6 Chapter 1 | History of Computing The Kenbak-1, designed by John Blankenbaker and released in 1970, is considered to be the first personal computer. It had 256 bytes of memory, an 8-bit word size, and input and output restrictions. The computer did not have much power, and therefore, this computer was only good for learning the essentials of programming but not at running application programs. In 1973 the Xerox Alto computer was developed, and the Xerox PARC computer followed soon after. The Xerox PARC was a major step in the development of personal computers because it had a graphical user interface (GUI), a bitmapped high-resolution screen, internal and external storage capabilities, a mouse, and some software. This was the first personal computer to be recognized as workable because of those listed capabilities. IBM International Business Machine (IBM), nicknamed Big Blue, was the first multinational and IT consulting corporation, headquartered in Armonk, New York. IBM started to work on the first computers in the 1940s that used a punch card. During World War II, the punch card was used in Los Alamos for the Manhattan Project in developing the first atomic bombs. IBM does not deny the allegations that it also corroborated with the Nazis to use the punch cards that were developed to tabulate the Census in the United States to help run the concentration camps. Also, during the war, the Harvard Mark I for the Navy was built by IBM, which was an Automatic Sequence Controlled Calculator that could add, subtract, multiply, and divide. At this time this was a novel idea. The company offered a range of hardware, software, and custom packages for the computer. Starting in 1952, IBM started working with the Massachusetts Institute of Technology (MIT) Lincoln Laboratory on an airdefense computer, which did not go too well. Then, in 1958, IBM worked with MIT on the massive computing and communication system for the United States Air Force on a project called SAGE. It was here that digital data was transmitted over the telephone, duplex multiprocessing, and other mechanics for this algebraic computer and real-time digital computer. In 1964 IBM had its first breakthrough with its System/360 family of mainframe computers because they had interchangeable software and peripheral equipment. In 1956, Arthur Samuel of IBM’s Poughkeepsie, New York, laboratory used machine learning to program the IBM computer to play checkers using techniques in which the machine could learn from its own mistakes. IBM was for the most part one of the most respected computer companies in the 1970s and 1980s for its computers and technological research. Bill Lowe was known as the Father of the IBM Personal Computer. In 1962 he received his physics degree at Layfette College. After that, he joined IBM in 1975 as a product test engineer. He moved up very quickly at IBM until he reached the position of lab director for the site. Lowe encouraged IBM to enter the PC business. The IBM Corporate Management Committee gave approval The Psychology of Evolving Technology for him to move forward to build an internal personal computer, and in 1980 Lowe selected a team to develop and launch the new product. To move quickly on the project, Lowe designed the computer with standard components and outsourced the development of the operating system to Microsoft and the processor to Intel. Following Lowe's strategy, the IBM PC was developed in one year. It was launched in August 1981 and sold far more units than had been projected, thereby legitimizing the personal computer business. IBM was the giant of the computer industry at that time and was expected to beat Apple’s market share. However, because of the shortcuts that IBM took to enter the market quickly, the nonproprietary parts of the products it ended up releasing were easily copied by other manufacturers. This outsourcing maneuver by Lowe opened the doors to the PC clones that imitated the IBM PC because it was based on relatively standard integrated circuits, a basic card-slot design that was not patented, and hardware that was outsourced to Microsoft and Intel. In other words, IBM’s major contribution in the role of the personal computer was the establishment of standards for hardware architecture among manufacturers. Because IBM outsourced and took too many shortcuts to cut the process time of developing a computer other companies where able to copy and perfect the computer to the point where IBM was no longer the significant force in its construction. In effect, IBM became only the prototype of the PC standard. In the process, Microsoft started to emerge as the dominant force because it provided the operating system and utilities for PCs whether they were IBM machines or PC clones. Apple Computer Now the Apple Inc. name is known worldwide primarily for its iPhones, smartwatches, and macOS computers and tablets, but it was founded originally as Apple Computer Company in 1976 by Steve Jobs, Steve Wozniak, and Ronald Wayne to sell the Apple I personal computer that Wozniak developed. When the Apple II computer became a bestseller in 1977, Jobs and Wozniak incorporated the name of the company as Apple Computer, Inc. The computer had color graphics and open architecture. In 1979, Apple released VisiCalc, a spreadsheet program that was easy to use as well. Soon after, in 1980, Apple went public with its success in selling PCs. In 1983 the Apple Lisa was created and was the first personal computer with a GUI interface. The Lisa computer was named after Steve Jobs’ daughter. GUI is a graphics-based operating system interface that uses icons, menus, and a mouse to carry out commands, such as opening, deleting, and moving files. Although a GUI operating system mainly uses a mouse, keyboard shortcuts or the arrow keys can also be used. This graphic-based idea came to Steve Jobs when he visited a Xerox Alto demonstration that had a computer 7 8 Chapter 1 | History of Computing with a graphical interface. Even though it was a brilliant suggestion to put in the computer, due to infighting and its high price and limited quantity of software, Apple took a big financial hit. Jobs was pushed off the Lisa team and went to the Macintosh division of the company. It was there that he introduced a cheaper graphical interface system than what was used on the Lisa computer. However, the Macintosh had only 128 kilobytes of RAM, which limited its speed. This was done to keep the price point low. However, once the slow speed of the computer hit the public media, the Macintosh sales slipped. Soon after in 1985 there were more infighting and power struggles in the company mainly between Steve Jobs and CEO John Scully, who Steve Jobs had hired because the computers were not selling well. Steve Jobs resigned and founded NeXT, and Wozniak removed himself amicably from Apple at that time. In 1990 Apple was losing market share to other companies in the personal computers arena, while other companies using the Microsoft Windows operating system expanded with Intel-powered clone IBMs. Just before Apple went bankrupt, the company brought Steve Jobs back to fix its operating strategy and restructure the company. In 1997 Apple returned to being a profitable company under the leadership of Steve Jobs, and the iMac, iPod, iPhone, and iPad were launched to great success. Apple Stores started to open, but Steve Jobs’ health declined, and soon after he died of pancreas cancer in 2011. Steve Jobs’ leadership brought the Apple Company to fame and glory. Tim Cook took the CEOs role after Steve Jobs’ death, and the Macintosh and iPhone markets continue to thrive. Summary The late 19th century was the start of a revolutionary time for the design of the computer, with Ada Lovelace and Professor Babbage first imagining a computer that could solve complex math algorithms. The 20th century brought Alan Turing who was able to utilize cryptology and the technology of a massive machine to save many lives in World War II. Turing also showed that computers could have the ability to think and was the impetus for the field of artificial intelligence. After that, Bell Labs, IBM, and Apple were able to put the technology to use by having the foresight to put a personal computer in the hands of every person. In about a century, the world had changed, but the next chapter will review some of the other discoveries that helped the computer to become more prominent and powerful and that made it more efficient and effective to use. B Bibliography Part I: Scientific Advances Chapter 1: History of Computing 1. Charles Petzold, The Annotated Turing: A Guide Tour through Alan Turing’s Historic Paper on Computability and the Turing Machine, Wiley Publishing Inc., 2008. 2. Paul E. Ceruzzi, Computing: A Concise History, MIT Press, 2012. 3. James W. Cortada, IBM: the rise and fall and reinvention of a global icon, Cambridge: MIT Press, 2019. 4. Edwin Black, IBM and the Holocaust: The Strategic Alliance Between Nazi Germany and the America’s Most Powerful Corporation, Three Rivers Press, 2001. 5. Walter Isaacson, The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution, Simon & Schuster, 2014. 6. Bob Goldsborough, “William C. Lowe, helped launch IBM’s first personal computer, 1941–2013,” Chicago Tribune, 2013. © Rhoda Okunev 2023 R. Okunev, The Psychology of Evolving Technology, https://doi.org/10.1007/978-1-4842-8686-9

Use Quizgecko on...
Browser
Browser