Ethics for the Information Age Chapter 8: Computer Reliability PDF

Document Details

IntelligentJasper852

Uploaded by IntelligentJasper852

2020

Tags

computer reliability software engineering ethics technology

Summary

This chapter from "Ethics for the Information Age" discusses computer reliability, examining issues like data-entry errors, software glitches, and notable failures. It analyzes cases like the Therac-25 accident and the Tesla Autopilot to highlight potential ethical concerns and challenges in building reliable computer systems.

Full Transcript

Ethics for the Information Age Eighth Edition Chapter 8 Computer Reliability Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Learning Objectives (1 o...

Ethics for the Information Age Eighth Edition Chapter 8 Computer Reliability Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Learning Objectives (1 of 2) 8.1 Introduction 8.2 Data-entry or data-retrieval errors 8.3 Software and billing errors 8.4 Notable software system failures 8.5 Therac-25 Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Learning Objectives (2 of 2) 8.6 Tesla Version 7.0 (Autopilot) 8.7 Uber Test-vehicle accident 8.8 Computer simulations 8.9 Software engineering 8.10 Software warranties and vendor liability Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved 8.1 Introduction Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved 8.1 Introduction Computers an integral part of modern communication, transportation, retail, banking, finance, health-care systems – Save time, money and enable greater productivity when working correctly – Failures can result in lost time, lost money, injury, or even death Studying failures a way to appreciate complexity of building reliable computerized systems Examine increasingly important area of computer simulations High-level overview of software engineering Software warranties and issue of responsibility of software manufacturers for quality of their products Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved 8.2 Data-Entry or Data-Retrieval Errors Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Two Kinds of Data-Related Failure A computerized system may fail because wrong data entered into it A computerized system may fail because people incorrectly interpret data they retrieve Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Disfranchised Voters November 2000 general election Florida disqualified thousands of voters Reason: People mistakenly identified as felons Cause: Incorrect records entered in voter database Consequence: May have affected outcome of national presidential election Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Act Utilitarian Analysis: Database of Stolen Vehicles (1 of 2) Over 1 million cars stolen every year Just over half are recovered, say 500,000 Assume NCIC is responsible for at least 20% 100,000 cars recovered because of NCIC Benefit of $5,000 per car (owner gets car back; effects on national insurance rates; criminal doesn’t profit) Total value of NCIC stolen vehicle database: $500 million/year Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Act Utilitarian Analysis: Database of Stolen Vehicles (2 of 2) Only a few stories of false arrests Assume 1 false arrest per year (probably high) Assume harm caused by false arrest $55,000 (size of award to Rogan) Benefit surpasses harm by $499,945,000/year Conclusion: Good to have NCIC stolen vehicles database Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved 8.3 Software and Billing Errors Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Errors When Data Are Correct Assume data correctly fed into computerized system System may still fail if there is an error in its programming Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Errors Leading to System Malfunctions Qwest sent incorrect bills to cell phone customers Faulty U S D A beef price reports Spelling and grammar error checkers increased errors New York City Housing authority overcharged renters About 450 California prison inmates mistakenly released Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Errors Leading to System Failures Ambulance dispatch system in London Japan’s air traffic control system Los Angeles County + USC Medical Center laboratory computer system Boeing 777 (Malaysia Airlines flight) NASDAQ stock exchange shut down Insulin pump demo at Black Hat conference Jeep Cherokee Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved 8.4 Notable Software System Failures Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Patriot Missile Designed as anti-aircraft missile Used in 1991 Gulf War to intercept Scud missiles One battery failed to shoot at Scud that killed 28 soldiers Designed to operate only a few hours at a time Kept in operation > 100 hours Tiny truncation errors added up Clock error of 0.3433 seconds → tracking error of 687 meters Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Patriot Missile Failure (1) The radar system doing a wide area search picks up the Scud missile. (2) The radar system isolates the proposed target. (3) A software error causes the system to produce a faulty range gate. The system loses track of the missile, because it does not fly through this gate. (Figure from Science 255:1347. Copyright ©1992 by the American Association for the Advancement of Science. Reprinted with permission.) Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved AT&T Long-Distance Network Significant service disruption – About half of telephone-routing switches crashed – 70 million calls not put through – 60,000 people lost all service – AT&T lost revenue and credibility Cause – Single line of code in error-recovery procedure – Most switches running same software – Crashes propagated through switching network Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved AT&T Long Distance Network Failure (1) A single switch in New York City detects an error condition and reboots. When it comes back up, it sends an “OK” message to other switches. (2) Switches in Detroit, St. Louis, and Atlanta are so busy that handling the “OK” message causes them to detect an error condition and reboot. When they come back up, they send out “OK” messages to other switches, causing some of them to fail, and so on. Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Denver International Airport BAE built automated baggage handling system Problems – Airport designed before automated system chosen – Timeline too short – System complexity exceeded development team’s ability Results – Added conventional baggage system – 16-month delay in opening airport – Cost Denver $1 million a day Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Direct-Recording Electronic Voting Machines After problems with 2000 election, Congress passed Help America Vote Act of 2002 H A V A provided money to states to replace punch card voting systems Many states used H A V A funds to purchase direct recording electronic (DRE) voting machines Brazil and India have run national elections using D R E voting machines exclusively In November 2006 one-third of U S voters used D R E voting machines Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Direct-Recording Electronic Voting Machine This Diebold voting machine uses a touch-sensitive screen to capture each voter’s choices. (AP photo/Rogelio Solis) Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Issues with DRE Voting Machines Voting irregularities – Failure to record votes – Overcounting votes – Misrecording votes Lack of a paper audit trail Vulnerability to tampering Source code a trade secret, can’t be examined Possibility of widespread fraud through malicious programming Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved 8.5 Therac-25 Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Genesis of the Therac-25 AECL and CGR built Therac-6 and Therac-20 Therac-25 built by AECL – PDP-11 an integral part of system – Hardware safety features replaced with software – Reused code from Therac-6 and Therac-20 First Therac-25 shipped in 1983 – Patient in one room – Technician in adjoining room Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Software Errors Race condition: order in which two or more concurrent tasks access a shared variable can affect program’s behavior Two race conditions in Therac-25 software – Command screen editing – Movement of electron beam gun Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved 8.6 Tesla Version 7.0 (Autopilot) Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Introduction (1 of 2) October 2014: Tesla introduces technology package – Ultrasonic sensors – Camera – Front radar – Digitally controlled brakes – Enabled car to brake before collisions October 2015: Tesla releases Version 7.0 (Autopilot) – Software to control speed and steer – Tesla: “Like the systems that airline pilots use when conditions are clear” – Tesla: “Driver still responsible” and supposed to keep hands on steering wheel Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Automation of Driving (SAE International) SAE Level 0 – No automation SAE Level 1 – Driver Assistance – Example: anti-lock brakes, dynamic cruise control SAE Level 2 – Partial Automation – Steering and acceleration/deceleration SAE Level 3 – Conditional Automation – All aspects of dynamic driving task with expectation that human responds to request for intervention SAE Level 4 – High Automation – All aspects of dynamic driving task, even if human does not respond to request for intervention SAE Level 5 – Full Automation Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved May 7, 2016 Fatal Accident Tesla S traveling east on US-27A, a divided highway in Florida Semitrailer truck, driving west on highway, turned left in front of Tesla Car struck trailer, killing driver Joshua Brown – Car traveling 74 miles per hour (posted speed limit 65 m ph) – Autopilot engaged for 37 minutes before collision – Brown’s hands on wheel only 25 seconds during that time – System provided Brown with 7 warnings to put hands on wheel – Brakes not applied ▪ Trailer was white, making it difficult to see ▪ Trailer was tall, making radar signature similar to signature of an overhead sign Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved The Hand-Off Problem Drivers get bored when they have nothing to do Hand-off problem – Takes 3-7 seconds on average for drivers to regain attention and take control – In many emergency situations, accident happens in less than 3 seconds Ford, Volvo, Google all skipping SAE Level 3 automation – Going to straight to SAE Level 5 – Avoiding hand-off problem Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved 8.7 Uber Test-Vehicle Accident Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Introduction (2 of 2) Travis Kalanick, former CEO of Uber, held that… – Development of autonomous vehicles an existential threat to Uber – Vital for Uber to be among first companies to develop autonomous-vehicle technology Uber’s effort to catch up with Tesla and Waymo – Opened R&D center in Pittsburgh, Pennsylvania – Hired 50 researchers from Carnegie Mellon University – Quickly developed test vehicles Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Uber Begins Self-Driving Car Pickups Pittsburgh, Pennsylvania – Started offering pickups in September 2016 – Uber engineers in front seat to monitor system, take over when necessary San Francisco, California – Started offering pickups in December 2016 – Experiment lasted only a week – Video of Uber car running red light on day 1 – California Department of Motor Vehicles revoked registrations of Uber’s self-driving cars Arizona governor welcomes Uber to his state Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Shift to One Human Safety Operator Two safety operators in initial tests – One behind the steering wheel, responsible for taking control of vehicle if necessary – Other responsible for monitoring system performance and logging significant events on a laptop Operators assigned to 8-hour shifts – Repeatedly traversed same route – 30-minute lunch break Fall 2017: Uber removes second safety operator – Operators complained it would be harder for them to stay alert – Scientific research demonstrates legitimacy of concerns Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved “Bad Experiences” In March 2018 Uber’s test vehicles still gave passengers frequent “bad experiences,” such as braking too quickly Autonomous vehicles subject to false positives – identifying a danger when there is none – Car exhaust – Steam – Litter Quickly braking for no reason a serious problem – Disconcerting to passengers – Car may get rear-ended Eliminating false positives can lead to false negatives – failing to identify a dangerous situation – and that is worse Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Effort to Eliminate “Bad Experiences” Uber chose to reduce “bad experiences” by turning off emergency braking of self-driving cars – Human safety operator responsible for emergency braking and steering – Engineers did not implement a way for system to alert operator when emergency braking needed ▪ This sounds bad, but it actually makes sense. Why? Volvo XC90s in Arizona test fleet came equipped with automatic emergency braking as standard equipment – System de-activated when car under control of self-driving system – Otherwise, two active systems could give conflicting commands Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved March 18, 2018 Accident (1 of 5) Uber test vehicle running predetermined test route in Tempe, Arizona – Dark conditions (time was 9:58 p.m.) – Car northbound in rightmost lane of N Mill Avenue, traveling at 43 mph (2 mph below speed limit) Female pedestrian began crossing N Mill Avenue from west to east, pushing a bicycle – Pedestrian wearing dark clothes – Some illumination from streetlights – No crosswalk, signs warning pedestrians not to cross – Nearest crosswalk 360 feet to north Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved March 18, 2018 Accident (2 of 5) NTSB preliminary report with a figure showing the location of the crash https://www.ntsb.gov/investigations/AccidentReports/Reports/ HWY18MH010-prelim.pdf Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved March 18, 2018 Accident (3 of 5) Response of self-driving system to pedestrian – First detected moving object six seconds before collision – Had trouble classifying the object – When car 80 feet from pedestrian (1.3 seconds before impact), it determined emergency braking required – System did not alert driver because there was no alert mechanism Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved March 18, 2018 Accident (4 of 5) Actions of safety operator – Had not been keeping gaze fixed on road ahead – Told police she was looking at computer interface – However, according to police report, driver had been streaming talent show “The Voice” to her smartphone – Put hands on steering wheel and began to turn just before impact – Braked vehicle less than 1 second after impact Effects of collision – Vehicle hit pedestrian at 39 mph – Pedestrian died Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved March 18, 2018 Accident (5 of 5) Actions of pedestrian before impact – Did not look in direction of oncoming car until just before it hit her – May have assumed there were no cars on road ▪ Volvo may not have been visible to her when she stepped off curb, due to distance, bend in road, and foliage – Her judgment may be been impaired ▪ Toxicology tests returned positive results for methamphetamine and marijuana Consequences of accident – Arizona governor suspended Uber’s testing program ▪ Said public safety should have been Uber’s top priority – Uber shut down Arizona testing facility ▪ Terminated 300 safety operators in Arizona Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Summary: Classifying Notable Software System Failures Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved 8.8 Computer Simulations Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Uses of Simulations Simulations better than physical experiments when … – experiment too expensive or time-consuming – experiment unethical – experiment impossible Applications of simulations – Model past events – Understand world around us – Predict the future Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Computer Simulations We rely on computer simulations to predict the path and speed of hurricanes.(Courtesy of NASA) Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Validating Simulations Verification: Does program correctly implement model? Validation: Does the model accurately represent the real system? Validation methods – Make prediction, wait to see if it comes true – Predict the present from old data – Test credibility with experts and decision makers Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Validating a Model A computer simulation of an automobile accident can reveal roughly the same information as an actual crash test, and it is far less expensive. (Courtesy of Oak Ridge National Laboratory, U S Dept. of Energy) Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved “Predicting the Present” You can validate a model’s ability to predict 25 years into the future by using it to “predict the present” with data 25 or more years old. You can then compare the model’s prediction of the present with current reality. Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved 8.9 Software Engineering Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Specification Determine system requirements Understand constraints Determine feasibility End products – High-level statement of requirements – Mock-up of user interface – Low-level requirements statement Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Development Create high-level design Discover and resolve mistakes, omissions in specification CASE tools to support design process Object-oriented systems have advantages After detailed design, actual programs written Result: working software system Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Validation (Testing) Ensure software satisfies specification Ensure software meets user’s needs Challenges to testing software – Noncontinuous responses to changes in input – Exhaustive testing impossible – Testing reveals bugs, but cannot prove none exist Test modules, then subsystems, then system Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Software Quality is Improving Standish Group tracks IT projects Situation in 1994 – 1/3 of projects canceled before completion – 1/2 projects had time and/or cost overruns – 1/6 projects completed on time and on budget Situation in 2009 – 1/4 of projects canceled before completion – 5/12 projects had time and/or cost overruns – 1/3 projects completed on time and on budget Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Success of IT Projects over Time Research by the Standish Group reveals that the success rate of I T projects in 2009 was twice that of 1994. Today, about one-third of software projects are completed on time and on budget. Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Gender Bias In male-dominated fields, unconscious gender bias can affect important design decisions – Example: seatbelts don’t work well for pregnant women and their unborn children because crash dummies are modeled after men Men and women have different approaches to writing and debugging software and using programming tools Even if women present, their voices may not be heard – Voting suppresses minority views – Many decisions made under time pressure Solutions – Attract more women by improving job postings – Assign mentors, eliminate sexual harassment Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Bias in Training Sets for Artificial-Intelligence Systems When AI system trained with biased data set, can affects its performance across a diverse population Example: Facial-recognition systems – Training data: 75% of faces are male and 80% of faces are white – Resulting performance ▪ Misidentified fair-skinned males 1% of time ▪ Misidentified gender of darker-skinned females up to 35% of time Example: Google Photos mislabeled black people as “gorillas” Example: Photos of people cooking in kitchen – 67% of training photos showed women cooking – Trained system correctly identified gender of men in photos less than 50% ofCopyright time © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved 8.10 Software Warranties and Vendor Liability Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Shrinkwrap Warranties Some say you accept software “as is” Some offer 90-day replacement or money-back guarantee None accept liability for harm caused by use of software Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Should Software Be Considered a Product? If software a product, then… – theory of strict liability would apply to software maker – maker would be liable for personal injury or property damage (but not economic loss) caused when product used as intended – primary impact would be when software in an embedded system, e.g., medical device or automobile Courts have resisted treating software as a product – A software-controlled device may cause harm through no fault of the programmer – Strict liability puts too much liability on programmer Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Case Study: Incredible Bulk Peter downloads Incredible Bulk for $49.95 Game usable, but has annoying bugs Company never releases software patches Next year company releases Incredible Bulk II for $49.95 New game fixes all majors bugs in original game Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Ethical Analyses Kantian analysis – Company did no wrong – It never promised to release bug fixes – Peter could have read reviews before purchasing game Social contract theory analysis – Peter not actually purchasing software, he purchased license to use software – At some point before releasing Incredible Bulk II the company fixed the bugs – At that point it should have made the patches freely available to users Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved Copyright This work is protected by United States copyright laws and is provided solely for the use of instructors in teaching their courses and assessing student learning. Dissemination or sale of any part of this work (including on the World Wide Web) will destroy the integrity of the work and is not permitted. The work and materials from it should never be made available to students except by instructors using the accompanying text in their classes. All recipients of this work are expected to abide by these restrictions and to honor the intended pedagogical purposes and the needs of other instructors who rely on these materials. Copyright © 2020, 2017, 2015 Pearson Education, Inc. All Rights Reserved

Use Quizgecko on...
Browser
Browser