Understanding the Swiss Cheese Model PDF

Document Details

Uploaded by Deleted User

Douglas A. Wiegmann, Laura J. Wood, Tara N. Cohen, Scott A. Shappell

Tags

patient safety human error system analysis root cause analysis

Summary

This review article discusses the Theory of Active and Latent Failures, also known as the Swiss cheese model, which is used to understand accident causation in complex systems. The review explains the model in detail and highlights its potential applications to patient safety.

Full Transcript

REVIEW ARTICLE Understanding the “Swiss Cheese Model” and Its Application...

REVIEW ARTICLE Understanding the “Swiss Cheese Model” and Its Application to Patient Safety Downloaded from http://journals.lww.com/journalpatientsafety by BhDMf5ePHKav1zEoum1tQfN4a+kJLhEZgbsI Ho4XMi0hCywCX1AWnYQp/IlQrHD3i3D0OdRyi7TvSFl4Cf3VC4/OAVpDDa8K2+Ya6H515kE= on 01/11/2024 Douglas A. Wiegmann, PhD,* Laura J. Wood, MS,* Tara N. Cohen, PhD,† and Scott A. Shappell, PhD‡ (or did) lead to an adverse event, so they can be addressed and mit- Abstract: This article reviews several key aspects of the Theory of Active igated before causing harm in the future.2 and Latent Failures, typically referred to as the Swiss cheese model of hu- Although the Swiss cheese model has become well known in man error and accident causation. Although the Swiss cheese model has most safety circles, there are several aspects of its underlying the- become well known in most safety circles, there are several aspects of its ory that are often misunderstood.13 Without a basic understanding underlying theory that are often misunderstood. Some authors have of its theoretical assumptions, the Swiss cheese model can easily dismissed the Swiss cheese model as an oversimplification of how acci- be viewed as a rudimentary diagram of a character and slices of dents occur, whereas others have attempted to modify the model to make it better equipped to deal with the complexity of human error in health care. I cheese. Indeed, some critics have expressed a viewpoint that the Swiss cheese model is an oversimplification of how accidents This narrative review aims to provide readers with a better understanding occur and have attempted to modify the model to make it better and greater appreciation of the Theory of Active and Latent Failures upon equipped to deal with the complexity of human error in health which the Swiss cheese model is based. The goal is to help patient safety care.14,15 professionals fully leverage the model and its associated tools when per- ↳ Others have called for the extreme measure of discarding the model in its entirety.16–19 Perhaps the rest of us, who have seen forming a root cause analysis as well as other patient safety activities. numerous illustrations and superficial references to the model in Key Words: patient safety, system analysis, root cause analysis, human countless conference presentations, simply consider it passé. =>- outof DATE error, human factors Such reactions are legitimate, particularly when the actual the- ory underlying the model gets ignored or misinterpreted. Never- (J Patient Saf 2022;18: 119–123) theless, there is much to the Swiss cheese model’s underlying assumptions and theory that, when understood, make it a powerful T he Theory of Active and Latent Failures was proposed by James Reason in his book, Human Error.1 According to Rea- son, accidents within most complex systems, such as health care, approach to accident investigation and prevention. Therefore, the purpose of the present article is to discuss several key aspects of the Theory of Active and Latent Failures upon which the Swiss are caused by a breakdown or absence of safety barriers across 4 - cheese model is based. We will begin our conversation by levels within a sociotechnical system. These levels can best be de- - - discussing the “holes in the cheese” and reviewing what they rep- scribed as Unsafe Acts, Preconditions for Unsafe Acts, Supervi- resent. As we dive deeper into the model, we will further explore sory Factors, and Organizational Influences.2 Reason1 used the the theoretical nature of these “holes,” including their unique char- term “active failures” to describe factors at the Unsafe Acts level, acteristics and dynamic interactions. Our hope is that this knowl- whereas the term “latent failures” was used to describe unsafe edge of the Swiss cheese model’s underlying theory will help conditions located higher up in the system. overcome many of the criticisms that have been levied against it. Today, most people refer to Reason’s theory as the “Swiss - e n e A greater appreciation of the Theory of Active and Latent Failures cheese model” because of the way it is typically depicted (Fig. 1). - will also help patient safety professionals fully leverage the Swiss For example, each level within the model is often shown as an in- cheese model and associated tools to support their RCA and other dividual layer or slice of cheese.3 Absent or failed barriers at each - patient safety activities. level are represented as holes in the cheese (hence, the cheese is - “Swiss”). When holes across each level of the system line up, they - provide a window of opportunity for an accident or patient harm - THE THEORY BEHIND THE CHEESE event to occur. - - Let us start by reminding ourselves why Reason’s theory is rep- The Swiss cheese model is commonly used to guide root cause - resented using Swiss cheese. As stated previously, the holes in the analyses (RCAs) and safety efforts across a variety of industries, - - cheese depict the failure or absence of safety barriers within a sys- including health care.4–12 Various safety and RCA frameworks tem. L - Some examples could be a nurse misprogramming an infu- that define the holes in the cheese and their relationships have also sion pump or an anesthesia resident not providing an adequate - been developed, such as the Human Factors Analysis and Classi- fication System (Table 1).2,4–8 The Swiss cheese model and its as- K briefing when handing off a patient to the intensive care unit. Such occurrences represent failures that threaten the overall safety in- sociated tools are intended to help safety professionals identify tegrity of the system. If such failures never occurred within a sys- holes in each layer of cheese, or level of the system, that could tem (i.e., if the system were perfectly safe), then there would not be any holes in the cheese. The cheese would not be Swiss. Rather, the system would be represented by solid slices of cheese, such as From the *Department of Industrial and Systems Engineering, University of cheddar or provolone.S Wisconsin-Madison, Madison, Wisconsin; †Department of Surgery, Cedars- Second, not every hole that exists in a system will lead to an ac- Sinai, Los Angeles, California; and ‡Department of Human Factors and Behav- ioral Neurobiology, Embry-Riddle Aeronautical University, Daytona Beach, Florida. = cident. Sometimes holes may be inconsequential. Other times, holes in the cheese may be detected and corrected before some- Correspondence: Laura J. Wood, MS, Department of Industrial and Systems thing bad happens.20–25 The nurse who inadvertently mispro- Engineering, University of Wisconsin-Madison, 1513 University Ave, Madison, WI 53706 (e‐mail: [email protected]). grams an infusion pump, for example, may notice that the value The authors disclose no conflict of interest. on the pump’s display is not right. As a result, the nurse corrects Copyright © 2021 Wolters Kluwer Health, Inc. All rights reserved. the error and enters the correct value for the patient.S This process J Patient Saf Volume 18, Number 2, March 2022 www.journalpatientsafety.com 119 Copyright © 2022 Wolters Kluwer Health, Inc. All rights reserved. Wiegmann et al J Patient Saf Volume 18, Number 2, March 2022 Downloaded from http://journals.lww.com/journalpatientsafety by BhDMf5ePHKav1zEoum1tQfN4a+kJLhEZgbsI Ho4XMi0hCywCX1AWnYQp/IlQrHD3i3D0OdRyi7TvSFl4Cf3VC4/OAVpDDa8K2+Ya6H515kE= on 01/11/2024 FIGURE 1. Swiss cheese model. of detecting and correcting errors occurs all the time, both at work surgical personnel to increase throughput of surgical patients re- and in our daily lives.20 None of us would be alive today if every sult in staff cutting corners when cleaning and prepping operating error we made proved fatal! rooms. Only after there is a spike in the number of patients with This point leads to another aspect of the theory underlying the surgical site infections does this problem become known.C Swiss cheese model. Namely, that the holes in the cheese are dy- It is also important to realize that most patient harm events are namic, not static.1,2 EThey open and close throughout the day, - - often associated with multiple active and latent failures.1,2 Unlike - - - allowing the system to function appropriately without catastrophe. the typical Swiss Cheese diagram (Fig. 1), which shows an arrow This is what human factors engineers call- “resilience.”26 A resil- flying through one hole at each level of the system, there can be a ient system is one that is capable of adapting and adjusting to variety of failures at each level that interact to produce an adverse changes or disturbances in the system to keep functioning safely.26 - - - - - event. In other words, there can be several failures at the Organi- - - Moreover, according - to the Theory of Active and Latent Fail- zational, Supervisory, Preconditions, and Unsafe Acts levels that - - ures, holes in the cheese open and close at different rates. In addi- - all lead to patient harm. Although not explicitly stated in the The- - - tion, the rate in which holes pop up or disappear is determined by ory of Active and Latent Failures, research indicates thatE- the num- - - the- type of failure the-hole represents. Holes that occur at the Un- - ber of holes in the cheese associated with accidents is more - - safe Acts level, and even some at the Preconditions level, repre- frequent at the Unsafe Acts and Preconditions levels, but become - - sent active failures.1,2IThey are called “active” for several reasons. - fewer as one progresses upward through the Supervisory and Or- - ganizational levels.2,4–8- - First, they occur most often during the process of actively perform- - ing work, such as treating patients, performing surgery, dispensing - - Given the frequency and dynamic nature of activities associ- medication, and so on. Such failures also occur in close proximity, ated with providing patient care, there are more opportunities for in terms of time and location, to the patient harm event. They are holes to open up at the Unsafe and Preconditions levels on a daily seen as being actively involved or directly linked to the bad out- basis. Consequently, there are often more holes identified at these come. Another reason why they are called active failures is because levels during an RCA investigation.↳Furthermore, if an organiza- they actively change during the process of performing work or pro- - - - tion were to have numerous failures at the Supervisory and Orga- viding care. They open and close constantly throughout the day as - nizational levels, it would not be a viable competitor within its people make errors, catch their errors, and correct them. industry. It would have too many accidents to survive. A clear ex- In contrast to active failures, latent failures have different char- ample from aviation is the former airline, ValuJet, that had a series acteristics.1,2 Latent failures occur higher up in the system, above - - - - of fatal accidents in a short period of time, all because of organi- the Unsafe Acts level. These include the Organizational, Supervi- - - - - zational (i.e., latent) failures.27S sory and Preconditions levels. These failures are referred to as “la- - - How the holes in the cheese interact across levels is also impor- tent” because when they occur or open, they often go undetected. - tant to understand. Given that we tend to find fewer holes as we - ↳They can lie “dormant” or “latent” in the system for a long period move up the system, a one-to-one mapping of these failures across - of time before they are recognized.IFurthermore, unlike active fail- levels is not expected.[In layman’s terms, this means that a single - ures, latent failures do not close or disappear quickly.1,20 They - hole, for example, at the Supervisory level, can actually result in may not even be detected until an adverse event occurs.-For exam- multiple failures at the Preconditions level, or a single hole at ple, a hospital may not realize that the pressures it imposes on the Preconditions level may result in several holes at the UnsafeI 120 www.journalpatientsafety.com © 2021 Wolters Kluwer Health, Inc. All rights reserved. Copyright © 2022 Wolters Kluwer Health, Inc. All rights reserved. J Patient Saf Volume 18, Number 2, March 2022 “Swiss Cheese Model” and Its Application TABLE 1. The Human Factors Analysis and Classification System for Health Care De Organizational influences Organizational culture: shared values, beliefs, and priorities regarding safety that govern organizational decision making, as well as the willingness of an organization to openly communicate and learn from adverse events Operational process: how an organization plans to accomplish its mission, as reflected by its strategic planning, policies/procedures, and corporate Downloaded from http://journals.lww.com/journalpatientsafety by BhDMf5ePHKav1zEoum1tQfN4a+kJLhEZgbsI Ho4XMi0hCywCX1AWnYQp/IlQrHD3i3D0OdRyi7TvSFl4Cf3VC4/OAVpDDa8K2+Ya6H515kE= on 01/11/2024 oversight Resource management: support provided by senior leadership to accomplish the objectives of the organization, including the allocation of human, equipment/facility, and monetary resources. DeSupervisory factors Inadequate supervision: oversight and management of personnel and resources, including training, professional guidance, and engagement Planned inappropriate operations: management and assignment of work, including aspects of risk management, staff assignment, work tempo, scheduling, etc Failed to correct known problem: instances in which deficiencies among individuals or teams, problems with equipment, or hazards in the environment, are known to the supervisor yet are allowed to continue unabated Supervisory violations: the willful disregard for existing rules, regulations, instructions, or standard operating procedures by managers or supervisors during the course of their duties - Preconditions for unsafe acts Environmental factors Tools and technology: a category encompassing a variety of issues, including the design of equipment and controls, display/interface characteristics, checklist layouts, task factors, and automation. Physical environment: a category including the setting in which individuals perform their work and consists of such things as lighting, layout, noise, clutter, and workplace design. Task: refers to the nature of the activities performed by individuals and teams, such as the complexity, criticality, and consistency of assigned work Individual factors Mental state: cognitive/emotional conditions that negatively affect performance such as mental workload, confusion, distraction, memory lapses, pernicious attitudes, misplaced motivation, stress, and frustration Physiological state: medical and/or physiological conditions that preclude safe performance, such as circadian dysrhythmia, physical fatigue, illness, intoxication, dehydration, etc Fitness for duty: off-duty activities that negatively impact performance on the job such as the failure to adhere to sleep/rest requirements, alcohol restrictions, and other off-duty mandates Team factors Communication: the sharing of information among team members including providing/requesting information and the use of 2-way (positive confirmation) communication Coordination: a category that refers to the interrelationship among team members including such things as planning, monitoring, and providing back-up where necessary Leadership: the team leader’s performance of his or her responsibilities such as the failure to adopt a leadership role or model/reinforce principles of teamwork G Unsafe acts Errors Decision errors: goal-directed behavior that proceed as intended, yet the plan proves inadequate or inappropriate for the situation; these errors typically result from a lack of information, knowledge, or experience Skill-based errors: “doing” errors that occur frequently during highly practiced activities and appear as attention failures, memory failures, or errors associated with the technique with which one performs a task Perceptual errors: errors that occur during tasks that rely heavily on sensory information which is obscured, ambiguous, or degraded because of impoverished environmental conditions or diminished sensory system Violations Routine violations: often referred to as “bending the rules”; a type of violation that tends to be habitual by nature, engaged in by others, and tolerated by supervisor and management Exceptional violations: isolated departures from authority, neither typical of the individual nor condoned by management Acts level.S2,28 This phenomenon is often referred to as a “one-to-many” - event. Thus, unlike the factors that can often cause problems with mapping of causal factors. However, the converse can also be true. physical systems (e.g., mechanical failures), causal pathways that For example, multiple causal factors at the Preconditions level - lead to human errors and patient harm cannot be reliably identified might interact to produce a single hole at the Unsafe Acts level. - by simply asking a set number of “why” questions in linear fashion. This is referred to as a “many-to-one” mapping of causal factors. - Just because there can be several causal pathways for a single This divergence and convergence of factors across levels are event, however, does not mean that every pathway will ultimately - one reason, in addition to others,29,30 that the traditional RCA lead to organizational roots. There may be some pathways that ter- - - method of asking “5 Whys” is not a very effective approach. Given minate at the Precondition or Supervisory levels. Nonetheless, the = - that failures across levels can interact in many different ways, there assumption underlying the Theory of Active and Latent Failures is can be multiple causal factor pathways associated with any given that all - pathways can be linked to latent failures. Therefore, the job © 2021 Wolters Kluwer Health, Inc. All rights reserved. www.journalpatientsafety.com 121 Copyright © 2022 Wolters Kluwer Health, Inc. All rights reserved. Wiegmann et al J Patient Saf Volume 18, Number 2, March 2022 of RCA investigators is to always explore the potential contribu- undergoing on-the-job training, which resulted in confusion about tions of factors across all levels of the system. Whether or not la- who had performed the count. tent failures exist becomes an empirical, not theoretical, question - *The point here is that unless we address the underlying latent - - - after an accident occurs. ↳ factual, actual - failures that occurred during the case, our solution to add the retained instrument to the counting protocol will not prevent other 3 - “yet-to-be-retained” objects from being retained in the future. Downloaded from http://journals.lww.com/journalpatientsafety by BhDMf5ePHKav1zEoum1tQfN4a+kJLhEZgbsI Moreover, as other objects do get retained over time, our solution Ho4XMi0hCywCX1AWnYQp/IlQrHD3i3D0OdRyi7TvSFl4Cf3VC4/OAVpDDa8K2+Ya6H515kE= on 01/11/2024 IMPLICATIONS of adding them to the list of items to be counted becomes highly There are several reasons why it is important to identify holes at impractical. Even worse, given that the solution of expanding each level of a system, or at least look for them during every safety our count list does not actually mitigate the underlying latent fail- investigation. ↳The more holes we identify, the more opportunities ures, other seemingly different active failures are bound to emerge. we have to improve upon system safety. 12 Another important reason EThese may include such active failures as wrong site surgeries, is that the type of hole we identify, be it active or latent, will delayed antibiotic administration, or inappropriate blood trans- have a significantly different effect on system safety if we are fusion. As result, we will need to develop interventions to address able to fill it in. S To illustrate this point, let us consider the classic childhood C each of these active failures as well. Like the little Dutch boy, we will ultimately run out of fingers and toes trying to plug story about a Dutch boy and leaking dam, written by Dodge31 in up every hole! 1865. The story is about a young boy, named Hans Brinker, who This is not to say that directly plugging up an active failure is observes a leak in a dam near his village as he is walking home unimportant. On the contrary, when an active failure is identified, - from school. As he approaches the village, he notices that every- - action should be taken to directly address the hazard. However, - one in the village is panicking. They are terrified the water leaking sometimes such fixes are seen as “system fixes” particularly when - - through the dam is going to flood the entire village. As a result of their fear, they are immobilized and unable to take action. Never- - - they can be easily applied and spread. Returning again to our retained instrument example, the hospital’s leadership could de- theless, the young Dutch boy stays calm. He walks over to the dam cide to require every surgical team to count the instrument (and and plugs the hole with his finger. The tiny village is saved! They all others similar to it), regardless of the specific surgical proce- all celebrate and dub Hans a hero. dure or type of patient.S Such a process is considered a system - Although the moral of this story is to stay calm in a time of cri- - -- fix because it was “spread throughout the system.” In essence, - sis, there is another moral that we want to emphasize here. From - - however, what we have actually done is require the entire village our perspective, the hole in the dam is analogous to an active fail- - to stand watch at the dam to ensure that all future leaks are caught ure, such as a surgeon leaving a surgical instrument inside a pa- - and plugged before the village floods. We have not fixed the system. tient. The Dutch boy’s act of putting his finger in the hole to To truly improve system safety, interventions need to also ad- -- - - - stop the leak is analogous to a local fix or corrective action for dress latent failures. These types of interventions will serve to re- - - - preventing the surgical instrument from being retained inside an- duce a variety of harmful events throughout the system (i.e., - other patient. For instance, the corrective action might be to add mean the retained instrument to the list of materials that are to be across units, divisions, departments, and the organization). Sys- tem improvements that address latent failures such as conflicting - -... counted before closing the surgical incision. Such a fix to the prob- - policies and procedures, poorly maintained and out-of-date tech- - - lem is very precise.----It focuses on a specific active failure—a nology, counterproductive incentive systems, lack of leadership - retained surgical instrument used in a specific surgical proce- engagement, workload and competing priorities, laxed oversight dure. It can also be implemented relatively quickly. In other and accountability, poor teamwork and communication, and ex- words, it is an immediate, localized intervention to address a - - cessive interruptions are ultimately needed to reduce future leaks known hazard, just like Hans plugging the hole in the dam with - and subsequent demand for more “fingers and toes.” Indeed, such his finger. system fixes release the villagers from their makeshift role as civil Although such corrective actions are needed and should be im- engineers plugging holes in the dam, freeing them to passionately plemented, they often do not address the underlying causes of the pursue their lives’ true calling to the best of their ability. problem (i.e., the reasons for the leak in the dam). It is just a matter of time before the next hole opens up. But what happens when lit- tle Hans ultimately runs out of fingers and toes to plug up the new CONCLUSIONS holes that will eventually occur? Clearly, there is a need for the de- Numerous articles on the Swiss cheese model have been pub- velopment of other solutions that are implemented upstream from lished. However, there are several aspects of its underlying theory the village and dam. Perhaps the water upstream can be rerouted, that are often misunderstood.13 Without a basic understanding of or other dams put in place. Whatever intervention we come up its theoretical assumptions, the Swiss cheese model can easily with, it needs to reduce the pressure on the dam so that other holes be criticized for being an oversimplification of how accidents ac- do not emerge. tually occur.16,17 Indeed, many of the criticisms and proposed Let us return to our example of the retained surgical instrument. modifications to the model reflect a lack of appreciation for the At this point in our discussion, we do not really know why the in- theory upon which the model is based. Nonetheless, the goal of strument was left in the patient. Perhaps the scrub nurse, surgical this article was not to go point-counterpoint with these critics in technician, or circulating nurse all knew the instrument had been an attempt to win a debate. Rather, the purpose of the present ar- left in the patient. However, they were afraid to speak up because ticle was to simply discuss specific aspects of the Swiss cheese they were worried about how the surgeon would react if they said model and its underlying Theory of Active and Latent Failures. something. Maybe they were all in a hurry because they felt pres- [This theory explains, in system terms, why accidents happen sure to turn over the operating room and get it ready for the next and how they may be prevented from happening again. When un- surgical patient in the queue. Perhaps there was a shift turnover derstood, the Swiss cheese model has proven to be an effective in the middle of the surgical procedure, and the oncoming surgical foundation for building robust methods to identify and analyze ac- staff were unaware that the instrument had been used earlier in the tive and latent failures associated with accidents across a variety of case. There could have been new surgical staff present who were complex industries.2–8 A firm grasp of the Swiss cheese model 122 www.journalpatientsafety.com © 2021 Wolters Kluwer Health, Inc. All rights reserved. Copyright © 2022 Wolters Kluwer Health, Inc. All rights reserved. J Patient Saf Volume 18, Number 2, March 2022 “Swiss Cheese Model” and Its Application and its underlying theory will prove invaluable when using the patient safety initiatives. Am J Med Qual. 2019;34:590–595. doi: model to support RCA investigations and other patient safety efforts. 10.1177/1062860618817638. 13. Perneger TV. The Swiss cheese model of safety incidents: are there holes in AKNOWLEDGMENTS the metaphor? BMC Health Serv Res. 2005;5. doi:10.1186/1472-6963-5-71. The project described was supported by the Clinical and 14. Li Y, Thimbleby H. Hot cheese: a processed Swiss cheese model. J R Coll Translational Science Award program, through the National Insti- Physicians Edinb. 2014;44:116–121. doi:10.4997/JRCPE.2014.205. Downloaded from http://journals.lww.com/journalpatientsafety by BhDMf5ePHKav1zEoum1tQfN4a+kJLhEZgbsI tutes of Health National Center for Advancing Translational Sci- Ho4XMi0hCywCX1AWnYQp/IlQrHD3i3D0OdRyi7TvSFl4Cf3VC4/OAVpDDa8K2+Ya6H515kE= on 01/11/2024 15. Seshia SS, Bryan Young G, Makhinson M, et al. Gating the holes in the ences (grant UL1TR002373), as well as the University of swiss cheese (part I): expanding professor Reason’s model for patient Wisconsin School of Medicine and Public Health’s Wisconsin safety. J Eval Clin Pract. 2018;24:187–197. doi:10.1111/jep.12847. Partnership Program. The content is solely the responsibility of the authors and does not necessarily represent the official views 16. Hollnagel E, Woods DD. Cognitive systems engineering: new wine in new of the National Institutes of Health or Wisconsin Partnership bottles. Int J Man Mach Stud. 1983;18:583–600. Program. 17. Leveson N. A new accident model for engineering safer systems. Safety Sci. 2004;42:237–270. REFERENCES 18. Dekker SW. Reconstructing human contributions to accidents: the new 1. Reason JT. Human Error. Cambridge, England: Cambridge University view on error and performance. J Safety Res. 2002;33:371–385. doi: Press; 1990. 10.1016/S0022-4375(02)00032-4. 2. Shappell SA, Wiegmann DA. A Human Error Approach to Aviation 19. Buist M, Middleton S. Aetiology of hospital setting adverse events 1: Accident Analysis: The Human Factors Analysis and Classification System. limitations of the ‘Swiss cheese’ model. Br J Hosp Med. 2016;77: Burlington, VT: Ashgate Press; 2003. C170–C174. doi:10.12968/hmed.2016.77.11.C170. 3. Stein JE, Heiss K. The Swiss cheese model of adverse event 20. Reason J. Human error: models and management. BMJ. 2000;320: occurrence—closing the holes. Semin Pediatr Surg. 2015;24:278–282. doi: 768–770. doi:10.1136/bmj.320.7237.768. 10.1053/j.sempedsurg.2015.08.003. 21. Kontogiannis T. User strategies in recovering from errors in man-machine 4. Reinach S, Viale A. Application of a human error framework to conduct systems. Safety Sci. 1999;32:49–68. train accident/incident investigations. Accid Anal Prev. 2006;38:396–406. 22. de Leval MR, Carthey J, Wright DJ, et al. Human factors and cardiac 5. Li WC, Harris D. Pilot error and its relationship with higher organizational surgery: a multicenter study. J Thorac Cardiovasc Surg. 2000;119: levels: HFACS analysis of 523 accidents. Aviat Space Environ Med. 2006; 661–672. doi:10.1016/S0022-5223(00)70006-7. 77:1056–1061. 23. King A, Holder MG Jr., Ahmed RA. Errors as allies: error management 6. Patterson JM, Shappell SA. Operator error and system deficiencies: training in health professions education. BMJ Qual Saf. 2013;22:516–519. analysis of 508 mining incidents and accidents from Queensland, Australia doi:10.1136/bmjqs-2012-000945. using HFACS. Accid Anal Prev. 2010;42:1379–1385. doi:10.1016/j. 24. Law KE, Ray RD, D’Angelo AL, et al. Exploring senior residents’ aap.2010.02.018. intraoperative error management strategies: a potential measure of 7. Diller T, Helmrich G, Dunning S, et al. The human factors analysis performance improvement. J Surg Educ. 2016;73:64–70. classification system (HFACS) applied to health care. Am J Med Qual. 25. Hales BM, Pronovost PJ. The checklist—a tool for error management and 2013;29:181–190. doi:10.1177/1062860613491623. performance improvement. J Crit Care. 2006;21:231–235. 8. Spiess BD, Rotruck J, McCarthy H, et al. Human factors analysis of a 26. Hollnagel E, Woods DD, Leveson NC, eds. Resilience Engineering: near-miss event: oxygen supply failure during cardiopulmonary bypass. Concepts and Precepts. Aldershot, UK: Ashgate Publishing, Ltd.; 2006. J Cardiothorac Vasc Anesth. 2015;29:204–209. doi:10.1053/j. jvca.2014.08.011. 27. National Transportation Safety Board. In-flight fire and impact with terrain ValuJet Airlines flight 592, DC-9-32, N904VJ, everglades, near Miami, 9. Collins SJ, Newhouse R, Porter J, et al. Effectiveness of the surgical safety Florida (Report No. AAR-97-06). 1997. Available at: https://www.ntsb. checklist in correcting errors: a literature review applying Reason’s Swiss gov/investigations/AccidentReports/Reports/AAR9706.pdf. Accessed cheese model. AORN J. 2014;100:65–79. doi:10.1016/j.aorn.2013.07.024. January 4, 2021. 10. Thonon H, Espeel F, Frederic F, et al. Overlooked guide wire: a 28. ElBardissi AW, Wiegmann DA, Dearani JA, et al. Application of the multicomplicated Swiss cheese model example. Analysis of a case and Human Factors Analysis and Classification System methodology to the review of the literature. Acta Clin Belg. 2019;75:1–7. doi: cardiovascular surgery operating room. Ann Thorac Surg. 2007;83: 10.1080/17843286.2019.1592738. 1412–1419. doi:10.1016/j.athoracsur.2006.11.002. 11. Neuhaus C, Huck M, Hofmann G, et al. Applying the Human Factors Analysis and Classification System to critical incident reports in 29. Card AJ. The problem with ‘5 whys’. BMJ Qual Saf. 2017;26:671–677. anaesthesiology. Acta Anaesthesiol Scand. 2018;62:1403–1411. doi: 30. Peerally MF, Carr S, Waring J, et al. The problem with root cause analysis. 10.1111/aas.13213. BMJ Qual Saf. 2017;26:417–422. 12. Durstenfeld MS, Statman S, Dikman A, et al. The Swiss cheese conference: 31. Dodge MM. Hans Brinker; or, the Silver Skates. New York, NY: James integrating and aligning quality improvement education with hospital O’Kane; 1865. © 2021 Wolters Kluwer Health, Inc. All rights reserved. www.journalpatientsafety.com 123 Copyright © 2022 Wolters Kluwer Health, Inc. All rights reserved.

Use Quizgecko on...
Browser
Browser