Swiss cheese model
- For the Swiss Cheese model in physical cosmology, see large-scale structure of the cosmos, galaxy filament, and supercluster.
The Swiss Cheese model of accident causation is a model used in risk analysis and risk management, including aviation, engineering, healthcare, and as the principle behind layered security, as used in computer security and defense in depth. It likens human systems to multiple slices of swiss cheese, stacked side by side, in which the risk of a threat becoming a reality is mitigated by the differing layers and types of defenses which are "layered" behind each other. Therefore, in theory, lapses and weaknesses in one defense do not allow a risk to materialize, since other defenses also exist, to prevent a single point of weakness. The model was originally formally propounded by Dante Orlandella and James T. Reason of the University of Manchester,[1] and has since gained widespread acceptance. It is sometimes called the cumulative act effect.
Although the Swiss cheese model is respected and considered to be a useful method of relating concepts, it has been subject to criticism that it is used over broadly, and without enough other models or support.[2]
Failure domains
Reason hypothesized that most accidents can be traced to one or more of four failure domains: organizational influences, supervision, preconditions and specific acts. Preconditions for unsafe acts include fatigued air crew or improper communications practices. Unsafe supervision encompasses for example, pairing inexperienced pilots on a night flight into known adverse weather. Organizational influences encompass such things as reduction in expenditure on pilot training in times of financial austerity.[3]
Holes and slices
In the Swiss Cheese model, an organisation's defenses against failure are modeled as a series of barriers, represented as slices of cheese. The holes in the slices represent weaknesses in individual parts of the system and are continually varying in size and position across the slices. The system produces failures when a hole in each slice momentarily aligns, permitting (in Reason's words) "a trajectory of accident opportunity", so that a hazard passes through holes in all of the slices, leading to a failure.[4][5][6]
Frosch[7] described Reason's model in mathematical terms as a model in percolation theory, which he analyses as a Bethe lattice.
Active and latent failures
The Swiss Cheese model includes both active and latent failures. Active failures encompass the unsafe acts that can be directly linked to an accident, such as (in the case of aircraft accidents) pilot error. Latent failures include contributory factors that may lie dormant for days, weeks, or months until they contribute to the accident. Latent failures span the first three domains of failure in Reason's model.[3]
Applications
The same framework applies in healthcare. For example, a latent failure could be the similar packaging of two drugs that are then stored close to each other in a pharmacy. Such a failure would be a contributory factor in the administration of the wrong drug to a patient. Such research led to the realization that medical error can be the result of "system flaws, not character flaws", and that greed, ignorance, malice or laziness are not the only causes of error.[8]
Lubnau, Lubnau, and Okray[9] apply the model to the engineering of firefighting systems, aiming to reduce human errors by "inserting additional layers of cheese into the system", namely the techniques of Crew Resource Management.
This is one of the many models listed, with references, in.[10]
See also
- Accident
- Aviation safety
- Iteration
- Proximate cause
- Proximate and ultimate causation
- Root cause analysis
- Safety
- Systems engineering
- Systems modelling
References
- ↑ Reason 1990.
- ↑ "Revisiting the Swiss cheese model of accidents" (PDF). Eurocontrol. October 2006.
- 1 2 Douglas A. Wiegmann & Scott A. Shappell (2003). A Human Error Approach to Aviation Accident Analysis: The Human Factors Analysis and Classification System. Ashgate Publishing, Ltd. pp. 48–49. ISBN 0754618730.
- ↑ Daryl Raymond Smith; David Frazier; L W Reithmaier & James C Miller (2001). Controlling Pilot Error. McGraw-Hill Professional. p. 10. ISBN 0071373187.
- ↑ Jo. H. Wilson; Andrew Symon; Josephine Williams & John Tingle (2002). Clinical Risk Management in Midwifery: the right to a perfect baby?. Elsevier Health Sciences. pp. 4–6. ISBN 0750628510.
- ↑ Tim Amos & Peter Snowden (2005). "Risk management". In Adrian J. B. James; Tim Kendall & Adrian Worrall. Clinical Governance in Mental Health and Learning Disability Services: A Practical Guide. Gaskell. p. 176. ISBN 1904671128.
- ↑ Robert A. Frosch (2006). "Notes toward a theory of the management of vulnerability". In Philip E Auerswald; Lewis M Branscomb; Todd M La Porte; Erwann Michel-Kerjan. Seeds of Disaster, Roots of Response: How Private Action Can Reduce Public Vulnerability. Cambridge University Press. p. 88. ISBN 0521857961.
- ↑ Patricia Hinton-Walker; Gaya Carlton; Lela Holden & Patricia W. Stone (2006-06-30). "The intersection of patient safety and nursing research". In Joyce J. Fitzpatrick & Patricia Hinton-Walker. Annual Review of Nursing Research Volume 24: Focus on Patient Safety. Springer Publishing. pp. 8–9. ISBN 0826141366.
- ↑ Thomas Lubnau II; Randy Okray & Thomas Lubnau (2004). Crew Resource Management for the Fire Service. PennWell Books. pp. 20–21. ISBN 1593700067.
- ↑ Taylor, G.A., Easter, K.M., Hegney, R.P. (2004). Enhancing Occupational Safety and Health. Elsevier. pp. 241–245, see also pages 140–141 and pages 147–153, also on Kindle. ISBN 0750661976.
Further reading
- Reason, James (2000-03-18). "Human error: models and management". British Medical Journal. 320 (7237): 768–770. doi:10.1136/bmj.320.7237.768. PMC 1117770. PMID 10720363.
- Reason, James (1990-04-12). "The Contribution of Latent Human Failures to the Breakdown of Complex Systems". Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences. 327 (1241): 475–484. doi:10.1098/rstb.1990.0090. (read online: JSTOR)
- Reason, James (1997). Managing the risks of organizational accidents. Aldershot: Ashgate. ISBN 1840141042.
- Reason, James (1995). "A System Approach to Organizational Error". Ergonomics. 38: 1708–1721. doi:10.1080/00140139508925221.
- Bayley, Carol (2004). "What medical errors can tell us about management mistakes". In Paul B. Hofmann and Frankie Perry. Management Mistakes in Healthcare: Identification, Correction, and Prevention. Cambridge University Press. pp. 74–75. ISBN 0521829003.
- Garland, Daniel J.; Westrum, Ron; Adamski, Anthony J. (1998). "Organizational factors associated with safety and mission success in aviation environments". Handbook of Aviation Human Factors. Lawrence Erlbaum Associates. p. 84. ISBN 0805816801. — Westrum and Adamski relate Reason's Swiss Cheese model to Westrum's "human envelope" model, where "around every complex operation there is a human envelope that develops, operates, maintains, interfaces, and evaluates the function of the sociotechnological system" and where the system "depends on the integrity of this envelope, on its thickness and strength". Westrum models latent failures as voids within this envelope, and active failures as factors external to the envelope that are acting to breach it.
- Shappell, Scott A.; Wiegmann, Douglas A. (February 2000). "The Human Factors Analysis and Classification System—HFACS: The "Swiss cheese" model of accident causation". National Technical Information Service.
- Horn, John R.; Hansten, Philip D. (2004). "Sources of Error in Drug Interactions: The Swiss Cheese Model". Pharmacy Times.
- Perneger, Thomas V. (2005-11-09). "The Swiss cheese model of safety incidents: are there holes in the metaphor?". BMC Health Services Research. BioMed Central Ltd. 5 (71). doi:10.1186/1472-6963-5-71.
- Young, M.S.; Shorrock, S.T.; Faulkner, J.P.E (2005-06-14). "Seeking and finding organisational accident causes: Comments on the Swiss cheese model". Department of Aviation, University of New South Wales. (also available on-line here) — a reminder that while Reason's model extends causation to latent failures, this is not at the expense of eliminating active failure entirely.
- Nance, John J. (2005-04-12). "Just How Secure Is Airline Security?: The Swiss Cheese Model and What We've Really Accomplished Since 9/11". ABC News.
- Luxhøj, James T.; Kauffeld, Kimberlee (2003). "Evaluating the Effect of Technology Insertion into the National Airspace System". The Rutgers Scholar. 3.
- Howell, Elizabeth A.; Chassin, Mark R. (May 2006). "Right? Left? Neither!". Morbidity & Mortality Rounds on the Web. Agency for Healthcare Research and Quality. — the application of the Swiss Cheese model to a specific case of medical error