Skip to content(if available)orjump to list(if available)

"Swiss Cheese" Failure Model

"Swiss Cheese" Failure Model

11 comments

·July 2, 2025

shelajev

what a strange place to link to. Even Wikipedia has a better entry [1].

The model itself is fun to think about: preventing failures by stuffing more cheese into the system. If you're interested, the classic example of the cheese failure is Chernobyl, where many different things had to fail in order to become a catastrophe.

--- [1] https://en.wikipedia.org/wiki/Swiss_cheese_model

hiddencost

Pretty sure they're just farming for site rep. The ads make it unusable.

joeatwork

The page doesn’t say it, but this is why adding redundant safety systems and defense in depth stops working after a while - such systems end up running with (acceptable, unobserved) “holes” after a while - the more complex the system, the harder it is to perceive the holes, until one day they line up and become very obvious indeed.

mattlondon

Well I think that actually this is the whole rationale for adding redundant safety systems: i.e. you are going to have "holes" even if you don't know it, so add another system and hopefully the holes don't line up. I don't think is is an argument for not adding more - if anything it is the opposite surely?

Obviously at some point you say enough is enough, no more cheese. I guess the nuance is how much cheese is enough.

robviren

I worked as a field engineer in nuclear power for maintaining reactors. Every plant you go to requires taking a training module before you can get in usually. Every single one had at least a slide showing the Swiss cheese model of "defence in depth". I think it is a fairly good visual and applies great to safety. Every mitigation is silly until it's not.

pstadler

This model is a applied in aviation safety. Mentour Pilot[0] is referecing it from time to time in his videos, mostly when existing systems and/or procedures fail to prevent accidents from happening.

[0] https://youtube.com/@mentourpilot

tialaramex

Also shout out in this context to ASRS, which is run by NASA. At ASRS their job is to take people's reports of things which didn't become accidents but could have otherwise, anonymize them, and then analyse that statistically.

https://asrs.arc.nasa.gov/

findthewords

I've heard that catastrophes occur when three things go wrong (at the same time).

taneq

Yeah, and that the obvious “cause” of the catastrophe is the third or fourth etc. failure in a row that allowed the situation to snowball into something truly damaging. Reminds me (on the other side) of the “five whys” approach to root cause analysis.

webdevver

im more of a salami tactics guy myself

aaron695

[dead]