Patient Safety, Swiss Cheese and the Secret Service

I was listening to the news on my way to work last week, and heard a story about the review conducted after the well-publicized security breach at the White House. Like many people, I was shocked when the story of the fence-jumper first broke. How was it possible that some guy with a knife managed to get over the fence, cross the lawn, enter the White House and get deep into the building before he was stopped? The answer, according to NPR’s reporting of the Department of Homeland Security investigation is that a whole sequence of events made it possible:

It turns out that the top part of the fence that he climbed over was broken, and it didn’t have that kind of ornamental spike that might have slowed him down. Gonzalez then set off alarms when he got over the fence, and an officer assigned to the alarm board announced over the Secret Service radio there was a jumper. But they didn’t know the radio couldn’t override other normal radio traffic. Other officers said they didn’t see Gonzalez because of a construction project along the fence line itself. And in one of the most perhaps striking breaches, a K-9 officer was in his Secret Service van on the White House driveway. But he was talking on his personal cell phone when this happened. He didn’t have his radio earpiece in his ear. His backup radio was in his locker. Officers did pursue Gonzalez, but they didn’t fire because they didn’t think he was armed. He did have a knife. He went through some bushes that officers thought were impenetrable, but he was able to get through them and to the front door. And then an alarm that would’ve alerted an officer inside the front door was muted, and she was overpowered by Gonzales when he burst through the door. So just a string of miscues.

The explanation rang true. Of course it was no “one thing” that went wrong; it was a series of events, no one of which in isolation was sufficient to cause a problem but, when strung together, led to a catastrophic system failure. The explanation also sounded familiar. It is a perfect example of the “swiss cheese” conceptual model of patient safety.

First articulated by Jim Reason the swiss cheese model holds that serious adverse events that occur in the context of complex systems are generally the consequence of multiple failures, not the fault of a single individual. In the case of a serious patient harm event (e.g., operating on the wrong body part), thoughtful analysis inevitably finds that many things have to go wrong for the surgery to occur. Indeed, just as the Secret Service has multiple layers of barriers around the White House to prevent an intruder from reaching the President, patient safety experts speak of “layers of defense” within medical systems that are designed to assure that small errors caused by human frailty don’t allow harm to “reach” the patient.

The “swiss cheese” description derives from the visual shorthand of imaging a series of slices of swiss cheese, each of which represents a system defense. In the case of the White House, the perimeter fence, the guard dog and the building alarm are each like separate pieces of cheese. The holes represent imperfections or failures of each slice. For the intruder to get through them all, the holes in the cheese have to line up in a particular way. If the holes don’t line up – the fence fails, but the dogs respond — then the system works.

For a wrong side surgery to occur, it may take a similar string of failures: maybe the surgical drape covered the surgeon’s pre-op marking and the patient had bilateral disease, and the surgeon working in an unfamiliar OR, and so on.

Addressing patient (and Presidential) safety is almost never about finding the single person who failed at his or her task, or about an easy fix. It is about understanding how complex systems work and creating a culture of safety to continuously improve them. I hope the Secret Service takes that approach, instead of just fixing the fence and firing the guy who was on his cell phone.

What do you think?

4 thoughts on “Patient Safety, Swiss Cheese and the Secret Service

  1. I had heard/ learned at one point that it takes on average five interrelated miscues/ breakdowns to cause an accident or any kind.

  2. A very thoughtful application of Reason’s theory of human error. I completely agree that it is a series of small failures, all lined up, that lead to a potentially catastrophic event in complex systems. This approach would be clearly applicable to many daily life situations where something goes seriously wrong-not just in medicine- If we only took the time to analyze them. But, of course, corrective action needs an in-depth examination and fair assignment of accountability. With your permission, I would like to use your example in my upcoming culture of safety lecture.

    1. Thanks for your comment. I agree that this kind of analysis has broad applicability. Sadly, the response to many such situations is to look for someone to blame instead of seeking understanding of the failure.
      I would be happy to have you use this example in your course!

  3. Root cause analysis would benefit greatly from the elimination of silo-thinking and fiefdoms from turning discussion into food-fights.

Join the Discussion! Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s