Skip to content

Doctors Don’t Die, Pilots Do (Sometimes) and Parachutes Work

Early on in this chain of electronic discussions we need to address the issue of differences between healthcare culture and aviation safety culture since there are some significant ones.  I give a lot of lectures and talks to doctors about “aviation safety models” and the high reliability mindset. There is almost always the same question from a well intended member of the audience along the lines implying that “pilots are more interested in safety ‘cause they die if they make a mistake and this stuff doesn’t apply in healthcare care because doctors don’t die if they make a mistake”.   The other point that is almost always raised is that there is much more variability to the biology of patients than the mechanics of airplanes and goes something like, “my patients are not airplanes, my operating room is not an airplane cockpit and I’m not a pilot and therefore this won’t work for me”.   I don’t accept either statement as a true.

There is no doubt that a major pilot error is likely to end up as one of those horrific scenes replayed on the 24 hour news networks and result in his/her injury or death and there is noDr Stahl at the controls of his aircraft question that serves as a huge incentive to get it right.  However, I am a pilot, I know hundreds of other pilots and my non-scientific, unofficial poll of pilot opinions reveals that 100% of them vehemently refuse to accept the offer that, given a certainty they would survive a crash, they would knowingly allow anything to occur that would cause harm or death to their passengers.  Perhaps this is the aviation equivalent of “Primum non nochere” (“first do no harm”) the famous dictum attributed to Hippocrates (although why a Greek guy would write in Latin has never been clear to me and is not part of his famous oath as also commonly stated) that admonishes physicians to always protect and act in the best interest of our patients.

I am also a surgeon and have taken up a scalpel blade tens of thousands of times to fix diseased or injured organs but I remember only a small percent of those patients and their procedures.  The ones I do vividly recall are the patients (fortunately few) who didn’t survive.  I did an equally unscientific poll among the hundreds of doctors I know who all agree with me when I admit that I have experienced no worse feeling than that I might have missed something, or done something different that would have changed a bad outcome.   And yes, I do get to go home and my patient doesn’t.  But under these sad circumstances, I always wish it could be the other way around.  I believe that if a physician ever looses that feeling of profound loss over the death of a patient, it is time to seek other employment and stop practicing medicine.

While on the topic of feelings, there is no difference in the feeling in the pit of my stomach when things are not going well in the operating room than when things are not going well in my airplane – especially when I am carrying passengers.   The impact of stress on my thinking, my alertness and my skills whether surgical or aviator are the same when circumstances turn to unexpected events.  Pilots are equally concerned about the welfare of their charges (passengers) as doctors are about theirs (patients).   If anything it could be argued that it would be more likely for a pilot to make an error  because their life is also in jeopardy and the negative impact of stress makes clear thinking and accurate decision making more difficult.  Take as evidence that high reliability theory can train operators to perform accurately under emergent and unforeseen circumstances as when Captain Chesley Sullenberger safely ditched a crippled Airbus in the Hudson River with no injuries.   This is a lesson that would be invaluable to learn for surgeons and trauma docs who must make urgent decisions and carry out difficult and complex procedures under the pressures generated by time compression and dire patient conditions.  So, with that said, I reject the premise that doctors are, or should be, less vigilant to guard against fatal error simply because  they will survive those mistakes and there is no less pressure or fervor for a “safe landing” for our patients than pilot feel for their passengers.

The second common statement that the high reliability mindset does not apply to healthcare since doctors are not pilots is equally specious.   Physicians are clearly data driven and evidenced based in their practice of medicine.  But that does not mean every medication or treatment needs to be proven over and over for each subset of patients.  If a treatment is efficacious in a 30 year old patient it is equally valuable in a 40 year old without doing another randomized-control-double-blinded-study to prove the point.  This was the topic of a brilliant article by Gordon Smith and Jill Pell entitled, “Parachute use to prevent death and major trauma related to gravitational challenge: systematic review of randomized controlled trials” in The British Medical Journal (BMJ 327:1459-61. December 2003).  The essence of their theory is that advocates of evidence based medicine have criticized the adoption of interventions evaluated by using only observational data.  As with many interventions intended to prevent ill health, the effectiveness of parachutes has not been subjected to rigorous evaluation by randomized controlled trials.  They propose that healthcare might benefit if the most radical protagonists of evidence based medicine organized and participated in a double blind, randomized, placebo controlled, and crossover trial of the parachute.  In short, there are some things that work and do not need to be re-proven over and over in every aspect of human behavior and error avoidance.   Such is the case with the applicability of high reliability safety theory to patient safety.

The sources of human error are thoroughly studied and well known and apply to all human error whether in healthcare, nuclear power plants or airline cockpits.   High Reliability Organization (HRO) safety theory embodies skills, methods and techniques to mitigate human error and catch the small missteps of early error before they are allowed to spiral into disaster.  These methods need only be understood and applied to patient safety, not reinvented and re-proven.   Certainly the sources of equipment failure and even some system failures are different in different organizations, but the hallmark of HRO’s is the way human operators are trained to deal with, and recover from, these failures to prevent catastrophe.  It is exactly these principles that will reduce errors in patient care.  So that’s what I think, what do you think?

Share on Facebook

Posted in High Reliability Mindset, Patient Safety.

Tagged with .

0 Responses

Stay in touch with the conversation, subscribe to the RSS feed for comments on this post.

Some HTML is OK

or, reply to this post via trackback. All comments are moderated and may not appear immediately.