Skip to content


Predictions, Baseball and other Lifesavers

Although usually attributed to Yogi Berra, it was really Yogi’s teammate Casey Stengel who said, “You should never make predictions, especially if they’re about the future”.

Cute, but is it really true? Is it so hard to predict the future?  I don’t think so and we actually do it all the time. For instance, “If you drive when you’re drunk, you’ll have an accident or end up in jail”.  That’s a pretty reasonable prediction given our justifiable cultural and legal climate against hurting others with your own negligence.  Or how about, “If you play with matches then you’re going to get burned”.  Again, this is another pretty good prediction about a very likely future outcome.

High reliability theorists have developed a “comprehensive model for evaluating the probability of a human error occurring throughout the completion of a specific task” – those of us down here on the ground would just call that a prediction.  The methodology is known as “Human Error Assessment and Reduction Technique” which lends itself to the convenient acronym HEART and is based largely on research done for the nuclear power industry by J.C. Williams (Williams JC. A data-based method for assessing and reducing human error to improve operational performance. 4th IEEE conference on Human Factors in Nuclear Power Plants, Monterey, CA, June 1988, 436–50).  The HEART analysis examines all the factors and circumstances that exist at the time of an adverse event or significant error and uses those circumstances to build a predictive model of when bad things are likely to occur.    Williams has called such sets of circumstances “error-producing conditions” (EPCs).  This theory has been widely validated in studies of accidents that found these combinations of conditions form a common thread of contributing factors to major incidents. Further applications through mathematical modeling have derived formulae that show if a given set of circumstances are in place, there is a high probability that an error will result and established very useful predictive tools to determine when human error is more likely to occur.

Those of us who advocate and practice the High Reliability Mindset understand that to prevent errors and accidents, it is necessary to be able to predict times when error is more likely than not to occur and these models allow us to make those predictions. EPCs link traditional systems approaches to error with advanced human factors analysis of individual performance. We do this intuitively in our everyday lives – so it’s not a crystal ball that drives that little voice in the back our heads to warn us not to drive after drinking or play with matches – it’s just putting our common experiences of 2 + 2 together. When certain conditions exist the likely outcome will be bad.

 

What are some common Error-Producing Conditions?

Take a guess (no peeking ahead) at some of the top EPCs described in the HEART analyses of bad outcomes and serious incidents.  Start by thinking about the times you are most likely to forget something or mess something up and you will see that the #1 most common condition that leads to error is fatigue.  Accident studies have confirmed that high performance crews are almost 50 times more likely to have an accident when they are fatigued.  This was a major factor implicated by the FAA investigation of the only recent commercial aviation accident, Continental Flight 3407 two winters ago in Buffalo New York. This is such a big factor that airline pilots are not allowed to be on duty for more than 8 hours and if the flight lasts longer a relief pilot is on board to let the primary crew get some rest.   Physicians don’t have laws that limit our time at work so we must be extra vigilant when we know we are tired to avoid errors.

Another very common EPC is time pressure or simply being rushed – known in the literature as time and task compression.  Think about being really late for a very important event, running out of the house, letting the door lock behind you and having left your car keys inside.  Think about being in the operating room suite where you are pressured to make a decision because of deteriorating condition of the patient.  This is a likely time for an error and an important time to be AWARE that you are more prone to mistakes, and you will therefore be more likely to avoid them.

Another common EPC is called, “one-way decision gates” which reflects the simple human trait (and especially surgeons’ trait) to be so sure you are right that you are not able to call a mental “about face” and go back and re-visit a decision and re-evaluate it based on new or evolving conditions.  Once you make your mind up about something, most of us have trouble (without some very direct prompting from a team member) to change our decisions.

Another EPC is called “faulty risk perception” which occurs when the operator fails to appreciate various risks and pitfalls of the current situation and therefore takes the decision too lightly.

Yet another important EPC is known as “normalization of deviance” which is also known as “bending the rules” or the tendency to cut corners, for example to not really do a good time-out since nobody else really does. Gotcha – just the time nobody noticed the patient was not given pre-op antibiotics.  The case goes on, the patient gets a wound infection that may or may not have anything to do with the fact that you didn’t follow the American College of Surgeons guidelines – but tell that to the patient (and the jury).

Information overload has also identified as an EPC which is when the operator finds him/herself overwhelmed with information or extraneous inputs and becomes distracted and unable to mentally process all the details with leads to error.

Another EPC identified in these studies is poor information transfer or simply not sharing important clinical facts has been addressed in this blog before and cannot be over-emphasized as a likely cause of bad outcomes.

The last of the top ten most common EPCs is inadequate standardization which means that simple protocols and standards of care are not followed.  Standards are there for a reason – because the evidence supports that as the best way to treat a problem and although there are exceptions – the weight of clinical studies have derived these standards and they should be followed.

So even if he didn’t come up with that zinger about predictions, it was Yogi who said, “The future ain’t what it used to be” and that is something I do believe is true. There’s just not as much mystery about the future anymore.  By adding a heightened awareness of these Error Producing Conditions to our practice of the high reliability mindset we can predict the future.  For advocates and adopters of the high reliability mindset, application of this theory to our common practices of patient care gives us the ability to anticipate when patients are at increased risk for bad outcomes and maintain an especially high degree of vigilance during those times.  This mindset is our only defense and when we hear AND heed that little voice in the back of our heads, special measures can then be taken to reduce the likelihood of errors.  We can strive to avoid conditions that lead to errors but if these conditions are inevitable because they came bundled up with our patient we can recognize they exist and be on guard for errors that are more likely to follow.

Share on Facebook

Posted in Error Producing Conditions, High Reliability Mindset, High Reliability Organizations.

Tagged with , .


0 Responses

Stay in touch with the conversation, subscribe to the RSS feed for comments on this post.



Some HTML is OK

or, reply to this post via trackback. All comments are moderated and may not appear immediately.