Skip to content


What you don’t know can kill you – or your patient

What you don’t know can kill you – or your patient

In the overall scheme of things it’s not too common for a modern era pilot to be involved in a plane crash and it’s even less common for a pilot to be involved in a crash, survive it and go on to be in another crash.  But such is the case of Francis Gary Powers, probably the most famous cold war era aviator who flew the high altitude U2 spy plane for the CIA in the 1950s and ‘60s.

Gary Powers with U2 aircraft

Gary Powers with U2 aircraft

In May of 1960, at the height of the cold war, Dwight Eisenhower was missing a critical piece of evidence on Soviet missile strength. The American press was claiming that there was a huge “missile gap” between the US and the Soviet Union to the tune of 1000:1 and America was in grave danger.  Eisenhower just didn’t believe it in spite of claims by the Russians and their threats to “bury” the US with their nuclear supremacy. Eisenhower was about to attend the “Great Paris Summit” with Soviet president Nikita Khrushchev and Eisenhower wanted to make a potentially risky bet:  he was going to call Khrushchev‘s bluff at the summit and not be bullied by his threats.   But to be safe, Eisenhower needed pictures as evidence that there were in reality, no missiles on the ground. So on May 1, 1960, the fabled “May Day” Soviet celebration, Powers took off in his U2 spy plane from Peshawar, Pakistan and climbed an incredible 15 miles into the Stratosphere.  He flew across the southern Soviet border at the Pamir plateau on a 10 hour route over the key nuclear test site of Sverdlovsk, the one Soviet site that the U2s had not yet photographed. The ambitious flight plan crossed the entire Soviet land mass with a landing planned outside Oslo Norway. Powers didn’t know one critical piece of information about the airplane he was flying; he didn’t know that the Lockheed engineers had modified it from the original specifications to save weight.  The massive tail assembly was now only held onto the body of the huge jet by three small 5/8th inch bolts.

New advances in soviet radar tracking technology allowed a young Soviet rocketier, Mikhail Voronov, to spot the U2 as Powers crossed north into Soviet airspace.  He launched six S-75 Dvina ground-to-air missiles at Powers and fortunately, only one left the launch pad.  Powers was able to spot the rocket coming towards him and, not knowing about the structural weakness of his tail assembly, steered so the rocket would pass “harmlessly” behind him.   The rocket detonated more than 200 feet behind the tail of Powers’ aircraft.  But that was enough:  with only the three small bolts holding the tail section to the plane it was easily torn off. The U2 spun out of control and plummeted towards the ground at more than two miles a minute but still Powers was able to eject.   With all the publicity of the shoot down and Powers being paraded through Red Square, the Paris summit was a complete bust.  English Prime Minister Harold Macmillan called the summit “the worst meeting I have ever attended”, Khrushchev stormed out and no deals to ease the heated “cold war” were made. There were certainly a lot of consequences of a small piece of unknown information.  (The date was Monday, May 16, 1960 and it was also historic for another reason that would become more than apparent in the future, it was the same day that Theodore Maiman invented the LASER at the Hughes Research Lab in Malibu California.)

After his “trial” and conviction for spying, Powers spent 2 years in Soviet jails.  On February 10, 1962 he was brought to Berlin where he was released in a trade for Rudolf Abel, a Soviet spy master caught in America.  By 1975 Powers had moved to Burbank California and was now flying news helicopters for the local NBC news station. He had flown this same helicopter for several years and he knew this particular whirlybird had a history of faulty fuel gauges.  The gauges read empty even when the craft still had plenty of fuel.   On August 1, 1977 Powers was coming back from getting film footage of a brush fire but he didn’t know another critical bit of information about his familiar craft.  It turns out that mechanics had fixed the fuel gauges after his last flight, but nobody told Powers anything about it.  He flew the craft towards home as the fuel gauges pointed to empty and he crashed near downtown Los Angeles.  One of America’s most famous pilots died because he didn’t know someone had fixed his fuel gauges and he ran out of gas.  “After all he’d been through, what a way to go”, the FAA investigator told the local newspaper.  What Francis Gary Powers again didn’t know ended up in a fatal accident.  He was buried in Section 11 of Arlington National Cemetery.

Because of these events and other similar tragic communication failures, aviation safety has developed precise tools for accurate communication of critical information.  So what can those of us who strive to adopt the high reliability mindset in our practice of medicine learn from these events?  Powers’ life was totally taken over and eventually ended by 2 crucial facts he didn’t know and proves that not knowing critical information isn’t always fatal, but if you tempt fate too much, it sure can be.  The importance of precise, complete and accurate communication of information is among the most critical aspects of the HRO mindset and high reliability team function.  It is just as important to make sure that all members of the team have ALL of important information, they understand it and they will carry out their tasks to account for the impact of that information on the patient.  In previous blogs I have talked about how the high reliability mindset teaches adopters the “Johari” theory of information management.  This teaches us how to move critical information out into the open so all team members can act on that information.  Joint Commission data posted on their web site indicates that the root cause of 67% of fatal sentinel events is failure of accurate, timely and complete communication.  Communication of critical data is a fundamental principle of error avoidance for HRO mindset adopters and highly reliable teams.  What you don’t know can be fatal and making critical decisions when information is hidden, unshared or unknown can end in disaster for our patients.

Share on Facebook

Posted in High Reliability Mindset, High Reliability Organizations, Joharis Window, Patient Safety.

Tagged with , , .


2 Responses

Stay in touch with the conversation, subscribe to the RSS feed for comments on this post.

  1. kenstahl says

    Thanks for your comment – will do my best to keep the information flowing.

Continuing the Discussion

  1. escort linked to this post on January 4, 2012

    My opinion is ……

    I was looking at some of your blog posts on this site and I believe this web site is very informative ! Keep posting ….



Some HTML is OK

or, reply to this post via trackback. All comments are moderated and may not appear immediately.