## CONCATENTATION AND EXPECTED VALUES

A bloody picture of a cyclist adorned my Facebook page.  The writer was succinct:

How I joined the walking dead:

1. Rented a bike with defective brakes.

2. Started riding through a long dark RR tunnel.

3. Encountered a multi-family group with very small children in tow coming the other way.

4. Wiped out trying to avoid scattering kids like bowling pins.

This is a classic description of a fortunately not tragic accident.  Each one of those incidents alone might not have been sufficient, but together they caused a bloody rider. There was a concatenation of events.  Sometimes, we have a concatenation of errors.

I had my own sixteen years ago this month:

1. Took part in a long distance bicycle tour only a few months after starting to ride a road bike.
2. Ended up on a rainy day wet, tired after crossing 3 Colorado passes, and eager to get to the school where we were going to be camping.
3. Saw a car in the turn lane headed towards me.  I had limited experience riding a bicycle in traffic.
4. Assumed the driver saw me.
5. The car suddenly turned in front of me.
6. Too late, with wet brakes, I skidded and landed on my right hip, trying to avoid him.  I wasn’t the walking dead, but I didn’t walk normally for several months, and I’m lucky I can walk today.

It’s worth discussing the concept of the expected value of an event, like the lottery.  People see 2 winners in the last lottery and buy tickets, because after all, they could win.  It has to be somebody.  This is usually true.  If not, eventually the probability becomes so high that when the lottery has an unusually large payoff somebody (or several people) almost certainly will win.

If the probability of an occurrence is extremely small, invariable, and not zero, and the number of times the occurrence may happen is very large, the expected value is their product.  A probability of 1 in 110 million of winning x 440 million lottery tickets sold has an expected value of 4 winners.  It’s that easy.  Low probability events, like automobile fatalities, occur every day, because so many people drive. Expected values are just that.  They are expected, but they are not necessarily going to occur.

Aviation, perhaps more than any other endeavor, has taken safety to heart, because aviation is so unforgiving of errors.  Additionally, aviation has a large number of events, called flights, where there is a low but non-zero probability of a crash.  Aviation has tried to improve the probabilities and in commercial aviation, there have been multiple years, often consecutive, without a fatality.

Non-commercial aviation isn’t as safe.  Nearly two decades ago, a 7 year-old was trying to be the youngest person to ever fly across the country.  Being the youngest, oldest, first, most disabled, fastest, —st is often the first cause in a cascade of events that leads to tragedy.

A 7 year-old had no business being at the controls of an aircraft.  Period.  One of the last things to mature is judgment.

• They took off to try to beat a thunderstorm, poor judgment, because wind shear is unpredictable in thunderstorms.  One must wait.
• The runway was at a higher altitude where there is less lift for aircraft.
• Rainwater on the wings diminished lift.  Airfoils are delicate; distortions of shape diminish performance.
• They turned to avoid part of the thunderstorm.  Turning decreases lift.  The overloaded, slow moving, distorted airfoil plane stalled and crashed, killing all aboard.

Remarkable finding of evidence and piecing it together led to understanding why Air France 447 crashed in the mid-Atlantic in 2009.  Here’s a root cause analysis:

• Why did the plane crash?  It stalled.
• Why did the plane stall?  It was in the nose up position for the last part of the flight, reducing lift.
• Why was the plane in the nose up position?  Because the co-pilots had taken control and saw that the altitude was low.
• Why did the co-pilots take control? Because the autopilot had shut off.
• Why did the autopilot shut off?  Because it wasn’t getting useful information from the pitot tubes, like altitude and speed; the altitude reading was faulty.
• Why didn’t the co-pilots keep on the same course as the autopilot? Because they trusted the instruments.
• Why weren’t the pitot tubes sending useful information?  Because they were faulty and needed to be replaced, but the airline was phasing them in.
• Why was the airline allowed to phase them in?  That ends the questions.  That’s where action needed to occur.  Additional causes included the pilot’s napping (not wrong) so he was not in the cockpit when called.  There were other crew miscommunications.
• What could have been done?  As soon as the “stall” alarm came on, the crew needed only to push the nose of the aircraft down.  Planes stall when they climb too rapidly.

**********

This root cause approach to errors is what medicine needs.  When a surgeon operated on the wrong side of the head, he got a letter telling him not to do it again.  Nothing changed.  Here’s what happened.

• Patient in ED had a subdural hematoma and needed emergency surgery.  There are emergencies where one must act in a matter of seconds, and there are emergencies where one needs to act quickly, but can take a few minutes to think about the necessary approach.  A lot of people in and out of the medical field don’t understand that there is a huge difference between the two.  Unnecessary hurry is one of three bad things in medicine (others are lack of sleep and interruption).  A subdural hematoma needs to be evacuated, but unlike its cousin an epidural, it doesn’t need to be done in the emergency department, and there is time to plan the procedure.
• CT Scans were relatively new and had changed the left-right orientation opposite to traditional X-Rays.  I practiced when CT scans showed this orientation, and it was extremely confusing.
• Many people have trouble distinguishing left from right.  It isn’t a personality flaw, it is a biological issue, akin to being shy.  Approximately 15% of women and 2% of men have this problem.
• Nobody spoke up to tell the surgeon they were concerned upon which side he was operating.

Without going into more detail, I reiterate the comments I made to the head of the operating room, who assured me that 99.9% of the time they did it right.

“No,” I replied.  “You get it right 99.99% of the time, and that isn’t good enough.  Counts matter, and wrong side surgery cases must be zero.”

We need better system design to decrease the probability of the wrong thing’s happening.  The stronger our systems, the more events will have to occur for something to go wrong, and that means people will be safer.

We will never know if a better system saved a life.  But probabilistically, it will increase the expected value of success, and I trust expected values.