Researching Risky Business - What managers, astronauts, and hurricane survivors alike can learn from near-misses

June 18, 2009

By Chris Blose

Imagine you have to attend a meeting in a rough neighborhood. You know from looking up crime statistics that there is a 2.5 percent chance you will be mugged in this part of town. You go anyway, and you do not get mugged. Then you go to another meeting, and another, and you continue to return home safely.

The more you go to that neighborhood without getting mugged, the more comfortable you start to feel. However, there is still a 2.5 percent chance you will be mugged next time.

"It's not that you change the number," says Catherine Tinsley, associate professor at Georgetown University's McDonough School of Business. "You just change how you feel about the number."

The mugging anecdote helps illustrate the work of Tinsley and Robin Dillon-Merrill, also an associate professor. Combining their expertise in decision theory and risk analysis, respectively, they study exactly how and why people make decisions that involve risk.

Specifically, Tinsley and Dillon-Merrill have explored and refined the idea of a "near-miss" event. Researchers have varying definitions of near-misses, but for the purposes of their studies, Tinsley defines them as follows: "A near-miss is an event that could have been a failure but for an element of luck or chance."

Near-misses are everywhere. They happen on the highway every day. They happen in the manufacturing industry. They happen in risky ventures such as space flight, and they happen in areas prone to natural disasters such as hurricanes.

Tinsley and Dillon-Merrill say the goal of their research is to help managers and others respond to near-misses not by saying, "That was a close call, but we made it," but instead, "That was a close call. What did we learn?"

Tragic Origins
Dillon-Merrill started down the road toward this research when she heard a presentation by Elisabeth Pate-Cornell, a Stanford University researcher who later became her mentor. In the late 1980s and early 1990s, Pate-Cornell performed an extensive risk analysis for NASA about the heat-resistant tiles that protect space shuttles during re-entry into Earth's atmosphere. The report identified the risk that those crucial tiles could be damaged by debris that sheds from the shuttles' insulating foam.

When the space shuttle Columbia disintegrated on Feb. 1, 2003, that foam was the culprit. In the wake of this tragedy and the death of all seven astronauts on board, NASA managers, investigators, and the public were left asking why nothing had been done to deal with a known risk.

"When we lost the space shuttle Columbia for a reason that was clearly identified in the risk analysis, it showed me something," Dillon-Merrill says. "There's nothing wrong with the risk analysis tools. The issue was everything that had happened from when [Pate-Cornell] turned in her report to when we lost the Columbia — the many different launches when the foam fell off and didn't cause any problems."

Like the mugging scenario, Dillon-Merrill theorized, every time a shuttle mission ended without incident, NASA managers likely altered their feelings about the statistical probability of risk, even if the foam kept shedding and the probability of catastrophe had not changed. The Columbia Accident Investigation Report backed that idea, too, saying NASA may have grown complacent about the foam shedding.

Seeing an opportunity to learn from tragedy, Dillon-Merrill enlisted Tinsley's expertise in the behavioral aspects of decision making, and the two got to work.
"What motivates us is that when failures happen, you get a huge investigation of everything," Tinsley says. "But failures are really costly. Can you avoid failures and find early warning signals by paying more attention to near-misses?"

Near-missed Opportunities
With a $250,000 grant from NASA, Dillon-Merrill and Tinsley set out to explore why people don't learn from near-misses. The resulting study, "How Near-Misses Influence Decision Making Under Risk: A Missed Opportunity for Learning," was published in the August 2008 issue of the journal Management Science.

The researchers started with the idea that people interpret near-miss events not as near-failures, but as successes. If people view a near-miss as a success, focusing solely on the outcome instead of recognizing it could have led to disaster, they will lower their perception of risk. In turn, they will be more comfortable making risky decisions.

"When you're faced with a near-miss, a close call, you can interpret it two ways," Tinsley says. "You could say, 'Wow, our system was really resilient because this thing missed, and we're OK.' Or you could say, 'Wow, our system is vulnerable because we almost got hit.' So whether you interpret the near-miss as a success or as almost a failure will influence how you view subsequent situations that have the same decision parameters."

To test this idea, Dillon-Merrill and Tinsley set up a series of controlled experiments involving hypothetical scenarios. NASA managers, defense contractors, and Georgetown students participated in these experiments. In the first, participants were asked to rate the decisions of Chris, a fictional project manager working on an unmanned spacecraft project.

Because of a tight schedule, Chris decided to skip peer review of an important instrument on the spacecraft. He also failed to investigate a design problem that could lead to catastrophic failure. Different participant groups received different information about the mission's success, failure, or near-miss.

In the success scenario, the mission occurs without incident. In the failure scenario, chance alignment with the sun causes catastrophic malfunction of the instrument in question. In the near-miss scenario, chance alignment with the sun kept that instrument in the shade, so it did not malfunction.

People in the near-miss group rated Chris' management of the project in a way that was statistically indistinguishable from how people in the success group rated Chris' management of the project. In other words, people viewed a near-miss as a success rather than as a near-failure.

In another experiment, a simulation, participants were asked to assume control of
an unmanned Mars rover on day six of an 11-day mission. On each morning of the mission, participants were given a 95-percent-accurate weather forecast. They had to decide whether to drive that day or stop for safety and deploy a guard on the rover's wheels.

All participants were given the following information: On days one through five, the rover was operating on autopilot. There was a 40 percent chance of catastrophic wheel failure in a severe storm. All else being equal, it was favorable to drive because of limited battery life, which depleted at a constant rate whether the rover was driving or stopped. For that first morning, day six, all participants also received a forecast calling for a severe dust storm.

From there, information varied for a control group and a near-miss group. The control group was told there had been mild weather for all previous days of the mission. The near-miss group was told the rover had made it through severe storms on three of those days.

Only 13 percent of the control group decided to drive that day, but a full 75 percent of the near-miss group chose to drive despite the threat of a severe storm, likely because they knew the rover had made it through such storms before. The evidence supported Dillon-Merrill and Tinsley's hypothesis that people who have near-miss information make riskier decisions than those without this knowledge.

Further experiments, with minor changes to the information, suggested that people were not logically updating the statistical probability based on the information they had; instead, they seemed simply to alter their perception of risk and grow more optimistic that they were safe.

"The full learning value of near-misses will be realized only when they are separated from successes and examined to demonstrate not only system resilience, but also system vulnerability," the study concludes.

If unexamined, near-misses may turn into hits.

Should I Stay or Should I Go?
Building on Dillon-Merrill's prior work in a mentorship program for the National Science Foundation (NSF), she and Tinsley received $300,000 from NSF to examine why and how people decide whether to evacuate a hurricane zone.

For more than 40 years, the disaster research community has studied why people evacuate for hurricanes or choose to stay. This research has explored more than 70 variables, such as demographics, education, and past experience. However, results are often inconclusive and unpredictable — just like people's emotions.

"It really boils down to this," Dillon-Merrill says. "If people feel safe, they stay, and if they don't feel safe, they go." The idea may be simple, but the process by which people reach that feeling is complex.

To explore this process, the two colleagues arranged another set of controlled experiments. The results were similar to those of the NASA study, and they helped the researchers refine the near-miss idea. In a working paper Dillon-Merrill and Tinsley plan to publish, near-misses have evolved into two categories: the "didn't near-miss" and the "almost near-miss."

In the "didn't near-miss" scenario presented to study participants, a hurricane had hit their property three times before, but they and their property were safe each time. In the "almost near-miss" scenario, the same was true, but a tree had fallen on their neighbor's house and produced significant damage.

The "didn't" information produced a different reaction than the "almost" information. People who knew their neighbors had experienced damage felt less safe and were more likely to evacuate from a hurricane. One little piece of information made all the difference in the world, Tinsley says. Again, the statistical probability of damage to people and their property was the same; only the perception of that probability changed.

Government agencies that deal with disasters are trying to understand and influence people's evacuation decisions. They want to help people in the most danger reach the decision to leave. In turn, they want to help people who may be in relatively little danger stay so they don't clog up the roads.
In other words, helping people decide whether to stay or go is about more than understanding behavior; in the face of a natural disaster, it may save lives.

A Hammer for Many Nails
Because risk is involved in so many decisions, near-miss research has the potential to reach far beyond space missions and natural disasters.

Dillon-Merrill and Tinsley talk about possible applications ranging far and wide. Near-miss information can help explain the current state of the economy, for example, and the behavior of financial institutions that engaged in risky investments. According to these professors, instead of managing risk and averting disaster ahead of time, everyone is looking backward to determine just what happened.

Although the research stems from large-scale crises, it also can benefit a manufacturing manager running an assembly line just as much as it can help a financial manager or a small-business owner. With the right understanding of near-misses, the researchers believe they can teach everyone to make better decisions in the face of risk.

"We think it has applications just about everywhere," Tinsley says, "but of course that's because we have a hammer now, so everything is a nail."