Plan Continuation Bias

Tom Connor
10x Curiosity
Published in
7 min readOct 8, 2020

--

We are so close, We may as well keep going… You may be suffering from “Get-there-itis”

Photo by Tim Gouw from Pexels

Looking through the list of cognitive bias’s, a good one to be aware of in the mental model bag of tricks is “Plan Continuation Bias”. This is the cognitive bias to continue with the original plan in spite of changing conditions and growing evidence that you should reconsider. You see this bias everywhere, especially when it combines with sunk cost bias. Kahneman and Tversky first highlighted how people hate to lose something significantly more than potential gains, so once you have something invested - be it money, time or effort — you run the risk sticking to a bad plan for longer than you should.

Sidney Decker writes in The Field Guide to Understanding Human Error

Plan continuation means sticking to an original plan while the changing situation actually calls for a different plan. As with cognitive fixation, situational dynamics and the continuous emergence of incomplete, uncertain evidence play a role:

early cues that suggest the initial plan is correct are usually very strong and unambiguous. This helps lock people into a continuation of their plan.

later cues that suggest the plan should be abandoned are typically fewer, more ambiguous and not as strong. Conditions may deteriorate gradually.

These cues, even while people see them and acknowledge them, often do not succeed in pulling people into a different direction…When cues suggesting that the plan should be changed are weak or ambiguous, it is not difficult to predict where people’s trade-off will go if abandoning the plan is somehow costly. Diverting with an airliner, for example, … entails multiple organizational and economic consequences. People need a lot of convincing evidence to justify changing their plan in these cases. This evidence may typically not be compelling until you have hindsight.

Tim Hardford’s excellent podcast — Cautionary Tales has two episodes that describe examples of Plan continuation bias ending in disastrous consequences. In Rocks Ahead he talks of the incident where the one of the largest oil tankers in the world — Torrey Canyon — ran aground on rocks whilst trying to be steered on a course that in hindsight was far too tight for the massive ship. The evidence mounted for the captain and crew by the minute that the ship was heading to disaster, but the captain was committed and even though he could have backed out of the risk course decided against it. This resulted in one of the largest ever oil spills.

Torrey Canyon

In the episode Beverly Hills Supper Club, came an example of a fire at a packed social club that should have been no more than a property loss, but instead due to the aversion of workers and then patrons to change their plans ended in disaster.

An excellent Article by Michael Roberto, Richard Bohmer and Amy Edmondson, digs further into the concept they call ambiguous threats, this is central to the reason why we are unwilling to change directions (‘Facing Ambiguous ThreatsHarvard Business Review November 2006):

The first impediment to dealing with ambiguous threats stems from cognitive biases …The human mind tends to protect itself from fear by suppressing subtle perceptions of danger. Moreover, we are prone to noticing and emphasizing information that confirms our existing views and hypotheses, while dismissing data that contradict them.

To make matters worse, in the face of vague evidence, we often escalate our commitment to existing courses of action, particularly when we have invested considerable time and money in them.

At NASA, for example, a few engineers, frustrated by a paucity of data about the effects of foam strikes on shuttle missions, had requested additional satellite imagery during Columbia’s final mission. The managers to whom they made the request had already signed off, during the Flight Readiness Review, on the determination that foam strikes did not represent a safety issue. They turned down the engineers’ request, and so beliefs about the harmlessness of foam strikes went unchallenged.

The authors go on to highlight three steps to assist in dealing with plan continuation bias:

Step one: Practice teamwork under pressure.

Organizations often do not have the luxury of taking their time during a crisis. Stress and anxiety run high as the clock ticks toward a potential failure. That’s why firms should not try to improvise in the midst of a recovery window. When time is of the essence, throwing assorted people together on a team and asking them to find a way to work effectively together is counterproductive.

Step two: Amplify the signal.

In countering the powerful psychological forces that muffle ambiguous threats, leaders need to develop processes for amplifying warning signs, even if they seem innocuous at first. Firms require mechanisms that make opening a recovery window legitimate, appropriate, and even welcome in the face of incomplete, but troubling, data. The opening of the window should initiate a brief but intense period of heightened inquiry, experimentation, and problem solving in which people feel free to ask uncomfortable questions about the potential threat. During this period, employees must be able to explore the significance of aberrant observations without fear of retribution should the threat prove harmless.

Step three: Experiment.

Evaluating ambiguous threats often requires rapid experimentation. However, organizations must take great care not to rely only on standard hypothesis-testing. When a potential catastrophe looms, formal scientific experiments may require too much time or too many valuable resources. Many firms therefore can benefit from less-formal, exploratory experimentation.

David Marquet in “Leadership is Language” presents a concept he terms “Redwork” or doing work and “Bluework” or thinking work. He highlights that when people are deep in redwork variability to routine is the enemy and they are not looking for signs of danger in this variability. To switch modes requires a deliberate pause or intervention. Marquet call this “Controlling the Clock” and highlights how it is often the job of leadership to do this as:

  • The team might be lost in redwork because of the stress of the clock.
  • The team might be lost in redwork because of the intensity of focus.
  • The team feels the pressure of obeying the clock most acutely.
  • Calling a pause is likely to be calling attention to a problem, or a possible problem

Finally the airline industry provides a very comprehensive (and successful) model of a practical response to avoiding plan continuation bias. Known as Crew Resource Management (CRM), it focuses on the first step highlighted above and encourages as many stakeholders as possible to voice concerns.

Beverly Hills Supper Club Fire

Clearfield and Tilcsik write about the development of CRM as a solution to improve team work under pressure ( Meltdown: Why Our Systems Fail and What We Can Do About It p 130).

All this changed [Plan continuation bias incidents] with a training program known as Crew Resource Management, or CRM. The program revolutionized the culture not just of the cockpit but also of the whole industry… It was no longer disrespectful to question the decisions of a superior — it was required. CRM taught crew members the language of dissent.

…many pilots thought it was useless psychobabble. They called it “charm school” and felt it was an absurd attempt to teach them how to be warm and fuzzy. But as more and more accident investigations revealed how failures to speak up and listen led to disasters, attitudes began to shift. Charm school for pilots has become one of the most powerful safety interventions ever designed.

…In a complex system, that’s often the right thing to do. Pausing gives us a chance to understand what’s going on and decide how to change course. But we often fail to pause even when we can. We press on even when our original plan no longer makes sense.

…Get-there-itis affects all of us, not just pilots. We become so fixated on getting there — whether “there” is an airport or the end of a big project — that we can’t stop even when the circumstances change

Let me know what you think? I’d love your feedback. If you haven’t already then sign up for a weekly dose just like this.

Get in touch… — linktr.ee/Tomconnor

More like this from 10x Curiosity

--

--

Tom Connor
10x Curiosity

Always curious - curating knowledge to solve problems and create change