Plan Continuation Bias

One of the key ones is the possibility of continuation bias in linear processes and how easy it is to become path dependent, increasing risk and closing the capacity for adaptation as you deepen the course of action. Hence, it is important to understand that continuation errors can occur, and it is important for pilots to be aware of the risks associated with not analyzing changes in a situation and considering the consequences of those changes in order to determine whether a line-align approach is more appropriate. As the workload increases, especially in one pilot scenario, there is less and less mental capacity to process these changes and consider the potential impact they could have on the original plan. The addiction to continuing the plan can prevent crews from realizing that they need to change their course of action. [Sources: 6, 12, 13]

Plan continuation addiction can be defined as the tendency of individuals to continue with an original course of action that is no longer viable, which can often occur despite changing conditions and current information about the situation (APA, 2020). Perhaps one of the most famous applications of this concept can be seen when an airline pilot is unexpectedly confronted with bad weather (or changing conditions) when entering the ground, but instead of taking a different runway or aborting a landing, decides to move forward with landing. plan at the originally planned destination. A NASA study of nine major plane crashes in the United States between 1990 and 2000, in which crew errors were considered a likely cause, found that pilot-to-aircrew bias about the continuation of the plan tends to increase as they get closer to their destination. … [Sources: 6, 9]

In other words, the closer the aircraft is to the final approach and landing phase, the more likely it is that the crew will continue to operate even when the environment changes. [Sources: 6]

Simply put, when the journey is nearly complete, people tend to run on autopilot, ignoring changing and potentially hazardous environmental factors. The “mission at any cost” mentality tends to creep in and overwhelm the crew’s abilities. Fatigue and stress are secondary factors, but with a major impact on the rider’s exposure to non-compliance with rules / procedures. [Sources: 7]

Situational awareness (SA) failures occur when continuous deviations prevent the pilot from detecting important signals, or the pilot cannot recognize the meaning of these signals. Plan continuation deviations are more common in single-pilot light aircraft operations; NASA does not use resources at all for forensic investigations of every small aviation incident. The 2004 NASA Ames Human Factors Study found persistent deviations. The study analyzed 19 plane crashes caused by crew errors between 1991 and 2000. [Sources: 0, 5, 13]

In some incidents, we have observed a snowball effect, in which decisions or actions at one stage of the flight increased the crew’s vulnerability to making mistakes later. For example, a crew that continued a highly questionable approach during a thunderstorm found themselves in a high workload situation that may have helped them forget to turn on spoilers. While NASA’s accident analysis has focused on human behavior, for example [Sources: 7, 12]

Prejudice arises when people stick to a plan, even if it seems wrong. These signs, even if people see and recognize them, often fail to lead people in a different direction … When the signs suggesting a change in plan are weak or ambiguous, it is not difficult to predict where people are trading. if canceling the plan is somehow costly. Accident investigators often believe that accidents are a result of this bias – that the idea of ​​taking a break or changing approach becomes not only aggravating, costly, or unpleasant – becomes literally unthinkable. Simply put, avoiding continuing with a plan is the tendency of all of us to continue on the path we have already chosen or taken, without carefully checking whether this is still the best idea or even the most expedient. [Sources: 1, 10, 11]

This particular form of cognitive bias to which we humans are susceptible is more complex (especially when we view it in terms of plane crashes) than it has been described here, but the concept as a whole is interesting in that it could be studied in relation to with many decisions and paths that we persist in pursuing despite current information or even warning signs that suggest this may not be the best course of action or the most appropriate course of action. In this article, I will describe how this bias permeates our psychology, observing how it works in plane crashes, and then examining its impact on financial markets. Investors will learn how to combat this bias and improve trading efficiency. [Sources: 1, 9]

I think there is something about these stories that suggests a continuation bias (or what aviation pilots call “push-to-go”), which is the tendency of people to continue their original course of action despite changing conditions. even when the plan is no longer viable. In the event of a GPS failure, it may have to do with a feeling that the technology needs to be fixed and that we will find a better way or a way out right around the next corner. In aviation, this tendency to move forward is more commonly known as “get it done” … and is often fatal, especially among less experienced pilots. It’s a bizarre name for “goal achievement” – a plan continuation bias, which is an unconscious cognitive bias towards continuing with the original plan despite changing conditions, and can be fatal to general aviation pilots. [Sources: 0, 4, 6]

This bias can be especially strong during the approach phase, when only a few extra steps are required to complete the original plan, and can act by preventing pilots from noticing subtle indications that the initial conditions have changed. Looking at the list of cognitive biases, one of the best things to keep in mind in the mental model cheat pack is the “Plan Continuation Bias”. So when you stray too far from the wrong path, prejudices become stronger, task saturation comes into play, situational awareness wears off, and you fully defend yourself, no longer thinking about the future. [Sources: 0, 10, 12]

This quick and erroneous simulation of pilots leaves out many important factors. Flying with one pilot in light aircraft requires good rainfall, healthy routines, excellent motor skills, and an understanding of our cognitive biases. [Sources: 1, 5]

We can be responsible for sticking to the plan like the above-mentioned pilots. We bought this stock, which was a good idea at the time, and we will continue to hold it even if the reason for the purchase disappears or is not disclosed. In retrospect, we can see that we will have to change our plans to adapt to changing conditions. [Sources: 1, 9]

When studying and analyzing plane crashes, it is very easy to fall into what cognitive scientists call past bias. Automatic bias. False priorities. A tendency to over-reliance on automated systems, which can lead to incorrect automated information overriding correct decisions. Module function attribution error. In human-robot interaction – the tendency of people to make systematic errors when interacting with a robot. [Sources: 2, 12]


— Slimane Zouggari


##### Sources #####