Automation Bias

If we think of the human brain as a computer, cognitive distortion is essentially a code error that causes us to perceive input differently or produce illogical output. [Sources: 2]

But there are other types of biases, not necessarily cognitive; For example, there is social protection theory, which is one of the most popular socio-psychological biases. In addition, there may be cognitive theories that are not necessarily considered bias, or rather, they are more like a web of shared biases woven together, such as the cognitive dissonance that causes mental disorders when conflicting ideas or beliefs arise in our minds. [Sources: 2]

In another case, cognitive bias can be used to understand personal reasoning patterns and motivational processes that are the basis of their decision-making behavior. Cognitive bias patterns in visualization research In another specific situation, the way a person processes information and makes decisions is also different. [Sources: 4]

Automatic reasoning promotes uncritical acceptance of proposals and maintains strong bias. It has been experimentally shown that this type of control creates a so-called automation bias, when operators trust computer solutions as correct and ignore or do not look for conflicting information. Cummings experimented with automation bias in researching an interface designed to monitor and allocate GPS-guided Tomahawk rockets in flight. [Sources: 7]

Level 4 is unacceptable because it is not conducive to confirmation of purpose, and a short veto time will increase the automation bias and leave no room for doubt or reflection. There must also be a means to quickly stop or interrupt the attack. An ordered list of goals is especially problematic, as automation bias can tend to take a goal at the top of the rankings if not given enough time and space to think. [Sources: 7]

This is a trend of over-reliance on automated systems, which may cause wrong automated information to overwhelm correct decisions. By showing how people trust automated systems based on their own judgment, we already have a rich history of automation bias. When we consider how the increasing distortions of automation caused by the rapid deployment of artificial intelligence and automation will affect the future, we begin to understand the risks involved in allowing machines to guide human thinking. [Sources: 1]

Every day, systematic errors in our thought processes affect the way we live and work. In the name of self-awareness, here’s a more detailed look at three newly discovered biases that we are most likely to display in the modern world. Automation bias refers to a specific class of errors that humans tend to make in the context of highly automated decision-making, where many decisions are processed by automated tools (such as computers) and a human actor is widely present to track the actions being taken. [Sources: 3, 5]

The following are excerpts from some representative examples of this research program. A number of recent studies on automation bias, using automation as a heuristic replacement for vigilant retrieval and processing of information, have explored omissions and errors in highly automated decision-making environments. Most of the research on this phenomenon has been conducted in a single person speaking setting. This study examined automation bias in teams of two artists versus solo artists in different educational settings. The training focused on automation bias and associated errors, and was successful in reducing commissions, but not omissions. [Sources: 5]

However, they found that the difficulty of the assignment did not affect the execution of the assignments. We found evidence that participants made mistakes or omissions, failing to detect 28.7% more prescription errors when CDS did not issue warnings, compared to the control condition without CDS. Interestingly, while participants were found to rely too heavily on automation, there was evidence of disagreement with the CDS provided to them. This problem is further exacerbated by the “gaze but not seeing” or inattentive blindness effect, in which participants made AB errors despite having access to sufficient information to judge that the automation was wrong [12, 13]. [Sources: 0]

However, automated deviation detection shows that this extra layer of protection is weakening, or in the worst case, without proper supervision, the commissioning of CDS to replace the efforts of clinicians to detect errors. In addition, it has been found that the use of cognitive strategies, such as requiring people to consider the opposite result, rather than just the expected result in judgment, has been found to be effective in reducing anchoring bias (Mussweiler et al., 2000). A large number of social psychology studies have shown that many cognitive biases and the resulting errors can be corrected by establishing an accountability system before making decisions, which makes decision makers aware of the need to create intimidation for their choices and how they make these choices. Convincing reason. Although humans are called “smart animals,” Bayesian analysis experiments in the 1950s and 1960s showed that human judgments may be biased and make wrong decisions (Edwards et al., 1963; Ellis, 2018). [Sources: 0, 4, 6]

From the above, it should be clear that there are lessons to be learned from both the psychology of human thinking and the literature on human-machine interaction. This study found that there is a risk of bias in electronically prescribing medications to senior medical students who will soon enter clinical practice as junior physicians. [Sources: 0, 7]

Knowing this list of biases will help you make better decisions and understand when you’ve gone astray. Most people don’t know how many types of cognitive biases there are – Wikipedia lists 184. We found 50 types of cognitive biases that arise almost every day in small discussions on Facebook, in horoscopes and on the world stage. [Sources: 1, 2]

Along with their definitions, these are real-life examples of cognitive bias, from subtle groupthink sabotaging your appointments with management to anchored attraction that makes you spend too much money in the store during a sale. Cognitive bias is widely recognized as something that makes us human. Cognitive bias is a psychological explanation for the patterns of human thinking and rational judgment (Haselton et al., 2015) associated with remembering, evaluating, processing information, and making decisions (Hilbert, 2012; Tversky and Kahneman, 1974). In the study of psychology and behavioral economics, similar patterns of biased thinking have been reported, called cognitive biases. [Sources: 2, 3, 4]

This cognitive bias, identified in 2011 by Michael Norton (Harvard Business School) and colleagues, is related to our tendency to place more value on what we help create. If we’re to counter this cognitive bias, finding a new favorite TV series on platforms like Netflix can take good old-fashioned human curiosity. The Google Effect, also known as digital amnesia, describes our tendency to forget information that can be easily accessed on the Internet. [Sources: 3]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://bmcmedinformdecismak.biomedcentral.com/articles/10.1186/s12911-017-0425-5

[1]: https://www.paconsulting.com/insights/what-is-automation-bias-how-to-prevent/

[2]: https://www.titlemax.com/discovery-center/lifestyle/50-cognitive-biases-to-be-aware-of-so-you-can-be-the-very-best-version-of-you/

[3]: https://www.visualcapitalist.com/50-cognitive-biases-in-the-modern-world/

[4]: http://www.braindigitallearning.org/article?num=N0230110302

[5]: https://lskitka.people.uic.edu/styled-7/styled-14/

[6]: https://journals.sagepub.com/doi/abs/10.1177/154193129604000413

[7]: https://www.icrac.net/icrac-working-paper-3-ccw-gge-april-2018-guidelines-for-the-human-control-of-weapons-systems/