Probability neglect, a type of cognitive bias, is the tendency to ignore probability when making decisions under uncertainty, and is an easy way for people to regularly break normative rules when making decisions. Probability neglect, a type of cognitive bias, is the tendency to completely ignore probability when making decisions under uncertainty, and is an easy way for people to regularly break normative rules when making decisions. When probability is neglected, people focus on the “worst case” and ignore the question of whether the worst case is possible — an approach that can also lead to overregulation. There are many related ways that people violate the normative rules for making decisions about likelihood, including hindsight bias, disregard for the influence of previous base bets, and player error. [Sources: 0, 1, 4, 5]
Cass Sunstein, a senior adviser to Barack Obama, says that people show a likelihood of rejection when faced with vivid images of terrorism, so that when their emotions are intensely involved, people’s attention is focused on the worst outcome. is unlikely to happen. This bias can lead subjects to decisively violate expected utility theory when making decisions, especially when a decision has to be made when the possible outcome is of much lower or higher utility, but with little likelihood (for example, [Sources: 3, 5]
However, this bias is different in that the actor does not abuse the probability, but completely ignores it. In a 2001 article, Sunstein addressed the question of how the law should respond to the rejection of probability. Again, the subject ignores probability when making a decision, considering every possible outcome to be equal in his reasoning. [Sources: 0, 4]
They assume that the likelihood is more likely to be overlooked if the results evoke emotion. In this respect, rejecting the probability bias is similar to rejecting the effect of previous base rates. Subadditivity effect. The tendency to rate the likelihood as wholly less than the probabilities of the parties. [Sources: 2, 4, 5]
While government policy on potential hazards should focus on statistics and probabilities, government efforts to raise awareness of these hazards must focus on worst-case scenarios to be most effective. He noted that there are methods available, such as Monte Carlo analysis, to study probability, but all too often “the probability continuum is ignored. Dobelly described the US Food Act of 1958 as a “classic example” of the rejection of probability. [Sources: 4]
Blind Spot Bias The tendency to believe that one’s own bias is less than that of others, or the ability to recognize others’ cognitive biases is greater than one’s own. connection error. Tend to assume that certain conditions are more likely than general conditions. Basic bet error or basic bet lost. Tend to ignore basic speed information (general and general information) and focus on specific information (information only relevant to a specific situation). [Sources: 2]
The player’s illusion. Tend to think that future probabilities will be distorted by past events when they have not really changed. All these biases indicate a tendency to focus on irrelevant information when making a decision. Berkson’s paradox. The tendency to misunderstand statistical experiments using conditional probability. [Sources: 2]
Irrational escalation. A phenomenon in which people justify an increase in investment in a solution based on previous cumulative investment despite new evidence that the decision was likely wrong. Choice bias. The tendency to remember your choice as better than it really was. [Sources: 2]
When national security is at stake, cost-benefit analysis is much less promising because it is usually impossible to assess the likelihood of an attack. The availability heuristic, widely used by ordinary people, can lead to highly exaggerated perceptions of risk, as serious incidents lead citizens to think the risk is much greater than it actually is. Civil libertarians overlook this point, believing that the meaning of the Constitution does not change in the face of intense public fear. [Sources: 1]
Deformation Professionalnelle is a French term for the tendency to look at things from the point of view of one’s profession rather than from a broader point of view. [Sources: 0]
— Slimane Zouggari
##### Sources #####
[0]: https://www.linkedin.com/pulse/cognitive-biases-every-risk-manager-must-know-part-2-sidorenko-crmp
[1]: https://muse.jhu.edu/article/527368/summary
[2]: https://behavioralgrooves.com/behavioral-science-glossary-of-terms/
[3]: https://www.cambridge.org/core/books/risk/quantifying-uncertainty/B41C7A211929DBA2B5CB4CEA4E3A66A1
[4]: https://en.wikipedia.org/wiki/Neglect_of_probability
[5]: https://nlpnotes.com/2014/03/22/neglect-of-probability/