Status Quo Bias

In behavioral economics, we can observe how people choose to pay more attention to what they already have. In other words, loss aversion motivates people to stick with what they have. [Sources: 6]

In other words, people don’t like uncertainty and don’t want to make choices. Rather than taking the risk of trying an unknown drug that may have unknown effects, people choose to stick with what they know, even if it’s potentially not as good as the alternatives. [Sources: 6, 11]

The addiction to the status quo forces people to keep their financial situation as it is now, instead of risking an improvement in their financial prospects. A shift in the status quo is evident when people choose to keep things the same, doing nothing (see also inertia) or sticking to a previously made decision (Samuelson & Zeckhauser, 1988). [Sources: 2, 11]

This can happen even when there is little transition cost and the importance of the decision is very high. For example, a person may decide to maintain their current situation due to the potential transition costs of switching to an alternative. When making important choices, people are more likely to choose the option that keeps things as they are. Conversely, if you destabilize their preferences, you increase their willingness to change. [Sources: 2, 3, 11, 12]

One such phenomenon is the impact of expected regret; status quo bias is a strategy to reduce regret5. The status quo-so regret our decision. This is due to the concept of choice overload, which shows that a larger set of choices leads us to make worse decisions6. In fact, some people might argue that the status quo bias is not at all eligible for decision-making; some researchers classify this bias as a form of avoiding decision-making. 7 When there are various options and you are not sure which is the best, choosing default values ​​can be a way to avoid decision-making pressure. solution. This may jeopardize our decision-making ability and prevent us from choosing the most profitable option due to fear of failure. [Sources: 1, 5]

Even when a new option or choice is proposed, we tend to stick with the default option. If we stick to current decisions to avoid the cost of making decisions, then this can be seen as a rational choice, since we save on computational costs. When making decisions, people tend to consider the more valuable option when they have chosen it. [Sources: 3, 6, 10]

One explanation is that in order for individuals to change course to adapt to their current situation, this means that alternatives must be considered twice as beneficial. The answer is simple. People will naturally find that change is expensive, dangerous, and risky. This may be a form of risk aversion inherent in the status quo bias: people who do not want to lose their current reality will choose to stay, even at the cost of living in reality rather than virtual reality. [Sources: 0, 12]

Research shows that when people make a decision, they weigh potential losses more than potential gains. On the contrary, they are more willing to continue the path they have chosen, even if the alternative is objectively better. Whether you realize it or not, you are naturally inclined to choose the path of least resistance in decision-making. [Sources: 3, 12]

It is much easier and safer to stick with your current course of action than to risk something new. Change can be intimidating to many people, which is why many prefer things to stay the way they are. [Sources: 11, 12]

Indeed, through a series of daily decisions such as moving or changing a car, or even changing TV channels, there is a noticeable tendency to maintain the status quo and refrain from action (1). It is in these cases, where the decision conflicts with the surrogates’ preferences to act in the patient’s best interests or to fulfill the patient’s wishes, and when the decision is therefore irrational, that a status quo bias may be the culprit. Once life-sustaining treatment has begun, clinicians can address the effects of status quo bias by recognizing signs of neglect bias, empathizing with surrogate mothers who express or imply concerns about stopping supportive care, and then feel responsible or blameworthy for the death of patients. … [Sources: 5, 9]

The results of this study indicate that there may be a status quo bias in the stated choice studies, especially with regard to drugs that patients must take on a daily basis, such as the maintenance drugs for asthma. One study found that when people are given a choice between their current drug and an even better drug, people are biased in choosing their current drug. In an open-choice study among asthma patients taking prescription maintenance drugs, there is an experiment to determine if there is a status quo bias towards current drugs, even when better alternatives are offered. [Sources: 4, 11]

Acceptance bias was observed in tests of high but not low difficulty, resulting in suboptimal selection behavior. This default bias was observed in 13 out of 16 subjects and, more importantly, resulted in suboptimal behavior choices. The addiction to the status quo was even more evident in older participants, as they chose to keep their initial investments rather than change them as new information emerged. [Sources: 4, 9]

The status quo bias is explained by a number of psychological principles, including loss aversion, sunk cost, cognitive dissonance, and simple exposure. Status quo bias The classic human decision-making model is the rational choice or “rational actor” model, the idea that people will choose the option that is most likely to satisfy their preferences. When faced with a choice, it is not always obvious which decision will be correct. [Sources: 1, 3, 5]

The status quo bias must be distinguished from the rational preference for the status quo, for example, when the current state of affairs is objectively superior to available alternatives, or when incomplete information is a serious problem. For example, prejudice is often used to explain why people do not take advantage of investment and savings opportunities. David Gal and Derek Rucker disputed the interpretation of loss aversion to the status quo bias. They argued that the evidence of loss aversion (that is, the tendency to avoid loss rather than seeking profit) and the inertial tendency (the tendency to avoid interference rather than interfere with the course of things). [Sources: 4, 11]

In addition to the significant main effect of status quo bias in all four experiments, we show that consciousness and an internal locus of control, as well as the presence of self-interest, significantly reduce susceptibility to status quo bias. Since the modulating parameters (in this case, the effect of the default deviation) in DCM are expressed as a fraction of baseline connectivity, we conclude that the default deviation induces prefrontal STN dynamics that is largely absent when the status quo persists. Analysis of actual connectivity showed that the inferior frontal cortex, a more active area for difficult decision-making, had an increased modulating effect on the STN during the transition from the status quo. [Sources: 7, 9]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://www.psychologytoday.com/us/blog/after-service/201609/how-powerful-is-status-quo-bias

[1]: https://thedecisionlab.com/biases/status-quo-bias/

[2]: https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/status-quo-bias/

[3]: https://www.thoughtco.com/status-quo-bias-4172981

[4]: https://en.wikipedia.org/wiki/Status_quo_bias

[5]: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5837876/

[6]: https://www.economicshelp.org/blog/glossary/status-quo-bias/

[7]: https://www.sciencedirect.com/science/article/pii/S0969698921003593

[8]: https://www.aeaweb.org/articles?id=10.1257/jep.5.1.193

[9]: https://www.pnas.org/content/107/13/6005

[10]: https://www.wheelofpersuasion.com/technique/status-quo-bias/

[11]: https://www.verywellmind.com/status-quo-bias-psychological-definition-4065385

[12]: https://corporatevisions.com/status-quo-bias/

System Justification

According to SJT, people are driven by a system-oriented conscious or unconscious need to “protect, maintain, and prove existing social, economic, and political systems and arrangements” (Jost and Kay, 2010, p. 1148), which represents a kind of Different types of human motivation because it can only maintain the status quo (Jost and Banaji, 1994, p. 10). System principle theory attempts to understand how and why people provide cognitive and ideological support for the status quo, and what are the social and psychological consequences of maintaining the status quo, especially for members of disadvantaged groups (eg Jost & Banaji, 1994; Jost & Burgess, 2000 ). [Sources: 2, 4]

In Justification Theory, John Yost argues that we are motivated to defend the status quo because, in doing so, we satisfy basic psychological needs for confidence, security, and social acceptance. Systems rationale theory refers to the socio-psychological tendency to defend and strengthen the status quo, that is, to see it as good, just, legitimate, and desirable. According to the original formulation of the SJT (Jost and Banaji, 1994) and its subsequent refinements (for example, Jost et al., 2004), this system-oriented motivation seems to be rooted in epistemic needs (for example, to avoid uncertainty), existential needs (for example, to reduce stress and threat) and relationship needs (for example, to accept shared realities; Jost et al., 2008), which is most pronounced when people crave predictability and / or confidence within a strong system on which they depend (Jost, 2017). [Sources: 2, 4]

Systems rationale theory argues that people have a strong motivation to see themselves, their social groups and structures that favorably affect their lives, and therefore they tend to view prevailing status hierarchies as fundamentally fair. In short, according to SIMSA’s explanations, there is evidence that the system’s justifying effect may be an attempt by the disadvantaged to protect, defend and strengthen their social identity. The SJT postulates that an underlying ideology motivates justifying social order in a way that fosters an often unconscious belief in inferiority among people from disadvantaged groups [3]. This assumes that people are motivated to defend, justify, accept, rationalize and support the social, political and economic systems in which they live and work (Jost, 2020). [Sources: 2, 3, 4]

Therefore, when the status quo persists (that is, economic inequality increases), liberals show more zero-sum thinking, and when the status quo (that is, reduced social inequality) is questioned, conservatives show more zero-sum thinking. thinking. Similarly, studies 5A and 5B show that challenging the status quo to express problems will enhance conservatives’ zero-sum thinking, while maintaining the existing social structure to express problems will enhance liberals’ thinking. We predict that even considering the same issues, conservatives will show more zero-sum thinking than liberals when it comes to challenging the status quo, but when the status quo persists, the situation is the opposite. Instead, people focus on extreme dominance to justify punishment of female agents, and extreme weakness or inactivity to justify punishment of atypical men, because these gender rules legitimize and strengthen the gender status quo. . [Sources: 0, 7]

Although compared with liberals, conservatives are less likely to view the economic status quo as a zero sum, but they are more likely to view society’s challenges to the status quo as such [(101) = 0.61, <0.001]. Therefore, in Study 2, we studied the relationship between ideology and zero-sum thinking about social issues (the status quo in the United States is often questioned) and economic issues (the status quo is usually kept unchanged). People often defend the existing social system, which seems to be self-contradictory, even if it has individual and collective costs [1]. Since the justification system operates based on personal fear and lack of self-esteem, for example, if the narcissist believes that he is gaining personal gain, that is, he has the opportunity to rise to the highest level, it will encourage the narcissist to defend the hierarchy [19]. [Sources: 3, 7]

Change is especially difficult if there is an ideological system that proclaims an authoritarian culture of inequality, which, according to the SJT, tends to become entrenched as a culture of justification [6]. The nation’s connection with God further strengthens people’s confidence in the justification of the system [7]. However, the ongoing debate around this phenomenon is now focused on why the underprivileged generally remain so. [Sources: 3, 4]

It is important to understand the point of view of individuals on the significance and scale of systems, since they can serve as justifiers of systems of different degrees in relation to different systems [1]. Hence, it stresses the motivation of workers to maintain a social hierarchy (that is, our structure is motivational, not just cognitive). Rather, workers see the actor refuting stereotypes (especially their status components) as a violation of prescriptive and / or prescriptive rules; consequently, perceivers feel entitled to unleash their own prejudices and punish the atypical actor. [Sources: 0, 3]

In contrast, the backlash prevention model argues that people cannot do their best because a justified fear of social rejection undermines the perceived right and optimal self-regulation flame (high promotion, low warning). The preference is given to existing social, economic and political agreements, and the alternatives are denigrated, sometimes even to the detriment of individual and collective interests. [Sources: 0, 1]

First, the SIH specifically proposes that the violation of status, rather than any violation of roles or stereotypes, should provoke backlash. Courtesy bias – The tendency to express a more socially correct opinion about your true opinion so as not to offend someone. The tripping effect is the tendency to do (or believe) something because many other people do (or believe) the same thing. [Sources: 0, 1]

Irrational escalation is a phenomenon in which people justify an increase in investment based on previous cumulative investment despite new evidence that the decision was probably wrong. However, these theories overlap as they both focus on how stereotypical anxiety undermines people’s ability to give their best, even when it’s critical. The ambiguity effect is the tendency to avoid options for which the likelihood of a favorable outcome is unknown. [Sources: 0, 1]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://www.sciencedirect.com/topics/psychology/systems-justification-theory

[1]: https://www.nextstageradicals.net/blog/cognitive-biases-hold-back-organisational-learning/

[2]: http://lex-biuro.pl/9zb5niis/social-justification-theory

[3]: https://mathias-sager.medium.com/why-people-justify-social-systems-that-disadvantage-them-58b9d9baf3de?source=post_internal_links———2——————————-

[4]: https://www.frontiersin.org/articles/10.3389/fpsyg.2020.00040/full

[5]: https://www.jstor.org/stable/3792282

[6]: https://econtent.hogrefe.com/doi/10.1027/2151-2604/a000299

[7]: https://www.science.org/doi/10.1126/sciadv.aay3761

Belief Bias Effect

From this point of view, reasoning that depends on the validity of the inference, rather than on logical analysis, allows you to quickly and often get useful results based on pre-existing knowledge about the world, and not on the logical validity of the assumption. In addition, reasoning ability sometimes predicts a greater likelihood of judging arguments and evidence in accordance with previous beliefs, rather than less (e.g., Shoots-Reinhard et al., 2021). [Sources: 1, 8]

Tools designed to measure belief bias often ask subjects to evaluate arguments with and without logical certainty. In particular, the validity of the argument and the credibility of its conclusion are two important aspects of arguments that affect the bias of people’s beliefs, especially in the context of syllogistic reasoning. When it comes to these factors, consistency / inconsistency between the credibility of an argument and its credibility (sometimes called consistency / inconsistency) can also affect people’s belief bias. [Sources: 6, 8]

Thus, the literature on reasoning bias indicates that both novices and relevant experts assess the strength of arguments and are influenced by both the validity of the conclusions and the validity of the premises. If, as predicted by a belief bias, the validity of an argument’s inference affects readers’ judgments on the subject, then a belief bias may pose a problem in the development of adaptive thinking in statistics. To achieve the learning goal of developing students’ statistical thinking skills, the statistical education community must consider factors such as belief bias, which influence how students make decisions, justify data, and respond to statistical inferences. [Sources: 10]

When people tend to seek information to support their beliefs/hypotheses, confirmation bias occurs, but this bias can be reduced by considering alternative hypotheses and their consequences. Misunderstandings This type of prejudice explains that people interpret evidence based on existing beliefs, and usually assessing solid evidence is different from the evidence that refutes their prejudice. In a 2012 study, Adrian P. Banks of the University of Surrey explained: “Belief bias is caused by the effectiveness of reasoning in working memory. It affects the level of activation and determines the likelihood of recovery, thereby affecting reasoning. … In addition to the syllogism that is mainly used to test formal reasoning, evidence of belief bias also appears in informal reasoning research, for example, when people are asked to rate the strength of the argument, the rationale or reliability of the argument Not really. It must be effective. [Sources: 2, 5, 6]

Researchers usually use syllogism reasoning tasks to investigate belief bias by manipulating the validity and logical validity of reasoning (Dube et al., 2010; Klauer and Kellen, 2011; Trippas et al., 2013). For content-neutral syllogisms, the results are consistent with research on belief distortions; however, for syllogisms with negative emotional content, participants are more likely to use valid inferences to reason about invalid syllogisms instead of automatically thinking they are valid . [Sources: 2, 11]

The experimental results reflect that when subjects were given detailed instructions to reason logically, the effect of belief bias was reduced. The result was that the pressure group had a higher percentage of incorrect answers than the other; they concluded that this was the result of a shift in thinking from logicians to believers. However, the subjects displayed a belief bias, as evidenced by their tendency to reject valid arguments with incredible conclusions and support false arguments with valid inferences. However, when the conclusion drawn from the statistics was inconsistent with the previous opinion, the subjects tended to be less confident in the statistics. [Sources: 2, 10]

The better people are at testing to make them rely on wrong ideas, the more likely they are to correctly assess the accuracy of fake news, and even if the headlines match the headlines, the less likely they are to share fake news. Their own guerrilla beliefs (Pennycook & Rand 2019). In addition, in an empirical study of graduate students, Koehler (1993) found that his subjects tended to give more favorable ratings to research reports that reached conclusions they agreed with (referred to as results “consistent with beliefs”). An intermediary analysis using the bootstrap program showed that light/low tar smokers have a direct impact on their belief that their cigarettes are less harmful (b = 0.24, 95% correct bootstrap bias, CI 0.13 to 0.34, p <0.001) and indirect The effect Because they believe their cigarettes are smoother, the effect of this effect is significant (b = 0.32, 95% CI corrected lead deviation from 0.28 to 0, 37, p <0.001), indicating that the mediation is partial. These results are similar to previous studies by Stupple et al. [Sources: 1, 3, 4, 10]

In addition, participants were most likely informed that the English letter “A” refers to a meaningless term that may lead to simpler syllogistic reasoning for the elderly in this study than for those in previous studies. We found that previous beliefs made reasoning more difficult for older people than for younger people in incompatible settings, and increased logical reasoning more significantly for older people than for younger people in congruent settings. First, while we presented the effect of age on belief bias in syllogistic reasoning, we did not fully match the educational attainment of older and younger adults. Based on the theories of the dual process, older people are less likely to use analytical strategies and are more easily influenced by beliefs. [Sources: 3, 11]

In addition, the influence of age on reasoning is largely due to bias caused by the conflict between faith and logic. In addition, when it comes to the structure of arguments, an associated bias that can affect a belief bias is a figurative bias, that is, a tendency to be influenced by the order in which information is presented in the premises of an argument when seeking a solution to a problem. the problem of syllogistic reasoning. To minimize this dissonance, people adjust to confirmation bias by avoiding information that contradicts their beliefs and looking for evidence to support their beliefs. Confirmation bias is a psychological effect in which, in the context of forming an opinion, an individual advocating a particular opinion tends to mistakenly perceive new incoming information as supporting his current belief. [Sources: 4, 5, 6, 11]

Home messages. Confirmation bias is the tendency of people to give preference to information that corroborates their existing beliefs or assumptions. Confirmation bias is the tendency to seek information that supports rather than reject it as bias, usually interpreting evidence to validate existing beliefs by rejecting or ignoring any conflicting evidence (American Psychological Association). People are prone to confirmation bias in order to protect their self-worth (to know that their beliefs are correct). Confirmation bias is the tendency to seek, interpret, and remember information based on one’s beliefs, while persistence is a state in which a person refuses to change their beliefs, even if their beliefs may be denial. [Sources: 5, 12]

From a legal point of view, this belief becomes a prejudice when one cannot deal with it effectively to focus on the facts at hand and the part of the case. However, when the validity of the inference contradicts the belief, people are unlikely to agree with the argument, and the belief will interfere with syllogistic reasoning (Dube et al., 2010; Trippas et al., 2013, 2018). In syllogism reasoning, people do not completely follow logical principles, and the reasoning process is often affected by beliefs (Evans et al., 1983, 2001). [Sources: 11, 12]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://eric.ed.gov/?id=EJ892079

[1]: https://www.psychologytoday.com/us/blog/upon-reflection/202112/belief-bias-polarization-and-potential-solutions

[2]: https://en.wikipedia.org/wiki/Belief_bias

[3]: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6990430/

[4]: https://www.science.gov/topicpages/b/belief+bias+effect

[5]: https://www.simplypsychology.org/confirmation-bias.html

[6]: https://effectiviology.com/belief-bias/

[7]: https://www.sciencedirect.com/science/article/pii/S1877042815009295

[8]: https://en.shortcogs.com/bias/belief-bias

[9]: https://www.verywellmind.com/cognitive-biases-distort-thinking-2794763

[10]: https://www.tandfonline.com/doi/full/10.1080/10691898.2009.11889501

[11]: https://www.frontiersin.org/articles/479235

[12]: https://lisbdnet.com/what-is-belief-bias-and-what-is-the-best-way-to-avoid-belief-bias-when-making-decisions/

[13]: https://www.forbes.com/sites/stephaniesarkis/2019/05/26/emotions-overruling-logic-how-belief-bias-alters-your-decisions/

Illusory Truth Effect

The illusory truth effect (also known as the truth illusion effect, the certainty effect, the truth effect, or the repetition effect) is the tendency to believe that false information is correct after repeated exposure. The illusory effect of truth is a well-studied and reproduced psychological phenomenon that describes the fact that if a lie is repeated often enough, people will begin to believe it. Psychologists have called this the “illusory truth effect” and it seems to be related to the fact that we can more easily process information that we have encountered many times before. This creates a sense of fluidity, which we then (erroneously) interpret as a signal that the content is true. [Sources: 0, 4, 16]

In other words, you say something many times and people start to believe it. If you tell people that a statement is false right after you hear or read it for the first time, the effect will diminish. And if you repeat a false statement too often, people may view that repetition as an attempt to convince them and, therefore, are less likely to believe the statement you are selling. [Sources: 7, 15]

In other words, you cannot repeat a weak argument to people who are listening carefully – then the illusion of truth does not work. Several studies on the illusion of truth have shown that people are more influenced when they hear opinions and persuasive messages more than once. Incredibly, when truth is evaluated, people rely on whether the information is consistent with their understanding or whether it is familiar to them. Because of the way our mind works, what we know is also true, hence the illusion of truth. [Sources: 11, 12]

Familiar things take less effort to process, and this feeling of lightness subconsciously signals the truth, this is called cognitive fluency. In other words, statements are easier for people to believe if they are easy to process. With repetition, it’s easier for the human mind to come up with a statement about other competing ideas that doesn’t repeat itself over and over again. [Sources: 7, 11]

Repetition is easier to deal with statements than new, non-repetitive statements, leading people to believe that repeated reasoning is more true. Although some previous studies included true and false statements, studies have shown that repetition can lead to an increase in the perceived truth of previously unknown truth and previously unknown false statements by an equal amount (for example, Hasher et al. Do not change based on objective facts since then Psychologists use the truth effect—or, more accurately, the illusory truth effect—to describe a phenomenon in which reiteration is considered more likely to occur, albeit only because of its repetition. [Sources: 2, 6, 10]

When a “fact” is delicious and repeated enough times, we tend to believe it, no matter how false it is. We can effectively persuade ourselves through repetition, which takes the real illusion to a new level. In other words, repetition magically makes any statement more true, regardless of whether it is really true or not. [Sources: 5, 8, 11]

But one of the most striking features of the illusory truth effect is that it can occur even though the claim is known to be false, 7 or if there are actual “fake news” headlines that are “wholly invented … stories. that on some thought people probably know they are not true. The illusory truth effect tends to be stronger when statements relate to a subject that we believe we know 5, and when statements are ambiguous in such a way that they are obviously not true or false at first glance. units) 4, although the supposed reliability of the source of the statements increases the perception of truth, as would be expected, the effect of truth persists even here. Ando’s sources are considered unreliable, especially if the source of the claim is unclear. [Sources: 9]

Therefore, psychological research has shown that any process of increasing familiarity with false information through repeated exposure or other means can enhance our correct view of information. A recent study by the British Psychological Association reported on the so-called “fantasy truth effect” (Brashier, Eliiseev, and Marsh, 2020), which is the tendency to treat statements as true based on repetition. These findings indicate that, regardless of our specific cognitive status, we tend to believe in repetitive information. The aforementioned 2015 study also showed that for many people, repeated statements are easier to deal with than new information, even if people know it better. [Sources: 0, 3, 9, 15]

For example, a frequently repeated statement may be approved by several people, which can be a useful clue to its truth. For example, someone who relies more on intuition and wants accurate answers may be more likely to use the fact that information has been repeated as a key to its truthfulness. Published in the Journal of Experimental Psychology; Research has shown that the effect of truth can affect participants who did know the correct answer at first, but who were led to believe otherwise by repeating a lie. [Sources: 0, 16]

After reproducing these results in another experiment, Fazio and his team attributed this strange phenomenon to the fluidity of processing, that is, how easy it is for people to understand the statement. Therefore, the researchers concluded that recall is a powerful technique for enhancing the validity of so-called statements, and that the illusion of truth is an effect that can be observed without questioning the statement of fact. This effect was first named and identified in a study by Villanova University and Temple University in 1977. The study required participants to judge a series of trivial statements to determine whether they were correct. [Sources: 16]

A week later, participants saw these same trivial statements along with new statements and were asked to rate the veracity of each statement. As in a typical illusory study of truth, half of the statements were repeated from an earlier stage of the experiment, and half were new. After pre-registration, we quickly identified each perceived truth as the proportion of “true” responses mediated between new and recurring elements. [Sources: 1, 6]

As described above, given the underlying psychometric properties of the task, we would expect there to be an inverted U-shaped relationship between the size of the illusory truth effect, the measure of accuracy for the least repetition, and perceived truth, the measure of accuracy. averaged over repeated and new (eg Chapman & Chapman, 1988). [Sources: 1]

Some studies have even tested how many times a message must be repeated for maximum effect of the illusion of truth. If repeated enough times, the information can be perceived as reliable, even if the sources are not credible. In experimental settings, people also mistakenly attribute their previous exposure to stories, believing they are reading news from another source, when in fact they saw it as part of an earlier piece of research. [Sources: 5, 11, 15]

So if you hear something repeatedly, you are more likely to believe it. And even if you cannot believe it, you will rate the likelihood that it is true higher. [Sources: 7]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://digest.bps.org.uk/2019/06/26/higher-intelligence-and-an-analytical-thinking-style-offer-no-protection-against-the-illusory-truth-effect-our-tendency-to-believe-repeated-claims-are-more-likely-to-be-true/

[1]: https://link.springer.com/article/10.3758/s13423-019-01651-4

[2]: https://artsandculture.google.com/entity/illusory-truth-effect/m0yqm57h?hl=en

[3]: https://www.jdsupra.com/legalnews/look-out-for-the-illusory-truth-effect-38750/

[4]: https://bigthink.com/neuropsych/repeating-lies-people-believe-true-studies/

[5]: https://fs.blog/illusory-truth-effect/

[6]: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8116821/

[7]: https://www.philosophytalk.org/blog/say-it-enough-they%E2%80%99ll-believe-it

[8]: http://econowmics.com/illusion-of-truth-effect-you-repeat-i-believe/

[9]: https://www.psychologytoday.com/us/blog/psych-unseen/202001/illusory-truth-lies-and-political-propaganda-part-1

[10]: https://gizmodo.com/these-statements-are-both-true-but-one-is-easier-to-be-1687482541

[11]: https://www.spring.org.uk/2021/07/illusion-of-truth.php

[12]: https://visioncareconnect.healthcare/the-vblog-home/the-illusory-truth-effect

[13]: https://www.adcocksolutions.com/post/what-is-the-illusory-truth-effect

[14]: https://center-divorce-mediation.com/illusory-truth-effect-divorce-and-mediation/

[15]: https://www.psychologicalscience.org/news/repeating-misinformation-doesnt-make-it-true-but-does-make-it-more-likely-to-be-believed.html

[16]: https://en.wikipedia.org/wiki/Illusory_truth_effect

Rhyme As Reason Effect

This seems to be because rhymes are easier to remember and process in the brain. The result of rhyming as a cause is a cognitive bias that makes it easier for people to remember, repeat, and believe statements that contain rhyme, rather than statements that do not rhyme. People tend to accept statements that rhyme truth, rather than statements that express the same meaning but do not rhyme. [Sources: 4, 7, 10]

The effect of rhyme as a cause (or the Eaton-Rosen phenomenon) is a cognitive bias in which a statement or aphorism is considered more accurate or truthful when rewritten into rhyme. In the experiments, the subjects rated the variants of statements that they rhymed and did not rhyme, and tend to rate those that rhyme as more truthful (tested for meaning). In the first experiment, the influence of rhyme as reason quickly disappeared when people were simply asked to distinguish between the pleasantness of rhyme and the flesh of the actual utterance. Once they knew the rhyme, they stopped automatically associating the sound of a sentence with its truthfulness. [Sources: 0, 1, 9, 11]

“Until we are explicitly aware of it,” McGlone says, “rhyme may even cause us to be more kind to a statement that we would otherwise disagree with.” What is perhaps hidden in the idiom is that rhyme can have the same meaning as reason. Not all aphorisms are rhymes, but evidence suggests that cognitive bias – the effect of rhyme as reason – causes those aphorisms that rhyme to acquire the perceived value of rhyme. [Sources: 0, 10, 16]

The main cognitive mechanism that explains why people perceive the effect of rhyme as a cause is the Keats heuristic, which is a mental label that people use when they base their judgment on whether a statement is true or not, on the aesthetic qualities of that statement. McGlone and Tofigbakhsh attribute this effect to what they call the Keats heuristic [McGlone 1999], with which we humans confuse the validity of a sentence or statement with its aesthetic qualities. Since rhyme is an aesthetic quality, it gives the rhyming sentence great perceived value. There is a reason rhymes are widely used in advertising and branding because rhymes are a key influencer. [Sources: 7, 13, 16]

We see it in action every day: cute rhyming phrases that stay in our brain and influence our behavior. But it wasn’t just repetition that made the phrase “strong and stable” stand out and sticky in people’s minds: it’s the use of consonance and something called the Keats heuristic. They have rhyming consonants at the beginning (hard “st”) – a type of rhyme called consonance. [Sources: 1]

Aphorisms are short and catchy sayings or remarks that we usually accept as true or wise. However, the notorious vagueness of the aphorisms makes it especially difficult to determine the conditions for their truth. If the persuasiveness of an aphorism critically depends on the clarity of the conditions for its truth, then we should find it surprising that people put at least some faith in such statements. Attributing a claim to a highly reliable or prestigious source can lead people to approve of it, especially when they lack the knowledge to assess the underlying claims (Asch, 1952; Saadi and Farnsworth, 1934). [Sources: 8, 16]

Not only is this aphorism familiar to American college students, but these students believe it is a more accurate description of mate choice than new statements that imply the same statement (for example, people with different interests and personalities tend to be attracted to a friend to a friend, McGlone and Necker, 1998). [Sources: 8]

Aphorisms were supposed to be archaic, since people tend to sharply positively react to things familiar to them. We asked people to rate the intelligibility and apparent accuracy of unfamiliar aphorisms presented in their original rhymed form (for example, an investigation into this issue was reported in 2000 by Matthew S. McGlone and Jessica Tofigbakhsh [McGlone 2000], who found that rhyming aphorisms are rated to be more more accurate than their modified, non-rhyming versions. Some of them repeated the use of the term “Eaton-Rosen phenomenon” in their articles, with the result that these sources were added as citations in support of the use of the term, although they were all published later than initial use of the term on Wikipedia. [Sources: 7, 8, 11, 16]

Companies use catchy phrases and rhyming slogans to influence consumers. This messenger effect has been used by marketers for decades and works even at the most basic levels. [Sources: 10, 14]

Likewise, when a commitment is costly and people have problems (such as being arrested), others notice it. Sure, this works great with charity races, but this engaging effect has also been used to protect the environment. Climate protesters have also been severely attacked for having previously resting or using plastic. [Sources: 14]

In contrast, in communities where denial is the norm for one reason or another, the social cost of not being denied is very high. For example, consider the obsolete observation that opposites attract. In conclusion, we can say that rhymes affect human nature. Many sayings and maxims are harmless and can have positive effects. [Sources: 8, 10, 14]

So if you need to convince people to believe something, then you will rhyme your thoughts. Whatever the reason, it seems that if you want people to believe you, use rhyme, but don’t insist. In addition, when using the rhyming effect as a reason, remember that being familiar with a sentence makes it easier for people to remember and believe it. This means that as much as possible and reasonable, repeat the sentence in rhyming form as much as possible in order to increase The possibility of people accepting it. [Sources: 4, 7, 11]

Thus, the nursery rhyme “get up and move” is associated with the severity and completeness of the operation, which in reality may not correspond to the situation. [Sources: 16]

We have concluded that propositional propositions can show whether and to what extent certain features of linguistic structure contribute to poetic effect. In two experiments, we investigated the influence of deviant and parallel linguistic characteristics on the grammatical and literary-aesthetic assessment of the readers of one sentence. We examined the role that poetic form can play in people’s perception of the accuracy of aphorisms as describing human behavior. In Experiment 2, PSAs were rated positively in both the rhymed and non-rhymed versions. [Sources: 2, 3, 5]

However, in some cases, you may want to use additional techniques to reduce the impact of this offset. This thinking error can lead organizations to underestimate data analytics and underestimate the people who perform this function. Cognitive biases are systematic patterns of deviation from rationality that make us irrational in how we seek, evaluate, interpret, judge, use, and remember information, and in how we make decisions. [Sources: 7, 10, 16]

Rhyme, the basis of musical songs and mischievous limericks, is often not taken seriously. It’s hard to tell right now – it looks like my social media bubble is talking about it, but not always for the right reasons. [Sources: 0, 1]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://www.psychologytoday.com/us/articles/199809/sounds-true-me

[1]: https://medium.com/@chrislynch_mwm/strong-and-stable-a-lesson-in-the-use-of-consonance-rhyme-as-reason-and-the-keats-heuristic-ba7d2340d863

[2]: https://pubmed.ncbi.nlm.nih.gov/23841497/

[3]: https://www.sciencedirect.com/science/article/pii/S0304422X99000030

[4]: https://steemit.com/life/@jevh/17-september-today-s-term-from-psychology-is-rhyme-as-reason-effect

[5]: https://journals.sagepub.com/doi/10.1111/1467-9280.00282

[6]: https://hyperleap.com/topic/Rhyme-as-reason_effect

[7]: https://effectiviology.com/rhyme-as-reason/

[8]: https://pdfslide.net/documents/the-keats-heuristic-rhyme-as-reason-in-aphorism-interpretation.html

[9]: https://nlpnotes.com/2014/04/07/rhyme-as-reason-effect/

[10]: https://www.boloji.com/blog/2449/rhyme-as-reason

[11]: https://gizmodo.com/why-rhyming-phrases-are-more-persuasive-1524861998

[12]: https://www.semanticscholar.org/paper/Rhyme-as-reason-in-commercial-and-social-Filkukova-Klempe/022e8b9c5c09d88614758949bc0034e1ad142bbb

[13]: https://eqsales.com.au/blog/rhymeasreason

[14]: http://o-behave.tumblr.com/post/152018329652/bias-of-the-month-rhyme-as-reason-effect

[15]: https://schwa.consulting/rhyme-as-reason-or-why-rhymes-chime

[16]: https://chacocanyon.com/pointlookout/191211.shtml

Subjective Validation Bias

Subjective testing, sometimes called the personal testing effect, describes the tendency of people to believe or accept an idea or statement when presented in a personal and positive way. People who are influenced by subjective testing will perceive two unrelated events as related because their personal beliefs require them to be related. Subjective testing is the process of checking words, initials, statements, or signs as accurate because you may find them personally meaningful and meaningful. Basically, subjective validation is a confirmatory bias against information that contributes to personal self-esteem. [Sources: 0, 5, 7]

Subjective testing also involves selective memory, because the subject is unlikely to find meaning in every expression of the host. Subjective verification explains why many people are attracted to the apparent accuracy of pseudo-scientific personality profiles. The overall effect of subjective verification should be how the entity assesses the accuracy of the carrier’s statement. When people present to them in person or actively, people tend to believe or accept an idea or statement. [Sources: 0, 3]

However, when Ellison Dubois tested the psychic model on the hit TV show Average, she did not use controls that would rule out subjective validation as an explanation for the high score given by the woman who performed Dubois’s testimony. … Measures of Confidence As a second source of bias in responses, we looked for a measure of low or overconfidence, or the difference between subjective and objective measures of confidence. [Sources: 0, 12]

Subjective examination disappoints everyone, from a housewife who thinks her happiness depends on her blood type or horoscope, to an FBI agent who believes her criminal profiles are correct, to a therapist who believes her Rorschach testimony to be insightful portraits of psychological disorders. For example, if someone loves to eat bacon and comes across an article that talks about how good bacon is for you, they will tend to believe it more because it “confirms” eating more bacon. [Sources: 0, 3]

However, the presence of potential biases in such self-assessment tools can call into question the validity of the measured constructs. One view is to view these response styles and trust biases as undesirable, while another is that these “biases” can potentially be used as interesting indicators of key characteristics of the respondent. [Sources: 12]

In 1948, he conducted the so-called original experiment to study the cognitive effect. The variables used in the response style calculations and the difference in confidence will be described in the next section. More formally, the Barnum effect was first studied by Professor Bertram R. Forer, hence the interchangeable name for bias, which is also commonly referred to as the Forer effect. It is based on a survey focused on the structure of expected values ​​[33], conducted at the beginning of the course, which generates various assessments of expectations (such as perceived cognitive competence or the expectation of not encountering learning difficulties) and personal assessments. [Sources: 9, 12]

They affect the likelihood that visitors will share or talk about your product or service. There are many other cognitive biases to consider, but these are some of the most common and relevant to marketers and SEOs. Cognitive bias is the tendency to think in a certain way, which often leads to deviation from rational and logical decisions. If you do qualitative research, the questions you ask are subject to this influence. [Sources: 2]

Every person has their own prejudices, and it is dangerous to assume that everyone thinks the same way. The tendency to rely too heavily on a trait or piece of information or “anchor” when making decisions (this is usually the first piece of information we get on this issue). [Sources: 2]

Once you learn about cognitive biases, you can start to consider them and limit their impact on the thinking of your visitors and yours. Subjective confidence is then defined as the expected value of a learning outcome based on survey responses (for example, both response styles and differences in confidence are a potential source of bias in the data. Cognitive bias list – Cognitive bias is a pattern of misjudgment, often triggered by a particular situation … [Sources: 2, 11, 12]

The more often a person sees your name, logo, or call to action, the more likely they are to buy from you. An effect whereby someone’s judgment of the logical strength of an argument is influenced by the validity of an inference. In fact, conversions could increase due to a reality threat such as a PPC campaign or seasonal change. For example, a person hears that his favorite vacation turned out to be a great form of fitness training. [Sources: 2, 8]

This reliable but surprising effect provides some indication of the high level of belief in the paranormal in society. If he did this, he would have something to compare with the statement with an accuracy of 73%. [Sources: 0, 6]

Identifying a misjudgment, or rather a deviation from judgment, requires a standard for comparison, for example A subscription or purchase is required to access the full content of Oxford Clinical Psychology. Filling in a CAPTCHA proves that you are human and gives you temporary access to a web resource. [Sources: 6, 10, 11]

 

— Slimane Zouggari

 

##### Sources #####

[0]: http://skepdic.com/subjectivevalidation.html

[1]: https://onlinelibrary.wiley.com/doi/abs/10.1002/9781119165811.ch96

[2]: https://cxl.com/blog/cognitive-biases-in-cro/

[3]: https://zims-en.kiwix.campusafrica.gos.orange.com/wikipedia_en_all_nopic/A/Subjective_validation

[4]: https://www.ijunoon.com/dictionary/Subjective+validation/

[5]: https://www.edunation19.in/2020/12/what-is-subjective-validation.html

[6]: https://www.oxfordclinicalpsych.com/view/10.1093/med:psych/9780198530114.001.0001/med-9780198530114-chapter-2

[7]: https://artsandculture.google.com/entity/subjective-validation/m03hfvwc?hl=en

[8]: https://www.alleydog.com/glossary/definition.php?term=Subjective+Validation

[9]: https://thedecisionlab.com/biases/barnum-effect/

[10]: https://www.researchgate.net/publication/328120833_Subjective_Validation_100_of_the_Most_Important_Fallacies_in_Western_Philosophy

[11]: https://en-academic.com/dic.nsf/enwiki/2246528

[12]: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7292385/

Automation Bias

If we think of the human brain as a computer, cognitive distortion is essentially a code error that causes us to perceive input differently or produce illogical output. [Sources: 2]

But there are other types of biases, not necessarily cognitive; For example, there is social protection theory, which is one of the most popular socio-psychological biases. In addition, there may be cognitive theories that are not necessarily considered bias, or rather, they are more like a web of shared biases woven together, such as the cognitive dissonance that causes mental disorders when conflicting ideas or beliefs arise in our minds. [Sources: 2]

In another case, cognitive bias can be used to understand personal reasoning patterns and motivational processes that are the basis of their decision-making behavior. Cognitive bias patterns in visualization research In another specific situation, the way a person processes information and makes decisions is also different. [Sources: 4]

Automatic reasoning promotes uncritical acceptance of proposals and maintains strong bias. It has been experimentally shown that this type of control creates a so-called automation bias, when operators trust computer solutions as correct and ignore or do not look for conflicting information. Cummings experimented with automation bias in researching an interface designed to monitor and allocate GPS-guided Tomahawk rockets in flight. [Sources: 7]

Level 4 is unacceptable because it is not conducive to confirmation of purpose, and a short veto time will increase the automation bias and leave no room for doubt or reflection. There must also be a means to quickly stop or interrupt the attack. An ordered list of goals is especially problematic, as automation bias can tend to take a goal at the top of the rankings if not given enough time and space to think. [Sources: 7]

This is a trend of over-reliance on automated systems, which may cause wrong automated information to overwhelm correct decisions. By showing how people trust automated systems based on their own judgment, we already have a rich history of automation bias. When we consider how the increasing distortions of automation caused by the rapid deployment of artificial intelligence and automation will affect the future, we begin to understand the risks involved in allowing machines to guide human thinking. [Sources: 1]

Every day, systematic errors in our thought processes affect the way we live and work. In the name of self-awareness, here’s a more detailed look at three newly discovered biases that we are most likely to display in the modern world. Automation bias refers to a specific class of errors that humans tend to make in the context of highly automated decision-making, where many decisions are processed by automated tools (such as computers) and a human actor is widely present to track the actions being taken. [Sources: 3, 5]

The following are excerpts from some representative examples of this research program. A number of recent studies on automation bias, using automation as a heuristic replacement for vigilant retrieval and processing of information, have explored omissions and errors in highly automated decision-making environments. Most of the research on this phenomenon has been conducted in a single person speaking setting. This study examined automation bias in teams of two artists versus solo artists in different educational settings. The training focused on automation bias and associated errors, and was successful in reducing commissions, but not omissions. [Sources: 5]

However, they found that the difficulty of the assignment did not affect the execution of the assignments. We found evidence that participants made mistakes or omissions, failing to detect 28.7% more prescription errors when CDS did not issue warnings, compared to the control condition without CDS. Interestingly, while participants were found to rely too heavily on automation, there was evidence of disagreement with the CDS provided to them. This problem is further exacerbated by the “gaze but not seeing” or inattentive blindness effect, in which participants made AB errors despite having access to sufficient information to judge that the automation was wrong [12, 13]. [Sources: 0]

However, automated deviation detection shows that this extra layer of protection is weakening, or in the worst case, without proper supervision, the commissioning of CDS to replace the efforts of clinicians to detect errors. In addition, it has been found that the use of cognitive strategies, such as requiring people to consider the opposite result, rather than just the expected result in judgment, has been found to be effective in reducing anchoring bias (Mussweiler et al., 2000). A large number of social psychology studies have shown that many cognitive biases and the resulting errors can be corrected by establishing an accountability system before making decisions, which makes decision makers aware of the need to create intimidation for their choices and how they make these choices. Convincing reason. Although humans are called “smart animals,” Bayesian analysis experiments in the 1950s and 1960s showed that human judgments may be biased and make wrong decisions (Edwards et al., 1963; Ellis, 2018). [Sources: 0, 4, 6]

From the above, it should be clear that there are lessons to be learned from both the psychology of human thinking and the literature on human-machine interaction. This study found that there is a risk of bias in electronically prescribing medications to senior medical students who will soon enter clinical practice as junior physicians. [Sources: 0, 7]

Knowing this list of biases will help you make better decisions and understand when you’ve gone astray. Most people don’t know how many types of cognitive biases there are – Wikipedia lists 184. We found 50 types of cognitive biases that arise almost every day in small discussions on Facebook, in horoscopes and on the world stage. [Sources: 1, 2]

Along with their definitions, these are real-life examples of cognitive bias, from subtle groupthink sabotaging your appointments with management to anchored attraction that makes you spend too much money in the store during a sale. Cognitive bias is widely recognized as something that makes us human. Cognitive bias is a psychological explanation for the patterns of human thinking and rational judgment (Haselton et al., 2015) associated with remembering, evaluating, processing information, and making decisions (Hilbert, 2012; Tversky and Kahneman, 1974). In the study of psychology and behavioral economics, similar patterns of biased thinking have been reported, called cognitive biases. [Sources: 2, 3, 4]

This cognitive bias, identified in 2011 by Michael Norton (Harvard Business School) and colleagues, is related to our tendency to place more value on what we help create. If we’re to counter this cognitive bias, finding a new favorite TV series on platforms like Netflix can take good old-fashioned human curiosity. The Google Effect, also known as digital amnesia, describes our tendency to forget information that can be easily accessed on the Internet. [Sources: 3]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://bmcmedinformdecismak.biomedcentral.com/articles/10.1186/s12911-017-0425-5

[1]: https://www.paconsulting.com/insights/what-is-automation-bias-how-to-prevent/

[2]: https://www.titlemax.com/discovery-center/lifestyle/50-cognitive-biases-to-be-aware-of-so-you-can-be-the-very-best-version-of-you/

[3]: https://www.visualcapitalist.com/50-cognitive-biases-in-the-modern-world/

[4]: http://www.braindigitallearning.org/article?num=N0230110302

[5]: https://lskitka.people.uic.edu/styled-7/styled-14/

[6]: https://journals.sagepub.com/doi/abs/10.1177/154193129604000413

[7]: https://www.icrac.net/icrac-working-paper-3-ccw-gge-april-2018-guidelines-for-the-human-control-of-weapons-systems/

Contrast Effect

A contrast effect is an improvement or decrease in relation to normal perception, cognition, or related characteristics after subsequent (immediately preceding) or simultaneous exposure to a stimulus of lesser or greater value of the same size. A contrast effect is an increase or decrease relative to normal perception, cognition and related actions as a result of an immediately preceding or simultaneous impact on a stimulus of lesser or greater value in the same dimension. Contrast effect is a cognitive bias that distorts our perception of something when we compare it to something else, increasing the differences between the two. This occurs when people feel better about their work in a given field, when they are surrounded by people who are relatively poorly performing in that field, compared to people who are relatively good in the same field. [Sources: 0, 1, 2]

The contrast effect is usually classified as one of two main types of contextual effects, which are cognitive biases that occur when comparison with background information affects our assessment of certain stimuli. Contrast effects are ubiquitous in the perception, cognition and performance of humans and animals. This concept differs from contrast, which itself refers to the difference in color and brightness of an object from its environment or background. Explaining to yourself why the comparison presented to you is inappropriate, such as focusing on the absolute price of a product rather than its relative price, can help reduce the likelihood that you will experience a contrast effect. [Sources: 1, 2, 6]

Thus, the effect of assimilation is similar to the effect of contrast, the difference between them is that the effect of assimilation decreases the perceived difference between the compared objects, and the effect of contrast increases this difference. Contrast effects can shape not only visual qualities such as color and brightness, but other types of perception, including the perception of weight. When the available information is used to construct a target view, the effect of assimilation is achieved, while the available information that is created in the mental view of the reference standard leads to contrast effects. Simultaneous contrast effect is the term used when stimuli are presented simultaneously, whereas subsequent contrast is used when stimuli are presented one after the other. [Sources: 1, 4, 8]

When the perception of the currently displayed stimulus is modulated by the previously displayed stimulus, subsequent contrast occurs. The 17th century philosopher John Locke noticed the contrast effect. He observed that cold water can appear hot or cold, depending on whether the hand that touched it was in hot or cold water. As a complement to the above definition, this term describes the influence of sincere psychological closeness in the social environment, which affects current self-expression and self-awareness (psychology)|self-knowledge. [Sources: 2, 4, 8]

On the contrary, it is widely known that, in addition to edge contrast, other factors associated with more complex and high-level image analysis, which cannot be explained by simple local interactions, can affect the perception of brightness / color in a global context. [Sources: 5]

In the early 20th century, Wilhelm Wundt identified contrast as a fundamental principle of perception, and this effect has since been validated in many different fields. Comparison of different colors and shades can cause a misperception of contrast. While these are very different visual impairments, they both affect the automaticity of walking and indicate that designers must consider the associated cognitive factors that accompany complex interactions of visual parameters. Types Simultaneous contrast, identified by Michel Eugene Chevreul, refers to how the colors of two different objects affect each other. [Sources: 3, 4, 6, 8]

Whenever researchers conduct attitude polls and questionnaires | questionnaires, they must take into account the processes of judgment and the subsequent effects of assimilation. Those with simulated or real-life low vision demonstrate a relatively intact ability to assess room size and renew self-esteem after walking simple paths within visual space, if only under conditions of extreme visual acuity / contrast sensitivity degradation. Ramps and Steps determined that enhancing the contrast of stepped transitions with directional illumination aids detection, but that providing high-contrast texture on these surfaces degrades detection. [Sources: 3, 8]

Some of the NIBS guidelines relate to the ideas of visual accessibility and perception of local and global characteristics of spatial behavior and can be based on basic scientific approaches such as the methods described above. In general for object detection, contrast is important for blurry vision, but more subtly, the contrast between object and background depends on the location of the lighting. Cognitive bias, in which people respond differently to a particular choice depending on how it is presented; For example. As other signals become more available with increasing duration, the perception of brightness / color changes to become more compatible with these other signals. [Sources: 3, 5, 7]

Cognitive bias that occurs when people place too much emphasis on just one aspect of an assessment, resulting in an error in accurately predicting the usefulness of a future outcome. Whether a piece of music is perceived as good or bad may have something to do with whether the music previously listened to was unpleasant or pleasant. Bias in the processing of information by a person, which refers to the tendency to insufficiently reconsider one’s beliefs when new evidence is presented. [Sources: 4, 7]

— Slimane Zouggari

 

##### Sources #####

[0]: https://findanyanswer.com/what-is-contrast-effect-in-psychology

[1]: https://effectiviology.com/contrast-effect/

[2]: https://psychology.fandom.com/wiki/Contrast_effect

[3]: https://cognitiveresearchjournal.springeropen.com/articles/10.1186/s41235-020-00265-y

[4]: https://psynso.com/contrast-effect/

[5]: https://jov.arvojournals.org/article.aspx?articleid=2191951

[6]: https://zims-en.kiwix.campusafrica.gos.orange.com/wikipedia_en_all_nopic/A/Contrast_effect

[7]: https://mycognitivebiases.com/?p=1943

[8]: https://theartofservice.com/contrast-effect.html

Scope Neglect

Fortunately, there are many reasons to believe that we can take advantage of scope insensitivity because people have already discovered ways to make the most of other forms of non-extensibility. Scale neglect or scope insensitivity is a cognitive bias that occurs when an assessment of a problem is not assessed using a multiplicative relationship with its dimension. After all, if we didn’t neglect scale, we would be more rational and therefore perhaps happier and healthier, living in a world where everyone has more of what they want, because without scale insensitivity there would be no it’s so difficult to convince people to help those who are far away, who need more than those close to them, who need less. Here I will look at one such use case, namely the use of scope insensitivity to prepare for high-risk situations in low-risk situations. [Sources: 2, 3]

The more anxious, depressed, or generally frustrated you are, the more likely you are to treat low-stakes situations as high, and thus, you will have even more opportunities to practice judo numbness than people who are calmer and are fair. Indeed, studies of neglect of scale, in which the quantitative variation is large enough to elicit any sensitivity, show a small linear increase in willingness to pay, corresponding to an exponential increase in volume. When you notice any of these situations, consider if this is really a high rate or if you think it is simply due to the viewfinder’s insensitivity. Expansion neglect [a] is a type of cognitive error that occurs when sample size is ignored when evaluating a study in which the sample size is logically significant. [Sources: 2, 5, 6]

Two other hypotheses to explain domain neglect include purchase of moral gratification (Kahneman and Knutch, 1992) and just cause burial (Harrison, 1992). The most widely accepted explanation for scale neglect is the affect heuristic. This can cause their reaction to problems to be disproportionate to the size of the problem. [Sources: 4, 5]

Baron and Green (1996) found no effect of a tenfold change in the number of lives saved. Kahneman, Daniel, Barbara Fredrickson, Charles Schreiber and Don Redelmeier. Wilson, Thomas, Christopher Houston, Catherine Etling and Nancy Brecke. [Sources: 0, 5]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://link.springer.com/article/10.1023/A:1007835629236

[1]: https://medium.com/@shravanshetty/scope-neglect-e76bfc623286

[2]: https://mapandterritory.org/scope-insensitivity-judo-a07f9166f165

[3]: http://zims-en.kiwix.campusafrica.gos.orange.com/wikipedia_en_all_nopic/A/Scope_neglect

[4]: https://www.thehindu.com/opinion/op-ed/what-is-scope-neglect-in-psychology/article24463617.ece

[5]: https://www.briangwilliams.us/global-catastrophic-risks/scope-neglect.html

[6]: https://en.wikipedia.org/wiki/Extension_neglect

Neglect Of Probability

Probability neglect, a type of cognitive bias, is the tendency to ignore probability when making decisions under uncertainty, and is an easy way for people to regularly break normative rules when making decisions. Probability neglect, a type of cognitive bias, is the tendency to completely ignore probability when making decisions under uncertainty, and is an easy way for people to regularly break normative rules when making decisions. When probability is neglected, people focus on the “worst case” and ignore the question of whether the worst case is possible — an approach that can also lead to overregulation. There are many related ways that people violate the normative rules for making decisions about likelihood, including hindsight bias, disregard for the influence of previous base bets, and player error. [Sources: 0, 1, 4, 5]

Cass Sunstein, a senior adviser to Barack Obama, says that people show a likelihood of rejection when faced with vivid images of terrorism, so that when their emotions are intensely involved, people’s attention is focused on the worst outcome. is unlikely to happen. This bias can lead subjects to decisively violate expected utility theory when making decisions, especially when a decision has to be made when the possible outcome is of much lower or higher utility, but with little likelihood (for example, [Sources: 3, 5]

However, this bias is different in that the actor does not abuse the probability, but completely ignores it. In a 2001 article, Sunstein addressed the question of how the law should respond to the rejection of probability. Again, the subject ignores probability when making a decision, considering every possible outcome to be equal in his reasoning. [Sources: 0, 4]

They assume that the likelihood is more likely to be overlooked if the results evoke emotion. In this respect, rejecting the probability bias is similar to rejecting the effect of previous base rates. Subadditivity effect. The tendency to rate the likelihood as wholly less than the probabilities of the parties. [Sources: 2, 4, 5]

While government policy on potential hazards should focus on statistics and probabilities, government efforts to raise awareness of these hazards must focus on worst-case scenarios to be most effective. He noted that there are methods available, such as Monte Carlo analysis, to study probability, but all too often “the probability continuum is ignored. Dobelly described the US Food Act of 1958 as a “classic example” of the rejection of probability. [Sources: 4]

Blind Spot Bias The tendency to believe that one’s own bias is less than that of others, or the ability to recognize others’ cognitive biases is greater than one’s own. connection error. Tend to assume that certain conditions are more likely than general conditions. Basic bet error or basic bet lost. Tend to ignore basic speed information (general and general information) and focus on specific information (information only relevant to a specific situation). [Sources: 2]

The player’s illusion. Tend to think that future probabilities will be distorted by past events when they have not really changed. All these biases indicate a tendency to focus on irrelevant information when making a decision. Berkson’s paradox. The tendency to misunderstand statistical experiments using conditional probability. [Sources: 2]

Irrational escalation. A phenomenon in which people justify an increase in investment in a solution based on previous cumulative investment despite new evidence that the decision was likely wrong. Choice bias. The tendency to remember your choice as better than it really was. [Sources: 2]

When national security is at stake, cost-benefit analysis is much less promising because it is usually impossible to assess the likelihood of an attack. The availability heuristic, widely used by ordinary people, can lead to highly exaggerated perceptions of risk, as serious incidents lead citizens to think the risk is much greater than it actually is. Civil libertarians overlook this point, believing that the meaning of the Constitution does not change in the face of intense public fear. [Sources: 1]

Deformation Professionalnelle is a French term for the tendency to look at things from the point of view of one’s profession rather than from a broader point of view. [Sources: 0]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://www.linkedin.com/pulse/cognitive-biases-every-risk-manager-must-know-part-2-sidorenko-crmp

[1]: https://muse.jhu.edu/article/527368/summary

[2]: https://behavioralgrooves.com/behavioral-science-glossary-of-terms/

[3]: https://www.cambridge.org/core/books/risk/quantifying-uncertainty/B41C7A211929DBA2B5CB4CEA4E3A66A1

[4]: https://en.wikipedia.org/wiki/Neglect_of_probability

[5]: https://nlpnotes.com/2014/03/22/neglect-of-probability/