Hard–Easy Effect

The “hard-easy effect” is a known cognitive bias in self-confidence calibration that refers to the tendency to overestimate the likelihood of success in difficult perceived tasks and underestimate it in easily perceived tasks. Moreover, we found that the phenomenon of the “simple versus complex” phenomenon follows the phenomenon of overconfidence bias. While there is strong evidence of bias in judgment, it seems unlikely that they fully explain the systematic nature of the effect. Few studies have tried to determine the cause of the “hard-to-easy” effect, but it may also be related to another cognitive bias – cycling. [Sources: 1, 2, 4, 6]

Thus, if lottery X promises high stakes, all above or equal to the external milestone, then the agent tends to be uncertain about easy-to-understand tasks and overly trust tasks that are perceived as difficult. Most people who study the hard / easy effect assume that it is due to other biases. Target. This article attempts to investigate how the overconfidence bias affects the forecast accuracy of financial market participants based on the concept of an easy and easy effect of overconfidence research. [Sources: 4, 7, 8]

Thus, if lottery X promises negative results that are less than or equal to the external control point m, then the agent is prone to low confidence in hard-to-understand tasks and over-confidence in easily-perceived tasks. Cognitive biases range from the tripping effect – where truths are accepted because even a large number of people accept them – to confirmation bias – where people believe information that confirms what they think or believe. The cyclist describes our tendency to spend a disproportionate amount of time on daily activities. Many studies have shown that overconfidence is one of the cognitive biases that prevent people from making such decisions. [Sources: 2, 3, 4, 7]

The relationships between country, gender, science education, cognitive bias, and self-confidence bias are discussed. We need to examine these biases to overcome them and make sure we think as clearly and critically as possible when it comes to decision making and information processing. We can also get to know them and even appreciate that we have at least some ability to process the universe with our own mysterious brain. While it is impossible to list all the potential cognitive biases in every life decision, there are actions you can take to train our brains to cope with these phenomena on a more general level. [Sources: 2, 3, 11]

If you visit this page from time to time to refresh your mind, the spacing effect will help highlight some of these thought patterns to keep our prejudices and naive realism in check. Thinking that you are rational despite the obvious irrationality of others is also known as the blind spot bias. Some things that we recall later make all of the above systems more biased and more detrimental to our mental processes. During the experiment, the authors measure and record the accuracy of the participants’ predictions and their individual level of confidence. [Sources: 8, 10, 11]

While the latter is by far the more difficult task, you will eventually have to do it, and you won’t be given the task if you can’t complete it completely. Becoming that “thought leader”, if you will, is beneficial in many ways, including being able to gain the trust of those with whom you want to communicate and authority in the space in which you gained your experience. [Sources: 3]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://www.oxfordreference.com/view/10.1093/oi/authority.20110803095920413

[1]: https://www.sciencedirect.com/science/article/pii/S0749597896900746

[2]: https://eric.ed.gov/?id=EJ1207685

[3]: https://www.verizon.com/business/small-business-essentials/resources/learning-learn-fighting-cognitive-biases-030221622/

[4]: https://thedecisionlab.com/biases/hard-easy-effect/

[5]: https://www.semanticscholar.org/paper/Naive-empiricism-and-dogmatism-in-confidence-a-of-Juslin-Winman/3e195dae70996ff707ffeaa2941b4ab057219d03

[6]: https://www.springerprofessional.de/en/a-target-based-foundation-for-the-hard-easy-effect-bias/11035798

[7]: https://link.springer.com/chapter/10.1007/978-3-319-46319-3_41

[8]: https://ro.uow.edu.au/buspapers/1601/

[9]: https://www.alleydog.com/glossary/definition.php?term=Hard-Easy+Effect

[10]: https://www.businessinsider.com/cognitive-biases-2015-10

[11]: https://betterhumans.pub/cognitive-bias-cheat-sheet-55a472476b18

Form Function Attribution Bias

Belief bias An effect whereby someone’s assessment of the logical strength of an argument is influenced by the validity of an inference. Tripping effect The tendency to do (or believe) something because many other people do (or believe) the same thing. Researchers or expectations bias. The tendency of experimenters to believe, validate, and publish data that are consistent with their expectations for an experimental outcome, and not believe, reject, or downgrade appropriate weights for data that appear to contradict those expectations. [Sources: 9]

Module function attribution is incorrect. In human-computer interaction-people tend to have system errors when interacting with robots. Humans’ expectations and perceptions of robots may be based on their appearance (shape) and attribute functions, which do not necessarily reflect the true functions of the robot. It believes that humans expect and perceive robots based on their shape and functional attributes, and these attributes do not necessarily reflect the true functions of the robot. [Sources: 1, 9]

The term shape function attribution bias (FFAB) refers to the cognitive bias that occurs when humans are prone to perceptual errors, resulting in a skewed interpretation of the robot’s functionality. We argue that instead of objectively perceiving the capabilities of robots, humans choose a cognitive label using information available to them through visual perception. These cognitive biases often determine how a person interacts with the world around them. In the business world, understanding these biases and understanding how they affect your behavior is critical to becoming a better manager. [Sources: 1, 7]

Basic attribution error is a kind of cognitive bias that causes people to underestimate the influence of contextual factors on human behavior and overestimate the influence of personality tendency factors. The final attribution error is cognitive bias, which makes people more likely to attribute positive behaviors to the situational factors performed by someone in the external group rather than someone in the internal group. It also makes people more likely to attribute the negative The behavior is attributed to negative behavior. The tendency factor when it is executed by someone in the external group rather than by someone in the internal group. The asymmetry of attribution between the subject and the observer is a kind of cognitive bias, which causes people to attribute their own behavior to situational reasons and the behavior of others to character factors. [Sources: 0]

Selfish bias refers to people’s tendency to attribute success to internal factors and failure to external factors. If a person uses selfish prejudice, attribute the positive things to oneself, and attribute the negative side to external forces, it will help them maintain positive self-esteem and self-esteem. [Sources: 3, 4]

For example, you might encourage the person exhibiting this bias to think about similar situations in which they behaved like the person they are judging due to situational factors. In other words, people have a cognitive bias in believing that a person’s actions depend on the “type” of that person, and not on the social forces and environmental factors that affect him. Jones and Harris (1967) hypothesized that people would attribute overtly freely chosen behavior to a predisposition (personality) and overtly random directional behavior to a situation. [Sources: 0, 6]

The theory was formed as a comprehensive explanation of how people interpret the foundations of behavior in human interactions; however, there have been studies that point to cultural differences in attribution biases between people of eastern collectivist societies and western individualist societies. Given these large differences in the weight given to internal and external attribution, it is not surprising that people in collectivist cultures tend to exhibit fundamental attribution error and comparison bias less frequently than in individualistic cultures, especially when situational causes of behavior are created. outstanding (Choi, Nisbett & Norenzayan, 1999). Fundamental attribution error (also known as matching bias or over-attribution effect) is the tendency for people to over-emphasize dispositional or personal explanations for observed behavior in others, while underestimating situational explanations. [Sources: 4, 5, 6]

In social psychology, attribution is the process of establishing the causes of events or behavior. In real life, attribution is something we all do every day, usually without realizing the underlying processes and biases that lead to our conclusions. For example, over the course of a typical day, you are likely to repeatedly ascribe to yourself your own behavior and the behavior of those around you. [Sources: 8]

This is also due to what the researchers found in these studies that selfish bias is influenced by a person’s age and whether they are trying to attribute success or failure. Other Factors That May Determine Male and Selfish Bias in Female This is not only because conflicting results have been obtained with gender differences in attribution. [Sources: 3]

External attributes are attributes attributed to situational power, while internal attributes are attributed to personal characteristics and traits. The second form of group attribution bias is closely related to basic attribution errors, because people begin to attribute group behaviors and attitudes to everyone in these groups, regardless of the degree of disagreement or decision-making methods within the group. Like selfish prejudice, group attribution can also have a self-improvement function, making people feel better by creating favorable explanations for their behavior within the group. People in these communities recognize that individual behavior is intertwined with the larger whole. [Sources: 3, 5, 8]

Attribution theory also provides explanations for why different people may interpret the same event differently, and what factors contribute to attribution bias. In psychology, attribution bias or attribution bias is a cognitive bias that refers to systematic errors that people make when people evaluate or try to find reasons for their own or other behavior. It is important to note that such methods are primarily intended for people who inadvertently exhibit a fundamental attribution error, such as cognitive bias. [Sources: 0, 4]

Each of these biases describes a specific tendency that people show when reasoning about the reasons for different behaviors. The researchers speculate that bias bias leads people to mistakenly believe that victims should have been able to predict future events and then take steps to avoid them. Hindsight bias This is sometimes called the “knew it all” effect — the tendency to view past events as predictably as they happened. Research has shown that there is a link between hostile attribution bias and aggression, so people who are more likely to interpret other people’s behavior as hostile are also more likely to behave aggressively. [Sources: 4, 8, 9]

Hostile attributions of intent are discussed in relation to the development and maintenance of aggressive behavior in children over thirty. When it comes to other people, we tend to attribute reasons to intrinsic factors, such as personality characteristics, and ignore or minimize extrinsic variables. This error is closely related to another attribution trend, matching bias, which occurs when we attribute behavior to the intrinsic characteristics of people, even in very limited situations. The differences in attribution made in the two situations were remarkable. [Sources: 2, 5, 8]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://effectiviology.com/fundamental-attribution-error/

[1]: https://waseda.pure.elsevier.com/en/publications/ffab-the-form-function-attribution-bias-in-human-robot-interactio

[2]: https://www.sciencedirect.com/science/article/abs/pii/S1359178917302720

[3]: https://www.healthline.com/health/self-serving-bias

[4]: https://en.wikipedia.org/wiki/Attribution_bias

[5]: https://courses.lumenlearning.com/suny-social-psychology/chapter/biases-in-attribution/

[6]: https://www.simplypsychology.org/fundamental-attribution.html

[7]: https://online.hbs.edu/blog/post/the-fundamental-attribution-error

[8]: https://www.verywellmind.com/attribution-social-psychology-2795898

[9]: https://uxinlux.github.io/cognitive-biases/

Exaggerated Expectation Bias

While these studies do well show that in such experimental paradigms, intimidating people often exaggerate the risk of adverse events, it is less clear whether this skewed risk perception is (a) related to fear-related stimuli or (b) related to unpleasant consequences that can to have such a meeting. In addition, it is unclear whether distorted risk perceptions are specific to individual fear or generalized to all negative events. In an online survey (N = 630), we assessed the perceived risk of facing stimuli related to fear and expectations of negative outcomes from such encounters. [Sources: 9]

Dissociation between covariance bias and expectation bias for fear-related stimuli. Bias in anticipation of threat and outcome of treatment in patients with panic disorder and agoraphobia. Perceptual confirmation bias The tendency of expectations to influence perception. [Sources: 9, 11]

Confirmation bias, the tendency to process information by seeking or interpreting information in accordance with existing beliefs. This biased approach to decision-making is largely unintentional and often results in conflicting information being ignored. Existing beliefs can include expectations in a given situation and predictions about a particular outcome. [Sources: 0]

By interacting with people who, in their opinion, have certain personalities, they will ask questions to those people who biasedly support the beliefs of the recipients. Additionally, by treating someone as expected, that person may inadvertently alter their behavior to meet expectations, thereby providing additional support for the confirmation bias on the part of the perceiver. Defeat can be encouraged by using absolute words such as “always” or “never” to create anticipatory bias. Exaggerating these offsets can cause a switching bias or a change in the direction of the association (for example, a true OR> 1 becomes <1) .2 There are several classifications of bias. [Sources: 0, 4, 6]

Sometimes the term bias is also used to refer to a mechanism that causes a lack of intrinsic confidence.1 Errors can be classified according to the direction of change they cause in a parameter (eg, odds ratio (OR)). Offset to zero or negative bias gives estimates that are close to zero (for example, OR is lower and closer to 1), while deviations from zero bias give opposite and higher estimates than true ones. This bias occurs in longitudinal studies that analyze the underlying definitions of a continuous variable (eg, diastolic blood pressure (DBP)) for an outcome (eg, stroke). [Sources: 6]

There is no bias in a matched study with a matched analysis if the risks of exposure-induced disease are constant over time and there are no sociable, select individuals from more than one case. The frequency of exposure is higher than expected in the control group, resulting in zero bias. [Sources: 6]

A study that looked at the two biases together showed that biases associated with meeting and outcome were correlated, meaning that arachnophobic women overestimated the risk of spiders in the room, and also exaggerated the likelihood of negative consequences of this meeting. [Sources: 9]

Confirmation bias also manifests itself in the tendency of people to look for positive examples. Importance Confirmation bias is important because it can cause people to forcibly hold false beliefs or give more weight to information that supports their beliefs than the evidence supports. Confirmation Bias Confirmation Bias A tendency to seek, interpret, focus, and remember information in a way that confirms your biases. [Sources: 0, 11]

Confirmation bias React to rebuttal evidence by reinforcing previous beliefs. Observer Expectation Confirmation Bias Effect When a researcher expects a certain outcome and then unconsciously manipulates the experiment or misinterprets the data to find it (see also “Object Expectation Effect”). Researchers or expectations bias. The tendency of experimenters to believe, validate, and publish data that are consistent with their expectations for an experimental outcome, and not believe, reject, or downgrade appropriate weights for data that appear to contradict those expectations. Researchers or expectations bias. The tendency of experimenters to believe, confirm, and publish data that are consistent with their expectations for experimental results, and to disbelieve, reject, or downgrade appropriate weights for data that appear to contradict those expectations. [Sources: 2, 8, 11]

Result deviation judges the tendency of a decision based on the final result of the decision rather than the quality of the decision. Exaggerated expectations. The tendency to expect or predict results that are more extreme than the actual situation. Exaggerated expectations. The tendency to expect or predict results that are more extreme than the actual situation. Pessimistic prejudice. Some people, especially those with depression, tend to overestimate the possibility of bad things happening to them. [Sources: 2, 8]

Pseudo-Confidence Effect Prospect Theory The tendency to make non-risk choices when the expected outcome is positive, but to make risk-based choices to avoid negative outcomes. Offset projection. The tendency to overestimate how our future shares our current preferences, thoughts, and values, leading to sub-optimal choices. Ironically, this is when we think other people have more biases than ourselves. [Sources: 2, 3, 11]

We found 7 cognitive biases that affect classroom learning, independent learning, and the feelings of many students. However, these thinking biases can have more impact on us than we could imagine, especially since many of us may suffer from the blind spot of bias. Understanding expectation bias is critical to being able to think clearly. [Sources: 3, 4]

This article identifies and describes some ST strategies that manipulators try to influence or control behavior, or shape the ideas and preconceived expectations of others. ST technology can be used in media, organizational management or advertising companies to create prejudices and influence personal decisions and/or the nature of relationships and judgments in the context by creating deliberately pre-formed expectations. The word “stinky thoughts” may sound a little derogatory or even funny, but the proven strategy is designed to negatively affect your thoughts and behavior. It’s okay to use words and phrases to influence your expectations and influence your behavior. Ridiculous. [Sources: 4]

Avoiding positive thinking by insisting that a positive example “doesn’t count” is another way to negatively influence and reduce expectations. Polarizing perceptions to establish black and white thinking with no room for grayscale is another expectation bias tactic. [Sources: 4]

For example, people with an optimistic bias tend to be overly optimistic and overestimate the likelihood of good events, while people with a selfish bias tend to remember the past in a way that reflects on them better than it actually did. On the other hand, this bias can make students worry about upcoming exams and over-exaggerate how awful they will be when imagining the worst-case scenario. This can cause unnecessary stress and anxiety about exams and exam preparation. [Sources: 3]

Unfortunately, this can cause students to expect their feedback to have a greater impact on their exam performance than it actually does. When this happens, students may put too little effort into revision and find themselves unprepared for exams. In other words, even if the search strategy cannot guarantee the content of my beliefs (since there is no way of knowing whether one day the evidence obtained will actually be favorable or unfavorable to my preferred hypothesis), my beliefs may be systematically less significant. accurate because they have not received evidence that one would expect to be more informative. [Sources: 3, 10]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://www.britannica.com/science/confirmation-bias

[1]: https://pubmed.ncbi.nlm.nih.gov/22122235/

[2]: https://cognitive-liberty.online/list-of-cognitive-biases/

[3]: https://blog.innerdrive.co.uk/7-cognitive-biases-holding-your-students-back

[4]: https://www.psychologytoday.com/us/blog/the-media-psychology-effect/201311/stinking-thinking-and-expectation-bias

[5]: https://link.springer.com/10.1007/978-94-007-0753-5_2219

[6]: https://jech.bmj.com/content/58/8/635

[7]: https://hbr.org/2014/05/is-the-possibility-bias-keeping-us-from-having-crazy-fun

[8]: https://uxinlux.github.io/cognitive-biases/

[9]: https://www.frontiersin.org/articles/10.3389/fpsyg.2019.01676/full

[10]: https://www.sciencedirect.com/topics/biochemistry-genetics-and-molecular-biology/confirmation-bias

[11]: https://en.wikipedia.org/wiki/List_of_cognitive_biases

End-Of-History Illusion Bias

I know that I have grown up since I was a child, but I can’t imagine that I will change in the future because I feel—or who I think I—have been formed, strong, and cannot be changed. Young people, middle-aged people, and old people think that they have changed a lot in the past, but the future changes will be relatively small. However, most people believe that their personality, work situation and values ​​will not change much in the future, even if they have changed a lot in the past. [Sources: 4, 8, 10]

According to the ending illusion (EOHI; Quoidbach, Gilbert & Wilson, 2013), people underestimate the number of future changes they will experience. Their research was cited in the recent NPR article “You didn’t see it, but in 10 years, you will be a different person”, indicating that people tend to underestimate the changes they will have in the future. Jordi Quoidbach, Dan Gilbert and Timothy Wilson stated that they have shown that people of all ages underestimate the changes in their personalities, preferences and values ​​in the future. [Sources: 0, 6, 9]

Ideally, psychologists would ask people to predict the magnitude of changes and then track them to compare actual changes with predictions. In order to overcome this problem, they asked more than a thousand people to answer a non-specific question and evaluate how much they would change as a person or how much they would change in their opinion. The results show that predictors believe that they have changed less than the personality changes actually experienced by reporters. [Sources: 4, 6]

Moreover, compared with the actual experience of reporters ten years older than them, forecasters have consistently underestimated how much their values ​​and preferences will change. Quoidbach, Gibert, and Wilson concluded from these data that people not only underestimate the changes they will have in the future, but also jeopardize the best decisions. It is also possible that study participants overestimated their past changes, giving the impression that they underestimated future changes. [Sources: 4, 12, 14]

People also like to believe that they know each other well, and the possibility of future changes can jeopardize that belief. This refers to the fact that people assume that they will not change in the future and will basically stay the same. Even more interesting, this illusion is the same for people of all ages. [Sources: 8, 12]

For age groups 18 to 68, people of all ages have described more changes in the past 10 years than they predicted 10 years ago. They measured the personality patterns, values ​​and preferences of more than 19,000 people, asking them to rate how much they think they have changed over the past decade and how much they will change in the next decade. They conducted a series of experiments in which they assessed the personalities, values ​​and preferences of more than 19,000 people between the ages of 18 and 68. [Sources: 2, 5, 7]

In an article published last week in the journal Science, these researchers report a study in which participants were asked to rate how much their personalities, tastes, and values ​​have changed over the past decade, and how much they expect them to change in the next. Their discovery of this phenomenon is based on a series of studies that show that people tend to underestimate how much they will change in the future, even though they know how much they have changed over time. For example, a 25-year-old predictor will predict how much his personality will change when he turns 35. [Sources: 4, 12]

Regardless, the extent of the end-of-story illusion did not change with age, as forecasters consistently predicted that their personalities would change less over the next decade than reporters believed at the time. Again, regression analysis showed the same results, with the illusion of the end of history present in the predictions of changing preferences. The sum of expected future changes among persons aged X was less than the magnitude of past changes reported among persons aged X + 10 years. Unsurprisingly, young people in the study reported more changes in the previous decade than older respondents. [Sources: 3, 9, 14]

Conversely, the analysis showed that the rate of change did not change with age. Research by Quoidbach, Gilbert, and Wilson has also shown that we believe the rate of change in our lives slows down with age. In terms of core values, the researchers found that the magnitude of the end-of-story illusion also exists for core values, and while the magnitude in this case decreased with age, it was still present across all age groups of children. Kuoidbach’s team realized this, so they looked at data from a separate longitudinal study that actually tracked people over time (once in 95-96 and again at 04-06) to study effective personality change. [Sources: 6, 13, 14]

The EOHI evidence comes from several studies that compared people between 28 and 68 years old who were randomly assigned to: (a) report how much their personality, values, or preferences have changed from the past 10 years to the present, or (b ) Predict how much they will change in the next 10 years. According to EOHI, although past personal changes are reported, people of all ages must mistakenly predict that future changes will be too small. According to their research, the study involved more than 19,000 people between the ages of 18 and 68. This illusion lasted from adolescence to retirement. Based on their experiments, people seem to see now as a turning point in their lives, when they finally become who they will be for the rest of their lives. [Sources: 3, 7, 9]

This “end-of-story illusion” has had practical implications for making people pay extra for future opportunities to indulge their current preferences. Again, people reported more changes in their values ​​than they expected. For example, when asked about changes in tastes in music, people will report significant changes in tastes over the past decade, downplaying expected changes in tastes in the coming years. It would be easier to believe in the end of the illusory story if there were data about real changes than to rely on the participants’ memories of themselves in the past. [Sources: 5, 6, 10]

The problem is that we make decisions based on our current understanding of reality, not what we know might happen in the future. This is one of the reasons why so many relationships fail because people are stuck in the present and unable to plan for future challenges and growth opportunities. If we make decisions without thinking about our future, we may end up reactive lives without adequate planning or opportunities for personal growth. [Sources: 4, 7]

Let’s take a look at why this is important in our professional and personal life, and three simple steps to get rid of this illusion and better shape our future. The end-of-story illusion is a psychological illusion in which people of all ages believe that they have experienced significant personal growth and changes in tastes so far, but will not grow or mature substantially in the future. [Sources: 4, 14]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://www.insidehighered.com/blogs/stratedgy/%E2%80%9Cend-history-illusion%E2%80%9D

[1]: https://www.newristics.com/selling-to-those-resisting-change.php

[2]: https://www.science.org/doi/10.1126/science.1229294

[3]: https://www.nytimes.com/2013/01/04/science/study-in-science-shows-end-of-history-illusion.html

[4]: https://nesslabs.com/end-of-history-illusion

[5]: https://www.psychologytoday.com/us/blog/reality-play/201301/the-end-history-illusion

[6]: https://digest.bps.org.uk/2013/01/14/the-end-of-history-illusion-illusion/

[7]: https://www.theknowledge.io/end-of-history/

[8]: http://econowmics.com/end-of-history-illusion/

[9]: https://www.sciencedirect.com/science/article/abs/pii/S009265661930090X

[10]: https://marcfbellemare.com/wordpress/8260

[11]: https://medium.com/science-and-values/the-end-of-history-illusion-fa7226657185

[12]: https://longnow.org/ideas/02013/01/23/time-and-the-end-of-history-illusion/

[13]: https://www.linkedin.com/pulse/essay-16-end-history-illusion-why-we-so-bad-future-jeff-guthrie

[14]: https://en.wikipedia.org/wiki/End-of-history_illusion

Hot-Cold Empathy Gap Bias

Participants in both the hot and cold phases are asked to indicate the minimum amount they need to stop smoking “right now.” The cold tag under the participants provided their monetary compensation for the delay in smoking, and the hot spot above the participants provided their monetary compensation. [Sources: 6]

The effects of empathic gap have been investigated in the field of sexual decision making, where young people in an unexcited “cold state” were unable to predict that when they were “hot” they would be more likely to engage in risky sexual relationships. solutions (for example, do not use a condom). The empathic gap between hot and cold is a cognitive bias in which people underestimate the influence of internal drives on their attitudes, preferences, and behavior. An empathic gap is a cognitive bias that makes people struggle to understand mental states that are different from their current state, or to think about how these states affect people’s judgment and decision-making. The gap between cold and hot empathy occurs when someone is in a cold (emotionally neutral) state and finds it difficult to understand someone in a warm (emotional) state, usually due to underestimation of the influence of emotions, impulses, and urges. an individual in a hot state. [Sources: 3, 6, 10]

For example, a person who is currently calm may experience a gap between cold and hot empathy, trying to predict how they will behave in a situation when they are upset. For example, someone who is currently passionate about a topic may experience a gap between warm and cold empathy, trying to figure out how other people feel about the topic if they are not so passionate about it. An empathic gap can prevent us from seeing that someone doesn’t necessarily have the same feelings for us as we do for them, or it can make us underestimate how much our feelings for someone have influenced our judgments in the past. It is important to note that failure to minimize the empathy gap can lead to negative outcomes in a medical setting (for example, when a physician needs to accurately diagnose a patient’s physical pain) [2] and in a work environment (for example, when an employer must assess the need for an employee to take leave of bereavement). [Sources: 3, 5, 6, 10]

When people are in an emotionally “cold” state, they cannot fully understand how the “hot” state will affect their preferences and behaviors. On the other hand, people in a “cold” state underestimate the ability of attachment to influence their decision-making in future situations and do not take enough action to avoid a “hot” state. [Sources: 2]

Previous research has shown that people incorrectly predict their behavior and preferences through affective states. The same prejudices apply at the interpersonal level; for example, people who are not emotionally aroused underestimate the effect of being hot on other people’s behavior. The gap between hot and cold empathy occurs when people underestimate the impact of visceral states (for example, [Sources: 2, 9, 12]

The empathic gap between hot and cold is the tendency to underestimate how our preferences will change when we are in a hot or cold emotional state. However, in some cases, we can make the most of the energy contained in a warm state and do something good, even when we have calmed down and returned to a warm state. For example, if you are feeling calm and collected right now, your thought processes may be too “cold” to remind you of the power of intense emotion, passion, or pain. [Sources: 1, 4, 11]

Generally speaking, people are more likely to underestimate the impact of pain in the cold state than in the warm state. Parents who report a lot of negative emotions are more likely to overestimate the pain of their children. [Sources: 4, 6]

Providing consumers with an understanding of the gap between hot and cold can help them understand why they are unable to follow their behavioral intentions. In the examples above, cold decision making helps to control and direct warm behavior. Conversely, decisions made in hot conditions are usually not very smart; an immediate email response to a coworker who pissed you off, impulsive purchases made before your bank account was checked, or a decision to end a relationship in the middle of a line. [Sources: 11]

When we are in different states, we cannot effectively predict our future behavior. Brain scans show that when we make seemingly hypothetical decisions or decisions that are far from the here and now, our brains do not turn on in the same way. When faced with popular choices, the brain’s reward center-and the amygdala-become more active (Kang and Camerer 2013). [Sources: 4, 5]

The empathic gap makes it difficult for us to understand a point of view or predict the actions of someone who is in a different state of mind, whether that person is someone else, our past or future. Overall, the main reason people experience empathy gaps is because human cognition is condition dependent, which means it is heavily influenced by our current mental state, making it difficult to correctly assess other mental states or predict their impact. … [Sources: 5, 10]

The division between hot and cold empathy was proposed by George Loewenstein, a well-known and influential figure in behavioral economics4. Levenshtein argued that “affect has the ability to profoundly transform us as human beings; in different affective states it is almost the same. if we were different people ”(1). 4 Through a series of studies that he has conducted or participated in, Lowenstein has demonstrated an empathic gap in responses to pain, addiction, thirst, and fear. This is an important step because no previous study of the empathy gap between hot and cold has compared hot and cold biological activity and has shown no direct neural evidence consistent with affective difference (empathic versus emotional). This fMRI evidence of the differential activity of the islet and amygdala provides new evidence for the biological coding of a gap between warm (real choice) and cold (hypothetical choice) empathic gap in the brain. The assumption of underestimating the visceral response also offers some potential ways to deflect the blame from hypothetical future studies. [Sources: 7, 8]

Further understanding of how the brain makes hot and cold decisions (real and hypothetical) can help individuals and society make these difficult decisions more effectively. For clarity and brevity, these techniques are usually written with an emphasis on one of the main types of empathic gap, namely cold-hot, intrapersonal, perceived bias, which presents our difficulty in predicting what he will think and do in the future. the future. when we are in a more emotional situation than the current one. Whether you are trying to understand someone else’s point of view or to put yourself in place in the future, make an active effort to visualize mental states that are different from your current one. Talk to your child about situations in which you miscalculated and how your emotional state played a role. [Sources: 4, 5, 7, 10]

If we no longer have feelings for someone, then underestimate how much our feelings for someone affect our past judgments. We have heard that there are people who cannot reconcile what they believe to be people and their behavior, and they are in a state of intense experience. When it is difficult for us to remember or understand our actions in different states. [Sources: 0, 3, 5]

— Slimane Zouggari

 

 

##### Sources #####

[0]: https://www.npr.org/2019/11/27/783495595/in-the-heat-of-the-moment-how-intense-emotions-transform-us

[1]: https://www.research-live.com/article/opinion/bias-in-the-spotlight-the-hotcold-empathy-gap/id/4013735

[2]: https://longevity.stanford.edu/hot-cold-empathy-gaps-and-medical-decision-making/

[3]: https://nextbigwhat.com/the-empathy-gap-why-we-fail-to-understand-different-perspectives/

[4]: https://parentingscience.com/empathy-gap/

[5]: https://nesslabs.com/empathy-gap

[6]: https://en.wikipedia.org/wiki/Hot-cold_empathy_gap

[7]: https://www.frontiersin.org/articles/10.3389/fnins.2013.00104/full

[8]: https://thedecisionlab.com/biases/empathy-gap/

[9]: https://pubmed.ncbi.nlm.nih.gov/16045419/

[10]: https://effectiviology.com/empathy-gap/

[11]: https://www.researchworld.com/bias-in-the-spotlight-hot-cold-empathy-gap/

[12]: https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/hot-cold-empathy-gap/

Declinism Bias

Declinism has been described as a “trick of the mind” and as “an emotional strategy, something comforting to curl up in a ball until the present becomes unbearably dark. One of the addiction factors is the so-called “memory lump”, which means that older people tend to “remember the best events that happened to them at the age of 10-30.” This means that older people are more likely to view their past in an unrealistically positive way because of their predisposition to a rosy retrospective, which makes their present and future seem inferior to them. [Sources: 0, 4]

Basically, people think that as they get older, people tend to experience fewer negative emotions and are more likely to remember the positive rather than the negative. Although positive effects and memory shocks deal with memory, negative prejudices deal with the here and now. In particular, due to cognitive biases such as optimistic retrospectives, it is a tendency to look at the past in a more favorable way and look at the future in a negative way. This belief can be traced back to Edward Gibbons [5] The History of the Decline and Fall of the Roman Empire, which was published between 1776 and 1788, in which Yoshimoto believed that the Roman Empire collapsed due to the gradual loss of civic virtues. Among its citizens,[6] became lazy, corrupt and inclined to hire foreign mercenaries to manage national defenses. [Sources: 0, 7, 9]

This thinking is likely related to “decadence,” that is, a belief driven by a bias that a company or institution always gets worse over time, no matter what the facts say. In particular, because this bias makes people look at the past in an unrealistically positive way, it can make the present worse in comparison to it, which can lead people to believe that things are getting worse. Likewise, another phenomenon that can lead to decreased beliefs is pessimism bias, which is a cognitive bias that causes people to overestimate the likelihood of negative events and underestimate the likelihood of positive things, especially when it comes to the assumption that future events will occur. bad outcome. In addition, negative biases or interpretations of events tend to be more pronounced in people with depression. [Sources: 4, 9, 10]

Just as there is evidence of a positive effect, there are also studies suggesting negative bias. This is the idea that emotionally negative events can have a greater impact on your thoughts and behavior than similar but positive events. Psychologists say that people tend to focus more on negative thoughts and emotions than positive ones. [Sources: 6, 9]

And if you are depressed, research like this in 2011 shows that negative bias can have an even stronger effect on you. Negative bias probably explains much of our perception that things are worse than they are at any given time, but especially now in America. We think that nowadays everything is terrible due to the bias towards negativity, we think that everything was better in the past due to the burst of memories and the effect of positivity, and we think that in the future things will get worse due to the decline. But when you add something like “confirmation bias,” we often become so ingrained in our beliefs that we lose the ability to break an endless chain of negativity. [Sources: 7, 10]

Confirmation bias is when we have a pre-established worldview and then we try to find facts to support that belief. Thus, confirmation bias not only rejects positive facts, but sometimes forces us to reject negative information as well. Of all the recognized psychological factors that influence our perceptions and beliefs, the most important to understand is confirmation bias, as this is what is most likely to negatively affect our decision making and is what we can overcome. These biases affect every aspect of our lives and can be especially detrimental to business decision making; this risk will be discussed in this article. [Sources: 5, 7]

Cognitive bias was first proposed by Israeli psychologists Amos Tversky and Daniel Kahneman in the 1970s. It was considered a kind of thinking error that would lead to The information of the world produces misunderstandings, which affects the accuracy of our decisions and the rationality of our judgments. In order to simplify the complex and multifaceted world we live in and make decisions relatively faster, our brains create shortcuts called cognitive biases. In total, there are more than 180 cognitive biases that hinder us from processing data, thinking critically, and perceiving reality. This does not mean that the world is okay, but understanding these cognitive biases can help you see the world in a different light, or at least understand why you think things are so scary. [Sources: 5, 10, 11]

Avoiding the trap of decline does not mean that you need to have a positive outlook on everything; rather, it means that you need to understand the nuances of how you look at the past and anticipate the future. A little healthy nostalgia won’t hurt anyone, but it’s important to remember how distorted our memories can be. But as we age and remember the past, we remember more positive than negative. [Sources: 3, 6]

Therefore, when we compare the present with the past, we have a more positive view of the past than the present. After that, we began to think that everything was worse than in the past. Today, many people think that the world is really getting worse. [Sources: 5, 6]

Declinism is a belief, often driven by a cognitive bias, that a company or institution tends to decline or go bankrupt, and this belief is now widespread. Declinism is the belief that societies tend to decline, often associated with rosy retrospectives, our tendency to look at the past more favorably and the future more negatively. Decline tendency is when a person sees the present or future in an overly negative light and romanticizes the past in a positive light. [Sources: 3, 5, 7]

Declinism also includes our tendency to believe that the future will be worse than the past. But decline is more than a tendency to recall the past with nostalgia. The tendency to block these unpleasant moments of the past is an example of decline. [Sources: 1]

However, in some cases, decadence tendencies can be beneficial, especially when it helps people prepare for the future. One of the biggest problems with decline is that it can become a self-fulfilling prophecy. Another problem is that an overly rosy view of the past can cause us to ignore our past mistakes, preventing us from learning from them; Combined with decline and the belief that the future will be worse, this can lead us to repeat the same behavior without placing any bets on ourselves or other favorable outcomes. [Sources: 3, 4]

It also seems unlikely to offset the feeling of recession by arguing that we live in the best world. If we believe that society is as good as it is now, we are more likely to believe that the future will be worse than it is now. On the other hand, Josef Joffe emphasized the fact: “Excessive anxiety about potential recession may be a good way to trigger it. [Sources: 0, 1, 8]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://en.wikipedia.org/wiki/Declinism

[1]: https://propagandacritic.com/index.php/everybody-is-biased/declinism/

[2]: https://findwords.info/term/declinism

[3]: https://nesslabs.com/declinism-rosy-retrospection

[4]: https://effectiviology.com/rosy-retrospection-and-declinism/

[5]: https://mahanakornpartners.com/the-effect-of-cognitive-bias-in-decision-making/

[6]: https://www.tomorrowsworld.org/commentary/declinism-or-just-cognitive-bias

[7]: https://www.psychologytoday.com/us/blog/reading-between-the-headlines/201702/declinism-why-you-think-america-is-in-crisis

[8]: https://www.clingendael.org/pub/2017/3/declinism-and-populism/

[9]: https://www.theguardian.com/science/head-quarters/2015/jan/16/declinism-is-the-world-actually-getting-worse

[10]: https://lifehacker.com/the-cognitive-biases-that-convince-you-the-world-is-fal-1822620516

[11]: https://www.visualcapitalist.com/24-cognitive-biases-warping-reality/

Curse Of Knowledge Bias

To pass on your knowledge to others, you not only need to put years of experience into words, which is difficult in itself, but you also need to look at things from the naive point of view of a beginner. When you become an expert on certain topics, it becomes much more difficult to explain the basics to someone without the same knowledge. It is also much more difficult to explain the basics to people unfamiliar with the subject because you cannot remember what questions you had when you were new to the topic. [Sources: 2, 3, 12]

When a person gains knowledge and is familiar with a topic, it is difficult for him to give up his prejudice and knowledge to truly talk about the topic. This article, The Curse of Knowledge in User Experience, will help you understand the concept of the Curse of Knowledge, why people experience it, and learn how to take it into consideration in the most effective way to help you improve your teaching. Communication and prediction of user behavior. Knowledge is the key to innovation, but if used unwisely, it can be a curse. [Sources: 4, 5, 12]

If you are new to your career, consider this an advantage because you can offer new perspectives. If you have a good experience, connect with people and collaborate. You never know who can give you a new insight into the project you are working on. [Sources: 12]

If you are talking to a friend or colleague, think about how knowledgeable they are before you start explaining. Don’t stop wondering if your audience has prior knowledge; o It’s hard to put yourself in the shoes of your audience. As you get used to the jargon of your new environment, it doesn’t take long for you to forget that other people don’t know what you do. In other words, your knowledge can become a curse, a barrier that prevents you from speaking about what customers are interested in in a language they understand. [Sources: 3, 8, 10]

It makes you misjudge how the other person sees, understands, or reacts to something, and since it’s intrinsically difficult to notice, it can do real harm before you catch the mistake. When you get creative it creates problems because it makes it difficult for people to see and understand what others are thinking. When someone wants to explain something in writing, it is difficult to explain it in simpler, more understandable language if you are familiar with the specifications. [Sources: 4, 11]

This often means that concepts, ideas and information are not clearly presented because the person presenting them assumes a certain level of knowledge and understanding from their audience. This is called the curse of knowledge, or the ability to assume that people know what you know, which makes you believe they understand you better than they do. The curse of knowledge is a cognitive bias that prevents people from adequately explaining the fact that others don’t have as much information as they do. The curse of knowledge is a cognitive bias that occurs when a person, while communicating with other people, unknowingly assumes that others have a background to understand. [Sources: 1, 3, 5, 7]

The curse of knowledge is a cognitive bias that occurs when a person, while communicating with other people, unknowingly assumes that others have a background to understand. Until you find out Wally’s whereabouts, he appears to be completely absent and impossible to find. Since the policeman already knows the way, he does not take into account that this person is not mentally following. [Sources: 4, 6]

Writers need to understand that they need to get feedback and question their writing to make sure the majority of the people in their target audience understand what they are trying to say. When you better understand the curse of knowledge and are aware of all the biases that come in when it comes to how other people think, you can predict their behavior. Since most people are unaware of the curse of knowledge and therefore don’t know how to explain it, understanding how this bias affects people’s thinking can help you more accurately predict their behavior. [Sources: 1, 4, 5]

Thus, the curse of knowledge can be seen as a type of self-centered prejudice that causes people to rely too much on their own point of view when they try to see things from the point of view of others. The curse of knowledge is linked to many other cognitive biases, such as the illusion of transparency, which causes people to overestimate the degree to which their thoughts and emotions are obvious to others, and the empathic gap, which makes people difficult to understand mental states that differ from their current state. states, or struggling to understand how these states affect people’s judgment and decision making. [Sources: 1]

Rather, the curse of knowledge stems from the predictive and communicative difficulties that people experience when they know what others do not know. The curse of knowledge is a cognitive bias in which we have not taken into account the fact that there are people who do not share our level of understanding. [Sources: 1, 5]

It is a cognitive bias that occurs when someone mistakenly believes that others have enough knowledge to understand. This cognitive bias leads people who are more knowledgeable about a topic to find it nearly impossible to view it from the perspective of someone who knows little about it. It is also considered a “self-centered bias” because it forces people to only express ideas in terms that they know and disregard the thoughts of others. [Sources: 4, 7, 8]

It is also used to enhance ideas in writing and other creative processes. The term “the curse of knowledge” was coined by economists Colin Camerer, George Lowenstein, and Martin Weber in an article in Political Economy in 1989 to describe what caused people to treat them Cognitive biases that project knowledge and experience of the world onto others. The curse of knowledge does make it difficult for us to accurately reconstruct our previously ignorant or unaware mental state of low consciousness. It is difficult for us to share our knowledge with others because it is difficult for us to reproduce the emotions of the audience. [Sources: 2, 4, 10]

Likewise, the unacknowledged assumption that your partner knows your opinion or might base their views on information you haven’t conveyed will likely make you feel like you clearly understand your wants and needs, even though you haven’t. As I noted in last year’s post on the Dunning-Kruger effect, people who are poorly informed still have a good chance of feeling confident in their opinion because they are not knowledgeable enough to question their undeserved trust. If you know for sure that your idea is objectively superior to the competition, the curse of competitive knowledge makes it easy to implicitly assume that your customers need to treat it the same way. [Sources: 11, 15]

Let the customer decide what’s important to them, don’t let the curse tell you that a feature or product concept is uncompetitive or attractive based on your implied knowledge. [Sources: 11]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://www.ideastogo.com/articles-on-innovation/curse-of-knowledge

[1]: https://effectiviology.com/curse-of-knowledge/

[2]: https://towardsdatascience.com/the-curse-of-knowledge-8deb4769bff9

[3]: https://www.usertesting.com/blog/curse-of-knowledge

[4]: https://wam.fandom.com/wiki/Curse_of_Knowledge

[5]: https://userpeek.com/blog/the-curse-of-knowledge-in-ux/

[6]: https://www.teachonmars.com/en/blog/2021/02/curse-of-knowledge-trainers-cognitive-bias/

[7]: https://tactics.convertize.com/definitions/curse-of-knowledge

[8]: https://nesslabs.com/the-curse-of-knowledge

[9]: https://www.lifehack.org/articles/communication/are-you-suffering-from-the-curse-knowledge.html

[10]: https://cxl.com/blog/curse-of-knowledge/

[11]: https://www.mindtheproduct.com/3-ways-the-curse-of-knowledge-can-sabotage-product-people/

[12]: https://digitallearning.arizona.edu/news/curse-knowledge

[13]: https://en.wikipedia.org/wiki/Curse_of_knowledge

[14]: https://thedecisionlab.com/reference-guide/management/curse-of-knowledge/

[15]: https://www.psychologytoday.com/us/blog/i-hear-you/202104/whats-the-curse-knowledge-and-how-can-you-break-it

Attribute Substitution Bias

As one of the psychological processes underlying judgment bias, attribute substitution explains why people rely on heuristics and commit judgment bias. Attribute substitution is a psychological process that is believed to underlie a number of cognitive distortions and perceptual illusions. Attribute substitution is a psychological process underlying a number of cognitive heuristics and perceptual illusions. [Sources: 3, 5, 10]

The heuristic and bias literature argues that substitution bias occurs when we replace a difficult task of judgment with an easier one. Attribute substitution theory brings together a series of distinct explanations of reasoning errors in terms of cognitive heuristics. This happens when people are faced with a computationally complex judgment and, because of its complexity, reformulate the problem into a more easily computable attribute. [Sources: 1, 3, 11]

To reduce this bias, we need to activate the reflective and analytical thinking of System 2; wherever possible, we should rely on artificial intelligence (AI) algorithms and systems rather than our heuristics and intuition. However, attribute substitution has also been proposed as an explanation for human judgment errors in other classical reasoning problems, such as the problem of ignoring base rate or the error of conjunction (Kahneman and Frederick, 2002). According to our statements, this suggests that the problem with substitution and judgment bias in general is not that people are not aware that they need to think deeper, but that this deliberate processing is not successful. [Sources: 6, 11]

Like most authors (e.g., Evans, 2010; Frederick, 2005; Kahneman, 2011; Stanovich, 2010), we also believe that the main reason for substitution bias is that reasoners tend to minimize effort. Cognitive and stick to simple intuitive techniques. treatment. This is also known as displacement bias, a psychological process underlying several cognitive biases. This substitution is believed to take place in an intuitive automatic judgment system rather than a more self-conscious reflexive system. These examples illustrate why I argue that substitution bias is a real problem, not for individual decision-makers, but for the analytic community. [Sources: 4, 6, 8, 11]

People sometimes answer a difficult question and replace it with a simpler question. This idea will become the basis of cognitive heuristics. However, these examples are not about how ordinary people become victims of substitution bias. I see that this community is ignoring, belittling and distorting cognitive phenomena and replacing them with formulas that are easier to calculate. This is a classic technique of displacement deviation. [Sources: 9, 11]

This indicates that the subjects did not use the baseline to assess the likelihood, but replaced the more accessible similarity attributes. In a revision of the theory in 2002, Kahneman and Shane Frederick proposed attribute substitution as a potential process for these and other influences. The preconscious and intuitive nature of attribute substitution explains how subjects are affected by stereotyped thinking, that is, they make an honest and fair assessment of the intelligence of others. In a 1974 article, psychologists Amos Tversky and Daniel Kahneman argued that a variety of heuristics (short for information processing) can be used to Explain a lot of bias (judgment and decision-making bias), including accessibility and representativeness. [Sources: 1]

Kahneman proposes to replace the attribute of fear with a calculation of the overall risks of travel. In this study, we focused on the bat and ball problem because it is one of the most proven and paradigmatic examples of human substitution bias (e.g. Bourgeois-Gironde & Vanderhenst, 2009; Kahneman, 2011; Kahneman & Frederick, 2002; Toplak et al., 2011). So, if the subject has a stereotype of the relative intelligence of whites, blacks and Asians, this racial attribute can replace the more intangible attribute of intelligence. They suggested that these biases influence how people think and their judgments. [Sources: 1, 2, 6, 10]

Heuristics are useful in many situations, but they can also lead to cognitive biases. I am not suggesting that analytic researchers deliberately make substitutions. While heuristics can help us solve problems and speed up decision making, they can lead to errors. [Sources: 2, 11]

This is the answer that people must choose if they engage in the postulated replacement process. This bias will later be identified as a representativeness heuristic in which probabilistic judgments are influenced by perceptions of similarity. This idea was developed by Kahneman and Frederick when they argued that the target attribute and the heuristic attribute can be very different in nature. [Sources: 6, 9, 10]

Knowing how heuristics work and the possible biases they introduce can help you make better, more accurate decisions. So when someone answers a difficult question, they may be answering a related but different question without realizing that a substitution has occurred. Simons’ research showed that humans are limited in their ability to make rational decisions, but it was the work of Tversky and Kahnemans that introduced the study of heuristics and specific ways of thinking that people use to make decisions easier. [Sources: 2, 10]

Heuristics are a psychological shortcut that allows people to solve problems and make judgments quickly and effectively. As you can see in the example above, heuristics can lead to inaccurate judgments about how things usually happen and how representative things might be. In other words, the control version is a simpler statement that participants must subconsciously replace. The representative heuristic involves making a decision by comparing the current situation with the most representative psychological prototype. [Sources: 2, 6]

Some theories argue that heuristics are actually more accurate than biased. While the heuristic is imperfect, it can be incredibly effective in making troubleshooting easier. This explains why prejudices are unconscious and persist even when the subject is aware of them. The associated attribute is very accessible. This may be because it is automatically assessed under normal perception, or because it was initiated. [Sources: 2, 9, 10]

When it comes to making judgments, we are not as intelligent as we would like to think. While most of them could certainly calculate the correct answers with pen and paper, there was a bias in the assumptions of these sophisticated participants. [Sources: 3, 9]

Shoppers hope that the store is simple, and the simplest brand thinking and thinking will be the most attractive. Statistics and heuristic regression are better than the average in predicting errors in performance estimates. Journal of Personality and Social Psychology, Vol. Neurobiological tools such as magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI) have been used to study the neural correlation of various deviations. In face-to-face conversations with strangers, assessing their intelligence is computationally more difficult than assessing their skin color. [Sources: 4, 7, 9, 10]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://www.encyclo.co.uk/meaning-of-Attribute_substitution

[1]: https://psychology.fandom.com/wiki/Attribute_substitution

[2]: https://www.verywellmind.com/what-is-a-heuristic-2795235

[3]: https://www.shortform.com/blog/attribute-substitution/

[4]: https://www.adcocksolutions.com/post/attribute-substitution

[5]: https://pubmed.ncbi.nlm.nih.gov/30845187/

[6]: https://link.springer.com/article/10.3758/s13423-013-0384-5

[7]: https://www.cambridge.org/core/books/heuristics-and-biases/0975337F864379F2729EAD873D804BA8

[8]: https://dl.acm.org/doi/book/10.5555/1823577

[9]: https://thedecisionlab.com/reference-guide/psychology/biases/

[10]: https://biasandbelief.wordpress.com/2009/06/01/attribute-substitution/

[11]: https://www.psychologytoday.com/us/blog/seeing-what-others-dont/202010/the-substitution-trap

Decoy Effect

The addition of MP3 Player C, which buyers are likely to avoid (they may pay less for a model with more memory), results in MP3 Player A, the dominant option, being chosen more often than when we had only two options. Since A is better than C in both respects, while B is only partially better than C, more consumers will prefer A now than before (quoted from Wikipedia). Since A is better than C in both respects, while B is only partially better than C, more consumers will prefer A now than before. [Sources: 11, 14]

Therefore, C is a bait whose sole purpose is to increase A’s sales. Add decoy C-considering that models with more memory may pay a lower price, consumers may avoid this-do it. Compared with only two options in consideration set 1, which option should be selected more frequently; C affects consumer preferences and serves as the basis for comparing A and B. In other words, the decoy effect is a phenomenon in which the attractiveness of B (relative to A) can be enhanced by adding an additional option D, which is dominated by author B (Li et al., 2019). Marketers refer to the decoy effect as an asymmetric advantage effect, and explore the phenomenon that when consumers have a third option in addition to the two options they have, they can signal a change in preference. The new incentive measures create asymmetric advantages. [Sources: 1, 2, 8, 14]

The decoy effect is technically known as asymmetric dominance and occurs when people’s preference for one option over another changes as a result of the addition of a third option (similar but less attractive). Bait effect is defined as a phenomenon in which consumers change their preferences between two options when they are presented with a third option – “bait”, with “asymmetric dominance”. In marketing, the decoy effect (or gravity or asymmetric dominance effect) is a phenomenon in which consumers will tend to have a certain preference change between two options when a third option with asymmetric dominance is also presented. [Sources: 1, 4, 6]

The decoy effect, or asymmetric dominance effect, is a cognitive bias in which consumers will tend to have a certain change in preference between two options when a third option with asymmetric dominance is also presented. The decoy effect is the phenomenon in which the addition of a third pricing option causes the consumer to change their preference in favor of the option the seller is trying to promote. But when consumers are presented with a different strategic bait option, they are more likely to choose the more expensive of the two initial options. [Sources: 5, 12, 14]

The bait effect can also be measured by how much more the consumer is willing to pay to select a target rather than a competitor. Advertisers and marketers use fictitious prices to make the targeted option appear higher than similar low-priced products. Companies are pushing customers, who usually buy the cheapest product, towards the more expensive product. [Sources: 0, 1, 11]

Marketers use a particularly cunning pricing strategy to switch your choice from one option to a more expensive or profitable option. It seems that the only thing that really makes sense is the “best price” offer. Price is the most subtle factor in the marketing mix. We have considered a lot about setting prices so that we can spend more money. Based on the theory that consumers tend to choose products or services of average value and price, marketers use the trade-off effect to increase the desire to buy average value products from customers. [Sources: 0, 5, 7]

Consumers may prefer a higher quality product more than a cheaper product of lower quality when offered a third option, which is relatively inferior to the first product in terms of quality, price, or both. [Sources: 15]

Decoy manipulates the decision-making process by directing consumers’ attention to the target option. Bait is not meant to be sold, but only to drive consumers away from the “competitor” and toward the “target,” usually the most expensive or profitable option. When used as a marketing strategy, the price of the bait not only increases profits, but also improves the overall image of the targeted product or service. [Sources: 0, 4]

Adding bait increases the likelihood of buying a higher quality product. The bait should be chosen so that it resembles the target variation, but is slightly smaller in order to create an effect. Once added, bait changes your choice from competitor to target. [Sources: 9, 10]

This makes you less likely to rate a competitor than the other two options and are more likely to rate a target than a decoy. And if you are evaluating a target by bait, the only differentiating factor is price, which means that you will be the one to choose the target. In this situation, it can be foreseen that if (for example) the memory size of the bait is too close to the memory of the target, the two products may appear to be nearly identical, and the bait is unlikely to affect the inversion of choice. While it is difficult to imagine how this confounder encourages people to choose a target option over other options, decoy research is ideal if it can rule out this confounder. [Sources: 2, 9, 10]

Note that since it is well known that strong prior preferences of both target and competitor can inhibit the effect of bait injection (Huber et al., 2014), our study clearly focuses on scenarios in which options can be created. [Sources: 10]

We base our decisions on what is more profitable than what is best for our purpose, when the decoy option is right between our choice and what marketers want to sell. Calls usually hit us unnoticed; whatever we end up choosing, we believe we are doing it independently. By manipulating these key attributes of choice, bait guides you in a certain direction, giving you the feeling that you are making rational and informed choices. [Sources: 0, 4, 13]

Thus, the decoy effect is a form of “nudge” that Richard Thaler and Cass Sunstein defined (pioneers of nudge theory) as “any aspect of choice architecture that alters people’s behavior in a predictable way without inhibiting any options.” Not all attacks are manipulative, and some argue that even manipulative attacks can be justified if the goal is noble. This has been demonstrated in many areas such as medical decisions (Schwartz and Chapman, 1999), consumer choice, gambling preferences, and so on (Huber et al., 1982; Heath and Chatterjee, 1995). This is consistent with the claim that the bait effect is persistent (Huber and Mccann, 1982). [Sources: 2, 3, 4]

In this study, the state of the bait had a higher visual load than the control state. In any case, with the exception of lottery tickets, the decoy successfully increased the likelihood that the target was chosen. As expected, when bait opportunity was present, people were more likely to target. The addition of bait allowed people to judge the beer by quality and forget about the price. [Sources: 2, 4, 9, 13]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://designbro.com/blog/industry-thoughts/decoy-effect-marketers-influence-choose-what-they-want/

[1]: https://en.wikipedia.org/wiki/Decoy_effect

[2]: https://www.frontiersin.org/articles/10.3389/fpsyg.2020.523299/full

[3]: https://theconversation.com/the-decoy-effect-how-you-are-influenced-to-choose-without-really-knowing-it-111259

[4]: https://www.qut.edu.au/study/business/insights/the-decoy-effect-how-you-are-influenced-to-choose-without-really-knowing-it

[5]: http://humanhow.com/the-decoy-effect-complete-guide/

[6]: https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/decoy-effect/

[7]: https://scitechdaily.com/think-you-got-a-good-deal-beware-the-decoy-effect/

[8]: https://www.segmentify.com/blog/the-3-option-decoy-effect-and-relativity

[9]: https://kenthendricks.com/decoy-effect/

[10]: https://www.nature.com/articles/palcomms201682

[11]: https://www.paulolyslager.com/decoy-effect-price-tables/

[12]: https://pony.studio/design-for-growth/decoy-effect

[13]: https://thedecisionlab.com/biases/decoy-effect/

[14]: https://www.intelligenteconomist.com/decoy-effect/

[15]: https://thinkinsights.net/strategy/decoy-effect/

Agent Detection

Agent detection bias, also known as illusory agency detection, social cognition hypertrophy, or overactive agent detection device (HADD), is the tendency to mistakenly believe that phenomena can be explained in terms of a conscious agent. Agent discovery is the tendency of animals, including humans, to assume the deliberate intervention of an intelligent or intelligent agent in situations that may or may not include it. Gray and Wegner also argued that the capture of an agent is likely “the basis for human belief in God,” but “mere over-attribution of an agent cannot fully explain belief in God …” because the human ability to form a theory of reason and that they call the “existential theory of mind”, it is also necessary to “give us the basic cognitive ability to comprehend God. Moreover, measures of agent perception and intentionality were not associated with individual differences in supernatural beliefs, although they were associated with negativity of prejudice. [Sources: 1, 2, 6]

The evolutionary and neurocognitive considerations discussed here indicate that human behavioral biases are encoded in the pre-motor mechanism, re-identifying an object through observation gaps, and overcoming it needs to be constructed, also as part of the pre-motor system, similar to replacing the agent with a mechanism Mapping structure. As the generator of action. If this model is correct, ignoring agent bias to construct a mechanical interpretation of observed events requires structural mapping reasoning to be implemented by the pre-exercise action planning system, which uses mechanisms instead of agents as the cause of non-movement changes. Observable in the contextual attributes or characteristics of the object. The plasticity of MNS at its input allows agent bias, allowing observable non-biological movements to be mapped to represent first-person actions and their typical accompanying intentions, thereby representing inanimate non-agents as agents (Catmur et al., 2007); Hey, 2010, 2012). [Sources: 3]

For example, probable Type 1 candidates include the processes of implicit perceptual learning of the agent’s repetitive properties, perceptual recognition of the kinematics and biomechanics of the agent’s movements, face recognition and sounds that signal the presence of a certain agent, and attention distortion. in relation to agents and persons, as well as the activation of basic emotional reactions to the agent. In the area of ​​agent identification, several studies show that contextual information about the social roles of target people can influence the tracking of people’s identities (Allen and Gabbert 2013), sometimes conveyed by gossip (Anderson et al. It is expected that a tracking mechanism based on basic human recognition is not will be reliable for tracking target agents from a set of indistinguishable agents. [Sources: 0]

Understanding that HADD is integral to human nature is part of the skeptics’ fundamental knowledge base. Bruce Hood, author of Supersense, traces in his book the psychological research that documented and described human tendency to think about objects differently from agents. In most cases, an important function of perceptual-based tracking mechanisms is to track a number of characteristics that can be used to identify a target agent. [Sources: 0, 4]

We can perceive freedom of action in non-living beings if they act as if they were agents. If all unobservable causal processes are agents, their role in re-identifying an object over time makes the identity of the object itself an agency dependency. A wide variety of perception and recognition processes for identifying an agent can fall into the category of type 1 processes. [Sources: 0, 3, 4]

When HADD is activated, we tend to see a hidden agent working behind the scenes, forcing events to unfold the way they do, and perhaps even deliberately hiding their tracks. So, at a fundamental level, our brains process agents differently from objects from the moment we see them. Research has also shown that HADD is more likely to fire when the stimulus is ambiguous, so this is generally our default assumption: an object is an agent until we are sure it is just an object. [Sources: 4]

This may also explain why we can react emotionally to characters when watching cartoons as if they are real: they are not alive, but we treat them as agents. We inject essence into agents-a unique vitality, even if they are still children. [Sources: 4]

Another reason the internal model does not have to be perfect is because humans have limited cognitive energy. The main claim of PP is that the human mind is a self-learning Bayesian prediction engine. Like all minds, the human mind must be able to quickly enter the environment to survive and develop. [Sources: 5]

Consequently, the human mind cannot stop too often to check if its internal pattern matches well the input it receives from the environment. While other cognitive models also argue that input is filtered by top-down processes, PP radicalizes this idea. Activities that consume a lot of cognitive energy tend to negatively affect cognitive performance. While efficiency prevents the mind from updating its model too often, internal models can also, among other things, be affected by an unrepresentative dataset. [Sources: 5]

While the predictive mind will shift towards greater precision, performance requirements imply that the subject will fail and not have to do things right. While both are important, the second contributes the most to the experience. [Sources: 5]

 

— Slimane Zouggari

##### Sources #####

[0]: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4438138/

[1]: https://ixtheo.de/Record/1741104904

[2]: https://religions.wiki/index.php/Agent_detection_bias

[3]: https://www.frontiersin.org/articles/78945

[4]: https://theness.com/neurologicablog/index.php/hyperactive-agency-detection/

[5]: https://onlinelibrary.wiley.com/doi/full/10.1111/zygo.12575

[6]: https://en.wikipedia.org/wiki/Agent_detection