Denomination Effect

The naming effect suggests that when a person has two equal amounts of money, they are more likely to spend less money (for example). The dignity effect (Raghubir and Srivastava, 2009) shows that when people exist in the form of large denominations (e.g., ten dollar bills), compared with many smaller denominations (e.g., ten dollars and one dollar bills), people It is unlikely to spend money. Previous research has documented the impact of denominations because consumers are less likely to spend large bills (e.g., $100) compared to small bills (e.g., five $20 bills). [Sources: 0, 4, 11]

For example, the naming effect occurs when people are less likely to spend larger bills than their equivalent value on small bills or coins. This cognitive bias, known as the naming effect, affects various forms of currency, with the result that people are less likely to spend money if they take the form of large bills than to spend the equivalent amount of money in smaller bills or coins. … The naming effect was only officially discovered recently, in a 2009 research paper by marketing professors Priya Raghubir and Joydip Srivastava, who conducted several test experiments on the topic. [Sources: 2, 8]

In another experiment, Raghubir was standing outside a gas station in Omaha. Priya Raghubir and Joydeep Srivastava conducted a series of experiments in the United States and China that showed that people were much more willing to spend the same amount of money if they had lower denomination bills rather than large bills. They were also more likely to break a larger worn-out bill than to pay the exact amount in a smaller denomination. [Sources: 6, 10]

Our findings specifically supplement the “naming effect” literature (Raghubir and Srivastava, 2009), which suggests that one of the reasons people are more likely to shop when they have lower denominations is because they want to exercise self-control. …And they don’t want to forget how much money they have (see also Raghubir et al., 2017). The marginal effect of naming provides insight into people’s normative beliefs about money; consumers may think that smaller bills are more affected because they themselves use them more frequently. When testing the opening, please note that when people receive the same amount of small denomination bills (four five-dollar bills), they spend more than when they hold large bills (one 20-dollar bill). The author discovers the physics of money Aspects can be enlarged. Reduce or even reverse this impact. [Sources: 6, 7, 9]

Despite evidence that currency denomination can affect spending, researchers have yet to explore whether the appearance of money can do the same. However, in addition to providing some of the earliest evidence that the appearance of money can change spending behavior, this study offers insight into the relative power of denomination on spending. Second, we find that the appearance of money can improve, mitigate, and even reverse the previous effects of denomination on spending. [Sources: 9]

So, like Serina, if you’ve ever wondered why we would rather spend our low value money more than high value equal value money, one explanation is the denomination effect. We tend to value lower bills less than tall bills, not paying much attention to why we do it. [Sources: 0]

In other situations not covered in this article, consumers may be indifferent between using two bills on hand, maintaining the principle of monetary invariance, or more likely to spend smaller bills, maintaining the “bill effect,” which we will discuss below. We assume that the effect of price denomination will mean that people will use cash when the present value matches the price to be paid, while using the card when the present value does not match the price. [Sources: 7]

This means that if the denominations are not equal to price, consumers are less likely to rely on price information when choosing a denomination to buy. Study 2 results show that people choose discounts that match the price of purchases that are most often considered, rather than discounts that do not match those prices, and make those decisions faster. We also examined the role of price pegging as a potential mechanism for this effect and found that people make faster decisions when they choose a bill that matches the price (Study 2). [Sources: 5]

We found that when the price matches the customer’s cash denomination, consumers are more likely to use cash than debit cards, indicating that the price denomination also affects their choice of payment method. However, the price denomination effect predicts that when the price matches the value of the paper money they hold, people are more likely to pay in cash, but if not, they will not. Between these two terms, participants are more likely to spend when they receive four quarters instead of one dollar bill, which is consistent with the denomination effect. [Sources: 0, 5, 7]

Study 1B replicated the effect of Study 1A with higher spending levels and higher prices. Study 1B explores the possibility of generalizing the effect using higher cost levels, higher price set, and larger reductions. Like the results of infection, these effects did not depend on the face value. [Sources: 5, 7, 9]

Another way consumers violate descriptive invariance is to spend more when using a credit card than when using cash (Raghubir and Srivastava, 2008), and spend more when they have smaller bills than larger ones (Mishra et al., 2006; Raghubir and Srivastava, 2009). Called the denomination effect, our tendency to avoid spending larger bills also suggests that people prefer to receive money in larger denominations. In 2009, people tend to associate smaller bills with smaller, more varied purchases, according to a study published in the Journal of Consumer Research. This means that if you carry around 10 $ 10 bills, you’ll spend it faster than $ 100, according to a study published in the Journal of Consumer Research. … [Sources: 3, 5]

The same effect usually occurs in the financial sector, where the unit value of a particular asset indicates that investors tend to reduce spending when pricing large amounts. When a company’s stock is traded at a relatively high price before the spin-off, such as five to one, investors are more likely to switch to and invest in other companies with lower absolute prices. Therefore, the impact of lower prices increases people’s absolute expenditures. [Sources: 8]

If a person receives APS20, if it is offered in a smaller denomination (such as APS1 coins or change), then that person is more likely to spend it instead of offering it in a larger denomination (especially in the case of the famous .APS20) . Using psychological accounting theory (Thaler, 1985; Thaler and Shefrin, 1981), the authors suggest that people might consider large denominations of currency as real currency, while equivalent amounts of smaller denominations are considered small currencies or change ( Ragubir and Srivastava, 2009)). [Sources: 1, 9]

You are not alone, and our reluctance to pay big bills can in fact be used to trick our minds into spending less, as science shows. This matching effect can be extended to other payment methods (such as gift cards). This belief underscores the growing interest in the influence of naming. [Sources: 3, 9, 11]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://absolutedecisionsblog.wordpress.com/2018/03/23/the-denomination-effect-in-consumer-spending-behaviour/

[1]: https://www.adcocksolutions.com/post/no-16-of-36-the-denomination-effect

[2]: https://www.businessinsider.com/cognitive-biases-2015-10

[3]: https://au.finance.yahoo.com/news/how-to-save-money-using-cash-235843261.html

[4]: https://www.journals.uchicago.edu/doi/10.1086/689867

[5]: https://www.frontiersin.org/articles/10.3389/fpsyg.2020.552888/full

[6]: https://www.eurekalert.org/news-releases/630713

[7]: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7509091/

[8]: https://internationalbanker.com/brokerage/cognitive-bias-series-6-denomination-effect/

[9]: https://academic.oup.com/jcr/article/39/6/1330/1825424

[10]: https://www.npr.org/templates/story/story.php?storyId=104063298

[11]: https://www.sciencedirect.com/science/article/abs/pii/S0148296321000990

Berkson’s Paradox

More broadly, however, we could only call it the Berkson effect because of the general application of unusual selection forces that are likely to create unexplored relationships. However, such a correlation clearly does not exist when we restrict our analysis of the NBA. [Sources: 11]

An obvious reading of Burkson’s paradox is that people who are good at basketball and short, and people who are bad and tall can compete in the NBA, but people who are short and poor at basketball were excluded from the champion. … intense competition. The significance of these rare attributes to the NBA can easily surpass the growth seen among the broader population. Growth is unrelated to playing in the NBA, because a short NBA player must have abnormally high skills to break into the tight circle known as the NBA. [Sources: 5, 11]

As a result of bias in our data collection, we end up seeing that 100% of the least educated are successful (Figure 3 – green), but only a fraction of the highly educated are successful (Figure 4 – green). This is the result of a compromise between the GPA and SAT scores of the people surveyed. [Sources: 4, 5]

It is tempting to see conflicting correlations and try to build a story about them, but they are often not surprising when we realize that a small circle is choosing between two dominant attributes. In general, if two factors influence the choice in the sample, we say that they “collide during the choice” (see Figure 2a). The lesson here is that we can see spurious correlations between variables as a result of sample bias. In the aggregate dataset, we come to the conclusion that if we divide this data into groups according to some criteria, we get results that are completely opposite to the results of previous observations. [Sources: 1, 5, 7, 9]

One of the most famous examples of the Simpsons paradox is the study of gender bias in the Graduate School of the University of California, Berkeley. This paradox is named after Joseph Berkson, who pointed out the selection bias in case-control studies to determine causal risk factors for the disease. Since the samples are taken from hospital patients rather than the general population, this may lead to negative and false associations between disease and risk factors. [Sources: 1, 7, 13]

This example is very similar to Berkson’s original 1946 work, in which the author noted a negative correlation between cholecystitis and diabetes in hospital patients, despite diabetes being a risk factor for cholecystitis. The paradox can also give the impression of a negative correlation, when in fact two variables are positively correlated or completely independent of each other. The Berkson paradox arises when this observation appears to be true, when in fact two properties do not correlate, or even positively correlate, because members of a population in which both are absent are observed unevenly. [Sources: 3, 5, 6]

According to Berkson’s paradox, cases in which two elements that appear to be related to people in general are not actually related. In other words, in the presence of two independent events, if we consider only the outcomes in which at least one occurs, they become negatively dependent, as shown above. This leads to the Berkson paradox, according to which, due to the presence of B in the subset, the conditional probability of A decreases, which explains the negative dependence of two independent events, provided that at least one of them occurs. The answer to any occurrence of Berkson’s error is to properly define or characterize the population and then statistically examine a significant portion of the population to test the relationship between A and B. [Sources: 0, 13]

In addition, a suitable procedure is proposed for generating inferences for a population based on a biased sample that has all the characteristics of the Burkson paradox. In particular, this occurs when there is an inherent estimation bias in the study design. Like other threats to causal inference, once you know about collider displacement, you will see it lurking everywhere: from introducing an association, when the two factors were effectively independent, to diminishing, exaggerating, or even altering the existing association in a way that is difficult to predict. … [Sources: 8, 9, 12]

In recent article 1, we summarize how collider displacement could play a role in Covid-19 research and identify hundreds of demographic, genetic, and health-related factors that influence the likelihood of a person being selected for Covid-19 testing. United Kingdom. Biobank participants. Observational errors and subgroup differences can easily lead to statistical paradoxes in any data analysis application. Hidden variables, variable collisions, and class imbalances can easily create statistical paradoxes in many data processing applications. In this article, we’ll look at 3 of the most common types of statistical paradoxes found in data science. [Sources: 3, 9]

A prime example is the observed negative association between the severity of COVID-19 and cigarette smoking (see, for example, Griffith 2020, recently published in the journal Nature, suggesting that this may be a case of a collider displacement, also called the Burkson paradox. The most common example of the Burkson paradox is is the false observation of a negative correlation between two positive traits, namely that members of the population who have some positive traits tend to miss a second. [Sources: 3, 5]

The Berkson paradox broadly refers to the tendency of subpopulations, caused by some selection effect, such as a cut estimate, to give the impression of correlations that do not exist or even exist in the opposite direction in a larger population. I have seen more general definitions of the Burkson paradox (sometimes Berkson bias) equate it with selection effects in general, but I think it is mainly used in situations where a portion of a population / dataset is excluded that will decrease or reverse the observed correlation if included. [Sources: 11]

In fact, this erroneous observation is based on erroneous assumptions related to “cause” and “effect” and bias in data collection. He is often described in the field of medical statistics or biostatistics, such as Joseph Berkson’s original description of the problem. He is often described in the field of medical statistics or biostatistics, such as Joseph Berkson’s original description of the problem. It is a probability statistical phenomenon in which trends appear in different data sets, but disappear or reverse when combined. [Sources: 1, 4, 8]

Because, looking at something from only one side, we will continue to see the same thing even on completely opposite data. If the observer only examines the stamps on display, he will discover a false negative relationship between beauty and rarity as a result of selection bias (i.e., lack of beauty clearly indicates rarity in an exhibition but not in a general collection). [Sources: 1, 13]

Burkson’s paradox states that two independent events become negatively dependent if we consider only the outcomes in which at least one of them occurs. The Berkson paradox is one of the results that can be obtained by mentally making conditional comparisons. The Burkson paradox, also known as Burkson bias, collider bias, or Burkson error, is the result of conditional and statistical probabilities that are often illogical and therefore a real paradox. [Sources: 0, 10, 13]

The Berkson paradox was originally discovered in the context of epidemiological studies that trace the relationship between disease and exposure to potential risk factors. Berkson’s original illustration includes a retrospective study examining the risk factor for disease in a statistical sample of the inpatient hospital population. [Sources: 2, 13]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://medium.com/@darshit0111/berksons-paradox-6d27225c47cb

[1]: https://www.sergilehkyi.com/7-data-paradoxes-you-should-be-aware-of/

[2]: https://brilliant.org/wiki/berksons-paradox/

[3]: https://towardsdatascience.com/top-3-statistical-paradoxes-in-data-science-e2dc37535d99

[4]: https://www.futurescienceleaders.com/blog/2021/03/berksons-paradox-a-statistical-illusion/

[5]: https://moontowermeta.com/berksons-paradox/

[6]: https://www.jaad.org/article/S0190-9622(21)00227-9/fulltext

[7]: http://corysimon.github.io/articles/berksons-paradox-are-handsome-men-really-jerks/

[8]: https://psychology.fandom.com/wiki/Berkson%27s_paradox

[9]: https://rss.onlinelibrary.wiley.com/doi/10.1111/1740-9713.01413

[10]: https://rf.mokslasplius.lt/berkson-paradox/

[11]: https://freddiedeboer.substack.com/p/beware-berksons-paradox

[12]: https://pubmed.ncbi.nlm.nih.gov/31696967/

[13]: https://en.wikipedia.org/wiki/Berkson%27s_paradox

Sunk Cost Fallacy

The addiction to commitment boils down to constantly trying to convince ourselves and others that we are making rational decisions. We do this by maintaining consistency in our actions and defending our decisions in front of the people around us, as we believe it will give us more respect. [Sources: 3]

Barry M. Stav was the first person to study and describe commitment bias. He hypothesized that this change in attitude is due to the need for consistency, which seems to be the driving force of all mankind. 4 Inconsistency is the cause of unpleasant feelings. It is related to cognitive dissonance. Since people are unlikely to recognize the negative consequences of their decisions, confirmation bias will also escalate commitment. Another deviation closely related to the escalation of commitments and sunk cost errors is the escalation of commitments, also known as commitment deviations. This prejudice is also called the delusion of promise escalation or sunk cost. [Sources: 0, 4, 7, 12]

When there is a commitment bias, we carefully select information that makes our decision seem right, while minimizing or even completely ignoring the evidence that we made the wrong choice. It may be us who justifies our past behavior in front of ourselves or in front of others. [Sources: 7]

Confirmation of the Hypothesis and Alternative Explanations Again, my hypothesis is that in some cases we feel pressured about sunk costs, but not in others, because we want to tell flattering yet believable stories about our diachronic behavior, stories in which we have not been harmed. diachronic adversity – and recognizing sunk costs can help achieve this. Another explanation for why we take sunk costs into account invokes prospect theory (Kahneman and Tversky, 1979), which is a descriptive theory of risky decision making. [Sources: 9]

Economists and behaviorists use the related term “sunk cost error” to describe the justification for investing more money or effort in a decision based on previous cumulative investment (“sunk costs”), despite new evidence suggesting that the future costs of continuing such behavior exceeds the expected benefit. More recently, the sunk cost fallacy has been used to describe the phenomenon in which people justify a large investment in a solution based on their previous cumulative investment, despite new evidence that the cost of continuing from today exceeds the cost of the solution. expected benefit. The sunk cost error increases the likelihood that a person or organization will continue a business in which they have already invested money, time, or effort, even if they would not have started a business if they had not already invested in it. [Sources: 4, 6, 8]

The larger the size of the shadow investment, the more people are inclined to invest additional funds, even if the added return on investment is not worth it. The idea that a larger investment in something you’ve already invested in “makes it work” may not only be the result of a sunk cost error, but the additional investment increases business commitment and can increase the likelihood of further investment. for sunk costs. Perhaps even worse, increased commitment to a particular course of action through past investment alone can block needed change and limit innovation. [Sources: 6]

Research by Ohio University psychologist Hal Arkes and his partner Katherine Bloomer shows sunk costs can discourage you from choosing “entertainment” whenever possible. When deciding whether to disconnect from the grid, costs already invested are usually taken into account, as a result of which some of them are more expensive. These are the costs that we most consider when deciding on a choice. [Sources: 6, 12]

Examples of potential costs are potential losses or costs of restocking. In a rational economy, these costs are not part of the decision-making process. People make the sunk cost error when they continue to conduct or endeavor as a result of previously invested resources (time, money, or effort) (Arkes & Blumer, 1985). [Sources: 2, 12]

There is also an interesting phenomenon – the so-called “reverse sunk cost effect” (Heath 1995) or the “Pro Rata fallacy” (Baliga & Ely 2011), in which decision-makers consider sunk costs by discarding them rather than fulfilling them. the project on which the costs were spent. In Teger’s and then Ross and Staw’s studies, situations where completing an action is worth more than completing it has left decision-makers trapped in their current and costly behaviors. In Tesco’s case, those tasked with making the best decisions often ramped up their efforts irrationally, and such irrational decision making was clearly costly. [Sources: 4, 5, 9]

First, we aimed to examine the extent to which people continue to increase their participation in a coherent decision-making situation. We report the results of an experiment inspired by a pivotal study (Staw, 1976) that introduced the concept of “escalation of commitment,” a variant of the sunk cost error. A critical aspect of management decision-making is managing the risks of increasing commitment in the face of a bad course of action. First, the situation requires a lot of resources such as time, money and people invested in the project. [Sources: 3, 4, 5, 11]

You hope to make a difference with additional effort / resources. Sometimes it makes perfect sense to want to continue with a project because of the resources you’ve already invested in it. Once you have spent a large amount of resources, you are wishful thinking. [Sources: 3, 9]

Don’t increase your engagement (despite evidence to the contrary) because you’ve already spent so much time, money, energy, emotion on this action. Entrepreneurs are ripe for “escalating commitments,” as evidenced by their willingness to keep investing in their businesses, even when things go wrong. For example, it found that those responsible for prior losses tend to be more positive about projects and more likely to commit additional resources to them than people who took on projects halfway through. Reinforcement, misrepresentation, and self-justification – three psychological factors that we all are exposed to – can keep us from projects or actions that we initiate. [Sources: 1, 6, 10]

However, when others observe our behavior, most management decisions involve some additional factors. Similarly, it is important for managers to be aware of the potential harm that might be caused by decisions made solely by the power leader. [Sources: 1, 3]

This may sound counterintuitive, but it can provide excellent insight into your thinking process when making decisions. Challenge decisions by asking open-ended, detailed, laser-focused questions that will help you understand their positions and interests. It may be that the party directing you does not have the understanding, data, or cost / benefit analysis to proceed / not continue. [Sources: 10, 12]

Second, I believe that once we understand why we feel pressured about sunk costs, it will no longer be clear that this is always irrational. And, as long as this desire is not offset by other considerations, it should not be irrational to consider sunk costs. [Sources: 9]

The term “irrecoverable” refers to the importance of incurred costs. The term is also used to describe bad decisions in business, government, general information systems, especially software project management, politics, and gambling. [Sources: 8, 12]

Adherence escalation is a pattern of human behavior in which an individual or group facing increasingly negative consequences of a decision, action, or investment continues to pursue that behavior rather than changing course. [Sources: 4]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://askinglot.com/what-is-escalation-bias

[1]: https://hbr.org/1987/03/knowing-when-to-pull-the-plug

[2]: https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/sunk-cost-fallacy/

[3]: https://thinkinsights.net/strategy/escalation-of-committment/

[4]: https://en.wikipedia.org/wiki/Escalation_of_commitment

[5]: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6041483/

[6]: https://leepublish.typepad.com/strategicthinking/2015/03/sunk-cost-fallacy.html

[7]: https://thedecisionlab.com/biases/commitment-bias/

[8]: https://psynso.com/escalation-of-commitment/

[9]: https://quod.lib.umich.edu/e/ergo/12405314.0006.040/–sunk-cost-fallacy-is-not-a-fallacy?rgn=main;view=fulltext

[10]: http://www.negotiations.ninja/blog/the-danger-of-the-escalation-of-commitment/

[11]: https://www.sciencedirect.com/science/article/pii/S0014292121001562?dgcid=rss_sd_all

[12]: https://neurofied.com/sunk-cost-fallacy/

Gambler’s Fallacy

By definition, independent random events cannot be predicted with certainty. This way of thinking is wrong because past events do not change the likelihood of certain events that will occur in the future. [Sources: 2, 4]

Gambler delusion is best defined as the tendency to think that future probabilities are skewed by past events when they do not really change. The player’s mistake is that he erroneously evaluates whether the series of events is really random and independent, and erroneously concludes that the outcome of the next event will be the opposite of the outcome of the previous series of events. If we estimate the likelihood of an event based on past events, we will all be misleading the player. Player error stems from our tendency to assume that if a random event happened many times in the past, it will happen more or less often in the future. [Sources: 2, 3, 5, 6]

So we are trying to rationalize random events in order to create an explanation and make them predictable. When we are busy rationalizing random events, we do not think clearly. Because player error forces us to keep track of our experience or base future events on past performance; we do not make an informed choice. It also refocuses the question of “rationality” implied in the term “fallacy”, because then we can ask ourselves if it is illogical to behave as if past results helped us predict future outcomes. [Sources: 3, 4, 5]

The fact that past events will not change the possibility of future events is the reason why such ideas and decisions are wrong. Player error is a person who mistakenly believes that past events or a series of events will affect future events. This error occurs when a person mistakenly believes that a particular event may or may not occur, depending on the result of a series of previous events. [Sources: 5]

Therefore, an error occurs when a person believes that a particular event in the past will affect future events. The root of this error is that we tend to assume that an event has occurred based on the number of occurrences in the past. It is based on the misunderstanding that due to previous events, events are more or less likely to happen. When an event occurs more frequently than usual, it is less likely to happen in the future. [Sources: 5]

There are also potentially unique issues that can be associated with sequential events. This could plausibly reflect a different process from the gambler’s delusion observed while gambling, however, it would be interesting to see if the similarities show up even in the investment scenario when decisions are made regularly based on consistent information such as daily performance or hourly. One possible approach would be to use a developmental paradigm to test if anything similar to the player’s fallacy arises before developing an understanding of the properties of chance. In this light, the phenomena described here are easy to understand; Beliefs about a relationship with previous results can provide a valuable (or, in the case of random events, the only) possible source of information about what might happen in the future and increase our sense of predictability. [Sources: 4]

We often select past experiences that we believe should be similar to future events or that we believe should reflect an ideal outcome. However, once a base rate is established, or at least a fair estimate, we can predict when certain events will return to the average. We see regression towards the mean because each individual event can bring us closer to our original baseline (assuming the baseline remains the same). [Sources: 1, 3]

In fact, the same is true for the frequency domain and probability domain. Heads-up with LeBron James will not change the results, while playing with someone with your skill level will make the results more or less random. Unless you think your mindset and processes are actually changing every week, the most likely outcome for any given week is a long-term return on investment. [Sources: 1, 8]

I find that the error of the players is so strong because it is difficult for people to understand how the mean reversion occurs. Player error can be so severe that it influences decision making because people tend to “act” after peripheral events. [Sources: 1]

This can be explained by the tendency of traders to judge a decision based on the outcome rather than the quality of the decision at the time it was made. To illustrate the bias in the outcome, we can use the example of a trader trading a system that relies primarily on Fibonacci retracements and extensions. To avoid skewing the result, the trader must think of his trading as a package of many skills. [Sources: 6]

This can be explained by a tendency to seek, interpret, focus, and memorize information in a way that confirms a preconceived notion. Distortions – After generalizations, people agree to distortions in order to adapt the information to their beliefs. The framing effect (cognitive bias) People respond differently to certain choices depending on how they are presented. [Sources: 0, 6]

This can be defined as the tendency to rely on or reinforce a trait or piece of information in making decisions (usually the first piece learned on the topic). Cognitive bias is best explained as a thinking bias that influences people’s decisions and judgments. They arise as a result of our brains trying to simplify information processing. [Sources: 6]

Hindsight bias This is sometimes called the “knew it all” effect — the tendency to view past events as predictably as they happened. Retrospective bias (cognitive bias) The tendency, after an event has occurred, to view the event as predictable, even though there is little objective basis for predicting it. Player error. The tendency to think that future probabilities are distorted by past events when they have not really changed. [Sources: 0, 7]

Focusing effect. The tendency to attach too much importance to one aspect of an event. The illusion of control. The tendency to overestimate the degree of their influence on other external events. [Sources: 7]

connection error. Tend to assume that certain conditions are more likely than general conditions. Information Bias The tendency to seek information, even if it cannot influence action. Basic rate error (cognitive error). Tend to ignore information about the basic interest rate (general, general information) and focus on specific information (information only relevant to specific cases). Basic bet error or basic bet lost. Tend to ignore information about basic rates (general and general information) and focus on specific information (information related to specific cases only). [Sources: 0, 7]

Availability heuristic. The tendency to overestimate the likelihood of events with increased “availability” in memory, which may depend on how recent the memories are or how unusual or emotionally charged they may be. Player error, also known as Monte Carlo error, occurs when a person mistakenly believes that a certain random event is less or more likely to occur based on the outcome of a previous event or series of events. Hot Hand Bug Hot Hand Bug (also known as the Hot Hand Phenomenon or Hot Hand) is the belief that a person who succeeds in a random event is more likely to succeed in further attempts. The error occurs due to the erroneous concept of the law of large numbers. [Sources: 1, 2, 7]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://www.studystack.com/flashcard-2548692

[1]: https://www.actionnetwork.com/education/jonathan-bales-endowment-effect-gamblers-fallacy-betting-fantasy-sports

[2]: https://www.investopedia.com/terms/g/gamblersfallacy.asp

[3]: https://thedecisionlab.com/biases/gamblers-fallacy/

[4]: https://www.nature.com/articles/palcomms201650

[5]: https://bizzbucket.co/gamblers-fallacy-why-it-matters-in-business/

[6]: https://tgfx-academy.com/lessons/lesson-2-psychological-biases/

[7]: https://uxinlux.github.io/cognitive-biases/

[8]: https://marro.io/bias/interpretation/probability.html

Hot-Hand Fallacy

In addition to his most famous discoveries, it appears to have correctly shown that people tend to overestimate the hot hand effect and spot random patterns where they don’t exist. Scientists argue that people misinterpret randomness and draw the wrong conclusions. [Sources: 0, 11]

The first of the biases is player error, which leads a person to assume that a long sequence of “head” or “tail” increases the likelihood of getting a “head” or “tail”, respectively. Their reaction to this belief – when winners place safer bets, assuming they will lose, and losers place long-term bets, believing that their luck is about to change – has the opposite effect of lengthening successive streaks. [Sources: 8, 10]

Maybe people who have winning streaks are better at betting than people who don’t heat up. The idea is that during a losing streak, a player’s luck can tip over and he starts winning. Likewise, in gambling, when we have a winning streak due to a hot hand mistake, we believe our success will continue. [Sources: 9, 10]

This means that it is a hot hand mistake to say that winning many times in a row will increase your chances of winning the next bet, and the player mistakenly saying that losing many times in a row will increase your chances of winning. Every bet. Besides, they were wrong. When the player thinks that the probability of winning consecutively is greater, a hot hand error occurs. The tendency of players to exchange lottery tickets for multiple tickets instead of cash is consistent with the hot hand phenomenon; because people who have won consistently in the past believe that they are more likely to win again. The belief in hot hands stems from the control of illusions in which people believe that they or others can control randomly defined events. [Sources: 8, 10]

Hot hand error is a psychological condition in which people feel that a person is “hot” or “cold” based on past work, when that performance does not affect future results. Hot hand is a cognitive social bias in which a person believes that past successful achievements can be used to predict success in future endeavors. Key Points A hot hand is an idea in which people believe that after a series of successes, a person or organization is more likely to have lasting success. The hot hand belief is shared by many players and investors alike, and psychologists believe it comes from the same source – the representative heuristic. [Sources: 6, 8]

While there is some truth in this that a person can be in good shape or their confidence is reinforced by initial success, the Hot Hand phenomenon goes beyond that of people making statistically inaccurate assumptions. The fact that a team or player is doing well in a short period of time does not contradict their overall average, but the hot hand error leads us to believe it is. Most likely, we will place bets reflecting a logical error and, as a result, lose money. This is why it is so easy for a trader to believe that his hand is hot and that he cannot lose dramatically after a series of winning trades. [Sources: 3, 9, 11]

Hot handed bias occurs when a trader believes that they have a winning streak, they have a hot hand, so they cannot lose. Belief in a hot hand is also prevalent in sports, especially basketball, where it is believed that a player’s performance over a period of time is significantly better, depending on his or her successful shooting streak. When something like this happens, most observers and players assume that the player in question has a “hot hand” – that for some reason he has entered a state that makes him shooting and shooting easier than they are. Usually (or even easier, in the case of a star born to score, like Thompson). When people see a streak like Craig Hodges scoring 19 out of 3 in a row or other outstanding performance, they usually attribute it to a hot hand. [Sources: 0, 2, 8, 11]

The idea that basketball players can end up with a warm hand – a series in which they magically appear to be shot after shot – resonates with sports reporters and audiences alike. This column examines whether the hot hand idea has any real foundation. In basketball, recent research has mainly focused on controlled conditions such as shooting experiments, NBA three-point matches or free throws, and researchers find evidence of a strong hand even under these controlled conditions (Arkes 2010, Miller & Sanjurjo 2018, Miller & Sanjurjo 2019) … Belief in a warm hand is simply an illusion that arises from the fact that we humans have a predisposition to see patterns in chance; we see stripes even though the survey data is essentially random. [Sources: 2, 5]

The GVT concluded that a warm hand is a “cognitive illusion”; the tendency of people to detect random patterns, to consider completely typical stripes as atypical, made them believe in an illusory warm hand. Importantly, the GVT found that the pros (players and coaches) were not only victims of error, but that their faith in the hot hand was steadily growing. The GVT has generated significant interest in the hot hand in various fields and sports, including baseball (Green and Zwiebel, 2018), horseshoe throwing (Smith, 2003), tennis (Klaassen, Magnus, 2001) and bowling (Dorsey-Palmateer and Smith , 2004). ). … Research by University College London psychology professor Nigel Harvey and graduate student Juemin Xu, published in the May 2014 issue of Cognition magazine, found that online betting site gamblers believed in common gambling error, “gambler’s mistake,” etc., led to that they experienced the opposite effect, the hot hand delusion (via Cardiff Garcia). [Sources: 2, 5, 10]

There is also a hot-hand trend in financial markets where traders try to delegate their investment decisions to professional fund managers with proven track records, believing they can consistently maintain high performance indicators. Just like a player will have a random streak when a coin is tossed, a basketball player will have a random streak when a ball is tossed. Basketball players and fans alike tend to believe that a player’s chances of hitting after hitting are higher than after a mistake on the previous hit. In this intuitive test, players’ goal percentage after winning streaks was not significantly higher than after missing. [Sources: 1, 2, 11]

The results showed that these shooters achieved 57.3% of their shots after a streak of three or more hits and 57.5% of their shots after a streak of three or more failures. Half of the shooters had fewer hit or miss streaks than expected (52%), and the other 48% had much more hit or miss streaks. Thus, it is clear that there is no consistent pattern between hitting, missing and the next shot. [Sources: 4]

The research was aimed at proving that people are wrong in believing that players have hot hands and are more likely to hit another successful shot after a series of hits. The study, which examined the intuitive concept of hot hand and shooting range belief, was based on mathematical psychology, decision-making behavior, heuristics, and cognitive psychology. [Sources: 8, 11]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://www.thecut.com/2016/08/how-researchers-discovered-the-basketball-hot-hand.html

[1]: https://www.sciencedirect.com/science/article/pii/0010028585900106

[2]: https://www.scientificamerican.com/article/momentum-isnt-magic-vindicating-the-hot-hand-with-the-mathematics-of-streaks/

[3]: http://changingminds.org/explanations/theories/hot_hand.htm

[4]: https://www.samford.edu/sports-analytics/fans/2017/The-Hot-Hand-Myth-or-Reality

[5]: https://voxeu.org/article/basketball-s-hot-hand-myth-or-reality

[6]: https://www.investopedia.com/terms/h/hot-hand.asp

[7]: https://link.springer.com/article/10.3758/BF03206327

[8]: https://corporatefinanceinstitute.com/resources/knowledge/trading-investing/hot-hand/

[9]: https://thedecisionlab.com/biases/hot-hand-fallacy/

[10]: https://www.businessinsider.com/the-gamblers-fallacy-and-the-hot-hand-2014-4

[11]: https://capital.com/hot-hand-fallacy-bias

Plan Continuation Bias

One of the key ones is the possibility of continuation bias in linear processes and how easy it is to become path dependent, increasing risk and closing the capacity for adaptation as you deepen the course of action. Hence, it is important to understand that continuation errors can occur, and it is important for pilots to be aware of the risks associated with not analyzing changes in a situation and considering the consequences of those changes in order to determine whether a line-align approach is more appropriate. As the workload increases, especially in one pilot scenario, there is less and less mental capacity to process these changes and consider the potential impact they could have on the original plan. The addiction to continuing the plan can prevent crews from realizing that they need to change their course of action. [Sources: 6, 12, 13]

Plan continuation addiction can be defined as the tendency of individuals to continue with an original course of action that is no longer viable, which can often occur despite changing conditions and current information about the situation (APA, 2020). Perhaps one of the most famous applications of this concept can be seen when an airline pilot is unexpectedly confronted with bad weather (or changing conditions) when entering the ground, but instead of taking a different runway or aborting a landing, decides to move forward with landing. plan at the originally planned destination. A NASA study of nine major plane crashes in the United States between 1990 and 2000, in which crew errors were considered a likely cause, found that pilot-to-aircrew bias about the continuation of the plan tends to increase as they get closer to their destination. … [Sources: 6, 9]

In other words, the closer the aircraft is to the final approach and landing phase, the more likely it is that the crew will continue to operate even when the environment changes. [Sources: 6]

Simply put, when the journey is nearly complete, people tend to run on autopilot, ignoring changing and potentially hazardous environmental factors. The “mission at any cost” mentality tends to creep in and overwhelm the crew’s abilities. Fatigue and stress are secondary factors, but with a major impact on the rider’s exposure to non-compliance with rules / procedures. [Sources: 7]

Situational awareness (SA) failures occur when continuous deviations prevent the pilot from detecting important signals, or the pilot cannot recognize the meaning of these signals. Plan continuation deviations are more common in single-pilot light aircraft operations; NASA does not use resources at all for forensic investigations of every small aviation incident. The 2004 NASA Ames Human Factors Study found persistent deviations. The study analyzed 19 plane crashes caused by crew errors between 1991 and 2000. [Sources: 0, 5, 13]

In some incidents, we have observed a snowball effect, in which decisions or actions at one stage of the flight increased the crew’s vulnerability to making mistakes later. For example, a crew that continued a highly questionable approach during a thunderstorm found themselves in a high workload situation that may have helped them forget to turn on spoilers. While NASA’s accident analysis has focused on human behavior, for example [Sources: 7, 12]

Prejudice arises when people stick to a plan, even if it seems wrong. These signs, even if people see and recognize them, often fail to lead people in a different direction … When the signs suggesting a change in plan are weak or ambiguous, it is not difficult to predict where people are trading. if canceling the plan is somehow costly. Accident investigators often believe that accidents are a result of this bias – that the idea of ​​taking a break or changing approach becomes not only aggravating, costly, or unpleasant – becomes literally unthinkable. Simply put, avoiding continuing with a plan is the tendency of all of us to continue on the path we have already chosen or taken, without carefully checking whether this is still the best idea or even the most expedient. [Sources: 1, 10, 11]

This particular form of cognitive bias to which we humans are susceptible is more complex (especially when we view it in terms of plane crashes) than it has been described here, but the concept as a whole is interesting in that it could be studied in relation to with many decisions and paths that we persist in pursuing despite current information or even warning signs that suggest this may not be the best course of action or the most appropriate course of action. In this article, I will describe how this bias permeates our psychology, observing how it works in plane crashes, and then examining its impact on financial markets. Investors will learn how to combat this bias and improve trading efficiency. [Sources: 1, 9]

I think there is something about these stories that suggests a continuation bias (or what aviation pilots call “push-to-go”), which is the tendency of people to continue their original course of action despite changing conditions. even when the plan is no longer viable. In the event of a GPS failure, it may have to do with a feeling that the technology needs to be fixed and that we will find a better way or a way out right around the next corner. In aviation, this tendency to move forward is more commonly known as “get it done” … and is often fatal, especially among less experienced pilots. It’s a bizarre name for “goal achievement” – a plan continuation bias, which is an unconscious cognitive bias towards continuing with the original plan despite changing conditions, and can be fatal to general aviation pilots. [Sources: 0, 4, 6]

This bias can be especially strong during the approach phase, when only a few extra steps are required to complete the original plan, and can act by preventing pilots from noticing subtle indications that the initial conditions have changed. Looking at the list of cognitive biases, one of the best things to keep in mind in the mental model cheat pack is the “Plan Continuation Bias”. So when you stray too far from the wrong path, prejudices become stronger, task saturation comes into play, situational awareness wears off, and you fully defend yourself, no longer thinking about the future. [Sources: 0, 10, 12]

This quick and erroneous simulation of pilots leaves out many important factors. Flying with one pilot in light aircraft requires good rainfall, healthy routines, excellent motor skills, and an understanding of our cognitive biases. [Sources: 1, 5]

We can be responsible for sticking to the plan like the above-mentioned pilots. We bought this stock, which was a good idea at the time, and we will continue to hold it even if the reason for the purchase disappears or is not disclosed. In retrospect, we can see that we will have to change our plans to adapt to changing conditions. [Sources: 1, 9]

When studying and analyzing plane crashes, it is very easy to fall into what cognitive scientists call past bias. Automatic bias. False priorities. A tendency to over-reliance on automated systems, which can lead to incorrect automated information overriding correct decisions. Module function attribution error. In human-robot interaction – the tendency of people to make systematic errors when interacting with a robot. [Sources: 2, 12]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://generalaviationnews.com/2013/05/20/protect-yourself-from-get-there-itis/

[1]: https://timlshort.com/2019/04/24/plan-continuation-bias-in-financial-markets/

[2]: https://en.wikipedia.org/wiki/List_of_cognitive_biases

[3]: https://www.ignitecsp.com/blog/plan-continuation-bias-or-oh-crap-what-do-i-do-now/

[4]: https://sakasandcompany.com/get-there-itis/

[5]: https://www.cessnaflyer.org/flighttraining/item/799-plan-continuation-bias-just-another-name-for-get-there-itis.html

[6]: https://www.onlydeadfish.co.uk/only_dead_fish/2020/06/the-danger-of-blindly-following-machines.html

[7]: http://aviationsafetyblog.asms-pro.com/blog/the-internal-aviation-sms-threat-plan-continuation-bias

[8]: https://criticaluncertainties.com/tag/plan-continuation-bias/

[9]: https://wiseducationblog.com/2020/08/22/when-sticking-to-the-plan-makes-us-stuck-plan-continuation-bias-and-university-career-plans/

[10]: https://medium.com/10x-curiosity/plan-continuation-bias-60efcc2b4cbe

[11]: https://thedailycoach.substack.com/p/we-must-eliminate-plan-continuation

[12]: https://humansystems.arc.nasa.gov/flightcognition/article2.htm

[13]: https://skybrary.aero/articles/continuation-bias

Time-Saving Bias

When asked to estimate how much time can be saved by increasing speed, they tend to underestimate the time saved when driving at a relatively low speed and overestimate the time saved when driving at a relatively high speed. Drivers were presented with a situation of accelerating from a relatively low speed in order to reach their destination in time, and were asked to estimate the time that could be saved by changing to higher speeds. The bias in Equation 2 towards time savings was found in another study by Swenson (1973), in which participants were asked to rate the effect of increasing the speed of a physical object and underestimating the time savings at a lower speed. It also indicates that this bias is not primarily limited to cognitive tasks, because it persists when information about a problem is based on perceptual cues or active driving information. [Sources: 3, 4, 6]

Then, the driver will want to accelerate to ensure that they arrive at their destination on time, but they may misjudge the time that can be saved by increasing the speed (Svenson, 2008). The time-saving deviation of active driving cannot be attributed to the underestimation of the average speed because the participants accurately estimated the average speed. Therefore, in a queue, the average time saved by increasing the speed from 100 km/h is 2.21 minutes, which is significantly less than three minutes, t 11 = -3.228, p = 0.00403. Therefore, my participants are driving faster than necessary. And gaining more time than needed when increasing speed from low speed. The idea that driving can save more time is called time-saving bias, which was explored in a study by the Hebrew University of Jerusalem. Their results showed that participants were biased in assessing how much time they saved in assessing distance. Time and speed. [Sources: 6, 7]

Acceleration from 10 km / h to 20 km / h saves 30 minutes per 10 km, but acceleration from 20 km / h to 30 km / h (the same speed increase) saves only 10 minutes, and acceleration from 30 km / h to 40 km / h saves only 5 minutes. we reach the speeds that are probably of interest to us, the time savings are minimal. As in the case above, increasing the speed by just 5 km / h only saved two minutes, but looking at just 65 km / h it looks like it will save a significant amount of time. Acceleration saves time, but less and less as the starting speed increases. 10 km is ten times higher per 100 km, but the time saved by traveling at 100 km / h instead of 90 km / h per 100 km is a paltry 7 minutes. It turns out that the widespread, almost universal assumption that we will get where we are going faster if we move at higher speeds is, if not false, then at least much less important than we might imagine. Once established, it seems obvious. [Sources: 1, 7]

Lowe believes that if the driver increases the average speed to 65 km per hour, it will reduce his trip to two minutes. In a number of studies, I have found that consumers actually make mistakes in these judgments and overestimate the benefits of higher speeds while underestimating the benefits of lower speeds. [Sources: 5, 7]

To save time, press the gas pedal a little harder to get to the spot faster. In active driving, an alternative reverse speed meter was used to reject judgment. Subsequently, the procedure was repeated, but for a different speed (and distance). [Sources: 2, 6, 7]

Great efforts are being made to persuade motorists not to accelerate, especially during vacations. Once we see we should be less frustrated when colliding with slower drivers and reduce the risk of crashes or collisions with road patrols. [Sources: 1]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://nlpnotes.com/2014/03/23/time-saving-bias/

[1]: https://www.futurelearn.com/info/courses/logical-and-critical-thinking/0/steps/9130

[2]: https://www.tandfonline.com/doi/full/10.1080/00140139.2015.1051592

[3]: https://www.sciencedirect.com/science/article/pii/S1369847811000659

[4]: https://pubmed.ncbi.nlm.nih.gov/20728651/

[5]: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2383205

[6]: http://journal.sjdm.org/13/13309/jdm13309.html

[7]: https://www.carhistory.com.au/resources/blog/does-driving-faster-actually-save-more-time

Zero-Sum Bias

For example, zero-sum bias can lead people to believe that there is competition for a limited resource in one place, but that resource is actually available for free elsewhere. Zero-sum bias can also affect the way people view transactions and transactions. In this case, they mistakenly believe that there should always be one person who benefits the most from the transaction, even when both parties benefit the same. But in a different way. Because zero-sum prejudice makes people believe that for one person to get something, another person must lose the same thing, this prejudice encourages people to believe in the antagonism of social relations. This means that zero-sum bias can lead people to mistakenly believe that within a group, there is competition for specific resources between them and other members of a specific social group. [Sources: 4, 12]

When applied to judgments between groups, the zero-sum heuristic will conclude that the gain of the other group (outer group) means the corresponding loss of their own group (inner group). The zero-sum deviation describes the intuitive judgment of a situation as a zero-sum (that is, comparing the resources received by one party with the corresponding loss of the other party), when in fact it is not zero. Zero-sum bias is a cognitive inference that leads people to (incorrectly) view certain situations as “zero returns”-they believe that one party’s gains are directly offset by the other’s losses. Zero-sum bias is a cognitive bias that makes people mistakenly regard certain situations as zero-sum, that is, mistakenly believe that the gains of one party are directly offset by the losses of the other party. [Sources: 2, 4, 5, 12]

Zero-sum deviation is a cognitive deviation from zero-sum thinking; people tend to intuitively judge that the situation is zero, even if it is not. When discussing zero-sum deviation, it is usually assumed that people generally tend to zero-sum thinking, and the solution is that we should be more inclined to treat the situation as a comprehensive positive sum. Similarly, while it makes people more likely to view the situation as a positive number, it sometimes makes them more cooperative, but this factor may be overestimated. [Sources: 8, 11]

Placing two groups in a non-zero-sum situation where cooperation leads to mutual gain and competition leads to mutual loss, as in the Robbers Cave experiment (Sherif et al., 1961), can conceptually reduce bias to zero-sum (Wright, 2000). ). One of the main ideas of economics is that trade is mutually beneficial, making both sides better than they were before. “One of the mistakes we often make when we think of trading is to view it as a great deal, not a win-win. [Sources: 5, 10]

A situation involving a set of objects is zero sum if the profit of one object represents another loss, while a situation is a positive sum if the participating objects can achieve the best possible result by cooperating with each other. This includes cases where the situation is considered to be a zero sum, which means that the gains of either party in the scenario are directly offset by the losses of the other parties involved, or that domain expansion should be at the expense of one. Decrease of the other. [Sources: 3, 4]

For example, it can be assumed that zero-sum bias is highly egalitarian (Katz and Hass, 1988) and prosocial (Van Lange et al., 1997; Kurzban and Houser, 2005) for people from a collectivist culture. ) Or collectivism (Triandis, 1995). Emotional adaptation can include jealousy (Hill and Buss, 2008), and cognitive adaptation can include zero-sum heuristics, because the achievements of others often mean their own losses, especially for inseparable resources such as peers and senior positions. Rank in an organized hierarchy. Since zero-sum deviation is very beneficial to the survival of early humans, natural selection ensures that it is still an instinctive way of thinking of modern humans. [Sources: 1, 5]

This is a harmful way of looking at the world, not only for others, but also for yourself. A universal belief system about the opposing nature of social relations, shared by people in society or culture, and based on the implicit assumption that there are a limited number of goods in the world, one of them wins the others and loses, and vice versa, [. ..] A relatively lasting and common belief that social relations are like a zero-sum game. In other words, they think that losing one person is gaining another. [Sources: 0]

Some people try to have a principled argument based on the money they bring to the company or the salary of others (higher) in order to do the same work. It should be noted that you need to consider how to create value for the other party. Some people try to get job opportunities by making more money elsewhere, and they can negotiate based on their value to others. [Sources: 7]

In such a context, people gain more resources for themselves by taking resources from others. The ancient people who survived and reproduced most successfully were the ones who intuitively felt that the receipt of resources by one entity could only occur at the expense of the loss of resources by another entity. Thus, a creature that “benefits” from a situation or has more pie can only come at the expense of all other creatures that have suffered a loss or received less pie. [Sources: 1, 3]

If a creature takes a larger piece of the pie, it means that all other creatures should be content with a smaller piece. Therefore, whenever another creature takes one unit of this limited supply – be it a toy, a piece of land or prey – it means that you have one unit less of this resource at your disposal, which increases the chances of you having a hard time surviving. [Sources: 1]

Under nonzero conditions, for example, when unlimited resources are available, applying this heuristic leads to a skewed judgment that the desired resources are no longer available. The relative neglect of opportunity cost in the context of altruistic action can be seen as a form of positive sum. [Sources: 5, 8]

If you feel like there is no alternative but to split the fixed pie, you may not be able to look for places to create value. For example, a child may mistakenly believe that he is in a zero-sum situation when it comes to the love his parents have for him and his siblings, which means that the love they have for the child must come from love. felt for others. This study offers only an introductory look at the roots of such thinking, but it reminds us that our common disagreements on economic issues are rooted in deeper views of the human personality and the fundamental nature of human relations in economic life (and beyond). .). [Sources: 6, 7, 10]

But the vision, enriched by economic history and theology, positions humans not only as mouths devouring the Earth’s resources, but also as productive gardeners and sub-creators, engraved with a divine creative spark. – buyers versus sellers, employees versus employers – we can rethink our work in the global economy as creators and servants, employees and contributors working with our neighbors to paint a big picture of the abundance and harmony of God in society. [Sources: 10]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://www.goalcast.com/what-is-zero-sum-thinking/

[1]: https://academy4sc.org/video/zero-sum-bias-i-win-you-lose/

[2]: https://www.mendeley.com/catalogue/dd5c3469-a59d-35bd-b6c7-16567e8b5780/

[3]: https://www.lesswrong.com/posts/aAFanvZnmPJb666EQ/fight-zero-sum-bias

[4]: https://effectiviology.com/zero-sum-bias/

[5]: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3153800/

[6]: https://psych.substack.com/p/zero-sum-bias-

[7]: https://www.psychologytoday.com/us/blog/statistical-life/201804/the-zero-sum-fallacy-in-negotiation-and-how-overcome-it

[8]: https://stefanfschubert.com/blog/2020/10/6/positive-sum-bias

[9]: https://cognitivebiases.net/zerosum-bias

[10]: https://blog.acton.org/archives/122444-win-win-denial-the-roots-of-zero-sum-thinking.html

[11]: https://dbpedia.org/page/Zero-sum_thinking

[12]: https://mmpi.ie/zero-sum-bias/

Disposition Effect Bias

Investor portfolios are mainly concentrated in funds with low volatility in several markets and real estate funds, and 90% of assets are real estate investment funds, although the largest positions are held by funds with several markets. A study by Odean and Barber (1999) found that brokerage investors trade too much – detrimental gains – and tend to sell winners and keep losers (disposition effect). One of the main researchers on the topic, Terrence Odean of the Haas Business School, published results that show that “investors with discounted brokerage accounts in the United States sold winners more easily than losers, in line with investor emotional bias. United States: Loss aversion and cognitive bias in the belief that winners and losers will return to the mean. [Sources: 3, 14]

This bias occurs when an investor holds losses for too long by selling assets too quickly after making a profit. The disposition effect is a behavioral bias based on the idea that investors are unwilling to incur losses. This refers to the tendency of investors to sell assets that have risen in value while maintaining assets that have fallen in value. Alexander Joshi summed up the disposition effect as an arrangement that investors must hold in order to lose positions longer than winners, saying that investors will exemplify risk-taking by holding back losers because they don’t like losses and are afraid to prevent them. [Sources: 1, 4, 8]

Or, investors may wish to freeze funds to avoid the risk of selling the winner. Dacey and Zielonka showed that despite the disposal effect, the greater the volatility of stock prices, the greater the likelihood that investors will sell losers. The disposal effect refers to the tendency of investors to sell profitable stocks prematurely and to continue losing stocks for too long (Shefrin & Statman, 1985). The disposal effect refers to the tendency of investors to sell winners prematurely and keep losers for too long. This is one of the most well-documented and reliable decision-making biases. [Sources: 4, 7]

The disposition effect revolves around the view that investors have an emotional bias towards loss aversion. The rational and emotional responses that lead investors to practice disposal effects have much in common with another type of prejudice—loss aversion. The alienation effect has been described as one of the strongest realities for individual investors, because investors hold depreciated stocks but sell appreciation stocks. The order effect describes how investors often sell stocks that have risen in value when they can hold stocks in the hope of obtaining higher returns. [Sources: 2, 4, 14]

Investors tend to sell assets that have brought them positive returns and are reluctant to let go of those that have brought them losses. In addition, this article notes that seasoned investors are more likely to sell winning assets and contain losses. [Sources: 3, 13]

This document calculates the proportion of realized profits and the proportion of realized losses to see if investors are suffering from the alienation effect. An alternative way to study the alienation effect is to take into account that realized / paper gains and losses are independent not at the transaction level, but at the account or investor level [13]. The alienation effect relates to the way investors tend to view unrealized gains and losses on financial assets. [Sources: 3, 6, 9]

However, many studies have shown that when rebalancing and stock prices are checked, there is still a disposition effect and that investments that investors choose to sell continue into the following months to outpace the losers they own [cf. Odean (1998). , Brown et al. Regardless of the argument used, rational or behavioral, the presence of a disposition effect means that investors (individuals or institutions) will not receive optimal returns. Among the arguments that provide behavioral reasons for explaining the effect of disposition, the former predicts that investors have a value function, as argued by prospectus theory. Likewise, investors exhibit the opposite effect of disposition when they perceive their investments as progress towards a specific investment goal rather than an overall investment. [Sources: 1, 4, 9]

Compared to this level of analysis, Dhar and Zhu (2006) confirm a significant disposition effect on average, but show that a fifth of investors exhibit opposite behavior and that the disposition effect is stronger for less experienced investors. This article provides evidence that risk averse investors are more prone to the predisposition effect, males are less prone to this cognitive bias, and age is not associated with the predisposition effect. This article shows that gender is an important trait in understanding cognitive biases and that an investor’s experience may not necessarily be a factor in softening an investor’s position in a liquidity-constrained market. [Sources: 3, 9]

Previous research has mainly focused on the influence of demographic characteristics (eg, age, gender), investor preferences (eg, trading frequency), and trading environment (eg, the importance of information about the purchase price of a security) on the effect of disposition (Taylor and Ogilvy). , 1994; Chen et al. 2007; Da Costa et al., 2008). Our findings provide invaluable guidance for individual investors in making financial decisions based on their characteristics. Journal of Finance, 53 (5), 1775-1798. argues that the reasons for the disposition effect are more consistent with the behavioral arguments, as his research showed that even when all of the arguments listed earlier are under control, investors still exhibit a disposition effect. A corollary of this logic is that the consistency of past performance acts as a fundamental incentive to effectively realize a gain or loss if a return to mean bias affects the manager’s decision-making process. [Sources: 1, 7]

The constant (beta0) measures the investor’s propensity to sell stocks at a loss, and the sum of the constant plus the profit rate (beta0 + beta1) measures the investor’s propensity to sell stocks at a loss. Profit is a dummy variable. If the weighted average purchase price of stock j is less than the current market price of stock j, then this variable is equal to 1, otherwise equal to 0. Sale is a dummy variable. If the amount of asset j in the investor’s account decreases between the previous month and today, it is equal to 1, otherwise it is equal to 0. [Sources: 13]

This table contains the date (first column) when at least one of the two investors opened a position (buy or sell). The results in the second group (SRD) and the third group (warrants) in Table 7 show that these four groups are prone to bias, and for experienced traders, the disposal effect seems to be slightly lower (for SRD investors) DE is 0.045, for investors it is 0.043). For inexperienced investors, the warrants are 0.051 and 0.055 respectively). [Sources: 9]

Taken together, these results indicate that FSE can promote both risk-taking in the area of ​​losses and risk-aversion in the area of ​​profit, resulting in a greater willingness to sell winning stocks too early and hold on to losing stocks too long. For example, [9, 10, 12, 15] explains social interaction as a way of social control and finds that it enhances the disposition effect because investors want to preserve their reputation by postponing recognition of losses when their transactions are open to others. Overall, this paper suggests that social trading platforms can play a positive role in helping investors make better decisions. [Sources: 5, 7]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://www.investopedia.com/terms/b/bias.asp

[1]: https://www.scielo.br/scielo.php?script=sci_arttext&pid=S0034-75902015000100026

[2]: https://capital.com/disposition-effect-an-anomaly-in-behavioural-finance

[3]: https://www.emerald.com/insight/content/doi/10.1108/RAUSP-08-2019-0164/full/html

[4]: https://en.wikipedia.org/wiki/Disposition_effect

[5]: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7877604/

[6]: https://breakingdownfinance.com/finance-topics/behavioral-finance/disposition-effect/

[7]: https://www.frontiersin.org/articles/10.3389/fpsyg.2018.02705/full

[8]: https://www.nature.com/articles/s41598-021-02596-2

[9]: https://www.cairn.info/revue-finance-2009-1-page-51.htm

[10]: https://www.behavioraleconomics.com/resources/mini-encyclopedia-of-be/disposition-effect/

[11]: https://www.pbs.org/wgbh/nova/money/disposition.html

[12]: https://www.sciencedirect.com/science/article/pii/S0927539819300623

[13]: https://www.sr-sv.com/understanding-the-disposition-effect/

[14]: https://www.wrapmanager.com/wealth-management-blog/what-investors-need-to-know-about-the-disposition-effect

Dread Aversion

Likewise, if your neophilia leads to impulsivity, deliberately add time to make a decision. This can not only affect the review and ultimately the results, but it will also contribute to the stress and anxiety around them. [Sources: 0, 5]

We would prefer to overcome negative experiences in order to avoid the fear of waiting. However, this desire is not as strong as the desire to immediately have a positive experience. [Sources: 3]

Loss aversion is the tendency to prefer loss avoidance to equivalent gains. What separates attention to loss from loss aversion is that it does not mean that losses have more subjective weight (or utility) than benefits. Some of these effects have previously been attributed to loss aversion, but they can be explained by a simple attentional asymmetry between profit and loss. [Sources: 2]

Even if there is no need to choose, this individual difference in the internal reactivity of the internal perception system reflects the impact of the predicted negative impact on the evaluation process, which leads to a preference for avoiding losses, rather than gaining greater but riskier profits. Before we discuss how to avoid loss aversion bias, let’s take a look at another equally important related concept. Loss aversion bias is a cognitive phenomenon in which a person is more affected by losses than by gains, that is, from an economic point of view, the fear of losing money is more than making money than the amount that can be lost. Therefore, there is trauma to prevent more damage. Many early losses. In order to avoid myopia loss aversion bias, you should know that you cannot make a buy/sell decision based on your emotions/feelings in panic. [Sources: 2, 6]

But if you are not taking risks because you are trying to avoid losses, know that the biggest risk is living without risk. Several studies on the effect of loss on decision making have not found loss aversion in terms of risk and uncertainty. In general, the role of the amygdala in anticipation of loss suggests that loss aversion may reflect an avoidance response within Pavlov’s conditional approach. Traditionally, this strong behavioral tendency has been attributed to loss aversion. [Sources: 2, 6]

In other words, heightened feelings of insecurity and risk aversion contributed to an increase in the desire to save more investments. This has had an unusual effect on economic activity and risk premiums, especially in large advanced economies. Having outlined how uncertainty and risk aversion may have affected some key parts of the global economy and financial markets in the post-GFC decade, let me now offer some hints as to what could be behind it all. A second problem, which may have contributed to heightened feelings of insecurity and risk aversion after the GFC, relates to concerns about excessive debt among households in large advanced economies. [Sources: 1]

But the key point was that for any potential growth rate, neutral rates will be lower when uncertainty and risk aversion are high. At the beginning of their research, the authors believed that the asymmetry of expectation best reflects how we approach future events. In this paper, using a new approach, we use economic survey data to assess individual differences in anticipatory emotions, finding that the tendency to feel unpleasant (horrified) at anticipating future losses outweighs the pleasure (taste) in anticipating benefits. That is, people do not like terror. Previous laboratory studies have shown that these anticipatory emotions influence decision making. [Sources: 1, 3, 8]

Research shows that instant gratification is more deeply rooted in our DNA than horror. We often worry about the future, dreading the thought of future misfortunes and savoring the thought of future pleasures. In other words, we are more averse to what we fear than what we like. Terror aversion can cause them not to think or schedule exams. [Sources: 3, 5, 9]

For example, if each group is researching and exploring different parts of a topic when it comes to hearing other groups present their findings to the class, students may object to learning or using the knowledge of other groups. Students who rely too heavily on Google for homework and checking answers (hello again, Tool Law) may get lost when they need to remember this information. Uncertainty and risk aversion are concepts that are difficult to define: they cannot be observed directly and can mean different things in different contexts. [Sources: 1, 5]

Today, I will talk about uncertainty to a large extent one-sidedly, that is, how people perceive the possibility of negative outcomes, referring to risk aversion or animal spirits related to how people act in the face of uncertainty. This is very subjective and varies from person to person and from situation to situation. Using these effects to be greater than the results of opinion polls means determining the source of variation so that they can be reliably proven in individual subjects. [Sources: 1, 2, 6]

This is the tendency to think that other people notice your behavior and appearance more than they actually do. The study found that students wore an uncomfortable shirt among other college students. [Sources: 5]

If your dog is overly afraid of these and other noises, such as fireworks, he may have an aversion to noise. Noise aversion is a fearful or anxious reaction to certain sounds, such as a thunderstorm, traffic, construction work, or a vacuum cleaner. [Sources: 4]

Negative bias is a powerful motivator because much of the research in modern media forces us to confront each other. We found 7 cognitive biases that affect classroom learning, independent learning, and the feelings of many students. Loss aversion is not a subject of behavioral finance or behavioral economics. Behavioral Economics. Behavioral economics is a branch of traditional economics that studies the influence of human psychology, ideology, or behavior on individual or institutional economic decisions. They use the concept of loss aversion bias to get people to buy as soon as possible. [Sources: 3, 5, 6]

Research shows that we crave a delicious snack right away, but prefer to defer paying our bills. This tip will also help you overcome planning error, a related effect that explains why we often think tasks are taking less time than they actually are. You may find that a few small changes in your dog’s daily routine can make a big difference. [Sources: 3, 4, 5]

Fear of you and terror of you will be on every animal on earth. I am afraid to get the test results because it might decide my whole life. [Sources: 7]

The volunteers were given the opportunity to either get a discount for one month or have an extra month to pay the bill. Waiting for a bill in the future was a stronger motivator than getting a future discount. The team conducted three studies using a dozen additional studies to support their article. [Sources: 3]

 

— Slimane Zouggari

 

##### Sources #####

[0]: https://www.merriam-webster.com/dictionary/neophilia

[1]: https://www.rba.gov.au/speeches/2021/sp-so-2021-06-02.html

[2]: https://en.wikipedia.org/wiki/Loss_aversion

[3]: https://bigthink.com/smart-skills/dread-motivational-tool/

[4]: https://www.animalhospitalofspringfield.com/services/dogs/blog/scary-sounds-halloween-how-help-dog-noise-aversion

[5]: https://blog.innerdrive.co.uk/7-cognitive-biases-holding-your-students-back

[6]: https://www.wallstreetmojo.com/loss-aversion-bias/

[7]: https://wikidiff.com/aversion/dread

[8]: https://researchportal.bath.ac.uk/en/publications/dread-aversion-and-economic-preferences

[9]: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3822640