3.4 Cognitive Biases

Sally Walters and Laura Westmaas

Learning Objectives

  1. Understand why cognitive biases are used.
  2. Explain confirmation bias, how to avoid it, and its role in stereotypes.
  3. Explain anchoring bias.
  4. Explain hindsight bias.
  5. Explain overconfidence bias.
  6. Explain the framing effect.
  7. Explain counterfactual thinking
  8. Explain cognitive dissonance.

What are cognitive biases?

Cognitive biases are patterns of flawed reasoning or memory common to all humans. Together with our perceptual biases, cognitive biases can interfere with successful communication.

Why do cognitive biases exist?

Research has shown that human thinking is subject to several cognitive processes; we all use them routinely, although we might not be aware that we are. We need to use patterns of thinking in order to understand the world as quickly and efficiently as possible. While these processes provide us with a level of cognitive efficiency in terms of time and effort, they may result in flawed problem-solving and decision-making.

Below you will find a description of several cognitive biases that impact communication.

Confirmation bias

Confirmation bias is the tendency to verify and confirm our existing beliefs and ignore or discount information that disconfirms them. For example, one might believe that organic produce is inherently better: higher in nutrition, lower in pesticides, and so on. Adhering to confirmation bias would mean paying attention to information that confirms the superiority of organic produce and ignoring or not believing any accounts that suggest otherwise. Confirmation bias is psychologically comfortable, so we can proceed to make decisions with unchallenged views. However, just because something “feels” right does not necessarily make it so. Confirmation bias can make people make poor decisions because they fail to pay attention to contrary evidence.

A good example of confirmation bias is seen in people’s attention to political messaging. Jeremy Frimer, Linda Skitka, and Matt Motyl (2017) found that both liberal and conservative voters in Canada and the United States were averse to hearing about the views of their ideological opponents. Furthermore, the study participants indicated that their aversion was not because they felt well-informed but because they were strategically avoiding learning information that would challenge their pre-existing views. As the researchers point out, confirmation bias can result in people on all sides of the political scene remaining within their ideological bubbles, avoiding dialogue with opposing views, and becoming increasingly entrenched and narrow-minded in their positions.

Avoiding confirmation bias and its effects on reasoning requires understanding its existence and, secondly, working to reduce its effects by actively and systematically reviewing disconfirmatory evidence (Lord, Lepper, & Preston, 1984). For example, someone who believes vaccinations are dangerous might change their mind if they wrote down the arguments for vaccination after considering some evidence.

It must be evident that confirmation bias plays a role in stereotypes, which are a set of beliefs or schemas about the characteristics of a group. John Darley and Paget Gross (1983) demonstrated how schemas about social class could influence memory. Their research gave participants a picture and some information about a Grade 4 girl named Hannah. To activate a schema about her social class, Hannah was pictured sitting in front of a nice suburban house for one-half of the participants and in front of an impoverished house in an urban area for the other half. Next, the participants watched a video that showed Hannah taking an intelligence test. As the test went on, Hannah got some of the questions right, and some wrong, but the number of correct and incorrect answers was the same in both conditions. Then, the participants were asked to remember how many questions Hannah got right and wrong. Demonstrating that stereotypes had influenced memory, the participants who thought that Hannah had come from an upper-class background remembered that she had gotten more correct answers than those who thought she was from a lower-class background. You can imagine how our stereotypes against certain groups can affect our behaviour toward members of those groups.

Anchoring Bias

Anchoring refers to the tendency for individuals to rely too heavily on a single piece of information. Job seekers often fall into this trap by focusing on a desired salary while ignoring other aspects of the job offer, such as additional benefits, fit with the job, and working environment. Similarly, but more dramatically, lives were lost in the Great Bear Wilderness Disaster when the coroner declared all five passengers of a small plane dead within five minutes of arriving at the accident scene, which halted the search effort for potential survivors. The next day two survivors who had been declared dead walked out of the forest. How could a mistake like this have been made? One theory is that decision biases played a large role in this serious error and anchoring on the fact that flames had consumed the plane led the coroner to call off the search for any possible survivors (Becker, 2007).

Hindsight bias

From time to time, we all fall prone to the feeling, “I knew it all along!” This tendency is called hindsight bias, and it refers to the narrative constructed about something that happened in the past that helps us make sense of the event. Hindsight bias is the brain’s tendency to rewrite one’s knowledge of history after it happens. The tendency to feel like “I knew it all along” is coupled with an inability to reconstruct the lack of knowledge that formerly existed. Thus, we overestimate our ability to predict the future because we have reconstructed an illusory past (Kahneman, 2011).

Overconfidence Bias

Overconfidence bias occurs when individuals overestimate their ability to predict future events. Many people exhibit signs of overconfidence. For example, 82 percent of the drivers surveyed feel they are in the top 30 percent of safe drivers, 86 percent of students at the Harvard Business School say they are better looking than their peers, and doctors consistently overestimate their ability to detect problems (Tilson, 1999). People who purchase lottery tickets as a way to make money are probably suffering from overconfidence bias. It is three times more likely for a person driving ten miles to buy a lottery ticket to be killed in a car accident than to win the jackpot (Orkin, 1991). Further, research shows that overconfidence leads to less successful negotiations (Neale & Bazerman, 1985).To avoid this bias, take the time to stop and ask yourself if you are being realistic in your judgments.

Framing effect

Humans tend to avoid risk when making decisions. The framing effect is the tendency for judgments to be affected by the framing, or wording, of the problem. When asked to choose between two alternatives, one framed to avoid loss and the second to maximize gain, people will make different choices about the same problem. Furthermore, emotional reactions to the same event framed differently are also different. For example, the probability of recovery from an illness is 90% is more reassuring than being told that the mortality rate of the illness is 10% (Kahneman, 2011). Amos Tversky and Daniel Kahneman (1981) devised an example, now known as the “Asian disease problem,” to show the framing effect:

Imagine that the United States is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimate of the consequences of the programs are as follows:

  • If Program A is adopted, 200 people will be saved. [72%]
  • If Program B is adopted, there is 1/3 probability that 600 people will be saved and 2/3 probability that no people will be saved. [28%]

Which of the two programs would you favour?

The majority of participants chose Program A as the best choice. Clearly, they favoured a sure thing over a gamble. In another version of the same problem, the options were framed differently:

  • If Program C is adopted, 400 people will die. [22%]
  • If Program D is adopted, there is 1/3 probability that nobody will die and 2/3 probability that 600 people will die. [78%]

Which of the two programs would you favour?

This time, the majority of participants chose the uncertain response of Program D as the best choice, but look closely, and you will note that the consequences of Programs A and C are identical, as are the consequences of Programs B and D. In the first problem, participants chose the sure gain over the riskier, though potentially more lifesaving, option. In the second problem, the opposite choice was made; this time, participants favoured the risky option of 600 people dying over the sure thing that 400 would die. In other words, choices involving potential losses are risky, while choices involving gains are risk-averse. The important message here is that only the framing of the problem causes people to shift from risk aversion to risk acceptance; the problems are identical. Daniel Kahneman (2011) pointed out that even trained medical and policy professionals reason using the framing effects above, demonstrating that education and experience may not be powerful enough to overcome these cognitive tendencies.

Counterfactual thinking

All of us are prone to thinking about past events and imagining that they may have turned out differently. If we can easily imagine an outcome that is better than what actually happened, then we may experience sadness and disappointment; on the other hand, if we can easily imagine that a result might have been worse than what actually happened, we may be more likely to experience happiness and satisfaction. The tendency to think about and experience events according to “what might have been” is counterfactual thinking (Kahneman & Miller, 1986; Roese, 2005).

Imagine, for instance, that you were participating in an important contest and finished in second place, winning the silver medal. How would you feel? Certainly, you would be happy that you won the silver medal, but wouldn’t you also be thinking about what might have happened if you had been a little bit better — you might have won the gold medal! On the other hand, how might you feel if you won the bronze medal for third place? If you were thinking about the counterfactuals — that is, the “what might have been” — perhaps the idea of not getting any medal at all would have been highly accessible; you’d be happy that you got the medal that you did get, rather than coming in fourth.

Victoria Medvec, Scott Madey, and Thomas Gilovich (1995) investigated this idea by videotaping the responses of athletes who won medals in the 1992 Summer Olympic Games (see Figure 3.4.1). They videotaped the athletes both as they learned that they had won a silver or a bronze medal and again as they were awarded the medal. Then, the researchers showed these videos, without any sound, to raters who did not know which medal which athlete had won. The raters were asked to indicate how they thought the athlete was feeling, using a range of feelings from “agony” to “ecstasy.” The results showed that the bronze medalists were, on average, rated as happier than the silver medalists. In a follow-up study, raters watched interviews with many of these athletes as they talked about their performance. The raters indicated what we would expect on the basis of counterfactual thinking — the silver medalists talked about their disappointments in having finished second rather than first, whereas the bronze medalists focused on how happy they were to have finished third rather than fourth.

 

 

This picture shows gold, bronze, and silver medalists at the Olympic Games.
Figure 3.4.1 Counterfactual thinking might be a factor here. Does the bronze medalist look happier to you than the silver medalist? Medvec, Madey, and Gilovich (1995) found that, on average, bronze medalists were happier.

You might have experienced counterfactual thinking in other situations. If you were driving across the country and your car was having some engine trouble, you might feel an increased desire to make it home as you approached the end of your journey; you would have been extremely disappointed if the car broke down only a short distance from your home. Perhaps you have noticed that once you get close to finishing something, you feel like you really need to get it done. Counterfactual thinking has even been observed in juries. Jurors who were asked to award monetary damages to others who had been in an accident offered them substantially more compensation if they barely avoided injury than they offered if the accident seemed inevitable (Miller, Turnbull, & McFarland, 1988).

 

 

Psychology in Everyday Life

Cognitive dissonance and behaviour

Leon Festinger made one of the most important findings in social-cognitive psychology research more than 60 years ago. Festinger (1957) found that holding two contradictory attitudes or beliefs at the same time, or acting in a way that contradicts a pre-existing attitude, created a state of cognitive dissonance, which is a feeling of discomfort or tension that people actively try to reduce. The reduction in dissonance could be achieved by either changing the behaviour or by changing what is believed. For example, a person who smokes cigarettes while at the same time believing in their harmful effects is experiencing incongruence between actions and behaviour; this creates cognitive dissonance. The person can reduce the dissonance by either changing their behaviour (e.g., giving up smoking) or by changing their belief (e.g., convincing themselves that smoking isn’t that bad, rationalizing that lots of people smoke without harmful effect, or denying the evidence against smoking). Another example could be a parent who spanks their child and believes that spanking is wrong. The dissonance created here can similarly be reduced by either changing their behaviour (e.g., quitting spanking) or by changing their belief (e.g., rationalizing that the spanking was a one-off situation, adopting the view that spanking was occasionally justified, or rationalizing that many adults were spanked as children).

Cognitive dissonance is both cognitive and social because it involves thinking and, sometimes, social behaviours. Festinger (1957) wanted to see how believers in a doomsday cult would react when told the end of the world was coming, and later, when it failed to happen. By infiltrating a genuine cult, Festinger was able to extend his research into the real world. The cult members devoutly believed that the end was coming but that they would be saved by an alien spaceship, as the cult leader told them in a prophecy. Accordingly, they gave away their possessions, quit their jobs, and waited for rescue. When the prophecized end and rescue failed to materialize, one might think that the cult members experienced an enormous amount of dissonance; their strong beliefs, backed up by their behaviour, were incongruent with the rescue and end that failed to materialize. However, instead of reducing their dissonance by ceasing to believe that the end of the world was nigh, the cult members actually increased their faith by altering it to include a view that the world has been saved by the demonstration of their faith. They became even more evangelical.

You might be wondering how cognitive dissonance operates when the inconsistency is between two attitudes or beliefs. For example, suppose the family wage earner is a staunch supporter of the Green Party, believes in the science of climate change, and is generally aware of environmental issues. At the same time, the family wage-earner loses their job. They send out many job applications, but nothing materializes until full-time employment is offered from a large petroleum company to manage the town’s gas station. Here, we would see one’s environmental beliefs pitted against the need for a job at a gas station: cognitive dissonance in the making. How would you reduce your cognitive dissonance in this situation? Take the job and rationalize that environmental beliefs can be set aside this time, that one gas station is not so bad, or that there are no other jobs? Turn down the job knowing your family needs the money but retaining your environmental beliefs?

 

 

Key Takeaways

  • A variety of cognitive biases influence the accuracy of our judgments.
  • Overcoming cognitive bias may take awareness of their existence and active work.
  • Cognitive dissonance occurs when there is an inconsistency between two cognitions or between cognition and behaviour. People are motivated to reduce cognitive dissonance.

Image Attributions

Figure 8.9. Used under a CC BY-NC-SA 4.0 license.

Figure 8.10. 2010 Winter Olympic Men’s Snowboard Cross medalists by Laurie Kinniburgh is used under a CC BY 2.0 license.

References

Becker, W. S. (2007). Missed opportunities: The Great Bear Wilderness disaster. Organizational Dynamics, 36, 363–76.

Darley, J. M., & Gross, P. H. (1983). A hypothesis-confirming bias in labeling effects. Journal of Personality and Social Psychology, 44, 20–33.

Duncker, K. (1945). On problem-solving. Psychological Monographs, 58(5), i–113.

Festinger, L. (1957). A theory of cognitive dissonance. Evanston, IL: Row, Peterson.

Frimer, J. A., Skitka, L. J., & Motyl, M. (2017) Liberals and conservatives are similarly motivated to avoid exposure to one another’s opinions. Journal of Experimental Social Psychology, 72, 1–12.

Kahneman, D. (2011). Thinking, fast and slow. New York, NY: Farrar, Straus and Giroux.

Kahneman, D., & Miller, D. T. (1986). Norm theory: Comparing reality to its alternatives. Psychological Review, 93, 136–153.

Lord, C. G., Lepper, M. R., & Preston, E. (1984). Considering the opposite: A corrective strategy for social judgment. Journal of Personality and Social Psychology, 74(6), 1231–1243.

Medvec, V. H., Madey, S. F., & Gilovich, T. (1995). When less is more: Counterfactual thinking and satisfaction among Olympic medalists. Journal of Personality & Social Psychology, 69(4), 603–610.

Miller, D. T., Turnbull, W., & McFarland, C. (1988). Particularistic and universalistic evaluation in the social comparison process. Journal of Personality and Social Psychology, 55, 908–917.

Neale, M. A., & Bazerman, M. H. (1985). The effects of framing and negotiator overconfidence on bargaining behaviors and outcomes. Academy of Management Journal, 28, 34–49.

Roese, N. (2005). If only: How to turn regret into opportunity. New York, NY: Broadway Books.

Tilson, W. (1999, September 20). The perils of investor overconfidence. Retrieved March 1, 2008, from http://www.fool.com/BoringPort/1999/BoringPort990920.htm.

Tversky, A., & Kahneman, D. (1981). The framing of decisions and the psychology of choice. Science, 211, 453–458.

License

Icon for the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

3.4 Cognitive Biases Copyright © 2024 by Sally Walters and Laura Westmaas is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.

Share This Book