• Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Guided Meditations
  • Verywell Mind Insights
  • 2023 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

What Is the Confirmation Bias?

Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

problem solving confirmation bias definition

Verywell / Daniel Fishel 

  • Tips for Overcoming It

A confirmation bias is cognitive bias that favors information that confirms your previously existing beliefs or biases .

For example, imagine that a person believes left-handed people are more creative than right-handed people. Whenever this person encounters a person that is both left-handed and creative, they place greater importance on this "evidence" that supports what they already believe. This individual might even seek proof that further backs up this belief while discounting examples that don't support the idea.

Confirmation biases impact how we gather information but also influence how we interpret and recall information. For example, people who support or oppose a particular issue will not only seek information to support it, but they will also interpret news stories in a way that upholds their existing ideas. They will also remember details in a way that reinforces these attitudes .

History of Confirmation Bias

The idea behind the confirmation bias has been observed by philosophers and writers since ancient times. In the 1960s, cognitive psychologist Peter Wason conducted several experiments known as Wason's rule discovery task. He demonstrated that people tend to seek information that confirms their existing beliefs.

Signs of Confirmation Bias

When it comes to confirmation bias, there are often signs that a person is inadvertently or consciously falling victim to it. Unfortunately, it can also be very subtle and difficult to spot. Some of these signs that might help you identify when you or someone else is experiencing this bias include:

  • Only seeking out information that confirms your beliefs and ignoring or discredit information that doesn't support them.
  • Looking for evidence that confirms what you already think is true, rather than considering all of the evidence available.
  • Relying on stereotypes or personal biases when assessing information.
  • Selectively remembering information that supports your views while forgetting or discounting information that doesn't.
  • Having a strong emotional reaction to information (positive or negative) that confirms your beliefs, while remaining relatively unaffected by information that doesn't.

Types of Confirmation Bias

There are a few different types of confirmation bias that can occur. Some of the most common include the following:

  • Biased attention : This is when we selectively focus on information that confirms our views while ignoring or discounting data that doesn't.
  • Biased interpretation : This is when we consciously interpret information in a way that confirms our beliefs.
  • Biased memory : This is when we selectively remember information that supports our views while forgetting or discounting information that doesn't.

Examples of the Confirmation Bias

It can be helpful to consider a few examples of how confirmation bias works in everyday life to get a better idea of the effects and impact it may have.

Interpretations of Current Issues

One of the most common examples of confirmation bias is how we seek out or interpret news stories. We are more likely to believe a story if it confirms our pre-existing views, even if the evidence presented is shaky or inconclusive. For example, if we support a particular political candidate, we are more likely to believe news stories that paint them in a positive light while discounting or ignoring those that are critical.

Consider the debate over gun control:

  • Let's say Sally is in support of gun control. She seeks out news stories and opinion pieces that reaffirm the need for limitations on gun ownership. When she hears stories about shootings in the media, she interprets them in a way that supports her existing beliefs.
  • Henry, on the other hand, is adamantly opposed to gun control. He seeks out news sources that are aligned with his position. When he comes across news stories about shootings, he interprets them in a way that supports his current point of view.

These two people have very different opinions on the same subject, and their interpretations are based on their beliefs. Even if they read the same story, their bias shapes how they perceive the details, further confirming their beliefs.

Personal Relationships

Another example of confirmation bias can be seen in the way we choose friends and partners. We are more likely to be attracted to and befriend people who share our same beliefs and values, and less likely to associate with those who don't. This can lead to an echo chamber effect, where we only ever hear information that confirms our views and never have our opinions challenged.

Decision-Making

The confirmation bias can often lead to bad decision-making . For example, if we are convinced that a particular investment is good, we may ignore warning signs that it might not be. Or, if we are set on getting a job with a particular company, we may not consider other opportunities that may be better suited for us.

Impact of the Conformation Bias

The confirmation bias happens due to the natural way the brain works, so eliminating it is impossible. While it is often discussed as a negative tendency that impairs logic and decisions, it isn't always bad. The confirmation bias can significantly impact our lives, both positively and negatively. On the positive side, it can help us stay confident in our beliefs and values and give us a sense of certainty and security. 

Unfortunately, this type of bias can prevent us from looking at situations objectively. It can also influence our decisions and lead to poor or faulty choices.

During an election season, for example, people tend to seek positive information that paints their favored candidates in a good light. They will also look for information that casts the opposing candidate in a negative light.

By not seeking objective facts, interpreting information in a way that only supports their existing beliefs, and remembering details that uphold these beliefs, they often miss important information. These details and facts might have influenced their decision on which candidate to support.

How to Overcome the Confirmation Bias

There are a few different ways that we can try to overcome confirmation bias:

  • Be aware of the signs that you may be falling victim to it. This includes being aware of your personal biases and how they might be influencing your decision-making.
  • Consider all the evidence available, rather than just the evidence confirming your views.
  • Seek out different perspectives, especially from those who hold opposing views.
  • Be willing to change your mind in light of new evidence, even if it means updating or even changing your current beliefs.

A Word From Verywell

Unfortunately, we all have confirmation bias. Even if you believe you are very open-minded and only observe the facts before coming to conclusions, some bias will likely shape your opinion. It's very difficult to combat this natural tendency.

That said, if we know about confirmation bias and accept the fact that it does exist, we can make an effort to recognize it by working to be curious about opposing views and listening to what others have to say and why. This can help us better see issues and beliefs from another perspective, though we still need to be conscious of wading past our confirmation bias.

American Psychological Association. Confirmation bias . APA Dictionary of Psychology.

Wason PC. On the failure to eliminate hypotheses in a conceptual task . Quarterly Journal of Experimental Psychology . 1960;12(3):129-140. doi:10.1080/17470216008416717

Satya-Murti S, Lockhart J. Recognizing and reducing cognitive bias in clinical and forensic neurology . Neurol Clin Pract . 2015 Oct;5(5):389-396. doi:10.1212/CPJ.0000000000000181

Allahverdyan AE, Galstyan A. Opinion dynamics with confirmation bias . PLoS One . 2014;9(7):e99557. doi:10.1371/journal.pone.0099557

Frost P, Casey B, Griffin K, Raymundo L, Farrell C, Carrigan R. The influence of confirmation bias on memory and source monitoring. J Gen Psychol . 2015;142(4):238-52. doi:10.1080/00221309.2015.1084987

Suzuki M, Yamamoto Y. Characterizing the influence of confirmation bias on web search behavior . Front Psychol . 2021;12:771948. doi:10.3389/fpsyg.2021.771948

Poletiek FH. Hypothesis-Testing Behavior . Psychology Press, 2013.

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Research bias
  • What Is Confirmation Bias? | Definition & Examples

What Is Confirmation Bias? | Definition & Examples

Published on September 19, 2022 by Kassiani Nikolopoulou . Revised on March 10, 2023.

Confirmation bias is the tendency to seek out and prefer information that supports our preexisting beliefs. As a result, we tend to ignore any information that contradicts those beliefs.

Confirmation bias is often unintentional but can still lead to poor decision-making in (psychology) research and in legal or real-life contexts.

Table of contents

What is confirmation bias, types of confirmation bias, confirmation bias examples, how to avoid confirmation bias, other types of research bias, frequently asked questions about confirmation bias.

Confirmation bias is a type of cognitive bias , or an error in thinking. Processing all the facts available to us costs us time and energy, so our brains tend to pick the information that agrees most with our preexisting opinions and knowledge. This leads to faster decision-making. Mental “shortcuts” like this are called heuristics.

Confirmation bias

When confronted with new information that confirms what we already believe, we are more likely to:

  • Accept it as true and accurate
  • Overlook any flaws or inconsistencies
  • Incorporate it into our belief system
  • Recall it later, using it to support our belief during a discussion

On the other hand, if the new information contradicts what we already believe, we respond differently. We are more likely to:

  • Become defensive about it
  • Focus on criticizing any flaw, while that same flaw would be ignored if the information confirmed our beliefs
  • Forget this information quickly, not recalling reading or hearing about it later on

There are three main ways that people display confirmation bias:

  • Selective search
  • Selective interpretation
  • Selective recall

Biased search for information

This type of bias occurs when only positive evidence is sought, or evidence that supports your expectations or hypotheses. Evidence that could prove them wrong is systematically disregarded.

If you reverse the question and type “are cats better than dogs?”, you will get results in support of cats.

This will happen with any two variables : the search engine “assumes” that you think variable A is better than variable B, and shows you the results that agree with your opinion first.

Biased interpretation of information

Confirmation bias is not limited to the type of information we search for. Even if two people are presented with the same information, it is possible that they will interpret it differently.

The reader who doubts climate change may interpret the article as evidence that climate change is natural and has happened at other points in history. Any arguments raised in the article about the negative impact of fossil fuels will be dismissed.

On the other hand, the reader who is concerned about climate change will view the information as evidence that climate change is a threat and that something must be done about it. Appeals to cut down fossil fuel emissions will be viewed favorably.

Biased recall of information

Confirmation bias also affects what type of information we are able to recall.

A week after encountering the story, the reader who is concerned about climate change is more likely to recall these arguments in a discussion with friends. On the contrary, a climate change doubter likely won’t be able to recall the points made in the article.

Confirmation bias has serious implications for our ability to seek objective facts. It can lead individuals to “cherry-pick” bits of information that reinforce any prejudices or stereotypes.

An overworked physician, believing this is just drug-seeking behavior, examines him hastily in the hall. The physician confirms that all of the man’s vital signs are fine: consistent with what was expected.

The man is discharged. Because the physician was only looking for what was already expected, she missed the signs that the man was actually having a problem with his kidneys.

Confirmation bias can lead to poor decision-making in various contexts, including interpersonal relationships, medical diagnoses, or applications of the law.

Due to this, you unconsciously seek information to support your hypothesis during the data collection phase, rather than remaining open to results that could disprove it. At the end of your research, you conclude that memory games do indeed delay memory loss.

Although confirmation bias cannot be entirely eliminated, there are steps you can take to avoid it:

  • First and foremost, accept that you have biases that impact your decision-making. Even though we like to think that we are objective, it is our nature to use mental shortcuts. This allows us to make judgments quickly and efficiently, but it also makes us disregard information that contradicts our views.
  • Do your research thoroughly when searching for information. Actively consider all the evidence available, rather than just the evidence confirming your opinion or belief. Only use credible sources that can pass the CRAAP test .
  • Make sure you read entire articles, not just the headline, prior to drawing any conclusions. Analyze the article to see if there is reliable evidence to support the argument being made. When in doubt, do further research to check if the information presented is trustworthy.

Cognitive bias

  • Confirmation bias
  • Baader–Meinhof phenomenon

Selection bias

  • Sampling bias
  • Ascertainment bias
  • Attrition bias
  • Self-selection bias
  • Survivorship bias
  • Nonresponse bias
  • Undercoverage bias
  • Hawthorne effect
  • Observer bias
  • Omitted variable bias
  • Publication bias
  • Pygmalion effect
  • Recall bias
  • Social desirability bias
  • Placebo effect

Reliability and validity are both about how well a method measures something:

  • Reliability refers to the  consistency of a measure (whether the results can be reproduced under the same conditions).
  • Validity   refers to the  accuracy of a measure (whether the results really do represent what they are supposed to measure).

If you are doing experimental research, you also have to consider the internal and external validity of your experiment.

Research bias affects the validity and reliability of your research findings , leading to false conclusions and a misinterpretation of the truth. This can have serious implications in areas like medical research where, for example, a new form of treatment may be evaluated.

It can sometimes be hard to distinguish accurate from inaccurate sources , especially online. Published articles are not always credible and can reflect a biased viewpoint without providing evidence to support their conclusions.

Information literacy is important because it helps you to be aware of such unreliable content and to evaluate sources effectively, both in an academic context and more generally.

Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.

Although selective recall is a component of confirmation bias, it should not be confused with recall bias.

On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Nikolopoulou, K. (2023, March 10). What Is Confirmation Bias? | Definition & Examples. Scribbr. Retrieved April 5, 2024, from https://www.scribbr.com/research-bias/confirmation-bias/

Is this article helpful?

Kassiani Nikolopoulou

Kassiani Nikolopoulou

Other students also liked, random vs. systematic error | definition & examples, evaluating sources | methods & examples, applying the craap test & evaluating sources.

Confirmation Bias In Psychology: Definition & Examples

Julia Simkus

Editor at Simply Psychology

BA (Hons) Psychology, Princeton University

Julia Simkus is a graduate of Princeton University with a Bachelor of Arts in Psychology. She is currently studying for a Master's Degree in Counseling for Mental Health and Wellness in September 2023. Julia's research has been published in peer reviewed journals.

Learn about our Editorial Process

Saul Mcleod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul Mcleod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

On This Page:

Confirmation Bias is the tendency to look for information that supports, rather than rejects, one’s preconceptions, typically by interpreting evidence to confirm existing beliefs while rejecting or ignoring any conflicting data (American Psychological Association).

One of the early demonstrations of confirmation bias appeared in an experiment by Peter Watson (1960) in which the subjects were to find the experimenter’s rule for sequencing numbers.

Its results showed that the subjects chose responses that supported their hypotheses while rejecting contradictory evidence, and even though their hypotheses were incorrect, they became confident in them quickly (Gray, 2010, p. 356).

Though such evidence of confirmation bias has appeared in psychological literature throughout history, the term ‘confirmation bias’ was first used in a 1977 paper detailing an experimental study on the topic (Mynatt, Doherty, & Tweney, 1977).

Confirmation bias as psychological objective attitude issue outline diagram. Incorrect information checking or aware of self interpretation vector illustration. Tendency to approve existing opinion.

Biased Search for Information

This type of confirmation bias explains people’s search for evidence in a one-sided way to support their hypotheses or theories.

Experiments have shown that people provide tests/questions designed to yield “yes” if their favored hypothesis is true and ignore alternative hypotheses that are likely to give the same result.

This is also known as the congruence heuristic (Baron, 2000, p.162-64). Though the preference for affirmative questions itself may not be biased, there are experiments that have shown that congruence bias does exist.

For Example:

If you were to search “Are cats better than dogs?” in Google, all you would get are sites listing the reasons why cats are better.

However, if you were to search “Are dogs better than cats?” google will only provide you with sites that believe dogs are better than cats.

This shows that phrasing questions in a one-sided way (i.e., affirmative manner) will assist you in obtaining evidence consistent with your hypothesis.

Biased Interpretation

This type of bias explains that people interpret evidence concerning their existing beliefs by evaluating confirming evidence differently than evidence that challenges their preconceptions.

Various experiments have shown that people tend not to change their beliefs on complex issues even after being provided with research because of the way they interpret the evidence.

Additionally, people accept “confirming” evidence more easily and critically evaluate the “disconfirming” evidence (this is known as disconfirmation bias) (Taber & Lodge, 2006).

When provided with the same evidence, people’s interpretations could still be biased.

For example:

Biased interpretation is shown in an experiment conducted by Stanford University on the topic of capital punishment. It included participants who were in support of and others who were against capital punishment.

All subjects were provided with the same two studies.

After reading the detailed descriptions of the studies, participants still held their initial beliefs and supported their reasoning by providing “confirming” evidence from the studies and rejecting any contradictory evidence or considering it inferior to the “confirming” evidence (Lord, Ross, & Lepper, 1979).

Biased Memory

To confirm their current beliefs, people may remember/recall information selectively. Psychological theories vary in defining memory bias.

Some theories state that information confirming prior beliefs is stored in the memory while contradictory evidence is not (i.e., Schema theory). Some others claim that striking information is remembered best (i.e., humor effect).

Memory confirmation bias also serves a role in stereotype maintenance. Experiments have shown that the mental association between expectancy-confirming information and the group label strongly affects recall and recognition memory.

Though a certain stereotype about a social group might not be true for an individual, people tend to remember the stereotype-consistent information better than any disconfirming evidence (Fyock & Stangor, 1994).

In one experimental study, participants were asked to read a woman’s profile (detailing her extroverted and introverted skills) and assess her for either a job of a librarian or real-estate salesperson.

Those assessing her as a salesperson better recalled extroverted traits, while the other group recalled more examples of introversion (Snyder & Cantor, 1979).

These experiments, along with others, have offered an insight into selective memory and provided evidence for biased memory, proving that one searches for and better remembers confirming evidence.

social media bias

Social Media

Information we are presented on social media is not only reflective of what the users want to see but also of the designers’ beliefs and values. Today, people are exposed to an overwhelming number of news sources, each varying in their credibility.

To form conclusions, people tend to read the news that aligns with their perspectives. For instance, new channels provide information (even the same news) differently from each other on complex issues (i.e., racism, political parties, etc.), with some using sensational headlines/pictures and one-sided information.

Due to the biased coverage of topics, people only utilize certain channels/sites to obtain their information to make biased conclusions.

Religious Faith

People also tend to search for and interpret evidence with respect to their religious beliefs (if any).

For instance, on the topics of abortion and transgender rights, people whose religions are against such things will interpret this information differently than others and will look for evidence to validate what they believe.

Similarly, those who religiously reject the theory of evolution will either gather information disproving evolution or hold no official stance on the topic.

Also, irreligious people might perceive events that are considered “miracles” and “test of faiths” by religious people to be a reinforcement of their lack of faith in a religion.

when Does The Confirmation Bias Occur?

There are several explanations why humans possess confirmation bias, including this tendency being an efficient way to process information, protect self-esteem, and minimize cognitive dissonance.

Information Processing

Confirmation bias serves as an efficient way to process information because of the limitless information humans are exposed to.

To form an unbiased decision, one would have to critically evaluate every piece of information present, which is unfeasible. Therefore, people only tend to look for information desired to form their conclusions (Casad, 2019).

Protect Self-esteem

People are susceptible to confirmation bias to protect their self-esteem (to know that their beliefs are accurate).

To make themselves feel confident, they tend to look for information that supports their existing beliefs (Casad, 2019).

Minimize Cognitive Dissonance

Cognitive dissonance also explains why confirmation bias is adaptive.

Cognitive dissonance is a mental conflict that occurs when a person holds two contradictory beliefs and causes psychological stress/unease in a person.

To minimize this dissonance, people adapt to confirmation bias by avoiding information that is contradictory to their views and seeking evidence confirming their beliefs.

Challenge avoidance and reinforcement seeking to affect people’s thoughts/reactions differently since exposure to disconfirming information results in negative emotions, something that is nonexistent when seeking reinforcing evidence (“The Confirmation Bias: Why People See What They Want to See”).

Implications

Confirmation bias consistently shapes the way we look for and interpret information that influences our decisions in this society, ranging from homes to global platforms. This bias prevents people from gathering information objectively.

During the election campaign, people tend to look for information confirming their perspectives on different candidates while ignoring any information contradictory to their views.

This subjective manner of obtaining information can lead to overconfidence in a candidate, and misinterpretation/overlooking of important information, thus influencing their voting decision and, eventually country’s leadership (Cherry, 2020).

Recruitment and Selection

Confirmation bias also affects employment diversity because preconceived ideas about different social groups can introduce discrimination (though it might be unconscious) and impact the recruitment process (Agarwal, 2018).

Existing beliefs of a certain group being more competent than the other is the reason why particular races and gender are represented the most in companies today. This bias can hamper the company’s attempt at diversifying its employees.

Mitigating Confirmation Bias

Change in intrapersonal thought:.

To avoid being susceptible to confirmation bias, start questioning your research methods, and sources used to obtain their information.

Expanding the types of sources used in searching for information could provide different aspects of a particular topic and offer levels of credibility.

  • Read entire articles rather than forming conclusions based on the headlines and pictures. – Search for credible evidence presented in the article.
  • Analyze if the statements being asserted are backed up by trustworthy evidence (tracking the source of evidence could prove its credibility). – Encourage yourself and others to gather information in a conscious manner.

Alternative hypothesis:

Confirmation bias occurs when people tend to look for information that confirms their beliefs/hypotheses, but this bias can be reduced by taking into alternative hypotheses and their consequences.

Considering the possibility of beliefs/hypotheses other than one’s own could help you gather information in a more dynamic manner (rather than a one-sided way).

Related Cognitive Biases

There are many cognitive biases that characterize as subtypes of confirmation bias. Following are two of the subtypes:

Backfire Effect

The backfire effect occurs when people’s preexisting beliefs strengthen when challenged by contradictory evidence (Silverman, 2011).

  • Therefore, disproving a misconception can actually strengthen a person’s belief in that misconception.

One piece of disconfirming evidence does not change people’s views, but a constant flow of credible refutations could correct misinformation/misconceptions.

This effect is considered a subtype of confirmation bias because it explains people’s reactions to new information based on their preexisting hypotheses.

A study by Brendan Nyhan and Jason Reifler (two researchers on political misinformation) explored the effects of different types of statements on people’s beliefs.

While examining two statements, “I am not a Muslim, Obama says.” and “I am a Christian, Obama says,” they concluded that the latter statement is more persuasive and resulted in people’s change of beliefs, thus affirming statements are more effective at correcting incorrect views (Silverman, 2011).

Halo Effect

The halo effect occurs when people use impressions from a single trait to form conclusions about other unrelated attributes. It is heavily influenced by the first impression.

Research on this effect was pioneered by American psychologist Edward Thorndike who, in 1920, described ways officers rated their soldiers on different traits based on first impressions (Neugaard, 2019).

Experiments have shown that when positive attributes are presented first, a person is judged more favorably than when negative traits are shown first. This is a subtype of confirmation bias because it allows us to structure our thinking about other information using only initial evidence.

Learning Check

When does the confirmation bias occur.

  • When an individual only researches information that is consistent with personal beliefs.
  • When an individual only makes a decision after all perspectives have been evaluated.
  • When an individual becomes more confident in one’s judgments after researching alternative perspectives.
  • When an individual believes that the odds of an event occurring increase if the event hasn’t occurred recently.

The correct answer is A. Confirmation bias occurs when an individual only researches information consistent with personal beliefs. This bias leads people to favor information that confirms their preconceptions or hypotheses, regardless of whether the information is true.

Take-home Messages

  • Confirmation bias is the tendency of people to favor information that confirms their existing beliefs or hypotheses.
  • Confirmation bias happens when a person gives more weight to evidence that confirms their beliefs and undervalues evidence that could disprove it.
  • People display this bias when they gather or recall information selectively or when they interpret it in a biased way.
  • The effect is stronger for emotionally charged issues and for deeply entrenched beliefs.

Agarwal, P., Dr. (2018, October 19). Here Is How Bias Can Affect Recruitment In Your Organisation. https://www.forbes.com/sites/pragyaagarwaleurope/2018/10/19/how-can-bias-during-interviewsaffect-recruitment-in-your-organisation

American Psychological Association. (n.d.). APA Dictionary of Psychology. https://dictionary.apa.org/confirmation-bias

Baron, J. (2000). Thinking and Deciding (Third ed.). Cambridge University Press.

Casad, B. (2019, October 09). Confirmation bias . https://www.britannica.com/science/confirmation-bias

Cherry, K. (2020, February 19). Why Do We Favor Information That Confirms Our Existing Beliefs? https://www.verywellmind.com/what-is-a-confirmation-bias-2795024

Fyock, J., & Stangor, C. (1994). The role of memory biases in stereotype maintenance. The British journal of social psychology, 33 (3), 331–343.

Gray, P. O. (2010). Psychology . New York: Worth Publishers.

Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37 (11), 2098–2109.

Mynatt, C. R., Doherty, M. E., & Tweney, R. D. (1977). Confirmation bias in a simulated research environment: An experimental study of scientific inference. Quarterly Journal of Experimental Psychology, 29 (1), 85-95.

Neugaard, B. (2019, October 09). Halo effect. https://www.britannica.com/science/halo-effect

Silverman, C. (2011, June 17). The Backfire Effect . https://archives.cjr.org/behind_the_news/the_backfire_effect.php

Snyder, M., & Cantor, N. (1979). Testing hypotheses about other people: The use of historical knowledge. Journal of Experimental Social Psychology, 15 (4), 330–342.

Further Information

  • What Is Confirmation Bias and When Do People Actually Have It?
  • Confirmation Bias: A Ubiquitous Phenomenon in Many Guises
  • The importance of making assumptions: why confirmation is not necessarily a bias
  • Decision Making Is Caused By Information Processing And Emotion: A Synthesis Of Two Approaches To Explain The Phenomenon Of Confirmation Bias

Confirmation bias occurs when individuals selectively collect, interpret, or remember information that confirms their existing beliefs or ideas, while ignoring or discounting evidence that contradicts these beliefs.

This bias can happen unconsciously and can influence decision-making and reasoning in various contexts, such as research, politics, or everyday decision-making.

What is confirmation bias in psychology?

Confirmation bias in psychology is the tendency to favor information that confirms existing beliefs or values. People exhibiting this bias are likely to seek out, interpret, remember, and give more weight to evidence that supports their views, while ignoring, dismissing, or undervaluing the relevance of evidence that contradicts them.

This can lead to faulty decision-making because one-sided information doesn’t provide a full picture.

Print Friendly, PDF & Email

problem solving confirmation bias definition

  • Confirmation Bias  

Table of Contents

Confirmation bias refers to the tendency of individuals to interpret or favor information in a way that confirms their existing beliefs or hypotheses while ignoring or downplaying contradictory evidence. It is a cognitive bias that affects our thinking, decision-making processes, and overall mental health functioning.

What is Confirmation Bias?  

Confirmation bias occurs when people 1  Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220. https://doi.org/10.1037/1089-2680.2.2.175 only pay attention to information that supports what they already believe and ignore or downplay anything that goes against it. This can affect how we think and discuss things, like in debates or arguments. People tend to focus on evidence that backs up their side and ignore the other side’s arguments. This makes it harder to have productive discussions and consider different perspectives or solutions.

According to a research, about 25% of students 2  Walenchok, S. C., Goldinger, S. D., & Hout, M. C. (2020). The confirmation and prevalence biases in visual search reflect separate underlying processes.  Journal of experimental psychology. Human perception and performance ,  46 (3), 274–291. https://doi.org/10.1037/xhp0000714 showed confirmation bias when they looked for new information after making an initial screening. Certain things 3  Cook, M. B., & Smallman, H. S. (2008). Human factors of the confirmation bias in intelligence analysis: decision support from graphical evidence landscapes.  Human factors ,  50 (5), 745–754. https://doi.org/10.1518/001872008X354183 can make confirmation bias stronger.

If someone strongly believes in something, has an emotional attachment to an idea, or wants to be accepted by a certain group, they’re more likely to be influenced by confirmation bias. Also, people who aren’t aware that our minds can have biases or who don’t have much exposure to different viewpoints are more likely to be affected by confirmation bias.

Examples of Confirmation Bias

Types of Confirmation Bias  

Confirmation bias psychology can manifest 4  Modgil, S., Singh, R. K., Gupta, S., & Dennehy, D. (2021). A Confirmation Bias View on Social Media Induced Polarisation During Covid-19.  Information systems frontiers : a journal of research and innovation , 1–25. Advance online publication. https://doi.org/10.1007/s10796-021-10222-9 in various forms, such as:

  • Biased research which occurs when individuals selectively chooses sources or interpret data in a manner that supports their preconceived notions.
  • Biased recall in which individuals tend to remember information that aligns with their beliefs while conveniently forgetting or distorting information that contradicts them.  
  • Biased interpretation in which people tend to interpret ambiguous or neutral information in a way that confirms their preexisting beliefs.

Confirmation Bias in Mental Health

The psychological effects of confirmation bias 5  Friedman, H. H. (2017). Cognitive Biases that Interfere with Critical Thinking and Scientific Reasoning: A Course Module. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.2958800 include:  

  • Individuals pay more attention to information that confirms their negative thoughts, leading to distorted thinking patterns or irrational beliefs.
  • People may become resistant to considering alternative viewpoints when they are exposed to information that aligns with their existing beliefs.
  • Individuals may seek out information or interpret experiences in a way that confirms their negative self-perceptions.
  • Confirmation bias can hinder decision-making processes, as individuals may favor options that align with their existing beliefs or ideas.
  • Individuals may overlook alternative solutions or fail to consider different strategies, leading to impaired problem-solving skills.
  • When individuals selectively interpret social cues that confirm their negative beliefs, it can result in relationship difficulties.

Why Do People Get Affected by Confirmation Bias?

People are prone to confirmation bias due to various cognitive and psychological factors 6  Cook, M. B., & Smallman, H. S. (2008). Human factors of the confirmation bias in intelligence analysis: decision support from graphical evidence landscapes.  Human factors ,  50 (5), 745–754. https://doi.org/10.1518/001872008X354183 , such as: 

  • The human brain tends to prefer information that is familiar, coherent, and requires less mental effort to process.
  • Individuals often have a natural inclination to preserve and defend their existing beliefs.
  • Confirmation bias can provide emotional comfort by reinforcing one’s worldview and validating their identity.
  • Individuals tend to seek approval from others and want to fit in with a larger group.
  • By sticking to commonly accepted ideas, individuals reduce the risk of being isolated or feeling insecure in society.

How Confirmation Bias Affects Decision-making?  

When people only pay attention to information that supports their existing beliefs and ignore evidence that goes against them, this can lead to making mistakes in judgments and choosing options that may not be adequate.

It also makes it hard for them to consider different perspectives or use evidence-based solutions. Confirmation bias psychology can cause individuals to prevent objective and rational decision-making 7  Lange, R. D., Chattoraj, A., Beck, J. M., Yates, J. L., & Haefner, R. M. (2021). A confirmation bias in perceptual decision-making due to hierarchical approximate inference.  PLoS computational biology ,  17 (11), e1009517. https://doi.org/10.1371/journal.pcbi.1009517 , which can hold back personal growth, hinder problem-solving skills, and limit the chances of achieving the best outcomes.

Read More About  Decision-Making Here

How to Avoid Confirmation Bias  

To avoid the psychological effects of confirmation bias, there are some helpful strategies 8  Lomangino K. M. (2016). Countering Cognitive Bias: Tips for Recognizing the Impact of Potential Bias on Research.  Journal of the Academy of Nutrition and Dietetics ,  116 (2), 204–207. https://doi.org/10.1016/j.jand.2015.07.014   including:

  • Recognize that your beliefs might not always be completely objective, and consider that other perspectives could be valid too.
  • Try to expose yourself to a variety of opinions and information sources, even if they go against your existing beliefs.
  • Take time to reflect on your own biases and think about how they might be affecting the way you think and make decisions.
  • Explore novelty being out of your comfort zone to challenge your own assumptions.
  • Increase your knowledge and understanding of different concepts. The more you learn, the more prepared you’ll be to consider diverse perspectives.

Takeaway  

The examples of confirmation bias involve seeking out and favor information that confirms our existing beliefs while dismissing contradictory evidence. Being aware of this bias, actively seeking diverse perspectives, and cultivating open-mindedness can help mitigate its impact and promote more objective decision-making to ensure adequate mental health in long run.

At A Glance  

  • Confirmation bias happens when people only pay attention to information that supports what they already believe.
  • Confirmation bias can manifest in various forms, such as biased research, biased recall, and biased interpretation.
  • Confirmation bias can hinder decision-making process, problem-solving skills, and interpersonal relationships.
  • People are prone to confirmation bias due to their natural inclination and emotional comfort.
  • Confirmation bias prevents objective and rational decision-making, which can hold back personal growth.
  • Being aware of this bias, seeking diverse perspectives, and cultivating open-mindedness can help overcome the effect of confirmation bias in mental health.

Frequently Asked Questions (FAQs)

1. is confirmation bias intentional.

Confirmation bias is often unintentional, arising from unconscious cognitive processes rather than any deliberate intention.

2. Are there any benefits of confirmation bias?

While confirmation bias may provide a sense of validation and comfort by reinforcing existing beliefs, its benefits are outweighed by the potential negative impacts on critical thinking, objective decision-making, and personal growth.

3. Can awareness of confirmation bias help to prevent it?

Awareness of confirmation bias can help individuals actively recognize and mitigate its influence, enabling them to approach information more objectively and make more rational judgments.

References:

  • 1  Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220. https://doi.org/10.1037/1089-2680.2.2.175
  • 2  Walenchok, S. C., Goldinger, S. D., & Hout, M. C. (2020). The confirmation and prevalence biases in visual search reflect separate underlying processes.  Journal of experimental psychology. Human perception and performance ,  46 (3), 274–291. https://doi.org/10.1037/xhp0000714
  • 3  Cook, M. B., & Smallman, H. S. (2008). Human factors of the confirmation bias in intelligence analysis: decision support from graphical evidence landscapes.  Human factors ,  50 (5), 745–754. https://doi.org/10.1518/001872008X354183
  • 4  Modgil, S., Singh, R. K., Gupta, S., & Dennehy, D. (2021). A Confirmation Bias View on Social Media Induced Polarisation During Covid-19.  Information systems frontiers : a journal of research and innovation , 1–25. Advance online publication. https://doi.org/10.1007/s10796-021-10222-9
  • 5  Friedman, H. H. (2017). Cognitive Biases that Interfere with Critical Thinking and Scientific Reasoning: A Course Module. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.2958800
  • 6  Cook, M. B., & Smallman, H. S. (2008). Human factors of the confirmation bias in intelligence analysis: decision support from graphical evidence landscapes.  Human factors ,  50 (5), 745–754. https://doi.org/10.1518/001872008X354183
  • 7  Lange, R. D., Chattoraj, A., Beck, J. M., Yates, J. L., & Haefner, R. M. (2021). A confirmation bias in perceptual decision-making due to hierarchical approximate inference.  PLoS computational biology ,  17 (11), e1009517. https://doi.org/10.1371/journal.pcbi.1009517
  • 8  Lomangino K. M. (2016). Countering Cognitive Bias: Tips for Recognizing the Impact of Potential Bias on Research.  Journal of the Academy of Nutrition and Dietetics ,  116 (2), 204–207. https://doi.org/10.1016/j.jand.2015.07.014

Mental Health Topics (A-Z)

  • Abrasive Personality
  • Academic Problems And Skills
  • Actor-Observer Bias
  • Addiction And The Brain
  • Causes Of Addiction
  • Complications Of Addiction
  • Stages Of Addiction
  • Types Of Addiction
  • Adjustment Disorder
  • Adverse Childhood Experiences
  • Premature Aging
  • Aging And Mental Health
  • Agoraphobia
  • Agreeableness
  • Alcohol Use Disorder
  • Alcohol Withdrawal
  • Alcohol And Mental Health
  • End Stage Alcoholism
  • Treatment For Alcoholism
  • Alienation  
  • Alzheimer’s Disease
  • Anger Management
  • Animal Behavior
  • Treatment For Anorexia Nervosa
  • Anthropomorphism
  • Anthropophobia
  • Antidepressants
  • Arachnophobia
  • Art Therapy
  • Causes Of Asperger’s Syndrome
  • Coping With Asperger Syndrome
  • Symptoms Of Asperger’s Syndrome
  • Treatment For Asperger’s Syndrome
  • Astraphobia
  • Attachment styles
  • Attachment Theory
  • Attention Deficit Hyperactivity Disorder (ADHD)
  • Avoidant Personality Disorder
  • Bathophobia
  • Behavioral Change
  • Behavioral Economics
  • Bereavement
  • Body-Focused Repetitive Behaviors (BFRB)
  • Bibliophobia
  • Bibliotherapy
  • Big 5 Personality Traits  
  • Binge Drinking
  • Binge Eating Disorder
  • Binge Watching
  • Causes Of Bipolar Disorder
  • Treatment of Bipolar Disorder
  • Birthday Depression
  • Body language
  • Borderline Personality Disorder (BPD)
  • Causes Of Boredom
  • How To Improve Brain Health
  • Nutrition And Brain Health
  • Brain Science
  • Bulimia Nervosa
  • Causes Of Burnout
  • The Bystander Effect
  • Caffeine Use Disorder 
  • Capgras Delusion
  • Caregiving  
  • Character Traits 
  • Child Development  
  • Child Discipline 
  • Christmas And Mental Health  
  • Types Of Chronic Pain
  • Chronomentrophobia  
  • City Syndromes
  • Traits Of Cluster B Personality Disorder
  • Cluster B Personality Disorders Treatment 
  • Coping With Codependency 
  • Effectiveness Of Cognitive-Behavioral Therapy (CBT) 
  • Types of Cognitive Biases
  • Cognitive Decline
  • Cognitive Dissonance  
  • Color Psychology
  • Commitment Phobia
  • Communication Disorders 
  • Compassion Fatigue 
  • Compulsive Buying Disorder
  • Conduct Disorder
  • Conscientiousness
  • Consumer Behavior  
  • Consumerism  
  • Couples Therapy
  • Decision-making
  • Default Mode Network (DMN) And Alzheimer’s Disease
  • Defense Mechanisms
  • Delusional Disorder
  • Dependent Personality Disorder
  • Depression 
  • Depression At Night
  • Dermatillomania
  • Discrimination
  • Disruptive Mood Dysregulation Disorder (DMDD)
  • Dissociative Disorders
  • Treatment Of Dissociative Fugue
  • Dissociative Identity Disorder
  • Dissociative Trance Disorder
  • Domestic Violence and Mental Health
  • Dopamine Deficiency
  • How To Increase Dopamine
  • Dream Interpretation
  • Drug Abuse  
  • Drunkorexia
  • Coping With The Dunning-Kruger Effect
  • Types of Dysgraphia 
  • Dyspareunia
  • Dysthymia (Persistent Depressive Disorder)
  • Causes Of Eating Disorders
  • Self Help Strategies For Eating Disorders
  • Treatment For Eating Disorders
  • Ego Depletion
  • Embarrassment
  • Emotional Regulation
  • Emotional Abuse  
  • Improving Emotional Intelligence (EI)
  • Empathic Accuracy
  • Environmental Psychology 
  • Evolutionary Psychology
  • Executive Function
  • Exhibitionistic Disorder
  • Expressive Language Disorder
  • Signs Of An Extrovert
  • False Memory
  • Types of Family Dynamics
  • Fear of Missing Out (FOMO)
  • Financial Therapy  
  • First Impression
  • Forensic Psychology 
  • Four Pillars Of Mental Health
  • Free Will  
  • Freudian Psychology
  • Friends And Mental Health  
  • Gambling Disorder
  • Ganser Syndrome
  • Gaslighting
  • Gender Dysphoria
  • Gender and Alienation 
  • Gender Bias 
  • Causes Of Generalized Anxiety Disorder (GAD)
  • Coping With Generalized Anxiety Disorder (GAD)
  • Symptoms Of Generalized Anxiety Disorder (GAD)
  • Generalized Anxiety Disorder Treatment
  • Geographical Psychology  
  • Gerascophobia
  • Geriatric Depression
  • Group Therapy
  • Growth Mindset
  • How To Strengthen Your Gut Feeling
  • How To Form Effective Habits
  • Hakomi Therapy
  • Halo Effect
  •  Healing From Trauma
  • Heliophobia
  • Hexaco Personality Test
  • Highly Sensitive Person (HSP)
  • Histrionic Personality Disorder
  • Hoarding Disorder
  • Holiday Depression
  • Holiday Stress  
  • Types Of Holistic Health Treatments
  • Homelessness and Mental Health
  • Homosexuality
  • Horn Effect
  • Human Trafficking And Mental Health
  • Hypersomnia
  • Illness Anxiety Disorder
  • Imagination
  • Impostor Syndrome
  • Impulse Buying
  • Impulse Control Disorder (ICD)
  • Inspiration
  • Intellectual Disability
  • Internet Addiction
  • Introversion
  • Kleptomania
  • Laughter Therapy
  • Learned Helplessness
  • Life Satisfaction 
  • Types Of Life Skills
  • Living With Someone With Mental Illness
  • Locus Of Control
  • How To Practice Healthy Love In Relationships
  • Love Addiction
  • Love And Relationships  
  • Love And Mental Health
  • Self Love Deficit Disorder
  • Triangular Theory Of Love
  • Machiavellianism
  • Mageirocophobia
  • Benefits Of Magical Thinking
  • Magical Thinking OCD
  • Causes Of Depression
  • Coping With Depression
  • Treatment For Depression
  • Types Of Depression
  • Maladaptive Daydreaming
  • Manic Depression
  • Manic Episode
  • Mens Mental Health 
  • Mental Exercises
  •  Mental Health And Holidays 
  • Children’s Mental Health Awareness
  • Mental Health Disability
  • Mental Illness 
  • Mental Wellness In New Year 
  • Metacognition
  • Microaggression
  • Microexpressions
  • Midlife Crisis 
  • Mindfulness
  • Money and Mental Health
  • Mood Disorders
  • Motivated Reasoning
  • Music and Mental Health
  • Music Therapy
  • Myers-Briggs Type Indicator (MBTI)
  • Nature and Mental Health
  • Necrophobia
  • Neuroticism
  • Night Eating Syndrome
  • Causes Of Nightmare Disorder
  • Treatment for Nightmare Disorder
  • Nyctophobia
  • Obsessive-Compulsive Disorder (OCD)
  • Obsessive Love Disorder
  • Online Counseling
  • Online Therapy
  • Onychophagia (Nail Biting)
  • Oppositional Defiant Disorder (ODD)
  • Panic Disorder
  • Panic Disorder with Agoraphobia  
  • Causes Of Paranoia
  • Coping With Paranoia
  • Helping Someone With Paranoia
  • Symptoms Of Paranoia
  • Treatment Of Paranoia
  • Types Of Paranoia
  • Paranoid Personality Disorder
  • Coping With Paranoid Schizophrenia
  • Helping Someone With Paranoid Schizophrenia
  • Treatment of Paranoid Schizophrenia
  • Parental Alienation Syndrome  
  • Parkinson’s Disease
  • Passive Aggression
  • Pathological Jealousy
  • Perfectionism
  • Performance Anxiety
  • Perinatal Mood And Anxiety Disorders (PMADs)
  • Personality
  • Personality Disorders
  • Pet Therapy
  • Pica Disorder
  • Positive Psychology
  • Causes Of PTSD
  • Post-Traumatic Stress Disorder in Children
  • Coping With Post-Traumatic Stress Disorder (PTSD)
  • Diagnosis Of Post-Traumatic Stress Disorder (PTSD)
  • Symptoms Of Post-Traumatic Stress Disorder (PTSD)
  • Treatment Of Post-Traumatic Stress Disorder (PTSD)
  • Treatment Of Postpartum Depression
  • Postpartum Psychosis
  • Premature Ejaculation
  • Procrastination
  • Psychoanalysis
  • The Psychology Behind Smoking
  • Psychotherapy
  • Psychotic Depression
  • Reactive Attachment Disorder
  • Recovered Memory Syndrome
  • Rejection Sensitive Dysphoria
  • Relationships
  • Reminiscence Therapy
  • Repetitive Self-Mutilation
  • Restless Legs Syndrome
  • Retrograde Amnesia
  • Rumination Disorder
  • Seasonal Affective Disorder (SAD)
  • Schema Therapy
  • Childhood Schizophrenia
  • Hebephrenia (Disorganized Schizophrenia)
  • Treatment Of Schizophrenia
  • Schizophrenia Spectrum Disorder 
  • Selective Mutism
  • Self-Actualization
  • Self Care And Wellness
  • Self-Control
  • Self-disclosure
  • Self-Esteem
  • Self-Monitoring
  • Self-motivation
  • Self-Serving Bias
  • Sensory Processing Disorder
  • Separation Anxiety Disorder
  • Separation Anxiety in Relationships
  • Serial Killers
  • Sexual Masochism Disorder
  • Sexual Orientation
  • Shared Psychotic Disorder
  • Signs And Symptoms Of Schizophrenia
  • Situational Stress
  • Situationship
  • Non-Rapid Eye Movement (NREM) Sleep Arousal Disorders
  • Parasomnias
  • Sleep Disorders
  • Sleep Paralysis
  • Sleeplessness
  • Sleep Meditation
  • Sociability
  • Social Anxiety Disorder
  • Social Media Addiction
  • Social Media And Mental Health
  • Somatic Symptom Disorder ( SSD )
  • Specific Reading Comprehension Deficit
  • Spirituality
  • Stage Fright
  • Stendhal Syndrome
  • Stereotypes
  • Acute Stress Disorder
  • Stress Management
  • Suicide And Mental Health  
  • Suicide Grief 
  • Teen Dating Violence
  • Time Management Techniques
  • 10 Helpful Time Management Tips
  • Tobacco-Related Disorders
  • Toxic Love Disorder 
  • Transgender
  • Transient Tic Disorder
  • Trauma and Addiction
  • Trichotillomania
  • Venustraphobia
  • Video Game Addiction
  • Virtual Reality and Mental Health
  • Voyeuristic Disorder
  • Weight Watchers
  • Werther Syndrome
  • Good Mental Health
  • Stockholm Syndrome
  • Work and Mental Health 
  • How To Cure Workaholism
  • Workplace Bullying
  • Workplace Stress
  • Yoga For Mental Health
  • Young Male Syndrome
  • Zeigarnik Effect
  • Zone of Proximal Development  

google_news_takedown_d-removebg-preview

Copyright 2024

What Is Confirmation Bias In Psychology, And How Can You Overcome It?

The American Psychological Association defines confirmation bias as “the tendency to gather evidence that confirms preexisting expectations, typically by emphasizing or pursuing supporting evidence while dismissing or failing to seek contradictory evidence.” Confirmation bias might be thought of as an inclination to believe information supporting one’s existing beliefs while discounting opposing beliefs. 

Researchers believe that everyone experiences some degree of confirmation bias, whether they are conscious of it or not. Confirmation bias is a cognitive bias that supports a person’s personal beliefs or feelings. It is the mind's way of ignoring everything that does not support our ideas or views.

In social psychology, confirmation bias, also sometimes called myside bias , is an unconscious tendency not to judge new information objectively. In other words, confirmation bias is a psychological phenomenon in which people tend to seek information and evidence supporting their existing beliefs and discount anything that contradicts those beliefs.

The psychological advantage of confirmation bias is that it effectively allows individuals to process information and minimize cognitive dissonance that occurs when encountering conflicting viewpoints or information. In other words, confirmation bias can make it simpler to think about the world. However, unexamined biases can prevent you from seeing and acting on important information or connecting with people who think differently than you, which can limit your opportunities.

Examples of confirmation bias in psychology

There are many ways in which people display confirmation bias. For example, two friends might hold different views about the best solution for climate change. One supports solar power and reads articles affirming her belief about the need for more investment in solar power. The other believes more in the importance of wind power and gravitates toward online articles that prove his position. While they both read new stories about climate change, they interpret the news through the lens of their confirmation bias. This can make it difficult to see the strengths of opposing arguments. 

Another example might be found in politics. During election time, many people notice positive things about their chosen candidate and negative things about the candidates they do not support. Confirmation bias can be difficult for them to see any incoherence in their perspective. 

Myside bias and intelligence

Even if a person is generally highly skilled at problem-solving and reasoning, they may be just as likely as everyone else to fall into my side bias and miss crucial realities about the world around them.

Heuristics: Why the mind is susceptible to confirmation bias

If you regularly wonder why others cannot see what seems obvious to you, it may help to examine your biases. People who believe everyone else just doesn’t get it may be experiencing myside bias. Although we all tend to experience a desire to be correct about our beliefs, it is impossible to be right all the time. If you always think you are right and everyone else cannot see the truth, you might be experiencing confirmation bias.

Confirmation bias can reduce our ability to consider alternative hypotheses. Instead, our mind automatically tries to take mental shortcuts called heuristics to make its job faster and easier. For example, the mind often blocks opposing views and only lets you see what you want to see. In this way, you do not have to spend time and energy trying to make sense of contradicting ideas. If you constantly follow this shortcut, you may miss important realities in the world around you.

How confirmation bias can cloud our judgment

It can be difficult to stop using confirmation bias unless you can see it is there. If you think you may be experiencing confirmation bias, you’re not alone. Research demonstrates the widespread prevalence of confirmation bias and shows that it has nothing to do with intelligence. It’s one of several cognitive biases that can shape our thinking even if we don’t know it.

When we let confirmation bias have its way, we can go through life with a skewed version of the truth about our experiences. Research suggests that it can be challenging to escape confirmation bias because it is so natural for the brain to take this shortcut when looking at information.

Studies continue to evaluate the impact of cognitive bias on scientific discovery and decision-making, and researchers are studying how to reduce the influence of confirmation bias in different settings. Continued research in social psychology might help us determine the impact of confirmation bias and the challenges it presents to us as individuals and society as a whole.

Help for overcoming confirmation bias

If you think that you may be discounting information that goes against your beliefs, it may help to speak with a psychologist or licensed therapist about confirmation bias. One form of therapy that may help is cognitive behavioral therapy (CBT). In CBT, people typically learn to challenge their inaccurate beliefs and think in new ways. 

If you feel hesitant to see a psychologist or therapist in person, you might consider online therapy. CBT is a type of therapy you can engage in remotely, and numerous studies confirm that online CBT is as effective as seeing a counselor in person. 

With online CBT at BetterHelp , you can connect with a licensed therapist via audio, video, or live chat at a time that works for you. You can also message your therapist at any time through in-app messaging, and they’ll respond as soon as they can. This may be helpful if you encounter instances of bias and want to write down your thoughts in between sessions.

Also, it may help to speak with a psychologist or therapist who has knowledge of various cognitive biases and strategies for reducing the effects of confirmation bias. Take the first step toward curbing the effects of confirmation bias in your life and reach out to BetterHelp.

What is an example of confirmation bias?

Evidence and examples of confirmation bias can be seen stretching back for thousands of years, with the Greek historian Thucydides commenting on the phenomenon by saying, “...it is a habit of mankind to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy.”

In modern times, one example you may commonly see of confirmation bias relates to  horoscopes. These astrologically-based predictions use an individual’s zodiac sign to give vague descriptions of their future. Because these interpretations are vague, a person may interpret information provided by horoscopes in a way that supports their current perspective. 

What is confirmation bias and is it bad?

Confirmation bias is a term in psychology that refers to the tendency for individuals to only seek out information that supports a specific idea. In many cases, this leads people to accrue evidence that supports a position they currently hold without considering facts or figures that oppose that idea. 

Confirmation bias can be negative in many cases because it can lead to poorly informed decisions. In some cases, these may cause challenges in interpersonal relationships. For example, if a person is certain their partner is cheating on them, they may ignore all evidence that goes against this belief. As a result, they may accuse them of infidelity or break up with them, even if that person was never having an affair in the first place. 

Why confirmation bias happens?

There are multiple reasons that confirmation bias may occur, including the following.

  • Desire To Be Correct: Many people will fall prey to confirmation bias in order to affirm their ability to make proper decisions or show their intelligence. Because we tend to value those that are correct, the quest for this trait may lead some to only search for information that confirms their previously held ideas. 
  • Processing Efficiency: Our brains are constantly taking in information from a variety of sources, and it may be difficult to discern what is relevant and what isn’t. In order to reach our objectives, some may succumb to confirmation bias subconsciously in order to reach a conclusion. 
  • Fear Of Losing Respect: Being wrong may lead someone to looking foolish, or it could cause people to challenge their expertise in a specific field. As a result, individuals may choose to ignore viable information and focus on the information points that confirm existing beliefs.

What is the difference between confirmation bias and cognitive dissonance?

While confirmation bias and cognitive dissonance can be related phenomena, they have their own distinct definitions and differences. Confirmation bias refers to the tendency for people to place importance or seek out information that supports their deeply entrenched beliefs, while cognitive dissonance is the discomfort one may encounter when having an inconsistency between their beliefs and their behavior. While a person experiencing cognitive dissonance may use confirmation bias as a way to reduce this discomfort, they are not intrinsically linked, as a person may employ other strategies to combat these uncomfortable feelings. 

What are the 3 types of confirmation bias?

The three types of confirmation bias are selective recall, selective search, and selective interpretation.

  • Selective Recall: Through the process of selective recall, Information that we identify as matching our current position on a subject will be more easily recalled than information that counters our previously held notions. 
  • Selective Search: This form of information confirmation bias involves only looking for evidence that works in favor of our argument. Selective search allows us to support our theories while disregarding or refusing to interpret evidence that may lead to the contrary. 
  • Selective Interpretation: Selective interpretation involves looking at the facts of a specific situation and choosing to interpret the outcomes to favor one’s argument. 

What's the opposite of confirmation bias?

The opposite of confirmation bias is falsification bias. This type of bias involves looking for information or evidence that works against your previously held view or position. While adapting this in every situation may be difficult, having a falsification mindset can help to avoid the pitfalls associated with confirmation bias by making someone more skeptical and analytic. For example, if you are looking to prove a theory and everything you find supports your view, it may be beneficial to start looking for some level of evidence that disproves your theory. 

What is another word for confirmation bias?

Another name for confirmation bias is “myside bias,” so named because it implies that someone will conduct a biased search for information that supports their side instead of considering all evidence. This search can subsequently lead to a biased interpretation of the facts in order to reach the conclusion a person wants or needs to be true. In some cases, a person experiencing myside bias may also be prone to ignoring evidence, particularly if it does not fit their narrative.

One example of myside or confirmation bias can be seen in the paper Biased Assimilation and Attitude Polarization: The Effects of Prior Theories on Subsequently Considered Evidence. This study took a group of 48 undergraduate students and had them support and oppose capital punishment using two studies, one that confirmed their preexisting beliefs and one that opposed them. The results confirmed what researchers predicted, as a majority of the students rated the source that confirmed their beliefs as more convincing.  

Can confirmation bias be good?

While confirmation biases can often impede human understanding, they may be effective in situations that involve overwhelming amounts of information. Because the information surrounding a specific subject may involve hundreds or thousands of articles, videos, and files, it may be hard to reach any conclusion when one is sorting through this information. As a result, it may be helpful to filter information once someone has a somewhat clear idea of what their conclusion might be. However, even this form of confirmation bias can be harmful, and in most cases, it's better to closely analyze the same evidence with an unbiased perspective (even if it may disprove your point.) 

How do you break confirmation bias?

One way to avoid the potentially negative implications confirmation bias can have is to cultivate elements of a falsification mindset. This involves looking for evidence that disproves your current position in order to ensure you are considering all sides of the subject. By doing so, you may be able to ensure that the conclusions you reach have a higher level of validity and will be less likely to be disproven. In some cases, you may find flaws in your initial evidence or position that may make you change your mind.

What is an example of the Dunning Kruger effect?

An example of the Dunning-Kruger effect, the psychological bias that may lead individuals to overestimate their competence, can be seen in academic settings. Students may believe they have a strong grasp of a certain subject, even if they haven’t been attending class or studying frequently. As a result, they may not realize their actual skill level until they receive a failing grade; even then, a person under the influence of the dunning kruger effect may believe the fault lies with the instructor or course than in themselves. 

  • Using Molecular Psychology In Everyday Life Medically reviewed by Arianna Williams , LPC, CCTP
  • What Is Conformity And What Does It Do To A Person? Medically reviewed by Laura Angers Maddox , NCC, LPC
  • Psychologists
  • Relationships and Relations

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Social Sci LibreTexts

9.8: Confirmation Bias

  • Last updated
  • Save as PDF
  • Page ID 54779

  • Mehgan Andrade and Neil Walker
  • College of the Canyons

Picture1.png

Confirmation bias is a person’s tendency to seek, interpret and use evidence in a way that conforms to their existing beliefs. This can lead a person to make certain mistakes such as: poor judgments that limits their ability to learn, induces changing in beliefs to justify past actions, and act in a hostile manner towards people who disagree with them. Confirmation bias lead a person to perpetuate stereotypes or cause a doctor to inaccurately diagnose a condition.

What is noteworthy about confirmation bias is that it supports the The Argumentative Theory.Although confirmation bias is almost universally deplored as a regrettable failing of reason in others, the argumentative theory of reason explains that this bias is Adaptive Behavior because it aids in forming persuasive arguments by preventing us from being distracted by useless evidence and unhelpful stories.

Interestingly, Charles Darwin made a practice of recording evidence against his theory in a special notebook, because he found that this contradictory evidence was particularly difficult to remember.

What Is the Function of Confirmation Bias?

  • Original Research
  • Open access
  • Published: 20 April 2020
  • Volume 87 , pages 1351–1376, ( 2022 )

Cite this article

You have full access to this open access article

  • Uwe Peters 1 , 2  

69k Accesses

32 Citations

183 Altmetric

19 Mentions

Explore all metrics

Confirmation bias is one of the most widely discussed epistemically problematic cognitions, challenging reliable belief formation and the correction of inaccurate views. Given its problematic nature, it remains unclear why the bias evolved and is still with us today. To offer an explanation, several philosophers and scientists have argued that the bias is in fact adaptive. I critically discuss three recent proposals of this kind before developing a novel alternative, what I call the ‘reality-matching account’. According to the account, confirmation bias evolved because it helps us influence people and social structures so that they come to match our beliefs about them. This can result in significant developmental and epistemic benefits for us and other people, ensuring that over time we don’t become epistemically disconnected from social reality but can navigate it more easily. While that might not be the only evolved function of confirmation bias, it is an important one that has so far been neglected in the theorizing on the bias.

Similar content being viewed by others

problem solving confirmation bias definition

An evolutionary perspective on Kohlberg’s theory of moral development

Eugene W. Mathes

problem solving confirmation bias definition

Identity Theory

problem solving confirmation bias definition

Constructing an Identity

Avoid common mistakes on your manuscript.

In recent years, confirmation bias (or ‘myside bias’), Footnote 1 that is, people’s tendency to search for information that supports their beliefs and ignore or distort data contradicting them (Nickerson 1998 ; Myers and DeWall 2015 : 357), has frequently been discussed in the media, the sciences, and philosophy. The bias has, for example, been mentioned in debates on the spread of “fake news” (Stibel 2018 ), on the “replication crisis” in the sciences (Ball 2017 ; Lilienfeld 2017 ), the impact of cognitive diversity in philosophy (Peters 2019a ; Peters et al. forthcoming; Draper and Nichols 2013 ; De Cruz and De Smedt 2016 ), the role of values in inquiry (Steel 2018 ; Peters 2018 ), and the evolution of human reasoning (Norman 2016 ; Mercier and Sperber 2017 ; Sterelny 2018 ; Dutilh Novaes 2018 ).

Confirmation bias is typically viewed as an epistemically pernicious tendency. For instance, Mercier and Sperber ( 2017 : 215) maintain that the bias impedes the formation of well-founded beliefs, reduces people’s ability to correct their mistaken views, and makes them, when they reason on their own, “become overconfident” (Mercier 2016 : 110). In the same vein, Steel ( 2018 ) holds that the bias involves an “epistemic distortion [that] consists of unjustifiably favoring supporting evidence for [one’s] belief, which can result in the belief becoming unreasonably confident or extreme” (897). Similarly, Peters ( 2018 ) writes that confirmation bias “leads to partial, and therewith for the individual less reliable, information processing” (15).

The bias is not only taken to be epistemically problematic, but also thought to be a “ubiquitous” (Nickerson 1998 : 208), “built-in feature of the mind” (Haidt 2012 : 105), found in both everyday and abstract reasoning tasks (Evans 1996 ), independently of subjects’ intelligence, cognitive ability, or motivation to avoid it (Stanovich et al. 2013 ; Lord et al. 1984 ). Given its seemingly dysfunctional character, the apparent pervasiveness of confirmation bias raises a puzzle: If the bias is indeed epistemically problematic, why is it still with us today? By definition, dysfunctional traits should be more prone to extinction than functional ones (Nickerson 1998 ). Might confirmation bias be or have been adaptive ?

Some philosophers are optimistic, arguing that the bias has in fact significant advantages for the individual, groups, or both (Mercier and Sperber 2017 ; Norman 2016 ; Smart 2018 ; Peters 2018 ). Others are pessimistic. For instance, Dutilh Novaes ( 2018 ) maintains that confirmation bias makes subjects less able to anticipate other people’s viewpoints, and so, “given the importance of being able to appreciate one’s interlocutor’s perspective for social interaction”, is “best not seen as an adaptation” (520).

In the following, I discuss three recent proposals of the adaptationist kind, mention reservations about them, and develop a novel account of the evolution of confirmation bias that challenges a key assumption underlying current research on the bias, namely that the bias thwarts reliable belief formation and truth tracking. The account holds that while searching for information supporting one’s pre-existing beliefs and ignoring contradictory data is disadvantageous when that what one takes to be reality is and stays different from what one believes it to be, it is beneficial when, as the result of one’s processing information in that way, that reality is changed so that it matches one’s beliefs. I call this process reality matching and contend that it frequently occurs when the beliefs at issue are about people and social structures (i.e., relationships between individuals, groups, and socio-political institutions). In these situations, confirmation bias is highly effective for us to be confident about our beliefs even when there is insufficient evidence or subjective motivation available to us to support them. This helps us influence and ‘mould’ people and social structures so that they fit our beliefs, Footnote 2 which is an adaptive property of confirmation bias. It can result in significant developmental and epistemic benefits for us and other people, ensuring that over time we don’t become epistemically disconnected from social reality but can navigate it more easily.

I shall not argue that the adaptive function of confirmation bias that this reality-matching account highlights is the only evolved function of the bias. Rather, I propose that it is one important function that has so far been neglected in the theorizing on the bias.

In Sects.  1 and 2 , I distinguish confirmation bias from related cognitions before briefly introducing some recent empirical evidence supporting the existence of the bias. In Sect.  3 , I motivate the search for an evolutionary explanation of confirmation bias and critically discuss three recent proposals. In Sects.  4 and 5 , I then develop and support the reality-matching account as an alternative.

1 Confirmation Bias and Friends

The term ‘confirmation bias’ has been used to refer to various distinct ways in which beliefs and expectations can influence the selection, retention, and evaluation of evidence (Klayman 1995 ; Nickerson 1998 ). Hahn and Harris ( 2014 ) offer a list of them including four types of cognitions: (1) hypothesis-determined information seeking and interpretation, (2) failures to pursue a falsificationist strategy in contexts of conditional reasoning, (3) a resistance to change a belief or opinion once formed, and (4) overconfidence or an illusion of validity of one’s own view.

Hahn and Harries note that while all of these cognitions have been labeled ‘confirmation bias’, (1)–(4) are also sometimes viewed as components of ‘motivated reasoning’ (or ‘wishful thinking’) (ibid: 45), i.e., information processing that leads people to arrive at the conclusions they favor (Kunda 1990 ). In fact, as Nickerson ( 1998 : 176) notes, confirmation bias comes in two different flavors: “motivated” and “unmotivated” confirmation bias. And the operation of the former can be understood as motivated reasoning itself, because it too involves partial information processing to buttress a view that one wants to be true (ibid). Unmotivated confirmation bias, however, operates when people process data in one-sided, partial ways that support their predetermined views no matter whether they favor them. So confirmation bias is also importantly different from motivated reasoning, as it can take effect in the absence of a preferred view and might lead one to support even beliefs that one wants to be false (e.g., when one believes the catastrophic effects of climate change are unavoidable; Steel 2018 ).

Despite overlapping with motivated reasoning, confirmation bias can thus plausibly be (and typically is) construed as a distinctive cognition. It is thought to be a subject’s largely automatic and unconscious tendency to (i) seek support for her pre-existing, favored or not favored beliefs and (ii) ignore or distort information compromising them (Klayman 1995 : 406; Nickerson 1998 : 175; Myers and DeWall 2015 : 357; Palminteri et al. 2017 : 14). I here endorse this standard, functional concept of confirmation bias.

2 Is Confirmation Bias Real?

Many psychologists hold that the bias is a “pervasive” (Nickerson 1998 : 175; Palminteri et al. 2017 : 14), “ineradicable” feature of human reasoning (Haidt 2012 : 105). Such strong claims are problematic, however. For there is evidence that, for instance, disrupting the fluency in information processing (Hernandez and Preston 2013 ) or priming subjects for distrust (Mayo et al. 2014 ) reduces the bias. Moreover, some researchers have recently re-examined the relevant studies and found that confirmation bias is in fact less common and the evidence of it less robust than often assumed (Mercier 2016 ; Whittlestone 2017 ). These researchers grant, however, the weaker claim that the bias is real and often, in some domains more than in others, operative in human cognition (Mercier 2016 : 100, 108; Whittlestone 2017 : 199, 207). I shall only rely on this modest view here. To motivate it a bit more, consider the following two studies.

Hall et al. ( 2012 ) gave their participants (N = 160) a questionnaire, asking them about their opinion on moral principles such as ‘Even if an action might harm the innocent, it can still be morally permissible to perform it’. After the subjects had indicated their view using a scale ranging from ‘completely disagree’ to ‘completely agree’, the experimenter performed a sleight of hand, inverting the meaning of some of the statements so that the question then read, for instance, ‘If an action might harm the innocent, then it is not morally permissible to perform it’. The answer scales, however, were not altered. So if a subject had agreed with the first claim, she then agreed with the opposite one. Surprisingly, 69% of the study participants failed to detect at least one of the changes. Moreover, they subsequently tended to justify positions they thought they held despite just having chosen the opposite . Presumably, subjects accepted that they favored a particular position, didn’t know the reasons, and so were now looking for support that would justify their position. They displayed a confirmation bias. Footnote 3

Using a similar experimental set-up, Trouche et al. ( 2016 ) found that subjects also tend to exhibit a selective ‘laziness’ in their critical thinking: they are more likely to avoid raising objections to their own positions than to other people’s. Trouche et al. first asked their test participants to produce arguments in response to a set of simple reasoning problems. Directly afterwards, they had them assess other subjects’ arguments concerning the same problems. About half of the participants didn’t notice that by the experimenter’s intervention, in some trials, they were in fact presented with their own arguments again; the arguments appeared to these participants as if they were someone else’s. Furthermore, more than half of the subjects who believed they were assessing someone else’s arguments now rejected those that were in fact their own, and were more likely to do so for invalid than for valid ones. This suggests that subjects are less critical of their own arguments than of other people’s, indicating that confirmation bias is real and perhaps often operative when we are considering our own claims and arguments.

3 Evolutionary Accounts of the Bias

Confirmation bias is typically taken to be epistemically problematic, as it leads to partial and therewith for the individual less reliable information processing and contributes to failures in, for instance, perspective-taking with clear costs for social and other types of cognition (Mercier and Sperber 2017 : 215; Steel 2018 ; Peters 2018 ; Dutilh Novaes 2018 ). Prima facie , the bias thus seems maladaptive.

But then why does it still exist? Granted, even if the bias isn’t an adaptation, we might still be able to explain why it is with us today. We might, for instance, argue that it is a “spandrel”, a by-product of the evolution of another trait that is an adaptation (Gould and Lewontin 1979 ). Or we may abandon the evolutionary approach to the bias altogether and hold that it emerged by chance.

However, evolutionary explanations of psychological traits are often fruitful. They can create new perspectives on these traits that may allow developing means to reduce the traits’ potential negative effects (Roberts et al. 2012 ; Johnson et al. 2013 ). Evolutionary explanations might also stimulate novel, testable predictions that researchers who aren’t evolutionarily minded would overlook (Ketelaar and Ellis 2000 ; De Bruine 2009 ). Moreover, they typically involve integrating diverse data from different disciplines (e.g., psychology, biology, anthropology etc.), and thereby contribute to the development of a more complete understanding of the traits at play and human cognition, in general (Tooby and Cosmides 2015 ). These points equally apply when it comes to considering the origin of confirmation bias. They provide good reasons for searching for an evolutionary account of the bias.

Different proposals can be discerned in the literature. I will discuss three recent ones, what I shall call (1) the argumentative - function account, (2) the group - cognition account, and the (3) intention – alignment account. I won’t offer conclusive arguments against them here. The aim is just to introduce some reservations about these proposals to motivate the exploration of an alternative.

3.1 The Argumentative-Function Account

Mercier and Sperber ( 2011 , 2017 ) hold that human reasoning didn’t evolve for truth tracking but for making us better at convincing other people and evaluating their arguments so as to be convinced only when their points are compelling. In this context, when persuasion is paramount, the tendency to look for material supporting our preconceptions and to discount contradictory data allows us to accumulate argumentative ammunition, which strengthens our argumentative skill, Mercier and Sperber maintain. They suggest that confirmation bias thus evolved to “serve the goal of convincing others” ( 2011 : 63).

Mercier and Sperber acknowledge that the bias also hinders us in anticipating objections, which should make it more difficult for us to develop strong, objection–resistant arguments ( 2017 : 225f). But they add that it is much less cognitively demanding to react to objections than to anticipate them, because objections might depend on particular features of one’s opponents’ preferences or on information that only they have access to. It is thus more efficient to be ‘lazy’ in anticipating criticism and let the audience make the moves, Mercier and Sperber claim.

There is reason to be sceptical about their proposal, however. For instance, an anticipated objection is likely to be answered more convincingly than an immediate response from one’s audience. After all, “forewarned is forearmed”; it gives a tactical advantage (e.g., more time to develop a reply) (Sterelny 2018 : 4). And even if it is granted that objections depend on private information, they also often derive from obvious interests and public knowledge, making an anticipation of them easy (ibid). Moreover, as Dutilh Novaes ( 2018 : 519) notes, there is a risk of “looking daft” when producing poor arguments, say, due to laziness in scrutinizing one’s thoughts. Since individuals within their social groups depend on their reputation so as to find collaborators, anticipating one’s audience’s responses should be and have been more adaptive than having a confirmation bias (ibid). If human reasoning emerged for argumentative purposes, the existence of the bias remains puzzling.

3.2 The Group-Cognition Account

Even if confirmation bias is maladaptive for individual s, it might still be adaptive for groups . For instance, Smart ( 2018 ) and Peters ( 2018 ) hold that in groups with a sufficient degree of cognitive diversity at the outset of solving a particular problem, each individual’s confirmation bias might help the group as a whole conduct a more in-depth analysis of the problem space than otherwise. When each subject is biased towards a different particular proposal on how to solve the problem, the bias will push them to invest greater effort in defending their favored proposals and might, in the light of counterevidence, motivate them to consider rejecting auxiliary assumptions rather than the proposals themselves. This contributes to a thorough exploration of them that is less likely with less committed thinkers. Additionally, since individuals appear to have a particular strength in detecting flaws in others’ arguments (Trouche et al. 2016 ), open social criticism within the group should ensure that the group’s conclusions remain reliable even if some, or at times most, of its members are led astray by their confirmation bias (Smart 2018 : 4190; Peters 2018 : 20).

Mercier and Sperber ( 2011 : 65) themselves already float the idea of such a social “division of cognitive labor”. They don’t yet take its group-level benefits to explain why confirmation bias evolved, however (Dutilh Novaes 2018 : 518f). Smart ( 2018 ) and Peters ( 2018 ) also don’t introduce their views as accounts of the evolved function of the bias. But Dutilh Novaes ( 2018 : 519) and Levy ( 2019 : 317) gesture toward, and Smith and Wald ( 2019 ) make the case for, an evolutionary proposal along these lines, arguing that the bias was selected for making a group’s inquiry more thorough, effective, and reliable.

While I have sympathies with this proposal, several researchers have noted that the concept of ‘group selection’ is problematic (West et al. 2007 ; Pinker 2012 ). One of the issues is that since individuals reproduce faster than groups, a trait T that is an adaptation that is good for groups but bad for an individual’s fitness won’t spread, because the rate of proliferation of groups is undermined by the evolutionary disadvantage of T within groups (Pinker 2012 ). The point equally applies to the proposal that confirmation bias was selected for its group-level benefits.

Moreover, a group arguably only benefits from each individual’s confirmation bias if there is a diversity of viewpoints in the group and members express their views, as otherwise “group polarization” is likely to arise (Myers and Lamm 1976 ): arguments for shared positions will accumulate without being criticized, making the group’s average opinion more extreme and less reliable, which is maladaptive. Crucially, ancestral ‘hunter-gather’ groups are perhaps unlikely to have displayed a diversity of viewpoints. After all, their members traveled less, interacted less with strangers, and were less economically dependent on other groups (Simpson and Beckes 2010 : 37). This should have homogenized them with respect to race, culture, and background (Schuck 2001 : 1915). Even today groups often display such homogeneity, as calls for diversity in academia, companies etc. indicate. These points provide reasons to doubt that ancestral groups provided the kind of conditions in which confirmation bias could have produced the benefits that the group-cognition account highlights rather than maladaptive effects tied to group polarization.

3.3 The Intention–Alignment Account

Turning to a third and here final extant proposal on the evolution of confirmation bias, Norman ( 2016 ) argues that human reasoning evolved for facilitating an “intention alignment” between individuals: in social interactions, reasons typically ‘overwrite’ nonaligned mental states (e.g., people’s divergent intentions or beliefs) with aligned ones by showing the need for changing them. Norman holds that human reasoning was selected for this purpose because it makes cooperation easier. He adds that, in this context, “confirmation bias would have facilitated intention alignment, for a tribe of hunter-gatherers prone to [the bias] would more easily form and maintain the kind of shared outlook needed for mutualistic collaboration. The mythologies and ideologies taught to the young would accrue confirming evidence and tend to stick, thereby cementing group solidarity” ( 2016 : 700). Norman takes his view to be supported by the “fact that confirmation bias is especially pronounced when a group’s ideological preconceptions are at stake” (ibid).

However, the proposal seems at odds with the finding that the bias inclines subjects to ignore or misconstrue their opponents’ objections. In fueling one-sided information processing to support one’s own view, the bias makes people less able to anticipate and adequately respond to their interlocutor’s point of view (Dutilh Novaes 2018 : 520). Due to that effect, the bias arguably makes an intention alignment with others (especially with one’s opponents) harder, not easier. Moreover, since our ancesteral groups are (as noted above) likely to have been largely viewpoint homogenous, in supporting intention-alignment in these social environments, confirmation bias would have again facilitated group polarization, which is prima facie evolutionarily disadvantageous.

All three proposals of the adaptive role of confirmation bias considered so far thus raise questions. While the points mentioned aren’t meant to be fatal for the proposals and might be answerable within their frameworks, they do provide a motivation to explore an alternative.

4 Towards an Alternative

The key idea that I want to develop is the following. Confirmation bias is typically taken to work against an individual’s truth tracking (Mercier and Sperber 2017 : 215; Peters 2018 : 15), and indeed searching for information supporting one’s beliefs and ignoring contradictory data is epistemically disadvantageous when what one takes to be reality is and stays different from what one believes it to be. However, reality doesn’t always remain unchanged when we form beliefs about it. Consider social beliefs, that is, beliefs about people (oneself, others, and groups) and social structures (i.e., relationships between individuals, groups, and socio-political institutions). I shall contend that a confirmation bias pertaining to social beliefs reinforces our confidence in these beliefs, therewith strengthening our tendency to behave in ways that cause changes in reality so that it corresponds to the beliefs, turning them (when they are initially inaccurate) into self - fulfilling prophecies (SFPs) (Merton 1948 ; Biggs 2009 ). Due to its role in helping us make social reality match our beliefs, confirmation bias is adaptive, or so I will argue. I first introduce examples of SFPs of social beliefs. Then I explore the relevance of these beliefs in our species, before making explicit the adaptive role of confirmation bias in facilitating SFPs.

4.1 Social Beliefs and SFPs

Social beliefs often lead to SFPs with beneficial outcomes. Here are four examples.

S (false) believes he is highly intelligent. His self-view motivates him to engage with intellectuals, read books, attend academic talks, etc. This makes him increasingly more intelligent, gradually confirming his initially inaccurate self-concept (for relevant empirical data, see Swann 2012 ).

Without a communicative intention, a baby boy looking at a kitten produces a certain noise: ‘ma-ma’. His mother is thrilled, believing (falsely) that he is beginning to talk and wants to call her. She responds accordingly, rushing to him, attending to him, and indicating excitement. This leads the boy to associate ‘ma-ma’ with the arrival and attention of his mother. And so he gradually begins using the sounds to call her, confirming her initially false belief about his communicative intention (for relevant empirical data, see Mameli 2001 ).

A father believes his adolescent daughter doesn’t regularly drink alcohol, but she does. He acts in line with his beliefs, and expresses it in communication with other people. His daughter notices and likes his positive view of her, which motivates her to increasingly resist drinks, gradually fulfilling her father’s optimistic belief about her (for relevant empirical data; see Willard et al. 2008 ).

A teacher (falsely) believes that a student’s current academic performance is above average. She thus gives him challenging material, encourages him, and communicates high expectations. This leads the student to increase his efforts, which gradually results in above-average academic performance (for relevant evidence, see Madon et al. 1997 ).

SFPs of initially false positive trait ascriptions emerge in many other situations too. They also occurred, for instance, when adults ascribed to children traits such as being tidy (Miller et al. 1975 ), charitable (Jensen and Moore 1977 ), or cooperative (Grusec et al. 1978 ). Similarly, in adults, attributions of, for example, kindness (Murray et al. 1996 ), eco-friendliness (Cornelissen et al. 2007 ), military competence (Davidson and Eden 2000 ), athletic ability (Solomon 2016 ), and even physiological changes (Turnwald et al. 2018 ) have all had self-fulfilling effects. Moreover, these effects don’t necessarily take much time to unfold but can happen swiftly in a single interaction (e.g., in interview settings; Word et al. 1974 ) right after the ascription (Turnwald et al. 2018 : 49).

SFPs are, however, neither pervasive nor all-powerful (Jussim 2012 ), and there are various conditions for them to occur (Snyder and Klein 2007 ). For instance, they tend to occur only when targets are able to change in accordance with the trait ascriptions, when the latter are believable rather than unrealistic (Alfano 2013 : 91f), and when the ascriber holds more power than the ascribee (Copeland 1994 : 264f). But comprehensive literature reviews confirm that SFPs are “real, reliable, and occasionally quite powerful” (Jussim 2017 : 8; Willard and Madon 2016 ).

4.2 The Distribution of Social Beliefs and Role of Prosociality in Humans

Importantly, SFPs can be pernicious when the beliefs at the center of them capture negative social conceptions, for instance, stereotypes, anxious expectations, fear, or hostility (Darley and Gross 1983 ; Downey et al. 1998 ; Madon et al. 2018 ). In these cases, SFPs would be maladaptive. Given this, what do we know about the distribution of social beliefs, in general, and positive ones, in particular, in ancestral human groups?

Many researchers hold that our evolutionary success as a species relies on our being “ultra-social” and “ultra-cooperative” animals (e.g., Tomasello 2014 : 187; Henrich 2016 ). Human sociality is “spectacularly elaborate, and of profound biological importance” because “our social groups are characterized by extensive cooperation and division of labour” (Sterelny 2007 : 720). Since we live in an almost continuous flow of interactions with conspecifics, “solving problems of coordination with our fellows is [one of] our most pressing ecological tasks” (Zawidzki 2008 : 198). A significant amount of our beliefs are thus likely to be social ones (Tomasello 2014 : 190f).

Moreover, when it comes to oneself, to group or “tribe” members, and to collaborators, these beliefs often capture positive to overly optimistic ascriptions of traits (e.g., communicativeness, skills, etc.; Simpson and Beckes 2010 ). This is well established when it comes to one’s beliefs about oneself (about 70% of the general population has a positive self-conception; Talaifar and Swann 2017 : 4) and one’s family members (Wenger and Fowers 2008 ). The assumption that the point also holds for ‘tribe’ members and collaborators, more generally, receives support from the “tribal-instincts hypothesis” (Richerson and Boyd 2001 ), which holds that humans tend to harbor “ethnocentric attitudes in favor of [their] own tribe along with its members, customs, values and norms”, as this facilitates social predictability and cooperation (Kelly 2013 : 507). For instance, in the past as much as today, humans “talk differently about their in-groups than their out-groups, such that they describe the in-group and its members [but not out-groups] as having broadly positive traits” (Stangor 2011 : 568). In subjects with such ‘tribal instincts’, judgments about out-group members might easily be negative. But within the groups of these subjects, among in-group members, overly optimistic, cooperation-enhancing conceptions of others should be and have been more dominant particularly in “intergroup conflict, [which] is undeniably pervasive across human societies” (McDonald et al. 2012 : 670). Indeed, such conflicts are known to fuel in-group “glorification” (Leidner et al. 2010 ; Golec De Zavala 2011 ).

Given these points, in ‘ultra-cooperative’ social environments in which ‘tribe’ members held predominantly positive social conceptions and expectations about in-group subjects, positive SFPs should have been overall more frequent and stronger than negative ones. Indeed, there is evidence that even today, positive SFPs in individual, dyadic interactions are more likely and pronounced than negative ones. Footnote 4 For instance, focusing on mothers’ beliefs about their sons’ alcohol consumption, Willard et al. ( 2008 ) found that children “were more susceptible to their mothers’ positive than negative self-fulfilling effects” (499): “mothers’ false beliefs buffered their adolescents against increased alcohol use rather than putting them at greater risk” (Willard and Madon 2016 : 133). Similarly, studies found that “teachers’ false beliefs raised students’ achievement more than they lowered it” (Willard and Madon 2016 : 118): teacher overestimates “increase[d] achievement more than teacher underestimates tended to decrease achievement among students” (Madon et al. 1997 : 806). Experiments with stigmatized subjects corroborate these results further (ibid), leading Jussim ( 2017 ) in his literature review to conclude that high teacher expectations help students “more than low expectations harm achievement” (8).

One common explanation of this asymmetry is that SFPs typically depend on whether the targets of the trait ascriptions involved accept the expectations imposed on them via the ascriptions (Snyder and Klein 2007 ). And since subjects tend to strive to think well of themselves (Talaifar and Swann 2017 ), they respond more to positive than negative expectations (Madon et al. 1997 : 792). If we combine these considerations with the assumption that in ancestral groups of heavily interdependent subjects, positive social beliefs about in-group members (in-group favoritism) are likely to have been more prevalent than negative ones, then there is reason to hold that the SFPs of the social conceptions in the groups at issue were more often than not adaptive. With these points in mind, it is time to return to confirmation bias.

4.3 From SFPs to Confirmation Bias

Notice that SFPs depend on trait or mental-state ascriptions that are ‘ahead’ of their own truth: they are formed when an objective assessment of the available evidence doesn’t yet support their truth. Assuming direct doxastic voluntarism is false (Matheson and Vitz 2014 ), how can they nonetheless be formed and confidently maintained?

I suggest that confirmation bias plays an important role: it allows subjects to become and remain convinced about their social beliefs (e.g., trait ascriptions) when the available evidence doesn’t yet support their truth. This makes SFPs of these beliefs more likely than if the ascriber merely verbally attributed the traits without committing to the truth of the ascriptions, or believed in them but readily revised the beliefs. I shall argue that this is in fact adaptive not only when it comes to positive trait ascriptions, but also to negative ones. I will illustrate the point first with respect to positive trait ascriptions.

4.3.1 Motivated Confirmation Bias and Positive Trait Ascriptions

Suppose that you ascribe a positive property T to a subject A , who is your ward, but (unbeknownst to you) the available evidence doesn’t yet fully support that ascription. The more convinced you are about your view of A even in the light of counterevidence, the better you are at conveying your conviction to A because, generally, “people are more influenced [by others] when [these] others express judgments with high confidence than low confidence” (Kappes et al. 2020 : 1; von Hippel and Trivers 2011 ). Additionally, the better you are at conveying to A your conviction that he has T , the more confident he himself will be that he has that trait (assuming he trusts you) (Sniezek and Van Swol 2001 ). Crucially, if A too is confident that he has T , he will be more likely to conform to the corresponding expectations than if he doesn’t believe the ascription, say, because he notices that you only say but don’t believe that he has T . Relatedly, the more convinced you are about your trait ascription to A , the clearer your signaling of the corresponding expectations to A in your behavior (Tormala 2016 ) and the higher the normative impetus on him, as a cooperative subject, to conform so as to avoid disrupting interactions with you.

Returning to confirmation bias, given what we know about the cognitive effect of the bias, the more affected you are by the bias, the stronger your belief in your trait ascriptions to A (Rabin and Schrag 1999 ), and so the lower the likelihood that you will reveal in your behavior a lack of conviction about them that could undermine SFPs. Thus, the more affected you are by the bias, the higher the likelihood of SFPs of the ascriptions because conviction about the ascriptions plays a key facilitative role for SFPs. This is also experimentally supported. For several studies found that SFPs of trait ascriptions occurred only when ascribers were certain of the ascriptions, not when they were less confident (Swann and Ely 1984 ; Pelham and Swann 1994 ; Swann 2012 : 30). If we add to these points that SFPs of trait ascriptions were in developmental and educational contexts in ancestral tribal groups more often beneficial for the targets than not, then there is a basis for holding that confirmation bias might in fact have been selected for sustaining SFPs.

Notice that the argument so far equally applies to motivated reasoning. This is to be expected because, as mentioned above, motivated confirmation bias is an instance of motivated reasoning (Nickerson 1998 ). To pertain specifically to confirmation bias, however, the evolutionary proposal that the bias was selected for facilitating SFPs of social conceptions also has to hold for unmotivated confirmation bias. Is this the case?

4.3.2 Unmotivated Confirmation Bias and Negative Trait Ascriptions

Notice that when we automatically reinforce any of our views no matter whether we favor them, then our preferences won’t be required for and undermine the reinforcement process and the SFPs promoted by it. This means that such a general tendency, i.e., a confirmation bias, can fulfil the function of facilitating SFPs more frequently than motivated cognitions, namely whenever the subject has acquired a social conception (e.g., as the result of upbringing, learning, or testimony). This is adaptive for at least three reasons.

First, suppose that as a parent, caretaker, or teacher you (unknowingly) wishfully believe that A , who is your ward, has a positive trait T . You tell another subject ( B ) that A has T , and, on your testimony, B subsequently believes this too. But suppose that unlike you, B has no preference as to whether A has T. Yet, as it happens, she still has a confirmation bias toward her beliefs. Just like you, B will now process information so that it strengthens her view about A . This increases her conviction in, and so the probability of an SFP of, the trait ascription to A , because now both you and B are more likely to act toward A in ways indicating ascription-related expectations. As a general tendency to support any of one’s beliefs rather than only favored ones, the bias thus enables a social ‘ripple’ effect in the process of making trait ascriptions match reality. Since this process is in ultra-social and ultra-cooperative groups more often than not adaptive (e.g., boosting the development of a positive trait in A ), in facilitating a social extension of it, confirmation bias is adaptive too.

Secondly, in ancestral groups, many of the social conceptions (e.g., beliefs about social roles, gender norms, stereotypes etc.) that subjects unreflectively acquired during their upbringing and socialization will have been geared toward preserving the group's function and status quo  and aligning individuals with them (Sterelny 2006 : 148). Since it can operate independently of a subject’s preferences, a confirmation bias in each member of the group would have helped the group enlist each of its members for re-producing social identities, social structures, traits, and roles in the image of the group’s conceptions even when these individuals disfavored them. In sustaining SFPs of these conceptions, which might have included various stereotypes or ethnocentric, prejudicial attitudes that we today consider offensive negative trait ascriptions (e.g., gender or racist stereotypes) (Whitaker et al. 2018 ), confirmation bias would have been adaptive in the past. For, as Richerson and Boyd ( 2005 : 121f) note too, in ancestral groups, selection pressure favored social conformity, predictability, and stability. That confirmation bias might have evolved for facilitating SFPs that serve the ‘tribal' collective , possibly even against the preference, autonomy, and better judgment of the individual, is in line with recent research suggesting that many uniquely human features of cognition evolved through pressures selecting for the ability to conform to other people and to facilitate social projects (Henrich 2016 ). It is thought that these features may work against common ideals associated with self-reliance or “achieving basic personal autonomy, because the main purpose of [them] is to allow us to fluidly mesh with others, making us effective nodes in larger networks” (Kelly and Hoburg 2017 : 10). I suggest that confirmation bias too was selected for making us effective ‘nodes’ in social networks by inclining us to create social reality that corresponds to these networks’ conceptions even when we dislike them or they are harmful to others (e.g., out-group members).

Thirdly, in helping us make social affairs match our beliefs about them even when we don’t favor them, confirmation bias also provides us with significant epistemic benefits in social cognition. Consider Jack and Jill. Both have just seen an agent A act ambiguously, and both have formed a first impression of A according to which A is acting the way he is because he has trait T . Suppose neither Jack nor Jill has any preference as to whether A has that trait but subsequently process information in the following two different ways. Jack does not have a confirmation bias but impartially assesses the evidence and swiftly revises his beliefs when encountering contradictory data. As it happens, A ’s behavior soon does provide him with just such evidence, leading him to abandon his first impression of A and reopen the search for an explanation of A ’s action. In contrast, Jill does have a confirmation bias with respect to her beliefs and interprets the available evidence so that it supports her beliefs. Jill too sees A act in a way that contradicts her first impression of him. But unlike Jack, she doesn’t abandon her view. Rather, she reinterprets A ’s action so that it bolsters her view. Whose information processing might be more adaptive? For Jack, encountering data challenging his view removes certainty and initiates a new cycle of computations about A , which requires him to postpone a possible collaboration with A . For Jill, however, the new evidence strengthens her view, leading her to keep the issue of explaining A ’s action settled and be ready to collaborate with him. Jack’s approach might still seem better for attaining an accurate view of A and predicting what he’ll do next. But suppose Jill confidently signals to A her view of him in her behavior. Since people have a general inclination to fulfil others’ expectations (especially positive ones) out of an interest in coordinating and getting along with them (Dardenne and Leyens 1995 ; Bacharach et al. 2007 ), when A notices Jill’s conviction that he displays T , he too is likely to conform, which provides Jill with a correct view of what he will do next. Jill’s biased processing is thus more adaptive than Jack’s approach: a confirmation bias provides her with certainty and simpler information processing that simultaneously facilitates accurate predictions (via contributing to SFPs). Generalizing from Jill, in everyday social interactions we all form swift first impressions of others without having any particular preference with respect to these impressions either way. Assuming that confirmation bias operates on them nonetheless, the bias will frequently be adaptive in the ways just mentioned.

4.3.3 Summing Up: The Reality-Matching Account

By helping subjects make social reality match their beliefs about it no matter whether they favor these beliefs or the latter are sufficiently evidentially supported, confirmation bias is adaptive: when the bias targets positive social beliefs and trait ascriptions, it serves both the subject and the group by producing effects that (1) assist them in their development (to become, e.g., more communicative, cooperative, or knowledgeable) and (2) make social cognition more tractable (by increasing social conformity and predictability). To be sure, when it targets negative trait ascriptions (pernicious stereotypes, etc.), the bias can have ethically problematic SFP effects. But, as noted, especially in ancestral ‘tribal’ groups, it would perhaps still have contributed to social conformity, predictability, and sustaining the status quo , which would have been adaptive in these groups (Richerson and Boyd  2005 ) inter alia  by facilitating social cognition. Taken together, these considerations provide a basis for holding that confirmation bias was selected for promoting SFPs. I shall call the proposal introduced in this section, the  reality-matching (RM) account of the function of confirmation bias.

5 Supporting the RM Account

Before offering empirical support for the RM account and highlighting its explanatory benefits, it is useful to disarm an objection: if confirmation bias was selected for its SFP-related effects, then people should not also display the bias with respect to beliefs that can’t produce SFPs (e.g., beliefs about physics, climate change, religion, etc.). But they do (Nickerson 1998 ).

5.1 From Social to Non-social Beliefs

In response to the objection just mentioned, two points should be noted. First, the RM account is compatible with the view that confirmation bias was also selected for adaptive effects related to non -social beliefs. It only claims that facilitating the alignment of social reality with social beliefs (i.e., reality matching) is one of the important adaptive features for which the bias was selected that has so far been neglected.

Second, it doesn’t follow that because confirmation bias also affects beliefs that can’t initiate SFPs that it could not have been selected for affecting beliefs that can and do initiate SFPs. The literature offers many examples of biological features or cognitive traits that were selected for fulfilling a certain function despite rarely doing so or even having maladaptive effects (Millikan 1984 ; Haselton and Nettle 2006 ). Consider the “baby-face overgeneralization” bias (Zebrowitz and Montepare 2008 ). Studies suggest that people have a strong readiness to favorably respond to babies’ distinctive facial features. And this tendency is overgeneralized such that even adults are more readily viewed more favorably, treated as likeable (but also physically weak, and naïve) when they display babyface features. While this overgeneralization tendency often leads to errors, it is thought to have evolved because failures to respond favorably to babies (i.e., false negatives) are evolutionarily more costly than overgeneralizing (i.e., false positives) (ibid).

Might our domain-general tendency to confirm our own beliefs be similarly less evolutionarily costly than not having such a general tendency? It is not implausible to assume so because, as noted, we are ultra-social and ultra-cooperative, and our beliefs about people’s social standing, knowledge, intentions, abilities, etc. are critical for our flourishing (Sterelny 2007 : 720; Tomasello 2014 : 190f; Henrich 2016 ). Importantly, these beliefs, unlike beliefs about the non-social world, are able to and frequently do initiate SFPs contributing to the outlined evolutionary benefits. This matters because if social beliefs are pervasive and SFPs of them significant for our flourishing, then a domain-general tendency to confirm any of our beliefs ensures that we don’t miss opportunities to align social reality with our conceptions and to reap the related developmental and epistemic benefits. Granted, this tendency overgeneralizes, which creates clear costs. But given the special role of social beliefs in our species and our dependence on social learning and social cognition, which are facilitated by SFPs, it is worth taking seriously the possibility that these costs can often outweigh the benefits.

While this thought doesn’t yet show that the RM account is correct, it does help disarm the above objection. For it explains why the fact that confirmation bias also affects beliefs that cannot initiate SFPs doesn’t disprove the view that the bias was selected for reality matching: the special role of social beliefs in our species (compared to others species) lends plausibility to the assumption that the costs of the bias’ overgeneralizing might be lower than the costs of its failing to generalize. I now turn to the positive support for the RM account.

5.2 Empirical Data

If, as the RM account proposes, confirmation bias was selected for facilitating the process of making reality match our beliefs, then the bias should be common and pronounced when (1) it comes to social beliefs, that is, beliefs (a) about oneself, (b) about other people, and (c) about social structures that the subject can determine, and when (2) social conditions are conducive to reality matching. While there are no systematic comparative studies on whether the bias is more frequent or stronger with respect to some beliefs but not others (e.g., social vs. non-social beliefs), there is related empirical research that does provide some support for these predictions.

Self - related Beliefs

In a number of studies, Swann and colleagues (Swann 1983 ; Swann et al. 1992 ; for an overview, see Swann 2012 ) found that selective information processing characteristic of confirmation bias is “especially pronounced with regards to self-concepts” and so self-related beliefs (Müller-Pinzler et al. 2019 : 9). Footnote 5 Interestingly, and counterintuitively, the data show that “just as people with positive self-views preferentially seek positive evaluations, those with negative self-views preferentially seek negative evaluations” (Talaifar and Swann 2017 : 3). For instance, those “who see themselves as likable seek out and embrace others who evaluate them positively, whereas those who see themselves as dislikeable seek out and embrace others who evaluate them negatively” (ibid). Much in line with the RM account, Swann ( 2012 ) notes that this confirmatory tendency “would have been advantageous” in “hunter-gatherer groups”: once “people used input from the social environment to form self-views, self-verification strivings would have stabilized their identities and behavior, which in turn would make each individual more predictable to other group members” (26).

Similarly, in a study in which subjects received feedback about aspects of their self that can be relatively easily changed (e.g., their ability to estimate the weights of animals), Müller-Pinzler et al. ( 2019 ) found that “prior beliefs about the self modulate self-related belief-formation” in that subjects updated their performance estimates “in line with a confirmation bias”: individuals with prior negative self-related beliefs (e.g., low self-esteem) showed increased biases towards factoring in negative (vs. positive) feedback, and, interestingly, this tendency was “modulated by the social context and only present when participants were exposed to a potentially judging audience” (ibid: 9–10). This coheres with the view that confirmation bias might serve the ‘collective’ to bring subjects into accordance with its social conceptions (positive or negative).

Other - Related Beliefs

If confirmation bias was selected for sustaining social beliefs for the sake of reality matching then the bias should also be particularly pronounced when it comes to beliefs about other people especially in situations conducive to reality matching. For instance, powerful individuals have been found to be more likely to prompt subordinates to behaviorally confirm their social conceptions than relatively powerless subjects (Copeland 1994 ; Leyens et al. 1999 ). That is, interactions between powerful and powerless individuals are conducive to reality matching of the powerful individuals’ social beliefs. According to the RM account, powerful individuals should display a stronger confirmation bias with respect to the relevant social beliefs. Goodwin et al. ( 2000 ) found just that: powerful people, in particular, tend to fail to take into account data that may contradict their social beliefs (capturing, e.g., stereotypes) about subordinates and attend more closely to information that supports their expectations. Relative to the powerless, powerful people displayed a stronger confirmation bias in their thinking about subordinates (ibid: 239f).

Similarly, if confirmation bias serves to facilitate social interaction by contributing to a match between beliefs and social reality then the bias should be increased with respect to trait attributions to other people in subjects who care about social interactions compared to other subjects. Dardenne and Leyens ( 1995 ) reasoned that when testing a hypothesis about the personality of another individual (e.g., their being introverted or extroverted), a preference for questions that match the hypothesis (e.g., that the subject is introverted) indicates social skill, conveying a feeling of being understood to the individual and contributing to a smooth conversation. Socially skilled people (‘high self-monitors’) should thus request ‘matching questions’, say, in an interview setting, for instance, when testing the introvert hypothesis, an interviewer could ask questions that are answered ‘yes’ by a typical introvert (e.g., ‘Do you like to stay alone?’), confirming the presence of the hypothesized trait (ibid). Dardenne and Leyens did find that matching questions pertaining to an introvert or an extrovert hypothesis were selected most by high self-monitors: socially skilled subjects displayed a stronger confirmatory tendency than less socially skilled subjects (ibid).

Finally, there is also evidence that confirmation bias is more pronounced with respect to social beliefs compared to non-social beliefs. For instance, Marsh and Hanlon ( 2007 ) gave one group of behavioral ecologists a specific set of expectations with respect to sex differences in salamander behavior, while a second group was given the opposite set of expectations. In one experiment, subjects collected data on variable sets of live salamanders, while in the other experiment, observers collected data from identical videotaped trials. Across experiments and observed behaviors, the expectations of the observers biased their observations “only to a small or moderate degree”, Marsh and Hanlon note, concluding that these “results are largely optimistic with respect to confirmation bias in behavioral ecology” ( 2007 : 1089). This insignificant confirmation bias with respect to beliefs about non-social matters contrasts with findings of a significant confirmation bias with respect to beliefs about people (Talaifar and Swann 2017 ; Goodwin et al. 2000 ; Marks and Fraley 2006 ; Darley and Gross 1983 ), and, as I shall argue now, social affairs whose reality the subject can determine.

Non - personal, Social Beliefs

One important kind of social beliefs are political beliefs, which concern social states of affairs pertaining to politics. Political beliefs are especially interesting in the context of the RM account because they are very closely related to reality matching. This is not only because subjects can often directly influence political affairs via voting, running as a candidate, campaigning, etc. It is also because subjects who are highly confident about their political beliefs are more likely to be able to convince other people of them too (Kappes et al. 2020 ). And the more widespread a political conviction in a population, the higher the probability that the population will adopt political structures that shape reality in line with it (Jost et al. 2003 ; Ordabayeva and Fernandes 2018 ).

If, as the RM account proposes, confirmation bias was selected for sustaining social beliefs for the sake of reality matching then the bias should be particularly strong when it comes to beliefs about political states of affairs. And indeed Taber and Lodge ( 2006 ) did find that “motivated [confirmation] biases come to the fore in the processing of political arguments”, in particular, and, crucially, subjects “with weak […] [political] attitudes show less [confirmation] bias in processing political arguments” (767). In fact, in psychology, attitude strength, especially, in politically relevant domains of thinking has long been and still is widely accepted to increase the kind of selective exposure constitutive of confirmation bias (Knobloch-Westerwick et al. 2015 : 173). For instance, Brannon et al. ( 2007 ) found that stronger, more extreme political attitudes are correlated with higher ratings of interest in attitude-consistent versus attitude-discrepant political articles. Similarly, Knobloch-Westerwick et al. ( 2015 ) found that people online who attach high importance to particular political topics spent more time on attitude-consistent messages than users who attached low importance to the topics, and “[a]ttitude-consistent messages […] were preferred”, reinforcing the attitudes further (171). While this can contribute to political group polarization, such a polarization also boosts the group-wide reality-matching endeavour and can so be adaptive itself (Johnson and Fowler 2011 : 317).

In short, then, while there are currently no systematic comparative studies on whether confirmation bias is more frequent or stronger with respect to social beliefs, related empirical studies do suggest that when it comes to (positive or negative) social beliefs about oneself, other people, and social states of affairs that the subject can determine (e.g., political beliefs), confirmation bias is both particularly common and pronounced. Empirical data thus corroborate some of the predictions of the RM account.

5.3 Explanatory Benefits

The theoretical and empirical considerations from the preceding sections offer support for the RM account. Before concluding, it is worth mentioning three further reasons for taking the account seriously. First, it has greater explanatory power than the three alternative views outlined above. Second, it is consistent with, and provides new contributions to, different areas of evolutionary theorizing on human cognition. And it casts new light on the epistemic character of confirmation bias. I’ll now support these three points.

For instance, the argumentative-function account holds that confirmation bias is adaptive in making us better arguers. This was problematic because the bias hinders us in anticipating people’s objections, which weakens our argumentative skill and increases the risk of us appearing incompetent in argumentative exchanges. The RM account avoids these problems: if confirmation bias was selected for reinforcing our preconceptions about people to promote SFPs then, since in one’s own reasoning one only needs to justify one’s beliefs to oneself, the first point one finds acceptable will suffice. To convince others , one would perhaps need to anticipate objections. But if the bias functions to boost primarily only one’s own conviction about particular beliefs so as to facilitate SFPs then ‘laziness’ in critical thinking about one’s own positions (Trouche et al. 2016 ) shouldn’t be surprising.

Turning to the group-cognition account, the proposal was that confirmation bias is adaptive in and was selected for making group-level inquires more thorough, reliable, and efficient. In response, I noted that the concept of ‘group selection’ is problematic when it comes to traits threatening an individual’s fitness (West et al. 2007 ; Pinker 2012 ), and that confirmation bias would arguably only lead to the group-level benefits at issue in groups with viewpoint diversity. Yet, it is doubtful that ancestral groups met this condition. The RM account is preferable to the group-cognition view because it doesn’t rely on a notion of group selection but concerns primarily individual-level benefits, and it doesn’t tie the adaptive effects of the bias to conditions of viewpoint diversity. It proposes instead that the adaptive SFP-related effects of the bias increase individuals’ fitness (e.g., by facilitating their navigation of the social world, aligning them/others with their group's conceptions etc.) and can emerge whenever people hold beliefs about each other, interact, and fulfill social expectations. This condition is satisfied even in groups with viewpoint homogeneity.

The RM account also differs from the intention–alignment view, which holds that confirmation bias evolved for allowing us to synchronize intentions with others. One problem with this view was that the bias seems to hinder an intention alignment of individuals by weakening their perspective-taking capacity, and inclining them to ignore or distort people’s objections. The RM account avoids this problem because it suggests that by disregarding objections or counterevidence to one’s beliefs, one can remain convinced about them, which helps align social reality (not only, e.g., people’s intentions) with them, producing the adaptive outcomes outlined above. The account can also explain why confirmation bias is particularly strong in groups in which shared ideologies are at stake (Taber and Lodge 2006 ; Gerken 2019 ). For subjects have a keen interest in reality corresponding to their ideological conceptions. Since the latter are shaping social reality via their impact on behavior and are more effective in doing so the more convinced people are about them (Kappes et al. 2020 ), it is to be expected that when it comes to ideological propositions in like-minded groups, confirmation bias is more pronounced. And, as noted, the resulting group polarization itself can then be adaptive in strengthening the reality-matching process.

Moving beyond extant work on the evolution of confirmation bias, the RM account also contributes to and raises new questions for other areas of research in different disciplines. It, for instance, yields predictions that psychologists can experimentally explore in comparative studies such as the prediction that confirmation bias is more common and stronger when targeting social versus non-social beliefs, or when conditions are conducive to reality matching as opposed to when they are not. The account also adds a new perspective to research on SFPs and on how social conceptions interact with their targets (Hacking 1995 ; Snyder and Klein 2007 ; Jussim 2017 ). Relatedly, the RM account also contributes to recent philosophical work on, folk-psychology , i.e., our ability to ascribe mental states to agents to make sense of their behavior. In that work, some philosophers argue that folk-psychology serves “mindshaping”, that is, the moulding of people’s behavior and minds so that they fit our conceptions, making people more predictable and cooperation with them easier (Mameli 2001 ; Zawidzki 2013 ; Peters 2019b ). There are clear connections between the mindshaping view of folk psychology and the RM account, but also important differences. For instance, the RM account pertains to the function of confirmation bias, not folk psychology. Moreover, advocates of the mindshaping view have so far left the conditions for effective mindshaping via folk-psychological ascriptions and the possible role of confirmation bias in it unexplored. The RM account begins to fill this gap in the research and in doing so adds to work on the question of how epistemic (or ‘mindreading’) and non-epistemic (or ‘mindshaping’, e.g., motivational) processes are related in folk-psychology (Peters 2019b : 545f; Westra 2020 ; Fernández-Castro and Martínez-Manrique 2020 ).

In addition to offering contributions to a range of different areas of research, the RM account also casts new light on the epistemic character of confirmation bias. Capturing the currently common view on the matter, Mercier ( 2016 ) writes that “piling up reasons that support our preconceived views is not the best way to correct them. […] [It] stop[s] people from fixing mistaken beliefs” (110). The RM account offers a different perspective, suggesting that when it is directed at beliefs about social affairs, confirmation bias does often help subjects correct their mistaken conceptions to the extent that it contributes to SFPs of them. Similarly, Dutilh Novaes ( 2018 ) holds that the bias involves or contributes to a failure of perspective taking, and so, “given the importance of being able to appreciate one’s interlocutor’s perspective for social interaction”, is “best not seen as an adaptation” (520). The RM account, on the other hand, proposes that the bias often facilitates social understanding: in making us less sensitive to our interlocutor’s opposing perspective, it helps us remain confident about our social beliefs, which increases the probability of SFPs that in turn make people more predictable and mindreadable.

6 Conclusion

After outlining limitations of three recent proposals on the evolution of confirmation bias, I developed and supported a novel alternative, the reality-matching (RM) account, which holds that one of the adaptive features for which the bias evolved is that it helps us bring social reality into alignment with our beliefs. When the bias targets positive social beliefs, this serves both the subject and the group, assisting them in their development (to become, e.g., more communicative or knowledgeable) while also making their social cognition more effective and tractable. When it targets negative social beliefs, in promoting reality matching, the bias might contribute to ethically problematic outcomes, but it can then still support social conformity and predictability, which were perhaps especially in ancestral tribal groups adaptive. While the socially constructive aspect of confirmation bias highlighted here may not be the main or only feature of the bias that led to its evolution, it is one that has so far been overlooked in the evolutionary theorizing on confirmation bias. If we attend to it, an account of the function of confirmation bias becomes available that coheres with data from across the psychological sciences, manages to avoid many of the shortcomings of competitor views, and has explanatory benefits that help advance the research on the function, nature, and epistemic character of the bias.

Mercier and Sperber ( 2017 ) and others prefer the term ‘myside bias’ to ‘confirmation bias’ because people don’t have a general tendency to confirm any hypothesis that comes to their mind but only ones that are on ‘their side’ of a debate. I shall here use the term ‘confirmation bias’ because it is more common and in any case typically understood in the way just mentioned.

Researchers working on folk psychology might be reminded of the ‘mindshaping’ view of folk psychology (Mameli 2001 ; Zawidzki 2013 ). I will come back to this view and demarcate it from my account of confirmation bias here in Sect.  5 .

It might be proposed that when participants in the experiment seek reasons for their judgments, perhaps they take themselves already to have formed the judgements for good reasons and then wonder what these reasons might have been. Why would they seek reasons against a view that they have formed (by their own lights) for good reasons? However, we might equally well ask why they would take themselves to have formed a judgment for good reasons in the first place even though they don’t know any of them? If it is a general default tendency to assume that any view that one holds rests on good reasons, then that would again suggest the presence of a confirmation bias. For a general tendency to think that one’s views rest on good reasons even when one doesn’t know them is a tendency to favor and confirm these views while resisting balanced scrutiny of their basis.

SFPs can also accumulate when they occur across different interactions, and in contemporary societies, overall accumulative SFP effects of negative social beliefs capturing, e.g., stereotypes might be stronger than those of positive social beliefs in individual dyadic interactions (Madon et al. 2018 ). However, in ancestral, ‘tribal’ groups of highly interdependent subjects, even accumulative SFPs of, e.g., stereotypes would perhaps still have contributed to conformity and social stability. I shall return to the possible SFP-related benefits of nowadays highly negative social conceptions, i.e., stereotypes, ethnocentrism etc. below.

Relatedly, neuroscientific data show that a positive view of one’s own traits tends to correlate with a reduced activation of the right inferior prefrontal gyrus, which is the area of the brain processing self-related content, when the subject receives negative self-related information (Sharot et al. 2011 ). That is, optimists about themselves display a diminished sensitivity for negative information that is in tension with self-related trait optimism (ibid).

Alfano, M. (2013). Character as moral fiction . Cambridge: CUP.

Book   Google Scholar  

Bacharach, M., Guerra, G., & Zizzo, D. J. (2007). The self-fulfilling property of trust: An experimental study. Theory and Decision, 63, 349–388.

Article   Google Scholar  

Ball, P. (2017). The trouble with scientists. How one psychologist is tackling human biases in science. Nautilus . Retrieved May 2, 2019 from http://nautil.us/issue/54/the-unspoken/the-trouble-with-scientists-rp .

Biggs, M. (2009). Self-fulfilling prophecies. In P. Bearman & P. Hedstrom (Eds.), The Oxford handbook of analytical sociology (pp. 294–314). Oxford: OUP.

Google Scholar  

Brannon, L. A., Tagler, M. J., & Eagly, A. H. (2007). The moderating role of attitude strength in selective exposure to information. Journal of Experimental Social Psychology, 43, 611–617.

Copeland, J. (1994). Prophecies of power: Motivational implications of social power for behavioral confirmation. Journal of Personality and Social Psychology, 67, 264–277.

Cornelissen, G., Dewitte, S., & Warlop, L. (2007). Whatever people say I am that’s what I am: Social labeling as a social marketing tool. International Journal of Research in Marketing, 24 (4), 278–288.

Dardenne, B., & Leyens, J. (1995). Confirmation bias as a social skill. Personality and Social Psychology Bulletin, 21 (11), 1229–1239.

Darley, J. M., & Gross, P. H. (1983). A hypothesis-confirming bias in labeling effects. Journal of Personality and Social Psychology, 44, 20–33.

Davidson, O. B., & Eden, D. (2000). Remedial self-fulfilling prophecy: Two field experiments to prevent Golem effects among disadvantaged women. Journal of Applied Psychology, 85 (3), 386–398.

De Bruine, L. M. (2009). Beyond ‘just-so stories’: How evolutionary theories led to predictions that non-evolution-minded researchers would never dream of. Psychologist, 22 (11), 930–933.

De Cruz, H., & De Smedt, J. (2016). How do philosophers evaluate natural theological arguments? An experimental philosophical investigation. In H. De Cruz & R. Nichols (Eds.), Advances in religion, cognitive science, and experimental philosophy (pp. 119–142). New York: Bloomsbury.

Downey, G., Freitas, A. L., Michaelis, B., & Khouri, H. (1998). The self-fulfilling prophecy in close relationships: Rejection sensitivity and rejection by romantic partners. Journal of Personality and Social Psychology, 75, 545–560.

Draper, P., & Nichols, R. (2013). Diagnosing bias in philosophy of religion. The Monist, 96, 420–446.

Dutilh Novaes, C. (2018). The enduring enigma of reason. Mind and Language, 33, 513–524.

Evans, J. (1996). Deciding before you think: Relevance and reasoning in the selection task. British Journal of Psychology, 87, 223–240.

Fernández-Castro, V., & Martínez-Manrique, F. (2020). Shaping your own mind: The self-mindshaping view on metacognition. Phenomenology and the Cognitive Sciences . https://doi.org/10.1007/s11097-020-09658-2 .

Gerken, M. (2019). Public scientific testimony in the scientific image. Studies in History and Philosophy of Science Part A . https://doi.org/10.1016/j.shpsa.2019.05.006 .

Golec de Zavala, A. (2011). Collective narcissism and intergroup hostility: The dark side of ‘in-group love’. Social and Personality Psychology Compass, 5, 309–320.

Goodwin, S., Gubin, A., Fiske, S., & Yzerbyt, V. (2000). Power can bias impression formation: Stereotyping subordinates by default and by design. Group Processes and Intergroup Relations, 3, 227–256.

Gould, S. J., & Lewontin, R. C. (1979). The spandrels of San Marco and the Panglossian paradigm: A critique of the adaptationist programme. Proceedings of the Royal Society of London. Series B, 205 (1161), 581–598.

Grusec, J., Kuczynski, L., Rushton, J., & Simutis, Z. (1978). Modeling, direct instruction, and attributions: Effects on altruism. Developmental Psychology, 14, 51–57.

Hacking, I. (1995). The looping effects of human kinds. In D. Sperber, et al. (Eds.), Causal cognition (pp. 351–383). New York: Clarendon Press.

Hahn, U., & Harris, A. J. L. (2014). What does it mean to be biased: Motivated reasoning and rationality. In H. R. Brian (Ed.), Psychology of learning and motivation (pp. 41–102). New York: Academic Press.

Haidt, J. (2012). The righteous mind . New York: Pantheon.

Hall, L., Johansson, P., & Strandberg, T. (2012). Lifting the veil of morality: Choice blindness and attitude reversals on a self-transforming survey. PLoS ONE, 7 (9), e45457.

Haselton, M. G., & Nettle, D. (2006). The paranoid optimist: An integrative evolutionary model of cognitive biases. Personality and Social Psychology Review, 10, 47–66.

Henrich, J. (2016). The secret of our success . Princeton, NJ: Princeton University Press.

Hernandez, I., & Preston, J. L. (2013). Disfluency disrupts the confirmation bias. Journal of Experimental Social Psychology, 49 (1), 178–182.

Jensen, R. E., & Moore, S. G. (1977). The effect of attribute statements on cooperativeness and competitiveness in school-age boys. Child Development, 48 (1), 305–307.

Johnson, D. D. P., Blumstein, D. T., Fowler, J. H., & Haselton, M. G. (2013). The evolution of error: Error management, cognitive constraints, and adaptive decision-making biases. Trends in Ecology & Evolution, 28, 474–481.

Johnson, D. D. P., & Fowler, J. H. (2011). The evolution of overconfidence. Nature, 477, 317–320.

Jost, J. T., Glaser, J., Kruglanski, A. W., & Sulloway, F. J. (2003). Political conservatism as motivated social cognition. Psychological Bulletin, 129 (3), 339–375.

Jussim, L. (2012). Social perception and social reality . Oxford: OUP.

Jussim, L. (2017). Précis of social perception and social reality: Why accuracy dominates bias and self-fulfilling prophecy. Behavioral and Brain Sciences, 40, 1–20.

Kappes, A., Harvey, A. H., Lohrenz, T., et al. (2020). Confirmation bias in the utilization of others’ opinion strength. Nature Neuroscience, 23, 130–137.

Kelly, D. (2013). Moral disgust and the tribal instincts hypothesis. In K. Sterelny, R. Joyce, B. Calcott, & B. Fraser (Eds.), Cooperation and its evolution (pp. 503–524). Cambridge, MA: The MIT Press.

Kelly, D., & Hoburg, P. (2017). A tale of two processes: On Joseph Henrich’s the secret of our success: How culture is driving human evolution, domesticating our species, and making us smarter. Philosophical Psychology, 30 (6), 832–848.

Ketelaar, T., & Ellis, B. J. (2000). Are evolutionary explanations unfalsifiable? Evolutionary psychology and the Lakatosian philosophy of science. Psychological Inquiry, 11 (1), 1–21.

Klayman, J. (1995). Varieties of confirmation bias. Psychology of Learning and Motivation, 32, 385–418.

Knobloch-Westerwick, S., Johnson, B. K., & Westerwick, A. (2015). Confirmation bias in online searches: Impacts of selective exposure before an election on political attitude strength and shifts. Journal of Computer-Mediated Communication, 20, 171–187.

Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108 (3), 480–498.

Leidner, B., Castano, E., Zaiser, E., & Giner-Sorolla, R. (2010). Ingroup glorification, moral disengagement, and justice in the context of collective violence. Personality and Social Psychology Bulletin, 36 (8), 1115–1129.

Levy, N. (2019). Due deference to denialism: Explaining ordinary people’s rejection of established scientific findings. Synthese, 196 (1), 313–327.

Leyens, J., Dardenne, B., Yzerbyt, V., Scaillet, N., & Snyder, M. (1999). Confirmation and disconfirmation: Their social advantages. European Review of Social Psychology, 10 (1), 199–230.

Lilienfeld, S. O. (2017). Psychology’s replication crisis and the grant culture: Righting the ship. Perspectives on Psychological Science, 12 (4), 660–664.

Lord, C., Lepper, M., & Preston, E. (1984). Considering the opposite: A corrective strategy for social judgment. Journal of Personality and Social Psychology, 47, 1231–1243.

Madon, S., Jussim, L., & Eccles, J. (1997). In search of the powerful self-fulfilling prophecy. Journal of Personality and Social Psychology, 72, 791–809.

Madon, S., Jussim, L., Guyll, M., Nofziger, H., Salib, E. R., Willard, J., et al. (2018). The accumulation of stereotype-based self-fulfilling prophecies. Journal of Personality and Social Psychology, 115 (5), 825–844.

Mameli, M. (2001). Mindreading, mindshaping, and evolution. Biology and Philosophy, 16, 597–628.

Marks, M. J., & Fraley, R. C. (2006). Confirmation bias and the sexual double standard. Sex Roles: A Journal of Research, 54 (1–2), 19–26.

Marsh, D. M., & Hanlon, T. J. (2007). Seeing what we want to see: Confirmation biasin animal behavior research. Ethology, 113, 1089–1098.

Matheson, J., & Vitz, R. (Eds.). (2014). The ethics of belief: Individual and social . Oxford: OUP.

Mayo, R., Alfasi, D., & Schwarz, N. (2014). Distrust and the positive test heuristic: Dispositional and situated social distrust improves performance on the Wason Rule Discovery Task. Journal of Experimental Psychology: General, 143 (3), 985–990.

McDonald, M. M., Navarrete, C. D., & van Vugt, M. (2012). Evolution and the psychology of intergroup conflict: The male warrior hypothesis. Philosophical Transactions of the Royal Society, B, 367, 670–679.

Mercier, H. (2016). Confirmation (or myside) bias. In R. Pohl (Ed.), Cognitive illusions (pp. 99–114). London: Psychology Press.

Mercier, H., & Sperber, D. (2011). Why do humans reason? Arguments for an argumentative theory. Behavioral and Brain Sciences, 34 (2), 57–111.

Mercier, H., & Sperber, D. (2017). The enigma of reason . Cambridge, MA: Harvard University Press.

Merton, R. (1948). The self-fulfilling prophecy. The Antioch Review, 8 (2), 193–210.

Miller, R., Brickman, P., & Bolen, D. (1975). Attribution versus persuasion as a means for modifying behavior. Journal of Personality and Social Psychology, 31 (3), 430–441.

Millikan, R. G. (1984). Language thought and other biological categories . Cambridge, MA: MIT Press.

Müller-Pinzler, L., Czekalla, N., Mayer, A. V., et al. (2019). Negativity-bias in forming beliefs about own abilities. Scientific Reports, 9, 14416. https://doi.org/10.1038/s41598-019-50821-w .

Murray, S. L., Holmes, J. G., & Griffin, D. W. (1996). The self-fulfilling nature of positive illusions in romantic relationships: Love is not blind, but prescient. Journal of Personality and Social Psychology , 71 , 1155–1180.

Myers, D., & DeWall, N. (2015). Psychology . New York: Worth Publishers.

Myers, D. G., & Lamm, H. (1976). The group polarization phenomenon. Psychological Bulletin, 83, 602–627.

Nickerson, R. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2, 175–220.

Norman, A. (2016). Why we reason: Intention–alignment and the genesis of human rationality. Biology and Philosophy, 31, 685–704.

Ordabayeva, N., & Fernandes, D. (2018). Better or different? How political ideology shapes preferences for differentiation in the social hierarchy. Journal of Consumer Research, 45 (2), 227–250.

Palminteri, S., Lefebvre, G., Kilford, E. J., & Blakemore, S. J. (2017). Confirmation bias in human reinforcement learning: Evidence from counterfactual feedback processing. PLoS Computational Biology, 13 (8), e1005684.

Pelham, B. W., & Swann, W. B. (1994). The juncture of intrapersonal and interpersonal knowledge: Self-certainty and interpersonal congruence. Personality and Social Psychology Bulletin, 20 (4), 349–357.

Peters, U. (2018). Illegitimate values, confirmation bias, and mandevillian cognition in science. British Journal for Philosophy of Science . https://doi.org/10.1093/bjps/axy079 .

Peters, U. (2019a). Implicit bias, ideological bias, and epistemic risks in philosophy. Mind & Language , 34 , 393–419. https://doi.org/10.1111/mila.12194 .

Peters, U. (2019b). The complementarity of mindshaping and mindreading. Phenomenology and the Cognitive Sciences , 18 (3), 533–549.

Peters, U., Honeycutt, N., De Block, A., & Jussim, L. (forthcoming). Ideological diversity, hostility, and discrimination in philosophy. Philosophical Psychology . Available online: https://philpapers.org/archive/PETIDH-2.pdf .

Pinker, S. (2012). The false allure of group selection. Retrieved July 20, 2012 from http://edge.org/conversation/the-false-allure-of-group-selection .

Rabin, M., & Schrag, J. L. (1999). First impressions matter: A model of confirmatory bias. Quarterly Journal of Economics, 114 (1), 37–82.

Richerson, P., & Boyd, R. (2001). The evolution of subjective commitment to groups: A tribal instincts hypothesis. In R. M. Nesse (Ed.), Evolution and the capacity for commitment (pp. 186–202). New York: Russell Sage Found.

Richerson, P., & Boyd, R. (2005). Not by genes alone: How culture transformed human evolution . Chicago: University of Chicago Press.

Roberts, S. C., van Vugt, M., & Dunbar, R. I. M. (2012). Evolutionary psychology in the modern world: Applications, perspectives, and strategies. Evolutionary Psychology, 10, 762–769.

Schuck, P. H. (2001). The perceived values of diversity, then and now. Cardozo Law Review, 22, 1915–1960.

Sharot, T., Korn, C. W., & Dolan, R. J. (2011). How unrealistic optimism is maintained in the face of reality. Nature Neuroscience, 14, 1475–1479.

Simpson, J. A., & Beckes, L. (2010). Evolutionary perspectives on prosocial behavior. In M. Mikulincer & P. Shaver (Eds.), Prosocial motives, emotions, and behavior: The better angels of our nature (pp. 35–53). Washington, DC: American Psychological Association.

Chapter   Google Scholar  

Smart, P. (2018). Mandevillian intellingence. Synthese, 195, 4169–4200.

Smith, J. J., & Wald, B. (2019). Collectivized intellectualism. Res Philosophica, 96 (2), 199–227.

Sniezek, J. A., & Van Swol, L. M. (2001). Trust, confidence, and expertise in a judge–advisor system. Organizational Behavior and Human Decision Processes, 84, 288–307.

Snyder, M., & Klein, O. (2007). Construing and constructing others: On the reality and the generality of the behavioral confirmation scenario. In P. Hauf & F. Forsterling (Eds.), Making minds (pp. 47–60). John Benjamins: Amsterdam/Philadelphia.

Solomon, G. B. (2016). Improving performance by means of action–cognition coupling in athletes and coaches. In M. Raab, B. Lobinger, S. Hoffman, A. Pizzera, & S. Laborde (Eds.), Performance psychology: Perception, action, cognition, and emotion (pp. 88–101). London, England: Elsevier Academic Press.

Stangor, C. (2011). Principles of social psychology . Victoria, BC: BCcampus.

Stanovich, K., West, R., & Toplak, M. (2013). Myside bias, rational thinking, and intelligence. Current Directions in Psychological Science, 22, 259–264.

Steel, D. (2018). Wishful thinking and values in science: Bias and beliefs about injustice. Philosophy of Science . https://doi.org/10.1086/699714 .

Sterelny, K. (2006). Memes revisited. British Journal for the Philosophy of Science, 57, 145–165.

Sterelny, K. (2007). Social intelligence, human intelligence and niche construction. Philosophical Transactions of the Royal Society B, 362, 719–730.

Sterelny, K. (2018). Why reason? Hugo Mercier’s and Dan Sperber’s the enigma of reason: A new theory of human understanding. Mind and Language, 33 (5), 502–512.

Stibel, J. (2018). Fake news: How our brains lead us into echo chambers that promote racism and sexism. USA Today . Retrieved October 8, 2018 from https://eu.usatoday.com/story/money/columnist/2018/05/15/fake-news-social-media-confirmation-bias-echo-chambers/533857002/ .

Swann, W. B. (1983). Self-verification: Bringing social reality into harmony with the self. In J. Suls & A. G. Greenwald (Eds.), Social psychological perspectives on the self (Vol. 2, pp. 33–66). London: Erlbaum.

Swann, W. B., Jr. (2012). Self-verification theory. In P. A. M. Van Lange, A. W. Kruglanski, & E. T. Higgins (Eds.), Handbook of theories of social psychology (pp. 23–42). Beverley Hills, CA: Sage Publications Ltd.

Swann, W., & Ely, R. (1984). A battle of wills: Self-verification versus behavioral confirmation. Journal of Personality and Social Psychology, 46, 1287–1302.

Swann, W. B., Jr., Stein-Seroussi, A., & Giesler, B. (1992). Why people self-verify. Journal of Personality and Social Psychology, 62, 392–406.

Taber, C., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50, 755–769.

Talaifar, S., & Swann, W. B. (2017). Self-verification theory. In L. Goossens, M. Maes, S. Danneel, J. Vanhalst, & S. Nelemans (Eds.), Encyclopedia of personality and individual differences (pp. 1–9). Berlin: Springer.

Tomasello, M. (2014). The ultra-social animal. European Journal of Social Psychology, 44, 187–194.

Tooby, J., & Cosmides, L. (2015). The theoretical foundations of evolutionary psychology. In D. M. Buss (Ed.), The handbook of evolutionary psychology (pp. 3–87). Hoboken, NJ: Wiley.

Tormala, Z. L. (2016). The role of certainty (and uncertainty) in attitudes and persuasion. Current Opinion in Psychology, 10, 6–11.

Trouche, E., et al. (2016). The selective laziness of reasoning. Cognitive Science, 40, 2122–2136.

Turnwald, B., et al. (2018). Learning one’s genetic risk changes physiology independent of actual genetic risk. Nature Human Behaviour . https://doi.org/10.1038/s41562-018-0483-4 .

von Hippel, W., & Trivers, R. (2011). The evolution and psychology of self-deception. Behavioral and Brain Sciences, 34 (1), 1–16.

Wenger, A., & Fowers, B. J. (2008). Positive illusions in parenting: Every child is above average. Journal of Applied Social Psychology, 38 (3), 611–634.

West, S. A., Griffin, A. S., & Gardiner, A. (2007). Social semantics: How useful has group selection been? Journal of Evolutionary Biology , 21 , 374–385.

Westra, E. (2020). Folk personality psychology: Mindreading and mindshaping in trait attribution. Synthese . https://doi.org/10.1007/s11229-020-02566-7 .

Whitaker, R. M., Colombo, G. B., & Rand, D. G. (2018). Indirect reciprocity and the evolution of prejudicial groups. Scientific Reports , 8 (1), 13247. https://doi.org/10.1038/s41598-018-31363-z .

Whittlestone, J. (2017). The importance of making assumptions: Why confirmation is not necessarily a bias . Ph.D. Thesis. Coventry: University of Warwick.

Willard, J., & Madon, S. (2016). Understanding the connections between self-fulfilling prophecies and social problems. In S. Trusz & P. Przemysław Bąbel (Eds.), Interpersonal and intrapersonal expectancies (pp. 117–125). London: Routledge.

Willard, J., Madon, S., Guyll, M., Spoth, R., & Jussim, L. (2008). Self-efficacy as a moderator of negative and positive self-fulfilling prophecy effects: Mothers’ beliefs and children’s alcohol use. European Journal of Social Psychology, 38, 499–520.

Word, C. O., Zanna, M. P., & Cooper, J. (1974). The nonverbal mediation of self-fulfilling prophecies in interracial interaction. Journal of Experimental Social Psychology, 10, 109–120.

Zawidzki, T. (2008). The function of folk psychology: Mind reading or mind shaping? Philosophical Explorations, 11 (3), 193–210.

Zawidzki, T. (2013). Mindshaping: A new framework for understanding human social cognition. Cambridge: MIT Press.

Zebrowitz, L. A., & Montepare, J. M. (2008). Social psychological face perception: Why appearance matters. Social and Personality Psychology Compass, 2, 1497–1517.

Download references

Acknowledgements

Many thanks to Andreas De Block, Mikkel Gerken, and Alex Krauss for comments on earlier drafts. The research for this paper was partly funded by the Danmarks Frie Forskningsfond Grant no: 8018-00053B allocated to Mikkel Gerken.

Author information

Authors and affiliations.

Department of Philosophy, University of Southern Denmark, Odense, Denmark

Department of Psychology, King’s College London, De Crespigny Park, Camberwell, London, SE5 8AB, UK

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Uwe Peters .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Peters, U. What Is the Function of Confirmation Bias?. Erkenn 87 , 1351–1376 (2022). https://doi.org/10.1007/s10670-020-00252-1

Download citation

Received : 07 May 2019

Accepted : 27 March 2020

Published : 20 April 2020

Issue Date : June 2022

DOI : https://doi.org/10.1007/s10670-020-00252-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Find a journal
  • Publish with us
  • Track your research

Effectiviology

The Confirmation Bias: Why People See What They Want to See

The Confirmation Bias

The confirmation bias is a cognitive bias that causes people to search for, favor, interpret, and recall information in a way that confirms their preexisting beliefs. For example, if someone is presented with a lot of information on a certain topic, the confirmation bias can cause them to only remember the bits of information that confirm what they already thought.

The confirmation bias influences people’s judgment and decision-making in many areas of life, so it’s important to understand it. As such, in the following article you will first learn more about the confirmation bias, and then see how you can reduce its influence, both in other people’s thought process as well as in your own.

How the confirmation bias affects people

The confirmation bias promotes various problematic patterns of thinking , such as people’s tendency to ignore information that contradicts their beliefs . It does so through several types of biased cognitive processes:

  • Biased search for information. This means that the confirmation bias causes people to search for information that confirms their preexisting beliefs, and to avoid information that contradicts them.
  • Biased favoring of information. This means that the confirmation bias causes people to give more weight to information that supports their beliefs, and less weight to information that contradicts them.
  • Biased interpretation of information. This means that the confirmation bias causes people to interpret information in a way that confirms their beliefs, even if the information could be interpreted in a way that contradicts them.
  • Biased recall of information. This means that the confirmation bias causes people to remember information that supports their beliefs and to forget information that contradicts them, or to remember supporting information as having been more supporting than it really was, or to incorrectly remember contradictory information as having supported their beliefs.

Note : one closely related phenomenon is cherry picking . It involves focusing only on evidence that supports one’s stance, while ignoring evidence that contradicts it. People often engage in cherry picking due to the confirmation bias, though it’s possible to engage in cherry picking even if a person is fully aware of what they’re doing, and is unaffected by the bias.

Examples of the confirmation bias

One example of the confirmation bias is someone who searches online to supposedly check whether a belief that they have is correct, but ignores or dismisses all the sources that state that it’s wrong. Similarly, another example of the confirmation bias is someone who forms an initial impression of a person, and then interprets everything that this person does in a way that confirms this initial impression.

Furthermore, other examples of the confirmation appear in various domains. For instance, the confirmation bias can affect:

  • How people view political information. For example, people generally prefer to spend more time looking at information that supports their political stance and less time looking at information that contradicts it.
  • How people assess pseudoscientific beliefs. For example, people who believe in pseudoscientific theories tend to ignore information that  disproves those theories .
  • How people invest money. For example, investors give more weight to information that confirms their preexisting beliefs regarding the value of certain stocks.
  • How scientists conduct research. For example, scientists often display the confirmation bias when they selectively analyze and interpret data in a way that confirms  their preferred hypothesis.
  • How medical professionals diagnose patients. For example, doctors often search for new information in a selective manner that will allow them to confirm their initial diagnosis of a patient, while ignoring signs that this diagnosis could be wrong.

In addition, an example of how the confirmation bias can influence people appears in the following quote, which references the prevalent misinterpretation of evidence during witch trials in the 17th century:

“When men wish to construct or support a theory, how they torture facts into their service!” ⁠— From “ Extraordinary Popular Delusions and the Madness of Crowds “

Similarly, another example of how people display the confirmation bias is the following:

“… If the new information is consonant with our beliefs, we think it is well founded and useful: ‘Just what I always said!’ But if the new information is dissonant, then we consider it biased or foolish: ‘What a dumb argument!’ So powerful is the need for consonance that when people are forced to look at disconfirming evidence, they will find a way to criticize, distort, or dismiss it so that they can maintain or even strengthen their existing belief.” ⁠— From “ Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts “

Overall, examples of the confirmation bias appear in various domains. These examples illustrate the various different ways in which it can affect people, and show that this bias is highly prevalent, including among trained professionals who are often assumed to assess information in a purely rational manner.

Psychology and causes of the confirmation bias

The confirmation bias can be attributed to two main cognitive mechanisms:

  • Challenge avoidance , which is the desire to avoid finding out that you’re wrong.
  • Reinforcement seeking , which is the desire to find out that you’re right.

These forms of motivated reasoning can be attributed to people’s underlying desire to minimize their  cognitive dissonance , which is psychological distress that occurs when people hold two or more contradictory beliefs simultaneously. Challenge avoidance can reduce dissonance by reducing engagement with information that contradicts preexisting beliefs. Conversely, reinforcement seeking can reduce dissonance by increasing engagement with information that affirms people’s sense of correctness , including if they encounter contradictory information later.

Furthermore, the confirmation bias also occurs due to flaws in the way we test hypotheses.  For example, when people try to find an explanation for a certain phenomenon, they tend to focus on only one hypothesis at a time, and disregard alternative hypotheses, even in cases where they’re not emotionally incentivized to confirm their initial hypothesis. This can cause people to simply try and prove that their initial hypothesis is true, instead of trying to actually check whether it’s true or not, which causes them to ignore the possibility that the information that they encounter could disprove this initial hypothesis, or support alternative hypotheses.

An example of this is a doctor who forms an initial diagnosis of a patient, and who then focuses solely on trying to prove that this diagnosis is right, instead of trying to actively determine whether alternative diagnoses could make more sense.

This explains why people can experience unmotivated confirmation bias in situations where they have no emotional reason to favor a specific hypothesis over others. This is contrasted with a motivated confirmation bias, which occurs when the person displaying the bias is motivated by some emotional consideration.

Finally, the confirmation bias can also be attributed to a number of additional causes. For example, in the case of the motivated confirmation bias, an additional reason why people experience the bias is that the brain sometimes suppresses neural activity in areas associated with emotional regulation and emotionally neutral reasoning. This causes people to process information based on how their emotions guide them to, rather than based on how their logic would guide them.

Overall, people experience the confirmation bias primarily because they want to minimize psychological distress, and specifically due to challenge avoidance , which is the desire to avoid finding out that they’re wrong, and  reinforcement seeking , which is the desire to find out that they’re right. Furthermore, people can also experience the confirmation due to other causes, such as the flawed way they test hypotheses, as in the case where people fixate on confirming a single hypothesis while ignoring alternatives.

Note : Some of the behaviors that people engage in due to the confirmation bias can be viewed as a form of selective exposure . This involves people choosing to engage only with information that supports their preexisting beliefs and decisions, while ignoring information that contradicts them.

How to reduce the confirmation bias

Reducing other people’s confirmation bias.

There are various things that you can do to reduce the influence that the confirmation bias has on people. These methods generally revolve around trying to counteract the cognitive mechanisms that promote the confirmation bias in the first place .

As such, these methods generally involve trying to get people to overcome their tendency to focus on and prefer confirmatory information, or their tendency to avoid and reject challenging information, while also encouraging them to conduct a valid reasoning process.

Specifically, the following are some of the most notable techniques that you can use to reduce the confirmation bias in people:

  • Explain what the confirmation bias is, why we experience it, how it affects us, and why it can be a problem, potentially using relevant examples. Understanding this phenomenon better can motivate people to avoid it, and can help them deal with it more effectively, by helping them recognize when and how it affects them. Note that in some cases, it may be beneficial to point out the exact way in which a person is displaying the confirmation bias.
  • Make it so that the goal is to find the right answer, rather than defend an existing belief. For example, consider a situation where you’re discussing a controversial topic with someone, and you know for certain that they’re wrong. If you argue hard against them, that might cause them to get defensive and feel that they must stick by their initial stance regardless of whatever evidence you show them. Conversely, if you state that you’re just trying to figure out what the right answer is, and discuss the topic with them in a friendly manner, that can make them more open to considering the challenging evidence that you present. In this case, your goal is to frame your debate as a journey that you go on together in search of the truth, rather than a battle where you fight each other to prove the other wrong. The key here is that, when it comes to a joint journey, both of you can be “winners”, while in the case of a battle, only one of you can, and the other person will often experience the confirmation bias to avoid feeling that they were the “loser”.
  • Minimize the unpleasantness and issues associated with finding out that they’re wrong. In general, the more unpleasant and problematic being wrong is, the more a person will use the confirmation bias to stick by their initial stance. There are various ways in which you can make the experience of being wrong less unpleasant or problematic, such as by emphasizing the value of learning new things, and by avoiding mocking people for having held incorrect beliefs.
  • Encourage people to avoid letting their emotional response dictate their actions. Specifically, explain that while it’s natural to want to avoid challenges and seek reinforcement, letting these feelings dictate how you process information and make decisions is problematic. This means, for example, that if you feel that you want to avoid a certain piece of information, because it might show that you’re wrong, then you should realize this, but choose to see that information anyway.
  • Encourage people to give information sufficient consideration. When it comes to avoiding the confirmation bias, it often helps to engage with information in a deep and meaningful way, since shallow engagement can lead people to rely on biased intuitions, rather than on proper analytical reasoning. There are various things that people can do to ensure that they give information sufficient consideration , such as spending a substantial amount of time considering it, or interacting with it in an environment that has no distractions.
  • Encourage people to avoid forming a hypothesis too early. Once people have a specific hypothesis in mind, they often try and confirm it , instead of trying to formulate and test other possible hypotheses. As such, it can often help to encourage people to process as much information as possible before forming their initial hypothesis.
  • Ask people to explain their reasoning. For example, you can ask them to clearly state what their stance is, and what evidence has caused them to support that stance. This can help people identify potential issues in their reasoning, such as that their stance is unsupported.
  • Ask people to think about various reasons why their preferred hypothesis might be wrong. This can help them test their preferred hypothesis in ways that they might not otherwise, and can make them more likely to accept and internalize challenging information .
  • Ask people to think about alternative hypotheses, and why those hypotheses might be right. Similarly to asking people to think about reasons why their preferred hypothesis might be wrong, this can encourage people to engage in a proper reasoning process, which they might not do otherwise. Note that, when doing this, it is generally better to focus on a small number of alternative hypotheses , rather than a large number of them.

Different techniques will be more effective for reducing the confirmation bias in different situations, and it is generally most effective to use a combination of techniques, while taking into account relevant situational and personal factors.

Furthermore, in addition to the above techniques, which are aimed at reducing the confirmation bias in particular, there are additional debiasing techniques that you can use to help people overcome their confirmation bias. This includes, for example, getting people to slow down their reasoning process, creating favorable conditions for optimal decision making, and standardizing the decision-making process.

Overall, to reduce the confirmation bias in others, you can use various techniques that revolve around trying to counteract the cognitive mechanisms that promote the confirmation bias in the first place. This includes, for example, making people aware of this bias, making discussions be about finding the right answer instead of defending an existing belief, minimizing the unpleasantness associated with being wrong, encouraging people to give information sufficient consideration, and asking people to think about why their preferred hypothesis might be wrong or why competing hypotheses could be right.

Reducing your own confirmation bias

To mitigate the confirmation bias in yourself, you can use similar techniques to those that you would use to mitigate it in others. Specifically, you can do the following:

  • Identify when and how you’re likely to experience the bias.
  • Maintain awareness of the bias in relevant situations, and even actively ask yourself whether you’re experiencing it.
  • Figure out what kind of negative outcomes the bias can cause for you.
  • Focus on trying to find the right answer, rather than on proving that your initial belief was right.
  • Avoid feeling bad if you find out that you’re wrong; for example, try to focus on having learned something new that you can use in the future.
  • Don’t let your emotions dictate how you process information, particularly when it comes to seeking confirmation or avoiding challenges to your beliefs.
  • Dedicate sufficient time and mental effort when processing relevant information.
  • Avoid forming a hypothesis too early, before you’d had a chance to analyze sufficient information.
  • Clearly outline your reasoning, for example by identifying your stance and the evidence that you’re basing it on.
  • Think of reasons why your preferred hypothesis might be wrong.
  • Come up with alternative hypotheses, as well as reasons why those hypotheses might be right.

An added benefit of many of these techniques is that they can help you understand opposing views better, which is important when it comes to explaining your own stance and communicating with others on the topic.

In addition, you can also use general debiasing techniques , such as standardizing your decision-making process and creating favorable conditions for assessing information.

Furthermore, keep in mind that, as is the case with reducing the confirmation bias in others, different techniques will be more effective than others, both in general and in particular circumstances. You should take this into account, and try to find the approach that works best for you in any given situation.

Finally, note that in some ways, debiasing yourself can be easier than debiasing others, since other people are often not as open to your debiasing attempts as you yourself are. At the same time, however, debiasing yourself is also more difficult in some ways, since we often struggle to notice our own blind spots, and to identify areas where we are affected by cognitive biases in general, and the confirmation bias in particular.

Overall, to reduce the confirmation bias in yourself, you can use similar techniques to those that you would use to reduce it in others. This includes, for example, maintaining awareness of this bias, focusing on trying to find the right answer rather than proving that you were right, dedicating sufficient time and effort to analyzing information, clearly outlining your reasoning, thinking of reasons why your preferred hypothesis might be wrong, and coming up with alternative hypotheses.

Additional information

Related cognitive biases.

There are many cognitive biases that are closely associated with the confirmation bias, either because they involved a similar pattern or reasoning, or because they occur, at least partly, due to underlying confirmation bias.

For example, there is the backfire effect , which is a cognitive bias that causes people who encounter evidence that challenges their beliefs to reject that evidence, and to strengthen their support of their original stance. This bias can, for instance, cause people to increase their support for a political candidate after they encounter negative information about that candidate, or to strengthen their belief in a scientific misconception after they encounter evidence that highlights the issues with that misconception. The backfire effect is closely associated with the confirmation bias, since it involves the rejection of challenging evidence, with the goal of confirming one’s original beliefs.

Another example of a cognitive bias that is closely related to the confirmation bias is the halo effect , which is a cognitive bias that causes people’s impression of someone or something in one domain to influence their impression of them in other domains. This bias can, for instance, cause people to assume that if someone is physically attractive, then they must also have an interesting personality , or it can cause people to give higher ratings to an essay if they believe that it was written by an attractive author . The halo effect is closely associated with the confirmation bias, since it can be attributed in some cases to people’s tendency to confirm their initial impression of someone, by forming later impressions of them in a biased manner.

The origin and history of the confirmation bias

The term ‘confirmation bias’ was first used in a 1977 paper titled “ Confirmation bias in a simulated research environment: An experimental study of scientific inference “, published by Clifford R. Mynatt, Michael E. Doherty, and Ryan D. Tweney in the Quarterly Journal of Experimental Psychology (Volume 29, Issue 1, pp. 85-95). However, as the authors themselves note, evidence of the confirmation bias can be found earlier in the psychological literature.

Specifically, the following passage is the abstract of the paper that coined the term. It outlines the work presented in the paper, and also notes the existence of prior work on the topic:

“Numerous authors (e.g., Popper, 1959 ) argue that scientists should try to falsify  rather than  confirm theories. However, recent empirical work (Wason and Johnson-Laird, 1972 ) suggests the existence of a confirmation bias, at least on abstract problems. Using a more realistic, computer controlled environment modeled after a real research setting, subjects in this study first formulated hypotheses about the laws governing events occurring in the environment. They then chose between pairs of environments in which they could: (1) make observations which would probably confirm these hypotheses, or (2) test alternative hypotheses. Strong evidence for a confirmation bias involving failure to choose environments allowing tests of alternative hypotheses was found. However, when subjects did obtain explicit falsifying information, they used this information to reject incorrect hypotheses.”

In addition, a number of other past studies are discussed in the paper :

“Examples abound of scientists clinging to pet theories and refusing to seek alternatives in the face of large amounts of contradictory data (see Kuhn, 1970 ). Objective evidence, however, is scant. Wason ( 1968a ) has conducted several experiments on inferential reasoning in which subjects were given conditional rules of the form ‘If P then Q’, where P was a statement about one side of a stimulus card and Q a statement about the other side. Four stimulus cards, corresponding to P, not-P, Q, and not-Q were provided. The subjects’ task was to indicate those cards—and only those cards—which had to be turned over in order to determine if the rule was true or false. Most subjects chose only P, or P and Q. The only cards which can falsify the rule, however, are P and not-Q. Since the not-Q card is almost never selected, the results indicate a strong tendency to seek confirmatory rather than disconfirmatory evidence. This bias for selecting confirmatory evidence has proved remarkably difficult to eradicate (see Wason and Johnson-Laird, 1972 , pp. 171-201). In another set of experiments, Wason ( 1960 , 1968b , 1971 ) also found evidence of failure to consider alternative hypotheses. Subjects were given the task of recovering an experimenter defined rule for generating numerical sequences. The correct rule was a very general one and, consequently, many incorrect specific rules could generate sequences which were compatible with the correct rule. Most subjects produced a few sequences based upon a single, specific rule, received positive feedback, and announced mistakenly that they had discovered the correct rule. With some notable exceptions, what subjects did not do was to generate and eliminate alternative rules in a systematic fashion. Somewhat similar results have been reported by Miller ( 1967 ). Finally, Mitroff ( 1974 ), in a large-scale non-experimental study of NASA scientists, reports that a strong confirmation bias existed among many members of this group. He cites numerous examples of these scientists’ verbalizations of their own and other scientists’ obduracy in the face of data as evidence for this conclusion.”

Summary and conclusions

  • The confirmation bias is a cognitive bias that causes people to search for, favor, interpret, and recall information in a way that confirms their preexisting beliefs.
  • The confirmation bias affects people in every area of life; for example, it can cause people to disregard negative information about a political candidate that they support, or to only pay attention to news articles that support what they already think.
  • People experience the confirmation bias due to various reasons, including challenge avoidance (the desire to avoid finding out that they’re wrong), reinforcement seeking (the desire to find out that they’re right), and flawed testing of hypotheses (e.g., fixating on a single explanation from the start).
  • To reduce the confirmation bias in yourself and in others, you can use various techniques that revolve around trying to counteract the cognitive mechanisms that promote the confirmation bias in the first place.
  • Relevant debiasing techniques you can use include maintaining awareness of this bias, focusing on trying to find the right answer rather than being proven right, dedicating sufficient time and effort to analyzing relevant information, clearly outlining the reasoning process, thinking of reasons why a preferred hypothesis might be wrong, and coming up with alternative hypotheses and reasons why those hypotheses might be right.

Other articles you may find interesting:

  • The Backfire Effect: Why Facts Don't Always Change Minds
  • Cherry Picking: When People Ignore Evidence that They Dislike
  • Belief Bias: When People Rely on Beliefs Rather Than Logic

IResearchNet

Confirmation Bias

Confirmation bias definition.

Confirmation Bias

Confirmation Bias Background and History

The confirmation bias is one example of how humans sometimes process information in an illogical, biased manner. Many factors of which people are unaware can influence information processing. Philosophers note that humans have difficulty processing information in a rational, unbiased manner once they have developed an opinion about the issue. Humans are better able to rationally process information, giving equal weight to multiple viewpoints, if they are emotionally distant from the issue.

Academic Writing, Editing, Proofreading, And Problem Solving Services

Get 10% off with 24start discount code.

One explanation for why humans are susceptible to the confirmation bias is that it is an efficient way to process information. Humans are bombarded with information in the social world and cannot possibly take the time to carefully process each piece of information to form an unbiased conclusion. Human decision making and information processing is often biased because people are limited to interpreting information from their own viewpoint. People need to process information quickly to protect themselves from harm. It is adaptive to rely on instinctive, automatic reflexes that keep humans out of harm’s way.

Another reason people show the confirmation bias is to protect their self-esteem. People like to feel good about themselves, and discovering that a belief that they highly value is incorrect makes people feel bad about themselves. Therefore, people will seek information that supports their existing beliefs. Another motive is accuracy. People want to feel that they are intelligent, and information that suggests one holds an inaccurate belief or made a poor decision suggests one is lacking intelligence.

Confirmation Bias Evidence

The confirmation bias is strong and widespread, occurring in several contexts. In the context of decision making, once an individual makes a decision, he or she will look for information that supports the decision. Information that conflicts with the decision may cause discomfort and is therefore ignored or given little consideration. People give special treatment to information that supports their personal beliefs. In studies examining the my-side bias, people were able to generate and remember more reasons supporting their side of a controversial issue than the opposing side. Only when a researcher directly asked people to generate arguments against their own beliefs were they able to do so. Often when people generate arguments against their beliefs, the arguments may be used selectively or even distorted or misremembered to ultimately support the existing belief. It is not that people are incapable of generating arguments that are counter to their beliefs; rather, people are not motivated to do so.

The confirmation bias also surfaces in people’s tendency to look for positive instances. When seeking information to support their hypotheses or expectations, people tend to identify information that demonstrates a hypothesis to be true rather than look for information that the opposite view is false.

The confirmation bias also operates in impression formation. If people are told what to expect from a person they are about to meet, such as the person is warm, friendly, and outgoing, people will look for information that supports their expectations. When interacting with people whom perceivers think have certain personalities, the perceivers will ask questions of those people that are biased toward supporting the perceivers’ beliefs. For example, if Maria expects her roommate to be friendly and outgoing, Maria may ask her if she likes to go to parties rather than if she often studies in the library.

Importance of Confirmation Bias

The confirmation bias is important because it may lead people to hold strongly to false beliefs or to give more weight to information that supports their beliefs than is warranted by the evidence. People may be overconfident in their beliefs because they have accumulated evidence to support them, when in reality much evidence refuting their beliefs was overlooked or ignored, which, if considered, would lead to less confidence in one’s beliefs. These factors may lead to risky decision making and lead people to overlook warning signs and other important information.

Implications of Confirmation Bias

The confirmation bias has important implications in the real world, including in medicine, law, and interpersonal relationships. Research has shown that medical doctors are just as likely to have confirmation biases as everyone else. Doctors often have a preliminary hunch regarding the diagnosis of a medical condition early in the treatment process. This hunch often interferes with considering information that may indicate an alternative diagnosis is more likely. Another related outcome is how patients react to diagnoses. Patients are more likely to agree with a diagnosis that supports their preferred outcome than a diagnosis that goes against their preferred outcome. Both of these examples demonstrate that the confirmation bias has implications for individuals’ health and well-being. In the context of law, judges and jurors often form an opinion about a defendant’s guilt or innocence before all of the evidence is known. Once an opinion is formed, new information obtained during a trial is likely to be processed according to the confirmation bias, which may lead to unjust verdicts. In interpersonal relations, the confirmation bias can be problematic because it may lead to forming inaccurate and biased impressions of others. This may result in miscommunication and conflict in intergroup settings. In addition, by treating someone according to expectations, that someone may unintentionally change his or her behavior to conform to the expectations, thereby providing further support for the perceiver’s confirmation bias.

References:

  • Fiske, S. T., & Taylor, S. E. (1991). .Social cognition (2nd ed.). New York: McGraw-Hill.
  • Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108, 480-198.
  • Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2, 175-220.

Logo for University of Central Florida Pressbooks

Thinking and Intelligence

Pitfalls to Problem Solving

Learning objectives.

  • Explain some common roadblocks to effective problem solving

Not all problems are successfully solved, however. What challenges stop us from successfully solving a problem? Albert Einstein once said, “Insanity is doing the same thing over and over again and expecting a different result.” Imagine a person in a room that has four doorways. One doorway that has always been open in the past is now locked. The person, accustomed to exiting the room by that particular doorway, keeps trying to get out through the same doorway even though the other three doorways are open. The person is stuck—but she just needs to go to another doorway, instead of trying to get out through the locked doorway. A mental set is where you persist in approaching a problem in a way that has worked in the past but is clearly not working now.  Functional fixedness is a type of mental set where you cannot perceive an object being used for something other than what it was designed for. During the Apollo 13 mission to the moon, NASA engineers at Mission Control had to overcome functional fixedness to save the lives of the astronauts aboard the spacecraft. An explosion in a module of the spacecraft damaged multiple systems. The astronauts were in danger of being poisoned by rising levels of carbon dioxide because of problems with the carbon dioxide filters. The engineers found a way for the astronauts to use spare plastic bags, tape, and air hoses to create a makeshift air filter, which saved the lives of the astronauts.

Link to Learning

Check out this Apollo 13 scene where the group of NASA engineers are given the task of overcoming functional fixedness.

Researchers have investigated whether functional fixedness is affected by culture. In one experiment, individuals from the Shuar group in Ecuador were asked to use an object for a purpose other than that for which the object was originally intended. For example, the participants were told a story about a bear and a rabbit that were separated by a river and asked to select among various objects, including a spoon, a cup, erasers, and so on, to help the animals. The spoon was the only object long enough to span the imaginary river, but if the spoon was presented in a way that reflected its normal usage, it took participants longer to choose the spoon to solve the problem. (German & Barrett, 2005). The researchers wanted to know if exposure to highly specialized tools, as occurs with individuals in industrialized nations, affects their ability to transcend functional fixedness. It was determined that functional fixedness is experienced in both industrialized and nonindustrialized cultures (German & Barrett, 2005).

In order to make good decisions, we use our knowledge and our reasoning. Often, this knowledge and reasoning is sound and solid. Sometimes, however, we are swayed by biases or by others manipulating a situation. For example, let’s say you and three friends wanted to rent a house and had a combined target budget of $1,600. The realtor shows you only very run-down houses for $1,600 and then shows you a very nice house for $2,000. Might you ask each person to pay more in rent to get the $2,000 home? Why would the realtor show you the run-down houses and the nice house? The realtor may be challenging your anchoring bias. An anchoring bias occurs when you focus on one piece of information when making a decision or solving a problem. In this case, you’re so focused on the amount of money you are willing to spend that you may not recognize what kinds of houses are available at that price point.

The confirmation bias is the tendency to focus on information that confirms your existing beliefs. For example, if you think that your professor is not very nice, you notice all of the instances of rude behavior exhibited by the professor while ignoring the countless pleasant interactions he is involved in on a daily basis. This bias proves that first impressions do matter and that we tend to look for information to confirm our initial judgments of others.

You can view the transcript for “Confirmation Bias: Your Brain is So Judgmental” here (opens in new window) .

Hindsight bias leads you to believe that the event you just experienced was predictable, even though it really wasn’t. In other words, you knew all along that things would turn out the way they did. Representative bias describes a faulty way of thinking, in which you unintentionally stereotype someone or something; for example, you may assume that your professors spend their free time reading books and engaging in intellectual conversation, because the idea of them spending their time playing volleyball or visiting an amusement park does not fit in with your stereotypes of professors.

Finally, the availability heuristic is a heuristic in which you make a decision based on an example, information, or recent experience that is that readily available to you, even though it may not be the best example to inform your decision . To use a common example, would you guess there are more murders or more suicides in America each year? When asked, most people would guess there are more murders. In truth, there are twice as many suicides as there are murders each year. However, murders seem more common because we hear a lot more about murders on an average day. Unless someone we know or someone famous takes their own life, it does not make the news. Murders, on the other hand, we see in the news every day. This leads to the erroneous assumption that the easier it is to think of instances of something, the more often that thing occurs.

Watch the following video for an example of the availability heuristic.

You can view the transcript for “Availability Heuristic: Are Planes More Dangerous Than Cars?” here (opens in new window) .

Biases tend to “preserve that which is already established—to maintain our preexisting knowledge, beliefs, attitudes, and hypotheses” (Aronson, 1995; Kahneman, 2011). These biases are summarized in Table 2 below.

Learn more about heuristics and common biases through the article, “ 8 Common Thinking Mistakes Our Brains Make Every Day and How to Prevent Them ” by  Belle Beth Cooper.

You can also watch this clever music video explaining these and other cognitive biases.

Think It Over

Which type of bias do you recognize in your own decision making processes? How has this bias affected how you’ve made decisions in the past and how can you use your awareness of it to improve your decisions making skills in the future?

CC licensed content, Original

  • Modification, adaptation, and original content. Provided by : Lumen Learning. License : CC BY: Attribution

CC licensed content, Shared previously

  • Problem Solving. Authored by : OpenStax College. Located at : https://openstax.org/books/psychology-2e/pages/7-3-problem-solving . License : Public Domain: No Known Copyright . License Terms : Download for free at https://openstax.org/books/psychology-2e/pages/1-introduction
  • More information on heuristics. Authored by : Dr. Scott Roberts, Dr. Ryan Curtis, Samantha Levy, and Dr. Dylan Selterman. Provided by : University of Maryland. Located at : http://openpsyc.blogspot.com/2014/07/heuristics.html . Project : OpenPSYC. License : CC BY-NC-SA: Attribution-NonCommercial-ShareAlike

continually using an old solution to a problem without results

inability to see an object as useful for any other use other than the one for which it was intended

faulty heuristic in which you fixate on a single aspect of a problem to find a solution

seeking out information that supports our stereotypes while ignoring information that is inconsistent with our stereotypes

belief that the event just experienced was predictable, even though it really wasn’t

faulty heuristic in which you stereotype someone or something without a valid basis for your judgment

faulty heuristic in which you make a decision based on information readily available to you

General Psychology Copyright © by OpenStax and Lumen Learning is licensed under a Creative Commons Attribution 4.0 International License , except where otherwise noted.

Share This Book

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of plosone

Opinion Dynamics with Confirmation Bias

Armen e. allahverdyan.

1 Department of Theoretical Physics, Yerevan Physics Institute, Yerevan, Armenia

Aram Galstyan

2 USC Information Sciences Institute, Marina del Rey, California, United States of America

Conceived and designed the experiments: AA AG. Performed the experiments: AA. Analyzed the data: AA AG. Wrote the paper: AA.

Associated Data

Confirmation bias is the tendency to acquire or evaluate new information in a way that is consistent with one's preexisting beliefs. It is omnipresent in psychology, economics, and even scientific practices. Prior theoretical research of this phenomenon has mainly focused on its economic implications possibly missing its potential connections with broader notions of cognitive science.

Methodology/Principal Findings

We formulate a (non-Bayesian) model for revising subjective probabilistic opinion of a confirmationally-biased agent in the light of a persuasive opinion. The revision rule ensures that the agent does not react to persuasion that is either far from his current opinion or coincides with it. We demonstrate that the model accounts for the basic phenomenology of the social judgment theory, and allows to study various phenomena such as cognitive dissonance and boomerang effect. The model also displays the order of presentation effect–when consecutively exposed to two opinions, the preference is given to the last opinion (recency) or the first opinion (primacy) –and relates recency to confirmation bias. Finally, we study the model in the case of repeated persuasion and analyze its convergence properties.

Conclusions

The standard Bayesian approach to probabilistic opinion revision is inadequate for describing the observed phenomenology of persuasion process. The simple non-Bayesian model proposed here does agree with this phenomenology and is capable of reproducing a spectrum of effects observed in psychology: primacy-recency phenomenon, boomerang effect and cognitive dissonance. We point out several limitations of the model that should motivate its future development.

Introduction

Confirmation bias is the tendency to acquire or process new information in a way that confirms one's preconceptions and avoids contradiction with prior beliefs [52] . Various manifestations of this bias have been reported in cognitive psychology [5] , [67] , social psychology [24] , [54] , politics [46] and (media) economics [31] , [51] , [57] , [73] . Recent evidence suggests that scientific practices too are susceptible to various forms of confirmation bias [12] , [38] , [43] , [44] , [52] , even though the imperative of avoiding precisely this bias is frequently presented as one of the pillars of the scientific method.

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e001.jpg

We demonstrate that the proposed revision rule is consistent with the social judgment theory [10] , and reproduces the so called change-discrepancy relationship [10] , [35] , [40] , [45] , [69] . Furthermore, the well-studied weighted average approach [9] , [27] for opinion revision is shown to be a particular case of our model.

Our analysis of the revision rule also reveals novel effects. In particular, it is shown that within the proposed approach, the recency effect is related to confirmation bias. Also, repeated persuasions are shown to hold certain monotonicity features, but do not obey the law of diminishing returns. We also demonstrate that the rule reproduces several basic features of the cognitive dissonance phenomenon and predicts new scenarios of its emergence. Finally, the so called boomerang ( backfire ) effect can emerge as an extreme form of confirmation bias. The effect is given a straightforward mathematical description in qualitative agreement with experiments.

The rest of this paper is organized as follows. In the next section we introduce the problem setup and provide a brief survey of relevant work, specifically focusing on inadequacy of the standard Bayesian approach to opinion revision under persuasion. In the third section we define our axioms and introduce the confirmationally biased opinion revision rule. The fourth section relates our setup to the social judgment theory. Next two sections describe how our model accounts for two basic phenomena of experimental social psychology: opinion change versus discrepancy and the order of presentation effect. The seventh section shows how our model formalizes features of cognitive dissonance, followed by analysis of opinion change under repeated persuasion. Then we study the boomerang effect –the agent changes his opinion not towards the persuasion, but against it– as a particular case of our approach. We summarize and conclude in the last section.

The Set-Up and Previous Research

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e004.jpg

The normative standard for opinion revision is related to the Bayesian approach. Below we discuss the main elements of the Bayesian approach, and outline certain limitations that motivates the non-Bayesian revision rule suggested in this work.

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e020.jpg

It is worthwhile to note that researchers have studied several aspects of confirmation bias by looking at certain deviations from the Bayes rule, e.g. when the conditional probability are available, but the agent does not apply the proper Bayes rule deviating from it in certain aspects [31] , [51] , [57] , [73] . One example of this is when the (functional) form of the conditional probability is changed depending on the evidence received or on the prior probabilities. Another example is when the agent does not employ the full available evidence and selects only the evidence that can potentially confirm his prior expectations [39] , [48] , [67] . More generally, one has to differentiate between two aspects of the confirmation bias that can be displayed either with respect to information acquiring, or information assimilation (or both) [52] . Our study will concentrate on information assimilation aspect; first, because this aspect is not studied sufficiently well, and second, because because it seems to be more directly linked to cognitive limitations [52] . We also stress that we focus on the belief revision, and not on actions an agent might perform based on those beliefs.

Opinion Revision Rule

We propose the following conditions that the opinion revision rule should satisfy.

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e044.jpg

Eq. (3) can be considered as a succession of such local processes.

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e066.jpg

Conditions 3 and 4 are motivated by experimental results in social psychology, which state that people are not persuaded by opinions that are either very far, or very close to their initial opinion [10] , [17] , [69] .

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e080.jpg

5. F is a homogeneous function of order one:

equation image

6.2 Note that (8) does not satisfy conditions 2 and 3 above. We turn to the last ingredient of the sought rule, which, in particular, should achieve consistency with conditions 2 and 3 .

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e122.jpg

To make the projection process (more) objective, we shall assume that it commutes with the probabilistic revision: whenever

equation image

This feature means that the projection is consistent with probability theory: it does not matter whether (3) is applied before or after (10).

It is known that (9) together with (10, 11) selects a unique function [30] :

equation image

The final opinion revision rule reads from (12, 8, 9):

equation image

It is seen to satisfy conditions 1–5 .

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e156.jpg

The two processes were applied above in the specific order: first averaging (8), and then projection (9). We do not have any strong objective justifications for this order, although certain experiments on advising indicate on the order that led to (13) [72] . Thus, it is not excluded that the two sub-processes can be applied in the reverse order: first projection and then averaging. Then instead of (13) we get (3) with:

equation image

Returning to (1), we note that k  =  x can be a continuous variable, if (for example) the forecast concerns the chance of having rain or the amount of rain. Then the respective probability densities are:

equation image

Social Judgment Theory and Gaussian Opinions

Opinion latitudes.

Here we discuss our model in the context of the social judgment theory [59] , [10] , and consider several basic scenarios of opinion change under the rule (16).

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e168.jpg

where the latitude of non-commitment contains whatever is left out from (18, 19). Recall that the latitudes of acceptance, non-commitment and rejection carry (respectively) 95.4, 4.3 and 0.3% of probability.

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e180.jpg

Weighted average of anchors

Next, we demonstrate that the main quantitative theory of persuasion and opinion change – the weighted average approach [9] , [27] – is a particular case of our model. We assume that the opinions p ( x ) and q ( x ) are given as

equation image

Furthermore, we have

equation image

Opinions and bump-densities

Gaussian densities (with three latitudes) do correspond to the phenomenology of social psychology. However, in certain scenarios one might need other forms of densities, e.g., when the probability is strictly zero outside of a finite support. Such opinions can be represented by bump-functions

equation image

Opinion Change vs Discrepancy

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e261.jpg

However, consequent experiments revealed that the linear regime is restricted to small discrepancies only and that the actual behavior of the opinion change as a function of the discrepancy is non-monotonic: the opinion change reaches its maximal value at some discrepancy and decreases afterward [10] , [40] , [45] , [69] .

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e264.jpg

Order of Presentation

Recency versus primacy.

When an agent is consecutively presented with two persuasive opinions, his final opinion is sensitive to the order of presentation [10] , [13] , [25] , [34] , [35] , [50] , [52] . While the existence of this effect is largely established, its direction is a more convoluted matter. (Note that the order of presentation effect is not predicted by the Bayesian approach; see (2).) Some studies suggest that the first opinion matters more (primacy effect), whereas other studies advocate that the last interaction is more important (recency effect). While it is not completely clear which experimentally (un)controlled factors are responsible for primacy and recency, there is a widespread tendency of relating the primacy effect to confirmation bias [13] , [52] . This relation involves a qualitative argument that we scrutinize below.

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e386.jpg

Note that the primacy-recency effect is only one (though important!) instance of contextual and non-commutative phenomena in psychology; see [11] , [66] and references therein. Hence in section IV of File S1 we study a related (though somewhat less interesting) order of presentation effect, while below we discuss our findings in the context of experimental results.

Experimental studies of order of presentation effect

We now discuss our findings in this section in the context of experimental results on primacy and recency. The latter can be roughly divided into several group: persuasion tasks [10] , [50] , symbol recalling [70] , inference tasks [34] , and impression formation [7] , [9] . In all those situations one generally observes both primacy and recency, though in different proportions and under different conditions [34] . Generally, the recency effect is observed whenever the retention time (the time between the last stimulus and the data taking) is short. If this time is sufficiently long, however, the recency effect changes to the primacy effect [10] , [50] , [62] , [70] . The general interpretation of these results is that there are two different processes involved, which operate on different time-scales. These processes can be conventionally related to short-term and long-term memory [70] , with the primacy effect related to the long-term memory. In our model the longer time process is absent. Hence, it is natural that we see only the recency effect. The prevalence of recency effects is also seen in inference tasks, where the analogue of the short retention time is the incremental (step-by-step) opinion revision strategy [34] .

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e467.jpg

At the same time, in another interesting study based on subjective probability revision, where the authors had taken special measures for minimizing the attention decrement, the results indicated a primacy effect [55] .

We close this section by underlining the advantages and drawbacks of the present model concerning the primacy-recency effect: the main advantage is that it demonstrates the recency effect and shows that the well-known argument on relating confirmation bias to primacy does not hold generally. The main drawback is that the model does not involve processes that are supposedly responsible for the experimentally observed interplay between recency and primacy. In the concluding section we discuss possible extensions of the model that can account for this interplay.

Cognitive Dissonance

Consider an agent whose opinion probability density has two peaks on widely separated events. Such a density – with the most probable opinion being different from the average – is indicative of cognitive dissonance, where the agent believes in mutually conflicting things [10] , [26] .

The main qualitative scenario for the emergence of cognitive dissonance is when an agent – who initially holds a probabilistic opinion with a single peak – is exposed to a conflicting information coming from a sufficiently credible source [10] , [26] . We now describe this scenario quantitatively.

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e471.jpg

There are 3 options for reducing cognitive dissonance:

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e490.jpg

The existence of at least two widely different probable opinions is only one aspect of cognitive dissonance [10] , [26] . Another aspect (sometimes called Freud-Festinger's law) is that people tend to avoid cognitive dissonance: if in their action they choose one of the two options (i.e. one of two peaks of the subjective probability), they re-write the history of their opinion revision so that the chosen option becomes the most probable one [10] , [26] . This aspect of cognitive dissonance found applications in economics and decision making [2] , [73] . The above points (i) – (iii) provide concrete scenarios for a such re-writing.

Repeated Persuasion

Here we analyze the opinion dynamics under repeated persuasion attempts. Our motivation for studying this problem is that repeated exposure to the same opinion is generally believed to be more persuasive than a single exposure.

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e508.jpg

We conclude by stressing that while repeated persuasions drive the opinion to its fixed point monotonically in the number of repetitions, it is generally not true that the first persuasion causes the largest opinion change, i.e. the law of diminishing returns does not hold. To obtain the largest opinion change, one should carefully choose the number of repetitions.

Finally, note that the framework of (42) can be applied to studying mutual persuasion (consensus reaching). This is described in Section VII of File S1 ; see also [23] in this context.

Boomerang (Backfire) Effect

Definition of the effect.

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e558.jpg

Experimental studies indicate that the boomerang effect is frequently related with opinion formation in an affective state, where there are emotional reasons for (not) changing the opinion. For example, a clear evidence of the boomerang effect is observed when the persuasion contains insulting language [1] . Another interesting example is when the subjects had already announced their opinion publicly, and were not only reluctant to change it (as for the usual conservatism), but even enforced it on the light of the contrary evidence [64] (in these experiments, the subjects who did not make their opinion public behaved without the boomerang effect). A similar situation is realized for voters who decided to support a certain candidate. After hearing that the candidate is criticized, the voters display a boomerang response to this criticism and thereby increase their support [53] , [58] .

Opinion revision rule

We now suggest a simple modification of our model that accounts for the basic phenomenology of the boomerang effect.

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e560.jpg

with obvious generalization to probability densities. The absolute values in (48) are necessary to ensure the positivity of probabilities.

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e569.jpg

We obtain (48) after applying (9, 10) to (49).

Scenarios of opinion change

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e573.jpg

Thus, in the present model, the primacy effect (relevance of the first opinion) can be related to the boomerang effect.

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e631.jpg

The basic idea of the opinion revision rule is that no opinion change is expected if the persuasion is either too far or too close to the already existing opinion [15] , [36] , [60] . The opinion revision rule is not Bayesian, because the standard Bayesian approach does not apply to processes of persuasion and advising; see the second section for more details.

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e656.jpg

New effects predicted by the model are summarized as follows.

(i) For the order of presentation set-up (and outside of the boomerang regime) the model displays recency effect. We suggested that the standard argument that relates confirmation bias to the primacy effect does not work in this model. In this context we recall a widespread viewpoint that both recency and primacy relate to (normative) irrationality; see e.g. [13] . However, the information which came later is generally more relevant for predicting future. Hence recency can be more rational than primacy.

In many experimental set-ups the recency changes to primacy upon increasing the retention time; see e.g. [70] . Our model demonstrates the primacy effect only in the boomerang regime (i.e. only in the special case). Hence, in future it needs to be extended by involving additional mechanisms, e.g. those related to “long-term memory” processes which could be responsible for the above experimental fact. Recall in this context there are several other theoretical approaches that address the primacy-recency difference [11] , [34] , [42] , [56] , [66] .

(ii) The model can be used to describe the phenomenon of cognitive dissonance and to formalize the main scenario of its emergence.

(iii) Repeated persuasions display several features implying monotonous change of the target opinion towards the persuading opinion. However, the opinion changes do not obey the law of diminishing returns, or in other words, the first persuasion is not always leads to the largest change. These findings may contribute to better understanding the widespread use of repeated persuasions.

An external file that holds a picture, illustration, etc.
Object name is pone.0099557.e659.jpg

In this paper we restricted ourselves by studying few (two or three) interacting agents with opinions described via subjective probabilities. However, these probabilities can also represent an ensemble of agents each one having a fixed (single) opinion, a useful viewpoint on subjective probabilities advocated in Ref. [37] . In future we plan to explore this point and also address the opinion dynamics for collectives of agents. This last aspect was recently extensively studied via methods of statistical physics; see [20] , [63] for reviews.

Supporting Information

Acknowledgments.

We thank Seth Frey for useful remarks and suggestions.

Funding Statement

This research was supported by DARPA grant No. W911NF-12-1-0034 and AFOSR MURI grant No. FA9550-10-1-0569. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

  • Our expertise
  • Our services
  • Working papers
  • Who we work with
  • Our locations
  • How we work
  • 25th May 2018

How confirmation bias stops us solving problems

This is the third blog in our Behavioural Government series , which explores how behavioural insights can be used to improve how government itself works.

Confirmation bias is the tendency to seek out, interpret, judge and remember information so that it supports one’s pre-existing views and ideas.

Confirmation bias can make people less likely to engage with information which challenges their views. An example of this is a recent study of 376 million Facebook users, which found that many preferred to get their news from a small number of sources they already agreed with.

Even when people do get exposed to challenging information, confirmation bias can cause them to reject it and, perversely, become even more certain that their own beliefs are correct.

One famous experiment gave students evidence two scientific studies – one that supported capital punishment, and one that opposed it. The students denigrated whichever study went against their pre-existing opinion, and left the lab embracing their original position even more passionately.

The mental process which helps explain this behaviour is called motivated reasoning. What is worrying is that motivated reasoning may actually reduce our ability to understand and interpret evidence, and so make us less likely to be swayed by reasoned argument.

This is illustrated by a recent Danish study which showed elected politicians (hypothetical) satisfaction statistics for two different schools, then asked them to identify the best-performing one. Around 75% answered correctly when the options were labelled innocuously (e.g. “School A” and “School B”). However, these results changed dramatically when the options were framed in terms of public vs private services (e.g. “Private School” and “Public School”), a contentious issue in Danish politics.

Figure 1 shows that when the correct answer was in line with their pre-existing beliefs about public services (i.e. the politician strongly believed in the value of public services and the correct answer was that the public school was better), 92% of politicians chose correctly. But only 56% got it right when the answer was at odds with their beliefs (i.e. the politician strongly believed in the value of public services and the correct answer was that the private school was better).

problem solving confirmation bias definition

Figure 1. Relationship between prior attitudes and correct interpretations of statistical data among 127 Danish politicians.

problem solving confirmation bias definition

In our view, confirmation bias is one of the most pervasive and problematic cognitive biases that affects policy making. For that reason, it is also one of the hardest to tackle. However, we think that there are realistic improvements to be made.

Sign up to our mailing list to be among the first to hear them when we release our full Behavioural Government report.

photo of employee

Dr Michael Hallsworth

Managing Director, BIT Americas

problem solving confirmation bias definition

Dr Mark Egan

Principal Research Advisor

Related content

  • 15th Aug 2018

Central banking: when communication is the policy

  • 2nd May 2018

Behavioural Government: A major new initiative from BIT

Default image

  • 3rd Aug 2016

How can government make better use of data science? Insights from the first Data Science & Government Conference

Complimentary 1-hour tutoring consultation Schedule Now

Problem-solving and decision making

Problem-solving refers to a way of reaching a goal from a present condition, where the present condition is either not directly moving toward the goal, is far from it, or needs more complex logic in order to find steps toward the goal.

Types of problem-solving

There are considered to be two major domains in problem-solving : mathematical problem solving, which involves problems capable of being represented by symbols, and personal problem solving, where some difficulty or barrier is encountered.

Within these domains of problem-solving, there are a number of approaches that can be taken. A person may decide to take a trial and error approach and try different approaches to see which one works the best. Or they may decide to use an algorithm approach following a set of rules and steps to find the correct approach. A heuristic approach can also be taken where a person uses previous experiences to inform their approach to problem-solving.

MCAT Problem-solving and decision making

Barriers to effective problem solving 

Barriers exist to problem-solving they can be categorized by their features and tasks required to overcome them.

The mental set is a barrier to problem-solving. The mental set is an unconscious tendency to approach a problem in a particular way. Our mental sets are shaped by our past experiences and habits. Functional fixedness is a special type of mindset that occurs when the intended purpose of an object hinders a person’s ability to see its potential other uses.

The unnecessary constraint is a barrier that shows up in problem-solving that causes people to unconsciously place boundaries on the task at hand.

Irrelevant information is a barrier when information is presented as part of a problem, but which is unrelated or unimportant to that problem and will not help solve it. Typically, it detracts from the problem-solving process, as it may seem pertinent and distract people from finding the most efficient solution.

Confirmation bias is a barrier to problem-solving. This exists when a person has a tendency to look for information that supports their idea or approach instead of looking at new information that may contradict their approach or ideas.

Strategies for problem-solving

There are many strategies that can make solving a problem easier and more efficient. Two of them, algorithms and heuristics, are of particularly great psychological importance.

A heuristic is a rule of thumb, a strategy, or a mental shortcut that generally works for solving a problem (particularly decision-making problems). It is a practical method, one that is not a hundred per cent guaranteed to be optimal or even successful, but is sufficient for the immediate goal. Working backwards is a useful heuristic in which you begin solving the problem by focusing on the end result. Another useful heuristic is the practice of accomplishing a large goal or task by breaking it into a series of smaller steps.

An algorithm is a series of sets of steps for solving a problem. Unlike a heuristic, you are guaranteed to get the correct solution to the problem; however, an algorithm may not necessarily be the most efficient way of solving the problem. Additionally, you need to know the algorithm (i.e., the complete set of steps), which is not usually realistic for the problems of daily life.

Biases can affect problem-solving ability by directing a problem-solving heuristic or algorithm based on prior experience.

In order to make good decisions, we use our knowledge and our reasoning. Often, this knowledge and reasoning is sound and solid. Sometimes, however, we are swayed by biases or by others manipulating a situation. There are several forms of bias which can inform our decision-making process and problem-solving ability:

Anchoring bias -Tendency to focus on one particular piece of information when making decisions or problem-solving

Confirmation bias – Focuses on information that confirms existing beliefs

MCAT Problem-solving and decision making

Hindsight bias – Belief that the event just experienced was predictable

Representative bias – Unintentional stereotyping of someone or something

Availability bias – Decision is based upon either an available precedent or an example that may be faulty

Belief bias – casting judgment on issues using what someone believes about their conclusion. A good example is belief perseverance which is the tendency to hold on to pre-existing beliefs, despite being presented with evidence that is contradictory.

MCAT Problem-solving and decision making

Khan Academy

MCAT Official Prep (AAMC)

Sample Test P/S Section Passage 3 Question 12

Practice Exam 2 P/S Section Passage 8 Question 40

Practice Exam 2 P/S Section Passage 8 Question 42

Practice Exam 4 P/S Section Question 12

• Problem-solving can be considered when a person is presented with two types of problems – mathematical or personal

• Barriers exist to problem-solving maybe because of the mental set of the person, constraints on their thoughts or being presented with irrelevant information

• People can typically employ a number of strategies in problem-solving such as heuristics, where a general problem-solving method is applied to a problem or an algorithm can be applied which is a set of steps to solving a problem without a guaranteed result

• Biases can affect problem-solving ability by directing a problem-solving heuristic or algorithm based on prior experience.

Mental set: an unconscious tendency to approach a problem in a particular way

Problem : the difference between the current situation and a goal

Algorithm: problem-solving strategy characterized by a specific set of instructions

Anchoring bias: faulty heuristic in which you fixate on a single aspect of a problem to find a solution

Availability bias : faulty heuristic in which you make a decision based on information readily available to you

Confirmation bias : faulty heuristic in which you focus on information that confirms your beliefs

Functional fixedness: inability to see an object as useful for any other use other than the one for which it was intended

Heuristic : mental shortcut that saves time when solving a problem

Hindsight bias : belief that the event just experienced was predictable, even though it really wasn’t

Problem-solving strategy : a method for solving problems

Representative bias:  faulty heuristic in which you stereotype someone or something without a valid basis for your judgment

Working backwards: heuristic in which you begin to solve a problem by focusing on the end result

Your Notifications Live Here

{{ notification.creator.name }} Spark

{{ announcement.creator.name }}

problem solving confirmation bias definition

Trial Session Enrollment

Live trial session waiting list.

problem solving confirmation bias definition

Next Trial Session:

{{ nextFTS.remaining.months }} {{ nextFTS.remaining.months > 1 ? 'months' : 'month' }} {{ nextFTS.remaining.days }} {{ nextFTS.remaining.days > 1 ? 'days' : 'day' }} {{ nextFTS.remaining.days === 0 ? 'Starts Today' : 'remaining' }} Starts Today

Recorded Trial Session

This is a recorded trial for students who missed the last live session.

Waiting List Details:

Due to high demand and limited spots there is a waiting list. You will be notified when your spot in the Trial Session is available.

  • Learn Basic Strategy for CARS
  • Full Jack Westin Experience
  • Interactive Online Classroom
  • Emphasis on Timing

problem solving confirmation bias definition

Next Trial:

Free Trial Session Enrollment

Daily mcat cars practice.

New MCAT CARS passage every morning.

You are subscribed.

{{ nextFTS.remaining.months }} {{ nextFTS.remaining.months > 1 ? 'months' : 'month' }} {{ nextFTS.remaining.days }} {{ nextFTS.remaining.days > 1 ? 'days' : 'day' }} remaining Starts Today

Welcome Back!

Please sign in to continue..

problem solving confirmation bias definition

Please sign up to continue.

{{ detailingplan.name }}.

  • {{ feature }}

Gary Klein Ph.D.

The Curious Case of Confirmation Bias

The concept of confirmation bias has passed its sell-by date..

Posted May 5, 2019 | Reviewed by Devon Frye

Confirmation bias is the tendency to search for data that can confirm our beliefs, as opposed to looking for data that might challenge those beliefs. The bias degrades our judgments when our initial beliefs are wrong because we might fail to discover what is really happening until it is too late.

To demonstrate confirmation bias, Pines (2006) provides a hypothetical example (which I have slightly modified) of an overworked Emergency Department physician who sees a patient at 2:45 a.m.—a 51-year-old man who has come in several times in recent weeks complaining of an aching back. The staff suspects that the man is seeking prescriptions for pain medication . The physician, believing this is just one more such visit, does a cursory examination and confirms that all of the man's vital signs are fine—consistent with what was expected. The physician does give the man a new prescription for a pain reliever and sends the man home—but because he was only looking for what he expected, he missed the subtle problem that required immediate surgery.

The concept of confirmation bias appears to rest on three claims:

  • First, firm evidence, going back 60 years, has demonstrated that people are prone to confirmation bias.
  • Second, confirmation bias is clearly a dysfunctional tendency.
  • Third, methods of debiasing are needed to help us to overcome confirmation bias.

The purpose of this essay is to look closely at these claims and explain why each one of them is wrong.

Claim #1: Firm evidence has demonstrated that people are prone to confirmation bias.

Confirmation bias was first described by Peter Wason (1960), who asked participants in an experiment to guess at a rule about number triples. The participants were told that the sequence 2-4-6 fit that rule. They could generate their own triples and they would get feedback on whether or not their triple fit the rule. When they had collected enough evidence, they were to announce their guess about what the rule was.

Wason found that the participants tested only positive examples—triples that fit their theory of what the rule was. The actual rule was any three ascending numbers, such as 2, 3, 47. However, given the 2-4-6 starting point, many participants generated triples that were even numbers, ascending and also increasing by two. Participants didn’t try sequences that might falsify their theory (e.g., 6-4-5). They were simply trying to confirm their beliefs.

At least, that’s the popular story. Reviewing the original Wason data reveals a different story. Wason’s data on the number triples (e.g., 2-4-6) showed that six of the 29 participants correctly guessed the rule on the very first trial, and several of these six did use probes that falsified a belief.

Most of the other participants in that study seemed to take the task lightly because it seemed so simple—but after getting feedback that their first guess was wrong, they realized that there was only one right answer and they'd have to do more analysis. Almost half of the remaining 23 participants immediately shaped up—10 guessed correctly on the second trial, with many of these also making use of negative probes (falsifications).

Therefore, the impression found in the literature is highly misleading. The impression is that in this Wason study—the paradigm case of confirmation bias—the participants showed a confirmation effect. But when you look at all the data, most of the participants were not trapped by confirmation bias. Only 13 of the 29 participants failed to solve the problem in the first two trials. (By the fifth trial, 23 of the 29 had solved the problem.)

The takeaway should have been that most people do test their beliefs. However, Wason chose to headline the bad news. The abstract to his paper states that “The results show that those [13] subjects, who reached two or more incorrect solutions, were unable, or unwilling, to test their hypotheses.” (p. 129).

Since then, several studies have obtained results that challenge the common beliefs about confirmation bias. These studies showed that most people actually are thoughtful enough to prefer genuinely diagnostic tests when given that option (Kunda, 1999; Trope & Bassok, 1982; Devine et al., 1990).

In the cognitive interviews I have conducted, I have seen some people trying to falsify their beliefs. One fireground commander, responding to a fire in a four-story apartment building, saw that the fire was in a laundry chute and seemed to be just beginning. He believed that he and his crew had arrived before the fire had a chance to spread up the chute—so he ordered an immediate attempt to suppress it from above, sending his crew to the 2nd and 3rd floors.

problem solving confirmation bias definition

But he also worried that he might be wrong, so he circled the building. When he noticed smoke coming out of the eaves above the top floor, he realized that he was wrong. The fire must have already reached the 4th floor and the smoke was spreading down the hall and out the eaves. He immediately told his crew to stop trying to extinguish the fire and instead to shift to search and rescue for the inhabitants. All of them were successfully rescued, even though the building was severely damaged.

 Gary Klein

Another difficulty with Claim #1 is that confirmation bias tends to disappear when we add context. In a second study, Wason (1968) used a four-card problem to demonstrate confirmation bias. For example: Four cards are shown, each of which has a number on one side and a color on the other. The visible faces show 3, 8, red and brown. Participants are asked, "Which two cards should you turn over to test the claim that if a card has an even number on one face, then its opposite face is red?” (This is a slight variant of Wason’s original task; see the top part of the figure next to this paragraph.)

Most people turn over cards two and three. Card two, showing an “8,” is a useful test because of the opposite face is not red, the claim is disproved. But turning over card three, “red,” is a useless test because the claim is not that only cards with even numbers on one side have a red opposite face. Selecting card three illustrates confirmation bias.

However, Griggs and Cox (1982) applied some context to the four-card problem—they situated the task in a tavern with a barkeeper intent on following the law about underage drinking. Now the question took the form, “Which two of these cards should you turn over to test the claim that in this bar, 'If you are drinking alcohol then you must be over 19'?" Griggs and Cox found that 73 percent of the participants now chose “16,” and the beer—meaning the confirmation bias effect seen in Wason's version had mostly vanished. (See the bottom part of the figure above.)

Therefore, the first claim about the evidence for confirmation bias does not seem warranted.

Claim #2: Confirmation bias is clearly a dysfunctional tendency.

Advocates for confirmation bias would argue that the bias can still get in the way of good decision making . They would assert that even if the data don’t really support the claim that people fall prey to confirmation bias, we should still, as a safeguard, warn decision-makers against the tendency to support their pre-existing beliefs.

But that ploy, to discourage decision-makers from seeking to confirm their pre-existing beliefs, won’t work because confirmation attempts often do make good sense. Klayman and Ha (1987) explained that under high levels of uncertainty, positive tests are more informative than negative tests (i.e., falsifications). Klayman and Ha refer to a “positive test strategy” as having clear benefits.

As a result of this work, many researchers in the judgment and decisionmaking community have reconsidered their view that the confirmation tendency is a bias and needs to be overcome. Confirmation bias seems to be losing its force within the scientific community, even as it echoes in various applied communities.

Think about it: Of course we use our initial beliefs and frames to guide our explorations. How else would we search for information? Sometimes we can be tricked, in a cleverly designed study. Sometimes we trick ourselves when our initial belief is wrong. The use of our initial beliefs, gained through experience, isn’t perfect. However, it is not clear that there are better ways of proceeding in ambiguous and uncertain settings.

We seem to have a category error here—people referring to the original Wason data on the triples and the four cards (even though these data are problematic) and then stretching the concept of confirmation bias to cover all kinds of semi-related or even unrelated problems, usually with hindsight: If someone makes a mistake, then the researchers hunt for some aspect of confirmation bias. As David Woods observed, "The focus on confirmation bias commits hindsight bias."

For all these reasons, the second claim that the confirmation tendency is dysfunctional doesn’t seem warranted. We are able to make powerful use of our experience to identify a likely initial hypothesis and then use that hypothesis to guide the way we search for more data.

How would we search for data without using our experience? We wouldn’t engage in random search because that strategy seems highly inefficient. And I don’t think we would always try to search for data that could disprove our initial hypothesis, because that strategy won’t help us make sense of confusing situations. Even scientists do not often try to falsify their hypotheses, so there’s no reason to set this strategy up as an ideal for practitioners.

The confirmation bias advocates seem to be ignoring the important and difficult process of hypothesis generation, particularly under ambiguous and changing conditions. These are the kinds of conditions favoring the positive test strategy that Klayman and Ha studied.

Claim #3: Methods of debiasing are needed to help us to overcome confirmation bias.

For example, Lilienfeld et al. (2009) asserted that “research on combating extreme confirmation bias should be among psychological science’s most pressing priorities.” (p. 390). Many if not most decision researchers would still encourage us to try to debias decision-makers.

Unfortunately, that’s been tried and has gotten nowhere. Attempts to re-program people have failed. Lilienfeld et al. admitted that “psychologists have made far more progress in cataloguing cognitive biases… than in finding ways to correct or prevent them.” (p. 391). Arkes (1981) concluded that psychoeducational methods by themselves are “absolutely worthless.” (p. 326). The few successes have been small and it is likely that many failures go unreported. One researcher whose work has been very influential in the heuristics and biases community has admitted to me that debiasing efforts don’t work.

And let’s imagine that, despite the evidence, a debiasing tactic was developed that was effective. How would we use that tactic? Would it prevent us from formulating an initial hypothesis without gathering all relevant information? Would it prevent us from speculating when faced with ambiguous situations? Would it require us to seek falsifying evidence before searching for any supporting evidence? Even the advocates acknowledge that confirmation tendencies are generally adaptive. So how would a debiasing method enable us to know when to employ a confirmation strategy and when to stifle it?

Making this a little more dramatic, if we could surgically excise the confirmation tendency, how many decision researchers would sign up for that procedure? After all, I am not aware of any evidence that debiasing the confirmation tendency improves decision quality or makes people more successful and effective. I am not aware of data showing that a falsification strategy has any value. The Confirmation Surgery procedure would eliminate confirmation bias but would leave the patients forever searching for evidence to disconfirm any beliefs that might come to their minds to explain situations. The result seems more like a nightmare than a cure.

One might still argue that there are situations in which we would want to identify several hypotheses, as a way of avoiding confirmation bias. For example, physicians are well-advised to do differential diagnosis, identifying the possible causes for a medical condition. However, that’s just good practice. There’s no need to invoke a cognitive bias. There’s no need to try to debias people.

For these reasons, I suggest that the third claim about the need for debiasing methods is not warranted.

What about the problem of implicit racial biases? That topic is not really the same as confirmation bias, but I suspect some readers will be making this connection, especially given all of the effort to set up programs to overcome implicit racial biases. My first reaction is that the word “bias” is ambiguous. “Bias” can mean a prejudice , but this essay uses “bias” to mean a dysfunctional cognitive heuristic, with no consideration of prejudice, racial or otherwise. My second reaction is to point the reader to the weakened consensus on implicit bias and the concession made by Greenwald and Banaji (the researchers who originated the concept of implicit bias) that the Implicit Association Test doesn’t predict biased behavior and shouldn’t be used to classify individuals as likely to engage in discriminatory behavior.

Conclusions

Where does that leave us?

Fischhoff and Beyth-Marom (1983) complained about this expansion: “Confirmation bias, in particular, has proven to be a catch-all phrase incorporating biases in both information search and interpretation. Because of its excess and conflicting meanings, the term might best be retired.” (p. 257).

I have mixed feelings. I agree with Fischhoff and Beyth-Marom that over the years, the concept of confirmation bias has been stretched—or expanded—beyond Wason’s initial formation so that today it can refer to the following tendencies:

  • Search: to search only for confirming evidence (Wason’s original definition)
  • Preference: to prefer evidence that supports our beliefs
  • Recall: to best remember information in keeping with our beliefs
  • Interpretation: to interpret evidence in a way that supports our beliefs
  • Framing: to use mistaken beliefs to misunderstand what is happening in a situation
  • Testing: to ignore opportunities to test our beliefs
  • Discarding: to explain away data that don’t fit with our beliefs

I see this expansion as a useful evolution, particularly the last three issues of framing, testing, and discarding. These are problems I have seen repeatedly. With this expansion, researchers will perhaps be more successful in finding ways to counter confirmation bias and improve judgments.

Nevertheless, I am skeptical. I don’t think the expansion will be effective because researchers will still be going down blind alleys. Decision researchers may try to prevent people from speculating at the outset even though rapid speculation is valuable for guiding exploration. Decision researchers may try to discourage people from seeking confirming evidence, even though the positive test strategy is so useful. The whole orientation of correcting a bias seems misguided. Instead of appreciating the strength of our sensemaking orientation and trying to reduce the occasional errors that might arise, the confirmation bias approach typically tries to eliminate errors by inhibiting our tendencies to speculate and explore.

Fortunately, there seems to be a better way to address the problems of being captured by our initial beliefs, failing to test those beliefs, and explaining away inconvenient data—the concept of fixation . This concept is consistent with what we know of naturalistic decision making, whereas confirmation bias is not. Fixation doesn’t carry the baggage of confirmation bias in terms of the three unwarranted claims discussed in this essay. Fixation directly gets at a crucial problem of failing to revise a mistaken belief.

And best of all, the concept of fixation provides a novel strategy for overcoming the problems of being captured by initial beliefs, failing to test those beliefs, and explaining away data that are inconsistent with those beliefs.

My next essay will discuss fixation and describe that strategy.

Arkes, H. (1981). Impediments to accurate clinical judgment and possible ways to minimize their impact. Journal of Consulting and Clinical Psychology, 49, 323-330.

Devine, P. G. Hirt, E.R.; Gehrke, E.M. (1990), Diagnostic and confirmation strategies in trait hypothesis testing. Journal of Personality and Social Psychology, 58 , 952–63.

Fischhoff, B. & Beyth-Marom, R. (1983). Hypothesis evaluation from a Bayesian perspective. Psychological Review, 90 , 239-260.

Griggs, R.A., & Cox, J.R. (1982). The elusive thematic-materials effect in Wason’s selection task. British Journal of Psychology, 73, 407-420.

Klayman, J., & Ha, Y-W. (1987). Confirmation, disconfirmation, and information in hypothesis testing. Psychological Review, 94 , 211-228.

Klein, G. (1998). Sources of power: How people make decisions . Cambridge, MA: MIT Press.

Kunda, Z. (1999). Social cognition: Making sense of people .Cambridge, MA: MIT Press.

Lilienfeld, S.O., Ammirati, R., & Landfield, K. (2009). Giving debiasing away: Can psychological research on correcting cognitive errors promote human welfare? Perspectives on Psychological Science, 4 , 390-398.

Oswald, M.E., & Grossjean, S. (2004). Confirmation bias. In R.F. Rudiger (Ed.) Cognitive illusions: A handbook on fallacies and biases in thinking, judgement and memory. Hove, UK: Psychology Press.

Pines, J.M. (2006). Confirmation bias in emergency medicine .Academic Emergency Medicine, 13 , 90-94.

Trope, Y., & Bassok, M. (1982), Confirmatory and diagnosing strategies in social information gathering. Journal of Personality and Social Psychology, 43, 22–34.

Wason, P.C. (1960). On the failure to eliminate hypotheses in a conceptual task. The Quarterly Journal of Experimental Psychology, 12 , 129-140.

Wason, P.C. (1968). Reasoning about a rule. The Quarterly Journal of Experimental Psychology, 20 , 273-281.

Gary Klein Ph.D.

Gary Klein, Ph.D., is a senior scientist at MacroCognition LLC. His most recent book is Seeing What Others Don't: The remarkable ways we gain insights.

  • Find Counselling
  • Find Online Therapy
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience

pep

Find what you need to study

5.8 Biases and Errors in Thinking

5 min read • december 22, 2022

Sadiyya Holsey

Sadiyya Holsey

Dalia Savy

Haseung Jun

Errors in Problem Solving

Because of our mental concepts and other processes, we may be biased or think of situations without an open mind. Let's discuss what those other processes are.

Fixation is only thinking from one point of view. It is in the inability to approach a situation from different perspectives 👀 Fixation is used interchangeably with your mental concept.

Functional Fixedness 

Functional fixedness is the tendency to only think of the familiar functions of an object.

An example of functional fixedness would be the candle problem . Individuals were given a box with thumbtacks, matches 🔥, and a candle 🕯️Then they were asked to put the candle on the wall in a way that the candle wax would not drip while it was lit.

Most of the subjects were unable to solve the problem. Some tried to solve it by trying to pin the candle on the wall with a thumbtack. The successful method was to attach the box to the wall using the thumbtacks. Then, put the candle in the box to light it.

Because of functional fixedness , individuals were unsuccessful because they couldn't understand how a box 📦 can be more than just a container for something.

The following two heuristics can lead us to make poor decisions and snap judgements, which downgrade our thinking.

Availability Heuristic

An availability heuristic is the ability to easily recall immediate examples from the mind about something. When someone asks you "What is the first thing that comes to mind when you think of . . .," you are using the availability heuristic .

Rather than thinking further about a topic, you just mention/assume other events based on the first thing that comes to your mind (or the first readily available concept in your mind).

This makes us fear the wrong things. Many parents may not let their children walk to school 🏫 because the only thing they could think of is that one kid going missing ⚠️This is the very first thing that comes to their mind and because of it, they fear their children suffering the same fate.

Therefore, we really fear what is readily in our memory.

https://firebasestorage.googleapis.com/v0/b/fiveable-92889.appspot.com/o/images%2F-gbpmOKfGFOGZ.png?alt=media&token=3be53495-25a3-4835-99dd-a11de70b4e2d

Image Courtesy of The Decision Lab .

Representativeness Heuristic

The representativeness heuristic is when you judge something based on how they match your prototype. This leads us to ignore information and is honestly the stem of stereotypes.

For example, if someone was asked to decide who most likely went to an ivy league school (when looking at a truck driver 🚚 and a professor 👩‍🏫👨‍🏫), most people would say the professor. This doesn't mean that the professor actually went to an ivy league school, this is just us being stereotypical because of our prototype for a person that goes to an ivy.

There are so many different types of biases and we experience each and every one of them in our everyday lives.

Confirmation Bias 

Confirmation bias is the tendency of individuals to support or search for information that aligns with their opinions and ignore information that doesn't. This eventually leads us to be more polarized ⬅️➡️ as individuals, and is another way of experiencing fixation .

A key example is how many republicans 🔴 watch Fox News to view a channel that confirms their political beliefs. People really dislike it when others have differing opinions and continue to find information that back up their own beliefs.

Belief Perseverance and Belief Bias

Belief perseverance is the tendency to hold onto a belief even if it has lost its credibility. This is different from belief bias , which is the tendency for our preexisting beliefs to distort logical thinking, making logical conclusions look illogical.

Halo Effect 

The halo effect is when positive impressions of people lead to positive views about their character and personality traits. For example, if you see someone as attractive you may think of them as having better personality traits and character even though this isn't necessarily true. 

Self-Serving Bias 

Self-serving bias is when a person attributes positive outcomes to their own doing and negative outcomes to external factors.

For example, if you do well on a test 💯 you may think it makes sense, because you did a good job of studying to prepare for the exam. But if you fail the test, you may put the blame on the teacher for not teaching all the material or for making the test too hard.

Attentional Bias 

Attentional bias is when people’s perceptions are influenced by recurring thoughts.

For example, if marine biology has been on your mind a lot lately, your conversations may include references to marine biology. You would also be more likely to notice information that relates to your thoughts (marine biology).

Actor-observer Bias

Actor-observer bias is when a person might attribute their own actions to external factors and the actions of others to internal factors.

For example, if you see someone else litter, you might think about how people are careless. But if you litter, you might say it was because there was no trash can🗑️ within sight.

Anchoring Bias 

Anchoring bias is when an individual relies heavily on the first piece of information given when making a decision. The first piece of information acts as an anchor and compares it to all subsequent information.

Hindsight Bias

Hindsight bias is when you think you knew something all along after the outcome has occurred. People overestimate their ability to have predicted a certain outcome even if it couldn't possibly have been predicted. People often say "I knew that."

Image Courtesy of Giphy .

Framing impacts decisions and judgments. It's the way we present an issue, and it can be a very powerful persuasion tool.

For example, a doctor could say one of two things about a surgery:

10% of people die 😲

90% of people survive 😌

Obviously, 10% of people die is a much more direct way to phrase the same thing. This makes it scarier than "90% of people survive." Framing is a very important tool!

https://firebasestorage.googleapis.com/v0/b/fiveable-92889.appspot.com/o/images%2F-BZxuFkgQ32F9.JPG?alt=media&token=23cbe5b7-1c07-4c0b-848b-15ad53984667

Key Terms to Review ( 17 )

Actor-Observer Bias

Anchoring Bias

Attentional Bias

Belief Bias

Belief Perseverance

Candle Problem

Confirmation Bias

Functional Fixedness

Halo Effect

Self-Serving Bias

Fiveable

Stay Connected

© 2024 Fiveable Inc. All rights reserved.

AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

IMAGES

  1. 17 Confirmation Bias Examples (2023)

    problem solving confirmation bias definition

  2. Confirmation Bias in UX

    problem solving confirmation bias definition

  3. PPT

    problem solving confirmation bias definition

  4. Examples and Observations of a Confirmation Bias

    problem solving confirmation bias definition

  5. What is Confirmation Bias and How Does it Work?

    problem solving confirmation bias definition

  6. Guide to the Most Common Cognitive Biases and Heuristics

    problem solving confirmation bias definition

VIDEO

  1. How CONFIRMATION BIAS Works

  2. Breaking the Bias: Understanding Confirmation Bias in Decision-Making

  3. Confirmation bias affects information perception

  4. Confirmation Bias and Politics

  5. Avoiding Confirmation Bias in Real Estate Decision Making #realestaterevolution #propertymanagement

  6. Preventing Cognitive Bias at Work

COMMENTS

  1. Confirmation Bias: Definition, Signs, Overcoming

    A confirmation bias is cognitive bias that favors information that confirms your previously existing beliefs or biases. For example, imagine that a person believes left-handed people are more creative than right-handed people. Whenever this person encounters a person that is both left-handed and creative, they place greater importance on this ...

  2. What Is Confirmation Bias?

    Confirmation bias is the tendency to seek out and prefer information that supports our preexisting beliefs. As a result, we tend to ignore any information that contradicts those beliefs. Confirmation bias is often unintentional but can still lead to poor decision-making in (psychology) research and in legal or real-life contexts.

  3. Confirmation Bias In Psychology: Definition & Examples

    Confirmation bias in psychology is the tendency to favor information that confirms existing beliefs or values. People exhibiting this bias are likely to seek out, interpret, remember, and give more weight to evidence that supports their views, while ignoring, dismissing, or undervaluing the relevance of evidence that contradicts them.

  4. Confirmation bias

    cognitive bias. confirmation bias, people's tendency to process information by looking for, or interpreting, information that is consistent with their existing beliefs. This biased approach to decision making is largely unintentional, and it results in a person ignoring information that is inconsistent with their beliefs.

  5. Confirmation Bias: Definition, Theory, & Examples

    First, we show confirmation bias in choosing to read things that agree with our beliefs online (Bessi et al., 2015). Second, the algorithms behind social media want to keep us engaged, and they learn that providing us with more content related to what we already believe is a good way to achieve this (Bozdag, 2013).

  6. Confirmation Bias: 6 Psychological Effects And Tips To Avoid

    Confirmation bias happens when people only pay attention to information that supports what they already believe. Confirmation bias can manifest in various forms, such as biased research, biased recall, and biased interpretation. Confirmation bias can hinder decision-making process, problem-solving skills, and interpersonal relationships.

  7. What Is Confirmation Bias In Psychology, And How Can You ...

    Confirmation bias is a cognitive bias that supports a person's personal beliefs or feelings. It is the mind's way of ignoring everything that does not support our ideas or views. In social psychology, confirmation bias, also sometimes called myside bias, is an unconscious tendency not to judge new information objectively.

  8. 9.8: Confirmation Bias

    Figure 7. Most people use conformation bias unwittingly because it is usually easier to cling to a reassuring lie than an inconvenient truth. Confirmation bias is a person's tendency to seek, interpret and use evidence in a way that conforms to their existing beliefs. This can lead a person to make certain mistakes such as: poor judgments ...

  9. What Is the Function of Confirmation Bias?

    Confirmation bias is one of the most widely discussed epistemically problematic cognitions, challenging reliable belief formation and the correction of inaccurate views. Given its problematic nature, it remains unclear why the bias evolved and is still with us today. To offer an explanation, several philosophers and scientists have argued that the bias is in fact adaptive. I critically discuss ...

  10. The Curious Case of Confirmation Bias

    The concept of confirmation bias appears to rest on three claims: First, firm evidence, going back 60 years, has demonstrated that people are prone to confirmation bias. Second, confirmation bias ...

  11. The Confirmation Bias: Why People See What They Want to See

    The confirmation bias is a cognitive bias that causes people to search for, favor, interpret, and recall information in a way that confirms their preexisting beliefs. For example, if someone is presented with a lot of information on a certain topic, the confirmation bias can cause them to only remember the bits of information that confirm what they already thought.

  12. Confirmation Bias Definition and Examples

    Confirmation bias is the tendency to seek information confirming preexisting beliefs while ignoring information contradicting them. This bias can be particularly problematic when making important decisions, leading to flawed reasoning and inaccurate conclusions. It is a type of cognitive bias. Confirmation bias not only affects how we gather ...

  13. Confirmation Bias

    Confirmation Bias Definition Confirmation bias refers to processing information by looking for, or interpreting, information that is consistent with one's existing beliefs. This biased approach to decision making is largely unintentional and often results in ignoring inconsistent information. Existing beliefs can include one's expectations in a given situation and predictions about a ...

  14. Confirmation bias and its impact on complex problem solving ...

    According to the definition provided by Wikipedia: "Confirmation bias, also called confirmatory bias or myside bias, is the tendency to search for, interpret, favour, and recall information in a ...

  15. Pitfalls to Problem Solving

    Summary of Decision Biases. Bias. Description. Anchoring. Tendency to focus on one particular piece of information when making decisions or problem-solving. Confirmation. Focuses on information that confirms existing beliefs. Hindsight. Belief that the event just experienced was predictable.

  16. Opinion Dynamics with Confirmation Bias

    Introduction. Confirmation bias is the tendency to acquire or process new information in a way that confirms one's preconceptions and avoids contradiction with prior beliefs .Various manifestations of this bias have been reported in cognitive psychology , , social psychology , , politics and (media) economics , , , .Recent evidence suggests that scientific practices too are susceptible to ...

  17. How confirmation bias stops us solving problems

    Confirmation bias is the tendency to seek out, interpret, judge and remember information so that it supports one's pre-existing views and ideas. Confirmation bias can make people less likely to engage with information which challenges their views. An example of this is a recent study of 376 million Facebook users, which found that many ...

  18. Problem Solving And Decision Making

    Confirmation bias is a barrier to problem-solving. This exists when a person has a tendency to look for information that supports their idea or approach instead of looking at new information that may contradict their approach or ideas. Strategies for problem-solving. There are many strategies that can make solving a problem easier and more ...

  19. Confirmation Bias Flashcards

    What is confirmation bias? The tendency to look for the information that supports your views and to ignore the rest. The tendency to dismiss what you disagree with and accept what you agree with. Your prior beliefs directs the search for evidence. Even your memories are affected by confirmation bias.

  20. The Curious Case of Confirmation Bias

    Confirmation bias was first described by Peter Wason (1960), who asked participants in an experiment to guess at a rule about number triples. The participants were told that the sequence 2-4-6 fit ...

  21. AP Psychology 2024

    Self-Serving Bias. : Self-serving bias is the tendency to attribute positive events to one's own character but attribute negative events to external factors. It's a common type of cognitive bias that has been extensively studied in social psychology. Cram for AP Psychology Unit 5 - Topic 5.8 with study guides and practice quizzes to review ...

  22. Problem solving

    Problem solving is the process of achieving a goal by overcoming obstacles, a frequent part of most activities. Problems in need of solutions range from simple personal tasks (e.g. how to turn on an appliance) to complex issues in business and technical fields. ... Confirmation bias is an unintentional tendency to collect and use data which ...