Processing high amounts of information is difficult and there is a great cognitive load to decision-making. When it comes to most decisions, it is virtually impossible for us to take all the information available into consideration. This is why we rely on heuristics - mental shortcuts - to help us make decisions that are good enough. Most of the time, it works pretty well - but we can also happen to make systematic mistakes because of how we process information. These mistakes are called cognitive biases and come in many shapes and forms. One common bias is the confirmation bias, which is our inclination to prefer information that supports beliefs that we have. There are three types of this bias:
- Biased search for information - When having a hypothesis, we tend to look for evidence that supports it. This leads to selective gathering of information, with methods that themselves are biased. How we phrase questions is one example - asking “Are you unsatisfied with your job?” implies that you have an underlying intention with the question.
- Biased interpretation of information - the confirmation bias makes us interpret situations, especially ambiguous ones, so that they are in line with our beliefs. This leads to us overlooking parts of the information and exaggerating the importance of others, or even simply ignoring data that contradicts our position. This phenomenon is commonly called cherry-picking and can be a big fallacy in science and research.
- Biased memory - Confirmation bias leads to selective remembering. In a study by Snyder and Cantor (1978), the participants were given a description of a fictional person named Jane. After two days, half of the subjects got to evaluate her for a job at the library and the other half for a salesperson job. The librarian group remembered more introvert traits and the salesperson group remembered more extrovert traits. This was despite the fact that both types of attributes were equally prominent in the description of Jane.
Figure 1: Illustration of the confirmation bias. Figure by the author.
The confirmation bias is particularly strong when it comes to emotionally loaded issues, as it is difficult to abandon a belief that you are deeply invested in. In a 1979 study, Lord, Ross and Lepper gathered a group of subjects, half of them supporting capital punishment and the other half opposing it. The participants got to read two fictional papers, one confirming and one disconfirming their standpoints. Both the supporters and the opposers claimed that the paper that was in line with their belief was more convincing and well-made, and they were more critical towards the other one. When presented with more evidence, they expressed even stronger favor towards the findings that supported their position - leading to a bigger polarization between the two groups. There are several consequences of the confirmation bias:
- Cherry-picking and attitude polarization - biased interpretation of information makes the standpoints more extreme and widens the gap between different opinions. A discussion between two parties can feel even more polarized than it was initially, because each of them highlights the information that is in their favor.
- Clinging to one’s standpoint- when presented with information that contradicts their opinion, people tend to cling even harder to that belief. This is why it can feel impossible to use pure facts to make someone abandon their standpoint - for instance, trying to convert someone that believes that the earth is flat. This phenomenon is called belief perseverance, and you do not have to be flat-earther to fall victim for it. A study from 1992 showed a group of natural scientists doubting their ability to measure the capacity of a sphere by filling it with water, just because they first got to calculate its volume using an incorrect formula.
- Boosted overconfidence - confirmation bias goes hand in hand with the overconfidence effect, which is that we believe that our own judgement is greater than it actually is. It makes us overestimate our performance and believe that we are more right than we are. For instance, investors can show arrogant behavior and ignore clues suggesting that their investment is bad. The overconfidence effect can be extra harmful when it comes to matters that require a cautious or humble approach.
Overall, confirmation bias affects our ability to act adequately upon information and prevents objective decision-making. This can be seen in various areas - in politics, science, business and healthcare among others. Even the most brilliant scientists are susceptible to this effect, as no one is immune from cognitive biases. Being wrong is not pleasant and although the scientific approach promotes reconsideration of one’s ideas, it can be very tempting to cherry-pick data to find support for a promising hypothesis.
Confirmation bias can also affect the way we assess medical treatments. This is how advocates for alternative medicine find support for their cause, by attributing any progress in their health to the pseudo-scientific practices they believe in. This requires ignoring the enormous amount of proof for evidence-based medicine and sometimes even disregarding it as a conspiracy. Following the principle of belief perseverance, they can be even more strengthened in their opinion when presented with evidence against it.
Another appearance of the confirmation bias is in recruitment processes. It is easy to be biased by an initial impression and start looking for validation of your gut feeling, which affects the way recruiters gather, interpret and evaluate information about the candidates. This is why it is important to adopt more objective selection methods, to make sure that you get the best candidates and mitigate the damage caused by confirmation bias.
Three things to consider when using an interactive presentation tool like Mentimeter
- When presenting, the time to critically look at the responses from the audience is limited. As Mentimeter is a live interactive presentation tool, you have maybe only a couple of seconds to comment on the results. When posing a question, you probably have an idea of what responses will come up - or what you may be even hoping for. It is therefore likely that you will emphasize the responses that fit your thesis. Be conscious about the confirmation bias when, for instance, summarizing a workshop! It is a good idea to align with the team by double-checking if your key takeaways match theirs and remember that you can export the data from your presentation so that you can review it later in a less stressful setting.
- Because of our tendency to search for information in a biased way, it is easy to phrase questions in a way that the answers will confirm what you already believe. When posing an Open-Ended question, try to really keep it open. Instead of asking “Are you satisfied with your team's progress last week?”, ask “Please comment on the progress of the team last week!”. This allows for more nuanced answers that are less biased by the phrasing of the question itself. Be extra mindful of this when it comes to emotionally loaded questions.
- Reflect on the people joining your presentation - if you only invite flat earthers to your “What is the shape of the earth?” event, you will doubtless leave it with a biased perception. A group of like-minded individuals will enhance the confirmation bias, thus risking to compromise the outcome of the session to an even greater extent. This makes it extra important to make sure to invite relevant stakeholders to a meeting and align decisions throughout the organization.
Big thank you to Diana Diez who co-wrote this article.
Notify me about new blog posts