Page contents
ToggleCognitive biases
Cognitive bias is the tendency to act irrationally due to our limited ability to process information objectively. It's not always negative, but it can cloud our judgment and affect the clarity with which we perceive situations, people, or potential risks. Everyone is susceptible to cognitive biases, and researchers are no exception. Cognitive biases can therefore be a source of bias in research.
Where do cognitive biases come from?
Cognitive bias is an umbrella term used to describe our systematic but imperfect response patterns to problems related to judgment and decision-making. As expected, these patterns are not random. Although based on our beliefs and experiences, they often run counter to logic or probability.
Although we like to think of ourselves as rational beings who process all information before making a decision, this is often not the case. Everyone is subject to cognitive biases to different degrees.
Cognitive biases are hardwired into our brains and can help us manage the information overload inherent in everyday life. If we had to think carefully before all our actions, it would be really difficult to function.
To be more effective, our brains rely more on our experiences and beliefs than we think. These become mental shortcuts (also called heuristics). These rules of thumb help us make judgments and predictions. Because this process is intuitive or subconscious, people often do not realize that they are acting based on biases or preconceived ideas.
Causes of cognitive biases
Our tendency toward cognitive biases can come from many different sources. A few of them include:
Limited information processing capacity. Because our minds have a limited capacity to store and recall information, we simply cannot take into account all relevant information when we make an inference or decision. Usually we are forced to focus on a subset of the available information.
Emotions. If our decision involves those close to us, rather than complete strangers, we will evaluate the situation differently.
Motivation. Our judgments are influenced by our existing attitudes and beliefs. We are very likely to choose the beliefs and strategies that are most likely to help us reach the conclusions we want to reach.
Social influence. People tend to conform to the opinions previously expressed by others or to act in socially desirable ways. This can influence collective behavior, such as voting.
Heuristics or mental shortcuts. Our mind uses simple rules to achieve a conclusion in a “quick and frugal” manner. The goal is not to grasp the problem in all its complexity, nor even to arrive at the optimal solution, but to quickly arrive at a “good enough” solution while minimizing mental effort.
Age. There is evidence to suggest that older adults demonstrate less cognitive flexibility. This implies that as we age, we are more likely to exhibit cognitive biases.
Relying on mental shortcuts in our daily lives is effective and leads to faster decision-making when timing is more important than precision. However, cognitive biases can cause us to misunderstand events, facts, or other people. This, in turn, can affect our behavior in a wide range of situations.
Cognitive biases can have a negative impact:
Our decision-making capacity, limiting our receptivity to new or conflicting information.
How accurately can we remember incidents, for example an event we were eyewitnesses. Inaccurate or incomplete recollection of events can lead to recall bias.
Our anxiety levels force us to focus only on negative events or aspects of our lives.
Our relationships with others, when we are too quick to judge their personality based on a single trait.
Our critical thinking leads us to perpetuate misconceptions or erroneous information that can be harmful to others.
Anchoring bias
Anchoring bias is the tendency to rely on the first information offered. This especially applies to numbers. Traders use anchoring bias by starting with a number that is too low or too high. They know that this number will set the bar for subsequent offers.
"In one experiment, participants were asked to estimate the percentage of African countries in the United Nations (UN) in two ways:
First, they were asked whether the percentage was less than or greater than a given number (the anchor), which was determined randomly by spinning a wheel.
Participants were then asked to estimate the exact percentage of African states that are members of the UN.
Even though the anchor was completely arbitrary and unrelated to the question, it nevertheless influenced the participants who used it as a reference in their subsequent judgment. As a result, their answers were close to the anchor. For example:
If the anchor was 10, participants' average estimate of the true value was 25.
If the anchor was 65, their average estimate was 45.
This shows that under anchoring bias, irrelevant anchors have just as much impact as anchors that provide relevant information cues.”
Framing effect
The framing effect occurs when people make a choice based on whether the options presented to them are framed positively or negatively, for example in terms of loss or gain, reward or punishment.
"In a study of undergraduate students, respondents were presented with the following medical decision-making problem, described in a positive and a negative frame. Responses were recorded on a 6-point Likert scale ranging from 1 (very bad) to 6 (very good).
Positive: 100 patients took the drug and 70 patients felt better. How would you rate the effect of the drug?
Negative: 100 patients took the drug and 30 patients did not feel better. How would you rate the effect of the drug?
THE results found that frame influenced evaluation: when the drug effect was described in a loss frame (30 patients did not feel better), respondents gave negative evaluations. When the effect was described in a gain frame (70 patients felt better), respondents gave positive evaluations.
Actor-observer bias
Actor-observer bias is the tendency to attribute our actions to external factors and the actions of others to internal factors. For example, if you and a classmate both fail a test, you may think that your failure was due to the difficulty of the exams. questions, while your classmate's is due to poor preparation.
“A student who is doing poorly in school has an appointment with a school counselor. He arrives late. When asked about the reasons for his poor grades, the student explains this by citing external circumstances: too much work, family problems, and stress. The counselor nods in understanding, but in reality, he has a different opinion on the matter: he is convinced that the student is simply lazy and indifferent because of his tardiness.
In other words, the advisor, as an observer, attributes the student's results to his personality traits, underestimating the role that circumstances may have played. For his part, the student, as an actor in his own behavior, attributes his poor results to situational forces, ignoring his own responsibility.
In reality, both factors are probably at play here."
Availability heuristic
The availability heuristic (or availability bias) applies when we place greater value on information that is available to us or that comes to mind quickly. For this reason, we tend to overestimate the likelihood of similar things happening again.
"When asked whether falling airplane parts or shark attacks are a more common cause of death in the United States, most people say shark attacks. In fact, the risk of dying from falling airplane parts is 30 times higher than the risk of being killed by a shark.
People overestimate the risk of shark attacks because there are so many more news articles and movies about them. As a result, images of shark attacks are easier to conjure up. If you can quickly think of several examples of something happening, you're led to believe that it must happen often."
Confirmation Bias
Confirmation bias refers to our tendency to seek out evidence confirming what we already believe, viewing the facts and ideas we encounter as further confirmation. Confirmation bias also causes us to ignore any evidence that appears to support an opposing view.
“Let’s take the previous example on the topic of the article on climate change. The article presents arguments in favor of reducing fossil fuel emissions.
A week after reading the article, the reader concerned about climate change is more likely to recall these arguments in a discussion with friends. In contrast, a climate change skeptic is unlikely to be able to recall the points made in the article.
Due to confirmation bias, we tend to memorize and recall information that is more consistent with our existing ideas.”
Halo effect
The halo effect refers to how our perception of a single trait can influence how we perceive other aspects, particularly regarding a person's personality. For example, when we consider a person to be physically attractive, this often determines how we evaluate their other qualities.
“Imagine you are at the supermarket and trying to choose a snack. You see two cereal bars, one of which is labeled organic. Since you are health conscious, you choose the organic bar, thinking it is the better choice.
The truth is, just because a product is labeled organic or contains organic ingredients doesn't mean it's healthier. If you read the packaging, you'll see that the organic bar is always high in sugar.
The halo effect influences how consumers judge product quality based on a single product characteristic. By attributing a positive characteristic to their product, brands can influence customers' perceptions of the product's overall quality.
Baader-Meinhof phenomenon
The Baader-Meinhof phenomenon (or frequency illusion) is the tendency to see new information, names or patterns "everywhere" shortly after they are first brought to our attention.
“As COVID-19 cases began to spike in Boston, Massachusetts, doctors noticed something unusual. They saw an increase in patients seeking appointments because of swollen, discolored toes, with symptoms resembling frostbite. A doctor normally sees only one or two cases each winter, but now they’re seeing 15 to 20 people seeking appointments.
A similar increase has been reported by doctors from different parts of the world, which seems to coincide with the rise of the COVID-19 pandemic worldwide. As a result, many doctors have concluded that “COVID toes” are a symptom of coronavirus infection.
However, in the months that followed, it became clear that the proportion of COVID-19 patients was actually low among those who suffered from the skin condition. Why more people reported skin problems is still being investigated.
Doctors were misled by the Baader-Meinhof phenomenon. As the pandemic was at its peak, it was only natural that some of the patients with swollen toes would also test positive for COVID. This led doctors to perceive an association where there was none.
The Baader-Meinhof phenomenon can inflate the importance of recent stimuli or observations – in this case, people with frostbite who also had COVID.
In other words, after noticing something for the first time (here, the co-presence of COVID and swollen, discolored toes), we tend to notice it more often, leading us to believe that this frequency of occurrence is high."
Belief bias
Belief bias describes the tendency to judge an argument based on the plausibility of the conclusion, rather than the amount of evidence provided to support those conclusions during the argument.
"You come across the following statement:
Scientific studies have consistently shown that there is little nutritional difference between organic and conventional foods.
Because you firmly believe that an all-organic diet is superior to a conventional diet, you are skeptical and quickly dismiss the argument, even if it provides scientific evidence.
Affect heuristics
The affect heuristic occurs when our current emotional state or mood influences our decisions. Instead of assessing the situation objectively, we rely on our “gut feelings” and react based on how we feel.
“Communicators often attempt to manipulate our feelings, influencing our opinions on controversial issues through their choice of words. “Nuclear speak” is one example. It is a form of deceptive language used to present nuclear energy and weapons in a positive light.
By using terms like “smart bombs” and “peacekeeping missiles” for nuclear weapons and “excursions” for reactor accidents, nuclear energy advocates downplay the risks of nuclear applications and emphasize their benefits. Although not without resistance, they attempt to present nuclear concepts in a neutral or positive way using this language. As a result, the public associates a neutral or positive sentiment with the technology, leading to a framing effect.
According to the affect heuristic, when people perceive something, such as a technology, as highly beneficial, they automatically infer that it poses few risks. This ultimately influences how they feel about it. The next time they are asked for their opinion on the matter, they are more likely to consult their feelings rather than carefully consider all the facts.
Representativeness heuristic
The representativeness heuristic occurs when we estimate the probability of an event based on its similarity to a known situation. In other words, we compare it to a situation, prototype, or stereotype that we already have in mind.
“Consider the following sketch:
Tom is 34 years old. He is intelligent but unimaginative and collects old jazz records. At school he was good at math but weak at social sciences and humanities.
Which statement is most likely:
A. Tom is an accountant who plays the trumpet for a hobby.
B. Tom plays the trumpet for a hobby.
Given the description, you might think that A is a more accurate answer. However, this violates a fundamental rule of probability. The conjunction, or co-occurrence, of two events (e.g., "accountant" and "trumpet player") cannot be more probable than the probability of either event taken separately.
As the amount of detail in a scenario increases, its probability can only steadily decrease, but its representativeness (and hence its apparent plausibility) can increase.