Cognitive Biases and Fallacies: Examples & Differences
Cognitive biases are mental shortcuts. Fallacies are error in reasoning.
Suppose your plane crashed in the middle of nowhere with you and a dozen other survivors. You have three options:
Everyone leaves the crash site to look for food and help.
Half of the group leaves the crash site to look for food and help, while the other half stays put.
Everyone stays at the crash site.
What would you do?
This question was posed to me in a seminar. I felt certain that everyone should go look for food and help. I made my case to our group, and they ended up siding with me. But I was wrong, and so were they.
There was an expert in the room: a former US Air Force pilot. He explained that staying close to the plane was by far the best option for survival. His reasoning was that the search crew would have some idea of the plane’s last location by tracking the black box and smoke from the wreckage.
I had felt certain about my solution to this problem, but feeling certain doesn’t guarantee that your assessment is accurate. We humans have inherited many dispositions from our evolutionary past. Those dispositions include cognitive biases, and those biases can make us feel certain even when we’re wrong. In addition, they can trigger errors in reasoning—fallacies.
The difference between cognitive biases and fallacies is this: cognitive biases are tendencies to act, whereas fallacies are actual actions. By analogy, craving sugar is a tendency to act, whereas eating too much ice cream is an actual action.
Before going into this, let’s first understand what cognitive biases and fallacies are.
Cognitive biases are mental shortcuts (called heuristics) that fill in gaps in our information and help us act decisively despite those gaps.
The concept of cognitive bias was first introduced by Daniel Kahneman and Amos Tversky. They distinguished two cognitive mechanisms for decision-making: fast thinking, which uses mental shortcuts, and slow thinking, which uses reasoning. Cognitive biases are examples of fast thinking.
Cognitive biases evolved as a way of saving time and energy while at the same time promoting survival. The human brain is limited in its capacity to process information. We make so many decisions during the day that it would be taxing on our brains to make every decision in a slow, thoughtful manner. Moreover, in the environment in which our prehistoric ancestors lived, making every decision in a slow, thoughtful manner could be deadly!
Cognitive biases evolved as a way of speeding up information processing, and gave our ancestors the ability to act decisively in life and death situations. For most of us today, quick life-and-death decisions are rare, yet we retain these biases as part of our evolutionary inheritance.
Here are some of the cognitive biases we all have:
Confirmation bias: A tendency to filter information in a way that confirms what you already think. For example, you seek information that validates your current liberal point of view by watching CNN, CNBC, and reading the New York Times, and you discount news from sources like Fox and the Wall Street Journal. Confirmation bias is often what drives people to accept even the most absurd conspiracy theories: they only examine information that confirms the theory, not any information that disconfirms it.
Optimism bias: A tendency to be over-optimistic, and to underestimate the probability of undesirable outcomes. For example, instead of seeing the world for what it is, you overestimate your chances of succeeding at something. Overconfidence in positive outcomes is often the result of optimism bias.
Self-serving bias: A tendency to claim more responsibility for successes than failures. For example, many employees take credit for their wins but ignore their mistakes.
Availability bias (also called availability heuristic): A tendency to overestimate the likelihood of events based on recent and emotional memories. For example, you make a decision based on something that recently happened to you and avoid looking at the big picture.
Anchoring bias: A tendency to rely too much on one trait or piece of information. For example, you rely too heavily on a first impression and devalue subsequent information. Anchoring bias is often what drives people to hold onto misconceptions even after they’ve seen disconfirming evidence.
Attribution bias: A tendency to accept praise for good outcomes but not blame for bad outcomes when I’m evaluating my own performance but not when I’m evaluating other people’s performance. For example, suppose I make a decision that has a good outcome. I’ll accept credit for the good outcome, and say it’s because I’m a genius, but I’ll try to avoid blame for the bad outcome; I’ll instead try to blame it on external circumstances. Things are different if I’m evaluating someone else’s performance. In that case, I’m more willing to assign blame to the person, and I’m also more willing to attribute that person’s successes to external factors.
Here are the names of some other common cognitive biases: Observer bias, The halo effect bias, Ingroup bias, Implicit bias, Bandwagon effect, The Dunning-Kruger Effect, and Hindsight bias.
Now that we understand what cognitive biases are, let’s look at fallacies.
A fallacy is an error in reasoning.
Reasoning, or argumentation, is the process of supporting a statement by appeal to other statements. The statement you’re trying to support is called the conclusion, and the statements that are supposed to support it are called premises.
Reasoning can be correct or incorrect in just the way that mathematical calculation can. When reasoning is performed incorrectly, we say that it commits a fallacy.
The telltale sign of a fallacy is this: even if your premises are true, they still tell you nothing about whether or not your conclusion is true. Let’s look at an example. Here are two arguments:
Fallacious Argument A:
If it’s 2021, then it’s the 21st Century Premise (true statement)
It’s the 21st Century Premise (true statement)
Therefore, it’s 2021 Conclusion (true statement)
Fallacious Argument B:
If it’s 2016, then it’s the 21st Century Premise (true statement)
It’s the 21st Century Premise (true statement)
Therefore, it’s 2016 Conclusion (false statement)
Argument A and Argument B have the same form. We can represent that form as follows:
Affirming the Consequent (Fallacy)
If P, then Q
Q
Therefore, P
Here ‘P’ and ‘Q’ are variables. In Argument A, the variable P has the value ‘it’s 2021’ and the variable Q has the value ‘it’s the 21st Century’. In Argument B, the variable P has the value ‘it’s 2016’ and the variable Q has the value ‘it’s the 21st Century’.
When we plug in these values for the variables, we end up with true premises in both of the arguments: it’s true that if it’s 2021, then it’s the 21st century; it’s true that if it’s 2016, then it’s the 21st century, and it’s true that it’s the 21st century.
Both arguments, then, have true premises. If we reason correctly from true premises, then we should arrive at a true conclusion every time. But notice what happens when we reason by affirming the consequent: sometimes true premises yield a true conclusion, and sometimes they don’t. This shows us that reasoning in this way is unreliable. Even if you have true premises, those premises still tell you nothing about whether or not the conclusion is true.
That’s why we call this form of reasoning a fallacy. It’s an example of incorrect reasoning: even if the premises are true, they still don’t give you any reason to accept the conclusion.
Here are some common fallacies:
Appeal to Authority Fallacy: Appeals to authority look to support a claim by appeal to the person who’s making the claim. If I say that there is an afterlife because Descartes believes in it, this is an example of committing an appeal to authority fallacy. A claim’s truth or falsity doesn’t depend on who’s making it, but my appeal to Descartes makes it out as if it does--as if it must be true just because Descartes accepts it.
Ad Hominem Fallacy (also known as a personal attack): Ad hominem arguments look to falsify an opponent’s claim by attacking the arguer. An example is the following argument: “It’s false that 2 + 2 = 4 because you are stupid.” A claim’s truth or falsity doesn’t depend on who’s making it.
Hasty Generalization Fallacy: A generalization is stronger or weaker depending on the size of the initial sample. Hasty generalizations are weak generalizations. A generalization is hasty when we endorse a general claim without having observed a sample large enough to be confident that the claim is true. For example, “All the parrots I’ve ever seen are yellow, so all parrots must be yellow.”
Straw Man Fallacy: The straw man is a logical fallacy that replaces something (a person, a viewpoint, an argument) with a distorted version that blows the opponent’s position out of proportion to make it easier to attack.
Here’s an example of straw man fallacy:
Wife: “I’d rather go to a beach than a city.”
Husband: “Why do you hate cities?”
The wife never said that she hates cities. The husband misrepresents what she says to make her preferences seem more extreme than they are.
Here are the names of some other common fallacies: Sunk cost fallacy, Gambler’s fallacy, Slippery slope fallacy, and Begging the question (circular reasoning).
How Cognitive Biases Can Trigger Fallacies
The difference between a cognitive bias and a fallacy is the difference between a tendency and an action. Tendencies to act are different from actions themselves. For example, we all have a craving to eat sugar and fat, but that’s different from actually eating sugar and fat.
Similarly, cognitive biases are tendencies to make judgments based on inadequate or irrelevant evidence. We commit fallacies when we actually act on those tendencies.
Cognitive biases can tempt you to commit fallacies. For example, humans have the tendency to want to belong in a group. If you are born in Alabama, then you are surrounded by Christians, and you will have the tendency to conform to your social surroundings by believing what Christians believe. If you are born in Kabul, then you are surrounded by Muslims, and you will have the tendency to conform to your social surroundings by believing what Muslims believe.
The tendency to conform to social norms can lead people to commit an Appeal to popularity fallacy. Appeal to popularity happens when someone makes a claim based on popular opinion or a common belief among a specific group of people. Suppose, for instance, that I say, “There is an afterlife because most Christians around me believe in it.” This is an appeal to popularity fallacy because I believe something to be true because it is a popular opinion not because there is a reason to believe it.
Managing biases and fallacies: the good news and the bad
The bad news about cognitive biases: You can’t get rid of them. I explain why here.
The good news about cognitive biases: You can learn to manage them. I describe how here.
The bad news about fallacies: Fallacious arguments can be persuasive and can fool you. Politicians and marketers use fallacious arguments to get us to do certain things.
The good news about fallacies: You can learn about fallacies, spot them in other people’s reasoning, and take steps to avoid them yourself.
How Wisdom Seekers Approach Cognitive Biases and Fallacies
Wisdom seekers are committed to developing critical thinking skills–including understanding cognitive biases and fallacies. Wisdom seekers understand that they can’t change the cognitive biases they have, but they can change whether they act on them. They know that acting on them can lead to false judgments. Instead, they default to withholding judgment until they have adequate evidence to accept or reject something.