All of us are prone to logical fallacies and cognitive biases.
I know that I’m stupid sometimes—most of us are.
Still, we should all strive to be less stupid.
I’m deeply fascinated with studying logical fallacies and cognitive biases. Learning about human behaviours is helpful in public relations, where we deal with communication challenges daily.
Here we go:
- 1. Fallacy of Composition
- 2. Fallacy of Division
- 3. The Gambler’s Fallacy
- 4. Tu Quoque (Who Are You To Talk?)
- 5. Strawman
- 6. Ad Hominem
- 7. Genetic Fallacy (Fallacy of Origin or Fallacy of Virtue)
- 8. Fallacious Appeal to Authority
- 9. Red Herring
- 10. Appeal to Emotion
- 11. Appeal to Popularity (The Bandwagon Effect)
- 12. Appeal to Tradition
- 13. Appeal to Nature
- 14. Appeal to Ignorance
- 15. Begging the Question
- 16. Equivocation
- 17. False Dichotomy (Black or White)
- 18. Middle Ground Fallacy
- 19. Decision Point Fallacy (Sorites Paradox)
- 20. Slippery Slope Fallacy
- 21. Hasty Generalisations (Anecdotal Evidence)
- 22. Faulty Analogy
- 23. Burden of Proof
- 24. Affirming the Consequent
- 25. Denying the Antecedent (Fallacy of the Inverse)
- 26. Moving the Goalposts
- 27. No True Scotsman
- 28. Personal Incredulity
- 29. False Causality
- 30. Texas Sharpshooter
- 31. Loaded Question
- 32. Chesterton’s Fence
- 33. Survivorship Bias
- 34. The Dunning-Kruger Effect
- 35. Confirmation Bias
- 36. Heuristic Anchoring
- 37. The Curse of Knowledge
- 38. Optimism/Pessimism Bias
- 39. The Sunk Cost Fallacy
- 40. Negativity Bias
- 41. Declinism
- 42. The Backfire Effect (Conversion Theory)
- 43. The Fundamental Attribution Error
- 44. In-Group Bias
- 45. The Forer Effect (The Barnum Effect)
- 46. Cognitive Dissonance
- 47. The Hostile Media Effect
- 48. Cherry-Picking (The Fallacy of Incomplete Evidence)
- 49. The Spiral of Silence
- 50. The Yes Ladder
- 51. Bystander Effect
- 52. Reciprocation Effect
- 53. Commitment and Consistency
- 54. The Fallacy of Social Proof
- 55. Liking and Likeness
- 56. The Appeal to Authority
- 57. The Principle of Scarcity (FOMO)
- 58. Loss Aversion
1. Fallacy of Composition
Something can be valid for one part without being accurate for the total.
“We’ve got the best player in the world on our team, so we must also have the best team in the world.”
2. Fallacy of Division
Just because something is valid for the whole doesn’t make it accurate for the parts.
“We’ve got the best team in the world, so naturally, we must also have the best players in the world.”
3. The Gambler’s Fallacy
The gambler’s fallacy is when we believe in the stableness of random patterns, such as good or bad luck streaks.
“I bet again because I’m on a roll right now!”
4. Tu Quoque (Who Are You To Talk?)
When a person makes a statement outside their expertise or authority, we tend to assume it is incorrect.
“My friend said it’s great to go to the gym, but my friend is out of shape, so that can’t be good advice.”
A strawman argument is when your opponent grossly misrepresents the intended meaning of your original argument.
“Atheists don’t believe in anything, so they don’t believe it matters if one does bad things.”
6. Ad Hominem
Instead of attacking the arguments at face value, you attack your opponent’s good name or character.
“Our prime minister’s wife left him, and if he can’t keep his wife happy, how could he run our country?”
7. Genetic Fallacy (Fallacy of Origin or Fallacy of Virtue)
To assume something is correct or incorrect based solely on the source’s credibility.
“My priest would never lie to me; my priest is telling me that God exists; therefore, God exists.”
8. Fallacious Appeal to Authority
Just because someone’s an expert in a specific area doesn’t automatically make them an expert in others.
“I’m not an epidemiologist; however, I am a doctor, and I think we must take measures to ensure herd immunity.”
9. Red Herring
Injecting a piece of information to mislead or distract from the main point.
“We’re being told that the environment needs saving, but the government has made it a habit of lying to us.”
10. Appeal to Emotion
We often empathise with people or groups claiming to be sad, hurt, or scared. But this is sometimes not relevant to the main point.
“Why do you point your finger at immigrants who commit crimes when they’re often victims of horrible traumas themselves?”
11. Appeal to Popularity (The Bandwagon Effect)
The false belief is that many people simultaneously can’t be wrong about the same thing. Both logic and history have taught us that majorities can be objectively wrong. Still, it’s socially comforting to remind your opponent that you have the majority position on your side—which makes your position correct by default.
“One billion flies can’t be wrong—eat shit.”
12. Appeal to Tradition
The false belief is that many people can’t be wrong about something for a long time. Just because human beings have been religious throughout history, this fact doesn’t prove the existence of supernatural powers.
“This can’t be wrong since this has been common practice forever.”
13. Appeal to Nature
The tricky concept is that science is at odds with nature and that you shouldn’t trust science. This fact doesn’t automatically make vaccines dangerous because we’ve evolved in unison with nature without vaccines.
“We trust our natural immune systems, and therefore we don’t trust man-made vaccines.”
14. Appeal to Ignorance
The misconception is that it must be true if you can’t prove something false.
“Since God can’t be disproven, God must exist.”
15. Begging the Question
When a person uses a circular argument, any statement assumes the conclusion in one premise.
“God exists because it says so in the Bible.”
Use words of multiple meanings interchangeably to gain the upper hand in an argument.
“I have the right to believe in God; therefore, believing in God is right.”
17. False Dichotomy (Black or White)
Assuming that something must be either A or B and not both. The media logic often dictates that narratives must be simplified and amplified to be easier to understand quickly; as soon as something is two or more things simultaneously, we tend to find this hard to wrap our heads around. A person could be good or bad simultaneously — or neither.
“You’re either with us (in every way), or you’re against us (in every way).”
18. Middle Ground Fallacy
The false notion that any correct answer must lie somewhere between extreme positions. If someone says that the water is cold and someone else says it’s hot, it’s easy to assume that the temperature will likely be somewhere between. But it could still be hot — or freezing.
“Someone said this is dangerous and perfectly safe, so as long as I’m cautious, I should be fine.”
19. Decision Point Fallacy (Sorites Paradox)
The false assumption that something can’t be correct or incorrect because there’s no precise cut-off between two points.
“We haven’t had many pandemic deaths in Sweden because we know of other countries who have had lots of deaths—and since we haven’t had as many as those countries, we haven’t had many deaths.”
20. Slippery Slope Fallacy
The false assumption is that one step in a specific direction is bound to lead to many steps in that same direction. Sure, eating a piece of candy could lead to an over-consumption of sugar, leading to a severe disease that ultimately could cost you your life. But eating a piece of candy won’t lead directly to your death.
“If you eat that sandwich, you’ll get fat.”
21. Hasty Generalisations (Anecdotal Evidence)
When someone draws very general conclusions based on a small subset of subjective circumstances, we often draw generalised conclusions based on anecdotal evidence, especially if those circumstances are individual.
“I got mugged in the street yesterday, and I hate that society is becoming increasingly unsafe.”
22. Faulty Analogy
The attempt to disqualify an argument by making a point using an irrelevant analogy. Analogies can be powerful communication tools, but making unfair or accurate comparisons is easy.
“Abortion is murder.”
23. Burden of Proof
The attempt to push the opponent to disprove your claims. It’s the person making a claim who has the burden of proof, but a popular technique is to push the responsibility over to the person you’re arguing with.
“I believe in God, and you must prove me wrong.”
24. Affirming the Consequent
Just because an if-then statement is true in a particular situation, this doesn’t make the if-then statement accurate in all cases.
“A cat meows, so everything that meows is a cat.”
25. Denying the Antecedent (Fallacy of the Inverse)
If a statement with specific conditions is correct, this doesn’t make the information accurate or incorrect for all other situations.
“A cat meows, so if it doesn’t meow, it isn’t a cat.”
26. Moving the Goalposts
Manipulating the argument by changing the specifics of your initial claims — after being questioned or even proven wrong.
“Yes, there might be some innocent people in jail, but I was only talking about the guilty.”
27. No True Scotsman
To disqualify someone or something based on a false or biased ideal.
“All real men have beards, so if you don’t have a beard, you can’t be a real man.”
28. Personal Incredulity
It doesn’t make it untrue just because you find something hard to believe or imagine.
“I can’t believe that the universe and everything in it arose from nothing, so it can’t be true.”
29. False Causality
The false assumption is that correlation equals causation.
“Crime rates went up when the price of gas went up, so for the sake of everyone’s safety, we must lower our taxes on fossil fuels.”
30. Texas Sharpshooter
To decide on your position, find only data to support that position. This fallacy is especially prominent in this digital age when finding arguments defending almost any imaginable position online is possible.
“I’ve found numerous studies supporting my position, and I have no idea if any studies are supporting your position as well.”
31. Loaded Question
To ask a question with an assumption already built into the question.
“Have you stopped beating your wife?”
32. Chesterton’s Fence
If we don’t understand or see the reason for something, we might be inclined to do away with it. However, even if we don’t understand it, most things have been put in place for a reason. We should therefore leave it be unless we fully understand its purpose.
“There’s a fence here, but I can’t see what it’s good for, so let’s do away with it.”
33. Survivorship Bias
I’ve already written about survivorship bias and Abraham Wald.
“We shot all returning warplanes with damages in the wings, so we should reinforce the wings to make them safer.”
34. The Dunning-Kruger Effect
A cognitive bias is when people overestimate their competence as they progress in a new field. 1Please note that the Dunning-Kruger effect is under scientific scrutiny and lacks broad support from the scientific community.
“I’ve just started learning about this, and I’m amazed at how much more I know now compared to before when I knew next to nothing, so I’m quite sure that I’m an expert on this subject now.”
35. Confirmation Bias
Most of us tend to recall or interpret information in a way that reinforces our existing cognitive schemas.
“I refused to take my medicine and got well, so I always do my best to avoid treatment.
36. Heuristic Anchoring
When faced with an initial number, we often compare subsequent numbers to that initial anchor.
“The third house shown to us by the broker was over our budget but still a bargain compared to the first two houses she showed us.”
37. The Curse of Knowledge
We tend to assume that other people have at least enough knowledge to comprehend and digest what we’re saying.
“A preschool class came by the lab yesterday and asked about my work, so I talked about genome sequencing for a good half hour and got no follow-up questions.”
38. Optimism/Pessimism Bias
We find it easier to believe that negative things can happen to others than ourselves. But some people tend to be biased oppositely; they overestimate the likelihood of adverse events.
“We’re so blessed that those terrible things couldn’t ever happen to us.” / “What happened to them will also happen to us—only worse.”
39. The Sunk Cost Fallacy
Sometimes we stick to a behaviour simply because we’ve already invested time, money, and other resources. To abandon such an investment would force us to face an irreversible failure.
“I ordered too much food, so we’ll simply have to over-eat for a few weeks to get our money’s worth.”
40. Negativity Bias
We tend to react more strongly to negative impacts than to positive effects of similar or equal weight.
“Our daughter graduated with honours from college yesterday, but then on our way home, our car broke down and ruined the rest of the day.”
We tend to think that everything will decline, especially with new developments. This might be due to cognitive laziness; we don’t wish to change how we feel in tandem with the times.
“Everything was better in the past, so change is terrible.”
Read also: Social Media—The Good, The Bad, and the Ugly
42. The Backfire Effect (Conversion Theory)
When challenged, the outcome might be that we cling even firmer to our beliefs—instead of questioning ourselves.
“People hate us, but this proves us right about everything.”
The disproportional power of minorities is known as the conversion theory.
How does it work?
The social cost of holding a different view than the majority is high. This increased cost explains why minorities often hold their opinions more firmly. It takes determination to go against the norm. 2Moscovici, S. (1980). Toward a theory of conversion behaviour. In L. Berkowitz (Ed.), Advances in Experimental Social Psychology, 13, 209-239. New York: Academic Press.
In contrast, many majority members don’t hold their opinions so firmly. They might belong to the majority for no other reason than that everyone else seems to be. 3Chryssochoou, X. and Volpato, C. (2004). Social Influence and the Power of Minorities: An Analysis of the Communist Manifesto, Social Justice Research, 17, 4, 357-388.
“In groups, the minority can have a disproportionate effect, converting many ‘majority’ members to their own cause. This is because many majority group members are not strong believers in its cause. They may be simply going along because it seems easier or that there is no real alternative. They may also have become disillusioned with the group purpose, process, or leadership and are seeking a viable alternative.”
According to conversion theory, while majorities often claim normative social influence, minorities strive for ethical high ground.
Given the power of normative social influence, minorities must stick together in tight-knit groups that can verbalise the same message repeatedly.
43. The Fundamental Attribution Error
When someone else makes a mistake, we tend to attribute it to their character or behaviour. Still, we tend to attribute such mistakes to contextual circumstances when we make mistakes.
“When I’m in a rush, people behave like idiots in traffic.”
44. In-Group Bias
We have evolved to be subjectively preferential to people who belong to the same social group. This isn’t necessarily bad behaviour per se, but we must watch out for situations where we are put in a position where we can’t be expected to be fair and objective.
“I might be biased, of course, but I dare say, objectively, that my daughter was the best performer in the whole orchestra.”
45. The Forer Effect (The Barnum Effect)
We tend to fill any gaps in the information we give using our existing cognitive schemas. This is, for instance, why it’s so easy to think that a horoscope is eerily accurate. This is because we fail to recognise that vague statements might apply to ourselves and many others.
“I read my horoscope yesterday, and the information was uncannily accurate, so I’m certainly convinced that there are some things about the cosmos that influence our lives in a way that we can’t yet understand.”
46. Cognitive Dissonance
We tend to sort information based on our existing cognitive schemas. One outcome is that we tend to disregard any information that sits poorly with what we already believe while quickly absorbing anything that confirms our beliefs.
“The Earth is flat, and I haven’t seen any credible evidence to the contrary.”
47. The Hostile Media Effect
This can be seen as the equivalent in media science to the psychological fallacy of the backfire effect. Studies have shown that people with strong opinions on a specific issue tend to believe that the media is biased towards their opposition. The result will be even stronger if the individual believes that the silent majority is out there who are particularly susceptible to erroneous or misleading media coverage.
“I know the media is telling me I’m wrong, but that’s perfectly understandable since their primary objective is to stop me from exposing the truth.”
The Hostile Media Effect
Do you think that the news media is biased against your beliefs? Well, they might be. And they might also not be.
Researchers have found that individuals tend to see the news media as biased against them—even when it’s not:
“The hostile media effect […] is a perceptual theory of mass communication that refers to the tendency for individuals with a strong preexisting attitude on an issue to perceive media coverage as biased against their side and in favour of their antagonists’ point of view.”
Source: Hostile media effect 4Hostile media effect. (2022, October 25). In Wikipedia. https://en.wikipedia.org/wiki/Hostile_media_effect
Are we paranoid? Are we seeing bias in the news media that isn’t there? In short: Yes.
The hostile media effect doesn’t imply that the media is never biased. Still, science shows that opposing groups often regard the same articles as against them and favour their opponents.
The existence of the hostile media effect is scientifically well-established, but we still don’t know precisely why it persists:
“The hostile media perception, the tendency for partisans to judge mass media coverage as unfavorable to their own point of view, has been vividly demonstrated but not well explained. This contrast bias is intriguing because it appears to contradict a robust literature on assimilation biases — the tendency to find information more supportive, rather than more opposed, to one’s own position. […] content evaluations based on perceived influence on oneself vs influence on a broader audience suggested that the hostile media perception may be explained by perceived reach of the information source.”
Source: Journal of Communication 5Gunther, A.C. and Schmitt, K. (2004), Mapping Boundaries of the Hostile Media Effect. Journal of Communication, 54: 55-70.
Research suggests that the primary driver could be fear of opponents gaining in strength, and the hostile media effect could therefore be seen as a psychological defence mechanism.
48. Cherry-Picking (The Fallacy of Incomplete Evidence)
This fallacy is closely related to Texas sharpshooter and the fallacy of division. Cherry-picking fuels most of the reasoning behind popular conspiracy theories. In a world where information is abundant and easily accessible, it’s easy for anyone to make a case for almost anything.
“Apollo saved Greece from the dragon Python, and Napoleon saved France from the horrors of revolution (derived from ‘revolvo,’ something that crawls). Therefore, Napoleon is a myth.”
49. The Spiral of Silence
Most social animals harbour an instinctive fear of isolation, and in-groups maintain their cultural stability partially by excluding individuals with non-conforming opinions or behaviours. This can create a culture where group members self-censor their views and behaviours by going silent.
“My opinions are perceived as wrong, and it’s better for everyone if I stay silent.”
The Spiral of Silence
Rather than risking social isolation, many choose silence over expressing their true opinions.
“To the individual, not isolating himself is more important than his own judgement. […] This is the point where the individual is vulnerable; this is where social groups can punish him for failing to toe the line.”
— Elisabeth Noelle-Neumann
As the dominant coalition gets to stand unopposed, they push the confines of what’s acceptable down a narrower and narrower funnel (see also the opinion corridor).
“The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum—even encourage the more critical and dissident views. That gives people the sense that there’s free thinking going on, while all the time the presuppositions of the system are being reinforced by the limits put on the range of the debate.”
— Noam Chomsky
Read also: The Spiral of Silence
50. The Yes Ladder
This is a marketing exploit where the persuader aims to get you to say yes to something substantial (“big ask”) by methodically getting you to say yes to something smaller first (“small ask”).
“I wasn’t going to buy the pink umbrella at first, but then I subscribed to their newsletter, and via the newsletter, I downloaded a free photo book with pink umbrellas—and now I own five pink umbrellas.”
51. Bystander Effect
People are less inclined to offer support or aid if many others could do it.
“Everyone cares deeply about personal safety, so everyone will download our new CSR app to help each other.”
52. Reciprocation Effect
We often feel obligated to reciprocate if someone is friendly or generous towards us. While this is a beautiful and expected part of human behaviour, it’s something that special interests can take advantage of.
“I can’t believe the car broke down so fast—the guy I bought it from threw in so many extra features.”
53. Commitment and Consistency
Once we commit to something, we invest a part of ourselves in that decision. This makes it harder for many of us to abandon such commitments because it would mean giving up on ourselves. This bias is closely related to yes ladders, declinism, appeal to tradition, and sunk cost fallacy.
“I’ve made my decision, and therefore I’m sticking with it.”
54. The Fallacy of Social Proof
This fallacy is the commercial extension of the bandwagon effect; by showcasing social proof, we are comforted by decisions made by others. Ideally, we should always ensure that reviews and engagement displays are relevant (and accurate) before making any decisions, but this doesn’t always happen.
“Their product seems to have many happy users, so the risk of getting scammed is low.”
55. Liking and Likeness
“We prefer to say yes to people we know and like,” says Robert Cialdini in Influence: The Psychology of Persuasion.
“He is gorgeous, successful, and speaks in a way that resonates with me, so why shouldn’t I trust every word he says?”
56. The Appeal to Authority
It isn’t easy to distinguish between perceived authority and indisputable authority. Many companies use testimonials from people with impressive titles—and it works. This fallacy is closely related to the fallacious appeal to authority.
“Several leading doctors recommended this product, so the ad’s claims must be true.”
57. The Principle of Scarcity (FOMO)
Most of us are scared of missing out (also known as FOMO, fear of missing out). This makes us perceive things as more valuable, the rarer they are.
“I’m so happy I managed to snag that pink unicorn umbrella before the discount ran out!”
Read also: The Power of Artificial Scarcity
58. Loss Aversion
The pain of losing can psychologically be twice as powerful as the joy of winning. Our psychology often allows us to take disproportionate risks to avoid losing compared to the dangers we’re ready to take to win. This bias is closely related to commitment, consistency, and the sunk cost fallacy.
“Our last investment led to a loss of market share, so we must increase our investment to regain it.”
Arkes, H. R., & Blumer, C. (1985), The psychology of sunk costs. Organisational Behavior and Human Decision Processes, 35, 124-140.
Cialdini, R. (2006). Influence: The Psychology of Persuasion, Revised Edition. Harper Business: The United States.
Cook, J. & Lewandowsky, S. (2011). The debunking handbook. St. Lucia, Australia: University of Queensland.
Dwyer, C.P. (2017). Critical thinking: Conceptual perspectives and practical guidelines. Cambridge, UK: Cambridge University Press; with a foreword by former APA President, Dr Diane F. Halpern.
Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills & Creativity, 12, 43–52.
Forer, B. R. (1949). The Fallacy of Personal Validation: A classroom Demonstration of Gullibility. Journal of Abnormal Psychology, 44, 118-121.
Kahneman, D. (2011). Thinking fast and slow. Penguin: Great Britain.
Kruger, J., Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognising one’s own incompetence lead to inflated self-Assessments. Journal of Personality and Social Psychology, 77, 6, 1121–1134.
Scott, P. J., & Lizieri, C. 92012). Consumer house price judgments: New evidence of anchoring and arbitrary coherence. Journal of Property Research, 29, 49-68.
Simon, H. A. (1957). Models of man. New York: Wiley.
Sweis, B. M., Abram, S. V., Schmidt, B. J., Seeland, K. D., MacDonald, A. W., Thomas, M. J., & Redish, A. D. (2018). Sensitivity to “sunk costs” in mice, rats, and humans. Science, 361(6398), 178-181.
Thaler, R. H. (1999). Mental accounting matters. Journal of Behavioral Decision Making, 12, 183-206.
Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 4157, 1124–1131.
West, R. F., Toplak, M. E., & Stanovich, K. E. (2008). Heuristics and biases as measures of critical thinking: Associations with cognitive ability and thinking dispositions. Journal of Educational Psychology, 100, 4, 930–941.
|Please note that the Dunning-Kruger effect is under scientific scrutiny and lacks broad support from the scientific community.|
|Moscovici, S. (1980). Toward a theory of conversion behaviour. In L. Berkowitz (Ed.), Advances in Experimental Social Psychology, 13, 209-239. New York: Academic Press.|
|Chryssochoou, X. and Volpato, C. (2004). Social Influence and the Power of Minorities: An Analysis of the Communist Manifesto, Social Justice Research, 17, 4, 357-388.|
|Hostile media effect. (2022, October 25). In Wikipedia. https://en.wikipedia.org/wiki/Hostile_media_effect|
|Gunther, A.C. and Schmitt, K. (2004), Mapping Boundaries of the Hostile Media Effect. Journal of Communication, 54: 55-70.|