Doctor SpinMedia & PsychologyBehavioral Psychology58 Logical Fallacies and Cognitive Biases

58 Logical Fallacies and Cognitive Biases

The fascinating science of being stupid.

All of us are prone to logical fallacies and cognitive biases.

I know that I’m stupid sometimes — most of us are.

Still, we should all strive to be less stupid.

I’m deeply fascinated with studying logical fallacies and cognitive biases. Learning about human behaviours is helpful in public relations, where we deal with communication challenges daily.

Here we go:

1. Fallacy of Composition

Just because something is valid for one part doesn’t make it accurate for the whole.

“We’ve got the best player in the world on our team, so we must also have the best team in the world.”

2. Fallacy of Division

Just because something is valid for the whole doesn’t make it accurate for the parts.

“We’ve got the best team in the world, so naturally, we must also have the best players in the world.”

3. The Gambler’s Fallacy

The gambler’s fallacy is when we believe in the stableness of random patterns, such as good or bad luck streaks.

“I should place another bet because I’m on a roll right now!”

4. Tu Quoque (Who Are You To Talk?)

When a person makes a statement outside of their expertise or authority, we tend to assume the statement to be incorrect.

“My friend said it’s great to go to the gym, but my friend is out of shape, so that can’t be good advice.”

5. Strawman

A strawman argument is when your opponent grossly misrepresents the intended meaning of your original argument.

“Atheists don’t believe in anything, so they don’t believe it matters if one does bad things.”

6. Ad Hominem

Instead of attacking the arguments at face value, you attack your opponent’s good name or character.

“Our prime minister’s wife left him, and if he can’t keep a wife happy, how could he run our country?”

7. Genetic Fallacy (Fallacy of Origin or Fallacy of Virtue)

To assume something is correct or incorrect based solely on the source’s credibility.

“My priest would never lie to me; my priest is telling me that God exists; therefore, God exists.”

8. Fallacious Appeal to Authority

Just because someone’s an expert in a specific area doesn’t automatically make them an expert in others.

“I’m not an epidemiologist; however, I am a doctor, and I think we must take measures to ensure herd immunity.”

9. Red Herring

Injecting a piece of information to mislead or distract from the main point.

“We’re being told that the environment needs saving, but the government has made it a habit of lying to us.”

10. Appeal to Emotion

As human beings, we often empathise with people or groups of people claiming to be sad, hurt, or scared. But this is sometimes not relevant to the main point.

“Why do you point your finger at immigrants who commit crimes when they’re often victims of horrible traumas themselves?”

11. Appeal to Popularity (The Bandwagon Effect)

The false belief is that many people simultaneously can’t be wrong about the same thing. Both logic and history have taught us that majorities can be objectively wrong. Still, it’s socially comforting to remind your opponent that you have the majority position on your side — which makes your position correct by default.

“One billion flies can’t be wrong — eat shit.”

12. Appeal to Tradition

The false belief is that many people can’t be wrong about something for a long time. Just because human beings have been religious throughout history, this fact doesn’t prove the existence of supernatural powers.

“This can’t be wrong since this has been common practice forever.”

13. Appeal to Nature

The tricky concept is that science is at odds with nature and that you shouldn’t trust science. This fact doesn’t automatically make vaccines dangerous because we’ve evolved in unison with nature without vaccines.

“We trust in safekeeping our natural immune systems, and therefore we don’t trust in vaccines.”

14. Appeal to Ignorance

The misconception is that it must be true if you can’t prove something false.

“Since God can’t be disproven, God must exist.”

15. Begging the Question

When a person uses a circular argument, any statement assumes the conclusion in one premise.

“God exists because it says so in the Bible.”

16. Equivocation

Use words of multiple meanings interchangeably to gain the upper hand in an argument.

“I have the right to believe in God; therefore, believing in God is right.”

17. False Dichotomy (Black or White)

Assuming that something must be either A or B and not both. The media logic often dictates that narratives must be simplified and amplified to be easier to understand quickly; as soon as something is two or more things simultaneously, we tend to find this hard to wrap our heads around. A person could be good or bad simultaneously — or neither.

“You’re either with us (on everything), or you’re against us (on everything).”

18. Middle Ground Fallacy

The false notion that any correct answer must lie somewhere between extreme positions. If someone says that the water is cold and someone else says it’s hot, it’s easy to assume that the temperature will likely be somewhere between. But it could still be hot — or freezing.

“Someone said this is dangerous and perfectly safe, so as long as I’m cautious, I should be fine.”

19. Decision Point Fallacy (Sorites Paradox)

The false assumption that something can’t be correct or incorrect because there’s no precise cut-off between two points. The decision point fallacy is a fallacy we’re seeing in the news media connected to the coronavirus pandemic all the time; since we can’t say which a precise number of deaths is good or bad, many argue that we can’t know anything about any number.

“We haven’t had many pandemic deaths in Sweden because we know of other countries who have had lots of deaths — and since we haven’t had as many as those countries, we haven’t had many deaths.”

20. Slippery Slope Fallacy

The false assumption is that one step in a specific direction is bound to lead to many steps in that same direction. Sure, eating a piece of candy could lead to an over-consumption of sugar, leading to a severe disease that ultimately could cost you your life. But eating a piece of candy won’t lead directly to your death.

“If you eat that sandwich, you’ll get fat.”

21. Hasty Generalisations (Anecdotal Evidence)

When someone draws very general conclusions based on a small subset of subjective circumstances, we often draw generalised conclusions based on anecdotal evidence, especially if those circumstances are individual.

“I got mugged in the street yesterday, and I hate that society is becoming increasingly unsafe.”

22. Faulty Analogy

The attempt to disqualify an argument by making a point using an irrelevant analogy. Analogies can be powerful communication tools, but making unfair or accurate comparisons is easy.

“Abortion is murder.”

23. Burden of Proof

The attempt to push the opponent to disprove your claims. It’s the person making a claim who has the burden of proof, but a popular technique is to push the responsibility over at the person you’re arguing with.

“I believe in God, and you must prove me wrong to change my mind.”

24. Affirming the Consequent

Just because an if-then statement is true in a particular situation, this doesn’t make the if-then statement accurate in all cases.

“A cat meows, so everything that meows is a cat.”

25. Denying the Antecedent (Fallacy of the Inverse)

If a statement with specific conditions is correct, this doesn’t make the information accurate or incorrect for all other types of situations.

“A cat meows, so if it doesn’t meow, it isn’t a cat.”

26. Moving the Goalposts

Manipulating the argument by changing the specifics of your initial claims — after being questioned or even proven wrong.

“Yes, there might be some innocent people in jail, but I was only talking about the ones who did commit their crimes.”

27. No True Scotsman

To disqualify someone or something based on a false or biased ideal.

“All real men have beards, so if you don’t have a beard, you can’t be a real man.”

28. Personal Incredulity

It doesn’t make it untrue just because you find something hard to believe or imagine.

“I can’t believe that the universe and everything in it arose from nothing, so it can’t be true.”

29. False Causality

The false assumption is that correlation equals causation.

“Crime rates went up when the price of gas went up, so for the sake of everyone’s safety; we must lower our taxes on fossil fuels.”

30. Texas Sharpshooter

To decide on your position, find only data to support that position. This fallacy is especially prominent in this digital age when finding arguments defending almost any imaginable position online is possible.

“I’ve found numerous studies supporting my position, and I have no idea if any studies are supporting your position as well.”

31. Loaded Question

To ask a question with an assumption already built into the question.

“Have you stopped beating your wife?”

32. Chesterton’s Fence

If we don’t understand or see the reason for something, we might be inclined to do away with it. However, even if we don’t understand it, most things have been put in place for a reason. We should therefore leave it be unless we fully understand its purpose.

“There’s a fence here, but I can’t see what it’s good for, so let’s do away with it.”

33. Survivorship Bias

I’ve already written about survivorship bias and Abraham Wald.

“We shot all returning warplanes with damages in the wings, so we should reinforce the wings to make them safer.”

34. The Dunning-Kruger Effect

A cognitive bias in which people tend to overestimate their competence as they quickly make initial progress in a new field.

“I’ve just started learning about this, and I’m amazed at how much more I know now compared to before when I knew next to nothing, so I’m quite sure that I’m an expert on this subject now.”

35. Confirmation Bias

Most of us tend to recall or interpret information in a way that reinforces our existing cognitive schemas.

“I refused to take my medicine and got well, so I always do my best to avoid treatment.

36. Heuristic Anchoring

When faced with an initial number, we often compare any subsequent numbers to that initial anchor.

“The third house shown to us by the broker was over our budget but still a bargain compared to the first two houses she showed us.”

37. The Curse of Knowledge

We tend to assume that other people have at least enough knowledge to comprehend and digest what we’re saying.

“A preschool class came by the lab yesterday and asked about my work, so I talked about genome sequencing for a good half hour and got no follow-up questions.”

38. Optimism/Pessimism Bias

We find it easier to believe that negative things can happen to others than ourselves. But some people tend to be biased oppositely; they overestimate the likelihood of adverse events.

“We’re so blessed that those terrible things couldn’t ever happen to us.” / “What happened to them will also happen to us — only worse.”

39. The Sunk Cost Fallacy

Sometimes we stick to a behaviour simply because we’ve already invested time, money, and other resources. To abandon such an investment would force us to face an irreversible failure.

“I ordered too much food, so we’ll simply have to over-eat for a few weeks to get our money’s worth.”

40. Negativity Bias

We tend to react more strongly to negative impacts than to positive effects of similar or equal weight.

“Our daughter graduated with honours from college yesterday, but then on our way home, our car broke down and ruined the rest of the day.”

41. Declinism

We tend to think that everything will decline, especially with new developments. This might be due to cognitive laziness; we don’t wish to change how we feel in tandem with the times. About media science, I’ve written about declinism in The curious case of Cambridge Analytica and the big data techlash.

“Things are continually just getting worse.”

42. The Backfire Effect

When challenged, the outcome might be that we cling even firmer to our beliefs — instead of questioning ourselves. I’ve also written about this in How to fight populism.

“People seem to hate our ideology, but this only proves that we’re right about everything.”

43. The Fundamental Attribution Error

When someone else makes a mistake, we tend to attribute it to their character or behaviour. Still, we tend to attribute such mistakes to contextual circumstances when we make mistakes.

“People are idiots in traffic, especially when I’m stressed.”

44. In-Group Bias

We have evolved to be subjectively preferential to people who belong to the same social group. This isn’t necessarily bad behaviour per se, but we must watch out for situations where we are put in a position where we can’t be expected to be fair and objective.

“I might be biased, of course, but I dare say, objectively, that my daughter was the best performer in the whole orchestra.”

45. The Forer Effect (The Barnum Effect)

We tend to fill any gaps in the information we give using our existing cognitive schemas. This is, for instance, why it’s so easy to think that a horoscope is eerily accurate. This is because we fail to recognise that vague statements might apply to ourselves and many others.

“I read my horoscope yesterday, and the information was uncannily accurate, so I’m certainly convinced that there are some things about the cosmos that influence our lives in a way that we can’t yet understand.”

46. Cognitive Dissonance

We tend to sort information based on our existing cognitive schemas (see the Forer effect). One outcome is that we tend to disregard any information that sits poorly with what we already believe while quickly absorbing anything that confirms our beliefs.

“The Earth is flat, and I haven’t seen any credible evidence to the contrary.”

47. The Hostile Media Effect

This can be seen as the equivalent in media science to the psychological fallacy of the backfire effect. Studies have shown that people with strong opinions on a specific issue tend to believe that the media is biased towards their opposition. The result will be even stronger if the individual believes that the silent majority is out there who are particularly susceptible to erroneous or misleading media coverage.

“I know the media is telling me I’m wrong, but that’s perfectly understandable since their primary objective is to stop me from exposing the truth.”

48. Cherry Picking (The Fallacy of Incomplete Evidence)

This fallacy is closely related to Texas sharpshooter and the fallacy of division. Especially online, this fallacy fuels most of the reasoning behind popular conspiracy theories. In a world where information is abundant and easily accessible, it’s easy for anyone to make whatever case seems plausible. As an example, see this fragmented yet powerful attempt by the French author Rupert Furneaux demonstrates how to cast doubt on the existence of Napoleon Buonaparte, one of the most famous characters in history:

1. The name Napoleon is just a variation of Apoleon or Apollo, and as God of the Sun, he was named Buonaparte, which means “the good part of the day” (when the Sun shines).

2. Just as Apollo was born on the Mediterranean island Delos, Napoleon was born on the Mediterranean island Corsica.

3. Napoleon’s mother Letitia can be identified as Leto, Apollo’s mother. Both names mean joy and happiness, signalling the Sun keeping the night at bay.

4. Letitia had three daughters — as did Leto, Apollo’s mother.

5. Napoleon’s four brothers represent the four seasons. Three brothers became kings, except for one brother who became Prince of Canino (derived from ‘can,’ white, winter, ageing).

6. Napoleon was driven out of France by Northern armies, as Appolo, the Sun God, was driven away by the North Wind.

7. Napoleon had two wives, as did Apollo. They represent the Earth and the Moon. Apollo never had any children with the Moon, but the Earth gave him a son, describing the fertilisation of all green plants on Earth. Napoleon’s son was allegedly born on the 21st of March, the equinox in which the plane of Earth’s equator passes through the Sun’s centre (the Summer Solstice).

8. Apollo saved Greece from the dragon Python, and Napoleon saved France from the horrors of revolution (derived from ‘revolvo,’ something that crawls).

9. Napoleon’s twelve generals are symbols for the twelve creatures of the zodiac, and his four generals represent North, West, South, and East.

10. Napoleon, the Sun Myth, always conquered the South but was always defeated by the cold winds of the North. Like the Sun, Napoleon rose in the East—he was born in Corsica)—and dawned in the West—he died on St. Helena.

49. The Spiral of Silence

Most social animals harbour an instinctive fear of isolation, and in-groups maintain their cultural stability partially by excluding individuals with non-conforming opinions or behaviours. This can create a culture where group members self-censor their views and behaviours by going silent.

“My opinions are perceived as wrong, and it’s better for everyone if I stay silent.”

50. The Yes Ladder

This is a marketing exploit where the persuader aims to get you to say yes to something substantial (“big ask”) by methodically getting you to say yes to something smaller first (“small ask”).

“I wasn’t going to buy the pink umbrella at first, but then I subscribed to their newsletter, and via the newsletter, I downloaded a free photo book with pink umbrellas—and now I own five pink umbrellas.”

51. Bystander Effect

People are less inclined to offer support or aid if many others could do it.

“Everyone cares deeply about personal safety, so everyone will download our new CSR app to help each other.”

52. Reciprocation Effect

We often feel obligated to reciprocate if someone is friendly or generous towards us. While this is a beautiful and expected part of human behaviour, it’s something that special interests can take advantage of.

“I can’t believe the car broke down so fast — the guy I bought it from threw in so many extra features.”

53. Commitment and Consistency

Once we commit to something, we invest a part of ourselves in that decision. This makes it harder for many of us to abandon such commitments because it would mean giving up on the part of ourselves. This bias is closely related to yes ladders, declinism, appeal to tradition, and sunk cost fallacy.

“I’ve made my decision, and therefore I’m sticking with it.”

54. The Fallacy of Social Proof

This fallacy is the commercial extension of the bandwagon effect; by showcasing social proof, we are comforted by decisions made by others. Ideally, we should always ensure that reviews and engagement displays are relevant (and accurate) before making any decisions, but this doesn’t always happen.

“Their product seems to have many happy users, so the risk of getting scammed is low.”

55. Liking and Likeness

“We prefer to say yes to people we know and like,” says Robert Cialdini in Influence: The Psychology of Persuasion. This bias is closely related to the in-group fallacy described above. According to Social Media Examiner, a few extensions of this principle are:

We like people who are similar to us. We want people who compliment us. We like things that are familiar to us. Cooperation toward joint efforts inspires increased liking. An innocent association with bad or good things will influence how people feel about us. Physical attractiveness creates a halo effect and typically invokes the principle of liking.

“He is gorgeous, successful, and speaks in a way that resonates with me, so why shouldn’t I trust every word he says?”

56. The Appeal to Authority

It’s difficult to distinguish between perceived authority and indisputable authority. Many companies use testimonials from people with impressive titles — and it works. This fallacy is closely related to the fallacious appeal to authority.

“Several leading doctors recommended this product, and that’s true because the ad with the doctors said so.”

57. The Principle of Scarcity

Most of us are scared of missing out (also known as FOMO, fear of missing out). This makes us perceive things as more valuable, the rarer they are.

“I’m so happy I managed to snag that pink unicorn umbrella before the discount ran out!”

58. Loss Aversion

The pain of losing can psychologically be twice as powerful as the joy of winning. Our psychology often allows us to take disproportionate risks to avoid losing compared to the dangers we’re ready to take to win. This bias is closely related to commitment, consistency, and the sunk cost fallacy.

“Our last investment led to a loss of market share, so we must increase our investment to regain it.”

Cover photo by Jerry Silfwer (Prints/Instagram)

Reference List

Arkes, H. R., & Blumer, C. (1985), The psychology of sunk costs. Organisational Behavior and Human Decision Processes, 35, 124-140.

Cialdini, R. (2006). Influence: The Psychology of Persuasion, Revised Edition. Harper Business: The United States.

Cook, J. & Lewandowsky, S. (2011). The debunking handbook. St. Lucia, Australia: University of Queensland.

Dwyer, C.P. (2017). Critical thinking: Conceptual perspectives and practical guidelines. Cambridge, UK:  Cambridge University Press; with a foreword by former APA President, Dr Diane F. Halpern.

Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills & Creativity, 12, 43–52.

Forer, B. R. (1949). The Fallacy of Personal Validation: A classroom Demonstration of Gullibility. Journal of Abnormal Psychology, 44, 118-121.

Kahneman, D. (2011). Thinking fast and slow. Penguin: Great Britain.

Kruger, J., Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognising one’s own incompetence lead to inflated self-Assessments. Journal of Personality and Social Psychology, 77, 6, 1121–1134.

Scott, P. J., & Lizieri, C. 92012). Consumer house price judgments: New evidence of anchoring and arbitrary coherence. Journal of Property Research, 29, 49-68.

Simon, H. A. (1957). Models of man. New York: Wiley.

Sweis, B. M., Abram, S. V., Schmidt, B. J., Seeland, K. D., MacDonald, A. W., Thomas, M. J., & Redish, A. D. (2018). Sensitivity to “sunk costs” in mice, rats, and humans. Science, 361(6398), 178-181.

Thaler, R. H. (1999). Mental accounting matters. Journal of Behavioral Decision Making, 12, 183-206.

Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 4157, 1124–1131.

West, R. F., Toplak, M. E., & Stanovich, K. E. (2008). Heuristics and biases as measures of critical thinking: Associations with cognitive ability and thinking dispositions. Journal of Educational Psychology, 100, 4, 930–941.

.

Jerry Silfwer
Jerry Silfwerhttps://www.doctorspin.net/
Jerry Silfwer, aka Doctor Spin, is an awarded senior adviser specialising in public relations and digital strategy. Currently CEO at KIX Index and Spin Factory. Before that, he worked at Kaufmann, Whispr Group, Springtime PR, and Spotlight PR. Based in Stockholm, Sweden.
Buy PR Merch

Grab a free subscription before you go.

Get notified of new blog posts & new PR courses

🔒 Please read my integrity- and cookie policy.

Social media companies like Google and Facebook are powerful, but that doesn't make them evil. Paradoxically, they're too good for us to handle.
Most popular