All of us are prone to logical fallacies and cognitive biases.
I know that I’m stupid sometimes — most of us are.
Still, we should all strive to be less stupid.
I’m deeply fascinated with studying logical fallacies and cognitive biases. Learning about human behaviours is helpful in public relations, where we deal with communication challenges daily.
Here we go:
- 1. Fallacy of Composition
- 2. Fallacy of Division
- 3. The Gambler’s Fallacy
- 4. Tu Quoque (Who Are You To Talk?)
- 5. Strawman
- 6. Ad Hominem
- 7. Genetic Fallacy (Fallacy of Origin or Fallacy of Virtue)
- 8. Fallacious Appeal to Authority
- 9. Red Herring
- 10. Appeal to Emotion
- 11. Appeal to Popularity (The Bandwagon Effect)
- 12. Appeal to Tradition
- 13. Appeal to Nature
- 14. Appeal to Ignorance
- 15. Begging the Question
- 16. Equivocation
- 17. False Dichotomy (Black or White)
- 18. Middle Ground Fallacy
- 19. Decision Point Fallacy (Sorites Paradox)
- 20. Slippery Slope Fallacy
- 21. Hasty Generalisations (Anecdotal Evidence)
- 22. Faulty Analogy
- 23. Burden of Proof
- 24. Affirming the Consequent
- 25. Denying the Antecedent (Fallacy of the Inverse)
- 26. Moving the Goalposts
- 27. No True Scotsman
- 28. Personal Incredulity
- 29. False Causality
- 30. Texas Sharpshooter
- 31. Loaded Question
- 32. Chesterton’s Fence
- 33. Survivorship Bias
- 34. The Dunning-Kruger Effect
- 35. Confirmation Bias
- 36. Heuristic Anchoring
- 37. The Curse of Knowledge
- 38. Optimism/Pessimism Bias
- 39. The Sunk Cost Fallacy
- 40. Negativity Bias
- 41. Declinism
- 42. The Backfire Effect (Conversion Theory)
- 43. The Fundamental Attribution Error
- 44. In-Group Bias
- 45. The Forer Effect (The Barnum Effect)
- 46. Cognitive Dissonance
- 47. The Hostile Media Effect
- 48. Cherry-Picking (The Fallacy of Incomplete Evidence)
- 49. The Spiral of Silence
- 50. The Yes Ladder
- 51. Bystander Effect
- 52. Reciprocation Effect
- 53. Commitment and Consistency
- 54. The Fallacy of Social Proof
- 55. Liking and Likeness
- 56. The Appeal to Authority
- 57. The Principle of Scarcity (FOMO)
- 58. Loss Aversion
1. Fallacy of Composition
Fallacy of composition: “Since our top salesperson is a great public speaker, our entire sales team must also be excellent public speakers.”
The fallacy of composition, a prevalent cognitive bias in decision-making, arises when individuals erroneously infer that the attributes of a single component or a select few components within a larger system extend to the entire system.
This fallacious thinking may manifest in various contexts — from organizational strategy to market analysis — and can lead to misguided decisions with potentially adverse consequences.
To avoid falling prey to this fallacy, business leaders must engage in thoughtful and rigorous analysis, recognizing that the dynamics of complex systems may not always mirror the characteristics of their parts and that a more holistic approach is necessary to navigate the intricacies of today’s ever-evolving business landscape.
2. Fallacy of Division
Fallacy of division: “Our company is a market leader, so every employee within our organization must be an expert in their respective field.”
The fallacy of division emerges as a subtle yet significant cognitive trap, enticing decision-makers to mistakenly assume that the properties of a collective whole must inherently apply to its components.
This flawed logic can lead to erroneous conclusions and ill-informed decisions, particularly in organizational dynamics, where unique elements within a system may not conform to the overarching characteristics of the larger entity.
To counteract this fallacy, business leaders must adopt a nuanced approach, cultivating an understanding that the intricacies of complex systems demand careful consideration of the distinct attributes and interactions of their constituent parts rather than relying on simplistic generalizations that may obscure critical insights.
3. The Gambler’s Fallacy
Gambler’s fallacy: “We’ve had three failed product launches in a row; our next product is guaranteed success.”
The gambler’s fallacy, a widespread cognitive bias often encountered in decision-making, stems from the erroneous belief that past events can influence the probability of future independent events.
This misleading notion can lead to faulty assumptions and misguided decisions, particularly in business contexts where uncertainty and randomness play a prominent role.
To mitigate the risks associated with the gambler’s fallacy, executives must develop a data-driven mindset acknowledging the independence of discrete events and leveraging statistical analysis to inform strategic choices, thereby fostering more accurate assessments of probability and more informed decision-making in an unpredictable business landscape.
4. Tu Quoque (Who Are You To Talk?)
Tu quoque: “Our competitor’s CEO is criticizing our environmental policies, but their own company has had pollution issues in the past.”
The tu quoque fallacy, colloquially known as the “who are you to talk?” argument, represents a pernicious rhetorical tactic employed to deflect criticism or undermine an opponent’s position by highlighting their perceived hypocrisy or inconsistency rather than addressing the substance of the argument itself.
In the context of business discourse, this ad hominem attack can derail productive conversations and obscure valuable insights, potentially stifling innovation and collaboration.
To foster a more constructive dialogue, organizational leaders must cultivate an environment that encourages open and honest communication, focusing on the merits of the ideas presented and discouraging personal attacks or appeals to hypocrisy, thereby empowering individuals to engage in reasoned debate and contribute to the collective pursuit of excellence.
Strawman: “Our colleague wants to cut costs, but I doubt they’d be happy if we had to compromise the quality of our products and lose customers as a result.”
The strawman fallacy, a deceptive rhetorical manœuvre often encountered in business discourse, involves misrepresenting an opponent’s argument by constructing a distorted or oversimplified version of their stance, which is then easier refute or discredit.
This misleading tactic can obstruct meaningful dialogue, engender hostility, and inhibit the exploration of nuanced perspectives necessary for driving innovation and informed decision-making.
To foster a collaborative and intellectually rigorous environment, organisational leaders must emphasize the importance of engaging with the substance of arguments presented, encouraging participants to actively listen, seek clarification, and challenge ideas constructively, ultimately advancing the collective pursuit of knowledge and organizational success.
6. Ad Hominem
Ad hominem: “I wouldn’t trust that marketing proposal – it was created by someone known for their disorganization.”
The ad hominem fallacy, a detrimental form of argumentation frequently encountered in professional discourse, occurs when an individual targets an opponent’s personal attributes or character traits rather than addressing the substance of their argument.
This diversionary tactic can hinder productive discussion, impede the flow of valuable insights, and foster a toxic work environment, undermining the collaborative spirit essential to organizational success.
To create a culture of open and respectful dialogue, business leaders must actively discourage ad hominem attacks, encourage team members to engage with the merits of ideas presented, foster an atmosphere of intellectual rigour, and promote an inclusive environment where diverse perspectives can flourish and contribute to the organization’s growth and innovation.
7. Genetic Fallacy (Fallacy of Origin or Fallacy of Virtue)
Genetic fallacy: “The marketing strategy proposed by our newest team member can’t be any good; they’ve only been with the company for a few months.”
The genetic fallacy, also known as the fallacy of origin or fallacy of virtue, is a flawed reasoning pattern that arises when an argument’s validity or worth is assessed based on its source or origin rather than the argument’s merits.
This cognitive bias can obstruct the objective evaluation of ideas in a business context, potentially leading to missed opportunities, stifled innovation, or unwise strategic decisions.
To counteract the influence of the genetic fallacy, organisational leaders must cultivate a culture of intellectual openness, emphasizing the importance of engaging with the substance of ideas, regardless of their origins, and fostering an environment where critical thinking, reasoned debate, and the free exchange of diverse perspectives can thrive, ultimately driving informed decision-making and organizational success.
8. Fallacious Appeal to Authority
Fallacious appeal to authority: “We should invest in this new technology because a famous entrepreneur mentioned it in a recent podcast.”
The fallacious appeal to authority is a deceptive form of argumentation that occurs when an individual invokes the opinion or endorsement of a purported expert to bolster their position, despite the expert’s lack of relevant expertise or credibility on the subject.
In a business context, this cognitive bias can lead to ill-informed decisions, misplaced trust, and potentially detrimental consequences for organizational performance.
To safeguard against the fallacious appeal to authority, business leaders must foster a culture of critical thinking, promoting evidence-based decision-making and encouraging team members to scrutinize the credibility and relevance of expert opinions, ensuring that strategic choices are informed by rigorous analysis and well-founded expertise, rather than mere assertions of authority.
9. Red Herring
Red herring: “We shouldn’t worry about our declining market share; after all, our office just won an award for its eco-friendly design.”
The red herring fallacy, a cunning diversionary tactic often encountered in professional discourse, involves introducing an unrelated or tangential issue to distract from the original argument or issue at hand.
This deceptive manœuvre can undermine productive dialogue, hinder the pursuit of meaningful solutions, and impede the collaborative exchange of ideas essential to driving innovation and organizational success.
To foster a focused and intellectually honest environment, business leaders must emphasize the importance of staying on topic and addressing the substance of arguments, cultivating a culture of active listening and disciplined discussion that allows for the thoughtful examination of critical issues, ultimately promoting well-informed decision-making and the organization’s ability to navigate complex challenges effectively.
10. Appeal to Emotion
Appeal to emotion: “We can’t outsource our manufacturing overseas; think about the impact on our local employees’ families.”
The appeal to emotion fallacy, a manipulative tactic frequently observed in professional and personal interactions, involves leveraging emotional triggers to persuade or influence others, sidestepping the merits of the argument or the rationality of the underlying facts.
In a business context, this fallacy can lead to hasty decisions, impede objective evaluation, and inhibit the collaborative exchange of ideas crucial for driving innovation and sound decision-making.
To counteract the appeal to emotion, organizational leaders must foster a culture of critical thinking, emphasizing the importance of evidence-based reasoning and rational deliberation while also acknowledging the role of emotions in human decision-making and encouraging employees to strike a balance between emotional intelligence and analytical rigour in navigating the complexities of the business landscape.
11. Appeal to Popularity (The Bandwagon Effect)
Appeal to popularity: “We should implement the same remote work policy as the leading tech companies; if it’s good enough for them, it must be good for us.”
The appeal to popularity, also known as the bandwagon effect, is a fallacious form of argumentation that relies on the widespread acceptance or popularity of an idea or course of action as sufficient evidence of its validity or efficacy.
In business, succumbing to this fallacy can lead to herd mentality, stifled innovation, and suboptimal decision-making. Organizations risk neglecting rigorous analysis and thoughtful deliberation instead of following prevailing trends.
To counteract the bandwagon effect, business leaders must cultivate a culture that values independent thinking and evidence-based decision-making, encouraging team members to critically assess popular beliefs and practices and fostering an environment where diverse perspectives can be openly shared and debated, ultimately driving informed decision-making and sustained organizational success.
12. Appeal to Tradition
Appeal to tradition: “We’ve always used this software for our project management, so there’s no reason to consider alternatives now.”
The appeal to tradition fallacy, a pervasive cognitive bias in decision-making, occurs when an individual argues that a particular belief or practice should be maintained simply because it has been long-standing or customary.
In a business context, this fallacy can hinder innovation, stifle adaptation to changing market conditions, and perpetuate outdated or inefficient practices, potentially undermining an organization’s ability to compete and grow.
To counter the appeal to tradition, astute business leaders must foster a culture that embraces continuous improvement and adaptation, encouraging team members to evaluate long-held beliefs and practices critically and to consider novel approaches that may offer more effective solutions to the challenges of a rapidly evolving business landscape.
13. Appeal to Nature
Appeal to nature: “We should switch to a completely organic ingredient supplier, even if it’s more expensive, because natural products are always better.”
The appeal to nature fallacy emerges when an individual asserts that something is inherently excellent or superior simply because it is deemed natural or unaltered while dismissing or devaluing alternatives that may be perceived as artificial or synthetic.
In the business world, this fallacy can lead to suboptimal decision-making, risk aversion to innovation, and an overreliance on traditional or ‘natural’ solutions that may not effectively address contemporary challenges.
To navigate this cognitive bias, savvy business leaders must encourage a culture of critical thinking and open-mindedness, promoting evidence-based decision-making that carefully evaluates the advantages and drawbacks of various options, whether they are rooted in nature or human ingenuity, thereby fostering an environment that supports innovation, adaptability, and sustainable growth.
14. Appeal to Ignorance
Appeal to ignorance: “No one has proven that our new public relations campaign won’t work, so it must be a good idea.”
The appeal to ignorance fallacy arises when an individual contends that a claim is valid simply because it has not been proven false, or vice versa, exploiting gaps in knowledge or evidence to bolster their argument.
In a business context, this fallacy can lead to misguided decision-making, overconfidence in unverified assumptions, and a disregard for the importance of thorough analysis and evidence-based reasoning.
To mitigate the risks associated with the appeal to ignorance, astute business leaders must cultivate a culture that values intellectual humility, emphasizing the importance of recognizing and addressing knowledge gaps, seeking reliable evidence to inform decision-making, and fostering an environment where team members are encouraged to continually learn, adapt, and refine their understanding of the complex and ever-evolving business landscape.
15. Begging the Question
Begging the question: “Our company’s products are the best on the market because we provide the highest quality.”
The begging-the-question fallacy, a subtle yet problematic form of circular reasoning, occurs when an argument’s conclusion is assumed within its premises, sidestepping the need for genuine evidence or logical support.
In the business world, this fallacy can lead to unfounded assumptions, superficial analyses, and misguided decision-making that may undermine an organization’s ability to navigate challenges and seize opportunities effectively.
To counteract the risk of begging the question, business leaders must foster a culture that values critical thinking, open inquiry, and evidence-based decision-making, encouraging team members to rigorously examine the premises of their arguments, identify and address any underlying assumptions, and engage in a constructive, reasoned debate that drives innovation, growth, and sustainable success.
Equivocation: “Our sales figures are certainly interesting, which means they’re worth considering for future strategy.”
Equivocation, a deceptive rhetorical strategy frequently encountered in professional discourse, occurs when an individual exploits the ambiguity or multiple meanings of a word or phrase to create confusion or mislead their audience, effectively avoiding a clear or direct response to an argument or question.
In a business context, equivocation can obstruct meaningful communication, hinder the effective exchange of ideas, and undermine trust among team members, ultimately impeding innovation and sound decision-making.
To promote transparency and intellectual honesty within an organization, business leaders must emphasize the importance of clear and precise language, encouraging team members to seek clarification when faced with ambiguous statements and fostering a culture of open dialogue that values the rigorous examination of ideas and constructive debate, driving informed decision-making and sustained organizational success.
17. False Dichotomy (Black or White)
False dichotomy: “We either need to cut costs drastically, or we have to increase our prices significantly — there’s no other way to improve our profit margin.”
The false dichotomy fallacy, also known as the black or white fallacy, arises when an individual presents a complex issue or decision as having only two mutually exclusive options, effectively oversimplifying the matter and ignoring alternative perspectives or potential solutions.
In a business context, this fallacious reasoning can stifle creativity, hinder comprehensive problem-solving, and lead to suboptimal decision-making, ultimately constraining an organization’s ability to adapt and innovate in a rapidly evolving landscape.
To counteract the risks associated with false dichotomies, business leaders must encourage critical thinking and open-mindedness, foster an environment that values exploring nuanced perspectives and diverse approaches, and empower team members to engage in collaborative problem-solving that drives innovation.
18. Middle Ground Fallacy
Middle ground fallacy: “Our team is divided on whether to invest in research and development or marketing, so let’s allocate half our budget to each and satisfy everyone.”
The middle ground fallacy, a deceptive form of argumentation, occurs when an individual asserts that a compromise or middle point between two opposing positions must inherently represent the correct or most reasonable solution, neglecting the possibility that one or both extremes may hold merit or that the optimal solution may lie elsewhere.
In a business context, this fallacy can lead to suboptimal decision-making, fostering a false sense of consensus and potentially overlooking innovative or superior solutions.
To guard against the middle ground fallacy, business leaders must promote a culture of critical thinking and open debate, encouraging team members to examine the strengths and weaknesses of various perspectives rigorously and fostering an environment that supports collaborative problem-solving and the pursuit of evidence-based, well-informed solutions.
19. Decision Point Fallacy (Sorites Paradox)
Decision point fallacy: “We can’t determine the exact point at which adding more features to our product will make it too complex for our users, so let’s keep adding features without considering the potential downsides.”
The decision point fallacy, also known as the Sorites Paradox, arises when an individual struggle to identify a precise threshold or turning point within a series of incremental changes, leading to flawed reasoning or indecision.
This cognitive bias can manifest in a business context when decision-makers become mired in the minutiae of continuous improvement or incremental progress, losing sight of the bigger picture and ultimately hampering their ability to make strategic choices.
To counteract the decision point fallacy, organizational leaders must foster a culture emphasising the importance of establishing clear objectives, maintaining a holistic perspective, and striking a balance between incremental progress and decisive action, empowering team members to navigate complex challenges and drive sustained success.
20. Slippery Slope Fallacy
Slippery slope fallacy: “If we allow our employees to work remotely for one day a week, productivity will plummet, and soon everyone will be demanding a completely flexible schedule, resulting in chaos and the collapse of our company culture.”
The slippery slope fallacy occurs when an individual argues that a specific action or decision will inevitably lead to a chain of negative consequences without providing sufficient evidence for this causal relationship.
In a business context, this fallacious reasoning can undermine productive dialogue, stifle innovation, and promote an overly cautious approach to problem-solving, ultimately inhibiting an organization’s ability to adapt and grow.
To guard against the slippery slope fallacy, business leaders must foster a culture that values evidence-based decision-making and encourages team members to critically examine their arguments’ logic and assumptions, promoting a balanced and objective assessment of potential risks and opportunities that drive informed decision-making sustained success.
21. Hasty Generalisations (Anecdotal Evidence)
Hasty generalisations: “One of our remote employees missed a deadline last month, which clearly shows that allowing employees to work remotely leads to decreased productivity and a lack of accountability.”
Hasty generalizations, often fueled by anecdotal evidence, occur when an individual draws broad conclusions based on insufficient or unrepresentative data, resulting in potentially flawed or biased reasoning.
In a business context, relying on hasty generalizations can lead to misguided decision-making, suboptimal strategies, and an inability to effectively address complex challenges, ultimately impeding an organization’s success.
To counteract the risks associated with hasty generalizations, business leaders must emphasize the importance of thorough analysis, evidence-based decision-making, and critical thinking, encouraging team members to recognize the limitations of anecdotal evidence and consider diverse perspectives, fostering a culture that values rigorous inquiry and comprehensive problem-solving.
22. Faulty Analogy
Faulty analogy: “Managing a business is like riding a bicycle; once you’ve learned the basics, it’s all about maintaining balance and momentum, so we don’t need to invest in ongoing professional development for our employees.”
The faulty analogy fallacy arises when an individual draws a comparison between two concepts or situations that are not sufficiently alike, resulting in misleading or unsupported conclusions.
In a business context, relying on faulty analogies can impede effective problem-solving, foster misconceptions, and contribute to ill-advised decision-making, ultimately undermining an organization’s ability to innovate and succeed.
To guard against the pitfalls of faulty analogies, business leaders must cultivate a culture that values critical thinking, logical rigour, and evidence-based reasoning, encouraging team members to scrutinize their comparisons’ validity and seek out diverse perspectives that challenge assumptions and promote nuanced understanding.
23. Burden of Proof
Burden of proof: “Our new marketing strategy will boost sales by at least 20%; if you don’t believe me, prove me wrong.”
The burden of proof fallacy occurs when an individual asserts a claim without providing sufficient evidence, often shifting the responsibility onto others to disprove the assertion.
In a business context, this fallacious reasoning can hinder productive discourse, foster unwarranted assumptions, and contribute to flawed decision-making, ultimately impeding an organization’s ability to navigate challenges effectively and capitalize on opportunities.
To mitigate the risks associated with the burden of proof fallacy, business leaders must promote a culture of evidence-based reasoning, critical thinking, and intellectual accountability, encouraging team members to substantiate their claims with robust supporting evidence and to engage in a constructive, well-informed debate that drives innovative problem-solving and sustainable success.
24. Affirming the Consequent
Just because an if-then statement is true in a particular situation, this doesn’t make the if-then statement accurate in all cases.
“A cat meows, so everything that meows is a cat.”
25. Denying the Antecedent (Fallacy of the Inverse)
If a statement with specific conditions is correct, this doesn’t make the information accurate or incorrect for all other situations.
“A cat meows, so if it doesn’t meow, it isn’t a cat.”
26. Moving the Goalposts
Manipulating the argument by changing the specifics of your initial claims — after being questioned or even proven wrong.
“Yes, there might be some innocent people in jail, but I was only talking about the guilty.”
27. No True Scotsman
To disqualify someone or something based on a false or biased ideal.
“All real men have beards, so if you don’t have a beard, you can’t be a real man.”
28. Personal Incredulity
It doesn’t make it untrue just because you find something hard to believe or imagine.
“I can’t believe that the universe and everything in it arose from nothing, so it can’t be true.”
29. False Causality
The false assumption is that correlation equals causation.
“Crime rates went up when the price of gas went up, so for everyone’s safety, we must lower our taxes on fossil fuels.”
30. Texas Sharpshooter
To decide on your position, find only data to support that position. This fallacy is especially prominent in this digital age when finding arguments defending almost any imaginable position online is possible.
“I’ve found numerous studies supporting my position, and I have no idea if any studies also support your position.”
31. Loaded Question
To ask a question with an assumption already built into the question.
“Have you stopped beating your wife?”
32. Chesterton’s Fence
If we don’t understand or see the reason for something, we might be inclined to do away with it. However, even if we don’t understand it, most things have been implemented for a reason. We should therefore leave it be unless we fully understand its purpose.
“There’s a fence here, but I can’t see what it’s good for, so let’s do away with it.”
33. Survivorship Bias
I’ve already written about survivorship bias and Abraham Wald.
“We shot all returning warplanes with damages in the wings, so we should reinforce the wings to make them safer.”
Read also: Survivorship Bias: Abraham Wald and the WWII Airplanes
34. The Dunning-Kruger Effect
A cognitive bias is when people overestimate their competence as they progress in a new field. 1Please note that the Dunning-Kruger effect is under scientific scrutiny and lacks broad support from the scientific community.
“I’ve just started learning about this, and I’m amazed at how much more I know now compared to before when I knew next to nothing, so I’m quite sure that I’m an expert on this subject now.”
35. Confirmation Bias
Most of us tend to recall or interpret information in a way that reinforces our existing cognitive schemas.
“I refused to take my medicine and got well, so I always do my best to avoid treatment.
36. Heuristic Anchoring
When faced with an initial number, we often compare subsequent numbers to that anchor.
“The third house shown to us by the broker was over our budget but still a bargain compared to the first two houses she showed us.”
37. The Curse of Knowledge
We tend to assume that other people have at least enough knowledge to comprehend and digest what we’re saying.
“A preschool class came by the lab yesterday and asked about my work, so I talked about genome sequencing for a good half hour and got no follow-up questions.”
38. Optimism/Pessimism Bias
We find it easier to believe that negative things can happen to others than ourselves. But some people tend to be biased oppositely; they overestimate the likelihood of adverse events.
“We’re so blessed that those terrible things couldn’t ever happen to us.” / “What happened to them will also happen to us — only worse.”
39. The Sunk Cost Fallacy
Sometimes we stick to a behaviour simply because we’ve already invested time, money, and other resources. To abandon such an investment would force us to face an irreversible failure.
“I ordered too much food, so we’ll simply have to over-eat for a few weeks to get our money’s worth.”
40. Negativity Bias
We tend to react more strongly to negative impacts than to positive effects of similar or equal weight.
“Our daughter graduated with honours from college yesterday, but then on our way home, our car broke down and ruined the rest of the day.”
We tend to think that everything will decline, especially with new developments. This might be due to cognitive laziness; we don’t wish to change how we feel in tandem with the times.
“Everything was better in the past, so change is terrible.”
Read also: Social Media — The Good, The Bad, and the Ugly
42. The Backfire Effect (Conversion Theory)
When challenged, we might cling even firmer to our beliefs — instead of questioning ourselves.
“People hate us, but this proves us right about everything.”
The Conversion Theory: The Misrepresented Minority
The disproportional power of minorities is known as the conversion theory.
How does it work?
The social cost of holding a different view than the majority is high. This increased cost explains why minorities often hold their opinions more firmly. It takes determination to go against the norm. 2Moscovici, S. (1980). Toward a theory of conversion behaviour. In L. Berkowitz (Ed.), Advances in Experimental Social Psychology, 13, 209 – 239. New York: Academic Press.
In contrast, many majority members don’t hold their opinions so firmly. They might belong to the majority for no other reason than that everyone else seems to be. 3Chryssochoou, X. and Volpato, C. (2004). Social Influence and the Power of Minorities: An Analysis of the Communist Manifesto, Social Justice Research, 17, 4, 357 – 388.
“In groups, the minority can have a disproportionate effect, converting many ‘majority’ members to their own cause. This is because many majority group members are not strong believers in its cause. They may be simply going along because it seems easier or that there is no real alternative. They may also have become disillusioned with the group purpose, process, or leadership and are seeking a viable alternative.”
According to conversion theory, while majorities often claim normative social influence, minorities strive for ethical high ground.
Given the power of normative social influence, minorities must stick together in tight-knit groups that can verbalise the same message repeatedly.
Read also: Conversion Theory: The Disproportionate Influence of Minorities
43. The Fundamental Attribution Error
When someone else makes a mistake, we attribute it to their character or behaviour. Still, we tend to attribute such mistakes to contextual circumstances when we make mistakes.
“When I’m in a rush, people behave like idiots in traffic.”
44. In-Group Bias
We have evolved to be subjectively preferential to people who belong to the same social group. This isn’t necessarily bad behaviour per se, but we must watch out for situations where we are put in a position where we can’t be expected to be fair and objective.
“I might be biased, of course, but I dare say, objectively, that my daughter was the best performer in the whole orchestra.”
Read also: Social Group Sizes (The Social Brain Hypothesis)
45. The Forer Effect (The Barnum Effect)
We tend to fill any gaps in the information we give using our existing cognitive schemas. This is, for instance, why it’s so easy to think that a horoscope is eerily accurate. We fail to recognise that vague statements might apply to ourselves and many others.
“I read my horoscope yesterday, and the information was uncannily accurate, so I’m certainly convinced that there are some things about the cosmos that influence our lives in a way that we can’t yet understand.”
46. Cognitive Dissonance
We tend to sort information based on our existing cognitive schemas. One outcome is that we tend to disregard any information that sits poorly with what we already believe while quickly absorbing anything that confirms our beliefs.
“The Earth is flat, and I haven’t seen any credible evidence to the contrary.”
47. The Hostile Media Effect
This can be seen as the equivalent in media science to the psychological fallacy of the backfire effect. Studies have shown that people with strong opinions on a specific issue tend to believe that the media is biased towards their opposition. The result will be even stronger if the individual believes that the silent majority is out there who are particularly susceptible to erroneous or misleading media coverage.
“I know the media is telling me I’m wrong, but that’s perfectly understandable since their primary objective is to stop me from exposing the truth.”
The Hostile Media Effect
Do you think that the news media is biased against your beliefs? Well, they might be. And they might also not be.
Researchers have found that individuals tend to see the news media as biased against them — even when it’s not:
“The hostile media effect […] is a perceptual theory of mass communication that refers to the tendency for individuals with a strong preexisting attitude on an issue to perceive media coverage as biased against their side and in favour of their antagonists’ point of view.”
Source: Hostile media effect 4Hostile media effect. (2022, October 25). In Wikipedia. https://en.wikipedia.org/wiki/Hostile_media_effect
Are we paranoid? Do we see bias in the news media that isn’t there? In short: Yes.
The hostile media effect doesn’t imply that the media is never biased. Still, science shows that opposing groups often regard the same articles as against them and favour their opponents.
The existence of the hostile media effect is scientifically well-established, but we still don’t know precisely why it persists:
“The hostile media perception, the tendency for partisans to judge mass media coverage as unfavorable to their own point of view, has been vividly demonstrated but not well explained. This contrast bias is intriguing because it appears to contradict a robust literature on assimilation biases — the tendency to find information more supportive, rather than more opposed, to one’s own position. […] content evaluations based on perceived influence on oneself vs influence on a broader audience suggested that the hostile media perception may be explained by perceived reach of the information source.”
Source: Journal of Communication 5Gunther, A.C. and Schmitt, K. (2004), Mapping Boundaries of the Hostile Media Effect. Journal of Communication, 54: 55 – 70.
Research suggests that the primary driver could be fear of opponents gaining in strength, and the hostile media effect could therefore be seen as a psychological defence mechanism.
Read also: The Hostile Media Effect: How We Demonise the News Media
48. Cherry-Picking (The Fallacy of Incomplete Evidence)
This fallacy is closely related to Texas sharpshooter and the fallacy of division. Cherry-picking fuels most of the reasoning behind popular conspiracy theories. In a world where information is abundant and easily accessible, it’s easy for anyone to make a case for almost anything.
“Apollo saved Greece from the dragon Python, and Napoleon saved France from the horrors of revolution (derived from ‘revolvo,’ something that crawls). Therefore, Napoleon is a myth.”
Read also: Napoleon the Sun God (And Why Most Conspiracies are Bullshit)
49. The Spiral of Silence
Most social animals harbour an instinctive fear of isolation, and in-groups maintain their cultural stability partially by excluding individuals with non-conforming opinions or behaviours. This can create a culture where group members self-censor their views and behaviours by going silent.
“My opinions are perceived as wrong, and it’s better for everyone if I stay silent.”
The Spiral of Silence
Elisabeth Noelle-Neumann’s well-documented theory on the spiral of silence (1974) explains why the fear of isolation due to peer exclusion will pressure publics to silence their opinions.
Rather than risking social isolation, many choose silence over expressing their genuine opinions.
“To the individual, not isolating himself is more important than his own judgement. […] This is the point where the individual is vulnerable; this is where social groups can punish him for failing to toe the line.”
— Elisabeth Noelle-Neumann
As the dominant coalition gets to stand unopposed, they push the confines of what’s acceptable down a narrower and narrower funnel (see also the opinion corridor).
“The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum — even encourage the more critical and dissident views. That gives people the sense that there’s free thinking going on, while all the time the presuppositions of the system are being reinforced by the limits put on the range of the debate.”
— Noam Chomsky
Read also: The Spiral of Silence
50. The Yes Ladder
This is a marketing exploit where the persuader aims to get you to say yes to something substantial (“big ask”) by methodically getting you to say yes to something smaller first (“small ask”).
“I wasn’t going to buy the pink umbrella at first, but then I subscribed to their newsletter, and via the newsletter, I downloaded a free photo book with pink umbrellas — and now I own five pink umbrellas.”
51. Bystander Effect
People are less inclined to offer support or aid if many others can.
“Everyone cares deeply about personal safety, so everyone will download our new CSR app to help each other.”
Read also: Kitty Genovese Murder and the Misreported Bystander Effect
52. Reciprocation Effect
We often feel obligated to reciprocate if someone is friendly or generous towards us. While this is a beautiful and expected part of human behaviour, it’s something that special interests can take advantage of.
“I can’t believe the car broke down so fast — the guy I bought it from threw in so many extra features.”
53. Commitment and Consistency
Once we commit to something, we invest a part of ourselves in that decision. This makes it harder for many of us to abandon such commitments because it would mean giving up on ourselves. This bias is closely related to yes ladders, declinism, appeal to tradition, and sunk cost fallacy.
“I’ve made my decision, and therefore I’m sticking with it.”
54. The Fallacy of Social Proof
This fallacy is the commercial extension of the bandwagon effect; by showcasing social proof, we are comforted by decisions made by others. Ideally, we should always ensure that reviews and engagement displays are relevant (and accurate) before making any decisions, but this doesn’t always happen.
“Their product seems to have many happy users, so the risk of getting scammed is low.”
55. Liking and Likeness
“We prefer to say yes to people we know and like,” says Robert Cialdini in Influence: The Psychology of Persuasion.
“He is gorgeous, successful, and speaks in a way that resonates with me, so why shouldn’t I trust every word he says?”
56. The Appeal to Authority
It isn’t easy to distinguish between perceived authority and indisputable authority. Many companies use testimonials from people with impressive titles — and it works. This fallacy is closely related to the fallacious appeal to authority.
“Several leading doctors recommended this product, so the ad’s claims must be true.”
57. The Principle of Scarcity (FOMO)
Most of us are scared of missing out (also known as FOMO, fear of missing out). This makes us perceive things as more valuable the rarer they are.
“I’m so happy I managed to snag that pink unicorn umbrella before the discount ran out!”
Read also: The Power of Artificial Scarcity
58. Loss Aversion
The pain of losing can psychologically be twice as powerful as the joy of winning. Our psychology often allows us to take disproportionate risks to avoid losing compared to the dangers we’re ready to take to win. This bias is closely related to commitment, consistency, and the sunk cost fallacy.
“Our last investment led to a loss of market share, so we must increase our investment to regain it.”
Please support my blog by sharing it with other PR- and communication professionals. For questions or PR support, contact me via email@example.com.
Arkes, H. R., & Blumer, C. (1985), The psychology of sunk costs. Organisational Behavior and Human Decision Processes, 35, 124 – 140.
Cialdini, R. (2006). Influence: The Psychology of Persuasion, Revised Edition. Harper Business: The United States.
Cook, J. & Lewandowsky, S. (2011). The debunking handbook. St. Lucia, Australia: University of Queensland.
Dwyer, C.P. (2017). Critical thinking: Conceptual perspectives and practical guidelines. Cambridge, UK: Cambridge University Press; with a foreword by former APA President, Dr Diane F. Halpern.
Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills & Creativity, 12, 43 – 52.
Forer, B. R. (1949). The Fallacy of Personal Validation: A classroom Demonstration of Gullibility. Journal of Abnormal Psychology, 44, 118 – 121.
Kahneman, D. (2011). Thinking fast and slow. Penguin: Great Britain.
Kruger, J., Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognising one’s own incompetence lead to inflated self-Assessments. Journal of Personality and Social Psychology, 77, 6, 1121 – 1134.
Scott, P. J., & Lizieri, C. 92012). Consumer house price judgments: New evidence of anchoring and arbitrary coherence. Journal of Property Research, 29, 49 – 68.
Simon, H. A. (1957). Models of man. New York: Wiley.
Sweis, B. M., Abram, S. V., Schmidt, B. J., Seeland, K. D., MacDonald, A. W., Thomas, M. J., & Redish, A. D. (2018). Sensitivity to “sunk costs” in mice, rats, and humans. Science, 361(6398), 178 – 181.
Thaler, R. H. (1999). Mental accounting matters. Journal of Behavioral Decision Making, 12, 183 – 206.
Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 4157, 1124 – 1131.
West, R. F., Toplak, M. E., & Stanovich, K. E. (2008). Heuristics and biases as measures of critical thinking: Associations with cognitive ability and thinking dispositions. Journal of Educational Psychology, 100, 4, 930 – 941.
|1||Please note that the Dunning-Kruger effect is under scientific scrutiny and lacks broad support from the scientific community.|
|2||Moscovici, S. (1980). Toward a theory of conversion behaviour. In L. Berkowitz (Ed.), Advances in Experimental Social Psychology, 13, 209 – 239. New York: Academic Press.|
|3||Chryssochoou, X. and Volpato, C. (2004). Social Influence and the Power of Minorities: An Analysis of the Communist Manifesto, Social Justice Research, 17, 4, 357 – 388.|
|4||Hostile media effect. (2022, October 25). In Wikipedia. https://en.wikipedia.org/wiki/Hostile_media_effect|
|5||Gunther, A.C. and Schmitt, K. (2004), Mapping Boundaries of the Hostile Media Effect. Journal of Communication, 54: 55 – 70.|