All of us are prone to logical fallacies and cognitive biases.
I know that I’m stupid sometimes — most of us are.
Still, we should all strive to be less stupid.
I’m deeply fascinated with studying logical fallacies and cognitive biases. Learning about human behaviours is helpful in public relations, where we deal with communication challenges daily.
Here we go:
1. Fallacy of Composition
Fallacy of composition: “Since our top salesperson is a great public speaker, our entire sales team must also be excellent public speakers.”
The fallacy of composition, a prevalent cognitive bias in decision-making, arises when individuals erroneously infer that the attributes of a single component or a select few components within a more extensive system extend to the entire system.
This fallacious thinking may manifest in various contexts — from organizational strategy to market analysis — and can lead to misguided decisions with potentially adverse consequences.
Business leaders must engage in thoughtful and rigorous analysis to avoid falling prey to this fallacy. They must recognise that the dynamics of complex systems may not always mirror the characteristics of their parts and that a more holistic approach is necessary to navigate the intricacies of today’s ever-evolving business landscape.
2. Fallacy of Division
Fallacy of division: “Our company is a market leader, so every employee within our organization must be an expert in their respective field.”
The fallacy of division emerges as a subtle yet significant cognitive trap, enticing decision-makers to mistakenly assume that the properties of a collective whole must inherently apply to its components.
This flawed logic can lead to erroneous conclusions and ill-informed decisions, particularly in organisational dynamics, where unique elements within a system may not conform to the overarching characteristics of the larger entity.
To counteract this fallacy, business leaders must adopt a nuanced approach, cultivating an understanding that the intricacies of complex systems demand careful consideration of the distinct attributes and interactions of their constituent parts rather than relying on simplistic generalizations that may obscure critical insights.
3. The Gambler’s Fallacy
Gambler’s fallacy: “We’ve had three failed product launches in a row; our next product is guaranteed success.”
The gambler’s fallacy, a widespread cognitive bias often encountered in decision-making, stems from the erroneous belief that past events can influence the probability of future independent events.
This misleading notion can lead to faulty assumptions and misguided decisions, particularly in business contexts where uncertainty and randomness are prominent.
To mitigate the risks associated with the gambler’s fallacy, executives must develop a data-driven mindset. They must acknowledge the independence of discrete events and leverage statistical analysis to inform strategic choices. This will foster more accurate assessments of probability and more informed decision-making in an unpredictable business landscape.
4. Tu Quoque (Who Are You To Talk?)
Tu quoque: “Our competitor’s CEO is criticizing our environmental policies, but their own company has had pollution issues in the past.”
The tu quoque fallacy, colloquially known as the “who are you to talk?” argument, represents a pernicious rhetorical tactic employed to deflect criticism or undermine an opponent’s position by highlighting their perceived hypocrisy or inconsistency rather than addressing the substance of the argument itself.
In the context of business discourse, this ad hominem attack can derail productive conversations and obscure valuable insights, potentially stifling innovation and collaboration.
To foster more constructive dialogue, organizational leaders must cultivate an environment that encourages open and honest communication. They must focus on the merits of the presented ideas and discourage personal attacks or appeals to hypocrisy. They must empower individuals to engage in reasoned debate and contribute to the collective pursuit of excellence.
5. Strawman
Strawman: “Our colleague wants to cut costs, but I doubt they’d be happy if we had to compromise the quality of our products and lose customers as a result.”
The strawman fallacy, a deceptive rhetorical manœuvre often encountered in business discourse, involves misrepresenting an opponent’s argument by constructing a distorted or oversimplified version of their stance, which is easier to refute or discredit.
This misleading tactic can obstruct meaningful dialogue, engender hostility, and inhibit the exploration of nuanced perspectives necessary for driving innovation and informed decision-making.
To foster a collaborative and intellectually rigorous environment, organisational leaders must emphasize the importance of engaging with the substance of the arguments presented. They must encourage participants to actively listen, seek clarification, and challenge ideas constructively, ultimately advancing the collective pursuit of knowledge and organizational success.
6. Ad Hominem
Ad hominem: “I wouldn’t trust a proposal compiled by someone known for their disorganization.”
The ad hominem fallacy, a detrimental form of argumentation frequently encountered in professional discourse, occurs when an individual targets an opponent’s personal attributes or character traits rather than addressing the substance of their argument.
This diversionary tactic can hinder productive discussion, impede the flow of valuable insights, and foster a toxic work environment, undermining the collaborative spirit essential to organizational success.
To create a culture of open and respectful dialogue, business leaders must actively discourage ad hominem attacks, encourage team members to engage with the merits of ideas presented, foster an atmosphere of intellectual rigour, and promote an inclusive environment where diverse perspectives can flourish and contribute to the organization’s growth and innovation.
7. Genetic Fallacy (Fallacy of Origin or Fallacy of Virtue)
Genetic fallacy: “The marketing strategy proposed by our newest team member can’t be any good; they’ve only been with the company for a few months.”
The genetic fallacy, also known as the fallacy of origin or fallacy of virtue, is a flawed reasoning pattern that arises when an argument’s validity or worth is assessed based on its source or origin rather than the argument’s merits.
This cognitive bias can obstruct the objective evaluation of ideas in a business context, potentially leading to missed opportunities, stifled innovation, or unwise strategic decisions.
To counteract the influence of the genetic fallacy, organisational leaders must cultivate a culture of intellectual openness. They must emphasize the importance of engaging with the substance of ideas, regardless of their origins, and foster an environment where critical thinking, reasoned debate, and the free exchange of diverse perspectives can thrive. This will ultimately drive informed decision-making and organizational success.
8. Fallacious Appeal to Authority
Fallacious appeal to authority: “We should invest in this new technology because a famous entrepreneur mentioned it in a recent podcast.”
Fallacious appeal to authority is a deceptive form of argumentation in which an individual invokes the opinion or endorsement of a purported expert to bolster their position despite the expert’s lack of relevant expertise or credibility on the subject.
In a business context, this cognitive bias can lead to ill-informed decisions, misplaced trust, and potentially detrimental consequences for organizational performance.
To safeguard against the fallacious appeal to authority, business leaders must foster a culture of critical thinking, promote evidence-based decision-making, and encourage team members to scrutinize the credibility and relevance of expert opinions. This will ensure that strategic choices are informed by rigorous analysis and well-founded expertise rather than mere assertions of authority.
9. Red Herring
Red herring: “We shouldn’t worry about our declining market share; after all, our office just won an award for its eco-friendly design.”
The red herring fallacy, a cunning diversionary tactic often encountered in professional discourse, involves introducing an unrelated or tangential issue to distract from the original argument or issue at hand.
This deceptive manœuvre can undermine productive dialogue, hinder the pursuit of meaningful solutions, and impede the collaborative exchange of ideas essential to driving innovation and organizational success.
To foster a focused and intellectually honest environment, business leaders must emphasize the importance of staying on topic and addressing the substance of arguments. They must cultivate a culture of active listening and disciplined discussion that allows for the thoughtful examination of critical issues. This will promote well-informed decision-making and the organization’s ability to navigate complex challenges effectively.
10. Appeal to Emotion
Appeal to emotion: “We can’t outsource our manufacturing overseas; think about the impact on our local employees’ families.”
The appeal to emotion fallacy, a manipulative tactic frequently observed in professional and personal interactions, involves leveraging emotional triggers to persuade or influence others, sidestepping the merits of the argument or the rationality of the underlying facts.
In a business context, this fallacy can lead to hasty decisions, impede objective evaluation, and inhibit the collaborative exchange of ideas crucial for driving innovation and sound decision-making.
To counteract the appeal to emotion, organizational leaders must foster a culture of critical thinking. They must emphasize the importance of evidence-based reasoning and rational deliberation while acknowledging the role of emotions in human decision-making and encouraging employees to strike a balance between emotional intelligence and analytical rigour in navigating the complexities of the business landscape.
11. Appeal to Popularity (The Bandwagon Effect)
Appeal to popularity: “We should implement the same remote work policy as the leading tech companies; if it’s good enough for them, it must be good for us.”
The appeal to popularity, also known as the bandwagon effect, is a fallacious form of argumentation that relies on the widespread acceptance or popularity of an idea or course of action as sufficient evidence of its validity or efficacy.
In business, succumbing to this fallacy can lead to herd mentality, stifled innovation, and suboptimal decision-making. Organizations risk neglecting rigorous analysis and thoughtful deliberation instead of following prevailing trends.
Business leaders must cultivate a culture that values independent thinking and evidence-based decision-making to counteract the bandwagon effect. They must encourage team members to critically assess popular beliefs and practices and foster an environment where diverse perspectives can be openly shared and debated. This will ultimately drive informed decision-making and sustained organizational success.
12. Appeal to Tradition
Appeal to tradition: “We’ve always used this software for our project management, so there’s no reason to consider alternatives now.”
The appeal to tradition fallacy, a pervasive cognitive bias in decision-making, occurs when an individual argues that a particular belief or practice should be maintained simply because it has been long-standing or customary.
In a business context, this fallacy can hinder innovation, stifle adaptation to changing market conditions, and perpetuate outdated or inefficient practices, potentially undermining an organization’s ability to compete and grow.
Astute business leaders must foster a culture that embraces continuous improvement and adaptation to counter the appeal to tradition. They must encourage team members to evaluate long-held beliefs and practices critically and consider novel approaches that may offer more effective solutions to the challenges of a rapidly evolving business landscape.
13. Appeal to Nature
Appeal to nature: “We should switch to a completely organic ingredient supplier, even if it’s more expensive, because natural products are always better.”
The appeal to nature fallacy emerges when an individual asserts that something is inherently excellent or superior simply because it is deemed natural or unaltered while dismissing or devaluing alternatives that may be perceived as artificial or synthetic.
In the business world, this fallacy can lead to suboptimal decision-making, risk aversion to innovation, and an overreliance on traditional or ‘natural’ solutions that may not effectively address contemporary challenges.
To navigate this cognitive bias, savvy business leaders must encourage a culture of critical thinking and open-mindedness. They must promote evidence-based decision-making that carefully evaluates the advantages and drawbacks of various options, whether they are rooted in nature or human ingenuity. Thus, they will foster an environment that supports innovation, adaptability, and sustainable growth.
14. Appeal to Ignorance
Appeal to ignorance: “No one has proven that our new public relations campaign won’t work, so it must be a good idea.”
The appeal to ignorance fallacy arises when an individual contends that a claim is valid simply because it has not been proven false, or vice versa, exploiting gaps in knowledge or evidence to bolster their argument.
In a business context, this fallacy can lead to misguided decision-making, overconfidence in unverified assumptions, and a disregard for the importance of thorough analysis and evidence-based reasoning.
Business leaders must cultivate a culture that values intellectual humility to mitigate the risks associated with the appeal to ignorance. They must emphasise the importance of recognising and addressing knowledge gaps, seeking reliable evidence to inform decision-making, and fostering an environment where team members are encouraged to continually learn, adapt, and refine their understanding of the complex and ever-evolving business landscape.
15. Begging the Question
Begging the question: “Our company’s products are the best on the market because we provide the highest quality.”
The begging-the-question fallacy, a subtle yet problematic form of circular reasoning, occurs when an argument’s conclusion is assumed within its premises, sidestepping the need for genuine evidence or logical support.
In the business world, this fallacy can lead to unfounded assumptions, superficial analyses, and misguided decision-making that may undermine an organization’s ability to navigate challenges and seize opportunities effectively.
Business leaders must foster a culture that values critical thinking, open inquiry, and evidence-based decision-making to counteract the risk of begging the question. They must encourage team members to rigorously examine the premises of their arguments, identify and address any underlying assumptions, and engage in a constructive, reasoned debate that drives innovation, growth, and sustainable success.
16. Equivocation
Equivocation: “Our sales figures are certainly interesting, which means they’re worth considering for future strategy.”
Equivocation, a deceptive rhetorical strategy frequently encountered in professional discourse, occurs when an individual exploits the ambiguity or multiple meanings of a word or phrase to create confusion or mislead their audience. This effectively avoids a clear or direct response to an argument or question.
In a business context, equivocation can obstruct meaningful communication, hinder the effective exchange of ideas, and undermine trust among team members, ultimately impeding innovation and sound decision-making.
To promote transparency and intellectual honesty within an organization, business leaders must emphasize the importance of clear and precise language, encouraging team members to seek clarification when faced with ambiguous statements and fostering a culture of open dialogue that values the rigorous examination of ideas and constructive debate, driving informed decision-making and sustained organizational success.
17. False Dichotomy (Black or White)
False dichotomy: “We either need to cut costs drastically, or we have to increase our prices significantly — there’s no other way to improve our profit margin.”
The false dichotomy fallacy, also known as the black or white fallacy, arises when an individual presents a complex issue or decision as having only two mutually exclusive options. This effectively oversimplifies the matter and ignores alternative perspectives or potential solutions.
In a business context, this fallacious reasoning can stifle creativity, hinder comprehensive problem-solving, and lead to suboptimal decision-making, ultimately constraining an organization’s ability to adapt and innovate in a rapidly evolving landscape.
To counteract the risks associated with false dichotomies, business leaders must encourage critical thinking and open-mindedness, foster an environment that values exploring nuanced perspectives and diverse approaches, and empower team members to engage in collaborative problem-solving that drives innovation.
18. Middle Ground Fallacy
Middle ground fallacy: “Our team is divided on whether to invest in research and development or marketing, so let’s allocate half our budget to each and satisfy everyone.”
The middle ground fallacy is a deceptive form of argumentation in which an individual asserts that a compromise or middle point between two opposing positions must inherently represent the correct or most reasonable solution, neglecting the possibility that one or both extremes may hold merit or that the optimal solution may lie elsewhere.
In a business context, this fallacy can lead to suboptimal decision-making, foster a false sense of consensus, and potentially overlook innovative or superior solutions.
To guard against the middle ground fallacy, business leaders must promote a culture of critical thinking and open debate. They must encourage team members to examine the strengths and weaknesses of various perspectives rigorously and foster an environment that supports collaborative problem-solving and the pursuit of evidence-based, well-informed solutions.
19. Decision Point Fallacy (Sorites Paradox)
Decision point fallacy: “We can’t determine the exact point at which adding more features to our product will make it too complex for our users, so let’s keep adding features without considering the potential downsides.”
The decision point fallacy, also known as the Sorites Paradox, arises when an individual struggles to identify a precise threshold or turning point within a series of incremental changes. This leads to flawed reasoning or indecision.
This cognitive bias can manifest in a business context when decision-makers become mired in the minutiae of continuous improvement or incremental progress, losing sight of the bigger picture and ultimately hampering their ability to make strategic choices.
To counteract the decision point fallacy, organizational leaders must foster a culture emphasising the importance of establishing clear objectives, maintaining a holistic perspective, and striking a balance between incremental progress and decisive action, empowering team members to navigate complex challenges and drive sustained success.
20. Slippery Slope Fallacy
Slippery slope fallacy: “If we allow our employees to work remotely for one day a week, productivity will plummet, and soon everyone will be demanding a completely flexible schedule, resulting in chaos and the collapse of our company culture.”
The slippery slope fallacy occurs when an individual argues that a specific action or decision will inevitably lead to a chain of negative consequences without providing sufficient evidence for this causal relationship.
In a business context, this fallacious reasoning can undermine productive dialogue, stifle innovation, and promote an overly cautious approach to problem-solving, ultimately inhibiting an organization’s ability to adapt and grow.
To guard against the slippery slope fallacy, business leaders must foster a culture that values evidence-based decision-making and encourages team members to critically examine their arguments’ logic and assumptions. This promotes a balanced and objective assessment of potential risks and opportunities that drive informed decision-making and sustained success.
21. Hasty Generalisations (Anecdotal Evidence)
Hasty generalisations: “One of our remote employees missed a deadline last month, which clearly shows that allowing employees to work remotely leads to decreased productivity and a lack of accountability.”
Hasty generalizations, often fueled by anecdotal evidence, occur when an individual draws broad conclusions based on insufficient or unrepresentative data, resulting in potentially flawed or biased reasoning.
Relying on hasty generalizations in a business context can lead to misguided decision-making, suboptimal strategies, and an inability to effectively address complex challenges, ultimately impeding an organization’s success.
Business leaders must emphasize the importance of thorough analysis, evidence-based decision-making, and critical thinking to counteract the risks associated with hasty generalisations. They must also encourage team members to recognize the limitations of anecdotal evidence and consider diverse perspectives, fostering a culture that values rigorous inquiry and comprehensive problem-solving.
22. Faulty Analogy
Faulty analogy: “Managing a business is like riding a bicycle; once you’ve learned the basics, it’s all about maintaining balance and momentum, so we don’t need to invest in ongoing professional development for our employees.”
The faulty analogy fallacy arises when an individual draws a comparison between two concepts or situations that are not sufficiently alike, resulting in misleading or unsupported conclusions.
Relying on faulty analogies in a business context can impede effective problem-solving, foster misconceptions, and contribute to ill-advised decision-making, ultimately undermining an organization’s ability to innovate and succeed.
To guard against faulty analogies, business leaders must cultivate a culture that values critical thinking, logical rigour, and evidence-based reasoning. They must also encourage team members to scrutinize their comparisons’ validity and seek diverse perspectives that challenge assumptions and promote nuanced understanding.
23. Burden of Proof
Burden of proof: “Our new marketing strategy will boost sales by at least 20%; if you don’t believe me, prove me wrong.”
The burden of proof fallacy occurs when an individual asserts a claim without providing sufficient evidence, often shifting the responsibility to disprove the assertion onto others.
In a business context, this fallacious reasoning can hinder productive discourse, foster unwarranted assumptions, and contribute to flawed decision-making, ultimately impeding an organization’s ability to navigate challenges effectively and capitalize on opportunities.
To mitigate the risks associated with the burden of proof fallacy, business leaders must promote a culture of evidence-based reasoning, critical thinking, and intellectual accountability. They must encourage team members to substantiate their claims with robust supporting evidence and engage in a constructive, well-informed debate that drives innovative problem-solving and sustainable success.
24. Affirming the Consequent
Just because an if-then statement is true in a particular situation doesn’t make the if-then statement accurate in all cases.
“A cat meows, so everything that meows is a cat.”
25. Denying the Antecedent (Fallacy of the Inverse)
If a statement with specific conditions is correct, this doesn’t make the information accurate or incorrect for all other situations.
“A cat meows, so if it doesn’t meow, it isn’t a cat.”
26. Moving the Goalposts
Manipulating the argument by changing the specifics of your initial claims — after being questioned or even proven wrong.
“Yes, there might be some innocent people in jail, but I was only talking about the guilty.”
27. No True Scotsman
To disqualify someone or something based on a false or biased ideal.
“All real men have beards, so if you don’t have a beard, you can’t be a real man.”
28. Personal Incredulity
It doesn’t make it untrue just because you find something hard to believe or imagine.
“I can’t believe that the universe and everything in it arose from nothing, so it can’t be true.”
29. False Causality
The false assumption is that correlation equals causation.
“Crime rates went up when the price of gas went up, so for everyone’s safety, we must lower our taxes on fossil fuels.”
30. Texas Sharpshooter
To decide on your position, find only data to support that position. This fallacy is especially prominent in this digital age when finding arguments defending almost any imaginable position online is possible.
“I’ve found numerous studies supporting my position, and I have no idea if any studies also support your position.”
31. Loaded Question
To ask a question with an assumption already built into the question.
“Have you stopped beating your wife?”
32. Chesterton’s Fence
If we don’t understand or see the reason for something, we might be inclined to do away with it. However, even if we don’t understand it, most things have been implemented for a reason. Therefore, we should leave it unless we fully understand its purpose.
“There’s a fence here, but I can’t see what it’s good for, so let’s do away with it.”
33. Survivorship Bias
I’ve already written about survivorship bias and Abraham Wald.
“We shot all returning warplanes with damages in the wings, so we should reinforce the wings to make them safer.”
Read also: Survivorship Bias: Abraham Wald and the WWII Airplanes
34. The Dunning-Kruger Effect
A cognitive bias is when people overestimate their competence as they progress in a new field. 1Please note that the Dunning-Kruger effect is under scientific scrutiny and lacks broad support from the scientific community.
“I’ve just started learning about this, and I’m amazed at how much more I know now compared to before when I knew next to nothing, so I’m quite sure that I’m an expert on this subject now.”
35. Confirmation Bias
Most of us tend to recall or interpret information in a way that reinforces our existing cognitive schemas.
“I refused to take my medicine and got well, so I always do my best to avoid treatment.
36. Heuristic Anchoring
When faced with an initial number, we often compare subsequent numbers to that anchor.
“The third house shown to us by the broker was over our budget but still a bargain compared to the first two houses she showed us.”
37. The Curse of Knowledge
We tend to assume that other people have at least enough knowledge to comprehend and digest what we’re saying.
“A preschool class came by the lab yesterday and asked about my work, so I talked about genome sequencing for a good half hour and got no follow-up questions.”
38. Optimism/Pessimism Bias
We find it easier to believe that negative things can happen to others than ourselves. However, some people tend to be biased oppositely; they overestimate the likelihood of adverse events.
“We’re so blessed that those terrible things couldn’t ever happen to us.” / “What happened to them will also happen to us — only worse.”
39. The Sunk Cost Fallacy
Sometimes, we stick to a behaviour simply because we’ve already invested time, money, and other resources. Abandoning such an investment would force us to face an irreversible failure.
“I ordered too much food, so we’ll simply have to over-eat for a few weeks to get our money’s worth.”
40. Negativity Bias
We tend to react more strongly to negative impacts than to positive effects of similar or equal weight.
“Our daughter graduated with honours from college yesterday, but then on our way home, our car broke down and ruined the rest of the day.”
41. Declinism
We tend to think that everything will decline, especially with new developments. This might be due to cognitive laziness; we don’t wish to change how we feel in tandem with the times.
“Everything was better in the past, so change is terrible.”
Learn more: Social Media — The Good, The Bad, and the Ugly
42. The Backfire Effect (Conversion Theory)
When challenged, we might cling even firmer to our beliefs — instead of questioning ourselves.
“People hate us, but this proves us right about everything.”
Learn more: Conversion Theory: The Disproportionate Influence of Minorities
43. The Fundamental Attribution Error
When someone else makes a mistake, we attribute it to their character or behaviour. Still, we tend to attribute such mistakes to contextual circumstances when we make mistakes.
“When I’m in a rush, people behave like idiots in traffic.”
44. In-Group Bias
We have evolved to be subjectively preferential to people who belong to the same social group. This isn’t necessarily bad behaviour per se, but we must watch out for situations where we are put in a position where we can’t be expected to be fair and objective.
“I might be biased, of course, but I dare say, objectively, that my daughter was the best performer in the whole orchestra.”
Learn more: Social Group Sizes (The Social Brain Hypothesis)
45. The Forer Effect (The Barnum Effect)
We tend to fill any gaps in the information we give using our existing cognitive schemas. This is, for instance, why it’s so easy to think that a horoscope is eerily accurate. We fail to recognise that vague statements might apply to ourselves and many others.
“I read my horoscope yesterday, and the information was uncannily accurate, so I’m certainly convinced that there are some things about the cosmos that influence our lives in a way that we can’t yet understand.”
46. Cognitive Dissonance
We tend to sort information based on our existing cognitive schemas. One outcome is that we tend to disregard any information that conflicts with our existing beliefs while quickly absorbing anything that confirms our beliefs.
“The Earth is flat, and I haven’t seen any credible evidence to the contrary.”
Learn more: Cognitive Dissonance: Mental Harmony Above All Else
47. The Hostile Media Effect
This can be seen as the equivalent in media science to the psychological fallacy of the backfire effect. Studies have shown that people with strong opinions on a specific issue tend to believe that the media is biased towards their opposition. The result will be even stronger if the individual believes that the silent majority is out there who are particularly susceptible to erroneous or misleading media coverage.
“I know the media is telling me I’m wrong, but that’s perfectly understandable since their primary objective is to stop me from exposing the truth.”
Learn more: The Hostile Media Effect: How We Demonise the News Media
48. Cherry-Picking (The Fallacy of Incomplete Evidence)
This fallacy is closely related to Texas sharpshooter and the fallacy of division. Cherry-picking fuels most of the reasoning behind popular conspiracy theories. In a world where information is abundant and easily accessible, it’s easy for anyone to make a case for almost anything.
“Apollo saved Greece from the dragon Python, and Napoleon saved France from the horrors of revolution (derived from ‘revolvo,’ something that crawls). Therefore, Napoleon is a myth.”
Learn more: Napoleon the Sun God (And Why Most Conspiracies are Bullshit)
49. The Spiral of Silence
Most social animals harbour an instinctive fear of isolation, and in-groups maintain their cultural stability partially by excluding individuals with non-conforming opinions or behaviours. This can create a culture where group members self-censor their views and behaviours by going silent.
“My opinions are perceived as wrong, and it’s better for everyone if I stay silent.”
Learn more: The Spiral of Silence
50. The Yes Ladder
This is a marketing exploit where the persuader aims to get you to say yes to something substantial (“big ask”) by methodically getting you to say yes to something smaller first (“small ask”).
“I wasn’t going to buy the pink umbrella at first, but then I subscribed to their newsletter, and via the newsletter, I downloaded a free photo book with pink umbrellas — and now I own five pink umbrellas.”
51. Bystander Effect
People are less inclined to offer support or aid if many others can.
“Everyone cares deeply about personal safety, so everyone will download our new CSR app to help each other.”
Learn more: Kitty Genovese Murder and the Misreported Bystander Effect
52. Reciprocation Effect
We often feel obligated to reciprocate if someone is friendly or generous towards us. While this is a beautiful and expected part of human behaviour, special interests can take advantage of it.
“I can’t believe the car broke down so fast — the guy I bought it from threw in so many extra features.”
53. Commitment and Consistency
Once we commit to something, we invest a part of ourselves in that decision. This makes it harder for many of us to abandon such commitments because it would mean giving up on ourselves. This bias is closely related to yes ladders, declinism, appeal to tradition, and sunk cost fallacy.
“I’ve made my decision, and therefore I’m sticking with it.”
54. The Fallacy of Social Proof
This fallacy is the commercial extension of the bandwagon effect; by showcasing social proof, we are comforted by decisions made by others. Ideally, we should always ensure that reviews and engagement displays are relevant (and accurate) before making any decisions, but this doesn’t always happen.
“Their product seems to have many happy users, so the risk of getting scammed is low.”
55. Liking and Likeness
“We prefer to say yes to people we know and like,” says Robert Cialdini in Influence: The Psychology of Persuasion.
“He is gorgeous, successful, and speaks in a way that resonates with me, so why shouldn’t I trust every word he says?”
56. The Appeal to Authority
It isn’t easy to distinguish between perceived authority and indisputable authority. Many companies use testimonials from people with impressive titles — and it works. This fallacy is closely related to the fallacious appeal to authority.
“Several leading doctors recommended this product, so the ad’s claims must be true.”
57. The Principle of Scarcity (FOMO)
Most of us are scared of missing out (also known as FOMO, fear of missing out). This makes us perceive things as more valuable and rarer.
“I’m so happy I managed to snag that pink unicorn umbrella before the discount ran out!”
Learn more: The Power of Artificial Scarcity (FOMO)
58. Loss Aversion
The pain of losing can psychologically be twice as powerful as the joy of winning. Our psychology often allows us to take disproportionate risks to avoid losing compared to the dangers we’re ready to take to win. This bias is closely related to commitment, consistency, and the sunk cost fallacy.
“Our last investment led to a loss of market share, so we must increase our investment to regain it.”
Thanks for reading. Please support my blog by sharing articles with other communications and marketing professionals. You might also consider my PR services or speaking engagements.
Reference List
Arkes, H. R., & Blumer, C. (1985), The psychology of sunk costs. Organisational Behavior and Human Decision Processes, 35, 124 – 140.
Cialdini, R. (2006). Influence: The Psychology of Persuasion, Revised Edition. Harper Business: The United States.
Cook, J. & Lewandowsky, S. (2011). The debunking handbook. St. Lucia, Australia: University of Queensland.
Dwyer, C.P. (2017). Critical thinking: Conceptual perspectives and practical guidelines. Cambridge, UK: Cambridge University Press; with a foreword by former APA President, Dr Diane F. Halpern.
Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills & Creativity, 12, 43 – 52.
Forer, B. R. (1949). The Fallacy of Personal Validation: A classroom Demonstration of Gullibility. Journal of Abnormal Psychology, 44, 118 – 121.
Kahneman, D. (2011). Thinking fast and slow. Penguin: Great Britain.
Kruger, J., Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognising one’s own incompetence lead to inflated self-Assessments. Journal of Personality and Social Psychology, 77, 6, 1121 – 1134.
Scott, P. J., & Lizieri, C. 92012). Consumer house price judgments: New evidence of anchoring and arbitrary coherence. Journal of Property Research, 29, 49 – 68.
Simon, H. A. (1957). Models of man. New York: Wiley.
Sweis, B. M., Abram, S. V., Schmidt, B. J., Seeland, K. D., MacDonald, A. W., Thomas, M. J., & Redish, A. D. (2018). Sensitivity to “sunk costs” in mice, rats, and humans. Science, 361(6398), 178 – 181.
Thaler, R. H. (1999). Mental accounting matters. Journal of Behavioral Decision Making, 12, 183 – 206.
Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 4157, 1124 – 1131.
West, R. F., Toplak, M. E., & Stanovich, K. E. (2008). Heuristics and biases as measures of critical thinking: Associations with cognitive ability and thinking dispositions. Journal of Educational Psychology, 100, 4, 930 – 941.
PR Resource: More Psychology
Spin Academy | Online PR Courses
Doctor Spin’s PR School: Free Psychology PR Course
Join this free Psychology PR Course to learn essential skills tailored for public relations professionals. Start now and amplify your impact on society today.
Learn more: All Free PR Courses
💡 Subscribe and get a free ebook on how to get better PR ideas.
Mental Models: Be a Better Thinker
Mental models emphasise the importance of viewing problems from multiple perspectives, recognising personal limitations, and understanding the often unforeseen interactions between different factors.
“You only have to do a few things right in your life so long as you don’t do too many things wrong.”
— Warren Buffett
The writings of Charlie Munger, Vice Chairman of Berkshire Hathaway and long-time collaborator of Warren Buffett and many others, inspire several of the below models.2It’s worth noting that these models are not exclusively Charlie Munger’s inventions but tools he advocates for effective thinking and decision-making.
List of Mental Models
Here’s a list of my favourite mental models:
The iron prescription (mental model). Charlie Munger: “I have what I call an ‘iron prescription’ that helps me keep sane when I naturally drift toward preferring one ideology over another. I feel that I’m not entitled to have an opinion unless I can state the arguments against my position better than the people who are in opposition. I think that I am qualified to speak only when I’ve reached that state” (Knodell, 2016). 3Knodell, P. A. (2016). All I want to know is where I’m going to die so I’ll never go there: Buffett & Munger – A study in simplicity and uncommon, common sense. PAK Publishing.
The Red Queen effect (mental model). This metaphor originates from Lewis Carroll’s Through the Looking-Glass. It describes a situation in which one must continuously adapt, evolve, and work to maintain one’s position. In the story, the Red Queen is a character who explains to Alice that in their world, running as fast as one can is necessary just to stay in the same place. The metaphor is often used in the context of businesses that need to innovate constantly to stay competitive, highlighting the relentless pressure to adapt in dynamic environments where stagnation can mean falling behind. 4Red Queen hypothesis. (2023, November 27). In Wikipedia. https://en.wikipedia.org/wiki/Red_Queen_hypothesis 5Carroll, L. (2006). Through the looking-glass, and what Alice found there (R. D. Martin, Ed.). Penguin Classics. (Original work published 1871.)
Ockam’s razor (mental model). This principle suggests that the simplest explanation is usually correct. The one with the fewest assumptions should be selected when presented with competing hypotheses. It’s a tool for cutting through complexity and focusing on what’s most likely true. 6Ariew, R. (1976). Ockham’s Razor: A historical and philosophical analysis of simplicity in science. Scientific American, 234(3), 88 – 93.
Hanlon’s razor (mental model). This thinking aid advises against attributing to malice what can be adequately explained by incompetence or mistake. It reminds us to look for more straightforward explanations before jumping to conclusions about someone’s intentions. 7Hanlon, R. J. (1980). Murphy’s Law book two: More reasons why things go wrong!. Los Angeles: Price Stern Sloan.
Vaguely right vs precisely wrong (mental model). This principle suggests it is better to be approximately correct than 100% incorrect. In many situations, seeking precision can lead to errors if the underlying assumptions or data are flawed. Sometimes, a rough estimate is more valuable than a precise but potentially misleading figure. 8Keynes, J. M. (1936). The general theory of employment, interest, and money. London: Macmillan.
Fat pitch (mental model). Borrowed from baseball, this concept refers to waiting patiently for the perfect opportunity — a situation where the chances of success are exceptionally high. It suggests the importance of patience and striking when the time is right. 9Kaufman, P. A. (Ed.). (2005). Poor Charlie’s almanack: The wit and wisdom of Charles T. Munger. Virginia Beach, VA: Donning Company Publishers.
Chesterton’s fence (mental model). G.K. Chesterton: ”In the matter of reforming things, as distinct from deforming them, there is one plain and simple principle; a principle which will probably be called a paradox. There exists in such a case a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, ‘I don’t see the use of this; let us clear it away.’ To which the more intelligent type of reformer will do well to answer: ‘If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it” (Chesterton, 1929). 10Chesterton, G. K. (1929). “The Drift from Domesticity”. Archived 6 November 2018 at the Wayback Machine In: The Thing. London: Sheed & Ward, p. 35
First-conclusion bias (mental model). This is the tendency to stick with the first conclusion without considering alternative possibilities or additional information. It’s a cognitive bias that can impede critical thinking and thorough analysis.
First principles thinking (mental model). This approach involves breaking down complex problems into their most basic elements and then reassembling them from the ground up. It’s about getting to the fundamental truths of a situation and building your understanding from there rather than relying on assumptions or conventional wisdom.
The map is not the territory (mental model). This model reminds us that representations of reality are not reality itself. Maps, models, and descriptions are simplifications and cannot capture every aspect of the actual territory or situation. It’s a caution against over-relying on models and theories without considering the nuances of real-world situations. 11Silfwer, J. (2022, November 3). Walter Lippmann: Public Opinion and Perception Management. Doctor Spin | The PR Blog. https://doctorspin.net/walter-lippmann/
Bell curve (mental model). This curve is a graphical depiction of a normal distribution, showing how many occurrences fall near the mean value and fewer occur as you move away from the mean. In decision-making, it’s used to understand and anticipate variability and to recognise that while extreme cases exist, most outcomes will cluster around the average.
Compounding (mental model). Often used in the context of finance, compounding refers to the process where the value of an investment increases because the earnings on an investment, both capital gains and interest, earn interest as time passes. This principle can be applied more broadly to understand how small, consistent efforts can yield significant long-term results.
Survival of the fittest (mental model). Borrowed from evolutionary biology, this mental model suggests that only those best adapted to their environment survive and thrive. In a business context, it can refer to companies that adapt to changing market conditions and are more likely to succeed.
Mr. Market (mental model). A metaphor created by Benjamin Graham represents the stock market’s mood swings from optimism to pessimism. It’s used to illustrate emotional reactions in the market and the importance of maintaining objectivity.
Source: Graham, B. (2006) 12Graham, B. (2006). The intelligent investor: The definitive book on value investing (Rev. ed., updated with new commentary by J. Zweig). Harper Business. (Original work published 1949.)
Second-order thinking (mental model). This kind of thinking goes beyond the immediate effects of an action to consider the subsequent effects. It’s about thinking ahead and understanding the longer-term consequences of decisions beyond just the immediate results.
Law of diminishing returns (mental model). This economic principle states that as investment in a particular area increases, the rate of profit from that investment, after a certain point, cannot increase proportionally and may even decrease. It’s essential to understand when additional investment yields progressively smaller returns. 13Diminishing returns. (2024, November 15). Wikipedia. https://en.wikipedia.org/wiki/Diminishing_returns
Opportunity cost (mental model). This concept refers to the potential benefits one misses out on when choosing one alternative over another. It’s the cost of the following best option foregone. Understanding opportunity costs helps make informed decisions by considering what to give up when choosing.
Swiss Army knife approach (mental model). This concept emphasises the importance of having diverse tools (or skills). Being versatile and adaptable in various situations is valuable, like a Swiss Army knife. This model is beneficial for uncertain and volatile situations. There’s also a case to be made for generalists in a specialised world. 14Parsons, M., & Pearson-Freeland, M. (Hosts). (2021, August 8). Charlie Munger: Latticework of mental models (No. 139) [Audio podcast episode]. In Moonshots podcast: Learning out loud. … Continue reading 15Epstein, D. (2019). Range: Why generalists triumph in a specialized world. Riverhead Books.
Acceleration theory (mental model). This concept indicates that the winner mustn’t lead the race from start to finish. Mathematically, delaying maximum “speed” by prolonging the slower acceleration phase will get you across the finish line faster. 16Silfwer, J. (2012, October 31). The Acceleration Theory: Use Momentum To Finish First. Doctor Spin | The PR Blog. https://doctorspin.net/acceleration-theory/
Manage Expectations—This concept involves setting realistic expectations for yourself and others. It’s about aligning hopes and predictions with what is achievable and probable, thus reducing disappointment and increasing satisfaction. Effective expectation management can lead to better personal and professional relationships and outcomes.
Techlash—This mental model acknowledges that while technology can provide solutions, it can create anticipated and unanticipated problems. It’s a reminder to approach technological innovations cautiously, considering potential negative impacts alongside the benefits. 17Silfwer, J. (2018, December 27). The Techlash: Our Great Confusion. Doctor Spin | The PR Blog. https://doctorspin.net/techlash/
World’s Most Intelligent Question—This mental model refers to repeatedly asking “Why?” to delve deeper into a problem and understand its root causes. By continually asking why something happens, one can uncover layers of understanding that might remain hidden.
Regression to the Mean—This statistical principle states that extreme events are likely to be followed by more moderate ones. Over time, values tend to revert to the average, a concept relevant in many areas, from sports performance to business metrics.
False Dichotomy—This logical fallacy occurs when a situation is presented as having only two exclusive and mutually exhaustive options when other possibilities exist. It oversimplifies complex issues into an “either/or” choice. For instance, saying, “You are either with us or against us,” ignores the possibility of neutral or alternative positions.
Inversion—Inversion involves looking at problems backwards or from the end goal. Instead of thinking about how to achieve something, you consider what would prevent it from happening. This can reveal hidden obstacles and alternative solutions.
Psychology of Human Misjudgment—This mental model refers to understanding the common biases and errors in human thinking. By knowing how cognitive biases, like confirmation bias or the anchoring effect, can lead to flawed reasoning, one can make more rational and objective decisions.
Slow is Smooth, Smooth is Fast—Often used in military and tactical training, this phrase encapsulates the idea that sometimes, slowing down can lead to faster overall progress. The principle is that taking deliberate, considered actions reduces mistakes and inefficiencies, which can lead to faster outcomes in the long run. In practice, it means planning, training, and executing with care, leading to smoother, more efficient operations that achieve objectives faster than rushed, less thoughtful efforts. 18Silfwer, J. (2020, April 24). Slow is Smooth, Smooth is Fast. Doctor Spin | The PR Blog. https://doctorspin.net/slow-is-smooth/
Because You Are Worth It—This mental model focuses on self-worth and investing in oneself. It suggests recognizing and affirming one’s value is crucial for personal growth, happiness, and success. This can involve self-care, education, or simply making choices that reflect one’s value and potential.
Physics Envy—This term describes the desire to apply the precision and certainty of physics to fields where such exactitude is impossible, like economics or social sciences. It’s a caution against overreliance on quantitative methods in areas where qualitative aspects play a significant role.
Easy Street Strategy—This principle suggests that simpler solutions are often better and more effective than complex ones. In decision-making and problem-solving, seeking straightforward, clear-cut solutions can often lead to better outcomes than pursuing overly complicated strategies. 19Silfwer, J. (2021, January 27). The Easy Street PR Strategy: Keep It Simple To Win. Doctor Spin | The PR Blog. https://doctorspin.net/easy-street-pr-strategy/
Scale is Key—This concept highlights how the impact of decisions or actions can vary dramatically depending on their scale. What works well on a small scale might not be effective or feasible on a larger scale, and vice versa.
Circle of Competence—This concept involves recognizing and understanding one’s areas of expertise and limitations. The idea is to focus on areas where you have the most knowledge and experience rather than venturing into fields where you lack expertise, thereby increasing the likelihood of success.
Fail Fast, Fail Often—By failing fast, you quickly learn what doesn’t work, which helps in refining your approach or pivoting to something more promising. Failing often is seen not as a series of setbacks but as a necessary part of the process towards success. This mindset encourages experimentation, risk-taking, and learning from mistakes, emphasising agility and adaptability.
Correlation Do Not Equal Causation—This principle is a critical reminder in data analysis and scientific research. Just because two variables show a correlation (they seem to move together or oppose each other) does not mean one causes the other. Other variables could be at play, or it might be a coincidence.
Critical Mass—This mental model emphasizes the importance of reaching a certain threshold to trigger a significant change, whether user adoption, market penetration, or social movement growth. This model guides strategic decisions, such as resource allocation, marketing strategies, and timing of initiatives, to effectively reach and surpass this crucial point. 20Silfwer, J. (2019, March 10). Critical Mass: How Many Social Media Followers Do You Need? Doctor Spin | The PR Blog. https://doctorspin.net/critical-mass-followers/
Sorites Paradox—Also known as the paradox of the heap, this paradox arises from vague predicates. It involves a sequence of small changes that don’t seem to make a difference individually but, when accumulated, lead to a significant change where the exact point of change is indiscernible. For example, if you keep removing grains of sand from a heap, when does it stop being a heap? Each grain doesn’t seem to make a difference, but eventually, you’re left with no heap.
The Power of Cycle Times—Mathematically, reducing cycle times in a process that grows exponentially (like content sharing on social networks) drastically increases the growth rate, leading to faster and wider dissemination of the content, thereby driving virality. The combination of exponential growth, network effects, and feedback loops makes cycle time a critical factor. 21Let’s say the number of new social media shares per cycle is a constant multiplier, m. If the cycle time is t and the total time under consideration is T, the number of cycles in this time is T/t. … Continue reading 22Silfwer, J. (2017, February 6). Viral Loops (or How to Incentivise Social Media Sharing). Doctor Spin | the PR Blog. https://doctorspin.net/viral-loop/
Non-Linearity—This mental model recognises that outcomes in many situations are not directly proportional to the inputs or efforts. It suggests that effects can be disproportionate to their causes, either escalating rapidly with small changes or remaining stagnant despite significant efforts. Understanding non-linearity helps in recognizing and anticipating complex patterns in various phenomena.
Checklists—This mental model stresses the importance of systematic approaches to prevent mistakes and oversights. Using checklists in complex or repetitive tasks ensures that all necessary steps are followed and nothing is overlooked, thereby increasing efficiency and accuracy. 23Silfwer, J. (2020, September 18). Communicative Leadership in Organisations. Doctor Spin | The PR Blog. https://doctorspin.net/communicative-leadership/
Lollapalooza—Coined by Munger, this term refers to situations where multiple factors, tendencies, or biases interact so that the combined effect is much greater than the sum of individual effects. It’s a reminder of how various elements can converge to create significant impacts, often unexpected or unprecedented.
Limits—This mental model acknowledges that everything has boundaries or limits, beyond which there can be negative consequences. Recognising and respecting personal, professional, and physical limits is essential for sustainable growth and success.
The 7Ws—This mental model refers to the practice of asking “Who, What, When, Where, Why” (and sometimes “How”) to understand a situation or problem fully. By systematically addressing these questions, one can comprehensively understand an issue’s context, causes, and potential solutions, leading to more informed decision-making. 24Silfwer, J. (2020, September 18). The Checklist for Communicative Organisations. Doctor Spin | The PR Blog. https://doctorspin.net/checklist-for-communicative-leadership/
Chauffeur Knowledge—This mental model distinguishes between having a surface-level understanding (like a chauffeur who knows the route) and deep, genuine knowledge (like an expert who understands the intricacies of a subject). It warns against the illusion of expertise based on superficial knowledge and emphasizes the importance of true, deep understanding.
Make Friends with Eminent Dead—This mental model advocates learning from the past, particularly from significant historical figures and their writings. Studying the experiences and thoughts of those who have excelled in their fields can yield valuable insights and wisdom.
Seizing the Middle—This strategy involves finding and maintaining a balanced, moderate position, especially in conflict or negotiation. It’s about avoiding extremes and finding a sustainable, middle-ground solution. Also, centre positions often offer the broadest range of options.
Asymmetric Warfare—This refers to conflict between parties of unequal strength, where the weaker party uses unconventional tactics to exploit the vulnerabilities of the stronger opponent. It’s often discussed in military and business contexts.
Boredom Syndrome—This term refers to the human tendency to seek stimulation or change when things become routine or monotonous, which can lead to unnecessary changes or risks. Sometimes, taking no action is better than taking action, but remaining idle is sometimes difficult.
Survivorship Bias—This cognitive bias involves focusing on people or things that have “survived” some process and inadvertently overlooking those that did not due to their lack of visibility. This can lead to false conclusions because it ignores the experiences of those who did not make it through the process. 25Silfwer, J. (2019, October 17). Survivorship Bias — Correlation Does Not Equal Causation. Doctor Spin | The PR Blog. https://doctorspin.net/survivorship-bias/
Each mental model offers a lens for viewing problems, making decisions, and strategising, reflecting the complexity and diversity of thought required in various fields and situations.
Numerous other mental models are also used in various fields, such as economics, psychology, and systems thinking.
Learn more: Mental Models: How To Be a Better Thinker
PR Resource: How To Create Knowledge
Spin Academy | Online PR Courses
How To Create Knowledge
“If you can’t explain it simply, you don’t understand it well enough.”
— Albert Einstein
This list of how to create knowledge presents aspects of reasoning, methodological approaches, data analysis perspectives, and philosophical frameworks. It explains how knowledge can be approached, analysed, and interpreted.
Types of Reasoning and Logical Processes
Methodological Approaches
Data and Analysis Perspectives
Philosophical and Theoretical Frameworks
Learn more: How To Create Knowledge
💡 Subscribe and get a free ebook on how to get better PR ideas.
PR Resource: Types of Intelligences
Howard Gardner: 10 Intelligence Types
“Gardner’s theory of multiple intelligences has revolutionized education, challenging the notion of a single, fixed intelligence and promoting a more diverse approach to teaching and learning.”
Source: 26Checkley, K. (1997). The First Seven…and the Eighth: A Conversation with Howard Gardner. Educational Leadership, 55, 8 – 13.
Howard Gardner’s theory of multiple intelligences expands the traditional view of intelligence beyond logical and linguistic capabilities. 27Theory of multiple intelligences. (2023, November 28). In Wikipedia. https://en.wikipedia.org/wiki/Theory_of_multiple_intelligences
Here’s a description of each type of intelligence as outlined in his theory:
Each intelligence type represents different ways of processing information and suggests everyone has a unique blend of these bits of intelligence. 29Gardner, H. (1983). Frames of Mind: The Theory of Multiple Intelligences. Basic Books.
Learn more: 10 Intelligence Types: Howard Gardner’s Theory
ANNOTATIONS
1 | Please note that the Dunning-Kruger effect is under scientific scrutiny and lacks broad support from the scientific community. |
---|---|
2 | It’s worth noting that these models are not exclusively Charlie Munger’s inventions but tools he advocates for effective thinking and decision-making. |
3 | Knodell, P. A. (2016). All I want to know is where I’m going to die so I’ll never go there: Buffett & Munger – A study in simplicity and uncommon, common sense. PAK Publishing. |
4 | Red Queen hypothesis. (2023, November 27). In Wikipedia. https://en.wikipedia.org/wiki/Red_Queen_hypothesis |
5 | Carroll, L. (2006). Through the looking-glass, and what Alice found there (R. D. Martin, Ed.). Penguin Classics. (Original work published 1871.) |
6 | Ariew, R. (1976). Ockham’s Razor: A historical and philosophical analysis of simplicity in science. Scientific American, 234(3), 88 – 93. |
7 | Hanlon, R. J. (1980). Murphy’s Law book two: More reasons why things go wrong!. Los Angeles: Price Stern Sloan. |
8 | Keynes, J. M. (1936). The general theory of employment, interest, and money. London: Macmillan. |
9 | Kaufman, P. A. (Ed.). (2005). Poor Charlie’s almanack: The wit and wisdom of Charles T. Munger. Virginia Beach, VA: Donning Company Publishers. |
10 | Chesterton, G. K. (1929). “The Drift from Domesticity”. Archived 6 November 2018 at the Wayback Machine In: The Thing. London: Sheed & Ward, p. 35 |
11 | Silfwer, J. (2022, November 3). Walter Lippmann: Public Opinion and Perception Management. Doctor Spin | The PR Blog. https://doctorspin.net/walter-lippmann/ |
12 | Graham, B. (2006). The intelligent investor: The definitive book on value investing (Rev. ed., updated with new commentary by J. Zweig). Harper Business. (Original work published 1949.) |
13 | Diminishing returns. (2024, November 15). Wikipedia. https://en.wikipedia.org/wiki/Diminishing_returns |
14 | Parsons, M., & Pearson-Freeland, M. (Hosts). (2021, August 8). Charlie Munger: Latticework of mental models (No. 139) [Audio podcast episode]. In Moonshots podcast: Learning out loud. Moonshots. https://www.moonshots.io/episode-139-charlie-munger-latticework-of-mental-models |
15 | Epstein, D. (2019). Range: Why generalists triumph in a specialized world. Riverhead Books. |
16 | Silfwer, J. (2012, October 31). The Acceleration Theory: Use Momentum To Finish First. Doctor Spin | The PR Blog. https://doctorspin.net/acceleration-theory/ |
17 | Silfwer, J. (2018, December 27). The Techlash: Our Great Confusion. Doctor Spin | The PR Blog. https://doctorspin.net/techlash/ |
18 | Silfwer, J. (2020, April 24). Slow is Smooth, Smooth is Fast. Doctor Spin | The PR Blog. https://doctorspin.net/slow-is-smooth/ |
19 | Silfwer, J. (2021, January 27). The Easy Street PR Strategy: Keep It Simple To Win. Doctor Spin | The PR Blog. https://doctorspin.net/easy-street-pr-strategy/ |
20 | Silfwer, J. (2019, March 10). Critical Mass: How Many Social Media Followers Do You Need? Doctor Spin | The PR Blog. https://doctorspin.net/critical-mass-followers/ |
21 | Let’s say the number of new social media shares per cycle is a constant multiplier, m. If the cycle time is t and the total time under consideration is T, the number of cycles in this time is T/t. The total reach after time T can be approximated by m(T/t), assuming one initial share. When t decreases, T/t increases, meaning more cycles occur in the same total time, T. This leads to a higher power of m in the expression m(T/t), which means a more extensive reach. |
22 | Silfwer, J. (2017, February 6). Viral Loops (or How to Incentivise Social Media Sharing). Doctor Spin | the PR Blog. https://doctorspin.net/viral-loop/ |
23 | Silfwer, J. (2020, September 18). Communicative Leadership in Organisations. Doctor Spin | The PR Blog. https://doctorspin.net/communicative-leadership/ |
24 | Silfwer, J. (2020, September 18). The Checklist for Communicative Organisations. Doctor Spin | The PR Blog. https://doctorspin.net/checklist-for-communicative-leadership/ |
25 | Silfwer, J. (2019, October 17). Survivorship Bias — Correlation Does Not Equal Causation. Doctor Spin | The PR Blog. https://doctorspin.net/survivorship-bias/ |
26 | Checkley, K. (1997). The First Seven…and the Eighth: A Conversation with Howard Gardner. Educational Leadership, 55, 8 – 13. |
27 | Theory of multiple intelligences. (2023, November 28). In Wikipedia. https://en.wikipedia.org/wiki/Theory_of_multiple_intelligences |
28 | See also: Silfwer, J. (2023, April 25). Theory of Mind: A Superpower for PR Professionals. Doctor Spin | The PR Blog. https://doctorspin.net/theory-of-mind-a-superpower-for-pr-professionals/ |
29 | Gardner, H. (1983). Frames of Mind: The Theory of Multiple Intelligences. Basic Books. |