The Public Relations BlogMedia & PsychologySocial Psychology58 Logical Fallacies and Cognitive Biases

58 Logical Fallacies and Cognitive Biases

The fascinating science of being stupid.

Cover photo: @jerrysilfwer

Table of contents

All of us are prone to logic­al fal­la­cies and cog­nit­ive biases.

I know that I’m stu­pid some­times — most of us are.

Still, we should all strive to be less stupid. 

I’m deeply fas­cin­ated with study­ing logic­al fal­la­cies and cog­nit­ive biases. Learning about human beha­viours is help­ful in pub­lic rela­tions, where we deal with com­mu­nic­a­tion chal­lenges daily.

Here we go:

Logical Fallacies and Cognitive Biases - Doctor Spin
Logical fal­la­cies and cog­nit­ive biases.

1. Fallacy of Composition

Fallacy of com­pos­i­tion: “Since our top sales­per­son is a great pub­lic speak­er, our entire sales team must also be excel­lent pub­lic speakers.”

The fal­lacy of com­pos­i­tion, a pre­val­ent cog­nit­ive bias in decision-mak­ing, arises when indi­vidu­als erro­neously infer that the attrib­utes of a single com­pon­ent or a select few com­pon­ents with­in a more extens­ive sys­tem extend to the entire system.

This fal­la­cious think­ing may mani­fest in vari­ous con­texts — from organ­iz­a­tion­al strategy to mar­ket ana­lys­is — and can lead to mis­guided decisions with poten­tially adverse consequences. 

Business lead­ers must engage in thought­ful and rig­or­ous ana­lys­is to avoid fall­ing prey to this fal­lacy. They must recog­nise that the dynam­ics of com­plex sys­tems may not always mir­ror the char­ac­ter­ist­ics of their parts and that a more hol­ist­ic approach is neces­sary to nav­ig­ate the intric­a­cies of today’s ever-evolving busi­ness landscape.

2. Fallacy of Division

Fallacy of divi­sion: “Our com­pany is a mar­ket lead­er, so every employ­ee with­in our organ­iz­a­tion must be an expert in their respect­ive field.”

The fal­lacy of divi­sion emerges as a subtle yet sig­ni­fic­ant cog­nit­ive trap, enti­cing decision-makers to mis­takenly assume that the prop­er­ties of a col­lect­ive whole must inher­ently apply to its components.

This flawed logic can lead to erro­neous con­clu­sions and ill-informed decisions, par­tic­u­larly in organ­isa­tion­al dynam­ics, where unique ele­ments with­in a sys­tem may not con­form to the over­arch­ing char­ac­ter­ist­ics of the lar­ger entity.

To coun­ter­act this fal­lacy, busi­ness lead­ers must adopt a nuanced approach, cul­tiv­at­ing an under­stand­ing that the intric­a­cies of com­plex sys­tems demand care­ful con­sid­er­a­tion of the dis­tinct attrib­utes and inter­ac­tions of their con­stitu­ent parts rather than rely­ing on simplist­ic gen­er­al­iz­a­tions that may obscure crit­ic­al insights.

3. The Gambler’s Fallacy

Gambler’s fal­lacy: “We’ve had three failed product launches in a row; our next product is guar­an­teed success.”

The gam­bler­’s fal­lacy, a wide­spread cog­nit­ive bias often encountered in decision-mak­ing, stems from the erro­neous belief that past events can influ­ence the prob­ab­il­ity of future inde­pend­ent events.

This mis­lead­ing notion can lead to faulty assump­tions and mis­guided decisions, par­tic­u­larly in busi­ness con­texts where uncer­tainty and ran­dom­ness are prominent.

To mit­ig­ate the risks asso­ci­ated with the gam­bler­’s fal­lacy, exec­ut­ives must devel­op a data-driv­en mind­set. They must acknow­ledge the inde­pend­ence of dis­crete events and lever­age stat­ist­ic­al ana­lys­is to inform stra­tegic choices. This will foster more accur­ate assess­ments of prob­ab­il­ity and more informed decision-mak­ing in an unpre­dict­able busi­ness landscape.

4. Tu Quoque (Who Are You To Talk?)

Tu quoque: “Our com­pet­it­or’s CEO is cri­ti­ciz­ing our envir­on­ment­al policies, but their own com­pany has had pol­lu­tion issues in the past.”

The tu quoque fal­lacy, col­lo­qui­ally known as the “who are you to talk?” argu­ment, rep­res­ents a per­ni­cious rhet­or­ic­al tac­tic employed to deflect cri­ti­cism or under­mine an oppon­ent’s pos­i­tion by high­light­ing their per­ceived hypo­crisy or incon­sist­ency rather than address­ing the sub­stance of the argu­ment itself.

In the con­text of busi­ness dis­course, this ad hom­inem attack can derail pro­duct­ive con­ver­sa­tions and obscure valu­able insights, poten­tially stifling innov­a­tion and collaboration. 

To foster more con­struct­ive dia­logue, organ­iz­a­tion­al lead­ers must cul­tiv­ate an envir­on­ment that encour­ages open and hon­est com­mu­nic­a­tion. They must focus on the mer­its of the presen­ted ideas and dis­cour­age per­son­al attacks or appeals to hypo­crisy. They must empower indi­vidu­als to engage in reasoned debate and con­trib­ute to the col­lect­ive pur­suit of excellence.

5. Strawman

Strawman: “Our col­league wants to cut costs, but I doubt they’d be happy if we had to com­prom­ise the qual­ity of our products and lose cus­tom­ers as a result.”

The straw­man fal­lacy, a decept­ive rhet­or­ic­al manœuvre often encountered in busi­ness dis­course, involves mis­rep­res­ent­ing an oppon­ent’s argu­ment by con­struct­ing a dis­tor­ted or over­sim­pli­fied ver­sion of their stance, which is easi­er to refute or discredit.

This mis­lead­ing tac­tic can obstruct mean­ing­ful dia­logue, engender hos­til­ity, and inhib­it the explor­a­tion of nuanced per­spect­ives neces­sary for driv­ing innov­a­tion and informed decision-making. 

To foster a col­lab­or­at­ive and intel­lec­tu­ally rig­or­ous envir­on­ment, organ­isa­tion­al lead­ers must emphas­ize the import­ance of enga­ging with the sub­stance of the argu­ments presen­ted. They must encour­age par­ti­cipants to act­ively listen, seek cla­ri­fic­a­tion, and chal­lenge ideas con­struct­ively, ulti­mately advan­cing the col­lect­ive pur­suit of know­ledge and organ­iz­a­tion­al success.

6. Ad Hominem

Ad hom­inem: “I would­n’t trust a pro­pos­al com­piled by someone known for their disorganization.”

The ad hom­inem fal­lacy, a det­ri­ment­al form of argu­ment­a­tion fre­quently encountered in pro­fes­sion­al dis­course, occurs when an indi­vidu­al tar­gets an oppon­ent’s per­son­al attrib­utes or char­ac­ter traits rather than address­ing the sub­stance of their argument.

This diver­sion­ary tac­tic can hinder pro­duct­ive dis­cus­sion, impede the flow of valu­able insights, and foster a tox­ic work envir­on­ment, under­min­ing the col­lab­or­at­ive spir­it essen­tial to organ­iz­a­tion­al success. 

To cre­ate a cul­ture of open and respect­ful dia­logue, busi­ness lead­ers must act­ively dis­cour­age ad hom­inem attacks, encour­age team mem­bers to engage with the mer­its of ideas presen­ted, foster an atmo­sphere of intel­lec­tu­al rigour, and pro­mote an inclus­ive envir­on­ment where diverse per­spect­ives can flour­ish and con­trib­ute to the organ­iz­a­tion’s growth and innovation.

7. Genetic Fallacy (Fallacy of Origin or Fallacy of Virtue)

Genetic fal­lacy: “The mar­ket­ing strategy pro­posed by our new­est team mem­ber can­’t be any good; they’ve only been with the com­pany for a few months.”

The genet­ic fal­lacy, also known as the fal­lacy of ori­gin or fal­lacy of vir­tue, is a flawed reas­on­ing pat­tern that arises when an argu­ment’s valid­ity or worth is assessed based on its source or ori­gin rather than the argu­ment’s merits.

This cog­nit­ive bias can obstruct the object­ive eval­u­ation of ideas in a busi­ness con­text, poten­tially lead­ing to missed oppor­tun­it­ies, stifled innov­a­tion, or unwise stra­tegic decisions. 

To coun­ter­act the influ­ence of the genet­ic fal­lacy, organ­isa­tion­al lead­ers must cul­tiv­ate a cul­ture of intel­lec­tu­al open­ness. They must emphas­ize the import­ance of enga­ging with the sub­stance of ideas, regard­less of their ori­gins, and foster an envir­on­ment where crit­ic­al think­ing, reasoned debate, and the free exchange of diverse per­spect­ives can thrive. This will ulti­mately drive informed decision-mak­ing and organ­iz­a­tion­al success.

8. Fallacious Appeal to Authority

Fallacious appeal to author­ity: “We should invest in this new tech­no­logy because a fam­ous entre­pren­eur men­tioned it in a recent podcast.”

Fallacious appeal to author­ity is a decept­ive form of argu­ment­a­tion in which an indi­vidu­al invokes the opin­ion or endorse­ment of a pur­por­ted expert to bol­ster their pos­i­tion des­pite the exper­t’s lack of rel­ev­ant expert­ise or cred­ib­il­ity on the subject.

In a busi­ness con­text, this cog­nit­ive bias can lead to ill-informed decisions, mis­placed trust, and poten­tially det­ri­ment­al con­sequences for organ­iz­a­tion­al performance. 

To safe­guard against the fal­la­cious appeal to author­ity, busi­ness lead­ers must foster a cul­ture of crit­ic­al think­ing, pro­mote evid­ence-based decision-mak­ing, and encour­age team mem­bers to scru­tin­ize the cred­ib­il­ity and rel­ev­ance of expert opin­ions. This will ensure that stra­tegic choices are informed by rig­or­ous ana­lys­is and well-foun­ded expert­ise rather than mere asser­tions of authority.

9. Red Herring

Red her­ring: “We should­n’t worry about our declin­ing mar­ket share; after all, our office just won an award for its eco-friendly design.”

The red her­ring fal­lacy, a cun­ning diver­sion­ary tac­tic often encountered in pro­fes­sion­al dis­course, involves intro­du­cing an unre­lated or tan­gen­tial issue to dis­tract from the ori­gin­al argu­ment or issue at hand.

This decept­ive manœuvre can under­mine pro­duct­ive dia­logue, hinder the pur­suit of mean­ing­ful solu­tions, and impede the col­lab­or­at­ive exchange of ideas essen­tial to driv­ing innov­a­tion and organ­iz­a­tion­al success. 

To foster a focused and intel­lec­tu­ally hon­est envir­on­ment, busi­ness lead­ers must emphas­ize the import­ance of stay­ing on top­ic and address­ing the sub­stance of argu­ments. They must cul­tiv­ate a cul­ture of act­ive listen­ing and dis­cip­lined dis­cus­sion that allows for the thought­ful exam­in­a­tion of crit­ic­al issues. This will pro­mote well-informed decision-mak­ing and the organ­iz­a­tion’s abil­ity to nav­ig­ate com­plex chal­lenges effectively.

10. Appeal to Emotion

Appeal to emo­tion: “We can­’t out­source our man­u­fac­tur­ing over­seas; think about the impact on our loc­al employ­ees’ families.”

The appeal to emo­tion fal­lacy, a manip­u­lat­ive tac­tic fre­quently observed in pro­fes­sion­al and per­son­al inter­ac­tions, involves lever­aging emo­tion­al trig­gers to per­suade or influ­ence oth­ers, sidestep­ping the mer­its of the argu­ment or the ration­al­ity of the under­ly­ing facts.

In a busi­ness con­text, this fal­lacy can lead to hasty decisions, impede object­ive eval­u­ation, and inhib­it the col­lab­or­at­ive exchange of ideas cru­cial for driv­ing innov­a­tion and sound decision-making. 

To coun­ter­act the appeal to emo­tion, organ­iz­a­tion­al lead­ers must foster a cul­ture of crit­ic­al think­ing. They must emphas­ize the import­ance of evid­ence-based reas­on­ing and ration­al delib­er­a­tion while acknow­ledging the role of emo­tions in human decision-mak­ing and encour­aging employ­ees to strike a bal­ance between emo­tion­al intel­li­gence and ana­lyt­ic­al rigour in nav­ig­at­ing the com­plex­it­ies of the busi­ness landscape.

11. Appeal to Popularity (The Bandwagon Effect)

Appeal to pop­ular­ity: “We should imple­ment the same remote work policy as the lead­ing tech com­pan­ies; if it’s good enough for them, it must be good for us.”

The appeal to pop­ular­ity, also known as the band­wag­on effect, is a fal­la­cious form of argu­ment­a­tion that relies on the wide­spread accept­ance or pop­ular­ity of an idea or course of action as suf­fi­cient evid­ence of its valid­ity or efficacy.

In busi­ness, suc­cumb­ing to this fal­lacy can lead to herd men­tal­ity, stifled innov­a­tion, and sub­op­tim­al decision-mak­ing. Organizations risk neg­lect­ing rig­or­ous ana­lys­is and thought­ful delib­er­a­tion instead of fol­low­ing pre­vail­ing trends. 

Business lead­ers must cul­tiv­ate a cul­ture that val­ues inde­pend­ent think­ing and evid­ence-based decision-mak­ing to coun­ter­act the band­wag­on effect. They must encour­age team mem­bers to crit­ic­ally assess pop­u­lar beliefs and prac­tices and foster an envir­on­ment where diverse per­spect­ives can be openly shared and debated. This will ulti­mately drive informed decision-mak­ing and sus­tained organ­iz­a­tion­al success.

12. Appeal to Tradition

Appeal to tra­di­tion: “We’ve always used this soft­ware for our pro­ject man­age­ment, so there’s no reas­on to con­sider altern­at­ives now.”

The appeal to tra­di­tion fal­lacy, a per­vas­ive cog­nit­ive bias in decision-mak­ing, occurs when an indi­vidu­al argues that a par­tic­u­lar belief or prac­tice should be main­tained simply because it has been long-stand­ing or customary.

In a busi­ness con­text, this fal­lacy can hinder innov­a­tion, stifle adapt­a­tion to chan­ging mar­ket con­di­tions, and per­petu­ate out­dated or inef­fi­cient prac­tices, poten­tially under­min­ing an organ­iz­a­tion’s abil­ity to com­pete and grow. 

Astute busi­ness lead­ers must foster a cul­ture that embraces con­tinu­ous improve­ment and adapt­a­tion to counter the appeal to tra­di­tion. They must encour­age team mem­bers to eval­u­ate long-held beliefs and prac­tices crit­ic­ally and con­sider nov­el approaches that may offer more effect­ive solu­tions to the chal­lenges of a rap­idly evolving busi­ness landscape.

13. Appeal to Nature

Appeal to nature: “We should switch to a com­pletely organ­ic ingredi­ent sup­pli­er, even if it’s more expens­ive, because nat­ur­al products are always better.”

The appeal to nature fal­lacy emerges when an indi­vidu­al asserts that some­thing is inher­ently excel­lent or super­i­or simply because it is deemed nat­ur­al or unaltered while dis­miss­ing or devalu­ing altern­at­ives that may be per­ceived as arti­fi­cial or synthetic.

In the busi­ness world, this fal­lacy can lead to sub­op­tim­al decision-mak­ing, risk aver­sion to innov­a­tion, and an over­re­li­ance on tra­di­tion­al or ‘nat­ur­al’ solu­tions that may not effect­ively address con­tem­por­ary challenges. 

To nav­ig­ate this cog­nit­ive bias, savvy busi­ness lead­ers must encour­age a cul­ture of crit­ic­al think­ing and open-minded­ness. They must pro­mote evid­ence-based decision-mak­ing that care­fully eval­u­ates the advant­ages and draw­backs of vari­ous options, wheth­er they are rooted in nature or human ingenu­ity. Thus, they will foster an envir­on­ment that sup­ports innov­a­tion, adapt­ab­il­ity, and sus­tain­able growth.

14. Appeal to Ignorance

Appeal to ignor­ance: “No one has proven that our new pub­lic rela­tions cam­paign won’t work, so it must be a good idea.”

The appeal to ignor­ance fal­lacy arises when an indi­vidu­al con­tends that a claim is val­id simply because it has not been proven false, or vice versa, exploit­ing gaps in know­ledge or evid­ence to bol­ster their argument.

In a busi­ness con­text, this fal­lacy can lead to mis­guided decision-mak­ing, over­con­fid­ence in unveri­fied assump­tions, and a dis­reg­ard for the import­ance of thor­ough ana­lys­is and evid­ence-based reasoning. 

Business lead­ers must cul­tiv­ate a cul­ture that val­ues intel­lec­tu­al humil­ity to mit­ig­ate the risks asso­ci­ated with the appeal to ignor­ance. They must emphas­ise the import­ance of recog­nising and address­ing know­ledge gaps, seek­ing reli­able evid­ence to inform decision-mak­ing, and fos­ter­ing an envir­on­ment where team mem­bers are encour­aged to con­tinu­ally learn, adapt, and refine their under­stand­ing of the com­plex and ever-evolving busi­ness landscape.

15. Begging the Question

Begging the ques­tion: “Our com­pany’s products are the best on the mar­ket because we provide the highest quality.”

The beg­ging-the-ques­tion fal­lacy, a subtle yet prob­lem­at­ic form of cir­cu­lar reas­on­ing, occurs when an argu­ment’s con­clu­sion is assumed with­in its premises, sidestep­ping the need for genu­ine evid­ence or logic­al support.

In the busi­ness world, this fal­lacy can lead to unfoun­ded assump­tions, super­fi­cial ana­lyses, and mis­guided decision-mak­ing that may under­mine an organ­iz­a­tion’s abil­ity to nav­ig­ate chal­lenges and seize oppor­tun­it­ies effectively. 

Business lead­ers must foster a cul­ture that val­ues crit­ic­al think­ing, open inquiry, and evid­ence-based decision-mak­ing to coun­ter­act the risk of beg­ging the ques­tion. They must encour­age team mem­bers to rig­or­ously exam­ine the premises of their argu­ments, identi­fy and address any under­ly­ing assump­tions, and engage in a con­struct­ive, reasoned debate that drives innov­a­tion, growth, and sus­tain­able success.

16. Equivocation

Equivocation: “Our sales fig­ures are cer­tainly inter­est­ing, which means they’re worth con­sid­er­ing for future strategy.”

Equivocation, a decept­ive rhet­or­ic­al strategy fre­quently encountered in pro­fes­sion­al dis­course, occurs when an indi­vidu­al exploits the ambi­gu­ity or mul­tiple mean­ings of a word or phrase to cre­ate con­fu­sion or mis­lead their audi­ence. This effect­ively avoids a clear or dir­ect response to an argu­ment or question.

In a busi­ness con­text, equi­voc­a­tion can obstruct mean­ing­ful com­mu­nic­a­tion, hinder the effect­ive exchange of ideas, and under­mine trust among team mem­bers, ulti­mately imped­ing innov­a­tion and sound decision-making. 

To pro­mote trans­par­ency and intel­lec­tu­al hon­esty with­in an organ­iz­a­tion, busi­ness lead­ers must emphas­ize the import­ance of clear and pre­cise lan­guage, encour­aging team mem­bers to seek cla­ri­fic­a­tion when faced with ambigu­ous state­ments and fos­ter­ing a cul­ture of open dia­logue that val­ues the rig­or­ous exam­in­a­tion of ideas and con­struct­ive debate, driv­ing informed decision-mak­ing and sus­tained organ­iz­a­tion­al success.

17. False Dichotomy (Black or White)

False dicho­tomy: “We either need to cut costs drastic­ally, or we have to increase our prices sig­ni­fic­antly — there’s no oth­er way to improve our profit margin.”

The false dicho­tomy fal­lacy, also known as the black or white fal­lacy, arises when an indi­vidu­al presents a com­plex issue or decision as hav­ing only two mutu­ally exclus­ive options. This effect­ively over­sim­pli­fies the mat­ter and ignores altern­at­ive per­spect­ives or poten­tial solutions.

In a busi­ness con­text, this fal­la­cious reas­on­ing can stifle cre­ativ­ity, hinder com­pre­hens­ive prob­lem-solv­ing, and lead to sub­op­tim­al decision-mak­ing, ulti­mately con­strain­ing an organ­iz­a­tion’s abil­ity to adapt and innov­ate in a rap­idly evolving landscape. 

To coun­ter­act the risks asso­ci­ated with false dicho­tom­ies, busi­ness lead­ers must encour­age crit­ic­al think­ing and open-minded­ness, foster an envir­on­ment that val­ues explor­ing nuanced per­spect­ives and diverse approaches, and empower team mem­bers to engage in col­lab­or­at­ive prob­lem-solv­ing that drives innovation.

18. Middle Ground Fallacy

Middle ground fal­lacy: “Our team is divided on wheth­er to invest in research and devel­op­ment or mar­ket­ing, so let’s alloc­ate half our budget to each and sat­is­fy everyone.”

The middle ground fal­lacy is a decept­ive form of argu­ment­a­tion in which an indi­vidu­al asserts that a com­prom­ise or middle point between two oppos­ing pos­i­tions must inher­ently rep­res­ent the cor­rect or most reas­on­able solu­tion, neg­lect­ing the pos­sib­il­ity that one or both extremes may hold mer­it or that the optim­al solu­tion may lie elsewhere.

In a busi­ness con­text, this fal­lacy can lead to sub­op­tim­al decision-mak­ing, foster a false sense of con­sensus, and poten­tially over­look innov­at­ive or super­i­or solutions.

To guard against the middle ground fal­lacy, busi­ness lead­ers must pro­mote a cul­ture of crit­ic­al think­ing and open debate. They must encour­age team mem­bers to exam­ine the strengths and weak­nesses of vari­ous per­spect­ives rig­or­ously and foster an envir­on­ment that sup­ports col­lab­or­at­ive prob­lem-solv­ing and the pur­suit of evid­ence-based, well-informed solutions.

19. Decision Point Fallacy (Sorites Paradox)

Decision point fal­lacy: “We can­’t determ­ine the exact point at which adding more fea­tures to our product will make it too com­plex for our users, so let’s keep adding fea­tures without con­sid­er­ing the poten­tial downsides.”

The decision point fal­lacy, also known as the Sorites Paradox, arises when an indi­vidu­al struggles to identi­fy a pre­cise threshold or turn­ing point with­in a series of incre­ment­al changes. This leads to flawed reas­on­ing or indecision.

This cog­nit­ive bias can mani­fest in a busi­ness con­text when decision-makers become mired in the minu­ti­ae of con­tinu­ous improve­ment or incre­ment­al pro­gress, los­ing sight of the big­ger pic­ture and ulti­mately ham­per­ing their abil­ity to make stra­tegic choices. 

To coun­ter­act the decision point fal­lacy, organ­iz­a­tion­al lead­ers must foster a cul­ture emphas­ising the import­ance of estab­lish­ing clear object­ives, main­tain­ing a hol­ist­ic per­spect­ive, and strik­ing a bal­ance between incre­ment­al pro­gress and decis­ive action, empower­ing team mem­bers to nav­ig­ate com­plex chal­lenges and drive sus­tained success.

20. Slippery Slope Fallacy

Slippery slope fal­lacy: “If we allow our employ­ees to work remotely for one day a week, pro­ductiv­ity will plum­met, and soon every­one will be demand­ing a com­pletely flex­ible sched­ule, res­ult­ing in chaos and the col­lapse of our com­pany culture.”

The slip­pery slope fal­lacy occurs when an indi­vidu­al argues that a spe­cif­ic action or decision will inev­it­ably lead to a chain of neg­at­ive con­sequences without provid­ing suf­fi­cient evid­ence for this caus­al relationship. 

In a busi­ness con­text, this fal­la­cious reas­on­ing can under­mine pro­duct­ive dia­logue, stifle innov­a­tion, and pro­mote an overly cau­tious approach to prob­lem-solv­ing, ulti­mately inhib­it­ing an organ­iz­a­tion’s abil­ity to adapt and grow. 

To guard against the slip­pery slope fal­lacy, busi­ness lead­ers must foster a cul­ture that val­ues evid­ence-based decision-mak­ing and encour­ages team mem­bers to crit­ic­ally exam­ine their argu­ments’ logic and assump­tions. This pro­motes a bal­anced and object­ive assess­ment of poten­tial risks and oppor­tun­it­ies that drive informed decision-mak­ing and sus­tained success.

21. Hasty Generalisations (Anecdotal Evidence)

Hasty gen­er­al­isa­tions: “One of our remote employ­ees missed a dead­line last month, which clearly shows that allow­ing employ­ees to work remotely leads to decreased pro­ductiv­ity and a lack of accountability.”

Hasty gen­er­al­iz­a­tions, often fueled by anec­dot­al evid­ence, occur when an indi­vidu­al draws broad con­clu­sions based on insuf­fi­cient or unrep­res­ent­at­ive data, res­ult­ing in poten­tially flawed or biased reasoning. 

Relying on hasty gen­er­al­iz­a­tions in a busi­ness con­text can lead to mis­guided decision-mak­ing, sub­op­tim­al strategies, and an inab­il­ity to effect­ively address com­plex chal­lenges, ulti­mately imped­ing an organ­iz­a­tion’s success.

Business lead­ers must emphas­ize the import­ance of thor­ough ana­lys­is, evid­ence-based decision-mak­ing, and crit­ic­al think­ing to coun­ter­act the risks asso­ci­ated with hasty gen­er­al­isa­tions. They must also encour­age team mem­bers to recog­nize the lim­it­a­tions of anec­dot­al evid­ence and con­sider diverse per­spect­ives, fos­ter­ing a cul­ture that val­ues rig­or­ous inquiry and com­pre­hens­ive problem-solving.

22. Faulty Analogy

Faulty ana­logy: “Managing a busi­ness is like rid­ing a bicycle; once you’ve learned the basics, it’s all about main­tain­ing bal­ance and momentum, so we don’t need to invest in ongo­ing pro­fes­sion­al devel­op­ment for our employees.”

The faulty ana­logy fal­lacy arises when an indi­vidu­al draws a com­par­is­on between two con­cepts or situ­ations that are not suf­fi­ciently alike, res­ult­ing in mis­lead­ing or unsup­por­ted conclusions. 

Relying on faulty ana­lo­gies in a busi­ness con­text can impede effect­ive prob­lem-solv­ing, foster mis­con­cep­tions, and con­trib­ute to ill-advised decision-mak­ing, ulti­mately under­min­ing an organ­iz­a­tion’s abil­ity to innov­ate and succeed.

To guard against faulty ana­lo­gies, busi­ness lead­ers must cul­tiv­ate a cul­ture that val­ues crit­ic­al think­ing, logic­al rigour, and evid­ence-based reas­on­ing. They must also encour­age team mem­bers to scru­tin­ize their com­par­is­ons’ valid­ity and seek diverse per­spect­ives that chal­lenge assump­tions and pro­mote nuanced understanding.

23. Burden of Proof

Burden of proof: “Our new mar­ket­ing strategy will boost sales by at least 20%; if you don’t believe me, prove me wrong.”

The bur­den of proof fal­lacy occurs when an indi­vidu­al asserts a claim without provid­ing suf­fi­cient evid­ence, often shift­ing the respons­ib­il­ity to dis­prove the asser­tion onto others.

In a busi­ness con­text, this fal­la­cious reas­on­ing can hinder pro­duct­ive dis­course, foster unwar­ran­ted assump­tions, and con­trib­ute to flawed decision-mak­ing, ulti­mately imped­ing an organ­iz­a­tion’s abil­ity to nav­ig­ate chal­lenges effect­ively and cap­it­al­ize on opportunities. 

To mit­ig­ate the risks asso­ci­ated with the bur­den of proof fal­lacy, busi­ness lead­ers must pro­mote a cul­ture of evid­ence-based reas­on­ing, crit­ic­al think­ing, and intel­lec­tu­al account­ab­il­ity. They must encour­age team mem­bers to sub­stan­ti­ate their claims with robust sup­port­ing evid­ence and engage in a con­struct­ive, well-informed debate that drives innov­at­ive prob­lem-solv­ing and sus­tain­able success.

24. Affirming the Consequent

Just because an if-then state­ment is true in a par­tic­u­lar situ­ation doesn’t make the if-then state­ment accur­ate in all cases.

A cat meows, so everything that meows is a cat.”

25. Denying the Antecedent (Fallacy of the Inverse)

If a state­ment with spe­cif­ic con­di­tions is cor­rect, this doesn’t make the inform­a­tion accur­ate or incor­rect for all oth­er situations.

A cat meows, so if it doesn’t meow, it isn’t a cat.”

26. Moving the Goalposts

Manipulating the argu­ment by chan­ging the spe­cif­ics of your ini­tial claims — after being ques­tioned or even proven wrong.

Yes, there might be some inno­cent people in jail, but I was only talk­ing about the guilty.”

27. No True Scotsman

To dis­qual­i­fy someone or some­thing based on a false or biased ideal.

All real men have beards, so if you don’t have a beard, you can’t be a real man.”

28. Personal Incredulity

It doesn’t make it untrue just because you find some­thing hard to believe or imagine.

I can’t believe that the uni­verse and everything in it arose from noth­ing, so it can’t be true.”

29. False Causality

The false assump­tion is that cor­rel­a­tion equals causation.

Crime rates went up when the price of gas went up, so for everyone’s safety, we must lower our taxes on fossil fuels.”

30. Texas Sharpshooter

To decide on your pos­i­tion, find only data to sup­port that pos­i­tion. This fal­lacy is espe­cially prom­in­ent in this digit­al age when find­ing argu­ments defend­ing almost any ima­gin­able pos­i­tion online is possible.

I’ve found numer­ous stud­ies sup­port­ing my pos­i­tion, and I have no idea if any stud­ies also sup­port your position.”

31. Loaded Question

To ask a ques­tion with an assump­tion already built into the question.

Have you stopped beat­ing your wife?”

32. Chesterton’s Fence

If we don’t under­stand or see the reas­on for some­thing, we might be inclined to do away with it. However, even if we don’t under­stand it, most things have been imple­men­ted for a reas­on. Therefore, we should leave it unless we fully under­stand its purpose.

There’s a fence here, but I can’t see what it’s good for, so let’s do away with it.”

33. Survivorship Bias

I’ve already writ­ten about sur­viv­or­ship bias and Abraham Wald.

We shot all return­ing war­planes with dam­ages in the wings, so we should rein­force the wings to make them safer.”

Read also: Survivorship Bias: Abraham Wald and the WWII Airplanes

34. The Dunning-Kruger Effect

A cog­nit­ive bias is when people over­es­tim­ate their com­pet­ence as they pro­gress in a new field. 1Please note that the Dunning-Kruger effect is under sci­entif­ic scru­tiny and lacks broad sup­port from the sci­entif­ic com­munity.

I’ve just star­ted learn­ing about this, and I’m amazed at how much more I know now com­pared to before when I knew next to noth­ing, so I’m quite sure that I’m an expert on this sub­ject now.”

35. Confirmation Bias

Most of us tend to recall or inter­pret inform­a­tion in a way that rein­forces our exist­ing cog­nit­ive schemas.

I refused to take my medi­cine and got well, so I always do my best to avoid treatment.

36. Heuristic Anchoring

When faced with an ini­tial num­ber, we often com­pare sub­sequent num­bers to that anchor.

The third house shown to us by the broker was over our budget but still a bar­gain com­pared to the first two houses she showed us.”

37. The Curse of Knowledge

We tend to assume that oth­er people have at least enough know­ledge to com­pre­hend and digest what we’re saying.

A preschool class came by the lab yes­ter­day and asked about my work, so I talked about gen­ome sequen­cing for a good half hour and got no fol­low-up questions.”

38. Optimism/​Pessimism Bias

We find it easi­er to believe that neg­at­ive things can hap­pen to oth­ers than ourselves. However, some people tend to be biased oppos­itely; they over­es­tim­ate the like­li­hood of adverse events.

We’re so blessed that those ter­rible things couldn’t ever hap­pen to us.” /​ “What happened to them will also hap­pen to us — only worse.”

39. The Sunk Cost Fallacy

Sometimes, we stick to a beha­viour simply because we’ve already inves­ted time, money, and oth­er resources. Abandoning such an invest­ment would force us to face an irre­vers­ible failure.

I ordered too much food, so we’ll simply have to over-eat for a few weeks to get our money’s worth.”

40. Negativity Bias

We tend to react more strongly to neg­at­ive impacts than to pos­it­ive effects of sim­il­ar or equal weight.

Our daugh­ter gradu­ated with hon­ours from col­lege yes­ter­day, but then on our way home, our car broke down and ruined the rest of the day.”

41. Declinism

We tend to think that everything will decline, espe­cially with new devel­op­ments. This might be due to cog­nit­ive lazi­ness; we don’t wish to change how we feel in tan­dem with the times.

Everything was bet­ter in the past, so change is terrible.”

Learn more: Social Media — The Good, The Bad, and the Ugly

42. The Backfire Effect (Conversion Theory)

When chal­lenged, we might cling even firmer to our beliefs — instead of ques­tion­ing ourselves. 

People hate us, but this proves us right about everything.”

Learn more: Conversion Theory: The Disproportionate Influence of Minorities

43. The Fundamental Attribution Error

When someone else makes a mis­take, we attrib­ute it to their char­ac­ter or beha­viour. Still, we tend to attrib­ute such mis­takes to con­tex­tu­al cir­cum­stances when we make mistakes.

When I’m in a rush, people behave like idi­ots in traffic.”

44. In-Group Bias

We have evolved to be sub­ject­ively pref­er­en­tial to people who belong to the same social group. This isn’t neces­sar­ily bad beha­viour per se, but we must watch out for situ­ations where we are put in a pos­i­tion where we can’t be expec­ted to be fair and objective.

I might be biased, of course, but I dare say, object­ively, that my daugh­ter was the best per­former in the whole orchestra.”

Learn more: Social Group Sizes (The Social Brain Hypothesis)

45. The Forer Effect (The Barnum Effect)

We tend to fill any gaps in the inform­a­tion we give using our exist­ing cog­nit­ive schem­as. This is, for instance, why it’s so easy to think that a horo­scope is eer­ily accur­ate. We fail to recog­nise that vague state­ments might apply to ourselves and many others.

I read my horo­scope yes­ter­day, and the inform­a­tion was uncan­nily accur­ate, so I’m cer­tainly con­vinced that there are some things about the cos­mos that influ­ence our lives in a way that we can’t yet understand.”

46. Cognitive Dissonance

We tend to sort inform­a­tion based on our exist­ing cog­nit­ive schem­as. One out­come is that we tend to dis­reg­ard any inform­a­tion that con­flicts with our exist­ing beliefs while quickly absorb­ing any­thing that con­firms our beliefs.

The Earth is flat, and I haven’t seen any cred­ible evid­ence to the contrary.”

Learn more: Cognitive Dissonance: Mental Harmony Above All Else

47. The Hostile Media Effect

This can be seen as the equi­val­ent in media sci­ence to the psy­cho­lo­gic­al fal­lacy of the back­fire effect. Studies have shown that people with strong opin­ions on a spe­cif­ic issue tend to believe that the media is biased towards their oppos­i­tion. The res­ult will be even stronger if the indi­vidu­al believes that the silent major­ity is out there who are par­tic­u­larly sus­cept­ible to erro­neous or mis­lead­ing media coverage.

I know the media is telling me I’m wrong, but that’s per­fectly under­stand­able since their primary object­ive is to stop me from expos­ing the truth.”

Learn more: The Hostile Media Effect: How We Demonise the News Media

48. Cherry-Picking (The Fallacy of Incomplete Evidence)

This fal­lacy is closely related to Texas sharp­shoot­er and the fal­lacy of divi­sion. Cherry-pick­ing fuels most of the reas­on­ing behind pop­u­lar con­spir­acy the­or­ies. In a world where inform­a­tion is abund­ant and eas­ily access­ible, it’s easy for any­one to make a case for almost anything. 

Apollo saved Greece from the dragon Python, and Napoleon saved France from the hor­rors of revolu­tion (derived from ‘revolvo,’ some­thing that crawls). Therefore, Napoleon is a myth.”

Learn more: Napoleon the Sun God (And Why Most Conspiracies are Bullshit)

49. The Spiral of Silence

Most social anim­als har­bour an instinct­ive fear of isol­a­tion, and in-groups main­tain their cul­tur­al sta­bil­ity par­tially by exclud­ing indi­vidu­als with non-con­form­ing opin­ions or beha­viours. This can cre­ate a cul­ture where group mem­bers self-cen­sor their views and beha­viours by going silent.

My opin­ions are per­ceived as wrong, and it’s bet­ter for every­one if I stay silent.”

Learn more: The Spiral of Silence

50. The Yes Ladder

This is a mar­ket­ing exploit where the per­suader aims to get you to say yes to some­thing sub­stan­tial (“big ask”) by meth­od­ic­ally get­ting you to say yes to some­thing smal­ler first (“small ask”).

I wasn’t going to buy the pink umbrella at first, but then I sub­scribed to their news­let­ter, and via the news­let­ter, I down­loaded a free photo book with pink umbrel­las — and now I own five pink umbrellas.”

51. Bystander Effect

People are less inclined to offer sup­port or aid if many oth­ers can.

Everyone cares deeply about per­son­al safety, so every­one will down­load our new CSR app to help each other.”

Learn more: Kitty Genovese Murder and the Misreported Bystander Effect

52. Reciprocation Effect

We often feel oblig­ated to recip­roc­ate if someone is friendly or gen­er­ous towards us. While this is a beau­ti­ful and expec­ted part of human beha­viour, spe­cial interests can take advant­age of it.

I can’t believe the car broke down so fast — the guy I bought it from threw in so many extra features.”

53. Commitment and Consistency

Once we com­mit to some­thing, we invest a part of ourselves in that decision. This makes it harder for many of us to aban­don such com­mit­ments because it would mean giv­ing up on ourselves. This bias is closely related to yes lad­ders, declin­ism, appeal to tra­di­tion, and sunk cost fallacy.

I’ve made my decision, and there­fore I’m stick­ing with it.”

54. The Fallacy of Social Proof

This fal­lacy is the com­mer­cial exten­sion of the band­wag­on effect; by show­cas­ing social proof, we are com­for­ted by decisions made by oth­ers. Ideally, we should always ensure that reviews and engage­ment dis­plays are rel­ev­ant (and accur­ate) before mak­ing any decisions, but this doesn’t always happen.

Their product seems to have many happy users, so the risk of get­ting scammed is low.”

55. Liking and Likeness

We prefer to say yes to people we know and like,” says Robert Cialdini in Influence: The Psychology of Persuasion.

He is gor­geous, suc­cess­ful, and speaks in a way that res­on­ates with me, so why shouldn’t I trust every word he says?”

56. The Appeal to Authority

It isn’t easy to dis­tin­guish between per­ceived author­ity and indis­put­able author­ity. Many com­pan­ies use testi­mo­ni­als from people with impress­ive titles — and it works. This fal­lacy is closely related to the fal­la­cious appeal to authority.

Several lead­ing doc­tors recom­men­ded this product, so the ad’s claims must be true.”

57. The Principle of Scarcity (FOMO)

Most of us are scared of miss­ing out (also known as FOMO, fear of miss­ing out). This makes us per­ceive things as more valu­able and rarer.

I’m so happy I man­aged to snag that pink uni­corn umbrella before the dis­count ran out!”

Learn more: The Power of Artificial Scarcity (FOMO)

58. Loss Aversion

The pain of los­ing can psy­cho­lo­gic­ally be twice as power­ful as the joy of win­ning. Our psy­cho­logy often allows us to take dis­pro­por­tion­ate risks to avoid los­ing com­pared to the dangers we’re ready to take to win. This bias is closely related to com­mit­ment, con­sist­ency, and the sunk cost fallacy.

Our last invest­ment led to a loss of mar­ket share, so we must increase our invest­ment to regain it.”

Signature - Jerry Silfwer - Doctor Spin

Thanks for read­ing. Please sup­port my blog by shar­ing art­icles with oth­er com­mu­nic­a­tions and mar­ket­ing pro­fes­sion­als. You might also con­sider my PR ser­vices or speak­ing engage­ments.

Reference List

Arkes, H. R., & Blumer, C. (1985), The psy­cho­logy of sunk costs. Organisational Behavior and Human Decision Processes, 35, 124 – 140.

Cialdini, R. (2006). Influence: The Psychology of Persuasion, Revised Edition. Harper Business: The United States.

Cook, J. & Lewandowsky, S. (2011). The debunk­ing hand­book. St. Lucia, Australia: University of Queensland. 

Dwyer, C.P. (2017). Critical think­ing: Conceptual per­spect­ives and prac­tic­al guidelines. Cambridge, UK:  Cambridge University Press; with a fore­word by former APA President, Dr Diane F. Halpern.

Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014). An integ­rated crit­ic­al think­ing frame­work for the 21st cen­tury. Thinking Skills & Creativity, 12, 43 – 52.

Forer, B. R. (1949). The Fallacy of Personal Validation: A classroom Demonstration of Gullibility. Journal of Abnormal Psychology, 44, 118 – 121.

Kahneman, D. (2011). Thinking fast and slow. Penguin: Great Britain.

Kruger, J., Dunning, D. (1999). Unskilled and unaware of it: How dif­fi­culties in recog­nising one’s own incom­pet­ence lead to inflated self-Assessments. Journal of Personality and Social Psychology, 77, 6, 1121 – 1134.

Scott, P. J., & Lizieri, C. 92012). Consumer house price judg­ments: New evid­ence of anchor­ing and arbit­rary coher­ence. Journal of Property Research, 29, 49 – 68.

Simon, H. A. (1957). Models of man. New York: Wiley.

Sweis, B. M., Abram, S. V., Schmidt, B. J., Seeland, K. D., MacDonald, A. W., Thomas, M. J., & Redish, A. D. (2018). Sensitivity to “sunk costs” in mice, rats, and humans. Science, 361(6398), 178 – 181.

Thaler, R. H. (1999). Mental account­ing mat­ters. Journal of Behavioral Decision Making, 12, 183 – 206.

Tversky, A. & Kahneman, D. (1974). Judgment under uncer­tainty: Heuristics and biases. Science, 185, 4157, 1124 – 1131.

West, R. F., Toplak, M. E., & Stanovich, K. E. (2008). Heuristics and biases as meas­ures of crit­ic­al think­ing: Associations with cog­nit­ive abil­ity and think­ing dis­pos­i­tions. Journal of Educational Psychology, 100, 4, 930 – 941.

PR Resource: More Psychology

Mental Models: Be a Better Thinker

Mental mod­els emphas­ise the import­ance of view­ing prob­lems from mul­tiple per­spect­ives, recog­nising per­son­al lim­it­a­tions, and under­stand­ing the often unfore­seen inter­ac­tions between dif­fer­ent factors. 

You only have to do a few things right in your life so long as you don’t do too many things wrong.”
— Warren Buffett

The writ­ings of Charlie Munger, Vice Chairman of Berkshire Hathaway and long-time col­lab­or­at­or of Warren Buffett and many oth­ers, inspire sev­er­al of the below mod­els.2It’s worth not­ing that these mod­els are not exclus­ively Charlie Munger’s inven­tions but tools he advoc­ates for effect­ive think­ing and decision-mak­ing.

List of Mental Models

Here’s a list of my favour­ite men­tal models: 

The iron pre­scrip­tion (men­tal mod­el). Charlie Munger: “I have what I call an ‘iron pre­scrip­tion’ that helps me keep sane when I nat­ur­ally drift toward pre­fer­ring one ideo­logy over anoth­er. I feel that I’m not entitled to have an opin­ion unless I can state the argu­ments against my pos­i­tion bet­ter than the people who are in oppos­i­tion. I think that I am qual­i­fied to speak only when I’ve reached that state” (Knodell, 2016). 3Knodell, P. A. (2016). All I want to know is where I’m going to die so I’ll nev­er go there: Buffett & Munger – A study in sim­pli­city and uncom­mon, com­mon sense. PAK Publishing.

The Red Queen effect (men­tal mod­el). This meta­phor ori­gin­ates from Lewis Carroll’s Through the Looking-Glass. It describes a situ­ation in which one must con­tinu­ously adapt, evolve, and work to main­tain one’s pos­i­tion. In the story, the Red Queen is a char­ac­ter who explains to Alice that in their world, run­ning as fast as one can is neces­sary just to stay in the same place. The meta­phor is often used in the con­text of busi­nesses that need to innov­ate con­stantly to stay com­pet­it­ive, high­light­ing the relent­less pres­sure to adapt in dynam­ic envir­on­ments where stag­na­tion can mean fall­ing behind. 4Red Queen hypo­thes­is. (2023, November 27). In Wikipedia. https://​en​.wiki​pe​dia​.org/​w​i​k​i​/​R​e​d​_​Q​u​e​e​n​_​h​y​p​o​t​h​e​sis 5Carroll, L. (2006). Through the look­ing-glass, and what Alice found there (R. D. Martin, Ed.). Penguin Classics. (Original work pub­lished 1871.)

Ockam’s razor (men­tal mod­el). This prin­ciple sug­gests that the simplest explan­a­tion is usu­ally cor­rect. The one with the few­est assump­tions should be selec­ted when presen­ted with com­pet­ing hypo­theses. It’s a tool for cut­ting through com­plex­ity and focus­ing on what’s most likely true. 6Ariew, R. (1976). Ockham’s Razor: A his­tor­ic­al and philo­soph­ic­al ana­lys­is of sim­pli­city in sci­ence. Scientific American, 234(3), 88 – 93.

Hanlon’s razor (men­tal mod­el). This think­ing aid advises against attrib­ut­ing to malice what can be adequately explained by incom­pet­ence or mis­take. It reminds us to look for more straight­for­ward explan­a­tions before jump­ing to con­clu­sions about someone’s inten­tions. 7Hanlon, R. J. (1980). Murphy’s Law book two: More reas­ons why things go wrong!. Los Angeles: Price Stern Sloan.

Vaguely right vs pre­cisely wrong (men­tal mod­el). This prin­ciple sug­gests it is bet­ter to be approx­im­ately cor­rect than 100% incor­rect. In many situ­ations, seek­ing pre­ci­sion can lead to errors if the under­ly­ing assump­tions or data are flawed. Sometimes, a rough estim­ate is more valu­able than a pre­cise but poten­tially mis­lead­ing fig­ure. 8Keynes, J. M. (1936). The gen­er­al the­ory of employ­ment, interest, and money. London: Macmillan.

Fat pitch (men­tal mod­el). Borrowed from base­ball, this concept refers to wait­ing patiently for the per­fect oppor­tun­ity — a situ­ation where the chances of suc­cess are excep­tion­ally high. It sug­gests the import­ance of patience and strik­ing when the time is right. 9Kaufman, P. A. (Ed.). (2005). Poor Charlie’s alman­ack: The wit and wis­dom of Charles T. Munger. Virginia Beach, VA: Donning Company Publishers.

Chesterton’s fence (men­tal mod­el). G.K. Chesterton: ”In the mat­ter of reform­ing things, as dis­tinct from deform­ing them, there is one plain and simple prin­ciple; a prin­ciple which will prob­ably be called a para­dox. There exists in such a case a cer­tain insti­tu­tion or law; let us say, for the sake of sim­pli­city, a fence or gate erec­ted across a road. The more mod­ern type of reformer goes gaily up to it and says, ‘I don’t see the use of this; let us clear it away.’ To which the more intel­li­gent type of reformer will do well to answer: ‘If you don’t see the use of it, I cer­tainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to des­troy it” (Chesterton, 1929). 10Chesterton, G. K. (1929). “The Drift from Domesticity”. Archived 6 November 2018 at the Wayback Machine In: The Thing. London: Sheed & Ward, p. 35

First-con­clu­sion bias (men­tal mod­el). This is the tend­ency to stick with the first con­clu­sion without con­sid­er­ing altern­at­ive pos­sib­il­it­ies or addi­tion­al inform­a­tion. It’s a cog­nit­ive bias that can impede crit­ic­al think­ing and thor­ough analysis.

First prin­ciples think­ing (men­tal mod­el). This approach involves break­ing down com­plex prob­lems into their most basic ele­ments and then reas­sembling them from the ground up. It’s about get­ting to the fun­da­ment­al truths of a situ­ation and build­ing your under­stand­ing from there rather than rely­ing on assump­tions or con­ven­tion­al wisdom.

The map is not the ter­rit­ory (men­tal mod­el). This mod­el reminds us that rep­res­ent­a­tions of real­ity are not real­ity itself. Maps, mod­els, and descrip­tions are sim­pli­fic­a­tions and can­not cap­ture every aspect of the actu­al ter­rit­ory or situ­ation. It’s a cau­tion against over-rely­ing on mod­els and the­or­ies without con­sid­er­ing the nuances of real-world situ­ations. 11Silfwer, J. (2022, November 3). Walter Lippmann: Public Opinion and Perception Management. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​w​a​l​t​e​r​-​l​i​p​p​m​a​nn/

Bell curve (men­tal mod­el). This curve is a graph­ic­al depic­tion of a nor­mal dis­tri­bu­tion, show­ing how many occur­rences fall near the mean value and few­er occur as you move away from the mean. In decision-mak­ing, it’s used to under­stand and anti­cip­ate vari­ab­il­ity and to recog­nise that while extreme cases exist, most out­comes will cluster around the average.

Compounding (men­tal mod­el). Often used in the con­text of fin­ance, com­pound­ing refers to the pro­cess where the value of an invest­ment increases because the earn­ings on an invest­ment, both cap­it­al gains and interest, earn interest as time passes. This prin­ciple can be applied more broadly to under­stand how small, con­sist­ent efforts can yield sig­ni­fic­ant long-term results.

Survival of the fit­test (men­tal mod­el). Borrowed from evol­u­tion­ary bio­logy, this men­tal mod­el sug­gests that only those best adap­ted to their envir­on­ment sur­vive and thrive. In a busi­ness con­text, it can refer to com­pan­ies that adapt to chan­ging mar­ket con­di­tions and are more likely to succeed.

Mr. Market (men­tal mod­el). A meta­phor cre­ated by Benjamin Graham rep­res­ents the stock mar­ket’s mood swings from optim­ism to pess­im­ism. It’s used to illus­trate emo­tion­al reac­tions in the mar­ket and the import­ance of main­tain­ing objectiv­ity.
Source: Graham, B. (2006) 12Graham, B. (2006). The intel­li­gent investor: The defin­it­ive book on value invest­ing (Rev. ed., updated with new com­ment­ary by J. Zweig). Harper Business. (Original work pub­lished 1949.)

Second-order think­ing (men­tal mod­el). This kind of think­ing goes bey­ond the imme­di­ate effects of an action to con­sider the sub­sequent effects. It’s about think­ing ahead and under­stand­ing the longer-term con­sequences of decisions bey­ond just the imme­di­ate results.

Law of dimin­ish­ing returns (men­tal mod­el). This eco­nom­ic prin­ciple states that as invest­ment in a par­tic­u­lar area increases, the rate of profit from that invest­ment, after a cer­tain point, can­not increase pro­por­tion­ally and may even decrease. It’s essen­tial to under­stand when addi­tion­al invest­ment yields pro­gress­ively smal­ler returns. 13Diminishing returns. (2024, November 15). Wikipedia. https://​en​.wiki​pe​dia​.org/​w​i​k​i​/​D​i​m​i​n​i​s​h​i​n​g​_​r​e​t​u​rns

Opportunity cost (men­tal mod­el). This concept refers to the poten­tial bene­fits one misses out on when choos­ing one altern­at­ive over anoth­er. It’s the cost of the fol­low­ing best option fore­gone. Understanding oppor­tun­ity costs helps make informed decisions by con­sid­er­ing what to give up when choosing.

Swiss Army knife approach (men­tal mod­el). This concept emphas­ises the import­ance of hav­ing diverse tools (or skills). Being ver­sat­ile and adapt­able in vari­ous situ­ations is valu­able, like a Swiss Army knife. This mod­el is bene­fi­cial for uncer­tain and volat­ile situ­ations. There’s also a case to be made for gen­er­al­ists in a spe­cial­ised world. 14Parsons, M., & Pearson-Freeland, M. (Hosts). (2021, August 8). Charlie Munger: Latticework of men­tal mod­els (No. 139) [Audio pod­cast epis­ode]. In Moonshots pod­cast: Learning out loud. … Continue read­ing 15Epstein, D. (2019). Range: Why gen­er­al­ists tri­umph in a spe­cial­ized world. Riverhead Books.

Acceleration the­ory (men­tal mod­el). This concept indic­ates that the win­ner mustn’t lead the race from start to fin­ish. Mathematically, delay­ing max­im­um “speed” by pro­long­ing the slower accel­er­a­tion phase will get you across the fin­ish line faster. 16Silfwer, J. (2012, October 31). The Acceleration Theory: Use Momentum To Finish First. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​a​c​c​e​l​e​r​a​t​i​o​n​-​t​h​e​o​ry/

Manage Expectations—This concept involves set­ting real­ist­ic expect­a­tions for your­self and oth­ers. It’s about align­ing hopes and pre­dic­tions with what is achiev­able and prob­able, thus redu­cing dis­ap­point­ment and increas­ing sat­is­fac­tion. Effective expect­a­tion man­age­ment can lead to bet­ter per­son­al and pro­fes­sion­al rela­tion­ships and outcomes.

Techlash—This men­tal mod­el acknow­ledges that while tech­no­logy can provide solu­tions, it can cre­ate anti­cip­ated and unanti­cip­ated prob­lems. It’s a remind­er to approach tech­no­lo­gic­al innov­a­tions cau­tiously, con­sid­er­ing poten­tial neg­at­ive impacts along­side the bene­fits. 17Silfwer, J. (2018, December 27). The Techlash: Our Great Confusion. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​t​e​c​h​l​a​sh/

World’s Most Intelligent Question—This men­tal mod­el refers to repeatedly ask­ing “Why?” to delve deep­er into a prob­lem and under­stand its root causes. By con­tinu­ally ask­ing why some­thing hap­pens, one can uncov­er lay­ers of under­stand­ing that might remain hidden.

Regression to the Mean—This stat­ist­ic­al prin­ciple states that extreme events are likely to be fol­lowed by more mod­er­ate ones. Over time, val­ues tend to revert to the aver­age, a concept rel­ev­ant in many areas, from sports per­form­ance to busi­ness metrics.

False Dichotomy—This logic­al fal­lacy occurs when a situ­ation is presen­ted as hav­ing only two exclus­ive and mutu­ally exhaust­ive options when oth­er pos­sib­il­it­ies exist. It over­sim­pli­fies com­plex issues into an “either/​or” choice. For instance, say­ing, “You are either with us or against us,” ignores the pos­sib­il­ity of neut­ral or altern­at­ive positions.

Inversion—Inversion involves look­ing at prob­lems back­wards or from the end goal. Instead of think­ing about how to achieve some­thing, you con­sider what would pre­vent it from hap­pen­ing. This can reveal hid­den obstacles and altern­at­ive solutions.

Psychology of Human Misjudgment—This men­tal mod­el refers to under­stand­ing the com­mon biases and errors in human think­ing. By know­ing how cog­nit­ive biases, like con­firm­a­tion bias or the anchor­ing effect, can lead to flawed reas­on­ing, one can make more ration­al and object­ive decisions.

Slow is Smooth, Smooth is Fast—Often used in mil­it­ary and tac­tic­al train­ing, this phrase encap­su­lates the idea that some­times, slow­ing down can lead to faster over­all pro­gress. The prin­ciple is that tak­ing delib­er­ate, con­sidered actions reduces mis­takes and inef­fi­cien­cies, which can lead to faster out­comes in the long run. In prac­tice, it means plan­ning, train­ing, and execut­ing with care, lead­ing to smooth­er, more effi­cient oper­a­tions that achieve object­ives faster than rushed, less thought­ful efforts. 18Silfwer, J. (2020, April 24). Slow is Smooth, Smooth is Fast. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​s​l​o​w​-​i​s​-​s​m​o​o​th/

Because You Are Worth It—This men­tal mod­el focuses on self-worth and invest­ing in one­self. It sug­gests recog­niz­ing and affirm­ing one’s value is cru­cial for per­son­al growth, hap­pi­ness, and suc­cess. This can involve self-care, edu­ca­tion, or simply mak­ing choices that reflect one’s value and potential.

Physics Envy—This term describes the desire to apply the pre­ci­sion and cer­tainty of phys­ics to fields where such exactitude is impossible, like eco­nom­ics or social sci­ences. It’s a cau­tion against over­re­li­ance on quant­it­at­ive meth­ods in areas where qual­it­at­ive aspects play a sig­ni­fic­ant role.

Easy Street Strategy—This prin­ciple sug­gests that sim­pler solu­tions are often bet­ter and more effect­ive than com­plex ones. In decision-mak­ing and prob­lem-solv­ing, seek­ing straight­for­ward, clear-cut solu­tions can often lead to bet­ter out­comes than pur­su­ing overly com­plic­ated strategies. 19Silfwer, J. (2021, January 27). The Easy Street PR Strategy: Keep It Simple To Win. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​e​a​s​y​-​s​t​r​e​e​t​-​p​r​-​s​t​r​a​t​e​gy/

Scale is Key—This concept high­lights how the impact of decisions or actions can vary dra­mat­ic­ally depend­ing on their scale. What works well on a small scale might not be effect­ive or feas­ible on a lar­ger scale, and vice versa. 

Circle of Competence—This concept involves recog­niz­ing and under­stand­ing one’s areas of expert­ise and lim­it­a­tions. The idea is to focus on areas where you have the most know­ledge and exper­i­ence rather than ven­tur­ing into fields where you lack expert­ise, thereby increas­ing the like­li­hood of success.

Fail Fast, Fail Often—By fail­ing fast, you quickly learn what does­n’t work, which helps in refin­ing your approach or pivot­ing to some­thing more prom­ising. Failing often is seen not as a series of set­backs but as a neces­sary part of the pro­cess towards suc­cess. This mind­set encour­ages exper­i­ment­a­tion, risk-tak­ing, and learn­ing from mis­takes, emphas­ising agil­ity and adaptability.

Correlation Do Not Equal Causation—This prin­ciple is a crit­ic­al remind­er in data ana­lys­is and sci­entif­ic research. Just because two vari­ables show a cor­rel­a­tion (they seem to move togeth­er or oppose each oth­er) does not mean one causes the oth­er. Other vari­ables could be at play, or it might be a coincidence. 

Critical Mass—This men­tal mod­el emphas­izes the import­ance of reach­ing a cer­tain threshold to trig­ger a sig­ni­fic­ant change, wheth­er user adop­tion, mar­ket pen­et­ra­tion, or social move­ment growth. This mod­el guides stra­tegic decisions, such as resource alloc­a­tion, mar­ket­ing strategies, and tim­ing of ini­ti­at­ives, to effect­ively reach and sur­pass this cru­cial point. 20Silfwer, J. (2019, March 10). Critical Mass: How Many Social Media Followers Do You Need? Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​c​r​i​t​i​c​a​l​-​m​a​s​s​-​f​o​l​l​o​w​e​rs/

Sorites Paradox—Also known as the para­dox of the heap, this para­dox arises from vague pre­dic­ates. It involves a sequence of small changes that don’t seem to make a dif­fer­ence indi­vidu­ally but, when accu­mu­lated, lead to a sig­ni­fic­ant change where the exact point of change is indis­cern­ible. For example, if you keep remov­ing grains of sand from a heap, when does it stop being a heap? Each grain does­n’t seem to make a dif­fer­ence, but even­tu­ally, you’re left with no heap.

The Power of Cycle Times—Mathematically, redu­cing cycle times in a pro­cess that grows expo­nen­tially (like con­tent shar­ing on social net­works) drastic­ally increases the growth rate, lead­ing to faster and wider dis­sem­in­a­tion of the con­tent, thereby driv­ing vir­al­ity. The com­bin­a­tion of expo­nen­tial growth, net­work effects, and feed­back loops makes cycle time a crit­ic­al factor. 21Let’s say the num­ber of new social media shares per cycle is a con­stant mul­ti­pli­er, m. If the cycle time is t and the total time under con­sid­er­a­tion is T, the num­ber of cycles in this time is T/​t​. … Continue read­ing 22Silfwer, J. (2017, February 6). Viral Loops (or How to Incentivise Social Media Sharing). Doctor Spin | the PR Blog. https://​doc​tor​spin​.net/​v​i​r​a​l​-​l​o​op/

Non-Linearity—This men­tal mod­el recog­nises that out­comes in many situ­ations are not dir­ectly pro­por­tion­al to the inputs or efforts. It sug­gests that effects can be dis­pro­por­tion­ate to their causes, either escal­at­ing rap­idly with small changes or remain­ing stag­nant des­pite sig­ni­fic­ant efforts. Understanding non-lin­ear­ity helps in recog­niz­ing and anti­cip­at­ing com­plex pat­terns in vari­ous phenomena.

Checklists—This men­tal mod­el stresses the import­ance of sys­tem­at­ic approaches to pre­vent mis­takes and over­sights. Using check­lists in com­plex or repet­it­ive tasks ensures that all neces­sary steps are fol­lowed and noth­ing is over­looked, thereby increas­ing effi­ciency and accur­acy. 23Silfwer, J. (2020, September 18). Communicative Leadership in Organisations. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​c​o​m​m​u​n​i​c​a​t​i​v​e​-​l​e​a​d​e​r​s​h​ip/

Lollapalooza—Coined by Munger, this term refers to situ­ations where mul­tiple factors, tend­en­cies, or biases inter­act so that the com­bined effect is much great­er than the sum of indi­vidu­al effects. It’s a remind­er of how vari­ous ele­ments can con­verge to cre­ate sig­ni­fic­ant impacts, often unex­pec­ted or unprecedented.

Limits—This men­tal mod­el acknow­ledges that everything has bound­ar­ies or lim­its, bey­ond which there can be neg­at­ive con­sequences. Recognising and respect­ing per­son­al, pro­fes­sion­al, and phys­ic­al lim­its is essen­tial for sus­tain­able growth and success.

The 7Ws—This men­tal mod­el refers to the prac­tice of ask­ing “Who, What, When, Where, Why” (and some­times “How”) to under­stand a situ­ation or prob­lem fully. By sys­tem­at­ic­ally address­ing these ques­tions, one can com­pre­hens­ively under­stand an issue’s con­text, causes, and poten­tial solu­tions, lead­ing to more informed decision-mak­ing. 24Silfwer, J. (2020, September 18). The Checklist for Communicative Organisations. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​c​h​e​c​k​l​i​s​t​-​f​o​r​-​c​o​m​m​u​n​i​c​a​t​i​v​e​-​l​e​a​d​e​r​s​h​ip/

Chauffeur Knowledge—This men­tal mod­el dis­tin­guishes between hav­ing a sur­face-level under­stand­ing (like a chauf­feur who knows the route) and deep, genu­ine know­ledge (like an expert who under­stands the intric­a­cies of a sub­ject). It warns against the illu­sion of expert­ise based on super­fi­cial know­ledge and emphas­izes the import­ance of true, deep understanding.

Make Friends with Eminent Dead—This men­tal mod­el advoc­ates learn­ing from the past, par­tic­u­larly from sig­ni­fic­ant his­tor­ic­al fig­ures and their writ­ings. Studying the exper­i­ences and thoughts of those who have excelled in their fields can yield valu­able insights and wisdom.

Seizing the Middle—This strategy involves find­ing and main­tain­ing a bal­anced, mod­er­ate pos­i­tion, espe­cially in con­flict or nego­ti­ation. It’s about avoid­ing extremes and find­ing a sus­tain­able, middle-ground solu­tion. Also, centre pos­i­tions often offer the broad­est range of options.

Asymmetric Warfare—This refers to con­flict between parties of unequal strength, where the weak­er party uses uncon­ven­tion­al tac­tics to exploit the vul­ner­ab­il­it­ies of the stronger oppon­ent. It’s often dis­cussed in mil­it­ary and busi­ness contexts.

Boredom Syndrome—This term refers to the human tend­ency to seek stim­u­la­tion or change when things become routine or mono­ton­ous, which can lead to unne­ces­sary changes or risks. Sometimes, tak­ing no action is bet­ter than tak­ing action, but remain­ing idle is some­times difficult.

Survivorship Bias—This cog­nit­ive bias involves focus­ing on people or things that have “sur­vived” some pro­cess and inad­vert­ently over­look­ing those that did not due to their lack of vis­ib­il­ity. This can lead to false con­clu­sions because it ignores the exper­i­ences of those who did not make it through the pro­cess. 25Silfwer, J. (2019, October 17). Survivorship Bias — Correlation Does Not Equal Causation. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​s​u​r​v​i​v​o​r​s​h​i​p​-​b​i​as/

Each men­tal mod­el offers a lens for view­ing prob­lems, mak­ing decisions, and strategising, reflect­ing the com­plex­ity and diversity of thought required in vari­ous fields and situations.

Numerous oth­er men­tal mod­els are also used in vari­ous fields, such as eco­nom­ics, psy­cho­logy, and sys­tems thinking.

Learn more: Mental Models: How To Be a Better Thinker

PR Resource: How To Create Knowledge

How To Create Knowledge - Types of Reasoning - Doctor Spin
How to cre­ate knowledge.
Spin Academy | Online PR Courses

How To Create Knowledge

If you can­’t explain it simply, you don’t under­stand it well enough.”
— Albert Einstein

This list of how to cre­ate know­ledge presents aspects of reas­on­ing, meth­od­o­lo­gic­al approaches, data ana­lys­is per­spect­ives, and philo­soph­ic­al frame­works. It explains how know­ledge can be approached, ana­lysed, and interpreted.

Types of Reasoning and Logical Processes

  • Inductive reas­on­ing. Generalising from spe­cif­ic obser­va­tions to broad­er generalizations.
  • Deductive reas­on­ing. Starting with a gen­er­al state­ment or hypo­thes­is and reach­ing a spe­cif­ic conclusion.
  • Abductive reas­on­ing. Starting with an obser­va­tion and seek­ing the simplest and most likely explanation.
  • Probabilistic reas­on­ing. Making pre­dic­tions based on prob­ab­il­it­ies in uncer­tain situations.

Methodological Approaches

  • Empirical vs logic­al. Empirical—Deriving know­ledge from obser­va­tion or exper­i­ment­a­tion. Logical—Using struc­tured reas­on­ing and val­id argu­ments inde­pend­ent of empir­ic­al evidence.
  • Heuristic vs algorithmic. Heuristic—Applying prac­tic­al meth­ods or “rules of thumb” for imme­di­ate solu­tions. Algorithmic—Using sys­tem­at­ic pro­ced­ures for defin­it­ive, often optim­al solutions.

Data and Analysis Perspectives

  • Analytical vs syn­thet­ic. Analytical—Breaking down com­plex prob­lems into smal­ler com­pon­ents. Synthetic—Combining ele­ments to form a coher­ent whole.
  • Qualitative vs quant­it­at­ive. Qualitative—Focusing on non-stat­ist­ic­al aspects and qual­it­ies. Quantitative—Involving numer­ic­al data col­lec­tion and analysis.

Philosophical and Theoretical Frameworks

  • Rationalism vs empir­i­cism. Rationalism—Emphasising reas­on as the primary source of know­ledge. Empiricism—Stressing the import­ance of sens­ory exper­i­ence and evidence.
  • Positivism. Asserting that sci­entif­ic know­ledge is the true form of knowledge.
  • Hermeneutics. Focusing on the inter­pret­a­tion of texts, lan­guage, and symbols.
  • Phenomenology. Concentrating on the study of con­scious­ness and dir­ect experience.
  • Pragmatism. Considering prac­tic­al con­sequences as vital in mean­ing and truth.
  • Constructivism. Suggesting that know­ledge is con­struc­ted from exper­i­ences and ideas.
  • Deconstruction. Analysing philo­soph­ic­al and lit­er­ary lan­guage to uncov­er impli­cit assumptions.

Learn more: How To Create Knowledge

Logo - Spin Academy - Online PR Courses

PR Resource: Types of Intelligences

Howard Gardner: 10 Intelligence Types

Gardner’s the­ory of mul­tiple intel­li­gences has revo­lu­tion­ized edu­ca­tion, chal­len­ging the notion of a single, fixed intel­li­gence and pro­mot­ing a more diverse approach to teach­ing and learn­ing.”
Source: 26Checkley, K. (1997). The First Seven…and the Eighth: A Conversation with Howard Gardner. Educational Leadership, 55, 8 – 13.

Howard Gardner’s the­ory of mul­tiple intel­li­gences expands the tra­di­tion­al view of intel­li­gence bey­ond logic­al and lin­guist­ic cap­ab­il­it­ies. 27Theory of mul­tiple intel­li­gences. (2023, November 28). In Wikipedia. https://​en​.wiki​pe​dia​.org/​w​i​k​i​/​T​h​e​o​r​y​_​o​f​_​m​u​l​t​i​p​l​e​_​i​n​t​e​l​l​i​g​e​n​ces

Here’s a descrip­tion of each type of intel­li­gence as out­lined in his theory:

  • Linguistic intel­li­gence. This intel­li­gence involves effect­ively using words and lan­guage. It includes skills in read­ing, writ­ing, speak­ing, and com­mu­nic­a­tion. People with high lin­guist­ic intel­li­gence are typ­ic­ally good at telling stor­ies, mem­or­iz­ing words, and reading.
  • Logical-math­em­at­ic­al intel­li­gence. This form of intel­li­gence is about the capa­city to ana­lyze prob­lems logic­ally, carry out math­em­at­ic­al oper­a­tions, and invest­ig­ate issues sci­en­tific­ally. It involves strong reas­on­ing skills, pat­tern recog­ni­tion, and abstract thinking.
  • Musical intel­li­gence. This intel­li­gence rep­res­ents skill in per­form­ing, com­pos­ing, and appre­ci­at­ing music­al pat­terns. It encom­passes recog­nising and com­pos­ing music­al pitches, tones, and rhythms.
  • Spatial intel­li­gence. Spatial intel­li­gence involves the poten­tial to recog­nize and manip­u­late the pat­terns of wide spaces (like nav­ig­at­ors and pilots) and more con­fined areas (like chess play­ers and sur­geons). It includes skills like visu­al­iz­ing objects, cre­at­ing men­tal images, and think­ing in three dimensions.
  • Bodily-kin­es­thet­ic intel­li­gence. This type refers to using one’s whole body or parts of the body (like the hands or the mouth). It’s the abil­ity to manip­u­late objects and use vari­ous phys­ic­al skills. This intel­li­gence also involves a sense of tim­ing and the per­fec­tion of skills through hand-eye coördination.
  • Interpersonal intel­li­gence. This is about under­stand­ing and inter­act­ing effect­ively with oth­ers. It involves effect­ive verbal and non­verbal com­mu­nic­a­tion, the abil­ity to note dis­tinc­tions among oth­ers, sens­it­iv­ity to the moods and tem­pera­ments of oth­ers, and the abil­ity to enter­tain mul­tiple per­spect­ives. 28See also: Silfwer, J. (2023, April 25). Theory of Mind: A Superpower for PR Professionals. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​t​h​e​o​r​y​-​o​f​-​m​i​n​d​-​a​-​s​u​p​e​r​p​o​w​e​r​-​f​o​r​-​p​r​-​p​r​o​f​e​s​s​i​o​n​a​ls/
  • Intrapersonal intel­li­gence. Intrapersonal intel­li­gence is the capa­city to under­stand one­self and to appre­ci­ate one’s feel­ings, fears, and motiv­a­tions. This intel­li­gence involves hav­ing a deep under­stand­ing of the self, what one’s strengths/​weaknesses are, what makes one unique, and being able to pre­dict one’s own reactions/​emotions.
  • Naturalist intel­li­gence. This intel­li­gence refers to identi­fy­ing and clas­si­fy­ing nat­ur­al pat­terns. It involves under­stand­ing liv­ing creatures and bot­any and the abil­ity to observe nat­ur­al phenomena.
  • Teaching intel­li­gence. This form of intel­li­gence is evid­ent when indi­vidu­als, includ­ing very young chil­dren, suc­cess­fully teach oth­ers. It involves break­ing down com­plex con­cepts into sim­pler, teach­able parts and under­stand­ing how dif­fer­ent people learn.
  • Existential intel­li­gence. This type refers to the abil­ity to use intu­ition, thought, and meta-cog­ni­tion to ask (and con­sider) deep ques­tions about human exist­ence, such as the mean­ing of life, why we die, and how we got here.

Each intel­li­gence type rep­res­ents dif­fer­ent ways of pro­cessing inform­a­tion and sug­gests every­one has a unique blend of these bits of intel­li­gence. 29Gardner, H. (1983). Frames of Mind: The Theory of Multiple Intelligences. Basic Books.

Learn more: 10 Intelligence Types: Howard Gardner’s Theory

ANNOTATIONS
ANNOTATIONS
1 Please note that the Dunning-Kruger effect is under sci­entif­ic scru­tiny and lacks broad sup­port from the sci­entif­ic community.
2 It’s worth not­ing that these mod­els are not exclus­ively Charlie Munger’s inven­tions but tools he advoc­ates for effect­ive think­ing and decision-making.
3 Knodell, P. A. (2016). All I want to know is where I’m going to die so I’ll nev­er go there: Buffett & Munger – A study in sim­pli­city and uncom­mon, com­mon sense. PAK Publishing.
4 Red Queen hypo­thes­is. (2023, November 27). In Wikipedia. https://​en​.wiki​pe​dia​.org/​w​i​k​i​/​R​e​d​_​Q​u​e​e​n​_​h​y​p​o​t​h​e​sis
5 Carroll, L. (2006). Through the look­ing-glass, and what Alice found there (R. D. Martin, Ed.). Penguin Classics. (Original work pub­lished 1871.)
6 Ariew, R. (1976). Ockham’s Razor: A his­tor­ic­al and philo­soph­ic­al ana­lys­is of sim­pli­city in sci­ence. Scientific American, 234(3), 88 – 93.
7 Hanlon, R. J. (1980). Murphy’s Law book two: More reas­ons why things go wrong!. Los Angeles: Price Stern Sloan.
8 Keynes, J. M. (1936). The gen­er­al the­ory of employ­ment, interest, and money. London: Macmillan.
9 Kaufman, P. A. (Ed.). (2005). Poor Charlie’s alman­ack: The wit and wis­dom of Charles T. Munger. Virginia Beach, VA: Donning Company Publishers.
10 Chesterton, G. K. (1929). “The Drift from Domesticity”. Archived 6 November 2018 at the Wayback Machine In: The Thing. London: Sheed & Ward, p. 35
11 Silfwer, J. (2022, November 3). Walter Lippmann: Public Opinion and Perception Management. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​w​a​l​t​e​r​-​l​i​p​p​m​a​nn/
12 Graham, B. (2006). The intel­li­gent investor: The defin­it­ive book on value invest­ing (Rev. ed., updated with new com­ment­ary by J. Zweig). Harper Business. (Original work pub­lished 1949.)
13 Diminishing returns. (2024, November 15). Wikipedia. https://​en​.wiki​pe​dia​.org/​w​i​k​i​/​D​i​m​i​n​i​s​h​i​n​g​_​r​e​t​u​rns
14 Parsons, M., & Pearson-Freeland, M. (Hosts). (2021, August 8). Charlie Munger: Latticework of men­tal mod­els (No. 139) [Audio pod­cast epis­ode]. In Moonshots pod­cast: Learning out loud. Moonshots. https://​www​.moon​shots​.io/​e​p​i​s​o​d​e​-​1​3​9​-​c​h​a​r​l​i​e​-​m​u​n​g​e​r​-​l​a​t​t​i​c​e​w​o​r​k​-​o​f​-​m​e​n​t​a​l​-​m​o​d​els
15 Epstein, D. (2019). Range: Why gen­er­al­ists tri­umph in a spe­cial­ized world. Riverhead Books.
16 Silfwer, J. (2012, October 31). The Acceleration Theory: Use Momentum To Finish First. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​a​c​c​e​l​e​r​a​t​i​o​n​-​t​h​e​o​ry/
17 Silfwer, J. (2018, December 27). The Techlash: Our Great Confusion. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​t​e​c​h​l​a​sh/
18 Silfwer, J. (2020, April 24). Slow is Smooth, Smooth is Fast. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​s​l​o​w​-​i​s​-​s​m​o​o​th/
19 Silfwer, J. (2021, January 27). The Easy Street PR Strategy: Keep It Simple To Win. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​e​a​s​y​-​s​t​r​e​e​t​-​p​r​-​s​t​r​a​t​e​gy/
20 Silfwer, J. (2019, March 10). Critical Mass: How Many Social Media Followers Do You Need? Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​c​r​i​t​i​c​a​l​-​m​a​s​s​-​f​o​l​l​o​w​e​rs/
21 Let’s say the num­ber of new social media shares per cycle is a con­stant mul­ti­pli­er, m. If the cycle time is t and the total time under con­sid­er­a­tion is T, the num­ber of cycles in this time is T/​t​. The total reach after time T can be approx­im­ated by m(T/t), assum­ing one ini­tial share. When t decreases, T/​t​ increases, mean­ing more cycles occur in the same total time, T. This leads to a high­er power of m in the expres­sion m(T/t), which means a more extens­ive reach.
22 Silfwer, J. (2017, February 6). Viral Loops (or How to Incentivise Social Media Sharing). Doctor Spin | the PR Blog. https://​doc​tor​spin​.net/​v​i​r​a​l​-​l​o​op/
23 Silfwer, J. (2020, September 18). Communicative Leadership in Organisations. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​c​o​m​m​u​n​i​c​a​t​i​v​e​-​l​e​a​d​e​r​s​h​ip/
24 Silfwer, J. (2020, September 18). The Checklist for Communicative Organisations. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​c​h​e​c​k​l​i​s​t​-​f​o​r​-​c​o​m​m​u​n​i​c​a​t​i​v​e​-​l​e​a​d​e​r​s​h​ip/
25 Silfwer, J. (2019, October 17). Survivorship Bias — Correlation Does Not Equal Causation. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​s​u​r​v​i​v​o​r​s​h​i​p​-​b​i​as/
26 Checkley, K. (1997). The First Seven…and the Eighth: A Conversation with Howard Gardner. Educational Leadership, 55, 8 – 13.
27 Theory of mul­tiple intel­li­gences. (2023, November 28). In Wikipedia. https://​en​.wiki​pe​dia​.org/​w​i​k​i​/​T​h​e​o​r​y​_​o​f​_​m​u​l​t​i​p​l​e​_​i​n​t​e​l​l​i​g​e​n​ces
28 See also: Silfwer, J. (2023, April 25). Theory of Mind: A Superpower for PR Professionals. Doctor Spin | The PR Blog. https://​doc​tor​spin​.net/​t​h​e​o​r​y​-​o​f​-​m​i​n​d​-​a​-​s​u​p​e​r​p​o​w​e​r​-​f​o​r​-​p​r​-​p​r​o​f​e​s​s​i​o​n​a​ls/
29 Gardner, H. (1983). Frames of Mind: The Theory of Multiple Intelligences. Basic Books.
Jerry Silfwer
Jerry Silfwerhttps://doctorspin.net/
Jerry Silfwer, alias Doctor Spin, is an awarded senior adviser specialising in public relations and digital strategy. Currently CEO at Spin Factory and KIX Communication Index. Before that, he worked at Kaufmann, Whispr Group, Springtime PR, and Spotlight PR. Based in Stockholm, Sweden.

The Cover Photo

The cover photo isn't related to public relations obviously; it's just a photo of mine. Think of it as a 'decorative diversion', a subtle reminder that it's good to have hobbies outside work.

The cover photo has

.

Subscribe to SpinCTRL—it’s 100% free!

Join 2,550+ fellow PR lovers and subscribe to Jerry’s free newsletter on communication and psychology.
What will you get?

> PR commentary on current events.
> Subscriber-only VIP content.
> My personal PR slides for .key and .ppt.
> Discounts on upcoming PR courses.
> Ebook on getting better PR ideas.
Subscribe to SpinCTRL today by clicking SUBSCRIBE and get your first free send-out instantly.

Latest Posts
Similar Posts
Most Popular