The PR BlogDigital PRMonitoring, Algorithms & SEOSocial Media Algorithms and How They Rule Our Lives

Social Media Algorithms and How They Rule Our Lives

A social media algorithm is not your friend—and it must be managed.

Cover photo: @jerrysilfwer

We all know how social media algorithms work, right?

Most people think they know how social media algorithms work, but they don’t. This art­icle will shed some light on how algorithms use iter­at­ive test­ing — not soph­ist­ic­ated technology.

As a digit­al strategist, I’ve closely mon­itored social media algorithms for two dec­ades. I’ve learnt that algorithms aren’t your friends — and we must man­age them.

Let’s dive right in:

We Need Sorting and Labelling

You inter­act with social media, and the plat­form own­er col­lects your user data to serve you more con­tent to keep you engaged and thus increase your expos­ure to third-party advertising. 

So what’s the Original Sin of the Internet? Nearly all busi­ness mod­els it sup­ports require spy­ing on con­sumers and mon­et­ising them.”
— Bob Sullivan 

Order is neces­sary, of course, since there’s a lot of con­tent to struc­ture:

Social Media Algorithms | Monitoring, Algorithms & SEO | Doctor Spin
In a wired world of online abund­ance, gate­keep­ing is critical.

Unfortunately, the actu­al inner work­ings of a social media algorithm have a much dark­er side. And yes, “dark­ness” is a reas­on­able ana­logy because these algorithms are being kept secret for many reasons.

And behind these cur­tains of secrecy, we don’t find myri­ads of com­plex com­put­ing lay­ers but man­u­fac­tured fil­ters designed by real people with per­son­al agendas. 

We are con­stantly shap­ing, sug­gest­ing, nudging, and presenting.

Social Algorithms are Not “Personal”

As users, we have a gen­er­al idea of how algorithms work, but only a hand­ful of people know. To pre­vent indus­tri­al espi­on­age, we can safely assume that most social net­works are mak­ing sure that no one developer has full access to the entirety of an algorithm.

And even if you’re a Facebook pro­gram­mer, how would you know exactly how Google’s algorithm works?

One might assume you have a per­son­al Facebook algorithm stored on a serv­er some­where. An algorithm that tracks you per­son­ally and learns about you and your beha­viour. And the more it knows about you, the bet­ter it under­stands you. But this is not exactly how it works — for a good reason.

Humans are notori­ously bad at con­sciously know­ing ourselves and under­stand­ing oth­ers. And our think­ing is riddled with uncon­scious biases. 

Perception Quote - Anais Nin
We’re all biased.

However com­plex, apply­ing vari­ous types of machine learn­ing to learn about users and their inter­ac­tions on the indi­vidu­al level would be both slow and expens­ive. Few social media users would be patient enough to endure such a lengthy pro­cess through tri­al and error.

Anyone famil­i­ar with data min­ing will know that more advanced scrap­ing tech­niques, like sen­ti­ment ana­lys­is from social media mon­it­or­ing, will require large data sets. Hence the term “big data.” 1Batrinca, B., Treleaven, PC Social media ana­lyt­ics: a sur­vey of tech­niques, tools and plat­forms. AI & Soc, 89 – 116 (2015).

A social media algorithm gets immense power primar­ily from har­vest­ing data from large users sim­ul­tan­eously and over time, not from cre­at­ing bil­lions of self-con­tained algorithms. 

Put anoth­er way: Algorithms are fig­ur­ing out human­ity at scale, not your behaviour.

Our Lack of Consistency

No one argues the fact that you are an indi­vidu­al. But is your beha­viour con­sist­ent from inter­ac­tion to inter­ac­tion? The answer is prob­ably no. How you act and react will likely be much more con­tex­tu­al and situ­ation­al — at least con­cern­ing your per­ceived unique­ness and self-identification.

Social net­works are lim­it­ing your options to tweak “your algorithm” your­self. I would be all over the pos­sib­il­ity of adjust­ing Facebook’s news­feed or Google’s search res­ults in fine detail using vari­ous boolean rule­sets, but that would prob­ably only teach the mas­ter social algorithm a bit more about human pre­ten­sions — and little else.

Accurate per­son­al algorithms that fol­low us from ser­vice to ser­vice could out­per­form all oth­er algorithms. I would prob­ably ask my algorithm only to show me ser­i­ous art­icles writ­ten in peer-reviewed pub­lic­a­tions or pub­lished by well-edu­cated authors with proven track records. But if the algorithm does­n’t take it upon itself to show me some funny cat memes or weird Youtube clips now and then, I would prob­ably be bored quite quickly.

When we under­stand that the social media algorithms aren’t try­ing to fig­ure you out, it fol­lows to ask how well they’re doing in fig­ur­ing out humanity.

Why Social Media Algorithms Aren’t Better

Social media algorithms are pro­gress­ing — painstak­ingly. Understanding human beha­viour at the macro level is not a task to be underestimated.

Google struggles to show rel­ev­ant search res­ults, and it still isn’t uncom­mon for users to search quite a bit before find­ing the inform­a­tion they seek. 

Facebook struggles with users com­plain­ing about what they’re being shown in the newsfeeds. 

Spotify struggles to sug­gest new music and often misses the mark by a mile.

LinkedIn is strug­gling to be busi­ness-rel­ev­ant while at the same time per­son­ally enga­ging (i.e. not boring). 

Instagram is strug­gling not to make people feel bad about themselves. 

Pinterest is strug­gling with inter­pret­ing per­son­al visu­al taste and intent. 

Netflix struggles with sug­gest­ing what to watch (“Why on Earth would I want to see Jumanji 2?”).

Amazon is strug­gling with telling us what to buy (“Please stop, I regret click­ing on those purple bath tow­els by mis­take a year ago!”).

The prac­tic­al engin­eer­ing approach to these struggles is straightforward: 

Take the guess­work out of the equation.

How Social Networks Have Evolved

The Silent Switch

Not too long ago, social media algorithms would deliv­er organ­ic reach like this:

The Silent Switch - Doctor Spin - The PR Blog.001
Social media algorithms before the silent switch (click to enlarge).

Today, social media algorithms deliv­er organ­ic reach more like this:

The Silent Switch - Doctor Spin - The PR Blog.002
Social media algorithms after the silent switch (click to enlarge).

It’s the silent switch where social net­works have demoted the pub­lish­er­’s author­ity and repu­ta­tion and pro­moted single con­tent per­form­ance instead.

This algorithmic change has likely had pro­found and severe media implic­a­tions for soci­ety, such as trivi­al­iz­a­tion, sen­sa­tion­al­iz­a­tion, and polarization.

Our job as PR pro­fes­sion­als is to help organ­isa­tions nav­ig­ate the media land­scape and to com­mu­nic­ate more effi­ciently — espe­cially in times of change.

Read also: The Silent Switch

Virality Through Real-Time Testing

A dom­in­at­ing fea­ture of today’s social media algorithms is real-time testing. 

If you pub­lish any­thing, the algorithm will use its data to test your con­tent on a small stat­ist­ic­al sub­set of users. If their reac­tions are favour­able, the algorithm will show your pub­lished con­tent to a slightly lar­ger sub­set — and then test again. And so on.

Suppose your pub­lished con­tent has vir­al poten­tial, and your track record as a pub­lish­er has gran­ted you enough plat­form author­ity to sur­pass crit­ic­al mass. In that case, your con­tent will spread like rings on water through­out more extens­ive sub­sets of users. 

How the Instagram Algorithm Works
Via Marketing Tools.

This test­ing algorithm isn’t as math­em­at­ic­ally com­plex as one might think from a pro­gram­ming stand­point. The most robust approach to increas­ing vir­al­ity is redu­cing cycle times.

YouTube’s algorithm argu­ably does well in cycle times, but the real star of online vir­al­ity is the Chinese plat­form TikTok.

Social Media Algorithms - Silent Shift - Doctor Spin - The PR Blog
How algorithms iter­ate to max­im­ise engage­ment and con­tent quality.

Read also: What You Need To Know About TikTok’s Algorithm

And this is where it gets pitch black. Because the com­plex­ity and gate­keep­ing prowess of today’s social media algorithms don’t primar­ily stem from their cre­at­ive use of big data and high-end arti­fi­cial intel­li­gence — it stems from the blunt use of syn­thet­ic filters.

The social media algorithms could be much more com­plex using machine learn­ing, nat­ur­al lan­guage pro­cessing, and arti­fi­cial intel­li­gence com­bined with human psy­cho­logy neur­al net­work mod­els. Especially if we allow these pro­to­cols to be indi­vidu­al across ser­vices, we will enable them to lie to us just a bit. 

But, no. Instead, the social media algorithms of today are sur­pris­ingly straight­for­ward and based on real-time iter­at­ive test­ing. Today, vir­al­ity is con­trolled mainly via the use of added filters.

Underestimating the Effects of Filters

The algorithmic com­plex­ity is primar­ily derived from humans manu­ally adding fil­ters to algorithms in their con­trol. These fil­ters are tested on smal­ler sub­sets before rolling out on lar­ger scales.

Most of us have heard cre­at­ors on Instagram, TikTok, and some­times YouTube com­plain about being “shad­ow­banned” when their reach sud­denly dwindles from one day to the next — for no appar­ent reas­on. Sometimes this might be due to changes to the mas­ter algorithm, but most cre­at­ors are prob­ably affected by newly added filters.

Please make no mis­take about it: Filters are power­ful. No mat­ter how well a piece of con­tent would nego­ti­ate the mas­ter algorithm — if a piece of con­tent gets stuck in a fil­ter, it’s going nowhere. And these fil­ters aren’t the out­put of some ultra-smart algorithm; humans add them with cor­por­ate or ideo­lo­gic­al agendas.

There is no inform­a­tion over­load, only fil­ter fail­ure.”
— Clay Shirky

TikTok serves as one of the darkest examples of algorithmic abuse. 

Leaked intern­al doc­u­ments revealed how TikTok added fil­ters to lim­it con­tent by people clas­si­fied as non-attract­ive or poor. 

And yes, this is where the dark­ness comes into full effect — when human agen­das get added into the algorithmic mix.

One doc­u­ment goes so far as to instruct mod­er­at­ors to scan uploads for cracked walls and “dis­rep­ut­able dec­or­a­tions” in users’ own homes — then to effect­ively pun­ish these poorer TikTok users by arti­fi­cially nar­row­ing their audi­ences.”
Source: Invisible Censorship — TikTok Told Moderators to Suppress Posts by “Ugly” People and the Poor

The grim irony here is that adding fil­ters is rel­at­ively straight­for­ward from a pro­gram­ming perspective. 

We often con­sider algorithms advanced black boxes that oper­ate almost above human com­pre­hen­sion. But with reas­on­ably exact algorithms, it is arti­fi­cial fil­ters we need to watch out for and consider.

The Power of Perception Management

Since we can­not change real­ity, let us change the eyes which see real­ity.”
— Nikos Kazantzakis

What would hap­pen if Google and Facebook filtered away a spe­cif­ic day? Everything that refers to that day wouldn’t pass any iter­at­ive tests any­more. Any con­tent from that day would be shad­ow­banned. And search engine res­ults pages would deflect any­thing related to that par­tic­u­lar day.

To para­phrase a pop­u­lar TikTok meme, “How would you know?”

No one makes decisions based on the actu­al real­ity; we all make decisions based on our lim­ited under­stand­ing of that real­ity. Hence, if you con­trol some parts of that real­ity, you indir­ectly con­trol what people do, say, or even think. 2Lippmann, Walter. 1960. Public Opinion (1922). New York: Macmillan.

Walter Lippmann: Public Opinion and Perception Management

No one is basing their atti­tudes and beha­viours on real­ity; we’re basing them on our per­cep­tions of real­ity.

Walter Lippmann (1889 – 1974) pro­posed that our per­cep­tions of real­ity dif­fer from the actu­al real­ity. The real­ity is too vast and too com­plex for any­one to pro­cess. 3Lippmann, Walter. 1960. Public Opinion (1922). New York: Macmillan.

  • One who effect­ively man­ages the per­cep­tions of pub­lics acts as a mor­al legis­lat­or, cap­able of shap­ing atti­tudes and beha­viours accord­ing to the cat­egor­ic­al imperative.

The research on per­cep­tion man­age­ment is focused on how organ­isa­tions can cre­ate a desired reputation:

The OPM [Organizational Perception Management] field focuses on the range of activ­it­ies that help organ­isa­tions estab­lish and/​or main­tain a desired repu­ta­tion (Staw et al., 1983). More spe­cific­ally, OPM research has primar­ily focused on two inter­re­lated factors: (1) the tim­ing and goals of per­cep­tion man­age­ment activ­it­ies and (2) spe­cif­ic per­cep­tion man­age­ment tac­tics (Elsbach, 2006).”
Source: Hargis, M. & Watt, John 4Hargis, M. & Watt, John. (2010). Organizational per­cep­tion man­age­ment: A frame­work to over­come crisis events. Organization Development Journal. 28. 73 – 87.

Today, our per­cep­tions are heav­ily influ­enced by news media and influ­en­cers, algorithms, and social graphs. Therefore, per­cep­tion man­age­ment is more crit­ic­al than ever before.

We are all cap­tives of the pic­ture in our head — our belief that the world we have exper­i­enced is the world that really exists.”
— Walter Lippmann

Learn more: Walter Lippmann: Public Opinion and Perception Management

This is the truth about how social media algorithms are con­trolling our lives:

Social media algorithms and fil­ters influ­ence our per­cep­tions of real­ity and, by exten­sion, our atti­tudes and behaviours.

The First Rule of Social Media Algorithms

Social net­works don’t want us talk­ing and ask­ing ques­tions about their algorithms des­pite being at the core of their busi­nesses. Because 1) they need to keep them secret, 2) they are blunter than we might think, 3) their com­plex­ity is mani­fes­ted primar­ily by arti­fi­cial fil­ters, and 4) they don’t want to dir­ect our atten­tion at how much gate­keep­ing power they yield.

And both journ­al­ists and legis­lat­ors aren’t exactly hard at expos­ing these appar­ent demo­crat­ic weak­nesses; journ­al­ists want their lost gate­keep­ing power back and legis­lat­ors because they see ideo­lo­gic­al oppor­tun­it­ies to gain con­trol over these filters.

Any PR pro­fes­sion­al knows that the news media has an agenda. And that we must man­age that agenda. Otherwise, it might spin out of control. 

A social media algorithm can be suc­cess­fully nego­ti­ated and some­times work for you or your organ­isa­tion. But an algorithm with its fil­ters will nev­er be your friend. 

Social net­works are “good” in the same way the news media is “object­ive”, or politi­cians are “altru­ist­ic”. As pub­lic rela­tions pro­fes­sion­als, we should act accord­ingly and man­age the social media algorithms — just as we man­age journ­al­ists and legislators.

Read also: Social Media: The Good, The Bad, The Ugly

Please sup­port my blog by shar­ing it with oth­er PR- and com­mu­nic­a­tion pro­fes­sion­als. For ques­tions or PR sup­port, con­tact me via jerry@​spinfactory.​com.

1 Batrinca, B., Treleaven, PC Social media ana­lyt­ics: a sur­vey of tech­niques, tools and plat­forms. AI & Soc, 89 – 116 (2015).
2, 3 Lippmann, Walter. 1960. Public Opinion (1922). New York: Macmillan.
4 Hargis, M. & Watt, John. (2010). Organizational per­cep­tion man­age­ment: A frame­work to over­come crisis events. Organization Development Journal. 28. 73 – 87.
Jerry Silfwer
Jerry Silfwer
Jerry Silfwer, alias Doctor Spin, is an awarded senior adviser specialising in public relations and digital strategy. Currently CEO at KIX Index and Spin Factory. Before that, he worked at Kaufmann, Whispr Group, Springtime PR, and Spotlight PR. Based in Stockholm, Sweden.

The Cover Photo


Grab a free subscription before you go.

Get notified of new blog posts
& new PR courses

🔒 Please read my integrity- and cookie policy.

Get influencer outreach right by knowing the four types of influencer marketing. In this blog article, I've clearly defined them to mitigate confusion.
Most popular