Listen to the article
The Growing Threat of Disinformation in the Digital Age
Disinformation—deliberate falsehoods spread online—has become a significant threat to democracies worldwide, exploiting an information ecosystem that “prioritises what is popular over what is true” to cause widespread harm.
The fundamental issue lies in how political discourse unfolds on platforms designed for viral advertising, according to Renée DiResta, research manager at Stanford University’s Internet Observatory. “Social media algorithms will show you what you want to see, but they don’t have any kind of value judgment, and this is where we see things like radicalisation becoming an increasing problem because the recommendation engine does not actually understand what it is suggesting.”
While foreign influence operations from countries like Russia and increasingly China receive substantial attention, domestic disinformation presents an equally serious challenge. The 2020 edition of the Oxford Internet Institute’s Global Inventory of Organised Social Media Manipulation identified 77 countries where government or political actors deployed disinformation on social media to manipulate public opinion. Most of these campaigns operate domestically, researchers told Africa Check.
Among these countries are 12 African nations, including South Africa, where government agencies, politicians, parties, private contractors, citizens, and influencers engage in social media manipulation. Tactics range from attacking opposition parties to silencing critics through coordinated trolling campaigns.
Notable examples include the African National Congress’s pre-election “boiler room,” which reportedly mobilized party activists to target opposition leaders on Twitter, and the Economic Freedom Fighters’ “troll army” that focused on journalists. According to Amelie Henle, research assistant at the Oxford Internet Institute’s Computational Propaganda Research Project, the Gupta-backed influence operation involving the now-defunct British PR firm Bell Pottinger represents the largest disinformation campaign identified in South Africa in recent years.
Such campaigns ultimately undermine democracy itself. University of Washington associate professor Kate Starbird explains: “While a single disinformation campaign may have a specific objective—for instance, changing public opinion about a political candidate or policy—pervasive disinformation works at a more profound level to undermine democratic societies.”
Camille François, chief innovation officer at network analysis company Graphika, breaks down online disinformation into three interconnected “vectors”: manipulative actors, deceptive behavior, and harmful content.
Manipulative actors typically hide their identities and intentions, using sock-puppet accounts with false identities and trolls who intimidate those questioning the veracity of information. These actors often have diverse motives, including financial gain, tribal online affiliation, or influencing public opinion.
The financial motive was evident when a South African municipal employee posed as a racist white woman on Twitter in 2020 to drive traffic to his websites. Political influence was the clear objective of the Radical Economic Transformation (RET) disinformation campaign on South African Twitter in 2016-2017, which sought to undermine economic institutions and divert attention from state capture accusations.
Analysis of over 200,000 RET network tweets revealed that 98% were retweets, demonstrating how fake accounts amplified messages “to give the illusion that the content they are sharing resonates with a wider group.” The RET community remains active on South African Twitter, having merged with Economic Freedom Fighters supporters and displacing “the mainstream media from the centre of the conversation,” according to data analysis from the Superlinear blog.
Deceptive behaviors employed by disinformation actors include coordinated inauthentic activity to artificially inflate the perceived support for particular messages. In January 2021, both Twitter and Facebook removed accounts linked to Uganda’s ICT ministry for such behavior ahead of the country’s election. Tactics included using “fake and duplicate accounts to manage pages, comment on other people’s content, impersonate users and re-share posts in groups to make them appear more popular than they were.”
The Oxford Internet Institute identified 58 countries where “cyber troops”—government or political party actors—deployed bots to manipulate public opinion, while human-run fake accounts were used in 81 countries. South African cyber troops utilized bots, fake human-run accounts, and hacked or stolen profiles.
Disinformation campaigns typically span multiple platforms, often beginning in anonymous online spaces before moving through closed groups like WhatsApp, then to conspiracy communities on Reddit or YouTube, and finally reaching open networks like Twitter. Media amplification often follows, especially if politicians or influencers repeat the falsehoods.
The content itself takes many forms, from photos used out of context to completely fabricated material. Visual elements like memes, misleading infographics, and “evidence collages” that compile information from multiple sources into a single, shareable image are commonly employed. Content designed to trigger emotional responses—particularly fear or anger—proves especially effective.
As DiResta reminds us: in an information war, our minds are the territory. “If you aren’t a combatant, you are the territory. And once a combatant wins over a sufficient number of minds, they have the power to influence culture and society, policy and politics.”
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


20 Comments
The article provides a sobering look at the global scale of organized social media manipulation. Addressing the root causes in platform design and incentive structures is crucial.
Absolutely, this is a complex and multifaceted problem that requires concerted action on multiple fronts. Promoting media literacy and fact-based discourse should be a priority.
Fascinating insights into how social media algorithms can amplify and spread deliberate falsehoods. Fact-checking, media literacy, and platform accountability are key to combating this threat.
I agree, the article raises important points about the need for a multifaceted approach to tackling disinformation. Collaborative efforts between platforms, researchers, and policymakers are essential.
Concerning how political actors are exploiting social media to manipulate public discourse. Disinformation is a serious challenge to the integrity of democratic processes.
Concerning how social media can be weaponized to spread disinformation and manipulate public opinion, both domestically and internationally. We need more transparency and accountability.
Absolutely, the article underscores the urgent need for stronger safeguards against coordinated campaigns of falsehoods. Protecting the integrity of online information is vital.
The issue of radicalisation through social media recommendation algorithms is concerning. We must find ways to promote healthy online discourse and protect against malicious influence operations.
Thoughtful analysis of how social media algorithms and platform design can be exploited to spread disinformation and manipulate public discourse. Addressing these systemic issues is crucial.
Interesting insights into the tactics used in disinformation campaigns. Addressing the underlying issues in social media algorithms and incentive structures is essential to combating this threat.
The article highlights the alarming prevalence of deliberate falsehoods being spread online, both domestically and internationally. Protecting the integrity of information is vital for democracy.
Agree, the scale and complexity of the disinformation challenge is deeply concerning. Collaborative efforts between platforms, researchers, and policymakers are needed to find effective solutions.
Concerning how social media algorithms can amplify misinformation and radicalisation. We need better safeguards and fact-checking to combat the growing threat of deliberate falsehoods online.
Agree, the information ecosystem needs to prioritize truth and accuracy over popularity and engagement. Responsible platform design is crucial.
Interesting article on the global scale of organized social media manipulation by governments and political actors. Disinformation is a serious challenge to democratic discourse.
Yes, both foreign and domestic disinformation campaigns are worrying. Transparency and independent fact-checking are needed to counter these coordinated efforts.
Thoughtful piece on the growing prevalence of deliberate falsehoods online and their impact on democratic discourse. Algorithmic design and content moderation are crucial areas to address.
The article highlights the scale and complexity of the disinformation challenge. Addressing the systemic issues in platform design and incentive structures is critical to solving this problem.
Tracking and understanding the tactics of disinformation campaigns is important work. Addressing the root causes in platform design and incentive structures is key to solving this problem.
I agree, the article highlights the complexity and global scale of this challenge. Rigorous research and policy solutions are needed to counter the growing threat.