Listen to the article
Social media manipulation campaigns have become increasingly sophisticated and widespread, according to a new report from Oxford University’s Oxford Internet Institute (OII). Researchers have documented an alarming expansion of computational propaganda across the globe, with the number of countries experiencing organized social media manipulation nearly doubling from 28 to 48.
“The majority of growth comes from political parties who spread disinformation and junk news around election periods,” explains Samantha Bradshaw, co-author of the report. “More campaigns are using bots, junk news, and disinformation to polarize and manipulate voters.”
Political organizations worldwide appear to be adopting tactics similar to those deployed during Brexit and the 2016 U.S. Presidential election, creating a playbook for digital manipulation that transcends borders. This trend persists despite numerous democratic governments introducing legislation specifically designed to combat fake news online.
Professor Phil Howard, lead researcher on the OII’s Computational Propaganda project, points to a troubling development in how these anti-disinformation efforts are being implemented. “The problem with this is that these ‘task forces’ to combat fake news are being used as a new tool to legitimize censorship in authoritarian regimes,” Howard notes. “At best, these types of task forces are creating counter-narratives and building tools for citizen awareness and fact-checking.”
As platforms like Facebook and Twitter enhance their monitoring capabilities, disinformation campaigns are adapting by migrating to less regulated spaces. “There is evidence that disinformation campaigns are moving on to chat applications and alternative platforms,” Bradshaw observes. “This is becoming increasingly common in the Global South, where large public groups on chat applications are more popular.”
Messaging platforms like WhatsApp, Telegram, and Signal present unique challenges for monitoring and regulation due to their encrypted nature and private group structures. These platforms have become fertile ground for spreading misinformation, particularly in regions where digital literacy rates may be lower and fact-checking resources less available.
Despite platform efforts to identify and remove automated accounts, bot networks remain a prevalent tactic in the computational propaganda arsenal. These automated accounts serve multiple functions: spreading partisan content, strategically sharing posts to game algorithms, manipulating trending topics, and even mass-reporting legitimate content to trigger automatic removal systems.
“We suspect new innovation will continue to emerge as platforms and governments take legal and regulatory steps to curb this type of activity,” says Howard, suggesting that technological countermeasures often lead to an arms race rather than definitive solutions.
The financial scale of these manipulation efforts is substantial. “We estimate that tens of millions of dollars are spent on this type of activity,” Howard explains. “Some of the money may be spent on legitimate advertising on social media, but there is certainly a growing industry for fake accounts, online commentators, and political bots.”
This emerging industry undermines public discourse by distorting information ecosystems and eroding trust in legitimate media, scientific consensus, and public institutions. The growing sophistication of these campaigns creates a challenging environment for average citizens attempting to navigate increasingly polluted information spaces.
The OII report highlights how social media platforms, originally designed to connect people and democratize information access, have become weaponized tools that can undermine democratic processes. As techniques evolve and spread globally, the distinction between authentic public opinion and manufactured consensus becomes increasingly difficult to discern.
Experts suggest that addressing this complex challenge will require coordinated efforts from technology platforms, government regulators, educational institutions, and civil society organizations to develop media literacy, enhance transparency, and create more resilient information ecosystems.
Verify This Yourself
Use these professional tools to fact-check and investigate claims independently
Reverse Image Search
Check if this image has been used elsewhere or in different contexts
Ask Our AI About This Claim
Get instant answers with web-powered AI analysis
Related Fact-Checks
See what other fact-checkers have said about similar claims
Want More Verification Tools?
Access our full suite of professional disinformation monitoring and investigation tools
9 Comments
Concerning to see how social media manipulation campaigns are becoming more widespread globally. Voters need to be vigilant in identifying disinformation and fact-checking claims, especially around elections.
Agreed. Robust anti-disinformation efforts are crucial, but they need to be implemented thoughtfully to avoid unintended consequences.
This report highlights the global scale of the problem. It’s worrying to see the rapid expansion of coordinated social media manipulation campaigns, even in democratic countries. Safeguarding the integrity of public discourse is vital.
Interesting that despite legislative efforts, the problem persists. Clearly more needs to be done to combat these sophisticated disinformation tactics. I hope researchers and policymakers can stay ahead of the curve.
Agreed. The challenge seems to be evolving faster than the solutions. Ongoing vigilance and adaptation will be required to protect the information ecosystem.
I wonder how these manipulation tactics evolve over time and whether new technologies like AI will make the problem even harder to combat. Regulators have their work cut out for them.
Yes, the cat-and-mouse game between disinformation actors and those trying to stop them is likely to continue escalating. Staying ahead of the curve will be an ongoing challenge.
As someone who follows mining and commodities news, I’m curious how these social media manipulation tactics could impact discussions around things like resource extraction, supply chains, and related equities.
That’s a good point. Disinformation could potentially sway public opinion and policy decisions in the mining/commodities space as well. Fact-checking and media literacy will be crucial.