Listen to the article

0:00
0:00

Social Media Manipulation Poses Growing Threat to Democracy, Oxford Study Finds

Social media manipulation campaigns have increased by 150% across the globe over the past two years, according to a new report from the Oxford Internet Institute, highlighting an alarming trend that researchers say threatens democratic processes worldwide.

The University of Oxford’s Computational Propaganda Research Project examined what it calls “cyber troop” activity in 70 countries, finding that governments and political parties are increasingly deploying sophisticated digital tactics to shape public opinion, harass critics, and spread divisive messages.

“The use of computational propaganda to shape public attitudes via social media has become mainstream, extending far beyond the actions of a few bad actors,” the report states. “In an information environment characterized by high volumes of information and limited levels of user attention and trust, the tools and techniques of computational propaganda are becoming a common – and arguably essential – part of digital campaigning and public diplomacy.”

While propaganda has long been a tool of political influence, today’s digital landscape provides unprecedented opportunities for global reach through algorithmic targeting and automated amplification. Researchers attribute the growth in these campaigns partly to improved detection methods, but also to countries new to social media experimenting with computational propaganda during elections or as tools for information control.

Facebook remains the platform of choice for manipulation campaigns, with evidence showing 56 countries conducting cyber troop operations on the social network. Its market size, ability to influence users’ personal networks, and effectiveness at disseminating political content make it particularly valuable for propaganda purposes.

In response to the report, Facebook emphasized that showing users accurate information is a “major priority,” stating: “We’ve developed smarter tools, greater transparency, and stronger partnerships to better identify emerging threats, stop bad actors, and reduce the spread of misinformation on Facebook, Instagram and WhatsApp.”

The research reveals a concerning shift toward visual platforms like Instagram and YouTube, where disinformation spreads through quick, easily digestible content. Samantha Bradshaw, one of the report’s authors, told Reuters: “On Instagram and YouTube it’s about the evolving nature of fake news – now there are fewer text-based websites sharing articles and it’s more about video with quick, consumable content.”

This transition presents new challenges for content moderation. “It’s easier to automatically analyze words than it is an image. And images are often more powerful than words with more potential to go viral,” Bradshaw explained.

The Christchurch terrorist attack in March demonstrated how harmful visual content can spread rapidly across platforms. Facebook acknowledged that the video was viewed thousands of times before being reported and removed 29 minutes after it began streaming.

The researchers identified four types of accounts used in manipulation campaigns: bots (found in 50 countries), human-operated fake accounts (60 countries), cyborg accounts blending automation with human curation, and hijacked high-profile accounts. Human-controlled accounts were the most prevalent, used by 87% of countries studied.

Their analysis reveals disturbing patterns in how these accounts are deployed: 71% spread pro-government propaganda, 89% attack opposition figures, and 34% disseminate polarizing messages designed to divide societies. Three-quarters of countries studied used disinformation and media manipulation tactics, while 68% employed state-sponsored trolling to target political dissidents, opposition figures, or journalists.

The report notes a significant increase in trolling, doxxing, and harassment campaigns. In 2018, 27 countries were using state-sponsored trolls to attack political opponents or activists via social media. This year, that number rose to 47 countries.

While the researchers acknowledge that democracy faced challenges before the advent of social media, they raise fundamental questions about platforms’ role in public discourse: “Are social media platforms really creating a space for public deliberation and democracy? Or are they amplifying content that keeps citizens addicted, disinformed, and angry?”

Efforts to combat manipulation include Google’s recent initiative to help children identify fake news through lesson plans teaching them to spot phishing attempts, interact with chatbots, evaluate source credibility, detect fake URLs, and assess headlines critically. Mozilla has launched similar programs to counter misinformation.

As manipulation tactics grow more sophisticated, the researchers emphasize that healthy democracies require “access to high-quality information and an ability for citizens to come together to debate, discuss, deliberate, empathize, and make concessions” – raising urgent questions about whether current social media environments can support these democratic necessities.

Verify This Yourself

Use these professional tools to fact-check and investigate claims independently

Reverse Image Search

Check if this image has been used elsewhere or in different contexts

Ask Our AI About This Claim

Get instant answers with web-powered AI analysis

👋 Hi! I can help you understand this fact-check better. Ask me anything about this claim, related context, or how to verify similar content.

Related Fact-Checks

See what other fact-checkers have said about similar claims

Loading fact-checks...

Want More Verification Tools?

Access our full suite of professional disinformation monitoring and investigation tools

13 Comments

  1. Jennifer Martin on

    This is quite alarming. The growing use of social media manipulation tactics to influence public opinion and sway political processes is deeply concerning for democracy. We need stronger safeguards and transparency around digital campaigning and public communications.

    • John M. Thompson on

      I agree. The scale and sophistication of these tactics is really worrying. Governments and political groups must be held accountable for any attempts to mislead or manipulate citizens online.

  2. Patricia Garcia on

    This report highlights the scale and sophistication of social media manipulation tactics being used to sway public opinion. It’s a serious threat to democracy that requires a coordinated, multi-stakeholder response.

  3. The growing use of computational propaganda to influence political processes is deeply troubling. We need stronger guardrails and transparency around digital campaigning to protect the integrity of our democratic institutions.

    • Absolutely. This is a complex challenge that requires cooperation between policymakers, tech platforms, and the public to address effectively.

  4. James F. Taylor on

    The Oxford report highlights just how pervasive computational propaganda has become globally. It’s crucial that tech platforms, policymakers, and the public work together to combat the spread of disinformation and restore trust in our information ecosystem.

    • Patricia B. Williams on

      Well said. Addressing this threat will require a multi-pronged approach, from improving platform policies to educating the public on media literacy. The integrity of our democratic processes is at stake.

  5. This is a concerning trend that undermines the democratic process. We need stronger guardrails and transparency around digital campaigning and public communications to protect against manipulation and misinformation.

  6. Noah I. Taylor on

    The findings from the Oxford study are troubling, but not surprising given how social media has been weaponized for political gain. Urgent action is needed to combat the spread of computational propaganda and restore trust in our information ecosystem.

    • I agree completely. Governments, tech companies, and the public all have a role to play in addressing this threat to our democratic institutions.

  7. This is a worrying trend that undermines the foundations of democracy. We must act quickly to combat the spread of misinformation and manipulation tactics on social media platforms.

  8. The findings from the Oxford study are a wake-up call. The use of computational propaganda to sway public opinion is a serious threat that demands urgent action to protect the integrity of our democratic processes.

    • I agree. Restoring trust in our information ecosystem should be a top priority for policymakers, tech companies, and the public alike.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2025 Disinformation Commission LLC. All rights reserved. Designed By Sawah Solutions.