Listen to the article
Social media manipulation has emerged as a pervasive threat to global democracies, with coordinated campaigns operating in every one of the 81 countries surveyed, according to the 2020 media manipulation report from the Oxford Internet Institute. This represents a 15% increase from the 70 countries identified in 2019.
Government agencies, political parties, and private firms are now producing disinformation at an industrial scale, with researchers finding evidence of disinformation being deployed as part of political communication in 93% of surveyed countries. The professionalization of this sector has created a troubling landscape where public opinion is systematically manipulated.
“Our report shows misinformation has become more professionalised and is now produced on an industrial scale,” says Professor Philip Howard, director of the Oxford Internet Institute and co-author of the report. “Now, more than ever, the public needs to be able to rely on trustworthy information about government policy and activity.”
The research reveals millions are being spent on private sector “cyber troops” who work to drown out legitimate voices on social media platforms. These operations often employ citizen influencers—including volunteers, youth groups, and civil society organizations—who amplify manipulated messages that align with their ideologies.
Dr. Samantha Bradshaw, the report’s lead author and an Oxford Internet Institute alumna, notes how this activity has transformed: “A large part of this activity has become professionalised, with private firms offering disinformation-for-hire services.” The study identified state actors collaborating with strategic communications firms in 48 countries.
The financial scale of these operations is significant, with nearly $60 million spent on firms using bots and other amplification strategies to create the impression of trending political content. Additionally, approximately $10 million was directed toward social media political advertisements during the study period.
Social media platforms have attempted to combat these operations, with Facebook and Twitter removing more than 317,000 accounts and pages linked to “cyber troops” between January 2019 and November 2020. However, the problem continues to grow as tactics evolve.
Government involvement in computational propaganda is extensive, with researchers finding direct links between cyber troops and state agencies in 62 countries. “In 62 countries, we found evidence of a government agency using computational propaganda to shape public attitudes,” the report states.
Political parties also heavily leverage social media manipulation, with researchers identifying 61 countries where politicians or parties used computational propaganda techniques during campaigns. “Social media has become a critical component of digital campaigning,” the report notes.
The manipulation techniques vary across political systems. “Cyber troop activity can look different in democracies compared to authoritarian regimes,” Dr. Bradshaw explains. “Electoral authorities need to consider the broader ecosystem of disinformation and computational propaganda, including private firms and paid influencers.”
Researchers documented various manipulation tactics across countries: 79 countries used human accounts for spreading disinformation, 57 employed bot accounts, and 14 utilized hacked or stolen accounts. Additionally, 76 countries deployed disinformation and media manipulation campaigns, 30 used data-driven targeting strategies, and 59 employed state-sponsored trolls to attack political opponents or activists—a significant increase from 47 countries in 2019.
The Oxford researchers developed their findings using a four-step methodology that included systematic content analysis of news articles, secondary literature reviews of public archives and scientific reports, country-specific case studies, and expert consultations. The research was conducted between 2019 and 2020.
As social media platforms become increasingly central to political discourse worldwide, the industrialization of manipulation poses significant challenges for democratic processes and information integrity. The report underscores the urgent need for stronger safeguards and increased transparency in digital political communication.
Verify This Yourself
Use these professional tools to fact-check and investigate claims independently
Reverse Image Search
Check if this image has been used elsewhere or in different contexts
Ask Our AI About This Claim
Get instant answers with web-powered AI analysis
Related Fact-Checks
See what other fact-checkers have said about similar claims
Want More Verification Tools?
Access our full suite of professional disinformation monitoring and investigation tools


14 Comments
Nice to see insider buying—usually a good signal in this space.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Exploration results look promising, but permitting will be the key risk.
Good point. Watching costs and grades closely.
Exploration results look promising, but permitting will be the key risk.
Good point. Watching costs and grades closely.
Uranium names keep pushing higher—supply still tight into 2026.
Good point. Watching costs and grades closely.
If AISC keeps dropping, this becomes investable for me.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Exploration results look promising, but permitting will be the key risk.
Nice to see insider buying—usually a good signal in this space.