Listen to the article
Meta Fights Russian Disinformation Flood as Tactics Evolve
Russian disinformation operations continue to plague Facebook and Instagram despite Meta’s ban on Russian state-controlled media, according to Nick Clegg, Meta’s president of global affairs. Speaking ahead of the company’s quarterly adversarial threat report, Clegg revealed that thousands of unsophisticated accounts are flooding the platforms with pro-Kremlin narratives about the invasion of Ukraine.
These accounts, while lacking the establishment and sophistication of previous Russian disinformation operations, present a different kind of challenge. Security experts suggest the strategy could be aimed at overwhelming Meta’s detection infrastructure through sheer volume rather than sophistication.
The tactical shift might also indicate Russia was caught off guard by the need for this type of information warfare. “It suggests they were unprepared and not expecting to need to use this type of information war tactic,” Clegg noted, pointing to the hastily assembled nature of these newer influence operations.
Tim Squirrell, Head of Communications and Editorial at the Institute for Strategic Dialogue (ISD), told Sky News that the findings align with broader patterns observed by independent researchers. “It’s unsurprising that there has been a proliferation of pro-Kremlin accounts trying to crowd out the information environment with their propaganda,” he said.
Squirrell highlighted a more concerning trend: networks of pro-Kremlin influencers who operate with apparent independence have proven “even more effective at spreading Moscow’s views than both state media or Russian-origin propaganda.” These influencers often maintain the appearance of being neutral commentators while amplifying Kremlin talking points.
Meta’s response to the threat has shown some effectiveness. “If the figures are correct, it indicates that downranking and demonetisation does have an impact,” Squirrell observed. However, he questioned why the company applies such measures selectively: “This begs the question as to why Meta refuses to do that for the large array of other toxic content on its platform.”
The selective enforcement points to broader concerns about Meta’s content moderation approach. “They’ve proven they respond under massive international pressure, but aren’t able to proactively do it when it might compromise their bottom line,” Squirrell added, suggesting that financial considerations may influence the company’s willingness to take action against harmful content.
While security researchers acknowledge Meta’s efforts to remove explicit propaganda campaigns, they emphasize that significant challenges remain. Squirrell noted the difficulty independent researchers face in accessing Meta’s data to verify the company’s claims and conduct their own assessments.
ISD’s own analysis suggests that misinformation and disinformation continue to circulate widely on Meta’s platforms despite the company’s interventions. Particularly concerning are narratives around critical events in the Ukraine conflict, such as the Bucha massacre.
“Posts questioning the massacre were three times as likely to be shared as those involving corroborated reporting,” Squirrell revealed, highlighting how conspiracy theories and false narratives can often gain more traction than factual information on social media platforms.
The challenge of combating Russian disinformation comes amid broader scrutiny of social media companies and their role in geopolitical conflicts. As digital platforms increasingly become battlegrounds for information warfare, the effectiveness of content moderation policies and the transparency of enforcement actions remain central concerns for researchers, policymakers, and the public.
For Meta, balancing free expression with the need to prevent the weaponization of its platforms represents an ongoing challenge—one that will likely intensify as propaganda tactics evolve and geopolitical tensions continue to play out in the digital sphere.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


30 Comments
Production mix shifting toward Disinformation might help margins if metals stay firm.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Production mix shifting toward Disinformation might help margins if metals stay firm.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Nice to see insider buying—usually a good signal in this space.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Production mix shifting toward Disinformation might help margins if metals stay firm.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Nice to see insider buying—usually a good signal in this space.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Interesting update on Russian-Origin Accounts Spread Ukraine War Disinformation Across Meta Platforms. Curious how the grades will trend next quarter.
If AISC keeps dropping, this becomes investable for me.
Production mix shifting toward Disinformation might help margins if metals stay firm.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Silver leverage is strong here; beta cuts both ways though.
I like the balance sheet here—less leverage than peers.
If AISC keeps dropping, this becomes investable for me.
Good point. Watching costs and grades closely.
I like the balance sheet here—less leverage than peers.
Good point. Watching costs and grades closely.
I like the balance sheet here—less leverage than peers.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Production mix shifting toward Disinformation might help margins if metals stay firm.