Listen to the article
Russia Escalates Disinformation Operations Using AI, European Report Finds
Russian actors are significantly expanding their disinformation operations using artificial intelligence tools, according to a recent report from the European External Action Service (EEAS) cited by Ukraine’s Center for Countering Disinformation on March 20.
The comprehensive threat assessment documented 540 cases of foreign information manipulation and interference throughout 2025, involving approximately 10,500 social media channels and websites. Ukraine remained the primary target of these campaigns, which aimed to erode international support and undermine confidence in Ukrainian leadership and resistance efforts.
The EEAS report highlights a dramatic technological transformation in disinformation tactics. Analysis revealed that 27% of documented incidents incorporated AI-generated content—including synthetic text, fabricated audio, or manipulated video. These technological advancements have enabled hostile actors to produce misinformation at unprecedented speed while requiring fewer resources than traditional methods.
“Russian and Chinese actors have fully implemented AI tools to speed up content production and increase meddling activities with fewer resources,” the report stated, emphasizing how generative technologies are significantly reducing the cost barriers for conducting large-scale influence operations.
Among cases where attribution was possible, Russia was linked to 29% of disinformation incidents, while China was connected to 6%. The remaining 65% of cases could not be definitively attributed to specific state actors, highlighting the persistent challenge of identifying the sources behind sophisticated information manipulation campaigns.
The EEAS identified heightened vulnerability during major political events and news cycles, with nearly half of all recorded incidents coinciding with elections, protests, or international crises. Throughout 2025, election-related disinformation campaigns were tracked in several European nations, including Germany, Poland, Romania, Moldova, and the Czech Republic.
Security analysts note this represents a concerning evolution in Russia’s information warfare capabilities. While propaganda has long been part of Moscow’s strategy, the integration of advanced AI tools marks a significant escalation in both the scale and sophistication of these operations.
One specific example highlighted by Ukraine’s Center for Countering Disinformation involved a Russian narrative claiming Ukrainian drones deliberately targeted civilians in Russia’s Belgorod region. The center identified this as following a familiar propaganda template that relies on emotionally charged claims with limited verifiable evidence. The disinformation included stories about drones striking a woman’s car and pursuing an elderly resident walking with a goat—allegations that lacked independent verification.
The findings come amid growing international concern about the role of artificial intelligence in amplifying false information. Technology experts warn that as AI tools become more sophisticated and accessible, the challenge of distinguishing authentic from synthetic content will continue to intensify.
The EEAS report underscores the need for increased digital literacy and robust fact-checking mechanisms to counter these evolving threats. European officials have called for greater collaboration between governments, technology companies, and media organizations to develop more effective strategies for identifying and countering AI-generated disinformation.
As Russia’s war against Ukraine continues, information manipulation remains a critical component of Moscow’s hybrid warfare strategy. The latest assessment suggests that rather than diminishing, these efforts are becoming more technologically advanced and potentially more difficult to detect and counter.
Experts emphasize that understanding these new technological dimensions of information warfare will be essential for democratic societies seeking to protect their information ecosystems and electoral processes in the coming years.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


18 Comments
Nice to see insider buying—usually a good signal in this space.
Nice to see insider buying—usually a good signal in this space.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Production mix shifting toward Disinformation might help margins if metals stay firm.
Good point. Watching costs and grades closely.
The cost guidance is better than expected. If they deliver, the stock could rerate.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Interesting update on Russia Utilized AI in Over Quarter of Disinformation Campaigns in 2025, Report Finds. Curious how the grades will trend next quarter.
If AISC keeps dropping, this becomes investable for me.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
The cost guidance is better than expected. If they deliver, the stock could rerate.
Good point. Watching costs and grades closely.
Nice to see insider buying—usually a good signal in this space.
Good point. Watching costs and grades closely.
If AISC keeps dropping, this becomes investable for me.