Listen to the article
Russian intelligence services have ramped up their disinformation efforts against Ukraine, deploying sophisticated AI-generated videos that mimic street polls to undermine public support for the war effort, according to Ukrainian officials.
The Center for Countering Disinformation (CCD), operating under Ukraine’s National Security and Defense Council, has issued a warning about this latest evolution in Russia’s information warfare arsenal. In a recent social media statement, the agency detailed how these fabricated videos present seemingly ordinary Ukrainian citizens advocating for “peace at any cost” – regardless of the territorial or sovereignty implications.
“These videos are designed with remarkable attention to detail, creating a visual style that closely resembles authentic street interviews,” a CCD spokesperson explained. “However, comprehensive analysis using specialized detection tools confirms they are entirely AI-generated – not a single ‘person’ in these videos actually exists.”
The fabricated street polls follow a consistent narrative pattern. The AI-generated “Ukrainians” express growing war fatigue and suggest that peace should be the priority, while implying that Ukrainian authorities are needlessly prolonging the conflict. This narrative aims to shift blame for the war’s continuation away from Russia, the aggressor nation, and onto Ukraine’s leadership.
Intelligence experts note this represents a significant evolution in Russia’s digital propaganda tactics. While Russia has long employed disinformation campaigns across social media platforms, the sophistication of these AI-generated videos marks an escalation in technological approach.
“What makes these videos particularly concerning is their potential to appear authentic to the average viewer,” said Dr. Mykhailo Vasylenko, a digital security analyst based in Kyiv. “Without specialized tools or training, many people might not recognize these as AI fabrications, especially when they’re shared across platforms without context.”
The CCD indicates this is not an isolated incident but part of a broader disinformation strategy. Previous waves of similar AI-generated content have promoted the notion that Ukrainian citizens support territorial concessions to Russia – a narrative that contradicts polling data showing strong Ukrainian opposition to ceding territory.
These tactics align with Russia’s larger information warfare strategy that has evolved significantly since the full-scale invasion began in February 2022. Russian propagandists have repeatedly attempted to create the impression of international support for their position by creating fake materials purporting to represent foreign media outlets, especially during high-profile international events.
The timing of this latest disinformation push coincides with ongoing discussions about potential peace negotiations and increasing pressure on international support systems for Ukraine. Security experts suggest Russia aims to exploit war fatigue both within Ukraine and among its international partners.
Ukrainian media literacy organizations are responding by publishing guidelines to help citizens identify AI-generated content. Key indicators include unnatural facial movements, inconsistent lighting, audio-visual misalignments, and backgrounds that may contain subtle distortions.
“This represents the next frontier in information warfare,” said Oleksandra Matviychuk, a cybersecurity researcher at Ukraine’s Digital Transformation Ministry. “As AI technology becomes more accessible and sophisticated, distinguishing between authentic and fabricated content will become increasingly challenging.”
International digital security organizations have also expressed concern about the implications of these tactics beyond the Russia-Ukraine conflict. The deployment of realistic AI-generated videos to influence public opinion represents a threat to democratic discourse globally, as similar methods could be employed in election campaigns or other politically sensitive contexts worldwide.
Ukrainian officials urge citizens to verify information through multiple trusted sources and remain vigilant about the content they consume and share on social media platforms.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


12 Comments
The use of AI-generated videos to spread pro-Russian propaganda in the Ukraine conflict is a concerning development. It’s a reminder that we must be extremely cautious about the content we consume online, especially when it comes to sensitive geopolitical issues.
Absolutely. Maintaining a critical eye and relying on reputable, fact-based sources is essential in navigating this complex information landscape.
This is a concerning development in the information war over Ukraine. Using AI-generated videos to spread disinformation is a disturbing tactic that undermines public trust. It’s important to scrutinize any ‘vox pop’ style content and verify its authenticity.
Absolutely, these deepfakes are designed to manipulate public opinion. We need robust fact-checking and media literacy to counter such sophisticated propaganda efforts.
The use of AI-generated videos to spread disinformation about the war in Ukraine is a concerning development. We must be vigilant in identifying and calling out these fabricated narratives, which undermine the truth and play into Russia’s propaganda efforts.
Absolutely. Maintaining a critical eye and relying on verified, authoritative sources is essential in navigating the complex information landscape surrounding this conflict.
This news about Russia using AI-generated videos to push a ‘peace at any cost’ message in Ukraine is deeply troubling. It’s a sophisticated tactic that highlights the need for robust fact-checking and media literacy to combat such disinformation campaigns.
Well said. We must remain vigilant and not let these fabricated narratives cloud our understanding of the situation. Verifying the source and authenticity of information is crucial.
The Russian government’s use of AI-generated videos to push a ‘peace at any cost’ narrative in Ukraine is a worrying escalation of their disinformation campaign. Fabricated ‘street interviews’ erode public trust and make it harder to discern truth from fiction.
You’re right, this is a troubling tactic that highlights the need for increased vigilance and critical thinking when consuming online content, especially regarding geopolitical conflicts.
While I understand the desire for peace, these AI-generated videos seem to be promoting a narrative that plays into Russian interests. We should be wary of content that appears to sway public opinion without clear attribution or verification.
Agreed. It’s crucial that we maintain a healthy skepticism towards any ‘grassroots’ messaging that aligns with the Kremlin’s goals, especially when the sources appear questionable.