Listen to the article
A Russian disinformation network is using sophisticated AI voice technology to target Ukrainian athletes and fans at the Winter Olympics, according to research by multiple security and media experts.
The network, identified as “Matryoshka,” employs a particularly deceptive tactic that involves using real footage of trusted figures before seamlessly transitioning to AI-generated voiceovers that mimic their speech patterns, making false statements appear authentic.
“What truly sets Matryoshka apart is the use of AI voiceovers to impersonate the voices of trusted figures,” explains Pablo Maristany de las Casas from the Institute of Strategic Dialogue (ISD), a global think tank that monitors disinformation campaigns.
The BBC Verify team has identified several examples of this technique in action. One notable case features Olympics chief Kirsty Coventry, where genuine footage from a Euronews press conference is manipulated partway through. The AI-generated voice, mimicking Coventry, falsely claims she was shocked that Ukrainian athletes came to Milan “for crazy political PR,” that they were behaving aggressively, and that she had “never encountered people this irritating, I swear.”
Analysis of the original press conference confirms Coventry made no such statements.
Darren Linvill, a media forensics expert at Clemson University who has been tracking these operations, elaborated on the methodology: “They take a real video of a real person but part-way through they switch to stock footage overlaid with a deepfake narration that sounds just like the real person so that they can insert absurd lies that appear more authentic.”
The campaign extends beyond targeting Olympic officials. BBC Verify has identified similar tactics used to create deepfakes of an American sports commentator covering the Winter Olympics. Canadian broadcaster CBC has also debunked an AI-generated video purporting to feature one of their journalists.
This represents a concerning evolution in disinformation tactics. While individual videos may have limited reach, collectively they demonstrate a coordinated effort to undermine international support for Ukraine amid its ongoing conflict with Russia. The Olympics, with its global audience and spirit of international cooperation, presents an attractive target for such influence operations.
This isn’t Matryoshka’s first deployment of voice cloning technology. The BBC previously investigated how the same operation cloned the voice of a British emergency services call handler last year, indicating a pattern of increasingly sophisticated attacks aimed at exploiting public trust in familiar voices and institutions.
“The operators of Matryoshka know that its content is more credible when it’s delivered, seemingly, by a trusted person,” notes Maristany de las Casas, highlighting the psychological effectiveness of the approach.
Media and cybersecurity experts warn that such AI-powered disinformation campaigns are becoming increasingly difficult to detect. The technology behind voice cloning has advanced rapidly in recent years, making it more accessible and the results more convincing.
The targeting of Olympic coverage is particularly strategic, as sporting events typically draw diverse international audiences who may not be familiar with the ongoing geopolitical tensions between Russia and Ukraine, potentially making them more susceptible to misinformation.
Security analysts suggest this campaign represents just one facet of a broader Russian information warfare strategy that has intensified since the full-scale invasion of Ukraine began in February 2022. These operations typically aim to sow division, create confusion, and weaken international resolve regarding support for Ukraine.
As the technology behind these deepfakes continues to evolve, media literacy experts emphasize the importance of verifying information through multiple sources and maintaining healthy skepticism toward emotionally charged or politically divisive content, especially when it appears to show trusted figures making uncharacteristic statements.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


13 Comments
Kudos to the BBC Verify team for uncovering this sophisticated Russian disinformation network. The ability to seamlessly blend real footage with AI-generated audio is a concerning development that requires robust responses from media and governments.
This is a concerning development. The use of AI-generated voice impersonation to spread disinformation is a worrying tactic that could erode public trust. We need strong safeguards and transparency around these emerging technologies.
Agreed. The ability to impersonate trusted figures so convincingly is alarming. Fact-checking and media literacy efforts will be crucial to combat these sophisticated propaganda efforts.
This is a troubling example of how advanced technology can be weaponized to spread false narratives. The use of AI voice impersonation to target Ukrainian athletes is a cynical attempt to sow division and undermine the Olympic spirit.
Absolutely. Disinformation campaigns like this one erode public trust and must be met with strong, coordinated responses from media, tech platforms, and international organizations.
The use of AI voice impersonation to spread false claims about Ukrainian athletes is a despicable tactic. It demonstrates the lengths to which bad actors will go to undermine the integrity of major events like the Olympics. We must remain vigilant.
Agreed. Disinformation campaigns like this one undermine the spirit of international sporting events. Fact-checking and media literacy are crucial to combat these malicious efforts.
This report highlights the growing threat of AI-powered disinformation campaigns. The ability to manipulate audio and video to create believable fakes is a major challenge for maintaining truth and transparency. Robust fact-checking will be essential.
Targeting athletes and fans at the Olympics with fake news is a new low. Using AI to create fabricated statements by officials is a blatant attempt to sow discord and confusion. We must remain vigilant against these malicious tactics.
Absolutely. The Olympics should be a time of unity and sportsmanship, not political mudslinging. This Russian disinformation campaign is a shameful abuse of technological capabilities.
The Matryoshka network’s use of AI-generated voice impersonation is a disturbing tactic that highlights the growing challenge of combating digital deception. We must remain vigilant and invest in tools to counter these increasingly sophisticated disinformation efforts.
This report is a wake-up call about the evolving threat of AI-powered disinformation. The ability to manipulate audio and video to create believable fakes is a serious concern that requires urgent attention from policymakers, tech companies, and the public.
Agreed. The implications of this technology being used to target athletes and undermine major events like the Olympics are deeply troubling. Robust fact-checking and media literacy initiatives will be crucial going forward.