Listen to the article
Claims of Bank of England’s former governor Mark Carney promoting cryptocurrency investment schemes have been circulating online, but these videos are sophisticated deepfakes designed to scam unsuspecting investors, fact-checkers confirm.
The deceptive videos, which have appeared across social media platforms in recent weeks, show what appears to be Carney enthusiastically endorsing various crypto investment platforms that promise extraordinary returns. Using artificial intelligence technology to manipulate both Carney’s likeness and voice, the creators have produced realistic-looking content that has fooled many viewers.
“I have never endorsed any crypto scheme, trading platform or investment product – ever,” Carney stated in an official response released through his representatives. The former central banker, who led the Bank of England from 2013 to 2020, emphasized that he has consistently cautioned investors about the volatility and risks associated with cryptocurrency markets throughout his career.
Financial authorities have noted a troubling increase in deepfake videos targeting high-profile financial figures. These sophisticated scams represent an evolution of traditional investment fraud, leveraging artificial intelligence to create compelling and seemingly authentic endorsements from trusted figures.
The UK’s Financial Conduct Authority (FCA) has issued warnings about these particular videos, urging consumers to verify information through official channels before making any investment decisions. “Fraudsters are increasingly using advanced technology to impersonate trusted public figures,” said an FCA spokesperson. “Always check the FCA Register to ensure you’re dealing with a legitimate firm.”
Security experts point out several telltale signs that the Carney videos are fabricated. These include subtle inconsistencies in facial movements, unnatural voice patterns, and promotional language that would be highly inappropriate for someone of Carney’s position and professional background.
“The technology behind these deepfakes is becoming increasingly sophisticated,” explains Dr. Mira Patel, a digital forensics specialist at Cambridge University. “What’s particularly concerning is that they’re not just using Carney’s image, but they’ve synthesized his voice with remarkable accuracy, making the deception much more convincing to the average viewer.”
The scams typically direct viewers to investment platforms promising returns of 30% or more – red flags that financial experts say should immediately alert consumers. These platforms often use high-pressure tactics, creating artificial time constraints to rush potential victims into making hasty decisions before conducting proper due diligence.
This incident highlights the growing challenges posed by deepfake technology in the financial sector. According to cybersecurity firm Kaspersky, AI-generated investment scams increased by 135% in 2022-2023, with losses estimated at over £150 million in the UK alone.
For Carney, who now serves as the UN Special Envoy for Climate Action and Finance, these fraudulent representations threaten to undermine his reputation and the public’s trust in financial leadership more broadly. His team has been working with social media companies to remove the content and is exploring legal options against the perpetrators.
The Bank of England has also responded, reminding the public that neither current nor former central bank officials would ever endorse specific investment products. “These videos represent a concerning misuse of technology to exploit public trust in financial institutions,” a spokesperson noted.
Consumer protection groups advise exercising extreme caution with any investment opportunity promoted through social media, particularly those featuring celebrity endorsements. “Always verify information through multiple sources, and remember that legitimate financial advisors don’t promise extraordinary returns with no risk,” says Elaine Harper of the Consumer Financial Protection Network.
Technology companies, meanwhile, face mounting pressure to develop more effective methods for detecting and removing deepfake content before it reaches wide audiences. Meta, Google, and Twitter have all acknowledged the challenge and claim to be enhancing their detection capabilities, though critics argue these efforts remain insufficient given the rapid advancement of AI technology.
As this case demonstrates, distinguishing between authentic and manipulated content continues to become more difficult for average consumers. Financial literacy experts recommend that investors maintain healthy skepticism toward any investment opportunity that seems too good to be true – because it invariably is.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


8 Comments
This is a sobering reminder that we need to be extremely cautious about what we see and hear, especially when it comes to financial matters. Deepfake technology is advancing rapidly, and investors must be vigilant to avoid falling victim to these types of scams.
It’s disheartening to see such sophisticated technology being used for malicious purposes. Scammers are always looking for new ways to exploit people, and deepfakes are a particularly insidious tool. Kudos to the fact-checkers for exposing this particular scam.
It’s good that Mark Carney has strongly denied endorsing any crypto schemes. Investors need to be very cautious about claims made in videos, even if they seem to feature a credible figure. Verification is key to avoid falling for these types of scams.
Absolutely. Carney’s clear statement is important to counter the deception. Investors should always verify claims, especially for high-risk products like crypto.
I’m glad the authorities are taking this issue seriously. Deepfake videos targeting financial figures is a worrying trend that could have serious consequences for investors. Increased public awareness and robust verification processes will be key to combating this threat.
This is a good reminder that we can’t always trust what we see and hear, even from seemingly reputable sources. Deepfake technology is advancing rapidly, and it’s crucial for the public to be aware of the risks. Fact-checking is essential to avoid falling victim to these types of scams.
Wow, this is really concerning. Deepfake videos are becoming more and more sophisticated, and it’s scary to think they could be used to scam people out of their money. I’m glad the authorities are aware of this issue and are working to address it.
I wonder how prevalent these deepfake scams have become in the crypto space. It seems like an easy way for bad actors to exploit people’s fears of missing out on the ‘next big thing.’ Regulators will need to stay on top of this emerging threat.