Listen to the article
In the wake of escalating tensions between Iran and the United States, artificial intelligence has emerged as a potent tool in shaping public perception of the conflict. The widespread availability of AI video technologies has led to a flood of deepfake videos and manipulated images circulating across social media platforms, blurring the lines between fact and fiction.
Marc Owen Jones, associate professor of media analytics at Northwestern University in Qatar, describes social media as “a battlefield for competing narratives” where all parties involved are vying to win “hearts and minds.” According to Jones, who specializes in digital disinformation, the American side has been characterized by “videos intercut with Hollywood clips, a sort of memeification of communication designed to appeal to a far-right aesthetic.”
Iran, meanwhile, has adapted to this digital landscape, often using memes to mock the United States while deploying AI-generated content that appears to exaggerate Iranian military successes. This strategy may be aimed at pressuring Gulf states to advocate for de-escalation in the region.
The sophistication of AI-generated deepfakes has reached alarming levels. One striking example involved videos purporting to show the USS Abraham Lincoln aircraft carrier burning at sea—content so convincing that even former President Donald Trump reportedly contacted military officials to verify whether the footage was authentic. Trump later confirmed on his Truth Social platform that the carrier had not been attacked.
Other fabricated content includes videos claiming to show U.S. troops crying in defeat and buildings in Gulf cities being destroyed by military strikes. The increasing quality of these deepfakes makes them particularly difficult to identify. “The use of AI is legion and is increasingly hard to detect,” Jones warns.
The rapid dissemination of unverified information presents a significant challenge for the public. “In a fast-moving conflict, verified information is often delayed, which creates a vacuum that misinformation fills immediately,” Jones explains. “When people are worried, they crave information, but that information is often false.” This dynamic allows fabricated content to reach millions of viewers within minutes, well before fact-checkers can intervene.
The phenomenon extends beyond battle footage. Last week, rumors spread widely that Israeli Prime Minister Benjamin Netanyahu had died, with skeptics pointing to supposed visual glitches in a video released by Netanyahu’s office as evidence of AI manipulation. Despite Netanyahu subsequently releasing several “proof-of-life” videos to counter these claims, speculation about his death continues to circulate online.
Some misinformation appears to be part of coordinated campaigns designed to influence public opinion. “There are sketchy, anonymous accounts, with histories of multiple name changes, and no discernible identity sharing fake news and AI videos,” Jones notes. These accounts may be linked to state-backed actors or individuals seeking to profit from sensationalist content. In many cases, automated bot accounts amplify certain narratives, creating a false impression of widespread popularity.
Not all AI-generated content is created with deceptive intent. Some videos are deliberately produced as parody or satire, mocking world leaders like Trump and Netanyahu. However, these satirical creations can still be misconstrued as authentic footage. Examples include videos depicting Trump as Iran’s new supreme leader or portraying Netanyahu as a malfunctioning robot.
The proliferation of misleading content is eroding public trust in media. “False information can spread up to ten times faster than accurate reporting on social media, and corrections are rarely as widely seen or believed as the original false claim,” Jones observes. “Outrage drives sharing before fact-checking can occur, which is exactly what bad actors exploit.”
As the conflict continues, Jones advises treating dramatic footage with heightened skepticism. “The fact that it looks real is no longer sufficient evidence that it is,” he cautions. For ordinary people attempting to stay informed about the Iran-U.S. conflict, navigating this complex media environment requires unprecedented levels of media literacy and critical thinking.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

