Listen to the article

0:00
0:00

Fake Satellite Images Emerge as New Battleground in Israel-Iran Conflict

Social media users have grown adept at identifying obvious AI-manipulated celebrity photos or glitchy cityscapes, but a more sophisticated form of digital deception has emerged during the ongoing US-Israel conflict with Iran: fraudulent satellite imagery.

“For satellite images, we can safely say that the majority of people have very limited familiarity,” explains Symeon Papadopoulos, an AI researcher specializing in media verification at Greek research institute CERTH. “That makes them particularly prone to being misused, because if you change a small detail in a satellite image, most likely nobody will notice.”

While manipulating satellite imagery isn’t new—Russia infamously circulated fake satellite images of a downed Malaysian airliner in 2014, and similar deceptions appeared during India-Pakistan tensions last year—experts observe that the technique has become far more prevalent in the current Middle East conflict.

“It seems like the problem’s getting worse,” notes Brady Africk, an open-source intelligence (OSINT) analyst tracking the phenomenon.

One driving factor is the proliferation of AI tools that now make it trivial to obtain authentic satellite images from services like Google Earth or Bing Maps and digitally alter them. These doctored images often suggest destroyed infrastructure or strategic damage, advancing military narratives that benefit one side of the conflict.

Complicating matters, many commercial satellite providers have restricted public access to high-resolution imagery during the conflict to prevent their data from being used for military targeting. This information vacuum has created fertile ground for fabricated images that exploit public unfamiliarity with how satellite imagery actually looks and functions.

“Many people link the complexity involved in capturing a real satellite image to a resilience against those images being faked, but there’s no such link,” Africk emphasizes. He reminds social media users that satellite images “are photos just like any other and can be vulnerable to similar manipulations.”

Telltale Signs of Manipulation

Recent examples demonstrate the increasing sophistication—and prevalence—of these fakes. In one widely shared X post, a user circulated what appeared to be satellite imagery of burning oil fields in Qatar. While Qatar’s Liquefied Natural Gas facilities were indeed targeted by Iranian missiles, the image was easily identifiable as AI-generated by the Gemini watermark visible in the lower-right corner.

Though the image mimicked the texture and coloration of authentic satellite photos, the fire and smoke patterns were inconsistent with how such phenomena appear from orbit. The AI-detection tool ImageWhisperer also flagged the image as likely AI-generated with 73% confidence.

In another high-profile case, Tehran Times, a state-linked English-language newspaper, posted satellite images purporting to show “An American radar in Qatar” before and after it was allegedly destroyed in an Iranian drone strike. The post was viewed over 950,000 times.

Fact-checkers quickly determined that not only was the location misidentified—the site is actually a US naval base in Manama, Bahrain—but the “after” image showed clear signs of AI manipulation. Building structures had inexplicably changed shape, architectural lines appeared inconsistent, and certain elements were artificially added.

What makes this case particularly troubling is that Iran did attack this U.S. base, and verified satellite images from legitimate sources like Planet Labs and Airbus (published by The New York Times) documented authentic damage. The fake images circulated alongside real ones, creating confusion about the actual impact of the strike.

Impersonation Compounds the Problem

Adding another layer to the disinformation landscape, an account impersonating the Chinese geospatial intelligence company MizarVision circulated supposed satellite images of burning oil fields in Qatar. Not only were the images fabricated, but the entire account was fraudulent.

MizarVision, a legitimate Shanghai-based company, only publishes on Chinese platforms Weibo and WeChat. The imposter account, created in January and falsely claiming to be based in “Chinatown, Portland,” stole logos to post images with MizarVision watermarks before being taken down. The real company publicly clarified in February that any accounts using its name on Twitter/X were unauthorized impersonations.

One widely shared image from this fake account displayed a heavily filtered black-and-white “satellite” view of Qatar’s Ras Laffan refinery with multiple smoke plumes. Analysis revealed that all explosions appeared in nearly identical stages—a telltale sign of artificial cloning rather than natural occurrence.

Growing Challenge for Public Understanding

As satellite imagery becomes an increasingly central tool in both journalism and warfare, the rise of AI-manipulated visuals presents a significant challenge for public comprehension. False or altered images can spread virally across social platforms, shaping public perception and political narratives long before experts have time to analyze and debunk them.

In an era where conflicts unfold in real-time on social media, developing digital literacy—and maintaining healthy skepticism toward dramatic “satellite revelations”—is essential. While genuine satellite data remains crucial for documenting global events, distinguishing authentic imagery from sophisticated fabrications will require ongoing vigilance from platforms, media organizations, and individual users alike.

The proliferation of these fakes represents not just a challenge to accurate reporting on the current conflict, but a troubling preview of how future wars might be visually misrepresented to global audiences.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

9 Comments

  1. Elizabeth H. White on

    This is a concerning trend – the spread of false satellite imagery could have serious consequences, especially in geopolitical conflicts. Fact-checking and media verification will be crucial going forward.

    • John Williams on

      Agreed. Reliable open-source intelligence will be key to cutting through the misinformation and getting accurate, evidence-based information.

  2. Elijah S. White on

    This is a timely and important issue. Fact-checking and media verification will be essential to combating the spread of misinformation, especially in high-stakes scenarios involving national security and natural resources.

  3. I wonder how the proliferation of AI tools has enabled this problem. Are there any technical solutions or regulations that could help curb the creation and spread of falsified satellite imagery?

    • That’s a great question. Developing robust image authentication methods and raising public awareness of this issue could be important steps to address it.

  4. Linda K. Rodriguez on

    As someone interested in geopolitics and natural resources, I find this topic really fascinating. The potential for satellite imagery to be weaponized as part of information warfare is quite concerning.

    • Jennifer Miller on

      Absolutely. With the high stakes involved in conflicts over critical resources like oil and minerals, the stakes for accurate information are extremely high.

  5. Elijah Williams on

    Interesting how easily satellite imagery can be manipulated nowadays. It’s a good reminder to be cautious and verify information from all sources, even those that seem authoritative like satellite photos.

    • Isabella Williams on

      Absolutely. With AI tools making image editing more sophisticated, we have to be extra vigilant about digital deception.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.