Listen to the article

0:00
0:00

Viral Photo of Burqa-Clad Bangladeshi Cricketers Debunked as AI-Generated

A manipulated image purportedly showing two Bangladeshi women cricketers playing in burqas during a recent ICC Women’s World Cup match has been circulating widely on social media, prompting fact-checkers to investigate its authenticity.

The image, which depicts two players fully covered in black burqas standing on a cricket pitch, began spreading after New Zealand defeated Bangladesh by 100 runs in their World Cup fixture. It was accompanied by captions praising the players for supposedly demonstrating that “purdah is not a barrier, but a protection, and modesty is not weakness, but strength.”

An investigation by India Today’s Fact Check team has conclusively determined that the image is fabricated, created using artificial intelligence tools. Several telltale signs reveal the deception, including the presence of Google’s Gemini AI tool logo visible in the bottom right corner of the image.

When analyzed through Google’s SynthID Detector, a specialized tool designed to identify AI-generated content, the image was confirmed to be either entirely created or substantially manipulated using artificial intelligence.

Further evidence against the image’s authenticity comes from match highlights and official coverage of the New Zealand-Bangladesh game, which clearly show all Bangladeshi players wearing their standard team jerseys throughout the match. No credible news outlets reported any Bangladeshi cricketers competing while wearing burqas, which would have been a significant development in international sports had it occurred.

The Bangladesh women’s cricket team, like most international cricket squads, adheres to standard sporting attire that allows for mobility and comfort during play. The team’s uniform typically consists of green and red jerseys reflecting the nation’s flag colors, paired with appropriate sporting bottoms.

While the viral image is definitively fake, it’s worth noting that Muslim women athletes have competed at the international level while honoring their religious practices through modified attire. Notable examples include Scottish cricketer Abtaha Maqsood, who competed wearing a hijab during the T20 World Cup in 2024. Similarly, Pakistani player Quratulain Ahsen covered her head with a hijab while representing her nation in the Under-19 T20 World Cup this year.

The incident highlights the growing challenge of misinformation in sports coverage, particularly when cultural and religious elements intersect with athletics. As AI image generation tools become increasingly sophisticated and accessible, the ability to create convincing fake images that play into existing cultural narratives poses a significant challenge for media literacy.

Sports governing bodies, including the International Cricket Council (ICC), have established uniform regulations that aim to balance competitive standards with cultural accommodation. While modifications for religious reasons are sometimes permitted, they typically require approval and must not interfere with player safety or performance.

This is not the first instance of manipulated images spreading during international sporting events, but it demonstrates how AI technologies are making detection increasingly important. Social media platforms continue to grapple with the rapid spread of such content, which can quickly reach millions before fact-checking efforts catch up.

Media experts advise viewers to approach unusual images with skepticism, particularly those that appear to show dramatic departures from established norms in professional sports, and to verify information through official sources before sharing.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

10 Comments

  1. Robert Jackson on

    Kudos to the fact-checkers for their diligence in verifying the authenticity of this image. Spreading false information, even if well-intentioned, can be damaging. Transparency is key.

  2. While the original intent may have been positive, using AI to manipulate images in this way is highly problematic. I’m glad the truth was uncovered through rigorous investigation.

  3. William Hernandez on

    It’s disappointing to see a seemingly well-intentioned image turn out to be fabricated. But I’m glad the facts were brought to light. We must remain skeptical of sensational social media claims.

    • Absolutely. Verifying the authenticity of online content should be a priority, especially when it involves sensitive topics like religion and culture.

  4. Interesting fact check. It’s good to see media outlets investigating and debunking misinformation. Fabricating images using AI is a concerning trend that needs to be called out.

  5. While the original image may have been meant to inspire, the use of AI to create it is highly problematic. I’m glad the truth was uncovered, but it’s a sobering reminder of the potential for misinformation online.

  6. The use of AI to fabricate images like this is troubling. It speaks to the need for greater media literacy and vigilance when consuming online content. Kudos to the investigators for uncovering the truth.

  7. Isabella Rodriguez on

    This is a valuable lesson in the dangers of AI-generated content and the need for robust fact-checking. I commend the investigators for their diligence in getting to the truth of this matter.

  8. This is a good reminder of the importance of critical thinking and fact-checking, especially when it comes to viral social media content. AI-generated imagery can be incredibly convincing.

    • Agreed. It’s worrying how easily misinformation can spread online these days. Fact-checking is crucial to combat the rise of synthetic media.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.