Listen to the article

0:00
0:00

In the digital age, influence operates through a new set of rules where reach and engagement often trump substance and accuracy. This transformation in propaganda has become increasingly apparent in recent years, with social media platforms serving as the primary battleground for shaping public perception.

By early 2026, a curious phenomenon emerged when videos depicting American President Donald Trump and Israeli Prime Minister Benjamin Netanyahu as LEGO-style characters began circulating widely across social platforms. These animations placed the political figures in disturbing wartime scenarios—showing bombed schools, toy soldiers marching through blood, and miniature flag-draped coffins—all set to catchy, AI-generated music that combined humor, satire, and horror.

Some videos appeared on Iranian state television, while others spread through accounts claiming to be independent creators. The origin of much of this content remained deliberately ambiguous, yet the impact was unmistakable—millions of views and widespread engagement across platforms.

This ambiguity is a defining feature of modern propaganda. Groups like the self-described “Explosive News Team” maintain claims of independence despite narratives that closely align with state messaging. Meanwhile, official government accounts increasingly adopt similar visual languages and techniques, effectively blurring the line between grassroots creativity and coordinated propaganda campaigns.

The distinction of this new wave of propaganda lies not just in its message but in its format. War is now communicated through memes, animations, and short videos that borrow heavily from popular culture. Even established institutions have embraced this trend—the White House has experimented with content merging military footage with video game aesthetics that reference popular titles like Call of Duty.

In India, television channels often criticized as “Godi Media” have adopted similar approaches, reporting war updates with sports-like commentary that sometimes reduces human casualties to numerical scores. This transformation reflects the fundamental logic of the attention economy, where content value is determined not by accuracy but by engagement metrics.

The most effective content combines familiar formats with shocking elements—a LEGO animation of a bombing raid is more likely to be watched, shared, and discussed than a traditional news report. Crucially, users don’t need to agree with the content to amplify it; they only need to find it compelling enough to engage with.

This evolution has been developing for years. As early as 2015, ISIS produced stylized recruitment videos borrowing from gaming aesthetics to appeal to younger audiences. By 2020, China’s Xinhua News Agency released LEGO-style animations criticizing the United States’ COVID-19 response. Russia later adopted similar approaches in Eastern European campaigns, while in India, the BJP’s IT cell pioneered meme-driven propaganda strategies that were eventually adopted by mainstream media.

Satire and humor have become particularly effective tools in this landscape. They are engaging, digestible, and difficult to counter effectively. Factual rebuttals to memes or animated parodies often appear slow, serious, and mismatched in tone. As a result, spectacle consistently outpaces substance, with viral content shaping perceptions before fact-checking can catch up.

Recent analytics confirm this imbalance—social media videos related to ongoing conflicts generate billions of impressions, vastly exceeding the reach of traditional news coverage. For many users, especially younger demographics, war is encountered first as content, and only later, if at all, as verified information.

Generative AI has accelerated this shift dramatically by enabling rapid production of high-quality content at minimal cost. This allows both state and non-state actors to flood platforms with competing narratives. Attribution becomes increasingly difficult as governments, proxy groups, and independent creators produce similar content, often reinforcing each other’s messaging.

The effectiveness of modern propaganda cannot be measured solely by changed minds. Its impact works more subtly by shaping the environment in which people interpret events. Viral content influences what feels important, what appears credible, and what emotions become associated with conflicts. It creates an atmosphere defined by spectacle and emotional resonance rather than factual accuracy.

As media theorist Jacques Ellul noted decades ago, propaganda evolves alongside the systems that carry it. In today’s algorithm-driven ecosystem, it increasingly takes forms designed for maximum travel—fast, far, and widely. When memes and viral clips become primary vehicles for complex geopolitical realities, the line between information and entertainment dissolves completely.

The question is no longer just what people believe, but what captures their attention—and how often. In an age where virality determines visibility, the most powerful message isn’t necessarily the most truthful, but the one that travels furthest.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

8 Comments

  1. The use of AI-generated animations to spread propaganda is quite worrying. It’s a tactic that can make even the most absurd and disturbing content seem more palatable and shareable. We need robust media literacy efforts to help the public identify and resist these manipulative tactics.

    • Jennifer Z. Garcia on

      Absolutely. Viral misinformation can have a real impact on how people perceive current events and political issues. Fact-checking and critical thinking will be key to combating the rise of AI-driven propaganda.

  2. Amelia Jones on

    The use of LEGO-style animations to depict disturbing wartime scenarios is a particularly unsettling tactic. It seems designed to appeal to a wide audience, including children, while normalizing and even trivializing real-world violence. We must remain vigilant against such manipulative propaganda.

  3. Robert Taylor on

    Fascinating how AI can be used to manipulate public opinion through viral content. It’s a concerning trend that blurs the line between fact and fiction. We’ll need to be vigilant in scrutinizing online content to avoid being swayed by misleading propaganda.

  4. Patricia T. Jackson on

    The blending of humor, satire, and horror in these propaganda videos is a concerning development. It seems designed to captivate and engage viewers, while also subtly shaping their perspectives. We’ll need to be extra vigilant in separating truth from fiction online.

  5. Isabella Martinez on

    The rise of AI-driven propaganda is a complex issue that touches on everything from media literacy to geopolitics. It’s a reminder that we need to be critical consumers of online content and seek out reliable, fact-based sources of information. Developing effective countermeasures will be crucial.

  6. This is a prime example of how AI can be weaponized to spread disinformation and sow social discord. The ability to generate content that appears organic and engaging, yet pushes a specific narrative, is a real threat to democratic discourse. We must address this challenge head-on.

  7. Ava Martinez on

    The ambiguity around the origins of this type of content is part of what makes it so insidious. It’s difficult to trace the source and motivations behind these viral videos. Strengthening digital literacy and media analysis skills will be crucial for the public to navigate this landscape.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.