Listen to the article
In the digital age, influence is increasingly defined by reach and spectacle rather than traditional persuasion methods. This transformation of propaganda is reshaping how global conflicts are presented and perceived.
Beginning in March 2026, an unusual series of videos swept across social media platforms, depicting American President Donald Trump and Israeli Prime Minister Benjamin Netanyahu as LEGO-style characters in disturbing wartime scenarios. Set against AI-generated music, these videos blended dark humor with graphic imagery—showing bombed schools, toy soldiers wading through blood, and miniature flag-draped coffins.
The content spread rapidly through multiple channels, including Iranian state television and accounts claiming to be independent creators. While their exact origin remained ambiguous, their impact was undeniable—reaching millions and generating significant engagement across platforms.
This ambiguity is a hallmark of modern propaganda. Groups like the self-described “Explosive News Team” maintain claims of independence despite their content closely aligning with state narratives. Meanwhile, official government accounts increasingly adopt similar visual styles and distribution strategies.
The convergence creates accountability challenges for platforms, which must determine whether to classify such content as coordinated manipulation, state propaganda, or simply viral creative expression when considering removal actions.
What distinguishes this new wave of information warfare is its delivery format. War is no longer communicated primarily through speeches or traditional reporting, but packaged as entertainment—memes, animations, and short videos that borrow heavily from popular culture references.
This approach has been adopted globally. The White House has experimented with similar tactics, sharing content that merges actual military footage with video game aesthetics, referencing titles like Call of Duty and mimicking gaming interfaces. In India, certain news outlets dubbed “Godi Media” have presented conflict updates in formats resembling sports coverage, complete with casualty “scoreboards.”
The underlying strategy remains consistent across countries: make war content visually engaging, familiar, and shareable to maximize its reach.
This transformation follows the fundamental logic of the attention economy, where content value is determined by engagement metrics rather than accuracy or authority. The most successful propaganda combines familiar formats with shocking content—a LEGO animation of a bombing is more likely to be shared and discussed than conventional war reporting.
Importantly, users don’t need to agree with content to amplify it—they simply need to find it compelling enough to engage with.
This evolution has developed gradually over the past decade. As early as 2015, ISIS was producing stylized recruitment videos borrowing from gaming and cinema aesthetics to appeal to younger audiences. State actors soon followed, with China’s Xinhua News Agency releasing LEGO-style animations criticizing America’s COVID-19 response in 2020, and Russia adopting similar visual strategies in its information campaigns across Eastern Europe.
In India, the BJP IT cell pioneered these approaches, with mainstream media outlets quickly adopting similar techniques. The common thread across these examples is the recognition that in the digital age, influence depends more on content distribution than message substance.
Humor and satire have emerged as particularly effective tools in this landscape. They’re engaging, easily digestible, and challenging to counter effectively—factual rebuttals to memes often appear stodgy and mismatched in tone.
Recent metrics illustrate the dramatic imbalance in reach: social media videos related to ongoing conflicts have generated billions of impressions, far exceeding traditional news coverage. For many users, war is encountered first as viral content, with verified information arriving later, if at all.
Generative AI has accelerated this shift by enabling rapid production of high-quality content at minimal cost. This allows both state and non-state actors to flood platforms with competing narratives, creating a complex environment where attribution becomes increasingly difficult.
For platforms, this raises fundamental questions about the boundaries between creativity and manipulation, and who should make these distinctions.
Modern propaganda’s effectiveness cannot be measured solely by its persuasive impact. Its influence is more subtle and pervasive, shaping the environment in which people interpret events. Viral content influences what feels important, what appears credible, and what emotions become associated with conflicts.
As media theorist Jacques Ellul observed, propaganda evolves alongside the systems that carry it. In today’s algorithm-driven ecosystem, it increasingly takes forms designed for maximum distribution—fast, far, and widely.
The implications are profound. When memes and viral clips become primary vehicles for complex geopolitical information, the line between information and entertainment blurs dangerously. The question is no longer just what people believe, but what captures their attention most effectively.
In this landscape of viral reality, the most powerful message isn’t necessarily the most accurate—it’s the one that travels furthest.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


14 Comments
This is a concerning trend. AI-driven propaganda can be incredibly powerful at shaping public opinion, even if the source is unclear. We need more transparency and accountability around viral content online.
The rise of AI-driven propaganda is a troubling sign of how technology can be weaponized to manipulate people’s perceptions. We need robust fact-checking and digital literacy efforts to combat this threat to democracy.
Absolutely. Viral misinformation and state-sponsored propaganda are a major challenge that platforms and governments must address urgently.
This shows how modern propaganda is evolving to leverage the psychology of social media engagement. Disturbing visual content and ambiguous sourcing make it harder to discern truth from fiction.
Interesting how the propagandists are using avant-garde visual styles and techniques to make their messaging more engaging and shareable. It’s a worrying development that blurs the line between truth and fiction.
Agreed. The use of AI and viral tactics makes it harder for the public to discern what’s real. We’ll need stronger critical thinking and media literacy skills to combat this.
This is a concerning trend that highlights how technology can be exploited to manipulate public opinion. We need robust fact-checking and digital literacy initiatives to help people critically evaluate online content.
This is a worrying development that shows how modern propaganda is evolving to leverage the power of AI and social media engagement. We need to be vigilant and find ways to combat these threats to the public discourse.
The use of AI to create highly shareable propaganda that blurs the line between fact and fiction is a worrying development. We need stronger media literacy efforts to help the public navigate this landscape.
Absolutely. The scale and speed at which this type of content can spread is alarming. Combating AI-driven propaganda will require a multi-pronged approach.
The use of AI to generate highly shareable propaganda content is a worrying development. We need better detection and mitigation strategies to protect the public discourse.
This is a troubling sign of how AI can be weaponized for disinformation campaigns. The blending of dark humor and graphic imagery is a concerning tactic to grab attention and sway opinions.
Agreed. The ambiguity around the source of this content makes it even more insidious. We need more transparency and accountability online to combat these threats.
The rise of AI-driven propaganda is a troubling sign of how misinformation can spread rapidly in the digital age. We need to find ways to promote transparency and accountability around viral content.