Listen to the article

0:00
0:00

In a digital era where misinformation spreads with unprecedented speed, a novel approach to combat fake news is taking root: inoculation through gaming. During a recent experiment, I found myself destroying the fictional town of Harmony Square as the “Chief Disinformation Officer,” using a fake news site called Megaphone to divide citizens and create chaos.

This 10-minute exercise in villainy is actually “Breaking Harmony Square,” a game developed through collaboration between the U.S. Departments of State and Homeland Security, University of Cambridge psychologists, and DROG, a Dutch anti-misinformation initiative. The game’s premise builds on psychological inoculation theory – just as vaccines prepare immune systems to fight disease, playing these games helps people identify manipulation tactics in real life.

Research shows participants who play such games develop improved abilities to spot manipulation techniques, greater confidence in their judgments, and reduced likelihood of sharing misinformation. Even with prior knowledge of some tactics, I noticed heightened awareness of manipulative content when I next visited Facebook and Twitter.

This gamified approach addresses what WHO Director-General Tedros Adhanom Ghebreyesus called an “infodemic” alongside the COVID-19 pandemic. “Fake news spreads faster and more easily than this virus and is just as dangerous,” he warned in February 2020. Studies have since confirmed that social media news consumers were more susceptible to false claims about COVID-19 and vaccines, while similar manipulation fueled the January 6 Capitol insurrection.

The digital landscape has become a battlefield of influence. Researchers Samantha Bradshaw and Philip Howard from the Oxford Internet Institute documented in 2019 how “computational propaganda” has become “pervasive and ubiquitous,” with government agencies and political parties in 70 countries employing tactics like bot networks and disinformation amplification.

“The idea is to empower people to make their own decisions – better decisions – by giving them simple tools or heuristics,” explains Anastasia Kozyreva, a research scientist at the Max Planck Institute for Human Development in Berlin, who led an analysis published in Psychological Science in the Public Interest.

Society has historically adapted to disruptive technologies – from bicycles to radio to telephones – but the digital revolution’s pace is unprecedented. “I don’t think that we have ever seen such a drastic change as in the case of the internet,” notes Kozyreva. More than half the global population is now online, with decisions increasingly influenced by digital information.

What makes this environment particularly problematic is the hidden manipulation, explains Filippo Menczer, professor of informatics at Indiana University. “It is completely impossible even for researchers – let alone any single user – to understand how and why they are being exposed to one particular piece of information and not another,” he says. These systems evolve “at rates that are orders of magnitude faster than our brain could possibly adjust.”

Menczer’s Observatory on Social Media has developed tools like Hoaxy and Botometer to visualize misinformation spread on social media. Their research, along with an MIT study published in Science, revealed that false information spreads significantly faster than truth, especially in politics. The Pew Research Center found that bots share nearly two-thirds of Twitter links.

Beyond inoculation through gaming, experts recommend practical steps like turning off notifications, removing distracting apps, tightening privacy settings, and avoiding platforms where misinformation thrives. Michael Caulfield, director of blended learning at Washington State University Vancouver, teaches students to evaluate sources in “four moves,” providing them with context that helps them make better trust judgments.

However, University of Michigan professor Irene Pasquetto questions whether everyone who consumes misinformation actually wants to be better informed. “Kozyreva and colleagues seem to assume that people who are disinformed are aware of it and want not to be disinformed going forward,” she says. “I don’t think that’s necessarily the case.”

No one believes these interventions alone can fully protect against internet manipulation. Kozyreva acknowledges these tools are not “a silver bullet,” while Caulfield cautions against using education as an excuse to avoid policy solutions: “Just because you have driver’s education doesn’t mean that you abolish driving laws.”

Despite these challenges, Pasquetto remains optimistic about society’s capacity to adapt. “I see civil society fighting really hard to regulate the internet and rethink how it is designed,” she observes, noting extensive efforts in media literacy education, including Bad News, a game similar to Breaking Harmony Square but designed for children.

Ultimately, surrendering to cynicism may be more dangerous than any individual falsehood. As Caulfield warns, “If you remove truth from the equation, you start to live in a world where authoritarianism is really the only path forward. If you lose truth, you end up with power.”

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

10 Comments

  1. Jennifer Miller on

    Interesting, this game could be really helpful in teaching people to spot misinformation tactics. I’m curious to see how effective it is in the long run at inoculating people against manipulation.

    • Elijah O. Moore on

      Agreed, any tool that helps the public be more discerning consumers of media is valuable in this age of rampant disinformation.

  2. William Williams on

    As someone who has been on the receiving end of manipulative media tactics, I really appreciate initiatives like this that empower the public. Anything that helps people critically analyze what they see online is a big win in my book.

    • Isabella Martin on

      I agree, personal experience with misinformation makes you especially attuned to the need for better digital literacy. This game seems like a step in the right direction.

  3. James Garcia on

    While I support the intent behind this initiative, I worry about the potential for unintended consequences. Could this ‘inoculation’ approach backfire and actually increase distrust in mainstream media? We’ll have to closely monitor the real-world impacts.

    • Isabella D. Davis on

      That’s a valid concern. Careful implementation and ongoing evaluation will be crucial to ensure this tool achieves its intended goals without causing further harm.

  4. Michael R. Smith on

    I’m curious to learn more about the specific tactics covered in this game. What are some of the most common manipulation techniques people should be on the lookout for when consuming news and social media?

    • That’s a great question. The article mentions some common tactics like creating fake news sites and dividing citizens. Understanding those kinds of psychological techniques is key to spotting misinformation.

  5. Michael T. Miller on

    I’m quite skeptical of the government’s involvement in this project. While the goal seems noble, I worry about potential ulterior motives or unintended consequences of this type of ‘inoculation’ approach.

    • Patricia Davis on

      That’s a fair concern. The government’s role does raise some red flags about potential misuse. We’ll have to see how this plays out and whether it remains a truly independent, effective tool.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.