Listen to the article

0:00
0:00

In the battle against online falsehoods, factchecking has long been viewed as the primary defense. Yet mounting evidence suggests this approach often falls short, with people continuing to share and believe misinformation despite being presented with corrective facts.

Recent research indicates that journalists who debunk claims actually lose reader trust compared to those who confirm them. More worryingly, factchecking can inadvertently amplify false information by repeating it to new audiences who might not have encountered it otherwise.

Media scholar Alice Marwick’s research provides a framework for understanding why factchecking alone frequently fails. Her analysis reveals misinformation thrives through three interconnected pillars: the content itself, the personal context of those sharing it, and the technological infrastructure that spreads it.

On the content level, people find it cognitively easier to accept information than to reject it. Misinformation becomes particularly potent when it leverages what sociologist Arlie Hochschild calls “deep stories” – emotionally resonant narratives that align with existing beliefs. For example, disinformation about migration often exploits familiar tropes like “the dangerous outsider” or “the overwhelmed state,” simplifying complex social issues into emotionally charged narratives.

The personal context is equally crucial. When fabricated claims align with someone’s existing values and beliefs, they can quickly solidify into perceived knowledge, making them resistant to factual corrections. Marwick’s research documented how during the 2016 U.S. presidential election, individuals continued sharing false stories even after being shown they were untrue.

“I don’t care if it’s false, I care that I hate Hillary Clinton, and I want everyone to know that!” one subject told a family member who attempted to correct misinformation. This candid statement reveals how sharing false content often serves as an identity-signaling mechanism rather than an information-sharing one.

Researchers describe this as “identity-based motivation,” where the value of sharing lies not in accuracy but in reinforcing group identity and cohesion. This dynamic is becoming even more concerning with the proliferation of AI-generated images, as studies show people willingly share visuals they know are fake if they believe they contain an “emotional truth.”

The third pillar – technical infrastructure – dramatically amplifies these problems. Social media platforms generate revenue by capturing and selling users’ attention to advertisers. Their recommendation algorithms are explicitly designed to maximize engagement, and research consistently shows that emotionally charged content – especially that which evokes anger, fear or outrage – generates significantly more engagement than neutral or positive information.

The sharing functionality of messaging and social media platforms enables exponential spread. A 2020 BBC report revealed that a single message sent to a WhatsApp group of 20 people could ultimately reach more than 3 million if each recipient shared it with 20 others and this pattern repeated five times.

“By prioritizing content likely to be shared and making sharing effortless, every like, comment or forward feeds the system,” explains Kelly Fincham, Programme Director of BA Global Media at the University of Galway. “The platforms themselves act as a multiplier, enabling misinformation to spread faster, farther and more persistently than it could offline.”

This three-pillar framework explains why factchecking alone is insufficient. It addresses only the content aspect while ignoring the powerful emotional and structural factors that drive misinformation’s spread.

A more comprehensive approach would require long-term structural changes to platform incentives and accountability systems, alongside shifts in social norms and greater awareness of our own motivations for sharing information.

“If we continue to treat misinformation as a simple contest between truth and lies, we will keep losing,” Fincham warns. “Disinformation thrives not just on falsehoods, but on the social and structural conditions that make them meaningful to share.”

As AI-generated content becomes increasingly sophisticated and harder to detect, addressing these deeper drivers of misinformation becomes even more urgent for maintaining healthy information ecosystems and democratic discourse.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

10 Comments

  1. The mining and energy sectors are rife with misinformation, whether it’s around the environmental impacts of extractive industries or the future of fossil fuels. This article highlights the challenges in correcting these false narratives. Curious to hear how companies in these spaces are adapting their communications strategies.

    • You’re right, the energy and mining sectors face unique challenges when it comes to combating misinformation. Transparency, engaging directly with stakeholders, and leveraging trusted third-party validators could be some effective approaches for these industries.

  2. Really thought-provoking article. The challenge of correcting misinformation is particularly acute for complex, technical topics like mining, energy, and commodity markets. I wonder if innovative educational initiatives, media literacy programs, and collaborative fact-checking efforts could help address this issue.

  3. This is a concerning trend – the idea that debunking misinformation can actually backfire and further entrench false beliefs is troubling. I wonder what strategies media and tech companies could implement to more effectively counter the spread of online falsehoods.

    • Jennifer O. Lee on

      You raise a good point. Platforms and publishers need to rethink their approaches beyond just fact-checking. Promoting quality information, limiting the reach of proven misinformation, and empowering users to be more discerning consumers could all play a role.

  4. As an investor in mining and commodities, I’m quite concerned about the spread of misinformation in these sectors. Factual, science-based information is crucial for making informed decisions, yet emotionally charged narratives often drown it out. I wonder what role the investment community could play in pushing for more rigorous, transparent communication.

  5. Fascinating insights on why factual corrections often fail to change minds. The power of ‘deep stories’ to override rational arguments is troubling, especially for topics like climate change and the energy transition. Curious what other psychological factors might be at play here.

    • Michael Garcia on

      Absolutely, the psychological underpinnings of misinformation are complex. Things like confirmation bias, tribal loyalties, and the human tendency to believe the first information we encounter likely all play a role. Multifaceted solutions will be needed to build societal resilience.

  6. Oliver E. Davis on

    Fascinating article on the challenges of combating misinformation. The cognitive biases and emotional resonance of ‘deep stories’ make it an uphill battle for facts and fact-checking. Curious to hear perspectives on how we can build more resilience to misinformation in the digital age.

    • Agreed, this is a complex issue without easy solutions. Strengthening critical thinking skills and digital literacy could be part of the answer, but tackling the underlying social and psychological factors driving misinformation is crucial.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.