Listen to the article

0:00
0:00

Social media users who encounter false information may be doing more harm than good by engaging with it, according to new research that challenges conventional wisdom about combating misinformation online.

When faced with fake news or misleading content on platforms like Twitter or Facebook, many users feel compelled to correct the record by commenting, sharing with a correction, or otherwise engaging with the post. However, these well-intentioned reactions may inadvertently amplify the very content they aim to debunk.

Recent studies suggest that any form of interaction with false information—including clicking, commenting, or reacting with angry emojis—increases its visibility within social media algorithms. Even quote-tweeting or responding to correct misinformation signals to platforms that the content is generating engagement, causing it to appear in more users’ feeds.

“The simple fact is that engaging with false information increases the likelihood that other people will see it,” explains the research. “If people comment on it, or quote tweet—even to disagree—it means that the material will be shared to our own networks of social media friends and followers.”

The consequences of widespread misinformation are far-reaching and increasingly dangerous. A UK parliamentary committee has described online misinformation as a threat to “the very fabric of our democracy.” False information has been linked to real-world violence in countries like Myanmar and the United States, and has been weaponized to influence political processes in at least 48 countries worldwide.

Health misinformation presents particularly urgent concerns. With COVID-19 vaccine rollouts continuing globally, the spread of vaccine-related falsehoods may discourage people from getting immunized—a literally life-or-death consequence of online misinformation.

In a series of experiments involving 2,634 participants, researchers examined why people share false material online. The findings revealed two troubling patterns: some participants admitted to deliberately sharing political information they knew was untrue, and people were more likely to share content they believed they had encountered previously.

This repeated exposure effect represents a fundamental challenge in fighting misinformation. Multiple studies have established that repeated exposure to information—even false information—increases the likelihood that people will perceive it as true. This psychological phenomenon, sometimes called the “illusory truth effect,” has been a cornerstone of propaganda techniques throughout history.

“A common maxim of propaganda is that if you repeat a lie often enough, it becomes the truth,” the research notes. A 2018 study found that when social media users repeatedly saw false headlines, they rated them as more accurate—even when the headlines were flagged as disputed by fact-checkers.

Even more concerning, other research suggests that repeated exposure to false information may desensitize users to the ethical implications of spreading it. People who frequently encounter misinformation may come to view sharing it as less problematic, even when they personally don’t believe it.

These findings suggest a counterintuitive approach to combating online falsehoods: rather than engaging with misinformation to correct it, the most effective strategy may be to ignore it completely. By not interacting with false content, users deny it the algorithmic boost that comes with engagement.

The research implies that social media platforms should consider more aggressive approaches to handling misinformation. Rather than simply attaching warning labels—a common practice on platforms like Twitter—companies might be more effective if they removed false information entirely.

For individual users navigating an increasingly complex information landscape, the best response to encountering fake news might be the simplest one: scrolling past without engagement. In the fight against misinformation, sometimes silence speaks louder than corrections.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2025 Disinformation Commission LLC. All rights reserved.