Listen to the article
New Study Finds Media Literacy Training Can Encourage Active Response to Misinformation
A groundbreaking study by researchers at Carnegie Mellon University reveals that media literacy training can significantly increase people’s willingness to actively counter false information they encounter online, rather than simply scrolling past it.
The research, published in the International Conference on Social Computing, Behavioral-Cultural Modeling and Prediction, takes a novel approach to combating misinformation. While most media literacy efforts focus on teaching people to distinguish between true and false information, this study examined whether training could motivate users to take action when they spot misleading content.
“Many of us have seen false or misleading content on our feeds, but quickly scroll past it and move on,” explained lead researcher Christopher King, who co-authored the study with Kathleen M. Carley. “We wanted to see if we could change that behavior.”
The experiment involved government analysts participating in a social cybersecurity training program called OMEN. Researchers measured participants’ willingness to respond to explicitly labeled false social media posts both before and after completing an interactive training session on countering misinformation.
Results showed a marked increase in participants’ readiness to take “higher-effort actions” after training, including commenting publicly with corrective information or messaging posters privately. This shift was most notable among those who previously only took minimal steps like reporting posts to platforms.
The social context of misinformation emerged as a crucial factor in determining responses. Participants overwhelmingly reported being more likely to correct people they knew personally than strangers, citing that friends and family members felt “more worth the effort.” Conversely, they were less inclined to engage with posts expressing extremely false beliefs, such as flat Earth theories, feeling such interventions would be futile or potentially inflammatory.
“These findings reinforce that misinformation is not purely a content problem but also a social one,” noted Dr. Carley, director of the Center for Informed Democracy and Social Cybersecurity at Carnegie Mellon. “The relationships between users significantly influence intervention decisions.”
Platform design also proved influential in shaping user behavior. Some participants expressed greater comfort in correcting misinformation on platforms offering anonymity, such as Reddit, which reduced concerns about personal conflict. Others indicated they would be more likely to report content on platforms with user-friendly reporting features and a reputation for taking user reports seriously.
The study’s implications extend to both social media companies and policymakers. For platforms, the findings suggest investment in more accessible correction and reporting functionality could empower users to play a more active role in combating misinformation. For educational initiatives, the results indicate that media literacy programs might be more effective if they include training on response strategies rather than focusing exclusively on detection skills.
Industry analysts note this research comes at a critical time, as major platforms like Meta, Twitter (now X), and YouTube face mounting pressure to address the spread of false information without imposing excessive content restrictions.
“Platforms are caught between demands for more aggressive moderation and concerns about censorship,” said Renee DiResta, research manager at the Stanford Internet Observatory, who was not involved in the study. “Empowering users to participate in the solution offers a potential middle path.”
The researchers emphasize that user intervention should complement, not replace, existing moderation and fact-checking systems. They highlight several benefits to this community-based approach, including strengthened community bonds through open dialogue, support for formal moderation efforts, increased awareness of misinformation, and the introduction of diverse perspectives to online discussions.
As social media continues to serve as a primary information source for millions of people worldwide, this research suggests that equipping everyday users with both the skills and motivation to respond to misinformation could significantly impact its spread and influence online.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


9 Comments
Interesting study on countering misinformation. It’s crucial that people don’t just passively consume content online but actively engage and challenge false narratives. Media literacy training seems like a valuable tool to empower users.
Interesting study on boosting user engagement to counter misinformation. Proactive steps to challenge false narratives are essential. Looking forward to learning more about the specifics of the OMEN training program and its real-world impacts.
I’m curious to learn more about the specifics of this media literacy training program and how effectively it encourages an active response to misinformation. Proactive user engagement seems crucial for limiting the impact of false narratives.
Empowering people to counter misinformation, not just spot it, is crucial. Looking forward to learning more about the methods and outcomes of this study. Equipping the public with those skills could have a big impact.
This is an important development in the fight against online misinformation. Fostering an active, critical response rather than just passive recognition of false claims is a smart approach. Curious to see if the training program can be scaled and replicated.
This is an important development in the fight against online misinformation. Equipping people with the skills to actively counter falsehoods, rather than just passively recognize them, could be a game-changer. Looking forward to seeing the full study results.
Glad to see researchers exploring ways to motivate people to take action against misinformation, not just identify it. Fostering an active, critical response is key to combating the spread of false or misleading content.
Interesting that the study looked at government analysts in a social cybersecurity program. I wonder if the findings could be replicated with a more general population. Either way, the concept of motivating users to take action against misinformation is promising.
Shifting the focus from just identifying misinformation to actively countering it is a smart approach. The ability to confidently and effectively challenge false claims online is an essential digital literacy skill these days.