Listen to the article

0:00
0:00

New Study Reveals Gap Between Beliefs and Actions on Social Media Misinformation

A groundbreaking study published in Scientific Reports has uncovered significant disparities between what social media users believe should be done about misinformation and what they actually do when encountering it online.

Researchers from Carnegie Mellon University surveyed over 1,000 American social media users, examining how they respond to false or misleading content in their feeds and how relationships with those posting misinformation influence their actions.

The study, led by researchers King, Phillips, and Carley, shifts focus away from platform-level interventions like fact-checking and content moderation to examine individual user behavior—a critical but often overlooked factor in misinformation’s spread.

“While platforms continue developing technological solutions, understanding how everyday users respond to misinformation they encounter provides valuable insights for developing more effective countermeasures,” explained the research team.

One of the most striking findings revealed a substantial gap between ideals and reality. While respondents overwhelmingly believed that people should take high-effort actions like correcting misinformation through comments or direct messages, they admitted to taking far less action themselves when encountering problematic content.

The researchers categorized responses into three levels: “high-effort” actions (commenting to correct misinformation or messaging the poster directly), “low-effort” actions (reporting posts or blocking users), and “no effort” (simply ignoring the misinformation).

This belief-behavior gap suggests many users recognize the importance of combating misinformation but don’t follow through, possibly assuming others will handle the problem. The researchers note that addressing situational constraints like time limitations or lack of confidence could encourage more active countering behavior.

Social proximity also plays a significant role in how people respond to misinformation. Survey participants reported being more likely to intervene when false content came from friends or family rather than acquaintances or strangers. This finding suggests people feel more comfortable confronting those they know well or feel a stronger responsibility to correct close contacts.

In a political landscape often characterized by polarization, the study revealed rare bipartisan consensus. Over 70% of Americans across the political spectrum—from strong Republicans to strong Democrats—believe individuals should counter misinformation shared by close contacts and correct themselves if they post inaccurate information.

This broad agreement presents a unique opportunity for social media companies to implement user-led interventions that could receive wider acceptance than more controversial measures like content removal or account restrictions.

Based on these findings, the researchers proposed several recommendations for platforms and policymakers:

First, social media platforms should develop features encouraging users to take accountability for accurate information, especially when close contacts share questionable content. Since users are more likely to counter misinformation from people they know, these relationships offer strategic intervention points.

Second, improving reporting functionality with greater transparency could increase user participation. Platforms should clarify how reports are managed and inform users about outcomes when they flag problematic content.

Third, digital media literacy programs should expand beyond teaching people to identify misinformation to include training on constructive countering methods. Educational efforts should frame responding to misinformation as a shared social responsibility.

“As misinformation tactics continue evolving, our countermeasures must adapt accordingly,” noted the researchers. “Our study demonstrates that while people generally believe they should counter misinformation, they often hesitate unless they’re close to the person posting it.”

The research comes at a critical time when social media companies face mounting pressure to address misinformation while balancing free speech concerns. By leveraging existing user beliefs about the importance of accuracy and focusing on improving user-led interventions, platforms may find more effective and widely acceptable approaches to combat the spread of false information.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

13 Comments

  1. Bridging the gap between ideals and reality when it comes to combating misinformation is a huge challenge. This research highlights just how difficult it can be for even conscientious users to translate their beliefs into effective actions. Curious to learn more about the key barriers and influencing factors.

  2. Examining individual user responses to misinformation is a smart move. Platforms can only do so much – the human dynamics and psychology at play are just as critical. Curious to see how these insights can inform more holistic and effective strategies going forward.

  3. Elizabeth O. Lee on

    Insightful study – the focus on user behavior rather than just platform-level solutions is a valuable perspective. Misinformation thrives on complex human factors, so understanding the psychology and social dynamics at play is essential. Looking forward to seeing the practical applications.

  4. The finding about the disconnect between beliefs and actions on misinformation is really striking. It speaks to the complex human factors at play, and the difficulty of translating ideals into real-world behaviors. This research seems like an important step toward more holistic, effective strategies.

  5. Patricia Williams on

    This study underscores the importance of a multifaceted approach to tackling misinformation. Technological solutions alone are not enough – we need to deeply understand user behavior and motivations as well. Thoughtful, evidence-based strategies are crucial going forward.

  6. The finding about the gap between beliefs and actions is really thought-provoking. It suggests that even people who recognize the problem of misinformation can struggle to act on it in practice. Understanding those barriers is crucial. This is an important piece of the puzzle.

  7. Examining individual user responses, not just platform-level interventions, is a smart approach. Misinformation’s spread is fueled by complex human dynamics, so understanding the psychology and social factors at play is vital. Keen to see how these insights can be applied.

  8. The finding about the gap between ideals and reality when it comes to combating misinformation is really striking. It highlights how challenging this issue is, even for users who recognize the problem. Curious to learn more about the specific factors that influence people’s actions (or inaction).

    • Agreed, bridging that gap is key. User behavior is such a crucial but often overlooked piece of the misinformation puzzle. Looking forward to seeing how this research can inform more impactful solutions.

  9. Great to see a study that shifts the focus to individual user responses, not just platform policies. Misinformation’s spread is driven by human psychology and social dynamics, so this kind of granular understanding is crucial. Eager to see how these insights can inform more impactful solutions.

  10. Isabella Martinez on

    Interesting to see the focus shift to individual user responses. So much attention has been on platform policies, but the human element is just as critical. Looking forward to the insights this research can provide on designing more effective misinformation countermeasures.

    • Absolutely. Platform-level interventions are important, but they’ll always be playing catch-up unless we can better understand and shape user behavior. This study seems like an important step in that direction.

  11. Fascinating study on individual user behavior around social media misinformation. Understanding how people actually respond, not just what they believe they should do, is so important for developing effective countermeasures. Curious to see more insights from this research.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2025 Disinformation Commission LLC. All rights reserved.