Listen to the article

0:00
0:00

Public Backs Independent Fact-Checking as Meta Abandons Verification Efforts

Meta’s recent decision to end partnerships with independent fact-checking organizations for Facebook and Instagram has sparked widespread concern among communication experts and the American public. The move comes at a time when misinformation continues to threaten democratic discourse and public trust in institutions.

Meta CEO Mark Zuckerberg justified the decision by claiming the company’s fact-checking program “too often became a tool to censor.” However, new polling data from Boston University’s College of Communication reveals this stance contradicts public opinion, with 72 percent of Americans supporting social media platforms’ removal of inaccurate health information. This support transcends political divides, with 85 percent of Democrats, 70 percent of independents, and 61 percent of Republicans agreeing such content moderation is necessary.

Instead of working with professional fact-checkers, Meta is pivoting to a “community notes” approach where users themselves write and rate annotations on questionable content. This model mirrors the strategy implemented by Elon Musk on X (formerly Twitter). Yet the BU poll indicates Americans remain skeptical of this approach, with nearly two-thirds (63 percent) preferring verification from independent fact-checking organizations and less than half (48 percent) supporting the community notes model.

“Offloading content-moderation responsibilities onto users is yet another example of platforms shirking their duty to ensure the safety of their digital products,” notes Dr. Michelle Amazeen, Boston University associate professor of mass communication. “By abandoning content moderation, social media platforms risk enabling disinformation from those in power.”

The efficacy of crowdsourced verification remains questionable. While some studies suggest crowdsourcing can rival expert verification in certain contexts, other research highlights significant inconsistencies. Crowdsourcing tends to work well for assessing source credibility but struggles with reliably identifying disinformation. Partisanship often undermines its effectiveness, influencing which claims get verified, and distinguishing verifiable from unverifiable claims typically requires specialized training.

The real-world evidence appears to validate these concerns. Despite X’s community notes program, the platform continues to struggle with misinformation about elections, climate change, and other crucial topics, according to multiple reports from news organizations and researchers.

Meta’s departure from fact-checking carries significant implications beyond its platforms. As the largest global funder of fact-checkers, this decision threatens the financial sustainability of many independent verification organizations. The loss of resources could severely impact their ability to combat misinformation at a critical time for democratic discourse.

However, Amazeen suggests a potential silver lining: freed from Meta’s influence, fact-checkers might refocus their efforts on democratically important claims rather than viral but politically insignificant content. Under Meta’s program, the platform determined which claims were submitted for review, often prioritizing nonpolitical viral content while leaving politically charged claims unaddressed.

The BU poll offers another glimmer of hope: one-third of American adults expressed willingness to donate $1 to fund independent fact-checking through crowdfunding campaigns. Such grassroots support could help replace some of the financial resources these organizations are losing.

The debate over content moderation comes at a pivotal moment, with a new White House administration that critics say has previously used disinformation as a political tool. As political polarization deepens and social media’s role in shaping public discourse grows, the question of who should moderate online content—and how—has never been more urgent.

“The integrity of public discourse hangs in the balance,” Amazeen concludes. “Social media platforms must rise to the occasion, for their role in shaping the national conversation has never been more consequential.”

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

22 Comments

  1. This is a complex issue without any easy answers. While I understand Meta’s desire to reduce ‘censorship’, the public’s clear preference for content moderation is hard to ignore.

    • Elizabeth Lopez on

      I’ll be watching closely to see how the ‘community notes’ approach works and whether it can be as effective as professional fact-checking.

  2. Isabella Thomas on

    I’m a bit skeptical of Meta’s justification for ending partnerships with independent fact-checkers. Removing inaccurate health information seems like a reasonable and important form of moderation.

    • Robert Johnson on

      It will be important to monitor whether Meta’s new approach leads to the spread of more misinformation, even if it reduces their ‘censorship’ of content.

  3. The decision by Meta to end partnerships with independent fact-checkers is concerning, as it could lead to the further spread of misinformation on their platforms. I hope they reconsider this move.

    • Isabella Miller on

      The public’s strong support for content moderation to remove inaccurate health information is a positive sign, and I hope platforms take this feedback seriously.

  4. I’m not surprised to see the public’s support for content moderation, especially when it comes to misinformation about health and other important topics. Social media platforms have a responsibility to their users.

    • It will be interesting to see if Meta’s decision to rely more on user-generated annotations can maintain the same level of accuracy and credibility as their previous fact-checking partnerships.

  5. Elijah Y. Miller on

    This is a complex issue without any easy solutions. I appreciate the public’s nuanced perspective and hope that social media platforms can find a way to address misinformation while also upholding free speech.

    • Amelia X. Smith on

      It will be interesting to see how the ‘community notes’ model compares to professional fact-checking in terms of accuracy and effectiveness in the long run.

  6. I appreciate the public’s support for fact-checking and content moderation, even if the platforms don’t always see eye-to-eye with their users on the best approach.

    • Jennifer Thompson on

      It will be interesting to see how the ‘community notes’ model compares to professional fact-checking in terms of accuracy and effectiveness.

  7. It’s encouraging to see that the public’s views on content moderation transcend political divides. Removing inaccurate health information should be a priority for social media platforms.

    • I hope Meta and other platforms can find a balanced approach that respects free speech while also addressing the public’s legitimate concerns about misinformation.

  8. Olivia Hernandez on

    The divide between Meta’s stance and the public’s preferences on content moderation is quite stark. I hope they reconsider their decision and find a way to work with independent fact-checkers.

    • William B. Taylor on

      At the end of the day, the platforms need to put the needs of their users first. Removing misinformation should be a priority, even if it means more ‘censorship’.

  9. Patricia Davis on

    This is an interesting development in the ongoing debate around content moderation on social media. It’s good to see that the public broadly supports the removal of misinformation, even if platforms like Meta are taking a different approach.

    • I’m curious to see how the ‘community notes’ model will work in practice and whether it can be as effective as professional fact-checking.

  10. This is a challenging issue with valid arguments on both sides. While I understand Meta’s desire for less ‘censorship’, the public’s strong support for content moderation to limit misinformation is hard to ignore.

    • I’ll be curious to see how the ‘community notes’ approach works in practice and whether it can be as effective as professional fact-checking in curbing the spread of harmful misinformation.

  11. Elizabeth Jackson on

    This speaks to the broader challenge of balancing free speech with the need to limit the spread of harmful misinformation on social media. It’s a delicate balance that platforms are still trying to figure out.

    • I’m curious to see how the public’s expectations around content moderation evolve as these debates continue.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2025 Disinformation Commission LLC. All rights reserved.