Listen to the article
Meta’s Fact-Checking Pivot Raises Concerns Among Misinformation Experts
Meta’s announcement this week to discontinue its fact-checking program has sparked significant concern within the scientific community. The program, established in 2016, has paid independent groups to verify selected content on Facebook and has been a cornerstone of the platform’s misinformation strategy.
Joel Kaplan, Meta’s chief global-affairs officer, justified the decision by citing concerns over bias and censorship. “Experts, like everyone else, have their own biases and perspectives. This showed up in the choices some made about what to fact-check and how,” Kaplan wrote on January 7.
The move comes at a time when misinformation researchers have accumulated substantial evidence supporting fact-checking’s effectiveness. Sander van der Linden, a social psychologist at the University of Cambridge who previously advised Facebook’s fact-checking program, emphasized that “fact-checking does work” in making information more trustworthy.
“Studies provide very consistent evidence that fact-checking does at least partially reduce misperceptions about false claims,” van der Linden noted. A 2019 meta-analysis examining over 20,000 participants found fact-checking had a “significantly positive overall influence on political beliefs.”
However, experts acknowledge limitations. Jay Van Bavel, a psychologist at New York University, points out that fact-checking becomes less effective with polarized issues like elections or Brexit. “People who are partisans don’t want to believe things that make their party look bad,” he explained.
Despite these challenges, fact-checking still serves crucial functions beyond changing individual minds. Alexios Mantzarlis, who directs the Security, Trust, and Safety Initiative at Cornell Tech, highlighted how Facebook’s current system flags false content with warnings and reduces its algorithmic distribution.
Kate Starbird, a computer scientist at the University of Washington, noted that measuring direct effects on user beliefs differs from assessing “the broader effects of having those fact-checks in the information ecosystem.”
Regarding Meta’s claims of political bias in fact-checking, researchers offer a straightforward explanation: conservative content tends to be flagged more often because it contains more misinformation. “When one party, at least in the United States, is spreading most of the misinformation, it’s going to look like fact-checks are biased because they’re getting called out way more,” Van Bavel explained.
This assessment is supported by recent research. A study published in Nature last year revealed that politically conservative users on X (formerly Twitter) were more likely to share information from news sites deemed low-quality by a representative group of laypeople.
“If you wanted to know whether a person is exposed to misinformation online, knowing if they’re politically conservative is your best predictor of that,” said Gordon Pennycook, a Cornell University psychologist who contributed to the analysis.
As a replacement for third-party fact-checking, Meta CEO Mark Zuckerberg has proposed adopting a system similar to X’s “community notes,” where corrections and context come from users rather than professional fact-checkers.
While research indicates crowdsourced correction systems can work, their effectiveness depends heavily on implementation. Van der Linden cited an analysis showing that community notes on X often appeared too late to prevent false claims from spreading widely.
“Crowdsourcing is a useful solution, but in practice it very much depends on how it’s implemented,” he cautioned. “Replacing fact-checking with community notes just seems like it would make things a lot worse.”
The timing of Meta’s decision is particularly concerning for media literacy advocates, as 2025 begins with multiple high-stakes elections worldwide and continued polarization around scientific topics like climate change and public health. The platform’s pivot away from professional verification represents a significant shift in how one of the world’s largest social networks approaches truth and accuracy in the digital information ecosystem.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


8 Comments
Fact-checking is a useful tool, but its effectiveness likely depends on factors like the nature of the misinformation, the platforms it spreads on, and how the fact-checking is implemented. I’m curious to see how the research evolves on this complex issue.
Fact-checking is a complex and evolving field. While the research shows it can be effective, the implementation and impact likely varies greatly depending on the specific context. It will be interesting to see how this space continues to develop.
The mining and commodities sectors are closely watched for news and developments. Fact-checking efforts around this subject matter could be valuable in ensuring accurate information circulates, especially on social media.
That’s a good point. Misinformation in mining and commodities news can have real-world impacts, so fact-checking is important for maintaining transparency and trust in these industries.
Fact-checking is a complex issue – it can be effective at combating misinformation, but also has limitations and potential for bias. I’m curious to hear more about the evidence and nuances around its effectiveness in different contexts.
Agreed, the decision to discontinue Meta’s fact-checking program raises important questions about the program’s effectiveness and potential for bias. It will be interesting to see how this impacts the spread of misinformation on social media.
The decision to discontinue Meta’s fact-checking program is concerning, as it could allow the spread of misinformation to go unchecked on a major social media platform. Fact-checking is an important safeguard, even if it has limitations.
Agreed, this move by Meta raises significant questions about their commitment to combating misinformation. Fact-checking programs may not be perfect, but they serve an important role in trying to maintain the integrity of information online.