Listen to the article

0:00
0:00

Meta Abandons Fact-Checking Program in Favor of “Community Notes”

Meta has announced it will discontinue its fact-checking program, starting in the United States. The program was originally designed to combat the spread of misinformation across Meta’s platforms, which include Facebook, Instagram and Threads, reaching over 3 billion users worldwide.

In a video posted to Instagram, CEO Mark Zuckerberg stated that fact-checking had resulted in “too many mistakes” and “too much censorship.” He emphasized that it was time for Meta “to get back to our roots around free expression,” particularly in the aftermath of the recent U.S. presidential election, which he described as a “cultural tipping point, towards once again prioritizing speech.”

The tech giant will replace its professional fact-checking system with a “community notes” model, similar to the approach used by X (formerly Twitter). This system relies on users themselves to add context or caveats to potentially misleading posts. The European Union is currently investigating this model’s effectiveness on X.

Meta established its independent, third-party fact-checking program in 2016, amid growing concerns about information integrity following Donald Trump’s election as U.S. president. The program funded fact-checking organizations—including Reuters Fact Check, Australian Associated Press, Agence France-Presse, and PolitiFact—to evaluate questionable content on Meta’s platforms. Posts deemed inaccurate or misleading received warning labels to better inform users.

Despite Zuckerberg’s claims that the program stifled free speech, Angie Drobnic Holan, head of the International Fact-Checking Network, strongly disagreed. In a statement responding to Meta’s decision, she noted: “Fact-checking journalism has never censored or removed posts; it’s added information and context to controversial claims, and it’s debunked hoax content and conspiracy theories. The fact-checkers used by Meta follow a Code of Principles requiring nonpartisanship and transparency.”

Evidence supports the program’s efficacy. In Australia alone during 2023, Meta displayed warnings on more than 9.2 million pieces of content on Facebook and over 510,000 posts on Instagram, based on fact-checkers’ assessments. Multiple studies have shown that such warnings effectively slow the spread of misinformation.

Meta’s fact-checking guidelines specifically prohibited fact-checkers from censoring political figures and celebrities, contradicting claims that the program restricted speech. During the COVID-19 pandemic, fact-checkers played a crucial role in limiting harmful misinformation about the virus and vaccines.

The program also served as a backbone for global anti-misinformation efforts by providing financial support to approximately 90 accredited fact-checking organizations worldwide.

The shift to a “community notes” model raises significant concerns about misinformation control. Reports from The Washington Post and The Centre for Countering Digital Hate found that X’s similar approach has failed to adequately address falsehoods on that platform.

Meta’s decision will also create major financial challenges for independent fact-checking organizations, many of which have relied heavily on the tech giant’s funding. Without this support, these organizations may struggle to counter misinformation or combat efforts to weaponize fact-checking by other actors, such as Russian President Vladimir Putin’s recently announced state fact-checking network based on “Russian values.”

The change comes at a time when independent, third-party fact-checking remains crucial in the fight against global misinformation. Meta’s abandonment of its program signals a significant shift in how major social media platforms approach content moderation and information integrity, potentially impacting billions of users who rely on these platforms for news and information.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

7 Comments

  1. This move by Meta raises serious concerns about the spread of misinformation on their platforms. Relying on user-generated ‘community notes’ seems like an unreliable way to combat false or misleading content.

    • Michael Jackson on

      I agree, professional fact-checking is crucial to maintaining integrity of information, especially around high-stakes topics like elections.

  2. Michael Rodriguez on

    This is a worrying development. Fact-checking is essential to combat the spread of misinformation, especially on topics like mining, energy, and commodities that can have real economic impacts.

  3. From a mining and commodities perspective, this change by Meta could impact how information on these topics is shared and perceived on their platforms. Fact-checking is crucial for transparency.

    • Good point. Misinformation around mining, energy, and commodity markets can influence investor sentiment and decision-making. Reliable information is essential.

  4. While free expression is important, allowing unfettered spread of misinformation can have dangerous real-world consequences. Meta should reconsider this decision and find a balanced approach.

    • Exactly, this feels like a step backwards in the fight against online disinformation. Curious to see how the ‘community notes’ model works in practice.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.