Listen to the article

0:00
0:00

Meta Scraps Fact-Checking in Major Content Moderation Overhaul

Meta will discontinue its long-standing use of fact-checking organizations across its platforms, founder Mark Zuckerberg announced this week. The social media giant will instead implement a community notes system similar to X (formerly Twitter), allowing users to identify potentially misleading content.

In a five-minute video titled “More speech and fewer mistakes,” Zuckerberg explained the decision, claiming that fact-checking had become “biased” and overly intrusive. “It’s time to get back to our roots around free expression,” he stated. “Our system attached real consequences in the form of intrusive labels and reduced distribution. A programme intended to inform too often became a tool to censor.”

The sweeping changes will affect Meta’s three major platforms—Facebook, Instagram, and Threads—which collectively reach over 3 billion users worldwide. The community notes system is expected to roll out in the United States within the next few months.

This shift comes as tech companies prepare for Donald Trump’s return to the White House. The president-elect’s supporters have long criticized content moderation policies as censorship. When asked if his previous criticism influenced Meta’s decision, Trump responded simply: “Probably.”

Meta also plans to relocate its content moderation teams from California to Texas, a move Zuckerberg claims will “help us build trust” and address “concern about the bias of our teams.”

Samuel Woolley, former director of propaganda research at the University of Texas at Austin, described the relocation as “born out of both some practicality and also some political motivation,” noting the differing perceptions of California and Texas among the incoming administration.

The strategy appears to mirror Elon Musk’s corporate shift to Texas. Musk, who will lead Trump’s Department of Government Efficiency, praised Meta’s announcement on X, writing: “This is cool.”

Since 2016, Meta has partnered with more than 90 fact-checking organizations across 60+ languages, including PolitiFact, FactCheck.org, and AFP Fact Check. Under the current system, when fact-checkers identify false content, Meta limits its reach without removing it entirely unless it violates community standards.

While specific details of Meta’s community notes implementation remain unclear, Zuckerberg indicated it will resemble X’s system, where eligible users create annotations that appear beneath potentially misleading posts. These notes must receive positive ratings from other contributors before becoming visible to all users.

Research on X’s community notes has yielded mixed results. A University of Luxembourg study found the system reduced the spread of misleading posts by an average of 61.4%, but noted notes often appear too late to curb initial viral spread. The Center for Countering Digital Hate reported that 74% of misleading election-related posts with proposed notes never received the “helpful” status needed for platform-wide visibility.

Federal Trade Commission Chair Lina Khan expressed concern that Meta executives might be pursuing favorable treatment from the Trump administration, suggesting they could be attempting to secure a “sweetheart deal” with the White House.

Representative Alexandria Ocasio-Cortez told Business Insider that “Mark Zuckerberg is trying to follow in Elon’s footsteps, which means that… they’re going to use this guise of free speech to actually suppress critics of Trump and critics of themselves.”

Fact-checking organizations have criticized the move as politically motivated. Neil Brown, president of the Poynter Institute, which owns PolitiFact, stated: “Facts are not censorship. Fact-checkers never censored anything. And Meta always held the cards.”

The changes have garnered support from various conservative voices, including Republican Representative Randy Weber of Texas, who told Business Insider: “It seems like Meta is finally taking a page from Elon Musk’s playbook & letting Americans make decisions for themselves.”

While the initial rollout will be limited to the United States, Zuckerberg’s announcement referenced challenges in other regions, criticizing European content regulations as “institutionalising censorship” and claiming Latin American countries have “secret courts” that force content removal.

The European Commission promptly rejected these characterizations, with spokesperson Paula Pinho telling reporters in Brussels: “We absolutely refute any claims of censorship on our side.”

As Meta prepares this significant policy shift, experts continue to debate whether community-based moderation will effectively combat misinformation or merely open the floodgates to more problematic content across the company’s influential platforms.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

7 Comments

  1. Linda E. Taylor on

    I can see both pros and cons to Meta’s decision. On one hand, fact-checking has been criticized as biased and intrusive. But on the other, an unmoderated ‘community notes’ system could amplify misinformation and agenda-driven narratives. It’s a risky trade-off that will be important to monitor.

  2. Liam Rodriguez on

    Removing fact-checkers could undermine trust in the information shared on Meta’s platforms. While the current system isn’t perfect, an unmoderated ‘community notes’ approach seems risky, especially with the potential for political agendas to influence what gets flagged. Curious to see how this plays out.

  3. Ava J. Martinez on

    Meta’s move away from professional fact-checking is a curious one. While the current system has flaws, shifting to user-generated notes may create new challenges around credibility and consistency of information. It’s a high-stakes gamble for a platform of such scale and influence.

  4. Jennifer R. Jones on

    Interesting move by Meta. While fact-checking has its flaws, removing it entirely raises concerns about misinformation spreading on their platforms. Community-driven notes could be a double-edged sword – empowering users but also potentially amplifying biases and agenda-driven narratives.

  5. Robert Thompson on

    The decision to scrap fact-checkers and rely on community notes seems like a risky move. It could open the door to more unchecked misinformation, especially around sensitive topics like elections. Curious to see how this plays out and whether it actually leads to ‘more speech and fewer mistakes’.

    • I share your concerns. Crowd-sourced fact-checking is a tricky proposition and could easily be gamed by bad actors. Will be interesting to see if Meta has robust systems in place to prevent abuse of the community notes feature.

  6. This is a bold move by Meta, but I’m skeptical it will lead to ‘fewer mistakes.’ Relying on user-generated notes to identify misinformation is a gamble that could backfire if not implemented thoughtfully. The implications for the broader social media landscape are worth watching closely.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.