Listen to the article

0:00
0:00

In a dramatic shift for social media content moderation, Meta CEO Mark Zuckerberg announced plans to terminate the company’s U.S. fact-checking operations across Facebook, Instagram, and Threads, signaling a significant retreat from efforts to combat misinformation on its platforms.

The decision, revealed Tuesday, appears strategically timed as the Trump administration prepares to take office. Industry analysts view the move as both a political calculation to align with the incoming administration’s positions on content regulation and a business strategy to boost the company’s most prized metric: user engagement.

Research has consistently shown that false or misleading content spreads substantially faster on social media than accurate information. Studies indicate that false posts can circulate up to 20 times more quickly, especially when they contain provocative or outrageous claims such as conspiracy theories, inflammatory rhetoric, or calls to action. For Meta, this translates directly to increased ad revenue through heightened user activity.

“Absent the moderating force of fact-checking, we’ll see more content that’s hyper-partisan, vitriolic, and hostile,” warned Dr. Cody Buntain, an assistant professor at the University of Maryland specializing in social media disinformation. “The people that are already potentially more extreme, they’ll be more engaged in the platform. There’ll be more content that caters to their interests.”

The effects of reduced content moderation are already becoming apparent. Misinformation regarding the Los Angeles wildfires has spread rapidly across Meta’s platforms, mirroring patterns seen on X (formerly Twitter) after Elon Musk reduced moderation efforts. Zuckerberg explicitly acknowledged following Musk’s example in his announcement.

Meta’s decision to disband its fact-checking team stems from data showing that warning labels on questionable content reduce user interaction—directly contradicting the company’s primary business objective of maximizing engagement and time spent on its platforms.

Content designed to trigger emotional responses, whether positive or negative, drives engagement regardless of accuracy. When users react to misinformation—even to dispute it—they inadvertently boost the content’s visibility and contribute to Meta’s advertising revenue. The platform’s algorithms don’t distinguish between supportive engagement and critical responses; all interaction is valued equally from a business perspective.

This pursuit of engagement at any cost appears deeply ingrained in Meta’s corporate culture. In 2016, then-Facebook executive Andrew Bosworth argued in an internal email that even severe negative outcomes like facilitating terrorist attacks or contributing to suicides could be justified by the company’s mission to connect people. Though Zuckerberg publicly disagreed when the email surfaced in 2018, Bosworth was later promoted to chief technology officer in 2022.

Meta has similarly persisted in targeting younger users despite mounting evidence of social media’s negative impact on adolescent mental health. With Facebook’s core user base aging, the company has aggressively pursued younger demographics who, according to Pew Research Center data, demonstrate the highest levels of smartphone dependency across all age groups.

The abandonment of fact-checking represents a significant reversal from Meta’s post-2016 commitments to combat misinformation following widespread criticism of the platform’s role in election interference and the spread of false information. In previous years, such a decision might have triggered congressional scrutiny and public backlash.

However, with the political landscape shifting, Meta appears to have calculated that the business benefits of unrestricted engagement outweigh potential reputational damage, particularly in an environment where regulatory pressure may decrease under the incoming administration.

For users of Facebook, Instagram, and Threads, the change signals a fundamental shift in their information environment—one where determining the accuracy of viral content will increasingly fall to individuals rather than platform safeguards.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

27 Comments

  1. Isabella White on

    Interesting update on Mark Zuckerberg Ends Meta’s Fact-Checking Program in Bid to Boost Engagement. Curious how the grades will trend next quarter.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.