Listen to the article

0:00
0:00

In a significant development this year, the World Economic Forum (WEF) has ranked misinformation and disinformation as the most serious near-term global risk for the second consecutive year. This assessment comes at a pivotal moment when social media giant Meta announced it would terminate partnerships with third-party fact-checking organizations, citing concerns about “censorship” and “political bias.”

This policy reversal marks a dramatic shift for Meta, which had built extensive fact-checking collaborations following the 2016 U.S. presidential election. The company’s decision arrives as global concerns about online misinformation continue to intensify.

Research from the Reuters Institute’s Digital News Report highlights the growing public anxiety around false information, with 72% of online news users in the United States expressing concern about distinguishing between real and fake content on the internet—an 8-point increase from the previous year and significantly higher than the 59% average across 47 markets surveyed. The study identified X (formerly Twitter) and TikTok as platforms where users found it particularly difficult to identify trustworthy information.

The United States has recently emerged from what many observers describe as an exceptionally divisive election campaign. The BBC and other news organizations frequently prefaced statements from the new president with phrases like “without any evidence,” underscoring the tension between political rhetoric and factual reporting.

This phenomenon extends beyond the U.S. borders. In 2024 alone, approximately 49% of the global population participated in elections across more than sixty countries. Many of these electoral contests were characterized by similarly contentious campaigns that exacerbated existing social divisions along racial, religious, and cultural lines.

The Reuters Institute’s research suggests public concerns about misinformation are complex. Rather than focusing primarily on entirely fabricated news, audiences often worry about biased perspectives, superficial reporting, and unsubstantiated claims. Recent conflicts in Ukraine and Gaza have been particular flashpoints for concerns about misleading content.

“Misinformation doesn’t exist in a bubble—it plays on existing schisms in society that manifest as communal hatred, racism, gender violence, and class and caste divisions,” notes Mitali Mukherjee, Director of the Reuters Institute for the Study of Journalism at the University of Oxford.

While artificial intelligence introduces new complications to the information landscape, Mukherjee argues it isn’t the fundamental problem. “Across many markets and for many reasons, news has lost its connection with audiences. Tumultuous times increase the urgency for news publishers to find their way back,” she explains.

For media organizations seeking to rebuild audience trust, Mukherjee recommends focusing on creating “relevant, engaging, and high-quality content that demonstrates the value of engaging with the news.” She emphasizes the importance of reaching groups that feel most alienated by traditional news coverage, including young people, ethnic minorities, socioeconomically disadvantaged communities, and women.

The challenge extends to governments and regulatory bodies as well. Mukherjee calls for frameworks that protect journalists and editorial institutions—an increasingly difficult task in the current political climate. She advocates for public support of innovation in journalism through funding, removing obstacles to editorial creativity, and ensuring digital market regulations preserve editorial independence.

“Misinformation reflects the fissures that exist in our communities,” Mukherjee concludes. “If newsmakers focus on listening to their audiences and engaging with them, this approach has the potential to be the most effective way to repair the divisions, ultimately counteracting misinformation.”

As global tensions rise and technological capabilities advance, the battle against misinformation remains one of the most pressing challenges for democratic societies. The response requires coordination among media organizations, technology platforms, government entities, and an engaged public to preserve the integrity of information ecosystems worldwide.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

11 Comments

  1. This is a worrying trend. Misinformation can have real-world consequences, undermining public trust and skewing important debates. Social media platforms must take greater responsibility for the content they host and amplify.

  2. Elizabeth Martinez on

    The decision by Meta is concerning, but I can understand their desire to avoid perceived political bias. However, responsible content moderation is essential. I hope they find a way to maintain fact-checking partnerships without compromising free expression.

  3. Jennifer Thompson on

    The public’s growing anxiety around distinguishing real from fake content online is understandable. Social media algorithms can amplify misinformation, and the problem seems particularly acute on emerging platforms like TikTok.

  4. Isabella Q. Brown on

    The public’s growing distrust of online information is understandable. Combating misinformation is a complex challenge, but one that must be addressed to safeguard democratic institutions and public discourse.

  5. Misinformation poses a serious threat to democracy and global stability. I hope this assessment by the WEF spurs concrete actions to combat the problem. Fact-based journalism and digital literacy education will be crucial.

  6. This is a worrying trend that deserves urgent attention. Misinformation can have serious consequences for public health, the economy, and the integrity of democratic processes. I hope policymakers and tech companies can work together to address this challenge.

  7. Elijah Hernandez on

    This is a concerning development. Misinformation can undermine public trust and democratic institutions. It’s crucial that social media platforms work to combat the spread of false narratives, while still preserving free speech.

  8. Patricia Hernandez on

    This is a complex issue with no easy solutions. While free speech must be protected, the spread of misinformation can have serious societal consequences. Tech companies, policymakers, and the public all have a role to play in addressing this challenge.

  9. The decision by Meta to end fact-checking partnerships is worrying. Responsible content moderation is essential to curb the proliferation of misinformation online. I hope they reconsider this policy change.

    • I agree. Fact-checking plays a vital role in maintaining the integrity of online discourse. Platforms must find the right balance between free expression and preventing the spread of falsehoods.

  10. Mary F. Thompson on

    I’m concerned about the potential impact of this policy change by Meta. Fact-checking is an essential tool in the fight against misinformation. I hope they reconsider and find a way to maintain these partnerships while preserving free speech.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2025 Disinformation Commission LLC. All rights reserved.