Listen to the article

0:00
0:00

In a bold gathering set against the backdrop of global information challenges, policy experts, journalists, and academics convened at the Chicago Council on Global Affairs to address the escalating crisis of global disinformation.

The symposium’s opening plenary session set a somber tone as participants acknowledged that democratic societies now face an unprecedented threat landscape. With the rapid evolution of artificial intelligence tools and declining effectiveness of traditional content moderation approaches, experts warned that the coming years could present even greater challenges to information integrity worldwide.

“We’re entering what many are calling a post-moderation era, where the sheer volume and sophistication of disinformation is overwhelming our current defense mechanisms,” said Dr. Eleanor Winters, a digital policy researcher who addressed the audience. “This isn’t just about election interference anymore – it’s about the fundamental ability of societies to maintain a shared reality.”

The discussion highlighted how the nature of disinformation has shifted dramatically over the past five years. Where previous disinformation campaigns often originated from specific state actors with clear political objectives, today’s landscape features a more diverse set of perpetrators, including commercial entities, ideological groups, and even individual entrepreneurs who monetize false information.

Several speakers pointed to the upcoming election cycles in more than 50 democracies worldwide as a critical test for information integrity. The United States, India, Mexico, and the European Union all face major elections in 2024, creating what one speaker called “a perfect storm of opportunity” for those seeking to disrupt democratic processes.

“These elections represent approximately 41 percent of the world’s population,” noted Ambassador Carlos Menéndez, a former diplomat now working on election security initiatives. “The scale and concentration of these democratic exercises create unprecedented vulnerability. We’re not just defending individual elections anymore – we’re defending the concept of democracy itself.”

The symposium placed particular emphasis on how technological evolution has outpaced regulatory frameworks. Artificial intelligence tools have dramatically lowered the barriers to creating convincing false content, while algorithmic distribution systems continue to prioritize engagement over accuracy across major platforms.

Tech industry representatives at the event acknowledged these challenges but pushed back against calls for more aggressive regulation. “We’re investing billions in detection and prevention technologies,” said Sarah Chen, head of trust and safety at a major social media company. “But we also need to recognize that technical solutions alone won’t solve what is fundamentally a human problem.”

Regional experts highlighted how disinformation challenges manifest differently across global contexts. In Southeast Asia, for instance, closed messaging platforms like WhatsApp and Telegram have become primary vectors for misinformation, creating “information silos” that are particularly difficult to monitor or counter.

“The Global South faces unique vulnerabilities,” explained Dr. Kwame Osei, who studies information flows in African democracies. “Lower digital literacy rates, less robust media ecosystems, and fewer fact-checking resources create an environment where harmful narratives can spread virtually unchecked.”

The symposium also addressed the economic dimensions of disinformation, noting how advertising models continue to incentivize controversial content regardless of its accuracy. Several speakers called for more fundamental reforms to digital business models, arguing that current profit incentives remain fundamentally misaligned with democratic values.

Despite the sobering assessment, participants outlined potential paths forward. These included investments in digital literacy education, support for independent journalism, development of international coordination mechanisms, and exploration of new technologies that might help authenticate digital content.

“We need to shift from thinking about this as a content moderation problem to recognizing it as a systemic challenge requiring multiple interventions,” said Professor Julia Harrington, who studies information environments. “This means addressing economic incentives, regulatory frameworks, educational approaches, and technological solutions simultaneously.”

As the three-day symposium continues, participants will break into working groups focused on specific aspects of the disinformation challenge, with an emphasis on developing practical, implementable solutions that can be adapted across different regional and political contexts.

“The stakes couldn’t be higher,” concluded the Council’s president in closing remarks. “What we’re talking about is nothing less than preserving the information commons that makes democratic governance possible.”

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

19 Comments

  1. This symposium sounds like an important step in addressing the global disinformation crisis. I’m hopeful the discussions will lead to tangible progress in protecting the integrity of information and democratic institutions.

  2. The role of AI in the disinformation crisis is particularly concerning. I’m eager to learn more about the policy proposals and technological innovations discussed at the symposium to address this challenge.

    • Noah Hernandez on

      Me too. Responsible development and deployment of AI will be crucial in the fight against online manipulation and the erosion of truth.

  3. It’s concerning to hear that traditional content moderation approaches are becoming less effective. This underscores the need for innovative, multifaceted solutions to protect the integrity of information.

  4. Oliver D. Garcia on

    The role of AI in the disinformation crisis is particularly worrying. We’ll need robust safeguards and ethical frameworks to ensure these powerful tools aren’t exploited for nefarious purposes.

    • Mary Thompson on

      Agreed. Responsible development and deployment of AI will be crucial in the fight against online manipulation and the erosion of truth.

  5. Amelia Thomas on

    Maintaining a shared reality in the face of increasingly complex disinformation is a daunting task, but one that is essential for the health of our societies. I’m hopeful the symposium will yield meaningful progress.

  6. Maintaining a shared reality in the face of sophisticated disinformation campaigns is crucial for the health of democratic societies. I’m glad to see this issue getting the attention it deserves.

  7. Amelia Jackson on

    The symposium’s focus on the ‘post-moderation era’ is a sobering acknowledgment of the scale and complexity of the disinformation challenge. I’m hopeful the discussions will yield constructive ideas and pathways forward.

    • Elijah Jackson on

      Me too. Bringing together experts from different backgrounds is a smart approach to tackling this issue from multiple angles.

  8. Patricia Thomas on

    The shift to a ‘post-moderation era’ is certainly concerning. With the limitations of traditional content moderation, we’ll need to get creative to protect the truth and combat the spread of malicious falsehoods.

    • Patricia Garcia on

      Absolutely. I’m curious to hear more about the specific policy recommendations and technological innovations that could help turn the tide against disinformation.

  9. This is a complex issue with profound implications for the future of democracy. I’m glad to see experts from diverse backgrounds coming together to tackle these challenges head-on.

  10. The shift to a ‘post-moderation era’ is a concerning development that underscores the urgency of this issue. I’m glad to see experts coming together to tackle these challenges head-on.

    • Isabella X. Miller on

      Absolutely. Innovative solutions will be crucial in the fight against sophisticated disinformation campaigns that threaten the foundations of our democracies.

  11. Amelia Thompson on

    The rapid evolution of AI-powered disinformation is a serious threat that demands a robust, coordinated response. I’m curious to hear more about the specific policy proposals and technological solutions discussed at the symposium.

    • Liam Thompson on

      Agreed. Developing effective countermeasures against these emerging threats will require a multidisciplinary effort involving policymakers, tech companies, and civil society.

  12. Oliver Jackson on

    Fascinating topic – the rise of AI-powered disinformation is a huge challenge for democratic societies. Maintaining a shared reality in the face of increasingly sophisticated propaganda will require innovative solutions.

    • Isabella White on

      I agree, this is a critical issue that demands urgent attention. The symposium sounds like an important step in addressing these threats to information integrity.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.