Listen to the article

0:00
0:00

The Growing Threat of Social Media Disinformation

Social media platforms have become battlegrounds for disinformation campaigns, creating significant challenges for democracy and public discourse, according to new research by Wharton School Professor Eric Clemons.

In an in-depth interview, Clemons highlighted the disturbing effectiveness of disinformation, particularly around complex topics like climate change. “The lie is easy and effective. The truth is long, slow, and complicated,” Clemons explained. “This makes it a very effective way to change opinions.”

Clemons draws a compelling comparison between processed food and manipulated information. “We are finally understanding that highly processed food isn’t the same as a balanced diet, and highly processed news isn’t the same as an informed electorate,” he noted. Modern disinformation campaigns leverage vast amounts of user data to create targeted messaging that resonates with specific audience segments.

The professor pointed out that social media platforms have accumulated unprecedented knowledge about users. “Recent studies have shown that social media models can predict the behavior of a user more accurately than their spouse,” Clemons said. This data enables what he calls “highly processed lying” – targeted falsehoods delivered precisely to those most susceptible to believing them.

Unlike traditional news outlets that must maintain credibility across their entire audience, social media platforms operate under different incentives. “There’s no penalty to Facebook or Twitter for highly focused, highly targeted lying, whereas this would destroy the brand of The New York Times or NPR,” Clemons explained. Traditional media outlets generally strive for universal plausibility, while social platforms prioritize engagement and entertainment.

The research also clarifies an important distinction between misinformation and disinformation. Misinformation involves sharing false information that the sharer believes to be true – a practice protected by the First Amendment. Disinformation, however, involves deliberately spreading known falsehoods, often as part of organized campaigns. “If I know something to be false, I’m actually working in a troll factory, or I’m working for a political campaign, and I design really effective arguments to spread falsehood, that’s disinformation,” Clemons explained.

Several factors make disinformation particularly difficult to control. Political motivations blur the lines, with one person’s disinformation campaign being another’s legitimate political strategy. The First Amendment, which Clemons acknowledges as fundamentally important, was designed around the principle that truth prevails in the free marketplace of ideas. However, this principle falters “when technology allows highly processed news” and when stories can be crafted specifically for individual consumption.

Traditional media outlets have been severely impacted by the rise of targeted disinformation. Clemons shared an example from Denmark, where a traditional broadcaster saw its market share reduced by approximately 75% while struggling to compete with more entertaining – though less factual – social media content. “Fun beats traditional media on reposting. It beats traditional media on market share. It’s just compelling. It wins,” he noted.

The problem is exacerbated by how people consume social media content. Drawing from the work of Nobel laureates Daniel Kahneman and Amos Tversky, Clemons explained that people engage with social media using “fast thinking” – reactive and emotion-based – rather than the analytical “slow thinking” required to evaluate factual accuracy. This explains why labeling false content often fails to reduce its spread.

When asked about potential solutions, Clemons suggested limiting social media platforms’ ability to share aggregate user data and restricting their capacity to target content to specific users based on beliefs. “If you want to limit the effectiveness of highly processed fake news, you take away the processing factory,” he said, while acknowledging the difficulty of implementing such restrictions under current regulatory frameworks.

Looking ahead, Clemons warned that disinformation will likely remain a persistent challenge. “Lying has been a tool of statecraft and warcraft forever,” he observed. “The difference is between traditional one-off lying, which is highly effective only when a society is falling apart, and highly targeted lying, which works most of the time.”

To address this growing problem without compromising free speech principles, Clemons recommends focusing on the data infrastructure that enables precision targeting rather than attempting to regulate speech itself – essentially depriving disinformation campaigns of the tools needed to effectively process and direct false information.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

12 Comments

  1. Jennifer Thompson on

    Professor Clemons provides valuable insight into the complex challenge of combating online disinformation. His comparison to processed food highlights the need for more discerning, nutritious information consumption. Improving media literacy and platform accountability will be crucial to preserving the integrity of public discourse.

  2. Mary Q. Thomas on

    Clemons makes a strong case that social media’s vast user data enables highly targeted disinformation campaigns. While the platforms have accumulated unprecedented insights, they must be held accountable for how that information is leveraged. Proactive measures are needed to protect democratic discourse.

    • You’re right, the comparison to processed food is apt. Just as nutritious whole foods are better for our physical health, truthful, balanced information is essential for a healthy information ecosystem. Platforms need to prioritize factual, nuanced content over sensationalism.

  3. This is a complex and concerning issue. I appreciate Clemons’ emphasis on the need for an informed electorate, not just a persuaded one. Disinformation around climate change and other technical topics is particularly worrying. Improving media literacy will be crucial going forward.

    • Agreed, the scale and sophistication of modern disinformation campaigns is alarming. Clemons’ point about social media’s predictive power over users is chilling. Platforms must be more transparent and proactive in addressing these threats to democracy.

  4. Fascinating analysis from Professor Clemons on the dangers of social media disinformation. The processed food analogy is apt – we must be discerning consumers of information, not just passive recipients. Addressing this issue will require a concerted, multi-stakeholder effort to promote factual, nuanced discourse.

  5. Jennifer Miller on

    Clemons’ remarks on the effectiveness of lies versus the complexity of truth are sobering. Disinformation campaigns that leverage user data to target specific audiences are a serious threat. Improving media literacy and platform accountability will be crucial to combat this trend and preserve democratic institutions.

  6. Elijah Martinez on

    Professor Clemons provides valuable insights on the growing challenge of online disinformation. His comparison to processed food highlights the need for more nutritious, balanced information. As the platforms’ predictive power over users increases, so too must their responsibility to protect the integrity of public discourse.

  7. Clemons makes a compelling case for the threat of social media disinformation. The processed food analogy is thought-provoking – we need to be just as discerning about the information we consume. This will require a multifaceted approach of education, platform accountability, and a renewed commitment to factual discourse.

  8. Isabella Williams on

    Fascinating insights from Professor Clemons on the troubling spread of disinformation online. The comparison to processed food is a powerful analogy – we must be discerning consumers of information, just as with our diet. Balancing truth and nuance is critical for an informed society.

    • I agree, the ease and effectiveness of lies is a real threat. Combating this will require sustained effort and education around media literacy. Social platforms have a responsibility to address these issues proactively.

  9. Clemons makes a compelling case that social media platforms’ data-driven targeting capabilities enable the spread of effective disinformation. This is a significant threat to democracy and public discourse. Platforms must be held accountable and proactive measures taken to combat this trend and protect an informed electorate.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.