Listen to the article

0:00
0:00

The rapid spread of misinformation across social media platforms has become an increasingly concerning issue in the United States, affecting public discourse on critical topics ranging from presidential elections and climate change to COVID-19 vaccines and variants.

A recent Pew Research Service study revealed that 86 percent of American adults now get their news through smartphones, creating an environment where false information can spread quickly and efficiently. Despite growing calls for social media companies to address this problem, experts suggest that platforms like Facebook, Twitter, and YouTube have limited incentive to effectively combat misinformation.

“We see so much misinformation because the platforms have no real interest in deterring it,” explains Roger Entner, a technology and telecommunications analyst at Recon Analytics. “It is really easy and free to join the platform, there is no profit in deleting the misinformation and preventing provocateurs to post it.”

Entner points to a troubling economic reality behind the persistence of false information online: “Actually, the platforms profit from it because the more outrageous the content, the more people interact with it – this type of ‘engagement’ is what the platforms are looking for; people reacting to things. It doesn’t matter if it’s true or false as long as they engage.”

This creates a situation where content that generates strong reactions – regardless of accuracy – becomes more valuable to the platforms’ business models. “Everything gets sacrificed on the altar of monetization through engagement,” Entner adds.

Since social media companies are unlikely to eliminate misinformation entirely, experts recommend that users develop skills to identify questionable content. This is particularly challenging as misinformation often masquerades as legitimate news, sometimes resulting from genuine misunderstanding, but frequently designed to mislead.

William V. Pelfrey, Jr., professor in the Wilder School of Government and Public Affairs at Virginia Commonwealth University, distinguishes between two forms of problematic information. “Bad information comes in two flavors, unintentional and intentional. The latter, intentional disinformation, is far more dangerous,” he says.

The sources of disinformation can be varied and sophisticated. “There are many persons who purposefully distribute inaccurate information in an attempt to influence outcomes, such as an election,” Pelfrey notes. “Some countries, including Russia and China, have sophisticated disinformation organizations, which work hard to undermine the United States, thereby elevating other countries.”

The situation is further complicated by America’s deep political divisions, where contradictory information abounds and trust in opposing viewpoints has deteriorated significantly. “Human nature is to simply turn away when things get confusing,” Pelfrey explains. “If the government is telling you to get vaccinated, and social media is telling you that vaccines and Covid are a hoax, the easiest thing to do is ignore all of it.”

Dr. Donna Gregory, senior lecturer at the School of Nursing at Regis College, recommends three questions to evaluate information: “Who is posting it? What information are they sharing? What is their intent?” She emphasizes checking the qualifications and potential biases of sources, assessing whether claims are supported by data and recent studies, and considering the motivations behind the information.

“Disinformation is more likely to want to push you in one direction or another, instead of just sharing information,” Gregory notes.

Fact-checking has emerged as a crucial tool in combating misinformation, though it’s often misunderstood. True fact-checking involves verifying claims against reliable evidence, rather than simply cherry-picking data that supports a particular viewpoint.

“If the information comes from a single person, it is opinion—not fact,” warns Pelfrey, citing an example where seemingly authoritative health information turned out to be from “a part-time pharmacy employee. Not a scientist.”

Gregory recommends cross-referencing information across multiple credible sources: “If you find information about the delta variant on a news site, can you find this same information on the CDC website? What about other health care organizations such as the National Institute of Allergy and Infectious Disease?”

As misinformation continues to proliferate online, developing these critical evaluation skills becomes increasingly essential for navigating today’s complex information landscape. Without significant changes to social media business models, the responsibility for distinguishing fact from fiction will largely remain with individual users.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

18 Comments

  1. The rapid spread of misinformation on social media is a complex challenge that requires a multi-faceted approach. Improved algorithmic detection, enhanced user education, and stronger regulatory oversight could all help address this issue.

  2. This is a worrying trend. Social media platforms need to take more responsibility for combating misinformation, even if it impacts their bottom line. Protecting the integrity of public discourse should be a priority.

    • I agree. Profit motives shouldn’t come at the expense of truth and factual information. Platforms need to find a better balance.

  3. Misinformation spreads so rapidly on social media, making it increasingly difficult to counter. More robust content moderation and fact-checking systems are essential to address this growing problem.

    • Jennifer Lopez on

      You’re right. With people getting more news from social media, the potential for harm from false information is amplified. Platforms have to take this threat seriously.

  4. This is a concerning development that highlights the need for greater social media platform accountability. Profit motives should not come at the expense of public welfare and the integrity of information.

    • Absolutely. Platforms must be held responsible for the harmful impact of misinformation spread on their networks. Stronger regulations may be necessary to align their incentives with the public interest.

  5. Olivia A. Hernandez on

    It’s disturbing that social media platforms seem to profit from the spread of misinformation. They need to prioritize the public good over their own financial interests when it comes to combating false narratives online.

    • Agreed. The platforms’ current incentive structures are misaligned with the need to maintain the integrity of public discourse. This needs to change.

  6. Jennifer Smith on

    It’s concerning that social media companies seem to have little incentive to curb misinformation, given the engagement and traffic it generates. This needs to change if we want to protect the information ecosystem.

    • Exactly. The platforms need to prioritize the public good over profits when it comes to misinformation. Stronger regulations may be required to force their hand on this issue.

  7. James C. Garcia on

    This is a worrying trend that threatens to undermine public trust and erode the quality of discourse. Social media platforms must be held accountable for the spread of misinformation on their platforms.

    • I agree. Responsible platform governance and more effective content moderation are essential to curbing the tide of false information online.

  8. The growth of misinformation on social media is a troubling trend that undermines informed decision-making and erodes trust in institutions. More robust fact-checking and content moderation are essential to address this problem.

  9. It’s concerning that social media platforms seem to profit from the spread of misinformation. They need to prioritize the public good over their own financial interests when it comes to combating false narratives online.

    • William Miller on

      I agree. The platforms’ current incentive structures are misaligned with the need to maintain the integrity of public discourse. This needs to change if we want to protect the information ecosystem.

  10. The rapid spread of misinformation on social media is a complex challenge with no easy solutions. Fact-checking, content moderation, and digital literacy education all have important roles to play in addressing this problem.

  11. The proliferation of misinformation on social media is a serious concern that requires a multi-faceted approach. Improved algorithmic detection, enhanced user education, and stronger regulatory oversight could all help address this challenge.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2025 Disinformation Commission LLC. All rights reserved.