Listen to the article

0:00
0:00

Social media platforms have been criticized for failing to effectively remove COVID-19 misinformation, according to a new report from the Center for Countering Digital Hate. The study found that 90% of reported posts containing coronavirus falsehoods remained visible online without warning labels.

The investigation, conducted by ten volunteers from the UK, Ireland, and Romania, identified 649 posts spreading dangerous misinformation across Facebook, Instagram, and Twitter during a one-month period between April and May. The posts included false claims about cures, anti-vaccination propaganda, and conspiracy theories linking 5G networks to the virus.

Among the misleading content were recommendations suggesting COVID-19 sufferers could eliminate the virus by drinking aspirin dissolved in hot water or by taking zinc and vitamin C and D supplements. Other posts claimed wearing face masks could cause cancer.

Twitter proved to be the least responsive platform, taking action on only 3% of the 179 posts reported by volunteers. Facebook removed just 10% of 334 reported posts and added warning labels to another 2%. Similarly, Instagram, owned by Facebook, acted on only 10% of the 135 complaints it received.

Imran Ahmed, chief executive of the Center for Countering Digital Hate, criticized the social media giants for “shirking their responsibilities” in addressing the spread of harmful content. “Their systems for reporting misinformation and dealing with it are simply not fit for purpose,” Ahmed stated. “Social media giants have claimed many times that they are taking COVID-related misinformation seriously, but this new research shows that even when they are handed the posts promoting misinformation, they fail to take action.”

Facebook defended its performance, claiming the study sample was “not representative” of its overall efforts. A spokesperson for the company said: “We are taking aggressive steps to remove harmful misinformation from our platforms and have removed hundreds of thousands of these posts, including claims about false cures.” The company added that during March and April, it placed warning labels on approximately 90 million pieces of COVID-19 related content, preventing users from viewing the original content 95% of the time.

Twitter responded that it was prioritizing the removal of COVID-19 content “when it has a call to action that could potentially cause harm,” but would not take enforcement action on every tweet containing incomplete or disputed information. The platform stated it had challenged more than 4.3 million accounts targeting COVID-19 discussions with “spammy or manipulative behaviors” since March 18.

The findings come as both platforms face questioning from the UK’s Digital Culture Media and Sport sub-committee regarding their handling of coronavirus misinformation. Social media companies have implemented various policy changes during the pandemic to address harmful content, including Twitter’s recent move to label misleading tweets from high-profile accounts, including US President Donald Trump.

Rosanne Palmer-White, director of youth action group Restless Development, which participated in the survey, expressed frustration that young people doing their part to combat misinformation were being “let down” by social media companies.

Experts note that while platforms typically prioritize removing content posing immediate life threats, misleading information that presents less obvious dangers—such as anti-vaccination messaging—can ultimately prove just as harmful to public health efforts.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

14 Comments

  1. Jennifer J. Taylor on

    This report is a stark reminder of how much work social media companies still need to do to address the spread of COVID-19 misinformation. Failing to remove 90% of reported posts is unacceptable. They must improve their content moderation practices.

    • Elijah Williams on

      Absolutely. These platforms have a responsibility to their users to provide accurate, fact-based information, especially during a public health crisis. They need to be held accountable for their failures.

  2. The study’s findings are disappointing but not surprising. Many social media companies have struggled to effectively moderate misinformation, especially around sensitive topics like the pandemic. More needs to be done to improve content moderation.

    • Jennifer Hernandez on

      You’re right. These platforms have a responsibility to their users to provide accurate, fact-based information, especially during a public health crisis.

  3. Linda Williams on

    It’s alarming to see how much dangerous COVID-19 misinformation is still circulating on social media. These platforms need to significantly ramp up their efforts to identify and remove this kind of content. Lives are at stake.

    • Completely agree. The lack of action from these companies is unacceptable. They need to be held accountable for the real-world harm caused by the spread of misinformation.

  4. William D. Martin on

    This report highlights a major failure by social media platforms to protect their users from COVID-19 misinformation. They need to invest more resources into improving content moderation and enforcement of their own policies.

    • Definitely. These companies have a moral and social responsibility to ensure the information on their platforms is accurate and factual, especially during a public health crisis.

  5. While I’m not surprised by these findings, it’s still frustrating to see social media companies failing to address COVID-19 misinformation. Spreading false cures and conspiracy theories can have real consequences. They need to do better.

    • Absolutely. These platforms need to be more proactive and transparent about their content moderation efforts. Public health should be the top priority.

  6. William F. Lopez on

    It’s disappointing but not surprising to see how poorly social media companies are doing at combating COVID-19 misinformation. They need to take this issue much more seriously and be more transparent about their efforts.

    • Agreed. The spread of false cures and conspiracy theories on these platforms is extremely concerning and can have real-world consequences. They need to do better.

  7. This is really concerning. Social media platforms need to be more proactive in identifying and removing COVID-19 misinformation. Spreading false cures and conspiracy theories can be very dangerous to public health.

    • Agreed. The lack of action by these platforms is unacceptable. They need to enforce their own policies more strictly.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.