Listen to the article

0:00
0:00

Social Media Platforms Flooded with Mental Health Misinformation, Study Finds

A concerning wave of misinformation about ADHD and autism is spreading across social media platforms, according to a comprehensive new study from researchers at the University of East Anglia (UEA) and Norfolk and Suffolk NHS Foundation Trust. The research reveals that misleading content, particularly prevalent on TikTok, may be contributing to a troubling rise in self-diagnoses among young users.

The study, published in The Journal of Social Media Research, analyzed 5,057 social media posts across major platforms including TikTok, YouTube, Facebook, Instagram, and X. Researchers discovered alarming rates of inaccurate information, with TikTok emerging as the worst offender. More than half (52%) of ADHD-related videos and 41% of autism-related content on TikTok contained misinformation.

By comparison, YouTube showed a misinformation rate of approximately 22%, while Facebook performed somewhat better at just under 15%. These findings highlight significant disparities in content quality across different social media ecosystems.

“Mental health information on social media matters because many young people now turn to these platforms to understand their symptoms and possible diagnoses,” explained Dr. Eleanor Chatburn from UEA’s Norwich Medical School. “TikTok content has been linked to young people increasingly believing they may have mental health or neurodevelopmental conditions.”

The researchers acknowledge that social media can provide a valuable starting point for individuals questioning their mental health. However, they emphasize that online content should lead to proper clinical assessment rather than replacing professional diagnosis.

“While this questioning can be a helpful starting point, it’s important these questions lead to proper clinical assessment with a professional,” Dr. Chatburn noted. “As well as leading to misunderstanding of serious conditions and pathologizing ordinary behavior, misinformation can also lead to delayed diagnosis for people that actually do need help.”

The study identified a key factor in content reliability: the source. Information shared by health professionals consistently demonstrated higher accuracy than content from non-professionals, suggesting the importance of expertise in mental health discussions online.

Dr. Alice Carter, also from UEA, highlighted the dual nature of personal testimonials. “While lived-experience can play an important role, with personal stories helping people to feel understood and raising awareness of mental health conditions, it is vital to ensure that accurate and evidence-based information from clinicians and trusted organizations is also visible and easy to find.”

The researchers pointed to TikTok’s algorithm as a significant contributor to the problem. The platform’s design, which prioritizes engaging content, can create powerful echo chambers that amplify misinformation. “Once users show interest in a topic, they are bombarded with similar posts – creating powerful echo chambers that can reinforce false or exaggerated claims,” Dr. Carter explained. “It is a perfect storm for misinformation to go viral faster than facts can catch up.”

In response to the findings, the study authors are calling for improved moderation practices and greater promotion of evidence-based content across all social media platforms, but particularly on TikTok given its popularity among younger users and higher rates of misinformation.

TikTok has challenged the study’s conclusions. A spokesperson for the platform stated: “This is a flawed study that relies on outdated research about multiple platforms. The facts are that we remove harmful health misinformation and provide access to reliable information from the WHO, so that our community can express themselves about what matters to them and find support.”

The research comes amid growing concerns about social media’s influence on mental health perceptions and diagnoses, especially among impressionable young users. Mental health professionals have increasingly noted a trend of patients arriving at appointments with self-diagnoses based on social media content, highlighting the real-world implications of online misinformation.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

13 Comments

  1. Michael Davis on

    The findings on ADHD and autism misinformation are a wake-up call. Social media companies need to prioritize user safety over engagement and revenue.

  2. Liam Thompson on

    The study’s findings on TikTok’s high rate of ADHD and autism misinformation are alarming. The platform must take immediate steps to address this issue and protect its users.

    • William Hernandez on

      Absolutely. Social media companies have a moral and ethical responsibility to ensure their platforms don’t contribute to the spread of harmful mental health misinformation.

  3. This is a troubling trend that could have serious consequences for young people’s mental health. Platforms must address the proliferation of misinformation on their sites.

    • Oliver Moore on

      Agreed. Social media has become a powerful, but largely unregulated, source of mental health information. Stricter content moderation is crucial.

  4. James Jackson on

    This study underscores the urgent need for social media platforms to take mental health misinformation more seriously. The public deserves access to reliable, expert-backed resources.

  5. Linda Johnson on

    Misinformation about ADHD and autism on TikTok is especially troubling. The platform’s young user base is particularly vulnerable. Stronger content moderation is clearly needed.

    • Agreed. TikTok has a responsibility to ensure its users are accessing accurate, science-based information on mental health conditions.

  6. Elizabeth Moore on

    While social media has democratized mental health discussions, the prevalence of misinformation is deeply concerning. Platforms must do more to elevate expert voices and resources.

  7. William Johnson on

    Concerning to see the rise in self-diagnoses fueled by misinformation on social media. Reliable mental health info is crucial, especially for youth. Platforms need to do more to combat this dangerous trend.

    • Michael Martin on

      Absolutely. Social media algorithms often prioritize sensational, misleading content over factual information. This can have serious real-world consequences.

  8. Amelia Taylor on

    While social media has made mental health discussions more accessible, the prevalence of misinformation is deeply concerning. Platforms must do more to elevate credible sources.

    • Absolutely. Unregulated mental health content on social media can be extremely harmful, especially for vulnerable young people.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.