Listen to the article

0:00
0:00

A groundbreaking study from the University of East Anglia has revealed alarming levels of mental health misinformation circulating across social media platforms, with TikTok emerging as the most problematic channel for inaccurate content.

The research, published in the Journal of Social Media Research, analyzed over 5,000 posts about mental health and neurodivergence across YouTube, TikTok, Facebook, Instagram, and X (formerly Twitter). Researchers found misinformation rates reaching as high as 56% on some platforms, raising serious concerns about how young people are learning about mental health conditions.

“Social media has become an important space where many young people learn about mental health, but the quality of this information can vary greatly,” explained Eleanor Chatburn from Norwich Medical School at the University of East Anglia. “This means that misleading content can circulate quickly, especially if there are no accessible and reliable sources.”

The study identified TikTok as particularly problematic, with significantly higher rates of inaccurate or unfounded content compared to other platforms. According to Alice Carter, who conducted the research as part of her doctoral thesis, 52% of TikTok videos related to ADHD and 41% of videos about autism contained inaccurate information.

“In contrast, YouTube averaged 22% misinformation, while Facebook averaged just under 15%,” Carter noted.

The research team found content about neurodivergent conditions like autism and ADHD contained higher levels of misinformation than many other mental health topics examined, which included schizophrenia, bipolar disorder, depression, eating disorders, OCD, anxiety, and phobias.

This systematic review represents the first comprehensive examination of mental health information across multiple social media platforms, revealing concerning patterns in how mental health content is created and consumed online.

Mental health professionals worry that exposure to such misinformation can have serious consequences. “TikTok content has been linked to young people increasingly believing they may have mental health issues or neurodevelopmental conditions,” Chatburn pointed out. “While this questioning can be a useful starting point, it is important that these questions lead to appropriate clinical evaluation with a professional.”

The researchers stressed that misinformation can lead to misunderstanding of serious conditions, pathologizing of common behaviors, and delayed diagnosis for people who genuinely need help. Additionally, spreading false ideas can fuel stigma and discourage people from seeking professional support.

“When people encounter misleading advice about treatments, especially those not supported by evidence, it can delay them from receiving appropriate care and ultimately make things worse,” Chatburn emphasized.

The study confirmed that content created by health professionals was consistently more accurate. However, such professional voices represent only a small fraction of mental health content circulating on these platforms. For instance, on TikTok, only 3% of professional videos about ADHD contained erroneous information, compared to a staggering 55% of non-professional videos.

The researchers also highlighted how platform algorithms exacerbate the problem. “TikTok’s algorithms are designed to boost content that quickly attracts attention, and this is a significant factor in the spread of misinformation,” Carter explained. “Once users show interest in a topic, they are bombarded with similar posts, creating powerful echo chambers that can reinforce false or exaggerated claims. It’s the perfect storm for misinformation to go viral faster than facts can be confirmed.”

Not all platforms performed equally poorly. YouTube Kids showed promising results, containing no misinformation about anxiety and depression, and only 8.9% about ADHD – results attributed to stricter moderation standards. However, standard YouTube was described as “highly inconsistent,” with video quality ranging widely depending on topic, channel, and influencer.

As social media continues to serve as a primary information source for younger generations, this research highlights the urgent need for improved content moderation, greater prominence of professional voices, and better digital literacy to help users distinguish between reliable and misleading mental health information.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

14 Comments

  1. James Y. Jackson on

    While I’m not surprised by the findings, it’s still deeply troubling. Mental health is such a sensitive and important topic – we can’t allow it to become a breeding ground for false narratives and harmful pseudoscience.

    • John R. Davis on

      Absolutely. Mental health education should come from qualified professionals, not anonymous social media users. Platforms need to do more to elevate credible sources and fact-check content.

  2. Elizabeth O. Williams on

    As a parent, this study fills me with concern. We need to be vigilant about the information our kids are consuming online, especially when it comes to their mental well-being. More needs to be done to protect the vulnerable.

  3. Patricia Martinez on

    This is very concerning. Social media’s role in shaping young people’s understanding of mental health is clearly a double-edged sword. We need to find ways to promote reliable, evidence-based information and counter the spread of misinformation.

    • Oliver Jackson on

      Absolutely. Platforms like TikTok need to take more responsibility for the content they host and prioritize mental health resources from credible sources.

  4. Noah Williams on

    While social media can be a powerful tool for raising awareness and connection, this study shows the dark side of its influence on vulnerable populations. We need to find the right balance between free expression and responsible content moderation.

  5. Oliver Martin on

    As someone with personal experience navigating mental health challenges, I’m deeply concerned about the impact this misinformation could have on vulnerable individuals. We must do more to empower people with facts, not fiction.

    • Amelia Garcia on

      I share your concern. Inaccurate information about mental health can be incredibly harmful, leading to stigma, shame, and barriers to seeking proper care. This is a public health issue that demands immediate attention.

  6. Isabella Smith on

    This is a complex issue with no easy solutions. On one hand, we want to protect free speech and allow people to share their experiences. On the other, we have a duty to ensure accurate, safe information is readily available, especially for mental health.

    • Agreed. It’s a delicate balance, but the risks of spreading misinformation are too high, especially when it comes to sensitive topics like mental health. Platforms must do more to curate reliable content.

  7. Isabella Johnson on

    This study is a wake-up call. Social media platforms have a moral obligation to prioritize mental health resources and combat the spread of misinformation. The stakes are too high to ignore this problem.

  8. Amelia Taylor on

    Wow, 56% misinformation rate on some platforms? That’s truly staggering. This study highlights the urgent need for reform and a rethinking of how we approach mental health education and awareness on social media.

  9. James K. Lopez on

    I’m not surprised that TikTok is a hotspot for mental health misinformation. The short-form, viral nature of the platform makes it challenging to vet content thoroughly. This highlights the urgent need for better digital media literacy education.

    • Lucas Thompson on

      Yes, teaching critical thinking skills around online information is crucial, especially for younger audiences. Platforms, educators, and health professionals must collaborate to address this issue.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.