Listen to the article

0:00
0:00

TikTok Mental Health Misinformation More Prevalent Than Other Platforms, Study Finds

Inaccurate social media posts about attention deficit hyperactivity disorder (ADHD) and autism have been linked to a rise in young people self-diagnosing with neurodevelopmental conditions, according to researchers at the University of East Anglia (UEA).

A comprehensive review of 27 studies examining over 5,000 social media posts has revealed a troubling prevalence of mental health misinformation across popular platforms, with TikTok showing the highest rates of inaccurate content.

“When we looked closely at TikTok content, studies reported that 52% of ADHD-related videos and 41% of autism videos analyzed were inaccurate,” said Dr. Alice Carter from UEA. “By contrast, YouTube averaged 22% misinformation while Facebook averaged just under 15%.”

The research, published in The Journal of Social Media Research, found significant variation in misinformation rates across platforms and mental health topics. While YouTube Kids maintained zero misinformation about anxiety and depression—likely due to stricter content moderation practices—other platforms showed concerning levels of inaccuracy, with YouTube’s claustrophobia videos topping the chart at 56.9% misinformation.

Mental health and neurodevelopmental conditions proved particularly susceptible to misinformation compared to other health topics, raising concerns about the impact on vulnerable users seeking information about their symptoms.

Dr. Eleanor Chatburn from UEA’s Norwich Medical School emphasized the real-world consequences of this trend. “TikTok content has been linked to young people increasingly believing they may have mental health or neurodevelopmental conditions,” she explained.

“While questioning can be a helpful starting point, it’s important these questions lead to proper clinical assessment with a professional. Misinformation can lead to misunderstanding of serious conditions, pathologizing ordinary behavior, and delayed diagnosis for people who actually need help.”

The researchers identified TikTok’s algorithm as a significant factor in the spread of inaccurate content. “TikTok’s algorithms are designed to push rapidly engaging content, which is a major driver of misinformation,” said Dr. Carter. “Once users show interest in a topic, they are bombarded with similar posts—creating powerful echo chambers that can reinforce false or exaggerated claims.”

The study found that content created by healthcare professionals was generally more accurate, highlighting the importance of credible sources in countering misinformation. The findings present what researchers described as a “clear need for action” for improved content moderation and greater promotion of high-quality information from trusted sources.

Judith Brown, head of evidence and research at the National Autistic Society, called the spread of autism misinformation a “serious issue” that could prevent people from seeking appropriate support.

“People are being exposed to inaccurate and unreliable information which can lead to stigma and prejudice,” Brown said. “Social media companies should think about how to improve their platforms to prevent the spread of misinformation. People should be wary of information they find and know that it does not replace a professional assessment for autism.”

TikTok has disputed the findings, calling it a “flawed study that relies on outdated research about multiple platforms.” A spokesperson stated: “The facts are that we remove harmful health misinformation and provide access to reliable information from the WHO, so that our community can express themselves about what matters to them and find support.”

The UK government responded by emphasizing platform accountability under the Online Safety Act, which requires social media companies to tackle illegal and harmful content or face enforcement action. A spokesperson noted that accurate mental health information is essential, as “misinformation can cause real harm and delay people from getting the help they need.”

The researchers have called for improved evidence-based content and enhanced content moderation across all platforms to help combat the spread of mental health misinformation online.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

16 Comments

  1. Robert N. Thomas on

    Interesting that TikTok shows the highest rates of mental health misinformation. This underscores the challenges of content moderation on fast-paced, user-generated platforms. More needs to be done to address this issue.

    • Yes, the variation in misinformation rates across platforms is quite concerning. Clearly some platforms are doing a better job than others in this regard.

  2. Patricia Smith on

    It’s worrying to see such high rates of inaccurate content around ADHD and autism on platforms like TikTok. This could have serious consequences for young people’s mental health and well-being.

    • Definitely. These platforms need to step up their content moderation efforts to ensure users are getting reliable, science-based information on mental health topics.

  3. Self-diagnosis can be risky, especially for neurodevelopmental conditions. People should always consult qualified professionals for an accurate assessment and treatment plan. Relying on social media alone is a recipe for trouble.

    • Completely agree. Social media should not be a substitute for proper medical advice and diagnosis. This is an important issue that needs to be addressed.

  4. Elizabeth Jackson on

    This is alarming. Online misinformation can be very harmful, especially for vulnerable youth trying to understand their mental health. Platforms need to do more to combat the spread of inaccurate content.

    • Amelia Thompson on

      Agreed. Self-diagnosis without professional guidance can lead to serious issues. More education and moderation is needed.

  5. Mental health is a complex topic, and it’s crucial that young people get accurate information from reliable sources. This highlights the need for better digital literacy and fact-checking skills.

    • Jennifer Martinez on

      Absolutely. Social media companies have a responsibility to reduce the prevalence of misinformation, especially on sensitive subjects like ADHD and autism.

  6. Oliver Rodriguez on

    The variation in misinformation rates across platforms is quite striking. It suggests that some companies are doing a better job than others in addressing this problem. More transparency and accountability is needed.

    • Absolutely. Platform operators need to be held responsible for the content on their sites, especially when it involves sensitive topics like mental health.

  7. Isabella Garcia on

    This is a concerning trend, and it highlights the need for better mental health education and resources for young people. Online misinformation can be very harmful, and we need to do more to combat it.

    • Agreed. Platforms need to prioritize user safety and wellbeing over engagement and growth. Tackling mental health misinformation should be a top priority.

  8. Ava S. Martinez on

    This study highlights the need for better digital literacy education, especially for young people. Being able to critically evaluate online content and identify misinformation is a crucial life skill.

    • Patricia Hernandez on

      Agreed. Schools and parents should be teaching these skills from an early age to help protect youth from the dangers of online misinformation.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.