Listen to the article
Study Finds High Levels of Mental Health Misinformation on TikTok
A significant portion of TikTok posts about ADHD and autism contain misleading or unsubstantiated information, according to new research from the University of East Anglia (UEA). The comprehensive study, which examined over 5,000 social media posts across multiple platforms, found TikTok to be the worst offender for mental health misinformation.
Researchers investigated the accuracy of mental health and neurodivergence content across YouTube, TikTok, Facebook, Instagram, and X (formerly Twitter). Their findings revealed alarming rates of misinformation, with some categories reaching as high as 56 percent of examined content.
“Our work uncovered misinformation rates on social media as high as 56 percent. This highlights how easily engaging videos can spread widely online, even when the information isn’t always accurate,” said Dr. Eleanor Chatburn from UEA’s Norwich Medical School.
The study found that posts about neurodivergent conditions like autism and ADHD contained higher levels of misinformation than many other mental health topics. On TikTok specifically, researchers discovered that 52 percent of ADHD-related videos and 41 percent of autism videos analyzed contained inaccurate information.
By comparison, YouTube averaged 22 percent misinformation in mental health content, while Facebook performed better with an average of just under 15 percent inaccurate content.
Dr. Alice Carter, who conducted the research as part of her doctoral thesis, emphasized the stark contrast between professional and non-professional content: “Just three percent of professional videos about ADHD on TikTok contained misinformation, compared to 55 percent of videos by non-professionals.”
The study, published in The Journal of Social Media Research, represents the first systematic review to examine mental health and neurodivergence information across multiple social media platforms.
The researchers warn that misinformation about mental health conditions can have serious consequences. Dr. Chatburn explained: “Mental health information on social media matters because many young people now turn to these platforms to understand their symptoms and possible diagnoses.”
“TikTok content has been linked to young people increasingly believing they may have mental health or neurodevelopmental conditions. While this questioning can be a helpful starting point, it’s important these questions lead to proper clinical assessment with a professional,” she added.
The study identified several harmful outcomes of mental health misinformation, including delayed diagnosis for those who need help, reinforcement of stigma, and promotion of unproven treatments that could prevent people from seeking evidence-based care.
TikTok’s algorithm design was specifically highlighted as problematic. “TikTok’s algorithms are designed to push rapidly engaging content and this is a major driver of misinformation,” explained Dr. Carter. “Once users show interest in a topic, they are bombarded with similar posts—creating powerful echo chambers that can reinforce false or exaggerated claims.”
The research did find one bright spot: YouTube Kids contained no misinformation for anxiety and depression content, and only 8.9 percent for ADHD—a result attributed to the platform’s stricter moderation rules. Standard YouTube, however, was described as “highly inconsistent,” with video quality ranging from poor to moderately reliable.
The authors conclude with a call for health organizations and medical professionals to create and promote better evidence-based content on social media. They also recommend improved content moderation, standardized assessment tools for online mental health information, and clearer definitions of what constitutes misinformation in this space.
As social media continues to be a primary source of health information for many young people, the researchers stress the importance of ensuring accurate, evidence-based information is readily available and easily distinguishable from misleading content.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


8 Comments
I’m not surprised to see TikTok identified as a hotbed for mental health misinformation. The platform’s short-form video format can make it challenging to provide nuanced, evidence-based information on complex topics. More regulation and user education may be required.
That’s a good point. The quick, attention-grabbing nature of TikTok content doesn’t lend itself well to in-depth discussions on mental health issues. Misinformation can easily gain traction in that environment.
Interesting study on the prevalence of mental health misinformation on social media, especially TikTok. It’s concerning to see such high rates of inaccurate content, particularly around sensitive topics like ADHD and autism. Platforms need to do more to combat this issue and promote reliable information.
Agreed. Social media algorithms that prioritize engagement over accuracy can enable the rapid spread of misleading content. Stricter moderation and verification processes are needed to address this problem.
The finding that over half of ADHD-related content on TikTok contains misinformation is quite alarming. These are sensitive topics that require nuanced, factual discussion. Platforms need to do more to prioritize reliable, science-backed information on mental health.
Absolutely. Spreading misinformation about conditions like ADHD and autism can have real, harmful consequences for vulnerable individuals and communities. Social media companies should be held accountable for the content they amplify.
While social media can be a valuable tool for raising awareness and connecting people, this study highlights the darker side of its impact on mental health information. Platforms need to strike a better balance between user engagement and content quality/accuracy.
This study underscores the need for greater digital literacy and critical thinking skills when it comes to evaluating health information online. Social media platforms have a responsibility to curb the spread of misinformation, but users also need to be more discerning consumers of content.