Listen to the article
More than half of social media posts about mental health contain misleading or inaccurate information, according to a comprehensive new systematic review that raises alarm bells about where people are getting their health advice online.
The study, which analyzed over 5,000 posts across major platforms including TikTok and YouTube, found misinformation rates ranging from 0% to 56.9%, highlighting significant concerns about content reliability in an era where social media has become a primary source of health information for many.
Researchers discovered striking differences between platforms, with TikTok content showing notably higher levels of inaccuracy compared to YouTube. The type of mental health topic also influenced accuracy rates, with posts about neurodivergent conditions like autism and attention deficit hyperactivity disorder (ADHD) containing more misinformation than general mental health content.
The surge in mental health-related social media content comes amid a broader digital health information boom, where influencers and content creators without medical credentials often garner larger followings than healthcare professionals. This trend has created a complex information environment where entertainment value and engagement metrics can outweigh accuracy and evidence-based practice.
“The prevalence of misinformation we’re seeing represents a genuine public health challenge,” said one of the study’s researchers, who requested anonymity pending final publication. “When people base healthcare decisions on inaccurate social media posts, it can lead to self-misdiagnosis, delayed treatment, or inappropriate self-management strategies.”
The systematic review incorporated 27 separate studies and employed narrative synthesis to evaluate findings due to significant variability in how different researchers assessed and defined misinformation. This methodological challenge itself highlights the need for standardized approaches to evaluating health information quality online.
One consistent finding was the correlation between creator credentials and content accuracy. Posts from verified healthcare professionals or established health organizations generally contained more reliable, evidence-based information than user-generated content from non-professionals. However, these medically accurate posts often received less engagement and visibility on the platforms.
Mental health professionals express particular concern about the potential real-world impacts of such misinformation. Simplified or sensationalized content about complex conditions can lead to inappropriate self-diagnosis, treatment delays, or even the adoption of potentially harmful interventions. For vulnerable populations, especially young people who comprise a significant portion of social media users, these risks are especially pronounced.
Platform dynamics also contribute to the problem. Algorithms designed to maximize engagement often amplify emotionally resonant or controversial content regardless of accuracy. Short-form video platforms like TikTok, with their emphasis on brevity and virality, present particular challenges for conveying nuanced health information.
The researchers have called for a multi-faceted response to address these challenges. Their recommendations include encouraging healthcare institutions to develop more engaging, accessible content that maintains scientific accuracy while reaching broad audiences. Additionally, they advocate for stronger content moderation policies specific to health information, clearer labeling of evidence-based versus opinion content, and improved digital health literacy education.
“This isn’t about censoring speech,” noted the research team in their conclusions. “It’s about ensuring people can make informed decisions based on reliable information, especially when it comes to something as important as mental health.”
Social media companies have begun implementing various measures to combat health misinformation, including adding information labels, reducing the visibility of questionable content, and partnering with health organizations. However, the effectiveness of these approaches remains uneven across platforms and topics.
The study highlights the critical importance of coordinated action among healthcare providers, technology companies, policymakers, and users themselves to improve the quality of mental health information online in an increasingly digital information ecosystem.
The research was published in the Journal of Social Media Research under the citation: Carter A et al. Quality, reliability and misinformation in mental health and neurodivergence content on social media: a systematic review. JSOMER. 2026;DOI:10.29329/jsomer.84.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


10 Comments
The findings that posts about neurodivergent conditions like ADHD contain more misinformation than general mental health content is quite alarming. Mental health education and support should be based on science, not unsubstantiated claims.
I agree, that statistic is very concerning. Misinformation can be particularly harmful when it comes to complex neurological conditions. Rigorous fact-checking and empowering qualified voices should be a priority.
While the surge in mental health-related social media content is positive in many ways, the prevalence of misinformation is extremely worrying. Platforms must find a way to strike a balance between openness and accountability.
Concerning to see misinformation spreading so readily on social media when it comes to mental health. Reliable health information from qualified professionals is crucial, especially on sensitive topics. Platforms need to do more to curb the spread of inaccuracies.
I agree, social media has become a double-edged sword when it comes to mental health. Responsible curation and fact-checking are essential to ensure users get credible, science-based guidance.
The findings about the high rate of misinformation on TikTok compared to YouTube are quite alarming. Mental health is a serious issue and people deserve trustworthy information, not unsubstantiated claims or dangerous advice.
Absolutely. Social media platforms should prioritize elevating authoritative voices and suppressing misleading content, especially on sensitive health topics. The stakes are too high to let misinformation run rampant.
This study highlights the urgent need for social media platforms to improve their content moderation practices when it comes to sensitive health topics. Allowing misinformation to spread unchecked can have devastating consequences.
Absolutely. The stakes are too high to let the spread of mental health misinformation continue unabated. Platforms need to be more proactive in identifying and removing inaccurate content.
It’s unfortunate but not surprising that influencers without medical credentials often gain larger followings than healthcare professionals on social media. Platforms need to find ways to better surface credible, evidence-based information on mental health.