Listen to the article

0:00
0:00

In a digital age where health advice is just a scroll away, social media platforms have become go-to resources for many seeking information about mental health and neurodevelopmental conditions. However, a comprehensive new study reveals alarming levels of misinformation circulating online about these sensitive topics.

Researchers from the University of East Anglia in the United Kingdom analyzed more than 5,000 posts across major social media platforms—YouTube, TikTok, Facebook, Instagram, and X—covering conditions including autism, ADHD, schizophrenia, bipolar disorder, depression, eating disorders, OCD, anxiety, and phobias.

The findings, published recently in the Journal of Social Media Research, indicate that neurodivergence topics such as autism and ADHD contained the highest levels of misinformation. The study reveals that up to 56% of certain mental health content on social media platforms contains inaccurate or misleading information.

TikTok emerged as the platform with the most problematic content. Researchers found that 52% of ADHD-related videos and 41% of autism-focused content on the platform shared inaccurate information. This stands in stark contrast to YouTube Kids, which employs stricter content moderation measures and showed no misleading content about anxiety and depression, with only 8.9% of ADHD content containing misinformation.

Eleanor Chatburn, a clinical psychologist at the University of East Anglia and study co-author, explained the concerning implications: “Our work uncovered misinformation rates on social media as high as 56%. This highlights how easily engaging videos can spread widely online, even when the information isn’t always accurate.”

The problem is compounded by social media algorithms that serve users content similar to what they’ve previously engaged with, potentially creating echo chambers of misinformation. This algorithmic reinforcement can lead individuals to self-diagnose incorrectly or delay seeking professional help for genuine conditions.

“When false ideas spread, they can feed stigma and make people less likely to reach out for support when they really need it,” Chatburn noted. “On top of that, when people come across misleading advice about treatments, especially ones that aren’t backed by evidence, it can delay them from getting proper care and ultimately make things worse.”

The mental health landscape has increasingly moved online during recent years, particularly following the COVID-19 pandemic when telehealth services expanded dramatically. This digital shift makes accurate online information more crucial than ever, as many individuals turn to social media for initial guidance before consulting healthcare professionals.

The researchers found that content created by healthcare professionals was significantly more accurate. However, such expert-generated posts represent only a small fraction of mental health content circulating on these platforms. The study’s authors suggest that health organizations and clinicians should take a more active role in creating and sharing evidence-based information online.

Alice Carter, who led the study at the University of East Anglia, acknowledged the value of personal experiences while emphasizing the need for expert voices: “While lived-experience can play an important role, with personal stories helping people to feel understood and raising awareness of mental health conditions, it is vital to ensure that accurate and evidence-based information from clinicians and trusted organizations is also visible and easy to find.”

The study’s findings come amid growing concerns about the broader impact of social media on mental health, particularly among young users who are both more likely to use these platforms and more vulnerable to misinformation. Health advocacy groups have called for stricter content moderation policies, especially for health-related topics.

As social media continues to serve as an information gateway for millions seeking health advice, the research underscores the critical need for platforms, healthcare providers, and users alike to prioritize accuracy and evidence-based content when discussing mental health and neurodivergent conditions online.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

17 Comments

  1. James Thompson on

    While social media can be a valuable tool for connection and information-sharing, the risks of unchecked health misinformation are clear. Platforms need to work closely with experts to develop robust moderation policies and user education initiatives.

  2. Olivia Moore on

    This study underscores the urgent need for a coordinated, multi-stakeholder approach to addressing the spread of online health misinformation. Policymakers, tech companies, educators, and healthcare providers must collaborate to find solutions.

    • Patricia Hernandez on

      Agreed. Tackling this complex issue will require a concerted effort from all parties invested in public health and digital literacy.

  3. Oliver Smith on

    This study highlights the urgent need for greater digital media literacy education. Empowering users to critically evaluate online health content could go a long way in combating the misinformation crisis.

    • Robert Taylor on

      Agreed. Teaching people, especially young people, to spot red flags and verify information sources is crucial.

  4. Elijah Moore on

    This study highlights the critical need for improved content moderation and user education on social media platforms. Responsible stewardship of digital spaces is essential to protect vulnerable communities from the risks of health misinformation.

  5. Patricia Johnson on

    While social media has democratized access to information, the proliferation of health misinformation on these platforms poses a serious threat to public well-being. Urgent action is needed to address this growing crisis.

    • Jennifer Williams on

      Absolutely. The stakes are too high to allow the spread of harmful pseudoscience to continue unchecked.

  6. Ava Jackson on

    While social media has democratized access to information, it has also enabled the rapid spread of misinformation. Robust policies and enforcement are needed to ensure platforms prioritize factual, evidence-based content on sensitive health topics.

  7. Liam Garcia on

    The high prevalence of misinformation on TikTok is particularly alarming, given the platform’s popularity with younger users. Platforms must prioritize user safety and digital literacy to address this public health issue.

    • William T. Jones on

      Absolutely. TikTok should take immediate steps to improve content moderation and direct users to reliable mental health resources.

  8. Linda Miller on

    Misinformation on mental health conditions can have serious consequences, leading to harmful behaviors and discouraging people from seeking professional help. Platforms must take this threat seriously and implement comprehensive solutions.

  9. Jennifer Johnson on

    The high prevalence of autism and ADHD misinformation is particularly concerning, as these are often misunderstood conditions. Increased education and access to credible resources could make a real difference.

    • Ava Q. Taylor on

      Absolutely. Providing authoritative information and destigmatizing these conditions should be a top priority.

  10. Isabella Thompson on

    The high levels of misinformation around autism and ADHD are especially troubling, as these are often misunderstood conditions. Increased public awareness and access to credible resources could make a real difference.

  11. Patricia Hernandez on

    This is a concerning trend. Social media platforms need to do more to combat the spread of health misinformation, especially around sensitive topics like mental health and neurodevelopmental conditions. Fact-checking and content moderation will be crucial.

    • James Rodriguez on

      Agreed. Vulnerable populations deserve access to accurate, science-based information from trusted sources, not harmful pseudoscience.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.