Listen to the article

0:00
0:00

More than half of videos about Attention Deficit Hyperactivity Disorder (ADHD) shared on TikTok contain misinformation, according to a comprehensive new study from researchers at the University of East Anglia.

The research team analyzed over 5,000 social media posts across major platforms including TikTok, YouTube, Facebook, Instagram, and X (formerly Twitter), finding alarming rates of inaccurate mental health information, particularly on TikTok.

The study, published in The Journal of Social Media Research, revealed that 52 percent of ADHD-related videos and 41 percent of autism videos on TikTok contained incorrect or misleading information. This represents the first multi-platform examination of mental health and neurodivergence content accuracy across social media ecosystems.

“Our work uncovered misinformation rates on social media as high as 56 percent,” said Dr. Eleanor Chatburn from UEA’s Norwich Medical School. “This highlights how easily engaging videos can spread widely online, even when the information isn’t always accurate.”

The research found that TikTok consistently demonstrated higher rates of misinformation compared to other platforms. YouTube averaged 22 percent misinformation in mental health content, while Facebook performed better with just under 15 percent inaccurate information.

Posts about neurodivergent conditions like autism and ADHD were particularly problematic across all platforms, containing higher levels of misinformation than content about many other mental health topics examined, which included schizophrenia, bipolar disorder, depression, eating disorders, OCD, anxiety, and phobias.

The consequences of this widespread misinformation extend beyond simple confusion, according to the researchers. Dr. Chatburn warned of several potential harms: “As well as leading to misunderstanding of serious conditions and pathologising ordinary behaviour, misinformation can also lead to delayed diagnosis for people that actually do need help.”

She further explained that false information can perpetuate stigma, discourage people from seeking necessary support, and potentially lead individuals toward unproven treatments rather than evidence-based care.

The study found a notable bright spot: content created by healthcare professionals was consistently more accurate across all platforms. However, these professional voices represent only a small fraction of the mental health content currently circulating on social media.

In response to the findings, TikTok disputed the research, calling it “flawed” and stating that it “relies on outdated research about multiple platforms.” A spokesperson for the company said, “The facts are that we remove harmful health misinformation and provide access to reliable information from the WHO, so that our community can express themselves about what matters to them and find support.”

TikTok also highlighted its UK Clinician Creator Network, a group of 19 NHS-qualified clinicians who share medical expertise with over 2.2 million followers on the platform.

The University of East Anglia research team is calling for several interventions to address the problem. They recommend that health organizations and clinical professionals create and actively promote evidence-based content to counter misinformation. They also suggest improved content moderation by platforms, standardized assessment tools for evaluating online mental health information, and clearer definitions of what constitutes misinformation.

The study comes amid growing concerns about the role of social media in shaping public understanding of mental health conditions, particularly among young people who increasingly turn to these platforms for information. With rising rates of mental health diagnosis and awareness, ensuring accurate information reaches vulnerable populations has become increasingly critical for public health officials and platform governance alike.

YouTube, Facebook, Instagram, and X were approached for comment regarding the study findings but had not provided responses at the time of reporting.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

13 Comments

  1. Elijah Rodriguez on

    This is quite concerning, especially with the high prevalence of ADHD and the vulnerability of the TikTok audience. We need better ways to verify mental health content on social media platforms.

    • Elijah Thompson on

      Agreed. Platforms should take more responsibility for curbing misinformation, particularly on sensitive health topics. Fact-checking and content moderation need to be improved.

  2. Robert Martin on

    This is a concerning trend that needs to be addressed. Social media platforms have a responsibility to their users, especially young and vulnerable ones, to ensure the integrity of mental health information.

  3. The high rates of misinformation on TikTok regarding ADHD and autism are really troubling. Social media platforms need to partner with medical experts to improve content moderation in these sensitive areas.

    • Oliver Martin on

      Absolutely. With the influence of social media, it’s critical that users have access to accurate, evidence-based information on mental health conditions.

  4. Mental health education and awareness are so important, but it’s alarming to see so much inaccurate information spreading online. Social media platforms need stricter policies to address this issue.

    • Patricia Lopez on

      Absolutely. With the growing reliance on social media for information, we have to be vigilant about the sources and ensure accuracy, especially for vulnerable populations.

  5. Robert Moore on

    This study underscores the urgent need for better regulation and oversight of mental health content on social media. Misinformation can have serious consequences, especially for vulnerable populations.

  6. Misinformation on mental health conditions like ADHD can be very harmful, especially for those seeking support and guidance online. Platforms need to do more to combat the spread of false claims.

    • Elijah U. Jackson on

      Agreed. The lack of quality control on social media when it comes to sensitive health topics is really problematic and can have serious consequences.

  7. Linda Rodriguez on

    This highlights the challenges of regulating user-generated content on social media. Platforms need to invest more in content moderation and partner with experts to ensure mental health information is reliable.

  8. Elijah Jackson on

    It’s troubling that over half the ADHD content on TikTok contains misinformation. Social media has become a primary source of health information, so the lack of quality control is really concerning.

    • Agreed. Platforms need to do more to vet and fact-check mental health content, especially for impressionable audiences like teenagers on TikTok.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.