Listen to the article

0:00
0:00

Social media giants are facing an unprecedented legal reckoning as trials begin nationwide over allegations they have harmed children’s mental health. After years of denying responsibility, companies like Meta and TikTok must now defend themselves in courtrooms across the country, including before a jury for the first time.

The legal challenges come from multiple fronts – school districts, local and state governments, federal authorities, and thousands of families. Two significant trials are currently underway in Los Angeles and New Mexico, with more scheduled to follow.

Legal experts are drawing parallels to landmark litigation against the tobacco and opioid industries, with plaintiffs hoping for similar outcomes that resulted in massive settlements and fundamental changes to those businesses. The cases could challenge tech companies’ longstanding legal protections, including First Amendment shields and Section 230 of the Communications Decency Act, which has historically protected platforms from liability for user-posted content.

In Los Angeles, a bellwether trial centers on a 20-year-old identified only as “KGM,” whose case could set precedent for thousands of similar lawsuits. This trial focuses primarily on addiction, with plaintiffs alleging social media platforms deliberately designed features to make their products addictive to children.

“This is a monumental inflection point in social media,” said Matthew Bergman of the Social Media Victims Law Center, which represents over 1,000 plaintiffs in similar lawsuits. “When we started doing this four years ago no one said we’d ever get to trial. And here we are trying our case in front of a fair and impartial jury.”

Meta CEO Mark Zuckerberg testified in the Los Angeles trial on Wednesday, largely adhering to familiar talking points about the company’s age restriction policies. When plaintiff’s attorney Mark Lanier asked if people tend to use something more if it’s addictive, Zuckerberg responded, “I’m not sure what to say to that. I don’t think that applies here.”

Meanwhile, in New Mexico, Attorney General Raúl Torrez is pursuing a different angle against Meta, focusing on sexual exploitation of minors. Torrez’s team built their case by creating profiles posing as children on Meta’s platforms and documenting sexual solicitations they received and the company’s responses.

The New Mexico trial began in early February, with prosecuting attorney Donald Migliori arguing that Meta prioritized growth and engagement over youth safety. “Meta clearly knew that youth safety was not its corporate priority,” Migliori told the jury. Meta’s attorney Kevin Huff countered by highlighting the company’s content moderation efforts while acknowledging some harmful content inevitably slips through.

A third significant legal battle is scheduled for this summer in Oakland, California, where six public school districts will serve as bellwethers in a multidistrict litigation against social media companies. Attorney Jayne Conroy, who previously represented plaintiffs against pharmaceutical companies in opioid cases, sees strong parallels between the two fights.

“With the social media case, we’re focused primarily on children and their developing brains and how addiction is such a threat to their well-being,” Conroy explained. She noted the neurological similarities, adding, “The medical science is not really all that different, surprisingly, from an opioid or a heroin addiction. We are all talking about the dopamine reaction.”

Social media companies continue to dispute that their products are addictive. During questioning in the Los Angeles trial, Zuckerberg maintained that existing scientific research has not proven social media causes mental health harms. The psychiatric community has not officially recognized social media addiction as a disorder in the Diagnostic and Statistical Manual of Mental Disorders.

However, the companies face intensifying criticism from parents, educators, researchers, and lawmakers concerned about social media’s effects on youth mental health.

“While Meta has doubled down in this area to address mounting concerns by rolling out safety features, several recent reports suggest that the company continues to aggressively prioritize teens as a user base and doesn’t always adhere to its own rules,” noted Emarketer analyst Minda Smiley.

With inevitable appeals and potential settlement discussions, these cases could take years to resolve. Unlike Europe and Australia, where tech regulation has advanced more rapidly, the United States has moved much more slowly on implementing comprehensive guardrails for social media companies.

“Parents, education, and other stakeholders are increasingly hoping lawmakers will do more,” Smiley said. “While there is momentum at the state and federal level, Big Tech lobbying, enforcement challenges, and lawmaker disagreements over how to best regulate social media have slowed meaningful progress.”

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

14 Comments

  1. Oliver Johnson on

    Challenging the legal protections that have shielded tech companies from liability is crucial. These platforms have been allowed to operate with impunity for far too long. I’m curious to see how the courts navigate these complex issues.

    • Agreed. Their immunity from liability has been a major roadblock to addressing the harms caused by social media. These lawsuits could set important new precedents.

  2. I hope these cases lead to meaningful reforms, not just financial settlements. Systemic changes are needed to address the root causes of social media’s mental health harms, such as algorithm design, targeted advertising, and content moderation failures. Anything less would be a missed opportunity.

    • Oliver Williams on

      Agreed. Substantial changes to business practices and platform design must be part of the solution, not just payouts. These companies need to fundamentally rethink their priorities and accountability to users.

  3. While I’m glad to see these companies finally facing consequences, I worry the legal process will be long and drawn out. Justice needs to be swift to protect vulnerable youth. I hope the courts move quickly and decisively on these cases.

    • That’s a valid concern. The tobacco and opioid lawsuits took years to resolve. Expediting the process should be a priority to get meaningful reforms in place as soon as possible.

  4. This is a significant legal challenge for social media companies. Their platforms have had serious mental health impacts on young users, and they can no longer hide behind outdated liability shields. It will be interesting to see how these landmark cases play out and if they lead to real accountability and change.

    • Elizabeth I. Lopez on

      Absolutely. For too long, these companies have prioritized profits over the wellbeing of their users, especially vulnerable youth. Hopefully the courts will finally force them to take responsibility.

  5. This is an important issue that deserves serious attention. As a parent, I’m very concerned about the mental health impacts of social media on children and teens. I hope these lawsuits lead to real changes that protect young users.

    • Me too. Social media has become a public health crisis, and these companies need to be held accountable. Prioritizing profits over child wellbeing is unacceptable.

  6. Lucas Y. Thompson on

    The parallels to the tobacco and opioid lawsuits are apt. Social media has become an addiction for many, with devastating consequences. I’m glad to see plaintiffs taking these companies to task and pushing for meaningful reforms.

    • James C. Thompson on

      Agreed. These platforms have manipulated algorithms and design features to keep people hooked, much like the tactics used by Big Tobacco and Big Pharma. Accountability is long overdue.

  7. These lawsuits raise fundamental questions about the role and responsibilities of social media platforms. Do they deserve the same free speech protections as traditional media? Should they be subject to stronger regulations around content moderation and algorithm design? I’m curious to see how the courts address these complex issues.

    • Those are great points. The legal frameworks that have governed these platforms are clearly outdated. This could be a pivotal moment in reassessing their rights and obligations in the digital age.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.