Listen to the article

0:00
0:00

For years, parents, teenagers, pediatricians, educators and whistleblowers have sounded alarms about social media’s detrimental effects on young people’s mental health. Their concerns encompassed addiction, eating disorders, sexual exploitation, and suicide risk. This week, these warnings received unprecedented judicial validation in two separate landmark verdicts against major technology companies.

In a pivotal moment for tech accountability, juries in both Los Angeles and New Mexico delivered decisions against social media giants on Wednesday. The Los Angeles jury found Meta and YouTube liable for harming children using their platforms, while in New Mexico, a jury determined that Meta knowingly damaged children’s mental health and deliberately concealed evidence of child sexual exploitation occurring on its services.

The verdicts drew immediate celebration from tech watchdog organizations, families, and children’s advocates. “The era of Big Tech invincibility is over,” declared Sacha Haworth, executive director of The Tech Oversight Project. “After years of gaslighting from companies like Google and Meta, new evidence and testimony have pulled back the curtain and validated the harms young people and parents have been telling the world about for years.”

These dual decisions represent a significant shift in public perception regarding technology companies, potentially triggering a wave of additional lawsuits and regulatory scrutiny. For years, social media platforms have maintained that any harms were unintentional byproducts of their services or the result of bad actors circumventing safety measures. They consistently downplayed research linking social media use to psychological damage.

During the Los Angeles trial, Meta CEO Mark Zuckerberg demonstrated this defensive stance when questioned about whether people use addictive platforms more frequently. “I’m not sure what to say to that. I don’t think that applies here,” Zuckerberg testified, illustrating the companies’ reluctance to acknowledge responsibility.

Both Meta and Google have indicated they disagree with the verdicts and are exploring legal options, including appeals. This resistance comes as the public increasingly demands accountability and substantive changes to platform operations.

Arturo Béjar, a former Meta engineering director who later testified before Congress about Instagram’s harms in 2023, noted that jury trials help “level the playing field” against trillion-dollar companies. However, he emphasized that meaningful change will ultimately require regulatory intervention. “One thing that I saw working inside the company that effectively led to behavior change was when an attorney general or the FTC stepped in and required things of the company,” Béjar explained.

While both cases focused on child safety, they approached the issue from different angles. New Mexico’s lawsuit, filed by state Attorney General Raúl Torrez in 2023, built its case through undercover investigations. State officials posed as children on social media platforms and documented the sexual solicitations they received, as well as Meta’s inadequate response to these dangers. The jury ultimately found that Meta violated New Mexico’s consumer protection law.

The Los Angeles case involved a single plaintiff identified as KGM, who sued Meta, YouTube, TikTok, and Snap. TikTok and Snap reached settlements before trial. KGM’s case specifically targeted the platforms’ design features, arguing they were intentionally created to be addictive, particularly for younger users. This case serves as a bellwether trial among thousands of similar lawsuits, potentially setting the stage for broader settlements reminiscent of Big Tobacco and opioid litigation.

A crucial legal innovation in these cases was their focus on deliberate design choices and product liability, which allowed plaintiffs to circumvent Section 230 of the Communications Decency Act. This federal law has traditionally shielded internet companies from liability for user-posted content. Previous lawsuits focusing on content distribution often failed on these grounds.

“For the first time, courts have held social media platforms accountable for how their product design can harm users,” explained Nikolas Guggenberger, an assistant professor at the University of Houston Law Center. “This is a new legal territory that could reshape an industry long shielded by Section 230. Platforms will have to rethink their focus on engagement at any cost, which has outlived itself.”

Public opinion appears to be shifting dramatically against social media companies. A 2025 Pew Research Center poll found that 48% of teens believe social media harms people their age, up significantly from 32% in 2022.

As social media platforms face this reckoning, experts warn that artificial intelligence chatbots may become the next battleground in protecting young users. Sarah Kreps, director of Cornell University’s Tech Policy Institute, cautions about the cyclical nature of technological challenges: “You can ban today’s harm, but how do you know what tomorrow is going to bring? Whether it’s another social media app, AI or some other new technology, new things will crop up. And people will flock to those because where there’s demand you will see a supply come to meet that demand.”

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

12 Comments

  1. Oliver E. Hernandez on

    While the verdicts are a win for accountability, the real challenge will be implementing lasting changes to make social media safer and less harmful for young users. Curious to see what specific reforms emerge from this.

    • Amelia Hernandez on

      A good point. The companies will likely fight these rulings, so it remains to be seen whether meaningful, enforceable changes will actually be implemented.

  2. Elizabeth Jones on

    These verdicts are a significant milestone in holding social media companies accountable for the harms their platforms have inflicted on young people. It’s high time these issues were taken seriously and addressed.

    • Patricia Johnson on

      Absolutely. This sends a clear message that the wellbeing of children must come before profits and growth at all costs.

  3. Noah Jackson on

    While the verdicts are a step in the right direction, the challenge will be ensuring they lead to meaningful, lasting changes that prioritize user wellbeing over commercial interests. The battle for accountability is far from over.

  4. Olivia Johnson on

    These landmark rulings against social media giants represent a significant win for children’s rights and mental health advocates. It remains to be seen whether the industry will heed the warnings and implement substantive reforms.

    • Elijah Jones on

      Exactly. The true test will be whether these companies make the necessary changes to protect vulnerable users or if they continue to prioritize growth and engagement at all costs.

  5. Oliver Smith on

    The verdicts validate the experiences of countless young people and families who have suffered the consequences of social media’s excesses. Hopefully, this marks a turning point in holding these companies responsible.

    • William B. Johnson on

      Agreed. These decisions could empower more victims to come forward and seek justice against the tech giants.

  6. These rulings could have far-reaching consequences for the tech industry. Will this lead to a wider reckoning around the negative societal impacts of social media platforms? Hopeful, but skeptical about the willingness of these giants to truly reform.

  7. James Garcia on

    It’s encouraging to see the courts validate the long-standing concerns raised by parents, experts and whistleblowers. Hopefully, this will lead to meaningful reforms and stronger protections for minors on social media.

    • Olivia Taylor on

      Agreed. These companies have evaded responsibility for too long. Stricter regulations and oversight are clearly needed to curb the pernicious impacts of their platforms.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.