Listen to the article
The first jury verdict in a series of social media child safety trials has dealt a significant blow to Meta, as a New Mexico jury found the company’s platforms harmful to children’s mental health. The Tuesday ruling imposed a $375 million penalty on the social media giant, marking a watershed moment in the ongoing debate about tech companies’ responsibility toward young users.
While the fine represents only a fraction of Meta’s $201 billion annual revenue, the verdict signals a profound shift in public perception regarding social media companies and their obligation to protect young people online. For years, these companies have vehemently denied allegations that their platforms harm children through deliberate design choices that foster addiction and fail to shield minors from sexual predators and dangerous content.
The New Mexico case represents just one of several legal challenges Meta and other social media companies face this year, with lawsuits coming from various quarters—school districts, local and state governments, federal agencies, and thousands of families. These courtroom confrontations culminate after years of scrutiny over child safety practices and platform design choices allegedly contributing to depression, eating disorders, and even suicide among young users.
New Mexico Attorney General Raúl Torrez, who sued Meta in 2023, built a compelling case by creating child personas on the company’s platforms and documenting the sexual solicitations they received, along with Meta’s inadequate responses. Following a nearly seven-week trial, jurors determined that Meta violated the state’s Unfair Practices Act by making false or misleading statements about platform safety and engaging in “unconscionable” trade practices that exploited children’s vulnerabilities.
“We disagree with the verdict and will appeal,” Meta stated, adding that they “work hard to keep people safe” and remain “confident in our record of protecting teens online.” However, prosecutors successfully argued that Meta prioritized profits over safety, engineering algorithms designed to maximize user engagement at the expense of child protection.
The legal outcomes could fundamentally challenge two critical protections tech companies have relied upon: First Amendment shields and Section 230 of the Communications Decency Act, which has historically protected platforms from liability for user-posted content. Beyond financial penalties, these cases might force significant operational changes that could impact user engagement and advertising revenue.
In a parallel California case, jurors are still deliberating in a landmark trial where a 20-year-old plaintiff identified as “KGM” claims platforms like Meta and YouTube deliberately designed addictive features targeting young users. TikTok and Snap strategically settled before the trial commenced. This case serves as a “bellwether” trial—essentially a test case that could determine how thousands of similar lawsuits proceed.
“This is a monumental inflection point in social media,” said Matthew Bergman from the Social Media Victims Law Center. “When we started doing this four years ago, no one said we’d ever get to trial. And here we are trying our case in front of a fair and impartial jury.”
School districts are also taking action, with a trial scheduled for this summer in Oakland, California. Attorney Jayne Conroy, who previously worked on cases holding pharmaceutical companies accountable for the opioid epidemic, sees striking similarities between that crisis and current social media litigation.
“With the social media case, we’re focused primarily on children and their developing brains and how addiction is such a threat to their well-being,” Conroy explained. “The medical science is not really all that different, surprisingly, from an opioid or a heroin addiction. We are all talking about the dopamine reaction.”
Social media companies continue disputing that their products are addictive. During testimony in the Los Angeles trial, Meta CEO Mark Zuckerberg maintained that existing scientific evidence hasn’t conclusively proven social media causes mental health harms. While social media addiction isn’t recognized as an official disorder in psychiatric diagnostic manuals, companies face intensifying criticism from parents, educators, researchers, and lawmakers.
Analyst Minda Smiley notes, “While Meta has doubled down in this area to address mounting concerns by rolling out safety features, several recent reports suggest that the company continues to aggressively prioritize teens as a user base and doesn’t always adhere to its own rules.”
The resolution of these cases could take years amid appeals and settlement discussions. Unlike Europe and Australia, where tech regulation has advanced significantly, the United States continues to move at what critics describe as a glacial pace in establishing comprehensive guardrails for social media companies.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


12 Comments
While the financial penalty is relatively small for a company the size of Meta, the broader implications of this verdict could be significant. It signals that the tide is turning and the public is no longer willing to accept social media’s negligence when it comes to child safety.
Agreed. This case sets an important precedent and could embolden more legal challenges against tech firms that fail to protect minors on their platforms. It’s a wake-up call that they can no longer ignore the harms their products can cause.
While the $375 million fine is a drop in the bucket for Meta, the symbolic significance of this verdict should not be overlooked. It demonstrates that juries are willing to hold social media giants accountable for the harm their platforms can cause.
Absolutely. This sets an important precedent that could open the floodgates for more legal action against tech companies over child safety issues on their platforms.
This is a complex issue with valid concerns on both sides, but at the end of the day, the wellbeing of children has to be the top priority. Social media companies need to take more responsibility for the mental health impacts of their platforms, even if it cuts into profits.
Absolutely. While balancing innovation, growth, and user safety is challenging, these companies have a moral and ethical obligation to do more to protect vulnerable young users. This verdict is an important step in that direction.
This ruling is an important step, but there is still a long way to go in holding social media giants accountable. Lawmakers and regulators need to step up and enact stronger protections for minors on these platforms.
Absolutely right. Stronger policies and enforcement are needed to ensure tech companies put safeguards in place and prioritize the health and safety of young users over profits and growth.
It’s good to see the courts taking these issues seriously and pushing back against the tech industry’s long-standing denials about the harms of social media. Protecting young people should be the top priority, not maximizing profits.
Couldn’t agree more. This verdict shows that the tide is turning and social media companies will increasingly face consequences for prioritizing engagement over user wellbeing, especially when it comes to vulnerable populations like children.
This is an important verdict that could have far-reaching implications for social media companies. It signals a growing legal and public reckoning over the harms their platforms can pose to young users. Protecting children’s mental health online should be a top priority.
Agreed. Tech firms can no longer ignore the negative impacts of their products on vulnerable youth. Hopefully this spurs meaningful reforms to improve child safety and well-being on social media.