Listen to the article

0:00
0:00

New Mexico prosecutors are pushing for significant changes to Meta’s social media platforms as the landmark trial against the tech giant enters its second phase. Opening statements begin Monday in the three-week bench trial to determine whether Meta’s platforms constitute a public nuisance under state law.

The trial follows a first phase in which jurors ordered Meta to pay $375 million in civil penalties after finding the company knowingly harmed children’s mental health and concealed information about child sexual exploitation on its platforms.

New Mexico Attorney General Raúl Torrez and his team are now asking a judge to mandate fundamental changes to Meta’s business practices, targeting the core features that prosecutors argue contribute to social media addiction and child safety concerns. These include redesigning recommendation algorithms that prioritize user engagement, eliminating features like “infinite scroll” and constant push notifications, and strengthening age verification systems.

“The fact that we’re having a trial on nuisance is itself a remarkable outcome,” said Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University School of Law. “That theory is not well accepted as applied to the internet, and that theory doesn’t really fit the internet.”

The case represents a potential turning point in how technology platforms are regulated. According to Torrez, the jury verdict has already weakened the shield of immunity tech companies have enjoyed under Section 230 of the U.S. Communications Decency Act, which has protected platforms from liability for user-generated content for nearly three decades.

State prosecutors are seeking multiple safeguards, including requiring child accounts to have an associated parent or guardian and appointing a court-supervised child safety monitor to track improvements. The suit also aims to implement default privacy settings designed to prevent exploitation of minors.

Meta has vowed to appeal the initial jury verdict and issued stark warnings about potential consequences if forced to comply with the state’s demands. Company representatives have suggested Meta might eliminate Instagram and Facebook services in New Mexico entirely if mandated to implement what they describe as impractical changes.

“The state’s proposed mandates infringe on parental rights and stifle free expression for all New Mexicans,” Meta stated last week, invoking free speech protections that have traditionally shielded social media companies.

The company plans to argue that many of the state’s demands are redundant with existing safety measures. Meta executives maintain they continuously improve child safety features and address concerns about compulsive use. They also contend their platforms are being unfairly singled out among hundreds of apps used by teenagers, potentially leaving children vulnerable on platforms with fewer protections.

This New Mexico case stands apart from dozens of similar lawsuits filed by state attorneys general across the country. Most other cases are pursuing remedies in federal court, giving this state-level proceeding outsized importance. Torrez believes the case has potential to “change the paradigm of how this company does business, but also how Big Tech generally is expected to do business going forward.”

Legal experts note the proceedings venture into uncertain territory. Goldman pointed out that court-ordered age verification mandates, for example, lack clear Supreme Court precedent. “In practice a court order saying that Facebook had to impose age authentication would have no Supreme Court textual support,” he said. “The Supreme Court might bless it. We don’t know.”

The initial phase of the trial featured six weeks of testimony from a diverse array of witnesses, including teachers, psychiatric experts, state investigators, Meta executives, and former employees who left the company over concerns about its practices.

The case coincides with mounting public scrutiny of social media’s impact on youth. A separate Los Angeles jury recently found both Meta and YouTube liable for harms to children, further validating longstanding concerns about the dangers social media platforms pose to young users.

As the trial unfolds over the next three weeks, its outcome could reshape how social media companies operate and establish new precedents for holding technology platforms accountable for their societal impacts.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

13 Comments

  1. Noah Garcia on

    This trial could set a precedent for how tech platforms are held accountable for user safety. While the requested changes seem sensible, I’m curious to see how Meta responds and if they can demonstrate effective self-regulation.

  2. Patricia Jones on

    As a parent, I’m glad to see authorities taking steps to address social media’s impact on kids. However, overly restrictive measures could backfire and limit platforms’ ability to connect people. A nuanced approach is needed.

    • Noah Miller on

      I agree, it’s a tough balancing act. Protecting the vulnerable while enabling the benefits of social media will require thoughtful, evidence-based solutions.

  3. William Miller on

    As someone who follows this industry, I’m intrigued to see how the court responds to New Mexico’s demands. Redesigning core platform features like recommendation algorithms is a bold move. It will be interesting to see if other states follow suit.

  4. Michael Garcia on

    Interesting case – it’ll be important to see how the court balances child safety concerns with the need for tech platforms to innovate. Optimizing user engagement shouldn’t come at the expense of mental health, especially for minors.

    • Agreed, it’s a complex issue with valid concerns on both sides. Hopefully the trial can lead to constructive changes that protect kids while still allowing platforms to function.

  5. Jennifer T. Thomas on

    The outcome of this case will be closely watched by the entire tech industry. Reasonable safeguards for kids are important, but they need to be implemented carefully to avoid stifling innovation. I hope the court can find the right balance.

    • William Martin on

      Agreed. Constructive dialogue between regulators, tech companies, and child advocates is crucial to finding sustainable solutions.

  6. Noah Thomas on

    This case highlights the growing need for clearer digital safety regulations. It will be interesting to see if the court mandates specific changes to Meta’s algorithms and features. Transparency and user control should be key priorities.

  7. Linda D. Jackson on

    The mental health impacts of social media, especially on young users, are a serious concern. While the proposed changes seem reasonable, I hope the court considers unintended consequences as well. Innovation shouldn’t be stifled, but safeguards are crucial.

    • Oliver Lopez on

      Well said. It’s a delicate balance, but protecting vulnerable users has to be the top priority.

  8. Linda G. Thomas on

    This seems like an important test case for regulating social media algorithms and features. Ensuring child safety should be a top priority, but changes need to be carefully considered to avoid unintended consequences.

    • Well said. The outcome of this trial could set a precedent, so it will be crucial to find the right balance.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.