Listen to the article
Meta Executives’ Video Depositions Take Center Stage in New Mexico Trial
Prosecutors in New Mexico unveiled previously unreleased video depositions of top Meta executives on Tuesday as they sought to strengthen allegations that the social media giant has concealed information about harmful effects its platforms have on children.
The depositions of Meta CEO Mark Zuckerberg and Instagram head Adam Mosseri represent crucial elements in the state’s legal action against Meta, which owns Facebook, Instagram, and WhatsApp. New Mexico officials allege the company violated state consumer protection laws by failing to adequately disclose dangers associated with its platforms.
At the heart of the case are accusations that Meta has not properly addressed or disclosed risks of social media addiction and child sexual exploitation on its platforms. The prosecution’s strategy hinges on demonstrating a pattern of corporate decision-making that prioritized growth and engagement over user safety, particularly for younger users.
Meta’s defense counsel Kevin Huff previously rejected these claims during opening statements on February 9, highlighting the company’s content moderation efforts and arguing that Meta does inform users about potential risks associated with its platforms.
In Tuesday’s proceedings, jurors watched Mosseri face a barrage of questions about Meta’s approach to user safety, profit motives, and potentially harmful platform features. Prosecutors particularly focused on policies affecting young users that might contribute to sleep disruption, unwanted adult interactions, and negative self-image issues related to beauty filters.
When repeatedly asked if Instagram should do everything possible to keep teens safe, Mosseri offered a measured response. “I think we should do what we can,” he stated, adding, “I think that there’s over 2 billion people on Instagram, which means there are millions of teens on Instagram. So when you say everything, I want to be clear that we are a large enough platform that sometimes some things will — so for instance, problematic content will be seen.”
Mosseri also asserted that at Meta “we will prioritize safety over profits.” However, prosecutors quickly juxtaposed this claim against internal company documents, including audits, emails, and messages about proposed features that could have reduced compulsive Instagram use among teenagers or mitigated negative social comparisons but were not always implemented.
When questioned about Instagram’s decision to continue recommending teen accounts to adults despite concerns about child exploitation, Mosseri described the company’s belief in “proportional risk mitigation.” He explained, “We carved out a subset of adults that we thought might be more likely to be problematic. We basically tried to identify a subset of adults that might be risky and then remove them from… accounts you should follow.”
The New Mexico trial runs parallel to a similar case in Los Angeles, with both potentially setting precedents for thousands of comparable lawsuits against social media companies nationwide. These cases represent mounting legal pressure on tech platforms regarding their responsibilities to younger users.
Zuckerberg testified last month in the Los Angeles case about youth Instagram usage and has previously faced congressional questioning about youth safety on Meta’s platforms. During congressional testimony earlier this year, he apologized to families who believed social media contributed to personal tragedies but stopped short of accepting direct responsibility.
Notably, Mosseri maintained his position that social media platforms are not clinically addictive, telling the New Mexico court by deposition, “I’m not a scientist, but I don’t believe the latest science suggests that social media platforms are addictive.”
The ongoing trial highlights growing concerns about the impact of social media on youth mental health and safety, reflecting broader societal debates about technology companies’ responsibilities and the balance between connectivity benefits and potential harms in the digital age.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


15 Comments
While tech innovation has brought many benefits, the potential harms to vulnerable users, especially children, need to be taken seriously. This trial may set an important precedent for the industry.
The allegations that Meta prioritized engagement over user safety, especially for children, are concerning. Transparency and accountability should be paramount for tech platforms with such outsized influence on society.
Agreed. Platforms need to put safeguards in place and be upfront about potential risks, rather than chasing growth at all costs.
This case highlights the complexity of regulating tech companies and balancing innovation with public interest. I’m curious to see how the court will view the evidence presented by the prosecution.
You raise a good point. Policymakers are still grappling with how to effectively oversee the tech industry and protect vulnerable users.
The depositions of Zuckerberg and Mosseri will be closely watched. Their testimony could reveal important insights into Meta’s internal decision-making and the company’s approach to managing potential risks.
While I understand the need for tech companies to grow their user base, the alleged failure to disclose risks to children is troubling. Hopefully this trial will lead to more robust transparency and accountability measures.
The depositions of top Meta executives could provide crucial insight into the company’s decision-making process and priorities. Curious to see if the prosecution can demonstrate a pattern of negligence or willful disregard for user safety.
Agreed. The video evidence could be a turning point in how the public and policymakers view Meta’s conduct.
It’s good to see regulators taking a closer look at the practices of major tech companies. Safeguarding user privacy and wellbeing should be a top priority, not an afterthought.
Absolutely. Responsible innovation requires a commitment to transparency and putting user interests first.
This trial underscores the need for stronger oversight and accountability measures in the tech industry. The health and safety of users, especially minors, should be the top consideration, not just growth metrics.
Well said. Responsible tech leadership requires a fundamental shift in priorities and a genuine commitment to protecting vulnerable populations.
This case highlights the difficult balancing act tech companies face between growth and user safety, especially when it comes to vulnerable populations like children. Curious to see how the depositions play out and what accountability measures, if any, emerge.
You’re right, this is a complex issue with no easy solutions. The impact of social media on youth mental health is a growing concern that companies need to address proactively.