Listen to the article
New Mexico Attorney General Challenges Meta’s “PG-13” Teen Safety Claims
New Mexico Attorney General Raúl Torrez has issued a sharp rebuke to Meta executives over what he describes as misleading marketing of the company’s teen safety features on Instagram. In a strongly worded letter addressed to Meta CEO Mark Zuckerberg and Instagram head Adam Mosseri on December 21, 2025, Torrez accused the tech giant of creating a “false sense of security” for parents through its PG-13 content moderation system.
The controversy centers on Meta’s October 2025 announcement that Instagram accounts for users under 18 would be governed by standards similar to PG-13 movie ratings. Under this system, Meta claimed teens would be protected from content featuring strong language, dangerous stunts, and substance use like marijuana, while allowing some suggestive elements comparable to PG-13 films.
According to Torrez, this marketing approach deliberately misrepresents the level of oversight on the platform. “Meta’s misappropriation of the PG-13 label suggests a level of oversight that does not exist on the platform — making this announcement a dangerous promotional stunt,” Torrez stated in his letter.
The Motion Picture Association (MPA), which officially administers the film rating system, has already demanded that Meta discontinue using the PG-13 designation, describing it as “literally false, deceptive, and highly misleading.” Unlike Meta’s self-regulated content moderation, the MPA’s rating system involves independent review boards and standardized criteria.
Torrez’s demands extend beyond terminology changes. His letter calls for Meta to implement “meaningful safety protections” including effective age verification systems, more aggressive removal of predators and harmful actors, algorithmic changes to prevent promotion of dangerous content, and addressing risks created by end-to-end encryption features.
This challenge comes amid broader legal battles between Meta and various state authorities. Torrez’s office filed a civil lawsuit against Meta in 2023, alleging the company has failed to adequately protect children from sexual exploitation, human trafficking, and mental health harms on its platforms. That case survived Meta’s attempts to dismiss it and is scheduled for trial in February 2026.
The New Mexico lawsuit is part of a growing legal pressure campaign against social media companies over youth safety. A separate multistate lawsuit backed by more than 40 attorneys general alleges Meta knowingly harms young users’ mental health by designing addictive features and algorithms that can lead to depression, anxiety, and body image issues among teenagers.
Digital safety experts note that self-regulated content moderation systems often lack transparency and consistent enforcement. “When a company borrows terminology from established rating systems, it creates confusion about the actual standards being applied,” said Dr. Eleanor Harding, director of the Digital Youth Safety Institute, who is not involved in the case.
Meta has defended its Teen Accounts as a significant safety improvement, highlighting features like default private accounts, content filters for sensitive topics, and parental supervision tools. Company representatives maintain they consulted extensively with parents and safety experts while developing the guidelines, and acknowledge that “no system is perfect.”
Torrez, who previously worked as a prosecutor specializing in internet crimes against children, has made platform accountability a cornerstone of his tenure as Attorney General. “These companies have prioritized profit over the safety of our children for too long,” he stated at a press conference earlier this year.
Industry observers suggest this confrontation reflects growing tensions between tech platforms’ self-regulation efforts and increasing demands for external oversight. With federal legislation on child online safety stalled in Congress, state-level actions like Torrez’s may become increasingly important in shaping social media safety standards.
The letter’s full text is available on the New Mexico Department of Justice website.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


9 Comments
This highlights the ongoing challenges social media companies face in balancing teen user protections with their business models. The PG-13 labeling appears to be more about marketing than reality based on the AG’s letter.
You raise a fair point. The profit motive can sometimes lead to questionable trade-offs when it comes to user safety, especially for vulnerable populations like teens.
Interesting development in the ongoing social media teen safety debate. Curious to see how Meta responds to the New Mexico AG’s claims that their PG-13 labeling is misleading and doesn’t accurately reflect the content moderation on Instagram.
Agreed, it’s a complex issue with valid concerns on both sides. Effective teen safety measures are crucial but the implementation details matter.
The New Mexico AG seems to have valid points about Meta’s marketing claims not aligning with the actual content moderation on Instagram. As a parent, I’d want to understand the true level of oversight before feeling assured about teen safety.
Exactly, transparency around platform policies and enforcement is key. Parents need accurate information to make informed decisions about their kids’ social media use.
As someone who closely follows the commodities and mining space, I’m curious to see if this teen safety issue has any broader implications for Meta’s advertising revenue, which depends heavily on sectors like mining and energy.
That’s an interesting angle. Any regulatory actions or negative publicity around Meta’s practices could potentially impact their advertising relationships across industries.
Kudos to the New Mexico AG for taking a strong stance and pushing back on Meta’s claims. Responsible social media governance is essential, especially when it comes to protecting young users.