Listen to the article
Inside Instagram’s Knowledge of Teen Exposure to Harmful Content, Court Documents Reveal
Newly unsealed court documents are raising serious questions about what Instagram knew about harmful content on its platform, and when it knew it, particularly when it came to teens being exposed to posts related to suicide and eating disorders.
The documents were revealed as part of ongoing litigation in In re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, a massive federal case involving several technology companies. Portions of the filings were recently made public, offering a rare look inside internal conversations at Meta, the parent company of Instagram.
According to a 56-page opposition brief filed in the case, internal materials show a significant gap between Instagram’s public safety commitments and internal discussions about how harmful content appeared on the platform.
The newly released documents reference internal data indicating hundreds of thousands of mentions of suicide on Instagram and acknowledge that certain types of harmful content had a disproportionately large teen audience.
In one company PowerPoint cited in the filing, posted by Court Listener, Instagram employees wrote: “Teens’ behavior on IG suggests a need for more support. We know that SSI (suicidal ideation) and ED (eating disorders) have a significantly disproportionate large teen audience.”
The presentation also noted that parents had asked the company for stronger tools to block harmful content from reaching teenagers and suggested that competitors like TikTok were “seen as offering more safety measures.”
These internal findings stand in stark contrast to public statements made by Instagram leadership. In 2019, Instagram head Adam Mosseri announced that the platform would block graphic self-harm content from appearing in searches, hashtags, and recommendations as part of broader safety improvements for young users.
Internal communications surfaced in the filings appear to show company employees discussing the potential public relations fallout of media reporting on the issue. According to an internal message referenced in court records, employees were discussing how harmful content surfaced in Instagram searches after a reporter from The Telegraph contacted the company in September 2020.
“On search we’re exposed with nowhere to hide,” read one internal comment cited in the filing. The email discussion also reportedly weighed whether restricting certain content in search results could conflict with other product priorities such as search functionality or shopping features.
The revelations come as a landmark civil trial unfolds in Los Angeles, where social media companies face allegations that their platforms were intentionally designed to be addictive to children and teenagers. The lawsuit centers on claims brought by a 20-year-old woman named Kaley and her mother, who argue that several social media companies designed their platforms in ways that fueled compulsive use and contributed to serious mental health struggles, including an eating disorder, anxiety, and depression.
The companies named in the broader litigation include Meta (Instagram and Facebook), YouTube, Snap, and TikTok. While Snap and TikTok have settled some claims outside of court, Meta and YouTube continue to contest the allegations.
Tech industry leaders have been called to testify during the proceedings. Meta CEO Mark Zuckerberg and Instagram head Adam Mosseri have both appeared in court, defending the company’s efforts to address teen safety on its platforms.
During testimony reported by CNN, Mosseri acknowledged that extremely heavy social media use could be problematic for teens. He stated that scrolling for as much as 16 hours per day could be “problematic,” but argued it should not necessarily be considered “clinically addictive.”
Meta and other tech companies have repeatedly argued that scientific research has not conclusively proven that social media causes addiction or mental health disorders, though critics maintain that platform design can intensify harmful behaviors.
The outcome of this trial could reshape how social media platforms are regulated regarding younger users. Some companies have already begun implementing new safety measures, including age-based content filtering systems similar to movie ratings, aimed at limiting what types of posts can be recommended to minors.
The central question now before the court remains the same one raised by many industry critics: Did social media companies fail to fix harmful algorithms — or did they choose not to?
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


33 Comments
I like the balance sheet here—less leverage than peers.
Good point. Watching costs and grades closely.
Production mix shifting toward Fact Check might help margins if metals stay firm.
Uranium names keep pushing higher—supply still tight into 2026.
Good point. Watching costs and grades closely.
The cost guidance is better than expected. If they deliver, the stock could rerate.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Production mix shifting toward Fact Check might help margins if metals stay firm.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Exploration results look promising, but permitting will be the key risk.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
If AISC keeps dropping, this becomes investable for me.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Production mix shifting toward Fact Check might help margins if metals stay firm.
Uranium names keep pushing higher—supply still tight into 2026.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
The cost guidance is better than expected. If they deliver, the stock could rerate.
Good point. Watching costs and grades closely.
Silver leverage is strong here; beta cuts both ways though.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.
Nice to see insider buying—usually a good signal in this space.
Uranium names keep pushing higher—supply still tight into 2026.
Good point. Watching costs and grades closely.
Silver leverage is strong here; beta cuts both ways though.
Interesting update on Social Media Addiction Trial Against Tech Giants Begins as Court Documents Revealed. Curious how the grades will trend next quarter.
Good point. Watching costs and grades closely.
Good point. Watching costs and grades closely.