Listen to the article
European Union regulators have launched a formal investigation into Snapchat over concerns that the platform fails to adequately protect children from online risks. The European Commission announced Thursday it is examining whether Snapchat complies with the bloc’s Digital Services Act (DSA), a comprehensive regulatory framework designed to safeguard internet users.
The Commission expressed serious concerns about Snapchat’s age verification systems, which require users to be at least 13 years old. Regulators suspect these measures are “insufficient” at keeping younger children off the platform. Additionally, they worry the platform is exposing teenagers to inappropriate content by failing to properly verify whether users are under 17 years old.
A particular concern is that adults could potentially pose as minors due to inadequate age verification protocols. The Commission suspects Snapchat isn’t doing enough to protect children from being contacted by “users with harmful intent, such as sexual exploitation or recruitment for criminal activities.”
“Snapchat appears to have overlooked the DSA’s high safety standards for all users,” said Henna Virkkunen, the Commission’s executive vice president for tech sovereignty, security and democracy.
In response, Snapchat stated it has “fully cooperated” with the Commission and is “engaging proactively, transparently and working in good faith” to meet the DSA’s safety standards. The company emphasized that user safety and well-being are top priorities, claiming its platform is designed with “privacy and safety built in from the start, including additional protection for teens.”
The investigation comes amid mounting global pressure on social media companies regarding youth protection. Just a day earlier, a California jury awarded millions in damages to a 20-year-old woman after deciding that Meta and YouTube designed platforms that hook young users without regard for their wellbeing. Snapchat’s parent company, Snap Inc., was initially included in that lawsuit but settled for an undisclosed sum before trial.
In a separate case this week, a New Mexico jury imposed a $375 million penalty on Meta after determining the company knowingly harmed children’s mental health and concealed information about child sexual exploitation on its platforms.
The European Union has been increasingly active in regulating tech platforms. Earlier this year, the EU accused TikTok of breaching the DSA with “addictive design” features that lead to compulsive use by children. Facebook and Instagram have been under investigation since early 2024 over similar child protection concerns.
On the same day as the Snapchat announcement, Brussels also accused four major pornographic websites—Pornhub, Stripchat, XNXX, and XVideos—of failing to protect children from adult content following an investigation that began last year.
The DSA requires internet companies and online platforms to implement stronger user protections against harmful content and products or face significant penalties—up to 6% of annual revenue.
In preliminary findings regarding the pornography sites, regulators criticized allowing users, potentially including minors, to “self-declare” they are over 18 by merely clicking a link. They deemed additional measures such as page blurring and warning labels insufficient, and called for more robust age verification tools.
XVideos pushed back against these findings, arguing that “adding age checks on four sites out of a million does nothing to prevent minors from accessing adult content” and would drive users to “less safe sites that are completely out of reach of regulators.”
Aylo, Pornhub’s parent company, defended its practices, stating its moderation and verification exceed legal requirements. “Our goal is to get age verification right,” a spokesperson said, noting that current website-level verification solutions often fail and raise serious data privacy concerns.
The pornographic sites now have an opportunity to formally respond before the Commission reaches its final decision, while the Snapchat investigation continues to gather evidence of potential DSA violations.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


7 Comments
This investigation underscores the complexities of balancing innovation, user experience, and child safety online. It will be interesting to see how Snapchat responds and what solutions emerge from this process.
Protecting minors online is critical, especially for popular social media platforms like Snapchat. It’s good to see EU regulators taking this issue seriously and investigating Snapchat’s compliance with the DSA’s child protection standards.
Snapchat’s popularity with younger users makes this investigation particularly important. I hope they can work constructively with the EU to address the concerns and find ways to better protect vulnerable minors.
Absolutely. Social media platforms have a responsibility to put robust child safety measures in place, even if it means rethinking certain features or functionality.
While age verification can be challenging, platforms must do more to prevent minors from accessing inappropriate content or being exploited. Snapchat should work closely with regulators to strengthen its safeguards and build trust with users.
Agreed. Effective age verification is key, but platforms also need robust content moderation and reporting systems to protect young users.
The DSA sets a high bar for online platforms when it comes to safeguarding minors. This probe of Snapchat suggests regulators are serious about enforcing those standards across the industry.