Listen to the article

0:00
0:00

The European Union has formally accused Meta of failing to implement adequate measures to prevent underage users from accessing Facebook and Instagram, a direct violation of the bloc’s stringent Digital Services Act (DSA).

In a statement released Wednesday, EU regulators said Meta Platforms lacks effective safeguards to stop children under 13 from creating accounts on its platforms. The European Commission, the EU’s executive branch, also criticized the company for insufficient efforts to identify and remove accounts created by underage users.

“The DSA requires platforms to enforce their own rules: terms and conditions should not be mere written statements, but rather the basis for concrete action to protect users – including children,” said Henna Virkkunen, an executive vice president at the European Commission.

According to the preliminary findings, Meta has fallen short in assessing risks that children under 13 face when exposed to “age-inappropriate experiences” on its social media platforms. This represents a serious concern for EU regulators, who have made child safety online a priority under the new digital regulations.

Meta has disputed the EU’s assessment, defending its current practices. “We have measures in place to detect and remove accounts for anyone younger than 13,” the company stated. Meta also highlighted the broader industry challenge of age verification, saying it “requires an industry-wide solution.” The company promised to share additional measures it plans to implement in the coming week.

The investigation, launched earlier this year, is part of the EU’s enforcement of the Digital Services Act, a comprehensive regulatory framework that came into full effect for large platforms in 2023. The DSA places stringent obligations on major tech companies to protect users from harmful content and ensure platform transparency.

Meta now has the opportunity to respond to these preliminary findings before the Commission makes its final decision. The stakes are high, as violations of the DSA can result in substantial financial penalties—up to 6% of a company’s annual global revenue. For Meta, whose revenue exceeded $134 billion in 2023, potential fines could reach billions of dollars.

This case represents one of the first major enforcement actions under the DSA, signaling the EU’s determination to hold tech giants accountable for their content moderation practices. The focus on protecting minors is particularly significant as concerns about social media’s impact on young people’s mental health have intensified worldwide.

Child protection advocates have long criticized social media platforms for what they perceive as insufficient age verification systems. Many platforms rely largely on self-reporting, making it relatively simple for underage users to create accounts by entering false birth dates.

The technology industry has struggled to develop effective age verification solutions that balance privacy concerns with child safety. Biometric verification, ID checks, and other more stringent methods raise questions about data protection, particularly for younger users.

Meta’s challenge is further complicated by its massive global user base—Facebook and Instagram together serve billions of users worldwide—making comprehensive enforcement of age restrictions technically complex and resource-intensive.

The EU’s action against Meta could potentially establish precedents for how other jurisdictions approach the regulation of social media platforms regarding minor protection. Several countries, including the United Kingdom and Australia, have also been developing or implementing similar digital safety regulations with specific provisions for protecting children online.

As this case progresses, it will likely shape how major tech platforms approach age verification and minor protection globally, potentially spurring industry-wide innovation in age assurance technologies.

Fact Checker

Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.

14 Comments

  1. Lucas Martin on

    This is a concerning report about Meta’s failure to protect children on its platforms. Effective safeguards should be a priority to ensure underage users are not exposed to age-inappropriate content or experiences.

    • You’re right, the DSA requires concrete action from platforms to enforce their own rules. Meta needs to take this seriously and improve its child safety measures.

  2. It’s concerning to see Meta falling short on its obligations to protect children under the EU’s new digital regulations. Effective age verification and content moderation should be non-negotiable.

    • Michael Williams on

      Well said. Meta’s failure to address this issue is unacceptable and puts young users at risk. They need to act quickly to remedy the situation.

  3. John L. Miller on

    Protecting children online should be a top priority for any tech company. The EU is right to hold Meta accountable for its inability to enforce age restrictions on Facebook and Instagram.

    • William Hernandez on

      Agreed. Meta cannot continue to prioritize growth over user safety, especially when it comes to vulnerable young users.

  4. Robert Lopez on

    This news highlights the need for stronger regulation and enforcement around child online safety. Social media platforms must do more to keep underage users off their platforms.

  5. While Meta may dispute the EU’s findings, the fact remains that underage users can still easily access its platforms. This is a serious concern that requires immediate attention and action.

  6. The EU is right to hold Meta accountable for its lack of adequate safeguards to prevent underage access. Protecting children online should be a top priority for all social media companies.

    • Absolutely. Meta cannot continue to prioritize growth over user safety, especially when it comes to vulnerable young users. They need to do better.

  7. Jennifer G. Hernandez on

    This report highlights the need for stricter regulation and enforcement around child online safety. Social media platforms like Meta must implement robust age verification and content moderation to protect underage users.

  8. While Meta may dispute the EU’s findings, the fact that children under 13 can still access its platforms is worrying. Robust age verification should be a basic requirement for social media companies.

  9. William R. Lee on

    It’s troubling to hear that Meta is not doing enough to verify users’ ages and prevent underage access. As a parent, I would expect stronger protections for children on social media.

    • Absolutely, the lack of adequate safeguards is a significant failure that puts young users at risk. Meta must address this issue promptly.

Leave A Reply

A professional organisation dedicated to combating disinformation through cutting-edge research, advanced monitoring tools, and coordinated response strategies.

Company

Disinformation Commission LLC
30 N Gould ST STE R
Sheridan, WY 82801
USA

© 2026 Disinformation Commission LLC. All rights reserved.