Listen to the article
Recent Jury Verdicts Highlight Global Push for Social Media Regulation to Protect Children
Back-to-back jury decisions this week have validated long-held concerns about social media’s impact on young users, underscoring the absence of meaningful federal regulation in the United States while other countries forge ahead with protective measures.
On Wednesday, a Los Angeles jury found both Meta and YouTube liable for harms caused to children using their platforms. Just a day earlier, a New Mexico jury determined that Meta knowingly harmed children’s mental health and concealed information about child sexual exploitation on its services.
While these verdicts represent significant victories for children’s advocates, many experts argue that without comprehensive federal regulation, platforms like Instagram, YouTube, and TikTok are unlikely to implement meaningful changes. Advocates have placed their hopes on the Kids Online Safety Act, legislation designed to protect children from the harmful effects of social media, gaming sites, and other online platforms. Though the bill won Senate approval earlier this year, it remains stalled in the legislative process.
Meanwhile, countries around the world are taking more decisive action with a variety of approaches to restrict children’s online activities and protect them from potential harms.
Australia has emerged as a pioneer in this space, becoming the first nation to ban children under 16 from social media entirely. The 2024 law makes platforms including TikTok, Facebook, Snapchat, Reddit, X, and Instagram liable for fines up to 50 million Australian dollars ($34 million) if they fail to prevent underage users from holding accounts.
While many parents have welcomed Australia’s approach, some experts have raised concerns about the effectiveness of age verification methods and potential impacts on young people’s free speech, social connections, and privacy. Critics also worry that proving age might compromise privacy for all users, not just minors.
Brazil has taken a different approach with legislation that took effect this month. The new law requires children under 16 to link their social media accounts to a legal guardian to ensure supervision. It also prohibits platforms from using potentially addictive features such as infinite scroll and automatic video play. Additionally, digital services must implement effective age verification mechanisms beyond simple self-declaration.
Following Australia’s lead, Indonesia plans to ban social media for children under 16 beginning this month. The regulation will prevent children from having accounts on “high-risk” platforms including YouTube, TikTok, Facebook, Instagram, Threads, X, Bigo Live, and Roblox. Implementation will roll out gradually starting March 28 until all platforms comply, making Indonesia the first Southeast Asian country to restrict children’s access to social media.
Malaysia has also tightened regulations, requiring major social media and messaging platforms with at least 8 million users to obtain a license since January 2025. Licensed platforms must implement age verification, content safety measures, and transparency rules. The country also plans to ban children under 16 from social media this year.
European nations are following suit. Spain’s Prime Minister Pedro Sánchez announced in February that the country plans to limit social media access for children under 16 to shield young people from harmful online content. France has approved legislation banning social media for children under 15, set to take effect at the start of the next school year in September. The bill would also ban mobile phone use in high schools, building on previous legislation prohibiting phones in primary and middle schools.
Denmark has introduced similar legislation to ban social media access for users under 15, while the United Kingdom recently announced it would consider banning young teenagers from social media platforms as part of broader efforts to protect children from harmful content and excessive screen time.
As these international regulations take shape, the U.S. continues to rely primarily on litigation rather than legislation to address the growing concerns about social media’s impact on young users’ mental health and safety.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


14 Comments
This is an important issue that countries need to address. Protecting children’s wellbeing online should be a top priority as digital platforms become more prevalent in young people’s lives.
Agreed. Comprehensive regulations and oversight are necessary to ensure social media companies prioritize child safety over profits.
The negative mental health impacts on children from social media are well-documented. Platforms must be transparent about these risks and implement strong protections.
Agreed. Concealing information about child exploitation is unacceptable and shows the need for tougher regulations and enforcement.
It’s good to see some victories for children’s advocates, but more needs to be done globally to keep kids safe online. Comprehensive regulations seem crucial.
Absolutely. Without federal action, platforms are unlikely to make the necessary changes to prioritize child wellbeing over profits.
The stalling of the Kids Online Safety Act in the US legislative process is disappointing. Prompt action is needed to address this critical issue.
Absolutely. Children’s wellbeing shouldn’t be a partisan issue – this legislation deserves bipartisan support to become law.
It’s concerning to see the lack of meaningful federal regulation in the US compared to other countries taking more proactive steps. Children deserve robust safeguards against online harms.
Absolutely. The recent jury decisions highlight the urgent need for new legislation like the Kids Online Safety Act to hold platforms accountable.
Jury decisions finding platforms liable for harms to children are important steps, but long-term regulatory solutions are still needed. Self-regulation by tech companies is clearly insufficient.
Agreed. Comprehensive federal laws with strong enforcement mechanisms are necessary to compel meaningful changes from these powerful platforms.
This issue is especially concerning as digital technology becomes more integrated into young people’s lives. Protecting children online should be a top priority worldwide.
Agree, we need a global effort to ensure social media and online platforms have robust safeguards in place to shield kids from harm.