Listen to the article
Roblox is bolstering its safety protocols with a new age verification system and age-based chat restrictions, a move that comes amid ongoing scrutiny over child protection on the popular gaming platform.
Starting in December, users who wish to send private messages will need to complete a video selfie process that estimates their age. The rollout will begin in Australia, New Zealand, and the Netherlands before expanding globally in early January.
“Between the ages of about five to 25, the system can accurately estimate a person’s age within one or two years,” explained Matt Kaufman, chief safety officer at Roblox. The technology, provided by verification company Persona, analyzes facial features to determine age ranges.
For users whose appearance falls outside typical age indicators, alternatives are available. “If you disagree with the estimate that comes back, then you can provide an ID or use parental consent in order to correct that,” Kaufman noted.
The platform emphasizes that video selfies are deleted after processing, and users are not required to verify their age to access Roblox—only to use the messaging features.
After verification, users will be placed into specific age brackets: under nine, nine to 12, 13 to 15, 16 to 17, 18 to 20, and over 21. The system will restrict chat interactions to users within similar age groups, creating a more segmented communication environment.
This enhanced verification approach represents a significant expansion of Roblox’s safety infrastructure. Currently, the platform prohibits children under 13 from chatting with other users outside of games without explicit parental permission. Unlike some competitors, Roblox does not encrypt private conversations, allowing the company to monitor and moderate all communications.
The timing of these changes is notable as Roblox faces mounting legal challenges over child safety concerns. In June 2024, Kentucky’s attorney general filed a lawsuit alleging the platform fails to adequately protect young users from predators and inappropriate content. Similar legal actions have emerged in other states, putting pressure on the company to strengthen its safeguards.
Roblox’s initiatives mirror a broader industry shift toward more robust age verification systems. Google’s YouTube has begun testing AI-based age verification that analyzes viewing patterns to identify minors, while Meta’s Instagram is piloting technology that aims to detect when younger users misrepresent their age.
These developments come amid a global regulatory push for stronger online protections for children. Various countries and states have implemented or proposed laws requiring platforms to verify users’ ages and provide age-appropriate experiences. The UK’s Online Safety Act, California’s Age-Appropriate Design Code, and similar legislation in Australia have all contributed to this regulatory landscape.
Industry experts remain divided on the effectiveness of facial age estimation tools. While proponents highlight their non-invasive nature and improving accuracy, critics question reliability across diverse demographics and raise privacy concerns about collecting biometric data from minors.
For Roblox, which attracts over 70 million daily active users—many of whom are children—balancing an engaging platform with comprehensive safety measures represents both a technical challenge and a business imperative. The company’s stock has fluctuated in recent years partly due to investor concerns about regulatory risks and safety issues.
As digital platforms increasingly face scrutiny over their impact on young users, Roblox’s age verification system could establish a precedent for how gaming platforms approach child safety in an era of heightened awareness and regulation.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


11 Comments
This is a positive step forward for Roblox in terms of child online safety. The age-based chat segregation and video selfie verification process could help prevent inappropriate interactions. However, they’ll need to ensure it’s implemented thoughtfully to avoid any unintended consequences.
Agreed. Roblox will have to be vigilant about minimizing potential privacy or fairness issues with the new system. Transparency and user feedback will be key as they roll it out.
As a parent, I’m encouraged to see Roblox taking more proactive measures to safeguard younger users. The age verification and chat partitioning are sensible ideas, though the technical implementation will be crucial. Curious to see how it impacts the overall Roblox experience.
The new age-based chat and verification system sounds like a reasonable approach for Roblox to address safety concerns, but the accuracy and fairness of the facial recognition technology will be critical. Striking the right balance between protection and user experience will be a challenge.
Interesting move by Roblox to improve child safety and privacy protections. The new age verification system and age-based chat restrictions sound like a reasonable approach, though I wonder how accurate the facial recognition technology will be in practice.
Yes, it will be important for Roblox to closely monitor the accuracy and fairness of the age estimation system. Protecting young users while respecting privacy is a delicate balance.
Kudos to Roblox for taking proactive steps to enhance child safety and privacy on their platform. The age-based chat and video selfie verification process seem like reasonable approaches, though the effectiveness will depend on the technical implementation and how it’s received by the community.
The new age-based chat restrictions and verification process are a step in the right direction for Roblox, but the platform will need to ensure it’s implemented in a way that protects younger users without overly disrupting the gaming experience. Getting the balance right will be critical.
While increased safety measures are welcome, I hope Roblox can implement the new age verification system in a way that doesn’t overly disrupt the user experience, especially for older teens. Careful testing and user feedback will be important as they roll this out globally.
It’s good to see Roblox responding to concerns over child protection with these new safety features. Segregating chat by age and verifying users’ ages through facial recognition could help prevent inappropriate interactions. However, Roblox will need to closely monitor the accuracy and fairness of the system.
Absolutely. Roblox will have to be very thoughtful about privacy, consent, and the user experience as they roll out these changes. Transparent communication with the community will be essential.