In one of the most ambitious online safety moves the gaming industry has seen in years, Roblox is rolling out a facial age estimation system that will determine whether players can access chat features on the platform. The policy marks a significant shift in how the massively popular gaming platform - home to hundreds of millions of users, many of them children - manages age-based access controls.
Rather than relying solely on the birthdate a user types in at sign-up (which, frankly, any kid can fabricate), Roblox is moving toward biometric age estimation powered by a device's camera.
The process is more straightforward - and more privacy-conscious - than it might first sound. When a user wants to access chat features, Roblox will prompt them to complete a brief facial scan via their device's camera. A third-party partner then analyzes the image or short video clip to estimate the user's age range.
Critically, no images or video footage are stored after the check is complete. The data is processed and immediately deleted, which Roblox says is central to the privacy-first design of the feature. The platform is not building a face database - it's simply using facial estimation as a gatekeeping tool in the moment.
The result of the scan places users into one of several age brackets, starting from under 9 years old and scaling up to 21 and above.
Once a player is assigned to an age bracket, Roblox uses that classification to control who they can communicate with. The core principle is straightforward: players will only be able to chat with others in a similar age range, significantly limiting the potential for adults to interact directly with young children through the platform's messaging systems.
For the youngest users - those estimated to be under 9 years old - chat is disabled entirely by default. Parents retain the ability to grant permission if they choose, keeping parental oversight at the center of the experience for the platform's youngest audience.
Technically, no. The feature is described as optional by Roblox - but here's the practical catch: if you skip the verification process, you simply won't be able to access any chat features. So while no one is forced to submit to the scan, choosing not to means losing access to one of the platform's most central social tools.
This approach allows Roblox to maintain some degree of user autonomy while still pushing toward meaningful age verification at scale.
Roblox is taking a phased approach to the global rollout. The feature is launching first in Australia, New Zealand, and the Netherlands in early December, serving as a test group before the system expands internationally. The full global rollout - covering all regions where chat is available - is expected in early January.
This kind of staged deployment is smart from a technical and policy standpoint. It allows Roblox to monitor for friction points, address edge cases, and refine the experience before pushing it to its full, worldwide audience.
The facial age check for chat is just one piece of a much larger safety overhaul. Roblox has confirmed plans to extend age verification requirements to other areas of the platform, including:
Both of these expansions are expected to roll out in 2026. The facial age check initiative is part of a broader suite of over 145 safety improvements Roblox says it has launched throughout 2025 alone.
Roblox's user base skews young. With a platform that draws tens of millions of players, many of them under 13, the stakes around unsupervised chat interactions are real. There have been ongoing concerns about predatory behavior and inappropriate content in gaming environments more broadly, and Roblox has faced scrutiny on these issues before.
By tying communication access to biometric age estimation, Roblox is essentially acknowledging that self-reported age data isn't reliable enough on its own - and building a system that doesn't depend on users being honest.
Roblox has also signaled that it hopes its approach becomes a model for the wider tech industry. The company is actively encouraging other platforms to consider similar age-verification frameworks as part of a broader push to create safer digital environments for minors.
Whether competitors follow suit remains to be seen. But given increasing regulatory pressure on platforms regarding child safety - in the EU, the UK, Australia, and the United States - Roblox's proactive stance may soon look less like an outlier and more like a preview of where the industry is headed.