Highlights
- Roblox sparked a user revolt by requiring video face scans to unlock chat features.
- The new system segregates players into age buckets, breaking social gameplay.
- Mandatory global enforcement begins in January 2026 despite privacy concerns.
A massive storm is brewing in the metaverse as Roblox Corporation begins enforcing its most controversial safety update to date, triggering a widespread revolt among its million daily active users. Starting November 17, 2025, the platform began rolling out a strict new policy that requires users to submit video facial scans or government IDs to unlock basic chat features. What the company describes as a "safety gold standard" is being labelled as invasive and destructive by the community, with many fearing the update will permanently fracture the social ecosystem that makes Roblox unique.
The controversy centres on a new mandatory video selfie system. To bypass restrictions, users must now record themselves looking front, left, and right, allowing an AI to estimate their age. While the rollout began as a voluntary phase, it is set to become mandatory for players in Australia, New Zealand, and the Netherlands in December 2025, followed by a strict global enforcement deadline in January 2026.
Although Roblox asserts that this biometric data is processed by a third party and deleted immediately, the requirement has sparked significant privacy concerns. Players are worried about potential data leaks, with critics arguing that forcing users, especially minors, to upload live video of their faces to a gaming platform is an overreach.
Silent Lobbies and Broken Gameplay Mechanics
The immediate impact on gameplay has been severe, stripping away the "human touch" that fuels the platform's popularity. The new system sorts players into rigid "age-based chat" buckets, strictly segregating communication. For example, users estimated to be 12 years old can communicate with those under 15.
Technical failures are exacerbating the frustration. Users have flooded Reddit and community forums to report that the AI estimation is frequently incorrect, classifying users as younger than they actually are and locking them out of communication with their own friends.
This has broken the core mechanics of roleplay servers, particularly those simulating families or diverse communities, as the different age brackets can no longer interact. One popular thread described the update as a "breaking point," with veteran players threatening to quit because the collaborative spirit of the game has been dissolved by the verification walls.
Roblox Corporation maintains that these drastic measures are a necessary response to rising concerns regarding child safety. The platform has faced significant legal pressure in 2025, including a lawsuit from the Texas Attorney General.
By positioning itself as the first major platform to enforce such strict biometric checks, Roblox is attempting to prove it is a safe environment. However, as the January 2026 global mandate approaches, the disconnect between the company's legal strategy and the player experience is widening, leaving developers and fans hoping for a Change before the silence in the lobbies becomes permanent.

