AIbase Report — The well-known online gaming platform Roblox has officially launched a global age verification system, which will use AI-driven selfie scanning technology to more strictly manage interactions between adults and minors.
Roblox announced that by the end of 2025, all users with access to platform communication features must complete this new age check. This process will combine AI facial analysis, identity verification, and a parental consent step for minor users. The core goal of this move is to accurately determine the user's real age and strictly limit adults from contacting children on the platform.
This marks a major shift in Roblox's security strategy. Previously, the platform only required users to manually enter their birth dates when creating an account. Under the new rules, adults will no longer be able to freely connect with children on the platform unless they have a clear real-life connection.
This move is part of a broader initiative by Roblox to strengthen user behavior monitoring. The company stated that the selfie-based age estimation technology is more reliable than manual input, and can more effectively enforce communication rules across different age groups.
Currently, text chat on the Roblox platform is fully monitored and filtered, and image sharing is prohibited. Children under 13 are not allowed to use voice chat or send private messages, and parents can manage the functions their children can access through the control menu.
Under the new safety framework, Roblox has increased the age restrictions on some content in recent months. For example, game experiences marked as "restricted" are now only available to users aged 18 and above, and content without age ratings will be automatically blocked.
Roblox revealed that since early 2025, the company has added over 100 new safety features. These include an advanced AI system called Roblox Sentinel, specifically designed to detect potential child sexual exploitation; additionally, voice filters have been upgraded, and a tool has been introduced to automatically shut down suspicious servers. Another new system identifies violations by analyzing users' avatars and in-game behavior.