Roblox is rolling out new features aimed at making the platform safer for minors, including a revamped friend system, privacy tools, and age verification services users submit by recording a video selfie.
In Roblox’s old friend system, players have no distinction between people they know casually or online versus someone they consider a close friend. The platform’s new tiered system introduces Connections and Trusted Connections specifically for people that players know and trust. To access Trusted Connections and its benefits, users first need to complete an age verification, which requires them to submit a video selfie. Once they’ve submitted their video, the company says it’s run against an AI-driven “diverse dataset” to get an age estimation. If the user appears to be under 13, they will automatically lose access to any features not deemed age-appropriate.
For users whose ages cannot be determined with “high confidence,” according to a blog on the company’s site, their age remains unconfirmed; they’ll need to use ID verification to pass. The company says it will allow for parental consent in the future; biometric data is deleted after 30 days, except where required in the case of a warrant or subpoena. WIRED raised the issue of 13-year-olds not having government-issued IDs to chief safety officer Matt Kaufman. “That is a problem,” Kaufman says. “In North America or maybe the United States in particular, that's not common. In other parts of the world, it is much more common to have photo ID.” If a child is unable to obtain verification due to lack of ID, they can get verified through their parents. If their parents are unable to do so for any reason, kids won’t be able to use Trusted Connections.
Teen users who pass the age check will be able to use the Trusted Connections feature to add anyone ages 13 to 17. Anyone 18 or older will need to be added either via an in-person QR code scan or via a phone number. With Trusted Connections, Roblox removes filters—which includes inappropriate language and personally identifiable information—on party voice and text chats for users 13 and up. Those communications are still subject to Roblox’s community standards and moderation, but the company hopes removing filters will keep users on their platform rather than moving to spaces like Discord. By keeping players within Roblox, the company can monitor their activity. A spokesperson told The Verge that includes “any predatory behavior aimed at manipulating or harming minors, the sexualization of minors, engaging in inappropriate sexual conversations with or requesting sexual content, and any involvement with child sexual abuse material.”
Kaufman says the company wants to make Roblox “safe by default.” That’s why the company filters communications even for teenagers who haven’t verified their age. “If parents are uncomfortable with that, and it's the right decision for their family, parents can turn off communications through parental controls,” Kaufman says.
Roblox is one of the biggest platforms worldwide in video games, especially with kids. Kaufman, in a press briefing, said roughly 98 million people from 180 countries use the platform; Kaufman says that over 60 percent of users are over age 13. The company has struggled, however, with predators and minors’ safety. According to a 2024 Bloomberg report, police have arrested at least two dozen people who’ve used Roblox as a platform for grooming, abuse, or abduction. Roblox has also been the subject of several lawsuits. This includes a class action lawsuit alleging the company harvests user data, including that of minors, and a federal lawsuit alleging a 13-year-old girl was exploited and sexually groomed via Roblox and Discord.