Roblox’s AI-Driven Age Verification Is a Total Fiasco

Roblox's AI-Driven Age Verification Is a Total Fiasco

Just days after its launch, Roblox’s anticipated AI-driven age verification system has encountered significant issues.

The face scanning technology, designed to assess users’ ages before granting access to the platform’s chat features, was introduced in the US and various other countries last week, following an initial rollout in select locations last December. Roblox aims to implement this system to enable users to chat safely with peers of similar ages.

However, players are voicing their frustrations as the update has halted their ability to communicate with friends, developers are urging Roblox to retract the change, and importantly, experts have highlighted that the AI is misidentifying younger players as adults and vice versa, failing to address the very issue it was created to resolve: the influx of predators exploiting the platform to groom children.

WIRED has uncovered that individuals are promoting age-verified accounts for minors as young as 9 years old on eBay for prices as low as $4.

After WIRED alerted them to these listings, eBay representative Maddy Martinez stated that the company would be removing them due to policy violations.

In a communication, Roblox’s chief safety officer Matt Kaufman informed WIRED that implementing a change of this scale on a platform with over 150 million daily users requires time.

“You can’t just flip a switch for something that hasn’t existed before,” he remarked. “Expecting perfection from the system overnight disregards the complexity of this effort.”

Kaufman expressed satisfaction with the initial reception, noting that “tens of millions of users” have already verified their ages, which he argued demonstrates that “the vast majority of our community values a safer, more age-appropriate environment.”

The company also responded to some criticisms in a recent update, stating: “We are aware of cases where parents verify ages for their children, resulting in kids being categorically aged to 21+. We are developing solutions to rectify this and will provide updates soon.”

The age verification requirement was announced by Roblox last July as part of a broader set of enhancements intended to increase safety on the platform. Recently, the company has faced heightened scrutiny following several lawsuits claiming it failed to protect its youngest users and enabled predators to target children.

Attorneys general from Louisiana, Texas, and Kentucky also filed lawsuits against the company last year, making similar allegations, while Florida’s attorney general has issued criminal subpoenas to investigate whether Roblox is “aiding predators in accessing and harming children.”

Roblox asserts that requiring users to verify their ages before allowing them to chat will help prevent adults from having unrestricted interactions with unfamiliar children.

Though verification is optional, those who decline will lose access to the chat functionalities, which are among the main attractions of the Roblox platform.

To verify age, users must submit a brief video using their device’s camera, which is analyzed by a company called Persona to estimate their age. Alternatively, users aged 13 and over can upload a government-issued photo ID.

Roblox claims that all personal data is “deleted immediately after processing.” Nonetheless, many users have expressed reluctance to undergo age verification due to privacy concerns.

Users who have verified their ages can only chat with a limited group of players close to their own age. For instance, those verified as under 9 can interact only with individuals up to 13, while players identified as 16 can chat with those aged between 13 and 20.

https://in.linkedin.com/in/rajat-media

Helping D2C Brands Scale with AI-Powered Marketing & Automation 🚀 | $15M+ in Client Revenue | Meta Ads Expert | D2C Performance Marketing Consultant