An undercover investigation into Roblox has exposed serious lapses in child safety on one of the biggest digital platforms used by kids today.
Despite parental controls and public reassurances, new research shows that children as young as five are being exposed to explicit content, interacting with adults, and bypassing weak verification systems.
An investigation by digital behaviour experts, Revealing Reality, reported discovering “something deeply disturbing.”
“Despite the safety features in place, adults and children can easily interact in the same virtual spaces, with no effective age verification or separation,” the report stated .
Roblox is one of the largest video games in the world, averaging over 80 million players per day in 2024, with roughly 40% of them below the age of 13.
As part of the investigation, researchers created multiple fake Roblox accounts using people as young as five, nine, ten, thirteen, and over forty.
They found that the account registered as a 10-year-old could “freely access highly suggestive environments.”
These included virtual hotels with private rooms where characters wore sexually suggestive outfits. Within these spaces, children could engage in conversations “that often strayed into adult themes.”
A video posted by Revealing Reality shows one character encountering avatars making sexually suggestive noises and actions.
The 10-year-old account also entered a virtual dance club, where avatars were seen suggestively cuddling on a bed, entering private rooms with showers, and participating in similarly troubling scenarios.
In a statement shared with CCN, Matt Kaufman, Chief Safety Officer at Roblox, said: “At Roblox, trust and safety are at the core of everything we do. We continually evolve our policies, technologies, and moderation efforts to protect our community, especially young people.”
“In 2024 alone, we added more than 40 new safety enhancements, and we remain fully committed to going further to make Roblox a safe and civil place for everyone.”
A Roblox spokesperson also told CCN that Revealing Reality’s investigation “omits important contextual facts that are essential to an accurate understanding of safety on our platform.”
On Wednesday, April 16, Florida Attorney General James Uthmeier issued a subpoena to Roblox regarding its marketing to children.
The Attorney General’s office said they had received numerous reports regarding children being exposed to “graphic or harmful material on the gaming platform, as well as predatory adults being able to message minors on the app freely.”
“As a father and Attorney General, children’s safety and protection are a top priority,” Uthmeier said.
“There are concerning reports that this gaming platform, which is popular among children, is exposing them to harmful content and bad actors,” he added. “We are issuing a subpoena to Roblox to uncover how this platform is marketing to children and to see what policies they are implementing — if any — to avoid interactions with predators.”
This investigation follows claims from former Meta employee Kelly Stonelake that Meta knowingly allowed children under 13 access to its virtual reality platform, Horizon Worlds.
Stonelake, who worked at the company for 14 years, submitted her whistleblower statement through a complaint filed by the non-profit Fairplay.
The complaint alleges that Meta permitted underage users to register using adult accounts, enabling the company to collect data on them without proper parental consent.
Meta spokesperson Ryan Daniels said in a statement to Business Insider, “We’re committed to providing safe, age-appropriate experiences on our platform.”
He added that parents are required to manage accounts for children aged 10–12 on the Quest headset and must grant permission for them to access Horizon Worlds.
Daniels also noted that tools are available to report users suspected of being underage, and Meta will delete such accounts if they are confirmed to belong to pre-teens.
Earlier this month, Roblox introduced new safety features designed to limit the number of people children can interact with on the platform.
These updates are intended to give parents more control over what their children can access. Parents can now block their children from accessing specific experiences and block or report individuals on their child’s friend list.
However, these features only apply to users under 13 and accounts with parental controls enabled.
“The new safety features announced by Roblox last week don’t go far enough,” Damon De Ionno, Research Director at Revealing Reality said.
“Children can still chat with strangers who aren’t on their friends list. With six million experiences on the platform — many with inaccurate descriptions or ratings — how can parents realistically moderate what their kids are doing?” De Ionno added.
The updated safety features follow controversial remarks by Roblox CEO and co-founder David Baszucki, who suggested that concerned parents simply prevent their children from using the platform.
“My first message would be: if you’re not comfortable, don’t let your kids be on Roblox,” Baszucki told the BBC . “That sounds a little counterintuitive, but I would always trust parents to make their own decisions.”