BUSINESS

Roblox, Minecraft ordered to explain safeguards

Mico Virata

Australia’s online safety regulator has ordered major gaming platforms, including Roblox, Minecraft, Fortnite, and Steam, to disclose how they protect children from grooming, exploitation, and extremist recruitment, escalating global scrutiny over child safety in digital gaming spaces.

Julie Inman Grant, eSafety Commissioner on Wednesday issued legally enforceable transparency notices requiring the companies to detail their safety systems, staffing, and cybersecurity measures. Firms are required to respond within 30 days or face penalties of up to $590,783 per day.

Grant said gaming environments are increasingly being used as entry points by offenders targeting minors.

“What we often see after these offenders make contact with children in online game environments, they then move children to private messaging services,” Inman Grant said.

She warned that gaming platforms have become central social spaces for children, noting that “nine in 10 Australians aged 8 to 17 have played online games.”

“Predatory adults know this and target children through grooming or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact offending, radicalization and other off-platform harms,” she added.

The regulator said it is seeking detailed explanations on how platforms detect harmful behavior, moderate encrypted chats, and respond to potential abuse cases, particularly where automated systems struggle to monitor real-time interactions.

Microsoft said it was reviewing the request and reaffirmed its commitment to child safety.

“We continue to evolve our approach to meet the evolving threat and regulatory landscape,” a company spokesperson said in an email.

Roblox did not immediately respond to requests for comment.

The move comes as gaming companies face intensifying legal and regulatory pressure worldwide. Roblox recently settled cases in the US states of Alabama and West Virginia, agreeing to pay more than $23 million and tighten child safety controls on chat and gameplay features.

The platform is also facing more than 140 lawsuits in US federal courts alleging it failed to prevent child sexual exploitation.

Last week, Roblox announced new age-based accounts starting June, separating users into “Roblox Kids” for ages 5 to 8 and “Roblox Select” for ages 9 to 15, as it attempts to strengthen safeguards for younger players.