Australia Just Ordered Roblox to Explain Its Child Safety Practices — What Parents Need to Know

April 25, 2026·6 min read

Australia's eSafety Commissioner issued legally enforceable transparency notices this week to four of the world's largest online gaming platforms: Roblox, Minecraft, Fortnite, and Steam. The notices require each company to explain, in detail, how they are identifying, preventing, and responding to child grooming and violent extremism on their platforms.

Companies that fail to comply face financial penalties and civil action.

This is not a strongly worded letter. It is a legal demand backed by enforcement power. And it comes in the same week that Roblox agreed to pay more than $23 million to settle child safety claims with the US states of Alabama and West Virginia, adding to a growing stack of legal pressure from regulators on multiple continents.

Here is what the action means, why it matters for parents in the US, and what you can do right now without waiting for regulators to finish their work.

What Australia's eSafety Commissioner Said

eSafety Commissioner Julie Inman Grant explained the reasoning plainly. Online games have become social hubs for young people. In Australia, nine in ten children aged 8 to 17 play online games regularly. That makes gaming platforms one of the primary spaces where predatory adults find and target children.

The concern is not just about chat. Grant described two specific patterns:

  • Grooming through game environments. Predators make contact inside the game, build trust, and then move children to private messaging platforms like Discord or Telegram where there is less oversight and no moderation.
  • Radicalization through gameplay. Extremist content and narratives are embedded directly into games, normalizing violent ideologies and increasing the risk of real-world harm.

The transparency notices ask each platform to disclose their safety systems, staffing levels, and moderation practices in detail. The goal is to assess whether these companies are doing what they claim to be doing, or whether their safety commitments are mostly marketing.

Why Roblox Is in the Spotlight Again

Roblox is not a new target for regulators. The platform is now facing more than 140 US lawsuits alleging it failed to stop the sexual exploitation of children. This week alone, it settled child safety claims with Alabama and West Virginia for a combined total exceeding $23 million, following the Nevada settlement announced earlier this month.

The international pressure has been mounting for months:

  • Philippines: Regulators set a deadline for Roblox to demonstrate safety improvements or face a ban.
  • Indonesia: Rolled out new restrictions on gaming platforms over child safety concerns.
  • UK: A 19-year-old was jailed for grooming a 14-year-old he met on Roblox. The Philippine National Police opened a formal investigation into the platform.
  • US: State AGs in Texas, Nebraska, Louisiana, and Nevada have all taken legal action. The Florida AG cited Roblox specifically in a sweep that produced 1,400 predator arrests.
  • Australia: The eSafety Commissioner has now issued legally enforceable notices requiring transparency on child safety practices.

This is not isolated. The same platform, the same concerns, the same pattern of reactive rather than proactive safety measures, and now the same legal pressure appearing on six continents.

What Makes Gaming Platforms Different from Social Media

Australia already banned under-16s from major social media platforms last year. Gaming platforms were excluded from that ban. The eSafety notices this week reflect a growing recognition that the distinction matters less than regulators originally thought.

Gaming platforms present specific challenges that social media bans do not address:

  • The game is the cover story. A child talking to an adult in Roblox looks like normal gaming. The interaction starts inside a shared game world, which provides the predator with built-in context and plausible normality.
  • In-game chat is largely invisible to parents. Unlike a text message on a phone, Roblox chat happens inside the game client. There is no native notification to parents, no accessible log, no alert when a new contact appears.
  • Moving off-platform is easy. Once contact is established inside the game, moving the conversation to Discord takes one message. At that point, parents have even less visibility than they did before.
  • Age verification has been weak. Until very recently, it was trivial to create a Roblox account claiming any age. Adults could enter spaces designed for children with minimal friction.

Commissioner Grant's point about gaming platforms as social hubs is important for parents to internalize. Roblox is not primarily a game. It is a social network with games attached. The safety risks are social network risks, not just gaming risks.

What Roblox Is Doing in Response to Pressure

Under sustained legal and regulatory pressure, Roblox announced a major account overhaul launching in June 2026. The changes include age-based account tiers, tighter restrictions on who can contact children, and mandatory age verification for new accounts.

These are real improvements. Disabling chat for the youngest users removes the primary grooming vector. The new Roblox Kids tier for under-8s is a meaningful change from what existed before.

But as we noted in our coverage of the Nevada settlement, the June changes address restrictions, not visibility. Roblox is making it harder for certain contacts to happen. It is not building a dashboard that tells parents what is actually happening on their child's account.

Those are two different problems, and only one of them is being solved.

What This Means for Parents in the US

Australian regulatory action does not directly change anything for parents in the United States today. But it adds to a picture that parents should understand.

Roblox is under legal pressure from every direction. That pressure is producing real changes to the platform. But the pace of regulatory action is slow, and the focus of regulators is on platform-level accountability — what Roblox is required to disclose and what safety systems it is required to maintain.

Regulators are not focused on giving you, as a parent, better visibility into what your specific child is doing on the platform today. That is a different problem, and it is one that will not be solved by lawsuits or transparency notices.

What You Can Do Right Now

While regulators on multiple continents work through their processes, there are practical steps you can take today.

Set up Roblox's existing parental controls

Most parents have never configured them. Enable a parental PIN on purchases, restrict chat to "Friends Only," and make sure your child's account has the correct birth date set so age-based restrictions apply correctly. Our complete parental controls guide walks through each setting step by step.

Have a direct conversation about off-platform moves

Teach your child that any Roblox contact who asks to move the conversation to Discord, Telegram, or another messaging app is a warning sign that should be reported to you. This is the most common step in the grooming process that regulators and law enforcement consistently describe. A child who knows what it looks like is harder to manipulate.

Check the friends list regularly

You can view your child's Roblox friends list by logging into their account or by using a monitoring tool. The question to ask is not just how many friends they have, but who the accounts belong to. New, unfamiliar usernames that appeared recently are worth asking about.

Do not rely on gaming-only parental control apps

Tools that work at the device level — screen time limits, app blocks — cannot see inside Roblox. They can tell you your child spent two hours on the platform. They cannot tell you who your child talked to or what happened in that time. For Roblox-specific monitoring, you need something that connects to the account itself.

The Bottom Line

Australia joining the growing list of regulators scrutinizing Roblox and other gaming platforms is significant. The global consensus is forming: gaming platforms have a child safety problem, and voluntary self-regulation has not been sufficient.

But the regulatory process moves on a timeline measured in years. Your child is on Roblox today.

The combination that actually works is straightforward: use the platform controls that exist, add independent visibility into your child's account activity, and have ongoing honest conversations about what your child is seeing online. No regulator can do that last part for you.

See What Roblox Won't Show You

While regulators wait for Roblox to explain its safety systems, BloxWatch gives parents direct visibility into their child's account: friends, games, chat activity, spending, and real-time alerts when something changes.

No device software to install. Works on every platform your child plays on.

Start Free 14-Day Trial

Free: The Parent's Roblox Safety Checklist

A practical guide to keeping your kids safe on Roblox. Includes warning signs to watch for, games to avoid, and step-by-step security settings.

No spam, ever. Unsubscribe anytime.