Roblox Exposes a Dangerous Gap in Child Safety Laws — What Parents Need to Know

April 23, 2026·7 min read

Teresa Huizar has spent 25 years working on child safety. As CEO of the National Children's Alliance, she oversees a network of nearly 1,000 Children's Advocacy Centers across the country. These are the places families turn to after the worst has already happened.

In a widely circulated op-ed published this month, she made a pointed argument: Roblox isn't just a company with a safety problem. It's a symptom of something bigger. America's child safety laws have not kept up with the digital spaces where kids now spend so much of their lives — and Roblox shows just how wide that gap has become.

If you have a child on Roblox, this affects you directly. Here is what Huizar's op-ed reveals, why it matters, and what you can do right now without waiting for Congress to act.

The Law Is Two Decades Behind

The last major federal law governing how companies handle children's data online — the Children's Online Privacy Protection Act, known as COPPA — was passed in 1998. That was the year AOL chat rooms and dial-up modems defined what "the internet" meant.

Since then, Congress has taken limited steps to address online exploitation. The PROTECT Our Children Act of 2008 strengthened law enforcement investigations. But no comprehensive federal law tackles the realities of today's platforms: immersive games with real-time chat, algorithmic recommendations, virtual economies, and millions of children spending hours a day in spaces that have no physical equivalent.

Huizar's central argument is that this legal vacuum is not an accident — it's a pattern. Each time a new platform explodes in popularity, predators find ways to exploit it. Lawmakers respond only after children are harmed. The cycle repeats.

From MySpace to Facebook, Snapchat, Discord, and now Roblox, the internet has repeatedly proven to be a ready avenue for predators to find and target children. The names of the platforms change. The mechanism — reactive legislation, children paying the cost — stays the same.

The Scale of the Problem Is Staggering

Huizar cites numbers that put the problem in context.

In 2024, the National Center for Missing and Exploited Children received 20 million reports of suspected child exploitation. Those cases involved grooming, financial sextortion, bullying, and exposure to pornography and violence.

According to Meta's own estimates, about 100,000 children are sexually harassed daily across Instagram and Facebook alone.

And the problem isn't abstract. Huizar describes a specific case that anchors the op-ed: a 15-year-old boy who died by suicide after being groomed and blackmailed into sending explicit photos to an adult he met on Roblox.

Roblox is not an outlier. It is a symptom.

What Roblox Is Actually Doing

Under mounting legal and public pressure — from lawsuits in Kentucky, Louisiana, Texas, and a criminal probe in Florida — Roblox has recently introduced self-based age estimation and begun sorting users into different age tiers. New parental controls are rolling out in June.

Huizar acknowledges these as welcome steps. But she identifies the painful pattern: the platform grew explosively, horrific stories emerged, the public got angry, and only then did the company act.

She argues that reactive change from companies is not sufficient — because companies face no legal obligation to protect children proactively. Platforms like Roblox, YouTube, and TikTok still decide for themselves what protections to adopt, resulting in a patchwork of policies shaped more by headlines and liability exposure than by child welfare.

No amount of parental vigilance, she writes, can replace a federal baseline that holds platforms accountable.

What Huizar Says Congress Should Do

The op-ed lays out three concrete changes she believes federal law should require.

Strong age verification

Twenty-five states already require a government-issued ID to access pornography websites. This proves that large-scale age verification is both technically possible and constitutionally viable when lawmakers insist on it. Yet platforms where millions of children spend hours every day face no equivalent requirement. Huizar argues Congress should extend that same standard to any platform where minors gather online.

Banning unsolicited direct messages to minors

One of the primary vectors for grooming is unsolicited contact. An adult posing as a teenager joins a game server or chat group, builds trust, then escalates. Prohibiting unsolicited direct messages to minors across all platforms would close one of the simplest and most commonly exploited pathways.

Standardized, data-rich reporting to NCMEC

A Stanford Internet Observatory report found that many reports platforms submit to NCMEC's CyberTipline are incomplete — missing key details like offender identity, timestamps, or location. In 2022, NCMEC identified 36 platforms whose submissions were largely incomplete, with only 10% of reports deemed actionable. The Justice Department has similarly noted that many reports contain "little more than a screen name or user ID."

Incomplete reports make it far harder for law enforcement to identify patterns and catch repeat offenders. Requiring standardized, complete reporting would change that.

Why This Matters for Parents Right Now

The honest answer is that Congress moves slowly. Even if legislation passes, it takes years to implement. The June 2026 changes Roblox is rolling out are a step forward, but they will not exist when your child logs in tomorrow morning.

The gap Huizar describes is real and it is open today.

That gap has two dimensions. The first is what the law requires of platforms. The second is what parents can actually see. Huizar's op-ed focuses on the first. But the second is equally important for families who cannot wait for federal action.

Even with the new Roblox controls that launch in June, parents will still be unable to:

  • See who their child is chatting with inside Roblox
  • Know which games their child has visited or how long they spent there
  • Get an alert when their child adds a new friend
  • See a log of recent activity or friend requests
  • Track spending in real time

Those aren't things that new legislation will fix. They are things platforms choose not to provide to parents — and that choice has not changed.

What You Can Do While Waiting for Congress

Huizar is right that parental vigilance alone is not enough. But it is not nothing, either. There are concrete steps you can take right now.

Set up Roblox's existing parental controls

Most parents have never configured them. At minimum, set your child's correct birth date, restrict chat to "Friends Only," enable a parental PIN on purchases, and review the current friends list. See our complete Roblox parental controls guide for step-by-step instructions.

Have the conversation

Kids who understand why certain adults online make them uncomfortable are better equipped than kids who are just told "don't talk to strangers." Talk specifically about what grooming looks like on Roblox — someone asking to move conversation off the platform, asking for photos, offering free Robux. Make it concrete.

Fill the visibility gap yourself

The law does not require Roblox to notify you when your child adds a new friend or chats with someone unfamiliar. But you can build that visibility yourself. Tools like BloxWatch connect directly to your child's Roblox account and surface the activity data that Roblox does not bring to parents: friends, games, spending, and changes to the account. That is not surveillance — it is the kind of awareness that lets you ask the right question at the right time.

The Bottom Line

Teresa Huizar has seen what happens when children are harmed online. Her op-ed is not alarmist. It is a careful, evidence-based argument that the legal framework protecting children from exploitation has not kept pace with the platforms they use every day.

Roblox is changing, partly because it has been forced to. Congress may eventually act. Both of those things are true and still insufficient on their own.

The gap exists right now. And while it stays open, the responsibility sits with parents to know what is happening on their child's account — not because the law says so, but because no one else is going to do it for you.

Don't Wait for the Law to Catch Up

BloxWatch gives parents the visibility into their child's Roblox account that platforms aren't required to provide. See friends, games, spending, and online activity — and get alerts when something changes.

The legal gap is real. This is what you can do about it today.

Start Free 14-Day Trial

Free: The Parent's Roblox Safety Checklist

A practical guide to keeping your kids safe on Roblox. Includes warning signs to watch for, games to avoid, and step-by-step security settings.

No spam, ever. Unsubscribe anytime.