Indiana Sues Roblox and Discord After a Predator Lured and Killed 17-Year-Old Hailey Buzbee
On May 7, 2026, Indiana Attorney General Todd Rokita filed a lawsuit against Roblox Corp. and Discord Inc., naming them as defendants in the grooming and death of Hailey Buzbee, a 17-year-old from Fishers, Indiana.
Buzbee was lured from her home by Tyler Thomas, a 39-year-old from Columbus, Ohio, who investigators say had been communicating with her online for over a year. The two first connected while playing video games including Roblox and League of Legends. Their relationship moved to Discord and the encrypted messaging app Session. Buzbee's remains were found in Wayne National Forest in Ohio after Thomas led investigators to the site. Thomas has since been charged with federal exploitation offenses.
AG Rokita said Buzbee is one of at least three Indiana girls who were "groomed or enticed away from their homes while using Roblox."
"These companies, which cater to kids and young individuals, know full well that numerous predatory sex criminals have used these platforms to contact and lure their victims," Rokita said in a statement. "And yet, they continue promoting themselves as safe for children. That is more than reckless. It is a clear and ongoing violation of Indiana's Deceptive Consumer Sales Act."
What the Lawsuit Alleges
The lawsuit accuses both Roblox and Discord of failing to employ sufficient protections against online predators while knowingly marketing their platforms as safe for children. The core legal claim is under Indiana's Deceptive Consumer Sales Act — the same consumer protection framework other states have used in their own Roblox cases.
The complaint uses language that has appeared in nearly every other state AG action against Roblox this year: that the company "corralled children into their online communities, and then opened the gates wide to adult predators." But Indiana's version is anchored to a specific, named victim whose family has publicly supported the lawsuit.
The Buzbee family released a statement through the AG's office:
"Technology and social media companies have a responsibility to prioritize the safety and well-being of the children and families who use their platforms. Far too often, families are left navigating dangerous online environments without the transparency, protections, or urgency they deserve. Parents deserve the ability to make informed decisions when it comes to the safety of their children and currently that level of transparency and accountability is too often absent."
The relief Rokita is seeking includes injunctive and declaratory relief to prevent further harm, disgorgement of profits from the allegedly unlawful conduct, and civil penalties of up to $5,000 per knowing violation.
What Roblox and Discord Said
Roblox's chief safety officer, Matt Kaufman, issued a statement acknowledging the company's safety investments while defending the platform's current protections:
"Roblox has built a multilayered safety system that includes pioneering, industry-leading safeguards to help protect our users. You cannot exchange images or videos in chat, and we deploy a combination of advanced AI-powered detection, human moderation and rigorous filters designed to prevent the exchange of personal information."
Discord pushed back harder on the characterization of its platform, saying the lawsuit does not "reflect the platform we have built or the investments we have made in user safety." Discord noted it is a communication platform for people who already share gaming interests and said it offers teen-specific safety tools including "Teen Safety Assist" and a Family Center.
Neither response addressed the specific allegation that both platforms were used in Buzbee's case despite apparent warning signs.
Why Indiana's Lawsuit Is Different
Indiana is the latest in a long line of states to take legal action against Roblox in 2026. Texas, Nebraska, Florida, Louisiana, Nevada, Alabama, and West Virginia have all filed suits or reached settlements. Nearly 150 individual federal lawsuits have been consolidated into a multidistrict litigation. In April, Roblox agreed to pay a combined $35 million to Alabama, Nevada, and West Virginia.
But Indiana's case stands out in two ways.
First, it names Discord as a co-defendant alongside Roblox. Most prior state AG actions focused on Roblox alone. Indiana is specifically calling out the Roblox-to-Discord pathway as a systemic problem, not just a coincidence in one case. This matters because the migration from Roblox to Discord is how many grooming cases actually unfold: Roblox is where predators make initial contact, and Discord is where they deepen it.
Second, the anchor case involves a murder. Most Roblox lawsuits allege grooming, exploitation, and sexual abuse. Hailey Buzbee's case involves a predator who communicated with a child for over a year across multiple platforms and then physically removed her from her home. Her death is a stark endpoint to a process that is usually described in abstract terms.
AG Rokita put it plainly: "This pattern has repeated itself time and time again. We cannot stand idly by and allow it to continue."
The Roblox-to-Discord Pipeline in Practice
The Buzbee case follows a pattern that law enforcement and child safety researchers have documented repeatedly. Roblox serves as the point of first contact. Discord serves as the escalation channel. And parents who are only watching Roblox activity miss the transition entirely.
Roblox has some chat restrictions, particularly for accounts under 13. Those restrictions create pressure to move conversations elsewhere. Discord is the obvious destination: it is deeply embedded in gaming culture, Roblox game developers maintain Discord servers, and it feels like a natural extension of in-game friendships. A child being asked to join a Discord server is not a red flag in the way that being asked to download a new app might be.
This is the gap that Indiana's lawsuit is naming explicitly. A parent who checks their child's Roblox friend list and sees nothing alarming may not know that those same Roblox contacts have been talking with their child on Discord for months.
We wrote about this dynamic earlier this week in the context of a separate lawsuit: a predator who also used Roblox and Discord to groom a teen, then used Uber to transport her across state lines. Indiana's lawsuit, filed the same week, confirms this is not a single incident but a documented pattern that multiple AGs are now treating as a systemic failure.
What Roblox's Built-In Controls Miss
After the string of settlements and lawsuits this year, Roblox has announced new safety features. The company rolled out mandatory facial age estimation globally, making it the first major platform to require age checks for all users to access chat. Age-based accounts for younger users are planned for later in 2026.
These are real changes. But they share a limitation that runs through all of Roblox's current safety measures: they address what happens inside Roblox, not what happens when a relationship moves off the platform.
Roblox can restrict chat. It cannot prevent a child from telling a Roblox contact their Discord username. It can detect inappropriate keywords. It cannot see the Discord messages that come next. It can ban a reported user. It cannot prevent a banned user from creating a new account and reconnecting with a child who trusts them.
The Buzbee case makes this concrete. Tyler Thomas communicated with Hailey Buzbee for over a year before she went missing. The relationship had time to develop, to move platforms, and to escalate to in-person contact. Platform-level controls that operate at a single point in time cannot interrupt that kind of sustained, cross-platform grooming without additional visibility tools.
For a complete guide to what Roblox's controls cover and where they stop, see our Roblox Parental Controls Guide.
What Parents Can Do Right Now
The scale of what is happening — multiple state AG lawsuits, 150 federal cases, a murder tied to a year-long online grooming relationship — can feel overwhelming. The goal of knowing all this is not to frighten you into banning Roblox. It is to give you enough context to supervise it appropriately.
A few specific things worth doing today:
- Ask about Discord specifically. If your child plays Roblox, ask whether they have a Discord account and who they talk to on it. This is the migration point most parents miss.
- Check Roblox friends against real-world knowledge. Look at your child's Roblox friend list. Can you name most of those people? If there are Roblox friends you have never heard them mention in real life, that is worth a conversation.
- Talk about the year-long grooming pattern. Tyler Thomas had been talking with Hailey Buzbee for over a year before she was lured away. Children need to understand that a long online friendship with someone they have never met in person is still a relationship with a stranger.
- Know what contact looks like before it escalates. Grooming does not announce itself. It looks like a gaming friend being unusually attentive, asking to move conversations to a different platform, or making a child feel special in ways that feel good but that they would feel awkward mentioning to a parent.
The Buzbee family's statement said it clearly: parents deserve the ability to make informed decisions, and right now "that level of transparency and accountability is too often absent." The legal system is trying to create accountability for the platforms. In the meantime, the best thing a parent can do is close the visibility gap themselves.
Close the Visibility Gap on Roblox
BloxWatch monitors your child's Roblox activity from the cloud and alerts you when something changes. See their friend list, know when new friends are added, track which games they play and when they go online.
The goal is not to read every message. It is to know enough that you can have the right conversation at the right time, before a Roblox contact becomes something more.
Start Free 14-Day Trial