Nevada Sues Discord for Failing to Protect Children — What Roblox Parents Need to Know
On May 5, 2026, Nevada Attorney General Aaron D. Ford filed a lawsuit against Discord, calling the platform "the go-to chat option for child abusers, including in Nevada." Two days later, Indiana filed its own lawsuit naming both Roblox and Discord as co-defendants in the grooming and murder of a 17-year-old girl from Fishers.
Two state attorneys general sued Discord in the same week. That is not a coincidence.
For parents whose children use Roblox, the significance of these lawsuits is not mainly about Discord. It is about the path. Roblox is where the initial contact happens. Discord is where the relationship goes next. And most parents who keep a close eye on their child's Roblox activity have no visibility into what happens after that transition.
What the Nevada Lawsuit Alleges
Nevada's complaint focuses on three core failures:
- No meaningful age verification. Discord allows users to create accounts without providing any identity or age information. An adult can create a new account after being banned in minutes. The lawsuit alleges that Discord's own recent attempts to improve age verification were rolled back after backlash from its user base.
- Adults can reach minors directly. There are few safeguards stopping adult strangers from finding minor users and messaging them privately. The design does not meaningfully separate adult accounts from child accounts.
- Misrepresentation to parents. Discord has marketed itself as a safe gaming communication platform while failing to disclose that the same lack of verification that makes it easy to use also makes it easy to abuse.
The legal claim is under Nevada's Deceptive Trade Practices Act, the same consumer protection framework Nevada used against TikTok, Snap, Meta, YouTube, and Kik in prior actions. AG Ford's office has cited specific Nevada criminal cases involving adults who sexually assaulted, groomed, or solicited sex from minors after using Discord to make initial contact.
Discord's response was predictable: the lawsuit "does not reflect the platform we have built or the investments we have made in user safety." The company highlighted its teen-specific tools and noted that approximately 80% of its users are adults over 18. That last statistic, offered in its own defense, is part of what plaintiffs find troubling: a platform that is 80% adults is one where a child user is surrounded by adults by design.
Why Indiana Filed Against Discord Too — the Same Week
Nevada's lawsuit focused on Discord alone. Indiana's lawsuit, filed two days later, named both Roblox and Discord as co-defendants in the same complaint, anchored to the grooming and death of Hailey Buzbee, a 17-year-old from Fishers, Indiana.
Tyler Thomas, a 39-year-old from Columbus, Ohio, communicated with Buzbee for over a year across Roblox, Discord, and the encrypted messaging app Session before she was reported missing. Her remains were later found in Ohio.
Indiana AG Todd Rokita put it plainly: Roblox is where predators make initial contact, and Discord is where they deepen it. His lawsuit named both because the exploitation did not happen on a single platform. It moved.
We covered the Indiana case in detail here: Indiana Sues Roblox and Discord After a Predator Lured and Killed 17-Year-Old Hailey Buzbee.
Why Discord Keeps Appearing in These Cases
The pattern that shows up in both lawsuits and in the criminal cases cited by Nevada's AG is consistent. Understanding why Discord is so frequently the escalation platform helps explain what parents should actually watch for.
Roblox has chat restrictions, particularly for accounts under 13. Those restrictions create pressure on anyone who wants to move a conversation beyond what Roblox allows. Discord is the natural next step: it is embedded in gaming culture, it is where Roblox game developers and communities maintain their servers, and it feels to a child like a normal extension of their gaming social life.
A 12-year-old being asked to join a game's Discord server is not an obvious red flag. That is part of what makes the platform useful to predators. The invitation does not need to be suspicious. It just needs to move the conversation somewhere that parents are not watching.
Once on Discord, the content restrictions are looser, the encryption features make private messages harder to monitor, and the account verification gaps that Nevada's lawsuit describes mean that a banned account can be replaced almost immediately.
What the Legal Pressure Has and Has Not Changed
In 2026, Roblox is under pressure from multiple directions: over 150 federal lawsuits consolidated into multidistrict litigation, settlements with Alabama, Nevada, and West Virginia totaling $35 million, and active state AG cases in Indiana, Texas, Nebraska, Florida, Louisiana, and others. In response, Roblox has introduced mandatory facial age estimation and is planning age-based account tiers later this year.
Discord is now facing its own wave of AG attention. Nevada adds to a list that already includes actions from other states citing similar failures: no age verification, adult access to minors, and a pattern of prioritizing user growth over child safety.
What neither wave of lawsuits changes is the cross-platform problem. Roblox can restrict what happens inside Roblox. Discord can add teen safety tools to Discord. Neither platform controls what happens when a child moves from one to the other, and neither lawsuit requires the platforms to coordinate with each other on cross-platform grooming patterns.
That gap is structural. It is not something either settlement can close without additional visibility tools on the parent side.
What Parents Can Do
The most useful question to ask right now is not which platform is safer. It is whether you know what platforms your child is using.
- Ask about Discord specifically. If your child plays Roblox, ask whether they have a Discord account, whose servers they have joined, and whether they message anyone on Discord who they have not met in person. A straightforward question at dinner covers more ground than any parental control setting.
- Understand the invitation pattern. Children get added to Discord through game invitations, server links shared in Roblox chat, and direct messages from people they have met playing together. The initial contact looks like gaming, not grooming.
- Review Roblox friends as a starting point. The Roblox friend list represents the pool of people who could later invite your child to Discord. Knowing who is on that list matters not because of Roblox itself, but because of where those relationships can go.
- Use Discord's Family Center if your child is on the platform. Discord offers a feature that lets parents link their account to their child's and see which servers they have joined and who they have messaged recently. It does not show message content, but it shows activity.
The lawsuits in Nevada and Indiana are trying to make these platforms structurally safer. That process takes years. In the meantime, the visibility gap between Roblox and Discord is one that parents can close themselves, with the right information and the right questions.
For a complete breakdown of what Roblox's built-in controls cover and where they stop, see our Roblox Parental Controls Guide.
See Your Child's Roblox Activity Before It Moves Somewhere Else
BloxWatch monitors your child's Roblox account from the cloud. See their friend list, know when new friends are added, and track which games they play and when they go online.
The goal is not to read every message. It is to know who your child is connected to on Roblox, so you have the right starting point for a conversation before a gaming friendship becomes something more.
Start Free 14-Day Trial