The mother of a 15-year-old boy from California who tragically took his own life has filed a lawsuit against Roblox and Discord, asserting that her son was groomed and coerced into sending explicit images through these platforms. Rebecca Dallas initiated the legal action on Friday in San Francisco County Superior Court, claiming that the companies operated negligently and deceptively, contributing to both the sexual exploitation of her son and his subsequent suicide.
Ethan Dallas, described in the lawsuit as a creative and enthusiastic child with a passion for gaming and connecting with friends online, began using Roblox with his parents’ consent when he was just 9 years old, with safety measures in place. However, by the time he was 12, he was allegedly targeted by an adult predator who impersonated a child on Roblox and began befriending him. According to the lawsuit, initial innocent exchanges escalated into sexual discussions, leading the predator to encourage Ethan to disable parental controls and switch their communications to Discord.
Once on Discord, the predator reportedly made increasing demands for explicit photos and videos, employing threats that left Ethan feeling compelled to comply. The lawsuit alleges that these interactions caused lasting psychological harm, which culminated in Ethan’s death by suicide in April 2024.
The lawsuit accuses both Roblox and Discord of wrongful death, fraudulent concealment, negligent misrepresentation, and strict liability. The complaint contends that the platforms lacked adequate safety measures, such as user screening, age verification, and proper safeguards, which could have prevented Ethan from interacting with his predator.
Rebecca Dallas believed both Roblox and Discord were adequately safe for her son to use. The lawsuit points out that Roblox claims to prioritize child safety, yet allows users to create accounts without verifying their age. Similarly, Discord, a platform widely utilized by gamers for chatting, reportedly does not effectively verify the identities or ages of its users.
The suit further alleges that both platforms have misrepresented their safety features, suggesting that their designs make children vulnerable to predators. After Ethan’s death, it was revealed that the predator who targeted him had previously been arrested in Florida for exploiting other children through the apps.
Roblox has stated that it is committed to the safety of its users and has implemented numerous safety features and moderation processes to combat exploitation. Despite these claims, the suit asserts that the platform’s policies still permit children to easily create accounts and circumvent parental controls.
Discord also cited its efforts to maintain user safety, including using technology to detect inappropriate content. However, the allegations in the lawsuit paint a picture of pervasive exploitation occurring on the platform, leading to significant concerns about its safety protocols.
This lawsuit represents the ninth case filed by Anapol Weiss related to allegations of child exploitation on Roblox or associated platforms. Furthermore, the National Center on Sexual Exploitation included both Roblox and Discord on its “Dirty Dozen” list, which identifies entities said to facilitate or profit from sexual abuse and exploitation.
Recent investigations have revealed multiple instances of adult individuals being prosecuted for crimes such as grooming and sexual assault linked to communications on Discord. In a parallel initiative, Louisiana’s attorney general is suing Roblox, emphasizing how the platform’s inadequate safety measures have made it a haven for predators.
Dallas is now seeking a jury trial and compensatory damages, underscoring the severe consequences that can arise when major online platforms fail to protect vulnerable children. Resources are available for anyone grappling with a crisis, including the Suicide & Crisis Lifeline, which can be reached by calling or texting 988.