A lawsuit against Bay Area tech companies Roblox and Discord filed last week alleges that the apps are hunting grounds for child predators, and they deceive parents into believing their platforms are safe for children.
Roblox Corp. is a San Mateo-based gaming app in which users can create their own games and interact with other users. Discord Inc., based in San Francisco, is a chat app where users can join topic-based conversations and is a popular communication platform among video game players.
The lawsuit, filed on Feb. 12, alleges that both Roblox and Discord work in tandem to enable sexual exploitation of children. It claims that predators tend to pose as children on these apps, befriend young users through direct messaging, move the conversation from Roblox to Discord, then manipulate children into sending explicit content.
Two law firms, Cotchett, Pitre & McCarthy LLP and Anapol Weiss, filed the lawsuit on behalf of a teenage victim who is an example of what the firms say is a broader issue of Roblox and Discord providing platforms that facilitate the sexual abuse of children.
“What happened is far from an isolated event,” said attorney Anne Marie Murphy in a press release. “The plaintiff is but one of countless children whose lives have been devastated because of Roblox and Discord’s misrepresentations and defectively designed apps.
The plaintiff is an unnamed 13-year-old boy who had to move across the country after a predator learned of his location after interacting with him via Roblox and Discord.
The boy allegedly sent a man explicit videos and pictures in exchange for “Robux,” the digital currency used to buy avatar modifications and user experiences on the app. The predator is facing criminal charges and is suspected of similarly exploiting 26 other children, according to the suit.
The suit alleges that Roblox knowingly hosts and promotes sexually explicit content without adequate barriers preventing children from accessing it. The platform apparently allows users to create “Roblox pornography,” where avatars simulate engaging in sexual acts, according to the suit.
Hindenburg Research, a research firm that conducts investigations into companies, referred to Roblox as an “X-rated pedophile hellscape, exposing children to grooming, pornography, violent content and extremely abusive speech,” according to a report on Roblox the firm published in 2024.
The suit also alleges that Roblox and Discord are aware of the ease with which predators can interact with, groom and coerce children into sending pornographic material through their platforms yet they do not enforce adequate safety measures to protect minors.
“There are steps Roblox and Discord could take to protect children and make their apps safer,” said attorney Kristen Feden in the press release. “But time and time again, they have refused to invest in basic safety features that could protect against exactly the kind of exploitation the plaintiff suffered here.”
It is not the first time that Roblox and Discord have faced lawsuits for allegedly failing to prevent predators from exploiting child users on their platforms. They were named in a lawsuit in 2022 in San Francisco Superior Court, alleging that the apps misrepresented themselves as being safe for children and allowed adults to exploit a young girl through direct messaging.
Roblox declined to speak on the latest lawsuit but reaffirmed that it consistently aims to promote safety.
“We cannot comment on pending litigation,” said a spokesperson from Roblox. “With that being said, Roblox takes the safety of its community very seriously. We are constantly innovating and launching new safety features, including more than 40 safety features and policies in 2024.”
Discord did not respond to a request for comment in time for publication.
Related Posts:
CalMatters: California Investigating OpenAI’s Conversion to a For-Profit Company
EdSource: How Parents Can Limit Children’s Harmful Cellphone Use at Home