Is Roblox Getting Worse?

Roblox's Struggles Amid Safety Concerns and Lawsuits

Roblox, a multibillion - dollar company, has long faced criticism regarding the safety of its platform for young gamers. Despite announcing new measures in July, including an AI - powered age - verification system and privacy tools, concerns persist among researchers, experts, and lawyers. The fundamental issue at stake is whether these changes can effectively prevent individuals from exploiting players on the platform.

The Allure and Risks of Roblox for Kids

Launched in 2006, Roblox was designed to empower kids to create games using simple tools, appealing to their peers. In - game currency (Robux) enables the purchase of avatar outfits and other items, and the open - world nature of most games, combined with chat and trading features, has made it a popular semi - social space, especially during Covid - 19 lockdowns. For instance, the summer hit "Grow a Garden" contributed to pushing daily active users past 100 million by the end of July, with reports suggesting it was created by a 16 - year - old.

However, this freedom has rendered the platform difficult to moderate. Default chat settings allowing unrestricted communication between users, coupled with the large number of active users, make it challenging for kids to identify who is operating the avatars. Although Roblox introduced "Trusted Connections" to limit contact between older and younger users, experts remain skeptical about the complete protection of minors.

The Legal Backlash

For years, Roblox has faced allegations that its platform serves as a haven for pedophiles, fascists, and nihilist groups such as No Lives Matter and 764. Now, a group of law firms across the US is preparing to file a significant number of lawsuits. Matt Dolman from Dolman Law Group anticipates around 100 - hundreds of lawsuits pending by the end of September, and over 1,000 filed by this time next year. His firm alone has about 300 cases, with the vast majority of clients under 16, and approximately 60% of cases involving girls.

Individual lawsuits have already been filed. In May, the parents of a 15 - year - old in Indiana sued, alleging grooming by a man on the FBI watch list. In July, the parents of a 13 - year - old in Alabama filed a lawsuit, claiming manipulation with Robux, leading to an attempted sexual assault. Also in July, a lawsuit was filed on behalf of a 13 - year - old from Iowa who was allegedly kidnapped and raped. These lawsuits claim that Roblox has created an unsafe environment for children while falsely marketing itself as safe.

Law firms like Stinar Gould Grieco & Hensley are representing over 400 victims of sexual abuse and grooming on Roblox and Discord. Lawyers from other firms, such as Wagstaff & Cartmell and Anapol Weiss, are also evaluating potential victims, with cases involving both boys and girls aged 10 - 15 at the time of the alleged exploitation.

Roblox's Response and the Reality

Roblox has stated that safety is a top priority. Spokesperson Stefanie Notaney mentioned rigorous safeguards, including restrictions on sharing personal information, links, and user - to - user image sharing, and the prohibition of sexual conversations. The company's chief safety officer, Matt Kaufman, has also emphasized safety as part of Roblox's DNA since its inception.

However, the data tell a different story. The number of self - reported cases of suspected child sexual exploitation to the National Center for Missing and Exploited Children (NCMEC) has increased exponentially. From 675 cases in 2019, it almost quadrupled to over 2,200 in 2020 and exceeded 24,000 by 2024. In the first six months of 2025, the number of reports was nearly as high as the total for 2024, with a doubling of reports related to online enticement.

Moderation Challenges

Roblox uses a combination of human moderators and AI - and machine - learning - driven automated systems to filter out problematic content. Nevertheless, researchers have shown that these measures can be bypassed, with perpetrators using coded language to evade detection. In many cases, conversations are moved to platforms like Discord or Snapchat, where there is less oversight.

Discord, where many Roblox players connect outside of the games, requires users to be at least 13 years old and uses machine learning and human moderation to prevent sexual exploitation. However, similar to Roblox, it has faced challenges. For example, gamers on Discord managed to fool age - verification features within days.

Roblox's "Trusted Connections" feature aims to prevent communication from migrating to other platforms. But critics worry it may fail, given the ease with which similar features on Discord were circumvented.

Law Enforcement and Extremist Group Involvement

Law enforcement agencies across the US have been tracking grooming on the Roblox platform. Since 2018, at least two dozen people have been arrested for abducting or abusing victims they met or groomed on Roblox.

Moreover, nihilistic groups like No Lives Matter, CVLT, and 764, part of the Com network, have used Roblox to recruit children, encouraging self - harm, sharing of explicit content, and in extreme cases, suicide. Extremism researcher Ry Terran believes that the new safeguards give young teens more chatting opportunities and shift the safety burden onto kids and parents rather than the company.

Update: 8/18/2024, 5:30 PM EDT: This story has been updated to include comment from Discord.

Related Article