Discord app sued by New Jersey over child safety features: Complete drama explained

Discord app sued by New Jersey over child safety features (Image via X / @PopBase)
Discord app sued by New Jersey over child safety features (Image via X / @PopBase)

The New Jersey Attorney General has brought suit against social media company Discord and it said the company did not do enough to protect children and misled users regarding its safety features. The lawsuit which was filed in the New Jersey Superior Court by Matthew Platkin and the state’s division of consumer affairs reports that the platform broke state consumer protection laws in how they handled issues of child safety.

“Discord’s strategy of employing difficult to navigate and ambiguous safety settings to lull parents and children into a false sense of safety, when Discord knew well that children on the Application were being targeted and exploited, are unconscionable and/or abusive commercial acts or practices,” the legal filing read.

Also read: “I’m not a villain here” — Bill Maher fires back at Donald Trump meeting criticism


New Jersey lawsuit highlights concerns over Discord's child safety measures

In the complaint, it is reported that the social platform’s large user base, which includes gamers and young people, has failed to put in place adequate minor protection measures as the company has promised to do. Also, it is reported that the company’s safety features are too complex and confusing, which in turn may give parents a false sense of security.

In the suit, it was brought to light that the issue of age verification is a key factor. What the complaint puts forth is that Discord’s age verification process is broken, which in turn allows easy access to the platform for children under 13 who do so by putting in fake dates of birth.

Although they have a minimum age requirement in place, the VoIP social platform is reported not to enforce it properly.

Additionally, the lawsuit scrutinizes the effectiveness of the instant messaging platform's Safe Direct Messaging feature. Plaintiffs argue that the tool was presented as being capable of automatically filtering out harmful or explicit content, but in practice, it allegedly failed to detect or block much of this material—particularly in private messages exchanged between users classified as “friends.”

In response, a Discord spokesperson reported that they were taken aback by the legal action, which they were not aware of till that point. Also brought out was the company’s effort in developing safety tools and features, which is an element of our protection strategy.

“Given our engagement with the Attorney General’s office, we are surprised by the announcement that New Jersey has filed an action against Discord today,” a Discord spokesperson said.

This is a larger issue that sees greater attention from state officials across the U.S. to the regulation of social media platforms. In the past few years, there are many lawsuits brought forward against the large platforms which include Meta, TikTok, and Snap, and which report issues like enablement of predatory behavior and financial exploitation of minors.

In many cases, attorneys general are reporting an increased issue of child safety in digital spaces.


The lawsuit does not specify the amount of damages being sought but does request civil penalties. If successful, it could have significant implications for the social platform and other tech companies that cater to younger users.

Edited by Sugnik Mondal