
The Attorney General of New Jersey, Matthew Platkin, has initiated a legal action against Discord concerning the platform’s child safety measures. This lawsuit alleges that Discord has “misled parents regarding the effectiveness of its safety controls and obscured the dangers that children encounter while using the application.” The legal filing highlights serious concerns about how well the platform protects its younger users and suggests that the company has not been transparent about potential risks.
The Office of the Attorney General, along with the state’s Division of Consumer Affairs, determined that Discord breached New Jersey’s Consumer Fraud Act after conducting a comprehensive investigation spanning several years. Although the specifics of the lawsuit remain confidential, Platkin’s statements indicate that he plans to present evidence showing that Discord’s practices may have compromised the safety of children. He pointed out that the app’s default settings permit users to receive friend requests from anyone, and the process for creating an account is alarmingly simple for those under the age of 13. According to Platkin, the only requirement for age verification when establishing an account is entering one’s date of birth.
In response to the allegations, Discord issued a statement expressing its commitment to user safety:
Discord is proud of our continuous efforts and investments in features and tools that help make Discord safer. Given our engagement with the Attorney General’s office, we are surprised by the announcement that New Jersey has filed an action against Discord today. We dispute the claims in the lawsuit and look forward to defending the action in court.
Over the years, Discord has rolled out various features aimed specifically at safeguarding younger users. Following a troubling report that highlighted 35 criminal cases involving Discord users who faced charges such as kidnapping, grooming, or sexual assault, the platform introduced the Family Center tool, which allows parents to monitor their children’s activities on the app. Additionally, in 2023, Discord launched the Teen Safety Assist feature, which includes automatic content filters and a notification system to alert users about violations of the app’s guidelines. Moreover, in 2025, Discord established a nonprofit coalition named Roost with the specific mission of creating open-source tools dedicated to enhancing child safety online.
Like many other social media platforms, Discord has faced its share of scrutiny and challenges related to user safety, particularly for minors. The pressure for accountability appears to be intensifying, as evidenced by recent legislative initiatives. In 2024, lawmakers in California proposed measures aimed at restricting children’s access to algorithm-driven social feeds, while this year, Utah enacted an age verification law for app stores—an approach that underscores the growing concern for child safety in the digital landscape.
