Child Safety
Discord has a zero-tolerance policy for anyone who endangers or sexualizes children. Child-harm content is appalling, unacceptable, and has no place on Discord or in society.
We work with industry peers, civil society, and law enforcement to ensure that this effort extends beyond Discord. We invest heavily in resources and education so parents know how our platform works and understand the controls that can contribute to creating a positive and safe experience on Discord for their children.
Our Family Center makes it easy for parents and guardians to stay informed about their Discord teens Discord activity, and our Parent Hub is our resource to learn more about these tools and tips on how to help teens stay safer on Discord.
We are an active supporter of the National Center for Missing and Exploited Children (NCMEC) and their efforts to ensure that grooming and endangerment cases are quickly escalated to law enforcement. Users who upload abuse material of children to Discord or who engage in high-harm activity towards children are reported to NCMEC and removed from the service.
Discord is a member of the Technology Coalition, a group of companies working together to end online child sexual exploitation and abuse. We’ve also partnered with the technology non-profit Thorn to enable us to build the right protections to empower teens with the tools and resources they need to have a safe online experience. You can read more here. We are a proud sponsor of the National Parent Teacher Association, ConnectSafely, and partner with The Digital Wellness Lab to integrate their research on teen health and social media. We’re also members of the Family Online Safety Institute, contributing to and learning from its important work.
We also partner with INHOPE, the global network combatting online CSAM, and have become a member of the Internet Watch Foundation whose resources we will use to expand our ability to identify and prevent child sexual abuse imagery. We’re also a sponsor of events dedicated to increasing awareness of and action on child safety issues such as the annual Dallas Crimes Against Children Conference.
You can read our Child Safety policies, developed with the latest research, best practices, and expertise in mind, here.
Discord disabled 128,153 accounts and removed 7,736 servers for Child Safety during the third quarter of 2023. We removed servers for Child Safety concerns proactively 86% of the time, and CSAM servers 87% of the time. We reported 51,916 accounts to NCMEC, which was a 40% increase from the previous quarter which we attribute to our continued use of PhotoDNA, and the introduction of new technologies and hashing systems such as our visual safety platform, PDQ, and CLIP, which you can read more about here. We observed a changing threat landscape as well as implemented new tools to remove CSAM at the message-level while identifying and removing server owners who were distributing CSAM. As a result, we were able to remove a greater volume of CSAM content at the message and account level during this quarter, instead of removing servers outright.
We have continually invested more resources in combating CSAM on our platform, including a team who solely focuses on child safety as well as a dedicated engineering team.
Discord is committed to continually exploring new and improved safeguards that help keep younger users safe on our platform and online.
Deceptive Practices
Using Discord for the purpose of distributing malware, sharing or selling game hacks or cheats, authentication token theft, or participating in either identity, investment, or financial scams is a violation of our Community Guidelines. You can read more about our Deceptive Practices policy here.
During the third quarter of 2023, 11,885 accounts and 2,389 servers were removed for Deceptive Practices. This was an increase of 47% in accounts disabled, and a decrease of 27.5% in servers removed. The increase in accounts disabled was due to increases in the number of accounts disabled for sharing malware, and a new tool which enables us to identify users attempting to steal Discord account tokens. The decrease in servers removed resulted in shifting attention to focus on proactively malware sharing and token theft.
We issued 43,366 warnings to accounts for Deceptive Practices, which was the reason for the significant increase in individual account warnings made during the third quarter. Specifically, 40,179 accounts were warned for malware sharing and distribution.
Exploitative and Unsolicited Content
It is a violation of our Community Guidelines to share or promote sexually explicit content of other people without their consent. You can read more about our Exploitative and Unsolicited Content policies here and here.
During the third quarter of 2023, 14,448 accounts and 2,157 servers were removed for Exploitative and Unsolicited Content. This was a decrease of 25.5% in the number of accounts removed, and a decrease of 18% in the number of servers removed.
Harassment and Bullying
Harassment and bullying have no place on Discord. Continuous, repetitive, or severe negative comments, circumventing bans, suggestive or overt threats, the sharing of someone’s personally identifiable information (also known as doxxing), and server raiding are violations of our Community Guidelines. You can read more about our Harassment and Bullying policy here, and our Doxxing policy here.
During the third quarter of 2023, 24,384 accounts and 736 servers were removed for Harassment and Bullying. This was a 128.5% increase in the number of accounts disabled, which was driven by an increase in accounts disabled for doxxing. This was due to internal tooling improvements.
Hateful Conduct
Hate or harm targeted at individuals or communities is not tolerated on Discord in any way. Discord doesn’t allow the organization, promotion, or participation in hate speech or hateful conduct. We define “hate speech” as any form of expression that denigrates, vilifies, or dehumanizes; promotes intense, irrational feelings of enmity or hatred; or incites harm against people on the basis of protected characteristics. You can read more about our Hateful Conduct policy here.
During the third quarter of 2023, 4,189 accounts and 907 servers were removed for Hateful Conduct.
Identity and Authenticity
Using Discord for the purpose of coordinating and participating in malicious impersonation of individuals or organizations is a violation of our Community Guidelines. You can read more about our Identity and Authenticity policy here.
During the third quarter of 2023, 213 accounts and 56 servers were removed for Identity and Authenticity concerns.
Misinformation
It is a violation of our Community Guidelines to share false or misleading information that may result in damage to physical infrastructure, injury to others, obstruction of participation in civic process, or the endangerment of public health. You can read more about our Misinformation policy here.
During the third quarter of 2023, 49 accounts and 24 servers were removed for Misinformation.
Platform Manipulation
Spam, fake accounts, and self-bots are examples of platform manipulation that damage the experience of our users and violate our Community Guidelines. You can read more about our Platform Manipulation policy here.
We're focused on combating spam and minimizing users’ exposure to spammers and spam content on Discord. We have a dedicated cross-functional anti-spam team building sophisticated anti-spam measures. You can learn more about how Discord combats spam here, and here about Automod, a safety feature that enables server owners to automatically moderate certain abuse, including spam.
During the third quarter of 2023, 3,515 accounts and 1,485 servers were removed for non-spam related platform manipulation issues. An additional 12,845,746 accounts were disabled for spam or spam-related offenses. This represents an increase of 61% in the number of accounts disabled when compared to the previous quarter. 99% of accounts disabled for spam were disabled proactively, before we received any user report.
Regulated or Illegal Activities
Using Discord to organize, promote, or engage in any illegal behavior is a violation of our Community Guidelines. You can read more about our Regulated or Illegal Activities policies here and here.
During the third quarter of 2023, 47,211 accounts and 22,635 servers were removed for Regulated or Illegal Activities. Our rate of proactively removing servers was 74%.
Self-Harm Concerns
For those experiencing mental health challenges, finding a community that is navigating similar challenges can be incredibly helpful for support. That said, platforms have a critical role to play in ensuring that these spaces do not normalize, promote, or encourage others to engage in acts of self-harm.
We may take action on content that seeks to normalize self-harming behaviors, as well as content that encourages self-harm behaviors or discourages individuals from seeking help for self-harm behaviors. These actions are only taken on accounts glorifying or promoting acts of self-harm, not on users seeking help or in need of medical attention. You may read more about this policy here.
We’re proud to partner with Crisis Text Line, a nonprofit that provides 24/7 text-based mental health support and crisis intervention via trained volunteer crisis counselors. If a user reports a message for self-harm on Discord, they will be presented with information on how to connect with a volunteer Crisis Counselor. You can learn more here.
Crisis Text Line is currently available to those in the United States and is offered in both English and Spanish. You can read more about this partnership here. Since the launch of our partnership, there have been over 2,000 conversations started using Discord’s keyword.
During the third quarter of 2023, 1,090 accounts and 571 servers were removed for Self-Harm Concerns.
Violent and Graphic Content
Real media depicting gore, excessive violence, the glorification of violence, or animal cruelty is not allowed on Discord. You can read about our VIolent and Graphic Content policy here.
During the third quarter of 2023, 17,024 accounts and 2,039 servers were removed for Violent and Graphic Content. We proactively removed servers for these issues 92% of the time, an increase from 71% in the previous quarter.
Violent Extremism
We consider violent extremism to be the support, encouragement, promotion, or organization of violent acts or ideologies that advocate for the destruction of society, often by blaming certain individuals or groups and calling for violence against them. You can read more about our Violent Extremism policy here.
By partnering and engaging in cross-industry work with the Global Internet Forum To Counter Terrorism (GIFCT), the European Union Internet Forum, the Christchurch Call, and other organizations, we’ve made progress in our tooling, policy, and subject matter expertise to ensure violent extremism does not have a home on Discord.
During the third quarter of 2023, 8,300 accounts and 1,053 servers were removed for Violent Extremism. While the number of accounts disabled and servers removed declined, our proactive rate for removing servers for Violent Extremism increased from 77% to 98%. This increase can be attributed to investments in improving our proactive detection models for violent extremist content as well as through our continued cross-industry work to bolster our tooling, policy, and awareness of emerging trends.