Child Safety
Discord has a zero-tolerance policy for anyone who endangers or sexualizes children. Child-harm content is appalling, unacceptable, and has no place on Discord or the internet at large. We work with industry peers, civil society, and law enforcement to ensure that this effort extends beyond Discord. Discord is an active supporter of cross-industry programs such as the National Center for Missing and Exploited Children (NCMEC) and is a member of the Technology Coalition, a group of companies working together to end online child sexual exploitation and abuse. We’re also a frequent sponsor of events dedicated to increasing awareness of and action on child safety issues such as the annual Dallas Crimes Against Children Conference.
We invest heavily in advanced tooling and education so parents know how our service works and understand the controls that can contribute to creating a positive and safe experience on Discord for their children. As part of our ongoing commitment to parent engagement, Discord is a proud sponsor of the National Parent Teacher Association and ConnectSafely. We also continue to be a member of the Family Online Safety Institute, contributing to and learning from its important work.
Users who upload abuse material of children to Discord are reported to NCMEC and removed from the service. We deeply value our partnership with NCMEC and their efforts to ensure that grooming and endangerment cases are quickly escalated to law enforcement.
In the fourth quarter of 2022, we reported 11,589 accounts to NCMEC. 11,520 of those reports were media (images or videos), of which many were flagged through PhotoDNA – a tool that uses a shared industry hash database of known CSAM. Additionally, 69 high-harm grooming or endangerment reports were delivered to NCMEC.
Discord disabled 37,102 accounts and removed 17,426 servers for Child Safety during the fourth quarter of 2022. This was a decrease of 12% and increase of 20% respectively. By targeting and proactively removing networks of bad actors from Discord before they grew in size, fewer accounts participated in these spaces, and as a result, fewer accounts were disabled.
Our investment and prioritization in Child Safety has never been more robust. We were able to remove servers hosting CSAM content proactively 99% of the time. Removing CSAM content is one of our top priorities, and we are proud of the cross-functional efforts that have enabled us to achieve this proactive takedown rate for CSAM.
Deceptive Practices
Using Discord for the purpose of distributing malware, sharing or selling game hacks or cheats, authentication token theft, or participating in identity, investment, and financial scams is a violation of our Community Guidelines.
We disabled 5,135 accounts and removed 2,418 servers for Deceptive Practices during the fourth quarter of 2022. This was a decrease of 42% in accounts disabled, and an increase of 13% in servers removed.
Exploitative and Unsolicited Content
It is a violation of our Community Guidelines to share or promote sexually explicit content of other people without their consent.
We disabled 15,963 accounts and removed 1,037 servers for exploitative and unsolicited content. This represents a 66% decrease in accounts disabled and a 43% decrease in servers removed when compared to the previous quarter.
This decrease was driven by our ability to identify and remove a specific abuse pattern, resulting in a decrease of servers created to host that content, and consequently, fewer accounts disabled for engaging with that content.
Harassment and Bullying
Harassment and bullying have no place on Discord. Continuous, repetitive, or severe negative comments, circumventing bans, suggestive or overt threats, the sharing of someone’s personally identifiable information (also known as doxxing), and server raiding are violations of our Community Guidelines.
In the fourth quarter of 2022, 8,366 accounts and 561 servers were removed for harassment and bullying. This was a decrease of 26% in accounts disabled, and a 1% increase in servers removed.
Hateful Conduct
Hate or harm targeted at individuals or communities is not tolerated on Discord in any way. Discord doesn’t allow the organization, promotion, or participation in hate speech or hateful conduct. We define “hate speech” as any form of expression that denigrates, vilifies, or dehumanizes; promotes intense, irrational feelings of enmity or hatred; or incites harm against people on the basis of protected characteristics. You can read this policy blog post to learn more.
In the fourth quarter of 2022, 4,505 accounts and 620 servers were removed for hateful conduct. This was a decrease of 37% and 25% respectively.
Identity and Authenticity
Using Discord for the purpose of coordinating and participating in malicious impersonation of individuals or organizations is a violation of our Community Guidelines.
We disabled 2,325 accounts and removed 19 servers for identity and authenticity concerns.
Misinformation
It is a violation of our Community Guidelines to share false or misleading information that may lead to significant risk of physical or societal harm. You can read more about our health misinformation policy here.
We disabled 88 accounts and removed 48 servers for misinformation.
Platform Manipulation
Spam, fake accounts, and self-bots are examples of platform manipulation that damage the experience of our users and violate our Community Guidelines.
In the fourth quarter of 2022, 3,353 accounts and 828 servers were removed for platform manipulation-related issues not related to spam. An additional 36,825,143 accounts were disabled for spam or spam-related offenses.
We're focused on combating spam and minimizing users’ exposure to spammers and spam content on Discord. We have a dedicated cross-functional anti-spam team building sophisticated anti-spam measures, and as a result of this work, 99% of accounts disabled for spam were disabled proactively, before we received any user report.
You can read more about how Discord fights spam here. You can also read this blog post about Automod, a safety feature that enables server owners to automatically moderate certain abuse, including spam.
Regulated or Illegal Activities
Using Discord to organize, promote, or engage in any illegal behavior is a violation of our Community Guidelines.
In the fourth quarter of 2022, 57,739 accounts and 7,617 servers were removed for regulated or illegal activities. This was an increase of 55% and 10% respectively. Our rate of proactively removing servers for regulated or illegal activities was 60%. The increases in accounts disabled and the rate in which we proactively removed these servers was driven by improvements in proactive tooling and the ability to remove servers before they grew in size, resulting in fewer accounts disabled.
Self-Harm Concerns
For those experiencing mental health challenges, finding a community that is navigating similar challenges can be incredibly helpful for healing. That said, platforms have a critical role to play to ensure these digital spaces don’t attempt to normalize or promote hurtful behaviors or encourage other users to engage in acts of self-harm. We recently expanded our Self Harm Encouragement and Promotion Policy which you can read more about here.
Actions may be taken on accounts or servers with content or behavior that promotes or encourages individuals to commit acts of self-harm. We may take action on content that seeks to normalize self-harming behaviors, as well as content that discourages individuals from seeking help for self-harm behaviors. These actions are only taken on accounts glorifying or promoting acts of self-harm, not on users seeking help or in need of medical attention.
In September of 2022, we announced a new partnership with Crisis Text Line, a nonprofit that provides 24/7 text-based mental health support and crisis intervention via trained volunteer crisis counselors. Crisis Text Line is currently available to those in the United States and is offered in both English and Spanish. You can read more about this partnership here.
In the fourth quarter of 2022, 1,320 accounts and 545 servers were removed for self-harm concerns. This was an increase of 2% in accounts disabled, and a decrease of 10% in servers removed.
Violent and Graphic Content
Real media depicting gore, excessive violence, the glorification of violence, or animal cruelty is not allowed on Discord.
In the fourth quarter of 2022, 10,714 accounts and 1,283 servers were removed for violent and graphic content. This was a decrease of 13% and 2% respectively.
Violent Extremism
We consider violent extremism to be the support, encouragement, promotion, or organization of violent acts or ideologies that advocate for the destruction of society, often by blaming certain individuals or groups and calling for violence against them.
This blog post discusses our methods to address violent extremism. Through partnering and engaging in cross-industry work with the Global Internet Forum To Counter Terrorism (GIFCT), the European Union Internet Forum, and other organizations, we’ve made progress in our tooling, policy, and subject matter expertise to ensure that violent extremism does not have a home on Discord.
In the fourth quarter of 2022, 7,223 accounts and 829 servers were removed for violent extremism. This was a decrease of 42% and 21% respectively. Our rate of proactively removing servers for violent extremism increased 20% during this quarter, to 65%.