Our mission at Discord is to give everyone the power to create belonging in their lives. We recognize that safety enables people to find belonging, and that’s why safety is one of our most important investments and priorities.
Safety is a collaborative and cross-functional effort at Discord. Our Engineering, Data, and Product teams build with safety principles in mind. Our Policy team takes a nuanced and sophisticated approach to developing our Community Guidelines and forms strategic partnerships with academics, civil society, industry peers, and community moderators to advance our collective understanding of online safety. Our Safety team works with cutting-edge technology to detect and respond to abuse, both proactively and from reports received from users, moderators, and trusted third party reporters.
Safety is a vital priority for our company. Around 15% of all Discord employees are dedicated to working on safety, and every employee shares in the commitment to keeping Discord safe. These Transparency Reports provide insight into our continued investment into keeping Discord a safe place for people to find belonging.
This Transparency Report, our tenth since 2019, covers the fourth quarter of 2022, from October to December.
Community Guidelines Enforcement
Discord publishes and maintains a comprehensive set of Community Guidelines that explains what content and behavior is and isn’t allowed on Discord. We invest heavily in our proactive efforts to detect and remove abuse before it’s reported to us. Through advanced tooling, machine learning, specialized teams, and partnering with external experts, we work to remove high-harm abuse before it is viewed or experienced by others.
We encourage users, moderators, and trusted reporters to submit reports if they believe an account or server is violating our Community Guidelines. We analyze these reports to determine if the content or behavior violates our Guidelines. We may take a number of enforcement actions including but not limited to issuing warnings; removing content; temporarily or permanently disabling or removing the accounts and/or servers responsible; and reporting them to law enforcement.
This report details the actions that Discord has taken on accounts and servers that have violated our Community Guidelines and Terms of Service.
Discord issues warnings with the goal of preventing future violations of our Community Guidelines. For some high-harm issues such as Violent Extremism or Child Sexual Abuse Material (CSAM) - a subcategory of Child Safety - we do not issue warnings and immediately disable the account and remove the content. In the case of CSAM, we also report the account to the National Center for Missing and Exploited Children.
The above chart breaks down warnings issued to individual accounts, servers, and server members. A text version of this chart is available here.
We issue two types of warnings for accounts: Individual Account Warnings, which are issued to individual users, and Server Member Warnings, which target members of a server and may be issued when warning or removing a server from Discord.
We issued 12,720 warnings to individual accounts, a decrease of 29% when compared to the previous quarter. The number of warnings issued to servers decreased by 17% to 1,926, and warnings issued to server members decreased by 5% to 1,228,882 accounts warned.
Accounts Disabled
A text version of the number of accounts disabled by category chart is available here.
We disabled 153,883 accounts for policy violations not including spam, a 17% decrease when compared to 185,756 accounts disabled in the previous quarter.
All categories saw fewer accounts disabled with the exception of Platform Manipulation and Regulated or Illegal Activities, which saw increases of 180% and 55% respectively.
An in-depth look at each of these categories, including Child Safety, can be found in the Enforcement Trend Analysis of this report.
Accounts disabled for spam or spam-related offenses decreased by 27% to 36,825,143 when compared to 50,510,769 spam accounts disabled in the previous quarter.
Servers Removed
The above chart breaks down the number of servers removed by category and if they were removed proactively or reactively. A text version of this chart is available here.
We removed 33,232 servers, an increase of 9% from the 30,456 removed during the previous quarter.
We continue to invest in our ability to proactively detect and remove servers before they’re reported to us, especially for high-harm categories. We’re proud to share that 93.5% of servers removed for Child Safety were removed proactively, with CSAM servers being removed proactively 99% of the time. Overall, 73% of removed servers were removed proactively.
Appeals
The above chart shows the total percentage of disabled accounts that submitted an appeal, and the percentage of those appeals that were granted. A text version of this chart is available here.
Discord allows users to appeal actions taken on their accounts if they believe that the enforcement was incorrect.
We welcome appeals and take them seriously when they raise context or information that we may not have known. We review appeals and reinstate accounts if we determine that a mistake was made, or if we believe, based on the appeal, that the user has recognized the violation made for a low-harm issue and will abide by our Community Guidelines once back on Discord. We receivedappeals from 18% of disabled accounts. From these appeals, we reinstated 437 accounts or 1.57% of users who submitted an appeal.
Reports
Reports Received By Category and Action Rates of Reports:
A text version of the reports received chart is available here.
We received 103,441 user reports during the fourth quarter of 2022. Overall, 16,576, or 16% of the reports we received, identified violations of our Community Guidelines leading to action taken.
We also received 7,023,882 one-click reports for spam on 2,979,4641 unique accounts during this reporting period.
Enforcement Trend Analysis
Child Safety
Discord has a zero-tolerance policy for anyone who endangers or sexualizes children. Child-harm content is appalling, unacceptable, and has no place on Discord or the internet at large. We work with industry peers, civil society, and law enforcement to ensure that this effort extends beyond Discord. Discord is an active supporter of cross-industry programs such as the National Center for Missing and Exploited Children (NCMEC) and is a member of the Technology Coalition, a group of companies working together to end online child sexual exploitation and abuse. We’re also a frequent sponsor of events dedicated to increasing awareness of and action on child safety issues such as the annual Dallas Crimes Against Children Conference.
We invest heavily in advanced tooling and education so parents know how our service works and understand the controls that can contribute to creating a positive and safe experience on Discord for their children. As part of our ongoing commitment to parent engagement, Discord is a proud sponsor of the National Parent Teacher Association and ConnectSafely. We also continue to be a member of the Family Online Safety Institute, contributing to and learning from its important work.
Users who upload abuse material of children to Discord are reported to NCMEC and removed from the service. We deeply value our partnership with NCMEC and their efforts to ensure that grooming and endangerment cases are quickly escalated to law enforcement.
In the fourth quarter of 2022, we reported 11,589 accounts to NCMEC. 11,520 of those reports were media (images or videos), of which many were flagged through PhotoDNA – a tool that uses a shared industry hash database of known CSAM. Additionally, 69 high-harm grooming or endangerment reports were delivered to NCMEC.
A text version of the reports filed to NCMEC chart is available here.
Discord disabled 37,102 accounts and removed 17,426 servers for Child Safety during the fourth quarter of 2022. This was a decrease of 12% and increase of 20% respectively. By targeting and proactively removing networks of bad actors from Discord before they grew in size, fewer accounts participated in these spaces, and as a result, fewer accounts were disabled.
Our investment and prioritization in Child Safety has never been more robust. We were able to remove servers hosting CSAM content proactively 99% of the time. Removing CSAM content is one of our top priorities, and we are proud of the cross-functional efforts that have enabled us to achieve this proactive takedown rate for CSAM.
Deceptive Practices
Using Discord for the purpose of distributing malware, sharing or selling game hacks or cheats, authentication token theft, or participating in identity, investment, and financial scams is a violation of our Community Guidelines.
We disabled 5,135 accounts and removed 2,418 servers for Deceptive Practices during the fourth quarter of 2022. This was a decrease of 42% in accounts disabled, and an increase of 13% in servers removed.
Exploitative and Unsolicited Content
It is a violation of our Community Guidelines to share or promote sexually explicit content of other people without their consent.
We disabled 15,963 accounts and removed 1,037 serversfor exploitative and unsolicited content. This represents a 66% decrease in accounts disabled and a 43% decrease in servers removed when compared to the previous quarter.
This decrease was driven by our ability to identify and remove a specific abuse pattern, resulting in a decrease of servers created to host that content, and consequently, fewer accounts disabled for engaging with that content.
Harassment and Bullying
Harassment and bullying have no place on Discord. Continuous, repetitive, or severe negative comments, circumventing bans, suggestive or overt threats, the sharing of someone’s personally identifiable information (also known as doxxing), and server raiding are violations of our Community Guidelines.
In the fourth quarter of 2022, 8,366 accounts and 561 servers were removed for harassment and bullying. This was a decrease of 26% in accounts disabled, and a 1% increase in servers removed.
Hateful Conduct
Hate or harm targeted at individuals or communities is not tolerated on Discord in any way. Discord doesn’t allow the organization, promotion, or participation in hate speech or hateful conduct. We define “hate speech” as any form of expression that denigrates, vilifies, or dehumanizes; promotes intense, irrational feelings of enmity or hatred; or incites harm against people on the basis of protected characteristics. You can read this policy blog post to learn more.
In the fourth quarter of 2022, 4,505 accounts and 620 servers were removed for hateful conduct. This was a decrease of 37% and 25% respectively.
Identity and Authenticity
Using Discord for the purpose of coordinating and participating in malicious impersonation of individuals or organizations is a violation of our Community Guidelines.
We disabled 2,325 accounts and removed 19 servers for identity and authenticity concerns.
Misinformation
It is a violation of our Community Guidelines to share false or misleading information that may lead to significant risk of physical or societal harm. You can read more about our health misinformation policy here.
We disabled 88 accounts and removed 48 servers for misinformation.
Platform Manipulation
Spam, fake accounts, and self-bots are examples of platform manipulation that damage the experience of our users and violate our Community Guidelines.
In the fourth quarter of 2022, 3,353 accounts and 828 servers were removed for platform manipulation-related issues not related to spam. An additional 36,825,143 accounts were disabled for spam or spam-related offenses.
We're focused on combating spam and minimizing users’ exposure to spammers and spam content on Discord. We have a dedicated cross-functional anti-spam team building sophisticated anti-spam measures, and as a result of this work, 99% of accounts disabled for spam were disabled proactively, before we received any user report.
You can read more about how Discord fights spam here. You can also read this blog post about Automod, a safety feature that enables server owners to automatically moderate certain abuse, including spam.
Regulated or Illegal Activities
Using Discord to organize, promote, or engage in any illegal behavior is a violation of our Community Guidelines.
In the fourth quarter of 2022, 57,739 accounts and 7,617 servers were removed for regulated or illegal activities. This was an increase of 55% and 10% respectively. Our rate of proactively removing servers for regulated or illegal activities was 60%. The increases in accounts disabled and the rate in which we proactively removed these servers was driven by improvements in proactive tooling and the ability to remove servers before they grew in size, resulting in fewer accounts disabled.
Self-Harm Concerns
For those experiencing mental health challenges, finding a community that is navigating similar challenges can be incredibly helpful for healing. That said, platforms have a critical role to play to ensure these digital spaces don’t attempt to normalize or promote hurtful behaviors or encourage other users to engage in acts of self-harm. We recently expanded our Self Harm Encouragement and Promotion Policy which you can read more about here.
Actions may be taken on accounts or servers with content or behavior that promotes or encourages individuals to commit acts of self-harm. We may take action on content that seeks to normalize self-harming behaviors, as well as content that discourages individuals from seeking help for self-harm behaviors. These actions are only taken on accounts glorifying or promoting acts of self-harm, not on users seeking help or in need of medical attention.
In September of 2022, we announced a new partnership with Crisis Text Line, a nonprofit that provides 24/7 text-based mental health support and crisis intervention via trained volunteer crisis counselors. Crisis Text Line is currently available to those in the United States and is offered in both English and Spanish. You can read more about this partnership here.
In the fourth quarter of 2022, 1,320 accounts and 545 servers were removed for self-harm concerns. This was an increase of 2% in accounts disabled, and a decrease of 10% in servers removed.
Violent and Graphic Content
Real media depicting gore, excessive violence, the glorification of violence, or animal cruelty is not allowed on Discord.
In the fourth quarter of 2022, 10,714 accounts and 1,283 servers were removed for violent and graphic content. This was a decrease of 13% and 2% respectively.
Violent Extremism
We consider violent extremism to be the support, encouragement, promotion, or organization of violent acts or ideologies that advocate for the destruction of society, often by blaming certain individuals or groups and calling for violence against them.
This blog post discusses our methods to address violent extremism. Through partnering and engaging in cross-industry work with the Global Internet Forum To Counter Terrorism (GIFCT), the European Union Internet Forum, and other organizations, we’ve made progress in our tooling, policy, and subject matter expertise to ensure that violent extremism does not have a home on Discord.
In the fourth quarter of 2022, 7,223 accounts and 829 servers were removed for violent extremism. This was a decrease of 42% and 21% respectively. Our rate of proactively removing servers for violent extremism increased 20% during this quarter, to 65%.
Information Requests
When appropriate, Discord complies with information requests from law enforcement agencies while respecting the privacy and rights of our users.
Law enforcement must provide valid legal documentation for the requested information. We review each information request to ensure legal compliance.
Discord may also disclose user data to law enforcement in emergency situations when we possess a good faith belief that there is an imminent risk of serious physical injury or death. You can read more about how Discord works with law enforcement here.
Legal Requests
Discord received 1,755 piecesof legal process during the fourth quarter of 2022, finding 1,524 both legally valid and specific enough for us to identify an account and produce relevant information. We work to limit disclosures of user information and content so they match the specific circumstances dictated by each request.
A text version of the requests from law enforcement chart is available here.
Emergency Requests
Discord received 225 emergency disclosure requests from law enforcement during this period. These requests originated from law enforcement agencies around the world. We disclose user data to law enforcement absent legal process only when there is imminent risk of serious physical injury or death. Our specialized team reviews each emergency request to authenticate the requestor and confirm the request meets our threshold for voluntary disclosure. We were able to identify 186 accounts based on the information provided by law enforcement for these requests and disclosed basic subscriber information in response to 103 emergency disclosure requests.
Intellectual Property Removal Requests
Our Community Guidelines and Terms of Service prohibit the sharing of content that infringes third-party intellectual property rights. In accordance with the Digital Millennium Copyright Act (DMCA), Discord will remove content and/or accounts in connection with claims of copyright infringement on Discord.
We review each DMCA notice to ensure that reports are valid and complete and that they are made in good faith.
Discord received 1,195 facially valid DMCA takedown notifications, of which 985 provided information sufficient for content removal upon request.
Our Commitment to Safety and Transparency
Creating a place where you can talk, hang out with friends, and find belonging in a welcoming and safe environment requires collaboration and dedication from all of us at Discord.
We’re proud of the work that we do to help keep people and communities on Discord safe. We hope that as a result of this work, you’re able to find belonging in the communities on Discord that you call home.