Our mission at Discord is to create welcoming spaces where people can find belonging. We recognize that safety enables people to find belonging, and that’s why safety is one of our most important investments and priorities.
Safety is a collaborative and cross-functional effort at Discord. Our Engineering, Data, and Product teams build products with safety principles in mind. Our Policy team takes a nuanced and sophisticated approach to developing our Community Guidelines and forms strategic partnerships with academics, civil society, industry peers, and community moderators to advance our collective understanding of online safety. Our Safety team works with cutting-edge technology to detect and respond to abuse, both proactively and from reports received from users, moderators, and trusted third party reporters.
Safety is a vital priority for our company. Around 15% of all Discord employees are dedicated to this area and every employee shares in the commitment to keeping Discord safe. These Transparency Reports provide insight into our continued investment into keeping Discord a safe place for people to find belonging.
This Transparency Report, our eighth since 2019, covers the second quarter of 2022, from April to June.
Report Updates
The February 2022 update to our Community Guidelines introduced our policy prohibiting harmful misinformation. The data for actions taken on accounts and servers engaging in the spread of harmful misinformation will be detailed in a new category of the same name.
Community Guidelines Enforcement
Discord publishes and maintains a comprehensive set of Community Guidelines that explains what content and behavior is and isn’t allowed on Discord. We invest heavily in our proactive efforts to detect and remove abuse before it’s reported to us. Through advanced tooling, machine learning, specialized teams, and partnering with external experts, we work to remove high-harm abuse before it is viewed or experienced by others.
We encourage users, moderators, and trusted reporters to submit reports if they believe an account or server is violating our Community Guidelines. We analyze these reports to determine if the content or behavior violates our Guidelines. We may take a number of enforcement actions including but not limited to issuing warnings; removing content; temporarily or permanently disabling or removing the accounts and/or servers responsible; and potentially reporting them to law enforcement.
This report details the actions that Discord has taken on accounts and servers that have violated our Community Guidelines and Terms of Service.
Actions Taken on Accounts and Servers
Account and Server Warnings
Discord issues warnings with the goal of preventing future violations of our Community Guidelines. For some high-harm issues such as Violent Extremism or Child Sexual Abuse Material (CSAM) - a subcategory of Child Safety - we do not issue warnings but rather immediately disable the account and remove the content. In the case of CSAM, we also report the account to the National Center for Missing and Exploited Children.
We issue two types of warnings for accounts: Individual Account Warnings, which are issued to individual users, and Server Member Warnings, which target multiple members of a server and may be issued when warning or removing a server from Discord.
Individual accounts were issued 13,562 warnings, an increase of 39% when compared to the previous quarter. The number of warnings issued to servers increased by 64.5% to 2,624, and warnings issued to server members decreased by 26% to 1,546,843 accounts warned.
Accounts Disabled
We disabled 726,759 accounts between April and June 2022 for policy violations not including spam, a decrease of 31% from the 1,054,358 accounts disabled in the previous quarter.
The decrease in the number of accounts disabled mostly came from Child Safety which accounted for 73% of total accounts disabled in this quarter and saw a decrease of 35.5% from the previous quarter. A more in-depth explanation of the changes in this category can be found in the Enforcement Trend Analysis section of this report.
We disabled an additional 27,733,948 accounts for spam or spam-related offenses. This is an increase of 6.5% from the previous quarter when we disabled 26,017,742 spam accounts.
Servers Removed
We removed 28,370 servers between April and June 2022, a decrease of 29% from the 40,009 servers that we removed during the previous quarter.
The decrease mostly came from Child Safety which accounted for nearly 53% of servers removed and saw a decrease of 38.5% from the previous quarter. A more in-depth explanation of the changes in this category can be found in the Enforcement Trend Analysis section of this report.
We continue to invest in our ability to proactively detect and remove servers before they’re reported to us – especially for high-harm categories. Overall, 59% of servers removed were removed proactively – an increase of 7%.
Appeals
Discord allows users to appeal actions taken on their accounts if they believe that the enforcement was incorrect.
We welcome appeals and take them seriously when they raise context or information that we may not have known. We review appeals and reinstate accounts if we determine that a mistake was made, or if we believe, based on the appeal, that the user has recognized the violation made for a low-harm issue and will abide by our Community Guidelines once back on Discord. We received 122,714 appeals, an increase of 8.5% from the previous quarter. 733 or .6% of users who submitted an appeal had their account reinstated.
Reports
Reports Received By Category and Action Rates of Reports
Reports received during the second quarter of 2022 decreased to 153,550 when compared to 178,414 reports received during the first quarter of 2022.
Overall, 34,346 or 22% of reports identified violations of our Community Guidelines leading to action taken. This was down slightly from 24% of reports actioned during the previous quarter.
We also received 14,132,392 one-click reports for spam on 7,785,111 unique accounts during this reporting period.
Enforcement Trend Analysis
Child Safety
Discord has a zero-tolerance policy for anyone who endangers or sexualizes children. Child-harm content is appalling, unacceptable, and has no place on Discord or the internet at large. We work with industry peers, civil society, and law enforcement to ensure that this effort extends beyond Discord. Discord is an active supporter of cross-industry programs such as the National Center for Missing and Exploited Children (NCMEC) and is a member of the Technology Coalition. We’re also a frequent sponsor of events dedicated to increasing awareness of and action on child safety issues such as the annual Dallas Crime Against Children Convention.
We have a dedicated team and invest heavily in advanced tooling and education so parents know how our service works and understand the controls that can contribute to creating a positive and safe experience on Discord for their children. As part of our ongoing commitment to parent engagement, Discord is a proud sponsor of the National Parent Teacher Association and ConnectSafely. We continue to be a member of the Family Online Safety Institute, contributing to and learning from its important work.
Discord disabled 532,498 accounts and removed 15,163 servers for Child Safety in the second quarter of 2022. Servers removed for CSAM increased to 6,640, up from 1,271, with a proactive removal rate of 95%. This is a significant increase from the 52% CSAM server proactive removal rate in the first quarter of 2022, and resulted from the introduction of new tools built to identify and detect servers hosting this content.
Users who upload abuse material of minors to Discord are reported to NCMEC and removed from the service. We deeply value our partnership with NCMEC and their efforts to ensure that grooming and endangerment cases are quickly escalated to law enforcement.
In the second quarter of 2022, we reported 21,529 accounts to NCMEC, a 101% increase in reports made when compared to the first quarter of 2022. 21,425 of those reports were media (images or videos), of which many were flagged through PhotoDNA – a tool that uses a shared industry hash database of known CSAM. 104 high-harm grooming or endangerment reports were also delivered to NCMEC.
Of the 532,498accounts and 15,163servers removed for Child Safety, 497,267 accounts and 7,834 servers were removed for Sexualized Content Regarding Minors (SCRM), the largest sub-category within Child Safety. This represents a 30% decrease in accounts disabled and a 65% decrease in servers removed for SCRM from the previous quarter.
These decreases can be attributed to better operating procedures and our enhanced detection capabilities which enable our team to identify these servers faster. We believe these interventions have resulted in a behavioral shift with fewer SCRM servers created and fewer accounts joining these servers when they are created.
Deceptive Practices
Using Discord for the purpose of distributing malware, sharing or selling game hacks or cheats, and theft of authentication tokens is a violation of our Community Guidelines.
We disabled 6,150 accounts and removed 1,677 servers for Deceptive Practices during the second quarter of 2022.
Exploitative and Unsolicited Content
It is a violation of our Community Guidelines to share or promote sexually explicit content of other people without their consent.
We disabled 106,645* accounts and removed 2,326 serversfor Exploitative and Unsolicited Content, which remained the second-largest category of accounts disabled. Accounts disabled for this category declined by 27.5% when compared to the previous quarter. This decline was largely the result of successfully targeting specific abuse typesthat comprised the higher number of actionable content during the first quarter.
For nonconsensual pornography, one of the high-harm issues that Discord targets proactively, 701 servers were removed, with a proactive removal rate of 36%.
Harassment and Bullying
Harassment and bullying have no place on Discord. Continuous, repetitive, or severe negative comments, circumventing bans, suggestive or overt threats, the sharing of someone’s personally identifiable information (also known as doxxing), and server raiding are violations of our Community Guidelines.
During the second quarter of 2022, 13,779 accounts were disabled for harassment-related behavior, and 598 servers were removed for this issue.
Hateful Conduct
Discord doesn’t allow the organization, promotion, or participation in hate speech or hateful conduct.
During the second quarter of 2022, 5,719 accounts and 713 servers were removed for hateful conduct. Compared to the previous quarter, this was a decrease of 35% and 25%, respectively.
Identity and Authenticity
Using Discord for the purpose of coordinating and participating in malicious impersonation of individuals or organizations is a violation of our Community Guidelines.
We disabled 148 accounts and removed 12 servers for this issue.
Misinformation
We recently published a blog post discussing our new policy prohibiting the sharing of false or misleading information on Discord that is likely to cause physical or societal harm. This blog post discusses the new policy and our enforcement criteria in more detail.
We disabled 270 accounts and removed 73 servers for misinformation.
Platform Manipulation
Spam, fake accounts, and self-bots are examples of platform manipulation that damage the experience of our users and violate our Community Guidelines.
During the second quarter of 2022, 3,400 accounts and 351 servers were removed for platform manipulation-related issues not related to spam. An additional 27,733,948 accounts were disabled for spam or spam-related offenses.
We're focused on combating spam and minimizing users’ exposure to spammers and spam content on Discord. We have a dedicated cross-functional anti-spam team building sophisticated anti-spam measures, and as a result of this work, 90% of accounts disabled for spam were disabled proactively, before we received any user report.
You can read more about how Discord fights spam here.
Regulated or Illegal Activities
Using Discord for the purpose of engaging in regulated, illegal, or dangerous activities is strictly prohibited, including selling or facilitating the sale of prohibited or potentially dangerous goods or services.
We disabled 27,494 accounts for engaging in this behavior, an increase of 39.5% from the prior quarter.
A total of 4,639 servers were removed for this category with a proactive removal rate of 60.5%. The overall increase in warnings, accounts disabled, and servers removed for this category was the result of more sophisticated methods for proactive detection being introduced during the quarter.
Self-Harm Concerns
Using Discord to glorify or promote suicide or self-harm is not allowed under any circumstance.
Actions may be taken on accounts or servers encouraging people to cut themselves or embrace eating disorders, or otherwise manipulating and coercing other users to engage in acts of self-harm. These actions are only taken on accounts glorifying or promoting acts of self-harm, not on users seeking help or in need of medical attention.
We recently announced a new partnership with Crisis Text Line, a nonprofit that provides 24/7 text-based mental health support and crisis intervention via trained volunteer crisis counselors. Crisis Text Line is currently available to those in the United States and is offered in both English and Spanish. You can read more about this partnership here.
We disabled 2,495 accounts and removed 620 servers for self-harm concerns.
Violent and Graphic Content
Real media depicting gore, excessive violence, the glorification of violence, or animal cruelty with the intent to harass or shock others is not allowed on Discord.
In the second quarter of 2022, 15,630 accountswere disabled for posting violent and graphic content. We also removed 1,286 serversfor violent and graphic content.
Violent Extremism
We consider violent extremism to be the support, encouragement, promotion, or organization of violent acts or ideologies that advocate for the destruction of society, often by blaming certain individuals or groups and calling for violence against them.
This blog post discusses our methods to address violent extremism. Through partnering and engaging in cross-industry work with Tech Against Terrorism, the Global Internet Forum To Counter Terrorism (GIFCT), the European Union Internet Forum and other organizations, we’ve made progress in our tooling, policy, and subject matter expertise to ensure that violent extremism does not have a home on Discord.
We want to acknowledge the horrific attack in Buffalo, New York that occurred on May 14, 2022. You can read our full statement and a summary of the actions that we took here.
In the second quarter of 2022, 12,531 accounts and 910 servers were removed for violent extremism.
Information Requests
When appropriate, Discord complies with information requests from law enforcement agencies while respecting the privacy and rights of our users.
Law enforcement must provide valid legal documentation for the requested information. We review each information request to ensure legal compliance.
Discord may also disclose user data to law enforcement in emergency situations when we possess a good faith belief that there is an imminent risk of serious physical injury or death. You can read more about how Discord works with law enforcement here.
Legal Requests
Discord received 1,647 piecesof legal process during the second quarter of 2022, finding 1,448 both legally valid and specific enough for us to identify an account and produce relevant information. We work to limit disclosures of user information and content so they match the specific circumstances dictated by each request.
Emergency Requests
Discord received 251 emergency disclosure requests from law enforcement during this period. These requests originated from law enforcement agencies around the world. We disclose user data to law enforcement absent legal process only when there is imminent risk of serious physical injury or death. We were able to identify 224 accounts based on the information provided by law enforcement for these requests and disclosed basic subscriber information in response to 113 emergency disclosure requests.
Intellectual Property Removal Requests
Our Community Guidelines and Terms of Service prohibit the sharing of content that infringes third-party intellectual property rights. In accordance with the Digital Millennium Copyright Act (DMCA), Discord will remove content and/or accounts in connection with claims of copyright infringement on Discord.
We review each DMCA notice to ensure that reports are valid and complete and that they are made in good faith.
Discord received 840 facially valid DMCA takedown notifications, all of which provided information sufficient for content removal upon review.
Our Commitment to Safety and Transparency
Creating a place where you can talk, hang out with friends, and find belonging in a welcoming and safe environment requires collaboration and dedication from all of us at Discord.
We’re proud of the work that we do to help keep people and communities on Discord safe. We hope that as a result of this work, you’re able to find belonging in the communities on Discord that you call home.
Addendum: On November 8, we updated this report with the correct number of accounts disabled for Exploitative and Unsolicited Content. Initially the report incorrectly stated 147,249 accounts had been disabled in this category. It has since been updated to reflect the correct number, 106,645.