Discord

We at Discord recognize that safety enables people to find belonging. That is why safety is one of Discord’s most important priorities and investments as a company.

Our Engineering, Data, and Product teams work with safety principles in mind when designing and building products. Our Policy team takes a nuanced and sophisticated approach to developing our Community Guidelines, our Community Moderation team trains and empowers moderators to keep their communities safe and healthy, and Trust & Safety works with cutting-edge technology to detect and respond to abuse, both proactively and from user reports. 

This expansive effort and commitment to safety brings together around 15% of all Discord employees. As Discord continues to grow, welcoming more users and evolving as a platform, we’re continuously expanding our efforts to help ensure that everyone can have a safer experience that allows for belonging on Discord. 

The publication of this Transparency Report marks our sixth since 2019. This report covers the second half of 2021, from July through December 2021. With this, we’ve completed our reports from 2019 to 2021 and are looking to expand and increase our transparency reporting efforts from 2022 onwards. 

We’re committed to providing more useful insight into the enormous effort that goes into making Discord a safe place for all and welcome your feedback on how to improve these reports.

Report Structure Updates

We spent the past few months thinking deeply about how we present our data to you, making sure the story we’re telling is the most accurate representation of how Discord enforces its Guidelines. In the past, we’ve made changes to our reports in the categorization of policy types on our platform. 

In doing so, we recognize that we’ve made it difficult to compare older reports to one another. Moving forward, our goal is to enable you to easily review and compare data from report to report. 

In this Transparency Report, we recategorized our safety matrix and shifted from 14 categories to 11 new categories:

A text version of the Safety Category Updates chart is available here.

We’re also restructuring the design of our Transparency Report. In the past, we showcased a high-level overview of user reports and actions taken. 

In this report, we will continue to do that while adding a new section to individually discuss trends and analysis and provide additional context on a category-by-category basis. We hope the new addition will provide deeper insight into the trends we’ve seen on Discord.

Please note that our Community Guidelines were updated recently to reflect several new and clarified policies. Since this report covers Discord’s safety efforts during the second half of 2021, the enforcement actions documented in this report are based on the previous version of our Community Guidelines.

Community Guidelines Enforcement

Discord publishes and maintains a comprehensive set of Community Guidelines that explains what is and isn’t allowed on the platform. Users and moderators are encouraged to submit reports if they suspect an account or server is violating our Guidelines. 

Discord reviews and analyzes these reports to determine if the content submitted violates our Guidelines. Depending on the severity, we may take a number of enforcement steps including but not limited to issuing warnings; removing content; disabling or removing the accounts and/or servers responsible; and potentially reporting them to law enforcement. 

Beyond reports, we invest heavily in our proactive efforts to detect and remove abuse before it’s reported to us. Through advanced tooling, machine learning, specialized Safety teams tackling specific abuse situations, and insight from experts outside the company, our team works to remove high-harm abuse before it is viewed by others.

This report details the actions that Discord has taken on accounts and servers that have violated our Community Guidelines and Terms of Service.

User Reports

Reports Received

A text version of the Reports Received chart is available here.

While Discord welcomed even more people onto the platform during the second half of 2021, reports received decreased, with 402,659 reports – down 34,531 from the first half of 2021

This decrease was the result of improving the user reporting experience by introducing a new one-click spam reporting feature built directly within the app. 

The one-click spam report feature, which was rolled out on all platforms in the second half of 2021, enables users to easily report spam with just one click, and without needing to leave Discord. We received 35,027,133 one-click reports for spam on 5,472,683 unique accounts from October until the end of December 2021. 


Reports Received by Category

A text version of the Reports Received by Category chart is available here.

Harassment and Bullying was the largest category of reports received, followed by Platform Manipulation, Hateful Conduct, and Child Safety. The Platform Manipulation category excludes reports from the new one-click spam feature mentioned before. 


Action Rates of Reports Received by Category

A text version of the Action Rates of Reports Received by Category chart is available here.

Our team works hard to ensure that the most actionable and sensitive reports are routed to and resolved by the appropriate subject matter experts and that high-harm reports, such as those related to Child Safety, are reviewed and solved with urgency. 

On the whole, 28% of reports received were actioned, compared to 24% of reports actioned during the first half of 2021. Action rates were high for issues such as Child Safety, Identity and Authenticity, and Violent and Graphic Content. Harassment and Bullying, the largest category of reports received had one of the lower action rates, at 18%, as oftentimes behavior interpreted and reported as harassment does not directly violate our Guidelines. 


Actions Taken on Accounts and Servers

Account and Server Warnings

Warnings are issued to accounts and servers and are used frequently to correct problematic behavior that does not require immediate permanent removal from the platform. For some high-harm issues such as Child Sexual Abuse Material (CSAM) - a subcategory of Child Safety - we do not issue warnings but rather immediately disable and report the account to the National Center for Missing and Exploited Children (NCMEC) and remove the content. 

The above chart breaks down warnings issued to individual accounts, servers, and server members. A text version of this chart is available here.

Account and server warnings increased only marginally when compared to the first half of 2021, with 24,071 individual accounts and 4,055 servers warned, an increase of 153 and 171 respectively. 

We’re introducing a new reporting metric in this section: Server Members Warned. Now, for spaces where problematic behavior is occurring and where the majority of server members are participating, warnings can be issued to all members in those servers. 

Server member warnings were issued to 2,288,201 accounts, a number nearly 100 times greater than individual accounts warned. Server Member Warnings were utilized for servers engaged in Exploitative and Unsolicited Content and Platform Manipulation, an important tool allowing Discord to provide warnings to all members as an educational opportunity.


Accounts Disabled

A text version of the Number of Accounts Disabled by Category chart is available here.

We disabled 1,687,082 accounts between July and December 2021 for policy violations not including spam, a significant increase from the 470,465 accounts disabled during the first half of the year. 

Child Safety was the largest category of accounts disabled with 1,293,124 accounts, or nearly 77% of accounts disabled. A number of factors contributed to this, including our intentional and targeted set of efforts to detect and disable accounts engaging in issues concerning Child Safety, as well as the update to the safety matrix where parts of Exploitative and Unsolicited Content — the largest category of accounts disabled from the previous report — are now included in the Child Safety category.

We made significant improvements in our spam detection systems during the second half of 2021. Overall, 46,326,661 accounts were disabled for spam or spam-related offenses. This is an increase of 128% over the last reporting period, when we removed just over twenty million spam accounts. 

Servers Removed

The above chart breaks down the number of Servers Removed by Category and if they were removed Proactively or Reactively. A text version of this chart is available here.

We removed 52,177 servers between July and December 2021, an increase from the 43,044 servers we removed during the first half of the year.

As with actions taken on individual accounts, Child Safety was the largest category of servers removed. That being said, unlike accounts disabled, the number of servers removed was more equally distributed. 

We dedicate significant resources to proactively removing servers before they’re reported to us. These efforts currently target the highest-harm categories on our platform.

From the Child Safety category; CSAM and Sexual Content Regarding Minors (SCRM) or content that sexualizes and exploits minors but which is not reportable to NCMEC. From the Exploitative and Unsolicited Content category: Non-Consensual Pornography. From the Regulated and Illegal Activities category, Cybercrime, and lastly, Violent Extremism. 

The above chart shows a detailed breakdown of the high-harm areas that are proactively targeted.  A text version of this chart is available here.

Overall, 40.9% of these high-harm servers were removed proactively. As our tooling and techniques continue to advance, we aim to increase the rate of proactive server removals.


Appeals

The above chart shows the total percentage of accounts that submitted an appeal, and the percentage of users whose appeals were granted. A text version of this chart is available here.

Discord allows users to appeal actions taken on their accounts if they feel that the enforcement was incorrect. In this section, we’ll discuss our approach to how users can have actions taken on their accounts reviewed. 

We welcome appeals and take our users seriously when they make the effort to raise context or information we may not have known of when we made our decision. We review appeal requests and reinstate accounts if we determine that a mistake was made, or if we have good faith in the user’s appeal that they have recognized the violation made for a lower-harm issue and will abide by our Community Guidelines once back on Discord.

We received 175,393 appeals for issues not including spam during the second half of 2021, a 68% increase in appeals from the previous reporting period. 5,661, or 0.34% of the 1,687,082 accounts disabled, were reinstated. This low number of appeals granted when compared to the careful consideration that goes into reviewing appeals reflects the high degree of accuracy in actions taken.  


Enforcement Trend Analysis

In this new addition to our Transparency Report, we’ll break down each category and go into detail about the subcategories and trends within each one. 


Child Safety

Discord has a zero-tolerance policy for anyone who endangers children. Users who upload abuse material of minors to Discord are reported to NCMEC and removed from the platform. We deeply value our partnership with NCMEC to ensure that grooming and endangerment cases are quickly escalated to authorities. 

In the second half of 2021, we reported 15,126 accounts to NCMEC. 14,906 of those reports were images or videos, of which many were flagged through PhotoDNA. 220 high-harm grooming or endangerment reports were also delivered to NCMEC. Overall, this represented a 56.8% increase in reports made to NCMEC during the second half of 2021 when compared to the first half of the year.  

A text version of the Reports filed to NCMEC chart is available here.

Beyond reporting to NCMEC, our team focuses on proactively removing users who are posting SCRM, which is the single largest individual category of accounts disabled, accounting for 1,124,307 accounts disabled and 11,314 servers removed.

Child-harm content is appalling, unacceptable, and has no place on Discord or the internet at large. Our highest priority is to remove this content from our platform, and we work with industry peers and law enforcement to ensure that this effort extends beyond Discord. We’ve built a dedicated team for this work and invest heavily in advanced tooling and education so parents know how our service works and understand the controls that can contribute to creating a positive and safe experience for their children.

Discord has been an active partner in cross-industry programs and is a member of the Technology Coalition, the Family Online Safety Institute, the National Parent Teacher Association, ConnectSafely, as well as an annual sponsor of the Dallas Crime Against Children Convention. 


Deceptive Practices

Using Discord for the purpose of distributing malware, sharing or selling game hacks or cheats, and theft of authentication tokens is a violation of our Community Guidelines. 

For context, in our last Transparency Report, we reported on hacks & cheats and malware as two distinct categories which are now reported on together in this new category. 

We disabled 19,643 accounts for Deceptive Practices. One notable trend is that accounts disabled for authentication token theft rose by 740% when compared to the first half of 2021. This increase was a result of a focus on account takeover cases. You can read more about our efforts to reduce account takeover and educate users here.

A total of 6,601 servers were removed for this category, with servers removed for token theft up more than 450% from our last report. 


Exploitative and Unsolicited Content

It is a violation of our Community Guidelines to share or promote sexually explicit content of other people without their consent as well as non-consensual intimate materials.

We disabled 155,871 accounts and removed 7,604 servers for Exploitative and Unsolicited Content. This category was the second-largest category of accounts disabled.

Exploitative and Unsolicited Content is an area where we’re constantly investing in proactive measures. For non-consensual pornography, leaks and sexually degrading content, we removed 19.3%, 21.7%, and 16.9% of servers proactively. 


Harassment and Bullying

Harassment and bullying have no place on Discord. Continuous, repetitive, or severe negative comments, circumventing bans, suggestive threats, the sharing of someone’s personally identifiable information (also known as doxxing), and raiding are violations of our Community Guidelines.

In our last Transparency Report, we reported on doxxing and raiding as two distinct categories which are now reported here.

In the second half of 2021, 32,432 accounts were disabled for harassment-related behavior, and 1,867 servers were removed for this issue. 


Hateful Conduct

Discord doesn’t allow the organization, promotion, or participation in hate speech or hateful conduct.

In the second half of 2021, 20,295 accounts and 2,772 servers were removed for this issue. The number of accounts disabled was down 21.5% when compared to the first half of the year, while servers removed for this issue were up by more than 114% over the same time period. Accounts are most often disabled individually, or with the removal of a server. 

When servers are removed, the owner’s account is automatically disabled and if necessary so too are accounts of moderators and or users if the behavior and or participation is widespread. With updated operational procedures for Hateful Conduct, more servers may have been removed without all members leading to this inverse trend. 


Identity and Authenticity

Using Discord for the purpose of coordinating and participating in malicious impersonation of individuals or organizations is a violation of our Community Guidelines. 

As one of the smaller categories, we disabled 3,950 accounts and removed 17 servers for this issue.


Platform Manipulation

Spam, fake accounts, and self-bots are all examples of platform manipulation that damage the experience of our users and violate our Community Guidelines.

In the second half of 2021, 16,490 accounts and 2,941 servers were removed for platform manipulation-related issues not related to spam. An additional 46,326,661 accounts were disabled for spam or spam-related offenses. 

We’re focused on combating spam and minimizing users’ exposure to spammers and spam content on Discord. We recently established a dedicated cross-functional anti-spam team combining Engineering, Data, Product, and Safety resources to tackle this issue, and we rolled out a one-click spam reporting feature that enables users to easily report spam. Currently, 95% of accounts disabled for spam are done so proactively – before we receive any user report.

You can read more about how Discord fights spam here

Regulated or Illegal Activities

Using Discord for the purpose of engaging in regulated, illegal, or dangerous activities is strictly prohibited, including selling or facilitating the sale of prohibited or potentially dangerous goods, phishing, DDoS, and other forms of social engineering.

We disabled 66,774 accounts for engaging in this behavior. 41,447 accounts were disabled for engaging with regulated or illegal goods, an increase of more than 2,000%, while 25,327 accounts were disabled for other cybercrime-related behavior. The large increase in accounts disabled is the result of an internal operational shift that clarified how to action accounts engaging in these behaviors. 

A total of 8,101 servers were removed for this category, with 583 servers removed for engaging with regulated or illegal goods, an increase of more than 150%. 7,518servers were removed for other cybercrime-related abuse, a decrease of 10% when compared to the first half of 2021. 73.41 and 52% of these respective servers were removed proactively. 


Self-Harm Concerns

Using Discord to glorify or promote suicide or self-harm is not allowed under any circumstances. 

Actions may be taken on accounts or servers sharing or encouraging people to cut themselves, embrace eating disorders, or otherwise manipulate and coerce other users to engage in acts of self-harm. These actions are only taken on accounts glorifying or promoting acts of self-harm, not on users seeking help or in need of medical attention.

In the second half of 2021, 1,378 accounts were disabled for posting such content, an increase of 10% from the first half of the year.

We removed 412 servers for self-harm concerns. This reflects a 22% decline when compared to the first half of the year.


Violent and Graphic Content

Real media depicting gore, excessive violence, the glorification of violence, or animal cruelty with the intent to harass or shock others is not allowed on Discord.

In the second half of 2021, 51,955 accounts were disabled for posting such content. 36,120 of these accounts were disabled for gore, and the remaining 15,835 accounts were disabled for content glorifying violence. This was an increase of 129% and 33% from the first half respectively.

We also removed 2,663 servers for violent and graphic content. Of these, 1,018 servers were removed for gore, and 1,645 servers were removed for content glorifying violence. The number of servers removed for gore remained the same as in the first half of the year, while the number of servers removed for glorification of violence increased by 57%. 

Violent Extremism

We consider violent extremism to be the support, encouragement, promotion, or organization of violent acts or ideologies that advocate for the destruction of society, often by blaming certain individuals or groups and calling for violence against them. 

This blog post discusses our methods to address violent extremism on our platform. Through partnering and engaging in cross-industry work with the Global Forum To Counter Terrorism (GIFCT) and other organizations, we’ve made progress in our tooling, policy, and subject matter expertise to ensure that violent extremism does not have a home on Discord. 

In the second half of 2021, 24,444 accounts and 2,182 servers were removed for violent extremism. This reflects a 10% increase in the number of servers removed since the last reporting period. We dedicate significant resources to proactively removing violent extremism from our platform — during this reporting period, 47.5% of these servers removed were done so proactively. 


Information Requests

When appropriate, Discord complies with information requests from law enforcement agencies while respecting the privacy and rights of our users.

Law enforcement must provide valid legal documentation for the requested information. We review each information request to ensure legal compliance. 

Discord may also disclose user data to law enforcement in emergency situations when we possess a good faith belief that there is an imminent risk of serious physical harm. 


Legal Requests

Discord received 2,852 pieces of legal process during the second half of 2021. We complied with the majority of those requests, ultimately finding 2,559 of them both legally valid and specific enough for us to identify a user and produce the relevant information. We work to limit disclosures of user information and content so they match the specific circumstances dictated by each request. 

A text version of the Requests from Law Enforcement chart is available here.

Emergency Requests

Discord received 346 emergency disclosure requests from law enforcement during this period. These requests originated from law enforcement agencies around the world. We disclose user data to law enforcement absent legal process only when there is imminent risk of serious physical injury or death. 

We were able to identify 277 accounts based on the information provided by law enforcement for these requests and disclosed basic subscriber information in 162 emergency disclosure requests. 


Intellectual Property Removal Requests

For the first time, we are reporting on intellectual property removal requests from our platform. Our Community Guidelines and Terms of Service prohibit the sharing of content that infringes third-party intellectual property rights. In accordance with the Digital Millennium Copyright Act (DMCA), Discord will remove content and/or accounts in connection with claims of copyright infringement on our platform.

We review each DMCA notice to ensure that reports are valid and complete and that they are made in good faith.

Discord received 2,080 facially valid DMCA takedown notifications, of which 1,956 provided information sufficient for content removal upon review. 


Our Commitment to Safety and Transparency

Imagine a place where you can talk, hang out with your friends, and find belonging in a welcoming and safer environment: it doesn’t just happen. It takes immense effort, collaboration, and dedication from all of us at Discord. From Engineering to Data, Product to Policy, and Community Moderation to Trust & Safety, we strive to build the best platform for people around the world. 

We’re proud of the work that we do to help keep people and communities on Discord safe. We hope that as a result of this work, you’re able to feel safer and find your own sense of belonging in the communities on Discord you call home. 

Our commitment to safety and transparency is everlasting. We hope that these Transparency Reports, alongside our Policy & Safety Blog will continue to educate and inspire others toward building a safer digital world.

Contents
THE AUTHOR
MORE FROM