Discord

We recognize that safety enables people to find belonging. That is why safety is one of Discord’s most important priorities and investments as a company.

Our Engineering, Data, and Product teams work with safety principles in mind when building products. Our Policy team takes a nuanced and sophisticated approach to developing our Community Guidelines and forms strategic partnerships with academics, civil society, industry peers, and moderators to advance our collective understanding of online safety. Our Trust & Safety team works with cutting-edge technology to detect and respond to abuse, both proactively and from users, moderators, and trusted third party reporters.

This expansive effort and commitment to safety brings together around 15% of all Discord employees. We’re continuously expanding our efforts to help ensure that everyone can have a safer experience that allows for belonging on Discord. 

Transparency is one of the core values of our safety work at Discord. These Transparency Reports are a way to demonstrate our commitment and provide insight into the enormous effort that goes into keeping Discord a safe place for all. 

Report Structure Updates

As part of our ongoing commitment to be more transparent about the work that goes into keeping Discord safe, we’re doubling the number of Transparency Reports that we publish annually by moving from a semi-annual cadence to a quarterly cadence. This Report is our seventh since 2019, and our first quarterly Report. 

In making this transition, we’re changing the reporting period reflected in the data. This Report covers the first quarter of 2022, from January to March and will compare the first quarter of 2022 to the fourth quarter of 2021, from October to December.

We recently updated our Community Guidelines to reflect several new and clarified policies which went into effect at the end of March. Since this Report covers Discord’s safety efforts during the first quarter of 2022, the enforcement actions documented in this report are based on the previous version of our Community Guidelines.

Community Guidelines Enforcement

Discord publishes and maintains a comprehensive set of Community Guidelines that explains what content and behavior is and isn’t allowed on Discord. We invest heavily in our proactive efforts to detect and remove abuse before it’s reported to us. Through advanced tooling, machine learning, specialized Safety teams tackling specific abuse situations, and partnering with experts outside the company, our team works to remove high-harm abuse before it is viewed or experienced by others.

We encourage users, moderators, and trusted reporters to submit reports if they believe an account or server is violating our Community Guidelines. We review and analyze these reports to determine if the content or behavior violates our Community Guidelines. Depending on the severity, we may take a number of enforcement steps including but not limited to issuing warnings; removing content; temporarily or permanently disabling or removing the accounts and/or servers responsible; and potentially reporting them to law enforcement. 

This Report details the actions that Discord has taken on accounts and servers that have violated our Community Guidelines and Terms of Service.

Actions Taken on Accounts and Servers

Account and Server Warnings

Discord provides warnings as an educational opportunity and as a support system to prevent future violations of our Community Guidelines. Warnings issued to accounts and servers are used frequently to correct problematic behavior that does not require immediate permanent removal from Discord. For some high-harm issues such as Child Sexual Abuse Material (CSAM) - a subcategory of Child Safety - we do not issue warnings but rather immediately disable the account, remove the content, and report the account to the National Center for Missing and Exploited Children (NCMEC).

A text chart showing warning types such as "harassment and bullying" and the number of servers and individuals warned, totaling over 2 million.
The above chart breaks down warnings issued to individual accounts, servers, and server members. An accessible text version of this chart is available on GitHub.

We issue two types of warnings for accounts: Individual Account Warnings are issued to individual users that engage in problematic behavior; Server Members Warnings target multiple members of a community and may be issued at either the time of warning a server, or when removing a server from Discord.

Servers in violation of policies against Exploitative and Unsolicited Content, and Regulated or Illegal Activities are often removed without intermediary warning attempts. As a result, we see more normalized proportions of Server Members warned when combining the number of servers warned and servers removed for these categories.  

Warnings issued to Individual Accounts increased by 17% when compared to the previous quarter, to 9,752 accounts, and the number of warnings issued to servers decreased by 14.5% to 1,595 servers. Server Member Warnings were issued to 2,086,826 accounts, an increase of 15.5% when compared to the previous quarter.

Accounts Disabled

A circular pie graph displaying reasons for accounts disabled, with the highest reasons being "Exploitative and Unsolicited Content" with 146,897 accounts diabled.
An accessible text version of the number of accounts disabled by category chart is available on GitHub.

We disabled 1,054,358 accounts between January and March 2022 for policy violations not including spam, a decrease of 6.5% from the 1,127,220 accounts disabled in the previous quarter.

Child Safety was the largest category of accounts disabled with 826,591 accounts, or 78.5% of accounts disabled. This is the result of our intentional and targeted set of efforts to detect and disable accounts violating our policies regarding Child Safety. 

Overall, 26,017,742 accounts were disabled for spam or spam-related offenses. This is an increase of 31% from the previous quarter, when we disabled 19,794,949 spam accounts. 

Servers Removed

A bar graph displaying the top proactive and reactive reasons for server removal, the highest being "child safety"  with 24,706 servers removed.
The above chart breaks down the number of servers removed by category and if they were removed proactively or reactively. An accessible text version of this chart is available on GitHub.

We removed 40,009 servers between January and March 2022, an increase of 61% from the 24,841 servers that we removed during the previous quarter. This increase was due to our faster detection and response time when removing problematic servers. 

As with actions taken on individual accounts, Child Safety was the largest category of servers removed.

We dedicate significant resources to proactively removing servers before they’re reported to us. These efforts especially target the highest-harm categories on our platform.

From the Child Safety category; CSAM and Sexual Content Regarding Minors (SCRM) or content that sexualizes and exploits minors but which is not reportable to NCMEC. From the Exploitative and Unsolicited Content category: Non-Consensual Pornography. From the Regulated and Illegal Activities category, Cybercrime, and lastly, Violent Extremism. 

A bar graph showing the breakdown of Child Safety server removals, with SCRM (sexual content regarding minors) as the top reason with 22,499 removals.
The above chart shows a detailed breakdown of the high-harm areas that are proactively targeted.  An accessible text version of this chart is available on GitHub.

Overall, 62.5% of these high-harm servers were removed proactively compared to 42% in Q4 of 2021. We aim to increase the rate of proactive server removals, especially for these high-harm issues.


Appeals

Bar graph showing the % of accounts who submitted a ban appeal vs. the % of appeals granted, displayed by category of ban reasons.
The above chart shows the total percentage of accounts that submitted an appeal, and the percentage of those appeals that were granted. An accessible text version of this chart is available on GitHub.

Discord allows users to appeal actions taken on their accounts if they feel that the enforcement was incorrect.

We welcome appeals and take our users seriously when they make the effort to raise context or information that we may not have known when we made our decision. We review appeal requests and reinstate accounts if we determine that a mistake was made, or if we have good faith in the user’s appeal that they have recognized the violation made for a lower-harm issue and will abide by our Community Guidelines once back on Discord. We received 113,231 appeals for issues not including spam, a 16% decrease in appeals from the previous quarter. 2,365 or 2% of users who submitted an appeal had their accounts reinstated.

Reports

Reports Received By Category and Action Rates of Reports

Bar graph showing breakdown of total reports, reports actioned, and the percentage of reports actioned per category.
An accessible text version of the reports received chart is available on GitHub.

Reports received during the first quarter of 2022 increased marginally to 178,414 when compared to 175,292 reports during the fourth quarter of 2021. We received 65,700 reports in January and 57,096 and 55,618 reports in February and March respectively.

We also received 21,200,699 one-click reports for spam on 11,011,054 unique accounts during this reporting period.

Enforcement Trend Analysis

Child Safety

Discord has a zero-tolerance policy for anyone who endangers or sexualizes children. Users who upload abuse material of minors to Discord are reported to NCMEC and removed from the service. We deeply value our partnership with NCMEC and their efforts to ensure that grooming and endangerment cases are quickly escalated to law enforcement. 

In the first quarter of 2022, we reported 10,695 accounts to NCMEC. 10,641 of those reports were media (images or videos), of which many were flagged through PhotoDNA - a tool which uses a shared industry hash database of known CSAM. 54 high-harm grooming or endangerment reports were also delivered to NCMEC. Overall this represented a 29% increase in reports made to NCMEC when compared to the fourth quarter of 2021. 

Bar graph displaying media reports by Month (January, February, March) and the number of grooming or endangerment escalations that Discord reported to NCMEC.
An accessible text version of the reports filed to NCMEC chart is available on GitHub.

Discord disabled 826,591 accounts and removed 24,706 servers for Child Safety in the first quarter of 2022. Sexualized Content Regarding Minors (SCRM) is the single largest individual sub-category of accounts disabled within Child Safety, accounting for 718,385 accounts disabled and 22,499 servers removed.

Child-harm content is appalling, unacceptable, and has no place on Discord or the internet at large. We work with industry peers, civil society, and law enforcement to ensure that this effort extends beyond Discord. Discord is an active supporter of cross-industry programs such as NCMEC and is a member of the Technology Coalition. We’re also an annual sponsor of events dedicated to increasing awareness of and action on child safety issues such as the Dallas Crime Against Children Convention.

We have a dedicated team and invest heavily in advanced tooling and education so parents know how our service works and understand the controls that can contribute to creating a positive and safe experience on Discord for their children. As part of our ongoing commitment to parent engagement, Discord is a proud sponsor of National Parent Teacher Association and ConnectSafely. In recognition of Safer Internet Day in February, Discord and National PTA hosted an event to bring together parents, educators and teens to discuss online safety. We continue to be a member of the Family Online Safety Institute, contributing to and learning from its important work.

Deceptive Practices

Using Discord for the purpose of distributing malware, sharing or selling game hacks or cheats, and theft of authentication tokens is a violation of our Community Guidelines. 

We disabled 5,091 accounts and removed 1,797 servers for Deceptive Practices during the first quarter of 2022.

Exploitative and Unsolicited Content

It is a violation of our Community Guidelines to share or promote sexually explicit content of other people without their consent.

We disabled 146,897 accounts and removed 3,525 servers for Exploitative and Unsolicited Content. This category was the second-largest category of accounts disabled. These numbers were similar to the previous quarter, with one notable trend being that accounts disabled for Non-Consensual Pornography increased by 132% as a result of increased proactive efforts to remove this content from Discord.

Harassment and Bullying

Harassment and bullying have no place on Discord. Continuous, repetitive, or severe negative comments, circumventing bans, suggestive or overt threats, the sharing of someone’s personally identifiable information (also known as doxxing), and server raiding are violations of our Community Guidelines.

During the first quarter of 2022, 13,423 accounts were disabled for harassment-related behavior, and 716 servers were removed for this issue. Both the number of accounts disabled and servers removed stayed roughly consistent with the previous quarter. 

Hateful Conduct

Discord doesn’t allow the organization, promotion, or participation in hate speech or hateful conduct.

During the first quarter of 2022, 8,806 accounts and 965 servers were removed for Hateful Conduct. Compared to the previous quarter, this was a decrease of 6.5% for accounts and 18% for servers.

Identity and Authenticity

Using Discord for the purpose of coordinating and participating in malicious impersonation of individuals or organizations is a violation of our Community Guidelines. 

We disabled 154 accounts and removed 16 servers for this issue.

Platform Manipulation

Spam, fake accounts, and self-bots are examples of platform manipulation that damage the experience of our users and violate our Community Guidelines.

During the first quarter of 2022, 2,905 accounts and 972 servers were removed for platform manipulation-related issues not related to spam. An additional 26,017,742 accounts were disabled for spam or spam-related offenses. 

We’re focused on combating spam and minimizing users’ exposure to spammers and spam content on Discord. We recently established a dedicated cross-functional anti-spam team combining Engineering, Data, Product, and Safety resources to tackle this issue, and we rolled out a one-click spam reporting feature that enables users to easily report spam. In the first quarter of 2022, 90% of accounts disabled for spam were disabled proactively – before we received any user report.

You can read more about how Discord fights spam here

Regulated or Illegal Activities

Using Discord for the purpose of engaging in regulated, illegal, or dangerous activities is strictly prohibited, including selling or facilitating the sale of prohibited or potentially dangerous goods or services.

We disabled 19,669 accounts for engaging in this behavior.

A total of 4,027 servers were removed for this category. Cybercrime had a proactive take down rate of 51%, with 1,996 of the 3,883 servers removed proactively.

Self-Harm Concerns

Using Discord to glorify or promote suicide or self-harm is not allowed under any circumstance. 

Actions may be taken on accounts or servers encouraging people to cut themselves or embrace eating disorders, or otherwise manipulating and coercing other users to engage in acts of self-harm. These actions are only taken on accounts glorifying or promoting acts of self-harm, not on users seeking help or in need of medical attention.

We disabled 1,795 accounts and removed 594 servers for Self-Harm concerns.

Violent and Graphic Content

Real media depicting gore, excessive violence, the glorification of violence, or animal cruelty with the intent to harass or shock others is not allowed on Discord.

In the first quarter of 2022, 16,069 accounts were disabled for posting violent and graphic content. 5,419 of these accounts were disabled for gore, and the remaining 10,650 accounts were disabled for content glorifying violence.  This was a 34% decrease in accounts disabled for gore and a 64% increase in the number of accounts disabled for content glorifying violence.

We also removed 1,657 servers for violent and graphic content. Of these, 291 servers were removed for gore and 1,366 servers were removed for content glorifying violence. The number of servers removed for gore decreased by 22% while the number of servers removed for glorification of violence increased by 89%.

Violent Extremism

We consider violent extremism to be the support, encouragement, promotion, or organization of violent acts or ideologies that advocate for the destruction of society, often by blaming certain individuals or groups and calling for violence against them. 

This blog post discusses our methods to address violent extremism. Through partnering and engaging in cross-industry work with Tech Against Terrorism, the Global Internet Forum To Counter Terrorism (GIFCT), the European Union Internet Forum and other organizations, we’ve made progress in our tooling, policy, and subject matter expertise to ensure that violent extremism does not have a home on Discord. 

In the first quarter of 2022, 12,928 accounts and 1,034 servers were removed for violent extremism. This reflects a 9% increase in the number of servers removed since the previous quarter. We dedicate significant resources to proactively detecting and removing violent extremism. During this quarter, 40.5% of servers removed for Violent Extremism were removed proactively.

Information Requests

When appropriate, Discord complies with information requests from law enforcement agencies while respecting the privacy and rights of our users.

Law enforcement must provide valid legal documentation for the requested information. We review each information request to ensure legal compliance. 

Discord may also disclose user data to law enforcement in emergency situations when we possess a good faith belief that there is an imminent risk of serious physical harm. You can read more about how Discord works with law enforcement here.

Legal Requests

Discord received 1,446 pieces of legal process during the first quarter of 2022. We complied with the majority of these requests, ultimately finding 1,296 both legally valid and specific enough for us to identify an account and produce the relevant information. We work to limit disclosures of user information and content so they match the specific circumstances dictated by each request. 

Text graph showing the breakdown of 1,446 pieces of legal process provided including search warrants, subpoenas, and preservation requests.
An accessible text version of the requests from law enforcement chart is available on GitHub.

Emergency Requests

Discord received 182 emergency disclosure requests from law enforcement during this period. These requests originated from law enforcement agencies around the world. We disclose user data to law enforcement absent legal process only when there is imminent risk of serious physical injury or death. We were able to identify 145 accounts based on the information provided by law enforcement for these requests and disclosed basic subscriber information in response to 82 emergency disclosure requests. 

Intellectual Property Removal Requests

Our Community Guidelines and Terms of Service prohibit the sharing of content that infringes third-party intellectual property rights. In accordance with the Digital Millennium Copyright Act (DMCA), Discord will remove content and/or accounts in connection with claims of copyright infringement on Discord.

We review each DMCA notice to ensure that reports are valid and complete and that they are made in good faith.

Discord received 816 facially valid DMCA takedown notifications, of which 813 provided information sufficient for content removal upon review. 

Our Commitment to Safety and Transparency

Imagine a place where you can talk, hang out with your friends, and find belonging in a welcoming and safe environment: it doesn’t just happen. It takes immense effort, collaboration, and dedication from all of us at Discord. We strive to build the best service for people around the world. 

We’re proud of the work that we do to help keep people and communities on Discord safe. We hope that as a result of this work, you’re able to find belonging in the communities on Discord that you call home. 

We hope that these Transparency Reports, alongside our Policy & Safety Blog will continue to educate and inspire others toward building a safer digital world. We’re committed to continuously providing more useful insight into this effort and welcome your feedback on how to improve these reports.

Contents
THE AUTHOR
MORE FROM