Discord

2020 was a year of rapid growth for Discord. We felt honored to welcome many new people to our platform during the COVID-19 pandemic. Discord Trust & Safety worked hard in this time to keep pace with the growing user base. The team was able to respond meaningfully to an increased number of reports while maintaining proactive efforts to keep the highest-harm actors off the platform. We also sought to advance online safety in new ways by forming closer industry partnerships and by investing in community resources like the Discord Moderator Academy.

Today we’re pleased to bookend our reporting on 2020 with information on the second half of the year — between July 1st and December 31, 2020.

This Transparency Report iterates on the work we have done previously and expands the scope of the information we’re making public. We are for the first time detailing the number of servers warned for violations of our platform rules, the number of child-safety issues flagged to the National Center for Missing and Exploited Children (NCMEC), as well as the number of requests we received from law enforcement, and the rate at which we complied with legal process.

Tracking the rise of user reports

In December 2020 we announced that 140 million monthly active users were finding their place on Discord — up from 100 million in June 2020. We saw alongside the expanding user base a progressive month-over-month increase in user reports of violations of our Community Guidelines. This growth was not caused by a significant change in any one category but rather owes to an overall upward trend in reports across all categories.


A text version of the User Reports Received chart is available here.

Trust & Safety received a total of 355,633 user reports in the six months between July and December 2020. This represents an increase of slightly over 50% from the first half of the year. The team was receiving slightly under thirty thousand reports each month in the months before the COVID-19 pandemic. Monthly totals at the end of 2020 ballooned to well over sixty thousand — nearly forty thousand more per month than at the start of the year.

Reports received by category

A text version of the User Reports by Category chart is available here.

The pie chart above represents the number of inbound user reports Trust & Safety received in the second half of 2020. The totals in this chart do not represent confirmed instances of violations of Discord’s Community Guidelines but rather the number of reports of content.

We’ve maintained the groupings from our past Transparency Reports and broken out three new categories: Child Sexual Abuse Material (CSAM), Extremist or Violent Content, and Self-Harm Concerns. These three, previously grouped elsewhere, make up comparatively small portions of the overall report volume, but they represent some of our “highest priority” categories and therefore merit separate tracking and reporting.

CSAM in particular now stands separate from Exploitative Content. The category represents child-harm content or behavior — while Exploitative Content groups together sexually exploitative or otherwise inappropriate content. It should be noted that Exploitative Content is itself distinct from general not-safe-for-work (NSFW) imagery.

General Harassment continued to be the most frequently reported issue on the platform. Cybercrime contributed the second-highest number of reports in the period — with its total reports increasing nearly 250% from the first half of 2020. CSAM and Extremist or Violent Content, which tend to be some of the highest-harm issue types, were reported much less frequently, each contributing slightly greater than one percent to total volume.

Action rates of user reports

Trust & Safety uses the designation “actioned” when they have confirmed a violation of our Community Guidelines and followed up with some action. This can involve an account or server warning or deletion, a temporary account ban, a removal of content from the platform, or some other action that may not be immediately visible to the person submitting the report.

A text version of the Action Rates of User Reports chart is available here.

The table above shows that the most harmful issue types generally tended to have higher action rates by Trust & Safety. This may be partly explained by the team’s prioritization of issues in 2020 that were most likely to cause damage in the real world. Trust & Safety continued to ensure that reports on high-risk categories were reviewed and resolved in under two hours.

Actions taken by Trust & Safety

Account deletions

Our approach to content moderation involves taking action proportionate to level of harm. Account deletion is generally the most serious action that Trust & Safety can take at the user level. Users may be removed from Discord either for individual violations of our Community Guidelines or if they are part of high-harm groups that break our platform rules. Deletions are permanent unless the user appeals with additional details about the circumstances of their ban


A text version of the Accounts Deleted chart is available here.

Spam continued to contribute the highest number of raw deletions of any category. Trust & Safety and our anti-spam tools removed a combined total of 3,264,655 accounts for spammy behavior in this period. Much of our efforts in the second half of 2020 involved not only account deletions but proactively blocking or stopping spammers at registration. The Spam total has been excluded from the graph above because it is so much higher than the other deletion totals.

We removed 266,075 non-Spam accounts in the second half of 2020 for violations of our Community Guidelines. Exploitative Content continued to account for the second-highest number of bans after Spam. CSAM and Extremist or Violent Content, while accounting for comparatively small portions of overall report volume, each contributed high numbers of account deletions — a result of our efforts to proactively remove this content from the platform.

Server deletions

Trust & Safety may remove a server from the platform if it is dedicated to behavior or content that violates our Community Guidelines. The most harmful of these groups are frequently deleted with members. We may additionally remove a server for repeated minor violations of our Community Guidelines or if moderators are not compliant after receiving server-level warnings.


A text version of the Servers Deleted chart is available here.

Trust & Safety removed 27,410 servers between July and December 2020 for violations of our rules. Deletions for Cybercrime and Exploitative Content were most frequent in this period.

Server deletions for Cybercrime in particular jumped nearly 140% from the first half of 2020. This tracks with the marked increase in reports — more Cybercrime spaces than ever were flagged to Trust & Safety, and more were ultimately removed from our site. While Exploitative Content is slightly lower than in the first half of 2020, it is only because we previously reported CSAM as part of that category. Looking at the two categories together for the second half of 2020 shows that the total number of deleted servers is in line with the first half of the year.

Deletions in almost all the other categories rose in the second half of 2020. The only exception came in Hacks and Cheats. Both the number of reports and deletions for this category dropped significantly from the first half of the year. Hacks and Cheats did, however, remain one of the more common reasons for server removal in the period between July and December 2020.

User and server warnings

A text version of the User and Server Warnings chart is available here.

We continued to lean into using warnings as a way to course-correct users and servers responsible for less severe violations of our Community Guidelines. The decision to employ a warning over permanent deletion is generally only made if little harm results from the behavior and if the context suggests a violating party has the potential to reform. While some users and servers continue to violate Discord’s rules after receiving a warning — and are later deleted — the majority are able to continue on our platform without the need for further intervention.

Trust & Safety issued 28,170 user warnings in the second half of 2020 as well as 3,999 server-level warnings. We do not issue warnings for CSAM or Cybercrime at either the user or server level — and we also do not warn servers for Hacks and Cheats or Self-Harm Concerns. Violations in these categories are generally severe enough to warrant immediate deletion.

Reports to NCMEC

Discord does not allow content that sexually exploits children. Users found to be uploading such content are reported to the National Center for Missing and Exploited Children (NCMEC) and removed from the platform. All child-harm material is subsequently purged from Discord. We also frequently investigate and report higher-harm cases of grooming or endangerment.

A text version of the NCMEC Reports chart is available here.

We reported 6,948 accounts to NCMEC during the second half of 2020. This figure represents distinct accounts reported and not the total number of reports submitted.

The majority of our reports were for child sexual abuse images flagged by PhotoDNA hashing software. Trust & Safety reviews all flagged images and reports confirmed instances of child-harm content. These totals are represented as “Media reports” in the graph above.

Another 83 of our reports were for higher-harm grooming or endangerment situations. Trust & Safety thoroughly investigates all cases that could result in immediate harm to a minor. We partner closely with NCMEC to ensure that time-sensitive escalations receive prompt attention.

Proactive server deletions

A text version of the Proactive-to-Reactive Server Deletions chart is available here.

We continued to invest in resources in the second half of 2020 to proactively detect and remove the highest-harm groups from our platform. Most of Trust & Safety’s proactive efforts went towards removing spaces sharing Exploitative Content and Violent Extremist groups.

Exploitative Content

Included in the broad category of Exploitative Content are two subcategories: Nonconsensual Pornography (NCP) and Sexualized Content Related to Minors (SCRM). SCRM is distinct from CSAM here in that these spaces may sexualize minors in less explicit but still violating forms.

We proactively deleted 1,919 Exploitative Content servers in the second half of 2020. While we were able to delete slightly more NCP servers in the second half of the year than in the first, the number of proactive SCRM deletions decreased — for the first time ever sinking below the number of reactive server deletions. It should be noted, however, that overall SCRM deletions remained higher than the number of deletions for either NCP or Violent Extremism.

We were disappointed to realize that in this period one of our tools for proactively detecting SCRM servers contained an error. There were fewer overall flags to our team as a result. That error has since been resolved — and we’ve resumed removing servers the tool surfaces.

Violent Extremism

We also worked in the second half of 2020 to take action against militarized movements like the “Boogaloo Boys’’ and dangerous conspiratorial groups like QAnon. We continue to believe there is no place on Discord for groups organizing around hate, violence, or extremist ideologies.

Trust & Safety proactively removed 1,504 servers for Violent Extremism in the second half of 2020 — nearly a 93% increase from the first half of the year. This increase can be attributed to the expansion of our anti-extremism efforts as well as growing trends in the online extremism space. One of the online trends observed in this period was the growth of QAnon. We adjusted our efforts to address the movement — ultimately removing 334 QAnon-related servers.

The ban appeals process

Users who have been removed from Discord are given the option to write in to Trust & Safety and appeal the action that was taken on their account. We review user appeals and may reverse a decision if new details about the ban are presented or if we discover a mistake was made in our work. Trust & Safety will not engage with appeals that present safety concerns.

A text version of the Percent of Users Unbanned chart is available here. It includes the total number of appeals in each category.

We received 82,166 non-Spam user appeals in the second half of 2020 — up from about 70,000 in the first half of the year. These users were each looking to reverse one of the 266,075 deletions issued by Trust & Safety. We reversed our decision in 3,937 cases — translating to an unban rate of about 1.5%. We granted an additional 3,061 appeals for users deleted for Spam.

A closer look at three ban appeals

Provided below are three sample cases to better illustrate how Trust & Safety approaches the ban appeals process. Identifying information has been removed to respect user privacy.

Case study 1: Targeted harassment account deletion

Trust & Safety deleted a user’s account as part of a larger takedown of servers involved in targeted harassment and incitement of violence. The user publicly commented about the ban on other social media platforms and were advised to write in to our support queue.

Result: Our reason for removing a user or server from Discord may not always be immediately clear to the public. Trust & Safety closely reviewed this user’s ban appeal and confirmed their violation of our Community Guidelines. We decided not to reinstate the account.

Case study 2: Extremism account deletion

Trust & Safety deleted a server with all members for promoting extremist ideologies. A user impacted by the ban wrote in explaining why they were in the space. They claimed they were studying online radicalization and clarified that they had not promoted extremist ideologies.

Result: We investigated and confirmed that the user did not present a risk to Discord. Trust & Safety reversed the ban but warned the user that they should not engage in this behavior. We do not allow extremism — and we advise that all users report potentially dangerous spaces.

Case study 3: Temporary account ban

A user wrote in to Trust & Safety after receiving a warning and one-day temporary ban for making threats. They recognized the reason for the warning and pledged to be more respectful.

Result: We do not generally suspend temporary bans. Trust & Safety felt that the one-day ban was proportionate to the level of harm, and they informed the user that they would regain access to the account the following day. The user was at that time allowed to return to Discord.

Requests from law enforcement

Discord has a legal obligation to respond to valid United States law enforcement requests seeking information on users who may be the subject of criminal investigation. All pieces of legal process undergo strict scrutiny by our Legal Operations team to make sure they are legally sufficient. Discord may challenge requests that are unclear, overbroad, or not legally valid.

A text version of the Requests from Law Enforcement chart is available here.

We received 3,662 total pieces of legal process during the year of 2020. We did not present law enforcement numbers in our previous report  — so we are providing totals from both halves of 2020 here. These numbers include the requests we received through the Mutual Legal Assistance Treaty (MLAT) process. Discord currently requires that all legal requests from non-US government agencies be issued through proper MLAT procedures.

3,401 pieces of legal process were found to be valid — meaning they were issued and signed by a legal authority, and also provided enough information for us to identify a user.

Emergency disclosure requests

Discord may disclose user information to law enforcement without legal process in cases of urgent threat to life. Law enforcement can follow these steps if they believe there is an emergency with one of our users that could result in death or serious physical injury. We do not disclose user data to law enforcement absent legal process unless we have investigated and confirmed a good-faith belief that physical injury will result without our action.

We received 325 requests for emergency disclosure from global law enforcement in the six months between July and December 2020. We made disclosures in 122 of those cases. We either were not able to identify or did not disclose information on users in the remaining cases.

Strength through partnerships

We’ve benefited enormously in the past six months from forming closer partnerships with industry leaders, government agencies, civil society groups, and academics. We joined the Trust & Safety Professional Association in an effort to expand professional development for Discord’s Trust & Safety team; and, alongside Google, Facebook, Microsoft, Pinterest, and Reddit, were a founding member of the Digital Trust & Safety Partnership.

We partnered with the Technology Coalition, the Family Online Safety Institute, and ConnectSafely to reinforce our commitment to preventing child abuse online. We also joined Tech Against Terrorism and the Global Internet Forum to Counter Terrorism to expand our ability to identify and remove violent extremist groups from our platform. We’re looking forward to deepening these partnerships in 2021 and entering into others to help advance online safety.

We view these biannual reports as valuable opportunities to share information on company operations that directly impact our users. It is our hope that the increasingly clear, granular data presented here will help paint a fuller picture of the work we are doing to keep Discord safe.

Trust & Safety is working hard in 2021 to make sure that all user concerns receive swift and appropriate responses. The team is also continuing to invest in methods to proactively detect abuse — so our users are shielded from harmful content. We’re additionally hoping to empower moderators by investing in resources like the Discord Moderator Academy. We’re excited to rise to the challenge of tackling content moderation at scale.

Contents
THE AUTHOR
MORE FROM