Child Safety
Discord has a zero-tolerance policy for anyone who endangers or sexualizes children. Child-harm content is appalling, unacceptable, and has no place on Discord or in society.
We have a long-established team focused solely on child safety and a dedicated safety engineering team. Discord is committed to continually exploring new and improved safeguards that help keep younger users safe on our platform and online.
We collaborate with members of the industry, policymakers, law enforcement, and others to extend our efforts beyond Discord. We invest in resources and education to help parents and guardians understand how our platform works and how to create a safe and positive experience for their teens.
Discord actively supports the National Center for Missing and Exploited Children (NCMEC) and reports users who upload child abuse material or engage in other activity that seeks to exploit minors to NCMEC. As a member of the Technology Coalition, we collaborate with industry partners to drive critical advances in technology and adoption of best practices for keeping teen users safe online. Within the Technology Coalition, Discord helped pilot the Lantern initiative, a first-of-its-kind signal-sharing program for companies to enhance and strengthen how we detect attempts to sexually exploit and abuse children and teens online.
We also partner with Thorn and The Digital Wellness Lab to integrate their expertise and research on teen health and social media into our products and policies, including the development of our Teen Charter.
Discord is also a proud sponsor of the National Parent Teacher Association and ConnectSafely and is an active member of the Family Online Safety Institute.
We partner with INHOPE, the global network combatting online child sexual abuse material, and are a member of the Internet Watch Foundation, whose resources we use to identify and prevent child sexual abuse imagery. We also sponsor events dedicated to increasing awareness of, and action on, child safety issues such as the annual Dallas Crimes Against Children Conference.
At Discord, part of our holistic approach to safety also includes products and features for parents and teens. Our Family Center empowers parents and guardians to stay informed about their teen’s Discord activity, and our Teen Safety Assist features, which are enabled by default for teens, include safety alerts and sensitive content filters. We also publish our Guardians Guide to help teens stay safer on Discord.
You can read our Child Safety policies, developed with the benefit of the latest research, best practices, and expertise here.
Discord took action on 346,482 distinct accounts for Child Safety during this period. This included disabling 178,165 accounts and removing 7,462 servers. We reported 101,585 accounts to NCMEC as a result of CSAM that was identified by our hashing systems, reactive reporting, and additional proactive investigations.
Deceptive Practices
Using Discord for the purpose of financial scams, malicious conduct, or fraud services is a violation of our Community Guidelines. You can read more about our Deceptive Practices policy here.
Discord took action on 33,482 distinct accounts for Deceptive Practices during this period, including disabling 24,322 accounts and removing 1,207 servers.
Exploitative and Unsolicited Content
It is a violation of our Community Guidelines to share, distribute, or create sexually explicit content depicting other people without their consent. You can read more about our Exploitative and Unsolicited Content policies here and here.
Discord took action on 2,091,583 distinct accounts for Exploitative and Unsolicited Content during this period, including disabling 48,571 accounts and removing 1,835 servers.
Harassment and Bullying
Harassment and bullying have no place on Discord. Promoting, coordinating, or engaging in harassment, threatening to harm another individual or group of people, or sharing or threatening to share someone’s personally identifiable information (also known as doxxing) are violations of our Community Guidelines. You can read more about our Harassment and Bullying policy here and our Doxxing policy here.
Discord took action on 92,295 distinct accounts for Harassment and Bullying during this period, including disabling 19,283 accounts and removing 1,210 servers.
Hateful Conduct
Hate or threats of harm targeted at individuals or communities is not tolerated on Discord. Using hate speech or engaging in other hateful conduct are violations of our Community Guidelines. We define hate speech as any form of expression that either attacks other people or promotes hatred or violence against them based on their protected characteristics. You can read more about our Hateful Conduct policy here.
Discord took action on 67,969 distinct accounts for Hateful Conduct during this period, including disabling 5,457 accounts and removing 1,593 servers.
Identity and Authenticity
Using Discord to misrepresent your identity on Discord in a deceptive or harmful way or evade permanent Discord-level enforcement action is a violation of our Community Guidelines. You can read more about our Identity and Authenticity policy here.
Discord took action on 10,891 distinct accounts for Identity and Authenticity during this period, including disabling 10,004 accounts and removing 1,298 servers.
Misinformation
It is a violation of our Community Guidelines to share false or misleading information. You can read more about our Misinformation policy here.
Discord took action on 26 distinct accounts for Misinformation during this period, including disabling 3 accounts, and removed 3 servers.
Platform Manipulation
Spam, fake accounts, and self-bots are examples of platform manipulation that damage the experience of our users and violate our Community Guidelines. You can read more about our Platform Manipulation policy here.
We are focused on combating spam and minimizing users’ exposure to spammers and spam content on Discord. We have a dedicated cross-functional anti-spam team building sophisticated anti-spam measures. You can learn more about how Discord combats spam here and read more here about Automod, a safety feature that enables server owners to automatically moderate certain abuse, including spam.
Discord took action on 2,098 distinct accounts for non-spam-related Platform Manipulation issues during this period, including disabling 2,050 accounts, and removed 732 servers. An additional 35,456,553 accounts were disabled for spam or spam-related offenses.
Regulated or Illegal Activities
Using Discord to organize, promote, or engage in any illegal behavior is a violation of our Community Guidelines. You can read more about our Regulated or Illegal Activities policies here and here.
Discord took action on 795,569 distinct accounts for Regulated or Illegal Activities during this period, including disabling 26,810 accounts and removing 15,107 servers.
Self-Harm Concerns
For those experiencing mental health challenges, finding a community that is navigating similar challenges can be incredibly helpful for support. That said, platforms have a critical role to play in ensuring that these spaces do not normalize, promote, or encourage others to engage in acts of self-harm.
We take action on content that seeks to normalize self-harming behaviors, as well as content that encourages self-harm behaviors or discourages individuals from seeking help for self-harm behaviors. These actions are only taken on accounts glorifying or promoting acts of self-harm, not on users seeking help or in need of medical attention. You can read more about this policy here.
We are proud to partner with Crisis Text Line, a nonprofit that provides 24/7 text-based mental health support and crisis intervention in English and Spanish via trained volunteer crisis counselors in the US. If a user reports a message for self-harm on Discord, they will be presented with information on how to connect with a volunteer Crisis Counselor. You can learn more about Crisis Text Line here and about our partnership here. Since the launch of our partnership, there have been over 3,000 CrisisTextLine conversations that were started by texting “DISCORD”.
As part of our efforts to provide resources for all users, we’re excited to partner with ThroughLine Care, a global network of vetted helplines. Non-US Discord users can find helplines in their country by going to discord.findahelpline.com. Learn more by checking out the Family Center.
Discord took action on 18,781 distinct accounts for Self-Harm Concerns during this period, including disabling 5,425 accounts and removing 4,785 servers.
Violent and Graphic Content
Real media depicting gore, excessive violence, or animal cruelty is not allowed on Discord. You can read about our Violent and Graphic Content policy here.
Discord took action on 92,567 distinct accounts for Violent and Graphic Content during this period. This included disabling 20,657 accounts and removing 3,163 servers.
Violent Extremism
Discord has a zero-tolerance policy against violent extremism, and it is against our Community Guidelines to organize, promote, or support violent extremist activities or beliefs. The term violent extremist organization describes groups that promote a political, ideological, or religious agenda and tolerate, advocate for, or use violence to achieve their goals.. We also take reported off-platform harmful activities into account when evaluating potential violations of this policy.
We partner with industry-leading organizations, such as the Global Internet Forum To Counter Terrorism (GIFCT), the European Union Internet Forum, the Christchurch Call, and other organizations across the world to combat extremism and understand global trends.
Discord also leads its own Safety Reporting Network for direct communication with expert third parties, including researchers, industry peers, and journalists, for intelligence sharing about potential extremist and hateful content or activity on the platform. We also take appropriate action based on wider platform trends and intelligence we receive.
Discord took action on 17,567 distinct accounts for Violent Extremism during this period, including disabling 16,309 account, and removing 2,607 servers.