July 11, 2023

Discord’s Commitment to Teen and Child Safety

Today’s online communications platforms have the potential to be a force for good in people’s lives. At Discord, our mission is to give users the opportunity to talk online, foster connections, and find community. When it comes to our teen users, we’re committed to providing a safe environment to build friendships, self-express, learn, and explore for teens. Safety is a top priority and a primary area of investment as a business.

We built Discord to be different and work relentlessly to make it a fun and safe space for teens.

Child safety is a societal challenge and responsibility and we’re committed to continuing our work with industry partners and law enforcement to combat harmful content online and strengthen industry moderation practices and safety initiatives.

Discord continually evolves our efforts to better support teens and families on our platform. From our new parental tools, updated child safety policies, and new partnerships and resources these updates are the result of a multi-year effort by Discord to invest more deeply and holistically into our teen safety efforts. We take our responsibility to safeguard the experience of our youngest users seriously and want to do all we can to empower teens and families with additional resources and opportunities to customize a Discord experience that’s tailored for them. From combating child sexual exploitation to supporting teens struggling with mental health, we want Discord to be a place for connection and community.

We invest talent and resources towards safety efforts. From Safety and Policy to Engineering, Data, and Product teams, about 15% of all Discord employees are dedicated to working on safety. Discord has teams that are dedicated to and specialize in child safety, along with a new Mental Health Policy Manager. Creating a safer internet is at the heart of our collective mission.

We continue to innovate how we scale safety mechanisms, with a focus on proactive detection. Millions of people around the world use Discord every day; the vast majority are engaged in positive ways, but we take action on multiple fronts to address bad behavior and harmful content.

Our approach to safety is guided by the following principles:

  • Design for Safety: We make our products safe spaces by design and by default. Safety is and will remain part of Discord’s core experience.
  • Prioritize the Highest Harms: We prioritize issues that present the highest harm to our users and our platform. This includes harm to our users and society (e.g. sexual exploitation, violence, sharing of illegal content) and platform integrity harm (e.g. spam, account takeover, malware). 
  • Design for Privacy: We carefully balance user privacy and safety on the platform. We believe that users should be able to tailor their Discord experience to their preferences, including privacy.
  • Embrace Transparency & Knowledge Sharing: We continue to educate users, join coalitions, build relationships with experts, and publish our safety learnings, including our Transparency Reports.

Teens deserve to have a safe, welcoming space to explore their interests, connect with others, and find a place to belong.

We’re committed to providing a safe, welcoming environment to teens. To do so, we take a multi-faceted and holistic approach to teen safety. This includes:

  • Safety tools and resources that give users, parents, and educators control to customize their Discord experience for themselves or their teens.
  • Emphasis on education and learning from mistakes over punishment when appropriate with our Warning System.
  • Taking into consideration a user’s behavior off-platform when reviewing content under the Teen and Child Safety Policies.
  • Regularly updating our teen and child safety policies.
  • Partnerships with industry-leading organizations like the Family Online Safety Institute, Technology Coalition, National PTA, Digital Wellness Lab, INHOPE, and more. 
  • Trusted reporter process that allows trusted partners to surface content and reports of violations directly to our Safety team.
  • Proactive detection and actioning of high-harm content.‍

Child and Teen Policies

As a part of our regular process to review and update our policies, we’ve partnered with leaders in the teen and child safety space to ensure our policies are crafted with the latest research, best practices, and expertise in mind. We took a holistic, nuanced, and trauma-informed approach to these updates in order to better support and educate our teen users while continuing to make Discord a hostile environment for bad actors. 

While we’ve highlighted the most recent updates below, more detailed information about our policies can be found here.

Our Policies

You can read an overview of the following policies here.

Teen Self-Endangerment Policy

Teen self-endangerment is a nuanced issue that we do not take lightly. We want our teen users to be able to express themselves freely on Discord while also taking steps to ensure these users don’t engage in risky behaviors that might endanger their safety and well-being.

In order to help our teenage users stay safe, our policies state that users under the age of 18 are not allowed to send or access any sexually explicit content. Even when this kind of content is shared consensually between teens, there is a risk that self-generated sexual media can be saved and shared outside of their private conversations. We want to help our users avoid finding themselves in these situations.

In this context, we also believe that dating online can result in self-endangerment. Under this policy, teen dating servers are prohibited on Discord and we will take action against users who are engaging in this behavior. Additionally, older teens engaging in the grooming of a younger teen will be reviewed and actioned under our Inappropriate Sexual Conduct with Children and Grooming Policy.

Through our thorough work and partnership with a prominent child safety organization, we determined that we will, when possible, warn teens who have engaged in sexually explicit behavior before moving to a full ban. An example of this includes teens sharing explicit content with each other that is not their own content.

Child Sexual Abuse Material (CSAM) and Child Sexualization Policy

Discord has a zero-tolerance policy for child sexual abuse, which does not have a place on our platform — or anywhere in society.

We do not allow CSAM on Discord, including AI-generated photorealistic CSAM. When such imagery is found, the imagery is reported to the National Center for Missing & Exploited Children (NCMEC). This ensures that the sexualization of children in any context is not normalized by bad actors.

Inappropriate Sexual Conduct with Teens and Grooming Policy

Discord has a zero-tolerance policy for inappropriate sexual conduct with children and grooming. Grooming is inappropriate sexual contact between adults and teens on the platform, with special attention given to predatory behaviors such as online enticement and the sexual extortion of children, commonly referred to as “sextortion.” When we become aware of these types of incidents, we take appropriate action, including banning the accounts of offending adult users and reporting them to NCMEC, which subsequently works with local law enforcement. 

How We Work to Keep Our Younger Users Safe

Our Safety team works hard to find and remove abhorrent, harmful content, and take action including banning the users responsible and engaging with the proper authorities.

Discord’s Safety by Design practice includes a risk assessment process during the product development cycle that helps identify and mitigate potential risks to user safety. We recognize that teens have unique vulnerabilities in online settings and this process allows us to better safeguard their experience. Through this process, we think carefully about how product features might disproportionately impact teens and consider whether the product facilitates more teen-to-adult interactions and/or any unintended harm. Our teams identify and strategize ways to mitigate safety risks with internal safety technology solutions during this process and through getting insight and recommendations from external partners.

Discord uses a mix of proactive and reactive tools to remove content that violates our policies, from the use of advanced technology like machine learning models and PhotoDNA image hashing to partnering with community moderators to uphold our policies and providing in-platform reporting mechanisms to surface violations.

Our Terms of Service require people to be over a minimum age (13, unless local legislation mandates an older age) to access our website. We use an age gate that asks users to provide their date of birth upon creating an account. If a user is reported as being under 13, we delete their account. Users can only appeal to restore their account by providing an official ID document to verify their age.  Additionally, we require that all age-restricted content is placed in a channel clearly labeled as age-restricted, which triggers an age gate, preventing users under 18 from accessing that content.

The newest launches of safety products are focused on helping to safeguard the teen experience with our safety alerts on senders that educate teens on ways they can control their experience and limit unwanted contact. Additionally, sensitive content filters provide additional controls for teens to blur or block unwanted content. Rather than assigning prescriptive solutions for teens, we aim to empower teens with respect for their agency, privacy, and decision-making power.

Additionally, we are expanding our own Trusted Reporter Network in collaboration with INHOPE for direct communication with expert third parties, including researchers, industry peers, and journalists for intelligence sharing.

Our Technology Solutions

Machine learning has proven to be an essential component of safety solutions at Discord. Since our acquisition of Sentropy, a leader in AI-powered moderation systems, we have continued to balance technology with the judgment and contextual assessment of highly trained employees, along with continuing to maintain our strong stance on user privacy.

Below is an overview of some of our key investments in technology:

Teen Safety Assist: Teen Safety Assist is focused on further protecting teens by introducing a new series of features, including multiple safety alerts and sensitive content filters that will be default-enabled for teen users.

Safety Rules Engine: The Safety Rules Engine allows our teams to evaluate user activities on the platform, including registrations, server joins, and other metadata. We then analyze patterns of problematic behavior to make informed decisions and take uniform actions like user challenges or bans.

AutoMod: Automod is a powerful tool for community moderators to automatically filter unwanted messages with certain keywords, block profiles that may be impersonating a server admin or appear harmful to their server, and detect harmful messages using machine learning. This technology empowers community moderators to keep their communities safe without having to spend long hours manually reviewing and removing unwanted content on their servers.

Visual safety platform: Discord has continued to invest in improving image classifiers for content such as sexual, gore, and violent media. With these, our machine learning models are being improved to reach higher precision detection rates with lower error rates and decrease teen exposure to potentially inappropriate and mature content. We proactively scan images uploaded to our platform using PhotoDNA to detect child sexual abuse material (CSAM) and report any CSAM content and perpetrators to NCMEC, who subsequently work with local law enforcement to take appropriate action.

Investing in technological advancements and tools to proactively detect CSAM and grooming is a key priority for us, and we have a dedicated team to handle related content. In Q2 2023, we proactively removed 99% of servers found to be hosting CSAM. You can find more information in Discord’s latest Transparency Report.

There is constant innovation taking place within and beyond Discord to improve how companies can effectively scale and deliver content moderation. Our approach will continue to evolve as time goes on, as we’re constantly finding new ways to do better for our users.

Partnerships

We know that collaboration is important, and we’re continuously working with experts and partners so that we have a holistic and informed approach to combating the sexual exploitation of children. We’re grateful to collaborate with the Tech Coalition and NoFiltr to help young people stay safer on Discord. 

With our new series of products under the Teen Safety Assist initiative, Discord has launched a safety alert for teens to help flag when a teen might want to double-check that they want to reply to a particular new direct message. We’ve partnered with technology non-profit Thorn to design these features together based on their guidance on teen online safety behaviors and what works best to help protect teens. Our partnership is focused on empowering teens to take control over safety and how best to guide teens to helpful tips and resources.

Additional Resources

Want to learn more about Discord’s safety work? Check out these resources below:

We look forward to continuing this important work and deepening our partnerships to ensure we continue to have a holistic and nuanced approach to teen and child safety.

Tags:
Parents and Teens

Lorem Ipsum is simply