Discord is the place to talk online, whether that’s one-on-one, in small groups, or in larger communities organized around shared interests. We know that talking with a family member or close friend in a DM is different from a group chat, and that a group chat is very different from participating in a community with millions of users. Discord enables each of these different forms of communication across the more than 9 million active servers on our platform, but we don’t take a one-size-fits-all approach to privacy and content moderation across all of them.
There have been a lot of questions and discussions recently about what it means to feel safe online, and how people should think about privacy on communications platforms like Discord. We want Discord to be a place where everyone can feel safe, find belonging, and hang out with the most important people in your lives.
Today we’re making some changes that clarify expectations and policies for different spaces on Discord. People use Discord in different ways for different experiences. By providing more information on how spaces are defined on Discord and what users can expect in different spaces, we’re empowering users to better control their Discord experience.
That’s why today’s product update distinguishes among different spaces and provides enhanced server badges to make those spaces easier to identify, manage, and navigate. Here’s more detail on how we’re taking a tailored approach to safety and privacy on Discord:
Friend Spaces
Friend Servers (as well as direct messages and group direct messages) are invite-only spaces designed to talk and hang out with your friends or smaller groups. In these spaces, Discord:
- Responds to and investigates user reports about violations of our universal Community Guidelines.
- Scans for known malware and spam.
- Scans all files for Child Sexual Abuse Material and immediately reports any content and perpetrators to the National Center for Missing & Exploited Children (NCMEC) in the United States, which works with local law enforcement to take appropriate action.
- Provides users and administrators with tools, including optional filters, to help them avoid unwanted content and undesirable behavior.
Community Server
Community Servers are invite-only spaces designed for people to talk, connect and learn from others who share a common interest. A server’s administrators must opt-in for their server to become a Community Server. In Community Servers with more than 200 members, we take a more proactive and automated approach to safety in line with our Terms of Service and Privacy Policy. In these spaces, Discord follows the same core procedures as those for Friend Servers and also:
- May use automated means to detect violations of our policies. For example, we may implement tools to help admins proactively detect and prevent coordinated spamming (raids). Since 2022, our AutoMod feature has automatically blocked more than 45 million unwanted messages from servers before they even had a chance to be posted, based on server rules.
- Supports admins by giving them the ability to develop and enforce custom server rules in addition to Discord's Community Guidelines.
Discoverable Server
Discoverable Servers are a subset of Community Servers that have met specific additional requirements to appear in Server Discovery, making them more visible to potential members. Anyone on Discord can join these servers. In addition to the Community Server policies listed above, in these spaces Discord:
- May use posted content to help develop and improve features including Highlights Notifications. Any data we collect is in line with our Terms of Service and Privacy Policy. Those policies remain unchanged in today’s announcement.
- May proactively identify harmful content on the platform and enforce our Terms of Service and Community Guidelines.
As we implement these changes, we are embracing four guiding principles.
- Users First. Discord is designed to empower users to control their own experience, including who you communicate with, which communities you join, and what topics and content you engage with. Similarly, our privacy and content controls are designed to match user expectations, with highest privacy delivered in smaller friend spaces and Discord and community administrators engaging in more content moderation in larger spaces that are open to a wider audience.
- Transparency. We have an obligation to make clear to users at all times what space you’re in. We are deploying badges to indicate Community, Partner, and Verified Servers. This extends our existing commitment to Transparency, reflected in our publication of detailed policies and guidelines for use of our service, our publication of quarterly Transparency Reports with data and analysis on our content moderation actions, and blogs like this one.
- Collaboration. Privacy and safety present nuanced issues and we know we can’t address them alone. That’s why we’re grateful to work with organizations including NCMEC, the Family Online Safety Institute, Tech Coalition, Crisis Text Line, the Digital Trust & Safety Partnership, the Trust & Safety Professionals Association, Connect Safely, AccessNow, and Global Internet Forum to Counter Terrorism. And we’re always listening to and learning from our users.
- Humility. Discord is a work-in-progress, so we will continue to evolve and grow. And as we do that, we will continue to listen to you and evolve our products and policies.
We are committed to developing and implementing solutions that prioritize both privacy and safety. With these changes, we aim to advance that by enabling individual users to tailor their experience and enabling server admins to better engage with and safely grow their communities.
For more information about these changes, please see our Help Center article.