First, we need to ensure that your account credentials and login information are as secure as possible.
Your settings are very important. They give you control over who can contact you and what they can send you. You can access your privacy and safety settings in the Privacy & Safety section of your User Settings.
Here users can decide whether they want Discord to automatically scan and delete direct messages that contain explicit media content.
The last thing to do in your security settings is determine who can send you a friend request. You can find these settings in the Friend Requests section of your User Settings.
If you don’t want to receive ANY friend requests, you can deselect all three options. However, you can still send out friend requests to other people.
You should only accept friend requests from users that you know and trust — if you aren’t sure, there’s no harm in rejecting the friend request. You can always add them later if it’s a mistake.
As with any online interaction, we recommend following some simple rules while you’re on Discord:
Again, Discord will never ask you for your password either by email or by Discord direct message. If you believe your account has been compromised, submit a report to Trust & Safety here.
We understand that there are times when you might not want to interact with someone. We want everyone to have a positive experience on Discord and have you covered in this case.
If you have blocked a user but they create a new account to try and contact you, please report the user to the Trust & Safety team. You can learn more about how to do this at this link.
Roles are one of the building blocks of managing a Discord server. They give your members a fancy color, but more importantly, each role comes with a set of permissions that control what your members can and cannot do in the server. With roles, you can give members and bots administrative permissions like kicking or banning members, adding or removing channels, and pinging @everyone.
You can find these options in the Roles section of your Server Settings.
Assign permissions with care! Certain permissions allow members to make changes to your server and channels. These permissions are a great moderation tool, but be wary of who you grant this power to. Changes made to your server can’t be undone.
You can learn more about implementing roles and permissions in Role Management 101 and our Setting Up Permissions FAQ article.
Server verification levels allow you to control who can send messages in your server. Setting a high verification level is a great way to protect your server from spammers or raids. You can find this option in the Safety Setup section of your Server Settings.
When enabled, server-wide two-factor authentication (2FA) requires all of your moderators and administrators to have 2FA enabled on their accounts in order to take administrative actions, like deleting messages. You can read more about 2FA here.
By requiring all admin accounts to have 2FA turned on, you protect your server from malicious users who might try to compromise one of your moderators' or administrators' accounts and then make unwanted changes to your server. If you are the server owner, you can enable the 2FA requirement for moderation in the Safety Setup section of your Server Settings.
You must have 2FA enabled on your own account before you can enable this option!
The Explicit Media Content Filter automatically detects and deletes images and uploads deemed explicit. Age-restricted channels are exempt from the Explicit Media Content Filter. Turning this filter on allows your server members to share content like images and embeds, while reducing the risk of explicit media being posted in channels that are not age-restricted.
You can find this option in the Safety Setup section of your Server Settings.
Administrators are the people who create Discord servers around specific interests. They establish the rules for participating, can invite people to join, and oversee the health and well-being of their community. They have broad administrative control, and can bring in moderators to manage community members. They can also ban or remove members and, if necessary, remove and replace moderators.
Administrators also choose moderators to play a vital role in Discord communities. The responsibilities of a moderator might vary, but their overall role is to ensure that their Discord server is a safe, healthy environment for everyone. They can do things like moderate or delete messages, as well as invite, ban, or suspend people who violate the server’s rules. The best moderators typically are seasoned and enthusiastic participants in one or more communities.
Admins and moderators are your first go-to when you encounter an issue in a server. They may be able to respond immediately and help resolve your concerns.
Each Discord server should have written rules for behavior to alleviate confusion or misunderstanding about the guidelines for that particular community. These rules, which supplement our Community Guidelines, are your tools to moderate efficiently and transparently. As communities grow, moderators can add more mods to keep their server a fun and welcoming place to be.
If the violation happened in a server, you should first reach out to the server’s moderators, who may be able to respond immediately and help resolve your concerns. In addition, please remember that you always have the ability to block any users that you don’t want to interact with anymore.
If it happened in a Direct Message or contacting the moderators doesn’t help, fill out the Report Form.
Please make sure to fill in all fields on the form. Providing a concise summary of the issue and including relevant message links will help us respond to your request quickly.
You’ll get an email confirming your report, and we’ll send another email when we’ve investigated the situation.
If there is a report you’d like to make about a Discord user or server and you don’t use Discord, you can also use the Report Form.
Please select “Appeals, age update, other questions” in the “How can we help?” response dropdown list and “I’m the parent or guardian of a user” in the “Appeals, age update, or other questions” response dropdown list when filling out our Report Form and provide a concise description of the issue that you and your teen are having. For help gathering the information we will need to take action, please review the instructions here.
We also recommend that you check out our Parents and Educators section in this Safety Center, which contains useful information about how to keep your teen’s account safe.
If someone has posted comments about harming themselves in a server, you may consider reaching out to your server administrators or owner to let them know about the situation, so they can moderate their server as needed and provide support to the server member.
If you are still in touch with the user, you may wish to provide them with one of the help hotlines listed below.
You may not feel qualified to help a friend who expresses their desire to hurt themselves, and it may be helpful to ask a parent or another trusted adult for help in handling the situation.
You can always report concerning content to the Trust & Safety team using this form. You can read about what information we need to investigate here.
When we receive reports of self-harm threats, we investigate the situation and may contact authorities, but in the event of an emergency, we encourage you to contact law enforcement in addition to contacting us.
Please note that for privacy and security reasons we are unable to provide personal information such as contact information or location to someone who is not the account holder. If you are concerned that someone is in immediate danger, please contact law enforcement.
If you or another user you know is in urgent trouble, please contact authorities right away, regardless of the limited information you might be able to provide. Law enforcement has investigative resources and can contact Discord Trust & Safety for information that we aren't allowed to disclose otherwise and can identify those users to get them help.
Support networks and online communities can play a key role in helping people who are experiencing mental health issues. We support mental health communities on Discord where people can come together, and we want these spaces to remain positive and healthy.
When we receive reports of users or communities discussing or encouraging self-harm, we review such content carefully, and we take into account the context in which comments are posted. We will take action on communities or users that promote, encourage, or glorify suicide or self-harm. This includes content that encourages others to cut or injure themselves or content that encourages or glorifies eating disorders.
To help keep age-restricted content in a clearly labeled, dedicated spot, we’ve added a channel setting that allows you to designate one or more text channels in your server as age-restricted.
Anyone that opens the channel will be greeted with a notification letting them know that it might contain age-restricted material and asking them to confirm that they are over 18.
Any content that cannot be placed in an age-gated channel, such as avatars, server banners, and invite splashes, cannot contain age-restricted content.
Age-restricted content that is not placed in an age-gated channel will be deleted by moderators, and the user posting that content may be banned from the server.
Partnered servers on Discord should not have age-restricted content in those servers.
It's worth mentioning that while having a dedicated place for your age-restricted content is permitted, there is still some material that isn't appropriate anywhere on Discord. Content that sexualizes minors is never allowed anywhere on Discord. If you're unsure of what is allowed on Discord, check out our Community Guidelines.
If you do not want to be exposed to age-restricted content on Discord, or if you are under 18 years old, we recommend turning on the explicit media content filter in your privacy settings. In your User Settings, select Privacy & Safety, and choose 'Keep me safe' under Safe Direct Messaging. Choosing "Keep me safe" will ensure that images and videos in all direct messages are scanned by Discord and explicit media content is blocked.
If you are under 18 years old, our age gate will restrict you from accessing age-restricted channels in servers.
If you believe your account has been compromised, submit a report to Trust & Safety here.
If you’re getting unsolicited messages or friend requests, this article explains how to change your settings.
Discord uses a proactive spam filter to protect the experience of our users and the health of the platform. Sending spam is against our Terms of Service and Community Guidelines. We may take action against any account, bot, or server using the tactics described below or similar behavior.
Receiving unsolicited messages or ads is a bad experience for users. These are some examples of DM spam for both users and bots:
Join 4 Join is the process of advertising for others to join your server with the promise to join their server in return. This might seem like a quick and fun way to introduce people to your server and to join new communities, but there’s a thin line between Join 4 Join and spam.
Even if these invitations are not unsolicited, they might be flagged by our spam filter. Sending a large number of messages in a short period of time creates a strain on our service. That may result in action being taken on your account.
While we do want you to find new communities and friends on Discord, we will enforce rate limits against spammers who might take advantage of this through bulk joins or bulk requests. Joining a lot of servers simultaneously or sending a large number of friend requests might be considered spam. In order to shut down spambots, we take action against accounts that join servers too frequently, or send out too many friend requests at one time. The majority of Discord users will never encounter our proactive spam filter, but if, for example, you send a friend request in just a few minutes to everyone you see in a thousand-person server, we may take action on your account.
Instead of joining too many servers at once, we recommend using Server Discovery to find active public communities on topics you’re passionate about.
Servers dedicated to mass copy-paste messaging, or encouraging DM advertising, are considered dedicated spam servers.
Many servers have popular bots which reward active messaging. We don’t consider these to be spambots, but spam messages to generate these bot prompts is considered abuse of our API, and may result in our taking action on the server and/or the users who participate in mass messaging. Besides cheating those systems, sending a large number of messages in a short period of time harms the platform.
Invite reward servers are servers that promise some form of perk, often financial, for inviting and getting other users to join said server. We strongly discourage this activity, as it often results in spamming users with unsolicited messages. If it leads to spam or another form of abuse, we may take action including removing the users and server.
If a bot contacts you to be added to your server, or asks you to click on a suspicious link, please report it to our Trust & Safety team for investigation.
We don’t create bots to offer you free products. This is a scam. If you receive a DM from a bot offering you something, or asking you to click on a link, report it.
We understand the allure of free stuff. But we’re sorry to say these bots are not real. Do not add them to your server in hopes of receiving something in return as they likely will compromise your server. If anything gets deleted, we have no way of restoring what was lost.
Using a user token in any application (known as a Selfbot), or any automation of your account, may result in account suspension or termination. Our automated system will flag bots it suspects are being used for spam or any other suspicious activity. The bot, as well as the bot owner’s account, may be disabled as a result of our investigation. If your bot’s code is publicly available, please remove your bot’s token from the text to prevent it from being compromised.
If you believe your account has been compromised through hacking, here are some steps you can take to regain access and protect yourself in the future.
Two-factor authentication (2FA) strengthens your account to protect against intruders by requiring you to provide a second form of confirmation that you are the rightful account owner. Here’s how to set up 2FA on your Discord account. If for some reason you’re having trouble logging in with 2FA, here’s our help article.
A distributed denial of service (DDoS) attack floods an IP address with useless requests, resulting in the attacked modem or router no longer being able to successfully connect to the internet. If you believe your IP address has been targeted in a DDoS attack, here are some steps you can take:
Not sure how to approach difficult or sensitive topics when it comes to talking about your experiences online? Check out our print-at-home fortune teller filled with questions and icebreaker prompts that can help jump start a conversation about better digital health and safer online practices.
Everyone’s got a role to play in helping make your online communities a safe and inclusive space— wanna find out yours?
For this year’s Safer Internet Day, Discord and NoFiltr, with help from the Youth Innovation Council, are launching the “What’s Your Online Digital Role?” quiz. We believe that everyone can play a part in helping make online communities a safe and inclusive space, and this interactive quiz can help you figure out what role best suits you when it comes to being a part of and building a safe community. Take the quiz here to find out.
We’re committed now more than ever to helping spread the message of Safer Internet Day. In continuing our mission of making your online home a safer one, Discord is partnering with Childnet UK and Internet Sans Crainte, two European Safer Internet Centers dedicated to increasing awareness and education about better online safety practices for youth.
In addition, we’ll be hosting a round table event in Brussels where policymakers, civil society thought leaders, and industry partners will come together to share insights, discuss challenges, and discuss steps we can take together to make Discord and the internet a safer place for young people.
Wanna learn more about online safety how you can keep yourself and others safer online? We’ve gathered these resources to help give you a headstart:
Discord has its own vocabulary. You might hear your teen or students using these words when talking about Discord.
Server: Servers are the spaces on Discord. They are made by specific communities and friend groups. The vast majority of servers are small and invitation-only. Some larger servers are public. Any user can start a new server for free and invite their friends to it.
Channel: Discord servers are organized into text and voice channels, which are usually dedicated to specific topics and can have different rules.
DMs and GDMs: Users can send private messages to other users as a direct message (DM), as well as start a voice or video call. Most DMs are one-on-one conversations, but users have the option to invite up to nine others to the conversation to create a private group DM (GDM), with a maximum size of ten people. Group DMs are not public and require an invite from someone in the group to join.
Go Live: users can share their screen with other people who are in a server or a DM with them.
Nitro: Nitro is Discord’s premium subscription service. Nitro offers special perks for subscribers, such as the option to customize your Discord Tag, the ability to use custom emotes in every server, a higher file upload cap, and discounted Server Boosts.
Server Boosts: If your teen is a big fan of a community, they might want to boost the community’s server (or their own). Like Nitro, Server Boosts give servers special perks like more custom emotes, better video and voice quality, and the ability to set a custom invite link. Server Boosts can be bought with Nitro or purchased separately.
Student Hubs: Discord Hubs for Students allow students to verify their Discord account with their official student email, and unlock access to an exclusive hub for students at their school. Within the hub, they can connect with other verified students, discover servers for study groups or classes, and share their own servers for fellow students to join. Hubs are not affiliated with or managed by a school or school staff. Servers in a Hub are student-run but may include non-students. For more information on Student Hubs, please check out our Student Hubs FAQs.
Below, you can see just a few of our favorite stories about what people are doing on Discord and why they love it. You can find even more stories about how people use Discord right here.
Cyndie, a parent of two from North Carolina, reflects on how her family uses Discord:
“There are four of us and we all have Discord installed on both our computers and phones. My oldest son is in an apartment, and the younger one is on campus, so we use Discord to make family plans. Everything gets dropped into that server. From dinner’s ready to internships and job offers. Usually it’s the silly, stupid stuff we just drop in that makes us all laugh, like when there’s a Weird Al question on Jeopardy. I can’t imagine life without it.”
Genavieve, a high-school student from California, talks about how her classes use Discord:
"I've been using Discord for the last two years as my main communication with my friends. We had too many people in our group chat and wanted a platform where we could all communicate with each other. Discord is a great way for a friend group of thirty people to stay in touch! Also, with distance learning in place, I’ve started using it with my AP Physics class too. It's been so important to feel connected to our teachers and each other when we are so isolated and in such a difficult class. Using Discord brought us closer together as a class — we are already a small class of 22 students, so being able to joke around and send memes helps us not feel so alone during the distance learning. The different channels and @mentions make it much easier to keep information straight. Screenshare makes it even easier, so we can show each other documents or problems we are working on to get feedback or troubleshooting advice.”
David, a physics and math tutor from New Jersey, talks about how he teaches students and connects with other teachers over Discord:
"I use Discord to tutor one of my students and to stay up to date with conversations and announcements in a group of physics teachers interested in physics education research. It's nice to see a side-by-side camera view of my desk with the student's work. I also really like that the audio through the OPUS codec which sounds very clean."
We work hard to ensure everyone on Discord is able to have meaningful conversations and spend time with their communities in a safe, positive, and inclusive manner.
From this menu, users can decide whether they want Discord to automatically scan and delete direct messages (DMs) that contain explicit media content. You can access this setting by going into User Settings, selecting the Privacy & Safety section, and finding the "Safe Direct Messaging" heading.
By default, this is set to “my friends are nice”, which means only DMs your teen receives from non-friends are scanned for explicit media. Choose ‘Keep me safe’ to have all DMs your teen receives scanned for explicit media.
You can also control these settings on a server-by-server basis.
You can choose from the following options when deciding who can send your teen a friend request.
If you don’t want your teen to receive ANY friend requests, you can deselect all three options. However, your teen can still send out friend requests to other people.
If someone is bothering your teen, you always have the option to block the user. Blocking on Discord removes the user from your teen's Friends List, prevents them from messaging your teen directly, and hides their messages in any shared servers.
To block someone, they can simply right-click on their @username and select Block.
If your teen has blocked a user but that user creates a new account to try and contact them, please report the user to the Trust & Safety team. You can learn more about how to do this at this link.
If you or your teen would like to delete your teen’s Discord account, please follow the steps described in this article. Please note that we are unable to delete an account by request from someone other than the account owner.
Many online safety experts provide resources for parents to navigate their kids’ online lives.
ConnectSafely published their Parent’s Guide to Discord which gives a holistic overview of how your teen uses Discord, our safety settings, and ways to start conversations with your teen about their safety.
For more information from other organizations, please go directly to their websites:
There are a few things that make Discord a great and safe place for teens:
To help your teen use Discord safely, it’s important to understand how Discord works and how you can best control your teen’s experience on it. We have listed a number of tips to do so here.
Just like with every other online service, the best way to ensure your teen stays safe online is to have clear guidelines on what they should and shouldn’t be looking at or posting online, and make sure that you keep clear lines of communication with them.
Discord's Terms of Service require people to be over a minimum age to access our app or website. The minimum age to access Discord is 13, unless local legislation mandates an older age.
To ensure that users satisfy that minimum age requirement, users are asked to confirm their date of birth upon creating an account. Learn more about how we use this age information here. If a user is reported as being under 13, we delete their account unless they can verify that they are at least 13 years old using an official ID document.
Like on every internet platform, there is age-restricted content on Discord. Each user chooses which servers they want to join and who they want to interact with.
In servers, age-restricted content must be posted in a channel marked as age-restricted, which cannot be accessed by users under 18. For Direct Messages, we recommend that every user under 18 activates the explicit media content filter by selecting "Keep Me Safe" under the "Safe Direct Messaging" heading in the Privacy & Safety section of their User Settings. When a user chooses the "Keep Me Safe" setting, images and videos in all direct messages are scanned by Discord and explicit media content is blocked.
We believe that the best way to make sure that your teenagers are only accessing content that they should is to set clear guidelines on what they should and shouldn’t be looking at or posting online, and make sure that you keep clear lines of communication with them.
Unlike other platforms where someone might be able to message you as soon as you sign up for an account (before you have added any friends or joined any servers), this isn’t the case on Discord. In order for another user to send a direct message (DM) to your teen, your teen must either (1) accept the other user as a friend or (2) decide to join a server that the other user is a member of.
Each user has control over the following:
Users should only accept friend requests from users that they know and trust. If your teen isn’t sure, there’s no harm in rejecting the friend request. They can always add that user later if it’s a mistake.
If your teen is ever uncomfortable interacting with someone on Discord, they can always block that specific user. Blocking a user removes them from your teen's Friends List, prevents them from messaging your teen directly, and hides their messages in any shared servers.
We have detailed all the controls you have to help make your teen’s account safer here. We recommend going through these settings together with your teen and having an open conversation about why you are choosing certain settings.
iOS and Android operating systems offer parental controls that can help you manage your teen's phone usage, including Discord, if needed. Apple and Microsoft offer similar controls for computers.
Privacy is incredibly important to us, including your teen’s privacy. We can’t share their login information with you, but we encourage you to discuss how to use Discord safely directly with your teen.
Discord Hubs for Students allow students to verify their Discord account with their official student email, and unlock access to an exclusive hub for students at their school. Within the hub, they can connect with other verified students, discover servers for study groups or classes, and share their own servers for fellow students to join. Hubs are not affiliated with or managed by a school or school staff. Servers in a Hub are student-run but may include non-students.
For more information on Student Hubs, please check out our Student Hubs FAQs.
Even though the majority of Discord usage is in small, private, invite-only groups, we understand that there may be times when people in these groups behave in ways that make others uncomfortable or post content that isn’t allowed. Our Community Guidelines outline how all users should act on Discord and what we allow and do not allow. We recommend reviewing these with your teen so that you both know what behavior is and isn’t okay on the platform. Among other things, we do not allow:
If your teen encounters a violation of our Community Guidelines, such as harassment or inappropriate content, please file a report with details that you can gather. A member of our Trust & Safety team will review the content and take appropriate action. The Trust & Safety team strives to ensure bad users don't disrupt your teen’s experience on Discord. We also provide a number of tools to ensure that teens (and everyone else) have control over their Discord experience.
Protecting the privacy and safety of our users is of utmost importance to Discord.
Discord is a communications service for teens and adults who are looking to talk with their communities and friends online. We do not allow those under the age of 13 on our service, and we encourage our users to report accounts that may belong to underage individuals.
We’re pleased to share that the Better Business Bureau’s Children’s Advertising Review Unit (“CARU”) issued a report endorsing our practices relating to children’s privacy. CARU regularly monitors websites and online services for compliance with laws and best practices relating to children’s privacy and engaged with Discord as part of its routine monitoring work.
CARU finished its review of Discord in October 2020 and issued the following statement in connection with the release of its report:
“[T]he outcome we hope for is proactive corporate accountability on children’s privacy, and that is exactly what Discord delivered.”
— Dona Fraser, Senior Vice President Privacy Initiatives, and Director of the Children’s Advertising Review Unit (CARU).
Discord appreciates CARU's thorough evaluation of our service and our practices. We look forward to continuing our work to improve online privacy and safety.
We’re all about helping millions of communities, small and big, find a home online to talk, hang out, and have meaningful conversations. That means we need to find the right balance between giving people a place to express themselves and promoting a welcoming and safe environment for everyone.
Our Community Guidelines define what is and isn't okay to do on Discord. Every person on Discord should feel like their voice can be heard, but not at the expense of someone else.
If you come across a message that appears to break these rules, please report it to your server moderator or to us. We might take a number of steps, including issuing a warning, removing the content, or removing the accounts and/or servers responsible.
Learn more about our Community Guidelines here and Terms of Service here.
When we receive a report from a Discord user, the Trust & Safety team looks through the available evidence and gathers as much information as possible. This investigation is centered around the reported messages, but can expand if the evidence shows that there’s a bigger violation. For example, we may investigate whether the entire server is dedicated to bad behavior, or if the behavior appears to be part of a wider pattern.
We spend a lot of time on this process because we believe the context in which something is posted is important and can change the meaning entirely. We might ask the reporting user for more information to help our investigation.
Responding to user reports is an important part of our Trust & Safety team’s work, but we know there is also violating content on Discord that might go unreported. This is where we get proactive. Our goal is to stop bad actors and their activity before anyone else encounters it. We prioritize getting rid of the worst-of-the-worst content because it has absolutely no place on Discord, and because the risk of harm is high. We focus our efforts on exploitative content, in particular non-consensual pornography and sexual content related to minors, as well as violent extremism.
Please note: We do not monitor every server or every conversation. Privacy is incredibly important to us and we try to balance it thoughtfully with our duty to prevent harm. However, we scan images uploaded to our platform using industry-standard PhotoDNA to detect matches to known child sexual abuse material. When we have data suggesting that a user is engaging in illegal activity or violating our policies, we investigate their networks, activity on Discord, and their messages to proactively detect accomplices and determine whether violations have occurred.
When our Trust & Safety team confirms that there has been a violation of our Community Guidelines, the team takes immediate steps to mitigate the harm. The following are actions that we might take on either users and/or servers:
Discord also works with law enforcement agencies in cases of immediate danger and/or self-harm. In particular, we swiftly report child abuse material content and the users responsible to the National Center for Missing and Exploited Children.
Every user can appeal actions taken against their account. Through our investigative process, we go to great lengths to ensure that we’re only taking action when it’s warranted. But we’re not perfect. Mistakes might happen. Thus, appeals are an important part of the process.
Just as you deserve a chance to be heard when action is taken against you offline, you should have such a chance to be heard when an action is taken against your Discord account.
If you think we took unwarranted action against your account, you can reach out to us so we can review your case.
We’re committed to being transparent about bad behavior on the platform and how we respond to it. It’s an important part of our accountability to you and other Discord users. To share insights about what kind of bad behavior we’re seeing on Discord, and the actions we took to help keep Discord safe, we publish quarterly Transparency Reports. You can read our latest Transparency Report here.
As we continue to invest in safety and improve our enforcement capabilities, we’ll have new insights to share.
Discord is a voice, video, and text chat app that's used by tens of millions of people ages 13+ to talk and hang out with their communities and friends.
The vast majority of servers are private, invite-only spaces for groups of friends and communities to stay in touch and spend time together. There are also larger, more open communities, generally centered around specific topics. Users have control over whom they interact with and what their experience on Discord is.
More information about Discord and our community goals can be found here.
Law enforcement should send any additional questions and/or legal process to:
If needed for mail service, our physical address is as follows:
Discord, Inc. 444 De Haro St, Suite 200 San Francisco, CA, 94107
If serving process by mail, please direct the mail to the Legal Department.