First, we need to ensure that your account credentials and login information are as secure as possible.
Your settings are very important. They give you control over who can contact you and what they can send you. You can access your privacy and safety settings in the Privacy & Safety section of your User Settings.
We know it’s important for users to understand what controls they have over their experience on Discord and how to be safer. Part of delivering a better, safer experience is making sure people don’t see content they don’t want to – whether that’s intrusive spam or unwanted explicit images. This article covers direct message filters, which can help reduce the amount of unwanted content you see on Discord and promote a safer environment for you.
You can control these settings by going into User Settings, selecting the Privacy & Safety section, and finding the "Direct message filters" heading.
Automatically block direct messages that may contain explicit images. While the filter may successfully identify most explicit images, there may be some instances where it fails to do so. In such cases, you can block the user responsible and report the content that violates our Community Guidelines or Terms of Service.
Automatically send direct messages that may contain spam into a separate spam inbox.
These filters are customizable and you can choose to turn them off. By default, these filters are set to “Filter direct messages from non-friends.” Choose “Filter all direct messages” if you want all direct messages that you receive to be filtered, or select “Do not filter direct messages” to turn these filters off.
The last thing to do in your security settings is determine who can send you a friend request. You can find these settings in the Friend Requests section of your User Settings.
If you don’t want to receive ANY friend requests, you can deselect all three options. However, you can still send out friend requests to other people.
You should only accept friend requests from users that you know and trust — if you aren’t sure, there’s no harm in rejecting the friend request. You can always add them later if it’s a mistake.
As with any online interaction, we recommend following some simple rules while you’re on Discord:
Again, Discord will never ask you for your password either by email or by Discord direct message. If you believe your account has been compromised, submit a report to Trust & Safety here.
We understand that there are times when you might not want to interact with someone. We want everyone to have a positive experience on Discord and have you covered in this case.
If you have blocked a user but they create a new account to try and contact you, please report the user to the Trust & Safety team. You can learn more about how to do this at this link.
If you believe your account has been compromised, submit a report to Trust & Safety here.
If you’re getting unsolicited messages or friend requests, this article explains how to change your settings.
Discord uses a proactive spam filter to protect the experience of our users and the health of the platform. Sending spam is against our Terms of Service and Community Guidelines. We may take action against any account, bot, or server using the tactics described below or similar behavior.
Receiving unsolicited messages or ads is a bad experience for users. These are some examples of DM spam for both users and bots:
Join 4 Join is the process of advertising for others to join your server with the promise to join their server in return. This might seem like a quick and fun way to introduce people to your server and to join new communities, but there’s a thin line between Join 4 Join and spam.
Even if these invitations are not unsolicited, they might be flagged by our spam filter. Sending a large number of messages in a short period of time creates a strain on our service. That may result in action being taken on your account.
While we do want you to find new communities and friends on Discord, we will enforce rate limits against spammers who might take advantage of this through bulk joins or bulk requests. Joining a lot of servers simultaneously or sending a large number of friend requests might be considered spam. In order to shut down spambots, we take action against accounts that join servers too frequently, or send out too many friend requests at one time. The majority of Discord users will never encounter our proactive spam filter, but if, for example, you send a friend request in just a few minutes to everyone you see in a thousand-person server, we may take action on your account.
Instead of joining too many servers at once, we recommend using Server Discovery to find active public communities on topics you’re passionate about.
Servers dedicated to mass copy-paste messaging, or encouraging DM advertising, are considered dedicated spam servers.
Many servers have popular bots which reward active messaging. We don’t consider these to be spambots, but spam messages to generate these bot prompts is considered abuse of our API, and may result in our taking action on the server and/or the users who participate in mass messaging. Besides cheating those systems, sending a large number of messages in a short period of time harms the platform.
Invite reward servers are servers that promise some form of perk, often financial, for inviting and getting other users to join said server. We strongly discourage this activity, as it often results in spamming users with unsolicited messages. If it leads to spam or another form of abuse, we may take action including removing the users and server.
If a bot contacts you to be added to your server, or asks you to click on a suspicious link, please report it to our Trust & Safety team for investigation.
We don’t create bots to offer you free products. This is a scam. If you receive a DM from a bot offering you something, or asking you to click on a link, report it.
We understand the allure of free stuff. But we’re sorry to say these bots are not real. Do not add them to your server in hopes of receiving something in return as they likely will compromise your server. If anything gets deleted, we have no way of restoring what was lost.
Using a user token in any application (known as a Selfbot), or any automation of your account, may result in account suspension or termination. Our automated system will flag bots it suspects are being used for spam or any other suspicious activity. The bot, as well as the bot owner’s account, may be disabled as a result of our investigation. If your bot’s code is publicly available, please remove your bot’s token from the text to prevent it from being compromised.
If you believe your account has been compromised through hacking, here are some steps you can take to regain access and protect yourself in the future.
Two-factor authentication (2FA) strengthens your account to protect against intruders by requiring you to provide a second form of confirmation that you are the rightful account owner. Here’s how to set up 2FA on your Discord account. If for some reason you’re having trouble logging in with 2FA, here’s our help article.
A distributed denial of service (DDoS) attack floods an IP address with useless requests, resulting in the attacked modem or router no longer being able to successfully connect to the internet. If you believe your IP address has been targeted in a DDoS attack, here are some steps you can take:
To help keep age-restricted content in a clearly labeled, dedicated spot, we’ve added a channel setting that allows you to designate one or more text channels in your server as age-restricted.
Anyone that opens the channel will be greeted with a notification letting them know that it might contain age-restricted material and asking them to confirm that they are over 18.
Any content that cannot be placed in an age-gated channel, such as avatars, server banners, and invite splashes, cannot contain age-restricted content.
Age-restricted content that is not placed in an age-gated channel will be deleted by moderators, and the user posting that content may be banned from the server.
Partnered servers on Discord should not have age-restricted content in those servers.
It's worth mentioning that while having a dedicated place for your age-restricted content is permitted, there is still some material that isn't appropriate anywhere on Discord. Content that sexualizes minors is never allowed anywhere on Discord. If you're unsure of what is allowed on Discord, check out our Community Guidelines.
Automatically block direct messages that may contain explicit images. While the filter may successfully identify most explicit images, there may be some instances where it fails to do so. In such cases, you can block the user responsible and report the content that violates our Community Guidelines or Terms of Service.
You can control these settings by going into User Settings, selecting the Privacy & Safety section, and finding the "Direct message filters" heading.
The core of our mission is to give everyone the power to find and create belonging in their lives. Creating a safe environment on Discord is essential to achieve this, and is one of the ways we prevent misuse of our platform. Safety is at the core of everything we do and a primary area of investment as a business:
The fight against bad actors on communications platforms is unlikely to end soon, and our approach to safety is guided by the following principles:
Underpinning all of this are two important considerations: our overall approach towards content moderation and our investments in technology solutions to keep our users safe.
We believe that in the long term, machine learning will be an essential component of safety solutions. In 2021, we acquired Sentropy, a leader in AI-powered moderation systems, to advance our work in this domain. We will continue to balance technology with the judgment and contextual assessment of highly trained employees, as well as continuing to maintain our strong stance on user privacy.
Here is an overview of some of our key investments in technology:
In the field of online safety, we are inspired by the spirit of cooperation across companies and civil society groups. We are proud to engage and learn from a wide range of companies and organizations including:
This cooperation extends to our work with law enforcement agencies. When appropriate, Discord complies with information requests from law enforcement agencies while respecting the privacy and rights of our users. Discord also may disclose information to authorities in emergency situations when we possess a good faith belief that there is imminent risk of serious physical injury or death. You can read more about how Discord works with law enforcement here.
If you would like to learn more about our approach to Safety, we welcome you to visit the links below.
Many online safety experts provide resources for parents to navigate their kids’ online lives.
ConnectSafely published their Parent’s Guide to Discord which gives a holistic overview of how your teen uses Discord, our safety settings, and ways to start conversations with your teen about their safety.
For more information from other organizations, please go directly to their websites:
We know it’s important for users to understand what controls they have over their experience on Discord and how to be safer. Part of delivering a better, safer experience is making sure people don’t see content they don’t want to – whether that’s intrusive spam or unwanted explicit images. This article covers direct message filters, which can help reduce the amount of unwanted content seen on Discord and promote a safer environment for your teen.
These settings can be controlled by going into User Settings, selecting the Privacy & Safety section, and finding the "Direct message filters" heading.
Automatically block direct messages that may contain explicit images. While the filter may successfully identify most explicit images, there may be some instances where it fails to do so. In such cases, you can block the user responsible and report the content that violates our Community Guidelines or Terms of Service.
Automatically send direct messages that may contain spam into a separate spam inbox.
These filters are customizable and you can choose to turn them off. By default, these filters are set to “Filter direct messages from non-friends.” Choose “Filter all direct messages” if you want all direct messages that you receive to be filtered, or select “Do not filter direct messages” to turn these filters off.
You can also control these settings on a server-by-server basis.
You can choose from the following options when deciding who can send your teen a friend request.
If you don’t want your teen to receive ANY friend requests, you can deselect all three options. However, your teen can still send out friend requests to other people.
If someone is bothering your teen, you always have the option to block the user. Blocking on Discord removes the user from your teen's Friends List, prevents them from messaging your teen directly, and hides their messages in any shared servers.
To block someone, they can simply right-click on their @username and select Block.
If your teen has blocked a user but that user creates a new account to try and contact them, please report the user to the Trust & Safety team. You can learn more about how to do this at this link.
If you or your teen would like to delete your teen’s Discord account, please follow the steps described in this article. Please note that we are unable to delete an account by request from someone other than the account owner.
Administrators are the people who create Discord servers around specific interests. They establish the rules for participating, can invite people to join, and oversee the health and well-being of their community. They have broad administrative control, and can bring in moderators to manage community members. They can also ban or remove members and, if necessary, remove and replace moderators.
Administrators also choose moderators to play a vital role in Discord communities. The responsibilities of a moderator might vary, but their overall role is to ensure that their Discord server is a safe, healthy environment for everyone. They can do things like moderate or delete messages, as well as invite, ban, or suspend people who violate the server’s rules. The best moderators typically are seasoned and enthusiastic participants in one or more communities.
Admins and moderators are your first go-to when you encounter an issue in a server. They may be able to respond immediately and help resolve your concerns.
Each Discord server should have written rules for behavior to alleviate confusion or misunderstanding about the guidelines for that particular community. These rules, which supplement our Community Guidelines, are your tools to moderate efficiently and transparently. As communities grow, moderators can add more mods to keep their server a fun and welcoming place to be.
There are a few things that make Discord a great and safe place for teens:
To help your teen use Discord safely, it’s important to understand how Discord works and how you can best control your teen’s experience on it. We have listed a number of tips to do so here.
Just like with every other online service, the best way to ensure your teen stays safe online is to have clear guidelines on what they should and shouldn’t be looking at or posting online, and make sure that you keep clear lines of communication with them.
Discord's Terms of Service require people to be over a minimum age to access our app or website. The minimum age to access Discord is 13, unless local legislation mandates an older age.
To ensure that users satisfy that minimum age requirement, users are asked to confirm their date of birth upon creating an account. Learn more about how we use this age information here. If a user is reported as being under 13, we delete their account unless they can verify that they are at least 13 years old using an official ID document.
Like on every internet platform, there is age-restricted content on Discord. Each user chooses which servers they want to join and who they want to interact with.
In servers, age-restricted content must be posted in a channel marked as age-restricted, which cannot be accessed by users under 18. For Direct Messages, we recommend that every user under 18 activates the explicit media content filter by selecting "Keep Me Safe" under the "Safe Direct Messaging" heading in the Privacy & Safety section of their User Settings. When a user chooses the "Keep Me Safe" setting, images and videos in all direct messages are scanned by Discord and explicit media content is blocked.
We believe that the best way to make sure that your teenagers are only accessing content that they should is to set clear guidelines on what they should and shouldn’t be looking at or posting online, and make sure that you keep clear lines of communication with them.
Unlike other platforms where someone might be able to message you as soon as you sign up for an account (before you have added any friends or joined any servers), this isn’t the case on Discord. In order for another user to send a direct message (DM) to your teen, your teen must either (1) accept the other user as a friend or (2) decide to join a server that the other user is a member of.
Each user has control over the following:
Users should only accept friend requests from users that they know and trust. If your teen isn’t sure, there’s no harm in rejecting the friend request. They can always add that user later if it’s a mistake.
If your teen is ever uncomfortable interacting with someone on Discord, they can always block that specific user. Blocking a user removes them from your teen's Friends List, prevents them from messaging your teen directly, and hides their messages in any shared servers.
We have detailed all the controls you have to help make your teen’s account safer here. We recommend going through these settings together with your teen and having an open conversation about why you are choosing certain settings.
iOS and Android operating systems offer parental controls that can help you manage your teen's phone usage, including Discord, if needed. Apple and Microsoft offer similar controls for computers.
Privacy is incredibly important to us, including your teen’s privacy. We can’t share their login information with you, but we encourage you to discuss how to use Discord safely directly with your teen.
Discord Hubs for Students allow students to verify their Discord account with their official student email, and unlock access to an exclusive hub for students at their school. Within the hub, they can connect with other verified students, discover servers for study groups or classes, and share their own servers for fellow students to join. Hubs are not affiliated with or managed by a school or school staff. Servers in a Hub are student-run but may include non-students.
For more information on Student Hubs, please check out our Student Hubs FAQs.
Even though the majority of Discord usage is in small, private, invite-only groups, we understand that there may be times when people in these groups behave in ways that make others uncomfortable or post content that isn’t allowed. Our Community Guidelines outline how all users should act on Discord and what we allow and do not allow. We recommend reviewing these with your teen so that you both know what behavior is and isn’t okay on the platform. Among other things, we do not allow:
If your teen encounters a violation of our Community Guidelines, such as harassment or inappropriate content, please file a report with details that you can gather. Our Trust & Safety team strives to ensure bad users don't disrupt your teen’s experience on Discord. We also provide a number of tools to ensure that teens (and everyone else) have control over their Discord experience.
Every user can appeal actions taken against their account. Through our investigative process, we go to great lengths to ensure that we’re only taking action when it’s warranted. But we’re not perfect. Mistakes might happen. Thus, appeals are an important part of the process.
Just as you deserve a chance to be heard when action is taken against you offline, you should have such a chance to be heard when an action is taken against your Discord account.
If you think we took unwarranted action against your account, you can reach out to us so we can review your case.
When our Trust & Safety team confirms that there has been a violation of our Community Guidelines, the team takes immediate steps to mitigate the harm. The following are actions that we might take on either users and/or servers:
Discord also works with law enforcement agencies in cases of immediate danger and/or self-harm. In particular, we swiftly report child abuse material content and the users responsible to the National Center for Missing and Exploited Children.
We’re all about helping millions of communities, small and big, find a home online to talk, hang out, and have meaningful conversations. That means we need to find the right balance between giving people a place to express themselves and promoting a welcoming and safe environment for everyone.
We currently employ three levers to moderate user content on Discord, while being mindful of user privacy:
Our Community Guidelines define what is and isn't okay to do on Discord. Every person on Discord should feel like their voice can be heard, but not at the expense of someone else.
If you come across a message that appears to break these rules, please report it to your server moderator or to us. We might take a number of steps, including issuing a warning, removing the content, or removing the accounts and/or servers responsible.
Discord is a voice, video, and text chat app that's used by tens of millions of people ages 13+ to talk and hang out with their communities and friends.
The vast majority of servers are private, invite-only spaces for groups of friends and communities to stay in touch and spend time together. There are also larger, more open communities, generally centered around specific topics. Users have control over whom they interact with and what their experience on Discord is.
More information about Discord and our community goals can be found here.
Please submit your inquiry in our new Law Enforcement Portal at: https://app.kodex.us/discord/signin.
Our online portal will guide you through how to submit a request. You will need to:
1. Verify your email address via the link sent to your email address (only valid law enforcement domains will be accepted);
2. Fill in the required fields in the webform;
3. Upload a copy of any relevant documents in PDF format (for example, a copy of the subpoena or search warrant, as well as any non-disclosure order you may have).
Our online portal also allows you to add information, ask us questions, and download the information when it is available.
It is our policy to notify Discord users and provide them with a copy of the legal process seeking their account information. We do not provide notice if we are prohibited by law or under exceptional circumstances such as child sexual exploitation investigations or threat to life emergencies. We ask that your non-disclosure order (18 U.S.C. § 2705(b)) or similar legal authority include a time limitation, as we provide delayed notice upon expiration of this limitation and when we believe the exceptional circumstance no longer exists. If no non-disclosure provision or exception is included with your legal request, Discord will confirm with law enforcement before providing user notice. In order to expedite your legal request for account information, we ask that you include or inform us initially of one of the following:
Discord reserves the right to not notify users if doing so would pose a risk to Discord or its users general welfare.
If your data request places Discord on notice of an ongoing or prior violation of our Terms of Service, Community Guidelines or other policies, we may take action to prevent further abuse, including account termination and other actions that may notify the user that we are aware of their misconduct.
If needed for mail service, our physical address is as follows:
Discord, Inc. 444 De Haro St, Suite 200 San Francisco, CA, 94107
If serving process by mail, please direct the mail to the Legal Department.
If someone has posted comments about harming themselves in a server, you may consider reaching out to your server administrators or owner to let them know about the situation, so they can moderate their server as needed and provide support to the server member.
If you are still in touch with the user, you may wish to provide them with one of the help hotlines listed below.
You may not feel qualified to help a friend who expresses their desire to hurt themselves, and it may be helpful to ask a parent or another trusted adult for help in handling the situation.
All Discord users can report policy violations right in the app by following the instructions here.
When we receive reports of self-harm threats, we investigate the situation and may contact authorities, but in the event of an emergency, we encourage you to contact law enforcement in addition to contacting us.
Please note that for privacy and security reasons we are unable to provide personal information such as contact information or location to someone who is not the account holder. If you are concerned that someone is in immediate danger, please contact law enforcement.
If you or another user you know is in urgent trouble, please contact authorities right away, regardless of the limited information you might be able to provide. Law enforcement has investigative resources and can contact Discord Trust & Safety for information that we aren't allowed to disclose otherwise and can identify those users to get them help.
Support networks and online communities can play a key role in helping people who are experiencing mental health issues. We support mental health communities on Discord where people can come together, and we want these spaces to remain positive and healthy.
When we receive reports of users or communities discussing or encouraging self-harm, we review such content carefully, and we take into account the context in which comments are posted. We will take action on communities or users that promote, encourage, or glorify suicide or self-harm. This includes content that encourages others to cut or injure themselves or content that encourages or glorifies eating disorders.
Do not mislead Discord’s support teams. Do not make false or malicious reports to our Trust & Safety or other customer support teams, send multiple reports about the same issue, or ask a group of users to report the same content or issue. Repeated violations of this guideline may result in loss of access to our reporting functions.
If a credible threat of violence has been made and you or someone else are in immediate danger, or if someone is considering self-harm and is in immediate danger, please contact your local law enforcement agency.
Additionally, if you are in the United States, you can contact Crisis Text Line to speak with a volunteer crisis counselor to help you or a friend through any mental health crisis by texting DISCORD to 741741. You can learn more about Discord’s partnership with Crisis Text line here.
You can find more resources about mental health here.
When we become aware of content that violates our Community Guidelines or Terms of Service, our Safety team reviews and takes the necessary enforcement actions, including: disabling accounts, removing servers, and when appropriate, engaging with the proper authorities. We may not review your report manually or respond to you directly, but we’ll use your report to improve Discord.
You can read more about the reports we receive and the actions we take on violations of our Community Guidelines or Terms of Service in our quarterly Transparency Report.
Discord has its own vocabulary. You might hear your teen or students using these words when talking about Discord.
Server: Servers are the spaces on Discord. They are made by specific communities and friend groups. The vast majority of servers are small and invitation-only. Some larger servers are public. Any user can start a new server for free and invite their friends to it.
Channel: Discord servers are organized into text and voice channels, which are usually dedicated to specific topics and can have different rules.
DMs and GDMs: Users can send private messages to other users as a direct message (DM), as well as start a voice or video call. Most DMs are one-on-one conversations, but users have the option to invite up to nine others to the conversation to create a private group DM (GDM), with a maximum size of ten people. Group DMs are not public and require an invite from someone in the group to join.
Go Live: users can share their screen with other people who are in a server or a DM with them.
Nitro: Nitro is Discord’s premium subscription service. Nitro offers special perks for subscribers, such as the option to customize your Discord Tag, the ability to use custom emotes in every server, a higher file upload cap, and discounted Server Boosts.
Server Boosts: If your teen is a big fan of a community, they might want to boost the community’s server (or their own). Like Nitro, Server Boosts give servers special perks like more custom emotes, better video and voice quality, and the ability to set a custom invite link. Server Boosts can be bought with Nitro or purchased separately.
Student Hubs: Discord Hubs for Students allow students to verify their Discord account with their official student email, and unlock access to an exclusive hub for students at their school. Within the hub, they can connect with other verified students, discover servers for study groups or classes, and share their own servers for fellow students to join. Hubs are not affiliated with or managed by a school or school staff. Servers in a Hub are student-run but may include non-students. For more information on Student Hubs, please check out our Student Hubs FAQs.
Below, you can see just a few of our favorite stories about what people are doing on Discord and why they love it. You can find even more stories about how people use Discord right here.
Cyndie, a parent of two from North Carolina, reflects on how her family uses Discord:
“There are four of us and we all have Discord installed on both our computers and phones. My oldest son is in an apartment, and the younger one is on campus, so we use Discord to make family plans. Everything gets dropped into that server. From dinner’s ready to internships and job offers. Usually it’s the silly, stupid stuff we just drop in that makes us all laugh, like when there’s a Weird Al question on Jeopardy. I can’t imagine life without it.”
Genavieve, a high-school student from California, talks about how her classes use Discord:
"I've been using Discord for the last two years as my main communication with my friends. We had too many people in our group chat and wanted a platform where we could all communicate with each other. Discord is a great way for a friend group of thirty people to stay in touch! Also, with distance learning in place, I’ve started using it with my AP Physics class too. It's been so important to feel connected to our teachers and each other when we are so isolated and in such a difficult class. Using Discord brought us closer together as a class — we are already a small class of 22 students, so being able to joke around and send memes helps us not feel so alone during the distance learning. The different channels and @mentions make it much easier to keep information straight. Screenshare makes it even easier, so we can show each other documents or problems we are working on to get feedback or troubleshooting advice.”
David, a physics and math tutor from New Jersey, talks about how he teaches students and connects with other teachers over Discord:
"I use Discord to tutor one of my students and to stay up to date with conversations and announcements in a group of physics teachers interested in physics education research. It's nice to see a side-by-side camera view of my desk with the student's work. I also really like that the audio through the OPUS codec which sounds very clean."
Roles are one of the building blocks of managing a Discord server. They give your members a fancy color, but more importantly, each role comes with a set of permissions that control what your members can and cannot do in the server. With roles, you can give members and bots administrative permissions like kicking or banning members, adding or removing channels, and pinging @everyone.
You can find these options in the Roles section of your Server Settings.
Assign permissions with care! Certain permissions allow members to make changes to your server and channels. These permissions are a great moderation tool, but be wary of who you grant this power to. Changes made to your server can’t be undone.
Server verification levels allow you to control who can send messages in your server. Setting a high verification level is a great way to protect your server from spammers or raids. You can find this option in the Safety Setup section of your Server Settings.
When enabled, server-wide two-factor authentication (2FA) requires all of your moderators and administrators to have 2FA enabled on their accounts in order to take administrative actions, like deleting messages. You can read more about 2FA here.
By requiring all admin accounts to have 2FA turned on, you protect your server from malicious users who might try to compromise one of your moderators' or administrators' accounts and then make unwanted changes to your server. If you are the server owner, you can enable the 2FA requirement for moderation in the Safety Setup section of your Server Settings.
You must have 2FA enabled on your own account before you can enable this option!
The explicit image filter automatically blocks messages in a server that may contain explicit images in channels not marked as Age-restricted.
You can control this feature so that it applies to all server members, or just to members without roles — or you can turn this feature off altogether.
Age-restricted channels are exempt from the explicit image filter. Turning this filter on allows your server members to share content while reducing the risk of explicit images being posted in channels that are not age-restricted.
You can access this setting by navigating to Server Settings, and under Moderation, select SafetySetup, and then find the Explicit image filter heading.
When we receive a report from a Discord user, the Trust & Safety team looks through the available evidence and gathers as much information as possible. This investigation is centered around the reported messages, but can expand if the evidence shows that there’s a bigger violation. For example, we may investigate whether the entire server is dedicated to bad behavior, or if the behavior appears to be part of a wider pattern.
We spend a lot of time on this process because we believe the context in which something is posted is important and can change the meaning entirely. We might ask the reporting user for more information to help our investigation.
Responding to user reports is an important part of our Trust & Safety team’s work, but we know there is also violating content on Discord that might go unreported. This is where we get proactive. Our goal is to stop bad actors and their activity before anyone else encounters it. We prioritize getting rid of the worst-of-the-worst content because it has absolutely no place on Discord, and because the risk of harm is high. We focus our efforts on exploitative content, in particular non-consensual pornography and sexual content related to minors, as well as violent extremism.
Please note: We do not monitor every server or every conversation. Privacy is incredibly important to us and we try to balance it thoughtfully with our duty to prevent harm. However, we scan images uploaded to our platform to detect child sexual abuse material. When we have data suggesting that a user is engaging in illegal activity or violating our policies, we investigate their networks, activity on Discord, and their messages to proactively detect accomplices and determine whether violations have occurred.
We’ve been paying close attention to violent extremist groups and movements ever since we learned how the organizers of the 2017 Unite the Right Rally in Charlottesville, Virginia utilized Discord to plan their hateful activities.
Back then, Discord Trust & Safety was a team of one, just beginning to make difficult decisions about how to properly moderate the platform. Almost four years later, our Trust & Safety Team makes up 15% of Discord’s near-400 employees and splits its time between responding to user reports and now proactively finding and removing servers and users engaging in high-harm activity like violent extremist organizing.
Trust & Safety has spent a lot of time since 2017 trying to ensure that another event like Charlottesville isn’t planned on our platform. Our team developed frameworks based on academic research on violent extremist radicalization and behavior to better identify extremist users who try to use Discord to recruit or organize. We keep up-to-date on research that can lend insight into how to evaluate and understand extremist behavior online, and our recent partnerships with organizations like the Global Internet Forum for Countering Terrorism (GIFCT) and Tech Against Terrorism (TAT) are intended to support this effort.
Categorizing violent extremism itself is difficult because not all extremists have the same motives or believe in the same ideas. Some individuals who adopt violent ideologies act on their beliefs by joining organized hate, terrorist, or violent extremist groups.
Others don’t want to officially identify themselves as belonging to a particular movement, and may instead form looser connections with others who have adopted the same worldview. Different cultural contexts also influence belief systems and behaviors, so violent extremist ideologies in one country will naturally be different from those on the other side of the world.
Violent extremism is nuanced and the ideologies and tactics behind them evolve fast. We don’t try to apply our own labels or identify a certain “type” of extremism.
Instead, we evaluate user accounts, servers, and content that is flagged to us based on common characteristics and patterns of behavior, such as:
It’s important to note that the presence of one or two of these signals doesn’t automatically mean that we would classify a server as “violent extremist.” While we might use these signs to help us determine a user or space’s intent or purpose, we always want to understand the context in which user content is posted before taking any action.
On the day of the Insurrection, our Trust & Safety agents were reviewing reports of hate speech, glorification of violence, and misinformation about what was transpiring. We feel very fortunate that our team was able to locate and remove many of the most harmful servers dedicated to coordinating violence on January 6.
Our ability to move proactively on servers advocating for violence was thanks to two main factors: first, we were able to surface reports from users on these spaces quickly; and second, our Trust & Safety agents dedicated to countering violent extremism had been tracking these spaces ever since allegations of election fraud regarding the 2020 U.S. presidential election had begun to spread.
We believe it’s important to talk about the line we walk with Discord users who discuss politics or to organize political activities like protests. Many people are frustrated with how society works and how some governmental or societal systems are structured. Naturally, people have strong opinions on how things should or shouldn’t change.
Now more than ever, it’s important for meaningful conversations, debates, and exchanges of ideas to take place. We’re glad that users across the world have turned to Discord to discuss their opinions and beliefs, to organize, and to advocate for the change they want to see.
Discord Trust & Safety’s objective is to ensure that no harm comes to our users, or to society at large, because of actions taken on Discord, which is why we don’t tolerate activities that promote or advocate for violence. When we’re reviewing reports for violent extremism, while it’s sometimes clear when users or servers have crossed that line, in many cases there’s a lot more context to consider. One of the most difficult responsibilities of our work is balancing the mitigation of potential harm without appearing as if we are overstepping any boundaries or censoring meaningful conversation.
Because of these values, we plan to continue standing firm against ideologies of hate that violent extremist communities espouse, and we are excited to work with other platforms and organizations that seek to do the same. Stay tuned for more updates.
We know that we can’t solve violent extremism alone, but we’ll continue to do our best to make sure that the communities on Discord reflect our company values. We want Discord — and the internet as a whole — to be a space for positive interactions and creating belonging.
If you would like to report dangerous or harmful activity to the Trust & Safety team, please do so using our report form. If you’re unsure how to report a user or server, take a look at dis.gd/HowToReport.
In this situation, a user pretending to be your friend, or using a friend’s compromised account, reaches out asking you to check out their video, test a game they made, or practice running code they wrote. No matter the backstory, they’ll always ask you to download a program or click a link they provide, resulting in a malicious program entering your computer and/or compromising your account.
Another variation of this scheme involves a user asking you to “test” something for them, directing you to open the developer tools on your internet browser while logged into the web app. They’ll then ask you to show them your token — do not do this. With your token, malicious users can sign in and take over your account.
Discord will never ask you for your token, and you should never have any reason to open Discord’s Developer Console in the first place. Note that this is only applicable to Discord on your internet browser, and not the desktop or mobile application.
This is similar to the previous scheme in that usually it is, again, a trusted individual that DMs you. Sometimes it's in the form of a well-known bot or under the facade that they are an administrator for a server that you're active in. It may involve very genuine-looking links to websites as well. Like we said, if it's too good to be true, it likely is.
Discord impersonation involves a hacker pretends to be messaging you from an “official Discord account” and offer entry to one of our community initiatives, such as the HypeSquad or Partner programs.
This is nearly always fake. Below are two screenshots, both of which present themselves as official Discord-sent messages. However, of these two conversations, only the right screenshot is actually from Discord.
On the right, you can note the blurple “System” tag next to the sender’s name, as well as the Reply space being replaced with a unique banner that only official system messages come with.
The DM on the left does its best to be convincing though. It even sends an invite link to a real Discord-run server called Discord Testers and a somewhat-real-looking link to the supposed Discord Hypesquad form. Scammers will use a technique of mixing real Discord invite links (to public Discord servers usually) with their malicious links in order to portray legitimacy and lull you into a false sense of security.
If you suss out that a DM is a fake, report it as Spam using the red “Report Spam” button at the top of the DM.
This feature is one of many improvements that we’re working on to help identify and remove bad actors as soon as we’re aware of them.
One of the oldest scams is the temptation of “free Nitro.” While we can’t discount people who may be truly full of generosity and believe in gifting Nitro, getting a random DM from a stranger claiming to have chosen *you* for a Nitro giveaway is incredibly sus, and most likely a scam.
Discord will never ask you to scan a QR code in order to redeem a Nitro code. Do NOT scan any QR codes from people you don’t know or those you can’t verify as legitimate.
If you ever use QR Code Login to sign in to Discord, make sure you’re using the desktop app, or if you’re on the web app, that your URL bar says “https://discord.com/login” exactly as it's written.
The above tactics are some of the ways that scammers may attempt to socially-engineer you into giving up your information. Even if you don’t click any of their links, it's best to simply block and report them to us, rather than engage further.
We encourage you to share this article with friends who may not be as informed as you — when everyone’s aware, our communities are safer than ever. Here’s a quick link back to Protecting Against Scams on Discord too.
Stay safe out there!
Internet Safety doesn’t have to be exhausting. Below are some simple but effective ways to make sure you’re on guard against any potential ne’er-do-wells in your DMs, and even outside of Discord.
This may feel like a given, but a surprising amount of security issues stem from people clicking on links before checking if they’re the real deal. Always double-check a link you’re clicking — link shortening services can easily mask unsafe websites or programs. We recommend getting it checked against a resource like VirusTotal to see if someone has already flagged it as potentially dangerous.
In addition, Discord has its own systems in place to remove malicious links and we’re constantly evolving those systems.
It’s not advised to download and run software that doesn’t come from a reputable source. Downloading and running programs that someone sends you unprompted is almost always a bad idea.
If a person claiming to have “special access to features” or new software says they need you to run on your own computer, they’re misleading you in order to get your personal info with their shady programs. If it sounds too good to be true, it probably is.
There’s no reason to give it up, ever. Sharing your password not only gives away access to your account but also exposes any personal information you have tied to that account — and potentially any website where you use that password — making you vulnerable to more than just a single account takeover.
The above tips can be applied anywhere on the internet! Next, we'll share some Discord-specific tips to ensure you can be vigilant against baddies targeting your account or community:
Disabling DMs for a particular server is one of the best ways to prevent bad apples hiding inside larger communities from contacting you.
To adjust who can and can’t DM you, head into User Settings > Privacy & Safety, then scroll down to “Server Privacy Defaults.” From there, you’ll find the option to “Allow direct messages from server members.”
Feel free to adjust it as you wish, but do note that this new state only applies to servers joined after changing the toggle; it won’t retroactively affect your existing servers.
If you turn this option off, members of newly-joined servers can’t contact you via DM unless you’re friends with them beforehand. Receiving mail might be nice, but receiving suspicious messages from people you don’t know is less nice.
If you're in a server you trust and don’t mind being messaged by those in it, you can toggle the privacy setting on an individual basis. Head to that server on desktop or mobile and select its name to open the server's settings, and choose “Privacy Settings.” Once there, you’ll find the “Allow direct messages from server members" option. Turn that on, and you’re free to receive all sorts of DMs from everyone in that server, regardless of if you’re friends or not!
If you’ve joined a lot of communities, consider auditing the list and see if you’re comfortable with letting non-friends message you from that server, or if opening up is inviting unnecessary risk into your inbox.
Understanding which permissions your mods and members have access to is key to keeping everyone within it safe. If you're a server owner, have you checked your permissions list lately? Who has what perms? Did you know they had that access and for how long?
If the answer to any of these questions was a resounding :shrug_emoji:, it’s time to do a review of your server setup to ensure that only those who really need powerful permissions have them.
Specifically, make sure that only moderators you trust have access to permissions that can change powerful server tools, including any bots or webhooks you might add to the server. Be vigilant for bots that are impersonating larger well-known bots.
In almost every case, any large and reputable moderation bot will never need admin permissions to work properly. Only give a bot the permissions required for the tasks you need and no more — look for a Verified checkmark on a well-known bot before adding it.
If you need a refresher on how permissions on Discord work, you can check out the Help Center article here. If you have a basic understanding of the permissions system and want a more comprehensive look at what they mean in a moderation sense, we also have this article from the Discord Moderation Academy.
If you update your server’s links, make sure that your community and potential newcomers are aware of the changes and update any social media pages where you shared them. If possible, delete references to old invite links and make it known that those links have been updated.
This is doubly-so for servers Partnered, Verified or Level 3-boosted servers that utilize a vanity URL: if your server loses or changes its custom invite link, nefarious communities may swoop in and claim your old one. If this happens before you update your public-facing invites, people trying to join your community may instead join a server that’s looking to cause trouble.
In addition to updating existing links, consider implementing easy-to-follow community rules around invite sharing and encouraging members to always verify where a server invite leads and who it is coming from before clicking it.
Pro Tip: Try pasting one of those invites to a Discord message to preview where it leads to before opening it! (But of course, don’t make your invite testing look like a spam message by pasting a random invite in #general.)
If someone gains control of your Discord account, they will have as much reign over your account as you do: They’ll have the ability to change your username, password, email tied to the account, and any other information associated with your account.
They’ll also be able to see any personal info associated with your account once they’re in it. While most consider “personal info” to be payment info or email, it can also contain your private conversations and messages in DMs and servers alike. If you can see it, they’ll be able to see it.
As if they were the new owner of any servers you own, they’ll be able to make any changes they want: from server layout to server permissions, to bots and webhooks to kicking everyone out of the server, you name it. If your account is the moderator of a server that a hacker is targeting, they might even use you as a stepping stone to cause further damage within the community, or even impersonate you to trick unsuspecting members.
Some users may also target Discord accounts that have unique profile badges that are no longer available, such as the Early Supporter or Early Verified Bot Developer badges. If you have one of these unique badges, you should be extra-vigilant with your account.
If your account is taken over and the hacker changes the password, there isn’t much you can immediately do to stop them. However, if you have 2-Factor Authentication enabled on your account, the hacker will also be required to provide a 2FA code to change your password. We strongly recommend enabling 2FA, and you can learn how in our 2FA blog post.
Reporting what happened to Discord can help you regain ownership of your account, which can be done here — let us help you!
We’ve created an additional blog post describing the types of scams floating around on Discord, which you can check out to familiarize yourself with signs that you may be dealing with a hacker who needs to be blocked:
We recommend sharing it with larger communities that may benefit from this knowledge.
With these recommendations in your pocket, you’ll be better able to foil any potential digital threats. Just like with keeping your IRL-self healthy and whole, taking preventative measures can keep your virtual self safe and secure.
Stay safe out there!
Above all else, the foundation of a good moderation team is familiarity. By knowing your fellow moderators better, what they do, or where they live, you’ll relate to them better. You’ll get to know people’s strengths and weaknesses, learn to understand them better, and get a feeling for when they need help, and vice versa. Though you all may be in different time zones and have diverse backgrounds you’re all working towards the same goal which is keeping the community you all care for safe. Who knows- you might find that you share a lot of the same interests along the way, make great new friends, or deepen existing friendships during your time together as moderators.
Here are a few basic things you should do to familiarize with each other:
A moderation team needs a clear structure and a unified understanding of server moderation, which has already been covered in Developing moderator guidelines. Now we’re going to expand on how to utilize each and every single moderator's abilities further. A moderation team can range from a few members in a small server to a huge team with 30 or more staff depending on the server size. The bigger your community gets the more the team needs to be organized. While they are all moderators, it doesn’t mean they all do the same job.
Some of your moderators, especially experienced moderators, are likely to be in more administrative positions. They usually stay further away from general day-to-day channel moderation while newer moderators are focused on watching conversations and enforcing the server rules.
If you do have one of these larger mod teams, consider delegating certain moderators to tasks and responsibilities that they’d be best suited for, rather than having a jack of all trades, master of none situation. This allows to divide the team into smaller sub-teams that talk to each other more frequently in designated channels regarding their specific mod duties.
Here are a few examples of sub-teams that are common within larger communities:
Moderators that primarily contribute to the community by enforcing rules, watching conversations, engaging members, solving member to member conflicts and showing moderation presence. The same type of moderators could also exist for Voice Channels, but that is mostly for very large communities.
Moderators that are extremely familiar with permissions, bots, programming, etc. Ideally, they aren’t just able to operate bots you’re using, but also maintain them. A custom bot tailored to your community is always something to consider. Having a bot hosted “in-house” by a moderator within your team adds an additional layer of security. The Bot Team is very valuable in making new and creative ideas possible and especially in automating day-to-day processes.
Most servers host events, from community run events to events run by staff members. Event Supervisors watch over the community members hosting events, watching out for new events, while being the general point of communication in hosting and announcing them.
These are ways of how moderators can be utilized better by giving them a designated job depending on your team's size, and also giving them the ability to dive into certain topics of moderation more in-depth within your community, which overall makes managing and coordinating a team as a whole easier.
As server size and the number of moderators increases, the harder it becomes to hear every voice and opinion. As a team, decisions need to be made together, they need to be consistent, equitable, and take into account as many different opinions as possible.
It’s important to establish a system, especially when making big decisions. Often, there are decisions that need to be done right at the very moment. For example, when someone posts offensive content. In most cases, a moderator will act on their perception of the rules and punish offenders accordingly. At that very moment, the offending content has to be removed, leaving little to no time to gather a few staff members and make a decision together. This is where consistency comes into play. The more your moderators share equal knowledge and the same mindset, the more consistent moderation actions get. This is why it’s important to have moderator guidelines and a clear structure.
It’s very important to give every moderator freedom so they don’t have to ask every time before they can take action, but it’s also important to hear out as many opinions on any major server changes as possible, if time allows it.
Over time, a moderation team grows. They grow in many ways, in their abilities, in the number of moderators, but also grow together as a team. Every new moderation team will face challenges they need to overcome together and every already established team will face new situations that they have to adapt to and deal with. It’s important to give a moderator something to work towards. Mods should look forward to opportunities that will strengthen their capabilities as a moderator and strengthen the team’s performance as a whole.
You should let moderators know that they have the potential for growth in their future as a moderator. It can be something like specializing into specific topics of moderation, like introducing them into managing or setting up bots. Perhaps over time they will progress to a senior moderator and focus more on the administrative side of things.
The Discord Mod Academy can be a valuable resource in encouraging moderator growth as well. While they may be familiar with some of the concepts in the Discord Moderation Academy, no moderator can know everything and these articles have the potential to further refine their moderation knowledge and enhance their abilities.
Moderators are direct representatives of your community and as such should be a reflection of what an ideal community member looks like. Many things tie into showing this professionalism, ranging from how moderators chat with members in public to consistency in moderator actions.
The presence of a moderator should never make people uncomfortable - there needs to be a fine line between “I can chat with this moderator like with any other member” and “This is a moderator and I need to take what they're saying seriously”.
Here are a few attributes in what makes a moderation team look and be professional:
Your team should share positivity, engage conversation, show respect for others and maintain a professional look. Make it clear to your moderation team that they are a highly visible and perhaps incredibly scrutinized group within your community- and to conduct themselves appropriately with those aspects in mind.
As with all group efforts, there is a possibility for friction to occur within a moderation team. This is undoubtedly something that you’re all going to have to address at some point in your mod team’s lifespan. It’s very important to deal with these situations as soon as they come up. Many people don't feel comfortable dealing with conflict directly as it can be uncomfortable. However, getting to any potential problems before they become serious can prevent more severe issues from cropping up in the future. The goal of a problem-solving process is to make a moderation team more conflict-friendly.
As a general problem solving process, you should:
With that in mind, there are also situations where you’d want to exercise more discretion. Something that might prompt this is when a moderator makes a mistake.
It can be both uncomfortable and embarrassing to be “called out” for something, so often enough the best option is to speak to someone privately if you think they’ve made a mistake. After the moderator understands the situation and acknowledges the mistake, the problem can be talked about with the rest of the team, mainly to prevent situations like these in the future.
Still, sometimes there are situations where problems can’t be solved and things get more and more heated, and in the end separating is unavoidable. Moderators leave on their own or have to be removed. Your team members should always be given the option to take a break from moderation- especially to avoid burnout. You can learn more about how to deal with moderator burnout here.
Taking a moment to look back at the history and progression of your community and your mod team can be a useful way to evaluate where your team is at and where it needs to be. The time frame can be of your choosing, but some common intervals can be monthly, quarterly, or half year.
You don’t always get to talk with your moderators every day - most of us volunteer, moderating as a hobby, having our own lives while living scattered all over the world. With all of that going on it can be hard to find the time or space to discuss things that you might feel are lacking or could be changed regarding the server, and that’s why reviewing is important.
A Community Review can be done in many ways. For most, a channel asking a few basic questions, like how the community has developed since the last review, how activity and growth has changed, is a good way to start. Most importantly, you want to hear how the moderation team is doing. Talk about mistakes that have happened recently, how they can be prevented, and review some of the moderator actions that have been taken. A review allows everyone to share their thoughts, see what everyone is up to, and deal with more long term issues. It also allows giving your moderators feedback on their performance.
The essence of managing a moderation team is to be open-minded, communicate and to listen to each other. Endeavor to manage decisions and confront problems in an understanding, calm, and professional way. Give moderators the opportunity to grow, designated tasks, but also time to review, break and rest. Remember, you’re in this together!
Before we discuss how to use a modmail bot, we should explore what they are and what purpose they serve. Simply put, modmail bots are bots that allow users to contact the moderators of a server via a direct message with the bot. The bot then relays these messages to the entire moderation team by creating a private channel on the server visible only to the server moderators (also referred to as a “ticket”). Moderators can then use this private channel to have the bot relay their own messages back to the original user, all while keeping a record of the entire conversation within the server. Moderators can even discuss the ticket together in that channel without the user seeing since it requires usage of a specific command to send a message to the user.
There are numerous modmail bots online, each with their own styles and features. Some modmail bots can be self-hosted, while others are hosted publicly online! Here are some examples of modmail bots you can use:
It is important to do research into readily available bots to understand what your server needs and to choose whichever one best meets those needs. Please note, the content of this article is not endorsed by any developer or company related to the bots.
Before adding a modmail bot into your server, you should determine if you need one by considering the benefits and drawbacks of modmail bots. Let’s review the pros and cons of modmail bots to help with this decision.
One of the biggest benefits of having a modmail bot is better organization. Anytime a user opens a ticket, be it a question or concern about the server, instead of trying to figure out which moderator is online and available, a user can quickly DM the modmail bot and get in contact with the entire moderation team while also notifying the active moderators online. This also eliminates over usage of pinging the entire moderation team in public channels.
The next benefit of modmail bots is team discussions. In the above example, we’re using the ModMail bot by CHamburr. As you can see, a user has opened up a ticket that can be viewed by the entire moderation team. The team can then decide who wants to respond to the ticket, as well as discuss the ticket in question among themselves and more efficiently handle the issue. Team discussions are great for trickier questions that not everyone on the team might know the answer to.
In the image below, you can see an example of what moderators see from their side. The team can split their focus and knock out tickets depending on their priority, urgency, or even work together to pick up a ticket if another mod has signed off. The ability to read through moderator discussions and work together is a great team building exercise for moderator teams!
The next benefit of modmail bots are information logs. Most modmail bots have a modmail log that shows all of the modmail tickets that have been opened or closed along with useful receipts of the conversations that happened in the channels. This is great for administrators/lead moderators who want to see how cases were handled or want to catch up on what they’ve missed while offline. This also helps moderators look back on reporting history if an old report is continued at a different period of time or if they suspect someone is abusing modmail.
Another benefit of having a modmail bot is anonymity. Some modmail bots allow you to reply to the user anonymously which can decrease the chances of someone harassing a moderator, or other unpleasant situations that may come from having to action someone. Users can sometimes get upset or want to seek revenge after being banned, muted, or kicked from a server, so being able to communicate with someone and action them without revealing your identity is extremely valuable.
Moderators no longer need to act as the “middleman” between a user and the rest of the team. The moderator can speak on behalf of the team and use language such as “we” instead of “I” which can communicate to the user that it isn’t just one person handling their case, but rather the entire team while keeping their identity safe.
Something that you will notice when moderating is that there are usually some questions or concerns that come up frequently. Having a modmail bot can help you better understand what your community needs are and better visualize what the most frequent moderator tickets are about. With this knowledge, you can create custom quick-replies that you and your team can use to more efficiently resolve tickets! Having the same response to a scam DM that is used by all the moderators on your server makes responding to those types of tickets quick, easy, and consistent. Moderators can copy and paste replies, or use custom commands to reply with pre-made messages. Once you have set up quick replies, you can answer many tickets without needing to think of how to reply, making your replies consistent no matter who is responding to a ticket.
Here are a few examples of situations you can make tailored custom-replies for ease, speed, and consistency:
It is important to know the limitations and drawbacks of modmail bots before deciding if your community needs one. Let’s explore some of the ways these bots can be abused, as well as other negatives you should know about. Please keep in mind that every modmail bot is different, so you can work to choose, or even code, a modmail bot that may eliminate some of these drawbacks if you are concerned about them.
Modmail bots are vulnerable to spam since anyone on the server can DM the bot to open up a ticket. Be it a targeted spam attack or multiple users reporting the same thing. Some modmail bots allow you to block a member from opening up future tickets.
Depending on what kind of modmail bot you use, one drawback is that some modmail bots might not save the conversations between you and the user who made the ticket, only showing the reason the ticket was closed. Some modmail bots will save the conversation, but not any comments made by other moderators regarding the inquiry inside the ticket channel which can erase a lot of discussion and context from replies that are made. If you want to be able to track actions, you have to prioritize a bot that does save information logs.
Another drawback is the learning curve from using a new bot. Members that are new to Discord or that aren’t familiar with modmail bots can have trouble interacting and using the bot to report issues. Almost all modmail bots require a user to DM the bot, which can go against initial instincts and be challenging to new users. Some users have their DMs turned off which makes using a modmail bot more complicated if the user can’t receive DM replies from the bot. One solution to this is to ping the user in a public channel asking for them to change their DM settings so the bot can communicate with them.
Members figuring out a modmail bot for the first time can lead to “test” messages which will spam your modmail tickets. While these aren’t a big concern and are mostly harmless, depending on the size of your server, multiple users “testing” how the bot works can lead to a lot of time cleaning up the tickets.
Similarly to how a modmail bot can be a learning curve for your members, a modmail bot can also be a learning curve for your moderation team. Learning how a modmail bot works can take some time to get used to if your team has never used one before. Depending on how familiar your team is with using bots, the onboarding time may vary, so be sure to set aside time for learning and teaching how to use a modmail bot. We have an article on Training and Onboarding New Moderators that should help with this!
Modmail bots can be used in a variety of ways. They not only serve as a way to communicate with moderators, but also help with organization and structure and can be used for other purposes. Let’s look at the different uses of a modmail bot, and how it can make life on the server easier.
Questions. Modmail bots help users who might have questions that they don’t feel comfortable asking in public. This can be questions related to the server, or more private and sensitive matters. By using a modmail bot, the moderation team can efficiently answer the question depending on who knows the answer, and the user gets a sense of privacy by not having to ask in public. As mentioned previously, having a set of quick-replies to frequently asked questions makes responding to questions a breeze. Some modmail bots even have custom command replies where you can create commands to reply with a specific text, depending on the command. For example, replying with the command “=hello” can reply to the ticket with a premade message meant to greet the user and open a conversation, such as: “Hello. Thank you for contacting the moderation team. How can we help you today?”
Bug Reports. You can implement a modmail bot in a support server that deals with bug reports. Some larger communities have two servers–one for the main community, and one for support, depending on the community or product. Organizing bug reports using a modmail bot is an easy way to keep track of issues in one place. With the ability to give a closing reason, you can later search the modmail logs to look back at what sort of bugs were reported by different users.
Ban Appeals. Anonymity is important when dealing with troublemakers online. One way to use a modmail bot is to have a ban appeal server for users who would like to re-enter your community after being banned. With this setup appeals go through the modmail system so that the banned users can’t talk in any channels and can only interact with the staff team through the bot. Moderators can then review the appeal together and reply to the banned user anonymously through the bot.
Community Reports. Sometimes a member of your community needs to report something to the moderators that isn’t rule breaking or has information regarding another user that is important for the team to know. A modmail bot is great for relaying these types of reports to the team so everyone is notified at the same time without having to utilize an emergency ping.
Server Feedback. Members of your server can provide feedback using a modmail bot. Using modmail for feedback removes the need for a feedback channel and guarantees that the team will see the feedback when it is sent.
Staff Applications. There are different ways you can have staff applications for a server. Depending on who needs to see and evaluate the application, it can make sense to use a modmail bot to receive server staff applications in smaller arenas. When an application is made, a modmail ticket will open conveniently for the moderators or admins to look at. You can even comment within the ticket to evaluate each application, case by case! Keep in mind that there are a limited number of channels (in this case, tickets) that can be made in a channel category so this will only work for smaller servers.
The above list are just some of the many common ways that modmail bots can be used to strengthen communication with your server. The use cases for a modmail bot is limited only by your imagination and the type of bot you’re using!
When using a modmail bot, it is important that your moderation team is on the same page. Make sure there are clear protocols of how to handle tickets. An easy way to ensure that the end-user doesn’t get messages sent by multiple moderators is to have a moderator say that they will handle the ticket inside the ticket channel itself. This way, moderators aren’t stepping on each other’s toes and the ticket can be handled efficiently.
In the image below, we’re using a modmail bot that sends all messages sent in the ticket to the user, unless it has a specific command. Other modmail bots might only send a message to the user if a reply command is used, so be sure to read the bot documentation to know how your modmail bot handles communication between the moderators and the user sending a ticket.
Another good practice is to close any old or resolved tickets as quickly as possible. If the user has no further questions, it can be helpful to send a final message stating that if there are no further questions, the ticket will be closed soon. This way, you reduce the amount of open tickets and maintain a clean and organized modmail.
As mentioned before, anonymity in responding to tickets can help reduce the amount of harassment you and your moderators may receive from rule-breakers. When using a modmail bot you can use discretion when deciding whether to respond anonymously or with a more personalized response, depending on the situation. Here are a few scenarios where it’s okay to respond regularly:
In most cases, however, it is better to respond anonymously especially when dealing with any modmail tickets related to actioning or punishing another user. This applies to handling ban appeals if you have such a system set up. It isn’t a big deal if you reply to a question without showing your name. The same is not true if you show your name to someone who has been punished on your server.
Each server has its own tone and style when it comes to moderation. Some servers are more formal and may have an official tone, while others might have a more relaxed and easygoing tone. It is important to keep this in mind when responding to modmail tickets so that your tone in the modmail responses is consistent with the tone set by the server.
This article has explored the ways modmail bots can strengthen how your team communicates with your community as well as the direct advantages and disadvantages that exist when introducing such a bot to your server. In some ways moderation becomes significantly easier, but in other ways, negatives like spam from these bots can be overwhelming and introduce new problems to solve. Moderating a server can involve a lot of different moving parts and it is important to analyze your community’s needs as it grows, taking into consideration whether a modmail bot is included in that growth. Use what you’ve learned in this article to make a decision that’s right for you and your community.
If you decide that a modmail bot isn’t a good fit for your community, but you still want some bot help on your server, check out our Auto Moderation on Discord article!
There can be many causes for conflicts between your moderators, some of which occur more often than others. Some of these can easily be prevented, while others might need a more intentful thought process to stop them from escalating to a conflict.
Personality Clashes and Differences
The main reason for conflicts to arise is a difference in personalities amongst your team. Everyone perceives, analyzes and responds differently to certain situations. While this should be a learning opportunity for everyone involved, it can lead to arguments when moderators can not get along. While emphasizing there are different approaches to the same issue, you should teach moderators to learn from each other instead of disagreements ending in a conflict. When a moderator constantly disagrees with someone else, or ends up in a personal argument, try taking them aside and let them know that they should be working on their personal issues. You can help them as an objective observer - it can be very valuable to let members of your team know that you understand their individual perspectives and are working to balance them.
Try to avoid the “Halo/Horns effect” as much as possible. This is a type of cognitive bias where your view of someone’s actions can influence how you think and feel about that person in general, both positively as well as negatively.
Misunderstandings between your moderators can lead to a conflict as well, especially when there is a different understanding of how a situation should be handled. Having transparent, consistent communication where everyone is on the same page is very important in an online setting such as Discord. Text is often perceived differently than talking over voice or even in person. Oftentimes a language barrier can lead to misinterpreting a message as well. Try to prevent misunderstandings from happening with clear communication and when misunderstandings happen, step in where possible and explain how the situation should be handled. Foster a culture where it is not only allowed but encouraged to ask for clarification on someone’s statement if you have any doubts about what their intent was.
Bullying and Harassment
Making jokes is beneficial for maintaining a healthy and welcoming environment and even a way to deal with the tough situations moderators may face. However, it is important to make sure that jokes don’t push the boundaries of what is appropriate and lead to perceived bullying of server members behind closed doors, or even other members of your own team.
What is perceived to be a harmless in-joke in context could out of context seem like a malicious and unwarranted comment. When it is clear someone is not comfortable with certain jokes or feels targeted, you need to step in and clear the situation by telling moderators involved that you want a positive atmosphere where everyone feels welcomed and comfortable while clearly identifying what line was crossed so it does not happen again. That’s not to say that all close knit groups trend towards exclusionary jokes, but it is something that can happen without a team even being cognizant that it’s a problem. Hence why it’s pertinent to keep inclusivity and good faith intent in mind with situations like this.
A main cause for conflicts is unfair treatment amongst members of your team. This might be because there is a lack of equal opportunities, or because there are unrealistic expectations. Make sure that everyone on your team is treated equally as everyone deserves the same treatment, regardless of your personal relationships. This also means you and other moderators should have reasonable expectations of each other. Moderators often volunteer to help out in your community, so don’t expect them to be available at all times; they might have other responsibilities as well. Any mod team’s north star should be that “real life comes first”, and to be lenient amongst themselves in the face of that.
Sometimes it is not clear what responsibilities everyone has on the team. This can lead to misunderstandings and eventually to conflicts. Make sure everyone on the team receives adequate training and information on what different roles and positions should be doing. You might want to divide different responsibilities amongst members of your team or between different roles, such as moderators and administrators. Be as transparent and clear as possible on what everyone should be doing.
Sometimes moderators feel that they are competing against each other, especially if there are statistics or promotions available. You should always be clear that moderation is not about competition, it is about helping your community in the best way possible. Not every moderator has the same time available to help, so naturally there are going to be differences amongst your team. Moderation is not about quantity but quality - if this negatively skewed competitive atmosphere is a frequent issue, it may be worth looking at whether there are mechanisms within your community that encourage competition conflicts that could be removed or revised, such as quotas, public moderation statistics, or leaderboards.
Different Values and Standards
Moderators usually come from all over the world and might have different values. Sometimes, these might result in a conflict. Remind your moderators that they should always respect each other's values, regardless of their own opinion. Moderators should be able to explain why they feel a certain way in order for others to be more understanding of the situation. When someone has values that do not align with the server’s moderation philosophy, you might need to remove them from your team.
Unresolved Underlying Issues
It is not always possible to immediately see a reason for a conflict, this might be due to unresolved underlying issues. It is very hard to know these without having a conversation with everyone involved. There can be many more causes for a dispute to arise, but it is always important to find the underlying issue which causes a conflict, only then are you able to resolve it.
To minimize and prevent conflicts from happening, try as best as you can to get a comprehensive view of why and how conflicts occur in your server. It is very important to develop interpersonal relationships with all of your moderators and value their contributions to the server. Encourage moderators to take initiative in projects they are enthusiastic about, especially in collaboration with other moderators. Just as moderators set a positive example in your community, you as the leader of your moderation team should set a positive example within your team.
When you propose major changes to the server, listen to your team’s viewpoints before deploying and explain why a decision was made. Good communication is very important within a team, not only to prevent conflicts, but also to keep all moderators on the same page. All moderators should feel involved and informed when you are making major changes.
You can give your moderators regular feedback on how they are doing and what they can improve. Make sure moderators always feel welcome to provide feedback constructively and positively to each other and that they can always contact you or someone in charge in case a conflict does arise.
Try to discourage gossip within your team, both internal and external. When moderators start to talk about each other behind their backs, it becomes personal and can distort the relationship your moderators have and how they see themselves within the server. Instead, encourage moderators to form friendships with each other by organizing social events for your staff. During those events, you can learn about the different personalities in your team.
Last, but not least; you shouldn’t lash out over mistakes.Give feedback where it is appropriate and move on. Be quick to forgive and forget. You should always prevent belittling your moderators while also creating a culture of de-escalating situations in private.
There are many ways to resolve conflict internally and most of these will depend on what the cause is for the conflict. A good way of resolving conflicts is using the Thomas-Kilmann model. It should be stated that conflicts are not necessarily bad and shouldn’t be avoided at all costs. When you are working in a team, it is important to be able to challenge the status quo and question each other, to keep everyone on the aligned and up to date.
In the Thomas-Kilmann model, there are two dimensions when you are choosing a course of action: assertiveness and cooperativeness. Assertiveness is the degree to which you try to satisfy your own needs where cooperativeness is the degree to which you try and satisfy the other person's concerns.
Based on these dimensions, there are five ways to handle a conflict:
All these different ways to handle a conflict are your intention to solve them: Sometimes situations occur differently than you expect at first. Don’t jump to conclusions when you are dealing with a conflict, as the reasons for conflicts are often more complex than they first appear. Everyone tends to resolve conflicts in a certain way, so try to balance them so you don’t end up overwhelmed or overwhelm others.
The first step is to identify the cause for the conflict. You might already know this based on previously sent messages, but sometimes you need to contact everyone involved separately and in private to determine the cause of the conflict. It is your responsibility to determine how to handle a situation. While collaborating together to resolve a conflict might look the most appealing, this will not always be possible. Try not to completely avoid a conflict: if you feel uneasy dealing with conflicts or don’t want to give moderators feedback, you might not want to be in a leading position.
If you need to resolve a conflict, choose a neutral place to work it out. This might be a separate server or a private group. None of the people involved in the conflict should have power over the others, so you or someone else should act as an objective observer. If the conflict is between other moderators, you should offer guidance, but don’t offer solutions: ultimately it is up to others to resolve their conflicts, you are not taking sides.
Remember that not all conflicts require consequences. Most conflicts are sparked by the passion of your moderators who are simply in disagreement on how to deal with situations. Try to turn a conflict into a learning opportunity for everyone involved. Let them explain how they view the situation and how they would have handled it or behaved differently. Afterwards, you should be able to identify specific disagreements that you can solve. Listen to everyone involved and give everyone an equal opportunity to express themselves. Remind everyone to stay respectful at all times, even if they disagree with each other.
As said before, you should discourage gossip within your team and encourage each other to give constructive feedback. It is important that everyone knows how to give and receive feedback to prevent a conflict from happening in future situations.
When you notice a moderator that displays behavior or takes action that can be improved, you should give them feedback on how to improve this in the future. Don’t rush in giving your feedback, everyone needs time to process. When a situation becomes heated, it will not be the best time to give feedback. Remember to always give feedback in private!
If you are in doubt as to whether or not you should give feedback, see if you can recognize a pattern in their behavior or actions. Everyone makes mistakes and that is perfectly fine as mistakes serve as a learning opportunity and you should only give them feedback if it becomes a pattern. Additionally, it’s recommended that you ask for consent to give feedback. While this may seem a little counterintuitive to helping the moderator improve or change, blindsiding someone when they’re not ready can result in backlash rather than progress.
When you do decide to give someone feedback, don’t focus on the person who made a mistake: focus on their behavior or action instead.By making feedback personal and equating it to an issue with the individual rather than their choices, you can come off as argumentative and unconstructive and biased. Never exaggerate their behavior, be sure that you are clear and specific . When you make generalized or vague comments about someone, they will not be able to improve their behavior. Feedback should be actionable- there should be a suggestion or a change that the individual can work on as a result of the feedback. Otherwise it’s just airing your grievances, which is unproductive for all parties involved.
The next time you see someone doing something good when you have given them feedback in the past, give them a compliment! Positive, specific feedback is especially effective in encouraging repeating said good actions. Giving each other positive feedback is just as important as giving each other constructive or corrective feedback.
Remember to always treat others as you would like to be treated yourself. It is proven that negative feedback is given mostly when people are experiencing negative emotions like hunger, anger, loneliness, and tiredness. This negative feedback method is referred to as the HALT-mode, and it is important to try to avoid situations where you are giving feedback when you find yourself in one of these states.
When someone is giving you feedback, it is very important to listen to what they have to say. Don’t jump to conclusions, react defensively, or disagree immediately. Take a moment to summarize the feedback as you understand it to make sure you have understood their feedback correctly.
You should always thank someone for giving them feedback, even if you disagree. It has to be clear you are welcoming feedback to improve. If you do not thank them, they might not give you valuable feedback in the future.
Ideally feedback is very clear and specific, in which case you can end the conversation by thanking them for their feedback and reflect back what you are going to do with the feedback in the future. Other times it might not be understandable to you and as a result, it is unclear what you should do differently in the future. In this case you should ask questions to understand their feedback better. To have a better understanding of what they want to achieve with their feedback, it is a good idea to ask the other person how they would have handled the situation or how they would have behaved. After this you should thank them and reflect back on what you are going to do with their feedback.
It is okay to let someone know why you disagree with their feedback, but remember to stay respectful at all times and explain clearly why you disagree. Everyone should feel heard, even if you are not acting on their feedback.
There are many reasons why you might want to remove a moderator from your team, including internal conflicts you are unable to resolve. Removing or demoting a moderator is not easy to do, but can be a necessary evil. Removing someone from your team should be in the best interest of your community or team and can often be in the best interest of that person as well. Be sure to give someone an opportunity to learn from their mistakes before removing them. Give a warning first and have a conversation in private with them, following the principles of giving and receiving feedback outlined above.
While it is not easy to deliver bad news, here are some tips to keep in mind when you do want to remove someone from your team.
Don’t delay the conversation
Once you have made the decision to remove someone from your team, don’t hesitate and wait for the conversation to take place. When someone is causing more issues than they are solving, they will need to be removed sooner than later, but make sure that someone has had enough opportunities to fix their mistakes and resolve their issues. Please make sure you have this conversation in private, for example in direct messages or a voice call.
Keep it short and clear
When you are talking to someone you are removing, it is important to keep your message short and clear. Tell them they are being removed from the team, why you have made this decision, and when this will take effect. Be transparent in what reasons you have for removing them, but do not go into too much detail with specific examples. This may result in a conversation where you are arguing about these examples rather than informing them of their removal. You’re aiming for polite dismissal, not a lambasting of their character.
Stick to your decision
Despite your message being short, clear, and transparent, you might still receive counter arguments as to why they should not be removed and to give them another chance. It is important they can express themselves, but it should be clear at all times that the decision to remove them has already been made and is not up for debate. Always listen carefully to what someone has to say and appreciate their feedback, but unless someone has substantial evidence a mistake was made, repeat your decision and make it clear that your decision is final.
Be helpful and compassionate
Even though you are delivering bad news, you should take a supportive approach in the conversation. Remember that while it might be difficult for you to deliver bad news, it most definitely is difficult to be on the receiving end. Show empathy for them and especially when they have done a good job in the past.
In some cases, the moderator you’re removing may wish to receive feedback on what they can improve - giving this feedback in a constructive fashion is important, and will help them to avoid future problems. This feedback should include reflecting on the positive contributions they made to the team, helping them understand what the causes of conflict might have been while they were on your team, or simply trying to give them something positive to take forward and work on as a result of your conversation. This conversation shouldn’t reflect a reversal of your decision, but can be a useful point of reference if they want to join other servers or work on improving their skills down the line, or perhaps even re-apply to join the team in the future.
Inform your team
When someone has been removed from your team, make sure to inform the team of this decision. It is up to you whether or not you want to share the reason for their removal, but refrain from sharing details as these are confidential, especially when these could be potentially harmful to someone. You also want to prevent anyone, including yourself, from talking bad about them, as this sets a wrong precedent for your team. As with your decision, you should be straightforward and clear to your team, an example of this could be: “Today we have decided to remove Sarah from our team. I will not go into too much detail why this decision was made, but we believe it is the best decision for the community and the entire team.”
In case someone was removed because of misbehavior, you might want to include that in the message, as this gives the team assurance you are not removing someone because of a personal conflict, and it shows a strong message about how you want moderators to conduct themselves.
In every group, eventually a conflict or dispute will arise which needs to be resolved. There are common causes, such as personality clashes and differences, poor communication, unclear responsibilities, and harassment. To minimize and prevent conflicts from happening, try and understand as much as possible why conflicts occur in your server. It is very important to develop interpersonal relationships with all of your moderators and value their contributions to the server. Try to discourage gossip within your team. When moderators start to talk about each other behind their backs, it becomes personal and can distort the relationship your moderators have and how they see themselves within the server.
You can resolve conflicts using the Thomas-Kilmann model. It is important to know conflicts are not necessarily bad and shouldn’t be avoided at all costs. When you are working in a team, it is important to be able to challenge the status quo and question each other, to keep everyone on the same page and sharp.
When you want to give feedback to someone, talk to them in private and be clear and specific. Prevent giving feedback when you feel hungry, angry, lonely, or tired. Focus on their behavior or action instead and never exaggerate their behavior. Oppositely, when someone is giving you feedback, it is very important to listen to what they have to say and to avoid jumping to conclusions, reacting defensively, or disagreeing immediately. Take a moment to summarize the feedback to make sure you have understood their feedback correctly.
Sometimes you need to remove someone from your staff team. Don’t delay this conversation, keep it short and clear, and stick to your decision. When the conversation is taking place, be helpful and compassionate and do not forget to inform your team as well, but make it clear that certain things remain private for the safety and privacy of those involved.
Keeping all of those points in mind when managing a team environment will be integral to maintaining a healthy atmosphere full of various viewpoints that are united in the shared desire to keep your community safe.
There are millions of communities on Discord with varying interests and diversities. As such, there might be a desire for some servers to be able to integrate adult content as part of their community discussions. This would be the kind of content that is suitable for some people based on age restriction yet unsuitable for others. Channels for this content can provide an important space for adults in your server to discuss issues related to topics such as sexual health, safe sex, their relationships with their bodies, or a space to share and explore adult content. Adult content in any medium cannot be shared on Discord outside of channels that have been marked with the NSFW toggle. Maintaining such a space in any community does require a significant amount of oversight, effort, and proactive zeal from any given moderation team. As such, keep in mind that choosing to create this space is purely optional and most communities are free to decide whether having such a space is suitable for the culture and if it fits within the context of their community. This article will cover what falls under the umbrella of adult content, what it means, when and where it is allowed, how to maintain a space for it, and how to successfully set up and moderate such a space in compliance with Discord’s Terms of Service and Community Guidelines.
Adult content is anything that would be unsuitable for those under the age of 18 to view. This is synonymous with the term “NSFW” for the purposes of this article. NSFW is an acronym for the statement “not safe for work”, which is used as a shorthand to clearly indicate to others that a certain type of content may not be appropriate to look at in public, professional or controlled environments.
Please feel free to browse more on how Discord tackles safety on its platform in this section of the website. Check out our Safety Portal, particularly in the Parents and Educators section for further guidance.
The first step to setting up an adult content channel is to determine what method of age gating you need and how you want to set it up. People who should not have access to the content will try to get in and what steps you take to keep them out is up to you. Please note, since things like server icons, invite splashes, server banners, user profile pictures, usernames, nicknames, and custom statuses cannot be age gated, they should not contain any adult content. Emojis containing adult content should only be hosted and posted in places that are age gated.
Discord only requires that you use the NSFW toggle, but depending on your server and the nature of the content shared, you may want to take a more active approach to ensuring the content is only accessed by adults.
The NSFW toggle must be turned on for any channels with NSFW content. Even if your server is exclusively 18+ and requires users to send a picture of their photo ID to join, the channel still needs this toggled on. In addition to keeping minors out of spaces with adult content, this toggle will also flag the channel as NSFW so that adult users can avoid it if they do not wish to see that content. Not marking NSFW channels appropriately opens the risk of action being taken on the server from Discord’s Trust and Safety Team.
Discord asks all users to submit their birthday upon account creation and has been asking users whose accounts were made prior to its rollout to provide their birthday upon attempting to open an NSFW channel. This will prevent users who have told Discord that they are under the age of 18 from seeing any content in the channel, they will instead be met with a page telling them that they are not old enough to view the channel.
If you run a server with NSFW content, you may want to consider preventing users from just joining and immediately opening NSFW channels. Whether you want to do this or not depends on you and what your channels are for.
If your main concern is to do your due diligence to abide by the Terms of Service and Community Guidelines for a server with some image only channels, you may not want to gate your server entirely. If your main concern is keeping your younger members safe from engaging in inappropriate discussions and your adult members safe from unknowingly interacting with minors in an inappropriate way, then you might want to set up additional levels of security to keep out any minors who may have given the incorrect age when prompted by Discord’s age gate.
This can be achieved by asking users how old they are as part of your onboarding process. Perhaps a user must supply their age in their introduction or through picking a role from a bot. Users who aren’t aware that there is NSFW content in your server that they may later want access to are less likely to pretend to be over 18 immediately upon joining.
It should be noted that this is less effective if your server is named something that makes it obvious that the server contains 18+ content because users who join will likely know that they will need to lie to access it.
If your server has certain channels designated for adult content and the server has minors in it that are there to access age appropriate content that’s also hosted in the server, it may be worth role locking the adult content channels.
This can be implemented by making it so that @everyone, a role all members have, or an under 18 role, is unable to see the channel. The main ways to do this would be to set the default permissions for the channel to neutral and set another role to deny access, or set the default user permissions to deny access and the over 18 role to grant access. This also allows you to later remove access to adult content from any users that are causing problems in the channel by simply removing the role that grants them access.
There are various ways that role permissions can be set up to prevent access to a channel. This is only an example, and is not the only option, and may not be the best option to go along with your overall server permissions setup.
Channel role gating can also allow you an extra opportunity to read the rules for or regarding the adult content channels. This can be accomplished by asking users to react to a message within the rules, dm a secret phrase in the rules to a bot that assigns a role, ask a staff member directly for the role and have that staff member ask the user to confirm they’ve read the rules before assigning it, or a number of other systems.
When users open the channel in question, they will see the channel name and see the popup in the screenshot below. Most users will probably not read any channel rules or channel description beyond this. It is important to clearly state what the channel is for in the channel name to prevent users from just assuming it’s a general media channel based on Discord’s popup and the channel name.
For example, if you have a channel that is used for text-based advice about an adult topic and the channel name is vague and has image permissions enabled, users may assume that the channel is for posting content and not for seeking advice/discussing a heavy topic. To make things clearer, try to match the channel name and permissions to the purpose and context of the channel. In this case, changing the channel name to #-advice and disabling image permissions may help users to better understand what the channel is intended to be used for. You can also set a description for the channel where you can more clearly state the purpose, but keep in mind that channel descriptions are less visible to users and may not be seen by everyone entering the channel.
Channels focused on adult topics can provide users with a comfortable space to discuss personal issues and build closeness and trust between members of your community, or just be a space to blow off steam and share content they enjoy. These channels also have very specific risks and required mitigation strategies that will vary depending on the nature of the specific channel.
If you are running a channel on safe sex advice, your main concern will likely be the spread of misinformation and it will be of paramount importance to have reliable and accurate resources on hand. If you run a channel for sharing images, your main concern will likely be making sure that the images shared are legal and properly categorized into your channel. You have to consider what the specific risks in your channel are and ensure that you are writing policies that are specific to your needs and finding moderators that are knowledgeable and comfortable with those topics.
A webhook is sort of a “reverse API”: rather than an application you own (like a bot) calling another application to receive data when it wants it, a webhook is something you can give to someone else's application to send data to you directly as soon as there’s something to share. This makes the process very convenient and efficient for both providers and users.
Now let’s apply this definition to a more tangible example. Let’s say you aren’t around to reply to new e-mails for a while. Through the power of webhooks you could have a setup that replies automatically! When an event happens on a service (like receiving an email), that service activates your webhook and sends you the data relevant to the event that happened. That way, you have exactly the information you need to automatically reply to each new e-mail you receive.
So what does this mean for your server? Basically, Discord provides you with the ability to have a webhook that sends a message to your server when it’s activated, with the option to send the message to any channel along with having a cool name and avatar of your choice.
For example, let’s say you’re waiting for the latest chapter of your favorite webcomic series to release. You can set up a webhook that’s connected with an RSS feed. The feed will activate the webhook when the chapter is released and a message will be posted on your server to notify you about it. This can be applied to many different things - the potential is limitless!
If the service is capable of sending JSON Webhooks, Discord can often use these to create visually appealing embeds when it sends a message out. However, it can also accept raw text messages to pass along as well.
Webhooks are a great tool for services that have events that you’re interested in latching onto. This can be incredibly useful for Content Creator communities where their latest YouTube videos or Twitter posts can be funneled into channels for followers to have easy viewing access in your Discord community. You could also use this for a community that’s oriented towards an open-source technology by having events from GitHub be submitted directly to your server for people to keep track of.
There are a lot of platforms out there that provide you with the ability to handle everything by yourself- from creating the Webhook through your Discord Server Settings to plugging it into the platform you want updates from. However, this often requires you to be more familiar with at least some level of programming as many of these platforms require developer accounts. There can be a few hoops to jump through to get everything set up just right. This can often turn people off from using these types of webhooks.
However, times are changing and the barrier to entry when dabbling into the world of webhooks is much lower. These days, multiple third-party services exist that have handled all the hard techy stuff for you. They can interface with the platforms that would normally require you to do extra work to get the data you want. IFTTT and Zapier are two such services that lets you plug-and-play several useful platforms directly into your Discord server through a clean web interface that allows you to customize the type of data your webhook receives, and thus what messages are sent to your server. Just keep in mind that some services have restrictions on how many events can be sent to one webhook, so you may have to create multiple distinct webhooks for multiple functionalities. Note that these options and restrictions are entirely platform-dependent. Discord has no problems handling whatever data is thrown at its webhooks when activated, even if it’s from different sources.
There’s another use-case for Webhooks that is more unique. While they’re very useful for automated messages, they’re also great for one-off embed messages that have a very polished look! You have the ability to customize the name and the avatar of the webhook, but if you’re technologically-inclined you can create your own JSON data to activate the webhook with. This allows you to fully define the aesthetics of an embed in your message, giving a clean look to whatever information you’re sending. One possible application is nice formatting for your server rules and information, rather than just sending multiple plain text messages.
Of course, there are other possible uses, the only limit is your creativity! If you’re not too keen on developing JSON data by hand, or have no idea what a “JSON” is, services like Discohook also make it easy for anyone to take advantage of this use case. As a note, the websites linked here are not endorsed by Discord and are only a suggestion from the authors of the article.
The process of setting up webhooks can be roughly explained in five simple steps!
Step 1: In your Discord server, you will need to create a webhook and copy the webhook URL. This URL is the path for your webhook to receive an HTTP POST request from a service when an event occurs.
Step 2: From this menu, you have the ability to style your webhook with a name and avatar. We recommend at least a name to distinguish its purpose.
Step 3: If you’re using a third-party service to handle events or send messages, plug your webhook URL into the service and configure the type of events that the service should trigger for.
Step 4: If your third-party service allows you to, configure additional options that will change the look and content of your message to the way you like.
Step 5: Now that you’ve created the connection, when the event happens, the data will be sent directly to Discord and your webhook will post a message in your preferred channel!
We’ve been talking about embeds a lot, but we haven’t quite explained what they are yet! In simple terms, embedding refers to the integration of links, images, GIFs, videos and other forms of content into messages on social media. An embed itself is a part of the message that visually represents content in order to increase engagement.
As a user of Discord, you have probably seen embeds more than a few times. They often are created automatically by Discord when a user sends a link, such as allowing you to view a YouTube video within Discord when a YouTube URL is shared. But did you know that Discord Bots and Webhooks can create their own embeds with all kinds of data? They’re pretty flexible! They can have colored borders, images, text fields, and other kinds of fancy properties.
If you're a developer, the options available to you to create a good-looking embed are powerful and versatile. For non-developers, platforms may give you visualizations that help you plug-and-play data into a website for customization purposes. There are also online tools to help you do this too!
Here are several of those properties that can be modified in an embed in order to style it, to give you an idea of what’s possible when you have the ability to style your messages:
Markdown is also supported in an embed. Here is an image to showcase an example of these properties:
Example image to showcase the elements of an embed
An important thing to note is that embeds also have their limitations, which are set by the API. Here are some of the most important ones you need to know:
An important thing to note is that embeds also have their limitations, which are set by the API. Here are some of the most important ones you need to know:
If you feel like experimenting even further you should take a look at the full list of limitations provided by Discord here.
It’s very important to keep in mind that when you are writing an embed, it should be in JSON format. Some bots even provide an embed visualizer within their dashboards. You can also use this embed visualizer tool which provides visualization for bot and webhook embeds.
There are several benefits to using webhooks in your community including simplicity, versatility, automation, customization, and accessibility.
Simplicity. Webhooks are straightforward and lacking in complexity. They have a single function, which is to send a designated message when they are activated.
Versatility. Webhooks have many different use cases, all stemming from being able to send a message upon activation. This allows for many types of automation powered by other services, or allows you to send clean looking messages manually. Both of these setups have their own various uses ranging from posting updates and logging to posting custom messages for an aesthetic purpose.
Automation. With webhooks, there isn’t a need to constantly look for updates and post them yourself. Once set up, a webhook will automatically post any updates you need.
Customization. Each webhook can have a unique name and avatar, and each message it sends can be unique as well. You can have multiple webhooks with different looks in different channels and use each of them however you like.
Accessibility. Once you create a webhook, all you need is its URL and a website that will push messages. Since these webhooks are hosted on Discord, they don’t need to be hosted like a normal bot, often saving moderation teams from encountering a financial investment to use them, and making them more easily available to all users.
Webhooks and bots, while having slight similarities, are actually very different from each other. There are several aspects we have to look at when comparing the two:
Even though this comparison is important for better understanding of both bots and webhooks, it does not mean you should limit yourself to only picking one or the other. Sometimes, bots and webhooks work their best when working together. It’s not uncommon for bots to use webhooks for logging purposes or to distinguish notable messages with a custom avatar and name for that message. Both tools are essential for a server to function properly and make for a powerful combination.
Thanks to their simplicity and accessibility, webhooks have become a staple in many communities. Despite their limitations and lack of functions compared to bots, they’re still very useful and play an important role in the automation and decoration of your server. Overall, webhooks are a great tool for pushing various messages, with limitless customization opportunities. Ultimately, the choice of using them depends solely on your needs and preferences.
Auto Moderation is integral to many communities on Discord, especially those of any notable size. There are many valid reasons for this, some of which you may find apply to your community as well. The security that auto moderation can provide can give your users a much better experience in your community, make the lives of your moderators easier and prevent malicious users from doing damage to your community or even joining your community.
If you’re a well established community, you’ll likely have a moderation team in place. You may wonder, why should I use auto moderation? I already have moderators! Auto moderation isn’t a replacement for manual moderation, rather, it serves to enrich it. Your moderation team can continue to make informed decisions within your community while auto moderation serves to make that process easier for them by responding to common issues at any time more quickly than a real-life moderator can.
Different communities will warrant varying levels of auto moderation. It’s important to be able to classify your community and consider what level of auto moderation is most suitable to your community’s needs. Keep in mind that Discord does impose some additional guidelines depending on how you designate your community. Below are different kinds of communities and their recommended auto moderation systems:
If you run a Discord community with limited invites where every new member is known, auto moderation won’t be a critical function unless you have a significantly larger member count. It’s recommended to have at least some auto moderation however, namely text filters, anti-spam, or Discord’s AutoMod keyword filters.
If you run a Discord community that is Discoverable or has public invites where new members can come from just about anywhere, it’s strongly recommended to have anti-spam and text filters or Discord’s AutoMod keyword filters in place. Additionally, you should be implementing some level of member verification to facilitate the server onboarding process. If your community is large, with several thousand members, anti-raid functionality may become necessary. Remember, auto moderation is configurable to your rules, as strict or loose as they may be, so keep this principle in mind when deciding what level of automation works best for you.
If your Discord community is Verified or Partnered, you will need to adhere to additional guidelines to maintain that status. Auto moderation is recommended for these communities in order to feel confident that you can succinctly and effectively enforce these guidelines at all times so consider using anti-spam and text filters or Discord’s AutoMod keyword filters. If you have a Vanity URL or your community is Discoverable, anti-raid is a must-have in order to protect your community from malicious actors.
Some of the most powerful tools in auto moderation come with your community and are built directly into Discord. Located under the Server Settings tab, you will find the Moderation settings. This page houses some of the strongest safety features that Discord has to natively offer. These settings can help secure your Discord community without the elaborate setup of a third party bot involved. The individual settings will be detailed below.
AutoMod is a new content moderation feature as of 2022, allowing those with the “Manage Server” and “Administrator” permissions to set up keyword and spam filters that can automatically trigger moderation actions such as blocking messages that contain specific keywords or spam from being posted, and logging flagged messages as alerts for you to review.
This feature has a wide variety of uses within the realm of auto moderation, allowing mods to automatically log malicious messages and protect community members from harm and exposure to undesirable spam and words like slurs or severe profanity. AutoMod’s abilities also extend to messages within threads, text-in-voice channels, and Forum channels giving moderation teams peace of mind that they have AutoMod’s coverage across these message surfaces without having to worry about adding more manual moderation work by enabling these valuable features.
Setting up AutoMod is very straightforward. First, make sure your server has Communities enabled. Then, navigate to your server’s settings and click the AutoMod tab. From there, you’ll find AutoMod and can start setting up keyword and spam filters.
Keyword filters allow you to flag and block messages containing specific words, characters, and symbols from being posted. You can set up one “Commonly Flagged Words” filter, along with up to 3 custom keyword filters that allow you to enter a maximum of 1,000 keywords each, for a total of four keyword filters.
When inserting keywords, you should separate each word with a comma like so: Bad, words, go, here. Matches for keywords are exact and aware of whitespace. For example, the keyword “Test Filter” will be triggered by “test filter” but not “testfilter” or “test”. Do note that keywords also ignore capitalisation.
To have AutoMod filter messages containing words that partially match your keywords, which is helpful for preventing users from circumventing your filters, you can modify your keywords with the asterisk (*) wildcard character. This works as follows:
Be careful with wildcards so as to not have AutoMod incorrectly flag words that are acceptable and commonly used!
AutoMod’s Commonly Flagged Words keyword filter comes equipped with three predefined wordlists that provide communities with convenient protection against commonly flagged words. There are three predefined categories of words available: Insults and Slurs, Sexual Content, and Severe Profanity. These wordlists will all share one rule, meaning they’ll all have the same response configured. These lists are maintained by Discord and can help keep conversations in your Community consistent with Discord's Community Guidelines. This can be particularly helpful for Partnered and Verified communities.
Both AutoMod’s commonly flagged word filters and custom filters allow for exemptions in the form of roles and channels, with the commonly flagged word filter also allowing for the exemption of words from Discord’s predefined wordlists. Anyone with these defined roles, or sending messages within defined channels or containing keywords from Discord’s wordlists, will not trigger responses from AutoMod.
This is notably useful for allowing moderators to bypass filters, allowing higher trusted users to send more unrestricted messages, and tailoring the commonly flagged wordlists to your community’s needs. As an example, you could prevent new users from sending Discord invites with a keyword filter of: *discord.gg/*, *discord.com/invites/* and then give an exemption to moderators or users who have a certain role, allowing them to send Discord invites. This could also be used to only allow sharing Discord invites in a specific channel. There’s a lot of potential use cases for exemptions! Members with the Manage Server and Administrator permissions will always be exempt from all AutoMod filters. Bots and webhooks are also exempt.
Spam, by definition, is irrelevant or unsolicited messages. AutoMod comes equipped with two spam filters that allow you to flag messages containing mention spam and content spam.
Mention spam is when users post messages containing excessive mentions for the purpose of disrupting your server and unnecessarily pinging others.
AutoMod’s mention spam filter lets you flag and block messages containing an excessive number of unique @role and @user mentions. You define what is “excessive” by setting a limit on the number of unique mentions that a message may contain, up to 50.
It is recommended to select "Block message" as an AutoMod response when it detects a message containing excessive mentions as this prevents notifications from being sent out to tagged users and roles. This helps prevent your channels from being clogged up by disruptive messages containing mention spam during mass mention attempts and mention raids, and saves your members from the annoyance of getting unnecessary notifications and ghost pings.
This filter identifies spam at large by using a model that has been trained by messages that users have reported as spam to Discord. Enabling this filter is an effective way to block out a variety of messages that resemble spammy content reported by Discord users, and identify spammers in your community that should be weeded out. However, this filter isn’t perfect and might not catch all forms of spam, such as DM spam, copy/pasta or repeat messages.
You can configure AutoMod’s keyword and spam filters with the following automatic responses when a message is flagged:
This response will prevent a message containing a keyword or spam from being sent entirely. Users will be notified with an ephemeral message when this happens, informing them the community has blocked the message from being sent.
Discord will seamlessly block all messages containing matching keywords, spam content, and excessive mentions from your filters from being sent entirely regardless of the volume of messages, making this response especially effective for preventing or de-escalating raids where raiders try to spam your channels with repeated messages and excessive mentions.
This response will send an alert containing who-what-where information of a flagged message to a logging channel of your choice.
This message will preview what the full caught message would’ve looked like, including the full content. It also shows a pair of buttons at the bottom of the message, ⛨ Actions and Report Issues. Thes action buttons will bring up a user context menu, allowing you to use any permissions you have to kick, ban or time out the member. The message also displays the channel the message was attempted to be sent in and the filter that was triggered by the message. In the future, some auto-moderation bots may be able to detect these messages and action users accordingly.
This response will automatically apply a time out penalty to a user, preventing them from interacting in the server for the duration of the penalty. Affected users are unable to send messages, react to messages, join voice channels or video calls during their timeout period. Keep in mind that they are able to see messages being sent during this period.
To remove a timeout penalty, Moderators and Admins can right-click on any offending user’s name to bring up their Profile Context Menu and select “Remove Timeout.”
AutoMod is a very powerful tool that you can set up easily to reduce moderation work and keep your community's channels and conversations clean and welcoming during all hours of the day. For example, you may want to use three keyword filters; one to just block messages, one to just send alerts for messages, and one to do both.
Overall, it's recommended to have AutoMod block messages you wouldn't want community members to see. For example, high harm keywords such as slurs and other extreme language should have AutoMod’s “block message” and “send alerts” responses enabled. This will allow your moderation team to take action against undesirable messages and the users behind them while preventing the rest of your community from exposure. Low harm keywords or commonly spammed phrases can also have AutoMod’s “Block message” response enabled without the need to set up alerts. This will still prevent undesirable messages from being sent without spamming your logs with alerts. You can also quickly configure AutoMod’s keyword and spam filters in real-time to prevent and de-escalate raids by adding spammed keywords or - adjusting your mention limit in the event of a mention raid - to prevent the raids from causing lasting damage.
It's also recommended to have AutoMod send you alerts for more subjective content that requires a closer look from your moderation team, rather than having them being blocked entirely. This will allow your moderation team to investigate flagged messages with additional context to ensure there’s nothing malicious going on. This is useful for keywords that can be commonly misrepresented, or sent in a non-malicious context.
None - This turns off verification for your community, meaning anyone can join and immediately interact with your community. This is typically not recommended for public communities as anyone with malicious intent can immediately join and wreak havoc.
Low - This requires people joining your community to have a verified email which can help protect your community from the laziest of malicious users while keeping everything simple for well-meaning users. This would be a good setting for a small, private community.
Medium - This requires the user to have a verified email address and for their account to be at least 5 minutes old. This further protects your community by introducing a blocker for people creating accounts solely to cause problems. This would be a good setting for a moderately sized community or small public community.
High - This includes the same protections as both medium and low verification levels but also adds a 10 minute barrier between someone joining your community and being able to interact. This can give you and anyone else responsible for keeping things clean in your community time to respond to ‘raids’, or large numbers of malicious users joining at once. For legitimate users, you can encourage them to do something with this 10 minute time period such as read the rules and familiarize themselves with informational channels to pass the time until the waiting period is over. This would be a good setting for a large public community.
Highest - This requires a joining user to have a verified phone number in addition to the above requirements. This setting can be bypassed by robust ‘raiders’, but it takes additional effort. This would be a good setting for a private community where security is tantamount, or a public community with custom verification. This requirement is one many normal Discord users won’t fill, by choice or inability. It’s worth noting that Discord’s phone verification disallows VoIP numbers to be abused.
Not everyone on the internet is sharing content with the best intentions in mind. Discord provides a robust system to scan images and embeds to make sure inappropriate images don’t end up in your community. There are varying levels of scrutiny to the explicit media content filter which are:
Don’t scan any media content - Nothing sent in your community will go through Discord’s automagical image filter. This would be a good setting for a small, private community where only people you trust can post images, videos etc.
Scan media content from users without a role - Self explanatory, this works well to stop new users from filling your community with unsavoury imagery. When combined with the proper verification methods, this would be a good setting for a moderately sized private or public community.
Scan media content from all members - This setting makes sure everyone, regardless of their roles, isn’t posting unsavoury things in your community. In general, we recommend this setting for ALL public facing communities.
Once you’ve decided on the base level of auto moderation you want for your community, it’s time to look at the extra levels of auto moderation bots can bring to the table! The next few sections are going to detail the ways in which a bot can moderate.
If you want to keep your chats clean and clear of certain words, phrases, spam, mentions and everything else that can be misused by malicious users you’re going to need a little help from a robotic friend or two. Examples of bots that are freely available are referenced below. If you decide to use several bots, you may need to juggle several moderation systems.
When choosing a bot for auto moderation, you should also consider their capabilities for manual moderation (things like managing mutes, warns etc.). Find a bot with an infraction/punishment system you and the rest of your moderator team find to be the most appropriate. All of the bots listed in this article have a manual moderation system.
The main and most pivotal forms of auto moderation are:
Each of these subsets of auto moderation will be detailed below along with recommended configurations depending on your community.
It’s important your auto moderation bot(s) of choice are adopting the cutting edge of Discord API features, as this will allow them to provide better capabilities and integrate more powerfully with Discord. Slash commands are especially important as you’re able to configure which commands are usable on which bot on a case by case basis for each slash command. This will allow you to maintain very detailed moderation permissions for your moderation team. Bots that support more recent API features are generally also considered to be more actively developed, and thus more reliable in regards to reacting to new threat vectors as well as able to adapt to new features on Discord. A severely outdated bot could react insufficiently to a high-harm situation.
As mentioned above, one of the more recent features is Slash Commands. Slash commands are configurable per-command, per-role, and per-channel. This allows you to designate moderation commands solely to your moderation team without relying on permissions on the bot’s side to work perfectly. This is relevant because there have been documented examples in the past of permissions being bypassed on a moderation bot’s permission checking, allowing normal users to execute moderation commands.
One of the most common forms of auto moderation is anti-spam, a type of filter that can detect and prevent various kinds of spam. Depending on what bot(s) you’re using, this comes with various levels of configurability.
One of the most common forms of auto moderation is anti-spam, a type of filter that can detect and prevent various kinds of spam. Depending on what bot(s) you’re using, this comes with various levels of configurability.
Anti-spam is integral to running a large private community, or a public community. There are multiple types of spam a user can engage in, with some of the most common forms listed in the table above. These types of spam messages are also very typical of raids, especially Fast Messages and Repeated Text. While spam can largely be defined as irrelevant or unsolicited messages, the nature of spam can vary greatly. However, the vast majority of instances involve a user or users sending lots of messages with the same content with the intent of disrupting your community.
There are subsets of this spam that many anti-spam filters will be able to catch. For example, if any of the following: Mentions, Links, Invites, Emoji and Newline Text are spammed repeatedly in one message, or spammed repeatedly across several messages, they will provoke most Repeated Text and Fast Messages filters appropriately. Subset filters are still a good thing for your anti-spam filter to have as you may wish to punish more or less harshly depending on the spam. Notably, Emoji and Links may warrant separate punishments. Spamming 10 links in a single message is inherently worse than having 10 emoji in a message.
Anti-spam will only act on these things contextually, usually in an X in Y fashion where if a user sends, for example, ten links in five seconds, they will be punished to some degree. This could be ten links in one message, or one link in ten messages. In this respect, some anti-spam filters can act simultaneously as Fast Messages and Repeated Text filters.
Sometimes, spam may happen too quickly and a bot can fall behind. There are rate limits in place to stop bots from harming communities that can prevent deletion of individual messages if those messages are being sent too quickly. This can often happen in raids. As such, Fast Messages filters should prevent offenders from sending messages; this can be done via a mute, kick or ban. If you want to protect your community from raids, please read on to the Anti-Raid section of this article.
Text filters allow you to control the types of words and/or links that people are allowed to put in your community. Different bots will provide various ways to filter these things, keeping your chat nice and clean.
A text filter is a must for a well moderated community. It’s strongly recommended you use a bot that can filter text based on a banlist. A Banned words filter can catch links and invites provided http:// and https:// are added to the word banlist (for all links) or specific full site URLs to block individual websites. In addition, discord.gg can be added to a banlist to block ALL Discord invites.
A Banned Words filter is integral to running a public community, especially for Partnered, Community, or Verified servers who have additional content guidelines they must meet that a Banned Words filter can help with.
Before configuring a filter, it’s a good idea to work out what is and isn’t ok to say in your community, regardless of context. For example, racial slurs are generally unacceptable in almost all communities, regardless of context. Banned word filters often won’t account for context with an explicit banlist. For this reason, it’s also important that a robust filter contains allowlisting options. For example, if you add ‘cat’ to your filter and someone says ‘catch’, they could get in trouble for using an otherwise acceptable word.
Filter immunity may also be important to your community as there may be individuals who need to discuss the use of banned words, namely members of a moderation team. There may also be channels that allow the usage of otherwise banned words. For example, a serious channel dedicated to discussion of real world issues may require discussions about slurs or other demeaning language, in this exception channel based Immunity is integral to allowing those conversations.
Link filtering is important to communities where sharing links in ‘general’ chats isn’t allowed, or where there are specific channels dedicated to sharing that content. This can allow a community to remove links with an appropriate reprimand without treating that misstep with the same gravity one would someone who used a slur.
Allow/ban-listing and templates for links are also a good idea to have. While many communities will use catch-all filters to make sure links stay in specific channels, some links will always be inherently unsavory. Being able to filter specific links is a good feature- with preset filters (like the google filter provided by YAGPDB) coming in very handy for protecting your user base without requiring intricate setup on your behalf. However, it is recommended you configure a custom filter as a supplement, to ensure specific slurs, words, etc. that break the rules of your community, aren’t being said.
Invite filtering is equally important in large or public communities where users will attempt to raid, scam or otherwise assault your community with links with the intention of manipulating your user base or where unsolicited self-promotion is potentially fruitful. Filtering allows these invites to be recognized instantly and dealt with more harshly. Some bots may also allow by-community white/banlisting allowing you to control which communities are approved to share invites to and which aren’t. A good example of invite filtering usage would be something like a partners channel, where invites to other, closely linked, communities are shared. These communities should be added to an invite allowlist to prevent their deletion.
Discord also implements a native filter on links and files, though this filter is entirely client-side and doesn’t prevent malicious links or files being sent. It does, however, warn users who attempt to click suspicious links or download suspicious files (executables, archives etc.) and prevents known malicious links from being clicked at all. While this doesn’t remove offending content, and shouldn’t be relied on as auto moderation, it does prevent some cracks in your auto moderation from harming users.
Raids, as defined earlier in this article, are mass-joins of users (often selfbots) with the intent of damaging your community. Protecting your community from these raids can come in various forms. One method involves gating your server using a method detailed elsewhere in the DMA.
Raid detection means a bot can detect the large number of users joining that’s typical of a raid, usually in an X in Y format. This feature is usually chained with Raid Prevention or Damage Prevention to prevent the detected raid from being effective, wherein raiding users will typically spam channels with unsavory messages.
Raid-user detection is a system designed to detect users who are likely to be participating in a raid independently of the quantity of frequency of new user joins. These systems typically look for users that were created recently or have no profile picture, among other triggers depending on how elaborate the system is.
Raid prevention stops a raid from happening, either by Raid detection or Raid-user detection. These countermeasures stop participants of a raid specifically from harming your community by preventing raiding users from accessing your community in the first place, such as through kicks, bans, or mutes of the users that triggered the detection.
Damage prevention stops raiding users from causing any disruption via spam to your community by closing off certain aspects of it either from all new users, or from everyone. These functions usually prevent messages from being sent or read in public channels that new users will have access to. This differs from Raid Prevention as it doesn’t specifically target or remove new users in the community.
Raid anti-spam is an anti-spam system robust enough to prevent raiding users’ messages from disrupting channels via the typical spam found in a raid. For an anti-spam system to fit this dynamic, it should be able to prevent Fast Messages and Repeated Text. This is a subset of Damage Prevention.
Raid cleanup commands are typically mass-message removal commands to clean up channels affected by spam as part of a raid, often aliased to ‘Purge’ or ‘Prune’.
It should be noted that Discord features built-in raid and user bot detection, which is rather effective at preventing raids as or before they happen. If you are logging member joins and leaves, you can infer that Discord has taken action against shady accounts if the time difference between the join and the leave times is extremely small (such as between 0-5 seconds). However, you shouldn’t rely solely on these systems if you run a large or public community.
Messages aren’t the only way potential evildoers can introduce unwanted content to your community. They can also manipulate their Discord username or Nickname to be abusive. There are a few different ways a username can be abusive and different bots offer different filters to prevent this.
Username filtering is less important than other forms of auto moderation. When choosing which bot(s) to use for your auto moderation needs, this should typically be a later priority, since users with malicious usernames can just be nicknamed in order to hide their actual username.
So far, we’ve covered general auto moderation bots with a wide toolset. However, there are some specialized bots that only cover one specific facet of auto moderation and execute it especially well. A few examples and descriptions are below:
This bot detects raids as they happen globally, banning raiders from your community. This is especially notable as it’ll ban detected raiders from raids in other communities it’s in as they join your community, making it significantly more effective than other anti-raid solutions that only pay attention to your community.
Fish is designed to counter scamming links and accounts, targeting patterns in joining users to prevent DM raids (Like normal raids, but members are directly messaged instead). These DM raids are typically phishing scams, which Fish also filters, deleting known phishing sites.
Both of these bots are highly specialized link and file moderation bots, effectively filtering adult sites, scamming sites and other categories of sites as defined by your moderation team.
When choosing a bot for auto moderation you should ensure it has an infraction/punishment system you and your mod team are comfortable with as well as its features being what’s best suited for your community. Consider testing out several bots and their compatibility with Discord’s built-in auto moderation features to find what works best for your server’s needs. You should also keep in mind that the list of bots in this article is not comprehensive - you can consider bots not listed here. The world of Discord moderation bots is vast and fascinating, and we encourage you to do your own research!
For the largest of communities, it’s recommended you employ everything Discord has to offer. You should use the High or Highest Verification level, all of Discord’s AutoMod keyword filters and a robust moderation bot like Gearbot or Gaius. You should seriously consider additional bots like Fish, Beemo and Safelink/Crosslink to aid in keeping your users safe and have detailed Content Moderation filters. At this scale, you should seriously consider premium, self hosted, or custom moderation bots to meet the unique demands of your community.
It’s recommended you use a bot with a robust and diverse toolset, while simultaneously utilizing AutoMod’s commonly flagged word filters. You should use the High Verification level to aid in preventing raids. If raiding isn’t a large concern for your community, Gearbot and Giselle are viable options. Your largest concerns in a community of this size is going to be anti-spam and text filters meaning robust keyword filters are also highly recommended, with user filters as a good bonus. Beemo is generally recommended for any servers of this size. At this scale a self hosted, custom, or premium bot may also be a viable option, but such bots aren’t covered in this article.
It’s recommended you use Fire, Gearbot, Bulbbot, AutoModerator or Giselle. Mee6 and Dyno are also viable options, however as they’re very large bots and have been known to experience outages, leaving your community unprotected for large amounts of time. At this community size, you’re likely not going to be largely concerned about anti-raid with anti-spam and text filters being your main focus. You’ll likely be able to get by just using AutoMod’s keyword filters and commonly flagged words lists provided by Discord. User filters, at this size, are largely unneeded and your Verification Level shouldn’t need to be any higher than Medium.
If your community is small or private, the likelihood of malicious users joining to wreak havoc is rather low. As such, you can choose a bot with general moderation features you like the most and use that for auto moderation. Any of the bots listed in this article should serve this purpose. At this scale, you should be able to rely solely on AutoMod’s keyword filters. Your Verification Level is largely up to you at this scale depending on where you anticipate member growth coming from, with Medium being default recommended.
First, make sure Mee6 is in the communities you wish to configure it for. Then log into its online dashboard (https://mee6.xyz/dashboard/), navigate to the community(s), then plugins and enable the ‘Moderator’ plugin. Within the settings of this plugin are all the auto moderation options.
First, make sure Dyno is in the communities you wish to configure it for. Then log into its online dashboard (https://dyno.gg/account), navigate to the community(s), then the ‘Modules’ tab. Within this tab, navigate to ‘Automod’ and you will find all the auto moderation options.
First, make sure Giselle is in the communities you wish to configure it for. Then, look at its documentation (https://docs.gisellebot.com/) for full details on how to configure auto moderation for your community.
First, make sure Gaius is in the communities you wish to configure it for. Then, look at its documentation (https://automoderator.app/docs/setup/) for full details on how to configure auto moderation for your community.
First, make sure Fire is in the communities) you wish to configure it for. Then, look at its documentation (https://getfire.bot/commands) for full details on how to configure auto moderation for your community.
First, make sure Bulbbot is in the communities you wish to configure it for. Then, look at its documentation (https://docs.bulbbot.rocks/getting-started/) for full details on how to configure auto moderation for your community.
First, make sure Gearbot is in the communities you wish to configure it for. Then, look at its documentation (https://gearbot.rocks/docs) for full details on how to configure auto moderation for your community.
While there are certain “best practices” when it comes to moderation transparency, there is no single system that is right for everyone. The amount of transparency you need for your moderation system ultimately depends on your server rules, culture, and vision. This article will explain the pros and cons of transparency and ways that you can apply transparency to your moderation system.
Though the idea of moderation transparency is generally considered to be a good thing, it is important to understand that there are both pros and cons to transparency in moderation. Some of these pros and cons are described below.
To help you understand how the pros and cons apply to transparency, consider an example in which a moderator publicly warns another user not to call someone a “retard” because it violates an existing “No Slurs Allowed” rule.
Now that you are aware of some of the pros and cons of transparency in moderation, you must next understand the components of the moderation system so that you can consider ways in which these components can be made more or less transparent. Broadly speaking, a moderation system can be split into the following components:
Transparency and communication go hand-in-hand. The more you communicate these components to relevant users and the server as a whole, the more transparent your moderation system is.
There are several ways to implement transparency in each of these components, each with their own pros and cons. Each section here will establish ways in which a component can be made more or less transparent and a recommendation of the appropriate level of transparency for each. However, please keep in mind that every server’s needs are different and some of the pros and cons discussed may not apply to your server. It is always important to consider your specific community when it comes to implementing transparency.
Your server rules are the backbone of your moderation system. They describe how your members should conduct themselves and what happens if they don’t meet those expectations. In general though, your rules should be specific enough to ensure comprehension and compliance without being overly wordy or attempting to provide an exhaustive description of prohibited behaviors.
For example, giving a couple of examples of NSFW content for a “no NSFW content rule” may help people understand what you interpret as being NSFW, compared to other servers or Discord itself. However, too many examples may make the list seem fully comprehensive, and people will assume that items not on the list are fair game. Disclaiming that examples of rule-breaking content are non-exhaustive and that the moderators have the final say in interpreting if someone is breaking the rules can help to address users that are interested in testing the limits of the rules or being rules lawyers to escape punishment on a technicality.
Developing moderator guidelines is another important part of your moderation system. Similar to your rules guiding the conduct of your server members, your moderator guidelines help guide the conduct of your moderators.
Keeping your moderator guidelines visible to the rest of the server will encourage compliance from members, and enable them to defuse incidents without moderator intervention. Furthermore, providing basic standards of moderator conduct will help users know when it’s appropriate to report moderators to the server owner for misconduct and hold them accountable. However, you should avoid putting too much of your moderator guidelines out in the public in order to avoid rules lawyers deliberately misinterpreting the spirit of the guidelines to their advantage. After developing your moderator guidelines, balancing these pros and cons will help you determine how much of your guidelines you should present to the public.
Logging user infractions is key to ensuring that the entire moderation team has the same understanding of how often a user has broken the rules. Transparency between the mod team and the user in question is important for the user to understand when they have received a warning that brings them closer to being banned from the server. Informing the user of which moderator warned them is important for holding moderators accountable to the warnings they issue, but may leave moderators open to harassment by warned users. Having a procedure to deal with harassment that stems from this, is one way to achieve accountability while still protecting your moderators from bad actors in your server.
Although the communication of infractions is vital to ensure understanding among your server members, it may be prudent to withhold information about exactly how close a user is to being banned so that they do not attempt to toe the line by staying just under the threshold for being banned. Furthermore, even though a public infraction log may be a good way to promote cohesion and transparency by showing examples of unacceptable behavior to the rest of the server and fostering discussion between the mod team and community, others may think that such a log infringes on user privacy or that these logs may constitute a “witch hunt.” It may also leave mods and users open to harassment over warnings given or received.
If you want to encourage a sense of community and understanding without taking away user privacy or inadvertently encouraging harassment, a better option may be to encourage users to bring up criticisms of rules or enforcement in a feedback channel if they wish to. Provided that the mod team ensures these conversations remain constructive and civil, creating a public medium for these conversations will help others understand how the mod team operates and allow them to provide feedback on how the server is run.
Everyone makes mistakes, and moderators are no exception. It is important to have a process for users to appeal their warnings and punishments if they feel that they were issued unfairly. If you decide to have a public infractions log, you may receive appeals on behalf of warned users from people who were uninvolved in the situation if they feel the warning was issued unfairly. While this can help with accountability if a user is too nervous to try to appeal their warning, it can also waste the time of your mod team by involving someone that does not have a complete understanding of the situation. In general, it is better to keep the appeal process private between the moderation team and the punished user, primarily via mediums such as direct messages with an administrator or through a mod mail bot. During the appeal process, it is best to ensure that you clearly and calmly walk through the situation with the appealing user to help them better understand the rules while maintaining moderator accountability.
In the end, there is not a single “correct” way to manage transparency in your moderation system. The appropriate level of transparency will vary based on the size of the server and the rules that you implement. However, walking through the steps of your moderation system one by one and considering the various pros and cons of transparency will help you determine for yourself how to incorporate transparency into your moderation system. This will help you build trust between moderators and non-moderators while preventing abuse on both ends of the system.
A parasocial relationship describes a one-sided relationship between a spectator who develops a personal attachment through various influences to a performer who is not aware of the existence of the spectator. It is strengthened by continuous positive exposure to its source, which mainly happens on social platforms.
In this section we’ll take a look at how parasocial relationships are developed and how to establish the severity of the level of parasocial relationships you are encountering from a moderation standpoint.
The establishment of parasocial relationships can be portrayed as such:
User A, in this example a popular content creator, uploads regular content on a big platform. User B, who takes the position as a member of User A’s audience, takes an interest in their content. User B reacts to User A’s content and observes them. While User A may know that people are enjoying their content, they are unlikely to be aware of every viewers’ existence. This total awareness becomes more unlikely the bigger the audience gets.
User B on the other hand is regularly exposed to User A’s content and takes a liking to them. The interest is usually defined by User A’s online persona: content, visual appeal, likeability, and even their voice can all be influencing aspects. User B perceives User A as very relatable through common interests or behaviors and starts to develop a feeling of loyalty, or even responsibility, during that phase of one-sided bonding. This behavior can be attributed to personal reflection in User A, as well as psychological facts like loneliness, empathy, or even low self-esteem. As a consequence, User B can easily be influenced by User A.
At this point User B might feel like they understand User A in a way nobody else does and may even begin to view them on a personal level as some sort of friend or close relative. They see this individual every day, hear their voice on a regular basis, and believe that they are connecting to them on a deep level. They develop an emotional attachment, and the stronger the parasocial relationship gets, the more attention User B pays to User A’s behavior and mannerisms. While User A most likely doesn’t know User B personally, User B will seek out interaction with and recognition from User A. That behavior is typically represented through donations on stream, where User A either reads out their personal message and name or publishes a “thank you” message on certain websites.
Additionally, User B tries to follow and engage with their idol on as many platforms as possible aside from their main source of content creation. These social media platforms are usually Instagram or Twitter, but can also include User A’s Discord server.
While that type of relationship is natural and sometimes even desired, it is important to define the level of parasocial relationships and differ between its intensity for the safety of the community, the staff members, and the performer. In their article in the Psychology focused academic journal ‘The Psychologist’, researchers Giles and Maltby designed three levels of severity of parasocial relationships based on the Celebrity Attitude Scale.
“Fans are attracted to a favourite celebrity because of their perceived ability to entertain and to become a source of social interaction and gossip. Items include ‘My friends and I like to discuss what my favourite celebrity has done’ and ‘Learning the life story of my favourite celebrity is a lot of fun’.”
The least harmful level is the general public and social presence. The targeted celebrity is subjected to gossip and mostly provides a source of entertainment. Their presence is mostly found in talks with friends, talk shows, on magazine covers, and similar public-facing media. Discord users on this level usually interact with the community in a relaxed, harmless way.
The next level is parasocial interaction. The characteristics of this level are the development of an emotional attachment of a spectator with a performer, resulting in intense feelings. This behavior is characterized by the spectator wanting to get to know the performer, followed by the desire to be part of their life as well as considering them as part of their own life. A result of that can be addictive or even obsessive behavior, which can be noticed in Discord servers, too.
“The intense-personal aspect of celebrity worship reflects intensive and compulsive feelings about the celebrity, akin to the obsessional tendencies of fans often referred to in the literature. Items include ‘My favourite celebrity is practically perfect in every way’ and ‘I consider my favourite celebrity to be my soulmate’.”
Spectators of that level usually ping the performer or message them privately in an attempt to be recognized. While that behavior is natural, anything that endangers safe interactions between themselves, the community, or the performer needs to be supervised carefully. Unrestrained abusive behavior, which can be found in unwanted intimate, borderline NSFW questions or comments, needs to be addressed and corrected accordingly.
The final level is considered the most intense level and also the most dangerous. It contains severe, harmful obsessions that can extend all the way to stalking and real-world consequences. Parasocial relationships to this degree will rarely be found on Discord, but have to immediately be reported if present.
“This dimension is typified by uncontrollable behaviours and fantasies about their celebrities. Items include ‘I would gladly die in order to save the life of my favourite celebrity’ and ‘If I walked through the door of my favourite celebrity’s house she or he would be happy to see me’.”
The vast majority of users won’t reach the level past seeing the performer as a source of entertainment, but moderators should be aware of potential consequences of anything beyond that as they can be harmful to both the spectator, the performer, and the safe environment you are working to upkeep for all.
Parasocial relationships on Discord can pertain to anyone who is perceived as being popular or influential, making them “celebrities” of Discord. Some examples of parasocial relationships on Discord can be found between a user and a moderator, a user and a content creator you are moderating for, or even a member of your moderation team and the content creator you are working for.
But what does all that mean for you, the mod? While Discord moderators are not nearly as popular and influential as big content creators or celebrities, they are still observed by Discord users. While being a moderator puts you into a position of power and responsibility over the wellbeing of the server, some users perceive it as you climbing the social ladder in the Discord server. In their eyes, becoming a moderator changes your overall social status within your Discord community.
Being hoisted higher in the servers’ hierarchy results in members quickly recognizing you and potentially treating you differently due to your influence, even becoming “fans” of you as a person. Some users will soak up any information they can get about you, especially if they realize that you have common interests. This may lead to the development of a parasocial relationship between users and you. Users you have never interacted with before might see you as a person they would get along with and seek out your attention, leading to a one-sided relationship on their part.
Having such an audience can be overwhelming at first. People will start to look up to you, and younger users especially can easily be influenced by online personas. They might adapt to your behavior or even copy your mannerisms. Knowing that, you should always be self-aware of your actions and etiquette in public to promote a healthy, sustainable relationship with the users. Receiving special attention from users can quickly influence and spiral into developing an arrogant, or entitled attitude. There is nothing wrong with being proud of your position and accomplishments, but being overtly arrogant will influence a members’ behavior towards you.
The mindset of one user deciding a moderator is not being responsible can spread through the community in negative ways. They might belittle you in front of new members and give them the feeling that you won’t be there to help them or might not inform you of ongoing problems on the server during a temporary absence of moderators in the chat. A healthy user-moderator relationship is important to prevent or stop ongoing raids as well as make moderators aware of a user misbehaving in chat.
Additionally, it’s important to be mindful that your perceived fame does not start to negatively influence your judgment. For example, you may find yourself giving special attention to those who seem to appreciate you while treating users that are indifferent towards your position as a moderator more harshly. It also causes the dynamics within the staff team to change as fellow moderators might start to perceive you differently if you begin to allow bias to seep into moderation. They may start to second guess your decisions, feel the need to check up on your moderator actions, or even lose trust in your capabilities.
If you ever notice that you experience said effect, or notice one of your fellow moderators is experiencing it and letting it consume them, be supportive and sort out the negative changes. When confronting another moderator about it, make sure to do it through constructive criticism that doesn’t seem like a personal attack.
In spite of that, the effects of parasocial relationships don’t always have to be negative in nature. If users manage to build such a connection to a moderation team, the general server atmosphere can grow positively. Users know what moderators like and don’t enjoy, which will lead them to behave in a way that appeals to staff and usually abides by the server rules. They will also be able to predict a moderators’ reaction to certain behavior or messages new people might use. As a result, they will attempt to correct users that are mildly misbehaving themselves without getting staff involved immediately in hopes of receiving positive feedback from staff. Naturally, moderators won’t be able to know of every single person that tries to appeal to them through those actions, but once they are aware that such things happen in certain text channels, it will give them the opportunity to focus on other channels and provide their assistance there.
As mentioned before: In the case of content creators who frequently upload videos, streams, and other forms of media for their followers, the chance of a parasocial phenomenon being developed can be even greater. This will only intensify by joining a creators’ Discord server. This can be done under the false assumption that there will be a higher chance of their messages being read and noticed. Your responsibility as a moderator is to neither weaken that bond nor encourage it while providing security for users, staff, and the content creator. Let the users interact in a controlled environment while maintaining the privacy of the content creator.
Some users might even feel like the content creator owes them some sort of recognition after long-term support, both through engagement or donations. Such a demand can be intensified when they’re shown as “higher” in the hierarchy through dedicated Discord roles, such as Patreon/Donator or simple activity roles. In the case of multiple people building a parasocial relationship with the same content creator and experiencing that phenomenon, they may see other active users or even moderators as “rivals.” They see the content creator as a close friend in their eyes and feel threatened that others, especially those that financially support the content creator, perceive them the same or think they are even closer to them. During such moments, it is recommended to keep the peace between users and let them know that the content creator appreciates every fan they have. While the ones providing financial support are appreciated, every viewer is what makes the creator as big as they are and played a part in getting them to where they are today.
Maybe it won’t only be the user that feels closer to them by joining the server. Many beginner moderators may also find themselves feeling as though they are above the rest of the community because their idol has entrusted them with power on their server. Being closer to them than most of the users can easily fog your judgement; it is essential to prioritize being friendly and respectful to the users over these personal convictions. When adding moderators to your moderation team, it is important to keep an eye out for this kind of behavior to combat it, and to hold your teammates accountable should you see this behavior begin to exhibit in one of your teammates. Making sure your entire team is on the same page regarding your duties and standing in the community is essential to maintaining a healthy moderation environment. As a moderator for a Content Creator, this individual you may admire deeply has put their trust in you to keep their community safe. Falling into the machinations of developing an unhealthy parasocial relationship with them directly interferes with your ability to do that, failing not only the community but the creator.
Despite the potential dangers from parasocial relationships, the fact that they develop at all may indicate that you are doing a good job as a moderator. While positive attention and appreciation are key factors to a healthy development, not everyone may like that sort of attention and it is completely acceptable to tell your fellow moderators or even the users themselves about it. At one point you might feel like you reached your limit and need a break from moderation and managing parasocial relationships aimed at you and those around you. Moderator burnout is very real, and you should not hesitate to take a break when you need it.
Users will view you, as a moderator, as a leader that helps guide your designated Discord server in the right direction. As such, you will be a target for rude comments by users that have personal issues with the server while simultaneously getting showered with affection by other users who are thankful for what you do for the server. Never be afraid to ask for help and rely on the moderation team if things go too far for your personal boundaries or comfort level, even if you are an experienced moderator. Establishing a healthy relationship with the community is important, but being able to trust your fellow staff members is even more so. Nobody expects you to build an intimate relationship with every member, but knowing you can count on them and their support is essential for the team to function correctly.
The first step towards securing the server you moderate is securing your own Discord account. Your first line of defense is a strong and unique password. Some characteristics of strong passwords include:
You can also use a random password generator or a password manager to create a completely random password that will be nearly impossible to guess, but difficult to remember. Another option is to combine several random words together. The key, though, is that the words need to be completely random. Using a tool to help select words at random from the dictionary is a good way to help ensure their randomness.
Once you have a strong password, you should also enable two-factor authentication, also known as 2FA. 2FA ensures that even if someone manages to guess your password, they won’t be able to get into your account without access to the device where the 2FA app is. You can also enable 2FA via SMS and receive your authentication code via text message. However, SMS 2FA is less secure than application-based 2FA because text messages can be intercepted or your phone number could be stolen. Although the chance of this is still low, you should still avoid enabling the SMS backup for this reason if possible.
You also need to make sure the devices where your Discord account is logged in and the device that has your 2FA app are physically secure. Make sure your computer is password protected and locked when you are physically away from it. If you use a public computer, make sure that you use incognito mode on the web browser to ensure that your Discord information is removed when you close the browser. For a phone or tablet, require a PIN code to unlock it so that it can’t be used by strangers.
Now that your account is nice and secure, there is one more thing you must closely monitor to ensure it doesn’t fall into the wrong hands: yourself.
The weakest link in any cybersecurity system is usually a human, and the security of your Discord account is no exception. Social engineering is the use of deception to manipulate individuals into divulging confidential or personal information that may be used for fraudulent purposes. People attempting to gain access to your Discord account may attempt to get you to log into a fake site, download a malicious file, or click on a suspicious link. Being able to identify these actions and avoiding potential pitfalls is an important part of keeping your account (and the servers you moderate) safe.
One of the most common and dangerous scams on Discord is a user or a bot sending out a direct message with a QR code saying that you should scan the QR code with Discord’s QR code scanner for free nitro. This will generally be combined with instructions on how to access and use Discord’s QR code scanner. However, it is important to remember that Discord's QR code scanner is only used to log in to Discord. Scanning the given QR code will allow that attacker to directly log into your account, bypassing your password and any 2FA you may have configured. If you accidentally scan a suspicious QR code, you should immediately change your password as this will invalidate your current account token and log you out of all devices.You can also report any such scams directly to Discord Trust and Safety for further action. For more information on making reports, check out this article.
Another common attack is to encourage you to click on a link that redirects to a fake Discord website. Before clicking on any links from a user, ask yourself the following questions:
If you find that the answer to many of the above questions is “yes”, you should avoid performing whatever action they are requesting. You can also check any suspicious-looking URLs with various URL checkers, such as this one.
If the user is specifically asking you to click on a link that prompts you to log in to Discord, another option you have is to navigate directly to https://discord.com in your web browser and log in from there. If clicking on the user’s link still takes you to a login page, double check the URL of the website. One thing you’ll want to check is if the website starts with https:// instead of http:, or that there is a lock next to the beginning of the URL. Although some fake sites may still have an https:// designation, many of them will not. Other signs may be slight misspellings of the URL or visual tricks such as diiscrd.com or dlscord.com with a lowercase “l” instead of an “i”. If you notice any of these signs, it is highly likely that it is not actually Discord’s website and instead a fake website intended to trick you into entering your login credentials so that it can steal your account.
Creating a strong password, enabling 2FA, and following best practices for physical device security are the first steps towards keeping your Discord account secure. However, there may be people that try to trick you into giving access to your Discord account through various scams or other social engineering attacks. Being able to spot suspicious messages and users and being cautious when encountering strange links or files is another important part of keeping your account safe. Of course, anyone that is able to illicitly gain access to a moderator account on your server still has the potential to do great harm, such as banning users and deleting messages, channels, and roles. Be sure to share this information with the other moderators on your server so that you can each do your part to keep your community safe by keeping your accounts secure.
Discord communities are distinct from subreddits and attract different audiences. While there are often overlaps between those audiences, it will not always be the case, and it’s important to determine whether your community will benefit from having a Discord before you attempt to start one.
Successful Discord communities revolve around human connections and conversations and not just content. For a Reddit community to translate well into a Discord community, it should be centered around a topic its members are passionate about and are highly engaged with.
To start, ask yourself the following questions:
If you believe the answers to those questions would be negative about your community, it might be helpful to take a step back and reconsider whether a Discord server would benefit it.
Typically, it’s best to keep the Reddit and Discord moderation teams separate. Your Discord is a separate ecosystem with its own needs - and it’s important to find users from within it that will help you develop and maintain it, and make it flourish.
When starting off, adding your existing subreddit moderator team usually works. However, it’s important to note that those mods might not always be as dedicated to this new platform as they are to the one that they came from. Looking into the future for your Discord server, things might change and the subreddit mods that helped it in the early days might end up having to take a backseat in favor of users who are brought in from within the server.
The owner of the server should be a dedicated mod from the subreddit who knows both the community and the inner-workings of Discord. Decisions made by the owner will be critical to the development of your server, so take a moment to review all of the potential candidates within your team to choose the best one for the task.
Make sure to bring in at least one user or subreddit mod who is also as knowledgeable with the Discord ecosystem and familiar with your existing community to help with the setup and administration of the server early on.
You now have a bunch of mods! Mods on your subreddit, mods dedicated to your Discord server, and mods that are both. But what do you do with all these mods, how do you tell them apart, what perms should they have? Communication between your different teams is key to the success of both of your communities.
Outside of the owner, you should ideally have at least two other moderators that are present on both teams. These shared mods will be able to efficiently relay information between the teams, coordinate collaborations between the Reddit and Discord communities, be able to take action in emergency situations, and mediate conflicts if they occur.
Here are a few best practices:
It’s important to be upfront to your community about the fact that your subreddit and Discord server are run and moderated by completely separate mod teams after the server starts taking shape. Set up escalation paths for both of your teams to direct issues related to your subreddit to Reddit modmail, and vice-versa, direct users experiencing issues within the server to its team.
All staff positions (except the owner and lead admins) should be independent of the user’s status on different moderation teams, and you should remind your staff team that it is not a given to be modded elsewhere if you become a mod on either platform. Mods that participate in multiple teams must still uphold your activity requirements, and meet all of your expectations similarly to the rest of your team.
While your communities are linked together, they’re separate entities with different groups of regular visitors and contributors. When considering cross platform promotions, assess their relevance to each of your audiences, and determine whether they will find it helpful. A few best practices around this topic are:
Creating a Discord server can be a great way to broaden your subreddit-based community's horizons, giving your users a whole new way to interact with each other. However, it's important to remember that maintaining a Discord community can be a whole lot of work that some of your existing team members might not be interested in taking. Finding the right person to lead your Discord and ensuring your community's new outpost is in good hands early on will ensure a lasting and smooth relationship between your subreddit and Discord, to everyone’s benefit.
Each server has to decide for themselves what the most effective training method looks like. This article will present and explain some recommended methods that can simplify moderation training.
The “Buddy System” approach describes working in pairs or groups, in which two or more “buddies” work together on one task. Requiring newer moderators to work with each other or even an assigned team member allows you to better monitor progress and for aid to come when required.
It promotes active communication and trust between moderators and also allows them to keep an objective view on everything. Having multiple opinions about certain matters and receiving assistance from a reliable source prevents moderators from feeling pressured with moderation tasks and can even open your eyes to new viewpoints. Additionally, it allows its members to effectively share their moderation skills with each other. It also aids in terms of personal safety: if a moderator feels personally attacked by a user, they get immediate support from their “buddies” without feeling the need to tackle the issue alone.
Using this system allows less experienced moderators to quickly and effectively catch up to the moderation standards as they learn how to deal with specific matters first hand. It is important to remind the more experienced moderators on your team to allow new moderators to learn instead of wanting to quickly handle every task themselves. With time, certain actions will also become second nature for newer moderators, too!
This approach is most commonly used when onboarding new moderators. Here experienced moderators or the head of staff introduce each new member of the team personally and guide them through the most important aspects.
The difference between this system and the “Buddy System” is that each new moderator will be acquainted with the moderation tasks and responsibilities by a higher up, and usually only once. After the walkthrough most recruits are expected to manage certain moderation duties on their own while being supervised. It is crucial to support and reassure them so they are able to grow confident in their actions. Recruits can display and appropriately train their soft skills and be informed about moderation standards in a controlled environment without fear or causing too much irreparable damage.
Another method of efficiently training both experienced and inexperienced moderators is by letting them regularly test their knowledge. This involves designing exemplary situations of some everyday issues happening within your Discord server to let the trial moderators explain how they would handle them. Such instances can include how to handle issues in audio channels, user disputes, DM Discord invitations, off topic discussion in the incorrect channel, and potential issues that may be encountered with bots. At the end you should provide some sort of “model” or “example” answer to let the recruits know where they need to improve.
One negative effect of this system could be the fear of failure that some inexperienced moderators might be exposed to. Reassure them that making mistakes is okay as long as you take responsibility for your actions and are willing to learn from them. Moderation is an ever-shifting and learned art, and mistakes are not to be punished when they happen every once in a while.
It may be tempting to conduct these regular exercises incognito (aka acting up on an alt to see how they do) or test them without warning- while this may yield more “everyday” unbiased results, it has a high probability of backfiring. Blindsiding your moderation team with tests and exercises has the potential to do more harm than good, especially in terms of team trust and morale. It’s recommended that you don’t do this and instead, practice transparency when conducting regular exercises in order to avoid a potentially inequitable situation.
Another form of training is to demonstrate how situations or scenarios are handled in your community via presentation or an actual demonstrative walk-through with moderation alt accounts. This is very useful for training where moderators have to use a wide range of commands, such as explaining moderation and Modmail bots. Ideally, this should take place in an audio channel or group call where you can share your screen. Not everyone is able or comfortable joining a voice chat and unmuting themselves, which is something to be considered beforehand.
An essential part of onboarding new moderators is to have an easily accessible document outlining the basic responsibilities and details on moderator tasks and different moderation teams. Such documents need to be designed for each server individually, but they usually contain general rules for staff, conduct expectations, and a punishment outline to set a standard and unity for moderation actions.
Recommended additions to such a moderation handbook are:
Be sure to recognize anything that is prevalent in your moderation culture or community that is also worth mentioning here! The more thorough the guidelines, the easier the document is to refer back to for any questions. For example, gaming servers should have a brief description of the featured gaming company’s Terms of Service and discussion about how to handle cheaters or users mentioning account sharing/selling in accordance with those rules. Another example would be for bot support servers where a brief description of commonly encountered issues and how to solve them as well as an FAQ section to help users with simpler answers.
Another important topic new moderators should have easy access to are commands for moderation bots. Having a simple guide in a separate text channel with a quick guide of the bots’ prefix and format (command, user-id, [time], reason) will aid moderators in quickly responding to ongoing issues on the server. It lets them react fast without having to pause to look up the necessary command. Try to use the same prefix for the mainly used moderation bot to not cause unnecessary confusion, particularly for users who are new to moderating.
Each server uses a differently designed service to assist moderators with helping out users. Bigger servers tend to rely on a ticket or Modmail system, so properly introducing them with sample conversations or problematic matters is essential for both the recruit and the future user. Confident moderators are more willing to aid users in need than those who are still unsure of how the system works.
This kind of training should include commonly used commands and procedures. There should be conversation about how to redirect users to higher staff and closing tickets appropriately so they don’t stack up and cause confusion. As long as a ticket system remains organization and those guidelines or organizations are established across the full team this will prove to be an easy to use communication system that is much less daunting than it may seem!
No user should ever feel unsafe or threatened on a Discord server. Staff members are often exposed to harmful or disrespectful messages, some of them targeted at some moderators directly.
An important aspect of onboarding moderators therefore is to make them aware of how to react in such situations. They need to be able to create an environment in which it is comfortable for them to work in and not be afraid to ask for help if they feel threatened by users or, in extreme cases, other staff members. Staff members supporting each other and being able to communicate in such moments is crucial for an effectively working team.
Effective management promotes a feeling of professionalism and simplifies the process of training new moderators by a lot. Important aspects that should always be covered when creating a training program are time management, deciding on which staff are involved in training and why, effective communication, detailed content, and flexibility.
Choosing an appropriate approach. How do you want to introduce new members? What system did you conclude will work best for your server and staff team? Never be afraid to evolve an already existing plan into something that suits better for your current situation.
Deciding what topics to cover. What topics do you prioritize over others and think require more time investment than others? How are you moderators handling the learning process? Might there be anything in need of adjustment?
Selecting voluntary participants. Having a reliable team behind you to assist you in training is of most importance. Everyone involved should be aware of what exactly they need to teach and how to approach the training within a reasonable time frame.
Developing a timetable. Find common ground between everyone involved, including both the mentors and trainees, and settle on when to teach what topic. Recommend tools to easily manage bigger teams are Google Docs and similar, but bots and self made timetables will suffice, too. Try to keep it simple and easily accessible for everyone involved.
Communication. Who responds to whom? Decide on how and when mentors report to the head of staff or other higher ups. Documenting helps with ensuring everyone is aware of their part and planning how to proceed further.
Flexibility. No matter how carefully you plan everything out, it can always happen that something doesn’t work according to plan. In such situations you need to be able to react spontaneously and be flexible, such as pitching in for someone who is unable to attend due to a last minute shift in schedule.
How to Deliver Effective Training
Before conducting training, you should evaluate what can make your training most effective. Below you will find some best practices to set up a training program and be able to “train the trainers”.
When you are giving a training, it is important to properly plan your training. What are the most important topics to cover? What do you want new moderators to know after conducting the training? How much time are you planning to spend on training? Some people struggle to focus for a longer period of time, so if your training takes over two hours, you should consider breaking your training up into multiple sessions of shorter duration. Make sure to accommodate for any accessibility issues or notes for those who were unable to be present.
Each training should have objectives. You can create an outline of your training where you write down the topics you will cover and the objective of that topic. For example, if you are training moderators to use Modmail, your objective is that new moderators are able to handle Modmail tickets and use the commands available to them. For each topic, also write down how you want to give this training: are you going to give a demonstration? Will you use illustrations? Will you offer new moderators to practice during your training and get hands-on experience? Publicly sharing this with trainees can help keep you on topic and allow them to come prepared with questions
After creating your training, make sure to practice it at least once. First, go through the entire training yourself to see if everything is covered that you think is important. You can then give your training to someone on your team, in case you missed something important and to check whether or not your estimated time is accurate. If it is possible, go through the material with someone who is unfamiliar with moderation. Adjust your training appropriately and you are good to go!
For every training session, it is very important to have interaction with your trainees. People lose their attention after 15-30 minutes, so your training should include a discussion, practice, or some sort of interaction. Some of these interactive sessions are covered down below. Interaction with your trainees is also important because it acts as a way for you to verify whether or not your objectives are properly conveyed.
Make sure there is a break once every one to two hours so trainees can focus on your training without feeling overwhelmed.
Every training can consist of a combination of different training methods. These include, but are not limited to, lectures, quick exercises, group discussions, practice runs, quizzes, videos, demonstrations, and more. When planning your training, you can write down what method would work best to convey the objectives to your trainees. A group discussion will be better applicable to discuss moderation cases, where a demonstration and practice would work better to demonstrate how Modmail bots work.
Try mixing up multiple methods within a training to keep it fresh and keep people from losing their attention.
Preparation is key when you are giving any sort of training. Have quick notes ready with keywords that you are going to use during your training, so you do not lose track of where you are and what objective you are trying to convey. If you need any material such as a presentation, paper and pencil, example cases and such, prepare them beforehand so you do not lose time during your training setting these up. If you use any material, test them beforehand.
Don’t forget to clear your schedule beforehand and have some water ready if your training is mostly conducted over a voice chat or call. It is important to sleep well the night before and feel confident. This will let you remain focused and have an uninterrupted training session.
Make sure you are ready at least five minutes before the training starts to welcome arriving trainees and as a final check whether or not everything is ready and good to go. If your training takes place in a voice chat or call, test your microphone and if you have a wireless headset, make sure it is charged up beforehand!
When your training is done, finish with an exercise, group discussion, or something else that is interactive and fun to do. Ideally, this should summarize the entire training. Have some room at the end to answer any questions your trainees might have. Don’t forget to thank everyone who participated for their active contributions!
After each training, write down what went well and what could be improved. You can ask your trainees after they have had some experience as a moderator, what they missed during training that they think should be covered next time, as well as asking what information they did not find useful. You can then adjust your training for next time!
Having a training or onboarding process in place is very important to have new moderators get accustomed with moderation culture, using bot commands, the server rules, and your moderation guidelines. There are several training methods that include buddy or mentor systems as well as exercises and demonstrations. A training document should outline the most important information new moderators need, but prevent them from being overwhelmed with information they are unable to comprehend.
To have efficient training in place, there are some very important aspects to consider, such as your preparation, goals, how you will carry out the training of your new moderators, and planning. Considering all of these things when creating your training process will make onboarding as informational and effective as possible for you and your new recruits!
Which languages to choose entirely depend on the type of server you run. For a community server, it is recommended to adjust it to the community’s needs. Usually, the most commonly featured languages outside of English are German, French, Spanish, Turkish and Russian. You can determine your featured language by community analysis, server insights (if your server has them), and demand. What nationalities are represented the most in your server? Which users struggle the most with English upon joining the server and if they’re struggling, what language do they usually communicate in? In addition to that, you can consider starting a survey every time you and your staff feel ready to expand into a new language.
On the other hand, if you make a server for a brand or company, it is recommended to go by your target audience. Introducing features like internationalization needs carefully planned, but steady steps.
If your server was originally English only, it’s not recommended to expand into too many languages at once for a variety of reasons. Agree on one language (preferably the most popular one in your server) and slowly add on from there, if the need presents itself. Once you figure out a pattern and a stable structure on how to approach the expansion, you can add more sections for different languages.
As your server grows, so does Discord. It is vital to keep in mind that your server can get a lot of incoming users who have never really used Discord before- so make sure your server’s design is user-friendly, easily accessible, and has a clear structure. Also, remember that the more languages you feature on your server, the more text-and voice channels will appear for your staff.
It is recommended to hoist the most important text-channels at the top of your server. That may include announcements, server-info, rules, optionally giveaways and more. To ensure global accessibility, translate important parts like the rules and server information in every language you feature alongside your main language.
Additionally, you can add a server guide for easier coordination. Both new users and your staff will benefit from that. It should cover a list of accessible text and voice channels as well as a short but detailed description of each.
Furthermore, try to integrate every additional language in a way that utilizes it to its utmost potential. You can do so in the form of a feedback and suggestion channel in each language’s category. This lets you hear directly from the users that would benefit from that additional language chat the most. Consider having staff that are dedicated to interacting with each section instead of focusing all their attention on the global chat. Having these additional language chats makes it so that your international users don’t feel less validated only because they don’t want to or can’t communicate in the global chat.
You can implement that same line of thinking when organizing server events. Consider alternating between server-wide and international events as demand allows. For the latter, analyze your server’s activity for each region and host it at the time with the most community engagement for those users for the best results. You can apply the same system to the voice channels by creating a few global voice chats and then adding language specific channels. Your staff should only allow one common language to be spoken in the global chats and redirect every other language into their appropriate voice channels.
If you want to feature multiple languages with equal amounts, you can effectively combine multiple Discord servers in one. For the first half, it will be the same as before: Have at least one international general-chat before you split the server into each language. That way every user is able to communicate with the entirety of the server instead of just their language-specific section. Let users choose their language via a reaction or verification with the option to opt-out and change regions. You can summarise the most important aspects in a common text-channel, the text being translated into every featured language.
The simplest way to manage the split server is by having the same channels and categories, but in different languages. This will heavily depend on the type of server you moderate, but recommended channels for every section are the ones you would usually include in a monolingual server. These channels should include things like rules, announcements, general-chat, bot-commands, media, optionally a looking-for-game channel, and more. Take care to ensure you have appropriate translation for each channel.
Moreover, community events can be a lot more diverse. You can host a variety of events for each language as well as let all of them participate in shared events. Take the time to analyze each nation’s activity and also take holidays or national days into account.
Internationalizing your server may mean that your moderation team grows to be larger than expected, so to make it easy for users to contact the appropriate staff you can separate moderators by nicknames, colors, or role hoist.
Nicknames: If you want to have the moderation team equally displayed throughout the server, you can arrange the moderators by nicknames. Simply add the language they moderate at the beginning of their nickname so they are alphabetically sorted below each other. Users will have an easier time recognizing the appropriate moderators, but with a large staff, this might end up looking a little busy. Some moderators tend to change their nicknames frequently and may not be big fans of having the tag in front of their name, either.
Colors: It is advised to have the colors of each moderator role in the same shade with a slight but still visible difference. This helps to differentiate moderators and regular users will be able to tell the difference immediately. Remember that newer users might not check each role and just contact the first seemingly online moderator they see.
Role hoist: If you display only the accompanying moderator role for each language, ensure that you have a common, not displayed role for all moderators so one team is not less validated than the other. Users should definitely be able to contact the appropriate staff and ping the correct moderators to help them out with issues in chat. but, when displaying a lot of teams with many members each, some of them might be pushed off the screen. You run the risk that some moderators that are displayed below the other teams seem less “important and validated” to the community, but the shared staff role should fix that problem easily.
While moderating a multilingual server it is crucial to have moderators that are native speakers or at least fluent in each language that you decide to expand to. Having moderators that are native speakers makes it so that they not only understand the textbook definitions of what is being said but understand the cultural contexts that may come with international chats. However, these moderators should also be accomplished and capable with the other responsibilities of being a mod that lie outside of just being fluent.
While community moderators should follow the basic tasks (moderate the chat, be accessible for questions, guide users), things in multilingual servers can be a bit more advanced. Moderators need to identify and address inappropriate behavior towards other cultures in an authoritative, but instructive way. Furthermore, they need to free the chat from toxic behavior and control discussions about sensitive topics.
In addition to that, they have to be open for cultural questions of users who chose a language they are not fluent in yet. Both the user and staff need to work their way around language barriers together. They’ll be required to help users, as well as resolve disputes, often at the same time. For that, they’ll have to rely on their communication skills in order to succeed in resolving both situations and any other issue that may require it. Without good communication skills, users won't understand what you're trying to convey to them, which should be assertive but tolerant.
Moderators can under no circumstances be discriminating towards users of other cultures or beliefs. It is their responsibility to create a civil, welcoming and comfortable environment for all users. While you are always entitled to your own opinion, make sure to keep a neutral output in the public part of the server and try not to let it cloud your judgement.
Managing several moderators from all around the world can turn out to be quite tricky. Organizing meetings and waiting for everyone’s approval for one date may be a real challenge to deal with. Getting the majority of the team on board could take days, especially with the difference in time zones.
One simple solution is to delegate some captains or representative moderators for each team. They will gather feedback, opinions and suggestions from the rest of the staff team and discuss them with the rest of staff during the meeting. That way it’s easier to organize staff meetings, but they can also directly inform the rest of their team who could not make it due to time conflicts. Depending on the size of the staff teams, you can appoint more than one moderator for that position.
When creating more than one staff role, make sure that everyone is comfortable with their position and their responsibilities. Misalignment can lead to misunderstanding later on, and it might create needless tension amongst the staff. Consider appointing one representative in each region that your server is branching out to. That will open the possibility to have regional staff meetings where language and culture specific issues and suggestions can be discussed without the need for an international meeting with the whole team.
While not every moderator will get the chance to get to know every other moderator from different regions in these larger moderation rosters, it is important that the team feels united regardless. Don’t leave it up to one team to tackle an issue- let them know that they can always ask for help from the rest of the staff.
Global Accessibility: Make sure that you have the bots available for every part of your server. That could either mean that you include them in the global section of your server, or you translate the commands and definitions in every language featured.
Moderation: Since not all moderators will be able to understand every single language, using bots turn out to be very helpful for auto-moderation. Inform your fellow moderators about inappropriate words or phrases and ensure that you added them into the bot filter for easier moderation. Banned or filtered words should contain common slurs in not only English, but also in commonly banned words for every language that you offer.
Language Barriers: It may occur that users message the ModMail in poor English or in their native language. Find out the user's nationality and reply with a message in their native language that lets them know their request is being routed to the appropriately fluent mod. To save these messages, simply create an extra text-channel in the ModMail server and let moderators translate important phrases that might come in handy.
It's easy for people to underestimate the impact cultural differences can have. Culture influences values, rules, thought patterns, and perception.
That means that events happening in other countries may be viewed differently in each nation. News and social media don't always portray the truth which lets misinformation spread easily. Make sure to have your staff updated about the current situations so they have it easier to deal with discussions about sensitive topics and trolls. Communication is key and the users of the server will massively benefit from it. While people are entitled to have their own opinion and ask for further information, make sure to let it happen in a calm and civil atmosphere under a moderator’s watch. If you as a moderator find that you’re ill equipped to talk about the topic, you should refrain from publicly voicing your opinion until you’re better informed. Conversely, you may want to refrain from allowing contentious topics like current events or politics to occur in your server at all, which is something you should decide with your moderation team as a whole.
Making a server internationally available is a great idea and can be a boon to your community’s retention. It can be utilized for both communities and companies alike. But you have to be careful with your approach- Internationalization is a deliberate and complicated process and it should be treated as such. If it’s gone about in the wrong way, left uncompleted or rushed, these international spaces could backfire. It may result in negative feedback, a disappointed community and staff or deserted channels.
Make sure to inform your staff and community about every step you’re about to take, and give them a chance to voice their input. Feedback and suggestions from both your mods and your community will be essential to making sure this is the right fit for your server. Internationalization requires a lot of effort and prioritization in order to properly take care of many factors simultaneously, but if done right, it’s an unparalleled way to enrich your community.
An academic study interviewed dozens of moderators across multiple platforms and grouped moderation approaches into five different categories:
Being able to understand the perspective behind each of these approaches and then applying them to your own community as needed is a powerful ability as a community manager. This article will discuss in more detail what these five categories mean and how you can apply them within your own communities.
Moderators that nurture and support communities (nurturing-type moderators) focus on shaping the community and conversations that occur in the server among members to match their vision. The foundation for their moderation actions stem from their desire to keep the community positive and welcoming for everyone, not just long-time members. They seek to create a community with a good understanding of the rules that can then develop itself in a positive way over time.
These types of moderators may implement pre-screening of members or content in their communities by implementing a verification gate or using an automoderator to filter out low quality members or content and curate the conversations of the server to be better suited to their vision.
Although this passive behind-the-scenes guidance is one type of nurturing moderator, these types of moderators also often actively engage with the community as a “regular member.” For nurturing-type moderators, this engagement isn’t meant specifically to provide an example of rule-following behavior, but rather to encourage high-quality conversations on the server where members will naturally enjoy engaging with each other and the moderators as equals. They are leading by example.
While nurturing- and supporting-type moderators operate based upon their long-term vision for a community, moderators that are focused on overseeing and facilitating communities focus on short-term needs and the day-to-day interactions of community members. They are often involved in handling difficult scenarios and fostering a healthy community.
For example, these types of moderators will step in when there is conflict within the community and attempt to mediate between parties to resolve any misunderstandings and restore friendliness to the server. Depending on the issue, they may also refer to specific rules or community knowledge to assign validity to one viewpoint or to respectfully discredit the behavior of another. In both situations, moderators will attempt to elicit agreement from those involved about their judgment and resolve the conflict to earn the respect of their community members and restore order to the server.
Those in the overseeing and facilitating communities category may also take less involved approaches towards maintaining healthy day-to-day interaction among members, such as quickly making decisions to mute, kick, or ban someone that is causing an excessive amount of trouble rather than attempting to talk them down. They may also watch for bad behavior and report it to other moderators to step in and handle, or allow the community to self-regulate when possible rather than attempting to directly influence the conversation.
Where overseeing and facilitating community moderators emphasize interactive and communicative approaches to solving situations with community members, moderators who see themselves as fighting for communities heavily emphasize taking action and content removal rather than moderating via a two-way interaction. They may see advocating for their community members as part of their job and want to defend the community from those who would try to harm it. Oftentimes, the moderators themselves may have been on the receiving end of the problematic behavior in the past and desire to keep others in their community from having to deal with the same thing. This attitude is often the driver behind their no-nonsense approach to moderation while strictly enforcing the community’s rules and values, quickly working to remove hateful content and users acting in bad faith.
Moderators in this category are similar to the subset of moderators that view moderation from the overseeing and facilitating communities, specifically the ones that quickly remove those who are causing trouble. However, compared to the perspective that misbehavior stems from immaturity, moderators that fight for communities have a stronger focus on the content being posted in the community, rather than the intent behind it. In contrast to moderators in the overseeing and facilitating communities category, these moderators take a firmer stance in their moderation style and do not worry about complaints from users who have broken rules. Instead they accept that pushback on the difficult decisions they make is part of the moderation process.
Those that see themselves as governing and regulating communities see the moderation team as a form of governance and place great emphasis on the appropriate and desirable application of the community rules, often seeing the process for making moderation decisions as similar to a court system making decisions based on a set of community “laws.” They may also see themselves as representatives of the community or the moderation team and emphasize the need to create policies or enforce rules that benefit the community as a whole
Moderators in this category may consciously run the community according to specific government principles, such as having a vote on community changes. However, they may also achieve consensus within the team about changes to the server without involving the community at large or even have one moderator make the final determination about community changes. This “final decision” power is usually exercised in terms of vetoing a proposed policy or issuing a ruling on an issue that is particularly contentious within the mod team or community. Very rarely would a form of decision-making be exercised, and it would be granted to very specific members of a team hierarchy such as the server owner or administrative lead. Even so, moderators in this category find following procedure to be important and tend to involve others to some extent in making decisions about the community rather than acting on their own.
This tendency is also seen in the way that they approach rule enforcement. Moderators that see themselves as governing and regulating communities view the rules as if they were the laws of a country. They meticulously review situations that involve moderator intervention to determine which rule was broken and how it was broken while referring to similar past cases to see how those were handled. These moderators also tend to interpret the rules more strictly, according to the “letter of the law,” and attempt to leave no room for argument while building their “case” against potential offending users.
Moderators that see themselves as managing communities view moderation as a second job to be approached in a professional way. They pay particular attention to the way they interact with other members of the community moderation team as well as the moderation teams of other communities, and strive to represent the team positively to their community members. This type of moderator may appear more often as communities become very large and as there becomes a need for clearer, standard processes and division of responsibility between moderators in order to handle the workload.
Though this metaphor focuses more on moderator team dynamics than relationships between moderators and users, it can also shape the way moderators approach interactions with users. Managing-type moderators are more likely to be able to point users toward written rules, guidelines, or processes when they have questions. Managing-type moderators are also much less likely to make “on-the-fly” decisions about new issues that come up. Instead, they will document the issue and post about it in the proper place, such as a private moderator channel, so it can be discussed and a new process can be created if needed. This approach also makes it easier to be transparent with users about decision making. When there are established, consistent processes in place for handling issues, users are less likely to feel that decisions are random or arbitrary.
Another strength of this approach is evident in efficient on-boarding processes. When a community has clear processes for documenting, discussing, and handling different situations, adding new moderators to the team is much easier because there is already a set of written instructions for how they should do their job. This professional approach to moderation can also help moderators when they are attempting to form partnerships or make connections with other servers. An organized moderation team is much more likely to make a good impression with potential partners. If you want to learn more about managing moderation teams, click here.
As you read through this article, you may have found that some moderation category descriptions resonated with you more than others. The more experience you have moderating, the wider the variety of moderation approaches you’ll implement. Rather than trying to find a single “best” approach from among these categories, it’s better to consider your overall balance in using them and how often you consider moderation issues from each perspective. For example, you can nurture and support a community by controlling how members arrive at your server and curating the content of your informational channels to guide conversation, while also managing and overseeing the interactions of honest, well-intentioned community members and quickly banning those who seek to actively harm your community.
It’s perfectly natural that each person on your moderation team will have an approach that comes easier to them than the others and no category is superior to another. Making sure all moderation categories are represented in your moderation team helps to ensure a well-rounded staff that values differing opinions. Even just understanding each of these frameworks is an important component of maintaining a successful community. Now that you understand these different approaches, you can consciously apply them as needed so that your community can continue to thrive!
At some point during their Twitch careers, most streamers will ask the question, “Is having a Discord for my community necessary?”. Most streamers are already struggling enough with configuring their overlays, cultivating relationships with other streamers, and taking the time to edit or design future content. Will working with yet-another platform really help move my streaming career forward, or is it just another item on the endless list of distractions?
Most of the streamers who ask this question fail to understand how Discord fits into their high-level plan for scaling their brand and community. One of the biggest misconceptions is that Twitch can be used for every step of growing your channel, brand, and community. Twitch is only one of many platforms required for success. While Twitch is well-known for engaging and monetizing your active community base, it struggles with content discovery, data collection, and long term retention. Twitch should instead be viewed as a tool to engage and connect with an already-existing community.
The image above categorizes content platforms into 3 primary groups; Discovery, Engagement, and Activation. Content platforms that focus on Discovery are specifically designed to spread new, persistent content that can be viewed later. These platforms are ideal for finding new community members and fans to engage with by inviting them to join another more personalized platform, such as a fansite, email list, or Discord community. It is through this persistent community that you can advertise ways to support and engage directly with the primary content creators by supporting through Patreon or viewing/subscribing on Twitch.
The platforms above can obviously be used for different purposes, but they’re segmented as such because of how their primary business models work. Discovery platforms run on ads and focus on getting you to consume more content. Engagement platforms are more personalized, with content not designed for larger audiences and frequently encourage users to take action. Activation platforms are designed to have monthly or direct recurring payment models that provide an enhanced or personalized version of Engagement content.
With a better understanding of the primary purposes of each platform, you can begin to design your user experience in a way that provides new community members an easy and direct way to participate in and show dedication to the community. Regardless of how your Twitch viewers discover your channel or content, directing them to a centralized community hub will allow them to share their appreciation with others, and go from being an individual fan to being part of a community.
To get the most out of Discord and encourage members to join your stream, spend time with your community before and after each livestream. Before each stream, spend 15 minutes chatting with your active Discord members to ask them about their day or receive feedback about your content. As you begin your stream, post announcements on your social media platforms and use a bot to alert your Discord community that you’ve gone live. Moderators should continue to oversee discussions in your Discord, but encourage members to watch the livestream as it occurs. Bots and scripts can also be used to remind users to join the Discord community if they haven’t already, as well as following on social media for updates and alerts. Finally, as your stream is ending, remind the community that you’ll be available in Discord for a short time to discuss the stream and answer any questions. Some streamers will also include special events for high-ranking members or subscribers, which can help encourage collaboration and active use of the server.
One of the most effective ways to engage directly with your community is to create content with them. This can either be done while livestreaming, or else as a scheduled event that the community can participate in. Events need not necessarily be games, but can also include Q&A sessions or discussions about a topic of interest. Special precautions must be made while playing with viewers as you’re live to prevent “stream sniping”, where community members will attempt to disrupt your stream or game while you’re live. Moderators should take note and be ready to handle disruptions while engaging directly with the community, but most situations can be prevented by assigning restrictions or requirements to be on stream with the host. For example, being a subscriber to the community or requiring participants to apply for a spot if the game has a limited number of participants can help ensure everybody understands and follows the rules.
As you grow your Discord community, members will have different expectations of what your server offers, how it’s managed and what the server is used for. For example, a community with 500 members may not be expected to have a full events calendar, but they would definitely expect that roles have been properly configured so that subscribers appear as a different username color with more permissions and access to restricted channels. Below is a simple checklist for the functionality your server should have:
We hope these guidelines have helped you to determine if creating a Discord server for your Twitch audience is the right fit for you. The most essential thing to keep in mind is that your Discord server is not just an appendix to your Twitch presence. It serves as its own entity and community that needs to be fostered with the same, if not more, personal attention and care in order for it to blossom. With the right mindset and efforts, your Discord community can bolster your Twitch presence and bring your community closer together, for longer
Before we discuss what to consider when introducing mental health channels to a community, it is important to understand the difference between mental health and a mental illness, and to be aware of the realities of what these terms entail.
Mental health and mental illness are strongly intertwined, but very different. People with good mental health can develop a mental illness, while those with no mental illness can have poor mental health. Mental health reflects our emotional, physiological, and social well-being. Though the topic of mental health is still quite taboo to talk about in certain circles, there is help available for those who wish to seek it. Mental health is a crucial aspect of a person’s existence as it affects our actions, emotions, and thoughts. A healthy mental state enhances effectiveness and productivity in work, education, and interpersonal relationships.
A mental illness is a disorder of the mind that affects not just our thinking but also our energy, mood, and occasionally our conduct or behavior. Such a diagnosis may make it difficult to cope with the many obligations of life. Common mental illness diagnoses are anxiety disorders, such as panic attacks, post-traumatic stress disorder, obsessive-compulsive disorder, and specific types of fears. There are also mood disorders like depression, bipolar disorder, and schizophrenia. An excellent guide for identifying mental health challenges and pathways to care can be found here.
Being informed and aware about mental health issues is a great start when you’re considering how it may affect or apply to your community and team. As an illness can manifest in a variety of different ways, it is vital to be ready and have clear guidelines on these complex topics when dealing with online communities both for your staff internally and for your community externally. An often overlooked result of community moderation is how it can affect the mental health of your team. Moderator burnout stems from situations that harm your team’s mental health, and understanding the signs and how to deal with it are essential in building a strong online community led by a moderation team that can act as a strong support system not only to your community, but to each other.
There is a lot to consider when discussing if you want to add a channel that addresses and is dedicated to talking about topics that deal with mental health within your Discord server. Firstly, you have to determine whether or not your community needs a mental health discussion channel or thread. Consider whether or not your team has seen much talk about mental health-related content in your community and whether such conversations have remained respectful even without your team’s active intervention. If these conversations have devolved into arguments or negativity in your community, maybe re-evaluate if you feel this type of channel will benefit those in your server and really think about if you can find a way or have enough resources to guarantee that it remains a safe space for all users.
And easy way to contemplate whether or not a mental health channel would fit into your server is by looking at the general topic of your server. If your Discord server is more focused on gaming, it may not make sense to allow something that drastically differs from your server’s purpose. Contrarily, a server focusing on community building might be a safer space for a mental health channel. Ultimately, it’s very subjective and all about listening to not just the needs of your community but the tone of your server.
A lot goes into developing an excellent community and introducing a mental health channel is not easy as you have to consider how to best moderate it. This includes how you’ll be utilizing auto moderation and text filters in the channel as accidental flags in highly emotional situations can be more harmful than helpful. If you do decide to create a mental health channel or thread, it is important to be aware of all possible situations and to be flexible and prepared for unexpected scenarios.
Understanding that some moderators aren’t comfortable in these situations is important and if you decide to allow such a channel on your server, you need to be able to vet your moderators and future helpers. As moderators aren’t mental health professionals, they shouldn’t be treated as such and should also be presented with the option to opt out of spaces such as this. Therefore, it is vitally important that you establish clear internal guidelines and procedures to set expectations for the channel at a manageable level.
Discussing mental health in a positive way can be challenging as it is a sensitive topic with a variety of experiences attached to it. However, there are some ways to make this an easier experience from a moderation standpoint, the most important of which is having clear server rules and moderator guidelines for how to act when these discussions are taking place and having clear escalation protocols for if things go south. Talking with your moderators about mental health and challenging times in life as well as facilitating breaks for them demonstrates that you care about their well-being. This is an integral part of establishing a healthy environment for your moderators and solid internal relationships.
The creation of specific mental health-related guidelines ensure that users remain respectful of each other and that conversations are within the designated guidelines of your community.
Some suggestions for creating these guidelines include:
Escalation protocols are another important foundation for when mental health discussions take place. Some ideas to consider when implementing them:
Moderation of mental health channels can be a touchy subject. On one hand, moderators should be firm in removing harassers and users who are disrespecting others. On the other hand, moderators should try their best not to over-moderate. There must be a balance to ensure that these channels that have very sensitive subject matters within them are moderated with care. Some ways to do this are:
There are an incredible amount of resources available for mental health support–everything from emergency phone numbers to suicide hotlines and LGBTQ+ helplines. The Find a Helpline resource consists of a list of global helplines and hotlines. We also recommend looking at the TWLOHA resource, which is a non-profit movement dedicated to presenting hope and finding help for people struggling with their mental health.
Discord has also partnered with Crisis Text Line, a non-profit that provides text-based volunteer support for people in crisis. You can learn more about our integration and partnership and how to use it on Discord here.
Some more examples of support sites include:
Not all communities are prepared to host a mental health channel or thread, and there is a lot to consider before adding one to your server. While allowing talk about mental health can be incredibly beneficial, it is just as important to realize that moderators aren’t professionals and should not be put in those positions if they aren’t comfortable with it. That’s why we advise that you carefully consider your current environment, your community’s purpose, and implications that might occur if you choose to add one.
Whatever you decide is best for your community, we believe that education is key. It may be helpful to think about adding a channel or post that gives people access to professional support lines, such as FindaHelpline and the TWLOHA resource, as well as other resources laid out in this article. Generally, facilitating a healthy discussion around important topics of any kind is essential in helping to develop stronger communities online. Make sure that your community is a safe space for everyone that chooses to call it home!
Any relationship between two members of a community can be described as an interpersonal relationship. These relationships exist on a wide spectrum. As you participate in a community, you are most likely going to develop connections to varying degrees with other members of the community. As a moderator, this may even be expected as part of your duties to promote community engagement and healthy conversations. That’s perfectly normal, as it’s very natural for people who spend a lot of time communicating to develop closer ties to one another.
Every kind of relationship, from mere acquaintances to romantic partners, can occur in a Discord community, and every relationship you form as a moderator will carry its own unique challenges and responsibilities in order to ensure you are performing your duties to the best of your ability. Any kind of interpersonal relationship can create difficulty in moderation, but as the nature of the relationship changes, so too does the unconscious bias you may experience.
A friendship between a moderator and a member of the community is the least problematic type of intersocial relationship, but as these friendships form it is still important to take notice and be aware of them. As a moderator it is your duty to be available to everyone in the community, even people who you may not ever see as a friend, so you must resist the temptation to devote more time and attention to the people you more easily connect with. If your biases toward your friends begin to show up in your moderation efforts, many more serious and harder to diagnose problems can arise. Feelings of "elitism" or “favoritism” can start to take hold and disgruntled members may take advantage of your friendships to excuse or justify their own behavior, so take care to make sure that you are remaining impartial.
A friendship between a moderator and a member of the community that persists for a long period can evolve into a closer and more open relationship. These relationships are built on trust or shared experience, and can be more difficult to impartially manage than regular friendships or acquaintances. This kind of relationship could come from the fact that this person is someone you know from another server, in real life, or possibly even a family member. No matter what the scenario, the closeness of this kind of relationship makes it very difficult, sometimes impossible, to remove your own partiality from the equation. Special care must be taken to ensure you engage and listen to other moderators on your team when someone you are closely involved with is in question. When in doubt, it may be best to remove yourself from the situation entirely, which we will discuss in more detail later in the article.
A romantic relationship between a moderator and a member of the community can (and does!) happen. As is natural, if you meet someone who shares common interests and has an attractive personality, over time your relationship may progress into something more profound. Romantic relationships are certainly the most difficult to manage as a moderator. The saying holds true, especially in new romantic relationships, that you will see your significant other through “rose-tinted glasses” which tend to blind you from their potential flaws or wrongdoings.
Additionally, other members can very quickly see a budding relationship as an opportunity for a fellow member to grab power through the moderator they are romantically involved with. As a best practice, you should remove yourself from any moderation decisions involving a user that you are in a romantic relationship with. Failing to do so can and has directly caused the death of some communities, especially when the romantic partners are both on the same moderator team.
This type of one-sided interpersonal relationship is rare among moderators because of the connection a moderation team usually has to the content creator or personality that they moderate for. More commonly, a user witnessing a friendly moderator carrying out their daily duties to interact with their server can develop such a relationship. However, this type of relationship requires an extra level of care and awareness, as they can quickly become toxic if not managed appropriately. Always be aware of them, and consider their existence when making certain moderation decisions. The DMA has an article exclusively dedicated to parasocial relationships for further reading.
One thing to keep in mind when evaluating your relationships in your communities- regardless of the nature of them– is that your relationships and connections if played out in the server are most likely visible to other members of the community. When interacting with your friends, close friends, or even your partner in a space with other people such as your server, members of the community may pick up on the fact that you do have these relationships. As with any kind of community, there may be feelings of exclusion or the perception of “in-groups” that can arise in, especially when it comes to relationships between a “regular” server member and a highly public and visible one like a moderator. A responsibility you have as a moderator is to take this dynamic into account and the effects it can have on your members and how they view you and your friendships. Making sure that your friendships and relationships are not creating an exclusionary atmosphere for other community members, where they feel like it’s unwanted or difficult for them to contribute.
On the subject of “visibility”, a moderator – whether they are consistently conscious of it or not - is someone in the server who has power over other users in that space. It is not always the easiest task balancing the dynamic between being part of a community and cultivating relationships and friendships with being conscious of your role within that community as a moderator and what that imbalance may influence. This difference in responsibility and position can make relationships and connections with other users in the server more complicated. You may not be directly aware of it when you’re chatting with fellow server members, but there will be users in your community who are keenly aware of your status as a moderator. This scrutiny can affect how they approach becoming friends with you as well as affect how they view your own relationships with other server members. Always keep this dynamic in mind and be aware of how your position may affect not just how users interact with you but also how they interpret your relationships and conversations with other members.
Just as it is natural for these relationships to form, it is also human nature to unconsciously develop and act on a bias toward the people closest to you. As a moderator, that natural bias is something you must actively resist, and take conscious steps to avoid. What happens when the friend of a moderator has a bad day and doesn’t act in the spirit of the rules of the community? In an ideal scenario, the moderator’s response would be the same reasonable response that would be expected if the offending member were anyone else. Your response to these situations will have a profound impact on your community’s attitude toward you as a moderator, as showing favoritism will quickly evaporate the community’s trust in your ability to be impartial. Moderators are human, and for inexperienced and seasoned moderators alike, this kind of scenario can prove to be one of the most significant tests of their ability to manage conflict.
In preparing for this scenario, the most important tool in a moderator’s arsenal is self-awareness. It is the burden of a moderator that this commitment comes above any interpersonal relationships that may form during time spent engaging with a community. Being ever-mindful of your responsibility and role in a community can help temper the depth of the relationships that you build.
As a recommended best practice, moderators should be careful about building interpersonal relationships of depth (close or romantic relationships) in the communities they moderate, including with other moderators. The only guaranteed way for a moderator to remain impartial in upholding the rules for all members is to exclusively maintain friendships within their community, but this isn’t always reasonable for communities that you are closely involved in. Should you find yourself in a difficult scenario involving a member with whom you have a close interpersonal relationship, here are some best practices for managing the situation:
The first step in successfully managing a scenario that involves someone you have an interpersonal relationship with is to take stock of your own investment. How are you feeling? Are you calm and capable of making rational judgment? Is your gut reaction to jump to the defense of the member? Or is the opposite true - do you feel the need to be overly harsh in order to compensate for potential bias? Carefully self-evaluate before proceeding with any action. The wrong type of moderator response in a scenario like this can often exacerbate or distract from the actual issue at hand, and potentially weaken your community’s trust in your capabilities as a moderator.
If in the course of your self-evaluation you realize that you cannot positively answer any or all of these questions, it may be necessary for you to more seriously evaluate whether or not you need to make difficult decisions regarding your position as a moderator. If your interpersonal relationship is preventing you from fulfilling your duties as a moderator, you may need to consider either abdicating your role as a moderator or ending the relationship until circumstances improve. Neither option is easy or ideal, but making tough decisions for the health of the community is your primary responsibility as a moderator.
Once you’ve determined that you’re capable of proceeding with moderation, evaluate the scenario to identify what the problem is and whether it immediately needs to be addressed. If there is no immediate need to step in, as a best practice it is usually better to defer to another moderator whenever your personal relationships are involved. Contact another member of your moderation team to get a second opinion and some backup if necessary.
If immediate action is required, a concise and direct reference to the rules is usually sufficient to defuse the situation. Use your best judgment, but be aware that the likelihood of “rules lawyering” is higher with someone who trusts you or sees you as a friend in these scenarios because moderation action can be seen as a violation of that trust or relationship. Clearly and fairly indicating the grounds for you speaking up is crucial to prevent further issues from arising.
Additionally, be careful about what is discussed in private with the person involved in this scenario following any action. There is a higher likelihood of them contacting you via DM to talk about your decisions because of the level of trust that exists between you. As a best practice, it is usually best to avoid litigating the rules of the server with any member, especially a member with which you have an interpersonal relationship. Politely excuse yourself, or if prudent, redirect the conversation by giving the member a place to productively resolve their own issue.
As with any moderation action, once taken it is best practice to leave a note for your team about what action was taken and why. Another period of self-evaluation is a good idea after any action is taken. Ask yourself, was the action taken in alignment with the rules of your community? Was it fair to both the offending member, as well as the other members of your community? Was your decision affected by your bias towards the offending member? If necessary or unclear, ask your teammates for their outside perspective.
Taking moderation action when the offending member is one with whom a moderator has an interpersonal relationship can be one of the most difficult scenarios that a moderator can find themselves in. Set yourself up for success as a moderator by tempering the type of relationships you build within your community and cultivating the ability to self-evaluate. The best tool available to a moderator in these scenarios is self-awareness and the ability to recognize when their own biases prevent them from acting fairly. Remember that moderation is a team sport, and that team is your most valuable resource in impartially upholding the rules and values of your community.
Content creation is one of the coolest aspects of a community! Even those that do not create themselves can celebrate the passion and excitement that comes with sharing art. Artists shouldn’t be regulated to just a generalized #media channel where all users are posting photos, and you should consider instead giving them their own designated area in the server. This shows these users that moderators see their contributions to the community and appreciate what they are doing. This area can be a channel dedicated to sharing art and content, or even an entire channel category depending on how your moderation team wishes to interact with your community’s creators and how active this part of your community may be. Listen to their needs and expand and modify this category as necessary.
When building out a content creation realm in a server, it is important to keep in mind that your moderation team may encounter some new situations that don't apply to the rest of the server. Of course, content creators are subject to the same laws of the land in place for the entire community, but there may need to be some of these unique rules of the road to consider including:
Plagiarism. This is the practice of taking someone else’s work and claiming that it is your own. Plagiarizing another content creator should not be tolerated within any creative space. It should be highly discouraged and acted upon with moderator intervention if your community brings an accusation of plagiarism to your moderation team. As moderators, it is important to understand the difference between plagiarism and finding inspiration in someone else’s work. Tracing another creators’ artwork is the most common form of plagiarism, whereas being inspired by an original character to try out a new pose, color scheme, or scene featuring them is inspiration. While creators are often looking out for each other and willing to bring concerns about plagiarism to moderation teams, it is important to be able to look for it yourself by familiarizing yourself with your artists’ styles and reverse image searching images of concern to your team. Be sure to be able to explain to your community why plagiarizing is harmful when these situations arise.
Managing Constructive Criticism vs. Hate. Your content creation channels are going to be accessible to your entire server. This is so that the entire fandom can celebrate together, but also to drive interest from users to support your content creators. This means that the average user can come in and comment on content. There is a line between constructive criticism and hate. Watch out for it as moderators and be prepared to intervene should anything cross the line into attacks or hate-filled commentary that would give content creation an unwelcoming atmosphere. Oftentimes in creative communities it is an unspoken rule that you should not give constructive criticism unless it is specifically asked for. The average user may not realize this and could accidentally offend an artist if they’re not aware of this. As a moderator, it’s important to help artists understand constructive criticism when they ask for it while shielding them from trolls or baseless hate. Sharing content can be intimidating, so it is especially important to ensure that content creation channels remain positive and respectful environments. One way you can mitigate this issue is by making it clear in your rules that unless the artist specifically asks for constructive criticism, that feedback of that nature is not allowed.
Bumping. Art bumping may occur in an art channel where artists feel their content isn’t easily viewed by enough people. This is essentially the act of media getting bumped up in chat from other people sharing their media at the same time or from chatter about other works. An accusation of bumping usually comes up when a creator feels their art isn’t being noticed, or if they believe someone they do not have a good relationship with is intentionally bumping their work. In this case, it’s important to defuse the situation and not allow any forms of bullying by de-escalating the conflict. Maintaining an environment where users respect everyone's work is necessary for the peace of mind of creators and consumers alike. You can also consider building out a channel category instead of a single channel which would allow for a channel dedicated to posting art and a separate one for discussion. You may contemplate a rule of not posting art within a certain time frame of another creator posting, but be cautioned that this can lead to over-moderation by your community.
NSFW content. If your server allows Not Safe for Work content, it is important that you create a specific channel for it that can be marked as an NSFW channel separate from your regular content creation channels. In line with Discord’s policies, this will not allow users under the age of 18 to see this channel without agreeing to a prompt that says they are not a minor. It is also important to consider that the implementation of an NSFW channel disqualifies you from being a Partnered or Verified Discord server. Make sure to keep the expectations around SFW and NSFW content creation in line with that of your entire server, and offer to answer any questions in DMs if a creator thinks a piece may toe the boundaries you enforce.
Advertising Commissions. If you have a blanket ban on advertisement in your community, you may not want to make an exception to the rule here. However, if you decide to allow advertising commissions in your server, you are allowing more commissions to flow to your creators. Do not allow other users to beg for free art or try to guilt creators with open commissions into providing free content to them. It may be the case that your moderation team will have to enforce boundaries if someone who commissions a creator within your community doesn’t pay them or revokes payment. Conversely, if a creator requires payment up front and then does not deliver work and doesn’t refund the commissioner, moderator intervention should occur to no longer allow them to accept commissions from other members.
To be clear, you are not responsible for their financial disputes or business transactions. Ultimately, creators should look into their specific payment provider website for policy information on fraud and filing disputes, both of which are out of your control. Your job as a moderation team is protecting creators from scammers who make themselves known within your community. You’ve created this space to cater to creators and need them to know that users who take advantage of them and creators who take advantage of users are not welcome here.
Low Quality/Low Effort Art. Something your moderation team should consider is whether or not you will be moderating low quality or low effort art. Lower quality art has the ability to potentially create a divide with more experienced artists or diminish the overall quality of your artistic channels. Expectedly, this is a very subjective and divisive topic. Moderating “low quality” or “low effort” art can run the risk of upsetting younger users or creators that are at the very beginning of learning how to create. When considering moderating low quality art, be sure to display empathy and compassion to avoid coming off as inconsiderate or rude. Be honest and realistic in your descriptions and requirements for these art spaces so that users may have a better idea as to what is and isn’t acceptable both content and quality wise. Other ways to healthily promote higher quality artists is via potential role systems, automatic pins, or weekly artist highlights, which will be discussed in further detail below.
Off Topic Art. As a team, think about whether you want your artist channels to be dedicated to the purpose of your server or if you want to also allow off topic content. Once this rule is decided, check that your moderation team is on the same page for enforcement and nudging should you decide not to allow off topic art.
There are several ways to keep your community’s content creators engaged, which helps to showcase how much your moderation team values their contributions to the server. Discord has several native features that can showcase your community’s talent in emojis, stickers, banners, and server icons. While a banner and a server icon are important to be branded and thus rarely changed, generating emojis and stickers (especially from within your community) is a good way to bond, celebrate inside jokes with your community, and show some love for your creators. Oftentimes communities will employ certain yearly opportunities like emoji elections where creators can submit emojis for consideration and allow your community to vote as a whole.
Continued engagement with your content creators is also important. If you are engaging your community with generalized game or server events, examine whether you can engage your content creators in the same way with art events or monthly prompts as this promotes community bonding. If your community has a system to reward winners for their work or participation in events, work to instill the same kind of system for art adjacent events or prompts.
Finally, some communities may want to install a special role for content creators, especially those that are active and constantly contributing quality work. This will showcase artists in the server to the rest of the server. Do keep in mind however, that unique role colors can lead to inadvertent exclusivity and a social hierarchy within the server. This can also have the effect of alienating artists that do not yet have the role, which is why you should be careful when thinking about if you want to introduce this role to your community. If you decide to bring a specialized role into your server, take the step to have clear criteria for users to qualify for it as well as easy rules for moderators to grant it to users. This ensures that your moderation team can avoid accidentally leaving someone out and hurting someone’s feelings. Avoid bringing a role into your server if your server has had problems with role related hierarchies in the past. Listen to your community’s needs and anticipate potential problems!
Content creators are an exciting subset of fandom that should be welcomed to your community! Ensure that they have their own area to share all forms of content in, whether it be a channel or an entire channel category. Be aware that content creation arenas often come with unique rule considerations that you may not have encountered previously in the daily moderation of your server. Talk to your moderation team about everything before launching this channel or category so that you are all on the same wavelength with enforcement before jumping in. Continuously engage your creators and involve them in the artistic aspects of the server, such as emoji and sticker creation. Art can bring people together, and having a healthy artistic space within your community will provide a new way for your community to bond and celebrate your fandom!
Let’s start with the obvious question: what is Patreon? In case you are unfamiliar with Patreon, it is a subscription service for content creators similar to YouTube Memberships and Twitch Subscriptions. Supporters ( hereby referred to as “patrons”) can pay a monthly fee to access content from creators they are supporting. This private content can cover a wide range of content including text posts, polls, videos, images, merchandise, and more. The possibilities are truly endless and you, as a content creator, have control over how much value you place on the content you produce and distribute over Patreon.
Patreon also uses tiered platforms based upon monetary commitment. For example, let’s say you are a comic book creator that can generate an entire comic in a month’s time. You could charge patrons $5 to access works in progress, maybe snippets of the story you’re writing, or behind the scenes streams of you working on an anticipated page or panel. At the next $10 tier, patrons can gain access to the whole comic as a reward, and maybe even access polls to help you decide what to do next. Further, a $25 reward could be that patrons have complete access to your entire library, or get a physical edition of your comic book sent to them! The ideas are endless when it comes to what kind of rewards you can offer your patrons.
In fact, one of the new ways to explore rewards systems for patrons is via Discord.
For content creators that use platforms like Patreon, providing rewards to patrons is an absolute necessity because it directly benefits those who choose to support you. One of the simpler rewards creators can provide patrons is a Discord reward system automatically managed by the Patreon Bot. An example of such an award allows you to assign one or more roles to patrons in your Discord server based upon their subscription tier on Patreon. Once they have their designated roles, you can utilize channel permissions to provide your supporters with exclusive access to aspects of your server like hidden channels or perhaps even hoist them in the sidebar of your Discord server as an added bonus. This gives them closer access to you as a creator as well as your content and your community. This, in turn, can make them feel appreciated as support and perhaps allow them to see a direct impact of their support on your content.
The first step to setting up Discord Rewards on Patreon is to have a Patreon account. Once you’ve started building your Patreon page (or have one completed), you’ll need to head into your “Tiers” section of your page.
Once in the “Tiers” section, pick the tier you’d like to grant the access to your rewards in your Discord server. We recommend that this should be the lowest tier you’d allow patrons to subscribe at in order to receive a special role in a Discord server that we are presuming is public. Once you’ve selected your chosen tier, you’ll need to connect your Patreon to Discord.
Now, let’s make this reward a reality with the following steps:
Please note, if you created your server that you plan on adding to your Patreon, you’ll have no trouble finding it in the server list mentioned in Step 4. If you’ve had a friend, moderator, or maybe even a community manager create your server, you might not see it in this list. Make sure you have the Manage Server permission in order to see it. If you don’t, double check with the server owner and have them add it to one of your roles in order to successfully connect to Patreon.
With your Patreon page successfully linked with your Discord server, you may notice this message in Patreon:
To finalize reward set-up, you need to create the actual roles. We recommend creating a different role for each of the Patreon tiers you have and distinguishing the roles by naming them after the tiers’ name or the amount of money pledged by patrons of each tier. For example, if you have a $50 per month Patreon tier named Top Contributor, you could name the corresponding reward role either “$50” or “Top Contributor.” This usually works out the most suitably for future rewards, because you can easily distinguish your followers who send $5 a month, versus those sending $50 a month, and change role permissions accordingly if you are offering them different levels of Discord access.
Remember that it is important to keep the Patreon Bot’s role above all the other roles we make for this purpose so it can help manage them.
Now that your tier roles are in the server, there should be no more red warning text on the Patreon Tier Creation page. If there is, give it a quick refresh. The disappearance of this text means that you can now check the box “Gives patrons access to selected Discord roles.” If you are using multiple roles for each tier of your Patreon for organization in your server, make sure they’re all added. After setting your channels and permissions spoken about in the next step, double check that the correct role in your Discord server is associated with the proper tier on your Patreon page.
Success! You’ve completed the Patreon and Discord integration! Now lets establish accessibility levels.
At this point, you’ll want to build a structure to your server for these new tier roles. Head back to the channel list and decide how you’d like to reward your different tiers. Do you want each tier to have their own channel category, or do you want one channel category with different channels designated to different tiers? Make sure to give each category and/or channel permissions associated with each role that you want to have access to that area.
We have an based upon the channel list image below to see some of our recommendations in action. This example utilizes one channel category for all patrons, but gives certain patron tiers different levels of access.
You’ll notice Patreon allows you to assign users multiple roles when they subscribe to a tier. This makes it handy to ensure your private channels and categories are easier to manage across the different role tiers. In our provided example you’ll see that all patrons have access to an exclusive general text and voice channel to talk to each other regardless of tiers. However, different tier levels are given access to different additional perks such as sharing social media links for the creator to follow back, polls for future content, and a VIP text and voice chat to better talk to the creator they’re supporting.
Congratulations on launching your new and/or updated Patreon page! The beauty of this system is that you can continue to create new roles and channel permissions to best serve your community as you continue to grow. Now that you’re done with the basic steps outlined in this guide, all you have to do is edit/add/drop/shift roles around as needed to ensure you are making the most of your Discord Rewards patron program.
Patreon rewards give your fans and supporters exclusives in exchange for supporting your work. Having this automated system to handle and manage your supporters means you can spend more time making new content, and less time worrying about a bot, or a function in Discord whenever new patrons sign up. It also makes managing permissions much more simpler because you have a lot more control over which roles a patron is assigned to based on their subscription tier.
Starting a new job or position or activity will always be exciting. Most moderators are eager to help a community that they love and that they are an active part of to grow and prosper. You spend a lot of time there already, so why would you not want to do your part in helping that community be successful? However as time passes, interests can change and initial enthusiasm can wane. As a moderator, you might find yourself spending time in other communities or you realize your real world schooling and work is a priority over moderating, so there is the potential for anxiety to build as you try to juggle all of your responsibilities. Maybe your mental health is being affected by spending many hours a day on the internet, dealing with trolls and people who simply just want to cause trouble. This can mentally drain you, having to deal with negativity and conflict day in and day out. When moderating begins to feel like a chore, as opposed to a hobby, that’s when you might feel like moderator burnout has set in.
Burnout is the emotional, mental, and physical exhaustion you feel after a prolonged period of stress brought on by certain activities. Spending too much time in a stressful environment can easily lead to feelings of exhaustion, feeling distant from the activity or task at hand, or simply just having negative feelings when thinking about doing the activity or task. These negative effects on your body and mind eat away at you until you feel like you are at your limit and just do not have enough energy and motivation in you anymore to continue moderating. Sometimes burnout makes it feel like the only thing that can make you happy is to stop being a moderator completely.
One of the most common signs of moderator burnout is when you have been noticing yourself being less and less active when it comes to moderating, as well as being less active in the server itself. As a moderator, you have begun to feel like it is a chore for you to be moderating the server; something you feel forced to do, knowing it needs to be done with no joy attached to the task at hand. Time seems to move so slowly when you are moderating as you are constantly checking the clock, just hoping that an hour of your time is enough on the server. Your time spent on your server becomes less and less as the days go by until you have either stepped down or completely withdrawn from any activity in the server, moderation or otherwise.
A burgeoning lack of participation is another sign. If you know yourself well enough, you can probably tell when something is bothering you. Perhaps on any normal given day, you are social and engage with your other team members as well as with regular server members, but recently you’re only chatting in public channels, giving out the occasional public warning. You notice that you only really check the staff channels when there is a ping. Your account may be in the server and your name on the member list, but you’re no longer an active community member. At this point burnout has set in. There may be the urge to come in every day and give 100% but then you run the risk of giving too much, too quickly. You start to dread the amount of work necessary to do your part and eventually start to taper off.
You might notice yourself making more mistakes than normal. Frustration is another part of burnout that can affect the mental aspects of moderating. Feeling like you are making too many mistakes or are not doing as much as another moderator is a hard thing to put a reason to. Feelings of inadequacy may lead to reprimanding yourself internally, being your own worst critic, finding yourself in a rut, thinking everyone in the server is being difficult. You know you are making mistakes, others on your team see it, thinking they are helping by giving you constructive criticism. There is a lack of accomplishment that makes you frustrated, especially in moderating. Nothing you do feels like it is being done right, adding that to the difficulties of members of the server, and frustration kicks in. When telling members of the server to do something and they continue to post against the rules, you begin to wonder if you matter or are even making a difference.
Burnout can easily affect your attitude. Moderating tests your patience as some members of the server will purposefully push the line, seeing what they can get away with. As a new moderator, you might not be as strict, as you are wanting the community to not only respect you but to like you. As time passes, your patience may begin to run thin and you put up with a lot less while you get more irritated. You have a shift in your attitude and can become bitter at what you have to deal with when moderating a server.
After taking time to look at the signs of moderator burnout, it is important to know how to avoid it so you and your team can find your groove again and remember what it feels like to enjoy being not only a moderator but a community member.
Everything is healthy with moderation, including, well, moderation. Knowing when you need breaks and encouraging yourself, as well as other moderators, to take these breaks can really help with mounting stress levels. Moderating takes a toll on your mental health, so being able to step away and catch your breath can really help reset your focus. Reach out to other members on your team that can help with moderator duties and shoulder the workload so those that are experiencing burnout can feel that it is okay to step away. A big part of working on a team is being honest with each other when things are good and when things are not so good. You may be surprised at how eager your team members are to help prop you up when you’re feeling low. Remember- you're not alone in this, so feel free to take those breaks and be assured that the server is not on fire and that there are eyes other than yours that are there to share the workload. Offline life and your overall health should always come before any aspect of online life.
Make sure you are not taking on too much, but also have enough to do. This can be a tricky balance. Depending on your moderation experience and skills, it can be hard to determine what your work load capabilities are, especially as offline life changes. Be upfront and honest with what your team expects from you, but also let them know that if you are feeling overwhelmed, that they can talk to you. Always have an open line of communication so you can find ways to help yourself, as well as the others you moderate with. Something that has worked on larger servers that you might be able to incorporate to your server is having a summary of the channels. With the summary, you can have moderators on your team sign up for which channels they enjoy moderating, as well as the others, and not feel like you have to be in too many channels at one time. Delegating work makes it seem more manageable and less daunting. Encourage yourself and your fellow moderators to try new channels after a couple weeks to change everyone’s scenery, as well allowing various team members the chance to interact with certain server members that might only hang out in channels that they don’t normally moderate. As a moderator, you might feel yourself wanting to do more and seek to add to your responsibilities. Suggesting community events and helping out organizing and running these events makes for a great change of pace to contrast the normal moderating duties of watching chat. Events are great for bringing regular members together with moderators. It is a fun task that can bring activity levels up and spread some excitement.
Create an environment that is fun to moderate in. Having or suggesting a staff channel where you and your fellow moderators can be yourselves or vent is a great way to relieve some stress and have your team get to know one another. Ask them questions every day to encourage discussion and communication between your entire team. Building a team that works well with one another helps with the communication between fellow moderators, so they can express how everyone is feeling, and seek advice on stresses in the community. You can get to know each other’s personal lives and what other things in life might be causing outside stress. Getting to know each other’s personalities can help determine if someone might easily burn out or if they are just more introverted than other moderators on your team. Holding events, such as game nights, where you all play a game online together can really help you and other moderators feel at ease, bring enjoyment to everyone, and help everyone reset for the next day of moderating.
One of the biggest, and perhaps simplest, things you can do to help with moderator burnout is just being thankful. Typing the two little words of “thank you” can go a long way. As a moderator, you spend your free time helping the server, so let your fellow moderators know you appreciate that they chose this as their hobby.
To that end, positive specific feedback is one of the best ways to let someone know that they did a good job and what exactly it was that they did well. By being specific about what you’re thanking your moderator for, you’re letting them know that their hard work is recognized and valued and seen. Recognize when they put in a lot of hours on the server and are here on a day-to-day basis. Finding ways to reward them, whether through gifting Nitro or a special recognition in the server, can be really fulfilling to them. It reassures them that they are an important part of the server and are making a difference when helping. Be gracious with your words and remind them that they are here with you. It starts a chain reaction and you will see moderators thanking other moderators for their hard work.
Moderator burnout can happen to anyone at any time. It is important to understand what it is and how you can help. Whether you are an owner, administrator, or another moderator, it is important to support your team and look for the signs of burnout so you can suggest ways that might help them in how they are feeling. Always have an open line of communication with your team that fosters honesty. Encourage them or yourself to step away when feeling stressed or overwhelmed, and thank each other for helping in the server. Servers are a lot harder to run when doing it alone; moderators that are excited to be there are an important part of making sure things operate smoothly.
Because you are dealing with young people, family-friendly servers may become what many refer to as Home Servers. These are online communities that are perceived to be safe spaces that people come back to time and again, like a home. Thus, such servers may attract users for different purposes than a community meant for college students might, for example. The moderators of family-friendly servers can be viewed as both authoritative figures but also older siblings or friends that may be looked up to by their community in unique parasocial relationships. They will also have the unique experience of seeing many of their young users mature and grow along with the server.
When forming this kind of environment, it’s important to make sure your moderators are naturally empathetic to the unique situations that will arise within member reports as a result of interpersonal relationships amongst younger users.
During the foundational stages of server building make sure to keep ideas such as privacy concerns, text filters, appropriate topics of discussions, and rule continuity at the forefront of your mind. Consider how you will handle users who are above the traditional age range of family-friendly servers who may not wish to abide by these rules.
While privacy concerns exist on any platform, they are something to be especially aware of when moderating family-friendly spaces. Rules that cover privacy and internet safety may be more strictly enforced to protect younger server members from any harmful actors looking to take advantage of less experienced internet users. Oftentimes, these concerns can culminate in a blanket rule stating that sharing any personally identifying information in the server is not allowed. This goes beyond exact locations and can include full names, ages, and even face reveals.
Revealing exact locations and full names are generally frowned upon in most online communities because they can be used to find more information about a potential doxxing target. Forbidding users to share their age is an added level of privacy to protect younger members seeking community on the Internet from being taken advantage of by those with malicious intentions. Younger users may not understand how risky age differences can be to navigate. Face reveals can be reverse-image searched to further learn information about users who aren’t protecting their online privacy, or even be stolen and used in various forms of bullying. As a result, many family-friendly servers find it best to limit all personally identifying information outside of first name and country if users aren’t choosing to utilize an online persona to protect themselves.
As moderators it’s important to help educate young users about how to maintain confidentiality and use discretion when revealing personal information. This starts with recognizing and not clicking malicious links and steers into conversations about reporting things that make them uncomfortable in any way and protecting their private information. Keep an eye out on how others interact with younger users and act accordingly, as a victim may not realize they are a victim in some situations.
Text filter implementation and determining which topics of discussion are appropriate for the server may be more strict than the average community as a result of building a safe space for younger users. There is generally no room for crudeness in any form. This goes beyond using text filters to monitor hate speech, which is against Discord’s Community Guidelines, and going further to include curse words, innuendo, illegal substances and activities, gore, dark humor making light of serious situations, any negative and harmful rhetoric from within the fandom your community supports, and more.
Be sure to adapt your text filters as new situations and filter evasions arise within your community. It is often recommended to keep your blacklist private to avoid exposing harmful terminology to users. Additionally, make sure your text filters mirror the topics of discussion you aren’t allowing in your server and clearly outline those preferences in your server rules by noting that your environment is strictly SFW and any NSFW content is not allowed.
Automoderation can be utilized to mute or ban users who are engaging in problematic discussion around hate speech, and moderators can use their judgment on other blacklisted words as long as your moderation bot is set to log filtered words in moderation channels.
As mentioned in the previous section, rule continuity is a big deal in spaces meant for young people to ensure equal enforcement across everything. If you’re looking to filter certain content, it should also not be allowed to discuss that topic in your server. This goes for both general chatting and for various other forms of server usage, any place where there can be user generated content. One such example is meme or art channels. While it’s a great idea to foster and encourage a healthy and positive artistic environment, content that is NSFW, hate speech or politically motivated, and creations that may lean into age-restricted content like alcohol consumption or gore should be monitored and discouraged. If you have an active music channel setup, it might be worth keeping an eye on playlists to make sure the content being monitored elsewhere doesn’t slip by here.
With the release of server avatars, it is less likely that rule-breaking content can find its way into your server via this route, but consider moderating statuses of active users, user profiles, and profile pictures if it does. It also helps to have your moderators lead by example and follow the same rules in their profiles that you are looking to enforce elsewhere. Continuity is key for equal and unbiased rule enforcement.
While family-friendly environments exist to be safe spaces for young people you may find yourself also having users who are above your target age range. Their presence should not be discouraged, but it’s important to ensure that your rules and welcome messaging clearly defines your space as family-friendly so that they are aware of the expectations for ANY user who joins the server. Most will likely be accepting of the rules you’ve put in place and the space you are trying to cultivate. However, others may naturally challenge certain restrictions that they find childish or assume don’t apply to them. Establishing language around why your community is meant to be family-friendly and the importance of keeping it safe for everyone as a result, will be helpful if these conversations arise. Users who may rebel against rules that they do not agree with should be actioned accordingly.
If you notice a large portion of your userbase seeking out your community but growing frustrated with its family-friendly restrictions, you can consider opening locked channels for mature users that can either opt-in to their existence or show proof of their age. If you decide to do this, make sure you have a separate set of rules in place for these channels, as well as an identification vetting process to make sure that younger users do not get exposed to content you don’t want them to be exposed to. This would require moderation that is different from the rest of your server, so it’s important to consider whether or not your moderation team can handle this locked channel and if you feel it aligns with your servers’ purpose.
Family-friendly communities exist to welcome Discord’s youngest users into safe spaces with the protection of moderation teams. Home Servers are SFW communities that limit topics of discussion more than a generalized server would in order to cultivate that safe environment. As a result, there’s special consideration to be taken around user privacy and educating users about their privacy, expanded text filters, appropriate topics of discussion, and rule continuity when building out the rules and structure of such an environment. Make sure your team is prepared to handle older users who may find these rules constricting by explaining their importance to your community’s purpose.
The core foundation of a server on Discord is the community that populates it. Your community is what you engage with, protect, and grow over time. Engagement is important to focus on, but it’s just as important to make sure you are facilitating positive and welcoming engagement.
Positive engagement can mean a lot of things, but in this article, we will be referring to the way in which moderation can affect the culture of the server you are moderating. As moderators your policies, knowledge of your community, and deductive skills influence the way in which your community engages with each other and with your team.
When you establish and nurture your community, you are growing a collective group of people who all enjoy at least some of the same things. Regardless of your server topic, you are undoubtedly going to have members across different a variety of ethnicities, sexual orientations, and identities from across the world. Ensuring that your space on Discord is a space for them to belong necessitates making it safe for them to feel like they can be themselves, wholly, and without reservation. Your members are all humans, all community members, all people that deserve respect and deserve to be welcomed.
When you are establishing your community, it’s important to have a basic understanding of what kind of environment you would like your server to be. It’s good to break down the general moderation philosophy on what content and discussion you’d like your community to engage in and what content would be inappropriate given the space. Depending on the topic of your server these goals may be different, but some common questions you can ask to establish general boundaries are:
When it comes to the content you allow or moderate in your server, it’s important to, again, reflect on what type of community you are. It’s also important that you act quickly and precisely on this type of harmful behavior. Some users will slowly push boundaries on what type of language they can ‘get away with’ before being moderated.
When discussing moderation, a popular theory that circulates is called the broken windows theory. This theory expresses that if there are signs of antisocial behavior, civil unrest and disorder, as well as visible signs of crimes in the area, that it encourages further anti-social behavior and crime. Similarly, if you create an environment in which toxic and hateful behavior is common, the cycle will perpetuate into further toxicity and hatefulness.
‘Bad-faith’ content is a term that describes behavior done intentionally to cause mischief, drama, or toxicity to a community. They are also commonly referred to as bad actors, and are the type of people that should be swiftly dealt with and addressed directly.
‘Good-faith’ content is a term that describes user behavior with good intentions. When users are a positive foundation in your community, the members that join and interact with the established community will grow to adapt and speak in a way that continues the positive environment that has been fostered and established. It’s important to note that while ‘good-faith’ users are generally positive people, it is possible for them to state wrong or sometimes even harmful words. The importance of this distinction is that these users can be educated from their mistakes and adapt to the behavior you expect of them.
When users toe the line, they are not acting within good faith. As moderators, you should be directly involved enough to determine what is bad-faith content and remove it. On the other hand, education is important in the community sphere for long term growth. While you can focus on removing bad behavior from bad-faith users, reform in good-faith community members who are uneducated in harmful rhetoric should also be a primary goal when crafting your community. When interacting in your community, if you see harmful rhetoric or a harmful stereotype, step back and meaningfully think about the implications of leaving content up in channels that use this kind of language. Does it:
A core way to handle all de-escalation stands in your approach. Users, when heated up during a frustrating or toxic discussion, are easy to set off or to accidentally escalate to more toxicity. The key is to type calmly, and to make sure with whatever manner you approach someone to de-escalate, you do it in a way that is understood to be for the benefit of everyone involved.
Creating a healthy community that leaves a lasting, positive impact in its members is difficult. Moderators have to be aware, educated, and always on the lookout for things they can improve on. By taking the initiative on this front, your community can grow into a positive, welcoming place for all people, regardless of their race, gender, gender identity, or sexual orientation.
The vast majority of community members are interested and willing to participate according to the platform rules, even if they might not agree with every one of them. Sometimes people break rules or disagree, but their behavior can be quickly corrected and they can learn from their mistakes. If users continue to break the rules, they may be given longer-term or even permanent bans from the community or platform. Most users will accept their ban, but a small fraction will not.
A 2018 study by Stanford University estimated that 1% of subreddit communities on Reddit initiate 74% of all conflict on the platform. This incredibly small percentage of users rank extremely low in the agreeableness personality trait, and have no interest in getting along with others. Only a trained clinical psychologist can diagnose a patient with a disorder, but a term commonly used prior to diagnosis is HCP (high-conflict people). There are four primary characteristics of high conflict personalities, which is not a diagnosis but their description of specific conflict behavior:
If you fail to use tact in your moderation technique and communication approaches, you may find that you or your community become the target of a high-conflict person. They may spam your community and you may delete their posts and ban their accounts, but more accounts can be created. Discord uses IP bans to prevent users from creating new accounts, but VPNs can be used to circumvent these bans. If truly motivated, armies of bot accounts can be created and used for mass-spamming, members of your community can be doxed, and ISPs or platforms can be DDoS’d to create fear in your community. If a high-conflict person gains access to money, they can pay somebody else to do the work for them.
Most moderators choose to simply wait out the harassment. Advanced harassment like this may go on for several days, or even weeks, but then stop abruptly as the individual turns their attention to something new in their life. In some cases the harassment can go on for months, continuing to escalate in new ways that may put the lives of your team or community members in danger.
What can you do to protect your community from High Conflict Persons? What motivates a person to behave like this? This article will help to explain the motivations behind this persistent, destructive behavior, and provide actionable steps to reduce or resolve their harassment.
A “nemesis” is an enemy or rival that pursues you relentlessly in the search for vengeance. A nemesis typically holds some degree of fascination for a protagonist, and vice versa. They’re an antagonist who’s bent on revenge, who doesn’t go away, and who seems to haunt the mind of the protagonist. They’ve moved past being an enemy to become something much more personal.
You might assume that a high-conflict person harassing your community is your nemesis, but this would be incorrect. You’re not going out of your way to obstruct their behavior, your primary focus is to engage and moderate your community. If the harassment stopped, you would move on and forget about their behavior. You resist their behavior only as long as it falls under your realm of influence.
In their mind, you have become their nemesis, and you must be punished for your insolence.
To them, you are the Architect of an oppressive Matrix, the President Snow of an authoritarian Hunger Games, the tyrannical Norsefire government in V for Vendetta. You or your community represent the opposite of what they believe. In one way or another, either by your direct actions or through your association with your community, you have wronged them and deserve to suffer for your behavior. It’s clear that you will never learn or understand what they see. You not only participate in creating the corrupt and unjust system that they are oppressed by and fight against, but as a moderator, you are the very lynchpin that maintains the corrupt system.
You may believe this sounds outlandish, and you would be correct. Most people don’t believe that the world is out to get them, and that they’ll be hunted down and persecuted for what they believe. These individuals have an overactive threat detection system that makes them believe that you or your community are actively plotting their downfall. They take your opposing stance as a direct challenge to their competence, authority and autonomy. They harass you and your community because they believe that you’re out to get them, or want to replace them and their way of life. The truth is, all you really want them to do is follow the rules and maintain a civil conversation.
Now that you have a better understanding of how somebody like this thinks, we’ll discuss the strategies that you can employ to solve this problem. The goal is NOT to get them to seek help or change their mind- we aren’t attempting to solve people. Instead, our goal is to prevent or stop certain negative behaviors that keep happening so that you can protect your community and focus your energy elsewhere.
The key to getting an individual like this to change their behavior is through utilizing “tactical empathy”. Tactical empathy is the use of emotional intelligence and empathy to influence another’s behavior and establish a deal or relationship. It is not agreeing with them, but just grasping and recognizing their emotions and positions. This recognition allows us to act appropriately in order to respond to our counterpart’s position in a proactive and deliberate manner.
The premise behind tactical empathy is that no meaningful dialogue takes place when we are not trusted or we are perceived as a threat. In order to get someone to stop harassing your community, you need to shift yourself from being the villain of their story to being just another random person in their lives. You must work to shatter the persona that they have projected onto you and show that you are not the enemy working to destroy them. You’re just a mod trying to keep your community safe.
By demonstrating that you understand and respect them as an individual, this will disarm them and allow them to focus their energy elsewhere. It will not change their opinion, but at least their behavior will change.
When somebody continues to harass or disrupt your community, they’re essentially holding your community hostage. If someone truly is holding your community “hostage”, they’re often doing so because they’re looking to open a dialogue for negotiation. Frequently, people take hostages because they need somebody to listen. They aren’t getting the attention that they believe they deserve, and attempt to cause as much disruption as possible in order to make their case.
You are a community moderator negotiating the peace of your community, not their lives, but these tactics can still apply.
Situation diffusal can generally be defined by three primary processes, each designed to collect information and use it to disarm the high-conflict person from believing that you’re an enemy or threat. These processes are called The Accusations Audit, Mirroring to Understand and Getting to “That’s Right”.
An accusations audit is where you focus not on just the things that they believe, but the things that they believe you did wrong. An accusation Audit is not based on logic - it’s based on the unfiltered emotions of the other person.
It’s important that you go through their early comments and messages to understand what prompted this behavior in the first place. This might have been banning them for breaking a rule or not properly punishing another community member that they got into an argument with. They might believe “I feel like you didn’t give me a chance to explain myself” or “I feel like you’re discriminating against me”.
Your understanding of their beliefs will be flawed and inaccurate, but you must do your best to piece it together into a coherent argument on their behalf. If possible, learn more about the other communities they’re a part of. Identify if they’re harassing any other communities, and the reasons for doing so. Are there any commonalities of note?
Once you believe you’ve figured out why they’re upset with you or your community, mirror their language to verify it. At this point, opening a dialogue might be incredibly difficult if they’re using throwaway accounts regularly. Chances are they do have a primary account they continue to use in other communities, which can help greatly with starting your dialogue. At this stage, you’re still working to collect information about what they believe, directly from the source. Examples of questions you can use to verify their opinions include, “It seems like you believe that I’m being unfair because I didn’t give you a chance to explain yourself.” or “If I understand correctly, you believe I’ve been discriminating against you instead of taking your opinion seriously, is that right?”
Chances are, the responses you receive will be filled with aggression, profanity and insults. You must ignore all of this, and continue working to understand their position and the events that resulted in them targeting your community. Negotiations like this are difficult in voice-to-voice communication, and nearly impossible via instant or private messaging. They will be incredibly resistant at first, perhaps thinking that you’re attempting to trick them into a perjury trap for them to admit their guilt or ignorance.
When you get them talking to you, mirror that language to get them to elaborate further on their beliefs. An example of dialogue might go something like the following:
Spammer: “It’s bullshit that mods ban strawberry jam lovers because the blueberry jam lovers are afraid of being exposed for who they really are.”
Mod: “Afraid of being exposed?”
Spammer: “Yeah, the blueberry jam lovers are secretly running the world and plotting against anyone who doesn’t believe in the same jam flavor preferences as they do.”
Realistically, blueberry jam lovers are not actually running the world or plotting anything nefarious, but in the mind of the spammer this is undeniably true. And while this example was intentionally mild, you can infer more severe types of conversations that would follow a similar format.
Regardless, as you dig further into what they believe, you’ll notice that the rabbit hole will go very deep and be filled with logical fallacies and obviously disprovable biases that make no sense. Remember that the truth or reality behind what they believe is completely irrelevant, and attempts to correct them will undermine your goals. Your job is to help them explain their beliefs to you to the best of their ability, and for you to understand their position to the best of your ability. Once you believe you’ve collected enough information, you can move to the final step, getting to “That’s Right.”
Once you believe you’ve completely understood their position and what they believe, you can repeat their entire position back to them. Demonstrate your understanding by effectively summarizing it concisely and accurately, regardless of how much you disagree with the position. Don’t focus on their behavior or the actions that resulted in them getting banned. Instead, focus exclusively on the ideology that drove their behavior. Do this until you’re able to get them to say “Yes, that’s right” at least 3 times, or by asking if there’s anything else that you forgot in your summary. If you did miss anything, repeat the entire position again while including the extra information. When reiterating their points, be very careful about restating things that are not true. Do your best to remove personal bias from the statements to focus them back to “absolute truths.”
Their actions are about trying to make a point- but what you’re doing is getting them to make their point without taking action, because you have heard what they are trying to say. If you do this well enough, if you put enough effort into doing this correctly (and it doesn’t need to be perfect), they will know that you finally understand where they’re coming from and that they’ve been heard by you, and their opinion has been validated. By demonstrating you understand their position, you go from being part of the problem to being a real person. They might not like you, but they will at least (if begrudgingly so) respect you.
When you successfully reach this state of your discussion, it’s essential that you be careful with your choice of words. There’s a good chance that the spammer will leave your community alone now that they know that their opinion has been recognized. At the very least, you should see an immediate reduction in the number of times they attempt to cause harm.
If they do continue to harass you or your community, it’s possible that you failed to address the primary reason that they’re upset. Open dialogue with them again and follow the steps above from the beginning, or check to see that you haven’t fallen into a common pitfall or mistake.
Below is a list of common examples of mistakes people make during negotiations:
When using tactical empathy, remember that the purpose of the exercise is to bring their beliefs to the conscious mind and demonstrate agreement. If you attempt to tell them what they should believe, you may instead get a “you’re right” and fail to see any change. The difference is subtle, but important. Make sure that the other side actually feels heard, and that you’ve fully understood their position.
As a reminder: do not attempt to correct or modify their opinion. Remember the purpose of this process. It is not to modify their position or opinion, it’s only to mirror their opinion to stop identifying you and your community as a threat.
The methodology outlined in this article is designed for conversations in real-life, especially over the phone. It’s unlikely that you’ll be able to get the spammer on an audio call, so it’s essential to be patient with the process and careful with your wording. Formal grammar like punctuation can make a sentence feel more serious or threatening. Use casual phrasing and make an occasional spelling mistake to show you’re human. If you’re uncertain about tone, read the sentence out loud while sounding as angry as you can, and adjust accordingly.
The process outlined here can be easily undermined by others who aren’t involved in the process. If you’re working to negotiate with a spammer but another moderator is threatening them in a different conversation, you won’t see any changes in their behavior. Communicate with your team on the strategy you plan to use, and remember to ask for emotional support or step away if it becomes too taxing.
There will be some of you who believe that after getting this far, you may be on the path to rehabilitating a person like this. The mistake is believing that you are further along than you really are, or that you’re qualified to help someone struggling to control their emotions. The truth is, getting to “that’s right” is only 1% of the process.
Even if you’re a clinical psychologist, you wouldn’t be getting paid for your work, at least not this work. Attempting to provide support via text chat will have diminishing returns. Attempting to show somebody like this the “error of their ways” may result in all of the work you have done being reversed.
Instead, you must focus on the people who want your help and who need it- this being the people in your community. Empower the people who are truly deserving of your time and energy. At the end of the day, you’re a human and a moderator. Your primary focus in this realm is to make sure your community is safe and stays safe- and if you’ve managed to get the persistent spammer to stop then you’ve accomplished what it is you’ve aimed to do.
Whichever moderation roles a server may have, there should always be an authority role that can make calls at their discretion if they believe it is the best thing for the community. A good example on how to do just that can be found here. Moderation administrators, leaders, managers, etc. should always be prepared and ready to make judgment calls on the information provided to them, whether by mods or users. A very common misconception among moderation teams is that they should share all information amongst the team for transparency. This can be a double-edged sword in the sense that disclosing private information that is not essential for a moderator can open more routes for that information to have unauthorized distribution. If this occurs, it will compromise the privacy and trust of the users that the information applies to. In sensitive situations containing very volatile information, consider if it may be beneficial to have it handled directly by a team leader or even the owner of the community.
Personally identifiable information (or PII) is any information that can identify a user, such as an email address, full name, phone number, IP address, exact location, or even their Discord user ID and username.
People should never disclose someone’s personal information except their own in an appropriate environment, as disclosing others’ info can be treated as doxxing, which is a disclosure of personal info by a third party (for example, someone posting another user’s address), and can, in some instances, be actioned on by Trust and Safety as it may violate Discord’s Terms of Service/Community Guidelines. User IDs and usernames are acceptable as long as there is a justifiable need to disclose it, but make sure to always consider if there may be repercussions to that user if disclosed in any instance.
PII is very sensitive as it removes a user’s privacy and can result in them being targeted online or even in real life. Thus, this information should always be protected with the utmost discretion. Moderators may come in contact with this in ways such as a message they have to delete, someone maliciously doxxing another person, a user accidentally sharing it without realizing the harm they are putting themselves in or even from information included in a report. This information typically should not be disclosed to anyone and community leaders should consider removing it from bot logging channels to protect a user’s identity.
Also consider encouraging members of your community to learn how to safeguard their own information. You can include rules within your communities that discourage the sharing of even one’s own personal information. As important as it is to protect other users, it is just as important to help them protect themselves. Users may sometimes share their information out of good will or as a way of attempting to bond with others, but bad actors can use that information maliciously.
Personal matters can refer to a huge range of information, but some common examples can include relationships, interpersonal conflicts, previous history, or things as simple as a DM or private conversation. As a moderator you may very likely come across information involving this as part of reports, concerns, or even someone breaching trust by screenshotting and sharing private messages. This information is extremely important to protect as people may trust you to keep it private and use it only to take care of the issue at hand. Exposures of this information can be very harmful to people and can result in targeted harassment, bullying, or even further negative consequences. Stories of this can cause people to be concerned and even worried about reporting something for fear of it happening to them. In the end, this makes things very difficult for moderators to not only reassure, but to rectify.
Most public communities have ways of protecting their server with moderation tools, actions, and procedures. This includes moderator actions such as warnings, kicks, mutes, bans, etc. Moderation actions may be especially important when it involves a specific user. Moderation info can even include internal details such as protocol, procedure, censor lists, or even bot details.
Moderation information is something that can vary from server to server, and thus it is relatively up to the discretion of each moderation team to instill their own server rules to enforce. Some may have full transparency with an open log channel, and some may take a more confidential approach and only speak with those involved. Both have their pros and cons, but be sure to weigh what could happen if people know who receives what penalties. For protocol, always remember to carefully decide what to share publicly, as disclosing a procedure can lead to someone using that information to evade moderators or even exploit the server. This also stands true with bots, as disclosing bot details such as configuration or censor list can result in users evading the protections put in place by your team.
There are many different forms of information that must be considered heavily before disclosing to different people, whether they be users or other mods. Information can range from sensitive personal information such as emails, names, phone numbers, location, IP address, etc. to community-related information such as mod actions, previous incidents, and user history. Regarding users, very little should be shared to people who are not involved. When it comes to fellow mods, it is always best to share as much information as is reasonable aside from personal information to ensure everyone has a well-informed mindset.
Some questions to consider when speaking with users include:
Now for mods and members of the more internal team on servers, mods should of course be “in the loop” to know the story of a situation, and it’s never recommended to keep mod teams in the dark. That being said, even with other moderators, be careful about sharing unnecessary information, especially personally-identifying information, not only because there is often little benefit to it, but primarily because it compromises a user’s privacy even if behind closed doors. While there are fewer factors to consider, they are still just as important as the ones you would ask for another user.
Some things to consider when disclosing to moderators include:
Remember that if you aren’t sure if you should disclose something related to moderation, always ask an administrator/leader on your server for guidance, and always dispose of private information if it is not needed.
It may be easier to be fully transparent and not have to check every sentence before it is said or sent. That being said, there are many benefits to upholding a consistent, confidential environment where staff act with discretion when assisting with a variety of matters. There are many consequences if confidentiality is not upheld properly. Below are some examples of the benefits of protecting information as well as the consequences that can come with being overly transparent.
Keeping Pseudonymous. As stated by Discord’s Safety Principles, Discord is pseudonymous, which means that your account on Discord doesn’t need to be tied back to your identity. Protecting users who may provide information as evidence or otherwise may sometimes expose who they are, and protecting this information reassures that their personal life won’t be compromised by socializing or confiding in a server’s staff.
Trust. Users will know of and hold high trust within a staff team if they are confident that high expectations of privacy will be respected by the team they confide in. If not upheld, users will find it difficult to trust the team, and may heavily contemplate or even refrain contacting a moderation team again in the future.
User Safety. Diligent protection of user data and information helps protect users as it prevents unwanted data from getting into the wrong hands. If information is not guarded, information that gets into the wrong hands can result in targeted harassment or bullying, as many private details can reveal information to malicious individuals.
Moderator Safety. Keeping moderation actions confidential and only disclosing information to people who need to know helps to keep moderator anonymity and reinforces the idea of a team decision. Disclosing moderation actions and who performed them can put a target on the mod, as people may treat them personally responsible for an action and may result in harassment or disrespect from users who may not understand the decision.
Personally identifiable information being shared outside of need to know groups can result in compromising users and making them feel as if they may need to sacrifice their Discord to retain personal privacy. This leads to a loss of trust from the member, and perhaps even the loss of them as a member of your community.
There are multiple things to be mindful of when considering privacy and confidentiality, and it extends well beyond standard moderation. Often, privacy will fall down to the way that the server is configured. Some things to consider include:
Server Discoverability. If an LGBTQ+ server is in Server Discovery, a user may use an emote from that server in another one, and if someone clicks on the emote, it may accidentally expose the user as they may identify as LGBTQ+ privately but not publicly.
Public Join Messages. Some servers may have “welcome bots” or even Discord’s welcome feature that greets new users publicly upon joining. Server staff should take into account the type of community that they stand for, and consider if users may perhaps feel uncomfortable or exposed by being mentioned immediately upon joining.
Security. Automated security and “gatekeeper bots” may be used to prevent malicious users from joining a server on alt accounts or as part of malicious groups. While this seems perfectly normal, the part that has to be considered is what data you are requesting. Some of these bots may collect IP addresses, browser data, and various other forms of information. Users may not be comfortable in supplying information that could compromise who they are. Always make sure to read through the privacy statement of any bot that you add to ensure that you are not asking for too much information from regular members.
Bot Logging. Many servers have private log channels maintained by one or more bots. This tracks joins, leaves, deleted or edited messages, and even more. There are two main points to be wary of with these: if personal information is posted for any reason, be it accidentally by misclick or maliciously to dox a user, it will usually appear in a moderator logging channel when deleted. After the situation has been dealt with, owners or admins should consider deleting the log message to prevent personal information from persisting within that channel.
There are pros and cons to any level of disclosure that is offered by a server to its community and its staff. It is not black and white and there are gray areas in both transparency and revealing select information with moderator discretion. There must always be a balance of both that may shift depending on the situation at hand and the type of community that is present. Just as complete confidentiality will lead to distrust, total transparency will lead to users feeling unprotected due to a lack of privacy.
Lots of new opportunities are created for schools who make use of a private Discord server. One of the biggest opportunities is presented in a cultural shift away from formality and closer to general bonding. Gone are the days of formal communications over e-mail and instead we have open chat discussions with teachers, fellow classmates, group project members, and even private one-on-one’s (similar to office hours!) when necessary.
In this section we seek to discuss the many ways that Discord can help virtual learning. Discord can function as a virtual classroom setting in ways that can generate new opportunities for continued and reliable teamwork while enabling new and exciting ways for students to connect with each other.
While certain permissions are essential to creating an environment that separates teachers and student private chats and subjects, a classroom environment acts as the overall hub for everything necessary from a full school, to a specific classroom, to separate groups working on projects, and even a variation of subjects being taught.
Simultaneously, a server can be set up to serve a variety of school-related functions including, but not limited to:
Discord naturally lends itself to communication in ways that allow students to help each other. If one student is stuck on an assignment, any student can answer their question or perhaps join them in a study session to work out the problem together. Schoolwork can feel less solitary, there is less of a reliance on teachers and aids, and students who may be taking the same subject in different classes can meet each other and help each other out. This also allows teachers to monitor such a space to gain insight as to where their students are struggling so they can adjust their lesson plans accordingly.
Interaction with each other doesn’t have to be limited to just typing in designated channels. Students can interact with each other and teachers while using subject specific audio and/or video channels as well, allowing them to have more personal interactions and cater to different learning styles. However, if you are using these channels to broadcast a lesson and not promote group work, make sure that you disable voice activity for students to avoid accidental noise from being broadcasted throughout your online classroom. This helps create a controlled environment for all involved in the virtual classroom space. An example about how to do this is below.
Discord was created for housing online communities which sets it apart from other routes taken for e-learning platforms. Your students and staff will feel at home in the Discord user interface because it is easy to use and made for quick adaptation. The natural familiarity that many students may bring from their personal usage of the platform and even other similar platforms will reduce the friction of adopting into new virtual learning atmospheres and more easily help transition Discord into their daily educational routines.
A common complaint in larger classroom settings, both in person and virtually, is that the environment is impersonal. In a sea of many faces, it is understandable that a student may feel like just another number on the attendance roster. However, Discord’s premise of providing an easily built space meant for easy and informal communication can reduce this lack of personal connection and instead help students feel welcome, including, and easily engaged.
When creating your virtual classroom it is important to keep in mind what you want the purpose of your server to be. A server for a specific math classroom taught by Mr. Wumpus at noon everyday is going to be geared entirely towards education, whereas a more open-concept Algebra server that welcomes all students currently enrolled in the course under various teachers to have a more common meeting space while also being split into their respective virtual classrooms via permissions allows for antics in addition to education.
Educational settings can also be fun if you choose to build them that way. There is always fun to be had in a healthy community when the school day finishes, and providing this hang out space for students will allow them to not only bond but associate positive feelings with school. There can be spaces designated to help students study together in after school hours or perhaps come together to take a study break to play one of their favorite games. A music bot can be used to listen to music in an essay-writing sprint, and a leveling system can be used to reward students for participation. Community feelings lead to happiness, which can impact mental health and grading.
Discord also brings many advantages to educational arenas outside of the classroom. A server can be viewed as an extension of your physical campus to the Internet in an e-learning environment. Similar to a real-life campus, Discord is a meeting place for students and staff to chat, get to know each other, and build a stronger community in places they feel safe. Shy students may even find it easier to bond better in an online environment than a physical one which can lead to generating new friendships while also improving pre-existing ones. Students who perceive a welcoming environment and have positive feelings about their school and community will often get better grades because of higher motivations levels.
Interestingly, a further extension of a physical campus would be to an alumni association. While school related Discord servers are often for study tools or virtual classrooms, they can also serve as special places for alumni to connect after graduation. They can not only catch up with their old classmates, but they can mentor current students as well. Such a situation can be brought about via unique ideas like career fairs oriented towards advising current students about future career options by making use of a schools’ alumni network.
As mentioned above, Discord’s widespread usage for personal gaming and easy to use interface means that a lot of students could already use Discord in their free time or begin to adopt it into their lives after being introduced to it in a classroom setting. The person someone may present themselves as online can differ from who they are in real life. When given the opportunity to use a profile picture, set usernames and nicknames, and even statuses, things can get out of hand quickly. It may even be difficult for some teachers to identify which student is behind certain accounts if these expectations aren’t established immediately upon server creation.
With the likely change that a student might not want to use a picture of themself as their profile picture, it is important to establish server guidelines about what is appropriate to be in that photo. While some users may not be comfortable using their full name or even their first name in their username, set the expectation that everyone has to change their nickname in your private school-related Discord servers to their real name. On a related note, if a user is subscribed to Discord Nitro, they may also change their profile picture on an individual server basis. However, it's not required in order to use Discord on the whole. Our recommendation if a student is uncomfortable with using their real picture for their platform-wide account is to tell them to make a secondary account specifically for school where they can use a photo of themselves. With Discord's account switcher feature having a second account for this purpose is less of hassle. Finally, when students first enter the server, we recommend enabling Developer Mode to privately connect each individual User ID to a student should someone alter or change their identity/account information in the future. This could be done through the schools information management system, or just by using a spreadsheet.
It is also imperative to reiterate the boundary that you expect everything present in the virtual school ecosystem to be school appropriate. Features like profile pictures, username, and status are platform-wide features that appear in all servers a user is in. What is appropriate in some spaces may not be appropriate in school. Consider this ahead of time to establish rules around this and also punishments should this not be respected. It is important to keep in mind that Discord does have strict Terms of Service and Community Guidelines that state users aren’t allowed to use NSFW content and other forms of illegal content as their username, profile picture, and/or status. This will partly make sure the content on their profiles is acceptable to a degree. However, you may not be comfortable if a student is cursing in your virtual classroom and that can be dealt with accordingly by you.
Discord’s very purpose is to foster communication by bringing users an easy to understand interface that is adaptable to a variety of needs, thus making it a really useful modern tool for e-learning and virtual classrooms. With the right permissions, students can be brought into environments where they can not only learn, but also have fun in a virtual extension of their physical campus. Through utilization of a variety of ways to teach classrooms and communicate amongst groups, Discord naturally fosters teamwork and open forum discussions between students and teachers alike.
However, Discord is still a platform with millions of users exploring a variety of interests. It’s important to enter Discord prepared to handle students who may be using inappropriate media for school via platform-wide features like profile pictures, statues, and usernames because they are appropriate elsewhere. If you consider the logistics of creating a safe, school-appropriate environment and set expectations and guidelines upon entry, Discord is a great tool for your future educational needs!
Once confident in channel permissions that lock the access to your private moderation channels, it’s important to think about what you want your moderation channels to look like. The larger the server the more channels you may need to accommodate everything.
An important rule of thumb is to make sure that your moderation channels are an exception to any auto-moderated actions your moderation bot may be taking, or that the automoderator is configured not to act on messages from moderators. No blacklists should be in effect in these channels to ensure proper discussion of punishments and happenings in the server can be discussed truthfully and respectfully. It’s also important to ensure that any message logging is configured to ignore channels that not all moderators have access to. For example, if all of your moderators can see a general action log channel, but you have a separate channel for a lead chat, deleted and edited messages from that channel should not be logged for privacy reasons. More about this can be found below.
Structurally, it is recommended to have informational channels at the top of your moderation channel category to make sure everyone sees them. Anything that should be easily viewable by the team follows, such as an update channel and moderation/action logging based channels. Channels restricted to smaller groups of moderators should be closer to the bottom as less people have access to these, and partnership channels meant to maintain relations as opposed to direct moderation connections can be on the bottom. A sample of a large staff channelset can be found below.
We’ll now begin to outline a variety of channels that are often useful for moderation purposes. The below list is for your consideration when building your own moderation channel category, but by no means should you feel obligated to add every channel discussed below to your server. This is all about recognizing your needs and making sure they are met!
Every moderation team should have a developed moderation handbook that is easily accessible to all team members and updated regularly. However, some moderation teams like to have an overview channel for rule enforcement for quick referencing. This channel can contain information such as how you categorize punishments, an overview of popular commands for easy referencing after returning from a moderation break, and links to all guidelines and moderation forms for easy coordination.
We recommend that this channel is viewable to all moderators, but Send Message permissions for informational channels like this should be limited to the server owner or administrators to ensure only important and select information is fed into the channel. Additionally, denying Manage Message permissions to everyone but the administrators is also recommended to ensure that these informational messages aren’t deleted.
Action logs are the most important moderation channels out there, but also the busiest. Moderation action logs exist for a variety of purposes, and you can configure them however you see fit. Some recommended actioning channels include, but are not limited to:
Like your server, your moderation team can have a general channel. While it is important for your moderators to moderate, it’s also important for your team to bond. This is best achieved by having a space that is not dedicated to moderation. It exists to talk to each other, get to know each other, and build rapport in your team environment. Moderation can be stressful, and this is where you can go to take a break with your teammates. However, it is important to maintain the same set of moderation expectations here as you would in public channels. An occasional vent is understood and acceptable, but you should avoid speaking negatively of server members that can taint a moderation experience. While it’s important to bond with your teammates, it is also important to bond with your server members as well. Chatting in the server itself is just as encouraged as getting to know your fellow moderators.
If you enable the community server option for your server, you’ll have updates fed to a chosen channel. As these will be major community-based updates for moderation purposes, it is often recommended to have them feed to a moderation update or memo based channel for ease of viewing. Channels that serve this purpose can be used for a variety of reasons including announcing extended absences from the team for vacations or mental health purposes to avoid burnout, taking team-wide votes, and making announcements for moderator removals, departures, promotions, and initiatives the team is pursuing. Many servers use a single channel as a catch-all update arena as it serves a very specific purpose and will not be used daily.
Although there are a variety of ways to organize your team hierarchy, in addition to regular moderators most mod teams also have administrators who are responsible for overseeing the regular moderators. Some mod teams also make a distinction between regular moderators and moderators-in-training, who may have different permissions compared to regular moderators or otherwise be subject to additional scrutiny during their training period. If these distinctions exist within your own mod team, it may be wise to create separate administrator channels and training channels.
Administrator Channels - An administrator chat is necessary to speak about private matters on the team. This is to judge general moderation performance, handle punishments for problematic actions internally, and it also serves as a place to handle any reports against moderators to ensure privacy. Please remember, it is important that these messages are not caught up in bot logging so moderators are not made aware of the fact that they are being discussed privately before leads connect with them.
Training Channels - General permissions granted to all moderators will not yet be accessible to moderator trainees, and thus some teams consider locking their access to select channels. There will need to be a space for all moderators and leads to privately discuss the growth of the trainees without them gaining access even after promotion. Some teams may go as far as to establish a unique action log channel used during training periods before giving them access to the full history of the moderation team once they prove their ability to be unbiased moderators. Again, it is important to ensure that discussion channels for trainees are exceptions to basic bot logging to avoid awkward occurrences with trainees seeing commentary about them that they should not see yet.
There are a plethora of moderation channels that can benefit a team in unique circumstances that don’t fit into the above categories. Consider the following when thinking about what fits best for how your community is run:
If your community is linked to an external community such as Reddit or Twitch, it would be useful to have separate moderation channels dedicated to this external community in addition to your Discord moderation channels. Reddit moderation channels specifically can be created by utilizing webhooks. Reddit moderation spaces housed on Discord often have r/modnews updates feeding into a special update area and channel sets unique to their external community needs. You can also have new posts and comments logged into a designated Reddit action log for easy reference without opening Reddit.
Teams that have a separate Reddit or Twitch moderation team in addition to their Discord moderation team may have designated hangout spaces for all teams to get to know each other casually. But, most importantly, they may have shared moderation spaces to discuss troublemakers that can span multiple platforms to flag problematic users for the other team. Easy and specialized communication across teams will help to keep all facets of your community safe!
There is no right or wrong way to set up a moderation channel category outside of ensuring you utilize the correct permissions. It’s important to consider the needs of your community and how their needs translate to the needs of your team when creating your moderation space. Having flexibility and a willingness to grow as your server grows and requires change is imperative. While action logs are the most useful kind of moderation channels from a punishment perspective, hangout spaces are important to establishing team cohesion and rapport. Identifying the needs of your team and making sure they are adequately met will help you create the strongest moderation environment as possible.
One of the most important factors of keeping a community alive and healthy is activity. To maintain activity, moderators can use a few different methods which generally can be separated into two groups: active and passive. Active methods are those which require the presence and active participation of a moderator. Passive methods, on the other hand, do not require a constant presence from an individual and are often automated by using bots. Keep in mind that even passive methods will require occasional maintenance from a moderator.
One of the more popular passive methods are XP systems. XP systems, otherwise known as experience or leveling systems, grant users experience points (XP) and levels based on their activity in a server. Their main purpose is to reward member activity in the community. These systems exist in the form of bots. Usually they are just one function of multi-purpose bots, but there are cases where the sole function of the bot is the leveling system.
The basic way these systems work is:
These four steps are only a simplification of the process, and there are many options to consider while using these systems. Depending on the bot you choose to use, you can get various options for configuration but there are several that are commonly available on most bots. Some of these options are:
The main function of XP systems is to reward user activity. Their ability to passively lead users towards being active allows moderators to occasionally take a moment to step back from their usual activities of engaging with their community. The existence of a leaderboard appeals to the competitive nature of humans and pulls them to be more active. Rewards additionally add to this appeal. This is applicable in small, medium and some large communities.
Alternatively, XP systems can be used as a measure of security. By locking certain permissions behind levels, you can make sure that inactive and malicious users are prevented from committing certain offenses. This is mostly applicable in very large servers.
Both of these routes will require the utilization of roles. There are two main reasons for this:
As the goal of XP systems is to boost activity, it is important to note that they will also lure in users who believe any type of activity is acceptable. This is not the case. While the problem of spam is already resolved by the cooldown ability of most XP systems, there are still behavioral issues that need to be addressed.
Members need to be aware that rules still exist in the community and they cannot simply do as they please. It is important to moderate those who blatantly misbehave in order to level up. Other contributions to the community, which are not measured with activity in text or voice chat, should also be rewarded, such as artworks, stories, etc.
Furthermore, activity that comes from channels which encourage spam-like behavior, such as bot channels, should not count towards the total. For that reason, XP gain should be disabled in channels of that sort.
Knowing that we have the ability to reset the XP in the server, we can use this option to create different types of XP systems. We can divide XP systems into three different types: cyclical, permanent and combined.
Cyclical systems reset XP points in a regular cycle. Cycle duration should be set based on activity, though it is not advisable to use this system in communities with very low activity. The constant resets in the leaderboard allow new members to climb the charts quickly, but this only lets the system be used for rewards. It is common to give out special rewards to the most active users at the end of the cycle, such as custom roles for the duration of the following cycle, which encourages continued activity to retain their rewards. This also gives moderators the opportunity to post announcements regularly at the ends of cycles. The biggest downside of this system is that not all bots have the ability to also remove leveling roles when the cycle ends.
Permanent systems do not reset at any point in time. Occasionally, users that are banned or have left are removed from the leaderboards. They give a good look into who the most active and dedicated users in the community are. Permanent systems can be used both for rewards and security. Their biggest downside is that they are not very friendly towards new members, especially in older communities.
Combined systems are a combination of cyclical and permanent systems. They usually require the usage of 2 separate bots to keep track of rankings on both leaderboards. They take the best aspects of both systems, meaning they can be used both for rewards, which would be connected to the cyclical system, and security, which would be connected to the permanent one. This also allows for the cyclical leaderboard to help involve new users more, while still giving good insight of who the most active members of all time are with the permanent one. The only big downside that remains is the issue of removing roles from a large number of members.
As with any system, there are several negative aspects to consider when it comes to using XP systems. All of these should be taken into consideration before you make a decision on whether you want to use one. Some of the most commonly voiced concerns would be the following:
The fact that these systems encourage sending a larger number of messages automatically leads to the idea of spam. Since spam is considered a violation of rules, as well as a ToS violation, this would become a huge problem. The issue is easily resolvable, thanks to the built in anti-spam measures XP systems have, primarily the message cooldown. This, in combination with good moderation would make it certain that no spam is generated by the presence of the system.
Since most XP systems require utilization of roles for truly fulfilling their usage, the issue of role bloating comes to mind. It is important to manage and space the leveled roles properly in order to avoid creating an excessive number of roles with no real use in your server.
It is believed that when it comes to rewarding members, rewards should be granted manually through qualitative judgement of content, rather than quantitative. It’s a fact that bots cannot themselves tell the quality of the messages sent. The fact that all types of conversations allow users to gain XP means that there is no meter on what the quality of the conversation is. Granting privileges as a result of such conversations sends the wrong idea of what sort of activity is encouraged. Only humans can truly judge content subjectively to determine quality. By combining proper moderation, for handling the judgement of quality, and the built-in preventive mechanisms of XP systems for quantity control, this issue can be held down to a minimal level.
This would be one of the most difficult problems of the system to resolve. Many members, with the intent of increasing their XP count and level, will attempt to hold conversation in the community. This part of it is perfectly fine. The problem arises when they start forcing conversation at any point in time solely to increase their message count. The result would be communication that is completely unnatural and unhealthy.
This type of abuse of the system can’t really be stopped with the cooldown system since most of the time it is not spam. The only real way to prevent this is to use the No XP role, but the difficulty of telling organic and inorganic communication apart raises the question: Was there really any violation of the system?
Not in every case is an XP system useful for a community. In many cases moderators can’t figure out how they can add this sort of system and make it seem like a natural part of the server, or how it can fit the theme of the community. Considering the theme and purpose of your server is an important part of making the decision of adding a leveling system to your server. Before you make the decision, ask yourself: How can I make this system a natural part of my community?
In the following section, several publicly available bots will be presented as options for what you can use for an XP System in your server. The following bots have been chosen based on data collected from a survey of various moderators, administrators and owners in Discord communities.
This list is not exhaustive. There are plenty of alternatives available online. All of the listed bots are free and public. Some features may be limited to paid additions or private versions of the bots. The content of this article is not endorsed by any bot or company related to the bots.
Gaius Play is an entertainment bot that also hosts an XP system. It takes both text and voice chat activity into account. The basic version comes with a preset configuration for XP gain and leveling, as well as commands for adding rewards up to 6 roles, fully customizing and toggling level-up messages, manually controlling XP, ignoring activity in certain channels and the ability to boost XP gain within certain parameters (roles, channels, time periods). It also has the ability to remove leveled roles from all users, making it ideal for use in a cyclical system. Additionally, there are several premium features, such as unlimited reward roles and a tree leveling system. Users can also reset their own reward roles in order to change paths on the tree leveling system.
Amari is a bot that is solely focused on leveling systems. It only looks at activity in text channels. This is a very simple bot, containing commands for setting rewards, manually controlling XP, customizing and toggling level-up messages, as well as ignoring activity in some channels. It has the ability to have 2 leaderboards active at the same time, both of which can be reset at any point. Donor features allow for modification of the cooldown between messages, as well as modification of the XP gain per message.
Nadeko is a multi-purpose bot with a leveling system module. It detects activity exclusively from text chat and has a preset configuration for XP gain. It contains commands for setting up reward roles, toggling level-up messages, ignoring activity in channels and manually controlling XP. A big upside of this bot is that you have the option to host it yourself. This also adds the ability to set up in-bot currency rewards, as well as better overall control of the bot.
Tatsu is another multi-purpose bot with an XP system module. It has several basic commands, allowing creation of reward roles, toggling level-up announcements, modifying XP gain per message and cooldown, manual XP control and ignoring channel activity. It also features a dashboard and a global leaderboard alongside the local one.
With the available selection of bots and documentation explaining the setup of XP systems, using leveling modules is simpler than ever. The configuration options that exist on these modules allow for creative usage of leveling systems with the goal of passively increasing activity within the community.
Naturally, you have to consider several factors prior to deciding on using XP systems. If you are considering using one, think of how you can best integrate it into your community. In which way will you use it? Which type of system would suit your community best? Which bot would be the best for the task? Of course, there are negative aspects to consider as well, meaning you’d have to figure out how to control and minimize them. Carefully weigh all the pros and cons prior to making a final decision.
Content that sexualizes children has no place on Discord or in society. We don't tolerate any kind of text or media - real or generated - on our platform that portrays children in a sexual manner. This type of content causes serious harm to victims.
We consider Off-Platform Behaviors when reviewing content under this policy due to the high-harm nature of the offense.
We report illegal child sexual abuse material (CSAM) and grooming to the National Center for Missing & Exploited Children. Users who post this content are permanently banned from Discord.
Discord has a zero-tolerance policy towards individuals who engage in sexual grooming, extortion (sometimes referred to as “sextortion”), or the sexual exploitation of minors. Users who engage in this behavior will be permanently removed from Discord on a first offense. We consider anyone under the age of 18 to be a minor and anyone 18 or over to be an adult.
Given the high-harm nature of this content, we will also consider off-platform evidence as explained in our Off-Platform Behaviors Policy when reviewing content under this policy.
We take the safety of our younger users seriously and don't allow children under the age of 13 to have a Discord account. We want our teen users to be able to express themselves freely on Discord while also taking steps to ensure these users don’t engage in risky behaviors that might endanger their safety and wellbeing.
We also don’t allow servers that encourage or facilitate dating between teens, even if run by teens themselves. These types of spaces can make our teen communities targets for others who may want to exploit them and can inadvertently facilitate other violations of this policy.
While we understand that teens may want to help each other stay safe online, we cannot allow users or communities to put themselves in potential harm’s way. This means that we will take action against users or servers that we reasonably believe are trying to “bait” or entrap others into inappropriate interactions with teenage users.
We want teens to be safe on Discord, but we also don’t want to penalize them for expressing themselves or exploring their identities. We expect that some users might violate this policy without realizing it, so we provide warnings to teen users where possible. However, for their own safety, we permanently remove a user’s account from the platform if we see consistent violations of this policy or if the user appears to be in immediate risk of harm.
Reporting safety violations is critically important to keeping you and the broader Discord community safe.
All Discord users can report policy violations right in the app by following the instructions here.
If you or someone else is in danger we encourage you to consider contacting law enforcement right away and letting them know what's going on, regardless of the information you're able to provide.
If you or a friend needs mental health support, we encourage you to use the following resources:
If you’re outside of the United States:
If you’re in the United States:
Although we're not affiliated with staff at any of these hotlines, they're trained to help and guide callers with ways to address things that are bothering them
The moderators are the people that your community members look to, not just for enforcement of server rules and maintaining the peace, but also as role models for what behavior is appropriate for the server. If your users see moderators ignoring or bending certain rules, they will learn that that is ok for them to do so also, and they will call you out if you attempt to hypocritically enforce rules against them. As such, moderators should hold themselves to a higher standard than other users, especially in regards to civility and more subjective rules such as what is considered NSFW content. This also applies also for private interactions among the mod team.
For example, if a moderator is talking in chat and shares a suggestive picture, users will understand that other pictures that are equally as suggestive are ok to post. Not only will this encourage borderline rule-breaking behavior, it makes it more difficult for moderators to peacefully moderate NSFW content because users will say “Well, you posted this picture and the picture I posted is basically the same.” The same holds true in the way moderators respond to questions. If someone asks for help on something and moderators respond to them rudely or condescendingly others will treat new users the same way and create a hostile environment.
That’s not to say that moderators can’t have fun, of course. Moderators can and should participate in chat regularly and engage with members as normal users. If a moderator entering chat is disruptive in and of itself, it usually means that moderators are not active enough in the server.
Ultimately, moderators should strive to be seen fondly by server members, yet respected in their positions of authority. Moderators that fail to enforce rules will be seen as unprofessional or “pushovers” by the server members, while moderators that enforce rules too strictly and/or do not participate in chat will be seen as aloof, aggressive, or out of touch.
One of the things you may often hear is that the “spirit of the rules” is more important than the “letter.” In other words, it is more important that people follow the intent of the rules, rather than adhering to a literal or technical definition. As a result, moderators should focus on managing the problems of chat, including addressing unhealthy behavior that may not directly break a rule. It is appropriate to moderate people that are deliberately toeing the line to see what they can get away with (i.e., trolling), but what many moderator teams also forget is that the rules are not infallible, and moderators should use their judgment to enforce rules only when it makes sense, and not blindly following the letter of the law.
There may be instances where the wording or specifics of the rules end up disallowing behavior that, in practice, does not go against the main principle of moderation. In these cases, moderators should refrain from warning the user without consulting the rest of their mod team and also seriously consider modifying the rules to more accurately reflect the expectations of the mod team in regards to server conduct.
For example: let’s say you have a rule that prevents users from cropping images to focus on sexual body parts in order to prevent NSFW conversations from occurring in chat. However, someone ends up cropping an image of an in game character to focus on her skirt from behind, discussing the outfit. In this case, it may not be appropriate to warn the user since they are using this image to start an appropriate conversation, even if it technically breaks the rule about cropping pictures. So, the mod team should discuss ways that the rule can be rewritten to cover scenarios like these, rather than resign themselves to warning the user “because the rules say so.”
Remember: the rules exist to serve the community, not the other way around. Moderators should conduct themselves in accordance with the rules, and potentially even better, but they ultimately have the power to change them for the better of the server if need be. Treat your rules as a living document and remember that they are there to improve your community, not stunt it.
While certain rules readily offer an “instant ban” option (such as doxxing) in some cases, a user’s conduct may reveal that they are only in chat to troll or otherwise cause trouble in a way that does not break one of the instant ban rules.
Just as the rules exist to serve the community, so too does the progressive discipline system. The purpose of the progressive discipline system is to allow your members to understand their bad behavior and rectify it in the future without unduly punishing them for occasional small mistakes. Conversely, this means that users that are clearly acting in bad faith on the server may not be afforded the same leniency and should be muted or banned depending on the circumstances especially if the user in question does not have any previously normal chat history. While users that instantly break rules without message history could all be potentially banned, some behavior you may want to consider in particular includes:
It’s important that disruptive users be addressed quickly before they sour the mood of the other server members (which could lead to additional infractions from users that were incited by the original bad behavior in the first place). Just as moderators should not use the rules to punish users that don’t practically deserve it, moderators should also be sure not to allow disruptive users to remain on the principle of following policy.
While the principles above are the most generally important moderation principles, there may be other things you want to include in your moderator guidelines channel as well such as
Always keep in mind any peculiarities of your server and questions your moderators might have so that you can proactively address them before they become issues.
A moderation guidelines channel is an important channel for helping your moderators get acquainted with both the procedural aspects of moderation and the more subjective aspects. Moderators should be aware that they are leading by example and hold themselves to a higher standard so that other users will be encouraged to follow their example. This will help them perform their duties smoothly as well as allow them to readily de-escalate conflicts before they become an issue, encouraging a positive server culture. Finally, by encouraging your moderators to evaluate situations critically, you have mods that can understand both when users should be swiftly punished as well as when rules may need to be adjusted or clarified to allow greater flexibility.
If you’re interested in seeing example moderation guidelines, you can check out the link here. Hopefully these help you in developing your own moderation guidelines. Happy moderating!
The next screen will allow you to further specify the specific abuse that’s occurring. You can always click back and change your first answer, so you can select the most relevant category.
Alternatively, you may select “Delete Message,” which, as a mod will enable you to report the message to our Safety team as well, while removing the content from your server.
If a credible threat of violence has been made and you or someone else are in immediate danger, or if someone is considering self-harm and is in immediate danger, please contact your local law enforcement agency.
Additionally, if you are in the United States, you can contact Crisis Text Line to speak with a volunteer crisis counselor to help you or a friend through any mental health crisis by texting DISCORD to 741741. You can learn more about Discord’s partnership with Crisis Text line here.
You can find more resources about mental health here.
When we become aware of content that violates our Community Guidelines or Terms of Service, our Safety team reviews and takes the necessary enforcement actions, including: disabling accounts, removing servers, and when appropriate, engaging with the proper authorities. We may not review your report manually or respond to you directly, but we’ll use your report to improve Discord.
You can read more about the reports we receive and the actions we take on violations of our Community Guidelines or Terms of Service in our quarterly Transparency Report.
We’re committed to making Discord a safe place for teens to hang out with their friends online. While they’re doing their thing and we’re doing our part to keep them safe, sometimes it’s hard for parents to know what’s actually going on in their teens’ online lives.
Teens navigate the online world with a level of expertise that is often underestimated by the adults in their lives. For parents, it may be a hard lesson to fathom—that their teens know best. But why wouldn’t they? Every teen is their own leading expert in their life experiences (as we all are!). But growing up online, this generation is particularly adept at what it means to make new friends, find community, express their authentic selves, and test boundaries—all online.
But that doesn’t mean teens don’t need adults’ help when it comes to setting healthy digital boundaries. And it doesn’t mean parents can’t be a guide for cultivating safe, age-appropriate spaces. It’s about finding the right balance between giving teens agency while creating the right moments to check in with them.
One of the best ways to do that is to encourage more regular, open and honest conversations with your teen about staying safe online. Here at Discord, we’ve developed tools to help that process, like our Family Center: an opt-in program that makes it easy for parents and guardians to be informed about their teen’s Discord activity while respecting their autonomy.
Here are a few more ways to kick off online safety discussions.
If a teen feels like they could get in trouble for something, they won’t be honest with you. So go into these conversations from a place of curiosity first, rather than judgment.
Here are a few conversation-starters:
Teens will be less likely to share if they feel like parents just don’t get it, so asking open questions like these will foster more conversation. Questions rooted in blame can also backfire: the teen may not be as forthcoming because they feel like the adult is already gearing up to punish them.
Read more helpful prompts for talking with your teen about online safety in our Discord Safety Center.
Our goal at Discord is to make it a place where teens can talk and hang out with their friends in a safe, fun way. It’s a place where teens have power and agency, where they get to feel like they own something.
Just because your teen is having fun online doesn't mean you have to give up your parental role. Parents and trusted adults in a teen’s life are here to coach and guide them, enabling them to explore themselves and find out who they are—while giving them the parameters by which to do so.
On Discord, some of those boundaries could include:
Using Discord’s Family Center feature so you can be more informed and involved in your teens’ online life without prying.
In the spirit of meeting teens where they are, we’ve also introduced a lighthearted way to spur conversations through a set of digital safety tarot cards. Popular with Gen Z, tarot cards are a fun way for teens to self-reflect and find meaning in a world that can feel out of control.
The messages shown in the cards encourage teens to be kind and to use their intuition and trust their instincts. They remind teens to fire up their empathy, while also reminding them it’s OK to block those who bring you down.
And no, these cards will (unfortunately) not tell you your future! But they’re a fun way to initiate discussions about online safety and establish a neutral, welcoming space for your teen to share their concerns. They encourage teens to share real-life experiences and stories of online encounters, both positive and negative. The idea is to get young people talking, and parents listening.
Sometimes, even as adults, it's easy to get in over your head online. Through our research with parents and teens, we found that while 30% of parents said their Gen Zer’s emotional and mental health had taken a turn for the worse in the past few years, 55% of Gen Z said it. And while some teens acknowledged that being extremely online can contribute to that, more reported that online communications platforms, including social media, play a positive role in their life through providing meaningful community connection. Understanding healthy digital boundaries and how they can impact mental wellbeing is important, no matter if you’re a teen, parent, or any age in-between.
When it comes to addressing the unique safety needs of each individual, there are resources, such as Crisis Text Line. Trained volunteer Crisis Counselors are available to support anyone in their time of need 24/7. Just text DISCORD to 741741 to receive free and confidential mental health support in English and Spanish.