303: Facilitating Positive Environments

⚠️ Content Warning: This article contains sensitive terms which are offensive and often used by bad actors in communities for the purpose of harassment. The purpose of displaying them here is to provide context for what these terms mean for those unfamiliar with them, which will in turn allow moderators to make appropriate decisions of which content to filter in their communities and foster environments where everyone feels welcome.

Introduction

The core foundation of a server on Discord is the community that populates it. Your community is what you engage with, protect, and grow over time. Engagement is important to focus on, but it’s just as important to make sure you are facilitating positive and welcoming engagement. 

Why Positive Environments are Important

Positive engagement can mean a lot of things, but in this article, we will be referring to the way in which moderation can affect the culture of the server you are moderating. As moderators your policies, knowledge of your community, and deductive skills influence the way in which your community engages with each other and with your team.

When you establish and nurture your community, you are growing a collective group of people who all enjoy at least some of the same things. Regardless of your server topic, you are undoubtedly going to have members across different a variety of ethnicities, sexual orientations, and identities from across the world. Ensuring that your space on Discord is a space for them to belong necessitates making it safe for them to feel like they can be themselves, wholly, and without reservation. Your members are all humans, all community members, all people that deserve respect and deserve to be welcomed.

Establishing Community Boundaries in Moderation

When you are establishing your community, it’s important to have a basic understanding of what kind of environment you would like your server to be. It’s good to break down the general moderation philosophy on what content and discussion you’d like your community to engage in and what content would be inappropriate given the space. Depending on the topic of your server these goals may be different, but some common questions you can ask to establish general boundaries are:

  • What is the main topic of my server? When you’re thinking about the community and their impact on the growth of your server, it’s important to deduce what kind of server you want to build on a base conceptual level. If, for example, you are creating a politically-driven server, you might have different limits and expectations content and conversation wise for your community than a server based on Tetris or pets.
  • What topics do I expect users to engage in? Some servers will have the expectation that members will be allowed to discuss more sensitive or controversial and thought provoking topics, while others may feel as if these kinds of heavy debates are out of place. Video game servers tend to have a no-politics rule to avoid negative debates and personal attacks that are beyond the scope of the video game(s) in question. Servers centered around memes, irl, or social communities can be much more topical and have looser rules, while servers centered around mental health or marginalized communities can lean towards a stricter on-topic only community policy.
  • What would I like to foster in my community? While knowing what to avoid and moderate is very useful, having an idea of what kind of atmosphere you’d like the server to have goes far in setting the mood for the rest of the community at large. If users notice moderators are engaging in good-faith and positive conversations and condemning toxic or hateful discussion, it is more likely that your users will join in and participate in that positive conversation. If they see you and your mod team have taken the initiative to preserve the good atmosphere of the community, they are moved to put in the effort to reciprocate.
Evaluating Types of Harmful Rhetoric

This section will be more specific and will break down the most common ways in which a user can engage in harmful rhetoric, how to de-escalate discussions that attack marginalized communities, and how to properly address uncommon symbols used to attack communities.

Harmful Terms and Ableist Language

Members of your community may use obscure symbols or terms to send an offensive message while avoiding any blatant attention or triggering filters. While some of these will be used under the guise of being “internet culture” it’s important to be cognizant that these symbols and language can cause a lot of pain to many marginalized people. Understanding and approaching these terms seriously will help mitigate any long-term server culture damage and personal damage this kind of behavior can cause. This will not only be symbols, but popular terms that are used to cause intentional discourse and push harmful narratives.

These are all terms and symbols used to specifically target and belittle groups of people, and are harmful to the growth of a welcoming environment for your community. Some are slurs, while some are generally harmful rhetoric;

Ableist Terms

  • Autistic, Autist, Retard, etc: Very common ableist terms used to insult users on their intelligence. Commonly used as slurs to attack neurodivergent people, and should be avoided if possible.
    • To be particular, ‘retard’ is a slur specifically to attack neurodivergent people. ‘Autistic’ can be used neutrally by autistic people to refer to themselves, and it should only be moderated if it is being used as an insult.
  • Handicap/Mentally Retarded/Defective: Similar to above, used to refer to users in an accusatory manner, particularly to attack their intelligence or capabilities as people. Can also be used in an indirect manner with just as much harmful subtext; for example calling a character in a game a “wheelchair character even the mentally defective could play”.

Racist Terms

  • Jap: Used during World War II when the Japanese were in internment camps in the US, this term was a derogatory way US citizens referred to Japanese people and is heavily considered an ethnic slur against Japanese people. 
  • Gypsy/Gypped: Both to refer to ‘Gypsy’ the people, and to be robbed/conned in the form of ‘gypped’, this is a term specifically used as an ethnic slur against the Romani people. While it is used in legal contexts, the words have slowly been brought out of use due to its common use as a slur historically.
  • Chink/Ching Chong: Chink has been historically used as a slur against people of Chinese descent, and sometimes even Asian decent widely, with ching chong mocking the language of the Chinese which is commonly used alongside chink.
  • Triple Parentheses, also known as (((echo))): This is a very uncommon but recently used symbol to denote someone of Jewish origin, typically in a way to target or harass them. This symbol is used to single Jewish people out by communities and places a target on their back for their religion or ethnicity, and should not be tolerated.

LGBTQ+ Specific Slurs

  • Dyke/Lesbo: A term originated as a slur against more masculine-presenting lesbian women, this term has been reappropriated by its community into being a common slang term to refer to lesbian women. While some people would not mind being called a dyke, it should be made aware of it’s possible negative downsides for people who may be uncomfortable with the term.
  • Thing: Specifically in reference to pronouns, the use of ‘thing’ instead of a user’s preferred pronouns, used to usually mock a user’s preferred way of expressing their gender identity, is commonly used as a way to invalidate or minimalize trans/enby people.
  • Fag/Faggot/Homo: All terms used to refer to gay people, and all heavy slurs with the intention of belittling and attacking them. These words are also commonly used in the real world when they are attacked, and should not be taken lightly.
  • Trap: A term that originated from anime, this word is in reference to men who dress as females and look female-presenting, and ‘trap’ heterosexual people into having an attraction to them. This word has been used out of its original contexts as a slur to transgender people, as if their existence is to ‘trap’ or ‘trick’ people around them. Not everyone finds this term offensive, and definitely should be evaluated as a team if it deserves moderation on a case-by-case basis.

While some of these terms may be popular in certain spaces (such as gaming), it’s important to understand the history and weight behind them, and think accordingly about their place in your server long-term.

Creating an LGBTQ+ Friendly Environment 

Online platforms tend to have a large amount of hateful content and rhetoric used against marginalized groups of people. When crafting a community, there has to be a common goal of acceptance and welcoming that you provide for all of your members. In online communities, it is not uncommon for users to voice their disdain at other users for their choices in pronouns, gender presentation, or anything that relates them to the LGBTQ+ community.

What is an Ally?

An ally is someone who is not a part of the LGBTQ+ umbrella who supports and ‘allies’ with the community to create an open and welcoming atmosphere in your server, it is important to understand your role in being an ally to your community and to your users. Moderation is a key component in allowing people to present themselves openly in your server and grow long-lasting connections with other members. 

What are Pronouns?

Pronouns are what people use to refer to a person without directly stating their name. Common pronouns are They/Them, She/Her, and He/Him. There are many others that are not covered here, but pronouns can be very important to someone’s identity and how they’d prefer to be addressed. Users, at one point or another, may make jokes such as calling users incorrect pronouns intentionally to invalidate their identity, dehumanize them, and humiliate members of the community out of their ability to interact within the server. 

Additionally, it’s worth noting that comments such as ‘there are only two genders’ are used to directly disrespect and undermine the trans community, without directly appearing confrontational. This is used to skirt by rules by appearing to be much less antagonizing than these words truly are.

What are Important LGBTQ+ Terms to Know?

There are a few LGBTQ+ specific terms that are good to be aware of when interacting with LGBTQ+ members of your community. It’s important to also keep up with your communities- it can be helpful to do some research as terms and issues pop up. Quickly Googling a new term that you come across from a member can make them feel much more welcome in the community as a whole and address sore spots as well.

  • Enby/NB/Genderqueer: Non-binary, a term to describe a person whose gender identities don’t fit into the gender binary (female and male). (NB is an acronym that is also used for “non-black” in some contexts)
  • AMAB/AFAB: Terms to describe what gender someone was originally born as, usually in contrast to what gender they present as now. AMAB is ‘Assigned Male at Birth’ meaning that someone's birth certificate says “male” on it, and AFAB is ‘Assigned Female at Birth’ meaning that someone's birth certificate says “female” on it.

These are also some terms to be aware of that are related to LGBTQ+ issues that you may see being brought up: 

  • Chaser: A term to describe a cisgender person who fetishizes or objectifies transgender people (most often transgender women), and seeks out relationships with them.
  • TERF or Trans Exclusionary Radical Feminist: A term for gender-critical individuals who consider themselves feminists who do not acknowledge transgender women as women and promote the exclusion of trans women from women's spaces and organizations. 
Moderating Hateful Content

When it comes to the content you allow or moderate in your server, it’s important to, again, reflect on what type of community you are. It’s also important that you act quickly and precisely on this type of harmful behavior. Some users will slowly push boundaries on what type of language they can ‘get away with’ before being moderated.

When discussing moderation, a popular theory that circulates is called the broken windows theory. This theory expresses that if there are signs of antisocial behavior, civil unrest and disorder, as well as visible signs of crimes in the area, that it encourages further anti-social behavior and crime. Similarly, if you create an environment in which toxic and hateful behavior is common, the cycle will perpetuate into further toxicity and hatefulness. 

What is Bad-Faith Content vs. Good-Faith Content?

‘Bad-faith’ content is a term that describes behavior done intentionally to cause mischief, drama, or  toxicity to a community. They are also commonly referred to as bad actors, and are the type of people that should be swiftly dealt with and addressed directly. 

‘Good-faith’ content is a term that describes user behavior with good intentions. When users are a positive foundation in your community, the members that join and interact with the established community will grow to adapt and speak in a way that continues the positive environment that has been fostered and established. It’s important to note that while ‘good-faith’ users are generally positive people, it is possible for them to state wrong or sometimes even harmful words. The importance of this distinction is that these users can be educated from their mistakes and adapt to the behavior you expect of them. 

When users toe the line, they are not acting within good faith. As moderators, you should be directly involved enough to determine what is bad-faith content and remove it. On the other hand, education is important in the community sphere for long term growth. While you can focus on removing bad behavior from bad-faith users, reform in good-faith community members who are uneducated in harmful rhetoric should also be a primary goal when crafting your community. When interacting in your community, if you see harmful rhetoric or a harmful stereotype, step back and meaningfully think about the implications of leaving content up in channels that use this kind of language. Does it:

  • Enforce a negative stereotype?
  • Cause discomfort to users and the community at large?
  • Create a negative space for users to feel included in the community?
Ideas to Help Prioritize Inclusivity
  • Allowing users to have pronouns on their profile. Depending on your server, you may choose to have pronoun roles that members can directly pick from to display on their profile. This is a way to allow users to express their pronouns in a way that doesn’t isolate them. When creating a larger, more welcoming system for pronouns, it is much harder to decide who has pronouns because they are LGBTQ+, because they’re an ally, or just because it was part of setting up their roles. When servers have pronoun systems built into them, this can also allow for a community-wide acceptance of pronouns and respect for other users’ identities, and can deter transphobic rhetoric.
  • Discourage the use of harmful terms. It’s no secret that terms such as ‘retard’ and ‘trap’ are used in certain social circles commonly. As moderators, you can discourage the use of these words in your community’s lexicon.
  • Create strong bot filters. Automated moderation of slurs and other forms of hate speech is probably your strongest tool for minimizing the damage bad actors can create in your server. Add variating ways people commonly try to skip over the filter as well (for example, censoring a word with an added or subtracted letter that commonly is used as a slur).
    • A good document to follow for bot filter and auto moderation as a whole is also in the Discord Mod Academy, which can be found here
  • Educating your community. Building a community without toxicity takes a lot of time and energy. The core of all moderation efforts should be in educating your communities, rewarding good behavior, and making others aware of the content they are perpetuating.

A core way to handle all de-escalation stands in your approach. Users, when heated up during a frustrating or toxic discussion, are easy to set off or to accidentally escalate to more toxicity. The key is to type calmly, and to make sure with whatever manner you approach someone to de-escalate, you do it in a way that is understood to be for the benefit of everyone involved.

Closing

Creating a healthy community that leaves a lasting, positive impact in its members is difficult. Moderators have to be aware, educated, and always on the lookout for things they can improve on. By taking the initiative on this front, your community can grow into a positive, welcoming place for all people, regardless of their race, gender, gender identity, or sexual orientation.

Other Resources

Below are resources for further research and discussion on different types of slurs, symbols, and hate speech not referenced explicitly in this document. 



Ready to test your moderator skills? Take the Discord Moderator Exam!