There’s not one industry standard, or even a societal definition, of what it means to troll online. In the most extreme cases, there are some who use trolling as a way to intimidate. But to others, online trolling isn’t meant to cause harm. For the instigator, it’s often all in good fun, which can mean playing a silly prank, or poking fun and razzing your friends to get a rise out of them.
Regardless of the intent, however, there’s no doubt that words and actions on the internet can have an impact, especially when they are unwelcomed by those on the receiving end. We know from research that even people who witness this type of behavior can be affected.
Discord was created for people to hang out and talk together. We take a stance against bullying and harassment, because “it's the kind of behavior that impacts the culture and feelings of safety,” said Bri Riggio, a senior platform policy manager.
But how you define those behaviors isn’t so clear cut.
That’s why Riggio and her team study the effects that bad internet behavior can have on individuals, communities, and society. They define the terms and craft policies that aim to educate users and help them understand and learn from their mistakes.
One of the main challenges for the Platform Policy team is writing the rules for content that isn’t so clearly defined. Riggio notes that for the most severe violations, such as posting illegal material or making an imminent threat of physical harm, the line between what’s allowed and what’s prohibited is bright red. Discord has a strict policy for those types of violations and clear processes in place for removing content and banning the user.
But for content where there aren’t legal definitions or even clear societal ones, like for bullying and harassment, the line is more gray.
Landing on a definition starts with posing a fundamental question, said Riggio: “What is the problem that we’re trying to solve here on Discord?” How the team goes about answering that challenge becomes the guiding force behind each policy.
For bullying and harassment, the team looks at the user experience. We don’t want people to experience psychological or emotional harm from other users.
“The spirit of the policy is focused on individual and community experience and building healthy interactions,” Riggio said. Because ultimately, Discord is a place where everyone can find belonging, and experiencing these feelings hinder people from doing that.
The team did a rigorous amount of research including reviewing other platforms and reviewing guidance from industry groups, including the Global Internet Forum to Counter Terrorism and the Tech Coalition, which was formed so companies could share the best intelligence and resources when it comes to child safety online. They also read the academic literature on what it looks like in society.
They also brought in subject matter experts from different countries, such as academics who study harassment and legal scholars who can offer additional context for how platforms should be thinking about policies and enforcement. These experts provided feedback on early drafts of the policies.
They defined bullying as unwanted behavior purposefully conducted by an individual or group of people to cause distress or intimidate particular individuals, and harassment as abusive behavior towards individuals or groups of people that contributes to psychological or reputational harm that continues over time.
Policies like this one are only as useful if they are enforceable. So the team identified two dimensions that must be taken into account when assessing content.
First, the intent of the instigator. These are signals that the user or the server is posting content deliberately or is intending to cause harm. Intent isn’t always easy to spot, but Riggio said just as in real life, there are signs. For example, a user is directly harassed by someone, or there have been multiple instances of harassment in the past.
“What happens in society happens online. What happens in your high school can sometimes happen on Discord,” she said.
And second, is the content targeted? Is it targeting a specific individual by name? Is it targeting a specific individual by group? Here, there could also be dimensions of targeting users or groups because of certain protected characteristics, such as race or gender, which would then be considered a violation of Discord’s Hate Speech policy.
The term trolling has many dimensions. It’s been used to normalize “edgy behavior,” Riggio said. “Trolling in and of itself is a complicated term, because some communities that intend to cause harm downplay their behavior with, ‘We're just trolling,’” Riggio said.
Trolling on Discord takes on different dimensions. From "server raids", when a group of users joins a server at the same time to cause chaos, to "grief trolling", when someone attempts to make jokes or degrades the deceased or their next of kin, this type of negative behavior is rooted in disruption and harassment, and is not tolerated on our platform -- no matter how harmless the instigators may think it is.
“I hate the saying ‘sticks and stones may break my bones, but words will never hurt me,’ because words hurt,” said Patricia Noel, Discord’s mental health policy manager who is a licensed social worker and has a background in youth mental health. She collaborates cross-functionally within Discord and with external partners, such as the Crisis Text Line, with the goal of providing tools and resources to teens who may be experiencing mental health challenges.
Part of the enforcement strategy for bullying and harassment is to include a series of warnings, with escalating enforcement actions aimed at educating users, namely teens, that certain behaviors are not OK on Discord.
“This generation is really empathetic. But I also know that there are people who go online just for the sake of going online and being a troll,” Noel said. “Those might be the very same folks who are having problems at home, who are dealing with their own mental health and well being issues. They might not immediately recognize that they are doing harm to someone else.”
Understanding that young people are going to make mistakes is reflected in how the new warning system handles suspensions. For lower harm violations falling under the Bullying and Harassment policy, Discord will test suspensions rather than permanent bans. The idea here is that some teens may need a long pause before coming back to the platform, given how much growth and development occurs during each year of adolescence.
And while there will always be a zero-tolerance policy for the most serious violations, Discord will focus on remediation, when appropriate, for topics like bullying and harassment. This is so teens can have an opportunity to pause, acknowledge that they did something wrong, and hopefully change their behavior.
The challenge is that the behavior is so wide-ranging, said Ben Shanken, Discord’s Vice President of Product. He pointed to a recent analysis by Discord that looked at reports of bullying in large, discoverable servers.
“If you put these examples on an SAT test and asked, ‘Is this bullying? Many people probably would answer the questions incorrectly,” he said, noting how Discord’s large teen user base is inclined to “troll-y” behavior.
In one example, Shanken shared how teens were trolling each other by deceptively reporting their friends to Discord for supposedly being underage when in fact they were not.
It’s trolling. While it’s not nice, it’s also part of how teens joke around. But, instead of simply banning the offender, we build in an opportunity where they can learn from their mistake.
“If we ban a person, and they don't know what they did wrong, they likely won't change their behavior,” said Shanken. “Giving them a warning and saying, here's what you did wrong, flips the script in a more constructive way.
At its core, Discord is a place where people can come together to build genuine friendships. Educating users, especially young people, through our warning system is one part of Discord’s holistic strategy to cultivate safe spaces where users can make meaningful connections.
The nature of being a teenager and navigating relationships can be tricky. There are going to be situations where you hurt people's feelings. There may be moments where users lose their cool and say something really mean or even vaguely threatening,” Riggio said. “We want to build in more of a runway for them to make mistakes online but also learn from them.”