January 25, 2024

The Teachable Moments Hidden in Plain Sight

Discord helps teens build their “digital safety muscle” by developing policies and new features with their needs in mind.

For the producers of a famous 1970’s TV drama series about teens and for teens, there was only one rule that every story had to follow: the protagonist always had to figure out what to do for themselves. No parent, coach, or trusted adult could swoop in to make all the decisions, even if they were there to offer support. The teen had to arrive at their own solution.

The US-aired episodes, known as “afterschool specials,” featured junior high and high schoolers dealing with issues relevant to their peer group—from drug abuse to relationships. The popularity of the series, which ran for three decades, is a testament to its formula of centering the teen experience. The stories didn’t dismiss young people as too naive. They didn’t finger wag or lecture. They recognized the need for teens to come to their own decisions after weighing potential consequences.

While the afterschool special feels outdated today, the lessons are universal—as teens develop, they need to test the bounds of their independence and autonomy. This is backed up by leading research from organizations such as Boston Children’s Digital Wellness Lab, which studies and promotes healthy online practices for children and young adults.

It’s also the mindset behind Discord’s new initiatives and features aimed at keeping all users—but especially teens—safe. These include Teen Safety Assist, which uses proactive filters and alerts that also serve as teachable moments that are woven throughout the platform.

Using research to help teens build their 'safety muscle'

To do that, Discord works with organizations like the Digital Wellness Lab at Boston Children’s Hospital and the National PTA to understand the needs of one of our largest user bases.

“We are experts in technology, not experts in teenagers. So we need to make sure we are pulling in expert research when we're thinking about what it means to be a safe, healthy and ultimately, empowering place for young people,” said Liz Hegarty, Discord’s global teen safety policy manager.  

Hegarty and her colleagues are guided by the research, which is empathetic to teens' lived experiences and cautions against being overly paternalistic.

For example, Digital Wellness Lab advises parents to resist the urge to step in and solve every problem for their teen, online or off. Instead, they recommend asking probing questions and talking through the decision-making process. This can clear the way to have more conversations about digital boundaries.

“These conversations can help teens build their safety muscle”, said Hegarty.

We built our Family Center with this idea in mind. Launched in 2023, the Family Center is an opt-in tool that enables guardians, trusted adults, and teens to “link” their accounts. While it allows parents to know who their teen is connected with, the tool does not allow the adults to see the contents of their teen’s conversations with friends.

“The key is to find balance between respecting teen autonomy and agency while also giving parents ways to check in with their teens”, said Savannah Badalich, Discord’s senior director of policy.

“Parents and trusted adults in a teen's life are here to coach them,” said Badalich, who leads the team responsible for key safety areas, such as teen safety and mental health. “Let them explore themselves, let them find who they are. But give them parameters in which to do so.”

There is a risk that a teen may choose not to talk at all if they feel they are being constantly monitored. This is why it's important to follow the research.

“Leading experts say that sort of parental surveillance does not always help generate safety or create a trustworthy environment for teens to talk to their caregivers,” Hegarty said.

Hegarty notes that when building the Family Center tool, Discord decided to add an additional layer of support. As a teen connects with their parents, Discord surfaces information for the Crisis Text Line (note that this is for users in the U.S. only). This is a support line that provides free, 24/7 mental health support via text, WhatsApp, and web chat.

The idea is to provide available resources to teens and to encourage them to seek help if they need it. It’s another way we embed subtle lessons throughout the experience.

Weaving in teachable moments

Discord’s Teen Safety Assist includes two recently released safety features: one for when a teen receives a direct message (DM) from a user for the first time, and another that blurs potentially sensitive content in DMs and group DMs with friends.

When an alert is triggered, the teen might see a pop-up that says, Unwanted message? If you don’t want to chat with this person, you can block or mute them. They can then click to “learn more,” and see a message that prompts “Unwanted message?”, as well as provides a set of tips for handling the situation.

“It's this idea of educating as opposed to just punishing teenagers when they make mistakes,” said Hegarty. “In the moment, it’s important to give teens resources, autonomy, agency, and help.”

Hegarty points to research about how teens are nervous to block other people. They're worried if the person finds out they're blocked, the teen is going to look uncool. Meanwhile, if they report, they aren’t sure what will happen next. They are anxious about getting somebody in trouble.

Research from Thorn, a leading child welfare nonprofit, also suggests that when teens encounter a potentially harmful experience online, they are less likely to seek support offline, such as talking to a trusted adult. So Discord is trying to meet teens where they are.

“It's not a paternalistic approach that protects teens from everything. Being overly restrictive risks them lacking the necessary skills to move within the world once they turn 18,” said Badalich. “Instead, let's educate them throughout their experience. Let's give them tools to control their experience.”

Giving teens opportunities to learn

Risk-taking is part of growing up. It’s part of being a teenager.

“Because their brain is still developing, they need options, time and education,” said Badalich, whose team is also working to sharpen Discord’s enforcement system with teens in mind.

When it comes to developing the policies and rules that govern the platform, Badalich and her team are rethinking what that looks like, with the understanding that teens need extra opportunities to learn.

“Keep up, take down, or ban have been the main levers. And we're continuing to explore even more ways to take necessary action,” said Badalich. For example, “When you think about someone talking about self-harm, do we really want to penalize them? Or do we want to figure out some other form of intervention?”

To be clear, when it comes to the most severe violations—such as those involving violent extremism and content that sexualizes children—Discord will continue to have a zero-tolerance policy.

But when it comes to building out more nuanced, research-driven policies, the policy team looks to lessons from restorative justice, a concept that recognizes that people are capable of change.

“Instead of just being purely punitive all of the time,” said Badalich, “Discord wants to move in the direction of where we are giving people opportunities to change, grow, and learn.”

Tags:
Partnerships
Policy
Parents and Teens
User Safety

Lorem Ipsum is simply