December 21, 2023

Upholding Our Commitment to Safety in 2023 and Beyond

At Discord, we know that creating a fun and engaging place to hang out with your friends online hinges on our ability to keep you safe. As head of Discord’s Safety team, protecting our users is both a business imperative and a personal passion for me.

Safety is core to who we are as a company and we continuously invest in new and innovative ways to keep our users safe. The work we did this year is a testament to that, and I am looking forward to continuing this work in 2024 and beyond.

Product Developments

Discord’s ground-up approach to safety starts with our product and features. Not only do we intentionally build products that help to keep our users safer, but we also implement a Safety by Design practice, which includes a risk assessment process during the product development cycle that helps identify and mitigate potential risks to user safety.

This year, we made key investments that help teens, parents, and all of our users have a positive Discord experience:

  • We launched Teen Safety Assist, which introduced new features aimed at better protecting teen users on Discord, including safety alerts when a user receives a DM  from someone for the first time, and content filters to blur media that may be sensitive. These features are default enabled for teen users. We collaborated with leading child safety organization Thorn to ensure this tool was grounded in the latest research and included language that resonates with teens. We plan to announce more features as part of this tool in 2024. 
  • We developed a Warning System to help users more easily understand when and how they’ve broken Discord’s rules and provide them with resources to learn how to do better in the future. For teens specifically, this feature allows us to consider how they are learning and growing, giving them more opportunities to learn from mistakes rather than punishing them harshly for certain misbehaviors. Some violations are more serious than others, and in those cases, we’ll take the appropriate action, especially for violations related to violent extremism or sexualization of children.
  • As part of a Tech Coalition hackathon, we built and implemented a new Visual Safety Technology called CLIP to detect unknown CSAM and AI-generated CSAM with positive results. We have made this technology open source, so that we can share our innovation with other organizations—without cost—and contribute to the broader fight against CSAM online. 
  • We unveiled Family Center, an in-app tool to help keep parents and guardians informed of how their teens use Discord while still giving teens autonomy and control over their experience. This is a pivotal step in helping parents and teens develop collaborative approaches to build positive online behaviors.
  • We made updates to Discord’s Moderator Toolkit, equipping server moderators with new ways to proactively identify potentially unsafe activity, such as server raids and DM spam, and take swift actions as needed. 
  • As a part of our regular process to review and update our policies, we partnered with leaders in the teen and child safety space to ensure our teen and child safety policies are crafted with the latest research, best practices, and expertise in mind. For example, as part of our zero-tolerance policy for child sexual abuse, we added a clear distinction that we do not allow AI-generated photorealistic CSAM. We took a holistic and nuanced approach to these updates in order to better support and educate our teen users while continuing to make Discord a hostile environment for bad actors.

Partnerships

We believe that everyone in the industry should share solutions, intelligence, and technology, which is why we continued to invest in furthering our partnerships with a number of safety organizations this year. These organizations help us to keep our platform and the internet safer by supplementing our expertise and insights and helping us to continue to advance cutting edge safety efforts.

As part of our extensive fight against child sexual abuse material (CSAM), we continued to deepen our partnership with the Tech Coalition. I joined the Tech Coalition’s Board of Directors and am excited to help shape the organization’s future and continue to collaborate with industry leaders to combat online sexual abuse and exploitation.

One of the most exciting parts of our partnership with the Tech Coalition this year was the launch of Lantern, a first-of-its-kind signal sharing program for companies to enhance and strengthen how we detect attempts to sexually exploit and abuse children and teens online. Discord joined other industry partners like Google, Mega, Meta, Quora, Roblox, Snap, and Twitch to share signals about activity we detect that violates our policies and utilize signals shared by other companies to surface violating activity on their platforms.

We also expanded our partnership with INHOPE, a global network combatting CSAM online. Through this partnership, we are better able to collaborate with the hotlines globally working to address this important issue.

We joined the Internet Watch Foundation (IWF), the UK’s frontline defense against the spread of online child sexual abuse imagery. By becoming a member, Discord will be able to deploy IWF’s world-leading services to keep the platform safer, including the IWF’s URL List and Keywords List, as well as its Non-Photographic URL and Hash Lists.

Discord also joined the Christchurch Call, a community of over 120 members worldwide with a shared goal of eliminating violent extremism online. Partnership and transparency are cornerstones of our effort to counter extremism and we are committed to working collectively with this global community to make the internet and the world a safer place.

Discord is partnering with the European Center for Not-for-Profit Law to lead a pilot project to implement their Framework for Meaningful Engagement (FME). ECNL’s expertise and the FME will help us build and deploy AI systems in support of our safety work.

We continued our important partnership with the National PTA around family engagement efforts. Through their Build Up and Belong program, we supported local PTAs as they hosted events to help families build healthy digital habits and foster productive conversations about navigating online interactions. 

Additionally, we are proud board members of Digital Trust & Safety Partnership, which gathers tech platforms to collaborate on and publish online safety best practices. This includes conducting audits of our policies and processes to ensure safety is built into our systems by design. We also are active members of the Trust & Safety Professional Association, an organization that gathers Trust & Safety professionals from around the world for training and to share best practices across the industry.

Looking to 2024 and beyond

As technologies continue to evolve, so does the dynamic safety landscape, presenting new challenges that no company or government can tackle alone. Continued industry investment in safety features must be made to ensure users on our platforms are safer. We can do this by evolving our features and policies to ensure they are up to date, holistic, and meet the needs of our users; regularly sharing knowledge and progress; and working collaboratively with each other and legislators, advocacy groups, and academics, to ensure a safer online experience for everyone.

While we’ve made a lot of progress over the last year, we know there is still much more we can and must do – on our own and in collaboration with others – to combat known and new threats. The work here never stops. As we look to the New Year, my team and I are as committed as ever to building on the important safety work we’ve done over the past twelve months, and we are confident that Discord will continue to be the best place for people to have fun online.

Tags:
Communications
User Safety

Lorem Ipsum is simply