May 12, 2022

How we investigate

When we receive a report from a Discord user, the Trust & Safety team looks through the available evidence and gathers as much information as possible. This investigation is centered around the reported messages, but can expand if the evidence shows that there’s a bigger violation. For example, we may investigate whether the entire server is dedicated to bad behavior, or if the behavior appears to be part of a wider pattern.

We spend a lot of time on this process because we believe the context in which something is posted is important and can change the meaning entirely. We might ask the reporting user for more information to help our investigation.

Responding to user reports is an important part of our Trust & Safety team’s work, but we know there is also violating content on Discord that might go unreported. This is where we get proactive. Our goal is to stop bad actors and their activity before anyone else encounters it. We prioritize getting rid of the worst-of-the-worst content because it has absolutely no place on Discord, and because the risk of harm is high. We focus our efforts on exploitative content, in particular non-consensual pornography and sexual content related to minors, as well as violent extremism.

Please note: We do not monitor every server or every conversation. Privacy is incredibly important to us and we try to balance it thoughtfully with our duty to prevent harm. However, we scan images uploaded to our platform to detect child sexual abuse material. When we have data suggesting that a user is engaging in illegal activity or violating our policies, we investigate their networks, activity on Discord, and their messages to proactively detect accomplices and determine whether violations have occurred.

Tags:
Reporting
Transparency
User Safety
Contents

Lorem Ipsum is simply