Discord
Tags

The European Union’s Digital Services Act, or “DSA”, is an important regulation that affects how certain companies moderate content in the EU. We’ve been working hard on our approach to the DSA, and as the law comes into force for companies like ours, we want to outline some changes EU users are going to see across Discord.

Creating a place for safe connection is a top priority, and we invest heavily in it. Independent of any regulation, more than 15% of our employees are focused on safety, and we work tirelessly to reinforce our Community Guidelines, removing content and bad actors who violate them. Today we’re giving you some more information about the work we’ve been doing to meet our obligations under the DSA for our users based in the EU. As we updated our products and policies, we focused on what would be best for our users, and we are proud of what we have built. And while we’re starting with the EU, we plan to roll out certain parts of this system that improve safety for all of our users so that our platform continues to be the best place to hang out and have fun with friends.

Transparency

Transparency has always been one of our core values at Discord. In October 2023, we announced our new Warning System, which is an in-app hub where users can easily review their account status. We want users to be able to easily understand what rule(s) they broke, what restrictions may have been placed on their account, and how they can do better in the future.

This information lives in a user’s Privacy & Safety settings, under the Account Standing tab, and is the centralized place where users can see what actions we have taken. For EU users, we are rolling out an experience that lets users login and view their account standing, even when the account is suspended. This will enable better transparency, so that users can understand why their account was suspended and submit an appeal, if appropriate. We’re also adding more context to our action notices, including the content that violated our policies (as long as providing that additional context does not harm Discord or others). And for every report actioned under the DSA, we will be sending notices by email so users are alerted outside the app as well.

We also want people to understand how to exercise their rights under the DSA. To that end, we have made updates to our Safety Center and other articles to provide context on how to submit DSA reports, appeal decisions, and more, and we will be updating our user policies (including our Terms of Service) to account for DSA requirements.

Reports & Appeals

We’ve always made it easy for users to report violations of our Community Guidelines. With DSA requirements, we have created another streamlined way for EU users to report illegal content to us. Accessible within the app, EU users can open a dedicated report form from the standard in-app reporting menu. From there, users can provide additional information about the content they are reporting. As mentioned above, users will receive an email informing them of the outcome of the report, as well as how to appeal that decision.

Reports from Law Enforcement & Trusted Flaggers

Law enforcement and Trusted Flaggers are important partners in keeping our platform safe. We have a dedicated Government Request Portal, and we have established a specific pathway within this portal to receive reports from Trusted Flaggers under the DSA. This will facilitate efficient and accurate communication between Discord, law enforcement, and Trusted Flaggers. Organizations designated as Trusted Flaggers can register via the portal. Once their account is validated, reports will be directed to a prioritized queue for resolution. More information about the Government Request Portal can be found in our Safety Center.

Accountability

Like other companies covered by the DSA, we will send information about the reports we receive, and how we respond to them, to a public database. Later in the year, we will also publish a new DSA-focused Transparency Report that provides additional insight into the content moderation action we took, such as user appeals and requests from EU governments to take down user content.

Keeping our users safe is central to everything we do at Discord, and we will continue our work with industry, parents, law enforcement, safety experts, non-profits, and regulators worldwide to do so. For more information on these changes under the Digital Services Act, please visit our Help Center.

THE AUTHOR
MORE FROM