Child Safety Discord has a zero-tolerance policy for anyone who endangers or sexualizes children. Users who upload abuse material of minors to Discord are reported to NCMEC and removed from the service. We deeply value our partnership with NCMEC and their efforts to ensure that grooming and endangerment cases are quickly escalated to law enforcement.
In the first quarter of 2022, we reported 10,695 accounts to NCMEC. 10,641 of those reports were media (images or videos), of which many were flagged through PhotoDNA - a tool which uses a shared industry hash database of known CSAM. 54 high-harm grooming or endangerment reports were also delivered to NCMEC. Overall this represented a 29% increase in reports made to NCMEC when compared to the fourth quarter of 2021.
Discord disabled 826,591 accounts and removed 24,706 servers for Child Safety in the first quarter of 2022. Sexualized Content Regarding Minors (SCRM) is the single largest individual sub-category of accounts disabled within Child Safety, accounting for 718,385 accounts disabled and 22,499 servers removed.
Child-harm content is appalling, unacceptable, and has no place on Discord or the internet at large. We work with industry peers, civil society, and law enforcement to ensure that this effort extends beyond Discord. Discord is an active supporter of cross-industry programs such as NCMEC and is a member of the Technology Coalition. We’re also an annual sponsor of events dedicated to increasing awareness of and action on child safety issues such as the Dallas Crime Against Children Convention.
We have a dedicated team and invest heavily in advanced tooling and education so parents know how our service works and understand the controls that can contribute to creating a positive and safe experience on Discord for their children. As part of our ongoing commitment to parent engagement, Discord is a proud sponsor of National Parent Teacher Association and ConnectSafely. In recognition of Safer Internet Day in February, Discord and National PTA hosted an event to bring together parents, educators and teens to discuss online safety. We continue to be a member of the Family Online Safety Institute, contributing to and learning from its important work.
Deceptive Practices Using Discord for the purpose of distributing malware, sharing or selling game hacks or cheats, and theft of authentication tokens is a violation of our Community Guidelines.
We disabled 5,091 accounts and removed 1,797 servers for Deceptive Practices during the first quarter of 2022.
Exploitative and Unsolicited Content It is a violation of our Community Guidelines to share or promote sexually explicit content of other people without their consent.
We disabled 146,897 accounts and removed 3,525 servers for Exploitative and Unsolicited Content. This category was the second-largest category of accounts disabled. These numbers were similar to the previous quarter, with one notable trend being that accounts disabled for Non-Consensual Pornography increased by 132% as a result of increased proactive efforts to remove this content from Discord.
Harassment and Bullying Harassment and bullying have no place on Discord. Continuous, repetitive, or severe negative comments, circumventing bans, suggestive or overt threats, the sharing of someone’s personally identifiable information (also known as doxxing), and server raiding are violations of our Community Guidelines.
During the first quarter of 2022, 13,423 accounts were disabled for harassment-related behavior, and 716 servers were removed for this issue. Both the number of accounts disabled and servers removed stayed roughly consistent with the previous quarter.
Hateful Conduct Discord doesn’t allow the organization, promotion, or participation in hate speech or hateful conduct.
During the first quarter of 2022, 8,806 accounts and 965 servers were removed for Hateful Conduct. Compared to the previous quarter, this was a decrease of 6.5% for accounts and 18% for servers.
Identity and Authenticity Using Discord for the purpose of coordinating and participating in malicious impersonation of individuals or organizations is a violation of our Community Guidelines.
We disabled 154 accounts and removed 16 servers for this issue.
Platform Manipulation Spam, fake accounts, and self-bots are examples of platform manipulation that damage the experience of our users and violate our Community Guidelines.
During the first quarter of 2022, 2,905 accounts and 972 servers were removed for platform manipulation-related issues not related to spam. An additional 26,017,742 accounts were disabled for spam or spam-related offenses.
We’re focused on combating spam and minimizing users’ exposure to spammers and spam content on Discord. We recently established a dedicated cross-functional anti-spam team combining Engineering, Data, Product, and Safety resources to tackle this issue, and we rolled out a one-click spam reporting feature that enables users to easily report spam. In the first quarter of 2022, 90% of accounts disabled for spam were disabled proactively – before we received any user report.
You can read more about how Discord fights spam here .
Regulated or Illegal Activities Using Discord for the purpose of engaging in regulated, illegal, or dangerous activities is strictly prohibited, including selling or facilitating the sale of prohibited or potentially dangerous goods or services.
We disabled 19,669 accounts for engaging in this behavior.
A total of 4,027 servers were removed for this category. Cybercrime had a proactive take down rate of 51%, with 1,996 of the 3,883 servers removed proactively.
Self-Harm Concerns Using Discord to glorify or promote suicide or self-harm is not allowed under any circumstance.
Actions may be taken on accounts or servers encouraging people to cut themselves or embrace eating disorders, or otherwise manipulating and coercing other users to engage in acts of self-harm. These actions are only taken on accounts glorifying or promoting acts of self-harm, not on users seeking help or in need of medical attention.
We disabled 1,795 accounts and removed 594 servers for Self-Harm concerns.
Violent and Graphic Content Real media depicting gore, excessive violence, the glorification of violence, or animal cruelty with the intent to harass or shock others is not allowed on Discord.
In the first quarter of 2022, 16,069 accounts were disabled for posting violent and graphic content. 5,419 of these accounts were disabled for gore, and the remaining 10,650 accounts were disabled for content glorifying violence. This was a 34% decrease in accounts disabled for gore and a 64% increase in the number of accounts disabled for content glorifying violence.
We also removed 1,657 servers for violent and graphic content. Of these, 291 servers were removed for gore and 1,366 servers were removed for content glorifying violence. The number of servers removed for gore decreased by 22% while the number of servers removed for glorification of violence increased by 89%.
Violent Extremism We consider violent extremism to be the support, encouragement, promotion, or organization of violent acts or ideologies that advocate for the destruction of society, often by blaming certain individuals or groups and calling for violence against them.
This blog post discusses our methods to address violent extremism. Through partnering and engaging in cross-industry work with Tech Against Terrorism, the Global Internet Forum To Counter Terrorism (GIFCT), the European Union Internet Forum and other organizations, we’ve made progress in our tooling, policy, and subject matter expertise to ensure that violent extremism does not have a home on Discord.
In the first quarter of 2022, 12,928 accounts and 1,034 servers were removed for violent extremism. This reflects a 9% increase in the number of servers removed since the previous quarter. We dedicate significant resources to proactively detecting and removing violent extremism. During this quarter, 40.5% of servers removed for Violent Extremism were removed proactively.