Child Safety Discord has a zero-tolerance policy for anyone who endangers or sexualizes children. Child-harm content is appalling, unacceptable, and has no place on Discord or the internet at large. We work with industry peers, civil society, and law enforcement to ensure that this effort extends beyond Discord. Discord is an active supporter of cross-industry programs such as the National Center for Missing and Exploited Children (NCMEC) and is a member of the Technology Coalition , a group of companies together to end online child sexual exploitation and abuse. We’re also a frequent sponsor of events dedicated to increasing awareness of and action on child safety issues such as the annual Dallas Crimes Against Children Conference.
We invest heavily in advanced tooling and education so parents know how our service works and understand the controls that can contribute to creating a positive and safe experience on Discord for their children. As part of our ongoing commitment to parent engagement , Discord is a proud sponsor of the National Parent Teacher Association and ConnectSafely . We continue to be a member of the Family Online Safety Institute, contributing to and learning from its important work.
Users who upload abuse material of children to Discord are reported to NCMEC and removed from the service. We deeply value our partnership with NCMEC and their efforts to ensure that grooming and endangerment cases are quickly escalated to law enforcement.
In the third quarter of 2022, we reported 14,366 accounts to NCMEC. 14,303 of those reports were media (images or videos), of which many were flagged through PhotoDNA – a tool that uses a shared industry hash database of known CSAM. 63 high-harm grooming or endangerment reports were also delivered to NCMEC.
Discord disabled 42,458 accounts and removed 14,451 servers for Child Safety during the third quarter of 2022. This was a 92% decrease in the number of accounts disabled when compared to the previous quarter.
Our investment and prioritization in Child Safety has never been more robust. These decreases are the result of improvements in our efforts to detect and proactively remove child exploitative content from Discord. Specifically, better operating procedures and enhanced detection capabilities have enabled our team to identify these servers faster.
By targeting and proactively removing networks of bad actors from Discord before they grew in size, fewer accounts participated in these spaces, and as a result, fewer accounts were disabled.
We were able to remove servers hosting CSAM content proactively 99% of the time. Removing CSAM content is one of our top priorities, and we are proud of the cross-functional efforts that have enabled us to achieve this proactive takedown rate for CSAM.
Deceptive Practices Using Discord for the purpose of distributing malware, sharing or selling game hacks or cheats, theft of authentication tokens, or participating in identity, investment, and financial scams is a violation of our Community Guidelines.
We disabled 8,800 accounts and removed 2,137 servers for Deceptive Practices during the third quarter of 2022. This was an increase of 43% and 27% respectively.
Exploitative and Unsolicited Content It is a violation of our Community Guidelines to share or promote sexually explicit content of other people without their consent.
We disabled 47,570 accounts and removed 1,837 servers for Exploitative and Unsolicited Content. This represents a 55% decrease in accounts disabled and a 21% decrease in servers removed when compared to the previous quarter.
This decrease was driven by our ability to identify and remove a specific abuse pattern, resulting in a decrease of servers created to host that content, and consequently, fewer accounts disabled for engaging with that content.
Harassment and Bullying Harassment and bullying have no place on Discord. Continuous, repetitive, or severe negative comments, circumventing bans, suggestive or overt threats, the sharing of someone’s personally identifiable information (also known as doxxing), and server raiding are violations of our Community Guidelines.
During the third quarter of 2022, 11,347 accounts were disabled for harassment-related behavior, and 555 servers were removed for this issue.
Hateful Conduct Discord doesn’t allow the organization, promotion, or participation in hate speech or hateful conduct. We define “hate speech” as any form of expression that denigrates, vilifies, or dehumanizes; promotes intense, irrational feelings of enmity or hatred; or incites harm against people on the basis of protected characteristics. You can read our latest policy blog post to learn more.
During the third quarter of 2022, 7,104 accounts and 829 servers were removed for hateful conduct. Compared to the previous quarter, this was an increase of 24% and 16%, respectively.
Identity and Authenticity Using Discord for the purpose of coordinating and participating in malicious impersonation of individuals or organizations is a violation of our Community Guidelines.
We disabled 3,561 accounts and removed 13 servers for this issue. This was a significant increase as we continue to improve our methods for detecting and removing malicious bots.
Misinformation In February 2022, we published a blog post discussing our new policy prohibiting the sharing of false or misleading information on Discord that is likely to cause physical or societal harm. This blog post discusses the new policy and our enforcement criteria in more detail.
We disabled 385 accounts and removed 52 servers for misinformation.
Platform Manipulation Spam, fake accounts, and self-bots are examples of platform manipulation that damage the experience of our users and violate our Community Guidelines.
During the third quarter of 2022, 1,197 accounts and 656 servers were removed for platform manipulation-related issues not related to spam. An additional 50,510,769 accounts were disabled for spam or spam-related offenses.
We're focused on combating spam and minimizing users’ exposure to spammers and spam content on Discord. We have a dedicated cross-functional anti-spam team building sophisticated anti-spam measures, and as a result of this work, 90% of accounts disabled for spam were disabled proactively, before we received any user report.
You can read more about how Discord fights spam here . You can also read this blog post published in September about Automod, a new safety feature that enables server owners to automatically moderate certain abuse, including spam.
Regulated or Illegal Activities Using Discord for the purpose of engaging in regulated, illegal, or dangerous activities is strictly prohibited, including selling or facilitating the sale of prohibited or potentially dangerous goods or services.
We disabled 37,284 accounts for engaging in this behavior, an increase of 35.5% from the prior quarter. A total of 6,950 servers were removed for this category with a proactive removal rate of 63%.
Self-Harm Concerns Using Discord to glorify or promote suicide or self-harm is not allowed under any circumstance. We recently expanded our Self Harm Encouragement and Promotion Policy which you can read more about here .
Actions may be taken on accounts or servers encouraging people to cut themselves or embrace eating disorders, or otherwise manipulating and coercing other users to engage in acts of self-harm. These actions are only taken on accounts glorifying or promoting acts of self-harm, not on users seeking help or in need of medical attention.
In September we announced a new partnership with Crisis Text Line , a nonprofit that provides 24/7 text-based mental health support and crisis intervention via trained volunteer crisis counselors. Crisis Text Line is currently available to those in the United States and is offered in both English and Spanish. You can read more about this partnership here .
We disabled 1,297 accounts and removed 610 servers for Self-Harm concerns.
Violent and Graphic Content Real media depicting gore, excessive violence, the glorification of violence, or animal cruelty with the intent to harass or shock others is not allowed on Discord.
In the third quarter of 2022, 12,300 accounts were disabled for posting violent and graphic content. We also removed 1,313 servers for violent and graphic content.
Violent Extremism We consider violent extremism to be the support, encouragement, promotion, or organization of violent acts or ideologies that advocate for the destruction of society, often by blaming certain individuals or groups and calling for violence against them.
This blog post discusses our methods to address violent extremism. Through partnering and engaging in cross-industry work with Tech Against Terrorism, the Global Internet Forum To Counter Terrorism (GIFCT), the European Union Internet Forum, and other organizations, we’ve made progress in our tooling, policy, and subject matter expertise to ensure that violent extremism does not have a home on Discord.
In the third quarter of 2022, 12,489 accounts and 1,053 servers were removed for violent extremism.