Child Safety
Discord has a zero-tolerance policy for anyone who endangers or sexualizes children. Child-harm content is appalling, unacceptable, and has no place on Discord or in society. We work with industry peers, civil society, and law enforcement to ensure that this effort extends beyond Discord. Discord is an active supporter of cross-industry programs such as the National Center for Missing and Exploited Children (NCMEC) and is a member of the Technology Coalition, a group of companies working together to end online child sexual exploitation and abuse. We’re also a frequent sponsor of events dedicated to increasing awareness of and action on child safety issues such as the annual Dallas Crimes Against Children Conference. We are proud to announce a new partnership with INHOPE, the global network combatting online CSAM. Discord is a proud sponsor of the National Parent Teacher Association and ConnectSafely. We partner with The Digital Wellness Lab to integrate their research on teen health and social media, and are members of the Family Online Safety Institute, contributing to and learning from its important work.
We invest heavily in advanced tooling and education so parents know how our service works and understand the controls that can contribute to creating a positive and safe experience on Discord for their children. We just launched our new Family Center, and Parent Hub as part of our ongoing commitment to parent engagement and education.
Users who upload abuse material of children to Discord are reported to NCMEC and removed from the service. We deeply value our partnership with NCMEC and their efforts to ensure that grooming and endangerment cases are quickly escalated to law enforcement. We also recently published details about our Child Safety policies which we updated in partnership with leaders in teen and child safety to ensure our policies are crafted with the latest research, best practices, and expertise in mind. You can read the summary in this article here, and the policies here.
In the first quarter of 2023, we reported 20,126 accounts to NCMEC. 20,001 of those reports were media (images or videos), of which many were flagged through PhotoDNA – a tool that uses a shared industry hash database of known CSAM. Additionally, 125 high-harm grooming or endangerment reports were delivered to NCMEC.
Discord disabled 54,835 accounts and removed 12,670 servers for Child Safety during the first quarter of 2023. Accounts disabled increased by 48%, while servers removed decreased by 27%. We removed CSAM servers proactively 99% of the time.
The increase in accounts disabled and our reports to NCMEC is due to improvements in Discord’s identification of bad actors engaged in this behavior. Finding these bad actors early allows us to swiftly take action. Additionally, we launched in-platform reporting, which makes it easier for users to report content and more effective for our team to investigate. As has always been our policy, we review reports of inappropriate content with a minor on an escalated basis and report directly to NCMEC when appropriate.
One tool that we leverage is our visual safety platform. This is a service that can identify hashes of prohibited images such as CSAM and check all image uploads to Discord against databases of known objectionable images. Additionally, we have invested more resources in combating CSAM on our platform, including a team who solely focuses on child safety as well as a dedicated engineering team.
Discord is committed to continually exploring new and improved safeguards that help keep younger users safe on our platform and online.
Deceptive Practices
Using Discord for the purpose of distributing malware, sharing or selling game hacks or cheats, authentication token theft, or participating in either identity, investment, or financial scams is a violation of our Community Guidelines.
We disabled 6,635 accounts and removed 3,452 servers for Deceptive Practices during the first quarter of 2023. This was an increase of 29% in accounts disabled, and an increase of 43% in servers removed.
Exploitative and Unsolicited Content
It is a violation of our Community Guidelines to share or promote sexually explicit content of other people without their consent.
We disabled 22,123 accounts and removed 1,730 servers for exploitative and unsolicited content. This was an increase of 39% and 40% respectively largely driven by advancements in proactive models allowing for faster detection and removal of specific networks of bad actors.
Harassment and Bullying
Harassment and bullying have no place on Discord. Continuous, repetitive, or severe negative comments, circumventing bans, suggestive or overt threats, the sharing of someone’s personally identifiable information (also known as doxxing), and server raiding are violations of our Community Guidelines.
In the first quarter of 2023, 12,489 accounts and 1,298 servers were removed for harassment and bullying.
Hateful Conduct
Hate or harm targeted at individuals or communities is not tolerated on Discord in any way. Discord doesn’t allow the organization, promotion, or participation in hate speech or hateful conduct. We define “hate speech” as any form of expression that denigrates, vilifies, or dehumanizes; promotes intense, irrational feelings of enmity or hatred; or incites harm against people on the basis of protected characteristics.
In the first quarter of 2023, 4,352 accounts and 750 servers were removed for hateful conduct.
Identity and Authenticity
Using Discord for the purpose of coordinating and participating in malicious impersonation of individuals or organizations is a violation of our Community Guidelines.
We disabled 356 accounts and removed 23 servers for identity and authenticity concerns.
Misinformation
It is a violation of our Community Guidelines to share false or misleading information that may result in damage to physical infrastructure, injury to others, obstruction of participation in civic process, or the endangerment of public health.
We disabled 124 accounts and removed 50 servers for misinformation.
Platform Manipulation
Spam, fake accounts, and self-bots are examples of platform manipulation that damage the experience of our users and violate our Community Guidelines.
We're focused on combating spam and minimizing users’ exposure to spammers and spam content on Discord. We have a dedicated cross-functional anti-spam team building sophisticated anti-spam measures.
In the first quarter of 2023, 3,122 accounts and 1,277 servers were removed for platform manipulation-related issues not related to spam. An additional 10,693,885 accounts were disabled for spam or spam-related offenses. This represents a decrease of 71% in the number of accounts disabled when compared to the previous quarter. 99% of accounts disabled for spam were disabled proactively, before we received any user report.
Similar to how we observed fewer accounts disabled for policy violations not including spam over the past few quarters, this decrease reflects a number of positive trends and actions. First, it represents less spam on Discord, and secondly, it represents improvements in our systems to detect spam accounts upon registration, as well as quarantining suspected spam accounts without fully disabling them, allowing users to regain access to compromised accounts.
Learn more about how Discord combats spam here, and about Automod, a safety feature that enables server owners to automatically moderate certain abuse, including spam here.
Regulated or Illegal Activities
Using Discord to organize, promote, or engage in any illegal behavior is a violation of our Community Guidelines.
In the first quarter of 2023, 41,441 accounts and 9,558 servers were removed for regulated or illegal activities. This was a decrease of 28% and an increase of 26% respectively. Our rate of proactively removing servers for regulated or illegal activities increased from 60% to 75%. The decrease in accounts disabled and the rate in which we proactively removed these servers was driven by improvements in proactive tooling and the ability to remove servers before they grew in size, resulting in fewer accounts disabled.
Self-Harm Concerns
For those experiencing mental health challenges, finding a community that is navigating similar challenges can be incredibly helpful for support. That said, platforms have a critical role to play in ensuring that these spaces do not normalize, promote, or encourage others to engage in acts of self-harm.
We may take action on content that seeks to normalize self-harming behaviors, as well as content that discourages individuals from seeking help for self-harm behaviors. These actions are only taken on accounts glorifying or promoting acts of self-harm, not on users seeking help or in need of medical attention.
We’re proud to partner with Crisis Text Line, a nonprofit that provides 24/7 text-based mental health support and crisis intervention via trained volunteer crisis counselors. If a user reports a message for self-harm on Discord, they will be presented with information on how to connect with a volunteer Crisis Counselor. You can learn more here.
Crisis Text Line is currently available to those in the United States and is offered in both English and Spanish. You can read more about this partnership here.
In the first quarter of 2023, 1,294 accounts and 540 servers were removed for self-harm concerns.
Violent and Graphic Content
Real media depicting gore, excessive violence, the glorification of violence, or animal cruelty is not allowed on Discord.
In the first quarter of 2023, 18,666 accounts and 2,348 servers were removed for violent and graphic content. This was an increase of 74% and 83% respectively. This increase was driven largely by advances in proactive tooling which allowed for an increased focus on more specific networks that glorified and promoted violence.
Violent Extremism
We consider violent extremism to be the support, encouragement, promotion, or organization of violent acts or ideologies that advocate for the destruction of society, often by blaming certain individuals or groups and calling for violence against them.
By partnering and engaging in cross-industry work with the Global Internet Forum To Counter Terrorism (GIFCT), the European Union Internet Forum, and other organizations, we’ve made progress in our tooling, policy, and subject matter expertise to ensure that violent extremism does not have a home on Discord.
In the first quarter of 2023, 8,308 accounts and 934 servers were removed for violent extremism. This was an increase of 15% and 13% respectively. We removed servers for violent extremism proactively 60% of the time.