Child Safety
Discord has a zero-tolerance policy for anyone who endangers or sexualizes children. Child-harm content is appalling, unacceptable, and has no place on Discord or in society. We work with industry peers, civil society, and law enforcement to ensure that this effort extends beyond Discord. Discord is an active supporter of cross-industry programs such as the National Center for Missing and Exploited Children (NCMEC) and is a member of the Technology Coalition, a group of companies working together to end online child sexual exploitation and abuse. We’re also a frequent sponsor of events dedicated to increasing awareness of and action on child safety issues such as the annual Dallas Crimes Against Children Conference. We are a proud sponsor of the National Parent Teacher Association and ConnectSafely. We partner with The Digital Wellness Lab to integrate their research on teen health and social media, and with INHOPE, the global network combatting online CSAM, and are members of the Family Online Safety Institute, contributing to and learning from its important work.
We invest heavily in resources and education so parents know how our platform works and understand the controls that can contribute to creating a positive and safe experience on Discord for their children. In July we launched our Family Center, which makes it easy for teens to keep their parents and guardians informed about their Discord activity, and our Parent Hub, as part of our ongoing commitment to parent engagement and education.
Users who upload abuse material of children to Discord or who engage in high-harm activity towards children are reported to NCMEC and removed from the service. We deeply value our partnership with NCMEC and their efforts to ensure that grooming and endangerment cases are quickly escalated to law enforcement. We also recently published details about our Child Safety policies which we updated in partnership with leaders in teen and child safety to ensure our approach is developed with the latest research, best practices, and expertise in mind. You can read the summary in this article here, and the policies here.
In the second quarter of 2023, we reported 36,481 accounts to NCMEC. 36,323 of those reports were media (images or videos), of which many were flagged through PhotoDNA – a tool that uses a shared industry hash database of known CSAM. Additionally, 158 high-harm grooming or endangerment reports were delivered to NCMEC.
Discord disabled 155,873 accounts and removed 22,245 servers for Child Safety during the second quarter of 2023. This was an increase of 184% and 76% respectively when compared to the previous quarter.
We removed servers for Child Safety concerns proactively 95% of the time, and CSAM servers 99% of the time.
The increase in accounts disabled and our reports to NCMEC is due to improvements in Discord’s proactive tooling and identification of bad actors engaged in this behavior.
One innovative tool that we have developed is our visual safety platform which has expanded our capacity to proactively identify CSAM content. Additionally, we have continually invested more resources in combating CSAM on our platform, including a team who solely focuses on child safety as well as a dedicated engineering team.
Discord is committed to continually exploring new and improved safeguards that help keep younger users safe on our platform and online.
Deceptive Practices
Using Discord for the purpose of distributing malware, sharing or selling game hacks or cheats, authentication token theft, or participating in either identity, investment, or financial scams is a violation of our Community Guidelines.
We disabled 8,076 accounts and removed 3,296 servers for Deceptive Practices. This was an increase of 22% in accounts disabled, and a decrease of 5% in servers removed.
Exploitative and Unsolicited Content
It is a violation of our Community Guidelines to share or promote sexually explicit content of other people without their consent.
We disabled 19,404 accounts and removed 2,625 servers for exploitative and unsolicited content. This was a decrease of 12% in the number of accounts removed, and an increase of 52% in the number of servers removed. The increase in servers removed and corresponding decrease in accounts disabled was the result of our improved ability to detect these servers while refining our process to remove only those responsible for the content.
Warnings issued to server members increased by 216%, accounting for the majority of the overall 90% increase in server member warnings issued. This was specifically due to an increase in warnings resulting from machine learning detection models for non-consensual pornography, which increased by 1,910%, from 65,237 to 1,311,319 warnings.
Harassment and Bullying
Harassment and bullying have no place on Discord. Continuous, repetitive, or severe negative comments, circumventing bans, suggestive or overt threats, the sharing of someone’s personally identifiable information (also known as doxxing), and server raiding are violations of our Community Guidelines.
In the second quarter of 2023, 10,671 accounts and 1,132 servers were removed for harassment and bullying.
Hateful Conduct
Hate or harm targeted at individuals or communities is not tolerated on Discord in any way. Discord doesn’t allow the organization, promotion, or participation in hate speech or hateful conduct. We define “hate speech” as any form of expression that denigrates, vilifies, or dehumanizes; promotes intense, irrational feelings of enmity or hatred; or incites harm against people on the basis of protected characteristics.
In the second quarter of 2023, 5,874 accounts and 862 servers were removed for hateful conduct.
Identity and Authenticity
Using Discord for the purpose of coordinating and participating in malicious impersonation of individuals or organizations is a violation of our Community Guidelines.
We disabled 420 accounts and removed 119 servers for identity and authenticity concerns.
Misinformation
It is a violation of our Community Guidelines to share false or misleading information that may result in damage to physical infrastructure, injury to others, obstruction of participation in civic process, or the endangerment of public health.
We disabled 64 accounts and removed 27 servers for misinformation.
Platform Manipulation
Spam, fake accounts, and self-bots are examples of platform manipulation that damage the experience of our users and violate our Community Guidelines.
We're focused on combating spam and minimizing users’ exposure to spammers and spam content on Discord. We have a dedicated cross-functional anti-spam team building sophisticated anti-spam measures.
In the second quarter of 2023, 3,839 accounts and 1,476 servers were removed for non-spam related platform manipulation issues. An additional 7,971,335 accounts were disabled for spam or spam-related offenses. This represents a decrease of 25% in the number of accounts disabled when compared to the previous quarter. 99% of accounts disabled for spam were disabled proactively, before we received any user report.
This decrease reflects several positive trends and actions. First, it represents less spam on Discord, and second, it demonstrates the effectiveness of the improvements we have made to detect spam accounts upon registration, as well as quarantining suspected spam accounts without fully disabling them, allowing users to regain access to compromised accounts.
Learn more here about how Discord combats spam, and here about Automod, a safety feature that enables server owners to automatically moderate certain abuse, including spam.
Regulated or Illegal Activities
Using Discord to organize, promote, or engage in any illegal behavior is a violation of our Community Guidelines.
During the second quarter of 2023, 57,218 accounts and 17,809 servers were removed for regulated or illegal activities. This was an increase of 38% and 86% respectively. Our rate of proactively removing servers for regulated or illegal activities increased from 75% to 84%. These increases were driven largely by improved malware abuse detection capabilities.
Self-Harm Concerns
For those experiencing mental health challenges, finding a community that is navigating similar challenges can be incredibly helpful for support. That said, platforms have a critical role to play in ensuring that these spaces do not normalize, promote, or encourage others to engage in acts of self-harm.
We may take action on content that seeks to normalize self-harming behaviors, as well as content that encourages self-harm behaviors or discourages individuals from seeking help for self-harm behaviors. These actions are only taken on accounts glorifying or promoting acts of self-harm, not on users seeking help or in need of medical attention.
We’re proud to partner with Crisis Text Line, a nonprofit that provides 24/7 text-based mental health support and crisis intervention via trained volunteer crisis counselors. If a user reports a message for self-harm on Discord, they will be presented with information on how to connect with a volunteer Crisis Counselor. You can learn more here.
Crisis Text Line is currently available to those in the United States and is offered in both English and Spanish. You can read more about this partnership here. Since the launch of our partnership, there have been over 2,000 conversations started using Discord’s keyword.
During the second quarter of 2023, 1,175 accounts and 640 servers were removed for self-harm concerns.
Violent and Graphic Content
Real media depicting gore, excessive violence, the glorification of violence, or animal cruelty is not allowed on Discord.
During the second quarter of 2023, 21,256 accounts and 2,833 servers were removed for violent and graphic content. This was an increase of 14% and 21% respectively.
Violent Extremism
We consider violent extremism to be the support, encouragement, promotion, or organization of violent acts or ideologies that advocate for the destruction of society, often by blaming certain individuals or groups and calling for violence against them.
By partnering and engaging in cross-industry work with the Global Internet Forum To Counter Terrorism (GIFCT), the European Union Internet Forum, and other organizations, we’ve made progress in our tooling, policy, and subject matter expertise to ensure violent extremism does not have a home on Discord.
During the second quarter of 2023, 10,385 accounts and 1,298 servers were removed for violent extremism. This was an increase of 25% and 39% respectively. We removed servers for violent extremism proactively 77% of the time, up from 60% in the last quarter. These increases can be attributed to investments in improving our proactive detection models for violent extremist content as well as through our continued cross-industry work to bolster our tooling, policy, and awareness of emerging trends.