The Importance of Moderating In-Game Chat

10 min read

There are nearly 2.8 billion gamers in the world, and as you can imagine, not every one of them has perfect manners.

Emily R.
Emily R.
Published May 18, 2022

Trolls, bad actors, and even bots can dismantle trust within video game communities by harassing other players and posting offensive or spammy content within the chat platform.

Moderating multiplayer video game chat is essential to mitigating the harmful effects of toxic content on players, communities, and the game studio itself.

What is In-Game Chat?

In-game chat is a messaging solution that allows players to communicate over text-based or voice and video messaging platforms. A stellar chat end-user experience results in less customer churn, a higher lifetime value (LTV) rate, and a lower cost-per-install (CPI) rate for game studios.

What is Chat Moderation?

Chat moderation is the process of reviewing and censoring content and media posted on a platform by users to prevent discriminatory language, offensive imagery, and other forms of harassment from disrupting the community.

The Positive Impact of Safe Gaming Communities

Video games that provide the ability to chat also function as social platforms. The interactions players have within those communities hold the power to influence their quality of life, perception of themselves, and their actions. When chat is moderated effectively, these communities allow gamer-to-gamer relationships to flourish.

Some of the positive impacts of safe gaming communities on players are:

  • Making friends
  • Helping other players
  • Discovering new interests
  • Feeling a sense of belonging
  • Learning about interesting topics
  • Learning about themselves
  • Finding a romantic partner
  • Finding a mentor

Safe gaming environments provide a playful space where gamers can be themselves and hang out with their friends, even if they’re miles apart. While talking over a friendly game is a great way to strengthen relationships, unfortunately, there will always be bad actors looking to turn the chat toxic.

For more information: Gamer Communities: The Positive Side

Warning Signs of a Toxic In-Game Chat Environment

The Antidefamation League (ADL) splits the top nine gaming chat offenses into two categories: harassment and severe harassment. The ADL’s most recent survey shows that five out of six adult gamers have recently experienced one or more of these forms of harassment.

These offenses are signals that your gaming chat environment requires moderation:


  1. Trolling/griefing is a deliberate attempt to upset or provoke another player.
  2. Personally embarrassing another player
  3. Calling a player offensive names or using profanity

Severe Harassment

  1. Threatening a player with physical violence
  2. Harassing a player for a sustained period
  3. Stalking a player and monitoring or gathering information with the intent to use it to threaten or harass another player
  4. Sexually harassing another player
  5. Discriminating against a player by a stranger based on age, gender, ethnicity, sexual orientation, etc.
  6. Doxing, also known as “dropping documents,” is the internet-based practice of researching and broadcasting private or personally identifying information about an individual or group.

The ADL reports that harassment in multiplayer games has been on the rise for the past three years. It is critical that development studios implement a form of gaming chat moderation to mitigate the negative effects of a toxic messaging environment.

For more information: Hate is No Game: Harassment & Positive Social Experiences in Online Games 2021

The Impact of Toxic Gaming Communities

The pen is mightier than the Master Sword. The negative impacts of offensive content are as powerful as they are widespread. Video game chat harassment is felt as strongly by the studio that created the game as the message recipient.

Felt by Studios:

  • Damage to brand reputation
  • High player churn rates
  • Loss of partnerships
  • In-game ad sponsorship termination

Felt by Players:

  • Feeling less social, more isolated, & upset after gameplay
  • Experiencing suicidal thoughts & self-harm
  • Mistreating other players & abandoning personal relationships
  • Worsening academic performance

The eight adverse effects above are only the tip of the iceberg regarding the impact of toxic gaming chat environments. For example, damage to a studio’s reputation plus loss of valuable partnerships could snowball into employee layoffs. Or, if a player’s academic performance is declining, it could potentially hold them back from future opportunities.

For more information: Playing Games Online in 2021: Toxicity, Misogyny & Missing Moderation

Predictors of Toxic In-Game Chat Communities

While there are documented patterns of toxic behavior tied to certain genres of video games, there are many other factors at play when it comes to predicting the level of harassment you might find within any gaming chat community.


Toxicity doesn’t have a genre, but certain types of multiplayer games can provoke a heightened sense of competition. These genres have the potential to bring out a side of players that can be unfriendly, and hostile, and lead to taunting, threats, and other forms of harassment via gaming messenger.

Genres of games typically associated with toxic gaming chat communities:

  • eSports
  • Shooter
  • Battle Royal
  • Fighting
  • Racing

Barrier to Entry

The nature of high barrier-to-entry games will dissuade toxic players from joining simply to abuse others in the chat. The user loses their anonymity because they must pay to join or submit personally identifying information, like a valid email address or phone number. Games with a low barrier to entry attract users who aren’t interested in gameplay as much as they are harassing or spamming the platform.

Common examples of low barrier to entry games:

  • Free-to-play games
  • No community guidelines
  • No user verification system


Similar to the characteristics of a low barrier to entry game, the anonymity of titles with a high volume of users and a large player following makes it easy for bad actors to blend into the crowd.

Why the size of a game’s following matters:

  • Games with a large following provide the cover of anonymity to bad actors and make it more difficult for moderators to find them and enforce consequences
  • Games with a smaller following have tighter-knit chat communities where bad actors can easily be identified and reported


Unlike world-building or puzzle games that are best-enjoyed solo, game models in which players directly interact or collaborate with others are primed for in-game chat harassment. The game models listed below encourage players, who are typically strangers, to communicate in what can feel like a high-stakes environment, especially if a reputation or winning streak is on the line. When all of the conditions are right, chatting in a game can create hostile territory.

Game models that require players to communicate:

  • Multiplayer
  • Role-playing
  • Matchmaking


If players do not face any repercussions for their actions in a game, they are less likely to behave appropriately. Games must have consequences outlined and in place for a multitude of infractions, including abuse of the chat if they want to ensure their community is safe and inclusive.

Common characteristics of games without consequences:

  • Absentee moderators
  • No community guideline agreements
  • Absence of automod
  • Banned users can make new accounts

For more information: Can Game Genre Type be a Predictor of a Toxic Community?

Types of Content that Require Moderation

Whether a video game chat system exclusively supports text-based messages or allows users to share audio and visual content, moderation is still essential for maintaining a healthy communication experience.

Text Chat

Text-based chat is the perfect solution for quick conversations between international players, as it requires less bandwidth than audio or video communication.

Voice Chat

Voice chat allows players to talk in real-time as if they were on the phone or in the same room. It’s hands-free options that allow gamers to chat and play in real-time without the hassle of typing.

Secondary Chat Apps

Secondary chat apps, like Discord, are messaging platforms that provide a popular alternative to in-game chat and enable players to stay connected even if they use different consoles or play different games.

Building your own app? Get early access to our Livestream or Video Calling API and launch in days!

Whether players use the game’s messenger or prefer to connect over a secondary chat app, most modern messaging platforms allow users to upload image files, videos, and links. This media is typically related to game-play, like sharing a screenshot of a high score or video recap of an awesome play, but it can also be abused like any other method of chat listed here.

For more information: In-Game Chat 101 - Text, Voice, & Video Messaging

Methods of Moderation to Combat Toxic Content

The type of content will hold the greatest influence over the methods you use to moderate it. All moderation tactics fall into two primary categories, manual and automatic, also known as “automod.” Both methods have their advantages and disadvantages, and it is important to note that automod is a supplement to a human (manual) moderator, not a replacement.

Manual Moderation

Manual moderation requires an employee or longstanding, trusted member of a userbase to act as a moderator, regularly review message content and intervene when necessary to protect the safety and integrity of your gaming community.

Advantages of manual moderation:

  • More strategic
  • More effectively resolves self-reported issues

Disadvantages of manual moderation:

  • Slower time-to-resolution
  • Personal biases of moderators
  • Easier to circumvent
  • Difficult to scale

For more information: Human Content Moderation: Online Safety & Trust

Automated Moderation (Automod)

Automod relies on sentiment analysis, stance and intent detection, Natural Language Processing (NLP), and Machine Learning (ML) techniques to understand the meaning and intent of messages to decide if they are harmful.

Advantages of automod:

  • Designed to scale alongside the user base
  • Hard to circumvent
  • Quick time to resolution

Disadvantages of automod:

  • Unable to understand contextual clues, may issue false positives
  • Will always need to be used in tandem with manual moderation methods

For more information: How Effective is AI When it Comes to In-Game Chat Moderation?

How Has Chat Moderation within Video Games Evolved?

Language is constantly evolving; the latest slang and new meanings for old phrases catch on fast—especially in gaming communities. Moderation technology must develop in lockstep with gaming lingo to effectively protect players from receiving messages with inappropriate content.

Common ways players circumvent gaming chat moderation:

  • Garbling words (ex. with an asterisk, Sh*t or with spaces between letters, W T F)
  • Using euphemisms
  • Using new slang terms
  • Writing with emojis

Traditional methods of moderation, even those that employ a degree of automod, must go one step further than simply setting blocklists of terms they deem harmful. Modern game chat moderation tools come with a sophisticated set of tactics for moderators to leverage that can help them better identify truly offensive content at scale.

For more information: How Has Moderation Evolved Over the Years?

When Moderation Becomes Censorship

Rather than promoting a healthy mix of automod and self-moderation to allow players to call the shots on what they want the community to be, some multiplayer games have enforced strict zero-tolerance policies regarding certain words and topics that come with stiff consequences. The goal of these policies is to force a rapid behavioral change within what the studios believe to be toxic communities.

However, many experienced gamers believe this strategy of moderation borders on censorship. This group prefers the following moderation tactics:

  • Mute
  • Block
  • Self-report

Many of these experienced gamers recount having been hit by the intense moderation tactics below out of the blue and feel they have been censored.

  • Automod instant censoring
  • Timeouts
  • Ban

This reaffirms the idea that automod is only to be used in addition to manual and self-moderation tactics. Even though moderation tools are evolving, they are not as sophisticated as a human when it comes to understanding context—for example, being banned for typing “Nazi” while playing a WW2 game.

For more information: Where Should the Line Be Drawn for Chat Moderation in Games and their Social Spaces?

Who is Responsible for Moderating In-Game Chat Communities?

If the number of bullets in the four subsections below seems disproportionate, it is because the responsibility of maintaining safe chat communities largely falls on the game industry and the studios themselves.

However, toxic content that requires moderation, like hate speech, becomes a human rights violation if it incites discrimination, hostility, or violence towards a person or a group defined by race, religion, ethnicity, or other factors. This discriminatory speech brings federal law, parents, teachers, and civil society efforts into play.

See how these groups can bolster chat moderation efforts:

The Game Industry

AAA and indie studios alike are ultimately responsible for ensuring the chat platform associated with their titles is a safe place for all players.

How studios and the game industry can support more effective chat moderation:

  • Develop a rating system that gives meaningful information to consumers about the nature of the game’s online community
  • Create tools for content moderation for in-game voice chat
  • Strengthen policy and enforcement of terms of service
  • Improve workplace inclusivity efforts
  • Establish an industry-wide effort to address white supremacy
  • Increase transparency on harassment and hate on platforms

For more information: Racism, Misogyny, Death Threats: Why Can’t The Booming Video-Game Industry Curb Toxicity?

Civil Society

Civil society organizations have the opportunity to investigate and engage with the research, practice, and experts focused on specific communities to keep gaming chat a safe place.

How civil society organizations can support in-game moderation efforts:

  • Increase focus on how games impact vulnerable populations
  • Support game scholars and practitioners who have expertise on these issues to increase research
  • Create educational collateral and distribute it online and through social media

For more information: Bullying and Cyberbullying Prevention Strategies and Resources

Caregivers & Educators

Multiplayer video games can often provide educational and enriching opportunities for kids in an appropriate setting. However, while the content itself might be harmless, parents and teachers need to be aware of the potential dangers of video game chat to protect their children and students.

How parents, caregivers, and educators can support video game chat moderation:

  • Learn about and make deliberate, proactive choices about safety controls
  • Demand games companies improve parental safety controls and education
  • Familiarize yourself with the games young people in your life play and have meaningful conversations about their experiences in online games

For more information: Parents Level Up on Monitoring Their Kids’ Video Games

Federal Government

Consistent with the First Amendment, legislators have an opportunity to create laws that hold perpetrators of severe online hate and harassment more accountable for their offenses.

How the federal government can support in-game moderation:

  • Strengthen laws against perpetrators of online hate
  • Urge the game industry to institute robust governance in their online game products
  • Improve training for law enforcement

For more information: What the Law Can (and Can't) Do About Online Harassment

The Future of Gaming Moderation

The worth of the gaming industry is expected to reach a whopping $340B by 2027— a lack of chat moderation advancements will not be due to a shortage of resources.

The future of gaming moderation is sure to include improvements to the AI and gaming chat APIs that power automated moderation tools, but also increased socialization of the job title of “in-game chat moderator.” Major gaming engines have even started to develop SDKs to assist engineers in developing video game chat systems. When they use these ready-made components, it’ll make it easy to integrate and implement advanced moderation features.

Gamers want to enjoy a safe chat community, but they also want to avoid feeling censored if they’re just having a bit of innocent fun with close friends. As long as civil society groups, caregivers, educators, and legislators are fighting to protect vulnerable players, the game industry and studios should continue to prioritize safety within their gaming chat communities.

decorative lines
Integrating Video with your App?
We've built a Video and Audio solution just for you. Check out our APIs and SDKs.
Learn more ->