Missed the GamesBeat Summit excitement? Don’t worry! Tune in now to catch all of the live and virtual sessions here.


This article is part of GamesBeat’s special issue, Gaming communities: Making connections and fighting toxicity.

We’re probably never going to stamp out toxicity in online games. But it’s always interesting to see how we can fight the good fight.

As an example, game developers often fight a cat-and-mouse game against cheaters in video games such as Valorant and Call of Duty. While toxicity encompasses a lot more than cheating, it provides a window into the cat-and-mouse game. And cheating itself caused a lot of fights and led to a lot of toxicity.

So the developers would scoop up the known cheaters and put them in a battle space nicknamed “cheater island.” Rather than cheating against fair-minded players, the cheaters fought each other, often not realizing their cheating was kind of pointless.

Publicité

Yet the cheaters escalated their technology as well, selling cheats that were hard to detect. They also simply created a lot of accounts. Once the “banhammer” dropped on one account, the cheaters would shift to the next one. So the developers at Riot Games and Activision went a step further, developing an anti-cheat system that could access the player’s operating system and identify if the machine was used for cheating. If so, they would stop it from being used to create a new account. This was all part of the arms race between game developers and cheaters.

It seemed that such a system could work really well if the tech were shared across games and across companies. But there are privacy laws that could stop that from happening, introducing another complexity in the war against toxic behavior.

“This needs to be cross industry, this needs to be collaborative,” said Kimberly Voll, cofounder of the Fair Play Alliance, in an interview with GamesBeat. “These are problems that we’re seeing at a scale that exceeds any one gaming company. And so, it’s unrealistic to think that any one company is going to tackle this.”

At the same time, it’s hard to study the data in some of the reported cases without invading someone’s privacy. Welcome to the battle over fighting toxicity in games.

What’s at stake

Activision Is Sniping At Cheaters In Call Of Duty.
Activision Is Sniping At Cheaters In Call Of Duty.
Activision Is Sniping At Cheaters In Call Of Duty.

Toxic behavior in games can be a serious problem that can ruin the gaming experience for many players. All it takes is one cheater to take the crown in a game of Call of Duty Warzone to piss off 149 others in the same match. That kind of problem happened a lot in Warzone, and it caused some top players to churn out of Warzone in favor of other games.

After Activision introduced its Ricochet anti-cheat technology, many of the complaints about cheating subsided. Ricochet used a “kernel level” driver that made it easier to track the cheating and stop the creation of lots of accounts.

“We think about toxicity in games as a component of the player/user experience rather than strictly a trust and safety issue,” said Dave Matli, chief marketing officer at Spectrum Labs, in an email to GamesBeat. “We often engage with product and sometimes brand teams in addition to the trust and safety folks at a game studio because toxic interactions in a game can be directly attributed to player churn.”

We had a panel at our GamesBeat Summit 2023 event about how the game industry needs to get trust and safety right — before it’s forced to do so by regulators and others.

“The game industry has come of age,” said Hank Howie, game industry evangelist at Modulate, during the panel. “We are no longer this ancillary form of entertainment — we are the 800-pound gorilla of entertainment. It’s time to fully take on the mantle of leadership in the arena of trust and safety, at the CEO level of every company. To do anything less risks putting your company in financial peril, in addition to being in a morally bankrupt position.”

He was joined by Eve Crevoshay of Take This, a mental health advocacy nonprofit; Windwalk, which focuses on building online communities and David Hoppe of Gamma Law, to discuss the state of trust and safety as well as potential regulation.

“The days of anything goes and turning a blind eye, that is not going to fly even in the United States anymore, and certainly not in Europe,” Hoppe said. “First take a territorial approach and evaluate, based on the budget that you’re able to allocate at this stage, where those funds should be spent. The California law actually lays out very precisely what steps you are to take in terms of evaluating the current situation and determining the points that need to be focused on.”

Looming regulation

While things are better than they were a decade ago in terms of healthy play, there is some risk of regulation coming to the game industry over toxicity, Voll said.

“We still have a lot to do to get our act together. I think there’s a lot of work ahead of us,” Voll said. “There are a lot of accusations towards gaming companies that are not doing enough, and basically no recognition for how much they are doing. We could move backwards if we’re not careful.”

Earlier this year, the European Union’s Digital Safety Act came into effect. California’s Age Appropriate Design Act will become law in July. Other states will not be far behind.

“I think the regulatory landscape not just in California, but at the federal level in the U.S. is heating up substantially,” Crevoshay said. “We’ve been speaking with the Senate Judiciary Committee, with Representative Trahan from Massachusetts. They’re all barking up this tree around not just child protection, but around the larger issue of extremist behavior in online spaces.”

The new laws introduce new privacy restrictions and rules around information gathering, targeted advertising and dark patterns. You can’t, for instance, trick a child into buying something based on the placement of a button in a game. Epic Games ran into this problem and was fined $520 million.

Nanea Reeves, CEO of Tripp, a maker of a mental health awareness app, believes soft regulation could serve a purpose as well.

Reeves admires how Eve Online has created a strong community with governing principles. CCP could share that knowledge with others who can follow in its footsteps.

“Maybe the regulatory approach is more about knowledge sharing,” Reeves said.

Humans are the best and worst

Fortnite Luna
Fortnite Luna
Using Tech To Fight Toxicity In Online Games 17

Of course, it takes humans to figure out if other humans are being toxic. Some companies have hundreds of community managers whose job is to moderate communication for toxicity and take action in the worst cases.

Nanea Reeves, CEO of Tripp, said she understands that people use the remoteness of online games and distance from people to express their toxicity as a kind of outlet. Games are a safe space to express that, and you can adopt a different persona that you can benefit from. But that’s not the end destination for gamers, who can go on a journey to develop their spatial reasoning skills and find educational support through gaming.

Reeves acknowledges there can be a downside to games as well, where children can be exposed to predatory behavior. And she believes law enforcement issues that arise from that can be better handled and game companies can do better in moderating communities and safe spaces. We need guard rails, she said.

“I do think that the game industry, and this also extends beyond to any of these metaverse or open world communities, will need to ensure that we have the right kind of infrastructure for moderation,” Reeves said. “And community tools as well, so people can protect themselves, parents can ensure that their kids are engaging in safe behavior. And in that, the company has some responsibility here too. We’ve seen lots of bad behavior, especially with children and predatory adults. How do we protect people in these environments? And it’s not just games. It’s any online social community.”

Roblox has thousands of paid contractors who monitor its community for violations. But Roblox has 66 million daily active users. The community managers can’t keep up with the volumes of conversations that have to be monitored. So the company hasn’t allowed voice communication inside its platform for minors, and it has required age verification for adults who want to use voice chat.

AI technology can tackle a large percentage of the complaints about toxicity. But a system that is mostly effective could still leave millions of complaints for the thousands of human moderators to deal with. And in mature-rated games like Call of Duty, you have the added nuance of allowing most players to use swear words in the game. And then you have the problem of false reporting.

Fighting back

Ggwp Is Calculating Player Reputation Scores.

Companies like Modulate, Spectrum Labs, GGWP and others have systems that address toxic incidents across the entire range of behaviors (vs. just chat). GGWP provides reports on the automated actions that it takes in games, said Dennis Fong, CEO of GGWP, in a message to GamesBeat.

“Anyone who has played games knows that those types of in-game actions are often the most disruptive and frustrating, because you can’t simply mute the player that’s being toxic to you,” Fong said. “It’s empowering to have AI be able to detect these types of behaviors without putting the onus on the user to file a report.”

That said, it’s also often an eye-opening experience for customers. There are five times to 10 times toxic incidents that go unreported, and they’re always surprised at just how often toxicity occurs in their games and flies under the radar.

To use a recent example: A customer of GGWP’s recently launched their game and was instantly overwhelmed by having to review 6000-plus reports a day.

“Once we went live with them, we applied our automated moderation to 5,800 of those (which included 1,500 sanctions and the remainder with reputation score adjustments or dismissals),” Fong said. “We then prioritized the remaining ~200 incidents daily with proposed sanctions for 175 on average. This allowed their moderators to be far more efficient with their time and also resulted in the community being happy that their reports were being actioned upon.”

The most popular games in the world receive billions of user-submitted reports a year related to text chat and are able to only respond to less than 0.1% of them. GGWP uses AI for report handling by automatically validating, triaging, and prioritizing reports and improving the response rate to 98%.

“GGWP is the first comprehensive anti-toxicity platform built for games. Our ambition is to transform game moderation by thoughtfully leveraging AI to identify and respond to toxic incidents and enabling games to improve their response rates to incidents from an average of less than 0.1% to 98%,” said Fong.

Sadly, there are a lot of things to watch out for, including those using their own technology to defeat the AI used to detect problems.

Putting tech to work

Tox 3
Tox 3
Don’t Rage Quit.

Fortunately, technology can be used to help curb toxicity in games in several ways, Matli at Spectrum Labs said.

One way to curb toxicity is to filter out offensive words and phrases from chat messages. This can be done using machine learning algorithms that can identify and flag inappropriate content. Some games already use chat filtering systems that allow players to report offensive messages, which can then be reviewed by moderators.

Another way to curb toxicity is to use automated moderation tools that can identify and remove toxic behavior in real-time. For example, some games use natural language processing algorithms to analyze player chat and voice communications and flag toxic behavior.

“Voice chat is the frontier that is finally being conquered. It’s been almost there for years now. And in the last year or two, we finally see those walls break down,” Voll said. “Were tracing this now, and we’ll see a big upswing in voice chat moderation.”

In order to combat toxicity, studios need to first have a clear picture of the bad behavior occurring on their platform. Modulate’s ToxMod tries to give them this insight and equips them to take action by using AI to identify toxic behaviors across the entire voice ecosystem and surfacing the highest-priority situations for moderators to review.

Modulate’s ToxMod uses custom machine learning and AI models to analyze what’s being said in games — including not just the context but taking into account the emotion and nuance of the speaker and how others in the session are responding. This allows us to understand the full picture of each scenario, identify when harm is happening (compared to innocent behaviors like friendly trash talk) and share with the studio a live-updating prioritized queue of the worst harms happening in their games.

GGWP’s Fong said usernames are more than just identifiers; they are the first interaction point between players within a game’s community. Profanity filters don’t catch unnatural languages like l33tspeak and other workarounds that gamers use to create inappropriate usernames. GGWP uses AI to stay on top of inappropriate username trends.

GGWP goes beyond simple keyword-based filters by using contextual AI to identify and respond to difficult-to-detect behaviors like sarcasm, gameplay criticism, bullying, hate speech, child grooming and much more in text chat.

Awareness of the problem

Fair Play
Fair Play
The Fair Play Alliance’s Report On Harms In Online Gaming.

Voll, who is also head of Brace Yourself Games as well as cofounder of the Fair Play Alliance, believes that awareness of the problem around toxicity has gotten better. But even in the age of AI, the problem has not gone away.

The Fair Play Alliance offers the Disruption and Harms in Online Gaming Framework, a report on problematic in-game conduct, with the goal to empower the game industry with the knowledge and tools to support player well-being and foster healthier, more welcoming gaming spaces around the world.

Social pressure can also be an effective way to curb toxicity. Some games use public shaming or peer pressure to discourage toxic behavior. For example, some games display the usernames of players who have been reported for toxic behavior, which can discourage them from continuing to act inappropriately, Matli said.

“It is definitely a multifaceted problem, which makes it challenging. I think that oftentimes there’s a tendency with these sorts of problems to think about them in terms of one thing as offering the solution,” Voll said. “But I think there are many contributing factors to why we are here. And therefore, our approach needs to be multifaceted in that sense.”

When she worked at another game company, the attitude at that time was that there would always be jerks on the internet. And once you get rid of them, everything will be great. But that’s an oversimplification, she said.

“Human relationships are fraught to begin with. And when you take them into digital spaces, we are absent a lot of the things that facilitate those attractions to begin with,” Voll said. “They’re often strangers. These are dehumanizing spaces, and they are often crossing cultural boundaries. People don’t necessarily have the means to communicate effectively. They could be high conflict spaces to with high stakes. All of these things just make this a big, complicated mess.”

What’s working and what’s not?

Toxmod 2
Toxmod 2
Todmod’s Dashboard

Modulate, creators of purpose-built AI voice technology that improves the health and safety of online gaming communities, said its ToxMod is providing real value to game studios and players. “We have seen measurable upticks in key metrics like new and returning player retention, number of bad actors caught, moderator efficiency across top games in several genres.”

“Right now, we’re prioritizing the highest precision to avoid the risk of incorrectly flagging someone for friendly trash talk. This balance between coverage and accuracy is tricky to maintain, though, especially since some of the worst perpetrators, like violent extremists, use coded or cryptic language to disguise what they are doing. We’ve recently launched updates to our models to grow even more accurate on detecting violent radicalization and other complex harms, but staying on top of the ever-changing vocabulary in these areas will never be something we can just say is ‘done’.”

Matli at Spectrum Labs said the company has shown in testing with clients that positive player interactions can boost both new user retention as well as revenue per user. From this perspective, player experience (on games where players interact with one another) can be broken down into three components: Game Design (graphics, game mechanics, etc.), Gameplay (telemetry data on how users engage with the game) and Community Experience (how players interact with each other).

“All three drive the player experience, which drives retention, engagement and revenue,” Matli said.

Matli said that what works best for fighting toxicity in games is a combination of both removing toxic content (and bad actors who disproportionately produce it) as well as reinforcing and promoting healthy, prosocial behaviors that build a sense of community.

“From a technology standpoint, this is accomplished by using advanced behavior detection AI to support human moderators and help them scale the amount of content they can cover,” Matli said. “Basic AI systems work a lot like toxic keyword filters, but as platforms grow, they begin to attract bad actors who want access to the games’ audience for their own purposes.”

Games that are growing soon need AI that can also capture metadata about the player (while preserving their privacy) in order to detect age gate violations, hate speech, threats, bullying and illegal solicitation. But to detect the most subtle forms of toxicity such as radicalization or child grooming where a bad actor is trying to deceive another user by pretending to be their friend, only AI systems with neural network transformers (similar to the technology behind ChatGPT) are capable of tracking intent across a conversation spanning days where no flagged keywords are ever used. This is what Spectrum Labs has built.

Reputation tracking

Tox 2
Tox 2
What Is Your Player Reputation Score? A Single Bad Day Won’t Affect It.

Because bad actors can make false reports against innocent players for being toxic, tracking someone’s history for either making false reports or being toxic can matter. This gives a player a reputation score, which becomes relevant for repeat offenders.

Not unlike real life, actions speak louder than words; often the most harmful behaviors are actions people do to one another in the game. GGWP uses AI to automatically identify behaviors such as AFK or rage-quitting in a team-based game. It looks at feeding, griefing, trolling, intentional friendly fire and even integrates with anti-cheat providers for detections.

Reputation management is a part of GGWP’s overall platform, which also includes a community health dashboard, insights, and moderator workflow. Fong said that the reputation analysis is powerful because it is comprehensive – it uses as inputs GGWP’s AI-detected events across chat and in-game actions, player reports, internal sources (e.g. prior sanction history), and external sources such as anti-cheat.

Every incident, whether it’s AI-detected, user-reported, or identified through an external source (anti- cheat, moderation team, etc.) flows through the system and will impact a user’s reputation score.

“Our reputation is truly a holistic and historical view on a user’s overall conduct and behavior within a community,” Fong said.

Of course, reputation can be tracked on an anonymous basis, where a player account can be monitored without knowledge of who the player is. But as noted, it gets hard to track someone’s reputation across games or across companies.

GGWP’s Discord bot leverages the same chat and reputation models to help moderate Discord servers and conveniently brings Discord incidents into the same view into the GGWP platform. The system uniquely takes into account additional factors to help determine player intention and context for every incident. For example, GGWP doesn’t penalize people for what appears to be toxic chat if they’re goofing off with their friends.

Detecting positive play

Some games have implemented reputation systems that reward positive behavior and penalize toxic behavior. Players with good reputations may have access to exclusive features or rewards, while those with poor reputations may be restricted from certain parts of the game, said Matli at Spectrum Labs.

Roblox has its own team of people whose aim is to reward players who have positive effects on the community. It has a team focused on “digital civility.” Electronic Arts also has executives like Rachel Franklin heading an effort for positive play.

Voll believes that consequences are important, but so is encouragement and environment.

“We, as humans, like to be encouraged to do the right thing,” Voll said. “And that doesn’t necessarily need to be, ‘here’s a carrot, or here’s a cookie or whatever.’ Those can work.”

Overall, technology can be a powerful tool in curbing toxicity in games. However, it’s important to note that technology alone may not be enough to solve the problem completely. It’s also important for game developers to take a proactive approach to addressing toxic behavior by creating a positive and inclusive gaming environment, Matli said.

Fong said GGWP can make the reputation scores available to the games to, for example, matchmake new players with positive veterans of the game.

“We also make the player reputation scores available to use by our customers through an API so it can be leveraged in other ways beyond moderation. For example, one of the single biggest reasons why a new player quits a multiplayer game is if they experience toxicity within the first few matches,” Fong said. “So to help combat that, games will purposely match new users with positive reputation veterans of the game to effectively help onboard them.”

Reeves at Tripp said that punishing bad people may force those bad people to get more creative with their workarounds.

“As anybody who’s run an online game service knows, they will figure out any vulnerability. In aggregate, they’re smarter than we are. That’s how I always assume the way to play defense on it. If we can create incentives that are rewarding, we can get them to spend more in the game with good behavior.”

What more can be done

Toxmod Uses Ai To Detect Extremism In Game Communities.

Voll would like to see more systemic support from platform holders and game engine providers.

“I think that some of the tooling is inaccessible for smaller companies, and a lot of what needs to go into supporting that is often very difficult,” Voll said.

What’s coming? Spectrum Labs said the next stage in toxicity prevention for games is really making good AI tools more easily accessible for smaller game studios who want to build user safety into their design from the outset. To this end, Spectrum Labs is launching a new product — called SpectrumGO — which will provide a full, basic content moderation solution to smaller game studios at a very low price at $1,000 per month for up to 50 million API requests per month.

Modulate also recently announced that it has deployed the new “violent radicalization” detection category in its ToxMod voice chat moderation software.

This groundbreaking detection category makes ToxMod the gaming industry’s only voice moderation solution capable of identifying individuals promoting white supremacist and white nationalist radicalization and extremism in real-time, allowing community moderators to take immediate action.

In addition to the violent radicalization category, ToxMod continues to offer battle-tested detection of harms, including bullying, racial and cultural hate speech, gender and sexual hate speech, and more, helping game studios better moderate and address problematic player behaviors in their games.

With fines like $2,500 per violation in the California law, game companies need to pay attention.

“You need a carrot and stick approach,” said Crevoshay in the panel at the GamesBeat Summit. “Good design goes a really long way, both in a community and in the game itself in increasing pro-social behavior, increasing shared positive norms and aspirational ideas. But if you don’t also have the stick, it can very easily devolve into a problematic space.”

GamesBeat’s creed when covering the game industry is « where passion meets business. » What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.

4.8/5 - (17 votes)
Publicité
Article précédentEvery Panel Worth Checking Out At Anime Expo 2023
Article suivantL’anime préféré du printemps, Oshi no Ko est de retour avec la saison 2 – Date de sortie ici
Avatar
Violette Laurent est une blogueuse tech nantaise diplômée en communication de masse et douée pour l'écriture. Elle est la rédactrice en chef de fr.techtribune.net. Les sujets de prédilection de Violette sont la technologie et la cryptographie. Elle est également une grande fan d'Anime et de Manga.

LAISSER UN COMMENTAIRE

S'il vous plaît entrez votre commentaire!
S'il vous plaît entrez votre nom ici