Please support Game Informer. Print magazine subscriptions are less than $2 per issue

X
News

Blizzard Exploring Algorithms To Combat Overwatch Toxicity

by Imran Khan on Apr 03, 2018 at 07:31 PM

In an interview with Kotaku, Blizzard vice president and Overwatch lead designer Jeff Kaplan explained that the company is trying to explore how machines can be used to catch and punish toxic Overwatch players.

Player toxicity has been a focus for the team over the last year with Kaplan even explaining that content creation is slowing down to fix the problem of players being toxic to each other in the game. To this end, Kaplan spoke with Kotaku about machine-learning algorithms as a means to help combat the problem.

"We’ve been experimenting with machine learning," Kaplan said. "We’ve been trying to teach our games what toxic language is, which is kinda fun. The thinking there is you don’t have to wait for a report to determine that something’s toxic. Our goal is to get it so you don’t have to wait for a report to happen."

Right now, Blizzard is mostly focusing on language, as that's far easier for a computer to figure out than behavior.

“That’s the next step,” said Kaplan. “Like, do you know when the Mei ice wall went up in the spawn room that somebody was being a jerk?”

It is important to note that that this hasn't actually been instituted yet and right now Blizzard is just exploring those options. It does raise a lot of interesting questions, though, like whether the cases would be reviewed by a person, or if false positives could be common or even intentionally triggered by people. It is possible it is simply never used.

[Source: Kotaku]

 

Our Take
From an academic standpoint, I'd really be interested in finding out how they plan to use machine learning to understand griefing. I am unsure what it does with text chat that a word or phrase trigger does not already do, though.