In its on-going fight with toxic parts of the Overwatch community, Blizzard has turned to computers to help spot and report this negative behaviour.
That's according to game director Jeff Kaplan, who told Kotaku that the games firm has been trying to use machine learning to spot toxic language. The firm is teaching AI non-English languages, such as Korean, and hopes in the long run to be able to teach it what toxic gameplay is.
This is the latest step that Blizzard has made to try and curb negative aspects of the online shooter's community. In January of this year, Kaplan said that changes made to Overwatch's reporting and punishment systems has seen the amount of abusive chat reduce by 17 per cent. He claims to Kotaku that people are using the report function 20 per cent more.
To give you an idea of how bad things got with Overwatch, Blizzard put together a strike team to deal with toxicity in the game.
“We’ve been experimenting with machine learning,” he said.
“We’ve been trying to teach our games what toxic language is, which is kinda fun. The thinking there is you don’t have to wait for a report to determine that something’s toxic. Our goal is to get it so you don’t have to wait for a report to happen.”
In addition to the above, the firm is looking into how it can reinforce good behaviour. This is something that League of Legends developer Riot Games has been trying for years, with some degree of success. Both Blizzard and Riot are part of the new anti-toxicity initiative, the Fair Play Alliance, incidentally.
“We can start looking toward the future and talking about things like, what’s the positive version of reporting?” said Kaplan.
“Reporting is saying ‘Hey, Adrian was really bad and I want to punish him for that,’ but what’s the version where I can say ‘That Adrian guy was an awesome teammate and I’m so glad I had him’?”
Toxicity in video games was one of the biggest trends of last year, largely in part due to Kaplan opening up about how hard dealing with the Overwatch community could be. This was echoed by a former Riot Games employee, too.
Games such as Rainbow Six: Siege and Fortnite have since made changes to encourage positive play and make reporting of negative behaviour easier. Even Microsoft has outlined how people can get banned from its services for toxic behaviour.