Activision introducing voice chat moderation for Call of Duty

Activision introducing voice chat moderation for Call of Duty

Activision is rolling out in-game voice chat moderation in Call of Duty.

In a blog post on the franchise's website, the company wrote that this feature uses Modulate's ToxMod, an artificial intelligence system that can be used to recognise in real-time toxic speech. That includes hate speech, discriminatory language and harassment, Activision says.

This tech is being tested in beta form as of yesterday in Call of Duty: Modern Warfare 2 and Warzone, before a full rollout in the upcoming Call of Duty: Modern Warfare 3.

"This new development will bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team, which includes text-based filtering across 14 languages for in-game text (chat and usernames) as well as a robust in-game player reporting system," the Call of Duty team wrote.

Activision also said that since Modern Warfare 2's launch last October, it has restricted voice and/or text chat for over one million accounts who have violated Call of Duty's code of conduct.

PCGamesInsider Contributing Editor

Alex Calvin is a freelance journalist who writes about the business of games. He started out at UK trade paper MCV in 2013 and left as deputy editor over three years later. In June 2017, he joined Steel Media as the editor for new site In October 2019 he left this full-time position at the company but still contributes to the site on a daily basis. He has also written for, VGC, Games London, The Observer/Guardian and Esquire UK.