Spirit is using AI to curb toxic behaviour in game communities

Spirit is using AI to curb toxic behaviour in game communities

London-based technology outfit Spirit AI is using neural networking and language processing to help combat toxic online communities.

Ally is a social intelligence tool created by the company to contextualise online discussions and determine if participants are acting maliciously. Spirit AI hope that this will help curb cyber bullying and hostile interactions online.

Mitu Khandaker, creative partnerships director at Spirit AI, spoke to VentureBeat at Casual Connect Europe about the potential future of Ally.

“We started Spirit almost three years ago. This was a year into Gamergate, when myself and a lot of my colleagues were targets. I thought that if we really understand nuanced conversation, online harassment is a key area to try to tackle.”

Ally uses AI, natural language understanding, and machine learning to determine relationships and intent between users. Instead of flat-out blocking certain words, the tool attempts to determine context - what might be playful banter between friends can easily be harassment between strangers, after all.

Khandaker continued: “You can’t just ascertain harassment from keywords. Language changes. Bad language might be fine between friends, but then something that is completely innocuous-sounding that a stranger suddenly starts - someone might invent a new curse word, right? People try to circumvent these systems as much as possible if they’re trying to be malicious.”

The full interview continues to discuss Spirit AIs work in believable NPC interactions within games. But with Ally, the outfit hopes they can aid players in experiencing worthwhile, believable human interactions outside the game as well as in.

Staff Writer

Natalie Clayton is an Edinburgh-based freelance writer and game developer. Besides PCGamesInsider and, she's written across the games media landscape and was named in the 2018 100 Rising Star list.