Activision has begun testing a new AI-powered voice chat moderation system for Call of Duty games.
The system will be rolled out globally to coincide with Modern Warfare 3’s release in November.
“Call of Duty’s new voice chat moderation system utilizes ToxMod, the AI-Powered voice chat moderation technology from Modulate, to identify in real-time and enforce against toxic speech—including hate speech, discriminatory language, harassment and more,” Activision said in a blog post.
“This new development will bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team, which includes text-based filtering across 14 languages for in-game text (chat and usernames) as well as a robust in-game player reporting system.”
Since the release of Modern Warfare 2 last October, Activision claims to have restricted voice and/or text chat to over one million accounts that violated its Call of Duty code of conduct.
It said 20% of players didn’t reoffend after getting a first warning, while those that did received penalties including voice and text chat bans, and temporary account restrictions.
Activision plans to reveal Modern Warfare 3’s multiplayer offering at the returning Call of Duty next showcase on October 5.
The game will feature modernised versions of all 16 launch maps from 2009’s Modern Warfare 2. A selection of these will be included in the multiplayer beta, along with new Ground War experiences.
The second beta weekend is for all platforms and will support cross-play.
Set for release on November 10, Modern Warfare 3 is a direct sequel to last year’s Modern Warfare 2. Development is being led by Sledgehammer Games, in collaboration with Infinity Ward, while Treyarch is in charge of its Zombies mode.