Call of Duty will use AI to moderate voice chats
First-person shooters like Call of Duty are somewhat infamous for the toxicity of their lobbies and voice chats. Surveys have dubbed the franchise’s fan base the most negative in all of gaming; a feud between two players once resulted in the summoning of a SWAT team. Activision has been trying to crack down on this behavior for years, and part of the solution might involve artificial intelligence.
Activision has partnered with a company called Modulate to bring “in-game voice chat moderation” to their titles. The new moderation system, using an AI technology called ToxMod, will work to identify behaviors like hate speech, discrimination, and harassment in real time.
ToxMod’s initial beta rollout in North America begins today. It’s active within Call of Duty: Modern Warfare II and Call of Duty: Warzone. A “full worldwide release” (it does not include Asia, the press release notes) will follow on November 10th with the release of Call of Duty: Modern Warfare III, this year’s new entry in the franchise.
Modulate’s press release doesn’t include too many details about how exactly ToxMod works. Its website notes that the tool “triages voice chat to flag bad behavior, analyzes the nuances of each conversation to determine toxicity, and enables moderators to quickly respond to each incident by supplying relevant and accurate context.” The company’s CEO said in a recent interview that the tool aims to go beyond mere transcription; it takes factors like a player’s emotions and volume into context as well in order to differentiate harmful statements from playful ones.
It is noteworthy that the tool (for now, at least) will not actually take action against players based on its data but will merely submit reports to Activision’s moderators. Human involvement will likely remain an important safeguard since research has shown that speech recognition systems can display bias in the way they respond to users with different racial identities and accents.
Read the full article Here