We’ve all met toxic people online, and I’d go as far as to say that most people — if you’re being honest — have been toxic themselves. However, there is a distinction between getting heated during a tense game and outright toxicity, and some games — such as Call of Duty — have an absurd market share on the latter. Now, Activision has partnered with Modulate for Call of Duty AI integration to combat toxicity.
This week, Activision’s Call of Duty staff announced their focus on voice chat toxicity in the game, an issue the game is infamous for. Activision has partnered with Boston-based AI startup, Modulate, has developed the tool ToxMod, a cutting-edge voice chat moderation system utilizing AI.
The Power of ToxMod: AI-Powered Voice Chat Moderation
Scheduled to debut in Call of Duty: Modern Warfare III on November 10, ToxMod employs advanced AI to instantly detect and counter toxic speech, including but not limited to hate speech, discriminatory language, and harassment.
North American players experienced a taste of this feature from August 30, as the initial beta was rolled out for titles Call of Duty: Modern Warfare II and Call of Duty: Warzone. A full global launch (excluded Asia) is on the horizon, aligning with the release of Modern Warfare III.
While the beta supports English, Activision has plans to introduce additional languages in the subsequent releases, ensuring a broad player base benefits.
Strengthening Anti-Toxicity Measures
ToxMod is not a standalone solution. Instead, it complements the moderation systems helmed by the CoD anti-toxicity team. This includes text filtering across 14 languages and an efficient in-game player reporting mechanism. Unfortunately, these have only gone a small distance to stopping widespread toxicity and when hate is delivered through voice chat, it requires player reports, which itself has seen abuse in the form of malicious reporting.
Nevertheless, since Modern Warfare II, over 1 million accounts have faced chat restrictions for breaching the CoD Code of Conduct. Interestingly, Activision states that 20% of players heeded their initial warnings and refrained from repeating offenses.
Michael Vance, CTO at Activision, and Mike Pappas, CEO at Modulate, both underscore the significance of this partnership. For Vance, it’s about creating a “welcoming experience for all players.” It is this point that resonates with me. Toxicity can merely be annoying, but too often it gatekeeps the inclusive and safe environment that gaming ought to cultivate. When a franchise of Call of Duty’s size is a hotbed for toxicity, it’s important for the developers to lead from the front with countermeasures, and that’s what we’re seeing.
Final Thought: A Positive AI Use-Case
There is a connected takeaway from this story that is worth highlighting. So many of the news stories surrounding novel applications of AI are focused on the risks or are painting a dystopian future. That is sometimes just, but it is also the result of mainstream media narrative swings; “AI is amazing” has lost its allure, so now it’s time to assume to the opposite stance. Modulate’s AI voice chat moderation tool, ToxMod, is an example of doing well by doing good, and having their product integrated into one of the biggest gaming franchises around, I’d say they’re doing very well indeed.