ToxMod is unique in its ability to analyze tone and intent in speech to determine what is and isn't toxic. It has been trained to distinguish between malice and friendly banter, and it does not detect or identify the ethnicity of individual speakers. The tool also has a "violent radicalization" category that can flag terms and phrases related to white supremacist groups, radicalization, and extremism in real-time. However, the final enforcement will abide by Call of Duty's official Code of Conduct.
Key takeaways:
- Activision is partnering with AI company Modulate to integrate its voice moderation tool, ToxMod, into Call of Duty games to combat in-game toxicity.
- ToxMod is able to identify and enforce against toxic speech in real-time, including hate speech, discriminatory language, and harassment.
- The AI does not have the power to issue player bans but observes and reports toxic behavior to Activision for enforcement.
- ToxMod will be launched worldwide in Call of Duty with the release of Modern Warfare 3 on November 10, starting with English-only moderation and expanding to more languages later.