Call of Duty: Modern Warfare 3 will use AI to filter toxic voice chats
Technology
Activision says 'ToxMod' can detect in-game toxicity like hate speech, harassment, bullying, sexism
(Web Desk) - Call of Duty maker Activision has announced that it will start using an AI-powered voice chat moderation tool to identify toxic speech in its upcoming title – Call of Duty: Modern Warfare III, which will be launched on November 10 this year.
According to a recent blog post, Activision has teamed up with Modulate to develop a global voice chat moderation tool called ‘ToxMod’, which uses machine learning to detect in-game toxicity like hate speech, harassment, bullying, sexism, and discriminatory language among other things.
The AI-powered moderation tool will join Call of Duty’s existing anti-toxicity solution, which includes text-based filtering in 14 languages for both in-game chat and a reporting system.
Since AI-powered tools are known to hallucinate and produce several false positives, especially when it comes to languages other than English, the company’s support page suggests that the AI-based moderation system will only submit a report about toxic behaviour.
However, the decision to enforce these rules will still stay with a human reviewer.