AIS Editer Selects First Player for Call of Duty

Call of Duty: Modern Warfare III to Introduce AI for Moderation of Voice Chat

On the eve of the release of the new game Call of Duty: Modern Warfare III, planned for November 10, Activision has announced that artificial intelligence will be implemented into the game process to moderate the voice chat. The announcement was made in a statement on their website.

To make this idea a reality, Activision is collaborating with Modulat. The new system, called TOXMOD, will aim to detect and suppress instances of hatred, harassment, and other forms of aggressive behavior within the game.

While toxic behavior among gamers is not a new issue, it is especially relevant for the Call of Duty franchise. As the number of fans of the game grows, so does the scale of the problem.

The developers have already implemented text message filters in 14 languages and created a smart system for player complaints. These algorithms have resulted in chat access being restricted for over 1 million accounts that have violated the rules. Interestingly, 20% of those who received warnings did not engage in further rudeness, suggesting that the feedback system has been effective.

However, it is important to note that the AI’s speech recognition algorithms are not yet perfect. There is a risk of false positives initially, as the system will need to consider audio quality, accents, language variety, and speech impairments. This presents a challenging task in creating a system that can account for all these factors.

Activision stresses that human intervention will always be involved in the moderation process. The system operates in real time, detecting toxic statements and categorizing them based on the context of “Call of Duty.” Before any sanctions are applied, administrators may need to analyze similar recorded instances for better decision-making. Therefore, immediate actions will not be taken. As the system develops, both processes and reaction speed will be improved.

The development of the AI moderation system is currently undergoing beta testing among users in North America. Following

/Reports, release notes, official announcements.