Activision will use Artificial Intelligence (AI) to clamp down on toxic voice chats in Call of Duty: Modern Warfare III.
Also read | Indian gaming industry to shape entertainment for next generation: JetSynthesys' Rajan Navani
Activision has partnered with Modulate, and will use the company's AI solution called ToxMod. The technology will attempt to identify instances of hate speech in real time and flag chats that it deems offensive.
The flagged data is stored on servers for easier identification and banning of guilty accounts, while processing will be done on the local device. Modulate says that it evaluates everything from the tone of an users voice to analysing the emotion behind statements they make.
Modulate says it evaluates "what is being said" but also the context of "how it is said and how other players respond to it". Once a conversation is flagged, it is bought to the attention of moderators, who will then take action.
Also read | Red Dead Redemption returns not as a remake or remaster, but a divisive port
"With this collaboration, we are now bringing Modulate's state of the art machine learning technology that can scale in realtime for a global level of enforcement. This is a critical step forward to creating and maintaining a fun, fair and welcoming experience for all players," Activision's CTO Michael Vance said in a statement.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!