Toxic players are one of the biggest problems in call of Duty. Although mechanisms exist to allow them to be reported or silenced, these measures do not always work. That is why Activision began testing a tool that uses artificial intelligence combats abusive behavior.

In a blog post call of DutyActivision has announced that it will implement new large-scale real-time voice chat moderation system. The company will use ToxMod, a tool developed by Modulate that uses artificial intelligence to detect discriminatory speech, hate speech, harassment and other toxic behavior.

ToxMod works in three stages: first, classifies and processes voice chat data to determine if a conversation requires attention. If certain parameters are met, the system will proceed to analyze tone, context, and perceived emotions determine the type of behavior. This is achieved using machine learning models that Modulate says understands nuance and emotional cues to distinguish a joke from bad behavior.

“These classification models are not able to understand everything about a conversation at a glance, but they may look for clear signs of anger, distress, aggression, and even sinister intent. thinner,” Modulate mentions in one of his papers. Finally, the tool is able to escalate the most toxic voice chats so moderators can take appropriate action.

call of Duty will fight toxicity in all languages

ToxMod understands 18 languages – including Spanish – and is able to understand the full context of a conversation between two or more of them. This is important because you can often hear people utter racist slurs in a language other than English. call of Duty.

Modulate uses language models, artificial intelligence and human experts who speak every language in their mother tongue and can identify certain harmful actions. “Tackling toxic behavior is more complex and requires fluency in the language and culture of the country of origin of the language, as well as the subculture and psychology of gaming and online behavior in general,” the company says.

In a previous post, Modulate comments that ToxMod defines risk categories specific as violent radicalization or child abuse. This is possible thanks to detection algorithms that identify repetitive patterns of behavior. For example, if a player once uses extremist language, it is classified as an offense; but if the situation recurs or escalates over the course of a few weeks, it is added to a risk category for moderators to evaluate.

Artificial intelligence is not infallible

It is important to note that ToxMod is not the ultimate solution against offensive behavior in call of Duty. Tool will be integrated into the game moderation system which, among other things, includes text filtering available in 14 languages, as well as a mechanism for reporting players who violate the rules. The final decision on punishment or non-punishment will be up to the toxicity control team.

The first beta will start today with integration into Call of Duty: Modern Warfare II And Call of Duty: War zone in North America. Subsequently will be extended to the entire public with the launch Call of Duty: Modern Warfare III 10th of November. Activision has confirmed that moderation will begin in English, although Spanish and other languages ​​will be added at a later date.

Source: Hiper Textual

Previous articleMyth or reality: Do humans only use 10% of their brain?
Next articleDolby has developed a technology that automatically adjusts the sound to the layout of the room.
I am Bret Jackson, a professional journalist and author for Gadget Onus, where I specialize in writing about the gaming industry. With over 6 years of experience in my field, I have built up an extensive portfolio that ranges from reviews to interviews with top figures within the industry. My work has been featured on various news sites, providing readers with insightful analysis regarding the current state of gaming culture.

LEAVE A REPLY

Please enter your comment!
Please enter your name here