Call of Duty creators use artificial intelligence to combat toxic behavior. AI banned 2 million players

Call of Duty creators use artificial intelligence to combat toxic behavior.  AI banned 2 million players

The creators of Call of Duty, thanks to the help of artificial intelligence, banned over 2 million player accounts for their toxic behavior in the online multiplayer mode.

Toxic Internet violence, which is constantly spreading thanks to the false belief of Internet users in their anonymity and impunity, has also affected areas related to video games for many years. You can meet her, among others: on thematic forums or in comments under reviews, but the biggest problem are aggressive people participating in multiplayer modes, i.e. playing together with other players via the global network.

Activision, the publisher of Call of Duty games, is well aware of this and decided to oppose this harmful phenomenon. Thanks to a very decisive response to toxic behavior and – above all – the support of algorithms based on artificial intelligence, the administration banned over 2 million accounts belonging to players who use violence against other participants of online gaming.

Artificial intelligence fights toxic players in Call of Duty

Using offensive, rude or hostile language, or any attempt to harass or discriminate against others. Thanks to artificial intelligence, this type of behavior is effectively combated during multiplayer games in Call of Duty.

Activision uses AI to detect toxic behavior, including: by analyzing voice chat for specific keywords or prohibited phrases. All these rules are described in detail in the “Code of Conduct”, which should be read, at least in theory, by all people playing Call of Duty online.

Too few cases of aggression are reported by players

The publisher of the popular series draws attention to the fact that only every fifth case of toxic behavior is reported by the players themselves. Therefore, Activision decided to implement a moderation system based on the artificial intelligence model. Since then, the company has recorded an 8% decline in the number of aggressive game participants on a monthly basis.

The modern moderation system translates primarily into the latest installment of the Call of Duty series, Modern Warfare III. Activision reported that cases of players’ exposure to toxic behavior decreased there by almost 50%.

“We do not tolerate abuse or harassment, including derogatory comments based on race, gender identity, sexual orientation, age, culture, faith, mental or physical abilities, or country of origin.” – we read in a statement issued by Activision.

Similar Posts