Call of Duty: Yapay Zeka Tabanlı Moderasyon Sistemi ToxMod ile Oyun Deneyiminin Kalitesi Artacak
Toxicity or toxicity in first-person shooter games like Call of Duty has been notorious due to toxic lobbies and voice chats. Research shows that the fan base of the series has the most “toxic” fan base across all games. The fact that a fight between two players once resulted in real police intervention can be considered the most significant indicator of this toxicity. Activision, which has been trying to change this situation for years, will seek help from artificial intelligence for this purpose.
Activision has partnered with a company called Modulate to bring “in-game voice chat moderation” to its games. The new moderation system, which uses artificial intelligence technology called ToxMod, will work to detect behaviors such as hate speech, discrimination, and harassment in real time.
ToxMod’s first beta version in North America is now underway. It will be possible to experience this ToxMod in Modern Warfare II and Warzone initially. With the release of Modern Warfare III on November 10th, ToxMod will be available worldwide.
Modulate’s press release does not contain a lot of details about how ToxMod works exactly. The company states that the tool “triggers voice chat to flag bad behavior, analyzes nuances of every conversation to detect toxicity, and enables moderators to quickly respond to each incident by providing relevant and accurate context.” The CEO of the company said in a recent interview that the tool aims to go beyond just transcription; factors such as the player’s emotions and voice level are also taken into account by ToxMod to distinguish harmful expressions from joking ones.
ToxMod will report Call of Duty players to Activision moderators (at least for now), rather than taking action against them. Since research has shown that speech recognition systems can exhibit bias in responding to users with different racial identities and accents, human involvement will likely continue to be an important safeguard.
With the implementation of ToxMod, Activision aims to improve the quality of the gaming experience by reducing toxicity within the Call of Duty community. Players will now have a safer and more enjoyable environment to interact with each other. The introduction of artificial intelligence technology in moderation systems marks a significant milestone in the ongoing efforts to create a healthier gaming ecosystem.
In summary, Activision has partnered with Modulate to introduce a new moderation system, ToxMod, in its Call of Duty games. By utilizing artificial intelligence, ToxMod will be able to detect and address toxic behavior such as hate speech and harassment in real time. Although ToxMod will only report players to Activision moderators for now, it represents an important step towards reducing toxicity and improving the overall gaming experience in the Call of Duty community. So, gamers can look forward to a more positive and enjoyable environment with the implementation of ToxMod.