Call of Duty Bans Over 350,000 Accounts This Year for Racist Names and Toxicity

0
1350

For players of Call of Duty, toxic behaviour and racism is nothing new. Allegations of racism are rampant among the franchise’s fans. Many have condemned the company for too few Call of Duty bans resulting from reports. Just last month, rapper T-Pain posted a video on TikTok titled “Shutting down some racists in Call of Duty,” wherein he defeats a team on the game following their use of racist slurs. In the video, which currently has 2.5 million views, players exclaimed “F*** Black Lives Matter” and repeatedly said the n-word.

Activision Publishing, the company behind Call of Duty, has recently made the decision to curb similar behaviors in its games. These changes will occur on Call of Duty: Warzone, Black Ops Cold War, Modern Warfare, and Call of Duty: Mobile. The company announced in an “anti toxicity progress report,” that it implemented new filters to prevent offensive player names, clan tags, profiles, and text chats. These filters span 11 languages to catch inappropriate player behaviour. Moreover, it banned over 350,000 Call of Duty accounts “for racist names or toxic behavior” over the past 12 months. These bans were based on player reports and a review of its player database.  

“We are committed to delivering a fun gameplay experience for all of our players. There’s no place for toxic behavior, hate speech or harassment of any kind in our games or our society,” the report said.

Call of Duty bans aim to decrease in-game toxicity and racism

Call of Duty bans are nothing new to the fanbase; just this year, Activision banned 475,000 accounts for cheating. However, modifications for racism are relatively new within Call of Duty and many have complained about a lack of moderation. Previously, Call of Duty: Warzone shut down its Plunder endgame chat due to immense racism displayed in the 100-person chats. While originally designed to allow players to discuss the game and vent after a Plunder match, the conversations often resulted in players screaming slurs. 

For the future, the company plans to implement further monitoring measures to curtail racism in the game through increasing the reporting capabilities of its players and its own moderation within the text chat function. Activision said it intends to expand player reporting options and moderation. In addition, it will also take steps to cut back on toxic behaviour over voice chat. This will involve expanding monitoring within its backend technology and “scrubbing databases to bring systems up to current standards,” a fair review of its enforcement policies, and further communication with players. However, Activision did not discuss the steps it planned to take within the report.

“Our goal is to give players the tools needed to manage their own gameplay experience, combined with an enforcement approach that addresses hate speech, racism, sexism and harassment,” the report said. 

Despite good intentions, these changes saw mixed reactions among players. Some lauded the company for making these long-needed changes. Others claimed that they implemented too late and did not go far enough to prevent harassment. Some players denied the existence of racism and other forms of harassment in the fanbase. 

Words by Elizabeth Karpen

Featured Image: Amr Taha


Support The Indiependent

We’re trying to raise £200 a month to help cover our operational costs. This includes our ‘Writer of the Month’ awards, where we recognise the amazing work produced by our contributor team. If you’ve enjoyed reading our site, we’d really appreciate it if you could donate to The Indiependent. Whether you can give £1 or £10, you’d be making a huge difference to our small team.

LEAVE A REPLY

Please enter your comment!
Please enter your name here