They will call your mother names and tell you to kill yourself, and then they will go to sleep soundly

They will call your mother names and tell you to kill yourself, and then they will go to sleep soundly

Games are supposed to be entertainment and they should give us pleasure. However, they often cause frustration and nerves. Not because of the intentions of the creators themselves, but because of other players. Toxic players. Why is this happening?

There is probably no player who, playing multiplayer games regularly, has never encountered someone toxic. It's even worse when we get called a noob (the word comes from newbie and means a player with low skills) when we lose a round. It's worse when a toxic player insults our mother and the entire family three generations ago and tells us to kill ourselves (for this purpose, players use the abbreviation kys, i.e. kill yourself). Although this type of behavior is usually severely punished by game developers, it is easier to find such a person than someone kind who understands that sometimes we make mistakes. Where does this come from? Why do players challenge each other instead of enjoying their favorite entertainment?

Players are toxic – it's a fact confirmed by research

Anyone who plays multiplayer games will probably agree that encountering a toxic person is nothing unusual. What's more, it's definitely easier and more common today than having an understanding and cordial gaming companion. This is confirmed by my own observations, those of my friends, and also by a streamer who spends several hours in front of the computer almost every day, mainly playing online games such as Valorant or CS2.

I encounter toxic players in online games almost every day. It can even be said that games in which none of the players show toxic behavior are quite rare.

– says streamer Paweł 'saju' Pawełczak in an interview with TELEPOLIS.PL.

Additional confirmation is the “Multiplayer Games Toxicity Report 2023”, prepared by Harris Poll on behalf of the Unity group. The study was conducted from July 21 to August 9 last year and included 2,522 players and 407 developers from the USA, Great Britain and South Korea. It shows that as many as 74 percent players witnessed toxic behavior during the game. This is an increase of 6 percentage points. compared to an identical study a year earlier. Moreover, as many as 96 percent players reacted to these behaviors in some way. In turn, among game developers, as many as 53 percent admitted that such cases of inappropriate behavior are becoming more frequent.

Where does this even come from? According to a 2023 report prepared by the VideoGames Europe organization (which brings together many publishers, including Epic Games, Take Two, Nintendo, EA and Activision Blizzard), as many as 68 percent gamers are helpful in relieving stress. In turn, 57 percent of us feel less anxious thanks to them, and for 53 percent are helpful in the fight against loneliness and the feeling of isolation. Therefore, it would seem that we should feel better when playing. Games should give us pleasure and help us relieve negative emotions. Meanwhile, they are increasingly becoming their source, mainly from toxic players. Why?

Gamers are toxic because they often sit down to play games after a long day – at school or at work. Additionally, multiplayer games are associated with stress due to their specificity (lack of full control of a specific player over the course of the game). And because stress from various sources accumulates, toxic behaviors seem to be an “outlet” of emotions. Some players generally sit down to play with the intention of “having fun”. By providing anonymity, games make it easier to redirect frustration towards fellow players.

– Natalia Koperska, a sports psychologist and personal development trainer, who also specializes in e-sports, tells us.

Conspiracy of silence

And what do the game developers themselves say about this? There is a certain discrepancy between what developers and publishers think and what the players themselves think. The already mentioned “Report on toxicity in multiplayer games 2023” contains information that as many as 81 percent players' priority should be fighting toxic players. In turn, 89 percent creators admit that more could be done in this matter, but at the same time they do not consider it their priority. This is because we are more likely to give up on a given title not because of other players, but because of lag, boring and repetitive gameplay, overly aggressive monetization systems or lack of new content.

While collecting materials for this article, I decided to ask several publishers and developers for details. I sent questions to, among others, Riot Games, Ubisoft, PlayStation, Xbox, Activision Blizzard and Valve. Unfortunately, in a few cases I was completely ignored and in others I was refused to respond (e.g. PlayStation). In the rest, to be completely honest, I received only perfunctory answers about a strong commitment to fight toxicity. At the same time, no developer, I repeat, wanted to reveal even the smallest amount of statistics regarding toxic behavior in games.

Harassment and abuse are not acceptable in any area of ​​life. Our industry has clear codes of conduct prohibiting harmful behavior such as hate speech, harassment and incitement to violence, and member companies impose serious penalties, including account bans and reports to law enforcement. The video game industry uses many tools, including filtering tools, advanced AI moderation technologies and experienced moderators, while players have tools to report, mute and block. Privacy settings and parental controls allow parents and players to manage online interactions, including disabling interactions with other players. We will continue to listen to the gaming community and parents and improve our processes and policies to ensure a safe and positive experience for all players.

– I found out from the Association of Video Game Producers and Distributors SPiDOR.

Ubisoft, in turn, admitted that their approach to managing harmful behavior is based on three pillars. The first is to focus on prevention by setting clear rules and raising awareness of the consequences of destructive behavior. In addition, the company focuses on continuous technological innovations that help to more accurately identify and counteract this type of behavior. Third, actions are taken to protect communities through interventions.

Why don't game developers want to openly talk about toxicity among players? I cannot answer this question. I partially understand the fear of disclosing specific statistics, and I'm sure the developers have them. After all, showing how many players have toxic behavior would be something of an expose. It's like walking into the middle of a room full of onlookers, taking off your pants and showing your secrets. It's a terrifying prospect. But are you sure? Just as the sight of a naked person probably doesn't shock anyone anymore because we know what our bodies look like, talking openly about toxic players when we are all aware of their existence is not terrible. Ultimately, it's difficult to fight an opponent we don't know. It's hard to fight a war when you don't know who you're fighting.

They will call your mother names and tell you to kill yourself, and then they will go to sleep soundly

At this point, it is also worth mentioning the example of the Tribunal, a system that was in force in League of Legends many years ago. It was launched in 2011. Thanks to him, the community itself decided the fate of other players. Everyone could review cases of toxic behavior and decide on punishment (as long as there were enough votes). Players liked the system very much, but it was turned off in 2014. Riot Games claimed that it was ineffective and that automatic mechanisms cope with this task much better. Yes, many players are punished, but each of us has encountered a situation more than once where a toxic player was not punished. I can recall here the example of many games where people called me, my family, and wished me dead, and absolutely nothing happened.

How to fight toxicity?

We know that gamers are toxic. We know that they appear in many games, although no developer or publisher wants to reveal specifics. We know they can be disruptive to the game. We finally know this is a huge problem. This may be a fight against windmills that we will never fully win. However, you have to fight. Just as?

I think that sanctioning toxic behavior and creating tools to limit such behavior is the responsibility of game developers. As an environment, it remains to educate, report and silence such players.

– says Natalia Koperska in an interview with us.

A similar opinion is shared by Paweł 'saju' Pawełczak, a Polish streamer known on Twitch, who claims that muting is a last resort because it deprives us of the ability to communicate, which can often decide about winning or losing.

Dealing with such players is extremely difficult because there is really no proven method. You can mute such a player, but in my opinion this is the last resort, because we often lose communication in the game and this also leads to losing. However, if no arguments or attempts to calm down such a person work, the only option left is MUTE.

– says the streamer in an interview with TELEPOLIS.PL.

Artificial intelligence may be helpful in this regard, specifically, machine learning mechanisms. Activision Blizzard, the creators of the Call of Duty series, already use a similar system called ToxMod. AI analyzes voice chat at all times and automatically detects words or phrases that are prohibited and may be considered offensive. Since implementing the mechanism, the creators have noticed a decrease in the number of aggressive behaviors among players. In the case of Call of Duty Modern Warfare III, as much as 50 percent was recorded. fewer cases of players being exposed to toxic behavior. It works similarly on FACEIT, a platform for playing, among others, Counter-Strike 2.

In 2022, the GGWP platform was also established. The people behind it include former professional gamer Dennis “Thresh” Fong, Crunchyroll founder Kun Gao, and data and artificial intelligence expert George Ng. They created AI that allows you to analyze player behavior. The mechanism not only monitors what players write in the chat, but also detects behaviors related to toxicity, such as leaving the game, dying on purpose or attacking players from the same team. Additionally, AI is constantly learning, also using social media.

With the current development of AI, machine learning and neural networks, this seems to be the future that can save us from toxic players. However, let us remember that we ourselves are the problem. I beat myself on the chest and admit that sometimes I am toxic towards other players. I regret this and from this place I appeal to you and myself. If we want games to give us pleasure, let's start changing ourselves. Remember that in the end it's just a game. Losing or winning won't change anything in our lives, so why get upset. Ultimately, it's about the pleasure we get from our hobby.

Similar Posts