A new report coming out of the UK’s Anti-Bullying Alliance has shown just how bad online bullying has become. The report finds that 76% of teens aged 11-16 think that social media platforms need to do more to combat toxicity, and one in five have avoided online gaming and social media due to being bullied.
The report is a damning indictment of the level that online gaming has gotten to. Since online gaming’s inception, it has been a place of racist and homophobic slurs, and not enough is being done by the companies running these online spaces to combat it.
Bullying online, at least in terms of gaming, started out from a very simple place. Some of the most popular early online titles were action-based. Doom’s famous death matches often left players with palpable levels of frustration. This often translated into online ‘smack-talk,’ although at the time it was mainly relegated to forums since Doom didn’t originally include any sort of chat features.
When voice chat came along around the year 2000, things only got worse. With the ability to react to defeats in real-time with no ‘cooling-off’ period, it became very possible to say horrible things in the heat of the moment. Since then, being horrible to other players online sort of ended up becoming what people expected from online gaming to the point where the toxicity sort of became a running joke.
These days any online game with a lot of active voice chatters is bound to wind up with someone using a racial slur in minutes. Worse than that, things have only escalated as social media has made it easier to connect with people after playing and young people are often the hardest hit of all of us.
The first thing that platforms such as Xbox and PlayStation need to do more of is self-reporting. Collecting data on the sorts of horrible acts going on in their platforms is the first step to being able to combat it. It’s hard to realistically fight against a problem that we don’t really understand in the first place.
More than that, harsher punishments for the cyber-bullying need to be in place. Reporting of bullying, in all its forms, should be heavily encouraged. When someone reports someone for bullying and it can be verified, temporary bans should be the first step, not the end result. Harsher punishment for those bullying others would be a great deterrent, and honestly permanent bans should not be out of the question.
It’s not even like you’d have to tell people they couldn’t create other accounts. So much value is tied up in online gaming accounts that losing the ability to play online with one would be enough for many people to be strongly deterred from sending a violent or bullying message to someone on Xbox Live or PSN.
Many apologists might say that companies such as Microsoft are taking steps against bullying and harassment. While community standards for platforms such as Xbox are pretty clear on what is or isn’t acceptable, these standards are enforced nowhere near strictly enough.
It is the responsibility of these companies to ensure those that are using their services are protected from online harassment. If platforms talk a tough game but fail to enforce their own rules, then it really sends a strong message to young people that bullying online is okay. It’s up to Xbox and PlayStation to ensure that their services are a safe and fun place for everyone, and right now that just isn’t the case.
Cyber-bullying and toxicity have become huge problems in online gaming, and platforms aren’t doing enough to stop it.
Last modified: September 23, 2020 1:16 PM