As virtual reality gears up to provide the new social meeting space, will it follow the same trend as social media and become a haven for harassment and bullying? We hope that the uglier side of online interaction gets left behind on the 2D screen. But it’s clear to see that sexual harassment, at least, has already made the leap to this new visceral format. Will this be the new normal, or will games developers and education challenge the problem? Will parents have another medium to fear for their child’s safety on or will their teenager be of the progressive generation we’ve been waiting for?
Bullying is not just limited to the school play ground. Even if we are lucky enough not have been direct victims of it at any point, we have all heard the stories and are aware of the sometimes deadly consequences. If you thought it was bad when you were at school, you will doubtless be horrified by the idea of these taunts continuing into your own private spaces via your phone. Social media is a place where real life friendships continue in a digital form. I wrote last fortnight about how they can enhance feelings of connection and argued that virtual reality will compliment this method of socialising.
However, you would be naive to believe that any online social space is a rosy-hued, harmonious little village. Harassment is directed at users in the comfort of their own homes by others who employ disturbing methods to intimidate and humiliate. Picture that village again as somewhere stifling close and introverted, a place of no escape.
Not only are young people open to bullies who know how to pray on fears and insecurities, but all users may face harassment at one time or another either for who they are or for the views they have. Online, users are protected by anonymity. Unfortunately, this seems to be mostly to the benefit of mean spirited bullies who use anonymity as a way to say what they want without consequence. There are consequences, of course, mostly the its the victim who feels them. Victims of sexual harassment or racist attacks online know all too acutely that the words are not harmless. And yet cyber bullying and harassment is often overlooked and down played – victims are painted as hypersensitive. Perpetrators are rarely punished except on well moderated groups and forums.
So it is apparent that this nasty behaviour has already started rearing its ugly head on virtual reality. Users have confessed to feeling uncomfortable about displaying their real life identity for fear of harassment. Patrick Harris of Minority Media, speaking at Game Developers Conference 2016, tells us that harassment in virtual reality is “way, way, way worse” than online. That is because the harassment is, for all intents and purposes, physical. Sure, that person isn’t really in the room with you. But if you’ve felt how viscerally phenomena in VR affects your body, you will understand that the threat can no longer be dismissed as just “words”.
When Patrick Harris conducted a rather horrible experiment to see how easy it was to harass a female user on a multiplayer prototype he created, he was able to stalk and get up in the user’s face. He blocked her path and made obscene gestures. She was unable to play, and later called the experience “damaging”.
Let us not be coy about the fact that abusers target some demographics than others. Whether or not the views expressed in gaming communities and online are demonstrative of real life feelings, it is somewhat a trend that female players will receive abuse of sexual nature more so than men. However, as virtual reality becomes increasingly available to mainstream audiences, we must face the fact that all sorts of bullying and abuse will inevitably come with it. That is, if we allow it. But what if we don’t? What if virtual reality isn’t like the rest of the internet? What if we can create the safe space that forums and multiplayers should be but sometimes fail at.
A good moderator, or team of mods, is the golden ticket to a harmonious online community. They are the police of a group and they tend to understand the best interests of their forum. Not every social space has mods, but the ones that do have a massive advantage.
But can you effectively moderate a virtual space as well as you can an online space? Well at the moment, yes, you should be able to. With apps like Altspace being fairly sparsely populated at the moment, a single room might contain around 15-20 people maximum. A moderator can effectively see the behaviour of everyone in the room. It is also likely that if an obvious act of abuse occurred, other users would actually intervene to stop it. But that is viewing social VR in its current state. When these apps become densely populated, and entire friendship groups migrate to the format as another medium to share gossip, photos and whatnot, will it be easy to police? Can a moderator pick up on the subtleties that accompany psychological bullying, as opposed to out-and-out abuse? Furthermore, will users like being policed?
Some users on Reddit have objected to the idea of further moderation and protective measures on social Virtual Reality apps for fear of suppressing organic interaction and imposing a sort of totalitarianism that they have previously enjoyed freedom from. But perhaps the answer isn’t just in moderation but in positive actions that users can take to protect themselves.
Reddit user, NikoKunRift asks this: “I wonder if social/multiplayer VR games might implement an optional “personal space” feature, that fades other players to invisible, from your perspective/client, if they get uncomfortably close to you.” This is an idea Altspace already utilises where users can block abusive avatars. Some believe that this measure isn’t effective if the offending user can still see you but you can’t see them. Some wish for a toggle to make their own avatar invisible to certain users, meaning harassers won’t know they are there. But not everyone likes how this will effect the immersive quality of the game.
Other suggestions have included user rating systems where abusive users can be downvoted or lose points, available for all to see. This way, a abuser is preceded by their reputation. It makes them the loser. It is not hard to imagine the misuse of downvoting though as it could be used to bully innocent users by making them unpopular.
If virtual reality becomes a microcosm of society as social media has done, then issues of abuse require a deeper reaching approach than merely retrospective action. It is not enough to block an offending user or to protect victims, case by case. A wholesale approach to reducing bullying and harassment must occur in the form of education.
SchoolLife, creation of GiantOtter, is an anti-bullying virtual reality game. It’s co-founder, Geoff Marietta, says that experiencing other students’ points of view will help foster better relationships between pupils and reduce bullying. Students can role play as bullies and victims, so that they understand the acute distress that it causes, and the complexities of those who engage in it. Virtual Reality provides a fantastic medium for experiencing someone else’s point of view as you are thrust into their shoes. We are yet to see the long term effect of such apps, but it is generally understood that empathy is a valuable quality to instil in young people. It can make people think twice about an off-hand comment at the very least.
In the future, I’d like to see such apps created for all kinds of scenarios. Teenagers can be introduced to them as part of their school education to teach them about responsible social interaction. If young people learn how damaging it is to be targeted anonymously, they might become the shining bright sparks that Virtual Reality deserves. They might be the ones to turn the medium into a fantastic, rich and inviting community. Harassers, bullies and abusers, might become part of the old generation: generation internet. All the while, Virtual Reality will bound in giant strides towards progress and harmony!