More

    The dark side of the metaverse

    Published on:

    When a new technology surprises us enough, we have a habit of observing it as “like something out of science fiction.” Also, when we use this word, we tend to forget that most of his science fiction tends toward the dark.

    Earlier this month, it was revealed that British police are investigating the first case of sexual assault related to conduct in the Metaverse, the much-touted next-generation virtual reality-driven social media.

    An underage teenage girl was reportedly “gang raped” in the online world, although the names of the various Metaverse apps where the assaults allegedly took place were not disclosed. This is the first time British authorities have investigated such a case, but it is not the first. Unfortunately, this won't be the last time.

    Therefore, it is important to think about what actually happened, what has been abstracted from this incident and is largely unknown. The first important thing to emphasize is that it is not clear that sexual violence in the Metaverse results in direct physical harm. Virtual reality headsets allow you to see a virtual world and hear the real voices of other users, but do not actually cause any physical harm. The physical sensations associated with it.

    Attacks in the metaverse are therefore similar to a heightened version of online abuse and harassment. We should not underestimate how serious this is. When you're wearing a VR headset, your entire field of vision is taken over, and just as importantly, you can hear the real voices of the people attacking you.

    For those who have traditionally viewed online communities as safe or integral parts of their lives, making them unsafe is as big a loss as losing access to real-world communities. And a much less invasive form of online abuse has been demonstrated. Causes severe trauma to the target. This is not a trivial issue.

    SumOfUs, a nonprofit corporate responsibility advocacy organization, conducted research on assault in the Metaverse in a paper published in 2022 and found that Meta's (the company formerly known as Facebook) Horizon, the largest single Metaverse community ​​Focused on Worlds.

    Despite being the largest metaverse established, Horizon Worlds is something of a failure. At its peak, it boasted 300,000 monthly users, and the last time the numbers were reported, it was declining instead of increasing. While immersive metaverse communities have been hailed as the future by tech elites, the general public feels decidedly differently.

    Therefore, it seems to be a dangerous situation for people who participate in such communities with female avatars. When SumOfUs sends out researchers, some of them quickly report finding themselves in sexually aggressive situations, one of whom reports that several men are drinking from virtual vodka bottles while his girlfriend reported engaging in simulated group sex – all non-consensual – with their avatars.

    A video was posted to accompany the report, but when viewed on mute it looks more surreal than disturbing, with the camera shaking and a vodka bottle floating, obscuring most of the view. Characters in Horizon Worlds are cartoon-like floating torsos with no removable clothing. All that's actually happening on screen is some weird cartoon jump. You'll hear the difference when you hear a real male voice accompany the nonsensical action. It's clear that the men enjoy this idea, and anyone who receives such an attack can easily come to believe that in the real world they would want to do the same. I can imagine it.

    This puts us in a strange and uncomfortable position. We haven't solved any of the problems of existing social media, where people interact through the relatively safe medium of screens and with tools that can be easily blocked. The abuse, misinformation, and targeted harassment on most of these platforms is getting worse, not better.

    With that in mind, launching new versions of these networks where such attacks feel more real and invasive seems both foolish and reckless. For communities like this to grow, they need police and agreed rules of the road. As it stands, most of us vote with our wallets and feel like we've fully inherited the metaverse.

    Meta claims that Horizon Worlds, like most Metaverse applications, is strictly aimed at adults, and that, in theory, underage teenagers should not be participating in it in the first place. it is clear.


    From a practical standpoint, what we can do now is spend the most time thinking about technology and what works to protect children and vulnerable internet users. It's worth paying attention to people.

    The veteran technology editor, who has been on the job for more than 30 years, is a parent herself and is often surprised that most parents let their children (and even teenagers) use the Internet unsupervised. express. A rule in his house was that the internet screen had to be in the family living room and never go upstairs.

    His view is that allowing children to browse completely unsupervised is like letting them wander unaccompanied in a strange town. It's a convincing work.

    It is not clear whether the UK's current legal system regarding sexual offenses is up to the task of controlling the kinds of assaults that authorities are currently investigating. That said, it is not clear whether the UK's sexual offenses laws are doing anything about it.

    It is tempting to call for new legislation, but do we actually need it? Some sort of expert task force should be convened to consider these issues and make recommendations. would be beneficial. Can new guidance be issued on how to apply current legislation to these situations, or is new legislation essential?

    The UK's criminal justice system is dysfunctional, but it is not broken by a lack of new legislation. If anything, prosecutors have a choice. It is better to think carefully before acting and choose effective methods rather than gimmicks.

    More importantly, it's worth remembering where online harm occurs most often. Less than 500,000 people worldwide use Metaverse apps per month, while around 5 billion people use some form of social media. To protect most people, you need to pay attention to where your users are, whether it's Facebook, Insta, X, or TikTok. The metaverse can generally wait.

    Related

    Leave a Reply

    Please enter your comment!
    Please enter your name here