Veteran content moderators share their insights about working in virtual reality and the challenges they face every day.
Tate Ryan-Mosley talks with Ravi Yekkanti, content moderator for social VR experiences like Horizon Worlds, in his newsletter The Technocrat. Yekkanti oversees virtual worlds for various vendors of external companies. Eccanti says he doesn’t go a day without bad behavior in his job.
Moderation in VR: Fundamental Differences from Social Media
Yekkanti has been a content moderator since 2014 and emphasizes that moderating VR content is very different from moderating traditional content. VR moderation directly moderates player behavior, which makes it feel very real, Yekkanti said.
The big difference from “regular” online platforms is that it’s not just text and images that you need to control. VR moderators primarily evaluate speech and actions in real time. Moderation in virtual reality is therefore still viewed as relatively uncharted territory. This is demonstrated by an experiment in early 2022 that revealed significant moderation issues on Meta.
Horizon Worlds, a relatively new VR platform, has had to deal with harassment many times. Meta is trying to combat that with security measures such as restricting personal interactions in Horizon Worlds 18+ areas.
Tracking Discrimination and Hate in VR
As a moderator, Yekkanti himself is part of the virtual world and cannot be recognized as ‘official’. VR moderators act incognito. If they are marked in some way, users can consciously adjust their behavior.
Therefore, his own behavior and appearance can also provoke negative reactions from certain players. His Indian accent alone has made him a target of ridicule, discrimination and bullying, says Ecanti.
Some players just want to do bad things
Moderators undergo technical and mental training in preparation for moderation in the Metaverse. They learn how to stay undetected, initiate conversations, and help other players navigate.
At the same time, they are ready to deal with problem behaviors. Harassment is common in the metaverse, so well-trained moderators are becoming increasingly important for social VR platform providers. Metas XR CEO Andrew Bosworth has described toxic behavior as an existential threat to Metas’ plans for his Metaverse.
Yekkanti experiences this behavior on an almost daily basis. “Not all players behave the way they want. There are those who just want to be mean once in a while. Consider the different types of scenarios you might encounter and how to best handle them.” We will prepare, ”explains the moderator.
Here’s how content moderators handle violations
Content moderators collect all relevant information, such as the name of the game, participants, length of each session, and conversation history. In many cases, they also have to consider when they cross the line.
For example, using profanity out of frustration is considered borderline behavior. “There may be children on the platform, so we continue to track them,” Yeccanti says. If it gets too personal, he definitely needs to step in as a moderator.
For serious violations of the code of conduct, he says there are several options, including muting the player or removing him from the game. Further, such incidents will be reported to the customer so that the customer can take further action.
Content Moderation Saves Lives
Despite the problems, the host emphasizes that her work is fun and important. One example is the successful rescue of a kidnapped child who posted a petition for help on her platform online.
Yekkanti then called the rescue team and the child was saved. This experience taught him that his work actually impacts the real world and contributes to user safety.
Note: Links to online stores in articles may be so-called affiliate links. MIXED receives a commission from the provider when purchasing from this link. In your case the price remains the same.