Wednesday, 18 May 2022

Why the Metaverse Has to Be Its Own Reality

NAB

article here

No one wants to live in a virtual police state, but there’s a growing sense that behavior in the metaverse needs stricter legal enforcement along with positive “prosocial” modification.

That’s because the toxic behavior that has always existed online could actually get worse in the 3D internet as experiences cleave closer to reality.

A recent documentary investigation by UK broadcaster Channel 4 revealed a metaverse rife with hate speech, sexual harassment, pedophilia, and avatars simulating sex in spaces accessible to children.

Last December, Psychotherapist Nina Jane Patel reported her avatar being virtually gang-raped on Facebook’s Horizon Venues.

Meta promptly responded by introducing a personal boundary feature that avatars can trigger to keep others at arm’s length, like a forcefield.

But more profound action needs to be taken, and not just surface-level attempts designed to keep the regulators at bay.

“If something is possible to do, someone will do it,” computing and information researcher Lucy Sparrow tells The Guardian. “People can really be quite creative in the way that they use, or abuse, technology.”

As the virtual inexorably blurs with the real, the experience of abuse, such as that experienced by Patel, can trigger a deeply rooted panic response. “The fidelity is such that it felt very real,” Patel, who is also co-founder of children’s metaverse company Kabuni, said. “Physiologically, I responded in that fight or flight or freeze mode.”

According to David J. Chalmers, the author of Reality+… Virtual Worlds and the Problems of Philosophy, “bodily harassment” directed against an avatar is generally experienced as more traumatic than verbal harassment on traditional social media platforms. “That embodied version of social reality makes it much more on a par with physical reality,” he says.

Stepping from a social media platform such as Facebook into the metaverse means a shift from moderating content to moderating behavior. So how should the regulatory environment evolve to deal with this?

One solution is using AI to tackle any problem as soon as it arises at scale. But AI still isn’t clever enough to intercept real-time audio streams and determine, with accuracy, whether someone is being offensive,” Andy Phippen, professor of digital rights at Bournemouth University, argues in the London School of Economics blog. “While there might be some scope for human moderation, monitoring of all real-time online spaces would be impossibly resource-intensive.”

​​Legal experts believe that if the metaverse becomes as important as tech CEOs say it will, we could increasingly see real-world legal frameworks applied to these spaces.

However, there are those who hope that the metaverse might offer an opportunity to move beyond the current top-down reactive enforcement model of online moderation.

Reddit, for example, relies partly on community moderators to police discussion groups. The Guardian reports that Disney-owned Club Penguin, a multiplayer children’s game, pioneered a gamified network of “secret agent” informants, who kept a watchful eye on other players.

A 2019 paper, “Harassment in Social Virtual Reality: Challenges for Platform Governance,” by researchers working with Facebook-owned Oculus VR, indicates that the company is exploring community-driven moderation initiatives in its VR applications as a means of countering the problems of top-down governance.

Government legislation such as the EU’s newly enacted Digital Services Act — which imposes harsh penalties on social media companies if they don’t promptly remove illegal content — and the UK’s online harms bill could play a role in the development of safety standards in the metaverse.

But there are still unresolved legal questions about how to govern virtual bodies that go beyond the scope of the current web — such as how rules around national jurisdiction apply to a virtual world, and whether an avatar might one day gain the legal status necessary for it to be sued. The highly speculative nature of the space right now means these questions are far from being answered.

“In the near term, I suspect the laws of the metaverse are by and large going to derive from the laws of physical countries,” says Chalmers. But in the long term, “it’s possible that virtual worlds are going to become more like autonomous societies in their own right, with their own principles.”

 


No comments:

Post a Comment