[imagesource: Unsplash / Jessica Lewis]
A BBC investigation, probing online virtual reality (VR) apps, has found children using the Metaverse to access strip clubs and other adult arenas where they are exposed to grooming, sex, racism, and threats of violence.
A BBC researcher posing as a 13-year-old girl on these apps was exposed to all this “erotic role-play” where young people can “get naked and do unspeakable things”.
Besides seeing avatars simulating sex in rooms that resemble the “red-light district in Amsterdam”, she was also able to find sex toys and condoms and was pressured by numerous adult men.
The entrance to the virtual strip club:
It has become apparent that online safety regulations and rules are playing catch up as the Metaverse poses “a toxic combination of risks” for underage users, according to Andy Burrows, the head of online child safety policy.
The National Society for the Prevention of Cruelty to Children (NSPCC), who is “shocked and angry” by the findings, says that online safety improvements are a matter of urgency.
The BBC has more from Burrows about the “extraordinary” findings of the investigation:
“It’s children being exposed to entirely inappropriate, really incredibly harmful experiences,” he said.
He believes technology companies have learned little from mistakes made with the first generation of social media.
“This is a product that is dangerous by design, because of oversight and neglect. We are seeing products rolled out without any suggestion that safety has been considered,” he said.
The virtual nature of the experience doesn’t seem to protect the kids, because as a safety campaigner explained, VR is so immersive that children actually have to act out sexual movements.
Here’s a video from the investigation:
Catherine Allen, who is currently writing a report about VR for the Institute of Engineering and Technology, says that she found many experiences “quite traumatic and disturbing”:
She described one incident in a Meta-owned app where she encountered a seven-year-old girl. A group of men surrounded them both and joked about raping them. Ms Allen said she had to step between the men and the child to protect her.
“I shouldn’t have had to do that, but that’s because there’s no moderation, or apparently very little moderation.”
VRChat, the app that the investigation mainly targets, said it was “working hard to make itself a safe and welcoming place for everyone,” adding that “predatory and toxic behaviour has no place on the platform”.
Meanwhile, the UK’s forthcoming Online Safety Bill makes no mention of VR and the Metaverse – a clear sign that regulators and legislators need to catch up quickly.
[sources:bbc]
Hey Guys - thought I’d just give a quick reach-around and say a big thank you to our rea...
[imagesource:CapeRacing] For a unique breakfast experience combining the thrill of hors...
[imagesource:howler] If you're still stumped about what to do to ring in the new year -...
[imagesource:maxandeli/facebook] It's not just in corporate that staff parties get a li...
[imagesource:here] Imagine being born with the weight of your parents’ version of per...