Children can access sexual material and experience online grooming on the metaverse, according to an investigation by BBC News.
Using the metaverse app VRChat, a researcher was able to visit virtual strip clubs where avatars simulated sex. She was also shown sex toys and condoms and was approached by numerous adult men.
The app has a minimum age rating of 13 and has many innocent virtual settings for users to meet, such as a McDonald’s restaurant. However, there are also numerous adult arenas that users of any age can access, including pole dancing and strip clubs. The researcher created a fake profile to set up her account, and her real identity was not checked. The only requirement was that the user had a Facebook account.
In addition to accessing sexual content and being subjected to attempted grooming, the researcher said they witnessed racist insults and a rape threat on the virtual platform, which users can explore with 3D avatars.
She also cited comments from other users regarding the sexual content on the app, one of whom told her that avatars “can get naked and do unspeakable things.”
The findings add to concerns about privacy and security issues in the metaverse, an open-ended collection of digital experiences, environments and assets leveraging virtual reality headsets. This virtual world is expected to grow substantially in the coming months and years and be increasingly adapted for use in everyday life, like work and cinema trips. The technology is set to be led by big tech companies like Facebook owner Meta, Microsoft and Google.
However, security experts have expressed concerns about the metaverse’s security practices, including verifying users’ real-world identities.
Commenting on the story, Alexey Khitrov, CEO of ID R&D, said: “Fake news, malicious actors, adult content. There are many risks posed to kids by the metaverse or any environment where people hide behind avatars. The key to preventing children from accessing content that could be harmful to them is having age verification processes that work.
“To weed out fake identities from the real people, metaverse providers will need strong identity verification, both in the sign-up process and continuously as the platform is used, to avoid situations where a child might access an adult’s account. This can be achieved through deploying facial recognition technologies.”