Meta has rolled out a new feature allowing preteens, with accounts managed by their parents, to dive into its virtual reality world, Horizon Worlds, albeit with a few restrictions. This move is designed to offer younger users a safe exploration experience.
Soon, parents will have the power to select age-appropriate virtual experiences for their preteens, from chilling in The Space Station to ventures in The Aquarium, and even engaging in the Spy School racing game. If a preteen has a particular world in mind, they can ask for access, or parents can choose from a pre-approved list specially curated to keep their children safe online.
In a bid to step up safety measures, Meta has introduced a rating system that classifies VR worlds into categories: 10+, 13+, or 18+. This system helps parents effortlessly approve all 10+ rated worlds in one go, ensuring 18+ worlds remain off-limits for the younger audience. Moreover, the platform eliminates follower suggestions for preteens, and by default, their status appears as “offline” to others, unless parents decide otherwise.
A notable feature is the always-active “Personal Boundary” setting. This ensures that avatars maintain a bubble with a two-foot radius, essentially keeping virtual strangers at bay and respecting personal space.
Meta’s recent initiatives also allow parents to personally approve who their children can chat with or invite into virtual spaces. Plus, there’s a prompt for users of Meta Quest 2 or 3 headsets to re-confirm their birthdate before delving into the VR realm.
Having had these parent-managed accounts available since June 2023, Meta’s efforts to create a safer environment are apparent. Still, skepticism lingers among parents who question how effectively Meta can safeguard their kids, fueled by past controversies surrounding the company’s handling of younger users on the platform.
Earlier this year, internal documents brought to light in a lawsuit by the New Mexico Department of Justice exposed Meta for allegedly targeting underage users on its messaging platforms, despite knowing about the potential risks of adult interactions. Furthermore, another lawsuit from 42 state attorneys in the U.S. claims Meta deliberately designs its products to captivate children, which could, in turn, harm their mental health.
With these steps, Meta seeks to balance innovation and safety, but the journey to earn complete trust from parents might be ongoing.