Skip to main content

Facebook will block kids from downloading age-inappropriate virtual reality apps

caption: Children play a virtual reality game a Beijing 2022 Winter Olympics Live Site set up on February 07, 2022 in Beijing, China.
Enlarge Icon
Children play a virtual reality game a Beijing 2022 Winter Olympics Live Site set up on February 07, 2022 in Beijing, China.
Getty Images

Facebook's parent company, Meta, plans to roll out parental supervision tools for its virtual reality headset, as concerns mount over whether kids are safe while exploring the so-called "metaverse." The company also is launching new child-safety tools on Instagram.

Meta says its Quest headset is designed for people over the age of 13, but NPR and other outlets have reported that younger children appear to be using VR apps, including ones meant for adults, such as Meta's own Horizon Worlds. That's raised concerns that kids could become targets for predators and be exposed to inappropriate content in the apps.


Starting in April, Meta says, parents will be able to lock their kids out of apps they think are not age-appropriate. In May, the company will automatically block teenaged users from downloading apps rated too old for them by the International Age Rating Coalition.

Parents will also have access to a dashboard where they can see what VR apps their child has downloaded, get alerts when they make purchases, track how much time their child is spending using the headset, and view their child's list of friends.

Meta is also creating a new "Family Center" hub that brings together parental supervision tools and educational resources across all of its apps, including Instagram. It will include video tutorials on using the new tools and suggestions for how parents can talk to their kids about using the internet.

Long-awaited Instagram features launch Wednesday

Instagram's first set of parental oversight tools are launching in the U.S. on Wednesday and will expand globally over the next few months.

Originally announced in December, they will let parents see how much time their children spend on the photo-sharing app, set time limits, get notified if their child reports problems like bullying or harassment, and see what accounts their child follows and who follows them. But for parents to be able to use the new tools, both they and their kids need to opt in.

Vaishnavi J, Meta's head of youth and wellbeing, described the new features for both Instagram and virtual reality as "just one step in a much bigger, broader journey around safer experiences and meaningful conversations amongst families."

Child safety has become a major flashpoint for Meta over the past year. Leaked documents revealed plans to build a version of Instagram for kids under 13. Then, internal research disclosed by Facebook whistleblower Frances Haugen showed the company knew Instagram can be harmful to teenage girls. In September, Instagram said it was pausing work on the app for younger kids, but not halting it altogether.

Lawmakers have rallied to the subject as a rare area of bipartisan agreement. Senators Richard Blumenthal, D-Conn., and Marsha Blackburn, R-Tenn., last month introduced the Kids Online Safety Act, which would require apps to create stricter safety measures for users under 16, build parental supervision tools and protect the privacy of young users.

As Meta tries to respond to criticism that it has been too lax on child and teen safety, the company also faces stiff competition from other apps, like TikTok, that are more popular with younger people than its own properties.

It's also making a bet that the virtual reality metaverse will be the next big platform for playing games, communicating and buying things – and help offset stalled growth at its original social network.

Editor's note: Meta pays NPR to license NPR content. [Copyright 2022 NPR]

Why you can trust KUOW