Campaigners contend that the absence of the settings is a violation of the children's safety code, prompting The Information Commissioner's Office (ICO) to haul in the technology company over the coals for questioning over its omission.
The virtual reality headset proved to be a popular gift this Christmas with the social media owner making inroads in its efforts to mainstream the technology among the masses but concerns persist over the risk of exposure to harmful content without mechanisms in place to block unsuitable material for under 18s.
The Center for Countering Digital Hate is among those raising awareness of the issue, pointing to in-house research that unearthed instances of abuse conducted via the VRChat social app.
In a statement, an ICO spokesperson said: “Online services and products that use personal data and are likely to be accessed by children are required to comply with the standards of our children’s code.
“We are planning further discussions with Meta on its children’s privacy and data protection by design approaches to Oculus products and virtual reality services. Parents and children who have concerns about how their data is being handled can complain to us at the ICO."
Meta's grilling will see it quizzed over the apparent failure to comply with its age-appropriate design code for connected devices, which mandates that the wellbeing of children is of paramount importance.
Specific requirements include provisions to verify that users are aged 13 or over, not simply ask users to self-declare their age by checking a box, as is the case at present.
If Meta is found to be in contravention of the code a range of sanctions is possible ranging from a slap on the wrist to a fine of up to £17.5m or 4% of global turnover - equivalent to £2.5bn.
The attention of regulators has previously focussed on content rather than devices, with a code of conduct being drawn up to protect vulnerable users from illegal material, especially children.