Facebook and Twitter are to be made safer for younger users as the government has agreed to legislate a code of practice for social media sites setting out a minimum standard for “age-appropriate design”.
The move follows campaigning from peers and children’s charities, including the NSPCC and Children’s Society, to have safeguards such as applying the highest possible settings to children’s accounts by default and terms and conditions “in a language children will understand” introduced across social media.
According to reports in the Telegraph, the plans for ‘enforced safeguards for children online’ are included in new amendments to the data protection bill, based on a previous proposal from Baroness Kidron, and are intended to head off a possible defeat over legislation ensuring no one under the age of 13 can create a social media profile.
Speaking to the press, the NSPCC said allowing children as young as 13 to sign up for an account means that “everything should be designed for a 13-year-old rather than assuming children can make sound choices like adults”.
Earlier this week, Facebook unveiled Messenger Kids, an app designed for the use of pre-teens effectively acknowledging its efforts to prevent those under 13 signing up had failed.
Under the amendments, data watchdog, the Information Commissioner, has 18 months to “prepare a code of practice” after the data protection bill becomes law, with parents, children, child development experts and trade bodies consulted as part of the process. Non-compliance with the code could result in the possibility of large fines with the Information Commissioner able to fine those found breaching data protection regulations up to £17m or 4% of global turnover.
Last month, Google was forced to clarify its stance on child safety following reports it had benefitted from ads appearing on channels which supported child abuse.
A statement issued at the time read: “We take child safety extremely seriously and have clear policies against child endangerment. We recently tightened the enforcement of these policies and tackle content featuring minors where we receive signals that cause concern.”