Finding the balance between protecting children and letting them explore the digital world

Websites and apps used by young people must be designed with their protection and privacy in mind. That’s why we’re producing the Age Appropriate Design Code, writes Information Commissioner Elizabeth Denham



Whether playing online games, watching and sharing videos or interacting with friends via apps and social media, children today grow up with digital technology as a fundamental part of their daily lives.

This is a good thing. It is in all our interests to create an environment in which kids are engaged, informed, and responsible digital citizens of the future.

But while allowing them access to the many benefits that technology brings, we must protect their privacy and best interests when online. This is a challenging task, but one we have to get right.

When children can master a tablet before they learn to ride a bike and when kids as young as 13 can sign up for social media accounts, then the websites and apps they use must be designed with consideration of children’s rights and needs built in.

That’s why we’re producing the Age Appropriate Design Code. To let those who design the online services that children use know what we expect from them.

Introduced by the Data Protection Act 2018, the code will provide guidance on the privacy standards that we will expect organisations to adopt where they are offering online services and apps that children are likely to access and which will process their data.

We welcome this as an exciting opportunity that demonstrates the UK as world leader in this crucial area. To quote Baroness Kidron, who argued for this important addition to the Act, it represents “a step towards a better digital future for our children.”

The Code will further the concept of data protection by design, which is a key feature of the new Data Protection Act and the General Data Protection Regulation (GDPR).

We will consider privacy settings and notices, and whether the language used is clear and easy to understand for youngsters at different stages of their development. We will assess whether automated profiling of children or the use of geolocation data might be appropriate, and the transparency of marketing, product placement and advertising.

We will look at the strategies used by sites and apps to personalise a child’s experience to encourage them to stay online longer, such as auto play videos and the timing of social media notifications.

We have to make sure the Code is designed with children at its heart. This is why we’ve launched our Call for Evidence asking for views from providers of online services, child development experts, children’s advocacy services, academics and others with an interest in what the Code should say.

Children also have their own voice and the right to be heard as part of our work. We want to get this right and we will also therefore be doing a separate, direct consultation with children, parents, and guardians that will be specifically targeted to their needs so that we get the best possible evidence to inform our thinking. We will be seeking some specialist, innovative support to help us. We will publish further information soon.

In the meantime, we have also now published the final version of our guidance on children and GDPR. It explains what the law requires of data controllers that process children’s personal data, and we make no apology when we say that we expect them to take these requirements seriously.

We hope our call for evidence and specialist research will help create a Code which works in the best interests of everyone, from developers and industry to government, parents and – most importantly – children themselves.

Like learning to ride a bike, it is all a matter of finding the right balance between protecting our children and granting them the freedom to explore the digital world.

Elizabeth Denham is the Information Commissioner