The Age Appropriate Design Code: Strong on principles but will it trigger change?

On 2 September 2021, following a 12-month transition period, the Age Appropriate Design Code (the Code) became enforceable in the UK.  The application of the Code to “organisations providing online services likely to be accessed by children in the UK” (emphasis added) means that it has the potential to have far reaching implications for many organisations, not least social media platforms.  The importance of the Code to these organisations is clear: if you aren't fulfilling the requirements of age-appropriate privacy by design, you aren't complying with the UK General Data Protection Regulation (UK GDPR). 

At the time of launching the Code in September 2020, the Information Commissioner Elizabeth Denham CBE said that “developers and those in the digital sector must act” with the threat “companies that do not make the required changes risk regulatory action”.  Fast-forward 12 months and some ‘Big Tech’ companies such as Google (and YouTube), Facebook, Instagram and TikTok have made some changes to the design of their services, ostensibly to give children greater privacy protection.  In parallel, on 19 August 2021, the Information Commissioner’s Office (ICO) approved the first UK GDPR certification criteria focused on children’s online privacy, known as the Age Check Certification Scheme (ACCS). 

While measures which improve the protection of children online are to be welcomed, this area remains vulnerable to underenforcement because: firstly, applying for certification under the ACCS is voluntary; and, secondly, the ICO does not have the capacity to audit and monitor ongoing compliance of all organisations that fall within the scope of the Code.  

Background

The Code is a set of 15 flexible standards which require that:

  • settings on online services must be “high privacy” by default unless there’s a compelling reason not to that is in the best interests of the child
  • only the minimum amount of personal data should be collected and retained
  • children’s data should not usually be shared
  • geolocation services should be switched off by default
  • nudge techniques should not be used to encourage children to provide unnecessary personal data, weaken or turn off their privacy settings.

Key principles

Four of the Code’s key principles are considered in more detail below.

Age appropriate application

This principle requires organisations to adapt the design of their online services to the age range of the children accessing them and their different stages of development.  The level of data protection offered must also be tailored to these ranges and stages.  In practice therefore, compliance with the Code requires either adaptation for the specific audience of the online service in question or the uniform application of the highest standards to all users, regardless of age. 

Data protection impact assessments (DPIAs) should be conducted to assess the risks posed to relevant age groups.  Organisations can use various methods to verify the ages of children accessing online services; from self-declaration, provision of documents and parental confirmation, to more automated solutions such as AI or third-party age verification providers.  In practice, such solutions may not be practical or may themselves require intrusive data collection.  The ICO is currently considering how organisations can tackle age assurance, whether that is verifying or estimating age and will formally set out its position in the autumn.

Transparency

This principle builds upon Article 5 UK GDPR, which provides that personal data should be processed “lawfully, fairly and in a transparent manner […]” and Article 12 UK GDPR, which requires data controllers (the online services) to provide information about data processing to children in a manner they can comprehend. 

In practice, the principle of transparency requires online services to provide clear information about what they do with children’s personal data in a prominent place, and at the point at which personal data starts to be processed.  Privacy information should be presented in language children can comprehend and child users should be prompted to speak to an adult before they activate any new use of their data.  The more general information, such as full privacy policies, terms of use and community guidelines should be easily accessible, should users (and their parents) wish to review them in detail.  Pop-ups of child-friendly ‘bite-size’ explanations are encouraged, even if displayed alongside more ‘legalistic’ explanations.  The Code provides more detail as to the level of privacy information that ought to be provided to different age groups.  It should be borne in mind that parental consent is required under Article 8 UK GDPR in respect of processing the data of children aged under 13 years, if the legal basis for such processing is consent.  Privacy information geared towards older children (especially aged 13 and above) should empower them to make their own informed choice as appropriate.   

Default settings

Privacy settings are key to ensure that users of online services have full control over the collection and processing of their data.  The Code specifies that privacy settings may be designed and used by organisations to enable users to toggle the extent of data processing for purposes other than the core service, i.e. to “‘improve’, ‘enhance’ or ‘personalise’ their online experience”.  However, the ICO warns that default settings may act to users’ detriment, where they are set by default to collect more data than the users would allow if making an active choice. 

Data minimisation

This principle, echoing Article 5(1)(c) UK GDPR requires that only the minimum amount of personal data necessary to deliver a service may be collected.  Together with Articles 5(1)(b) (purpose limitation) and Article 5(1)(e) (storage limitation) UK GDPR which, respectively, require the purpose of the data collection to be “specified, explicit and legitimate” and prevent the storage of data beyond the time necessary for its processing, the data minimisation principle prohibits collection of more data than is necessary “to provide the elements of a service the child actually wants to use”.

In practice, this means online services must determine what personal data needs to be collected for each element of the services they offer and give children the choice to use only individual elements of the full range of services offered.  The collection of children’s personal data to improve, enhance or personalise their online experience should not be bundled with the collection of personal data required to provide the core service and data should be collected only when a child user is “actively and knowingly” using a specific element of the online service.

Recent changes made by major online services, ICO enforcement and compliance with the Code

As the Code makes clear, its focus is to set standards for the protection of children’s personal data.  Different online services will require different solutions. 

Google (and YouTube), Facebook, Instagram and TikTok have recently made adjustments to the design of their services.  In terms of enforcement of the Code, the ICO has been clear that data relating to children (which is afforded special protection under the UK GDPR) is a regulatory priority.  The ICO’s powers to enforce the Code will be used in tandem with its powers to sanction failure to comply with the UK GDPR and the Privacy and Electronic Communications Regulations (“PECR” which derive from European Directive 2002/58/EC – also known as the e-privacy Directive), including powers to issue warnings, reprimands, stop-now orders and fines.  The ICO’s general powers in respect of “serious” breaches of the UK GDPR include fines of up to £17.5 million or 4% of annual worldwide turnover, whichever is higher.

Comment

The Code undoubtedly represents progress in protecting children’s data online.  However, only time and enforcement action will reveal whether the Code is an effective tool.  The ICO’s enforcement action to date has been focused on breaches such as unsolicited direct marketing via telephone or email and, save for the fine imposed on Facebook in 2018 (following the Cambridge Analytica scandal), Big Tech companies have not yet been subject to fines or enforcement action by the ICO (although there are undoubtedly currently investigations ongoing).  This is surprising given the length of time the UK GDPR has been in force and the regulatory action taken in Europe, such as in Italy and Ireland, and elsewhere globally

The Code is “not a new law” but, rather, sets standards and gives greater clarity on the interpretation of the UK GDPR in so far as it applies to children’s data.  One thing is clear: where the ICO identifies harm or potential harm to children they will likely take more rigorous action against the relevant organisation than would be the case for other types of personal data. 

Unfortunately for UK data subjects, the focus of the ICO to date seems to have been on unsolicited calls, emails and hacking, not on the compliance of online services with the wider objectives of the UK GDPR.  Given the limited resources of the ICO (which is flying solo on enforcement of the UK GDPR post-Brexit), the burden of auditing and monitoring the ongoing compliance of all organisations that fall within the scope of the Code means that it is likely that there will be significant underenforcement in this area. 

The authors are grateful for the contribution of previous Intern Adel Msolly to this article.