The Information Commissioners Office (ICO) has published a Code of Practice to protect children online. What this means to app designers and ultimately the various App stores is that they will need to meet 15 standards to protect children’s privacy and data safety online.
Kids are spending an increasing amount of time in front of screens, and the ICO has rightly considered that the responsibility should at least, in part, lie with app creators to have privacy and child protection built into the app's design.
For all the benefits the digital economy can offer children, we are not currently creating a safe space for them to learn, explore and play. – Information Commissioner
Here at Compare and Recycle, we have often recommended that parents take a proactive approach in disabling data collection and enabling as many privacy features as possible, but the speed and ease that children can install apps makes this a daunting task for even the most vigilant of parents. Therefore, it’s great to see that the ICO are taking steps to ensure that children are protected from within the digital world.
There are 15 standards that apps including social media platforms, games and entertainment platforms need to adhere to. The ICO provided detailed information regarding this, however, here’s a bitesize summary to outline the key points of each standard:
1. Best Interest of the Child
Taken from the United Nations Convention on the Rights of the Child (UNCRC), the best interest of the child provides a framework to ensure that the best interest of the child is first and foremost:
“In all actions concerning children, whether undertaken by public or private social welfare institutions, courts of law, administrative authorities or legislative bodies, the best interests of the child shall be a primary consideration.”
It is not difficult to apply this principle as a measure of support for a child’s safety that can be applied to the digital world. A child should have the right to privacy and freedom from economic exploitation, safe protected and supported and how an application can not only avoid the negative, but reinforce the positive aspects of childhood development. Further measures go into detail of how these are going to be actively protected.
2. Data Protection Impact Assessments
This is a format for companies to follow so that they can determine internally how much of a risk there is and should highlight areas they need to fix. Ideally, this will cut down on a lot of excuses that are made when claiming ignorance.
3. Age-Appropriate Application
This requires that a “risk based approach to recognising the age of individual users” be applied by establishing what ages would be using the application, and that safeguards should be in place that take into account the ages of the expected users.
No more jargon in terms and conditions; the privacy details need to be concise and in a clear language suited to the age of the child.
5. Detrimental Use of Data
When data has been gathered from usage, this cannot be used in any way that can be “shown to be detrimental to their wellbeing”. This covers behavioural advertising as much as it does advertising age-inappropriate messages or ones that may undermine parental authority or cause self-harm.
This could include application strategies that are used to extend user engagement, such as reward loops, auto-play features, infinite scrolling and notifications. Many triggers for addicting strategies should be covered within this. An example of this would be notifications that push to maintain 'streaks' of using it every day.
6. Policies and Community Standards
If a site relies on or uses chat rooms, communities or any form of user-generated content, it should adhere to content policies that have been clearly outlined. If it is stated that an online community is suitable for children, then there needs to be mechanisms in place to ensure that it is, such as, proven anti-bullying measures to identify and effectively deal with online bullies.
7. Default Settings
This simply requires that all apps that are suitable for children have the highest possible privacy settings enabled by default. This is helpful as there are apps that take advantage of children that are oblivious to managing their privacy settings.
8. Data Minimisation
Without complicating matters, this refers to collecting the minimal amount of personal data that would be required to deliver an individual element of app's service. We have seen Android and iOS start to disable constant streaming of location data, so putting this impetus back on the developer along with the protection offered by operating system should go a long way to minimising how much information can be gathered by apps and websites.
9. Data Sharing
There are many cases when data sharing with third parties is acceptable, such as an educational app sharing data with the child's school, or safeguarding in one-off emergency situations, but this standard should cover what data may be shared with third parties considering the child's best interest. Think of the Cambridge Analytica Scandal that was given third party data to influence political messaging and we can see the impact that this can have.
Geolocation options must be switched off by default, unless there is a clear and compelling reason for geolocation to be switched on. If a child chooses to make their location visible to others, it should default back to off at the end of each session. It also needs to be obvious to the child that their location is being tracked, and information should be provided at the point of sign up and provide a clear indication (e.g. via a visible symbol) that their location is being tracked.
11. Parental Controls
It also needs to be clear to the child that their activity is being monitored by parents or guardians if there are parental controls or monitoring. The recommendations also describe the child's right to privacy under the United Nations Conventions on the Rights of the Child (UNCRC) and how this expectation is likely to increase as they get older.
This section states that profiling should be switched off by default unless there is a compelling reason to justify keeping it on. Profiling is often used to provide users with relevant content, as a lot of this is controlled by cookies (a small text file placed onto your device so that the site or app can identify you). There will be stricter controls on what cookies can be used in an age-appropriate setting.
It will also need to be made clear if browsing history is used to recommend appropriate content or display advertisements and can’t be listed in a catch-all. This is one of the longer codes of practice listed on ICO.org.uk and it is encouraging to see that they are going into more detail of what is covered by profiling to a further degree.
13. Nudge Techniques
Encouraging children to provide unnecessary personal data or to turn off privacy protections is reprehensible. Any prompt that asks for privacy settings to be changed need to be supplemented with age-appropriate explanations of functionality and the risk involved and, where possible, notification of a trusted adult.
What should be encouraged under the new measures are nudge techniques towards digital wellbeing enhancements, such as taking breaks and pausing and safety features so children do not feel obliged to keep playing even while tired.
14. Connected Toys and Devices
We are seeing an increasing number of products aimed at children that are connected to the internet, such as fitness bands or interactive teddy bears. These need to comply with all the measures within the code, and communicate the key features of how the data is processed and what personal data is used at the point of purchase and during setup. Most importantly, the passive collection of personal data should be avoided so that the device cannot just listen in a standby mode, and a hard button is recommended to be able to switch off connectivity at any time.
Allow children to exercise their data protection rights and report concerns, similar to the right over personal data outlined in the GDPR. The rights to access, rectify, erase, restrict port elsewhere and object entirely need to be followed in apps targeted at children. This could be in the form of a very basic, icon that allows the child to demonstrate they are not happy with the service and should these be pressed, prompt the child to inform an adult.
What This Means for Parents and Guardians
The measures are in depth, considerate and a positive change for the industry that should empower children of all ages when using apps and websites. The full code is worth reading as it not only serves as a guideline for developers, but also as a guide for parents to aid in their understanding of how data could be used in the seemingly innocuous activity of using an application.
The timeline gives 12 months for coders, UX designers and system engineers to put these standards into practice, however, we should start seeing how this manifests in some of the more popular applications in the coming months.