How Can Organisations Protect Children in the Digital World?

After the ICO fined TikTok £12.7 million for misusing children’s data by allowing over 1 million children under 13 to set up an account and access content, a Children’s code was published. The code was created to help companies understand their responsibilities when it comes to protecting children online. 

The ICO says that 1 in 5 online users are children and as such, online safeguarding measures should be a priority to protect children in the digital world. Having laws to protect children are equally as important online as they are offline. 

The Children’s code has 15 core points including: 1. Best interests of the child 2. Data protection impact assessments 3. Age appropriate application 4. Transparency 5. Detrimental use of data 6. Policies and community standards 7. Default settings 8. Data minimisation 9. Data sharing 10. Geolocation 11. Parental controls 12. Profiling 13. Nudge techniques 14. Connected toys and devices 15. Online tools. 

The code is not a new law but it helps to clearly outline how the General Data Protection Regulation (GDPR) applies to children and to companies with child users. The aim of the code is to obligate digital services whether apps, games or websites to have an automatic built-in baseline level of data protection for child users. In light of this, privacy settings should be set high and organisations should not use nudge techniques to influence children to lower their settings standards. In addition, settings such as location settings to find out where a child is and profile settings to target content should be disabled.  

Key Points to Consider When Processing Children’s Data 

The code has a number of important points for companies to consider when processing children’s data, however here are a few to be aware of:

Data Protection Impact Assessments (DPIA)

A DPIA is a risk assessment that helps organisations work out whether childrens’ data is at risk as a result of the way it is being processed. The GDPR requires that a DPIA is done before any processing begins that may be of high risk to the rights and freedoms of individuals. High risk factors may include things like processing genetic information, invisible processing where the data has not been obtained from the person directly or where you are providing a service specifically for children. The ICO outlines further details of when a DPIA might be necessary. 

An organisation’s DPIA should explain the nature and purpose for processing the data, assessing why it is necessary for the organisation and justify the lawful basis for doing so. After this, the risk must be evaluated including the risk to the child’s mental and physical health and wellbeing, the risk of being exposed to misinformation or prolonged screen time and interrupted sleep patterns. The risk assessment should also include how the risks can be mitigated such as whether extra safeguards can be implemented or whether it is appropriate to provide further information to parents about the potential risks. 

Transparency

The ICO says that transparency refers to being “clear, open and honest with your users” as to how their data is being processed. We saw TikTok breach Article 5 of the GDPR when they were not clear on how child users’ data would be used. 

Companies can comply with the transparency requirement by providing privacy information that can be easily accessed by both children and parents. The ICO says that bite-sized information should be given to make important privacy terms more straightforward. Companies can also use a “just in time notice” at the point at which the personal data will be used by prompting the child to speak with an adult if they are unsure about proceeding. If companies know that children will be using their services, they should make the information as child-friendly as possible through images or videos.  

Profiling

Profiling can be used to target users by tailoring certain content and deciding when and how often to serve them that content. In particular, the content put in front of children must not be capable of affecting their mental and physical well-being. There are specific rules that cover profiling. Article 22(1) says that “the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning or affecting him or her.” 

The concerns are that children can be presented with completely inappropriate content that can negatively affect them such as graphic images. Children are particularly vulnerable and can be exploited through the potentially dangerous content that they are served with. 

Companies that do use profiling to recommend certain content to child users will be ultimately responsible, more so than if the child went and looked for the content themself. This is because the child is being profiled based on how their data has been processed. So it is important to identify child users so that they aren’t profiled in this way and are protected instead. 

How Can Gerrish Legal Help?

Gerrish Legal is a dynamic digital law firm. We pride ourselves on giving high-quality and expert legal advice to our valued clients. We specialise in many aspects of digital law such as GDPR, data privacy, digital and technology law, commercial law, and intellectual property. 

We give companies the support they need to successfully and confidently run their businesses whilst complying with legal regulations without the burdens of keeping up with ever-changing digital requirements.

We are here to help you, get in contact with us today for more information.

Previous
Previous

What is Contract Drafting Automation?

Next
Next

Meta Fined Record Amount by the Irish DPC for Data Breaches