Part 2: EdTech - Case Studies & Compliance

Schools across the UK have spent close to £900 million each year on Ed Tech. However, what do we really know about the privacy implications of this fast-growing industry?

As we stated in the first part of this two-part article, in 2019, Ed Tech made up 4% of companies operating in the digital sector, and there are projections that this industry will grow by 7% annually.

In our first part of our two-part article, we looked at the use of AI and Surveillance in education. In this second part, we take a look at case studies from around Europe and set out some practical considerations for compliance.

The Use of Surveillance and AI in Education

Education is changing. New AI systems are allowing students to be in control of their own education and can be used to predict curriculums that will be suitable for them and interesting to them. Children are also increasingly taught how to use tech for the future. Some predict that in the future, educators could be replaced by technology.  More on this is set out in the first part of this two-article series.

Case Studies: What happens when it goes wrong?

Despite the privacy implications, these practices are increasingly being used- and data protection authorities around the world are reacting. 

  • France - Authorities Preventing Surveillance

42 Silicone Valley, a software engineering school in France, recently came to the attention of the French data protection watch dog for employing AI surveillance systems. The video systems recorded students in their relaxation areas and did not have measures in place to ensure that the CCTV images could not be accessed by unauthorised individuals.

As well as this, students were not publicly informed of the monitoring. The school was forced to stop the surveillance. 

Two schools in Nice and Marseille had also requested permission to experiment with the use of facial recognition technology. However, the CNIL (the French Data Protection Authority) found that the biometric data of most minors would be neither proportional or necessary.

The CNIL considered that the experiments would violate the principles of proportionality and data minimisation under the GDPR, and there were other less intrusive ways of monitoring students. 

  •  Norway - Poor Security 

The Norwegian Data Protection Authority, Datatilsynet, issued a fine of €120,000 against the Municipality of Oslo, the Education Agency, after its use of a mobile app for communication between teachers, students and their parents. The app had poor security which meant that it could be accessed by unauthorised parties, and there were no special technical measures in place to protect special data such as health information. 

  • Sweden - Excessive Surveillance 

The Swedish Data Protection Authority, Datainspektionen, fined Skellefteå High School Board around €19,000 (200,000 SEK) for using facial recognition technology to verify 22 pupils’ attendance in lessons for three weeks. 

The pilot programme was put in place at the end of 2018 to track the attendance of pupils: rather than take a register, facial recognition would track their physical comings and goings. The school obtained parents’ consents for the monitoring, however, the Datainspektionen considered that there were no legal reasons to collect the sensitive personal data.

Students were entitled to an expectation of privacy when they entered the classroom, and there were less intrusive ways that their attendance could have been monitored. 

For failing to conduct an adequate data protection impact assessment and failing to consult the regulator before the processing began, the school had unlawfully processed sensitive biometric data. It was the first fine issued in Sweden under the GDPR, and shows that regulators will be taking such practices seriously. 

Practical Considerations

Considering video surveillance, the CNIL published guidance following warnings that had been issued to schools for using excessive surveillance, explaining that any sort of surveillance must comply with the GDPR.

It set out that any system placing students under constant and systematic surveillance would be considered excessive. If there were no exceptional circumstances justifying the use of such surveillance, the cameras should be repositioned or removed so that only entrances and exits are filmed, or reprogrammed so that they do not constantly record during open hours. 

The use of cameras must therefore be fully justifiable and evidence-based, with a high threshold for this evidence. In general, constant CCTV is highly intrusive. The location of cameras should be chosen with care and they should not be placed in areas where individuals would expect privacy. 

When it comes to EdTech in general, the first port of call should always be a data protection impact assessment.

Schools, education institutions and providers of EdTech or simply where surveillance by automated means is being considered, should therefore:

  • Consider proportionality - can the same aim be achieved by less-intrusive means?

  • Is facial recognition required or would simple-CCTV systems have the same effect?

  • Obtain feedback from all stakeholders - education providers, local authorities, pupils and students, teachers and staff and parents.

  • Ensure that all correct privacy documentation is in place and that any notices are clearly displayed and easy to understand.

  • Conduct a Data Protection Impact Assessment to outline risks and see how they can be overcome or mitigated.

  • Liaise with the applicable Data Protection Authority where the EdTech and surveillance relies on biometric or other sensitive or special category data or involves innovative or new technologies - especially where AI or “black box” tech is involved.

On the basis of our experience, businesses should work to collaborate with data protection authorities as they refine their product, and not against them - so do not be afraid to ask them for advice and assistance - and more importantly, please ensure that the privacy risks are covered off and mitigated before roll-out.

If you have any questions relating to EdTech, or data privacy in general, please don’t hesitate to get in touch!

Article by Lily Morrison @ Gerrish Legal, April 2020

Previous
Previous

Covid & Influencers: What are the Rules During Lockdown?

Next
Next

Part 1: EdTech - Privacy Implications