GDPR - Can Biometric Data Processing be Lawful?
Following a major data breach in the UK, with warnings from the Netherlands and Sweden a couple of months back that the technology is racing forward ahead of the law, we ask: when is processing biometric data justified?
Major Data Breach - The BioStar 2 case
A huge data breach was recently been discovered by internet privacy researchers in Israel, on the security platform app BioStar 2.
The security platform is a biometric security smart lock system, an app on a central server which allows admins to oversee and control anyone accessing secure areas of their facilities. The security surveillance system used facial recognition and fingerprint, or biometric, data to authorise access to users. Users consented to this data being held on the system and admins could allow users into secure areas, manage user permissions, record activity logs and integrate the app with other third-party security apps.
BioStar 2 was integrated with another access control system called AEOS which is used by 5,700 organisations in 83 governments- including governmental bodies and the MET police. However, researchers discovered that the BioStar 2 database was unprotected and unencrypted. It was possible for them to search through all of the data available on the app, simply by manipulating the URL.
The breach contained over 27.8m records- fingerprint data, facial recognition data, face photographs of users, usernames and passwords, and other personal details. Data uploaded by co-working offices in the US and Indonesia, Indian and Sri Lankan gym chains, Finnish businesses and British medical companies could be viewed. They could access the platform and watch real time as any data was uploaded, amended or deleted- and could even upload, amend and delete their own data on the platform themselves!
The researchers warned that they could use an existing profile and change the biometric data to match their own so that they could then easily access security protected facilities.
Whistle-blowers claim that the company was generally uncooperative and unresponsive after being alerted to the breach. A spokesperson has admitted that mistakes have been made, but they believe the real test to be how they will handle their mistake. While we wait to see if there will be punishment from legislators and watchdogs, it raises the question: when is it legitimate to process and store biometric data?
What does the GDPR say?
Biometric data is information pertaining to someone’s physical characteristics, such as fingerprint data and facial recognition data.
Under the General Data Protection Regulation (2016/679) (“GDPR”) biometric data is considered to be a special category of personal data which requires:
a special legal basis for processing the special category of personal data; and
the performance of a data protection impact assessment, to confirm the processing is absolutely necessary.
It is Article 9 of the GDPR which expressly prohibits special categories of personal data from being processed without a special legal basis or unless an exception applies.
It is possible to process biometric data on the basis of consent, from the data subject, however with the toughened rules under the GDPR this can be unpredictable. Article 9(2) of the GDPR provides other grounds on which companies may rely on to process special categories of data, including biometric data, but such grounds are construed very narrowly.
Article 9(2)(a)-(i) lay out the circumstances, which, in addition to explicit consent, includes where such processing is necessary for various reasons, including in the field of employment and social security and social protection law, to protect the vital interests of a data subject (or of another natural person) who is physically or legally incapable of giving consent, for the establishment, exercise or defence of legal claims, for reasons of public interest and for certain reasons concerning uses of healthcare data.
Furthermore, it also covers where processing is carried out by a foundation, association or any other not-for-profit with a political, philosophical, religious or trade union aim (subject to strict conditions) and where the data is manifestly made public by the data subject.
These grounds should always be read in conjunction with applicable national law, for example the UK Data Protection Act 2018 which implements the GDPR into UK law provides for additional conditions in some instances.
Dutch Guidance
Shortly after the data breach in the UK, the Court of Amsterdam issued guidance about biometric data when it was asked if it was legitimate to require employees required to use biometric verification for security purposes.
Employees were expected to use their fingerprints on a cash register to confirm their identity before the money could be accessed. No consent was obtained for the using of such personal data- it was a company requirement.
The Dutch version of the GDPR allows for biometrics to be used without consent, so long as the processing is necessary for authentication or security purposes (UAVG, Article 29) which implements the Article 9(2)(g) exception of the GDPR that special categories of personal data can be processed for reasons necessary for substantial public interest.
The Dutch court decided that the employer had not sufficiently documented the reasons for using the biometric data or performed a proper data protection impact assessment, as can be required by Article 35 of the GDPR. Taking into consideration that the GDPR generally prohibited biometric data from being process unless reasons of public interest prevailed, the Court felt that it was the employer’s responsibility to document why alternative security measures would not have been appropriate and why only biometric data would suffice.
The decision shows how important it is to ensure you do a data protection impact assessment and consider whether the personal data you are storing is absolutely necessary for the purposes you aim to achieve.
Sweden’s View
Shortly after the Dutch decision, Sweden stepped into the discussion! Issuing its first penalty fee under the GDPR in August 2019, the Datainspektionen came down on facial recognition and biometric data warning that the technology, while in its infancy, has the potential to become out of control.
A high school in Skellefteå had been trialing facial recognition software to monitor and record attendance of children in lessons. The trial had lasted 3 weeks, and 22 pupil’s faces had been scanned, with their personal biometric data being stored and processed to oversee attendance.
The school was under the impression that it had obtained valid consent to the processing of biometric data which meant that their monitoring system was lawful. However, the Datainspektionen advised that the level of authority that the high school had over the children, and the level of dependence that the children had on the high school, meant that consent could not be validly obtained. The facial recognition software meant that the children were constantly monitored and did not have any privacy which infringed on their everyday lives.
It issued guidance that, since there were other ways that the school could have monitored the children which would not have been so intrusive, their processing of the data was not legitimate. Biometric data is sensitive personal data which deserves extra protection, and there must be explicit exceptions which apply in order to process such data lawfully.
Using this situation to issue its first fine has sent a clear message from the Swedish GDPR watchdog. It has advised that facial recognition technology is developing quickly and sees that there is a great need to clarify how the rules apply to everyone.
Lessons to learn
There are a number of lessons to learn from the recent breaches that have been discovered and the timely guidance from Dutch and Swedish regulators. Ultimately, processing biometric data can be lawful provided that you have an appropriate basis on with to do so. Furthermore, it is always worth keeping in mind the following points:
Firstly, perform a data protection impact assessment. Always be sure that your processing is absolutely necessary for the activities you are performing, and there are no other methods available to you which would not involve the processing of personal data, sensitive or general.
Secondly, document your decisions. If you decide that your processing is legitimate and lawful, you should record your decision and the other options that you have explored to prove that you have explored other avenues and believe your processing is legitimate.
And lastly, use appropriate technical and security measures. It is crucial that, if you do decide to process personal data, this data is kept organised and safe with encrypted passwords, and deleted as soon as it is no longer required.
If you have questions about the data processing you are carrying out or have any other legal issues you would like advice on, please don’t hesitate to get in contact here.
Article by Lily Morrison and Rebecca Willoughby @ Gerrish Legal, August 2019