How to Comply With Biometric Data Processing Standards

The Information Commissioner’s Office (ICO) is in the process of putting together guidance on biometric data and biometric technologies which help companies understand how to securely process biometric data according to regulations. 

The ICO suspects that the use of biometric data in businesses is expected to increase in future. This could be to enable a better user experience for customers or to create more efficient systems for employees. As such, it is important that companies know how to handle biometric data and potential risks such as inaccuracy and discrimination. 

What Is Biometric Data Processing?

According to the UK GDPR, biometric data is defined in Article 4(14) as:

“‘biometric data’ means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data”.

Some biometric data is considered “special category” data. The ICO says that the reason biometric data is considered to be “special” is because it can reveal information about someone’s race, ethnicity or gender for instance and can be linked to wider factors such as freedom of thought, conscience and religion, the right to bodily integrity, the right to respect for private and family life or freedom from discrimination.

In another post, we speak about virtual reality and the risk to our personal biometric data which is captured when we use virtual equipment like headsets. 

Our biometric data encompasses distinctive features such as fingerprints and facial expressions that are exclusive to each individual. Consequently, when we employ facial recognition to unlock our mobile devices, it becomes something only we can do. The potential ramifications of our biometric data falling into unauthorised hands could have a huge impact on us.

Biometric data can be captured in our everyday lives without us even thinking about it. For example, having to scan your fingerprint at the gym to enter, signing an electronic signature when you receive a package (an example of behavioural biometric identification) or sending a voice note message to a friend.

The ICO give other examples of biometric data including:

  • facial recognition

  • fingerprint verification

  • iris scanning

  • retinal analysis

  • voice recognition

  • ear shape recognition

  • keystroke analysis

  • handwritten signature analysis

  • gait analysis

  • gaze analysis (eye tracking)

Biometric Data Processing 

The ICO will be producing further guidance for companies but has outlined some key points for those using or considering the use of biometric data in their daily business activity. 

Consent

The ICO says that explicit consent is usually needed when gathering biometric data. As with the collection of other data, users must not feel as though they are being forced into providing their sensitive data and must be given a choice. Let’s use the gym membership example again. If a gym offers fingerprint entry access but some members do not feel comfortable giving their fingerprints, the gym should provide an alternative to entering the gym like a pin code or swipe card. 

Data Protection Impact Assessment

Conducting a Data Protection Impact Assessment (DPIA) is mandatory for processing activities that pose a substantial risk to individuals' rights and freedoms. The use of any biometric recognition system is highly likely to need a DPIA. 

This requirement arises from data protection regulations, which stipulate that a DPIA must be performed when handling special category data on a significant scale or engaging in extensive systematic monitoring of publicly accessible areas. Most applications of biometric recognition systems align with these criteria. Even in cases where your system does not meet these specific criteria, a DPIA is still imperative if your processing activities correspond to high-risk processing operations.

Risks

Holding biometric data carries inherent risks, primarily related to accuracy, discrimination, and security. Inaccurate identification can lead to system errors, jeopardising the reliability of biometric authentication. Discrimination concerns arise when individuals or groups face unjust treatment based on their protected characteristics. Additionally, the security aspect involves unauthorised access to biometric data and the potential for system manipulation or spoofing, undermining its intended safeguards.

Addressing Risks

To safeguard biometric data effectively, it's imperative to implement suitable security measures. These measures should be determined through a comprehensive risk analysis, which takes into account factors such as the processing context, potential security threats, and the potential harm or distress resulting from data compromise. Additionally, assess the vulnerabilities your system might encounter from various forms of attack. To maintain the efficacy of these security measures, regular testing and reviews are essential. Furthermore, it is crucial to encrypt all biometric data utilised to enhance protection.

How Can Gerrish Legal Help?

Gerrish Legal is a dynamic digital law firm. We pride ourselves on giving high-quality and expert legal advice to our valued clients. We specialise in many aspects of digital law such as GDPR, data privacy, digital and technology law, commercial law, and intellectual property. 

We give companies the support they need to successfully and confidently run their businesses whilst complying with legal regulations without the burdens of keeping up with ever-changing digital requirements. 

We are here to help you, get in contact with us today for more information.

Previous
Previous

Stricter Online Safety Rules: How Are Companies Coping in Reality?

Next
Next

Do You Need a Solicitor to Write a Contract?