Gerrish Legal at the Transparency by Design Summit!

How can we encourage trust in track and trace apps? It has been an interesting few months for data privacy! We have witnessed in real time the rules set out around using public health data which apply to public and governmental bodies, and it has raised some interesting considerations for privacy lawyers.

In June 2020, our founder at Gerrish Legal, Charlotte Gerrish, spoke at the Transparency by Design Summit. The aim of the summit was to think about the concerns that individuals have when it comes to their personal data being used for the prevention of COVID, and how we, as tech minded and privacy literate individuals, could work towards a solution. 

You can watch the Main Sessions of the Summit via Facebook or on Youtube. In the meantime, check out our overview below!

Privacy Concerns During COVID…

The General Data Protection Regulation requires all personal data to be collected and processed in a lawful way, and for the last two years the legal text has strived for trust in the way that businesses use our personal data.

Governments, just like businesses, require a legal basis to collect health data under the GDPR. The WHO declared the Coronavirus to be a world pandemic and, as such, government bodies are legally permitted to collect health data without the consent of individuals involved, with its legal basis being that this is necessary for reasons of substantial public interest.

However, whilst it might seem there is a legal basis for governments to collect public health data, it does not seem that the public has been pleased with this!

France was one of the first European countries to commence with plans for an app which would control the spread of the virus. It announced that the StopCOVID app would use Bluetooth signals to keep a track of anonymous devices so as to limit the collection of personal data. However, there were immediately major privacy concerns, with political parties worrying about how the data would be used after the crisis and 28% of the population stating it was certain that it would not download the app.

While France has ploughed forward making the app available for download, the UK has become increasingly hesitant about the development of its NHS Covid-19 App. It should have been released at the same time as the French app, was pushed back to June, and remains delayed in July. After advising the NHS app would use the same technology as the French StopCOVID app, these plans have been scrapped in favour of using technology by internet giants Google and Apple- another worrying turn which has been putting the public off the use of the apps…

The concerns that have been raised all over Europe have got us thinking about the GDPR, and why after two years of implementing rules aimed at encouraging public trust in the use of personal data, people remain hesitant to share their information.

So, our founding lawyer attended the Transparency by Design Summit to speak about what we can do next in order to encourage trust in the use of our personal data. Here is what was discussed!

The need for adaptation?

The concerns that have been raised during the pandemic have got some of us wondering whether the GDPR is fit for purpose and whether it has achieved its aim- perhaps the concerns show that the text needs to be adapted?

Or perhaps not. The GDPR was developed with flexibility and adaptability in mind so it in theory should not stand as an obstacle to fighting the pandemic. For many it exists as a gold standard to aim for. Regulators do seem to understand this and have so far been quite sparing with the fines they have handed out, considering their wide authority and how far they technically could go. Indeed, since 2018 there have been 160,921 personal data breaches reported in the EEA, but there has only been €153 million in fines- not a large number considering the huge discretion authorities have for monetary fines, and how large the figure could have been. So, baring this in mind, perhaps the starting point is that the GDPR does not actually need to be adapted in the sense that its rules cannot be conducive to fighting the pandemic: we need to think about how we can use the adaptability of the GDPR to our advantage.

To tech-minded data privacy lawyers, the pandemic hasn’t revealed any new issues with the GDPR. There has long been debates over how restrictive it can be for innovative processes, and there has been feeling in the tech community for some time that this should be reviewed. So, maybe we should think of the issues that have been revealed by the pandemic positively in that they may force regulators to review the realities of how technology needs to be developed!

Indeed, we have seen strict rules being relaxed all over Europe. In Italy, a Civil Protection Ordinance No.630 has allowed civil bodies extensive powers to process personal data related to the COVID-10 crisis, lifting restrictions on personal data. In France, an information notice has confirmed that the data strictly necessary for the accomplishment of the COVID-19 mission will be transmitted in the spirit that the GDPR intended. In Germany, limiting the use of personal data to be only that which is in the public interest is not a new concept and Section 22(1) of the German Federal Data Protection Act permits processing of special categories of personal data for reasons of public interest. Data protection watchdogs have been clear from the beginning of the pandemic that they are aware of the need to adjust their approaches to regulation, and apply their authority within the larger social and economic situation. 

So, perhaps the pandemic has shown that the GDPR does contain all the required flexibility and adaptability that it claims to possess- we just need to work on encouraging this flexibility in other areas! However, while we encourage this flexibility, we must also ensure that the aims of the GDPR are not forgotten.

Data Minimisation vs Tech

Despite the flexibility of the GDPR, we are seeing tech companies struggle with some elements of data protection. How can they ensure compliance?

Since the pandemic has called for quick development of new technologies, it is arguable that all of the principles set out in the GDPR are more difficult than usual to comply with, since they require start-ups to build their projects slowly and mindfully. Take for example the data minimisation principle which requires that data is relevant and limited to what is necessary in relation to the purposes it was collected for. Tech start-ups aiming to develop technology to fight the pandemic have not been given the opportunity to explore alternative measures to collecting data and have an uphill battle in order to comply with the requirement of avoiding the excessive collection of data.

So, tech companies looking to develop such apps now have the challenge of following the privacy by design principles at a much faster speed than ever before.

To a tech-minded data privacy lawyer, steps towards compliance for these companies cannot be viewed from only a legal perspective. While data privacy experts can make efforts to implement the changes needed from their end once products have been designed, we think that a culture of data privacy must be promoted within tech start-ups developing such technologies.

Therefore, when looking to use tech to track the Coronavirus, we need to take steps towards designing such technologies with data protection at the heart of the process. Privacy compliance cannot be added on to a system at the end of the design process: it must be kept in mind throughout the process. By doing so, we can encourage the public to trust in the process, use the apps and ultimately achieve the aims of the apps.

Encouraging Trust

However, the past few months have shown that there is a lack of trust in the use of personal data. How can we change this?

We think that expecting trust is impossible without first encouraging people to learn about the rights and protections they have under the GDPR.

The GDPR has taken great strides towards protecting our personal data, and it is therefore important that society understands what personal data is and the value it can have. There is a lot of confusion around the development of the track and trace apps which require personal data, and about how the data is collected, used and stored. Most people understandably don’t have time to research the science behind the technology, so grow concerned with the possibilities of how data could be used.

Unfortunately, while data protection watchdogs have been tasked with enforcement for any companies failing to follow the rules, the soft approach which they have taken means there is arguably a lack of education for those looking to learn what rights they have under the GDPR. With only internet giants being made an example of, we don’t have much to learn about smaller businesses. Tech minded individuals should therefore encourage society to become data literate so that everyone can understand the ways in which their data might be used. It is important that we demonstrate to society that we have rights under the GDPR, and these rights cannot be given away and must be insisted upon!

It would be great to see some privacy certification schemes stepping up to certify tech companies that have followed this process.

As an illustrative example, when thinking about restaurants and food hygiene certificates, we might not fully understand the criteria that kitchens have been judged on but we do all trust that a kitchen which has earned a hygiene certificate can be trusted to serve food. In the same way, perhaps privacy certifications could be used to encourage the public that data will be handled safely, even if some do not fully understand the intricacies of data processing.

We also wonder if tech could be used to encourage data education in a way that is more digestible to people, rather than the current approach of soft regulation from data protection watchdogs who are hesitant to hand out large fines and ultimately miss the mark on providing us with useful education. Every day millions of tweets, tiktoks, youtube posts, facebook posts- the list goes on!- are shared and consumed by the public. Could these contain educational information on how important our personal data is, and what we can do to enforce our rights?

So, the Summit for us demonstrated that tech minded individuals do want to be privacy compliant but have an unusually large battle in achieving this at the moment. However, it is in no way impossible- promoting a culture of data privacy and following the privacy by design process will demonstrate a compliant company and encourage trust. That being said, we cannot expect trust until we first tackle the lack of education around data protection.

We would love to hear your thoughts on how we can encourage education around data protection and whether this is the right approach to instil public confidence in track and trace apps. Please contact us and let us know your thoughts!

Article by Lily Morrison @ Gerrish Legal, first published on TechGirl in July 2020

Previous
Previous

The Right To Be Forgotten: Can we ever erase our online selves?

Next
Next

The AI France Summit: The Race to World Domination