Chatbots and Privacy-by-Design: a few tips to ensure GDPR Compliance

Chatbots have experienced a boom in the wake of the Covid-19 pandemic. Available 24/7 and untouched by the successive closure/re-opening measures affecting brick-and-mortar shops, these tools have proven particularly handy for individuals and organizations alike during the strict confinements imposed worldwide.

Workers stranded at home and experiencing difficulties with remote working software; but also patients looking to receive expert health advice from within the safety of their homes; and of course customers looking to contact the customer service teams of their favorite e-commerce platforms – all have come into contact with these conversational assistants in one way or another.

According to a recent study conducted by Inbenta, a chatbot supplier based in Toulouse, France, chatbot editors even experienced a 20% increase in the number of conversations managed by their bots during the first confinement in France – a number which sometimes skyrocketed to 96% in the airline and hospitality sectors. 

One obvious question this raises for data controllers and data processors is how to ensure compliance of these chatbots with data protection laws. In the following article, we propose to go through some of the tips and measures you can start putting into place now to ensure the compliance of your chatbots with data protection laws – and as the GDPR represents the gold standard in the industry, the following discussion will focus on the different guidelines that can be gleaned from the GDPR framework, as well as the French data protection authority’s interpretation of these guidelines for chatbot-specific concerns.

Chatbot or not: the General Principles relating to processing of personal data

Before getting into the specifics of data processing for chatbots, a good place to start is with the GDPR’s general principles relating to the processing of personal data: these are the principles that should be respected whatever the kind of data processing you’re conducting, irrespective of whether or not you’re using a chatbot.

At Gerrish Legal, we’ve written extensively about the GDPR’s principles-based regulation and in all likelihood, these principles are already firmly embedded in your company culture and business model. Since a reminder is always welcome though, here’s a quick summary of what you must bear in mind when processing data. 

There are 6 key principles to keep in sight, which can be found at article 5 GDPR and require data to be: 

  1. processed lawfully, fairly and in a transparent manner (‘Lawfulness, Fairness and Transparency’); 

  2. collected only for specified, explicit and legitimate purposes (‘Purpose Limitation’); 

  3. adequate, relevant and limited to what is necessary in relation to the purposes for which it is Processed (‘Data Minimisation’); 

  4. accurate and where necessary kept up to date (‘Accuracy’); 

  5. not kept in a form which permits identification of Data Subjects for longer than is necessary for the purposes for which the data is Processed (‘Storage Limitation’); 

  6. processed in a manner  that  ensures  its  security,  using  appropriate  technical  and organisational measures (6 :’Security, Integrity and Confidentiality’).

There are two additional principles, which can be found at articles 12 and 13 of the GDPR and which can also be added to this list. These set forth the requirement that personal data: 

  • shall not be transferred to another country without appropriate safeguards being in place (‘Transfer Limitation’); and

  • be made available to Data Subjects, for them to exercise certain rights in relation to their Personal Data (‘Data Subject's Rights and Requests’).

 As always, if you’ve got any questions regarding the above principles and how to embed them in your business model from the get-go, don’t hesitate to contact us for advice.

A few tips and considerations specific to Chatbots

  • Q: Chatbots and Cookies?

One of the main technical considerations which comes up when discussing chatbots is cookies. This is because chatbot editors often choose to pair their conversational assistants with cookies, in order to ensure continuity of experience for the human on the receiving end of the conversation -  even in the cases where the user decides to refresh the webpage or open a new tab for instance.

This raises the obvious issue of cookie consent. Two main options are available to chatbot editors and operators looking to collect user consent: 

Consent can be collected by virtue of a cookie banner: the advantage here is that once user consent has been collected, the chatbot window can pop-up and open-up automatically, without any specific action by the user required. 

If you pick this option though, you’ll want to pay particular attention to the formatting and information contained in your banner, especially if you’re reading this from France (the CNIL recently encouraged private and public organisations to audit their websites and mobile applications to ensure compliance of their cookie banners with cookie rules: according to the CNIL’s investigation, a large number of cookie banners do not, as it stands, allow for users to accept or refuse cookies in an easy, user-friendly manner):

Consent can also be collected by virtue of the chatbot window itself: in this case, consent is obtained as a result of the user choosing to open the chatbot window of his or her own volition.

This option won’t exempt you from the requirement to display a cookie banner for the other cookies used on your website; it’ll also result in a discrete chatbot which, depending on your business model and customer engagement strategy, may be a good or bad thing

  • Q: Any room for centenarian chatbots in a GDPR-compliant world?

One of the reasons you may want to equip your chatbot with a cookie in the first place is to keep a record of the previous chats conducted between your conversational agent and a given user. This raises the question of data retention periods.

The GDPR prohibits the indefinite retention of data and requires the data retention cycle (i.e.: active usage, intermediate archiving and final deletion) to be adapted to the data processing purposes.  

Consider getting in touch with your DPO or a legal professional at this stage -  as for a number of data processing activities, the law provides from some specific data retention cycles. When looking at chatbots specifically, you may want to bear in mind that while there may be some cases that justify data retention (in the case of a customer-service complaint that would need to be followed-up with for instance), other cases are unlikely to necessitate such retention (in the cases of a chatbot dedicated to purchase assistance for example).

  • Q: Truer-Than-Life Chatbots?

Some chatbots are so effective that we may come to forget that those are robots and not actual humans we’re conversing with. From a GDPR perspective though, it’s best never to lose sight of this.

Article 22 of the GDPR is very clear that: “the data subject shall have the right not to be subject to a decision based solely on automated processing”, if this automated processing results in legal effects or similarly “significantly affects him or her”.

The article provides from some slim exceptions to this general rule, for instance in cases where: 

  • the automated processing is necessary for entering into, or performance of a contract between the data subject and a data controller;

  • the data subject has given explicit consent; or

  • the processing is authorised by Union or Member State law to which the controller is subject 

Even in these three cases though, the data subject must retain “at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.”

  •  Q: What about chatbots that are just a little bit too nosey?

There’s one crucial question that just can’t be avoided when talking about chatbots: that of special categories of personal data. This is because chatbots can come to communicate these special categories of personal data – sometimes willingly (in the case of a chatbot which is specifically designed to answer questions about a person’s health on a health practitioner’s website for instance). But sometimes, this can also happen independently of the will of the editor: while most basic chatbots rely on a repertoire of pre-defined answers, some chatbots are now able to learn from previous conversations, resulting in conversation threads which may lead to an exchange of sensitive personal data between the chatbot and the user. 

Article 9 of the GDPR defines special categories of personal data as data:

Revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation. 

As a general rule, article 9 provides for a prohibition of the processing of these special categories of personal data. As is often the case with legal matters though, the article also provides for exceptions, i.e. special circumstances in which the prohibition shall not apply, for instance if: 

  • the data subject has given explicit consent to the processing of those personal data;

  • the processing is necessary for the purposes of preventive or occupational medicine, medical diagnosis, the provision of health or social care or treatment or the management of health or social care;

  • the processing is carried out in the course of its legitimate activities by a foundation, association or any other not-for-profit body with a political, philosophical, religious or trade union aim; or

  • the processing is necessary for reasons of substantial public interest

It’s very important to assess whether your chatbot will be likely to come into contact with these special categories of a personal data: this is because wherever in the world you’re reading this article from, you’ll likely have to conduct a Data Protection Impact Assessment as a result of your activities.

It’s good practice to do a DPIA for any major project which requires the processing of data, but it is a requirement by Data Protection Authorities in both the UK and the EU whenever the processing involves the aforementioned special categories of data. Conducting a DPIA can entail a lot of work - and sometimes a lot of headaches too. It’s also something we’re experts at here at Gerrish Legal, so don’t hesitate to get in touch with us if this is something you’d like some assistance with.

One last thing: you may want to include some preventative features to your Chabot window, such as a banner warning users not to share special categories of data with the chatbot, or a function allowing users to delete past conversations containing sensitive data. 

Remember, chatbots can prove particularly handy to businesses, consumers and patients alike, especially in this day and age of lockdowns. In order to reap all the benefits offered by these conversational assistants, it’s best to approach them with a compliance-by-design mindset, with a special focus afforded to cookie consent, data retention periods, special categories of personal data and the precautions inherent to automated decision-making.

If you have any questions regarding your chatbot or more largely relating to your compliance with the GDPR, please do not hesitate to contact us!

Article by Leila Saidi @ Gerrish Legal, March 2021

Previous
Previous

PART 2 - Data Protection Officers: Who is the ideal candidate?

Next
Next

Digital Services Act - reworking the status quo