Fashion & AI: the privacy of virtual fitting rooms

The retail fashion industry is once again embracing the merits of artificial intelligence – but this time it is bringing the technology to its fitting rooms. Artificial intelligence, along with virtual and augmented reality technology, is being harnessed to tackle the unprecedented issue of contactless stores as a result of the ongoing pandemic. 


‘Fit technology’ is by no means a new area and companies have been trying to perfect the art of online fittings for years now, mainly focusing on enhancing the e-commerce experience. However, due to safety measures that have been rolled out in most brick and mortar stores around the globe, this technology is now being looked at to rapidly replace the fitting room experience that was once the main selling point of in-person stores. 

Such technology is not without privacy concerns. Here at Gerrish Legal, we have a lot of experience advising companies working with AI and augmented reality technology and wanted to share our insights into reconciling cutting-edge algorithms with the right to privacy. 

What is the problem that AI-fitting rooms can solve?

Whilst most fitting rooms in fashion retail stores are closed due to the risk of creating contamination sites, recent studies show that even if such amenities were available, consumers no longer feel comfortable using them.

According to a white paper by First Insight, more than half of all women and men in the US no longer feel safe trying on clothes in fitting rooms, at 65% and 54% respectively, with a staggering 78% of females no longer feeling safe when testing beauty products. 


This means that the fashion and make-up retail industry could take a significant hit as consumers will now either refrain from buying items due to the uncertainty surrounding fit, or will buy products to try on in the comfort of their own home, which could add to the rate of returns that these retailers see. According to Vogue Business, fit is said to be the leading reason for fashion e-commerce returns, explaining the accelerated shift towards tech-driven fitting rooms now. 

How does the technology work? 

For such virtual fit technologies to work, precise measurements and personal preferences are required – which means handing over sensitive data in the hopes that those pair of jeans you have been eyeing up all week might fit. 

For example, one leading company in this space uses artificial intelligence, with its customers providing data on their height, weight and fit preferences, as well as 3D cameras capturing 150 data points on their body in the space of 10 seconds – all through the medium of an app on their mobile phone.

This algorithm then combines this information with its existing database of items, styles and sizes to recommend products from its brand partners. 

Another company takes this same data and is planning on combining it with the ability to scan QR codes in retail stores to have personalised recommendations sent to your profile on their app. Similarly, 3D scanning technology is working towards allowing tailors to take measurements without having to come into contact with their client, as well as creating online fitting rooms where you can style clothes on yourself via your mobile or laptop.  

What are the privacy concerns? 

In using such online fitting rooms, consumers will be handing over their personal data to the providers of this technology, as well as their suppliers and brand partners (the retailers). The significant point here is that such personal data is not limited to usual contact data such as name or email address but will include sensitive biometric data. 

Sensitive data is accorded extra protection under the General Data Protection Regulation (GDPR). As such, companies processing such personal data must rely on a lawful basis for processing under Article 6 of the GDPR, which applies to the processing of all personal data, but in addition, they will also require a lawful basis under Article 9 of the GDPR, since the personal data they are processing is likely to constitute sensitive or special category personal data, given that body shape may equate to health data to the extent that a person’s body shape may indicate a physiological condition. For such commercial processing of sensitive data, consent will most likely be the appropriate ground to rely on under Article 9 - as the other other bases offered by this provision are unlikely to apply.


However, even if as a shopper, you do provide consent to such processing, you need to fully understand what you are consenting to. For example, could your sensitive data be used for e-marketing? Could it be sold and used by AdTech companies? Will it be shared with only one retailer or with a consortium of retailers? Could your data be used in existing trend forecasting AI tools or used for machine learning or product development of the technology that is used?

These questions should all be covered by the data controller’s privacy policy – which should be as transparent and open to accountability as possible. As sizing can differ across brands and retailers, this process might require handing over such data to numerous data controllers and processors.

Furthermore, biometric data in the form of 3D scans of your face and body could also reveal sensitive data such as sex, racial or ethnic origins, and potential or existing health issues. This is significant as data bias is a huge issue with AI and automated technology.

A famous example includes Amazon’s now discontinued AI recruitment algorithm which favoured females for HR roles and rejected them for more technical roles. This was due to the bias that was integrated into the algorithm as the system was trained using internal records from the last decade.

Therefore, the data used to train these algorithms will be important. Arguably, any automated decision making through these algorithms should not have any significant legal or similar effects on data subjects as described in Article 22 of the GDPR; as in the worst-case scenario, data subjects will just be recommended products that do not actually fit.

Nonetheless, the processing of sensitive data must be proportional to the reasons behind the processing, with such companies imposing adequate technical and organisational measures to mitigate any risks posed to the rights of data subjects under the GDPR.  

Such measures can include anonymisation of personal data so that the data subject can no longer be identified or pseudonymisation of personal data whereby that data cannot be used to identify a data subject without additional information or a key, which is kept separately and securely, for example through encryption of data. However, such measures can be difficult to apply when biometric data is concerned, especially if facial images are being used in an algorithm.

Similarly, the retention of such data can be an issue if a data subject was to retract their consent to the processing. Data controllers must be able to extract and delete one individual’s data from a data set at any given time – which is easier said than done when it comes to machine learning technology. 

Next steps

If companies rolling out such technology successfully implement privacy by design, remove data bias from their algorithms, as well as implement secure technical and organisational measures, then this could be a massive opportunity in this industry – and could revolutionise the way we experience retail fashion. 

However, to get this right, privacy must be at the forefront of all stakeholders’ minds, including the data subjects that will be handing over such sensitive data. 

If you have any questions relating to privacy, artificial intelligence, or automated technologies, please do not hesitate to contact us! Likewise, let us know what you think about the future of virtual fitting rooms!

Article by Komal Shemar @ Gerrish Legal, first published on TechGirl in August 2020 / Cover photo by Alexandra Gorn on Unsplash

Previous
Previous

Social Media: How can we protect child influencers?

Next
Next

The LEGALTECH Book: Can the law keep up with the development of AI?