Skip to main content

Clifford Chance

Clifford Chance
Regulatory Investigations and Financial Crime Insights<br />

Regulatory Investigations and Financial Crime Insights

New investigation reveals risks of biometric data processing for law enforcement purposes

The regulatory woes of facial recognition firm Clearview AI deepened last week when data regulators in the UK and Australia (the ICO and OAIC) announced a joint investigation into their personal information handling practices, and signalled that they would liaise with other global data protection authorities.

Clearview AI has reportedly amassed a database of as many as 3 billion images scraped from popular social media sites like Facebook, Twitter and LinkedIn. The firm's service allows users to upload photographs of people and then its facial recognition AI will match them against its database and provide links to where the individuals appear online. Clearview asserts that it sources its data from publicly available websites. It markets its software tools to law enforcement and security professionals, with the service reportedly used by 600 law enforcement agencies across the globe.

Facial recognition software has been a recent area of focus for large tech firms, with IBM recently announcing that it will stop offering such software for "mass surveillance", and Microsoft and Amazon both putting moratoriums on providing their software to law enforcement agencies in the U.S.

The ICO and OAIC investigation follows on the heels of an investigation launched by Canadian data protection authorities in February 2020, which led to Clearview recently withdrawing its services from the Canadian market. The European Data Protection Board (EDPB), in response to a query from Members of the European Parliament, opined last month that law enforcement use of Clearview AI would "likely not be consistent with the EU data protection regime". The EDPB emphasised the need for appropriate safeguards for the rights of data subjects, especially when automated processing is being applied to biometric data. They based their conclusion on the fact that the use of Clearview AI's services would entail European law enforcement agencies sharing personal data outside the EU, for the purpose of applying biometric matching against Clearview AI's "mass and arbitrarily populated database of photographs and facial pictures accessible online".

These investigations highlight a number of issues regarding the use of automated facial recognition technology.

Law enforcement use falls outside of GDPR

With regard to personal data, processing for law enforcement purposes falls outside the scope of the GDPR and is instead governed by the Law Enforcement Directive (Directive 2016/680) (LED). In the UK, the LED was incorporated into UK law by Part 3 of the Data Protection Act 2018 (DPA). Part 3 of the DPA applies to the police but also to processors carrying out a law enforcement function on behalf of the police. There are strict conditions for the processing of biometric data, which includes the use of facial recognition software. To carry out processing of this data, law enforcement must be able to demonstrate consent or that it satisfies one of the conditions listed in Part 3, which includes showing that the processing is necessary for reasons of substantial public interest.

Part 3 of the DPA provides fewer rights to data subjects than Part 2, which governs general processing in line with the GDPR, but Part 3 does include: the right to be informed; the right of access; the right to rectification; the right to erasure or restrict processing; and the right not to be subject to automated decision-making. These rights apply even when the personal information has been scraped from publicly available sources like open social media profiles.

As scraping data generally does not offer the data subject an opportunity for consent, data regulators will likely be assessing whether Clearview's policies comply with the requirements of the DPA with regard to the processing of biometric data.

A UK court considered a similar issue recently in Bridges v SWP and held that South Wales Police's use of live automated facial recognition software in public spaces was lawful because was it was used for a legitimate aim, it struck a fair balance between the rights of the individual and the community, and was not disproportionate. In October 2019, the ICO also issued an opinion setting out its concerns regarding law enforcement use of live automated facial recognition software. The ICO launched an investigation into the use of live facial recognition software in Kings Cross Station in 2019. The resolution of that investigation will clarify the ICO's position on the use of such software.

IP Interests

In addition to data protection concerns, scraping photographs, even those from publicly available sources, may violate IP interests in the photographs as well as the terms and conditions of social media websites, which often contain restrictions on the commercial use of data obtained from them. Indeed, a host of tech firms, including Facebook, Google and Twitter, sent cease and desist letters to Clearview AI, asserting that scraping data from their websites violates their terms of use.

Conclusion

These investigations highlight the increasing multi-jurisdictional nature of data regulation and their outcomes could have far reaching consequences for businesses who implement data scraping practices. They should remind businesses of the importance of undertaking a proper data impact assessment in relation to a new product / technology seeking to monetise data, in relation to each territory in which a product will be marketed. Amongst other matters, this must assess issues such as the source of the data (and any contractual or non-contractual obligations that might be infringed by using it), issues around consent, and issues around the potential transfer of data.

  • Share on Twitter
  • Share on LinkedIn
  • Share via email
Back to top