Clearview technology in war times: à la guerre comme à la guerre?

May 3, 2022

In March, the Ukrainian government  began using Clearview facial recognition. The technology is presumably applied to spot Russian operatives in Ukraine, as well as identify Ukrainian victims of war and fallen Russian soldiers. Some may say this is right – all if fair in love and war. But is it?


The US-based Clearview AI collects photos publicly available on the Internet, complies them into a database (currently about 10 billion images) and allows users (law enforcement) to run image-based search requests. The requests are processed with the help of what Clearview claims to be is a highly-effective AI, allowing identification even in cases of facial damage. 

The database has already been used, among others, by Dutch and Belgian law enforcement, triggering a nation-wide outcry in both cases. No wonder, as, from a data protection perspective, the company’s activities are highly controversial.


First and foremost, under the GDPR, photographs processed for identification purposes are biometric data[1]. It falls under special categories of data, whose processing, in principle, is prohibited[2]. Clearview states that it only collects public data. Yet it is clear that individuals don’t manifestly make their photos public for biometric identification, so the relevant exception[3] does not apply. Moreover, users of social networks – even with open profiles – never explicitly consented[4] to their data being collected by a third party, then sold to law enforcement, and used for intrusive searches with potentially adverse consequences.


Things are not better with retention periods. In its privacy policy, Clearview provides it will store data ‘only […] for as long as it is necessary for the purposes set out in this Privacy Policy’. Yet the very purpose of its business is to sell access to as many law enforcement agencies as possible – and this can only be accomplished if the database of images is constantly expanded. No wonder the data retention period isn’t specified – in fact, it’s indefinite.


What about the data subjects’ rights? Importantly, the privacy policy explicitly grants them only to California residents under the California Consumer Privacy Act (CCPA). The rights of other data subjects are unclear – but even for Californians they are limited. First, pursuant to CCPA, access is provided only to data collected up to 12 months before the request was received[5]. In addition, individuals can’t submit more than two requests in a year[6]. And strikingly, for access or data deletion requests from a database of photos Clearview requires – you guessed it right, a photo.

The large-scale collection of biometric data with no consent, virtually uncapped data retention and unclear or limited data subjects’ rights are worrying already in peaceful times. Using the Clearview technology in war is even more alarming.


If the living are identified, a false positive may lead to arrest in an ordinary situation, but result in death in times of war. Ukrainian officials are receiving training from Clearview on working with its database. Yet it remains a question if this can mitigate the anger and stress of war – and the decisions driven by them.


And as to identifying the deceased, sending photos of fallen Russian soldiers to their relatives, even if there is no mistake, crosses an ethical line. The fact that Russia started a war of unspeakable brutality shouldn’t make us forget where that line is – and I am saying this as a person who had to flee that war.


So, à la guerre comme à la guerre? I suggest Ukraine recall instead that the ends do not justify the means and reject Clearview.
 


[1] Recital 51.
[2] Article 9 (1).
[3] Article 9 (2)(e).
[4] Article 9 (2)(a).
[5] Section 1798.130.
[6] Section 1798.100. 

Download