Skip to main content

Behind the Face: Unmasking GDPR's Efficacy in Regulating Facial Recognition Technology in Law Enforcement.

Author: Liza Mozgunova, Qualified Solicitor of England and Wales, LLM

Facial Recognition Technology (FRT) has become pervasive in various applications, ranging from unlocking smartphones to law enforcement use. This article critically analyzes the application of FRT in law enforcement within the framework of the General Data Protection Regulation (GDPR). The article scrutinizes the quality and accuracy of FRT data, addressing challenges related to hardware, software, and data biases. GDPR's principles of accuracy and data minimization are examined in the context of FRT, emphasizing the importance of balancing security and privacy. The specified, explicit, and legitimate purpose of FRT in law enforcement is discussed, highlighting potential risks of misuse and the need for careful consideration of purpose limitations. Additionally, storage limitations and the balance between data quality improvement and privacy concerns are explored. Data Protection Impact Assessments (DPIA) are deemed crucial for FRT implementation, with case analyses demonstrating their significance in ensuring lawful and ethical use. The article concludes that GDPR provides adequate safeguards against FRT misuse by law enforcement, emphasizing the importance of transparency and adherence to best practices. The EU draft AI Act's provisions on real-time biometric identification systems are discussed, offering additional layers of protection. However, challenges arise from the complexity of FRT techniques, diverse actors involved, and contextual nuances. The article acknowledges civil society's role in advocating for fundamental rights and individuals' avenues for legal recourse under GDPR.In summary, while FRT presents potential risks, GDPR, along with supplementary legal instruments, offers a robust regulatory framework to safeguard against misuse in law enforcement contexts. Ongoing developments, such as the EU draft AI Act, contribute to the evolving landscape of facial recognition technology regulation.

Categories

Business Category
Public Services
Technical Category
Computer vision

Facial Recognition Technology (FRT) refers to algorithmic systems and the associated hardware that can analyse an individual’s face to make a claim of identity. FRT can be used to verify someone’s identity for example secure a login, or to identify an individual for example identify an individual captured in the footage by law enforcement. There are two main types of FRT, one is facial verification, second is facial identification. Facial verification is one-to-one matching, whereas facial identification is one-to-many matching. In addition to facial verification and identification, there are systems that can classify people and draw inferences about their characteristics.FRT looks for patterns in a face measurement to construct a ‘template’, and then it compares similarities between two templates. It is also a statistical system in the sense that it facilitates the estimation of similarity with some degree of error. FRT is a probabilistic software (and associated hardware), meaning a category of a system that incorporates principles of probability and as a result, can have an inaccuracy in its functioning.FRT can be applied live or retrospectively and it can be automated or utilised to aid humans. The application of FRT includes collection, generation, processing and storage of data. Facial data is uniquely connected to a person’s look and hence the identity of that person can be identified. There are two important attributes: permanence and inseperability. Permanence relates to features that are rather stable over time, whereas inseparability relates to facial features that cannot be dissociated from a person unless significant medical intervention occured.

FRT has generated various commercial applications like unlocking a smartphone for example, but its application by law enforcement is particularly contentious. Although law enforcement has broad authority to detect and prevent crime, they must only process biometric data when it is strictly essential and proportionate to achieve law enforcement purposes. This analysis provides a critical analysis of whether the General Data Protection Regulation sufficiently safeguards against the misuse of facial recognition technology by law enforcement.

Part I: Qualification of personal data as biometric data and distinctions in the operation of Facial Recognition Technology.

For the personal data to be biometric data and fall under Article 9 GDPR, three specific criteria outlined in Article 4(14) GDPR must be satisfied. This requires a case-by-case assessment by the data controller to establish these three conditions are satisfied. The first condition requires the data to relate to the physical, psychological or behavioural characteristics of a natural person. The second condition requires personal data resulting from specific technical processing. The third condition requires personal data confirming the unique identification of that natural person. The processing of personal data will not be subject to the regulation of Article 9 if one of the three criteria is not met. It will not be considered special category data and will be under the GDPR requirements for the processing of non-special categories of personal data. This can be undertaken in accordance with six legal bases set out in Article 6(1), consent being only one of them. If biometric or other data is not linked to an identifiable or identified natural person, it might not fall under GDPR. The example of an instance where data is not processed to uniquely identify a natural person is demonstrated further. It is important to note that a face template may not qualify as personal data if there is no reasonable and lawful way for the data controller to identify the individual from a face template. This may occur if the data controller is not in possession of the original photograph or if the template is of poor quality or basic. Article 9 GDPR should solely be applicable if the technology is used to uniquely identify a natural person, for example, if video footage of a person is processed and generates biometric data, it can then be used to identify that person uniquely.

It is important to acknowledge a few distinctions in the use of FRT. One purpose of FRT is to uniquely identify a natural person, another purpose is to classify a natural person based on physical traits only, to which Article 9 would not apply. But simply matching two templates or matching one template with the second one should not 

fall under the provisions in Article 9 GDPR and is arguably not deemed to be a technique to uniquely identify a natural person.

The fact that FRT is used to identify that an individual was present at a particular location or was detected in different places does not necessarily imply that the data controller intends to uniquely identify that individual. Indeed, FRT can detect that a person was present at different locations, but the act of detection alone does not indicate that the data controller automatically has the purpose of uniquely identifying that individual. In other words, the fact that a technology system can recognise an individual’s face in different places does not necessarily mean that the data controller using this technology system applies it to ascertain the unique identity of that individual. The crucial context here is whether the processing of templates has an objective to uniquely identify an individual. When the data controller matches ‘person 1’ and ‘person 2’, the same reasoning should apply. The act of differentiating two templates does not imply that the purpose is to specifically identify the person, because face templates may not necessarily make it always possible to identify a certain person. For a face template to be considered biometric data it must meet the condition of being used to allow or confirm the unique identification of that natural person. This indicates that the data controller must possess another identifying data associated with the facial template, which is used to match against the newly acquired facial template to uniquely identify the person.

The crucial argument is that unique identification takes place when the newly obtained template is matched against the pre-existing template, which contains metadata (in the meaning of data about data or associated data). Without having the pre-existing template and associated identifying data, it is not possible to exclusively identify an individual using only a new face template. As a consequence, it is argued that the act of processing a face template without associated identifying data cannot be considered as personal data and as a result does not fall under Article 4(14) GDPR. Hence a face template that is used to match faces is not applied for the purpose of allowing or confirming the unique identification of that natural person. The distinction needs to be acknowledged between specific facial recognition, face characterisation and cases of unique persistent identifier. Furthermore, keeping in mind that facial recognition forms part of computer vision, these are related concepts, but again there is a distinction. Computer vision is a field of computer science that is used in interpreting visual data, whereas FRT is a subfield of computer vision and is specifically focused on the identification and verification of individuals based on their faces. 

Facial recognition needs to meet the requirements in Article 4(14), whereas the potential for unique persistent identifier to lead to the recognition of an individual’s identity occurs when it is linked to other unique identifying information. The application of FRT becomes relevant only when these intermediary templates are associated with other identifier and are employed to uniquely identify a natural person. Whereas the face characterisation technique might not fall under the definition of personal data. Consequently, Article 9 GDPR should not apply to situations of intermediate templates that do not imply the unique identification of a natural person. Notably in most circumstances, only facial data of natural persons which were marked as a match is kept after the initial facial scan. This reasoning is essential when considering the merits of whether a misuse of FRT by law enforcement has occurred and whether the case falls under the governance of GDPR.

There is an argument that the real-time creation of biometric templates by law enforcement occurs without a lawful basis under Article 9 GDPR. As a start, the term biometric template needs to be defined. Moreover, an FRT system does not necessarily automatically create biometric templates just because it captures the faces of people live. There is a need to differentiate two types of biometric templates. The first type is created during enrollment to uniquely identify a natural person. The second type is an intermediary template which was made during the matching to check for matches or non-matches. The law enforcement’s objective is to identify a specific person (Person 1), law enforcement has a biometric face template with associated identified information. However, law enforcement is not using the FRT system to uniquely identify everyone who is captured by the camera using intermediate templates. Law enforcement’s focus is on Person 1 identification. Again, just because the FRT system catches the faces of people passing by does not imply that it generates biometric templates.

Article 4(14) emphasises that for the data to be categorised as biometric, specific technical processing needs to allow or confirm the unique identification of that natural person. When there is a case of non-match, the processing is not extended to the unique identification of the individual. Such as when there is no match, namely, no identification is confirmed, it does not meet the criteria of biometric data definition and the consent requirement as stipulated in Article 9 of GDPR. Furthermore, it would be difficult to obtain an informed consent form from every person who is passing by. From an operational point of view, it might be impossible. It creates the necessity to consider the balance between security and privacy when analysing FRT in a law enforcement context.

Part II: Quality of Facial Recognition Technology and Accuracy of Data.

Article 5(1)(d) of GDPR provides that personal data shall be accurate and where necessary kept up to date. The judgment of the case Peter Nowak v Data Protection Commissioner highlighted that the assessment of whether personal data is accurate and complete must be made in the light of the purpose for which that data was collected.It is difficult and uncertain whether law enforcement applying FRT will be able to comply with this premise. The principle mandates that input data is accurate. As well as data sets are unbiased and representative that are used to train algorithms. Accuracy of personal data must be considered broadly and should go beyond a concept of simple correctness, such as age for example. Data quality and accuracy are related concepts. Law enforcement data controllers and processors must examine the image quality and biometric templates contained in the Watch List. This is to avoid false matches because poor-quality visualisation increases inaccuracy. FTR is based on precise algorithms to analyze and match facial features. However, people naturally possess a wide range of characteristics such as different gender presentations, ages and skin tones. These variations can sometimes lead to inaccuracies in the FRT’s ability to consistently identify and match faces accurately. A significant challenge is for the FRT developers to implement robust testing procedures that reflect this diversity. Real-time FRT system’s algorithms may mask biases of their programming and lead to detection errors. However, it should be highlighted that this is not necessarily a result of law enforcement misconduct or misuse of FRT. Ideal compliance with the accuracy requirement by law enforcement is improbable because FRT system is not definitely accurate. The requirement to adhere to the accuracy principle leads to a continuous accumulation of data to create an algorithm, which still might have defects.

Furthermore, Article 5(1)(c) of GDPR provides a principle on data minimisation, the important aspect here is that personal data processing must be limited to what is necessary in relation to the purposes for which they are processed and not excessive. It emphasises that data collected should not be overly abundant and processing should be restricted to the necessity for the purpose. There are challenges in complying with this requirement because of functional limitations inherent in the technology being used.

Results of FRT are presented in terms of similarity percentage because the technology is not entirely precise, therefore it is based on statistical estimates. When FRT is used to provide binary decisions, such as determining whether a match or no match occurred between two images, it is the responsibility of the deployer to define a suitable similarity threshold. The level of threshold differs depending on the context, for example, social media platforms will have a lower level threshold, because false positives can be accepted, but in a law enforcement context actors can deploy FRT to locate a person of interest in public space and therefore a high level of threshold will be applied to ensure that matches are accurate, meaning that theoretically there could be a chance in missing a potential match, as a result of higher threshold.

In reality, the output that is generated by comparison is not binary, because it is based on probability. The generation of biometric template is affected by various factors such as light, image resolution, the position of the subject, weather conditions, and angle. For example, when FRT is used in a controlled environment with reasonably good lightning and the image has a high resolution, it could provide higher accuracy. This highlights the argument that performance/output can vary depending on the circumstances in which FRT is used and the conditions when the biometric template was generated. It depends on the quality of the hardware devices. The accuracy of source code and configuration of the software are of utmost importance as well and depending on these two parameters false negatives and false positives may vary. In summary, there are four paramount elements in the effective operation of FRT system:(1) the quality of hardware, (2)the quality of software, (3) the level of acceptable threshold and (4) the accuracy of data.

Part III: Specified, Explicit and Legitimate Purpose and Storage Limitation

Article 5(1)(b) of GDPR and Article 4(1)(b) of the Directive on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, commonly referred to as LED(Law Enforcement Directive) both provide that processing of personal data to be collected for specified, explicit and legitimate purposes and not processed in a manner that is incompatible with those purposes. This requirement is mirrored in Article 8(2) of the Charter of Fundamental Rights of the European Union. The principle of purpose limitation is one of the central principles of the European Data Protection Law. The European Union Agency for Fundamental Human Rights (EUAFR) argues that following the case Promusicae v Telefonica de Espana SAU the data subject should be able to foresee the purpose for which their data will be processed.

The use of FRT by law enforcement can be for legitimate purposes in cases related to terrorism or cases where there is an immediate threat to public security or safety. Such cases could commonly form the underlying rationale of the purpose limitation requirement as set out in Data Protection Law to allow law enforcement actors to gain access to large-scale EU databases. When developing a computer system including FTR to prevent serious crimes and terrorism to improve public safety, indeed there is a risk of misuse. This means that personal data, such as facial images in certain contexts, may be used for unintended purposes. It breaks down into three conditions, misuse can occur if (1)the quality of the hardware is poor, (2) the quality of the software is poor, and (3) misconduct or error by law enforcement agents took place. Needless to say, when it comes to the interoperability of large-scale EU databases, security is a paramount concept. Security measures need to prevent the extension of the FRT system to access and operate with the EU large-scale databases in the absence of a legitimate purpose. Moreover, the application of live FRT systems in an uncontrolled environment for uniquely identifying a natural person purposes must adhere to necessity and proportionality principles.

Article 5(1)(e) of GDPR mandates that personal data shall be kept in a form which permits the identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed. A notable concern is around the risk of the ongoing expansion of facial image databases leading to their utilization without a lawful basis. The European Court of Human Rights stressed in the case M.K. v France the paramount significance of the protection of personal data as a crucial aspect of individuals’ right to private life. European Data Protection Board(EDPB) provides guidelines, that state that real-time intermediate templates that were generated for the match or no match comparison to the templates created from people’s data during the enrollment process should be promptly deleted when a probability score has been obtained. They should not be stored and kept in archive. It means that EDPB maintains that this data should not be stored and archived beyond their immediate use. This paper argues that intermediate templates that were created for match or non-match comparison present a string of digits with no link to a specific identity. Hence they are not characterised as special categories data, because this data alone cannot uniquely identify a natural person. This processing arguably is not considered as processing of personal data. Subsequently, the requirements in Article 5(1)(e) of GDPR should apply only to face templates that allow unique identification of natural person. There should be a more careful consideration of the balance between the storage of this data and the advantage of keeping this data. Storage of these templates allows technology developers to enhance the accuracy and correctness of FTR systems. If storage of such data is strictly prohibited it might have long-term implications such as continuous inaccuracy and bias and false positive matches. This causes harm to individuals and could be otherwise prevented. The balance between the benefits of improved quality and the potential risks of storing such data needs to be considered. To improve the quality of FRT it would be beneficial to designate resources towards research and improvement of FRT systems. Any stored templates must be kept secure. Additionally, the data subject can exercise their right to erasure under Article 17 of GDPR.

Part IV: The importance of Data Protection Impact Assessment and relevant cases.

Article 25(1) of GDPR provides that data needs to be protected by design and by default. Further, the use of FRT by law enforcement is subject to requirements of Data Protection Impact Assessment(DPIA) and requires the data controller to consult the supervisory authority prior to processing where a data protection impact assessment under Article 35 indicates that the processing would result in a high risk in the absence of measures taken by the controller to mitigate the risk. DPIA is a crucial instrument to determine the risks and the lawfulness of the utilisation of FRT system. In the UK case Bridges v South West Police, the Court of Appeal concluded that DPIA, in this case, was sufficient, however, the issue was whether South West Police complied with the fundamental principle of lawfulness. The Court concluded that the processing of the facial image of the claimant by the FRT system amounted to the processing of personal data and entailed biometric data processing because the data captured by the FRT system singled the claimant from all others. The argument in Chapter I of this paper contradicts the decision of the Court. Nevertheless, this paper maintains that the reasoning in Chapter I is most accurate to capture the notion of relevant provisions of GDPR concerning personal data, biometric data and processing of special categories of data. There are impotant distinctions between FRT system’s techniques and the access and availability of the associated data which leads to the unique identification of natural person.

In the US Axon Enterprise, a technology developer, concluded not to deploy a FRT system in law enforcement agent’s vest cameras as a result of DPIA which was conducted at the research stage of the product. The UK Information Commissioner’s Office and the London Police Ethics Panel conducted surveys which revealed that most people do not have objections to the use of FRT systems by law enforcement agencies for criminal investigations, as opposed to FRT systems used for commercial purposes.

Concluding observations

As a result of the analysis of this paper, it is concluded that the risk of misuse of facial recognition technology by law enforcement can occur if one of these conditions is present: (1) poor quality of hardware, (2) poor quality of software, (3) inaccurate data, (4) inappropriate level of match threshold and (5) misconduct or error by law enforcement actor. The first condition relates to the quality of electronic devices and is therefore not governed by GDPR. The second condition relates to the issue of programming and computer science and is not exactly governed by GDPR, but this aspect affects the accuracy of data. The third condition relates to the accuracy of data input and data processing and is governed by GDPR. The fourth condition relates to the discretion of the actor to set the threshold and is governed by GDPR. Finally, the fifth condition relates to the issue of the code of conduct and best practices of law enforcement agents where compliance might be strengthened through national legislation in the EU. It is to a certain degree governed by the GDPR and complemented by codes of conduct and LED, which is a targeted law on competent authorities. Those conditions that fall under GDPR are sufficiently safeguarded. The issue is not in the lack of specific law but in the interdependency of contexts and the variety of actors involved in the lifecycle of FRT system. The law faces a challenge in reflecting (1) the complexity and distinction of the FRT techniques, (2) the complex web of actors in FRT lifecycle, (3) the enormous amount of data and (4) the context of the application of technology.

In conclusion, GDPR provides sufficient safeguards against the misuse of facial recognition technology by law enforcement. GDPR, being the key legislative instrument in Data Protection framework, is complemented by the EU directive on law enforcement and processing of personal data(LED), guidelines and opinions of regulatory and governing bodies overseeing data protection, or in terms of the UK, GDPR is complemented by the analogous directive on law enforcement as well as Surveillance Camera Code of Practice and Human Rights law overall. Civil society organisations are advocating for the protection of fundamental rights and rights to data protection being one of them. Typically, they assist individuals in representation actions and accessing redress mechanisms in disputes concerning violations of fundamental rights. Individuals can exercise their right to an effective judicial remedy against a controller or processor under Article 79 of GDPR and the right to compensation under Article 82 of GDPR. These legal instruments in combination demonstrate that there is adequate protection and a robust regulatory system in place. Concerning specifically law enforcement conduct, it is expected that they are appropriately transparent and use best practices.

It is important to recognise that the processing of personal data using FRT system can occur not only on the basis of consent. Intermediary templates that were generated for match or no match comparison are not characterised as special categories of data. Further, it is important to note the distinction between FRT techniques, whether it is a face recognition or a face characterisation, or a unique persistent identifier. It is important to establish whether data leads to uniquely identifying a natural person. FRT system can allow for no intervention of data subject in as to say ‘stop and search’ procedure. However, GDPR and other legal tools prevent arbitrary use of this technology. Finally, the EU draft AI Act prohibits the use of ‘real-time’ biometric identification systems in publicly accessible spaces for the purposes of law enforcement. However, there are three exceptions to this prohibition where the risks to fundamental rights are outweighed by substantial public interest. Recital 21 of the draft AI Act provides that each use of a ‘real-time’ remote biometric identification system in publicly accessible spaces for the purpose of law enforcement should be subject to an express and specific authorisation by a judicial authority or by an independent administrative authority of a Member State.

Interestingly, there is an outstanding example of misuse of FTR system. The FRT system is used in a service called ‘GlazBoga’ it dramatically translates into English as the ‘eye of God’. It seems that this service is operating in the Russian Federation. It is a private actor and not law enforcement. However, the information on the website of this service states that they have obtained data from national law enforcement agency and based on processing these data sets they are providing facial recognition services and disclosure of various associated to natural person data for a fee. This is an extreme case of misuse of facial recognition technology, amongst other concerns, and it does not seem to fall under the GDPR jurisdiction, as it operates in the Russian Federation. However, GDPR would apply if GlazBoga processed personal data of EU nationals which is possible theoretically.