Face-off on Use of Biometric Technology in the UK


August.12.2020

In one of the world’s first test cases regarding the legality of the use of automated facial recognition and biometric technology, on 11 August 2020 the English Court of Appeal handed down judgment in R (Bridges) v CC South Wales. The court found that the use of this technology by the South Wales Police Force violated privacy, equality and data protection laws.

The facts

The appeal was brought against a decision of the English High Court handed down in September 2019. Mr. Bridges, the appellant, is a civil liberties campaigner and was supported by Liberty, a civil liberties membership organisation. Mr. Bridges challenged the legality of South Wales Police’s use of an automated facial recognition technology known as “AFR Locate” on two occasions and on an ongoing basis.

In the High Court, Mr. Bridges argued that the use of AFR Locate: (1) infringed his right to respect for private and family life under Article 8 of the European Convention on Human Rights, (2) was in breach of requirements under the Data Protection Act 1998 and the Data Protection Act 2018, and (3) failed to comply with South Wales Police’s obligation to consider indirect discrimination on the grounds of sex or race, as the AFR Locate tool might produce a higher rate of positive matches for female and/or for minority individuals in contravention of the Public Sector Equality Duty (“PSED”) in section 149 of the Equality Act 2010. The High Court dismissed the claim on all grounds and found that South Wales Police’s use of this automated facial recognition and biometric technology was lawful.

The appeal

Mr. Bridges sought and obtained permission to appeal the High Court’s ruling on five Grounds. He argued that the High Court had erred in finding that: (1) interference with Bridges’ Article 8 rights by the South Wales Police had been in accordance with the law, (2) the use of the AFR Locate technology was proportionate regarding this interference, (3) South Wales Police had conducted an adequate data protection impact assessment as required under the Data Protection Act 2018, (4) the Court did not have to conclude as to whether South Wales Police had an ‘appropriate policy document’ in place as required under the DPA 2018, and (5) South Wales Police had complied with its obligations under the Public Sector Equality Duty as required under the Equality Act 2010.

The decision

The Court of Appeal allowed the appeal on Grounds 1, 3 and 5 above, and issued a declaratory judgment in favour of Mr. Bridges. A declaratory judgement states whether the parties may seek or are entitled to relief but does not order a party to take any action or award any damages for violations of the law.

The Court declared that:

  • On Ground 1, the use of the live automated facial recognition technology on the two occasions and on an ongoing basis, which engaged Article 8(1) of the European Convention on Human Rights, was unlawful for the purposes of Article 8(2).
  • On Ground 3, and as a consequence of the finding on Ground 1, South Wales Police’s Data Protection Impact Assessment did not comply with section 64(3)(b) and (c) of the Data Protection Act 2018.
  • On Ground 5, due to potential racial and gender bias in the algorithmic technology used by the AFR Locate, South Wales Police did not comply with the Public Sector Equality Duty in section 149 of the Equality Act 2010.

The Court of Appeal rejected Mr. Bridges’ arguments on Grounds 2 and 4 and dismissed his appeal over those grounds.

Implications of the decision

Despite the unsuccessful outcome for South Wales Police, those hoping that the Court of Appeal’s decision will cause police trials of live facial recognition and biometric technologies, including AFR Locate, to halt in their tracks may be disappointed. Speaking for the South Wales Police, Chief Constable Matt Jukes welcomed judicial assessment of the use of AFR Locate, stating that “I am confident this is a judgment that we can work with”. South Wales Police have committed to adjust their policies in line with the Judgment and to work alongside the Surveillance Camera Commissioner, Home Office, College of Policing and National Police Chiefs Council to address the issues raised in the Bridges case to allow them to continue to make use of live facial recognition and biometric technologies.

However, the use of facial recognition and biometric technologies will continue to draw public interest both in the UK and elsewhere. Liberty have already called on the Government to ban the use of these technologies, with Megan Goulding, a lawyer for Liberty, commenting that: “The Court has agreed that this dystopian surveillance tool violates our rights and threatens our liberties … [i]t is time for the Government to recognise the serious dangers of this intrusive technology. Facial recognition is a threat to our freedom – it needs to be banned.

Companies that develop and offer facial recognition technology have been responding to these concerns. In June this year, IBM sent a letter to the United States Congress, stating that it would no longer offer general purpose facial recognition or analysis software, referencing the potential for such technologies to result in human rights and privacy abuses, as well as to display racial and gender bias. In the same month Amazon announced a one-year moratorium on police use of its Rekognition technology, stating that stronger government regulation of facial recognition technology is required.

These technologies are likely to remain popular with law enforcement, and improvements that can increase accuracy and reduce inherent bias will only serve to support their ongoing use. Nevertheless, parties interested in the deployment of facial recognition and biometric technologies have a difficult task in justifying the use of such technologies when balancing the potential benefits with the risks identified by the Court of Appeal and the potential to violate data protection, equality, and human rights laws. Let’s face it - this is a discussion that seems bound to continue.