Facial Recognition - What is the impact of the Bridges case?

19th August 2020 Media Law

Last week, the Court of Appeal handed down its decision that the use of automated facial recognition technology by South Wales Police violated a complainant’s data protection rights. How ‘ground-breaking’ was this decision? 

Amy Smethurst discusses the verdict. 

What is the background?

Automated Facial Recognition (“AFR”) is software that automatically detects human faces from a video and compares it with a database of facial images. South Wales Police (SWP) is the UK policing lead for AFR. “AFR Locate” is a pilot project run by SWP. It involves the use of surveillance cameras to scan crowds in an attempt to identify people in real-time. The digital images captured are processed and compared with a watchlist compiled by SWP for the purpose of the deployment.

Edward Bridges, a civil liberties campaigner, brought a claim against SWP using such technology on the basis that it breached his data protection and privacy rights. Bridges was in Cardiff on two occasions in December 2017 and March 2018 when AFR had been deployed by SWP. 

Mr Bridges’ claim was dismissed by the High Court, but he was granted permission to take his complaint to the Court of Appeal. The landmark ruling handed down on 11 August 2020 saw the Court of Appeal find SWP’s use of AFR breached privacy rights, data protection laws and equality laws. 

The Judgment

Bridges was granted permission to appeal on five separate grounds. The Court of Appeal accepted three grounds of appeal and rejected two. The three grounds on which the appeal was allowed were:

The CA held the use of AFR on the two occasions and on an ongoing basis engaged Article 8(1) of the European Convention on Human Rights. Article 8 guarantees a right to privacy and family life. It was held that the interference with Article 8 was not in accordance with law due to the insufficient legal framework. 

The CA held that SWP’s Data Protection Impact Assessment did not comply with the Data Protection Act 2018. It was held there was an insufficient legal framework for establishing who would be on the ‘watchlist’ and the areas where the technology may be deployed. The ‘Data Protection Impact Assessment’ was criticised as being too broad in its discretion vested in the individual police officer to decide who should be on the watchlist. 

SWP was also found to have breached their public sector equality duties by failing to properly check whether algorithms were biased in terms of discriminating on the basis of race or sex. SWP did not take into account the possibility that the AFE technology may produce a disproportionate amount of false positives when used on women and ethnic minorities. 

Importantly, the appeal on the ground of proportionality was rejected. Bridges’ complaint concerns the impact of AFR on himself, not anyone else. The effect of the impact on Bridges cannot be added to by the fact other people were also affected. The CA upheld the High Court decision that AFT struck a fair balance between the rights of the individual and the interest of the community.

If AFR is held to be proportionate to the goal of identifying and preventing crime in terms of Article 8 rights, surely its use can be justified in other circumstances.

Case Reaction

Mr Bridges has said that he is “delighted that the court has agreed that facial recognition clearly threatens our rights”. Mr Bridges was supported in his claim by civil rights group Liberty. Ms Goulding, a lawyer for Liberty said “this judgement is a major victory in the fight against discrimination and oppressive facial recognition”. The UK civil liberties campaign group Big Brother Watch said “this is a huge step forward in the fight against facial recognition”. 

These strong words suggest that the Court’s decision has been a devastating blow to the use of AFR. However, the comments of SPW paint a different picture. 

SWP response to the judgement was that of welcome as they confirmed they would not appeal the decision: “what the judgement has done is helpfully describe how we might strengthen deployment policies and influence codes of practice in how this technology is used across the UK”. 

SWP go as far as to say the case has been “a really helpful process…placing a rigorous test on our policies and the way that we approach things”. 

It is clear that the decision has not been a deterrent for its use by SWP: 

We will continue our deployment and development of the technology when we have satisfied ourselves that we can meet the specific points identified in the conclusions of the Court of Appeal, and that work is underway as we now consider the comprehensive judgment."

The Surveillance Camera Commissioner has also responded to the judgement in a statement made on the GOV.UK website. His comments too are that of acceptance: 

I very much welcome the findings of the court in these circumstances. I do not believe the judgement is fatal to the use of this technology, indeed, I believe adoption of new and advancing technologies is an important element of keeping citizens safe. It does however set clear parameters as to use, regulation and legal oversight”.

Conclusion

The decision is important in relation to regulating the use of AFR. The decision limits the discretion as to who can be placed on a watchlist and where the technology can be deployed. It also heightens the duty to pay the possibility of inequality due regard. 

In July 2019 Rt Hon Norman Lamb MP, Chair of the Science and Technology Committee said:

The legal basis for AFR has been called into question, yet the Government has not accepted that there’s a problem. It must. A legislative framework on the use of these technologies is urgently needed. Current trials should be stopped and no further trials should take place until the right legal framework is in place.

This decision comes before the use of this software is deployed on a national scale which is what both the Surveillance Camera Commissioner and the Science and Technology Committee have been waiting for. Had the software been implemented nationally before this decision was provided there would be no limits on the discretion the police could have as to who they want to target and where they want to use it. 

So, while the decision may have halted SWP in its long-running trial of AFR, facial recognition lives to fight another day.  SWP: “there is nothing in the Court of Appeal judgement that fundamentally undermines the use of facial recognition to protect the public. This judgement will only strengthen the work which is already underway to ensure that the operational policies we have in place can withstand robust legal challenge and public scrutiny.

We're Social

Amy Smethurst is a Paralegal located in Manchesterin our Media Law department

View other posts by Amy Smethurst

Let us contact you

*
*
*
*
*

COVID-19 Update - Our website and phone lines are operating as normal and our teams are on hand to deal with all enquiries. Meetings can be conducted via telephone and video conferencing.

View our Privacy Policy

Areas of Interest