top of page

Lords express regret at lack of guidance in Surveillance Camera Code of Practice


Members of the House of Lords have expressed regret at the Surveillance Camera Code of Practice because “it does not constitute a legitimate legal or ethical framework for the police’s use of facial recognition technology”.


In a motion debated on Wednesday (2 February 2022), lords also stated that the Code was not compatible with human rights.


Lord Clement-Jones, a Liberal Democrat peer, introduced the ‘Motion to Regret’ by warning of the dangers of automated facial recognition (AFR) in schools and elsewhere. He criticised the Home Office for the “incredibly brief” consultation on the updates to the Code over the summer.


Although the motion was withdrawn, Lord Clement-Jones said that it nonetheless was another marker expressing the concerns of some Lords about automated facial recognition technology. The debate also highlighted a number of key issues around automated facial recognition and artificial intelligence.


He argued that the Code does not address the objections to the use of live facial recognition raised by the Court of Appeal in its judgement of August 2020 in the case of Bridges v South Wales Police. It ruled that South Wales Police’s use of automated facial recognition had not been in accordance with data protection legislation, the public sector equality duty and, crucially, article 8 of the Human Rights Convention (the right to privacy).


The Court of Appeal set out minimum standards that had to be met for the use of AFR to be lawful, but Lord Clement-Jones said the previous Code did not provide these standards and the new version was no better. He criticised the Home Office approach to regulating surveillance cameras on several grounds:

  1. The Code continues to allow wide discretion to individual police forces to develop their own policies around the deployment of AFR including the categories of people included on a watchlist and the criteria for when to deploy

  2. The Code only contains four passing references to AFR which provides “scant guidance” to create a regulatory framework for its use

  3. There are no references to AFR in the Protection of Freedoms Act 2012 nor in any other UK law

  4. The Code has not been subject to proper democratic scrutiny

  5. There remains no explicit basis for the use of AFR by police forces in the UK

  6. The forthcoming guidance from the College of Policing is inadequate

  7. The revised code fails to provide any practical guidance on the public sector equality duty

He warned that live AFR is “an enormous interference with the right to privacy under Article 8 of the ECHR”. A ‘false positive’ can mean a person is stopped and subject to search by the police, and he argued that it was not right for the police to delegate this intrusive identification task to a machine.


He also argued that AFR can have a chilling effect on freedom of expression and the right to free association by making people wary of being recorded in public talking to certain people.


“Inherent bias” in AFR technology means that it “disproportionately misidentifies women and BAME people, meaning that people from these groups are more likely to be wrongly stopped and questioned by police and to have their images retained as the result of a false match”.


He also warned that it is unfairly deployed by police in certain communities. “[Civil rights group] Liberty has raised concerns regarding the racial and socioeconomic dimensions of police trial deployments thus far – for example, at Notting Hill Carnival for two years running as well as twice in the London Borough of Newham. The disproportionate use of this technology in communities against which it ‘underperforms’ – according to its proponent’s standards – is deeply concerning.”


He quoted the Biometrics and Surveillance Camera Commissioner, Fraser Sampson, who wrote in a recent blog: “What we talk about in the end, is how people will need to be able to have trust and confidence in the whole ecosystem of biometrics and surveillance”.


He concluded by calling for a moratorium on the use of AFR while its impact on human rights was examined in more detail.


He said: “Rather than update toothless codes of practice to legitimise the use of new technologies like live facial recognition, the UK should have a root and branch surveillance camera review which seeks to increase accountability and protect fundamental rights. The review should investigate the novel rights impacts of these technologies, the scale of surveillance we live under and the regulations and interventions needed to uphold our rights.”


Lord Alton of Liverpool agreed with Lord Clement-Jones’s concerns about the Code and warned further that repressive regimes around the world were using video surveillance technology against their people. He pointed in particular to the genocide of Uighur Muslims in Xinjiang and a quote from the head of MI6, Richard Moore, who said “technologies of control… are increasingly being exported to other governments by China – expanding the web of authoritarian control around the planet”.


Lord Anderson of Ipswich also agreed. He said that AFR had made public space surveillance cameras more intrusive because of their ability to put names to faces and make that data searchable using AI-driven analytics. He warned that we are in the early stages of a technological revolution that cannot be stopped but, if we act now, at least controlled.


He criticised the Code as a “slim and highly general document” in contrast to the “detailed codes of practice issued under the Investigatory Powers Act 2016 and overseen by the world-leading Investigatory Powers Commissioner’s Office”.


Baroness Falkner of Margravine, a non-aligned peer and chair of the Equality and Human Rights Commission (EHRC), warned that society needs to be vigilant about commercial surveillance and espionage before criticising the lack of substance in the Code. “Articles 8, 9, 10, 11 and 14 – the general article against discrimination – of the European Convention on Human Rights are engaged in this, so the fact that we get a document as thin as this is truly remarkable,” she said.


She asked the minister a number of questions about the Code:

  1. Why wasn’t the EHRC, as the regulator of the public sector equality duty, not consulted after the Bridges case about how to strengthen the Code?

  2. In paragraph 10.4, why do authorities only have to publish a summary of an audit and review rather than a full report?

  3. Paragraph 12.3 regarding the adverse impact of AFR algorithms, why aren’t there any practical examples for authorities to consider?

She called for mandatory independent equality and human rights impact assessments which are followed up with robust mitigating action to address any shortcomings. Equality and human rights must, she said, be embedded in this new technology, and she warned that the EHRC would take enforcement and other legal action to ensure that AI does not bias decision-making or breach human rights.


Lord Rosser, a Labour peer, said that according to the Secondary Legislation Scrutiny Committee of the House of Lords, the revised Code reflects the Court of Appeal judgement because it restricts the use of AFR by:

  1. Restricting it to “places where the police have reasonable grounds to expect someone on a watchlist to be”

  2. Directing that it cannot be used for ‘fishing expeditions’

  3. Requiring that biometric data of members of the public who are not of interest to the police be deleted immediately

  4. Monitoring its compliance with the public sector equality duty to ensure there is no unacceptable bias

  5. Making it clear which authority a person can complain to.

However, he said that despite the need for clear guidelines for the use of the technology, the safeguards and frameworks for controlling the spread of AFR are being built “in a piecemeal way in response to court cases, legislation and different initiatives over its use, rather than strategic planning from the Government”.


He said the House of Lords has previously called for a detailed review of the use of AFR technology, including:

  1. The process that police forces should follow to put facial recognition tools in place

  2. The operational use of the technology at force level, taking into account specific considerations around how data is retained and stored, regulated, monitored and overseen in practice, how it is deleted and its effectiveness in achieving operational objectives

  3. The proportionality of the technology’s use to the problems it seeks to solve

  4. The level and rank required for sign-off

  5. The engagement with the public and an explanation of the technology’s use

  6. The use of technology by authorities and operators other than the police

The Conservative peer, Baroness Williams of Trafford, who is Minister of State at the Home Office, responded for the government. She started by refuting the claim that the Code was out of date, saying that it was ‘principles based rather than technology specific’ which ensures it remains largely up to date.


The changes to the Code were made in response to legislative developments and the Bridges case, she said.


The consultation was held primary among police and commissioners including the Information Commissioner’s Office, but it was in the public domain and many groups commented on it including the National Police Chiefs Council, she said.


She said the Court of Appeal had confirmed that the police have common law powers to use AFR and other new technologies. She dismissed Lord Rosser’s concerns about data protection, saying: “The Data Protection Act is relevant but under legislation for operating ‘in accordance with law’, published police policies constitute law for these purposes, and the use of LFR was proportionate.” She also stated that live facial recognition (LFR) instantly deletes the biometrics of people who are not matched.


In response to Lord Clement-Jones’s concerns about the accuracy of AFR, she said it would depend on the technology and how it was used, pointing out that the technology is probabilistic, giving possible matches, not definite ones. But she added, the technology is becoming more accurate.


As to the question of bias, neither South Wales Police nor the Metropolitan Police have found any evidence that their algorithms are biassed, but in any case, the final decision is always taken by a human operator so it not a case of people’s privacy being jeopardised by machines.


On the ethical question of equipment being supplied by companies from repressive countries, she acknowledged that this was an important question and that the Foreign, Commonwealth & Development Office (FCDO) and Cabinet Office will issue new guidance for buyers to exclude suppliers linked to modern slavery and human rights violations, and the government will also be bringing forward a public procurement bill to strengthen the ability of public sector bodies to block companies with a history of misconduct.


And finally she said that the Code was just the beginning and that further guidance for police use of AFR would be set out in the College of Policing’s national guidance. However, the police had a duty to use technology to protect the public as long as they do so in a way that maintains public trust.


The video of the debate can be found on Parliament TV here: https://parliamentlive.tv/event/index/df408964-5e92-4237-97c4-c29df0d4b825?in=20:28:10

Comments


bottom of page