Human rights experts say London police should stop using facial recognition

  • July 11, 2019
  • 0 Comments
  • Tech

From Eandt.Theiet.org

The London Metropolitan Police have been urged to avoid the use of live facial-recognition (LFR) technology due to “significant flaws” in the way it was implemented and possible breaches of human rights.

A report from the University of Essex Human Rights Centre claimed it is “highly possible” the Metropolitan Police’s use of LFR to-date would be held unlawful if challenged in court.

In order to compile the report, authors Professor Peter Fussey and Dr Daragh Murray were granted access to the final six of the ten trials carried out by the Metropolitan Police, running from June 2018 to February 2019.

They found that the technology may be in breach of domestic law as there was no explicit legal authorisation for its use.

Furthermore they said the LFR deployment lacked sufficient pre-test planning and the Metropolitan Police’s trial methodology focused primarily on the technical aspects of the trials without considering other impacts.

They added that the mixing of trials with operational deployments “raises a number of issues regarding consent, public legitimacy and trust”, particularly when considering the differences between an individual’s consent to participate in research and their consent to the use of technology for police operations.

“It is appropriate that issues such as those relating to the use of LFR are subject to scrutiny, and the results of that scrutiny made public,” Fussey said.

“The Metropolitan Police’s willingness to support this research is welcomed. The report demonstrates a need to reform how certain issues regarding the trialling or incorporation of new technology and policing practices are approached, and underlines the need to effectively incorporate human rights considerations into all stages of the Metropolitan Police’s decision-making processes.”

Across the six trials that were evaluated, the technology made 42 matches, and in only eight of the matches can the researchers say it made a correct match.

In 2017 a civil rights group claimed that its deployment at that year’s Notting Hill Carnival saw a number of incorrect matches even leading to an erroneous arrest.

Duncan Ball, Deputy Assistant Commissioner, said the Met was “extremely disappointed with the negative and unbalanced tone” of the research and insisted the pilot had been successful.

The Neoface system uses special cameras to scan the structure of faces in a crowd of people to create a digital image, comparing the result against a watch list made up of pictures of people who have been taken into police custody.

If a match is found, officers at the scene where cameras are set up are alerted.

Use of the facial recognition is currently under judicial review in Wales following the technology’s first ever legal challenge, brought against South Wales Police by Liberty.

Hannah Couchman, policy and campaigns officer for the civil rights group, renewed their calls for a ban on the technology after the research was published.

“This damning assessment of the Met’s trial of facial-recognition technology only strengthens Liberty’s call for an immediate end to all police use of this deeply invasive tech in public spaces,” she said.

“It would display an astonishing and deeply troubling disregard for our rights if the Met now ignored this independent report and continued to deploy this dangerous and discriminatory technology.

“We will continue to fight against police use of facial recognition, which has no place on our streets.”