Ministers are facing calls for stronger safeguards on facial recognition technology after the Home Office admitted that it is more likely to incorrectly identify black and Asian people than their white counterparts on some settings.
Following the latest testing conducted by the National Physical Laboratory (NPL) of the technology’s application within the police national database, the Home Office said it was “more likely to incorrectly include some demographic groups in its search results”.
Police and crime commissioners said publication of the NPL’s finding “sheds light on a concerning in-built bias” and urged caution over plans for a national expansion.
The findings were released on Thursday, hours after Sarah Jones, the policing minister, had described the technology as the “biggest breakthrough since DNA matching”.
Facial recognition technology scans people’s faces and then cross-references the images against watchlists of known or wanted criminals. It can be used while examining live video footage of people passing cameras and comparing their faces with those on wanted lists, or be used by officers to target individuals as they walk by mounted cameras.
Images of suspects can also be run retrospectively through police, passport or immigration databases to identify them and check their backgrounds.
Concerns have been raised following the NPL’s report, which was released on Thursday. Analysts who examined the police national database’s retrospective facial recognition technology tool at a lower setting found that “the false positive identification rate (FPIR) for white subjects (0.04 %) is lower than that for Asian subjects (4.0 %) and black subjects (5.5 %)”.
The testing went on to find that the number of false positives for black women was particularly high. “The FPIR for black male subjects (0.4 %) is lower than that for black female subjects (9.9 %),” the report said.
The Association of Police and Crime Commissioners said in a statement that the findings showed “an in-built bias”.
It said: “This has meant that, in some circumstances, it is more likely to incorrectly match black and Asian people than their white counterparts. The language is technical but, behind the detail, it seems clear that technology has been deployed into operational policing without adequate safeguards in place.”
The statement, signed off by the APCC leads Darryl Preston, Alison Lowe, John Tizard and Chris Nelson, questioned why the findings had not been released at an earlier opportunity or shared with black and Asian communities. “Although there is no evidence of adverse impact in any individual case, that is more by luck than design. System failures have been known for some time, yet these were not shared with those communities affected, nor with leading sector stakeholders,” it said.
The government announced a 10-week public consultation that it hopes will pave the way for the technology to be used more often. The public will be asked whether police should be able to go beyond their records to access other databases, including passport and driver’s licence images, to track down criminals.
after newsletter promotion
Civil servants are working with police to establish a new national facial recognition system that will hold millions of images.
The former cabinet minister David Davis raised concerns after police leaders said the cameras could be placed at shopping centres, stadiums and transport hubs to hunt for wanted criminals. He told the Daily Mail: “Welcome to big brother Britain. It is clear the government intends to roll out this dystopian technology across the country. Something of this magnitude should not happen without full and detailed debate in the House of Commons.”
Officials say the technology is needed to help catch serious offenders. They say there are manual safeguards, written into police training, operational practice and guidance, that require all potential matches returned from the police national database to be visually assessed by a trained user and investigating officer.
A Home Office spokesperson said: “The Home Office takes the findings of the report seriously and we have already taken action. A new algorithm has been independently tested and procured, which has no statistically significant bias. It will be tested early next year and will be subject to evaluation.
“Given the importance of this issue, we have also asked the police inspectorate, alongside the Forensic Science Regulator, to review law enforcement’s use of facial recognition. They will assess the effectiveness of the mitigations, which the National Police Chiefs’ Council supports.”






