‘Urgent clarity’ sought over racial bias in UK police facial recognition technology | Facial recognition


The UK’s data protection watchdog has asked the Home Office for “urgent clarity” over racial bias in police facial recognition technology before considering its next steps.

The Home Office has admitted that the technology was “more likely to incorrectly include some demographic groups in its search results”, after testing by the National Physical Laboratory (NPL) of its application within the police national database.

The report revealed that the technology, which is intended to be used to catch serious offenders, is more likely to incorrectly match black and Asian people than their white counterparts.

In a statement responding to the report, Emily Keaney, the deputy commissioner for the Information Commissioner’s Office, said the ICO had asked the Home Office “for urgent clarity on this matter” in order for the watchdog to “assess the situation and consider our next steps”.

The next steps could include enforcement action, including issuing a legally binding order to stop using the technology or fines, as well as working with the Home Office and police to make improvements.

Keaney said: “Last week we were made aware of historical bias in the algorithm used by forces across the UK for retrospective facial recognition within the police national database.

“We acknowledge that measures are being taken to address this bias. However, it’s disappointing that we had not previously been told about this, despite regular engagement with the Home Office and police bodies as part of our wider work to hold government and the public sector to account on how data is being used in their services.

“While we appreciate the valuable role technology can play, public confidence in its use is paramount, and any perception of bias and discrimination can exacerbate mistrust. The ICO is here to support and assist the public sector to get this right.”

Police and crime commissioners said publication of the NPL’s finding “sheds light on a concerning inbuilt bias” and urged caution over plans for a national expansion, which could include cameras being placed at shopping centres, stadiums and transport hubs, without putting in place adequate safeguards.

The findings were released on Thursday, hours after Sarah Jones, the policing minister, had described the technology as the “biggest breakthrough since DNA matching”.

Facial recognition technology scans people’s faces and cross-references the images against watchlists of known or wanted criminals. It can be used while examining live footage of people passing cameras, comparing their faces with those on wanted lists, or to enable officers to target individuals as they walk by mounted cameras.

Police officers can also retrospectively run images of suspects through police, passport or immigration databases to identify them and check their backgrounds.

Analysts who examined the police national database’s retrospective facial recognition technology tool at a lower setting found that “the false positive identification rate (FPIR) for white subjects (0.04%) is lower than that for Asian subjects (4.0%) and black subjects (5.5%)”.

The testing found that the number of false positives for black women was particularly high. “The FPIR for black male subjects (0.4%) is lower than that for black female subjects (9.9%),” the report said.

Responding to the report, a Home Office spokesperson said the department took the findings “seriously”, and had already taken action, including procuring and testing a new algorithm “which has no statistically significant bias”.

“Given the importance of this issue, we have also asked the police inspectorate, alongside the forensic science regulator, to review law enforcement’s use of facial recognition. They will assess the effectiveness of the mitigations, which the National Police Chiefs’ Council supports,” the spokesperson said.



Source link

  • Related Posts

    At least 34 dead as Colombian military plane crashes | Colombia

    At least 34 people were killed when a Colombian Air Force plane carrying 125 people crashed just after takeoff in the south of the country, authorities said. The defence minister,…

    'Groundbreaking' approach to youth mental health care in Toronto funded by $40M donation

    The Labatt Family Thriving Minds Program aims to support new treatment models and speed up access to outpatient mental health care for youth and children, SickKids said. Source link

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Claude Code and Cowork can now use your computer

    Claude Code and Cowork can now use your computer

    Ford slams City of Toronto as ‘closed-minded,’ plans to take over airport, use Bill 5

    Ford slams City of Toronto as ‘closed-minded,’ plans to take over airport, use Bill 5

    Traction Uranium Announces Partial Conversion and Repayment of Unsecured Convertible Debentures

    At least 34 dead as Colombian military plane crashes | Colombia

    At least 34 dead as Colombian military plane crashes | Colombia

    Michael Vaughan: Brendon McCullum and Rob Key lucky to avoid sack after ECB review of England’s Ashes

    Michael Vaughan: Brendon McCullum and Rob Key lucky to avoid sack after ECB review of England’s Ashes

    $99 Clear Concierge service saved me 4 hours in the TSA screening line

    $99 Clear Concierge service saved me 4 hours in the TSA screening line