Home Office Hid Police Facial Recognition Flaws, UK Watchdog Outraged

Regulatory Concerns Over Biases in Facial Recognition Technology
The UK's data protection watchdog has expressed disappointment over the lack of transparency regarding significant biases in police facial recognition technology. Despite regular communication between the Information Commissioner's Office (ICO) and the Home Office, critical issues with the algorithm used by UK police forces for retrospective facial recognition (RFR) remained undisclosed.
Emily Keaney, deputy commissioner for the ICO, revealed that the regulator only recently became aware of historical bias in the algorithm used within the Police National Database (PND). She emphasized the importance of public confidence in the use of such technology and highlighted the potential for mistrust when biases are perceived.
"The ICO is here to support and assist the public sector to get this right," Keaney stated. The organization has requested urgent clarity from the Home Office to assess the situation and determine next steps.
Updated Accuracy Tests Highlight Algorithmic Disparities
Keaney’s comments followed updated accuracy tests published on December 4, conducted by the National Physical Laboratory and commissioned by the Home Office. These tests evaluated two algorithms: Cognitec FaceVACS-DBScan ID v5.5, currently used by the PND, and Idemia MBSS FR, which is planned for future deployments.
While Idemia's algorithm demonstrated near-perfect performance in both ideal testing conditions and realistic operational scenarios, Cognitec's algorithm showed notable weaknesses when identifying certain demographics under strict settings designed to minimize false positives.
In Cognitec's case, the algorithm correctly matched an image of a suspect to an individual in the PND 99.9% of the time when no restrictions were applied. However, when testers forced the system to return results only when similarity scores were set to very high levels, its accuracy dropped to 91.9%.
Demographic Performance Variations
The strict setting revealed that the algorithm was most effective at identifying Asian subjects, with a 98% success rate. White subjects were correctly identified 91% of the time, while Black subjects were identified correctly in 87% of cases.
When similarity scores were lowered but still maintained high levels, false positive rates increased, disproportionately affecting certain demographics. In these tests, Black females were more likely to be falsely matched to a reference image than Black males, with false positive rates of 9.9% and 0.4%, respectively.
Removing gender from the equation, false positive rates for White subjects (0.04%) were significantly lower than those for Asian (4%) and Black (5.5%) subjects.
Addressing the Concerns
The Register understands that RFR results are never used as evidence before undergoing a manual review, reducing the risk of incorrect usage. Training and guidance have been reissued to police forces nationwide following the report.
The government has also asked the Inspectorate of Constabulary to review police use of facial recognition technology, with assistance from the Forensic Science Regulator, in light of the tests.
A Home Office spokesperson stated that they take the findings seriously. "A new algorithm has been independently tested and procured, which has no statistically significant bias. It will be tested early next year and will be subject to evaluation."
"Our priority is protecting the public. This game-changing technology will support police to put criminals and rapists behind bars. There is human involvement in every step of the process and no further action would be taken without trained officers carefully reviewing results."
Ongoing Debate and Expansion of Use
The tests were published as the Home Office launched a consultation to expand police use of facial recognition, despite widespread criticisms of the technology across its various types of deployment.
The UK government spends tens of millions on facial recognition technology every year and has consistently supported its efficacy since the PND launched in 2011.
Posting Komentar untuk "Home Office Hid Police Facial Recognition Flaws, UK Watchdog Outraged"
Posting Komentar