Facial recog system used by Met Police shows racial bias at low thresholds
The UK Parliament has heard that a facial recognition system used by the Metropolitan police during the King’s Coronation can exhibit racial bias at certain thresholds.
Speaking to the Science, Innovation and Technology Committee, Dr Tony Mansfield, principal research scientist, National Physical Laboratory, said the NEC-based system used by the Met, the UK’s largest police force, was prone to bias against Black individuals on a set of test data created for his investigations.
“We find that if the system is run at low thresholds and easy thresholds, that it does start showing a bias against the Black males and females combined,” he told MPs.
Mansfield added that he believed the Met did not operate the system at these thresholds.
The testimony before the hearing investigating the Governance of artificial intelligence (AI) related to Mansfield’s report [PDF] conducted by the National Physical Laboratory for the Met and South Wales Police.
It found that false positive identifications increase at lower face-match thresholds (0.58 and 0.56) and “start to show a statistically significant imbalance between demographics with more Black subjects having a false positive than Asian or White subjects.”
However, the system, based on NEC Neoface V4 1 using HD5 Face applied to a 178,000 Filler image watchlist, offered no false positive identifications at higher thresholds (0.64 or higher) and, therefore, no bias. At the medium threshold (between 0.60 and 0.62), it offered eight false positives but no statistically significant bias in demographics.
Lindsey Chiswick, the Met’s director of intelligence, told MPs the system had been used three times since the research was published, including in the London boroughs of Camden and Islington, and during the King’s Coronation.
Across these deployments, which involved the use of six police vans in all, there were four alerts, and zero false positives. There were two arrests, and with the remaining alerts, officers decided an arrest was unnecessary, even though the identification was correct.
She said that in each deployment, the police created a separate data set for the “watchlist” based on the location and the “intelligence case” for the deployment. After each deployment, the watchlist was deleted. None of the data from the live facial recognition deployments were stored, she said.
“I completely understand public concern around [areas] like facial recognition technology and AI, and there’s lots of debate going on around that. I’ve tried introducing facial recognition technology in as careful, proportionate and transparent way possible,” she told MPs.
Nonetheless, the introduction of facial recognition in policing has attracted criticism in the UK.
In 2017, the Met was urged to cancel its planned use of facial recognition software at Notting Hill Carnival, Europe’s largest street festival. Privacy groups, including Big Brother Watch, Liberty and Privacy International called for a U-turn on the use of the tech.
Speaking to MPs this week, Chiswick said the Met had only used the technology once “in the very early days of us learning to use” it. “It wasn’t a very successful deployment. We learned that the sheer number of faces, the density of the crowd, made it very difficult for the technology to operate,” she said.
In 2018, campaigners Big Brother Watch found 91 per cent of people flagged up on the Met’s facial recognition system turned out to be not on the watch list. At the time, a Met spokesperson said the force did not consider these false positive matches “because additional checks and balances are in place to confirm identification following system alerts”.
It’s not a great look just a week after the UK biometrics and surveillance camera commissioner Professor Fraser Sampson warned the government was trying to scrap independent oversight of facial recognition just as the policing minister has made it clear he plans to “embed” it into the force.
“As an advocate of the accountable and proportionate use of new technology by the police I think this lacuna is problematic as much for the police themselves as for the communities they serve,” he said. ®
READ MORE HERE