Essex Police has paused the use of live facial recognition cameras (LFR) after a study found they identified more black people than other ethnic groups.
The cameras are mounted on vans and designed to identify people on watchlists if they pass by.
Thirteen forces were using it by the end of last year and the home secretary said in January that the number of LFR vans would increase from 10 to 50.
However, Essex Police said it had paused use of LFR after “potential bias in the positive identification rate” – although it now believes the issue has been corrected by updating the algorithm.
Nearly 200 people were recruited by University of Cambridge researchers to test LFR during one of the force’s deployments.
They found it correctly identified around half of those on the watchlist, and that it was “extremely rare” for someone to be flagged up if they weren’t on the list.
But the study found it was “statistically significantly more likely” to correctly identify black people than other ethnicities. It was also “more likely” to spot men than women.
Researchers said it raised “questions about fairness that require continued monitoring”.
Essex Police told Sky News the study was one of two it commissioned – and the second suggested no bias – but that it had paused deployments to work “with the algorithm software provider” to update the system.
It said “further academic assessment” had been done and it believed the system was ready to hit the streets again.
“We have revised our policies and procedures and are now confident that we can start deploying this important technology as part of policing operations to trace and arrest wanted criminals,” said a statement.
“We will continue to monitor all results to ensure there is no risk of bias against any one section of the community”.
‘Real risk of unfairness’
Beyond concerns about identifying some groups more than others, the study also examined how LFR had worked so far for Essex Police.
It said about 1.3 million faces had been scanned from August 2024 to February 2025, with 48 arrests, about one for every 27,000 faces.
There was only one mistaken intervention.
Researchers said different LFR systems and conditions could produce different results and that more testing is needed “to build a fuller understanding of the technology’s performance”.
As the government looks to ramp up use of the technology, questions over privacy and the huge number of images taken remain a key concern.
The study said “proportionality, transparency and oversight” were vital in deciding when to use LFR, and the Information Commissioner’s Office (ICO) is also scrutinising the technology.
“All forces should also be conducting routine testing for bias and discriminatory outcomes – whether arising from technology design, training data, or watchlist composition,” said the ICO.
“Without this, there is a real risk of unfairness.”
The Home Office said a person’s image is “immediately and automatically” deleted if it does not match the watchlist and all deployments are “targeted, intelligence-led, time-bound, and geographically limited”.
It said more then 1,300 people suspected of serious crimes – including rape, domestic abuse and GBH – had been arrested in London thanks to LFR between January 2024 and September 2025.






