The future is coming at Croydon fast. It might not look like Britain’s cutting edge but North End, a pedestrianised high street lined with the usual mix of pawn shops, fast-food outlets and branded clothing stores, is expected to be one of two roads to host the UK’s first fixed facial recognition cameras.
Digital photographs of passersby will be silently taken and processed to extract the measurements of facial features, known as biometric data. They will be immediately compared by artificial intelligence to images on a watchlist. Matches will trigger alerts. Alerts can lead to arrests.
According to the south London borough’s most recent violence reduction strategy, North End and nearby streets are its “primary crime hotspot”. But these are not, by any measure, among the capital’s most dangerous roads.
Its crime rate only ranks as 20th worst out of the 32 London boroughs, excluding the City of London. The plan to install the fixed cameras later this summer for a trial period is not an emergency initiative. North End and nearby London Road could be anywhere.
Asked about the surveillance, most shopkeepers and shoppers approached on North End said they had not heard of the police plans, let alone the technology behind it.
To some, the cameras will be just another bit of street furniture to go alongside the signs announcing 24-hour CCTV and urging safe cycling. That, some say, should be cause for alarm. Others point to surveys that suggest the public, fed up with a rise in crime, is broadly on side.
Police forces started to trial facial recognition cameras in England and Wales from 2016. But documents released under the Freedom of Information Act (FoI) and police data analysed by Liberty Investigates and shared with the Guardian, provide evidence of a major escalation in their use in the last 12 months. No longer a specialist tool, it is quietly becoming an everyday part of the police arsenal.
Police forces scanned nearly 4.7m faces with live facial recognition cameras last year – more than twice as many as in 2023. Live facial recognition vans were deployed at least 256 times in 2024, up from 63 the year before.
Forces are imminently expected to launch a roving unit of 10 live facial recognition vans that can be sent anywhere in the country.
Meanwhile civil servants are working with the police to establish a new national facial recognition system, known as strategic facial matcher. The platform will be capable of searching a range of databases including custody images and immigration records.
“The use of this technology could become commonplace in our city centres and transport hubs around England and Wales,” according to one funding document drafted by South Wales police submitted to the Home Office and released by the Metropolitan police under FoI.
Campaigners liken the technology to randomly stopping members of the public going about their daily lives to check their fingerprints.
They envision a dystopian future in which the country’s vast CCTV network is updated with live facial recognition cameras. Advocates of the technology say they recognise the dangers but point to the outcomes.
This week David Cheneler, a 73-year-old registered sex offender from Lewisham, in south London, who had previously served nine years for 21 offences, was sentenced to two years in prison for breaching his probation conditions.
A live facial recognition camera on a police van had alerted officers to the fact that he was walking alone with a six-year-old child.
“He was on [the watchlist] because he had conditions to abide by”, said Lindsey Chiswick, the director of intelligence at the Met and the National Police Chiefs’ Council lead on facial recognition. “One of the conditions was don’t hang out with under 14-year-olds.
“He had formed a relationship with the mother over the course of a year, began picking the daughter up at school and goodness knows what would have happened if he hadn’t been stopped that day, he also had a knife in his belt. That’s an example of the police really [being] unlikely to remember the face and pick the guy up otherwise.”
It will be powerful testimony for many – but critics worry about the unintended consequences as forces seize the technology at a time when parliament is yet to legislate about the rules of its use.
Madeleine Stone from the NGO Big Brother Watch, which attends the deployment of the mobile cameras, said they had witnessed the Met misidentify children in school uniforms who were subjected to “lengthy, humiliating and aggressive police stops” in which they were required to evidence their identity and provide fingerprints.
In two such cases, the children were young black boys and both children were scared and distressed, she said.
after newsletter promotion
“And the way it works is that the higher the threshold the less effective it is at catching people.” Stone added. “Police will not always necessarily want to use it at those settings. There’s nothing in law that requires them to use it at those settings. The idea that the police are being able to write their own rules about how they use it is really concerning.”
A judicial review has been launched by Shaun Thompson from London, with the support of Big Brother Watch, into the Met’s use of the cameras after he was wrongly identified by the technology as a person of interest and held for 30 minutes as he was returning home from a volunteering shift with Street Fathers, an anti-knife group.
There is also the risk of a “chilling” effect on society, said Dr Daragh Murray, who was commissioned by the Met in 2019 to carry out an independent study into their trials. There had been insufficient thinking about how the use of these cameras will change behaviour, he said.
“The equivalent is having a police officer follow you around, document your movements, who you meet, where you go, how often, for how long,” he said.
“Most people, I think, would be uncomfortable if this was a physical reality. The other point, of course, is that democracy depends on dissent and contestation to evolve. If surveillance restricts that, it risks entrenching the status quo and limiting our future possibilities.”
Live facial recognition cameras have been used to arrest people for traffic offences, cultivation of cannabis and failure to comply with a community order. Is this proportionate?
Fraser Sampson, who was the biometrics and surveillance camera commissioner for England and Wales, until the position was abolished in October 2023, is now a non-executive director at Facewatch, the UK’s leading facial recognition retail security company which provides systems to companies to keep shoplifters out of their shops.
He can see the value in the technology. But he is concerned that regulation and methods of independent oversight have not caught up with the pace at which it is advancing and being used by the state.
Sampson said: “There is quite a lot of information and places you can go to get some kind of clarity on the technology, but actually, when, where, how it can be used by whom, for what purpose over what period of time, how you challenge it, how you complain about it, what will happen in the event that it didn’t perform as expected? All those kind of things still aren’t addressed.”
Chiswick said she understood the concerns and could see the benefit of statutory guidance. The Met was taking “really quite small steps” which were being reviewed at every stage, she said.
With limited resources, police had to adapt and “harness” the opportunities offered by artificial intelligence. They were well aware of the potential “chilling effect” on society and its ability to change behaviour, and cameras were not deployed at protests, she added.
“Is it going to become commonplace? I don’t know”, Chiswick said. “I think we just need to be a bit careful about when we say [that]. I can think of lots of potential. Like the West End? Yeah, I can see that being, you know, instead of doing this static trial we’re doing in Croydon, we could have done it in the West End. And I can see a different use case for that. It doesn’t mean we’re going to do it.”
She added: “I think we’re going to see an increase in the use of technology, data and AI increasing over the coming years, and on a personal level, I think it should, because that’s how we’re going to become better at our jobs. But we just need to do it carefully.”