A rising number of British stores are using a facial recognition system powered by artificial intelligence to identify repeat shoplifters in what one human rights group has called the spread of “airport-style security” on the high street.
Simon Gordon, founder of UK surveillance company Facewatch, told CNN that demand for his product had grown “exponentially” as the incidence of shoplifting and violence in stores has soared in recent years.
“We’re just here trying to prevent crime,” he said.
It works like this: Once a store manager knows an item has been stolen — for instance, when taking an inventory of their stock — they will review the footage recorded by their security cameras to identify the thief.
Then, the manager will log into Facewatch’s system, which will have also taken video of all the customers who entered the store that day, to find the suspect in the firm’s footage and log the incident.
“We then review the incident, make sure it describes the suspected crime or disorder and then we set it live,” Gordon said. Any time the same person tries to enter that store again, the manager will receive an alert on their phone, and can ask the person to leave, or keep a close eye on them.
Before the alert is sent, one of Facewatch’s human “super-recognizers” double-checks that the suspect’s face matches one in the firm’s database of offenders.
If that person is either a prolific offender or has stolen something worth more than £100 ($131), their biometric data could also be shared with other stores in the local area that use Facewatch’s system.
Human rights groups say this type of technology flouts people’s right to privacy and often makes mistakes.
“Something like Facewatch is basically normalizing what is airport-style security [for] something as mundane as going to get a pint of milk at the shops,” Madeleine Stone, senior advocacy officer at Big Brother Watch, a UK civil liberties campaign group, told CNN.
Recording shoppers’ biometric data is the equivalent to asking them to “hand over their fingerprint or even a DNA sample just to walk into the shops.”
‘Not infallible’
Gordon, who also runs London’s oldest wine bar, said the system was accurate 99.85% of the time last month in identifying repeat offenders.
But mistakes do sometimes happen, he added. In those cases, the person incorrectly flagged as an offender can complain to Facewatch, and have their details wiped from the system.
“Sometimes you’ll get maybe someone who’s a doppelganger for someone else,” Gordon said, which will result in an incorrect alert. That’s happened “a few times,” although, once aware of the error, the company “immediately” deletes the person’s details.
“Nothing’s happened to them. They haven’t been thrown into prison,” Gordon said.
The system is legal in the United Kingdom, he noted, and businesses that install it must place a sign on their storefront informing customers that it is in use. Facewatch also retains shoppers’ data for only two weeks — half the amount of time a regular “CCTV” security camera in the UK typically stores footage.
But for Stone that’s not enough. She said “people shouldn’t have to be proactively proving their innocence,” and pointed to the well-documented potential for bias in AI-powered software.
“You could very easily be wrongly placed on watch lists and have your life really changed if you’re not able to access your local shops because some AI-powered technology has flagged you as a criminal, which you aren’t,” she said.
But Gordon is confident Facewatch’s system doesn’t have any bias and stresses that it is supported by human staff who’ve been trained in facial recognition.
“Artificial intelligence is amazing now,” he said. “In any decent algorithm, the bias has been removed.”
Rising crime
Gordon started Facewatch 13 years ago as a way to share footage captured by traditional security cameras with the police but became frustrated by a lack of response.
“The police couldn’t deal with it. They’re not really interested in this level of crime,” he said.
In the year ending in March 2022, levels of violence and abuse in stores in England and Wales were nearly double those before the pandemic, according to a recent survey of businesses by the British Retail Consortium.
The number of thefts also shot up during the 2022 calendar year in 10 of the biggest cities across England, Wales and Northern Ireland, the trade group told CNN. That has coincided with a steep increase in the cost of living, including in food prices. Food inflation in the UK, which stood at 18.4% in June, is much higher than it has been in its European neighbors or the United States.
“Something like shoplifting has really complicated causes and impact — we shouldn’t rush to an AI-driven solution,” argues Stone at Big Brother Watch.
Gordon, however, said the technology was not meant to penalize small-scale, one-off, or even accidental, instances of theft, but repeat offenders, who can cost individual businesses thousands in lost revenue, as well as those who threaten staff.
“This is the private sector trying to keep their staff safe,” he said. “That’s the primary objective.”
Gordon is receiving requests for Facewatch’s product from businesses around the world and is looking to expand in the United States.
The system is gaining popularity amid fears over the potential for AI to destabilize society, whether through unprecedented levels of intrusion into people’s privacy or by replacing millions of jobs, among a ream of other concerns.
Governments are stepping up efforts to regulate the technology. Last month, the European Parliament agreed to ban the use of real-time, AI-powered facial recognition technology in public spaces. The draft legislation, once formally approved, will be the first of its kind to set rules on how companies use AI.