Privacy Advocates Outraged by UK Supermarket’s Face Cameras

By Robert Bateman

U.K. supermarket chain the Southern Co-op is using facial-recognition technology to address shoplifting, a move that privacy advocates argued violated shoppers’ privacy rights.

The supermarket has been using facial-recognition cameras, provided by the security firm Facewatch, in 18 of its stores across the south of England since 2018, a company spokesperson told Digital Privacy News.

The cameras scan the faces of anyone walking into the store and retains biometric data derived from the facial images of people suspected of having committed a crime.

“To see a supposedly ethical company secretly using rights-abusive tech like facial recognition on its customers in the U.K. is deeply chilling,” said Silkie Carlo, director of U.K. privacy advocacy group Big Brother Watch.

“This surveillance is well known to suffer from severe inaccuracy and biases, leading to innocent people being wrongly flagged and put on criminal databases,” Carlo told Digital Privacy News. 

“Live facial-recognition is more commonly seen in dictatorships than democracies,” she added. “This is a serious error of judgment by Southern Co-op — and we urge them to drop these Big Brother-style cameras immediately.”

Southern Co-op Responds

The Southern Co-op spokesperson told Digital Privacy News: “Already this year, we have seen an 80% increase in assaults and violence against our store colleagues.

“This is not acceptable,” the representative continued. “We’re working hard to protect them, but this is not at the expense of our customers’ rights.

“This surveillance is well known to suffer from severe inaccuracy and biases.”

Silkie Carlo, Big Brother Watch.

“The number one reason for violence in our stores and within the wider retail sector is when a colleague intervenes after a theft has already taken place.

“Using facial recognition in this limited way has improved the safety of our store colleagues.

The spokesman said that “the system is GDPR-compliant and does not store images of an individual unless they have been identified as a repeat offender.”

The GDPR Factor

The General Data Regulation (GDPR) is an EU privacy law, passed in 2016, that the U.K. retained in its domestic law upon leaving the EU in January 2020.

Still, advocates questioned whether Southern’s facial-recognition technology complied with GDPR.

“Under Europe’s data-protection regime, this is not lawful,” said Ella Jakubowska, policy officer at European Digital Rights (EDRi). 

“Already this year, we have seen an 80% increase in assaults and violence against our store colleagues.”

Southern Co-op spokesperson.

“It is never going to be legitimate, necessary, or proportionate to put a technology that amounts to biometric mass surveillance for the purpose of low-level crimes,” she told Digital Privacy News.

‘Limited and Targeted’

The Southern Co-op said its use of facial recognition was “limited and targeted”  and used only to “identify when a known repeat offender enters one of our stores.”

Jakubowska, however, argued that the supermarket was not using the technology in a targeted way.

“When you have, at the entrance to a supermarket, a technology that is going to scan every person’s face as they walk in — this is a form of untargeted mass surveillance.

“If you are scanning every person’s face, even if you are disregarding their data, that is not targeted, that is not limited — that is disproportionate,” Jakubowska said.

“It’s infringing on everyone’s privacy and data-protection rights — which are rights that are so important in enabling us to feel free, to live our lives how we choose, and to participate in society.”

“If you are scanning every person’s face, even if you are disregarding their data … that is disproportionate.”

Ella Jakubowska, European Digital Rights.

The Southern Co-op said that “only images of individuals known to have offended within our premises, including those who have been banned or excluded, are used on our facial-recognition platform.”

That, Jakubowska contended, raises its own concerns.

“People are added to these ‘watch lists’ based on ‘behavior’ — but who gets to decide what behavior is suspicious?” she posed. “There’s no evidence of due process or procedure.”

UK Court Ruling

Last August, the U.K.’s Court of Appeal found that police had been unlawfully using facial-recognition cameras. The full extent of the use of facial recognition in the country’s private sector remains unknown.

Evan Greer, deputy director of Fight for the Future, an advocacy organization based in Boston, told Digital Privacy News that the use of facial recognition by private companies “poses just as much of a threat to public safety, privacy and human rights as use by government and police.

“Corporations already harvest entirely too much of our personal data — and they have regularly failed to keep it safe,” he said. “Private companies will absolutely use this type of technology to discriminate, engage in racist profiling and worse.”

Greer also described biometric surveillance among private companies as “a major security risk.” 

“Corporations already harvest entirely too much of our personal data.”

Evan Greer, Fight for the Future.

“If a company gets hacked and leaks your credit-card number, you can get a new credit card,” he told Digital Privacy News. “But if a hacker obtains a biometric scan of your face, you can’t get a new face.”

Robert Bateman is a writer in Brighton, U.K.