Clearview’s Biometric Database Ruled ‘Illegal’ in Canada, EU

By Robert Bateman

Clearview AI’s biometric database was declared unlawful in Canada earlier this month, just a week after a similar decision by German regulators.

The New York-based tech firm has amassed a vast collection of more than three billion facial images by scraping publicly available data.

Clearview’s algorithmic software derives “faceprints” from these images, creating a trove of biometric information that is searchable by the company’s clients, including U.S. law-enforcement agencies.

In a Feb. 3 news release, announcing the outcome of a yearlong investigation, Canada’s Office of the Privacy Commissioner (OPC) concluded that Clearview’s practices represented “mass surveillance” and were “illegal.” 

“Canada is starting to look into the full picture of facial-recognition software uses, and Clearview is one example where many in Canada don’t like what we see,” said Victoria McIntosh, an independent privacy consultant based in Nova Scotia.

Clearview’s Response

In a statement, Doug Mitchell of IMK Advocates, an attorney for Clearview AI, told Digital Privacy News: “Clearview AI’s technology is not available in Canada and it does not operate in Canada.

“In any event, Clearview AI only collects public information from the Internet which is explicitly permitted under PIPEDA,” he said.

“Clearview is one example where many in Canada don’t like what we see.”

Victoria McIntosh, Canadian privacy consultant.

The Personal Information Protection and Electronic Documents Act (PIPEDA) is Canada’s private-sector federal privacy law.

Mitchell continued: “The Federal Court of Appeal has previously ruled in the privacy context that publicly available information means exactly what it says: ‘available or accessible by the citizenry at large.’

“There is no reason to apply a different standard here.

“Clearview AI is a search engine that collects public data just as much larger companies do, including Google, which is permitted to operate in Canada,” he said.

Probe After News Reports

The Canadian investigations into Clearview began last February, after news reports disclosed that 48 organizations across Canada were using Clearview’s services, including the Royal Canadian Mounted Police (RCMP). 

The OPC — along with provincial regulators in Quebec, British Columbia, and Alberta — made several recommendations to Clearview, including that it stopped offering its services in Canada, ended the collection of the facial images and biometric information of Canadians and deleted Canadians’ information from its database.

While Clearview had ceased its dealings with Canadian organizations by the end of July, the company refused to comply with the Canadian authorities’ other recommendations.

“Clearview AI’s technology is not available in Canada and it does not operate in Canada.”

Doug Mitchell, IMK Advocates, Clearview attorney.

“A weighing factor in the decision has been Clearview AI’s continuing dismissal of the OPC’s concerns on infringement of individual privacy rights and potential harms,” Canadian privacy consultant McIntosh told Digital Privacy News. 

“Clearview AI, in its unwillingness to work with the OPC and its recommendations, is insistent that business interests trump personal privacy,” she said. “When reviewing the regular reports, surveys and investigations conducted by the OPC, it’s clear many Canadians disagree.”

The OPC’s release stated that the authorities would “pursue other actions available” under Canada’s federal and provincial privacy laws if Clearview failed to comply with their recommendations.

GDPR Violations in EU

The OPC’s announcement came less than a week after a similar announcement from a regulator in Hamburg, Germany, which found that Clearview had violated the EU’s General Data Protection Regulation (GDPR).

The German regulator delivered its decision Jan. 28, following a complaint by security researcher and Hamburg resident Matthias Marx.

The regulator concluded that Clearview had acted illegally by failing to obtain Marx’s consent before processing his biometric information.

“Europeans are having their photos harvested for a purpose they never intended or even contemplated,” said Alan Dahi, a privacy lawyer for European Center for Digital Rights: NOYB (None of Your Business), who supported the complaint against Clearview.

“This is a clear violation of Europeans’ fundamental rights,” Dahi told Digital Privacy News.

“Europeans are having their photos harvested for a purpose they never intended or even contemplated.”

Alan Dahi, NOYB.

But Clearview CEO Hoan Ton-That told Digital Privacy News: “Clearview AI’s technology is not available in the EU.

“We look forward to engaging with the Hamburg DPA in an effort to resolve their concerns,” he said.

Clearview is not believed to have any European clients, but its database contains the facial images and biometric information of people in the EU.

Ruling’s Significance

Dahi cited “two significant positive aspects” of the ruling.

“The first is asserting jurisdiction over Clearview AI,” he said. “Even if based solely in the U.S., Clearview AI crosses the borders of its jurisdiction and reaches into the EU by harvesting photos of Europeans.

“If a foreign company enters European territory, even digitally, it needs to abide by European laws.

“The second is the finding that Clearview AI’s business model is not viable in the EU,” Dahi continued. “Clearview AI’s use of photos scraped from the internet is illegal and would need the profiled individuals’ consent.”

“Extracting face templates from photos scraped from the internet — without giving notice or obtaining consent … — violated BIPA.”

Rebecca Glenberg, ACLU of Illinois.

The Hamburg data-protection authority ordered Clearview to delete the complainant’s faceprint but stopped short of ordering the company to delete his images or to remove the personal data of all EU citizens on its database — an action that NOYB claimed was within the regulator’s powers.

Dahi said NOYB was considering an appeal.

Clearview also faces litigation in the U.S.

ACLU Suit in Illinois

The American Civil Liberties Union (ACLU) is among several plaintiffs suing Clearview in Illinois, alleging the company has breached the state’s Biometric Information Privacy Act (BIPA).

“BIPA requires notice and consent whenever a private entity takes a person’s biometric information,” said Rebecca Glenberg, senior civil liberties staff counsel at ACLU of Illinois.

“Our view is that extracting face templates from photos scraped from the internet — without giving notice or obtaining consent from the people whose biometric information was taken — violated BIPA,” Glenberg told Digital Privacy News.

In response, Clearview’s Ton-That said: “We will vigorously defend the Illinois related cases.”

The ACLU’s Illinois case, filed last May, argued that Clearview’s biometric database could inflict harm on members of vulnerable communities, including survivors of domestic violence or sexual assault and undocumented immigrants.

In a memorandum included in court documents, dated last Oct. 7, Clearview’s lawyers argued that its actions were protected under the First Amendment, in part because the photographs used by the company were publicly available.

“That’s not a defense,” said ACLU’s Glenberg. “We’re not suing them for taking or using photographs from the internet.

“We will vigorously defend the Illinois related cases.”

Clearview CEO Hoan Ton-That.

“What we’re complaining about is that they take the photographs and, using their own proprietary algorithm, turn them into face templates that are unique measurements of a person’s facial characteristics and can be used to match faces,” she explained.

“It’s not the photograph that violates BIPA, or the taking of the photograph from the internet,” Glenberg argued, “it’s extracting from that photograph a unique, personally-identifying face template.”

Robert Bateman is a writer in Brighton, U.K.