Daily Banking News
$42.39
-0.38%
$164.24
-0.07%
$60.78
+0.07%
$32.38
+1.31%
$260.02
+0.21%
$372.02
+0.18%
$78.71
-0.06%
$103.99
-0.51%
$76.53
+1.19%
$2.81
-0.71%
$20.46
+0.34%
$72.10
+0.28%
$67.30
+0.42%

Clearview AI Raises Disquiet at Privacy Regulators


Facial-recognition company Clearview AI Inc., under regulatory and legal scrutiny in Europe, North America and Australia, is having to increasingly tailor its business to regional privacy laws. And much of the attention trained on it may hinge on the varying laws governing the use of web-scraped photos on which biometric profiles are built.

The data protection authority in Hamburg, Germany, for instance, last week issued a preliminary order saying New York-based Clearview must delete biometric data related to Matthias Marx, a 32-year-old doctoral student. The regulator ordered the company to delete biometric hashes, or bits of code, used to identify photos of Mr. Marx’s face, and gave it till Feb. 12 to comply. Not all photos, however, are considered sensitive biometric data under the European Union’s 2018 General Data Protection Regulation.

The action in Germany is only one of many investigations, lawsuits and regulatory reprimands that Clearview is facing in jurisdictions around the world. On Wednesday, Canadian privacy authorities called the company’s practices a form of “mass identification and surveillance” that violated the country’s privacy laws. Clearview said its technology is no longer available in Canada and that it would remove any data on Canadian citizens upon request.

Johannes Caspar, head of Hamburg’s data protection authority, shown here in 2016. 



Photo:

Lukas Schulze/Zuma Press

Clearview draws on a database of about 3 billion photos it scraped from the internet, allowing it to search for matches using facial-recognition algorithms. Some law enforcement agencies use its technology to find perpetrators and witnesses, including an Alabama police department that said it found suspects in the Jan. 6 riot at the U.S. Capitol.

Biometric data is defined in the GDPR as information, such as a fingerprint or facial-recognition scan, that can identify a person—though a person’s photo isn’t automatically considered biometric, said Els Kindt, a researcher and associate professor in data protection law at the university KU Leuven in Belgium and Leiden University in the Netherlands. That definition is too narrow, she said, because a photo, too, can verify someone’s identity.

Good facial-recognition algorithms would recognize Mr. Marx’s face, even if it changed or was partially covered, he noted. “I can’t change my biometric data,” he said.

Mr. Marx filed his complaint last February after Clearview confirmed to him that his images were in their database. The database contains pictures of him available online from local media reports, he said.

The Hamburg regulator said the GDPR applied to Clearview’s collection of Mr. Marx’s data because some of the images and accompanying texts identified him as a student. That qualifies as a behavioral trait covered under the privacy law.

“As of March 2020, Clearview AI has discontinued the few trial accounts for law enforcement in the EU. It never had any contracts with any customers in the EU, and is not currently available in the EU,” Clearview’s Chief Executive Hoan Ton-That said in an emailed…



Read More: Clearview AI Raises Disquiet at Privacy Regulators

Get real time updates directly on you device, subscribe now.

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments

Get more stuff like this
in your inbox

Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

Thank you for subscribing.

Something went wrong.