AI firm faces wave of complaints in Europe
US firm that copies facial images off social media sites to sell to government agencies faces challenge
Luxembourg — Clearview AI has been hit by complaints from across Europe for allegedly breaking the region’s tough privacy laws by scraping billions of facial images from social-media profiles and the internet.
In a concerted move on Thursday, campaigners including Privacy International and Noyb filed complaints with data watchdogs in Austria, France, Greece, Italy and the UK. They urge regulators to declare that Clearview’s practices “have no place in Europe”.
Clearview, which scrapes photos from social media accounts with the goal of helping law enforcement agencies, has come under increased scrutiny in Europe. The UK privacy commissioner and its Australian counterpart last year opened a joint probe into how its facial-recognition technology uses people’s data.
“Extracting our unique facial features or even sharing them with the police and other companies goes far beyond what we could ever expect as online users,” Ioannis Kouvakas, legal officer at Privacy International, said in a joint statement on Thursday.
New York-based Clearview said in a statement that it “has never had any contracts with any EU customer and is not currently available to EU customers”.
Clearview “has helped thousands of law enforcement agencies across America save children from sexual predators, protect the elderly from financial criminals, and keep communities safe”, it said. “National governments have expressed a dire need for our technology because they know it can help investigate crimes like money laundering and human trafficking, which know no borders.”
The official body that represents data watchdogs from the 27-nation EU has said it is “particularly concerned” by certain developments in facial-recognition technology and by the “unprecedented” issues raised for data protection.
Sweden’s data regulator has fined the nation’s police authority for using Clearview’s technology to identify people, saying law enforcement agents had “unlawfully processed biometric data for facial recognition” and failed to do “a data protection impact assessment”.
Bloomberg News. More stories like this are available on bloomberg.com
Would you like to comment on this article or view other readers' comments?
Register (it’s quick and free) or sign in now.
Please read our Comment Policy before commenting.