Lawsuit Says Clearview's Facial Recognition App Violates Illinois Privacy Laws

from the OR-DOES-IT-[dramatic-side-eye] dept

Clearview has gathered a whole lot of (negative) attention ever since its exposure by Kashmir Hill for the New York Times. The facial recognition app developed by Hoan Ton-That (whose previous app was a novelty that allowed users to transpose President Trump’s distinctive hairdo on their own heads) relies on scraped photos to perform its questionable magic. Rather than limiting themselves to law enforcement databases, cops can upload a photo and search a face against pictures taken from dozens of websites.

The company’s marketing materials claim cops have access to 3 billion face photos via Clearview — all pulled from public accounts linked to names, addresses, and any other personal info millions of unwitting social media users have uploaded to the internet.

Its marketing materials also claims it has been instrumental in solving current crimes and generating suspect lists for cold cases. So far, very few of these claims seem to be based on fact. That’s only one of the company’s issues. Another is the heat it’s drawing from companies like Twitter and Facebook who claim photo scraping violates their terms of service. That’s one for the courts and it’s only a matter of time before someone sues.

Someone has sued, but it’s not an affected service provider. It’s some guy from Illinois trying to fire up a class action lawsuit against the company for violating his home state’s privacy laws. Here’s Catalin Cimpanu of ZDNet with the details:

According to a copy of the complaint obtained by ZDNet, plaintiffs claim Clearview AI broke Illinois privacy laws.

Namely, the New York startup broke the Illinois Biometric Information Privacy Act (BIPA), a law that safeguards state residents from having their biometrics data used without consent.

According to BIPA, companies must obtain explicit consent from Illinois residents before collecting or using any of their biometric information — such as the facial scans Clearview collected from people’s social media photos.

“Plaintiff and the Illinois Class retain a significant interest in ensuring that their biometric identifiers and information, which remain in Defendant Clearview’s possession, are protected from hacks and further unlawful sales and use,” the lawsuit reads.

Hmm. This doesn’t seem to have much going for it. And, believe it or not, it’s not a pro se lawsuit. Whether it’s possible to violate a privacy law by scraping public photos remains to be litigated, but it would seem the word “public” is pretty integral here. Unless Clearview found some way to scrape photos not published publicly, the lawsuit is dead in the water.

It shouldn’t take too long for a judge to declare public and private legally contradictory. This lawsuit was composed by a member of the bar, but it reads more like a Facebook note the lawyer published accidentally. From the lawsuit [PDF]:

Without obtaining any consent and without notice, Defendant Clearview used the internet to covertly gather information on millions of American citizens, collecting approximately three billion pictures of them, without any reason to suspect any of them of having done anything wrong, ever.

Wat? Just because Clearview is aggressively pitching LEOs doesn’t mean Clearview can only scrape photos of people it suspects of wrongdoing. Yes, it’s disturbing Clearview has decided to make its stalker-enabling AI available to people who can hurt, maim, jail, and kill you, but there’s nothing on any law book that says collecting pictures of faces can only be done if the people are probably criminals — even if the targeted end users of this software are people who go after criminals.

Putting it in a sworn document doesn’t make it any less ridiculous. But it does get more ridiculous.

[A]lmost none of the citizens in the database has ever been arrested, much less been convicted. Yet these criminal investigatory records are being maintained on them, and provide government almost instantaneous access to almost every aspect of their digital lives.

Facebook is collecting photos of people, almost none of which have been criminally charged. They reside in Facebook’s database. Facebook is publicly searchable, and public profiles can be searched for photos, even by law enforcement officers. Is Facebook breaking state law by “collecting” photos of innocent people? No rational person would argue that it is. And yet, this is the same argument and it’s no less stupid just because an actual lawyer is involved.

Look, I also don’t want Clearview pushing this “product,” much less to people with the power to do incredible amounts of damage to anyone the AI mistakes for a criminal. But this isn’t going to fix anything. The lawsuit makes better points about Clearview’s end of the deal, which makes it easier for it to look over law enforcement’s shoulder. Since Clearview hosts all the pictures on its own servers, it can see what cops are looking for and do its own digging into the personal lives of anyone cops might be thinking about targeting. That’s an ugly byproduct of this service and Clearview hasn’t said anything about siloing itself off from government queries.

The claims in this suit are almost certain to fail. Clearview streamlines processes cops can perform on their own, like reverse image searches and browsing of social media accounts. Actions you can perform one person at a time without violating the Constitution (or state law), you can most likely do in bulk. For now. A more realistic approach would be to take edge cases to the Supreme Court, which has been more receptive to expanding the boundaries of citizens’ expectations of privacy in the digital era. This lawsuit may raise limited awareness about Clearview (and discovery could be very interesting) but it’s not going to end Clearview’s scraping or deter law enforcement from using it. And it’s certainly not going to earn a payout for the plaintiff.

Filed Under: , , ,
Companies: clearview, clearview ai

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Lawsuit Says Clearview's Facial Recognition App Violates Illinois Privacy Laws”

Subscribe: RSS Leave a comment
10 Comments
BIPA says:

@Tim Cushing
You might want to check the Wikipedia Article to that law.

Especially Reference 12 (law360), Facebook has been sued succesfully over the same law.

And they had a license for those images (granted by uploading), the company scrapping most probably didn’t have any license for those images.

Just because somebody postet their picture publicly doesn’t mean you can use it for whatever you want.

BIPA says:

Re: GDPR

Question would be if GDPR does apply, which it only would if Clearview would do business within the EU i.e. have clients within the EU.

In other words, if they don’t have clients in the EU or offered their service to such, GDPR doesn’t matter.

Even if some people seem to think so, the EU never said their laws apply outside the EU, all they did was to change the place of business to where the client is from where the service provider is.

Graham Cobb (profile) says:

Re: Re:

GDPR isn’t about privacy, it is about processing. Even if the pictures are public, that does not authorise anyone to process the pictures.

The regulation does allow some cases where processing is permitted without agreement from the subject: specifically for domestic purposes (so you can keep an address book for your own family use with names, addresses and photos scraped from the Internet if you wish).

Any other processing requires approval from the subject.

In effect, GDPR makes the question not one about ownership but about purpose and intention.

Moral rights (a concept not used in UK law as far as I know but certainly valid in some European law codes) is more about ownership I believe.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...