CBP signs Clearview AI deal to use face recognition for 'tactical targeting' (wired.com)

240 points by cdrnsf 6 hours ago

sailfast 5 hours ago

Always easier when you can avoid the law and just buy it off the shelf. It’s fine to do this, we say, because it’s not being done by the government - but if they’re allowed to turn around and buy it we’re much worse off.

digiown 4 hours ago

That's why it doesn't make sense to ban governments from doing things while still allowing private companies. Either it is illegal to surveil the public for everyone, or the government can always do it indirectly with the same effect.

I don't think the deal described here is even that egregious. It's basically a labeled data scrape. Any entity capable of training these LLMs are able to do this.

asveikau 4 hours ago

The difference is that a government can take personal liberty away from people in the most direct way. A private company can't decide to lock somebody away in prison or send them to death row. (Hopefully anyway.) So we put a higher standard on government.

That said, I do believe there ought to be more restrictions on private use of these technologies.

pixl97 4 hours ago

helterskelter 4 hours ago

WrongAssumption 3 hours ago

heavyset_go 3 hours ago

digiown 3 hours ago

kristopolous 4 hours ago

tintor 3 hours ago

mrguyorama 2 hours ago

keybored 30 minutes ago

bcrosby95 3 hours ago

koolba an hour ago

What would such a ban look like?

A private company can surely link its own cameras and data to create a private use database of undesirables. I’m certain that Walmart and friends do exactly this already. It’s the large scale version of the Polaroids behind the counter.

bad_haircut72 an hour ago

runlevel1 3 hours ago

Just like when Verizon sold its customers' precise location history to data brokers who then sold it to law enforcement agencies.[^1] Laundered.

[^1]: https://arstechnica.com/tech-policy/2025/09/court-rejects-ve...

Manuel_D an hour ago

That's not how the law works in the US. The government cannot have a third party take action on its behalf to do something that would be illegal for the government to do itself. This is why the Biden administration had a restraining order filed against it, on account of them pressuring social media companies to ban content it didn't like. This violated the First Amendment, despite the fact that it was a third party that was doing the actual banning at the behest of the government.

The government could legally create its own facial recognition technology if it wanted to. They're not avoiding the law, facial recognition isn't illegal.

mothballed an hour ago

That's pretty much how KYC works. The government can't just willy nilly demand papers of everyone going into the bank to open up an account due to the 4th amendment. So they just make the bank do it so it is a "private" act, and then for instance IRS is authorized to do warrantless seizure on the accounts which are now tied to names that were forced to be revealed under KYC laws.

Manuel_D an hour ago

duped 4 hours ago

This is why we should shun the people that build this stuff. If you take a paycheck to enable fascism, you're a bad person and should be unwelcome in polite society.

observationist 5 hours ago

snarky123 3 hours ago

"Tactical Targeting" - you just know someone's PowerPoint presentation used the word "synergy" in it too.

givemeethekeys 4 hours ago

How long before the bring the price down and local PD's start using it too?

nsriv 4 hours ago

Not sure if you're joking but Clearview's primary customers are local or metro police departments.

tcmart14 11 minutes ago

I'm sure the anti-vax crowd who were foaming at the mouthes over the vaccine containing tracking chips will explain why this is needed and why its not a big deal.

yababa_y 5 hours ago

local laws forbidding facial recognition tech have never been wiser

quantified 4 hours ago

225k USD per year sells us cheaply!

grvdrm 2 hours ago

I keep reading this as “CBS signs…” and can’t help thinking about that uncomfortable possible future moment.

mschuster91 5 hours ago

And this right here is why Clearview (and others) should have been torn apart back when they first appeared on stage.

I 'member people who warned about something like this having the potential to be abused for/by the government, we were ridiculed at best, and look where we are now, a couple of years later.

gostsamo 5 hours ago

"This cannot happen here" should be classified as a logical fallacy.

dylan604 5 hours ago

As stated in many of the comments in my code where some else branch claims this shouldn't be happening

jmyeet 4 hours ago

There are certain people who believe that average citizens can be held responsible for the actions of their government, to the point that they are valid military targets.

Well, if that's true then employees of the companies that build the tools for all this to happen can also be held responsible, no?

I'm actually an optimist and believe there will come a time whena whole lot of people will deny ever working for Palantir, for Clearview on this and so on.

What you, as a software engineer, help build has an impact on the world. These things couldn't exist if people didn't create and maintain them. I really hope people who work at these companies consider what they're helping to accomplish.

Manuel_D 40 minutes ago

> average citizens can be held responsible for the actions of their government, to the point that they are valid military targets.

What do you mean by this? If a government conscripts "average citizens" into its military then they become valid military targets, sure.

I'm not why you think this implies that developers working for Palantir or Clearview would become military targets. Palantir builds software for the military. But the people actually using that software are military personnel, not Palantir employees.

some_random 3 hours ago

>There are certain people who believe that average citizens can be held responsible for the actions of their government, to the point that they are valid military targets.

Yeah we typically call those people terrorists or war criminals.

mikkupikku 2 hours ago

Or heroes, if they win.

some_random an hour ago

the_gastropod 4 hours ago

I never worked at a company that could broadly be considered unethical, I don't think. But it was always a bit disheartening how many little obviously unethical decisions (e.g., advertised monthly plans with a small print "annual contract" and cancellation fee) almost every other employee would just go along with implementing, no pushback whatsoever. I don't know what it is, but your average employee seemingly sees themselves as wholly separate from the work they're paid to do.

I have friends who are otherwise extremely progressive people, who I think are genuinely good people, who worked for Palantir for many years. The cognitive dissonance they must've dealt with...

throw-qqqqq 4 hours ago

> I don't know what it is, but your average employee seemingly sees themselves as wholly separate from the work they're paid to do.

Hannah Arendt coined the term “the banality of evil”. Many people think they are just following orders without reflecting on their actions.

cyanydeez an hour ago

"Tactical Targetting": Whitewash stochastic terrorism to attack brown people before midterms.

OutOfHere 5 hours ago

We need a Constitutional amendment that guarantees a complete right to anonymity at every level: financial, vehicular, travel, etc. This means the government must not take any steps to identify a person or link databases identifying people until there has been a documented crime where the person is a suspect.

Only if an anonymous person or their property is caught in a criminal act may the respective identity be investigated. This should be sufficient to ensure justice. Moreover, the evidence corresponding to the criminal act must be subject to a post-hoc judicial review for the justifiability of the conducted investigation.

Unfortunately for us, the day we stopped updating the Constitution is the day it all started going downhill.

_3u10 5 hours ago

That will be wildly unpopular with both parties and most importantly their constituents. I doubt even the libertarian party should they get the president, house and senate could pull it off

OutOfHere 5 hours ago

Note that the Amendment would apply only to the government, not to private interests. Even so, i could be unpopular among advertisers and data resellers, e.g. Clearview, who sell to the government. I guess these are what qualify as constituents these days. The people themselves have long been forgotten as being constituents.

plagiarist 4 hours ago

What do you mean "even" the libertarian party? Libertarians would remove whatever existing laws there are around facial recognition so that companies are free to do whatever they like with the data.

quantified 4 hours ago

Maybe. Anonymity is where bad actors play. Better to have better disclosure and de-anonymization in some cases. If some live in fear (e.g. of cartels), go after the cartels harder than they go after you.

GVIrish 2 hours ago

> Maybe. Anonymity is where bad actors play.

The problem is when the government changes the definition of 'bad actor'.

OutOfHere 4 hours ago

> Anonymity is where bad actors play

That is a myth spread by control freaks and power seekers. Yes, bad actors prefer anonymity, but the quoted statement is intended to mislead and deceive because good actors can also prefer strong anonymity. These good actors probably even outnumber bad ones by 10:1. To turn it around, deanonymization is where the bad actors play.

Also, anonymity can be nuanced. For example, vehicles can still have license plates, but the government would be banned from tracking them in any way until a crime has been committed by a vehicle.

quantified 3 hours ago

wat10000 3 hours ago

Anonymity is where little bad actors play. The big ones don't need to be anonymous because their nefariousness is legal, or they don't get prosecuted. See: waves vaguely in the direction of the US government.

That said, the recent waves vaguely in the direction of the US government has demonstrated the weakness of legal restrictions on the government. It's good to have something you can point to when they violate it, but it's too easily ignored. There's no substitute for good governance.

neuroelectron 5 hours ago

Don't we already have facial recognition technology that isn't based on AI? why is throwing AI into the mix suddenly a reasonable product? Liability wavers?

dylan604 5 hours ago

I think the facial rec systems you're thinking of will recognize faces, but not ID them. They need you to label a face, and then it recognizes that face with a name from there on. Clearview is different in that you can provide it an unknown face and it returns a name. Whether it's just some ML based AI vs an LLM, it's still under the AI umbrella technically.

lazide 4 hours ago

Uh no? Facial recognition to names has been the bread and butter of facial recognition since the beginning. It’s literally the point.

dylan604 4 hours ago

porridgeraisin 4 hours ago

After the literal first one which just measured distance between nose and mouth and stuff like that from the 1960s, everything else has been based on AI.

If my memory serves me, we had a PCA and LDA based one in the 90s and then the 2000s we had a lot of hand-woven adaboosts and (non AI)SIFTs. This is where 3D sensors proved useful, and is the basis for all scifi potrayals of facial recognition(a surface depth map drawn on the face).

In the 2010s, when deep learning became feasible, facial recognition as well as all other AI started using an end to end neural network. This is what is used to this day. It is the first iteration pretty much to work flawlessly regardless of lighting, angle and what not. [1]

Note about the terms AI, ML, Signal processing:

In any given era:

- whatever data-fitting/function approximation method is the latest one is typically called AI.

- the previous generation one is called ML

- the really old now boring ones are called signal processing

Sometimes the calling-it-ML stage is skipped.

[1] All data fitting methods are only as good as the data. Most of these were trained on caucasian people initially so many of them were not as good for other people. These days the ones deployed by Google photos and stuff of course works for other races as well, but many models don't.

lenerdenator 5 hours ago

Wear a face mask in public. Got it.

estebank 5 hours ago

I think anything short of fully obscuring your face (a-la ICE-agent/stormtrooper) will be merely a mitigation and not 100% successful. I recall articles talking about face recognition being used "successfully" on people wearing surgical masks in China. In the US they ask you to remove face masks in places where face recognition is used (at the border, TSA checkpoints), but would be unsurprised if that isn't strictly needed in most cases (but asking people to remove it preemptively ends up being faster for throughput).

quantified 4 hours ago

Probably room to add little cheek pads or other shape-shifters under the mask.

verdverm 4 hours ago

nullocator 3 hours ago

Your gait I think is more useful than your face is anyways and my understanding is it's my difficult to disguise. So you'll need a wheel chair/scooter and a mask in public.

ajcp 2 hours ago

Putting a rock in your shoe instantly changes your gait signature.

mrguyorama an hour ago

dylan604 5 hours ago

Aren't we back to where this is illegal again, unless you're an ICE agent.

lenerdenator 5 hours ago

"Hey man, doctor's orders. Gotta wear it to get allergy relief. And no, can't ask about it... HIPAA stuff."

hackingonempty 4 hours ago

FireBeyond 4 hours ago

dylan604 4 hours ago

adi_kurian 3 hours ago

If you have not yet heard of it, look into gait recognition. Any battle for anonymity is a losing one, it appears.

lenerdenator 2 hours ago

In that case, guess it's time to start thinking of ways to make it unappealing to act upon the intelligence they've gathered upon us.

josefritzishere 5 hours ago

Skynet. "You only postponed it. Judgment Day is inevitable."