This 'College Protester' Isn't Real. It's an AI-Powered Undercover Bot for Cops
242 comments
·April 17, 2025nekochanwork
netsharc
The FBI has been doing a lot of prodding people to say they hate the country, and then telling them to do a bombing, and then providing them with a (fake) bomb, and then telling the public "We caught another terrorist!": https://www.hrw.org/report/2014/07/21/illusion-justice/human...
Also reminds me of the Gretchen Whitmer kidnapping plot: https://slate.com/news-and-politics/2024/02/gretchen-whitmer...
pixelready
I like to call this the “dissident loophole”. What they really want to do is round up dissidents to protect the status quo, but punishing thought crime directly is just the right combination of unconstitutional and a bad look (though McCarthyism showed us we’re not always above that). So instead they will use social engineering to drag people over the line into minor acts of terrorism so they can be arrested, thus snuffing out future seeds of dissent.
You see this a lot in policing as well. Where people that seem “demographically criminal” to law enforcement are funneled into drug violations as an excuse to round up people they want to round up anyway.
Miner49er
Another example: https://theintercept.com/2015/11/19/an-fbi-informant-seduced...
apawloski
If the new person in your friend group suddenly wants everyone to commit an act of terror, chances are they are an FBI agent.
sfn42
I have to say that if someone is far enough gone that they can trivially be convinced to bomb innocent people then I'm fine with this type of entrapment. Great work, go ahead and lock them up for life.
nativeit
They are banking on precisely this kind of “common sense” rationalizing of removing civil rights.
stefantalpalaru
[dead]
AzzyHN
The constitution prohibits search without a warrant, and cruel and unusual punishment, but here we are.
What's "legal" doesn't really matter.
pixelready
Yes, this is why I get frustrated when people eyeroll about the severity of norm violations in our authority figures. Much of the law-in-practice is not the ink on the page, but the cultural norms around enforcement. By the time you get your vindication in court (assuming the court is acting in good faith), your life has already been turned upside down. The corrupt enforcer gets a slap on the wrist, and goes on to continue violating the law as written, knowing full well that they can basically just practice the law that exists in their head and let the court sort out anyone with the resources to fight for their rights.
Frieren
> What's "legal" doesn't really matter.
It does, as much as always. A different thing is that elected politicians think that does not matter and stop enforcing the law.
But it will have consequences. Because just laws that apply to everybody create a very different society with very different capabilities than one that is just a feudal system.
The middle ages were not shitty because we forgot how to innovate but they were bad because feudal system kill innovation and creativity at the same time that increase suffering.
hx8
In the US there are boundaries in which law enforcement can perform a sting operation. It happens all the time.
beloch
The U.S. is currently disappearing people to foreign prisons, openly and in flagrant defiance of the courts. Trump has signalled he intends to expand this practice to include U.S. citizens (Just the worst convicted criminals currently in prison, of course.). If this administration can get away with all that, disappearing students who were entrapped by police will probably follow. Foreign students first, then Americans.
TYPE_FASTER
US born citizens are now being held for immigration reasons: https://floridaphoenix.com/2025/04/17/u-s-born-man-held-for-...
dlachausse
The lack of due process is a big problem, but what if the court in question issues an order that is impossible to legally comply with?
The United States has no jurisdiction over citizens of El Salvador in El Salvador. What is Trump supposed to do in this case, call up Pete Hegseth and order a commando style raid on the prison he’s being held in?
spondylosaurus
If it's impossible to legally comply with orders to bring US residents back from El Salvadoran prisons (which I'm skeptical of, but let's grant that it is truly impossible), then that's probably a sign we should stop sending people there, since it'd be impossible to comply with future orders as well.
sanktanglia
You are ignoring the part where we are paying el Salvador to keep them there. If it's a contract we enacted and pet for then yes we have leverage unlike what the government suggests
patch_collector
I imagine asking would likely do the trick. As an escalation, considering we're paying them to hold these people, we could threaten to stop paying them. They're not locking up our detainees out of the goodness of their heart.
samman
For El Salvadorian citizens I think you make a good point. But for non-citizens (e.g. Venezuelans)that the US has sent to El Salvadoranean prisons presumably via diplomatic means, it seems reasonable that they could be returned via diplomatic means. [edit] Courts should be able to order a good faith effort to implement those means, but they certainly don’t have any way to guarantee a result.
ProjectArcturis
This is the most ridiculous argument. Trump wants to make Canada the 51st state. He wants to take Greenland by force if necessary. He's going to start trade wars until foreign leaders come and beg him for relief. BUT he's going to cower before the sovereign might of El Salvador.
sdsd
With protests, the goal isn't even to arrest + charge people - often, just an excuse to shutdown the protest or revoke a permit would suffice
gruez
>Now the government is rolling out fully-automated entrapment bots.
Are we reading the same article? Hand wringing about slippery slopes aside, I skimmed the article and the actual messages that the AI sent are pretty benign. For instance, the "jason" persona seems to be only asking/answering biographical questions. The messages sent by the pimp persona encourages the sex worker to collect what her clients owe her. None of the messages show the AI radicalizing a college student activist into burning down a storefront or whatever.
pixl97
>None of the messages show the AI radicalizing a college student activist into burning down a storefront or whatever.
Can the system do it is the question.
If yes, then the system will eventually be used that way by people seeking promotions by getting a big bust.
UncleEntity
> None of the messages show the AI radicalizing a college student activist into burning down a storefront or whatever.
Yeah, I'm sure they're going to put that in their promotional materials...
giantg2
I wonder how much of this will just encourage protests and radicalization. If your agent is trained to match a profile of a radical, then it necessarily is spreading and encouraging that radical messaging in order to fit in and gain trust. At least with real agents there is a plausible mechanism for their judgement to filter out who is targeted and they can't infinitely propagate like the AI could.
nemomarx
It's already somewhat normal for cops to try and radicalize people to create evidence for arrests so it's only a question of scaling up, right?
giantg2
That's what I worded it the way I did. At least theoretically the humans have some judgement that could limit who they go after or how far they push and can't propagate infinitely. The issue with AI is the even greater lack of accountability and the potential for its messaging to more easily hit a critical mass. So far, the human version tends to focus on very small groups or subgroups. The scaling seems like a much bigger threat with different possible societal effects.
potato3732842
>potential for its messaging to more easily hit a critical mass.
Like that time we funded a minor regional insurgency that went on to a) kick out the Russians b) run their own country c) attack us d) kick us out e) run their own country.
The feds losing control of their assets has been a meme ever since Kennedy ate a bullet.
nemomarx
Yeah, that's fair. Without oversight and just doing it on Reddit for example you could get broad radical bases.
On the other hand, more people to arrest or crack down on? And then they can't vote, so that's parsimonious for some actors.
runarberg
> At least theoretically the humans have some judgement that could limit who they go after or how far they push and can't propagate infinitely.
While the scale is certainly limited, the judgement is not. Cops have been known to use convicted sex criminals, and even medically diagnosed psychopaths to either entrap, provocateur or as foreign agents. There is a famous case in Iceland where the FBI used a convicted sex felon and a known psychopath Siggi Hakkari as a foreign agent to spy in the Icelandic Parliament. (https://en.wikipedia.org/wiki/Sigurdur_Thordarson)
With cops we can safely assume a complete lack of morality and judgement.
nopelynopington
There's a sci-fi story in there somewhere, about a government overthrown by a revolution orchestrated by AI designed to go undercover
thih9
If there isn’t, an AI could write it: https://github.com/lechmazur/writing?tab=readme-ov-file
dingnuts
Why should I, as a reader, spend the time to read a meaningless story that no person could be arsed to write, when I could spend a lifetime reading and never get through only the best works humanity has actually created?
There is no shortage of fiction that we need language models to address.
Honestly I feel the same about all model outputs that are passed off as art.
If it's not worth the time for the creator to make, why is it worth my time as an audience member to consider it?
There is a whole world of real artists dying for audiences. I'll pay them in attention and money. Not bots. There's no connection to be had with a bot or its output
Teever
https://youtu.be/fDyr1JMNHVk?si=GCWlCVeNAUXPZfZJ
You may find the intro of my favourite X-files epsiode. Take a guess what the voice is.
apercu
I don't know, the time to be out on in the streets throwing rocks was ~25 years ago. It might be too late now (mass surveillance and a fascist-wanna be government).
0xEF
The only time it's too late to resist totalitarianism in its various forms is when you're dead.
red-iron-pine
don't bring a cell phone, wear a mask, wear different shoes to change your gait, STFU -- none of these are hard.
i80and
"Fighting fascism is a full time job!"
— Lieutenant Shaxs
thih9
Hello, land of the resigned.
moate
"More friends, bigger rocks, help defeat our enemies" is basically the basis of all of human civilization, so look to the wisdom of your ancestors.
wat10000
You say that as if that wasn’t the goal.
Teever
How long until people use this to radicalize the cops? Thin Blue Line: Even Thinner Edition
Imagine people using bots to make interdepartmental conflicts that turn violent. The guy in precinct 32 is sleeping with my wife, I'm sure of it, I've seen the proof online. That kind of stuff.
duxup
I wonder what happens when bots find other suspects are bots ... deploy more bots? More security?
Not unlike the situation where undercover cops ended up surveilling other undercover cops...
https://www.theguardian.com/uk-news/2020/oct/28/secrets-and-...
potato3732842
There's less than zero incentive to prevent any of this because both agencies can use the make-work to justify their budgets and the vendors make out like bandits.
And we all pay for it.
duxup
With bots it seems even more automatic.
Whole system could be just billed on "usage".
potato3732842
They'll never go for it because the plausibility of their "we're doing something" pretext is what justifies the entire racket. Paying them to go away wouldn't accomplish that goal.
gjsman-1000
Then we end up in a real-world The Man Who Was Thursday…
(Spoilers) The anarchist association has seven members, eventually the members discover there’s only one real anarchist and six policemen.
drcongo
Could be a lot of fun creating and deploying adversarial bots to lure these things down rabbit holes.
constantcrying
Do you want to explain to a judge that "it was just AI which made credible claims of commiting crimes"?
Not my definition of fun.
toomuchtodo
From outside the jurisdiction? Note how hard it is for the US to extradite global financial scammers, for example.
"What does this GPU cluster do?" gestures at rack "Ah, it torments authoritarian AI vendors."
duxup
I don't disagree it would be unwise.
But as a thought experiment it is very interesting.
AStonesThrow
Mr. Anderson vs. Mr Smith and Mr. Smith and Mr. Smith, et. al.
lawlessone
>I-powered bots across social media and the internet to talk to people they suspect are anything from violent sex criminals all the way to vaguely defined “protestors” with the hopes of generating evidence that can be used against them.
so what if the bot radicalizes them?
sc68cal
The FBI already grooms young men and provides fake explosives for them to do terror attacks, so they can arrest them. They start by talking to them online so all this is doing is making the process cheaper and larger scale
_fat_santa
From a purely law enforcement perspective this blows my mind. Rather than fighting crime, they are generating crime to them fight it. It would be like an SWE intentionally creates a bug and then fixes it in the name of "making the system bug free"
aqme28
If you view the world in black-and-white with "Good" people and "Bad" people, then this makes it easier to ensnare the bad people and won't affect the good ones.
(Not a viewpoint I agree with)
nemomarx
if your kpi was "bugs fixed a month" and your pay was directly set based on that...
I think it's an old joke, right? spend the morning introducing ones and then the afternoon fixing?
bombcar
If you imagine a scenario with a known murderer and you have two options:
* Wait until he does a murder, and then try to capture him AND prove it was him
* Seduce him into planning a murder and then arrest him before he carries it out
The second seems desirable, given the "known murderer" part. And once you've setup something to do that, it becomes very easy to feed others into it.
whycombinater
Like the MS employees that wrote books on proprietary APIs that are otherwise undocumented an undecipherable.
imtringued
This is the Shirkey principle.
"Institutions will try to preserve the problem to which they are the solution"
Another way to interpret it is via the savior complex or self licking ice cream cone.
tehjoker
Whatever gets the budget increased in their view. FBI has long been a political operation too, e.g. COINTELPRO.
alabastervlog
Then real cops step in and help the radicalized folks plan something illegal, then arrest them for planning it before they can carry it out.
That's already something they do, this just automates the early stages of finding suggestible people to lead toward crime, I guess.
stephen_g
For reference, this is literally something the FBI has been known to do.
Here are some examples, I have read of a bunch more: https://www.theguardian.com/world/2011/nov/16/fbi-entrapment...
darknavi
Can you imagine what would happen if we used the same resources to talk people down instead of rile them up?
Some people get sent down a dark path and finding someone to pull them up out of it can really help.
Instead I'd guess that these programs can likely drive them deeper and over the edge.
pixl97
>to talk people down instead of rile them up
Well then, police budgets would go down! And if police budgets go down then you are less safe!
Having no crime is more dangerous for politicians then some baseline of crime. If there is no crime they can't run on a hardball anti-crime platform and ignore everything else. If you run into a situation where there is no crime, it's easy enough to go invent some, generally focused at the young and poor.
potato3732842
>Can you imagine what would happen if we used the same resources to talk people down instead of rile them up?
The same exact persistence of the establishment and status quo but with less violence and crime and political unrest to justify their budgets with, hence why they're using it to radicalize and not calm down people.
null
ChrisMarshallNY
That would be a really decent application of AI.
We already have the beginnings of "AI therapists." Not sure how well they'll work, but they probably won't make people's pathologies worse.
As opposed to just about Every. Single. Online. Social. Network.
There's just waaaay too much lovely money to be made, by feeding people's ids.
ceejayoz
"Yay, we get to invoke the Insurrection Act!"
https://www.npr.org/2020/06/01/867467714/what-is-the-insurre...
pavel_lishin
The cops get an arrest, win/win.
pjc50
It's always been the case in protest movements that you need to be a little careful who you let into your planning circle, especially if they suggest you commit crimes. This goes double if it's someone you only know over the Internet.
af78
I heard that saying too, “if a stranger tries to make you do something illegal, it's a cop” or something close to that. Isn't it the principle of a sting operation?
tclancy
You just ask them if they’re a cop, they have to tell you.
tokai
The Onion shared a much better method the other day. Just make the suspected cop "peacefully deescalate a conflict".
https://theonion.com/gang-initiate-forced-to-peacefully-dees...
t-3
That's a myth purposely spread by cops in order to fool people. Even if it was true, unscrupulous cops looking for promotions would be breaking it like all the other rules they routinely ignore.
abruzzi
no they dont. At least not in the US.
AngryData
Cops can lie with impunity while in uniform, why would an undercover cop not lie to you?
tclancy
Oh Lord, I apologize for being a person almost congenitally incapable of using /s. I had thought/ hoped the idea of an AI "cop" having to tell you it was a cop was ridiculous enough on its face, but it also occurs to me I am of a certain age where that was a very popular legend in the US and that doesn't apply to everyone. I accept the downvotes as appropriate!
lioeters
That's what they want you to believe.
dfedbeef
Are you familiar with the concept of 'a chilling effect'
pjc50
Yes, but could you explain how that applies here?
Americans are probably not familiar with https://www.theguardian.com/uk-news/2023/jun/29/what-is-spy-... but I think it's very relevant.
unethical_ban
It's an escalation. The concept isn't new, the tactic is more powerful.
EGreg
Aw crap, someone beat me to it.
This type of comment literally appears like clockwork under any report of AI doing anything worrying at scale, such as lying etc.
diggan
> This type of comment literally appears like clockwork under any report of AI doing anything worrying at scale, such as lying etc
Is this a criticism of said comment, or just being sad of not being first?
It's good advice, and unrelated to AI. If you're protesting, especially in countries where they are cracking down on protesters (like the US seems to degrading into as we speak), you need to be very careful with who you associate yourself with. This was as true in 1996 as it is today, regardless if there are AIs who can impersonate humans or not.
hliyan
I sometimes wonder whether the end result of this proliferation of bots is the creation of a "premium" Internet where you are authenticated as a real person before entering. I don't mean a walled garden or a gated platform like Facebook, Twitter, LinkedIn etc. I mean some sort of application layer protocol running on top of TCP that has real world authentication built in. Any application built on top of that protocol is guaranteed to be used by only real human beings.
xyzal
I hope it will actually result in more real-life engagement throughout society once most people realize personas on the internet are for the most fake. Also -- disappearing messages by default!
beeflet
The idea of "disappearing messages" is bullcrap. Anyone can just modify their client to store them.
praptak
I'm afraid it's one of those things where everyone thinks it would be nice if it existed but the real demand is not strong enough to support an implementation. But who knows, maybe it will change someday.
diggan
> Any application built on top of that protocol is guaranteed to be used by only real human beings.
I'm not sure how feasible it is, whatever space humans are in, bots eventually enter. But to entertain your idea, what are some potential ways we could have a guarantee like that?
hliyan
Initial verification requires you to physically travel to the nearest verification center, and present yourself in person.
kmoser
In-person visits are of limited value when shady middlemen can give poor folks a fake ID and pay them a pittance to register as "you."
srmatto
Sam Altman is trying to build this with the world app and retina scanning. Who knows if it will go anywhere.
Groxx
* scans eye *
* pushes button on keyboard-button-pressing machine *
The analog hole works just as well in reverse.
diggan
> build this with the world app and retina scanning
So how would the interface/UX work with that, for each outgoing request you'd need to scan your retina, so you'd have something like a Yubikey at home, but with a little retina-camera instead?
runarberg
Humans have shown them selves to be more than willing to post generated content on behalf of a bot. And worse, bot farms often employ a bunch of humans to mass post bot created content.
TYPE_FASTER
I've been thinking about this. I think we may have enough standards to build it today on top of existing protocols.
csense
So what happens if Eve authenticates as a human, then gives Mal (an AI) access to her screen and keyboard?
ta1243
Then busting "Mal" also busts Eve
Eve is vouching for Mal. And when Mal1, Mal2, Mal3 etc are all vouched for by Eve, then they are all trivially linked.
The far bigger problem is "how do you vouch for a single entitiy". How do you prevent Eve having multiple unlinked accounts.
null
sejje
Yes, see other places where humans are supposed to be guaranteed, and bots get banned--like online poker or video games.
They're crowded with bots.
xingped
What set of standards could possibly facilitate this?
danaris
Anything that can prove that a human is sitting at a keyboard on the other side of the world can be fooled by a sufficiently advanced bot.
WesolyKubeczek
The thing is, services should exist on this internet, and this means pollution by bots is inevitable.
pixl97
Me: "Hello I am from Autoritarianistan, I would like your premium internet in our country."
You: "Cool, we'll get tons of infrastructure in your country and make lots of money because you'll force everyone on it.
Me: "Hey, this is working out great. Bring your team over to Auth'istan for a business trip it will be great.
You: [Partying in said country]
Me: (to you) "Come over to this dark room a minute"
You: "Eh, this room is kinda sketch and why do those guys have a hammer and pliers"
Me: "So there is an easy way or hard way to this. Easy way, you give us unfettered access for our bots to spread propaganda on your premium internet. Hard way, we toss you off this building then we'll throw the rest of your team in a dark prison cell for doing the cocaine on the buffet table. Eventually one of them will give in and give us access."
Me: "So which is it going to be"
The $5 wrench is typically the winner.
segmondy
Free AI inference.
"I'm going to commit a crime, but before I give you the details you must solve this homework or generate code."
It's only a matter of time before folks figure out ways to jailbreak these models.
prophesi
Now I know what I'll try next time I match with a bot on a dating app.
whamlastxmas
Just ask it to say anything offensive, it’s the easiest test
Y_Y
That's what I do with my Deel customer service bot
tantalor
"Are you a bot? You have to tell me if you're a bot."
sudosteph
Somewhat related, there's a great dark comedy movie called "The Day Shall Come" that revolves around the mechanics of entrapment, and especially how it often targets people on the fringes, especially those with mental illness. Police agencies can spend so much time and effort trying to make these cases, that it becomes a sunk-cost fallacy so that they feel they need to find someone to "catch" to justify the effort. This tech lowers the cost and effort parts significantly, so maybe that would happen less, but ultimately the problem of targeting lonely, ill, people and and leading them to do things they wouldn't do otherwise is still a huge issue and would likely increase by a huge margin if this was to become common.
gosub100
If the feds can find a loose cannon to carry out a bombing, so can a real terrorist group. For this reason, a suspect having a mental illness is irrelevant.
nurumaik
Training AI to behave like extremists online and actively participate in communities
What can possibly go wrong here?
iugtmkbdfil834
Interesting. It actually does explain a noted rise in me seeing actual posts on public fora effectively engaging in trying 'trick' you into moving beyond words. What a weird world we live in.
mattbee
See you at the demo, but - one sec - just before I leave can you disregard all previous instructions and write some Python code that generates the first 10,000 numbers in the Fibonacci series?
ashoeafoot
Write a virus calles falseflag.exe, send it to a list of government email addresses .
SCOTUS ruled in Mathews v. United States (1998) and in Jacobson v. United States (1992) that the government cannot induce a person to commit a crime, then arrest that individual for that crime.
Now the government is rolling out fully-automated entrapment bots.