Skip to content(if available)orjump to list(if available)

Armed police handcuff teen after AI mistakes chip bag for gun in Baltimore

sshine

These stories always surprise me.

Somehow the culprit is AI, yet grown men handcuff a child when there is no gun.

The bar for becoming a cop in the US must be really low.

CaptainOfCoit

Considering how many "large boned" US police officers one could see on social media today (since I don't live in the US), it seems like they don't even have physical exams anymore, which seems like the lowest bar to put in place.

hollerith

It seems to me that being overweight is an occupational advantage for a police officer?

skeeter2020

for Joining ICE it's even lower. Here's some helpful tips from theonion.com:

https://theonion.com/how-to-join-ice/

Including these gems:

* Be born with something just…missing

* Undergo background check confirming at least one prior arrest for a violent crime

drivingmenuts

The Onion is supposed to be satire, not an instruction manual.

Esophagus4

I don't think that's a fair characterization of this particular situation: that the cops were idiots. This also wasn't entirely an AI problem, as it was reviewed by a (presumably) human before being sent to the police.

I'm assuming you're not in the US from your comment, so I encourage you to think more broadly about the problem rather than focusing on the headlines.

To play devil's advocate, this case exists within a larger system in the US. In particular:

1) the proliferation of guns in America causes tremendous uncertainty for police, for whom any encounter with the public could involve a firearm. In other countries where the public have fewer firearms, police don't have to escalate a situation as quickly for safety reasons because they can assume the person they're interacting with doesn't have a gun.

1a) therefore, police are trained to take control of situations in certain ways if they believe a threat to be present: for example, drawing their gun and handcuffing a potential suspect out of the abundance of caution, then performing their investigation once the scene is safe. That is what appeared to have happened here.

2) the risk of school shootings putting pressure on police, school systems, etc to prevent these situations. Hence, the deployment of technology, armed guards at schools, etc

2a) Sadly, local police often bear the brunt of larger system failures. Example: mental health treatment difficult to get? Police will have to deal with people who are not taking their anti-psychotic medications, or who are at risk of harming themselves or others. Proliferation of guns for political reasons? Police will have to encounter more people with guns.

3) Police are human beings. And even in the best case, with incredible amounts of training (which they may or may not have), they will still have to make split second decisions that could mean life or death. Some of those may be mistakes.

Combine all these forces, and you have a recipe for incidents.

It's absolutely fair to criticize behavior and hold law enforcement to a high standard, but man... I think we haven't put them in a position to succeed, and we ask a tremendous amount from them.

It's the same way if there's an incident in a production software environment, my general thought is not: "the bar for being a developer here must be really low because they're incompetent." My thought starts with, "what in our system allowed this event to happen?"

bb88

Let's criticize their behavior then.

Police have a long history of racial profiling. Driving while black. Walking while black. And in this case, carrying a bag of empty Doritos while black. They often are taught to "secure by force" rather than "de-escalate and investigate". And if they do violate your rights, so what? They have qualified immunity.

On the issue of guns, if you are white and republican, you can walk around the Idaho Statehouse with a loaded AR-15 without any issue whatsoever. It's only at the moment the gun goes off is when there's a problem.

Here's a fun story where a 11 year old girl carried an AR-15 to the Idaho Statehouse.

https://www.ktvb.com/article/news/local/capitol-watch/girl-1...

null

[deleted]

ufmace

Assuming this quote from the article is true, it sounds like the only villain here is the school principal:

> It said the AI alert was sent to human reviewers who found no threat - but the principal missed this and contacted the school's safety team, who ultimately called the police.

So the company AI reviewers correctly determined that there was no actual gun, and the police responded appropriately to a report by a person (the school principal) who claimed to have seen this student with a gun. The question then is what the heck is this principal doing? Why do they have access to this pre-verified AI information, and why are they going off and calling the police with this information before doing any verification themselves?

Well also, none of the coverage includes what was said to the dispatcher, but maybe they screwed up too - I would expect a dispatcher to question the caller in more detail about what's going on. Like, what kind of gun was it, is it in their hand or in a pocket or what, did they point it at anyone or threaten anyone with it or are they just carrying it around, etc, and such information could be used for a more appropriate police response.

carlosjobim

A child?

gherkinnn

"Computer said so" is the new "Computer says no".

rahimnathwani

Extensive previous discussion: https://news.ycombinator.com/item?id=45684934 (433 comments)

conradev

  Mr Allen said: "I don't think no chip bag should be mistaken for a gun at all."
Mr. Allen is right. My UniFi camera system regularly detects cars when I play Mario Kart World. I cannot fathom summoning police with an accuracy rate that low and no human in the loop. Absolutely wild.

unyttigfjelltol

Well, there was a human in the loop— the one in dispatch and the one with the handcuffs. Question is, why did they decide to act like bots?

wildzzz

The dispatcher is just there to send police to a call. I doubt any dispatcher is going to put their ass on the line when deciding whether or not a caller (AI or human) is credible or not when a gun is involved. Send the cops and let them figure it out. Plus it's not like they are an expert at identifying a gun from CCTV images. What if it was a gun and the dispatcher made the decision not to send cops? That kind of decision really isn't theirs to make.

conradev

Dispatch is quite complicated. They are always operating with limited time and imperfect information. Their job is to send help. As for the attending officers, they are supposed to know that dispatch has imperfect information.

There should be a higher bar for determining gun possession, just like an EMT calling ahead for trauma or stroke protocols. It puts everyone on high alert!

kcplate

Pretty sure AI detected a “shape” in a pocket that was created by a chip bag.

That is quite a bit different then “AI mistakes chip bag for a gun”

complex_pi

The end result is the same to be honest.

kcplate

Sure, but could a gun also cause a similar shape within the pocket?

My point is that you wouldn’t necessarily want an AI that is designed to detect weapons in this manner to ignore a gun shaped object in a pocket because it might be something that is not a gun. So did the AI actually fail in this case? In my mind, no.

Please note, I am not debating here whether these types of detection systems should or shouldn’t be in use here. Personally I am very much against it. No doubt the human element of this story deserves criticism, but the AI? Not in my opinion.

failrate

Don't make tap the sign: "A COMPUTER CAN NEVER BE HELD ACCOUNTABLE

THEREFORE A COMPUTER MUST NEVER MAKE A MANAGEMENT DECISION"

CaptainOfCoit

You're kind of preaching to the choir here with that.

How do we get the people who already doing this in businesses to stop doing it, if they see it as "saving time" and currently don't have qualms with it?

amarant

Hold them accountable for the computers mistakes? The computer is a tool, and if a carpenter makes a mess we're not likely to blame the hammer. This isn't different in any way.

fhualkd

What choir?

The article is already [FLAGGED].

Like everything else on this site that even hints at criticism of technology or authoritarianism.