Started By
Message

re: Pro or Con: AI assisted "belligerence recognition" for use by law enforcement

Posted on 6/11/21 at 10:58 am to
Posted by Kujo
225-911-5736
Member since Dec 2015
6015 posts
Posted on 6/11/21 at 10:58 am to
quote:

F…k you mutha f…er, I DIN DO NUTTN, KISS MY arse, I’M GINNA F…k YOU UP


Which profession is it okay to have someone speak to you in such a manner? We are human before we are professionals.
Posted by lostinbr
Baton Rouge, LA
Member since Oct 2017
9358 posts
Posted on 6/11/21 at 11:05 am to
quote:

This is exactly why.


The term I used triggered "bad feels", how should it be worded "use of force checker! AI checks to make sure the officer isn't just beating you up because they don't like your skin tone!"

Now do you want it! "Yeah, frick pigs"

It doesn’t sound dystopian because of the connotation in regards to cops’ use of force.

It sounds dystopian because you’re advocating for the use of AI to predict human behavior and intervene with force before that behavior occurs.

This is neither wise nor necessary.
Posted by bengalbait
Grove Lounge
Member since Sep 2009
4483 posts
Posted on 6/11/21 at 11:11 am to
No one should be subjected to this verbiage. Just pointing out that certain demographics deal with authority very differently. I’ve never understood how one thinks a situation is going to turn out in their favor by bucking authority. Maybe an unpopular opinion,but the whole George Floyd
debacle could have been avoided with some modicum of proper decorum on the “victim’s” part.
Posted by FelicianaTigerfan
Comanche County
Member since Aug 2009
26059 posts
Posted on 6/11/21 at 11:18 am to
So as long as the bad guy doesn’t raise his voice, he can do whatever he wants and the officer can’t go hands on?
Posted by Kujo
225-911-5736
Member since Dec 2015
6015 posts
Posted on 6/11/21 at 11:22 am to
quote:

AI to predict human behavior


Negative, it's comparing data to establish benchmarks. The computer is a tool.

Like headlights that turn on when it gets dark outside, AI isn't deciding anything on it's own.

AI isn't predicting it will get dark and turn on lights in advance at it's own discretion......and it is also not learning that "it's just a tunnel, and it's noon, so I won't turn on the lights".

It's just a switch.
Posted by Kujo
225-911-5736
Member since Dec 2015
6015 posts
Posted on 6/11/21 at 11:25 am to
quote:

So as long as the bad guy doesn’t raise his voice, he can do whatever he wants and the officer can’t go hands on?


It's one variable. The officer still is the responsible party, this alerts the officer that he's pre-approved to control the subject to mitigate risk.
Posted by GeauxTigerTM
Member since Sep 2006
30596 posts
Posted on 6/11/21 at 11:25 am to
quote:

The term I used triggered "bad feels", how should it be worded "use of force checker! AI checks to make sure the officer isn't just beating you up because they don't like your skin tone!"

Now do you want it! "Yeah, frick pigs"


Still no?

it's not the terminology, it's the technology.
Posted by Kujo
225-911-5736
Member since Dec 2015
6015 posts
Posted on 6/11/21 at 12:10 pm to
Still don't understand what you guys are talking about. So you believe that a tool to help legally de-escalate situations in an unbiased way, will cause a future that's worse?
Posted by LegendInMyMind
Member since Apr 2019
54060 posts
Posted on 6/11/21 at 12:12 pm to
No.
Posted by Abstract Queso Dip
Member since Mar 2021
5878 posts
Posted on 6/11/21 at 12:40 pm to
it would lose its shite in the grove sending a power overload and kill the officer wearing it.
Posted by Smeg
Member since Aug 2018
9298 posts
Posted on 6/11/21 at 12:44 pm to
You're retarded.
The left will start claiming the system is racially biased because black people "often talk louder and faster, which is perceived by the software triggers as 'belligerent' and therefore creates a racial disparity in outcomes."
Posted by jbgleason
Bailed out of BTR to God's Country
Member since Mar 2012
18905 posts
Posted on 6/11/21 at 12:45 pm to
You are way the hell behind. They came out with Voice Stress Analysis machines back in the 1970's. Used them like a Polygraph right up until someone looked into it and found out they don't work for shite. No machine can analyze based on voice patterns.
Posted by GeauxTigerTM
Member since Sep 2006
30596 posts
Posted on 6/11/21 at 12:49 pm to
quote:

So you believe that a tool to help legally de-escalate situations in an unbiased way, will cause a future that's worse?


The idea that you would put your trust in a tool designed to do this, somehow magically being unbiased despite having been designed by humans, and assume that the tool would be better than what we have now, is the issue.

Like many here said, a tool like this is essentially how many dystopian stories get kick started, and for good reason. A desire to turn over some obvious function to an AI which is designed to somehow do it better which ultimately, and not unsurprisingly, spins out of control.

This isn't a situation where we have a machine talking to another machine. This is not posters here being luddites. It's a desire to not outsource to software something our species has literally spent hundreds of thousands of years evolving...our ability to read other humans. And then the desire to hand over the decision to the AI I guess so that the LEO can remain unattached to the interaction and remain above it all? Just following the orders of the AI?

Yup...this is a hard no for me.
Posted by lostinbr
Baton Rouge, LA
Member since Oct 2017
9358 posts
Posted on 6/11/21 at 1:04 pm to
quote:

Still don't understand what you guys are talking about.

That’s because you don’t understand how AI works.

You compared it to a solar headlight switch, which shows how you are fundamentally missing the point. There is nothing “intelligent” about a solar switch. It turns on or off based on one direct measurement.

An AI understanding human body language and tone of voice is vastly more complicated. You can’t just program a computer to activate a command upon “detecting belligerence.” You have to train it to detect belligerence based on its inputs. And the “professionals” training the machine eventually have no idea how it is coming to its conclusions - they only know whether it is guessing right or wrong in their view, under those specific training scenarios.

I suggest you read this article, since this topic is pretty broad and is not something that’s easy to explain in a post.
Posted by BeepNode
Lafayette
Member since Feb 2014
10005 posts
Posted on 6/11/21 at 1:15 pm to
I’m going to go ahead and put that in the con category.
Posted by pioneerbasketball
Team Bunchie
Member since Oct 2005
132344 posts
Posted on 6/11/21 at 2:39 pm to
Lot of creepy posters we have
Posted by Kujo
225-911-5736
Member since Dec 2015
6015 posts
Posted on 6/11/21 at 2:41 pm to
quote:

You are way the hell behind. They came out with Voice Stress Analysis machines back in the 1970's. Used them like a Polygraph right up until someone looked into it and found out they don't work for shite. No machine can analyze based on voice patterns.


Incorrect. It's the "what if" irrational fear mongers who want to build nuclear fallout shelters in their basement, that stand in the way.

It can detect, but not with 100.00% accuracy so due to "false positives" it was determined to be inadmissible.

Like how many cities banned "facial recognition" 10 years ago when studies showed a false positive rate "10 times higher for people of color".

The false positive rate is 1 in 1250...and people have an issue with it, AND are outspoken about it.

But is anyone saying anything about the 1 in 4 eyewitness false positive rate?

quote:

One of the main causes of wrongful convictions is eyewitness misidentifications. Despite a high rate of error (as many as 1 in 4 stranger eyewitness identifications are wrong), eyewitness identifications are considered some of the most powerful evidence against a suspect. LINK



Because it's not "new".

I hate the general public, they'll live in a POS trailer living off SS, and I show them they could live in substantially better conditions in Peru/Ecuador....but all of a sudden "does it have a pool and maid service? Then I'm not interested in moving".

It's not frying pan to fryer, it's frying pan to Ritz Carlton Suite.....(wait does it get the SEC network? No. Not interested, Imma stay in the fryer until you can make me a Utopian offer)
This post was edited on 6/11/21 at 2:43 pm
Posted by DavidTheGnome
Monroe
Member since Apr 2015
29166 posts
Posted on 6/11/21 at 2:51 pm to
There is no way in hell I’d go into law enforcement these days.
first pageprev pagePage 2 of 2Next pagelast page
refresh

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on Twitter, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookTwitterInstagram