- My Forums
- Tiger Rant
- LSU Recruiting
- SEC Rant
- Saints Talk
- Pelicans Talk
- More Sports Board
- Fantasy Sports
- Golf Board
- Soccer Board
- O-T Lounge
- Tech Board
- Home/Garden Board
- Outdoor Board
- Health/Fitness Board
- Movie/TV Board
- Book Board
- Music Board
- Political Talk
- Money Talk
- Fark Board
- Gaming Board
- Travel Board
- Food/Drink Board
- Ticket Exchange
- TD Help Board
Customize My Forums- View All Forums
- Show Left Links
- Topic Sort Options
- Trending Topics
- Recent Topics
- Active Topics
Started By
Message
re: Regarding AI and Driverless Cars...
Posted on 10/3/17 at 10:39 pm to TigerFanInSouthland
Posted on 10/3/17 at 10:39 pm to TigerFanInSouthland
quote:
I ask this because they paralleled these two scenarios with driverless cars and AI and let's say you were riding in your car and a wreck was about to happen in which either you or the people you wreck into will die, how is the AI in the driverless car supposed to decide for you who gets to die and who doesn't?
I can't quite remember every facet of the arguments that were made, but I do believe that (and AI as a whole with the road we're going down in regards to AI) is a very very dangerous slippery slope.
It's really a mucked up trolley problem.. which left you with only two choices. Humans are limited by reaction time and if they have no obligation to face that dilemma (whichever choice they make is defensible) the same would have to be true for any AI-powered vehicle.
it's not an ethical problem if we're talking about machine learning. it's a risk management problem. an easily solved one too.
This post was edited on 10/3/17 at 10:40 pm
Posted on 10/4/17 at 12:06 am to bmy
quote:
it's not an ethical problem if we're talking about machine learning. it's a risk management problem. an easily solved one too.
In your scenario the outcomes for the two options(or however many there are) are never going to have 100% probability so its really a simple choice for a computer to make. The choice of running over the 4 folks crossing the road may have a fatality probability of 80% vs. the choice to slam into the divider which may have a fatality probability of 50% for the vehicle occupants. Easy choice. Even if it was 80.0000% vs. 79.99999% probability, respectively. Still 100% easy choice for a computer to make.
It really comes down to the accuracy of the programmed projection data and the quality of the sensors the computer uses to formulate the outcomes it uses in its decision making algorithm.
Of course, I want my car to prioritize my life over some knuckledraggers schlepping their sorry asses across the street. Slam me into a divider?! I dont think so. Not after the money I shelled on my Tesla AI car.
Bad car! No! Bad.
This post was edited on 10/4/17 at 12:11 am
Popular
Back to top
Follow TigerDroppings for LSU Football News