Return to CreateDebate.comjaded • Join this debate community

Joe_Cavalry All Day Every Day


Debate Info

19
30
Yup. Wait..., what? No!!!
Debate Score:49
Arguments:43
Total Votes:50
More Stats

Argument Ratio

side graph
 
 Yup. (16)
 
 Wait..., what? No!!! (28)

Debate Creator

jolie(9810) pic



Should Driverless Cars Kill Their Own Passengers To Save A Pedestrian?

Yup.

Side Score: 19
VS.

Wait..., what? No!!!

Side Score: 30
3 points

But it mostly depends on who the passengers are in the car. LOL

Side: Yup.
1 point

Pedestrian lives matter.

Side: Yup.
jolie(9810) Clarified
1 point

Car owner privilege/culture needs to stop.

Side: Yup.
GoneFishing(126) Disputed
1 point

All lives matter. (Or would that comment be considered to soon?) :p

Side: Wait..., what? No!!!
1 point

I'm not sure all lives matter. I mean, have you ever heard about someone dying and then thinking to yourself, "Meh..."

Side: Wait..., what? No!!!
Cartman(18192) Disputed
1 point

They should look where they are going if they believe their lives matter.

Side: Wait..., what? No!!!

The disrespect to danger is a killer. Just had write a piece about sidewalk safety. Fun stuff.

Side: Wait..., what? No!!!
1 point

You know? - My car has a climate control temperature slider, and I set the temperature. I also control the speed with the accelerator, select the limiter or cruise control and even how much concentration I invest in the driving and how much in how to solve the problems at work.

So - how about we put a "morality slider" in the car? After an hour's Buddhist meditation, I slide it over to "100% the other guy". But after a big fight with the girlfriend I reset it to "20% the other guy".

We make these moral decisions all the time when we drive a car. We will usually swerve to avoid the child in the pushchair - even though a huge truck is coming the other way - because we still think we can survive.

So, as the pilot of the car (the person who decides why he is in the car and where he is going) we can set the morality slider ourselves! Relieves the manufacturer of responsibility and puts the moral decision firmly on the pilot - where it should always be!

Side: Yup.
1 point

Sorry for offense please choose your local supermarket for more info. Hope this helps:)

Side: Yup.
1 point

Passenger lives matter.

Side: Wait..., what? No!!!
jolie(9810) Clarified
1 point

If you're on foot, you probably can'take afford a car because you have no job and are homeless..... either that or you are a health enthusiast (biker, jogger, etc.). In either case, you are and obstacle and should perish.

Side: Yup.
1 point

If you read Asimov's three Laws of Robotics then you will find that this situation cannot exist.

Side: Wait..., what? No!!!
2 points

That was a fictional book...

Side: Yup.
1 point

If you read the article, you will find that it can.

Side: Wait..., what? No!!!
1 point

No one should read the article because it is based on a ridiculous premise that doesn't even make sense in the first paragraph.

Side: Wait..., what? No!!!
DKCairns(868) Disputed
1 point

Jolie , you obviously have not read the laws of robotics otherwise you would realise this is an irrelevant debate.

Side: Yup.
1 point

I was going to mention Asimov too. .

Side: Wait..., what? No!!!

Uh. What? Why would a driverless car be forced in such a situation? The AI functioning capabilities in terms to response aer far superior than humans. They would never need to have the option to choose. AI is not capable of functioning in a moral aptitude. If you follow the rules of the road, you will not have to worry about choosing who should live or die.

Edit: read the article. Again such a hypothetical situation like this is pretty far fetched, to the point of why even debate?

There's to many "what if's" not answered to address this ethically. So I'll take the question directly: should AI be able to choose life or death in certain events. The answer would be no, they shouldn't.

Side: Wait..., what? No!!!
1 point

You read the article? Wow! You even summarized what it says! Thanks. I'm just interested in the sound bites.

Side: Wait..., what? No!!!
GoneFishing(126) Clarified
1 point

The ethical situations are pretty much endles in this situation. To allow a machine to make a ethical decision with out understanding ethics in itself, doesn't make any sense.

For fun : the passenger is s single parent of 3 kidswith no existing relatives. However, at the time of the accident they were at school. (The parent was only passenger).

The pedestrians are convicted criminals assigned for highway clean up. These criminals were all (let's say just two of them) convicted of dui's that resulted in the death of others. There's criminals have no children but a vast amount of siblings let's say 8 total.

Who should should be saved? If we cannot even calculate the moral ramifications of this situation, how would a computer be able to?

Side: Yup.
1 point

You know? - My car has a climate control temperature slider, and I set the temperature. I also control the speed with the accelerator, select the limiter or cruise control and even how much concentration I invest in the driving and how much in how to solve the problems at work.

So - how about we put a "morality slider" in the car? After an hour's Buddhist meditation, I slide it over to "100% the other guy". But after a big fight with the girlfriend I reset it to "20% the other guy".

We make these moral decisions all the time when we drive a car. We will usually swerve to avoid the child in the pushchair - even though a huge truck is coming the other way - because we still think we can survive.

So, as the pilot of the car (the person who decides why he is in the car and where he is going) we can set the morality slider ourselves! Relieves the manufacturer of responsibility and puts the moral decision firmly on the pilot - where it should always be!

Side: Wait..., what? No!!!
1 point

It doesn't seem like a good business model. I would want the car that tries to save me if there was ever an "either or" situation.

Side: Wait..., what? No!!!