Dear Guest,
Please register or login. Content don't create itself!
Thank you

Likes:
0
-
Moral Machine MIT Crowd Source
There have been numerous threads on autonomous cars. Some of the threads have touched on the moral dilemma inherent with driving decisions. MIT has an online project to crowd-source ethical decisions.
Moral Machine
It only takes a few minutes, but it makes you think.
You don't want to be fat and old. I've asked them to add some cyclists to the scenarios to see what happens.
D
-
Re: Moral Machine MIT Crowd Source
Fun. I scored 0% on protecting occupants which was a big outlier, but I am working on the assumption that anyone who gets in the car takes the risk and the moral requirement is to protect the vulnerable user who did not make that choice.
-
Re: Moral Machine MIT Crowd Source
I scored:
100% on Saving More Lives
0% on Upholding The Law
Then realized that was the crosswalk signal and not the traffic light.
My second time around this is what I have taken note to..
Social Value Preference
0%
justin rogers.
-
Re: Moral Machine MIT Crowd Source
Odd.... I did not differentiate between saving passengers versus saving pedestrians (very close to "does not matter"), whereas the mean response is equal between "does not matter" and "matters a lot."
Kind of scary - at least for pedestrians and perhaps cyclists as well - that there appears to be a bias towards saving the lives of the car occupants over the lives of others.
-
Re: Moral Machine MIT Crowd Source

Originally Posted by
monadnocky
Kind of scary - at least for pedestrians and perhaps cyclists as well - that there appears to be a bias towards saving the lives of the car occupants over the lives of others.
Doesn't surprise me at all, considering the current traffic situation seems to be a bias towards saving TIME over saving the lives of others...
-
Re: Moral Machine MIT Crowd Source
I think I hate myself now.
-
Re: Moral Machine MIT Crowd Source
The moral answer depends on which ethical system you go by. The result could be simple Utilitarianism which won't be satisfactory in every case. There are other systems.
And where someone has to get hurt maybe the insurance companies would want to know who programmed, or even voted for, that response. Could deliberate harm be replacing accidental harm in this case?
-
Re: Moral Machine MIT Crowd Source
Ethics is an expansive field and the more you study the more paradoxes and problems appear. Doing the right thing isn't always a straightforward case of instinct or common sense.
Ethics - Wikipedia, the free encyclopedia
-
Re: Moral Machine MIT Crowd Source
I think they need to work on improving the reliability of those cars' brakes...
Apparently I like animals and don't like fat ladies.
(More seriously, my "save the animals" choices were done consciously just for kicks and to be perverse. I did not consciously attempt to go after the fat ladies, that was just a statistical fluke.)
Snap299.jpg
-
Re: Moral Machine MIT Crowd Source
That was super interesting.
Apparently I'm into upholding the law. I had to go a second time as I ignored the signals the first time. My take is if there are two groups in the row and one group is following the law and one not, that's a pretty easy decision. Darwin rules.
Just like I am on the road. People jay walk in front of me with their head in the iPhone and I will likely yell at them. People in crosswalks get waved through. Without rules and following them, we do not have civilization.
Overall, the site shows some of the hard decisions of driverless cars. But, it ignores the fact that cars protect their occupants with safety devices. Not sure running the car into a roadblock will hurt the occupants the same as hitting similar number of pedestrians.
Jon
-
Re: Moral Machine MIT Crowd Source

Originally Posted by
stackie
Overall, the site shows some of the hard decisions of driverless cars. But, it ignores the fact that cars protect their occupants with safety devices. Not sure running the car into a roadblock will hurt the occupants the same as hitting similar number of pedestrians.
I also thought about this, but they specifically said that the vehicle's occupants would be killed.
But yes, unless the car is driving way too fast (and to really do that would require another failure of the car's systems, not just the brakes) then pedestrians are much more likely to be badly hurt than those inside the car. An equal likelihood of fatalities doesn't seem right, and I found it difficult to ignore that when choosing between mowing down the pedestrians vs hitting the barrier.
-
Re: Moral Machine MIT Crowd Source
I missed the signals, people walking illegally might change my opinions. I looked at is as pedestrians win over occupants 100% of the time, people win over animals 100% of the time. I did not distinguish between the type of people in the car or in the crosswalk as that information would not be available in real life, and if all else is equal don't swerve.
That said, I think the cars can be programmed to avoid these scenarios at least 10x if not 100x+ better than a human driver and will drastically reduce the number of car related fatalities. The liability issue is thorny as there will inevitably be failures and there is no obviously responsible party but it's not like the law allocates responsibility to drivers under the current system anyway or we would see a dramatic increase in people being jailed for their driving and likely an increase in safety. That's a whole other thread.
-
Re: Moral Machine MIT Crowd Source

Originally Posted by
sailor
That said, I think the cars can be programmed to avoid these scenarios at least 10x if not 100x+ better than a human driver and will drastically reduce the number of car related fatalities.
Perhaps, but it isn't simply a question of programming the software. If the vehicle's sensors can't even distinguish between a light blue sky and the white side panels of an 18-wheeler trailer, then it's going to be a while before we come even close to a "take the pet to the vet" scenario.
-
Re: Moral Machine MIT Crowd Source
my ethic:
kill 'em all and let God sort 'em out.
-
Re: Moral Machine MIT Crowd Source

Originally Posted by
Mabouya
Apparently I like animals and don't like fat ladies.
I forget who my Most Killed Character was but My Most Saved Character was also animals, and that was not "to be perverse" but rather from my sincere belief that they are the only victim in these scenarios who can legitimately claim to not understand the (possible) consequences of crossing the street.
But I also very conciously always chose to kill the vehicle's passengers rather than any pedestrians, because I believe there should be a higher standard for responsibility when you climb into a motor vehicle, even if you're 1) not the driver, 2) obeying all traffic laws, and 3) not able to control when/whether the brakes function reliably.
Of course, it's all a series of false dichotomies anyway; in any of those situations there's always a third (or fourth, or fifth) option that will have yet another outcome. Hmm...What if there were no hypothetical situations?
Similar Threads
-
Replies: 9
Last Post: 01-13-2011, 04:59 PM
Tags for this Thread
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
Bookmarks