Building the Moral Compass for Self Driving Cars

The Moral Machine asks users what ethics should drive autonomous car programming in life and death situations

If a crash was imminent and the self-driving car had to decide, should it kill two kids or a mom and a pregnant mother? What if the choice was between five working adults or six senior citizens?

These are the type of ethical scenarios you’ll encounter in the Moral Machine. Developed by Scalable Cooperation at the MIT Media, the website features 13-questions. Respondents essentially weigh in on the kinds of moral decisions programmers face as they build autonomous vehicles.

Questions are similar in nature to the infamous trolley problem, devised by the Oxford moral philosopher, Phillipa Foot. If a train is barreling towards five people, and a person could pull the track switch and divert it to a track with only one person, should she/he do so?

It’s a question that pins the morality of saving more human lives against the idea of who has the right to decide who lives and dies.

     Related: Driverless Car Accidents & the Trolley Problem

Driverless Decisions

As we barrel towards a driverless future, modern day moralists grapple with these types of questions. The Moral Machine crowd-sources human perspectives. What is the lesser of two evils? Killing two passengers or five pedestrians? Offing four pedestrians or two cats and two dogs?

At the end of the quiz, the Moral Machine shows respondents how their answers compare to the rest of the survey pool. For instance, did you choose answers that saved more lives? Did you protect passengers over pedestrians? Did you try to avoid an intervention by the car, such as an unnatural swerve?

Quiz takers can discuss their viewpoints with others. They can even create their own moral dilemma questions and see how others respond.

The goal of the Moral Machine is to, “further our scientific understanding of how people think about machine morality,” said Iyad Rahwan, it’s co-creator. However,  it’s questionable what can be uncovered. Scenarios are narrowly written. And all decisions ensure only one outcome – death … to someone.

     Related: Is Upgrading Tesla’s Autopilot Safe?

 Unlikely Scenarios

Indeed, the scenarios illustrated in The Moral Machine would be extremely rare. In pedestrian areas, a self-driving car would adhere to posted speed limits. In major cities like New York City, the speed limit on local streets is 25 miles per hour – sometimes 20 mph.

According to safespeed.org, if a car hits a pedestrian at 40 mph there’s a 90 percent chance of death. At 30 mph there’s 50 percent chance the victim is killed. But at 20 mph the chances of killing a pedestrian is only 10 percent.

Whether or not the scenarios illustrated by the Moral Machine are realistic, the self-driving cars ethics project does make users appreciate the difficulties of creating algorithms that make moral choices.

Israel Salas-Rodriguez is a senior journalism student at Brooklyn College. He is the Sports Editor at the school’s newspaper, The Kingsman. When he isn’t writing about sports, he can be found in Sunset Park dribbling a basketball up and down the courts. Aside from the world of sports he enjoys nights out in the city with friends, typically where ever margaritas are served.  

Israel Rodriguez

Israel Rodriguez

Test

israel-rodriguez has 4 posts and counting.See all posts by israel-rodriguez

Please wait...

Register for FREE Dryve. Our Driverless Car Updates.

Wondering what your peers are driving? Discover which car companies most plan on investing in (hint: it’s not what you think). Also see how much time is spent on R&D and much more.
Join more than 540,000 of you peers !

Dryve

Subscribe For Latest Updates

Sign up & get the latest updates in the world of driverless cars and auto technology — the latest innovations, key players, big issues.

It’s Free!