moral-machine3

Who To Kill? This Grim MIT Online Game From MIT Helps Self-Driving Cars Learn

Posted by

With Westworld‘s debut on HBO this weekend, a lot of people are thinking about the Three Laws of Robotics. MIT’s new “Moral Machine” presents scenarios that all end in the death of either a pedestrian or a passenger. The idea is to try to collect human data on which fatality is more morally acceptable.

The Three Laws of Robotics — in case you haven’t seen a sci-fi movie since 1942, when these laws appeared in a short story by Isaac Asimov — are as follows:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

Find a car near you with BestRide’s local search.

Self-driving cars unleashed on city streets without human interaction will eventually face a scenario in which someone — a pedestrian, an animal, a passenger — will likely die in a crash. Choosing the “moral” choice requires data collected by humans, and the “Moral Machine” is trying to collect it.

Here’s a sample scenario:

A driverless car arrives at an intersection and a sudden brake failure keeps the car moving toward the crosswalk at a full rate of speed. The crosswalk on the left has two bank robbers crossing on a WALK signal. The crosswalk on the right has two elderly people, crossing on a DO NOT WALK signal.

Which set of pedestrians should the driverless car plow into?

moral-machine1

It seems like a ghastly question to ask, but moral decisions are something you make every single time you operate an automobile. These kind of innately human decisions are choices that machines simply don’t have the capacity to execute at the moment. At last May’s NEMPA/MIT Technology Conference, MIT Professor Mechanical and Ocean Engineering John Leonard presented a similar scenario, one with consequences just as drastic.

Leonard showed a video utilizing the kind of camera that a self-driving car would use to navigate the road ahead. The subject car made a decision based on the green light in the intersection, but completely ignored the police officer with a hand up telling the driver to stop. How do you engineer an autonomous car to move the instructions of a police officer up in hierarchy, but not those of a pedestrian executing the same gesture?

Similar to the online game “Moral Machine,” Leonard’s researchers built a scale city called Duckietown, and filled its streets with small, $150 robots that had to learn which scenarios were acceptable:

According to a 2014 MIT Technology Review update on the self-driving car from Google, despite the advanced technology packed inside, the Google car faced major hurdles that could keep it from becoming a reality for decades, if ever at all.

“Among other unsolved problems, Google has yet to drive in snow,” the assessment reads. “[Director of the Google Car Team Chris] Urmson says safety concerns preclude testing during heavy rains. Nor has it tackled big, open parking lots or multilevel garages.

The car’s video cameras detect the color of a traffic light; Urmson said his team is still working to prevent them from being blinded when the sun is directly behind a light. Despite progress handling road crews, “I could construct a construction zone that could befuddle the car,” Urmson says.

moral-machine2

The Moral Machine hopes to help autonomous technology of all types understand decisions that involve human life: “The greater autonomy given machine intelligence…can result in situations where they have to make autonomous choices involving human life and limb. This calls for not just a clearer understanding of how humans make such choices, but also a clearer understanding of how humans perceive machine intelligence making such choices. ”

The scenarios are more granular than just passenger vs. pedestrian. Shades of morality color the pedestrian groups. Some are doctors, some are thieves. Some are children, some are elderly. Some decisions choose between cats and humans.

You can judge these scenarios for yourself, and you can build your own using the online tool.

It’s difficult to make decisions that will end in the loss of life. But the Moral Machine presents users with just those kind of decisions, and provides results that tell you as much about yourself as they do about the cars that will eventually be driving our streets.

Find a car near you with BestRide’s local search.

Share:
Craig Fitzgerald

Craig Fitzgerald

Writer, editor, lousy guitar player, dad. Content Marketing and Publication Manager at BestRide.com.