3.1.4 Autonomous cars

Course subject(s) Module 3. Connect your mind

What values should autonomous cars have?

Self-driving cars are one of the most prominent examples of robots that are going to greatly impact society over the next few years. But as we enjoy being effortlessly transported to the destination of our choice, we are ceding control over the life-or-death decisions that are entailed in the process of driving a car. While self-driving cars may (eventually) be safer than human-driven ones, accidents are still bound to happen.

What are the values on which autonomous cars should base their decisions in situations where only bad outcomes seem likely? Should the car default to a “safe” behavior, like braking? Or should it perhaps refrain from taking action at all, even if it leads to bad outcomes? Should it always save the passenger(s)? Or people outside the car? Should it learn the car’s owner’s values and how they would react? Or some average of the population? Or should it make a calculation to minimize the amount of harm done? If so, what features should be taken into account? Age? Gender? Whether they are breaking the rules of traffic? How much they donated to charity last year? Or to the car manufacturer? Value to society? Who should decide? The car manufacturer? The government? The car’s owner? What kind of self-driving car would you want?

Creative Commons License
Mind of the Universe: Robots in Society - Blessing or Curse by TU Delft OpenCourseWare is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Based on a work at https://online-learning.tudelft.nl/courses/mind-of-the-universe-robots-in-society-blessing-or-curse/.
Back to top