Should Self-Driving Cars Have Ethics?
To design a "moral machine," researchers updated a classic thought experiment for the autonomous vehicle age. But do we really want artificial intelligence making decisions on who lives or dies?
by Laurel Wamsley
Oct 26, 2018
3 minutes
In the not-too-distant future, fully autonomous vehicles will drive our streets. These cars will need to make split-second decisions to avoid endangering human lives — both inside and outside of the vehicles.
To determine attitudes toward these decisions a group of researchers created a variation on the classic philosophical exercise known as "." They posed a series of moral dilemmas involving a self-driving car with brakes that suddenly give out: Should the car
You’re reading a preview, subscribe to read more.
Start your free 30 days