New! Sign up for our free email newsletter.
Science News
from research organizations

Should ethics or human intuition drive the moral judgments of driverless cars?

Date:
May 3, 2018
Source:
Frontiers
Summary:
Driverless cars will encounter situations requiring moral assessment -- and new research suggests that people may not be happy with the decisions their cars make. Experiments designed to test people's reactions to a driving dilemma that endangers human life, revealed a high willingness for self-sacrifice, a consideration of the age of potential victims and swerving onto the sidewalk to save more lives -- intuitions that are sometimes at odds with ethically acceptable behavior or political guidelines.
Share:
FULL STORY

When faced with driving dilemmas, people show a high willingness to sacrifice themselves for others, make decisions based on the victim's age and swerve onto sidewalks to minimize the number of lives lost, reveals new research published in open-access journal Frontiers in Behavioral Neuroscience. This is at odds with ethical guidelines in these circumstances, which often dictate that no life should be valued over another. This research hopes to initiate discussions about the way self-driving vehicles should be programmed to deal with situations that endanger human life.

"The technological advancement and adoption of autonomous vehicles is moving quickly but the social and ethical discussions about their behavior is lagging behind," says lead author Lasse T. Bergmann, who completed this study with a team at the Institute of Cognitive Science, University of Osnabrück, Germany. "The behavior that will be considered as right in such situations depends on which factors are considered to be both morally relevant and socially acceptable."

Traffic accidents are a major source of death and injury in the world. As technology improves, automated vehicles will outperform their human counterparts, saving lives by eliminating accidents caused by human error. Despite this, there will still be circumstances where self-driving vehicles will need to make decisions in a morally challenging situation. For example, a car can swerve to avoid hitting a child that has run into the road but in doing so endangers other lives. How should it be programmed to behave?

An ethics commission initiated by the German Ministry for Transportation has created a set of guidelines, representing its members' best judgement on a variety of issues concerning self-driving cars. These expert judgments may, however, not reflect human intuition.

Bergmann and colleagues designed a virtual reality experiment to examine human intuition in a variety of possible driving scenarios. Different sets of tests were created to highlight different factors that may or may not be perceived as morally relevant.

Based on a traditional ethical thought experiment, the trolley problem, test subjects could choose between two lanes on which their vehicle drove at constant speed. They were presented with a morally challenging driving dilemma, such as an option to move lanes to minimize lives lost, a choice between victims of different age, or a possibility for self-sacrifice to save others.

It revealed that human intuition was often at odds with ethical guidelines.

Bergmann explains, "The German ethics commission proposes that a passenger in the vehicle may not be sacrificed to save more people; an intuition not generally shared by subjects in our experiment. We also find that people chose to save more lives, even if this involves swerving onto the sidewalk -- endangering people uninvolved in the traffic incident. Furthermore, subjects considered the factor of age, for example, choosing to save children over the elderly."

He continues, "If autonomous vehicles abide with guidelines dictated by the ethics commission, our experimental evidence suggests that people would not be happy with the decisions their cars make for them."

Professor Gordon Pipa, co-author, also based at the University of Osnabrück continues, "It is urgent that we start engaging into a societal discussion to define the goals and constraints of future rules that apply to self-drive vehicles. This needs to happen before they become an integral part of our daily lives."

Bergmann explains that further research is needed. "While 'dilemma' situations deserve more study, other questions should also be discussed. Driving requires an intricate weighing of risks versus rewards, for example speed versus the danger of a critical situation unfolding. Decision making-processes that precede or avoid a critical situation should also be investigated."


Story Source:

Materials provided by Frontiers. Note: Content may be edited for style and length.


Journal Reference:

  1. Lasse T. Bergmann, Larissa Schlicht, Carmen Meixner, Peter König, Gordon Pipa, Susanne Boshammer, Achim Stephan. Autonomous Vehicles Require Socio-Political Acceptance—An Empirical and Philosophical Perspective on the Problem of Moral Decision Making. Frontiers in Behavioral Neuroscience, 2018; 12 DOI: 10.3389/fnbeh.2018.00031

Cite This Page:

Frontiers. "Should ethics or human intuition drive the moral judgments of driverless cars?." ScienceDaily. ScienceDaily, 3 May 2018. <www.sciencedaily.com/releases/2018/05/180503142637.htm>.
Frontiers. (2018, May 3). Should ethics or human intuition drive the moral judgments of driverless cars?. ScienceDaily. Retrieved December 21, 2024 from www.sciencedaily.com/releases/2018/05/180503142637.htm
Frontiers. "Should ethics or human intuition drive the moral judgments of driverless cars?." ScienceDaily. www.sciencedaily.com/releases/2018/05/180503142637.htm (accessed December 21, 2024).

Explore More

from ScienceDaily

RELATED STORIES