New! Sign up for our free email newsletter.
Science News
from research organizations

Who(what)'s driving and when?

Date:
February 26, 2016
Source:
Human Factors and Ergonomics Society
Summary:
Researchers are working to advance the state of knowledge about human factors aspects of autonomous passenger vehicles, including an assessment of the level of drivers' trust in the autonomous car, and how drivers will respond best to verbal prompts alerting them to driving conditions and the state of the vehicle.
Share:
FULL STORY

For all the media attention they've been getting lately, self-driving cars come with many unknowns and potential obstacles to safe driving. A critical issue is the relative lack of research on the role of the human in the system. This human factors component may represent more daunting challenges than technological, legal, and security concerns of self-driving cars.

Advancing the state of knowledge about human factors aspects of autonomous passenger vehicles are two studies published recently in Human Factors: The Journal of the Human Factors and Ergonomics Society. One paper assesses the level of drivers' trust in the autonomous car by monitoring how often they interrupt a nondriving task to look at their surroundings. This study presents the first empirical evidence making this connection.

The other study suggests that drivers will respond best to verbal prompts, as opposed to sounds or visual displays, alerting them to driving conditions and the state of the vehicle (for example, low tire pressure).

"Keep Your Scanners Peeled: Gaze Behavior as a Measure of Automation Trust During Highly Automated Driving" is the work of Sebastian Hergeth, Lutz Lorenz, and Roman Vilimek of the BMW Group in Munich, and Josef F. Krems from Technische Universität Chemnitz, Germany.

In this study, 35 BMW Group employees ages 18 to 55 participated in a self-driving car simulation while engaging in a visually demanding nondriving task. The driving scenario was a standard three-lane highway with a hard shoulder in which uneventful driving was periodically interrupted by incidents requiring the driver to take control. Although trust is difficult to quantify, drivers' use of eye-tracking glasses enabled the researchers to capture data about how frequently participants looked away from the secondary task to observe the driving scene. Hergeth et al. then used these data to draw preliminary conclusions about drivers' levels of trust in the simulated car's automation.

The more the participants trusted the automation, the less frequently they looked at their surroundings. They were also more trusting of the car once they learned the system. Overall, more than half the drivers said they trusted the car more at the end than at the beginning of the trials. The researchers postulate that appropriate trust in automation is crucial for drivers to get the maximum benefit from self-driving vehicles.

In "Speech Auditory Alerts Promote Memory for Alerted Events in a Video-Simulated Self-Driving Car Ride," human factors researchers Michael A. Ness, Benji Helbein, and Anna Porter of Lafayette College, Easton, Pennsylvania, studied the usefulness of speech alerts to help drivers perceive and remember driving conditions while engaged in a nondriving activity.

Eighty-five undergraduate students performed a word search task while watching three driving simulation videos. Each scenario showed a routine driving condition. The participants were randomly assigned to one of three display conditions: sounds such as a jackhammer, indicating construction ahead; a visual display with text; and speech alerts such as "pedestrian" or "front hazard."

After watching the videos, participants reported what they recalled about the driving scenario, how useful and how annoying the alerts were, and how confident they would feel if they had to resume control of the car at the moment the video stopped. Participants who heard the speech alerts had better recall than those who were given the sound icons or visual displays. However, both audio alerts were rated as annoying, and studies show that annoying alerts have a tendency to be turned off.

Both research teams plan further investigations to assess how these areas of study can impact safety and how quickly and effectively drivers would take over the controls when necessary.


Story Source:

Materials provided by Human Factors and Ergonomics Society. Note: Content may be edited for style and length.


Journal References:

  1. M. A. Nees, B. Helbein, A. Porter. Speech Auditory Alerts Promote Memory for Alerted Events in a Video-Simulated Self-Driving Car Ride. Human Factors: The Journal of the Human Factors and Ergonomics Society, 2016; DOI: 10.1177/0018720816629279
  2. S. Hergeth, L. Lorenz, R. Vilimek, J. F. Krems. Keep Your Scanners Peeled: Gaze Behavior as a Measure of Automation Trust During Highly Automated Driving. Human Factors: The Journal of the Human Factors and Ergonomics Society, 2016; DOI: 10.1177/0018720815625744

Cite This Page:

Human Factors and Ergonomics Society. "Who(what)'s driving and when?." ScienceDaily. ScienceDaily, 26 February 2016. <www.sciencedaily.com/releases/2016/02/160226081532.htm>.
Human Factors and Ergonomics Society. (2016, February 26). Who(what)'s driving and when?. ScienceDaily. Retrieved November 20, 2024 from www.sciencedaily.com/releases/2016/02/160226081532.htm
Human Factors and Ergonomics Society. "Who(what)'s driving and when?." ScienceDaily. www.sciencedaily.com/releases/2016/02/160226081532.htm (accessed November 20, 2024).

Explore More

from ScienceDaily

RELATED STORIES