New! Sign up for our free email newsletter.
Science News
from research organizations

Tracking drivers' eyes can determine ability to take back control from 'auto-pilot' mode

Date:
August 31, 2023
Source:
University College London
Summary:
A team of researchers has developed a new method to determine the attention levels of drivers and their readiness to respond to warning signals when using auto-pilot mode.
Share:
FULL STORY

A team of UCL-led researchers has developed a new method to determine the attention levels of drivers and their readiness to respond to warning signals when using auto-pilot mode.

The research, published in Cognitive Research: Principles and Implications, found that people's attention levels and how engrossed they are in on-screen activities can be detected from their eye movements.

The findings suggest a new way to determine the readiness of drivers using auto-pilot mode to respond to real world signals, such as takeover requests from the car.

Although fully autonomous driverless cars are not yet available for personal use, cars with a "driverless" auto-pilot mode are available for commercial private use in some locations, including Germany and certain US states.

When using the auto-pilot mode, drivers are able to take their hands off the wheel and participate in other activities, such as playing games on their car-integrated central screen.

However, current models may require the driver to take back control of the car at certain points. For example, drivers can use the 'auto pilot' mode during a traffic jam on a motorway. But once the jam has cleared and the motorway allows faster than 40mph speeds, the AI will send a "takeover" signal to the driver, indicating that they must return to full driving control.

The researchers tested whether it was possible to detect if a person was too engrossed in another task to respond swiftly to such a "takeover" signal.

To do this, the team tested 42 participants across two experiments, using a procedure that mimicked a "takeover" scenario as used in some advanced models of cars with an auto-pilot mode.

Participants were required to search a computer screen with many coloured shapes for some target items and linger their gaze on targets to show they had found them.

The search tasks were either easy (ie. participants had to spot an odd 'L' shape amongst multiple 'T' shapes), or more demanding (ie. participants had to spot a specific arrangement of the shape parts and their colour).

At later points in their search task, a tone would then sound and the participants were required to stop watching the screen as fast as they could and press a button in response to it.

Researchers monitored the time it took between the tone sounding and the participants pressing the button, alongside analysing how their eyes moved across the screen during their search, to see if attention levels to the task could be detected from a change in their gaze.

They found that when the task demanded more attention, participants took a longer time to stop watching the screen and respond to the tone.

The analysis showed that it was possible to detect participants' attention levels from their eye movements. An eye movement pattern involving longer fixations and shorter distance of eye travel between all items indicated that the task was more demanding on attention.

The researchers also trained a machine learning model on this data and found that they could predict whether the participants were engaged in the easy or demanding task based on their eye movement patterns.

Senior author, Professor Nilli Lavie (UCL Institute of Cognitive Neuroscience), said: "Driverless car technology is fast advancing and promises a more enjoyable and productive driving experience, where drivers can use their commuting time for other non-driving tasks.

"However, the big question is whether the driver will be able to return to driving swiftly upon receiving a takeover signal if they are fully engaged in another activity.

"Our findings show that it is possible to detect the attention levels of a driver and their readiness to respond to a warning signal, just from monitoring their gaze pattern.

"It is striking that people can get so consumed with their on-screen activity that they ignore the rest of the world around them. Even when they are aware that they should be ready to stop their task and respond to tones as quickly as they can, they take longer to do it when their attention is engrossed in the screen.

"Our research shows that warning signals may not be noticed quickly enough in such cases."

Larger datasets are required in order to train the machine learning and make it more accurate.

The research was funded by JLR and the Engineering and Physical Sciences Research Council as part of the jointly funded Towards Autonomy: Smart and Connected Control (TASCC) programme.


Story Source:

Materials provided by University College London. Note: Content may be edited for style and length.


Journal Reference:

  1. Anthony M. Harris, Joshua O. Eayrs, Nilli Lavie. Establishing gaze markers of perceptual load during multi-target visual search. Cognitive Research: Principles and Implications, 2023; 8 (1) DOI: 10.1186/s41235-023-00498-7

Cite This Page:

University College London. "Tracking drivers' eyes can determine ability to take back control from 'auto-pilot' mode." ScienceDaily. ScienceDaily, 31 August 2023. <www.sciencedaily.com/releases/2023/08/230831121802.htm>.
University College London. (2023, August 31). Tracking drivers' eyes can determine ability to take back control from 'auto-pilot' mode. ScienceDaily. Retrieved November 20, 2024 from www.sciencedaily.com/releases/2023/08/230831121802.htm
University College London. "Tracking drivers' eyes can determine ability to take back control from 'auto-pilot' mode." ScienceDaily. www.sciencedaily.com/releases/2023/08/230831121802.htm (accessed November 20, 2024).

Explore More

from ScienceDaily

RELATED STORIES