Eye movement behavior during autonomous vehicle-human interaction.
David Whitney, Professor
Psychology
Closed. This professor is continuing with Spring 2024 apprentices on this project; no new apprentices needed for Fall 2024.
The recent advent of autonomous vehicles (AV) in the last decade has changed the traditional role of drivers. Currently, autonomous cars being developed and commercialized are not fully autonomous. In fact, commercially-available AV take control of only some of the driving functions, such as speed, or are highly automated systems that still need the driver’s input/control under challenging situations (for instance, traffic or poor weather conditions). To the extent that cars aren’t perfectly autonomous, some work falls on the drivers’ shoulders, including monitoring the system’s behavior. This proposed project aims to investigate whether drivers are capable of efficiently monitoring an AV system and to provide an accurate measure of driver’s inattentiveness.
One of the major challenges the autonomous vehicle research is facing is developing an ability to communicate to humans – both pedestrians and drivers – their next actions and intentions. In the first phase of this project we will approach the question of how driver's attention change when riding on an autonomous vehicle that they do not need to control. When driving a manual car, drivers need to selectively attend to the critical areas to the driving environment at all times. However, in an autonomous vehicle, the idea is that the driver would be able to disengage from the driving environment. The question arises as to whether humans are able to supervise efficiently and whether automation increases the chances of the driver being inattentive. The present project uses state-of-the-art mobile eye-tracking devices to record drivers’ eye movements while interacting with autonomous and manual cars. For this project, drivers will wear a mobile eye-tracker while performing a task in a driving simulator located in McLaughlin Hall.
The findings from this proposed project will help clarify if drivers can efficiently monitor an automated system. If we find that distribution of attention in automated vehicles differs from attention when manually driving a car, we will provide evidence that when drivers delegate the control of the vehicle to an automated system their attention may shift away from critical areas. This information could be then implemented in a car monitoring system that could help drivers maintain attention or alert drivers from a drop of attention. Our goal is also to contribute to the BDD repository with datasets of drivers’ gaze in a more realistic environment that can be used to train attention prediction models or implemented in autonomous driving models.
This project is funded by and in collaboration with the Berkeley Deep Drive group. Previous work from this collaboration unit can be found here: https://deepdrive.berkeley.edu/node/232
Role: The research apprentice will be involved in one or two of these tasks:
1) Setting up experiments with the driving simulator using Matlab Simulink and Python.
2) Collecting data and running subjects experiments.
3) Analyzing eyetracking and behavioral data: Analysis of the data includes applying deep neural networks and Python toolboxes to categorize the eye movement videos and write Matlab and Python code to investigate eye movement patterns.
Students will be trained to set up and record data with the Pupil Labs eye-tracker and work the driving simulator and simulation software required for the project. The apprentice will also meet with the supervisor weekly to discuss preliminary data and background literature, in addition to the opportunity to attend weekly lab meetings. Last, the student will also have the opportunity to attend the November Berkeley Deep Drive meeting and meet the industry sponsors.
Most tasks and meetings will be remote, but some in-person data collection will be required.
Qualifications: Required: High motivation, interest in visual perception and cognitive science, interest in learning about driving research and simulators, experience with programming languages Python and/or Matlab., ability to work independently on certain projects.
Desirable but not essential: basic knowledge of deep neural networks;
Day-to-day supervisor for this project: Zixuan Wang, Graduate Student
Hours: to be negotiated
Related website: https://deepdrive.berkeley.edu/node/232
Related website: https://deepdrive.berkeley.edu/node/232