Multimodal Distraction Detection

Source Organization:
University:Carnegie Mellon University
Principal Investigator:Maxine Eskenazi
PI Contact
Project Manager:Courtney Ehrlichman
Funding Source(s) and Amounts Provided (by each agency or organization):$85,000
Total Dollars:
Agency ID/Contract/Grant Number:
Start and End Dates:January 2016 - January 2017
Project Status:Active
Subject Categories:In-Vehicle Technologies
Abstract:Distracted Driving continues to be a cause of traffic accidents despite prevailing legislation. The goal of this project is to automatically determine when the driver is becoming distracted. This information can be sent to a warning system in a car or can be used to help shut down the distracting activity. While we are able to detect distraction fairly reliably using such information as speech, gas pedal and brake use, and steering wheel trajectory, the use of automatic detection of head movement will make the distraction detector more robust. In many cases, the driver turns their head to either talk to a passenger or, more frequently, look at a smart device. In this project, we will gather data in a driving simulator from 50 subjects who are first asked to drive the course and then asked to watch the recording of their driving and tell us where they were distracted. They will use their own smart devices to listen to and dictate email. The conditions will vary with email of differing degrees of cognitive load, road conditions of varying difficulty and varying need to look at the smart device. The resulting database will be used to train and test a new and robust distraction detector.
Describe Implementation of Research Outcomes (or why not implemented):
Impacts/Benefits of Implementation (actual, not anticipated):
Project URL: