Visual Navigation with Android Tablets
|University:||University of Pennsylvania|
|Principal Investigator:||Dan Lee|
|PI Contact Information:||200 S. 33rd Street, 203 Moore Building, University of Pennsylvania, Philadelphia, PA 19104 - 215-898-8112 email@example.com|
|Project Manager:||Courtney Ehrlichman|
|Funding Source(s) and Amounts Provided (by each agency or organization):||UTC|
|Agency ID/Contract/Grant Number:|
|Start and End Dates:|
|Abstract:||The goal of this project is to develop a visual navigation system for use in vehicles using Android tablets. Current vehicle navigation techniques rely solely on static road maps and noisy GPS data. This approach is prone to errors where GPS information is not available, such as urban canyons. We propose to use a windshield mounted, Android tablet as a sensor platform for augmenting in-vehicle navigation, Figure 1.
We begin by detecting and tracking salient features over time from the color camera images. Combined with the accelerometer and gyroscope data, we can distinguish between static (background) and dynamic (foreground) features in the images, Figure 1. The static features are then used to compute the visual odometry (motion of the vehicle) and produce a 3D model of the environment. This information can then be used to augment the navigation system with local, real-time information and overlays.
Ultimately we aim to create an Android application that helps to make drivers more aware of their environments and promotes safety and focus while driving. To this end, we are developing algorithms to carry out key operations which include (Figure 2): detecting the road and lane markings, identifying other vehicles on the road, recognizing pedestrians and cyclists, and identifying important road features such as stop lights, stop signs, and speed limits. Using moving object detection and visual odometry, our application will compute and monitor distances to other vehicles, pedestrians and cyclists nearby. This information can in turn be used to alert the driver of a potentially hazardous situation such as an impending collision. Additionally, lane detection and vehicle egomotion estimation will be used to determine and issue warnings when unintended lane departure (drifting) occurs.
|Describe Implementation of Research Outcomes (or why not implemented):|
|Impacts/Benefits of Implementation (actual, not anticipated):||Android tablets are well suited for this task because they are inexpensive, readily available and come with an array of useful sensors; including GPS, color cameras, accelerometer, gyroscope, and wireless capabilities. Unlike many of the current advanced safety features available in vehicles today, this project does not require any specialized hardware to be installed on the vehicle. The radar and LIDAR sensors, typically used in modern vehicles safety measures, can be expensive and any older vehicles would not benefit from these solutions. Our solution will not require any external hardware modification allowing for an immediate impact in both new and old vehicles.|