Understanding driver behavior and ethnography surrounding the task of driving are essential in the development of human-centric driver assistance systems. Novel instrumented vehicles are used for conducting experiments, where the rich contextual information about vehicle dynamics, surround and driver state are captured for careful, detailed ethnographic studies, as well as realistic data for developing algorithms to analyze multi sensory signals for active safety. In this presentation, we will provide a systems-oriented framework for developing multimodal sensing, inferencing algorithms and human-vehicle interfaces for safer automobiles. We will consider three main components of the system, driver, vehicle, and vehicle surround. We will discuss various issues and ideas for developing models for these main components as well as activities associated with the complex task of safe driving. The presentation will include discussion of novel sensory systems and learning algorithms for capturing not only the dynamic surround information of the vehicle but also the state, intent and activity patterns of drivers. We will also introduce a new type of visual display called "dynamic active display". These displays present visual information to the driver where driving view and safety-critical visual icons are presented to the driver in a manner that minimizes deviation of her gaze direction without adding to unnecessary visual clutter. These contributions support the practical promise of the "human-centric active safety" (HCAS) systems in enhancing the safety, comfort, and convenience.
Monday, September 05, 2011
Free and open to the public