Traditional cameras use lenses to form an optical image of the scene and thus obtain spatial correspondences between the scene and the film or sensor array. These cameras do not sample the incident light fast enough to record any transient variations in the light field. This talk introduces diffuse imaging -- a signal processing framework for imaging using only omnidirectional illumination and sensing. We show that it is possible to construct images by computationally processing samples of the response to a time-varying illumination. We also show a range sensing system that uses neither scene scanning by laser (as in LIDAR) nor multiple sensors (as in a time-of-flight camera). These technologies depend on novel parametric signal modelling and sampling theory.
Thursday, September 22, 2011
Free and open to the public