KAUST DepartmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
Computer Science Program
Visual Computing Center (VCC)
University of British Columbia and KAUST
MetadataShow full item record
AbstractDepth cameras are a ubiquitous technology used in a wide range of applications, including robotic and machine vision, human computer interaction, autonomous vehicles as well as augmented and virtual reality. In this paper, we explore the design and applications of phased multi-camera time-of-flight (ToF) systems. We develop a reproducible hardware system that allows for the exposure times and waveforms of up to three cameras to be synchronized. Using this system, we analyze waveform interference between multiple light sources in ToF applications and propose simple solutions to this problem. Building on the concept of orthogonal frequency design, we demonstrate state-of-the-art results for instantaneous radial velocity capture via Doppler time-of-flight imaging and we explore new directions for optically probing global illumination, for example by de-scattering dynamic scenes and by non-line-of-sight motion detection via frequency gating. © 2016 ACM.
CitationShrestha S, Heide F, Heidrich W, Wetzstein G (2016) Computational imaging with multi-camera time-of-flight systems. ACM Transactions on Graphics 35: 1–11. Available: http://dx.doi.org/10.1145/2897824.2925928.
SponsorsNational Science Foundation[1553333, 1539120]
KAUST Office of Sponsored Research
Intel Partnership on Visual and Experiential Computing
Conference/Event nameACM SIGGRAPH 2016