Type
Conference PaperKAUST Department
Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) DivisionComputer Science Program
Visual Computing Center (VCC)
University of British Columbia and KAUST
Date
2016-07-11Permanent link to this record
http://hdl.handle.net/10754/621386
Metadata
Show full item recordAbstract
Depth cameras are a ubiquitous technology used in a wide range of applications, including robotic and machine vision, human computer interaction, autonomous vehicles as well as augmented and virtual reality. In this paper, we explore the design and applications of phased multi-camera time-of-flight (ToF) systems. We develop a reproducible hardware system that allows for the exposure times and waveforms of up to three cameras to be synchronized. Using this system, we analyze waveform interference between multiple light sources in ToF applications and propose simple solutions to this problem. Building on the concept of orthogonal frequency design, we demonstrate state-of-the-art results for instantaneous radial velocity capture via Doppler time-of-flight imaging and we explore new directions for optically probing global illumination, for example by de-scattering dynamic scenes and by non-line-of-sight motion detection via frequency gating. © 2016 ACM.Citation
Shrestha S, Heide F, Heidrich W, Wetzstein G (2016) Computational imaging with multi-camera time-of-flight systems. ACM Transactions on Graphics 35: 1–11. Available: http://dx.doi.org/10.1145/2897824.2925928.Sponsors
National Science Foundation[1553333, 1539120]Conference/Event name
ACM SIGGRAPH 2016ae974a485f413a2113503eed53cd6c53
10.1145/2897824.2925928