skip to main content
research-article

Doppler time-of-flight imaging

Published: 27 July 2015 Publication History

Abstract

Over the last few years, depth cameras have become increasingly popular for a range of applications, including human-computer interaction and gaming, augmented reality, machine vision, and medical imaging. Many of the commercially-available devices use the time-of-flight principle, where active illumination is temporally coded and analyzed in the camera to estimate a per-pixel depth map of the scene. In this paper, we propose a fundamentally new imaging modality for all time-of-flight (ToF) cameras: per-pixel radial velocity measurement. The proposed technique exploits the Doppler effect of objects in motion, which shifts the temporal illumination frequency before it reaches the camera. Using carefully coded illumination and modulation frequencies of the ToF camera, object velocities directly map to measured pixel intensities. We show that a slight modification of our imaging system allows for color, depth, and velocity information to be captured simultaneously. Combining the optical flow computed on the RGB frames with the measured metric radial velocity allows us to further estimate the full 3D metric velocity field of the scene. The proposed technique has applications in many computer graphics and vision problems, for example motion tracking, segmentation, recognition, and motion deblurring.

Supplementary Material

MP4 File (a36.mp4)

References

[1]
Barron, J., Fleet, D., and Beauchemin, S. 1994. Performance of optical flow techniques. IJCV 12, 1, 43--77.
[2]
Boreman, G. D. 2001. Modulation Transfer Function in Optical and ElectroOptical Systems. SPIE Publications.
[3]
Büttgen, B., and Seitz, P. 2008. Robust optical time-of-flight range imaging based on smart pixel structures. IEEE Trans. Circuits and Systems 55, 6, 1512--1525.
[4]
Ceperley, P., 2015. Resonances, waves and fields. http://resonanceswavesandfields.blogspot.com/2011/04/28-valid-method-of-multiplying-two.html. {Online; accessed 20-January-2015}.
[5]
Conroy, R., Dorrington, A., Kunnemeyer, R., and Cree, M. 2009. Range Imager Performance Comparison in Homodyne and Heterodyne Operating Modes. In Proc. SPIE 7239.
[6]
Doppler, C. J. 1842. Über das farbige Licht der Doppelsterne und einiger anderer Gestirne des Himmels. Abhandlungen der Königl. Böhm. Gesellschaft der Wissenschaften 12, 2, 465--482.
[7]
Dorrington, A. A., Cree, M. J., Payne, A. D., Conroy, R. M., and Carnegie, D. A. 2007. Achieving sub-millimetre precision with a solid-state full-field heterodyning range imaging camera. In Proc. Meas. Sci. Technol., vol. 18.
[8]
Erz, M., and Jähne, B. 2009. Radiometric and spectrometric calibrations, and distance noise measurement of ToF cameras. In Dynamic 3D Imaging. Springer, 28--41.
[9]
Gokturk, S., Yalcin, H., and Bamji, C. 2004. A time-of-flight depth sensor - system description, issues and solutions. In Proc. CVPR, 35--35.
[10]
Gu, J., Hitomi, Y., Mitsunaga, T., and Nayar, S. 2010. Coded Rolling Shutter Photography: Flexible Space-Time Sampling. In Proc. ICCP.
[11]
Gupta, M., Nayar, S. K., Hullin, M., and Martin, J. 2014. Phasor Imaging: A Generalization Of Correlation-Based Time-of-Flight Imaging. Tech. rep., Jun.
[12]
Heide, F., Hullin, M. B., Gregson, J., and Heidrich, W. 2013. Low-budget transient imaging using photonic mixer devices. ACM Trans. Graph. (SIGGRAPH) 32, 4, 45:1--45:10.
[13]
Heide, F., Xiao, L., Heidrich, W., and Hullin, M. B. 2014. Diffuse mirrors: 3D reconstruction from diffuse indirect illumination using inexpensive time-of-flight sensors. In Proc. CVPR.
[14]
Heide, F., Xiao, L., Kolb, A., Hullin, M. B., and Heidrich, W. 2014. Imaging in scattering media using correlation image sensors and sparse convolutional coding. OSA Opt. Exp. 22, 21, 26338--26350.
[15]
Hoegg, T., Lefloch, D., and Kolb, A. 2013. Real-time Motion Compensation for PMD-ToF Images. In Lecture Notes in Computer Science, vol. 8200.
[16]
Honegger, D., Meier, L., Tanskanen, P., and Pollefeys, M. 2013. An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications. In Proc. ICRA, IEEE, 1736--1741.
[17]
Hontani, H., Oishi, G., and Kitagawa, T. 2014. Local estimation of high velocity optical flow with correlation image sensor. In Proc. ECCV, 235--249.
[18]
Horn, B., and Schunck, B. 1981. Determining optical flow. Artificial Intelligence 17, 185--203.
[19]
Kadambi, A., Whyte, R., Bhandari, A., Streeter, L., Barsi, C., Dorrington, A., and Raskar, R. 2013. Coded time of flight cameras: sparse deconvolution to address multipath interference and recover time profiles. ACM Trans. Graph. (SIGGRAPH Asia) 32, 6.
[20]
Kirmani, A., Hutchison, T., Davis, J., and Raskar, R. 2009. Looking around the corner using transient imaging. In Proc. ICCV, 159--166.
[21]
Lange, R., and Seitz, P. 2001. Solid-state time-of-flight range camera. IEEE J. Quantum Electronics 37, 3, 390--397.
[22]
Li, Y., and Stuber, G. 2006. Orthogonal Frequency Division Multiplexing for Wireless Communications. Springer.
[23]
Lindner, M., and Kolb, A. 2006. Lateral and depth calibration of PMD-distance sensors. In Advances in Visual Computing. Springer, 524--533.
[24]
Lindner, M., and Kolb, A. 2009. Compensation of Motion Artifacts for Time-of-Flight Cameras. In Proc. Dynamic 3D Imaging. 16--27.
[25]
Liu, C., Yuen, J., Torralba, A., Sivic, J., and Freeman, W. T. 2008. SIFT flow: Dense correspondence across different scenes. In Computer Vision--ECCV 2008. Springer, 28--42.
[26]
Liu, C. 2009. Beyond pixels: exploring new representations and applications for motion analysis. PhD thesis, MIT.
[27]
Naik, N., Zhao, S., Velten, A., Raskar, R., and Bala, K. 2011. Single view reflectance capture using multiplexed scattering and time-of-flight imaging. ACM Trans. Graph. (SIGGRAPH Asia) 30, 6, 171:1--171:10.
[28]
O'Toole, M., Heide, F., Xiao, L., Hullin, M. B., Heidrich, W., and Kutulakos, K. N. 2014. Temporal frequency probing for 5d transient analysis of global light transport. ACM Trans. Graph. (SIGGRAPH) 33, 4, 87:1--87:11.
[29]
Pandharkar, R., Velten, A., Bardagjy, A., Lawson, E., Bawendi, M., and Raskar, R. 2011. Estimating motion and size of moving non-line-of-sight objects in cluttered environments. In Proc. CVPR, 265--272.
[30]
Tocci, M., Kiser, C., Tocci, N., and Sen, P. 2011. A versatile HDR video production system. ACM Trans. Graph. (SIGGRAPH) 30, 4, 41.
[31]
Velten, A., Willwacher, T., Gupta, O., Veeraraghavan, A., Bawendi, M., and Raskar, R. 2012. Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging. Nat Commun 745, 3.
[32]
Velten, A., Wu, D., Jarabo, A., Masia, B., Barsi, C., Joshi, C., Lawson, E., Bawendi, M., Gutierrez, D., and Raskar, R. 2013. Femto-photography: Capturing and visualizing the propagation of light. ACM Trans. Graph. (SIGGRAPH) 32, 4, 44:1--44:8.
[33]
Wei, D., Masurel, P., Kurihara, T., and Ando, S. 2006. Optical flow determination with complex-sinusoidally modulated imaging. In Proc. ICSP, vol. 2.
[34]
Wu, D., Wetzstein, G., Barsi, C., Willwacher, T., O'Toole, M., Naik, N., Dai, Q., Kutulakos, K., and Raskar, R. 2012. Frequency analysis of transient light transport with applications in bare sensor imaging. In Proc. ECCV, 542--555.
[35]
Yasuma, F., Mitsunaga, T., Iso, D., and Nayar, S. K. 2010. Generalized assorted pixel camera: postcapture control of resolution, dynamic range, and spectrum. IEEE TIP 19, 9, 2241--2253.

Cited By

View all
  • (2024)Time-of-Flight Camera Intensity Image Reconstruction Based on an Untrained Convolutional Neural NetworkPhotonics10.3390/photonics1109082111:9(821)Online publication date: 30-Aug-2024
  • (2024)Snapshot Lidar: Fourier Embedding of Amplitude and Phase for Single-Image Depth Reconstruction2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52733.2024.02381(25203-25212)Online publication date: 16-Jun-2024
  • (2024)Fruit modeling and application based on 3D imaging technology: a reviewJournal of Food Measurement and Characterization10.1007/s11694-024-02480-318:6(4120-4136)Online publication date: 12-Mar-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Graphics
ACM Transactions on Graphics  Volume 34, Issue 4
August 2015
1307 pages
ISSN:0730-0301
EISSN:1557-7368
DOI:10.1145/2809654
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 27 July 2015
Published in TOG Volume 34, Issue 4

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. computational photography
  2. time-of-flight

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)53
  • Downloads (Last 6 weeks)5
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Time-of-Flight Camera Intensity Image Reconstruction Based on an Untrained Convolutional Neural NetworkPhotonics10.3390/photonics1109082111:9(821)Online publication date: 30-Aug-2024
  • (2024)Snapshot Lidar: Fourier Embedding of Amplitude and Phase for Single-Image Depth Reconstruction2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52733.2024.02381(25203-25212)Online publication date: 16-Jun-2024
  • (2024)Fruit modeling and application based on 3D imaging technology: a reviewJournal of Food Measurement and Characterization10.1007/s11694-024-02480-318:6(4120-4136)Online publication date: 12-Mar-2024
  • (2024)RGB Guided ToF Imaging System: A Survey of Deep Learning-Based MethodsInternational Journal of Computer Vision10.1007/s11263-024-02089-5132:11(4954-4991)Online publication date: 1-Nov-2024
  • (2024)Light-in-Flight for a World-in-MotionComputer Vision – ECCV 202410.1007/978-3-031-72754-2_12(204-220)Online publication date: 29-Sep-2024
  • (2023)Reconstructing Depth Images for Time-of-Flight Cameras Based on Second-Order Correlation FunctionsPhotonics10.3390/photonics1011122310:11(1223)Online publication date: 31-Oct-2023
  • (2023)A Deblurring Method for Indirect Time-of-Flight Depth SensorIEEE Sensors Journal10.1109/JSEN.2022.322968723:3(2718-2726)Online publication date: 1-Feb-2023
  • (2023)Computational Imaging and Artificial Intelligence: The Next Revolution of Mobile VisionProceedings of the IEEE10.1109/JPROC.2023.3338272111:12(1607-1639)Online publication date: Dec-2023
  • (2023) -ToF: A feature-alignment and frequency-division time-of-flight data denoise network Computer Communications10.1016/j.comcom.2023.04.033207(66-76)Online publication date: Jul-2023
  • (2022)Underwater LiDAR : Development of Underwater Visible 3D Scan LiDAR水中ライダ──水中における可視光3Dスキャンライダの開発──IEICE Communications Society Magazine10.1587/bplus.15.30715:4(307-313)Online publication date: 2022
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media