Skip to main content

Towards an Ultrasound Probe with Vision: Structured Light to Determine Surface Orientation

  • Conference paper
Augmented Environments for Computer-Assisted Interventions (AE-CAI 2011)

Abstract

Over the past decade, we have developed an augmented reality system called the Sonic Flashlight (SF), which merges ultrasound with the operator’s vision using a half-silvered mirror and a miniature display attached to the ultrasound probe. We now add a small video camera and a structured laser light source so that computer vision algorithms can determine the location of the surface of the patient being scanned, to aid in analysis of the ultrasound data. In particular, we intend to determine the angle of the ultrasound probe relative to the surface to disambiguate Doppler information from arteries and veins running parallel to, and beneath, that surface. The initial demonstration presented here finds the orientation of a flat-surfaced ultrasound phantom. This is a first step towards integrating more sophisticated computer vision methods into automated ultrasound analysis, with the ultimate goal of creating a symbiotic human/machine system that shares both ultrasound and visual data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 49.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Stetten, G.: System and Method for Location-Merging of Real-Time Tomographic Slice Images with Human Vision, U.S. Patent no. 6,599,247 (2003)

    Google Scholar 

  2. Chang, W., Amesur, N., Klatzky, R., Stetten, G.: The Sonic Flashlight Is Faster than Conventional Ultrasound Guidance to Learn and Use For Vascular Access on Phantoms. Radiology 241(3) (2006)

    Google Scholar 

  3. State, A., Livingston, M., Garret, W., Hirota, G., Whitton, M., Pisano, E., Fuchs, H.: Technologies for Augmented Reality Systems: Realizing Ultrasound-Guided Needle Biopsies. In: ACM SIGGRAPH, New Orleans, LA, pp. 439–446 (1996)

    Google Scholar 

  4. Sauer, F., Khamene, A., Bascle, B., Schimmang, L., Wenzel, F., Vogt, S.: Augmetned reality visualization of ultrasound images: System description, calibration, and features. In: International Symposium on Augmented Reality, pp. 30–39. IEEE and ACM, New York City (2001)

    Google Scholar 

  5. Flaccavento, G., Lawrence, P., Rohling, R.: Patient and Probe Tracking During Freehand Ultrasound. In: Barillot, C., Haynor, D.R., Hellier, P. (eds.) MICCAI 2004, Part II. LNCS, vol. 3217, pp. 585–593. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  6. Sauer, F., Khamene, A.: Video-assistance for ultrasound guided needle biopsy, U.S. Patent no. 6,612,991 (2003)

    Google Scholar 

  7. Chan, C., Lam, F., Rohling, R.: A needle tracking device for ultrasound guided percutaneous procedures. Ultrasound in Medicine & Biology 31(11), 1469–1483 (2005)

    Article  Google Scholar 

  8. Rafii-Tari, H., Abolmaesumi, P., Rohling, R.: Panorama Ultrasound for Guiding Epidural Anesthesia: A Feasibility Study. In: Taylor, R.H., Yang, G.-Z. (eds.) IPCAI 2011. LNCS, vol. 6689, pp. 179–189. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  9. Wang, D., Klatzky, R., Wu, B., Weller, G., Sampson, A., Stetten, G.: Fully Automated Common Carotid Artery and Internal Jugular Vein Identification and Tracking using B-Mode Ultrasound. IEEE Transactions on Biomedical Engineering 56(6), PMCID: PMC2873619 (2009)

    Google Scholar 

  10. Saad, A., Loupas, T., Shapiro, L.: Computer Vision Approach for Ultrasound Doppler Angle Estimation. Journal of Digital Imaging 22(6), 681–688, PMC3043730 (2008)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Horvath, S. et al. (2012). Towards an Ultrasound Probe with Vision: Structured Light to Determine Surface Orientation. In: Linte, C.A., Moore, J.T., Chen, E.C.S., Holmes, D.R. (eds) Augmented Environments for Computer-Assisted Interventions. AE-CAI 2011. Lecture Notes in Computer Science, vol 7264. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-32630-1_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-32630-1_6

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-32629-5

  • Online ISBN: 978-3-642-32630-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics