Abstract
Over the past decade, we have developed an augmented reality system called the Sonic Flashlight (SF), which merges ultrasound with the operator’s vision using a half-silvered mirror and a miniature display attached to the ultrasound probe. We now add a small video camera and a structured laser light source so that computer vision algorithms can determine the location of the surface of the patient being scanned, to aid in analysis of the ultrasound data. In particular, we intend to determine the angle of the ultrasound probe relative to the surface to disambiguate Doppler information from arteries and veins running parallel to, and beneath, that surface. The initial demonstration presented here finds the orientation of a flat-surfaced ultrasound phantom. This is a first step towards integrating more sophisticated computer vision methods into automated ultrasound analysis, with the ultimate goal of creating a symbiotic human/machine system that shares both ultrasound and visual data.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Stetten, G.: System and Method for Location-Merging of Real-Time Tomographic Slice Images with Human Vision, U.S. Patent no. 6,599,247 (2003)
Chang, W., Amesur, N., Klatzky, R., Stetten, G.: The Sonic Flashlight Is Faster than Conventional Ultrasound Guidance to Learn and Use For Vascular Access on Phantoms. Radiology 241(3) (2006)
State, A., Livingston, M., Garret, W., Hirota, G., Whitton, M., Pisano, E., Fuchs, H.: Technologies for Augmented Reality Systems: Realizing Ultrasound-Guided Needle Biopsies. In: ACM SIGGRAPH, New Orleans, LA, pp. 439–446 (1996)
Sauer, F., Khamene, A., Bascle, B., Schimmang, L., Wenzel, F., Vogt, S.: Augmetned reality visualization of ultrasound images: System description, calibration, and features. In: International Symposium on Augmented Reality, pp. 30–39. IEEE and ACM, New York City (2001)
Flaccavento, G., Lawrence, P., Rohling, R.: Patient and Probe Tracking During Freehand Ultrasound. In: Barillot, C., Haynor, D.R., Hellier, P. (eds.) MICCAI 2004, Part II. LNCS, vol. 3217, pp. 585–593. Springer, Heidelberg (2004)
Sauer, F., Khamene, A.: Video-assistance for ultrasound guided needle biopsy, U.S. Patent no. 6,612,991 (2003)
Chan, C., Lam, F., Rohling, R.: A needle tracking device for ultrasound guided percutaneous procedures. Ultrasound in Medicine & Biology 31(11), 1469–1483 (2005)
Rafii-Tari, H., Abolmaesumi, P., Rohling, R.: Panorama Ultrasound for Guiding Epidural Anesthesia: A Feasibility Study. In: Taylor, R.H., Yang, G.-Z. (eds.) IPCAI 2011. LNCS, vol. 6689, pp. 179–189. Springer, Heidelberg (2011)
Wang, D., Klatzky, R., Wu, B., Weller, G., Sampson, A., Stetten, G.: Fully Automated Common Carotid Artery and Internal Jugular Vein Identification and Tracking using B-Mode Ultrasound. IEEE Transactions on Biomedical Engineering 56(6), PMCID: PMC2873619 (2009)
Saad, A., Loupas, T., Shapiro, L.: Computer Vision Approach for Ultrasound Doppler Angle Estimation. Journal of Digital Imaging 22(6), 681–688, PMC3043730 (2008)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Horvath, S. et al. (2012). Towards an Ultrasound Probe with Vision: Structured Light to Determine Surface Orientation. In: Linte, C.A., Moore, J.T., Chen, E.C.S., Holmes, D.R. (eds) Augmented Environments for Computer-Assisted Interventions. AE-CAI 2011. Lecture Notes in Computer Science, vol 7264. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-32630-1_6
Download citation
DOI: https://doi.org/10.1007/978-3-642-32630-1_6
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-32629-5
Online ISBN: 978-3-642-32630-1
eBook Packages: Computer ScienceComputer Science (R0)