Skip to main content

Advertisement

Log in

A gesture-controlled projection display for CT-guided interventions

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

The interaction with interventional imaging systems within a sterile environment is a challenging task for physicians. Direct physician–machine interaction during an intervention is rather limited because of sterility and workspace restrictions.

Methods

We present a gesture-controlled projection display that enables a direct and natural physician–machine interaction during computed tomography (CT)-based interventions. Therefore, a graphical user interface is projected on a radiation shield located in front of the physician. Hand gestures in front of this display are captured and classified using a leap motion controller. We propose a gesture set to control basic functions of intervention software such as gestures for 2D image exploration, 3D object manipulation and selection. Our methods were evaluated in a clinically oriented user study with 12 participants.

Results

The results of the performed user study confirm that the display and the underlying interaction concept are accepted by clinical users. The recognition of the gestures is robust, although there is potential for improvements. The gesture training times are less than 10 min, but vary heavily between the participants of the study. The developed gestures are connected logically to the intervention software and intuitive to use.

Conclusions

The proposed gesture-controlled projection display counters current thinking, namely it gives the radiologist complete control of the intervention software. It opens new possibilities for direct physician–machine interaction during CT-based interventions and is well suited to become an integral part of future interventional suites.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Rutala WA, White MS, Gergen MF, Weber DJ (2006) Bacterial contamination of keyboards: efficacy and functional impact of disinfectants. Infect Control Hosp Epidemiol Off J Soc Hosp Epidemiol Am 27(4):372–377

    Article  Google Scholar 

  2. Schultz M, Gill J, Zubairi S, Huber R, Gordin F (2003) Bacterial contamination of computer keyboards in a teaching hospital. Infect Control Hosp Epidemiol Off J Soc Hosp Epidemiol Am 24(4):302–303

    Article  Google Scholar 

  3. Hübler A, Hansen C, Beuing O, Skalej M, Preim B (2014) Workflow analysis for interventional neuroradiology using frequent pattern mining. In: 13. Jahrestagung der Deutschen Gesellschaft für Computer- und Roboterassistierte Chirurgie (CURAC), pp 29–31

  4. Preim B, Botha C (2014) Chapter 5—Human-computer interaction for medical visualization. In: Preim B, Botha C (eds) Visual computing for medicine, 2nd edn. Morgan Kaufmann, Boston, pp 177–225

    Chapter  Google Scholar 

  5. Wachs JP, Stern HI, Edan Y, Gillam M, Handler J, Feied C, Smith M (2008) A gesture-based tool for sterile browsing of radiology images. J Am Med Inform Assoc JAMIA 15(3):321–323

    Article  PubMed  Google Scholar 

  6. Ritter F, Hansen C, Wilkens K, Köhn A, Peitgen HO (2009) User interfaces for direct interaction with 3d planning data in the operation room. Journal of i-com (Zeitschrift für interaktive und kooperative Medien) 8(1):24–31

    Google Scholar 

  7. Wigdor D, Wixon D (2011) Brave NUI world: designing natural user interfaces for touch and gesture, 1st. Morgan Kaufmann, San Francisco

    Google Scholar 

  8. Baer A, Hübler A, Saalfeld P, Cunningham D, Preim B (2014) A comparative user study of a 2D and an autostereoscopic 3D display for a tympanoplastic surgery. In: Proceedings of Eurographics workshop on visual computing for biology and medicine (EG VCBM), pp 181–190

  9. Soutschek S, Penne J, Hornegger J, Kornhuber J (2008) 3-D gesture-based scene navigation in medical imaging applications using time-of-flight cameras. In: Computer vision and pattern recognition workshops, 2008. CVPRW ‘08. IEEE computer society conference on, pp 1–6

  10. Kollorz E, Penne J, Hornegger J, Barke A (2008) Gesture recognition with a time-of-flight camera. Int J Intell Syst Technol Appl 5(3):334–343

    Google Scholar 

  11. Ebert LC, Hatch G, Ampanozi G, Thali MJ, Ross S (2012) You can’t touch this: touch-free navigation through radiological images. Surg Innov 19(3):301–307

    Article  PubMed  Google Scholar 

  12. Gallo L, Placitelli AP, Ciampi M (2011) Controller-free exploration of medical image data: experiencing the Kinect. In: Computer-based medical systems (CBMS), 2011 24th international symposium on, pp 1–6

  13. Hötker AM, Pitton MB, Mildenberger P, Düber C (2013) Speech and motion control for interventional radiology: requirements and feasibility. Int J Comput Assist Radiol Surg 8(6):997–1002

    Article  PubMed  Google Scholar 

  14. Jacob MG, Wachs JP, Packer RA (2013) Hand-gesture-based sterile interface for the operating room using contextual cues for the navigation of radiological images. J Am Med Inform Assoc JAMIA 20(e1):e183–e186

    Article  PubMed  Google Scholar 

  15. Bizzotto N, Costanzo A, Bizzotto L, Regis D, Sandri A, Magnan B (2014) Leap motion gesture control with osirix in the operating room to control imaging: first experiences during live surgery. Surg Innov 21(6):655–656

    Article  PubMed  Google Scholar 

  16. Mauser S, Burgert O (2014) Touch-free, gesture-based control of medical devices and software based on the leap motion controller. Stud Health Technol Inform 196:265–270

    PubMed  Google Scholar 

  17. Rosa GM, Elizondo ML (2014) Use of a gesture user interface as a touchless image navigation system in dental surgery: Case series report. Imaging Sci Dent 44(2):155–160

    Article  PubMed Central  PubMed  Google Scholar 

  18. Weichert F, Bachmann D, Rudak B, Fisseler D (2013) Analysis of the accuracy and robustness of the leap motion controller. Sensors (Basel, Switzerland) 13(5):6380–6393

    Article  Google Scholar 

  19. Guna J, Jakus G, Pogačnik M, Tomažič S, Sodnik J (2014) An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors (Basel, Switzerland) 14(2):3702–3720

    Article  Google Scholar 

  20. Nielsen M, Störring M, Moeslund TB, Granum E (2004) A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In: Goos G, Hartmanis J, van Leeuwen J, Camurri A, Volpe G (eds) Gesture-based communication in human-computer interaction, vol 2915. Springer, Berlin, pp 409–420

    Chapter  Google Scholar 

  21. Ericsson KA, Simon HA (1984) Protocol analysis. Verbal reports as data. MIT Press, Cambridge

    Google Scholar 

  22. Prümper J (1997) Der Benutzungsfragebogen ISONORM 9241/10: Ergebnisse zur Reliabilität und Validität. In: Software-Ergonomie’97. Springer, pp 253–262

Download references

Acknowledgments

The work of this paper is partly funded by the Federal Ministry of Education and Research within the Forschungscampus STIMULATE under Grant number “13GW0095A.”

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. Mewes.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mewes, A., Saalfeld, P., Riabikin, O. et al. A gesture-controlled projection display for CT-guided interventions. Int J CARS 11, 157–164 (2016). https://doi.org/10.1007/s11548-015-1215-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-015-1215-0

Keywords

Navigation