Skip to main content

VR Simulation of Novel Hands-Free Interaction Concepts for Surgical Robotic Visualization Systems

  • Conference paper
  • First Online:
Book cover Medical Image Computing and Computer Assisted Intervention – MICCAI 2020 (MICCAI 2020)

Abstract

In microsurgery, visualization systems such as the traditional surgical microscope are essential, as surgeons rely on the highly magnified stereoscopic view for performing their operative tasks. For well-aligned visual perspectives onto the operating field during surgery, precise adjustments of the positioning of the system are frequently required. This, however, implies that the surgeon has to reach for the device and each time remove their hand(s) from the operating field, i.e. a disruptive event to the operative task at hand. To address this, we propose two novel hands-free interaction concepts based on head-, and gaze-tracking, that should allow surgeons to efficiently control the 6D positioning of a robotic visualization system with little interruptions to the main operative task. The new concepts were purely simulated in a virtual reality (VR) environment using a HTC Vive for a robotic visualization system. The two interaction concepts were evaluated within the virtual reality simulation in a quantitative user study with 11 neurosurgeons at the Charité hospital and compared to conventional interaction with a surgical microscope. After a brief introduction to the interaction concepts in the virtual reality simulation, neurosurgeons were 29% faster in reaching a set of virtual targets (position and orientation) in simulation as compared to reaching equivalent physical targets on a 3D-printed reference object.

Supported by BMBF (German Federal Ministry of Education and Research).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Google Glass: Glass gestures (2014). https://support.google.com/glass/answer/3064184?hl=en. Accessed 14 Feb 2019

  2. Microsoft Hololens: Gaze (2016). https://docs.microsoft.com/en-us/windows/mixed-reality/gaze. Accessed 14 Feb 2019

  3. Afkari, H., Eivazi, S., Bednarik, R., Mäkelä, S.: The potentials for hands-free interaction in micro-neurosurgery. In: Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational, pp. 401–410. ACM (2014)

    Google Scholar 

  4. Ashbrook, D.L.: Enabling mobile microinteractions. Ph.D. thesis, Georgia Institute of Technology (2010)

    Google Scholar 

  5. Bérard, F.: The perceptual window: head motion as a new input stream. In: INTERACT, pp. 238–237. Citeseer (1999)

    Google Scholar 

  6. Billinghurst, M., Starner, T.: Wearable devices: new ways to manage information. Computer 32(1), 57–64 (1999)

    Article  Google Scholar 

  7. Bulling, A., Gellersen, H.: Toward mobile eye-based human-computer interaction. IEEE Pervasive Comput. 4, 8–12 (2010)

    Article  Google Scholar 

  8. Crossan, A., McGill, M., Brewster, S., Murray-Smith, R.: Head tilting for interaction in mobile contexts. In: Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services, p. 6. ACM (2009)

    Google Scholar 

  9. Eivazi, S., Afkari, H., Bednarik, R., Leinonen, V., Tukiainen, M., Jääskeläinen, J.E.: Analysis of disruptive events and precarious situations caused by interaction with neurosurgical microscope. Acta Neurochirurgica 157(7), 1147–1154 (2015). https://doi.org/10.1007/s00701-015-2433-5

    Article  Google Scholar 

  10. Eivazi, S., Bednarik, R., Leinonen, V., von und zu Fraunberg, M., Jääskeläinen, J.E.: Embedding an eye tracker into a surgical microscope: requirements, design, and implementation. IEEE Sensors J. 16(7), 2070–2078 (2016)

    Google Scholar 

  11. Eivazi, S., Fuhl, W., Kasneci, E.: Towards intelligent surgical microscope: micro-surgeons’ gaze and instrument tracking. In: Proceedings of the 22nd International Conference on Intelligent User Interfaces Companion, pp. 69–72. ACM (2017)

    Google Scholar 

  12. Fujii, K., Gras, G., Salerno, A., Yang, G.Z.: Gaze gesture based human robot interaction for laparoscopic surgery. Med. Image Anal. 44, 196–214 (2018). https://doi.org/10.1016/j.media.2017.11.011

    Article  Google Scholar 

  13. Gras, G., Yang, G.Z.: Intention recognition for gaze controlled robotic minimally invasive laser ablation. In: 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2431–2437, October 2016. https://doi.org/10.1109/IROS.2016.7759379

  14. Hernesniemi, J., et al.: Some collected principles of microneurosurgery: simple and fast, while preserving normal anatomy: a review. Surg. Neurol. 64(3), 195–200 (2005)

    Article  Google Scholar 

  15. Hosseini, S.M.H., et al.: Neural, physiological, and behavioral correlates of visuomotor cognitive load. Sci. Rep. 7(1), 1–9 (2017). https://doi.org/10.1038/s41598-017-07897-z

    Article  Google Scholar 

  16. Jackowski, A., Gebhard, M., Graeser, A.: A novel head gesture based interface for hands-free control of a robot. In: 2016 IEEE International Symposium on Medical Measurements and Applications (MeMeA), pp. 1–6, May 2016. https://doi.org/10.1109/MeMeA.2016.7533744

  17. Kassner, M., Patera, W., Bulling, A.: Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, pp. 1151–1160. ACM (2014)

    Google Scholar 

  18. Plaumann, K., Ehlers, J., Geiselhart, F., Yuras, G., Huckauf, A., Rukzio, E.: Better than you think: head gestures for mid air input. In: Abascal, J., Barbosa, S., Fetter, M., Gross, T., Palanque, P., Winckler, M. (eds.) INTERACT 2015. LNCS, vol. 9298, pp. 526–533. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-22698-9_36

    Chapter  Google Scholar 

  19. Roethe, A.L., Landgraf, P., Schroeder, T., Vajkoczy, P., Picht, T.: Monitor-based exoscopic neurosurgical interventions: a task-based preparatory evaluation of 3d4k surgery. 69. Jahrestagung der Deutschen Gesellschaft fuer Neurochirurgie (DGNC) pp. Joint Meeting mit der Mexikanischen und Kolumbianischen Gesellschaft fuer Neurochirurgie- (2018). https://doi.org/10.3205/18DGNC387

  20. Sauer, I.M., et al.: Mixed reality in visceral surgery: Development of a suitable workflow and evaluation of intraoperative use-cases. Ann. Surg. 266(5), 706–712 (2017). https://doi.org/10.1097/SLA.0000000000002448

    Article  Google Scholar 

  21. Starner, T.E.: Attention, memory, and wearable interfaces. IEEE Pervasive Comput. 1(4), 88–91 (2002)

    Article  Google Scholar 

  22. Werner, S., et al.: Awareness of sensorimotor adaptation to visual rotations of different size. PLOS ONE 10(4), e0123321 (2015). https://doi.org/10.1371/journal.pone.0123321

    Article  Google Scholar 

  23. Zinchenko, K., Komarov, O., Song, K.: Virtual reality control of a robotic camera holder for minimally invasive surgery. In: 2017 11th Asian Control Conference (ASCC), pp. 970–975 (2017)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fang You .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

You, F., Khakhar, R., Picht, T., Dobbelstein, D. (2020). VR Simulation of Novel Hands-Free Interaction Concepts for Surgical Robotic Visualization Systems. In: Martel, A.L., et al. Medical Image Computing and Computer Assisted Intervention – MICCAI 2020. MICCAI 2020. Lecture Notes in Computer Science(), vol 12263. Springer, Cham. https://doi.org/10.1007/978-3-030-59716-0_42

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-59716-0_42

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-59715-3

  • Online ISBN: 978-3-030-59716-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics