Abstract
In microsurgery, visualization systems such as the traditional surgical microscope are essential, as surgeons rely on the highly magnified stereoscopic view for performing their operative tasks. For well-aligned visual perspectives onto the operating field during surgery, precise adjustments of the positioning of the system are frequently required. This, however, implies that the surgeon has to reach for the device and each time remove their hand(s) from the operating field, i.e. a disruptive event to the operative task at hand. To address this, we propose two novel hands-free interaction concepts based on head-, and gaze-tracking, that should allow surgeons to efficiently control the 6D positioning of a robotic visualization system with little interruptions to the main operative task. The new concepts were purely simulated in a virtual reality (VR) environment using a HTC Vive for a robotic visualization system. The two interaction concepts were evaluated within the virtual reality simulation in a quantitative user study with 11 neurosurgeons at the Charité hospital and compared to conventional interaction with a surgical microscope. After a brief introduction to the interaction concepts in the virtual reality simulation, neurosurgeons were 29% faster in reaching a set of virtual targets (position and orientation) in simulation as compared to reaching equivalent physical targets on a 3D-printed reference object.
Supported by BMBF (German Federal Ministry of Education and Research).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Google Glass: Glass gestures (2014). https://support.google.com/glass/answer/3064184?hl=en. Accessed 14 Feb 2019
Microsoft Hololens: Gaze (2016). https://docs.microsoft.com/en-us/windows/mixed-reality/gaze. Accessed 14 Feb 2019
Afkari, H., Eivazi, S., Bednarik, R., Mäkelä, S.: The potentials for hands-free interaction in micro-neurosurgery. In: Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational, pp. 401–410. ACM (2014)
Ashbrook, D.L.: Enabling mobile microinteractions. Ph.D. thesis, Georgia Institute of Technology (2010)
Bérard, F.: The perceptual window: head motion as a new input stream. In: INTERACT, pp. 238–237. Citeseer (1999)
Billinghurst, M., Starner, T.: Wearable devices: new ways to manage information. Computer 32(1), 57–64 (1999)
Bulling, A., Gellersen, H.: Toward mobile eye-based human-computer interaction. IEEE Pervasive Comput. 4, 8–12 (2010)
Crossan, A., McGill, M., Brewster, S., Murray-Smith, R.: Head tilting for interaction in mobile contexts. In: Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services, p. 6. ACM (2009)
Eivazi, S., Afkari, H., Bednarik, R., Leinonen, V., Tukiainen, M., Jääskeläinen, J.E.: Analysis of disruptive events and precarious situations caused by interaction with neurosurgical microscope. Acta Neurochirurgica 157(7), 1147–1154 (2015). https://doi.org/10.1007/s00701-015-2433-5
Eivazi, S., Bednarik, R., Leinonen, V., von und zu Fraunberg, M., Jääskeläinen, J.E.: Embedding an eye tracker into a surgical microscope: requirements, design, and implementation. IEEE Sensors J. 16(7), 2070–2078 (2016)
Eivazi, S., Fuhl, W., Kasneci, E.: Towards intelligent surgical microscope: micro-surgeons’ gaze and instrument tracking. In: Proceedings of the 22nd International Conference on Intelligent User Interfaces Companion, pp. 69–72. ACM (2017)
Fujii, K., Gras, G., Salerno, A., Yang, G.Z.: Gaze gesture based human robot interaction for laparoscopic surgery. Med. Image Anal. 44, 196–214 (2018). https://doi.org/10.1016/j.media.2017.11.011
Gras, G., Yang, G.Z.: Intention recognition for gaze controlled robotic minimally invasive laser ablation. In: 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2431–2437, October 2016. https://doi.org/10.1109/IROS.2016.7759379
Hernesniemi, J., et al.: Some collected principles of microneurosurgery: simple and fast, while preserving normal anatomy: a review. Surg. Neurol. 64(3), 195–200 (2005)
Hosseini, S.M.H., et al.: Neural, physiological, and behavioral correlates of visuomotor cognitive load. Sci. Rep. 7(1), 1–9 (2017). https://doi.org/10.1038/s41598-017-07897-z
Jackowski, A., Gebhard, M., Graeser, A.: A novel head gesture based interface for hands-free control of a robot. In: 2016 IEEE International Symposium on Medical Measurements and Applications (MeMeA), pp. 1–6, May 2016. https://doi.org/10.1109/MeMeA.2016.7533744
Kassner, M., Patera, W., Bulling, A.: Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, pp. 1151–1160. ACM (2014)
Plaumann, K., Ehlers, J., Geiselhart, F., Yuras, G., Huckauf, A., Rukzio, E.: Better than you think: head gestures for mid air input. In: Abascal, J., Barbosa, S., Fetter, M., Gross, T., Palanque, P., Winckler, M. (eds.) INTERACT 2015. LNCS, vol. 9298, pp. 526–533. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-22698-9_36
Roethe, A.L., Landgraf, P., Schroeder, T., Vajkoczy, P., Picht, T.: Monitor-based exoscopic neurosurgical interventions: a task-based preparatory evaluation of 3d4k surgery. 69. Jahrestagung der Deutschen Gesellschaft fuer Neurochirurgie (DGNC) pp. Joint Meeting mit der Mexikanischen und Kolumbianischen Gesellschaft fuer Neurochirurgie- (2018). https://doi.org/10.3205/18DGNC387
Sauer, I.M., et al.: Mixed reality in visceral surgery: Development of a suitable workflow and evaluation of intraoperative use-cases. Ann. Surg. 266(5), 706–712 (2017). https://doi.org/10.1097/SLA.0000000000002448
Starner, T.E.: Attention, memory, and wearable interfaces. IEEE Pervasive Comput. 1(4), 88–91 (2002)
Werner, S., et al.: Awareness of sensorimotor adaptation to visual rotations of different size. PLOS ONE 10(4), e0123321 (2015). https://doi.org/10.1371/journal.pone.0123321
Zinchenko, K., Komarov, O., Song, K.: Virtual reality control of a robotic camera holder for minimally invasive surgery. In: 2017 11th Asian Control Conference (ASCC), pp. 970–975 (2017)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
You, F., Khakhar, R., Picht, T., Dobbelstein, D. (2020). VR Simulation of Novel Hands-Free Interaction Concepts for Surgical Robotic Visualization Systems. In: Martel, A.L., et al. Medical Image Computing and Computer Assisted Intervention – MICCAI 2020. MICCAI 2020. Lecture Notes in Computer Science(), vol 12263. Springer, Cham. https://doi.org/10.1007/978-3-030-59716-0_42
Download citation
DOI: https://doi.org/10.1007/978-3-030-59716-0_42
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-59715-3
Online ISBN: 978-3-030-59716-0
eBook Packages: Computer ScienceComputer Science (R0)