Skip to main content

Advertisement

Log in

Device- and system-independent personal touchless user interface for operating rooms

One personal UI to control all displays in an operating room

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Introduction

In the modern day operating room, the surgeon performs surgeries with the support of different medical systems that showcase patient information, physiological data, and medical images. It is generally accepted that numerous interactions must be performed by the surgical team to control the corresponding medical system to retrieve the desired information. Joysticks and physical keys are still present in the operating room due to the disadvantages of mouses, and surgeons often communicate instructions to the surgical team when requiring information from a specific medical system. In this paper, a novel user interface is developed that allows the surgeon to personally perform touchless interaction with the various medical systems, switch effortlessly among them, all of this without modifying the systems’ software and hardware.

Methods

To achieve this, a wearable RGB-D sensor is mounted on the surgeon’s head for inside-out tracking of his/her finger with any of the medical systems’ displays. Android devices with a special application are connected to the computers on which the medical systems are running, simulating a normal USB mouse and keyboard. When the surgeon performs interaction using pointing gestures, the desired cursor position in the targeted medical system display, and gestures, are transformed into general events and then sent to the corresponding Android device. Finally, the application running on the Android devices generates the corresponding mouse or keyboard events according to the targeted medical system.

Results and conclusion

To simulate an operating room setting, our unique user interface was tested by seven medical participants who performed several interactions with the visualization of CT, MRI, and fluoroscopy images at varying distances from them. Results from the system usability scale and NASA-TLX workload index indicated a strong acceptance of our proposed user interface.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Brooke J (1996) SUS—A quick and dirty usability scale. Usability Eval Ind 189(194):4–7. doi:10.1002/hbm.20701

    Google Scholar 

  2. Ebert LC, Hatch G, Ampanozi G, Thali MJ, Ross S (2012) You can’t touch this: touch-free navigation through radiological images. Surg Innov 19(3):301–307. doi:10.1177/1553350611425508

    Article  PubMed  Google Scholar 

  3. Ebert LC, Hatch G, Thali MJ, Ross S (2013) Invisible touch-control of a DICOM viewer with finger gestures using the kinect depth camera. J Forensic Radiol Imaging 1(1):10–14. doi:10.1016/j.jofri.2012.11.006

    Article  Google Scholar 

  4. Grätzel C, Fong T, Grange S, Baur C (2004) A non-contact mouse for surgeon-computer interaction. Technol Health Care 12(3):245–257

    PubMed  Google Scholar 

  5. Hart SG (2006) Nasa-Task Load Index (NASA-TLX); 20 Years Later. Proc Hum Factors Ergon Soc Annu Meet 50(9):904–908. doi:10.1177/154193120605000909

    Article  Google Scholar 

  6. Ionescu A (2006) A Mouse in the O.R. Ambidextrous Mag 4:30–32

    Google Scholar 

  7. Jalaliniya S, Pederson T (2015) Designing wearable personal assistants for surgeons: an egocentric approach. Pervasive Comput IEEE 14(3):22–31

    Article  Google Scholar 

  8. Jalaliniya S, Smith J, Sousa M, Büthe L, Pederson T (2013) Touch-less interaction with medical images using hand & foot gestures. In: Proceedings of the 2013 ACM conference pervasive ubiquitous Comput Adjun Publ - UbiComp ’13 Adjun., ACM Press, New York, New York, USA, pp 1265–1274. doi:10.1145/2494091.2497332

  9. Johnson R, O’Hara K, Sellen A, Cousins C, Criminisi A (2011) Exploring the potential for touchless interaction in image-guided interventional radiology. In: Proceedings of the 2011 annuual conference Hum factors Comput Syst - CHI ’11, pp 3323–3332. doi:10.1145/1978942.1979436

  10. Ma M, Merckx K, Fallavollita P, Navab N (2015) [POSTER] Natural user interface for ambient objects. In: 2015 IEEE International Symposium Mix Augment Real, Fukuoka, Japan, pp 76–79. doi:10.1109/ISMAR.2015.25

  11. Moraes TF, Amorim PHJ, Azevedo FS, Silva JVL (2012) InVesalius-An open-source imaging application. Comput Vis Med Image Process 19:405–408

    Google Scholar 

  12. Norman Da (2010) Natural user interfaces are not natural. Interactions 17(3):6. doi:10.1145/1744161.1744163

    Article  Google Scholar 

  13. O’Hara K, Dastur N, Carrell T, Gonzalez G, Sellen A, Penney G, Varnavas A, Mentis H, Criminisi A, Corish R, Rouncefield M (2014) Touchless interaction in surgery. Commun ACM 57(1):70–77. doi:10.1145/2541883.2541899

    Article  Google Scholar 

  14. Pederson T, Janlert LE, Surie D (2010) Towards a model for egocentric interaction with physical and virtual objects. In: Proceedings of the 6th Nord conference human–computer interact. Extending Boundaries—Nord. ’10, ACM Press, New York, New York, USA, p 755. doi:10.1145/1868914.1869022

  15. Rosa GM, Elizondo ML (2014) Use of a gesture user interface as a touchless image navigation system in dental surgery: case series report. Imaging Sci Dent 44(2):155–160. doi:10.5624/isd.2014.44.2.155

    Article  PubMed  PubMed Central  Google Scholar 

  16. Schwarz LA, Bigdelou A, Navab N (2011) Learning gestures for customizable human-computer interaction in the operating room. Lect Notes Comput Sci (including Subser Lect Notes Artif Intell Lect Notes Bioinformatics) 6891 LNCS (PART 1), 129–136. doi:10.1007/978-3-642-23623-5_17

  17. Strickland M, Tremaine J, Brigley G, Law C (2013) Using a depth-sensing infrared camera system to access and manipulate medical imaging from within the sterile operating field. Can J Surg 56(3):1–6. doi:10.1503/cjs.035311

    Article  Google Scholar 

  18. Tan JH, Chao C, Zawaideh M, Roberts AC, Kinney TB (2013) Informatics in radiology: developing a touchless user interface for intraoperative image control during interventional radiology procedures. Radiographics 33(2):E61–70. doi:10.1148/rg.332125101

    Article  PubMed  Google Scholar 

  19. Tangcharoen T, Bell A, Hegde S, Hussain T, Beerbaum P, Schaeffter T, Razavi R, Botnar RM, Greil GF (2011) Detection of coronary artery anomalies in infants and young children with congenital heart disease by using MR imaging. doi:10.1148/radiol.10100828

  20. Wachs J, Stern H, Edan Y (2008) Real-time hand gesture interface for browsing medical images. J Intell 2(1):15–25

    Google Scholar 

Download references

Acknowledgments

This work was partly supported by the China Scholarship Council (file No. 201206110030). We also want to thanks Sergii Pylypenko for the virtual devices in the linux kernel to simulate the USB mouse and keyboard.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Meng MA.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mp4 25276 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

MA, M., Fallavollita, P., Habert, S. et al. Device- and system-independent personal touchless user interface for operating rooms. Int J CARS 11, 853–861 (2016). https://doi.org/10.1007/s11548-016-1375-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-016-1375-6

Keywords

Navigation