Skip to main content

Advertisement

Log in

Interactional Order and Constructed Ways of Seeing with Touchless Imaging Systems in Surgery

  • Published:
Computer Supported Cooperative Work (CSCW) Aims and scope Submit manuscript

An Erratum to this article was published on 12 March 2015

Abstract

While surgical practices are increasingly reliant on a range of digital imaging technologies, the ability for clinicians to interact and manipulate these digital representations in the operating theatre using traditional touch based interaction devices is constrained by the need to maintain sterility. To overcome these concerns with sterility, a number of researchers are have been developing ways of enabling interaction in the operating theatre using touchless interaction techniques such as gesture and voice to allow clinicians control of the systems. While there have been important technical strides in the area, there has been little in the way of understanding the use of these touchless systems in practice. With this in mind we present a touchless system developed for use during vascular surgery. We deployed the system in the endovascular suite of a large hospital for use in the context of real procedures. We present findings from a study of the system in use focusing on how, with touchless interaction, the visual resources were embedded and made meaningful in the collaborative practices of surgery. In particular we discuss the importance of direct and dynamic control of the images by the clinicians in the context of talk and in the context of other artefact use as well as the work performed by members of the clinical team to make themselves sensable by the system. We discuss the broader implications of these findings for how we think about the design, evaluation and use of these systems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Figure 1
Figure 2
Figure 3
Figure 4
Figure 5
Figure 6
Figure 7
Figure 8
Figure 9
Figure 10
Figure 11
Figure 12
Figure 13

Similar content being viewed by others

Notes

  1. The other two clinicians are standing close to the left of the chief surgeon—the most salient figure in the images—and as such are partially obscured by him.

References

  • Alač, M. (2008). Working with Brain Scans: Digital Images and Gestural Interaction in fMRI Laboratory. Social Studies of Science, vol. 38, no.4, August 2008, pp. 483–508.

  • Cassell, J. (1987). On Control, Certitude, and the “Paranoia” of Surgeons. Culture, Medicine and Psychiatry, vol. 11, no.2, June 1987, pp. 229–49.

  • Douglas, M. (1966). Purity and Danger. London: Routledge.

    Book  Google Scholar 

  • Ebert, L., Hatch, G., Ampanozi, G., Thali, M. and Ross, S. (2012). You Can’t Touch This: Touch-free Navigation Through Radiological Images. Surgical Innovation, vol. 19, no. 3, September, 2012, pp. 301–307.

  • Ebert, L., Hatch, G., Ampanozi, G., Thali, M. and Ross, S. (2013). Invisible touch—Control of a DICOM viewer with finger gestures using the Kinect depth camera. Journal of Forensic Radiology and Imaging, vol. 1, no. 1, January 2013, pp. 10–14.

  • Fox, N. (1992). The Social Meaning of Surgery. Buckingham: Open University Press.

    Google Scholar 

  • Gallo, L., Placitelli, A.P. and Ciampi, M. (2011). Controller-free exploration of medical image data: experiencing the Kinect. CBMS 2011. Proceedings of 24th Symposium on Computer-Based Medical Systems, Bristol, UK, 27–30 June 2011, pp. 1–6.

  • Goffman, E. (1961). Encounters: Two Studies in the Sociology of Interaction. Harmondsworth: Penguin.

    Google Scholar 

  • Goffman, E. (1983). The Interaction Order. American Sociological Review, vol. 48, no. 1, February 1983, pp. 1–17.

  • Goodwin, C. (1994). Professional Vision. American Anthropologist, vol. 96, no. 3, September 1994, pp. 606–633.

  • Goodwin, C. (2000). Practices of Seeing: Visual Analysis: An Ethnomethodological Approach. In van Leeuwen, T. and Carey Jewit, C. (Eds) Handbook of Visual Analysis. London: Sage, pp. 157–182.

    Google Scholar 

  • Goodwin, C. (2003). Pointing as Situated Practice. In Kita, S. (Ed) Pointing: Where Language, Culture and Cognition Meet. Mahwah, NJ: Lawrence Erlbaum, pp. 217–41.

    Google Scholar 

  • Graetzel, C., Fong, T., Grange, S. and Baur, C. (2004). A Non-Contact Mouse for Surgeon-Computer Interaction. Technology and Health Care, vol. 12, no. 3, August 2004, pp. 245–257.

  • Hartswood, M., Procter, R., Rouncefield, M., Slack, R. Soutter, J. and Voss, A. (2003). Repairing the Machine: A Case Study of the Evaluation of Computer-Aided Detection Tools in Breast Screening. ECSCW’03. Proceedings of 8th European Conference on Computer Supported Cooperative Work, 14–18 September 2003, Helsinki, Finland, pp. 375–394.

  • Hindmarsh, J. and Pilnick, A. (2007). Knowing Bodies at Work: Embodiment and Ephemeral Teamwork in Anaesthesia. Organization Studies, vol. 28, no. 9, September 2007, pp. 1395–1416.

  • Hindmarsh, J. and Pilnick, A. (2002). The Tacit Order of Teamwork: Collaboration and Embodied Conduct in Anaesthesia. The Sociological Quarterly, vol. 43, no. 2, March 2002, pp. 139–164.

  • Hirschauer, S. (1991). The Manufacture of Bodies in Surgery. Social Studies of Science, vol. 21, no. 2, May 1991, pp. 279–319.

  • Jacob, M., Li, Y., Akingba, G. and Wachs, J. P. (2012). Gestonurse: a robotic surgical nurse for handling surgical instruments in the operating room. Journal of Robotic Surgery, vol. 6, no. 1, March 2012, pp. 53–63.

  • Johnson, R., O’Hara, K., Sellen, A., Cousins, C. and Criminisi, A. (2011). Exploring the potential for touchless interaction in image-guided interventional radiology. CHI 2011. Proceedings of Conference on Human Factors in Computing Systems, 7–12 April 2011, Vancouver, Canada, pp. 3323–3332.

  • Katz, P. (1981). Ritual in the Operating Room. Ethnology, vol. 20, no. 4, October 1981, pp. 335–350.

  • Kipshagen, T., Graw, M., Tronnier, V., Bonsanto, M. and Hofmann, U. (2009). Touch- and marker-free interaction with medical software. World Congress on Medical Physics and Biomedical Engineering, September 7–12, 2009, Munich, Germany, pp. 75–78.

  • Lammer, C. (2002). Horizontal Cuts and Vertical Penetration: The ‘Flesh and Blood’ of Image Fabrication in the Operating Theatres of Interventional Radiology. Cultural Studies, vol. 16, no. 6, pp. 833–847.

    Article  Google Scholar 

  • Lynch, M. (1990). The Externalized Retina: Selection and Mathematization in the Visual Documentation of Objects in the Life Sciences. In Lynch, M. and Woolgar, S. (Eds): Representation in Scientific Practice. Cambridge MA: MIT Press, pp. 153–186.

    Google Scholar 

  • Mentis, H., O’Hara, K., Sellen, A. and Trivedi, R. (2012). Interaction Proxemics and Image Use in Neurosurgery. CHI 2012. Proceedings of Conference on Human Factors in Computing Systems, 5–10 May 2012, Austin, Texas, pp. 927–936.

  • O’Hara, K., Harper, R., Mentis, H. Sellen, A. and Taylor, A. (2013). On the naturalness of touchless: putting the “Interaction” back into NUI. In ACM Transactions on Computer-Human Interaction, vol. 20, no. 1, March 2013, pp. 1–25.

  • Ruppert, G., Amorim, P., Moares, T. and Silva, J. (2012). Touchless Gesture User Interface for 3D Visualization using Kinect Platform and Open-Source Frameworks. Innovative Developments in Virtual and Physical Prototyping. Proceedings of the 5th International Conference on Advanced Research in Virtual and Rapid Prototyping, 28 September–1 October, Leiria, Portugal, pp. 215–219.

  • Stern, H. I., Wachs, J. P. and Edan, Y. (2008). Optimal Consensus Intuitive Hand Gesture Vocabulary Design. ICSC 2008. Proceedings of International Conference on Semantic Computing, 4–7 August, Santa Clara, CA, pp. 96–103.

  • Strickland M., Tremaine J., Brigley G. and Law C. (2013). Using a depth-sensing infrared camera system to access and manipulate medical imaging from within the sterile operating field. Canadian Journal of Surgery, vol. 56, no. 3, June 2013, E1–6.

  • Svenssen, M., Heath, C. and Luff, P. (2007). Instrumental action: the timely exchange of implements during surgical operations. ECSCW’07. Proceedings of ECSCW’07, 24–28 September 2007, Limerick, Ireland, pp. 41–60.

  • Tan, J., Unal, J., Tucker, T. and Link, K. (2011). Kinect Sensor for Touch-Free Use in Virtual Medicine. http://www.youtube.com/watch?v=id7OZAbFaVI&feature=player_embedded. Uploaded 11 February 2011.

  • Wachs, J., Stern, H., Edan, Y., Gillam, M., Feied, C., Smith, M., and Handler, J. (2006). A Real-Time Hand Gesture Interface for Medical Visualization Applications. Applications of Soft Computing: Advances in Intelligent and Soft Computing vol. 36, pp. 153–162.

    Article  Google Scholar 

  • Wilson, R. N. (1954). Team Work in the Operating Room. Human Organization, vol. 12, no. 4, winter 1954, pp. 9–14.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kenton O’Hara.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

O’Hara, K., Gonzalez, G., Penney, G. et al. Interactional Order and Constructed Ways of Seeing with Touchless Imaging Systems in Surgery. Comput Supported Coop Work 23, 299–337 (2014). https://doi.org/10.1007/s10606-014-9203-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10606-014-9203-4

Key words

Navigation