ABSTRACT
Gesture interfaces contribute to touchless interaction in systems inserted in critical hospital environments such as surgery rooms and intensive care units, among other areas where infection care is essential. However, defining a vocabulary of gestures that is adequate to this environment's physical characteristics, the sensors used, and the tasks performed by users while interacting is a challenge for developers. Methodologies and guidelines proposed in the literature to the gestures definition do not consider the characteristics of the critical environment, causing natural and innovative interaction to fail by not meeting the user's need. In this sense, our paper discusses the main challenges imposed by critical environments in gesture interaction development. As a result, we propose a set of recommendations that can contribute to developers of gesture interaction systems. Moreover, our research highlights and promotes a discussion about the need to develop specific methodologies to the critical environment context that supports the definition of gesture vocabulary.
- Patrícia Caetano Gattás Bara. 2019. O ambiente da sala de hemodiálise: Estudos de casos em Juiz de Fora. Master's thesis. Universidade Federal de Juiz de Fora.Google Scholar
- Andrew Bragdon, Eugene Nelson, Yang Li, and Ken Hinckley. 2011. Experimental Analysis of Touch-Screen Gesture Designs in Mobile Environments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Vancouver, BC, Canada) (CHI '11). Association for Computing Machinery, New York, NY, USA, 403--412. https://doi.org/10.1145/1978942.1979000Google ScholarDigital Library
- Brasil. 2002. Resolução - RDC N° 50. Ministério da Saúde. http://bvsms.saude.gov.br/bvs/saudelegis/anvisa/2002/rdc0050_21_02_2002.htmlGoogle Scholar
- Brasil. 2012. Segurança do paciente em serviços de saúde: limpeza e desinfecção de superfícies. Technical Report. Agência Nacional de Vigilância Sanitária (ANVISA).Google Scholar
- Brasil. 2019. Medidas simples podem evitar infecção hospitalar. Ministério da Saúde. http://www.blog.saude.gov.br/yoxvlrGoogle Scholar
- Seán Cronin and Gavin Doherty. 2019. Touchless computer interfaces in hospitals: A review. Health Informatics Journal 25, 4 (2019), 1325--1342. https://doi.org/10.1177/1460458217748342Google ScholarCross Ref
- Lucineide Rodrigues da Silva, Laura Sánchez Garcia, and Luciano Silva. 2014. Gesture Vocabulary for Natural Interaction with Virtual Museums - Case Study: A Process Created and Tested Within a Bilingual Deaf Children School. In Proceedings of the 16th International Conference on Enterprise Information Systems - Volume 2: ICEIS. INSTICC, SciTePress, 5--13. https://doi.org/10.5220/0004886700050013Google ScholarDigital Library
- Tereza Cristina Marques Dalla. 2003. Estudo da Qualidade do Ambiente Hospitalar como Contribuição na Recuperação de Pacientes. Master's thesis. Universidade Federal do Espírito Santo.Google Scholar
- Alan Lopes de Sousa Freitas, Vinícius Paes de Camargo, Heloise Manica Paris Teixeira, Renato Balancieri, and Thelma Elita Colanzi. 2017. Gesture and Voice-Based Natural User Interface for Electronic Whiteboard System in a Medical Emergency Department. In Proceedings of the XVI Brazilian Symposium on Human Factors in Computing Systems (Joinville, Brazil) (IHC 2017). Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/3160504.3160534Google ScholarDigital Library
- Sukeshini A. Grandhi, Gina Joue, and Irene Mittelberg. 2011. Understanding Naturalness and Intuitiveness in Gesture Production: Insights for Touchless Gestural Interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Vancouver, BC, Canada) (CHI '11). Association for Computing Machinery, New York, NY, USA, 821--824. https://doi.org/10.1145/1978942.1979061Google ScholarDigital Library
- Tibor Guzsvinecz, Veronika Szucs, and Cecilia Sik-Lanyi. 2019. Suitability of the Kinect Sensor and Leap Motion Controller---A Literature Review. Sensors 19, 5 (2019). https://doi.org/10.3390/s19051072Google Scholar
- Niels Henze, Andreas Löcken, Susanne Boll, Tobias Hesselmann, and Martin Pielot. 2010. Free-Hand Gestures for Music Playback: Deriving Gestures with a User-Centred Process. In Proceedings of the 9th International Conference on Mobile and Ubiquitous Multimedia (Limassol, Cyprus) (MUM '10). Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/1899475.1899491Google ScholarDigital Library
- Mithun George Jacob and Juan Pablo Wachs. 2014. Context-based hand gesture recognition for the operating room. Pattern Recognition Letters 36 (2014), 196--203. https://doi.org/10.1016/j.patrec.2013.05.024Google ScholarDigital Library
- David Louis M. Achacon Jr., Denise M. Carlos, Maryann Kaye Puyaoan, Christine T. Clarin, and Prospero C. Naval. 2009. REALISM: Real-Time Hand Gesture Interface for Surgeons and Medical Experts. 1--6.Google Scholar
- K. Khoshelham. 2011. Accuracy Analysis of Kinect Depth Data. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XXXVIII-5/W12 (2011), 133--138. https://doi.org/10.5194/isprsarchives-XXXVIII-5-W12-133-2011Google Scholar
- Artur Henrique Kronbauer. 2017. Natural and Multimodal Interactions: An Empirical Study. In Proceedings of the XVI Brazilian Symposium on Human Factors in Computing Systems (Joinville, Brazil) (IHC 2017). Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/3160504.3160540Google ScholarDigital Library
- Larry Li. 2014. Time-of-Flight Camera - An Introduction. Technical Report. Texas Instruments.Google Scholar
- André Mewes, Bennet Hensen, Frank Wacker, and Christian Hansen. 2017. Touchless interaction with software in interventional radiology and surgery: a systematic literature review. International Journal of Computer Assisted Radiology and Surgery 12 (2017), 291--305. https://doi.org/10.1007/s11548-016-1480-6Google ScholarCross Ref
- Lou Michel. 1996. Light: the shape of the space. VNR.Google Scholar
- Microsoft. [n.d.]. Kinect para Windows. https://developer.microsoft.com/pt-br/windows/kinect/Google Scholar
- Michael Nielsen, Moritz Störring, Thomas B. Moeslund, and Erik Granum. 2004. A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI. In Gesture-Based Communication in Human-Computer Interaction, Antonio Camurri and Gualtiero Volpe (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 409--420.Google Scholar
- World Health Organization. [n.d.]. Health care-associated infections - FACT SHEET. https://www.who.int/gpsc/country_work/gpsc_ccisc_fact_sheet_en.pdfGoogle Scholar
- Kenton O'Hara, Gerardo Gonzalez, Graeme Penney, Abigail Sellen, Robert Corish, Helena Mentis, Andreas Varnavas, Antonio Criminisi, Mark Rouncefield, Neville Dastur, and Tom Carrell. 2014. Interactional Order and Constructed Ways of Seeing with Touchless Imaging Systems in Surgery. Computer Supported Cooperative Work (CSCW) 23 (2014), 299--337. https://doi.org/10.1007/s10606-014-9203-4Google ScholarDigital Library
- Julien Pauchot, Laetitia Di Tommaso, Ahmed Lounis, Mourad Benassarou, Pierre Mathieu, and Dominique Bernot ans Sébastien Aubry. 2015. Leap Motion Gesture Control With Carestream Software in the Operating Room to Control Imaging: Installation Guide and Discussion. Surgical Innovation 22, 6 (2015), 615--620. https://doi.org/10.1177/1553350615587992Google ScholarCross Ref
- Adriana Peccin. 2002. Iluminação Hospitalar. Estudo de Caso: Espaços de Internação e Recuperação. Master's thesis. Universidade Federal do Rio Grande do Sul.Google Scholar
- Guillermo M Rosa and María L Elizondo. 2014. Use of a gesture user interface as a touchless image navigation system in dental surgery: Case series report. Imaging Science in Dentistry 44, 2 (2014), 155--160. https://doi.org/10.5624/isd.2014.44.2.155Google ScholarCross Ref
- William A. Rutala, Matthew S. White, Maria F. Gergen, and David J. Weber. 2006. Bacterial contamination of keyboards: efficacy and functional impact of disinfectants. Infection Control and Hospital Epidemiology 27, 4 (2006). https://doi.org/10.1086/503340Google Scholar
- Ben Shneiderman, Catherine Plaisant, Maxine S. Cohen, Steven Jacobs, Niklas Elmqvist, and Nicholas Diakopoulos. 2016. Designing the User Interface: Strategies for Effective Human-Computer Interaction. Pearson.Google Scholar
- Ramon A. Suárez Fernández, Jose Luis Sanchez-Lopez, Carlos Sampedro, Hriday Bavle, Martin Molina, and Pascual Campoy. 2016. Natural user interfaces for human-drone multi-modal interaction. In 2016 International Conference on Unmanned Aircraft Systems (ICUAS). 1013--1022. https://doi.org/10.1109/ICUAS.2016.7502665Google ScholarCross Ref
- Ultraleap. [n.d.]. Leap Motion Controller. https://www.ultraleap.com/product/leap-motion-controller/Google Scholar
- Juan Pablo Wachs, Mathias Kölsch, Helman Stern, and Yael Edan. 2011. Vision-Based Hand-Gesture Applications. Commun. ACM 54, 2 (Feb. 2011), 60--71. https://doi.org/10.1145/1897816.1897838Google ScholarDigital Library
- Frank Weichert, Daniel Bachmann, Bartholomäus Rudak, and Denis Fisseler. 2013. Analysis of the Accuracy and Robustness of the Leap Motion Controller. Sensors 13, 5 (2013), 6380--6393. https://doi.org/10.3390/s130506380Google ScholarCross Ref
- Antoine Widmer, Roger Schaer, Dimitrios Markonis, and Henning Müller. 2014. Gesture Interaction for Content-Based Medical Image Retrieval. In Proceedings of International Conference on Multimedia Retrieval (Glasgow, United Kingdom) (ICMR '14). Association for Computing Machinery, New York, NY, USA, 503--506. https://doi.org/10.1145/2578726.2578804Google ScholarDigital Library
Index Terms
- Gesture-based Interaction Systems in Hospital Critical Environment: Challenges and Recommendations for Gesture Creation
Recommendations
Fluid gesture interaction design: Applications of continuous recognition for the design of modern gestural interfaces
This article presents Gesture Interaction DEsigner (GIDE), an innovative application for gesture recognition. Instead of recognizing gestures only after they have been entirely completed, as happens in classic gesture recognition systems, GIDE exploits ...
3D gesture interaction for handheld augmented reality
SA '14: SIGGRAPH Asia 2014 Mobile Graphics and Interactive ApplicationsIn this paper, we present a prototype for exploring natural gesture interaction with Handheld Augmented Reality (HAR) applications, using visual tracking based AR and freehand gesture based interaction detected by a depth camera. We evaluated this ...
Contents-aware gesture interaction using wearable motion sensor
ISWC '14 Adjunct: Proceedings of the 2014 ACM International Symposium on Wearable Computers: Adjunct ProgramGesture interaction has become a major role as intuitive control of remote devices. Motion-based hand gesture recognition using a wearable motion sensor equipped on the wrist-band helps decreasing recognition errors compared with that of video-based ...
Comments