ABSTRACT
Due to the explicit and implicit facets of gaze-based interaction, eye tracking is a major area of interest within the field of cognitive industrial assistance systems. In this position paper, we describe a scenario which includes a wearable platform built around a mobile eye tracker, which can support and guide an industrial worker throughout the execution of a maintenance task. The potential benefits of such a solution are discussed and the key components are outlined.
- Gabriele Bleser, Dima Damen, Ardhendu Behera, Gustaf Hendeby, Katharina Mura, Markus Miezal, Andrew Gee, Nils Petersen, Gustavo Maçães, Hugo Domingues, et al. 2015. Cognitive learning, monitoring and assistance of industrial workflows using egocentric sensor networks. PloS one 10, 6 (2015).Google Scholar
- Andreas Bulling and Hans Gellersen. 2010. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 4 (2010). Google ScholarDigital Library
- Andreas Bulling, Daniel Roggen, and Gerhard Troester. 2011. What's in the Eyes for Context-Awareness? IEEE Pervasive Computing 10, 2 (2011). Google ScholarDigital Library
- Andreas Bulling, Jamie A Ward, Hans Gellersen, and Gerhard Troster. 2011. Eye movement analysis for activity recognition using electrooculography. IEEE transactions on pattern analysis and machine intelligence 33, 4 (2011), 741--753. Google ScholarDigital Library
- Andreas Bulling and Thorsten O Zander. 2014. Cognition-aware computing. IEEE Pervasive Computing 13, 3 (2014), 80--83.Google ScholarCross Ref
- Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the computer using gaze gestures. In IFIP Conference on Human-Computer Interaction. Springer, 475--488. Google ScholarDigital Library
- Benedikt Gollan, Michael Haslgrübler, Alois Ferscha, and Josef Heftberger. {n. d.}. Making Sense: Experiences with Multi-Sensor Fusion in Industrial Assistance Systems. ({n. d.}).Google Scholar
- Dominic Gorecky, Simon F Worgan, and Gerrit Meixner. 2011. COG-NITO: a cognitive assistance and training system for manual tasks in industry.. In ECCE. 53--56. Google ScholarDigital Library
- Michael Haslgrübler, Peter Fritz, Benedikt Gollan, and Alois Ferscha. 2017. Getting through: modality selection in a multi-sensor-actuator industrial IoT environment. In Proceedings of the Conference on the Internet of Things. ACM. Google ScholarDigital Library
- Michael Haslgrübler, Benedikt Gollan, and Alois Ferscha. 2018. A Cognitive Assistance Framework for Supporting Human Workers in Industrial Tasks. IT Professional 20, 5 (2018), 48--56.Google ScholarCross Ref
- Sabrina Hoppe, Tobias Loetscher, Stephanie A Morey, and Andreas Bulling. 2018. Eye movements during everyday behavior predict personality traits. Frontiers in human neuroscience 12 (2018), 105.Google Scholar
- Robert JK Jacob. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems (TOIS) 9, 2 (1991), 152--169. Google ScholarDigital Library
- Robert JK Jacob and Sophie Stellmach. 2016. What you look at is what you get: gaze-based user interfaces. interactions 23, 5 (2016), 62--65. Google ScholarDigital Library
- Florian Jungwirth, Michael Haslgrübler, Alois Ferscha, and Michaela Murauer. 2018. Eyes are different than Hands: An Analysis of Gaze as Input Modality for Industrial Man-Machine Interactions. In PErvasive Technologies Related to Assistive Environments Conference. ACM. Google ScholarDigital Library
- Florian Jungwirth, Michaela Murauer, Johannes Selymes, Michael Haslgrübler, Benedikt Gollan, and Alois Ferscha. 2019. mobEYEle: An Embedded Eye Tracking Platform for Industrial Assistance. In Proceedings of the 2019 ACM International Joint Conference and 2019 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers. ACM. Google ScholarDigital Library
- Jari Kangas, Deepak Akkil, Jussi Rantala, Poika Isokoski, Päivi Majaranta, and Roope Raisamo. 2014. Gaze gestures and haptic feedback in mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 435--438. Google ScholarDigital Library
- Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing: Adjunct publication. ACM, 1151--1160. Google ScholarDigital Library
- Päivi Majaranta and Andreas Bulling. 2014. Eye tracking and eye-based human-computer interaction. In Advances in physiological computing. Springer, 39--65.Google Scholar
- Sascha Niedersteiner, Clemens Pohlt, and Thomas Schlegl. 2015. Smart workbench: A multimodal and bidirectional assistance system for industrial application. In IECON 2015-41st Annual Conference of the IEEE Industrial Electronics Society. IEEE, 002938--002943.Google ScholarCross Ref
- Jussi Rantala, Jari Kangas, Deepak Akkil, Poika Isokoski, and Roope Raisamo. 2014. Glasses with haptic feedback of gaze gestures. In CHI'14 Extended Abstracts on Human Factors in Computing Systems. ACM. Google ScholarDigital Library
- Torsten Sattler, Bastian Leibe, and Leif Kobbelt. 2012. Improving image-based localization by active correspondence search. In European conference on computer vision. Springer, 752--765. Google ScholarDigital Library
- Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM, 246--253. Google ScholarDigital Library
Index Terms
- EyeControl: wearable assistance for industrial maintenance tasks
Recommendations
mobEYEle: an embedded eye tracking platform for industrial assistance
UbiComp/ISWC '19 Adjunct: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable ComputersThe eyes are a particularly interesting modality for cognitive industrial assistance systems, as gaze analysis can reveal cognition- and task-related aspects, while gaze interaction depicts a lightweight and fast method for hands-free machine control. ...
EyeControl: Towards Unconstrained Eye Tracking in Industrial Environments
SUI '18: Proceedings of the 2018 ACM Symposium on Spatial User InteractionWe propose the idea of a powerful mobile eye tracking platform that enables whole new ways of explicit and implicit human-machine interactions in complex industrial settings. The system is based on two hardware components (NVIDIA Jetson TX2, Pupil labs ...
Contour-guided gaze gestures: using object contours as visual guidance for triggering interactions
ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & ApplicationsThe eyes are an interesting modality for pervasive interactions, though their applicability for mobile scenarios is restricted by several issues so far. In this paper, we propose the idea of contour-guided gaze gestures, which overcome former ...
Comments