skip to main content
10.1145/3341162.3348384acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

EyeControl: wearable assistance for industrial maintenance tasks

Published:09 September 2019Publication History

ABSTRACT

Due to the explicit and implicit facets of gaze-based interaction, eye tracking is a major area of interest within the field of cognitive industrial assistance systems. In this position paper, we describe a scenario which includes a wearable platform built around a mobile eye tracker, which can support and guide an industrial worker throughout the execution of a maintenance task. The potential benefits of such a solution are discussed and the key components are outlined.

References

  1. Gabriele Bleser, Dima Damen, Ardhendu Behera, Gustaf Hendeby, Katharina Mura, Markus Miezal, Andrew Gee, Nils Petersen, Gustavo Maçães, Hugo Domingues, et al. 2015. Cognitive learning, monitoring and assistance of industrial workflows using egocentric sensor networks. PloS one 10, 6 (2015).Google ScholarGoogle Scholar
  2. Andreas Bulling and Hans Gellersen. 2010. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 4 (2010). Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Andreas Bulling, Daniel Roggen, and Gerhard Troester. 2011. What's in the Eyes for Context-Awareness? IEEE Pervasive Computing 10, 2 (2011). Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Andreas Bulling, Jamie A Ward, Hans Gellersen, and Gerhard Troster. 2011. Eye movement analysis for activity recognition using electrooculography. IEEE transactions on pattern analysis and machine intelligence 33, 4 (2011), 741--753. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Andreas Bulling and Thorsten O Zander. 2014. Cognition-aware computing. IEEE Pervasive Computing 13, 3 (2014), 80--83.Google ScholarGoogle ScholarCross RefCross Ref
  6. Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the computer using gaze gestures. In IFIP Conference on Human-Computer Interaction. Springer, 475--488. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Benedikt Gollan, Michael Haslgrübler, Alois Ferscha, and Josef Heftberger. {n. d.}. Making Sense: Experiences with Multi-Sensor Fusion in Industrial Assistance Systems. ({n. d.}).Google ScholarGoogle Scholar
  8. Dominic Gorecky, Simon F Worgan, and Gerrit Meixner. 2011. COG-NITO: a cognitive assistance and training system for manual tasks in industry.. In ECCE. 53--56. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Michael Haslgrübler, Peter Fritz, Benedikt Gollan, and Alois Ferscha. 2017. Getting through: modality selection in a multi-sensor-actuator industrial IoT environment. In Proceedings of the Conference on the Internet of Things. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Michael Haslgrübler, Benedikt Gollan, and Alois Ferscha. 2018. A Cognitive Assistance Framework for Supporting Human Workers in Industrial Tasks. IT Professional 20, 5 (2018), 48--56.Google ScholarGoogle ScholarCross RefCross Ref
  11. Sabrina Hoppe, Tobias Loetscher, Stephanie A Morey, and Andreas Bulling. 2018. Eye movements during everyday behavior predict personality traits. Frontiers in human neuroscience 12 (2018), 105.Google ScholarGoogle Scholar
  12. Robert JK Jacob. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems (TOIS) 9, 2 (1991), 152--169. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Robert JK Jacob and Sophie Stellmach. 2016. What you look at is what you get: gaze-based user interfaces. interactions 23, 5 (2016), 62--65. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Florian Jungwirth, Michael Haslgrübler, Alois Ferscha, and Michaela Murauer. 2018. Eyes are different than Hands: An Analysis of Gaze as Input Modality for Industrial Man-Machine Interactions. In PErvasive Technologies Related to Assistive Environments Conference. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Florian Jungwirth, Michaela Murauer, Johannes Selymes, Michael Haslgrübler, Benedikt Gollan, and Alois Ferscha. 2019. mobEYEle: An Embedded Eye Tracking Platform for Industrial Assistance. In Proceedings of the 2019 ACM International Joint Conference and 2019 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Jari Kangas, Deepak Akkil, Jussi Rantala, Poika Isokoski, Päivi Majaranta, and Roope Raisamo. 2014. Gaze gestures and haptic feedback in mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 435--438. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing: Adjunct publication. ACM, 1151--1160. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Päivi Majaranta and Andreas Bulling. 2014. Eye tracking and eye-based human-computer interaction. In Advances in physiological computing. Springer, 39--65.Google ScholarGoogle Scholar
  19. Sascha Niedersteiner, Clemens Pohlt, and Thomas Schlegl. 2015. Smart workbench: A multimodal and bidirectional assistance system for industrial application. In IECON 2015-41st Annual Conference of the IEEE Industrial Electronics Society. IEEE, 002938--002943.Google ScholarGoogle ScholarCross RefCross Ref
  20. Jussi Rantala, Jari Kangas, Deepak Akkil, Poika Isokoski, and Roope Raisamo. 2014. Glasses with haptic feedback of gaze gestures. In CHI'14 Extended Abstracts on Human Factors in Computing Systems. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Torsten Sattler, Bastian Leibe, and Leif Kobbelt. 2012. Improving image-based localization by active correspondence search. In European conference on computer vision. Springer, 752--765. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM, 246--253. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. EyeControl: wearable assistance for industrial maintenance tasks

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        UbiComp/ISWC '19 Adjunct: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers
        September 2019
        1234 pages
        ISBN:9781450368698
        DOI:10.1145/3341162

        Copyright © 2019 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 9 September 2019

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate764of2,912submissions,26%

        Upcoming Conference

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader