skip to main content
10.1145/3317956.3318155acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

EyeMRTK: a toolkit for developing eye gaze interactive applications in virtual and augmented reality

Published: 25 June 2019 Publication History

Abstract

For head mounted displays, like they are used in mixed reality applications, eye gaze seems to be a natural interaction modality. EyeMRTK provides building blocks for eye gaze interaction in virtual and augmented reality. Based on a hardware abstraction layer, it allows interaction researchers and developers to focus on their interaction concepts, while enabling them to evaluate their ideas on all supported systems. In addition to that, the toolkit provides a simulation layer for debugging purposes, which speeds up prototyping during development on the desktop.

References

[1]
Jonas Blattgerste, Patrick Renner, and Thies Pfeiffer. 2018. Advantages of Eye-Gaze over Head-Gaze-Based Selection in Virtual and Augmented Reality under Varying Field of Views. In COGAIN '18. Proceedings of the Symposium on Communication by Gaze Interaction. ACM.
[2]
Nathan Cournia, John D. Smith, and Andrew T. Duchowski. 2003. Gaze- vs. Hand-based Pointing in Virtual Environments. In CHI '03 Extended Abstracts on Human Factors in Computing Systems (CHI EA '03). ACM, New York, NY, USA, 772--773.
[3]
Brian Guenter, Mark Finch, Steven Drucker, Desney Tan, and John Snyder. 2012. Foveated 3D Graphics. ACM Trans. Graph. 31, 6 (Nov. 2012), 164:1--164:10.
[4]
John Paulin Hansen, Kristian Tørning, Anders Sewerin Johansen, Kenji Itoh, and Hirotaka Aoki. 2004. Gaze Typing Compared with Input by Head and Hand. In Proceedings of the 2004 Symposium on Eye Tracking Research & Applications (ETRA '04). ACM, New York, NY, USA, 131--138.
[5]
Tobias Höllerer and Steve Feiner. 2004. Mobile augmented reality. Telegeoinformatics: Location-Based Computing and Services. Taylor and Francis Books Ltd., London, UK 21 (2004). 00533.
[6]
Felix Hülsmann, Timo Dankert, and Thies Pfeiffer. 2011. Comparing gaze-based and manual interaction in a fast-paced gaming task in Virtual Reality. In Proceedings of the Workshop Virtuelle & Erweiterte Realitat 2011. https://pub.uni-bielefeld.de/publication/2308550
[7]
FOVE Inc. 2017. FOVE. (2017). https://www.getfove.com/ Accessed 2018-04-14.
[8]
Robert J. K.Jacob. 1990. What You Look at is What You Get: Eye Movement-based Interaction Techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '90). ACM, New York, NY, USA, 11--18.
[9]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. CoRR abs/1405.0006 (2014). 00096.
[10]
Manu Kumar, Jeff Klingner, Rohan Puranik, Terry Winograd, and Andreas Paepcke. 2008. Improving the accuracy of gaze input for interaction. In Proceedings of the 2008 symposium on Eye tracking research & applications. ACM, 65--68.
[11]
Jae-Young Lee, Hyung-Min Park, Seok-Han Lee, Soon-Ho Shin, Tae-Eun Kim, and Jong-Soo Choi. 2011. Design and implementation of an augmented reality system using gaze interaction. Multimedia Tools and Applications 68, 2 (Dec. 2011), 265--280.
[12]
Marc Levoy and Ross Whitaker. 1990. Gaze-directed Volume Rendering. In Proceedings of the 1990 Symposium on Interactive 3D Graphics (I3D '90). ACM, New York, NY, USA, 217--223. 00137.
[13]
LooxidLabs. 2017. LooxidLabs HomePage. (2017). http://looxidlabs.com/ Accessed 2018-04-14.
[14]
Magic Leap, Inc. 2019. Magic Leap One. (2019). https://www.magicleap.com/ Accessed 2019-01-25.
[15]
Päivi Majaranta and Richard Bates. 2009. Special issue: Communication by gaze interaction. Universal Access in the Information Society 8, 4 (March 2009), 239--240. 00000.
[16]
Päivi Majaranta and Andreas Bulling. 2014. Eye Tracking and Eye-Based Human-Computer Interaction. In Advances in Physiological Computing. Springer, London, 39--65. https://link.springer.com/chapter/10.1007/978-1-4471-6392-3_3
[17]
Diako Mardanbegi, Dan Witzner Hansen, and Thomas Pederson. 2012. Eye-based head gestures. In Proceedings of the symposium on eye tracking research and applications. ACM, 139--146.
[18]
Susanna Nilsson, Torbjörn Gustafsson, and Per Carleberg. 2009. Hands Free Interaction with Virtual Information in a Real Environment: Eye Gaze as an Interaction Tool in an Augmented Reality System. PsychNology Journal 7, 2 (Aug. 2009), 175--196.
[19]
Oculus. 2019. Oculus Integration for Unity. (2019). https://developer.oculus.com/Accessed 2019-01-25.
[20]
Hyung Min Park, Seok Han Lee, and Jong Soo Choi. 2008. Wearable Augmented Reality System Using Gaze Interaction. In Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality (ISMAR '08). IEEE Computer Society, Washington, DC, USA, 175--176.
[21]
Ken Pfeuffer, Benedikt Mayer, Diako Mardanbegi, and Hans Gellersen. 2017. Gaze + Pinch Interaction in Virtual Reality. In Proceedings of the 5th Symposium on Spatial User Interaction (SUI '17). ACM, New York, NY, USA, 99--108.
[22]
Thammathip Piumsomboon, Gun Lee, Robert W Lindeman, and Mark Billinghurst. 2017. Exploring natural eye-gaze-based interaction for immersive virtual reality. In 3D User Interfaces (3DUI), 2017 IEEE Symposium on. IEEE, 36--39.
[23]
Kari-Jouko Räihä. 2015. Life in the fast lane: Effect of language and calibration accuracy on the speed of text entry by gaze. In Human-Computer Interaction. Springer, 402--417.
[24]
Sensomotoric Instruments. 2018. Eye Tracking Solutions by SMI. (2018). https://www.smivision.com/ Accessed 2018-04-14.
[25]
Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of Eye Gaze Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '00). ACM, New York, NY, USA, 281--288.
[26]
Oleg Špakov and Päivi Majaranta. 2012. Enhanced gaze interaction using simple head gestures. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing. ACM, 705--710.
[27]
Tuukka M Takala. 2014. RUIS: A toolkit for developing virtual reality applications with spatial interaction. In Proceedings of the 2nd ACM symposium on Spatial user interaction. ACM, 94--103.
[28]
Vildan Tanriverdi and Robert J. K. Jacob. 2000. Interacting with Eye Movements in Virtual Environments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '00). ACM, New York, NY, USA, 265--272.
[29]
TheStonefox. 2019. VRTK. (2019). https://vrtoolkit.readme.io/ Accessed 2019-01-25.
[30]
Tobii VR. 2019. Tobii Pro VR Integration based on HTC Vive HMD. (Oct. 2019). https://vr.tobii.com/ Accessed 2019-01-25.
[31]
Takumi Toyama, Daniel Sonntag, Jason Orlosky, and Kiyoshi Kiyokawa. 2015. Attention Engagement and Cognitive State Analysis for Augmented Reality Text Display Functions. In Proceedings of the 20th International Conference on Intelligent User Interfaces (IUI '15). ACM, New York, NY, USA, 322--332.
[32]
ValveSoftware. 2019. Steam VR Unity Plugin. (2019). https://github.com/ValveSoftware/steamvr_unity_plugin Accessed 2019-01-25.
[33]
Robert C Zeleznik, Andrew S Forsberg, and Jürgen P Schulze. 2005. Look-that-there: Exploiting gaze in virtual reality interactions. Technical Report CS-05, Tech. Rep. (2005).

Cited By

View all
  • (2024)Hands-free multi-type character text entry in virtual realityVirtual Reality10.1007/s10055-023-00902-z28:1Online publication date: 3-Jan-2024
  • (2024)Universal XR Framework Architecture Based on Open-Source XR ToolsXR and Metaverse10.1007/978-3-031-50559-1_7(87-98)Online publication date: 23-Jan-2024
  • (2023)Exploring the user experience of hands-free VR interaction methods during a Fitts’ taskComputers and Graphics10.1016/j.cag.2023.10.005117:C(1-12)Online publication date: 1-Dec-2023
  • Show More Cited By

Index Terms

  1. EyeMRTK: a toolkit for developing eye gaze interactive applications in virtual and augmented reality

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
    June 2019
    623 pages
    ISBN:9781450367097
    DOI:10.1145/3314111
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 June 2019

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eye tracking
    2. gaze interaction
    3. unity
    4. virtual reality

    Qualifiers

    • Short-paper

    Funding Sources

    • German Research Foundation (DFG)
    • EPSRC

    Conference

    ETRA '19

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)23
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 12 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Hands-free multi-type character text entry in virtual realityVirtual Reality10.1007/s10055-023-00902-z28:1Online publication date: 3-Jan-2024
    • (2024)Universal XR Framework Architecture Based on Open-Source XR ToolsXR and Metaverse10.1007/978-3-031-50559-1_7(87-98)Online publication date: 23-Jan-2024
    • (2023)Exploring the user experience of hands-free VR interaction methods during a Fitts’ taskComputers and Graphics10.1016/j.cag.2023.10.005117:C(1-12)Online publication date: 1-Dec-2023
    • (2021)ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted DisplaysSensors10.3390/s2106223421:6(2234)Online publication date: 23-Mar-2021
    • (2021)Hands-free interaction in immersive virtual reality: A systematic reviewIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2021.306768727:5(2702-2713)Online publication date: May-2021
    • (2019)Enhancing Interaction with Augmented Reality through Mid-Air Haptic Feedback: Architecture Design and User FeedbackApplied Sciences10.3390/app92351239:23(5123)Online publication date: 26-Nov-2019

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media