skip to main content
10.1145/2984751.2984754acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
poster

Transparent Reality: Using Eye Gaze Focus Depth as Interaction Modality

Published:16 October 2016Publication History

ABSTRACT

We present a novel, eye gaze based interaction technique, using focus depth as an input modality for virtual reality (VR) applications. We also show custom hardware prototype implementation. Comparing the focus depth based interaction to a scroll wheel interface, we find no statistically significant difference in performance (the focus depth works slightly better) and a subjective preference of the users in a user study with 10 participants playing a simple VR game. This indicates that it is a suitable interface modality that should be further explored. Finally, we give some application scenarios and guidelines for using focus depth interactions in VR applications.

Skip Supplemental Material Section

Supplemental Material

uistp0110-file3.mp4

mp4

76.8 MB

References

  1. Serkan Alkan and Kursat Cagiltay. 2007. Studying computer game learning experience through eye tracking. British Journal of Educational Technology 38, 3 (2007), 538--542.Google ScholarGoogle ScholarCross RefCross Ref
  2. Heiko Drewes, Alexander De Luca, and Albrecht Schmidt. 2007. Eye-gaze interaction for mobile phones. In Proceedings of the 4th international conference on mobile technology, applications, and systems and the 1st international symposium on Computer human interaction in mobile technology. ACM,364--371. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Andrew T Duchowski, Vinay Shivashankaraiah, Tim Rawls, Anand K Gramopadhye, Brian J Melloy, and Barbara Kanki. 2000. Binocular eye tracking in virtual reality for inspection training. In Proceedings of the 2000 symposium on Eye tracking research & applications.ACM, 89--96. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. ACM,457--466. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Yoshio Ishiguro, Adiyan Mujibiya, Takashi Miyaki, and Jun Rekimoto. 2010. Aided eyes: eye activity sensing for daily life. In Proceedings of the 1st Augmented Human International Conference. ACM, 25. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Transparent Reality: Using Eye Gaze Focus Depth as Interaction Modality

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      UIST '16 Adjunct: Adjunct Proceedings of the 29th Annual ACM Symposium on User Interface Software and Technology
      October 2016
      244 pages
      ISBN:9781450345316
      DOI:10.1145/2984751

      Copyright © 2016 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 16 October 2016

      Check for updates

      Qualifiers

      • poster

      Acceptance Rates

      UIST '16 Adjunct Paper Acceptance Rate79of384submissions,21%Overall Acceptance Rate842of3,967submissions,21%

      Upcoming Conference

      UIST '24

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader