skip to main content
10.1145/3204493.3204555acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
abstract

Anyorbit: orbital navigation in virtual environments with eye-tracking

Published: 14 June 2018 Publication History

Abstract

Gaze-based interactions promise to be fast, intuitive and effective in controlling virtual and augmented environments. Yet, there is still a lack of usable 3D navigation and observation techniques. In this work: 1) We introduce a highly advantageous orbital navigation technique, AnyOrbit, providing an intuitive and hands-free method of observation in virtual environments that uses eye-tracking to control the orbital center of movement; 2) The versatility of the technique is demonstrated with several control schemes and use-cases in virtual/augmented reality head-mounted-display and desktop setups, including observation of 3D astronomical data and spectator sports.

References

[1]
Roman Bednarik, Hana Vrzakova, and Michal Hradis. 2012. What Do You Want to Do Next: A Novel Approach for Intent Prediction in Gaze-based Interaction. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 83--90.
[2]
Nicholas Burtnyk, Azam Khan, George Fitzmaurice, Ravin Balakrishnan, and Gordon Kurtenbach. 2002. StyleCam: interactive stylized 3D navigation using integrated spatial & temporal controls. In Proceedings of the 15th annual ACM symposium on User interface software and technology. ACM, 101--110.
[3]
Wanjun Chen, JZ Chen, and Richard Hau Yue So. 2011. Visually induced motion sickness: effects of translational visual motion along different axes. Contemporary Ergonomics and Human Factors (2011), 281--287.
[4]
James Che-Ming Chung and James C Chung. 1994. Intuitive navigation in the targeting of radiation therapy treatment beams. (1994).
[5]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze interaction for smart watches using smooth pursuit eye movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. ACM, 457--466.
[6]
Ajoy S Fernandes and Steven K Feiner. 2016. Combating VR sickness through subtle dynamic field-of-view modification. In 3D User Interfaces (3DUI), 2016 IEEE Symposium on. IEEE, 201--210.
[7]
George Fitzmaurice, Justin Matejka, Igor Mordatch, Azam Khan, and Gordon Kurtenbach. 2008. Safe 3D navigation. In Proceedings of the 2008 symposium on Interactive 3D graphics and games. ACM, 7--15.
[8]
John Paulin Hansen, Alexandre Alapetite, I Scott MacKenzie, and Emilie Møllenbach. 2014. The use of gaze to control drones. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 27--34.
[9]
Henna Heikkilä and Kari-Jouko Räihä. 2012. Simple gaze gestures and the closure of the eyes as an interaction technique. In Proceedings of the symposium on eye tracking research and applications. ACM, 147--154.
[10]
Rob Jacob and Sophie Stellmach. 2016. What you look at is what you get: gaze-based user interfaces. interactions 23, 5 (2016), 62--65.
[11]
Jacek Jankowski and Martin Hachet. 2015. Advances in interaction with 3D environments. In Computer Graphics Forum, Vol. 34. Wiley Online Library, 152--190.
[12]
Azam Khan, Ben Komalo, Jos Stam, George Fitzmaurice, and Gordon Kurtenbach. 2005. Hovercam: interactive 3D navigation for proximal object inspection. In Proceedings of the 2005 symposium on Interactive 3D graphics and games. ACM, 73--80.
[13]
David R Koller, Mark R Mine, and Scott E Hudson. 1996. Head-tracked orbital viewing: an interaction technique for immersive virtual environments. In Proceedings of the 9th annual ACM symposium on User interface software and technology. ACM, 81--82.
[14]
Jock D Mackinlay, Stuart K Card, and George G Robertson. 1990. Rapid controlled movement through a virtual 3D workspace. In ACM SIGGRAPH Computer Graphics, Vol. 24. ACM, 171--176.
[15]
Päivi Majaranta and Andreas Bulling. 2014. Eye tracking and eye-based human-computer interaction. In Advances in physiological computing. Springer, 39--65.
[16]
Diako Mardanbegi and Dan Witzner Hansen. 2011. Mobile Gaze-based Screen Interaction in 3D Environments. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications (NGCA '11). ACM, New York, NY, USA, Article 2, 4 pages.
[17]
Diako Mardanbegi, Dan Witzner Hansen, and Thomas Pederson. 2012. Eye-based head gestures. In Proceedings of the symposium on eye tracking research and applications. ACM, 139--146.
[18]
Yogesh Kumar Meena, Hubert Cecotti, KongFatt Wong-Lin, and Girijesh Prasad. 2017. A multimodal interface to resolve the Midas-Touch problem in gaze controlled wheelchair. In Engineering in Medicine and Biology Society (EMBC), 2017 39th Annual International Conference of the IEEE. IEEE, 905--908.
[19]
David Nash. 2011. The Hyg Database. URL: https://github. com/astronexus/HYG-database (2011).
[20]
Michael Ortega, Wolfgang Stuerzlinger, and Doug Scheurich. 2015. SHOCam: A 3D Orbiting Algorithm. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. ACM, 119--128.
[21]
Benjamin I Outram. 2018. AnyOrbit Unity3D demos and code implementation GitHub repository. (2018). Retrieved April 18, 2018 from https://github.com/bio998/AnyOrbit
[22]
Benjamin I Outram, Yun Suen Pai, Kevin Fan, Kouta Minamizawa, and Kai Kunze. 2016. AnyOrbit: Fluid 6DOF Spatial Navigation of Virtual Environments using Orbital Motion. In Proceedings of the 2016 Symposium on Spatial User Interaction. ACM, 199--199.
[23]
Yun Suen Pai, Benjamin I Outram, Benjamin Tag, Megumi Isogai, Daisuke Ochi, and Kai Kunze. 2017. GazeSphere: navigating 360-degree-video environments in VR using head rotation and eye gaze. In ACM SIGGRAPH 2017 Posters. ACM, 23.
[24]
Randy Pausch, M Anne Shackelford, and Dennis Proffitt. 1993. A user study comparing head-mounted and stationary displays. In Virtual Reality, 1993. Proceedings., IEEE 1993 Symposium on Research Frontiers in. IEEE, 41--45.
[25]
Cary B Phillips, Norman I Badler, and John Granieri. 1992. Automatic viewing control for 3D direct manipulation. In Proceedings of the 1992 symposium on Interactive 3D graphics. ACM, 71--74.
[26]
Joseph Psotka. 1995. Immersive training systems: Virtual reality and education and training. Instructional science 23, 5--6 (1995), 405--431.
[27]
Aljoscha Smolic, Karsten Mueller, Philipp Merkle, Christoph Fehn, Peter Kauff, Peter Eisert, and Thomas Wiegand. 2006. 3D video and free viewpoint video-technologies, applications and MPEG standards. In Multimedia and Expo, 2006 IEEE International Conference on. IEEE, 2161--2164.
[28]
Kay M Stanney and Phillip Hash. 1998. Locus of user-initiated control in virtual environments: Influences on cybersickness. Presence: Teleoperators and Virtual Environments 7, 5 (1998), 447--459.
[29]
Sophie Stellmach and Raimund Dachselt. 2012. Investigating gaze-supported multimodal pan and zoom. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 357--360.
[30]
Desney S Tan, George G Robertson, and Mary Czerwinski. 2001. Exploring 3D navigation: combining speed-coupled flying with orbiting. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 418--425.
[31]
Robert Zeleznik and Andrew Forsberg. 1999. UniCam---2D gestural camera controls for 3D environments. In Proceedings of the 1999 symposium on Interactive 3D graphics. ACM, 169--173.

Cited By

View all
  • (2023)Eye Tracking in Virtual Reality: a Broad Review of Applications and ChallengesVirtual Reality10.1007/s10055-022-00738-z27:2(1481-1505)Online publication date: 18-Jan-2023
  • (2022)The Eye in Extended Reality: A Survey on Gaze Interaction and Eye Tracking in Head-worn Extended RealityACM Computing Surveys10.1145/349120755:3(1-39)Online publication date: 25-Mar-2022

Index Terms

  1. Anyorbit: orbital navigation in virtual environments with eye-tracking

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications
    June 2018
    595 pages
    ISBN:9781450357067
    DOI:10.1145/3204493
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 14 June 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. 3D navigation
    2. 3D user interface
    3. augmented reality
    4. eye tracking
    5. orbital mode
    6. orbiting
    7. virtual reality

    Qualifiers

    • Abstract

    Conference

    ETRA '18

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    ETRA '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)14
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 25 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Eye Tracking in Virtual Reality: a Broad Review of Applications and ChallengesVirtual Reality10.1007/s10055-022-00738-z27:2(1481-1505)Online publication date: 18-Jan-2023
    • (2022)The Eye in Extended Reality: A Survey on Gaze Interaction and Eye Tracking in Head-worn Extended RealityACM Computing Surveys10.1145/349120755:3(1-39)Online publication date: 25-Mar-2022

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media