skip to main content
10.1145/3204493.3209579acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
abstract

Anyorbit: orbital navigation in virtual environments with eye-tracking

Published: 14 June 2018 Publication History

Abstract

Gaze-based interactions promise to be fast, intuitive and effective in controlling virtual and augmented environments. Yet, there is still a lack of usable 3D navigation and observation techniques. In this work: 1) We introduce a highly advantageous orbital navigation technique, AnyOrbit, providing an intuitive and hands-free method of observation in virtual environments that uses eye-tracking to control the orbital center of movement; 2) The versatility of the technique is demonstrated with several control schemes and use-cases in virtual/augmented reality head-mounted-display and desktop setups, including observation of 3D astronomical data and spectator sports.

Supplementary Material

MP4 File (a99-outram.mp4)

References

[1]
Roman Bednarik, Hana Vrzakova, and Michal Hradis. 2012. What Do You Want to Do Next: A Novel Approach for Intent Prediction in Gaze-based Interaction. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 83--90.
[2]
Nicholas Burtnyk, Azam Khan, George Fitzmaurice, Ravin Balakrishnan, and Gordon Kurtenbach. 2002. StyleCam: interactive stylized 3D navigation using integrated spatial & temporal controls. In Proceedings of the 15th annual ACM symposium on User interface software and technology. ACM, 101--110.
[3]
Wanjun Chen, JZ Chen, and Richard Hau Yue So. 2011. Visually induced motion sickness: effects of translational visual motion along different axes. Contemporary Ergonomics and Human Factors (2011), 281--287.
[4]
James Che-Ming Chung and James C Chung. 1994. Intuitive navigation in the targeting of radiation therapy treatment beams. (1994).
[5]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze interaction for smart watches using smooth pursuit eye movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. ACM, 457--466.
[6]
Ajoy S Fernandes and Steven K Feiner. 2016. Combating VR sickness through subtle dynamic field-of-view modification. In 3D User Interfaces (3DUI), 2016 IEEE Symposium on. IEEE, 201--210.
[7]
George Fitzmaurice, Justin Matejka, Igor Mordatch, Azam Khan, and Gordon Kurtenbach. 2008. Safe 3D navigation. In Proceedings of the 2008 symposium on Interactive 3D graphics and games. ACM, 7--15.
[8]
John Paulin Hansen, Alexandre Alapetite, I Scott MacKenzie, and Emilie Møllenbach. 2014. The use of gaze to control drones. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 27--34.
[9]
Henna Heikkilä and Kari-Jouko Räihä. 2012. Simple gaze gestures and the closure of the eyes as an interaction technique. In Proceedings of the symposium on eye tracking research and applications. ACM, 147--154.
[10]
Rob Jacob and Sophie Stellmach. 2016. What you look at is what you get: gaze-based user interfaces. interactions 23, 5 (2016), 62--65.
[11]
Jacek Jankowski and Martin Hachet. 2015. Advances in interaction with 3D environments. In Computer Graphics Forum, Vol. 34. Wiley Online Library, 152--190.
[12]
Azam Khan, Ben Komalo, Jos Stam, George Fitzmaurice, and Gordon Kurtenbach. 2005. Hovercam: interactive 3D navigation for proximal object inspection. In Proceedings of the 2005 symposium on Interactive 3D graphics and games. ACM, 73--80.
[13]
David R Koller, Mark R Mine, and Scott E Hudson. 1996. Head-tracked orbital viewing: an interaction technique for immersive virtual environments. In Proceedings of the 9th annual ACM symposium on User interface software and technology. ACM, 81--82.
[14]
Jock D Mackinlay, Stuart K Card, and George G Robertson. 1990. Rapid controlled movement through a virtual 3D workspace. In ACM SIGGRAPH Computer Graphics, Vol. 24. ACM, 171--176.
[15]
Päivi Majaranta and Andreas Bulling. 2014. Eye tracking and eye-based human-computer interaction. In Advances in physiological computing. Springer, 39--65.
[16]
Diako Mardanbegi and Dan Witzner Hansen. 2011. Mobile Gaze-based Screen Interaction in 3D Environments. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications (NGCA '11). ACM, New York, NY, USA, Article 2, 4 pages.
[17]
Diako Mardanbegi, Dan Witzner Hansen, and Thomas Pederson. 2012. Eye-based head gestures. In Proceedings of the symposium on eye tracking research and applications. ACM, 139--146.
[18]
Yogesh Kumar Meena, Hubert Cecotti, KongFatt Wong-Lin, and Girijesh Prasad. 2017. A multimodal interface to resolve the Midas-Touch problem in gaze controlled wheelchair. In Engineering in Medicine and Biology Society (EMBC), 2017 39th Annual International Conference of the IEEE. IEEE, 905--908.
[19]
David Nash. 2011. The Hyg Database. URL: https://github. com/astronexus/HYG-database (2011).
[20]
Michael Ortega, Wolfgang Stuerzlinger, and Doug Scheurich. 2015. SHOCam: A 3D Orbiting Algorithm. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. ACM, 119--128.
[21]
Benjamin I Outram. 2018. AnyOrbit Unity3D demos and code implementation GitHub repository. (2018). Retrieved April 18, 2018 from https://github.com/bio998/AnyOrbit
[22]
Benjamin I Outram, Yun Suen Pai, Kevin Fan, Kouta Minamizawa, and Kai Kunze. 2016. AnyOrbit: Fluid 6DOF Spatial Navigation of Virtual Environments using Orbital Motion. In Proceedings of the 2016 Symposium on Spatial User Interaction. ACM, 199--199.
[23]
Yun Suen Pai, Benjamin I Outram, Benjamin Tag, Megumi Isogai, Daisuke Ochi, and Kai Kunze. 2017. GazeSphere: navigating 360-degree-video environments in VR using head rotation and eye gaze. In ACM SIGGRAPH 2017 Posters. ACM, 23.
[24]
Randy Pausch, M Anne Shackelford, and Dennis Proffitt. 1993. A user study comparing head-mounted and stationary displays. In Virtual Reality, 1993. Proceedings., IEEE 1993 Symposium on Research Frontiers in. IEEE, 41--45.
[25]
Cary B Phillips, Norman I Badler, and John Granieri. 1992. Automatic viewing control for 3D direct manipulation. In Proceedings of the 1992 symposium on Interactive 3D graphics. ACM, 71--74.
[26]
Joseph Psotka. 1995. Immersive training systems: Virtual reality and education and training. Instructional science 23, 5--6 (1995), 405--431.
[27]
Aljoscha Smolic, Karsten Mueller, Philipp Merkle, Christoph Fehn, Peter Kauff, Peter Eisert, and Thomas Wiegand. 2006. 3D video and free viewpoint video-technologies, applications and MPEG standards. In Multimedia and Expo, 2006 IEEE International Conference on. IEEE, 2161--2164.
[28]
Kay M Stanney and Phillip Hash. 1998. Locus of user-initiated control in virtual environments: Influences on cybersickness. Presence: Teleoperators and Virtual Environments 7, 5 (1998), 447--459.
[29]
Sophie Stellmach and Raimund Dachselt. 2012. Investigating gaze-supported multimodal pan and zoom. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 357--360.
[30]
Desney S Tan, George G Robertson, and Mary Czerwinski. 2001. Exploring 3D navigation: combining speed-coupled flying with orbiting. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 418--425.
[31]
Robert Zeleznik and Andrew Forsberg. 1999. UniCam---2D gestural camera controls for 3D environments. In Proceedings of the 1999 symposium on Interactive 3D graphics. ACM, 169--173.

Cited By

View all
  • (2025)Real-Time Gaze Estimation Using Webcam-Based CNN Models for Human–Computer InteractionsComputers10.3390/computers1402005714:2(57)Online publication date: 10-Feb-2025
  • (2024)RPG: Rotation Technique in VR Locomotion using Peripheral GazeProceedings of the ACM on Human-Computer Interaction10.1145/36556098:ETRA(1-19)Online publication date: 28-May-2024
  • (2023)Person-Specific Gaze Estimation from Low-Quality Webcam ImagesSensors10.3390/s2308413823:8(4138)Online publication date: 20-Apr-2023
  • Show More Cited By

Index Terms

  1. Anyorbit: orbital navigation in virtual environments with eye-tracking

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications
    June 2018
    595 pages
    ISBN:9781450357067
    DOI:10.1145/3204493
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 14 June 2018

    Check for updates

    Author Tags

    1. 3D navigation
    2. 3D user interface
    3. augmented reality
    4. eye tracking
    5. orbital mode
    6. orbiting
    7. virtual reality

    Qualifiers

    • Abstract

    Conference

    ETRA '18

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    ETRA '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)22
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 25 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)Real-Time Gaze Estimation Using Webcam-Based CNN Models for Human–Computer InteractionsComputers10.3390/computers1402005714:2(57)Online publication date: 10-Feb-2025
    • (2024)RPG: Rotation Technique in VR Locomotion using Peripheral GazeProceedings of the ACM on Human-Computer Interaction10.1145/36556098:ETRA(1-19)Online publication date: 28-May-2024
    • (2023)Person-Specific Gaze Estimation from Low-Quality Webcam ImagesSensors10.3390/s2308413823:8(4138)Online publication date: 20-Apr-2023
    • (2023)Adaptive navigation assistance based on eye movement features in virtual realityVirtual Reality & Intelligent Hardware10.1016/j.vrih.2022.07.0035:3(232-248)Online publication date: Jun-2023
    • (2021)Gaze Tracking Using an Unmodified Web Camera and Convolutional Neural NetworkApplied Sciences10.3390/app1119906811:19(9068)Online publication date: 29-Sep-2021
    • (2020)Pleasant Locomotion -- Towards Reducing Cybersickness using fNIRS during Walking Events in VRAdjunct Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology10.1145/3379350.3416184(56-58)Online publication date: 20-Oct-2020
    • (2019)PanoFlex: Adaptive Panoramic Vision to Accommodate 360° Field-of-View for HumansProceedings of the 25th ACM Symposium on Virtual Reality Software and Technology10.1145/3359996.3364767(1-2)Online publication date: 12-Nov-2019
    • (2019)Private ReaderProceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services10.1145/3338286.3340129(1-6)Online publication date: 1-Oct-2019

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media