skip to main content
10.1145/3385959.3418444acmconferencesArticle/Chapter ViewAbstractPublication PagessuiConference Proceedingsconference-collections
research-article

Eye Gaze-based Object Rotation for Head-mounted Displays

Published: 30 October 2020 Publication History

Abstract

Hands-free manipulation of 3D objects has long been a challenge for augmented and virtual reality (AR/VR). While many methods use eye gaze to assist with hand-based manipulations, interfaces cannot yet provide completely gaze-based 6 degree-of-freedom (DoF) manipulations in an efficient manner. To address this problem, we implemented three methods to handle rotations of virtual objects using gaze, including RotBar: a method that maps line-of-sight eye gaze onto per-axis rotations, RotPlane: a method that makes use of orthogonal planes to achieve per-axis angular rotations, and RotBall: a method that combines a traditional arcball with an external ring to handle user-perspective roll manipulations. We validated the efficiency of each method by conducting a user study involving a series of orientation tasks along different axes with each method. Experimental results showed that users could accomplish single-axis orientation tasks with RotBar and RotPlane significantly faster and more accurate than RotBall. On the other hand for multi-axis orientation tasks, RotBall significantly outperformed RotBar and RotPlane in terms of speed and accuracy.

References

[1]
Aaron Bangor, Philip Kortum, and James Miller. 2009. Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of usability studies 4, 3 (2009), 114–123.
[2]
John Brooke. 1996. SUS-A quick and dirty usability scale. In Usability evaluation in industry. London: Taylor and Francis, 189–194.
[3]
Michael Chen, S. Joy Mountford, and Abigail Sellen. 1988. A Study in Interactive 3-D Rotation Using 2-D Control Devices. SIGGRAPH Computer Graphics 22, 4 (1988), 121–129.
[4]
[4] Anja Groß.2018. Benutzerinteraktion in Virtual Reality mittels Eye Tracking. B.S. thesis.
[5]
Robert J. K. Jacob. 1995. Eye Tracking in Advanced Interface Design. In Virtual Environments and Advanced Interface Design. Oxford University Press, Inc., 258–288.
[6]
Nicholas Katzakis, Kazuteru Seki, Kiyoshi Kiyokawa, and Haruo Takemura. 2013. Mesh-grab and arcball-3d: Ray-based 6-dof object manipulation. In Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction. 129–136.
[7]
Mohamed Khamis, Carl Oechsner, Florian Alt, and Andreas Bulling. 2018. VRpursuits: Interaction in Virtual Reality Using Smooth Pursuit Eye Movements. In Proceedings of the 2018 International Conference on Advanced Visual Interfaces(AVI ’18). Article 18, 8 pages.
[8]
Joseph J LaViola Jr, Ernst Kruijff, Ryan P McMahan, Doug Bowman, and Ivan P Poupyrev. 2017. 3D user interfaces: theory and practice, 2nd ed.Addison-Wesley Professional.
[9]
Chang Liu, Alexander Plopski, and Jason Orlosky. 2020. OrthoGaze: Gaze-based three-dimensional object manipulation using orthogonal planes. Computers & Graphics 89(2020), 1–10.
[10]
Diako Mardanbegi, Tobias Langlotz, and Hans Gellersen. 2019. Resolving Target Ambiguity in 3D Gaze Interaction through VOR Depth Estimation. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems(CHI ’19). 1–12.
[11]
Diako Mardanbegi, Benedikt Mayer, Ken Pfeuffer, Shahram Jalaliniya, Hans Gellersen, and Alexander Perzl. 2019. EyeSeeThrough: Unifying Tool Selection and Application in Virtual Environments. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). 474–483.
[12]
Pallavi Mohan, Wooi Boon Goh, Chi-Wingand Fu, and Sai-Kit Yeung. 2018. DualGaze: Addressing the Midas Touch Problem in Gaze Mediated VR Interaction. In 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). 79–84.
[13]
Nelusa Pathmanathan, Michael Becher, Nils Rodrigues, Guido Reina, Thomas Ertl, Daniel Weiskopf, and Michael Sedlmair. 2020. Eye vs. Head: Comparing Gaze Methods for Interaction in Augmented Reality. In ACM Symposium on Eye Tracking Research and Applications (Stuttgart, Germany) (ETRA ’20 Short Papers). Association for Computing Machinery, New York, NY, USA, Article 50, 5 pages.
[14]
Ken Pfeuffer, Benedikt Mayer, Diako Mardanbegi, and Hans Gellersen. 2017. Gaze + Pinch Interaction in Virtual Reality. In Proceedings of the Symposium on Spatial User Interaction. 99–108.
[15]
Thammathip Piumsomboon, Gun Lee, Robert W Lindeman, and Mark Billinghurst. 2017. Exploring natural eye-gaze-based interaction for immersive virtual reality. In 2017 IEEE Symposium on 3D User Interfaces (3DUI). IEEE, 36–39.
[16]
Ivan Poupyrev, Suzanne Weghorst, and Sidney Fels. 2000. Non-Isomorphic 3D Rotational Techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (The Hague, The Netherlands) (CHI ’00). Association for Computing Machinery, New York, NY, USA, 540–547. https://doi.org/10.1145/332040.332497
[17]
Kari-Jouko Räihä and Oleg Špakov. 2009. Disambiguating ninja cursors with eye gaze. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 1411–1414.
[18]
Robert Rosenthal. 1994. Parametric Measures of Effect Size. In H. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis. 231–244.
[19]
Ken Shoemake. 1985. Animating Rotation with Quaternion Curves. In Proceedings of the 12th Annual Conference on Computer Graphics and Interactive Techniques(SIGGRAPH ’85). 245–254.
[20]
Ken Shoemake. 1992. ARCBALL: A User Interface for Specifying Three-Dimensional Orientation Using a Mouse. In Proceedings of the Conference on Graphics Interface ’92. 151–156.
[21]
Ludwig Sidenmark, Christopher Clarke, Xuesong Zhang, Jenny Phu, and Hans Gellersen. 2020. Outline Pursuits: Gaze-Assisted Selection of Occluded Objects in Virtual Reality. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems(CHI ’20). 1–13.
[22]
Ludwig Sidenmark and Hans Gellersen. 2019. Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (New Orleans, LA, USA) (UIST ’19). Association for Computing Machinery, New York, NY, USA, 1161–1174.
[23]
Ludwig Sidenmark, Diako Mardanbegi, Argenis Ramirez Gomez, Christopher Clarke, and Hans Gellersen. 2020. BimodalGaze: Seamlessly Refined Pointing with Gaze and Filtered Gestural Head Movement. In ACM Symposium on Eye Tracking Research and Applications (Stuttgart, Germany) (ETRA ’20 Full Papers). Association for Computing Machinery, New York, NY, USA, Article 8, 9 pages.
[24]
Junbong Song, Sungmin Cho, Seung-Yeob Baek, Kunwoo Lee, and Hyunwoo Bang. 2014. GaFinC: Gaze and Finger Control interface for 3D model manipulation in CAD application. Computer-Aided Design 46(2014), 239–245.
[25]
Sophie Stellmach and Raimund Dachselt. 2013. Still Looking: Investigating Seamless Gaze-supported Selection, Positioning, and Manipulation of Distant Targets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 285–294.
[26]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing(UbiComp ’13). 439–448.
[27]
Mélodie Vidal, Ken Pfeuffer, Andreas Bulling, and Hans W Gellersen. 2013. Pursuits: eye-based interaction with moving targets. In CHI’13 Extended Abstracts on Human Factors in Computing Systems. 3147–3150.
[28]
Colin Ware and Harutune H. Mikaelian. 1986. An Evaluation of an Eye Tracker As a Device for Computer Input. SIGCHI Bull. 17, SI (May 1986), 183–188.

Cited By

View all
  • (2024)FocusFlow: 3D Gaze-Depth Interaction in Virtual Reality Leveraging Active Visual Depth ManipulationProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642589(1-18)Online publication date: 11-May-2024
  • (2024)Object manipulation based on the head manipulation space in VRInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103346192:COnline publication date: 18-Nov-2024
  • (2023)Exploring Gaze-assisted and Hand-based Region Selection in Augmented RealityProceedings of the ACM on Human-Computer Interaction10.1145/35911297:ETRA(1-19)Online publication date: 18-May-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SUI '20: Proceedings of the 2020 ACM Symposium on Spatial User Interaction
October 2020
188 pages
ISBN:9781450379434
DOI:10.1145/3385959
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 30 October 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. eye gaze
  2. head-mounted display
  3. object rotation
  4. user interface

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • United States Office of Naval Research Global

Conference

SUI '20
SUI '20: Symposium on Spatial User Interaction
October 30 - November 1, 2020
Virtual Event, Canada

Acceptance Rates

Overall Acceptance Rate 86 of 279 submissions, 31%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)80
  • Downloads (Last 6 weeks)2
Reflects downloads up to 27 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)FocusFlow: 3D Gaze-Depth Interaction in Virtual Reality Leveraging Active Visual Depth ManipulationProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642589(1-18)Online publication date: 11-May-2024
  • (2024)Object manipulation based on the head manipulation space in VRInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103346192:COnline publication date: 18-Nov-2024
  • (2023)Exploring Gaze-assisted and Hand-based Region Selection in Augmented RealityProceedings of the ACM on Human-Computer Interaction10.1145/35911297:ETRA(1-19)Online publication date: 18-May-2023
  • (2023)Evaluating User Interactions in Wearable Extended Reality: Modeling, Online Remote Survey, and In-Lab Experimental MethodsIEEE Access10.1109/ACCESS.2023.329859811(77856-77872)Online publication date: 2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media