skip to main content
10.1145/3562939.3565659acmconferencesArticle/Chapter ViewAbstractPublication PagesvrstConference Proceedingsconference-collections
abstract

Object Manipulation Method Using Eye Gaze and Hand-held Controller in AR Space

Published:29 November 2022Publication History

ABSTRACT

When manipulating virtual objects in AR space, the target object is often occluded by other objects partially or completely (hereinafter, occlusion problem). In addition, the hand-ray manipulation that is commonly used in many AR devices requires the user to keep raising the arm within the range of the hand tracking sensor and causes the gorilla-arm problem. In this study, we propose an object manipulation method that combines eye gaze and a hand-held controller to mitigate the occlusion problem and gorilla-arm problem in AR environments. In the proposed method, the user controls the ray’s direction with eye gaze, and the user adjusts the length of the ray and selects objects with the hand-held controller.

Skip Supplemental Material Section

Supplemental Material

vrst_movie.mp4

mp4

4.4 MB

References

  1. Marc Baloup, Thomas Pietrzak, and Géry Casiez. 2019. RayCursor: A 3D Pointing Facilitation Technique Based on Raycasting. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3290605.3300331Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Juan David Hincapié-Ramos, Xiang Guo, Paymahn Moghadasian, and Pourang Irani. 2014. Consumed Endurance: A Metric to Quantify Arm Fatigue of Mid-Air Interactions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Toronto, Ontario, Canada) (CHI ’14). Association for Computing Machinery, New York, NY, USA, 1063–1072. https://doi.org/10.1145/2556288.2557130Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Robert J. K. Jacob. 1990. What You Look at is What You Get: Eye Movement-Based Interaction Techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Seattle, Washington, USA) (CHI ’90). Association for Computing Machinery, New York, NY, USA, 11–18. https://doi.org/10.1145/97243.97246Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Regis Kopper, Felipe Bacim, and Doug A. Bowman. 2011. Rapid and accurate 3D selection by progressive refinement. In 2011 IEEE Symposium on 3D User Interfaces (3DUI). 67–74. https://doi.org/10.1109/3DUI.2011.5759219Google ScholarGoogle ScholarCross RefCross Ref
  5. Microsoft Corporation. 2022. Microsoft HoloLens | Mixed Reality Technology for Business. https://www.microsoft.com/en-us/hololens. (Accessed on 08/21/2022).Google ScholarGoogle Scholar
  6. Microsoft Corporation. 2022. MRTK2 Unity Developer Documentation - MRTK 2 | Microsoft Docs. https://docs.microsoft.com/en-us/windows/mixed-reality/mrtk-unity/mrtk2/?view=mrtkunity-2022-05. (Accessed on 08/21/2022).Google ScholarGoogle Scholar
  7. Nintendo Co., Ltd.2022. Joy-Con & controllers. https://www.nintendo.com/store/hardware/joy-con-and-controllers/. (Accessed on 08/21/2022).Google ScholarGoogle Scholar
  8. Ludwig Sidenmark, Christopher Clarke, Xuesong Zhang, Jenny Phu, and Hans Gellersen. 2020. Outline Pursuits: Gaze-Assisted Selection of Occluded Objects in Virtual Reality. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3313831.3376438Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Unity Technologies. 2022. Unity Real-Time Development Platform | 3D, 2D VR & AR Engine. https://unity.com/. (Accessed on 08/21/2022).Google ScholarGoogle Scholar
  10. Difeng Yu, Qiushi Zhou, Joshua Newn, Tilman Dingler, Eduardo Velloso, and Jorge Goncalves. 2020. Fully-Occluded Target Selection in Virtual Reality. IEEE Transactions on Visualization and Computer Graphics 26, 12(2020), 3402–3413. https://doi.org/10.1109/TVCG.2020.3023606Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Object Manipulation Method Using Eye Gaze and Hand-held Controller in AR Space

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        VRST '22: Proceedings of the 28th ACM Symposium on Virtual Reality Software and Technology
        November 2022
        466 pages
        ISBN:9781450398893
        DOI:10.1145/3562939

        Copyright © 2022 Owner/Author

        Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 29 November 2022

        Check for updates

        Qualifiers

        • abstract
        • Research
        • Refereed limited

        Acceptance Rates

        Overall Acceptance Rate66of254submissions,26%

        Upcoming Conference

        VRST '24
      • Article Metrics

        • Downloads (Last 12 months)68
        • Downloads (Last 6 weeks)8

        Other Metrics

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format .

      View HTML Format