skip to main content
10.1145/3359997.3365693acmconferencesArticle/Chapter ViewAbstractPublication PagessiggraphConference Proceedingsconference-collections
research-article

Visualizing and Interacting with Hierarchical Menus in Immersive Augmented Reality

Published: 14 November 2019 Publication History

Abstract

Graphical User Interfaces (GUIs) have long been used as a way to inform the user of the large number of available actions and options. GUIs in desktop applications traditionally appear in the form of two-dimensional hierarchical menus due to the limited screen real estate, the spatial restrictions imposed by the hardware e.g. 2D, and the available input modalities e.g. mouse/keyboard point-and-click, touch, dwell-time etc. In immersive Augmented Reality (AR), there are no such restrictions and the available input modalities are different (i.e. hand gestures, head pointing or voice recognition), yet the majority of the applications in AR still use the same type of GUIs as with desktop applications.
In this paper we focus on identifying the most efficient combination of (hierarchical menu type, input modality) to use in immersive applications using AR headsets. We report on the results of a within-subjects study with 25 participants who performed a number of tasks using four combinations of the most popular hierarchical menu types with the most popular input modalities in AR, namely: (drop-down menu, hand gestures), (drop-down menu, voice), (radial menu, hand gestures), and (radial menu, head pointing). Results show that the majority of the participants (60%, 15) achieved a faster performance using the hierarchical radial menu with head pointing control. Furthermore, the participants clearly indicated the radial menu with head pointing control as the most preferred interaction technique due to the limited physical demand as opposed to the current de facto interaction technique in AR i.e. hand gestures, which after prolonged use becomes physically demanding leading to arm fatigue known as ’Gorilla arms’.

References

[1]
[Ahlström et al., 2010] David Ahlström, Andy Cockburn, Carl Gutwin, and Pourang Irani. 2010. Why it’s quick to be square: modelling new and existing hierarchical menu designs. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1371–1380.
[2]
[Azai et al., 2017] Takumi Azai, Shuhei Ogawa, Mai Otsuki, Fumihisa Shibata, and Asako Kimura. 2017. Selection and Manipulation Methods for a Menu Widget on the Human Forearm. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 357–360.
[3]
[Blattgerste et al., 2018] Jonas Blattgerste, Patrick Renner, and Thies Pfeiffer. 2018. Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views. In Proceedings of the Workshop on Communication by Gaze Interaction. ACM, 1.
[4]
[Bowman and Wingrave, 2001] Doug A Bowman and Chadwick A Wingrave. 2001. Design and evaluation of menu systems for immersive virtual environments. In Virtual Reality, 2001. Proceedings. IEEE. IEEE, 149–156.
[5]
[Brudy, 2013] Frederik Brudy. 2013. Interactive menus in augmented reality environments. Beyond the Desktop (2013), 1.
[6]
[Callahan et al., 1988] Jack Callahan, Don Hopkins, Mark Weiser, and Ben Shneiderman. 1988. An empirical comparison of pie vs. linear menus. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 95–100.
[7]
[Davis et al., 2016] Matthew M Davis, Joseph L Gabbard, Doug A Bowman, and Dennis Gracanin. 2016. Depth-based 3D gesture multi-level radial menu for virtual object manipulation. In Virtual Reality (VR), 2016 IEEE. IEEE, 169–170.
[8]
[Faul et al., 2007] Franz Faul, Edgar Erdfelder, Albert-Georg Lang, and Axel Buchner. 2007. G* Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior research methods 39, 2 (2007), 175–191.
[9]
[Gebhardt et al., 2013] Sascha Gebhardt, Sebastian Pick, Franziska Leithold, Bernd Hentschel, and Torsten Kuhlen. 2013. Extended pie menus for immersive virtual environments. IEEE transactions on visualization and computer graphics 19, 4(2013), 644–651.
[10]
[Gerber and Bechmann, 2004] Dominique Gerber and Dominique Bechmann. 2004. Design and evaluation of the ring menu in virtual environments. Immersive projection technologies(2004).
[11]
[Grinshpoon et al., 2018] Alon Grinshpoon, Shirin Sadri, Gabrielle J Loeb, Carmine Elvezio, and Steven K Feiner. 2018. Hands-Free Interaction for Augmented Reality in Vascular Interventions. (2018).
[12]
[Hincapié-Ramos et al., 2014] Juan David Hincapié-Ramos, Xiang Guo, Paymahn Moghadasian, and Pourang Irani. 2014. Consumed endurance: a metric to quantify arm fatigue of mid-air interactions. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems. ACM, 1063–1072.
[13]
[Hoang and Thomas, 2008] Thuong N Hoang and Bruce H Thomas. 2008. Augmented reality in-situ 3D model menu for outdoors. In Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality. IEEE Computer Society, 185–186.
[14]
[Jankowski and Hachet, 2013] Jacek Jankowski and Martin Hachet. 2013. A survey of interaction techniques for interactive 3D environments. In Eurographics 2013-STAR.
[15]
[Kytö et al., 2018] Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A Lee, and Mark Billinghurst. 2018. Pinpointing: Precise Head-and Eye-Based Target Selection for Augmented Reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 81.
[16]
[Lin et al., 2017] Sikun Lin, Hao Fei Cheng, Weikai Li, Zhanpeng Huang, Pan Hui, and Christoph Peylo. 2017. Ubii: Physical World Interaction Through Augmented Reality.IEEE Trans. Mob. Comput. 16, 3 (2017), 872–885.
[17]
[Matsui and Yamada, 2008] Shouichi Matsui and Seiji Yamada. 2008. Optimizing hierarchical menus by genetic algorithm and simulated annealing. In Proceedings of the 10th annual conference on Genetic and evolutionary computation. ACM, 1587–1594.
[18]
[Özacar et al., 2016] Kasım Özacar, Juan David Hincapié-Ramos, Kazuki Takashima, and Yoshifumi Kitamura. 2016. 3D Selection Techniques for Mobile Augmented Reality Head-Mounted Displays. Interacting with Computers 29, 4 (2016), 579–591.
[19]
[Park et al., 2008] Hyung Min Park, Seok Han Lee, and Jong Soo Choi. 2008. Wearable augmented reality system using gaze interaction. In Proceedings of the 7th IEEE/ACM international Symposium on Mixed and Augmented Reality. IEEE Computer Society, 175–176.
[20]
[Pick et al., 2017] Sebastian Pick, Andrew S Puika, and Torsten W Kuhlen. 2017. Comparison of a speech-based and a pie-menu-based interaction metaphor for application control. In 2017 IEEE Virtual Reality (VR). IEEE, 381–382.
[21]
[Reilink et al., 2010] Rob Reilink, Gart de Bruin, Michel Franken, Massimo A Mariani, Sarthak Misra, and Stefano Stramigioli. 2010. Endoscopic camera control by head movements for thoracic surgery. In Biomedical Robotics and Biomechatronics (BioRob), 2010 3rd IEEE RAS and EMBS International Conference on. IEEE, 510–515.
[22]
[Samp and Decker, 2010] Krystian Samp and Stefan Decker. 2010. Supporting menu design with radial layouts. In Proceedings of the International Conference on Advanced Visual Interfaces. ACM, 155–162.
[23]
[Xiao et al., 2018] Robert Xiao, Julia Schwarz, Nick Throm, Andrew D Wilson, and Hrvoje Benko. 2018. MRTouch: Adding Touch Input to Head-Mounted Mixed Reality. IEEE transactions on visualization and computer graphics 24, 4(2018), 1653–1660.
[24]
[Xu et al., 2019] Wenge Xu, Hai-Ning Liang, Yuxuan Zhao, Tianyu Zhang, Difeng Yu, Diego Monteiro, and Yong Yue. 2019. RingText: Dwell-free and hands-free Text Entry for Mobile Head-Mounted Displays using Head Motions. IEEE transactions on visualization and computer graphics (2019).
[25]
[Yue et al., 2017] Ya-Ting Yue, Yong-Liang Yang, Gang Ren, and Wenping Wang. 2017. SceneCtrl: Mixed Reality Enhancement via Efficient Scene Editing. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology. ACM, 427–436.

Cited By

View all
  • (2025)Review of Multimodal Interaction in Optical See-Through Augmented RealityInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2442128(1-17)Online publication date: 17-Jan-2025
  • (2024)Virtual Task Environments Factors Explored in 3D Selection StudiesProceedings of the 50th Graphics Interface Conference10.1145/3670947.3670983(1-16)Online publication date: 3-Jun-2024
  • (2024)Augmented Object Intelligence with XR-ObjectsProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676379(1-15)Online publication date: 13-Oct-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
VRCAI '19: Proceedings of the 17th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry
November 2019
354 pages
ISBN:9781450370028
DOI:10.1145/3359997
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 November 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Gestural input
  2. Graphical user interfaces
  3. Head Pointing
  4. Interaction paradigms
  5. Interaction techniques

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

VRCAI '19
Sponsor:

Acceptance Rates

Overall Acceptance Rate 51 of 107 submissions, 48%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)101
  • Downloads (Last 6 weeks)9
Reflects downloads up to 03 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Review of Multimodal Interaction in Optical See-Through Augmented RealityInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2442128(1-17)Online publication date: 17-Jan-2025
  • (2024)Virtual Task Environments Factors Explored in 3D Selection StudiesProceedings of the 50th Graphics Interface Conference10.1145/3670947.3670983(1-16)Online publication date: 3-Jun-2024
  • (2024)Augmented Object Intelligence with XR-ObjectsProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676379(1-15)Online publication date: 13-Oct-2024
  • (2024)Effects of User Interface Orientation on Sense of Immersion in Augmented RealityInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2352923(1-15)Online publication date: 24-May-2024
  • (2024)Breadth and orientation of pie menus for mid-air interaction: effects on upper extremity biomechanics, performance, and subjective assessmentBehaviour & Information Technology10.1080/0144929X.2024.231009744:1(61-78)Online publication date: 29-Jan-2024
  • (2024)MarkAR: Exploring the Benefits of Combining Microgestures and Mid-Air Marks to Trigger Commands in Augmented RealityVirtual Reality and Mixed Reality10.1007/978-3-031-78593-1_11(163-181)Online publication date: 27-Nov-2024
  • (2023)VRoom – Development of a Virtual Reality learning environment for the utilization of color in an architectural contextProceedings of Mensch und Computer 202310.1145/3603555.3609314(561-563)Online publication date: 3-Sep-2023
  • (2023)Gesture-based Interaction for AR Systems: A Short ReviewProceedings of the 16th International Conference on PErvasive Technologies Related to Assistive Environments10.1145/3594806.3594815(284-292)Online publication date: 5-Jul-2023
  • (2023)A Review of Interaction Techniques for Immersive EnvironmentsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.317480529:9(3900-3921)Online publication date: 1-Sep-2023
  • (2023)Analysis of Hand Movement and Head Orientation in Hierarchical Menu Selection in Immersive AR2023 IEEE International Symposium on Multimedia (ISM)10.1109/ISM59092.2023.00052(270-275)Online publication date: 11-Dec-2023
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media