skip to main content
10.1145/3359997.3365697acmconferencesArticle/Chapter ViewAbstractPublication PagessiggraphConference Proceedingsconference-collections
research-article

Head-Fingers-Arms: Physically-Coupled and Decoupled Multimodal Interaction Designs in Mobile VR

Published: 14 November 2019 Publication History

Abstract

This paper proposes a novel bimanual finger-based gesture called X-Fingers that provides interactive 2D spatial input control using vision-based techniques in mobile VR. This finger-based input modality can be coordinated with the movement of the user's arms or head to provide an additional input modality. The incorporation of the arms or the head provides physically-coupled and physically-decoupled multimodal interactions respectively. Given these two design options, we conducted user studies to understand how the nature of the physical coupling of interactions influences the user's performance and preferences with task consisting of varying degrees of coordination between the modalities. Our results show that physically-decoupled interactions designs are preferred when the degree of coordination is high within the multimodal interaction.

Supplemental Material

MP4 File - a28-mohan-supplement
video

References

[1]
A. Ishii, T. Adachi, K. Shima, S. Nakamae, B. Shizuki, and S. Takahashi, “FistPointer: Target Selection Technique Using Mid-air Interaction for Mobile VR Environment,” Proc. 2017 CHI Conf. Ext. Abstr. Hum. Factors Comput. Syst., p. 474, 2017.
[2]
D. P. Mapes and J. M. Moshell, “A Two-Handed Interface for Object Manipulation in Virtual Environments,” Presence Teleoperators Virtual Environ., vol. 4, no. 4, pp. 403–416, Jan. 2015.
[3]
C. Yu, Y. Gu, Z. Yang, X. Yi, H. Luo, and Y. Shi, “Tap, Dwell or Gesture?: Exploring Head-Based Text Entry Techniques for HMDs,” Chi 2017, pp. 0–0, 2017.
[4]
R. Atienza, R. Blonna, M. I. Saludares, J. Casimiro, and V. Fuentes, “Interaction techniques using head gaze for virtual reality,” in Proceedings - 2016 IEEE Region 10 Symposium, TENSYMP 2016, 2016, pp. 110–114.
[5]
X. Yan, C.-W. Fu, P. Mohan, and W. B. Goh, “Cardboardsense: Interacting with DIY cardboard VR headset by tapping,” in DIS 2016 - Proceedings of the 2016 ACM Conference on Designing Interactive Systems: Fuse, 2016.
[6]
J. Gugenheimer, D. Dobbelstein, C. Winkler, G. Haas, and E. Rukzio, “FaceTouch,” in Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems - CHI EA ’16, 2016, pp. 3679–3682.
[7]
J. Lee, B. Kim, B. Suh, and E. Koh, “Exploring the Front Touch Interface for Virtual Reality Headsets,” Proc. 2016 CHI Conf. Ext. Abstr. Hum. Factors Comput. Syst. - CHI EA ’16, pp. 2585–2591, 2016.
[8]
D. Vogel and R. Balakrishnan, “Distant freehand pointing and clicking on very large, high resolution displays,” in Proceedings of the 18th annual ACM symposium on User interface software and technology - UIST ’05, 2005, p. 33.
[9]
A. Cockburn, P. Quinn, C. Gutwin, G. Ramos, and J. Looser, “Air pointing: Design and evaluation of spatial target acquisition with and without visual feedback,” J. Hum. Comput. Stud., vol. 69, pp. 401–414, 2011.
[10]
Oviatt and Sharon, “Multimodal interactive maps: designing for human performance,” Human-Computer Interact., vol. 12, no. 1, pp. 93–129, 1997.
[11]
Z. Chen, J. Li, Y. Hua, R. Shen, and A. Basu, “Multimodal interaction in augmented reality,” in 2017 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2017, 2017, vol. 2017-Janua, pp. 206–209.
[12]
A. J. F. Kok and R. van Liere, “A multimodal virtual reality interface for 3D interaction with VTK,” Knowl. Inf. Syst., vol. 13, no. 2, pp. 197–219, Oct. 2007.
[13]
W.-L. Tsai, Y.-L. Hsu, C.-P. Lin, C.-Y. Zhu, Y.-C. Chen, and M.-C. Hu, “Immersive virtual reality with multimodal interaction and streaming technology,” in Proceedings of the 18th ACM International Conference on Multimodal Interaction - ICMI 2016, 2016, pp. 416–416.
[14]
M. Kytö, B. Ens, T. Piumsomboon, G. A. Lee, and M. Billinghurst, “Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality,” Chi 2018, p. (In press), 2018.
[15]
A. D. Wilson, “Robust Computer Vision-Based Detection of Pinching for One and Two-Handed Gesture Input,” 2006.
[16]
“OnePlus 5 Specs - OnePlus (United States).” [Online]. Available: https://www.oneplus.com/5/specs. [Accessed: 29-Nov-2018].
[17]
“Silvertec VR Glasses.” [Online]. Available: http://www.silvertec.com/in-vr002. [Accessed: 29-Nov-2018].
[18]
A. Kolmogorov and “Sulla determinazione empirica di una lgge di distribuzione,” Inst. Ital. Attuari, Giorn., vol. 4, pp. 83–91, 1933.
[19]
A. Agrawal, R. Raj, and S. Porwal, “Vision-based multimodal human-computer interaction using hand and head gestures,” in 2013 IEEE CONFERENCE ON INFORMATION AND COMMUNICATION TECHNOLOGIES, 2013, pp. 1288–1292.

Cited By

View all
  • (2023)A Review of Interaction Techniques for Immersive EnvironmentsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.317480529:9(3900-3921)Online publication date: 1-Sep-2023
  • (2023)Real-time multimodal interaction in virtual reality - a case study with a large virtual interfaceMultimedia Tools and Applications10.1007/s11042-023-14381-682:16(25427-25448)Online publication date: 2-Feb-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
VRCAI '19: Proceedings of the 17th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry
November 2019
354 pages
ISBN:9781450370028
DOI:10.1145/3359997
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 November 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. interaction design
  2. multimodal interaction
  3. virtual reality

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

VRCAI '19
Sponsor:

Acceptance Rates

Overall Acceptance Rate 51 of 107 submissions, 48%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)28
  • Downloads (Last 6 weeks)1
Reflects downloads up to 03 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2023)A Review of Interaction Techniques for Immersive EnvironmentsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.317480529:9(3900-3921)Online publication date: 1-Sep-2023
  • (2023)Real-time multimodal interaction in virtual reality - a case study with a large virtual interfaceMultimedia Tools and Applications10.1007/s11042-023-14381-682:16(25427-25448)Online publication date: 2-Feb-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media