skip to main content
10.1145/3379336.3381506acmconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
extended-abstract

Multimodal Interaction for Real and Virtual Environments

Published: 17 March 2020 Publication History

Abstract

Multimodal interfaces can leverage the information from multiple modalities to provide robust and error-free interaction. Early multimodal interfaces demonstrate the feasibility of building such systems but focused on specific applications. The challenge in building adaptive systems is lack of techniques for input data fusion. In this direction, we have developed a multimodal head and eye gaze interface and evaluated it in two scenarios. In aviation scenario, our interface has reduced the task time and perceived cognitive load significantly from the existing interface. We have also studied the effect of various output conditions on user's performance in a Virtual Reality (VR) task. Further, we are making our proposed interface to include additional modalities and building novel haptic and multimodal output systems for VR.

References

[1]
Richard A Bolt. 1980. "Put-that-there": Voice and gesture at the graphics interface. Vol. 14. ACM.
[2]
Paul M Fitts. 1954. The information capacity of the human motor system in controlling the amplitude of movement. Journal of experimental psychology 47, 6 (1954), 381.
[3]
Herbert Goldstein, Ch Poole, and J Safko. 1980. Classical Mechanics Addison-Wesley. Reading, MA (1980), 426.
[4]
Linnéa Larsson, Andrea Schwaller, Kenneth Holmqvist, Marcus Nyström, and Martin Stridh. 2014. Compensation of head movements in mobile eye-tracking data using an inertial measurement unit. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication. ACM, 1161--1167.
[5]
DV Shree, LRD Murthy, Kamalpreet Singh Saluja, Pradipta Biswas, et al. 2018. Operating Different Displays in Military Fast Jets Using Eye Gaze Tracker. Journal of Aviation Technology and Engineering 8, 1 (2018), 31.
[6]
Matthew Turk. 2014. Multimodal interaction: A review. Pattern Recognition Letters 36 (2014), 189--195.

Cited By

View all
  • (2022)A study of button size for virtual hand interaction in virtual environments based on clicking performanceMultimedia Tools and Applications10.1007/s11042-022-14038-w82:10(15903-15918)Online publication date: 15-Oct-2022

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
IUI '20 Companion: Companion Proceedings of the 25th International Conference on Intelligent User Interfaces
March 2020
153 pages
ISBN:9781450375139
DOI:10.1145/3379336
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 17 March 2020

Check for updates

Author Tags

  1. Machine Learning
  2. Multimodal Interfaces
  3. Virtual Reality

Qualifiers

  • Extended-abstract
  • Research
  • Refereed limited

Conference

IUI '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 746 of 2,811 submissions, 27%

Upcoming Conference

IUI '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)21
  • Downloads (Last 6 weeks)2
Reflects downloads up to 17 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2022)A study of button size for virtual hand interaction in virtual environments based on clicking performanceMultimedia Tools and Applications10.1007/s11042-022-14038-w82:10(15903-15918)Online publication date: 15-Oct-2022

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media