skip to main content
10.1145/3594739.3605105acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
short-paper

The Third Workshop on Multiple Input Modalities and Sensations for VR/AR Interactions (MIMSVAI)

Published: 08 October 2023 Publication History

Abstract

Rapid technological advances are expanding the practical applicability of virtual reality and/or augmented reality (VR/AR); however, the degree to which new users are able to interact with these technologies is limited by current input modalities. Gaining an intuitive grasp of VR/AR applications requires that users be immersed in the virtual environment, which in turn depends on the integration of multiple realistic sensory feedback mechanisms. This workshop will bring together researchers from the fields of UbiComp and VR/AR to explore alternative input modalities and sensory feedback systems to facilitate the design of coherent and engaging VR/AR experiences comparable to those in the real world.

References

[1]
ACM, Inc.2023. ACM Primary Article Template. https://www.acm.org/publications/proceedings-template.
[2]
Alan Chalmers and Andrej Ferko. 2008. Levels of Realism: From Virtual Reality to Real Virtuality. In Proceedings of the 24th Spring Conference on Computer Graphics (Budmerice, Slovakia) (SCCG ’08). Association for Computing Machinery, New York, NY, USA, 19–25. https://doi.org/10.1145/1921264.1921272
[3]
Valeria Garro, Veronica Sundstedt, and Diego Navarro. 2020. A Review of Current Trends on Visual Perception Studies in Virtual and Augmented Reality. In SIGGRAPH Asia 2020 Courses (Virtual Event) (SA ’20). Association for Computing Machinery, New York, NY, USA, Article 7, 97 pages. https://doi.org/10.1145/3415263.3419144
[4]
Mar Gonzalez-Franco, Antonella Maselli, Dinei Florencio, Nikolai Smolyanskiy, and Zhengyou Zhang. 2017. Concurrent talking in immersive virtual reality: on the dominance of visual speech cues. Scientific Reports 7, 1 (19 Jun 2017), 3817. https://doi.org/10.1038/s41598-017-04201-x
[5]
Hae-Jong Joo and Hwa-Young Jeong. 2020. A study on eye-tracking-based Interface for VR/AR education platform. Multimedia Tools and Applications 79, 23 (01 Jun 2020), 16719–16730. https://doi.org/10.1007/s11042-019-08327-0
[6]
Erika Kerruish. 2019. Arranging sensations: smell and taste in augmented and virtual reality. The Senses and Society 14, 1 (2019), 31–45. https://doi.org/10.1080/17458927.2018.1556952 arXiv:https://doi.org/10.1080/17458927.2018.1556952
[7]
Forbes Media LLC.2019. The Real Significance Of AR, VR And Mixed Reality. https://www.forbes.com/sites/timbajarin/2019/09/23/the-real-significance-of-ar-vr-and-mixed-reality/?sh=2f06b2cf6d80.
[8]
T. Piumsomboon, D. Altimira, H. Kim, A. Clark, G. Lee, and M. Billinghurst. 2014. Grasp-Shell vs gesture-speech: A comparison of direct and indirect natural interaction techniques in augmented reality. In 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 73–82. https://doi.org/10.1109/ISMAR.2014.6948411
[9]
Mark Weiser. 1991. The Computer for the Twenty-First Century. Scientific American 265, 3 (1991), 94–110.
[10]
Eric Whitmire, Hrvoje Benko, Christian Holz, Eyal Ofek, and Mike Sinclair. 2018. Haptic Revolver: Touch, Shear, Texture, and Shape Rendering on a Reconfigurable Virtual Reality Controller. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI ’18). ACM, New York, NY, USA, Article 86, 12 pages. https://doi.org/10.1145/3173574.3173660
[11]
Yukang Yan, Yingtian Shi, Chun Yu, and Yuanchun Shi. 2020. HeadCross: Exploring Head-Based Crossing Selection on Head-Mounted Displays. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 4, 1, Article 35 (March 2020), 22 pages. https://doi.org/10.1145/3380983

Index Terms

  1. The Third Workshop on Multiple Input Modalities and Sensations for VR/AR Interactions (MIMSVAI)

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UbiComp/ISWC '23 Adjunct: Adjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing
    October 2023
    822 pages
    ISBN:9798400702006
    DOI:10.1145/3594739
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 08 October 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Augmented reality
    2. Mixed reality
    3. Multi-sensations
    4. Multiple sensory modalities
    5. Realism
    6. User friendliness.
    7. Virtual reality

    Qualifiers

    • Short-paper
    • Research
    • Refereed limited

    Funding Sources

    • NSFC
    • National Science and Technology Council of Taiwan

    Conference

    UbiComp/ISWC '23

    Acceptance Rates

    Overall Acceptance Rate 764 of 2,912 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 82
      Total Downloads
    • Downloads (Last 12 months)35
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 17 Feb 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media