skip to main content
10.1145/3490632.3497832acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmumConference Proceedingsconference-collections
poster

User-Elicited Gestural Interaction With Future In-Car Holographic 3D Displays

Published: 25 February 2022 Publication History

Abstract

Holographic 3D displays (H3D) have the potential to expand the interaction space for in-car infotainment systems by providing a larger depth range than other state of the art 3D display technologies. This work explored how non-expert users would interact with non-driving related tasks tailored to H3D visualization. In a gesture-elicitation study, N = 20 participants proposed mid-air gestures for a set of 33 tasks (referents) displayed either within or outside of participants’ reach. In a follow-up reverse-matching study with N = 21 participants, the resulting set of most mentioned gestures was refined. The final gesture set shows that techniques elicited for other 3D technologies are applicable to interaction with future in-car H3D displays.

References

[1]
Doug A. Bowman, Jian Chen, Chadwick A. Wingrave, John Lucas, Andrew Ray, Nicholas F. Polys, Qing Li, Yonca Haciahmetoglu, Ji-Sun Kim, Seonho Kim, Robert Boehringer, and Tao Ni. 2006. New Directions in 3D User Interfaces. International Journal of Virtual Reality 5, 2 (2006), 3–14. https://doi.org/10.20870/IJVR.2006.5.2.2683
[2]
SAE International and ISO. 2021. SURFACE VEHICLE RECOMMENDED PRACTICE: Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems. https://doi.org/10.4271/J3016_202104
[3]
Haoyan Jiang, Mark Chignell, Sachi Mizobuchi, Farzin Farhadi Niaki, Zhe Liu, Wei Zhou, and Wei Li. 2022. Demographic Effects on Mid-Air Gesture Preference for Control of Devices: Implications for Design. In Proceedings of the 21st Congress of the International Ergonomics Association (IEA 2021), Nancy L. Black, W. Patrick Neumann, and Ian Noy (Eds.). Vol. 223. Springer International Publishing, Cham, 379–386. https://doi.org/10.1007/978-3-030-74614-8_47
[4]
Christine Kühnel, Tilo Westermann, Fabian Hemmert, Sven Kratz, Alexander Müller, and Sebastian Möller. 2011. I’m home: Defining and evaluating a gesture set for smart-home control. International journal of human-computer studies 69, 11 (2011), 693–704. https://doi.org/10.1016/j.ijhcs.2011.04.005
[5]
Dan Mauney, Jonathan Howarth, Andrew Wirtanen, and Miranda Capra. 2010. Cultural Similarities and Differences in User-Defined Gestures for Touchscreen User Interfaces. In CHI ’10 Extended Abstracts on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, 4015–4020. https://doi.org/10.1145/1753846.1754095
[6]
Michael Nielsen, Moritz Störring, Thomas B. Moeslund, and Erik Granum. 2004. A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI. In Gesture-Based Communication in Human-Computer Interaction, Antonio Camurri and Gualtiero Volpe (Eds.). Lecture Notes in Computer Science, Vol. 2915. Springer, Berlin and Heidelberg, 409–420. https://doi.org/10.1007/978-3-540-24598-8_38
[7]
F. R. Ortega, A. Galvan, K. Tarre, A. Barreto, N. Rishe, J. Bernal, R. Balcazar, and J. Thomas. 2017. Gesture elicitation for 3D travel via multi-touch and mid-Air systems for procedurally generated pseudo-universe. In 2017 IEEE Symposium on 3D User Interfaces (3DUI). IEEE, Piscataway, NJ, 144–153. https://doi.org/10.1109/3DUI.2017.7893331
[8]
Tran Pham, Jo Vermeulen, Anthony Tang, and Lindsay MacDonald Vermeulen. 2018. Scale Impacts Elicited Gestures for Manipulating Holograms: Implications for AR Gesture Design. In Proceedings of the 2018 Designing Interactive Systems Conference(DIS ’18). Association for Computing Machinery, New York, NY, USA, 227–240. https://doi.org/10.1145/3196709.3196719
[9]
T. Piumsomboon, A. Clark, M. Billinghurst, and A. Cockburn. 2013. User-Defined Gestures for Augmented Reality. In Human-Computer Interaction – INTERACT 2013(Lecture Notes in Computer Science / Information Systems and Applications, Incl. Internet/Web, and HCI, Vol. v.8118), Paula Kotze, Gary Marsden, Gitte Lindgaard, and Marco Winckler (Eds.). Springer Berlin Heidelberg, Berlin/Heidelberg, 282––299. https://doi.org/10.1007/978-3-642-40480-1_18
[10]
Stephan Reichelt, Ralf Häussler, Gerald Fütterer, and Norbert Leister. 2010. Depth cues in human visual perception and their realization in 3D displays. In Proceedings Volume 7690: Three-Dimensional Imaging, Visualization, and Display 2010 and Display Technologies and Applications for Defense, Security, and Avionics IV, Bahram Javidi, Son Jung-Young, John T. Thomas, and Daniel D. Desjardins (Eds.). Vol. 7690. SPIE - International Society for Optics and Photonics. https://doi.org/10.1117/12.850094
[11]
Zhicheng Ren, Bin Jiang, and Licheng Deng. 2020. Research of Interactive Gesture Usability of Navigation Application Based on Intuitive Interaction. In Human-Computer Interaction. Multimodal and Natural Interaction, Masaaki Kurosu (Ed.). Springer, Chem, 96–105.
[12]
Theophanis Tsandilas. 2018. Fallacies of Agreement: A Critical Review of Consensus Assessment Methods for Gesture Elicitation. ACM Trans. Comput.-Hum. Interact. 25, 3 (2018), 1–49. https://doi.org/10.1145/3182168
[13]
Radu-Daniel Vatavu and Jacob O. Wobbrock. 2015. Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems(CHI ’15). Association for Computing Machinery, New York, NY, USA, 1325–1334. https://doi.org/10.1145/2702123.2702223
[14]
Radu-Daniel Vatavu and Ionut-Alexandru Zaiti. 2014. Leap Gestures for TV: Insights from an Elicitation Study. In Proceedings of the ACM International Conference on Interactive Experiences for TV and Online Video(TVX ’14). Association for Computing Machinery, New York, NY, USA, 131–138. https://doi.org/10.1145/2602299.2602316
[15]
Florian Weidner and Wolfgang Broll. 2019. Interact with Your Car: A User-Elicited Gesture Set to Inform Future in-Car User Interfaces. In Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia(MUM ’19). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3365610.3365625
[16]
Gesa Wiegand, Christian Mai, Kai Holländer, and Heinrich Hussmann. 2019. InCarAR: A Design Space Towards 3D Augmented Reality Applications in Vehicles. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications(AutomotiveUI ’19). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3342197.3344539
[17]
Jacob O. Wobbrock, Htet Htet Aung, Brandon Rothrock, and Brad A. Myers. 2005. Maximizing the Guessability of Symbolic Input. In CHI ’05 Extended Abstracts on Human Factors in Computing Systems(CHI EA ’05). Association for Computing Machinery, New York, NY, USA, 1869–1872. https://doi.org/10.1145/1056808.1057043
[18]
Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. 2009. User-Defined Gestures for Surface Computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’09). Association for Computing Machinery, New York, NY, USA, 1083–1092. https://doi.org/10.1145/1518701.1518866

Cited By

View all
  • (2022)Exploring New Depths: How Could Passengers Interact with Future In-Car Holographic 3D Displays?HCI in Mobility, Transport, and Automotive Systems10.1007/978-3-031-04987-3_3(35-61)Online publication date: 26-Jun-2022

Index Terms

  1. User-Elicited Gestural Interaction With Future In-Car Holographic 3D Displays
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    MUM '21: Proceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia
    December 2021
    263 pages
    ISBN:9781450386432
    DOI:10.1145/3490632
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 February 2022

    Check for updates

    Author Tags

    1. datasets
    2. gaze detection
    3. neural networks
    4. text tagging

    Qualifiers

    • Poster
    • Research
    • Refereed limited

    Conference

    MUM 2021

    Acceptance Rates

    Overall Acceptance Rate 190 of 465 submissions, 41%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)19
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 17 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2022)Exploring New Depths: How Could Passengers Interact with Future In-Car Holographic 3D Displays?HCI in Mobility, Transport, and Automotive Systems10.1007/978-3-031-04987-3_3(35-61)Online publication date: 26-Jun-2022

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media