skip to main content
10.1145/3581961.3609872acmconferencesArticle/Chapter ViewAbstractPublication PagesautomotiveuiConference Proceedingsconference-collections
Work in Progress

Investigating User Preferences for In-Vehicle Virtual Robots’ Anthropomorphic Appearance in Augmented Reality Head-Up Display

Authors Info & Claims
Published:18 September 2023Publication History

ABSTRACT

As an interactive medium between drivers and vehicles, in-vehicle virtual robots significantly impact user performance and experience. Among them, those robots that have anthropomorphic appearances are the mainstream. As an emerging human-vehicle interaction interface, Augmented Reality Head-Up Display (AR-HUD) still has not been used to showcase in-vehicle virtual robots. For this reason, it is currently unclear what preferences users have for the anthropomorphic appearance of virtual robots when using AR-HUD. In this work, we conducted two experimental online studies focusing on two aspects of anthropomorphic appearance: human similarity and morphological completeness. A total of 257 participants’ valid data sets were collected. Participants rated the in-vehicle virtual robot’s anthropomorphic appearance on five preference dimensions on Pleasure, Fear, Trust, Comprehensibility, and Acceptance dimensions. We found that users prefer appearances of virtual robots with medium human similarity and high morphological completeness, which aligns with the current theories on robot appearance. Our future research will focus on developing and validating design principles for the appearances of in-vehicle virtual robots.

References

  1. Emily S Cross and Richard Ramsey. 2021. Mind meets machine: Towards a cognitive science of human–machine interactions. Trends in Cognitive Sciences 25, 3 (2021), 200–212.Google ScholarGoogle ScholarCross RefCross Ref
  2. Maartje MA De Graaf and Somaya Ben Allouch. 2013. Exploring influencing variables for the acceptance of social robots. Robotics and autonomous systems 61, 12 (2013), 1476–1486.Google ScholarGoogle Scholar
  3. Alexander Feierle, David Beller, and Klaus Bengler. 2019. Head-up displays in urban partially automated driving: Effects of using augmented reality. In 2019 IEEE Intelligent Transportation Systems Conference (ITSC). IEEE, 1877–1882.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Andy Field. 2013. Discovering statistics using IBM SPSS statistics. sage.Google ScholarGoogle Scholar
  5. Julia Fink. 2012. Anthropomorphism and human likeness in the design of robots and human-robot interaction. In Social Robotics: 4th International Conference, ICSR 2012, Chengdu, China, October 29-31, 2012. Proceedings 4. Springer, 199–208.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Donald O Hebb. 1946. Emotion in man and animal: an analysis of the intuitive processes of recognition.Psychological review 53, 2 (1946), 88.Google ScholarGoogle Scholar
  7. Thomas Holz, Mauro Dragone, and Gregory MP O’Hare. 2009. Where robots and virtual agents meet: a survey of social interaction research across milgram’s reality-virtuality continuum. International Journal of Social Robotics 1 (2009), 83–93.Google ScholarGoogle ScholarCross RefCross Ref
  8. Shanee Honig and Tal Oron-Gilad. 2018. Understanding and resolving failures in human-robot interaction: Literature review and model development. Frontiers in psychology 9 (2018), 861.Google ScholarGoogle Scholar
  9. Bing Cai Kok and Harold Soh. 2020. Trust in robots: Challenges and opportunities. Current Robotics Reports 1 (2020), 297–309.Google ScholarGoogle ScholarCross RefCross Ref
  10. Szilveszter Kovács, Dávid Vincze, Márta Gácsi, Péter Korondi, 2009. Interpolation based fuzzy automaton for human-robot interaction. IFAC Proceedings Volumes 42, 16 (2009), 317–322.Google ScholarGoogle ScholarCross RefCross Ref
  11. Mirjam Lanzer, Franziska Babel, Fei Yan, Bihan Zhang, Fang You, Jianmin Wang, and Martin Baumann. 2020. Designing communication strategies of autonomous vehicles with pedestrians: an intercultural study. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 122–131.Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Merle Lau, Meike Jipp, and Michael Oehl. 2021. Investigating the Interplay between eHMI and dHMI for Automated Buses: How Do Contradictory Signals Influence a Pedestrian’s Willingness to Cross?. In 13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 152–155.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Seul Chan Lee and Myounghoon Jeon. 2022. A systematic review of functions and design features of in-vehicle agents. International Journal of Human-Computer Studies (2022), 102864.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Jamy Li. 2015. The benefit of being physically present: A survey of experimental works comparing copresent robots, telepresent robots and virtual agents. International Journal of Human-Computer Studies 77 (2015), 23–37.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Masahiro Mori, Karl F MacDorman, and Norri Kageki. 2012. The uncanny valley [from the field]. IEEE Robotics & automation magazine 19, 2 (2012), 98–100.Google ScholarGoogle ScholarCross RefCross Ref
  16. Prajval Kumar Murali, Mohsen Kaboli, and Ravinder Dahiya. 2022. Intelligent In-Vehicle Interaction Technologies. Advanced Intelligent Systems 4, 2 (2022), 2100122.Google ScholarGoogle ScholarCross RefCross Ref
  17. S Murugan, A Sampathkumar, S Kanaga Suba Raja, S Ramesh, R Manikandan, and Deepak Gupta. 2022. Autonomous Vehicle Assisted by Heads up Display (HUD) with Augmented Reality Based on Machine Learning Techniques. In Virtual and Augmented Reality for Automobile Industry: Innovation Vision and Applications. Springer, 45–64.Google ScholarGoogle Scholar
  18. Shunta Nanbu, Hiroyuki Masuta, Y otaro Fuse, Noboru Takagi, Kei Sawai, and Tatsuo Motoyoshi. 2022. Effects of Non-Verbal Communication by In-Vehicle Robots on Passengers during Automated Driving. In 2022 IEEE Symposium Series on Computational Intelligence (SSCI). IEEE, 955–961.Google ScholarGoogle ScholarCross RefCross Ref
  19. Martín Naya-Varela, Andres Faina, and Richard J Duro. 2021. Morphological Development in robotic learning: A survey. IEEE Transactions on Cognitive and Developmental Systems 13, 4 (2021), 750–768.Google ScholarGoogle ScholarCross RefCross Ref
  20. Hye Sun Park, Min Woo Park, Kwang Hee Won, Kyong-Ho Kim, and Soon Ki Jung. 2013. In-vehicle AR-HUD system to provide driving-safety information. ETRI journal 35, 6 (2013), 1038–1047.Google ScholarGoogle ScholarCross RefCross Ref
  21. Reza Rawassizadeh, Taylan Sen, Sunny Jung Kim, Christian Meurisch, Hamidreza Keshavarz, Max Mühlhäuser, and Michael Pazzani. 2019. Manifestation of virtual assistants and robots into daily life: Vision and challenges. CCF Transactions on Pervasive Computing and Interaction 1 (2019), 163–174.Google ScholarGoogle ScholarCross RefCross Ref
  22. Megan K Strait, Victoria A Floerke, Wendy Ju, Keith Maddox, Jessica D Remedios, Malte F Jung, and Heather L Urry. 2017. Understanding the uncanny: Both atypical features and category ambiguity provoke aversion toward humanlike robots. Frontiers in psychology (2017), 1366.Google ScholarGoogle Scholar
  23. Timo Strohmann, Dominik Siemon, and Susanne Robra-Bissantz. 2019. Designing virtual in-vehicle assistants: Design guidelines for creating a convincing user experience. AIS Transactions on Human-Computer Interaction 11, 2 (2019), 54–78.Google ScholarGoogle ScholarCross RefCross Ref
  24. Jianmin Wang, Yujia Liu, Tianyang Yue, Chengji Wang, Jinjing Mao, Yuxi Wang, and Fang You. 2021. Robot Transparency and Anthropomorphic Attribute Effects on Human–Robot Interactions. Sensors 21, 17 (2021), 5722.Google ScholarGoogle ScholarCross RefCross Ref
  25. Manhua Wang, Philipp Hock, Seul Chan Lee, Martin Baumann, and Myounghoon Jeon. 2021. Genie vs. Jarvis: Characteristics and Design Considerations of In-Vehicle Intelligent Agents. In 13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 197–199.Google ScholarGoogle Scholar
  26. Jakub Złotowski, Diane Proudfoot, Kumar Yogeeswaran, and Christoph Bartneck. 2015. Anthropomorphism: opportunities and challenges in human–robot interaction. International journal of social robotics 7 (2015), 347–360.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Investigating User Preferences for In-Vehicle Virtual Robots’ Anthropomorphic Appearance in Augmented Reality Head-Up Display

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      AutomotiveUI '23 Adjunct: Adjunct Proceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
      September 2023
      382 pages
      ISBN:9798400701122
      DOI:10.1145/3581961

      Copyright © 2023 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 18 September 2023

      Check for updates

      Qualifiers

      • Work in Progress
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate248of566submissions,44%

      Upcoming Conference

    • Article Metrics

      • Downloads (Last 12 months)108
      • Downloads (Last 6 weeks)19

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format .

    View HTML Format