skip to main content
10.1145/3568294.3579959acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
abstract
Public Access

Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)

Published:13 March 2023Publication History

ABSTRACT

The 6th International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI) will bring together HRI, robotics, and mixed reality researchers to address challenges in mixed reality interactions between humans and robots. Topics relevant to the workshop include the development of robots that can interact with humans in mixed reality, the use of virtual reality for developing interactive robots, the design of augmented reality interfaces that mediate communication between humans and robots, the investigations of mixed reality interfaces for robot learning, comparisons of the capabilities and perceptions of robots and virtual agents, and best design practices. VAM-HRI 2023 will follow the success of VAM-HRI 2018-22 and advance the cause of this nascent research community.

References

  1. Christian Barentine, Andrew McNay, Ryan Pfaffenbichler, Addyson Smith, Eric Rosen, and Elizabeth Phillips. A vr teleoperation suite with manipulation assist. In Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, pages 442--446, 2021.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Gordon Briggs, Tom Williams, Ryan Blake Jackson, and Matthias Scheutz. Why and how robots should say ?no'. International Journal of Social Robotics, 14(2):323--339, 2022.Google ScholarGoogle ScholarCross RefCross Ref
  3. Ravi Teja Chadalavada, Henrik Andreasson, Robert Krug, and Achim J Lilienthal. That's on my mind! robot to human intention communication through on-board projection on shared floor space. In ECMR, 2015.Google ScholarGoogle ScholarCross RefCross Ref
  4. Henrik Christensen, Nancy Amato, Holly Yanco, Maja Matarić, Howie Choset, Ann Drobnis, Ken Goldberg, Jessy Grizzle, Gregory Hager, John Hollerbach, et al. A roadmap for us robotics-from internet to robotics 2020 edition. Foundations and Trends® in Robotics, 8(4):307--424, 2021.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Thomas R Groechel, Amy O'Connell, Massimiliano Nigro, and Maja J Matarić. Reimagining rviz: Multidimensional augmented reality robot signal design. In 2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), pages 1224--1231. IEEE, 2022.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Thomas R Groechel, Michael E Walker, Christine T Chang, Eric Rosen, and Jessica Zosa Forde. A tool for organizing key characteristics of virtual, augmented, and mixed reality for human-robot interaction systems: Synthesizing vam-hri trends and takeaways. IEEE Robotics & Automation Magazine, 29(1):35--44, 2022.Google ScholarGoogle ScholarCross RefCross Ref
  7. Akkamahadevi Hanni and Yu Zhang. Generating active explicable plans in human-robot teaming. In 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 2993--2998, 2021.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Bryce Ikeda. Ar indicators for visually debugging robots. In 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pages 1161--1163, 2022.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Hala Khodr, Ulysse Ramage, Kevin Kim, Arzu Guneysu Ozgur, Barbara Bruno, and Pierre Dillenbourg. Being part of the swarm: Experiencing human-swarm interaction with vr and tangible robots. In Symposium on Spatial User Interaction, pages 1--2, 2020.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Matthew B. Luebbers, Connor Brooks, Carl L. Mueller, Daniel Szafir, and Bradley Hayes. Arc-lfd: Using augmented reality for interactive long-term robot skill maintenance via constrained learning from demonstration. In 2021 IEEE International Conference on Robotics and Automation (ICRA), pages 3794--3800, 2021.Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Emily McQuillin, Nikhil Churamani, and Hatice Gunes. Learning socially appropriate robo-waiter behaviours through real-time user feedback. In 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pages 541--550, 2022.Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. James F. Mullen, Josh Mosier, Sounak Chakrabarti, Anqi Chen, Tyler White, and Dylan P. Losey. Communicating inferred goals with passive augmented reality and active haptic feedback. IEEE Robotics and Automation Letters, 6(4):8522--8529, 2021.Google ScholarGoogle ScholarCross RefCross Ref
  13. Christopher Reardon, Kevin Lee, John G. Rogers, and Jonathan Fink. Communicating via Augmented Reality for Human-Robot Teaming in Field Environments. In 2019 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), pages 94--101, September 2019. ISSN: 2475--8426.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Aaquib Tabrez, Matthew B Luebbers, and Bradley Hayes. Descriptive and prescriptive visual guidance to improve shared situational awareness in human-robot teaming. In Proceedings of the 21st International Conference on Autonomous Agents and Multiagent Systems, pages 1256--1264, 2022.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Michael Walker, Hooman Hedayati, Jennifer Lee, and Daniel Szafir. Communicating robot motion intent with augmented reality. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, pages 316--324, 2018.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Atsushi Watanabe, Tetsushi Ikeda, Yoichi Morales, Kazuhiko Shinozawa, Takahiro Miyashita, and Norihiro Hagita. Communicating robotic navigational intentions. In IROS, pages 5763--5769. IEEE, 2015.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Tom Williams, Daniel Szafir, Tathagata Chakraborti, and Heni Ben Amor. The 1st international workshop on virtual, augmented, and mixed reality for human-robot interaction. AI Magazine, 2018.Google ScholarGoogle Scholar
  18. Tom Williams, Daniel Szafir, Tathagata Chakraborti, and Heni Ben Amor. Virtual, augmented, and mixed reality for human-robot interaction. In Comp. HRI, 2018.Google ScholarGoogle Scholar
  19. Maciej K Wozniak and Patric Jensfelt. Virtual reality framework for better human-robot collaboration and mutual understanding.Google ScholarGoogle Scholar
  20. Maciej K. Wozniak, Rebecca Stower, Patric Jensfelt, and Andre Pereira. What you see is (not) what you get: A vr framework for correcting robot errors. arXiv preprint arXiv:2301.04919, 2023.Google ScholarGoogle Scholar
  21. Boling Yang, Xiangyu Xie, Golnaz Habibi, and Joshua R Smith. Competitive physical human-robot game play. In Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, pages 242--246, 2021.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      HRI '23: Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction
      March 2023
      612 pages
      ISBN:9781450399708
      DOI:10.1145/3568294

      Copyright © 2023 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 13 March 2023

      Check for updates

      Qualifiers

      • abstract

      Acceptance Rates

      Overall Acceptance Rate242of1,000submissions,24%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader