skip to main content
10.1145/3371382.3374850acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
abstract

Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)

Published:01 April 2020Publication History

ABSTRACT

The 3rd International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interactions (VAM-HRI) will bring together HRI, Robotics, and Mixed Reality researchers to address challenges in mixed reality interactions between humans and robots. Topics relevant to the workshop include development of robots that can interact with humans in mixed reality, use of virtual reality for developing interactive robots, the design of augmented reality interfaces that mediate communication between humans and robots, comparisons of the capabilities and perceptions of robots and virtual agents, and best design practices. VAM-HRI 2020 will follow on the success of VAM-HRI 2018 and 2019, and advance the cause of this nascent research community

References

  1. Ravi Teja Chadalavada, Henrik Andreasson, Robert Krug, and Achim J Lilienthal. That's on my mind! robot to human intention communication through on-board projection on shared floor space. In ECMR, 2015.Google ScholarGoogle ScholarCross RefCross Ref
  2. Tathagata Chakraborti, Sarath Sreedharan, Anagha Kulkarni, and Subbarao Kambhampati. Alternative Modes of Interaction in Proximal Human-in-the-Loop Operation of Robots. CoRR, abs/1703.08930, 2017.Google ScholarGoogle Scholar
  3. Tathagata Chakraborti, Sarath Sreedharan, Anagha Kulkarni, and Subbarao Kambhampati. Projection-Aware Task Planning and Execution for Human-in-the-Loop Operation of Robots. In IROS, 2018.Google ScholarGoogle Scholar
  4. Henrik I Christensen, T Batzinger, K Bekris, K Bohringer, J Bordogna, G Bradski, O Brock, J Burnstein, T Fuhlbrigge, R Eastman, et al. A roadmap for us robotics: from internet to robotics. CCC and CRA, 2009.Google ScholarGoogle Scholar
  5. Catherine Diaz, Michael Walker, Danielle Albers Szafir, and Daniel Szafir. Designing for depth perceptions in augmented reality. In Proc. ISMAR, 2017.Google ScholarGoogle ScholarCross RefCross Ref
  6. Samir Yitzhak Gadre, Eric Rosen, Gary Chien, Elizabeth Phillips, Stefanie Tellex, and George Konidaris. End-user robot programming using mixed reality. In Proc. ICRA, 2019.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Hooman Hedayati, Michael Walker, and Daniel Szafir. Improving Collocated Robot Teleoperation with Augmented Reality. In Proc. HRI, 2018.Google ScholarGoogle Scholar
  8. Baichuan Huang, Deniz Bayazit, Daniel Ullman, Nakul Gopalan, and Stefanie Tellex. Flight, camera, action! using natural language and mixed reality to control a drone. ICRA, 2019.Google ScholarGoogle Scholar
  9. Florian Leutert, Christian Herrmann, and Klaus Schilling. A spatial augmented reality system for intuitive display of robotic data. In Proc. HRI, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  10. Eric Rosen, David Whitney, Elizabeth Phillips, Gary Chien, James Tompkin, George Konidaris, and Stefanie Tellex. Communicating and controlling robot arm motion intent through mixed-reality head-mounted displays. IJRR, 2019.Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Shin Sato and Shigeyuki Sakane. A human-robot interface using an interactive hand pointer that projects a mark in the real work space. In Proc. ICRA, 2000.Google ScholarGoogle ScholarCross RefCross Ref
  12. Jinglin Shen, Jingfu Jin, and Nicholas Gans. A multi-view camera-projector system for object detection and robot-human feedback. In Proc. ICRA, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  13. Daniel Szafir, Bilge Mutlu, and Terrence Fong. Communication of Intent in Assistive Free Flyers. In Proceedings of HRI. ACM, 2014.Google ScholarGoogle Scholar
  14. Daniel Szafir, Bilge Mutlu, and Terrence Fong. Communicating directionality in flying robots. In Proceedings of HRI. ACM, 2015.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Matthew Turk and Victor Fragoso. Computer vision for mobile augmented reality. In Mobile Cloud Visual Media Computing. 2015.Google ScholarGoogle ScholarCross RefCross Ref
  16. Christian Vogel. Projection-based interaction and safeguarding. https://goo.gl/61zfnE, 2016. FourByThree.Google ScholarGoogle Scholar
  17. Michael Walker, Hooman Hedayati, Jennifer Lee, and Daniel Szafir. Communicating Robot Motion Intent with Augmented Reality. In Proc. HRI, 2018.Google ScholarGoogle Scholar
  18. Atsushi Watanabe, Tetsushi Ikeda, Yoichi Morales, Kazuhiko Shinozawa, Takahiro Miyashita, and Norihiro Hagita. Communicating robotic navigational intentions. In IROS, pages 5763--5769. IEEE, 2015.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. David Whitney, Eric Rosen, Daniel Ullman, Elizabeth Phillips, and Stefanie Tellex. Ros reality: A virtual reality framework using consumer-grade hardware for ros-enabled robots. In IROS. IEEE, 2018.Google ScholarGoogle Scholar
  20. Tom Williams, Gordon Briggs, Brad Oosterveld, and Matthias Scheutz. Going beyond literal command-based instructions: Extending robotic natural language interaction capabilities. In Proceedings of AAAI, 2015.Google ScholarGoogle ScholarCross RefCross Ref
  21. Tom Williams, Daniel Szafir, Tathagata Chakraborti, and Heni Ben Amor. The 1st international workshop on virtual, augmented, and mixed reality for human-robot interaction. AI Magazine, 2018.Google ScholarGoogle Scholar
  22. Tom Williams, Daniel Szafir, Tathagata Chakraborti, and Heni Ben Amor. Virtual, augmented, and mixed reality for human-robot interaction. In Comp. HRI, 2018.Google ScholarGoogle Scholar
  23. Yu Zhang, Sarath Sreedharan, Anagha Kulkarni, Tathagata Chakraborti, Hankz Hankui Zhuo, and Subbarao Kambhampati. Plan explicability and predictability for robot task planning. In ICRA. IEEE, 2017.Google ScholarGoogle Scholar
  24. Danny Zhu and Manuela Veloso. Virtually adapted reality and algorithm visualization for autonomous robots. In RoboCup Symposium, 2016.Google ScholarGoogle Scholar

Index Terms

  1. Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          HRI '20: Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction
          March 2020
          702 pages
          ISBN:9781450370578
          DOI:10.1145/3371382

          Copyright © 2020 Owner/Author

          Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 1 April 2020

          Check for updates

          Qualifiers

          • abstract

          Acceptance Rates

          Overall Acceptance Rate192of519submissions,37%

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader