skip to main content
10.1145/3526114.3558690acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
poster

MoonBuddy: A Voice-based Augmented Reality User Interface That Supports Astronauts During Extravehicular Activities

Authors Info & Claims
Published:28 October 2022Publication History

ABSTRACT

As NASA pursues Artemis missions to the moon and beyond, it is essential to equip astronauts with the appropriate human-autonomy enabling technology necessary for the elevated demands of lunar surface exploration and extreme terrestrial access. We present MoonBuddy, an application built for the Microsoft HoloLens 2 that utilizes Augmented Reality (AR) and voice-based interaction to assist astronauts in communication, navigation, and documentation on future lunar extravehicular activities (EVAs), with the goal of reducing cognitive load and increasing task completion. User testing results for MoonBuddy under simulated lunar conditions have been positive overall, with participants indicating that the application was easy to use and helpful in completing the required tasks.

Skip Supplemental Material Section

Supplemental Material

MoonBuddy Video Long Version.mp4

mp4

133.3 MB

MoonBuddy Video Preview.mp4

mp4

32 MB

References

  1. Steven Chappell, Kara Beaton, Carolyn Newton, Trevor Graff, K. Young, David Coan, Andrew Abercromby, and Michael Gernhardt. 2017. Integration of an Earth-based science team during human exploration of Mars. 1–11. https://doi.org/10.1109/AERO.2017.7943727Google ScholarGoogle Scholar
  2. David Coan. 2015. NEEMO 20 Exploration EVA Results. Technical Report.Google ScholarGoogle Scholar
  3. David Coan. 2019. Relevant Environments for Analysis and Development (READy): Enabling Human Space Exploration Through Integrated Operational Testing. In International Conference on Environmental Systems (ICES 2019)-11.Google ScholarGoogle Scholar
  4. Trevor G. Graff. 2022. NASA SUITS 2022 Mission Briefing #4 – AR for Lunar Spacewalks. https://www.youtube.com/watch?v=20RcGiXaxsY&ab_channel=NASASTEMGoogle ScholarGoogle Scholar
  5. Ed Kaiser, Alex Olwal, David McGee, Hrvoje Benko, Andrea Corradini, Xiaoguang Li, Phil Cohen, and Steven Feiner. 2003. Mutual disambiguation of 3D multimodal interaction in augmented and virtual reality. In Proceedings of the 5th international conference on Multimodal interfaces (Vancouver, British Columbia, Canada) (ICMI ’03). Association for Computing Machinery, New York, NY, USA, 12–19. https://doi.org/10.1145/958432.958438Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Briana L Krygier. 2021. Joint Augmented Reality Visual Informatics System Project: HID Prototype System Goals for FY22. Technical Report.Google ScholarGoogle Scholar
  7. Jaewook Lee, Fanjie Jin, Younsoo Kim, and David Lindlbauer. 2022. User Preference for Navigation Instructions in Mixed Reality. In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). 802–811. https://doi.org/10.1109/VR51125.2022.00102Google ScholarGoogle Scholar
  8. David Lindlbauer, Anna Maria Feit, and Otmar Hilliges. 2019. Context-Aware Online Adaptation of Mixed Reality Interfaces. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (New Orleans, LA, USA) (UIST ’19). Association for Computing Machinery, New York, NY, USA, 147–160. https://doi.org/10.1145/3332165.3347945Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Stanley G. Love and Marcum L. Reagan. 2013. Delayed voice communication. Acta Astronautica 91(2013), 89–95. https://doi.org/10.1016/j.actaastro.2013.05.003Google ScholarGoogle ScholarCross RefCross Ref
  10. Thammathip Piumsomboon, David Altimira, Hyungon Kim, Adrian Clark, Gun Lee, and Mark Billinghurst. 2014. Grasp-Shell vs gesture-speech: A comparison of direct and indirect natural interaction techniques in augmented reality. In 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 73–82. https://doi.org/10.1109/ISMAR.2014.6948411Google ScholarGoogle ScholarCross RefCross Ref
  11. Marshall Smith, Douglas Craig, Nicole Herrmann, Erin Mahoney, Jonathan Krezel, Nate McIntyre, and Kandyce Goodliff. 2020. The Artemis program: an overview of NASA’s activities to return humans to the moon. In 2020 IEEE Aerospace Conference. IEEE, 1–10.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. MoonBuddy: A Voice-based Augmented Reality User Interface That Supports Astronauts During Extravehicular Activities

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        UIST '22 Adjunct: Adjunct Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology
        October 2022
        413 pages
        ISBN:9781450393218
        DOI:10.1145/3526114

        Copyright © 2022 Owner/Author

        Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 28 October 2022

        Check for updates

        Qualifiers

        • poster
        • Research
        • Refereed limited

        Acceptance Rates

        Overall Acceptance Rate842of3,967submissions,21%

        Upcoming Conference

        UIST '24

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format .

      View HTML Format