ABSTRACT
As NASA pursues Artemis missions to the moon and beyond, it is essential to equip astronauts with the appropriate human-autonomy enabling technology necessary for the elevated demands of lunar surface exploration and extreme terrestrial access. We present MoonBuddy, an application built for the Microsoft HoloLens 2 that utilizes Augmented Reality (AR) and voice-based interaction to assist astronauts in communication, navigation, and documentation on future lunar extravehicular activities (EVAs), with the goal of reducing cognitive load and increasing task completion. User testing results for MoonBuddy under simulated lunar conditions have been positive overall, with participants indicating that the application was easy to use and helpful in completing the required tasks.
Supplemental Material
- Steven Chappell, Kara Beaton, Carolyn Newton, Trevor Graff, K. Young, David Coan, Andrew Abercromby, and Michael Gernhardt. 2017. Integration of an Earth-based science team during human exploration of Mars. 1–11. https://doi.org/10.1109/AERO.2017.7943727Google Scholar
- David Coan. 2015. NEEMO 20 Exploration EVA Results. Technical Report.Google Scholar
- David Coan. 2019. Relevant Environments for Analysis and Development (READy): Enabling Human Space Exploration Through Integrated Operational Testing. In International Conference on Environmental Systems (ICES 2019)-11.Google Scholar
- Trevor G. Graff. 2022. NASA SUITS 2022 Mission Briefing #4 – AR for Lunar Spacewalks. https://www.youtube.com/watch?v=20RcGiXaxsY&ab_channel=NASASTEMGoogle Scholar
- Ed Kaiser, Alex Olwal, David McGee, Hrvoje Benko, Andrea Corradini, Xiaoguang Li, Phil Cohen, and Steven Feiner. 2003. Mutual disambiguation of 3D multimodal interaction in augmented and virtual reality. In Proceedings of the 5th international conference on Multimodal interfaces (Vancouver, British Columbia, Canada) (ICMI ’03). Association for Computing Machinery, New York, NY, USA, 12–19. https://doi.org/10.1145/958432.958438Google ScholarDigital Library
- Briana L Krygier. 2021. Joint Augmented Reality Visual Informatics System Project: HID Prototype System Goals for FY22. Technical Report.Google Scholar
- Jaewook Lee, Fanjie Jin, Younsoo Kim, and David Lindlbauer. 2022. User Preference for Navigation Instructions in Mixed Reality. In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). 802–811. https://doi.org/10.1109/VR51125.2022.00102Google Scholar
- David Lindlbauer, Anna Maria Feit, and Otmar Hilliges. 2019. Context-Aware Online Adaptation of Mixed Reality Interfaces. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (New Orleans, LA, USA) (UIST ’19). Association for Computing Machinery, New York, NY, USA, 147–160. https://doi.org/10.1145/3332165.3347945Google ScholarDigital Library
- Stanley G. Love and Marcum L. Reagan. 2013. Delayed voice communication. Acta Astronautica 91(2013), 89–95. https://doi.org/10.1016/j.actaastro.2013.05.003Google ScholarCross Ref
- Thammathip Piumsomboon, David Altimira, Hyungon Kim, Adrian Clark, Gun Lee, and Mark Billinghurst. 2014. Grasp-Shell vs gesture-speech: A comparison of direct and indirect natural interaction techniques in augmented reality. In 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 73–82. https://doi.org/10.1109/ISMAR.2014.6948411Google ScholarCross Ref
- Marshall Smith, Douglas Craig, Nicole Herrmann, Erin Mahoney, Jonathan Krezel, Nate McIntyre, and Kandyce Goodliff. 2020. The Artemis program: an overview of NASA’s activities to return humans to the moon. In 2020 IEEE Aerospace Conference. IEEE, 1–10.Google ScholarCross Ref
Index Terms
- MoonBuddy: A Voice-based Augmented Reality User Interface That Supports Astronauts During Extravehicular Activities
Recommendations
AARON: Assistive Augmented Reality Operations and Navigation System for NASA’s Exploration Extravehicular Mobility Unit (xEMU)
Intelligent Human Computer InteractionAbstractThe AARON system is a, reactive, integrated augmented reality (AR) system developed with considerations for collaborative activity. The underlying software architecture and design allows for bi-directional interactions and collaboration between ...
A Framework for Virtual Reality with Tangible Augmented Reality-Based User Interface
In this paper, we propose a framework for virtual reality, I2-NEXT, which enables users to interact with virtual objects by tangible objects in immersive networked virtual environment. The primary goal of this framework is to support rapid development ...
Haptics in Augmented Reality
ICMCS '99: Proceedings of the IEEE International Conference on Multimedia Computing and Systems - Volume 2An augmented reality system merges synthetic sensory information into a user's perception of a three-dimensional environment. An important performance goal for an augmented reality system is that the user perceives a single seamless environment. In most ...
Comments