Skip to main content

Extravehicular Intelligence Solution for Lunar Exploration and Research: ARSIS 5.0

  • Conference paper
  • First Online:
HCI International 2022 - Late Breaking Papers. Design, User Experience and Interaction (HCII 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13516))

Included in the following conference series:

  • 1239 Accesses

Abstract

Augmented Reality Space Informatics System (ARSIS) 5.0 is a prototypal system designed to help astronauts on Extravehicular Activity (EVA), per the 2022 NASA SUITS (National Aeronautics and Space Administration) (Spacesuit User Interface Technologies for Students) Challenge. ARSIS employs a Ground Station and a HoloLens 2 application that cohesively collaborate to improve autonomy, efficiency, and efficacy of communication between Mission Control and an astronaut on the moon.

The core of the ARSIS system is in its interactive menu navigation. These menus are implementations of the mixed reality toolkit that have redundant interaction control methods employing hand tracking, eye tracking, and voice commands. A few examples of the interactive menus within the ARSIS system include procedures, biometrics, a geology sampling tool and field notes.

The Mini Map is a persistent dismissible panel that displays real-time environmental information, as well as waypoints and beacons set by mission control. This overlay can be expanded into an interactive panel (Mega Map) allowing the user to resize, zoom, set waypoints, and ultimately guide an RC assistant all utilizing hand tracking control methods. Utilizing the Mega Map, the HoloLens 2 user will be able to select destinations in the environment for an RC car, which utilizes self-driving to reach the selected destination. The user receives visual feedback in the HMD via static images and video feed.

Information overload is avoided by utilizing Arm-Retained Menus which are virtual informational overlays rendered over the user’s forearms for easy visibility and access. ARMs provide additional access to commonly used features and functions such as Navigation, Emergency, Tools, and System Access. The Navigation ARM is located on the back of the left hand and provides quick access to the Mega Map as well as functionalities related to the Map Navigation Beacons. The Emergency ARM is located on the back of the user’s right hand and provides quick access to Biometrics and LunaSAR system functionalities. The Tools ARM is located on the left palm and provides access to various tools. Included is the Record Path tool which allows the user to record their physical path of movement through an environment visualized as an AR annotation as well as the measurement tool which allows the user to place points to measure distance.

Mission Control is equipped with the Ground Station which includes virtual reality (VR) and desktop software portals affording Mission Control three major functionalities. The HoloLens 2 HMD records topological data about the user’s surroundings and transmits that data to the Ground Station. Utilizing point cloud matching, a low-resolution version of the environment is rendered. This function provides the Ground Station users the ability to better understand the immediate environment. Future plans include implementing cloud anchors to increase the accuracy of this simulated environment to combat the problem of drift over time. This function supports the primary purpose of the Ground Station to provide Telestration to the HoloLens 2 user. The Ground Station user has the ability to create icons or paths, which are placed in the HoloLens 2 field of view via augmented reality (AR) annotation. Annotations are placed in accordance with the topology received by the Ground Station ensuring they are properly rendered in the HoloLens 2 user’s real-world environment. The Ground Station is also capable of adding or removing Navigational Beacons as well as procedures at run time to allow for flexible problem solving and communication. Topology reconstruction, Telestration, and Mission Updates can all be performed in real-time with minimal latency affording improved effectiveness and efficiency between Mission Control and an Astronaut on EVA.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Abbreviations

AR::

Augmented Reality

ARMs::

Arm Retained Menus

ARSIS::

Augmented Reality Space Informatics System

CRUD::

Create, Read, Update, Delete

EVA::

Extravehicular Activity

HITL::

Human In The Loop

HMD::

Head Mounted Display

HUD::

Heads-up Display

MRTK::

Mixed Reality Toolkit

NASA::

National Aeronautics and Space Administration

LunarSAR::

Lunar Search and Rescue

RC::

Remote Controlled

SLAP::

Situational and Locational Awareness Package

SUITS::

Spacesuit User Interface Technologies for Students

UI::

User Interface

VR::

Virtual Reality

XR::

Extended Reality

References

  1. Arora, R., Kazi, R. H., Anderson, F., Grossman, T., Singh, K., Fitzmaurice, G. W.: Experimental evaluation of sketching on surfaces in VR. In: CHI, vol. 17, pp. 5643–5654 (2017)

    Google Scholar 

  2. Barnes, R., Press, R.: Self-driving RC car using tensorflow and opencv. The MagPi magazine. Accessed 22 Oct 2021. https://magpi.raspberrypi.com/articles/self-driving-rc-car

  3. Bowman, D., Kruijff, E., La Viola, J., Poupyrev, I.: 3D user interfaces theory and practice. Boston, MA: Addison-Wesley. Kindle Edition (2004)

    Google Scholar 

  4. Hsieh, C.-Y., Chiang, Y.-S., Chiu, H.-Y., Chang, Y.-J.: Bridging the virtual and real worlds: a preliminary study of messaging notifications in virtual reality. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI 2020). Association for Computing Machinery, New York, NY, USA, 1–14 (2020). https://doi.org/10.1145/3313831.3376228

  5. Mahalil, I., Yusof, A.M., Ibrahim, N., Mahidin, E.M.M., Rusli, M.E.: Virtual reality mini map presentation techniques: lessons and experience learned. In: 2019 IEEE Conference on Graphics and Media (GAME), pp. 26–31, Polar-kev et al., July 1, 2021. What is the Mixed Reality Toolkit, Microsoft (2019). https://docs.microsoft.com/en-us/windows/mixed-reality/mrtk-unity/. https://doi.org/10.1109/GAME47560.2019.8980759

  6. Pell, M.: Envisioning holograms: design breakthrough experiences for mixed reality. New York, Apress (2017). https://doi.org/10.1007/978-1-4842-2749-7

  7. Yoshinaga, T.: HoloLens2 Point Cloud App. Github Repository. https://github.com/TakashiYoshinaga/HoloLens2-Point-Cloud-App (2021)

Download references

Acknowledgements

The achievements our team have and will continue to make are in large part due to our advisors Dr. Steve Swanson and Dr. Karen Doty. We appreciate all their contributions and thank you, the reader, as well for supporting our efforts and to HCI for inviting our team to participate in HCI International 2022.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Karen Doty .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Laing, M., Cram, C., Frances, M., Tullis, A., Teogalbo, D.J., Doty, K. (2022). Extravehicular Intelligence Solution for Lunar Exploration and Research: ARSIS 5.0. In: Kurosu, M., et al. HCI International 2022 - Late Breaking Papers. Design, User Experience and Interaction. HCII 2022. Lecture Notes in Computer Science, vol 13516. Springer, Cham. https://doi.org/10.1007/978-3-031-17615-9_40

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-17615-9_40

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-17614-2

  • Online ISBN: 978-3-031-17615-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics