Abstract
US dismounted Special Operations Forces operating in near-threat environments must maintain their Situation Awareness, survivability, and lethality to maximize their effectiveness on mission. As small unmanned aerial systems and unmanned ground vehicles become more readily available as organic assets at the individual Operator and team level, these Operators must make the decision to divert attention and resources to the control of these assets using touchscreens or controllers to benefit from their support capabilities. This paper provides an end-to-end overview of a solution development process that started with a broad and future-looking capabilities exploration to address this issue of unmanned system control at the dismounted Operator level, and narrowed to a fieldable solution offering immediate value to Special Operation Forces. An overview of this user-centric design process is presented along with lessons learned for others developing complex human-machine interface solutions for dynamic and high-risk environments.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Jenkins, M. et al. (2021). Contextually Adaptive Multimodal Mixed Reality Interfaces for Dismounted Operator Teaming with Unmanned System Swarms. In: Chen, J.Y.C., Fragomeni, G. (eds) Virtual, Augmented and Mixed Reality. HCII 2021. Lecture Notes in Computer Science(), vol 12770. Springer, Cham. https://doi.org/10.1007/978-3-030-77599-5_30
Download citation
DOI: https://doi.org/10.1007/978-3-030-77599-5_30
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-77598-8
Online ISBN: 978-3-030-77599-5
eBook Packages: Computer ScienceComputer Science (R0)