Skip to main content
Log in

Utility of Functional Transparency and Usability in UAV Supervisory Control Interface Design

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

A basic notion of transparency in automated systems design is the need to support user tracking and understanding of system states. Many usability principles for complex systems design implicitly target the concept of transparency. In this study, we made comparison of a “baseline” control interface mimicking an existing available UAV ground control station with an “enhanced” interface designed with improved functional transparency and usability, and a “degraded” interface which removed important design features. Each participant was extensively trained in the use of one of the interfaces and all simulated UAV control tasks. Each participant was tested in four trials of a typical military concept of UAV operation with different mission maps and vehicle speeds. Results revealed participants using the enhanced interface to produce significantly faster task completion times and greater accuracy across all UAV control tasks. The enhanced features were also found to promote operator understanding of the system and mitigate workload. By defining and setting automation transparency as an overarching design objective and identifying specific transparency and usability issues within existing GCS designs, we weress able to design and prototype an enhanced interface that more effectively supported human-automation interaction. Automation transparency as a high-level design objective may be useful for expert designers; whereas, usability design guidelines, as “building blocks” to transparency, may be a useful tool for new system designers.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Nullmeyer R, Herz R, Montijo G (2009) Training interventions to reduce air force predator mishaps. Air Force Research Lab Mesa AZ Human Effectiveness Directorate. Report No.: AFRL-RH-AZ-PR-2009–0002. Available from: http://www.dtic.mil/docs/citations/ADA501747

  2. Tvaryanas AP, Thompson WT, Constable SH (2006) Human factors in remotely piloted aircraft operations: HFACS analysis of 221 mishaps over 10 years. Aviat Space Environ Med 77(7):724–732

    Google Scholar 

  3. Yesilbas V, Cotter TS (2014) Structural analysis of Hfacs in unmanned and manned air vehicles. In: Proceedings of the international annual conference of the American Society for Engineering Management; Huntsville. Huntsville: American Society for Engineering Management (ASEM). pp 1–7. Available from: http://search.proquest.com/docview/1705171380/abstract/5BCAB252E0054797PQ/1

  4. Giese S, Carr D, Chahl J (2013) Implications for unmanned systems research of military UAV mishap statistics. In: 2013 IEEE intelligent vehicles symposium (IV), pp 1191–6.

  5. Hobbs A (2010) Unmanned aircraft systems. Hum Factors Aviat, pp 505–531

  6. Fong T, Thorpe C (2001) Vehicle teleoperation interfaces. Auton Robots 11(1):9–18

    Article  Google Scholar 

  7. Williams KW (2006) Human factors implications of unmanned aircraft accidents: flight-control problems. In: Human factors of remotely operated vehicles. Emerald Group Publishing Limited, pp 105–116. Available from: http://www.emeraldinsight.com, https://doi.org/10.1016/S1479-3601(05)07008-6

  8. Neville K, Blickensderfer B, Archer J, Kaste K, Luxion SP (2012) A cognitive work analysis to identify human-machine interface design challenges unique to uninhabited aircraft systems. Proc Hum Factors Ergon Soc Annu Meet 56(1):418–422

    Article  Google Scholar 

  9. Hobbs A, Lyall B (2016) Human factors guidelines for unmanned aircraft systems. Ergon Des 24(3):23–28

    Google Scholar 

  10. Pedersen HK, Cooke NJ, Pringle H, Connor O (2006) UAV human factors: operator perspectives. In: Cooke NJ, Pringle H, Pedersen H, Connor O (eds) Human factors of remotely operated vehicles

  11. Sarter NB, Woods DD (1995) How in the world did we get into that mode? Mode error and awareness in supervisory control. Hum Factors 37:5–19

    Article  Google Scholar 

  12. Selkowitz AR, Larios CA, Lakhmani SG, Chen JYC (2017) Displaying information to support transparency for autonomous platforms. In: Advances in human factors in robots and unmanned systems. Springer, Cham, pp 161–73. (Advances in Intelligent Systems and Computing). Available from: https://link.springer.com/chapter, https://doi.org/10.1007/978-3-319-41959-6_14

  13. Chen JY, Procci K, Boyce M, Wright J, Garcia A, Barnes M (2014) Situation awareness-based agent transparency (No. ARL-TR-6905). Army research lab Aberdeen proving ground MD human research and engineering directorate.

  14. Lyons JB (2013) Being transparent about transparency: a model for human-robot interaction. In: 2013 AAAI spring symposium series [Internet]. Available from: https://www.aaai.org/ocs/index.php/SSS/SSS13/paper/view/5712

  15. Degani A, Shafto M, Kirlik A (1999) Modes in human-machine systems: constructs, representation, and classification. Int J Aviat Psychol. 9(2):125–138

    Article  Google Scholar 

  16. DeVisser EJ, Cohen M, Freedy A, Parasuraman R (2014) A design methodology for trust cue calibration in cognitive agents. In: Virtual, augmented and mixed reality designing and developing virtual and augmented environments. Springer, Cham, pp 251–62. (Lecture Notes in Computer Science). Available from: https://link.springer.com/chapter, https://doi.org/10.1007/978-3-319-07458-0_24

  17. Masalonis AJ, Parasuraman R (2003) Effects of situation-specific reliability on trust and usage of automated air traffic control decision aids. Proc Hum Factors Ergon Soc Annu Meet 47(3):533–537

    Article  Google Scholar 

  18. Kilgore R, Voshell M (2014) Increasing the transparency of unmanned systems: applications of ecological interface design. In: Virtual, augmented and mixed reality applications of virtual and augmented reality. Springer, Cham, pp 378–89. (Lecture Notes in Computer Science). Available from: https://link.springer.com/chapter, https://doi.org/10.1007/978-3-319-07464-1_35

  19. Selkowitz A, Lakhmani S, Chen JYC, Boyce M (2015) The effects of agent transparency on human interaction with an autonomous robotic agent. Proc Hum Factors Ergon Soc Annu Meet. 59(1):806–810

    Article  Google Scholar 

  20. Selkowitz AR, Lakhmani SG, Larios CN, Chen JYC (2016) Agent transparency and the autonomous squad member. Proc Hum Factors Ergon Soc Annu Meet. 60(1):1319–1323

    Article  Google Scholar 

  21. Wright JL, Chen JYC, Barnes MJ, Hancock PA (2016) The effect of agent reasoning transparency on automation bias: an analysis of response performance. In: Virtual, augmented and mixed reality. Springer, Cham, pp 465–77. (Lecture Notes in Computer Science). Available from: https://link.springer.com/chapter, https://doi.org/10.1007/978-3-319-39907-2_45

  22. Mercado JE, Rupp MA, Chen JYC, Barnes MJ, Barber D, Procci K (2016) Intelligent agent transparency in human-agent teaming for multi-UxV management. Hum Factors 58(3):401–415

    Article  Google Scholar 

  23. Sadler G, Battiste H, Ho N, Hoffmann L, Johnson W, Shively R (2016) Effects of transparency on pilot trust and agreement in the autonomous constrained flight planner. In: 2016 IEEE/AIAA 35th digital avionics systems conference (DASC). pp 1–9

  24. Dixon SR, Wickens CD (2003). Control of multiple-UAVs: a workload analysis. DTIC Document.

  25. Galster SM, Knott BA, Brown RD (2006). Managing multiple UAVs: are we asking the right questions? In: Proceedings of the human factors and ergonomics society annual meeting, vol 50, pp 545–549

  26. Squire P, Trafton G, Parasuraman R (2006) Human control of multiple unmanned vehicles: effects of interface type on execution and task switching times. In: Proceedings of 1st ACM SIGCHI/SIGART conference on Human-robot interaction. ACM, pp 26–32

  27. Prewett MS, Johnson RC, Saboe KN, Elliott LR, Coovert MD (2010) Managing workload in human–robot interaction: a review of empirical studies. Comput Hum Behav 26(5):840–856

    Article  Google Scholar 

  28. Peschel JM, Murphy RR (2013) On the human-machine interaction of unmanned aerial system mission specialists. IEEE Trans Human-Mach Syst 43(1).

  29. Afergan D, Peck EM, Solovey ET, Jenkins A, Hincks SW, Brown ET, Jacob RT (2014) Dynamic difficulty using brain metrics of workload. In: Proceedings of the 32nd annual ACM conference on human factors in computing systems. ACM, pp 26–32

  30. Chen JY, Barnes MJ, Harper-Sciarini M (2011) Supervisory control of multiple robots: human-performance issues and user-interface design. IEEE Trans Syst Man Cybern C 41(4):435–454

    Article  Google Scholar 

  31. Cummings ML, Myers K, Scott SD (2006) Modified Cooper Harper evaluation tool for unmanned vehicle displays. In: Proceedings of UVS Canada: conference on unmanned vehicle systems Canada.

  32. Fahlstrom P, Gleason T (2012) Introduction to UAV systems. Wiley

  33. Fuchs C, Borst C, de Croon GC, van Paassen MR, Mulder M (2014) An ecological approach to the supervisory control of UAV swarms. Int J Micro Air Vehicles 6(4):211–229

    Article  Google Scholar 

  34. Lorite S, Muñoz A, Tornero J, Ponsa P, Pastor E (2013) Supervisory control interface design for unmanned aerial vehicles through GEDIS-UAV. In: International conference on human-computer interaction, pp 231–240. Springer

  35. Lu JL, Horng RY, Chao CJ (2013) Design and test of a situation-augmented display for an anmaned aerial vehicle monitoring task. Perceptual Motor Skills 117(1)

  36. Menda J, Hing JT, Ayaz H, Shewokis PA, Izzetoglu K, Onaral B, Oh P (2011) Optical brain imaging to enhance UAV operator training, evaluation, and interface development. J Intell Rob Syst 61(1–4):423–443

    Article  Google Scholar 

  37. Hart SG, Staveland LE (1988) Development of NASA-TLX (task load index): results of empirical and theoretical research. Adv Psychol 52: 139–83. (Human Mental Workload; vol. 52). Available from: http://www.sciencedirect.com/science/article/pii/S0166411508623869

  38. Conover WJ, Conover WJ (1980) Practical nonparametric statistics. Available from: https://orion.sfasu.edu/courseinformation/syl/201202/MTH4752.pdf

  39. Parasuraman R, Sheridan TB, Wickens CD (2000) A model for types and levels of human interaction with automation. IEEE Trans Syst Man Cybern Part Syst Hum 30(3):286–297

    Article  Google Scholar 

  40. Kaber DB, Wright MC (2003) Adaptive automation of stages of information processing and the relation to operator functional states. NATO Sci Ser SUB Ser Life Behav Sci 355:204–223

    Google Scholar 

  41. Kaber DB, Endsley MR (2004) The effects of level of automation and adaptive automation on human performance, situation awareness and workload in a dynamic control task. Theor Issues Ergon Sci. 5(2):113–153

    Article  Google Scholar 

Download references

Funding

This research was supported by a grant from the National Aeronautics and Space Administration (NASA Grant No. NNX16AB23A). Terry Fong was the technical monitor. The views and opinions expressed are those of the authors and do not necessarily reflect the views of the NASA.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Wenjuan Zhang or David Kaber.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Research involving human participants

The institutional review board of North Carolina State University has approved this study. All participants were provided information on the objectives, procedure and potential risks related to the experiment and signed an informed consent form prior to testing.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, W., Feltner, D., Kaber, D. et al. Utility of Functional Transparency and Usability in UAV Supervisory Control Interface Design. Int J of Soc Robotics 13, 1761–1776 (2021). https://doi.org/10.1007/s12369-021-00757-x

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-021-00757-x

Keywords

Navigation