Abstract
Augmented Reality (AR) is a tool which can be used to improve human-computer interaction in flight operations. The application of AR can facilitate pilots integrating the information from interfaces in the flight deck to analyze various sources of messages simultaneously. There are seventeen subjects aged from 23 to 53 (M = 29.82, SD = 8.93) who have participated in this experiment. Their flight experience ranged from zero flight hours to 3000 flight hours (M = 605.00, SD = 1051.04). Two types of HCI AR design (gesture or voice control checklist) have been compared with traditional paper checklist. The results show that AR gesture control induced the highest perceived workload compared with AR voice checklist and traditional paper checklist. There are lots of complicated cognitive processes and physical movements involved in the AR gesture checklist that induced the highest level of effort and frustration based on NASA-TLX. The AR checklist application has relied on the use of the default HoloLens interactions including cursor movement linked with head movements, Air Tap gesture and Microsoft voice recognition system. The current technological features embedded in the HoloLens device are not certified to be used in the cockpit yet. The improvement in the types of interaction and displays with AR devices could lead to changes in pilot’s perceived workload while interacting with an innovative device. This research demonstrated that AR integrated with voice command has potential of significant benefits to be applied in the flight deck for future flight operation.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
The application of augmented reality (AR) can facilitate pilots interacting with the interfaces in the flight deck to analyze various sources of information simultaneously. System developers take the importance of human-computer interaction into account when design new operational systems to optimize pilots’ situational awareness and minimize workload [1]. Augmented Reality differs from Virtual Reality (VR) as AR uses overlaid images in the real-world environment, whereas VR is based on a digital environment where the user cannot see or interact with the real-world. Augmented Reality is a tool which can be used to improve human-computer interaction in aviation [2]. However, such innovative AR devices need to be validated before implementation in aviation.
Augmented Reality (AR) is textual, symbolic or graphical information overlaid onto the users view of the real world. In his survey of AR technology [3], defines key characteristics inherent to AR - a combination of real and virtual objects, real-time interaction and 3D registration of virtual objects. The first use of AR in aviation was during World War II with the development of head-up displays (HUDs), placing aiming gunsights into the pilots’ field of view [4]. The concept was further developed during the 1960’s with the inclusion of flight information. This early research in HUD technology found some utility in flight operation, but presented significant human factors issues, often with distracting and disorientating effect for pilots [5]. It was not until later standardisation effort that the full effectiveness of HUDs have been realized [6].
Helmet Mounted Displays (HMDs) offer the next evolution of HUD technology, incorporating HUD parameters (flight, weapons, navigation) and offer head-aiming off-boresight weapon delivery - improving over the compromised situational awareness inherent with traditional Cursor Control Devices (CCDs) used for this function [7]. In recent years there have been increased development of AR headsets (e.g. Google Glass, Microsoft HoloLens and Magic Leap One) that have the potential to be used in commercial aviation. Comerford and Johnson described an augmented cockpit workspace where a pilot could interact with virtual information windows, grouping data and supplanting the environment with overlaid data [8].
Over the past seventy-years there has been an increased reduction in crew complement from five person crew in the 1950s (Captain, First Officer, Navigator, Radio Operator and Flight Engineer) to two personnel (Captain and First Officer). Increased automation and reduced pilot availability may see a further reduction to single pilot operations. Studies have investigated the potential for a ground operator to assist the single pilot during high workload situations [9]. This assistance can also be offered with AR systems that guide the operator to perform flight tasks, compensating for the lack of co-pilot. Tran and his colleagues explored the use of the Microsoft HoloLens to reduce reaction time during an emergency situation [10]. By re-enacting the TransAsia Flight 235 accident, pilot performance was compared using the HoloLens to pilots using ECAM systems. For the participants using HoloLens, controls are indicated within the pilot’s field of view.
In modern aviation, checklists and monitoring are considered essential for flight safety and guarding against equipment or pilot errors. Two examples of accidents where the probable cause was improper checklist procedure are that of TransAsia Flight 235, where pilots responded to an engine fire with the wrong engine [11] and Beechcraft King Air B200, where the pilot failed to identify full nose-left rudder trim prior to take-off [12]. Dismukes’s study of checklist deviations highlighted the potential safety-risk where checklist procedures are not conducted [13].
The execution of a checklist may be influenced by the medium used for its representation - typically either paper format or digitised. Studies have shown that paper remains the fastest checklist medium for pilots - however they lack a pointer/memory system leading to skipped items [14]. Digitised checklists allow operation of a checklist with a pointer and colour used to indicate completed items. Studies have shown a reduction in error rate when compared to traditional paper systems [15].
The potential for AR in HCI was first explored by Weiser with his vision of “Ubiquitous Computing”, where interactions with computing technology become transparent and indistinguishable from everyday life [16]. Although not all ideas presented in this paper have yet to be realised, many can now be recognised as part of modern life: mobile and wearable computing, wireless communication and the Internet-of-Things. With the combination of real and virtual objects, AR leads to concept of “Tangible Computing”, a form of “Ubiquitous Computing” where everyday objects become input and output devices for computer interaction [17, 18]. The concept of “Tangible Computing” can be taken further when we consider proprioception - the sense of position and orientation of the body and limbs. This enables new HCI methods of direct manipulation via body sense; physical mnemonics to store/recall information; and gestural actions to issue commands [19].
Hand pose estimation has seen a great deal of Computer Vision (CV) research to enable gesture recognition without the use of data gloves to capture hand movements. Given the complexity of the human hand and multiple degrees of freedom for hand posture and movement, glove-based methods remain an effective method to acquire a model of hand functionality [20]. Limitations of glove-based systems (cost, restricted movement, complex calibration) has led to increased CV research into unencumbered gesture recognition, with development of methods that determine the 3D hand-pose for gesture classification, closely linked to those methods used for human body tracking. Erol and his colleagues summarised the earlier CV work for pose estimation, with techniques ranging from low degrees-of-freedom (DOF) for tracking simple gestures to complex learning methods for high DOF hand models [21]. Modern methods now employ Deep Learning techniques, with Convolutional Neural Networks proving effective at interpreting the hand from a depth map [22].
2 Method
2.1 Participants
Seventeen aviation professionals (pilots, engineers, and training pilots) aged from 23 to 53 (M = 29.82, SD = 8.93) participated in this experiment. Their flight experience ranged from zero flight hours to 3000 flight hours (M = 605.00, SD = 1051.04). The collected data was gathered from human subjects; therefore, the research proposal was submitted to the Cranfield University Research Ethics System for ethical approval. As stated in the consent form, participants have the right to terminate the experiment at any time and to withdraw their provided data at any moment even after the data collection.
2.2 Research Apparatus
Augmented Reality Device.
The AR device used in the experiment is a Microsoft HoloLens headset (Fig. 1). These glasses comprise see-through holographic waveguides, two HD 16:9 light engines and built-in processors that can display holograms with a resolution of 1280 × 720 px per eye, a field of view of 30° × 17.5° and a refresh rate of 60 Hz. Brightness and audio volume can be adjusted by 4 buttons located on top of the headset. The HoloLens comes with built-in sensors: an Inertial Measurement Unit (IMU), four environment understanding depth cameras, one 2MP photo/HD video camera, four microphones and one ambient light sensor. Its audio output consists of two speakers located near the user’s ears that can emit spatial sound.
Flight Simulator.
The experiment was run on the Cranfield University Large Aircraft Flight Simulator with a representative model of the Boeing 747 simulator (Hanke, 1971). It is comprised of a realistic mock-up of a cockpit of Boeing commercial aircraft with functioning flight controls, stick-shaker stall warning, over-speed alerts, primary flight and navigation displays, and landing gear lever to name a few. The simplified overhead panel is composed of light switches, engine fire emergency levers and engine ignition switches (Fig. 2). The scenario is based on an Instrumented Landing System (ILS) in the final approach. The aircraft is set at 2000 ft and eight nautical miles (NM) from the airfield. As soon as the simulation starts, participants must execute a pre-landing checklist by interacting with the AR device and flying the aircraft for landing.
Perceived workload measurement: NASA-TLX was applied to evaluate pilots’ perceived workload among AR voice, AR gesture and paper checklist. NASA-TLX is a popular technique for measuring subjective perceived workload including Mental demand, Physical demand, Temporal demand, Performance, Effort and Frustration. The participants were required to evaluate their perceived workload among three different modes of pre-landing checklist after each trial. By analysing these six dimensions, it is possible to understand the various safety concerns on applications of AR in relation to perceived workload and HCI in flight operations.
2.3 Research Design
All participants carry out the following; (1) provide the demographical data including age, gender, qualifications, type hours and total flight hours (five minutes); (2) interacted with Hololens AR device (15 min); (3) interacted with B747 simulator to practice how to land the aircraft using a checklist (ten minutes); (4) experienced a briefing on the AR checklist app, with a detailed explanation of the item highlights by voice control and gesture control (ten minutes); (5) performed a landing by using Hololens AR device on both voice control (five minutes) and gesture control (five minutes) randomly; (6) evaluating perceived workload by NASA-TLX to three different modes of operational checklists (ten minutes).
3 Result and Discussion
There are 17 participants conducting three modes of flight operations, traditional paper checklist, gesture control AR checklist, and voice control AR checklist. One-way ANOVA was applied for data analysis. Bonferroni tests were performed to identify pairwise differences for factors with more than two levels. Partial eta-square (η2) is a measure of effect size for ANOVA. The descriptive results of NASA-TLX scores and six dimension on three checklist modes in the pre-landing phase are shown as Table 1.
There is a significant difference of participants’ perceived workload on NASA-TLX among three modes of operations for pre-landing checklist, F(2, 48) = 7.49, p < .01, η2 = 0.24. Post-hoc comparison indicates that the AR gesture checklist induced significantly higher level of workload than traditional paper checklist and AR voice checklist. Generally, AR gesture checklist causes participants the highest level of perceived workload on flight operations (Table 1). Furthermore, there is a significant difference on physical demand among three modes of operational checklist, F(2, 48) = 9.15, p < .001, η2 = 0.28. Post-hoc comparison shows that the physical demand of AR gesture checklist is significantly higher than traditional paper checklist and AR voice checklist. It was observed that participants have to activate a lot of checklists at pre-landing procedures by air tap gestures. Therefore, the physical demands are the highest compared with paper checklist and AR voice checklists. There is a significant difference on performance among three modes of operational checklist, F(2, 48) = 21.34 p < .01, η2 = 0.22. Post-hoc comparison shows that performance of flight operations with AR gesture checklist is significantly lower than traditional paper checklist and AR voice checklist. Participants’ have struggled to accomplish lots of physical movements and air tap gestures in additional to fly the airplane safely landing. Therefore, the performance of AR gesture checklist is the lowest compared with paper checklist and AR voice checklists.
There is a significant difference on participant’s effort among three modes of checklist, F(2, 48) = 9.31, p < .001, η2 = 0.28. Post-hoc comparison indicates that the AR gesture checklist demands participants more efforts than traditional paper checklist and AR voice checklist. It was identified that participants have to complete the pre-landing checklists by moving the focus of camera to the item in the checklist and activate by air tap simultaneously. There are lots of cognitive processes and physical efforts involved in the flight operations. The effort is the highest on the AR gesture checklist compared with paper checklist and AR voice checklists. There is a significant difference on participant’s frustration among three modes of checklist, F(2, 48) = 8.90, p < .01, η2 = 0.24. Post-hoc comparison indicates that the AR gesture checklist makes participants felt a significant higher level of frustration than traditional paper checklist and AR voice checklist. During the experiment a potential lack of compliance was witnessed as some participants appears to become frustrated with the physical effort and mental demand required to operate both AR device by Air Tap and the landing gear in particular. Although these instances did not result in non-compliance during the study it is likely that some may become non-compliant in a single pilot operation scenario [23]. However, the mental demand (F(2, 48) = 2.69, p = .08) and temporal demand (F(2, 48) = 2.77, p = .07) among three modes of checklist show no significant difference in the current study (Table 1). The compliance of checklist and procedures are of great interest for Human Factors research and investigations as they may be accountable for a significant number of aviation accidents. The human-cantered design of augmented visualization aids have effects on human performance and cognitive processes by increased operator’s capability to manage complex checklists.
4 Conclusion
There are two types of interaction by gesture and voice controlled have been compared to traditional paper checklists. The results show that gesture control AR gives rise to unnecessary complexity and tends to be awkward to use. There are lots of complicated cognitive processes and physical movements involved in the AR gesture checklist inducing the highest level of frustration compared with paper and AR voice checklists. Therefore, participants rated the highest perceived workload to the AR gesture checklist. On the other hand, voice control AR checklists could establish the human-centered design on HCI in the flight deck for pre-landing check. It was acknowledged as the lowest workload while pilots performing pre-landing checklist. The AR checklist application has relied on the use of the default HoloLens interactions including cursor movement with head movements, Air Tap gesture, Microsoft voice recognition system. The current technological features embedded in the HoloLens device are not certified to be used in the cockpit yet. The improvement in the types of interaction and displays on AR devices could lead to changes in pilot’s perceived workload while interacted with the innovative device. This research demonstrated that AR voice command has potential of significant benefits to be applied in the flight deck for future flight operation.
References
Dorneich, M.C., Rogers, W., Whitlow, S.D., DeMers, R.: Human performance risks and benefits of adaptive systems on the flight deck. Int. J. Aviat. Psychol. 26(1–2), 15–35 (2016)
Luzik, E., Akmaldinova, A.: Psychological aspects of ensuring flight safety in civil aviation. Aviation 10(1), 25–35 (2006)
Azuma, R.T.: A survey of augmented reality. Teleop. Virtual Environ. 6(4), 355–385 (1997)
Prinzel, L.J, Risser, M.: Head-up displays and attention capture (NASA/TM-2004-213000). NASA Langley Research Center, Hampton, VA, United States (2004)
Barnette, J.F.: Role of head-up display in instrument flight (No. IFC-LR-76-2). Air Force Instrument Flight Center Randolph AFB TX, Dublin, OH, United States (1976)
Newman, R. L.: Improvement of head-up display standards. In: Head-Up Display Design Guide. Appendix, vol. 1. Crew Systems Consultants, Yellow Springs, OH, United States (1987)
Thomas, P., Biswas, P., Langdon, P.: State-of-the-art and future concepts for interaction in aircraft cockpits. In: Antona, M., Stephanidis, C. (eds.) UAHCI 2015. LNCS, vol. 9176, pp. 538–549. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-20681-3_51
Comerford, D., Johnson, W.W.: Potential capabilities in a future, augmented cockpit. Ergon. Design 15(1), 8–13 (2007)
Brandt, S.L., Lachter, J., Battiste, V., Johnson, W.: Pilot situation awareness and its implications for single pilot operations: analysis of a human-in-the-loop study. Procedia Manuf. 3, 3017–3024 (2015)
Tran, T.H., Behrend, F., Fünning, N., Arango, A.: Single pilot operations with AR-glasses using Microsoft HoloLens. In: IEEE/AIAA 37th Digital Avionics Systems Conference, pp. 1–7. IEEE, London (2018)
Kharoufah, H., Murray, J., Baxter, G., Wild, G.: A review of human factors causations in commercial air transport accidents and incidents: from to 2000–2016. Prog. Aerosp. Sci. 99, 1–13 (2018)
Australian Transport Safety Bureau: Loss of control and collision with terrain involving B200 King Air, VH-ZCR at Essendon Airport, Victoria on 21 February 2017 (Investigation No. AO-2017-024). ATSB, Canberra, Australia (2018)
Dismukes, R.K., Berman, B.: Checklists and monitoring in the cockpit: why crucial defenses sometimes fail (NASA/TM-2010-216396, TH-084, ARC-E-DAA-TN1902). NASA Ames Research Center, Moffett Field, CA, United States (2010)
Degani, A., Wiener, E.L.: Human factors of flight-deck checklists: the normal checklist (NCC2-377; RTOP 505-67-41). Ames Research Center, Moffett Field, CA, United States (1991)
Rouse, S., Rouse, W.: Computer-based manuals for procedural information. IEEE Trans. Syst. Man Cybern. 10(8), 506–510 (1980)
Weiser, M.: The computer for the 21st century. Sci. Am. 265(3), 94–104 (1991)
Ishii, H., Ullmer, B.: Tangible bits: towards seamless interfaces between people, bits and atoms. In: Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems, pp. 234–241. ACM Press, New York (1997)
Kato, H., Billinghurst, M.: Marker tracking and HMD calibration for a video-based augmented reality conferencing system. In: Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality, pp. 85–94. IEEE, London (1999)
Mine, M.R., Brooks, F.P., Jr., Sequin, C.H.: Moving objects in space: exploiting proprioception in virtual-environment interaction. In: Proceedings of the 24th Annual Conference on Computer Graphics and Interactive Techniques, pp. 19–26. ACM Press, New York (1997)
Dipietro, L., Sabatini, A.M., Dario, P.: A survey of glove-based systems and their applications. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 38(4), 461–482 (2008)
Erol, A., Bebis, G., Nicolescu, M., Boyle, R.D., Twombly, X.: Vision-based hand pose estimation: a review. Comput. Vis. Image Underst. 108(1), 52–73 (2007)
Oberweger, M., Lepetit, V.: DeepPrior++: improving fast and accurate 3D hand pose estimation. In: IEEE International Conference on Computer Vision Workshops. IEEE, London (2017)
Stanton, N.A., Plant, K.L., Roberts, A.P., Allison, C.K.: Use of highways in the sky and a virtual pad for landing head up display symbology to enable improved helicopter pilots situation awareness and workload in degraded visual conditions. Ergonomics 62(2), 255–267 (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Li, WC. et al. (2020). Evaluating Pilot’s Perceived Workload on Interacting with Augmented Reality Device in Flight Operations. In: Harris, D., Li, WC. (eds) Engineering Psychology and Cognitive Ergonomics. Cognition and Design. HCII 2020. Lecture Notes in Computer Science(), vol 12187. Springer, Cham. https://doi.org/10.1007/978-3-030-49183-3_26
Download citation
DOI: https://doi.org/10.1007/978-3-030-49183-3_26
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-49182-6
Online ISBN: 978-3-030-49183-3
eBook Packages: Computer ScienceComputer Science (R0)