ABSTRACT
As part of our research on multimodal analysis and visualization of activity dynamics, we are exploring the integration of data produced by a variety of sensor technologies within ChronoViz, a tool aimed at supporting the simultaneous visualization of multiple streams of time series data. This paper reports on the integration of a mobile eye-tracking system with data streams collected from HD video cameras, microphones, digital pens, and simulation environments. We focus on the challenging environment of the commercial airline flight deck, analyzing the use of mobile eye tracking systems in aviation human factors and reporting on techniques and methods that can be applied in this and other domains in order to successfully collect, analyze and visualize eye-tracking data in combination with the array of data types supported by ChronoViz.
- Björklund, C., Alfredson, J., and Dekker, S. 2006. Mode monitoring and call-outs: An eye-tracking study of two-crew automated flight deck operations. The International journal of aviation psychology 16, 3, 257--269.Google Scholar
- Duchowski, A. 2002. A breadth-first survey of eye-tracking applications. Behavior Research Methods 34, 4, 455--470.Google ScholarCross Ref
- Duchowski, A. 2007. Eye tracking methodology: Theory and practice. Springer-Verlag New York Inc. Google ScholarDigital Library
- Fletcher, L., and Zelinsky, A. 2009. Driver inattention detection based on eye gazeroad event correlation. The International Journal of Robotics Research 28, 6, 774. Google ScholarDigital Library
- Fouse, A., and Hollan, J. D. 2010. DataPrism: A tool for visualizing multimodal data. In Proc. Measuring Behavior 2010, 1:1--1:4. Google ScholarDigital Library
- Fouse, A., Weibel, N., Hutchins, E., and Hollan, J. D. 2011. Chronoviz: a system for supporting navigation of time-coded data. In Extended Abstracts of CHI 2011, 299--304. Google ScholarDigital Library
- Hagedorn, J., Hailpern, J., and Karahalios, K. 2008. Vcode and vdata: illustrating a new framework for supporting the video annotation workflow. In Proc. AVI 2008. Google ScholarDigital Library
- Hammoud, R., Smith, M. R., Dufour, R., Bakowski, D. L., and Witt, G. J. 2008. Driver distraction monitoring and adaptive safety warning systems. In Proc. SAE Commercial Vehicles Engineering Congress and Exhibition.Google Scholar
- Huemer, V., Hayashi, M., Renema, F., Elkins, S., McCandless, J., and McCann, R. 2005. Characterizing scan patterns in a spacecraft cockpit simulator: Expert versus novice performance. In Proc. Human Factors and Ergonomics Society Annual Meeting, vol. 49, 83--87.Google Scholar
- Hutchins, E. 1995. Cognition in the wild. The MIT Press.Google Scholar
- Hymes, D. 1962. The ethnography of speaking. Anthropology and human behavior 13, 53, 11--74.Google Scholar
- Hyrskykari, A. 2006. Utilizing eye movements: Overcoming inaccuracy while tracking the focus of attention during reading. Computers in human behavior 22, 4, 657--671.Google Scholar
- Land, M., and Hayhoe, M. 2001. In what ways do eye movements contribute to everyday activities? Vision research 41, 25--26, 3559--3565.Google Scholar
- Pea, R., Mills, M., Rosen, J., Dauber, K., Effelsberg, W., and Hoffert, E. 2004. The diver project: Interactive digital video repurposing. IEEE MultiMedia 11, 54--61. Google ScholarDigital Library
- Räihä, K., Aula, A., Majaranta, P., Rantala, H., and Koivunen, K. 2005. Static visualization of temporal eye-tracking data. Proc. INTERACT 2005, 946--949. Google ScholarDigital Library
- Ramanathan, S., Katti, H., Sebe, N., Kankanhalli, M., and Chua, T. 2010. An eye fixation database for saliency detection in images. Computer Vision--ECCV 2010, 30--43. Google ScholarDigital Library
- Roda, C., and Thomas, J. 2006. Attention aware systems: Theories, applications, and research agenda. Computers in Human Behavior 22, 4, 557--587.Google ScholarCross Ref
- Roda, C. 2011. Human Attention in Digital Environments. Cambridge Univ Pr. Google ScholarDigital Library
- Sacks, H. 1995. Lectures on conversation, vol. 1. Blackwell.Google Scholar
- Sarter, N., Mumaw, R., and Wickens, C. 2007. Pilots' monitoring strategies and performance on automated flight decks: An empirical study combining behavioral and eye-tracking data. HUM FACTORS 49, 3, 347.Google ScholarCross Ref
- Schnell, T., Keller, M., and Poolman, P. 2008. Neuro-physiological workload assessment in flight. In Proc. IEEE Digital Avionics Systems Conference, 2008., 4.B.2-1--4.B.2-14.Google Scholar
- Špakov, O., and Miniotas, D. 2007. Visualization of eye gaze data using heat maps. Electronics and electrical engineering 2, 55--58.Google Scholar
- Steelman, K., McCarley, J., and Wickens, C. 2011. Modeling the control of attention in visual workspaces. HUM FACTORS 53, 2, 142.Google ScholarCross Ref
- Tatler, B., Hayhoe, M., Land, M., and Ballard, D. 2011. Eye guidance in natural vision: Reinterpreting salience. Journal of Vision 11, 5.Google ScholarCross Ref
- Thomas, L., and Wickens, C. 2004. Eye-tracking and individual differences in off-normal event detection when flying with a synthetic vision system display. In Proc. Human Factors and Ergonomics Society, vol. 48, 223--227.Google ScholarCross Ref
- Tsang, H., Tory, M., and Swindells, C. 2010. eSeeTrack -- Visualizing Sequential Fixation Patterns. IEEE Transactions on Visualization and Computer Graphics 16, 6, 953--962. Google ScholarDigital Library
- Tukey, J. 1977. Exploratory data analysis. Addison-Wesley Series in Behavioral Science: Quantitative Methods, Reading, Mass 1.Google Scholar
- Vertegaal, R. 1999. The GAZE groupware system: mediating joint attention in multiparty communication and collaboration. In Proc. CHI '99, 294--301. Google ScholarDigital Library
- Weibel, N., Fouse, A., Hutchins, E., and Hollan, J. D. 2011. Supporting an integrated paper-digital workflow for observational research. In Proc. IUI 2011, 257--266. Google ScholarDigital Library
- Wickens, C., Alexander, A., Horrey, W., Nunes, A., and Hardy, T. 2004. Traffic and flight guidance depiction on a synthetic vision system display: The effects of clutter on performance and visual attention allocation. In Proc. Human Factors and Ergonomics Society, vol. 48, 218--222.Google ScholarCross Ref
- Wittenburg, P., Brugman, H., Russel, A., Klassmann, A., and Sloetjes, H. 2006. Elan: a professional framework for multimodality research. In Proc. Language Resources and Evaluation Conference (LREC).Google Scholar
- Zhai, S. 2003. What's in the eyes for attentive input. Communications of the ACM 46, 3, 34--39. Google ScholarDigital Library
Index Terms
- Let's look at the cockpit: exploring mobile eye-tracking for observational research on the flight deck
Recommendations
Research on RGB-d-Based Pilot Hand Detection in Complex Cockpit Environment
Human-Computer InteractionAbstractWith the rapid development of the aviation industry, human error has replaced mechanical failure as the main cause of aviation accidents, which makes “human factors” increasingly popular. In human factors research, hand detection is very important ...
Consider the Head Movements! Saccade Computation in Mobile Eye-Tracking
ETRA '22: 2022 Symposium on Eye Tracking Research and ApplicationsSaccadic eye movements are known to serve as a suitable proxy for tasks prediction. In mobile eye-tracking, saccadic events are strongly influenced by head movements. Common attempts to compensate for head-movement effects either neglect saccadic ...
Crew Workload Considerations in Using HUD Localizer Takeoff Guidance in Lieu of Currently Required Infrastructure
Virtual, Augmented and Mixed Reality. Design and InteractionAbstractThe purpose of this research was to examine the crew workload considerations for using HUD with localizer guidance symbology in lieu of currently required infrastructure for lower than standard takeoff minima and within the larger conceptual ...
Comments