skip to main content
10.1145/2702123.2702539acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

SwitchBack: Using Focus and Saccade Tracking to Guide Users' Attention for Mobile Task Resumption

Published: 18 April 2015 Publication History

Abstract

Smartphones and tablets are often used in dynamic environments that force users to break focus and attend to their surroundings, creating a form of "situational impairment." Current mobile devices have no ability to sense when users divert or restore their attention, let alone provide support for resuming tasks. We therefore introduce SwitchBack, a system that allows mobile device users to resume tasks more efficiently. SwitchBack is built upon Focus and Saccade Tracking (FAST), which uses the front-facing camera to determine when the user is looking and how their eyes are moving across the screen. In a controlled study, we found that FAST can identify how many lines the user has read in a body of text within a mean absolute percent error of just 3.9%. We then tested SwitchBack in a dual focus-of-attention task, finding that SwitchBack improved average reading speed by 7.7% in the presence of distractions.

Supplementary Material

suppl.mov (pn2281-file3.flv)
Supplemental video
MP4 File (p2953-mariakakis.mp4)

References

[1]
Aaltonen, A., Hyrskykari, A., and Räihä, K. 101 spots, or how do users read menus? Proc. CHI '98, (1998), 132--139.
[2]
Baluja, S. and Pomerleau, D. Non-intrusive gaze tracking using artificial neural networks. 1994.
[3]
Bartram, L. Can motion increase user interface bandwidth in complex systems? Proc. Systems, Man, and Cybernetics '97., (1997), 1686--1692.
[4]
Baudisch, P., Tan, D., and Collomb, M. Phosphor: explaining transitions in the user interface using afterglow effects. Proc. UIST '06, (2006), 169--178.
[5]
Brookings, J., Wilson, G., and Swain, C. Psychophysiological responses to changes in workload during simulated air traffic control. Biological psychology 42, 3 (1996), 361--377.
[6]
Drewes, H., Luca, A. De, and Schmidt, A. Eye-gaze interaction for mobile phones. Proc. Mobility '07, (2007), 364--371.
[7]
Frederick, B. Fixed-, Random-, and Mixed-Effects ANOVA Models: A User-Friendly Guide for Increasing the Generalizability of ANOVA Results. Advances in Social Science Methodology, (1999), 111--122.
[8]
Games, P. Multiple comparisons of means. American Educational Research Journal 2, 8 (1971), 531--565.
[9]
Goldberg, J. and Kotval, X. Computer interface evaluation using eye movements: methods and constructs. International Journal of Industrial Ergonomics 24, 6 (1999), 631--645.
[10]
Goldberg, J. and Stimson, M. Eye tracking in web search tasks: design implications. Proc. Symposium on Eye Tracking Research and Applicaitons '02, (2002), 51--58.
[11]
Goldberg, J. and Wichansky, A. Eye tracking in usability evaluation: A practitioner's guide. The mind's eye: cognitive and applied aspects of eye movement research, (2003), 537--605.
[12]
Hansen, D. and Ji, Q. In the eye of the beholder: A survey of models for eyes and gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence 32, 3 (2010), 478--500.
[13]
Hart, S. and Staveland, L. Development of NASATLX (Task Load Index): Results of empirical and theoretical research. Advances in psychology 52, (1988), 139--183.
[14]
Higgins, J. and Tastoush, S. An aligned rank transform test for interaction. Nonlinear World 1, 2 (1994), 201--211.
[15]
Jacob, R. and Karn, K. Eye tracking in humancomputer interaction and usability research: Ready to deliver the promises. Mind, (2003).
[16]
Just, M. and Carpenter, P. A theory of reading: From eye fixations to comprehension. Psychological review 87, 4 (1980), 329.
[17]
Kang, D., Ryoo, S., Kim, J., Kim, C., and Seo, Y. Apparatus and method for detecting speaking person's eyes and face. US Patent 6,611,613, 2003. http://www.google.com/patents/US6611613.
[18]
Kern, D., Marshall, P., and Schmidt, A. Gazemarks: gaze-based visual placeholders to ease attention switching. Proc. CHI '10, (2010), 2093--2102.
[19]
Kincaid, J.P., Fishburne Jr, R.P., Rogers, R.L., and Chissom, B.S. Derivation of new readability formulas (automated readability index, fog count and flesch reading ease formula) for navy enlisted personnel. 1975.
[20]
Kintsch, W. and Keenan, J. Reading rate and retention as a function of the number of propositions in the base structure of sentences. Cognitive Psychology 5, 3 (1973), 257--274.
[21]
Littell, R.C., Henry, P.R., and Ammerman, C.B. Statistical analysis of repeated measures data using SAS procedures. Journal of Animal Science 4, 76 (1998), 1216--1231.
[22]
Marshall, S.P. Method and apparatus for eye tracking and monitoring pupil dilation to evaluate cognitive activity. US Patent 6,090,051, 2000. http://www.google.com/patents/US6090051.
[23]
McCrickard, D.S., Catrambone, R., and Stasko, J.T. Evaluating animation in the periphery as a mechanism for maintaining awareness. Proc. IFIP '01, (2001), 148--156.
[24]
Miluzzo, E., Wang, T., and Campbell, A.T. EyePhone: activating mobile phones with your eyes. Proc. SIGCOMM '10, (2010), 15--20.
[25]
Monsell, S. Task switching. Trends in Cognitive Sciences 7, 3 (2003), 134--140.
[26]
Morimoto, C.H. and Mimica, M.R. Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding 98, 1 (2005), 4--24.
[27]
Nasar, J.L. and Troyer, D. Pedestrian injuries due to mobile phone use in public places. Accident Analysis & Prevention 57, (2013), 91--95.
[28]
Newcomb, A. Texting While Walking Banned in New Jersey Town ABC News. 2012. http://abcnews.go.com/blogs/headlines/2012/05/textin g-while-walking-banned-in-new-jersey-town/.
[29]
Pomplun, M. and Sunkara, S. Pupil dilation as an indicator of cognitive workload in human-computer interaction. Proc. International Conference on HCI '03, (2003).
[30]
Salter, K. and Fawcett, R. The ART test of interaction: A robust and powerful rank test of interaction in factorial models. Communications in Statistics: Simulation and Computation 22, 1 (1993), 137--153.
[31]
Sears, A., Lin, M., Jacko, J., and Xiao, Y. When computers fade: Pervasive computing and situationally-induced impairments and disabilities. Proc. HCI International '03, (2003), 1298--1302.
[32]
Shapiro, S. and Wilk, M. An analysis of variance test for normality (complete samples). Biometrika 52, 3,4 (1965), 591--611.
[33]
Stiefelhagen, R., Yang, J., and Waibel, A. Tracking eyes and monitoring eye gaze. Proc. Workshop on Perceptual User Interfaces, (1997), 98--100.
[34]
Wobbrock, J.O., Findlater, L., Gergle, D., and Higgins, J.J. The Aligned Rank Transform for nonparametric factorial analyses using only ANOVA procedures. Proc. CHI '11, (2011), 143--146.
[35]
Ye, Z., Li, Y., Fathi, A., et al. Detecting eye contact using wearable eye-tracking glasses. Proc. UbiComp '12, (2012), 699--704.
[36]
Yokoyama, T., Sakai, H., Noguchi, Y., and Kita, S. Perception of direct gaze does not require focus of attention. Scientific reports 4, (2014), 3858.

Cited By

View all
  • (2024)PrivateGaze: Preserving User Privacy in Black-box Mobile Gaze Tracking ServicesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785958:3(1-28)Online publication date: 9-Sep-2024
  • (2024)The Ability-Based Design Mobile Toolkit (ABD-MT): Developer Support for Runtime Interface Adaptation Based on Users' AbilitiesProceedings of the ACM on Human-Computer Interaction10.1145/36765248:MHCI(1-26)Online publication date: 24-Sep-2024
  • (2024)GazePrompt: Enhancing Low Vision People's Reading Experience with Gaze-Aware AugmentationsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642878(1-17)Online publication date: 11-May-2024
  • Show More Cited By

Index Terms

  1. SwitchBack: Using Focus and Saccade Tracking to Guide Users' Attention for Mobile Task Resumption

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '15: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems
    April 2015
    4290 pages
    ISBN:9781450331456
    DOI:10.1145/2702123
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 18 April 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gaze-tracking
    2. mobile
    3. reading
    4. situational impairments

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    CHI '15
    Sponsor:
    CHI '15: CHI Conference on Human Factors in Computing Systems
    April 18 - 23, 2015
    Seoul, Republic of Korea

    Acceptance Rates

    CHI '15 Paper Acceptance Rate 486 of 2,120 submissions, 23%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)65
    • Downloads (Last 6 weeks)7
    Reflects downloads up to 05 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)PrivateGaze: Preserving User Privacy in Black-box Mobile Gaze Tracking ServicesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785958:3(1-28)Online publication date: 9-Sep-2024
    • (2024)The Ability-Based Design Mobile Toolkit (ABD-MT): Developer Support for Runtime Interface Adaptation Based on Users' AbilitiesProceedings of the ACM on Human-Computer Interaction10.1145/36765248:MHCI(1-26)Online publication date: 24-Sep-2024
    • (2024)GazePrompt: Enhancing Low Vision People's Reading Experience with Gaze-Aware AugmentationsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642878(1-17)Online publication date: 11-May-2024
    • (2024)Human I/O: Towards a Unified Approach to Detecting Situational ImpairmentsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642065(1-18)Online publication date: 11-May-2024
    • (2024)Expert gaze as a usability indicator of medical AI decision support systems: a preliminary studynpj Digital Medicine10.1038/s41746-024-01192-87:1Online publication date: 27-Jul-2024
    • (2024)KD-Eye: Lightweight Pupil Segmentation for Eye Tracking on VR Headsets via Knowledge DistillationWireless Artificial Intelligent Computing Systems and Applications10.1007/978-3-031-71464-1_18(209-220)Online publication date: 13-Nov-2024
    • (2023)EV-EyeProceedings of the 37th International Conference on Neural Information Processing Systems10.5555/3666122.3668838(62169-62182)Online publication date: 10-Dec-2023
    • (2023)Improving Mobile Reading Experiences While Walking Through Automatic Adaptations and Prompted CustomizationAdjunct Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586182.3616666(1-3)Online publication date: 29-Oct-2023
    • (2023)Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580871(1-17)Online publication date: 19-Apr-2023
    • (2023)Reading and Walking with Smart Glasses: Effects of Display and Control Modes on SafetyInternational Journal of Human–Computer Interaction10.1080/10447318.2023.227652940:23(7875-7891)Online publication date: 7-Nov-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media