Skip to main content
Log in

Smooth Gaze: a framework for recovering tasks across devices using eye tracking

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

A user task is often distributed across devices, e.g., a student listening to a lecture in a classroom while watching slides on a projected screen and making notes on her laptop, and sometimes checking Twitter for comments on her smartphone. In scenarios like this, users move between heterogeneous devices and have to deal with task resumption overhead from both physical and mental perspectives. To address this problem, we created Smooth Gaze, a framework for recording the user’s work state and resuming it seamlessly across devices by leveraging implicit gaze input. In particular, we propose two novel and intuitive techniques, smart watching and smart posting, for detecting which display and target region the user is looking at, and transferring and integrating content across devices respectively. In addition, we designed and implemented a cross-device reading system SmoothReading that captures content from secondary devices and generates annotations based on eye tracking, to be displayed on the primary device. We conducted a study that showed that the system supported information seeking and task resumption, and improved users’ overall reading experience.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Dearman D, Pierce JS (2008) It’s on my other computer!: computing with multiple devices. In: Proceedings of the ACM conference on human factors in computing systems (CHI’08), ACM, pp 767–776

  2. Hamilton P, Wigdor DJ (2014) Conductor: enabling and understanding cross-device interaction. In: Proceedings of the ACM conference on human factors in computing systems (CHI’14), ACM, pp 2773–2782

  3. Li Y, Landay JA (2008) Activity-based prototyping of ubicomp applications for long-lived, everyday human activities. In: Proceedings of the ACM conference on human factors in computing systems (CHI’08), ACM, pp 1303–1312

  4. Chang TH, Li Y (2011) Deep shot: a framework for migrating tasks across devices using mobile phone cameras. In: Proceedings of the ACM conference on human factors in computing systems (CHI’11), ACM, pp 2163–2172

  5. Turner J, Bulling A, Alexander J, Gellersen H (2014) Cross-device gaze-supported point-to-point content transfer. In: Proceedings of the 2010 symposium on eye-tracking research & applications (ETRA’14), ACM, pp 19–26

  6. Stellmach S, Dachselt R (2012) Look & touch: gaze-supported target acquisition. In: Proceedings of the ACM conference on human factors in computing systems (CHI’12), ACM, pp 2981–2990

  7. Jacob RJK (1990) What you look at is what you get: eye movement-based interaction techniques. In: Proceedings of the ACM conference on human factors in computing systems (CHI’90), ACM, pp 11–18

  8. Turner J, Bulling A, Alexander J, Gellersen H (2013) Eye drop: an interaction concept for gaze-supported point-to-point content transfer. In: Proceedings of the 12th international conference on mobile and ubiquitous multimedia (MUM’13), ACM, Article 37, 4 pages

  9. Santosa S, Wigdor D (2013) A field study of multi-device workflows in distributed workspaces. In: Proceedings of the 2013 ACM international joint conference on pervasive and ubiquitous computing (UbiComp’13), ACM, pp 63–72

  10. Turner J, Alexander J, Bulling A, Schmidt D, Gellersen H (2013) Eye pull, eye push: moving objects between large screens and personal devices with gaze and touch. In: Proceedings of IFIP conference on human-computer interaction (INTERACT’13), Springer, pp 170-186

  11. Jokela T, Ojala J, Olsson T (2015) A diary study on combining multiple information devices in everyday activities and tasks. In: Proceedings of the ACM conference on human factors in computing systems (CHI’15), ACM, pp 3903–3912

  12. Nebeling M, Dey AK (2016) XDBrowser: user-defined cross-device web page designs. In: Proceedings of the ACM conference on human factors in computing systems (CHI’16), ACM, pp 5494–5505

  13. Stellmach S, Stober S, Nürnberger A, Dachselt R (2011) Designing gaze-supported multimodal interactions for the exploration of large image collections. In: Proceedings of the 1st conference on novel gaze-controlled applications (NGCA’11), ACM, Article 1 , 8 pages

  14. Stellmach S, Dachselt R (2013) Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets. In: Proceedings of the ACM conference on human factors in computing systems (CHI’13), ACM, pp 285–294

  15. Turner J, Alexander J, Bulling A, Gellersen H (2015) Gaze+RST: integrating gaze and multitouch for remote rotate-scale-translate tasks. In: Proceedings of the ACM conference on human factors in computing systems (CHI’15), ACM, pp 4179–4188

  16. Dostal J, Kristensson PO, Quigley A (2013) Subtle gaze-dependent techniques for visualising display changes in multi-display environments. In: Proceedings of the 2013 international conference on intelligent user interfaces (IUI’13), ACM, pp 137–148

  17. Turner J, Bulling A, Gellersen H (2012) Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction. In: Proceedings of the 2012 symposium on eye-tracking research & applications (ETRA’ 12), ACM, pp 269-272

  18. Dickie C, Hart J, Vertegaal R, Eiser A (2006) LookPoint: an evaluation of eye input for hands-free switching of input devices between multiple computers. In: Proceedings of the 18th Australia conference on computerhuman interaction: design: activities, artefacts and environments (OZCHI’06), ACM, pp 119–126

  19. Hu MK (1962) Visual pattern recognition by moment invariants. IRE Trans Inf Theory 8:179–187

    MATH  Google Scholar 

  20. Zhu Z, Ji Q (2004) Eye and gaze tracking for interactive graphic display. Vis Appl 15(3):139–148

    Google Scholar 

  21. Kern D, Marshall P, Schmidt A (2010) Gazemarks: gaze-based visual placeholders to ease attention switching. In: Proceedings of the ACM conference on human factors in computing systems (CHI’10), ACM, pp 2093–2102

  22. Kiefer P, Giannopoulos I (2015) A framework for attention-based implicit interaction on mobile screens. In: Proceedings of international conference on human-computer interaction with mobile devices and services (MobileHCI’15), ACM, pp 1088–1093

  23. Cheng S, Sun Z, Lu Y (2016) An eye tracking approach to cross-device interaction. J Comput Aided Des Comput Graph 28(7):1094–1104

    Google Scholar 

  24. Cheng S, Sun Z, Sun L, Yee K, Dey AK (2015) Gaze-based annotations for reading comprehension. In: Proceedings of the ACM conference on human factors in computing systems (CHI’15), ACM, pp 1569–1572

  25. Strasburger H, Rentschler I, Jüttner M (2011) Peripheral vision and pattern recognition: a review. J Vis 11(5):13–13

    Article  Google Scholar 

  26. Rothkopf CA (2016) Minimal sequential gaze models for inferring walkers’ tasks. In: Proceedings of international conference on human-computer interaction with mobile devices and services (MobileHCI’16), ACM, pp 1041–1044

  27. Rayner K (1998) Eye movements in reading and information processing: 20 years of research. Psychol Bull 124(3):372–422. https://doi.org/10.1037/0033-2909.124.3.372

    Article  Google Scholar 

Download references

Acknowledgments

We thank all the volunteers, and the grant from National Key Research & Development Program of China (No. 2016YFB1001403), National Natural Science Foundation of China (Nos. 61772468 and 61572437), and Zhejiang Provincial Natural Science Foundation of China (No. LY15F020030).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shiwei Cheng.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cheng, S., Fan, J. & Dey, A.K. Smooth Gaze: a framework for recovering tasks across devices using eye tracking. Pers Ubiquit Comput 22, 489–501 (2018). https://doi.org/10.1007/s00779-018-1115-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-018-1115-8

Keywords

Navigation