Abstract
In public display contexts, interactions are spontaneous and have to work without preparation. We propose gaze as a modality for such contexts, as gaze is always at the ready, and a natural indicator of the user’s interest. We present GazeHorizon, a system that demonstrates spontaneous gaze interaction, enabling users to walk up to a display and navigate content using their eyes only. GazeHorizon is extemporaneous and optimised for instantaneous usability by any user without prior configuration, calibration or training. The system provides interactive assistance to bootstrap gaze interaction with unaware users, employs a single off-the-shelf web camera and computer vision for person-independent tracking of the horizontal gaze direction and maps this input to rate-controlled navigation of horizontally arranged content. We have evaluated GazeHorizon through a series of field studies, culminating in a 4-day deployment in a public environment during which over a hundred passers-by interacted with it, unprompted and unassisted. We realised that since eye movements are subtle, users cannot learn gaze interaction from only observing others and as a result guidance is required.










Similar content being viewed by others
Notes
This article is an extended version of the work [30] we presented in UbiComp 2014. Previous work only highlights the deployment of GazeHorizon, whereas this article provides a complete report on detailed observations, lessons learned and in-depth discussions for each field study.
References
Boring S, Baur D, Butz A, Gustafson S, Baudisch P (2010) Touch projector: mobile interaction through video. In: Proceedings of the CHI 2010, ACM Press, 2287–2296
Brignull H, Rogers Y (2003) Enticing people to interact with large public displays in public spaces. In: Proceedings of the INTERACT 2003, IOS Press, 17–24
Eaddy M, Blasko G, Babcock J, Feiner S (2004) My own private kiosk: privacy-preserving public displays. In: Proceedings of the ISWC 2004, IEEE computer society, 132–135
Hansen D, Ji Q (2010) In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32(3):478–500
Jacob RJK (1991) The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans Inf Syst 9(2):152–169
Kukka H, Oja H, Kostakos V, Gonçalves J, Ojala T (2013) What makes you click: exploring visual signals to entice interaction on public displays. In: Proceedings of the CHI 2013, ACM Press, 1699–1708
Kumar M, Winograd T (2007) Gaze-enhanced scrolling techniques. In: Proceedings of the UIST 2007, ACM Press, 213–216
MacKenzie IS, Zhang X (2008) Eye typing using word and letter prediction and a fixation algorithm. In: Proceedings of the ETRA 2008, ACM Press, 55–58
Mardanbegi D, Hansen DW, Pederson T (2012) Eye-based head gestures. In: Proceedings of the ETRA 2012, ACM Press, 139–146
Marshall P, Morris R, Rogers Y, Kreitmayer S, Davies M (2011) Rethinking ‘multi-user’: an in-the-wild study of how groups approach a walk-up-and-use tabletop interface. In: Proceedings of the CHI 2011, ACM, 3033–3042
Morimoto CH, Mimica MRM (2005) Eye gaze tracking techniques for interactive applications. Comput Vis Image Underst 98(1):4–24
Müller J, Alt F, Michelis D, Schmidt A (2010) Requirements and design space for interactive public displays. In: Proceedings of the MM 2010, ACM Press, 1285–1294
Müller J, Walter R, Bailly G, Nischt M, Alt F (2012) Looking glass: a field study on noticing interactivity of a shop window. In: Proceedings of the CHI 2012, ACM Press, 297–306
Nakanishi Y, Fujii T, Kiatjima K, Sato Y, Koike H (2002) Vision-based face tracking system for large displays. In: Proceedings of the UbiComp 2002, Springer, 152–159
Peltonen P, Kurvinen E, Salovaara A, Jacucci G, Ilmonen T, Evans J, Oulasvirta A, Saarikko P (2008) It’s mine, don’t touch!: interactions at a large multi-touch display in a city centre. In: Proceedings of the CHI 2008, ACM Press, 1285–1294
Ren G, Li C, O’Neill E, Willis P (2013) 3d freehand gestural navigation for interactive public displays. IEEE Comput Graph Appl 33(2):47–55
Schmidt C, Müller J, Bailly G (2013) Screenfinity: extending the perception area of content on very large public displays. In: Proceedings of the CHI 2013, ACM Press, 1719–1728
Sippl A, Holzmann C, Zachhuber D, Ferscha A (2010) Real-time gaze tracking for public displays. In: Proceedings of the AmI 2010, Springer, 167–176
Smith BA, Yin Q, Feiner SK, Nayar SK (2013) Gaze locking: passive eye contact detection for human-object interaction. In: Proceedings UIST 2013, ACM Press, 271–280
Smith JD, Vertegaal R, Sohn C (2005) Viewpointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis. In: Proceedings of the UIST 2005, ACM Press, 53–61
Turner J, Alexander J, Bulling A, Schmidt D, Gellersen H (2013) Eye pull, eye push: moving objects between large screens and personal devices with gaze & touch. In: Proceedings of the INTERACT 2013, 170–186
Turner J, Bulling A, Alexander J, Gellersen H (2014) Cross-device gaze-supported point-to-point content transfer. In: Proceedings of the ETRA 2014, ACM Press, 19–26
Vertegaal R, Mamuji A, Sohn C, Cheng D (2005) Media eyepliances: using eye tracking for remote control focus selection of appliances. In: CHI EA 2005, ACM Press, 1861–1864
Vidal M, Bulling A, Gellersen H (2013) Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In: Proceedings of the UbiComp 2013, ACM Press, 439–448
Vogel D, Balakrishnan R (2004) Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users. In: Proceedings of the UIST 2004, ACM, 137–146
Walter R, Bailly G, Müller J (2013) Strikeapose: revealing mid-air gestures on public displays. In: Proceedings of the CHI 2013, ACM Press, 841–850
Zhai S, Morimoto C, Ihde S (1999) Manual and gaze input cascaded (magic) pointing. In: Proceedings of the CHI 1999, ACM Press, 246–253
Zhang Y, Bulling A, Gellersen H (2013) SideWays: a gaze interface for spontaneous interaction with situated displays. In: Proceedings of the CHI 2013, ACM Press, 851–860
Zhang Y, Bulling A, Gellersen H (2014) PCR: a calibration-free method for tracking horizontal gaze direction using a single camera. In: Proceedings of the AVI 2014, ACM Press, 129–132
Zhang Y, Müller J, Chong MK, Bulling A, Gellersen H (2014) GazeHorizon: enabling passers-by to interact with public displays by gaze. In: Proceedings of the UbiComp 2014, ACM Press, 559–563
Zhu D, Gedeon T, Taylor K (2011) Moving to the centre: a gaze-driven remote camera control for teleoperation. Interact Comput 23(1):85–95
Acknowledgments
This work was supported by the EU Marie Curie Network iCareNet under grant number 264738.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Zhang, Y., Chong, M.K., Müller, J. et al. Eye tracking for public displays in the wild. Pers Ubiquit Comput 19, 967–981 (2015). https://doi.org/10.1007/s00779-015-0866-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00779-015-0866-8