Skip to main content
Log in

Attention as an input modality for Post-WIMP interfaces using the viGaze eye tracking framework

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Eye tracking is one of the most prominent modalities to track user attention while interacting with computational devices. Today, most of the current eye tracking frameworks focus on tracking the user gaze during website browsing or while performing other tasks and interactions with a digital device. Most frameworks have in common that they do not exploit gaze as an input modality. In this paper we describe the realization of a framework named viGaze. Its main goal is to provide an easy to use framework to exploit the use of eye gaze as an input modality in various contexts. Therefore it provides features to explore explicit and implicit interactions in complex virtual environments by using the eye gaze of a user for various interactions. The viGaze framework is flexible and can be easily extended to incorporate other input modalities typically used in Post-WIMP interfaces such as gesture or foot input. In this paper we describe the key components of our viGaze framework and additionally describe a user study that was conducted to test the framework. The user study took place in a virtual retail environment, which provides a challenging pervasive environment and contains complex interactions that can be supported by gaze. The participants performed two gaze-based interactions with products on virtual shelves and started an interaction cycle between the products and an advertisement monitor placed on the shelf. We demonstrate how gaze can be used in Post-WIMP interfaces to steer the attention of users to certain components of the system. We conclude by discussing the advantages provided through the viGaze framework and highlighting the potentials of gaze-based interaction.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Biedert R, Buscher G, Schwarz S, Hees J, Dengel A (2010) Text 2.0. In CHI ’10 extended abstracts on human factors in computing systems. ACM Press, NY, USA, pp 4003–4008

  2. Bulling A, Roggen D, Tröster G (2008) Eyemote–towards context-aware gaming using eye movements recorded from wearable electrooculography. Fun and games, pp 33–45

  3. Cauchard JR, Löchtefeld M, Irani P, Schoening J, Krüger A, Fraser M, Subramanian S (2011) Visual separation in mobile multi-display environments. In Proceedings of the 24th annual ACM symposium on User interface software and technology. ACM, pp 451–460

  4. Corbetta M, Shulman GL (2002) Control of goal-directed and stimulus-driven attention in the brain. Nat Rev Neurosci 3(3):201–215

    Article  Google Scholar 

  5. Daiber F, Schöning J, Krüger A (2009) Whole body interaction with geospatial data. In Smart Graphics. Springer, pp 81–92

  6. Duchowski A (2007) Eye tracking methodology: Theory and practice, vol 373. Springer

  7. Duggan GB, Payne SJ (2011) Skim reading by satisficing: evidence from eye tracking. In Proceedings of the 2011 annual conference on Human factors in computing systems. CHI ’11, ACM, pp 1141–1150

  8. Ferscha A, Resmerita S, Holzmann C (2007) Human computer confluence. Universal Access in Ambient Intelligence Environments, pp 14–27

  9. Giannopoulos I, Kiefer P, Raubal M (2012) Geogazemarks: providing gaze history for the orientation on small display maps. In Proceedings of the 14th ACM international conference on Multimodal interaction. ICMI ’12, ACM, NY, USA, pp 165–172

  10. Hempel J (2011) Navigation with gaze and feet: An interaction method for spatial data. Faculty of Computer Science, University of Magdeburg. Bachelor Thesis

  11. Hill RL, Dickinson A, Arnott JL, Gregor P, McIver L (2011) Older web users’ eye movements: experience counts. In Proceedings of the 2011 annual conference on Human factors in computing systems. CHI ’11, ACM, pp 1151–1160

  12. Jacob R J K (1991) The use of eye movements in human-computer interaction techniques: What you look at is what you get. ACM Trans Inf Syst 9(2):152–169

    Article  Google Scholar 

  13. Kern D, Marshall P, Schmidt A (2010) Gazemarks: gaze-based visual placeholders to ease attention switching. In CHI, pp 2093–2102

  14. Komogortsev O, Holland C, Camou J (2011) Adaptive eye-gaze-guided interfaces: design & performance evaluation. In CHI Extended Abstracts, pp 1255–1260

  15. Krüger A, Schöning J, Olivier P (2011) How computing will change the face of retail. Computer:84–87

  16. Ohno T, Mukawa N, Kawato S (2003) Just blink your eyes: A head-free gaze tracking system. In CHI’03 extended abstracts on Human factors in computing systems. ACM, pp 950–957

  17. Pieters R, Warlop L (1999) Visual attention during brand choice: The impact of time pressure and task motivation. Int J Res Mark 16(1):1–16

    Article  Google Scholar 

  18. Porta M, Ravarelli A, Spagnoli G (2010) eCursor, a contextual eye cursor for general pointing in windows environments. In ETRA, pp 331–337

  19. Rayner K (1998) Eye movements in reading and information processing: 20 years of research. Psychol bull 124(3):372–422

    Article  Google Scholar 

  20. Rensink RA, O’Regan JK, Clark JJ (1997) To see or not to see: The need for attention to perceive changes in scenes. Psychol Sci 8(5):368–373

    Article  Google Scholar 

  21. Santella A, Agrawala M, DeCarlo D, Salesin D, Cohen M (2006) Gaze-based interaction for semi-automatic photo cropping. In Proceedings of the SIGCHI conference on Human Factors in computing systems. ACM, pp 771–780

  22. Spassova L, Schöning J, Kahl G, Krüger A (2009) Innovative retail laboratory. In Roots for the Future of Ambient Intelligence. European Conference on Ambient Intelligence (AmI-09). 3rd, pp 18–21

  23. Stellmach S, Dachselt R (2012) Investigating gaze-supported multimodal pan and zoom. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, pp 357–360

  24. Tanriverdi V, Jacob R J K (2000) Interacting with eye movements in virtual environments. In CHI, pp 265–272

  25. Treue S (2003) Visual attention: the where, what, how and why of saliency, vol 13, pp 428–432

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Johannes Schöning.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Giannopoulos, I., Schöning, J., Krüger, A. et al. Attention as an input modality for Post-WIMP interfaces using the viGaze eye tracking framework. Multimed Tools Appl 75, 2913–2929 (2016). https://doi.org/10.1007/s11042-014-2412-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-014-2412-5

Keywords

Navigation