Skip to main content
Log in

Towards cognitively grounded gaze-controlled interfaces

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

Gaze-controlled interfaces have become a viable alternative to hand-input-based displays and present a particular value to the field of assistive technologies, allowing people with motor disabilities to partake in activities that otherwise would have been inaccessible to them. The present paper gives an overview of the key problems associated with the user experience in gaze-controlled human–computer interfaces and introduces two areas of psychological research that could contribute to the development of gaze-controlled interfaces that give a more intuitive sense of control and are less likely to interfere with ongoing cognitive processes. Such interfaces are referred to as cognitively grounded. The two areas of psychological research that lead to the design of cognitively grounded gaze-controlled interfaces are the sense of agency and the cognitive embodiment. This overview builds on findings within these areas and outlines research questions essential to the design of cognitively grounded gaze-controlled interfaces.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Van der Kamp J, Sundstedt V (2011) Gaze and voice controlled drawing. In: Proceedings of the 1st conference on novel gaze-controlled applications (NGCA’11). ACM, pp 1–8. doi:10.1145/1983302.1983311

  2. Fosso F, Porta M (2009) A vision-based attentive user interface with (semi-) automatic parameter calibration. In: Proceedings of the international conference on computer systems and technologies (CompSysTech ‘09). ACM, pp 1–6. doi:10.1145/1731740.1731777

  3. Perreira Da Silva M, Courboulay V, Prigent A (2007) Gameplay experience based on a gaze tracking system. In: COGAIN Proceedings 2007

  4. Dirican AC, Göktürk M (2012) Involuntary postural responses of users as input to attentive computing systems: an investigation on head movements. Comput Hum Behav 28:1634–1647. doi:10.1016/j.chb.2012.04.002

    Article  Google Scholar 

  5. Bee N, Andre E (2008) Writing with your eye: a dwell time free writing system adapted to the nature of human eye gaze. In: Lecture Notes in Computer Science, vol 5078. Perception in Multimodal Dialogue Systems, pp 111–122

  6. Dorr M, Bex P (2011) A gaze-contingent display to study contrast sensitivity under natural viewing conditions. In: Rogowitz BE, Pappas TN (eds) Human vision and electronic imaging, vol XVI (7865), pp 1–8

  7. Hornof AJ, Cavender A (2005) EyeDraw: enabling children with severe motor impairments to draw with their eyes. In: Proceedings of SIGCHI conference on human factors in computing systems (CHI’05). ACM, pp 161–170. doi:10.1145/1054972.1054995

  8. Penkar AM, Lutteroth C, Weber G (2012) Designing for the eye—design parameters for dwell in gaze interaction. In: Proceedings of the 24th Australian computer–human interaction conference (OzCHI’12). ACM, pp 479–488. doi:10.1145/2414536.2414609

  9. Duchowski AT (2002) A breadth-first survey of eye tracking applications. Beh Res Methods Instrum Comput 34:455–470. doi:10.3758/BF03195475

    Article  Google Scholar 

  10. Hyrskykari A, Majaranta P, Räihä K-J (2005) From gaze control to attentive interfaces. In: Proceedings of HCII 2005. Lawrence Erlbaum Associates

  11. Vertegaal R, Shell JS (2008) Attentive user interfaces: the surveillance and sousveillance of gaze-aware objects. Soc Sci Inf 47:275–298. doi:10.1177/0539018408092574

    Article  Google Scholar 

  12. Dorr M, Pomarjanschi L, Barth E (2009) Gaze beats mouse: a case study on a gaze-controlled breakout. PsychNology J 7:197–211

    Google Scholar 

  13. Chen M-C, Klatzky RL (2007) Displays attentive to unattended regions: presenting information in a peripheral-vision-friendly way. In: Lecture Notes in Computer Science, vol 4551. Human–Computer Interaction, pp 23–31

  14. Loschky LC, McConkie GW (2000) User performance with gaze contingent multiresolutional displays. In: Proceedings of the eye tracking research and applications symposium, pp 97–103

  15. Asai K, Osawa N, Takahashi H, Sugimoto YY, Yamazaki S, Samejima M, Tanimae T (2000) Eye mark pointer in immersive projection display. In: Proceedings of the IEEE virtual reality conference (VR’00). IEEE, pp 125–132. doi:10.1109/VR.2000.840490

  16. Bates R, Istance HO (2004) Towards eye based virtual environment interaction for users with high-level motor disabilities. In: Proceedings of 5th international conference series on disability, virtual reality and associated technologies (ICDVRAT’04), pp 275–282

  17. Isokoski P, Hyrskykari A, Kotkaluoto S, Martin B (2007) Gamepad and eye tracker input in FPS games: data for the first 50 minutes. In: Proceedings of the 3rd conference on communication by gaze interaction—COGAIN 2007: Gaze-based Creativity and Interacting with Games and On-line Communities, pp 11–15

  18. Adams N, Witkowski M, Spence R (2008) The inspection of very large images by eye-gaze control. In: Proceedings of the working conference on advanced visual interfaces (AVI’08). ACM, pp 111–118. doi:10.1145/1385569.1385589

  19. Jacob RJK (1991) The use of eye movements in human computer interaction techniques: what you look at is what you get. ACM Trans Inf Syst 9:152–169. doi:10.1145/123078.128728

    Article  Google Scholar 

  20. Tanriverdi V, Jacob RJK (2000) Interacting with eye movements in virtual environments. In: Proceedings of the CHI conference on human factors in computing systems (CHI’00). ACM, pp 265–272. doi:10.1145/332040.332443

  21. Heikkilä H (2013) Tools for a gaze-controlled drawing application—comparing gaze gestures against dwell buttons. Lect Notes Comput Sci 8118:187–201

    Article  Google Scholar 

  22. Abe K, Owada K, Ohi S, Ohyama M (2008) A system for web browsing by eye-gaze input. Electron Commun Jpn 91:11–18. doi:10.1002/ecj.10110

    Article  Google Scholar 

  23. Ashmore M, Duchowski AT, Shoemaker G (2005) Efficient eye pointing with a fisheye lens. In: Proceedings of graphics interface (GI’05). Canadian Human–Computer Communications Society, pp 203–210

  24. Hansen JP, Johansen AS, Hansen DW, Itoh K, Mashino S (2003) Command without a click: dwell time typing by mouse and gaze selection. In: Proceedings of human computer interaction (INTERACT’03), pp 121–128

  25. Istance H, Hyrskykari A, Immonen L, Mansikkamaa S, Vickers S (2010) Designing gaze gestures for gaming: an investigation of performance. In: Proceedings of the 2010 symposium on eye-tracking research & application (ETRA’10). ACM, pp 323–330. doi:10.1145/1743666.1743740

  26. Bednarik R, Gowases T, Tukiainen M (2009) Gaze interaction enhances problem solving: effects of dwell-time based, gaze-augmented, and mouse interaction on problem-solving strategies and user experience. J Eye Mov Res 3:1–10

    Google Scholar 

  27. Takahashi H (2012) The estimation of the sense of agency in infancy from eye movement. Workshop abstracts: Gaze Bias Learning II. Linking neuroscience, computational modeling, and cognitive development

  28. Wang Q, Bolhuis J, Rothkopf CA, Kolling T, Knopf M, Triesch J (2012) Infants in control: rapid anticipation of action outcomes in a gaze-contingent paradigm. PLoS ONE 7:e30884. doi:10.1371/journal.pone.0030884

    Article  Google Scholar 

  29. Moore JW, Obhi SS (2012) Intentional binding and the sense of agency: a review. Conscious Cogn 21:546–561. doi:10.1016/j.concog.2011.12.002

    Article  Google Scholar 

  30. Desantis A, Roussel C, Waszak F (2011) On the influence of causal beliefs on the feeling of agency. Conscious Cogn 20:1211–1220. doi:10.1016/j.concog.2011.02.012

    Article  Google Scholar 

  31. Cravo AM, Claessens PME, Baldo MVC (2009) Voluntary action and causality in temporal binding. Exp Brain Res 199:95–99. doi:10.1007/s00221-009-1969-0

    Article  Google Scholar 

  32. Aarts H, Bijleveld E, Custers R, Dogge M, Deelder M, Schutter D, van Haren NEM (2012) Positive priming and intentional binding: eye-blink rate predicts reward information effects on the sense of agency. Soc Neurosci 12:105–112. doi:10.1080/17470919.2011.590602

    Article  Google Scholar 

  33. Asai T, Tanno Y (2007) The relationship between the sense of self-agency and schizotypal personality traits. J Mot Behav 39:162–168. doi:10.3200/JMBR.39.3.162-168

    Article  Google Scholar 

  34. Burr DC, Ross J, Binda P, Morrone MC (2010) Saccades compress space, time and number. Trends Cogn Sci 14:528–533. doi:10.1016/j.tics.2010.09.005

    Article  Google Scholar 

  35. Stampe DM, Reingold EM (1995) Selection by looking: a novel computer interface and its application to psychological research. In: Findaly JM, Walker R, Kentridge RW (eds) Eye movement research: mechanisms, processes and applications. Elsevier Science, Amsterdam, pp 467–478

    Chapter  Google Scholar 

  36. Špakov O, Miniotas D (2004) On-line adjustment of dwell time for target selection by gaze. In: Proceedings of the third Nordic conference on human–computer interaction (NordiCHI’04). ACM, pp 203–206. doi:10.1145/1028014.1028045

  37. Kristensson PO, Vertanen K (2012) The potential of dwell-free eye-typing for fast assistive gaze communication. In: Proceedings of the ACM symposium on eye-tracking research and applications (ETRA’12), ACM, pp 241–244. doi:10.1145/2168556.2168605

  38. Blakemore SJ, Frith CD, Wolpert DM (1999) Spatio-temporal prediction modulates the perception of self-produced stimuli. J Cogn Neurosci 11:551–559. doi:10.1162/089892999563607

    Article  Google Scholar 

  39. Weiss C, Tsakiris M, Haggard P, Schütz-Bosbach S (2014) Agency in the sensorimotor system and its relation to explicit action awareness. Neuropsychologia 52:82–92. doi:10.1016/j.neuropsychologia.2013.09.034

    Article  Google Scholar 

  40. Hon N, Poh J-H, Soon C-S (2013) Preoccupied minds feel less control: sense of agency is modulated by cognitive load. Conscious Cogn 22:556–561. doi:10.1016/j.concog.2013.03.004

    Article  Google Scholar 

  41. Shapiro L (2014) The Routledge handbook of embodied cognition. Routledge, Abingdon

    Google Scholar 

  42. Boulenger V, Roy AC, Paulignan Y, Deprez V, Jeannerod M, Nazir TA (2006) Cross-talk between language processes and overt motor behavior in the first 200 msec of processing. J Cogn Neurosci 18:1607–1615. doi:10.1162/jocn.2006.18.10.1607

    Article  Google Scholar 

  43. Brouillet T, Heurley L, Martin S, Brouillet D (2010) The embodied cognition theory and the motor component of “yes” and “no” verbal responses. Acta Psychol 134:310–317. doi:10.1016/j.actpsy.2010.03.003

    Article  Google Scholar 

  44. Wilson AD, Golonka S (2013) Embodied cognition is not what you think it is. Front Psychol 4:58. doi:10.3389/fpsyg.2013.00058

    Article  Google Scholar 

  45. Mele ML, Federici S (2012) Gaze and eye-tracking solutions for psychological research. Cogn Process Suppl 1:S261–S265. doi:10.1007/s10339-012-0499-z

    Article  Google Scholar 

  46. Loetscher T, Bockisch CJ, Nicholls MER, Brugger P (2010) Eye position predicts what number you have in mind. Curr Biol 20:R264–R265. doi:10.1016/j.cub.2010.01.015

    Article  Google Scholar 

  47. Lawrence B, Myerson MJ, Abrams RA (2004) Interference with spatial working memory: an eye movement is more than a shift of attention. Psychon Bull Rev 11:488–494. doi:10.3758/BF03196600

    Article  Google Scholar 

  48. Ehrlichman H, Micic D, Sousa A, Zhu J (2007) Looking for answers: eye movements in non-visual cognitive tasks. Brain Cogn 64:7–20. doi:10.1016/j.bandc.2006.10.001

    Article  Google Scholar 

  49. Grant ER, Spivey MJ (2003) Eye movements and problem solving: guiding attention guides thought. Psychol Sci 14:462–466. doi:10.1111/1467-9280.02454

    Article  Google Scholar 

  50. Thomas LE, Lleras A (2007) Moving eyes and moving thought: on the spatial compatibility between eye movements and cognition. Psychon Bull Rev 4:663–668. doi:10.3758/BF03196818

    Article  Google Scholar 

  51. Lorens SA Jr, Darrow CW (1962) Eye movements, EEG, GSR and EKG during mental multiplication. Electroencephalogr Clin Neurophysiol 14:739–746. doi:10.1016/0013-4694(62)90088-3

    Article  Google Scholar 

  52. May JG, Kennedy RS, Williams MC, Dunlap WP, Brannan JR (1990) Eye movement indices of mental workload. Acta Psychol 75:75–89. doi:10.1016/0001-6918(90)90067-P

    Article  Google Scholar 

  53. Duncker K (1945) On problem-solving. Psychol Monogr 58:i-113

    Article  Google Scholar 

  54. Wallentin M, Kristensen LB, Olsen JH, Nielsen AH (2011) Eye movement suppression interferes with construction of object-centered spatial reference frames in working memory. Brain Cogn 77:432–437. doi:10.1016/j.bandc.2011.08.015

    Article  Google Scholar 

  55. Postle BR, Idzikowski C, Della Sala S, Logie RH, Baddeley AD (2006) The selective disruption of spatial working memory by eye movements. Q J Exp Psychol 59:100–120. doi:10.1080/17470210500151410

    Article  Google Scholar 

  56. Kirsh D, Maglio P (1994) On distinguishing epistemic from pragmatic action. Cogn Sci 18:513–549. doi:10.1207/s15516709cog1804_1

    Article  Google Scholar 

  57. Ballard D, Hayhoe M, Pelz J (1995) Memory representations in natural tasks. J Cogn Neurosci 7:66–80. doi:10.1162/jocn.1995.7.1.66

    Article  Google Scholar 

  58. Proctor RW, Vu K-PL (2006) Stimulus–response compatibility principles: data, theory, and application. CRC Press, Boca Raton

    Google Scholar 

  59. Bertera JH, Callan JR, Parsons OA, Pishkin V (1975) Lateral stimulus-response compatibility effects in the oculomotor system. Acta Psychol 39:175–181. doi:10.1016/0001-6918(75)90032-3

    Article  Google Scholar 

  60. Sullivan K, Edelman J (2009) An oculomotor Simon effect. J Vis 9:380. doi:10.1167/9.8.380

    Article  Google Scholar 

  61. Khalid S, Ansorge U (2013) The Simon effect of spatial words in eye movements: comparison of vertical and horizontal effects and of eye and finger responses. Vis Res 86:6–14. doi:10.1016/j.visres.2013.04.001

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nadiya Slobodenyuk.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Slobodenyuk, N. Towards cognitively grounded gaze-controlled interfaces. Pers Ubiquit Comput 20, 1035–1047 (2016). https://doi.org/10.1007/s00779-016-0970-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-016-0970-4

Keywords

Navigation