skip to main content
10.1145/2207676.2207777acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Designing effective gaze mechanisms for virtual agents

Authors Info & Claims
Published:05 May 2012Publication History

ABSTRACT

Virtual agents hold great promise in human-computer interaction with their ability to afford embodied interaction using nonverbal human communicative cues. Gaze cues are particularly important to achieve significant high-level outcomes such as improved learning and feelings of rapport. Our goal is to explore how agents might achieve such outcomes through seemingly subtle changes in gaze behavior and what design variables for gaze might lead to such positive outcomes. Drawing on research in human physiology, we developed a model of gaze behavior to capture these key design variables. In a user study, we investigated how manipulations in these variables might improve affiliation with the agent and learning. The results showed that an agent using affiliative gaze elicited more positive feelings of connection, while an agent using referential gaze improved participants' learning. Our model and findings offer guidelines for the design of effective gaze behaviors for virtual agents.

References

  1. Argyle, M., and Cook, M. Gaze and mutual gaze. Cambridge University Press Cambridge, 1976.Google ScholarGoogle Scholar
  2. Bailenson, J., Yee, N., Merget, D., and Schroeder, R. The effect of behavioral realism and form realism of real-time avatar faces on verbal disclosure, nonverbal disclosure, emotion recognition, and copresence in dyadic interaction. Presence: Teleoperators and Virtual Environments 15, 4 (2006), 359--372. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Bayliss, A., Paul, M., Cannon, P., and Tipper, S. Gaze cuing and affective judgments of objects: I like what you look at. Psychonomic bulletin & review 13, 6 (2006), 1061--1066.Google ScholarGoogle Scholar
  4. Beebe, S. Effects of eye contact, posture and vocal inflection upon credibility and comprehension.Google ScholarGoogle Scholar
  5. Burgoon, J., Coker, D., and Coker, R. Communicative effects of gaze behavior. Human Communication Research 12, 4 (1986), 495--524.Google ScholarGoogle ScholarCross RefCross Ref
  6. Cafaro, A., Gaito, R., and Vilhjálmsson, H. Animating idle gaze in public places. In Intelligent Virtual Agents, Springer (2009), 250--256. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Cassell, J., Bickmore, T., Billinghurst, M., Campbell, L., Chang, K., Vilhjálmsson, H., and Yan, H. Embodiment in conversational interfaces: Rea. In Proceedings of the SIGCHI conference on Human factors in computing systems: the CHI is the limit, ACM (1999), 520--527. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Cassell, J., Torres, O., and Prevost, S. Turn taking vs. discourse structure: How best to model multimodal conversation. Machine conversations (1999), 143--154.Google ScholarGoogle Scholar
  9. Deng, Z., Lewis, J., and Neumann, U. Automated eye motion using texture synthesis. IEEE Computer Graphics and Applications (2005), 24--30. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Fox, J., and Bailenson, J. Virtual virgins and vamps: The effects of exposure to female characters sexualized appearance and gaze in an immersive virtual environment. Sex roles 61, 3 (2009), 147--157.Google ScholarGoogle Scholar
  11. Freedman, E., and Sparks, D. Activity of cells in the deeper layers of the superior colliculus of the rhesus monkey: evidence for a gaze displacement command. Journal of neurophysiology 78, 3 (1997), 1669.Google ScholarGoogle Scholar
  12. Frischen, A., Bayliss, A., and Tipper, S. Gaze cueing of attention: Visual attention, social cognition, and individual differences. Psychological bulletin 133, 4 (2007), 694.Google ScholarGoogle Scholar
  13. Fuller, J. Head movement propensity. Experimental Brain Research 92, 1 (1992), 152--164.Google ScholarGoogle ScholarCross RefCross Ref
  14. Fullwood, C., and Doherty-Sneddon, G. Effect of gazing at the camera during a video link on recall. Applied Ergonomics 37, 2 (2006), 167--175.Google ScholarGoogle ScholarCross RefCross Ref
  15. Garau, M., Slater, M., Bee, S., and Sasse, M. The impact of eye gaze on communication using humanoid avatars. In Proceedings of the SIGCHI conference on Human factors in computing systems, ACM (2001), 309--316. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Goldberg, G., Kiesler, C., and Collins, B. Visual behavior and face-to-face distance during interaction. Sociometry (1969), 43--53.Google ScholarGoogle Scholar
  17. Goldring, J., Dorris, M., Corneil, B., Ballantyne, P., and Munoz, D. Combined eye-head gaze shifts to visual and auditory targets in humans. Experimental brain research 111, 1 (1996), 68--78.Google ScholarGoogle Scholar
  18. Goossens, H., and Opstal, A. Human eye-head coordination in two dimensions under different sensorimotor conditions. Experimental Brain Research 114, 3 (1997), 542--560.Google ScholarGoogle ScholarCross RefCross Ref
  19. Griffin, Z. Gaze durations during speech reflect word selection and phonological encoding. Cognition 82, 1 (2001), B1-B14.Google ScholarGoogle ScholarCross RefCross Ref
  20. Guitton, D., and Volle, M. Gaze control in humans: eye-head coordination during orienting movements to targets within and beyond the oculomotor range. Journal of neurophysiology 58, 3 (1987), 427.Google ScholarGoogle Scholar
  21. Harris, M., and Rosenthal, R. No more teachers dirty looks: Effects of teacher nonverbal behavior on student outcomes. Applications of nonverbal communication (2005), 157--192.Google ScholarGoogle Scholar
  22. Heylen, D., Van Es, I., Van Dijk, E., NI-JHOLT, A., van Kuppevelt, J., Dybkjaer, L., and Bernsen, N. Experimenting with the gaze of a conversational agent, 2005.Google ScholarGoogle Scholar
  23. Hietanen, J. Does your gaze direction and head orientation shift my visual attention? Neuroreport 10, 16 (1999), 3443.Google ScholarGoogle ScholarCross RefCross Ref
  24. Ipeirotis, P. Demographics of Mechanical Turk. Tech. Rep. CeDER-10-01, 2010. Accessed on 10-Mar-2010 at http://hdl.handle.net/2451/29585.Google ScholarGoogle Scholar
  25. Itti, L., Dhavale, N., and Pighin, F. Photorealistic attention-based gaze animation. In 2006 IEEE International Conference on Multimedia and Expo, IEEE (2006), 521--524.Google ScholarGoogle ScholarCross RefCross Ref
  26. Kelley, D., and Gorham, J. Effects of immediacy on recall of information. Communication Education (1988).Google ScholarGoogle Scholar
  27. Kim, K., Brent Gillespie, R., and Martin, B. Head movement control in visually guided tasks: Postural goal and optimality. Computers in Biology and Medicine 37, 7 (2007), 1009--1019. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Kittur, A., Chi, E., and Suh, B. Crowdsourcing user studies with Mechanical Turk. In Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, ACM (2008), 453--456. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Lance, B., and Marsella, S. The expressive gaze model: Using gaze to express emotion. Computer Graphics and Applications, IEEE 30, 4 (2010), 62--73. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Langton, S., and Bruce, V. Reflexive visual orienting in response to the social attention of others. Visual Cognition (1999).Google ScholarGoogle Scholar
  31. Lee, J., Marsella, S., Traum, D., Gratch, J., and Lance, B. The rickel gaze model: A window on the mind of a virtual human. In Intelligent Virtual Agents, Springer (2007), 296--303. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Lee, S., Badler, J., and Badler, N. Eyes alive. In ACM Transactions on Graphics (TOG), vol. 21, ACM (2002), 637--644. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Lester, J., Towns, S., Callaway, C., Voerman, J., and FitzGerald, P. Deictic and emotive communication in animated pedagogical agents. Embodied conversational agents (2000), 123--154. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Liu, C., Kay, D., and Chai, J. Awareness of partners eye gaze in situated referential grounding: An empirical study. 2nd Workshop on Eye Gaze in Intelligent Human Machine Interaction (2011).Google ScholarGoogle Scholar
  35. Ma, X., Le, B., and Deng, Z. Perceptual analysis of talking avatar head movements: A quantitative perspective. CHI'11 (2011). Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Mason, M., Tatkow, E., and Macrae, C. The look of love: Gaze shifts and person perception. Psychological Science (2005), 236--239.Google ScholarGoogle Scholar
  37. Mehrabian, A. Immediacy: An indicator of attitudes in linguistic communication. Journal of Personality 34, 1 (1966), 26--34.Google ScholarGoogle ScholarCross RefCross Ref
  38. Meyer, A., Sleiderink, A., and Levelt, W. Viewing and naming objects: Eye movements during noun phrase production. Cognition 66, 2 (1998), B25-B33.Google ScholarGoogle ScholarCross RefCross Ref
  39. Mutlu, B., Forlizzi, J., and Hodgins, J. A storytelling robot: Modeling and evaluation of human-like gaze behavior. In Humanoid Robots, 2006 6th IEEE-RAS International Conference on, IEEE (2006), 518--523.Google ScholarGoogle ScholarCross RefCross Ref
  40. Nijholt, A., Heylen, D., and Vertegaal, R. Inhabited interfaces: Attentive conversational agents that help. In Proceedings 3rd international Conference on Disability, Virtual Reality and Associated Technologies-CDVRAT2000, Alghero, Sardinia (2000).Google ScholarGoogle Scholar
  41. Otteson, J., and Otteson, C. Effect of teacher's gaze on children's story recall. Perceptual and Motor Skills (1980).Google ScholarGoogle Scholar
  42. Parke, F., and Waters, K. Computer facial animation. AK Peters Ltd, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Pelachaud, C., and Bilvi, M. Modelling gaze behavior for conversational agents. In Intelligent Virtual Agents, Springer (2003), 93--100.Google ScholarGoogle ScholarCross RefCross Ref
  44. Pelz, J., Hayhoe, M., and Loeber, R. The coordination of eye, head, and hand movements in a natural task. Experimental Brain Research 139, 3 (2001), 266--277.Google ScholarGoogle ScholarCross RefCross Ref
  45. Peters, C. Animating gaze shifts for virtual characters based on head movement propensity. In 2010 Second International Conference on Games and Virtual Worlds for Serious Applications, IEEE (2010), 11--18. Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Rickel, J., and Johnson, W. Task-oriented collaboration with embodied agents in virtual worlds. Embodied conversational agents (2000), 95--122. Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Sherwood, J. Facilitative effects of gaze upon learning. Perceptual and Motor Skills (1987).Google ScholarGoogle Scholar
  48. Steptoe, W., and Steed, A. High-fidelity avatar eye-representation. In Virtual Reality Conference, 2008. VR'08. IEEE, IEEE (2008), 111--114.Google ScholarGoogle ScholarCross RefCross Ref
  49. Tartaro, A., and Cassell, J. Authorable virtual peers for autism spectrum disorders. In Proceedings of the Combined workshop on Language-Enabled Educational Technology and Development and Evaluation for Robust Spoken Dialogue Systems at the 17th European Conference on Artificial Intellegence, Citeseer (2006).Google ScholarGoogle Scholar
  50. Vertegaal, R. The gaze groupware system: mediating joint attention in multiparty communication and collaboration. In Proceedings of the SIGCHI conference on Human factors in computing systems: the CHI is the limit, ACM (1999), 294--301. Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. Zangemeister, W., and Stark, L. Types of gaze movement: variable interactions of eye and head movements. Experimental Neurology 77, 3 (1982), 563--577.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Designing effective gaze mechanisms for virtual agents

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '12: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
      May 2012
      3276 pages
      ISBN:9781450310154
      DOI:10.1145/2207676

      Copyright © 2012 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 5 May 2012

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate6,199of26,314submissions,24%

      Upcoming Conference

      CHI '24
      CHI Conference on Human Factors in Computing Systems
      May 11 - 16, 2024
      Honolulu , HI , USA

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader