ABSTRACT
Virtual agents hold great promise in human-computer interaction with their ability to afford embodied interaction using nonverbal human communicative cues. Gaze cues are particularly important to achieve significant high-level outcomes such as improved learning and feelings of rapport. Our goal is to explore how agents might achieve such outcomes through seemingly subtle changes in gaze behavior and what design variables for gaze might lead to such positive outcomes. Drawing on research in human physiology, we developed a model of gaze behavior to capture these key design variables. In a user study, we investigated how manipulations in these variables might improve affiliation with the agent and learning. The results showed that an agent using affiliative gaze elicited more positive feelings of connection, while an agent using referential gaze improved participants' learning. Our model and findings offer guidelines for the design of effective gaze behaviors for virtual agents.
- Argyle, M., and Cook, M. Gaze and mutual gaze. Cambridge University Press Cambridge, 1976.Google Scholar
- Bailenson, J., Yee, N., Merget, D., and Schroeder, R. The effect of behavioral realism and form realism of real-time avatar faces on verbal disclosure, nonverbal disclosure, emotion recognition, and copresence in dyadic interaction. Presence: Teleoperators and Virtual Environments 15, 4 (2006), 359--372. Google ScholarDigital Library
- Bayliss, A., Paul, M., Cannon, P., and Tipper, S. Gaze cuing and affective judgments of objects: I like what you look at. Psychonomic bulletin & review 13, 6 (2006), 1061--1066.Google Scholar
- Beebe, S. Effects of eye contact, posture and vocal inflection upon credibility and comprehension.Google Scholar
- Burgoon, J., Coker, D., and Coker, R. Communicative effects of gaze behavior. Human Communication Research 12, 4 (1986), 495--524.Google ScholarCross Ref
- Cafaro, A., Gaito, R., and Vilhjálmsson, H. Animating idle gaze in public places. In Intelligent Virtual Agents, Springer (2009), 250--256. Google ScholarDigital Library
- Cassell, J., Bickmore, T., Billinghurst, M., Campbell, L., Chang, K., Vilhjálmsson, H., and Yan, H. Embodiment in conversational interfaces: Rea. In Proceedings of the SIGCHI conference on Human factors in computing systems: the CHI is the limit, ACM (1999), 520--527. Google ScholarDigital Library
- Cassell, J., Torres, O., and Prevost, S. Turn taking vs. discourse structure: How best to model multimodal conversation. Machine conversations (1999), 143--154.Google Scholar
- Deng, Z., Lewis, J., and Neumann, U. Automated eye motion using texture synthesis. IEEE Computer Graphics and Applications (2005), 24--30. Google ScholarDigital Library
- Fox, J., and Bailenson, J. Virtual virgins and vamps: The effects of exposure to female characters sexualized appearance and gaze in an immersive virtual environment. Sex roles 61, 3 (2009), 147--157.Google Scholar
- Freedman, E., and Sparks, D. Activity of cells in the deeper layers of the superior colliculus of the rhesus monkey: evidence for a gaze displacement command. Journal of neurophysiology 78, 3 (1997), 1669.Google Scholar
- Frischen, A., Bayliss, A., and Tipper, S. Gaze cueing of attention: Visual attention, social cognition, and individual differences. Psychological bulletin 133, 4 (2007), 694.Google Scholar
- Fuller, J. Head movement propensity. Experimental Brain Research 92, 1 (1992), 152--164.Google ScholarCross Ref
- Fullwood, C., and Doherty-Sneddon, G. Effect of gazing at the camera during a video link on recall. Applied Ergonomics 37, 2 (2006), 167--175.Google ScholarCross Ref
- Garau, M., Slater, M., Bee, S., and Sasse, M. The impact of eye gaze on communication using humanoid avatars. In Proceedings of the SIGCHI conference on Human factors in computing systems, ACM (2001), 309--316. Google ScholarDigital Library
- Goldberg, G., Kiesler, C., and Collins, B. Visual behavior and face-to-face distance during interaction. Sociometry (1969), 43--53.Google Scholar
- Goldring, J., Dorris, M., Corneil, B., Ballantyne, P., and Munoz, D. Combined eye-head gaze shifts to visual and auditory targets in humans. Experimental brain research 111, 1 (1996), 68--78.Google Scholar
- Goossens, H., and Opstal, A. Human eye-head coordination in two dimensions under different sensorimotor conditions. Experimental Brain Research 114, 3 (1997), 542--560.Google ScholarCross Ref
- Griffin, Z. Gaze durations during speech reflect word selection and phonological encoding. Cognition 82, 1 (2001), B1-B14.Google ScholarCross Ref
- Guitton, D., and Volle, M. Gaze control in humans: eye-head coordination during orienting movements to targets within and beyond the oculomotor range. Journal of neurophysiology 58, 3 (1987), 427.Google Scholar
- Harris, M., and Rosenthal, R. No more teachers dirty looks: Effects of teacher nonverbal behavior on student outcomes. Applications of nonverbal communication (2005), 157--192.Google Scholar
- Heylen, D., Van Es, I., Van Dijk, E., NI-JHOLT, A., van Kuppevelt, J., Dybkjaer, L., and Bernsen, N. Experimenting with the gaze of a conversational agent, 2005.Google Scholar
- Hietanen, J. Does your gaze direction and head orientation shift my visual attention? Neuroreport 10, 16 (1999), 3443.Google ScholarCross Ref
- Ipeirotis, P. Demographics of Mechanical Turk. Tech. Rep. CeDER-10-01, 2010. Accessed on 10-Mar-2010 at http://hdl.handle.net/2451/29585.Google Scholar
- Itti, L., Dhavale, N., and Pighin, F. Photorealistic attention-based gaze animation. In 2006 IEEE International Conference on Multimedia and Expo, IEEE (2006), 521--524.Google ScholarCross Ref
- Kelley, D., and Gorham, J. Effects of immediacy on recall of information. Communication Education (1988).Google Scholar
- Kim, K., Brent Gillespie, R., and Martin, B. Head movement control in visually guided tasks: Postural goal and optimality. Computers in Biology and Medicine 37, 7 (2007), 1009--1019. Google ScholarDigital Library
- Kittur, A., Chi, E., and Suh, B. Crowdsourcing user studies with Mechanical Turk. In Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, ACM (2008), 453--456. Google ScholarDigital Library
- Lance, B., and Marsella, S. The expressive gaze model: Using gaze to express emotion. Computer Graphics and Applications, IEEE 30, 4 (2010), 62--73. Google ScholarDigital Library
- Langton, S., and Bruce, V. Reflexive visual orienting in response to the social attention of others. Visual Cognition (1999).Google Scholar
- Lee, J., Marsella, S., Traum, D., Gratch, J., and Lance, B. The rickel gaze model: A window on the mind of a virtual human. In Intelligent Virtual Agents, Springer (2007), 296--303. Google ScholarDigital Library
- Lee, S., Badler, J., and Badler, N. Eyes alive. In ACM Transactions on Graphics (TOG), vol. 21, ACM (2002), 637--644. Google ScholarDigital Library
- Lester, J., Towns, S., Callaway, C., Voerman, J., and FitzGerald, P. Deictic and emotive communication in animated pedagogical agents. Embodied conversational agents (2000), 123--154. Google ScholarDigital Library
- Liu, C., Kay, D., and Chai, J. Awareness of partners eye gaze in situated referential grounding: An empirical study. 2nd Workshop on Eye Gaze in Intelligent Human Machine Interaction (2011).Google Scholar
- Ma, X., Le, B., and Deng, Z. Perceptual analysis of talking avatar head movements: A quantitative perspective. CHI'11 (2011). Google ScholarDigital Library
- Mason, M., Tatkow, E., and Macrae, C. The look of love: Gaze shifts and person perception. Psychological Science (2005), 236--239.Google Scholar
- Mehrabian, A. Immediacy: An indicator of attitudes in linguistic communication. Journal of Personality 34, 1 (1966), 26--34.Google ScholarCross Ref
- Meyer, A., Sleiderink, A., and Levelt, W. Viewing and naming objects: Eye movements during noun phrase production. Cognition 66, 2 (1998), B25-B33.Google ScholarCross Ref
- Mutlu, B., Forlizzi, J., and Hodgins, J. A storytelling robot: Modeling and evaluation of human-like gaze behavior. In Humanoid Robots, 2006 6th IEEE-RAS International Conference on, IEEE (2006), 518--523.Google ScholarCross Ref
- Nijholt, A., Heylen, D., and Vertegaal, R. Inhabited interfaces: Attentive conversational agents that help. In Proceedings 3rd international Conference on Disability, Virtual Reality and Associated Technologies-CDVRAT2000, Alghero, Sardinia (2000).Google Scholar
- Otteson, J., and Otteson, C. Effect of teacher's gaze on children's story recall. Perceptual and Motor Skills (1980).Google Scholar
- Parke, F., and Waters, K. Computer facial animation. AK Peters Ltd, 2008. Google ScholarDigital Library
- Pelachaud, C., and Bilvi, M. Modelling gaze behavior for conversational agents. In Intelligent Virtual Agents, Springer (2003), 93--100.Google ScholarCross Ref
- Pelz, J., Hayhoe, M., and Loeber, R. The coordination of eye, head, and hand movements in a natural task. Experimental Brain Research 139, 3 (2001), 266--277.Google ScholarCross Ref
- Peters, C. Animating gaze shifts for virtual characters based on head movement propensity. In 2010 Second International Conference on Games and Virtual Worlds for Serious Applications, IEEE (2010), 11--18. Google ScholarDigital Library
- Rickel, J., and Johnson, W. Task-oriented collaboration with embodied agents in virtual worlds. Embodied conversational agents (2000), 95--122. Google ScholarDigital Library
- Sherwood, J. Facilitative effects of gaze upon learning. Perceptual and Motor Skills (1987).Google Scholar
- Steptoe, W., and Steed, A. High-fidelity avatar eye-representation. In Virtual Reality Conference, 2008. VR'08. IEEE, IEEE (2008), 111--114.Google ScholarCross Ref
- Tartaro, A., and Cassell, J. Authorable virtual peers for autism spectrum disorders. In Proceedings of the Combined workshop on Language-Enabled Educational Technology and Development and Evaluation for Robust Spoken Dialogue Systems at the 17th European Conference on Artificial Intellegence, Citeseer (2006).Google Scholar
- Vertegaal, R. The gaze groupware system: mediating joint attention in multiparty communication and collaboration. In Proceedings of the SIGCHI conference on Human factors in computing systems: the CHI is the limit, ACM (1999), 294--301. Google ScholarDigital Library
- Zangemeister, W., and Stark, L. Types of gaze movement: variable interactions of eye and head movements. Experimental Neurology 77, 3 (1982), 563--577.Google ScholarCross Ref
Index Terms
- Designing effective gaze mechanisms for virtual agents
Recommendations
Gaze and Attention Management for Embodied Conversational Agents
To facilitate natural interactions between humans and embodied conversational agents (ECAs), we need to endow the latter with the same nonverbal cues that humans use to communicate. Gaze cues in particular are integral in mechanisms for communication ...
Designing effective behaviors for educational embodied agents
CHI EA '12: CHI '12 Extended Abstracts on Human Factors in Computing SystemsWhile various forms of online and distance learning have opened up opportunities for people to gain knowledge independent of where they are, embodied interaction with teachers or learning partners supports the social and cognitive aspects of the ...
Towards building a virtual counselor: modeling nonverbal behavior during intimate self-disclosure
AAMAS '12: Proceedings of the 11th International Conference on Autonomous Agents and Multiagent Systems - Volume 1Nonverbal behavior is considered critical for indicating intimacy and is important when designing a social virtual agent such as a counselor. One key research question is how to properly express intimate self-disclosure. In this paper we present an ...
Comments