skip to main content
10.1145/1107548.1107610acmotherconferencesArticle/Chapter ViewAbstractPublication Pagessoc-eusaiConference Proceedingsconference-collections
Article

Basic components of a face-to-face interaction with a conversational agent: mutual attention and deixis

Published:12 October 2005Publication History

ABSTRACT

We present a series of experiments that involve a face-to-face interaction between an embodied conversational agent (ECA) and a human interlocutor. The main challenge is to provide the interlocutor with implicit and explicit signs of mutual interest and attention and of the awareness of environmental conditions in which the interaction takes place. A video realistic talking head with independent head and eye movements was used as a talking agent interacting with a user during a simple card game offering different levels of help and guidance. We analyzed the user performance and how the quality of assistance given by the embodied conversational agent was perceived. The experiment showed that users can profit from its presence and its facial deictic cues.

References

  1. Bailly, G., Bérar, M., Elisei, F., and Odisio, M. (2003) Audiovisual speech synthesis. International Journal of Speech Technology, 6: p.331--346.Google ScholarGoogle ScholarCross RefCross Ref
  2. Baron-Cohen, S., Leslie, A., and Frith, U. (1985) Does the autistic child have a "theory of mind"?, Cognition, 21: p.37--46.Google ScholarGoogle ScholarCross RefCross Ref
  3. Breazeal, C. (2002) Designing Sociable Robots,.: The MIT Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Brooks, R. A., Breazeal, C., Marjanovic, M., Scassellati, B., and Williamson, M. (1999) The Cog Project: Building a Humanoid Robot" in Computation for Metaphors, Analogy, and Agents, in Lecture Notes in Artificial Intelligence, C. Nehaniv, Editor. Springer: New York. p. 52--87. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Cassell, J., Sullivan, J., Prevost, S., and Churchill, E. (2000) Embodied Conversational Agents. Cambridge: MIT Press.Google ScholarGoogle Scholar
  6. Driver, J., Davis, G., Riccardelli, P., Kidd, P., Maxwell. E., and Baron-Cohen, S. (1999) Shared attention and the social brain: gaze perception triggers automatic visuospatial orienting in adults. Visual Cognition, 6(5): p.509--540.Google ScholarGoogle ScholarCross RefCross Ref
  7. Garau, M., Slater, M., Bee, S., and Sasse, M.-A. (2001) The impact of eye gaze on communication using humanoid avatars. in CHI'01: ACM Conference on Human Factors in Computing Systems. Seattle, WA. p.309--316. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Langton, S. and Bruce, V. (1999) Reflexive visual orienting in response to the social attention of others. Visual Cognition, 6(5): p.541--567.Google ScholarGoogle ScholarCross RefCross Ref
  9. Langton. S., Watt, J., and Bruce, V. (2000) Do the eyes have it? Cues to the direction of social attention. Trends in Cognitive Sciences, 4(2): p.50--59.Google ScholarGoogle ScholarCross RefCross Ref
  10. Leslie, A. M. (1994) ToMM, ToBY, and Agency: Core architecture and domain specificity, in Mapping the Mind: Domain specificity in cognition and culture, L. A. Hirschfeld and S. A. Gelman, Editors. Cambridge University Press: Cambridge. p. 119--148.Google ScholarGoogle Scholar
  11. Nagao, K. and Takeuchi, A. (1994) Speech dialogue with facial displays: multimodal human-computer conversation. in 32nd conference of the Association for Computational Linguistics. Las Cruces, New Mexico. p. 102--109. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Posner, M. and Peterson, S. (1990) The attention system of the human brain. Annual Review of Neuroscience, 13: p.25--42.Google ScholarGoogle ScholarCross RefCross Ref
  13. Posner, M. I. (1980) Orienting of attention. Quarterly Journal of Experimental Psychology, 32: p.3--25.Google ScholarGoogle ScholarCross RefCross Ref
  14. Premack, D. and Woodruff. G. (1978) Does the chimpanzee have a theory of mind?, Behavioral and brain sciences, 1: p.515--526.Google ScholarGoogle Scholar
  15. Revéret, L., Bailly, G., and Badin. P. (2000) MOTHER: a new generation of talking heads providing a flexible articulatory control for video-realistic speech animation. in International Conference on Speech and Language Processing. Beijing - China. p.755--758.Google ScholarGoogle Scholar
  16. Roccella, S., Carrozza. M. C., Cappiello, G., Dario, P., Cabibihan. J.-J., Zecca. M., Miwa, H., Itoh, K., Matsumoto, M., and Takanishi, A. (2004) Design, fabrication and preliminary results of a novel anthropomorphic hand for humanoid robotics: RCH-1,. in IEEE/RSJ International Conference on Intelligent Robot and Systems. Sendai, Japan. p.266--271.Google ScholarGoogle ScholarCross RefCross Ref
  17. Scassellati, B. (2001) Foundations for a theory of mind for a humanoid robot, in Department of Computer Science and Electrical Engineering. MIT: Boston - MA. p. 174. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Simons, D. J. and Chabris. C. F. (1999) Gorillas in our midst: sustained inattentional blindness for dynamic events. Perception, 28: p. 1059--1074.Google ScholarGoogle ScholarCross RefCross Ref
  19. Vatikiotis-Bateson, E., Eigsti, I.-M., Yano, S., and Munhall, K. G. (1998) Eye movement of perceivers during audiovisual speech perception. Perception & Psychophysics, 60: p.926--940.Google ScholarGoogle ScholarCross RefCross Ref
  20. Vertegaal, R., Slagter, R., Veer, G.v.d., and Nijholt, A. (2001) Eye gaze patterns in conversations: There is more to conversational agents than meets the eyes. in Conference on Human Factors in Computing Systems. Seattle, USA. p.301--308. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Yarbus. A. L. (1967) Eye movements during perception of complex objects, in Eye Movements and Vision', L. A. Riggs, Editor. Plenum Press: New York. p. 171--196.Google ScholarGoogle ScholarCross RefCross Ref
  1. Basic components of a face-to-face interaction with a conversational agent: mutual attention and deixis

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      sOc-EUSAI '05: Proceedings of the 2005 joint conference on Smart objects and ambient intelligence: innovative context-aware services: usages and technologies
      October 2005
      316 pages
      ISBN:1595933042
      DOI:10.1145/1107548

      Copyright © 2005 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 12 October 2005

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • Article

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader