skip to main content
10.1145/1088463.1088484acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
Article

Understanding the effect of life-like interface agents through users' eye movements

Published: 04 October 2005 Publication History

Abstract

We motivate an approach to evaluating the utility of life-like interface agents that is based on human eye movements rather than questionnaires. An eye tracker is employed to obtain quantitative evidence of a user's focus of attention. The salient feature of our evaluation strategy is that it allows us to measure important properties of a user's interaction experience on a moment-by-moment basis in addition to a cumulative (spatial) analysis of the user's areas of interest. We describe an empirical study in which we compare attending behavior of subjects watching the presentation of an apartment by three types of media: an animated agent, a text box, and speech only. The investigation of users' eye movements reveals that agent behavior may trigger natural and social interaction behavior of human users.

References

[1]
E. André, J. Müller, and T. Rist. The PPP Persona: A multipurpose animated presentation agent. In Proceedings Advanced Visual Interfaces (AVI-96), pages 245--247. ACM Press, 1996.
[2]
J. Bates. The role of emotion in believable agents. Communications of the ACM, 37(7):122--125, 1994.
[3]
W. Burleson, R. Picard, K. Perlin, and J. Lippincott. A platform for affective agent research. In Proceedings 3rd International Conference on Autonomous Agents & Multi Agent Systems (AAMAS-03), New York, 2004. ACM Press.
[4]
J. Cassell, J. Sullivan, S. Prevost, and E. Churchill, editors. Embodied Conversational Agents. The MIT Press, Cambridge, MA, 2000.
[5]
R. M. Cooper. The control of eye fixation by the meaning of spoken language: A new methodology for the real-time investigation of speech perception, memory, and language processing. Cognitive Psychology, (6):84--107, 1974.
[6]
D. M. Dehn and S. van Mulken. The impact of animated interface agents: A review of empirical research. International Journal of Human-Computer Studies, (52):1--22, 2000.
[7]
A. T. Duchowski. Eye Tracking Methodology: Theory and Practice. Springer, London, UK, 2003.
[8]
P. Faraday and A. Sutcliffe. An empirical study of attending and comprehending multimedia presentations. In Proceedings of ACM Multimedia 96, pages 265--275, Boston MA, 1996.
[9]
J. H. Goldberg and X. P. Kotval. Computer interface evaluation using eye movements: Methods and constructs. International Journal of Industrial Ergonomics, 24:631--645, 1999.
[10]
E. H. Hess. Pupillometrics: A method of studying mental, emotional and sensory processes. In N. Greenfield and R. Sternbach, editors, Handbook of Psychophysiology, pages 491--531. Holt, Rinehart & Winston, New York, 1972.
[11]
C. Hongpaisanwiwat and M. Lewis. Attention effect of animated character. In Proceedings Human-Computer Interaction (INTERACT-03), pages 423--430. IOS Press, 2003.
[12]
R. J. K. Jacob. The use of eye movements in human-computer interaction techniques: What you look at is what you get. ACM Transactions on Information Systems, 9(3):152--169, 1991.
[13]
J. C. Lester, S. A. Converse, S. E. Kahler, S. T. Barlow, B. A. Stone, and R. S. Bhogal. The Persona effect: Affective impact of animated pedagogical agents. In Proceedings of CHI-97, pages 359--366. ACM Press, 1997.
[14]
H. McBreen, P. Shade, M. Jack, and P. Wyard. Experimental assessment of the effectiveness of synthetic personae for multi-modal e-retail applications. In Proceedings 4th International Conference on Autonomous Agents (Agents'2000), pages 39--45, New York, 2000. ACM Press.
[15]
Microsoft. Developing for Microsoft Agent. Microsoft Press, Redmond, WA, 1998.
[16]
NAC. Image Technology, 2004. URL: http://eyemark.jp.
[17]
Y. I. Nakano, G. Reinstein, T. Stocky, and J. Cassell. Towards a model of face-to-face grounding. In Proceedings of Association for Computational Linguistics (ACL-03), pages 553--561, 2003.
[18]
R. E. Nisbett and T. D. Wilson. Telling more than we know: Verbal reports on mental processes. Psychological Review, 84:231--259, 1977.
[19]
T. Partala and V. Surakka. Pupil size variation as an indication of affective processing. International Journal of Human-Computer Studies, 59:185--198, 2003.
[20]
H. Prendinger, S. Descamps, and M. Ishizuka. MPML: A markup language for controlling the behavior of life-like characters. Journal of Visual Languages and Computing, 15(2):183--203, 2004.
[21]
H. Prendinger and M. Ishizuka, editors. Life-Like Characters. Tools, Affective Functions, and Applications. Cognitive Technologies. Springer Verlag, Berlin Heidelberg, 2004.
[22]
H. Prendinger and M. Ishizuka. The Empathic Companion: A character-based interface that addresses users' affective states. International Journal of Applied Artificial Intelligence, 19(3):267--285, 2005.
[23]
L. Qu, N. Wang, and W. L. Johnson. Pedagogical agents that interact with learners. In AAMAS-04 Workshop on Balanced Perception and Action in ECAs, 2004.
[24]
J. Renshaw, J. Finlay, D. Tyfa, and R. Ward. Understanding visual influence in graph design through temporal and spatial eye movement characterisitics. Interacting with Computers, 16:557--578, 2004.
[25]
A. Takeuchi and T. Naito. Situated facial displays: Towards social interaction. In Proceedings CHI 95 Conference, pages 450--455, New York, 1995. ACM Press.
[26]
Tokyo Mansions, 2004. URL: http://www.themansions.jp/.
[27]
H. Umemuro and J. Yamashita. Detection of user's confusion and surprise based on pupil dilation. The Japanese Journal of Ergonomics, 39(4):153--161, 2003.
[28]
S. van Mulken, E. André, and J. Müller. The Persona Effect: How substantial is it? In Proceedings Human Computer Interaction (HCI-98), pages 53--66, Berlin, 1998. Springer.
[29]
G. Wilson and M. Sasse. Listen to your heart rate: Counting the cost of media quality. In A. Paiva, editor, Affective Interactions - Towards a New Generation of Computer Interfaces, pages 9--20. Springer, Berlin Heidelberg, 2000.
[30]
M. Witkowski, Y. Arafa, and O. de Bruijn. Evaluating user reaction to character agent mediated displays using eye-tracking technology. In Proceedings AISB-01 Symposium on Information Agents for Electronic Commerce, pages 79--87, 2001.

Cited By

View all
  • (2014)A cross-platform, remotely-controlled mobile avatar simulation framework for AmI environmentsSIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications10.1145/2669062.2669083(1-4)Online publication date: 24-Nov-2014
  • (2012)2nd International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2012)Proceedings of the 2012 ACM Conference on Ubiquitous Computing10.1145/2370216.2370362(673-676)Online publication date: 5-Sep-2012
  • (2011)Attentive user interface for interaction within virtual reality environments based on gaze analysisProceedings of the 14th international conference on Human-computer interaction: interaction techniques and environments - Volume Part II10.5555/2022466.2022491(204-213)Online publication date: 9-Jul-2011
  • Show More Cited By

Index Terms

  1. Understanding the effect of life-like interface agents through users' eye movements

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ICMI '05: Proceedings of the 7th international conference on Multimodal interfaces
      October 2005
      344 pages
      ISBN:1595930280
      DOI:10.1145/1088463
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 04 October 2005

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. animated interface agents
      2. eye tracking
      3. user study
      4. web-based presentation

      Qualifiers

      • Article

      Conference

      ICMI05
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 453 of 1,080 submissions, 42%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)4
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 24 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2014)A cross-platform, remotely-controlled mobile avatar simulation framework for AmI environmentsSIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications10.1145/2669062.2669083(1-4)Online publication date: 24-Nov-2014
      • (2012)2nd International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2012)Proceedings of the 2012 ACM Conference on Ubiquitous Computing10.1145/2370216.2370362(673-676)Online publication date: 5-Sep-2012
      • (2011)Attentive user interface for interaction within virtual reality environments based on gaze analysisProceedings of the 14th international conference on Human-computer interaction: interaction techniques and environments - Volume Part II10.5555/2022466.2022491(204-213)Online publication date: 9-Jul-2011
      • (2011)Virtual WorldsVirtual Worlds and E-Commerce10.4018/978-1-61692-808-7.ch001(1-15)Online publication date: 2011
      • (2011)Iterative User Involvement in Ambient Assisted Living Research and Development ProcessesE-Health, Assistive Technologies and Applications for Assisted Living10.4018/978-1-60960-469-1.ch010(217-243)Online publication date: 2011
      • (2011)Attentive User Interface for Interaction within Virtual Reality Environments Based on Gaze AnalysisHuman-Computer Interaction. Interaction Techniques and Environments10.1007/978-3-642-21605-3_23(204-213)Online publication date: 2011
      • (2011)Estimation of Interest from Physical Actions Captured by Familiar User DeviceWhole Body Interaction10.1007/978-0-85729-433-3_15(187-195)Online publication date: 4-Apr-2011
      • (2009)Analyzing the Benefits of a Novel Multiagent Approach in a Multimodal Biometrics Identification TaskIEEE Systems Journal10.1109/JSYST.2009.20359783:4(410-417)Online publication date: Dec-2009
      • (2009)Attentive interfaces for users with disabilities: eye gaze for intention and uncertainty estimationUniversal Access in the Information Society10.1007/s10209-009-0144-58:4(339-354)Online publication date: 27-Oct-2009
      • (2008)Uses of Eye Tracking Technology in DesignProceedings of the Human Factors and Ergonomics Society Annual Meeting10.1177/15419312080520195752:19(1574-1578)Online publication date: 1-Sep-2008
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media