ABSTRACT
Non-verbal behavior, particularly eye movement, plays a fundamental role in nonverbal communication among people. In order to realize natural and intuitive human-agent interaction, the virtual agents need to employ this communicative channel effectively. Against this background, our research addresses the problem of emotionally expressive eye movement manner by describing a preliminary approach based on the parameters picked from real-time eye movement data (pupil size, blink rate and saccade).
- E. Gu and N.I. Badler: Visual Attention and Eye Gaze During Multiparty Conversations with Distractions, Proceeding of International Conference on Intelligent Virtual Agents, 2007, pp.193--204 Google ScholarDigital Library
- B. Lance and S. Marsella: The Relation between Gaze Behavior and the Attribution of Emotion: An Empirical Study, Proceeding of International Conference on Intelligent Virtual Agents, 2008, pp.1--14 Google ScholarDigital Library
- MPEG4: Moving Pictures Expert Group, ISO/IEC 14496, International Standard on Coding of Audio-Visual Objects M.M. Bradley, L. Miccoli, M.A. Escrig and P.J. Lang: The pupil as a measure of emotional arousal and autonomic activation, Psychophysiology, 2008, vol. 45, pp. 602--607Google ScholarCross Ref
- K. Takashima, Y. Miyata, Y. Itoh, Y. Kitamura and F. Kishino: Effects of avatar's blinking animation on person impressions, In Proceedings of Graphics interface, 2008, pp. 169--176 Google ScholarDigital Library
- J. Kissler and A. Keil: Look-don't look! How emotional pictures affect pro- and anti-saccades, Exp Brain Res, 2008, vol. 188, pp. 215--222Google ScholarCross Ref
- P.J. Lang and M.M. Bradley and B.N. Cuthbert: Affective ratings of pictures and instruction manual, Technical Report A-8. University of Florida, Gainesville, FL.Google Scholar
- M.M. Bradley and P.J. Lang: International affective digitized sounds (IADS), Stimuli, instruction manual and affective ratings (Tech. Rep. No. B--2). Gainesville, FL: The Center for Research in Psychophysiology, University of FloridaGoogle Scholar
- S. Lee, J. Badler and N. Badler: Eyes alive, ACM Transactions on Graphics, 2002, vol. 21, no. 3 Google ScholarDigital Library
- C. Whissel: The dictionary of affect in language, in Emotion: Theory, Research and Experience, Academic Press, 1989, pp. 113--131Google Scholar
- R. Plutchik: Emotion: A Psychoevolutionary Synthesis, Harper and Row, New York, 1980Google Scholar
- A. Raouzaiou, N. Tsapatsoulis, K. Karpouzis and S. Kollias: Parameterized Facial Expression Synthesis Based on MPEG-4, EURASIP Journal on Applied Signal Processing, 2002, vol. 10, pp. 1021--1038 Google ScholarDigital Library
Index Terms
- Providing expressive eye movement to virtual agents
Recommendations
Emotional eye movement markup language for virtual agents
AAMAS '10: Proceedings of the 9th International Conference on Autonomous Agents and Multiagent Systems: volume 1 - Volume 1EEMML (Emotional Eye Movement Markup Language) is a scripting tool that enables authors to describe and generate emotional eye movement in virtual agents. The EEMML is capable of describing and generating both basic eye movement and emotional eye ...
Hierarchical HMM for Eye Movement Classification
Computer Vision – ECCV 2020 WorkshopsAbstractIn this work, we tackle the problem of ternary eye movement classification, which aims to separate fixations, saccades and smooth pursuits from the raw eye positional data. The efficient classification of these different types of eye movements ...
Real time eye movement identification protocol
CHI EA '10: CHI '10 Extended Abstracts on Human Factors in Computing SystemsThis paper introduces a Real Time Eye Movement Identification (REMI) protocol designed to address challenges related to the implementation of the eye-gaze guided computer interfaces. The REMI protocol provides the framework for 1) eye position data ...
Comments