Skip to main content
Log in

EEMML: the emotional eye movement animation toolkit

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Eye movement plays an important role in face to face communication in that it conveys nonverbal information and emotional intent beyond speech. Being “a window to the mind”, the eye and its behavior are tightly coupled with human cognitive processes. In this paper, we proposed an Emotional Eye Movement Markup Language (EEMML) which is an emotional eye movement animation scripting tool that enables authors to describe and generate emotional eye movement in virtual agents. The language can describe eye movement parameters we derived from facial expression database as well as real-time eye movement data (pupil size, blink rate and saccade). EEMML provides the input for our eye movement generator system with one or more eye movement actions in sequence. The language is extensible, so that new rules can be quickly added. It is designed to plug into larger human-agent or agent-agent interaction systems. We present an evaluation in which subjects evaluated the EEMML and gave their feedback. The results indicate the validity of our approach.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Balci K (2004) Xface: Mpeg-4 based open source toolkit for 3d facial animation. In: Proceeding of the working conference on advanced visual interfaces, pp 399–402

  2. Beard S (2002) Design decisions underlying virtual conversational character scripting languages. In: Proceedings HF-02 workshop on virtual conversational characters: applications, methods, and research challenges

  3. Bradley M, Lang P (1999) International affective digitized sounds (iads): stimuli, instruction manual and affective ratings (tech. rep. no. b-2). Gainesville, FL: The Center for Research in Psychophysiology, University of Florida. Technical Report

  4. Cassell J, Vilhjalmsson H, Bickmore T (2001) Beat: the behavior expression animation toolkit. In: Proceeding of SIGGRAPH01, pp 477–486

  5. Cohn J, Zlochower A, Lien J, Kanade T (1999) Automated face analysis by feature point tracking has high concurrent validity with manual facs coding. Psychophysiology 36:35–43

    Article  Google Scholar 

  6. Damasio A (1994) Descartes’ error, emotion reason and the human brain. Putnam, Crosset

    Google Scholar 

  7. Faigin G (1990) The artist’s complete guide to facial expression. Watson-Guptill Publications

  8. Gu E, Badler N (2007) Visual attention and eye gaze during multiparty conversations with distractions. In: Proceeding of international conference on intelligent virtual agents, pp 193–204. doi:10.1007/11821830

  9. Just M, Carpenter P (1976) Eye fixations and cognitive processes. Cogn Psychol 8:441–480

    Article  Google Scholar 

  10. Kopp S, Krenn B, Marsell S (2006) Towards a common framework for multimodal generation: the behavior markup language. In: Proceeding of IVA2006

  11. Lance B, Marsella S (2007) Emotionally expressive head and body movements during gaze shifts. In: Proceeding of international conference on intelligent virtual agents, pp 72–85. doi:10.1007/978–3–540–74997

  12. Lance B, Marsella S (2008) A model of gaze for the purpose of emotional expression in virtual embodied agents. In: Proceeding of autonomous agents and multi-agent systems.

  13. Lang P, Bradley M, Cuthbert B (1997) International affective picture system (iaps): affective ratings of pictures and instruction manual. technical report a-8. University of florida, Gainesville, FL. Technical Report

  14. Li Z, Mao X, Liu L (2009) Providing expressive eye movement to virtual agents. In: Proceeding of ICMI-MLMI2009, pp 241–244

  15. Mao X, Li Z, Bao H (2008) Extension of mpml with emotion recognition functions. In: Proceeding of IVA2008, pp 289–295

  16. Mao X, Li Z (2010) Web-based affective human-agent interaction generation. In: Hartung R (Ed) Agent and multi-agent system technology for internet and enterprise systems. Springer-Verlag, Berlin, pp 323–345

    Chapter  Google Scholar 

  17. Marriott A, Stallo J (2002) Vhml—uncertainties and problems, a discussion. In: Proceeding of AAMAS’02 workshop on ECA-Let’s specify and evaluate them. Italy

  18. MPEG4 (2005) Moving pictures expert group, information technology—coding of audio-visual objects, iso/iec 14496. Technical Report

  19. Ortony A, Clore G, Collins A (1988) The cognitive structure of emotions. Cambridge University Press

  20. Picard R (1997) Affective computing. The MIT Press

  21. Plutchik R (1980) Emotion: a psychoevolutionary synthesis. Harper and Row

  22. Prendinger H, Ishizuka M (2006) Describing and generating multimodal contents featuring affective lifelike agents with mpml. New Generating Computing 24(2):97–128

    Article  Google Scholar 

  23. Raouzaiou ATNKK, Kollias S (2002) Parameterized facial expression synthesis based on mpeg-4. EURASIP Journal on Applied Signal Processing 10:1021–1038

    Google Scholar 

  24. SMIL: The SMIL 2.0 Timing and Synchronization Module. http://www.w3.org/TR/2005/REC-SMIL2-20050107/smil-timing.html. Accessed 7 Jan 2005

  25. Stamper R (1996) Signs, norms, and information systems. In: Holmqvist B et al (eds) Signs at work. Walter de Gruyter, Berlin

    Google Scholar 

  26. Ullrich S, Bruegmann K, Prendinger H, Ishizuka M (2008) Extending mpml3d to second life. In: Proceeding of IVA2008, pp 281–288

  27. Whissel C (1989) The dictionary of affect in language. In Emotion: theory, research and experience. Academic Press

Download references

Acknowledgements

This work is supported by the National Nature Science Foundation of China (No.60873269), International Science and Technology Cooperation Program of China (No.2010DFA11990) and Innovation Foundation of BUAA for PhD Graduates.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zheng Li.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Li, Z., Mao, X. EEMML: the emotional eye movement animation toolkit. Multimed Tools Appl 60, 181–201 (2012). https://doi.org/10.1007/s11042-011-0816-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-011-0816-z

Keywords

Navigation