Abstract
Hand-over-face gestures, a subset of emotional body language, are overlooked by automatic affect inference systems. We propose the use of hand-over-face gestures as a novel affect cue for automatic inference of cognitive mental states. Moreover, affect recognition systems rely on the existence of publicly available datasets, often the approach is only as good as the data. We present the collection and annotation methodology of a 3D multimodal corpus of 108 audio/video segments of natural complex mental states. The corpus includes spontaneous facial expressions and hand gestures labelled using crowd-sourcing and is publicly available.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Afzal, S., Robinson, P.: Natural affect data - collection & annotation in a learning context. In: ACII, pp. 1–7. IEEE, Los Alamitos (2009)
Baron-Cohen, S., Golan, O., Wheelwright, S., Hill, J.: Mind Reading: The Interactive Suide to Emotions (2004)
Bourel, F., Chibelushi, C., Low, A.: Robust facial expression recognition using a state-based model of spatially-localised facial dynamics. In: IEEE AFGR (2002)
Cowie, R.: Building the databases needed to understand rich, spontaneous human behaviour. In: AFGR, pp. 1–6. IEEE, Los Alamitos (2008)
Duchenne, G., Cuthbertson, R.: The mechanism of human facial expression. Cambridge Univ. Press, Cambridge (1990)
Ekenel, H., Stiefelhagen, R.: Block selection in the local appearance-based face recognition scheme. In: CVPRW, pp. 43–43. IEEE, Los Alamitos (2006)
Ekman, P., Friesen, W.: Manual for the Facial Action Coding System. Consulting Psychologists Press, Palo Alto (1977)
Ekman, P., Friesen, W.V., Ellsworth, P.: Emotion in the Human Face, 2nd edn. Cambridge University Press, Cambridge (1982)
Fleiss, J., Levin, B., Paik, M.: Statistical Methods for Rates and Proportions. Wiley, Chichester (2003)
de Gelder, B.: Why bodies? Twelve reasons for including bodily expressions in affective neuroscience. Phil. Trans. of the Royal Society B 364(1535), 3475 (2009)
Gunes, H., Piccardi, M.: A bimodal face and body gesture database for automatic analysis of human nonverbal affective behavior. In: ICPR, vol. 1, pp. 1148–1153. IEEE, Los Alamitos (2006)
Hoque, M.E., Picard, R.W.: Acted vs. natural frustration and delight: Many people smile in natural frustration. In: IEEE AFGR (2011)
Lausberg, H., Sloetjes, H.: Coding gestural behavior with the NEUROGES-ELAN system. Behavior research methods (2009), http://www.lat-mpi.eu/tools/elan/
Lucey, P., Cohn, J., Kanade, T., Saragih, J., Ambadar, Z., Matthews, I.: The extended Cohn-Kanade dataset (CK+): A complete dataset for action unit and emotion-specified expression. In: CVPRW, pp. 94–101. IEEE, Los Alamitos (2010)
McKeown, G., Valstar, M., Cowie, R., Pantic, M.: The SEMAINE corpus of emotionally coloured character interactions. In: ICME, pp. 1079–1084. IEEE, Los Alamitos (2010)
Pantic, M., Valstar, M., Rademaker, R., Maat, L.: Web-based database for facial expression analysis. In: IEEE Conf. Multimedia and Expo, p. 5. IEEE, Los Alamitos (2005)
Pease, A., Pease, B.: The definitive book of body language, Bantam (2006)
Roberts, N., Tsai, J., Coan, J.: Emotion elicitation using dyadic interaction tasks. In: Handbook of Emotion Elicitation and Assessment, pp. 106–123 (2007)
Rozin, P., Cohen, A.B.: High frequency of facial expressions corresponding to confusion, concentration, and worry in an analysis of naturally occurring facial expressions of Americans. Emotion 3(1), 68–(2003)
Snow, R., O’Connor, B., Jurafsky, D., Ng, A.: Cheap and fast-but is it good?: evaluating non-expert annotations for natural language tasks. In: Proc. of the Conf. on Empirical Methods in Natural Language Processing, pp. 254–263. Association for Computational Linguistics (2008)
Sun, Y., Yin, L.: Facial expression recognition based on 3D dynamic range model sequences. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008, Part II. LNCS, vol. 5303, pp. 58–71. Springer, Heidelberg (2008)
Tong, Y., Liao, W., Ji, Q.: Facial action unit recognition by exploiting their dynamic and semantic relationships. IEEE PAMI, 1683–1699 (2007)
Weitz, S.: Sex differences in nonverbal communication. Sex Roles 2, 175–184 (1976)
Yin, L., Wei, X., Sun, Y., Wang, J., Rosato, M.J.: A 3D facial expression database for facial behavior research. In: AFGR, pp. 211–216 (2006)
Zeng, Z., Pantic, M., Roisman, G.I., Huang, T.S.: A survey of affect recognition methods: Audio, visual, and spontaneous expressions. TPAMI 31(1), 39–58 (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Mahmoud, M., Baltrušaitis, T., Robinson, P., Riek, L.D. (2011). 3D Corpus of Spontaneous Complex Mental States. In: D’Mello, S., Graesser, A., Schuller, B., Martin, JC. (eds) Affective Computing and Intelligent Interaction. ACII 2011. Lecture Notes in Computer Science, vol 6974. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-24600-5_24
Download citation
DOI: https://doi.org/10.1007/978-3-642-24600-5_24
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-24599-2
Online ISBN: 978-3-642-24600-5
eBook Packages: Computer ScienceComputer Science (R0)