skip to main content
chapter

How do users perceive multimodal expressions of affects?

Published: 01 October 2018 Publication History
First page of PDF

References

[1]
R. B. Adams and R. E. Kleck. 2003. Perceived gaze direction and the processing of facial displays of emotion. Psychological Science, 14: 644--647. 277
[2]
R. B. Adams and R. E. Kleck. 2005. The effects of direct and averted gaze on the perception of facially communicated emotion. Emotion, 5: 3--11. 277
[3]
M. Ammi, V. Demulier, S. Caillou, Y. Gaffary, Y. Tsalamlal, J.-C. Martin, and A. Tapus. 2015. Haptic Human-Robot Interaction in a Handshaking Social Protocol. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI '15), ACM, New York, pp. 263--270. 274
[4]
N. H. Anderson. 1982. Methods of information integration theory, vol. 2. Academic Press New York. 274
[5]
N. H. Anderson. 1989. Information integration approach to emotions and their measurement. In R. P. Kellerman, editors, The Measurement of Emotions, pp. 133--186. Academic Press. 274
[6]
L. F. Barrett, K. A. Lindquist, and M. Gendron. 2007. Language as context for the perception of emotion. Trends in Cognitive Sciences, 11(8): 327--332. 276
[7]
A. W. de Borst and B. de Gelder. 2016. Clear signals or mixed messages: Inter-individual emotion congruency modulates brain activity underlying affective body perception. Social Cognitive and Affective Neuroscience, 11(8): 1299--1309. 279
[8]
S. Buisine, S. Abrilian, R. Niewiadomski, J.-C. Martin, L. Devillers, and C. Pelachaud. 2006. Perception of blended emotions: From video corpus to expressive agent. In Proceedings of the International Conference on Intelligent Virtual Agents (IVA'2006), pp. 93--106. Lecture Notes in Computer Science, vol 4133. Springer, Berlin, Heidelberg. 269
[9]
S. Buisine and J. C. Martin. 2007. The effects of speech-gesture cooperation in animated agents' behavior in multimedia presentations. Interacting with Computers, 19: 484--493. 263
[10]
S. Buisine and J.-C. Martin. 2010. The influence of user's personality and gender on the processing of virtual agents' multimodal behavior. In A. M. Colombus (editor), The Psychology of Extraversion, vol. 65, pp. 289--302. Nova Science Publishers New York. 264
[11]
S. Buisine, M. Courgeon, A. Charles, C. Clavel, J-C. Martin, N. Tan, and O. Grynszpan. 2014. The role of body posture in the recognition of emotion in contextually-rich scenarios. International Journal of Human-Computer Interaction, 30(1): 52--62. 271
[12]
P. Chevalier, J.-C. Martin, B. Isableu, C. Bazile, and A. Tapus. 2016. Impact of sensory preferences of individuals with autism on the recognition of emotions expressed by two robots, an avatar, and a human. Autonomous Robots, 41(3): 613--635. 268, 270
[13]
C. Clavel, J. Plessier, J.-C. Martin, L. Ach, and B. Morel. 2009. Combining facial and postural expressions of emotions in a virtual character. In Proceedings of the International Conference on Intelligent Virtual Agents (IVA'2009), pp. 287--300. Lecture Notes in Computer Science, vol. 5773. Springer, Berlin, Heidelberg. 268, 269
[14]
C. Clavel, L. Devillers, J. Plessier, L. Ach, B. Morel, and J.-C. Martin. 2012. Combinaisons d'expressions vocales, faciales et posturales des émotions chez un agent animé - perception par les utilisateurs. Techniques et Sciences Informatiques (TSI), 31(4): 533--564. 268, 272
[15]
A. Courbalay, T. Deroche, M. Descarreaux, E. Prigent, J. O'Shaughnessy, and M-A. Amorim. 2016. Facial expression overrides lumbopelvic kinematics, for clinical judgments about low back pain intensity. Pain Research & Management, vol. 2016, Article ID 7134825. 274
[16]
M. Courgeon, C. Clavel, N. Tan, and J.-C. Martin. 2011. Front View vs. Side View of Facial and Postural Expressions of Emotions in a Virtual Character. In Z. Pan, A. D. Cheok, W. Müller, editors, Transactions on Edutainment VI. Lecture Notes in Computer Science, vol. 6758. Springer, Berlin, Heidelberg. 268, 270
[17]
M. Courgeon, C. Clavel, N. Tan, and J.-C. Martin. 2014. Modeling Facial Signs of Appraisal During Interaction; Impact on Users' Perception and Behavior. In Proceedings of the 13th International Conference on Autonomous Agents and Multiagent Systems (AAMAS'2014), pp. 765--772. Paris, France. 264
[18]
C. Creed and R. Beale. 2008. Psychological responses to simulated displays of mismatched emotional expressions. Interacting with Computers, 20(2): 225--239. 265, 272
[19]
U. Dimberg. 1982. Facial reactions to facial expressions. Psychophysiology, 19(6):643--647. 277
[20]
U. Dimberg, M. Thunberg, and K. Elmehed. 2000. Unconscious facial reactions to emotional facial expressions. Psychological Science, 11(1): 86--89. 277
[21]
P. Ekman and W. V. Friesen. 1974. Nonverbal behavior and psychopathology. In R. J. Friedman and M. M. Datz, editors, The psychology of depression: contemporary theory and research, pp. 203--232. Winston & Sons, Washington, D.C. 269
[22]
P. Ekman and W. V. Friesen. 1975. Unmasking the face. A guide to recognizing emotions from facial clues. Prentice-Hall Inc., Englewood Cliffs, New Jersey. 266, 267
[23]
Y. Gaffary, V. Eyharabide, J.-C. Martin, and M. Ammi. 2014. The Impact of Combining Kinesthetic and Facial Expression Displays on Emotion Recognition by Users. International Journal on Human-Computer Interaction, 30: 904--920. 268, 276
[24]
K. Gasper and C. L. Danube. 2016. The scope of our affective influences when and how naturally occurring positive, negative, and neutral affects alter judgment. Personality and Social Psychology Bulletin, 42(3): 385--399. 265
[25]
B. de Gelder and J. Vroomen. 2000. The perception of emotions by ear and by eye. Cognition and Emotion, 14(3): 289--311. 277
[26]
B. de Gelder and J. Van den Stock. 2012. Real faces, real emotions: perceiving facial expressions in naturalistic contexts of voices, bodies and scenes. In A. J. Calder, G. Rhodes, J. V. Haxby and M. H. Johnson, editors, The Handbook of Face Perception. Oxford University Press, Oxford. 265, 269
[27]
O. Golan, S. Baron-Cohen, and J. Hill. 2006. The Cambridge Mindreading (CAM) face-voice battery: testing complex emotion recognition in adults with and without asperger syndrome. Journal of Autism and Developmental Disorders, 36(2): 169--183. 266
[28]
D. Gomez, C. Castanier, B. Chang, M. Val, M., F. Cottin, C. Le Scanff, and J.-C. Martin. 2017. Toward Automatic Detection of Acute Stress: Relevant Nonverbal Behaviors and Impact of Personality Traits. In Proceedings on the 7th International Conference on Affective Computing and Intelligent Interaction (ACII2017), San Antonio, Texas, October 23--26. 279
[29]
J. J. Gross and L. F. Barrett. 2011. Emotion generation and emotion regulation: One or two depends on your point of view. Emotion Review, 3(1): 8--16. 266
[30]
H. Gunes and M. Piccardi. 2005. Fusing face and body display for bi-modal emotion recognition: single frame analysis and multi-frame post integration. In Proceedings of the International Conference Affective Computing and Intelligent Interaction (ACII'2005), pp. 102--110. Springer. 269
[31]
J.K. Hietanen and J.M. Leppanen. 2008. Judgment of other people's facial expressions of emotions is influenced by their concurrent affective hand movements. Scandinavian Journal of Psychology, 49(3): 221--230. . 269
[32]
K. Hoemann, M. Gendron, and L. Feldman Barrett. 2017. Mixed emotions in the predictive brain. Current Opinion in Behavioral Sciences, 15: 51--57. 267
[33]
M. E. Hoque, M. Courgeon, B. Mutlu, J-C. Martin, and R. W. Picard. 2013. MACH: My Automated Conversation coacH. In Proceedings of the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UBICOMP 2013). pp. 697--706. ACM, New York. 265
[34]
S. Hyniewska, R. Niewiadomski, M. Mancini, and C. Pelachaud. 2010. Expression of affects in embodied conversational agents. In K.R. Scherer, T. Bänziger, and E. Roesch, editors, Blueprint for Affective Computing, pp. 213--221. Oxford University Press. 269
[35]
M. Kipp and J.-C. Martin. 2009. Gesture and emotion: can basic gestural form features discriminate emotions? In Proceedings of the International Conference on Affective Computing and Intelligent Interaction (ACII'2009). IEEE Press. 269
[36]
B. Kreifelts, T. Ethofer, W. Grodd, M. Erb, and D. Wildgruber. 2007. Audiovisual integration of emotional signals in voice and face: an event-related fMRI study. Neuroimage, 37(4): 1445--56. 265
[37]
M. E. Kret, K. Roelofs, J. Stekelenburg and B. de Gelder. 2013. Emotional signals from faces, bodies and scenes influence observers' face expressions, fixations and pupil-size. Frontiers in Human Neuroscience, 7: 810. 265
[38]
B. Liebold and P. Ohler. 2013. Multimodal emotion expressions of virtual agents - Mimic and vocal emotion expressions and their effects on emotion recognition. Proceedings of the International Conference on Affective Computing and Intelligent Interaction (ACII'2013). 272
[39]
J. C. Martin. 1998. TYCOON: theoretical and software tools for multimodal interfaces. In John Lee, editor, Intelligence and Multimodality in Multimedia interfaces. AAAI Press. 263
[40]
L. Martinez, V. B. Falvello, H. Aviezer, and A. Todorov. 2016. Contributions of facial expressions and body language to the rapid perception of dynamic emotions. Cognition and Emotion, 30(5): 939--952. 271
[41]
H. K. Meeren, C. C. van Heijnsbergen, and B. de Gelder. 2005. Rapid perceptual integration of facial expression and emotional body language. In Proceedings of the National Academy of Sciences of the United States of America, 102: 16518--16523. 269, 277
[42]
R. Niewiadomski, S. Hyniewska, and C. Pelachaud, 2013. Computational Models of Expressive Behaviors for a Virtual Agent. In Jonathan Gratch and Stacy Marsella, editors, Social emotions in nature and artifact: Emotions in Human and Human Computer Interaction. Oxford University Press. 264
[43]
R. Niewiadomski, Y. Ding, M. Mancini, C. Pelachaud, G. Volpe, and A. Camurri. 2015. Perception of Intensity Incongruence in Synthesized Multimodal Expressions of Laughter. Proceedings of the International Conference on Affective Computing and Intelligent Interaction (ACII'2015). 273
[44]
A. M. Oliveira, M. P. Teixeira, I. B. Fonseca, E. R. Santos, and M. Oliveira. 2006. Inter-emotional comparisons of facially expressed emotion intensities: Dynamic ranges and general-purpose rules. In D. E. Kornbrot, R. M. Msefti, and A. W. MacRae, editors, Proceedings of the 22nd Annual Meeting of the International Society for Psychophysics, pp. 239--244. The International Society for Psychophysics, St. Albans UK. 274
[45]
L. Philip, J.-C. Martin, and C. Clavel. 2017. Suppression of Facial Mimicry of Negative Facial Expressions in an Incongruent Context. International Journal of Psychophysiology. 277
[46]
E. Prigent, M.-A. Amorim, P. Leconte, and D. Pradon. 2014. Perceptual weighting of pain behaviours of others, not information integration, varies with expertise. European Journal of Pain, 18(1): 110--119. 274
[47]
L. M., Reeves, J. Lai, J. A. Larson, S. Oviatt, T. S. Balaji, S. Buisine, P. Collings, P. Cohen, B. Kraal, J. C. Martin, M. McTear, T. V. Raman, K. M. Stanney, H. Su, and Q. Y. Wang. 2004. Guidelines for multimodal user interface design. Communications of the ACM, 47(1): 57--59. 263
[48]
J. A. Russell and A. Mehrabian. 1977. Evidence for a three-factor theory of emotions. Journal of Research on Personality, 11(3): 273--294. 266
[49]
K. R. Scherer. 1998. Analyzing emotion blends. In Proceedings of the 10th Conference of the International Society for Research on Emotions, pp. 142--148. 267
[50]
K. R. Scherer and H. Ellgring. 2007. Multimodal expression of emotion: affect programs or componential appraisal patterns? Emotion, (1): 158--171. 264
[51]
K. R. Scherer. 2010. Emotion and emotional competence: conceptual and theoretical issues for modelling agents. In K.R. Scherer, T. Bänziger, and E. Roesch, editors, Blueprint for Affective Computing, pp. 3--20. Oxford University Press, Oxford. 269
[52]
B. Schuller. 2018. Multimodal user state & trait recognition: an overview. In S. Oviatt, B. Schuller, P. Cohen, D. Sonntag, G. Potamianos, and A. Krueger, editors, The Handbook of Multimodal-Multisensor Interfaces, Volume 2: Signal Processing, Architectures, and Detection of Emotion and Cognition, Chapter 5 Morgan & Claypool Publishers, San Rafael, CA.
[53]
D. Tell and D. Davidson. 2015. Emotion recognition from congruent and incongruent emotional expressions and situational cues in children with autism spectrum disorder. Autism, 19(3): 375--379. 268, 278
[54]
M. Y. Tsalamlal, M-A. Amorim, J-C. Martin, and M. Ammi. 2017. Combining facial expression and touch for perceiving emotional valence. IEEE Transactions on Affective Computing, 99. 264, 268, 273, 274, 275
[55]
M. Tsalamlal, J-C. Martin, M. Ammi, A. Tapus, and M-A. Amorim. 2015. Affective Handshake with a Humanoid Robot: How do Participants Perceive and Combine its Facial and Haptic Expressions. In Proceedings of the 6th International Conference on Affective Computing and Intelligent Interaction (ACII 2015 2015), pp. 334--340. Xi'an, China. 268, 276
[56]
P. Venkatesh and D. B. Jayagopi. 2016a. Automatic Expression Recognition and Expertise Prediction in Bharatnatyam. Proceedings of the IEEE Conference on Advances in Computing, Communications and Informatics (ICACCI). 272
[57]
P. Venkatesh and D. B. Jayagopi. 2016b. Automatic bharatnatyam dance posture recognition and expertise prediction using depth cameras. In Y. Bi, S. Kapoor, and R. Bhatia, editors, Proceedings of SAI Intelligent Systems Conference (IntelliSys). Lecture Notes in Networks and Systems, vol. 16. Springer. 272
[58]
H. G. Wallbott. 1998. Bodily expression of emotion. European Journal of Social Psychology, 28: 879--896. 269

Cited By

View all
  • (2023)Investigating the influence of agent modality and expression on agent-mediated fairness behavioursJournal on Multimodal User Interfaces10.1007/s12193-023-00403-y17:2(65-77)Online publication date: 23-May-2023
  • (2019)Medical and health systemsThe Handbook of Multimodal-Multisensor Interfaces10.1145/3233795.3233808(423-476)Online publication date: 1-Jul-2019
  1. How do users perceive multimodal expressions of affects?

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Books
    The Handbook of Multimodal-Multisensor Interfaces: Signal Processing, Architectures, and Detection of Emotion and Cognition - Volume 2
    October 2018
    2034 pages
    ISBN:9781970001716
    DOI:10.1145/3107990

    Publisher

    Association for Computing Machinery and Morgan & Claypool

    Publication History

    Published: 01 October 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Qualifiers

    • Chapter

    Appears in

    ACM Books

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)10
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 03 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Investigating the influence of agent modality and expression on agent-mediated fairness behavioursJournal on Multimodal User Interfaces10.1007/s12193-023-00403-y17:2(65-77)Online publication date: 23-May-2023
    • (2019)Medical and health systemsThe Handbook of Multimodal-Multisensor Interfaces10.1145/3233795.3233808(423-476)Online publication date: 1-Jul-2019

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media