skip to main content
10.1145/3549865.3549908acmotherconferencesArticle/Chapter ViewAbstractPublication PagesinteraccionConference Proceedingsconference-collections
short-paper

Do Machines Better Understand Synthetic Facial Expressions than People?

Published: 09 September 2022 Publication History

Abstract

The recognition of facial expressions is a topic frequently addressed since the raising of Artificial Intelligence. From the point of view of the human-computer interaction (HCI), there is also an increasing interest as the emotions conveyed through expressions provide a huge amount of information when dealing with non-verbal communication. Nowadays, neural networks are one of the most used computational learning systems to recognize and analyze emotions. Generally, the main efforts are led to increase the performance of machine learning (ML) models in terms of high accuracy. But, actually, humans are not so good at distinguishing emotions. In this work, we rise the question of whether the validation of such models should rely only on performance measures or if we should also focus on trying to emulate human behavior. We try to give a fair answer performing two experiments using both, human participants and machine learning techniques.

References

[1]
Vinay Bettadapura. 2012. Face expression recognition and analysis: the state of the art. arXiv preprint arXiv:1203.6722(2012).
[2]
Venkata Rami Reddy Chirra, Srinivasulu Reddy Uyyala, and Venkata Krishna Kishore Kolli. 2021. Virtual facial expression recognition using deep CNN with ensemble learning. Journal of Ambient Intelligence and Humanized Computing 12, 12(2021), 10581–10599.
[3]
Laurent Colbois, Tiago de Freitas Pereira, and Sébastien Marcel. 2021. On the use of automatically generated synthetic image datasets for benchmarking face recognition. In 2021 IEEE International Joint Conference on Biometrics (IJCB). IEEE, 1–8.
[4]
R. Cowie, E. Douglas-Cowie, N. Tsapatsoulis, G. Votsis, S. Kollias, W. Fellenz, and J.G. Taylor. 2001. Emotion recognition in human-computer interaction. IEEE Signal Processing Magazine 18, 1 (2001), 32–80. https://doi.org/10.1109/79.911197
[5]
Juan del Aguila, Luz M. González-Gualda, María Angeles Játiva, Patricia Fernández-Sotos, Antonio Fernández-Caballero, and Arturo S. García. 2021. How Interpersonal Distance Between Avatar and Human Influences Facial Affect Recognition in Immersive Virtual Reality. Frontiers in Psychology 12 (2021). https://doi.org/10.3389/fpsyg.2021.675515
[6]
Paul Ekman. 1993. Facial expression and emotion.American psychologist 48, 4 (1993), 384.
[7]
Paul Ekman and Wallace V Friesen. 1978. Facial action coding system. Environmental Psychology & Nonverbal Behavior (1978).
[8]
Patrick Lucey, Jeffrey F Cohn, Takeo Kanade, Jason Saragih, Zara Ambadar, and Iain Matthews. 2010. The extended cohn-kanade dataset (ck+): A complete dataset for action unit and emotion-specified expression. In 2010 ieee computer society conference on computer vision and pattern recognition-workshops. IEEE, 94–101.
[9]
Daniel Lundqvist, Anders Flykt, and Arne Öhman. 1998. Karolinska directed emotional faces. Cognition and Emotion(1998).
[10]
Michael J Lyons, Miyuki Kamachi, and Jiro Gyoba. 1997. Japanese female facial expressions (JAFFE). Database of digital images 3 (1997).
[11]
Ali Mollahosseini, Behzad Hasani, and Mohammad H Mahoor. 2017. Affectnet: A database for facial expression, valence, and arousal computing in the wild. IEEE Transactions on Affective Computing 10, 1 (2017), 18–31.
[12]
Miquel Mascaró Oliver and Esperança Amengual Alcover. 2020. UIBVFED: Virtual facial expression dataset. PLOS ONE 15, 4 (04 2020), 1–10. https://doi.org/10.1371/journal.pone.0231266
[13]
Silvia Ramis, Jose M Buades, Francisco J Perales, and Cristina Manresa-Yee. 2022. A Novel Approach to Cross dataset studies in Facial Expression Recognition. Multimedia Tools and Applications(2022), 1–38.
[14]
Christian Szegedy, Sergey Ioffe, Vincent Vanhoucke, and Alexander A Alemi. 2017. Inception-v4, inception-resnet and the impact of residual connections on learning. In Thirty-first AAAI conference on artificial intelligence.
[15]
Michel Valstar, Maja Pantic, 2010. Induced disgust, happiness and surprise: an addition to the mmi facial expression database. In Proc. 3rd Intern. Workshop on EMOTION (satellite of LREC): Corpora for Research on Emotion and Affect. Paris, France., 65.

Cited By

View all
  • (2024)On the Convenience of Using 32 Facial Expressions to Recognize the 6 Universal EmotionsInformation Systems and Technologies10.1007/978-3-031-45645-9_60(625-634)Online publication date: 14-Feb-2024
  • (2023)UIBVFED-Mask: A Dataset for Comparing Facial Expressions with and without Face MasksData10.3390/data80100178:1(17)Online publication date: 11-Jan-2023
  • (2023)UIBVFEDPlus-Light: Virtual facial expression dataset with lightingPLOS ONE10.1371/journal.pone.028700618:9(e0287006)Online publication date: 29-Sep-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
Interacción '22: Proceedings of the XXII International Conference on Human Computer Interaction
September 2022
104 pages
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 09 September 2022

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. convolutional neural networks
  2. facial expression datasets
  3. facial expression recognition
  4. hci
  5. machine learning
  6. synthetic avatars

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Funding Sources

  • Ministerio de Ciencia e Innovación

Conference

Interaccion 2022

Acceptance Rates

Overall Acceptance Rate 109 of 163 submissions, 67%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)10
  • Downloads (Last 6 weeks)0
Reflects downloads up to 30 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)On the Convenience of Using 32 Facial Expressions to Recognize the 6 Universal EmotionsInformation Systems and Technologies10.1007/978-3-031-45645-9_60(625-634)Online publication date: 14-Feb-2024
  • (2023)UIBVFED-Mask: A Dataset for Comparing Facial Expressions with and without Face MasksData10.3390/data80100178:1(17)Online publication date: 11-Jan-2023
  • (2023)UIBVFEDPlus-Light: Virtual facial expression dataset with lightingPLOS ONE10.1371/journal.pone.028700618:9(e0287006)Online publication date: 29-Sep-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media