skip to main content
10.1145/2254556.2254678acmotherconferencesArticle/Chapter ViewAbstractPublication PagesaviConference Proceedingsconference-collections
poster

Automatic natural expression recognition using head movement and skin color features

Published: 21 May 2012 Publication History

Abstract

Significant progress has been made in automatic facial expression recognition, yet most state of the art approaches produce significantly better reliabilities on acted expressions than on natural ones. User interfaces that use facial expressions to understand user's affective states need to be most accurate during naturalistic interactions. This paper presents a study where head movement features are used to recognize naturalistic expressions of affect. The International Affective Picture System (IAPS) collection was used as stimulus for triggering different affective states. Machine learning techniques are applied to classify user's expressions based on their head position and skin color changes. The proposed approach shows a reasonable accuracy in detecting three levels of valence and arousal for user-dependent model during naturalistic human-computer interaction.

References

[1]
R. W. Picard, Affective Computing. The MIT Press, 2000, p. 304.
[2]
R. A. Calvo and S. D'Mello, Eds., New Perspectives on Affect and Learning Technologies, vol. 3. New York: Springer, 2011.
[3]
Z. Ambadar, J. W. Schooler, and J. F. Cohn, "Deciphering the Enigmatic Face: The Importance of Facial Dynamics in Interpreting Subtle Facial Expressions," Psychological Science, vol. 16, no. 5, pp. 403--410, 2005.
[4]
M. Hoque and R. W. Picard, "Acted vs. natural frustration and delight: Many people smile in natural frustration," in Automatic Face & Gesture Recognition and Workshops (FG 2011), 2011 IEEE International Conference on, 2011, pp. 354--359.
[5]
J. Cohn and K. Schmidt, "The timing of facial motion in posed and spontaneous smiles," International Journal of Wavelets, Multiresolution and Information Processing, vol. 2, pp. 1--12, 2004.
[6]
P. Ekman and W. V. Friesen, Facial Action Coding System: A Technique for the Measurement of Facial Movement. Palo Alto, CA: Consulting Psychologists Press, 1978.
[7]
J. A. Russell, "A Circumplex Model of Affect," Journal of Personality and Social Psychology, vol. 39, no. 6. pp. 1161--1178, 1980.
[8]
J. F. Cohn, "Foundations of Human Computing: Facial Expression and Emotion," in Proc. Eighth ACM Int'l Conf. Multimodal Interfaces (ICMI '06), 2006, pp. 233--238.
[9]
M. E. Hoque, R. Kaliouby, and R. W. Picard, "When Human Coders (and Machines) Disagree on the Meaning of Facial Affect in Spontaneous Videos," in 9th International Conference on Intelligent Virtual Agents, (IVA 2009), 2009, pp. 337--343.
[10]
R. A. Calvo and S. D'Mello, "Affect Detection Interdisciplinary Review of Models, Methods, and Their Applications," Affective Computing, IEEE Transaction on, vol. 1, no. 1, pp. 18--37, 2010.
[11]
S. Craig, A. Graesser, J. Sullins, and B. Gholson, "Affect and learning: an exploratory look into the role of affect in learning with AutoTutor," Learning, Media & Technology, vol. 29, no. 3, pp. 241--250, Oct. 2004.
[12]
M. A. Nicolaou, H. Gunes, and M. Pantic, "Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence-Arousal Space," Affective Computing, IEEE Transactions on, vol. 2, no. 2, pp. 92--105, 2011.
[13]
D. Glowinski, A. Camurri, G. Volpe, and N. Dael, "Technique for automatic emotion recognition by body gesture analysis," in Computer Vision and Pattern Recognition Workshops, 2008. CVPRW '08. IEEE Computer Society Conference on, 2008, pp. 1--6.
[14]
P. Lang and M. Bradley, "International affective picture system (IAPS): Technical manual and affective ratings," Psychology, 1997.
[15]
M. S. Hussain, O. Alzoubi, R. A. Calvo, and S. D. Mello, "Affect Detection from Multichannel Physiology during Learning Sessions with AutoTutor," in Artificial Intelligence in Education, G. Biswas, S. Bull, J. Kay, and A. Mitrovic, Eds. Springer, 2011, pp. 131--138.
[16]
P. Viola and M. Jones, "Rapid object detection using a boosted cascade of simple features," in IEEE Conference on Computer Vision and Pattern Recognition (CVPR '01), 2001, pp. 511--518.
[17]
M. A. Hall, "Correlation-based Feature Selection for Discrete and Numeric Class Machine Learning," in ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning, 2000, pp. 359--366.

Cited By

View all
  • (2021)Bi-modal Emotion Recognition via Broad Learning System2021 China Automation Congress (CAC)10.1109/CAC53003.2021.9727610(2143-2148)Online publication date: 22-Oct-2021
  • (2021)Detecting naturalistic expression of emotions using physiological signals while playing video gamesJournal of Ambient Intelligence and Humanized Computing10.1007/s12652-021-03367-714:2(1133-1146)Online publication date: 9-Jul-2021
  • (2021)A proficient approach for face detection and recognition using machine learning and high‐performance computingConcurrency and Computation: Practice and Experience10.1002/cpe.658234:3Online publication date: 25-Sep-2021
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
AVI '12: Proceedings of the International Working Conference on Advanced Visual Interfaces
May 2012
846 pages
ISBN:9781450312875
DOI:10.1145/2254556

Sponsors

  • Consulta Umbria SRL
  • University of Salerno: University of Salerno

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 21 May 2012

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. affective user interface
  2. emotion recognition
  3. facial expression recognition
  4. machine learning

Qualifiers

  • Poster

Conference

AVI'12
Sponsor:
  • University of Salerno

Acceptance Rates

Overall Acceptance Rate 128 of 490 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1
  • Downloads (Last 6 weeks)1
Reflects downloads up to 03 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2021)Bi-modal Emotion Recognition via Broad Learning System2021 China Automation Congress (CAC)10.1109/CAC53003.2021.9727610(2143-2148)Online publication date: 22-Oct-2021
  • (2021)Detecting naturalistic expression of emotions using physiological signals while playing video gamesJournal of Ambient Intelligence and Humanized Computing10.1007/s12652-021-03367-714:2(1133-1146)Online publication date: 9-Jul-2021
  • (2021)A proficient approach for face detection and recognition using machine learning and high‐performance computingConcurrency and Computation: Practice and Experience10.1002/cpe.658234:3Online publication date: 25-Sep-2021
  • (2019)Facial Expression Recognition Using Computer Vision: A Systematic ReviewApplied Sciences10.3390/app92146789:21(4678)Online publication date: 2-Nov-2019
  • (2019)Increased affect-arousal in VR can be detected from faster body motion with increased heart rateProceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games10.1145/3306131.3317022(1-6)Online publication date: 21-May-2019
  • (2017)Automated Detection of Engagement Using Video-Based Estimation of Facial Expressions and Heart RateIEEE Transactions on Affective Computing10.1109/TAFFC.2016.25150848:1(15-28)Online publication date: 1-Jan-2017
  • (2013)Body Movements for Affective Expression: A Survey of Automatic Recognition and GenerationIEEE Transactions on Affective Computing10.1109/T-AFFC.2013.294:4(341-359)Online publication date: Oct-2013
  • (2012)A dynamic approach for detecting naturalistic affective states from facial videos during HCIProceedings of the 25th Australasian joint conference on Advances in Artificial Intelligence10.1007/978-3-642-35101-3_15(170-181)Online publication date: 4-Dec-2012

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media