skip to main content
10.1145/2797143.2797165acmotherconferencesArticle/Chapter ViewAbstractPublication PageseannConference Proceedingsconference-collections
research-article

Exploiting the Use of Ensemble Classifiers to Enhance the Precision of User's Emotion Classification

Published: 25 September 2015 Publication History

Abstract

There is an increasing number of studies in the area of Human-Computer Interaction (HCI) that bears witness to the importance of taking account of emotional factors in interactions with computer systems. By getting to know the emotions of the users, it is possible for artificial agents to have an influence on human feelings with a view to stimulating them in a particular or everyday activities. Thus, one of the great challenges of the HCI area is to enable computer systems to recognize and interpret the feelings of the users. This article sets out a functional Ensemble model for the classification of emotions based on the motor facial expressions of the users. The results described in this article show that the Ensemble Classification that is put forward, can achieve greater rates of accuracy in classifying feelings than what can be obtained by using a single classifier.

References

[1]
J. N. Bailenson, E. D. Pontikakis, I. B. Mauss, et al. Real-time classification of evoked emotions using facial feature tracking and physiological responses. Intl. journal of human-computer studies, 2008.
[2]
R. R. Bouckaert, E. Frank, M. Hall, R. Kirkby, et al. Weka manual for version 3-7-8, 2013.
[3]
G. Chanel, J. J. Kierkels, M. Soleymani, et al. Short-term emotion assessment in a recall paradigm. Intl. Journal of Human-Computer Studies, 2009.
[4]
R. O. Duda, P. E. Hart, et al. Pattern classification. John Wiley & Sons, 2012.
[5]
P. Ekman. Darwin and facial expression: A century of research in review. Ishk, 2006.
[6]
G. P. R. Filho, J. Ueyama, L. A. Villas, A. R. Pinto, and G. Pessin. Nodepm: A remote monitoring alert system for energy consumption using probabilistic techniques. Sensors, 2014.
[7]
J. Klein, Y. Moon, et al. This computer responds to user frustration:: Theory, design, and results. Interacting with computers, 2002.
[8]
O. Langner, R. Dotsch, G. Bijlstra, et al. Presentation and validation of the radboud faces database. Cognition and Emotion, 2010.
[9]
G. Libralon and R. Romero. Mapping of facial elements for emotion analysis. In Proceedings of the Brazilian Conf. on Intelligent Systems, 2014.
[10]
A. Lichtenstein et al. Comparing two emotion models for deriving affective states from physiological data. In Affect and Emotion in HCI. Springer, 2008.
[11]
R. LiKamWa, Y. Liu, et al. Moodscope: Building a mood sensor from smartphone usage patterns. In Proceeding of the 11th annual intl. conf. on Mobile systems, applications, and services, 2013.
[12]
P. Lucey, J. F. Cohn, et al. The extended cohn-kanade dataset (ck+): A complete dataset for action unit and emotion-specified expression. In Computer Vision and Pattern Recognition Workshops, IEEE Computer Society Conf. on, 2010.
[13]
S. Mahlke and M. Minge. Consideration of multiple components of emotions in human-technology interaction. In Affect and emotion in HCI. Springer, 2008.
[14]
A. Øhrn et al. Rough sets: a knowledge discovery technique for multifactorial medical outcomes. American journal of physical medicine & rehabilitation, 2000.
[15]
C. Peter and B. Urban. Emotion in human-computer interaction. In Expanding the Frontiers of Visual Analytics and Visualization. Springer, 2012.
[16]
S. Ramakrishnan and I. M. El Emary. Speech emotion recognition approaches in human computer interaction. Telecommunication Systems, 2013.
[17]
J. M. Saragih, S. Lucey, et al. Deformable model fitting by regularized landmark mean-shift. Intl. Journal of Computer Vision, 2011.
[18]
K. R. Scherer. What are emotions? and how can they be measured? Social science information, 2005.
[19]
B. Schuller, S. Reiter, R. Muller, et al. Speaker independent speech emotion recognition by ensemble classification. In Multimedia and Expo, 2005. ICME 2005. IEEE Intel. Conf. on, 2005.
[20]
F. Zhou, X. Qu, M. G. Helander, et al. Affect prediction from physiological measures via visual stimuli. Intl. Journal of Human-Computer Studies, 2011.

Cited By

View all
  • (2023)A Comprehensive Study of Emotional Responses in AI-Enhanced Interactive Installation ArtSustainability10.3390/su15221583015:22(15830)Online publication date: 10-Nov-2023
  • (2023)Survey on Emotion Sensing Using Mobile DevicesIEEE Transactions on Affective Computing10.1109/TAFFC.2022.322048414:4(2678-2696)Online publication date: 1-Oct-2023
  • (2019)The Relation of Satisfaction, Self-Confidence and Emotion in a Simulated EnvironmentInternational Journal of Nursing Education Scholarship10.1515/ijnes-2018-000916:1Online publication date: 23-Feb-2019
  • Show More Cited By

Index Terms

  1. Exploiting the Use of Ensemble Classifiers to Enhance the Precision of User's Emotion Classification

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      EANN '15: Proceedings of the 16th International Conference on Engineering Applications of Neural Networks (INNS)
      September 2015
      266 pages
      ISBN:9781450335805
      DOI:10.1145/2797143
      © 2015 Association for Computing Machinery. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of a national government. As such, the Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

      In-Cooperation

      • Aristotle University of Thessaloniki
      • INNS: International Neural Network Society

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 25 September 2015

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Emotion Classification
      2. Ensemble of Classification
      3. FaceTracker
      4. Human-Computer Interaction (HCI)

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Conference

      16th EANN workshops

      Acceptance Rates

      EANN '15 Paper Acceptance Rate 36 of 60 submissions, 60%;
      Overall Acceptance Rate 36 of 60 submissions, 60%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)3
      • Downloads (Last 6 weeks)1
      Reflects downloads up to 17 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2023)A Comprehensive Study of Emotional Responses in AI-Enhanced Interactive Installation ArtSustainability10.3390/su15221583015:22(15830)Online publication date: 10-Nov-2023
      • (2023)Survey on Emotion Sensing Using Mobile DevicesIEEE Transactions on Affective Computing10.1109/TAFFC.2022.322048414:4(2678-2696)Online publication date: 1-Oct-2023
      • (2019)The Relation of Satisfaction, Self-Confidence and Emotion in a Simulated EnvironmentInternational Journal of Nursing Education Scholarship10.1515/ijnes-2018-000916:1Online publication date: 23-Feb-2019
      • (2019)A module-based framework to emotion recognition by speech: a case study in clinical simulationJournal of Ambient Intelligence and Humanized Computing10.1007/s12652-019-01280-814:11(15513-15522)Online publication date: 1-Apr-2019
      • (2019)An intelligent and generic approach for detecting human emotions: a case study with facial expressionsSoft Computing10.1007/s00500-019-04411-7Online publication date: 8-Oct-2019
      • (2017)Assessing users' emotion at interaction timeSoft Computing - A Fusion of Foundations, Methodologies and Applications10.1007/s00500-016-2115-021:18(5309-5323)Online publication date: 1-Sep-2017
      • (2017)An Ensemble Classifiers Approach for Emotion ClassificationIntelligent Interactive Multimedia Systems and Services 201710.1007/978-3-319-59480-4_11(99-108)Online publication date: 28-May-2017

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media