skip to main content
10.1145/2818346.2820739acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article
Free access

Accuracy vs. Availability Heuristic in Multimodal Affect Detection in the Wild

Published: 09 November 2015 Publication History

Abstract

This paper discusses multimodal affect detection from a fusion of facial expressions and interaction features derived from students' interactions with an educational game in the noisy real-world context of a computer-enabled classroom. Log data of students' interactions with the game and face videos from 133 students were recorded in a computer-enabled classroom over a two day period. Human observers live annotated learning-centered affective states such as engagement, confusion, and frustration. The face-only detectors were more accurate than interaction-only detectors. Multimodal affect detectors did not show any substantial improvement in accuracy over the face-only detectors. However, the face-only detectors were only applicable to 65% of the cases due to face registration errors caused by excessive movement, occlusion, poor lighting, and other factors. Multimodal fusion techniques were able to improve the applicability of detectors to 98% of cases without sacrificing classification accuracy. Balancing the accuracy vs. applicability tradeoff appears to be an important feature of multimodal affect detection.

References

[1]
Paul D. Allison. 1999. Multiple regression: A primer. Pine Forge Press.
[2]
Ivon Arroyo, David G. Cooper, Winslow Burleson, Beverly Park Woolf, Kasia Muldner, and Robert Christopherson. 2009. Emotion sensors go to school. Proceedings of the 14th International Conference on Artificial Intelligence in Education, IOS Press, 17--24.
[3]
Ryan Baker, Sujith M. Gowda, Michael Wixon, et al. 2012. Towards sensor-free affect detection in cognitive tutor algebra. Proceedings of the 5th International Conference on Educational Data Mining, 126--133.
[4]
Ryan Baker and Jaclyn Ocumpaugh. 2015. Interaction-based affect detection in educational software. In The Oxford Handbook of Affective Computing, Rafael Calvo, Sidney D'Mello, J. Gratch and A. Kappas (eds.). New York: Oxford University Press, 233--245.
[5]
Ryan Baker, Jaclyn Ocumpaugh, Sujith M. Gowda, Amy M. Kamarainen, and Shari J. Metcalf. 2014. Extending logbased affect detection to a multi-user virtual environment for science. 22nd Conference on User Modeling, Adaptation and Personalization (UMAP 2014), Springer, 290--300.
[6]
Ryan Baker and Kalina Yacef. 2009. The state of educational data mining in 2009: A review and future visions. Journal of Educational Data Mining 1, 1: 3--16.
[7]
Nigel Bosch, Sidney D'Mello, Ryan Baker, et al. 2015. Automatic detection of learning-centered affective states in the wild. Proceedings of the 2015 International Conference on Intelligent User Interfaces (IUI 2015), New York, NY: ACM, 379--388.
[8]
R.A. Calvo and Sidney D'Mello. 2010. Affect detection: An interdisciplinary review of models, methods, and their applications. IEEE Transactions on Affective Computing 1, 1: 18--37.
[9]
N. V. Chawla, K. W. Bowyer, L. O. Hall, and W. P. Kegelmeyer. 2011. SMOTE: synthetic minority oversampling technique. Journal of Artificial Intelligence Research 16: 321--357.
[10]
Desmond L. Cook. 1962. The Hawthorne effect in educational research. Phi Delta Kappan: 116--122.
[11]
Joseph F. Grafsgaard, Joseph B. Wiggins, Kristy Elizabeth Boyer, Eric N. Wiebe, and James C. Lester. 2013. Automatically recognizing facial expression: Predicting engagement and frustration. Proceedings of the 6th International Conference on Educational Data Mining.
[12]
Joseph F. Grafsgaard, Joseph B. Wiggins, Alexandria Katarina Vail, Kristy Elizabeth Boyer, Eric N. Wiebe, and James C. Lester. 2014. The additive value of multimodal features for predicting engagement, frustration, and learning during tutoring. Proceedings of the 16th International Conference on Multimodal Interaction, ACM, 42--49.
[13]
G. Holmes, A. Donkin, and I.H. Witten. 1994. WEKA: a machine learning workbench. Proceedings of the Second Australian and New Zealand Conference on Intelligent Information Systems, 357--361.
[14]
L.A Jeni, J.F. Cohn, and F. de la Torre. 2013. Facing imbalanced data-Recommendations for the use of performance metrics. 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII), 245--251.
[15]
Shiming Kai, Luc Paquette, Ryan Baker, et al. 2015. Comparison of face-based and interaction-based affect detectors in physics playground. Proceedings of the 8th International Conference on Educational Data Mining (EDM 2015), International Educational Data Mining Society, 77--84.
[16]
Ashish Kapoor, Winslow Burleson, and Rosalind W. Picard. 2007. Automatic prediction of frustration. International Journal of Human-Computer Studies 65, 8: 724--736.
[17]
Ashish Kapoor and Rosalind W. Picard. 2005. Multimodal affect recognition in learning environments. Proceedings of the 13th Annual ACM International Conference on Multimedia, ACM, 677--682.
[18]
Igor Kononenko. 1994. Estimating attributes: Analysis and extensions of RELIEF. In Machine Learning: ECML-94, Francesco Bergadano and Luc De Raedt (eds.). Springer, Berlin Heidelberg, 171--182.
[19]
Gwen Littlewort, J. Whitehill, Tingfan Wu, et al. 2011. The computer expression recognition toolbox (CERT). 2011 IEEE International Conference on Automatic Face Gesture Recognition and Workshops (FG 2011), 298--305.
[20]
Sidney D'Mello. 2011. Dynamical emotions: Bodily dynamics of affect during problem solving. Proceedings of the 33rd Annual Conference of the Cognitive Science Society.
[21]
Sidney D'Mello. 2013. A selective meta-analysis on the relative incidence of discrete affective states during learning with technology. Journal of Educational Psychology 105, 4: 1082--1099.
[22]
Sidney D'Mello, Nathan Blanchard, Ryan Baker, Jaclyn Ocumpaugh, and Keith Brawner. 2014. I feel your pain: A selective review of affect-sensitive instructional strategies. In Design Recommendations for Intelligent Tutoring Systems Volume 2: Instructional Management, Robert Sottilare, Art Graesser, Xiangen Hu and Benjamin Goldberg (eds.). 35--48.
[23]
Sidney D'Mello and Art Graesser. 2010. Multimodal semiautomated affect detection from conversational cues, gross body language, and facial features. User Modeling and UserAdapted Interaction 20, 2: 147--187.
[24]
Sidney D'Mello, Tanner Jackson, Scotty Craig, et al. 2008. AutoTutor detects and responds to learners affective and cognitive states. Workshop on Emotional and Cognitive Issues at the International Conference on Intelligent Tutoring Systems.
[25]
Sidney D'Mello and Jacqueline Kory. 2015. A review and meta-analysis of multimodal affect detection systems. ACM Computing Surveys 47, 3: 43:1--43:36.
[26]
Jaclyn Ocumpaugh, Ryan Baker, Amy Kamarainen, and Shari Metcalf. 2014. Modifying field observation methods on the fly: Creative metanarrative and disgust in an environmental MUVE. Proceedings of the 4th International Workshop on Personalization Approaches in Learning Environments (PALE), held in conjunction with the 22nd International Conference on User Modeling, Adaptation, and Personalization (UMAP 2014), 49--54.
[27]
Jaclyn Ocumpaugh, Ryan Baker, and Ma Mercedes T. Rodrigo. 2012. Baker-Rodrigo observation method protocol (BROMP) 1.0. Training manual version 1.0. Technical Report. New York, NY: EdLab. Manila, Philippines: Ateneo Laboratory for the Learning Sciences.
[28]
Luc Paquette, Ryan S.J.d. Baker, Michael Sao Pedro, et al. 2014. Sensor-Free Affect Detection for a Simulation-Based Science Inquiry Learning Environment. Proceedings of the 12th International Conference on Intelligent Tutoring Systems, ITS 2014, 1--10.
[29]
Zachary a. Pardos, Ryan S.J.d. Baker, Maria O. C. Z. San Pedro, Sujith M. Gowda, and Supreeth M. Gowda. 2014. Affective states and state tests: Investigating how affect and engagement during the school year predict end of year learning outcomes. Journal of Learning Analytics 1, 1: 107--128.
[30]
Ka÷ka Porayska-Pomsta, Manolis Mavrikis, Sidney D'Mello, Cristina Conati, and Ryan Baker. 2013. Knowledge elicitation methods for affect modelling in education. International Journal of Artificial Intelligence in Education 22, 3: 107--140.
[31]
Valerie Shute and Matthew Ventura. 2013. Measuring and supporting learning in games: Stealth assessment. The MIT Press, Cambridge, MA.
[32]
J. Whitehill, Z. Serpell, Yi-Ching Lin, A Foster, and J.R. Movellan. 2014. The faces of engagement: Automatic recognition of student engagement from facial expressions. IEEE Transactions on Affective Computing 5, 1: 86--98.
[33]
Z. Zeng, M. Pantic, G.I. Roisman, and T.S. Huang. 2009. A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence 31, 1: 39--58.

Cited By

View all
  • (2024)Sensor-free Affect Detection in Learning Environments: A Systematic Literature ReviewRevista Brasileira de Informática na Educação10.5753/rbie.2024.436232(679-717)Online publication date: 21-Nov-2024
  • (2024)Preliminary study on the feasibility of approximating children's engagement level from their emotions estimation by a picture-based, three-model AI in a family-robot cohabitation scenarioAdvanced Robotics10.1080/01691864.2024.2415093(1-19)Online publication date: 22-Oct-2024
  • (2023)Food Choices after Cognitive Load: An Affective Computing ApproachSensors10.3390/s2314659723:14(6597)Online publication date: 21-Jul-2023
  • Show More Cited By

Index Terms

  1. Accuracy vs. Availability Heuristic in Multimodal Affect Detection in the Wild

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ICMI '15: Proceedings of the 2015 ACM on International Conference on Multimodal Interaction
    November 2015
    678 pages
    ISBN:9781450339124
    DOI:10.1145/2818346
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 09 November 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. affect
    2. affect detection
    3. facial expressions
    4. interaction
    5. missing data

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    ICMI '15
    Sponsor:
    ICMI '15: INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION
    November 9 - 13, 2015
    Washington, Seattle, USA

    Acceptance Rates

    ICMI '15 Paper Acceptance Rate 52 of 127 submissions, 41%;
    Overall Acceptance Rate 453 of 1,080 submissions, 42%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)146
    • Downloads (Last 6 weeks)16
    Reflects downloads up to 16 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Sensor-free Affect Detection in Learning Environments: A Systematic Literature ReviewRevista Brasileira de Informática na Educação10.5753/rbie.2024.436232(679-717)Online publication date: 21-Nov-2024
    • (2024)Preliminary study on the feasibility of approximating children's engagement level from their emotions estimation by a picture-based, three-model AI in a family-robot cohabitation scenarioAdvanced Robotics10.1080/01691864.2024.2415093(1-19)Online publication date: 22-Oct-2024
    • (2023)Food Choices after Cognitive Load: An Affective Computing ApproachSensors10.3390/s2314659723:14(6597)Online publication date: 21-Jul-2023
    • (2023)Dyadic Affect in Parent-Child Multimodal Interaction: Introducing the DAMI-P2C Dataset and its Preliminary AnalysisIEEE Transactions on Affective Computing10.1109/TAFFC.2022.317868914:4(3345-3361)Online publication date: 1-Oct-2023
    • (2023)Engagement Detection and Its Applications in Learning: A Tutorial and Selective ReviewProceedings of the IEEE10.1109/JPROC.2023.3309560111:10(1398-1422)Online publication date: Oct-2023
    • (2023)The dynamics of Brazilian students’ emotions in digital learning systemsInternational Journal of Artificial Intelligence in Education10.1007/s40593-023-00339-034:2(519-544)Online publication date: 5-Jul-2023
    • (2023)Multi-modal Affect Detection Using Thermal and Optical Imaging in a Gamified Robotic ExerciseInternational Journal of Social Robotics10.1007/s12369-023-01066-116:5(981-997)Online publication date: 31-Oct-2023
    • (2022)The Evaluation of Learner Experience in Serious GamesResearch Anthology on Developments in Gamification and Game-Based Learning10.4018/978-1-6684-3710-0.ch073(1521-1548)Online publication date: 2022
    • (2022)Psychological Measurement in the Information Age: Machine-Learned Computational ModelsCurrent Directions in Psychological Science10.1177/0963721421105690631:1(76-87)Online publication date: 14-Feb-2022
    • (2021)What's Fair is Fair: Detecting and Mitigating Encoded Bias in Multimodal Models of Museum Visitor AttentionProceedings of the 2021 International Conference on Multimodal Interaction10.1145/3462244.3479943(258-267)Online publication date: 18-Oct-2021
    • Show More Cited By

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media