Skip to main content

4D Affect Detection: Improving Frustration Detection in Game-Based Learning with Posture-Based Temporal Data Fusion

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11625))

Abstract

Recent years have seen growing interest in utilizing sensors to detect learner affect. Modeling frustration has particular significance because of its central role in learning. However, sensor-based affect detection poses important challenges. Motion-tracking cameras produce vast streams of spatial and temporal data, but relatively few systems have harnessed this data successfully to produce accurate run-time detectors of learner frustration outside of the laboratory. In this paper, we introduce a data-driven framework that leverages spatial and temporal posture data to detect learner frustration using deep neural network-based data fusion techniques. To train and validate the detectors, we utilize posture data collected with Microsoft Kinect sensors from students interacting with a game-based learning environment for emergency medical training. Ground-truth labels of learner frustration were obtained using the BROMP quantitative observation protocol. Results show that deep neural network-based late fusion techniques that combine spatial and temporal data yield significant improvements to frustration detection relative to baseline models.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. D’Mello, S.: A selective meta-analysis on the relative incidence of discrete affective states during learning with technology. J. Educ. Psychol. 105, 1082–1099 (2013)

    Article  Google Scholar 

  2. Grafsgaard, J.F., Wiggins, J.B., Vail, A.K., Boyer, K.E., Wiebe, E.N., Lester, J.C.: The additive value of multimodal features for predicting engagement, frustration, and learning during tutoring. In: Proceedings of the Sixteenth ACM International Conference on Multimodal Interaction, pp. 42–49. ACM (2014)

    Google Scholar 

  3. DeFalco, J.A., et al.: Detecting and addressing frustration in a serious game for military training. Int. J. Artif. Intell. Educ. 28, 152–193 (2018)

    Article  Google Scholar 

  4. Grafsgaard, J.F., Wiggins, J.B., Boyer, K.E., Wiebe, E.N., Lester, J.C.: Predicting learning and affect from multimodal data streams in task-oriented tutorial dialogue. In: Proceedings of the Seventh International Conference on Educational Data Mining, pp. 122–129. International Educational Data Mining Society, London, UK (2014)

    Google Scholar 

  5. Harley, J.M., Bouchet, F., Azevedo, R.: Aligning and comparing data on emotions experienced during learning with MetaTutor. In: Lane, H.C., Yacef, K., Mostow, J., Pavlik, P. (eds.) AIED 2013. LNCS (LNAI), vol. 7926, pp. 61–70. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-39112-5_7

    Chapter  Google Scholar 

  6. Pardos, Z., Baker, R., Pedro, M.S., Gowda, S.M., Gowda, S.M.: Affective states and state tests: investigating how affect and engagement during the school year predict end-of-year learning outcomes. J. Learn. Anal. 1, 107–128 (2014)

    Article  Google Scholar 

  7. D’Mello, S., Graesser, A.: The half-life of cognitive-affective states during complex learning. Cogn. Emot. 25, 1299–1308 (2011)

    Article  Google Scholar 

  8. Cooper, D.G., Arroyo, I., Woolf, B.P.: Actionable affective processing for automatic tutor interventions. In: Calvo, R., D’Mello, S. (eds.) New Perspectives on Affect and Learning Technologies. Explorations in the Learning Sciences, Instructional Systems and Performance Technologies, vol. 3, pp. 127–140. Springer, New York (2011). https://doi.org/10.1007/978-1-4419-9625-1_10

    Chapter  Google Scholar 

  9. Botelho, A.F., Baker, R.S., Heffernan, N.T.: Improving sensor-free affect detection using deep learning. In: André, E., Baker, R., Hu, X., Rodrigo, Ma.M.T., du Boulay, B. (eds.) AIED 2017. LNCS (LNAI), vol. 10331, pp. 40–51. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-61425-0_4

    Chapter  Google Scholar 

  10. Jiang, Y., et al.: Expert feature-engineering vs. deep neural networks: which is better for sensor-free affect detection? In: Proceedings of the International Conference on Artificial Intelligence in Education, pp. 198–211. Springer, Cham (2018)

    Google Scholar 

  11. Bosch, N., D’mello, S.K., Ocumpaugh, J., Baker, R.S., Shute, V.: Using video to automatically detect learner affect in computer-enabled classrooms. ACM Trans. Interact. Intell. Syst. 6, 1–26 (2016)

    Article  Google Scholar 

  12. Arroyo, I., Cooper, D.G., Burleson, W., Woolf, B.P., Muldner, K., Christopherson, R.: Emotion sensors go to school. In: Artificial Intelligence in Education, pp. 17–24 (2009)

    Google Scholar 

  13. Bosch, N., et al.: Detecting student emotions in computer-enabled classrooms. In: Proceedings of the 25th International Joint Conference on Artificial Intelligence, pp. 4125–4129 (2016)

    Google Scholar 

  14. Henderson, N., Aygun, R.: Human action classification using temporal slicing for deep convolutional neural networks. In: 2017 IEEE International Symposium on Multimedia (2017)

    Google Scholar 

  15. Yang, J., Wang, K.: Deep recurrent multi-instance learning with spatio-temporal features for engagement intensity prediction. In: Proceedings of the 2018 on International Conference on Multimodal Interaction, pp. 594–598. ACM (2018)

    Google Scholar 

  16. Ocumpaugh, J., Baker, R.S., Rodrigo, M.T.: Baker Rodrigo Ocumpaugh Monitoring Protocol (BROMP) 2.0 Technical and Training Manual (2015)

    Google Scholar 

  17. Grafsgaard, J., Boyer, K., Wiebe, E., Lester, J.: Analyzing posture and affect in task-oriented tutoring. In: FLAIRS Conference, pp. 438–443 (2012)

    Google Scholar 

  18. Patwardhan, A., Knapp, G.: Multimodal affect recognition using Kinect. arXiv preprint arXiv:1607.02652 (2016)

  19. Sottilare, R.A., Baker, R.S., Graesser, A.C., Lester, J.C.: Special issue on the generalized intelligent framework for tutoring (GIFT): creating a stable and flexible platform for innovations in AIED research. Int. J. Artif. Intell. Educ. 28, 139–151 (2018)

    Article  Google Scholar 

  20. Sanghvi, J., Castellano, G., Leite, I., Pereira, A., McOwan, P.W., Paiva, A.: Automatic analysis of affective postures and body motion to detect engagement with a game companion. In: Proceedings of the 6th International Conference on Human-Robot Interaction, pp. 305–312. ACM (2011)

    Google Scholar 

  21. Platt, J.C.: Sequential minimal optimization: a fast algorithm for training support vector machines, pp. 1–21 (1998)

    Google Scholar 

  22. Zeiler, M.D.: ADADELTA: An adaptive learning rate method (2012)

    Google Scholar 

  23. Mierswa, I., Wurst, M., Klinkenberg, R., Scholz, M.: Yale: rapid prototyping for complex data mining tasks. In: Proceedings of the 12th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 935–940 (2006)

    Google Scholar 

  24. Soleymani, M., Pantic, M., Pun, T.: Multimodal emotion recognition in response to videos. IEEE Trans. Affect. Comput. 3, 211–223 (2012)

    Article  Google Scholar 

  25. Baltrušaitis, T., Ahuja, C., Morency, L.-P.: Multimodal machine learning: a survey and taxonomy. IEEE Trans. Pattern Anal. Mach. Intell. 41, 423–443 (2018)

    Article  Google Scholar 

  26. Rahman, W., Gavrilova, M.L.: Emerging EEG and kinect face fusion for biometric identification. In: Proceedings of the IEEE Symposium Series on Computational Intelligence (SSCI), pp. 1–8. IEEE (2017)

    Google Scholar 

  27. Cohen, J.: A coefficient of agreement for nominal scales. Educ. Psychol. Measur. 20, 37–46 (1960)

    Article  Google Scholar 

Download references

Acknowledgements

We wish to thank Dr. Jeanine DeFalco and Dr. Benjamin Goldberg at the U.S. Army Combat Capabilities Development Command – Simulation and Training Technology Center (CCDC-STTC), Dr. Mike Matthews and COL James Ness at the United States Military Academy, and Dr. Robert Sottilare at SoarTech for their assistance in facilitating this research. The research was supported by the U.S. Army Research Laboratory under cooperative agreement #W911NF-13-2-0008. Any opinions, findings, and conclusions expressed in this paper are those of the authors and do not necessarily reflect the views of the U.S. Army.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Nathan L. Henderson , Jonathan P. Rowe , Bradford W. Mott , Keith Brawner , Ryan Baker or James C. Lester .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Henderson, N.L., Rowe, J.P., Mott, B.W., Brawner, K., Baker, R., Lester, J.C. (2019). 4D Affect Detection: Improving Frustration Detection in Game-Based Learning with Posture-Based Temporal Data Fusion. In: Isotani, S., Millán, E., Ogan, A., Hastings, P., McLaren, B., Luckin, R. (eds) Artificial Intelligence in Education. AIED 2019. Lecture Notes in Computer Science(), vol 11625. Springer, Cham. https://doi.org/10.1007/978-3-030-23204-7_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-23204-7_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-23203-0

  • Online ISBN: 978-3-030-23204-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics