Skip to main content

Multimodal Affect Detection from Physiological and Facial Features during ITS Interaction

  • Conference paper
Book cover Artificial Intelligence in Education (AIED 2011)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 6738))

Included in the following conference series:

Abstract

Multimodal approaches are increasingly used for affect detection. This paper proposes a model for the fusion of physiological signal that measure learners’ heart activity and their facial expressions to detect learners’ affective states while students interact with an Intelligent Tutoring System (ITS). It studies machine learning and fusion techniques that classify the system’s automated feedback from the individual channels and their feature level fusion. It also evaluates the classification performance of fusion models in multimodal systems, identifying the effects of fusion over the individual modalities.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. D’Mello, S., Graesser, A.: Multimodal semi-automated affect detection from conversational cues, gross body language, and facial features. User Modeling and User-Adapted Interaction 20, 147–187 (2010)

    Article  Google Scholar 

  2. Sebe, N., Cohen, I., Gevers, T., Huang, T.S.: Multimodal approaches for emotion recognition: a survey. In: Proc. SPIE, vol. 5670, pp. 56–67 (2005)

    Google Scholar 

  3. Calvo, R.A., D’Mello, S.: Affect Detection: An Interdisciplinary Review of Models, Methods, and their Applications. IEEE Transactions on Affective Computing 1, 18–37 (2010)

    Article  Google Scholar 

  4. Holmes, N.P., Spence, C.: Multisensory integration: space, time and superadditivity. Current Biology 15, 762–764 (2005)

    Article  Google Scholar 

  5. Aghaei Pour, P., Hussain, M., AlZoubi, O., D’Mello, S., Calvo, R.: The Impact of System Feedback on Learners’ Affective and Physiological States. In: Aleven, V., Kay, J., Mostow, J. (eds.) ITS 2010. LNCS, vol. 6094, pp. 264–273. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  6. Graesser, A.C., Chipman, P., Haynes, B.C., Olney, A.: AutoTutor: An intelligent tutoring system with mixed-initiative dialogue. IEEE Transactions on Education 48, 612–618 (2005)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Hussain, M.S., Calvo, R.A. (2011). Multimodal Affect Detection from Physiological and Facial Features during ITS Interaction. In: Biswas, G., Bull, S., Kay, J., Mitrovic, A. (eds) Artificial Intelligence in Education. AIED 2011. Lecture Notes in Computer Science(), vol 6738. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21869-9_73

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-21869-9_73

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-21868-2

  • Online ISBN: 978-3-642-21869-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics