Skip to main content

Advertisement

Log in

Student mental state inference from unintentional body gestures using dynamic Bayesian networks

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

Applications that interact with humans would benefit from knowing the intentions or mental states of their users. However, mental state prediction is not only uncertain but also context dependent. In this paper, we present a dynamic Bayesian network model of the temporal evolution of students’ mental states and causal associations between mental states and body gestures in context. Our approach is to convert sensory descriptions of student gestures into semantic descriptions of their mental states in a classroom lecture situation. At model learning time, we use expectation maximization (EM) to estimate model parameters from partly labeled training data, and at run time, we use the junction tree algorithm to infer mental states from body gesture evidence. A maximum a posteriori classifier evaluated with leave-one-out cross validation on labeled data from 11 students obtains a generalization accuracy of 97.4% over cases where the student reported a definite mental state, and 83.2% when we include cases where the student reported no mental state. Experimental results demonstrate the validity of our approach. Future work will explore utilization of the model in real-time intelligent tutoring systems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Picard RW (1997/2000) Affective computing. MIT Press, Cambridge

    Google Scholar 

  2. Vinciarelli A, Pantic M, Bourlard H (2009) Social signal processing: Survey of an emerging domain. Image Vis Comput 27(12):1743–1759

    Article  Google Scholar 

  3. Lehman B, Matthews M, D’Mello S, Person N (2008) What are you feeling? Investigating student affective states during expert human tutoring sessions. In: ITS 2008. Lecture notes in computer science, vol 5091. Springer, Berlin, Heidelberg, pp 50–59

    Google Scholar 

  4. Dragon T, Arroyo I, Woolf BP, Burleson W, El Kaliouby R, Eydgahi H (2008) Viewing student affect and learning through classroom observation and physical sensors. In: ITS 2008. Lecture notes in computer science, vol 5091. Springer, Berlin, Heidelberg, pp 29–39

    Google Scholar 

  5. D’Mello S, Jackson T, Craig S, Morgan B, Chipman P, White H, Person N, Kort B, El Kaliouby R, Picard RW, Graesser A (2008) Auto Tutor detects and responds to learners affective and cognitive states. In: Workshop on emotional and cognitive issues ITS 2008, Canada

  6. Brave S, Nass C (2002) Emotion in HCI. In: Jacko J, Sears A (eds) The human-computer interaction handbook: fundamentals, evolving technologies and emerging applications, pp 9–26

  7. Wentzel K (1997) Student motivation in middle school: the role of perceived pedagogical caring. J Educ Psychol 89(3):411–419

    Article  Google Scholar 

  8. Zakharov K, Mitrovic A, Johnston L (2008) Towards emotionally-intelligent pedagogical agents. In: ITS 2008, Canada, pp 19–28

  9. Darwin C (1872/1998) The expression of emotions in man and animals. Murray, London

    Google Scholar 

  10. McNeill D (1992) Hand and mind: what gestures reveal about thought. University of Chicago Press, Chicago

    Google Scholar 

  11. Wallbott HG (1998) Bodily expression of emotion. Eur J Soc Psychol 28(4):879–896

    Article  Google Scholar 

  12. Givens DB (2002) The nonverbal dictionary of gestures, signs & body language cues from Adam’s-apple-jump to zygomatic smile. Center for Nonverbal Studies Press, Spokane

    Google Scholar 

  13. Richmond P, McCroskey C (2000) Nonverbal behavior in interpersonal relationship. Allyn and Bacon, Needham Heights

    Google Scholar 

  14. Meijer M (1989) The contribution of general features of body movement to the attribution of emotions. J Nonverbal Behav 13(4):247–268

    Article  Google Scholar 

  15. Coulson M (1992) Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J Nonverbal Behav 28(2):117–139

    Article  Google Scholar 

  16. Pollick FE, Paterson HM, Bruderlin A, Sanford AJ (2001) Perceiving affect from arm movement. Cognition 82(2):51–61

    Article  Google Scholar 

  17. Bernhardt D, Robinson P (2007) Detecting affect from non-stylised body motions. In: ACII 2007. Lecture notes in computer science, vol 4738. Springer, Berlin, Heidelberg, pp 59–70

    Google Scholar 

  18. Ambady N, Rosenthal R (1992) Thin slices of expressive behavior as predictors of interpersonal consequences: a meta analysis. Psychol Bull 111(2):256–274

    Article  Google Scholar 

  19. Balomenos T, Raouzaiou A, Ioannou S, Drosopoulos A, Karpouzis K, Kollias SD (2004) Emotion analysis in man-machine interaction systems. Lect Notes Comput Sci 3361:318–328

    Article  Google Scholar 

  20. Gunes H, Piccardi M (2006) A bi-modal face and body gesture database for automatic analysis of human nonverbal affective behavior. In: Proceedings of IEEE international conference on pattern recognition, pp 1148–1153

  21. Mislevy RJ, Gitomer DH (1995) The role of probability based inference in an intelligent tutoring system. User Model User-Adapt Interact 5(3–4):253–282

    Google Scholar 

  22. Conati C, Gertner A, VanLehn K (2002) Using Bayesian networks to manage uncertainty in student modeling. User Model User-Adapt Interact 12(4):371–417

    Article  MATH  Google Scholar 

  23. Reye J (2004) Student modeling based on belief networks. Int J Artif Intell Educ 14:1–33

    Google Scholar 

  24. Suebnukarn S, Haddawy P (2006) A Bayesian approach to generating tutorial hints in a collaborative medical problem-based learning system. Artif Intell Med 82(1):5–24

    Article  Google Scholar 

  25. Kapoor A, Burleson W, Picard RW (2007) Automatic prediction of frustration. Int J Human-Comput Stud 65:724–736

    Article  Google Scholar 

  26. Ji Q, Lan P, Looney C (2006) A probabilistic framework for modeling and realtime monitoring human fatigue. IEEE Trans Syst Man Cybern Syst Humans 36(5):862–875

    Article  Google Scholar 

  27. El Kaliouby R, Robinson P (2005) Real-time inference of complex mental states from facial expressions and head gestures. In: Real-time vision for HCI. Springer, Berlin, pp 181–200

    Chapter  Google Scholar 

  28. Baron-Cohen S, Golan O, Wheelwright S, Hill J (2004) Mind reading: the interactive guide to emotions. Jessica Kingsley Publishers, London

    Google Scholar 

  29. Conati C (2002) Probabilistic assessment of user’s emotions in educational games. J Appl Artif Intell 16(7–8):555–575

    Article  Google Scholar 

  30. Liao W, Zhang W, Zhua Z, Ji Q, Gray WD (2006) Toward a decision theoretic framework for affect recognition and user assistance. Int J Human-Comput Stud 64:846–873

    Article  Google Scholar 

  31. Goldin-Meadow S (2003) Hearing gesture: how our hands help us think. Harvard University Press, Cambridge

    Google Scholar 

  32. Mitra S, Acharya T (2007) Gesture recognition: a survey. IEEE Trans Syst Mach Cybern-Appl Rev 37(3):311–324

    Article  Google Scholar 

  33. Lu S, Tsechpenakis G, Metaxas DN, Jensen ML, Kruse J (2005) Blob analysis of the head and hands: a method for deception detection. In: 38th annual Hawaii international conference on system sciences

  34. Ahmad M, Lee S (2008) Human action recognition using shape and CLG-motion flow from multi-view image sequences. Pattern Recogn 41(7):2237–2252

    Article  MATH  Google Scholar 

  35. Caridakis G, Karpouzis K, Pateritsas C, Drosopoulos A, Stafylopatis A, Kollias S (2008) Hand trajectory based gesture recognition using self-organizing feature maps and Markov models. In: IEEE int conf multimedia & expo (ICME’08), June 23–26, Hannover, Germany

  36. Suk H-I, Sin B-K, Lee S-W (2008) Recognizing hand gestures using dynamic Bayesian network. In: 8th IEEE int conf automatic face & gesture recognition (FG’08), pp 1–6

  37. Abbasi AR, Uno T, Dailey MN, Afzulpurkar NV (2007) Towards knowledge-based affective interaction: Situational interpretation of affect. In: ACII 2007. Lecture notes in computer science, vol 4738. Springer, Berlin, Heidelberg, pp 452–463

    Google Scholar 

  38. Abbasi AR, Dailey MN, Afzulpurkar NV, Uno T (2008) Obtaining self-reports for affective system design. In: NordiCHI 2008 workshop on research goals and strategies for studying user experience and emotion, Sweden

  39. Miller A (2004) Video-cued recall: its use in a work domain analysis. In: Human factors and ergonomics society 48th annual meeting

  40. Ericsson KA, Simon HA (1984/1993) Protocol analysis: verbal reports as data. Bradford Books/MIT Press, Cambridge

    Google Scholar 

  41. Scherer KR (2005) What are emotions? And how can they be measured? Soc Sci Inf 44(4):695–729. GALC code is available at: http://www.unige.ch/fapse/emotion/resmaterial/GALC.xls

    Article  Google Scholar 

  42. Webb GI, Pazzani MJ, Billsus D (2001) Machine learning for user modeling. User Model User-Adapt Interact 11:19–29

    Article  MATH  Google Scholar 

  43. Dean T, Kanazawa K (1989) A model for reasoning about persistence and causation. Int J Comput Intell 5:142–150

    Article  Google Scholar 

  44. Pearl J (1988) Probabilistic reasoning in intelligent systems: networks of plausible inference. Morgan Kaufmann, San Mateo

    Google Scholar 

  45. GeNIE/SMILE software tool developed at Decision Systems Laboratory (2009). http://genie.sis.pitt.edu

  46. Damasio AR (1994) Decartes’ error: emotion, reason, and the human brain. Avon Books, New York

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Abdul Rehman Abbasi.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Abbasi, A.R., Dailey, M.N., Afzulpurkar, N.V. et al. Student mental state inference from unintentional body gestures using dynamic Bayesian networks. J Multimodal User Interfaces 3, 21–31 (2010). https://doi.org/10.1007/s12193-009-0023-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-009-0023-7

Navigation