Skip to main content

A Binary Decision Tree Based Real-Time Emotion Detection System

  • Conference paper
Advances in Visual Computing (ISVC 2007)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 4841))

Included in the following conference series:

Abstract

This paper presents a real-time emotion detection system capable of identifying seven affective states: agreeing, concentrating, disagreeing, interested, thinking, unsure, and angry from a near infrared video stream. A Viola Jones face detector is trained to locate the face within the frame. The Active Appearance Model is then used to place 23 landmark points around key areas of the eyes, brows, and mouth. A prioritized binary decision tree then detects, based on the actions of these key points, if on of the seven emotional states occurs as frames pass. The completed system runs accurately and seamlessly on an Intel Pentium IV, 2.8 GHz processor with 512 MB of memory, achieving a real-time frame rate of 36 frames per second.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bernsen, N.O.: Multimodality in language and speech systems. From theory to design support tool. In: Granström, B., House, D., Karlsson, I. (eds.) Multimodality in Language and Speech Systems, pp. 93–148. Kluwer Academic Publ., Dordrecht (2002)

    Google Scholar 

  2. Corradini, A., Mehta, M., Bernsen, N.O., Martin, J.C., Abrilian, S.: Multimodal Input Fusion in Human-Computer Interaction on the Example of the on-going NICE Project. In: Proc. NATO-ASI Conf. on Data Fusion for Situation Monitoring, Incident Detection, Alert and Response Management, Yerevan (Armenia), August 18th-29th (2003)

    Google Scholar 

  3. Darwin, C.: The Expression of Emotions in Man and Animals. In: Murray, J. (ed.), University of Chicago Press, Chicago (1965)

    Google Scholar 

  4. Ekman, P.: The Argument and Evidence about Universals in Facial Expressions of Emotion, pp. 143–164. Wiley, New York (1989)

    Google Scholar 

  5. Ekman, P., Friesen, W.: The Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press, San Francisco (1978)

    Google Scholar 

  6. Scherer, K., Ekman, P.: Handbook of Methods in Nonverbal Behavior Research. Cambridge University Press, Cambridge (1982)

    Google Scholar 

  7. Suwa, M., Sugie, N., Fujimora, K.: A preliminary note on pattern recognition of human emotional expression. In: Int’l Joint Conf. on Pattern Recognition, pp. 408–410 (1978)

    Google Scholar 

  8. Tian, Y., Kanade, T., Cohn, J.F.: Recognizing Action Units for Facial Expression Analysis. IEEE Trans. Pattern Analysis and Machine Intelligence 23(2) (2001)

    Google Scholar 

  9. El Kaliouby, R., Robinson, P.: Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures. In: The IEEE Int’l Workshop on Real Time Computer Vision for Human Computer Interaction at CVPR, IEEE Computer Society Press, Los Alamitos (2004)

    Google Scholar 

  10. Baron-Cohen, S., Golan, O., Wheelwright, S., Hill, J.J.: Mind Reading: The Interactive Guide to Emotions. Jessica Kingsley Publishers, London (2004)

    Google Scholar 

  11. El Kaliouby, R.: Mind-reading machines: automated inference of complex mental states. Technical Report UCAM-CL-TR-636 (2005)

    Google Scholar 

  12. Viola, P.A., Jones, M.J.: Robust Real-Time Face Detection. Int’l J. Computer Vision 57(2), 137–154 (2004)

    Article  Google Scholar 

  13. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of online learning and an application to boosting. J. Comp. Sys. Sci. 55(1), 119–139 (1977)

    Article  MathSciNet  Google Scholar 

  14. Cootes, T.F., Edwards, G., Taylor, C.J.: A comparative evaluation of active appearance model algorithms. In: BMVC 1998. Proc.Ninth British Machine Vision Conf, vol. 2, pp. 680–689 (1998)

    Google Scholar 

  15. Bruce, V.: Recognizing Faces. Lawrence Erlbaum, Hove, East Sussex (1986)

    Google Scholar 

  16. Pantic, M., Rothkrantz, L.J.M.: Automatic Analysis of Facial Expressions: The State of the Art. IEEE Trans. Pattern Analysis and Machine Intelligence 22, 1424–1445 (2000)

    Article  Google Scholar 

  17. Ekman, P., Friesen, W.V.: Measuring facial movement. Environmental Psychology and Nonverbal Behavior 1(1), 56–75 (1976)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

George Bebis Richard Boyle Bahram Parvin Darko Koracin Nikos Paragios Syeda-Mahmood Tanveer Tao Ju Zicheng Liu Sabine Coquillart Carolina Cruz-Neira Torsten Müller Tom Malzbender

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Livingston, A., Seow, MJ., Asari, V.K. (2007). A Binary Decision Tree Based Real-Time Emotion Detection System. In: Bebis, G., et al. Advances in Visual Computing. ISVC 2007. Lecture Notes in Computer Science, vol 4841. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-76858-6_43

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-76858-6_43

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-76857-9

  • Online ISBN: 978-3-540-76858-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics