Skip to main content

Using Automatic Facial Expression Classification for Contents Indexing Based on the Emotional Component

  • Conference paper
Embedded and Ubiquitous Computing (EUC 2006)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 4096))

Included in the following conference series:

Abstract

Within the last decade the development of new technologies in the multimedia sector has advanced with stunning pace. Due to the availability of high-capacity mass storage devices at low cost private multimedia libraries containing digital video and audio items have recently gained popularity. Although attached meta-data like title, actor’s/actress’ name and creation time eases the task of finding preferred contents, it is still difficult to find a specific part within a movie one enjoyed before by remembering the time code. In this paper we introduce the BROAFERENCE system that provides a solution for the above problem. We propose meta-data creation based on recorded user experience derived from facial expressions containing joy, sadness and anger events as well as interest focus data. In the following the system layout, functionality and conducted experiments for system verification will be introduced to the reader.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Ekman, P.: Facial Expressions. In: Dalgleish, T., Power, M. (eds.) Handbook of Cognition and Emotion, John Willey & Sons Ltd., New York (1999)

    Google Scholar 

  2. Ekman, P.: Facial expression and emotion. American Psychologist 48, 384–392

    Google Scholar 

  3. Ekman, P., Friesen, W.V., Hager, J.C.: The new Facial Action Coding System (FACS) (2002)

    Google Scholar 

  4. Littlewort, G., Bartlett, M.S., Fasel, I., Susskind, J., Movellan, J.: Dynamics of Facial Expression Extracted Automatically from Video. In: Conference on Computer Vision and Pattern Recognition Workshop (CVPRW 2004), vol. 5, p. 80 (2004)

    Google Scholar 

  5. FACSAID, http://face-and-emotion.com

  6. Riedmiller, M.: Untersuchungen zu Konvergenz und Generalisierungsverhalten ĂĽberwachter Lernverfahren mit dem SNNS. In: Proceedings of the SNNS 1993 workshop (1993)

    Google Scholar 

  7. Snoek, C.G.M., Worring, M., Van Gemert, J., Geusebroek, J.M., Koelma, D., Nguyen, G.P., De Rooij, O., Seinstra, F.: MediaMill: exploring news video archives based on learned semantics. In: Proc. of the 13th ACM international conference on Multimedia, Singapore (November 2005)

    Google Scholar 

  8. Worring, M., Nguyen, G.P., Hollink, L., Gemert, J.C., Koelma, D.C.: Accessing video archives using interactive search. In: Proceedings of IEEE International Conference on Multimedia and Expo, June, IEEE, Taiwan (2004)

    Google Scholar 

  9. Hauptman, A., Baron, R.V., Chen, M.-Y.: Informedia at TRECVID 2003: Analyzing and Searching Broadcast News Video (2003)

    Google Scholar 

  10. Mongy, S., Boulali, F., Djeraba, C.: Analyzing user’s behavior on a video database. In: Proc. of Workshop on Multimedia Data Mining, Chicago, IL, USA (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kowalik, U., Aoki, T., Yasuda, H. (2006). Using Automatic Facial Expression Classification for Contents Indexing Based on the Emotional Component. In: Sha, E., Han, SK., Xu, CZ., Kim, MH., Yang, L.T., Xiao, B. (eds) Embedded and Ubiquitous Computing. EUC 2006. Lecture Notes in Computer Science, vol 4096. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11802167_53

Download citation

  • DOI: https://doi.org/10.1007/11802167_53

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-36679-9

  • Online ISBN: 978-3-540-36681-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics