Skip to main content

Abstract

Affect and emotion are probably the most important facets of our lives. They make our lives worth living by enabling us to enjoy experiences, to value the behavior of others and helping us to make decisions more easily. They enforce or fade out the memory of distinct events and make some of them unique in the sequence of episodes that we undergo each day. But also, they function as a modulator of information when interacting with other people and play an essential role in fine-tuning our communication. The ability to express and understand emotional signs can hence be considered vital for interacting with human beings. Leveraging the power of emotion recognition to enhance technology seems obligatory when designing technology for people. This chapter introduces the physiological background of emotion recognition, describes the general approach to detecting emotion using physiological sensors, and gives two examples of affective applications.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Speech analysis is also successfully used to conclude on emotional processes in a user. The emotional coloring of the voice happens through ANS controlled changes of the tension of the muscles of the vocal tract. As speech analysis can only be performed when the person is talking, it is a rather sporadic source of information and will not discussed here any further. Please refer Peter and Beale (2008) and Tao and Tan (2009).

References

  • ACII (2005). 1st international conference on affective computing and intelligent interaction. Beijing, 2005. Berlin: Springer ISBN 3-540-29621-2.

    Google Scholar 

  • ACII (2007). 2nd international conference on affective computing and intelligent interaction. Lisbon, Portugal.

    Google Scholar 

  • ACII (2009). 3rd international conference on affective computing and intelligent interaction. Amsterdam. The Netherlands.

    Google Scholar 

  • ACII (2011). 4th international conference on affective computing and intelligent interaction. Memphis, USA.

    Google Scholar 

  • Barreto, A., Zhai, J., & Adjouadi, M. (2007). Non-intrusive physiological monitoring for automated stress detection in human-computer interaction. In M. Lew, N. Sebe, T. S. Huang, & E. M. Bakker (Eds.), LNCS: Vol. 4796. HCI 2007 (pp. 29–38). Heidelberg: Springer.

    Google Scholar 

  • Bishop, C. M. (2006). Pattern recognition and machine learning. London: Springer. ISBN 978-0-387-31073-2.

    MATH  Google Scholar 

  • British Standards Institution (2004). BS 7986: Industrial process measurement and control – Data quality metrics. Available from BSI Customer Services email: orders@bsi-global.com.

    Google Scholar 

  • Bush, G., Luu, P., & Posner, M. I. (2000). Cognitive and emotional influences in anterior cingulate cortex. Trends in Cognitive Sciences, 4(6), 215–222.

    Article  Google Scholar 

  • Cacioppo, J. T., Tassinary, L. G., & Berntson, G. G. (Eds.) (2000). Handbook of psychophysiology (2nd edn.). Cambridge: Cambridge University Press. ISBN 0-521-62634-X

    Google Scholar 

  • Ekman, P., & Davidson, R. J. (Eds.) (1994). The nature of emotion: fundamental questions. New York: Oxford University Press.

    Google Scholar 

  • Ekman, P., & Friesen, W. (1976) Pictures of facial affect. Palo Alto: Consulting Psychologists Press.

    Google Scholar 

  • Ekman, P., & Friesen, W. (1978). Facial action coding system: a technique for the mea-surement of facial movement. Palo Alto: Consulting Psychologists Press.

    Google Scholar 

  • Ekman, P., Levenson, R. W., & Friesen, W. (1983). Autonomic nervous system activity distinguishes among emotions. Science, 221. The American Association for Advancement of Science.

    Google Scholar 

  • Haag, A., Goronzy, S., Schaich, P., & Williams, J. (2004). Emotion recognition using bio-sensors: first steps towards an automatic system. In André, et al. (Eds.), Lecture notes in computer science: Vol. 3068. Affective dialogue systems, proceedings of the Kloster Irsee tutorial and research workshop on affective dialogue systems (pp. 36–48). Berlin: Springer.

    Google Scholar 

  • Hastie, T., Tibshirani, R., & Friedman, J. (2009). The elements of statistical learning: data mining, inference, and prediction. Springer series in statistics (2nd edn.). Corr. 3rd printing, 5th Printing, 2009. ISBN 978-0-387-84857-0.

    Google Scholar 

  • Henry, M. P., & Clarke, D. W. (1993). The self-validating sensor: rationale, definitions and examples. Control Engineering Practice, 1, 585.

    Article  Google Scholar 

  • Horvitz, E., Kadie, C., Paek, T., & Hovel, D. (2003). Models of attention in computing and communication: from principles to applications. Communications of the ACM, 46(3), 52–59.

    Article  Google Scholar 

  • IEEE Transactions on Affective Computing (2012). http://www.computer.org/portal/web/tac. ISSN: 1949-3045.

  • Kim, J., & André, E. (2008). Emotion recognition based on physiological changes in music listening. IEEE Transactions on Pattern Analysis and Machine Intelligence, 30(12), 2067–2083.

    Article  Google Scholar 

  • Kim, K. H., Bang, S. W., & Kim, S. R. (2004). Emotion recognition system using shortterm monitoring of physiological signals. Medical and Biological Engineering and Computing, 42(2004), 419–427.

    Article  Google Scholar 

  • King, R. D., Feng, C., & Sutherland, A. (1995). Statlog: comparison of classification algorithms on large real-world problems. Applied Artificial Intelligence, 9(3), 289–333

    Article  Google Scholar 

  • Knapp, R. B., Kim, J., & André, E. (2011). Physiological signals and their use in augmenting emotion recognition for human-machine interaction. In P. Petta, C. Pelachaud, & R. Cowie (Eds.), Emotion-oriented systems: the humaine handbook. Berlin: Springer.

    Google Scholar 

  • Lane, R. (2000). Cognitive neuroscience of emotion. New York: Oxford University Press.

    Google Scholar 

  • Lichtenstein, A., Oehme, A., Kupschick, S., & Jürgensohn, T. (2008). Comparing two emotion models for deriving affective states from physiological data. In C. Peter, & R. Beale (Eds.), LNCS: Vol. 4868. Affect and emotion in human-computer interaction. Heidelberg: Springer. ISBN 978-3-540-85098-4.

    Chapter  Google Scholar 

  • Mader, S., Peter, C., Göcke, R., Schultz, R., Voskamp, J., & Urban, B. (2004). A freely configurable, multi-modal sensor system for affective computing. In André, et al. (Eds.), Affective dialogue systems: tutorial and research workshop (pp. 313–318). Berlin: Springer.

    Chapter  Google Scholar 

  • Magjarevic, M., Gao, Y., Barreto, A., & Adjouadi, M. (2009). Comparative analysis of noninvasively monitored biosignals for affective assessment of a computer user. In A. J. McGoron, C.-Z. Li, & W.-C. Lin (Eds.), IFMBE proceedings: Vol. 24. 25th southern biomedical engineering conference 2009, 15–17 May 2009, Miami, Florida, USA (pp. 255–260). Berlin: Springer. doi:10.1007/978-3-642-01697-4_90.

    Google Scholar 

  • Mandryk, R. L., & Atkins, M. S. (2007). A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies. International Journal of Human-Computer Studies, 65, 329–347.

    Article  Google Scholar 

  • Nasoz, F., Alvarez, K., Lisetti, C.L., & Finkelstein, N. (2004). Emotion recognition from physiological signals using wireless sensors for presence technologies. International Journal of Cognition, Technology, and Work—Special Issue on Presence, 6.

    Google Scholar 

  • Olofsson, J. K., Nordin, S., Sequeira, H., & Polich, J. (2008). Affective picture processing: An integrative review of ERP findings. Biological Psychology, 77(3), 247–265.

    Article  Google Scholar 

  • Peter, C., & Beale, R. (Eds.) (2008). Affect and emotion in human-computer interaction. LNCS: Vol. 4868. Heidelberg: Springer, ISBN 978-3-540-85098-4

    Google Scholar 

  • Peter, C., Ebert, E., & Beikirch, H. (2005). A wearable multi-sensor system for mobile acquisition of emotion-related physiological data. In Proceedings of the 1st international conference on affective computing and intelligent interaction, Beijing, 2005. Berlin: Springer.

    Google Scholar 

  • Peter, C., Schultz, R., Voskamp, J., Urban, B., Nowack, N., Janik, H., Kraft, K., & Göcke, R. (2007). EREC-II in use—studies on usability and suitability of a sensor system for affect detection and human performance monitoring. In J. Jacko (Ed.), LNCS: Vol. 4552. Human-computer interaction, part III, HCII 2007 (pp. 465–474). Berlin: Springer.

    Google Scholar 

  • Petta, P., Pelachaud C., & Cowie, R. (Eds.) (2011). Emotion-oriented systems: the humaine handbook. Berlin: Springer.

    Google Scholar 

  • Picard, R. W. (1997). Affective computing. Cambridge: M.I.T. Press

    Google Scholar 

  • Picard, R. W., Vyzas, E., & Healey, J. (2001). Toward machine emotional intelligence – analysis of affective physiological state. IEEE Transactions on Pattern Analysis and Machine Intelligence, 23(10), 1175–1191.

    Article  Google Scholar 

  • Poh, M.-Z., McDuff, D. J., & Picard, R. W. (2010). Non-contact, automated cardiac pulse measurements using video imaging and blind source separation. Optics Express, 18(10(c)), 10763–10774.

    Google Scholar 

  • Posner, M. (2004). Cognitive neuroscience of attention. New York: Guilford Press.

    Google Scholar 

  • Rani, P., Liu, C., Sarkar, N., & Vanman, E. (2006). An empirical study of machine learning techniques for affect recognition in human–robot interaction. Pattern Analysis & Applications, 9(1).

    Google Scholar 

  • Rigas, G., Katsis, C., Ganiatsas, G., & Fotiadis, D. (2009). A user independent, biosignal based, emotion recognition method. In C. Conati, K. McCoy, & G. Paliouras (Eds.), Lecture notes in computer science: Vol. 4511. User modeling 2007. (pp. 314–318). Berlin: Springer. doi:10.1007/978-3-540-73078-1_36

    Chapter  Google Scholar 

  • Schwartz, M. S., & Andrasik, F. (2003). Biofeedback: a practitioner’s guide (3rd edn.). New York: Guilford Press. ISBN 1-57230-845-1

    Google Scholar 

  • Tao, J., & Tan, T. (Eds.) (2009). Affective information processing. London: Springer. ISBN 978-1-84800-305–978-1-84800-7.

    Google Scholar 

  • van den Broek, E. L., Janssen, J. H., Westerink, J. H. D. M., & Healey, J. A. (2009). Prerequisites for affective signal processing (ASP). In International conference on bio-inspired systems and signal processing, biosignals, 14–17 Jan 2009, Porto, Portugal.

    Google Scholar 

  • Verhoef, T., Lisetti, C., Barreto, A., Ortega, F., van der Zant, T., & Cnossen, F. (2009). Bio-sensing for emotional characterization without word labels. In J. A. Jacko (Ed.), LNCS: Vol. 5612. Human-computer interaction, Part III, ambient, ubiquitous and intelligent interaction (pp. 693–702). Berlin: Springer.

    Chapter  Google Scholar 

  • Vuilleumier, P., & Driver, J. (2007). Modulation of visual processing by attention and emotion: windows on causal interactions between human brain regions. Philosophical Transactions of the Royal Society B: Biological Sciences, 362(1481), 837–855.

    Article  Google Scholar 

  • Wimmer, M. (2007). Model-based image interpretation with application to facial expression recognition. Ph.D. Thesis, Technische Universitat München, Institute for Informatics.

    Google Scholar 

  • Yerkes, R. M., & Dodson, J. D. (1908). The relation of strength of stimulus to rapidity of habit-formation. Journal of Comparative Neurology and Psychology, 18, 459–482.

    Article  Google Scholar 

  • Yoo, S. K., Lee, C. K., Park, Y. J., Kim, N. H., Lee, B. C., & Jeong, K. S. (2005). Neural network based emotion estimation using heart rate variability and skin resistance. In L. Wang, K. Chen, & Y. S. Ong (Eds.), Lecture notes in computer science: vol. 3610. Advances in natural computation (pp. 818–824). Berlin: Springer.

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christian Peter .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag London Limited

About this chapter

Cite this chapter

Peter, C., Urban, B. (2012). Emotion in Human-Computer Interaction. In: Dill, J., Earnshaw, R., Kasik, D., Vince, J., Wong, P. (eds) Expanding the Frontiers of Visual Analytics and Visualization. Springer, London. https://doi.org/10.1007/978-1-4471-2804-5_14

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-2804-5_14

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-4471-2803-8

  • Online ISBN: 978-1-4471-2804-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics