Skip to main content
Log in

Emotion extraction based on multi bio-signal using back-propagation neural network

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

This study proposes a system that can recognize human emotional state from bio-signal. The technology is provided to improve the interaction between humans and computers to achieve an effective human–machine that is capable for intelligent interaction. The proposed method is able to recognize six emotional states, such as joy, happiness, fear, anger, despair, and sadness. These set of emotional states are widely used for emotion recognition purposes. The result shows that the proposed method can distinguish one emotion compared to all other possible emotional states. The method is composed of two steps: 1) multi-modal bio-signal evaluation and 2) emotion recognition using artificial neural network. In the first step, we present a method to analyze and fix human sensitivity using physiological signals, such as electroencephalogram, electrocardiogram, photoplethysmogram, respiration, and galvanic skin response. The experimental analysis shows that the proposed method has good accuracy performance and could be applied on many human–computer interaction devices for emotion detection.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Alaoui-Ismaili O, Robin O, Rada H, Dittmar A, Vernet-Maury E (1997) Basic emotions evoked by odorants: comparison between autonomic responses and self-evaluation. Physiol Behav 62:713–720

    Article  Google Scholar 

  2. Ax AR (1953) The physiological differentiation between fear and anger in humans. Psychosom Med 15:147–150

    Article  Google Scholar 

  3. Boiten FA (1996) Autonomic response patterns during voluntary facial action. Psychophysiology 33:123–131

    Article  Google Scholar 

  4. Breiman L (2001) Random forests. Mach Learn 45(1):5–32

    Article  MATH  Google Scholar 

  5. Burkhardt F, van Ballegooy M, Englert R, Huber R (2005) An emotion aware voice portal. Proc ESSP:123–131

  6. Busso C, Deng Z, Yildirim S, Bulut M, Lee C, Kazemzadeh A, Lee S, Neumann U, Narayanan S (2004) Analysis of emotion recognition using facial expressions, speech and multimodal information. ACM International Conference on Multimodal Interfaces

  7. Cover T, Hart P (1967) Nearest neighbor pattern classification. IEEE Trans Inf Theory 13(1):21–27

    Article  MATH  Google Scholar 

  8. Cowie R, Douglas-Cowie E, Tsapatsoulis N, Votsis G, Kollias S, Fellenz W, Taylor GJ (2001) Emotion recognition in human computer interaction. IEEE Signal Process Mag 18:32–80

    Article  Google Scholar 

  9. Dellaert F, Polzin T, Waibel A (1996) Recognizing emotion in speech. Proc 4th ICSLP 3:1970–1973

    Google Scholar 

  10. Drummond PD, Quah SH (2001) The effect of expressing anger on cardiovascular reactivity and facial blood flow in Chinese and Caucasians. Psychophysiology 38:190–196

    Article  Google Scholar 

  11. El Ayadi M, Kamel MS, Karray F (2011) Survey on speech emotion recognition: features, classification schemes, and databases. Pattern Recogn 44(3):572–587

    Article  MATH  Google Scholar 

  12. Haag A, Goronzy S, Schaich P, Williams J (2004) Emotion recognition using bio-sensors: first step towards an automatic system, in affective dialogue systems tutorial and research workshop. Kloster Irsee, Germany

    Google Scholar 

  13. Hanson R, Stutz J, Cheeseman P (1991) Bayesian classification theory. NASA Ames Research Center, Washington, DC

    MATH  Google Scholar 

  14. Healey JA (2000) Wearable and Automotive Systems for Affect Recognition from Physiology. PhD thesis, MIT, Cambridge, MA

  15. Kanade TC, Tian Y (2000) Comprehensive database for facial expression analysis. Proceeding of the 4th IEEE International Conference on Automatic Face and Gesture Recognition, Grenoble, pp 46–53

  16. Kim Y, Lee H, Provost EM (2013) Deep learning for robust feature generation in audiovisual emotion recognition. Acoustics, Speech and Signal Processing

  17. Lang PJ, Bradley MM, Cuthbert BN (2008) International affective picture system (IAPS): affective ratings of pictures and instruction manual. Technical report A-8. University of Florida, Gainesville

    Google Scholar 

  18. Larsen RJ, Diener E (1992) Promises and problem with the Cirmumplex model of emotion. In: Clark MS (ed) Review of personality and social psychology 13: emotion. Sage, Newbury

    Google Scholar 

  19. Nasoz F, Alvarez K, Lisetti CL, Finkelstein N (2004) Emotion recognition from physiological signals for presence technologies. Cognit Technol Work, Spec Issue Presence 6:4–14

    Article  Google Scholar 

  20. Palomba D, Sarlo M, Angrilli A, Mini A (2000) Cardiac responses associated with affective processing of unpleasant film stimulus. Int J Psychophysiol 36:45–57

    Article  Google Scholar 

  21. Pantic M, Caridakis G, Andre E, Kim J, Karpouzis K, Kollias S (2011) Multimodal emotion recognition from low-level cues. In: Emotion Oriented Systems

  22. Picard RW, Vyzas E, Healey J (2001) Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans Pattern Anal Mach Intell 23:1175–1191

    Article  Google Scholar 

  23. Razak AA, Komiya R, Izani M, Abidin Z (2005) Comparison between fuzzy and NN method for speech emotion recognition. Proc 3rd ICITA 1:297–302

    Google Scholar 

  24. Sinha R, Parsons OA (1996) Multivariate response patterning of fear and anger. Cognit Emot 10:173–198

    Article  Google Scholar 

  25. Stemmler G (2004) Physiological processes during emotion. In: Phillippot P, Feldman RS (eds) The regulation of emotion. Erlbaum, Mahwah, pp. 33–70

    Google Scholar 

  26. Stephens CL, Christie IC, Friedman BH (2010) Autonomic specificity of basic emotions: evidence from pattern classification and cluster analysis. Biol Psychol 84:463–473

    Article  Google Scholar 

  27. Vapnik V (1999) The nature of statistical learning theory. Springer–Verlag, New York

    MATH  Google Scholar 

  28. Vogt T, Andre E (2005) Comparing feature sets for acted and spontaneous speech in view of automatic emotion recognition. IEEE International Conference on Multimedia and Expo

  29. Wang F, Sahli H, Gao J, Jiang D, Verhelst W (2015) Relevance units machine based dimensional and continuous speech emotion prediction. Multimed Tools Appl 74(22):9983–10000

    Article  Google Scholar 

  30. Westerdijk W, Barber D, Wiegerinck W (1999) Generative vector quantization. In Proc. 9th ICANN, pp 934–939

  31. Wimmer M, Schuller B, Arsic D, Rigoll G, Radig B (2008) Low level fusion of audio and video feature for multi-modal emotion recognition. International Conference on Computer Vision Theory and Applications

  32. Yegnanarayana B (2004) Artificial neural networks. Prentice-Hall, Englewood Cliffs

    Google Scholar 

Download references

Acknowledgements

This research was supported by a Korea University Grant with Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2015R1D1A1A01057975).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gilsang Yoo.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yoo, G., Seo, S., Hong, S. et al. Emotion extraction based on multi bio-signal using back-propagation neural network. Multimed Tools Appl 77, 4925–4937 (2018). https://doi.org/10.1007/s11042-016-4213-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-016-4213-5

Keywords