Skip to main content
Log in

Development of emotion recognition interface using complex EEG/ECG bio-signal for interactive contents

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

With the changes in the interface paradigm, a user may not be satisfied using only a behavior-based interface such as a mouse and keyboard. In this paper, we propose a real-time user interface with emotion recognition that depends on the need for skill development to support a change in the interface paradigm to one that is more human centered. The proposed emotion recognition technology may provide services to meet the need to recognize emotions when using contents. Until now, most studies on an emotion recognition interface have used a single signal, which was difficult to apply because of low accuracy. In this study, we developed a complex biological signal emotion recognition system that blends the ratio of an ECG for the autonomic nervous system and the relative power value of an EEG (theta, alpha, beta, and gamma) to improve the low accuracy. The system creates a data map that stores user-specific probabilities to recognize six kinds of feelings (amusement, fear, sadness, joy, anger, and disgust). It updates the weights to improve the accuracy of the emotion corresponding to the brain waves of each channel. In addition, we compared the results of the complex biological signal data set and EEG data set to verify the accuracy of the complex biological signal, and found that the accuracy had increased by 35.78%. The proposed system will be utilized as an interface for controlling a game and smart space for a user with high accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

References

  1. Cai J, Liu G, Hao M (2009a) The research on emotion recognition from ECG signal. Proceedings of the IEEE ITCS 2009 International Conference 1: 497–500. doi: 10.1109/ITCS.2009.108

  2. Cai J, Liu G, Hao M (2009b) The research on emotion recognition from ECG signal. In Information Technology and Computer Science 2009:497–500. doi:10.1109/ITCS.2009.108

    Google Scholar 

  3. Christie IC, Friedman BH (2004) Autonomic specificity of discrete emotion and dimensions of affective space: a multivariate approach. Int J Psychophysiol 51(2):143–153. doi:10.1016/j.ijpsycho.2003.08.002

    Article  Google Scholar 

  4. Ekman P (1999) Facial expressions. Handbook of Cognition and Emotion:226–232. doi:10.1002/0470013494.ch16

  5. Elmir Y, Adjoudj R (2014) Multimodal biometric using a hierarchical fusion of a person’s face, voice, and online signature. Journal of Information Processing Systems 10(4):555–567. doi:10.3745/JIPS.02.0007

    Article  Google Scholar 

  6. Frantzidis CA, Bratsas C, Papadelis CL, Konstantinidis E, Pappas C, Bamidis PD (2010) Toward emotion aware computing: an integrated approach using multichannel neurophysiological recordings and affective visual stimuli. IEEE Trans Inf Technol Biomed 14(3):589–597. doi:10.1109/TITB.2010.2041553

    Article  Google Scholar 

  7. Harter S (1981) A new self-report scale of intrinsic versus extrinsic orientation in the classroom: motivational and informational components. Dev Psychol 17(3):300–312. doi:10.1037/0012-1649.17.3.300

    Article  Google Scholar 

  8. Healey J, Picard R (2000) SmartCar: detecting driver stress. Proceedings of the 15th IEEE Pattern Recognition 2000 International Conference 4: 218–221. doi: 10.1109/ICPR.2000.902898

  9. Khosrowabadi R, Quek HC, Wahab A, Ang KK (2010) EEG-based emotion recognition using self-organizing map for boundary detection. Proceedings of the 20th IEEE ICPR 2010 International Conference 4242–4245. doi: 10.1109/ICPR.2010.1031

  10. Kim K, Lee Y, Oh D (2015) Usability test of “Paldokangsan3” a walking game for the elderly. Journal of Korea Game Society 15(1):145–154. doi:10.7583/JKGS.2015.15.1.145

    Article  Google Scholar 

  11. Koelstra S, Muhl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, Patras I (2012) Deap: a database for emotion analysis; using physiological signals IEEE trans. Affective Computing 3(1):18–31. doi:10.1109/t-affc.2011.15

    Article  Google Scholar 

  12. Lee L-T, Hung JC (2015) Effects of blended e-learning: a case study in higher education tax learning setting. Human-centric Computing and Information Sciences 5:13. doi:10.1186/s13673-015-0024-3

    Article  Google Scholar 

  13. Lee J, Kim K (2010) A study of biosignal analysis system for sensibility evaluation. Journal of the Korea Society of Computer and Information 15(12):19–26. doi:10.9708/jksci.2010.15.12.019

    Article  Google Scholar 

  14. Lee D, Sim K (2006) Development of emotion recognition model based on multilayer perceptron. Korea Intelligent systems 16(1):172–175. doi:10.5391/JKIIS.2006.16.3.372

    Google Scholar 

  15. Liu Y, Sourina O, Nguyen MK (2010) Real-time EEG-based human emotion recognition and visualization. Proceedings of the IEEE Cyberworlds 2010 International Conference 262–269. doi: 10.1109/CW.2010.37

  16. Oh J-S, Kim H-Y, Moon H-N (2014) A study on the diffusion of digital interactive e-books. Journal of Convergence 5:2

    Google Scholar 

  17. Parrott WG (2001) Emotions in social psychology: Essential readings. Psychology Press

  18. Petrantonakis PC, Hadjileontiadis LJ (2009) EEG-based emotion recognition using hybrid filtering and higher order crossings, Proceeding of the 3rd IEEE ACII 2009 International Conference 1–6. doi: 10.1109/ACII.2009.5349513

  19. Plutchik R (2001) The nature of emotions: human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. Am Sci 89(4):344–350. doi:10.1511/2001.4.344

    Article  Google Scholar 

  20. Pyun H-GP, An H-A, Yuk S, Park J (2015) A gesture interface based on hologram and haptics environments for interactive and immersive experiences. Journal of Korea Game Society 15(1):27–34. doi:10.7583/JKGS.2015.15.1.27

    Article  Google Scholar 

  21. Rosalind P (1995) Affective computing

  22. Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161–1178. doi:10.1037/h0077714

    Article  Google Scholar 

  23. Shon JG, Kim BW (2014) Design and implementation of a content model for m-learning. Journal of Information Processing System 10(4):543–554. doi:10.3745/JIPS.04.0010

    Article  Google Scholar 

  24. Shusterman V, Barnea O (2005) Sympathetic nervous system activity in stress and biofeedback relaxation. Engineering in Medicine and Biology Magazine IEEE 24(2):52–57. doi:10.1109/MEMB.2005.1411349

    Article  Google Scholar 

  25. Stipek D (1993) Motivation to learn: from theory to practice. Allyn and Bacon

  26. Takayuki H, Kiyoko Y (2003) The relaxation biofeedback system with computer and heart rate variability interaction. Technical Report of IEICE:35–38

  27. Tao J, Tan T (2005) Affective computing: a review. In International Conference on Affective Computing and Intelligent Interaction 981–995. doi: 10.1007/11573548_125

  28. Woestenburg JC, Verbaten MN, Slangen JL (1983) The removal of the eye-movement artifact from the EEG by regression analysis in the frequency domain. Biol Psychol 16(1):127–147. doi:10.1016/0301-0511(83)90059-5

    Article  Google Scholar 

  29. Wolpaw JR, McFarland DJ (2004) Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans. Proc Natl Acad Sci U S A 101(51):17849–17854. doi:10.1073/pnas.0403504101

    Article  Google Scholar 

  30. Wolpaw JR, McFarland DJ, Neat GW (1991) An EEG-based brain-computer interface for cursor control. Electroencephalogr Clin Neurophysiol 78(3):252–259. doi:10.1016/0013-4694(91)90040-B

    Article  Google Scholar 

  31. Zhao Q-B, Zhang L-Q, Cichocki A (2009) EEG-based asynchronous BCI control of a car in 3D virtual reality environments. Chin Sci Bull 54(1):78–87. doi:10.1007/s11434-008-0547-3

    Article  Google Scholar 

Download references

Acknowledgements

This research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2015R1D1A1A01059253).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dongkyoo Shin.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shin, D., Shin, D. & Shin, D. Development of emotion recognition interface using complex EEG/ECG bio-signal for interactive contents. Multimed Tools Appl 76, 11449–11470 (2017). https://doi.org/10.1007/s11042-016-4203-7

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-016-4203-7

Keywords

Navigation