Skip to main content

Advertisement

Log in

Computational framework for emotional VAD prediction using regularized Extreme Learning Machine

  • Regular Paper
  • Published:
International Journal of Multimedia Information Retrieval Aims and scope Submit manuscript

Abstract

With the advancement of Human Computer interaction and affective computing, emotion estimation becomes a very interesting area of research. In literature, the majority of emotion recognition systems presents an insufficiency due to the complexity of processing a huge number of physiological data and analyzing various kind of emotions in one framework. The aim of this paper is to present a rigorous and effective computational framework for humans affect recognition and classification through arousal valence and dominance dimensions. In the proposed algorithm, physiological instances from the multimodal emotion DEAP dataset has been used for the analysis and characterization of emotional pattern. Physiological features were employed to predict VAD levels via Extreme Learning Machine. We adopted a feature-level fusion to exploit the complementary information of some physiological sensors in order to improve the classification performance. The proposed framework was also evaluated in a VA quadrant by predicting four emotional classes. The obtained results proves the robustness and correctness of our proposed framework compared to other recent studies. We can also confirm the sufficiency of the R-ELM when it was applied for the estimation and recognition of emotional responses.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Picard RW, Vyzas E, Healey J (2001) Machine emotional intelligence: analysis of affective physiological state. IEEE Trans Pattern Anal Mach Intell 23:1175–1191

    Article  Google Scholar 

  2. Zeng Z, Pantic M, Roisman GI, Huang TS (2009) Survey of affect recognition methods audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31:39–58

    Article  Google Scholar 

  3. International Affective Picture System, IAPS (1997)

  4. Kim J, Andre E (2008) Emotion recognition based on physiological changes in music listening. IEEE Trans Pattern Anal Mach Intell 30:2067–2083

    Article  Google Scholar 

  5. Schuller B (2011) Recognizing affect from linguistic information in 3D continuous space. IEEE Trans Affect Comput 2(4):192–205

    Article  Google Scholar 

  6. Soleymani M, Chanel G, Kierkels JJM, Pun T (2009) Affective characterization of movie scenes based on content analysis and physiological changes. Int J Semant Comput 30(2):235–254

    Article  Google Scholar 

  7. Rottenberg J, Ray RD, Gross JJ (1976) Emotion elicitation using films series in affective science. Oxford University, Oxford, pp 9–28

    Google Scholar 

  8. Mauss IB, Robinson MD (2009) Measures of emotion: a review. Cognit Emot 23(2):209–237

    Article  Google Scholar 

  9. Healey JM, Picard RW (2005) Detecting stress during real-world driving tasks using physiological sensors. IEEE Trans Intell Transp Syst 6(2):156–166

    Article  Google Scholar 

  10. Chanel G, Rbetez C, Betancourt M, Pun T (2008) Boredom, engagement and anxiety as indicators for adaptation to difficulty in games. The 12th international conference on Entertainment and media in the ubiquitous era, Finland, pp 13–17

  11. Ekman P (1982) Emotions in the human faces. 2nd ed studies in emotion and social interaction. Cambridge University Press, Cambridge

    Google Scholar 

  12. Russell JA (1982) A circumplex model of affect. J Personal Soc Psychol 39:1161–1178

    Article  Google Scholar 

  13. Jerritta S, Murugappan M, Nagarajan R, Kahrunizam W (2011) Signals based human emotion recognition: a review. 7th international colloquium on signal processing and its application

  14. Rani P, Liu C, Sarkar N, Vanman E (2006) An empirical study of machine learning techniques for affect recognition in Human–robot interaction. Pattern Anal Appl 9:58–69

    Article  Google Scholar 

  15. Long Z, Liu G, Dai X (2010) Extracting emotional features from ECG by using wavelet transform, International conference on biomedical engineering and computer sciences (ICBECS), Wuhan, pp 1–4

  16. Cong Z, Chetouani M (2010) Hilbert–Huang transform based physiological signals analysis for emotion recognition, International symposium on in signal processing and information technology (ISSPIT), pp 1–7

  17. Schlkopf B, Smola AJ (2001) Learning with Kernels: support vector machines, regularization, optimization, and beyond (adaptive computation and machine learning). 1st edition

  18. Cortes C, Vapnik V (1995) Support vector networks. Mach Learn 20(3):273–297

    MATH  Google Scholar 

  19. Shi L, Lu BL (2013) EEG-based vigilance estimation using extreme learning machines. Neurocomputing 102:135–143

  20. He B, Xu D, Nian R, van Heeswijk M, Yu Q, Miche Y, Amaury L (2014) Fast face recognition via sparse coding and extreme learning machine. Cognit Comput 6:264–277

    Google Scholar 

  21. Bazi Y, Alajlan N, Melgani F, AlHichri H, Malek S, Yager RR (2014) Differential evolution extreme learning machine for the classification of hyperspectral images. IEEE Geosci Remote Sens Lett 6:1066–1070

    Article  Google Scholar 

  22. Huang G-B, Zhou H, Ding X, Zhang R (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern 42:513–529

    Article  Google Scholar 

  23. Chanel G, Ansari-Asl K, Pun T (2007) Valence–arousal evaluation using physiological signals in an emotion recall paradigm systems. ISIC. IEEE international conference on man and cybernetics, pp 2662–2667

  24. Koelstra S, Yazdani A, Soleymani M, Muhl C, Lee JS, Nijholt A, Pun T, Ebrahimi T, Patras I (2010) Single trial classification of EEG and peripheral physiological signals for recognition of emotions induced by music videos. In Brain informatics, ser. Lecture Notes in Computer Science 6334(9):89–100

  25. Soleymani M, Pantic M, Pun T (2012) Multimodal emotion recognition in response to videos. IEEE Trans Affect Comput 3:211–223

    Article  Google Scholar 

  26. Koelstra S, Muhl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2012) DEAP: a database for emotion analysis using physiological signals. Trans Affect Comput 3(1):18–31

    Article  Google Scholar 

  27. Godin CH, Compagne A (2015) Selection of the best physiological features for classifying emotion. The 3th International conference on physiological computing systems

  28. Soleymani M, Lichtinauer J, Pantic M, Pun T (2012) A multimodal database for affect recognition and implicit tagging. IEEE Trans Affect Comput 3(1):42–55

    Article  Google Scholar 

  29. Russell JA, Mehrabian A (1977) Evidence for a three-factor theory of emotions. J Res Personal 11(3):273–294

    Article  Google Scholar 

  30. Verma GK, Tiwary US (2014) Multimodal fusion frame-work: A multiresolution approach for classification and recognition from physiological signals. Neuroimage 102:162–172

    Article  Google Scholar 

  31. Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1–3):489–501

    Article  Google Scholar 

  32. Huanga G, Huang GB, Songa S, Youa K (2014) Trends in extreme learning machines: a review. Neural Netw 61:32–48

    Article  Google Scholar 

  33. Huang GB, Wang DH, Lan Y (2011) Extreme learning machines: a survey. Int J Mach Learn Cybern 2(2):107–122

    Article  Google Scholar 

  34. Chung SY, Yoon HJ (2012) Affective classification using Bayesian classifier and supervised learning. IEEE 12th international conference on control, automation and systems (ICCAS)

  35. Zhang X, Hu B, Chen J, Moore P (2013) Ontology-based context modeling for emotion recognition in an intelligent web. World Wide Web 16(4):497–513

    Article  Google Scholar 

  36. Zheng W, Zhu JY, Lu BL (2016) Identifying stable patterns over time for emotion recognition from EEG. In Human–Computer Interaction. arXiv:1601.02197

  37. Filix A, Daniela H, Puica M (2015) Neural network approaches for children’s emotion recognition in intelligent learning applications. 7th international conference on education and new learning technologies (1) doi:10.13140/RG.2.1.3413.2969

  38. Muthusamy H, Polat K, Yaacob S (2015) Improved emotion recognition using Gaussian mixture model and extreme learning machine in speech and glottal signals. Math Probl Eng. doi:10.1155/2015/394083

    Google Scholar 

  39. Kaya H, Salah AA (2016) Combining modality-specific extreme learning machines for emotion recognition in the wild. J Multimodal User Interfaces 2(10):139–149

    Article  Google Scholar 

Download references

Acknowledgements

The authors want to thank all iBUG group members and the DEAP dataset administrator for the sharing of all database files for free.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zied Guendil.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Guendil, Z., Lachiri, Z. & Maaoui, C. Computational framework for emotional VAD prediction using regularized Extreme Learning Machine. Int J Multimed Info Retr 6, 251–261 (2017). https://doi.org/10.1007/s13735-017-0128-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13735-017-0128-9

Keywords

Navigation