Abstract
With the advancement of Human Computer interaction and affective computing, emotion estimation becomes a very interesting area of research. In literature, the majority of emotion recognition systems presents an insufficiency due to the complexity of processing a huge number of physiological data and analyzing various kind of emotions in one framework. The aim of this paper is to present a rigorous and effective computational framework for humans affect recognition and classification through arousal valence and dominance dimensions. In the proposed algorithm, physiological instances from the multimodal emotion DEAP dataset has been used for the analysis and characterization of emotional pattern. Physiological features were employed to predict VAD levels via Extreme Learning Machine. We adopted a feature-level fusion to exploit the complementary information of some physiological sensors in order to improve the classification performance. The proposed framework was also evaluated in a V–A quadrant by predicting four emotional classes. The obtained results proves the robustness and correctness of our proposed framework compared to other recent studies. We can also confirm the sufficiency of the R-ELM when it was applied for the estimation and recognition of emotional responses.













Similar content being viewed by others
References
Picard RW, Vyzas E, Healey J (2001) Machine emotional intelligence: analysis of affective physiological state. IEEE Trans Pattern Anal Mach Intell 23:1175–1191
Zeng Z, Pantic M, Roisman GI, Huang TS (2009) Survey of affect recognition methods audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31:39–58
International Affective Picture System, IAPS (1997)
Kim J, Andre E (2008) Emotion recognition based on physiological changes in music listening. IEEE Trans Pattern Anal Mach Intell 30:2067–2083
Schuller B (2011) Recognizing affect from linguistic information in 3D continuous space. IEEE Trans Affect Comput 2(4):192–205
Soleymani M, Chanel G, Kierkels JJM, Pun T (2009) Affective characterization of movie scenes based on content analysis and physiological changes. Int J Semant Comput 30(2):235–254
Rottenberg J, Ray RD, Gross JJ (1976) Emotion elicitation using films series in affective science. Oxford University, Oxford, pp 9–28
Mauss IB, Robinson MD (2009) Measures of emotion: a review. Cognit Emot 23(2):209–237
Healey JM, Picard RW (2005) Detecting stress during real-world driving tasks using physiological sensors. IEEE Trans Intell Transp Syst 6(2):156–166
Chanel G, Rbetez C, Betancourt M, Pun T (2008) Boredom, engagement and anxiety as indicators for adaptation to difficulty in games. The 12th international conference on Entertainment and media in the ubiquitous era, Finland, pp 13–17
Ekman P (1982) Emotions in the human faces. 2nd ed studies in emotion and social interaction. Cambridge University Press, Cambridge
Russell JA (1982) A circumplex model of affect. J Personal Soc Psychol 39:1161–1178
Jerritta S, Murugappan M, Nagarajan R, Kahrunizam W (2011) Signals based human emotion recognition: a review. 7th international colloquium on signal processing and its application
Rani P, Liu C, Sarkar N, Vanman E (2006) An empirical study of machine learning techniques for affect recognition in Human–robot interaction. Pattern Anal Appl 9:58–69
Long Z, Liu G, Dai X (2010) Extracting emotional features from ECG by using wavelet transform, International conference on biomedical engineering and computer sciences (ICBECS), Wuhan, pp 1–4
Cong Z, Chetouani M (2010) Hilbert–Huang transform based physiological signals analysis for emotion recognition, International symposium on in signal processing and information technology (ISSPIT), pp 1–7
Schlkopf B, Smola AJ (2001) Learning with Kernels: support vector machines, regularization, optimization, and beyond (adaptive computation and machine learning). 1st edition
Cortes C, Vapnik V (1995) Support vector networks. Mach Learn 20(3):273–297
Shi L, Lu BL (2013) EEG-based vigilance estimation using extreme learning machines. Neurocomputing 102:135–143
He B, Xu D, Nian R, van Heeswijk M, Yu Q, Miche Y, Amaury L (2014) Fast face recognition via sparse coding and extreme learning machine. Cognit Comput 6:264–277
Bazi Y, Alajlan N, Melgani F, AlHichri H, Malek S, Yager RR (2014) Differential evolution extreme learning machine for the classification of hyperspectral images. IEEE Geosci Remote Sens Lett 6:1066–1070
Huang G-B, Zhou H, Ding X, Zhang R (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern 42:513–529
Chanel G, Ansari-Asl K, Pun T (2007) Valence–arousal evaluation using physiological signals in an emotion recall paradigm systems. ISIC. IEEE international conference on man and cybernetics, pp 2662–2667
Koelstra S, Yazdani A, Soleymani M, Muhl C, Lee JS, Nijholt A, Pun T, Ebrahimi T, Patras I (2010) Single trial classification of EEG and peripheral physiological signals for recognition of emotions induced by music videos. In Brain informatics, ser. Lecture Notes in Computer Science 6334(9):89–100
Soleymani M, Pantic M, Pun T (2012) Multimodal emotion recognition in response to videos. IEEE Trans Affect Comput 3:211–223
Koelstra S, Muhl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2012) DEAP: a database for emotion analysis using physiological signals. Trans Affect Comput 3(1):18–31
Godin CH, Compagne A (2015) Selection of the best physiological features for classifying emotion. The 3th International conference on physiological computing systems
Soleymani M, Lichtinauer J, Pantic M, Pun T (2012) A multimodal database for affect recognition and implicit tagging. IEEE Trans Affect Comput 3(1):42–55
Russell JA, Mehrabian A (1977) Evidence for a three-factor theory of emotions. J Res Personal 11(3):273–294
Verma GK, Tiwary US (2014) Multimodal fusion frame-work: A multiresolution approach for classification and recognition from physiological signals. Neuroimage 102:162–172
Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1–3):489–501
Huanga G, Huang GB, Songa S, Youa K (2014) Trends in extreme learning machines: a review. Neural Netw 61:32–48
Huang GB, Wang DH, Lan Y (2011) Extreme learning machines: a survey. Int J Mach Learn Cybern 2(2):107–122
Chung SY, Yoon HJ (2012) Affective classification using Bayesian classifier and supervised learning. IEEE 12th international conference on control, automation and systems (ICCAS)
Zhang X, Hu B, Chen J, Moore P (2013) Ontology-based context modeling for emotion recognition in an intelligent web. World Wide Web 16(4):497–513
Zheng W, Zhu JY, Lu BL (2016) Identifying stable patterns over time for emotion recognition from EEG. In Human–Computer Interaction. arXiv:1601.02197
Filix A, Daniela H, Puica M (2015) Neural network approaches for children’s emotion recognition in intelligent learning applications. 7th international conference on education and new learning technologies (1) doi:10.13140/RG.2.1.3413.2969
Muthusamy H, Polat K, Yaacob S (2015) Improved emotion recognition using Gaussian mixture model and extreme learning machine in speech and glottal signals. Math Probl Eng. doi:10.1155/2015/394083
Kaya H, Salah AA (2016) Combining modality-specific extreme learning machines for emotion recognition in the wild. J Multimodal User Interfaces 2(10):139–149
Acknowledgements
The authors want to thank all iBUG group members and the DEAP dataset administrator for the sharing of all database files for free.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Guendil, Z., Lachiri, Z. & Maaoui, C. Computational framework for emotional VAD prediction using regularized Extreme Learning Machine. Int J Multimed Info Retr 6, 251–261 (2017). https://doi.org/10.1007/s13735-017-0128-9
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13735-017-0128-9