Skip to main content
Log in

SVM-based feature selection methods for emotion recognition from multimodal data

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

Multimodal emotion recognition is an emerging field within affective computing that, by simultaneously using different physiological signals, looks for evaluating an emotional state. Physiological signals such as electroencephalogram (EEG), temperature and electrocardiogram (ECG), to name a few, have been used to assess emotions like happiness, sadness or anger, or to assess levels of arousal or valence. Research efforts in this field so far have mainly focused on building pattern recognition systems with an emphasis on feature extraction and classifier design. A different set of features is extracted over each type of physiological signal, and then all these sets of features are combined, and used to feed a particular classifier. An important stage of a pattern recognition system that has received less attention within this literature is the feature selection stage. Feature selection is particularly useful for uncovering the discriminant abilities of particular physiological signals. The main objective of this paper is to study the discriminant power of different features associated to several physiological signals used for multimodal emotion recognition. To this end, we apply recursive feature elimination and margin-maximizing feature elimination over two well known multimodal databases, namely, DEAP and MAHNOB-HCI. Results show that EEG-related features show the highest discrimination ability. For the arousal index, EEG features are accompanied by Galvanic skin response features in achieving the highest discrimination power, whereas for the valence index, EEG features are accompanied by the heart rate features in achieving the highest discrimination power.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Aksu Y, Miller DJ, Kesidis G, Yang QX (2010) Margin-maximizing feature elimination methods for linear and nonlinear kernel-based discriminant functions. Trans Neur Netw 21(5):701–717

    Article  Google Scholar 

  2. Bouaguel W, Bel Mufti G, Limam M (2013) A fusion approach based on wrapper and filter feature selection methods using majority vote and feature weighting. In: Computer applications technology (ICCAT), 2013 International conference on, pp 1–6

  3. Cheng K-S, Chen Y-S, Wang T (2012) Physiological parameters assessment for emotion recognition. In: Biomedical engineering and sciences (IECBES), 2012 IEEE EMBS conference on, pp 995–998

  4. Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297

    MATH  Google Scholar 

  5. Duin RPW (2000) Prtools version 3.0: a matlab toolbox for pattern recognition. In: Proceeedings of SPIE, p 1331

  6. Eckmann J-P, Oliffson Kamphorst S, Ruelle D (1987) Recurrence plots of dynamical systems. EPL (Europhys Lett) 4(9):973

    Article  Google Scholar 

  7. Ekman P (1972) Universals and cultural differences in facial expressions of emotions. I:n Cole J (ed) Nebraska symposium on motivation, vol 19, pp 207–283. Lincoln University of Nebraska Press

  8. García-López F, García-Torres M, Melián-Batista B, Moreno-Pérez J, Moreno-Vega JM (2006) Solving feature subset selection problem by a Parallel Scatter Search. Eur J Op Res 169(2):477–489

  9. Greco A, Valenza G, Lanata A, Rota G, Scilingo EP (2014) Electrodermal activity in bipolar patients during affective elicitation. Biomed Health Inform IEEE J 18(6):1865–1873

    Article  Google Scholar 

  10. Gu Y, Tan SL, Wong KJ, Ho MHR, Qu L (2009) Using ga-based feature selection for emotion recognition from physiological signals. In: Intelligent signal processing and communications systems, 2008. ISPACS 2008. International symposium on, pp 1–4

  11. Guyon I, Weston J, Barnhill S, Vapnik V (2002) Gene selection for cancer classification using support vector machines. Mach Learn 46(1–3):389–422

    Article  MATH  Google Scholar 

  12. Hanjalic A, Xu L-Q (2005) Affective video content representation and modeling. Multimed IEEE Trans 7(1):143–154

    Article  Google Scholar 

  13. Jolliffe Ian (2005) Principal component analysis. Wiley, Oxford

    Book  MATH  Google Scholar 

  14. Koelstra S, Muhl C, Soleymani M, Lee J-S, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2012) Deap: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3(1):18–31

    Article  Google Scholar 

  15. Koelstra S, Patras I (2013) Fusion of facial expressions and eeg for implicit affective tagging. Image Vis Comput 31(2):164–174

    Article  Google Scholar 

  16. Kortelainen J, Seppanen T (2013) Eeg-based recognition of video-induced emotions: Selecting subject-independent feature set. Conf Proc IEEE Eng Med Biol Soc 4287–4290:2013

    Google Scholar 

  17. Kreibig SD (2010) Autonomic nervous system activity in emotion: a review. The biopsychology of emotion: current theoretical and empirical perspectives. Biol Psychol 84(3):394–421

    Article  Google Scholar 

  18. Marwan N, Wessel N, Meyerfeldt U, Schirdewan A, Kurths J (2002) Recurrence Plot Based Measures of Complexity and its Application to Heart Rate Variability Data. Phys Rev E 66(2):026702

  19. Marwan N, Romano MC, Thiel M, Kurths J (2007) Recurrence plots for the analysis of complex systems. Phys Rep 438(5–6):237–329

    Article  MathSciNet  Google Scholar 

  20. Nicolaou MA, Gunes H, Pantic M (2011) Continuous prediction of spontaneous affect from multiple cues and modalities in valence-arousal space. Affect Comput IEEE Trans 2(2):92–105

    Article  Google Scholar 

  21. Paleari M, Chellali R, Huet B (2010) Features for multimodal emotion recognition: an extensive study. In: Cybernetics and intelligent systems (CIS), 2010 IEEE conference on, pp 90–95

  22. Park BJ, Jang EH, Kim SH, Huh C, Sohn JH (2011) Feature selection on multi-physiological signals for emotion recognition. In: Engineering and industries (ICEI), 2011 international conference on, pp 1–6

  23. Pizarro J, Guerrero E, Galindo PL (2002) Multiple comparison procedures applied to model selection. Neurocomputing 48(174):155–173

    Article  MATH  Google Scholar 

  24. Pun T, Pantic M, Soleymani M (2012) Multimodal emotion recognition in response to videos. IEEE Trans Affect Comput 3(2):211–223

    Article  Google Scholar 

  25. Rozgic V, Ananthakrishnan S, Saleem S, Kumar R, Prasad R (2012) Ensemble of svm trees for multimodal emotion recognition. In: Signal information processing association annual summit and conference (APSIPA ASC), 2012 Asia-Pacific, pp 1–4

  26. Soleymani M, Lichtenauer J, Pun T, Pantic M (2012) A multimodal database for affect recognition and implicit tagging. IEEE Trans Affect Comput 3(1):42–55

    Article  Google Scholar 

  27. Soleymani M, Chanel G, Kierkels JJM, Pun T (2008) Affective characterization of movie scenes based on content analysis and physiological changes. Int J Semant Comput 2:235–254

    Article  Google Scholar 

  28. Takahashi K, Namikawa S, Hashimoto M (2012) Computational emotion recognition using multimodal physiological signals: elicited using Japanese Kanji words. In: Telecommunications and signal processing (TSP), 2012 35th international conference on, pp 615–620

  29. Tokuno S, Tsumatori G, Shono S, Takei E, Suzuki G, Yamamoto T, Mituyoshi S, Shimura M (2011) Usage of emotion recognition in military health care. Defense science research conference and expo (DSR) 2011, pp 1–5

  30. Valenza G, Citi L, Gentili C, Lanata A, Scilingo EP, Barbieri R (2015) Characterization of depressive states in bipolar patients using wearable textile technology and instantaneous heart rate variability assessment. Biomed Health Inform IEEE J 19(1):263–274

    Article  Google Scholar 

  31. Valenza G, Lanata A, Scilingo EP (2012) The role of nonlinear dynamics in affective valence and arousal recognition. Affect Comput IEEE Trans 3(2):237–249

    Article  Google Scholar 

  32. Wang Y, Mo J (2013) Emotion feature selection from physiological signals using tabu search. In: Control and decision conference (CCDC), 2013 25th Chinese, pp 3148–3150

  33. Yan K, Zhang D (2015) Feature selection and analysis on correlated gas sensor data with recursive feature elimination. Sensors Actuators B Chem 212:353–363

    Article  Google Scholar 

  34. Zbilut Joseph P, Webber Charles L (2006) Recurrence quantification analysis. Wiley encyclopedia of biomedical engineering. Wiley, Oxford

    Google Scholar 

  35. Zeng Z, Pantic M, Roisman GI, Huang TS (2009) A survey of affect recognition methods: audio, visual and spontaneous expressions. In: ICMI ’07: Proceedings of the 9th international conference on multimodal interfaces, pp 126–133, New York, NY, USA, 2009. ACM

Download references

Acknowledgments

This work was supported by the “Automática” research group from the “Universidad Tecnológica de Pereira”. The authors would like to thank the project “Eficacia de un sistema basado en realidad virtual, como coadyuvante en el control emocional a través de estrategias psicológicas integradas al entrenamiento militar” funded by Colciencias with code 111542520798 and the project “Desarrollo de un sistema basado en realidad virtual de baja inmersión para asistir intervenciones psicológicas enfocadas al control emocional” funded by the “Universidad Tecnológica de Pereira” with code 6-11-3 that provided the resources to develop this work. The author C. A. Torres-Valencia was funded by the program “Formación de alto nivel para la ciencia, la tecnología y la innovación—Doctorado Nacional-Convoctoria 647 de 2014” of Colciencias.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Cristian Torres-Valencia.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Torres-Valencia, C., Álvarez-López, M. & Orozco-Gutiérrez, Á. SVM-based feature selection methods for emotion recognition from multimodal data. J Multimodal User Interfaces 11, 9–23 (2017). https://doi.org/10.1007/s12193-016-0222-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-016-0222-y

Keywords

Navigation