Loading [a11y]/accessibility-menu.js
Emotion Recognition with the Help of Privileged Information | IEEE Journals & Magazine | IEEE Xplore

Emotion Recognition with the Help of Privileged Information


Abstract:

In this article, we propose a novel approach to recognize emotions with the help of privileged information, which is only available during training, but not available dur...Show More

Abstract:

In this article, we propose a novel approach to recognize emotions with the help of privileged information, which is only available during training, but not available during testing. Such additional information can be exploited during training to construct a better classifier. Specifically, we recognize audience's emotion from EEG signals with the help of the stimulus videos, and tag videos' emotions with the aid of electroencephalogram (EEG) signals. First, frequency features are extracted from EEG signals and audio/visual features are extracted from video stimulus. Second, features are selected by statistical tests. Third, a new EEG feature space and a new video feature space are constructed simultaneously using canonical correlation analysis (CCA). Finally, two support vector machines (SVM) are trained on the new EEG and video feature spaces respectively. During emotion recognition from EEG, only EEG signals are available, and the SVM classifier obtained on EEG feature space is used; while for video emotion tagging, only video clips are available, and the SVM classifier constructed on video feature space is adopted. Experiments of EEG-based emotion recognition and emotion video tagging are conducted on three benchmark databases, demonstrating that video content, as the context, can improve the emotion recognition from EEG signals and EEG signals available during training can enhance emotion video tagging.
Published in: IEEE Transactions on Autonomous Mental Development ( Volume: 7, Issue: 3, September 2015)
Page(s): 189 - 200
Date of Publication: 30 July 2015

ISSN Information:


Contact IEEE to Subscribe

References

References is not available for this document.