EmoMA-Net: A Novel Model for Emotion Recognition Using Hybrid Multimodal Neural Networks in Adaptive Educational Systems
Abstract
References
Index Terms
- EmoMA-Net: A Novel Model for Emotion Recognition Using Hybrid Multimodal Neural Networks in Adaptive Educational Systems
Recommendations
A novel facial emotion recognition method for stress inference of facial nerve paralysis patients
Highlights- We built an emotional face video dataset from facial nerve paralysis patients.
- The emotions of facial nerve paralysis patients are recognizable via face images.
- Transfer learning helps conquer the problem of limited data size in ...
AbstractFacial nerve paralysis results in muscle weakness or complete paralysis on one side of the face. Patients suffer from difficulties in speech, mastication and emotional expression, impacting their quality of life by causing anxiety and depression. ...
Multimodal Emotion Recognition via Convolutional Neural Networks: Comparison of different strategies on two multimodal datasets
AbstractThe aim of this paper is to investigate emotion recognition using a multimodal approach that exploits convolutional neural networks (CNNs) with multiple input. Multimodal approaches allow different modalities to cooperate in order to achieve ...
Graphical abstractDisplay Omitted
Highlights- Emotion recognition through multimodal architectures.
- Comparison of early to late fusion of 1-input models.
- Synchronization of video and audio channels.
Comments
Information & Contributors
Information
Published In

Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Author Tags
Qualifiers
- Research-article
Funding Sources
- Xi'an Jiaotong-Liverpool University
Conference
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 21Total Downloads
- Downloads (Last 12 months)21
- Downloads (Last 6 weeks)21
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in