skip to main content
research-article

EmotionSense: An Adaptive Emotion Recognition System Based on Wearable Smart Devices

Published: 30 September 2020 Publication History

Abstract

With the recent surge of smart wearable devices, it is possible to obtain the physiological and behavioral data of human beings in a more convenient and non-invasive manner. Based on such data, researchers have developed a variety of systems or applications to recognize and understand human behaviors, including both physical activities (e.g., gestures) and mental states (e.g., emotions). Specifically, it has been proved that different emotions can cause different changes in physiological parameters. However, other factors, such as activities, may also impact one’s physiological parameters. To accurately recognize emotions, we need not only explore the physiological data but also the behavioral data. To this end, we propose an adaptive emotion recognition system by exploring a sensor-enriched wearable smart watch. First, an activity identification method is developed to distinguish different activity scenes (e.g., sitting, walking, and running) by using the accelerometer sensor. Based on the identified activity scenes, an adaptive emotion recognition method is proposed by leveraging multi-mode sensory data (including blood volume pulse, electrodermal activity, and skin temperature). Specifically, we extract fine-grained features to characterize different emotions. Finally, the adaptive user emotion recognition model is constructed and verified by experiments. An accuracy of 74.3% for 30 participants demonstrates that the proposed system can recognize human emotions effectively.

References

[1]
R. Bailon, L. Sornmo, and P. Laguna. 2006. A robust method for ECG-based estimation of the respiratory frequency during stress testing. IEEE Transactions on Biomedical Engineering 53, 7 (2006), 1273--1285.
[2]
J. Cabibihan and S. S. Chauhan. 2017. Physiological responses to affective tele-touch during induced emotional stimuli. IEEE Transactions on Affective Computing 8, 1 (2017), 108--118.
[3]
Yixiang Dai, Xue Wang, Pengbo Zhang, and Weihang Zhang. 2017. Wearable biosensor network enabled multimodal daily-life emotion recognition employing reputation-driven imbalanced fuzzy classification. Measurement 109 (2017), 408--424.
[4]
Guido H. E. Gendolla. 2000. On the impact of mood on behavior: An integrative theory and a review. Review of General Psychology 4, 4 (2000), 378--408.
[5]
Y. Hsu, J. Wang, W. Chiang, and C. Hung. 2020. Automatic ECG-based emotion recognition in music listening. IEEE Transactions on Affective Computing 11, 1 (2020), 85--89.
[6]
Eun-Hye Jang, Byoung-Jun Park, Mi-Sook Park, Sang-Hyeob Kim, and Jin-Hun Sohn. 2015. Analysis of physiological signals for recognition of boredom, pain, and surprise emotions. Journal of Physiological Anthropology 34, 1 (2015), Article 25, 12 pages.
[7]
J. Kim and E. Andre. 2008. Emotion recognition based on physiological changes in music listening. IEEE Transactions on Pattern Analysis and Machine Intelligence 30, 12 (Dec. 2008), 2067--2083.
[8]
A. Kleinsmith and N. Bianchi-Berthouze. 2013. Affective body expression perception and recognition: A survey. IEEE Transactions on Affective Computing 4, 1 (Jan. 2013), 15--33.
[9]
S. Koelstra, C. Muhl, M. Soleymani, J. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, and I. Patras. 2012. DEAP: A database for emotion analysis ;using physiological signals. IEEE Transactions on Affective Computing 3, 1 (2012), 18--31.
[10]
D. Kulic and E. A. Croft. 2007. Affective state estimation for human-robot interaction. IEEE Transactions on Robotics 23, 5 (2007), 991--1000.
[11]
P. Kuppens, F. Tuerlinckx, J. Russell, and L. Barrett. 2013. The relation between valence and arousal in subjective experience. Psychological Bulletin 139, 4 (2013), 917--940.
[12]
M. Kusserow, O. Amft, and G. Troster. 2013. Modeling arousal phases in daily living using wearable sensors. IEEE Transactions on Affective Computing 4, 1 (2013), 93--105.
[13]
Fan Liu, Xingshe Zhou, Zhu Wang, Jinli Cao, Hua Wang, and Yanchun Zhang. 2019. Unobtrusive mattress-based identification of hypertension by integrating classification and association rule mining. Sensors 19, 7 (2019), Article 1489, 25 pages.
[14]
I. Mauss, R. Levenson, L. McCater, F. Wilhelm, and Gross J. 2005. The tie that binds—Coherence among emotion experience, behavior, and physiology. Emotion 5, 2 (2005), 175--190.
[15]
Michele Orini, Raquel Bailón, Ronny Enk, Stefan Koelsch, Luca T. Mainardi, and Pablo Laguna. 2010. A method for continuously assessing the autonomic response to music-induced emotions through HRV analysis. Medical 8 Biological Engineering 8 Computing 48, 5 (2010), 423--433.
[16]
Markus Quirin, Miguel Kazén, and Julius Kuhl. 2009. When nonsense sounds happy or helpless: The implicit positive and negative affect test (IPANAT).Journal of Personality and Social Psychology 97, 3 (2009), 500--516.
[17]
Pierre Rainville, Antoine Bechara, Nasir Naqvi, and Antonio R. Damasio. 2006. Basic emotions are associated with distinct patterns of cardiorespiratory activity. International Journal of Psychophysiology 61, 1 (2006), 5--18.
[18]
Georgios Rigas, Christos D. Katsis, George Ganiatsas, and Dimitrios I. Fotiadis. 2007. A user independent, biosignal based, emotion recognition method. In Proceedings of the International Conference on User Modeling. 314--318.
[19]
N. Sarode and S. Bhatia. 2010. Facial expression recognition. International Journal on Computer Science and Engineering 2, 5 (2010), 1552--1557.
[20]
K. Schaaff and T. Schultz. 2009. Towards emotion recognition from electroencephalographic signals. In Proceedings of the 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops. 1--6.
[21]
C. Setz, B. Arnrich, J. Schumm, R. La Marca, G. Troster, and U. Ehlert. 2010. Discriminating stress from cognitive load using a wearable EDA device. IEEE Transactions on Information Technology in Biomedicine 14, 2 (2010), 410--417.
[22]
K. Wac and C. Tsiourti. 2014. Ambulatory assessment of affect: Survey of sensor systems for monitoring of autonomic nervous systems activation in emotion. IEEE Transactions on Affective Computing 5, 3 (2014), 251--272.
[23]
J. Wang, Y. Wang, D. Zhang, and S. Helal. 2018. Energy saving techniques in mobile crowd sensing: Current state and future opportunities. IEEE Communications Magazine 56, 5 (2018), 164--169.
[24]
J. Wang, Y. Wang, D. Zhang, Q. Lv, and C. Chen. 2019. Crowd-powered sensing and actuation in smart cities: Current issues and future directions. IEEE Wireless Communications 26, 2 (2019), 86--92.
[25]
Zhu Wang, Xingshe Zhou, Weichao Zhao, Fan Liu, Hongbo Ni, and Zhiwen Yu. 2017. Assessing the severity of sleep apnea syndrome based on ballistocardiogram. PLoS ONE 12, 4 (2017), 1--24. https://doi.org/10.1371/journal.pone.0175351
[26]
Siqing Wu, Tiago H. Falk, and Wai-Yip Chan. 2011. Automatic speech emotion recognition using modulation spectral features. Speech Communication 53, 5 (2011), 768--785.
[27]
B. Zhao, Z. Wang, Z. Yu, and B. Guo. 2018. EmotionSense: Emotion recognition based on wearable wristband. In Proceedings of the 15th IEEE International Conference on Ubiquitous Intelligence and Computing. 346--355.
[28]
M. D. Zwaag, J. H. Janssen, and J. M. Westerink. 2013. Directing physiology and mood through music: Validation of an affective music player. IEEE Transactions on Affective Computing 4, 1 (2013), 57--68.

Cited By

View all
  • (2025)A Review on Deep Learning for Quality of Life Assessment Through the Use of Wearable DataIEEE Open Journal of Engineering in Medicine and Biology10.1109/OJEMB.2025.35264576(261-268)Online publication date: 2025
  • (2024)Research on mood monitoring and intervention for anxiety disorder patients based on deep learning wearable devicesTechnology and Health Care10.1177/09287329241291376Online publication date: 3-Dec-2024
  • (2024)AirECG: Contactless Electrocardiogram for Cardiac Disease Monitoring via mmWave Sensing and Cross-domain Diffusion ModelProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785508:3(1-27)Online publication date: 9-Sep-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Computing for Healthcare
ACM Transactions on Computing for Healthcare  Volume 1, Issue 4
Special Issue on Wearable Technologies for Smart Health: Part 1
October 2020
184 pages
EISSN:2637-8051
DOI:10.1145/3427421
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 30 September 2020
Online AM: 07 May 2020
Accepted: 01 February 2020
Revised: 01 December 2019
Received: 01 July 2019
Published in HEALTH Volume 1, Issue 4

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Emotion recognition
  2. activity identification
  3. multi-mode signals
  4. scene-adaptive
  5. wearable devices

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

  • Fundamental Research Funds for the Central Universities
  • National Natural Science Foundation of China
  • Innovative Talents Promotion Program of Shaanxi Province
  • National Key R8D Program of China

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)162
  • Downloads (Last 6 weeks)16
Reflects downloads up to 20 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)A Review on Deep Learning for Quality of Life Assessment Through the Use of Wearable DataIEEE Open Journal of Engineering in Medicine and Biology10.1109/OJEMB.2025.35264576(261-268)Online publication date: 2025
  • (2024)Research on mood monitoring and intervention for anxiety disorder patients based on deep learning wearable devicesTechnology and Health Care10.1177/09287329241291376Online publication date: 3-Dec-2024
  • (2024)AirECG: Contactless Electrocardiogram for Cardiac Disease Monitoring via mmWave Sensing and Cross-domain Diffusion ModelProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785508:3(1-27)Online publication date: 9-Sep-2024
  • (2024)mmArrhythmiaProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36435498:1(1-25)Online publication date: 6-Mar-2024
  • (2024)Water Salinity Sensing with UAV-Mounted IR-UWB RadarACM Transactions on Sensor Networks10.1145/363351520:4(1-37)Online publication date: 11-May-2024
  • (2024)TFSemantic: A Time–Frequency Semantic GAN Framework for Imbalanced Classification Using Radio SignalsACM Transactions on Sensor Networks10.1145/361409620:4(1-22)Online publication date: 11-May-2024
  • (2024)U-Shaped Distribution Guided Sign Language Emotion Recognition With Semantic and Movement FeaturesIEEE Transactions on Affective Computing10.1109/TAFFC.2024.340935715:4(2180-2191)Online publication date: Oct-2024
  • (2024)A systematic review of emotion recognition using cardio-based signalsICT Express10.1016/j.icte.2023.09.00110:1(156-183)Online publication date: Feb-2024
  • (2024)BRAVE: Bio Responsive Alert VEstComputers Helping People with Special Needs10.1007/978-3-031-62849-8_50(406-413)Online publication date: 8-Jul-2024
  • (2024)Heart Rate-Based Emotion Recognition and Adaptive Emotion Regulation Support with Wrist-Worn Wearables: A Systematic Literature ReviewInformation Systems and Neuroscience10.1007/978-3-031-58396-4_31(355-366)Online publication date: 26-Jul-2024
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media