skip to main content
10.1145/3399715.3399820acmotherconferencesArticle/Chapter ViewAbstractPublication PagesaviConference Proceedingsconference-collections
research-article

DELEX: a DEep Learning Emotive eXperience: Investigating empathic HCI

Published: 02 October 2020 Publication History

Abstract

Recent advances in Machine Learning have unveiled interesting possibilities for real-time investigating about user characteristics and expressions like, but not limited to, age, sex, body posture, emotions and moods. These new opportunities lay the foundations for new HCI tools for interactive applications that adopt user emotions as a communication channel.
This paper presents an Emotion Controlled User Experience that changes according to user feelings and emotions analysed at runtime. Aiming at obtaining a preliminary evaluation of the proposed ecosystem, a controlled experiment has been performed in an engineering and software development company, where 60 people have been involved as volunteers. The subjective evaluation has been based on a standard questionnaire commonly adopted for measuring user perceived sense of immersion in Virtual Environments. The results of the controlled experiment encourage further investigations strengthen by the analysis of objective performance measurements and user physiological parameters.

References

[1]
Nadeen Abbas, Dinesh Kumar, and Neil Mclachlan. 2005. The psychological and physiological effects of light and colour on space users. In 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference (pp. 1228--1231). IEEE.
[2]
Cesare Alippi, Simone Disabato, and Manuel Roveri. 2018. Moving Convolutional Neural Networks to Embedded Systems: The AlexNet and VGG-16 Case. in 17th ACM/IEEE International Conference on Information Processing in Sensor Networks (IPSN), Porto, 2018, pp. 212--223.
[3]
Emad Barsoum, Cha Zhang, Cristian Ferrer, and Zhang Canton. 2016. Training deep networks for facial expression recognition with crowdsourced label distribution. In Proceedings of ICMI, pages 279--283. ACM
[4]
Sergi Bermudez Badia, Luis Velez Quintero, Monica Cameirão, Alice Chirico, Stefano Triberti, Pietro Cipresso, and Andrea Gaggioli. 2018. Toward emotionally adaptive virtual reality for mental health applications. IEEE journal of biomedical and health informatics, 23(5), 1877--1887.
[5]
Guolu Cao, Yuliang Ma, Xiaofei Meng, Yunyuan Gao, and, Ming Meng. 2019. Emotion Recognition Based On CNN. In 2019 Chinese Control Conference (CCC) (pp. 8627--8630). IEEE.
[6]
Kasra Dalvand, K., and Mohammad Kazemifard. 2012. An Adaptive User-Interface Based on User's Emotion. In 2012 2nd International eConference on Computer and Knowledge Engineering (ICCKE) (pp. 161--166). IEEE.
[7]
Andrea De Lucia, Rita Francese, Ignazio Passero, and Genoveffa Tortora. 2009. Development and evaluation of a virtual campus on Second Life: The case of SecondDMI. Computers & Education, 52(1), 220--233.
[8]
Paul Ekman, and Richard Davidson. 1994. The nature of emotion: Fundamental questions. Oxford University Press.
[9]
Paul Ekman. 1999. Basic emotions. Handbook of cognition and emotion, 98(45--60), Wiley, New York.
[10]
Ian J. Goodfellow, Demitru Erhan, Pierre Luc Carrier, Aaron Courville, Mehdi Mirza, Ben Hamner, Will Cukierski, Yichuan Tang Zhou, and David Thaler. 2013. Challenges in representation learning: A report on three machine learning contests. In International Conference on Neural Information Processing (pp. 117--124). Springer, Berlin, Heidelberg.
[11]
Antonio Gulli, and Sujit Pal. 2017. Deep learning with Keras. Packt Publishing Ltd
[12]
Sander Koelstra, Christian Muhl, Mohammad Soleymani, Jong-Seok Lee, AshKan Yazdani, Touradj Ebrahimi, and Ioannis Patras. 2011. Deap: A database for emotion analysis; using physiological signals. IEEE transactions on affective computing, 3(1), 18--31.
[13]
Alex Krizhevsky, Ilya Sutskever, and Geoffry E. Hinton. 2012. ImageNet Classification with Deep Convolutional Neural Networks. In P. Bartlett, F. Pereira, C. Burges, L. Bottou, and K. Weinberger, editors, Proceedings of NIPS, pages 1106--1114
[14]
Dahus Li, Zhe Wang, Chuhan Wang, Shuang Liu, Wenhao Chi, Enzeng Dong, and Yu Song. 2019. The Fusion of Electroencephalography and Facial Expression for Continuous Emotion Recognition. IEEE Access, 7, 155724--155736.
[15]
Ali Mollahosseini, Behzad Hasani, Mohammad H. Mahoor. 2017. AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild. IEEE Transactions on Affective Computing, 10(1):18--31
[16]
Larissa Müller, Sebastian Zagaria, Arne Bernin, Abbes Amira, Naeem Ramzan, Christos Grecos, and Floria Vogt. 2015. Emotionbike: a study of provoking emotions in cycling exergames. In International Conference on Entertainment Computing (pp. 155--168). Springer, Cham.
[17]
Brandon Packard, and Jamie Phillips. 2019. AN EXAMINATION OF THE ETHICS OF IMITATION GAMES. In 34th Annual Conference of The Pennsylvania Association of Computer and Information Science Educators (p. 37).
[18]
Robert Plutchik, and Hope Conte. 1997. Circumplex models of personality and emotions (pp. xi-484). American Psychological Association.
[19]
Magda Saraiva, and Ayanoğlu, H. (2019). Emotions and Emotions in Design. In Emotional Design in Human-Robot Interaction (pp. 57--70). Springer, Cham.
[20]
Watchara Sroykham, J. Wongsathikun, and Yodchanan Wongsawat. 2014. The effects of perceiving color in living environment on QEEG, oxygen saturation, pulse rate, and emotion regulation in humans. In 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (pp. 6226--6229). IEEE.
[21]
Martin Usoh, Ernest Catena, Sima Arman, and Mel Slater. 2000. Using presence questionnaires in reality. Presence: Teleoperators and Virtual Environments, 9(5), 497--503.
[22]
Claes Wohlin, Per Runeson, Martin Höst, Magnus Ohlsson, Bjorn Regnell, and Anders Wesslén. 2000. Experimentation in software engineering: an introduction. Kluwer Academic Publishers, Norwell, MA, USA.
[23]
Bob G. Witmer, and Michael G. Singer. 1998. Measuring presence in virtual environments: A presence questionnaire. Presence: Teleoperators and Virtual Environments, 7(3), 225--240.
[24]
Jiabei Zeng, Shiguang Shan, and Xilin Che. 2018. Facial expression recognition with inconsistently annotated datasets. In Proceedings of ECCV, pages 222--237, 2018.
[25]
Hongli Zhang, Alireza Jolfaei, and Mamoun Alazab. 2019. A Face Emotion Recognition Method Using Convolutional Neural Network and Image Edge Computing. IEEE Access, 7, 159081--159089.

Cited By

View all
  • (2024)The Generative Fairy Tale of Scary Little Red Riding HoodProceedings of the 2024 ACM International Conference on Interactive Media Experiences10.1145/3639701.3656303(129-144)Online publication date: 7-Jun-2024
  • (2022)Machine In The Middle: Exploring Dark Patterns of Emotional Human-Computer Integration Through Media ArtExtended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491101.3503555(1-7)Online publication date: 27-Apr-2022
  • (2022)Combining filtered dictionary representation based deep subspace filter learning with a discriminative classification criterion for facial expression recognitionArtificial Intelligence Review10.1007/s10462-022-10160-155:8(6547-6566)Online publication date: 1-Dec-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
AVI '20: Proceedings of the 2020 International Conference on Advanced Visual Interfaces
September 2020
613 pages
ISBN:9781450375351
DOI:10.1145/3399715
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 October 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Computer Vision
  2. Deep Learning
  3. User Emotions
  4. User Experience

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

AVI '20
AVI '20: International Conference on Advanced Visual Interfaces
September 28 - October 2, 2020
Salerno, Italy

Acceptance Rates

AVI '20 Paper Acceptance Rate 36 of 123 submissions, 29%;
Overall Acceptance Rate 128 of 490 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)12
  • Downloads (Last 6 weeks)1
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)The Generative Fairy Tale of Scary Little Red Riding HoodProceedings of the 2024 ACM International Conference on Interactive Media Experiences10.1145/3639701.3656303(129-144)Online publication date: 7-Jun-2024
  • (2022)Machine In The Middle: Exploring Dark Patterns of Emotional Human-Computer Integration Through Media ArtExtended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491101.3503555(1-7)Online publication date: 27-Apr-2022
  • (2022)Combining filtered dictionary representation based deep subspace filter learning with a discriminative classification criterion for facial expression recognitionArtificial Intelligence Review10.1007/s10462-022-10160-155:8(6547-6566)Online publication date: 1-Dec-2022
  • (2021)Deep learning for emotion driven user experiencesPattern Recognition Letters10.1016/j.patrec.2021.09.004152:C(115-121)Online publication date: 1-Dec-2021
  • (2021)Attention monitoring for synchronous distance learningFuture Generation Computer Systems10.1016/j.future.2021.07.026125:C(774-784)Online publication date: 1-Dec-2021

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media