skip to main content
10.1145/3388818.3389161acmotherconferencesArticle/Chapter ViewAbstractPublication PagesivspConference Proceedingsconference-collections
research-article

Analysing the Compatibility of Identifying Emotions by Facial Expressions and Text Analytics when Using Mobile Devices

Published: 18 May 2020 Publication History

Abstract

Affective computing is an imperative topic for Human-Computer Interaction, where user emotions and emotional communication can be utilized to improve the usability of a system. Several strategies are available to detect user emotions but it is questionable when identifying the most suitable and compatible strategy which can be used to detect emotions when using mobile devices. Multimodal emotion recognition paves the path to detect emotions by combining two or more strategies in order to identify the most meaningful emotion. Emotion identification through facial expressions and text analytics has given high accuracies but combining them and practically applying them in the context of a mobile environment should be done. Three prototypes were developed using evolutionary prototyping which can detect emotions from facial expressions and text data, using state of the art APIs and SDKs where the base of the prototypes was a keyboard known as "Emotional Keyboard" which is compatible with Android devices. Evaluations of Prototype 1 and 2 have been performed based on participatory design and reviewed the compatibility of emotion identification through facial expressions and text data in the mobile context. Evaluation of Prototype 3 should be done in the future and a confusion matrix should be built to verify the accuracies by cross-checking with training and validation accuracies that have been obtained when developing the neural network.

References

[1]
Bostan, L.A.M. and Klinger, R. 2018. An Analysis of Annotated Corpora for Emotion Classification in Text Title and Abstract in German. Proceedings of the 27th International Conference on Computational Linguistics (2018), 2104--2119.
[2]
Chaffar, S. and Inkpen, D. 2011. Using a Heterogeneous Dataset for Emotion Analysis in Text. Springer. (2011), 62--67.
[3]
Corneanu, C., Noroozi, F., Kaminska, D., Sapinski, T., Escalera, S. and Anbarjafari, G. 2018. Survey on Emotional Body Gesture Recognition. CoRR. abs/1801.0, (2018).
[4]
Dalvand, K. and Kazemifard, M. 2012. An Adaptive User Interface Based on User's Emotion. (2012), 161--166.
[5]
Deshmukh, R.S. and Jagtap, V. 2018. A survey: Software API and database for emotion recognition. 2017 International Conference on Intelligent Computing and Control Systems (ICICCS). (2018), 284--289.
[6]
Dubey, M. and Singh, L. 2016. Automatic Emotion Recognition Using Facial Expression: A Review. International Research Journal of Engineering and Technology (IRJET). 3, 2 (2016), 488--492.
[7]
Garcia-Garcia, J.M., Penichet, V.M.R. and Lozano, M.D. 2017. Emotion detection. Proceedings of the XVIII International Conference on Human Computer Interaction - Interacción '17. October (2017), 1--8.
[8]
Hakak, N.M., Mohd, M., Kirmani, M. and Mohd, M. 2017. Emotion analysis: A survey. 2017 International Conference on Computer, Communications and Electronics (COMPTELIX) (2017), 397--402.
[9]
Hirakawa, M., Hewagamage, P. and Ichikawa, T. 1998. Situation-dependent browser to explore the information space. Proceedings. 1998 IEEE Symposium on Visual Languages (Cat. No.98TB100254). (1998), 108--115.
[10]
Jiang, Y.G., Xu, B. and Xue, X. 2014. Predicting emotions in user-generated videos. Proceedings of the National Conference on Artificial Intelligence. 1, (2014), 73--79.
[11]
Khalili, Z. and Moradi, M.H. 2008. Emotion detection using brain and peripheral signals. 2008 Cairo International Biomedical Engineering Conference, CIBEC 2008. (2008), 2006--2009.
[12]
Khanna, P. and Sasikumar, M. 2010. Recognising Emotions from Keyboard Stroke Pattern. International Journal of Computer Applications. 11, 9 (2010), 1--5.
[13]
Kingma, D.P. and Ba, J. 2014. Adam: A Method for Stochastic Optimization. (2014), 1--15.
[14]
Kolodziej, M., Majkowski, A., Rak, R.J., Tarnowski, P. and Pielaszkiewicz, T. 2018. Analysis of facial features for the use of emotion recognition. Proceedings of 2018 19th International Conference Computational Problems of Electrical Engineering, CPEE 2018 (2018), 1--4.
[15]
Lugovic, S., Dunder, I. and Horvat, M. 2016. Techniques and applications of emotion recognition in speech. 2016 39th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO) (2016), 1278--1283.
[16]
Mukeshimana, M., Ban, X., Karani, N. and Liu, R. 2017. Multimodal Emotion Recognition for Human-Computer Interaction: A Survey. 8, 4 (2017), 1289--1301.
[17]
Nwankpa, C., Ijomah, W., Gachagan, A. and Marshall, S. 2018. Activation Functions: Comparison of trends in Practice and Research for Deep Learning. (2018), 1 -20.
[18]
Paul, E. 1993. Facial expression and Emotion. American Pshychologist Association. 49, (1993), 384--392.
[19]
Picard, R.W. 1999. Affective computing for HCI. HCI. 1, August (1999), 829--833.
[20]
Picard, R.W. Toward Machines with Emotional Intelligence. 1--22.
[21]
Porcu, S., Floris, A. and Atzori, L. 2019. Towards the Prediction of the Quality of Experience from Facial Expression and Gaze Direction. ICIN 2019 QoE Management Workshop, Paris (2019), 82--87.
[22]
Sayette, M.A., Cohn, J.F., Wertz, J.M., Perrott, M.A. and Parrott, D.J. 2002. A Psychometric Evaluation of the Facial Action Coding System for Assessing Spontaneous Expression. Journal of Nonverbal Behavior. 25, (2002), 167--186.
[23]
Shikder, R., Rahaman, S., Afroze, F. and Al Islam, A.B.M.A. 2017. Keystroke/mouse usage based emotion detection and user identification. 2017 International Conference on Networking, Systems and Security (NSysS). (2017), 96--104.
[24]
Strapparava, C. and R. Mihalcea 2007. Affective Text. (2007).
[25]
Tao, J. and Tan, T. 2005. Affective Computing: A Review. (2005), 981--995.
[26]
Using, M. 2005. Intelligent Modeling and Control of Washing. Ii (2005), 812--817.
[27]
Wang, S.-M., Li, C.-H., Lo, Y.-C., Huang, T.-H.K. and Ku, L.-W. 2016. Sensing Emotions in Text Messages: An Application and Deployment Study of EmotionPush. (2016), 1--6.
[28]
Wattearachchi, W.D., Hettiarachchi, E. and Hewagamage, K.P. 2019. Critical Success Factors of Analysing User Emotions to Improve the Usability of Systems. 2019 Twelfth International Conference on Ubi-Media Computing (Ubi-Media) (Aug. 2019), 91--95.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
IVSP '20: Proceedings of the 2020 2nd International Conference on Image, Video and Signal Processing
March 2020
168 pages
ISBN:9781450376952
DOI:10.1145/3388818
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

  • Nanyang Technological University
  • The Hong Kong Polytechnic: The Hong Kong Polytechnic University

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 18 May 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Affective Computing
  2. Emotion Analysis in Text
  3. Facial Expressions Recognition
  4. Human-Computer Interaction (HCI)
  5. Machine Learning
  6. Mobile Devices

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

IVSP '20

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 145
    Total Downloads
  • Downloads (Last 12 months)24
  • Downloads (Last 6 weeks)2
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media