skip to main content
10.1145/3218585.3218589acmotherconferencesArticle/Chapter ViewAbstractPublication PagesdsaiConference Proceedingsconference-collections
research-article

Emotionally-Aware Multimodal Interfaces: Preliminary Work on a Generic Affective Modality

Published: 20 June 2018 Publication History

Abstract

In interactive systems, knowing the user's emotional state is not only important to understand and improve overall user experience, but also of the utmost relevance in scenarios where such information might foster our ability to help users manage and express their emotions (e.g., anxiety), with a strong impact on their daily life and on how they interact with others. Nevertheless, although there is a clear potential for emotionally-aware applications, several challenges preclude their wider availability, sometimes resulting from the low translational nature of the research in affective computing methods, and from a lack of straightforward methods for easy integration of emotion in applications. In light of these challenges, we propose a conceptual vision for the consideration of emotion in the scope of multimodal interactive systems, and how it can articulate with research in affective computing. Aligned with this vision, a first instantiation of an affective generic modality is presented, and a proof-of-concept application, enabling multimodal interaction with Spotify, illustrates how the modality can provide emotional context in interactive scenarios.

References

[1]
H. S. Alavi, H. Verma, M. Papinutto, and D. Lalanne. 2017. Comfort: A Coordinate of User Experience in Interactive Built Environments. In IFIP Conference on Human-Computer Interaction. Springer, 247--257.
[2]
N. Almeida, S. Silva, and A. Teixeira. 2014. Design and development of speech interaction: a methodology. In Proc. HCII. Springer, 370--381.
[3]
N. Almeida, A. Teixeira, S. Silva, and J. Freitas. 2016. Towards Integration of Fusion in a W3C-based Multimodal Interaction Framework: Fusion of Events. Proc. Iberspeech, pp. 1 (2016), 291--300.
[4]
L. F. Barrett. 2017. How emotions are made: The secret life of the brain. Houghton Mifflin Harcourt.
[5]
Nielsole Ole Bernsen and Laila Dybkjaer. 1997. Designing Interactive Speech Systems: From First Ideas to User Testing. (Dec. 1997).
[6]
T. J. Brigham. 2017. Merging Technology and Emotions: Introduction to Affective Computing. Medical reference services quarterly 36, 4 (2017), 399--407.
[7]
R. A Calvo and S. Mac Kim. 2013. Emotions in text: dimensional and categorical models. Computational Intelligence 29, 3 (2013), 527--543.
[8]
S. Chamberlain, H. Sharp, and N. Maiden. 2006. Towards a framework for integrating agile development and user-centred design. Proc. 7th Int. Conf. on Extreme Programming and Agile Processes in Software Engineering (2006).
[9]
Deborah A. Dahl. 2013. The W3C multimodal architecture and interfaces standard. Journal on Multimodal User Interfaces 7, 3 (2013), 171--182.
[10]
Deborah A Dahl. 2016. Multimodal Interaction with W3C Standards. Springer.
[11]
B. Dumas, D. Lalanne, and S. Oviatt. 2009. Multimodal interfaces: A survey of principles, models and frameworks. In Human machine interaction. Springer, 3--26.
[12]
R. E. Kaliouby, R. Picard, and S. Baron-Cohen. 2006. Affective computing and autism. Annals of the New York Academy of Sciences 1093 (2006), 228--248.
[13]
J. Ferreira, S. Brás, C. F. Silva, and S. C. Soares. 2017. An automatic classifier of emotions built from entropy of noise. Psychophysiology 54, 4 (2017), 620--627.
[14]
V. Gay, P. Leijdekkers, J. Agcanas, F. Wong, and Qiang Wu. 2013. CaptureMyEmotion: helping autistic children understand their emotions using facial expression recognition and mobile technologies. Studies in Health Technology and Informatics 189 (2013), 71--76.
[15]
A. Kappas. 2010. Smile when you read this, whether you like it or not: Conceptual challenges to affect detection. IEEE Trans. Affective Computing 21 (2010), 38--41.
[16]
B. C. Ko. 2018. A Brief Review of Facial Emotion Recognition Based on Visual Information. Sensors 18, 2 (2018), 401.
[17]
Sharon Oviatt, Rachel Coulston, and Rebecca Lunsford. 2004. When Do We Interact Multimodally?: Cognitive Load and Multimodal Communication Patterns. In Proc. 6th Int. Conf. on Multimodal Interfaces. New York, NY, USA, 129--136.
[18]
R. W. Picard. 2010. Emotion research by the people, for the people. Emotion Review 2, 3 (2010), 250--254.
[19]
A. Popescu, J. Broekens, and M. Van Someren. 2014. GAMYGDALA: An emotion engine for games. (2014).
[20]
A. Schmidt. 2000. Implicit human computer interaction through context. Personal technologies 4, 2--3 (2000), 191--199.
[21]
S. Silva, S. C. Soares, S. Brás, J. Fernandes, M. Coroa, N. Madeira, T. Santos, and P.J. S. G. Ferreira. 2018. From the Lab into People's Lives: Setting a Roadmap for Mental Healthcare Assistive Technologies derived from Psychophysiology. In Proc. 26th European Congress of Psychiatry.
[22]
M. Sucala, P. Cuijpers, F. Muench, R. Cardoş, R. Soflau, A. Dobrean, P. Achimas-Cadariu, and D. David. 2017. Anxiety: There is an app for that. A systematic review of anxiety apps. Depression and Anxiety 34 (2017), 518--525.
[23]
E. T. Solovey, D. Afergan, E. M. Peck, S. W. Hincks, and R. J. K. Jacob. 2015. Designing Implicit Interfaces for Physiological Computing. (2015).
[24]
A. Teixeira, N. Almeida, C. Pereira, M. O. e Silva, D. Vieira, and S. Silva. 2017. Applications of the Multimodal Interaction Architecture in Ambient Assisted Living. In Multimodal Interaction with W3C Standards. Springer, 271--291.
[25]
A. Teixeira, F. Ferreira, N. Almeida, S. Silva, A. F. Rosa, J. C. Pereira, and D. Vieira. 2017. Design and development of Medication Assistant: older adults centred design to go beyond simple medication reminders. Universal Access in the Information Society 16 (2017), 545--560.
[26]
C. M. Tyng, H. U. Amin, M. Saad, and A. S. Malik. 2017. The influences of emotion on learning and memory. Frontiers in psychology 8 (2017), 1454.
[27]
D. Vieira, A. Leal, N. Almeida, S. Silva, and A. Teixeira. 2017. "Tell your day": Developing multimodal interaction applications for children with ASD. In Proc. HCII Vol. 10277 LNCS. 525--544.

Cited By

View all
  • (2024)Enhancing User Experience through Emotion-Aware Interfaces: A Multimodal ApproachJournal of Innovative Image Processing10.36548/jiip.2024.1.0036:1(27-39)Online publication date: Mar-2024
  • (2020)Enabling Multimodal Emotionally-Aware Ecosystems Through a W3C-Aligned Generic Interaction ModalityWireless Mobile Communication and Healthcare10.1007/978-3-030-49289-2_11(140-152)Online publication date: 28-May-2020
  • (2019)The AM4I Architecture and Framework for Multimodal Interaction and Its Application to Smart EnvironmentsSensors10.3390/s1911258719:11(2587)Online publication date: 6-Jun-2019
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
DSAI '18: Proceedings of the 8th International Conference on Software Development and Technologies for Enhancing Accessibility and Fighting Info-exclusion
June 2018
365 pages
ISBN:9781450364676
DOI:10.1145/3218585
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 20 June 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Emotional and affective interaction
  2. multi-platform design
  3. multimodal interfaces
  4. natural interaction

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

DSAI 2018

Acceptance Rates

DSAI '18 Paper Acceptance Rate 17 of 23 submissions, 74%;
Overall Acceptance Rate 17 of 23 submissions, 74%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)22
  • Downloads (Last 6 weeks)0
Reflects downloads up to 17 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Enhancing User Experience through Emotion-Aware Interfaces: A Multimodal ApproachJournal of Innovative Image Processing10.36548/jiip.2024.1.0036:1(27-39)Online publication date: Mar-2024
  • (2020)Enabling Multimodal Emotionally-Aware Ecosystems Through a W3C-Aligned Generic Interaction ModalityWireless Mobile Communication and Healthcare10.1007/978-3-030-49289-2_11(140-152)Online publication date: 28-May-2020
  • (2019)The AM4I Architecture and Framework for Multimodal Interaction and Its Application to Smart EnvironmentsSensors10.3390/s1911258719:11(2587)Online publication date: 6-Jun-2019
  • (2019)Design and Development for Individuals with ASD: Fostering Multidisciplinary Approaches Through PersonasJournal of Autism and Developmental Disorders10.1007/s10803-019-03898-149:5(2156-2172)Online publication date: 30-Jan-2019

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media