skip to main content
10.1145/3577190.3616878acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
short-paper

4th Workshop on Social Affective Multimodal Interaction for Health (SAMIH)

Published: 09 October 2023 Publication History

Abstract

This workshop discusses how interactive, multimodal technology, such as virtual agents, can measure and train social-affective interactions. Sensing technology now enables analyzing users’ behaviors and physiological signals. Various signal processing and machine learning methods can be used for prediction tasks. Such social signal processing and tools can be applied to measure and reduce social stress in everyday situations, including public speaking at schools and workplaces.

References

[1]
S. Burke, Tammy L. Bresnahan, T. Li, K. Epnere, A. Rizzo, Mary Partin, Robert M Ahlness, and M. Trimmer. 2018. Using Virtual Interactive Training Agents (ViTA) with Adults with Autism and Other Developmental Disabilities. Journal of Autism and Developmental Disorders 48 (2018), 905–912.
[2]
J. Daniels, J. N. Schwartz, C. Voss, N. Haber, A. Fazel, A. Kline, P. Washington, C. Feinstein, T. Winograd, and D. P. Wall. 2018. Exploratory study examining the at-home feasibility of a wearable tool for social-affective learning in children with autism. NPJ Digit Med 1 (2018), 32.
[3]
Ouriel Grynszpan, Julie Bouteiller, Séverine Grynszpan, Florence Le Barillier, Jean-Claude Martin, and Jacqueline Nadel. 2019. Altered sense of gaze leading in autism. Research in Autism Spectrum Disorders 67 (2019), 101441. https://doi.org/10.1016/j.rasd.2019.101441
[4]
L. Hemamou, G. Felhi, J. Martin, and C. Clavel. 2019. Slices of Attention in Asynchronous Video Job Interviews. In 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII). 1–7. https://doi.org/10.1109/ACII.2019.8925439
[5]
L’eo Hemamou, Ghazi Felhi, Vincent Vandenbussche, Jean-Claude Martin, and Chloé Clavel. 2019. HireNet: A Hierarchical Attention Model for the Automatic Analysis of Asynchronous Video Job Interviews. In AAAI.
[6]
Isabella Poggi, Catherine Pelachaud, F. Rosis, Valeria Carofiglio, and Berardina Carolis. 2005. Greta. A Believable Embodied Conversational Agent. 3–25. https://doi.org/10.1007/1-4020-3051-7_1
[7]
Kazuhiro Shidara, Hiroki Tanaka, Hiroyoshi Adachi, Daisuke Kanayama, Yukako Sakagami, Takashi Kudo, and Satoshi Nakamura. 2022. Automatic thoughts and facial expressions in cognitive restructuring with virtual agents. Frontiers in Computer Science 4 (2022), 762424.
[8]
Hiroki Tanaka, Satoshi Nakamura, Jean-Claude Martin, and Catherine Pelachaud. 2020. Social Affective Multimodal Interaction for Health. In Proceedings of the 2020 International Conference on Multimodal Interaction (Virtual Event, Netherlands) (ICMI ’20). Association for Computing Machinery, New York, NY, USA, 893–894. https://doi.org/10.1145/3382507.3420059
[9]
Hiroki Tanaka, Satoshi Nakamura, Jean-Claude Martin, and Catherine Pelachaud. 2021. 2nd Workshop on Social Affective Multimodal Interaction for Health (SAMIH). In Proceedings of the 2021 International Conference on Multimodal Interaction. 853–854.
[10]
Hiroki Tanaka, Satoshi Nakamura, Kazuhiro Shidara, Jean-Claude Martin, and Catherine Pelachaud. 2022. 3rd Workshop on Social Affective Multimodal Interaction for Health (SAMIH). In Proceedings of the 2022 International Conference on Multimodal Interaction. 805–806.
[11]
H. Tanaka, H. Negoro, H. Iwasaka, and S. Nakamura. 2017. Embodied conversational agents for multimodal automated social skills training in people with autism spectrum disorders. PLoS ONE 12, 8 (2017), e0182151.
[12]
M. Iftekhar Tanveer, Emy Lin, and Mohammed (Ehsan) Hoque. 2015. Rhema: A Real-Time In-Situ Intelligent Interface to Help People with Public Speaking. In Proceedings of the 20th International Conference on Intelligent User Interfaces (Atlanta, Georgia, USA) (IUI ’15). Association for Computing Machinery, New York, NY, USA, 286–295. https://doi.org/10.1145/2678025.2701386

Index Terms

  1. 4th Workshop on Social Affective Multimodal Interaction for Health (SAMIH)

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ICMI '23: Proceedings of the 25th International Conference on Multimodal Interaction
    October 2023
    858 pages
    ISBN:9798400700552
    DOI:10.1145/3577190
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 09 October 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. affective computing
    2. cognitive behavioral therapy
    3. large language models
    4. motivational interview
    5. social signal processing
    6. social skills training
    7. virtual agents

    Qualifiers

    • Short-paper
    • Research
    • Refereed limited

    Funding Sources

    • JST CREST
    • ANR

    Conference

    ICMI '23
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 453 of 1,080 submissions, 42%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 45
      Total Downloads
    • Downloads (Last 12 months)18
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 03 Mar 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media