skip to main content
10.1145/3527188.3561937acmotherconferencesArticle/Chapter ViewAbstractPublication PageshaiConference Proceedingsconference-collections
research-article

The Effect of the Repetitive Utterances Complexity on User’s Desire to Continue Dialogue by a Chat-oriented Spoken Dialogue System

Published: 05 December 2022 Publication History

Abstract

In everyday conversation, it is common for participants to repeat all or part of the other’s words, and such repetition is often accompanied by elements such as backchannels. Repetitive utterances have an empathic effect other than confirming information. However, if the repetitive utterances of a dialogue system are monotonous, the user may get bored quickly, and if it is too complex, there is a concern that it may place a cognitive burden on the user. In this study, we define complexity as the number of elements and patterns associated with repeated words and examine the effect of the complexity of repetitive utterances on the user’s perceived empathy and desire to continue dialogue. The complexity of the repetitive utterances was divided into three conditions: low, moderate, and high, and templates of repetitive utterances were made according to each condition. We constructed a chat-oriented spoken dialogue system that automatically generates repetitive utterances. A dialogue experiment was conducted with 12 subjects. As a result, no significant difference was found between the 3 complexity conditions for the evaluation items of the user’s perceived empathy and desire to continue dialogue. On the other hand, considering the characteristics of the user’s negative attitudes towards robots and the anxiety towards robots, the results suggest that the stronger the user’s negative attitudes and anxiety towards robots, the greater the desire to continue the dialogue after exposure to high- complexity repetitive utterances.

References

[1]
Riki Fujihara, Yosuke Kishinami, Ryoto Konno, Shiki Sato, Tasuku Sato, Shumpei Miyawaki, Takuma Kato, Jun Suzuki, and Kentaro Inui. 2020. ILYS aoba bot: A Chatbot Combining Rules and Large-Scale Neural Response Generation. In JSAI SIG-SLUD-C002).
[2]
Maarten Grootendorst. 2020. KeyBERT: Minimal keyword extraction with BERT.https://doi.org/10.5281/zenodo.4461265
[3]
Tatsuya Hayamizu, Mutsuo Sano, Kentarou Mukai, Tomoko Koda, Kenzaburo Miyawaki, Ryohei Sasama, Tomoharu Yamaguchi, Keiji Yamada, 2013. A Synchrony Control Method Based on Switching Pauses and Prosodic Information in Conversation and Implementation of a Conversational Agent for Gathering Information. IPSJ journal 54, 8 (2013), 2109–2118.
[4]
Ryuichiro Higashinaka, Kohji Dohsaka, and Hideki Isozaki. 2009. Effects of empathy and self-disclosure in dialogue systems. In Proceedings of The Fifteenth Annual Meeting of The Association for Natural Language Processing.
[5]
Yumika Hosoya and Tetsuo Fukushima. 2016. The Relationship between Counselors’ Reflection, Validation and Affirmation and Clients’ Empathized Experience and Psychological Distance in Counseling. Japan Women’s University journal. The Graduate School of Integrated Arts and Social Sciences22(2016), 217–244.
[6]
Ryosuke Iwakura, Tomohiro Yoshikawa, and Takeshi Furuhashi. 2018. A Basic Study on Generating Back-channel Humor for Chat Dialogue Systems. In The Japanese Society for Fuzzy Theory and Intelligent Informatics Fuzzy System Symposium Proceedings 34th Fuzzy System Symposium. Japan Society for Fuzzy Theory and Intelligent Informatics, 170–174.
[7]
Takanori Komatsu and Seiji Yamada. 2009. Effects of Adaptation Gap on Users’ Variation of Impression about Artificial Agents. Transactions of the Japanese Society for Artificial Intelligence : AI 24, 2(2009), 232–240.
[8]
Shusaku Kumazaki and Yugo Takeuchi. 2017. Inducement of Empathic Response to Artifact Imitating Human Behavior. Transactions of the institute of electronics, information and communication engineers 100, 1 (2017), 24–33.
[9]
Senko Kumiya Maynard. 1993. Kaiwa bunseki. Number 2. Kurosio Publishers, Tokyo. https://ci.nii.ac.jp/ncid/BN08690553
[10]
Chika Nagaoka. 2006. Mutual influence of nonverbal behavior in interpersonal communication. Japanese journal of interpersonal and social psychology 6 (2006), 101–112.
[11]
Shu Nakazato, Keitaro Kanzaki, and Hideaki Kikuchi. 2014. The Relationship between the Level of Intimacy and the Vocabulary Used in Dialogues. IPSJ SIG Technical Report 2014, 22 (2014), 1–6.
[12]
Ryota Nishimura and Seiichi Nakagawa. 2011. A spoken dialog system for spontaneous conversations considering response timing and response type. IEEJ transactions on electrical and electronic engineering 6, S1(2011), S17–S26.
[13]
Tatsuya Nomura, Takayuki Kanda, and Tomohiro Suzuki. 2006. Experimental investigation into influence of negative attitudes toward robots on human–robot interaction. Ai & Society 20, 2 (2006), 138–150.
[14]
Tatsuya Nomura, Takayuki Kanda, Tomohiro Suzuki, and Kensuke Kato. 2008. Prediction of human behavior in human–robot interaction using psychological scales for anxiety and negative attitudes toward robots. IEEE transactions on robotics 24, 2 (2008), 442–451.
[15]
Tatsuya Nomura, Tomohiro Suzuki, Takayuki Kanda, and Kensuke Kato. 2006. Measurement of anxiety toward robots. In ROMAN 2006-The 15th IEEE International Symposium on Robot and Human Interactive Communication. IEEE, 372–377.
[16]
Yuya Otake and Masafumi Hagiwara. 2012. A Dialogue System for the Elderly Considering Utterance Intention. Transactions of Japan Society of Kansei Engineering 11, 2(2012), 207–214.
[17]
Kazuya Shitaoka, Ryoko Tokuhisa, Takayoshi Yoshimura, Hiroyuki Hoshino, and Narimasa Watanabe. 2017. Active Listening System for a Conversation Robot. Journal of natural language processing 24, 1 (2017), 3–47.
[18]
Nobuaki Tanaka. 2007. The relationship between counselors’ responses and clients’ perceived empathy: considering the difference between turns and back channel responses. Japanese journal of counseling science 40, 3 (2007), 208–217.
[19]
Yuhei Tanizaki, Felix Jimenez, Tomohiro Yoshikawa, and Takeshi Furuhashi. 2018. Impression Effects of Educational Support Robots using Sympathy Expressions Method by Body Movement and Facial Expression. Journal of Japan Society for Fuzzy Theory and Intelligent Informatics 30, 5(2018), 700–708.
[20]
Takahisa Uchida, Takashi Minato, and Hiroshi Ishiguro. 2019. The relationship between dialogue motivation and attribution of subjective opinions to conversational androids. Transactions of the Japanese Society for Artificial Intelligence : AI 34, 1(2019), B–I62_1.
[21]
Özge Nilay Yalçın and Steve DiPaola. 2019. Modeling empathy: building a link between affective and cognitive processes. Artificial Intelligence Review(2019), 1–24.
[22]
Jie Yang and Hideaki Kikuchi. 2021. Promoting Users’ Empathy and Desire of Continuing Dialogue by Chat-oriented Dialogue Robot with Linguistic Alignment. Journal of Information Processing (JIP) 62, 2 (2021), 772–781.

Cited By

View all
  • (2024)Situating Empathy in HCI/CSCW: A Scoping ReviewProceedings of the ACM on Human-Computer Interaction10.1145/36870528:CSCW2(1-37)Online publication date: 8-Nov-2024
  • (2024)Evaluation of Preference on Context-Aware Utterances based on Personality Traits using a Conversational Android Robot System2024 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN)10.1109/RO-MAN60168.2024.10731410(22-28)Online publication date: 26-Aug-2024

Index Terms

  1. The Effect of the Repetitive Utterances Complexity on User’s Desire to Continue Dialogue by a Chat-oriented Spoken Dialogue System

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      HAI '22: Proceedings of the 10th International Conference on Human-Agent Interaction
      December 2022
      352 pages
      ISBN:9781450393232
      DOI:10.1145/3527188
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      In-Cooperation

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 05 December 2022

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. chat-oriented spoken dialogue
      2. complexity
      3. desire to continue dialogue
      4. perceived empathy
      5. repetitive utterances

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Funding Sources

      • Waseda University Grants for Special Research Projects (?Tokutei Kadai?)
      • JSPS KAKENHI

      Conference

      HAI '22
      HAI '22: International Conference on Human-Agent Interaction
      December 5 - 8, 2022
      Christchurch, New Zealand

      Acceptance Rates

      Overall Acceptance Rate 121 of 404 submissions, 30%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)16
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 05 Mar 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Situating Empathy in HCI/CSCW: A Scoping ReviewProceedings of the ACM on Human-Computer Interaction10.1145/36870528:CSCW2(1-37)Online publication date: 8-Nov-2024
      • (2024)Evaluation of Preference on Context-Aware Utterances based on Personality Traits using a Conversational Android Robot System2024 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN)10.1109/RO-MAN60168.2024.10731410(22-28)Online publication date: 26-Aug-2024

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media