skip to main content
10.1145/3472307.3484668acmconferencesArticle/Chapter ViewAbstractPublication PageshaiConference Proceedingsconference-collections
poster

Inferring Human Beliefs and Desires from their Actions and the Content of their Utterances

Published: 09 November 2021 Publication History

Abstract

To create dialogue systems that provide information a user needs to know at an opportune moment, it is important to infer the user’s mental states such as his/her beliefs and desires. There are two types of study on inferring beliefs and desires: one type infers them from actions and the other infers them from the content of utterances. However, a method to infer beliefs and desires from both kinds of inference in an integrated way has not yet been established. In this paper, we propose Multimodal Inference of Mind Simultaneous Contextualization and Interpreting (MIoM SCAIN), a system for sequentially inferring users’ beliefs and desires on the basis of their walking behaviors and the content of their utterances. In our evaluation, we compared inferences of MIoM SCAIN with those of baselines that use either walking behaviors or the content of utterances. MIoM SCAIN’s predictions showed more correlation with subjective judgements compared with the baselines, indicating that the inference of beliefs and desires from both walking behaviors and utterance content is possible.

References

[1]
Chris L Baker, Julian Jara-Ettinger, Rebecca Saxe, and Joshua B Tenenbaum. 2017. Rational quantitative attribution of beliefs, desires and percepts in human mentalizing. Nature Human Behaviour 1, 4 (2017), 1–10.
[2]
Qibin Chen, Junyang Lin, Yichang Zhang, Ming Ding, Yukuo Cen, Hongxia Yang, and Jie Tang. 2019. Towards Knowledge-Based Recommender Dialog System. arxiv:1908.05391 [cs.CL]
[3]
Hamidreza. Chinaei. 2016. Building Dialogue POMDPs from Expert Dialogues An end-to-end approach (1st ed. 2016. ed.). Springer International Publishing, Cham.
[4]
Yosuke Fukuchi, Masahiko Osawa, Hiroshi Yamakawa, Tatsuji Takahashi, and Michita Imai. 2018. Bayesian Inference of Self-Intention Attributed by Observer. In Proceedings of the 6th International Conference on Human-Agent Interaction (Southampton, United Kingdom) (HAI ’18). Association for Computing Machinery, New York, NY, USA, 3–10. https://doi.org/10.1145/3284432.3284438
[5]
Chiori Hori, Julien Perez, Ryuichiro Higashinaka, Takaaki Hori, Y-Lan Boureau, Michimasa Inaba, Yuiko Tsunomori, Tetsuro Takahashi, Koichiro Yoshino, and Seokhwan Kim. 2019. Overview of the sixth dialog system technology challenge: DSTC6. Computer Speech & Language 55 (2019), 1–25.
[6]
Chaitanya K. Joshi, Fei Mi, and Boi Faltings. 2017. Personalization in Goal-Oriented Dialog. arxiv:1706.07503 [cs.CL]
[7]
Timo. Koski. 2009. Bayesian networks an introduction. John Wiley and Sons, Hoboken, NJ. 12 pages.
[8]
William A. Link. 2010. Bayesian inference with ecological applications (1st ed. ed.). Academic Press, Amsterdam.
[9]
David Premack and Guy Woodruff. 1978. Does the chimpanzee have a theory of mind?Behavioral and brain sciences 1, 4 (1978), 515–526.
[10]
Kun Qian and Zhou Yu. 2019. Domain adaptive dialog generation via meta learning. arXiv preprint arXiv:1906.03520(2019).
[11]
Roshni R. Ramnani, Shubhashis Sengupta, and Poulami Debnath. 2018. Intelligent Travel Advisor: A Goal Oriented Virtual Agent with Task Modeling, Planning and User Personalization. In Proceedings of the 18th International Conference on Intelligent Virtual Agents (Sydney, NSW, Australia) (IVA ’18). Association for Computing Machinery, New York, NY, USA, 341–342. https://doi.org/10.1145/3267851.3267885
[12]
Weiyan Shi and Zhou Yu. 2018. Sentiment adaptive end-to-end dialog systems. arXiv preprint arXiv:1804.10731(2018).
[13]
Yusuke Takimoto, Yosuke Fukuchi, Shoya Matsumori, and Michita Imai. 2020. SLAM-Inspired Simultaneous Contextualization and Interpreting for Incremental Conversation Sentences. arxiv:2005.14662 [cs.CL]
[14]
Xiang Zhang and Qiang Yang. 2019. Transfer hierarchical attention network for generative dialog system. International Journal of Automation and Computing 16, 6 (2019), 720–736.

Index Terms

  1. Inferring Human Beliefs and Desires from their Actions and the Content of their Utterances
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      HAI '21: Proceedings of the 9th International Conference on Human-Agent Interaction
      November 2021
      447 pages
      ISBN:9781450386203
      DOI:10.1145/3472307
      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 09 November 2021

      Check for updates

      Author Tags

      1. Bayesian inference
      2. Dialogue system
      3. Human Computer Interaction
      4. Partially Observable Markov Decision Processes
      5. Theory of Mind

      Qualifiers

      • Poster
      • Research
      • Refereed limited

      Conference

      HAI '21
      Sponsor:
      HAI '21: International Conference on Human-Agent Interaction
      November 9 - 11, 2021
      Virtual Event, Japan

      Acceptance Rates

      Overall Acceptance Rate 121 of 404 submissions, 30%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 81
        Total Downloads
      • Downloads (Last 12 months)18
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 25 Feb 2025

      Other Metrics

      Citations

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media