Skip to main content

Estimation of Empathy Skill Level and Personal Traits Using Gaze Behavior and Dialogue Act During Turn-Changing

  • Conference paper
  • First Online:
HCI International 2021 - Late Breaking Papers: Multimodality, eXtended Reality, and Artificial Intelligence (HCII 2021)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 13095))

Included in the following conference series:

  • 1593 Accesses

Abstract

We explored the gaze behavior towards the end of utterances and dialogue acts (DAs), i.e., verbal-behavior information indicating the intension of an utterance, during turn-keeping/changing to estimate several social skills and personal traits in multi-party discussions. We first collected data on several personal indicators, i.e., Big Five, which measures personal traits, and Davis’ Interpersonal Reactivity Index (IRI), which measures empathy skill level, utterances that include DA categories, and gaze behavior, from participants in four-person discussions. We constructed and evaluated models for estimating the scores of these indicators using gaze behavior and DA information. The evaluation results indicate that using both gaze behavior and DAs during turn-keeping/changing is effective for estimating all such scores with high accuracy. It is also possible to estimate these scores with higher accuracy by using the gaze distribution to the current speaker and listener and amount of speaking obtained during the entire discussion. We also found that the IRI scores can be estimated more accurately than those of Big Five.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Banissy, M.J., Kanai, R., Walsh, V., Rees, G.: Inter-individual differences in empathy are reflected in human brain structure. NeuroImage 62, 2034–2039 (2012)

    Article  Google Scholar 

  2. Bouckaert, R.R., et al.: WEKA-experiences with a Java open-source project. J. Mach. Learn. Res. 11, 2533–2541 (2010)

    MathSciNet  MATH  Google Scholar 

  3. Chen, L., Harper, M.P.: Multimodal floor control shift detection. In: Proceedings of the International Conference on Multimodal Interaction, pp. 15–22 (2009)

    Google Scholar 

  4. Costa, P.T., McCrae, R.R.: The NEO personality inventory manual, FL Psychological Assessment Resources (1985)

    Google Scholar 

  5. Davis, M.H.: A multidimensional approach to individual differences in empathy 10 (1980)

    Google Scholar 

  6. De Corte, K., Buysse, A., Verhofstadt, L.L., Roeyers, H., Ponnet, K., Davis, M.H.: Measuring empathic tendencies: reliability and validity of the Dutch version of the interpersonal reactivity index. Psychologica Belgica 47, 235–260 (2007)

    Article  Google Scholar 

  7. De Kok, I., Heylen, D.: Multimodal end-of-turn prediction in multi-party meetings. In: Proceedings of the International Conference on Multimodal Interaction, pp. 91–98 (2009)

    Google Scholar 

  8. Fernandez, A., Dufey, M., Kramp, U.: Testing the psychometric properties of the interpersonal reactivity index (IRI) in Chile: empathy in a different cultural context. Eur. J. Assess. 27, 179–185 (2011)

    Article  Google Scholar 

  9. Greene, J.O., Burleson, B.R.: Handbook of Communication and Social Interaction Skills. Psychology Press, UK (2003)

    Book  Google Scholar 

  10. Gwet, K.L.: Handbook of Inter-Rater Reliability: The Definitive Guide to Measuring the Extent of Agreement Among Raters. Advanced Analytics, LLC (2014)

    Google Scholar 

  11. Higashinaka, R., et al.: Towards an open-domain conversational system fully based on natural language processing. In: International Conference on Computational Linguistics, pp. 928–939 (2014)

    Google Scholar 

  12. Ishii, R., Kumano, S., Otsuka, K.: Multimodal fusion using respiration and gaze behavior for predicting next speaker in multi-party meetings. In: ICMI, pp. 99–106 (2015)

    Google Scholar 

  13. Ishii, R., Kumano, S., Otsuka, K.: Predicting next speaker using head movement in multi-party meetings. In: ICASSP, pp. 2319–2323 (2015)

    Google Scholar 

  14. Ishii, R., Kumano, S., Otsuka, K.: Analyzing mouth-opening transition pattern for predicting next speaker in multi-party meetings. In: Proceedings of the International Conference on Acoustics, Speech and Signal Processing, pp. 209–216 (2016)

    Google Scholar 

  15. Ishii, R., Kumano, S., Otsuka, K.: Analyzing gaze behavior during turn-taking for estimating empathy skill level. In: Proceedings of the 19th ACM International Conference on Multimodal Interaction, ICMI 2017, pp. 365–373. ACM, New York (2017)

    Google Scholar 

  16. Ishii, R., Kumano, S., Otsuka, K.: Prediction of next-utterance timing using head movement in multi-party meetings. In: Proceedings of the 5th International Conference on Human Agent Interaction, HAI 2017, pp. 181–187. ACM, New York (2017)

    Google Scholar 

  17. Ishii, R., Miyajima, T., Fujita, K., Nakano, Y.: Avatar’s gaze control to facilitate conversational turn-taking in virtual-space multi-user voice chat system. In: Gratch, J., Young, M., Aylett, R., Ballin, D., Olivier, P. (eds.) IVA 2006. LNCS (LNAI), vol. 4133, p. 458. Springer, Heidelberg (2006). https://doi.org/10.1007/11821830_47

  18. Ishii, R., Otsuka, K., Kumano, S., Yamamoto, J.: Predicting of who will be the next speaker and when using gaze behavior in multiparty meetings. ACM Trans. Interact. Intell. Syst. 6(1), 4 (2016)

    Article  Google Scholar 

  19. Ishii, R., Otsuka, K., Kumano, S., Yamamoto, J.: Using respiration to predict who will speak next and when in multiparty meetings. ACM Trans. Interact. Intell. Syst. 6(2), 20 (2016)

    Article  Google Scholar 

  20. Ishii, R., Otsuka, K., Kumano, S., Higashinaka, R., Tomita, J.: Analyzing gaze behavior and dialogue act during turn-taking for estimating empathy skill level. In: Proceedings of the 20th ACM International Conference on Multimodal Interaction, ICMI 2018, pp. 31–39. ACM, New York (2018)

    Google Scholar 

  21. Ishii, R., Otsuka, K., Kumano, S., Higashinaka, R., Tomita, J.: Estimating interpersonal reactivity scores using gaze behavior and dialogue act during turn-changing. In: Meiselwitz, G. (ed.) HCII 2019, Part II. LNCS, vol. 11579, pp. 45–53. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-21905-5_4

    Chapter  Google Scholar 

  22. Ishii, R., Otsuka, K., Kumano, S., Higashinaka, R., Tomita, J.: Prediction of who will be next speaker and when using mouth-opening pattern in multi-party conversation. Multimodal Technol. Interact. 3(4), 70 (2019)

    Article  Google Scholar 

  23. Ishii, R., Otsuka, K., Kumano, S., Matsuda, M., Yamato, J.: Predicting next speaker and timing from gaze transition patterns in multi-party meetings. In: Proceedings of the International Conference on Multimodal Interaction, pp. 79–86 (2013)

    Google Scholar 

  24. Ishii, R., Otsuka, K., Kumano, S., Yamato, J.: Analysis and modeling of next speaking start timing based on gaze behavior in multi-party meetings. In: Proceedings of the International Conference on Acoustics, Speech, and Signal Processing, pp. 694–698 (2014)

    Google Scholar 

  25. Ishii, R., Otsuka, K., Kumano, S., Yamato, J.: Analysis of respiration for prediction of who will be next speaker and when? In multi-party meetings. In: Proceedings of the International Conference on Multimodal Interaction, pp. 18–25 (2014)

    Google Scholar 

  26. Ishii, R., Ren, X., Muszynski, M., Morency, L.-P.: Can prediction of turn-management willingness improve turn-changing modeling?. In: Proceedings of the 20th ACM International Conference on Intelligent Virtual Agents (2020)

    Google Scholar 

  27. Jokinen, K., Furukawa, H., Nishida, M., Yamamoto, S.: Gaze and turn-taking behavior in casual conversational interactions. J. TiiS 3(2), 12 (2013)

    Google Scholar 

  28. Keerthi, S.S., Shevade, S.K., Bhattacharyya, C., Murthy, K.R.K.: Improvements to Platt’s SMO algorithm for SVM classifier design. Neural Comput. 13(3), 637–649 (2001)

    Article  Google Scholar 

  29. Kendon, A.: Some functions of gaze direction in social interaction. Acta Psychologica 26, 22–63 (1967)

    Article  Google Scholar 

  30. Koiso, H., Horiuchi, Y., Tutiya, S., Ichikawa, A., Den, Y.: An analysis of turn-taking and backchannels based on prosodic and syntactic features in Japanese map task dialogs. Lang. Speech 41, 295–321 (1998)

    Article  Google Scholar 

  31. Kumano, S., Otsuka, K., Matsuda, M., Yamato, J.: Analyzing perceived empathy based on reaction time in behavioral mimicry. IEICE Trans. Inf. Syst. E97-D(8), 2008–2020 (2014)

    Google Scholar 

  32. Kumano, S., Otsuka, K., Mikami, D., Matsuda, M., Yamato, J.: Analyzing interpersonal empathy via collective impressions. IEEE Trans. Affect. Comput. 6(4), 324–336 (2015)

    Article  Google Scholar 

  33. Meguro, T., Higashinaka, R., Minami, Y., Dohsaka, K.: Controlling listening-oriented dialogue using partially observable Markov decision processes. In: International Conference on Computational Linguistics, pp. 761–769 (2010)

    Google Scholar 

  34. Nguyen, L., Frauendorfer, D., Mast, M., Gatica-Perez, D.: Hire me: computational inference of hirability in employment interviews based on nonverbal behavior. IEEE Trans. Multimed. 16(4), 1018–1031 (2014)

    Article  Google Scholar 

  35. Okada, S., et al.: Estimating communication skills using dialogue acts and nonverbal features in multiple discussion datasets. In: Proceedings of the International Conference on Multimodal Interaction, pp. 169–176 (2016)

    Google Scholar 

  36. Otsuka, K., Araki, S., Mikami, D., Ishizuka, K., Fujimoto, M., Yamato, J.: Realtime meeting analysis and 3D meeting viewer based on omnidirectional multimodal sensors. In: ACM International Conference on Multimodal Interfaces and Workshop on Machine Learning for Multimodal Interaction, pp. 219–220 (2009)

    Google Scholar 

  37. Park, S., Shim, H.S., Chatterjee, M., Sagae, K., Morency, L.-P.: Computational analysis of persuasiveness in social multimedia: a novel dataset and multimodal prediction approach. In: Proceedings of the ACM ICMI, pp. 50–57 (2014)

    Google Scholar 

  38. Ramanarayanan, V., Leong, C.W., Feng, G., Chen, L., Suendermann-Oeft, D.: Evaluating speech, face, emotion and body movement time-series features for automated multimodal presentation scoring. In: Proceedings of the ACM ICMI, pp. 23–30 (2015)

    Google Scholar 

  39. Rodrigues, S.M., Saslow, L.R., Garcia, N., John, O.P., Keltner, D.: Oxytocin receptor genetic variation relates to empathy and stress reactivity in humans. Proc. Natl. Acad. Sci. U. S. A. 106, 21437–21441 (2009)

    Article  Google Scholar 

  40. Ruhland, K., et al.: A review of eye gaze in virtual agents, social robotics and HCI: behaviour generation, user interaction and perception. Comput. Graph. Forum 34(6), 299–326 (2015)

    Article  Google Scholar 

  41. Sacks, H., Schegloff, E.A., Jefferson, G.: A simplest systematics for the organisation of turn taking for conversation. Language 50, 696–735 (1974)

    Article  Google Scholar 

  42. Sanchez-Cortes, D., Aran, O., Mast, M.S., Gatica-Perez, D.: A nonverbal behavior approach to identify emergent leaders in small groups. IEEE Trans. Multimed. 14(3), 816–832 (2012)

    Article  Google Scholar 

  43. Wortwein, T., Chollet, M., Schauerte, B., Morency, L.-P., Stiefelhagen, R., Scherer, S.: Multimodal public speaking performance assessment. In: Proceedings of the ACM ICMI, pp. 43–50 (2015)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ryo Ishii .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ishii, R., Kumano, S., Higashinaka, R., Ozawa, S., Kinebuchi, T. (2021). Estimation of Empathy Skill Level and Personal Traits Using Gaze Behavior and Dialogue Act During Turn-Changing. In: Stephanidis, C., et al. HCI International 2021 - Late Breaking Papers: Multimodality, eXtended Reality, and Artificial Intelligence. HCII 2021. Lecture Notes in Computer Science(), vol 13095. Springer, Cham. https://doi.org/10.1007/978-3-030-90963-5_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-90963-5_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-90962-8

  • Online ISBN: 978-3-030-90963-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics